WOLFRAM

"NSFWImage" (Built-in Classifier)

Determine whether an image is considered not safe for work.

Classes

Details

Examples

open allclose all

Basic Examples  (2)Summary of the most common use cases

Determine whether the input image contains pornographic contents (not safe):

Out[1]=1

Obtain the probabilities for different classes:

Out[2]=2

Obtain a ClassifierFunction for this classifier:

Out[1]=1

Use the classifier function to classify an example:

Out[2]=2

Classify many examples at once:

Out[3]=3

Scope  (1)Survey of the scope of standard use cases

Load the ClassifierFunction corresponding to the built-in classifier:

Out[1]=1

Obtain the possible classes:

Out[2]=2

Options  (2)Common values & functionality for each option

ClassPriors  (1)

Use a custom ClassPriors for the possible outputs:

Out[1]=1

IndeterminateThreshold  (1)

Use a custom IndeterminateThreshold:

Out[1]=1