"NSFWImage" (Built-in Classifier)
Determine whether an image is considered not safe for work.
Classes


Details

- This classifier is based on the net model Yahoo Open NSFW Model V1.
- This classifier may download resources that will be stored in your local object store at $LocalBase, and that can be listed using LocalObjects[] and removed using ResourceRemove.
Examples
open allclose allBasic Examples (2)Summary of the most common use cases
Determine whether the input image contains pornographic contents (not safe):
In[1]:=1

✖
https://wolfram.com/xid/0byy5b7z20vkk6zixrkrgte-vojt5n
Out[1]=1

Obtain the probabilities for different classes:
In[2]:=2

✖
https://wolfram.com/xid/0byy5b7z20vkk6zixrkrgte-pgl1oa
Out[2]=2

Obtain a ClassifierFunction for this classifier:
In[1]:=1

✖
https://wolfram.com/xid/0byy5b7z20vkk6zixrkrgte-6cwlu3
Out[1]=1

Use the classifier function to classify an example:
In[2]:=2

✖
https://wolfram.com/xid/0byy5b7z20vkk6zixrkrgte-l0fphr
Out[2]=2

Classify many examples at once:
In[3]:=3

✖
https://wolfram.com/xid/0byy5b7z20vkk6zixrkrgte-g98f4p
Out[3]=3

Scope (1)Survey of the scope of standard use cases
Load the ClassifierFunction corresponding to the built-in classifier:
In[1]:=1

✖
https://wolfram.com/xid/0byy5b7z20vkk6zixrkrgte-jsnagu
Out[1]=1

In[2]:=2

✖
https://wolfram.com/xid/0byy5b7z20vkk6zixrkrgte-e7vozp
Out[2]=2

Options (2)Common values & functionality for each option
ClassPriors (1)
Use a custom ClassPriors for the possible outputs:
In[1]:=1

✖
https://wolfram.com/xid/0byy5b7z20vkk6zixrkrgte-bpgox1
Out[1]=1

IndeterminateThreshold (1)
Use a custom IndeterminateThreshold:
In[1]:=1

✖
https://wolfram.com/xid/0byy5b7z20vkk6zixrkrgte-yhheed
Out[1]=1
