Skip to content

platelminto/NudeNetClassifier

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

NudeNet: Neural Nets for Nudity Classification, Detection and selective censoring

DOI Upload Python package

Fork differences:

  • Only the default classifier is available.
  • The classifier no longer throws the Initializer block1_conv1_bn/keras_learning_phase:0 appears in graph inputs and will not be treated as constant value/weight. etc. warning.
  • It only works on images.
  • The classifier is included in the project itself.
  • Only the v2 model is available (the original repo's default). So v2 from original is main here.

Uncensored version of the following image can be found at https://i.imgur.com/rga6845.jpg (NSFW)

Classifier classes:

class name Description
safe Image is not sexually explicit
unsafe Image is sexually explicit

As self-hostable API service

# Classifier
docker run -it -p8080:8080 notaitech/nudenet:classifier

# See fastDeploy-file_client.py for running predictions via fastDeploy's REST endpoints 
wget https://raw.githubusercontent.com/notAI-tech/fastDeploy/master/cli/fastDeploy-file_client.py
# Single input
python fastDeploy-file_client.py --file PATH_TO_YOUR_IMAGE

# Client side batching
python fastDeploy-file_client.py --dir PATH_TO_FOLDER --ext jpg

Note: golang example notAI-tech#63 (comment), thanks to Preetham Kamidi

As Python module

Installation:

pip install -U git+https://github.com/platelminto/NudeNet

Classifier Usage:

# Import module
from nudenet import NudeClassifier

# initialize classifier (downloads the checkpoint file automatically the first time)
classifier = NudeClassifier()

# Classify single image
classifier.classify('path_to_image_1')
# Returns {'path_to_image_1': {'safe': PROBABILITY, 'unsafe': PROBABILITY}}
# Classify multiple images (batch prediction)
# batch_size is optional; defaults to 4
classifier.classify(['path_to_image_1', 'path_to_image_2'], batch_size=BATCH_SIZE)
# Returns {'path_to_image_1': {'safe': PROBABILITY, 'unsafe': PROBABILITY},
#          'path_to_image_2': {'safe': PROBABILITY, 'unsafe': PROBABILITY}}

Notes:

About

A Neural Net for Nudity Detection. Classifier only.

Topics

Resources

License

Stars

Watchers

Forks

Packages

No packages published

Languages

  • Python 97.7%
  • Shell 2.3%