How to recognize adult content programmatically?

I am currently developing a website for a client. It consists of users who can upload images that will be displayed in the gallery on the site.

The problem with us is that when a user uploads an image, obviously, it will be necessary to check to make sure that it is safe for the website (not pornographic or explicit images). However, my client would not want to manually accept each uploaded image, as it will be time consuming and user images will not be instantly online.

I am writing my code in PHP. If necessary, I can switch to ASP.net or C #. Is there any way this can be done?

+8
source share
7 answers

2019 update

Much has changed since this original answer appeared in 2013, the main thing is machine learning. Currently, several libraries and APIs for software-based content discovery for adults are available:

Google Cloud Vision API , which uses the same models that Google uses for safe searches.

NSFWJS uses TensorFlow.js approvals to achieve ~ 90% accuracy and is open source under the MIT license.

Yahoo has a solution called Open nsfw licensed under BSD 2.

Reply 2013

There is a nude.js JavaScript library for this, although I have never used it. Here is a demo of it .

There is also PORNsweeper .

Another option is to outsource the moderation work using something like Amazon Mechanical Turk , which is a crowdsourced platform which "enables computer programs to co-ordinate the use of human intelligence to perform tasks which computers are unable to do" . So you would basically pay a small amount per moderation item and have an outsourced actual human to moderate the content for you.

which is a crowdsourcing platform that "allows computer programs to coordinate the use of human intelligence to perform tasks that computers are unable to complete." do. "That way, you essentially pay a small amount for the moderation item, and you have a real outsourcing person to moderate the content for you. The only other solution I can come up with is to moderate user images where users may flag inappropriate entries / images for moderation, and if no one wants to moderate them manually, they can simply be deleted after a certain number of flags.

Here are some more interesting related links:

+14
source

There is a free API that defines adult content (porn, NSFW).

https://market.mashape.com/purelabs/sensitive-image-detection

We use it in our production environment, and I would say that it works very well. However, there are some false findings, it seems they prefer to mark the image as unsafe if they are not sure.

+1
source

If you are looking for an API based solution, you can check out Sightengine.com

This is an automated solution for discovering things like adult content, violence, celebrities, etc. in images and videos.

Here is an example in PHP using the SDK:

<?php $client = new SightengineClient('YourApplicationID', 'YourAPIKey'); $output = $client>check('nudity')>image('https://sightengine.com/assets/img/examples/example2.jpg'); 

Then the result will return the classification:

{ "status": "success", "request": { "id": "req_VjyxevVQYXQZ1HMbnwtn", "timestamp": 1471762434.0244, "operations": 1 }, "nudity": { "raw": 0.000757, "partial": 0.000763, "safe": 0.999243 }, "media": { "id": "med_KWmB2GQZ29N4MVpVdq5K", "uri": "https://sightengine.com/assets/img/examples/example2.jpg" } }

Take a look at the documentation for more details: https://sightengine.com/docs/#nudity-detection (disclaimer: I work there)

+1
source

Microsoft Azure has a very cool API called Computer Vision , which you can use for free (either through the user interface or programmatically) and has tons of documentation , including for PHP .

It has some surprisingly accurate (and sometimes humorous) results.

With the exception of the discovery of adult and "racist" material, he will read the text, guess his age, determine the primary colors, etc. etc.

You can try: azure.microsoft.com .

Sample output from a "bright" image:

 FEATURE NAME: VALUE: Description { "tags": [ "person", "man", "young", "woman", "holding", "surfing", "board", "hair", "laying", "boy", "standing", "water", "cutting", "white", "beach", "people", "bed" ], "captions": [ { "text": "a man and a woman taking a selfie", "confidence": 0.133149087 } ] } Tags [ { "name": "person", "confidence": 0.9997446 }, { "name": "man", "confidence": 0.9587285 }, { "name": "wall", "confidence": 0.9546831 }, { "name": "swimsuit", "confidence": 0.499717563 } ] Image format "Jpeg" Image dimensions 1328 x 2000 Clip art type 0 Line drawing type 0 Black and white false Adult content true Adult score 0.9845981 Racy true Racy score 0.964191854 Categories [ { "name": "people_baby", "score": 0.4921875 } ] Faces [ { "age": 37, "gender": "Female", "faceRectangle": { "top": 317, "left": 1554, "width": 232, "height": 232 } } ] Dominant color background "Brown" Dominant color foreground "Black" Accent Color #0D8CBE 
+1
source

It all depends on the level of accuracy you are looking for, a simple detection of skin tone (for example, nude.js) will allow you to get 60-80% accuracy on a large set of samples, for something more accurate, let it say 90-95% You will need some specialized computer vision system with a changing model, which is revised over time. For the latter, you can check out http://clarifai.com or https://scanii.com (which I'm working on)

0
source

The example below does not give you 100% accurate results, but it should help you at least a little and work out of the box.

 <?php $url = 'http://server.com/image.png'; $data = json_decode(file_get_contents('http://api.rest7.com/v1/detect_nudity.php?url=' . $url)); if (@$data->success !== 1) { die('Failed'); } echo 'Contains nudity? ' . $data->nudity . '<br>'; echo 'Nudity percentage: ' . $data->nudity_percentage . '<br>'; 
0
source

I recently discovered that I need a system to detect adult content.

In the end, I created this project , which is an API that can be easily deployed to Herkou (or anywhere else where you can launch the docker container) and which allows you to get images for adult content.

It is based on the open_nsfw open source model, which has been trained to detect working images unsuitable. The above project is basically a Python API on top of open_nsfw, ready for deployment.

0
source

All Articles