How to determine if an image is explicit

I am looking for a way to determine if an image is explicit (safe for use) or not.

I'm currently looking for an API that can do this, but so far I have not had any success.

One of my ideas was to use the google search API and specify the URL of the image and see if it will be included in the results if safeSearch is included, but it will not work on the image that was added before the crawler approached it.

As an alternative, I'm looking for pointers as to what to look for in an image to determine how SFW is. Any suggestions regarding shapes, colors or patterns?

+5
source share
3 answers

As promised, SFW paper from Google researchers and a patent for your research, obtained from this blog post .

+1
source

One of my colleagues led the development of porn classification technology at one of the largest Internet companies. I will tell what he told me about the development of the filter.

  • The definition of what is clearly different in different jurisdictions, therefore, what is considered explicit in the United States may not be in other parts of the world and vice versa. Therefore, models must consider the origin of users.
  • , , -. , , , , .
  • - , . .
  • , 8 , , . - , .
  • . PhD , , , .

, , , . , , , . .

+1

. :

. -, -, -, . - . -

- (林 語 君) - (曾 宏偉) -

Rigan Ap-apid , ,

, :

"The Porn Detection Stick uses advanced image analysis algorithms that classify images as potentially dangerous by defining facial features, flesh colors, image backs, body shapes, etc."

0
source

All Articles