PHP script or API for filtering inappropriate content

I am launching a website where aspiring authors publish their entries. Naturally, some of these letters may be "sensitive" in nature. Sometimes it contains many abusive words, sometimes it is malicious, sometimes it is suicide. This is not the norm, but it exists.

Of course, we want our users to be able to freely post content that they like, but the problem is with advertisers. Namely, Google Adsense, which regularly sends us warnings about a piece of content that contains too much swearing or is defamatory. The last is the part that someone wrote about how to cut yourself, which Google considers too "tragic."

In general, I could write scripts, looking through the text and discovering many abusive words or racist terms or anti-gay rhetoric, etc. But I can’t live to understand how to detect a “tragic” text.

So, the question is 2 times.

  • Is there some kind of PHP / function / API class that detects inappropriate content well?
  • Any idea on how to automatically detect suicidal or homing content?
+4
source share
1 answer

You could have prepared a Bayesian filter about what tragic content looks like. Like a spam filter, but for the content you want to keep. I used this PHP library and it works well: https://github.com/Dachande663/PHP-Classifier

It fits well with the process of moderation and human flags.

+2
source

All Articles