I am launching a website where aspiring authors publish their entries. Naturally, some of these letters may be "sensitive" in nature. Sometimes it contains many abusive words, sometimes it is malicious, sometimes it is suicide. This is not the norm, but it exists.
Of course, we want our users to be able to freely post content that they like, but the problem is with advertisers. Namely, Google Adsense, which regularly sends us warnings about a piece of content that contains too much swearing or is defamatory. The last is the part that someone wrote about how to cut yourself, which Google considers too "tragic."
In general, I could write scripts, looking through the text and discovering many abusive words or racist terms or anti-gay rhetoric, etc. But I can’t live to understand how to detect a “tragic” text.
So, the question is 2 times.
- Is there some kind of PHP / function / API class that detects inappropriate content well?
- Any idea on how to automatically detect suicidal or homing content?
source share