Google’s PC Police Algorithm

Toxic_WordsThe PC police are expanding watch over the Internet. No longer will you have true freedom of speech, as Google and other search engines are working to block your toxic words from being published.

I tested Google’s new algorithm to see if my word choices would be blocked. Here is a sentence I wrote that was 2% likely to be perceived as toxic.

“Those who accept media bias without consideration find themselves following unhealthy trends.”

I then decided to make the comment more opinionated to grab the attention of the reader and found my words were 97% likely to be perceived as toxic.

“Those who accept media bias without consideration find themselves following idiots.”

Here is the winning version of my statement that was 0% likely to be perceived as toxic.

“Those who accept media bias without consideration find themselves following trends.”

I next tried a few religious comments. The following statement was 34% likely to be perceived as toxic.

“Shows about Jews should be banded from the media.”

After correcting the word “banded” to “band” the statement was 18% likely to be perceived as toxic.

“Shows about Jews should be band from the media.”

I then switched out the word “Jews” to “Muslim” and then “Christian,” which dropped the likeliness of the statement to be perceived as toxic to 1% for each.

It was apparent that the algorithm used was based on machine learning, which draws from biased news sources. The more sources stating that certain words are toxic, the greater the bias being policed becomes.

In other words, if you fill the Internet with documents, stories and news briefs stating how hateful the word “gismo” is, you’ll actually shift the algorithm to determine that the use of the word is toxic.

While its unlikely a group of caring people will produce 20 million articles using the word “gismo” as a hate word to change algorithm results, some might consider sidelining their competition by turning their important phrases into hate words.

I think we’re at a turning point and need to leave ethical and moral decisions to man, not machines. Then again, can you really trust them?

© 2017 by CJ Powers
Advertisements