NewsArtificial IntelligenceCensorshipFourth Industrial RevolutionMisinformation-DisinformationWorld Economic Forum

Allowing AI to “Shape Public Discourse” is Dangerous to Humanity

(by Igor Chudov | Substack) – Last August, I reported on a WEF’s agenda article proposing to create an AI system that would search the entire Internet for wrong and dangerous ideas, generally defined by the WEF as COVID misinformation, hate, conspiracy theories, climate change denial, and more.

This quote from WEF’s agenda article explains WEF’s intentions:

While AI provides speed and scale and human moderators provide precision, their combined efforts are still not enough to proactively detect harm before it reaches platforms. To achieve proactivity, trust and safety teams must understand that abusive content doesn’t start and stop on their platforms. Before reaching mainstream platforms, threat actors congregate in the darkest corners of the web to define new keywords, share URLs to resources and discuss new dissemination tactics at length. These secret places where terrorists, hate groups, child predators and disinformation agents freely communicate can provide a trove of information for teams seeking to keep their users safe.

My post about the WEF’s plans was entirely fact-based and used the WEF’s agenda article as its main source. It was not a far-fetched conspiracy theory based on a concoction of disjoint facts pulled from various sources. I am not in the business of creating such theories! I only report on current news – even if the news is crazy – and try to explain the news in plain and accurate terms.

And yet, even though the WEF said it, the idea of an AI engine proactively searching websites for undesirable ideas seemed extremely fanciful and almost impossible to imagine being implemented.

Until 2023, that is.

Now, Google is developing an AI-based tool to offer a “cross-service database of terrorist items,” with the help of the United Nations-supported “Tech against Terrorism.” Read Full Article >

You may also like

Leave a Comment