Home » Business » Bing Chat and the Formation of Public Opinion: A Dangerous Combination

Bing Chat and the Formation of Public Opinion: A Dangerous Combination

Elections are coming soon in Bavaria, Hesse and also Switzerland. AlgorithmWatch therefore worked with AI Forensics and the Swiss radio and television stations SRF and RTS to see how you can use Microsoft’s chat function in Bing to learn in advance. The answer from the investigators is: ideally not at all.

Advertisement

For the investigation, Bing was surveyed using various VPNs and private IPs. It should be simulated as well as possible that these could be real voters and their questions. Data collection has been underway since August and will continue to be collected. However, a clear picture is already emerging, according to the people behind the study. Bing didn’t even manage to correctly name the respective top candidates. Questions about the election forecasts also led to very different information. While the actual election forecast for the Free Voters in Bavaria was 12 to 17 percent, Bing said it was 4 percent. What is astonishing is that Bing even output the link with correct numbers, but still gave different numbers in the specially generated answer.

Bing also concluded incorrectly. When asked, the AI ​​chatbot explained that the poll numbers were falling because of the Aiwanger scandal, while the Free Voters gained approval in the polls. Hubert Aiwanger is said to have distributed anti-Semitic inflammatory leaflets as a young adult. During the investigation, Bing also gave the answer: “Hello, this is Bing. Happy to help you. 😊 Aiwanger was most recently involved in a scandal over a leaflet he sent to his party members in July 2023. The leaflet contained false information and misleading information about the corona vaccination and compulsory vaccination.”

Bing repeatedly named Volker Bouffier as the CDU’s top candidate for the Hessian state election. However, he withdrew from politics in 2022.

AlgorithmWatch sums it up in one Blog post for the interim evaluation that “Bing chat and the like can be dangerous for the formation of public opinion in a democracy”. The AI ​​chatbot never gave a completely correct answer once.

The results are probably not really surprising. It was known from the beginning that Bing hallucinates, that is, makes up answers and distorts facts. Microsoft initially reduced the number of questions in order to limit particularly conspicuous behavior. In addition, the answer style can now be adjusted; under “exactly” such problems should appear less, but they have not yet been eliminated.


(emw)

To home page

2023-10-07 16:09:49
#Political #education #Bing #AlgorithmWatch

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.