AI chatbot confuses VVD and PVV in election advice – NL Times


AI chatbots may be a tempting advisor this election season, given the political parties’ many-paged election programs. But take their advice with a huge pinch of salt, experts advised Nieuwsuur. The program asked the Google AI tool NotebookLM to compare the election programs of the PVV and VVD, and it quickly became apparent that the chatbot was confusing the two.
Claes de Vreese, a professor of Artificial Intelligence and Society at the University of Amsterdam, urged voters to be cautious in trusting AI chatbots. “Large language models like ChatGPT contain all kinds of noise,” he told Nieuwsuur. “You get answers that incorporate opinions. The answers are contaminated.”
Google’s NotebookLm claims to be shielded from external information, so apparently only compared the election programs Nieuwsuur gave it without any outside information. But it still got it wrong and confused the two parties.
Nieuwsuur asked the tool what the VVD and PVV said about sheltering Ukrainian refugees in the Nehtelrands. “The VVD proposes sending Ukrainian men back to Ukraine,” NotebookLM responded. That proposal comes from the PVV election program, not the VVD’s. According to the current affairs program, the AI tool also confused the two parties with each other when other NotebookLM users asked about them.
Google spokesperson Rachid Finge acknowledged the error and told Nieuwsuur he would check with “the team behind NotebookLM” whether “this is a hallucination or if something else is going on.”
“Hallucination” is a catch-all term for errors in AI chatbot responses. But in reality, all an AI chatbot does is hallucinate. These language models are trained on language patterns with enormous amounts of text, which they use to predict which word is most likely to come next, and not necessarily the most accurate. That means that responses can easily deviate completely from reality.
Promedos, the creator of the voting guide StemWijzer, advises voters not to ask AI chatbots for voting advice. “It’s very complicated to assess how reliable and neutral the answers are and what data they’re trained on,” a Stemwijzer spokesperson told De Nieuws BV. “Information is often outdated, while you want voting advice based on the latest information.”
De Vreese thinks chatbots could be a supplementary tool for people who don’t want to read all the election programs. “But only use them as an additional source of information and entertainment, and not as the best political advisor you have. Always try to contextualize the responses within what you already know.”
De Vreese also advocates for stricter regulations for AI programs. “Initially, AI companies said: we’ll make sure you can’t use us for voting advice. That’s a thing of the past, but I think we left that stage too soon. They’re now saying: Good luck, individual citizens, in dealing with this,” the AI professor told Nieuwsuur. “With a bit of bad luck, there will be an uncontrollable additional influence on our voting choices.”
© 2012-2025, NL Times, All rights reserved.

source

Jesse
https://playwithchatgtp.com