ChatGPT Hit With Lawsuit For Portraying A Man As A Child Murderer

chatgpt-hit-with-lawsuit-for-portraying-a-man-as-a-child-murderer
ChatGPT Hit With Lawsuit For Portraying A Man As A Child Murderer

ChatGPT is becoming for AI chatbots what Google became for online search. The service is the most popular in its segment, consistently leading the top free apps in Apple’s App Store, for example. However, it has not been controversy-free throughout its existence. Due to a serious error in an output, a man filed a lawsuit against ChatGPT, as the chatbot claimed the man had murdered his children.

You may not remember, but there was a time when ChatGPT didn’t have internet access. Each platform update also included a more modern database. So, if you asked questions about things after the service’s database date limit, you would likely get the wrong answer—or an error message.

Man files lawsuit against ChatGPT for claiming he murdered his children

During that time, Arve Hjalmar Holmen asked the chatbot for everything it knew about him. The man had an unpleasant surprise when ChatGPT claimed he was convicted of murdering his children, in addition to attempting to kill a third. The platform also mentioned that Holmen was serving a 21-year sentence in Norway.

Apparently, ChatGPT suffered “hallucinations” while generating the output. Hallucinations in AI platforms occur when they “invent” false information and pass it off as real, affecting the reliability of the output. The chatbot allegedly mixed the false details with real details about the man’s life. For example, the reply correctly showed Holmen’s hometown, how many children he had, and the gender of his children.

See also  Samsung Tri-Fold Availability, Display Crease & More: Rumor

It’s possible that the lack of internet access at the time prevented ChatGPT from performing further checks before sending the output. The platform could also have taken information about another person with a similar name and mixed it together. Anyway, it’s clear that the response didn’t make Holmen happy at all.

Noyb, an Austrian advocacy group, filed the lawsuit with the Norwegian Datatilsynet against ChatGPT on behalf of the man for portraying him as a convicted murderer. The lawsuit claims that OpenAI’s service violated EU data privacy requirements.

ChatGPT’s output allegedly violated the EU’s data privacy requirements

The GDPR is clear. Personal data must be accurate. And if it isn’t, users have the right to have it changed to reflect the truth,” stated Joakim Söderberg, a lawyer at Noyb. “Showing ChatGPT users a tiny disclaimer that the chatbot can make mistakes clearly isn’t enough. You can’t just spread false information and, in the end, add a small disclaimer saying that everything you said may just not be true.

The Noyb group is seeking a fine from OpenAI and the removal of defamatory information about Holmen. It also demands improvements to ChatGPT to avoid similar problems. That said, it’s possible that OpenAI resolved the latter requirement long ago. Currently, ChatGPT’s results for the same prompt that listed Holmen as a child murderer now only show news about the lawsuit.

See also  Chegg Suffers A 24% Revenue Loss, Sues Google