According to what Chatgpt was accused of an innocent man of killing his children

Photo of author

By [email protected]


More than two years have passed since the ChatGPT explosion on the world stage Openai advanced It is in many ways, there are still a few obstacles. One of the biggest issues: HallucinogenicOr mentioning wrong information as realistic. Now, the Austrian Noyb Da`wah Group presented The second complaint against Openai For such hallucinations, I mentioned the name of a specific example in which ChatGPT – wrongly – was mentioned that the Norwegian man was a killer.

To make matters, to some extent, and worse, when this man asked, I was absent from what he knew about him, he mentioned that he was sentenced to 21 years in prison for killing two of his children and trying to kill the third. Hallucinations have also been sprayed with real information, including the number of children they have, their races and the name of his city.

NoYB claims that this response is an openai in violation of the gross domestic product. "GDP is clear. Personal data should be accurate. And if not, then users have the right to change it to reflect the truth," Lawyer Noyb Joakim Söderberg Data Protection Lawyer I mentioned. "The display of Chatgpt users is a small evacuation that Chatbot can make clear errors that are not enough. Not only can you post wrong information and in the end add a small evacuation that says everything you said may not be true .." 

Other prominent counterparts of hallucin Fraud and embezzlementCourt correspondent Abuse the child And professor of law sexual harassmentAs I mentioned by many publications.

Nobeep is the first complaint To openai about hallucinations, in April 2024Focus on the indoctrinity date of the general character (so that no killing, but still inaccurate). Openai refused to request the complainant erasing or updating his birth date, claiming that information cannot already be changed in the system, just prohibiting its use of certain claims. ChatGPT responds to the evacuation of it "You can make mistakes." 

Yes, there is something valuable like, everyone makes errors, and for this reason they put the penalties. However, when it comes to Chatbot incredibly common, does this logic really apply? We’ll see whether OpenAI responds to the latest NoYB complaints.

This article was originally appeared on Engadget on https://www.engadget.com/ai/chatgpt-reportly-accted-innocent-man-f-murding-his-children-120057654.html?



[og_img]

Source link

Leave a Comment