OpenAI challenged in Europe over ChatGPT hallucinations


The European Centre for Digital Rights, or noyb (none of your business), has filed a complaint against OpenAI over ChatGPT's inability to provide factual information about people.

Noyb states that “in the EU, the GDPR [General Data Protection Regulation] requires that information about individuals is accurate and that they have full access to the information stored, as well as information about the source.”

However, the organization claims that OpenAI “openly admits that it’s unable to correct incorrect information on ChatGPT” and that “factual accuracy in large language models remains an area of active research.”

ChatGPT’s likelihood of hallucinating and OpenAI’s inability to guarantee that ChatGPT is providing accurate information to users has raised alarm bells for the organization.

Although this information may be “tolerable when a student uses ChatGPT to help them with their homework, it is unacceptable when it comes to information about individuals,” noyb states.

According to the organization, OpenAI is either unwilling or unable to correct inaccurate information about individuals. One example provided by noyb is of an unknown public figure whose date of birth provided by ChatGPT was incorrect, and OpenAI refused his request to change or erase the data.

“Although the GDPR gives users the right to ask companies for a copy of all personal data that is processed about them, OpenAI failed to disclose any information about the data processed, its sources or recipients,” noyb said.

The GDPR grants users the right to ask for rectification of inaccurate data. Therefore, noyb’s perspective is that OpenAI is breaching the parameters of laws under the GDPR.

EU law also requires that all personal data be accurate, and ChatGPT’s propensity to hallucinate is, therefore, another violation of the GDPR.

“Making up false information is quite problematic in itself. But when it comes to false information about individuals, there can be serious consequences. It’s clear that companies are currently unable to make chatbots like ChatGPT comply with EU law when processing data about individuals,” Maartje de Graaf, data protection lawyer at noyb said.

Noyb deems this a structural issue, referencing a recent New York Times report stating that chatbots fabricate information at least three percent of the time and as high as 27 percent.

The data protection organization “is asking the Austrian data protection authority (DSB) to investigate OpenAIs data processing and the measures taken to ensure the accuracy of personal data.”

Noyb requires the DSB to order OpenAI to “comply with the complainant's access request and to bring its processing in line with the GDPR.”

Finally, the data protection organization wants the DSB to impose a fine on OpenAI to “ensure future compliance.”