European data protection advocacy group noyb has sued OpenAI for failing to correct inaccurate information generated by ChatGPT. The group claims that OpenAI's failure to ensure the accuracy of personal data processed by its service violates the European Union's General Data Protection Regulation (GDPR).
“Fabricating false information is already a big problem, but when it comes to false information about individuals, the consequences can be serious,” said Maartje de Graaf, data protection lawyer at noyb.
“At this point, it is clear that chatbots like ChatGPT cannot comply with EU law when processing personal data. If a system cannot produce accurate and transparent results, then it cannot be used to generate personal data. Technology must comply with legal requirements, not the other way around.”
GDPR requires personal data to be accurate, and individuals have the right to correct inaccurate data and to access information about the data processed and its source. However, OpenAI openly acknowledges that it cannot correct inaccurate information generated by ChatGPT or disclose the source of the data used to train its models.
“Factual accuracy in large-scale language models remains an area of active research,” OpenAI claims.
The advocacy group highlights a New York Times report that found chatbots like ChatGPT “fabricate information at least 3 percent of the time, and up to 27 percent of the time.” In its complaint against OpenAI, noyb cites an example of ChatGPT repeatedly providing false birth dates for prominent complainants, despite requests to correct them.
“Despite the claimant's date of birth provided by ChatGPT being incorrect, OpenAI denied our request to correct or delete the data, claiming that correcting the data was not possible,” noyb said.
OpenAI argued that it could filter or block data based on certain prompts, such as the complainant's name, but that doesn't stop ChatGPT from filtering all information about an individual. The company also failed to respond adequately to complainants' access requests, which the GDPR requires companies to do.
“The obligation to comply with access requests applies to all companies. It is clearly possible to keep a record of the training data used to at least learn about its source,” De Graaf said. “It seems that every time an 'innovation' occurs, another group of companies thinks that their products do not have to comply with the law.”
European privacy watchdogs are already scrutinizing ChatGPT for inaccuracies, with the Italian data protection authority imposing temporary restrictions on OpenAI's data processing in March 2023 and the European Data Protection Board setting up a task force on ChatGPT.
In its complaint, noyb asks the Austrian data protection authority to investigate OpenAI's data processing and the measures it takes to ensure the accuracy of personal data processed by its large-scale language models. The advocacy group also asks the authority to order OpenAI to comply with complainants' access requests, bring its processing into line with the GDPR, and impose fines to ensure future compliance.
Read the full complaint here (PDF)
(Photo by Eleonora Francesca Grotto)
reference: Igor Zhablokov, Pryon: Building a Responsible AI Future
Want to learn more about AI and big data from industry leaders? Check out the AI & Big Data Expo in Amsterdam, California and London, a comprehensive event taking place alongside other major events like BlockX, Digital Transformation Week and Cyber Security & Cloud Expo.
Find out about upcoming enterprise technology events and webinars hosted by TechForge here.