Sam Altman says OpenAI will leave the EU because of AI regulations

Photo credit: Jonathan Kemper

Sam Altman, CEO of OpenAI, says his company could exit Europe if regulations around AI become too strict.

The European Union is currently considering the introduction of an initial set of rules to help regulate the development of artificial intelligence technology worldwide. Companies using generative AI tools like ChatGPT must disclose when copyrighted material was used to develop their systems. Altman says his company will “try to comply” but may be forced to leave the company.

“The current draft of I HAVE Act “It would be over-regulation, but we’ve heard it’s being withdrawn,” Altman said in an interview with Reuters. “They’re still talking about it. “There’s so much they could do, like change the definition of general-purpose AI systems. There are a lot of things that could be done.”

The EU AI law aims to classify AI into three risk categories. Some AI systems used in China, such as social scoring systems, are considered “unacceptable risks” or “violations of fundamental rights”. Meanwhile, a high-risk AI system must conform to common standards designed to increase transparency and oversight of AI models. Altman’s concern is that ChatGPT qualifies under the current definition of “high risk.”

“If we can stick to it, we will, and if we can’t, we will shut down,” Altman told his audience at a panel discussion hosted by University College London. “We’ll try. But there are technical limits to what’s possible.” ChatGPT’s large language model, which powers the chatbot, was trained using private datasets pulled from the Internet. Researchers were able to extract literal text sequences from the model’s training data extract.

“These extracted examples include (public) personally identifiable information (names, phone numbers and email addresses), IRC conversations, code and 128-bit UUIDs,” security researchers have noted disclosed They studied LLMs to see how results appear when queried with prompts designed to gather that information.