What outside the EU?
The GDPR, the EU’s Data Protection Regulation, is the block’s most famous tech export, and it has been copied from California to all over India.
The EU’s approach to AI, which targets the most dangerous AI, is one that most developed countries agree on. If Europeans can create a coherent way to control technology, it can serve as a model for other countries hoping to do the same.
“US companies will also raise their standards for American consumers in terms of transparency and accountability in their compliance with the EU AI Act,” says Mark Rotenberg, head of the Center for AI and Digital Policy, a nonprofit organization that tracks AI. Policy
The bill is also being closely watched by the Biden administration. U.S. Home to some of the largest AI labs in the world, such as Google AI, Meta and OpenAI, and a number of different global rankings in AI research, the White House wants to know how any regulation can be applied to these companies. . For now, influential US government figures such as National Security Adviser Jack Sullivan, Commerce Secretary Gina Raymondo and Lynn Parker, who is leading the White House AI effort, have welcomed Europe’s efforts to control AI.
“This is a sharp contrast to how the United States viewed the growth of the GDPR,” said Rotenberg. Says Rotenberg.
Despite some unavoidable precautions, the US has good reasons to welcome the law. He is deeply concerned about China’s growing influence in tech. For America, the official stance is that maintaining Western dominance over technology is a matter of whether “democratic values” prevail. It wants to keep the EU close to “like-minded allies”.
What are the biggest challenges?
It is currently technically impossible to comply with certain requirements of the bill. The first draft of the bill requires that data sets be error-free and that humans be able to “fully understand” how the AI system works. The data sets used to train the AI system are vast, and it would take thousands of hours of human work to test whether it is completely error-free, even if it were possible to test such a thing. And today’s neural networks are so complex that even their creators do not fully understand how they reach their conclusions.
Tech companies are also very concerned about the need to give external auditors or regulators access to their source code and algorithms to enforce the law.