The European Union has developed an artificial intelligence strategy to simplify research and rules and regulations. It’s focusing on building a trustworthy environment. The European Union’s approach to this new technology is to implement a legal framework to address fundamental rights and safety risks. It plans to implement rules to address liability issues. It also plans to revise the sectoral safety legislation and modify the rules and regulations. The new framework grants developers, deployers, and users a certain amount of clarity if it becomes necessary for them to intervene if legislation does not cover the issues.
Artificial intelligence can be used in critical infrastructures such as manufacturing and transportation. This technology can be used in education and vocational training such as preparing, taking, and scoring exams. Robotic technologies are already being used in medical products that would allow robot-assisted medical procedures. Law enforcement agencies can use this technology but they should not interfere with the general public’s fundamental rights (e.g., free speech, religious beliefs, privacy). The state and federal courts can use it for assistance in evidence comparison and evaluation. Biometric technology can also be used in conjunction with artificial intelligence.
On April 21, 2021, the European Commission has published a proposed set of laws to regulate the usage of artificial intelligence in its jurisdictions. The Artificial Intelligence Act consolidates a risk-based approach on the pyramid of criticality to assess the risks. There are four major risk levels starting from minimal risk, limited risk, high risk, and unacceptable risk. This proposed legislation implements a new enforcement agency called the European Artificial Intelligence Board (“EAIB”) which consists of national supervisors who will have oversight power. The Artificial Intelligence Act will have an extraterritorial effect on all providers, marketers, or distributors whose products or services reach the European Union’s market. This regulation defines artificial intelligence as a collection of software development frameworks that include machine learning, expert and logic systems, and statistical methods.
The Artificial Intelligence Act prohibits the following system activities: (1) using subliminal techniques to change an individual’s actions in a way that could cause psychological or physical injury; (2) targeting a group’s vulnerabilities based on their age, physical, or mental disability that would cause psychological or physical harm; (3) enabling a government to utilize general-purpose social credit scoring; or (4) providing a real-time remote biometric identification in public locations by law enforcement except in certain time-limited public safety situations.
The Artificial Intelligence Act outlines the penalties for violations. For example, non-conformity or lack of compliance with the legislation could lead to fines up to thirty-million Euros or six percent of the worldwide annual turnover whichever is greater. The cap for high-risk AI applications is twenty-million Euros or four percent of turnover. Also, the submission of inaccurate, incomplete, or misleading information to national competent agencies is subject to a fine of up to ten-million Euros or two percent of turnover.
So, as of now, the new legislation has been proposed and it’s working its way up the European Parliament. Now, if it passes, then it will be subject to a two-year implementation period. We certainly hope that state, federal, and international agencies assist in the implementation of a comprehensive legislation for this new technology.
Our artificial intelligence lawyers have been prosecuting and defending legal actions in state and federal courts and are available to speak with clients. Our law firm assists clients in matters related to artificial intelligence, cybersecurity, privacy, and the applicable state, federal, or international laws. Please contact our law firm to speak with an artificial intelligence attorney at your earliest convenience.