published : 2023-12-02
Experts analyze the strengths and weaknesses of Biden's AI executive order
President Biden's wide-ranging executive order on artificial intelligence has AI experts tracking how agencies craft and implement regulations
The wide-ranging executive order on artificial intelligence (AI) safety and testing requirements signed by President Joe Biden has AI experts tracking how agencies fill in the gaps from the broadly written order as those directives move into the rulemaking process.
Biden’s executive order instructed federal agencies to begin crafting rules and standards that touched on an expansive set of topics ranging from the watermarking of AI-generated content to rooting out bias in AI systems, and from addressing national security concerns to testing protocols for generative AI tools.
While some of those regulatory areas had a relatively clear direction under the executive order, others were less clear and will be primarily defined by agencies as they craft the regulations in the weeks and months ahead – all of which will factor into how businesses and consumers alike interact with AI moving forward.
In the near term, the reporting and testing security requirements for companies with large-scale models, computer clusters with a certain amount of computing power, and cloud computing providers who offer substantial amounts of computing power to foreign customers will be the most relevant items for AI firms.
All of those reporting requirements are expected to take effect quickly and will have a larger impact on the biggest AI firms and cloud computing providers than on the smaller AI companies, who won’t meet the reporting thresholds. However, in the longer term, as many of the other regulations proposed by the EO come into force and agencies start acting, the biggest AI companies may be better placed to absorb the cost of complying, and the smaller companies may struggle.
Some of the rules, such as those applying to cloud computing providers, could have implications for the confidentiality and security of those companies’ clients depending on how the rulemaking process proceeds.
The reporting requirements for cloud computing providers, in particular, are underdeveloped and under-explained. If the cloud computing providers are now going to have to police users who request large amounts of computing power to see if they’re engaged in building a model with malicious cyber-enabled capabilities, then depending on how the regulations shake out, those companies may have to look at the activities undertaken by their larger customers directly.
The executive order could have privacy implications for businesses and their clients, depending on how rulemaking proceeds.
The executive order was fairly comprehensive overall but lacked a more focused approach to different types of AI solutions for different content types.
Detecting AI in video, music, photos, and text requires different strategies and cannot be approached with a one-size-fits-all solution.
While watermarking was discussed in the executive order, it is not considered a foolproof strategy.
Intellectual property (IP) and copyright rules could impact the design and use of AI tools, potentially limiting startups.
The IP and copyright section of the executive order indicates that there will be more controls and limitations on what can be created and how these tools work, which may have a chilling effect.
Biden’s executive order should be viewed as a signal since federal agencies will primarily be responsible for crafting the rules.
More details on the regulations and their implementation will be unveiled in the coming months, but for now, the executive order serves as a starting point.
The landscape of AI regulation is bound to change, and there will be more clarity on how the rules will be put into practice.
It is an exciting and transformative time for the AI industry, with the potential for significant impact on businesses and society as a whole.