Welcome to DU!
The truly grassroots left-of-center political community where regular people, not algorithms, drive the discussions and set the standards.
Join the community:
Create a free account
Support DU (and get rid of ads!):
Become a Star Member
Latest Breaking News
Editorials & Other Articles
General Discussion
The DU Lounge
All Forums
Issue Forums
Culture Forums
Alliance Forums
Region Forums
Support Forums
Help & Search
General Discussion
Related: Editorials & Other Articles, Issue Forums, Alliance Forums, Region ForumsWhite House to Force Companies to Share Artificial-Intelligence Data

https://prospect.org/economy/2023-10-31-white-house-companies-artificial-intelligence-data/

To realize the promise of AI and avoid the risk, we need to govern this technology. With that statement, Joe Biden signed an executive order on Monday outlining the federal governments approach to regulating artificial intelligence. The executive orders breadth reflects how the White House recognizes that the use of artificial intelligence across the economy is not a tech sector marketing pitch. Instead, the document acknowledges that artificial intelligences adoption could leverage immense benefits for the public and productivity gains in the economy, while its just as possible that unchecked AI tools could exacerbate already-existing concerns over cybersecurity, data privacy, discrimination, and the exploitation of consumers. That balancing acthow to maximize the promise of the technology while minimizing the riskis at the heart of the administrations strategy.
The biggest impediment to such an approach is actually knowing what the tech companies have in mind. So the administration turned to a novel application of existing law to make sure that the AI era doesnt feature the usual move-fast-and-break-things ethos of Silicon Valley, where forgiveness is sought instead of permission. Using powers under the Defense Production Act, the government will require any company building an AI model that has national security, national economic security, or national public health and safety implications to notify the federal government and give up the results of all risk assessments and safety tests. Thats broad enough to encompass virtually any large-scale AI model.
Those assessments, known as red-teaming, involve tests to identify flaws and vulnerabilities in AI models. They are usually closely guarded corporate secrets. The Prospect wrote last month about how autonomous-vehicle companies operating in California refused to release safety information to the public via public records requests. The state Department of Motor Vehicles has subsequently suspended one company, Cruise, from operating on California roads, after it withheld video of a pedestrian crash. Under this executive order, within 90 days, risk and safety information will have to be presented to the federal government in advance. It firmly designates who holds the regulatory power for AI, and subordinates the tech companies to the regulators. Tech companies were not told in advance about this provision, according to published reports.
The National Institute of Standards and Technology (NIST), located within the Commerce Department, will establish standards AI systems must reach before being publicly released. That includes standards for red-team testing and availability for so-called testbeds to support the practices. The standard-setting also includes expansions of NISTs existing risk management and software development guidelines, as well as guidance for how to audit AI capabilities. A relatively obscure agency, NIST will take on new significance with this responsibility. NIST has 270 days to develop these standards, in coordination with the Secretary of Commerce and the Departments of Energy and Homeland Security.
snip
1 replies
= new reply since forum marked as read
Highlight:
NoneDon't highlight anything
5 newestHighlight 5 most recent replies
White House to Force Companies to Share Artificial-Intelligence Data (Original Post)
Celerity
Oct 2023
OP
leftstreet
(40,675 posts)1. No. Just no
There's absolutely no justification for using the Defense Production Act. None. Citing vague "national security" concerns is the road to killing open source AND competition.
No