(
).
Credit: VentureBeat was generated using MidJourney
The incoming Trump administration will likely make many changes in the coming year to existing policies. AI regulation is not exempt. This will include repealing the AI executive order issued by current president Joe Biden.
Biden’s order established government oversight agencies and encouraged model developers implement safety standards. The Biden AI executive orders rules are aimed at model developers. However, its repeal may present some challenges to enterprises. Some companies, such as Trump ally Elon Muskโs xAI could benefit from a repeal, while others will face issues. This could include having a patchwork regulation, less open sharing data sources, less research funded by the government, and more emphasis placed on responsible AI programs.
Patchwork of local regulations
Prior to the EO signing, policymakers conducted several listening tours and held hearings with leaders in the industry to determine how to regulate technology most appropriately. Under the Democratic-controlled Senate, there was a strong possibility AI regulations could move forward, but insiders believe the appetite for federal rules around AI has cooled significantly.
Gaurab Bansal said at the ScaleUp: AI Conference in New York, that the lack federal oversight of AI may lead states to write policies. Bansal stated that there is a feeling that neither party in Congress will regulate AI, so states may follow the same playbook like California’s SB 1047. “Enterprises must have standards to ensure consistency, but it is bad if there are different standards in different areas.” Gavin Newsom’s desk. Yann Le Cunn, Meta’s CEO, praised Newsom’s decision to veto the bill. Bansal said that states are more likely than not to pass similar legislation.
Dean Ball is a research fellow with George Mason University’s Mercatus Center. He said that companies may find it difficult to navigate different regulations.
Ball said that “those laws may well create a complex compliance regime and a patchwork law for both AI developers as well as companies hoping to use AI.”
Voluntary Responsible AI
Industry led responsible AI has existed for a long time. The pressure on companies to be accountable and fair could increase because customers are demanding that they focus on safety. Model developers and enterprise users need to spend time developing standards and implementing responsible AI policies that comply with laws such as the European Union’s AI Act.
At the ScaleUp AI conference, Microsoft’s Chief Product Officer for Responsive AI Sarah Bird stated that many developers, their customers, and Microsoft are preparing their systems to comply with the EU’s AI Act. Bird said that even if there is no law governing AI, it’s still a good idea to incorporate responsible AI and safety from the beginning.
Bird said that the AI Act will be useful for start-ups. “A lot of what it asks you to do at the high level is just common sense,” Bird added. “If you are building models, you need to control the data that goes into them. You should also test them.” If you’re starting from scratch, compliance is easier for smaller organizations. Invest in a solution to govern your data over time. Jason Corso is a professor of robotics and co-founder of Voxel51. He told VentureBeat that the Biden EO encouraged a great deal of openness among model developers.
We can’t know the full impact of a single sample on a model with a high level of bias risk. Corso stated that model users’ businesses may be at risk if there is no governance surrounding the use of models and the data used.
AI companies are currently attracting significant investor interest despite receiving fewer research dollars (19659020). The government has supported research that some investors deem too risky. Corso said that the Trump administration may decide to not invest in AI research because it wants to save money.
Corso said, “I’m worried about the lack of government resources to support these high-risk early-stage projects.”
A new administration does NOT mean that money will not allocated to AI. The Trump administration has not yet confirmed if it will abolish the newly established AI Safety Institute or other AI oversight offices. However, the Biden Administration did guarantee budgets up until 2025.
A pending question must be a part of Trump’s replacement of the Biden EO, and that is how to organize authorities and allocate dollars appropriated by the AI Initiative Act. This bill is where many of the authorities, and activities, that Biden has assigned to agencies like NIST come from. The funding will continue until 2025. Many activities will continue in some way with the dollars already allocated. Matt Mittelsteadt, research fellow at the Mercatus Center, said that it is still unclear what form this will take.
The next administration’s AI policy will be revealed in January. However, enterprises should prepare themselves for whatever may come next.
Want to impress your boss? VB Daily can help. We provide you with the inside scoop on what companies do with generative AI. From regulatory shifts to practical implementations, we give you the insights you need to maximize ROI.
Read our privacy policy
Thank you for subscribing. Click here to view more VB Newsletters.
An error occured.