Artificial Intelligence (AI)

Should AI be regulated?

October 10, 2023

Yes, but how it is regulated is important.

AI is not just one thing but encompasses several different tools and systems. The United States currently regulates AI as distinct products and applications, but due to the complicated nature of the technology, there are some less clear-cut areas for government responsibility. AI also poses many significant ethical risks for users, developers, humans and society.

At the federal level, AI legislation regulating AI writ large does not yet exist. The German Marshall Fund has an AI policy tracker that maps federal action and initiatives on AI, comparing their goals, applications, enforcement, timing and legal outcomes. The National Conference of State Legislatures also has a webpage tracker that summarizes enacted legislation and adopted resolutions in states so far.

Key Takeaways

  1. 1.

    AI needs to be regulated, but how remains a question.

  2. 2.

    While federal AI policy is still being developed, President Biden issued an Executive Order that establishes new standards for AI safety and security, protects Americans’ privacy, advances equity and civil rights, stands up for consumers and workers, promotes innovation and competition, advances American leadership around the world, and ensures responsible and effective government use of AI.

  3. 3.

    Individual states and cities have each enacted their own laws and rules regulating AI and how it is used.