Elon Musk’s X sues Minnesota to stop law banning ‘deepfakes’ generated by AI


Summary

Legal concerns

The lawsuit argues the law's vagueness could lead to unnecessary censorship, criminal liability for sharing deepfakes near elections.

First Amendment issues

X's complaint cites violations of the First and Fourteenth Amendments, stating the law's ambiguity threatens core political speech.

Other states involved

Over 20 other states have enacted similar deepfake legislation, reflecting growing concerns over the impact of AI on democracy.


This recording was made using enhanced software.

Summary

Legal concerns

The lawsuit argues the law's vagueness could lead to unnecessary censorship, criminal liability for sharing deepfakes near elections.

First Amendment issues

X's complaint cites violations of the First and Fourteenth Amendments, stating the law's ambiguity threatens core political speech.

Other states involved

Over 20 other states have enacted similar deepfake legislation, reflecting growing concerns over the impact of AI on democracy.


Full story

A Minnesota law that took effect in 2023 banned the use of technology like artificial intelligence to create images known as “deepfakes.” Now, social media platform X is challenging the statute in federal court. The legislation specifically targets the election season and anyone creating AI-generated content with the purpose of influencing others.

X sues Minnesota over state law

X Corp., a company owned by Elon Musk, filed the complaint on Wednesday, April 23, alleging the law breaks the Constitution and violates the free speech rights of its platform. Minnesota Attorney General Keith Ellison, D, is named as a defendant.

The complaint claims the statute, “Violates the First and Fourteenth Amendments of the United States Constitution, because its requirements are so vague and unintelligible that social media platforms cannot understand what the statute permits and what it prohibits, which will lead to blanket censorship, including of fully protected, core political speech.”

QR code for SAN app download

Download the SAN app today to stay up-to-date with Unbiased. Straight Facts™.

Point phone camera here

The law defines deepfakes as video, audio or still images of any person who has not consented to the content. Especially when the imagery is, “So realistic that a reasonable person would believe it depicts speech or conduct of an individual who did not in fact engage in such speech or conduct.” 

In particular, the idea is centered around candidates running for an elected position, where deepfakes can influence the community on how to vote. 

Law prohibits sharing of deepfakes

The lawsuit also challenges the criminal liability associated with breaking the law. As written, anyone who shares a deepfake within 90 days of an election could face legal intervention. Specifically, if the intent was to harm a candidate’s reputation or influence the outcome of an election.

Court documents state, “Under this enforcement system, platforms that keep up content presenting a close call under the statute run the risk of criminal penalties, but there is no penalty for erring on the side of too much censorship.”

X argues the law will, “result in the censorship of wide swaths of valuable political speech and commentary and will limit the type of ‘uninhibited, robust, and wide-open’ ‘debate on public issues’ that core First Amendment protections are designed to ensure.”

X argues law is vague and violates federal law

The company also brings up the law’s vagueness and Section 230, which protects social media companies from liability based on content posted by its users. Similar to two failed lawsuits brought by a Minnesota lawmaker and a social media content creator, X wants the judge to block the law from being enforced through a preliminary injunction.

Musk has been a vocal advocate for freedom of speech, a topic he pushed when he bought Twitter in 2022. He later renamed the social media platform to X and changed the company’s content moderation policies. 

Minnesota isn’t the only state with a law banning deepfakes, more than 20 other states have similar legislation in place. Most laws passed and were signed into law in the last three years, according to Public Citizen. The tracker claims AI, “poses a myriad of serious threats to our democracy,” with one of the biggest being deepfakes.

Jake Larsen (Video Editor) contributed to this report.
Tags: , , , , , , , ,

Why this story matters

The lawsuit by X Corp against Minnesota's new deepfake legislation raises significant concerns about free speech and the implications of regulating AI-generated content during elections.

Free speech

This theme is crucial as the legislation is claimed to infringe upon First Amendment rights, potentially introducing challenges to political discourse.

Election integrity

The law seeks to protect electoral processes by restricting misleading AI-generated content, which is significant in maintaining the trust of voters.

Legal implications

The case could set precedential legal standards regarding the regulation of technology and speech, particularly in the context of rapidly evolving AI.

Get the big picture

Synthesized coverage insights across 26 media outlets

Context corner

The Minnesota deepfake law reflects a growing concern about the influence of artificial intelligence and misinformation on electoral processes. Historically, the use of technology in campaigns has evolved, from traditional media to digital platforms, raising challenges in content authenticity and voter manipulation.

Policy impact

New policies regarding deepfake regulation could lead to significant changes in how social media platforms operate, potentially increasing compliance costs and affecting content moderation practices. Platforms may face heightened scrutiny for user-generated content to avoid legal repercussions.

Underreported

The complexities of technological implementation and the effectiveness of monitoring deepfakes are under-reported aspects. There's limited discussion on how social media platforms will comply with these laws without facing undue liability or how users might adapt to new restrictions on content sharing.

Bias comparison

  • Not enough coverage from media outlets on the left to provide a bias comparison.
  • Media outlets in the center maintained a detached tone, focusing on the lawsuit's legal arguments and Musk's "free speech absolutist" views.
  • Media outlets on the right emphasized "free speech concerns," aligning with a broader narrative of individual liberties threatened by regulation.

Media landscape

Click on bars to see headlines

33 total sources

Key points from the Right

  • Elon Musk's social media platform X filed a lawsuit against Minnesota over its law banning AI-generated deepfakes in elections, claiming it violates free speech protections under the First Amendment and is impermissibly vague.
  • Minnesota's law threatens criminal liability for platforms if they misjudge content under this law, raising concerns about censorship.
  • X asked the federal judge for a permanent injunction against the law, stating it would lead to censorship of political speech and commentary, according to the company’s complaint.

Report an issue with this summary

Powered by Ground News™