DOJ cracks down on AI threats ahead of 2024 election
Media Landscape
See who else is reporting on this story and which side of the political spectrum they lean. To read other sources, click on the plus signs below. Learn more about this dataLISA MONACO
DEPUTY ATTORNEY GENERAL
“Our democratic process and the public servants who protect it have been under attack like never before. As threats evolve and spread today, those threats are being supercharged by advanced technologies.
[LAUREN TAYLOR]
ARTIFICIAL INTELLIGENCE CONTINUES TO BE A SIGNIFICANT CONCERN AS THE 2024 ELECTION APPROACHES. ACCORDING TO A NEW Elon University Poll , MORE THAN THREE-QUARTERS OF AMERICANS BELIEVE AI ABUSES WILL IMPACT THE ELECTION’S OUTCOME.
73% of Americans believe it’s “very” or “somewhat” likely AI will infiltrate and manipulate social media, including through the use of fake accounts and bots to spread misinformation.
70% suspect that AI-generated fake video and audio information will blur the lines between truth and deception.
62% say AI will be used to convince certain voters to skip voting.
78% say at least one of these AI abuses will be used, and more than half think all of them are likely to happen.
AI COULD BE USED FOR GOOD, BUT AMERICANS, BY AN EIGHT-TO-ONE MARGIN, BELIEVE IT WILL HARM ELECTIONS RATHER THAN HELP.
AS THE STAKES RISE, SO DO THE CONSEQUENCES.
FEDERAL PROSECUTORS ARE INCREASING THEIR EFFORTS TO COMBAT ELECTION-RELATED CRIMES INVOLVING A-I. DEPUTY U.S. ATTORNEY GENERAL LISA MONACO ANNOUNCED THIS WEEK A JUSTICE DEPARTMENT TASK FORCE WILL SEEK TOUGHER SENTENCES FOR CRIMES WHERE AI IS USED, INCLUDING THREATS AGAINST ELECTION WORKERS AND VOTER SUPPRESSION.
LISA MONACO
DEPUTY ATTORNEY GENERAL
“These advanced tools are providing new avenues for bad actors to hide their identities and obscure sources of violent threats. They’re providing new avenues to misinform and threaten voters through deepfakes, spreading altered video or cloned audio, impersonating trusted voices. And they’re providing new avenues to recruit and radicalize with incendiary social media content that accelerates online hate and harassment.”
[LAUREN TAYLOR]
THIS POLICY CHANGE AIMS TO ADDRESS THE GROWING CHALLENGES POSED BY AI TOOLS IN THE LEAD-UP TO THE 2024 PRESIDENTIAL ELECTION. THESE TOOLS CAN EASILY MIMIC POLITICIANS’ VOICES AND LIKENESSES, SPREADING FALSE INFORMATION MORE EFFECTIVELY. THE NEW GUIDELINES WILL TARGET CASES WHERE AI MAKES THESE CRIMES MORE DANGEROUS AND IMPACTFUL.
A RECENT EXAMPLE INVOLVED AN AI-GENERATED ROBOCALL IN NEW HAMPSHIRE, IMITATING PRESIDENT JOE BIDEN AND URGING VOTERS TO SKIP THE PRIMARY. THE ROBOCALL WAS CREATED BY A NEW ORLEANS MAGICIAN FOR A POLITICAL CONSULTANT WORKING FOR MINNESOTA REP. DEAN PHILLIPS, A DEMOCRATIC CHALLENGER TO BIDEN.
MERRICK GARLAND
ATTORNEY GENERAL
“Fulfilling that charge means confronting the full range of threats to our elections. That includes continuing our work through this task force, our U.S. Attorney’s offices, and our FBI offices across the country to investigate, disrupt, and combat unlawful threats against those who administer our elections.”
[LAUREN TAYLOR]
U.S. OFFICIALS ARE ALSO CONCERNED ABOUT FOREIGN ADVERSARIES USING AI TO SPREAD ELECTION DISINFORMATION. IN DECEMBER, SENIOR OFFICIALS SIMULATED A SCENARIO IN WHICH CHINESE OPERATIVES CREATED AN AI-GENERATED VIDEO SHOWING A SENATE CANDIDATE DESTROYING BALLOTS.
THE JUSTICE DEPARTMENT FACES PRESSURE FROM ELECTION OFFICIALS TO INVESTIGATE THE SURGE OF THREATS AND HARASSMENT THEY HAVE RECEIVED, MANY OF WHICH STEM FROM FALSE CLAIMS OF FRAUD IN THE 2020 ELECTION. IN MARCH, AN OHIO MAN WAS SENTENCED TO OVER TWO YEARS IN PRISON FOR MAKING DEATH THREATS TO AN ARIZONA ELECTION OFFICIAL.
TO STAY UP TO DATE ON 2024 ELECTION COVERAGE DOWNLOAD THE STRAIGHT ARROW NEWS APP OR VISIT US AT SAN DOT COM.
FOR STRAIGHT ARROW NEWS, I’M LAUREN TAYLOR.