Skip to main content
Tech

UN adviser on AI: Deepfake tech is ‘greatest threat to democracy’

Aug 23, 2023

Share

As the 2024 election cycle begins to unfold, the role of artificial intelligence (AI) is gaining prominence, but concerns are growing about its potential impact on democratic processes. The United Nations (UN) has expressed apprehension, particularly due to the proliferation of deepfake technology that some within the organization have labeled “the greatest threat to democracy.”

Neil Sahota, an AI adviser for the UN, has warned that the upcoming presidential race will likely see a surge in the use of deepfakes — highly convincing manipulated audio and video content. This technology, capable of creating deceptive media featuring public figures, poses a significant challenge to ensuring the integrity of information disseminated during campaigns. Sahota also emphasized that addressing this issue presents a complex dilemma with no straightforward solution.

“A lot of people — and I think those in the media too are calling the 2024 election ‘the deepfake election’ that is probably going to be marred by tons and tons of deepfakes,” Sahota said. “Not much can be done right now to stop any of that.”

Echoing these concerns, the UN Security Council has cautioned that if left unchecked, AI could become a threat to humanity on par with the dangers of nuclear warfare. In response to these apprehensions, the UN is taking steps to develop and deploy software designed to detect deepfakes. Currently, debunking such fabricated content can take several days, allowing it time to spread widely and potentially manipulate public opinion in today’s digital age.

“If someone releases a very damaging deepfake video two days before the election, that may not be enough time to counteract it and prove it and get people to believe that,” Sahota said.

Reports indicate that deepfakes have already been influencing people’s decision-making processes. According to DeepMedia, an estimated 500,000 audio and video clips featuring synthetic content are projected to be shared throughout 2023.

This is partly due to the increased accessibility of tools required to create such media. The cost of cloning a voice, for instance, has drastically decreased, with companies now offering the service for only a fraction of what it used to cost: a few dollars compared to $10,000 just a year ago.

To combat the rise in usage of deepfakes for misinformation, some industry leaders like OpenAI have taken steps to prevent users from generating images of political figures such as former President Donald Trump or President Joe Biden. However, this approach has not deterred smaller AI startups from allowing users to create such images, with some even providing this ability for free.

“It’s going to be very difficult for voters to distinguish the real from the fake. And you could just imagine how either Trump supporters or Biden supporters could use this technology to make the opponent look bad,” said Darrell West, senior fellow at the Brookings Institution’s Center for Technology Innovation.

Amid these developments, the Federal Elections Commission (FEC) has announced its intention to establish regulations governing the use of AI in electoral campaigns leading up to the 2024 elections. While there have been concerns about whether these regulations might inadvertently infringe upon free speech rights, FEC Commissioner Allen Dickerson has said that efforts will be made to craft regulations that target genuinely fraudulent activities without hindering protected expression.

“Precision of regulation is a requirement in our work,” Dickerson said. “And if the commission has authority to act in this area, I hope that commentators will also demonstrate that it is possible to tailor a regulation to truly fraudulent activity without slowing protected expression.”

Tags: , , , , , , , ,

SHANNON LONGWORTH: THE 20-24 ELECTION CYCLE IS ALREADY HERE – AND ARTIFICIAL INTELLIGENCE IS EXPECTED TO PLAY A BIG ROLE.
IT MAY ALSO BE THE GREATEST THREAT TO OUR DEMOCRACY – AT LEAST ACCORDING TO THE UNITED NATIONS.

AN A-I ADVISOR FOR THE U-N PREDICTS DEEP FAKES WILL BE A CONSTANT DURING THE 20-24 PRESIDENTIAL RACE – AND SAYS THERE’S REALLY NOT MUCH THAT CAN BE DONE TO STOP IT.

NEIL SAHOTA, UNITED NATIONS ARTIFICIAL INTELLIGENCE ADVISOR: “A lot of people—and I think those in the media too, are calling the 2024 election ‘the deepfake election’ that is probably going to be marred by tons and tons of deepfakes. Not much can be done right now to stop any of that.”

LONGWORTH: IF LEFT UNCHECKED, THE UN SECURITY COUNCIL BELIEVES A.I. COULD POSE A THREAT TO HUMANITY ON PAR WITH THE DANGERS OF NUCLEAR WAR.

THE U-N IS NOW WORKING TO ROLL OUT SOFTWARE TO DETECT DEEP FAKES.
CURRENTLY DEBUNKING THIS KIND OF GENERATED CONTENT CAN TAKE DAYS.

OFFICIALS SAY DEEP FAKES ALREADY MANIPULATE PEOPLE’S DECISIONS.
TECH COMPANY ‘DEEP MEDIA’ ESTIMATES ABOUT 500-THOUSAND AUDIO AND VIDEO CLIPS WILL BE SHARED IN 2023.

THAT’S PARTLY BECAUSE SO MANY PEOPLE HAVE ACCESS TO THE TOOLS.
CLONING A VOICE USED TO COST AS MUCH AS TEN THOUSAND DOLLARS LAST YEAR.
NOW THERE ARE COMPANIES OFFERING THE SERVICE FOR ONLY A FEW BUCKS.

SOME INDUSTRY LEADERS LIKE ‘OPEN A.I.’ ARE ATTEMPTING TO COMBAT THIS BY BLOCKING USERS FROM CREATING IMAGES OF POLITICAL FIGURES LIKE DONALD TRUMP OR JOE BIDEN.
BUT THAT HASN’T STOPPED SMALLER A.I. STARTUPS FROM ALLOWING THE CREATION OF IMAGES DEPICTING THESE FIGURES, WITH SOME APPS GIVING USERS THE ABILITY TO MAKE THEM FOR FREE.

MEANWHILE, THE FEDERAL ELECTIONS COMMISSION HAS SAID IT INTENDS TO PROCEED WITH ITS OWN QUOTE “FULL RULEMAKING” FOR THE USE OF ARTIFICIAL INTELLIGENCE IN ELECTORAL CAMPAIGNS AHEAD OF 2024.

THE F-E-C’S COMMISSIONER HAS SAID ANY PROPOSED SAFEGUARDS AGAINST DEEPFAKES WILL BE TAILORED TO CURB ONLY TRULY FRAUDULENT ACTIVITY WITHOUT INFRINGING UPON PROTECTED FREE SPEECH.

FOR MORE INFORMATION ON THIS TOPIC, HEAD OVER TO S-A-N DOT COM AND CHECK OUT MY OTHER STORY ON ARTIFICIAL INTELLIGENCE IN POLITICS, WHERE I FURTHER BREAK DOWN HOW SYNTHETIC MEDIA COULD SHAPE THE 2024 PRESIDENTIAL ELECTION.