Skip to main content
Tech

Microsoft’s AI language interpreter could be boon for cybercriminals

Listen
Share

Microsoft unveiled a new Teams AI interpreter on Tuesday, Nov. 19. The program can replicate a user’s voice in near-real-time in nine different languages.

Media Landscape

MediaMiss™This story is a Media Miss by the right as only 3% of the coverage is from right leaning media. Learn more about this data
Left 43% Center 53% Right 3%
Bias Distribution Powered by Ground News

The company said it plans to expand to 31 different languages in the future. The languages reportedly range from English, French, German, Italian, Chinese, Korean, Japanese, Portuguese and Spanish.

QR code for SAN app download

Download the SAN app today to stay up-to-date with Unbiased. Straight Facts™.

Point phone camera here

The feature is currently available to a select group of users. Officials said that will expand to more customers with a Microsoft 365 Copilot license in 2025.

The company touts the new technology as a less expensive way to have international phone calls or meetings without the expense of a human translator.

However, the new feature isn’t quite perfect yet, with Microsoft admitting the Teams AI interpreter may not be 100% accurate.

Still, critics say that’s not their biggest concern about the AI feature. They said they’re afraid the technology could open the door to fraudsters.

Security analysts warn of potential hackers using the technology, noting deepfakes are already a problem and impersonation scams reportedly cost Americans more than $1 billion in 2023.

In 2024, scammers used deepfake technology to setup a fake Teams video conference call and stole $25 million from a multinational firm.

An anonymous threat analyst group is already skeptical of Microsoft’s new technology, posting on X, “Ever be North Korean but want to sound American? It’s now possible,” apparently poking fun at the Big Tech company.

However, the group’s concerns may not just be talk. A recent report by SecureWorks warned of North Korean hackers applying for IT jobs at companies across the United States, United Kingdom and Australia, in an attempt to steal company secrets.

Cybersecurity experts urge companies and organizations worried about impersonation scams to opt out of the new Microsoft feature. They said companies should use the generic voice simulator option instead.

Microsoft users will reportedly have to give consent to the AI interpreter through privacy settings for it to use voice simulation during a meeting.

They can opt out of the voice replication by disabling it in settings. The interpreter will then use a default interpretation of the person’s voice instead.

Tags: , , , , , ,

[JACK AYLMER]

MICROSOFT UNVEILED A NEW TEAMS A-I INTERPRETER TUESDAY THAT CAN REPLICATE YOUR VOICE IN NEAR-REAL-TIME IN NINE DIFFERENT LANGUAGES.

THE FEATURE IS CURRENTLY BEING TESTED BY A SELECT GROUP OF USERS AND WILL EXPAND TO MORE CUSTOMERS WITH A MICROSOFT 365 COPILOT LICENSE NEXT YEAR.

THE COMPANY TOUTS IT AS LESS EXPENSIVE WAY TO HAVE INTERNATIONAL PHONE CALLS WITHOUT THE EXPENSE OF A HUMAN TRANSLATOR.

HOWEVER, IT’S NOT QUITE PERFECT YET. MICROSOFT ADMITS, THE TEAMS A-I INTERPRETER MAY NOT BE 100 PERCENT ACCURATE.

CRITICS SAY THAT’S NOT THEIR BIGGEST CONCERN.

FEARING THE A-I TECH COULD OPEN A PANDORA’S BOX OF FRAUDSTERS.

SECURITY ANALYSTS WARN OF POTENTIAL HACKERS USING THE TECHNOLOGY.

NOTING AUDIO DEEPFAKES ARE ALREADY A PROBLEM AND IMPERSONATION SCAMS REPORTEDLY COST AMERICANS MORE THAN ONE BILLION DOLLARS LAST YEAR.

THIS YEAR, SCAMMERS USED DEEPFAKE TECH TO SETUP A FAKE TEAMS VIDEO CONFERENCE CALL AND STEAL 25 MILLION DOLLARS FROM A MULTINATIONAL FINANCE FIRM.

AN ANONYMOUS THREAT ANALYST GROUP IS ALREADY SKEPTICAL OF MICROSOFT’S NEW TECH.

POSTING ON X:

“EVER BE NORTH KOREAN BUT WANT TO SOUND AMERICAN? IT’S NOW POSSIBLE.

AND THEIR CONCERNS MAY NOT JUST BE TALK.

A RECENT REPORT BY SECUREWORKS WARNED OF NORTH KOREAN HACKERS APPLYING FOR I-T JOBS AT COMPANIES ACROSS THE U-S, U-K AND AUSTRALIA, IN AN ATTEMPT TO STEAL COMPANY SECRETS.

CYBERSECURITY EXPERTS URGE COMPANIES AND ORGANIZATIONS WORRIED ABOUT IMPERSONATION SCAMS TO OPT OUT OF THE FEATURE AND USE THE GENERIC VOICE SIMULATOR OPTION INSTEAD.

FOR MORE ON THIS STORY– DOWNLOAD THE STRAIGHT ARROW NEWS APP OR VISIT SAN DOT COM.

FOR STRAIGHT ARROW NEWS– I’M JACK AYLMER.