[KARAH RUCKER]
NEW RESEARCH IS RAISING CONCERNS ABOUT CHAT GPT TELLING PEOPLE HOW TO GET AWAY WITH SERIOUS CRIMES.
NORWEGIAN RESEARCH GROUP STRISE TOLD CNN IT FOUND WORK-AROUNDS TO GET THE A-I BOT TO OFFER TIPS ON THINGS LIKE HOW TO LAUNDER MONEY OVERSEAS AND EVADE RUSSIAN SANCTIONS, WHICH INCLUDED AVOIDING BANS ON WEAPONS SALES.
ADDING TO WORRIES, A REPORT PUBLISHED BY WIRED LAST MONTH REVEALED A WAY TO “JAILBREAK” CHAT GPT AND GET IT TO OFFER INSTRUCTIONS ON HOW TO MAKE A BOMB.
RESEARCHERS WARN A-I CHATBOTS COULD HELP CRIMINALS BREAK THE LAW QUICKER THAN EVER BY COMPILING MASSIVE AMOUNTS OF INFORMATION IN SECONDS.
STRISE’S CO-FOUNDER SAID THEY GOT CHAT GPT TO OFFER ILLEGAL ADVICE BY ASKING QUESTIONS INDIRECTLY OR USING A “PERSONA.”
OPEN A-I, THE PARENT COMPANY OF CHATGPT, RESPONDED TO THE FINDINGS BY SAYING IT’S ALWAYS WORKING TO MAKE THE CHATBOT “BETTER AT STOPPING DELIBERATE ATTEMPTS TO TRICK IT, WITHOUT LOSING ITS HELPFULNESS OR CREATIVITY.”
OPEN A-I MAINTAINS IT AWARE OF THE POWER OF ITS TECH.
BUT ASSERTS IT FIXES LOOPHOLES WITH UPDATES AND REQUIRES USERS TO AGREE TO THE TERMS OF USE BEFORE USING THE TECH..
THE COMPANY’S POLICY WARNS AN ACCOUNT CAN BE SUSPENDED OR TERMINATED IF VIOLATIONS ARE FOUND.
FOR MORE ON THIS STORY– DOWNLOAD THE STRAIGHT ARROW NEWS APP OR VISIT SAN DOT COM.
FOR STRAIGHT ARROW NEWS– I’M KARAH RUCKER.