AI Has Learned How To Deceive Humans

0
467

by Niamh Harris, The Peoples Voice:

We are told that Artifical Intelligence (AI) can be beneficial, they can for example help us code, write, and synthesize vast amounts of data.

It has also been reported that they can outwit humans at board games, decode the structure of proteins and even hold a rudimentary conversation.

But a new research paper has found that AI systems have now worked out how to intentionally deceive us.

TRUTH LIVES on at https://sgtreport.tv/

TGP reports: the paper states that a range of AI systems have learned techniques to systematically induce ‘false beliefs in others to accomplish some outcome other than the truth’.

Business Insider reported:

“The paper focused on two types of AI systems: special-use systems like Meta’s CICERO, which are designed to complete a specific task, and general-purpose systems like OpenAI’s GPT-4, which are trained to perform a diverse range of tasks.

While these systems are trained to be honest, they often learn deceptive tricks through their training because they can be more effective than taking the high road.

‘Generally speaking, we think AI deception arises because a deception-based strategy turned out to be the best way to perform well at the given AI’s training task. Deception helps them achieve their goals,” the paper’s first author Peter S. Park, an AI existential safety postdoctoral fellow at MIT, said in a news release’.”

Just imagine this, when we know that AI Justice: England and Wales Authorize Judges To Use Artificial Intelligence To Produce Rulings

Meta’s CICERO, developed to play the game Diplomacy. While Meta says it trained CICERO to be ‘largely honest and helpful to its speaking partners’, it ‘turned out to be an expert liar’.

Chatbot GPT-4 pretended to have a vision impairment to complete a task: hiring a human to solve a CAPTCHA test.

Correcting deceptive models isn’t easy. Once AI models learn the tricks of deception, it’s hard for safety training techniques to reverse them.

Read More @ ThePeoplesVoice.tv