Ross: The fundamental problem with the new artificial intelligence

I read Steven Johnson excellent article in the New York Times on a new artificial intelligence system called GPT-3

GPT-3 stands for Generative Pre-Trained Transformer 3, which is an artificial intelligence program that mimics brain synapses and is hosted in a supercomputer in Iowa. This machine reads the Internet 24/7 and digests its content, maps speech patterns and teaches itself to write original prose in response to any question.

It learns by teaching itself to complete partial sentences, similar to how Microsoft Outlook offers to complete your email responses.

But GPT-3 goes much further. He can write original short stories. He was even trained to write film scripts.

Imagine being able to pick a topic, style, and tone — from Northwest Nice to Five Jalapenos — and in less than a second, a grammatically perfect paragraph appears.

The excerpts from the Times article are staggering compared to what I saw a few years ago.

But there is a fundamental problem with this technology and that is the temptation to use it. If you’re just producing fiction for entertainment, great. However, you to know trouble is ahead when you read the software’s license terms: the designers specifically prohibit using the technology to determine who gets a credit card, payday loan, job, or apartment. It is also prohibited to use it to generate spam, promote “pseudo-pharmaceuticals” or influence the political process.

Except there is no mention of how to enforce this. Since computers cannot feel fear, shame, pain, poverty, loss or death – they have no motivation to control themselves, and therefore all control will have to be imposed by humans – humans who, I guarantee, have every intention of using these machines to do everything. those forbidden things.

GPT-3 will also be used to generate high school essays, eventually college essays, and eventually human experts will simply read an AI-generated script in their teleprompter glasses.

Of course, ultimately this will be used to make decisions – like, do we stick with conventional weapons, or is it time for nuclear weapons?

I want to say here and now: that would be bad. I want this saved before this segment is transferred to GPT-3.

Time may be running out. After reading the article, I also read some of the comments.

One from a New Jersey reader named “Archer” reads:

“I have no objection to any of this. I’m sick of reading the scribbles of carbon-based writers.

Like I said, it was from a reader from New Jersey named Archer…unless that’s not the case.

Listen to Seattle’s Morning News with Dave Ross and Colleen O’Brien weekday mornings from 5 to 9 a.m. on KIRO Newsradio, 97.3 FM. Subscribe to the podcast here.

Comments are closed.