Deepfakes From OpenAI GPT-2 Algorithm

OpenAI's Generative Pre-trained Transformer-2 (GPT-2 ) is capable of generating text from short writing samples. How good is it? Maybe too good.

Our model, called GPT-2 (a successor to GPT), was trained simply to predict the next word in 40GB of Internet text. Due to our concerns about malicious applications of the technology, we are not releasing the trained model. As an experiment in responsible disclosure, we are instead releasing a much smaller model for researchers to experiment with, as well as a technical paper.

GPT-2 is a large transformer-based language model with 1.5 billion parameters, trained on a dataset[1] of 8 million web pages. GPT-2 is trained with a simple objective: predict the next word, given all of the previous words within some text. The diversity of the dataset causes this simple goal to contain naturally occurring demonstrations of many tasks across diverse domains. GPT-2 is a direct scale-up of GPT, with more than 10X the parameters and trained on more than 10X the amount of data.

GPT-2 displays a broad set of capabilities, including the ability to generate conditional synthetic text samples of unprecedented quality, where we prime the model with an input and have it generate a lengthy continuation.

(Via Better Language Models and Their Implications.)

Better yet, let the always informative and entertaining Siraj Raval explain it to you:


Science fiction fans are not shocked to read that AIs write as well as we do. In Studio 5, The Stars, a 1971 story by JG Ballard, a verse transcriber is used to write poetry on demand:

"Do you mean she wrote these herself?" I nodded. "It has been done that way. In fact the method enjoyed quite a vogue for twenty or thirty centuries. Shakespeare tried it, Milton, Keats and Shelley - it worked reasonably well for them."

"But not now," Tony said. "Not since the VT set. How can you compete with an IBM heavy-duty logomatic analogue?"

"...Hold on," I told him. I was pasting down one of Xero's satirical pastiches of Rubert Brooke and was six lines short. I handed Tony the master tape and he played it into the IBM, set the meter, rhyme scheme, verbal pairs, and then switched on, waited for the tape to chunter out of the delivery head, tore off six lines and passed them back to me. I didn't even need to read them.

For the next two hours we worked hard, at dusk had completed over 1,000 lines and broke off for a well-earned drink.

Scroll down for more stories in the same category. (Story submitted 2/15/2019)

Follow this kind of news @Technovelgy.

| Email | RSS | Blog It | Stumble | del.icio.us | Digg | Reddit |

Would you like to contribute a story tip? It's easy:
Get the URL of the story, and the related sf author, and add it here.

Comment/Join discussion ( 0 )

Related News Stories - (" Artificial Intelligence ")

Datagrid Model Generation Perfect For Eternal Cities Of Science Fiction
'... there was enough flexibility to allow for wide variation. - Arthur C. Clarke, 1956.

Amazon Echo And Google Home Should Have Morality Software
'The Dwoskin Morality Rating-Computer could 'spot the slightest tendency to deviation' from the social norm...' - Kendall Foster Crossen, 1953.

Deepfakes From OpenAI GPT-2 Algorithm
'How can you compete with an IBM heavy-duty logomatic analogue?' - JG Ballard, 1971.

Fishy Facial Recognition Now Possible
'Palenkis can identify random line patterns better than any other species in the universe.' - Frank Herbert, 1969.

 

Google
  Web TechNovelgy.com   

Technovelgy (that's tech-novel-gee!) is devoted to the creative science inventions and ideas of sf authors. Look for the Invention Category that interests you, the Glossary, the Invention Timeline, or see what's New.

 

 

 

 

 

Current News

Datagrid Model Generation Perfect For Eternal Cities Of Science Fiction
'... there was enough flexibility to allow for wide variation.

Kazahk Ironist Protester Arrested For Blank Sign Protest
'...a man carried a white rectangular sign, blank on both sides.'

Bitcoin Surges Again, To $7,000
'... electronic, private cash, unbacked by any government, untraceable, completely anonymous.'

China Develops Taste Testing Robots
'Install taste buds in the end of one tentacle...'

North Sea Stone Age Reconstruction And Philip K Dick
'Your Dip digs back into antiquity. Rome. Greece. Dust and old volumes.'

Tesla Robotaxis Will Automatically Recharge Themselves
'Then it appeared to make up its mind, and trundled over to a wall socket...'

New Lifelike Material Powered By Artificial Metabolism
'... The biological robots were not living creatures.'

Husqvarna Automower 435X AWD
'Gramp Stevens sat in a lawn chair, watching the mower at work...'

Elon Musk Foretells Tesla Sans Steering Wheel
'How about the steering wheel... I don't need one.'

Adversarial Patches Trick Computer Vision
'The surveillance cameras can all see it, but then they forget they’ve seen it.'

Amazon Warehouse Computer Can Fire People Now
'The system has already fired five people...'

BrainEx Restores Some Activity To Severed Pig Head
'... they placed the brain in a special solution, having all the properties of Nursing the brain cells.'

Yes, But Do Astrobees Have Lasers For Lightsaber Training?
'... Ancient weapons are no match for a good blaster at your side, kid.'

'Young Razorbacks Before Their Katanas Grow In'
'Twin robotic arms with gleaming three-foot sword blades unfolded from the forward hydraulic assemblies...'

A New Way To Run Into Things
'He made an adjustment, pointed the tube at the wall beside Etzwane, and projected a cone of light.'

'Metallic Wood' Strong Like Titanium, Floats In Water
'A metal... light as cork and stronger than steel...'

More SF in the News Stories

More Beyond Technovelgy science news stories

Home | Glossary | Invention Timeline | Category | New | Contact Us | FAQ | Advertise |
Technovelgy.com - where science meets fiction™

Copyright© Technovelgy LLC; all rights reserved.