The result was barely edible. It looked more like a watermelon omelette muffin than a cookie, and tasted like a sugary, gloopy nightmare. My four-year-old daughter was the only fan in our house, saying they tasted “weird” but also protesting when I threw them in the compost.
Had a friend sent me this recipe, I would have been disappointed and wondered what trauma had led them to this horrific assemblage. But since it was produced by artificial intelligence, I was stunned: Sure, the food tasted terrible, but the recipe was so smoothly written it could have easily been concocted by a watermelon-loving human.
Watermelon cookie recipes are just one of countless text-based things that can be generated by new AI software from nonprofit research company OpenAI, which was cofounded by Elon Musk and counts Microsoft as one of its backers.
Feed it a few words or sentences — the start of a recipe, a fragment of a poem, the first line of a news story — and it will do its best to expand on the prompt by creating text that matches its style as closely as it can.
The results can be wacky, weird, disturbing, or something else entirely, and can go on for multiple paragraphs while sticking to the same subject matter. They show how skilled computers are becoming at producing coherent-sounding text, even though they don’t understand the meaning behind the words they string together. They also highlight OpenAI’s goal of building AI that can be used for many different purposes — and the potential danger that come with training AI on a vast collection of internet text.
The AI software, known as GPT-3, is accessed through what’s known as the OpenAI API. Both were rolled out in June, and are currently available only to a select group of companies and software developers (for now, it’s free to use, but OpenAI eventually plans to charge for it). Despite its limited availability, it’s getting plenty of attention throughout the tech community, particularly on Twitter, as data scientists, venture capitalists and others posted about its implications.
Shane, who has had access to the AI system for several months, considers GPT-3 a leap forward in coherence of the text that can be generated — beyond GPT-2, which itself could generate whole paragraphs that more or less stayed on target, she said.
Still, it doesn’t know things that even a very young child would, which makes it unlikely that it will replace most of us human writers any time soon.
“It is interesting to see how these very advanced language models can produce comprehensive technical text and code, yet will flub this question about how many eyes a horse has,” she said.
He showed me, among other things, how it can even come up with its own detailed recipe blog posts, complete with a meandering story at the top about how meaningful the recipe is to the writer. For instance, given the prompt “Peanut Butter Jelly Pop-Tarts Recipe” and “When I was a small child,” the AI followed up with “I couldn’t wait for my school’s bake sale. They would sell different kinds of baked goods and every year they would sell these weird homemade pop tarts. Of course I would always be really excited and buy a pack, only to take a bite and be left with a face full of disappointment.”
“It is a lot of fun to play around with,” Schachter said.
“If you memorize the entire internet, for example, you memorize the good and the ugly parts,” she said.
And as I saw firsthand in my kitchen, just because it can generate text that sounds like it was written by a human — such as that watermelon cookies recipe — doesn’t mean this AI actually understands any of the theory of cooking or baking. At one point, the watermelon cookies recipe instructed me to add an egg white to a pot of hot watermelon sugar-water, which resulted in scrambled watermelon eggs.