Search

Move Over Keats, Poetry Has a New King in AI

Meet the refined AI that creates rhyming poems, songs, and long-form content. Its ability to adapt and transform different types of poetry has impressed authors and poets broadly.

OpenAI:s updated model of the GPT-3 family of AI-powered large language models, “text-davinci-003”, can now handle much more complex instructions, produce longer-form content, create more rhyming poems and songs.

The improved AI could also generate rhyming songs, limericks, and poetry at a level GPT-3, its predecessor, could not produce.

GPT-3 (which stands for “generative pre-trained transformer”) auto-completes your text based on prompts from a human writer.

Prompting is a new verb

Several commentators on Hacker News extolled the AI and were was genuinely surprised after the GPT-3 wrote a short but rhyming poem explaining Albert Einstein’s theory of relativity.

When GPT-3 was asked to re-write the poem in the style of John Keats, it came back with impressive lines like “Mass doth affect the shape of time.”

Art, poetry and lyrics is increasingly being created with the help of AI-prompting – the art of telling an AI what to do.

Here is the prompt that got our attention: “Write a short rhyming poem explaining Einstein’s theory of general relativity in easy but accurate terms” 

Move Over Keats, Poetry Has a New King in AI
Albert Einstein as imagined by AI (Midjourney).

If you want to understand Einstein’s thought
It’s not that hard if you give it a shot

General Relativity is the name of the game
Where space and time cannot remain the same

Mass affects the curvature of space
Which affects the flow of time’s race

An object’s motion will be affected
By the distortion that is detected

The closer you are to a large mass
The slower time will seem to pass

The farther away you may be
Time will speed up for you to see

GPT-3, which was launched in 2020, shot to prominence for its ability to compose text in various styles at the level of proficiency of a human, thanks to extensive training on text scraped from the Internet and data pulled from books. 

Around the same time, college student Liam Porr got GPT-3 to produce a fake blog post that landed on the No. 1 spot on Hacker News.

Using statistical regularities to “think”

GPT-3 searches for patterns in data and is trained  to a huge corpus of text that it’s mined for statistical regularities. Generally, such regularities are unknown to humans, but are stored as billions of weighted connections between the different nodes in GPT-3’s neural network.

Additionally, there’s no human input involved in this process as well. The AI simply and looks and finds patterns on its own and then uses it to complete text prompts. If one inputs “fire” into GPT-3, the AI will reason, based on its weights, that the words “truck” and “alarm” are more likely to follow. It is also aware that words like “lucid” or “elvish” are less likely to follow when the word “fire” has been into the GPT-3 prompter.

The creation of poetry with a machine is not really a new phenomenon. In 1845, inventors crafted ways to write expressive verse through automation.

However, experts note that GPT-3’s latest update solves an array of problems that come with integrating knowledge on a host of subjects and styles into one model that writes coherent text. 

OpenAI quietly pimped GPT-3 and made text-davinci-003 a notable public upgrade. Rumors and whispering of a powerful successor, called GPT-4, have emerged, although OpenAI has not commented on it.

The info circulating and speculations on the matter have led some to believe that GPT-4 will be able to pass the Turing test.

/MetaNews

Image credits: Shutterstock, CC images, Midjourney, Unsplash.

Welcome

Install
×