The GPT-3 Zero Shot approach

TL;DR: For many tasks you don’t need to provide GPT-3 with examples because it already understands what you want.

If you look closely at the documentation and prompts for GPT-3 provided by OpenAI you’ll notice that a number of them don’t require any examples to show the model what you want. This is because in the world of text that GPT-3 was trained on, it was able to learn a number of useful tasks simply through observation.

In my GPT-3 training sessions I encourage starting with a zero-shot approach (using no examples) to build a prompt. This doesn’t work for every case, but often has surprising results. Examples can sometimes confuse the model – especially if they’re not clearly delineated in the prompt. Using none is ideal whenever possible.

Here are some of the zero-shot prompts that work with GPT-3:


Putting the letters TL;DR: after a block of text will usually give you a summary.

A simple one-word prompt for summarization


Since GPT-3 understands what a blog post or an article looks like, you can just add “Tags:” after text to get a list of tags that apply to passage.

Text Sentiment

Text sentiment is something that GPT-3 doesn’t require any examples to understand. You create a prompt for sentiment analysis as simply as writing:

Text: It was a wonderful day!
Sentiment (Positive, Neutral, Negative): Positive


You can get GPT-3 to apply a category to text using a similar zero-shot prompt:

Story: The Hobbit
Genre: Fantasy

Question answering

All it takes to get GPT-3 to answer a question is a format that it recognizes:

Q: How tall is the Eiffel Tower?
A: 324 meters (1063 feet)

Language translation

GPT-3 understands a number of languages besides English. All you have to do get it to perform simple translations is use a prompt like this:

English: Let's go to the beach.
Vietnamese: Đi biển đi.

Discovering new zero-shot capabilities 

GPT-3 has been trained on so much data it’s possible we’ll never discover all of the zero-shot tasks it’s able to perform. One way to uncover some of them is to look at patterns in text that occur frequently, like following an article with “Tags.” 

Another method is to look for words or characters that are super to GPT-3. For example: Writing “Names for tech companies” followed by “1.” makes it pretty clear to GPT-3 you want a list.

Other words to play with are pairs like “Problem” and “Solution” (like you’d see in online test examples) or “Raw” and “Formatted.” These pairs clarify for GPT-3 what you’re expecting and help it accomplish a task without any additional instructions.

Look around the web for text patterns that can be turned into zero-shot prompts. Pay close attention to sites like Wikipedia, Reddit and online references for clues to what GPT-3 is capable of doing.