[BLANK]

OpenAI’s gigantic GPT-3 hints at the limits of language models for AI


Publication Title
ZDNet
Publication/Creation Date
June 1 2020
Creators/Contributors
Tiernan Ray (creator)
OpenAI (contributor)
Persuasive Intent
Information
Description

 "Like GPT-2 and other Transformerbased. programs, GPT-3 is trained on the Common Crawl data set, a corpus of almost a trillion words of texts scraped from the Web. 'The dataset and model size are about two orders of magnitude larger than those used for GPT-2.'"

HCI Platform
Other
Location on Body
Not On The Body
Marketing Keywords
OpenAI, OpenAI GPT-3, OpenAI GPT-2
Source
https://www.zdnet.com/article/openais-gigantic-gpt-3-hints-at-the-limits-of-language-models-for-ai/

Date archived
June 1 2020
Last edited
September 30 2020
How to cite this entry
Tiernan Ray. (June 1 2020). "OpenAI’s gigantic GPT-3 hints at the limits of language models for AI". ZDNet. CBS Interactive Inc.. Fabric of Digital Life. https://fabricofdigitallife.com/index.php/Detail/objects/4577