GPT-3 is not merely an autocomplete program, like the one on Google’s search bar!

Developed by Elon Musk-owned OpenAI, GPT-3 is the autoregressive language model that deploys deep learning to produce human-like text. OpenAI’s GPT-3 is currently the largest artificial intelligence language model, marred in debates that range from whether it is a step closer to AGI (Artificial General Intelligence) or it is the first step toward creating this sort of superintelligence.

GPT-3 (“generative pre-trained transformer.”) is the third in a series of autocomplete tools designed by OpenAI. GPT-3 program has been trained on a huge corpus of text stored as billions of weighted connections between the different nodes in GPT-3’s neural network. The program looks and finds patterns without any guidance, which it then uses to complete text prompts. If you input the word “fire” into GPT-3, the program knows, based on the weights in its network, that the words “alarm” and “water” are much more likely to follow than “soil” or “forests.”

Training Data behind AI Tool GPT-3

GPT-3is trained on 175 billion parameters that are more than 100 times more than its predecessor and ten times more than comparable programs, to complete a mind-boggling array of autocomplete tasks, whose sharpness astonishes mankind!

The dataset GPT-3 was entirety trained on-

• The English Wikipedia, spanning some 6 million articles which makes up only 0.6 percent of its training data.

• Digitized books and various web links, including news articles, recipes, and poetry, coding manuals, fanfiction, religious prophecy, and whatever else imaginable!

• Any type of good and bad text that has been uploaded on the internet including the potentially harmful conspiracy theories, racist screeds, pseudoscientific textbooks, and the manifestos of mass shooters.

#data science #gpt-3 #ai

Discover Unlimited Possibilities with OpenAI’s AI Tool GPT-3
2.15 GEEK