Natural Language Generation: GPT-2 and Huggingface

In this tutorial, we'll r Learn to use Huggingface and GPT-2 to train a language model to be used with Tensorflow

Top 6 Alternatives To Hugging Face, Natural Language Processing

With Hugging Face raising $40 million funding, NLPs has the potential to provide us with a smarter world ahead.

GPT-3: The Next Revolution in Artificial Intelligence (AI)

Generative Pre-trained Transformer 3 also referred to as GPT-3 is the next big revolution in artificial intelligence (AI). In 2018, a startup, OpenAI was the f…

Everything GPT-2: 2. Architecture Comprehensive

The existing resources for GPT-2’s architecture are very good, but are written for researchers so I will provide you will a tailored concept map for all the areas you will need to know prior to jumping in.

How to create an AI that chats like you

Use your WhatsApp and Telegram data to train and chat with a GPT-2 neural network. The goal of this guide is to build a system capable of chatting like you, using your own WhatsApp and Telegram chats as an ML dataset.

NLP-Video Summarization with Watson and GPT

In this project, I explored the power of GPT2 (which has around 1 billion parameters) and can only imagine the power of the most recent GPT3 which has 175 billion parameters!, which can write from software codes to artistic poems.

Fine Tuning GPT-2 for Magic the Gathering Flavour Text Generation

In this article, I will share a method for fine tuning the 117M parameter GPT-2 model with a corpus of Magic the Gathering card flavour texts to create a flavour text generator. This will all be captured in a Colab notebook so you can copy and edit to create generators for your own tasks!

Is GPT-3 the "Adam" of Natural Language?

GPT-3 is a good start for human-like natural language performance. Perhaps a better analogy might be the “Homo habilis” [1] of Artificial Intelligence Natural Language.

Got Writer’s Block? It’s PlotJam to the Rescue!

Using GPT-2 to create plot summaries of books that don’t exist … yet. In this post, I’ll show you how Artificial Intelligence (AI) and Machine Learning (ML) can be used to help you get a start on that novel you always wanted to write.

The ABBA explainer to BERT and GPT-2

For the life of me, I couldn’t understand how BERT or GPT-2 worked. I read articles; followed diagrams; squinted at equations; watched recorded classes; read code documentation; and still struggled to make sense of it all. It wasn’t the math that made it hard.

Generate Fresh Movie Stories for your Favorite Genre with Deep Learning

Fine-tuning GPT-2 to generate stories based on genres. After discovering time travel, the Earth’s inhabitants now live in futuristic cities, which are controlled by the government, for the duration of a decade. The government plans to send two elite teams of scientists to the city, in order to investigate the origin of these machines and discover the existence of the “God”.

Methods and Plugins to Spot Deepfakes and AI-Generated Text

With the emergence of incredibly powerful machine learning technologies, such as Deepfakes and Generative Neural Networks, it is much easier now to spread false information. In this article, we will briefly introduce deepfakes and generative neural networks, as well as a few ways to spot AI-generated content and protect yourself against misinformation.

Generating Text with Hugging…

This is part 3 of an ongoing series on language models, starting with defining neural machine translation and exploring the transformer model architecture.

From Transformers to GPT-2

Experimenting with different specifications for NMT. Some basic knowledge from part 1 is necessary in order to go through this session, so I’d recommend checking it out if you need the prime.

Fine-Tuning GPT2 on Colab GPU… For Free!

Leveraging Google Colab’s GPU to fine-tune pretrained GPT2. Models these days are very big, and most of us don’t have the resources to train them from scratch.

How to Spot Deepfakes and AI-Generated Text

In this article, we will briefly introduce deepfakes and generative neural networks, as well as a few ways to spot AI-generated content and protect yourself against misinformation.

How to Spot Deepfakes and AI-Generated Text

In this article, we will briefly introduce deepfakes and generative neural networks, as well as a few ways to spot AI-generated content and protect yourself against misinformation.

Let the machine write next..!

— Text generation using Vanilla LSTM, Attention and GPT-2. Objective: Generate new sentences automatically in continuation of given input sentences.

GPT-3, a giant step for Deep Learning and NLP? - KDnuggets

Recently, OpenAI announced a new successor to their language model, GPT-3, that is now the largest model trained so far with 175 billion parameters.