Current Transformers based models, like GPT-2 or even GPT-3 show incredible achievements in the task of text-generation (prediction of the next probable word based on the previous sequence of words). These models can create long, creative and cohesive texts, but usually they can generate text only in one direction, from left to right. I was wondering if there is a way to generate text in both directions and having some start phrase (for example “text generation is cool”) to see what story will unfold around it. XLNet was the solution: due to its using of all permutations of the input sequence factorization order this model can help to generate text in any direction.

In this article we will not study in detail the internal principles of XLNet (excellent brief explanation you can find here). Instead, we’ll start experimenting right away: we will practice a little bit in masked word prediction with XLNet, try to implement top-K bidirectional generation, and then implement a more efficient approach that combines beam search and top-K sampling.

At the end of the article we will get a generator capable of creating such text based on the start phrase (which is highlighted in bold):

Following up on my initial thoughts: text generation is cool! It works great for creating blog header, title etc. You will need Word 2013

Install needed modules

Let’s begin. We will conduct all our experiments in Google Collab Notebook (with GPU environment), which is available by this link, so the only module we will need to install is the excellent Transformers library. This library provides a simple interface to XLNet, as well as to many other transformers based models.

#artificial-intelligence #programming #data-science #machine-learning #nlp #deep learning

Build a bidirectional text generator with XLNet
4.85 GEEK