Natural Language Processing using TensorFlow - Tokenization (Zero to Hero, part 1)

Welcome to Zero to Hero for Natural Language Processing using TensorFlow! If you’re not an expert on AI or ML, don’t worry – we’re taking the concepts of NLP and teaching them from first principles with our host Laurence Moroney (@lmoroney).

In this first lesson we’ll talk about how to represent words in a way that a computer can process them, with a view to later training a neural network to understand their meaning.

Sequencing - Turning sentence into data (NLP Zero to Hero, part 2)

Welcome to Zero to Hero for Natural Language Processing using TensorFlow! If you’re not an expert on AI or ML, don’t worry – we’re taking the concepts of NLP and teaching them from first principles with our host Laurence Moroney (@lmoroney).

In this video you’ll take that to the next step – creating sequences of numbers from your sentences, and using tools to process them to make them ready for teaching neural networks.

Training a model to recognize sentiment in text (NLP Zero to Hero, part 3)

In the last couple of episodes you saw how to tokenize text into numeric values and how to use tools in TensorFlow to regularize and pad that text. Now that we’ve gotten the preprocessing out of the way, we can next look at how to build a classifier to recognize sentiment in text.

ML with Recurrent Neural Networks (NLP Zero to Hero - Part 4)

Welcome to this episode in Natural Language Processing Zero to Hero with TensorFlow. In the previous videos in this series you saw how to tokenize text, and use sequences of tokens to train a neural network. In the next videos we’ll look at how neural networks can generate text and even write poetry, beginning with an introduction to Recurrent Neural Networks (RNNs).

Long Short-Term Memory for NLP (NLP Zero to Hero - Part 5)

Welcome to episode 5 of our Natural Language Processing with TensorFlow series. In this video we’re going to take a look at how to manage the understanding of context in language across longer sentences, where we can see that the impact of a word early in the sentence can determine the meaning and semantics of the end of the sentence. We’ll use something called an LSTM, or Long Short-Term Memory to achieve this.

Training an AI to create poetry (NLP Zero to Hero - Part 6)

Through this series so far you’ve been learning the basics of NLP using TensorFlow. You saw how to tokenize and then sequence text, preparing it to train neural networks. You saw how sentiment in text can be represented with embeddings, and how the semantics of text over long stretches might be learned using recurrent neural networks and LSTMs. In this video we’ll put all of that together into a fun scenario – creating a model and training it on the lyrics to traditional Irish songs.

#tensorflow #ai #machine-learning #data-sicence #python

Natural Language Processing using TensorFlow: From Zero To Hero
64.40 GEEK