What is BERT (Bidirectional Encoder Representations From Transformers) and how it is used to solve NLP tasks? This video provides a very simple explanation of it. I am not going to go in details of how transformer based architecture works etc but instead I will go over an overview where you understand the usage of BERT in NLP tasks. In coding section we will generate sentence and word embeddings using BERT for some sample text.

We will cover various topics such as,

  • Word2vec vc BERT
  • How BERT is trained on masked language model and next sentence completion task

⭐️ Timestamps ⭐️

  • 00:00 Introduction
  • 00:39 Theory
  • 11:00 Coding in tensorflow

Code: https://github.com/codebasics/deep-learning-keras-tf-tutorial/blob/master/46_BERT_intro/bert_intro.ipynb

#deep-learning #data-science

What is BERT? | Deep Learning Tutorial (TensorFlow, Keras & Python)
21.60 GEEK