Alright in the previous post we have learned to tokenize and sequence the tokens from a sentence. We can observe that the length of tokens differ.
We need to make sure that inputs are of the same length. Padding saves us from this problem!
pad_sequences’ padded the sequences into the same length. You can observe that 0’s are padded in the beginning of a list which is smaller in size.
‘pad_sequence’ can be used to pad a sentence in the end or in the beginning, or padding a sentence to a desired length by truncating the sequence…

#tensorflow #naturallanguageprocessing #nlp #machine-learning #python

NLP with Tensorflow -Padding sentences
5.70 GEEK