Natural Language Processing and Word Embeddings

This article is based on week 2 of course Sequence Models on Coursera. In this article, I try to summarise and explain the concept of word representation and word embedding.

Tale of the Alhambra, Sequence Models and Literature

Quick hands on Sequences Models using BRNN/LSTM on Tensorflow. In this post we will be covering BRNN (Bidirectional Recurrent Neural Network, also sometimes referred to as Bidirectional Recursive Neural Network)) and LSTM (Long Term Short-Term Memory).

Farewell RNNs, Welcome TCNs

How Temporal Convolutional Networks are moving in favor of Sequence Modeling — Stock Trend Prediction. Disclaimer: this article assumes that readers possess preliminary knowledge behind the model intuition and architecture of LSTM neural networks.

Markov and Hidden Markov Model

A stochastic process is a collection of random variables that are indexed by some mathematical sets. That is, each random variable of the stochastic process is uniquely associated.

Artificial Neural Networks — Recurrent Neural Networks

Artificial Neural Networks — Recurrent Neural Networks. Remembering the history and predicting the future with neural networks. A intuition behind Recurrent neural networks.