Disclaimer: this article assumes that readers possess preliminary knowledge behind the model intuition and architecture of LSTM neural networks.


Overview

  1. Background of Deep Learning in FTS
  2. Noteworthy Data Preprocessing Practices for FTS
  3. Temporal Convolutional Network Architecture
  4. Example Application of Temporal Convolutional Networks in FTS
  • Knowledge-Driven Stock Trend Prediction and Explanation via TCN

1. Background

Financial Time Series (FTS) modelling is a practice with a long history which first revolutionised algorithmic trading in the early 1970s. The analysis of FTS was divided into two categories: fundamental analysis and technical analysis. Both these practices were put into question by the Efficient Market Hypothesis (EMH). The EMH, highly disputed since its initial publication in 1970, hypothesizes that stock prices are ultimately unpredictable. This has not constrained research attempting to model FTS through the use of linear, non-linear and ML-based models, as mentioned hereafter.

Due to the nonstationary, nonlinear, high-noise characteristics of financial time series, traditional statistical models have difficulty predicting them with high precision. Hence, increased attempts in recent years are being made to apply deep learning to stock market forecasts, though far from perfection. To list a mere few:


2013

Lin et al. proposed a method to predict stocks using a support vector machine to establish a two-part feature selection and prediction model and proved that the method has better generalization than conventional methods.

2014

Wanjawa et al. proposed an artificial neural network using a feed-forward multilayer perceptron with error backpropagation to predict stock prices. The results show that the model can predict a typical stock market.

2017

‎‎‎‎Enter **LSTM **— a surge in studies concerning application of LSTM neural networks to the time series data.

Zhao et al., a time-weighted function was added to an LSTM neural network, and the results surpassed those of other models.

2018

Zhang et al. later combined convolutional neural network (CNN) and recurrent neural network (RNN) to propose a new architecture, the deep and wide area neural network (DWNN). The results show that the DWNN model can reduce the predicted mean square error by 30% compared to the general RNN model.

Ha et al., CNN was used to develop a quantitative stock selection strategy to determine stock trends and then predict stock prices using LSTM to promote a hybrid neural network model for quantitative timing strategies to increase profits.

Jiang et al. used an LSTM neural network and RNN to construct models and found that LSTM could be better applied to stock forecasting.

2019

Jin et al. added investor sentiment tendency in model analysis and introduced empirical modal decomposition (EMD) combined with LSTM to obtain more accurate stock forecasts. The LSTM model based on the attention mechanism is common in speech and image recognition but is rarely used in finance.

Radford et al. Precursor to the now hot-stock, GPT-3 , GPT-2’s goal is to design a multitask learner, and it utilizes a combination of pretraining and supervised finetuning to achieve more flexible forms of transfer. Therefore it has 1542M parameters, much bigger than other comparative models.

#stock-market #sequence-model #predictions #deep-learning #financial-analysis

Farewell RNNs, Welcome TCNs
4.30 GEEK