In this short note, we describe a Jump Unit that can be used to fit a step function with a simple neural network. Our motivation comes from quantitative finance problems where discontinuities often appear.

** Note from Towards Data Science’s editors:**_ While we allow independent authors to publish articles in accordance with our

Discontinuous functions are a common occurrence in financial instruments. For instance, the graph below shows the price of a typical five-year fixed rate bond with a semi-annual coupon. We have set the coupon rate higher than the discount rate, so the value of the bond stays above its Par Value of $100. If you are not familiar with bond pricing, a good primer is available here.

(Image by Author)

The important things to notice for our purposes are the jumps that occur at each coupon payment date. This happens simply because money cannot be made out of thin air. The “wealth” of the security owner remains the same immediately before and after the coupon. As such, we have:

value before coupon = value after coupon + coupon cash.

Similar jumps occur in the values of more complicated path-dependent financial derivatives on exercise dates. A classic example here is a *Bermudan swaption*, a popular instrument used to manage mortgage prepayment risk. Bermudan-style options can be exercised on a predetermined schedule of dates where the value may jump.

(Image by Author)

To keep things concrete, we will focus on the sub-problem of learning a piece-wise constant function with a single downward jump. We generate the training data as shown in the graph above. The reader is invited to follow along with our code in this Jupyter notebook.

Neural networks are a series of algorithms that identify underlying relationships in a set of data. These algorithms are heavily based on the way a human brain operates. These networks can adapt to changing input and generate the best result without the requirement to redesign the output criteria. In a way, these neural networks are similar to the systems of biological neurons. There are several neural network architectures with different features. Here, we are going to explore some of them. Top 5 Neural Network Models For Deep Learning and Their Applications

Artificial Neural Networks — Recurrent Neural Networks. Remembering the history and predicting the future with neural networks. A intuition behind Recurrent neural networks.

The past few decades have witnessed a massive boom in the penetration as well as the power of computation, and amidst this information.

“You do not really understand something unless you can explain it to your grandmother” Not sure where this quote originally came from, it is sometimes kind of half-attributed to Albert Einstein.

Deep Learning Explained in Layman's Terms. In this post, you will get to learn deep learning through a simple explanation (layman terms) and examples.