It takes an array of integers as a parameter, and very simply sums all the values and returns the sum. We can use recursion for this.

The history of functional programming can be traced back to Lambda-calculus. It is a mathematical language invented by Alonzo in 1930. In some ways, lambda-calculus is the first programming language. Some principles of lambda calculus can be found in modern functional languages and even in Java also. The first principle is that the main object in this kind of language is functions and functions in the mathematical sense of these words. something that takes input and returns the output.

It always returns the same output if based on the same input. Hence, this property goes by the name of being stateless. In other words, the function has no memory of previous inputs.

This property is so important that it’s been given different names such as purity or pure functions. It’s also been called referential transparency. The consequence of this property is that there are no variables because variables are used to preserve the memory of past events. Therefore, there’s also no assignments and no traditional loops because loops are based on some variable repeatedly changing its value.

Let’s see a simple example. Consider two java web development fragments here, these two methods as below:

On the left-hand side, there is a method called arraySum(). It takes an array of integers as a parameter, and very simply sums all the values and returns the sum. Now, how can we do the same thing in java recursion? We can use recursion for this.

In recursion, the method is going to call itself with slightly different parameters. In the Right-hand site method, I have used an extra parameter called ‘start’. This method will return the sum of the elements of the array whose index is greater than or equal to start. In the else block, it will return the sum of the first element plus the result of calling the same method with an increased value of start. This obtains the same effect as the more natural method on the left side. It does so without any variables.

Just because we can replace the iteration with the recursion, it does not mean that we should do it. In particular, we should not do it in Java as Java is not optimized for recursion. It means that the example on the left-hand side is going to only require constant space to perform the sum, and this constant space is taken by two variables “sum” and “i”.

On the other hand, the method on the right-hand side requires linear extra space. In other words, an amount of space or memory which is proportional to the size of the input array and this amount of memory is used by recursion because a very recursive call to the method will take some space in stack. It means that if the input array is too large, the iterated versions on the left will work fine, while the recursive version on the right will run in the stack overflow issue.

Hi Folks! Hope you all are safe in the COVID-19 pandemic and learning new tools and tech while staying at home. I also have just started learning a very prominent Big Data framework for stream processing which is Flink. Flink is a distributed framework and based on the streaming first principle, means it is a real streaming processing engine and implements batch processing as a special case. In this blog, we will see the basic anatomy of a Flink program. So, this blog will help us to understand the basic structure of a Flink program and how we can start writing a basic Flink Application.

What is OpenJDK? OpenJDk or Open Java Development Kit is a free, open-source framework of the Java Platform, Standard Edition (or Java SE).

Java For loop provides a concise way of writing the loop structure. For loop consumes the initialization, condition and increment/decrement in one line.

Learn how to convert streams to other data structures using the Java Stream and Java Collectors APIs. In this article, we’ll be going over different ways in which you can convert streams to other data structures.

Spark Structured Streaming – Stateful Streaming. Welcome back folks to this blog series of Spark Structured Streaming. This blog is the continuation of the earlier blog "Internals of Structured Streaming".