Hi Folks! Hope you all are safe in the COVID-19 pandemic and learning new tools and tech while staying at home. I also have just started learning a very prominent Big Data** framework** for stream processing which is  Flink. Flink is a distributed framework and based on the streaming first principle, means it is a real streaming processing engine and implements batch processing as a special case. In this blog, we will see the basic anatomy of a Flink program. So, this blog will help us to understand the basic structure of a Flink program and how we can start writing a basic Flink Application.

Let’s explore the steps involves in setting up the streaming application in Flink with a simple example. In the example, we will read messages in the form of text from the socket text stream. Then filter out the streaming text if it is a number. The Flink application for this use case will be accomplished in 5 steps as shown below.

Step 1: Setup Execution Environment

The very first step is to let Flink knows the right environment for application means whether the streaming application is going to be run locally or on some machines need to connect. So, we need to create a stream execution environment.

StreamExecutionEnvironment executionEnvironment =
       StreamExecutionEnvironment.getExecutionEnvironment();

#apache flink #big data and fast data #flink #java ##apache flink ##flink #big data #big data analytics #fast data #stream processing #streaming

Basic Anatomy of a Flink Program
2.30 GEEK