Event driven architecture requires forethought, creating the scaffolding in order to integrate popular event streaming platforms can become complicated. Spring Cloud Stream is a framework for message-driven microservice applications and it provides binder implementations for various message brokers, RabbitMQ, Apache Kafka, Kafka Streams, Amazon Kinesis, etc. The framework can simplify things and allow us to easily build message publication and consumption for different platforms, by keeping the specific implementation details of the chosen platform away and by using already familiar Spring idioms and Spring interfaces.
In this post we will be using Apache Kafka message broker.
Jay Kreps chose to name the software after the author Franz Kafka because it is “a system optimized for writing”, and he liked Kafka’s work.
In my previous article, we used Spring Application Events to trigger email sending each time a customer makes a reservation.
Now, we will build on top of that in order to look at the Spring Cloud Stream architecture. We will split that application into two microservices that communicate over messages. We will create User service, that will be dealing with user administration and Notification service that will be in charge of sending different notifications, in this case emails.
User service will be the message producer (publisher), it will publish a message to a Kafka topic each time new user account is created. Notification service will be the message consumer, it will consume the messages from the message queue and react to those messages, without being explicitly called by the User service.
By using messaging communication, we can also add new functionalities that can react to the changes in the User service by having them listen to messages on the message queue and allow our services to scale.
#spring #spring-cloud #programming #java #kafka