Leverage change streams feature in MongoDB to get every change occurring in real-time.Many different databases are used at Adform, each tailored for specific requirements, but what is common for these use cases is the necessity for a consistent interchange of data between these data stores.
Leverage change streams feature in MongoDB to get every change occurring in real-time.
Many different databases are used at Adform, each tailored for specific requirements, but what is common for these use cases is the necessity for a consistent interchange of data between these data stores. It’s a tedious task to keep the origin of that data and its copies consistent manually, not to mention that with a sufficiently large number of multiplications the origin may not be the source of truth anymore. The need for having its own copy of data is also dictated by the necessity of loose coupling and performance. It wouldn’t be practical to be constantly impacted by every change made in the source system. The answer here is an event-based architecture which allows to keep every change consistent and provides us with the possibility of restoring the sequence of changes related to particular entities. For those reasons, the decision was made to use the publisher/subscriber model. MongoDB’s change streams saved the day, finally letting us say farewell to much more complex oplog tailing.
Akka gRPC provides support for building streaming gRPC servers and clients on top of Akka Streams and Akka Http. Features of Akka-gRPC A generator, that starts from a protobuf service definitions, for: Model classes The service API as a Scala trait using Akka Stream Sources On the server side code to create an Akka HTTP route based on your implementation of the service On the client side, a client for the service gRPC Runtime implementation that uses Akka Http/2 support or the server side and grpc-netty-shaded for the client side.
As the world is growing, so is the data. And analysis of this data has become important. But how will you do it? How will you work with the data whose size is u
In my previous blogs, I discussed about the basics of akka-streams and materialization. Now let's dig deeper into Graphs in Akka-Streams. Graphs Till now
Corona Virus Pandemic has brought the world to a standstill. Countries are on a major lockdown. Schools, colleges, theatres, gym, clubs, and all other public
Hello friends, I hope you all are safe in COVID-19 pandemic and learning new tools and tech while staying at home. In our last blog post on Akka Cluster, we learnt about the configurations we need in order to form an Akka Cluster. But we didn’t saw it in action. Hence in this blog post, we will see one in action.