Apache Kafka is great for delivering messages, but if you're using it for event sourcing—think again. Learn about why Kafka isn't suitable for event sourcing.
Apache Kafka is a cool product, but if you are thinking about using it for event sourcing you should think again.
Kafka is a great tool for delivering messages between producers and consumers and the optional topic durability allows you to store your messages permanently. Forever if you’d like.
So, if your messages are events you can use Kafka as an event store or an event log, but it really isn’t a suitable tool for event sourcing.
When a service receives a command requesting a state change of an event sourced entity, we first need to recreate the current state of that object. We do that by loading all previous events for that particular entity ID from our event store and then re-apply them on our object, fast-forwarding it into its current state.
Loading events for a particular entity like this is not easy in Kafka. Topics are usually centered around and partitioned by entity types like "Orders", "Payments" or "Notifications" so we would have to go through all events for all "Orders" and filter them by their ID to load up a single "Order" entity. Although this is POSSIBLE, it’s not very practical.
One alternative would be to have one topic per entity but then we would probably end up in a situation where we have thousands of topics and on top of that, the subscribing services downstreams need a way to automatically discover the newly created topics for each new entity. Again - not very practical.
When our entity’s state has been recreated, it’s time to execute the business logic requested by the incoming command. If the business logic fails we return an error to the client but if it succeeds a new event is emitted. In that case we must be able to save the new event to our event store with a guarantee that no other event has been stored for this particular entity ID in the meantime, or we would risk breaking the consistency of our domain objects.
Data science is omnipresent to advanced statistical and machine learning methods. For whatever length of time that there is data to analyse, the need to investigate is obvious.
How to Design Data Marts. This article will give some best practice tips for designing useful data marts for your business teams.
Tableau Data Analysis Tips and Tricks. Master the one of the most powerful data analytics tool with some handy shortcut and tricks.
Analysis, Price Modeling and Prediction: AirBnB Data for Seattle. A detailed overview of AirBnB’s Seattle data analysis using Data Engineering & Machine Learning techniques.
DISCLAIMER: absolutely subjective point of view, for the official definition check out vocabularies or Wikipedia. And come on, you wouldn’t read an entire article just to get the definition.