Reading Time: 4 minutes

Knime Analytics Platform provides it’s users a way to consume messages from Apache Kafka and publish the transformed results back to Kafka. This allows the users to integrate their knime workflows easily with a distributed streaming pub-sub mechanism.

With Knime 3.6 +, the users get a Kafka extension with three new nodes:

1. Kafka Connector

2. Kafka Consumer

3. Kafka Producer

A user familiar with Apache Kafka would already know what these three nodes do in a Knime workflow.

Kafka Extension to Knime

With the Kafka Extension, Knime users receive 3 new nodes:

  • Kafka Connector Node: A knime connector node allows the workflow to connect to a Kafka cluster. This node takes a list of host/pair values to establish a connection.

  • Kafka Producer Node: Once the user configures kafka connector node, set the Kafka Producer Node to publish records from an input to a topic in the connected cluster. To configure this node, set properties like client ID, Topic to publish records to and, the column of input record to publish. Also, the user can define how to send messages – Synchronously, Asynchronously, or Fire and Forget.

#apache kafka #ml # ai and data engineering #tech blogs #gui analytics #kafka extension #knime #knime analytics platform

Knime Meets Apache Kafka
2.30 GEEK