Kafka with Spring Cloud Stream on Docker – part 2

This is a continuation of part 1 on Kafka, Spring Cloud Stream on Docker. In this post, we will be talking about setting up a Spring boot project and configuring binder for Kafka and produce messages. Before we delve deeper, it is important to understand few concepts in Spring Cloud Stream and how it works.

When a microservice is ready to publish a message, it will publish the message using a source. Source identifies the contract between the message producer and the message destination.

A channel is an abstraction over the actual queue or topic. A channel name is always associated with a target queue name in configuration, which makes it easy to switch queues by changing configuration.

Its Spring’s code that talks to a specific message platform, like RabbitMQ, Kafka etc.

To get started, add @EnableBinding annotation to the bootstrap class of your Spring boot project, you created in part 1. This will turn Spring boot project into a Spring Cloud Stream project.

Next, configure your application.yml as below.

The above binding maps to a channel called output with Kafka topic called KafkaDemoTopic. It specifies Kafka is the underlying message platform. You can switch this to RabbitMQ, ActiveMQ etc depending on your actual implementation. It also specifies where Kafka and Zookeeper are running. Couple of things to watch out for.

You may get above error if port specifications are not correct. I didn’t specify ports as spring already knows default ports for Zookeeper and Kafka.

This is a common error, if you forget to add message converters.

Once, you have the above set up, create a message producer to post the messages to the topic as below.

That’s all you have to do. Check your Topic for messages.

Please watch the video below for additional details.

Download working source code from kafkaspringboot folder.