Spring Boot & Apache Kafka: Publishing and Consuming Messages from Topics | ChargeAhead LLC | Skillshare

Playback Speed

  • 0.5x
  • 1x (Normal)
  • 1.25x
  • 1.5x
  • 2x

Spring Boot & Apache Kafka: Publishing and Consuming Messages from Topics

teacher avatar ChargeAhead LLC, Technologist

Watch this class and thousands more

Get unlimited access to every class
Taught by industry leaders & working professionals
Topics include illustration, design, photography, and more

Watch this class and thousands more

Get unlimited access to every class
Taught by industry leaders & working professionals
Topics include illustration, design, photography, and more

Lessons in This Class

12 Lessons (11m)
    • 1. Introduction

    • 2. Course Roadmap

    • 3. Audience

    • 4. Prerequisites

    • 5. 5 Tools

    • 6. Demo: Installing Kafka

    • 7. Demo: Spring Boot Integrating with Apache Kafka

    • 8. Demo: Creating a Kafka Producer

    • 9. Demo: Creating a Kafka Consumer

    • 10. Demo: Creating REST API to Post Messages to Kafka Topic

    • 11. Demo: Flow in Action

    • 12. Summary

  • --
  • Beginner level
  • Intermediate level
  • Advanced level
  • All levels
  • Beg/Int level
  • Int/Adv level

Community Generated

The level is determined by a majority opinion of students who have reviewed this class. The teacher's recommendation is shown until at least 5 student responses are collected.





About This Class

In this course we will install Apache Kafka, a very popular open source event sourcing platform. We will demonstrate the command line tool to create a topic, create producer, publish messages and consume messages. Next we will create a Spring Boot App and integrate it with Apache Kafka. We will create a producer to produce messages to Kafka topics and a consumer to consume messages from it. We will create a REST endpoint to post messages to the Apache Kafka topic.

Meet Your Teacher

Teacher Profile Image

ChargeAhead LLC



Class Ratings

Expectations Met?
  • Exceeded!
  • Yes
  • Somewhat
  • Not really
Reviews Archive

In October 2018, we updated our review system to improve the way we collect feedback. Below are the reviews written before that update.

Why Join Skillshare?

Take award-winning Skillshare Original Classes

Each class has short lessons, hands-on projects

Your membership supports Skillshare teachers

Learn From Anywhere

Take classes on the go with the Skillshare app. Stream or download to watch on the plane, the subway, or wherever you learn best.


1. Introduction: Hello, my name is Bianca Jane. And welcome to this course on Spring Boot and Apache Kafka, publishing and consuming messages from topics. Apache Kafka is a very popular open source stream processing platform. Apache Kafka is a distributed, scalable log framework which can handle the trillions of log messages or events everyday. Events or logs in Kafka are organized as topics. In this course, we will see how we can publish and consume messages from Apache Kafka topics using Spring Boot. 2. Course Roadmap: In this course, we will first start off talking about the audience for this course. Prerequisites required and tools needed to follow along. Next, we will install Apache Kafka on an Ubuntu Linux machine. Next, using command line, it will create a Kafka topic. Creative producer published some messages and then consume it in the consumer. If you already have access to Kafka instance, then you can skip this section and move on to the next clip. Next, we will create our Spring Boot app and see how easy it is to integrate apache Kafka with Spring Boot. We'll create a producer to publish messages to an Apache Kafka topic. Next, we will create a consumer to consume messages from that topic. Finally, we will create a rest API to post messages to Apache Kafka and see everything in action. 3. Audience: This course is for developers who want to create a Spring Boot app to work with Apache Kafka. It is also suitable for architects who want to understand Apache Kafka topics and producer-consumer model. This is a beginner level course and I will explain everything as we go along. 4. Prerequisites: There are not a whole lot of prerequisites for this course. Basic understanding of Spring Boot would be good to have, though. We will create a Spring Boot app from scratch. And it should be easy to follow along. You should have some basic Java concepts. I will go step-by-step, and it should be very easy for you to follow along. 5. 5 Tools: I will be using Spring tools speed for the demos. Spring tool suite is a flavor of Eclipse and is optimized for spring projects. So I find it very efficient and productive when I use this to create spring projects. These are programs can also be written by just using a notepad. However, an IDE from wives, many nice features like IntelliSense, compile check, ease of running, etc. So I would always recommend to you to get familiar and use a good IDE. However, if you're more comfortable with and want to use some other IDEs like IntelliJ, j, etc. Please feel free to do so. I will be of course, using Java, Open JDK to demonstrate. I'll be using Open JDK version 11. And it's always good to use the latest version of Java as it brings with it. New features, fixes and optimizations will also be needing Apache Kafka. And I will show you how to download and install it. You can find the code for the demo in this course and the following GitHub URL. 6. Demo: Installing Kafka: First, let's type download Apache Kafka in Google. Let's choose the download link from Kafka dot apache.org. Let's go to the binary download for Kafka to 0.7 using scalar 11, 0.3. Let's click on the middle link and save the file. Here is a file in my local folder. Let's double-click it and extract it to a location of your choice. I like keeping mine in the tools directory. Okay, the extraction is complete. Let's go to the extracted folder and we see a bunch of folders here, Like been config, etc. Let's open a new terminal at this location. First, let me sudo as a superuser, Kafka requires Zookeeper for it is running. Zookeeper comes along with the Download. The first step is to start Zookeeper. The command started is bin slash Zookeeper, hyphen server, hyphens dot, dot ASH and bonding it to config slash Zookeeper dot properties file. Again, all these files came with the download. Zookeeper starts successfully. Next, let's start Kafka. Let's open another terminal in the same location. Again, login as a superuser. Let's run the command bin slash Kafka's hyphen server, hyphen star dot ASH, config slash several properties. Kafka starts at its default port of 1982. Now let's using command line, create a new Kafka topic. So again, let's open another terminal in the same location, login as a superuser. The command bin slash Kafka hyphen topics dot h, hyphen, hyphen, create, hyphen, hyphen topic name of the topic. Let's call it mitotic hyphen, hyphen bootstrap hyphen server as localhost 9092. Topic is created. Now let's start the console producer, which will produce the messages. Enter the command bin slash Kafka hyphen console, hyphen producer dot SSH, hyphen hyphen topic as my topic. Hyphen hyphen bootstrap server, localhost 9092. Now, what would type in this window will post messages to our topic, might topic. But let's start the consumer first. Let's start another terminal. Login as superuser. Run the command bin slash Kafka hyphen console hyphen consumer dot SSH, hyphen hyphen topic to listen to is my topic. Hyphen, hyphen bootstrap server, localhost 9092. Alright, the consumer is up and listening. And the producer window, Let's type hello world. This gets published to my topic. The consumer listening to my topic gets it and displays it. We can keep producing more messages. And they go to the consumer. 7. Demo: Spring Boot Integrating with Apache Kafka: Here I have spring tool suite open. Let's go to File new other, good uncle spring and then choose Spring starter project. Click Next. Let's give it a name. Kafka producer consumer will type is Maven, packaging is Java. Java version is 11. Click Next. For the frameworks, let's search web and choose Spring Web search Kafka, and choose spring for Apache Kafka. Let's expand the project and open the bomb dot XML file. And here we can see the two dependencies we added for the web and Kafka. Next, let's create a Kafka producer. 8. Demo: Creating a Kafka Producer: Let's right-click on the package, choose new package and give it a name service. Let's right-click on the package and choose new class. Call it producer. This is going to publish messages to Kafka. You publish messages to topics in Kafka. Let's create a static final string, topic and name of a topic as my topic. Next, let's auto wire the coffered template for water-based spring, which provides convenience methods to publish to Kafka. It takes in two strings, the name of the topic we want to publish and the message. Let's name it Kafka Tim, let's fix the imports. Let's create a method published to topic which takes in the message we want to pose to the topic. Let's put a system out, publishing the topic with topic name. And next, using our GFA template with sin to the topic of our message. That is it to create a producer. How simple it is a Spring Boot does all the hard work behind the scenes. Next, let's create a consumer. 9. Demo: Creating a Kafka Consumer: One thing I would do in a production application is to separate out the publisher and the consumer as two separate Spring Boot projects or services so that they can be deployed and scale separately. However, to make it simple and easy to understand, I'm keeping them as a part of the same project. Or here, right-click on the package. Jews and new class, call it consumer market or the service. Now, let's talk Kafka listener, which as its name suggests, listens to Kafka messages on this specified topic. Here my topic variable producer is publishing two and group ID, let's call it my group. The method can zoom from topic Dickson, the message received from the topic. Inside, we can do anything with the message, perform aggregation with previous messages, transform it, and further push it to another Kafka topic, etc. Here we just do a system out to show that we got the message with desktop producer and consumer. Let's create an API. 10. Demo: Creating REST API to Post Messages to Kafka Topic: So let's right-click on the package and choose new package. Give it a name controller. Right-click on the package, choose new class and give it a name, Khafre controller that is annotated with the rest controller and the addition, let's auto wire the producer class we created earlier. Let's import it. Let's create a post call to slash post URI to post the messages to Kafka. First, let's do a request mapping to Kafka app for the main controller. Let's create a method send message where it takes in the request per annum message, which is of type string message insight. Let's call producer dot published your topic method, which takes in the message to be published. Finally, let's open up our application or properties file were delivered producer and consumer about the location of the Apache Kafka instance. Here I have pasted a few properties. Let me go over them with you. Several dot port specifies the board where our app for listen to. Next, we have a few spring dot Kafka consumer properties to indicate the bootstrap servers or the Kafka server, which is running locally on my machine. So localhost At its port 9092, we indicate the group ID as my group, the same we used in our Kafka listener. For auto offset rest. We have used earliest. This flag tells Kafka waitlist start reading the offsets in case you do not have any commit yet. In other words, it'll start either from the earliest or from the latest. If you have not registered any offset in zookeeper yet for the key and value for de-serializing, we're using string d3 laser. Also the same for the producer. For the producer VL, again telling the Apache Kafka location. So this ties the producers and consumers at Kafka. That is it. Let's start our app, right-click and choose Run as Spring Boot app. Looks like there was a problem starting the app. Oh, it was not able to locate the auto wired producer in the controller. So let's go back to our producer class. Or I hadn't missed adding the service and audition, so spring could not locate it. Let's do that. Let's start the app again. This time it starts successfully. We see the consumer getting launched, our topic and group. Now to test it, we need a client to execute the post call. 11. Demo: Flow in Action: You can use any app like Bozeman hit on Firefox. If you go to Add-ins, you can see that I'm using the restrict blind. I have in my personal experience, found the restaurant client to be very easy to use. Its tools or rest history calls, etc. To get it, you can go to your plugins section, put in the search, arrested. Then click here, since I already have it and store it gives me the option to remove it. You would see something like install. After that, you will see this angle brackets on your browser, which you can click to launch the restaurant client. So let's change the method to post put in the URL for our app issued http localhost 8888 slash Kafka API slash post. And using question mark Boston that bram message with a value say hello. It was successful with a 200 status code. Let's go back to our Spring Boot app and we see that the producer successfully bolstered the message to the topic. And the consumer pictured up from there and printed it. Let's go back and enter a few more messages. Second message to 100 status code. Third message, which is successful. Again. Going back to the console, we see the publishing of these messages to the topic and the consumer getting it and printing it to the console. 12. Summary: In this course, we first downloaded and installed Apache Kafka. Next, using command line, we created a Kafka topic, greeted a producer, published some messages, and then consume them in the consumer. Next, we created a Spring Boot app to create a producer to publish messages to a Kafka topic. And then a consumer which listens to the topic and consumes the messages. We saw how Spring Boot does the heavy lifting and makes it extremely easy for us to integrate Kafka in our app. Thanks for watching.