I recently got very interested in event streaming architecture and its wide adoption in banking, healthcare, government etc. Naturally, my first stop was learning about Apache Kafka. It's fascinating how fast, scalable and reliable this piece of software actually is.
That's why I've decided to seek some lessons and tutorials to create my own (very amateurish and simple) event streaming platform.
So, a very high level overview of the Kafka architecture is this:
Credit to: Finematics
On this diagram you don't see Zookeeper but it's getting deprecated in Kafka 4.0. I won't go into detail explaining the software - I am far, far, far from an expert.
-
This was my first project with Docker Compose but it was already a build .yml file of Kafka and Zookeeper so it didn't take much effort.
-
The screenshot above shows my producer (on the left), my consumer (on the right). The middle terminal is used to manually send a message to test the functionality of the whole thing. As you can see it works and on both sides of the frame there's a confirmation of message received and stored.
-
This was my first project in Go using GO Fiber which is like ExpressJS in the context of Golang. They allow you to define routes for handling different HTTP methods among many other functionalities.
-
I've also used samara in this project which is a Go library for Kafka to interact with the clusters and also produce and consume messages to and from the Kafka topics.