This project demonstrates a real-time data pipeline using Kafka for messaging between two databases. It consists of two demos:
demo-db-to-kafka
: Produces student data from a source database and publishes it to a Kafka topic namedstudent
.demo-kafka-to-db
: Consumes student data from thestudent
topic and inserts it into a destination database.
- Every 2 seconds, 5 students records are produced from the
source
database and published to the Kafka topic in a batch. - The batch is then consumed and inserted into the
destination
database as soon as the batch is received from the Kafka topic.
- Left window: Shows the Kafka consumer listening for incoming data.
- Right window: Displays the
destination
database inserting the consumed data.
Students.data.sent.from.DB.to.another.DB.through.Kafka.mp4
Ensure the following tools are installed to run the project smoothly:
- Java 21.0.3: Required to build and run the Spring Boot applications.
- Apache Maven 3.9.8: Used for dependency management and building the project.
- Spring Boot 3.3.3: Framework for building the Kafka producer and consumer demos.
- Docker 26.1.4: Necessary to set up MySQL, Zookeeper, and Kafka services in containers.
Follow these steps to run the project locally:
- Clone the repository:
git clone https://github.com/Ahmad-AlDeeb/demo-db-to-kafka-to-db.git
- Navigate to the project directory:
cd demo-db-to-kafka-to-db
- Start Docker services: Make sure Docker is running, then start the required services (MySQL, Zookeeper, Kafka) using:
docker-compose up -d
- Build and run the producer demo:
mvn -f ./demo-db-to-kafka clean install spring-boot:run
- Build and run the consumer demo: Open another terminal in the same directory and run:
mvn -f ./demo-kafka-to-db clean install spring-boot:run
- Stop the services: After testing, stop both demos by pressing
CTRL + C
in each terminal.