This was made as a learning project to experiment with event-driven architecture patterns. It uses Kafka as the message delivery system, and includes server that consumes messages and pushes them across a websocket to a simple frontend (which it also serves) that logs messages in the UI as received, and a producer that pushes messages to Kafka for the server to subscribe to.
- Install NodeJS
- Install all node modules:
npm run install:all
- Set up and run Kafka:
npm run start:kafka
- Build and run the server/consumer:
npm run build:start:server
- Open the website at localhost:8080 in your browser. Note the logging of messages received.
- In a separate terminal window, build and run the producer:
npm run build:start:producer
. Note how counters are now being logged on the website.
You now should also be able to open the browser app on a separate device on your local network if your machine doesn't have a firewall, via [serving machine's IP]:8080
There you have it! You are running a full event-driven system! Initially, the producer just has an interval counter number that it pushes to Kafka, but you can play with any part of this system to see how things work and affect each other:
- The server/consumer
- The producer
- The website
- or even the Kafka setup
For example, an initial setting to look at in the server/consumer logic is the fromBeginning
parameter when setting up the consumer. This is currently set to false, but if it's set to true
, it will grab all of the counter numbers that were published by the consumer before the consumer started running and print them all at once on the web browser (if the browser is running first). However, if this setting is true
but the browser is opened after the server is started, it will only see the newest counters.
There is also session logic in the server (using express-session
), allowing you to play with user state and session management and decide what consumed information to give by user, or when to start information or do other things.