-
Notifications
You must be signed in to change notification settings - Fork 0
/
instructions.txt
31 lines (20 loc) · 1.19 KB
/
instructions.txt
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
MONGODB
Start server: mongod --port 27018 --dbpath /Users/mauromarini/Documents/hackathon/mongo_data --replSet “hackathon”
Connect to server: mongo --port 27018
Make the MongoDB node primary: rs.initiate()
KAFKA *** change IP address
Start zookeeper: zookeeper-server-start /usr/local/etc/kafka/zookeeper.properties
Start server Kafka: kafka-server-start /usr/local/etc/kafka/server.properties
Create topic test: kafka-topics --create --zookeeper 192.168.1.28:2181 --replication-factor 1 --partitions 1 --topic test
Console del produttore: kafka-console-producer --broker-list 192.168.1.28:9092 --topic test
Console del consumatore: kafka-console-consumer --bootstrap-server 192.168.1.28:9092 --topic test —from-beginning
Start producer script: /Users/mauromarini/Documents/code_kafka_changestreams/kafkaproducer.py
Start consumer script: python /Users/mauromarini/Documents/code_kafka_changestreams/kafkaconsumer.py
CREATE THESE 2 TOPICS STATICALLY:
- data_in
- mongo_in
CONFIGURE POSTEGRESQL:
SQL.postegre_creation.sql
START PYTHON SCRIPTS IN THIS ORDER:
1. Main.py from its folder (this simulate data arrival)
2. postgresql.postgreconsumer.py (tho read the Kafka stream)