Description:
For this project, we developed an application to show information from 3 different sources, NASA API, a Simulator with Elastic Search and Kafka, and data from 2 websites using a web scraper. Our application features knowledge in using new technologies which collect and stores data in an Elastic Search database. This information is transferred using Kafka which can handle large amounts of data, enabling communication between our data source and database. In addition, our system includes a real-time alert component, using Socket.io to notify users of crucial Events. The front end was created as a Single Page Application (SPA) using ReactJS. The backend is written in NodeJS, in a Mirco-Services Approach and Lambda architecture.
- API Service with NodeJS
- Redis
- Elastic Search
- Cloud Karafaka
Clone the project
important! you need to have the .env files in order to run the project.
git clone https://github.com/dolev146/BigData-Cloud-Computing-project.git
Go to the project directory
cd BigData-Cloud-Computing-project
Set Up Docker
note you need apple m1 if not then there are modification in the dockerfile that need to be modiefied to x86 instead of arm64
make sure you have the docker engine open in your computer then run the following commands
docker-compose build
docker-compose up
Set Up Scraper
navigate to the scraper folder
cd scraper
install dependencies
npm i
start the scraper
npm start
Navigate to the frontend folder
cd frontend
install dependencies
npm i
run the front end
npm run dev
and press "o" to open the front end
Image 1 | Image 2 | Image 3 |
---|---|---|
Image 1 | Image 2 | Image 3 | Image 4 |
---|---|---|---|