description |
---|
If you plan to work on CueSearch code and make changes, this documentation will give you a high level overview of the components used and how to modify them. |
CueSearch has multi-service architecture, with services as mentioned:
Frontend
single-page application written on ReactJS. It's code can be found inui
folder and runs on http://localhost:3000/.API
is based on Django (python framework) & uses REST API. It is the main service, responsible for connections, authentication and anomaly.- ElasticSearch for giving search suggestions.
- Celery to execute the tasks asynchronously. Tasks like hourly indexing are handled by Celery.
- Celery beat scheduler to trigger the scheduled tasks.
- Redis to handle the task queue of Celery.
Get the code by cloning our open source github repo
https://github.com/cuebook/CueSearch.git
cd CueSearch
docker-compose -f docker-compose-dev.yml --env-file .env.dev up --build
docker-compose
's build command will pull several components and install them on local, so this will take a few minutes to complete.
The code for the backend is in /api
directory. As mentioned in the overview it is based on Django framework.
Configure environment variables as you need for the backend server :
ENVIRONMENT=dev
NODE_ENV=development
CHOKIDAR_USEPOLLING=true
API_URL=http://localhost:8000
REDIS_BROKER_URL=redis://localhost:6379/0
CELERY_RESULT_BACKEND=redis
NGINX_API_URL=http://localhost:8000
NGINX_UI_URL=http://localhost:3030
## Central DB SETTINGS
#POSTGRES_DB_HOST=localhost
#POSTGRES_DB_USERNAME=postgres
#POSTGRES_DB_PASSWORD=postgres
#POSTGRES_DB_SCHEMA=cuesearch
#POSTGRES_DB_PORT=5432
## SUPERUSER'S VARIABLE
DJANGO_SUPERUSER_USERNAME=User
DJANGO_SUPERUSER_PASSWORD=admin
DJANGO_SUPERUSER_EMAIL=admin@domain.com
## AUTHENTICATION
IS_AUTHENTICATION_REQUIRED=False
Change the values based on your running PostgreSQL instance. If you do not wish to use PostgreSQL as your database for development, comment lines 4-8 and CueSearch will create a SQLite database file at the location api/db/db.sqlite3
.
The backend server can be accessed on http://localhost:8000/.
CueSearch uses Celery for executing asynchronous tasks like anomaly detection. There are three components needed to run an asynchronous task, i.e. Redis, Celery and Celery Beat. Redis is used as the message queue by Celery, so before starting Celery services, Redis server should be running. Celery Beat is used as the scheduler and is responsible to trigger the scheduled tasks. Celery workers are used to execute the tasks.
At the moment, we have test cases only for the backend service, test cases for UI are in our roadmap.
Backend for API and services is tested using PyTest. To run test cases exec
into cueo-backend and run command
pytest