Skip to content

ML summarisation using LLM with docker, nginx, RabbitMQ, gunicorn & flask.

Notifications You must be signed in to change notification settings

pogpog/summarisation-docker-rabbitmq

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

1 Commit
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Background Processing with RabbitMQ, Python and Flask

A simple job worker that uses RabbitMQ for job management. Full doc and explanation here.

Quick start

Start processes

Use the following command to start all three processes:

% docker-compose up -d
Starting rabbitmq-job-worker_worker_1 ... done
Starting rabbitmq-job-worker_rabbitmq_1 ... done
Starting rabbitmq-job-worker_server_1 ... done

Run jobs

To see all of this in action, just hit the /add-job/hey or /add-job/hello end-point on your localhost and you will see the messages flowing through.

% curl localhost:5002/add-job/hey
[x] Sent: hey

Check Jobs

The worker container logs should show the job being executed:

% docker logs rabbitmq-job-worker_worker_1
...
[*] Sleeping for 10 seconds.
[*] Connecting to server ...
[*] Waiting for messages.
[x] Received b'hey'
hey there
[x] Done

About

ML summarisation using LLM with docker, nginx, RabbitMQ, gunicorn & flask.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published