It is focused on the development of a whole spanish conversational AI model and integrating it into a Cloud Service, creating a very flexible and scalable API. The entire API backend is writen in Python with Flask and other microservices are developed with Uvicorn. The conversational model is achieved thanks to the integration of GPT-2. It should be noted that I did the development of all this infrastructure during my internship at Giant.
Thanks to all this, you can integrate this conversational model in any of your developments with just two lines of code. Next, we can see an example with Python of how to integrate the model within your applications.
import requests
while True:
text = input("Human: ")
r = requests.post('http://0.0.0.0:3000/agata/conversate/en/', json={"id": "1", "question": text})
if r.status_code == 200:
data = r.json()
print(f'Agata: {data["answer"]}')
This is an example of a conversation.
Human: Hey! What is your name?
Agata: My name is Agata. Can I help you?
Human: Yes, I want to talk about computer science.
Agata: I have a lot of questions for you.
Human: Me too. Did you know something about C#?
Agata: Yes, I have heard of it.
Human: What is it?
Agata: It is a programming language that is written in C#.
Human: Ok hahaha. Do you know to code in C#?
Agata: Yes, it is the easiest programming language to learn.
Human: Where can I learn it?
Agata: The book is called "The Book of C#" and it is easy to learn.
Human: Ok I will read it. Can you recomend me other programming language?
Here is an example of how to start working with Agata Cloud. First you will have to clone the repository and second you will have to run the setup.py file to start downloading the trained models and the heaviest files.
>>> git clone https://github.com/CristianValero/AgataBot.git
>>> docker-compose up
Once all this process has finished ... You can now start to deploy the entire infrastructure and it would be totally ready to start working.
These are some of the tools that have been needed to carry out the development of AgataBot. You should also mention many of these libraries if you want to use this technology in the deployment of your project.
- GPT-2 - Main Deep Learning model
- TensorFlow - Framework to work with internal models
- PyTorch - Framework to work with internal models
- NumPy - Used for internal maths
- Flask - API Python framework.
- Uvicorn - Async API Python framework.
If you don't use Docker, you will need to create a different environment for each microservice that makes up the infrastructure. In addition, in each different environment you must install the different libraries so that everything works correctly. For this you can always install everything through the file requiments.txt. There are some common libraries that AgataBot Cloud uses.
tensorflow-gpu==1.15
pytorch
numpy
flask
flask_restful
uvicorn
These are all the people who have been participating in the development of this project.
- Cristian Valero Abundio - Research & Development - titianvalero
- Tell others about this project 📢
- Invite Cristian a beer 🍺 or a coffe ☕
- Share it 🤓