Skip to content

A complete example of the integration of an integrated generic chatbot front-end, a Rasa chatbot service, and a Qanary-driven question answering system.

License

Notifications You must be signed in to change notification settings

WSE-research/Rasa-Qanary-integration-example

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

52 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Integration Example of a Rasa Chatbot and a Qanary Question Answering System

The sole purpose of this repository is to provide a complete example of the integration of an integrated generic chatbot UI, a Rasa chatbot, and a Qanary-driven question answering system (please recognize that the actual quality of the system is not in the focus). The integration is done using a Rasa custom action that passes the intended questions to the Qanary question answering system. As a frontend, we use here a generic chatbot UI. However, there are no specific requirements for the frontend. Hence, you might use any other Rasa-compatible frontend (e.g., a custom chatbot UI) as well.

Use Case

The use case is the configured system is to compute the answer to a question about the birth date of a person. For example:

  • Where and when was Albert Einstein born?

  • Where was Albert Einstein born

  • when was Albert Einstein born?

The system is configured to answer the question using the following steps:

  1. Identify the firstname and lastname of the person using a NER, or NED, component

  2. Create a SPARQL query to query Wikidata for the birth date of the person

  3. Query Wikidata for the birth date of the person

  4. Prepare a HTML answer to the question to be displayed in the chatbot UI


Example interaction: "When was Albert Einstein born?"

Big Picture

The following figure shows the architecture.

graph TD

    Chatbot[<b>General Purpose Chatbot UI</b><br>provides an interactive frontend on port 3000]

    subgraph Rasa[Rasa Chatbot framework]
        direction TB
        RasaBackend[<b>Rasa chatbot backend</b><br>provides NLU and stories]
        RasaCustomAction[<b>Rasa Custom Action</b><br>provides a custom implementation for handing specific questions to the question answering functionality]
    end

    subgraph Qanary[Qanary question answering framework]
        direction TB
        QanarySystem[<b>Qanary System</b><br>orchestrates the connected Qanary components as demanded by the Rasa custom action]
        AutomationServiceComponent["<b>AutomationServiceComponent<br>(Qanary component)</b><br>identifies names of persons in<br> a given question using a pre-trained<br> model (simple one, but functional)"]
        BirthDataQueryBuilderWikidata["<b>BirthDataQueryBuilderWikidata<br>(Qanary component)</b><br>using the identified names,<br> it creates a SPARQL query that can<br> be used to fetch the requested<br> data from Wikidata"]
        SparqlExecuterComponent["<b>SparqlExecuterComponent<br>(Qanary component)</b><br>executes the previously<br> computed SPARQL query on<br> Wikidata to fetch the<br> requested data"]
    end

    QanaryTriplestore["<b>Qanary Triplestore</b><br>stores all information for a question in a specific graph<br>(i.e., a global QA process memory) <br>here: a demo service of the WSE research group is used (a Stardog triplestore)"]

    Wikidata["<b>Wikidata knowledge graph</b><br><a href='https://query.wikidata.org/'>public SPARQL endpoint<br>for querying RDF data</a>"]

    MLflow["<b>MLflow</b><br>logging of the training and usage<br> of the AutomationServiceComponent <br>(available on port 5000)"]
    AutomationServiceComponent-->|<span>logs results</span>|MLflow

    Chatbot -->|<span>calls RESTful endpoint at port 5005</span>| RasaBackend
    RasaBackend -->|<span>calls RESTful endpoint at port 5055</span>| RasaCustomAction
    RasaCustomAction -->|<span>calls RESTful endpoint at port 8080</span>| QanarySystem

    QanarySystem -->|<span>1. calls</span>| AutomationServiceComponent
    QanarySystem -->|<span>2. calls</span>| BirthDataQueryBuilderWikidata
    QanarySystem -->|<span>3. calls</span>| SparqlExecuterComponent

    AutomationServiceComponent-->|<span>interact using SPARQL</span>| QanaryTriplestore
    BirthDataQueryBuilderWikidata-->|<span>interact using SPARQL</span>| QanaryTriplestore
    SparqlExecuterComponent-->|<span>interact using SPARQL</span>| QanaryTriplestore

    SparqlExecuterComponent-->|<span>interact using SPARQL</span>| Wikidata


    classDef subgraphClass fill:#FFF,opastroke:#333,stroke:#999,stroke-width:1px,font-size:15px,font-weight:bold;
    classDef boxClass font-size:100%;
    linkStyle default stroke-width:2px,stroke:darkgray,fill:#FFFFFF00,color:black;

    class Qanary,Rasa subgraphClass
    class Chatbot,RasaBackend,RasaCustomAction,QanarySystem,AutomationServiceComponent,BirthDataQueryBuilderWikidata,SparqlExecuterComponent boxClass
Loading

Starting the complete system using docker-compose

All components are orchestrated using a docker-compose file. The configuration is available at docker-compose.yml. Use the following commands to build and start all components of this project.

docker-compose build
docker-compose up

The configuration parameters are available in the file .env.

⚠️
The first time you start the system, it takes a while to download all required docker images. Additionally, please be aware that starting the Rasa chatbot service takes a while (you might check the availability of components using the URLs shown below).

Troubleshooting: no gpu support

GPU support is enabled by default in the file docker-compose.yml as it is useful for speeding up the training process of the AutomationServiceComponent. If you get the following error message, you need to upgrade your Docker version or remove the GPU support from the docker-compose file.

ERROR: for automation_component  device_requests param is not supported in API versions < 1.40

Additionally, we provide a docker-compose file without requested GPU support at docker-compose-nogpu.yml. To start the system without GPU support, use the following commands.

docker-compose -f docker-compose-nogpu.yml build
docker-compose -f docker-compose-nogpu.yml up

Troubleshooting: new components' versions

The docker-compose file is configured to use the latest versions of the components. The actual components' versions are shown in the console. If you want to use a new (recently published) version of a component, you might need to force the docker-compose to pull the Docker images.

docker-compose pull

Troubleshooting: Pipeline crashes due to Virtuoso errors

When first executing docker-compose up the Qanary pipeline may try to access the virtuoso instance before it is started. In this case, it helps to restart the pipeline.

Components of the architecture

The following components are integrated into this example. Please recognize that the components are not part of this repository (links are provided for each component). In the following, the port numbers are shown for the default configuration of the docker-compose file.

About

A complete example of the integration of an integrated generic chatbot front-end, a Rasa chatbot service, and a Qanary-driven question answering system.

Topics

Resources

License

Stars

Watchers

Forks