This is a repository which contains endpoints for various Terrarium LLM workflows.
cd
into root
run: docker build -t gollm .
run: docker run -p 8000:8000 -e OPENAI_API_KEY=$OPENAI_API_KEY gollm
Once the API has been started, the /configure
endpoint will consume a JSON with the structure:
{ research_paper: str, amr: obj }
The API will return a model configuration candidate with the structure
{response: obj}
where `response` contains the AMR populated with configuration values.
Note: This is a WIP, is unoptimized and is currently being used as a test case for integrating LLM features with Terrarium.
Once the API has been started, the /model_card
endpoint will consume a JSON with the structure:
{
research_paper: str,
}
The API will return a model card in JSON format
{response: obj}
Note: This is a WIP