This repository has been archived by the owner on Aug 13, 2024. It is now read-only.
Replies: 1 comment
-
@GWSzeto - Thanks a bunch for having a look. I've updated the README.md for the demo so as to clarify some of this.
Let me know if this somehow doesn't cover your questions. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Looked over the code in the iACEui project and wanted to do a writeup of my understanding of the key pieces, how the project is setup and how it runs
Key Pieces
There are 3 main services that drive the ACE Framework and the Frontend to help setup the system.
The 3 main services are Postgres DB, RabbitMQ and the API, all of which are orchestrated through docker compose
The Postgres DB is responsible for storing the identity, reasoning ad ancestral prompts, which serve as the system messages along with the control bus and data bus prompts, which serve as rules for the LLM to follow
RabbitMQ serves as the pipeline for the control and data busses, where the messages are passed between each layer. To allow for the messages to be passed through the layers properly, each layer is responsible for handling 4 messaging queues. 2 queues subscribed for messages from the data bus and control busses and 2 queues for publishing messages to the data and control busses. The subscribed data queue receives messages from the lower layer while the subscribed control queue receives messages from the upper layer. For the publisher counter part, it is vice versa
The API has multiple types of endpoints, ranging from starting, testing and setting/getting the prompts for the system.
How it's setup
On initial setup, the Svelte frontend is served to allow the user to set the identity, reasoning, ancestral, control bus and data bus prompts through the API, thus populating the postgres DB.
Once the DB has been populated, the system is then ready to be started through the /mission endpoint
How it runs
The system starts through the /mission endpoint in the API, where the mission statement is published to the aspirational layer
The aspirational layer receives the message through the subscribed control bus queue. Then using the message along with using the identity, reasoning and ancestral prompts as the system messages, it generates a reasoning about the message sent in.
Then using both the message and the reasoning generated along with only just the identity and ancestral prompts as the system message, it generates data and control messages separately for the publisher control and data bus queues.
Once the control and data messages get published, they are then picked up by the subsequent lower and upper layers (if they exist), thus continuing the cycle
Would love to get critique about my current understanding so far
@samgriek
Beta Was this translation helpful? Give feedback.
All reactions