Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Local LLMs into main #1351

Merged
merged 219 commits into from
Oct 25, 2023
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
219 commits
Select commit Hold shift + click to select a range
4e62d61
fix
rounak610 Aug 11, 2023
7ea2b3c
fixing the toolkit config in iteration workflow
Aug 17, 2023
bff3814
List file s3 fix (#1076)
luciferlinx101 Aug 17, 2023
34c3503
workflow changes
Aug 18, 2023
a8c37d4
minor seed file fix
Aug 18, 2023
ef8055b
frontend fixes (#1079)
Fluder-Paradyne Aug 18, 2023
1aa9436
fixed api_bug (#1080)
Aryan-Singh-14 Aug 18, 2023
3ff1c45
add one condition (#1082)
Fluder-Paradyne Aug 18, 2023
280ea14
api fix (#1087)
Fluder-Paradyne Aug 18, 2023
90eb841
Tools error fix (#1093)
rounak610 Aug 21, 2023
97cbc07
webhooks frontend + api calls complete almost
namansleeps2 Aug 21, 2023
ee53fd0
Merge remote-tracking branch 'origin/dev' into dev
rounak610 Aug 22, 2023
e19a7e9
Tool-LTM(Updated) (#1039)
AdityaSharma13064 Aug 22, 2023
3e4b797
Toolkit configuration fix (#1102)
sayan1101 Aug 23, 2023
1b1a12e
webhooks compplete frontend
namansleeps2 Aug 23, 2023
0ea717b
schedule agent fix (#1104)
rounak610 Aug 23, 2023
84085a1
Models superagi (#936)
jedan2506 Aug 23, 2023
5c5a193
Models superagi (#1108)
jedan2506 Aug 23, 2023
de14f7f
Changes for no receiver address
Aug 24, 2023
83463b3
Merge pull request #1111 from TransformerOptimus/email_return_statement
Tarraann Aug 24, 2023
2fb1fe1
made changes to github helper
Aug 24, 2023
fc3c616
Models superagi (#1112)
jedan2506 Aug 24, 2023
1d5d066
Models superagi (#1117)
jedan2506 Aug 25, 2023
5360f01
Merge pull request #1114 from TransformerOptimus/github_add_file
Tarraann Aug 25, 2023
bfde461
Models fixes (#1118)
jedan2506 Aug 25, 2023
57f741a
\n bug resolved (#1122)
luciferlinx101 Aug 25, 2023
8b01357
PDF and DOCX support in Write File - Feature Improvement, close #548 …
Arkajit-Datta Aug 25, 2023
556072e
Revert "PDF and DOCX support in Write File - Feature Improvement, clo…
luciferlinx101 Aug 25, 2023
8655121
Merge branch 'dev' of https://github.com/TransformerOptimus/SuperAGI …
rounak610 Aug 28, 2023
05719c1
expose port
Fluder-Paradyne Aug 28, 2023
1ee2878
Merge pull request #1133 from TransformerOptimus/expose_docker_port_dev
Tarraann Aug 28, 2023
54caf01
latest safetensors breaking in macs (#1134)
Fluder-Paradyne Aug 28, 2023
b754873
Changes in save template (#1120)
rounak610 Aug 28, 2023
9f52ebd
Merge branch 'dev' of https://github.com/TransformerOptimus/SuperAGI …
rounak610 Aug 28, 2023
50dbec2
Main to dev sync (#1139)
luciferlinx101 Aug 28, 2023
737268b
Models fixes (#1126)
jedan2506 Aug 28, 2023
3d180f3
added filters in the webhooks
rounak610 Aug 28, 2023
898d40f
fix
rounak610 Aug 28, 2023
5d63f8f
added filters in the webhooks
rounak610 Aug 28, 2023
f9a989d
Models fixes (#1145)
jedan2506 Aug 29, 2023
a0079eb
Jira Bug Fix
jagtarcontlo Aug 29, 2023
8778b58
Jira Bug Fix 2.0
jagtarcontlo Aug 29, 2023
6a2b781
Jira Bug Fix 3.0
jagtarcontlo Aug 29, 2023
b5c15e8
Merge pull request #1146 from TransformerOptimus/jira_bug_fix
jagtarcontlo Aug 29, 2023
dc84435
Merge branch 'dev' of github.com:TransformerOptimus/SuperAGI into web…
namansleeps2 Aug 29, 2023
8eb134e
added filters in the webhooks
rounak610 Aug 29, 2023
7d90ea0
Merge branch 'webhook_dev' into webhooks_new
rounak610 Aug 29, 2023
aa1146c
Models fixes (#1147)
jedan2506 Aug 29, 2023
127cd49
Bug fix model redirection (#1148)
luciferlinx101 Aug 29, 2023
68bf955
adding of filters table and edit functionality is on the way
namansleeps2 Aug 29, 2023
d00124d
Merge remote-tracking branch 'origin/webhooks_new' into webhooks_new
namansleeps2 Aug 29, 2023
5006fe2
added tool config for dalle
Aug 30, 2023
eec7853
removed model dependency on dalle tool
Aug 30, 2023
42f79ad
Merge pull request #1150 from TransformerOptimus/dalle_api_fix
Tarraann Aug 30, 2023
f3ab0f4
Remove hardcoded creds
Aug 30, 2023
fb561a2
fixed env error
Aug 30, 2023
18687a8
removed refactoring from main
Aug 30, 2023
c973ca9
removed refactoring
Aug 30, 2023
863873f
removed refactoring
Aug 30, 2023
c0feae9
handled error
Aug 30, 2023
fc70c5b
stop agent from executing if model is not found (#1156)
Fluder-Paradyne Aug 30, 2023
86cfaaf
entity details (#1158)
sayan1101 Aug 30, 2023
ae27ff4
Metric frontend (#1152)
namansleeps Aug 30, 2023
4ef811b
added filters in webhooks
rounak610 Aug 30, 2023
41180c8
Merge branch 'webhooks_new' of https://github.com/TransformerOptimus/…
rounak610 Aug 30, 2023
df718e5
added filters in webhooks
rounak610 Aug 30, 2023
1ca7e3e
minor changes
namansleeps2 Aug 30, 2023
6c0dca4
Merge branch 'webhooks_new' of github.com:TransformerOptimus/SuperAGI…
namansleeps2 Aug 30, 2023
0e15c68
webhooks complete
namansleeps2 Aug 30, 2023
ab37d67
minor changes for PR
namansleeps2 Aug 30, 2023
5c14715
minor changes for PR
namansleeps2 Aug 30, 2023
196fd43
Publish agent template to marketplace (#1106)
rounak610 Aug 30, 2023
1dd9611
added filters in webhooks
rounak610 Aug 30, 2023
a382161
Merge branch 'webhooks_new' of https://github.com/TransformerOptimus/…
rounak610 Aug 30, 2023
576f31d
resolving conflicts
namansleeps2 Aug 30, 2023
5f50b66
added filters in webhooks
rounak610 Aug 30, 2023
7a29b3e
Merge branch 'webhooks_new' of https://github.com/TransformerOptimus/…
rounak610 Aug 30, 2023
eb48d88
resolving conflicts
namansleeps2 Aug 30, 2023
0ed457f
Merge remote-tracking branch 'origin/webhooks_new' into webhooks_new
namansleeps2 Aug 30, 2023
f0659e5
added filters in the webhooks
rounak610 Aug 31, 2023
955b8ed
lint issue fixed
Aug 31, 2023
7e15b6e
bug fix of prev PR
namansleeps2 Aug 31, 2023
d069216
Merge remote-tracking branch 'origin/webhooks_new' into webhooks_new
namansleeps2 Aug 31, 2023
5ef2768
Merge pull request #1154 from TransformerOptimus/update_hardcoded_creds
Tarraann Aug 31, 2023
d7a2121
Merge branch 'dev' of https://github.com/TransformerOptimus/SuperAGI …
rounak610 Aug 31, 2023
6fd5390
fix for new run and edit agent
Aug 31, 2023
a4148fb
error handling
Aug 31, 2023
5378cbd
Merge pull request #1165 from TransformerOptimus/new_run_fix
Tarraann Aug 31, 2023
d741fa8
added filters in webhooks
rounak610 Aug 31, 2023
62dd221
Merge branch 'webhooks_new' of https://github.com/TransformerOptimus/…
rounak610 Aug 31, 2023
f71bba4
fix for knowledge search tool
Aug 31, 2023
e38391e
Docker digitalocean deployment
Aug 31, 2023
256c977
changed branch name
Aug 31, 2023
1f5e6d5
added filters in the webhooks
rounak610 Aug 31, 2023
7254014
Merge pull request #1168 from TransformerOptimus/knowledge_model_fix
Tarraann Aug 31, 2023
ea5175f
changes
Aug 31, 2023
2419bc3
removed region
Aug 31, 2023
b7e3b5a
added button
Aug 31, 2023
e493034
change in branch
Aug 31, 2023
5fc2526
Merge pull request #1170 from TransformerOptimus/docker-digitalocean
Tarraann Aug 31, 2023
a56dbe6
added filters in the webhooks
rounak610 Aug 31, 2023
7a08bd6
Update conftest.py
Fluder-Paradyne Aug 31, 2023
946a4a0
Added filters in the webhooks (#1140)
rounak610 Aug 31, 2023
d0fd847
Models calls logs dev (#1174)
jedan2506 Aug 31, 2023
b2709b0
models scroll fix, format of log timestamp fix, adding of loader to m…
namansleeps Sep 1, 2023
e4e7d2c
Update app.yaml (#1179)
Tarraann Sep 1, 2023
73a657c
fixes related to webhooks
rounak610 Sep 2, 2023
0d4728c
Merge branch 'dev' of https://github.com/TransformerOptimus/SuperAGI …
rounak610 Sep 2, 2023
1e25a5c
Merge branch 'dev' of https://github.com/TransformerOptimus/SuperAGI …
rounak610 Sep 2, 2023
e24feb0
fixes for webhooks
rounak610 Sep 4, 2023
52932fd
Fixes for webhooks (#1181)
rounak610 Sep 4, 2023
245eead
bugs by qa (#1178)
namansleeps Sep 4, 2023
827f6ca
Merge branch 'dev' of https://github.com/TransformerOptimus/SuperAGI …
rounak610 Sep 4, 2023
311f344
Fix for schedule agent (#1184)
Tarraann Sep 4, 2023
517a36e
Entity fix (#1185)
sayan1101 Sep 5, 2023
f9c160f
fixes for webhooks
rounak610 Sep 5, 2023
9d12c43
fixes for webhooks
rounak610 Sep 5, 2023
84931c9
Merge remote-tracking branch 'origin/dev' into webhooks_new
rounak610 Sep 5, 2023
135e1dc
Merge pull request #1187 from TransformerOptimus/webhooks_new
Tarraann Sep 5, 2023
da89860
fix added for index state (#1188)
Tarraann Sep 5, 2023
9387fdc
API bug fixes for SDK (#1189)
jagtarcontlo Sep 5, 2023
021698b
Main to dev sync v12 (#1193)
luciferlinx101 Sep 6, 2023
6817934
added button
Sep 6, 2023
d0f9b1e
Merge pull request #1197 from TransformerOptimus/update_digitalocean_…
Tarraann Sep 6, 2023
2be6ffd
Merge branch 'dev' of https://github.com/TransformerOptimus/SuperAGI …
rounak610 Sep 6, 2023
8437909
GitHub pull request tools (#1190)
Sep 6, 2023
26f6a1d
PDF and DOCX support in Write File - Feature Improvement, close #548 …
Arkajit-Datta Sep 6, 2023
d1c05d8
minor documentation fix
Sep 6, 2023
19fbda6
Design bugs (#1199)
namansleeps Sep 6, 2023
589341e
fetching token limit from db
Sep 7, 2023
3153def
Revert "PDF and DOCX support in Write File - Feature Improvement, clo…
luciferlinx101 Sep 7, 2023
a70e715
Unit Test Fix (#1203)
luciferlinx101 Sep 7, 2023
cb7e1bf
adding of docs and and discord link correction (#1205)
namansleeps Sep 7, 2023
3677584
Merge branch 'dev' of https://github.com/TransformerOptimus/SuperAGI …
rounak610 Sep 7, 2023
6024fd8
openai error handling
rounak610 Sep 7, 2023
6c3fd8c
error_handling
rounak610 Sep 8, 2023
5c72543
api call only when agent is running
namansleeps2 Sep 8, 2023
195e0fc
Merge remote-tracking branch 'origin/error_handling' into error_handling
namansleeps2 Sep 8, 2023
d7613d7
Feature : Wait block for agent workflow (#1186)
luciferlinx101 Sep 8, 2023
571c588
minor changes (#1213)
jagtarcontlo Sep 8, 2023
f8d158b
error handling
rounak610 Sep 8, 2023
46e1b90
error handling
rounak610 Sep 8, 2023
9445d29
error handling
rounak610 Sep 8, 2023
d7c01a2
error handling
rounak610 Sep 8, 2023
3166d7a
error handling
rounak610 Sep 10, 2023
55d0dd9
fix
rounak610 Sep 10, 2023
3fe6684
fix
rounak610 Sep 10, 2023
e887ffa
fix
rounak610 Sep 10, 2023
55435d2
error handling
rounak610 Sep 10, 2023
b9341ed
models changes (#1207)
jedan2506 Sep 11, 2023
80489eb
error handling
rounak610 Sep 11, 2023
17ee9ee
models marketplace changes (#1219)
jedan2506 Sep 11, 2023
f8bd122
Merge branch 'dev' of https://github.com/TransformerOptimus/SuperAGI …
rounak610 Sep 11, 2023
6592d8e
minor changes
namansleeps2 Sep 11, 2023
b1fed77
error handling
rounak610 Sep 11, 2023
1fe6a41
error handling
rounak610 Sep 12, 2023
9d88d7d
removing single qoutes (#1224)
namansleeps Sep 12, 2023
6ba8bbc
apm changes (#1222)
jedan2506 Sep 12, 2023
7e5e305
Merge branch 'dev' of https://github.com/TransformerOptimus/SuperAGI …
rounak610 Sep 12, 2023
99d0271
list tool fix
rounak610 Sep 12, 2023
d8c57f7
list tool fix
rounak610 Sep 12, 2023
307169c
PR CHANGES
namansleeps2 Sep 12, 2023
9519aca
entity fix for dev (#1230)
sayan1101 Sep 13, 2023
9525f74
Merge branch 'dev' of https://github.com/TransformerOptimus/SuperAGI …
rounak610 Sep 14, 2023
5d6e726
frontend changes (#1231)
jedan2506 Sep 14, 2023
5d8772d
read_tool_fix
rounak610 Sep 14, 2023
6b86162
Merge pull request #1236 from TransformerOptimus/read_tool_fix
Tarraann Sep 14, 2023
e830bd7
Merge pull request #1226 from TransformerOptimus/list_tool_fix
Tarraann Sep 14, 2023
11e7213
fix
rounak610 Sep 14, 2023
c6f93a2
Merge pull request #1232 from TransformerOptimus/error_handling_3
Tarraann Sep 14, 2023
eb9b8ec
waiting block frontend (#1233)
jedan2506 Sep 14, 2023
6845224
Dev Fixes (#1242)
jedan2506 Sep 15, 2023
f969813
read tool fix
rounak610 Sep 15, 2023
edeaf46
Merge pull request #1245 from TransformerOptimus/read_tool_fix_2
rounak610 Sep 15, 2023
c96ae0e
Maintaining dev (#1244)
jedan2506 Sep 15, 2023
3b5a868
added logs (#1246)
luciferlinx101 Sep 15, 2023
eaf44a8
error_handling fix (#1247)
rounak610 Sep 17, 2023
f118e90
Feature first login src (#1241)
luciferlinx101 Sep 18, 2023
0db35bb
apollo NoneType bug fix (#1238)
sayan1101 Sep 18, 2023
2ccbd4b
Mixpanel integration (#1256)
namansleeps Sep 18, 2023
f465a55
Models Marketplace bug fix for dev (#1266)
jedan2506 Sep 20, 2023
1883061
Fix 1257 dev (#1269)
Fluder-Paradyne Sep 21, 2023
96a5f31
add cache layer (#1275)
Fluder-Paradyne Sep 22, 2023
ebbe3fc
Fix api dev (#1283)
Fluder-Paradyne Sep 22, 2023
83d09f7
mixpanel changes (#1285)
namansleeps Sep 25, 2023
1c2425d
rename error_handling.py to error_handler.py (#1287)
Fluder-Paradyne Sep 27, 2023
1224489
Analytics login (#1258)
namansleeps Sep 28, 2023
90c45e7
calendar issues fixed
Sep 28, 2023
14ffef2
Merge pull request #1290 from TransformerOptimus/delete_event_calenda…
Tarraann Sep 29, 2023
c46765d
append fle tool bug fixed (#1294)
Tarraann Sep 29, 2023
9720c65
adding cookie in access token (#1301)
namansleeps Oct 3, 2023
f772a38
local_llms
Oct 3, 2023
427a04f
local_llms
rounak610 Oct 3, 2023
874635c
local_llms
Oct 4, 2023
d931ac1
local_llms
Oct 4, 2023
e746c1e
local_llms
Oct 4, 2023
2cdd551
fixes
rounak610 Oct 4, 2023
f8d6084
models error fixed (#1308)
namansleeps Oct 4, 2023
9e7e686
local_llms
rounak610 Oct 4, 2023
b1ddffb
local_llms
rounak610 Oct 4, 2023
581b174
Merge branch 'local_llm_final' of https://github.com/TransformerOptim…
rounak610 Oct 4, 2023
43732f7
local_llms
rounak610 Oct 4, 2023
882c197
local_llms
rounak610 Oct 4, 2023
ab1d96c
local_llms
rounak610 Oct 5, 2023
a363b0a
frontend_changes
rounak610 Oct 9, 2023
6a5fa43
Merge branch 'local_llm_final' into dev
rounak610 Oct 9, 2023
6271d8e
local_llms
rounak610 Oct 9, 2023
fea8fc5
local_llms
rounak610 Oct 10, 2023
648f530
local_llms
rounak610 Oct 10, 2023
5734b8c
local_llms
rounak610 Oct 10, 2023
dd2b04a
local_llms_frontend
rounak610 Oct 10, 2023
6ee4359
fixes
rounak610 Oct 18, 2023
b72447e
fixes
rounak610 Oct 18, 2023
9a1b7ad
fixes
rounak610 Oct 18, 2023
84c8dc6
fixes
rounak610 Oct 18, 2023
6b70539
Merge branch 'main' into local_llm_final
rounak610 Oct 25, 2023
254e772
merged main into local_llm_final
rounak610 Oct 25, 2023
38b3cb8
merged main into local_llm_final
rounak610 Oct 25, 2023
76ae7d1
local llms
rounak610 Oct 25, 2023
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion gui/pages/Content/APM/ApmDashboard.js
Original file line number Diff line number Diff line change
Expand Up @@ -76,7 +76,7 @@ export default function ApmDashboard() {
const fetchData = async () => {
try {
const [metricsResponse, agentsResponse, activeRunsResponse, toolsUsageResponse] = await Promise.all([getMetrics(), getAllAgents(), getActiveRuns(), getToolsUsage()]);
const models = ['gpt-4', 'gpt-3.5-turbo', 'gpt-3.5-turbo-16k', 'gpt-4-32k', 'google-palm-bison-001'];
const models = ['gpt-4', 'gpt-3.5-turbo', 'gpt-3.5-turbo-16k', 'gpt-4-32k', 'google-palm-bison-001', 'replicate-llama13b-v2-chat'];

assignDefaultDataPerModel(metricsResponse.data.agent_details.model_metrics, models);
assignDefaultDataPerModel(metricsResponse.data.tokens_details.model_metrics, models);
Expand Down
1 change: 1 addition & 0 deletions gui/pages/_app.js
Original file line number Diff line number Diff line change
Expand Up @@ -61,6 +61,7 @@ export default function App() {
});
}


const installFromMarketplace = () => {
const toolkitName = localStorage.getItem('toolkit_to_install') || null;
const agentTemplateId = localStorage.getItem('agent_to_install') || null;
Expand Down
11 changes: 10 additions & 1 deletion main.py
Original file line number Diff line number Diff line change
Expand Up @@ -50,6 +50,7 @@
from superagi.llms.replicate import Replicate
from superagi.llms.hugging_face import HuggingFace
from superagi.models.agent_template import AgentTemplate
from superagi.models.models_config import ModelsConfig
from superagi.models.organisation import Organisation
from superagi.models.types.login_request import LoginRequest
from superagi.models.types.validate_llm_api_key_request import ValidateAPIKeyRequest
Expand Down Expand Up @@ -215,6 +216,13 @@ def register_toolkit_for_master_organisation():
Organisation.id == marketplace_organisation_id).first()
if marketplace_organisation is not None:
register_marketplace_toolkits(session, marketplace_organisation)

def local_llm_model_config():
existing_models_config = session.query(ModelsConfig).filter(ModelsConfig.org_id == default_user.organisation_id, ModelsConfig.provider == 'Local LLM').first()
if existing_models_config is None:
models_config = ModelsConfig(org_id=default_user.organisation_id, provider='Local LLM', api_key="EMPTY")
session.add(models_config)
session.commit()

IterationWorkflowSeed.build_single_step_agent(session)
IterationWorkflowSeed.build_task_based_agents(session)
Expand All @@ -238,7 +246,8 @@ def register_toolkit_for_master_organisation():
# AgentWorkflowSeed.doc_search_and_code(session)
# AgentWorkflowSeed.build_research_email_workflow(session)
replace_old_iteration_workflows(session)

local_llm_model_config()

if env != "PROD":
register_toolkit_for_all_organisation()
else:
Expand Down
28 changes: 28 additions & 0 deletions migrations/versions/9270eb5a8475_local_llms.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,28 @@
"""local_llms

Revision ID: 9270eb5a8475
Revises: 3867bb00a495
Create Date: 2023-10-04 09:26:33.865424

"""
from alembic import op
import sqlalchemy as sa


# revision identifiers, used by Alembic.
revision = '9270eb5a8475'
down_revision = '3867bb00a495'
branch_labels = None
depends_on = None


def upgrade() -> None:
# ### commands auto generated by Alembic - please adjust! ###
op.add_column('models', sa.Column('context_length', sa.Integer(), nullable=True))
# ### end Alembic commands ###


def downgrade() -> None:
# ### commands auto generated by Alembic - please adjust! ###
op.drop_column('models', 'context_length')
# ### end Alembic commands ###
1 change: 1 addition & 0 deletions requirements.txt
Original file line number Diff line number Diff line change
Expand Up @@ -158,3 +158,4 @@ google-generativeai==0.1.0
unstructured==0.8.1
ai21==1.2.6
typing-extensions==4.5.0
llama_cpp_python==0.2.7
38 changes: 38 additions & 0 deletions superagi/helper/llm_loader.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,38 @@
from llama_cpp import Llama
from llama_cpp import LlamaGrammar
from superagi.config.config import get_config
from superagi.lib.logger import logger


class LLMLoader:
_instance = None
_model = None
_grammar = None

def __new__(cls, *args, **kwargs):
if cls._instance is None:
cls._instance = super(LLMLoader, cls).__new__(cls)
return cls._instance

Check warning on line 15 in superagi/helper/llm_loader.py

View check run for this annotation

Codecov / codecov/patch

superagi/helper/llm_loader.py#L14-L15

Added lines #L14 - L15 were not covered by tests

def __init__(self, context_length):
self.context_length = context_length

Check warning on line 18 in superagi/helper/llm_loader.py

View check run for this annotation

Codecov / codecov/patch

superagi/helper/llm_loader.py#L18

Added line #L18 was not covered by tests

@property
def model(self):
if self._model is None:
try:
self._model = Llama(

Check warning on line 24 in superagi/helper/llm_loader.py

View check run for this annotation

Codecov / codecov/patch

superagi/helper/llm_loader.py#L23-L24

Added lines #L23 - L24 were not covered by tests
model_path="/app/local_model_path", n_ctx=self.context_length)
except Exception as e:
logger.error(e)
return self._model

Check warning on line 28 in superagi/helper/llm_loader.py

View check run for this annotation

Codecov / codecov/patch

superagi/helper/llm_loader.py#L26-L28

Added lines #L26 - L28 were not covered by tests

@property
def grammar(self):
if self._grammar is None:
try:
self._grammar = LlamaGrammar.from_file(

Check warning on line 34 in superagi/helper/llm_loader.py

View check run for this annotation

Codecov / codecov/patch

superagi/helper/llm_loader.py#L33-L34

Added lines #L33 - L34 were not covered by tests
"superagi/llms/grammar/json.gbnf")
except Exception as e:
logger.error(e)
return self._grammar

Check warning on line 38 in superagi/helper/llm_loader.py

View check run for this annotation

Codecov / codecov/patch

superagi/helper/llm_loader.py#L36-L38

Added lines #L36 - L38 were not covered by tests
1 change: 1 addition & 0 deletions superagi/jobs/agent_executor.py
Original file line number Diff line number Diff line change
@@ -1,6 +1,7 @@
from datetime import datetime, timedelta

from sqlalchemy.orm import sessionmaker
from superagi.llms.local_llm import LocalLLM

import superagi.worker
from superagi.agent.agent_iteration_step_handler import AgentIterationStepHandler
Expand Down
25 changes: 25 additions & 0 deletions superagi/llms/grammar/json.gbnf
Original file line number Diff line number Diff line change
@@ -0,0 +1,25 @@
root ::= object
value ::= object | array | string | number | ("true" | "false" | "null") ws

object ::=
"{" ws (
string ":" ws value
("," ws string ":" ws value)*
)? "}" ws

array ::=
"[" ws (
value
("," ws value)*
)? "]" ws

string ::=
"\"" (
[^"\\] |
"\\" (["\\/bfnrt] | "u" [0-9a-fA-F] [0-9a-fA-F] [0-9a-fA-F] [0-9a-fA-F]) # escapes
)* "\"" ws

number ::= ("-"? ([0-9] | [1-9] [0-9]*)) ("." [0-9]+)? ([eE] [-+]? [0-9]+)? ws

# Optional space: by convention, applied in this grammar after literal chars when allowed
ws ::= ([ \t\n] ws)?
1 change: 1 addition & 0 deletions superagi/llms/llm_model_factory.py
Original file line number Diff line number Diff line change
@@ -1,4 +1,5 @@
from superagi.llms.google_palm import GooglePalm
from superagi.llms.local_llm import LocalLLM
from superagi.llms.openai import OpenAi
from superagi.llms.replicate import Replicate
from superagi.llms.hugging_face import HuggingFace
Expand Down
92 changes: 92 additions & 0 deletions superagi/llms/local_llm.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,92 @@
from superagi.config.config import get_config
from superagi.lib.logger import logger
from superagi.llms.base_llm import BaseLlm
from superagi.helper.llm_loader import LLMLoader


class LocalLLM(BaseLlm):
def __init__(self, temperature=0.6, max_tokens=get_config("MAX_MODEL_TOKEN_LIMIT"), top_p=1,
frequency_penalty=0,
presence_penalty=0, number_of_results=1, model=None, api_key='EMPTY', context_length=4096):
"""
Args:
model (str): The model.
temperature (float): The temperature.
max_tokens (int): The maximum number of tokens.
top_p (float): The top p.
frequency_penalty (float): The frequency penalty.
presence_penalty (float): The presence penalty.
number_of_results (int): The number of results.
"""
self.model = model
self.api_key = api_key
self.temperature = temperature
self.max_tokens = max_tokens
self.top_p = top_p
self.frequency_penalty = frequency_penalty
self.presence_penalty = presence_penalty
self.number_of_results = number_of_results
self.context_length = context_length

Check warning on line 29 in superagi/llms/local_llm.py

View check run for this annotation

Codecov / codecov/patch

superagi/llms/local_llm.py#L21-L29

Added lines #L21 - L29 were not covered by tests

llm_loader = LLMLoader(self.context_length)
self.llm_model = llm_loader.model
self.llm_grammar = llm_loader.grammar

Check warning on line 33 in superagi/llms/local_llm.py

View check run for this annotation

Codecov / codecov/patch

superagi/llms/local_llm.py#L31-L33

Added lines #L31 - L33 were not covered by tests

def chat_completion(self, messages, max_tokens=get_config("MAX_MODEL_TOKEN_LIMIT")):
"""
Call the chat completion.

Args:
messages (list): The messages.
max_tokens (int): The maximum number of tokens.

Returns:
dict: The response.
"""
try:

Check warning on line 46 in superagi/llms/local_llm.py

View check run for this annotation

Codecov / codecov/patch

superagi/llms/local_llm.py#L46

Added line #L46 was not covered by tests
if self.llm_model is None or self.llm_grammar is None:
logger.error("Model not found.")
return {"error": "Model loading error", "message": "Model not found. Please check your model path and try again."}

Check warning on line 49 in superagi/llms/local_llm.py

View check run for this annotation

Codecov / codecov/patch

superagi/llms/local_llm.py#L48-L49

Added lines #L48 - L49 were not covered by tests
else:
response = self.llm_model.create_chat_completion(messages=messages, functions=None, function_call=None, temperature=self.temperature, top_p=self.top_p,

Check warning on line 51 in superagi/llms/local_llm.py

View check run for this annotation

Codecov / codecov/patch

superagi/llms/local_llm.py#L51

Added line #L51 was not covered by tests
max_tokens=int(max_tokens), presence_penalty=self.presence_penalty, frequency_penalty=self.frequency_penalty, grammar=self.llm_grammar)
content = response["choices"][0]["message"]["content"]
logger.info(content)
return {"response": response, "content": content}

Check warning on line 55 in superagi/llms/local_llm.py

View check run for this annotation

Codecov / codecov/patch

superagi/llms/local_llm.py#L53-L55

Added lines #L53 - L55 were not covered by tests

except Exception as exception:
logger.info("Exception:", exception)
return {"error": "ERROR", "message": "Error: "+str(exception)}

Check warning on line 59 in superagi/llms/local_llm.py

View check run for this annotation

Codecov / codecov/patch

superagi/llms/local_llm.py#L57-L59

Added lines #L57 - L59 were not covered by tests

def get_source(self):
"""
Get the source.

Returns:
str: The source.
"""
return "Local LLM"

Check warning on line 68 in superagi/llms/local_llm.py

View check run for this annotation

Codecov / codecov/patch

superagi/llms/local_llm.py#L68

Added line #L68 was not covered by tests

def get_api_key(self):
"""
Returns:
str: The API key.
"""
return self.api_key

Check warning on line 75 in superagi/llms/local_llm.py

View check run for this annotation

Codecov / codecov/patch

superagi/llms/local_llm.py#L75

Added line #L75 was not covered by tests

def get_model(self):
"""
Returns:
str: The model.
"""
return self.model

Check warning on line 82 in superagi/llms/local_llm.py

View check run for this annotation

Codecov / codecov/patch

superagi/llms/local_llm.py#L82

Added line #L82 was not covered by tests

def get_models(self):
"""
Returns:
list: The models.
"""
return self.model

Check warning on line 89 in superagi/llms/local_llm.py

View check run for this annotation

Codecov / codecov/patch

superagi/llms/local_llm.py#L89

Added line #L89 was not covered by tests

def verify_access_key(self, api_key):
return True

Check warning on line 92 in superagi/llms/local_llm.py

View check run for this annotation

Codecov / codecov/patch

superagi/llms/local_llm.py#L92

Added line #L92 was not covered by tests
Loading