Releases: letta-ai/letta
Releases · letta-ai/letta
0.2.4
This release includes bugfixes (including major bugfixes for autogen) and a number of new features:
- Custom presets, which allow customization of the set of function calls MemGPT can make
- Integration with LanceDB for archival storage contributed by @PrashantDixit0
- Integration with vLLM OpenAI compatible endpoints
What's Changed
- Set service context for llama index in
local.py
by @sarahwooders in #462 - Update functions.md by @cpacker in #461
- Fix linking functions from
~/.memgpt/functions
by @cpacker in #463 - Add d20 function example to readthedocs by @cpacker in #464
- Move
webui
backend to new openai completions endpoint by @cpacker in #468 - updated websocket protocol and server by @cpacker in #473
- Lancedb by @PrashantDixit0 in #455
- Docs: Fix typos by @sahusiddharth in #477
- Remove .DS_Store from agents list by @cpacker in #485
- Fix #487 (summarize call uses OpenAI even with local LLM config) by @cpacker in #488
- patch web UI by @cpacker in #484
- ANNA, an acronym for Adaptive Neural Network Assistant. personal research assistant by @agiletechnologist in #494
- vLLM support by @cpacker in #492
- Add error handling during linking imports by @cpacker in #495
- Fixes bugs with AutoGen implementation and exampes by @cpacker in #498
- [version] bump version to 0.2.4 by @sarahwooders in #497
New Contributors
- @PrashantDixit0 made their first contribution in #455
- @sahusiddharth made their first contribution in #477
- @agiletechnologist made their first contribution in #494
Full Changelog: 0.2.3...0.2.4
0.2.3
Updates
- Updated MemGPT and Agent Configs: This release makes changes to how MemGPT and agent configurations are stored. These changes will help MemGPT keep track of what settings and with what version an agent was saved with, to help improve cross-version compatibility for agents.
- If you've been using a prior version of MemGPT, you may need to re-run
memgpt configure
to update your configuration settings to be compatible with this version.
- If you've been using a prior version of MemGPT, you may need to re-run
- Configurable Presets: Presets have been refactored to allow developers to customize the set of functions and system prompts MemGPT uses.
What's Changed
- Configurable presets to support easy extension of MemGPT's function set by @cpacker in #420
- WebSocket interface and basic
server.py
process by @cpacker in #399 - patch
getargspec
error by @cpacker in #440 - always cast
config.context_window
toint
before use by @cpacker in #444 - Refactor config + determine LLM via
config.model_endpoint_type
by @sarahwooders in #422 - Update config to include
memgpt_version
and re-run configuration for old versions onmemgpt run
by @sarahwooders in #450 - Add load and load_and_attach functions to memgpt autogen agent. by @wrmedford in #430
- Update documentation [local LLMs, presets] by @cpacker in #453
- When default_mode_endpoint has a value, it needs to become model_endp… by @kfsone in #452
- Upgrade workflows to Python 3.11 by @sarahwooders in #441
New Contributors
Full Changelog: 0.2.2...0.2.3
0.2.2
What's Changed
- Fix MemGPTAgent attach docs error by @anjaleeps in #427
- [fix] remove asserts for
OPENAI_API_BASE
by @sarahwooders in #432 - Patch for #434 (context window value not used by
memgpt run
) by @cpacker in #435 - Patch for #428 (context window not passed to summarize calls) by @cpacker in #433
- [version] bump release to 0.2.2 by @cpacker in #436
New Contributors
- @anjaleeps made their first contribution in #427
Full Changelog: 0.2.1...0.2.2
0.2.1
This is a release to replace the yanked 0.2.0 release, which had critical bugs.
What's Changed
- [version] bump version to 0.2.0 by @sarahwooders in #410
- Fix main.yml to not rely on requirements.txt by @vivi in #411
- Hotfix openai create all with context_window kwarg by @vivi in #413
- Fix agent load for old agent config files by @sarahwooders in #412
- Patch local LLMs with context_window by @cpacker in #416
- Fix model configuration for when
config.model == "local"
previously by @sarahwooders in #415 - Throw more informative error when local model envs are/are not set by @sarahwooders in #418
- [version] bump version to 0.2.1 by @sarahwooders in #417
Full Changelog: 0.2.0...0.2.1
0.2.0
This release includes updated documentation , integration with vector databases (pgvector), and many bug fixes!
What's Changed
- Patch runtime error with personas by @cpacker in #221
- Gracefully catch errors when running agent.step() by @vivi in #216
- update max new tokens by @web3wes in #182
- Added db load ability by @mr-sk in #106
- Using SPR to Compress System Prompts by @tractorjuice in #158
- Cli bug fixes (loading human/persona text, azure setup, local setup) by @sarahwooders in #222
- Support for MemGPT + Autogen + Local LLM by @vivi in #231
- len needs to be implemented in all memory classes by @cpacker in #236
- Update README for CLI changes by @sarahwooders in #207
- Allow MemGPT to read/write text files + make HTTP requests by @cpacker in #174
- fixed load loading from wrong directory by @cpacker in #237
- await async_get_embeddings_with_backoff by @vivi in #239
- Add basic tests that are run on PR/main by @cpacker in #228
- fix: LocalArchivalMemory prints ref_doc_info on if not using EmptyIndex by @goetzrobin in #240
- Allow loading in a directory non-recursively by @vivi in #246
- Fix typos in functions spec by @cpacker in #268
- fix typo in the base system prompt by @yubozhao in #189
- Patch summarize when running with local llms by @cpacker in #213
- Improvements to JSON handling for local LLMs by @cpacker in #269
- Update openai_tools.py by @tractorjuice in #159
- Add more stop tokens by @cpacker in #288
- Don't prompt for selecting existing agent if there is a
--persona/human/model
flag by @sarahwooders in #289 - strip '/' and use osp.join (Windows support) by @cpacker in #283
- Make CLI agent flag errors more clear, and dont throw error if flags dont contradict existing agent config by @sarahwooders in #290
- VectorDB support (pgvector) for archival memory by @sarahwooders in #226
- try to patch hanging test by @cpacker in #295
- I made dump showing more messages and added a count (the last x) by @oderwat in #204
- I added commands to shape the conversation: by @oderwat in #218
- I added a "/retry" command to retry for getting another answer. by @oderwat in #188
- make timezone local by default by @cpacker in #298
- FIx #261 by @danx0r in #300
- Add grammar-based sampling (for webui, llamacpp, and koboldcpp) by @cpacker in #293
- fix: import PostgresStorageConnector only if postgres is selected as … by @goetzrobin in #310
- Don't import postgres storage if not specified in config by @sarahwooders in #318
- Aligned code with README for using Azure embeddings to load documents by @dividor in #308
- Fix: imported wrong storage connector by @sarahwooders in #320
- Remove embeddings as argument in archival_memory.insert by @cpacker in #284
- Create docs pages by @cpacker in #328
- patch in-chat command info by @cpacker in #332
- Bug fix grammar_name not being defined causes a crash by @borewik in #326
- cleanup #326 by @cpacker in #333
- Stopping the app to repeat the user message in normal use. by @oderwat in #304
- Remove redundant docs from README by @sarahwooders in #334
- Add autogen+localllm docs by @vivi in #335
- Add
memgpt version
command and package version by @sarahwooders in #336 - add ollama support by @cpacker in #314
- Better interface output for function calls by @vivi in #296
- Better error message printing for function call failing by @vivi in #291
- Fixing some dict value checking for function_call by @nuaimat in #249
- Specify model inference and embedding endpoint separately by @sarahwooders in #286
- Fix config tests by @sarahwooders in #343
- Avoid throwing error for older
~/.memgpt/config
files due to missing sectionarchival_storage
by @sarahwooders in #344 - Dependency management by @sarahwooders in #337
- Relax verify_first_message_correctness to accept any function call by @vivi in #340
- Update
poetry.lock
by @sarahwooders in #346 - Add autogen example that lets you chat with docs by @vivi in #342
- add gpt-4-turbo by @cpacker in #349
- Revert relaxing verify_first_message_correctness, still add archival_memory_search as an exception by @vivi in #350
- Bump version to 0.1.18 by @vivi in #351
- Remove
requirements.txt
andrequirements_local.txt
by @sarahwooders in #358 - disable pretty exceptions by @cpacker in #367
- Updated documentation for users by @cpacker in #365
- Create pull_request_template.md by @cpacker in #368
- Add pymemgpt-nightly workflow by @vivi in #373
- Update lmstudio.md by @cpacker in #382
- Update lmstudio.md to show the Prompt Formatting Option by @MSZ-MGS in #384
- Swap asset location from #384 by @cpacker in #385
- Update poetry with
pg8000
and includepgvector
in docs by @sarahwooders in #390 - Allow overriding config location with
MEMGPT_CONFIG_PATH
by @sarahwooders in #383 - Always default to local embeddings if not OpenAI or Azure by @sarahwooders in #387
- Add support for larger archival memory stores by @sarahwooders in #359
- Replace
memgpt run
flags error with warning + remove custom embedding endpoint option + add agent create time by @sarahwooders in #364 - Update webui.md by @cpacker in #397
- Update webui.md by @cpacker in #398
- softpass test when keys are missing by @cpacker in #369
- Use
~/.memgpt/config
to set questionary defaults inmemgpt configure
by @sarahwooders in #389 - Simple docker. by @BobKerns in #393
- Return empty list if archival memory search over empty local index by @sarahwooders in #402
- Remove AsyncAgent and async from cli by @vivi in #400
- I added some json repairs that helped me with malformed messages by @oderwat in #341
- Fix max tokens constant by @cpacker in #374
New Contributors
- @web3wes made their first contribution in #182
- @mr-sk made their first contribution in #106
- @goetzrobin made their first contribution in #240
- @yubozhao made their first contribution in #189
- @oderwat made their first contribution in #204
- @danx0r made their first contribution in #300
- @dividor made their first contribution in #308
- @borewik made their first contribution in #326
- @nuaimat made their first contribution in #249
- @MSZ-MGS made their first contribution in #384
- @BobKerns made their first contribution in #393
Full Changelog: 0.1.15...0.2.0
0.1.15
0.1.14
What's Changed
- Changes to lmstudio to fix JSON decode error by @raisindetre in #208
New Contributors
- @raisindetre made their first contribution in #208
Full Changelog: 0.1.13...0.1.14
0.1.13
0.1.12
What's Changed
- Integration with AutoGen workflow by @QZGao in #126
- Support loading data into archival with Llama Index connectors by @sarahwooders in #146
- Add synchronous memgpt agent by @vivi in #156
- LM Studio inference server support by @cpacker in #167
- add missing pip requirement
llama_index
by @almontasser in #166 - hotfix for broken sync agent code by @cpacker in #171
- use urljoin instead of path.join by @cpacker in #173
- Fix typos by @HKABIG in #172
- black patch on outstanding files that were causing workflow fails on PRs by @cpacker in #193
- Allow recursive blobs by @wrmedford in #186
- azure typo patch by @cpacker in #192
- New wrapper for Zephyr models + little fix in memory.py by @v-kamelowy in #183
- Update agent.py by @tractorjuice in #185
- Summary hotfix by @vivi in #195
- added more clear warnings for when OPENAI_API_BASE and BACKEND_TYPE are not set by @cpacker in #202
- Hotfix memory bug from async refactor by @vivi in #203
- Refactoring CLI to use config file, connect to Llama Index data sources, and allow for multiple agents by @sarahwooders in #154
- Refactor autogen agent to use sync memgpt, add notebook example by @vivi in #157
New Contributors
- @almontasser made their first contribution in #166
- @HKABIG made their first contribution in #172
- @wrmedford made their first contribution in #186
- @v-kamelowy made their first contribution in #183
- @tractorjuice made their first contribution in #185
Full Changelog: 0.1.6...0.1.12
0.1.6
What's Changed
- Add llamaindex example (chat with MemGPT over llamaindex documentation) by @cpacker in #2
- Add llamaindex example (chat with MemGPT over llamaindex documentation) by @cpacker in #3
- Update README.md by @sarahwooders in #9
- Add flag for preloading files by @vivi in #10
- Example SQL integration with MemGPT by @ShishirPatil in #4
- Clean up requirements.txt by @daneah in #16
- Update README.md by @bllchmbrs in #23
- Update README.md by @eltociear in #17
- fixed bug where persistence manager was not saving in demo CLI by @cpacker in #27
- made CONTRIBUTION.md by @0Armaan025 in #29
- [Feature] support line breaks by @QZGao in #30
- support generating embeddings on the fly by @vivi in #34
- allow bypassing message check by @cpacker in #43
- add csv support for preloading files into archival memory by @vivi in #45
- Simplify load by @cpacker in #46
- bare bones github actions workflow by @cpacker in #47
- fix typo by @Rudra-Ji in #49
- Passable gpt-3.5-turbo support by @cpacker in #58
- Add Autogen MemGPT agent by @vivi in #64
- Add pdf support by @vivi in #71
- azure support by @cpacker in #79
- change pop, since agent.messages is locked by @cpacker in #84
- autosave on /exit by @cpacker in #83
- Parallelize embedding generation by @vivi in #85
- fix: typos in memgpt/autogen/memgpt_agent.py by @shresthasurav in #95
- Add local LLM support (with function calling) by @cpacker in #97
- CLI overhaul by @vivi in #96
- Fix typo in memgpt_base.txt #89 by @rabbabansh in #93
- Create python package with poetry by @sarahwooders in #104
- fix runtime error by @cpacker in #105
- Fix memgpt_dir circular import by @vivi in #110
- hotfix for airoboros prompt formatting template by @cpacker in #113
- add wrapper for dolphin mistral + inner monologue wrapper by @cpacker in #116
- Cleanup by @cpacker in #117
- Revert main by @vivi in #120
- cleanup local LLM wrappers by @vivi in #121
- Refactored AutoGen integration + added examples folder by @cpacker in #123
- Patch azure support by @vivi in #140
- fix runtime error in AutoGen agent by @cpacker in #143
- Update to consistent formatting by @sarahwooders in #144
- powershell instructions by @cpacker in #145
- Add pre-commit file that includes whitespace formatting by @sarahwooders in #147
New Contributors
- @cpacker made their first contribution in #2
- @sarahwooders made their first contribution in #9
- @vivi made their first contribution in #10
- @ShishirPatil made their first contribution in #4
- @daneah made their first contribution in #16
- @bllchmbrs made their first contribution in #23
- @eltociear made their first contribution in #17
- @0Armaan025 made their first contribution in #29
- @QZGao made their first contribution in #30
- @Rudra-Ji made their first contribution in #49
- @shresthasurav made their first contribution in #95
- @rabbabansh made their first contribution in #93
Full Changelog: https://github.com/cpacker/MemGPT/commits/0.1.6