Pending rename: llm.nvim
-> model.nvim
#37
Pinned
gsuuon
announced in
Announcements
Replies: 1 comment 1 reply
-
I want to apologise for not checking if Keep up the good work! |
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Pending rename:
llm.nvim
->model.nvim
I'm moving this plugin to
model.nvim
to avoid a name collision with huggingface's plugin since their rename[1]. This means existing users will need to changerequire('llm..)
torequire('model..)
. I'll be re-exporting the top-levelrequire('llm')
with a warning for now, but if you're importing any of the utils, providers, etc (i.e. if you've written your own prompts) you'll need to do a:%s/require('llm/require('model/
. This comes with a chat feature, command changes and several fixes/improvements.You can check out the changes early in the model branch. The previous version is available as the
llm
tag if you're not ready to switch yet.Chat
model_reason_siblings.mp4
:Mchat [name]
A big value add of using LLM's in Neovim is that it's a great editor for text - the chat experience shouldn't be any different. Chat runs in a normal editable buffer, so it's possible to tune things like the temperature or easily edit an assistant response and continue the conversation. Instead of asking the LLM to fix little simple mistakes, you can just fix the response and continue on. Since it's a normal buffer, you can also just copy-paste the entire conversation (or run
:Mchat [other chat prompt]
) and try again with a different model or settings.Commands
Along with the rename, all of the commands have been changed - they'll still work with their old names with a warning.
Renamed
Llm
→Model
orM
LlmShow
→Mshow
LlmSelect
→Mselect
LlmDelete
→Mdelete
LlmCancel
→Mcancel
Removing
LlmMulti
-- unchanged and planned to be removedNew
Mchat [name]
- create a new chat buffer with the given chat promptMchat
- run the current chat buffer (whenfiletype=mchat
)Myank
- yank the selected text with line numbers and filename, useful for dropping file context into a conversation.Mcount
- count the current buffer's tokens using tiktoken (requires python and tiktoken installed)Other
[1] also maybe I'd like to play around with non-large, non-language models
Beta Was this translation helpful? Give feedback.
All reactions