Replies: 2 comments 1 reply
-
@JoseConseco moved here to keep the other issue on-topic, feel free to continue asking questions here or create a new discussion. The markdown parsing would be part of the prompt's transform field - it's built into the code starter as an example. It'll automatically remove everything in the response segment that isn't in a markdown block, and everything else goes to About renaming LlmDelete - undo implies a sequence, delete is just removing the only completion there (there's no history, there's just one item). I'm not stuck on the naming though, feel free to open a discussion or issue to get feedback. |
Beta Was this translation helpful? Give feedback.
-
Thx, it worked ok. Still it would be cool if we cold redirect output to new popup window - eg. If user ask question, it should not pollute the current code with answer. |
Beta Was this translation helpful? Give feedback.
-
original: #13 (comment)
@gsuuon is there way to output AI message to popup window?
llm_replace.mp4
The issue I have now - llama will output code with comments, and I cant seem to force it to output the pure code only.
It would help, if output was written to new popup windwo, where user could copy the code, close popup, and paste into place...
I was also wondering maybe: LlmDelete - shoudl be renamed LlmUndo - since this is what it is for
Beta Was this translation helpful? Give feedback.
All reactions