Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fix coloring of last n_batch of prompt, and refactor line input #221

Merged
merged 4 commits into from
Mar 19, 2023

Conversation

bitRAKE
Copy link
Contributor

@bitRAKE bitRAKE commented Mar 17, 2023

llama.cpp/main.cpp

Lines 980 to 983 in 7213110

// reset color to default if we there is no pending user input
if (!input_noecho && params.use_color && embd_inp.size() == input_consumed) {
printf(ANSI_COLOR_RESET);
}

This can fail to colorize the last params.n_batch part of the prompt correctly because embd was just loaded with those tokens and not printed, yet.

@gjmulder gjmulder added the bug Something isn't working label Mar 17, 2023
@sw
Copy link
Contributor

sw commented Mar 19, 2023

Can confirm the bug and the PR fixes it and is certainly cleaner. Can you resolve the conflicts and maybe take along the one-liner in #283, also related to colors? Thanks.

@ggerganov
Copy link
Owner

@sw merge if you approve it

@sw sw merged commit 5c19c70 into ggerganov:master Mar 19, 2023
dmahurin pushed a commit to dmahurin/llama.cpp that referenced this pull request May 31, 2023
dmahurin pushed a commit to dmahurin/llama.cpp that referenced this pull request May 31, 2023
dmahurin pushed a commit to dmahurin/llama.cpp that referenced this pull request Jun 1, 2023
Deadsg pushed a commit to Deadsg/llama.cpp that referenced this pull request Dec 19, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants