-
Notifications
You must be signed in to change notification settings - Fork 1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
docs: add execution permission to the llama-server when running tabby… #2549
Conversation
… on Linux The user should give execution permission to the llama-server when running on Tabby on a Linux standalone install. Otherwise, the application will crash due to a permission error. ```sh The application panicked (crashed). Message: Failed to start llama-server <embedding> with command Command { std: "/home/<user>/tabby/dist/tabby/llama-server" "-m" "/home/<user>/.tabby/models/TabbyML/Nomic-Embed-Text/ggml/model.gguf" "--cont-batching" "--port" "30888" "-np" "1" "--log-disable" "--ctx-size" "4096" "-ngl" "9999" "--embedding" "--ubatch-size" "4096", kill_on_drop: true }: Permission denied (os error 13) Location: crates/llama-cpp-server/src/supervisor.rs:80 ```
WalkthroughThe recent change updates the installation instructions for Tabby on Linux. Specifically, it modifies the command for making a file executable by adding Changes
Poem
Tip Early access features
Note:
Thank you for using CodeRabbit. We offer it for free to the OSS community and would appreciate your support in helping us grow. If you find it useful, would you consider giving us a shout-out on your favorite social media? TipsChatThere are 3 ways to chat with CodeRabbit:
Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments. CodeRabbit Commands (invoked as PR comments)
Additionally, you can add CodeRabbit Configration File (
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 0
Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Files selected for processing (1)
- website/docs/quick-start/installation/linux/index.mdx (2 hunks)
Additional comments not posted (1)
website/docs/quick-start/installation/linux/index.mdx (1)
34-34
: LGTM!The addition of
llama-server
to thechmod +x
command ensures that bothtabby
andllama-server
files are made executable, preventing permission errors.However, please verify that the
llama-server
file exists in the repository.
Thanks for the PR |
… on Linux
The user should give execution permission to the llama-server when running on Tabby on a Linux standalone install. Otherwise, the application will crash due to a permission error.
Summary by CodeRabbit
llama-server
executable alongsidetabby
.