Skip to content

ollama: Set default max_tokens for llama3.3 #40836

ollama: Set default max_tokens for llama3.3

ollama: Set default max_tokens for llama3.3 #40836

Triggered via pull request January 23, 2025 17:19
Status Success
Total duration 19m 44s
Billable time 12m
Artifacts

ci.yml

on: pull_request
(Linux) Run Clippy and tests
15m 56s
(Linux) Run Clippy and tests
(macOS) Run Clippy and tests
9m 7s
(macOS) Run Clippy and tests
Check Postgres and Protobuf migrations, mergability
14s
Check Postgres and Protobuf migrations, mergability
Check formatting and spelling
24s
Check formatting and spelling
(Linux) Build Remote Server
1m 24s
(Linux) Build Remote Server
(Windows) Run Clippy and tests
11m 44s
(Windows) Run Clippy and tests
Linux x86_x64 release bundle
0s
Linux x86_x64 release bundle
Linux arm64 release bundle
0s
Linux arm64 release bundle
Create a macOS bundle
0s
Create a macOS bundle
Auto release preview
0s
Auto release preview
Fit to window
Zoom out
Zoom in

Annotations

2 warnings
Check Postgres and Protobuf migrations, mergability
No github_token supplied, API requests will be subject to stricter rate limiting
(Linux) Run Clippy and tests
Failed to restore: EBADF: bad file descriptor, close