Skip to content

Actions: li-plus/chatglm.cpp

CMake

Actions

Loading...
Loading

Show workflow options

Create status badge

Loading
240 workflow runs
240 workflow runs

Filter by Event

Filter by Status

Filter by Branch

Filter by Actor

Dynamic memory allocation. Drop Baichuan/InternLM support in favor of llama.cpp.
CMake #273: Pull request #305 synchronize by li-plus
June 20, 2024 04:22 3m 34s dev
dev
June 20, 2024 04:22 3m 34s
Dynamic memory allocation. Drop Baichuan/InternLM support in favor of llama.cpp.
CMake #272: Pull request #305 synchronize by li-plus
June 20, 2024 03:39 4m 42s dev
dev
June 20, 2024 03:39 4m 42s
Dynamic memory allocation. Drop Baichuan/InternLM support in favor of llama.cpp.
CMake #271: Pull request #305 synchronize by li-plus
June 20, 2024 03:37 4m 22s dev
dev
June 20, 2024 03:37 4m 22s
Dynamic memory allocation. Drop Baichuan/InternLM support in favor of llama.cpp.
CMake #270: Pull request #305 synchronize by li-plus
June 20, 2024 01:09 3m 52s dev
dev
June 20, 2024 01:09 3m 52s
Dynamic memory allocation. Drop Baichuan/InternLM support in favor of llama.cpp.
CMake #269: Pull request #305 synchronize by li-plus
June 18, 2024 11:48 4m 26s dev
dev
June 18, 2024 11:48 4m 26s
Dynamic memory allocation. Drop Baichuan/InternLM support in favor of llama.cpp.
CMake #268: Pull request #305 synchronize by li-plus
June 18, 2024 11:29 3m 42s dev
dev
June 18, 2024 11:29 3m 42s
Dynamic memory allocation. Drop Baichuan/InternLM support in favor of llama.cpp.
CMake #267: Pull request #305 synchronize by li-plus
June 18, 2024 11:21 3m 0s dev
dev
June 18, 2024 11:21 3m 0s
Dynamic memory allocation. Drop Baichuan/InternLM support in favor of llama.cpp.
CMake #266: Pull request #305 synchronize by li-plus
June 18, 2024 08:28 4m 13s dev
dev
June 18, 2024 08:28 4m 13s
Dynamic memory allocation. Drop Baichuan/InternLM support in favor of llama.cpp.
CMake #265: Pull request #305 synchronize by li-plus
June 18, 2024 08:17 3m 27s dev
dev
June 18, 2024 08:17 3m 27s
Dynamic memory allocation. Drop Baichuan/InternLM support in favor of llama.cpp.
CMake #264: Pull request #305 synchronize by li-plus
June 16, 2024 12:51 3m 35s dev
dev
June 16, 2024 12:51 3m 35s
Dynamic memory allocation. Drop Baichuan/InternLM support in favor of llama.cpp.
CMake #263: Pull request #305 synchronize by li-plus
June 15, 2024 03:31 4m 24s dev
dev
June 15, 2024 03:31 4m 24s
June 14, 2024 12:56 4m 21s
Disable shared library by default. Set default max_length in api server.
CMake #261: Pull request #317 opened by li-plus
June 14, 2024 12:51 4m 26s glm4
June 14, 2024 12:51 4m 26s
Fix regex lookahead for code input tokenization (#314)
CMake #260: Commit c9a4a70 pushed by li-plus
June 14, 2024 07:52 4m 10s main
June 14, 2024 07:52 4m 10s
Fix regex lookahead for code input tokenization
CMake #259: Pull request #314 opened by li-plus
June 14, 2024 07:46 4m 9s glm4
June 14, 2024 07:46 4m 9s
Use apply_chat_template to calculate tokens (#309)
CMake #258: Commit 6d671d2 pushed by li-plus
June 13, 2024 11:05 5m 4s main
June 13, 2024 11:05 5m 4s
Dynamic memory allocation. Drop Baichuan/InternLM support in favor of llama.cpp.
CMake #256: Pull request #305 synchronize by li-plus
June 13, 2024 07:43 5m 14s dev
dev
June 13, 2024 07:43 5m 14s
Update py interface for GLM4 (#306)
CMake #255: Commit 6e8bf84 pushed by li-plus
June 13, 2024 02:29 4m 55s main
June 13, 2024 02:29 4m 55s
Update py interface
CMake #254: Pull request #306 opened by li-plus
June 13, 2024 02:29 4m 48s glm4
June 13, 2024 02:29 4m 48s
Dynamic memory allocation. Drop Baichuan/InternLM support in favor of llama.cpp.
CMake #253: Pull request #305 opened by li-plus
June 13, 2024 02:25 4m 52s dev
dev
June 13, 2024 02:25 4m 52s
Add ChatGLM model type & tokenizer to pybind (#304)
CMake #252: Commit 1003af9 pushed by li-plus
June 13, 2024 02:19 4m 53s main
June 13, 2024 02:19 4m 53s
Add ChatGLM model type & tokenizer to pybind
CMake #251: Pull request #304 opened by li-plus
June 13, 2024 02:19 3m 55s glm4
June 13, 2024 02:19 3m 55s
Support ChatGLM4 conversation mode (#303)
CMake #250: Commit 598b38e pushed by li-plus
June 13, 2024 01:33 4m 51s main
June 13, 2024 01:33 4m 51s
Support ChatGLM4 conversation mode
CMake #249: Pull request #303 synchronize by li-plus
June 12, 2024 16:33 6m 3s glm4
June 12, 2024 16:33 6m 3s