Skip to content

Actions: li-plus/chatglm.cpp

Python package

Actions

Loading...
Loading

Show workflow options

Create status badge

Loading
257 workflow runs
257 workflow runs

Filter by Event

Filter by Status

Filter by Branch

Filter by Actor

Apply flash attention on vision encoder (#339)
Python package #273: Commit 60c89b7 pushed by li-plus
July 31, 2024 04:30 4m 25s main
July 31, 2024 04:30 4m 25s
Apply flash attention on vision encoder
Python package #272: Pull request #339 opened by li-plus
July 31, 2024 04:18 4m 20s flash-attn
July 31, 2024 04:18 4m 20s
Fix compilation on metal (#337)
Python package #271: Commit 606eb1b pushed by li-plus
July 30, 2024 05:53 4m 54s main
July 30, 2024 05:53 4m 54s
Fix compilation on metal
Python package #270: Pull request #337 synchronize by li-plus
July 30, 2024 05:43 6m 26s fix-metal
July 30, 2024 05:43 6m 26s
Fix compilation on metal
Python package #269: Pull request #337 synchronize by li-plus
July 30, 2024 05:43 4m 51s fix-metal
July 30, 2024 05:43 4m 51s
Fix compilation on metal
Python package #268: Pull request #337 opened by li-plus
July 25, 2024 10:08 4m 50s fix-metal
July 25, 2024 10:08 4m 50s
Support GLM4V (#336)
Python package #267: Commit 0f7a8a9 pushed by li-plus
July 25, 2024 04:32 5m 4s main
July 25, 2024 04:32 5m 4s
Support GLM4V
Python package #266: Pull request #336 synchronize by li-plus
July 25, 2024 04:16 4m 20s glm4v
July 25, 2024 04:16 4m 20s
Support GLM4V
Python package #265: Pull request #336 synchronize by li-plus
July 25, 2024 03:37 5m 54s glm4v
July 25, 2024 03:37 5m 54s
Support GLM4V
Python package #264: Pull request #336 synchronize by li-plus
July 25, 2024 03:30 4m 44s glm4v
July 25, 2024 03:30 4m 44s
Support GLM4V
Python package #263: Pull request #336 synchronize by li-plus
July 25, 2024 03:21 5m 47s glm4v
July 25, 2024 03:21 5m 47s
Support GLM4V
Python package #262: Pull request #336 synchronize by li-plus
July 25, 2024 03:20 5m 19s glm4v
July 25, 2024 03:20 5m 19s
Support GLM4V
Python package #261: Pull request #336 synchronize by li-plus
July 25, 2024 03:09 4m 45s glm4v
July 25, 2024 03:09 4m 45s
Support GLM4V
Python package #260: Pull request #336 synchronize by li-plus
July 25, 2024 03:01 7m 24s glm4v
July 25, 2024 03:01 7m 24s
Support GLM4V
Python package #259: Pull request #336 synchronize by li-plus
July 25, 2024 02:57 4m 42s glm4v
July 25, 2024 02:57 4m 42s
Support GLM4V
Python package #258: Pull request #336 opened by li-plus
July 25, 2024 02:04 4m 19s glm4v
July 25, 2024 02:04 4m 19s
Fix nan by rescheduling attention scaling (#322)
Python package #257: Commit f86777c pushed by li-plus
June 24, 2024 05:15 4m 48s main
June 24, 2024 05:15 4m 48s
Fix nan by rescheduling attention scaling
Python package #256: Pull request #322 opened by li-plus
June 24, 2024 03:23 4m 52s fix-nan
June 24, 2024 03:23 4m 52s
Dynamic memory allocation. Drop Baichuan/InternLM support in favor of…
Python package #255: Commit e9989b5 pushed by li-plus
June 21, 2024 02:20 5m 19s main
June 21, 2024 02:20 5m 19s
Dynamic memory allocation. Drop Baichuan/InternLM support in favor of llama.cpp.
Python package #254: Pull request #305 synchronize by li-plus
June 21, 2024 02:14 5m 42s dev
dev
June 21, 2024 02:14 5m 42s
Dynamic memory allocation. Drop Baichuan/InternLM support in favor of llama.cpp.
Python package #253: Pull request #305 synchronize by li-plus
June 20, 2024 06:03 4m 56s dev
dev
June 20, 2024 06:03 4m 56s
Dynamic memory allocation. Drop Baichuan/InternLM support in favor of llama.cpp.
Python package #252: Pull request #305 synchronize by li-plus
June 20, 2024 05:13 5m 17s dev
dev
June 20, 2024 05:13 5m 17s
Dynamic memory allocation. Drop Baichuan/InternLM support in favor of llama.cpp.
Python package #251: Pull request #305 synchronize by li-plus
June 20, 2024 04:25 4m 10s dev
dev
June 20, 2024 04:25 4m 10s
Dynamic memory allocation. Drop Baichuan/InternLM support in favor of llama.cpp.
Python package #250: Pull request #305 synchronize by li-plus
June 20, 2024 04:22 4m 53s dev
dev
June 20, 2024 04:22 4m 53s
Dynamic memory allocation. Drop Baichuan/InternLM support in favor of llama.cpp.
Python package #249: Pull request #305 synchronize by li-plus
June 20, 2024 03:39 5m 34s dev
dev
June 20, 2024 03:39 5m 34s