Skip to content

multiheadattention int8 quantization #336

multiheadattention int8 quantization

multiheadattention int8 quantization #336

build

succeeded Oct 15, 2024 in 13m 49s
Set up job
4s
Run actions/checkout@v4
12s
cache-openmp
1s
openmp
0s
openmp-arm64
0s
openmp-arm64e
0s
openmp-simulator-x86_64
0s
openmp-simulator-arm64
0s
openmp-merge-fat-library
0s
install-openmp
0s
moltenvk
6s
arm64
3m 11s
arm64e
2m 36s
simulator-x86_64
4m 11s
simulator-arm64
3m 25s
Post cache-openmp
0s
Post Run actions/checkout@v4
0s
Complete job
0s