Skip to content
This repository has been archived by the owner on Nov 18, 2023. It is now read-only.

mtb0x1/llama2.rs.wasm

Repository files navigation

What is this

This is a dirty demo in wasm of llama2.c, you can run the demo localy.

This demo relies on notables implementation higlighted in here under notable forks->rust section.

How

  1. Download the release tarball and untar it.
  2. Run python3 -m http.server 8080 in www folder.
  3. Open http://127.0.0.1:8080/ in your browser.

or

  1. run wasm-pack build --release --target web --out-dir www/pkg/ --verbose
  2. download models :
mkdir -p stories/
wget -P stories/ https://huggingface.co/karpathy/tinyllamas/resolve/main/stories15M.bin
wget -P stories/ https://huggingface.co/karpathy/tinyllamas/resolve/main/stories42M.bin
wget -P stories/ https://huggingface.co/karpathy/tinyllamas/resolve/main/stories110M.bin
for i in $(ls -d port*/www/)
do
    cd $i
    ln -s ../../stories/stories15M.bin . 
    ln -s ../../stories/stories42M.bin .
    ln -s ../../stories/stories110M.bin .
done
  1. Run python3 -m http.server 8080 in www folder.
  2. Open http://127.0.0.1:8080/ in your browser.

or

check it out live demo

Credit

  1. Port1 A dirty and minimal port of @Gaxler llama2.rs.
  2. Port2 A dirty and minimal port of @Leo-du llama2.rs.
  3. Port3 A dirty and minimal port of @danielgrittner llama2-rs.
  4. Port4 A dirty and minimal port of @lintian06 llama2.rs.
  5. Port5 A dirty and minimal port of @rahoua pecca.rs.
  6. Port6 A dirty and minimal port of @flaneur2020 llama2.rs.