Skip to content

Commit

Permalink
Update readme.md
Browse files Browse the repository at this point in the history
  • Loading branch information
stefanache authored Sep 8, 2024
1 parent 8797075 commit 9ca40c9
Showing 1 changed file with 10 additions and 0 deletions.
10 changes: 10 additions & 0 deletions python/llamafile_llava/readme.md
Original file line number Diff line number Diff line change
@@ -1,3 +1,13 @@
Server-ul Ollama uneori s-a dovedit a fi prea lent. Asa ca am decis sa creez un workshop cu serverul llamafile sub sistemul de operare Windows 10(Pro).

Despre descarcarea fisierului **llava-v1.5-7b-q4llamfile.** puteti citi [aici](https://python.langchain.com/v0.2/docs/integrations/llms/llamafile/)

Dupa descarcare modificati/redenumiti extensia fisierului **llava-v1.5-7b-q4llamfile.** in **llava-v1.5-7b-q4llamfile.exe** pt a deveni/a fi recunoscut ca executabil de catre sistemul windows.

Rularea server-ului se va face cu dublu-click pe fisierul **_1.RUN_server.bat** care va rula **app.py**

Asteptati incarcarea serverului si apoi puteti folosi din Browser adresa **http://localhost:8080/** unde veti putea transmite mesaje. Apoi minimizati fereastra in bara dar nu o inchideti pt ca serverul trebuie sa ramana in functiune.

Dupa ce aveti serverul llamafile in functiune puteti rula clientul dublu-clikand fisierul ****


0 comments on commit 9ca40c9

Please sign in to comment.