mirror of
https://github.com/almet/notmyidea.git
synced 2025-04-28 19:42:37 +02:00
- More categories are displayed - Change the URL Scheme for categories - Add a view for weeknotes, readings and code.
737 B
737 B
tags |
---|
llm |
How to run the vigogne model locally
Vigogne is a LLM model based on LLAMA2, but trained with french data. As I'm working mostly in french, it might be useful. The current models that I can get locally are in english.
The information I've found online are scarse and not so easy to follow, so here is a step by step tutorial you can follow. I'm using pipenv almost everywhere now, it's so easy :-)
llm install -U llm-llama-cpp
wget https://huggingface.co/TheBloke/Vigogne-2-7B-Chat-GGUF/resolve/main/vigogne-2-7b-chat.Q4_K_M.gguf
llm llama-cpp add-model vigogne-2-7b-chat.Q4_K_M.gguf -a vigogne
llm models default vigogne