My Ollama flake for nixOS
- Nix 100%
| .gitignore | ||
| flake.lock | ||
| flake.nix | ||
| README.md | ||
Qwen3-0.6B
Run Qwen3-0.6B locally with ollama or llama.cpp.
Setup
nix develop
Usage
ollama
ollama serve &
ollama run qwen3:0.6b
llama.cpp
curl -L -O https://huggingface.co/Qwen/Qwen3-0.6B-GGUF/resolve/main/Qwen3-0.6B-Q8_0.gguf
llama-cli -m Qwen3-0.6B-Q8_0.gguf -p "Hello" -cnv