My Ollama flake for nixOS
Find a file
2026-02-25 10:43:24 +02:00
.gitignore curl & .gitignore 2026-02-25 10:43:24 +02:00
flake.lock nix flake 2026-02-25 10:36:40 +02:00
flake.nix nix flake 2026-02-25 10:36:40 +02:00
README.md curl & .gitignore 2026-02-25 10:43:24 +02:00

Qwen3-0.6B

Run Qwen3-0.6B locally with ollama or llama.cpp.

Setup

nix develop

Usage

ollama

ollama serve &
ollama run qwen3:0.6b

llama.cpp

curl -L -O https://huggingface.co/Qwen/Qwen3-0.6B-GGUF/resolve/main/Qwen3-0.6B-Q8_0.gguf
llama-cli -m Qwen3-0.6B-Q8_0.gguf -p "Hello" -cnv