ollama的源码地址
https://github.com/ollama/ollama/tree/v0.9.5?tab=readme-ov-file
安装ollama
下载地址: https://ollama.com/download/OllamaSetup.exe
- 安装指定目录:
OllamaSetup.exe /DIR="D:\Ollama"
- 展示模型的信息:
ollama show gemma3
- 列出ollama上安装的模型:
ollama list
- 查看模型:
ollama ps
- 暂停模型:
ollama stop gemma3
- 运行cmd设置ip访问:
set OLLAMA_HOST=0.0.0.0:11434
- PowerShell运行:
$env:OLLAMA_HOST="0.0.0.0:11434"
- 或者设置系统环境变量:
OLLAMA_HOST=0.0.0.0:11434
- 启动ollama服务:
ollama serve
- 运行gemma3模型:
ollama run gemma3
- 运行deepseek-r1模型:
ollama run deepseek-r1
- 本地访问ollama服务:
http://localhost:11434/
- 查看模型标签:
http://localhost:11434/api/tags
- 查看模型库:
https://ollama.com/library
模型库
模型 | 参数 | 大小 | 下载 |
---|---|---|---|
Gemma 3 | 1B | 815MB | ollama run gemma3:1b |
Gemma 3 | 4B | 3.3GB | ollama run gemma3 |
Gemma 3 | 12B | 8.1GB | ollama run gemma3:12b |
Gemma 3 | 27B | 17GB | ollama run gemma3:27b |
QwQ | 32B | 20GB | ollama run qwq |
DeepSeek-R1 | 7B | 4.7GB | ollama run deepseek-r1 |
DeepSeek-R1 | 671B | 404GB | ollama run deepseek-r1:671b |
Llama 4 | 109B | 67GB | ollama run llama4:scout |
Llama 4 | 400B | 245GB | ollama run llama4:maverick |
Llama 3.3 | 70B | 43GB | ollama run llama3.3 |
Llama 3.2 | 3B | 2.0GB | ollama run llama3.2 |
Llama 3.2 | 1B | 1.3GB | ollama run llama3.2:1b |
Llama 3.2 Vision | 11B | 7.9GB | ollama run llama3.2-vision |
Llama 3.2 Vision | 90B | 55GB | ollama run llama3.2-vision:90b |
Llama 3.1 | 8B | 4.7GB | ollama run llama3.1 |
Llama 3.1 | 405B | 231GB | ollama run llama3.1:405b |
Phi 4 | 14B | 9.1GB | ollama run phi4 |
Phi 4 Mini | 3.8B | 2.5GB | ollama run phi4-mini |
Mistral | 7B | 4.1GB | ollama run mistral |
Moondream 2 | 1.4B | 829MB | ollama run moondream |
Neural Chat | 7B | 4.1GB | ollama run neural-chat |
Starling | 7B | 4.1GB | ollama run starling-lm |
Code Llama | 7B | 3.8GB | ollama run codellama |
Llama 2 Uncensored | 7B | 3.8GB | ollama run llama2-uncensored |
LLaVA | 7B | 4.5GB | ollama run llava |
Granite-3.3 | 8B | 4.9GB | ollama run granite3.3 |
curl http://localhost:11434/api/generate -d '{ "model": "gemma3", "prompt":"Why is the sky blue?" }'
curl http://localhost:11434/api/generate -d '{ "model": "deepseek-r1", "prompt":"Why is the sky blue?" }'
curl http://localhost:11434/api/chat -d '{ "model": "gemma3", "messages": [ { "role": "user", "content": "why is the sky blue?" } ] }'
您应该至少有 8 GB 的 RAM 来运行 7B 型号,16 GB 的 RAM 来运行 13B 的型号,32 GB 的 RAM 来运行 33B 型号。