ollama: Update curl proxy. (#508)

Signed-off-by: fengding <feng1.ding@intel.com>
This commit is contained in:
feng-intel
2024-09-04 09:36:54 +08:00
committed by GitHub
parent 2a53e2547d
commit f510b69eac

View File

@@ -48,7 +48,7 @@ All of your local models are automatically served on localhost:11434. Run ollama
Send an application/json request to the API endpoint of Ollama to interact.
```bash
curl http://localhost:11434/api/generate -d '{
curl --noproxy "*" http://localhost:11434/api/generate -d '{
"model": "llama3",
"prompt":"Why is the sky blue?"
}'