deepseek local model config

hello

I tried to run deepseek LLM on the local system, but the problem is that the model I downloaded is not displayed in the list.
0




1 Like

@arddashti you can use local DeepSeek models with the help of ollama or GPT4All

2 Likes

in terminal or windows powershell check if ollama is running in this link or not:
http://localhost:11434/

check here:

5 Likes