Running DeepSeek LLM Localy Inside Zeus
Posted: Fri May 30, 2025 4:01 am
The Zeus IDE has scripts for AI 'code explain' and 'ask a question' prompts as described here: https://www.zeusedit.com/zBB3/viewtopic.php?t=8402
IMPORTANT: The standard Zeus 'autocomplete' script can't be used by most DeepSeek models as they don't support Fill-in-the-Middle (FIM) operation.
To use DeepSeek for autocomplete refer to the Zeus deepseek_complete.py script which uses a specific DeepSeek model to implement autocomplete.
NOTE: These Zeus scripts can also be used as templates, making it easy to create other types of AI prompts.
One set of those Zeus scripts run the Ollama LLM localy, and using those scripts it is possible to run the DeepSeek model.
Here are the instructions on how to do this:
1. Download and install Ollama found here: https://ollama.com/download/windows
NOTE: The Ollama installer configures the Ollama application to autostart. If this is not desired, turn off autostart by first killing the Ollama server
using Task Manager and then using Windows search looking for 'Startup Apps' and turning off the Ollama application.
2. Pip will need to be installed, and this can be done from inside Zeus by using the Tools, DOS Shell menu and running the following commands:
b]a.[/b] Run this command to install pip:
b]b.[/b] Upgrade pip to the latest version using this command line:
3. With pip installed use the Zeus Tools, DOS Command Line menu and use the following command line to install the Ollama Python package.
4. Copy the details of the DeepSeek model to be installed from this page: https://ollama.com/library/deepseek-r1
5. The model is installed using the Ollama pull command by specifying the model's name and version number.
So for example, this would be the command used to install the DeepSeek model:
NOTE: The OLLAMA_MODELS environment variable can be used to specify the folder used to store these models:
6. Check that the model has been installed using the list command:
7. Open the Zeus ollama_zeus.py file found in the Zeus zScript folder and update the model details. The model details are found in a few locations which can be found using a search for the text:
Assuming the deepseek-r1:7b was downloaded, change the code in the locations found to use the new model details:
This line of code defines the model used by the script, and it can be modified to suit.
8. Start the local Ollama service:
8. Use the following page for instructions on how to use these Zeus AI scripts: https://www.zeusedit.com/zBB3/viewtopic.php?t=8402
IMPORTANT: The standard Zeus 'autocomplete' script can't be used by most DeepSeek models as they don't support Fill-in-the-Middle (FIM) operation.
To use DeepSeek for autocomplete refer to the Zeus deepseek_complete.py script which uses a specific DeepSeek model to implement autocomplete.
NOTE: These Zeus scripts can also be used as templates, making it easy to create other types of AI prompts.
One set of those Zeus scripts run the Ollama LLM localy, and using those scripts it is possible to run the DeepSeek model.
Here are the instructions on how to do this:
1. Download and install Ollama found here: https://ollama.com/download/windows
NOTE: The Ollama installer configures the Ollama application to autostart. If this is not desired, turn off autostart by first killing the Ollama server
using Task Manager and then using Windows search looking for 'Startup Apps' and turning off the Ollama application.
2. Pip will need to be installed, and this can be done from inside Zeus by using the Tools, DOS Shell menu and running the following commands:
b]a.[/b] Run this command to install pip:
Code: Select all
python.exe -m ensurepip
Code: Select all
python -m pip install --upgrade --force-reinstall --no-cache-dir pip
Code: Select all
pip install ollama
5. The model is installed using the Ollama pull command by specifying the model's name and version number.
So for example, this would be the command used to install the DeepSeek model:
Code: Select all
ollama pull deepseek-r1:7b
Code: Select all
set OLLAMA_MODELS = c:\ollama\models
Code: Select all
ollama list
Code: Select all
model='codellama:7b-instruct'):
Code: Select all
model='deepseek-r1:7b'):
8. Start the local Ollama service:
Code: Select all
Ollama serve