Ollama LLM Chat Code Assistant

This forum allows you to share scripts with other Zeus users. Please do not post bug reports, feature requests or questions to this forum, but rather use it exclusively for posting scripts or for the discussion of scripts that have been posted.
Post Reply
jussij
Site Admin
Posts: 2650
Joined: Fri Aug 13, 2004 5:10 pm

Ollama LLM Chat Code Assistant

Post by jussij »

NOTE: The latest version of Zeus includes scripts that allow it to integrate with Gemini AI, OpenAI and Ollama.

These features can be found on the new Macros, AI menu shown below:
ai-macros.png
ai-macros.png (94.16 KiB) Viewed 8 times
IMPORTANT: Before trying to use this macro, make sure you are running the most recent version of Zeus: https://www.zeusedit.com/download.html

Consider a Python file that contains the following text:
prompt.png
prompt.png (5.87 KiB) Viewed 7 times
With the cursor on the line of comment shown in the image, running the auto complete macro will result in this response:
response.png
response.png (10.4 KiB) Viewed 7 times
NOTE: Instructions on how to run these macros can be found in the header section of each macro script found in the Zeus zScript folder.

Legacy Macro
NOTE: The Ollama Chat macro shown below is now legacy as it has been superseded by the macros that come with the installer.

NOTE: Instructions on how to run the macro can be found in the header section of the macro script code. The macro should be saved to the ollama-chat.py file in the Zeus zScript folder.

Here is the Ollama LLM Chat Code Assistant macro script that produces this code completion behaviour:

Code: Select all

#
#        Name: Ollama LLM Chat Code Assistant
#
#  Description: This macro will provide LLM code responses to question asked, embedding the
#               result inside the document.
#
#  Setup:
#
#     1. Download and install ollama found here: https://ollama.com/download/windows
#
#     2. Form inside Zeus open a DOS command line prompt using the tools menu, and then
#        run the following commands.
#
#     3. Install the ollama Python package by running this command:
#
#            pip install ollama
#
#     4. Pick a model to be installed: https://ollama.com/library
#
#     5. Install the model:
#
#            ollama run codellama:7b-instruct
#
#     6. Make sure ollama server is running:
#
#            ollama serve
#
#  Usage:
#     To use, load the macro (i.e. F9 key), position the cursor and then run macro using
#     the Macro Execute (i.e. F8 key)
#
#     Alternatively use the Options, Editor Options menu and in the Keyboard section bind
#     the macro to the keyboard.
#
#  Usage: Open a Python file inside Zeus and enter this single line of comment into
#         that file.
#
#               # Write a function to generate the nth fibonacci number.
#
#         With the cursor somewhere on that line run this macro.
#
#         This should result in the following output being entered into the file:
#
#               # Write a function to generate the nth fibonacci number.
#
#               def fibonacci(n):
#                   if n <= 1:
#                       return n
#                   else:
#                       return fibonacci(n-1) + fibonacci(n-2)
#
import os
import re
import time
import zeus
import ollama
import asyncio

def code_blocks_to_string(markdown_text):
    pattern = re.compile(r'```(.*?)```', re.DOTALL)
    code_blocks = pattern.findall(markdown_text)

    if code_blocks != None:
        separator = '\n'
        return separator.join(code_blocks)

    return markdown_text

def ollama_code(question):
    # set the library to used: https://ollama.com/library
    model='codellama:7b-instruct'

    language = zeus.macro_tag('$language').decode('utf8')

    # set the prompt: https://ollama.com/blog/how-to-prompt-code-llama
    #                 https://www.llama.com/docs/how-to-guides/prompting/
    prompt = 'You are an expert programmer that writes simple, concise {} code. Return the code without any additional explanation. If you can not produce an accurate result, return nothing.'.format(language)

    messages = [
        {
          'role': 'system',
          'content': prompt,
        },
        {
          'role': 'user',
          'content': question,
        },
    ]

    # helps debugging
    #zeus.message_box(prompt, str(messages))

    try:
        zeus.message('Starting ollama chat....')

        response = ollama.chat(model, messages, stream=True)

        # some sleep need and clear the wait cursor
        zeus.yieldTask(100, 1)

        result = ""
        status = "Thinking (ESC to cancel)."
        message = status
        VK_ESCAPE = 27   # 0x1B scan code

        for chunk in response:
            result = result + chunk['message']['content']

            # some sleep need and clear the wait cursor
            zeus.yieldTask(100, 1)

            message = message + '.'

            if len(message) > 150:
                message = status


            if (zeus.key_down(VK_ESCAPE) == 1):
                zeus.message("Operation cancelled by user.")
                return False, '';

            zeus.message(message)

        zeus.message('')

    except Exception as e:
        zeus.message("Failed to connect to ollama. Use 'ollama serve' to make sure the service is running.")
        zeus.beep()

    return True, result

def key_macro():
    # macro only works for documents
    document = zeus.is_document()

    # macro only works for read/write documents.
    locked = zeus.is_read_only()

    if (locked == 1) or (document == 0):
      # can't run the format on current document
      zeus.message("This macro only works with named, writable documents.")
      zeus.beep()
      return 0

    window_id = zeus.get_window_id()

    question = b''

    insert_at = 0
    if zeus.is_marked():
        first = zeus.get_marked_top()
        last  = zeus.get_marked_bottom()

        for index in range(first, last + 1):
            question += zeus.get_line_text(index)
            question += b'\n'

        insert_at = last
        zeus.MarkHide()
    else:
        question  = zeus.get_line_text()
        insert_at = zeus.get_line_pos()

    if len(question) == 0:
        zeus.message('No question found. Place the cursor on a line of text containg the question you which to ask ollam.')
        zeus.beep()
        return

    completed, response = ollama_code(question.decode('utf8'))

    if completed == True:
        # makes sure we return the the original window
        zeus.window_activate(window_id)

        code = code_blocks_to_string(response).lstrip('\n')

        #zeus.message_box("response: ", response)
        #zeus.message_box("code: ", code)

        # insert the response
        zeus.line_insert(insert_at + 1, code)

key_macro() # run the macro
Post Reply