Skip to main content

Integrating Ollama into Live Helper Chat with tool calls support

Present sample is using llama3-groq-tool-use model. You can use any other model as well.

I had good experience also with https://ollama.com/library/mistral-nemo model.

  • You will need working ollama model and ollama server running.

Installation

  • Import Rest API
  • Import Bot and choose just imported Rest API in import window.

Important

Forwarding port on WSL to Windows

From ubuntu on windows WSL layer.

```shell
vim /etc/systemd/system/ollama.service

Append OLLAMA_HOST=0.0.0.0 to the Service directive.

[Unit]
Description=Ollama Service
After=network-online.target

[Service]
ExecStart=/usr/local/bin/ollama serve
User=ollama
Group=ollama
Restart=always
RestartSec=3
Environment="PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin"
Environment="OLLAMA_HOST=0.0.0.0"

[Install]
WantedBy=default.target

Execute now

service ollama restart
systemctl daemon-reload

From windows command line find out the ip address of the WSL layer.

C:\Users\remdex>wsl.exe hostname -I
172.29.52.196 172.17.0.1

Edit windows windows hosts file C:\Windows\System32\drivers\etc\hosts and add the following line.

172.29.52.196 wsl

Forward this port to windows from WSL layer.

netsh interface portproxy set v4tov4 listenport=11434 listenaddress=* connectport=11434 connectaddress=wsl

Now you can access the ollama service from windows at http://your-pc-ip:11434/.