Open source LLM - Ollama

I followed the instructions on this link: How to leverage open source LLMs locally via Ollama | KNIME

The OpenAI Chat Model Connector works fine as displayed in the workflow.

However, upon trying to use OpenAI Embeddings Connector and OpenAI LLM Connector for more complext scenarios, it does not work the same as was shown during the AI-KNIME Short Course.

Please suggest alternatives to the OpenAI APIs that were used in the Exercises.

Hey there and welcome to the Forum!

You are right - Ollama provides an OpenAI compatible API for Chat / Instruct Models, but not for embeddings unfortunately.

See this post that contains a link to find potential alternatives:

In general you can use any embeddings model provided the API you use to access it is OpenAI embeddings compatible.

2 Likes

This topic was automatically closed 90 days after the last reply. New replies are no longer allowed.

Thought I let you know that I looked into this again and found that there seems to be experimental support out there now for OpenAI compatible endpoint with Ollama:

Have not yet had a chance to test it out though…

1 Like