I followed the instructions on this link: How to leverage open source LLMs locally via Ollama | KNIME
The OpenAI Chat Model Connector works fine as displayed in the workflow.
However, upon trying to use OpenAI Embeddings Connector and OpenAI LLM Connector for more complext scenarios, it does not work the same as was shown during the AI-KNIME Short Course.
Please suggest alternatives to the OpenAI APIs that were used in the Exercises.
Hey there and welcome to the Forum!
You are right - Ollama provides an OpenAI compatible API for Chat / Instruct Models, but not for embeddings unfortunately.
See this post that contains a link to find potential alternatives:
Glad you managed to work it out.
Regarding cheaper embedding models:
You should be able to use any embedding model provided that it uses the structure of the OpenAI Embeddings API Endpoint:
https://platform.openai.com/docs/api-reference/embeddings
Unfortunately that may rule out some providers - e.g. I tried running local embedding models using Ollama, but as Ollama is not OpenAI Compatible (yet) for their Embeddings Endpoint it doesn’t work.
As far as I know Deepseek currently does not hav…
In general you can use any embeddings model provided the API you use to access it is OpenAI embeddings compatible.
2 Likes
system
Closed
July 7, 2025, 5:50pm
3
This topic was automatically closed 90 days after the last reply. New replies are no longer allowed.
Thought I let you know that I looked into this again and found that there seems to be experimental support out there now for OpenAI compatible endpoint with Ollama:
# OpenAI compatibility
> [!NOTE]
> OpenAI compatibility is experimental and is subject to major adjustments including breaking changes. For fully-featured access to the Ollama API, see the Ollama [Python library](https://github.com/ollama/ollama-python), [JavaScript library](https://github.com/ollama/ollama-js) and [REST API](https://github.com/ollama/ollama/blob/main/docs/api.md).
Ollama provides experimental compatibility with parts of the [OpenAI API](https://platform.openai.com/docs/api-reference) to help connect existing applications to Ollama.
## Usage
### OpenAI Python library
```python
from openai import OpenAI
client = OpenAI(
base_url='http://localhost:11434/v1/',
# required but ignored
api_key='ollama',
)
This file has been truncated. show original
Have not yet had a chance to test it out though…
1 Like