I followed the instructions on this link: How to leverage open source LLMs locally via Ollama | KNIME
The OpenAI Chat Model Connector works fine as displayed in the workflow.
However, upon trying to use OpenAI Embeddings Connector and OpenAI LLM Connector for more complext scenarios, it does not work the same as was shown during the AI-KNIME Short Course.
Please suggest alternatives to the OpenAI APIs that were used in the Exercises.
Hey there and welcome to the Forum!
You are right - Ollama provides an OpenAI compatible API for Chat / Instruct Models, but not for embeddings unfortunately.
See this post that contains a link to find potential alternatives:
Glad you managed to work it out.
Regarding cheaper embedding models:
You should be able to use any embedding model provided that it uses the structure of the OpenAI Embeddings API Endpoint:
https://platform.openai.com/docs/api-reference/embeddings
Unfortunately that may rule out some providers - e.g. I tried running local embedding models using Ollama, but as Ollama is not OpenAI Compatible (yet) for their Embeddings Endpoint it doesn’t work.
As far as I know Deepseek currently does not hav…
In general you can use any embeddings model provided the API you use to access it is OpenAI embeddings compatible.
2 Likes
system
Closed
July 7, 2025, 5:50pm
3
This topic was automatically closed 90 days after the last reply. New replies are no longer allowed.