I tested several local URLs and models but I always receive the same error message ‘invalid input’. I wonder if there is a problem with the current implementation - Ollama version 0.3.14.
Maybe someone can also try and give feedback.
What does work is creating such Vector stores with GPT4All and the KNIME nodes (Chroma and FAISS and then using it with Ollama).
Hi @mlauber71, thanks for checking out the blog post .
What you’re trying to do is unfortunately not possible using the OpenAI Authenticator node.
In Step 2 of the blog post, I also explained why:
… Next, we drag and drop the OpenAI Chat Model Connector node, which we can use to connect to Ollama’s chat, instruct and code models. We need this specific connector node (and not the OpenAI LLM Connector) because Ollama has built-in compatibility only with the OpenAI Chat Completions API, which is what the OpenAI Chat Model Connector node uses behind the hood.
If we wish to leverage Ollama’s vision and embeddings models, we can do so using the nodes of the KNIME REST Client Extension.
To connect to embedding models available via Ollama, you can use the POST Request node (Mind: make sure that the model you want to use is compatible with the task of creating embeddings).
To show how this would work, a couple of months ago I built a workflow to connect to embedding models and vision models from Ollama. Find it below:
Hi @tescnovonesis, nice to see that with a POST Request you were able to obtain the embeddings.
Are you asking what you can do with embeddings? There are many uses: the most common one in the context of GenAI is for populating vector stores, and subsequently retrieving documents in RAG systems.
However, that’s not the only option. In a traditional data science setting, you could use those vectors to feed a traditional ML model and obtain predictions, perform topic modeling or even use embeddings to plot semantic overlaps in your text documents - just to mention a few applications.
@tescnovonesis I have modified my original workflow to now use GPT4All to create Vector stores and store them with Chroma and FAISS.
If you want a more Pythonesque version that also can work on KNIME 4 you can take a look at my article and the examples mentioned and in my LLM collection on the hub.
Also make sure to check out KNIME’s collection about the use of GenAI
If you have the extension installed you can go to your knime installation folder.
In Plugins Folder there should be a folder org.knime.python.llm_5.3.2.v202409031801 (version number may differ…)
You can find the Python code behind that extension in this folder (inside the folder above)