Hello,
I am currently exploring this sample workflow [Create a Vector Store]
I have successfully used the OpenAI nodes to connect to Ollama hosted on my localhost, utilizing the Llama 3.1 model for simple queries and prompts.
I am now looking to explore the next step, which involves using the embedding nodes. Anyone knows if the current OpenAI Embedding Connectors can be used to connect with Ollama? If so, which open-source model available on Ollama supports embeddings? I understand that Llama 3.1 is an LLM model and not an embedding model. I would like to test a use case by connecting an open-source embedding model through the OpenAI Embedding Connectors, via Ollama hosted locally.
Any assistance would be greatly appreciated.