Vector Stores LLM connection issues

Hi,

Whilst trying to follow along with creating a vector store using local LLMs I have hit a snag that I wonder if one of the more technical people in the community can explain.

I am using LM Studio as my source for local models and using the OpenAI nodes I have no issue connecting and running the models when running normal Chat type workflows.

However I have just tried to do the same with the FAISS Vector Store creator node using the Nomic Embed and I have hit a problem I can not get my head around.

Using the OpenAI nodes I can see and point to the model in the LM Studio repository. However when i connect to the FAISS node I get an error - Execute failed: Error code: 400 - {ā€˜error’: ā€œā€˜input’ field must be a string or an array of stringsā€}

But if I use the GPT4ALL embedding node and point to the exact same model in the LM Studio repository then the FAISS node run without issue. Si knmow the model is working.

I also tried the same using the Chroma Vector store node with the same error why OpenAI and GPT4All works also.

Any ideas would be welcome as I would prefer to only use the one source of models

Currently it is not possible to use the OpenAI nodes to create vector stores. I use GPT4All for that. Although you can later use them.

2 Likes

@mlauber71
Thanks, I had the suspicion that would be the answer :slight_smile:

I can use GPT4all for the Vector stores then at least here at home.

1 Like

This topic was automatically closed 7 days after the last reply. New replies are no longer allowed.