Hi,
Whilst trying to follow along with creating a vector store using local LLMs I have hit a snag that I wonder if one of the more technical people in the community can explain.
I am using LM Studio as my source for local models and using the OpenAI nodes I have no issue connecting and running the models when running normal Chat type workflows.
However I have just tried to do the same with the FAISS Vector Store creator node using the Nomic Embed and I have hit a problem I can not get my head around.
Using the OpenAI nodes I can see and point to the model in the LM Studio repository. However when i connect to the FAISS node I get an error - Execute failed: Error code: 400 - {āerrorā: āāinputā field must be a string or an array of stringsā}
But if I use the GPT4ALL embedding node and point to the exact same model in the LM Studio repository then the FAISS node run without issue. Si knmow the model is working.
I also tried the same using the Chroma Vector store node with the same error why OpenAI and GPT4All works also.
Any ideas would be welcome as I would prefer to only use the one source of models