KNIME AI Extension and local LLMs

Have you ever thought about what happens to your data when you send them through an #LLM-driven #chatbot across the internet? @mlauber71 shows how to blend #KNIME with #GPT4All in a chatbot data app, allowing anyone to run models locally and safely into #lowcode workflows. Enjoy the data story!

PS: :date:#HELPLINE . Want to discuss your article? Need help structuring your story? Make a date with the editors of Low Code for Data Science via Calendly → Calendly - Blog Writer

5 Likes

Has anyone found a way to retain the string of prompt / response history as a frame of reference for subsequent questions? At this point I am essentially collecting and combining all of the prior prompts and responses into a long string to retain context. Any better solutions for retaining history for follow-up questions?

@iCFO this is what would be great if this would work in the same way as with the OpenAI connector where you can use your own vector store. GPT4All does support such features but it is not (yet) there with the KNIME nodes.

2 Likes

This topic was automatically closed 90 days after the last reply. New replies are no longer allowed.