Hi everyone,
AI assistance while developing workflows has become a “must‑have” in today’s research environment.
In KNIME the K‑AI assistant is already integrated, but it is limited to external services such as OpenAI’s ChatGPT. Because the use of these external APIs is prohibited by our institute’s compliance rules, I would like to propose adding a configuration option in KNIME Preferences that allows the K‑AI assistant to use a locally hosted or on‑premise LLM.
Enabling a self‑hosted model for K‑AI would:
-
meet internal data‑security and compliance policies,
-
keep all processing and data within the institute’s infrastructure, and
-
broaden KNIME’s applicability for organisations with strict regulatory requirements.
Thank you for considering this request.
Best regards,
Lars