KAI - Please add feature

Hi -
One of the best Modern features in Knime is the option to use KAI, however for those of us behind locked down company Firewalls KAI is “Wish to have” as we can not connect to the Internet for LLMs or updates

Is it not possible to have a section somewhere in Preferences where you can set your own URL to an LLM ? (as I can do with LLM nodes for instance ) I am guessing its hard coded at the moment to OpenAI or one of the other external LLMs.

But we have our own locally deployed LLM which could give us at least most of the features that KAI provides even if its not fine tuned with all the Knime documentation.

This would be a God send for many of the Data Security restricted companies (or individuals for that matter ) that want to use Knime to its fullest.

Got my vote for this - pretty sure that there’s some RAG behind KAI as well so not sure how well models will perform w/o access to that…

1 Like

True, but anything is better than nothing

1 Like

I am adding my vote on this also. Enterprise is only allowing access to LLMs that we want to allow and without the ability to point to our desired path this is an unusable feature.

Hey hey,

Business Hub Enterprise Edition already allows you to use your own K-AI Backend (today we support bunch of OpenAi kompatible Models - plan to add more, e.g. Gemini).

We are working on making this feature also available for Basic and Standard Edition.

Cheers,

Christian

1 Like

Hi Christian, We are only using the desktop KNIME Analytics Platform, we do not have any hub version and no plans for a hub. There needs to be a way to either point K-AI at an internal LLM at the desktop, or to turn off K-AI support somehow. At this point, since it uses the community hub, we have had to block access to hub.knime.com completely making it unusable for everything now. If there is a particular IP address and port that we can block to stop K-AI and allow other functionality that would be good in the short term. There needs to be a way to disable this feature in an enterprise setting.

Would that be on a per instance level (e.g. per user turned off) or how would you want to enforce that settings for your users? Also, would it be OK if you could point K-AI to your own LLM e.g. via Hub Teams?

Hi Christian,

Well we deploy KNIME packaged via the Software Center. If there was a setting we could configure in the knime.ini or something like that, it would be turned off for everyone that deploys from software center and since they do not have admin it would stay off. It would also be acceptable to us if it was possible to just block the network connection for K-AI at the firewall. We blocked hub.knime.com but we would like to be more refined and just block the AI path.

If we could point to our own LLM via a setting somewhere, again something like a knime.ini, that a user could not change, we could work with that too.

From a security standpoint, we can’t have people with access to a public AI and potentially put out something that is confidential or restricted by mistake. Since K-AI has access to the open workflows it could potentially see confidential data and that is the problem.

Steve

2 Likes

I understand your use-case and concerns, that’s pretty much the motivation why wanted to give our B-Hub customers the possibility to select their own AI provider for K-AI. We have also thoughts around making this capability available for C-Hub Teams subscribers. In both cases any question or K-AI data will never touch “our” AI, but only go through the provided backend.

The logic behind K-AI can’t be implemented as a AP feature (for many reasons), but is part of Hub infrastructure, meaning you can’t simply switch the backend in AP without a Hub. However, you can disable K-AI in the APs you provide through your software center as follows: KNIME Analytics Platform User Guide