Hi,
As we are not allowed to use Internet based LLMs we are using local deployments in our network,
In a separate post I believe we have confirmed that the OPENAI node configs are correct (Open AI connector - reopen of issue - #3 by mgirdwood)
-however when the Prompt is sent there is a connection error.
We had the same issue with Alteryx and have found a way around this by adding a windows Powershell call within that workflow and we get the expected returns to the prompts.
Does anyone have any ideas on how this can be accomplished in a Knime workflow ?
Thanks
Hey @mgirdwood,
I think this blogpost might be what your looking for:
You mention you are using the openAI nodes, so you need to make sure your internal deployment supports OpenAI’s chat completions.
TL
1 Like
Thanks for the reply @thor_landstrom
However I can not use Ollama ( the powershell in the example is simply to start Ollama) or GPT4All
What I am trying to find is a way to call a Powershell script from within a workflow. I have the script that we wrote for Alteryx and I woul dhope that would be able to be used in Knime somehow
@mgirdwood,
I think you may be looking for this node:
Also a similar thread:
TL
Hi @mgirdwood,
Did you try below steps yet (there are many different ways you can do that in Eclipse and in KNIME):
Option 1:
How to configure Proxy Settings in Eclipse -mkyong
Step 1:

Step 2:
Step 3:
Select Manual add Proxies (with authentications if necessary).
Option 2:
You can also do it in knime.ini
file in your installation directory (only touch if you know what you are doing).
How do I configure the proxy settings so that Eclipse can download new plugins? - Stackoverflow
1 Like