Agent Prompter calls AI tool with wrong parameter names

Hello,

I’ve created an AI tool that has two input variables: userQuery (string) and number-chunks (integer). I connected the tool to the “Agent Prompter” node using the “Workflow to Tool” node.

When I execute the Agent Prompter node, I get the following error message:

ERROR CloseablePythonNodeProxy agents._tool:Missing configuration parameters: userQuery-1951, number-chunks-1952

Can someone help with this?

1 Like

I have now found that the problem only occurs in connection with the Gemini LLM Selector. If I connect the input of the Agent Prompter to the OpenAI LLM Selector, the error does not occur.

@sommer_sun there already is a Ticket for this (AP-24608).

3 Likes

I know that @ActionAndi played around with agents as well - initially unsuccessfully - however after tweaking the default prompt that comes with Agent Prompter node I believe it worked out.

There was quite some discussion on this parameter topic in the thread @mlauber71 linked further up - quality I think definitely depends on which model is being used (and probably also which provider the model is from).

If I had to guess I’d say that this functionality was optimized for OpenAI models (which explains why e.g. @ActionAndi had to modify the system message to make it work with Gemini) and that model quality play a crucial role - e.g. I was not able to make a more complex agent work with smaller models that run local - primarily because at some stage the get the parameter names confused due to the number suffix… very similar to what you are reporting above.

This topic was automatically closed 90 days after the last reply. New replies are no longer allowed.