I’ve created an AI tool that has two input variables: userQuery (string) and number-chunks (integer). I connected the tool to the “Agent Prompter” node using the “Workflow to Tool” node.
I have now found that the problem only occurs in connection with the Gemini LLM Selector. If I connect the input of the Agent Prompter to the OpenAI LLM Selector, the error does not occur.
I know that @ActionAndi played around with agents as well - initially unsuccessfully - however after tweaking the default prompt that comes with Agent Prompter node I believe it worked out.
There was quite some discussion on this parameter topic in the thread @mlauber71 linked further up - quality I think definitely depends on which model is being used (and probably also which provider the model is from).
If I had to guess I’d say that this functionality was optimized for OpenAI models (which explains why e.g. @ActionAndi had to modify the system message to make it work with Gemini) and that model quality play a crucial role - e.g. I was not able to make a more complex agent work with smaller models that run local - primarily because at some stage the get the parameter names confused due to the number suffix… very similar to what you are reporting above.