LLM Prompter Azure OpenAI 400 error

Hi,

I get a 400 error on the LLM Prompter node when calling an o4-mini deployment on Azure AI Foundry.

Execute failed: Error code: 400 - {‘error’: {‘message’: “Unsupported parameter: ‘max_tokens’ is not supported with this model. Use ‘max_completion_tokens’ instead.”, ‘type’: ‘invalid_request_error’, ‘param’: ‘max_tokens’, ‘code’: ‘unsupported_parameter’}}

I’ve tried Azure API Versions, 2024-12-01-preview and 2025-01-01-preview in the Azure OpenAI Authenticator node in case that made a difference, but it didn’t.

Not sure where to go from here since it looks like KNIME is sending an incorrect parameter name.

Hi @Vexatious_Outlier (great name, by the way :smiley: ) -

Are you still running into this problem? If so, I can get one of our AI devs to take deeper look…

By name and nature!

Looks like the problem has been resolved by the update. Not getting a useful response out of it yet, but since there is no error it’s likely now PEBKAC.

Thanks,

Rob

Additional: As predicted above, working fine now. After brief initial test where no error but also no response, extended number of output tokens and all working as expected. Well done Team KNIME!

1 Like

This topic was automatically closed 7 days after the last reply. New replies are no longer allowed.