I get a 400 error on the LLM Prompter node when calling an o4-mini deployment on Azure AI Foundry.
Execute failed: Error code: 400 - {‘error’: {‘message’: “Unsupported parameter: ‘max_tokens’ is not supported with this model. Use ‘max_completion_tokens’ instead.”, ‘type’: ‘invalid_request_error’, ‘param’: ‘max_tokens’, ‘code’: ‘unsupported_parameter’}}
I’ve tried Azure API Versions, 2024-12-01-preview and 2025-01-01-preview in the Azure OpenAI Authenticator node in case that made a difference, but it didn’t.
Not sure where to go from here since it looks like KNIME is sending an incorrect parameter name.
Looks like the problem has been resolved by the update. Not getting a useful response out of it yet, but since there is no error it’s likely now PEBKAC.
Thanks,
Rob
Additional: As predicted above, working fine now. After brief initial test where no error but also no response, extended number of output tokens and all working as expected. Well done Team KNIME!