First of all kudos for the very detailed explanation of what you’ve done so far and what you’ve tried - makes it really easy and enjoyable to help.
I think in general you are on the right way with your set up including passing in a model name via flow variable.
Only thing that might be causing your problem I can see on the screenshots is using LLM Prompter vs. Chat Model Prompter:
As far as I know LLM Prompter uses completions API structure while Chat Model Prompter uses chat completion structure. If you try and use a model that needs chat completion in LLM Prompter you may get that error.
When making the change be mindful that Chat Model Prompter expects a conversation as in input - so a table with role column (ai or human) and a message column ( with your prompt ) This basic example should help you work that out:
Could I possibly give you my API for you to try it on your end? I can send it via a private message (e-mail) and then delete it after your attempt, what do you think?
If you agree, could you provide me with your email?
By the way, I am using the free Google Gemini Student account. I don’t think that’s a problem.
Before you set me up with an API key try one more thing:
In chat model selection where you pass the flow variable switch the model selection trigger at the top from „default models“ to „all models“?
I think default models triggers some sort of validation against a list of models and if the model in your flow variable is not on the list, it errors out.
Will send you a PM in case you want me to have a crack