First of all kudos for the very detailed explanation of what you’ve done so far and what you’ve tried - makes it really easy and enjoyable to help.
I think in general you are on the right way with your set up including passing in a model name via flow variable.
Only thing that might be causing your problem I can see on the screenshots is using LLM Prompter vs. Chat Model Prompter:
As far as I know LLM Prompter uses completions API structure while Chat Model Prompter uses chat completion structure. If you try and use a model that needs chat completion in LLM Prompter you may get that error.
When making the change be mindful that Chat Model Prompter expects a conversation as in input - so a table with role column (ai or human) and a message column ( with your prompt ) This basic example should help you work that out: