'Temperature' parameter in HF Hub LLM Selector node does not affect responses

Issue Summary

Even when the value of the Temperature parameter is changed in the HF Hub LLM Selector node, no change is observed in the responses generated by the model. The responses remain deterministic, regardless of the parameter’s value.

Detailed Description

Tests were conducted with various Hugging Face models using the HF Hub LLM Selector node. During these tests, the Temperature parameter was assigned various values ranging from 0.1 to 2. However, it was found that for the same input (prompt), the model always produced the exact same output, no matter what value was assigned to the parameter. This indicates that the Temperature parameter is not affecting the model’s inference process.

Expected Behavior

When the value of the Temperature parameter is increased, the model is expected to produce more creative, diverse, and less predictable responses. When the value is decreased, the responses should be more consistent and focused.

Current Behavior

Changing the Temperature parameter results in no change to the model’s response. The outputs are always identical, as if the parameter does not exist.

Additional Information

  • An example workflow demonstrating the issue has been shared.
  • This behavior has been consistently replicated across multiple models.

System Information

  • Operating System: macOS Sequoia 15.4.1
3 Likes

Created ticket ID: QA-1131

1 Like

Internal ticket ID: AP-24529
Summary: ‘Temperature’ parameter in HF Hub LLM Selector node does not affect responses
Fix version(s): 5.5.0

1 Like