-
-
Notifications
You must be signed in to change notification settings - Fork 1.6k
Description
What's happening?
I generated a script to test temperature support for gemini models:
import os
import requests
api_key = os.environ["GEMINI_API_KEY"]
base_url = "https://generativelanguage.googleapis.com/v1beta"
models_response = requests.get(f"{base_url}/models?key={api_key}")
models = models_response.json()
# Scan for max model name length
max_name_len = max(len(model["name"]) for model in models.get("models", []))
for model in models.get("models", []):
model_name = model["name"]
padded_name = model_name.ljust(max_name_len)
has_temperature = "temperature" in model
has_max_temp = "maxTemperature" in model
if has_temperature:
default = model["temperature"]
max_temp = model.get("maxTemperature", "unspecified")
output = f"{padded_name}: temperature supported (default={default}, max={max_temp})"
# Test models where max temperature is unspecified
if not has_max_temp:
supported_methods = model.get("supportedGenerationMethods", [])
if "generateContent" in supported_methods:
payload = {
"contents": [{"role": "user", "parts": [{"text": "hi"}]}],
"generationConfig": {"temperature": 2.0}
}
r = requests.post(
f"{base_url}/{model_name}:generateContent?key={api_key}",
json=payload
)
if r.status_code == 200:
output += " - test request with temperature=2.0 succeeded"
else:
output += " - test request with temperature=2.0 failed"
else:
output += " - skipped test (generateContent not supported)"
print(output)
else:
print(f"{padded_name}: temperature not supported")
And I got the following output:
models/gemini-2.5-flash : temperature supported (default=1, max=2)
models/gemini-2.5-pro : temperature supported (default=1, max=2)
models/gemini-2.0-flash : temperature supported (default=1, max=2)
models/gemini-2.0-flash-001 : temperature supported (default=1, max=2)
models/gemini-2.0-flash-exp-image-generation : temperature supported (default=1, max=2)
models/gemini-2.0-flash-lite-001 : temperature supported (default=1, max=2)
models/gemini-2.0-flash-lite : temperature supported (default=1, max=2)
models/gemini-exp-1206 : temperature supported (default=1, max=2)
models/gemini-2.5-flash-preview-tts : temperature supported (default=1, max=2)
models/gemini-2.5-pro-preview-tts : temperature supported (default=1, max=2)
models/gemma-3-1b-it : temperature supported (default=1, max=unspecified) - test request with temperature=2.0 succeeded
models/gemma-3-4b-it : temperature supported (default=1, max=unspecified) - test request with temperature=2.0 succeeded
models/gemma-3-12b-it : temperature supported (default=1, max=unspecified) - test request with temperature=2.0 succeeded
models/gemma-3-27b-it : temperature supported (default=1, max=unspecified) - test request with temperature=2.0 succeeded
models/gemma-3n-e4b-it : temperature supported (default=1, max=unspecified) - test request with temperature=2.0 succeeded
models/gemma-3n-e2b-it : temperature supported (default=1, max=unspecified) - test request with temperature=2.0 succeeded
models/gemini-flash-latest : temperature supported (default=1, max=2)
models/gemini-flash-lite-latest : temperature supported (default=1, max=2)
models/gemini-pro-latest : temperature supported (default=1, max=2)
models/gemini-2.5-flash-lite : temperature supported (default=1, max=2)
models/gemini-2.5-flash-image : temperature supported (default=1, max=1)
models/gemini-2.5-flash-preview-09-2025 : temperature supported (default=1, max=2)
models/gemini-2.5-flash-lite-preview-09-2025 : temperature supported (default=1, max=2)
models/gemini-3-pro-preview : temperature supported (default=1, max=2)
models/gemini-3-flash-preview : temperature supported (default=1, max=2)
models/gemini-3-pro-image-preview : temperature supported (default=1, max=1)
models/nano-banana-pro-preview : temperature supported (default=1, max=1)
models/gemini-robotics-er-1.5-preview : temperature supported (default=1, max=2)
models/gemini-2.5-computer-use-preview-10-2025 : temperature supported (default=1, max=2)
models/deep-research-pro-preview-12-2025 : temperature supported (default=1, max=2)
models/gemini-embedding-001 : temperature not supported
models/aqa : temperature supported (default=0.2, max=unspecified) - skipped test (generateContent not supported)
models/imagen-4.0-generate-preview-06-06 : temperature not supported
models/imagen-4.0-ultra-generate-preview-06-06 : temperature not supported
models/imagen-4.0-generate-001 : temperature not supported
models/imagen-4.0-ultra-generate-001 : temperature not supported
models/imagen-4.0-fast-generate-001 : temperature not supported
models/veo-2.0-generate-001 : temperature not supported
models/veo-3.0-generate-001 : temperature not supported
models/veo-3.0-fast-generate-001 : temperature not supported
models/veo-3.1-generate-preview : temperature not supported
models/veo-3.1-fast-generate-preview : temperature not supported
models/gemini-2.5-flash-native-audio-latest : temperature supported (default=1, max=2)
models/gemini-2.5-flash-native-audio-preview-09-2025: temperature supported (default=1, max=2)
models/gemini-2.5-flash-native-audio-preview-12-2025: temperature supported (default=1, max=2)
I see some discrepancies with what Big-AGI exposes in model option dialogs. For example, temperature adjustment is greyed out for Gemini 3 Pro Preview.
Is it possible/not too difficult for Big-AGI to pull temperature support for Gemini models from what it gets from "/models" API endpoint?
If a given model has "temperature" defined, then the model supports temperature, and "temperature" value is the default. If "maxTemperature" is defined, then that would specify the max value for the temperature, otherwise assume it is 2.0.
If "temperature" is not defined for a given model, then it can be assumed that temperature is not supported and the slider can be greyed out.
Where does this happen?
Big-AGI Pro (big-agi.com)
Impact on your workflow
Medium - Workaround exists
Environment (if applicable)
No response
Additional context
No response