Clarification Needed: Accessing All Available Retell Models via API

Hi Retell team,

I’m integrating with the Retell API and had a question about how to retrieve the full list of models supported by Retell.

Right now, when I call ⁠ list-retell-llms ⁠, it appears to return the Retell LLM configurations saved in my account, along with their ⁠ model ⁠ values. That helps for account-specific configs, but it does not seem to provide a global list of all models currently supported by Retell.

My use case is that I want to show users the full set of models available in Retell, such as GPT, Gemini, Claude, realtime, etc., rather than only the subset already present in our existing LLM configs.

Could you please clarify:

1.⁠ ⁠Is there an API endpoint that returns the global list of all currently supported Retell models?
2.⁠ ⁠Should we rely on the documentation enum values as the source of truth, or is there another supported approach?
3.⁠ ⁠If ⁠ list-retell-llms ⁠ is only account-scoped, is there any plan to expose a separate “list available models” endpoint?

We noticed that the models returned from our account configs are limited and repeated based on what has already been created, so we want to make sure we are handling this correctly.

Thanks,

Hi @business

  1. There is no global endpoint to retrieve all supported models.

2. The OpenAPI spec enum values are your source of truth. The LLMModel schema defines these text LLM models:

  • gpt-4.1, gpt-4.1-mini, gpt-4.1-nano

  • gpt-5, gpt-5-mini, gpt-5-nano, gpt-5.1, gpt-5.2, gpt-5.4, gpt-5.4-mini, gpt-5.4-nano

  • claude-4.5-sonnet, claude-4.6-sonnet, claude-4.5-haiku

  • gemini-2.5-flash, gemini-2.5-flash-lite, gemini-3.0-flash

And these speech-to-speech (realtime) models via s2s_model:

  • gpt-realtime-1.5, gpt-realtime, gpt-realtime-mini

3. list-retell-llms is indeed account-scoped — it returns your saved LLM configurations, not a global model catalog.

Thank You