Hi Retell team,
I’m integrating with the Retell API and had a question about how to retrieve the full list of models supported by Retell.
Right now, when I call list-retell-llms , it appears to return the Retell LLM configurations saved in my account, along with their model values. That helps for account-specific configs, but it does not seem to provide a global list of all models currently supported by Retell.
My use case is that I want to show users the full set of models available in Retell, such as GPT, Gemini, Claude, realtime, etc., rather than only the subset already present in our existing LLM configs.
Could you please clarify:
1. Is there an API endpoint that returns the global list of all currently supported Retell models?
2. Should we rely on the documentation enum values as the source of truth, or is there another supported approach?
3. If list-retell-llms is only account-scoped, is there any plan to expose a separate “list available models” endpoint?
We noticed that the models returned from our account configs are limited and repeated based on what has already been created, so we want to make sure we are handling this correctly.
Thanks,