Getting a resource not found error on an agent that was just working a minute ago

this is the second time today i randomly have an agent turn into “retell llm” and it becomes unusable. I’ve tried to update the agent through the api but it’s not accepting any requests. i also see others with this same issue so I’d like to know if there is a fix

Hi @dapperdev2

Thanks for reporting this. Could you please share your agent ID? This will help us investigate the issue further.

Thank you!

agent id: agent_8dabe0e8237de6ca3c012aed75

Hi @dapperdev2

Thank you for the details. I’ve forwarded them to our team for review, and we’ll get back to you as soon as we have an update.

Thank you for your patience!

Thank you! I even hooked up a number and was testing calls this morning and everything was working great. i read somewhere it could be the llm but i cant update anything through the api either

Hi @dapperdev2

The issue is that the LLM configuration linked to the agent agent_8dabe0e8237de6ca3c012aed75 was deleted. Unfortunately, the deleted LLM configuration cannot be recovered. Is it possible that something in your workflow is inadvertently deleting the LLM; do you have any automation scripts for DELETE requests to Retell LLM endpoints?

Thank You

Thanks for the clarification!

This agent and its LLM configuration were originally created manually, not through my automation workflow. After reviewing my setup, I don’t have any DELETE requests or cleanup scripts that would intentionally remove an LLM configuration.

Earlier today, I was troubleshooting some LLM ID issues and testing new configurations, so it’s possible the agent was inadvertently linked to a temporary or invalid LLM ID that was later removed, even though i really dont see this being possible as i really only use /create-agent.

My main concern now is whether there is any way to recover or access the prompt/configuration that was previously associated with this deleted LLM, as that’s the most valuable part for me.

If recovery isn’t possible, is there any logging, version history, or export that might still contain the previous prompt or configuration?

Appreciate any help you can provide.

Hi @dapperdev2

Thanks for the additional context and I’m really sorry this happened, especially after you had everything working.

When a Retell LLM is deleted, its internal prompt/configuration is permanently removed from our system. We don’t keep a recoverable backup/version history/export of the deleted LLM content, but ill check with the product team just to be sure (similar issues like this have been raised before)

Going forward, we strongly recommend keeping prompts/configs in your own version control and using Update Retell LLM instead of creating/deleting when iterating.

Thank You

Thank you! if im using automations for creating new agents doesnt each new agent need a different llm assigned to it? i have it in my workflow that seems to be working now where it builds the llm payload, creates the llm, builds the agent payload, and then lastly builds the agent.

Hi @dapperdev2

No, each new agent does not need a different LLM. Multiple agents can share/reference the same retell-llm configuration (same llm_id). You only need separate LLMs if you want different prompts/behaviors. When iterating, we recommend updating the existing LLM via Update Retell LLM rather than creating/deleting new ones.

Thank You