Needs more LLM options
Pretty decent service, but I would like to see more options for LLMs supported. Let us choose if we want to use the experimental or preview models. Allow sequential fallback to other models if the primary request fails so we can use credits on our preferred model rather than multiple at once.
It's good, but not quite all the way to where it should be.

zquestz
Apr 22, 2025Experimental models will not be supported, as they have usage limits that prevent them from being added to our platform. There simply aren't enough API calls allowed to service our customers with those models. That is why they are not production ready yet...
Sequential fallback is an interesting idea, and can easily be done when integrating against our API, but we haven't thought about adding this to our graphical products. We have been doing this for 8 years and it is the first time that suggestion has come up. Definitely worth thinking about, but we will likely be focusing on other priorities in the short term.