Q: Why no Claude?
Claude is mysteriously absent in the long list of LLMs. You say, anthropic integration, but i don't see how i can integrate it? One other thing. Some LLMs (NousResearch, Nous, Meta LLama 3.1, Llama Guard, Open Hermes) say 'OpenRouter is currently limiting your rate of generation. To address this, please contact OpenRouter for assistance with the rate limit' and some don't work at all (DeepSeek), even when you attempt them the first time. Shouldn't they be greyed out if they are API only?
Felipe_TriploAI
Edited Jan 16, 2025A: Hey Zeeky! Thanks for the question.
The short answer is cost.
We prioritize sustainability, and if we can't afford the expenses, we risk failing those who rely on Claude and our other models all together.
Additionally, we're not in the business of selling lifetime access to premium models. Our allowances are designed to help non-tech-savvy users and, frankly, our diverse model offerings cover nearly all use cases.
That said, you can use your own API keys for OpenAI, Anthropic, and/or OpenRouter to access nearly all available models.
Regarding rate limitations on OpenRouter the only one that's applied (by OpenRouter itself) is on Gemini since it's experimental. We, Triplo AI, do not pose ANY interference on your communication with the models providers (none of them).
Feel free to reach out if you have more questions.
Take care,
Felipe