Q: Attention Middle East and South Asia - Introducing Mistral Small 3 24b and Mistral Saba 24b to Triplo AI Allowance
🌐 Exciting News! 💡We just added 2 fresh language models to our allowance: Mistral Small 3 24b and Mistral Saba 24b!
Mistral Small, with its 24B parameters, is optimized for low-latency performance across various AI tasks. It boasts an impressive 81% accuracy on the MMLU benchmark, competing with larger models like Llama 3.3 70B and Qwen 32B, all while operating at three times the speed on equivalent hardware!
On the other hand, Mistral Saba, also a 24B-parameter model, is tailored for the Middle East and South Asia. Trained on curated regional datasets, it delivers accurate and contextually relevant responses, supporting multiple Indian-origin languages such as Tamil and Malayalam, as well as Arabic.
Join us in celebrating this expansion and exploring the possibilities that Mistral Small and Mistral Saba bring to Triplo AI!

Felipe_TriploAI
Feb 19, 2025A: It's a pleasure to allow access to these excellent models to our South Asian and Arabic-speaking community.
This reinforces our agnostic position and commitment to always serve you with the best available models as long as it doesn't hurt our sustainability.
You can check all the available models under Triplo AI's allowance and BYOK at go.triplo.ai/allowance.
Enjoy!
Felipe and the Elbruz Tech Giants

Verified purchaser
Awesome. Please tell me we will get o1/o3 access thru either BYOK or triplo built in soon? Much better than o1preview/mini and the other companies just haven't quite demonstrated the same leadership so we have to rely on what we know will be good for business when we don't have time to research other companies' models. Keeping/using the best possible openAi api possible is key for me!
The O1 and O3 APIs diverge slightly from the OpenAI standard, their integration is ready and they'll be available on our next release.