OmniOps Rolls Out Saudi’s First Inference-As-A-Service Platform Bunyan
Built on OmniOps' AI software, the platform allows companies to run large language models and build AI apps without being tied to specific hardware.

Saudi Arabia-based artificial intelligence (AI) infrastructure provider OmniOps has launched Bunyan, the Kingdom’s first sovereign inference-as-a-service platform. The announcement came following a strategic meeting with HE Eng. Abdullah Amer Alswaha, KSA Minister of Communications and Information Technology.
The newly launched Bunyan delivers an end-to-end AI infrastructure stack supporting text, vision, and speech applications, with a strong focus on data sovereignty and regulatory compliance. Built on OmniOps' proprietary AI infrastructure software, the platform enables graphics processing unit (GPU)-agnostic deployment, allowing enterprises to run large language models (LLMs) and create AI applications without being locked into specific hardware.
Founded by Mohamed Altassan in Saudi Arabia in 2024, OmniOps provides cloud and high-performance computing solutions. Its latest platform, Bunyan, is hosted on hardware from providers like NVIDIA and Groq and supports both cloud and on-premise deployment to give enterprises greater control over performance and compliance.
In an interview with Inc. Arabia, OmniOps’ CEO Altassan and Chief Product Officer (CPO) Mehdi Tantaoui explained that while many companies in the region are ramping up AI adoption, few fully grasp the demands of running models in production environments. “One of the biggest misconceptions is that deploying an AI model into production is the finish line. When in reality, it’s just the beginning,” they explained. “Many organizations underestimate the complexity and cost of inference at scale. They assume the infrastructure that worked during experimentation or training will scale linearly in production, but that’s rarely the case.”
This is where inference-as-a-service—Bunyan’s core proposition—steps in. Instead of requiring companies to manage the infrastructure, cost optimization, and hardware orchestration behind running AI models, the platform offers a managed layer that handles the full lifecycle of inference at scale.
Altassan and Tantaoui explained that another oversight by companies adopting AI solutions is the lack of data readiness. “Many organizations invest heavily in AI without ensuring their data pipelines are clean, structured, or even accessible in real time. If your data isn’t production-grade, your inference output won’t be reliable, no matter how advanced the model is,” they noted.
“Blind spots include ignoring hardware inefficiencies, over-provisioning GPUs, and lacking observability into inference performance,” they added. “There’s also a tendency to treat inference as a black box, when in fact, optimizing latency, throughput, and cost per inference requires deep visibility and smart orchestration. That’s exactly the gap we’re solving with Bunyan, by enabling scalable, cost-efficient, and compliant inference-as-a-service designed with local needs in mind.”
The point out that one major advantage Bunyan offers is flexibility, eliminating the hardware lock-in that often slows down AI adoption. “Organizations today spend weeks designing and thinking about which GPU brand and type to acquire in order to enable their AI partitioners to build or use AI,” they told us. “With Bunyan it doesn't matter which brand you choose, we guarantee seamless experience and a level of abstraction that removes that burden and accelerates AI adoption across the board.”
It’s this emphasis on sovereignty that the founders believe differentiates Bunyan in the market. “It's a crowded market where many global companies are offering AI software similar to ours,” the duo told us. “But Bunyan has been built to guarantee AI and data sovereignty. The product is evolving rapidly and our ambition is to compete globally and offer a value that accelerates AI adoption everywhere.”
Looking ahead, Altassan and Tantaoui see the future of AI evolving beyond simple automation, with intelligence becoming more autonomous and proactive. “We know that research is now heavily invested in artificial general intelligence (AGI),” they said. “We aspire to be a global leader in that space and build AI infrastructure and software that supports such advanced intelligence. A lot of our research is focused on agentic AI, [and] we believe in the mid-term, the majority of the market will be revolved around active agents that can autonomously act on your behalf to perform repetitive daily/hourly tasks without being prompted.”
As for other founders dealing with the challenges of building AI infrastructure, Altassan and Tantaoui shared candid advice from the frontlines. “Start with clarity. Don’t build infrastructure for its own sake, but design it then build it to serve actual workloads. Focus on interoperability, efficiency, and scalability from day one.”
“In this region especially, where data sovereignty and regulatory compliance are key, founders should think about infrastructure not just as technical plumbing, but as a strategic layer,” they added, advising other startups not to go it alone. “The ecosystem is growing fast, and organizations like OmniOps and platforms like Bunyan exist to help you deploy, manage, and scale AI infrastructure without reinventing the wheel. You don’t need a fleet of DevOps engineers to get started; you just need the right partner and a clear roadmap.”
Pictured in the lead image is the OmniOps leadership team with HE Abdullah Amer Alswaha, Saudi Minister of Communications and Information Technology. Image courtesy OmniOps.