YOU ARE AT:AI-Machine-LearningNVIDIA to buy AI edge-cloud management platform Run:ai

NVIDIA to buy AI edge-cloud management platform Run:ai

News from last week; NVIDIA has entered into a definitive agreement to acquire Israel-based Run:ai, a Kubernetes-based compute management and orchestration software provider. The US chip firm, which has seen its stock value spiral upwards in line with demand for graphics processing units (GPUs) for modish AI computing workloads, said the deal will help customers make more efficient use of their AI resources across cloud and edge compute engines. The size of the deal has not been disclosed.

Bloomberg wrote last week the Santa Clara firm has posted its best weekly stock performance since last May, after stocks soared 15 percent to add nearly $290 billion in market capitalization to the firm. “The surge comes after firms like Meta Platforms, Alphabet, and Microsoft pledged billions in AI investments,” said Bloomberg. Run:ai makes a “sophisticated scheduling” developer platform to optimise the performance of GPU-driven compute infrastructure.

This enables enterprise customers to better manage and sundry orchestrate generative AI, recommender systems, search engines, and other workloads in distributed GPU clusters in edge-cloud computing infrastructure. Run:ai supports all popular Kubernetes variants and integrates with third-party AI tools and frameworks, it said. “Run:ai customers include some of the world’s largest enterprises across multiple industries,” said a statement.

The Run:ai platform includes: a centralised interface to manage shared compute infrastructure; functionality to variously add users and curate teams, and to provide controlled access to cluster resources; ability to pool GPUs and share computing power (“from fractions of GPUs to multiple GPUs or [GPU] nodes… on different clusters”); and optimised GPU cluster resource utilisation.

NVIDIA said it will continue to offer Run:ai’s products under the same business model “for the immediate future”, and further invest in the Run:ai product roadmap. Run.ai tools will be enabled on NVIDIA’s DGX Cloud, described as an “AI platform co-engineered with leading clouds” with an “integrated full-stack service… for generative AI”. Customers using the firm’s HGX and DGX platforms will also get access to the Run:ai platform for their AI workloads.

Run:ai’s solutions are already integrated with certain NVIDIA software. 

NVIDIA stated: “Together with Run:ai, NVIDIA will enable customers to have a single fabric that accesses GPU solutions anywhere. Customers can expect to benefit from better GPU utilisation, improved management of GPU infrastructure and greater flexibility from the open architecture.”

Omri Geller, co-founder and chief executive at Run:ai, said: “Run:ai has been a close collaborator with NVIDIA since 2020 and we share a passion for helping our customers make the most of their infrastructure. We’re thrilled to join NVIDIA and look forward to continuing our journey together.”

ABOUT AUTHOR

James Blackman
James Blackman
James Blackman has been writing about the technology and telecoms sectors for over a decade. He has edited and contributed to a number of European news outlets and trade titles. He has also worked at telecoms company Huawei, leading media activity for its devices business in Western Europe. He is based in London.