Infosys, a global leader in next-generation digital services and consulting, has announced the launch of an Infosys Cobalt offering – its applied AI cloud, built on NVIDIA DGX A100 systems.
The powerful new Infosys applied AI cloud will be an AI center of excellence for the company, enabling developers and project teams at Infosys to quickly and easily access AI hardware and software stacks, across both private and public clouds, to build contextualized services that deliver AI-first business processes for enterprises.
NVIDIA DGX A100 systems will provide the infrastructure and the advanced compute power needed for over 100 project teams to run machine learning (ML) and deep learning operations, simultaneously.
NVIDIA Multi-Instance GPU (MIG) technology will enable Infosys to improve infrastructure efficiency and maximize utilization of each DGX A100 system. Teams can process AI algorithms centrally or locally on any device, without lag, using Infosys edge AI.
As a service delivery partner in the NVIDIA Partner Network, Infosys will also be able to build NVIDIA DGX A100-powered, on-prem AI clouds for enterprises, providing access to cognitive services, licensed and open source AI software-as-a-service (SaaS), pre-built AI platforms, solutions, models and edge capabilities.
Balakrishna D.R., Senior VP, Head – AI & Automation Services, Infosys said, “Infosys applied AI cloud, powered by NVIDIA DGX A100 systems, can help enterprises to quickly build on the opportunity, while scaling with new technological advancements.”
“Many organizations are eager to infuse their business with AI but lack the strategic platform on which they can pool expertise and scale the computing resources needed to build mission-critical AI applications,” said Charlie Boyle, Vice President and General Manager of DGX Systems at NVIDIA.
“Working with Infosys, we’re helping organizations everywhere build their own AI centers of excellence, powered by NVIDIA DGX A100 and NVIDIA DGX POD infrastructure to speed the RoI of AI investments,” added Boyle.