VMware senior vice president and general manager of the cloud platform business unit Krish Prasad said this "will change the nature of enterprise computing."
VMware's focus has been to make it easy to run any application anywhere. So the company built Kubernetes into VMware (Tanzu) so that containers could be run without requiring a separate platform.
And the VMware multi-cloud strategy covers on premises, AWS, Azure, Google Cloud, Alibaba Cloud, IBM Cloud and Oracle Cloud, he added.
What's now being added is the ability to run cloud-native AI applications on VMware Cloud Foundation with Tanzu, explained Nvidia head of enterprise computing Manuvir Das.
This means data analytics, machine learning training and inference, and professional visualisation software can all be run in virtual machines or containers. The GPUs at the hardware layers as "plumbed through the VMware stack in a first class manner," he said.
The AI software available on Nvidia's NGC hub will be integrated into VMware vSphere, VMware Cloud Foundation and VMware Tanzu, helping accelerate AI adoption, extend existing infrastructure for AI, and manage all applications with a single set of operations.
Consequently, AI applications don't need special infrastructure (as long as the servers are GPU equipped), which makes it easier to put the application where the data lives, said Prasad.
This approach AI-enables entire data centres, he added.
The integration of Nvidia NGC with VMware vSphere and VMware Cloud Foundation will simplify the deployment and management of AI for the most demanding workloads, according to the companies.
Industries such as healthcare, financial services, retail and manufacturing will be able to develop and deploy AI workloads using containers and virtual machines on the same platform as their enterprise applications, even across a hybrid cloud.
NGC software is supported on a pre-tested Nvidia A100-powered servers expected from leading system manufacturers, including Dell Technologies, Hewlett Packard Enterprise (HPE) and Lenovo.
The other part of Nvidia and VMware's ongoing collaboration involves the latter's Project Monterey. This is a push involving multiple industry partners to create the infrastructure needed for next-generation applications.
The Nvidia-VMware piece announced today concerns the combination of VMware Cloud Foundation and Nvidia's BlueField-2 DPU (data processing unit, aka SmartNIC).
The idea is to offload certain functions such as network, security, storage and host management services from the CPU to the DPU, even for bare metal Linux and Windows servers. Not only are those functions performed more efficiently by the DPU, but freeing up CPU cycles means better application performance.
This approach means security services under ESXi can be distributed around the hardware, said Prasad. Locating such services at the perimeter of each app rather than at the perimeter of the data centre is "a huge improvement," he observed.
Das described BlueField-2 as a software-defined data centre, accelerated on a chip. It combines Arm cores with the Mellanox (now owned by Nvidia) ConnectX-6 Dx network adaptor for InfiniBand and 200Gbps Ethernet, operating within an isolated security domain.
Supported storage functions include NVMeOF, deduplication, compression and encryption.
As an example of the performance gains that are possible, moving IPsec processing to BlueField-2 gives an eightfold performance improvement, according to Das.
There is already an extensive ecosystem of developers familiar with ConnectX, and that experience is directly applicable to BlueField-2, he added.
BlueField-2 slots into the hardware layer and becomes a first-class citizen within the VMware stack.
"We are partnering with Nvidia to bring AI to every enterprise; a true democratisation of one of the most powerful technologies," said VMware CEO Pat Gelsinger.
"We're also collaborating to define a new architecture for the hybrid cloud – one purpose built to support the needs and demands of the next generation of applications. Together, we're positioned to help every enterprise accelerate their use of breakthrough applications to drive their business."
Nvidia founder and CEO Jensen Huang said "AI and machine learning have quickly expanded from research labs to data centres in companies across virtually every industry and geography.
"Nvidia and VMware will help customers transform every enterprise data centre into an accelerated AI supercomputer. Nvidia DPUs will give companies the ability to build secure, programmable, software-defined data centres that can accelerate all enterprise applications at exceptional value."