Once, computers had CPUs - central processing units; this was the brain of your computer, and the name gives away that this processor was the centre of your computer. The storage, the peripherals, and all other components either fed into or received commands from the CPU.
Then, of course, the GPU, or graphics processing unit, came into its own. Originally GPUs offloaded complex video algorithms and floating-point maths to a dedicated device to provide ever-increasingly powerful levels of graphical detail for gaming, CAD, and other visually intensive apps. However, the parallel processing capabilities meant GPUs were tremendous at accelerating all kinds of computing tasks, not only rendering graphics, and today GPUs are essential for artificial intelligence, deep learning, big data analytics, cryptography, and more.
Data centres at both the physical and virtual level are now defined in terms of CPU, GPU, and RAM more than any other characteristic. Well, until now; with vast computing resources available and vast amounts of data, a third category of processor has come into being, which VMware CEO Raghu Raghuram says will become ubiquitous within the next decade of computing.
|
This is the DPU, or data processing unit, and represents one of the three major pillars of computing going forward, according to Nvidia CEO Jensen Huang. “The CPU is for general-purpose computing, the GPU is for accelerated computing, and the DPU, which moves data around the data centre, does data processing.”
So what is a DPU?
According to Nvidia, it is a system on a chip that combines all of these three key elements:
- An industry-standard, high-performance, software-programmable, multi-core CPU, typically based on the widely used ARM architecture, tightly coupled to the other SoC components.
- A high-performance network interface capable of parsing, processing and efficiently transferring data at line rate, or the speed of the rest of the network, to GPUs and CPUs.
- A rich set of flexible and programmable acceleration engines that offload and improve applications performance for AI and machine learning, security, telecommunications and storage, among others.
You might not have heard of a DPU before - and you’d be in the majority. However, VMware is betting you’ll hear a lot more about DPUs in the future. “DPUs are just starting out,” said Raghu Raghuram. “Anytime you want to introduce a new processor into an architecture takes planning. We will start seeing servers from OEMs with DPUs, perhaps as an optional component, but then as standard in coming years.”
This is the architectural change for the next decade, Raghuram says. AMD and Nvidia are working on their DPUs, and major OEMs like HP, Dell, and Lenovo are starting to adopt them.
In fact, VMware NSX - the company’s network virtualisation platform - will run on DPUs, as part of vSphere 8. “This is transformative for security,” said VMware senior vice president networking and advanced security Tom Gillis. “NSX puts a baby firewall everywhere; putting it on the DPU makes it faster.”
So note that DPUs are in your future; perhaps not this year or even next year, but the supply chain is already in gear, and VMware is positioning itself as the toolset to help you manage your CPU, GPU, and DPU environments whether they run on cloud or on-premises.
You'll hear more about this in iTWire’s coverage of VMware Explore and Raghuram’s vision for the next decade of computing.
Hear Jensen Huang speak about it here: