Spending on switches deployed in AI back-end networks used to connect accelerated servers is forecast to approach US$80 billion over the next five years, nearly doubling the total data centre switch market opportunity, according to market research firm Dell’Oro Group.
Nvidia's next-generation H100 Tensor Core GPUs and Quantum-2 InfiniBand are now widely available, in Microsoft Azure and more than 50 partner systems from the company's partners including Asus, Atos, Dell Technologies, Gigabyte, HPE, Lenovo, and Supermicro.
HPC and AI vendor Nvidia has introduced an upgraded GPU, a new workgroup server, and a next-generation networking technology.
For most developers the security/performance trade off is still the hardest one to tackle, even as the cost of processing[…]
RISC has been overhyped. While it is an interesting low-level processor architecture, what the world needs is high-level system architectures,[…]
There are two flaws that are widespread in the industry here. The first is that any platform or language should[…]
Ajai Chowdhry, one of the founders and CEO of HCL is married to a cousin of a cousin of mine.[…]
I wonder when they will implement all of this, and what the pricing plans will be.FWIW, these days the proposed[…]