Displaying items by tag: Inferencing

Thursday, 06 April 2023 03:01

Nvidia DGX H100 tops MLPerf testing round

GPU maker Nvidia says its H100 Tensor Core GPUs running in DGX H100 systems delivered the highest performance in every test of AI inferencing in the latest MLPerf benchmarking round.

Published in Hardware

MLPerf Inference provides an indication of how well systems perform at inferencing tasks such as image classification and speech recognition. Nvidia's A100 achieved the highest performance per accelerator in the Data Center Offline and Server tests.

Published in Hardware

IBM's new z16 mainframe features an integrated on-chip AI accelerator, allowing inferencing from real-time transactions.

Published in Hardware
Thursday, 22 October 2020 15:21

Nvidia GPUs set inferencing benchmark records

GPU vendor Nvidia set records in the latest round of MLPerf Inference benchmarking.

Published in Hardware
Thursday, 13 September 2018 16:07

Nvidia launches platform for hyperscale inferencing

The Nvidia TensorRT Hyperscale Inference Platform is designed to allow hyperscale data centres more efficiently support inferencing tasks such as speech recognition and image search.

Published in Hardware

Subscribe to Newsletter

*  Enter the security code shown: img0

WEBINARS & EVENTS

CYBERSECURITY

PEOPLE MOVES

GUEST ARTICLES

Guest Opinion

ITWIRETV & INTERVIEWS

RESEARCH & CASE STUDIES

Channel News

Comments