Modern electric machines process 25 GB of information per hour, compared to a DVD which was 4.7 GB. In this context, high latency increases the risk of accidents, so the machine’s response time needs to be as low as possible, said Romeo Lazăr, Sales Manager for Eastern Europe and Russia at Corning, at the DigitALL 2024 conference organized by Energynomics.
At the same time, Artificial Intelligence is already helping to uncover new levels of efficiency, but the trade-off is a massive increase in demand for bandwidth, as well as energy and water for cooling. In the context of rapidly developing AI, ChatGPT was trained using 10,000 Nvidia GPUs on an Azure supercomputer. It is estimated that future AI models could require up to 10 million GPUs.
NVIDIA dominates the GPU market for machine learning (ML), with about 90% of the market. AI training tasks require large clusters of GPUs (up to 32k), driving the need for power efficiency, cooling, and high bandwidth in data centers and the cloud.
Romeo Lazăr presented the impact of these developments on power requirements and the types of data cables that can be used effectively in the long term.
DOWNLOAD THE PRESENTATION OF ROMEO LAZĂR
“You may be happy with 40G now but planning to upgrade to 100G four years from now. Or maybe you are working with 400G and have your eyes set on 800G in five years. Migration will always vary based on your timeline and the available technologies in the market”, he stressed. “Corning‘s EDGE8 structured cabling solutions will support your transition needs, doesn‘t matter if we talk about Ethernet or InfiniBand.”
DigitALL 2024 was organized by Energynomics, in partnership with reputed organizations such as the CIO Council, with the support of our partners: Corning, Datacor, Eaton Electric, Enevo Group, Procesio, Siemens, Sixt, Vertiv.