Spirent Report : AI and high-speed Ethernet (HSE) across Data center, Telecom, and Enterprise networking

Bangalore – October 24, 2024 – Spirent Communications, a leading provider of test and assurance solutions for next-generation devices and networks, today released its inaugural report on the high-speed Ethernet (HSE) market and the impact of AI advancements on data center, telecom and enterprise networking. AI’s influence cannot be overstated as it radically transforms data centers and interconnects, surpassing the impact of traditional cloud applications. The Spirent impact report “The Future of High-Speed Ethernet Across Data Center, Telecom, and Enterprise Networking” offers a comprehensive look at key drivers, market impacts, and predictions for what comes next, and includes insights from over 340 HSE engagements supported by Spirent in the past year.

“As the market focuses on the power and promise of AI, there is tremendous pressure to move faster, push the boundaries of speed, and relentlessly pursue every competitive edge available in the market,” says Aniket Khosla, Vice President of Wireline Product Management at Spirent. “AI is driving an inflection point in the market and there is strong demand to understand and get ahead of the trends driving this.” 

“As an independent vendor that both service providers and network equipment manufacturers (NEMs) rely on to validate cutting edge Ethernet technology and new infrastructure, Spirent has a unique perspective on the market. Our new report highlights the pace of innovation in the HSE market, AI’s data deluge pushing changes to Ethernet, and steps the industry is taking to meet the resulting challenges,” explains Khosla. 

The most promising HSE trends identified by the report include: 

  • HSE port shipments continue to accelerate* – In 2023, suppliers shipped more than 70 million HSE ports, with volume expected to explode to more than 240 million ports between 2024-2026. Ahead of traditional demand curves, markets are already looking to 1.6T Ethernet to pursue AI-driven opportunities as soon as next year.
  • Ramp up of higher speeds – The impact of AI is changing the data center and interconnect ecosystem around it, making it necessary to rearchitect the network to support new performance and scalability requirements. As a result, the market will continue to see rapid migration to 400/800G and beyond.
  • AI fabric requires new testing approaches – AI data center performance testing requires test cases configured to generate AI workloads using real servers, an extremely expensive undertaking. As a result, new cost-efficient ways of stress testing AI data center networking are being used that emulate realistic xPU workload traffic.

While hyperscalers are migrating to 800G, enterprises are not waiting on future developments to make progress and telecom operators are throwing out traditional playbooks to meet customers where they are in ambitious deployment cycles. Key insights include: 

  • The need for speed – The report anticipates that 800G, while still gaining traction, will soon be complemented by 1.6T Ethernet in an effort to meet near-term needs, as AI models grow in complexity and size, requiring more bandwidth and speed.
  • AI inference: edge capacity will grow – Significant amounts of AI traffic will be at the edge, prompting the need for early capacity upgrades in access and transport networks. Early forecasts suggest edge locations could require additional capacities, with far-edge sites requiring 25-50G speed grade upgrades, mid-edge sites requiring 100-200G, and near-edge sites requiring 400G with potentially a faster refresh cycle to 800G.
  • RoCEv2 in back-end data center – Remote Direct Memory Access over Converged Ethernet (RoCEv2) is a crucial enabler of high-performance, low-latency networking, made possible by facilitating direct memory access between devices over the Ethernet. The report highlights the growing adoption of RoCEv2 in back-end data centers for AI interconnect fabrics.

LEAVE A REPLY

Please enter your comment!
Please enter your name here