NVIDIA maintains dominant position in 2020 market for AI processors for cloud and data center

The company held an 80.6% share of fast-growing business—but alternative suppliers to arise

Leading graphics processing unit (GPU) supplier NVIDIA Corp. maintained its dominant position in the global market for artificial intelligence (AI) processors used in the cloud and in data centers in 2020, with an 80.6% share of global revenue, according to Omdia.

NVIDIA generated cloud and data center AI processor revenue totaling $3.2 billion in 2020, up from $1.8 billion in 2019, as reported by Omdia’s AI Processors for Cloud and Data Center Forecast Report. The company continued to benefit from its supremacy in the market for GPU-derived chips, which currently represent the leading type of AI processor employed in cloud and data center equipment, including servers, workstations and expansion cards.

“NVIDIA in 2020 continued to capitalize on its strong incumbent position in GPU-derived chips to maintain its leadership position in cloud and data center AI processors,” said Jonathan Cassell, principal analyst, advanced computing, at Omdia. “With their capability to accelerate deep-learning applications, GPU-based semiconductors became the first type of AI processor widely employed for AI acceleration. And as the leading supplier of GPU-derived chips, NVIDIA has established itself and bolstered its position as the AI processor market leader for the key cloud and data center market.”

The market for AI processors is undergoing rapid growth, attracting a flood of suppliers vying to challenge NVIDIA’s leadership. Global market revenue for cloud and data center AI processors rose 79% to reach $4 billion in 2020. Revenue is expected to soar by a factor of nine to reach $37.6 billion in 2026, according to Omdia.

Figure AI processors for cloud and datacenter revenue forecast world markets 201926

During the past few years, competitive suppliers ranging from small startups to major semiconductor vendors have entered the AI processor market with a number of different chips, ranging from their own types of GPU-based chips, to programmable devices, to new varieties of semiconductors specifically designed to accelerate deep learning.

“Despite the onslaught of new competitors and new types of chips, NVIDIA’s GPU-based devices have remained the default choice for cloud hyperscalers and on-premises data centers, partly because of their familiarity to users,” Cassell noted. “NVIDIA’s Compute Unified Device Architecture (CUDA) Toolkit is used nearly universally by the AI software development community, giving the company’s GPU-derived chips a huge advantage in the market. However, Omdia predicts that other chip suppliers will gain significant market share in the coming years as market acceptance increases for alternative GPU-based chips and other types of AI processors.”

In its definition of AI processors, Omdia includes only those chips that integrate distinct subsystems dedicated to AI processing. These devices include GPU-derived AI application-specific standard products (GPU-derived AI ASSPs), proprietary-core AI application-specific standard products (proprietary-core AI ASSPs), AI application-specific integrated circuit (AI ASICs) and field-programmable gate arrays (FPGAs). While central processing unit (CPU) chips like Intel’s Xeon are extensively used for AI acceleration in cloud and data center operations, Omdia is not including these devices in its AI processor analysis.

The other top players in the cloud and data center AI processor market include:

  • Second-ranked Xilinx, which offers field-programmable gate array FPGA products commonly used for AI inferencing in cloud and data center servers.
  • Third-ranked Google, whose Tensor Processing Unit (TPU) AI ASIC is employed extensively in its own hyperscale cloud operations.
  • Fourth-placed Intel, which is supplying its Habana AI proprietary-core AI ASSPs and its FPGA products for AI cloud and data center servers.
  • Fifth-ranked AMD, which is offering GPU-derived AI ASSPs for cloud and data center servers

LEAVE A REPLY

Please enter your comment!
Please enter your name here