According to OMDIA’s AI Processors for Cloud and Data Centre Forecast Report, NVIDIA is a market leader in terms of how it uses artificial intelligence (AI) in cloud and data centre environments.
According to the report, NVIDIA accounted for an impressive 80.6 percent of global revenue in 2020, totaling $3.2 billion, up from $1.8 billion in 2019.
The report refers to NVIDIA’s market dominance as’supremacy in the market for GPU-derived chips,’ which are frequently used in servers, workstations, and expansion cards in cloud equipment.
According to Jonathan Cassell, Omdia’s principal analyst for advanced computing, NVIDIA employs critical strategies to sustain its growth.
“GPU-based semiconductors became the first type of AI processor widely used for AI acceleration due to their ability to accelerate deep-learning applications. And as the leading supplier of GPU-derived chips, NVIDIA has established and strengthened its position as the market leader for AI processors in the critical cloud and data centre markets,” Cassell explains.
NVIDIA’s dominance is also fueling intense market competition as suppliers compete for a piece of the $4 billion cloud and data centre AI processor market. By 2026, total market revenue could reach $37.6 billion.
“Despite an onslaught of new competitors and new types of chips, NVIDIA’s GPU-based devices have remained the default choice for cloud hyperscalers and on-premises data centres, in part due to their user familiarity,” Cassell explains.
Cassell refers to the NVIDIA Compute Unified Device Architecture (CUDA) Toolkit, which is widely used in the community of artificial intelligence software developers. By default, this boosts the performance of NVIDIA’s associated products, such as GPU chips.
However, market competition will only intensify in the future, as the market shifts away from NVIDIA-based GPU chips and toward other AI processors.
Other major market players in the cloud and data centre AI processor market, according to Omdia’s research, include Xilinx, Google, Intel, and AMD.
Xilinx came in second place, behind NVIDIA. Xilinx manufactures field-programmable gate array (FPGA) products that are widely used in cloud and data centre servers for AI inference.
Google came in third place. Its own hyperscale cloud operations heavily rely on its Tensor Processing Unit (TPU) AI ASIC.
Intel came in fourth place. Its proprietary-core AI ASSPs and FPGA products, dubbed Habana AI, are optimised for AI cloud and data centre servers.
AMD was ranked fifth for its cloud and data centre server AI ASSPs derived from GPUs.