As global enterprises undergo digital transformation, the demand for AI computing power is surging, shaping a GPU market dominated by non-customized mainstream specifications – a trend that has pushed the major cloud providers (Hyperscalers) to create their own AI chips to bolster their services, particularly in generative AI, a segment that presents significant business prospects and is the focal point of competition.
This intense competition has propelled eight leading Hyperscalers in the US and China to innovate in AI chip development. However, US chip bans on China widen the disparity in AI resources and chip technology between Chinese and US cloud providers, according to DIGITIMES Research.
The booming cloud market is driving substantial computing demand. Central processing units (CPUs) combined with dedicated chips like GPUs offer benefits such as offloading massive CPU computing loads, accelerating overall computing speeds, achieving low latency, and smooth and stable cloud service quality.
Table 1: Cloud computing workloads and chips to satisfy their demand
Table 2: US- and China-based public CSPs' in-house designed ICs and their applications
Table 3: Benefits of large-scale public CSPs pushing in-house developed chips
Table 4: US- and China-based public CSPs' internal teams or subsidiaries for chip R&D
Table 5: Services US- and China-based public CSPs provide with in-house developed chips
Chart 1: US- and China-based public CSPs' announcement of in-house developed chips, 2015-2023
Chart 2: US- and China-based public CSPs' in-house developed AI inferencing chips, 2015-2023