Rising AI Chip Demand Spurs Shortage of 12-inch Wafer and Challenges in HBM

TapTechNews May 20th news, citing information from industry insiders by the Commercial Times, due to the surging demand for AI chips and the increased area of the silicon interposer, resulting in a reduced number of 12-inch wafers that can be cut out, further exacerbating the situation of short supply of CoWoS packaging.

 Rising AI Chip Demand Spurs Shortage of 12-inch Wafer and Challenges in HBM_0

Chip gets larger

TrendForce estimates that Nvidia's B series (including GB200, B100, B200) will consume more CoWoS packaging capacity.

TapTechNews previously reported that TSMC has increased the CoWoS capacity demand for the entire year of 2024, and it is expected that the monthly capacity will approach 40,000 wafers by the end of the year, an increase of more than 150% compared to the total capacity in 2023. The total capacity in 2025 may increase nearly twice.

However, the interposer area of the B100 and B200 chips released by Nvidia will be larger than before, meaning that the number of chips cut from 12-inch wafers is reduced, resulting in the inability of CoWoS capacity to meet the GPU demand.

HBM

Industry insiders said that HBM is also a big problem, and the number of EUV layers begins to gradually increase. Taking SK Hynix, which has the first market share in HBM, as an example, the company applied single-layer EUV in the 1α production, and started to shift to 1β this year, and it is possible to increase the EUV application by 3 to 4 times.

 Rising AI Chip Demand Spurs Shortage of 12-inch Wafer and Challenges in HBM_1

In addition to the increase in technical difficulty, as HBM iterates each time, the number of DRAMs in HBM also increases simultaneously. The number of DRAMs stacked in HBM2 is 4 to 8, and that in HBM3/3E increases to 8 to 12, and the number of DRAMs stacked in HBM4 will increase to 16.

Likes