Of course, power demand is set to continue expanding rapidly as the supply chain increases its production capacity while demand remains high. TSMC has already confirmed its target to double its CoWoS capacity again in 2025 (see Data S1, sheet 2). This could mean the total power demand associated with devices produced using TSMC’s CoWoS capacity will also double from 2024 to 2025—just as it did from 2023 to 2024 (Figure 1), when TSMC similarly doubled its CoWoS capacity. At this rate, the cumulative power demand of AI accelerator modules produced in 2023, 2024, and 2025 could reach 12.8 GW by the end of 2025. For AI systems, this figure would rise to 23 GW, surpassing the electricity consumption of Bitcoin mining and approaching half of total data center electricity consumption (excluding crypto mining) in 2024. However, with the industry transitioning from CoWoS-S to CoWoS-L as the main packaging technology for AI accelerators, continued suboptimal yield rates for this new packaging technology may slow down both device production and the total power demand associated with these devices. Moreover, although demand for TSMC’s CoWoS capacity exceeded supply in both 2023 and 2024, it is not guaranteed that this trend will persist throughout 2025. Several factors could lead to a slowdown in AI hardware demand, such as waning enthusiasm for AI applications. Additionally, AI hardware may face new bottlenecks in the manufacturing and deployment process. While limited CoWoS capacity has constrained AI accelerator production and power demand over the past 2 years, export controls and sanctions driven by geopolitical tensions could introduce new disruptions in the AI hardware supply chain. Chinese companies have already faced restrictions on the type of AI hardware they can import, leading to the notable release of Chinese tech company DeepSeek’s R1 model. This large language model may achieve performance comparable to that of OpenAI’s ChatGPT, but it was claimed to do so using less advanced hardware and innovative software. These innovations can reduce the computational and energy costs of AI. At the same time, this does not necessarily change the “bigger is better” dynamic that has driven AI models to unprecedented sizes in recent years. Any positive effects on AI power demand as a result of efficiency gains may be negated by rebound effects, such as incentivizing greater use and the use of more computational resources to improve performance. Furthermore, multiple regions attempting to develop their own AI solutions may, paradoxically, increase overall AI hardware demand. Tech companies may also struggle to deploy AI hardware, given that Google already faced a “power capacity crisis” while attempting to expand data center capacity. For now, researchers will have to continue navigating limited data availability to determine what TSMC’s expanding CoWoS capacity means for the future power demand of AI.
this post was submitted on 29 May 2025
7 points (100.0% liked)
Technology
106 readers
106 users here now
Share interesting Technology news and links.
Rules:
- No paywalled sites at all.
- News articles has to be recent, not older than 2 weeks (14 days).
- No videos.
- Post only direct links.
To encourage more original sources and keep this space commercial free as much as I could, the following websites are Blacklisted:
- NBC.
- CNBC.
- NY Magazine.
- Substack.
- Tom's Hardware.
- ZDNet.
- TechSpot.
- Ars Technica.
- The Verge.
- Engadget.
- TechCrunch.
- Gizmodo.
Encouraged:
- Archive links in the body of the post.
- Linking to the direct source, instead of linking to an article talking about the source.
founded 3 weeks ago
MODERATORS
there doesn't seem to be anything here