Google’s growing push into AI hardware with its tensor processing units (TPUs) is raising expectations for major gains among South Korea’s semiconductor leaders, Samsung Electronics and SK hynix. As Google seeks to supply its TPUs to other tech giants — including Meta, which is considering adopting them for its 2027 data centers — demand for high-bandwidth memory (HBM) is projected to surge.
Developed with Broadcom, Google’s TPUs are positioned as a cost-effective alternative to Nvidia’s dominant GPUs. Industry estimates suggest TPUs can be up to 80% cheaper than Nvidia’s flagship H100, while still delivering competitive AI performance. Each TPU requires six to eight HBM modules, making TPU adoption directly tied to rising memory demand.
SK hynix currently supplies HBM3E chips for Google’s latest TPU, Ironwood, and is expected to deliver next-generation 12-layer HBM for upcoming TPU models. Analysts say expanding TPU use will intensify the existing HBM supply shortage and lift both selling prices and shipment volumes — benefiting both SK hynix and Samsung.
The rapid growth of AI data centers is also expected to boost demand for traditional DRAM products, while Samsung’s improved foundry yields at advanced nodes position it as an increasingly competitive manufacturing alternative. Analysts note that Google’s expanding AI ecosystem could support higher memory shipments, stronger foundry utilization, and even increased Galaxy device sales powered by Google’s Gemini AI.
Samsung’s planned chip plant in Taylor, Texas, which will produce sub-2nm chips, is also expected to gain from a larger AI hardware market.