Breaking news

Samsung and AMD sign $3 billion HBM3E sales agreement, but there are provisos

--

In October 2023, Samsung held the Samsung Memory Tech Day 2023 event and announced the launch of a new generation HBM3E code-named Shinebolt. By February 2024, Samsung announced that it had developed the industry’s first HBM3E 12H DRAM, which has a 12-layer stack and a capacity of 36GB. It is the HBM product with the highest bandwidth and capacity so far. After that, Samsung began to provide samples to customers and plans to start mass production in the second half of 2024.

According to foreign media reports, Samsung has signed a new agreement worth US$3 billion with processor major AMD to supply HBM3E 12H DRAM, which is expected to be used in the Instinct MI350 series AI chips. In this agreement, Samsung also agreed to purchase AMD’s GPUs in exchange for HBM products, but the specific products and quantities are not yet clear.

Previous market news pointed out that AMD plans to launch the Instinct MI350 series in the second half of 2024, which is an upgraded version of the Instinct MI300 series AI chips. It is produced using TSMC’s 4nm process technology, the leading foundry, to provide more powerful computing performance and reduce power consumption. Since the 12-layer stacked HBM3E will be used, the transmission bandwidth will also be increased while also increasing the capacity.

According to official news, Samsung’s HBM3E 12H DRAM provides a bandwidth of up to 1280GB/s, and with a 36GB capacity, it is 50% higher than the previous generation’s 8-layer stacked product. Due to the use of advanced hot-pressed non-conductive film (TC NCF) technology, 12-layer stacked products have the same chip height as 8-layer stacked products, meeting the requirements of current HBM packaging technology. This technology also improves the thermal performance of HBM by using bumps of different sizes between wafers. During the wafer bonding process, smaller bumps are used for signal transmission areas, while larger bumps are placed in areas that require heat dissipation. Will help improve product yield.

According to Samsung, in artificial intelligence applications, using HBM3E 12H DRAM is expected to increase the average speed of large model training by 34% compared to HBM3E 8H DRAM, and the number of inference service users will also increase by more than 11.5 times. In this regard, market participants said that the transaction is separate from the Samsung wafer foundry negotiations. Therefore, previous news stated that AMD will hand over some new generation CPU/GPU to Samsung wafer foundry for production, which will not be related to this transaction.

(First image source: Flickr/Jamie McCall CC By 2.0)

Follow TechNews via Google News here


608ad17ad2.jpg

New scientific and technological knowledge, updated from time to time


-

NEXT Binance Changpeng Zhao was finally sentenced to 4 months in prison. What did CZ say in court? | DongZuDongTren – the most influential blockchain news media