In an industry-first, SK hynix has announced its 16-Hi HBM3E memory, offering capacities of 48GB per stack alongside other ...
SK Hynix plans to reduce its legacy DRAM production to 20% by the fourth quarter of 2024, responding to increased supply and ...
SK hynix unveils the industry's first 16-Hi HBM3E memory, offering up to 48GB per stack for AI GPUs with even more AI memory in the future.
TL;DR: NVIDIA CEO Jensen Huang has requested SK hynix expedite the supply of its next-generation HBM4 memory by six months, originally planned for the second half of 2025. NVIDIA currently uses SK ...
There are lots of ways that we might build out the memory capacity and memory bandwidth of compute engines to drive AI and ...
Nvidia Corp CEO Jensen Huang has requested SK Hynix Inc. to expedite the supply of its next-generation high-bandwidth memory ...
SK Group Chairman Chey Tae-won said that Nvidia's (NASDAQ:NVDA) CEO Jensen Huang asked SK hynix to speed up the supply of its next-generation high-bandwidth memory chips, HBM4, by six months ...
SK hynix, the world's second-largest memory chip maker, is racing to meet explosive demand for the HBM chips that are used to process vast amounts of data to train AI, including from Nvidia ...
Samsung Electronics and SK Hynix are advancing next-generation memory technology with "ultra-low-temperature" etching, a ...
and the memory bandwidth was being expanded with more bandwidth and more energy efficiency at the same time. Compounding all of those effects led to super-acceleration of processing capabilities.” SK ...
SK hynix claims an 18% improvement in training ... of 16 vertically stacked DRAM dies (16-Hi) - each packing up to 4GB of memory. Those are some monumental upgrades, generation on generation ...