High Bandwidth Memory (HBM) is the commonly used type of DRAM for data center GPUs like NVIDIA's H200 and AMD's MI325X. High Bandwidth Flash (HBF) is a stack of flash chips with an HBM interface. What ...
SK Hynix and Taiwan’s TSMC have established an ‘AI Semiconductor Alliance’. SK Hynix has emerged as a strong player in the high-bandwidth memory (HBM) market due to the generative artificial ...
At AMD’s Financial Analyst Day earlier this month (which was actually more interesting than it initially sounds), AMD finally confirmed that it was looking to use high-bandwidth memory (HBM) in an ...
JEDEC is still finalizing the HBM4 memory specifications, with Rambus teasing its next-gen HBM4 memory controller that will be prepared for next-gen AI and data center markets, continuing to expand ...
If the HPC and AI markets need anything right now, it is not more compute but rather more memory capacity at a very high bandwidth. We have plenty of compute in current GPU and FPGA accelerators, but ...
As SK hynix leads and Samsung lags, Micron positions itself as a strong contender in the high-bandwidth memory market for generative AI. Micron Technology (Nasdaq:MU) has started shipping samples of ...
TL;DR: SK hynix CEO Kwak Noh-Jung unveiled the "Full Stack AI Memory Creator" vision at the SK AI Summit 2025, emphasizing collaboration to overcome AI memory challenges. SK hynix aims to lead AI ...
The surge in demand for memory stocks underscores the sector’s growth. The burgeoning need for high-bandwidth memory (HBM), essential for Nvidia’s (NASDAQ:NVDA) AI processors, is driving the interest.
Microsoft announced today a new security feature for the Windows operating system. Named "Hardware-enforced Stack Protection," this feature allows applications to use the local CPU hardware to protect ...