Despite strong AI demand, Micron's financial pressures and Nvidia's delays create uncertainty. Click here to read an analysis ...
Micron is a Strong Buy with improved operating results and room for growth in the memory cycle upswing. See why MU stock is ...
Rambus announces the industry's first HBM4 controller IP to accelerate next-generation AI workloads, ready for next-gen AI ...
SK hynix VP said the company wants to become the "total AI memory provider" for AI GPUs with its HBM memory, and it's continuing on that (dominant) path. Anthony joined the TweakTown team in 2010 ...
The standard for high-bandwidth memory limits design freedom at many levels, but that is required for interoperability. What ...
Taiwan-based Phison, best known for its flash memory technology, has developed technology that extends GPU memory for AI training and tuning.
Memory bandwidth is a major challenge in designing processing chips for data intensive applications. HBM helps to alleviate such problems. Chips such as GPU, application accelerators, vector ...
It has become a well known fact these days that the switches that are used to interconnect distributed systems are not the ...
GPUs have become increasingly important for several large software firms such as AWS, Google, and OpenAI, as the demand for generative AI continues to grow steadily.
The battle for token speed is intensifying as SambaNova, Cerebras, and Groq push the limits of inference performance.
Nevertheless, HBM memory is perceived as a factor in the price increase of 2025. Recent market concerns have been fueled by delays in the delivery of Nvidia's next-generation Blackwell GPU solutions.
Samsung, NEO Semicondutor, Meta, Fadu and Western Digital announced major NAND flash, SSDs and DRAM announcements at the 2024 FMS.