Samsung last month unveiled a SOCAMM2 LPDDR5-based memory module designed specifically for AI data center platforms.
TL;DR: NVIDIA is negotiating with Samsung, SK hynix, and Micron to use SOCAMM memory modules in its Project DIGITS follow-up. SOCAMM offers better energy efficiency and more I/O channels than current ...
Hosted on MSN
Nvidia, SK Hynix, Samsung and Micron reportedly working on new SOCAMM memory standard for AI PCs
Nvidia is reportedly teaming up with memory manufacturers SK hynix, Micron and Samsung to create a new memory standard that's both small in size, but big on performance, according to a report via ...
Hosted on MSN
Nvidia's revolutionary memory format for AI GPUs could come to other platforms, rivaling LPDDR6 on the horizon
Nvidia scrapped SOCAMM 1 after repeated failures to meet expectations SOCAMM 2 promises faster transfer speeds reaching 9,600 MT/s performance LPDDR6 adoption discussions signal scalability beyond ...
TAIPEI, July 23, 2025 /PRNewswire/ -- Nvidia is preparing to enter a new phase in the memory market by planning to deploy between 600,000 and 800,000 SOCAMM modules in 2025. This initiative positions ...
TL;DR: NVIDIA is ramping up production of LPDDR-based SOCAMM memory, targeting 600,000 to 800,000 units in 2024 for AI PC and server products. SOCAMM offers superior power efficiency, modular upgrades ...
Nvidia is moving to disrupt the memory market again, planning to deploy between 600,000 and 800,000 SOCAMM modules in 2025 and handed the entire contract to Micron. According to DigiTimes Asia the ...
Some results have been hidden because they may be inaccessible to you
Show inaccessible results