SK Hynix revealed at a press conference held in Seoul, South Korea on Thursday that it is working on developing a solid state drive with an unprecedented 30TGB capacity. The push was pre-announced as part of a broader portfolio of products and technologies designed to advance both data center and on-device AI capabilities.
Market researchers cited by SK Hynix say that in the AI ​​era, the total amount of data generated worldwide (both human and AI-generated) will grow from 15 ZB in 2014 to 660 Zettabytes in 2030. I think we will reach it. This huge amount of data needs to be stored somewhere. This is where the 100TB HDD and 300TB SSD come into play.
For now, little is known about SK Hynix's 300TB SSD other than the fact that the demand for high-capacity, high-performance storage will skyrocket in the coming years. This requires both high-capacity drives and high-performance all-flash arrays for a variety of applications.
The most we can do is speculate about SK Hynix's 300TB SSD. The company may be developing a rival to Samsung's PBSSD concept, but it's currently limited to machines that can store up to 240TB of data. In the case of SK Hynix, the system stores his 300TB of data. These machines are designed to offer a competitive balance of capacity, performance per TB, reliability, and energy efficiency.
Alternatively, SK Hynix's 300TB SSD initiative could be a rival to Nimbus Data's 3.5-inch ExaDrive product, which can store up to 100TB (for now), but such SSDs offer considerably less performance per performance. Being a niche product, there are reasonable doubts about this. T.B.
Finally, it could be a custom-built PCIe card SSD, but again, even a 300TB drive with a PCIe 6.0 x16 interface would have pretty low performance per TB, making it a niche product (and , we have a 300TB SSD here).
In addition to the 300TB SSD, SK Hynix is ​​working on a variety of products that serve both data center AI training and inference (HBM4, HBM4E, CXL pool memory solutions, processing in-memory solutions), and edge AI devices (LPDDR6) . , GDDR7, PIM), and for on-device AI inference (LPDDR6, GDDR7, high-capacity DDR5).