Here’s why SSDs over 100TB will play an important role in ultra-large language models in the near future


  • Kioxia reveals a new project called AiSAQ that aims to replace RAM with SSDs for AI data processing
  • Larger SSDs (read: 100TB+) could improve RAG at a lower cost than using memory alone
  • No timeline was given, but expect Kioxia’s competitors to offer similar technology

Large language models often generate plausible but factually incorrect results – in other words, they make things up. These “hallucinations” can undermine the reliability of critical tasks such as medical diagnosis, legal analysis, financial reporting and scientific research.

Retrieval augmented generation (RAG) alleviates this problem by integrating external data sources, allowing LLMs to access real-time information during generation, thereby reducing errors and, by anchoring the outputs in current data, improving contextual precision. Effectively implementing RAG requires significant memory and storage resources, and this is especially true for vector data and large-scale indices. Traditionally, this data is stored in DRAM which, although fast, is both expensive and limited in capacity.

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top