- Kioxia reveals a new project called AiSAQ that aims to replace RAM with SSDs for AI data processing
- Larger SSDs (read: 100TB+) could improve RAG at a lower cost than using memory alone
- No timeline was given, but expect Kioxia’s competitors to offer similar technology
Large language models often generate plausible but factually incorrect results – in other words, they make things up. These “hallucinations” can undermine the reliability of critical tasks such as medical diagnosis, legal analysis, financial reporting and scientific research.
Retrieval augmented generation (RAG) alleviates this problem by integrating external data sources, allowing LLMs to access real-time information during generation, thereby reducing errors and, by anchoring the outputs in current data, improving contextual precision. Effectively implementing RAG requires significant memory and storage resources, and this is especially true for vector data and large-scale indices. Traditionally, this data is stored in DRAM which, although fast, is both expensive and limited in capacity.
To meet these challenges, ServeTheHome reports that at this year’s CES, Japanese memory giant Kioxia introduced AiSAQ – All-Storage Nearest Neighbor Search (ANNS) with Product Quantization – which uses high-capacity SSDs to store vector data and clues. Kioxia claims that AiSAQ significantly reduces DRAM usage compared to DiskANN, providing a more cost-effective and scalable approach to supporting large AI models.
More accessible and more profitable
Moving to SSD storage makes it possible to handle larger data sets without the high costs associated with heavy DRAM usage.
Although accessing data from SSDs can introduce slight latency compared to DRAM, the trade-off includes lower system costs and improved scalability, which can enable better performance and model accuracies, because sets Larger data provides a richer basis for learning and inference.
Using high-capacity SSDs, AiSAQ meets RAG’s storage demands while contributing to the broader goal of making advanced AI technologies more accessible and cost-effective. Kioxia hasn’t revealed when it plans to bring AiSAQ to market, but it’s sure competitors like Micron and SK Hynix will have something similar in the works.
ServeTheHome concludes: “Nowadays, everything is AI, and Kioxia is pushing it too. In reality, RAG is going to be a significant part of many applications, and if there is an application that needs to access a lot of data, but is not used as frequently, that would be a great opportunity for something like Kioxia AiSAQ.