- The SSD strategy of Phison reduces the AI training costs of $ 3 million to $ 100,000
- AIDAPTIV + software moves the AI workloads of GPUs to SSD effectively
- SSDs could replace expensive GPUs in massive training on the IA model
The development of AI models has become more and more expensive as their size and complexity increase, requiring massive calculation resources with GPUs playing a central role in the management of the workload.
Phison, a key actor in Portable SSDS, has unveiled a new solution which aims to considerably reduce the cost of forming a parameter model of 1 Billion by moving part of the GPU treatment charge to SSDs, which reduces the estimated operational expenses from $ 3 million to only $ 100,000.
Phison’s strategy is to integrate its AIDAPTIV + software with high performance SSDs to manage certain AI tool processing tasks traditionally managed by GPUs while also incorporating the NVIDIA GH200 Superchip to improve performance and maintain manageable costs.
Growth of the AI model and important step
Phison expects the AI industry to reach the milestones of the parameters of 1 Billion before 2026.
According to the company, the model sizes developed quickly, going from 69 billion parameters in Llama 2 (2023) to 405 billion with Llama 3.1 (2024), followed by the 671 billion parameters of Deepseek R3 (2025).
If this model continues, a parameter model of billions of billions could be revealed before the end of 2025, marking a significant jump in the capabilities of the AI.
In addition, it estimates that its solution can considerably reduce the number of GPUs necessary to perform large -scale AI models by removing some of the most important GPU treatment tasks and this approach could reduce training costs to only 3% of current projections (97% savings), or less than 1/25 of current operational expenses.
Phison has already Collaborated with Many to launch AI workstations fed by Intel Xeon W7-3455, signaling his commitment to reshaping AI equipment.
While companies are looking for profitable means of training massive AI models, innovations in SSD technology could play a crucial role in driving efficiency gains External hard drive options remain relevant to storing long -term data.
The pressure for the less expensive AI training solutions has grown after Deepseek made the headlines earlier this year when its R1 Deepseek model has shown that cutting -edge AI could be developed at a fraction of the usual cost, with 95% less chips and would only require $ 6 million for training.
Via tweaktown




