- Floating data centers aim to bypass network bottlenecks with offshore deployment
- Samsung model connects directly to shore power for faster AI scaling
- Offshore barges could significantly reduce data center deployment times
Samsung Heavy Industries has unveiled a large floating data center ship model that could support AI tools on a global scale.
Samsung and OpenAI signed a letter of intent in October 2025 for a comprehensive partnership, including the development of a floating data center.
The design of this data center is specifically intended to host future versions of systems like OpenAI’s ChatGPT on an aquatic platform.
Article continues below
Offshore deployment strategy for AI infrastructure
This vessel would sit offshore and connect directly to electricity and cooling near coastal energy resources.
Samsung says the concept cuts the typical multi-year buildout for terrestrial data centers into a much shorter time frame.
The entire project is also associated with a Dallas-based infrastructure developer, Mousterian Corp., focused on high-density AI computing.
The floating model aims to reduce the time needed to ensure AI workloads are powered and cooled.
Instead of waiting for new grid connections, the system docks near existing thermal or nuclear power plants.
This approach views the coastline as a deployment zone for digital infrastructure, and the barges carry fully liquid-cooled data halls that can scale based on demand.
Developers say “speed to power is the new moat” for AI tools and cloud operators.
Anyone who can turn on the math and power quickly gains a real advantage over their slower rivals.
This is why the partnership says this strategy can move capacity delivery from years to quarters for some sites.
The Dallas-based partner says the floating data center initiative aims to deliver more than 1.5 GW of capacity within about three years.
“Speed access to electricity is the new divide. We have built thoughtful partnerships with some of the world’s largest conglomerates, enabling us to deliver over 1,500 MW of capacity over the next 3 years,” said Min Suh, CEO of Mousterian Corp.
However, this figure means there will be multiple barge-based projects, each linked to local electricity and network constraints.
Each ship would host thousands of servers designed for AI training and inference workloads.
The 1.5 GW target also depends on approvals, construction speed and water availability near baseload plants.
Some analysts doubt whether this pace can be sustained in practice, and maritime data centers still face large-scale technical, regulatory and economic hurdles.
Operational risks and uncertainties at sea
Although floating data centers solve some of the problems associated with land-based data centers, they also introduce new challenges.
Experts fear these facilities could create new cybersecurity, physical access and long-term reliability risks.
Saltwater environments, exposure to storms, and emergency response times significantly complicate operations.
Maintenance and fiber optic links are also becoming more complex at sea than on land.
Additionally, claims to deliver 1.5 GW in 36 months are based on unproven timelines for shipbuilding, permitting and tenant onboarding.
Market demand for AI tools and data centers is real, but its implementation remains uncertain.
The model may add a niche option rather than overhaul how most AI calculations are housed, and the real test will be how many barges actually come online as planned.
Via Dallas innovates
Follow TechRadar on Google News And add us as your favorite source to get our news, reviews and expert opinions in your feeds.




