Samsung and M3 to Develop Global Floating Data Centers

In an era where digital infrastructure is colliding with the physical limits of land and power, Dominic Jainy stands at the forefront of the next architectural shift. As an expert in large-scale technical integration, he has spent years dissecting how artificial intelligence and machine learning demand increasingly creative cooling and energy solutions. The recent collaboration between Samsung Heavy Industries and M3 to build gigawatt-scale floating data centers represents more than just an engineering feat; it is a fundamental pivot in how we conceive of data real estate. By leveraging maritime expertise to solve the bottlenecks of urban density and power scarcity, this partnership signals the arrival of a new, institutional-grade asset class that could redefine the global digital map.

Samsung Heavy Industries and M3 are collaborating on institutional-grade floating data centers. How does Samsung’s specific expertise in maritime engineering and large-scale fabrication help de-risk these projects, and what specific technical hurdles are overcome by using a shipbuilding approach rather than traditional land-based construction?

The involvement of Samsung Heavy Industries brings a level of institutional credibility that is often missing in experimental infrastructure projects. Their massive shipbuilding scale and deep balance sheet allow them to treat a data center like a high-spec maritime vessel, which significantly de-risks the fabrication process through standardized, shipyard-based assembly. By moving construction away from unpredictable land-based sites and into a controlled engineering environment, they can bypass local labor shortages and the logistical headaches of hauling heavy materials to remote plots. This approach overcomes the traditional hurdles of site preparation and “stick-built” delays, offering a path to delivery on timelines that conventional land-based methods simply cannot match. It essentially turns a complex construction project into a repeatable, industrial manufacturing process.

Project Zeus in Houston is designed as a 500MW hybrid floating and land-based campus. What are the primary logistical advantages of situating a data center adjacent to a power plant, and how do these “hybrid” configurations optimize power delivery and cooling compared to isolated facilities?

Project Zeus represents a masterclass in proximity-based efficiency by situating its 500MW capacity right next to a CCGT power plant. The primary advantage is the dramatic reduction in transmission losses and the elimination of the years-long wait times for grid connections that plague isolated land facilities. In a hybrid configuration, the facility can draw massive amounts of power directly from the source while utilizing the nearby water body for highly efficient, large-scale cooling loops. This synergy allows for a much denser power profile, supporting high-performance computing loads that would otherwise overwhelm the cooling capacity of a standard landlocked data center. By H1 2028, this model will likely prove that combining maritime flexibility with existing power utility footprints is the fastest way to scale.

Space for digital infrastructure is increasingly at a premium in major global hubs. How do floating facilities like Project Sandpiper in California address the high cost of land, and what specific site-origination criteria must be met to ensure these aquatic platforms are both scalable and environmentally viable?

Project Sandpiper, with its ambitious 860MW target near San Jose, tackles the California land crisis by effectively creating its own real estate on the water. In markets where land prices are astronomical and zoning laws are restrictive, moving offshore or into industrial waterways provides an “out-of-the-box” solution to space scarcity. For these platforms to be viable, site-origination teams must look for protected aquatic zones with deep-water access and proximity to existing fiber backbones. Environmental viability is maintained by using the water’s natural thermal mass for cooling, which reduces the facility’s overall carbon footprint and water consumption compared to traditional evaporative cooling towers. As we look toward the 2029 launch, the focus remains on ensuring these platforms can scale horizontally without disrupting local maritime ecosystems.

Different firms are exploring various power sources for maritime data centers, including wave energy and offshore wind integration. How do these diverse energy solutions impact the overall reliability of a gigawatt-scale pipeline, and what steps are taken to ensure consistent uptime when transitioning to floating, eco-friendly energy models?

Integrating renewable sources like wave energy or offshore wind is essential for the long-term sustainability of the global data market, but it introduces variables in power consistency. To maintain the rigorous uptime required for institutional-grade facilities, developers are looking at “eco-friendly energy solutions” that serve as a primary or secondary driver of growth without sacrificing reliability. This often involves modular designs, such as those seen with Aikido’s floating wind platforms, which can be paired with battery storage or hybrid connections to land-based grids to smooth out intermittent power generation. Ensuring 99.999% reliability in a floating environment requires a sophisticated orchestration of these diverse energy inputs, often managed by AI-driven power management systems that can switch sources in milliseconds. This transition is less about replacing the grid and more about creating a resilient, self-sustaining energy ecosystem at sea.

M3’s leadership includes veterans from previous floating data center ventures. Based on past attempts to commercialize this technology, what are the most critical “lessons learned” regarding tenant sourcing and capital markets access that will determine the long-term success of this new asset class?

The most critical lesson learned from earlier ventures is that technical ingenuity isn’t enough; you need the backing of industrial giants and clear access to capital markets to move beyond the pilot stage. By partnering with Samsung, M3 is signaling to investors that floating data centers are no longer a “niche” experiment but a “highly credible” asset class with a validated delivery path. Previous attempts struggled because they lacked the scale to attract hyperscale tenants who require hundreds of megawatts of guaranteed capacity. The current strategy focuses on building massive pipelines—like the gigawatt-scale goals mentioned—to ensure that the economics of scale work in favor of the developer. Success now depends on proving to institutional investors that these maritime assets are as durable and “bankable” as any traditional warehouse or office building on land.

What is your forecast for floating data centers?

I forecast that floating data centers will evolve from a “last resort” for land-constrained hubs like Singapore and California into a primary strategy for rapid AI deployment globally. Over the next decade, we will see the emergence of “data flotillas” that can be towed to areas of high demand, effectively decoupling digital growth from local real estate constraints. As shipbuilding techniques continue to merge with modular IT design, these facilities will become the standard for high-density, liquid-cooled workloads. We are moving toward a world where the most powerful computers on earth are not hidden in deserts or warehouses, but are instead stationed off the coasts of our major cities, powered and cooled by the very oceans they overlook.

Explore more

Is Google’s Agentic Data Cloud the Future of Enterprise AI?

Enterprises currently find themselves at a critical junction where the value of digital information is no longer measured by its volume but by its ability to power autonomous decision-making processes. This shift represents a move away from the traditional model of data as a passive archive toward a dynamic ecosystem where information functions as a reasoning engine. For years, corporate

Is the Agentic Data Cloud the Future of Enterprise AI?

Introduction The architectural blueprint of modern enterprise intelligence is undergoing a radical transformation as data platforms evolve from passive repositories for human analysts into active environments for autonomous software agents. This shift reflects a move away from human-centric analytics toward a model where machines are the primary consumers of data. As these AI capabilities mature, the engineering of data ecosystems

What Sparked the $15.5 Million Data Center Legal Battle?

The rapid expansion of African digital sovereignty hit a literal and metaphorical wall when a multi-million dollar data center project in the heart of Lagos transformed into a tangled web of litigation and structural decay. What began as a bold $35 million blueprint to bolster Nigeria’s internet backbone has dissolved into a $15.5 million legal war in the English High

Prime Data Centers Breaks Ground on New Sacramento AI Facility

The relentless demand for artificial intelligence has officially reshaped the landscape of Northern California as Prime Data Centers begins construction on its ambitious SMF02 project. This groundbreaking at the McClellan Business Park signals a major shift in how the Sacramento region handles high-performance computing (HPC) capabilities. By expanding its local footprint, the company addresses the global surge in AI adoption,

Global Datacenter Growth Collides With Power Grid Limits

The massive wave of artificial intelligence integration has pushed the global digital economy toward a critical threshold where physical infrastructure cannot keep pace with software innovation. While the financial markets continue to pour hundreds of billions of dollars into high-performance computing, the hard reality of electrical distribution is beginning to stall even the most ambitious projects from the world’s technology