Can a Data Center Be Built Next to a School?

With the relentless expansion of AI and blockchain technologies demanding unprecedented computational power, the physical infrastructure that supports our digital world is evolving at a breakneck pace. We’re joined by Dominic Jainy, an IT professional whose work at the intersection of artificial intelligence and machine learning provides a unique lens on the global demand for data centers. Today, we’ll explore a fascinating case study in Provo, Utah, delving into how communities are adapting to this growth, the critical decisions behind powering these digital fortresses, the innovative methods used to keep them cool, and why forgotten industrial sites are becoming the new frontier for digital real estate.

Provo recently adopted a data center overlay zone. What specific community concerns, such as the facility’s proximity to a nearby charter school, does this zoning address, and how are developers navigating these new requirements to gain project approval?

It’s a classic example of a city proactively managing industrial growth while protecting its community fabric. Provo’s data center overlay zone is a forward-thinking tool to guide development. The primary concern addressed here is proximity to sensitive locations, specifically schools. The ordinance stipulates that data centers can’t be built within 200 feet of a school property, which is a very direct way to create a buffer. In this case, the developer B+F Timpanogos faced a challenge with a charter school located nearby. Their solution is quite clever and shows a willingness to collaborate; since they own both parcels of land, they are simply adjusting the property lines to create the required 200-foot distance. This kind of creative compliance is exactly what cities hope for—it allows for significant investment while respecting the established community guidelines.

This project plans to connect to the Provo Power grid after initial plans for on-site generation were dropped. Could you explain the technical and financial trade-offs in this decision and elaborate on the choice of natural gas for backup power?

Pivoting from on-site generation to a grid connection is a major strategic shift, and it almost always comes down to balancing capital expenditure against operational complexity and risk. Building your own power plant is incredibly expensive upfront and adds layers of permitting and maintenance that can delay a project for years. By connecting directly to Provo Power, the developers can get operational faster and leverage the reliability of an established utility. This frees up a significant portion of that $280 million investment for the core data center technology. As for backup, natural gas is the industry standard for a reason. It’s reliable, readily available, and provides the instant-on power needed to prevent any service interruption during a grid outage, ensuring the 30MW of IT capacity remains stable.

The facility will use a closed-loop cooling system to reduce water consumption. Can you walk us through how this system works and compare its environmental and operational impact to other common data center cooling methods in a region like Utah?

In an arid state like Utah, water is gold, so choosing a closed-loop system is both environmentally responsible and operationally smart. Think of it like the radiator in your car; a liquid coolant circulates in a sealed loop, absorbing heat from the servers and then releasing it to the outside air through heat exchangers, without any water evaporating into the atmosphere. This stands in stark contrast to older, water-intensive methods like evaporative cooling towers, which essentially use evaporation to cool and can consume millions of gallons of water a year. By using a closed-loop system, this 132,000-square-foot facility drastically reduces its water footprint, which is a huge win for sustainability and long-term operational costs in a water-scarce region.

With a $280 million investment planned to convert a vacant warehouse, what makes this former Novell campus property an ideal site for a 30MW data center? Please detail the specific site characteristics and infrastructure advantages that justify this redevelopment.

This project is a perfect illustration of adaptive reuse in the tech sector. The site, formerly part of the Novell campus, has several inherent advantages that make it a prime candidate for a data center. First, redeveloping the existing 50,000-square-foot warehouse on a nine-acre plot is often much faster than a ground-up construction project. Second, being part of a former tech campus suggests that the site likely has excellent fiber connectivity and robust power infrastructure nearby, which are the lifeblood of any data center. The massive $280 million investment signals that this isn’t just a cosmetic update; it’s a complete transformation into a high-density facility capable of scaling up to 30MW, which is a significant amount of power. It’s about leveraging the bones of an old industrial site to build a state-of-the-art digital factory.

What is your forecast for data center development in growing secondary markets like Provo, especially regarding the redevelopment of existing industrial properties?

My forecast is that this trend is not just going to continue; it’s going to accelerate dramatically. Primary data center markets are becoming saturated, expensive, and difficult to build in. Secondary markets like Provo offer a compelling alternative with more available land, affordable power, and often a more welcoming regulatory environment. The strategy of redeveloping vacant industrial properties is particularly brilliant. It breathes new economic life into forgotten buildings, shortens construction timelines, and promotes sustainability by reusing existing structures. As the demand for AI and data processing continues its exponential rise, we are going to see a gold rush for these underutilized industrial shells in cities all across the country. This Provo project is a blueprint for the future.

Explore more

AI Human Resources Integration – Review

The rapid transition of the human resources department from a back-office administrative hub to a high-tech nerve center has fundamentally altered how organizations perceive their most valuable asset: their people. While the promise of efficiency has always been the primary driver of digital adoption, the current landscape reveals a complex interplay between sophisticated algorithms and the indispensable nature of human

Is Your Organization Hiring for Experience or Adaptability?

The standard executive recruitment model has historically prioritized candidates with decades of specialized industry tenure, yet the current economic volatility suggests that a reliance on past success is no longer a reliable predictor of future performance. In 2026, the global marketplace is defined by rapid technological shifts where long-standing industry norms are frequently upended by generative AI and decentralized finance

OpenAI Challenge Hiring – Review

The traditional resume, once the golden ticket to high-stakes employment, has officially entered its obsolescence phase as automated systems and AI-generated content saturate the labor market. In response, OpenAI has introduced a performance-driven recruitment model that bypasses the “slop” of polished but hollow applications. This shift represents a fundamental pivot toward verified capability, where a candidate’s worth is measured not

How Do Your Leadership Signals Affect Team Performance?

The modern corporate landscape operates within a state of constant flux where economic shifts and rapid technological integration create an environment of perpetual high-stakes decision-making. In this atmosphere, the emotional and behavioral cues projected by executives do not merely stay within the confines of the boardroom but ripple through every level of an organization, dictating the collective psychological state of

Restoring Human Choice to Counter Modern Management Crises

Ling-yi Tsai, an organizational strategy expert with decades of experience in HR technology and behavioral science, has dedicated her career to helping global firms navigate the friction between technological efficiency and human potential. In an era where data-driven decision-making is often mistaken for leadership, she argues that we have industrialized the “how” of work while losing sight of the “why.”