Can a Data Center Be Built Next to a School?

With the relentless expansion of AI and blockchain technologies demanding unprecedented computational power, the physical infrastructure that supports our digital world is evolving at a breakneck pace. We’re joined by Dominic Jainy, an IT professional whose work at the intersection of artificial intelligence and machine learning provides a unique lens on the global demand for data centers. Today, we’ll explore a fascinating case study in Provo, Utah, delving into how communities are adapting to this growth, the critical decisions behind powering these digital fortresses, the innovative methods used to keep them cool, and why forgotten industrial sites are becoming the new frontier for digital real estate.

Provo recently adopted a data center overlay zone. What specific community concerns, such as the facility’s proximity to a nearby charter school, does this zoning address, and how are developers navigating these new requirements to gain project approval?

It’s a classic example of a city proactively managing industrial growth while protecting its community fabric. Provo’s data center overlay zone is a forward-thinking tool to guide development. The primary concern addressed here is proximity to sensitive locations, specifically schools. The ordinance stipulates that data centers can’t be built within 200 feet of a school property, which is a very direct way to create a buffer. In this case, the developer B+F Timpanogos faced a challenge with a charter school located nearby. Their solution is quite clever and shows a willingness to collaborate; since they own both parcels of land, they are simply adjusting the property lines to create the required 200-foot distance. This kind of creative compliance is exactly what cities hope for—it allows for significant investment while respecting the established community guidelines.

This project plans to connect to the Provo Power grid after initial plans for on-site generation were dropped. Could you explain the technical and financial trade-offs in this decision and elaborate on the choice of natural gas for backup power?

Pivoting from on-site generation to a grid connection is a major strategic shift, and it almost always comes down to balancing capital expenditure against operational complexity and risk. Building your own power plant is incredibly expensive upfront and adds layers of permitting and maintenance that can delay a project for years. By connecting directly to Provo Power, the developers can get operational faster and leverage the reliability of an established utility. This frees up a significant portion of that $280 million investment for the core data center technology. As for backup, natural gas is the industry standard for a reason. It’s reliable, readily available, and provides the instant-on power needed to prevent any service interruption during a grid outage, ensuring the 30MW of IT capacity remains stable.

The facility will use a closed-loop cooling system to reduce water consumption. Can you walk us through how this system works and compare its environmental and operational impact to other common data center cooling methods in a region like Utah?

In an arid state like Utah, water is gold, so choosing a closed-loop system is both environmentally responsible and operationally smart. Think of it like the radiator in your car; a liquid coolant circulates in a sealed loop, absorbing heat from the servers and then releasing it to the outside air through heat exchangers, without any water evaporating into the atmosphere. This stands in stark contrast to older, water-intensive methods like evaporative cooling towers, which essentially use evaporation to cool and can consume millions of gallons of water a year. By using a closed-loop system, this 132,000-square-foot facility drastically reduces its water footprint, which is a huge win for sustainability and long-term operational costs in a water-scarce region.

With a $280 million investment planned to convert a vacant warehouse, what makes this former Novell campus property an ideal site for a 30MW data center? Please detail the specific site characteristics and infrastructure advantages that justify this redevelopment.

This project is a perfect illustration of adaptive reuse in the tech sector. The site, formerly part of the Novell campus, has several inherent advantages that make it a prime candidate for a data center. First, redeveloping the existing 50,000-square-foot warehouse on a nine-acre plot is often much faster than a ground-up construction project. Second, being part of a former tech campus suggests that the site likely has excellent fiber connectivity and robust power infrastructure nearby, which are the lifeblood of any data center. The massive $280 million investment signals that this isn’t just a cosmetic update; it’s a complete transformation into a high-density facility capable of scaling up to 30MW, which is a significant amount of power. It’s about leveraging the bones of an old industrial site to build a state-of-the-art digital factory.

What is your forecast for data center development in growing secondary markets like Provo, especially regarding the redevelopment of existing industrial properties?

My forecast is that this trend is not just going to continue; it’s going to accelerate dramatically. Primary data center markets are becoming saturated, expensive, and difficult to build in. Secondary markets like Provo offer a compelling alternative with more available land, affordable power, and often a more welcoming regulatory environment. The strategy of redeveloping vacant industrial properties is particularly brilliant. It breathes new economic life into forgotten buildings, shortens construction timelines, and promotes sustainability by reusing existing structures. As the demand for AI and data processing continues its exponential rise, we are going to see a gold rush for these underutilized industrial shells in cities all across the country. This Provo project is a blueprint for the future.

Explore more

Review of Zoho CRM

Is Zoho CRM the Right Partner for Your Established Business? For a seasoned company with decades of success, the prospect of adopting new technology often brings a significant risk: being forced to dismantle proven, intricate processes to fit the rigid confines of a one-size-fits-all software solution. This review assesses Zoho CRM’s value not merely as a tool but as a

Employee Sues Blue Cross Over Racial Discrimination Claims

A federal lawsuit has brought serious allegations of racial discrimination to the forefront, detailing a Black employee’s claims of a hostile work environment, systemic exclusion, and eventual retaliatory termination against Health Care Service Corporation, the operator of Blue Cross Blue Shield of Texas. The complaint, filed by Rodeshia Galbert, asserts that her repeated pleas for intervention were consistently disregarded by

Was a UPS Firing a Pretext for Discrimination?

A Two-Decade Career Ends Under a Cloud of Suspicion A veteran IT professional’s abrupt termination from UPS after more than two decades of service has ignited a federal lawsuit that questions whether a minor policy violation was a legitimate reason for firing or merely a convenient pretext for discrimination. This article provides a detailed timeline of the events leading to

AI Agent Framework Security – Review

The rapid evolution of local AI agents has ushered in an era where autonomous systems manage our most sensitive tasks, yet this power comes tethered to an equally significant risk of exploitation. The OpenClaw framework, a prominent player in this sector, represents a significant advancement in local AI agent capabilities. This review explores the evolution of the technology, focusing on

Trend Analysis: AI Agent Security

The swift and widespread integration of autonomous AI agents into critical business and personal workflows has quietly ushered in a new era of security vulnerabilities that operate beyond the scope of traditional cyber defenses. As these sophisticated programs gain increasing access to sensitive corporate data, financial systems, and personal information, the imperative to secure this novel computing paradigm has become