Policy   ·   Urban Planning

The Data Center Industry Is Measuring the Wrong Thing

reprints


The data center industry has a messaging problem. Not because it lacks strong fundamentals, but because it keeps answering the wrong question.

When a project is proposed, the industry responds with national statistics: trillions in GDP contribution, millions of jobs supported, and billions in tax revenue generated. These numbers are not wrong, but they are not persuasive.

SEE ALSO: Mamdani Administration Hits Bronx Landlords With $31M in Penalties

Communities are not asking about national impact. They are asking about local outcomes. What do we get, here, from this project.

Suhail Y. Tayeb.
Suhail Y. Tayeb. PHOTO: Courtesy NYU

That question has become the center of the data center debate, and it is where the industry is losing ground. Critics point to generous tax incentives, hidden infrastructure costs and limited permanent job creation. In many cases, those concerns are justified.

The issue is not the criticism itself. It is the lack of a consistent way to evaluate it. The industry speaks in aggregate, while opposition speaks in specifics, and specifics win.

A resident does not experience GDP. A resident experiences a utility bill. A school district does not budget based on projections. It runs on tax receipts and budget gaps. And local governments have to fund and manage the roads, substations, water systems and emergency services that support these projects, regardless of how the economic projections are framed.

Without a clear way to connect a specific project to these local realities, the debate becomes abstract on one side and tangible on the other. That imbalance is driving opposition.

The path forward is not more messaging. It is better measurement.

Because, right now, the industry is measuring the wrong thing.

Every data center project should be evaluated through a local balance sheet. Not a national model or a marketing deck, but a simple, project-level accounting of what flows into and out of a community over time.

What are the inflows? What are the outflows? And when do they occur?

On one side of the ledger are the benefits, including property taxes, equipment taxes, construction employment, permanent jobs, indirect economic activity and infrastructure improvements. On the other side are the costs, including tax incentives, public infrastructure investments, strain on power systems, potential ratepayer impact, water usage constraints and municipal service demands.

Timing matters as much as magnitude. Construction jobs arrive early and disappear quickly, while infrastructure costs are often front-loaded. Tax revenues may ramp slowly or be offset by incentives for years, and permanent employment is smaller but longer-lived.

Without aligning these timelines, it is easy to overstate benefits and understate costs. This is where much of the current tension originates.

A project can look positive in aggregate while creating near-term strain at the local level. Or it can generate long-term value but fail to communicate that fact clearly enough to survive the approval process.

In markets like Northern Virginia, data centers have generated outsize local tax revenues relative to service demand, helping fund schools and infrastructure. In other regions, poorly structured incentives and infrastructure obligations have created fiscal strain and political backlash.

Right now, there is no standardized way to present this information. Each project is framed differently, assumptions are inconsistent, and key variables are often opaque. That creates skepticism. And skepticism turns into opposition.

A project that cannot clearly articulate its local balance sheet is more likely to face delays or cancellation. In a market where timing is critical, that uncertainty carries real financial consequences.

The industry has solved far more complex problems. It has engineered hyperscale infrastructure, optimized power usage, and built global networks of compute capacity. It can solve this.

What is missing is not capability. It is alignment on what should be measured and how it should be communicated.

The next phase of data center development will require a shift from selling economic potential to underwriting community value. That means moving beyond national narratives and building project-level transparency into the development process.

It also means accepting that not every project will look attractive under this lens. Some will show clear positive impact, while others will reveal imbalances that need to be addressed.

That is not a weakness. That is discipline.

Other asset classes have already gone through similar transitions. Infrastructure, energy and public-private partnerships have long relied on detailed, project-level analysis. Data centers are now reaching that same point.

At the Center for the Sustainable Built Environment at New York University, we are beginning to explore what a standardized local balance sheet for data centers could look like. The goal is to establish a common framework that allows developers, communities and policymakers to evaluate projects using the same assumptions.

This is not about proving that data centers are good or bad. It is about making their impact legible.

Until communities can clearly see what they are gaining, they will focus on what they might lose. And, until the industry can demonstrate local value, it will struggle to deliver projects at the pace demand requires.

The question is no longer whether data centers create economic value. The question is whether that value can be clearly defined, fairly distributed, and demonstrated at the community level.

The projects that move forward will not be the ones that promise the most. They will be the ones that can show the numbers.

That is the balance sheet that matters.

Suhail Y. Tayeb is clinical assistant professor at New York University’s Schack Institute of Real Estate and director of the Center for the Sustainable Built Environment.