Join our daily and weekly newsletters for the latest updates and exclusive content on our industry-leading AI coverage. He learns more
in The age of artificial intelligencePublic utilities now face an unexpected new problem: ghost data centers. On the surface, it may seem silly: Why (and how) would anyone manufacture something as complex as a data center? But as demand for artificial intelligence rises alongside the need for more computing power, speculation about data center development is creating chaos, especially in areas like northern Virginia. The data center capital of the world. In this evolving landscape, utilities are being bombarded with energy demands from real estate developers who may or may not be on board In reality Building the infrastructure they claim.
Fake data centers represent a pressing bottleneck in scaling data infrastructure to keep up with computing demand. This emerging phenomenon prevents capital from flowing where it is actually needed. Maybe any organization can help solve this problem Taking advantage of artificial intelligence To solve a problem created by artificial intelligence – it will have a huge advantage.
The mirage of gigawatt demands
Dominion Energy, the largest utility company in Northern Virginia, received total orders for 50 GW of power from data center projects. This is more energy than Iceland consumes in one year.
But many of these requests are either speculative or completely false. Developers are eyeing potential sites and staking their claims on energy capacity long before they have capital or any strategy on how to get started. In fact, it is estimated that up to 90% of these requests are completely fake.
In the early days of the data center boom, utilities never had to worry about spurious demand. Companies like Amazon, Google Microsoft — called “super-scalers” because it runs data centers containing hundreds of thousands of servers — made direct requests for power, and the utilities were simply delivered. But now, the craze for securing energy capacity has led to an influx of orders from lesser-known developers or speculators with questionable track records. Utilities, which traditionally dealt with a few power-hungry customers, were suddenly inundated with requests for power capacity that would dwarf their entire grid.
Utilities struggle to distinguish fact from fiction
The challenge utilities face is not just a technical one, it is an existential one. They are charged with determining what is real and what is not. And they are not well equipped to deal with this. Historically, public utilities have been slow-moving, risk-averse enterprises. Now they are being asked to screen speculators, many of whom are simply playing the real estate game, hoping to flip their allotments of power once the market heats up.
Utilities have groups charged with economic development, but these teams are not accustomed to handling dozens of speculative requests at once. It’s like a rush for land, where only a small fraction of those demanding quotas plan to build something tangible. The result? paralysis. Utilities are reluctant to allocate power when they do not know which projects will be undertaken, which slows down the entire project Development cycle.
Capital wall
There is no shortage of capital flowing into the data center space, but this abundance is part of the problem. When capital is easy to access, it leads to speculation. In a way, this is similar to the better mousetrap problem: there are too many players chasing an oversupplied market. This influx of speculators creates hesitation not only within public utilities, but also in local communities, which must decide whether to grant licenses for land use and infrastructure development.
Adding to the complexity is that data centers are not just for AI. AI is certainly driving an increase in demand, but there is also a continuing need for cloud computing. Developers are building data centers to accommodate both, but distinguishing between the two is becoming increasingly difficult, especially when projects mix Artificial intelligence hype With traditional cloud infrastructure.
What is real?
The legitimate players — the aforementioned Apple, Google and Microsoft — are building real data centers, and many are adopting strategies such as “behind the meter” deals with renewable energy providers or building microgrids to avoid grid interconnection bottlenecks. But as real projects multiply, fake projects multiply as well. Developers with little experience in this area try to take advantage of this, resulting in an increasingly chaotic environment for public facilities.
The problem isn’t just the financial risks — although the capital required to build a single gigawatt-sized campus could easily exceed several billion dollars — it’s the sheer complexity of developing infrastructure on this scale. A 6-gigawatt campus sounds impressive, but the financial and engineering realities make it nearly impossible to build in a reasonable time frame. However, speculators are trading these massive numbers, hoping to secure energy capacity and hopefully later turn the project around.
Why the network can’t keep up with the demands of the data center
As utilities struggle to distinguish fact from fiction, the network itself becomes a bottleneck. McKinsey recently estimated that global demand for data centers could reach up to… 152 GW by 2030This adds 250 terawatt-hours of new electricity demand. In the United States, only data centers can bear responsibility 8% of total energy demand by 2030This is an astonishing number considering how little demand has grown in the past two decades.
However, the network is not ready for this influx. Interconnection and transmission issues are rampant, with estimates suggesting that the US could run out of power capacity by 2027 to 2029 if alternative solutions are not found. Developers are increasingly turning to on-site generation such as gas turbines or microgrids to avoid interconnection bottlenecks, but these temporary gaps only serve to highlight the limitations of the grid.
Conclusion: Facilities are gatekeepers
The real bottleneck is not a lack of capital (trust me, there is a lot of capital here) or even technology – it is the ability of utilities to act as gatekeepers, determining who is real and who is playing the speculative game. Without a robust process for vetting developers, the network runs the risk of being overwhelmed with projects that will never be realized. The era of fake data centers has arrived, and until utilities adapt, the entire industry may struggle to keep up with real demand.
In this chaotic environment, it’s not just about allocating energy; It’s about utilities learning how to navigate new frontiers of speculation so enterprises (and AI) can thrive.
Sophie Bakalar is a partner at Cooperative Fund.
Data decision makers
Welcome to the VentureBeat community!
DataDecisionMakers is a place where experts, including technical people who do data work, can share data insights and innovations.
If you want to read about cutting-edge ideas, cutting-edge information, best practices, and the future of data and data technology, join us at DataDecisionMakers.
You might even think Contribute an article Your own!
https://venturebeat.com/wp-content/uploads/2025/01/a-3d-render-of-a-vector-illustration-of-_nrTyo3lCTBakFxfwwy_Npg_K7G5l8EhS4uQBvLexvn0lQ_enhanced.png?w=1024?w=1200&strip=all
Source link