There has been a lot of talk about the growing energy demands coming from data centers. Big data and AI have been positioned as a bit of a threat. I might be dating myself to say this, but the current climate brings to mind the iconic scene from Raiders of the Lost Ark with AI as the 12-foot boulder threatening to crush our hero and spelling certain disaster.
Yes, AI training is expected to drive a 160% increase in data center power demand by 2030, but it’s necessary and important to respond to this new source of demand. A slowdown in digital infrastructure development could exacerbate cybersecurity risks and threaten global competitiveness.
It’s time to set shock and awe aside and dig in on solutions because AI purports huge advantages to the way we work, our economy and domestic security. We need AI to flourish. We can turn the power requirements of data centers into an opportunity for utility companies, local communities and the grid. Done right, a data center should be a 100-year asset, benefiting local communities as a source of economic prosperity and long-term growth…akin to railroad expansion in the late 1800s.
Reframing “will serve”
One emerging area of discussion is the will-serve model, where electric utilities have a duty to serve demand as it comes. Put in place during a simpler time, the will-serve model is not suited for an era where the needs of certain customers are growing exponentially compared to others.
Today, a developer can come to a utility and say they are building a hyperscale data center that’s going to need as much energy as a small power plant. Unfortunately, in the current climate, such requests are coming from speculative investors tying up power without an actual project and true demand behind it. The utility is required to put the request in the demand queue.
Building the infrastructure to respond to these requests could take years, with costs shared among all ratepayers. If the hypothetical data center project falls through, those customers are left paying for unneeded infrastructure.
A more sustainable and scalable approach
There is a future scenario where data centers can have access to the energy they need without harming communities or driving up costs. The only way to do it is by co-serving. Data centers are asking for a lot, and the only way to get it is to give a lot too.
With a co-serve model, data centers and utilities can work together to close the gap between supply and demand. This approach spreads risk across utility companies and developers, while minimizing costs to local retail and residential customers.
Framing up a co-serve partnership
I see a few ways data centers and utilities can move toward a more collaborative approach to help solve current demand challenges.
- Spread financial risk more equitably. Shareholders and residential ratepayers shouldn’t be left holding the bag for million- or billion-dollar generation and transmission investments if a speculative data center project falls through and doesn’t take the capacity. Once the utility has booked the capacity, the customer should have to pay some portion regardless of if a project gets pulled. A more equitable approach to risk sharing could include developer financing feasibility and environmental impact studies upfront.
- Contribute to construction. Let’s face it, private companies can get things done faster than regulated utilities. Data center developers, with requisite expertise in land acquisition, can help secure rights of way and build critical infrastructure like transmission lines and substations. We have greater capital deployment flexibility and can make infrastructure investment decisions relatively quickly.
- Embrace creative approaches to rate paying. Today, everyone bears the cost burden when new energy infrastructure is built. That’s set to change as utility companies explore creative tariffs requiring commercial and industrial customers to pay more. Data center developers should embrace these modernized approaches to cost sharing. Paying our fair share will only serve to speed up development and position the industry as good neighbors.
- Clear a path to peak shaving. Data centers have backup generators to ensure continuous power supply. These sit idle most of the time. It’s a lot of untapped power. Using microgrids, we’ll be able to run generators during peak demand periods to alleviate pressure on the grid and help ensure households and other businesses are powered. Unfortunately, air quality permits have grown increasingly complex which makes peak shaving a challenging solution to deploy universally.
Power providers, the Environmental Protection Agency and the data center industry need to work toward an agreement that takes into consideration various fuel sources, the emissions of each, and opportunities to create permitting structures and regulations that allow power providers flexibility in supply.
Between 2022 and 2030, the demand for power, stemming from data centers, will double, according to Goldman Sachs Research. Add to that the growth in manufacturing and electrification of appliances and cars, and you have a spike that the US hasn’t experienced since the early 2000s. An investment of $50 billion will be required to build new generation capacity just for data centers.
Addressing the challenge will require close collaboration. A co-serve model sets the stage for long-term growth, a more resilient energy grid, and a more equitable distribution of risk across utilities and developers.
The AI train has left the station. Let’s stay on it, keep it on the rails, reap the rewards and stay ahead. The nation’s economic competitiveness and national security depend on it.