The Cost of Powering AI: Distributional Impacts and an Operational Standard for Self-Powered Deployments

Read the full article See related articles

Discuss this preprint

Start a discussion What are Sciety discussions?

Listed in

This article is not in any list yet, why not save it to one of your lists.
Log in to save this article

Abstract

Artificial intelligence (AI) is scaling amid grid congestion, volatile peak pricing, and uneven water availability. New data-center load can shift public costs—scarcity-hour prices, capacity upgrades, marginal emissions, and water withdrawals—onto disadvantaged communities with elevated energy burdens. This paper treats AI as an electrical load and proposes an operational, auditable Self-Powered AI Standard. Deployments must (i) supply attributable additional clean energy (AACS), (ii) match consumption hour-by-hour within the balancing area (HCC/EHCC with Clean Matching Shortfall), and (iii) maintain firm self-supply availability (FSSA) while demonstrating near-zero Scarcity-Adjusted Import Exposure (SAIE) in top-decile price/CO₂ hours. Using public indicators—wholesale prices, interconnection queues, marginal emissions, congestion, and water context—we construct a screening of “where an extra MW hurts,” outline siting/operation patterns (islanded training parks; grid-tied inference with islanding), portfolio 24/7 matching, and a Levelized Cost of AI Power (LCAP) for self-supplied stacks. A telemetry case study demonstrates end-to-end computation of AACS, HCC/EHCC/CMS, FSSA, and SAIE. The framework enables growth in compute without leaning on the public during scarcity and aligns reporting with standard telemetry and open datasets.

Article activity feed