Energy
February 10, 2026 · Negotiate The Future
Realistic Environmental Accounting
Measuring the true costs and benefits of large-scale compute
AI infrastructure requires energy. Datacenters, semiconductor fabrication, and the networks that connect them will expand electricity demand significantly over the next decade. That is a fact. The question is what we do with it.
Two responses dominate the current discourse, and both are wrong. The first is moral panic — treating AI energy consumption as inherently disqualifying, without accounting for the productivity gains and emission reductions that well-deployed AI can enable. The second is PR minimization — corporate claims that AI will be carbon-neutral by some future date, without credible plans or honest accounting for near-term grid impact.
Negotiate the Future supports responsible growth: building the compute capacity society will need, while ensuring that near-term reliance on imperfect energy sources is paired with genuine decarbonization commitments. That means standardized reporting on datacenter energy and water use, strong incentives for renewable-powered compute, and procurement and permitting reforms that speed clean energy buildout.
Carbon accounting for AI must reflect time and place. Grid carbon intensity varies by region and hour. A data center in a coal-heavy grid at peak load has a different impact than one running on hydropower overnight. One-size averages obscure the choices that actually matter.