EcoThink: When AI Learns to Think Less (and Achieve More)
Opening — Why this matters now For all the breathless talk about AI scaling, there’s a quieter, less glamorous curve rising just as fast: energy consumption. Training large models was the original villain. But inference—the act of actually using AI—is becoming the real cost center. Billions of queries, each wrapped in unnecessarily elaborate reasoning chains, quietly compound into a global carbon problem. ...