Sasha Luccioni: sustainable AI is impossible without data on emissions and real-world usage
Sasha Luccioni believes the conversation about sustainable AI is still happening almost blindly: companies lack proper data on emissions, energy consumption, an

Sasha Luccioni, a researcher in sustainable AI, says the industry discusses the environmental footprint of technology almost blind. Without data on energy consumption, water usage, and real-world use cases, companies cannot understand where AI actually delivers value and where it simply increases infrastructure burden.
Industry Works Blind
According to Luccioni, the main market failure lies not only in the growth of data centers but also in the lack of basic reporting. Business already faces pressure from employees, boards of directors, and ESG teams: if a company deploys Copilot, chatbots, or media generation, it must understand how this affects climate goals. But many clients of AI services don't know where models physically run, which power grids data centers connect to, and what their indirect carbon footprint is.
The problem affects not only private companies. Governments and energy agencies also lack figures to plan new capacity and assess the consequences of the next wave of data center construction. In Europe, the topic is already embedded in the regulatory agenda around AI, and some countries are watching new data center operators more closely.
If AI load cannot be separated from the rest of cloud infrastructure, any talk of "green" growth quickly turns into guessing.
Not Every AI is Needed
Luccioni has a separate complaint—the market's habit of selling large universal models as a solution to any task. In practice, many companies don't need a heavy LLM for every request. For searching corporate documents, classification, filtering, speech-to-text, or narrow analytical tasks, simpler models are often sufficient. They are cheaper, faster, and require fewer computations, thus reducing energy load without noticeable quality loss in a specific scenario. A good reference point here is telemetry. If a provider shows how many tokens come and go, a company can understand what types of requests dominate: simple text, image generation, or deep research. Then model choice becomes an engineering decision, not a purchase of "the biggest, just in case."
By this logic, sustainability doesn't start with banning AI but with an honest comparison of task, cost, and resource intensity of each tool.
"We still need data on energy and water to make informed decisions,"
Luccioni says.
- light models for search, classification, and routine tasks
- more powerful LLMs only for complex analysis
- separate accounting for image and video generation
- choosing locations with cleaner energy
Transparency as Leverage
Luccioni says she lacks a very simple interface solution: a conditional meter in ChatGPT or Claude that would show energy consumption, emissions, and the source of that energy after each query. Such an indicator would make the environmental cost of AI visible to users and give companies a clear basis for purchasing decisions. Until this happens, sustainability remains a topic for PR slides, not everyday product and infrastructure management.
Against this backdrop, together with former Salesforce Sustainability Lead Boris Gamazaichikovv, she is launching the Sustainable AI Group. Their task is to help businesses understand which levers truly reduce harm: model choice, region of deployment, compute provider, task type, and electricity source. If a major player first starts honestly disclosing such metrics and betting on renewable energy, this could become not a weakness but a competitive advantage.
What This Means
The environmental discussion around AI is gradually shifting from general fears to measurable questions: how much energy is spent, where exactly models run, which water and which network they use, and whether such a class of models is even necessary for a specific task in practice. The next step for the market is to make this data visible to clients and regulators. Without it, sustainable AI will remain a beautiful promise that cannot be verified.