AI can reduce industrial greenhouse‑gas emissions if its energy and process-optimization benefits exceed the emissions from the computing required to run it. Those efficiency gains depend on where the AI operates. When deployed close to the process, AI can act in real time on operating parameters like material inputs, heat, pressure and process flow. When AI is in the cloud, latency reduces the benefits of this real-time optimization.
The Middle East is emerging as a test bed for AI at a national scale, across education, industrial automation, and energy systems, making the region a powerful lens for understanding AI’s benefits and potential climate impact. At a closed‑door dialogue during Abu Dhabi Sustainability Week, Climate Investment (CI) and our partner ADNOC posed a central question: could AI become the first industrial revolution to lower global emissions rather than accelerate them?
Global AI demand is driving a centralized, large-scale data center expansion with insatiable requirements for electricity and cooling. On the current “build capacity at any cost” trajectory, AI risks increasing emissions significantly, consistent with prior industrial revolutions. Recent analysis suggests that AI driven data center expansion could require more than 300 GW of additional power capacity by 2030.(1)
AI is evolving from chatbots to agentic systems capable of observing operating conditions, generating action plans, and executing multistep tasks. These systems increasingly interact with operational software, industrial controllers, and sensor networks. We heard great examples during our roundtable:
For industrial control, however, AI requires low-latency proximity to physical assets. Real-time, closed-loop optimisation depends on sub-second feedback. Heat flux, torque, flow, and pressure evolve in milliseconds. Cloud-based inference models cannot reliably manage these processes due to latency, bandwidth, and data sovereignty constraints.(2) For example, closed-loop kiln temperature control in cement production requires response cycles that cloud inference cannot guarantee.
Further, many hard-to-abate sectors operate in remote or intermittently connected environments. Mines, refineries, chemical plants, offshore platforms, and vessels cannot depend exclusively on cloud connectivity. This makes on premises inference a functional requirement rather than an optimization preference.(3) Offshore platforms, for example, rely on on-site control systems because satellite latency is incompatible with real time safety critical operations.
The shift is therefore architectural. AI will move from “intelligence delivered as a cloud service” to “intelligence embedded in the physical fabric of industry.” Asset specific‑specific data remains local and protected, and the value of this data grows over time. On-site models are also smaller, more efficient, and require significantly less power.
These dynamics create two competing emissions curves. One reflects rising emissions from expanding compute infrastructure. The other represents avoided emissions from improved efficiency, reduced waste, and optimized industrial processes. AI reduces emissions only if the second curve overtakes the first.
For CI, the convergence of agentic AI, edge execution, and industrial decarbonization defines a new investment aperture. Industrial optimization requires intelligence within the process. Progress occurs when AI can adjust a kiln’s temperature profile, tune a compressor load, reduce flaring, optimize thermal cycles, or coordinate a fleet under real-world, real-time operational constraints.
This points to clear priorities for disciplined capital investment:
CI supports the data center buildout that enables new AI driven industrial and emissions reducing solutions. However, smaller inference models operating at the edge will shape whether this Industrial Revolution fully embraces its implicit decarbonization potential and becomes the first to reduce global carbon emissions.(4)
1McKinsey & Company, The Cost of Compute: A $7 Trillion Race to Scale Data Centers (2024)
Estimates that AI driven data center expansion could require >300 GW of new power capacity by 2030 and trillions in global investment.
2IEEE Industrial Electronics Society; ACM Edge Computing; MIT CSAIL
Research on latency, edge inference, and autonomous industrial systems.
3McKinsey Global Institute, Artificial Intelligence and the Future of Industrial Efficiency
Analysis of AI deployment in hard to abate sectors.
4International Energy Agency, Electricity 2024 and Data Centers and Data Transmission Networks
Projections on global data center electricity demand driven by AI, estimating usage to more than double to around 945 TWh by 2030.
Disclaimer Clause
This publication contains certain forward-looking statements – that is, statements related to future, not past events and circumstances – which may relate to the ambitions, aims, targets, plans and objectives of OGCI Climate Investments LLP or its subsidiaries (“CI”) and/ or its member companies. These use expressions such as “accelerate”, “advance”, “aim”, “ambition”, “commit”, “expect”, “plans”, “strive”, “target” and “will” or similar expressions intended to identify such forward-looking statements. Forward looking statements involve risk and uncertainty because they relate to events and depend on circumstances that will or may occur in the future and are outside of the control of CI and/or its member companies. Actual results or outcomes may differ from those expressed in such statements, depending on a variety of factors. CI does not undertake to publicly update or revise these forward-looking statements, even if experience or future changes make it clear that the projected performance, conditions or events expressed or implied therein will not be realized. © 2026, OGCI Climate Investments LLP. All rights reserved.