OpenAI, ChatGPT maker, and AMD have signed a multi-year deal for AMD to supply chips that will power OpenAI’s future AI systems. As part of the deal, OpenAI will get warrants that allow it to buy up to 10% of AMD’s shares — about 160 million shares — at a very low price. These shares will only be available if OpenAI meets certain goals in performance and deployment.
OpenAI plans to start using 1 gigawatt of computing power with AMD’s new Instinct MI450 chips by the second half of 2026. Over time, this could grow to as much as 6 gigawatts of AI computing power.
The move shows OpenAI’s plan to reduce its heavy dependence on Nvidia. Nvidia remains an important partner, as it has already agreed to provide up to 10 gigawatts of computing power under its own deal with OpenAI. The AMD agreement is not exclusive, which means OpenAI can still work with other chip makers in the future.
AMD CEO, Lisa Su, noted in an interview that:
“You need partnerships like this that really bring the ecosystem together to ensure that, you know, we can really get the best technologies, you know, out there…So we’re super excited about the opportunities here.”
Numbers That Matter: The $100B Power Play Behind OpenAI’s AI Engine
Experts believe the AMD–OpenAI deal could bring AMD tens of billions of dollars in new yearly revenue. It could also generate over 100 billion dollars in new income for OpenAI and its clients over four years.
After the announcement, AMD’s stock rose sharply by over 30% trading. On the other hand, Nvidia’s shares dropped slightly, as investors worried about new competition in the AI chip market.
AMD currently has about 1.62 billion shares in total. The warrants given to OpenAI will only be valid if AMD meets specific stock price and performance goals — including reaching $600 per share for the final stage. These financial terms show how large this partnership could become and how much confidence investors now have in AMD’s growing role in AI hardware.
Chip Chess: AMD, Nvidia, and OpenAI’s Strategic Power Moves
Nvidia’s earlier deal with ChatGPT’s owner included up to 10 gigawatts of computing systems. The new AMD partnership doesn’t replace Nvidia — it expands OpenAI’s supply options. The rollout is expected over several years, with the first systems planned for 2026.
- READ MORE: NVIDIA Stock Surges on $100B OpenAI and $5B Intel Deals: Driving Sustainable AI Computing
However, there are risks. AMD must prove that its chips can perform as well as Nvidia’s in speed, power efficiency, and reliability. There are also challenges in scaling up production, securing parts, and meeting OpenAI’s demanding timelines.
The warrants are split into parts (“tranches”) tied to both AMD’s stock performance and the rollout of AI systems. That means OpenAI’s potential ownership depends on how well AMD performs.
This deal impacts each of the companies involved:
- OpenAI gains a second major chip supplier, reducing its risk of relying on one company. It also strengthens ties with AMD through possible ownership, helping it expand its AI computing capacity over time.
- AMD earns a major boost in reputation and a long-term client in OpenAI. The deal supports AMD’s AI growth strategy and could help it compete with Nvidia. But it also adds pressure to meet production goals, manage costs, and hit strict performance targets.
- Nvidia faces stronger competition in the AI chip space. This could affect its prices and profit margins over time. To stay ahead, Nvidia will likely focus on improving chip efficiency, system integration, and value-added services while monitoring demand shifts between itself and AMD.
RELATED: TSMC Dominates AI Chip Market with Record Sales—But Can It Tackle Its Rising Emissions?
The Carbon Cost of Intelligence: AI’s Growing Energy Appetite
While this deal is a big step in business and technology, it also raises environmental, social, and governance (ESG) concerns — especially around power use and emissions.
Wired for Power: How 6 Gigawatts Could Change AI’s Footprint
AI data centers use huge amounts of electricity. The International Energy Agency (IEA) says power demand from global data centers could more than double by 2030, reaching around 945 terawatt-hours — about the same as Japan’s total power use today. In developed countries, data centers could drive over 20% of all electricity demand growth.
Deloitte estimates that in 2025, data centers will use around 536 terawatt-hours of power — about 2% of the world’s total. By 2030, this could exceed 1,000 terawatt-hours.
Some studies suggest AI systems alone might take up nearly 50% of all data center energy use by late 2025, using about 23 gigawatts of power — roughly equal to the total electricity demand of small countries.
If global AI hardware demand hits between 5.3 and 9.4 gigawatts in 2025, total energy use could reach 46 to 82 terawatt-hours — similar to what Switzerland or Finland uses each year. That means OpenAI’s 6-gigawatt deployment with AMD could consume a major share of global power, depending on how efficiently it runs.
A single high-end training node with eight GPUs can draw up to 8.4 kilowatts of power when training AI models like ChatGPT. Scaled across thousands of nodes, total power use becomes massive.
- INTERESTING READ: ChatGPT, Gemini, and DeepSeek Are on an AI Race – But at What Climate Cost? A Comparison
Silicon and Sustainability: The Hidden Cost of Making AI Chips
AI chips also affect the environment during manufacturing. Producing GPUs requires mining rare minerals, refining metals, and making semiconductors — all of which use a lot of energy and create waste.
Studies show that while power use has the largest climate impact, making the chips themselves also causes issues like mineral depletion, water pollution, and toxic waste. Some estimates say training advanced AI models can use up to 4,600 times more energy than older machine-learning systems.
If AI adoption continues to grow quickly, its total electricity use could increase 24 times by 2030. Because of this, researchers and companies are exploring ways to make AI more energy-efficient.
Smaller and optimized models can cut energy use by nearly 28% without much loss in accuracy. Streamlining data and removing extra model layers can lower energy needs by more than 90% in some cases.
The researchers noted that in the U.S., using more efficient AI models could save about 16.25 terawatt-hours of power in 2025 — the same amount as two nuclear plants produce in a year. By 2028, the savings could reach 41.8 terawatt-hours, equal to seven nuclear plants. These cuts show how choosing better models can greatly reduce the energy use of data centers and make AI more sustainable.
Greening the Grid: Can AMD, Nvidia, and OpenAI Align AI with ESG?
From an ESG standpoint, the AMD–OpenAI deal puts pressure on all three companies — OpenAI, AMD, and Nvidia — to act responsibly as AI expands. They are expected to:
- Disclose how much energy and emissions come from their AI systems.
- Use renewable energy or carbon offsets to power their data centers.
- Build strong governance rules to ensure fairness, privacy, and transparency in AI use.
- Be accountable to investors, regulators, and the public about their environmental and social impacts.
Some experts recommend that companies fully integrate ESG principles into AI projects — assessing environmental and social risks early, applying strong oversight, and aligning goals with long-term sustainability.
The AMD–OpenAI deal marks a new chapter in the AI hardware race. It could reshape how computing power is built, supplied, and shared between tech leaders. But as AI infrastructure grows, so will its energy demands. Balancing performance with sustainability will be one of the biggest challenges for the big tech in the years ahead.