World Bank’s $200M Bond Links Carbon Credits and Clean Cookstoves in Ghana

The World Bank has settled a US$200 million Clean Cooking Outcome Bond to support clean cooking in Ghana. This bond was priced in early December 2025 and is set to mature in March 2032. It aims to make cleaner cooking accessible to more than one million people in Ghana.

The funding will help distribute hundreds of thousands of improved cookstoves. These stoves reduce the need for wood and charcoal fuel. This bond is part of an emerging class of “outcome-linked” financial products. It ties investor returns partly to measurable development outcomes.

In this case, the outcomes are better health, fewer emissions, and greater access to modern cooking systems. The deal attracted over ten global investors, which shows strong demand for sustainable finance.

How the Clean Cooking Outcome Bond Works

The Clean Cooking Outcome Bond is fully principal-protected, which is issued by the World Bank’s International Bank for Reconstruction and Development (IBRD). It is structured with a fixed return and a variable component linked to the success of the Ghana clean cooking project.

It can mobilize about $30.5 million of private capital. Under the bond terms, money that usually goes to investors is instead “frontloaded” to support clean cooking projects in Ghana.

Standard Chartered Bank provides this frontloaded amount through a hedge transaction. These funds are then used to distribute cookstoves through UpEnergy, a renewable energy project developer. At a glance, the initiative has the following attributes.

World Bank carbon credit bond ghana

Investors can also earn additional returns based on carbon credits generated by the use of cleaner cookstoves. Experts expect these credits to qualify as Internationally Transferred Mitigation Outcomes (ITMOs) under Article 6 of the Paris Agreement. Countries can trade or count ITMOs, which are verified emissions reductions, toward their national climate goals.

The variable return tied to carbon credits makes this bond different from regular World Bank bonds of similar maturity. Investors accept a lower fixed return in exchange for potential earnings from the carbon credit component. This structure aligns financial incentives with measurable environmental outputs.

Kris Atkinson, Fixed Income Portfolio Manager at Fidelity International, remarked:

“We are delighted to support the World Bank’s new Clean Cooking Outcome Bond through our sustainable fixed income fund range, including our flagship Climate Transition and Social Bond strategies. This innovative, outcome-linked structure aligns with our commitment to mobilising capital towards impactful sustainable solutions, helping expand access to cleaner cooking technologies for households in Ghana while contributing to meaningful emissions reductions.”

What $200 Million Delivers on the Ground in Ghana

The bond will raise funds to help distribute electric cookstoves and better charcoal cookstoves throughout Ghana. From 2025 to 2028, the country will roll out about 415,000 cleaner cooking devices.

The project is planned to make clean cooking accessible to about 1.3 million people. It will replace inefficient traditional cooking methods with modern solutions.

The distribution includes:

  • Electric cookstoves for homes with grid access.
  • Improved biomass cookstoves for homes without reliable electricity.

Traditional cooking methods — often using fuelwood and charcoal — are still widespread in Ghana. Around 75% of Ghanaians, or about 26 million people, continue to rely on solid biomass fuels for daily cooking. Wood accounts for roughly 31% of household cooking fuel use, and charcoal for about 23%.

Household air pollution from traditional stoves contributes to serious health problems. In Ghana, researchers link air pollution to an estimated 28,000 premature deaths each year, mainly among women and children.

Cleaner cookstoves can significantly lower smoke and toxic emissions in homes. This helps lower health risks for families.

Why Clean Cooking Matters: Health, Climate, and Livelihoods

Access to clean cooking technologies remains low in many parts of the world. More than 2 billion people globally still use traditional biomass fuels like wood and charcoal for cooking. This creates health, gender, and environmental challenges, especially in Sub-Saharan Africa.

people gaining access to clean cooking by region
Source: IEA

In Ghana, only about 23% of households cook with clean fuels or technologies. Most still use polluting methods. Access varies widely in urban and rural areas, with rural households far less likely to use clean cooking fuels. Household air pollution is a major contributor to respiratory illnesses and other diseases.

The lack of modern cookstoves also affects women and girls disproportionately. They often spend significant time collecting firewood or charcoal, which reduces the time available for education and paid work. Cleaner cookstoves can help reduce these burdens and improve the quality of life.

Cleaner cooking also has climate benefits. Traditional biomass cooking contributes to deforestation and greenhouse gas emissions.

Ghana can reduce its reliance on fuelwood by using more electric and high-efficiency stoves. This change will help lower emissions and lessen deforestation pressures. The carbon credits generated through the project show how local environmental improvements can align with global climate efforts.

Why Global Investors Backed This Bond

The Clean Cooking Outcome Bond attracted a broad group of investors from around the world. Participation came from North America, Europe, and regions including Africa, Asia, and the Pacific. This geographic mix reflects growing global interest in innovative, sustainable finance products.

Notable investors included major firms like Mackenzie Investments, Nuveen, and Rathbones. Others were RBC BlueBay Asset Management, Skandia, Velliv, Fidelity, and Legal & General. ZEP-RE stood out as the first African investor in a World Bank outcome bond.

Investors highlighted the strength of the bond’s verification process and its comprehensive impact reporting. Many people pointed to the bond’s mix of principal protection and clear results as the main reasons to participate.

The transaction also showed how Article 6 of the Paris Agreement applies to private finance. This helps achieve large-scale climate mitigation outcomes.

The World Bank has now issued six outcome bonds to date, with more than 25 investors participating across those instruments. This growing track record shows a rising interest in outcome-linked financing. This is especially true in areas that provide both social and environmental benefits.

Clean Cooking Meets Carbon Markets

Investment in clean cooking has been rising as awareness of health, environmental, and social impacts grows. In Africa, clean cooking investment reached around US$675 million in 2023, the highest recorded to date. This represented roughly 10% year-on-year growth. LPG infrastructure and related equipment made up the majority of these investments.

Progress toward universal access to clean cooking remains uneven. Each year, around 13 million people in sub-Saharan Africa gain access to clean cooking solutions. This rise comes from better policies and new technologies. However, this pace must increase to meet global targets for access by 2030.

A Blueprint for Scaling Clean Cooking Finance

The World Bank’s US$200 million Clean Cooking Outcome Bond illustrates how new financial tools can support sustainable development. This bond connects investor returns to clear climate and social outcomes. So, it draws private capital to areas that traditional financing often misses.

The project in Ghana could serve as a model for similar outcome-linked investments elsewhere. If successful, it might help fill the US$8 billion funding gap. This gap is needed for universal access to clean cooking by 2030. This target is set by the Clean Cooking Alliance and other global partners.

annual investment needed clean cooking 2030 IEA
Source: IEA

The bond’s success highlights how sustainable finance and carbon credit innovations can deliver measurable change, connecting investor capital with real-world solutions to problems that affect millions of lives.

Mark Carney Admits Canada Will Miss 2030 and 2035 Climate Targets as Policy Rollbacks Slow Progress

Canada’s climate strategy reached a turning point this week. Prime Minister Mark Carney openly acknowledged that the country will not meet its greenhouse gas emissions targets for 2030 and 2035 under current policies. The admission marked a rare moment of candor from a government that has already rolled back several flagship climate measures since taking office in March.

Speaking to Radio-Canada, Carney said it was “clear” Canada would fall short of its goals unless policies change. The statement landed heavily because Canada’s climate targets are not aspirational. They are written into law and closely watched by investors, provinces, and international partners.

At the same time, the government continues to defend its long-term strategy. Officials argue that energy investments, industrial decarbonization, and revised methane rules will eventually bend the emissions curve downward. However, recent data, audits, and projections suggest the gap between ambition and reality remains wide.

Canada’s 1% Emission Cuts Fall Short of Net-Zero Goals

Canada committed in 2021 to cut emissions 40% to 45% below 2005 levels by 2030. It also pledged to reach a net-zero electricity grid by 2035. Later, the country strengthened its longer-term ambition, submitting a nationally determined contribution (NDC) in February 2025 that raised the 2035 reduction goal to 45%–50%.

Yet progress has been slow. Between 2005 and 2023, Canada reduced emissions by only about 8.5%. That translates to about a 1% annual reduction. While progress is real, it falls far short of what climate science demands.

  • A global net-zero-by-2050 pathway requires annual emissions cuts closer to 4%. At the current pace, Canada simply cannot close the gap by 2030 or 2035.

High Emissions, Heavy Footprint

Globally, Canada ranked 11th worldwide in total emissions in 2022, contributing about 1.4% of the global total.

CANADA EMISSIONS
Source: Climate Change Tracker

Yet per capita emissions tell a different story. Canadians emit between 14 and 20.5 tonnes of CO₂-equivalent per person each year, depending on the metric used. That places Canada among the highest per capita emitters in the OECD.

This imbalance matters. While Canada’s global share may seem modest, its emissions intensity remains extremely high. When combined with other mid-sized emitters, Canada’s footprint contributes meaningfully to global warming.

Additionally, under the policies currently in place, emissions are expected to fall only about 21% below 2005 levels by 2030. In short, Canada is moving in the right direction, but far too slowly.

canada emissions
Source: Canadian Government

Policy Reversals Shake Climate Confidence

Since taking office, Carney’s minority government has shifted course on several climate policies. Most notably, it eliminated the consumer-facing carbon tax on fuels in April 2025. Canadians no longer pay a carbon charge at the pump or on home heating fuels.

The government also scrapped a proposed emissions cap for the oil and gas sector. These moves were framed as part of a broader effort to turn Canada into an “energy superpower,” especially as trade tensions with the United States intensified under President Donald Trump.

However, the policy pivot triggered political fallout. Steven Guilbeault, who served as environment minister under former Prime Minister Justin Trudeau, resigned after Ottawa reached a deal with Alberta that relaxed certain climate rules to support pipeline development. In his resignation statement, Guilbeault warned that environmental priorities were being sidelined.

Meanwhile, current Environment Minister Julie Dabrusin struck a more optimistic tone. She said Canada’s 2030 and 2035 targets remain achievable, directly contradicting Carney’s public assessment. The mixed messaging has added uncertainty to Canada’s climate outlook.

Fresh Methane Rules: Strong Signal, Longer Timeline

One area where the government doubled down is methane regulation. Recently, Ottawa announced new rules aimed at cutting oil and gas methane emissions by 75% by 2035. Carney’s government extended the deadline by five years, offering companies more time to comply. It also fulfills a promise Carney made during the election campaign to strengthen existing methane standards.

The press release from the Government of Canada projects that from 2028, when the new regulations take effect, through 2040, the measures will cut a total of 304 million tonnes of CO₂-equivalent and 1,593 kilotonnes of volatile organic compounds (VOCs).

  • These reductions are expected to prevent around $36.3 billion in climate-related damages and deliver $257 million in health benefits to Canadians.

Methane may not stay in the atmosphere as long as carbon dioxide, but it packs a powerful punch. Over 20 years, methane can trap roughly 80 times more heat than CO₂. Oil and gas facilities account for about half of Canada’s methane emissions, largely due to venting, flaring, and leaks across production infrastructure.

Thus, stronger methane controls could deliver fast climate benefits.

Methane emissions canada
Source: Climate Change Tracker

Canada’s Carbon Pricing at Crossroads?

Carbon pricing remains a central pillar of Canada’s climate framework, even after recent changes. For large industrial emitters, the Output-Based Pricing System (OBPS) remains operational. Under this system, facilities receive emissions limits based on output. Companies that exceed their limits must pay, while those that perform better earn tradable credits.

canada carbon price carbon tax

The OBPS aims to cut emissions without driving industries offshore. It tries to balance climate ambition with global competitiveness.

However, the removal of the consumer carbon tax shifted the burden almost entirely onto industry. The federal government plans to review the carbon pricing benchmark in 2026. That review could reshape how the OBPS functions and how stringent it becomes.

This moment is critical. Without clear rules and long-term certainty, carbon pricing risks becoming too weak to drive meaningful change. On the other hand, a strong and predictable system could help Canada decarbonize while protecting jobs and investment.

Canada carbon credit price tax

Is the Window Narrowing for Action?

Carney’s admission removes any illusion that Canada is on track. The challenge now is whether the government uses this moment to reset policy or continues to rely on long-term promises.

New methane rules, industrial carbon pricing, and clean energy investments offer pathways forward. Yet auditors, analysts, and climate institutes agree that current measures lack the scale and speed required.

Canada stands at an economic and environmental crossroads. Stronger climate policy could position the country as a credible player in the global energy transition. Weak or uncertain action, by contrast, risks higher emissions, lost competitiveness, and missed opportunities.

The targets are still on the books. The question is whether Canada is willing to match them with the policies needed to make them real.

Lithium Economics: Why Surge Battery Metals’ Nevada Project Stands Out

Disseminated on behalf of Surge Battery Metals Inc.

The race for North America’s lithium supply is about more than size. It’s about efficiency and economics. In this contest, Surge Battery Metals (TSXV: NILI) shines. Its Nevada North Lithium Project (NNLP) combines strong geology with solid financial metrics. This junior company could build one of the most profitable lithium operations in the U.S.

The latest preliminary economic assessment (PEA) shows Nevada North ranks among the top lithium claystone projects in the Western Hemisphere. It hosts an Inferred Resource of 11.24 Mt LCE at 3,010 ppm Li, the highest grade in the U.S.

Nevada North Lithium Project (NNLP) is the highest-grade U.S. lithium clay asset

surge battery metals NILI resource description
Source: Surge Battery Metals

The U.S. Lithium Imperative

As the U.S. aims for energy independence, a domestic lithium supply chain is essential. Demand for lithium carbonate equivalent (LCE) is set to grow fivefold by 2035. Battery gigafactories need a steady, low-cost supply.

Nevada is at the center of this change. It’s home to Tesla’s Gigafactory and has strong mining infrastructure. Surge Battery Metals’ Nevada North Project benefits from easy access to highways, water, and skilled workers. This reduces capital risk and environmental impact.

lithium surge battery metals
Source: Surge Battery Metals

Breaking Down the Economics

  • Evolution Mining JV Strengthens NNLP’s Tier 1 Potential

In September, Surge signed a JV LOI with Evolution Mining (ASX: EVN) for the Nevada North Lithium Project. Under the deal, Surge holds 77% and Evolution 23%, with Evolution funding up to C$10 million for the Preliminary Feasibility Study to earn up to 32.5%.

Following this, on December 9, the company announced receiving C$3 million in initial funding. The payment was made under NNL’s updated operating agreement. As a result, Evolution’s stake in the joint venture rose to 25.85%, up by 2.85%. And Surge Battery Metals USA Inc. now holds the remaining 74.15%.

Evolution adds 880 acres of private land and 21,000 acres of high-potential ground, expanding NNLP’s footprint. The JV will now focus on advancing the Pre-Feasibility Study, building on the strong 2025 PEA results.

  • Exceptional Profitability Profile

The Preliminary Economic Assessment (PEA) confirmed NNLP’s strong potential with a 42-year mine life, 82% recovery rate, and average annual output of 86,000 t/y LCE, peaking at 109,000 t/y. The project’s after-tax NPV₈ reached about US$9.2 billion with a 23% IRR — impressive for an early-stage asset.

Even at lower prices of US$20,000/t LCE, it stays profitable. The 22.8% IRR shows efficient capital use and a quick payback for investors.

Surge also secured an expanded BLM exploration permit, increasing its allowable disturbance from 5 to 250 acres. This approval enables its largest drill program yet, targeting an upgrade of most resources to measured and indicated by mid-2026.

NILI profit
Source: Surge Battery Metals

These figures highlight the project’s resilience, even with conservative pricing. Simply put, it competes well with established lithium producers.

The low operating cost of US$5,097/t LCE sets NILI apart. Most North American peers operate at US$8,000–10,000/t due to higher energy costs. Also, another significant advantage comes from its mineralogy and process design. Its lithium-bearing clays allow high recoveries with simple leaching, cutting reagent use, and waste.

lithium production

  • Long-Life Asset

Preliminary models suggest a mine life spanning decades based on only part of the drilled area. Further exploration could expand the resource, boosting longevity and cash flow. Long-life assets are valuable in a volatile market. A project that keeps low costs for years can deliver value through price cycles.

  • Capital Efficiency

Unlike expensive spodumene or brine projects needing billions upfront, NILI’s Nevada North benefits from its U.S. location and modular setup. With easy transport access and favorable permitting, capital needs are lower.

Capital expenditures are expected to stay below US$2 billion. This eases financing through partnerships, grants, or U.S. Department of Energy incentives for domestic lithium production.

Why Economics Matter More Than Ever

The lithium market is maturing. Not every discovery will become a mine. Investors now seek projects that thrive in weak markets. And Nevada North meets this demand.

  • By 2035, global lithium demand is expected to hit 3.56 million metric tonnes, while supply will reach only 3.16 million tonnes.

lithium supply and lithium demand

Its strong economics show that Surge Battery Metals aims to build a profitable, sustainable operation—not chase hype. Pilot testing and metallurgical studies confirm scalable processing methods. These programs help turn PEA numbers into reality and reduce project risk.

Sustainability and Strategic Value

NILI aligns with ESG goals. The company plans to use renewable energy for processing and recycle water to limit environmental impact. Unlike South American brine mines facing water issues, or hard-rock projects relying on carbon-heavy crushing, Nevada North uses a low-impact clay-based process. This supports U.S. sustainability and clean energy goals.

Clean Chemistry, High Potential

NNLP sits in an ancient volcanic basin similar to the McDermitt Caldera that hosts Thacker Pass. Over time, rhyolitic ash settled into a lake, forming lithium-rich clay beds now preserved near the surface, ideal for low-cost open-pit mining.

Crucially, NNLP’s clean chemistry sets it apart. Its clays contain low magnesium (2–4%) and less hectorite, making acid leaching easier, reagent use lower, and processing simpler. Early tests already produced battery-grade lithium carbonate without complex steps, proving strong scalability.

With grades topping 3,000 ppm, well above most clay projects, NNLP naturally fits into the lowest-cost tier once in production.

Driving Real Shareholder Value

Surge Battery Metals offers investors a U.S.-based Tier 1 lithium project with great potential. By 2030, global lithium demand is expected to hit 2.4 million tonnes of LCE, nearly four times the current amount. As electric vehicles, grid storage, and clean infrastructure grow, more lithium projects will be needed to meet this demand.

As Government policies reshape the landscape quickly, U.S.-backed projects like Nevada North are all set for gains.

Share Structure

NILI share structure
Source: Surge Battery Metals

For Surge Battery Metals, Nevada North is more than a mine – it’s a value engine. Surge is debt-free and had approximately C$6–7 million in cash as of October 2025, providing a 12-month runway at its current burn rate of around C$300,000 per month.

nili cash

Investors in lithium often face two extremes: overpriced speculations and undervalued strong assets. NILI falls into the second group. Nevada North may be key to America’s lithium independence. If it keeps growing and uses U.S. critical mineral policies, its impact could be significant.

Looking Ahead

The next few months are crucial. Surge Battery Metals will drill more to expand high-grade zones, update its resource, and start prefeasibility studies. Each step will enhance the project’s world-class potential.

Surge Battery Metals stands out among lithium juniors. While many face weak prices, its economics are strong now and will remain so in the future.

For investors seeking clean energy opportunities, the Nevada North Lithium Project stands out. It offers strong economics and sustainable design. This project isn’t just another venture; it’s a blueprint for the future of lithium mining.

DISCLAIMER 

New Era Publishing Inc. and/or CarbonCredits.com (“We” or “Us”) are not securities dealers or brokers, investment advisers, or financial advisers, and you should not rely on the information herein as investment advice. Surge Battery Metals Inc. (“Company”) made a one-time payment of $50,000 to provide marketing services for a term of two months. None of the owners, members, directors, or employees of New Era Publishing Inc. and/or CarbonCredits.com currently hold, or have any beneficial ownership in, any shares, stocks, or options of the companies mentioned.

This article is informational only and is solely for use by prospective investors in determining whether to seek additional information. It does not constitute an offer to sell or a solicitation of an offer to buy any securities. Examples that we provide of share price increases pertaining to a particular issuer from one referenced date to another represent arbitrarily chosen time periods and are no indication whatsoever of future stock prices for that issuer and are of no predictive value.

Our stock profiles are intended to highlight certain companies for your further investigation; they are not stock recommendations or an offer or sale of the referenced securities. The securities issued by the companies we profile should be considered high-risk; if you do invest despite these warnings, you may lose your entire investment. Please do your own research before investing, including reviewing the companies’ SEDAR+ and SEC filings, press releases, and risk disclosures.

It is our policy that information contained in this profile was provided by the company, extracted from SEDAR+ and SEC filings, company websites, and other publicly available sources. We believe the sources and information are accurate and reliable but we cannot guarantee them.

CAUTIONARY STATEMENT AND FORWARD-LOOKING INFORMATION

Certain statements contained in this news release may constitute “forward-looking information” within the meaning of applicable securities laws. Forward-looking information generally can be identified by words such as “anticipate,” “expect,” “estimate,” “forecast,” “plan,” and similar expressions suggesting future outcomes or events. Forward-looking information is based on current expectations of management; however, it is subject to known and unknown risks, uncertainties, and other factors that may cause actual results to differ materially from those anticipated.

These factors include, without limitation, statements relating to the Company’s exploration and development plans, the potential of its mineral projects, financing activities, regulatory approvals, market conditions, and future objectives. Forward-looking information involves numerous risks and uncertainties and actual results might differ materially from results suggested in any forward-looking information. These risks and uncertainties include, among other things, market volatility, the state of financial markets for the Company’s securities, fluctuations in commodity prices, operational challenges, and changes in business plans.

Forward-looking information is based on several key expectations and assumptions, including, without limitation, that the Company will continue with its stated business objectives and will be able to raise additional capital as required. Although management of the Company has attempted to identify important factors that could cause actual results to differ materially, there may be other factors that cause results not to be as anticipated, estimated, or intended.

There can be no assurance that such forward-looking information will prove to be accurate, as actual results and future events could differ materially. Accordingly, readers should not place undue reliance on forward-looking information. Additional information about risks and uncertainties is contained in the Company’s management’s discussion and analysis and annual information form for the year ended December 31, 2024, copies of which are available on SEDAR+ at www.sedarplus.ca.

The forward-looking information contained herein is expressly qualified in its entirety by this cautionary statement. Forward-looking information reflects management’s current beliefs and is based on information currently available to the Company. The forward-looking information is made as of the date of this news release, and the Company assumes no obligation to update or revise such information to reflect new events or circumstances except as may be required by applicable law.

Tesla Tests Driverless Robotaxis in Austin While Analysts Predict 1 Million Units by 2035, Sending Stocks Up

Tesla (TSLA) is making big progress in testing driverless robotaxis on public roads and attracting attention from analysts and investors. The company started testing its self-driving cars in Austin, Texas, on December 15. No human safety monitor was on board. This was a milestone that Tesla’s leaders said would happen by year’s end. This shift represents a key part of the EV giant’s long‑term strategy for autonomous vehicles and future mobility services.

At the same time, Wall Street firms, including Morgan Stanley, are issuing forecasts about Tesla’s robotaxi plans and their potential impact on the company’s future. Analysts calculate the scale of robotaxi fleets and potential valuation effects over the next decade.

These changes have kept Tesla’s stock in the spotlight for investors and the market, even with challenges in electric vehicle sales growth.

Driverless Robotaxis Hit Austin Streets

Tesla (TSLA stock)  began testing its self-driving cars on public roads in Austin, Texas. There were no human drivers or safety monitors in the front seats. CEO Elon Musk confirmed that fully driverless tests are happening. He sees this as an important step toward commercial operation.

Earlier in 2025, Tesla had already launched a limited robotaxi service in Austin using modified Model Y vehicles. Initially, these vehicles included a human safety monitor in the passenger seat to observe system performance.

Over the months, Tesla grew its service area and fleet size. By December 2025, reports showed about 31 active robotaxis operating in the city.

Recent tests without monitors show progress. However, they are still for internal validation, not for daily commercial use. Tesla confirmed that tests aren’t open to paying customers yet. The company hasn’t provided a specific date for when fully autonomous rides will be available to the public.

The Technology Behind Tesla’s Autonomous Effort

Tesla’s autonomous driving push relies on its Full Self‑Driving (FSD) software and onboard sensors. The FSD system can manage various driving situations. It uses cameras, radar inputs, and neural network processing. This differs from some competitors that rely on additional sensors such as LiDAR for redundancy.

In June 2025, Tesla shared its Q2 tech update. The company boosted AI training by adding tens of thousands of GPUs at its Gigafactory in Texas. This expansion supports improvements in FSD, where the company reported its first autonomous delivery. A Model Y drove itself without human help for 30 minutes.

Vehicles with FSD software need regulatory approval to drive on their own. In the Austin pilot, removing physical safety monitors marks progress toward that goal. Achieving fully reliable, unsupervised autonomy is still a challenge. This is true, especially when it comes to safety standards and different road conditions.

Wall Street Eyes Tesla’s Robotaxi Potential, Sending Stock Near Record Highs

Tesla’s autonomous ambitions are closely watched by financial analysts. Morgan Stanley just shared forecasts that say Tesla could greatly grow its robotaxi presence in the next 10 years.

The bank says Tesla might have 1 million robotaxis on the road by 2035. These will operate in various cities as part of its autonomous fleet plan.

Morgan Stanley’s analysis sees active robotaxi units growing in 2026. However, the first fleets will be small compared to the long-term plan. The forecasts show the possible size of the autonomous vehicle market. They also highlight Tesla’s role in this growth. However, there are uncertainties tied to technology and regulations.

Stock markets have reacted to these developments. Tesla’s stock price nearly hit record highs. It rose almost 5% during trading sessions. Investors were excited about progress in driverless testing and the promise of future autonomous revenue. Analysts say Tesla’s value might go up more if its autonomous services and AI products perform well.

Tesla stock december price

Tesla’s Vision for Autonomous Mobility Services

Tesla’s robotaxi initiative fits into its broader vision of mobility services and artificial intelligence (AI)‑driven transport. The company plans to launch purpose-built autonomous vehicles, like the Cybercab. These vehicles won’t have traditional controls, such as steering wheels or pedals. They aim for mass production in April 2026.

Tesla sees a future where owners can add their cars to a decentralized robotaxi network. This could boost fleet availability and usage. This strategy could shift parts of Tesla’s revenue profile away from vehicle sales toward recurring service revenues if adopted at scale. The global robotaxi market could reach over $45 billion in 2030, as shown below.

robotaxi market 2030
Source: MarketsandMarkets

Analysts say that major technical, regulatory, and safety issues still stand in the way of robotaxis operating widely and making a profit. Building public trust, meeting varied local regulations, and demonstrating consistent safety across different road environments will be key factors in future deployment.

Tesla vs Competitors and Safety Regulations

Tesla is not alone in the autonomous vehicle race. Other companies, such as Alphabet’s Waymo, owned by Alphabet, have been operating fully autonomous services in multiple cities for several years and continue to expand.

The company operates about 2,500 robotaxis across multiple cities. Waymo has logged millions of paid autonomous rides and already meets higher autonomy standards in some regions. In comparison, Tesla operates around 31 robotaxis in Austin, with plans to expand to several major U.S. cities by 2026.

Waymo Robotaxi Fleet and CO₂ Avoidance by City

Tesla chose camera-centric sensors over multi-sensor arrays. This decision shows their focus on scalability and cost. Critics and some experts argue that adding LiDAR or other sensors could improve safety and performance under challenging conditions.

Regulators also play an important role. In some states, pilot autonomously driven services are permitted under special testing allowances. Widespread commercial use needs approval from both state and federal agencies. This ensures that vehicles meet safety and operational standards.

What’s Next for Tesla’s Driverless Fleets

Tesla’s move to test robotaxis without onboard safety monitors in Austin marks a clear technical milestone, though it is not yet a commercial service. The company’s next steps will likely focus on scaling test fleets, improving software robustness, and navigating regulatory approvals to allow expanded operations in other cities in 2026 and beyond.

Morgan Stanley and other analysts think robotaxis might play a big role in Tesla’s growth. They could boost service revenue as traditional vehicle sales slow down. However, forecasts at this stage remain based on long‑range assumptions about adoption, pricing, and regulatory landscapes.

Investor sentiment has been mixed. Stock movements show excitement about tech advances but also worry about short-term vehicle sales and profit pressures in the auto industry.

Overall, Tesla’s autonomous ambitions continue to shape its corporate strategy and public profile. The speed of robotaxi rollout, along with improvements in Full Self-Driving software and AI, will be key to seeing if the company can shift from an EV maker to a driverless mobility platform.

EU Carbon Prices Hit 2025 Highs as 2040 Climate Target Tightens the Market

European carbon prices have risen notably in late 2025, reaching €83.79 per tonne on December 15, up 3% monthly and 30% year-over-year. The price of carbon permits in the EU’s main emissions trading system (EU ETS) recently hit multi-month highs. Companies are getting ready for compliance deadlines, and markets are responding to stricter policy signals.

EU carbon price benchmarks have been testing higher levels, reflecting increased market demand and confidence in the emissions cap tightening. This is lower than the record highs of over €100 per tonne seen in 2023. Still, the prices sent strong policy signals to cut greenhouse gas emissions.

The EU ETS sets a limit on emissions from sectors like power generation and heavy industry. Companies can trade allowances to meet their needs. As the cap is lowered over time, the number of available permits decreases, pushing prices up. This system encourages companies to reduce emissions in cost-effective ways and supports the EU’s climate goals.

In 2023, EU ETS auctioned allowances generated a record €783 billion, making it the world’s largest carbon market by turnover.

Higher carbon prices raise the cost of emitting greenhouse gases. This change can affect investment choices and operational practices. These price signals affect other EU climate policies.

EU carbon price Dec 2025
Data from EU ETS auctions (EEX/ICE)

One key policy is the Carbon Border Adjustment Mechanism (CBAM). Starting in 2026, CBAM will apply similar carbon costs to specific imported goods,requiring certificate purchases for 2026 imports starting in 2027.

CBAM aims to prevent “carbon leakage,” where production shifts to countries without strong carbon pricing, potentially undercutting emissions reductions in the EU.

Why EU Carbon Prices Are Rising Again

Several factors are driving the upward trend in EU carbon prices. Markets anticipate a tighter supply of carbon allowances as the EU strengthens its emissions caps.

Also, regulatory changes boost price momentum. This includes expanding covered sectors and strengthening the Market Stability Reserve, which absorbs excess allowances.

Research groups say average carbon prices may keep rising as we approach the decade’s end, based on current policies. BNEF forecasts ETS prices at €149 per tonne by 2030.

EU ETS carbon price forecast BNEFDemand for allowances increases as industries near compliance deadlines. Companies need to surrender enough permits to match their emissions. This compliance buying can push prices higher in the short term.

EU Climate Target: 90% Emissions Cut by 2040

The European Union has made a provisional political agreement on a new climate target for 2040. This comes alongside recent carbon market developments.

Lawmakers from the European Parliament and the Council agreed to reduce net greenhouse gas emissions. They set a binding goal of a 90% cut compared to 1990 levels. This target acts as an intermediate step toward the EU’s long-term aim of climate neutrality by 2050.

The agreed text includes certain flexibility mechanisms. Starting in 2036, member states can use high-quality international carbon credits. These credits can help meet up to 5% of the 2040 target. Yet, strict rules will ensure they support environmental goals. The agreement also confirms a one-year delay in applying the EU ETS to the buildings and road transport sectors.

The new target will guide future legislation and policy. It guides energy, industrial, and climate policy. It balances goals for cutting greenhouse gases with competitiveness and social fairness.

Key Elements of the 2040 Target Deal

The provisional agreement reflects several key decisions:

  • A binding target to cut net greenhouse gas emissions by 90 percent by 2040 compared with 1990 levels.
  • Flexibility options help member states meet the target. This includes using international carbon credits in a limited way.
  • Enhanced provisions for domestic permanent carbon removals under the EU ETS.
  • A reinforced mechanism for reviewing progress regularly and proposing adjustments where needed.
  • A delayed start for ETS2, the system covering buildings and road transport, from 2027 to 2028.

These elements aim to guide the EU’s climate policy. They also recognize the economic and social challenges of reducing emissions significantly.

How Carbon Pricing Fits with the 2040 Target

Carbon pricing through the EU ETS remains central to achieving the bloc’s climate goals. The price of allowances directly influences the cost of emitting CO₂-equivalent greenhouse gases. As the emissions cap gets stricter, the EU ETS will cover more emissions and sectors. This will create greater incentives for reducing emissions.

EU climate net zero goal
Source: European Commission

Using international carbon credits in the 2040 framework shows how carbon market tools fit into long-term climate planning. Under the agreement, up to 5 percent of the total emissions cut for 2040 may come from high-quality international carbon credits. This approach gives member states more flexibility while maintaining a strong domestic reduction pathway.

Carbon pricing trends also interact with other EU measures. The CBAM adds carbon costs to some imported products. This pushes trading partners to adopt carbon pricing or emissions reduction policies that meet EU standards.

What Higher Carbon Prices Mean for Industry and Policy

A rising carbon price affects many aspects of the European economy. High-emission industries might face higher costs. This could push them to invest in cleaner technologies.

Meanwhile, policymakers aim to balance climate ambition with economic competitiveness and fairness. Flexibility mechanisms and phased implementation schedules help industries adapt. They also protect jobs and energy security.

The upcoming 2040 climate target will shape future rules, investment choices, and carbon pricing. It will also affect the EU ETS design and climate policies. Regular progress reviews will help the European Commission and member states check performance. They can also change policies if needed.

What Comes Next for EU Carbon Markets

The provisional agreement on a 90 percent emissions cut by 2040 represents a significant milestone in EU climate policy. It provides a long-term signal to markets, investors, and industries about the direction of the region’s climate ambition. At the same time, carbon prices rising to multi-month highs show how market mechanisms can support decarbonization goals.

These developments show how the EU blends regulatory goals with market tools. This mix aims for significant emissions cuts in the bloc. Continued policy implementation, periodic review, and alignment with international carbon market standards will shape how effectively these goals are met in the coming years.

NVIDIA Stock Rebounds as AI Product Launch and Data Center Demand Restore Confidence

NVIDIA shares rebounded to $176.12 (+0.63%) on December 15, 2025, following a mix of product updates, demand signals from China, and revised outlooks from Wall Street analysts. The recovery followed weeks of market ups and downs. These were tied to worries about valuations, export limits, and uncertainty in the tech sector.

The latest movement reflects renewed confidence in NVIDIA’s position in artificial intelligence hardware and software. Investors reacted to the launch of a new software, Nemotron 3. They also noted early demand for the H200 data center chip. Plus, updated forecasts suggest ongoing growth in AI-driven computing.

NVIDIA’s recent updates highlight its role in shaping technology and infrastructure for modern AI. Short-term market swings may still happen, but the company stays focused on innovation.

Nemotron 3 Pushes NVIDIA Deeper Into AI Software

One key driver of the stock rebound was the launch of Nemotron 3. The product expands NVIDIA’s growing portfolio of AI software models designed for enterprise use. Nemotron 3 brings a new set of open-source AI models for agentic uses. These include:

  • Nano: 30B total parameters, 3B active
  • Super: about 100B
  • Ultra: around 500B

The hybrid Mixture-of-Experts (MoE) architecture combines Mamba and Transformer elements. It offers up to 4x more token throughput than Nemotron 2 Nano. With a 1M-token context window, it generates 60% fewer reasoning tokens. This model excels in coding, math, long-context tasks, and multi-agent reasoning, all while lowering inference costs.

Jensen Huang, founder and CEO of NVIDIA, remarked:

“Open innovation is the foundation of AI progress. With Nemotron, we’re transforming advanced AI into an open platform that gives developers the transparency and efficiency they need to build agentic systems at scale.”

The launch reinforces NVIDIA’s shift from being seen only as a chipmaker to a full-stack AI company. Hardware remains central, but software now plays a larger role in driving long-term revenue. Analysts view this approach as a way to create more stable income streams beyond cyclical chip demand.

Nemotron 3 also supports NVIDIA’s ecosystem strategy. The chip company boosts customer reliance by tightly integrating AI models with its chips and platforms.

Nvidia stock price

H200 Chip Demand Shows Strength Despite Constraints

Another factor supporting NVIDIA stock or shares was fresh discussion around demand for its H200 data center chip. The H200 is one of NVIDIA’s most advanced AI accelerators. It targets high-performance workloads such as large-scale AI training and inference.

Market signals suggest that demand remains strong, including interest from Chinese cloud and research clients. While U.S. export controls limit the types of chips NVIDIA can sell to China, modified versions of its products continue to find buyers.

Notably, President Trump greenlighted the H200 sales to “approved” Chinese customers. The U.S. will take a 25% revenue cut. This move reverses Biden-era restrictions after talks with Xi Jinping. However, Senate Democrats labeled it “dangerous” for national security.

The H200 builds on the success of the earlier H100 platform. It offers faster memory and better performance for large models. This matters as AI models grow in size and complexity.

Strong interest in the H200 indicates that global demand for AI infrastructure remains high. Data centers continue to invest heavily as AI adoption spreads across industries. This trend supports NVIDIA’s revenue outlook even as geopolitical risks remain.

Wall Street Forecasts Turn More Balanced

Wall Street analysts also played a role in the stock’s rebound. Several firms updated their forecasts after recent pullbacks in NVIDIA’s stock price. Some analysts were cautious about valuation. Others pointed out strong earnings visibility and market leadership.

Updated forecasts reflect expectations that AI spending will remain a priority for large technology firms, governments, and enterprises. Per McKinsey & Company estimates, AI-related data center demand may reach up to $8 trillion by 2030. And NVIDIA continues to dominate the market for AI accelerators used in data centers.

investments for AI-related data center capacity 2030

Analysts also noted that revenue growth may normalize after years of rapid expansion. However, they still expect NVIDIA to grow faster than most semiconductor peers.

The more balanced tone from Wall Street helped stabilize investor sentiment. Rather than focusing only on risks, forecasts now reflect both strong fundamentals and realistic growth assumptions.

NVIDIA’s Role in the Global AI Buildout

NVIDIA sits at the center of the global AI infrastructure buildout. Its chips power many of the world’s largest AI models. Its software tools support developers, researchers, and enterprises.

As AI adoption grows, demand extends beyond tech companies. Industries such as healthcare, finance, energy, and manufacturing increasingly rely on AI for efficiency and decision-making.

This broad demand base helps reduce reliance on any single sector. It also supports longer-term growth even if consumer technology spending slows.

However, challenges remain. Competition is increasing, and governments are tightening rules on technology exports. Moreover, energy use in data centers is under scrutiny as it will send power demand skyrocketing. These pressures make sustainability and efficiency more important to NVIDIA’s strategy.

Datacenter growth will drive power demand from 2024 to 2030

Efficiency and Sustainability Take Center Stage

NVIDIA has expanded its focus on sustainability as its technology footprint grows. Data centers powered by AI chips consume large amounts of energy. This creates both environmental and cost concerns.

The company aims to improve energy efficiency across its products. Newer GPUs deliver more performance per watt than earlier generations. This means customers can run larger workloads using less energy.

NVIDIA focuses on clean electricity. The company now uses 100% renewable energy for its offices and data centers. This shift helps cut Scope 1 and 2 emissions and lowers its carbon footprint.

NVIDIA nvda Carbon emissions
Source: NVIDIA

The firm has set science-based targets aligned with limiting global warming to 1.5°C, including a 50% cut in Scope 1 and 2 emissions by FY 2030 (FY 2023 baseline). NVIDIA aims for a 75% cut in emissions intensity during customer use per petaflop of computing power by 2030. This focus targets most of its lifecycle emissions, mainly from end-user applications, not manufacturing.

NVIDIA also works with data center partners to improve cooling systems and power management. Better design reduces wasted energy and lowers emissions tied to electricity use.

The company also designs AI systems that support climate research. These systems help model weather patterns, climate risks, and energy systems. While indirect, such applications show how AI can support environmental goals.

Sustainability now plays a larger role in how investors evaluate technology firms. NVIDIA’s focus on efficiency and emissions aligns with this shift, even as demand for computing power continues to rise.

What NVIDIA’s Stock Recovery Tells Investors

NVIDIA’s stock rebound reflects a mix of short-term and long-term factors. The Nemotron 3 launch highlights software growth. H200 demand points to continued strength in AI infrastructure. Analyst updates suggest a more stable outlook after rapid gains earlier in the year.

At the same time, the company faces real constraints. Export rules, competition, and energy use remain key risks. Growth may slow compared to recent years, but the scale of AI adoption still supports expansion.

For investors, the latest developments suggest that NVIDIA remains a central player in AI. Market expectations may be more measured, but confidence in its long-term role remains intact.

As AI reshapes industries, NVIDIA’s mix of hardware, software, and sustainability efforts will continue to shape its position in global markets.

Google’s 3,500-Tonne Carbon Removal Deal with Ebb Signals Growing Confidence in Ocean-Based Climate Solutions

Ebb has taken a major step forward in the carbon removal space by signing its first carbon removal offtake agreement with Google. Under this prepurchase deal, Ebb will remove 3,500 tonnes of carbon dioxide from the atmosphere. While the volume is modest, the signal is powerful. It shows growing confidence in ocean-based carbon removal and, more importantly, in Ebb’s strategy of scaling climate solutions through existing industrial infrastructure.

This agreement follows closely on the heels of Ebb’s announcement of a landmark partnership with the Saudi Water Authority (SWA), the world’s largest desalination operator. By deploying its technology across SWA’s facilities, Ebb estimates it could enable up to 85 million tonnes of annual CO₂ removal capacity at full scale.

These two deals, taken together, demonstrate that carbon removal can scale faster and at a lower cost when it is integrated into industrial systems that already operate at massive volumes.

How Ebb’s Ocean Carbon Removal Technology Works

Ebb focuses on ocean alkalinity enhancement, a method that accelerates a natural carbon storage process. In nature, the ocean absorbs carbon dioxide from the air and converts it into bicarbonate, a stable form of carbon that can remain stored in seawater for thousands of years. Over time, this process has already absorbed about 30% of all human-made CO₂ emissions since the Industrial Revolution.

The company speeds up this natural mechanism using an electrochemical system. The result is safe, durable carbon storage that aligns with the ocean’s existing chemistry. Importantly, the company does not rely on building new, standalone facilities. Instead, it integrates directly into desalination plants.

Turning Desalination Waste Into Climate Value

Desalination plants produce fresh water by removing salt from seawater. However, this process also generates large volumes of brine, a highly concentrated salty waste stream. Globally, desalination facilities produce more than 100 million tonnes of brine every day.

Ebb intercepts this brine before it returns to the ocean. The brine then passes through its modular electrochemical system, which converts it into an alkaline solution. Once released back into the ocean, this solution increases the water’s ability to draw CO₂ from the atmosphere.

This approach delivers several benefits at once.

  • First, it enables large-scale carbon removal.
  • Second, it can increase freshwater yield at desalination plants.
  • Third, it produces valuable chemical co-products that can be reused within the plant or sold to other industries.

By transforming waste into multiple revenue streams, Ebb makes carbon removal more economically attractive for its partners.

ebb carbon ocean removal
Source: EBB

Infrastructure Integration Changes the Game

One of the biggest barriers to carbon removal is cost. Building new infrastructure from scratch requires time, capital, and regulatory approvals. Ebb avoids many of these challenges by integrating with existing systems.

Desalination plants process hundreds of millions of tonnes of seawater every day. This scale creates a massive opportunity. According to Ebb, current global desalination capacity could support billions of tonnes of carbon removal per year if fully leveraged.

As a result, integration reduces both deployment costs and operational complexity. It also allows Ebb to scale faster than many other carbon removal pathways. This infrastructure-first model sits at the heart of the company’s partnership with SWA.

Strengthening Google’s Carbon Removal Strategy

Google’s decision to purchase carbon removals from Ebb reflects its broader climate strategy and its commitment to reach net-zero emissions across its operations and value chain by 2030. However, as Google’s business continues to expand, its overall emissions have also moved higher.

  • In 2024, Google’s total ambition-based emissions reached 11.5 million tonnes of CO₂ equivalent, marking a 51% increase compared to 2019.

During the same period, combined Scope 1 and Scope 2 emissions rose sharply, driven largely by the rapid growth of energy-intensive data centers. At the top, Scope 3 emissions from the supply chain remained the largest contributor, totaling 8.4 million tonnes of CO₂ equivalent.

google scope emissions
Source: Google

Google has long backed early-stage carbon removal through offtake agreements. In 2024 alone, it signed 16 new deals worth more than $100 million, covering about 728,300 tonnes of CO₂e. This lifted Google’s total removal portfolio to roughly 782,400 tonnes—a fourteen-fold jump from the previous year.

In this context, the Ebb agreement fits squarely into Google’s strategy. While the company keeps pushing decarbonization, it is also investing in high-quality carbon removal to tackle emissions that remain hard to cut in the near term.

For Google, the deal offers access to durable removals with clear monitoring and storage pathways. For Ebb, it validates a scalable, globally replicable model—highlighting how industrial partnerships can speed up carbon removal at scale.

Beyond Carbon: Creating Additional Value

Ebb’s technology does more than remove carbon. The ocean alkalinity process also produces an acid co-product. Rather than treating this as waste, Ebb is exploring ways to turn it into value.

The company has been working with X, the Moonshot Factory, to explore innovative uses for this acid stream. These efforts underline Ebb’s broader vision: carbon removal should not exist in isolation. Instead, it should support water security, industrial efficiency, and sustainable chemical production.

This multi-benefit approach strengthens the business case and reduces reliance on carbon credit revenue alone. Concisely, through these purchases, Google aims not only to neutralize its own remaining emissions but also to help push promising technologies toward commercial scale.

mCDR: Unlocking the Ocean’s Carbon Removal Potential

The ocean holds the largest accessible carbon reservoir on Earth—over 50 times the pre-industrial atmospheric carbon and 20 times that stored in global plants and soils. It currently absorbs roughly 10 Gt CO₂ per year, about a quarter of human-caused emissions, through natural air-sea gas exchange.

Marine carbon dioxide removal (mCDR) aims to accelerate this process by deliberately transferring CO₂ from the atmosphere to the ocean. So far, 578,000 tonnes of CO₂ removals have been secured via mCDR offtake agreements, though only 0.3% of these credits—mainly from Ocean Alkalinity Enhancement projects—have been formally issued.

While most carbon removal efforts have focused on land, the ocean’s capacity to safely store CO₂ at scale surpasses terrestrial solutions. Ocean methods offer exceptional permanence, with deep storage or bicarbonate forms lasting over 1,000 years. Rigorous monitoring, reporting, and verification (MRV) ensures reliability, aligning with ICVCM-like standards and supporting premium credit pricing.

Marks & Spencer and Schneider Electric Partner to Cut Supply Chain or Scope 3 Emissions

Marks & Spencer and Schneider Electric have launched a new partnership to help reduce carbon emissions in global supply chains. The initiative, called RE:Spark, aims to increase the use of renewable electricity by suppliers across M&S’s network. It combines software, clean energy purchases, and advisory support.

Schneider Electric provides technology and services. Marks & Spencer (M&S) brings its large supplier base and sustainability goals. The effort reflects a growing corporate focus on cutting emissions deep in the value chain.

The partnership builds on broader work by both companies to cut greenhouse gas emissions. These efforts aim to support their long-term climate targets and influence industry change.

Katharine Beacham, Marks & Spencer’s head of sustainability and materials in fashion, home and beauty, remarked:

“By acting as a facilitator, we can help our suppliers build networks and resilience for the long term — sparking a movement of change across the industry and beyond.”

RE:Spark in Action: Empowering Suppliers to Go Green

RE:Spark is designed to help suppliers adopt renewable energy and reduce emissions. It uses Schneider Electric’s Zeigo Hub, a digital tracking platform. Suppliers can submit emissions data, track progress, and access resources. The initiative also offers:

  • Clean energy guidance and advisory services to help suppliers switch to wind, solar, or other low-carbon power.
  • Regional engagement events to educate suppliers in markets such as Vietnam, Turkey, India, China, and Bangladesh.
  • Aggregated power purchase agreements (PPAs) help smaller firms buy renewable energy together.

These steps aim to lower costs and increase access to clean electricity for suppliers that normally cannot secure renewable contracts on their own.

The program is planned to roll out over a three-year period. It initially focuses on high-impact regions of M&S’s fashion and food supply chains.

M&S Net Zero Targets: Tackling Scope 3 Emissions

Marks & Spencer is a major British retailer with long-standing sustainability commitments. Its Plan A strategy targets net zero emissions across its full value chain by 2040. This includes Scope 1, 2, and 3 emissions — meaning emissions from operations, energy use, and suppliers.

Marks & Spencer net zero roadmap
Source: Marks & Spencer

According to M&S’s own reports, around 95% of its carbon footprint comes from indirect Scope 3 emissions — mostly linked to products and supplier activity.

M&S reported total emissions of around 7.4 million tonnes of CO₂ in a recent baseline year. Most of this came from sourcing and manufacturing. The company has set medium- and long-term targets, including a 55% reduction in emissions by 2030 from a 2017 baseline.

Marks & Spencer ghg emissions 2024
Source: Marks & Spencer

M&S has begun taking action across its supply chain and logistics. It added 85 lower-emission vehicles to its fleet. This includes five zero-emission electric heavy goods vehicles and compressed natural gas trucks.

These trucks can cut CO₂ emissions by up to 85% compared to diesel in some cases. About 10% of its transport fleet now runs on zero or lower-emission technology.

Despite business growth in recent years, M&S also reported a rise in emissions. In one period, the company said its emissions increased by 6% even as revenue grew by 9%. Most retailers face the challenge of balancing growth with climate targets.

Schneider Electric Leads Supplier Decarbonization

Schneider Electric is a global leader in energy management and automation. It has set its own climate goals, including a plan to reach net zero emissions across its entire value chain by 2050.

Schneider Electric Net-Zero Commitment
Source: Schneider Electric

The company aims for a 25% cut in value-chain emissions by 2030. This includes its own emissions and those from its suppliers. Schneider calls this its “Zero Carbon Project.”

A major part of Schneider Electric’s strategy is helping suppliers decarbonize. Under its Zero Carbon Project, the company has worked with its top 1,000 suppliers to reduce emissions. They represent a large part of its Scope 3 footprint.

Schneider reported a 42% average cut in emissions from participating suppliers. They also helped about 700 suppliers measure and define their carbon footprints. Its near-term 2025 net-zero targets are as follows:

Schneider Electric 2025 targets
Source: Schneider Electric

The electric company plans to balance residual emissions through high‑quality carbon removal credits. Its strategy includes investing in nature‑based and engineered removals, such as direct air capture, to match residual operational emissions by 2030 and support full value‑chain net zero by 2050.

Schneider’s approach combines data analytics, ambition setting, and direct support. It focuses on helping suppliers grasp emissions and take action, avoiding strict top-down rules.

Scope 3 Emissions: The Hidden Climate Challenge

Many companies now focus on emissions beyond their own factories and offices. These outside emissions are called Scope 3 emissions. Scope 3 includes all the carbon produced by suppliers and partners. It also covers emissions from the production of raw materials, transport, and other steps before a product reaches the customer.

Scope 3 emissions are often much larger than a company’s direct emissions. In 2023, corporate disclosures showed that Scope 3 supply chain emissions were about 26x greater than emissions from a company’s own operations (Scopes 1 and 2). This means that most of a company’s climate impact comes from its value chain, not from its own buildings or vehicles.

Because Scope 3 emissions are so large, reducing them is critical for companies to meet net-zero goals. Yet many companies do not fully measure or control these emissions. Only a small share of firms that report emissions actually set specific targets to cut Scope 3 emissions. This means that most corporate climate plans are missing the biggest piece of their carbon footprint.

For retailers and manufacturers, this problem is especially strong. Industry analysts note that for many firms, 70% to 90% of total emissions come from Scope 3 activities rather than direct operations. Take, for instance, the case of Nike’s Scope 3 emissions below, which represent over 90% of its total carbon footprint. 

Netflix scope 3 emissions
Source: Nike

These high shares occur because raw materials, supplier processes, packaging, and transport often require much more energy and carbon than company‑owned facilities. Suppliers also often lack the tools and financing to switch quickly to clean power.

Decarbonizing supply chains can also reduce business risk. If companies do not address Scope 3 emissions, they may face higher costs in the future. Many countries are now requiring reports on indirect emissions. They also aim to reduce these emissions.

Investors and customers are also more likely to choose companies with stronger climate action plans. Companies that engage suppliers and track performance with digital tools can accelerate progress and make climate work more transparent. These steps make it easier for both large brands and smaller suppliers to reduce emissions together.

RE:Spark and the Global Push for Corporate Climate Action

Marks & Spencer and Schneider Electric are part of a larger trend of corporate climate action. Many companies now set science-based targets to align with global climate goals.

However, Scope 3 remains a major challenge across sectors. New reporting guidelines and frameworks are now available. They help companies measure better and set goals more effectively.

Tools like Schneider’s Zeigo Hub reflect trends toward digital solutions for emissions tracking and supplier engagement. More research shows that collaborative programs help companies expand climate action beyond their own operations.

Early adopters often inspire peers and suppliers to take action as well. Efforts like RE:Spark aim to make supply chain decarbonization more practical and accessible, especially for smaller suppliers.

M&S and Schneider Electric plan to expand the program over the next three years. The initiative will initially focus on key regions and high‑impact supply chain segments. As suppliers engage with renewable power procurement and emissions tracking, the partners expect to accelerate progress toward the companies’ net-zero goals.

The success of this model may inspire other brands to launch similar programs. Analysts and sustainability advocates will watch whether RE:Spark leads to measurable emissions cuts across global supply chains. If it does, this approach could become a broader template for corporate climate action.

CATL’s Multi-Country Expansion Redefines Europe’s Battery Supply Chain and Workforce

CATL’s rapid growth across Hungary, Germany, and Spain marks a major shift in how the company operates in Europe. It is no longer only supplying batteries from abroad. Instead, it is becoming deeply involved in Europe’s industrial and workforce ecosystem. Through new factories, training partnerships, and community programs, the company is building a long-term European presence that supports local economies and clean-energy goals.

EUROPE battery storage

Hungary: Debrecen Plant Nears Launch and Strengthens the EV Supply Chain

CATL’s new battery cell factory in Debrecen is moving into its final phase before full operation. The greenfield site is set to play a central role in Europe’s EV supply chain. When it opens, it will deliver 40 GWh of annual capacity, all of which is already fully booked by customers. Mass cell production is expected to begin in early 2026.

While the cell lines prepare for launch, module assembly has already been running for more than a year. The plant has produced more than 120,000 battery modules, enough to power over 30,000 electric vehicles across Europe. The number of employees is also rising quickly, and CATL expects the local workforce to reach 1,500 people by Q1 2026.

Matt Shen, Managing Director of CATL Germany and Hungary, said

“Our Debrecen investment is a major step towards strengthening CATL’s European presence. We are planning for the long term, bringing our most advanced and sustainable manufacturing technologies to Hungary.”

A Facility Designed for High Environmental Standards

Environmental protection is central to the Debrecen plant’s design. CATL built the facility to meet Europe’s strictest environmental requirements, along with additional Hungarian regulations. Several achievements highlight this commitment:

  • Energy use was reduced by almost 30 percent compared with the earlier IPPC permit.

  • Potable water demand cut to one-third, supported by water-saving cooling technologies.

  • ISO 14001 certification was achieved in October 2025, confirming strong environmental management.

  • Greening activities were launched around the site, improving local biodiversity.

CATL already operates ten carbon-neutral plants worldwide. The company expects Debrecen to reach carbon-neutral status within two years of opening, using renewable electricity and installing on-site solar capacity.

Building Local Talent and Creating a Stable Industrial Base

The battery giant has been hiring steadily since 2023. The Debrecen site now employs more than 1,000 people, with two-thirds coming from Debrecen and nearby regions. Recruitment covers a broad range of functions, including production, logistics, quality, finance, IT, and HR.

According to Alexandra Kitta, Head of Recruitment at CATL Debrecen, the company aims to create a modern and stable workplace with strong learning opportunities. CATL offers competitive salaries along with cultural and professional training programs. Employees also gain access to international expertise while building skills in advanced battery technology.

Strengthening Community Connections

Beyond manufacturing, CATL is investing in Debrecen’s cultural and social life. The company supports major local events and brings new traditions to the region, such as the Chinese Lantern Festival and the Mid-Autumn Festival. Community programs focus on children and environmental protection, reflecting its commitment to building long-term relationships with residents.

Germany: Developing Battery Skills Through Training, Industry Links, and the Dual System

Germany plays a major role in CATL’s European strategy. The company is investing heavily in workforce development, technology testing, and partnerships with educational institutions.

New IHK Certificate Course Builds Battery Expertise

At the end of 2025, CATL introduced the IHK-certified course “Basic Battery Technology for Trainees.” This two-week program gives second-year trainees foundational knowledge in areas such as battery safety, sustainability, cell manufacturing, and industry standards.

Nineteen trainees joined the first class. Over time, the course will open to participants from outside the company. Supported through the BatterieMD network, the program includes both hands-on training and digital learning modules that cover the full battery value chain.

This initiative is also an important step toward creating a dedicated battery-technology career path within Germany’s dual vocational system. Despite rising industry demand, Germany still does not have a standardized training track for battery specialists. CATL’s efforts could help shape a modern curriculum that combines theory and practice.

Expanding Testing Capacity in Thuringia

CATL’s training efforts support a growing physical presence in Thuringia. The company began battery cell production in Arnstadt in 2022, its first plant outside China. These cells now power high-performance European vehicles.

Simultaneously, it is also doubling the capacity of its large testing center, which is already certified by Volkswagen for both cell and module testing. The company has invested EUR 1.8 billion in its German operations and employs more than 1,700 people. Training programs focus on chemical processes, Industry 4.0 technologies, and workforce localization.

Deepening Training Partnerships

CATL runs a vocational training center at Erfurter Kreuz and collaborates with key partners, including TÜV Süd, IHK, Debrecen Vocational Training Center, University of Debrecen, and University of Miskolc. Dual study programs and in-house training help build a strong pipeline of skilled workers for Europe’s growing battery sector.

Spain: New 50 GWh LFP Gigafactory with Stellantis

CATL’s expansion reached another milestone with the groundbreaking of a new gigafactory in Zaragoza, Spain. The project is a 50:50 joint venture with Stellantis and will use lithium iron phosphate (LFP) technology. With a capacity of 50 GWh, the plant represents one of Europe’s largest battery investments to date.

Production is expected to begin in late 2026. When fully operational, the factory will supply battery packs for up to one million electric vehicles each year, helping cut more than 30 million tons of CO₂ over their lifetime.

The project includes an investment of up to EUR 4.1 billion and will create more than 4,000 direct jobs. Thousands of indirect jobs are also expected as suppliers and service providers expand around the site.

This gigafactory strengthens Europe’s battery value chain and reflects CATL’s evolution from supplying Europe to operating “in Europe, for Europe.” The Spanish plant will primarily serve Stellantis brands, while the combined Hungary and Spain operations will support a stable European customer base.

Europe’s Battery Storage Market Accelerates

Wood Mackenzie expects Europe’s battery storage capacity to climb from about 11 GW in 2024 to 16 GW in 2025, a 45% jump. The firm also projects steady growth through the next decade, with deployments rising at a 9% annual pace and reaching roughly 35 GW by 2034.

battery storage market Europe
Source: Wood Mackenzie

In this space, Germany will remain the largest market, supported by strong utility-scale and commercial demand. But the region also faces grid bottlenecks, more than 500 GW of connection requests, and rising revenue pressure as more projects come online.

Orbital Data Center Guide: Everything You Need to Know About This Next-Gen Space Computing Technology

0

Orbital data centers are a radical rethinking of where and how we process the world’s data. Companies are moving away from building bigger campuses on Earth. Instead, they are designing computing facilities to operate in low Earth orbit and beyond. These systems provide constant solar power and cool naturally to reject heat. Plus, they can process satellite data right on-site. This could tackle some major challenges that terrestrial data centers face today.

The idea has moved quickly from theory to concrete plans. Axiom Space, for example, is planning to deploy orbital data center (ODC) nodes to the International Space Station by 2027. Google has joined the race with Project Suncatcher. This initiative aims to create solar-powered AI data centers in orbit. 

Google’s plan includes launching prototype satellites around 2027 equipped with Tensor Processing Units (TPUs). They will run on continuous sunlight and use laser-based communication systems.

The company says orbital solar panels could produce up to 8x more energy than those on Earth. They also believe costs may match those of land-based data centers by the mid-2030s.

Market analysts expect fast growth. The orbital data center market will rise to tens of billions of dollars by 2035. This shows a compound annual growth rate of about 67%. This surge comes from high demand for AI computing, new satellite data, and the push to lower the data center’s environmental impact.

The next sections will explore the technical, environmental, commercial, and geopolitical factors driving this change. They will explain why the decade ahead might determine whether orbital data centers stay niche or become key global infrastructure.

The Rise of Orbital Data Centers

The digital world is expanding at a pace never seen before. Every day, businesses, governments, and individuals generate massive volumes of data. The demand for data processing and storage is skyrocketing. This is driven by AI training models needing a lot of computational power and satellite networks sending terabytes of images back to Earth.

Traditional terrestrial data centers have carried the load so far, but they are reaching their limits. The sheer scale of energy consumption, cooling requirements, and land use is making it harder to sustain growth. This pressure has given rise to a bold alternative: orbital data centers.

The Limits of Earth-Based Data Centers

On Earth, data centers are already among the most energy-intensive types of infrastructure. In the U.S. alone, power demand from these facilities is projected to climb from 17 gigawatts in 2022 to 35 gigawatts by 2030.

The industry could see $2 trillion in global capital spending in the next five years. Half of that will be in the United States. Data centers use more than just electricity. They consume millions of gallons of water each year for cooling. They also take up large areas of land and release a lot of carbon dioxide. This environmental footprint clashes with global climate goals. Many areas are facing water scarcity and grid issues.

The physical expansion of Earth-based data centers also creates tensions with local communities. In parts of the U.S. and Europe, new projects face pushback. This is due to land use issues, water stress, and rising electricity costs tied to large-scale digital infrastructure. As demand continues to rise, these conflicts are expected to grow sharper.

Enter Space-Based Computing

Orbital data centers want to solve these problems by placing processing power in space. These systems use solar energy from space. They don’t rely on power grids on Earth. So, they can work without interruptions from weather or day-night changes.

Orbital centers could cut the environmental impact by eliminating the need for land and water resources. This is a major advantage over Earth-based centers.

The concept is not entirely new. Experiments have already proven that computers can operate reliably in space. Hewlett-Packard Enterprise (HPE) grabbed attention with its Spaceborne Computer project. This project showed that regular hardware can work well on the International Space Station (ISS). This early step showed that data centers could scale to orbital operations. They would need radiation protection, heat dissipation systems, and reliable networking.

Market Potential

What was once science fiction is now a market on the cusp of rapid growth. Analysts expect the orbital data center industry to grow from $1.77 billion in 2029 to $39.09 billion by 2035. This shows a remarkable compound annual growth rate (CAGR) of 67.4%.

orbital data center market growth 2035

Notes: Shows rapid industry expansion with a CAGR of 57.4% driven by AI demands and sustainability

This surge is fueled by multiple drivers:

  • The insatiable demand for AI and machine learning workloads.
  • The explosion of satellite constellations is generating enormous amounts of data.
  • The urgent need for more sustainable, climate-conscious computing.
  • Advances in reusable rockets and space-based solar power systems are making orbital deployment increasingly feasible.

Cost Comparison

Based on the Lumen Orbit white paper, the cost comparison between orbital and terrestrial data centers is dramatic: 

Over a 10‑year span, a 40 MW terrestrial data center would cost about US$167 million, covering energy (~$140 m), cooling ($7 m), water use, backup power ($20 m), etc. Meanwhile, an equivalent orbital setup would cost only ~US$8.2 million, factoring in $5 m for launch, $2 m for a solar array, and $1.2 m for radiation shielding. 

This implies space‑based data centers could be roughly 20× cheaper to operate over that timeframe.

Cost Comparison for a 10-year Cycle: Terrestrial vs. Orbital Data Centers

From Vision to Reality

The next few years will be critical in proving the viability of orbital data centers. Companies such as Axiom Space, Google, Starcloud, and China’s ADA Space are already preparing demonstration missions and initial deployments. These projects aim to test hardware. They also show investors and customers that orbital facilities offer real performance benefits.

Axiom Space plans to send an orbital data center module to the ISS by 2027. They aim to grow this with their commercial space station platform. Starcloud plans to launch a GPU-powered satellite in 2025 to test high-performance computing in orbit. ADA Space is launching a bold plan for a 2,800-satellite constellation. This shows how quickly orbital infrastructure can grow.

These efforts mark a shift from theoretical feasibility studies to practical implementation. If they succeed, they could change global digital infrastructure. This would lead to a future where computing isn’t just on Earth, but spread across land and space.

Environmental Promise and Sustainability Benefits

One of the strongest arguments for orbital data centers is their potential to ease the environmental strain created by traditional facilities. Terrestrial data centers use about 1–2% of the world’s electricity. This percentage is rising as more people adopt AI. Cooling systems alone can consume up to 40% of a facility’s power needs.

In addition, many data centers require hundreds of acres of land and millions of gallons of water each year for heat management. These pressures have made the sector a target of regulatory scrutiny and community pushback. Companies think they can greatly reduce these impacts by moving some computing infrastructure into orbit.

Unlimited Access to Solar Power

The most obvious advantage of orbital data centers is access to continuous solar energy. Orbital solar arrays don’t have interruptions from weather or night. Unlike solar farms on Earth, they operate continuously. This means they can deliver a constant and highly efficient power supply. Starcloud and others are planning large solar grids. These grids could stretch up to 2.5 miles and aim to power big orbital data facilities.

For high-performance computing, like AI training, this power source offers faster, cheaper, and more sustainable processing than what we can achieve on Earth.

Cooling in the Vacuum of Space

Cooling is one of the biggest sources of energy waste in terrestrial facilities. Conventional centers use fans, air conditioning, or liquid systems. These can make up almost half of their electricity use.

In orbit, space creates a unique environment. Traditional convection cooling doesn’t work here, so heat must leave through radiation.

Engineers are creating unique radiator panels, heat pipes, and phase-change materials. These tools help control thermal loads in orbital data centers. These systems are complex, but they lack energy-hungry water and air cooling. This could lead to big efficiency gains when scaled up.

Reduced Land and Water Use

Space-based facilities free up land on Earth. This helps avoid conflicts with farming, city growth, or conservation. This is especially important in places like Northern Virginia and Dublin. Data center growth there has led to community pushback about land use and strain on infrastructure.

Orbital centers also sidestep water usage, a growing concern in drought-prone regions such as the American West. By comparison, large data centers use millions of gallons of water each year for cooling. This puts pressure on local water supplies.

Carbon Footprint Reduction

Orbital data centers could reduce dependence on fossil fuel-based electricity by tapping directly into abundant solar power in space. They also eliminate the carbon emissions tied to land clearing and cooling infrastructure.

Research backed by the European Commission shows that orbital data centers may be environmentally friendly. They could offer computing power with a lower carbon footprint than data centers on Earth.

The Trade-Off: Launches and Space Debris

The sustainability equation is not without complications. Rocket launches needed for orbital infrastructure still release a lot of emissions. This includes black carbon particles, which can build up in the upper atmosphere.

Large-scale deployment might increase this footprint. However, reusable launch systems like SpaceX’s Falcon 9 are lowering the cost and emissions for sending hardware to orbit.

Another concern is space debris. With tens of thousands of new satellites projected for launch by 2030, orbital traffic is experiencing significant congestion. Large solar arrays and data hubs represent big targets for collisions with debris traveling at speeds of up to 28,000 kilometers per hour.

Mitigation strategies are key. Debris shields, active cleanup missions, and smart orbital slot management will help. These steps ensure that sustainability gains aren’t lost to new environmental hazards in orbit.

Environmental Impact Comparison: Terrestrial vs. Orbital Data Centers Orbital data center infographic. Environmental impact of orbital and terrestrial data centers

A Net Positive, if Challenges Are Managed

Taken together, orbital data centers promise major environmental benefits by reducing energy demand, land use, and water consumption on Earth. Launch emissions and orbital debris are concerns, but technology and smart rules can create a positive outcome. If these facilities can scale well, they could be a major innovation in sustainability for digital infrastructure.

Market Landscape and Key Players

The orbital data center market is new, but companies and partnerships are paving the way. These players include space station developers, satellite operators, cloud and hardware firms, and telecom companies. They all compete for a share of a potential multibillion-dollar industry by the mid-2030s.

Their strategies vary, but all aim to address the same challenges while providing real benefits compared to land-based options:

  • powering,
  • cooling, and
  • protecting computing infrastructure in the harsh conditions of space.

Axiom Space: Building the First Orbital Data Hub

Axiom Space is among the most prominent U.S. firms advancing orbital data center capabilities. The company is best known for Axiom Station, which will replace the International Space Station. It also plans to integrate orbital data center (ODC) modules into its future projects. Axiom has received funding to boost its research and development. This includes $5.5 million from the Texas Space Commission.

The company focuses on “Earth independence”, which means data can be stored and processed in orbit. It doesn’t need any ground-based cloud systems.

Axiom teamed up with Kepler Communications and Skyloom Global. Together, they added optical inter-satellite links (OISLs). This upgrade allows fast data transfers between orbit and ground. The first ODC nodes will launch on the ISS by 2027. This will be one of the earliest real-world tests of orbital computing.

Kepler network Axiom
Source: Axiom

Starcloud: Scaling Solar-Powered Computing

Another key innovator is Starcloud, which used to be called Lumen Orbit. This startup has raised over $21 million in seed funding: one of the biggest early investment rounds for a Y Combinator graduate. The company’s vision is bold: solar panel grids up to 2.5 miles wide powering megawatt-scale data centers in orbit.

Starcloud’s first demo mission successfully launched in November 2025 on a SpaceX Falcon 9. It carried a 132-pound (60-kilogram) satellite with an NVIDIA data-center-grade GPU. This mission is designed to prove that space can handle demanding computing tasks such as AI inference and training.

orbital data center network architecture
Source: Lumin Orbit

If they succeed, the company thinks orbital facilities could be cheaper than Earth-based data centers. This is especially true for processing satellite data and AI tasks that work better near the source.

Google’s Project Suncatcher: Sustainable AI Computing

Google announced this project, which represents the first major move by a global tech giant into orbital computing. The company will launch solar-powered satellites. These satellites have custom TPUs and use laser communication to connect orbital clusters to Earth.

Google’s research shows that solar collection in orbit could be up to eight times more efficient than on Earth. This could provide sustainable AI computing on a large scale. If the prototypes succeed, Google expects to expand toward operational orbital nodes in the early 2030s.

China’s ADA Space: An Ambitious Constellation

China has emerged as a powerful competitor through ADA Space (Guoxing Aerospace). In May 2025, the company launched 12 AI-enabled satellites. This is the first step in a plan for 2,800 satellites. Each satellite has 744 tera operations per second (TOPS) of computing power. They use 8 billion-parameter AI models and feature 100 Gbps laser inter-satellite links.

This project highlights China’s strategic intent to dominate orbital computing. ADA Space processes data in orbit, especially for astronomy and remote sensing. This helps reduce bandwidth issues and speeds up response times. The constellation is more than just a business. It also helps China with its national security and space goals.

PowerBank: A Strategic Contributor

PowerBank Corporation, in partnership with Orbit AI, is developing the Orbital Cloud, a network of AI-enabled orbital data centers. The system combines satellite communication, on-orbit AI computing, and blockchain verification, providing resilient, censorship-resistant services independent of ground networks.

PowerBank supplies advanced solar energy systems, adaptive energy management, and thermal control technologies to power and maintain these orbital compute nodes. The first satellite, DeStarlink Genesis‑1, launched in December 2025, marking the start of the network, with additional nodes planned through 2026 and beyond.

This initiative positions PowerBank at the intersection of renewable energy, AI, and space infrastructure. Analysts estimate the combined market for orbital infrastructure, in-orbit computing, and satellite services could exceed USD 700 billion over the next decade.

OrbitsEdge + HPE: Modular Racks in Orbit

OrbitsEdge has partnered with Hewlett Packard Enterprise (HPE) to design modular, satellite-based data centers. Their SatFrame satellite bus can hold standard 19-inch server racks. It can also scale to support larger hardware.

HPE’s Edgeline Converged Edge Systems show how traditional IT hardware companies are adjusting ground technology for space. This modular approach could allow incremental scaling. This has lower risks compared to large, one-time deployments.

NTT/JSAT Space Compass: Beyond-5G Integration

Japan’s NTT Corporation and Sky Perfect JSAT have teamed up in the Space Compass joint venture. This shows how telecom companies see orbital computing as key to future networks. Their plan connects land, air, and space communication systems. This will support Beyond-5G and 6G connectivity.

NTT JSAT Space integrating network
Source: NTT

The venture plans to use high-speed optical transmission to connect these layers. This will provide seamless global cloud services. As such, orbital data centers will serve as the backbone for low-latency processing.

European Efforts: Feasibility and Sustainability

Europe is also exploring orbital data centers, though it is earlier in the process. Thales Alenia Space, with help from the European Commission, studied the technology and environmental impact of these systems.

Europe may be behind the U.S. and China in commercial deployments. But its focus on sustainability and regulation might set global standards.

Funding and Strategic Backing

Behind these companies is a growing web of investors and government support. Venture firms like Y Combinator, NFX, FUSE, and Soma Capital are backing startups such as Starcloud. Also, major tech investors like Sequoia and Andreessen Horowitz are interested in orbital computing ventures.

Even the CIA’s venture capital arm, In-Q-Tel, has backed projects in this space. This shows how important orbital data centers are for defense and intelligence.

These investments show confidence that orbital computing will evolve from experimental missions into commercial infrastructure. As demonstration projects show their worth, the sector may attract bigger funding rounds. We could also see more partnerships among aerospace, telecom, and cloud computing giants.

Technical and Operational Challenges

While orbital data centers promise enormous benefits, turning the concept into reality requires solving a set of tough technical and operational problems. Space is a harsh and unforgiving place. Radiation, extreme temperatures, and micrometeoroids constantly threaten electronic systems.

Also, the costs of starting, running, and growing orbital infrastructure create challenges that land-based competitors don’t encounter.

Radiation: The Need for Hardened Components

One of the most critical issues is radiation. Electronic components in space face cosmic rays and charged particles. These can lead to single-event effects (SEEs), memory corruption, and system failures. Commercial off-the-shelf (COTS) hardware, while cheaper and more advanced, is highly vulnerable in orbit.

Radiation-hardened (rad-hard) electronics are tougher. However, they cost more, offer less power, and are often years behind the newest commercial chips.

Google’s Project Suncatcher acknowledges that radiation hardening is key to long-term reliability. This is especially true when using large arrays of TPUs in orbit. The company is testing AI chips that can handle faults. They are also using adaptive software like RedNet AI’s error-correction model, which aims to reduce radiation damage.

Thermal management is equally vital. Google’s research shows that using solar power constantly still faces a big problem. Radiating waste heat in a vacuum is tough to manage. Their proposed solution involves kilometer-scale radiator panels and phase-change systems — technologies also being studied by Starcloud and ADA Space.

Innovative approaches are emerging. Researchers created methods like RedNet. This system is made for deep neural networks. RedNet doesn’t just depend on rad-hard hardware. It also uses the varying sensitivity of AI model layers to manage radiation-induced errors. 

Correcting errors in the model’s weak spots leads to nearly zero error rates. This also speeds up inference by 33% compared to traditional methods. Such hybrid strategies could allow orbital data centers to balance cost, reliability, and performance.

Thermal Management in the Vacuum of Space

Cooling is another major hurdle. On Earth, data centers rely on air and liquid cooling systems to dissipate heat. In space, convection does not work in a vacuum — all heat must be radiated away. This is much less efficient and needs special systems, such as large radiator panels, heat pipes, and phase-change materials.

As data centers scale to megawatt power levels, the challenge becomes more extreme. Companies like Starcloud envision orbital facilities with cooling systems stretching kilometers across to shed excess heat.

Designing these systems to run reliably for years without maintenance makes them more complex and expensive. Before orbital data centers can manage workloads like Earth’s biggest facilities, solving thermal management is key.

Space Debris and the Kessler Risk

The growing density of satellites and debris in low Earth orbit (LEO) poses a serious risk. NASA scientist Donald Kessler first described Kessler Syndrome in 1978. It’s a chain reaction in which collisions create more debris, causing even more collisions.

Large orbital data centers, with expansive solar panels and radiator arrays, would be particularly vulnerable. Even tiny fragments, less than a centimeter, can travel up to 28,000 kilometers per hour. They can destroy sensitive equipment.

Operators will need to incorporate shields, redundant systems, and debris-avoidance maneuvers. Still, the risk of serious damage is a big concern for long-term orbital infrastructure.

Launch Economics and In-Space Assembly

Getting heavy, complex systems into orbit is expensive. Current launch costs range from about $7.5 million to $67 million per mission, depending on payload size and orbit. Reusable rockets, like SpaceX’s Falcon 9 and Starship, are cutting costs. However, setting up gigawatt-scale facilities may still need hundreds of tons of hardware.

One solution is in-space assembly and modular construction. Companies could use smaller modules instead of a full facility. They can then piece these modules together in orbit. This incremental approach spreads costs across multiple missions and reduces risk. Longer term, in-space manufacturing could further cut costs by using materials sourced from the Moon or asteroids.

Maintenance Hurdles and Redundancy Needs

Unlike Earth-based facilities, orbital data centers cannot rely on technicians to swap out failing components. Repairs need robotic missions or astronauts. Both options are expensive and complicated. To mitigate this, orbital facilities will need high levels of redundancy and fault tolerance.

This means creating systems that can handle component failures while still operating. This approach adds weight, cost, and complexity. Maintenance challenges will stay a major hurdle for commercial success until autonomous repair systems improve.

Applications and Use Cases

For orbital data centers to thrive, they need to show clear benefits compared to land-based facilities. Building in space has high costs and risks. However, some applications could benefit so much from orbital infrastructure that using it will become necessary. These early use cases, powering artificial intelligence and securing defense systems. show how the market might grow.

AI Training with Continuous Solar Energy

One of the most promising applications is training large AI models, including large language models (LLMs). These workloads need a lot of computing power and constant energy. They often test the limits of Earth’s grids.

Orbital data centers can use constant solar energy in space. They don’t face day-night cycles or weather issues. This enables uninterrupted operations, potentially lowering costs and accelerating model development.

Starcloud’s 2025 mission will test AI training and inference using NVIDIA GPUs in space. This will provide 100 times more computing power than past space demos. In the long run, gigawatt-scale orbital clusters might serve as special platforms for training large AI systems. They could take on the energy-heavy tasks that currently happen on Earth.

Earth Observation and Satellite Data Processing

Today, satellite constellations produce terabytes of data every day. A lot of this data needs to be sent to ground stations for processing. This creates bandwidth bottlenecks and latency issues that limit real-time applications. Orbital data centers solve this problem by processing data in space. They send only useful insights back to Earth.

Research from Tsinghua University shows that using inter-satellite links for data processing can boost system capacity significantly. This is much more effective than traditional downlink methods. This could lead to quicker wildfire detection, better disaster response, and improved environmental monitoring. Plus, it would lower transmission costs.

Defense and National Security

Defense is another high-value use case. Orbital data centers can support missile defense systems, autonomous weapons, and intelligence gathering. Here, even a fraction of a second can make a big difference. Processing data in orbit offers ultra-low latency and global coverage. These benefits aren’t achievable with just terrestrial infrastructure.

Security is also enhanced. Orbital centers are naturally isolated from many physical and cyber threats. Axiom Space has pointed out “Earth independence” as a key feature of its orbital cloud services. This means defense applications stay functional even if ground networks fail.

Disaster Recovery and Data Backup

Orbital and lunar data centers also hold potential for disaster recovery. Lonestar Data Holdings has already shown a lunar payload that can store and retrieve encrypted data. This makes the Moon an ideal backup site. Storing important information in orbit can help protect against natural disasters, political issues, or cyberattacks that could harm data centers on Earth.

This concept echoes the “Library of Alexandria” concern — the idea that without off-world backups, humanity risks losing irreplaceable knowledge in a catastrophe. Orbital data centers may become the ultimate safeguard for digital civilization.

Hybrid Cloud Models

In the near term, orbital facilities are unlikely to replace terrestrial data centers. Instead, they are expected to operate in hybrid systems, where workloads are distributed between Earth and orbit. Advanced optical communication networks will let data flow easily between the two. Placement will be optimized for speed, cost, and security needs.

For instance, AI training could happen in orbit, but customer-facing apps stay on Earth. Satellite data can be processed in orbit. Then, it can be added to cloud platforms on Earth. This hybrid model may prove the most commercially viable, offering the best of both worlds.

Regulatory, Political, and Security Landscape

As orbital data centers move closer to reality, questions of governance, law, and security loom large. Orbital data centers don’t fit neatly under national laws. Instead, they exist in a complex mix of international treaties, national rules, and global competition. Companies venturing into this space must navigate overlapping — and often unclear — legal frameworks.

  • Data Privacy in Orbit

One of the most pressing issues is data privacy. There is no treaty that specifically governs personal data protection in space. Current laws like the EU’s General Data Protection Regulation (GDPR) apply to any company handling data from EU citizens, no matter where they are located.

U.S. laws also apply to orbital operations. These include HIPAA for health data, the Gramm-Leach-Bliley Act for financial data, and state laws like California’s Consumer Privacy Act (CCPA). Companies must comply with several regulatory rules at the same time, even when in orbit.

  • Export Controls and Security Restrictions

Orbital data centers must follow rules from the International Traffic in Arms Regulations (ITAR) and the Export Administration Regulations (EAR). Advanced computing systems, radiation-hardened electronics, and satellite technologies are often seen as dual-use or defense-related. This limits international collaboration and may prevent certain partnerships or data-sharing arrangements.

Governments are likely to impose further restrictions as orbital computing becomes strategically significant, particularly for defense and intelligence applications.

  • Orbital Debris, Licensing, and Traffic Management

The growth of orbital infrastructure adds to existing concerns about space debris. Regulators like the U.S. Federal Communications Commission (FCC) and the International Telecommunication Union (ITU) require companies to develop end-of-life disposal and collision avoidance plans.

Large orbital data centers, with their big solar panels and radiator arrays, will have strict requirements. Licensing rules, debris control, and orbital slot allocation will increase costs and make operations more complex.

Geopolitical Competition

Finally, geopolitics is shaping the orbital data center race. The United States leads in space through companies like Axiom Space, Google, and Starcloud. NASA, the Department of Defense, and venture funding support them.

China, however, is also making big investments in ADA Space’s planned 2,800-satellite constellation. This shows how important orbital computing is for both business and military use.

This rivalry could speed up innovation, but it might also lead to fragmented systems. The U.S. and Chinese orbital clouds may operate side by side.

If governments see orbital computing as a key asset, national security might become more important than working together commercially. This could widen the gap between competitors.

Global Competition Growing: China, Elon Musk, and Amazon Enter the Race

The race to build orbital data centers and AI supercomputers in space is expanding fast. Major tech companies and national programs are now pursuing high-performance computing in orbit.

China is moving quickly. Companies like Zhongke Tiansuan (Comospace) have run space computers on Jilin‑1 satellites for over 1,000 days. Research groups, including the Three-Body Computing Constellation, have launched satellite clusters performing multi-trillion operations per second. The country plans a centralized space data center in dawn-dusk orbit with over one gigawatt of power, rolling out in phases toward a full-scale orbital megacenter by 2035.

Elon Musk’s xAI and SpaceX are exploring AI payloads on Starlink satellites, which could enable distributed orbital computing. Reusable rockets help lower costs and speed deployment. Blue Origin is also developing related technology.

Amazon also aims to extend AWS cloud and AI services into space. Its “Leo” satellite initiative seeks to integrate orbital computing with Earth-based networks, positioning Amazon against both traditional cloud providers and new orbital competitors.

This global competition now includes both state-backed programs and private companies. China’s rapid satellite deployment, Musk’s launch advantages, and Amazon’s cloud ecosystem give each player unique strengths.

As these systems move from prototypes to operational networks, the race will drive innovation, influence regulations, and reshape global computing infrastructure over the next decade. Here is what we can expect in the coming years.

Future Outlook

Orbital data centers are just starting. However, the future looks clearer as technology improves and pilot missions get ready to launch. If costs and risks can be contained, the sector could move from proof-of-concept to mainstream adoption within the next decade.

2025–2030: Demonstrations to Early Operations

The late 2020s will mark a turning point for orbital computing. Axiom’s ISS deployment and Starcloud’s 2025 GPU mission are leading the way. Also, Google’s Project Suncatcher brings strong commercial support. China’s ADA Space constellation is also rolling out in phases, beginning with AI-enabled satellites that can process data directly in space. 

Between 2025 and 2030, these demonstrations will test whether AI training and continuous solar power can coexist in orbit. By 2030, the first orbital data centers should be handling some commercial tasks. They will show their worth for AI, Earth observation, and defense uses.

2030–2035: Scaling to Gigawatt-Class Facilities

In the early 2030s, orbital data centers are expected to grow a lot. They’ll shift from small demo payloads to large gigawatt-class clusters. These large facilities will use huge solar arrays that stretch for kilometers. They will also have advanced thermal management systems. 

At this stage, orbital computing could play key roles in AI training, global cloud services, and secure government operations. The massive market growth will make orbital infrastructure a key part of the global data economy.

Google’s modeling shows that the cost per kilowatt-year might match Earth-based centers soon. This could be a tipping point for broad adoption. Beyond that, lunar storage projects like Lonestar’s may expand humanity’s computing footprint even further.

Beyond 2035: Lunar Storage and Off-World Infrastructure

After the mid-2030s, orbital data centers may expand to the Moon and deep space. Companies are looking into lunar data centers. They see the Moon as a top choice for storing and backing up digital assets. These lunar outposts could be disaster recovery sites, research hubs, and steps toward interplanetary computing.

As the sector matures, consolidation around successful players is likely. Startups like Starcloud can grow by partnering with or buying established aerospace and cloud computing companies.

Meanwhile, Hybrid Earth-orbit cloud models will likely become the norm. Orbital nodes will work alongside ground data centers. This setup suits energy-heavy or time-sensitive tasks.

The Future of Orbital Computing

Orbital data centers represent one of the boldest ideas in digital infrastructure. By moving computing power into space, they offer solutions to the pressing limitations of terrestrial facilities — from soaring energy consumption and water use to the physical constraints of land and grid capacity.

Orbital data centers could change how and where we process data. They can use continuous solar power, advanced thermal management, and easily integrate with satellites.

If successful, the first orbital data center launch aboard the ISS in 2027 could be remembered as the start of a new era. What once sounded like science fiction is now on the threshold of becoming mainstream — with the potential to transform not only the digital economy but also the environmental footprint of global computing.