Energy per query

1.4 Wh

CO2 per query

0.77 g

Water per query

5 mL

Processing location

Alibaba Cloud (China / Global)

Provider

Alibaba

Category

Text / Chat

Grid carbon intensity

550 g CO2/kWh (30% renewable)

How does Qwen 3 235B compare?

00.350.71.051.4LLaMA 3.2 1BGemini 1.5 ProGPT-4.1 NanoQwen 3 235B

Detailed Breakdown

Energy Consumption

Qwen 3 235B uses MoE with ~22B active parameters out of 235B total. Estimated at ~1.4 Wh per short query. When served via Alibaba Cloud, it runs on infrastructure comparable in efficiency to major US cloud providers. When self-hosted (it's open-source), energy depends entirely on the hardware.

Power Source & Carbon

Alibaba Cloud serves Qwen primarily from data centres in China, with a grid carbon intensity of approximately 550 gCO₂/kWh. As an open-source model, it can also be self-hosted globally.

Water Usage

Estimated at approximately 5.2 mL per query when served from Chinese data centres, reflecting both the energy requirements and regional cooling infrastructure.

What does your Qwen 3 235B usage cost the planet?

Use our calculator to estimate your personal environmental footprint based on how often you use Qwen 3 235B.

Calculate My Compute

Frequently Asked Questions

How much energy does Qwen 3 235B use per query?

Each Qwen 3 235B query consumes approximately 1.4 Wh of energy. This is 5x more than a traditional Google search (~0.3 Wh).

What is Qwen 3 235B's carbon footprint?

Based on the carbon intensity of Alibaba Cloud (China / Global), each query produces approximately 0.77 g of CO2. The grid in this region has a carbon intensity of 550 g CO2/kWh with 30% renewable energy.

How much water does Qwen 3 235B use?

Each query consumes approximately 5 mL of water, primarily used for cooling the data centers that process the request.

How does Qwen 3 235B compare to a Google search?

A Qwen 3 235B query uses 5x more than a Google search in terms of energy. A Google search uses approximately 0.3 Wh, while Qwen 3 235B uses 1.4 Wh.

Technical Details

Architecture

Transformer Mixture-of-Experts (decoder-only)

Parameters

235B

Context window

128,000 tokens

Release date

2025-04-28

Open source

Yes

Training data cutoff

2025-03