Model logo

Qwen3 235B A22B Thinking 2507

Instruct
Reasoning
Tools
Qwen3-235B-A22B-Thinking-2507, part of the Qwen3-235B series and released in 2025, is an open-weight Mixture-of-Experts large language model with 235 billion parameters, activating 22 billion per inference. It excels in logical reasoning, mathematics, science, coding, and academic benchmarks, achieving state-of-the-art results among open-source thinking models. The model also demonstrates strong instruction following, tool usage, and text generation capabilities, with enhanced alignment to human preferences. With support for up to 262,000 tokens of context, it is optimized for step-by-step reasoning, agentic workflows, and complex multilingual tasks, making it one of the most capable open-source variants available.
Provider
Context Size
Throughput
Latency
Input Cost
Output Cost

Usage

Generate your API key and query the model through the OpenAI-compatible interface. The preference parameter allows you to define the routing strategy. For more details, see the documentation.

>Enter ↵