Tongyi DeepResearch 30B A3BModel24/100 via “agentic-long-horizon-research-execution”
Tongyi DeepResearch is an agentic large language model developed by Tongyi Lab, with 30 billion total parameters activating only 3 billion per token. It's optimized for long-horizon, deep information-seeking tasks...
Unique: Uses a 30B parameter model with 3B active tokens per inference step, enabling efficient long-horizon agentic loops without the computational cost of full-parameter activation. The sparse activation pattern (MoE-style) allows the model to maintain extended reasoning chains while keeping inference latency competitive with smaller models.
vs others: More efficient than full-parameter 30B models for research tasks due to sparse activation, and maintains deeper reasoning capability than 7B-13B models while avoiding the latency penalties of 70B+ parameter dense models.