Capability
Intelligent Model Selection For Gemini Api
19 artifacts provide this capability.
Want a personalized recommendation?
Find the best match →Top Matches
via “knowledge distillation from gemini models with capability preservation”
Google's efficient open model competitive above its weight class.
Unique: Distillation specifically targets reasoning and instruction-following capabilities from Gemini rather than generic language modeling, using synthetic data generation and response ranking to preserve complex reasoning patterns in a much smaller model
vs others: Achieves 70B-class reasoning performance at 27B scale more effectively than standard distillation approaches used in Llama 2 or Mistral, because it leverages Gemini's superior reasoning as the teacher model rather than distilling from same-scale peers