Capability
Model Compression Through Pruning And Distillation
3 artifacts provide this capability.
Want a personalized recommendation?
Find the best match →Top Matches
Microsoft's distributed training library — ZeRO optimizer, trillion-parameter scale, RLHF.
Unique: Combines structured pruning with knowledge distillation; supports both unstructured and structured sparsity patterns with automatic fine-tuning to recover accuracy
vs others: More integrated than separate pruning/distillation tools; automatic fine-tuning reduces manual tuning effort