Capability
Fine Tuning On Downstream Chinese Nlp Tasks
10 artifacts provide this capability.
Want a personalized recommendation?
Find the best match →Top Matches
via “fine-tuning for task-specific multilingual adaptation”
fill-mask model by undefined. 63,13,411 downloads.
Unique: Fine-tuning leverages 2.5TB multilingual pretraining as initialization, enabling effective adaptation with 10-100x less labeled data than training from scratch; unified vocabulary across 101 languages allows single fine-tuned model to handle multiple languages
vs others: Requires 10-100x less labeled data than training language-specific models from scratch; maintains cross-lingual transfer better than language-specific BERT variants when fine-tuned on multilingual data