Capability
Cross Lingual Knowledge Transfer
6 artifacts provide this capability.
Want a personalized recommendation?
Find the best match →Top Matches
via “cross-lingual transfer learning with shared vocabulary”
translation model by undefined. 7,17,998 downloads.
Unique: Shared 32K SentencePiece vocabulary across 101 languages enables cross-lingual attention patterns to transfer knowledge from high-resource to low-resource pairs; unlike language-pair-specific models, single encoder learns unified multilingual representation space through C4 pretraining
vs others: Broader language coverage than mBART (50 languages) with unified vocabulary; enables zero-shot translation between unseen language pairs unlike separate bilingual models