Capability
Neural Machine Translation With Task Prefix Conditioning
8 artifacts provide this capability.
Want a personalized recommendation?
Find the best match →Top Matches
via “fine-tuning on custom tasks with task-prefix adaptation”
translation model by undefined. 22,70,077 downloads.
Unique: Task-prefix conditioning enables multi-task fine-tuning in a single model without architectural changes; prefixes act as soft prompts that condition generation without explicit task-specific heads or adapters
vs others: More efficient than training from scratch; task-prefix approach is simpler than adapter-based fine-tuning but less parameter-efficient than LoRA