bart-large-cnnModel48/100 via “cnn-dailymail-domain-optimized-summarization-with-journalistic-style-transfer”
summarization model by undefined. 19,66,142 downloads.
Unique: Fine-tuned on 300K+ CNN/DailyMail news article-summary pairs, learning journalistic conventions (inverted pyramid, entity preservation, lead generation) that generic summarization models lack. The domain specialization is baked into the model weights through supervised fine-tuning on real news data, not through prompt engineering or post-processing.
vs others: Achieves higher ROUGE scores on CNN/DailyMail benchmark than generic T5 or GPT-2 baselines; produces more journalistically coherent summaries than extractive methods; more specialized than general-purpose BART but with faster inference than larger domain-specific models like PEGASUS-large.