emotion-english-distilroberta-baseModel46/100 via “multi-class emotion classification from english text”
text-classification model by undefined. 7,24,277 downloads.
Unique: Uses DistilRoBERTa (knowledge-distilled RoBERTa) rather than full RoBERTa or BERT, reducing model size by ~40% while maintaining 7-class emotion granularity. Fine-tuned specifically on Twitter/Reddit corpora (informal, emoji-rich, sarcasm-heavy text) rather than generic sentiment datasets, enabling better performance on social media edge cases. Implements standard HuggingFace transformers pipeline interface, allowing seamless integration with text-embeddings-inference servers and cloud deployment (Azure, AWS SageMaker).
Smaller and faster than full RoBERTa-based emotion models (40% fewer parameters) while maintaining competitive accuracy on social media; more emotion-granular than binary sentiment classifiers (7 classes vs. positive/negative); more accessible than proprietary APIs (open-source, no rate limits, can run on-device)