Capability
Long Context Processing With 1m Token Support Internlm2 5
20 artifacts provide this capability.
Want a personalized recommendation?
Find the best match →Top Matches
via “128k token context window for long-document processing”
Ultra-lightweight 1B model for on-device AI.
Unique: 128K context window on 1B model enables long-document processing on edge devices — most 1B models have 2K-4K context windows; larger models with 128K context require cloud deployment
vs others: Larger context than typical 1B models (which average 2K-4K tokens) enabling document-level tasks; smaller context than Llama 3.2 11B/90B (also 128K) but deployable on mobile