CoPE Collection CoPE is a drop-in enhancement of RoPE that delivers consistent gains within the training context and during long-context extrapoaltion. • 9 items • Updated 18 days ago • 2
CoPE: Clipped RoPE as A Scalable Free Lunch for Long Context LLMs Paper • 2602.05258 • Published 19 days ago • 7
Accurate Failure Prediction in Agents Does Not Imply Effective Failure Prevention Paper • 2602.03338 • Published 21 days ago • 26
Llama-3.1-FoundationAI-SecurityLLM-Reasoning-8B Technical Report Paper • 2601.21051 • Published 26 days ago • 13
Quartet II: Accurate LLM Pre-Training in NVFP4 by Improved Unbiased Gradient Estimation Paper • 2601.22813 • Published 25 days ago • 56
EEG Foundation Models: Progresses, Benchmarking, and Open Problems Paper • 2601.17883 • Published 29 days ago • 20
OCRVerse: Towards Holistic OCR in End-to-End Vision-Language Models Paper • 2601.21639 • Published 26 days ago • 50
Self-Improving Pretraining: using post-trained models to pretrain better models Paper • 2601.21343 • Published 26 days ago • 17
CGPT: Cluster-Guided Partial Tables with LLM-Generated Supervision for Table Retrieval Paper • 2601.15849 • Published Jan 22 • 14
AVMeme Exam: A Multimodal Multilingual Multicultural Benchmark for LLMs' Contextual and Cultural Knowledge and Thinking Paper • 2601.17645 • Published about 1 month ago • 23
Linear representations in language models can change dramatically over a conversation Paper • 2601.20834 • Published 26 days ago • 21
view article Article Introducing Waypoint-1: Real-time interactive video diffusion from Overworld +3 Jan 20 • 40
The Assistant Axis: Situating and Stabilizing the Default Persona of Language Models Paper • 2601.10387 • Published Jan 15 • 12
LucaOne Collection Generalized biological foundation model with unified nucleic acid and protein language(Nature Machine Intelligence),https://github.com/LucaOne/LucaOne • 6 items • Updated Dec 31, 2025 • 2
TokSuite: Measuring the Impact of Tokenizer Choice on Language Model Behavior Paper • 2512.20757 • Published Dec 23, 2025 • 18