Hugging Face's logo Hugging Face
  • Models
  • Datasets
  • Spaces
  • Docs
  • Enterprise
  • Pricing

  • Log In
  • Sign Up
Goekdeniz-Guelmez 's Collections
JOSIE-MoE
JOSIE
Gabliteration
Josiefied and Abliterated Models
Josiefied and Abliterated Qwen3
J.O.S.I.E.-R1
Josiefied and Gabliterated Qwen3-Instruct
Josiefied and Abliterated Qwen2.5
J.O.S.I.E.-Dev-v6.0
J.O.S.I.E.-Dev-v4o

JOSIE-MoE

updated about 23 hours ago

JOSIE models using a custom dynamic Mixture of Expert architecture.

Upvote
1

  • DynaMoE: Dynamic Token-Level Expert Activation with Layer-Wise Adaptive Capacity for Mixture-of-Experts Neural Networks

    Paper • 2603.01697 • Published 2 days ago • 1
Upvote
1
  • Collection guide
  • Browse collections
Company
TOS Privacy About Careers
Website
Models Datasets Spaces Pricing Docs