soft-mixture-of-experts

PyTorch implementation of Soft MoE by Google Brain in "From Sparse to Soft Mixtures of Experts" (https://arxiv.org/pdf/2308.00951.pdf)

MIT License

Stars
30

Bot releases are hidden (Show)

soft-mixture-of-experts - 0.2.0 Latest Release

Published by fkodom about 1 year ago

What's Changed

New Contributors

Full Changelog: https://github.com/fkodom/soft-mixture-of-experts/compare/0.1.0...0.2.0

soft-mixture-of-experts - 0.1.0

Published by fkodom about 1 year ago

First full release for PyPI

  • SoftMoE layer
  • Transformer layers with Soft MoE
  • ViT models, with and without Soft MoE
  • ablations from paper
soft-mixture-of-experts - 0.1.0rc1

Published by fkodom about 1 year ago

First prerelease candidate for PyPI

Related Projects