maximal update parametrization (µP)
MIT License
Bot releases are hidden (Show)
Published by thegregyang over 2 years ago
Initial release.
Common PyTorch Modules
Tutel MoE: An Optimized Mixture-of-Experts Implementation
MII makes low-latency and high-throughput inference possible, powered by DeepSpeed.