Common PyTorch Modules
MIT License
Repo for WWW 2022 paper: Progressively Optimized Bi-Granular Document Representation for Scalable...
Secure AI Solutions
Tutel MoE: An Optimized Mixture-of-Experts Implementation
Foundation Architecture for (M)LLMs
Community for applying LLMs to robotics and a robot simulator with ChatGPT integration
Building modular LMs with parameter-efficient fine-tuning.
A unified wrapper for various ML frameworks - to have one uniform scikit-learn format for predict...
MII makes low-latency and high-throughput inference possible, powered by DeepSpeed.
System for AI Education Resource.
workshop materials to build intelligent solutions on Open AI
To speed up Long-context LLMs' inference, approximate and dynamic sparse calculate the attention,...