MIT License
Statistics for this project are still being loaded, please check back later.
Tutel MoE: An Optimized Mixture-of-Experts Implementation
A library for reproducible deep learning.
The ORBIT dataset is a collection of videos of objects in clean and cluttered scenes recorded by ...
Project for open sourcing research efforts on Backward Compatibility in Machine Learning
Foundation Architecture for (M)LLMs
multispy is a lsp client library in Python intended to be used to build applications around langu...
Code for loralib, an implementation of "LoRA: Low-Rank Adaptation of Large Language Models"
[NeurIPS'24 Spotlight] To speed up Long-context LLMs' inference, approximate and dynamic sparse c...
Windows Agent Arena (WAA) 🪟 is a scalable OS platform for testing and benchmarking of multi-modal...
ONNX Runtime: cross-platform, high performance ML inferencing and training accelerator
Repo for WWW 2022 paper: Progressively Optimized Bi-Granular Document Representation for Scalable...
VPTQ, A Flexible and Extreme low-bit quantization algorithm
FS-Mol is A Few-Shot Learning Dataset of Molecules, containing molecular compounds with measurem...