SOTA low-bit LLM quantization (INT8/FP8/INT4/FP4/NF4) & sparsity; leading model compression techniques on TensorFlow, PyTorch, and ONNX Runtime
APACHE-2.0 License
Bot releases are hidden (Show)
Published by ftian1 almost 3 years ago
Features
Validated Configurations
Channel | Links | Install Command | |
---|---|---|---|
Source | Github | https://github.com/intel/neural-compressor.git | $ git clone https://github.com/intel/neural-compressor.git |
Binary | Pip | https://pypi.org/project/neural-compressor | $ pip install neural-compressor |
Binary | Conda | https://anaconda.org/intel/neural-compressor | $ conda install neural-compressor -c conda-forge -c intel |
Please feel free to contact [email protected], if you get any questions.
Published by ftian1 almost 3 years ago
Features
Productivity
Ecosystem
Validated Configurations
Channel | Links | Install Command | |
---|---|---|---|
Source | Github | https://github.com/intel/neural-compressor.git | $ git clone https://github.com/intel/neural-compressor.git |
Binary | Pip | https://pypi.org/project/neural-compressor | $ pip install neural-compressor |
Binary | Conda | https://anaconda.org/intel/neural-compressor | $ conda install neural-compressor -c conda-forge -c intel |
Please feel free to contact [email protected], if you get any questions.
Published by ftian1 almost 3 years ago
Intel® Neural Compressor(formerly known as Intel® Low Precision Optimization Tool) v1.7 release is featured by:
Features
Ecosystem
Documentation
Validated Configurations
Channel | Links | Install Command | |
---|---|---|---|
Source | Github | https://github.com/intel/neural-compressor.git | $ git clone https://github.com/intel/neural-compressor.git |
Binary | Pip | https://pypi.org/project/neural-compressor | $ pip install neural-compressor |
Binary | Conda | https://anaconda.org/intel/neural-compressor | $ conda install neural-compressor -c conda-forge -c intel |
Please feel free to contact INC Maintainers, if you get any questions.
Published by ftian1 about 3 years ago
Intel® Neural Compressor(formerly known as Intel® Low Precision Optimization Tool) v1.7 release is featured by:
Features
Productivity
Ecosystem
Documentation
Validated Configurations
Channel | Links | Install Command | |
---|---|---|---|
Source | Github | https://github.com/intel/neural-compressor.git | $ git clone https://github.com/intel/neural-compressor.git |
Binary | Pip | https://pypi.org/project/neural-compressor | $ pip install neural-compressor |
Binary | Conda | https://anaconda.org/intel/neural-compressor | $ conda install neural-compressor -c conda-forge -c intel |
Please feel free to contact [email protected], if you get any questions.
Published by ftian1 about 3 years ago
Intel® Low Precision Optimization Tool v1.6 release is featured by:
Pruning:
Quantization:
User Experience:
New Models:
Documentation:
Channel | Links | Install Command | |
---|---|---|---|
Source | Github | https://github.com/intel/lpot.git | $ git clone https://github.com/intel/lpot.git |
Binary | Pip | https://pypi.org/project/lpot | $ pip install lpot |
Binary | Conda | https://anaconda.org/intel/lpot | $ conda install lpot -c conda-forge -c intel |
Please feel free to contact [email protected], if you get any questions.
Published by ftian1 about 3 years ago
Intel® Low Precision Optimization Tool v1.5.1 release is featured by:
Channel | Links | Install Command | |
---|---|---|---|
Source | Github | https://github.com/intel/lpot.git | $ git clone https://github.com/intel/lpot.git |
Binary | Pip | https://pypi.org/project/lpot | $ pip install lpot |
Binary | Conda | https://anaconda.org/intel/lpot | $ conda install lpot -c conda-forge -c intel |
Please feel free to contact [email protected], if you get any questions.
Published by ftian1 over 3 years ago
Intel® Low Precision Optimization Tool v1.5 release is featured by:
Channel | Links | Install Command | |
---|---|---|---|
Source | Github | https://github.com/intel/lpot.git | $ git clone https://github.com/intel/lpot.git |
Binary | Pip | https://pypi.org/project/lpot | $ pip install lpot |
Binary | Conda | https://anaconda.org/intel/lpot | $ conda install lpot -c conda-forge -c intel |
Please feel free to contact [email protected], if you get any questions.
Published by ftian1 over 3 years ago
Intel® Low Precision Optimization Tool v1.4.1 release is featured by:
Channel | Links | Install Command | |
---|---|---|---|
Source | Github | https://github.com/intel/lpot.git | $ git clone https://github.com/intel/lpot.git |
Binary | Pip | https://pypi.org/project/lpot | $ pip install lpot |
Binary | Conda | https://anaconda.org/intel/lpot | $ conda install lpot -c conda-forge -c intel |
Please feel free to contact [email protected], if you get any questions.
Published by ftian1 over 3 years ago
Intel® Low Precision Optimization Tool v1.4 release is featured by:
Quantization
Pruning
Model Zoo
User Experience
Extended Capabilities
Channel | Links | Install Command | |
---|---|---|---|
Source | Github | https://github.com/intel/lpot.git | $ git clone https://github.com/intel/lpot.git |
Binary | Pip | https://pypi.org/project/lpot | $ pip install lpot |
Binary | Conda | https://anaconda.org/intel/lpot | $ conda install lpot -c conda-forge -c intel |
Please feel free to contact [email protected], if you get any questions.
Published by ftian1 over 3 years ago
Intel® Low Precision Optimization Tool v1.3 release is featured by:
Channel | Links | Install Command | |
---|---|---|---|
Source | Github | https://github.com/intel/lpot.git | $ git clone https://github.com/intel/lpot.git |
Binary | Pip | https://pypi.org/project/lpot | $ pip install lpot |
Binary | Conda | https://anaconda.org/intel/lpot | $ conda install lpot -c conda-forge -c intel |
Please feel free to contact [email protected], if you get any questions.
Published by ftian1 over 3 years ago
Intel® Low Precision Optimization Tool v1.3 release is featured by:
Channel | Links | Install Command | |
---|---|---|---|
Source | Github | https://github.com/intel/lpot.git | $ git clone https://github.com/intel/lpot.git |
Binary | Pip | https://pypi.org/project/lpot | $ pip install lpot |
Binary | Conda | https://anaconda.org/intel/lpot | $ conda install lpot -c conda-forge -c intel |
Please feel free to contact [email protected], if you get any questions.
Published by ftian1 over 3 years ago
Intel® Low Precision Optimization Tool v1.2.1 release is featured by:
Channel | Links | Install Command | |
---|---|---|---|
Source | Github | https://github.com/intel/lpot.git | $ git clone https://github.com/intel/lpot.git |
Binary | Pip | https://pypi.org/project/lpot | $ pip install lpot |
Binary | Conda | https://anaconda.org/intel/lpot | $ conda install lpot -c conda-forge -c intel |
Please feel free to contact [email protected], if you get any questions.
Published by ftian1 over 3 years ago
Intel® Low Precision Optimization Tool v1.2 release is featured by:
Channel | Links | Install Command | |
---|---|---|---|
Source | Github | https://github.com/intel/lpot.git | $ git clone https://github.com/intel/lpot.git |
Binary | Pip | https://pypi.org/project/lpot | $ pip install lpot |
Binary | Conda | https://anaconda.org/intel/lpot | $ conda install lpot -c conda-forge -c intel |
Please feel free to contact [email protected], if you get any questions.
Published by ftian1 almost 4 years ago
Intel® Low Precision Optimization Tool v1.1 release is featured by:
Channel | Links | Install Command | |
---|---|---|---|
Source | Github | https://github.com/intel/lpot.git | $ git clone https://github.com/intel/lpot.git |
Binary | Pip | https://pypi.org/project/lpot | $ pip install lpot |
Binary | Conda | https://anaconda.org/intel/lpot | $ conda install lpot -c conda-forge -c intel |
Please feel free to contact [email protected], if you get any questions.
Published by ftian1 almost 4 years ago
Intel® Low Precision Optimization Tool v1.0 release is featured by:
Channel | Links | Install Command | |
---|---|---|---|
Source | Github | https://github.com/intel/lp-opt-tool.git | $ git clone https://github.com/intel/lp-opt-tool.git |
Binary | Pip | https://pypi.org/project/ilit | $ pip install ilit |
Binary | Conda | https://anaconda.org/intel/ilit | $ conda install ilit -c intel |
Please feel free to contact [email protected], if you get any questions.
Published by ftian1 about 4 years ago
Intel® Low Precision Optimization Tool v1.0 beta release is featured by:
TensorFlow Model | Category |
---|---|
ResNet50 V1 | Image Recognition |
ResNet50 V1.5 | Image Recognition |
ResNet101 | Image Recognition |
Inception V1 | Image Recognition |
Inception V2 | Image Recognition |
Inception V3 | Image Recognition |
Inception V4 | Image Recognition |
ResNetV2_50 | Image Recognition |
ResNetV2_101 | Image Recognition |
ResNetV2_152 | Image Recognition |
Inception ResNet V2 | Image Recognition |
SSD ResNet50 V1 | Object Detection |
Wide & Deep | Recommendation |
VGG16 | Image Recognition |
VGG19 | Image Recognition |
Style_transfer | Style Transfer |
PyTorch Model | Category |
---|---|
BERT-Large RTE | Language Translation |
BERT-Large QNLI | Language Translation |
BERT-Large CoLA | Language Translation |
BERT-Base SST-2 | Language Translation |
BERT-Base RTE | Language Translation |
BERT-Base STS-B | Language Translation |
BERT-Base CoLA | Language Translation |
BERT-Base MRPC | Language Translation |
DLRM | Recommendation |
BERT-Large MRPC | Language Translation |
ResNext101_32x8d | Image Recognition |
BERT-Large SQUAD | Language Translation |
ResNet50 V1.5 | Image Recognition |
ResNet18 | Image Recognition |
Inception V3 | Image Recognition |
YOLO V3 | Object Detection |
Peleenet | Image Recognition |
ResNest50 | Image Recognition |
SE_ResNext50_32x4d | Image Recognition |
ResNet50 V1.5 QAT | Image Recognition |
ResNet18 QAT | Image Recognition |
MxNet Model | Category |
---|---|
ResNet50 V1 | Image Recognition |
MobileNet V1 | Image Recognition |
MobileNet V2 | Image Recognition |
SSD-ResNet50 | Object Detection |
SqueezeNet V1 | Image Recognition |
ResNet18 | Image Recognition |
Inception V3 | Image Recognition |
Channel | Links | Install Command | |
---|---|---|---|
Source | Github | https://github.com/intel/lp-opt-tool.git | $ git clone https://github.com/intel/lp-opt-tool.git |
Binary | Pip | https://pypi.org/project/ilit | $ pip install ilit |
Binary | Conda | https://anaconda.org/intel/ilit | $ conda config --add channels intel $ conda install ilit |
Please feel free to contact [email protected], if you get any questions.
Published by ftian1 about 4 years ago
Intel® Low Precision Optimization Tool (iLiT) is an open-sourced python library which is intended to deliver a unified low-precision inference solution cross multiple Intel optimized DL frameworks on both CPU and GPU. It supports automatic accuracy-driven tuning strategies, along with additional objectives like performance, model size, or memory footprint. It also provides the easy extension capability for new backends, tuning strategies, metrics and objectives.
Model | Framework | Model | Framework | Model | Framework |
---|---|---|---|---|---|
ResNet50 V1 | MXNet | BERT-Large RTE | PyTorch | ResNet18 | PyTorch |
MobileNet V1 | MXNet | BERT-Large QNLI | PyTorch | ResNet50 V1 | TensorFlow |
MobileNet V2 | MXNet | BERT-Large CoLA | PyTorch | ResNet50 V1.5 | TensorFlow |
SSD-ResNet50 | MXNet | BERT-Base SST-2 | PyTorch | ResNet101 | TensorFlow |
SqueezeNet V1 | MXNet | BERT-Base RTE | PyTorch | Inception V1 | TensorFlow |
ResNet18 | MXNet | BERT-Base STS-B | PyTorch | Inception V2 | TensorFlow |
Inception V3 | MXNet | BERT-Base CoLA | PyTorch | Inception V3 | TensorFlow |
DLRM | PyTorch | BERT-Base MRPC | PyTorch | Inception V4 | TensorFlow |
BERT-Large MRPC | PyTorch | ResNet101 | PyTorch | Inception ResNet V2 | TensorFlow |
BERT-Large SQUAD | PyTorch | ResNet50 V1.5 | PyTorch | SSD ResNet50 V1 | TensorFlow |
Channel | Links | Install Command | |
---|---|---|---|
Source | Github | https://github.com/intel/lp-opt-tool.git | $ git clone https://github.com/intel/lp-opt-tool.git |
Binary | Pip | https://pypi.org/project/ilit | $ pip install ilit |
Binary | Conda | https://anaconda.org/intel/ilit | $ conda config --add channels intel $ conda install ilit |
Please feel free to contact [email protected], if you get any questions.