The backend behind the LLM-Perf Leaderboard
APACHE-2.0 License
The LLM Perf backend is now maintained in optimum-benchmark/llm_perf.
Benchmark some scientific computations for various languages & libraries
Python Performance Benchmark Suite
Intel PMU profiling tools
Starter pack for NeurIPS LLM Efficiency Challenge 2023.
A simple benchmark of LLM speed
A suite of benchmarks for CPU and GPU performance of the most popular high-performance libraries ...
Official Pytorch Implementation of "OwLore: Outlier-weighed Layerwise Sampled Low-Rank Projection...
HTTP API for LLM with OpenAI compatibility