Efficent platform for inference and serving local LLMs including an OpenAI compatible API server.
MIT License
Statistics for this project are still being loaded, please check back later.
An ecosystem of Rust libraries for working with large language models
Blazingly fast LLM inference.
Minimalist ML framework for Rust