LLM inference in C/C++
MIT License
No README available, please check again later.
maid_llm is a dart implementation of llama.cpp used by the mobile artificial intelligence distrib...
Llama.cpp in Unity, straightforward and clean
Run AI models locally on your machine with node.js bindings for llama.cpp. Enforce a JSON schema ...
Run LLMs locally. A clojure wrapper for llama.cpp.
LLaMA-2 in native Go
fork of fedora spec of llama.cpp
A C#/.NET library to run LLM (🦙LLaMA/LLaVA) on your local device efficiently.
Tutorial on training, evaluating LLM, as well as utilizing RAG, Agent, Chain to build entertainin...
Python bindings for llama.cpp
Node.js binding of Llama.cpp
LSP server leveraging LLMs for code completion (and more?)
A simple-to-use, open source GUI for local AI chat on desktop and mobile. Powered by llama.cpp.
Yet another `llama.cpp` Rust wrapper
llama.cpp LLM Provider