🦙 Ollama interfaces for Neovim
MIT License
Bot releases are hidden (Show)
Published by jpmcb 10 months ago
Welcome! This is the initial release for the nvim-llama plugin and features a deep integration with the Ollama project.
Enjoy! 🦙 👋🏼
Full Changelog: https://github.com/jpmcb/nvim-llama/commits/v0.0.1
A Web Interface for chatting with your local LLMs via the ollama API
This repository contains a web application designed to execute relatively compact, locally-operat...
This repository demonstrates how to do inference with llama-2-7b-chat using llama.cpp on a machin...
A self-hosted, offline, ChatGPT-like chatbot. Powered by Llama 2. 100% private, with no data leav...
LightLLM is a Python-based LLM (Large Language Model) inference and serving framework, notable fo...
Node Llama Cpp wrapper for Node JS
vnc-lm is a Discord bot that lets you talk with and configure language models in your server. It ...
llama.go is like llama.cpp in pure Golang!
LLaMa 7b with CUDA acceleration implemented in rust. Minimal GPU memory needed!
A simple "Be My Eyes" web app with a llama.cpp/llava backend
Chat with your favourite LLaMA models in a native macOS app
Run any Large Language Model behind a unified API
WebAssembly binding for llama.cpp - Enabling in-browser LLM inference
LLaMA Server combines the power of LLaMA C++ with the beauty of Chatbot UI.
Yet another operator for running large language models on Kubernetes with ease. Powered by Ollama! 🐫