This LLM generates code based on tests, and makes sure they pass.
Bot releases are hidden (Show)
Published by JamesVorder 3 months ago
This release is the first functional version of the tool, and was used to generate a basic rock, paper, scissors game engine.
Full Changelog: https://github.com/JamesVorder/python-tddpp/commits/v0.0.1
Query LLM with Chain-of-Tought
World’s first and simplest AI-oriented programming language using Ollama.
Experimental front-end client library for interacting with llama.cpp
LLM Benchmark for Throughput via Ollama (Local LLMs)
LLaMA Server combines the power of LLaMA C++ with the beauty of Chatbot UI.
Self-host llmapi server, make it really easy for accessing LLMs !
Get up and running with Llama 3.2, Mistral, Gemma 2, and other large language models.
Yet another `llama.cpp` Rust wrapper
LightLLM is a Python-based LLM (Large Language Model) inference and serving framework, notable fo...
LLM-powered code documentation generation
Local first semantic code search and chat powered by vector embeddings and LLMs
The simplest way to run LLaMA on your local machine
A simple, intuitive toolkit for quickly implementing LLM powered applications.
Python library for the instruction and reliable validation of structured outputs (JSON) of Large ...
Running Llama 2 and other Open-Source LLMs on CPU Inference Locally for Document Q&A