"A simple and lightweight client-server program for interfacing with local LLMs using ollama, and LLMs in groq using groq api."
MIT License
This project is a simple and lightweight client-server program for interfacing with local LLMs using ollama, and LLMs in groq using groq api.
This project demonstrates a simple and lightweight client-server program for interfacing with Large Language Models (LLMs) on your local machine using ollama and also interfacing groq using groq api.
git clone https://github.com/im-pramesh10/LocalPrompt.git
cd LocalPrompt/backend
python -m venv venv
else
python3 -m venv venv
or
virtualenv venv
.\venv\Scripts\activate
source venv/bin/activate
pip install -r requirements.txt
python simple_async_server.py
To connect LocalPrompt setup with a different LLM using Ollama:
{'response':'You need to set up your custom model function inside api_call.py file inside backend folder'}
[!NOTE] Make sure to restart server after each changes.