CDoc lets you chat with your documents using local LLMs, combining Ollama, ChromaDB, and LangChain for offline, secure, and efficient information extraction. Perfect for researchers, developers, and professionals seeking quick insights from their documents.
CDoc empowers you to have a conversation with your documents using local large language models (LLMs) and the power of Ollama, ChromaDB, and LangChain.
Key Features:
Target Users:
Prerequisites:
Installation Steps:
Clone the repository:
git clone https://github.com/ChatDocDev/CDoc
Navigate to the project directory:
cd CDoc
Open project directory in VSCode
code .
or any other code editor
Install dependencies from requirements.txt
pip install -r requirements.txt
Pull the required models from Ollama
Download & install Ollama if not installed
Open terminal & run these command to pull the required models into local machine
For llama3
ollama pull llama3:latest
For nomic-embed-text
ollama pull nomic-embed-text:latest
Insure both models are downloaded
ollama ls
Serve Ollama
ollama serve
goto localhost:11434
& you should get Ollama is running
BACKEND
go to backend
directory
cd backend
create db
folder for storing Chromadb files
mkdir db
Start Chromadb server:
chroma run --path db --port 8001
Open new terminal and go into backend folder(hint: cd backend
) & Run backend server:
python backend.py
FRONTEND
Open new terminal and go to frontend folder
cd frontend
Run frontend.py
streamlit run frontend.py