Chrome Extension to Summarize or Chat with Web Pages/Local Documents Using locally running LLMs. Keep all of your data and conversations private. 🔐
MIT License
DistiLlama is a Chrome extension that leverages locally running LLM perform following tasks.
One of the things that I was experimenting with is how to use a locally running LLM instance for various tasks and summarization (tl;dr) was on the top of my list. It was key to have all calls to LLM be local and all the data to stay private.
This project utilizes Ollama as the locally running LLM instance. Ollama is a great project that is easy to setup and use. I highly recommend checking it out.
To generate the summary I am using the following approach:
Prerequisites:
OLLAMA_ORIGINS=* OLLAMA_HOST=127.0.0.1:11435 ollama serve
ollama pull llama2:latest
or ollama pull mistral:latest
Clone this repo
npm install -g pnpm
pnpm install
pnpm dev
chrome://extensions/
Load unpacked
and select the dist
folder from the base of the cloned project.