tinker-chat

MIT License

Stars
28
Committers
2

tinker-chat

This chat app supports GPT from OpenAI or your own local LLM.

GPT from OpenAI

To use GPT from OpenAI, set the environment variable OPENAI_API_KEY to your API key.

Local LLM

To utilize llama.cpp locally with its inference engine, first load a quantized model such as Phi-3 Mini, e.g.:

/path/to/llama.cpp/server -m Phi-3-mini-4k-instruct-q4.gguf

Before launching the demo, set the environment variable OPENAI_API_BASE:

export OPENAI_API_BASE=http://127.0.0.1:8080

Demo

With Node.js >= v18:

npm install
npm start

and open localhost:5000 with a web browser.

Related Projects