An experimental ollama client for the terminal
GPL-3.0 License
A wee experimental terminal-based chat client, built to mess around with the Python Ollama library.
There are lots of good and very comprehensive terminal-based LLM clients out there. Pretty much all of them are feature-rich and very busy, and most concentrate on privacy-problematic backend engines.
I wanted to toy with something that was simple, direct, to the point, and locally-based.
Also, this is a tinker toy so I can experiment with the library I've built this around, and also experiment with how I can make use of a locally-hosted LLM.