TinyLLM

Setup and run a local LLM and Chatbot using consumer grade hardware.

MIT License

Stars
57
Committers
2
TinyLLM - v0.14.1 - Chatbot Baseprompt Latest Release

Published by jasonacox 8 months ago

  • Fixed bug with baseprompt updates to respond to saved Settings or new sessions.
  • Updated baseprompt to include date and guidance for complex and open-ended questions.
  • Add TZ local timezone environmental variable to ensure correct date in baseprompt.

Full Changelog: https://github.com/jasonacox/TinyLLM/compare/v0.14.0...v0.14.1

TinyLLM - v0.14.0 - Chatbot Controls

Published by jasonacox 8 months ago

  • Added ability to change LLM Temperature and MaxTokens in settings.
  • Added optional prompt settings read-only options to allow viewing but prevent changes (PROMPT_RO=true).

Full Changelog: https://github.com/jasonacox/TinyLLM/compare/v0.13.0...v0.14.0

TinyLLM - v0.13.0 - Use Weaviate for RAG

Published by jasonacox 8 months ago

What's Changed

  • Moved from Qdrant to Weaviate - This externalizes the sentence transformation work and lets the chatbot run as a smaller service. Activate by setting WEAVIATE_HOST to the address of the DB
  • Added "References" text to output from /rag queries.
  • Added ONESHOT environmental variable that if True will remove conversation threading allowing each query to be answered as a standalone sessions.
  • Added RAG_ONLY environmental variable that if True will assume all queries should be directed to the default RAG database as set by WEAVIATE_LIBRARY.
  • See https://github.com/jasonacox/TinyLLM/pull/5
docker run \
    -d \
    -p 5000:5000 \
    -e PORT=5000 \
    -e OPENAI_API_BASE="http://localhost:8000/v1" \
    -e ONESHOT="true" \
    -e RAG_ONLY="false" \
    -e WEAVIATE_HOST="localhost" \
    -e WEAVIATE_LIBRARY="tinyllm" \
    -v $PWD/.tinyllm:/app/.tinyllm \
    --name chatbot \
    --restart unless-stopped \
    jasonacox/chatbot

Full Changelog: https://github.com/jasonacox/TinyLLM/compare/v0.12.5...v0.13.0

TinyLLM - v0.12.5 - Chatbot LLM Model

Published by jasonacox 8 months ago

  • Added logic to poll LLM for model list. If only one model is available, use that. Otherwise verify the user requested model is available.
  • Chatbot UI now shows model name and adds responsive elements to better display on mobile devices.
  • Add encoding user prompts to correctly display html code in Chatbot.
  • Fix chat.py CLI chatbot to handle user/assistant prompts for vLLM.
TinyLLM - v0.12.4 - Chatbot Fixes

Published by jasonacox 8 months ago

  • Add encoding user prompts to correctly display html code in Chatbot.
  • Fix chat.py CLI chatbot to handle user/assistant prompts for vLLM.

Full Changelog: https://github.com/jasonacox/TinyLLM/compare/v0.12.3...v0.12.4

TinyLLM - v0.12.3 - Extract from URL

Published by jasonacox 9 months ago

  • Bug fix for handle_url_prompt() to extract text from URL.

Full Changelog: https://github.com/jasonacox/TinyLLM/compare/v0.12.2...v0.12.3

TinyLLM - v0.12.2 - Misc Improvements

Published by jasonacox 9 months ago

v0.12.2

  • Speed up command functions using aiohttp.
  • Fix prompt_expand for rag command.
  • Added topic option to /news command.

v0.12.1 - Performance Improvements

  • Speed up user prompt echo. Immediately send to chat windows instead of waiting for LLM stream to start.
  • Optimize message handling dispatching using async.
  • Use AsyncOpenAI for non-streamed queries.

Full Changelog: https://github.com/jasonacox/TinyLLM/compare/v0.12.0...v0.12.2

TinyLLM - v0.12.0 - FastAPI and Uvicorn

Published by jasonacox 9 months ago

Chatbot - v0.12.0 - FastAPI and Uvicorn

  • Ported Chatbot to the async FastAPI and Uvicorn ASGI high speed web server implementation (https://github.com/jasonacox/TinyLLM/issues/3).
  • Added /stats page to display configuration settings and current stats (optional ?format=json)
  • UI updated to help enforce focus on text entry box.
  • Moved prompts.json and Sentence Transformer model location to a ./.tinyllm for Docker support.
TinyLLM - v0.11.4 - Stats Page

Published by jasonacox 9 months ago

  • Add /stats URL to Chatbot for settings and current status information.
  • Update Chatbot HTML to set focus on user textbox.
  • Move prompts.json and Sentence Transformer models into .tinyllm directory.

Full Changelog: https://github.com/jasonacox/TinyLLM/compare/v0.11.3...v0.11.4

TinyLLM - v0.11.3 - Optimize for Docker

Published by jasonacox 9 months ago

What's Changed

New Contributors

Full Changelog: https://github.com/jasonacox/TinyLLM/commits/v0.11.3