A Slack bot that helps you to summarize long messages and threads.
MIT License
A Slack bot that helps you to summarize long messages and threads.
Generate Summary
message action to summarize/tlds
command, you can choose whether the summary is visible to everyone or is ephemeral and only visible to youGenerate summary
message action, the summary is ephemeral and only visible to youGenerate public summary
message action, the summary is visible to everyone/tlds-help
) to show the available commands and optionsmanifest.yml
file to create itexample.env
file to .env
and fill in the values
.secrets.toml
and settings.toml
example.*.toml
filesRun the following command to build and run the Docker container:
docker-compose up
If you want to run the container in the background, use the -d
flag:
docker-compose up -d
If you have Ollama running on the host, the host can be accessed by the container by using host.docker.internal
as the host name. For example, if Ollama is running on port 11434
(the default), setting the base_url
to http://host.docker.internal:11434
will allow the container to access Ollama.
The docker-compose.yml
file also has a commented-out section for Ollama that can be used. However, setting it up on the host or by itself can offer more flexibility. To use the Ollama section in the docker-compose.yml
file, uncomment the section. The base_url
should be http://ollama:11434
in this case.
If you want to run the server without Docker, you can use Poetry:
poetry install
poetry run python -m tlds
/tlds
command or the Generate Summary
/Generate public summary
message actions to summarize messages and threads