yalla

A tiny LLM Agent with minimal dependencies, focused on local inference.

MIT License

Stars
49

Statistics for this project are still being loaded, please check back later.