Access llamafile localhost models via LLM
APACHE-2.0 License
Access fireworks.ai models via API
Run evals using LLM
Access large language models from the command-line
LLM plugin for models hosted by Anyscale Endpoints
LLM plugin for interacting with the Claude 3 family of models
LLM plugin adding support for the MPT-30B language model
LLM plugin to access Google's Gemini family of models
LLM plugin to run an IPython interpreter or notebook in the LLM virtual environment and use the L...
Access the Cohere Command R family of models
Answer questions against collections stored in LLM using Retrieval Augmented Generation
"llm python" is a command to run a Python interpreter in the LLM virtual environment
Run models distributed as GGUF files using LLM
LLM plugin for models hosted by OpenRouter
Use LLM to generate and execute commands in your shell
Run Llama 2 using MLX on macOS