This is a Grafana plugin designed to showcase using the LLM functionality available in @grafana/experimental
. Under the hood this uses the grafana-llm-app
to proxy requests to the LLM provider.
To get started you'll need a few things:
OPENAI_API_KEY
environment variableThen run the following to build the plugin:
npm install
npm run dev
Finally, use docker compose to run a Grafana instance with access to the plugin:
docker compose up
You should then be able to access Grafana on http://localhost:3000.
Head to the LLM Examples plugin page to see some use of the LLMs in action!
The Grafana container in docker-compose is provisioned with the grafana-llm-app
plugin installed (using GF_INSTALL_PLUGINS
) and configured with your OpenAI key (using provisioning/plugins/apps.yaml
).
This plugin makes use of the @grafana/experimental
package to make requests to OpenAI via the grafana-llm-app
plugin, which provides an authenticating proxy and handles streaming responses using Grafana Live.
Take a look at src/pages/ExamplePage.tsx
to see how to make requests and use responses. You can also take a look at extensions/panelExplainer.tsx
to see how to add a plugin extension utilizing the same APIs.
You can also toggle the value of disabled
for the grafana-llm-app
plugin in provisioning/plugins/apps.yaml
to see what happens when the LLM plugin is unavailable.
To add LLM functionality to your own plugin you'll need to do the following:
@grafana/experimental>=1.7.0
and make use of the llms
modulegrafana-llm-app
plugin is installed and configured in the Grafana instance
GF_INSTALL_PLUGINS
environment variable at startup time - see docker-compose.yaml
for an example