Cloudflare Workers AI LLM Playground with Nuxt 💬
Demo: https://hub-chat.nuxt.dev
Overview
This project is a chat interface to interact with various text generation models supported by Cloudflare Workers AI. It allows users to set different LLM parameters, toggle response streaming, handle streaming/non-streaming responses, parse markdown in responses, and includes a dark mode.
Read the blog post on how I created this LLM playground.
Features
- Select the text generation model to interact with
- Set different LLM parameters (temperature, max tokens, system prompt, top_p, top_k, etc.)
- Toggle LLM response streaming on/off
- Handle streaming and non-streaming LLM responses on both server and client sides
- Parse and display markdown in LLM responses
- Auto-scroll chat container as responses are streamed
- Dark mode support
Technologies Used
-
Nuxt: Vue.js framework for the application foundation
-
Nuxt UI: Module for creating a sleek and responsive interface
-
Nuxt MDC: For parsing and displaying chat messages
-
NuxtHub: Deployment and administration platform for Nuxt, powered by Cloudflare
Prerequisites
-
Cloudflare Account: Required for using Workers AI models and deploying the project on Cloudflare Pages
-
NuxtHub Account: For managing NuxtHub apps and using AI in development
You can deploy and manage this application with a free Cloudflare and free NuxtHub account.
Setup
- Clone the repository and install the dependencies with pnpm:
pnpm i
- Link your NuxtHub project to use AI models in development (it will ask you to create one if you don't have any)
npx nuxthub link
- Start the application in development mode
pnpm dev
Open http://localhost:3000 in your browser.
Deployment
NuxtHub Admin
- Push your code to a GitHub repository.
- Link the repository with NuxtHub.
- Deploy from the Admin console.
Learn more about Git integration
Deploy via NuxtHub CLI:
npx nuxthub deploy
Learn more about CLI deployment
License
This project is licensed under the MIT License. See the LICENSE file for details.