worker-fal-ai-proxy

fal.ai proxy for Cloudflare Workers

MIT License

Stars
2

worker-fal-ai-proxy

This template is a simple fal.ai proxy for Cloudflare Workers. It is prepared for server-side integration and is compatible with Cloudflare Workers.

src/index.ts is the content of the Workers script.

Setup

To use this template, use one of the following commands:

$ npx wrangler generate worker-fal-ai-proxy https://github.com/denizcdemirci/worker-fal-ai-proxy
# or
$ yarn wrangler generate worker-fal-ai-proxy https://github.com/denizcdemirci/worker-fal-ai-proxy
# or
$ pnpm wrangler generate worker-fal-ai-proxy https://github.com/denizcdemirci/worker-fal-ai-proxy

This template uses fal.ai Key-Based Authentication and requires one key. You can create a key here.

Before publishing your script, you need to edit the wrangler.toml file. Add your fal.ai key FAL_KEY this file. More information about configuring and publishing your script can be found in the documentation.

Once you are ready, you can publish your script by running the following command:

$ npm run deploy
# or
$ yarn run deploy
# or
$ pnpm run deploy

Configure the client

To use the proxy, you need to configure the client to use the proxy endpoint. You can do that by setting the proxyUrl option in the client configuration:

import * as fal from '@fal-ai/serverless-client';

fal.config({
  proxyUrl: 'https://your-worker.workers.dev',
});

For more information, please refer to the fal.ai documentation.

Notes

Although this is a nice alternative to hide FAL_KEY, please note that this endpoint will be publicly available. I recommend adding your own authentication structure and updating the headers in src/index.ts. You can also add your auth token to requests using requestMiddleware. Here is an example:

fal.config({
  requestMiddleware: async (request) => {
    request.headers = {
      ...request.headers,
      Authorization: 'your_auth_token',
    };

    return request;
  },
});
Related Projects