🧰 @grammyjs helpers for @Vercel
MIT License
Collection of useful methods to run your bot on Vercel
npm i vercel-grammy
import {/* methods */} from "vercel-grammy"
import {Bot} from "grammy"
import {getURL} from "vercel-grammy"
const url = getURL({path: "api/index"})
const bot = new Bot(/* token */)
await bot.api.setWebhook(url)
// Anywhere in your code
getHost() // *.vercel.app (from `process.env.VERCEL_URL`)
// At your function handler
export default ({headers}) => {
getHost({headers}) // domain.com (from `x-forwarded-host` header)
}
// Anywhere in your code
getURL({path: "api/index"}) // https://*.vercel.app/api/index
// At your function handler
export default ({headers}) => {
getURL({headers, path: "api/index"}) // https://domain.com/api/index
}
// Anywhere in your code
bot.api.setWebhook(getURL({path: "api/index"}))
// As function handler
export default setWebhookCallback(bot, {path: "api/index"})
Note that this will work only at Vercel Edge Functions
// As function handler
export default webhookStream(bot) // Instead of webhookCallback(bot)
export const config = {
runtime: "edge"
}
When you deploy a project to Vercel, one of these environments is installed for it:
production
— default for main
or master
branchespreview
— for all other branches in your repositorydevelopment
— when using the vercel dev
commandIn the early stages of bot development, it is enough to install a webhook
on the main (production) domain, such as project.vercel.app
However, if you want to test new changes without stopping the bot,
then you can simply use a separate (test) bot (for example @awesome_beta_bot
)
and set the webhook to the URL of the branch — project-git-branch-username.vercel.app
But what if you have several separate branches with different changes and want to test them without creating a separate bot for each or manually managing webhooks ?
Q: You didn't make a separate plugin for this, right ? A: 😏 Q: Didn't do it, right ?
Thanks to the Vercel build step, we can run some code before a new version of the bot is published and no one will stop us from using it
Just add this code to a new JavaScript file:
const {
VERCEL_ENV,
} = process.env
// List of allowed environments
const allowedEnvs = [
"production",
"preview"
]
// Exit in case of unsuitable environments
if (!allowedEnvs.includes(VERCEL_ENV)) process.exit()
// Webhook URL generation
const url = getURL({path: "api/index"})
// Installing a webhook
await bot.api.setWebhook(url)
And specify the path to it in the vercel.json
file:
{
"buildCommand": "node path/to/new/file.js"
}
By the way, you can manage tokens for each environment (or even branch) in the project settings
By default, Vercel limits the invocation time for your code:
10
seconds for Serverless Functions
60
seconds at Pro plan900
seconds at Enterprise plan25
seconds for Edge Functions
1 000
seconds with streaming response
So, without streaming (and paying) you can get up to 25
seconds with default
grammY webhookCallback
adapter at
Edge Functions
On the other hand, we also have a time limit for responding to incoming requests from Telegram — 60
seconds,
after which, the request will be considered unsuccessful and will be retried, which you probably don't want
To get around these limitations you can proxy the request before calling the function by following scheme:
60
seconds will be returned to Telegram200
status to prevent a recurrence940
secondsQ: What proxy server is suitable for this ? A: I don't know, but I made it 🙂
Source: ProlongRequest
Endpoint: https://prolong-request.fly.dev
Reference:
/domain.com
/http://domain.com
/https://domain.com
/https://domain.com/path/to/file.txt
/https://domain.com/route?with=parameters
Also supports any HTTP methods and transmits raw headers and body
Just prepend proxy endpoint to webhook URL:
https://prolong-request.fly.dev/https://*.vercel.app/api/index
Or do it automatically:
const proxy = "https://prolong-request.fly.dev"
const url = getURL({path: "api/index"})
bot.api.setWebhook(`${proxy}/${url}`)
And use streaming response in webhook handler:
export default webhookStream(bot, {
timeoutMilliseconds: 999 // where you can also control timeout
})
export const config = {
runtime: "edge"
}
getHost([options])
options
(object
, optional) — Options for hostname
headers
(Headers
, optional) — Headers from incoming requestheader
(string
, optional) — Header name which contains the hostnamefallback
(string
, optional) — Fallback hostname (process.env.VERCEL_URL
by default)string
— Target hostnameThis method generates a hostname from the options passed to it
getURL([options])
options
(object
, optional) — Options for URL
path
(string
, optional) — Path to a function that receives updateshost
(string
, optional) — Hostname without protocol (replaces getHost
options)...options
(object
, optional) — Options for getHost
string
— Target URLThis method generates a URL from the options passed to it
setWebhookCallback(bot[, options])
bot
(Bot
, required) — grammY bot instanceoptions
(object
, optional) — Options for webhooks
url
(string
, optional) — URL for webhooks (replaces getURL
options)onError
("throw" | "return"
, optional) — Strategy for handling errorsallowedEnvs
(array
, optional) — List of environments where this method allowed...options
(object
, optional) — Optionsbot.api.setWebhook
...options
(object
, optional) — Options for getURL
() => Promise<Response>
— Target callback methodCallback factory for grammY bot.api.setWebhook
method
webhookStream(bot[, options])
bot
(Bot
, required) — grammY bot instanceoptions
(object
, optional) — Options for stream
chunk
(string
, optional) — Content for chunksintervalMilliseconds
(number
, optional) — Interval for writing chunks to stream...options
(object
, optional) — OptionswebhookCallback
() => Response
— Target callback methodCallback factory for streaming webhook response
jsonResponse(value[, options])
value
(any
, required) — Serializable valueoptions
(object
, optional) — Options for JSON response
replacer
((string | number)[] | null | undefined
, optional)space
(string | number | undefined
, optional)...options
(ResponseInit
, optional)Response
— Target JSON ResponseThis method generates Response objects for JSON
Made with 💜 by Vladislav Ponomarev