Incredibly fast JavaScript runtime, bundler, test runner, and package manager β all in one
Bot releases are visible (Hide)
Published by Jarred-Sumner about 2 years ago
To upgrade:
bun upgrade
To install:
curl https://bun.sh/install | bash
Run the install script (you can run it multiple times):
curl https://bun.sh/install | bash
This fixes a regression introduced in Bun v0.1.12 where fetch()
with a body would fail to write the body when the body was small. This also adds keepalive: false
and timeout: false
to fetch
as options.
Please see Bun v0.1.12's release notes, as a lot more changed there.
Published by Jarred-Sumner about 2 years ago
To upgrade:
bun upgrade
To install:
curl https://bun.sh/install | bash
Run the install script (you can run it multiple times):
curl https://bun.sh/install | bash
bun install
gets faster & more reliablePreviously, bun install
on Linux would crash or hang at least 1 out of every 80,000 cold package installs π. It was particularly bad for download speeds < 100 mbps. Sometimes there were DNS hostname resolution issues as well (which caused a separate crash). If you were on a Linux kernel earlier than v5.5 or had a low memlock limit (e.g. using Ubuntu 18.04), it just wouldn't work at all.
Bun v0.1.12 fixes that.
To end-to-end test this, I had a computer cold install 512 MB of node_modules
for 2 hours in a loop (at least 800,000 cold package installs) and it installed successfully every time without crashing or hanging (unlike Bun v0.1.11 and earlier)
The first run installed 100 times and the second 1,000 times.
Four different things caused stability issues:
getaddrinfo
Many of these issues applied to macOS as well (excluding the memlock limit one)
There are still important missing features in bun install
, including support for git
dependencies, github
dependencies, npm:
package aliasing, and workspaces support. Those are just not implemented yet, which is different than stability issues with existing features.
fetch()
gets faster & more reliableThe eventing code for Bun's HTTP client has been rewritten to use uSockets and that, along with changes to concurrent task scheduling and HTTP keep-alive made sending HTTP requests in Bun faster & more reliable.
Bun's fetch()
can send 155,000 requests per second on Linux x64:
The performance of fetch
in Bun seems to be within 25% of optimized native HTTP benchmarking tools like oha
and bombardier
(autocannon
, the more popular choice, only reaches 60k req/s)
read.u8
in bun:ffi
read
in bun:ffi
lets you read pointers without creating a new DataView. This helps write faster libraries.
import {read, ptr} from 'bun:ffi';
import {it, expect} from 'bun:test';
it("read", () => {
const buffer = new BigInt64Array(16);
const dataView = new DataView(buffer.buffer);
const addr = ptr(buffer);
for (let i = 0; i < buffer.length; i++) {
buffer[i] = BigInt(i);
expect(read.intptr(addr, i * 8)).toBe(
Number(dataView.getBigInt64(i * 8, true))
);
expect(read.ptr(addr, i * 8)).toBe(
Number(dataView.getBigUint64(i * 8, true))
);
expect(read.f64(addr, i + 8)).toBe(dataView.getFloat64(i + 8, true));
expect(read.i64(addr, i * 8)).toBe(dataView.getBigInt64(i * 8, true));
expect(read.u64(addr, i * 8)).toBe(dataView.getBigUint64(i * 8, true));
}
for (let i = 0; i < buffer.byteLength - 4; i++) {
// read is intended to behave like DataView
// but instead of doing
// new DataView(toArrayBuffer(myPtr)).getInt8(0, true)
// you can do
// read.i8(myPtr, 0)
expect(read.i8(addr, i)).toBe(dataView.getInt8(i, true));
expect(read.i16(addr, i)).toBe(dataView.getInt16(i, true));
expect(read.i32(addr, i)).toBe(dataView.getInt32(i, true));
expect(read.u8(addr, i)).toBe(dataView.getUint8(i, true));
expect(read.u16(addr, i)).toBe(dataView.getUint16(i, true));
expect(read.u32(addr, i)).toBe(dataView.getUint32(i, true));
expect(read.f32(addr, i)).toBe(dataView.getFloat32(i, true));
}
});
Implementing this involved a change to WebKit's DOMJIT to enable 52-bit integer arguments (pointers)
This involved an unlikely but potentially breaking change to the pointer representation in bun:ffi
. Previously, bun:ffi
pointers stored the memory address at the end of a JavaScript double (bit-casted a double to a 64-bit signed integer) and now the value is stored in the integer part of the double. This would only be a breaking change for libraries relying on the pointer representation in napi
String.prototype.replace
gets 2x faster in Safari & Bun, thanks to @Constellation. It affects code like this:
myString.replace("foo", "baz")
PRs:
This version of Bun updates to the latest WebKit as of September 17th, 2022 (which includes these PRs)
crypto.getRandomValues
and crypto.randomUUID
crypto.getRandomValues
now uses BoringSSL's optimized random functions
Note: this screenshot was taken before bun's version was bumped to v0.1.12 which is why it shows v0.1.11 there
Buffer.from([123], "utf8")
Full Changelog: https://github.com/oven-sh/bun/compare/bun-v0.1.11...bun-v0.1.12
Published by Jarred-Sumner about 2 years ago
To upgrade:
bun upgrade
To install:
curl https://bun.sh/install | bash
Run the install script (you can run it multiple times):
curl https://bun.sh/install | bash
Benchmark: https://github.com/SaltyAom/bun-http-framework-benchmark
This was run on Linux
Image credit: @Kapsonfire-DE
Bun's runtime now supports a plugin API.
.svelte
, .vue
, .yaml
, .scss
, .less
and other file extensions that Bun doesn't implement a builtin loader forThe API is loosely based on esbuild's plugin API.
This code snippet lets you import .mdx
files in Bun:
import { plugin } from "bun";
import { renderToStaticMarkup } from "react-dom/server";
// Their esbuild plugin runs in Bun (without esbuild)
import mdx from "@mdx-js/esbuild";
plugin(mdx());
// Usage
import Foo from "./bar.mdx";
console.log(renderToStaticMarkup(<Foo />));
This lets you import yaml
files:
import { plugin } from "bun";
plugin({
name: "YAML",
setup(builder) {
const { load } = require("js-yaml");
const { readFileSync } = require("fs");
// Run this function on any import that ends with .yaml or .yml
builder.onLoad({ filter: /\.(yaml|yml)$/ }, (args) => {
// Read the YAML file from disk
const text = readFileSync(args.path, "utf8");
// parse the YAML file with js-yaml
const exports = load(text);
return {
// Copy the keys and values from the parsed YAML file into the ESM module namespace object
exports,
// we're returning an object
loader: "object",
};
});
},
});
We're planning on supporting browser builds with this plugin API as well (run at transpilation time)
Node compatibility:
napi_add_finalizer
f023b89b732db0aff24445acbbe39c366d13118dNAPI_MODULE_INIT()
wasn't implemented correctly in Bun and that has been fixedimport assert
and import process
did not behave as expected (assert
wasn't returning a function). This has been fixed"node:module"
's createRequire
function wasn't requiring non-napi modules correctlymacOS event loop internals moved to a more reliable polling mechanism:
setTimeout
CPU usage drops by 50% https://github.com/oven-sh/bun/commit/c1734c6ec5ef709ee4126b3474c7bee0a377a1fa (Before: 90%, After: 33% - still more work to do here)fetch
enough times in quick succession. The race conditions have been fixed.More:
Request
and Response
in macros e0b35b3086b00fb27f950a72a082b360a3dad891clearTimeout
on Linux e6a1209c53adb3056263b894d774b30ee70a3188Bun has a long-term commitment to performance. On macOS, React server-side rendering gets around 2x faster.
Coming up next in performance improvements: a new HTTP server implementation. Not far enough along for this release, but experiments are showing progress.
extends
by @yepitschunked in https://github.com/oven-sh/bun/pull/1147
Full Changelog: https://github.com/oven-sh/bun/compare/bun-v0.1.10...bun-v0.1.11
Published by Jarred-Sumner about 2 years ago
To upgrade:
bun upgrade
To install:
curl https://bun.sh/install | bash
Run the install script (you can run it multiple times):
curl https://bun.sh/install | bash
bun dev
to not send HTTP bodies[native code]
for too many things 5eeb704f25200af7ad8819c35bcfd16b8b1bff49bun dev
when used with Next.js e45ddc086fe6b3e7a32aa45607f5e3d570998137Buffer.compare
https://github.com/oven-sh/bun/commit/d150a2f4ddc10597e4531fd3c55b62bb0ecbf02c
TextDecoder
2.5x faster e3c2a95e5ff4e6c6b63839f4773cc3f5aeadddc8WebSocket
bdf733973c72b8e156cd1cf1c6c8a8b4649fedbeRequest
, Response
and TextDecoder
globals not read-only 0f45386673fbf4f33b6e61b17ea49b69697ec79aFull Changelog: https://github.com/oven-sh/bun/compare/bun-v0.1.9...bun-v0.1.10
Published by Jarred-Sumner about 2 years ago
To upgrade:
bun upgrade
To install:
curl https://bun.sh/install | bash
Run the install script (you can run it multiple times):
curl https://bun.sh/install | bash
bun install
and fetch()
mostly)Bun.serve
gets about 20% faster outside of "hello world" benchmarks due to optimizing how Headers
are copied/read and faster generated bindingsrequire("buffer")
and require("process")
now point to Bun's implementation instead of a browserify polyfill (thanks @zhuzilin)setTimeout
or setInterval
to not keep the process alive (thanks @zhuzilin)ptr()
in bun:ffi
JIT-optimized TextEncoder.encodeInto
can be a 1.5x perf boost up to around a 5x perf boost
The hash()
function in this microbenchmark calls ptr()
:
DOMJIT
is a JavaScriptCore API that gives 3rd-party embedders low-level access to the JIT compiler/assembler to optimize native functions and getters/setters. Safari leverages DOMJIT to make commonly-accessed properties like element.parentNode
faster
Bun is beginning to use DOMJIT now, starting in two places:
ptr()
function in bun:ffi
TextEncoder.encodeInto
To better support Bun's usecase, I extended DOMJIT
to support Typed Array arguments, as well as added support for specifying more kinds of side effects that enable/disable optimizations:
At Bun's compile-time, Bun now code-generates C++ binding classes for JavaScript objects implemented in Zig. Previously, Bun mostly used the JavaScriptCore C API.
Using JavaScriptCore's C++ API improves performance and is important for making the garbage collector better aware of Bun's usage. But, writing bindings manually can be very repetitive.
Given a class definition in JavaScript like this:
define({
name: "Response",
construct: true,
finalize: true,
JSType: "0b11101110",
klass: {
json: {
fn: "constructJSON",
},
// rest of the code
},
proto: {
url: {
getter: "getURL",
cache: true,
},
text: { fn: "getText" },
json: { fn: "getJSON" },
arrayBuffer: { fn: "getArrayBuffer" },
blob: { fn: "getBlob" },
clone: { fn: "doClone", length: 1 },
// rest of the code
},
})
Bun generates corresponding:
This approach is inspired by WebIDL bindings which both Safari and Chromium use.
This screenshot is with a simulated bandwidth limit and no throttling of max http connections
Previously, bun had a tendency to hang in situations like this
IntPrimtiive
-> IntPrimitive
by @ryanrussell in https://github.com/oven-sh/bun/pull/1046
Full Changelog: https://github.com/oven-sh/bun/compare/bun-v0.1.8...bun-v0.1.9
Published by Jarred-Sumner about 2 years ago
To upgrade:
bun upgrade
To install:
curl https://bun.sh/install | bash
Run the install script (you can run it multiple times):
curl https://bun.sh/install | bash
A huge thank you to @zhuzilin for all their help on this release. @zhuzilin fixed 4 crashes!
bun link
lets you symlink a folder to node_modules. It works like npm link.
fs.copyFileSync
gets 2x to 10x faster:
require.resolve
works at runtime now instead of only build-time
WebSocket
is more reliable now. Previously the garbage collector would attempt to free it when the socket was still open π
bun:ffi
's toBuffer
and toArrayBuffer
functions now support a function pointer to a destructor so that native code can perform cleanup without needing to go through a FinalizationRegistry
.
TypedArray
logs the value for the type (instead of in bytes π)
console.log(MessageEvent
) is more useful now
More:
setInterval
wouldn't cause the process to stay alive π’ and now that is fixed thanks to @zhuzilinbun install
gets a symlink
backend, which you probably don't want to use in most cases. It's used internally if you do file:./
as a dependency, which some packages doprocess.revision
returns the git sha used to build bunTypos:
Misc:
Full Changelog: https://github.com/oven-sh/bun/compare/bun-v0.1.7...bun-v0.1.8
Published by Jarred-Sumner about 2 years ago
To upgrade:
bun upgrade
To install:
curl https://bun.sh/install | bash
Run the install script (you can run it multiple times):
curl https://bun.sh/install | bash
bun init
quickly start a new, empty project that uses Bun (similar to npm init). bun init
is a new subcommand in bun
.
bun install
now supports private npm registries & scoped (authenticated) packages
Thank you @soneymathew for your help with this.
bun install
now supports lifecycle hooks for project-level package.json (not dependencies)
It runs postinstall scripts for your app's package.json, but ignores dependencies lifecycle hooks. This lets you use husky
, lint-staged
, and other postinstall-dependent packages tools
More new stuff:
express
is partially supported, thanks to @zhuzilin and @evanwashere. There is a lot more work to be done - it's not fast yet and it logs a spurious error on request, but it is better than not workingbun create
now lets you specify a start command so that you can say how to run the program in the outputprocess.revision
has the git sha that bun was built withbun:ffi
and a couple other places thanks to @sno2Buffer.isBuffer
no longer checks that this
is the Buffer
constructorBun.Transpiler
no longer does bun-specific transforms when it shouldn'tlatin1
and binary
encodings in bun's Buffer
implementationFull Changelog: https://github.com/oven-sh/bun/compare/bun-v0.1.6...bun-v0.1.7
Published by Jarred-Sumner about 2 years ago
To upgrade:
bun upgrade
Run this:
curl https://bun.sh/install | bash
baseline
builds, these are separate builds of bun for Linux x64 and macOS x64 which do not use AVX/AVX2 instructions. You can install with the install script. This was one of the most common issues people ran into.util.TextEncoder
by @soneymathew in https://github.com/oven-sh/bun/pull/844
-profile
builds of bun include debug symbolsThanks to upgrading WebKit:
${{}}
from the if
block in GHA by @rgoomar in https://github.com/oven-sh/bun/pull/863
@vscode/dev-container-cli
by @kidonng in https://github.com/oven-sh/bun/pull/912
Full Changelog: https://github.com/oven-sh/bun/compare/bun-v0.1.5...bun-v0.1.6
Published by Jarred-Sumner about 2 years ago
To upgrade:
bun upgrade
Run this:
curl https://bun.sh/install | bash
This release is mostly just bug fixes. There is also a Linux arm64 build available (not for Android arm64 yet, but this should work for raspberry pi's)
require
is not defined bugbun install
hangs"url"
polyfill @SheetJSDev in https://github.com/oven-sh/bun/pull/772
node:http
server polyfill (this is not optimized yet, do not expect good performance from this version) by @evanwashere in https://github.com/oven-sh/bun/pull/572
bun add @scoped/package
@alexkuz in https://github.com/oven-sh/bun/pull/760
bun install
with BUN_CONFIG_REGISTRY
@SheetJSDev in https://github.com/oven-sh/bun/pull/823
Two new flags added to bun install
:
--no-progress Disable the progress bar
--no-verify Skip verifying integrity of newly downloaded packages
node:fs
@sno2 in https://github.com/oven-sh/bun/pull/807
Misc:
Other:
blank template
by @foyzulkarim in https://github.com/oven-sh/bun/pull/727
atob
and btoa
by @thislooksfun in https://github.com/oven-sh/bun/pull/748
Full Changelog: https://github.com/oven-sh/bun/compare/bun-v0.1.4...bun-v0.1.5
Published by github-actions[bot] about 2 years ago
This canary release of Bun corresponds to the commit [37edd5a6e389265738e89265bcbdf2999cb81a49]
Published by Jarred-Sumner over 2 years ago
To upgrade:
bun upgrade
Run this:
curl https://bun.sh/install | bash
fetch(url)
that frequently caused crashesTypos, README, examples:
.md
readability improvements by @ryanrussell in https://github.com/oven-sh/bun/pull/597
debug
in readme by @biw in https://github.com/oven-sh/bun/pull/189
blank
template by @SheetJSDev in https://github.com/oven-sh/bun/pull/523
Full Changelog: https://github.com/oven-sh/bun/compare/bun-v0.1.3...bun-v0.1.4
Published by Jarred-Sumner over 2 years ago
alert()
, confirm()
, prompt()
are new globals, thanks to @sno2console.log()
sometimes adding an "n" to non-BigInt numbers thanks to @FinnRGsubarray()
console.log bugconsole.log(request)
prints something instead of nothingperformance.now()
returning nanoseconds instead of milliseconds @PruxisAll the PRs:
connectedWebSocketContext()
by @ryanrussell in https://github.com/oven-sh/bun/pull/459
Output.flush
s before Global.exit
and Global.crash
by @r00ster91 in https://github.com/oven-sh/bun/pull/535
Full Changelog: https://github.com/oven-sh/bun/compare/bun-v0.1.2...bun-v0.1.3
Published by Jarred-Sumner over 2 years ago
To upgrade:
bun upgrade
Run this:
curl https://bun.sh/install | bash
bun install:
error: NotSameFileSystem
bun.js:
randomUUID
in crypto
module by @WebReflection in https://github.com/Jarred-Sumner/bun/pull/254
napi_get_version
should return the Node-API version by @kjvalencik in https://github.com/Jarred-Sumner/bun/pull/392
bun dev:
bun dev
on linux on some machines (@egoarka) https://github.com/Jarred-Sumner/bun/pull/316
Examples:
Landing page:
Internal:
README:
Safari's implementation
broken link by @F3n67u in https://github.com/Jarred-Sumner/bun/pull/257
Full Changelog: https://github.com/Jarred-Sumner/bun/compare/bun-v0.1.1...bun-v0.1.2
Published by Jarred-Sumner over 2 years ago
ReadableStream
, WritableStream
, TransformStream
and more. This was a massive project. There are still some reliability things to fix, but I'm very happy with the performance, and I think you will be too.WebSocket
is now a global, powered by a custom WebSocket clientrequire()
Full Changelog: https://github.com/Jarred-Sumner/bun/compare/bun-v0.0.83...bun-v0.1.0
Published by Jarred-Sumner over 2 years ago
To upgrade:
bun upgrade
You can also try the install script.
curl https://bun.sh/install | bash
Thanks to:
CFunction
abstraction"wrangler@beta"
with "wrangler"
in the examples for bun add
bun:sqlite
bun:sqlite
is a high-performance builtin SQLite module for bun.js.
It tends to be around 3x faster than the popular better-sqlite3
npm package
Note: in the benchmark I tweeted earlier, better-sqlite3
always returned arrays of arrays rather than arrays of objects, which was inconsistent with what bun:sqlite
& deno's x/sqlite were doing
import { Database } from "bun:sqlite";
const db = new Database("mydb.sqlite");
db.run(
"CREATE TABLE IF NOT EXISTS foo (id INTEGER PRIMARY KEY AUTOINCREMENT, greeting TEXT)"
);
db.run("INSERT INTO foo (greeting) VALUES (?)", "Welcome to bun!");
db.run("INSERT INTO foo (greeting) VALUES (?)", "Hello World!");
// get the first row
db.query("SELECT * FROM foo").get();
// { id: 1, greeting: "Welcome to bun!" }
// get all rows
db.query("SELECT * FROM foo").all();
// [
// { id: 1, greeting: "Welcome to bun!" },
// { id: 2, greeting: "Hello World!" },
// ]
// get all rows matching a condition
db.query("SELECT * FROM foo WHERE greeting = ?").all("Welcome to bun!");
// [
// { id: 1, greeting: "Welcome to bun!" },
// ]
// get first row matching a named condition
db.query("SELECT * FROM foo WHERE greeting = $greeting").get({
$greeting: "Welcome to bun!",
});
// [
// { id: 1, greeting: "Welcome to bun!" },
// ]
There are more detailed docs in Bun's README
bun:sqlite
's API is loosely based on @joshuawise's better-sqlite3
bun:ffi
CFunction
lets you call native library functions from a function pointer.
It works like dlopen
but its for cases where you already have the function pointer so you don't need to open a library. This is useful for:
import {CFunction} from 'bun:ffi';
const myNativeLibraryGetVersion: number | bigint = /* Somehow you got this function pointer */
const getVersion = new CFunction({
returns: "cstring",
args: [],
// ptr is required
// this is where the function pointer goes!
ptr: myNativeLibraryGetVersion,
});
getVersion();
getVersion.close();
linkSymbols
is like CFunction
except for when there are multiple functions. It returns the same object as dlopen
except ptr
is required and there is no path
import { linkSymbols } from "bun:ffi";
const [majorPtr, minorPtr, patchPtr] = getVersionPtrs();
const lib = linkSymbols({
// Unlike with dlopen(), the names here can be whatever you want
getMajor: {
returns: "cstring",
args: [],
// Since this doesn't use dlsym(), you have to provide a valid ptr
// That ptr could be a number or a bigint
// An invalid pointer will crash your program.
ptr: majorPtr,
},
getMinor: {
returns: "cstring",
args: [],
ptr: minorPtr,
},
getPatch: {
returns: "cstring",
args: [],
ptr: patchPtr,
},
});
const [major, minor, patch] = [
lib.symbols.getMajor(),
lib.symbols.getMinor(),
lib.symbols.getPatch(),
];
new CString(ptr)
should be a little faster due to using a more optimized function for getting the length of a string.
Running require.resolve("my-module")
in Bun.js will now resolve the path to the module. Previously, this was not supported.
In browsers, it becomes the absolute filepath at build-time. In node, it's left in without any changes.
Internally, Bun's JavaScript transpiler transforms it to:
// input:
require.resolve("my-module");
// output
import.meta.resolveSync("my-module");
You can see this for yourself by running bun build ./file.js --platform=bun
"node:module"
module polyfillNode's "module"
module lets you create require functions from ESM modules.
Bun now has a polyfill that implements a subset of the "module"
module.
Normally require()
in bun transforms statically at build-time to an ESM import
statement. That doesn't work as well for Node-API (napi) modules because they cannot be statically analyzed by a JavaScript parser (since they're not JavaScript).
For napi modules, bun uses a dynamic require function and the "module"
module exports a way to create those using the same interface as in Node.js
import { createRequire } from "module";
// this also works:
//import {createRequire} from 'node:module';
var require = createRequire(import.meta.url);
require.resolve("my-module");
// dynamic require is supported for:
// - .json files
// - .node files (napi modules)
require("my-napi-module");
This is mostly intended for improving Node-API compatibility with modules loaded from ESM.
As an extra thing, you can also use require()
this way for .json files.
Bun.Transpiler
now supports passing objects to macros.
import { Transpiler } from "bun";
import { parseCookie } from "my-cookie-lib";
import { Database } from "bun:sqlite";
const transpiler = new Transpiler();
const db = new Database("mydb.sqlite");
export default {
fetch(req) {
const transpiled = transpiler.transformSync(
`
import {getUser} from 'macro:./get-user';
export function Hello({name}) {
return <div>Hello {name}</div>;
}
export const HelloCurrentUser = <Hello {...getUser()} />;
`,
// passing contextual data to Bun.Transpiler
{
userId: parseCookie(req.headers.get("Cookie")).userId,
db: db,
}
);
return new Response(transpiled, {
headers: { "Content-Type": "application/javascript" },
});
},
};
Then, in get-user.js
:
// db, userId is now accessible in macros
export function getUser(expr, { db, userId }) {
// we can use it to query the database while transpiling
return db.query("SELECT * FROM users WHERE id = ? LIMIT 1").get(userId);
}
That inlines the returned current user into the JavaScript source code, producing output equivalent to this:
export function Hello({ name }) {
return <div>Hello {name}</div>;
}
// notice that the current user is inlined rather than a function call
export const HelloCurrentUser = <Hello name="Jarred" />;
Buffer.from(arrayBuffer, byteOffset, length)
now works as expected (thanks to @kriszyp for reporting)Published by Jarred-Sumner over 2 years ago
To upgrade:
bun upgrade
Node-API is 1.75x - 3x faster in Bun compared to Node.js 18 (in call overhead)
Getters & setters:
Simple function calls:
Just like in Node.js, to load a Node-API module, use require('my-npm-package')
or use process.dlopen
.
90% of the API is implemented, though it is certainly buggy right now.
The following functions have been added:
import.meta.resolveSync
synchronously run the module resolver for the currently-referenced fileimport.meta.require
synchronously loads .node
or .json
modules and works with dynamic paths. This doesn't use ESM and doesn't run the transpiler, which is why regular js files are not supported. This is mostly an implementation detail for how require
works for Node-API modules, but it could also be used outside of that if you want Bun.gzipSync
, Bun.gunzipSync
, Bun.inflateSync
, and Bun.deflateSync
which expose native bindings to zlib-cloudflare
. On macOS aarch64, gzipSync
is ~3x faster than in Node. This isn't wired up to the "zlib"
polyfill in bun yetAdditionally:
__dirname
is now supported for all targets (including browsers)__filename
is now supported for all targets (including browsers)Buffer.byteLength
is now implementedSeveral packages using Node-API also use detect-libc
. Bun polyfills detect-libc
because bun doesn't support child_process
yet and this improves performance a little.
new.target
is referenced outside of a constructor d1ea51e9f2bfecd696224f4d715a8955a6300440Published by Jarred-Sumner over 2 years ago
To upgrade:
bun upgrade
Try running this:
curl https://bun.sh/install | bash
"bun:ffi"
is a new bun.js core module that lets you use third-party native libraries written in languages that support the C ABI (Zig, Rust, C/C++ etc). It's like a foreign function interface API but fasterBuffer
(like in Node.js) is now a global, but the implementation is incomplete - see tracking issue. If you import "buffer"
, it continues to use the browser polyfill so this shouldn't be a breaking changeTextEncoder
& TextDecoder
thanks to some fixes to the vectorization (SIMD) codeTypedArray.from
. JavaScriptCore's implementation of TypedArray.from
uses the code path for JS iterators when it could instead use an optimized code path for copying elements from an array, like V8 does. I have filed an upstream bug with WebKit about this, but I expect to do a more thorough fix for this in Bun and upstream that. For now, Bun reuses TypedArray.prototype.set
when possibleUint8Array.fill
Bun.Transpiler
gets an API for removing & replacing exportsSHA512
, SHA256
, SHA128
, and more are now exposed in the "bun"
module and the Bun
global. They use BoringSSL's optimized hashing functions.new Response(Bun.file(path))
stop()
function. Before, there was no way to stop it without terminating the process πThe next large project for bun is a production bundler Tracking issue
The "bun:ffi"
core module lets you efficiently call native libraries from JavaScript. It works with languages that support the C ABI (Zig, Rust, C/C++, C#, Nim, Kotlin, etc).
Get the locally-installed SQLite version number:
import { dlopen, CString, ptr, suffix, FFIType } from "bun:ffi";
const sqlite3Path = process.env.SQLITE3_PATH || `libsqlite3.${suffix}`;
const {
symbols: { sqlite3_libversion },
} = dlopen(sqlite3Path, {
sqlite3_libversion: {
returns: "cstring",
},
});
console.log("SQLite version", sqlite3_libversion());
FFI is really exciting because there is no runtime-specific code. You don't have to write a Bun FFI module (that isn't a thing). Use JavaScript to write bindings to native libraries installed with homebrew, with your linux distro's package manager or elsewhere. You can also write bindings to your own native code.
FFI has a reputation of being slower than runtime-specific APIs like napi β but that's not true for bun:ffi
.
Bun embeds a small C compiler that generates code on-demand and converts types between JavaScript & native code inline. A lot of overhead in native libraries comes from function calls that validate & convert types, so moving that to just-in-time compiled C using engine-specific implementation details makes that faster. Those C functions are called directly β there is no extra wrapper in the native code side of things.
Some bun:ffi
usecases:
"ndarray"
package from JavaScript (ideally via ndarray's C API and not just embedding Python in bun)Later (not yet), bun:ffi will be integrated with bun's bundler and that will enable things like:
A lot of Node.js' Buffer module is now implemented natively in Bun.js, but it's not complete yet.
Here is a comparison of how long various functions take.
Bun.Transpiler
For code transpiled with Bun.Transpiler
, you can now remove and/or replace exports with a different value.
const transpiler = new Bun.Transpiler({
exports: {
replace: {
// Next.js does this
getStaticProps: ["__N_SSG", true],
},
eliminate: ["localVarToRemove"],
},
treeShaking: true,
trimUnusedImports: true,
});
const code = `
import fs from "fs";
export var localVarToRemove = fs.readFileSync("/etc/passwd");
import * as CSV from "my-csv-parser";
export function getStaticProps() {
return {
props: { rows: CSV.parse(fs.readFileSync("./users-list.csv", "utf8")) },
};
}
export function Page({ rows }) {
return (
<div>
<h1>My page</h1>
<p>
<a href="/about">About</a>
</p>
<p>
<a href="/users">Users</a>
</p>
<div>
{rows.map((columns, index) => (
<span key={index}>{columns.join(" | ")} </span>
))}
</div>
</div>
);
}
`;
console.log(transpiler.transformSync(code));
Which outputs (this is the automatic react transform)
export var __N_SSG = true;
export function Page({ rows }) {
return jsxDEV("div", {
children: [
jsxDEV("h1", {
children: "My page"
}, undefined, false, undefined, this),
jsxDEV("p", {
children: jsxDEV("a", {
href: "/about",
children: "About"
}, undefined, false, undefined, this)
}, undefined, false, undefined, this),
jsxDEV("p", {
children: jsxDEV("a", {
href: "/users",
children: "Users"
}, undefined, false, undefined, this)
}, undefined, false, undefined, this),
jsxDEV("div", {
children: rows.map((columns, index) => jsxDEV("span", {
children: [
columns.join(" | "),
" "
]
}, index, true, undefined, this))
}, undefined, false, undefined, this)
]
}, undefined, true, undefined, this);
}
server.stop()
lets you stop bun's HTTP serverHashing functions powered by BoringSSL:
import {
SHA1,
MD5,
MD4,
SHA224,
SHA512,
SHA384,
SHA256,
SHA512_256,
} from "bun";
// hash the string and return as a Uint8Array
SHA1.hash("123456");
MD5.hash("123456");
MD4.hash("123456");
SHA224.hash("123456");
SHA512.hash("123456");
SHA384.hash("123456");
SHA256.hash("123456");
SHA512_256.hash("123456");
// output as a hex string
SHA1.hash(new Uint8Array(42), "hex");
MD5.hash(new Uint8Array(42), "hex");
MD4.hash(new Uint8Array(42), "hex");
SHA224.hash(new Uint8Array(42), "hex");
SHA512.hash(new Uint8Array(42), "hex");
SHA384.hash(new Uint8Array(42), "hex");
SHA256.hash(new Uint8Array(42), "hex");
SHA512_256.hash(new Uint8Array(42), "hex");
// incrementally update the hashing function value and convert it at the end to a hex string
// similar to node's API in require('crypto')
// this is not wired up yet to bun's "crypto" polyfill, but it really should be
new SHA1().update(new Uint8Array(42)).digest("hex");
new MD5().update(new Uint8Array(42)).digest("hex");
new MD4().update(new Uint8Array(42)).digest("hex");
new SHA224().update(new Uint8Array(42)).digest("hex");
new SHA512().update(new Uint8Array(42)).digest("hex");
new SHA384().update(new Uint8Array(42)).digest("hex");
new SHA256().update(new Uint8Array(42)).digest("hex");
new SHA512_256().update(new Uint8Array(42)).digest("hex");
Published by Jarred-Sumner over 2 years ago
To upgrade:
bun upgrade
You can now import
from "bun"
in bun.js. You can still use the global Bun
.
Before:
await Bun.write("output.txt", Bun.file("input.txt"))
After:
import {write, file} from 'bun';
await write("output.txt", file("input.txt"))
This isn't a breaking change β you can still use Bun
as a global same as before.
Bun's JavaScript printer replaces the "bun"
import specifier with globalThis.Bun
.
var {write, file} = globalThis.Bun;
await write("output.txt", file("input.txt"))
You'll probably want to update types too:
bun add bun-types
isFile
and isDirectory
functions to fs.stat()
!
and comma operator - 4de7978b2763d95ddcded6ccda9b7b80cca7e8f1bun bun
with --platform=bun
set - 43b18663fdc763c24ae8fa0940019f2aee37c6aaBun.file
is not found and the error handler is not run, the default status code is now 404Bun.file
sent with sendfile and the request aborts or errorsFileBlob
sent via sendfile after the callback completes instead of before.ts
and .tsx
files with text/javascript
mime type by default instead of MPEG2 VideoPublished by Jarred-Sumner over 2 years ago
To upgrade:
bun upgrade
bun-types
npm package. A PR for @types/bun is waiting review (feel free to nudge them).Bun.serve()
is a fast HTTP & HTTPS server that supports the Request
and Response
Web APIs. The server is a fork of uWebSockets
Bun.write()
β one API for writing files, pipes, and copying files leveraging the fastest system calls available for the input & platform. It uses the Blob
Web API.Bun.mmap(path)
lets you read files as a live-updating Uint8Array
via the mmap(2) syscall. Thank you @evanwashere!!Bun.hash(bufferOrString)
exposes fast non-cryptographic hashing functions. Useful for things like ETag
, not for passwords.Bun.allocUnsafe(length)
creates a new Uint8Array ~3.5x faster than new Uint8Array, but it is not zero-initialized. This is similar to Node's Buffer.allocUnsafe
, though without the memory pool currentlyURL
examples
folderfs.read()
and fs.write()
and some more tests for fsResponse.redirect()
, Response.json()
, and Response.error()
have been addedimport.meta.url
is now a file:// url stringBun.resolve
and Bun.resolveSync
let you resolve the same as import
does. It throws a ResolveError
on failure (same as import
)Bun.stderr
and Bun.stdout
now return a Blob
SharedArrayBuffer
is now enabled (thanks @evanwashere!)Going forward, Bun will first try to rely on WebKit/Safari's implementations of Web APIs rather than writing new ones. This will improve Web API compatibility while reducing bun's scope, without compromising performance
These Web APIs are now available in bun.js and powered by Safari's implementation:
URL
URLSearchParams
ErrorEvent
Event
EventTarget
DOMException
Headers
uses WebKit's implementation now instead of a custom oneAbortSignal
(not wired up to fetch or fs yet, it exists but not very useful)AbortController
Also added:
reportError
(does not dispatch an "error"
event yet)Additionally, all the builtin constructors in bun now have a .prototype
property (this was missing before)
Bun.serve
- fast HTTP serverFor a hello world HTTP server that writes "bun!", Bun.serve
serves about 2.5x more requests per second than node.js on Linux:
Requests per second | Runtime |
---|---|
~64,000 | Node 16 |
~160,000 | Bun |
Bigger is better
Bun:
Bun.serve({
fetch(req: Request) {
return new Response(`bun!`);
},
port: 3000,
});
Node:
require("http")
.createServer((req, res) => res.end("bun!"))
.listen(8080);
Two ways to start an HTTP server with bun.js:
export default
an object with a fetch
functionIf the file used to start bun has a default export with a fetch
function, it will start the http server.
// hi.js
export default {
fetch(req) {
return new Response("HI!");
},
};
// bun ./hi.js
fetch
receives a Request
object and must return either a Response
or a Promise<Response>
. In a future version, it might have an additional arguments for things like cookies.
Bun.serve
starts the http server explicitlyBun.serve({
fetch(req) {
return new Response("HI!");
},
});
For error handling, you get an error
function.
If development: true
and error
is not defined or doesn't return a Response
, you will get an exception page with a stack trace:
It will hopefully make it easier to debug issues with bun until bun gets debugger support. This error page is based on what bun dev
does.
If the error function returns a Response
, it will be served instead
Bun.serve({
fetch(req) {
throw new Error("woops!");
},
error(error: Error) {
return new Response("Uh oh!!\n" + error.toString(), { status: 500 });
},
});
If the error
function itself throws and development
is false
, a generic 500 page will be shown
Currently, there is no way to stop the HTTP server once started π , but that will be added in a future version.
The interface for Bun.serve
is based on what Cloudflare Workers does.
Bun.write
lets you write, copy or pipe files automatically using the fastest system calls compatible with the input and platform.
interface Bun {
write(
destination: string | number | FileBlob,
input: string | FileBlob | Blob | ArrayBufferView
): Promise<number>;
}
Output | Input | System Call | Platform |
---|---|---|---|
file | file | copy_file_range | Linux |
file | pipe | sendfile | Linux |
pipe | pipe | splice | Linux |
terminal | file | sendfile | Linux |
terminal | terminal | sendfile | Linux |
socket | file or pipe | sendfile (if http, not https) | Linux |
file (path, doesn't exist) | file (path) | clonefile | macOS |
file | file | fcopyfile | macOS |
file | Blob or string | write | macOS |
file | Blob or string | write | Linux |
All this complexity is handled by a single function.
// Write "Hello World" to output.txt
await Bun.write("output.txt", "Hello World");
// log a file to stdout
await Bun.write(Bun.stdout, Bun.file("input.txt"));
// write the HTTP response body to disk
await Bun.write("index.html", await fetch("http://example.com"));
// this does the same thing
await Bun.write(Bun.file("index.html"), await fetch("http://example.com"));
// copy input.txt to output.txt
await Bun.write("output.txt", Bun.file("input.txt"));
require
to produce incorrect output depending on how the module was usedbun dev
related to HMR & websockets - https://github.com/Jarred-Sumner/bun/commit/daeede28dbb3c6b5bb43250ca4f3524cd728ca4c
fs.openSync
now supports mode
and flags
- https://github.com/Jarred-Sumner/bun/commit/c73fcb073109405e1ccc30299bd9f8bef2791435
fs.read
and fs.write
were incorrectly returning the output of fs/promises
versions, this is fixedfs
function received a TypedArray as input. - https://github.com/Jarred-Sumner/bun/commit/614f64ba82947f7c2d3bf2dcc4e4993f6446e9b9. This also improves performance of sending array buffers to native a littleResponse
's constructor previously read statusCode
instead of status
. This was incorrect and has been fixed.fs.stat
reported incorrect information on macOS x64Bun.Transpiler
and Bun.unsafe
29a759a65512278f1c20d1089ba05dbae268ef24Published by Jarred-Sumner over 2 years ago
To upgrade:
bun upgrade
This release adds source maps for JS output, several web APIs to bun.js (and HTMLRewriter
), parser support for ES2022 syntax, improves parsing speed for long strings, makes bun dev
10% faster on macOS and fixes a lot of crashes.
Note: the WASM build is not quite ready yet, but I'm working on it! The main thing I'm uncertain of is whether to release it with Bun.Transpiler
as the API, or with a partially-implemented esbuild
-compatible API. So if you would like to use it and have opinions there, do say
Thanks to source maps, errors in bun.js show the source code instead of the transpiled code now:
Note the incorrect line numbers and the missing if (true)
branch β bun's transpiler removed the dead code, but that can make it harder to read the code
New APIs:
HTMLRewriter
β fast jQuery-like streaming HTML parser API in Cloudflare Workers, powered by LOLHTML (the same parser in cloudflare workers)setTimeout
, setInterval
, clearTimeout
, and clearInterval
Response.clone
, Request.clone
, Request.bodyUsed
, Response.bodyUsed
, Response.headers
atob
, btoa
Blob
, including Blob.text()
, Blob.slice()
, Blob.arrayBuffer()
and Request.blob()
, Response.blob()
console.trace()
is now implementedBun.inspect
lets you format like console.log
but get back a stringBun.unsafe.arrayBufferToString(buffer)
lets you quickly & unsafely cast an ArrayBuffer-like object to a string. Useful when you've already encoded the string.Misc:
console.log
support for JSX!ResolveError.prototype.toString()
and BuildError.prototype.toString()
work as expectedconsole.log
support for Request
, Response
, Headers
FetchEvent.respondWith
will automatically await any promises now{time, count, timeEnd, profile, profileEnd, count, countReset}
bun pm cache
prints the cache dir pathBun.Transpiler
:
autoImportJSX
flag lets you enable or disable importing auto-importing the jsxImportSource
.allowBunRuntime
flag lets you disable importing bun's runtime code. This is useful if you want to use bun as just a transpiler. It is disabled by default, as I would expect people to use Bun.Transpiler
mostly for other environments than bun itselfI started using the debug build of mimalloc, bun's memory allocator for the debug builds of bun and that uncovered a few otherwise difficult-to-reproduce crashes