functorch is JAX-like composable function transforms for PyTorch.
BSD-3-CLAUSE License
Published by zou3519 over 1 year ago
As of PyTorch 2.0, we have deprecated the functorch module in favor of the new torch.func module in PyTorch.
functorch started as an out-of-tree library here at the pytorch/functorch repository. Our goal has always been to upstream functorch directly into PyTorch and provide it as a core PyTorch library.
As the final step of the upstream, we’ve decided to migrate from being a top level package (functorch) to being a part of PyTorch to reflect how the function transforms are integrated directly into PyTorch core. As of PyTorch 2.0, we are deprecating import functorch
and ask that users migrate to the newest APIs, which we will maintain going forward. import functorch
will be kept around to maintain backwards compatibility for at least one year.
Please see https://pytorch.org/docs/2.0/func.migrating.html for a full guide to the migration. The TL;DR is that:
We'll continue to publish functorch binaries on pypi as well to maintain backwards compatibility, but note that as of PyTorch 1.13, installing PyTorch will automatically allow you to import functorch
.
Published by zou3519 almost 2 years ago
We’re excited to announce that, as a first step towards closer integration with PyTorch, functorch has moved to inside the PyTorch library and no longer requires the installation of a separate functorch package. After installing PyTorch via conda or pip, you’ll be able to import functorch
in your program.
functorch will no longer have a separate version number (and instead the version number will match PyTorch’s; 1.13 for the current release).
If you're upgrading from an older version of functorch (functorch 0.1.x or 0.2.x), then you may need to uninstall functorch first via pip uninstall functorch
.
We've maintained backwards compatibility for pip install functorch
: this command works for PyTorch 1.13 and will continue to work for the foreseeable future until we do a proper deprecation. This is helpful if you're maintaining a library that supports multiple versions of PyTorch and/or functorch. The actual mechanics of this is that the functorch pip wheel is just a dummy package that lists torch==1.13 as a dependency.
Please refer to the PyTorch release notes for a detailed changelog.
Published by zou3519 about 2 years ago
We’re excited to present the functorch 0.2.1 minor bug-fix release, compatible with PyTorch 1.12.1. Please see here for installation instructions.
masked_fill
(#946), searchsorted
(#966)Published by zou3519 over 2 years ago
Inspired by Google JAX, functorch is a library that offers composable vmap (vectorization) and autodiff transforms. It enables advanced autodiff use cases that would otherwise be tricky to express in PyTorch. Examples of these include:
We’re excited to announce functorch 0.2.0 with a number of improvements and new experimental features.
functorch's Linux binaries are compatible with all PyTorch 1.12.0 binaries aside from the PyTorch 1.12.0 cu102 binary; functorch will raise an error if it is used with an incompatible PyTorch binary. This is due to a bug in PyTorch (https://github.com/pytorch/pytorch/issues/80489); in previous versions of PyTorch, it is possible to build a single Linux binary for functorch that works with all PyTorch Linux binaries. This will be fixed in the next PyTorch (and functorch) minor release.
We significantly improved coverage for functorch.jvp
(our forward-mode autodiff API) and other APIs that rely on it (functorch.{jacfwd, hessian}).
Given a function f
, functionalize(f)
returns a new function without mutations (with caveats). This is useful for constructing traces of PyTorch functions without in-place operations. For example, you can use make_fx(functionalize(f))
to construct a mutation-free trace of a pytorch function. To learn more, please see the documentation
There are now official functorch pip wheels for Windows.
Note that this is not an exhaustive list of changes, e.g. changes to pytorch/pytorch can fix bugs in functorch or improve our transform coverage. Here we include user-facing changes that were committed to pytorch/functorch.
functorch.experimental.functionalize
(#236, #720, and more)torch.norm
(#708)disable_autograd_tracking
to make_functional
variants. This is useful if you’re not using torch.autograd
(#701)torch.nn.functional.mse_loss
(#860)torch.autograd.functional
and functorch transforms (#849)Published by zou3519 over 2 years ago
We’re excited to present the functorch 0.1.1 minor bug-fix release, compatible with PyTorch 1.11. Please see here for installation instructions.
jvp
with vmap
(#603)jvp
now works when called inside autograd.Function
(#607)make_functional
(and variants) now work with models that do parameter sharing (also known as weight tying) (#620)nn.functional.silu
, nn.functional.prelu
, nn.functional.glu
(#677, #609, #665)vmap
support for nn.functional.group_norm
, binomial
, torch.multinomial
, Tensor.to
(#685, #670, #672, #649)Published by zou3519 over 2 years ago
We’re excited to announce the first beta release of functorch. Heavily inspired by Google JAX, functorch is a library that adds composable function transforms to PyTorch. It aims to provide composable vmap (vectorization) and autodiff transforms that work with PyTorch modules and PyTorch autograd with good eager-mode performance.
Composable function transforms can help with a number of use cases that are tricky to do in PyTorch today:
For more details, please see our documentation, tutorials, and installation instructions.