Bayesian Adaptive Direct Search (BADS) optimization algorithm for model fitting in MATLAB
GPL-3.0 License
BADS is a fast hybrid Bayesian optimization algorithm designed to solve difficult optimization problems, in particular related to fitting computational models (e.g., via maximum likelihood estimation). The original BADS paper was presented at NeurIPS in 2017 [1].
BADS has been intensively tested for fitting behavioral, cognitive, and neural models, and is currently being used in many computational labs around the world.
In our benchmark with real model-fitting problems, BADS performed on par or better than many other common and state-of-the-art MATLAB optimizers, such as fminsearch
, fmincon
, and cmaes
[1].
BADS is recommended when no gradient information is available, and the objective function is non-analytical or noisy, for example evaluated through numerical approximation or via simulation.
BADS requires no specific tuning and runs off-the-shelf like other built-in MATLAB optimizers such as fminsearch
.
Download the latest version of BADS as a ZIP file.
install.m
.
bads('test')
.The BADS interface is similar to that of other MATLAB optimizers. The basic usage is:
[X,FVAL] = bads(FUN,X0,LB,UB,PLB,PUB);
with input parameters:
FUN
, a function handle to the objective function to minimize (typically, the negative log likelihood of a dataset and model, for a given input parameter vector);X0
, the starting point of the optimization (a row vector);LB
and UB
, hard lower and upper bounds;PLB
and PUB
, plausible lower and upper bounds, that is a box where you would expect to find almost all solutions.The output parameters are:
X
, the found optimum.FVAL
, the (estimated) function value at the optimum.For more usage examples, see bads_examples.m. You can also type help bads
to display the documentation.
For practical recommendations, such as how to set LB
and UB
, and any other question, check out the FAQ on the BADS wiki.
Note: BADS is a semi-local optimization algorithm, in that it can escape local minima better than many other methods but it can still get stuck. The best performance for BADS is obtained by running the algorithm multiple times from distinct starting points (see here).
BADS follows a mesh adaptive direct search (MADS) procedure for function minimization that alternates poll steps and search steps (see Fig 1).
Fig 1: BADS procedure
See here for a visualization of several optimizers at work, including BADS.
See our paper for more details [1].
If you have trouble doing something with BADS:
acerbilab
Discussions forum.This project is under active development. If you find a bug, or anything that needs correction, please let us know.
You can cite BADS in your work with something along the lines of
We optimized the log likelihoods of our models using Bayesian adaptive direct search (BADS; Acerbi and Ma, 2017). BADS alternates between a series of fast, local Bayesian optimization steps and a systematic, slower exploration of a mesh grid.
Besides formal citations, you can demonstrate your appreciation for BADS in the following ways:
@article{acerbi2017practical,
title={Practical {B}ayesian Optimization for Model Fitting with {B}ayesian Adaptive Direct Search},
author={Acerbi, Luigi and Ma, Wei Ji},
journal={Advances in Neural Information Processing Systems},
volume={30},
pages={1834--1844},
year={2017}
}
BADS is released under the terms of the GNU General Public License v3.0.