For an up-to-date list of our software, visit our group's GitHub profile.
Bayesian Adaptive Direct Search (BADS)
Github page
BADS is a fast Bayesian optimization algorithm designed to solve difficult optimization problems, in particular related to fitting computational models (e.g., via maximum likelihood estimation). In benchmarks with real model-fitting problems, BADS performed on par or better than many other common and state-of the-art MATLAB optimizers, such as fminsearch, fmincon, and cmaes. BADS has become the default optimization method in many computational labs around the world, and has been applied to a variety of problems in computational and cognitive neuroscience, and more recently to economics and engineering.
BADS is recommended when no gradient information is available, and the objective function is non analytical or noisy, for example evaluated through numerical approximation or via simulation. BADS requires no specific tuning, comes with extensive documentation and examples, and runs off-the-shelf like other built-in MATLAB optimizers such as fminsearch.
Reference:
Variational Bayesian Monte Carlo (VBMC)
GitHub page: MATLAB, Python
VBMC is an approximate inference method designed to fit and evaluate computational models with a limited budget of potentially noisy likelihood evaluations. Specifically, VBMC simultaneously computes:
Extensive benchmarks on both artificial test problems and a large number of real model-fitting problems from computational and cognitive neuroscience show that VBMC generally — and often vastly — outperforms alternative methods for sample-efficient Bayesian inference. VBMC runs with virtually no tuning, comes with extensive documentation and examples, and is very easy to set up (especially if you are already familiar with our optimization toolbox, BADS).
References:
Inverse Binomial Sampling (IBS)
GitHub page
IBS is a technique to obtain unbiased, efficient estimates of the log-likelihood of a model by simulation. IBS simultaneously yields both a normally distributed, unbiased estimate of the log-likelihood, and a calibrated estimate of its variance.
The typical scenario is the case in which you have a simulator, that is a model from which you can randomly draw synthetic observations (for a given parameter vector), but cannot evaluate the log-likelihood analytically or numerically. In other words, IBS affords likelihood-based inference for models without explicit likelihood functions (also known as implicit models or simulators). Unlike other approaches for simulator-based inference, IBS does not reduce the data to summary statistics, but computes the log-likelihood of the entire data set.
IBS is commonly used as a part of an algorithm for maximum-likelihood estimation or Bayesian inference, and due to its properties it combines very well with the BADS and VBMC methods from our group (see in this same page).
Reference:
Optimization Visualization Demo (OptimViz)
GitHub page
This demo visualizes several MATLAB derivative-free optimizers at work on standard test functions. The optimization algorithms visualized in the demo include BADS, fminsearch, fmincon, genetic algorithms (ga), multi-level coordinate search (MCS), and CMA-ES. This code is purely for demonstration purposes and does not represent a proper benchmark.
For some of our studies we or our collaborators collected psychophysical data from human participants (to the left, data from a multisensory perception task). When applicable, datasets for published studies can be found in the Publications page, under each article entry.