Resources

Machine learning algorithms and statistical software developed by our group, available as well-documented toolboxes.

For an up-to-date list of our software, visit our group's GitHub profile.

Bayesian Adaptive Direct Search (BADS)

Bayesian Adaptive Direct Search (BADS)
Github page

BADS is a fast Bayesian optimization algorithm designed to solve difficult optimization problems, in particular related to fitting computational models (e.g., via maximum likelihood estimation). In benchmarks with real model-fitting problems, BADS performed on par or better than many other common and state-of the-art MATLAB optimizers, such as fminsearch, fmincon, and cmaes. BADS has become the default optimization method in many computational labs around the world, and has been applied to a variety of problems in computational and cognitive neuroscience, and more recently to economics and engineering.

BADS is recommended when no gradient information is available, and the objective function is non analytical or noisy, for example evaluated through numerical approximation or via simulation. BADS requires no specific tuning, comes with extensive documentation and examples, and runs off-the-shelf like other built-in MATLAB optimizers such as fminsearch.

Reference:

  1. Acerbi L, Ma WJ (2017). Practical Bayesian Optimization for Model Fitting with Bayesian Adaptive Direct Search. In Proc. Advances in Neural Information Processing Systems 30 (NeurIPS '17), Long Beach, USA | Link | arXiv (article + supplement)
Variational Bayesian Monte Carlo (VBMC)

Vari­ational Bayesian Monte Carlo (VBMC)
GitHub page: MATLAB, Python

VBMC is an approximate inference method designed to fit and evaluate computational models with a limited budget of potentially noisy likelihood evaluations. Specifically, VBMC simultaneously computes:

  • an approximate posterior distribution of the model parameters;
  • an approximation — technically, an approximate lower bound — of the log model evidence (also known as log marginal likelihood or log Bayes factor), a metric used for Bayesian model selection.

Extensive benchmarks on both artificial test problems and a large number of real model-fitting problems from computational and cognitive neuroscience show that VBMC generally — and often vastly — outperforms alternative methods for sample-efficient Bayesian inference. VBMC runs with virtually no tuning, comes with extensive documentation and examples, and is very easy to set up (especially if you are already familiar with our optimization toolbox, BADS).

References:

  1. Acerbi L (2018). Variational Bayesian Monte Carlo. In Proc. Advances in Neural Information Processing Systems 31 (NeurIPS '18), Montréal, Canada. | Link arXiv
  2. Acerbi L (2019). An Exploration of Acquisition and Mean Functions in Variational Bayesian Monte Carlo. In Proc. Machine Learning Research 96: 1-10. 1st Symposium on Advances in Approximate Bayesian Inference, Montréal, Canada. | Link
  3. Acerbi L (2020). Variational Bayesian Monte Carlo with Noisy Likelihoods. In Proc. Advances in Neural Information Processing Systems 33 (NeurIPS '20), Montréal, Canada. | arXiv
  4. Huggins B*, Li C*, Tobaben M*, Aarnos MJ, Acerbi L (2023). PyVBMC: Efficient Bayesian inference in Python. arXiv preprint. | arXiv
  5. Li C, Clarté G, Acerbi L (2023). Fast post-process Bayesian inference with Sparse Variational Bayesian Monte Carlo. arXiv preprint. | arXiv
     
Inverse Binomial Sampling (IBS)

Inverse Binomial Sampling (IBS)
GitHub page

IBS is a technique to obtain unbiased, efficient estimates of the log-likelihood of a model by simulation. IBS simultaneously yields both a normally distributed, unbiased estimate of the log-likelihood, and a calibrated estimate of its variance.

The typical scenario is the case in which you have a simulator, that is a model from which you can randomly draw synthetic observations (for a given parameter vector), but cannot evaluate the log-likelihood analytically or numerically. In other words, IBS affords likelihood-based inference for models without explicit likelihood functions (also known as implicit models or simulators). Unlike other approaches for simulator-based inference, IBS does not reduce the data to summary statistics, but computes the log-likelihood of the entire data set.

IBS is commonly used as a part of an algorithm for maximum-likelihood estimation or Bayesian inference, and due to its properties it combines very well with the BADS and VBMC methods from our group (see in this same page).

Reference:

  1. van Opheusden B*, Acerbi L*, Ma WJ (2020). Unbiased and Efficient Log-Likelihood Estimation with Inverse Binomial Sampling. PLoS Computational Biology 16(12): e1008483. DOI: 10.1371/journal.pcbi.1008483 (* equal contribution) | Link
Optimization Visualization Demo (OptimViz)

Optimization Visualization Demo (OptimViz)
GitHub page

This demo visualizes several MATLAB derivative-free optimizers at work on standard test functions. The optimization algorithms visualized in the demo include BADS, fminsearch, fmincon, genetic algorithms (ga), multi-level coordinate search (MCS), and CMA-ES. This code is purely for demonstration purposes and does not represent a proper benchmark.

Data

For some of our studies we or our collaborators collected psychophysical data from human participants (to the left, data from a multisensory perception task). When applicable, datasets for published studies can be found in the Publications page, under each article entry.