LibBi released

Sunday, June 23rd, 2013

After four years of work, I’m very happy to announce that LibBi is now available as open source software.

LibBi is used for state-space modelling and Bayesian inference on high-performance computer hardware, including multi-core CPUs, many-core GPUs (graphics processing units) and distributed-memory clusters.

The staple methods of LibBi are based on sequential Monte Carlo (SMC), also known as particle filtering. These methods include particle Markov chain Monte Carlo (PMCMC) and SMC2. Other methods include the extended Kalman filter and some parameter optimisation routines.

LibBi consists of a C++ template library, as well as a parser and compiler, written in Perl, for its own modelling language.

Find out more on the external LibBi site. A good place to start, as a user, is this introductory paper. If you are interested in getting involved in development of the software, please contact me.

New papers on

Wednesday, February 29th, 2012

I’ve added one preprint and one older workshop paper to, given recent interest, see below.

Murray, L. M.; Jones, E. M. & Parslow, J. (2012). On collapsed state-space models and the particle marginal Metropolis-Hastings sampler. In review. [arXiv]

Murray, L.M. (2011). GPU acceleration of the particle filter: The Metropolis resampler. Distributed machine learning and sparse representation with massive data-sets (DMMD 2011). [arXiv]

Bayesian Learning of Continuous-Time Dynamical Systems

Saturday, June 27th, 2009

I’ve posted the final version of my PhD thesis, "Bayesian Learning of Continuous-Time Dynamical Systems, with Applications in Functional Magnetic Resonance Imaging" to the research page. Now assessed, corrected and passed!

Note that this may serve as a useful manual for some of the detail behind the algorithms of the dysii Dynamic Systems Library.

dysii 1.4.0 released

Wednesday, December 17th, 2008

Version 1.4.0 of the dysii Dynamic Systems Library has been released. This is a major new release with a number of additional features and performance enhancements, as well as representing a consolidation of code and maturation of much of the API.

Particular new features include:

  • The kernel forward-backward and two-filter smoothers, suitable for fast, large-scale approximate inference in continuous-time stochastic models, as documented in my recent PhD thesis.
  • Overhauled kd tree implementation, featuring distributed partitioning, dual-tree and self-tree evaluations, particularly useful for the new smoothers above.
  • Improved stochastic Runge-Kutta and new Euler-Maruyama method for integrating stochastic differential equations.
  • Performance improvements resulting from continued profiling, including more aggressive inlining and less dependence on virtuals.
  • A new installation guide, available in the INSTALL.txt file of the distribution. Also note that with Boost 1.35 now released, dysii no longer requires the latest CVS of Boost, making it much simpler to install.

Full details are included in the VERSION.txt file of the distribution.

A couple of examples of applications using dysii are expected to be released within a matter of days also. These should provide an excellent starting point for those wishing to use the library for their own work.

Category-specific feeds

RSS feed iconPosts RSS feed iconComments