Skip to content

Instantly share code, notes, and snippets.

Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save SaranjeetKaur/37086fea06076bd3ec76d052cc166378 to your computer and use it in GitHub Desktop.
Save SaranjeetKaur/37086fea06076bd3ec76d052cc166378 to your computer and use it in GitHub Desktop.
My documentation for final evaluation of GSoC 2020

Introduction

In this Google Summer of Code program I, along with my mentors, aimed to build the polychord nested sampling algorithm and integrate it with Turing using Julia Language. The version 0.5.0 release of NestedSamplers.jl has our work done, to date, included. This would allow users to use the random staggering, slicing and random slicing (or polychord) proposal alogrithms in NestedSamplers.jl. Much of this work was inspired by dynesty and its modular approach to nested sampling, which Julia’s multiple dispatch made even more effective. The majority of the code for the proposal algorithms has already been merged to NestedSamplers.jl. One of the major improvements which would greatly increase its usage would be merging ns.jl with Turing.jl package.

Blog post

The work done in this project is described in this blog post with links to all the code that is merged and illustrations.

Citation

The version 0.5.0 release of NestedSamplers.jl can be cited as: Miles Lucas, Saranjeet Kaur, Hong Ge, & Cameron Pfiffer. (2020, July 22). TuringLang/NestedSamplers.jl: v0.5.0 (Version v0.5.0). Zenodo.

What is NestedSamplers.jl?

NestedSamplers.jl is a Julian implementation of the nested sampling algorithm which allows the user to generate samples from posterior distributions and estimate the model evidence.

Illustration

Below is an illustration of how a user would use our code to sample from a 25 dimensional correlated multivariate normal likelihood with uniform prior for each variable (in the range of -5 to 5). This is adopted from one of the illustrations in our blog.

using Random
Random.seed!(1225);
using NestedSamplers
using LinearAlgebra
using StatsBase: sample, Weights
using MCMCChains: Chains

ndims = 25    # number of dimensions
C = Matrix(1.0I, ndims, ndims)    # covariance is set to be an identity Matrix
C[C.==0] .= 0.4    # off-diagonal terms set to be correlated
Cinv = inv(C)    # the precision Matrix
lnorm = -0.5 * (log(2π)*ndims + log(det(C)))    # ln(normalization) constant

# 25-D correlated multivariate normal log-likelihood function
function logl(x)
    return -0.5 * (x' * (Cinv * x)) + lnorm
end

# prior transform
function prior_transform(u)
    return @. 10.0 * u - 5.0    # unit cube samples `u` are transformed to a uniform prior in -5.0 to 5.0 and 0 elsewhere, for each variable
end

model = NestedModel(logl, prior_transform)    # create the model

spl = Nested(ndims, 500, bounds=Bounds.Ellipsoid, proposal=Proposals.RSlice())    # create the sampler

chain = sample(model, spl; dlogz=0.01, chain_type=Chains)    # construct the chains object

Current state of the project

  1. Pull requests that got merged:
  1. Pull requests that are open yet:
  • ns.jl#1333: This is to integrate NestedSamplers.jl with Turing.jl. This is a work in progress and the last Google Summer of Code commit for this pull request is noted here.
  • Proposals.HSlice#45: This is to implement the Hamiltonian slicing proposal alogrithm in NestedSamplers.jl. This is a work in progress and the last Google Summer of Code commit for this pull request is noted here.

Challenges

Several challenges were faced throughout the project. These included, but where not limited to build errors, compilation errors, and errors while refactoring the code. My mentors very patiently helped in pointing out these errors and fixing them.

Learnings

I learnt how open source code evolves. Learnt a lot about the Julia programming language as well. Getting the NestedSampler.jl to integrate with Turing.jl is offering me a lot of learning about the internals of Turing.jl, this for me, is a very unique and challenging learning experience.

Further Improvements

I hope to be able to keep improving and extending NestedSamplers.jl, using it where possibly throughout my further studies. Some improvements which will be done:

  • Completing the integration of NestedSamplers.jl with Turing.jl. This will make it easier for the user to experiment with nested samplers without changing their modelling code.
  • Creating more documents (and/or blog posts) which illustrate the use of this sampler in various scenarios.
  • Enabling the NestedSamplers.jl to perform dynamic nested sampling, which may improve the computational efficiency and the sampling accuracy, compared to static nested sampling.
  • Providing more advanced bounds and proposals alternatives in NestedSamplers.jl

Conference

I got to learn more about the Julia community by attending the online JuliaCon 2020 this year. A lot of excellent talks and workshops were included in it, which offered a variety of learning.

Acknowledgements

I would like to express my sincere gratitude to Hong Ge for creating the Turing project and giving me an opportunity to extend NestedSamplers.jl. Without his support I wouldn't have been able to meet the entire Turing team and learn so much from them. Miles Lucas has built NestedSamplers.jl in such a way that it is very easy to navigate through this package and add more functionalities. He has very patiently helped me (at various stages) by providing necessary information to implement more code in the package. Cameron Pfiffer has been an excellent guide whenever I've been stuck in the work. He has being helping me with a lot of my doubts and fixing errors in the code. Despite the significant timezone difference all of them have always been there to answer my questions and provide suggestions. Finally I thank the Google Summer of Code program for funding this project throughout the summer months.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment