… using Stan & HMC

Here, I sample from an Ising-like model (I treat the random variables as continuous, between -1 and 1 and add a term to the pseudo-likelihood that resembles a beta log density).

functions {
    real log_p(matrix m, real T, real alpha) {
        int n = rows(m);
        return( (1/T) * sum(m[2:(n-1), 2:(n-1)] .* m[1:(n-2), 2:(n-1)] +
                            m[2:(n-1), 2:(n-1)] .* m[2:(n-1), 1:(n-2)] +
                            m[2:(n-1), 2:(n-1)] .* m[3:n    , 2:(n-1)] +
                            m[2:(n-1), 2:(n-1)] .* m[2:(n-1), 3:n    ]) +
                sum( log(m/2 + 0.5)*(alpha - 1) + log(0.5 - m/2)*(alpha - 1) ));
    }
}
data {
    int n;
    real T;
    real alpha;
}
parameters {
    matrix<lower = -1, upper = 1>[n, n] m;
}
model {
    target += log_p(m, T, alpha);
}

The matrix terms are essentially a vectorised product-sum of nearest neighbour spins.

Results:

2019

Efficient Gaussian Process Computation

I’ll try to give examples of efficient gaussian process computation here, like the vec trick (Kronecker product trick), efficient toeliptz and circulant matrix computations, RTS smoothing and Kalman filtering using state space representations, and so on.

~1 min read

Gaussian Process Middle C

First of my experiments on audio modelling using gaussian processes. Here, I construct a GP that, when sampled, plays middle c the way a grand piano would.

~1 min read
Back to Top ↑

2018

Back to Top ↑