gusl: (Default)
I'm interested in the effective sample size of my Gibbs sample.

Plotting the autocorrelation function, I notice that the even lags have a higher much autocorrelation than the odd lags. But if you skip odd numbers (or the even numbers), it looks smooth. It's interesting to consider what may have caused this.

I'm also wondering if R's effectiveSize will still work correctly.
gusl: (Default)
I am wondering if, instead of running MCMC and hoping that it has "mixed" ("achieved stationarity"), there are approaches based on computing (or approximating) the principal left eigenvector of the transition matrix.

Of course, in continuous spaces, this "matrix" has as many entries as S^2, where S is the space our parameters live in... so our "principal eigenvector" becomes the "principal eigenfunction". Functional analysts, how do you compute this?

If it helps, we might want to choose a sparse proposal (such as the one corresponding to Gibbs sampling, in which all transitions changing more than one parameter have probability density zero)

February 2020

S M T W T F S
      1
2345678
9101112131415
16171819202122
23242526272829

Syndicate

RSS Atom

Most Popular Tags

Style Credit

Expand Cut Tags

No cut tags