This is a blunt translation from my original post. Just to bring bilingual audience, I actually used google translator, hehehe!
I am surprised by the ability and flexibility to express complex statistical models with STAN. As you know, STAN is a declarative language of probabilistic programming. Statistical models are described with a simple syntax to specify in a way very similar to what we do on the board or on paper. STAN is based on Bayesian principles and uses several methods to sample subsequent distributions, among the best known are Hamiltonian Monte Carlo, Non U Turn and Metropolis-Hastings (I’m not going to get into detail on this now, on the internet there are lots of references about it ).
In this post we will see how to define a hierarchical logistic model. Suppose that for any binary observation $ latex y_i \ in \ {0,1 \} $ there is an associated level $ latex j_i \ in \ {1,2, …, J \} $ and a vector of covariates (predictors) $ latex x_i \ in \ mathbb {R} ^ K $. We will therefore assume that the observations are realizations of an independent bernoulli process when it is conditioned with a normal hierarchical process. This is:
\([Y | S] \sim Bernoulli(S)\)
Now, $ latex S $ can be described as follows:
\(S_{i,j} = \beta^t_j x_i + \epsilon\)
And it reads as follows. The variable S in observation i of level j is equal to $ latex \ beta ^ t_j x_ {i} + \ epsilon $. That is, the vector $ latex \ beta $ is the coefficients associated with the response $ latex S_ {i} $ at level j.
In STAN this is equivalent to:
hierarquical_log_model = u""" data { int<lower=0> N; // num obs. int<lower=0> J; // number of levels int<lower=1> K; // num covariates matrix[N, K] x; // design matrix int<lower=1,upper=J> level[N]; // type of level (spec) int<lower=0,upper=1> y[N]; // observations, in this case is binary. } parameters { real mu[K]; vector[K] beta[J]; real<lower=0,upper=100> sigma; // same sigma } model { for (k in 1:K) { mu[k] ~ normal(0,100); for (j in 1:J) beta[j,k] ~ normal(mu[k], sigma); } for (i in 1:N) y[i] ~ bernoulli_logit(x[i] * beta[level[i]]); }"""
For running the model in Python, using PyStan, use:
varying_intercept_model = pystan.StanModel(model_code=hierarquical_log_model) multilevel_linear_data = {'N': X.shape[0], 'J': 2, 'K': X.shape[1], 'level': data.codes.values, # niveles 'x' : X.values, 'y': data.obs.values} %time fit = varying_intercept_model.sampling(data=multilevel_linear_data, iter=1000,chains=1) sample = fit.extract(permuted=True)
Resources:
Originally taken from: https://mc-stan.org/docs/2_18/stan-users-guide/basic-motivation.html
Very good tutorial: https://mc-stan.org/users/documentation/case-studies/radon.html