You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
All models were run on a mac m1promax using cmdstanr with 4 parallel chains and everything else default.
Model 1: Current Stan parameterization
Notes: Lots of warnings during warmup about Shape parameter[2] is inf
Total execution time: 1.0 seconds
data {
int N; //the number of observationsint K; //the number of columns in the model matrixvector[N] y; //the responsematrix[N,K] X; //the model matrix
}
parameters {
vector[K] betas; //the regression parametersreal<lower=0> phi; //the variance parameter
}
transformed parameters {
vector[N] mu =exp(X * betas); //the expected values (linear predictor)vector[N] alpha = mu .* mu / phi; //shape parameter for the gamma distributionvector[N] beta = mu / phi; //rate parameter for the gamma distribution
}
model {
betas ~normal(0, 4);
y ~gamma(alpha,beta);
}
Model 2: Max Mantei's suggested parameterization
Notes: Lots of warnings during wamup about shape parameter is 0, but must be positive finite.
Total execution time: 0.6 seconds
data {
int N; //the number of observationsint K; //the number of columns in the model matrixvector[N] y; //the responsematrix[N,K] X; //the model matrix
}
parameters {
vector[K] betas; //the regression parametersreal<lower=0> invphi; //the variance parameter
}
transformed parameters {
vector[N] mu =exp(X * betas); //the expected values (linear predictor)vector[N] beta = invphi / mu ; //shape parameter for the gamma distribution
}
model {
betas ~normal(0, 4); //prior for the intercept following Gelman 2008
invphi ~exponential(1);
y ~gamma(invphi,beta);
}
Model 3: JohnnyZoom4H's suggested parameterization
Notes: No warnings!
Total execution time: 0.4 seconds
functions {
real johnnys_gamma_lpdf(vector x, real tau, vector mu) {
int N =num_elements(x);
return (tau -1) *sum(log(x)) + tau *sum(log(tau) -log(mu)) - N *lgamma(tau) -sum(x * tau ./ mu);
}
}
data {
int N; //the number of observationsint K; //the number of columns in the model matrixvector[N] y; //the responsematrix[N,K] X; //the model matrix
}
parameters {
vector[K] betas; //the regression parametersreal<lower=0> tau; //the variance parameter
}
transformed parameters {
vector[N] mu =exp(X * betas); //the expected values (linear predictor)
}
model {
betas ~normal(0, 4); //prior for the intercept following Gelman 2008
tau ~exponential(1);
y ~ johnnys_gamma(tau, mu);
}
Clearly, parameterizations 2 and 3 are superior for this regression. I believe 3 is the best and when coded in cpp and given derivatives may increase efficiency.
The text was updated successfully, but these errors were encountered:
Description
The current parameterization of the gamma distribution results in correlated parameters that bias estimates when doing gamma regression. An alternative parameterization was discussed on the forums at https://discourse.mc-stan.org/t/posterior-estimates-of-rate-and-shape-of-gamma-distribution-are-dependent/3220/14?u=spinkney.
I decided to look at 3 of the parameterizations using this problem on the forums, https://discourse.mc-stan.org/t/gamma-regression-in-stan-vs-frequentist-approach/16274/11.
Frequentist Fit for comparison given in the forums
All models were run on a mac m1promax using cmdstanr with 4 parallel chains and everything else default.
Model 1: Current Stan parameterization
Notes: Lots of warnings during warmup about Shape parameter[2] is inf
Total execution time: 1.0 seconds
Fit of betas
Model 2: Max Mantei's suggested parameterization
Notes: Lots of warnings during wamup about shape parameter is 0, but must be positive finite.
Total execution time: 0.6 seconds
Fit of betas
Model 3: JohnnyZoom4H's suggested parameterization
Notes: No warnings!
Total execution time: 0.4 seconds
Fit of betas
Summary
Clearly, parameterizations 2 and 3 are superior for this regression. I believe 3 is the best and when coded in cpp and given derivatives may increase efficiency.
The text was updated successfully, but these errors were encountered: