Setting up a hierarchical model using R2jags

215 Views Asked by At

I'm working on a project for an introductory Bayesian analysis course and I'm also fairly new to using R regularly. We are supposed to build a hierarchical model using a data set we found or put together. I put a data set together to analyze the question of how much a nations social freedoms and its wealth affect its happiness. The data set looks like this:

      Country Country.ID Year Happiness.Score   GDP.PPP Status
1 Afghanistan          1 2015           3.575  1766.593      1
2 Afghanistan          1 2016           3.360  1757.023      1
3 Afghanistan          1 2017           3.794  1758.466      1
4     Albania          2 2015           4.959 10971.044      2
5     Albania          2 2016           4.655 11356.717      2
6     Albania          2 2017           4.644 11803.282      2

Status is taking from Freedom House that I converted into an integer, 1 means not free, 2 means partly free, and 3 means a free society. For my first model, I don't want to add the GDP data.

I put together code that looked the same as an example we saw in class, but I've run into trouble, and I'm not sure how to fix it.

Here is how I set everything up from making lists, to modeling, setting up priors, and running the jags command:

#Making lists and assigning loop numbers
Num.Obs <- 460
Country <- impute.set1[,2]
Year <- impute.set1[,3]
Happiness.Score <- impute.set1[,4]
GDP.PPP <- impute.set1[,5]
Status <- unique(impute.set1[,6])
Num.Status <- 3

#Modeling
model1 <- function(){
  #Data Model
  for (i in 1:Num.Obs){
    #Observations at Country Level
    Happiness.Score[i] ~ dnorm(mu[i], tau.Happiness.Score)
    #Random Intercept and slope(alpha is intercetp, beta1 of country and time)
    mu[i] <- alpha[Country[i]] + beta1[Country[i]]*Year[i]
  }
  #Priors
  for (j in 1:Num.Status){
    alpha[j] ~ dnorm(mu.alpha, tau.alpha)
    beta1[j] ~ dnorm(mu.beta1[Status[j]], tau.beta1)
  }
  mu.alpha ~ dnorm(0, 0.01)
  tau.alpha ~ dnorm(0, 0.01)
  sigma.alpha ~ dunif(0, 10)

  mu.beta1[1] ~ dnorm(0, 0.01)
  mu.beta1[2] ~ dnorm(0, 0.01)
  mu.beta1[3] ~ dnorm(0, 0.01)
  tau.beta1 ~ dnorm(0, 0.01)
  sigma.beta1 ~ dunif(0, 10)

  tau.Happiness.Score ~ dnorm(0, 0.01)
  sigma.Happiness.Score ~ dnorm(0, 0.01)
}

#Listing my lists and parameters I want to save for final use
MyVars1data <- list(Country=Country, Year=Year, Status=Status, Happiness.Score=Happiness.Score,
                    Num.Obs=Num.Obs, Num.Status=Num.Status)
params <- c("mu.alpha", "mu.beta1[1]", "mu.beta1[2]", "mu.beta1[3]", "sigma.Happiness.Score")

#Implement Jags
m1 <- jags(MyVars1data,
           model.file = model1,
           parameters.to.save = params,
           n.chains = 3,
           inits = NULL,
           n.iter = 10000,
           n.burnin = 5000,
           progress.bar = "text")

I end up with an error that looks like this:

Error in jags.model(model.file, data = data, inits = init.values, n.chains = n.chains,  : 
  RUNTIME ERROR:
Compilation error on line 5.
Index out of range taking subset of  alpha

I have 460 observations in my data set. I'm looking through the loops I made and I think i should be going from 1 to 460. I believe the error is here: mu[i] <- alpha[Country[i]] + beta1[Country[i]]*Year[i] Am I looking at the wrong place?

Can anyone help me to debug this? Any help would be greatly appreciated.

0

There are 0 best solutions below