Decreasing linear model with a Sine Wave in R

757 Views Asked by At

I am attempting to create a linear model that incorporates a Sine Wave with decreasing amplitude over time.

I am able to successfully incorporate a simple sine wave into my model using the following code:

#sine curve equation resolving 12 months 

xc<-cos(2*pi*stan$num/12)
xs<-sin(2*pi*stan$num/12)

#other variables 

Time_bb <- as.Date(stan$C_Month_Year, format = "%m/%d/%y")
Count1_bb <- stan$stan_calls

#fitted complete model 
cos.f_bb <- fitted(cos.m_bb)

cos.m_bb <- lm(Count1_bb ~ Time_bb + xc + xs)
cos.f_bb <- fitted(cos.m_bb)  

stan$fit_m <- cos.f_bb

p5 <- ggplot(stan, aes(x = Time_bb, y = stan_calls) ) +
  geom_line(data = stan, color = 'red', aes(y = fit_m)) +
  geom_smooth(method = lm, se = FALSE) + 
  geom_point(size=1)
  

This code results in the following graph:

linear model incorporating a regular sine wave

However, I believe I would get a better model fit if I were able to incorporate a decreasing amplitude over time. However, I'm at a loss of how to properly code this, and more importantly control the magnitude of the decrease in amplitude.

I attempted some of the suggestions detailed in Drawing sine wave with increasing Amplitude and frequency over time. And using the following code:

#sine curve equation resolving 12 months 

f <- 12
f_c <- 1
T <- 1/f
t <- seq(0, 12, T) 
A <- t

stan$num/12

carrier <- cos(2*pi*f_c*t)
out <- A*carrier

out <- rev(out)
out<- out[0:65]

#other variables 
Time_bb <- as.Date(stan$C_Month_Year, format = "%m/%d/%y")
Count1_bb <- stan$stan_calls

#complete model 
cos.m_bb <- lm(Count1_bb ~ Time_bb + out)
tidy(cos.m_bb, conf.int = TRUE)

#fitted complete model 
cos.f_bb <- fitted(cos.m_bb) 

fit.lm_bb <- lm(Count1_bb~out)
stan$fit_bb <- fitted(fit.lm_bb)  
stan$fit.res_bb <- resid(fit.lm_bb)

cos.m_bb <- lm(Count1_bb ~ Time_bb + out)
cos.f_bb <- fitted(cos.m_bb)  

stan$fit_m <- cos.f_bb

p5 <- ggplot(stan, aes(x = Time_bb, y = stan_calls) ) +
  geom_line(data = stan, color = 'red', aes(y = fit_m)) +
  geom_point()

I was able to create the following graph:

Linear Model decreasing amplitude

However, I don't understand how to properly manipulate f, T, f_c, t, and A to obtain a better model fit. Any help would be greatly appreciated!

A head of the database I am using looks like this:

head(stan)

 C_Month_Year stan_calls
1       7/1/14  0.1154295
2       8/1/14  0.2049913
3       9/1/14  0.1786142
4      10/1/14  0.1453100
5      11/1/14  0.1238671
6      12/1/14  0.1289842

0

There are 0 best solutions below