Created
December 7, 2018 12:41
-
-
Save martinv13/6ebee3fc9c5f4808a7039ad6e39966f2 to your computer and use it in GitHub Desktop.
Rolling weighted linear model with exponential window computed in linear time.
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
# This gist demonstrate the use of a linear algorithm to compute exponentially weighted rolling linear regression between two time-series. | |
# benchmark rolling linear model using R's lm function | |
# output 2 regression coefficients + R2, ie 3 value per row | |
# alpha: decay coefficient (last weight = (1 - alpha) * previous weight) | |
# miny: minimum number of sample required to calculate an output value | |
rlm <- function(y, x, alpha, miny=10) { | |
n <- length(y) | |
res <- matrix(nrow = n, ncol = 3) | |
for (i in miny:n){ | |
w <- rep((1-alpha), i)^((i-1):0) | |
m <- lm(y[1:i]~x[1:i], weights = w) | |
s <- summary(m) | |
res[i,] <- c(s$coefficients[2,1], s$coefficients[1,1], s$r.squared) | |
} | |
res | |
} | |
# implementation of rolling exponentially weighted linear model in linear time | |
# arguments and outputs: same as above | |
rlm2 <- function(y, x, alpha, miny=10) { | |
n <- length(y) | |
res <- matrix(nrow = n, ncol = 3) | |
sw <- 0 | |
swx <- 0 | |
swy <- 0 | |
swxy <- 0 | |
swx2 <- 0 | |
swy2 <- 0 | |
for (i in 1:n) { | |
sw <- 1 + sw*(1-alpha) | |
swx <- x[i] + swx*(1-alpha) | |
swy <- y[i] + swy*(1-alpha) | |
swxy <- x[i]*y[i] + swxy*(1-alpha) | |
swx2 <- x[i]^2 + swx2*(1-alpha) | |
swy2 <- y[i]^2 + swy2*(1-alpha) | |
if (i >= miny) { | |
mx <- swx / sw | |
my <- swy / sw | |
b1 <- (swxy + sw*mx*my - swx*my - swy*mx)/(swx2 + sw*mx^2 - 2*swx*mx) | |
b0 <- my-b1*mx | |
ssr <- sw*b0^2 + swx2*b1^2 + sw*my^2 + 2*swx*b0*b1 - 2*sw*b0*my - 2*swx*b1*my | |
sse <- sw*b0^2 + swx2*b1^2 + swy2 + 2*swx*b0*b1 - 2*swy*b0 - 2*swxy*b1 | |
r2 <- ssr/(ssr+sse) | |
res[i,] <- c(b1, b0, r2) | |
} | |
} | |
res | |
} | |
# test equality of the output of the two functions | |
# on a random sample with linear correlation | |
n <- 1000 | |
x <- runif(n)*200 | |
y <- x*.5 + rnorm(n, 30) | |
alpha <- 0.03 | |
rm1 <- rlm(y, x, alpha) | |
rm2 <- rlm2(y, x, alpha) | |
comp <- abs(rm1 - rm2) | |
sum(comp > 1e-10, na.rm = TRUE) | |
Sorry for the late answer. I believe you are correct, this part can be simplified. I did the calculation myself based on the plain linear regression formulas that can be found easily including on Wikipedia .
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
sw*mx*my - swx*my = 0
assw*mx*my = swx*my
andmx <- swx / sw
Is there a place I can read where this calculation was taken from?