Strange


Total Posts: 1434 
Joined: Jun 2004 


Trying to save a few hours of my time here. By an off chance, does anyone have a piece of python or C code for exponential least squares regression? I'll owe you beers and dinners 
I don't interest myself in 'why?'. I think more often in terms of 'when?'...sometimes 'where?'. And always how much?' 


kloc


Total Posts: 14 
Joined: May 2017 


For python, scipy.optimize.least_squares(...) should be straightforward to use:
https://docs.scipy.org/doc/scipy/reference/generated/scipy.optimize.least_squares.html



Rashomon


Total Posts: 182 
Joined: Mar 2011 


Strange, looks like you just take logs of your data first and then run an lm.
http://mathworld.wolfram.com/LeastSquaresFittingExponential.html http://www.math.usm.edu/lambers/mat419/lecture13.pdf
If you have functions for matrix transpose and matrix inversion then you can construct a hat matrix by formula rather than calling a lib.
Remember that the least squares formula is an algebraic way of getting the geometric projection. So pretransforming the data by logs can make sense that way. 



Strange


Total Posts: 1434 
Joined: Jun 2004 


Sorry, misswrote (love autocorrect)  i am looking for rolling exponentially weighted least squares. I.e. it's a regular OLS, but the the data being exponentially weighted along the time axis as it's rolling along.
Since I want it in the form of y = a + b * x, it's simply b = ewcov(x,y)/var(x), a = ewma(y)  ewma(x) * b etc. I am hoping that someone has a piece of code that does it in a recursive form so it would be quick. 
I don't interest myself in 'why?'. I think more often in terms of 'when?'...sometimes 'where?'. And always how much?' 

jr


Total Posts: 1 
Joined: Apr 2017 


Sklearn LinearRegression allows to fit with user specified sample weights. It is just batch version though. http://scikitlearn.org/stable/modules/generated/sklearn.linear_model.LinearRegression.html#sklearn.linear_model.LinearRegression 



Rashomon


Total Posts: 182 
Joined: Mar 2011 


rolling
in R:
data < cbind( data$x1, data$x2, data$t)
data$x1 / 2 ** data$t > data$x1 data$x2 / 2 ** data$t > data$x2
lm(data$x1 ~ data$x2)
Adjust the 2 to something else, or replace –data$t with end(data$t)  data$t or some other howFarBack(data$t) function.
Searching around a bit I found R's dlm
https://stats.stackexchange.com/questions/9931/exponentiallyweightedmovinglinearregression https://stat.ethz.ch/pipermail/rsigfinance/2012q1/009313.html
They are calling it a “poor man’s Kalman filter” (good name!).
Source to dlm ("dynamic linear model") has more than I wrote above.
https://www.rdocumentation.org/packages/dlm/versions/1.15/topics/dlmFilter https://www.rdocumentation.org/packages/dlm/versions/1.15/source
Sorry strange, I don’t speak python, but hopefully you can read my code. (including rightassignment > 

