tag:blogger.com,1999:blog-128274497687662608.post8706137710768199291..comments2014-08-06T01:06:13.948-04:00Comments on Scipy Stats Project: Sparse Least Squares Solverjseaboldhttp://www.blogger.com/profile/16419160807917835179noreply@blogger.comBlogger2125tag:blogger.com,1999:blog-128274497687662608.post-36083700369201687462010-03-14T16:04:13.145-04:002010-03-14T16:04:13.145-04:00Sure. The form comes up from time to time while I...Sure. The form comes up from time to time while I'm working on some estimators, where the right hand side is block diagonal and the left hand side can be a long column vector. To be a little more concrete consider the following toy example.<br /><br />import scikits.statsmodels as sm<br />from scipy.linalg import block_diag<br />import numpy as np<br /><br />data = sm.datasets.longley.Load()<br /><br />#Fair warning, the design of the datasets objects may change soon!<br /><br />exog = sm.add_constant(data.exog, prepend=True)<br />endog = data.endog<br /><br />#Now we can do<br /><br />X = block_diag(*[exog]*3)<br />Y = np.tile(endog, (1,3))<br /><br />results = sm.OLS(Y,X).fit()<br />results.params.reshape(3,-1)<br /><br />In this case, it's really just a way to do many OLS regressions at the same time. I was doing some Monte Carlo experiments on the model Y = XB where I had Y = 98*1000 x 1<br />and X as a big block diagonal, and instead of looping and doing OLS, I just did one big sparse least squares. The building of the sparse block diagonal was kind of slow though, and I wonder if it couldn't be optimized.jseaboldhttps://www.blogger.com/profile/16419160807917835179noreply@blogger.comtag:blogger.com,1999:blog-128274497687662608.post-15650414087977795182010-03-14T15:35:53.248-04:002010-03-14T15:35:53.248-04:00Hi, Skipper! Could you explain what you mean by &...Hi, Skipper! Could you explain what you mean by "whenever the RHS of a least squares can be written in block diagonal form"?Stefan van der Walthttps://www.blogger.com/profile/05207333579791530523noreply@blogger.com