f******y 发帖数: 2971 | 1 suppose two random variables, X and Y, mean of them are very small.
I can get the slope by linear regression lm(Y~X);
I can also do PCA,
data = data.frame(X=X, Y=Y);
princomp(data);
I expected the slope of the first PC vector to be very close to the slope
given by linear regression. I tried it in R, the results are very different.
Anyone can explain? | a***d 发帖数: 336 | 2 what do you mean by slope of the first PC vector?
different.
【在 f******y 的大作中提到】 : suppose two random variables, X and Y, mean of them are very small. : I can get the slope by linear regression lm(Y~X); : I can also do PCA, : data = data.frame(X=X, Y=Y); : princomp(data); : I expected the slope of the first PC vector to be very close to the slope : given by linear regression. I tried it in R, the results are very different. : Anyone can explain?
| j******n 发帖数: 2206 | | t**c 发帖数: 539 | 4 请教PCA和regression之间是什么关系啊? | z*****n 发帖数: 413 | 5 I don't think there is specific relationship between PCA and regression.
But covariates of regression can be replaced by PCs, if the X matrix has
strong colinearity. Or PCA is a good way to reduce the number of covariates.
I doubt the meaning of what LOUzhu did. For y=mu + b * x, b can be treated
as a scale of the x vector in the n-space. The first component of x,y should
be vector x+y, and this is non-sense if you haven't standardized your data
before PCA. |
|