J*****n 发帖数: 4859 | 1 It makes me confused.
I have two series, x and y.
When I run y = ax + b, I got:
Call:
lm(formula = y ~ x)
Residuals:
Min 1Q Median 3Q Max
-38161 -5115 -19 4550 40688
Coefficients:
Estimate Std. Error t value Pr(>|t|)
(Intercept) 1.300e+03 1.221e+03 1.065 0.29
x 1.058e+00 4.398e-02 24.057 <2e-16 ***
---
Signif. codes: 0 ‘***’ 0.001 ‘**’ 0.01 ‘*’ 0.05 ‘.’ 0.1 ‘ ’ 1
Residual standard error: 11210 on 84 degrees of freedom
Multiple R-squared: 0.8733, Adjusted R-squared: 0.8717
F-statistic: 578.7 on 1 and 84 DF, p-value: < 2.2e-16
The slope is 1.05 with sd 0.04, and intercept is not significant. I think in
this case, I can say x is a unbised estimator of y.
Then I run the regression in the other way x = a'y + b', I got :
Call:
lm(formula = x ~ y)
Residuals:
Min 1Q Median 3Q Max
-35130 -5516 -1025 5854 30005
Coefficients:
Estimate Std. Error t value Pr(>|t|)
(Intercept) -1.573e+03 1.072e+03 -1.467 0.146
y 8.254e-01 3.431e-02 24.057 <2e-16 ***
---
Signif. codes: 0 ‘***’ 0.001 ‘**’ 0.01 ‘*’ 0.05 ‘.’ 0.1 ‘ ’ 1
Residual standard error: 9903 on 84 degrees of freedom
Multiple R-squared: 0.8733, Adjusted R-squared: 0.8717
F-statistic: 578.7 on 1 and 84 DF, p-value: < 2.2e-16
Here, I noticed that the slope is 0.83 and sd is -0.03. Even consider 2
sigma, the slope is quite different from 1. In this case, we can't say y is
a unbised estimator of x.
These two conclusions seem contradict to each other. Do I miss sth?
Thank you. | n*****n 发帖数: 3123 | 2 don't know what your meaning about unbiased estimator is.
I think they are consistent. x and y are significantly associated according
to either model. | J*****n 发帖数: 4859 | 3
according
x is predict result of some model to predict y.
To my mind, y should be y = 1 * x + 0 + some error item. (*)
and also x = 1 * y + 0 + some error. (**)
My above regresions showed that
y = 1.05 * x + some error item. and this 1.05's sd is 0.04. So I see (*)
hold.
However, for the second regression, it showed that :
x = 0.85 * y + 0 + some error item, and this 0.85's sd is 0.03, thus (**)
failed.
I am confused about this fact.
Thank you for you reply.
【在 n*****n 的大作中提到】 : don't know what your meaning about unbiased estimator is. : I think they are consistent. x and y are significantly associated according : to either model.
| l***q 发帖数: 208 | 4 I think you have the endogenous problem, then zero conditional mean
assumption is violated and hence not unbiased. | n*****n 发帖数: 3123 | 5 For the model, Y=a*x+b. You want to test H0: a=1 and b=0.
Obviously, you did wrong tests. "H01: a=1 and H02: b=0" is not equivalent to
H0: a=1 and b=0.
I believe the results from two models (Y=a*x+b, and x=a*Y+b) are the same if
you do the right tests.
【在 J*****n 的大作中提到】 : : according : x is predict result of some model to predict y. : To my mind, y should be y = 1 * x + 0 + some error item. (*) : and also x = 1 * y + 0 + some error. (**) : My above regresions showed that : y = 1.05 * x + some error item. and this 1.05's sd is 0.04. So I see (*) : hold. : However, for the second regression, it showed that : : x = 0.85 * y + 0 + some error item, and this 0.85's sd is 0.03, thus (**)
| t**c 发帖数: 539 | 6 hand~ 我也是这么想的。
to
if
【在 n*****n 的大作中提到】 : For the model, Y=a*x+b. You want to test H0: a=1 and b=0. : Obviously, you did wrong tests. "H01: a=1 and H02: b=0" is not equivalent to : H0: a=1 and b=0. : I believe the results from two models (Y=a*x+b, and x=a*Y+b) are the same if : you do the right tests.
|
|