6674:
6335:
6669:{\displaystyle {\begin{aligned}\mathbf {y} &={\begin{bmatrix}\mathbf {X} &\mathbf {K} \end{bmatrix}}{\begin{bmatrix}{\hat {\boldsymbol {\beta }}}\\{\hat {\boldsymbol {\gamma }}}\end{bmatrix}},\\{}\Rightarrow {\begin{bmatrix}{\hat {\boldsymbol {\beta }}}\\{\hat {\boldsymbol {\gamma }}}\end{bmatrix}}&={\begin{bmatrix}\mathbf {X} &\mathbf {K} \end{bmatrix}}^{-1}\mathbf {y} ={\begin{bmatrix}\left(\mathbf {X} ^{\top }\mathbf {X} \right)^{-1}\mathbf {X} ^{\top }\\\left(\mathbf {K} ^{\top }\mathbf {K} \right)^{-1}\mathbf {K} ^{\top }\end{bmatrix}}\mathbf {y} .\end{aligned}}}
12142:
11903:
12962:
15853:
405:
2516:
449:
5579:
2169:
11790:
13091:
4596:
10394:
5646:
2511:{\displaystyle \mathbf {X} ={\begin{bmatrix}X_{11}&X_{12}&\cdots &X_{1p}\\X_{21}&X_{22}&\cdots &X_{2p}\\\vdots &\vdots &\ddots &\vdots \\X_{n1}&X_{n2}&\cdots &X_{np}\end{bmatrix}},\qquad {\boldsymbol {\beta }}={\begin{bmatrix}\beta _{1}\\\beta _{2}\\\vdots \\\beta _{p}\end{bmatrix}},\qquad \mathbf {y} ={\begin{bmatrix}y_{1}\\y_{2}\\\vdots \\y_{n}\end{bmatrix}}.}
4989:
5510:
4290:
10156:
10129:
3359:
9426:
4719:
2906:
5248:
13098:
Using either of these equations to predict the weight of a 5' 6" (1.6764 m) woman gives similar values: 62.94 kg with rounding vs. 62.98 kg without rounding. Thus a seemingly small variation in the data has a real effect on the coefficients but a small effect on the results of the equation.
9837:
The theorem can be used to establish a number of theoretical results. For example, having a regression with a constant and another regressor is equivalent to subtracting the means from the dependent variable and the regressor and then running the regression for the de-meaned variables but without the
4591:{\displaystyle s^{2}={\frac {{\hat {\varepsilon }}^{\mathrm {T} }{\hat {\varepsilon }}}{n-p}}={\frac {(My)^{\mathrm {T} }My}{n-p}}={\frac {y^{\mathrm {T} }M^{\mathrm {T} }My}{n-p}}={\frac {y^{\mathrm {T} }My}{n-p}}={\frac {S({\hat {\beta }})}{n-p}},\qquad {\hat {\sigma }}^{2}={\frac {n-p}{n}}\;s^{2}}
1815:
As a concrete example where regressors are non-linearly dependent yet estimation may still be consistent, we might suspect the response depends linearly both on a value and its square; in which case we would include one regressor whose value is just the square of another regressor. In that case, the
13109:
This highlights a common error: this example is an abuse of OLS which inherently requires that the errors in the independent variable (in this case height) are zero or at least negligible. The initial rounding to nearest inch plus any actual measurement errors constitute a finite and non-negligible
13016:
An important consideration when carrying out statistical inference using regression models is how the data were sampled. In this example, the data are averages rather than measurements on individual women. The fit of the model is very good, but this does not imply that the weight of an individual
12023:
will suggest the form and strength of the relationship between the dependent variable and regressors. It might also reveal outliers, heteroscedasticity, and other aspects of the data that may complicate the interpretation of a fitted regression model. The scatterplot suggests that the relationship
11836:
Second, for each explanatory variable of interest, one wants to know whether its estimated coefficient differs significantly from zero—that is, whether this particular explanatory variable in fact has explanatory power in predicting the response variable. Here the null hypothesis is that the true
3256:
1811:
Regressors do not have to be independent for estimation to be consistent e.g. they may be non-linearly dependent. Short of perfect multicollinearity, parameter estimates may still be consistent; however, as multicollinearity rises the standard error around such estimates increases and reduces the
12973:
Residuals against the explanatory variables in the model. A non-linear relation between these variables suggests that the linearity of the conditional mean function may not hold. Different levels of variability in the residuals for different levels of the explanatory variables suggests possible
6896:
can be cast in order to make the OLS technique applicable. Each of these settings produces the same formulas and same results. The only difference is the interpretation and the assumptions which have to be imposed in order for the method to give meaningful results. The choice of the applicable
3048:
13037:
This example also demonstrates that coefficients determined by these calculations are sensitive to how the data is prepared. The heights were originally given rounded to the nearest inch and have been converted and rounded to the nearest centimetre. Since the conversion factor is one inch to
11531:
11852:
is used to test whether two subsamples both have the same underlying true coefficient values. The sum of squared residuals of regressions on each of the subsets and on the combined data set are compared by computing an F-statistic; if this exceeds a critical value, the null hypothesis of no
14243:
10389:{\displaystyle {\hat {\beta }}^{c}=R(R^{\operatorname {T} }X^{\operatorname {T} }XR)^{-1}R^{\operatorname {T} }X^{\operatorname {T} }y+{\Big (}I_{p}-R(R^{\operatorname {T} }X^{\operatorname {T} }XR)^{-1}R^{\operatorname {T} }X^{\operatorname {T} }X{\Big )}Q(Q^{\operatorname {T} }Q)^{-1}c,}
9165:
11845:. If the t-statistic is larger than a predetermined value, the null hypothesis is rejected and the variable is found to have explanatory power, with its coefficient significantly different from zero. Otherwise, the null hypothesis of a zero value of the true coefficient is accepted.
3824:
11181:
518:
Geometrically, this is seen as the sum of the squared distances, parallel to the axis of the dependent variable, between each data point in the set and the corresponding point on the regression surface—the smaller the differences, the better the model fits the data. The resulting
6324:
7435:
This assumption is not needed for the validity of the OLS method, although certain additional finite-sample properties can be established in case when it does (especially in the area of hypotheses testing). Also when the errors are normal, the OLS estimator is equivalent to the
4269:
8212:
2710:
6679:
Another way of looking at it is to consider the regression line to be a weighted average of the lines passing through the combination of any two points in the dataset. Although this way of calculation is more computationally expensive, it provides a better intuition on OLS.
9945:
11820:
Two hypothesis tests are particularly widely used. First, one wants to know if the estimated regression equation is any better than simply predicting that all values of the response variable equal its sample mean (if not, it is said to have no explanatory power). The
5775:
3267:
10706:
9229:
8696:
4984:{\displaystyle R^{2}={\frac {\sum ({\hat {y}}_{i}-{\overline {y}})^{2}}{\sum (y_{i}-{\overline {y}})^{2}}}={\frac {y^{\mathrm {T} }P^{\mathrm {T} }LPy}{y^{\mathrm {T} }Ly}}=1-{\frac {y^{\mathrm {T} }My}{y^{\mathrm {T} }Ly}}=1-{\frac {\rm {RSS}}{\rm {TSS}}}}
3610:
1812:
precision of such estimates. When there is perfect multicollinearity, it is no longer possible to obtain unique estimates for the coefficients to the related regressors; estimation for these parameters cannot converge (thus, it cannot be consistent).
2741:
5505:{\displaystyle {\begin{aligned}{\widehat {\beta }}&={\frac {\sum _{i=1}^{n}{(x_{i}-{\bar {x}})(y_{i}-{\bar {y}})}}{\sum _{i=1}^{n}{(x_{i}-{\bar {x}})^{2}}}}\\{\widehat {\alpha }}&={\bar {y}}-{\widehat {\beta }}\,{\bar {x}}\ ,\end{aligned}}}
1052:
13679:
8032:
6071:
14323:
1131:
12586:
5956:
1415:
5159:, then this is called the "simple regression model". This case is often considered in the beginner statistics classes, as it provides much simpler formulas even suitable for manual calculation. The parameters are commonly denoted as
12136:
13518:
3172:
5549:
was obtained as a value that minimizes the sum of squared residuals of the model. However it is also possible to derive the same estimator from other approaches. In all cases the formula for OLS estimator remains the same:
7864:
2971:
11374:
8806:
14129:
9011:
7433:
1978:
6121:
13828:
3706:
11337:
11044:
12870:
6247:
2158:
9900:
9565:
8503:
7273:
in the model, although usually it is also estimated. If this assumption is violated then the OLS estimates are still valid, but no longer efficient. It is customary to split this assumption into two parts:
6156:
4128:
686:
8093:
2637:
3127:
3085:
14389:
5999:
5253:
5237:
882:
12969:
Ordinary least squares analysis often includes the use of diagnostic plots designed to detect departures of the data from the assumed form of the model. These are some of the common diagnostic plots:
12668:
13194:
10124:{\displaystyle {\hat {\beta }}^{c}={\hat {\beta }}-(X^{\operatorname {T} }X)^{-1}Q{\Big (}Q^{\operatorname {T} }(X^{\operatorname {T} }X)^{-1}Q{\Big )}^{-1}(Q^{\operatorname {T} }{\hat {\beta }}-c).}
10993:
2087:
7244:
1614:
9811:
8850:. If we are willing to allow biased estimators, and consider the class of estimators that are proportional to the sum of squared residuals (SSR) of the model, then the best (in the sense of the
6340:
3354:{\displaystyle {\hat {\boldsymbol {\beta }}}={\boldsymbol {\beta }}+\left(\mathbf {X} ^{\operatorname {T} }\mathbf {X} \right)^{-1}\mathbf {X} ^{\operatorname {T} }{\boldsymbol {\varepsilon }}.}
4657:
11265:
9728:
8073:
5697:
10758:
3976:
10590:
8406:
6188:
13110:
error. As a result, the fitted parameters are not the best estimates they are presumed to be. Though not totally spurious the error in the estimation will depend upon relative size of the
9421:{\displaystyle {\hat {y}}_{j}^{(j)}-{\hat {y}}_{j}=x_{j}^{\mathrm {T} }{\hat {\beta }}^{(j)}-x_{j}^{\operatorname {T} }{\hat {\beta }}=-{\frac {h_{j}}{1-h_{j}}}\,{\hat {\varepsilon }}_{j}}
6692:(MLE) under the normality assumption for the error terms. This normality assumption has historical importance, as it provided the basis for the early work in linear regression analysis by
3160:
13898:
13102:
While this may look innocuous in the middle of the data range it could become significant at the extremes or in the case where the fitted model is used to project outside the data range (
12881:
is calculated under the assumption that errors follow normal distribution. Even though the assumption is not very reasonable, this statistic may still find its use in conducting LR tests.
8912:
8581:
8371:
8246:
7915:
7748:
7444:. Importantly, the normality assumption applies only to the error terms; contrary to a popular misconception, the response (dependent) variable is not required to be normally distributed.
1462:
9481:. Usually the observations with high leverage ought to be scrutinized more carefully, in case they are erroneous, or outliers, or in some other way atypical of the rest of the dataset.
8589:
5054:
14431:
14124:
8319:
6816:
5691:), we are looking for a solution that could provide the smallest discrepancy between the right- and left- hand sides. In other words, we are looking for the solution that satisfies
2622:
1848:
1205:
9690:
7038:
3469:
10519:
2901:{\displaystyle S({\boldsymbol {\beta }})=\sum _{i=1}^{n}\left|y_{i}-\sum _{j=1}^{p}X_{ij}\beta _{j}\right|^{2}=\left\|\mathbf {y} -\mathbf {X} {\boldsymbol {\beta }}\right\|^{2}.}
15308:
14469:
12977:
Residuals against explanatory variables not in the model. Any relation of the residuals to these variables would suggest considering these variables for inclusion in the model.
3698:
7150:
1806:
1365:
1309:
1258:
1163:
762:
11366:
11186:
These asymptotic distributions can be used for prediction, testing hypotheses, constructing other estimators, etc.. As an example consider the problem of prediction. Suppose
10808:
10575:
10483:
8969:
5866:
5547:
921:
14066:
14039:
13992:
13945:
13559:
11761:
7923:
13743:
13711:
13554:
6015:
14248:
2955:
2541:
1733:
1659:
1530:
1440:
1067:
463:
growth should depend linearly on the changes in the unemployment rate. Here the ordinary least squares method is used to construct the regression line describing this law.
12889:
tests whether there is any evidence of serial correlation between the residuals. As a rule of thumb, the value smaller than 2 will be an evidence of positive correlation.
13439:
13339:
13223:
13007:
12326:
12284:
12242:
1760:
1556:
1488:
1231:
11829:. If the calculated F-value is found to be large enough to exceed its critical value for the pre-chosen level of significance, the null hypothesis is rejected and the
2577:
1695:
13288:
12472:
5895:
2113:
1373:
12773:
12746:
11561:
11211:
8512:. This theorem establishes optimality only in the class of linear unbiased estimators, which is quite restrictive. Depending on the distribution of the error terms
5634:
5607:
1336:
913:
733:
13042:
an exact conversion. The original inches can be recovered by Round(x/0.0254) and then re-converted to metric without rounding. If this is done the results become:
7458:
6836:, and thus the system is exactly identified. This is the so-called classical GMM case, when the estimator does not depend on the choice of the weighting matrix.
5081:
14086:
14012:
13965:
13918:
13402:
13382:
13263:
13243:
7367:
5109:
2933:
2733:
2597:
2024:
2001:
1634:
1580:
1508:
1282:
1183:
782:
706:
614:
7465:
which makes all the assumptions listed earlier simpler and easier to interpret. Also this framework allows one to state asymptotic results (as the sample size
6228:
12748:, designed to penalize for the excess number of regressors which do not add to the explanatory power of the regression. This statistic is always smaller than
4696:
It is common to assess the goodness-of-fit of the OLS regression by comparing how much the initial variation in the sample can be reduced by regressing onto
12034:
7873:
models, where exogeneity is assumed only with respect to the past shocks but not the future ones), then these estimators will be biased in finite samples.
3251:{\displaystyle {\hat {\boldsymbol {\beta }}}=\left(\mathbf {X} ^{\operatorname {T} }\mathbf {X} \right)^{-1}\mathbf {X} ^{\operatorname {T} }\mathbf {y} .}
15301:
12903:
are both used for model selection. Generally when comparing two alternative models, smaller values of one of these criteria will indicate a better model.
13444:
10442:. However, generally we also want to know how close those estimates might be to the true values of parameters. In other words, we want to construct the
10138:
is invertible. It was assumed from the beginning of this article that this matrix is of full rank, and it was noted that when the rank condition fails,
8524:
The properties listed so far are all valid regardless of the underlying distribution of the error terms. However, if you are willing to assume that the
6900:
One of the lines of difference in interpretation is whether to treat the regressors as random variables, or as predefined constants. In the first case (
3043:{\displaystyle \left(\mathbf {X} ^{\operatorname {T} }\mathbf {X} \right){\hat {\boldsymbol {\beta }}}=\mathbf {X} ^{\operatorname {T} }\mathbf {y} \ .}
12949:
indicates probability that the hypothesis is indeed true. Note that when errors are not normal this statistic becomes invalid, and other tests such as
11213:
is some point within the domain of distribution of the regressors, and one wants to know what the response variable would have been at that point. The
11526:{\displaystyle \left({\hat {y}}_{0}-y_{0}\right)\ {\xrightarrow {d}}\ {\mathcal {N}}\left(0,\;\sigma ^{2}x_{0}^{\mathrm {T} }Q_{xx}^{-1}x_{0}\right),}
7764:
14238:{\displaystyle A={\begin{bmatrix}1&-0.731354\\1&-0.707107\\1&-0.615661\\1&\ 0.052336\\1&0.309017\\1&0.438371\end{bmatrix}}}
9160:{\displaystyle {\hat {\beta }}^{(j)}-{\hat {\beta }}=-{\frac {1}{1-h_{j}}}(X^{\mathrm {T} }X)^{-1}x_{j}^{\mathrm {T} }{\hat {\varepsilon }}_{j}\,,}
8726:
5123:
of data on regressors must contain a column vector of ones to represent the constant whose coefficient is the regression intercept. In that case,
9489:
Sometimes the variables and corresponding parameters in the regression can be logically split into two groups, so that the regression takes form
13131:
We can use the least square mechanism to figure out the equation of a two body orbit in polar base co-ordinates. The equation typically used is
15370:
15294:
1867:
10577:
8709:, this result establishes optimality among both linear and non-linear estimators, but only in the case of normally distributed error terms.
7183:
6084:
3819:{\displaystyle {\hat {\beta }}=\operatorname {argmin} _{b\in \mathbb {R} ^{p}}S(b)=(X^{\operatorname {T} }X)^{-1}X^{\operatorname {T} }y\ .}
11176:{\displaystyle ({\hat {\sigma }}^{2}-\sigma ^{2})\ {\xrightarrow {d}}\ {\mathcal {N}}\left(0,\;\operatorname {E} \left-\sigma ^{4}\right).}
13748:
6319:{\displaystyle {\hat {\mathbf {r} }}:=\mathbf {y} -\mathbf {X} {\hat {\boldsymbol {\beta }}}=\mathbf {K} {\hat {\boldsymbol {\gamma }}}.}
11274:
9931:. In this case least squares estimation is equivalent to minimizing the sum of squared residuals of the model subject to the constraint
4264:{\displaystyle {\hat {\varepsilon }}=y-{\hat {y}}=y-X{\hat {\beta }}=My=M(X\beta +\varepsilon )=(MX)\beta +M\varepsilon =M\varepsilon .}
15379:
14495:
12784:
8207:{\displaystyle {\widehat {\operatorname {s.\!e.} }}({\hat {\beta }}_{j})={\sqrt {s^{2}\left(X^{\operatorname {T} }X\right)_{jj}^{-1}}}}
2705:{\displaystyle {\hat {\boldsymbol {\beta }}}={\underset {\boldsymbol {\beta }}{\operatorname {arg\,min} }}\,S({\boldsymbol {\beta }}),}
2125:
9858:
9495:
8418:
6957:. For practical purposes, this distinction is often unimportant, since estimation and inference is carried out while conditioning on
6843:
implies a far richer set of moment conditions than stated above. In particular, this assumption implies that for any vector-function
6126:
621:
15384:
10422:. Such a matrix can always be found, although generally it is not unique. The second formula coincides with the first in case when
3098:
3056:
435:
15817:
14330:
12707:
indicating goodness-of-fit of the regression. This statistic will be equal to one if fit is perfect, and to zero when regressors
7344:, cluster samples, hierarchical data, repeated measures data, longitudinal data, and other data with dependencies. In such cases
5964:
5177:
787:
345:
12607:
12024:
is strong and can be approximated as a quadratic function. OLS can handle non-linear relationships by introducing the regressor
17:
13134:
11833:, that the regression has explanatory power, is accepted. Otherwise, the null hypothesis of no explanatory power is accepted.
546:(rank condition), consistent for the variance estimate of the residuals when regressors have finite fourth moments and—by the
15277:
15251:
15232:
15213:
10816:
6700:. From the properties of MLE, we can infer that the OLS estimator is asymptotically efficient (in the sense of attaining the
2033:
15352:
6997:
5130:
The variance in the prediction of the independent variable as a function of the dependent variable is given in the article
3895:
matrix of X. This formulation highlights the point that estimation can be carried out if, and only if, there is no perfect
1585:
335:
9743:
15712:
12682:
indicate that the null hypothesis can be rejected and that the corresponding coefficient is not zero. The second column,
9659:
5770:{\displaystyle {\hat {\beta }}={\rm {arg}}\min _{\beta }\,\lVert \mathbf {y} -\mathbf {X} {\boldsymbol {\beta }}\rVert ,}
3627:, denoting the values of all the independent variables associated with a particular value of the dependent variable, are
15692:
15342:
10701:{\displaystyle ({\hat {\beta }}-\beta )\ {\xrightarrow {d}}\ {\mathcal {N}}{\big (}0,\;\sigma ^{2}Q_{xx}^{-1}{\big )},}
8079:-th diagonal element of this matrix. The estimate of this standard error is obtained by replacing the unknown quantity
7162:
is finite and positive semi-definite. When this assumption is violated the regressors are called linearly dependent or
4624:
3398:
1261:
11220:
9695:
8040:
7152:
Usually, it is also assumed that the regressors have finite moments up to at least the second moment. Then the matrix
7091:
15612:
15187:
15159:
15079:
15027:
14987:
14770:
14701:
14674:
14647:
14590:
10714:
6969:
The classical model focuses on the "finite sample" estimation and inference, meaning that the number of observations
3922:
8380:
6164:
15418:
14892:
8691:{\displaystyle {\hat {\beta }}\ \sim \ {\mathcal {N}}{\big (}\beta ,\ \sigma ^{2}(X^{\mathrm {T} }X)^{-1}{\big )}.}
7361:
3136:
299:
13835:
8886:
8555:
8345:
8220:
7889:
7722:
1445:
14500:
4610:
3892:
350:
288:
108:
83:
12933:
tries to test the hypothesis that all coefficients (except the intercept) are equal to zero. This statistic has
7301:. If the errors have infinite variance then the OLS estimates will also have infinite variance (although by the
7052:. The exogeneity assumption is critical for the OLS theory. If it holds then the regressor variables are called
15888:
15654:
7690:
6974:
6930:
551:
210:
14519:
3605:{\displaystyle S(b)=\sum _{i=1}^{n}(y_{i}-x_{i}^{\operatorname {T} }b)^{2}=(y-Xb)^{\operatorname {T} }(y-Xb),}
14397:
14091:
11801:
10150:
identifiable, in which case one would like to find the formula for the estimator. The estimator is equal to
8254:
6897:
framework depends mostly on the nature of data in hand, and on the inference task which has to be performed.
6717:
5115:
which is equivalent to regression on a constant; it simply subtracts the mean from a variable.) In order for
5003:
169:
6726:
6238:
5582:
OLS estimation can be viewed as a projection onto the linear space spanned by the regressors. (Here each of
2605:
1831:
1188:
15840:
15740:
15730:
15649:
15594:
12893:
12704:
12409:
12349:
12150:
11862:
9666:
8919:
7498:
7437:
7174:
values is still possible for new values of the regressors that lie in the same linearly dependent subspace.
6689:
4702:
579:
428:
10488:
15867:
15682:
14485:
12885:
12466:
12393:
8509:
5825:
2625:
371:
14914:
14436:
1047:{\displaystyle y_{i}=\beta _{1}\ x_{i1}+\beta _{2}\ x_{i2}+\cdots +\beta _{p}\ x_{ip}+\varepsilon _{i},}
15362:
13674:{\displaystyle \cos(\theta -\theta _{0})=\cos(\theta )\cos(\theta _{0})+\sin(\theta )\sin(\theta _{0})}
13026:
13012:
Residuals against the preceding residual. This plot may identify serial correlations in the residuals.
10537:
is "large enough" so that the true distribution of the OLS estimator is close to its asymptotic limit.
8027:{\displaystyle \operatorname {Var} =\sigma ^{2}\left(X^{\operatorname {T} }X\right)^{-1}=\sigma ^{2}Q.}
7878:
7472:), which are understood as a theoretical possibility of fetching new independent observations from the
7305:
they will nonetheless tend toward the true values so long as the errors have zero mean). In this case,
7058:
3668:
340:
309:
236:
6066:{\displaystyle (\mathbf {y} -\mathbf {X} {\hat {\boldsymbol {\beta }}})\cdot \mathbf {X} \mathbf {v} }
5789:
1769:
1341:
1287:
1236:
1139:
738:
15707:
15534:
15498:
15467:
14480:
14318:{\displaystyle b={\begin{bmatrix}0.21220\\0.21958\\0.24741\\0.45071\\0.52883\\0.56820\end{bmatrix}}.}
11342:
10784:
10551:
10459:
8997:-th observation and consider how much the estimated quantities are going to change (similarly to the
8945:
7345:
5842:
5523:
4681:
2628:
1126:{\displaystyle y_{i}=\mathbf {x} _{i}^{\operatorname {T} }{\boldsymbol {\beta }}+\varepsilon _{i},\,}
330:
319:
283:
190:
14044:
14017:
13970:
13923:
11569:
11339:. Clearly the predicted response is a random variable, its distribution can be derived from that of
6158:, that is, the variance of the residuals is the minimum possible. This is illustrated at the right.
15687:
15565:
15529:
15457:
15347:
15329:
14490:
13716:
13684:
13527:
13032:
12690:
11866:
8717:
8706:
8326:
7672:
7542:
7502:
7045:
6852:
5669:
For mathematicians, OLS is an approximate solution to an overdetermined system of linear equations
5143:
5131:
3439:
547:
524:
391:
262:
185:
78:
57:
2938:
2524:
1700:
1642:
1513:
1423:
15428:
14799:
9852:
Suppose it is known that the coefficients in the regression satisfy a system of linear equations
8931:
8705:
for the model, and thus is optimal in the class of all unbiased estimators. Note that unlike the
7473:
6991:
6893:
6887:
5127:
will always be a number between 0 and 1, with values close to 1 indicating a good degree of fit.
885:
591:
421:
314:
14582:
14576:
13415:
13315:
13199:
12983:
12304:
12262:
12220:
11871:
The following data set gives average heights and weights for
American women aged 30–39 (source:
8828:
8702:
6821:
These moment conditions state that the regressors should be uncorrelated with the errors. Since
6701:
4709:
is defined as a ratio of "explained" variance to the "total" variance of the dependent variable
1738:
15893:
15781:
15607:
15472:
12581:{\displaystyle {\hat {\sigma }}_{j}=\left({\hat {\sigma }}^{2}\left_{jj}\right)^{\frac {1}{2}}}
11830:
7298:
6984:. The linear functional form must coincide with the form of the actual data-generating process.
6922:
5951:{\displaystyle (\mathbf {y} -\mathbf {X} {\hat {\boldsymbol {\beta }}})^{\top }\mathbf {X} =0.}
4675:
is used more often, since it is more convenient for the hypothesis testing. The square root of
1535:
1467:
1410:{\displaystyle \mathbf {y} =\mathbf {X} {\boldsymbol {\beta }}+{\boldsymbol {\varepsilon }},\,}
1210:
1058:
520:
278:
215:
14691:
14664:
14637:
14572:
13225:
is the radius of how far the object is from one of the bodies. In the equation the parameters
10533:
goes to infinity. While the sample size is necessarily finite, it is customary to assume that
2546:
1664:
15862:
15822:
15786:
15771:
15722:
15666:
15493:
15265:
14979:
13412:
First we need to represent e and p in a linear form. So we are going to rewrite the equation
13273:
12954:
10522:
8937:
7063:
6942:
5810:
is just a certain linear combination of the vectors of regressors. Thus, the residual vector
2602:
Such a system usually has no exact solution, so the goal is instead to find the coefficients
1859:
484:
366:
62:
9223:-th observation resulting from omitting that observation from the dataset will be equal to
2092:
15827:
15766:
15753:
15702:
15602:
15524:
15503:
15477:
15223:
Heij, Christiaan; Boer, Paul; Franses, Philip H.; Kloek, Teun; van Dijk, Herman K. (2004).
12775:, can decrease as new regressors are added, and even be negative for poorly fitting models:
12751:
12724:
12697:-values smaller than 0.05 are taken as evidence that the population coefficient is nonzero.
12141:
11902:
11539:
11189:
10545:
7755:
7454:
7302:
7077:
7056:. If it does not, then those regressors that are correlated with the error term are called
6832:-vector, the number of moment conditions is equal to the dimension of the parameter vector
5869:
5658:
5612:
5585:
4996:
4713:, in the cases where the regression sum of squares equals the sum of squares of residuals:
2958:
2116:
1314:
891:
711:
535:
512:
496:
480:
386:
376:
257:
225:
180:
159:
67:
12715:, and will never decrease if additional regressors are added, even if they are irrelevant.
8922:, the fact which comes in useful when constructing the t- and F-tests for the regression.
7040:
The immediate consequence of the exogeneity assumption is that the errors have mean zero:
3899:
between the explanatory variables (which would cause the gram matrix to have no inverse).
8:
15845:
15776:
15661:
15628:
15580:
15570:
15549:
15544:
15423:
15405:
15390:
15321:
15197:
14731:
12961:
12927:
tell us how much of the initial variation in the sample were explained by the regression.
10772:
10768:
6926:
4280:
575:
304:
205:
200:
154:
103:
93:
38:
15286:
15857:
15761:
15750:
15575:
15261:
15148:
14071:
13997:
13950:
13903:
13387:
13367:
13248:
13228:
11780:
11268:
10777:
Using this asymptotic distribution, approximate two-sided confidence intervals for the
9824:
8851:
7596:
7578:
7294:
7270:
5094:
5059:
4668:
4663:. The two estimators are quite similar in large samples; the first estimator is always
2918:
2718:
2582:
2009:
1986:
1619:
1565:
1493:
1267:
1168:
767:
691:
599:
504:
409:
138:
123:
12131:{\displaystyle w_{i}=\beta _{1}+\beta _{2}h_{i}+\beta _{3}h_{i}^{2}+\varepsilon _{i}.}
7758:, meaning that their expected values coincide with the true values of the parameters:
7348:
provides a better alternative than the OLS. Another expression for autocorrelation is
6197:
5683:
is the unknown. Assuming the system cannot be solved exactly (the number of equations
15852:
15812:
15539:
15449:
15440:
15273:
15247:
15228:
15209:
15183:
15155:
15075:
15023:
14983:
14972:
14766:
14697:
14670:
14663:
Hofmann-Wellenhof, Bernhard; Lichtenegger, Herbert; Wasle, Elmar (20 November 2007).
14643:
14586:
14568:
12899:
12425:
11011:
10443:
7441:
7306:
7163:
4070:
4002:
3896:
555:
488:
404:
195:
98:
52:
13265:
are used to determine the path of the orbit. We have measured the following data.
15508:
15375:
9847:
8998:
8377:(BLUE). Efficiency should be understood as if we were to find some other estimator
7278:
5112:
4066:
616:
220:
149:
13830:. We use the original two-parameter form to represent our observational data as:
13513:{\displaystyle {\frac {1}{r(\theta )}}={\frac {1}{p}}-{\frac {e}{p}}\cos(\theta )}
12711:
have no explanatory power whatsoever. This is a biased estimate of the population
8873:, which even beats the Cramér–Rao bound in case when there is only one regressor (
2543:
contains information on the data points. The first column is populated with ones,
448:
15791:
15697:
15638:
15633:
14967:
11822:
11031:
is also consistent and asymptotically normal (provided that the fourth moment of
7600:
7314:
7254:
6329:
The equation and solution of linear least squares are thus described as follows:
5961:
A geometrical interpretation of these equations is that the vector of residuals,
5829:
5800:
3437:, and thus assesses the degree of fit between the actual data and the model. The
2003:
563:
492:
381:
88:
12600:
columns are testing whether any of the coefficients might be equal to zero. The
15735:
15201:
12196:
11842:
10435:
7859:{\displaystyle \operatorname {E} =\beta ,\quad \operatorname {E} =\sigma ^{2}.}
5578:
4664:
3662:
456:
133:
11837:
coefficient is zero. This hypothesis is tested by computing the coefficient's
10540:
We can show that under the model assumptions, the least squares estimator for
15882:
15807:
15337:
15317:
14606:
13103:
11214:
10134:
This expression for the constrained estimator is valid as long as the matrix
8975:, meaning that it represents a linear combination of the dependent variables
8801:{\displaystyle s^{2}\ \sim \ {\frac {\sigma ^{2}}{n-p}}\cdot \chi _{n-p}^{2}}
8339:
7632:
7462:
6961:. All results stated in this article are within the random design framework.
3130:
1559:
559:
500:
452:
252:
128:
10142:
will not be identifiable. However it may happen that adding the restriction
8373:
is efficient in the class of linear unbiased estimators. This is called the
1766:. Without the intercept, the fitted line is forced to cross the origin when
503:: minimizing the sum of the squares of the differences between the observed
14666:
GNSS – Global
Navigation Satellite Systems: GPS, GLONASS, Galileo, and more
12945:) distribution under the null hypothesis and normality assumption, and its
11853:
difference between the two subsets is rejected; otherwise, it is accepted.
10449:
Since we have not made any assumption about the distribution of error term
8986:, and generally are unequal. The observations with high weights are called
8335:
6977:
of OLS, and in which the behavior at a large number of samples is studied.
6697:
6002:
3908:
2027:
543:
118:
14544:
13094:
Residuals to a quadratic fit for correctly and incorrectly converted data.
8990:
because they have a more pronounced effect on the value of the estimator.
7428:{\displaystyle \varepsilon \mid X\sim {\mathcal {N}}(0,\sigma ^{2}I_{n}).}
2915:
below. This minimization problem has a unique solution, provided that the
15395:
14545:"What is a complete list of the usual assumptions for linear regression?"
12592:
12201:
12020:
11906:
11838:
8982:. The weights in this linear combination are functions of the regressors
7870:
7337:
7319:
7085:
6010:
5889:
In other words, the gradient equations at the minimum can be written as:
5832:
3841:
3092:
1973:{\displaystyle \sum _{j=1}^{p}x_{ij}\beta _{j}=y_{i},\ (i=1,2,\dots ,n),}
164:
113:
27:
Method for estimating the unknown parameters in a linear regression model
13090:
11789:
9730:
will be numerically identical to the residuals and the OLS estimate for
7440:(MLE), and therefore it is asymptotically efficient in the class of all
11825:
of no explanatory value of the estimated regression is tested using an
7341:
7265:
is a parameter which determines the variance of each observation. This
6954:
4020:
3406:-th observation, measures the vertical distance between the data point
3163:
468:
7293:
in each observation. When this requirement is violated this is called
6116:{\displaystyle \mathbf {y} -\mathbf {X} {\boldsymbol {\hat {\beta }}}}
5242:
The least squares estimates in this case are given by simple formulas
15015:
12950:
11849:
6693:
3620:
539:
528:
15182:(2nd ed.). New York: Oxford University Press. pp. 48–113.
13823:{\displaystyle \tan \theta _{0}=\sin(\theta _{0})/\cos(\theta _{0})}
13017:
woman can be predicted with high accuracy based only on her height.
11909:
of the data, the relationship is slightly curved but close to linear
11428:
11094:
10626:
8516:, other, non-linear estimators may provide better results than OLS.
6973:
is fixed. This contrasts with the other approaches, which study the
15246:(3rd ed.). Hoboken, NJ: John Wiley & Sons. pp. 8–47.
14893:"Assumptions of multiple regression: Correcting two misconceptions"
8993:
To analyze which observations are influential we remove a specific
8549:), then additional properties of the OLS estimators can be stated.
3665:, and therefore this function possesses a unique global minimum at
571:
15227:(1st ed.). Oxford: Oxford University Press. pp. 76–115.
14662:
11332:{\displaystyle {\hat {y}}_{0}=x_{0}^{\mathrm {T} }{\hat {\beta }}}
9219:-th observation. Similarly, the change in the predicted value for
7457:, an additional assumption is imposed — that all observations are
523:
can be expressed by a simple formula, especially in the case of a
13521:
12865:{\displaystyle {\overline {R}}^{2}=1-{\frac {n-1}{n-p}}(1-R^{2})}
12683:
12206:
8583:
is normally distributed, with mean and variance as given before:
7869:
If the strict exogeneity does not hold (as is the case with many
1820:
in the second regressor, but none-the-less is still considered a
508:
8842:. However it was shown that there are no unbiased estimators of
7448:
2153:{\displaystyle \mathbf {X} {\boldsymbol {\beta }}=\mathbf {y} ,}
1639:
Typically, a constant term is included in the set of regressors
15225:
Econometric
Methods with Applications in Business and Economics
15022:(Second ed.). New York: J. Wiley & Sons. p. 319.
11826:
11536:
which allows construct confidence intervals for mean response
10456:, it is impossible to infer the distribution of the estimators
9895:{\displaystyle A\colon \quad Q^{\operatorname {T} }\beta =c,\,}
9560:{\displaystyle y=X_{1}\beta _{1}+X_{2}\beta _{2}+\varepsilon ,}
8498:{\displaystyle \operatorname {Var} -\operatorname {Var} \geq 0}
6151:{\displaystyle \mathbf {y} -\mathbf {X} {\boldsymbol {\beta }}}
5152:
contains only two variables, a constant and a scalar regressor
681:{\displaystyle \left\{\mathbf {x} _{i},y_{i}\right\}_{i=1}^{n}}
12028:. The regression model then becomes a multiple linear model:
6241:), the residual vector should satisfy the following equation:
15272:(4th ed.). Mason, OH: Cengage Learning. pp. 22–67.
15242:
Hill, R. Carter; Griffiths, William E.; Lim, Guay C. (2008).
15208:(Fifth ed.). Boston: McGraw-Hill Irwin. pp. 55–96.
7048:), and that the regressors are uncorrelated with the errors:
3122:{\displaystyle \mathbf {X} ^{\operatorname {T} }\mathbf {y} }
3080:{\displaystyle \mathbf {X} ^{\operatorname {T} }\mathbf {X} }
14014:
is constructed by the first column being the coefficient of
12674:
follows a
Student-t distribution. Under weaker conditions,
9001:). It can be shown that the change in the OLS estimator for
7360:. It is sometimes additionally assumed that the errors have
2624:
which fit the equations "best", in the sense of solving the
14862:
14860:
14823:
14821:
14384:{\displaystyle {\binom {x}{y}}={\binom {0.43478}{0.30435}}}
7617:} is nonstationary, OLS results are often spurious unless {
5994:{\displaystyle \mathbf {y} -X{\hat {\boldsymbol {\beta }}}}
5232:{\displaystyle y_{i}=\alpha +\beta x_{i}+\varepsilon _{i}.}
4053:; this is a projection matrix onto the space orthogonal to
877:{\displaystyle \mathbf {x} _{i}=\left^{\operatorname {T} }}
567:
14890:
14715:
14713:
14636:
Ghilani, Charles D.; Paul r. Wolf, Ph. D. (12 June 2006).
12663:{\displaystyle t={\hat {\beta }}_{j}/{\hat {\sigma }}_{j}}
5570:; the only difference is in how we interpret this result.
15316:
14891:
Williams, M. N; Grajales, C. A. G; Kurkiewicz, D (2013).
7062:, and the OLS estimator becomes biased. In such case the
6713:
4667:, while the second estimator is biased but has a smaller
460:
14857:
14818:
13189:{\displaystyle r(\theta )={\frac {p}{1-e\cos(\theta )}}}
7289:, which means that the error term has the same variance
6704:
for variance) if the normality assumption is satisfied.
2911:
A justification for choosing this criterion is given in
2521:(Note: for a linear model as above, not all elements in
1490:
vectors of the response variables and the errors of the
538:
for the level-one fixed effects when the regressors are
14710:
12452:
column gives the least squares estimates of parameters
10988:{\displaystyle \beta _{j}\in {\bigg _{jj}}}\ {\bigg ]}}
7166:. In such case the value of the regression coefficient
5868:
in this case can be interpreted as the coefficients of
2082:{\displaystyle \beta _{1},\beta _{2},\dots ,\beta _{p}}
1367:. This model can also be written in matrix notation as
14263:
14144:
13121:
9699:
8890:
8559:
8384:
8349:
8224:
8044:
8037:
In particular, the standard error of each coefficient
7893:
7726:
7239:{\displaystyle \operatorname {Var} =\sigma ^{2}I_{n},}
6541:
6496:
6444:
6388:
6360:
5062:
5006:
4628:
2579:. Only the other columns contain actual data. So here
2449:
2376:
2186:
1165:, as introduced previously, is a column vector of the
574:. Under the additional assumption that the errors are
15145:
14439:
14400:
14333:
14251:
14132:
14094:
14074:
14047:
14020:
14000:
13973:
13953:
13926:
13906:
13838:
13751:
13719:
13687:
13562:
13530:
13447:
13418:
13390:
13370:
13318:
13276:
13251:
13231:
13202:
13137:
12986:
12787:
12754:
12727:
12610:
12475:
12307:
12265:
12223:
12037:
12019:
When only one dependent variable is being modeled, a
11572:
11542:
11377:
11345:
11277:
11223:
11192:
11047:
10819:
10787:
10717:
10593:
10554:
10491:
10462:
10159:
9948:
9861:
9746:
9698:
9669:
9498:
9232:
9014:
8948:
8889:
8729:
8592:
8558:
8421:
8383:
8348:
8257:
8223:
8096:
8043:
7926:
7892:
7767:
7725:
7370:
7186:
7094:
7000:
6929:. This approach allows for more natural study of the
6873:, which results in the moment equation posted above.
6729:
6338:
6250:
6200:
6167:
6129:
6087:
6018:
5967:
5898:
5845:
5700:
5615:
5588:
5526:
5251:
5180:
5097:
4722:
4627:
4293:
4131:
3925:
3709:
3671:
3472:
3270:
3175:
3139:
3101:
3059:
2974:
2941:
2921:
2744:
2721:
2640:
2608:
2585:
2549:
2527:
2172:
2128:
2095:
2036:
2012:
1989:
1870:
1834:
1772:
1741:
1703:
1667:
1645:
1622:
1609:{\displaystyle \mathbf {x} _{i}^{\operatorname {T} }}
1588:
1568:
1538:
1516:
1496:
1470:
1448:
1426:
1376:
1344:
1317:
1290:
1270:
1239:
1213:
1191:
1171:
1142:
1070:
924:
894:
790:
770:
741:
714:
694:
624:
602:
566:. Under these conditions, the method of OLS provides
515:. Some sources consider OLS to be linear regression.
507:(values of the variable being observed) in the input
15222:
12689:, expresses the results of the hypothesis test as a
10415:) matrix such that the matrix is non-singular, and
9806:{\displaystyle M_{1}y=M_{1}X_{2}\beta _{2}+\eta \,,}
7461:. This means that all observations are taken from a
7336:. This assumption may be violated in the context of
6964:
6892:
There are several different frameworks in which the
6839:
Note that the original strict exogeneity assumption
5520:
In the previous section the least squares estimator
582:
that outperforms any non-linear unbiased estimator.
14635:
14520:"The Origins of Ordinary Least Squares Assumptions"
13364:We need to find the least-squares approximation of
8248:is uncorrelated with the residuals from the model:
7297:, in such case a more efficient estimator would be
4274:Using these residuals we can estimate the value of
1636:-th observations on all the explanatory variables.
15147:
14971:
14463:
14425:
14383:
14317:
14237:
14118:
14080:
14060:
14033:
14006:
13986:
13959:
13939:
13912:
13892:
13822:
13737:
13705:
13673:
13548:
13512:
13433:
13396:
13376:
13333:
13282:
13257:
13237:
13217:
13188:
13001:
12864:
12767:
12740:
12662:
12580:
12320:
12278:
12236:
12130:
11841:, as the ratio of the coefficient estimate to its
11755:
11555:
11525:
11360:
11331:
11259:
11205:
11175:
10987:
10802:
10752:
10700:
10569:
10513:
10477:
10388:
10123:
9894:
9805:
9722:
9684:
9559:
9420:
9159:
8963:
8906:
8800:
8690:
8575:
8497:
8400:
8365:
8313:
8240:
8206:
8067:
8026:
7909:
7858:
7742:
7427:
7238:
7144:
7032:
6810:
6668:
6318:
6222:
6182:
6150:
6115:
6065:
5993:
5950:
5860:
5769:
5628:
5601:
5541:
5504:
5231:
5103:
5075:
5048:
4983:
4651:
4590:
4263:
3970:
3818:
3692:
3604:
3353:
3250:
3154:
3121:
3079:
3042:
2949:
2927:
2900:
2727:
2704:
2616:
2591:
2571:
2535:
2510:
2152:
2107:
2081:
2018:
1995:
1972:
1842:
1800:
1754:
1727:
1689:
1653:
1628:
1608:
1574:
1550:
1524:
1502:
1482:
1456:
1434:
1409:
1359:
1338:from sources other than the explanatory variables
1330:
1303:
1276:
1252:
1225:
1199:
1185:-th observation of all the explanatory variables;
1177:
1157:
1125:
1046:
907:
876:
776:
756:
727:
700:
680:
608:
552:optimal in the class of linear unbiased estimators
15106:
15094:
15074:. New York: Oxford University Press. p. 33.
15069:
15020:Linear Statistical Inference and its Applications
14375:
14362:
14350:
14337:
13868:
13855:
11673:
10980:
10912:
10835:
10343:
10255:
10070:
10020:
9215:is the vector of regressors corresponding to the
8846:with variance smaller than that of the estimator
8107:
7098:
6107:
5137:
4652:{\displaystyle \scriptstyle {\hat {\sigma }}^{2}}
15880:
15241:
12670:. If the errors ε follow a normal distribution,
11260:{\displaystyle y_{0}=x_{0}^{\mathrm {T} }\beta }
9723:{\displaystyle \scriptstyle {\hat {\beta }}_{2}}
8068:{\displaystyle \scriptstyle {\hat {\beta }}_{j}}
7095:
6933:of the estimators. In the other interpretation (
6707:
5730:
3373:is a "candidate" value for the parameter vector
2599:is equal to the number of regressors plus one).
1558:matrix of regressors, also sometimes called the
15070:Davidson, Russell; MacKinnon, James G. (1993).
14897:Practical Assessment, Research & Evaluation
14041:and the second column being the coefficient of
10753:{\displaystyle Q_{xx}=X^{\operatorname {T} }X.}
9939:estimator can be given by an explicit formula:
8217:It can also be easily shown that the estimator
6716:case the OLS estimator can also be viewed as a
3971:{\displaystyle {\hat {y}}=X{\hat {\beta }}=Py,}
3162:is the coefficient vector of the least-squares
1311:accounts for the influences upon the responses
511:and the output of the (linear) function of the
15196:
14639:Adjustment Computations: Spatial Data Analysis
8401:{\displaystyle \scriptstyle {\tilde {\beta }}}
6183:{\displaystyle {\hat {\boldsymbol {\gamma }}}}
3700:, which can be given by the explicit formula:
1762:corresponding to this regressor is called the
531:on the right side of the regression equation.
15302:
14950:
14948:
14946:
14745:
10690:
10645:
9663:states that in this regression the residuals
8680:
8623:
7449:Independent and identically distributed (iid)
7131:
7101:
7080:. Mathematically, this means that the matrix
6851:will hold. However it can be shown using the
6797:
6737:
6720:estimator arising from the moment conditions
3155:{\displaystyle {\hat {\boldsymbol {\beta }}}}
1233:vector of unknown parameters; and the scalar
429:
15270:Introductory Econometrics: A Modern Approach
15146:Burnham, Kenneth P.; David Anderson (2002).
15129:
15127:
14581:. New York: John Wiley & Sons. pp.
13893:{\displaystyle A^{T}A{\binom {x}{y}}=A^{T}b}
8907:{\displaystyle \scriptstyle {\hat {\beta }}}
8576:{\displaystyle \scriptstyle {\hat {\beta }}}
8366:{\displaystyle \scriptstyle {\hat {\beta }}}
8241:{\displaystyle \scriptstyle {\hat {\beta }}}
7910:{\displaystyle \scriptstyle {\hat {\beta }}}
7743:{\displaystyle \scriptstyle {\hat {\beta }}}
5761:
5740:
1853:
1457:{\displaystyle {\boldsymbol {\varepsilon }}}
15065:
15063:
11027:Similarly, the least squares estimator for
8925:
8811:The variance of this estimator is equal to
7710:
7476:. The list of assumptions in this case is:
6990:. The errors in the regression should have
5687:is much larger than the number of unknowns
15309:
15295:
15260:
14943:
14797:
14760:
14567:
14496:Numerical methods for linear least squares
13020:
12678:is asymptotically normal. Large values of
11456:
11122:
10656:
10438:of the linear regression model parameters
10429:
8334:assumption (that is, the errors should be
7170:cannot be learned, although prediction of
6949:is sampled conditionally on the values of
5515:
4577:
4024:because it "puts a hat" onto the variable
3463:)) is a measure of the overall model fit:
915:, is a linear function of the regressors:
436:
422:
15177:
15150:Model Selection and Multi-Model Inference
15124:
14763:Standard Mathematical Tables and Formulae
14613:. Princeton University Press. p. 15.
11873:The World Almanac and Book of Facts, 1975
11856:
9891:
9841:
9799:
9484:
9398:
9153:
8485:
8466:
8450:
8431:
8301:
8267:
7955:
7936:
7836:
7819:
7796:
7777:
7693:, with a finite matrix of second moments
7206:
7196:
7128:
7106:
7020:
7010:
6911:are random and sampled together with the
6794:
6742:
5739:
5479:
3738:
2684:
2668:
1406:
1122:
491:model (with fixed level-one effects of a
15072:Estimation and Inference in Econometrics
15060:
14693:GPS: Theory, Algorithms and Applications
13089:
12960:
12140:
11901:
6941:are treated as known constants set by a
6123:is the shortest of all possible vectors
5577:
5049:{\textstyle L=I_{n}-{\frac {1}{n}}J_{n}}
1260:represents unobserved random variables (
447:
15818:Numerical smoothing and differentiation
15133:
15118:
15054:
15042:
15002:
14966:
14954:
14937:
14878:
14866:
14851:
14839:
14827:
14812:
14785:
14719:
14623:
14605:
14426:{\displaystyle p={\frac {1}{x}}=2.3000}
14119:{\displaystyle {\frac {1}{r(\theta )}}}
11863:Simple linear regression § Example
9431:From the properties of the hat matrix,
9204:-th diagonal element of the hat matrix
8942:As was mentioned before, the estimator
8314:{\displaystyle \operatorname {Cov} =0.}
7459:independent and identically distributed
6466:
6450:
6410:
6394:
6304:
6284:
6171:
6144:
6104:
6038:
5982:
5918:
5757:
5654:This section may need to be cleaned up.
5636:refers to a column of the data matrix.)
3642:which minimizes this sum is called the
3344:
3287:
3274:
3179:
3143:
3005:
2880:
2752:
2692:
2679:
2644:
2610:
2364:
2135:
1836:
1450:
1399:
1391:
1193:
1102:
570:estimation when the errors have finite
14:
15881:
14734:Practical Regression and Anova using R
13075:Converted to metric without rounding.
7453:In some applications, especially with
6811:{\displaystyle \mathrm {E} {\big }=0.}
6688:The OLS estimator is identical to the
3861:
3133:of regressand by regressors. Finally,
2617:{\displaystyle {\boldsymbol {\beta }}}
1843:{\displaystyle {\boldsymbol {\beta }}}
1200:{\displaystyle {\boldsymbol {\beta }}}
15290:
14765:. Chapman&Hall/CRC. p. 626.
12980:Residuals against the fitted values,
11774:
11014:of standard normal distribution, and
9685:{\displaystyle {\hat {\varepsilon }}}
8519:
7033:{\displaystyle \operatorname {E} =0.}
6683:
4028:. Another matrix, closely related to
15353:Iteratively reweighted least squares
14978:. Harvard University Press. p.
13126:
11784:
10514:{\displaystyle {\hat {\sigma }}^{2}}
9925:×1 vector of known constants, where
7571:
6888:Linear regression § Assumptions
6855:that the optimal choice of function
5639:
15014:
14746:Kenney, J.; Keeping, E. S. (1963).
13122:Another example with less real data
13061:Converted to metric with rounding.
12913:, standard error of the error term.
12604:-statistic is calculated simply as
11867:Linear least squares § Example
11038:exists) with limiting distribution
7066:may be used to carry out inference.
5820:will have the smallest length when
24:
15371:Pearson product-moment correlation
15171:
14689:
14464:{\displaystyle e=p\cdot y=0.70001}
14366:
14341:
13859:
12721:is a slightly modified version of
11706:
11651:
11605:
11478:
11440:
11311:
11248:
11123:
11106:
11024:-th diagonal element of a matrix.
10890:
10739:
10638:
10359:
10333:
10323:
10294:
10284:
10242:
10232:
10203:
10193:
10092:
10043:
10030:
9994:
9874:
9343:
9299:
9128:
9095:
8854:) estimator in this class will be
8656:
8616:
8172:
8108:
8101:
7983:
7810:
7768:
7385:
7001:
6781:
6731:
6641:
6610:
6588:
6557:
6194:with the assumption that a matrix
5932:
5724:
5721:
5718:
4975:
4972:
4969:
4964:
4961:
4958:
4931:
4911:
4878:
4855:
4843:
4466:
4428:
4416:
4378:
4325:
3802:
3776:
3576:
3535:
3338:
3307:
3235:
3204:
3109:
3067:
3024:
2987:
2675:
2672:
2669:
2665:
2662:
2659:
1601:
1096:
869:
25:
15905:
14088:is the values for the respective
13520:. Furthermore, one could fit for
10521:. Nevertheless, we can apply the
10434:The least squares estimators are
6965:Classical linear regression model
4093:), and relate to the data matrix
3693:{\displaystyle b={\hat {\beta }}}
2961:, given by solving the so-called
15851:
14798:Akbarzadeh, Vahab (7 May 2014).
13713:and in the extra basis function
11788:
7145:{\displaystyle \Pr \!{\big }=1.}
7064:method of instrumental variables
6655:
6636:
6616:
6605:
6583:
6563:
6552:
6529:
6507:
6500:
6371:
6364:
6344:
6297:
6277:
6269:
6255:
6213:
6205:
6139:
6131:
6097:
6089:
6059:
6054:
6031:
6023:
5969:
5938:
5911:
5903:
5752:
5744:
5644:
4687:standard error of the regression
3333:
3313:
3302:
3241:
3230:
3210:
3199:
3115:
3104:
3073:
3062:
3030:
3019:
2993:
2982:
2943:
2875:
2867:
2529:
2437:
2174:
2143:
2130:
1828:still linear in the parameters (
1801:{\displaystyle x_{i}={\vec {0}}}
1647:
1591:
1518:
1428:
1386:
1378:
1360:{\displaystyle \mathbf {x} _{i}}
1347:
1304:{\displaystyle \varepsilon _{i}}
1253:{\displaystyle \varepsilon _{i}}
1158:{\displaystyle \mathbf {x} _{i}}
1145:
1086:
793:
757:{\displaystyle \mathbf {x} _{i}}
744:
633:
483:method for choosing the unknown
403:
15178:Dougherty, Christopher (2002).
15139:
15112:
15100:
15088:
15048:
15036:
15008:
14996:
14960:
14931:
14907:
14884:
14872:
14845:
14833:
14806:
14791:
14779:
14754:
14739:
14725:
14690:Xu, Guochang (5 October 2007).
14501:Nonlinear system identification
11361:{\displaystyle {\hat {\beta }}}
10803:{\displaystyle {\hat {\beta }}}
10570:{\displaystyle {\hat {\beta }}}
10478:{\displaystyle {\hat {\beta }}}
9937:constrained least squares (CLS)
9868:
8964:{\displaystyle {\hat {\beta }}}
8075:is equal to square root of the
7809:
5861:{\displaystyle {\hat {\beta }}}
5542:{\displaystyle {\hat {\beta }}}
4536:
3916:) from the regression will be
2912:
2435:
2362:
784:parameters (regressors), i.e.,
585:
351:Least-squares spectral analysis
289:Generalized estimating equation
109:Multinomial logistic regression
84:Vector generalized linear model
15107:Davidson & MacKinnon (1993
15095:Davidson & MacKinnon (1993
14683:
14656:
14629:
14617:
14599:
14561:
14537:
14512:
14110:
14104:
14061:{\displaystyle {\frac {e}{p}}}
14034:{\displaystyle {\frac {1}{p}}}
13987:{\displaystyle {\frac {e}{p}}}
13940:{\displaystyle {\frac {1}{p}}}
13817:
13804:
13790:
13777:
13732:
13726:
13700:
13694:
13668:
13655:
13646:
13640:
13628:
13615:
13606:
13600:
13588:
13569:
13543:
13537:
13507:
13501:
13463:
13457:
13428:
13422:
13328:
13322:
13212:
13206:
13180:
13174:
13147:
13141:
12993:
12859:
12840:
12648:
12624:
12511:
12483:
12469:of each coefficient estimate:
11756:{\displaystyle y_{0}\in \left}
11683:
11668:
11656:
11617:
11390:
11352:
11323:
11285:
11083:
11058:
11048:
10922:
10907:
10895:
10850:
10794:
10615:
10603:
10594:
10561:
10499:
10469:
10368:
10351:
10306:
10276:
10215:
10185:
10167:
10115:
10103:
10084:
10052:
10035:
10003:
9986:
9977:
9956:
9707:
9676:
9406:
9354:
9325:
9319:
9312:
9273:
9258:
9252:
9240:
9141:
9105:
9086:
9049:
9035:
9029:
9022:
8955:
8897:
8666:
8647:
8599:
8566:
8486:
8473:
8463:
8451:
8438:
8428:
8391:
8375:best linear unbiased estimator
8356:
8302:
8289:
8274:
8264:
8231:
8143:
8131:
8121:
8052:
7956:
7943:
7933:
7900:
7837:
7816:
7797:
7784:
7774:
7733:
7719:assumption the OLS estimators
7691:martingale difference sequence
7419:
7390:
7364:conditional on the regressors:
7207:
7193:
7119:
7113:
7021:
7007:
6881:
6469:
6453:
6436:
6413:
6397:
6307:
6287:
6259:
6217:
6201:
6174:
6047:
6041:
6019:
5985:
5928:
5921:
5899:
5852:
5707:
5533:
5486:
5458:
5416:
5409:
5387:
5359:
5353:
5331:
5328:
5322:
5300:
5138:Simple linear regression model
4819:
4792:
4778:
4752:
4742:
4691:standard error of the equation
4636:
4611:statistical degrees of freedom
4544:
4516:
4510:
4501:
4373:
4363:
4337:
4317:
4234:
4225:
4219:
4204:
4183:
4159:
4138:
3950:
3932:
3785:
3768:
3762:
3756:
3716:
3684:
3596:
3581:
3572:
3556:
3544:
3509:
3482:
3476:
3277:
3182:
3146:
3008:
2885:
2862:
2756:
2748:
2696:
2688:
2647:
1964:
1934:
1792:
568:minimum-variance mean-unbiased
459:states that in an economy the
13:
1:
15266:"The Simple Regression Model"
14573:"Classical Linear Regression"
14506:
13738:{\displaystyle \sin(\theta )}
13706:{\displaystyle \cos(\theta )}
13549:{\displaystyle \cos(\theta )}
12149:The output from most popular
10584:) and asymptotically normal:
9737:in the following regression:
9470:, and observations with high
6876:
6708:Generalized method of moments
5573:
5119:to be meaningful, the matrix
4018:is also sometimes called the
3364:
2715:where the objective function
596:Suppose the data consists of
527:, in which there is a single
170:Nonlinear mixed-effects model
15841:Regression analysis category
15731:Response surface methodology
15180:Introduction to Econometrics
14750:. van Nostrand. p. 187.
12907:Standard error of regression
12894:Akaike information criterion
12794:
12705:coefficient of determination
10781:-th component of the vector
10762:
8827:, which does not attain the
8716:will be proportional to the
8508:in the sense that this is a
7529:no perfect multicollinearity
7438:maximum likelihood estimator
6690:maximum likelihood estimator
5000:for the dependent variable,
4813:
4772:
4703:coefficient of determination
2950:{\displaystyle \mathbf {X} }
2536:{\displaystyle \mathbf {X} }
1728:{\displaystyle i=1,\dots ,n}
1654:{\displaystyle \mathbf {X} }
1525:{\displaystyle \mathbf {X} }
1435:{\displaystyle \mathbf {y} }
580:maximum likelihood estimator
7:
15713:Frisch–Waugh–Lovell theorem
15683:Mean and predicted response
14474:
13556:with an extra parameter as
13407:
12153:will look similar to this:
9660:Frisch–Waugh–Lovell theorem
8701:This estimator reaches the
8510:nonnegative-definite matrix
7671:is of full rank, and hence
7309:techniques are recommended.
3893:Moore–Penrose pseudoinverse
708:includes a scalar response
578:with zero mean, OLS is the
372:Mean and predicted response
10:
15910:
15363:Correlation and dependence
15244:Principles of Econometrics
15154:(2nd ed.). Springer.
14915:"Memento on EViews Output"
13681:, which is linear in both
13434:{\displaystyle r(\theta )}
13334:{\displaystyle r(\theta )}
13218:{\displaystyle r(\theta )}
13030:
13027:Errors-in-variables models
13024:
13002:{\displaystyle {\hat {y}}}
12321:{\displaystyle \beta _{3}}
12279:{\displaystyle \beta _{2}}
12237:{\displaystyle \beta _{1}}
11860:
11778:
10766:
10529:properties as sample size
9845:
8935:
8929:
7879:variance-covariance matrix
6885:
5141:
4659:, is the MLE estimate for
4617:, is the OLS estimate for
4010:spanned by the columns of
1755:{\displaystyle \beta _{1}}
589:
165:Linear mixed-effects model
15836:
15800:
15749:
15721:
15708:Minimum mean-square error
15675:
15621:
15595:Decomposition of variance
15593:
15558:
15517:
15499:Growth curve (statistics)
15486:
15468:Generalized least squares
15448:
15437:
15404:
15361:
15328:
14748:Mathematics of Statistics
12432:
12416:
12400:
12384:
12370:
12356:
12343:
12212:
12184:
12179:
12171:
12163:
11900:
9917:matrix of full rank, and
8883:Moreover, the estimators
8408:which would be linear in
7346:generalized least squares
5806:. The predicted quantity
4682:regression standard error
3860:, closely related to its
2115:. This can be written in
1854:Matrix/vector formulation
1551:{\displaystyle n\times p}
1483:{\displaystyle n\times 1}
1226:{\displaystyle p\times 1}
888:, the response variable,
331:Least absolute deviations
15566:Generalized linear model
15458:Simple linear regression
15348:Non-linear least squares
15330:Computational statistics
13033:Quantization error model
10578:converges in probability
8926:Influential observations
8718:chi-squared distribution
7715:First of all, under the
7711:Finite sample properties
7543:positive-definite matrix
7164:perfectly multicollinear
7046:law of total expectation
5656:It has been merged from
5144:Simple linear regression
5132:Polynomial least squares
3902:After we have estimated
3440:sum of squared residuals
2572:{\displaystyle X_{i1}=1}
1824:model because the model
1690:{\displaystyle x_{i1}=1}
525:simple linear regression
79:Generalized linear model
14761:Zwillinger, D. (1995).
14732:Julian Faraway (2000),
14491:Nonlinear least squares
14486:Fama–MacBeth regression
13283:{\displaystyle \theta }
13021:Sensitivity to rounding
12925:residual sum of squares
12886:Durbin–Watson statistic
11857:Example with real data
10430:Large sample properties
8932:Influential observation
7501:from, and has the same
7474:data generating process
6894:linear regression model
6847:, the moment condition
5516:Alternative derivations
3661:with positive-definite
3457:residual sum of squares
886:linear regression model
592:Linear regression model
15858:Mathematics portal
15782:Orthogonal polynomials
15608:Analysis of covariance
15473:Weighted least squares
15463:Ordinary least squares
15414:Ordinary least squares
14481:Bayesian least squares
14465:
14427:
14385:
14319:
14239:
14120:
14082:
14062:
14035:
14008:
13988:
13961:
13941:
13914:
13894:
13824:
13739:
13707:
13675:
13550:
13514:
13435:
13398:
13378:
13335:
13284:
13259:
13239:
13219:
13190:
13095:
13003:
12966:
12866:
12769:
12742:
12664:
12582:
12322:
12280:
12238:
12146:
12132:
11910:
11831:alternative hypothesis
11757:
11557:
11527:
11362:
11333:
11261:
11207:
11177:
10989:
10810:can be constructed as
10804:
10754:
10702:
10571:
10515:
10479:
10390:
10125:
9896:
9842:Constrained estimation
9807:
9724:
9686:
9561:
9485:Partitioned regression
9422:
9161:
8965:
8908:
8802:
8692:
8577:
8499:
8402:
8367:
8330:states that under the
8315:
8242:
8208:
8069:
8028:
7911:
7860:
7744:
7429:
7322:between observations:
7299:weighted least squares
7240:
7146:
7034:
6812:
6670:
6320:
6239:Orthogonal projections
6224:
6184:
6152:
6117:
6067:
5995:
5952:
5862:
5826:projected orthogonally
5771:
5637:
5630:
5603:
5543:
5506:
5385:
5298:
5233:
5105:
5077:
5050:
4985:
4653:
4621:, whereas the second,
4613:. The first quantity,
4592:
4265:
3972:
3820:
3694:
3606:
3508:
3355:
3252:
3156:
3123:
3081:
3044:
2951:
2935:columns of the matrix
2929:
2902:
2822:
2782:
2729:
2706:
2618:
2593:
2573:
2537:
2512:
2154:
2109:
2108:{\displaystyle n>p}
2083:
2020:
1997:
1974:
1891:
1844:
1802:
1756:
1729:
1691:
1655:
1630:
1610:
1576:
1552:
1526:
1504:
1484:
1458:
1436:
1411:
1361:
1332:
1305:
1278:
1254:
1227:
1201:
1179:
1159:
1127:
1048:
909:
878:
778:
758:
729:
702:
682:
610:
499:) by the principle of
473:ordinary least squares
464:
410:Mathematics portal
336:Iteratively reweighted
18:Ordinary Least Squares
15889:Parametric statistics
15823:System identification
15787:Chebyshev polynomials
15772:Numerical integration
15723:Design of experiments
15667:Regression validation
15494:Polynomial regression
15419:Partial least squares
14974:Advanced Econometrics
14569:Goldberger, Arthur S.
14466:
14428:
14386:
14320:
14240:
14121:
14083:
14063:
14036:
14009:
13989:
13962:
13942:
13915:
13895:
13825:
13740:
13708:
13676:
13551:
13515:
13436:
13399:
13379:
13336:
13285:
13260:
13240:
13220:
13191:
13093:
13038:2.54 cm this is
13004:
12964:
12867:
12770:
12768:{\displaystyle R^{2}}
12743:
12741:{\displaystyle R^{2}}
12665:
12583:
12323:
12281:
12239:
12144:
12133:
11905:
11758:
11558:
11556:{\displaystyle y_{0}}
11528:
11363:
11334:
11262:
11208:
11206:{\displaystyle x_{0}}
11178:
10990:
10805:
10755:
10703:
10572:
10523:central limit theorem
10516:
10480:
10391:
10126:
9897:
9808:
9725:
9692:and the OLS estimate
9687:
9562:
9446:, so that on average
9442:, and they sum up to
9423:
9162:
8966:
8938:Leverage (statistics)
8909:
8803:
8693:
8578:
8528:holds (that is, that
8500:
8403:
8368:
8316:
8243:
8209:
8070:
8029:
7912:
7861:
7745:
7430:
7241:
7147:
7035:
6982:Correct specification
6931:asymptotic properties
6813:
6671:
6321:
6225:
6185:
6153:
6118:
6073:is equal to zero for
6068:
6001:is orthogonal to the
5996:
5953:
5863:
5772:
5631:
5629:{\displaystyle X_{2}}
5604:
5602:{\displaystyle X_{1}}
5581:
5544:
5507:
5365:
5278:
5234:
5106:
5078:
5051:
4986:
4654:
4593:
4266:
4122:from the regression:
3973:
3821:
3695:
3607:
3488:
3356:
3253:
3157:
3124:
3082:
3045:
2952:
2930:
2903:
2802:
2762:
2730:
2707:
2619:
2594:
2574:
2538:
2513:
2155:
2110:
2084:
2021:
1998:
1975:
1871:
1860:overdetermined system
1845:
1803:
1757:
1730:
1692:
1656:
1631:
1611:
1577:
1553:
1527:
1505:
1485:
1459:
1437:
1412:
1362:
1333:
1331:{\displaystyle y_{i}}
1306:
1279:
1255:
1228:
1202:
1180:
1160:
1128:
1049:
910:
908:{\displaystyle y_{i}}
879:
779:
759:
730:
728:{\displaystyle y_{i}}
703:
683:
611:
564:serially uncorrelated
534:The OLS estimator is
497:explanatory variables
451:
367:Regression validation
346:Bayesian multivariate
63:Polynomial regression
15828:Moving least squares
15767:Approximation theory
15703:Studentized residual
15693:Errors and residuals
15688:Gauss–Markov theorem
15603:Analysis of variance
15525:Nonlinear regression
15504:Segmented regression
15478:General linear model
15396:Confounding variable
15343:Linear least squares
15198:Gujarati, Damodar N.
14437:
14398:
14331:
14249:
14130:
14092:
14072:
14045:
14018:
13998:
13971:
13951:
13924:
13904:
13836:
13749:
13717:
13685:
13560:
13528:
13445:
13416:
13404:for the given data.
13388:
13368:
13316:
13274:
13249:
13229:
13200:
13135:
12984:
12921:model sum of squared
12917:Total sum of squares
12785:
12752:
12725:
12608:
12473:
12305:
12263:
12221:
12151:statistical packages
12035:
11570:
11540:
11375:
11343:
11275:
11221:
11190:
11045:
10817:
10785:
10715:
10591:
10552:
10489:
10460:
10157:
9946:
9928:q < p
9859:
9744:
9696:
9667:
9496:
9230:
9012:
8946:
8887:
8727:
8707:Gauss–Markov theorem
8590:
8556:
8526:normality assumption
8419:
8412:and unbiased, then
8381:
8346:
8327:Gauss–Markov theorem
8255:
8221:
8094:
8041:
7924:
7890:
7765:
7723:
7455:cross-sectional data
7368:
7303:law of large numbers
7184:
7092:
7078:linearly independent
7072:. The regressors in
7070:No linear dependence
6998:
6853:Gauss–Markov theorem
6727:
6336:
6248:
6230:is non-singular and
6198:
6165:
6127:
6085:
6016:
5965:
5896:
5870:vector decomposition
5843:
5839:. The OLS estimator
5698:
5659:Linear least squares
5613:
5586:
5524:
5249:
5178:
5095:
5060:
5004:
4997:total sum of squares
4720:
4625:
4291:
4129:
3923:
3707:
3669:
3470:
3449:error sum of squares
3268:
3173:
3137:
3099:
3057:
2972:
2959:linearly independent
2939:
2919:
2742:
2719:
2638:
2606:
2583:
2547:
2525:
2170:
2126:
2093:
2034:
2010:
1987:
1868:
1832:
1770:
1739:
1701:
1665:
1643:
1620:
1586:
1566:
1536:
1514:
1494:
1468:
1446:
1424:
1374:
1342:
1315:
1288:
1268:
1237:
1211:
1189:
1169:
1140:
1068:
922:
892:
788:
768:
739:
735:and a column vector
712:
692:
622:
600:
576:normally distributed
548:Gauss–Markov theorem
513:independent variable
481:linear least squares
392:Gauss–Markov theorem
387:Studentized residual
377:Errors and residuals
211:Principal components
181:Nonlinear regression
68:General linear model
15846:Statistics category
15777:Gaussian quadrature
15662:Model specification
15629:Stepwise regression
15487:Predictor structure
15424:Total least squares
15406:Regression analysis
15391:Partial correlation
15322:regression analysis
15262:Wooldridge, Jeffrey
12974:heteroscedasticity.
12693:. Conventionally,
12548:
12394:Durbin–Watson stat.
12385:Residual sum-of-sq.
12111:
11732:
11711:
11672:
11610:
11563:to be constructed:
11504:
11483:
11432:
11316:
11253:
11147:
11098:
10959:
10911:
10773:Prediction interval
10769:Confidence interval
10687:
10630:
9459:. These quantities
9347:
9304:
9262:
9133:
8797:
8201:
7638:The regressors are
7362:normal distribution
6975:asymptotic behavior
6927:observational study
6785:
5882:along the basis of
5148:If the data matrix
4281:reduced chi-squared
3619:denotes the matrix
3539:
3447:) (also called the
3424:and the hyperplane
1605:
1100:
688:. Each observation
677:
237:Errors-in-variables
104:Logistic regression
94:Binomial regression
39:Regression analysis
33:Part of a series on
15863:Statistics outline
15762:Numerical analysis
14578:Econometric Theory
14461:
14423:
14381:
14327:On solving we get
14315:
14306:
14235:
14229:
14116:
14078:
14058:
14031:
14004:
13984:
13957:
13937:
13910:
13890:
13820:
13735:
13703:
13671:
13546:
13510:
13431:
13394:
13374:
13331:
13280:
13255:
13235:
13215:
13186:
13096:
12999:
12967:
12909:is an estimate of
12862:
12765:
12738:
12719:Adjusted R-squared
12691:significance level
12660:
12578:
12528:
12357:S.E. of regression
12318:
12276:
12234:
12169:Dependent variable
12147:
12128:
12097:
11911:
11800:. You can help by
11781:Hypothesis testing
11775:Hypothesis testing
11753:
11712:
11695:
11626:
11594:
11553:
11523:
11484:
11467:
11358:
11329:
11300:
11269:predicted response
11257:
11237:
11203:
11173:
11133:
10985:
10939:
10865:
10800:
10750:
10698:
10667:
10567:
10511:
10475:
10444:interval estimates
10386:
10121:
9892:
9825:annihilator matrix
9803:
9720:
9719:
9682:
9557:
9418:
9333:
9288:
9233:
9157:
9117:
9005:will be equal to
8961:
8904:
8903:
8852:mean squared error
8798:
8777:
8688:
8573:
8572:
8520:Assuming normality
8495:
8398:
8397:
8363:
8362:
8311:
8238:
8237:
8204:
8161:
8083:with its estimate
8065:
8064:
8024:
7907:
7906:
7856:
7740:
7739:
7579:stochastic process
7442:regular estimators
7425:
7350:serial correlation
7295:heteroscedasticity
7271:nuisance parameter
7236:
7142:
7030:
6937:), the regressors
6808:
6771:
6684:Maximum likelihood
6666:
6664:
6648:
6513:
6477:
6421:
6377:
6316:
6220:
6180:
6148:
6113:
6081:. This means that
6077:conformal vector,
6063:
5991:
5948:
5858:
5835:by the columns of
5767:
5738:
5638:
5626:
5599:
5539:
5502:
5500:
5229:
5101:
5076:{\textstyle J_{n}}
5073:
5046:
4981:
4669:mean squared error
4649:
4648:
4588:
4261:
3968:
3816:
3690:
3657:) is quadratic in
3644:OLS estimator for
3623:, and the rows of
3602:
3525:
3351:
3248:
3152:
3119:
3077:
3040:
2947:
2925:
2898:
2725:
2702:
2682:
2614:
2589:
2569:
2533:
2508:
2499:
2426:
2353:
2150:
2105:
2079:
2016:
1993:
1970:
1840:
1798:
1752:
1735:. The coefficient
1725:
1687:
1651:
1626:
1606:
1589:
1572:
1548:
1522:
1510:observations, and
1500:
1480:
1454:
1432:
1407:
1357:
1328:
1301:
1274:
1250:
1223:
1197:
1175:
1155:
1123:
1084:
1044:
905:
874:
774:
754:
725:
698:
678:
625:
606:
542:and forms perfect
505:dependent variable
465:
124:Multinomial probit
15876:
15875:
15868:Statistics topics
15813:Calibration curve
15622:Model exploration
15589:
15588:
15559:Non-normal errors
15450:Linear regression
15441:statistical model
15279:978-0-324-58162-1
15253:978-0-471-72360-8
15234:978-0-19-926801-6
15215:978-0-07-337577-9
15206:Basic Econometics
14800:"Line Estimation"
14415:
14373:
14348:
14199:
14114:
14081:{\displaystyle b}
14056:
14029:
14007:{\displaystyle A}
13982:
13960:{\displaystyle y}
13935:
13913:{\displaystyle x}
13866:
13493:
13480:
13467:
13397:{\displaystyle p}
13377:{\displaystyle e}
13362:
13361:
13258:{\displaystyle e}
13238:{\displaystyle p}
13184:
13127:Problem statement
13088:
13087:
12996:
12900:Schwarz criterion
12838:
12797:
12651:
12627:
12575:
12514:
12486:
12440:
12439:
12426:Schwarz criterion
12145:Fitted regression
12015:
12014:
11848:In addition, the
11818:
11817:
11770:confidence level.
11747:
11743:
11686:
11645:
11620:
11593:
11437:
11433:
11422:
11393:
11355:
11326:
11288:
11103:
11099:
11088:
11061:
11012:quantile function
11002:confidence level,
10977:
10973:
10925:
10884:
10853:
10842:
10797:
10635:
10631:
10620:
10606:
10564:
10502:
10472:
10170:
10106:
9980:
9959:
9710:
9679:
9634:×1 vectors, with
9409:
9396:
9357:
9315:
9276:
9243:
9144:
9084:
9052:
9025:
8958:
8900:
8871: + 2)
8772:
8748:
8742:
8636:
8613:
8607:
8602:
8569:
8476:
8441:
8394:
8359:
8292:
8277:
8234:
8202:
8134:
8118:
8055:
7946:
7903:
7884:covariance matrix
7787:
7736:
7717:strict exogeneity
7673:positive-definite
7572:Time series model
7318:: the errors are
7307:robust estimation
6988:Strict exogeneity
6904:) the regressors
6472:
6456:
6416:
6400:
6310:
6290:
6262:
6211:
6177:
6110:
6044:
5988:
5924:
5855:
5729:
5710:
5667:
5666:
5536:
5494:
5489:
5476:
5461:
5442:
5427:
5412:
5356:
5325:
5265:
5104:{\displaystyle L}
5091:matrix of ones. (
5034:
4994:where TSS is the
4979:
4944:
4891:
4829:
4816:
4775:
4755:
4639:
4601:The denominator,
4575:
4547:
4531:
4513:
4490:
4452:
4402:
4355:
4340:
4320:
4186:
4162:
4141:
4003:projection matrix
3953:
3935:
3897:multicollinearity
3862:covariance matrix
3844:and its inverse,
3812:
3719:
3687:
3280:
3185:
3149:
3036:
3011:
2928:{\displaystyle p}
2728:{\displaystyle S}
2657:
2650:
2592:{\displaystyle p}
2019:{\displaystyle p}
1996:{\displaystyle n}
1933:
1795:
1661:, say, by taking
1629:{\displaystyle i}
1616:and contains the
1575:{\displaystyle i}
1503:{\displaystyle n}
1284:-th observation.
1277:{\displaystyle i}
1178:{\displaystyle i}
1014:
979:
950:
777:{\displaystyle p}
701:{\displaystyle i}
609:{\displaystyle n}
489:linear regression
446:
445:
99:Binary regression
58:Simple regression
53:Linear regression
16:(Redirected from
15901:
15856:
15855:
15613:Multivariate AOV
15509:Local regression
15446:
15445:
15438:Regression as a
15429:Ridge regression
15376:Rank correlation
15311:
15304:
15297:
15288:
15287:
15283:
15257:
15238:
15219:
15193:
15166:
15165:
15153:
15143:
15137:
15131:
15122:
15116:
15110:
15104:
15098:
15092:
15086:
15085:
15067:
15058:
15052:
15046:
15040:
15034:
15033:
15012:
15006:
15000:
14994:
14993:
14977:
14968:Amemiya, Takeshi
14964:
14958:
14952:
14941:
14935:
14929:
14928:
14926:
14924:
14919:
14911:
14905:
14904:
14888:
14882:
14876:
14870:
14864:
14855:
14849:
14843:
14837:
14831:
14825:
14816:
14810:
14804:
14803:
14795:
14789:
14783:
14777:
14776:
14758:
14752:
14751:
14743:
14737:
14729:
14723:
14717:
14708:
14707:
14687:
14681:
14680:
14660:
14654:
14653:
14633:
14627:
14621:
14615:
14614:
14603:
14597:
14596:
14565:
14559:
14558:
14556:
14555:
14541:
14535:
14534:
14532:
14531:
14516:
14470:
14468:
14467:
14462:
14432:
14430:
14429:
14424:
14416:
14408:
14390:
14388:
14387:
14382:
14380:
14379:
14378:
14365:
14355:
14354:
14353:
14340:
14324:
14322:
14321:
14316:
14311:
14310:
14244:
14242:
14241:
14236:
14234:
14233:
14197:
14125:
14123:
14122:
14117:
14115:
14113:
14096:
14087:
14085:
14084:
14079:
14067:
14065:
14064:
14059:
14057:
14049:
14040:
14038:
14037:
14032:
14030:
14022:
14013:
14011:
14010:
14005:
13993:
13991:
13990:
13985:
13983:
13975:
13966:
13964:
13963:
13958:
13946:
13944:
13943:
13938:
13936:
13928:
13919:
13917:
13916:
13911:
13899:
13897:
13896:
13891:
13886:
13885:
13873:
13872:
13871:
13858:
13848:
13847:
13829:
13827:
13826:
13821:
13816:
13815:
13797:
13789:
13788:
13767:
13766:
13745:, used to extra
13744:
13742:
13741:
13736:
13712:
13710:
13709:
13704:
13680:
13678:
13677:
13672:
13667:
13666:
13627:
13626:
13587:
13586:
13555:
13553:
13552:
13547:
13519:
13517:
13516:
13511:
13494:
13486:
13481:
13473:
13468:
13466:
13449:
13440:
13438:
13437:
13432:
13403:
13401:
13400:
13395:
13383:
13381:
13380:
13375:
13340:
13338:
13337:
13332:
13289:
13287:
13286:
13281:
13268:
13267:
13264:
13262:
13261:
13256:
13244:
13242:
13241:
13236:
13224:
13222:
13221:
13216:
13195:
13193:
13192:
13187:
13185:
13183:
13154:
13045:
13044:
13008:
13006:
13005:
13000:
12998:
12997:
12989:
12871:
12869:
12868:
12863:
12858:
12857:
12839:
12837:
12826:
12815:
12804:
12803:
12798:
12790:
12774:
12772:
12771:
12766:
12764:
12763:
12747:
12745:
12744:
12739:
12737:
12736:
12669:
12667:
12666:
12661:
12659:
12658:
12653:
12652:
12644:
12640:
12635:
12634:
12629:
12628:
12620:
12587:
12585:
12584:
12579:
12577:
12576:
12568:
12566:
12562:
12561:
12560:
12552:
12547:
12539:
12522:
12521:
12516:
12515:
12507:
12494:
12493:
12488:
12487:
12479:
12433:p-value (F-stat)
12410:Akaike criterion
12401:Total sum-of-sq.
12371:Model sum-of-sq.
12327:
12325:
12324:
12319:
12317:
12316:
12285:
12283:
12282:
12277:
12275:
12274:
12243:
12241:
12240:
12235:
12233:
12232:
12158:
12157:
12137:
12135:
12134:
12129:
12124:
12123:
12110:
12105:
12096:
12095:
12083:
12082:
12073:
12072:
12060:
12059:
12047:
12046:
11880:
11879:
11813:
11810:
11792:
11785:
11769:
11762:
11760:
11759:
11754:
11752:
11748:
11745:
11744:
11742:
11741:
11731:
11723:
11710:
11709:
11703:
11694:
11693:
11688:
11687:
11679:
11675:
11671:
11655:
11654:
11647:
11646:
11638:
11622:
11621:
11613:
11609:
11608:
11602:
11591:
11582:
11581:
11562:
11560:
11559:
11554:
11552:
11551:
11532:
11530:
11529:
11524:
11519:
11515:
11514:
11513:
11503:
11495:
11482:
11481:
11475:
11466:
11465:
11444:
11443:
11435:
11434:
11424:
11420:
11419:
11415:
11414:
11413:
11401:
11400:
11395:
11394:
11386:
11367:
11365:
11364:
11359:
11357:
11356:
11348:
11338:
11336:
11335:
11330:
11328:
11327:
11319:
11315:
11314:
11308:
11296:
11295:
11290:
11289:
11281:
11266:
11264:
11263:
11258:
11252:
11251:
11245:
11233:
11232:
11217:is the quantity
11212:
11210:
11209:
11204:
11202:
11201:
11182:
11180:
11179:
11174:
11169:
11165:
11164:
11163:
11151:
11146:
11141:
11110:
11109:
11101:
11100:
11090:
11086:
11082:
11081:
11069:
11068:
11063:
11062:
11054:
11001:
10994:
10992:
10991:
10986:
10984:
10983:
10975:
10974:
10972:
10971:
10963:
10958:
10950:
10933:
10932:
10927:
10926:
10918:
10914:
10910:
10894:
10893:
10886:
10885:
10877:
10861:
10860:
10855:
10854:
10846:
10840:
10839:
10838:
10829:
10828:
10809:
10807:
10806:
10801:
10799:
10798:
10790:
10759:
10757:
10756:
10751:
10743:
10742:
10730:
10729:
10707:
10705:
10704:
10699:
10694:
10693:
10686:
10678:
10666:
10665:
10649:
10648:
10642:
10641:
10633:
10632:
10622:
10618:
10608:
10607:
10599:
10576:
10574:
10573:
10568:
10566:
10565:
10557:
10525:to derive their
10520:
10518:
10517:
10512:
10510:
10509:
10504:
10503:
10495:
10484:
10482:
10481:
10476:
10474:
10473:
10465:
10421:
10395:
10393:
10392:
10387:
10379:
10378:
10363:
10362:
10347:
10346:
10337:
10336:
10327:
10326:
10317:
10316:
10298:
10297:
10288:
10287:
10269:
10268:
10259:
10258:
10246:
10245:
10236:
10235:
10226:
10225:
10207:
10206:
10197:
10196:
10178:
10177:
10172:
10171:
10163:
10130:
10128:
10127:
10122:
10108:
10107:
10099:
10096:
10095:
10083:
10082:
10074:
10073:
10063:
10062:
10047:
10046:
10034:
10033:
10024:
10023:
10014:
10013:
9998:
9997:
9982:
9981:
9973:
9967:
9966:
9961:
9960:
9952:
9930:
9901:
9899:
9898:
9893:
9878:
9877:
9848:Ridge regression
9812:
9810:
9809:
9804:
9792:
9791:
9782:
9781:
9772:
9771:
9756:
9755:
9729:
9727:
9726:
9721:
9718:
9717:
9712:
9711:
9703:
9691:
9689:
9688:
9683:
9681:
9680:
9672:
9653:
9584:have dimensions
9566:
9564:
9563:
9558:
9547:
9546:
9537:
9536:
9524:
9523:
9514:
9513:
9458:
9441:
9427:
9425:
9424:
9419:
9417:
9416:
9411:
9410:
9402:
9397:
9395:
9394:
9393:
9377:
9376:
9367:
9359:
9358:
9350:
9346:
9341:
9329:
9328:
9317:
9316:
9308:
9303:
9302:
9296:
9284:
9283:
9278:
9277:
9269:
9261:
9250:
9245:
9244:
9236:
9199:
9166:
9164:
9163:
9158:
9152:
9151:
9146:
9145:
9137:
9132:
9131:
9125:
9116:
9115:
9100:
9099:
9098:
9085:
9083:
9082:
9081:
9062:
9054:
9053:
9045:
9039:
9038:
9027:
9026:
9018:
8999:jackknife method
8970:
8968:
8967:
8962:
8960:
8959:
8951:
8913:
8911:
8910:
8905:
8902:
8901:
8893:
8879:
8872:
8841:
8829:Cramér–Rao bound
8826:
8807:
8805:
8804:
8799:
8796:
8791:
8773:
8771:
8760:
8759:
8750:
8746:
8740:
8739:
8738:
8703:Cramér–Rao bound
8697:
8695:
8694:
8689:
8684:
8683:
8677:
8676:
8661:
8660:
8659:
8646:
8645:
8634:
8627:
8626:
8620:
8619:
8611:
8605:
8604:
8603:
8595:
8582:
8580:
8579:
8574:
8571:
8570:
8562:
8548:
8504:
8502:
8501:
8496:
8478:
8477:
8469:
8443:
8442:
8434:
8407:
8405:
8404:
8399:
8396:
8395:
8387:
8372:
8370:
8369:
8364:
8361:
8360:
8352:
8342:) the estimator
8332:spherical errors
8320:
8318:
8317:
8312:
8294:
8293:
8285:
8279:
8278:
8270:
8247:
8245:
8244:
8239:
8236:
8235:
8227:
8213:
8211:
8210:
8205:
8203:
8200:
8192:
8184:
8180:
8176:
8175:
8160:
8159:
8150:
8142:
8141:
8136:
8135:
8127:
8120:
8119:
8114:
8099:
8074:
8072:
8071:
8066:
8063:
8062:
8057:
8056:
8048:
8033:
8031:
8030:
8025:
8017:
8016:
8004:
8003:
7995:
7991:
7987:
7986:
7971:
7970:
7948:
7947:
7939:
7916:
7914:
7913:
7908:
7905:
7904:
7896:
7865:
7863:
7862:
7857:
7852:
7851:
7829:
7828:
7789:
7788:
7780:
7749:
7747:
7746:
7741:
7738:
7737:
7729:
7705:
7670:
7642:: E = 0 for all
7566:
7558:homoscedasticity
7554:
7540:
7524:
7481:iid observations
7471:
7470: → ∞
7434:
7432:
7431:
7426:
7418:
7417:
7408:
7407:
7389:
7388:
7335:
7325:
7288:
7279:Homoscedasticity
7269:is considered a
7252:
7245:
7243:
7242:
7237:
7232:
7231:
7222:
7221:
7178:Spherical errors
7161:
7151:
7149:
7148:
7143:
7135:
7134:
7105:
7104:
7051:
7043:
7039:
7037:
7036:
7031:
6992:conditional mean
6920:
6872:
6858:
6850:
6846:
6842:
6817:
6815:
6814:
6809:
6801:
6800:
6793:
6789:
6784:
6779:
6767:
6766:
6752:
6751:
6741:
6740:
6734:
6702:Cramér–Rao bound
6675:
6673:
6672:
6667:
6665:
6658:
6653:
6652:
6645:
6644:
6639:
6633:
6632:
6624:
6620:
6619:
6614:
6613:
6608:
6592:
6591:
6586:
6580:
6579:
6571:
6567:
6566:
6561:
6560:
6555:
6532:
6527:
6526:
6518:
6517:
6510:
6503:
6482:
6481:
6474:
6473:
6465:
6458:
6457:
6449:
6435:
6426:
6425:
6418:
6417:
6409:
6402:
6401:
6393:
6382:
6381:
6374:
6367:
6347:
6325:
6323:
6322:
6317:
6312:
6311:
6303:
6300:
6292:
6291:
6283:
6280:
6272:
6264:
6263:
6258:
6253:
6229:
6227:
6226:
6223:{\displaystyle }
6221:
6216:
6209:
6208:
6189:
6187:
6186:
6181:
6179:
6178:
6170:
6157:
6155:
6154:
6149:
6147:
6142:
6134:
6122:
6120:
6119:
6114:
6112:
6111:
6103:
6100:
6092:
6072:
6070:
6069:
6064:
6062:
6057:
6046:
6045:
6037:
6034:
6026:
6000:
5998:
5997:
5992:
5990:
5989:
5981:
5972:
5957:
5955:
5954:
5949:
5941:
5936:
5935:
5926:
5925:
5917:
5914:
5906:
5881:
5867:
5865:
5864:
5859:
5857:
5856:
5848:
5819:
5788:is the standard
5787:
5785:
5776:
5774:
5773:
5768:
5760:
5755:
5747:
5737:
5728:
5727:
5712:
5711:
5703:
5678:
5648:
5647:
5640:
5635:
5633:
5632:
5627:
5625:
5624:
5608:
5606:
5605:
5600:
5598:
5597:
5569:
5548:
5546:
5545:
5540:
5538:
5537:
5529:
5511:
5509:
5508:
5503:
5501:
5492:
5491:
5490:
5482:
5478:
5477:
5469:
5463:
5462:
5454:
5444:
5443:
5435:
5428:
5426:
5425:
5424:
5423:
5414:
5413:
5405:
5399:
5398:
5384:
5379:
5363:
5362:
5358:
5357:
5349:
5343:
5342:
5327:
5326:
5318:
5312:
5311:
5297:
5292:
5276:
5267:
5266:
5258:
5238:
5236:
5235:
5230:
5225:
5224:
5212:
5211:
5190:
5189:
5170:
5113:centering matrix
5110:
5108:
5107:
5102:
5082:
5080:
5079:
5074:
5072:
5071:
5055:
5053:
5052:
5047:
5045:
5044:
5035:
5027:
5022:
5021:
4990:
4988:
4987:
4982:
4980:
4978:
4967:
4956:
4945:
4943:
4936:
4935:
4934:
4923:
4916:
4915:
4914:
4903:
4892:
4890:
4883:
4882:
4881:
4870:
4860:
4859:
4858:
4848:
4847:
4846:
4835:
4830:
4828:
4827:
4826:
4817:
4809:
4804:
4803:
4787:
4786:
4785:
4776:
4768:
4763:
4762:
4757:
4756:
4748:
4737:
4732:
4731:
4658:
4656:
4655:
4650:
4647:
4646:
4641:
4640:
4632:
4597:
4595:
4594:
4589:
4587:
4586:
4576:
4571:
4560:
4555:
4554:
4549:
4548:
4540:
4532:
4530:
4519:
4515:
4514:
4506:
4496:
4491:
4489:
4478:
4471:
4470:
4469:
4458:
4453:
4451:
4440:
4433:
4432:
4431:
4421:
4420:
4419:
4408:
4403:
4401:
4390:
4383:
4382:
4381:
4361:
4356:
4354:
4343:
4342:
4341:
4333:
4330:
4329:
4328:
4322:
4321:
4313:
4308:
4303:
4302:
4270:
4268:
4267:
4262:
4188:
4187:
4179:
4164:
4163:
4155:
4143:
4142:
4134:
4113:
4106:
4092:
4082:
4057:. Both matrices
4052:
3977:
3975:
3974:
3969:
3955:
3954:
3946:
3937:
3936:
3928:
3914:predicted values
3825:
3823:
3822:
3817:
3810:
3806:
3805:
3796:
3795:
3780:
3779:
3749:
3748:
3747:
3746:
3741:
3721:
3720:
3712:
3699:
3697:
3696:
3691:
3689:
3688:
3680:
3638:. The value of
3611:
3609:
3608:
3603:
3580:
3579:
3552:
3551:
3538:
3533:
3521:
3520:
3507:
3502:
3436:
3423:
3395:
3360:
3358:
3357:
3352:
3347:
3342:
3341:
3336:
3330:
3329:
3321:
3317:
3316:
3311:
3310:
3305:
3290:
3282:
3281:
3273:
3257:
3255:
3254:
3249:
3244:
3239:
3238:
3233:
3227:
3226:
3218:
3214:
3213:
3208:
3207:
3202:
3187:
3186:
3178:
3161:
3159:
3158:
3153:
3151:
3150:
3142:
3129:is known as the
3128:
3126:
3125:
3120:
3118:
3113:
3112:
3107:
3087:is known as the
3086:
3084:
3083:
3078:
3076:
3071:
3070:
3065:
3049:
3047:
3046:
3041:
3034:
3033:
3028:
3027:
3022:
3013:
3012:
3004:
3001:
2997:
2996:
2991:
2990:
2985:
2963:normal equations
2956:
2954:
2953:
2948:
2946:
2934:
2932:
2931:
2926:
2907:
2905:
2904:
2899:
2894:
2893:
2888:
2884:
2883:
2878:
2870:
2856:
2855:
2850:
2846:
2845:
2844:
2835:
2834:
2821:
2816:
2798:
2797:
2781:
2776:
2755:
2734:
2732:
2731:
2726:
2711:
2709:
2708:
2703:
2695:
2683:
2678:
2652:
2651:
2643:
2623:
2621:
2620:
2615:
2613:
2598:
2596:
2595:
2590:
2578:
2576:
2575:
2570:
2562:
2561:
2542:
2540:
2539:
2534:
2532:
2517:
2515:
2514:
2509:
2504:
2503:
2496:
2495:
2475:
2474:
2461:
2460:
2440:
2431:
2430:
2423:
2422:
2402:
2401:
2388:
2387:
2367:
2358:
2357:
2350:
2349:
2330:
2329:
2315:
2314:
2276:
2275:
2256:
2255:
2244:
2243:
2230:
2229:
2210:
2209:
2198:
2197:
2177:
2159:
2157:
2156:
2151:
2146:
2138:
2133:
2114:
2112:
2111:
2106:
2088:
2086:
2085:
2080:
2078:
2077:
2059:
2058:
2046:
2045:
2025:
2023:
2022:
2017:
2004:linear equations
2002:
2000:
1999:
1994:
1979:
1977:
1976:
1971:
1931:
1927:
1926:
1914:
1913:
1904:
1903:
1890:
1885:
1849:
1847:
1846:
1841:
1839:
1807:
1805:
1804:
1799:
1797:
1796:
1788:
1782:
1781:
1761:
1759:
1758:
1753:
1751:
1750:
1734:
1732:
1731:
1726:
1696:
1694:
1693:
1688:
1680:
1679:
1660:
1658:
1657:
1652:
1650:
1635:
1633:
1632:
1627:
1615:
1613:
1612:
1607:
1604:
1599:
1594:
1581:
1579:
1578:
1573:
1557:
1555:
1554:
1549:
1531:
1529:
1528:
1523:
1521:
1509:
1507:
1506:
1501:
1489:
1487:
1486:
1481:
1463:
1461:
1460:
1455:
1453:
1441:
1439:
1438:
1433:
1431:
1416:
1414:
1413:
1408:
1402:
1394:
1389:
1381:
1366:
1364:
1363:
1358:
1356:
1355:
1350:
1337:
1335:
1334:
1329:
1327:
1326:
1310:
1308:
1307:
1302:
1300:
1299:
1283:
1281:
1280:
1275:
1259:
1257:
1256:
1251:
1249:
1248:
1232:
1230:
1229:
1224:
1206:
1204:
1203:
1198:
1196:
1184:
1182:
1181:
1176:
1164:
1162:
1161:
1156:
1154:
1153:
1148:
1132:
1130:
1129:
1124:
1118:
1117:
1105:
1099:
1094:
1089:
1080:
1079:
1053:
1051:
1050:
1045:
1040:
1039:
1027:
1026:
1012:
1011:
1010:
992:
991:
977:
976:
975:
963:
962:
948:
947:
946:
934:
933:
914:
912:
911:
906:
904:
903:
883:
881:
880:
875:
873:
872:
867:
863:
862:
861:
840:
839:
824:
823:
802:
801:
796:
783:
781:
780:
775:
763:
761:
760:
755:
753:
752:
747:
734:
732:
731:
726:
724:
723:
707:
705:
704:
699:
687:
685:
684:
679:
676:
671:
660:
656:
655:
654:
642:
641:
636:
615:
613:
612:
607:
438:
431:
424:
408:
407:
315:Ridge regression
150:Multilevel model
30:
29:
21:
15909:
15908:
15904:
15903:
15902:
15900:
15899:
15898:
15879:
15878:
15877:
15872:
15850:
15832:
15796:
15792:Chebyshev nodes
15745:
15741:Bayesian design
15717:
15698:Goodness of fit
15671:
15644:
15634:Model selection
15617:
15585:
15554:
15513:
15482:
15439:
15433:
15400:
15357:
15324:
15315:
15280:
15254:
15235:
15216:
15202:Porter, Dawn C.
15190:
15174:
15172:Further reading
15169:
15162:
15144:
15140:
15132:
15125:
15117:
15113:
15105:
15101:
15093:
15089:
15082:
15068:
15061:
15053:
15049:
15041:
15037:
15030:
15013:
15009:
15001:
14997:
14990:
14965:
14961:
14953:
14944:
14940:, pages 27, 30)
14936:
14932:
14922:
14920:
14917:
14913:
14912:
14908:
14889:
14885:
14877:
14873:
14865:
14858:
14850:
14846:
14838:
14834:
14826:
14819:
14811:
14807:
14796:
14792:
14784:
14780:
14773:
14759:
14755:
14744:
14740:
14730:
14726:
14718:
14711:
14704:
14688:
14684:
14677:
14661:
14657:
14650:
14634:
14630:
14622:
14618:
14604:
14600:
14593:
14566:
14562:
14553:
14551:
14549:Cross Validated
14543:
14542:
14538:
14529:
14527:
14518:
14517:
14513:
14509:
14477:
14438:
14435:
14434:
14407:
14399:
14396:
14395:
14374:
14361:
14360:
14359:
14349:
14336:
14335:
14334:
14332:
14329:
14328:
14305:
14304:
14298:
14297:
14291:
14290:
14284:
14283:
14277:
14276:
14270:
14269:
14259:
14258:
14250:
14247:
14246:
14228:
14227:
14222:
14216:
14215:
14210:
14204:
14203:
14195:
14189:
14188:
14180:
14174:
14173:
14165:
14159:
14158:
14150:
14140:
14139:
14131:
14128:
14127:
14100:
14095:
14093:
14090:
14089:
14073:
14070:
14069:
14048:
14046:
14043:
14042:
14021:
14019:
14016:
14015:
13999:
13996:
13995:
13974:
13972:
13969:
13968:
13952:
13949:
13948:
13927:
13925:
13922:
13921:
13905:
13902:
13901:
13881:
13877:
13867:
13854:
13853:
13852:
13843:
13839:
13837:
13834:
13833:
13811:
13807:
13793:
13784:
13780:
13762:
13758:
13750:
13747:
13746:
13718:
13715:
13714:
13686:
13683:
13682:
13662:
13658:
13622:
13618:
13582:
13578:
13561:
13558:
13557:
13529:
13526:
13525:
13485:
13472:
13453:
13448:
13446:
13443:
13442:
13417:
13414:
13413:
13410:
13389:
13386:
13385:
13369:
13366:
13365:
13317:
13314:
13313:
13275:
13272:
13271:
13250:
13247:
13246:
13230:
13227:
13226:
13201:
13198:
13197:
13158:
13153:
13136:
13133:
13132:
13129:
13124:
13035:
13029:
13023:
12988:
12987:
12985:
12982:
12981:
12957:should be used.
12853:
12849:
12827:
12816:
12814:
12799:
12789:
12788:
12786:
12783:
12782:
12759:
12755:
12753:
12750:
12749:
12732:
12728:
12726:
12723:
12722:
12654:
12643:
12642:
12641:
12636:
12630:
12619:
12618:
12617:
12609:
12606:
12605:
12567:
12553:
12540:
12532:
12524:
12523:
12517:
12506:
12505:
12504:
12503:
12499:
12498:
12489:
12478:
12477:
12476:
12474:
12471:
12470:
12467:standard errors
12457:
12444:In this table:
12344:
12312:
12308:
12306:
12303:
12302:
12270:
12266:
12264:
12261:
12260:
12228:
12224:
12222:
12219:
12218:
12213:
12185:
12119:
12115:
12106:
12101:
12091:
12087:
12078:
12074:
12068:
12064:
12055:
12051:
12042:
12038:
12036:
12033:
12032:
12027:
11869:
11859:
11823:null hypothesis
11814:
11808:
11805:
11798:needs expansion
11783:
11777:
11764:
11737:
11733:
11724:
11716:
11705:
11704:
11699:
11689:
11678:
11677:
11676:
11674:
11650:
11649:
11648:
11637:
11630:
11612:
11611:
11604:
11603:
11598:
11590:
11586:
11577:
11573:
11571:
11568:
11567:
11547:
11543:
11541:
11538:
11537:
11509:
11505:
11496:
11488:
11477:
11476:
11471:
11461:
11457:
11449:
11445:
11439:
11438:
11423:
11409:
11405:
11396:
11385:
11384:
11383:
11382:
11378:
11376:
11373:
11372:
11347:
11346:
11344:
11341:
11340:
11318:
11317:
11310:
11309:
11304:
11291:
11280:
11279:
11278:
11276:
11273:
11272:
11247:
11246:
11241:
11228:
11224:
11222:
11219:
11218:
11197:
11193:
11191:
11188:
11187:
11159:
11155:
11142:
11137:
11129:
11115:
11111:
11105:
11104:
11089:
11077:
11073:
11064:
11053:
11052:
11051:
11046:
11043:
11042:
11036:
11019:
10996:
10979:
10978:
10964:
10951:
10943:
10935:
10934:
10928:
10917:
10916:
10915:
10913:
10889:
10888:
10887:
10876:
10869:
10856:
10845:
10844:
10843:
10834:
10833:
10824:
10820:
10818:
10815:
10814:
10789:
10788:
10786:
10783:
10782:
10775:
10767:Main articles:
10765:
10738:
10734:
10722:
10718:
10716:
10713:
10712:
10689:
10688:
10679:
10671:
10661:
10657:
10644:
10643:
10637:
10636:
10621:
10598:
10597:
10592:
10589:
10588:
10556:
10555:
10553:
10550:
10549:
10505:
10494:
10493:
10492:
10490:
10487:
10486:
10464:
10463:
10461:
10458:
10457:
10454:
10436:point estimates
10432:
10426:is invertible.
10416:
10371:
10367:
10358:
10354:
10342:
10341:
10332:
10328:
10322:
10318:
10309:
10305:
10293:
10289:
10283:
10279:
10264:
10260:
10254:
10253:
10241:
10237:
10231:
10227:
10218:
10214:
10202:
10198:
10192:
10188:
10173:
10162:
10161:
10160:
10158:
10155:
10154:
10098:
10097:
10091:
10087:
10075:
10069:
10068:
10067:
10055:
10051:
10042:
10038:
10029:
10025:
10019:
10018:
10006:
10002:
9993:
9989:
9972:
9971:
9962:
9951:
9950:
9949:
9947:
9944:
9943:
9926:
9873:
9869:
9860:
9857:
9856:
9850:
9844:
9838:constant term.
9833:
9827:for regressors
9822:
9787:
9783:
9777:
9773:
9767:
9763:
9751:
9747:
9745:
9742:
9741:
9736:
9713:
9702:
9701:
9700:
9697:
9694:
9693:
9671:
9670:
9668:
9665:
9664:
9648:
9641:
9635:
9633:
9626:
9619:
9612:
9605:
9594:
9583:
9576:
9542:
9538:
9532:
9528:
9519:
9515:
9509:
9505:
9497:
9494:
9493:
9487:
9479:leverage points
9475:
9466:are called the
9464:
9452:
9447:
9438:
9432:
9412:
9401:
9400:
9399:
9389:
9385:
9378:
9372:
9368:
9366:
9349:
9348:
9342:
9337:
9318:
9307:
9306:
9305:
9298:
9297:
9292:
9279:
9268:
9267:
9266:
9251:
9246:
9235:
9234:
9231:
9228:
9227:
9213:
9197:
9183:
9176:
9171:
9147:
9136:
9135:
9134:
9127:
9126:
9121:
9108:
9104:
9094:
9093:
9089:
9077:
9073:
9066:
9061:
9044:
9043:
9028:
9017:
9016:
9015:
9013:
9010:
9009:
8980:
8950:
8949:
8947:
8944:
8943:
8940:
8934:
8928:
8892:
8891:
8888:
8885:
8884:
8874:
8867: −
8855:
8832:
8821: −
8812:
8792:
8781:
8761:
8755:
8751:
8749:
8734:
8730:
8728:
8725:
8724:
8679:
8678:
8669:
8665:
8655:
8654:
8650:
8641:
8637:
8622:
8621:
8615:
8614:
8594:
8593:
8591:
8588:
8587:
8561:
8560:
8557:
8554:
8553:
8545:
8529:
8522:
8468:
8467:
8433:
8432:
8420:
8417:
8416:
8386:
8385:
8382:
8379:
8378:
8351:
8350:
8347:
8344:
8343:
8284:
8283:
8269:
8268:
8256:
8253:
8252:
8226:
8225:
8222:
8219:
8218:
8193:
8185:
8171:
8167:
8166:
8162:
8155:
8151:
8149:
8137:
8126:
8125:
8124:
8100:
8098:
8097:
8095:
8092:
8091:
8058:
8047:
8046:
8045:
8042:
8039:
8038:
8012:
8008:
7996:
7982:
7978:
7977:
7973:
7972:
7966:
7962:
7938:
7937:
7925:
7922:
7921:
7895:
7894:
7891:
7888:
7887:
7847:
7843:
7824:
7820:
7779:
7778:
7766:
7763:
7762:
7728:
7727:
7724:
7721:
7720:
7713:
7703:
7694:
7687:
7683:
7667:
7662:
7629:
7622:
7615:
7608:
7593:
7586:
7574:
7561:
7552:
7537:
7532:
7520:
7517:
7510:
7495:
7488:
7466:
7451:
7413:
7409:
7403:
7399:
7384:
7383:
7369:
7366:
7365:
7327:
7323:
7315:autocorrelation
7283:
7255:identity matrix
7251:
7247:
7227:
7223:
7217:
7213:
7185:
7182:
7181:
7158:
7153:
7130:
7129:
7100:
7099:
7093:
7090:
7089:
7084:must have full
7049:
7041:
6999:
6996:
6995:
6967:
6918:
6916:
6909:
6890:
6884:
6879:
6860:
6856:
6848:
6844:
6840:
6826:
6796:
6795:
6780:
6775:
6762:
6758:
6757:
6753:
6747:
6743:
6736:
6735:
6730:
6728:
6725:
6724:
6710:
6686:
6663:
6662:
6654:
6647:
6646:
6640:
6635:
6634:
6625:
6615:
6609:
6604:
6603:
6602:
6598:
6597:
6594:
6593:
6587:
6582:
6581:
6572:
6562:
6556:
6551:
6550:
6549:
6545:
6544:
6537:
6536:
6528:
6519:
6512:
6511:
6506:
6504:
6499:
6492:
6491:
6490:
6483:
6476:
6475:
6464:
6463:
6460:
6459:
6448:
6447:
6440:
6439:
6434:
6431:
6430:
6420:
6419:
6408:
6407:
6404:
6403:
6392:
6391:
6384:
6383:
6376:
6375:
6370:
6368:
6363:
6356:
6355:
6348:
6343:
6339:
6337:
6334:
6333:
6302:
6301:
6296:
6282:
6281:
6276:
6268:
6254:
6252:
6251:
6249:
6246:
6245:
6212:
6204:
6199:
6196:
6195:
6169:
6168:
6166:
6163:
6162:
6143:
6138:
6130:
6128:
6125:
6124:
6102:
6101:
6096:
6088:
6086:
6083:
6082:
6058:
6053:
6036:
6035:
6030:
6022:
6017:
6014:
6013:
5980:
5979:
5968:
5966:
5963:
5962:
5937:
5931:
5927:
5916:
5915:
5910:
5902:
5897:
5894:
5893:
5873:
5847:
5846:
5844:
5841:
5840:
5830:linear subspace
5811:
5801:Euclidean space
5783:
5781:
5756:
5751:
5743:
5733:
5717:
5716:
5702:
5701:
5699:
5696:
5695:
5670:
5663:
5649:
5645:
5620:
5616:
5614:
5611:
5610:
5593:
5589:
5587:
5584:
5583:
5576:
5551:
5528:
5527:
5525:
5522:
5521:
5518:
5499:
5498:
5481:
5480:
5468:
5467:
5453:
5452:
5445:
5434:
5433:
5430:
5429:
5419:
5415:
5404:
5403:
5394:
5390:
5386:
5380:
5369:
5364:
5348:
5347:
5338:
5334:
5317:
5316:
5307:
5303:
5299:
5293:
5282:
5277:
5275:
5268:
5257:
5256:
5252:
5250:
5247:
5246:
5220:
5216:
5207:
5203:
5185:
5181:
5179:
5176:
5175:
5160:
5157:
5146:
5140:
5096:
5093:
5092:
5067:
5063:
5061:
5058:
5057:
5040:
5036:
5026:
5017:
5013:
5005:
5002:
5001:
4968:
4957:
4955:
4930:
4929:
4925:
4924:
4910:
4909:
4905:
4904:
4902:
4877:
4876:
4872:
4871:
4854:
4853:
4849:
4842:
4841:
4837:
4836:
4834:
4822:
4818:
4808:
4799:
4795:
4788:
4781:
4777:
4767:
4758:
4747:
4746:
4745:
4738:
4736:
4727:
4723:
4721:
4718:
4717:
4642:
4631:
4630:
4629:
4626:
4623:
4622:
4582:
4578:
4561:
4559:
4550:
4539:
4538:
4537:
4520:
4505:
4504:
4497:
4495:
4479:
4465:
4464:
4460:
4459:
4457:
4441:
4427:
4426:
4422:
4415:
4414:
4410:
4409:
4407:
4391:
4377:
4376:
4372:
4362:
4360:
4344:
4332:
4331:
4324:
4323:
4312:
4311:
4310:
4309:
4307:
4298:
4294:
4292:
4289:
4288:
4178:
4177:
4154:
4153:
4133:
4132:
4130:
4127:
4126:
4108:
4098:
4097:via identities
4084:
4074:
4046:
4037:
4006:onto the space
3945:
3944:
3927:
3926:
3924:
3921:
3920:
3872:
3854:cofactor matrix
3801:
3797:
3788:
3784:
3775:
3771:
3742:
3737:
3736:
3729:
3725:
3711:
3710:
3708:
3705:
3704:
3679:
3678:
3670:
3667:
3666:
3649:. The function
3636:
3632:
3575:
3571:
3547:
3543:
3534:
3529:
3516:
3512:
3503:
3492:
3471:
3468:
3467:
3425:
3420:
3413:
3407:
3390:
3383:
3378:
3377:. The quantity
3367:
3343:
3337:
3332:
3331:
3322:
3312:
3306:
3301:
3300:
3299:
3295:
3294:
3286:
3272:
3271:
3269:
3266:
3265:
3240:
3234:
3229:
3228:
3219:
3209:
3203:
3198:
3197:
3196:
3192:
3191:
3177:
3176:
3174:
3171:
3170:
3166:, expressed as
3141:
3140:
3138:
3135:
3134:
3114:
3108:
3103:
3102:
3100:
3097:
3096:
3095:and the matrix
3072:
3066:
3061:
3060:
3058:
3055:
3054:
3029:
3023:
3018:
3017:
3003:
3002:
2992:
2986:
2981:
2980:
2979:
2975:
2973:
2970:
2969:
2942:
2940:
2937:
2936:
2920:
2917:
2916:
2889:
2879:
2874:
2866:
2865:
2861:
2860:
2851:
2840:
2836:
2827:
2823:
2817:
2806:
2793:
2789:
2788:
2784:
2783:
2777:
2766:
2751:
2743:
2740:
2739:
2720:
2717:
2716:
2691:
2658:
2656:
2642:
2641:
2639:
2636:
2635:
2609:
2607:
2604:
2603:
2584:
2581:
2580:
2554:
2550:
2548:
2545:
2544:
2528:
2526:
2523:
2522:
2498:
2497:
2491:
2487:
2484:
2483:
2477:
2476:
2470:
2466:
2463:
2462:
2456:
2452:
2445:
2444:
2436:
2425:
2424:
2418:
2414:
2411:
2410:
2404:
2403:
2397:
2393:
2390:
2389:
2383:
2379:
2372:
2371:
2363:
2352:
2351:
2342:
2338:
2336:
2331:
2322:
2318:
2316:
2307:
2303:
2300:
2299:
2294:
2289:
2284:
2278:
2277:
2268:
2264:
2262:
2257:
2251:
2247:
2245:
2239:
2235:
2232:
2231:
2222:
2218:
2216:
2211:
2205:
2201:
2199:
2193:
2189:
2182:
2181:
2173:
2171:
2168:
2167:
2142:
2134:
2129:
2127:
2124:
2123:
2094:
2091:
2090:
2073:
2069:
2054:
2050:
2041:
2037:
2035:
2032:
2031:
2011:
2008:
2007:
1988:
1985:
1984:
1922:
1918:
1909:
1905:
1896:
1892:
1886:
1875:
1869:
1866:
1865:
1856:
1835:
1833:
1830:
1829:
1816:model would be
1787:
1786:
1777:
1773:
1771:
1768:
1767:
1746:
1742:
1740:
1737:
1736:
1702:
1699:
1698:
1672:
1668:
1666:
1663:
1662:
1646:
1644:
1641:
1640:
1621:
1618:
1617:
1600:
1595:
1590:
1587:
1584:
1583:
1567:
1564:
1563:
1537:
1534:
1533:
1517:
1515:
1512:
1511:
1495:
1492:
1491:
1469:
1466:
1465:
1449:
1447:
1444:
1443:
1427:
1425:
1422:
1421:
1398:
1390:
1385:
1377:
1375:
1372:
1371:
1351:
1346:
1345:
1343:
1340:
1339:
1322:
1318:
1316:
1313:
1312:
1295:
1291:
1289:
1286:
1285:
1269:
1266:
1265:
1244:
1240:
1238:
1235:
1234:
1212:
1209:
1208:
1192:
1190:
1187:
1186:
1170:
1167:
1166:
1149:
1144:
1143:
1141:
1138:
1137:
1113:
1109:
1101:
1095:
1090:
1085:
1075:
1071:
1069:
1066:
1065:
1035:
1031:
1019:
1015:
1006:
1002:
984:
980:
971:
967:
955:
951:
942:
938:
929:
925:
923:
920:
919:
899:
895:
893:
890:
889:
868:
854:
850:
832:
828:
816:
812:
811:
807:
806:
797:
792:
791:
789:
786:
785:
769:
766:
765:
748:
743:
742:
740:
737:
736:
719:
715:
713:
710:
709:
693:
690:
689:
672:
661:
650:
646:
637:
632:
631:
630:
626:
623:
620:
619:
601:
598:
597:
594:
588:
493:linear function
479:) is a type of
442:
402:
382:Goodness of fit
89:Discrete choice
28:
23:
22:
15:
12:
11:
5:
15907:
15897:
15896:
15891:
15874:
15873:
15871:
15870:
15865:
15860:
15848:
15843:
15837:
15834:
15833:
15831:
15830:
15825:
15820:
15815:
15810:
15804:
15802:
15798:
15797:
15795:
15794:
15789:
15784:
15779:
15774:
15769:
15764:
15758:
15756:
15747:
15746:
15744:
15743:
15738:
15736:Optimal design
15733:
15727:
15725:
15719:
15718:
15716:
15715:
15710:
15705:
15700:
15695:
15690:
15685:
15679:
15677:
15673:
15672:
15670:
15669:
15664:
15659:
15658:
15657:
15652:
15647:
15642:
15631:
15625:
15623:
15619:
15618:
15616:
15615:
15610:
15605:
15599:
15597:
15591:
15590:
15587:
15586:
15584:
15583:
15578:
15573:
15568:
15562:
15560:
15556:
15555:
15553:
15552:
15547:
15542:
15537:
15535:Semiparametric
15532:
15527:
15521:
15519:
15515:
15514:
15512:
15511:
15506:
15501:
15496:
15490:
15488:
15484:
15483:
15481:
15480:
15475:
15470:
15465:
15460:
15454:
15452:
15443:
15435:
15434:
15432:
15431:
15426:
15421:
15416:
15410:
15408:
15402:
15401:
15399:
15398:
15393:
15388:
15382:
15380:Spearman's rho
15373:
15367:
15365:
15359:
15358:
15356:
15355:
15350:
15345:
15340:
15334:
15332:
15326:
15325:
15314:
15313:
15306:
15299:
15291:
15285:
15284:
15278:
15258:
15252:
15239:
15233:
15220:
15214:
15194:
15188:
15173:
15170:
15168:
15167:
15160:
15138:
15123:
15111:
15099:
15087:
15080:
15059:
15047:
15035:
15028:
15007:
14995:
14988:
14959:
14942:
14930:
14906:
14883:
14871:
14856:
14844:
14832:
14817:
14805:
14790:
14778:
14771:
14753:
14738:
14724:
14709:
14702:
14682:
14675:
14655:
14648:
14628:
14616:
14607:Hayashi, Fumio
14598:
14591:
14560:
14536:
14524:Feature Column
14510:
14508:
14505:
14504:
14503:
14498:
14493:
14488:
14483:
14476:
14473:
14460:
14457:
14454:
14451:
14448:
14445:
14442:
14422:
14419:
14414:
14411:
14406:
14403:
14377:
14372:
14369:
14364:
14358:
14352:
14347:
14344:
14339:
14314:
14309:
14303:
14300:
14299:
14296:
14293:
14292:
14289:
14286:
14285:
14282:
14279:
14278:
14275:
14272:
14271:
14268:
14265:
14264:
14262:
14257:
14254:
14232:
14226:
14223:
14221:
14218:
14217:
14214:
14211:
14209:
14206:
14205:
14202:
14196:
14194:
14191:
14190:
14187:
14184:
14181:
14179:
14176:
14175:
14172:
14169:
14166:
14164:
14161:
14160:
14157:
14154:
14151:
14149:
14146:
14145:
14143:
14138:
14135:
14112:
14109:
14106:
14103:
14099:
14077:
14055:
14052:
14028:
14025:
14003:
13981:
13978:
13956:
13934:
13931:
13909:
13889:
13884:
13880:
13876:
13870:
13865:
13862:
13857:
13851:
13846:
13842:
13819:
13814:
13810:
13806:
13803:
13800:
13796:
13792:
13787:
13783:
13779:
13776:
13773:
13770:
13765:
13761:
13757:
13754:
13734:
13731:
13728:
13725:
13722:
13702:
13699:
13696:
13693:
13690:
13670:
13665:
13661:
13657:
13654:
13651:
13648:
13645:
13642:
13639:
13636:
13633:
13630:
13625:
13621:
13617:
13614:
13611:
13608:
13605:
13602:
13599:
13596:
13593:
13590:
13585:
13581:
13577:
13574:
13571:
13568:
13565:
13545:
13542:
13539:
13536:
13533:
13509:
13506:
13503:
13500:
13497:
13492:
13489:
13484:
13479:
13476:
13471:
13465:
13462:
13459:
13456:
13452:
13430:
13427:
13424:
13421:
13409:
13406:
13393:
13373:
13360:
13359:
13356:
13353:
13350:
13347:
13344:
13341:
13330:
13327:
13324:
13321:
13310:
13309:
13306:
13303:
13300:
13297:
13294:
13291:
13279:
13254:
13234:
13214:
13211:
13208:
13205:
13182:
13179:
13176:
13173:
13170:
13167:
13164:
13161:
13157:
13152:
13149:
13146:
13143:
13140:
13128:
13125:
13123:
13120:
13086:
13085:
13082:
13079:
13076:
13072:
13071:
13068:
13065:
13062:
13058:
13057:
13054:
13051:
13048:
13025:Main article:
13022:
13019:
13014:
13013:
13010:
12995:
12992:
12978:
12975:
12965:Residuals plot
12959:
12958:
12928:
12914:
12904:
12890:
12882:
12879:Log-likelihood
12875:
12874:
12873:
12872:
12861:
12856:
12852:
12848:
12845:
12842:
12836:
12833:
12830:
12825:
12822:
12819:
12813:
12810:
12807:
12802:
12796:
12793:
12777:
12776:
12762:
12758:
12735:
12731:
12716:
12698:
12657:
12650:
12647:
12639:
12633:
12626:
12623:
12616:
12613:
12588:
12574:
12571:
12565:
12559:
12556:
12551:
12546:
12543:
12538:
12535:
12531:
12527:
12520:
12513:
12510:
12502:
12497:
12492:
12485:
12482:
12459:
12455:
12442:
12441:
12438:
12437:
12434:
12431:
12428:
12422:
12421:
12418:
12415:
12412:
12406:
12405:
12402:
12399:
12396:
12390:
12389:
12386:
12383:
12380:
12379:Log-likelihood
12376:
12375:
12372:
12369:
12366:
12362:
12361:
12358:
12355:
12352:
12346:
12345:
12341:
12340:
12337:
12334:
12331:
12328:
12315:
12311:
12299:
12298:
12295:
12292:
12289:
12286:
12273:
12269:
12257:
12256:
12253:
12250:
12247:
12244:
12231:
12227:
12215:
12214:
12210:
12209:
12204:
12199:
12194:
12191:
12187:
12186:
12182:
12181:
12178:
12174:
12173:
12170:
12166:
12165:
12164:Least squares
12162:
12139:
12138:
12127:
12122:
12118:
12114:
12109:
12104:
12100:
12094:
12090:
12086:
12081:
12077:
12071:
12067:
12063:
12058:
12054:
12050:
12045:
12041:
12025:
12017:
12016:
12013:
12012:
12009:
12006:
12003:
12000:
11997:
11993:
11992:
11989:
11986:
11983:
11980:
11977:
11973:
11972:
11969:
11966:
11963:
11960:
11957:
11953:
11952:
11949:
11946:
11943:
11940:
11937:
11933:
11932:
11929:
11926:
11923:
11920:
11917:
11913:
11912:
11899:
11896:
11893:
11890:
11887:
11884:
11858:
11855:
11843:standard error
11816:
11815:
11795:
11793:
11779:Main article:
11776:
11773:
11772:
11771:
11765:1 −
11763: at the
11751:
11740:
11736:
11730:
11727:
11722:
11719:
11715:
11708:
11702:
11698:
11692:
11685:
11682:
11670:
11667:
11664:
11661:
11658:
11653:
11644:
11641:
11636:
11633:
11629:
11625:
11619:
11616:
11607:
11601:
11597:
11589:
11585:
11580:
11576:
11550:
11546:
11534:
11533:
11522:
11518:
11512:
11508:
11502:
11499:
11494:
11491:
11487:
11480:
11474:
11470:
11464:
11460:
11455:
11452:
11448:
11442:
11431:
11427:
11418:
11412:
11408:
11404:
11399:
11392:
11389:
11381:
11354:
11351:
11325:
11322:
11313:
11307:
11303:
11299:
11294:
11287:
11284:
11267:, whereas the
11256:
11250:
11244:
11240:
11236:
11231:
11227:
11200:
11196:
11184:
11183:
11172:
11168:
11162:
11158:
11154:
11150:
11145:
11140:
11136:
11132:
11128:
11125:
11121:
11118:
11114:
11108:
11097:
11093:
11085:
11080:
11076:
11072:
11067:
11060:
11057:
11050:
11034:
11015:
11004:
11003:
10997:1 −
10995: at the
10982:
10970:
10967:
10962:
10957:
10954:
10949:
10946:
10942:
10938:
10931:
10924:
10921:
10909:
10906:
10903:
10900:
10897:
10892:
10883:
10880:
10875:
10872:
10868:
10864:
10859:
10852:
10849:
10837:
10832:
10827:
10823:
10796:
10793:
10764:
10761:
10749:
10746:
10741:
10737:
10733:
10728:
10725:
10721:
10709:
10708:
10697:
10692:
10685:
10682:
10677:
10674:
10670:
10664:
10660:
10655:
10652:
10647:
10640:
10629:
10625:
10617:
10614:
10611:
10605:
10602:
10596:
10563:
10560:
10508:
10501:
10498:
10471:
10468:
10452:
10431:
10428:
10397:
10396:
10385:
10382:
10377:
10374:
10370:
10366:
10361:
10357:
10353:
10350:
10345:
10340:
10335:
10331:
10325:
10321:
10315:
10312:
10308:
10304:
10301:
10296:
10292:
10286:
10282:
10278:
10275:
10272:
10267:
10263:
10257:
10252:
10249:
10244:
10240:
10234:
10230:
10224:
10221:
10217:
10213:
10210:
10205:
10201:
10195:
10191:
10187:
10184:
10181:
10176:
10169:
10166:
10132:
10131:
10120:
10117:
10114:
10111:
10105:
10102:
10094:
10090:
10086:
10081:
10078:
10072:
10066:
10061:
10058:
10054:
10050:
10045:
10041:
10037:
10032:
10028:
10022:
10017:
10012:
10009:
10005:
10001:
9996:
9992:
9988:
9985:
9979:
9976:
9970:
9965:
9958:
9955:
9903:
9902:
9890:
9887:
9884:
9881:
9876:
9872:
9867:
9864:
9846:Main article:
9843:
9840:
9831:
9820:
9814:
9813:
9802:
9798:
9795:
9790:
9786:
9780:
9776:
9770:
9766:
9762:
9759:
9754:
9750:
9734:
9716:
9709:
9706:
9678:
9675:
9646:
9639:
9631:
9624:
9617:
9610:
9603:
9592:
9581:
9574:
9568:
9567:
9556:
9553:
9550:
9545:
9541:
9535:
9531:
9527:
9522:
9518:
9512:
9508:
9504:
9501:
9486:
9483:
9473:
9462:
9450:
9436:
9429:
9428:
9415:
9408:
9405:
9392:
9388:
9384:
9381:
9375:
9371:
9365:
9362:
9356:
9353:
9345:
9340:
9336:
9332:
9327:
9324:
9321:
9314:
9311:
9301:
9295:
9291:
9287:
9282:
9275:
9272:
9265:
9260:
9257:
9254:
9249:
9242:
9239:
9211:
9195:
9181:
9174:
9168:
9167:
9156:
9150:
9143:
9140:
9130:
9124:
9120:
9114:
9111:
9107:
9103:
9097:
9092:
9088:
9080:
9076:
9072:
9069:
9065:
9060:
9057:
9051:
9048:
9042:
9037:
9034:
9031:
9024:
9021:
8978:
8957:
8954:
8930:Main article:
8927:
8924:
8899:
8896:
8809:
8808:
8795:
8790:
8787:
8784:
8780:
8776:
8770:
8767:
8764:
8758:
8754:
8745:
8737:
8733:
8712:The estimator
8699:
8698:
8687:
8682:
8675:
8672:
8668:
8664:
8658:
8653:
8649:
8644:
8640:
8633:
8630:
8625:
8618:
8610:
8601:
8598:
8568:
8565:
8552:The estimator
8543:
8521:
8518:
8506:
8505:
8494:
8491:
8488:
8484:
8481:
8475:
8472:
8465:
8462:
8459:
8456:
8453:
8449:
8446:
8440:
8437:
8430:
8427:
8424:
8393:
8390:
8358:
8355:
8322:
8321:
8310:
8307:
8304:
8300:
8297:
8291:
8288:
8282:
8276:
8273:
8266:
8263:
8260:
8233:
8230:
8215:
8214:
8199:
8196:
8191:
8188:
8183:
8179:
8174:
8170:
8165:
8158:
8154:
8148:
8145:
8140:
8133:
8130:
8123:
8117:
8113:
8110:
8106:
8103:
8061:
8054:
8051:
8035:
8034:
8023:
8020:
8015:
8011:
8007:
8002:
7999:
7994:
7990:
7985:
7981:
7976:
7969:
7965:
7961:
7958:
7954:
7951:
7945:
7942:
7935:
7932:
7929:
7902:
7899:
7867:
7866:
7855:
7850:
7846:
7842:
7839:
7835:
7832:
7827:
7823:
7818:
7815:
7812:
7808:
7805:
7802:
7799:
7795:
7792:
7786:
7783:
7776:
7773:
7770:
7735:
7732:
7712:
7709:
7708:
7707:
7698:
7685:
7681:
7676:
7665:
7651:
7636:
7633:co-integrating
7627:
7620:
7613:
7606:
7591:
7584:
7573:
7570:
7569:
7568:
7555:
7546:
7535:
7526:
7515:
7508:
7493:
7486:
7450:
7447:
7446:
7445:
7424:
7421:
7416:
7412:
7406:
7402:
7398:
7395:
7392:
7387:
7382:
7379:
7376:
7373:
7355:
7354:
7353:
7310:
7249:
7235:
7230:
7226:
7220:
7216:
7212:
7209:
7205:
7202:
7199:
7195:
7192:
7189:
7175:
7156:
7141:
7138:
7133:
7127:
7124:
7121:
7118:
7115:
7112:
7109:
7103:
7097:
7088:almost surely:
7067:
7029:
7026:
7023:
7019:
7016:
7013:
7009:
7006:
7003:
6985:
6966:
6963:
6914:
6907:
6883:
6880:
6878:
6875:
6824:
6819:
6818:
6807:
6804:
6799:
6792:
6788:
6783:
6778:
6774:
6770:
6765:
6761:
6756:
6750:
6746:
6739:
6733:
6709:
6706:
6685:
6682:
6677:
6676:
6661:
6657:
6651:
6643:
6638:
6631:
6628:
6623:
6618:
6612:
6607:
6601:
6596:
6595:
6590:
6585:
6578:
6575:
6570:
6565:
6559:
6554:
6548:
6543:
6542:
6540:
6535:
6531:
6525:
6522:
6516:
6509:
6505:
6502:
6498:
6497:
6495:
6489:
6486:
6484:
6480:
6471:
6468:
6462:
6461:
6455:
6452:
6446:
6445:
6443:
6438:
6433:
6432:
6429:
6424:
6415:
6412:
6406:
6405:
6399:
6396:
6390:
6389:
6387:
6380:
6373:
6369:
6366:
6362:
6361:
6359:
6354:
6351:
6349:
6346:
6342:
6341:
6327:
6326:
6315:
6309:
6306:
6299:
6295:
6289:
6286:
6279:
6275:
6271:
6267:
6261:
6257:
6219:
6215:
6207:
6203:
6176:
6173:
6146:
6141:
6137:
6133:
6109:
6106:
6099:
6095:
6091:
6061:
6056:
6052:
6049:
6043:
6040:
6033:
6029:
6025:
6021:
5987:
5984:
5978:
5975:
5971:
5959:
5958:
5947:
5944:
5940:
5934:
5930:
5923:
5920:
5913:
5909:
5905:
5901:
5854:
5851:
5778:
5777:
5766:
5763:
5759:
5754:
5750:
5746:
5742:
5736:
5732:
5726:
5723:
5720:
5715:
5709:
5706:
5665:
5664:
5652:
5650:
5643:
5623:
5619:
5596:
5592:
5575:
5572:
5535:
5532:
5517:
5514:
5513:
5512:
5497:
5488:
5485:
5475:
5472:
5466:
5460:
5457:
5451:
5448:
5446:
5441:
5438:
5432:
5431:
5422:
5418:
5411:
5408:
5402:
5397:
5393:
5389:
5383:
5378:
5375:
5372:
5368:
5361:
5355:
5352:
5346:
5341:
5337:
5333:
5330:
5324:
5321:
5315:
5310:
5306:
5302:
5296:
5291:
5288:
5285:
5281:
5274:
5271:
5269:
5264:
5261:
5255:
5254:
5240:
5239:
5228:
5223:
5219:
5215:
5210:
5206:
5202:
5199:
5196:
5193:
5188:
5184:
5155:
5142:Main article:
5139:
5136:
5100:
5070:
5066:
5043:
5039:
5033:
5030:
5025:
5020:
5016:
5012:
5009:
4992:
4991:
4977:
4974:
4971:
4966:
4963:
4960:
4954:
4951:
4948:
4942:
4939:
4933:
4928:
4922:
4919:
4913:
4908:
4901:
4898:
4895:
4889:
4886:
4880:
4875:
4869:
4866:
4863:
4857:
4852:
4845:
4840:
4833:
4825:
4821:
4815:
4812:
4807:
4802:
4798:
4794:
4791:
4784:
4780:
4774:
4771:
4766:
4761:
4754:
4751:
4744:
4741:
4735:
4730:
4726:
4679:is called the
4671:. In practice
4645:
4638:
4635:
4599:
4598:
4585:
4581:
4574:
4570:
4567:
4564:
4558:
4553:
4546:
4543:
4535:
4529:
4526:
4523:
4518:
4512:
4509:
4503:
4500:
4494:
4488:
4485:
4482:
4477:
4474:
4468:
4463:
4456:
4450:
4447:
4444:
4439:
4436:
4430:
4425:
4418:
4413:
4406:
4400:
4397:
4394:
4389:
4386:
4380:
4375:
4371:
4368:
4365:
4359:
4353:
4350:
4347:
4339:
4336:
4327:
4319:
4316:
4306:
4301:
4297:
4272:
4271:
4260:
4257:
4254:
4251:
4248:
4245:
4242:
4239:
4236:
4233:
4230:
4227:
4224:
4221:
4218:
4215:
4212:
4209:
4206:
4203:
4200:
4197:
4194:
4191:
4185:
4182:
4176:
4173:
4170:
4167:
4161:
4158:
4152:
4149:
4146:
4140:
4137:
4073:(meaning that
4044:
4014:. This matrix
3979:
3978:
3967:
3964:
3961:
3958:
3952:
3949:
3943:
3940:
3934:
3931:
3891:is called the
3873:. The matrix (
3868:
3827:
3826:
3815:
3809:
3804:
3800:
3794:
3791:
3787:
3783:
3778:
3774:
3770:
3767:
3764:
3761:
3758:
3755:
3752:
3745:
3740:
3735:
3732:
3728:
3724:
3718:
3715:
3686:
3683:
3677:
3674:
3634:
3630:
3613:
3612:
3601:
3598:
3595:
3592:
3589:
3586:
3583:
3578:
3574:
3570:
3567:
3564:
3561:
3558:
3555:
3550:
3546:
3542:
3537:
3532:
3528:
3524:
3519:
3515:
3511:
3506:
3501:
3498:
3495:
3491:
3487:
3484:
3481:
3478:
3475:
3418:
3411:
3388:
3381:
3366:
3363:
3362:
3361:
3350:
3346:
3340:
3335:
3328:
3325:
3320:
3315:
3309:
3304:
3298:
3293:
3289:
3285:
3279:
3276:
3259:
3258:
3247:
3243:
3237:
3232:
3225:
3222:
3217:
3212:
3206:
3201:
3195:
3190:
3184:
3181:
3148:
3145:
3117:
3111:
3106:
3075:
3069:
3064:
3051:
3050:
3039:
3032:
3026:
3021:
3016:
3010:
3007:
3000:
2995:
2989:
2984:
2978:
2945:
2924:
2909:
2908:
2897:
2892:
2887:
2882:
2877:
2873:
2869:
2864:
2859:
2854:
2849:
2843:
2839:
2833:
2830:
2826:
2820:
2815:
2812:
2809:
2805:
2801:
2796:
2792:
2787:
2780:
2775:
2772:
2769:
2765:
2761:
2758:
2754:
2750:
2747:
2724:
2713:
2712:
2701:
2698:
2694:
2690:
2687:
2681:
2677:
2674:
2671:
2667:
2664:
2661:
2655:
2649:
2646:
2612:
2588:
2568:
2565:
2560:
2557:
2553:
2531:
2519:
2518:
2507:
2502:
2494:
2490:
2486:
2485:
2482:
2479:
2478:
2473:
2469:
2465:
2464:
2459:
2455:
2451:
2450:
2448:
2443:
2439:
2434:
2429:
2421:
2417:
2413:
2412:
2409:
2406:
2405:
2400:
2396:
2392:
2391:
2386:
2382:
2378:
2377:
2375:
2370:
2366:
2361:
2356:
2348:
2345:
2341:
2337:
2335:
2332:
2328:
2325:
2321:
2317:
2313:
2310:
2306:
2302:
2301:
2298:
2295:
2293:
2290:
2288:
2285:
2283:
2280:
2279:
2274:
2271:
2267:
2263:
2261:
2258:
2254:
2250:
2246:
2242:
2238:
2234:
2233:
2228:
2225:
2221:
2217:
2215:
2212:
2208:
2204:
2200:
2196:
2192:
2188:
2187:
2185:
2180:
2176:
2161:
2160:
2149:
2145:
2141:
2137:
2132:
2104:
2101:
2098:
2076:
2072:
2068:
2065:
2062:
2057:
2053:
2049:
2044:
2040:
2015:
1992:
1981:
1980:
1969:
1966:
1963:
1960:
1957:
1954:
1951:
1948:
1945:
1942:
1939:
1936:
1930:
1925:
1921:
1917:
1912:
1908:
1902:
1899:
1895:
1889:
1884:
1881:
1878:
1874:
1855:
1852:
1838:
1794:
1791:
1785:
1780:
1776:
1749:
1745:
1724:
1721:
1718:
1715:
1712:
1709:
1706:
1686:
1683:
1678:
1675:
1671:
1649:
1625:
1603:
1598:
1593:
1571:
1547:
1544:
1541:
1520:
1499:
1479:
1476:
1473:
1452:
1430:
1418:
1417:
1405:
1401:
1397:
1393:
1388:
1384:
1380:
1354:
1349:
1325:
1321:
1298:
1294:
1273:
1247:
1243:
1222:
1219:
1216:
1195:
1174:
1152:
1147:
1134:
1133:
1121:
1116:
1112:
1108:
1104:
1098:
1093:
1088:
1083:
1078:
1074:
1055:
1054:
1043:
1038:
1034:
1030:
1025:
1022:
1018:
1009:
1005:
1001:
998:
995:
990:
987:
983:
974:
970:
966:
961:
958:
954:
945:
941:
937:
932:
928:
902:
898:
871:
866:
860:
857:
853:
849:
846:
843:
838:
835:
831:
827:
822:
819:
815:
810:
805:
800:
795:
773:
751:
746:
722:
718:
697:
675:
670:
667:
664:
659:
653:
649:
645:
640:
635:
629:
605:
590:Main article:
587:
584:
457:macroeconomics
444:
443:
441:
440:
433:
426:
418:
415:
414:
413:
412:
397:
396:
395:
394:
389:
384:
379:
374:
369:
361:
360:
356:
355:
354:
353:
348:
343:
338:
333:
325:
324:
323:
322:
317:
312:
307:
302:
294:
293:
292:
291:
286:
281:
276:
268:
267:
266:
265:
260:
255:
247:
246:
242:
241:
240:
239:
231:
230:
229:
228:
223:
218:
213:
208:
203:
198:
193:
191:Semiparametric
188:
183:
175:
174:
173:
172:
167:
162:
160:Random effects
157:
152:
144:
143:
142:
141:
136:
134:Ordered probit
131:
126:
121:
116:
111:
106:
101:
96:
91:
86:
81:
73:
72:
71:
70:
65:
60:
55:
47:
46:
42:
41:
35:
34:
26:
9:
6:
4:
3:
2:
15906:
15895:
15894:Least squares
15892:
15890:
15887:
15886:
15884:
15869:
15866:
15864:
15861:
15859:
15854:
15849:
15847:
15844:
15842:
15839:
15838:
15835:
15829:
15826:
15824:
15821:
15819:
15816:
15814:
15811:
15809:
15808:Curve fitting
15806:
15805:
15803:
15799:
15793:
15790:
15788:
15785:
15783:
15780:
15778:
15775:
15773:
15770:
15768:
15765:
15763:
15760:
15759:
15757:
15755:
15754:approximation
15752:
15748:
15742:
15739:
15737:
15734:
15732:
15729:
15728:
15726:
15724:
15720:
15714:
15711:
15709:
15706:
15704:
15701:
15699:
15696:
15694:
15691:
15689:
15686:
15684:
15681:
15680:
15678:
15674:
15668:
15665:
15663:
15660:
15656:
15653:
15651:
15648:
15646:
15645:
15637:
15636:
15635:
15632:
15630:
15627:
15626:
15624:
15620:
15614:
15611:
15609:
15606:
15604:
15601:
15600:
15598:
15596:
15592:
15582:
15579:
15577:
15574:
15572:
15569:
15567:
15564:
15563:
15561:
15557:
15551:
15548:
15546:
15543:
15541:
15538:
15536:
15533:
15531:
15530:Nonparametric
15528:
15526:
15523:
15522:
15520:
15516:
15510:
15507:
15505:
15502:
15500:
15497:
15495:
15492:
15491:
15489:
15485:
15479:
15476:
15474:
15471:
15469:
15466:
15464:
15461:
15459:
15456:
15455:
15453:
15451:
15447:
15444:
15442:
15436:
15430:
15427:
15425:
15422:
15420:
15417:
15415:
15412:
15411:
15409:
15407:
15403:
15397:
15394:
15392:
15389:
15386:
15385:Kendall's tau
15383:
15381:
15377:
15374:
15372:
15369:
15368:
15366:
15364:
15360:
15354:
15351:
15349:
15346:
15344:
15341:
15339:
15338:Least squares
15336:
15335:
15333:
15331:
15327:
15323:
15319:
15318:Least squares
15312:
15307:
15305:
15300:
15298:
15293:
15292:
15289:
15281:
15275:
15271:
15267:
15263:
15259:
15255:
15249:
15245:
15240:
15236:
15230:
15226:
15221:
15217:
15211:
15207:
15203:
15199:
15195:
15191:
15189:0-19-877643-8
15185:
15181:
15176:
15175:
15163:
15161:0-387-95364-7
15157:
15152:
15151:
15142:
15135:
15134:Amemiya (1985
15130:
15128:
15120:
15119:Amemiya (1985
15115:
15108:
15103:
15096:
15091:
15083:
15081:0-19-506011-3
15077:
15073:
15066:
15064:
15056:
15055:Amemiya (1985
15051:
15044:
15043:Amemiya (1985
15039:
15031:
15029:0-471-70823-2
15025:
15021:
15017:
15011:
15004:
15003:Amemiya (1985
14999:
14991:
14989:9780674005600
14985:
14981:
14976:
14975:
14969:
14963:
14956:
14955:Hayashi (2000
14951:
14949:
14947:
14939:
14938:Hayashi (2000
14934:
14916:
14910:
14902:
14898:
14894:
14887:
14880:
14879:Hayashi (2000
14875:
14868:
14867:Hayashi (2000
14863:
14861:
14853:
14852:Hayashi (2000
14848:
14841:
14840:Hayashi (2000
14836:
14829:
14828:Hayashi (2000
14824:
14822:
14814:
14813:Hayashi (2000
14809:
14801:
14794:
14787:
14786:Hayashi (2000
14782:
14774:
14772:0-8493-2479-3
14768:
14764:
14757:
14749:
14742:
14736:
14735:
14728:
14721:
14720:Hayashi (2000
14716:
14714:
14705:
14703:9783540727156
14699:
14695:
14694:
14686:
14678:
14676:9783211730171
14672:
14668:
14667:
14659:
14651:
14649:9780471697282
14645:
14641:
14640:
14632:
14625:
14624:Hayashi (2000
14620:
14612:
14608:
14602:
14594:
14592:0-471-31101-4
14588:
14584:
14580:
14579:
14574:
14570:
14564:
14550:
14546:
14540:
14525:
14521:
14515:
14511:
14502:
14499:
14497:
14494:
14492:
14489:
14487:
14484:
14482:
14479:
14478:
14472:
14471:
14458:
14455:
14452:
14449:
14446:
14443:
14440:
14420:
14417:
14412:
14409:
14404:
14401:
14391:
14370:
14367:
14356:
14345:
14342:
14325:
14312:
14307:
14301:
14294:
14287:
14280:
14273:
14266:
14260:
14255:
14252:
14230:
14224:
14219:
14212:
14207:
14200:
14192:
14185:
14182:
14177:
14170:
14167:
14162:
14155:
14152:
14147:
14141:
14136:
14133:
14107:
14101:
14097:
14075:
14053:
14050:
14026:
14023:
14001:
13979:
13976:
13954:
13932:
13929:
13907:
13887:
13882:
13878:
13874:
13863:
13860:
13849:
13844:
13840:
13831:
13812:
13808:
13801:
13798:
13794:
13785:
13781:
13774:
13771:
13768:
13763:
13759:
13755:
13752:
13729:
13723:
13720:
13697:
13691:
13688:
13663:
13659:
13652:
13649:
13643:
13637:
13634:
13631:
13623:
13619:
13612:
13609:
13603:
13597:
13594:
13591:
13583:
13579:
13575:
13572:
13566:
13563:
13540:
13534:
13531:
13524:by expanding
13523:
13504:
13498:
13495:
13490:
13487:
13482:
13477:
13474:
13469:
13460:
13454:
13450:
13425:
13419:
13405:
13391:
13371:
13357:
13354:
13351:
13348:
13345:
13342:
13325:
13319:
13312:
13311:
13307:
13304:
13301:
13298:
13295:
13292:
13290:(in degrees)
13277:
13270:
13269:
13266:
13252:
13232:
13209:
13203:
13177:
13171:
13168:
13165:
13162:
13159:
13155:
13150:
13144:
13138:
13119:
13117:
13113:
13107:
13105:
13104:extrapolation
13100:
13092:
13083:
13080:
13077:
13074:
13073:
13069:
13066:
13063:
13060:
13059:
13055:
13052:
13049:
13047:
13046:
13043:
13041:
13034:
13028:
13018:
13011:
12990:
12979:
12976:
12972:
12971:
12970:
12963:
12956:
12952:
12948:
12944:
12940:
12936:
12932:
12929:
12926:
12922:
12918:
12915:
12912:
12908:
12905:
12902:
12901:
12896:
12895:
12891:
12888:
12887:
12883:
12880:
12877:
12876:
12854:
12850:
12846:
12843:
12834:
12831:
12828:
12823:
12820:
12817:
12811:
12808:
12805:
12800:
12791:
12781:
12780:
12779:
12778:
12760:
12756:
12733:
12729:
12720:
12717:
12714:
12710:
12706:
12702:
12699:
12696:
12692:
12688:
12686:
12681:
12677:
12673:
12655:
12645:
12637:
12631:
12621:
12614:
12611:
12603:
12599:
12595:
12594:
12589:
12572:
12569:
12563:
12557:
12554:
12549:
12544:
12541:
12536:
12533:
12529:
12525:
12518:
12508:
12500:
12495:
12490:
12480:
12468:
12465:column shows
12464:
12460:
12458:
12451:
12447:
12446:
12445:
12435:
12429:
12427:
12424:
12423:
12419:
12413:
12411:
12408:
12407:
12403:
12397:
12395:
12392:
12391:
12387:
12381:
12378:
12377:
12373:
12367:
12364:
12363:
12359:
12353:
12351:
12348:
12347:
12342:
12338:
12335:
12332:
12329:
12313:
12309:
12301:
12300:
12296:
12293:
12290:
12287:
12271:
12267:
12259:
12258:
12254:
12251:
12248:
12245:
12229:
12225:
12217:
12216:
12211:
12208:
12205:
12203:
12200:
12198:
12195:
12192:
12189:
12188:
12183:
12176:
12175:
12168:
12167:
12160:
12159:
12156:
12155:
12154:
12152:
12143:
12125:
12120:
12116:
12112:
12107:
12102:
12098:
12092:
12088:
12084:
12079:
12075:
12069:
12065:
12061:
12056:
12052:
12048:
12043:
12039:
12031:
12030:
12029:
12022:
12010:
12007:
12004:
12001:
11998:
11995:
11994:
11990:
11987:
11984:
11981:
11978:
11975:
11974:
11970:
11967:
11964:
11961:
11958:
11955:
11954:
11950:
11947:
11944:
11941:
11938:
11935:
11934:
11930:
11927:
11924:
11921:
11918:
11915:
11914:
11908:
11904:
11897:
11894:
11891:
11888:
11885:
11882:
11881:
11878:
11877:
11876:
11874:
11868:
11864:
11854:
11851:
11846:
11844:
11840:
11834:
11832:
11828:
11824:
11812:
11809:February 2017
11803:
11799:
11796:This section
11794:
11791:
11787:
11786:
11782:
11768:
11749:
11738:
11734:
11728:
11725:
11720:
11717:
11713:
11700:
11696:
11690:
11680:
11665:
11662:
11659:
11642:
11639:
11634:
11631:
11627:
11623:
11614:
11599:
11595:
11587:
11583:
11578:
11574:
11566:
11565:
11564:
11548:
11544:
11520:
11516:
11510:
11506:
11500:
11497:
11492:
11489:
11485:
11472:
11468:
11462:
11458:
11453:
11450:
11446:
11429:
11425:
11416:
11410:
11406:
11402:
11397:
11387:
11379:
11371:
11370:
11369:
11349:
11320:
11305:
11301:
11297:
11292:
11282:
11270:
11254:
11242:
11238:
11234:
11229:
11225:
11216:
11215:mean response
11198:
11194:
11170:
11166:
11160:
11156:
11152:
11148:
11143:
11138:
11134:
11130:
11126:
11119:
11116:
11112:
11095:
11091:
11078:
11074:
11070:
11065:
11055:
11041:
11040:
11039:
11037:
11030:
11025:
11023:
11018:
11013:
11009:
11000:
10968:
10965:
10960:
10955:
10952:
10947:
10944:
10940:
10936:
10929:
10919:
10904:
10901:
10898:
10881:
10878:
10873:
10870:
10866:
10862:
10857:
10847:
10830:
10825:
10821:
10813:
10812:
10811:
10791:
10780:
10774:
10770:
10760:
10747:
10744:
10735:
10731:
10726:
10723:
10719:
10695:
10683:
10680:
10675:
10672:
10668:
10662:
10658:
10653:
10650:
10627:
10623:
10612:
10609:
10600:
10587:
10586:
10585:
10583:
10579:
10558:
10547:
10543:
10538:
10536:
10532:
10528:
10524:
10506:
10496:
10466:
10455:
10447:
10445:
10441:
10437:
10427:
10425:
10419:
10414:
10411: −
10410:
10406:
10402:
10383:
10380:
10375:
10372:
10364:
10355:
10348:
10338:
10329:
10319:
10313:
10310:
10302:
10299:
10290:
10280:
10273:
10270:
10265:
10261:
10250:
10247:
10238:
10228:
10222:
10219:
10211:
10208:
10199:
10189:
10182:
10179:
10174:
10164:
10153:
10152:
10151:
10149:
10145:
10141:
10137:
10118:
10112:
10109:
10100:
10088:
10079:
10076:
10064:
10059:
10056:
10048:
10039:
10026:
10015:
10010:
10007:
9999:
9990:
9983:
9974:
9968:
9963:
9953:
9942:
9941:
9940:
9938:
9934:
9929:
9924:
9920:
9916:
9912:
9908:
9888:
9885:
9882:
9879:
9870:
9865:
9862:
9855:
9854:
9853:
9849:
9839:
9835:
9830:
9826:
9819:
9800:
9796:
9793:
9788:
9784:
9778:
9774:
9768:
9764:
9760:
9757:
9752:
9748:
9740:
9739:
9738:
9733:
9714:
9704:
9673:
9662:
9661:
9655:
9652:
9645:
9638:
9630:
9623:
9616:
9609:
9602:
9598:
9591:
9587:
9580:
9573:
9554:
9551:
9548:
9543:
9539:
9533:
9529:
9525:
9520:
9516:
9510:
9506:
9502:
9499:
9492:
9491:
9490:
9482:
9480:
9476:
9469:
9465:
9457:
9453:
9445:
9439:
9413:
9403:
9390:
9386:
9382:
9379:
9373:
9369:
9363:
9360:
9351:
9338:
9334:
9330:
9322:
9309:
9293:
9289:
9285:
9280:
9270:
9263:
9255:
9247:
9237:
9226:
9225:
9224:
9222:
9218:
9214:
9207:
9203:
9198:
9191:
9188:
9184:
9177:
9154:
9148:
9138:
9122:
9118:
9112:
9109:
9101:
9090:
9078:
9074:
9070:
9067:
9063:
9058:
9055:
9046:
9040:
9032:
9019:
9008:
9007:
9006:
9004:
9000:
8996:
8991:
8989:
8985:
8981:
8974:
8971:is linear in
8952:
8939:
8933:
8923:
8921:
8917:
8894:
8881:
8877:
8870:
8866:
8862:
8858:
8853:
8849:
8845:
8840:
8836:
8830:
8824:
8820:
8816:
8793:
8788:
8785:
8782:
8778:
8774:
8768:
8765:
8762:
8756:
8752:
8743:
8735:
8731:
8723:
8722:
8721:
8719:
8715:
8710:
8708:
8704:
8685:
8673:
8670:
8662:
8651:
8642:
8638:
8631:
8628:
8608:
8596:
8586:
8585:
8584:
8563:
8550:
8546:
8540:
8536:
8532:
8527:
8517:
8515:
8511:
8492:
8489:
8482:
8479:
8470:
8460:
8457:
8454:
8447:
8444:
8435:
8425:
8422:
8415:
8414:
8413:
8411:
8388:
8376:
8353:
8341:
8340:homoscedastic
8337:
8333:
8329:
8328:
8308:
8305:
8298:
8295:
8286:
8280:
8271:
8261:
8258:
8251:
8250:
8249:
8228:
8197:
8194:
8189:
8186:
8181:
8177:
8168:
8163:
8156:
8152:
8146:
8138:
8128:
8115:
8111:
8104:
8090:
8089:
8088:
8086:
8082:
8078:
8059:
8049:
8021:
8018:
8013:
8009:
8005:
8000:
7997:
7992:
7988:
7979:
7974:
7967:
7963:
7959:
7952:
7949:
7940:
7930:
7927:
7920:
7919:
7918:
7897:
7885:
7881:
7880:
7874:
7872:
7853:
7848:
7844:
7840:
7833:
7830:
7825:
7821:
7813:
7806:
7803:
7800:
7793:
7790:
7781:
7771:
7761:
7760:
7759:
7757:
7753:
7730:
7718:
7701:
7697:
7692:
7688:
7677:
7674:
7668:
7660:
7656:
7652:
7649:
7645:
7641:
7640:predetermined
7637:
7634:
7630:
7623:
7616:
7609:
7602:
7598:
7594:
7587:
7580:
7576:
7575:
7565:
7559:
7556:
7550:
7547:
7544:
7538:
7530:
7527:
7523:
7518:
7511:
7504:
7500:
7496:
7489:
7482:
7479:
7478:
7477:
7475:
7469:
7464:
7463:random sample
7460:
7456:
7443:
7439:
7422:
7414:
7410:
7404:
7400:
7396:
7393:
7380:
7377:
7374:
7371:
7363:
7359:
7356:
7351:
7347:
7343:
7339:
7334:
7330:
7321:
7317:
7316:
7311:
7308:
7304:
7300:
7296:
7292:
7287:
7281:
7280:
7276:
7275:
7272:
7268:
7264:
7260:
7257:in dimension
7256:
7233:
7228:
7224:
7218:
7214:
7210:
7203:
7200:
7197:
7190:
7187:
7179:
7176:
7173:
7169:
7165:
7159:
7139:
7136:
7125:
7122:
7116:
7110:
7107:
7087:
7083:
7079:
7075:
7071:
7068:
7065:
7061:
7060:
7055:
7047:
7027:
7024:
7017:
7014:
7011:
7004:
6993:
6989:
6986:
6983:
6980:
6979:
6978:
6976:
6972:
6962:
6960:
6956:
6952:
6948:
6944:
6940:
6936:
6932:
6928:
6924:
6917:
6910:
6903:
6902:random design
6898:
6895:
6889:
6874:
6871:
6867:
6863:
6854:
6837:
6835:
6831:
6827:
6805:
6802:
6790:
6786:
6776:
6772:
6768:
6763:
6759:
6754:
6748:
6744:
6723:
6722:
6721:
6719:
6715:
6705:
6703:
6699:
6695:
6691:
6681:
6659:
6649:
6629:
6626:
6621:
6599:
6576:
6573:
6568:
6546:
6538:
6533:
6523:
6520:
6514:
6493:
6487:
6485:
6478:
6441:
6427:
6422:
6385:
6378:
6357:
6352:
6350:
6332:
6331:
6330:
6313:
6293:
6273:
6265:
6244:
6243:
6242:
6240:
6236:
6233:
6193:
6190:and a matrix
6159:
6135:
6093:
6080:
6076:
6050:
6027:
6012:
6008:
6004:
5976:
5973:
5945:
5942:
5907:
5892:
5891:
5890:
5887:
5885:
5880:
5876:
5871:
5849:
5838:
5834:
5831:
5827:
5823:
5818:
5814:
5809:
5805:
5802:
5799:-dimensional
5798:
5794:
5792:
5764:
5748:
5734:
5713:
5704:
5694:
5693:
5692:
5690:
5686:
5682:
5677:
5673:
5661:
5660:
5655:
5651:
5642:
5641:
5621:
5617:
5594:
5590:
5580:
5571:
5568:
5565:
5561:
5558:
5554:
5530:
5495:
5483:
5473:
5470:
5464:
5455:
5449:
5447:
5439:
5436:
5420:
5406:
5400:
5395:
5391:
5381:
5376:
5373:
5370:
5366:
5350:
5344:
5339:
5335:
5319:
5313:
5308:
5304:
5294:
5289:
5286:
5283:
5279:
5272:
5270:
5262:
5259:
5245:
5244:
5243:
5226:
5221:
5217:
5213:
5208:
5204:
5200:
5197:
5194:
5191:
5186:
5182:
5174:
5173:
5172:
5168:
5164:
5158:
5151:
5145:
5135:
5133:
5128:
5126:
5122:
5118:
5114:
5098:
5090:
5086:
5068:
5064:
5041:
5037:
5031:
5028:
5023:
5018:
5014:
5010:
5007:
4999:
4998:
4952:
4949:
4946:
4940:
4937:
4926:
4920:
4917:
4906:
4899:
4896:
4893:
4887:
4884:
4873:
4867:
4864:
4861:
4850:
4838:
4831:
4823:
4810:
4805:
4800:
4796:
4789:
4782:
4769:
4764:
4759:
4749:
4739:
4733:
4728:
4724:
4716:
4715:
4714:
4712:
4708:
4705:
4704:
4699:
4694:
4692:
4688:
4684:
4683:
4678:
4674:
4670:
4666:
4662:
4643:
4633:
4620:
4616:
4612:
4608:
4604:
4583:
4579:
4572:
4568:
4565:
4562:
4556:
4551:
4541:
4533:
4527:
4524:
4521:
4507:
4498:
4492:
4486:
4483:
4480:
4475:
4472:
4461:
4454:
4448:
4445:
4442:
4437:
4434:
4423:
4411:
4404:
4398:
4395:
4392:
4387:
4384:
4369:
4366:
4357:
4351:
4348:
4345:
4334:
4314:
4304:
4299:
4295:
4287:
4286:
4285:
4283:
4282:
4277:
4258:
4255:
4252:
4249:
4246:
4243:
4240:
4237:
4231:
4228:
4222:
4216:
4213:
4210:
4207:
4201:
4198:
4195:
4192:
4189:
4180:
4174:
4171:
4168:
4165:
4156:
4150:
4147:
4144:
4135:
4125:
4124:
4123:
4121:
4117:
4111:
4105:
4101:
4096:
4091:
4087:
4081:
4077:
4072:
4068:
4064:
4060:
4056:
4051:
4047:
4040:
4035:
4031:
4027:
4023:
4022:
4017:
4013:
4009:
4005:
4004:
3999:
3995:
3992:
3988:
3984:
3965:
3962:
3959:
3956:
3947:
3941:
3938:
3929:
3919:
3918:
3917:
3915:
3911:
3910:
3909:fitted values
3905:
3900:
3898:
3894:
3890:
3887:
3883:
3879:
3876:
3871:
3867:
3863:
3859:
3855:
3851:
3847:
3843:
3839:
3836:
3832:
3813:
3807:
3798:
3792:
3789:
3781:
3772:
3765:
3759:
3753:
3750:
3743:
3733:
3730:
3726:
3722:
3713:
3703:
3702:
3701:
3681:
3675:
3672:
3664:
3660:
3656:
3652:
3648:
3647:
3641:
3637:
3626:
3622:
3618:
3599:
3593:
3590:
3587:
3584:
3568:
3565:
3562:
3559:
3553:
3548:
3540:
3530:
3526:
3522:
3517:
3513:
3504:
3499:
3496:
3493:
3489:
3485:
3479:
3473:
3466:
3465:
3464:
3462:
3458:
3454:
3450:
3446:
3442:
3441:
3435:
3432:
3428:
3421:
3414:
3405:
3401:
3400:
3396:, called the
3394:
3391:
3384:
3376:
3372:
3348:
3326:
3323:
3318:
3296:
3291:
3283:
3264:
3263:
3262:
3245:
3223:
3220:
3215:
3193:
3188:
3169:
3168:
3167:
3165:
3132:
3131:moment matrix
3094:
3090:
3089:normal matrix
3037:
3014:
2998:
2976:
2968:
2967:
2966:
2964:
2960:
2922:
2914:
2895:
2890:
2871:
2857:
2852:
2847:
2841:
2837:
2831:
2828:
2824:
2818:
2813:
2810:
2807:
2803:
2799:
2794:
2790:
2785:
2778:
2773:
2770:
2767:
2763:
2759:
2745:
2738:
2737:
2736:
2722:
2699:
2685:
2653:
2634:
2633:
2632:
2630:
2627:
2600:
2586:
2566:
2563:
2558:
2555:
2551:
2505:
2500:
2492:
2488:
2480:
2471:
2467:
2457:
2453:
2446:
2441:
2432:
2427:
2419:
2415:
2407:
2398:
2394:
2384:
2380:
2373:
2368:
2359:
2354:
2346:
2343:
2339:
2333:
2326:
2323:
2319:
2311:
2308:
2304:
2296:
2291:
2286:
2281:
2272:
2269:
2265:
2259:
2252:
2248:
2240:
2236:
2226:
2223:
2219:
2213:
2206:
2202:
2194:
2190:
2183:
2178:
2166:
2165:
2164:
2147:
2139:
2122:
2121:
2120:
2118:
2102:
2099:
2096:
2074:
2070:
2066:
2063:
2060:
2055:
2051:
2047:
2042:
2038:
2029:
2013:
2005:
1990:
1967:
1961:
1958:
1955:
1952:
1949:
1946:
1943:
1940:
1937:
1928:
1923:
1919:
1915:
1910:
1906:
1900:
1897:
1893:
1887:
1882:
1879:
1876:
1872:
1864:
1863:
1862:
1861:
1851:
1827:
1823:
1819:
1813:
1809:
1789:
1783:
1778:
1774:
1765:
1747:
1743:
1722:
1719:
1716:
1713:
1710:
1707:
1704:
1684:
1681:
1676:
1673:
1669:
1637:
1623:
1596:
1569:
1561:
1560:design matrix
1545:
1542:
1539:
1497:
1477:
1474:
1471:
1403:
1395:
1382:
1370:
1369:
1368:
1352:
1323:
1319:
1296:
1292:
1271:
1263:
1245:
1241:
1220:
1217:
1214:
1172:
1150:
1119:
1114:
1110:
1106:
1091:
1081:
1076:
1072:
1064:
1063:
1062:
1060:
1041:
1036:
1032:
1028:
1023:
1020:
1016:
1007:
1003:
999:
996:
993:
988:
985:
981:
972:
968:
964:
959:
956:
952:
943:
939:
935:
930:
926:
918:
917:
916:
900:
896:
887:
864:
858:
855:
851:
847:
844:
841:
836:
833:
829:
825:
820:
817:
813:
808:
803:
798:
771:
749:
720:
716:
695:
673:
668:
665:
662:
657:
651:
647:
643:
638:
627:
618:
603:
593:
583:
581:
577:
573:
569:
565:
561:
560:homoscedastic
557:
553:
549:
545:
541:
537:
532:
530:
526:
522:
516:
514:
510:
506:
502:
501:least squares
498:
494:
490:
486:
482:
478:
474:
470:
462:
458:
454:
450:
439:
434:
432:
427:
425:
420:
419:
417:
416:
411:
406:
401:
400:
399:
398:
393:
390:
388:
385:
383:
380:
378:
375:
373:
370:
368:
365:
364:
363:
362:
358:
357:
352:
349:
347:
344:
342:
339:
337:
334:
332:
329:
328:
327:
326:
321:
318:
316:
313:
311:
308:
306:
303:
301:
298:
297:
296:
295:
290:
287:
285:
282:
280:
277:
275:
272:
271:
270:
269:
264:
261:
259:
256:
254:
253:Least squares
251:
250:
249:
248:
244:
243:
238:
235:
234:
233:
232:
227:
224:
222:
219:
217:
214:
212:
209:
207:
204:
202:
199:
197:
194:
192:
189:
187:
186:Nonparametric
184:
182:
179:
178:
177:
176:
171:
168:
166:
163:
161:
158:
156:
155:Fixed effects
153:
151:
148:
147:
146:
145:
140:
137:
135:
132:
130:
129:Ordered logit
127:
125:
122:
120:
117:
115:
112:
110:
107:
105:
102:
100:
97:
95:
92:
90:
87:
85:
82:
80:
77:
76:
75:
74:
69:
66:
64:
61:
59:
56:
54:
51:
50:
49:
48:
44:
43:
40:
37:
36:
32:
31:
19:
15801:Applications
15640:
15518:Non-standard
15462:
15413:
15269:
15243:
15224:
15205:
15179:
15149:
15141:
15114:
15102:
15090:
15071:
15050:
15038:
15019:
15010:
14998:
14973:
14962:
14933:
14921:. Retrieved
14909:
14900:
14896:
14886:
14874:
14847:
14835:
14808:
14793:
14781:
14762:
14756:
14747:
14741:
14733:
14727:
14692:
14685:
14665:
14658:
14638:
14631:
14619:
14611:Econometrics
14610:
14601:
14577:
14563:
14552:. Retrieved
14548:
14539:
14528:. Retrieved
14526:. 2022-03-01
14523:
14514:
14394:
14392:
14326:
13832:
13411:
13363:
13130:
13115:
13111:
13108:
13101:
13097:
13039:
13036:
13015:
12968:
12946:
12942:
12938:
12934:
12930:
12924:
12920:
12916:
12910:
12906:
12898:
12892:
12884:
12878:
12718:
12712:
12708:
12700:
12694:
12684:
12679:
12675:
12671:
12601:
12597:
12591:
12462:
12453:
12449:
12443:
12177:Observations
12148:
12018:
11996:Weight (kg)
11956:Weight (kg)
11916:Weight (kg)
11872:
11870:
11847:
11835:
11819:
11806:
11802:adding to it
11797:
11766:
11535:
11185:
11032:
11028:
11026:
11021:
11016:
11010:denotes the
11007:
11005:
10998:
10778:
10776:
10710:
10581:
10541:
10539:
10534:
10530:
10526:
10450:
10448:
10439:
10433:
10423:
10417:
10412:
10408:
10404:
10400:
10398:
10147:
10143:
10139:
10135:
10133:
9936:
9932:
9927:
9922:
9918:
9914:
9910:
9906:
9904:
9851:
9836:
9828:
9817:
9815:
9731:
9658:
9656:
9650:
9643:
9636:
9628:
9621:
9614:
9607:
9600:
9596:
9589:
9585:
9578:
9571:
9569:
9488:
9478:
9471:
9467:
9460:
9455:
9448:
9443:
9434:
9430:
9220:
9216:
9209:
9205:
9201:
9193:
9189:
9186:
9179:
9172:
9169:
9002:
8994:
8992:
8987:
8983:
8976:
8972:
8941:
8915:
8882:
8875:
8868:
8864:
8860:
8859:= SSR
8856:
8847:
8843:
8838:
8834:
8822:
8818:
8814:
8810:
8713:
8711:
8700:
8551:
8541:
8538:
8534:
8530:
8525:
8523:
8513:
8507:
8409:
8374:
8336:uncorrelated
8331:
8325:
8323:
8216:
8084:
8080:
8076:
8036:
7917:is equal to
7883:
7877:
7875:
7868:
7751:
7716:
7714:
7699:
7695:
7679:
7663:
7658:
7654:
7647:
7643:
7639:
7625:
7618:
7611:
7604:
7589:
7582:
7563:
7557:
7548:
7533:
7528:
7521:
7513:
7506:
7503:distribution
7491:
7484:
7480:
7467:
7452:
7357:
7349:
7332:
7328:
7320:uncorrelated
7312:
7290:
7285:
7277:
7266:
7262:
7258:
7177:
7171:
7167:
7154:
7081:
7076:must all be
7073:
7069:
7057:
7053:
6987:
6981:
6970:
6968:
6958:
6950:
6946:
6938:
6935:fixed design
6934:
6921:s from some
6912:
6905:
6901:
6899:
6891:
6869:
6865:
6861:
6838:
6833:
6829:
6822:
6820:
6711:
6687:
6678:
6328:
6234:
6231:
6191:
6161:Introducing
6160:
6078:
6074:
6009:, since the
6006:
6003:column space
5960:
5888:
5883:
5878:
5874:
5836:
5821:
5816:
5812:
5807:
5803:
5796:
5790:
5779:
5688:
5684:
5680:
5675:
5671:
5668:
5657:
5653:
5566:
5563:
5559:
5556:
5552:
5519:
5241:
5166:
5162:
5153:
5149:
5147:
5129:
5124:
5120:
5116:
5088:
5084:
4995:
4993:
4710:
4706:
4701:
4697:
4695:
4690:
4686:
4680:
4676:
4672:
4660:
4618:
4614:
4606:
4602:
4600:
4279:
4275:
4273:
4119:
4118:creates the
4115:
4109:
4103:
4099:
4094:
4089:
4085:
4079:
4075:
4062:
4058:
4054:
4049:
4042:
4038:
4033:
4029:
4025:
4019:
4015:
4011:
4007:
4001:
3997:
3993:
3990:
3986:
3982:
3980:
3913:
3907:
3903:
3901:
3888:
3885:
3881:
3877:
3874:
3869:
3865:
3857:
3853:
3849:
3845:
3837:
3834:
3830:
3829:The product
3828:
3658:
3654:
3650:
3645:
3643:
3639:
3628:
3624:
3616:
3614:
3460:
3456:
3452:
3448:
3444:
3438:
3433:
3430:
3426:
3416:
3409:
3403:
3397:
3392:
3386:
3379:
3374:
3370:
3368:
3260:
3088:
3052:
2962:
2910:
2735:is given by
2714:
2629:minimization
2601:
2520:
2162:
2028:coefficients
1982:
1858:Consider an
1857:
1825:
1821:
1817:
1814:
1810:
1763:
1638:
1562:, whose row
1419:
1135:
1056:
617:observations
595:
586:Linear model
533:
517:
495:of a set of
476:
472:
466:
310:Non-negative
273:
14923:28 December
14854:, page 187)
12931:F-statistic
12593:t-statistic
12417:F-statistic
12202:t-statistic
12021:scatterplot
11976:Height (m)
11936:Height (m)
11907:Scatterplot
11883:Height (m)
11839:t-statistic
9477:are called
8988:influential
8920:independent
7882:(or simply
7871:time series
7499:independent
7338:time series
7086:column rank
6925:, as in an
6882:Assumptions
6859:is to take
6011:dot product
4284:statistic:
4034:annihilator
3842:Gram matrix
3093:Gram matrix
3053:The matrix
544:colinearity
320:Regularized
284:Generalized
216:Least angle
114:Mixed logit
15883:Categories
15676:Background
15639:Mallows's
15136:, page 22)
15121:, page 21)
15109:, page 20)
15097:, page 36)
15057:, page 27)
15045:, page 20)
15016:Rao, C. R.
15005:, page 14)
14957:, page 27)
14881:, page 34)
14869:, page 10)
14830:, page 52)
14815:, page 49)
14788:, page 20)
14722:, page 19)
14626:, page 18)
14554:2022-09-28
14530:2024-05-16
14507:References
13031:See also:
12365:Adjusted R
12190:Parameter
11861:See also:
10546:consistent
10527:asymptotic
8936:See also:
7646:= 1, ...,
7597:stationary
7549:exogeneity
7519:) for all
7342:panel data
7059:endogenous
6955:experiment
6923:population
6886:See also:
6877:Properties
5793: norm
5574:Projection
4278:using the
4071:idempotent
4021:hat matrix
3365:Estimation
3164:hyperplane
2913:Properties
536:consistent
485:parameters
469:statistics
453:Okun's law
359:Background
263:Non-linear
245:Estimation
15751:Numerical
14842:, page 7)
14450:⋅
14183:−
14168:−
14153:−
14108:θ
13809:θ
13802:
13782:θ
13775:
13760:θ
13756:
13730:θ
13724:
13698:θ
13692:
13660:θ
13653:
13644:θ
13638:
13620:θ
13613:
13604:θ
13598:
13580:θ
13576:−
13573:θ
13567:
13541:θ
13535:
13505:θ
13499:
13483:−
13461:θ
13426:θ
13326:θ
13278:θ
13210:θ
13178:θ
13172:
13163:−
13145:θ
13081:−131.5076
13070:61.96033
12994:^
12951:Wald test
12847:−
12832:−
12821:−
12812:−
12795:¯
12713:R-squared
12701:R-squared
12649:^
12646:σ
12625:^
12622:β
12542:−
12512:^
12509:σ
12484:^
12481:σ
12463:Std error
12310:β
12288:–143.1620
12268:β
12226:β
12197:Std error
12117:ε
12089:β
12066:β
12053:β
11850:Chow test
11726:−
11684:^
11681:σ
11640:α
11635:−
11624:±
11618:^
11615:β
11584:∈
11498:−
11459:σ
11403:−
11391:^
11353:^
11350:β
11324:^
11321:β
11286:^
11255:β
11157:σ
11153:−
11135:ε
11127:
11075:σ
11071:−
11059:^
11056:σ
10953:−
10923:^
10920:σ
10879:α
10874:−
10863:±
10851:^
10848:β
10831:∈
10822:β
10795:^
10792:β
10763:Intervals
10681:−
10659:σ
10613:β
10610:−
10604:^
10601:β
10562:^
10559:β
10548:(that is
10500:^
10497:σ
10470:^
10467:β
10373:−
10311:−
10271:−
10220:−
10168:^
10165:β
10110:−
10104:^
10101:β
10077:−
10057:−
10008:−
9984:−
9978:^
9975:β
9957:^
9954:β
9880:β
9866::
9797:η
9785:β
9708:^
9705:β
9677:^
9674:ε
9552:ε
9540:β
9517:β
9468:leverages
9407:^
9404:ε
9383:−
9364:−
9355:^
9352:β
9331:−
9313:^
9310:β
9274:^
9264:−
9241:^
9142:^
9139:ε
9110:−
9071:−
9059:−
9050:^
9047:β
9041:−
9023:^
9020:β
8956:^
8953:β
8898:^
8895:β
8786:−
8779:χ
8775:⋅
8766:−
8753:σ
8744:∼
8671:−
8639:σ
8629:β
8609:∼
8600:^
8597:β
8567:^
8564:β
8490:≥
8480:∣
8474:^
8471:β
8461:
8455:−
8445:∣
8439:~
8436:β
8426:
8392:~
8389:β
8357:^
8354:β
8296:∣
8290:^
8287:ε
8275:^
8272:β
8262:
8232:^
8229:β
8195:−
8132:^
8129:β
8116:^
8053:^
8050:β
8010:σ
7998:−
7964:σ
7950:∣
7944:^
7941:β
7931:
7901:^
7898:β
7845:σ
7831:∣
7814:
7804:β
7791:∣
7785:^
7782:β
7772:
7734:^
7731:β
7401:σ
7381:∼
7375:∣
7372:ε
7358:Normality
7215:σ
7201:∣
7198:ε
7191:
7111:
7054:exogenous
7044:(for the
7015:∣
7012:ε
7005:
6953:as in an
6787:β
6769:−
6642:⊤
6627:−
6611:⊤
6589:⊤
6574:−
6558:⊤
6521:−
6470:^
6467:γ
6454:^
6451:β
6437:⇒
6414:^
6411:γ
6398:^
6395:β
6308:^
6305:γ
6288:^
6285:β
6274:−
6260:^
6237:= 0 (cf.
6175:^
6172:γ
6145:β
6136:−
6108:^
6105:β
6094:−
6051:⋅
6042:^
6039:β
6028:−
5986:^
5983:β
5974:−
5933:⊤
5922:^
5919:β
5908:−
5853:^
5850:β
5828:onto the
5762:‖
5758:β
5749:−
5741:‖
5735:β
5708:^
5705:β
5534:^
5531:β
5487:¯
5474:^
5471:β
5465:−
5459:¯
5440:^
5437:α
5410:¯
5401:−
5367:∑
5354:¯
5345:−
5323:¯
5314:−
5280:∑
5263:^
5260:β
5218:ε
5201:β
5195:α
5024:−
4953:−
4900:−
4814:¯
4806:−
4790:∑
4773:¯
4765:−
4753:^
4740:∑
4637:^
4634:σ
4609:, is the
4566:−
4545:^
4542:σ
4525:−
4511:^
4508:β
4484:−
4446:−
4396:−
4349:−
4338:^
4335:ε
4318:^
4315:ε
4256:ε
4247:ε
4238:β
4217:ε
4211:β
4184:^
4181:β
4172:−
4160:^
4151:−
4139:^
4136:ε
4120:residuals
4114:. Matrix
4067:symmetric
3951:^
3948:β
3933:^
3852:, is the
3790:−
3751:
3734:∈
3717:^
3714:β
3685:^
3682:β
3621:transpose
3588:−
3563:−
3523:−
3490:∑
3345:ε
3324:−
3288:β
3278:^
3275:β
3221:−
3183:^
3180:β
3147:^
3144:β
3009:^
3006:β
2881:β
2872:−
2838:β
2804:∑
2800:−
2764:∑
2753:β
2693:β
2680:β
2648:^
2645:β
2626:quadratic
2611:β
2481:⋮
2416:β
2408:⋮
2395:β
2381:β
2365:β
2334:⋯
2297:⋮
2292:⋱
2287:⋮
2282:⋮
2260:⋯
2214:⋯
2136:β
2071:β
2064:…
2052:β
2039:β
1956:…
1907:β
1873:∑
1837:β
1818:quadratic
1793:→
1764:intercept
1744:β
1717:…
1543:×
1475:×
1451:ε
1400:ε
1392:β
1293:ε
1264:) of the
1242:ε
1218:×
1194:β
1111:ε
1103:β
1033:ε
1004:β
997:⋯
969:β
940:β
845:…
572:variances
554:when the
540:exogenous
529:regressor
521:estimator
226:Segmented
15581:Logistic
15571:Binomial
15550:Isotonic
15545:Quantile
15264:(2008).
15204:(2009).
15018:(1973).
14970:(1985).
14609:(2000).
14571:(1964).
14475:See also
14225:0.438371
14213:0.309017
14201:0.052336
14186:0.615661
14171:0.707107
14156:0.731354
13408:Solution
13118:errors.
13084:58.5046
13078:119.0205
13067:−143.162
13064:128.8128
12246:128.8128
11426:→
11092:→
10624:→
9185: (
8863: (
8087:. Thus,
7756:unbiased
5786:‖
5782:‖
5679:, where
4665:unbiased
3402:for the
3399:residual
3369:Suppose
2886:‖
2863:‖
2631:problem
2119:form as
2026:unknown
1697:for all
341:Bayesian
279:Weighted
274:Ordinary
206:Isotonic
201:Quantile
15576:Poisson
14459:0.70001
14371:0.30435
14368:0.43478
14302:0.56820
14295:0.52883
14288:0.45071
14281:0.24741
14274:0.21958
14267:0.21220
13522:apsides
13358:1.7599
13355:1.8910
13352:2.2187
13349:4.0419
13346:4.5542
13343:4.7126
13056:Height
12955:LR test
12947:p-value
12703:is the
12598:p-value
12436:0.0000
12430:0.3964
12420:5471.2
12414:0.2548
12404:693.37
12398:2.1013
12388:0.7595
12382:1.0890
12374:692.61
12368:0.9987
12360:0.2516
12354:0.9989
12339:0.0000
12336:10.3122
12330:61.9603
12297:0.0000
12294:–7.2183
12291:19.8332
12255:0.0000
12249:16.3083
12207:p-value
12172:WEIGHT
11020:is the
9823:is the
9627:×1 and
9200:is the
7689:} is a
7661:matrix
7601:ergodic
7253:is the
6698:Pearson
5833:spanned
5795:in the
4036:matrix
4032:is the
4000:is the
3663:Hessian
2089:, with
884:. In a
509:dataset
300:Partial
139:Poisson
15540:Robust
15276:
15250:
15231:
15212:
15186:
15158:
15078:
15026:
14986:
14769:
14700:
14673:
14646:
14589:
14421:2.3000
14198:
13900:where
13196:where
13053:Height
12923:, and
12687:-value
12333:6.0084
12252:7.8986
12193:Value
12161:Method
12026:HEIGHT
12011:74.46
11971:64.47
11931:57.20
11865:, and
11827:F-test
11746:
11592:
11436:
11421:
11102:
11087:
11006:where
10976:
10841:
10711:where
10634:
10619:
10399:where
10146:makes
9935:. The
9905:where
9816:where
9606:, and
9570:where
9208:, and
9170:where
8747:
8741:
8635:
8612:
8606:
7603:; if {
7562:Var =
7553:E = 0;
7340:data,
7261:, and
7246:where
6945:, and
6943:design
6210:
5780:where
5493:
5083:is an
5056:, and
4700:. The
3981:where
3906:, the
3811:
3727:argmin
3615:where
3035:
2163:where
2117:matrix
1932:
1822:linear
1532:is an
1420:where
1262:errors
1136:where
1061:form,
1059:vector
1057:or in
1013:
978:
949:
556:errors
258:Linear
196:Robust
119:Probit
45:Models
14918:(PDF)
14903:(11).
14126:so
13050:Const
12450:Value
12008:72.19
12005:69.92
12002:68.10
11999:66.28
11991:1.83
11968:63.11
11965:61.29
11962:59.93
11959:58.57
11951:1.70
11928:55.84
11925:54.48
11922:53.12
11919:52.21
11898:1.57
10403:is a
9921:is a
9909:is a
7886:) of
7631:} is
7595:} is
7541:is a
7522:i ≠ j
7505:as, (
7497:) is
7324:E = 0
7050:E = 0
7042:E = 0
6994:zero:
6919:'
6849:E = 0
6841:E = 0
6828:is a
5111:is a
4689:, or
3840:is a
3455:) or
1207:is a
487:in a
305:Total
221:Local
15320:and
15274:ISBN
15248:ISBN
15229:ISBN
15210:ISBN
15184:ISBN
15156:ISBN
15076:ISBN
15024:ISBN
14984:ISBN
14925:2020
14767:ISBN
14698:ISBN
14671:ISBN
14644:ISBN
14587:ISBN
14433:and
14245:and
14068:and
13994:and
13947:and
13384:and
13308:116
13305:108
13245:and
13114:and
12897:and
12596:and
12590:The
12461:The
12448:The
11988:1.80
11985:1.78
11982:1.75
11979:1.73
11948:1.68
11945:1.65
11942:1.63
11939:1.60
11895:1.55
11892:1.52
11889:1.50
11886:1.47
10771:and
10485:and
9657:The
9620:are
9577:and
9433:0 ≤
8918:are
8914:and
8537:(0,
8338:and
8324:The
7876:The
7754:are
7750:and
7653:The
7599:and
7577:The
7326:for
7284:E =
7108:rank
6868:) =
6696:and
6694:Yule
5609:and
4107:and
4083:and
4069:and
4065:are
4061:and
3912:(or
2957:are
2100:>
1464:are
1442:and
562:and
558:are
15655:BIC
15650:AIC
14583:158
14393:so
13967:is
13920:is
13799:cos
13772:sin
13753:tan
13721:sin
13689:cos
13650:sin
13635:sin
13610:cos
13595:cos
13564:cos
13532:cos
13496:cos
13441:as
13302:93
13299:52
13296:45
13293:43
13169:cos
13106:).
13040:not
12953:or
12943:n–p
12939:p–1
12180:15
11875:).
11804:.
11271:is
10580:to
10544:is
10420:= 0
9456:p/n
9440:≤ 1
8880:).
8878:= 1
8831:of
8458:Var
8423:Var
8259:Cov
7928:Var
7704:= E
7700:xxε
7669:= E
7539:= E
7483:: (
7313:No
7188:Var
7160:= E
6718:GMM
6714:iid
6712:In
6075:any
6005:of
5872:of
5824:is
5731:min
5555:= (
4112:= 0
3856:of
3633:= x
3461:RSS
3453:ESS
3445:SSR
3261:or
3091:or
2006:in
1983:of
1850:).
1582:is
764:of
477:OLS
467:In
461:GDP
455:in
15885::
15268:.
15200:;
15126:^
15062:^
14982:.
14980:13
14945:^
14901:18
14899:.
14895:.
14859:^
14820:^
14712:^
14696:.
14669:.
14642:.
14585:.
14575:.
14547:.
14522:.
12919:,
11368::
11017:jj
10446:.
10424:XX
10418:RQ
10407:×(
10136:XX
9834:.
9654:.
9649:=
9642:+
9613:,
9595:,
9454:≈
9178:=
8817:/(
8720::
8533:~
8309:0.
7666:xx
7624:,
7610:,
7588:,
7560::
7551::
7536:xx
7531::
7512:,
7490:,
7331:≠
7282::
7157:xx
7140:1.
7096:Pr
7028:0.
6806:0.
6266::=
5946:0.
5886:.
5879:Py
5877:=
5817:Xβ
5815:−
5808:Xβ
5674:≈
5672:Xβ
5171::
5165:,
5134:.
4693:.
4685:,
4110:MX
4102:=
4100:PX
4088:=
4078:=
4048:−
4041:=
3985:=
3880:)
3864:,
3429:=
3415:,
3385:−
2965::
2253:22
2241:21
2207:12
2195:11
2030:,
1826:is
1808:.
471:,
15643:p
15641:C
15387:)
15378:(
15310:e
15303:t
15296:v
15282:.
15256:.
15237:.
15218:.
15192:.
15164:.
15084:.
15032:.
14992:.
14927:.
14802:.
14775:.
14706:.
14679:.
14652:.
14595:.
14557:.
14533:.
14456:=
14453:y
14447:p
14444:=
14441:e
14418:=
14413:x
14410:1
14405:=
14402:p
14376:)
14363:(
14357:=
14351:)
14346:y
14343:x
14338:(
14313:.
14308:]
14261:[
14256:=
14253:b
14231:]
14220:1
14208:1
14193:1
14178:1
14163:1
14148:1
14142:[
14137:=
14134:A
14111:)
14105:(
14102:r
14098:1
14076:b
14054:p
14051:e
14027:p
14024:1
14002:A
13980:p
13977:e
13955:y
13933:p
13930:1
13908:x
13888:b
13883:T
13879:A
13875:=
13869:)
13864:y
13861:x
13856:(
13850:A
13845:T
13841:A
13818:)
13813:0
13805:(
13795:/
13791:)
13786:0
13778:(
13769:=
13764:0
13733:)
13727:(
13701:)
13695:(
13669:)
13664:0
13656:(
13647:)
13641:(
13632:+
13629:)
13624:0
13616:(
13607:)
13601:(
13592:=
13589:)
13584:0
13570:(
13544:)
13538:(
13508:)
13502:(
13491:p
13488:e
13478:p
13475:1
13470:=
13464:)
13458:(
13455:r
13451:1
13429:)
13423:(
13420:r
13392:p
13372:e
13329:)
13323:(
13320:r
13253:e
13233:p
13213:)
13207:(
13204:r
13181:)
13175:(
13166:e
13160:1
13156:p
13151:=
13148:)
13142:(
13139:r
13116:y
13112:x
13009:.
12991:y
12941:,
12937:(
12935:F
12911:σ
12860:)
12855:2
12851:R
12844:1
12841:(
12835:p
12829:n
12824:1
12818:n
12809:1
12806:=
12801:2
12792:R
12761:2
12757:R
12734:2
12730:R
12709:X
12695:p
12685:p
12680:t
12676:t
12672:t
12656:j
12638:/
12632:j
12615:=
12612:t
12602:t
12573:2
12570:1
12564:)
12558:j
12555:j
12550:]
12545:1
12537:x
12534:x
12530:Q
12526:[
12519:2
12501:(
12496:=
12491:j
12456:j
12454:β
12350:R
12314:3
12272:2
12230:1
12126:.
12121:i
12113:+
12108:2
12103:i
12099:h
12093:3
12085:+
12080:i
12076:h
12070:2
12062:+
12057:1
12049:=
12044:i
12040:w
11811:)
11807:(
11767:α
11750:]
11739:0
11735:x
11729:1
11721:x
11718:x
11714:Q
11707:T
11701:0
11697:x
11691:2
11669:)
11666:1
11663:,
11660:0
11657:(
11652:N
11643:2
11632:1
11628:q
11606:T
11600:0
11596:x
11588:[
11579:0
11575:y
11549:0
11545:y
11521:,
11517:)
11511:0
11507:x
11501:1
11493:x
11490:x
11486:Q
11479:T
11473:0
11469:x
11463:2
11454:,
11451:0
11447:(
11441:N
11430:d
11417:)
11411:0
11407:y
11398:0
11388:y
11380:(
11312:T
11306:0
11302:x
11298:=
11293:0
11283:y
11249:T
11243:0
11239:x
11235:=
11230:0
11226:y
11199:0
11195:x
11171:.
11167:)
11161:4
11149:]
11144:4
11139:i
11131:[
11124:E
11120:,
11117:0
11113:(
11107:N
11096:d
11084:)
11079:2
11066:2
11049:(
11035:i
11033:ε
11029:σ
11022:j
11008:q
10999:α
10981:]
10969:j
10966:j
10961:]
10956:1
10948:x
10945:x
10941:Q
10937:[
10930:2
10908:)
10905:1
10902:,
10899:0
10896:(
10891:N
10882:2
10871:1
10867:q
10858:j
10836:[
10826:j
10779:j
10748:.
10745:X
10740:T
10736:X
10732:=
10727:x
10724:x
10720:Q
10696:,
10691:)
10684:1
10676:x
10673:x
10669:Q
10663:2
10654:,
10651:0
10646:(
10639:N
10628:d
10616:)
10595:(
10582:β
10542:β
10535:n
10531:n
10507:2
10453:i
10451:ε
10440:β
10413:q
10409:p
10405:p
10401:R
10384:,
10381:c
10376:1
10369:)
10365:Q
10360:T
10356:Q
10352:(
10349:Q
10344:)
10339:X
10334:T
10330:X
10324:T
10320:R
10314:1
10307:)
10303:R
10300:X
10295:T
10291:X
10285:T
10281:R
10277:(
10274:R
10266:p
10262:I
10256:(
10251:+
10248:y
10243:T
10239:X
10233:T
10229:R
10223:1
10216:)
10212:R
10209:X
10204:T
10200:X
10194:T
10190:R
10186:(
10183:R
10180:=
10175:c
10148:β
10144:A
10140:β
10119:.
10116:)
10113:c
10093:T
10089:Q
10085:(
10080:1
10071:)
10065:Q
10060:1
10053:)
10049:X
10044:T
10040:X
10036:(
10031:T
10027:Q
10021:(
10016:Q
10011:1
10004:)
10000:X
9995:T
9991:X
9987:(
9969:=
9964:c
9933:A
9923:q
9919:c
9915:q
9913:×
9911:p
9907:Q
9889:,
9886:c
9883:=
9875:T
9871:Q
9863:A
9832:1
9829:X
9821:1
9818:M
9801:,
9794:+
9789:2
9779:2
9775:X
9769:1
9765:M
9761:=
9758:y
9753:1
9749:M
9735:2
9732:β
9715:2
9651:p
9647:2
9644:p
9640:1
9637:p
9632:2
9629:p
9625:1
9622:p
9618:2
9615:β
9611:1
9608:β
9604:2
9601:p
9599:×
9597:n
9593:1
9590:p
9588:×
9586:n
9582:2
9579:X
9575:1
9572:X
9555:,
9549:+
9544:2
9534:2
9530:X
9526:+
9521:1
9511:1
9507:X
9503:=
9500:y
9474:j
9472:h
9463:j
9461:h
9451:j
9449:h
9444:p
9437:j
9435:h
9414:j
9391:j
9387:h
9380:1
9374:j
9370:h
9361:=
9344:T
9339:j
9335:x
9326:)
9323:j
9320:(
9300:T
9294:j
9290:x
9286:=
9281:j
9271:y
9259:)
9256:j
9253:(
9248:j
9238:y
9221:j
9217:j
9212:j
9210:x
9206:P
9202:j
9196:j
9194:x
9192:)
9190:X
9187:X
9182:j
9180:x
9175:j
9173:h
9155:,
9149:j
9129:T
9123:j
9119:x
9113:1
9106:)
9102:X
9096:T
9091:X
9087:(
9079:j
9075:h
9068:1
9064:1
9056:=
9036:)
9033:j
9030:(
9003:β
8995:j
8984:X
8979:i
8977:y
8973:y
8916:s
8876:p
8869:p
8865:n
8861:/
8857:σ
8848:s
8844:σ
8839:n
8837:/
8835:σ
8833:2
8825:)
8823:p
8819:n
8815:σ
8813:2
8794:2
8789:p
8783:n
8769:p
8763:n
8757:2
8736:2
8732:s
8714:s
8686:.
8681:)
8674:1
8667:)
8663:X
8657:T
8652:X
8648:(
8643:2
8632:,
8624:(
8617:N
8547:)
8544:n
8542:I
8539:σ
8535:N
8531:ε
8514:ε
8493:0
8487:]
8483:X
8464:[
8452:]
8448:X
8429:[
8410:y
8306:=
8303:]
8299:X
8281:,
8265:[
8198:1
8190:j
8187:j
8182:)
8178:X
8173:T
8169:X
8164:(
8157:2
8153:s
8147:=
8144:)
8139:j
8122:(
8112:.
8109:e
8105:.
8102:s
8085:s
8081:σ
8077:j
8060:j
8022:.
8019:Q
8014:2
8006:=
8001:1
7993:)
7989:X
7984:T
7980:X
7975:(
7968:2
7960:=
7957:]
7953:X
7934:[
7854:.
7849:2
7841:=
7838:]
7834:X
7826:2
7822:s
7817:[
7811:E
7807:,
7801:=
7798:]
7794:X
7775:[
7769:E
7752:s
7706:.
7702:²
7696:Q
7686:i
7684:ε
7682:i
7680:x
7678:{
7675:;
7664:Q
7659:p
7657:×
7655:p
7650:;
7648:n
7644:i
7635:.
7628:i
7626:y
7621:i
7619:x
7614:i
7612:y
7607:i
7605:x
7592:i
7590:y
7585:i
7583:x
7581:{
7567:.
7564:σ
7545:;
7534:Q
7525:;
7516:j
7514:y
7509:j
7507:x
7494:i
7492:y
7487:i
7485:x
7468:n
7423:.
7420:)
7415:n
7411:I
7405:2
7397:,
7394:0
7391:(
7386:N
7378:X
7352:.
7333:j
7329:i
7291:σ
7286:σ
7267:σ
7263:σ
7259:n
7250:n
7248:I
7234:,
7229:n
7225:I
7219:2
7211:=
7208:]
7204:X
7194:[
7180::
7172:y
7168:β
7155:Q
7137:=
7132:]
7126:p
7123:=
7120:)
7117:X
7114:(
7102:[
7082:X
7074:X
7025:=
7022:]
7018:X
7008:[
7002:E
6971:n
6959:X
6951:X
6947:y
6939:X
6915:i
6913:y
6908:i
6906:x
6870:x
6866:x
6864:(
6862:ƒ
6857:ƒ
6845:ƒ
6834:β
6830:p
6825:i
6823:x
6803:=
6798:]
6791:)
6782:T
6777:i
6773:x
6764:i
6760:y
6755:(
6749:i
6745:x
6738:[
6732:E
6660:.
6656:y
6650:]
6637:K
6630:1
6622:)
6617:K
6606:K
6600:(
6584:X
6577:1
6569:)
6564:X
6553:X
6547:(
6539:[
6534:=
6530:y
6524:1
6515:]
6508:K
6501:X
6494:[
6488:=
6479:]
6442:[
6428:,
6423:]
6386:[
6379:]
6372:K
6365:X
6358:[
6353:=
6345:y
6314:.
6298:K
6294:=
6278:X
6270:y
6256:r
6235:X
6232:K
6218:]
6214:K
6206:X
6202:[
6192:K
6140:X
6132:y
6098:X
6090:y
6079:v
6060:v
6055:X
6048:)
6032:X
6024:y
6020:(
6007:X
5977:X
5970:y
5943:=
5939:X
5929:)
5912:X
5904:y
5900:(
5884:X
5875:y
5837:X
5822:y
5813:y
5804:R
5797:n
5791:L
5784:·
5765:,
5753:X
5745:y
5725:g
5722:r
5719:a
5714:=
5689:p
5685:n
5681:β
5676:y
5662:.
5622:2
5618:X
5595:1
5591:X
5567:y
5564:X
5562:)
5560:X
5557:X
5553:β
5496:,
5484:x
5456:y
5450:=
5421:2
5417:)
5407:x
5396:i
5392:x
5388:(
5382:n
5377:1
5374:=
5371:i
5360:)
5351:y
5340:i
5336:y
5332:(
5329:)
5320:x
5309:i
5305:x
5301:(
5295:n
5290:1
5287:=
5284:i
5273:=
5227:.
5222:i
5214:+
5209:i
5205:x
5198:+
5192:=
5187:i
5183:y
5169:)
5167:β
5163:α
5161:(
5156:i
5154:x
5150:X
5125:R
5121:X
5117:R
5099:L
5089:n
5087:×
5085:n
5069:n
5065:J
5042:n
5038:J
5032:n
5029:1
5019:n
5015:I
5011:=
5008:L
4976:S
4973:S
4970:T
4965:S
4962:S
4959:R
4950:1
4947:=
4941:y
4938:L
4932:T
4927:y
4921:y
4918:M
4912:T
4907:y
4897:1
4894:=
4888:y
4885:L
4879:T
4874:y
4868:y
4865:P
4862:L
4856:T
4851:P
4844:T
4839:y
4832:=
4824:2
4820:)
4811:y
4801:i
4797:y
4793:(
4783:2
4779:)
4770:y
4760:i
4750:y
4743:(
4734:=
4729:2
4725:R
4711:y
4707:R
4698:X
4677:s
4673:s
4661:σ
4644:2
4619:σ
4615:s
4607:p
4605:−
4603:n
4584:2
4580:s
4573:n
4569:p
4563:n
4557:=
4552:2
4534:,
4528:p
4522:n
4517:)
4502:(
4499:S
4493:=
4487:p
4481:n
4476:y
4473:M
4467:T
4462:y
4455:=
4449:p
4443:n
4438:y
4435:M
4429:T
4424:M
4417:T
4412:y
4405:=
4399:p
4393:n
4388:y
4385:M
4379:T
4374:)
4370:y
4367:M
4364:(
4358:=
4352:p
4346:n
4326:T
4305:=
4300:2
4296:s
4276:σ
4259:.
4253:M
4250:=
4244:M
4241:+
4235:)
4232:X
4229:M
4226:(
4223:=
4220:)
4214:+
4208:X
4205:(
4202:M
4199:=
4196:y
4193:M
4190:=
4175:X
4169:y
4166:=
4157:y
4148:y
4145:=
4116:M
4104:X
4095:X
4090:M
4086:M
4080:P
4076:P
4063:M
4059:P
4055:V
4050:P
4045:n
4043:I
4039:M
4030:P
4026:y
4016:P
4012:X
4008:V
3998:X
3996:)
3994:X
3991:X
3989:(
3987:X
3983:P
3966:,
3963:y
3960:P
3957:=
3942:X
3939:=
3930:y
3904:β
3889:X
3886:Q
3884:=
3882:X
3878:X
3875:X
3870:β
3866:C
3858:β
3850:N
3848:=
3846:Q
3838:X
3835:X
3833:=
3831:N
3814:.
3808:y
3803:T
3799:X
3793:1
3786:)
3782:X
3777:T
3773:X
3769:(
3766:=
3763:)
3760:b
3757:(
3754:S
3744:p
3739:R
3731:b
3723:=
3676:=
3673:b
3659:b
3655:b
3653:(
3651:S
3646:β
3640:b
3635:i
3631:i
3629:X
3625:X
3617:T
3600:,
3597:)
3594:b
3591:X
3585:y
3582:(
3577:T
3573:)
3569:b
3566:X
3560:y
3557:(
3554:=
3549:2
3545:)
3541:b
3536:T
3531:i
3527:x
3518:i
3514:y
3510:(
3505:n
3500:1
3497:=
3494:i
3486:=
3483:)
3480:b
3477:(
3474:S
3459:(
3451:(
3443:(
3434:b
3431:x
3427:y
3422:)
3419:i
3417:y
3412:i
3410:x
3408:(
3404:i
3393:b
3389:i
3387:x
3382:i
3380:y
3375:β
3371:b
3349:.
3339:T
3334:X
3327:1
3319:)
3314:X
3308:T
3303:X
3297:(
3292:+
3284:=
3246:.
3242:y
3236:T
3231:X
3224:1
3216:)
3211:X
3205:T
3200:X
3194:(
3189:=
3116:y
3110:T
3105:X
3074:X
3068:T
3063:X
3038:.
3031:y
3025:T
3020:X
3015:=
2999:)
2994:X
2988:T
2983:X
2977:(
2944:X
2923:p
2896:.
2891:2
2876:X
2868:y
2858:=
2853:2
2848:|
2842:j
2832:j
2829:i
2825:X
2819:p
2814:1
2811:=
2808:j
2795:i
2791:y
2786:|
2779:n
2774:1
2771:=
2768:i
2760:=
2757:)
2749:(
2746:S
2723:S
2700:,
2697:)
2689:(
2686:S
2676:n
2673:i
2670:m
2666:g
2663:r
2660:a
2654:=
2587:p
2567:1
2564:=
2559:1
2556:i
2552:X
2530:X
2506:.
2501:]
2493:n
2489:y
2472:2
2468:y
2458:1
2454:y
2447:[
2442:=
2438:y
2433:,
2428:]
2420:p
2399:2
2385:1
2374:[
2369:=
2360:,
2355:]
2347:p
2344:n
2340:X
2327:2
2324:n
2320:X
2312:1
2309:n
2305:X
2273:p
2270:2
2266:X
2249:X
2237:X
2227:p
2224:1
2220:X
2203:X
2191:X
2184:[
2179:=
2175:X
2148:,
2144:y
2140:=
2131:X
2103:p
2097:n
2075:p
2067:,
2061:,
2056:2
2048:,
2043:1
2014:p
1991:n
1968:,
1965:)
1962:n
1959:,
1953:,
1950:2
1947:,
1944:1
1941:=
1938:i
1935:(
1929:,
1924:i
1920:y
1916:=
1911:j
1901:j
1898:i
1894:x
1888:p
1883:1
1880:=
1877:j
1790:0
1784:=
1779:i
1775:x
1748:1
1723:n
1720:,
1714:,
1711:1
1708:=
1705:i
1685:1
1682:=
1677:1
1674:i
1670:x
1648:X
1624:i
1602:T
1597:i
1592:x
1570:i
1546:p
1540:n
1519:X
1498:n
1478:1
1472:n
1429:y
1404:,
1396:+
1387:X
1383:=
1379:y
1353:i
1348:x
1324:i
1320:y
1297:i
1272:i
1246:i
1221:1
1215:p
1173:i
1151:i
1146:x
1120:,
1115:i
1107:+
1097:T
1092:i
1087:x
1082:=
1077:i
1073:y
1042:,
1037:i
1029:+
1024:p
1021:i
1017:x
1008:p
1000:+
994:+
989:2
986:i
982:x
973:2
965:+
960:1
957:i
953:x
944:1
936:=
931:i
927:y
901:i
897:y
870:T
865:]
859:p
856:i
852:x
848:,
842:,
837:2
834:i
830:x
826:,
821:1
818:i
814:x
809:[
804:=
799:i
794:x
772:p
750:i
745:x
721:i
717:y
696:i
674:n
669:1
666:=
663:i
658:}
652:i
648:y
644:,
639:i
634:x
628:{
604:n
550:—
475:(
437:e
430:t
423:v
20:)
Text is available under the Creative Commons Attribution-ShareAlike License. Additional terms may apply.