394:
8912:(or the tuning parameter) inherently involved in its construction. While it does not completely discard any of the components, it exerts a shrinkage effect over all of them in a continuous manner so that the extent of shrinkage is higher for the low variance components and lower for the high variance components. Frank and Friedman (1993) conclude that for the purpose of prediction itself, the ridge estimator, owing to its smooth shrinkage effect, is perhaps a better choice compared to the PCR estimator having a discrete shrinkage effect.
4260:
8805:, which is probably more suited for addressing the multicollinearity problem and for performing dimension reduction, the above criteria actually attempts to improve the prediction and estimation efficiency of the PCR estimator by involving both the outcome as well as the covariates in the process of selecting the principal components to be used in the regression step. Alternative approaches with similar goals include selection of the principal components based on
9546:
exactly equivalent to the classical PCR based on a primal formulation. However, for arbitrary (and possibly non-linear) kernels, this primal formulation may become intractable owing to the infinite dimensionality of the associated feature map. Thus classical PCR becomes practically infeasible in that case, but kernel PCR based on the dual formulation still remains valid and computationally scalable.
3933:
6888:
6193:
4714:
4459:
8946:(PLS) estimator. Similar to PCR, PLS also uses derived covariates of lower dimensions. However unlike PCR, the derived covariates for PLS are obtained based on using both the outcome as well as the covariates. While PCR seeks the high variance directions in the space of the covariates, PLS seeks the directions in the covariate space that are most useful for the prediction of the outcome.
6560:
8465:
7854:
4255:{\displaystyle \operatorname {Var} ({\widehat {\boldsymbol {\beta }}}_{k})=\sigma ^{2}\;V_{k}(W_{k}^{T}W_{k})^{-1}V_{k}^{T}=\sigma ^{2}\;V_{k}\;\operatorname {diag} \left(\lambda _{1}^{-1},\ldots ,\lambda _{k}^{-1}\right)V_{k}^{T}=\sigma ^{2}\sideset {}{}\sum _{j=1}^{k}{\frac {\mathbf {v} _{j}\mathbf {v} _{j}^{T}}{\lambda _{j}}}.}
9446:
hence the corresponding principal components and principal component directions could be infinite-dimensional as well. Therefore, these quantities are often practically intractable under the kernel machine setting. Kernel PCR essentially works around this problem by considering an equivalent dual formulation based on using the
6719:
5874:
6035:
4520:
4271:
7457:
8941:
that involves the observations for the explanatory variables only. Therefore, the resulting PCR estimator obtained from using these principal components as covariates need not necessarily have satisfactory predictive performance for the outcome. A somewhat similar estimator that tries to address this
9545:
so obtained. It can be easily shown that this is the same as regressing the outcome vector on the corresponding principal components (which are finite-dimensional in this case), as defined in the context of the classical PCR. Thus, for the linear kernel, the kernel PCR based on a dual formulation is
9445:
Clearly, kernel PCR has a discrete shrinkage effect on the eigenvectors of K', quite similar to the discrete shrinkage effect of classical PCR on the principal components, as discussed earlier. However, the feature map associated with the chosen kernel could potentially be infinite-dimensional, and
8564:
7956:
4860:
9061:
covariates that turn out to be the most correlated with the outcome (based on the degree of significance of the corresponding estimated regression coefficients) are selected for further use. A conventional PCR, as described earlier, is then performed, but now it is based on only the
6422:
7715:
1630:
5941:. Since the smaller eigenvalues do not contribute significantly to the cumulative sum, the corresponding principal components may be continued to be dropped as long as the desired threshold limit is not exceeded. The same criteria may also be used for addressing the
7217:
8337:
7726:
6883:{\displaystyle {\widehat {\boldsymbol {\beta }}}_{L}=\arg \min _{{\boldsymbol {\beta }}_{*}\in \mathbb {R} ^{p}}\|\mathbf {Y} -\mathbf {X} {\boldsymbol {\beta }}_{*}\|^{2}\quad {\text{ subject to }}\quad L_{(p-k)}^{T}{\boldsymbol {\beta }}_{*}=\mathbf {0} .}
6188:{\displaystyle \min _{{\boldsymbol {\beta }}_{*}\in \mathbb {R} ^{p}}\left\|\mathbf {Y} -\mathbf {X} {\boldsymbol {\beta }}_{*}\right\|^{2}\quad {\text{ subject to }}\quad {\boldsymbol {\beta }}_{*}\perp \{\mathbf {v} _{k+1},\ldots ,\mathbf {v} _{p}\}.}
5692:
4709:{\displaystyle \operatorname {Var} ({\widehat {\boldsymbol {\beta }}}_{\mathrm {ols} })-\operatorname {Var} ({\widehat {\boldsymbol {\beta }}}_{k})=\sigma ^{2}\sideset {}{}\sum _{j=k+1}^{p}{\frac {\mathbf {v} _{j}\mathbf {v} _{j}^{T}}{\lambda _{j}}}.}
4454:{\displaystyle \operatorname {Var} ({\widehat {\boldsymbol {\beta }}}_{p})=\operatorname {Var} ({\widehat {\boldsymbol {\beta }}}_{\mathrm {ols} })=\sigma ^{2}\sideset {}{}\sum _{j=1}^{p}{\frac {\mathbf {v} _{j}\mathbf {v} _{j}^{T}}{\lambda _{j}}}.}
3757:
1401:
3498:
3276:
2340:
836:
8717:
1231:
8953:
was proposed. In a spirit similar to that of PLS, it attempts at obtaining derived covariates of lower dimensions based on a criterion that involves both the outcome as well as the covariates. The method starts by performing a set of
8768:. In general, they may be estimated using the unrestricted least squares estimates obtained from the original full model. Park (1981) however provides a slightly modified set of estimates that may be better suited for this purpose.
1335:
8267:
7361:
5443:
9434:. The estimated regression coefficients (having the same dimension as the number of selected eigenvectors) along with the corresponding selected eigenvectors are then used for predicting the outcome for a future observation. In
525:. PCR can aptly deal with such situations by excluding some of the low-variance principal components in the regression step. In addition, by usually regressing on only a subset of all the principal components, PCR can result in
8473:
7865:
4769:
3089:
3155:
5591:
748:
9450:
of the associated kernel matrix. Under the linear regression model (which corresponds to choosing the kernel function as the linear kernel), this amounts to considering a spectral decomposition of the corresponding
6555:{\displaystyle \min _{{\boldsymbol {\beta }}_{*}\in \mathbb {R} ^{p}}\|\mathbf {Y} -\mathbf {X} {\boldsymbol {\beta }}_{*}\|^{2}\quad {\text{ subject to }}\quad L_{(p-k)}^{T}{\boldsymbol {\beta }}_{*}=\mathbf {0} }
6266:
5345:
8177:
8085:
7565:
7635:
2392:
7076:
1850:
1902:
1268:
7263:
1538:
8303:
8125:
8015:
7601:
7299:
6966:
6711:
6023:
3925:
3813:
578:
for the explanatory variables to obtain the principal components, and then (usually) select a subset, based on some appropriate criteria, of the principal components so obtained for further use.
8460:{\displaystyle \forall k\in \{1,\ldots ,p\}:\quad \operatorname {Var} ({\widehat {\boldsymbol {\beta }}}_{\mathrm {ols} })-\operatorname {Var} ({\widehat {\boldsymbol {\beta }}}_{k})\succeq 0,}
7849:{\displaystyle \forall j\in \{1,\ldots ,p\}:\quad \operatorname {Var} ({\widehat {\boldsymbol {\beta }}}_{\mathrm {ols} })-\operatorname {Var} ({\widehat {\boldsymbol {\beta }}}_{j})\succeq 0,}
1477:
7087:
6391:
9543:
9509:
8898:
8864:
8803:
5939:
5023:
2484:
2430:
2188:
1290:
1743:
2872:
8766:
8613:
8329:
8041:
7627:
7353:
5869:{\displaystyle \sum _{i=1}^{n}\left\|\mathbf {x} _{i}-V_{k}\mathbf {x} _{i}^{k}\right\|^{2}={\begin{cases}\sum _{j=k+1}^{n}\lambda _{j}&1\leqslant k<p\\0&k=p\end{cases}}}
4512:
3395:
1674:
1652:
1499:
5280:
5183:
9174:
9130:
9039:
7508:
4761:
3889:
3586:
3373:
2734:
2684:
1440:
6672:
5987:
5216:
2513:
3696:
1340:
3420:
3166:
2150:
2082:
2007:
6640:
2610:
2193:
8939:
5677:
5489:
4985:
4959:
3302:
2640:
2112:
1928:
1769:
1702:
1525:
1184:
1162:
1119:
1097:
1067:
1025:
997:
952:
8744:
4889:
753:
9475:
9377:
9086:
6930:
6602:
5136:
2966:
2787:
8648:
910:
8643:
2576:
2543:
1189:
5110:
3538:
3329:
2761:
640:
602:
568:
7452:{\displaystyle \operatorname {Var} ({\widehat {\boldsymbol {\beta }}}_{\mathrm {ols} })=\operatorname {MSE} ({\widehat {\boldsymbol {\beta }}}_{\mathrm {ols} }),}
3687:
5073:. This issue can be effectively addressed through using a PCR estimator obtained by excluding the principal components corresponding to these small eigenvalues.
9059:
8995:
8972:
8584:
7979:
7319:
5901:
5655:
5631:
5611:
5509:
5239:
5071:
5043:
4909:
3833:
3658:
3634:
3614:
3415:
3006:
2986:
2940:
2827:
2807:
2450:
1295:
1045:
972:
930:
880:
860:
8185:
5356:
8559:{\displaystyle \operatorname {MSE} ({\widehat {\boldsymbol {\beta }}}_{\mathrm {ols} })-\operatorname {MSE} ({\widehat {\boldsymbol {\beta }}}_{k})\succeq 0}
7951:{\displaystyle \operatorname {MSE} ({\widehat {\boldsymbol {\beta }}}_{\mathrm {ols} })-\operatorname {MSE} ({\widehat {\boldsymbol {\beta }}}_{k})\succeq 0}
4855:{\displaystyle \operatorname {Var} ({\widehat {\boldsymbol {\beta }}}_{\mathrm {ols} })-\operatorname {Var} ({\widehat {\boldsymbol {\beta }}}_{k})\succeq 0}
3015:
529:
through substantially lowering the effective number of parameters characterizing the underlying model. This can be particularly useful in settings with
9602:
3094:
6396:
Thus, when only a proper subset of all the principal components are selected for regression, the PCR estimator so obtained is based on a hard form of
5514:
669:
9722:
9353:
evaluated at the corresponding pairs of covariate vectors. The pairwise inner products so obtained may therefore be represented in the form of a
5945:
issue whereby the principal components corresponding to the smaller eigenvalues may be ignored as long as the threshold limit is maintained.
4939:, so that one can be linearly predicted from the others with a non-trivial degree of accuracy. Consequently, the columns of the data matrix
6204:
5285:
8138:
8046:
7710:{\displaystyle \operatorname {Var} ({\widehat {\boldsymbol {\beta }}}_{k})=\operatorname {MSE} ({\widehat {\boldsymbol {\beta }}}_{k}).}
5953:
Since the PCR estimator typically uses only a subset of all the principal components for regression, it can be viewed as some sort of a
7513:
654:(with dimension equal to the total number of covariates) for estimating the regression coefficients characterizing the original model.
2345:
6974:
1774:
1855:
424:
1625:{\displaystyle {\widehat {\boldsymbol {\beta }}}_{\mathrm {ols} }=(\mathbf {X} ^{T}\mathbf {X} )^{-1}\mathbf {X} ^{T}\mathbf {Y} }
334:
1236:
9310:
with the understanding that instead of the original set of covariates, the predictors are now given by the vector (potentially
7225:
9809:
9781:
839:
575:
8866:) as covariates in the model and discards the remaining low variance components (corresponding to the lower eigenvalues of
8615:, Park (1981) proposes the following guideline for selecting the principal components to be used for regression: Drop the
8272:
8094:
7984:
7570:
7268:
6935:
6680:
5992:
3894:
3762:
324:
7212:{\displaystyle \Lambda _{(p-k)}^{1/2}=\operatorname {diag} \left(\lambda _{k+1}^{1/2},\ldots ,\lambda _{p}^{1/2}\right).}
477:
5683:
3513:
The fitting process for obtaining the PCR estimator involves regressing the response vector on the derived data matrix
1453:
533:. Also, through appropriate selection of the principal components to be used for regression, PCR can lead to efficient
6277:
9645:
1074:
504:
9720:
Eric Bair; Trevor Hastie; Debashis Paul; Robert
Tibshirani (2006). "Prediction by Supervised Principal Components".
9560:
9232:
9209:
514:
the outcome, the principal components with low variances may also be important, in some cases even more important.
288:
9514:
9480:
8869:
8835:
8774:
5910:
4994:
2455:
2401:
2159:
1273:
8586:
is such that the excluded principal components correspond to the smaller eigenvalues, thereby resulting in lower
339:
277:
97:
72:
9662:
Sung H. Park (1981). "Collinearity and
Optimal Restrictions on Regression Parameters for Estimating Responses".
8904:
on the low variance components nullifying their contribution completely in the original model. In contrast, the
1707:
9447:
9411:
6416:
Given the constrained minimization problem as defined above, consider the following generalized version of it:
5903:, the number of principal components to be used, through appropriate thresholding on the cumulative sum of the
2153:
9691:
Lldiko E. Frank & Jerome H. Friedman (1993). "A Statistical View of Some
Chemometrics Regression Tools".
9431:
9177:
8806:
5045:
under such situations. The variance expressions above indicate that these small eigenvalues have the maximum
2832:
158:
8749:
8596:
8312:
8024:
7610:
7336:
4467:
3378:
1657:
1635:
1482:
9828:
9555:
9189:
8916:
8909:
6397:
5954:
5246:
5141:
2613:
2582:
2578:
2545:
2395:
2089:
1905:
1746:
1122:
647:
571:
500:
496:
481:
473:
454:
417:
9135:
9091:
9088:
data matrix corresponding to the observations for the selected covariates. The number of covariates used:
9000:
7469:
4722:
3850:
3547:
3334:
2695:
2645:
1410:
8719:
Practical implementation of this guideline of course requires estimates for the unknown model parameters
530:
360:
8832:
that usually retains the high variance principal components (corresponding to the higher eigenvalues of
3752:{\displaystyle {\widehat {\boldsymbol {\beta }}}_{p}={\widehat {\boldsymbol {\beta }}}_{\mathrm {ols} }}
1396:{\displaystyle \;\operatorname {Var} \left({\boldsymbol {\varepsilon }}\right)=\sigma ^{2}I_{n\times n}}
480:. One typically uses only a subset of all the principal components for regression, making PCR a kind of
8817:
6645:
5960:
4936:
3493:{\displaystyle {\widehat {\boldsymbol {\beta }}}_{k}=V_{k}{\widehat {\gamma }}_{k}\in \mathbb {R} ^{p}}
3271:{\displaystyle {\widehat {\gamma }}_{k}=(W_{k}^{T}W_{k})^{-1}W_{k}^{T}\mathbf {Y} \in \mathbb {R} ^{k}}
329:
298:
225:
5192:
4991:
losing its full column rank structure. More quantitatively, one or more of the smaller eigenvalues of
2489:
2335:{\displaystyle \Lambda _{p\times p}=\operatorname {diag} \left=\operatorname {diag} \left=\Delta ^{2}}
604:
Now regress the observed vector of outcomes on the selected principal components as covariates, using
5050:
5046:
319:
308:
272:
179:
9736:
5784:
2122:
2012:
1937:
9833:
9350:
9311:
9259:
9251:
9247:
9228:
9220:
6607:
6026:
4912:
2588:
1136:
974:
831:{\displaystyle \mathbf {X} _{n\times p}=\left(\mathbf {x} _{1},\ldots ,\mathbf {x} _{n}\right)^{T}}
613:
526:
380:
251:
174:
67:
46:
8922:
5660:
5451:
4968:
4942:
3285:
2618:
2095:
1911:
1752:
1685:
1508:
1167:
1145:
1102:
1080:
1050:
1008:
980:
935:
8722:
410:
303:
8712:{\displaystyle \lambda _{j}<(p\sigma ^{2})/{\boldsymbol {\beta }}^{T}{\boldsymbol {\beta }}.}
4868:
472:
In PCR, instead of regressing the dependent variable on the explanatory variables directly, the
9801:
9795:
9731:
9454:
9356:
9196:
for predicting the outcome based on the covariates. However, it can be easily generalized to a
9065:
6896:
6568:
5115:
3690:
3279:
2945:
2766:
1502:
1226:{\displaystyle \mathbf {Y} =\mathbf {X} {\boldsymbol {\beta }}+{\boldsymbol {\varepsilon }},\;}
605:
267:
262:
204:
8977:(or univariate regressions) wherein the outcome vector is regressed separately on each of the
9773:
9767:
9570:
8943:
8593:
In order to ensure efficient estimation and prediction performance of PCR as an estimator of
5241:
5219:
4988:
3009:
1528:
889:
643:
355:
51:
8618:
8087:, based on using the mean squared error as the performance criteria. In addition, any given
2551:
2518:
9580:
8587:
8306:
7604:
7330:
5634:
5088:
3516:
3307:
2739:
1532:
375:
365:
246:
214:
169:
148:
56:
1330:{\displaystyle \operatorname {E} \left({\boldsymbol {\varepsilon }}\right)=\mathbf {0} \;}
626:
588:
554:
8:
9319:
9299:
9287:
9201:
8901:
8829:
8262:{\displaystyle k\in \{1,\ldots ,p\},V_{(p-k)}^{T}{\boldsymbol {\beta }}\neq \mathbf {0} }
8018:
5880:
5082:
4962:
3666:
883:
485:
450:
293:
194:
189:
143:
92:
82:
27:
5438:{\displaystyle \sum _{i=1}^{n}\left\|\mathbf {x} _{i}-L_{k}\mathbf {z} _{i}\right\|^{2}}
9619:
9414:
of K' is obtained. Kernel PCR then proceeds by (usually) selecting a subset of all the
9291:
9217:
9044:
8980:
8957:
8569:
8128:
7964:
7463:
7304:
5886:
5640:
5616:
5596:
5494:
5224:
5056:
5028:
4894:
3818:
3643:
3619:
3599:
3400:
2991:
2971:
2876:
2812:
2792:
2435:
1030:
957:
915:
865:
845:
398:
127:
112:
521:
problem which arises when two or more of the explanatory variables are close to being
510:
of the explanatory variables) are selected as regressors. However, for the purpose of
9805:
9777:
9641:
9575:
9419:
9307:
9271:
9263:
9243:
9224:
9193:
8974:
8816:
criteria. Often, the principal components are also selected based on their degree of
5942:
4932:
3836:
3637:
3593:
1139:
609:
518:
507:
466:
462:
393:
184:
87:
41:
9719:
9600:
Jolliffe, Ian T. (1982). "A note on the Use of
Principal Components in Regression".
6404:
of the selected principal component directions, and consequently restricts it to be
3503:
9741:
9706:
9702:
9673:
9611:
9565:
9511:
and then regressing the outcome vector on a selected subset of the eigenvectors of
9435:
8905:
1126:
1070:
209:
138:
3084:{\displaystyle \mathbf {x} _{i}^{k}=V_{k}^{T}\mathbf {x} _{i}\in \mathbb {R} ^{k}}
9763:
8810:
3150:{\displaystyle \mathbf {x} _{i}\in \mathbb {R} ^{p}\;\;\forall \;\;1\leq i\leq n}
650:(the eigenvectors corresponding to the selected principal components) to get the
370:
77:
5586:{\displaystyle \mathbf {z} _{i}=\mathbf {x} _{i}^{k}=V_{k}^{T}\mathbf {x} _{i},}
9745:
9387:
9338:
9323:
9315:
9303:
9295:
9275:
9267:
9239:
9197:
6405:
3589:
3541:
2085:
1654:. PCR is another technique that may be used for the same purpose of estimating
122:
9822:
9693:
9664:
9399:
9346:
9342:
9334:
9255:
1931:
743:{\displaystyle \mathbf {Y} _{n\times 1}=\left(y_{1},\ldots ,y_{n}\right)^{T}}
241:
117:
6411:
9427:
9423:
9415:
9330:
6401:
646:
this vector back to the scale of the actual covariates, using the selected
107:
3616:
selected principal components as covariates is equivalent to carrying out
9791:
8132:
8088:
5904:
5186:
4920:
4916:
153:
102:
4961:
that correspond to the observations for these covariates tend to become
9623:
9407:
9403:
9395:
9391:
9380:
4919:
of the PCR estimator has a lower variance compared to that of the same
534:
511:
458:
438:
8771:
Unlike the criteria based on the cumulative sum of the eigenvalues of
3663:
When all the principal components are selected for regression so that
9283:
9279:
9213:
9205:
6261:{\displaystyle V_{(p-k)}^{T}{\boldsymbol {\beta }}_{*}=\mathbf {0} ,}
1448:
522:
9677:
9615:
9345:
among the feature maps for the observed covariate vectors and these
1270:
denotes the unknown parameter vector of regression coefficients and
5340:{\displaystyle \mathbf {z} _{i}\in \mathbb {R} ^{k}(1\leq i\leq n)}
3278:
denote the vector of estimated regression coefficients obtained by
1404:
492:
9341:. It turns out that it is only sufficient to compute the pairwise
8172:{\displaystyle {\widehat {\boldsymbol {\beta }}}_{\mathrm {ols} }}
8080:{\displaystyle {\widehat {\boldsymbol {\beta }}}_{\mathrm {ols} }}
9690:
3504:
Fundamental characteristics and applications of the PCR estimator
7560:{\displaystyle V_{(p-k)}^{T}{\boldsymbol {\beta }}=\mathbf {0} }
2452:
denote the corresponding orthonormal set of eigenvectors. Then,
2387:{\displaystyle \lambda _{1}\geq \cdots \geq \lambda _{p}\geq 0}
7071:{\displaystyle L_{(p-k)}^{*}=V_{(p-k)}\Lambda _{(p-k)}^{1/2},}
1845:{\displaystyle \Delta _{p\times p}=\operatorname {diag} \left}
1501:, based on the data. One frequently used approach for this is
1077:. This centering step is crucial (at least for the columns of
545:
The PCR method may be broadly divided into three major steps:
1897:{\displaystyle \delta _{1}\geq \cdots \geq \delta _{p}\geq 0}
612:) to get a vector of estimated regression coefficients (with
8915:
In addition, the principal components are obtained from the
5862:
1682:
PCR starts by performing a PCA on the centered data matrix
1263:{\displaystyle {\boldsymbol {\beta }}\in \mathbb {R} ^{p}}
9379:
symmetric non-negative definite matrix also known as the
7258:{\displaystyle {\widehat {\boldsymbol {\beta }}}_{L^{*}}}
6412:
Optimality of PCR among a class of regularized estimators
3592:
to each other. Thus in the regression step, performing a
2394:
denoting the non-negative eigenvalues (also known as the
9684:
9227:
turns out to be a special case of this setting when the
9188:
The classical PCR method as described above is based on
9132:
and the subsequent number of principal components used:
5049:
on the variance of the least squares estimator, thereby
2789:
matrix with orthonormal columns consisting of the first
8908:
estimator exerts a smooth shrinkage effect through the
3008:
may be viewed as the data matrix obtained by using the
3640:(or univariate regressions) separately on each of the
616:
equal to the number of selected principal components).
9517:
9483:
9457:
9430:
to be used for regression are usually selected using
9359:
9138:
9094:
9068:
9047:
9003:
8983:
8960:
8925:
8872:
8838:
8777:
8752:
8725:
8651:
8621:
8599:
8572:
8476:
8340:
8315:
8298:{\displaystyle {\widehat {\boldsymbol {\beta }}}_{k}}
8275:
8188:
8141:
8120:{\displaystyle {\widehat {\boldsymbol {\beta }}}_{k}}
8097:
8049:
8027:
8010:{\displaystyle {\widehat {\boldsymbol {\beta }}}_{k}}
7987:
7967:
7868:
7729:
7638:
7613:
7596:{\displaystyle {\widehat {\boldsymbol {\beta }}}_{k}}
7573:
7516:
7472:
7364:
7339:
7307:
7294:{\displaystyle {\widehat {\boldsymbol {\beta }}}_{k}}
7271:
7228:
7090:
6977:
6961:{\displaystyle {\widehat {\boldsymbol {\beta }}}_{L}}
6938:
6899:
6722:
6706:{\displaystyle {\widehat {\boldsymbol {\beta }}}_{L}}
6683:
6648:
6610:
6571:
6425:
6280:
6207:
6038:
6018:{\displaystyle {\widehat {\boldsymbol {\beta }}}_{k}}
5995:
5963:
5913:
5889:
5695:
5663:
5643:
5619:
5599:
5517:
5497:
5454:
5359:
5288:
5249:
5227:
5195:
5144:
5118:
5091:
5059:
5031:
4997:
4971:
4945:
4897:
4871:
4772:
4725:
4523:
4470:
4274:
3936:
3920:{\displaystyle {\widehat {\boldsymbol {\beta }}}_{k}}
3897:
3853:
3821:
3808:{\displaystyle W_{p}=\mathbf {X} V_{p}=\mathbf {X} V}
3765:
3699:
3669:
3646:
3622:
3602:
3550:
3519:
3423:
3403:
3381:
3337:
3310:
3288:
3169:
3097:
3018:
2994:
2974:
2948:
2879:
2835:
2815:
2795:
2769:
2742:
2698:
2648:
2621:
2591:
2554:
2521:
2492:
2458:
2438:
2404:
2348:
2196:
2162:
2125:
2098:
2015:
1940:
1914:
1858:
1777:
1755:
1710:
1688:
1660:
1638:
1541:
1511:
1485:
1456:
1413:
1343:
1298:
1276:
1239:
1192:
1170:
1148:
1105:
1083:
1053:
1033:
1011:
983:
960:
938:
918:
892:
868:
848:
756:
672:
629:
591:
557:
9208:
in the covariates, but instead it can belong to the
9183:
6968:achieves the minimum prediction error is given by:
5053:the estimator significantly when they are close to
9603:Journal of the Royal Statistical Society, Series C
9537:
9503:
9469:
9371:
9168:
9124:
9080:
9053:
9033:
8989:
8966:
8933:
8892:
8858:
8797:
8760:
8738:
8711:
8637:
8607:
8578:
8558:
8459:
8323:
8297:
8261:
8171:
8119:
8079:
8035:
8009:
7973:
7950:
7848:
7709:
7621:
7595:
7559:
7502:
7451:
7347:
7313:
7293:
7257:
7211:
7070:
6960:
6924:
6893:Then the optimal choice of the restriction matrix
6882:
6705:
6666:
6634:
6596:
6554:
6385:
6260:
6187:
6025:denotes the regularized solution to the following
6017:
5981:
5933:
5895:
5868:
5671:
5649:
5633:dimensional principal components provide the best
5625:
5605:
5585:
5503:
5483:
5437:
5339:
5274:
5233:
5210:
5177:
5130:
5104:
5065:
5037:
5017:
4979:
4953:
4903:
4883:
4854:
4755:
4708:
4506:
4453:
4254:
3919:
3883:
3827:
3807:
3751:
3681:
3652:
3628:
3608:
3580:
3532:
3492:
3409:
3389:
3367:
3323:
3296:
3270:
3149:
3083:
3000:
2980:
2960:
2934:
2866:
2821:
2801:
2781:
2755:
2728:
2678:
2634:
2604:
2570:
2537:
2507:
2478:
2444:
2424:
2386:
2334:
2182:
2144:
2106:
2076:
2001:
1922:
1896:
1844:
1763:
1737:
1696:
1668:
1646:
1624:
1519:
1493:
1471:
1434:
1395:
1329:
1284:
1262:
1225:
1178:
1156:
1113:
1091:
1061:
1039:
1019:
991:
966:
946:
924:
904:
874:
854:
830:
742:
634:
596:
562:
9713:
8949:2006 a variant of the classical PCR known as the
4625:
4376:
4177:
1472:{\displaystyle {\widehat {\boldsymbol {\beta }}}}
9820:
6752:
6427:
6386:{\displaystyle V_{(p-k)}=\left_{p\times (p-k)}.}
6198:The constraint may be equivalently written as:
6040:
5025:get(s) very close or become(s) exactly equal to
886:and the number of covariates respectively, with
9723:Journal of the American Statistical Association
8997:covariates taken one at a time. Then, for some
7222:Quite clearly, the resulting optimal estimator
5511:principal component directions as columns, and
4926:
7329:Since the ordinary least squares estimator is
6400:that constrains the resulting solution to the
3689:, then the PCR estimator is equivalent to the
3660:selected principal components as a covariate.
9657:
9655:
9653:
6604:denotes any full column rank matrix of order
418:
9661:
9538:{\displaystyle \mathbf {X} \mathbf {X} ^{T}}
9504:{\displaystyle \mathbf {X} \mathbf {X} ^{T}}
9163:
9145:
9119:
9101:
9028:
9010:
8893:{\displaystyle \mathbf {X} ^{T}\mathbf {X} }
8859:{\displaystyle \mathbf {X} ^{T}\mathbf {X} }
8798:{\displaystyle \mathbf {X} ^{T}\mathbf {X} }
8368:
8350:
8213:
8195:
7757:
7739:
7497:
7479:
6814:
6785:
6489:
6460:
6179:
6137:
5934:{\displaystyle \mathbf {X} ^{T}\mathbf {X} }
5169:
5151:
5018:{\displaystyle \mathbf {X} ^{T}\mathbf {X} }
4750:
4732:
4501:
4477:
3878:
3860:
3575:
3557:
3362:
3344:
2723:
2705:
2673:
2655:
2479:{\displaystyle \mathbf {X} \mathbf {v} _{j}}
2425:{\displaystyle \mathbf {X} ^{T}\mathbf {X} }
2183:{\displaystyle \mathbf {X} ^{T}\mathbf {X} }
1285:{\displaystyle {\boldsymbol {\varepsilon }}}
999:denotes the corresponding observed outcome.
517:One major use of PCR lies in overcoming the
9242:setting, the vector of covariates is first
8942:issue through its very construction is the
5138:matrix having orthonormal columns, for any
4935:, two or more of the covariates are highly
1447:The primary goal is to obtain an efficient
750:denote the vector of observed outcomes and
537:of the outcome based on the assumed model.
491:Often the principal components with higher
9650:
9638:The Oxford Dictionary of Statistical Terms
8823:
7265:is then simply given by the PCR estimator
4072:
4061:
3981:
3131:
3130:
3126:
3125:
1738:{\displaystyle \mathbf {X} =U\Delta V^{T}}
1431:
1430:
1344:
1326:
1222:
631:
630:
593:
592:
559:
558:
457:(PCA). More specifically, PCR is used for
425:
411:
9735:
6773:
6713:denote the corresponding solution. Thus
6448:
6061:
5613:dimensional derived covariates. Thus the
5306:
4923:of the ordinary least squares estimator.
4891:indicates that a square symmetric matrix
3759:. This is easily seen from the fact that
3480:
3258:
3115:
3091:instead of using the original covariates
3071:
1292:denotes the vector of random errors with
1250:
476:of the explanatory variables are used as
9599:
9422:of the outcome vector on these selected
9390:setting can now be implemented by first
9212:associated with any arbitrary (possibly
5948:
954:denotes one set of observations for the
658:
9762:
8754:
8702:
8691:
8601:
8532:
8490:
8430:
8388:
8317:
8280:
8247:
8146:
8102:
8054:
8029:
7992:
7981:. Thus in that case, the corresponding
7924:
7882:
7819:
7777:
7686:
7652:
7615:
7578:
7545:
7420:
7378:
7341:
7276:
7233:
6943:
6859:
6803:
6758:
6727:
6688:
6534:
6478:
6433:
6237:
6124:
6094:
6046:
6000:
4828:
4786:
4579:
4537:
4322:
4288:
3950:
3902:
3726:
3704:
3508:
3428:
3383:
2867:{\displaystyle W_{k}=\mathbf {X} V_{k}}
1662:
1640:
1546:
1487:
1460:
1356:
1310:
1278:
1241:
1215:
1207:
1099:) since PCR involves the use of PCA on
9821:
9349:are simply given by the values of the
9337:without ever explicitly computing the
9333:actually enables us to operate in the
8761:{\displaystyle {\boldsymbol {\beta }}}
8608:{\displaystyle {\boldsymbol {\beta }}}
8324:{\displaystyle {\boldsymbol {\beta }}}
8036:{\displaystyle {\boldsymbol {\beta }}}
7622:{\displaystyle {\boldsymbol {\beta }}}
7348:{\displaystyle {\boldsymbol {\beta }}}
6932:for which the corresponding estimator
5957:procedure. More specifically, for any
5076:
4507:{\displaystyle k\in \{1,\ldots ,p-1\}}
3390:{\displaystyle {\boldsymbol {\beta }}}
1669:{\displaystyle {\boldsymbol {\beta }}}
1647:{\displaystyle {\boldsymbol {\beta }}}
1494:{\displaystyle {\boldsymbol {\beta }}}
977:covariate and the respective entry of
9790:
9772:. Harvard University Press. pp.
9278:, corresponds to one feature (may be
5275:{\displaystyle L_{k}\mathbf {z} _{i}}
5178:{\displaystyle k\in \{1,\ldots ,p\}.}
3842:
2988:principal components as its columns.
9169:{\displaystyle k\in \{1,\ldots ,m\}}
9125:{\displaystyle m\in \{1,\ldots ,p\}}
9034:{\displaystyle m\in \{1,\ldots ,p\}}
7503:{\displaystyle k\in \{1,\ldots ,p\}}
5081:PCR may also be used for performing
4756:{\displaystyle k\in \{1,\ldots ,p\}}
3884:{\displaystyle k\in \{1,\ldots ,p\}}
3581:{\displaystyle k\in \{1,\ldots ,p\}}
3368:{\displaystyle k\in \{1,\ldots ,p\}}
2729:{\displaystyle k\in \{1,\ldots ,p\}}
2679:{\displaystyle j\in \{1,\ldots ,p\}}
1435:{\displaystyle \sigma ^{2}>0\;\;}
8645:principal component if and only if
5189:each of the covariate observations
4627:
4378:
4179:
3588:since the principal components are
13:
9756:
9438:, this technique is also known as
9418:so obtained and then performing a
8507:
8504:
8501:
8405:
8402:
8399:
8341:
8163:
8160:
8157:
8071:
8068:
8065:
7899:
7896:
7893:
7794:
7791:
7788:
7730:
7437:
7434:
7431:
7395:
7392:
7389:
7092:
7031:
4803:
4800:
4797:
4554:
4551:
4548:
4339:
4336:
4333:
3743:
3740:
3737:
3417:principal components is given by:
3282:regression of the response vector
3127:
2323:
2198:
2129:
1779:
1722:
1563:
1560:
1557:
1299:
1135:Following centering, the standard
14:
9845:
9184:Generalization to kernel settings
8828:In general, PCR is essentially a
6667:{\displaystyle 1\leqslant k<p}
5982:{\displaystyle 1\leqslant k<p}
9561:Partial least squares regression
9525:
9519:
9491:
9485:
9322:the actual covariates using the
9210:Reproducing Kernel Hilbert Space
8927:
8886:
8875:
8852:
8841:
8791:
8780:
8255:
7553:
6873:
6797:
6789:
6548:
6472:
6464:
6341:
6314:
6251:
6169:
6142:
6088:
6080:
5927:
5916:
5750:
5725:
5665:
5570:
5535:
5520:
5414:
5389:
5291:
5262:
5211:{\displaystyle \mathbf {x} _{i}}
5198:
5011:
5000:
4973:
4947:
4675:
4663:
4420:
4408:
4221:
4209:
3798:
3780:
3290:
3249:
3100:
3056:
3021:
2919:
2913:
2893:
2887:
2850:
2508:{\displaystyle \mathbf {v} _{j}}
2495:
2466:
2460:
2418:
2407:
2176:
2165:
2100:
2061:
2040:
1986:
1965:
1916:
1757:
1712:
1690:
1618:
1607:
1588:
1577:
1513:
1322:
1202:
1194:
1172:
1150:
1107:
1085:
1055:
1013:
985:
940:
882:denote the size of the observed
807:
786:
759:
675:
540:
467:standard linear regression model
392:
8374:
7763:
6829:
6823:
6504:
6498:
6121:
6115:
2090:left and right singular vectors
340:Least-squares spectral analysis
278:Generalized estimating equation
98:Multinomial logistic regression
73:Vector generalized linear model
9707:10.1080/00401706.1993.10485033
9630:
9593:
8681:
8665:
8547:
8525:
8513:
8483:
8445:
8423:
8411:
8381:
8236:
8224:
7939:
7917:
7905:
7875:
7834:
7812:
7800:
7770:
7701:
7679:
7667:
7645:
7534:
7522:
7443:
7413:
7401:
7371:
7108:
7096:
7047:
7035:
7025:
7013:
6995:
6983:
6917:
6905:
6847:
6835:
6629:
6617:
6589:
6577:
6522:
6510:
6375:
6363:
6298:
6286:
6225:
6213:
6105:
6075:
5766:
5719:
5425:
5383:
5334:
5316:
4843:
4821:
4809:
4779:
4594:
4572:
4560:
4530:
4345:
4315:
4303:
4281:
4021:
3992:
3965:
3943:
3221:
3192:
2929:
2883:
2145:{\displaystyle V\Lambda V^{T}}
2077:{\displaystyle V_{p\times p}=}
2071:
2035:
2002:{\displaystyle U_{n\times p}=}
1996:
1960:
1593:
1572:
1073:so that all of them have zero
842:of observed covariates where,
443:principal component regression
1:
9586:
9398:(K, say) with respect to the
8900:). Thus it exerts a discrete
8182:Now suppose that for a given
8131:compared to that of the same
7324:
6635:{\displaystyle p\times (p-k)}
3375:, the final PCR estimator of
2605:{\displaystyle j^{\text{th}}}
2579:principal component direction
484:procedure and also a type of
159:Nonlinear mixed-effects model
9556:Principal component analysis
9266:so obtained is known as the
8934:{\displaystyle \mathbf {X} }
6408:to the excluded directions.
5883:may be achieved by choosing
5672:{\displaystyle \mathbf {X} }
5657:to the observed data matrix
5484:{\displaystyle L_{k}=V_{k},}
5350:Then, it can be shown that
5185:Suppose now that we want to
4980:{\displaystyle \mathbf {X} }
4954:{\displaystyle \mathbf {X} }
4927:Addressing multicollinearity
4614:
4365:
4166:
3297:{\displaystyle \mathbf {Y} }
2635:{\displaystyle \lambda _{j}}
2107:{\displaystyle \mathbf {X} }
1923:{\displaystyle \mathbf {X} }
1764:{\displaystyle \mathbf {X} }
1747:singular value decomposition
1697:{\displaystyle \mathbf {X} }
1520:{\displaystyle \mathbf {X} }
1179:{\displaystyle \mathbf {X} }
1157:{\displaystyle \mathbf {Y} }
1114:{\displaystyle \mathbf {X} }
1092:{\displaystyle \mathbf {X} }
1062:{\displaystyle \mathbf {X} }
1020:{\displaystyle \mathbf {Y} }
992:{\displaystyle \mathbf {Y} }
947:{\displaystyle \mathbf {X} }
499:corresponding to the higher
455:principal component analysis
7:
9549:
9300:underlying regression model
8739:{\displaystyle \sigma ^{2}}
7720:We have already seen that
1505:regression which, assuming
531:high-dimensional covariates
453:technique that is based on
361:Mean and predicted response
10:
9850:
9797:Principles of Econometrics
9746:10.1198/016214505000000628
9420:standard linear regression
8470:it is still possible that
5491:the matrix with the first
4915:. Consequently, any given
4884:{\displaystyle A\succeq 0}
3594:multiple linear regression
1904:denoting the non-negative
508:variance-covariance matrix
154:Linear mixed-effects model
9470:{\displaystyle n\times n}
9372:{\displaystyle n\times n}
9306:setting is essentially a
9286:) of the covariates. The
9081:{\displaystyle n\times m}
8975:simple linear regressions
8269:. Then the corresponding
7567:, then the corresponding
6925:{\displaystyle L_{(p-k)}}
6597:{\displaystyle L_{(p-k)}}
5131:{\displaystyle p\times k}
3638:simple linear regressions
3397:based on using the first
2961:{\displaystyle n\times k}
2782:{\displaystyle p\times k}
2118:The principal components:
838:denote the corresponding
320:Least absolute deviations
9290:is then assumed to be a
9221:positive-definite kernel
9204:need not necessarily be
9176:are usually selected by
8910:regularization parameter
8127:would also have a lower
7510:, we additionally have:
6027:constrained minimization
3815:and also observing that
2968:matrix having the first
2515:respectively denote the
2088:of vectors denoting the
68:Generalized linear model
9392:appropriately centering
9308:linear regression model
9225:linear regression model
9194:linear regression model
8824:Shrinkage effect of PCR
7462:where, MSE denotes the
2585:) corresponding to the
2432:, while the columns of
1186:can be represented as:
905:{\displaystyle n\geq p}
463:regression coefficients
9539:
9505:
9471:
9448:spectral decomposition
9408:centered kernel matrix
9402:and then performing a
9373:
9238:In general, under the
9170:
9126:
9082:
9055:
9035:
8991:
8968:
8935:
8894:
8860:
8799:
8762:
8740:
8713:
8639:
8638:{\displaystyle j^{th}}
8609:
8580:
8560:
8461:
8325:
8299:
8263:
8173:
8121:
8081:
8037:
8011:
7975:
7952:
7850:
7711:
7623:
7597:
7561:
7504:
7453:
7349:
7321:principal components.
7315:
7295:
7259:
7213:
7072:
6962:
6926:
6884:
6826: subject to
6707:
6668:
6636:
6598:
6556:
6501: subject to
6387:
6262:
6189:
6118: subject to
6019:
5983:
5935:
5897:
5870:
5813:
5716:
5673:
5651:
5627:
5607:
5587:
5505:
5485:
5439:
5380:
5341:
5276:
5235:
5212:
5179:
5132:
5106:
5067:
5039:
5019:
4981:
4955:
4905:
4885:
4856:
4757:
4710:
4654:
4508:
4455:
4399:
4256:
4200:
3921:
3885:
3829:
3809:
3753:
3691:ordinary least squares
3683:
3654:
3630:
3610:
3582:
3534:
3494:
3411:
3391:
3369:
3325:
3298:
3280:ordinary least squares
3272:
3151:
3085:
3002:
2982:
2962:
2936:
2868:
2823:
2803:
2783:
2757:
2730:
2680:
2636:
2606:
2572:
2571:{\displaystyle j^{th}}
2539:
2538:{\displaystyle j^{th}}
2509:
2480:
2446:
2426:
2388:
2336:
2184:
2154:spectral decomposition
2146:
2108:
2078:
2003:
1924:
1898:
1846:
1765:
1739:
1698:
1670:
1648:
1626:
1521:
1503:ordinary least squares
1495:
1473:
1436:
1397:
1331:
1286:
1264:
1227:
1180:
1158:
1115:
1093:
1063:
1041:
1021:
993:
968:
948:
926:
906:
876:
856:
832:
744:
636:
606:ordinary least squares
598:
564:
399:Mathematics portal
325:Iteratively reweighted
9769:Advanced Econometrics
9571:Canonical correlation
9540:
9506:
9472:
9410:(K', say) whereby an
9374:
9258:characterized by the
9171:
9127:
9083:
9056:
9036:
8992:
8969:
8944:partial least squares
8936:
8895:
8861:
8800:
8763:
8741:
8714:
8640:
8610:
8581:
8561:
8462:
8326:
8300:
8264:
8174:
8122:
8091:of the corresponding
8082:
8038:
8012:
7976:
7953:
7851:
7712:
7624:
7598:
7562:
7505:
7454:
7350:
7316:
7296:
7260:
7214:
7073:
6963:
6927:
6885:
6708:
6669:
6637:
6599:
6557:
6388:
6263:
6190:
6020:
5984:
5949:Regularization effect
5936:
5898:
5871:
5787:
5696:
5674:
5652:
5628:
5608:
5588:
5506:
5486:
5440:
5360:
5342:
5277:
5242:linear transformation
5236:
5213:
5180:
5133:
5107:
5105:{\displaystyle L_{k}}
5068:
5040:
5020:
4982:
4956:
4913:non-negative definite
4906:
4886:
4857:
4758:
4711:
4610:
4509:
4456:
4361:
4257:
4162:
3922:
3886:
3830:
3810:
3754:
3684:
3655:
3631:
3611:
3583:
3535:
3533:{\displaystyle W_{k}}
3495:
3412:
3392:
3370:
3326:
3324:{\displaystyle W_{k}}
3299:
3273:
3152:
3086:
3003:
2983:
2963:
2937:
2869:
2824:
2804:
2784:
2758:
2756:{\displaystyle V_{k}}
2731:
2681:
2637:
2607:
2573:
2540:
2510:
2481:
2447:
2427:
2389:
2337:
2185:
2147:
2109:
2079:
2004:
1925:
1899:
1847:
1766:
1740:
1699:
1671:
1649:
1627:
1522:
1496:
1474:
1437:
1398:
1332:
1287:
1265:
1228:
1181:
1159:
1116:
1094:
1064:
1042:
1022:
994:
969:
949:
927:
907:
877:
857:
833:
745:
659:Details of the method
637:
599:
565:
356:Regression validation
335:Bayesian multivariate
52:Polynomial regression
16:Statistical technique
9581:Total sum of squares
9515:
9481:
9455:
9357:
9312:infinite-dimensional
9274:, also known as the
9252:infinite-dimensional
9231:is chosen to be the
9200:setting whereby the
9136:
9092:
9066:
9045:
9001:
8981:
8958:
8923:
8870:
8836:
8775:
8750:
8723:
8649:
8619:
8597:
8570:
8474:
8338:
8313:
8273:
8186:
8139:
8095:
8047:
8025:
7985:
7965:
7961:for that particular
7866:
7859:which then implies:
7727:
7636:
7611:
7571:
7514:
7470:
7362:
7337:
7305:
7269:
7226:
7088:
6975:
6936:
6897:
6720:
6681:
6646:
6608:
6569:
6423:
6278:
6205:
6036:
5993:
5989:, the PCR estimator
5961:
5911:
5887:
5693:
5684:reconstruction error
5661:
5641:
5635:linear approximation
5617:
5597:
5515:
5495:
5452:
5357:
5286:
5247:
5225:
5193:
5142:
5116:
5089:
5057:
5029:
4995:
4969:
4943:
4895:
4869:
4770:
4723:
4620:
4521:
4468:
4371:
4272:
4172:
3934:
3895:
3851:
3819:
3763:
3697:
3667:
3644:
3620:
3600:
3548:
3517:
3509:Two basic properties
3421:
3401:
3379:
3335:
3308:
3286:
3167:
3095:
3016:
2992:
2972:
2946:
2877:
2833:
2813:
2793:
2767:
2740:
2696:
2646:
2619:
2589:
2552:
2519:
2490:
2456:
2436:
2402:
2346:
2194:
2160:
2123:
2096:
2013:
1938:
1912:
1856:
1775:
1753:
1708:
1686:
1658:
1636:
1539:
1509:
1483:
1454:
1411:
1341:
1296:
1274:
1237:
1190:
1168:
1146:
1103:
1081:
1051:
1031:
1009:
1003:Data pre-processing:
981:
958:
936:
916:
890:
866:
846:
754:
670:
664:Data representation:
635:{\displaystyle \;\;}
627:
597:{\displaystyle \;\;}
589:
563:{\displaystyle \;\;}
555:
474:principal components
381:Gauss–Markov theorem
376:Studentized residual
366:Errors and residuals
200:Principal components
170:Nonlinear regression
57:General linear model
9829:Regression analysis
9440:spectral regression
9288:regression function
9202:regression function
8917:eigen-decomposition
8830:shrinkage estimator
8245:
8019:efficient estimator
7543:
7466:. Now, if for some
7301:based on the first
7200:
7168:
7125:
7064:
7004:
6856:
6531:
6234:
5881:dimension reduction
5879:Thus any potential
5764:
5567:
5549:
5085:. To see this, let
5083:dimension reduction
5077:Dimension reduction
4689:
4622:
4616:
4434:
4373:
4367:
4235:
4174:
4168:
4148:
4128:
4101:
4047:
4009:
3682:{\displaystyle k=p}
3590:mutually orthogonal
3304:on the data matrix
3247:
3209:
3053:
3035:
2690:Derived covariates:
2546:principal component
2313:
2289:
652:final PCR estimator
527:dimension reduction
495:(the ones based on
486:shrinkage estimator
451:regression analysis
226:Errors-in-variables
93:Logistic regression
83:Binomial regression
28:Regression analysis
22:Part of a series on
9800:. Wiley. pp.
9535:
9501:
9467:
9412:eigendecomposition
9369:
9292:linear combination
9166:
9122:
9078:
9051:
9031:
8987:
8964:
8931:
8890:
8856:
8820:with the outcome.
8795:
8758:
8736:
8709:
8635:
8605:
8576:
8556:
8457:
8331:. However, since
8321:
8295:
8259:
8219:
8169:
8129:mean squared error
8117:
8077:
8033:
8007:
7971:
7948:
7846:
7707:
7619:
7593:
7557:
7517:
7500:
7464:mean squared error
7449:
7345:
7311:
7291:
7255:
7209:
7178:
7140:
7091:
7068:
7030:
6978:
6958:
6922:
6880:
6830:
6784:
6703:
6664:
6632:
6594:
6552:
6505:
6459:
6383:
6258:
6208:
6185:
6072:
6015:
5979:
5931:
5893:
5866:
5861:
5748:
5682:The corresponding
5669:
5647:
5623:
5603:
5593:the corresponding
5583:
5553:
5533:
5501:
5481:
5435:
5337:
5272:
5231:
5208:
5175:
5128:
5102:
5063:
5035:
5015:
4977:
4963:linearly dependent
4951:
4901:
4881:
4852:
4753:
4706:
4673:
4504:
4451:
4418:
4252:
4219:
4134:
4111:
4084:
4033:
3995:
3917:
3891:, the variance of
3881:
3843:Variance reduction
3825:
3805:
3749:
3679:
3650:
3626:
3606:
3578:
3530:
3490:
3407:
3387:
3365:
3321:
3294:
3268:
3233:
3195:
3161:The PCR estimator:
3147:
3081:
3039:
3019:
2998:
2978:
2958:
2932:
2864:
2819:
2799:
2779:
2753:
2726:
2676:
2632:
2602:
2568:
2535:
2505:
2476:
2442:
2422:
2384:
2332:
2299:
2275:
2180:
2142:
2104:
2074:
1999:
1920:
1894:
1842:
1761:
1735:
1694:
1666:
1644:
1622:
1533:unbiased estimator
1517:
1491:
1479:for the parameter
1469:
1432:
1393:
1327:
1282:
1260:
1223:
1176:
1154:
1111:
1089:
1069:have already been
1059:
1037:
1017:
989:
964:
944:
922:
902:
872:
852:
828:
740:
632:
594:
560:
113:Multinomial probit
9811:978-0-471-85845-4
9783:978-0-674-00560-0
9636:Dodge, Y. (2003)
9576:Deming regression
9054:{\displaystyle m}
8990:{\displaystyle p}
8967:{\displaystyle p}
8579:{\displaystyle k}
8538:
8496:
8436:
8394:
8286:
8152:
8108:
8060:
7998:
7974:{\displaystyle k}
7930:
7888:
7825:
7783:
7692:
7658:
7584:
7426:
7384:
7314:{\displaystyle k}
7282:
7239:
6949:
6827:
6751:
6733:
6694:
6502:
6426:
6119:
6039:
6006:
5943:multicollinearity
5896:{\displaystyle k}
5650:{\displaystyle k}
5626:{\displaystyle k}
5606:{\displaystyle k}
5504:{\displaystyle k}
5234:{\displaystyle k}
5066:{\displaystyle 0}
5038:{\displaystyle 0}
4933:multicollinearity
4904:{\displaystyle A}
4834:
4792:
4701:
4585:
4543:
4446:
4328:
4294:
4247:
3956:
3908:
3837:orthogonal matrix
3828:{\displaystyle V}
3732:
3710:
3693:estimator. Thus,
3653:{\displaystyle k}
3629:{\displaystyle k}
3609:{\displaystyle k}
3466:
3434:
3410:{\displaystyle k}
3180:
3001:{\displaystyle W}
2981:{\displaystyle k}
2935:{\displaystyle =}
2822:{\displaystyle V}
2802:{\displaystyle k}
2599:
2445:{\displaystyle V}
1552:
1466:
1403:for some unknown
1140:linear regression
1133:Underlying model:
1040:{\displaystyle p}
967:{\displaystyle p}
925:{\displaystyle n}
875:{\displaystyle p}
855:{\displaystyle n}
610:linear regression
519:multicollinearity
435:
434:
88:Binary regression
47:Simple regression
42:Linear regression
9841:
9815:
9787:
9764:Amemiya, Takeshi
9750:
9749:
9739:
9730:(473): 119–137.
9717:
9711:
9710:
9688:
9682:
9681:
9659:
9648:
9634:
9628:
9627:
9597:
9566:Ridge regression
9544:
9542:
9541:
9536:
9534:
9533:
9528:
9522:
9510:
9508:
9507:
9502:
9500:
9499:
9494:
9488:
9476:
9474:
9473:
9468:
9436:machine learning
9432:cross-validation
9378:
9376:
9375:
9370:
9316:feature elements
9296:feature elements
9276:feature elements
9270:and each of its
9248:high-dimensional
9192:and considers a
9178:cross-validation
9175:
9173:
9172:
9167:
9131:
9129:
9128:
9123:
9087:
9085:
9084:
9079:
9060:
9058:
9057:
9052:
9040:
9038:
9037:
9032:
8996:
8994:
8993:
8988:
8973:
8971:
8970:
8965:
8940:
8938:
8937:
8932:
8930:
8906:ridge regression
8902:shrinkage effect
8899:
8897:
8896:
8891:
8889:
8884:
8883:
8878:
8865:
8863:
8862:
8857:
8855:
8850:
8849:
8844:
8807:cross-validation
8804:
8802:
8801:
8796:
8794:
8789:
8788:
8783:
8767:
8765:
8764:
8759:
8757:
8745:
8743:
8742:
8737:
8735:
8734:
8718:
8716:
8715:
8710:
8705:
8700:
8699:
8694:
8688:
8680:
8679:
8661:
8660:
8644:
8642:
8641:
8636:
8634:
8633:
8614:
8612:
8611:
8606:
8604:
8585:
8583:
8582:
8577:
8566:, especially if
8565:
8563:
8562:
8557:
8546:
8545:
8540:
8539:
8531:
8512:
8511:
8510:
8498:
8497:
8489:
8466:
8464:
8463:
8458:
8444:
8443:
8438:
8437:
8429:
8410:
8409:
8408:
8396:
8395:
8387:
8330:
8328:
8327:
8322:
8320:
8304:
8302:
8301:
8296:
8294:
8293:
8288:
8287:
8279:
8268:
8266:
8265:
8260:
8258:
8250:
8244:
8239:
8178:
8176:
8175:
8170:
8168:
8167:
8166:
8154:
8153:
8145:
8126:
8124:
8123:
8118:
8116:
8115:
8110:
8109:
8101:
8086:
8084:
8083:
8078:
8076:
8075:
8074:
8062:
8061:
8053:
8042:
8040:
8039:
8034:
8032:
8017:would be a more
8016:
8014:
8013:
8008:
8006:
8005:
8000:
7999:
7991:
7980:
7978:
7977:
7972:
7957:
7955:
7954:
7949:
7938:
7937:
7932:
7931:
7923:
7904:
7903:
7902:
7890:
7889:
7881:
7855:
7853:
7852:
7847:
7833:
7832:
7827:
7826:
7818:
7799:
7798:
7797:
7785:
7784:
7776:
7716:
7714:
7713:
7708:
7700:
7699:
7694:
7693:
7685:
7666:
7665:
7660:
7659:
7651:
7628:
7626:
7625:
7620:
7618:
7602:
7600:
7599:
7594:
7592:
7591:
7586:
7585:
7577:
7566:
7564:
7563:
7558:
7556:
7548:
7542:
7537:
7509:
7507:
7506:
7501:
7458:
7456:
7455:
7450:
7442:
7441:
7440:
7428:
7427:
7419:
7400:
7399:
7398:
7386:
7385:
7377:
7354:
7352:
7351:
7346:
7344:
7320:
7318:
7317:
7312:
7300:
7298:
7297:
7292:
7290:
7289:
7284:
7283:
7275:
7264:
7262:
7261:
7256:
7254:
7253:
7252:
7251:
7241:
7240:
7232:
7218:
7216:
7215:
7210:
7205:
7201:
7199:
7195:
7186:
7167:
7163:
7154:
7124:
7120:
7111:
7077:
7075:
7074:
7069:
7063:
7059:
7050:
7029:
7028:
7003:
6998:
6967:
6965:
6964:
6959:
6957:
6956:
6951:
6950:
6942:
6931:
6929:
6928:
6923:
6921:
6920:
6889:
6887:
6886:
6881:
6876:
6868:
6867:
6862:
6855:
6850:
6828:
6825:
6822:
6821:
6812:
6811:
6806:
6800:
6792:
6783:
6782:
6781:
6776:
6767:
6766:
6761:
6741:
6740:
6735:
6734:
6726:
6712:
6710:
6709:
6704:
6702:
6701:
6696:
6695:
6687:
6673:
6671:
6670:
6665:
6641:
6639:
6638:
6633:
6603:
6601:
6600:
6595:
6593:
6592:
6561:
6559:
6558:
6553:
6551:
6543:
6542:
6537:
6530:
6525:
6503:
6500:
6497:
6496:
6487:
6486:
6481:
6475:
6467:
6458:
6457:
6456:
6451:
6442:
6441:
6436:
6392:
6390:
6389:
6384:
6379:
6378:
6355:
6351:
6350:
6349:
6344:
6329:
6328:
6317:
6302:
6301:
6267:
6265:
6264:
6259:
6254:
6246:
6245:
6240:
6233:
6228:
6194:
6192:
6191:
6186:
6178:
6177:
6172:
6157:
6156:
6145:
6133:
6132:
6127:
6120:
6117:
6114:
6113:
6108:
6104:
6103:
6102:
6097:
6091:
6083:
6071:
6070:
6069:
6064:
6055:
6054:
6049:
6024:
6022:
6021:
6016:
6014:
6013:
6008:
6007:
5999:
5988:
5986:
5985:
5980:
5940:
5938:
5937:
5932:
5930:
5925:
5924:
5919:
5902:
5900:
5899:
5894:
5875:
5873:
5872:
5867:
5865:
5864:
5823:
5822:
5812:
5807:
5775:
5774:
5769:
5765:
5763:
5758:
5753:
5747:
5746:
5734:
5733:
5728:
5715:
5710:
5678:
5676:
5675:
5670:
5668:
5656:
5654:
5653:
5648:
5632:
5630:
5629:
5624:
5612:
5610:
5609:
5604:
5592:
5590:
5589:
5584:
5579:
5578:
5573:
5566:
5561:
5548:
5543:
5538:
5529:
5528:
5523:
5510:
5508:
5507:
5502:
5490:
5488:
5487:
5482:
5477:
5476:
5464:
5463:
5448:is minimized at
5444:
5442:
5441:
5436:
5434:
5433:
5428:
5424:
5423:
5422:
5417:
5411:
5410:
5398:
5397:
5392:
5379:
5374:
5346:
5344:
5343:
5338:
5315:
5314:
5309:
5300:
5299:
5294:
5281:
5279:
5278:
5273:
5271:
5270:
5265:
5259:
5258:
5240:
5238:
5237:
5232:
5217:
5215:
5214:
5209:
5207:
5206:
5201:
5184:
5182:
5181:
5176:
5137:
5135:
5134:
5129:
5111:
5109:
5108:
5103:
5101:
5100:
5072:
5070:
5069:
5064:
5047:inflation effect
5044:
5042:
5041:
5036:
5024:
5022:
5021:
5016:
5014:
5009:
5008:
5003:
4987:tends to become
4986:
4984:
4983:
4978:
4976:
4960:
4958:
4957:
4952:
4950:
4910:
4908:
4907:
4902:
4890:
4888:
4887:
4882:
4861:
4859:
4858:
4853:
4842:
4841:
4836:
4835:
4827:
4808:
4807:
4806:
4794:
4793:
4785:
4762:
4760:
4759:
4754:
4715:
4713:
4712:
4707:
4702:
4700:
4699:
4690:
4688:
4683:
4678:
4672:
4671:
4666:
4659:
4653:
4648:
4631:
4630:
4624:
4623:
4621:
4609:
4608:
4593:
4592:
4587:
4586:
4578:
4559:
4558:
4557:
4545:
4544:
4536:
4513:
4511:
4510:
4505:
4460:
4458:
4457:
4452:
4447:
4445:
4444:
4435:
4433:
4428:
4423:
4417:
4416:
4411:
4404:
4398:
4393:
4382:
4381:
4375:
4374:
4372:
4360:
4359:
4344:
4343:
4342:
4330:
4329:
4321:
4302:
4301:
4296:
4295:
4287:
4261:
4259:
4258:
4253:
4248:
4246:
4245:
4236:
4234:
4229:
4224:
4218:
4217:
4212:
4205:
4199:
4194:
4183:
4182:
4176:
4175:
4173:
4161:
4160:
4147:
4142:
4133:
4129:
4127:
4119:
4100:
4092:
4071:
4070:
4060:
4059:
4046:
4041:
4032:
4031:
4019:
4018:
4008:
4003:
3991:
3990:
3980:
3979:
3964:
3963:
3958:
3957:
3949:
3926:
3924:
3923:
3918:
3916:
3915:
3910:
3909:
3901:
3890:
3888:
3887:
3882:
3834:
3832:
3831:
3826:
3814:
3812:
3811:
3806:
3801:
3793:
3792:
3783:
3775:
3774:
3758:
3756:
3755:
3750:
3748:
3747:
3746:
3734:
3733:
3725:
3718:
3717:
3712:
3711:
3703:
3688:
3686:
3685:
3680:
3659:
3657:
3656:
3651:
3635:
3633:
3632:
3627:
3615:
3613:
3612:
3607:
3587:
3585:
3584:
3579:
3544:columns for any
3539:
3537:
3536:
3531:
3529:
3528:
3499:
3497:
3496:
3491:
3489:
3488:
3483:
3474:
3473:
3468:
3467:
3459:
3455:
3454:
3442:
3441:
3436:
3435:
3427:
3416:
3414:
3413:
3408:
3396:
3394:
3393:
3388:
3386:
3374:
3372:
3371:
3366:
3331:. Then, for any
3330:
3328:
3327:
3322:
3320:
3319:
3303:
3301:
3300:
3295:
3293:
3277:
3275:
3274:
3269:
3267:
3266:
3261:
3252:
3246:
3241:
3232:
3231:
3219:
3218:
3208:
3203:
3188:
3187:
3182:
3181:
3173:
3156:
3154:
3153:
3148:
3124:
3123:
3118:
3109:
3108:
3103:
3090:
3088:
3087:
3082:
3080:
3079:
3074:
3065:
3064:
3059:
3052:
3047:
3034:
3029:
3024:
3007:
3005:
3004:
2999:
2987:
2985:
2984:
2979:
2967:
2965:
2964:
2959:
2941:
2939:
2938:
2933:
2928:
2927:
2922:
2916:
2902:
2901:
2896:
2890:
2873:
2871:
2870:
2865:
2863:
2862:
2853:
2845:
2844:
2828:
2826:
2825:
2820:
2808:
2806:
2805:
2800:
2788:
2786:
2785:
2780:
2762:
2760:
2759:
2754:
2752:
2751:
2735:
2733:
2732:
2727:
2685:
2683:
2682:
2677:
2641:
2639:
2638:
2633:
2631:
2630:
2611:
2609:
2608:
2603:
2601:
2600:
2597:
2577:
2575:
2574:
2569:
2567:
2566:
2544:
2542:
2541:
2536:
2534:
2533:
2514:
2512:
2511:
2506:
2504:
2503:
2498:
2485:
2483:
2482:
2477:
2475:
2474:
2469:
2463:
2451:
2449:
2448:
2443:
2431:
2429:
2428:
2423:
2421:
2416:
2415:
2410:
2396:principal values
2393:
2391:
2390:
2385:
2377:
2376:
2358:
2357:
2341:
2339:
2338:
2333:
2331:
2330:
2318:
2314:
2312:
2307:
2288:
2283:
2260:
2256:
2255:
2254:
2236:
2235:
2212:
2211:
2189:
2187:
2186:
2181:
2179:
2174:
2173:
2168:
2151:
2149:
2148:
2143:
2141:
2140:
2113:
2111:
2110:
2105:
2103:
2086:orthonormal sets
2083:
2081:
2080:
2075:
2070:
2069:
2064:
2049:
2048:
2043:
2031:
2030:
2008:
2006:
2005:
2000:
1995:
1994:
1989:
1974:
1973:
1968:
1956:
1955:
1929:
1927:
1926:
1921:
1919:
1903:
1901:
1900:
1895:
1887:
1886:
1868:
1867:
1851:
1849:
1848:
1843:
1841:
1837:
1836:
1835:
1817:
1816:
1793:
1792:
1770:
1768:
1767:
1762:
1760:
1744:
1742:
1741:
1736:
1734:
1733:
1715:
1704:. For this, let
1703:
1701:
1700:
1695:
1693:
1675:
1673:
1672:
1667:
1665:
1653:
1651:
1650:
1645:
1643:
1631:
1629:
1628:
1623:
1621:
1616:
1615:
1610:
1604:
1603:
1591:
1586:
1585:
1580:
1568:
1567:
1566:
1554:
1553:
1545:
1529:full column rank
1526:
1524:
1523:
1518:
1516:
1500:
1498:
1497:
1492:
1490:
1478:
1476:
1475:
1470:
1468:
1467:
1459:
1441:
1439:
1438:
1433:
1423:
1422:
1402:
1400:
1399:
1394:
1392:
1391:
1376:
1375:
1363:
1359:
1336:
1334:
1333:
1328:
1325:
1317:
1313:
1291:
1289:
1288:
1283:
1281:
1269:
1267:
1266:
1261:
1259:
1258:
1253:
1244:
1232:
1230:
1229:
1224:
1218:
1210:
1205:
1197:
1185:
1183:
1182:
1177:
1175:
1163:
1161:
1160:
1155:
1153:
1123:PCA is sensitive
1120:
1118:
1117:
1112:
1110:
1098:
1096:
1095:
1090:
1088:
1068:
1066:
1065:
1060:
1058:
1046:
1044:
1043:
1038:
1027:and each of the
1026:
1024:
1023:
1018:
1016:
998:
996:
995:
990:
988:
973:
971:
970:
965:
953:
951:
950:
945:
943:
931:
929:
928:
923:
911:
909:
908:
903:
881:
879:
878:
873:
861:
859:
858:
853:
837:
835:
834:
829:
827:
826:
821:
817:
816:
815:
810:
795:
794:
789:
774:
773:
762:
749:
747:
746:
741:
739:
738:
733:
729:
728:
727:
709:
708:
690:
689:
678:
641:
639:
638:
633:
603:
601:
600:
595:
574:on the observed
569:
567:
566:
561:
427:
420:
413:
397:
396:
304:Ridge regression
139:Multilevel model
19:
18:
9849:
9848:
9844:
9843:
9842:
9840:
9839:
9838:
9834:Factor analysis
9819:
9818:
9812:
9784:
9759:
9757:Further reading
9754:
9753:
9737:10.1.1.516.2313
9718:
9714:
9689:
9685:
9678:10.2307/1267793
9660:
9651:
9635:
9631:
9616:10.2307/2348005
9598:
9594:
9589:
9552:
9529:
9524:
9523:
9518:
9516:
9513:
9512:
9495:
9490:
9489:
9484:
9482:
9479:
9478:
9456:
9453:
9452:
9358:
9355:
9354:
9351:kernel function
9260:kernel function
9229:kernel function
9186:
9137:
9134:
9133:
9093:
9090:
9089:
9067:
9064:
9063:
9046:
9043:
9042:
9002:
8999:
8998:
8982:
8979:
8978:
8959:
8956:
8955:
8926:
8924:
8921:
8920:
8885:
8879:
8874:
8873:
8871:
8868:
8867:
8851:
8845:
8840:
8839:
8837:
8834:
8833:
8826:
8814:
8790:
8784:
8779:
8778:
8776:
8773:
8772:
8753:
8751:
8748:
8747:
8730:
8726:
8724:
8721:
8720:
8701:
8695:
8690:
8689:
8684:
8675:
8671:
8656:
8652:
8650:
8647:
8646:
8626:
8622:
8620:
8617:
8616:
8600:
8598:
8595:
8594:
8571:
8568:
8567:
8541:
8530:
8529:
8528:
8500:
8499:
8488:
8487:
8486:
8475:
8472:
8471:
8439:
8428:
8427:
8426:
8398:
8397:
8386:
8385:
8384:
8339:
8336:
8335:
8316:
8314:
8311:
8310:
8289:
8278:
8277:
8276:
8274:
8271:
8270:
8254:
8246:
8240:
8223:
8187:
8184:
8183:
8156:
8155:
8144:
8143:
8142:
8140:
8137:
8136:
8111:
8100:
8099:
8098:
8096:
8093:
8092:
8064:
8063:
8052:
8051:
8050:
8048:
8045:
8044:
8028:
8026:
8023:
8022:
8001:
7990:
7989:
7988:
7986:
7983:
7982:
7966:
7963:
7962:
7933:
7922:
7921:
7920:
7892:
7891:
7880:
7879:
7878:
7867:
7864:
7863:
7828:
7817:
7816:
7815:
7787:
7786:
7775:
7774:
7773:
7728:
7725:
7724:
7695:
7684:
7683:
7682:
7661:
7650:
7649:
7648:
7637:
7634:
7633:
7629:and therefore
7614:
7612:
7609:
7608:
7587:
7576:
7575:
7574:
7572:
7569:
7568:
7552:
7544:
7538:
7521:
7515:
7512:
7511:
7471:
7468:
7467:
7430:
7429:
7418:
7417:
7416:
7388:
7387:
7376:
7375:
7374:
7363:
7360:
7359:
7340:
7338:
7335:
7334:
7327:
7306:
7303:
7302:
7285:
7274:
7273:
7272:
7270:
7267:
7266:
7247:
7243:
7242:
7231:
7230:
7229:
7227:
7224:
7223:
7191:
7187:
7182:
7159:
7155:
7144:
7139:
7135:
7116:
7112:
7095:
7089:
7086:
7085:
7055:
7051:
7034:
7012:
7008:
6999:
6982:
6976:
6973:
6972:
6952:
6941:
6940:
6939:
6937:
6934:
6933:
6904:
6900:
6898:
6895:
6894:
6872:
6863:
6858:
6857:
6851:
6834:
6824:
6817:
6813:
6807:
6802:
6801:
6796:
6788:
6777:
6772:
6771:
6762:
6757:
6756:
6755:
6736:
6725:
6724:
6723:
6721:
6718:
6717:
6697:
6686:
6685:
6684:
6682:
6679:
6678:
6647:
6644:
6643:
6609:
6606:
6605:
6576:
6572:
6570:
6567:
6566:
6547:
6538:
6533:
6532:
6526:
6509:
6499:
6492:
6488:
6482:
6477:
6476:
6471:
6463:
6452:
6447:
6446:
6437:
6432:
6431:
6430:
6424:
6421:
6420:
6414:
6356:
6345:
6340:
6339:
6318:
6313:
6312:
6311:
6307:
6306:
6285:
6281:
6279:
6276:
6275:
6250:
6241:
6236:
6235:
6229:
6212:
6206:
6203:
6202:
6173:
6168:
6167:
6146:
6141:
6140:
6128:
6123:
6122:
6116:
6109:
6098:
6093:
6092:
6087:
6079:
6078:
6074:
6073:
6065:
6060:
6059:
6050:
6045:
6044:
6043:
6037:
6034:
6033:
6009:
5998:
5997:
5996:
5994:
5991:
5990:
5962:
5959:
5958:
5951:
5926:
5920:
5915:
5914:
5912:
5909:
5908:
5888:
5885:
5884:
5860:
5859:
5848:
5842:
5841:
5824:
5818:
5814:
5808:
5791:
5780:
5779:
5770:
5759:
5754:
5749:
5742:
5738:
5729:
5724:
5723:
5722:
5718:
5717:
5711:
5700:
5694:
5691:
5690:
5664:
5662:
5659:
5658:
5642:
5639:
5638:
5618:
5615:
5614:
5598:
5595:
5594:
5574:
5569:
5568:
5562:
5557:
5544:
5539:
5534:
5524:
5519:
5518:
5516:
5513:
5512:
5496:
5493:
5492:
5472:
5468:
5459:
5455:
5453:
5450:
5449:
5429:
5418:
5413:
5412:
5406:
5402:
5393:
5388:
5387:
5386:
5382:
5381:
5375:
5364:
5358:
5355:
5354:
5310:
5305:
5304:
5295:
5290:
5289:
5287:
5284:
5283:
5266:
5261:
5260:
5254:
5250:
5248:
5245:
5244:
5226:
5223:
5222:
5202:
5197:
5196:
5194:
5191:
5190:
5143:
5140:
5139:
5117:
5114:
5113:
5096:
5092:
5090:
5087:
5086:
5079:
5058:
5055:
5054:
5030:
5027:
5026:
5010:
5004:
4999:
4998:
4996:
4993:
4992:
4972:
4970:
4967:
4966:
4965:and therefore,
4946:
4944:
4941:
4940:
4929:
4896:
4893:
4892:
4870:
4867:
4866:
4837:
4826:
4825:
4824:
4796:
4795:
4784:
4783:
4782:
4771:
4768:
4767:
4724:
4721:
4720:
4695:
4691:
4684:
4679:
4674:
4667:
4662:
4661:
4660:
4658:
4649:
4632:
4626:
4615:
4613:
4612:
4611:
4604:
4600:
4588:
4577:
4576:
4575:
4547:
4546:
4535:
4534:
4533:
4522:
4519:
4518:
4469:
4466:
4465:
4440:
4436:
4429:
4424:
4419:
4412:
4407:
4406:
4405:
4403:
4394:
4383:
4377:
4366:
4364:
4363:
4362:
4355:
4351:
4332:
4331:
4320:
4319:
4318:
4297:
4286:
4285:
4284:
4273:
4270:
4269:
4265:In particular:
4241:
4237:
4230:
4225:
4220:
4213:
4208:
4207:
4206:
4204:
4195:
4184:
4178:
4167:
4165:
4164:
4163:
4156:
4152:
4143:
4138:
4120:
4115:
4093:
4088:
4083:
4079:
4066:
4062:
4055:
4051:
4042:
4037:
4024:
4020:
4014:
4010:
4004:
3999:
3986:
3982:
3975:
3971:
3959:
3948:
3947:
3946:
3935:
3932:
3931:
3911:
3900:
3899:
3898:
3896:
3893:
3892:
3852:
3849:
3848:
3845:
3820:
3817:
3816:
3797:
3788:
3784:
3779:
3770:
3766:
3764:
3761:
3760:
3736:
3735:
3724:
3723:
3722:
3713:
3702:
3701:
3700:
3698:
3695:
3694:
3668:
3665:
3664:
3645:
3642:
3641:
3621:
3618:
3617:
3601:
3598:
3597:
3596:jointly on the
3549:
3546:
3545:
3524:
3520:
3518:
3515:
3514:
3511:
3506:
3484:
3479:
3478:
3469:
3458:
3457:
3456:
3450:
3446:
3437:
3426:
3425:
3424:
3422:
3419:
3418:
3402:
3399:
3398:
3382:
3380:
3377:
3376:
3336:
3333:
3332:
3315:
3311:
3309:
3306:
3305:
3289:
3287:
3284:
3283:
3262:
3257:
3256:
3248:
3242:
3237:
3224:
3220:
3214:
3210:
3204:
3199:
3183:
3172:
3171:
3170:
3168:
3165:
3164:
3119:
3114:
3113:
3104:
3099:
3098:
3096:
3093:
3092:
3075:
3070:
3069:
3060:
3055:
3054:
3048:
3043:
3030:
3025:
3020:
3017:
3014:
3013:
2993:
2990:
2989:
2973:
2970:
2969:
2947:
2944:
2943:
2923:
2918:
2917:
2912:
2897:
2892:
2891:
2886:
2878:
2875:
2874:
2858:
2854:
2849:
2840:
2836:
2834:
2831:
2830:
2814:
2811:
2810:
2794:
2791:
2790:
2768:
2765:
2764:
2747:
2743:
2741:
2738:
2737:
2697:
2694:
2693:
2647:
2644:
2643:
2626:
2622:
2620:
2617:
2616:
2614:principal value
2596:
2592:
2590:
2587:
2586:
2559:
2555:
2553:
2550:
2549:
2526:
2522:
2520:
2517:
2516:
2499:
2494:
2493:
2491:
2488:
2487:
2470:
2465:
2464:
2459:
2457:
2454:
2453:
2437:
2434:
2433:
2417:
2411:
2406:
2405:
2403:
2400:
2399:
2372:
2368:
2353:
2349:
2347:
2344:
2343:
2326:
2322:
2308:
2303:
2284:
2279:
2274:
2270:
2250:
2246:
2231:
2227:
2226:
2222:
2201:
2197:
2195:
2192:
2191:
2175:
2169:
2164:
2163:
2161:
2158:
2157:
2136:
2132:
2124:
2121:
2120:
2099:
2097:
2094:
2093:
2065:
2060:
2059:
2044:
2039:
2038:
2020:
2016:
2014:
2011:
2010:
1990:
1985:
1984:
1969:
1964:
1963:
1945:
1941:
1939:
1936:
1935:
1915:
1913:
1910:
1909:
1906:singular values
1882:
1878:
1863:
1859:
1857:
1854:
1853:
1831:
1827:
1812:
1808:
1807:
1803:
1782:
1778:
1776:
1773:
1772:
1756:
1754:
1751:
1750:
1729:
1725:
1711:
1709:
1706:
1705:
1689:
1687:
1684:
1683:
1661:
1659:
1656:
1655:
1639:
1637:
1634:
1633:
1617:
1611:
1606:
1605:
1596:
1592:
1587:
1581:
1576:
1575:
1556:
1555:
1544:
1543:
1542:
1540:
1537:
1536:
1512:
1510:
1507:
1506:
1486:
1484:
1481:
1480:
1458:
1457:
1455:
1452:
1451:
1418:
1414:
1412:
1409:
1408:
1381:
1377:
1371:
1367:
1355:
1351:
1342:
1339:
1338:
1321:
1309:
1305:
1297:
1294:
1293:
1277:
1275:
1272:
1271:
1254:
1249:
1248:
1240:
1238:
1235:
1234:
1214:
1206:
1201:
1193:
1191:
1188:
1187:
1171:
1169:
1166:
1165:
1149:
1147:
1144:
1143:
1106:
1104:
1101:
1100:
1084:
1082:
1079:
1078:
1075:empirical means
1054:
1052:
1049:
1048:
1032:
1029:
1028:
1012:
1010:
1007:
1006:
984:
982:
979:
978:
959:
956:
955:
939:
937:
934:
933:
917:
914:
913:
891:
888:
887:
867:
864:
863:
847:
844:
843:
822:
811:
806:
805:
790:
785:
784:
783:
779:
778:
763:
758:
757:
755:
752:
751:
734:
723:
719:
704:
700:
699:
695:
694:
679:
674:
673:
671:
668:
667:
661:
628:
625:
624:
590:
587:
586:
556:
553:
552:
543:
431:
391:
371:Goodness of fit
78:Discrete choice
17:
12:
11:
5:
9847:
9837:
9836:
9831:
9817:
9816:
9810:
9788:
9782:
9758:
9755:
9752:
9751:
9712:
9701:(2): 109–135.
9683:
9672:(3): 289–295.
9649:
9629:
9610:(3): 300–303.
9591:
9590:
9588:
9585:
9584:
9583:
9578:
9573:
9568:
9563:
9558:
9551:
9548:
9532:
9527:
9521:
9498:
9493:
9487:
9477:kernel matrix
9466:
9463:
9460:
9388:kernel machine
9368:
9365:
9362:
9347:inner products
9343:inner products
9304:kernel machine
9240:kernel machine
9198:kernel machine
9185:
9182:
9165:
9162:
9159:
9156:
9153:
9150:
9147:
9144:
9141:
9121:
9118:
9115:
9112:
9109:
9106:
9103:
9100:
9097:
9077:
9074:
9071:
9050:
9030:
9027:
9024:
9021:
9018:
9015:
9012:
9009:
9006:
8986:
8963:
8951:supervised PCR
8929:
8888:
8882:
8877:
8854:
8848:
8843:
8825:
8822:
8812:
8793:
8787:
8782:
8756:
8733:
8729:
8708:
8704:
8698:
8693:
8687:
8683:
8678:
8674:
8670:
8667:
8664:
8659:
8655:
8632:
8629:
8625:
8603:
8575:
8555:
8552:
8549:
8544:
8537:
8534:
8527:
8524:
8521:
8518:
8515:
8509:
8506:
8503:
8495:
8492:
8485:
8482:
8479:
8468:
8467:
8456:
8453:
8450:
8447:
8442:
8435:
8432:
8425:
8422:
8419:
8416:
8413:
8407:
8404:
8401:
8393:
8390:
8383:
8380:
8377:
8373:
8370:
8367:
8364:
8361:
8358:
8355:
8352:
8349:
8346:
8343:
8319:
8292:
8285:
8282:
8257:
8253:
8249:
8243:
8238:
8235:
8232:
8229:
8226:
8222:
8218:
8215:
8212:
8209:
8206:
8203:
8200:
8197:
8194:
8191:
8165:
8162:
8159:
8151:
8148:
8114:
8107:
8104:
8073:
8070:
8067:
8059:
8056:
8031:
8004:
7997:
7994:
7970:
7959:
7958:
7947:
7944:
7941:
7936:
7929:
7926:
7919:
7916:
7913:
7910:
7907:
7901:
7898:
7895:
7887:
7884:
7877:
7874:
7871:
7857:
7856:
7845:
7842:
7839:
7836:
7831:
7824:
7821:
7814:
7811:
7808:
7805:
7802:
7796:
7793:
7790:
7782:
7779:
7772:
7769:
7766:
7762:
7759:
7756:
7753:
7750:
7747:
7744:
7741:
7738:
7735:
7732:
7718:
7717:
7706:
7703:
7698:
7691:
7688:
7681:
7678:
7675:
7672:
7669:
7664:
7657:
7654:
7647:
7644:
7641:
7617:
7590:
7583:
7580:
7555:
7551:
7547:
7541:
7536:
7533:
7530:
7527:
7524:
7520:
7499:
7496:
7493:
7490:
7487:
7484:
7481:
7478:
7475:
7460:
7459:
7448:
7445:
7439:
7436:
7433:
7425:
7422:
7415:
7412:
7409:
7406:
7403:
7397:
7394:
7391:
7383:
7380:
7373:
7370:
7367:
7343:
7326:
7323:
7310:
7288:
7281:
7278:
7250:
7246:
7238:
7235:
7220:
7219:
7208:
7204:
7198:
7194:
7190:
7185:
7181:
7177:
7174:
7171:
7166:
7162:
7158:
7153:
7150:
7147:
7143:
7138:
7134:
7131:
7128:
7123:
7119:
7115:
7110:
7107:
7104:
7101:
7098:
7094:
7079:
7078:
7067:
7062:
7058:
7054:
7049:
7046:
7043:
7040:
7037:
7033:
7027:
7024:
7021:
7018:
7015:
7011:
7007:
7002:
6997:
6994:
6991:
6988:
6985:
6981:
6955:
6948:
6945:
6919:
6916:
6913:
6910:
6907:
6903:
6891:
6890:
6879:
6875:
6871:
6866:
6861:
6854:
6849:
6846:
6843:
6840:
6837:
6833:
6820:
6816:
6810:
6805:
6799:
6795:
6791:
6787:
6780:
6775:
6770:
6765:
6760:
6754:
6750:
6747:
6744:
6739:
6732:
6729:
6700:
6693:
6690:
6663:
6660:
6657:
6654:
6651:
6631:
6628:
6625:
6622:
6619:
6616:
6613:
6591:
6588:
6585:
6582:
6579:
6575:
6563:
6562:
6550:
6546:
6541:
6536:
6529:
6524:
6521:
6518:
6515:
6512:
6508:
6495:
6491:
6485:
6480:
6474:
6470:
6466:
6462:
6455:
6450:
6445:
6440:
6435:
6429:
6413:
6410:
6398:regularization
6394:
6393:
6382:
6377:
6374:
6371:
6368:
6365:
6362:
6359:
6354:
6348:
6343:
6338:
6335:
6332:
6327:
6324:
6321:
6316:
6310:
6305:
6300:
6297:
6294:
6291:
6288:
6284:
6269:
6268:
6257:
6253:
6249:
6244:
6239:
6232:
6227:
6224:
6221:
6218:
6215:
6211:
6196:
6195:
6184:
6181:
6176:
6171:
6166:
6163:
6160:
6155:
6152:
6149:
6144:
6139:
6136:
6131:
6126:
6112:
6107:
6101:
6096:
6090:
6086:
6082:
6077:
6068:
6063:
6058:
6053:
6048:
6042:
6012:
6005:
6002:
5978:
5975:
5972:
5969:
5966:
5950:
5947:
5929:
5923:
5918:
5892:
5877:
5876:
5863:
5858:
5855:
5852:
5849:
5847:
5844:
5843:
5840:
5837:
5834:
5831:
5828:
5825:
5821:
5817:
5811:
5806:
5803:
5800:
5797:
5794:
5790:
5786:
5785:
5783:
5778:
5773:
5768:
5762:
5757:
5752:
5745:
5741:
5737:
5732:
5727:
5721:
5714:
5709:
5706:
5703:
5699:
5686:is given by:
5667:
5646:
5622:
5602:
5582:
5577:
5572:
5565:
5560:
5556:
5552:
5547:
5542:
5537:
5532:
5527:
5522:
5500:
5480:
5475:
5471:
5467:
5462:
5458:
5446:
5445:
5432:
5427:
5421:
5416:
5409:
5405:
5401:
5396:
5391:
5385:
5378:
5373:
5370:
5367:
5363:
5336:
5333:
5330:
5327:
5324:
5321:
5318:
5313:
5308:
5303:
5298:
5293:
5269:
5264:
5257:
5253:
5230:
5205:
5200:
5174:
5171:
5168:
5165:
5162:
5159:
5156:
5153:
5150:
5147:
5127:
5124:
5121:
5099:
5095:
5078:
5075:
5062:
5034:
5013:
5007:
5002:
4989:rank deficient
4975:
4949:
4928:
4925:
4900:
4880:
4877:
4874:
4863:
4862:
4851:
4848:
4845:
4840:
4833:
4830:
4823:
4820:
4817:
4814:
4811:
4805:
4802:
4799:
4791:
4788:
4781:
4778:
4775:
4752:
4749:
4746:
4743:
4740:
4737:
4734:
4731:
4728:
4719:Thus, for all
4717:
4716:
4705:
4698:
4694:
4687:
4682:
4677:
4670:
4665:
4657:
4652:
4647:
4644:
4641:
4638:
4635:
4629:
4619:
4607:
4603:
4599:
4596:
4591:
4584:
4581:
4574:
4571:
4568:
4565:
4562:
4556:
4553:
4550:
4542:
4539:
4532:
4529:
4526:
4503:
4500:
4497:
4494:
4491:
4488:
4485:
4482:
4479:
4476:
4473:
4464:Hence for all
4462:
4461:
4450:
4443:
4439:
4432:
4427:
4422:
4415:
4410:
4402:
4397:
4392:
4389:
4386:
4380:
4370:
4358:
4354:
4350:
4347:
4341:
4338:
4335:
4327:
4324:
4317:
4314:
4311:
4308:
4305:
4300:
4293:
4290:
4283:
4280:
4277:
4263:
4262:
4251:
4244:
4240:
4233:
4228:
4223:
4216:
4211:
4203:
4198:
4193:
4190:
4187:
4181:
4171:
4159:
4155:
4151:
4146:
4141:
4137:
4132:
4126:
4123:
4118:
4114:
4110:
4107:
4104:
4099:
4096:
4091:
4087:
4082:
4078:
4075:
4069:
4065:
4058:
4054:
4050:
4045:
4040:
4036:
4030:
4027:
4023:
4017:
4013:
4007:
4002:
3998:
3994:
3989:
3985:
3978:
3974:
3970:
3967:
3962:
3955:
3952:
3945:
3942:
3939:
3914:
3907:
3904:
3880:
3877:
3874:
3871:
3868:
3865:
3862:
3859:
3856:
3844:
3841:
3824:
3804:
3800:
3796:
3791:
3787:
3782:
3778:
3773:
3769:
3745:
3742:
3739:
3731:
3728:
3721:
3716:
3709:
3706:
3678:
3675:
3672:
3649:
3625:
3605:
3577:
3574:
3571:
3568:
3565:
3562:
3559:
3556:
3553:
3527:
3523:
3510:
3507:
3505:
3502:
3487:
3482:
3477:
3472:
3465:
3462:
3453:
3449:
3445:
3440:
3433:
3430:
3406:
3385:
3364:
3361:
3358:
3355:
3352:
3349:
3346:
3343:
3340:
3318:
3314:
3292:
3265:
3260:
3255:
3251:
3245:
3240:
3236:
3230:
3227:
3223:
3217:
3213:
3207:
3202:
3198:
3194:
3191:
3186:
3179:
3176:
3146:
3143:
3140:
3137:
3134:
3129:
3122:
3117:
3112:
3107:
3102:
3078:
3073:
3068:
3063:
3058:
3051:
3046:
3042:
3038:
3033:
3028:
3023:
2997:
2977:
2957:
2954:
2951:
2931:
2926:
2921:
2915:
2911:
2908:
2905:
2900:
2895:
2889:
2885:
2882:
2861:
2857:
2852:
2848:
2843:
2839:
2818:
2798:
2778:
2775:
2772:
2750:
2746:
2725:
2722:
2719:
2716:
2713:
2710:
2707:
2704:
2701:
2675:
2672:
2669:
2666:
2663:
2660:
2657:
2654:
2651:
2629:
2625:
2595:
2565:
2562:
2558:
2532:
2529:
2525:
2502:
2497:
2473:
2468:
2462:
2441:
2420:
2414:
2409:
2383:
2380:
2375:
2371:
2367:
2364:
2361:
2356:
2352:
2329:
2325:
2321:
2317:
2311:
2306:
2302:
2298:
2295:
2292:
2287:
2282:
2278:
2273:
2269:
2266:
2263:
2259:
2253:
2249:
2245:
2242:
2239:
2234:
2230:
2225:
2221:
2218:
2215:
2210:
2207:
2204:
2200:
2178:
2172:
2167:
2139:
2135:
2131:
2128:
2114:respectively.
2102:
2073:
2068:
2063:
2058:
2055:
2052:
2047:
2042:
2037:
2034:
2029:
2026:
2023:
2019:
1998:
1993:
1988:
1983:
1980:
1977:
1972:
1967:
1962:
1959:
1954:
1951:
1948:
1944:
1918:
1893:
1890:
1885:
1881:
1877:
1874:
1871:
1866:
1862:
1840:
1834:
1830:
1826:
1823:
1820:
1815:
1811:
1806:
1802:
1799:
1796:
1791:
1788:
1785:
1781:
1759:
1732:
1728:
1724:
1721:
1718:
1714:
1692:
1664:
1642:
1620:
1614:
1609:
1602:
1599:
1595:
1590:
1584:
1579:
1574:
1571:
1565:
1562:
1559:
1551:
1548:
1515:
1489:
1465:
1462:
1429:
1426:
1421:
1417:
1390:
1387:
1384:
1380:
1374:
1370:
1366:
1362:
1358:
1354:
1350:
1347:
1324:
1320:
1316:
1312:
1308:
1304:
1301:
1280:
1257:
1252:
1247:
1243:
1221:
1217:
1213:
1209:
1204:
1200:
1196:
1174:
1152:
1109:
1087:
1057:
1036:
1015:
987:
963:
942:
921:
912:. Each of the
901:
898:
895:
871:
851:
825:
820:
814:
809:
804:
801:
798:
793:
788:
782:
777:
772:
769:
766:
761:
737:
732:
726:
722:
718:
715:
712:
707:
703:
698:
693:
688:
685:
682:
677:
660:
657:
656:
655:
618:
617:
580:
579:
542:
539:
433:
432:
430:
429:
422:
415:
407:
404:
403:
402:
401:
386:
385:
384:
383:
378:
373:
368:
363:
358:
350:
349:
345:
344:
343:
342:
337:
332:
327:
322:
314:
313:
312:
311:
306:
301:
296:
291:
283:
282:
281:
280:
275:
270:
265:
257:
256:
255:
254:
249:
244:
236:
235:
231:
230:
229:
228:
220:
219:
218:
217:
212:
207:
202:
197:
192:
187:
182:
180:Semiparametric
177:
172:
164:
163:
162:
161:
156:
151:
149:Random effects
146:
141:
133:
132:
131:
130:
125:
123:Ordered probit
120:
115:
110:
105:
100:
95:
90:
85:
80:
75:
70:
62:
61:
60:
59:
54:
49:
44:
36:
35:
31:
30:
24:
23:
15:
9:
6:
4:
3:
2:
9846:
9835:
9832:
9830:
9827:
9826:
9824:
9813:
9807:
9803:
9799:
9798:
9793:
9789:
9785:
9779:
9775:
9771:
9770:
9765:
9761:
9760:
9747:
9743:
9738:
9733:
9729:
9725:
9724:
9716:
9708:
9704:
9700:
9696:
9695:
9694:Technometrics
9687:
9679:
9675:
9671:
9667:
9666:
9665:Technometrics
9658:
9656:
9654:
9647:
9646:0-19-920613-9
9643:
9639:
9633:
9625:
9621:
9617:
9613:
9609:
9605:
9604:
9596:
9592:
9582:
9579:
9577:
9574:
9572:
9569:
9567:
9564:
9562:
9559:
9557:
9554:
9553:
9547:
9530:
9496:
9464:
9461:
9458:
9449:
9443:
9441:
9437:
9433:
9429:
9425:
9421:
9417:
9413:
9409:
9405:
9401:
9400:feature space
9397:
9396:kernel matrix
9393:
9389:
9384:
9382:
9381:kernel matrix
9366:
9363:
9360:
9352:
9348:
9344:
9340:
9336:
9335:feature space
9332:
9329:However, the
9327:
9325:
9321:
9317:
9313:
9309:
9305:
9301:
9297:
9293:
9289:
9285:
9281:
9277:
9273:
9269:
9265:
9261:
9257:
9256:feature space
9253:
9250:(potentially
9249:
9245:
9241:
9236:
9234:
9233:linear kernel
9230:
9226:
9222:
9219:
9215:
9211:
9207:
9203:
9199:
9195:
9191:
9190:classical PCA
9181:
9179:
9160:
9157:
9154:
9151:
9148:
9142:
9139:
9116:
9113:
9110:
9107:
9104:
9098:
9095:
9075:
9072:
9069:
9048:
9025:
9022:
9019:
9016:
9013:
9007:
9004:
8984:
8976:
8961:
8952:
8947:
8945:
8918:
8913:
8911:
8907:
8903:
8880:
8846:
8831:
8821:
8819:
8815:
8808:
8785:
8769:
8731:
8727:
8706:
8696:
8685:
8676:
8672:
8668:
8662:
8657:
8653:
8630:
8627:
8623:
8591:
8589:
8573:
8553:
8550:
8542:
8535:
8522:
8519:
8516:
8493:
8480:
8477:
8454:
8451:
8448:
8440:
8433:
8420:
8417:
8414:
8391:
8378:
8375:
8371:
8365:
8362:
8359:
8356:
8353:
8347:
8344:
8334:
8333:
8332:
8308:
8290:
8283:
8251:
8241:
8233:
8230:
8227:
8220:
8216:
8210:
8207:
8204:
8201:
8198:
8192:
8189:
8180:
8149:
8134:
8130:
8112:
8105:
8090:
8057:
8020:
8002:
7995:
7968:
7945:
7942:
7934:
7927:
7914:
7911:
7908:
7885:
7872:
7869:
7862:
7861:
7860:
7843:
7840:
7837:
7829:
7822:
7809:
7806:
7803:
7780:
7767:
7764:
7760:
7754:
7751:
7748:
7745:
7742:
7736:
7733:
7723:
7722:
7721:
7704:
7696:
7689:
7676:
7673:
7670:
7662:
7655:
7642:
7639:
7632:
7631:
7630:
7606:
7588:
7581:
7549:
7539:
7531:
7528:
7525:
7518:
7494:
7491:
7488:
7485:
7482:
7476:
7473:
7465:
7446:
7423:
7410:
7407:
7404:
7381:
7368:
7365:
7358:
7357:
7356:
7332:
7322:
7308:
7286:
7279:
7248:
7244:
7236:
7206:
7202:
7196:
7192:
7188:
7183:
7179:
7175:
7172:
7169:
7164:
7160:
7156:
7151:
7148:
7145:
7141:
7136:
7132:
7129:
7126:
7121:
7117:
7113:
7105:
7102:
7099:
7084:
7083:
7082:
7065:
7060:
7056:
7052:
7044:
7041:
7038:
7022:
7019:
7016:
7009:
7005:
7000:
6992:
6989:
6986:
6979:
6971:
6970:
6969:
6953:
6946:
6914:
6911:
6908:
6901:
6877:
6869:
6864:
6852:
6844:
6841:
6838:
6831:
6818:
6808:
6793:
6778:
6768:
6763:
6748:
6745:
6742:
6737:
6730:
6716:
6715:
6714:
6698:
6691:
6675:
6661:
6658:
6655:
6652:
6649:
6626:
6623:
6620:
6614:
6611:
6586:
6583:
6580:
6573:
6544:
6539:
6527:
6519:
6516:
6513:
6506:
6493:
6483:
6468:
6453:
6443:
6438:
6419:
6418:
6417:
6409:
6407:
6403:
6399:
6380:
6372:
6369:
6366:
6360:
6357:
6352:
6346:
6336:
6333:
6330:
6325:
6322:
6319:
6308:
6303:
6295:
6292:
6289:
6282:
6274:
6273:
6272:
6255:
6247:
6242:
6230:
6222:
6219:
6216:
6209:
6201:
6200:
6199:
6182:
6174:
6164:
6161:
6158:
6153:
6150:
6147:
6134:
6129:
6110:
6099:
6084:
6066:
6056:
6051:
6032:
6031:
6030:
6028:
6010:
6003:
5976:
5973:
5970:
5967:
5964:
5956:
5946:
5944:
5921:
5906:
5890:
5882:
5856:
5853:
5850:
5845:
5838:
5835:
5832:
5829:
5826:
5819:
5815:
5809:
5804:
5801:
5798:
5795:
5792:
5788:
5781:
5776:
5771:
5760:
5755:
5743:
5739:
5735:
5730:
5712:
5707:
5704:
5701:
5697:
5689:
5688:
5687:
5685:
5680:
5644:
5636:
5620:
5600:
5580:
5575:
5563:
5558:
5554:
5550:
5545:
5540:
5530:
5525:
5498:
5478:
5473:
5469:
5465:
5460:
5456:
5430:
5419:
5407:
5403:
5399:
5394:
5376:
5371:
5368:
5365:
5361:
5353:
5352:
5351:
5348:
5331:
5328:
5325:
5322:
5319:
5311:
5301:
5296:
5267:
5255:
5251:
5243:
5228:
5221:
5203:
5188:
5172:
5166:
5163:
5160:
5157:
5154:
5148:
5145:
5125:
5122:
5119:
5097:
5093:
5084:
5074:
5060:
5052:
5051:destabilizing
5048:
5032:
5005:
4990:
4964:
4938:
4934:
4924:
4922:
4918:
4914:
4898:
4878:
4875:
4872:
4849:
4846:
4838:
4831:
4818:
4815:
4812:
4789:
4776:
4773:
4766:
4765:
4764:
4747:
4744:
4741:
4738:
4735:
4729:
4726:
4703:
4696:
4692:
4685:
4680:
4668:
4655:
4650:
4645:
4642:
4639:
4636:
4633:
4617:
4605:
4601:
4597:
4589:
4582:
4569:
4566:
4563:
4540:
4527:
4524:
4517:
4516:
4515:
4498:
4495:
4492:
4489:
4486:
4483:
4480:
4474:
4471:
4448:
4441:
4437:
4430:
4425:
4413:
4400:
4395:
4390:
4387:
4384:
4368:
4356:
4352:
4348:
4325:
4312:
4309:
4306:
4298:
4291:
4278:
4275:
4268:
4267:
4266:
4249:
4242:
4238:
4231:
4226:
4214:
4201:
4196:
4191:
4188:
4185:
4169:
4157:
4153:
4149:
4144:
4139:
4135:
4130:
4124:
4121:
4116:
4112:
4108:
4105:
4102:
4097:
4094:
4089:
4085:
4080:
4076:
4073:
4067:
4063:
4056:
4052:
4048:
4043:
4038:
4034:
4028:
4025:
4015:
4011:
4005:
4000:
3996:
3987:
3983:
3976:
3972:
3968:
3960:
3953:
3940:
3937:
3930:
3929:
3928:
3912:
3905:
3875:
3872:
3869:
3866:
3863:
3857:
3854:
3840:
3838:
3822:
3802:
3794:
3789:
3785:
3776:
3771:
3767:
3729:
3719:
3714:
3707:
3692:
3676:
3673:
3670:
3661:
3647:
3639:
3623:
3603:
3595:
3591:
3572:
3569:
3566:
3563:
3560:
3554:
3551:
3543:
3525:
3521:
3501:
3485:
3475:
3470:
3463:
3460:
3451:
3447:
3443:
3438:
3431:
3404:
3359:
3356:
3353:
3350:
3347:
3341:
3338:
3316:
3312:
3281:
3263:
3253:
3243:
3238:
3234:
3228:
3225:
3215:
3211:
3205:
3200:
3196:
3189:
3184:
3177:
3174:
3162:
3158:
3144:
3141:
3138:
3135:
3132:
3120:
3110:
3105:
3076:
3066:
3061:
3049:
3044:
3040:
3036:
3031:
3026:
3011:
2995:
2975:
2955:
2952:
2949:
2924:
2909:
2906:
2903:
2898:
2880:
2859:
2855:
2846:
2841:
2837:
2816:
2796:
2776:
2773:
2770:
2748:
2744:
2720:
2717:
2714:
2711:
2708:
2702:
2699:
2691:
2687:
2670:
2667:
2664:
2661:
2658:
2652:
2649:
2627:
2623:
2615:
2593:
2584:
2580:
2563:
2560:
2556:
2547:
2530:
2527:
2523:
2500:
2471:
2439:
2412:
2397:
2381:
2378:
2373:
2369:
2365:
2362:
2359:
2354:
2350:
2327:
2319:
2315:
2309:
2304:
2300:
2296:
2293:
2290:
2285:
2280:
2276:
2271:
2267:
2264:
2261:
2257:
2251:
2247:
2243:
2240:
2237:
2232:
2228:
2223:
2219:
2216:
2213:
2208:
2205:
2202:
2170:
2155:
2137:
2133:
2126:
2119:
2115:
2091:
2087:
2066:
2056:
2053:
2050:
2045:
2032:
2027:
2024:
2021:
2017:
1991:
1981:
1978:
1975:
1970:
1957:
1952:
1949:
1946:
1942:
1933:
1907:
1891:
1888:
1883:
1879:
1875:
1872:
1869:
1864:
1860:
1838:
1832:
1828:
1824:
1821:
1818:
1813:
1809:
1804:
1800:
1797:
1794:
1789:
1786:
1783:
1748:
1730:
1726:
1719:
1716:
1681:
1677:
1612:
1600:
1597:
1582:
1569:
1549:
1534:
1530:
1504:
1463:
1450:
1446:
1442:
1427:
1424:
1419:
1415:
1406:
1388:
1385:
1382:
1378:
1372:
1368:
1364:
1360:
1352:
1348:
1345:
1318:
1314:
1306:
1302:
1255:
1245:
1219:
1211:
1198:
1141:
1138:
1134:
1130:
1129:of the data.
1128:
1124:
1076:
1072:
1034:
1004:
1000:
976:
961:
919:
899:
896:
893:
885:
869:
849:
841:
823:
818:
812:
802:
799:
796:
791:
780:
775:
770:
767:
764:
735:
730:
724:
720:
716:
713:
710:
705:
701:
696:
691:
686:
683:
680:
665:
653:
649:
645:
623:
620:
619:
615:
611:
607:
585:
582:
581:
577:
573:
551:
548:
547:
546:
541:The principle
538:
536:
532:
528:
524:
520:
515:
513:
509:
506:
502:
498:
494:
489:
487:
483:
479:
475:
470:
468:
464:
460:
456:
452:
448:
444:
440:
428:
423:
421:
416:
414:
409:
408:
406:
405:
400:
395:
390:
389:
388:
387:
382:
379:
377:
374:
372:
369:
367:
364:
362:
359:
357:
354:
353:
352:
351:
347:
346:
341:
338:
336:
333:
331:
328:
326:
323:
321:
318:
317:
316:
315:
310:
307:
305:
302:
300:
297:
295:
292:
290:
287:
286:
285:
284:
279:
276:
274:
271:
269:
266:
264:
261:
260:
259:
258:
253:
250:
248:
245:
243:
242:Least squares
240:
239:
238:
237:
233:
232:
227:
224:
223:
222:
221:
216:
213:
211:
208:
206:
203:
201:
198:
196:
193:
191:
188:
186:
183:
181:
178:
176:
175:Nonparametric
173:
171:
168:
167:
166:
165:
160:
157:
155:
152:
150:
147:
145:
144:Fixed effects
142:
140:
137:
136:
135:
134:
129:
126:
124:
121:
119:
118:Ordered logit
116:
114:
111:
109:
106:
104:
101:
99:
96:
94:
91:
89:
86:
84:
81:
79:
76:
74:
71:
69:
66:
65:
64:
63:
58:
55:
53:
50:
48:
45:
43:
40:
39:
38:
37:
33:
32:
29:
26:
25:
21:
20:
9796:
9792:Theil, Henri
9768:
9727:
9721:
9715:
9698:
9692:
9686:
9669:
9663:
9637:
9632:
9607:
9601:
9595:
9444:
9439:
9428:eigenvectors
9424:eigenvectors
9416:eigenvectors
9385:
9331:kernel trick
9328:
9320:transforming
9318:obtained by
9298:. Thus, the
9262:chosen. The
9237:
9187:
9041:, the first
8950:
8948:
8914:
8827:
8770:
8592:
8469:
8181:
8043:compared to
7960:
7858:
7719:
7461:
7328:
7221:
7080:
6892:
6676:
6564:
6415:
6402:column space
6395:
6270:
6197:
5952:
5878:
5681:
5447:
5349:
5218:through the
5080:
4930:
4864:
4718:
4463:
4264:
3927:is given by
3846:
3662:
3636:independent
3512:
3160:
3159:
2689:
2688:
2117:
2116:
1930:, while the
1679:
1678:
1531:, gives the
1444:
1443:
1137:Gauss–Markov
1132:
1131:
1005:Assume that
1002:
1001:
663:
662:
651:
648:PCA loadings
621:
608:regression (
583:
549:
544:
516:
497:eigenvectors
490:
471:
461:the unknown
446:
442:
436:
299:Non-negative
199:
9386:PCR in the
9339:feature map
9324:feature map
9272:coordinates
9268:feature map
8818:association
8133:linear form
8089:linear form
5955:regularized
5905:eigenvalues
5187:approximate
5112:denote any
4921:linear form
4917:linear form
3012:covariates
3010:transformed
2942:denote the
2809:columns of
2763:denote the
2583:PCA loading
1745:denote the
1047:columns of
975:dimensional
840:data matrix
576:data matrix
501:eigenvalues
482:regularized
309:Regularized
273:Generalized
205:Least angle
103:Mixed logit
9823:Categories
9587:References
9404:kernel PCA
9284:non-linear
9214:non-linear
8811:Mallow's C
7355:, we have
7325:Efficiency
6406:orthogonal
4937:correlated
3542:orthogonal
3540:which has
1445:Objective:
1407:parameter
1142:model for
535:prediction
512:predicting
478:regressors
459:estimating
439:statistics
348:Background
252:Non-linear
234:Estimation
9732:CiteSeerX
9462:×
9364:×
9294:of these
9218:symmetric
9155:…
9143:∈
9111:…
9099:∈
9073:×
9020:…
9008:∈
8755:β
8728:σ
8703:β
8692:β
8673:σ
8654:λ
8602:β
8551:⪰
8536:^
8533:β
8523:
8517:−
8494:^
8491:β
8481:
8449:⪰
8434:^
8431:β
8421:
8415:−
8392:^
8389:β
8379:
8360:…
8348:∈
8342:∀
8318:β
8284:^
8281:β
8252:≠
8248:β
8231:−
8205:…
8193:∈
8150:^
8147:β
8106:^
8103:β
8058:^
8055:β
8030:β
7996:^
7993:β
7943:⪰
7928:^
7925:β
7915:
7909:−
7886:^
7883:β
7873:
7838:⪰
7823:^
7820:β
7810:
7804:−
7781:^
7778:β
7768:
7749:…
7737:∈
7731:∀
7690:^
7687:β
7677:
7656:^
7653:β
7643:
7616:β
7582:^
7579:β
7546:β
7529:−
7489:…
7477:∈
7424:^
7421:β
7411:
7382:^
7379:β
7369:
7342:β
7280:^
7277:β
7249:∗
7237:^
7234:β
7180:λ
7173:…
7142:λ
7133:
7103:−
7093:Λ
7042:−
7032:Λ
7020:−
7001:∗
6990:−
6947:^
6944:β
6912:−
6865:∗
6860:β
6842:−
6815:‖
6809:∗
6804:β
6794:−
6786:‖
6769:∈
6764:∗
6759:β
6749:
6731:^
6728:β
6692:^
6689:β
6653:⩽
6624:−
6615:×
6584:−
6540:∗
6535:β
6517:−
6490:‖
6484:∗
6479:β
6469:−
6461:‖
6444:∈
6439:∗
6434:β
6370:−
6361:×
6334:…
6293:−
6243:∗
6238:β
6220:−
6162:…
6135:⊥
6130:∗
6125:β
6100:∗
6095:β
6085:−
6057:∈
6052:∗
6047:β
6029:problem:
6004:^
6001:β
5968:⩽
5830:⩽
5816:λ
5789:∑
5736:−
5698:∑
5400:−
5362:∑
5329:≤
5323:≤
5302:∈
5282:for some
5161:…
5149:∈
5123:×
4876:⪰
4847:⪰
4832:^
4829:β
4819:
4813:−
4790:^
4787:β
4777:
4763:we have:
4742:…
4730:∈
4693:λ
4656:
4628:∑
4618:∑
4602:σ
4583:^
4580:β
4570:
4564:−
4541:^
4538:β
4528:
4514:we have:
4496:−
4487:…
4475:∈
4438:λ
4401:
4379:∑
4369:∑
4353:σ
4326:^
4323:β
4313:
4292:^
4289:β
4279:
4239:λ
4202:
4180:∑
4170:∑
4154:σ
4122:−
4113:λ
4106:…
4095:−
4086:λ
4077:
4053:σ
4026:−
3973:σ
3954:^
3951:β
3941:
3906:^
3903:β
3870:…
3858:∈
3730:^
3727:β
3708:^
3705:β
3567:…
3555:∈
3476:∈
3464:^
3461:γ
3432:^
3429:β
3384:β
3354:…
3342:∈
3254:∈
3226:−
3178:^
3175:γ
3142:≤
3136:≤
3128:∀
3111:∈
3067:∈
2953:×
2907:…
2774:×
2715:…
2703:∈
2665:…
2653:∈
2642:for each
2624:λ
2379:≥
2370:λ
2366:≥
2363:⋯
2360:≥
2351:λ
2324:Δ
2301:δ
2294:…
2277:δ
2268:
2248:λ
2241:…
2229:λ
2220:
2206:×
2199:Λ
2130:Λ
2084:are both
2054:…
2025:×
1979:…
1950:×
1889:≥
1880:δ
1876:≥
1873:⋯
1870:≥
1861:δ
1829:δ
1822:…
1810:δ
1801:
1787:×
1780:Δ
1723:Δ
1680:PCA step:
1663:β
1641:β
1598:−
1550:^
1547:β
1488:β
1464:^
1461:β
1449:estimator
1416:σ
1386:×
1369:σ
1357:ε
1349:
1311:ε
1303:
1279:ε
1246:∈
1242:β
1216:ε
1208:β
1127:centering
897:≥
800:…
768:×
714:…
684:×
644:transform
614:dimension
523:collinear
493:variances
215:Segmented
9794:(1971).
9766:(1985).
9550:See also
7605:unbiased
7603:is also
7331:unbiased
6271:where:
6106:‖
6076:‖
5767:‖
5720:‖
5637:of rank
5426:‖
5384:‖
3847:For any
2692:For any
2612:largest
2548:and the
2152:gives a
1405:variance
1071:centered
932:rows of
570:Perform
330:Bayesian
268:Weighted
263:Ordinary
195:Isotonic
190:Quantile
9640:, OUP.
9624:2348005
9406:on the
9302:in the
9264:mapping
9246:into a
8809:or the
6565:where,
1932:columns
1771:where,
503:of the
449:) is a
289:Partial
128:Poisson
9808:
9780:
9734:
9644:
9622:
9426:. The
9280:linear
9244:mapped
9223:. The
9206:linear
8307:biased
7081:where
4931:Under
4865:where
3835:is an
2829:. Let
2736:, let
2190:where
1233:where
884:sample
505:sample
247:Linear
185:Robust
108:Probit
34:Models
9802:46–55
9774:57–60
9620:JSTOR
9394:this
9314:) of
6642:with
2398:) of
2342:with
1852:with
465:in a
294:Total
210:Local
9806:ISBN
9778:ISBN
9642:ISBN
8746:and
8663:<
8588:bias
8309:for
7607:for
7333:for
7130:diag
6677:Let
6659:<
5974:<
5836:<
5220:rank
4074:diag
3163:Let
2581:(or
2486:and
2265:diag
2217:diag
2009:and
1798:diag
1425:>
1337:and
1121:and
862:and
666:Let
642:Now
9742:doi
9728:101
9703:doi
9674:doi
9612:doi
9282:or
9216:),
8919:of
8520:MSE
8478:MSE
8418:Var
8376:Var
8305:is
8135:of
8021:of
7912:MSE
7870:MSE
7807:Var
7765:Var
7674:MSE
7640:Var
7408:MSE
7366:Var
6753:min
6746:arg
6428:min
6041:min
5907:of
4911:is
4816:Var
4774:Var
4567:Var
4525:Var
4310:Var
4276:Var
3938:Var
2156:of
2092:of
1934:of
1908:of
1749:of
1632:of
1527:is
1346:Var
1164:on
1125:to
572:PCA
488:.
447:PCR
437:In
9825::
9804:.
9776:.
9740:.
9726:.
9699:35
9697:.
9670:23
9668:.
9652:^
9618:.
9608:31
9606:.
9442:.
9383:.
9326:.
9254:)
9235:.
9180:.
8590:.
8179:.
6674:.
5679:.
5347:.
3839:.
3500:.
3157:.
2686:.
2598:th
1676:.
1535::
622:3.
584:2.
550:1.
469:.
441:,
9814:.
9786:.
9748:.
9744::
9709:.
9705::
9680:.
9676::
9626:.
9614::
9531:T
9526:X
9520:X
9497:T
9492:X
9486:X
9465:n
9459:n
9367:n
9361:n
9164:}
9161:m
9158:,
9152:,
9149:1
9146:{
9140:k
9120:}
9117:p
9114:,
9108:,
9105:1
9102:{
9096:m
9076:m
9070:n
9049:m
9029:}
9026:p
9023:,
9017:,
9014:1
9011:{
9005:m
8985:p
8962:p
8928:X
8887:X
8881:T
8876:X
8853:X
8847:T
8842:X
8813:p
8792:X
8786:T
8781:X
8732:2
8707:.
8697:T
8686:/
8682:)
8677:2
8669:p
8666:(
8658:j
8631:h
8628:t
8624:j
8574:k
8554:0
8548:)
8543:k
8526:(
8514:)
8508:s
8505:l
8502:o
8484:(
8455:,
8452:0
8446:)
8441:k
8424:(
8412:)
8406:s
8403:l
8400:o
8382:(
8372::
8369:}
8366:p
8363:,
8357:,
8354:1
8351:{
8345:k
8291:k
8256:0
8242:T
8237:)
8234:k
8228:p
8225:(
8221:V
8217:,
8214:}
8211:p
8208:,
8202:,
8199:1
8196:{
8190:k
8164:s
8161:l
8158:o
8113:k
8072:s
8069:l
8066:o
8003:k
7969:k
7946:0
7940:)
7935:k
7918:(
7906:)
7900:s
7897:l
7894:o
7876:(
7844:,
7841:0
7835:)
7830:j
7813:(
7801:)
7795:s
7792:l
7789:o
7771:(
7761::
7758:}
7755:p
7752:,
7746:,
7743:1
7740:{
7734:j
7705:.
7702:)
7697:k
7680:(
7671:=
7668:)
7663:k
7646:(
7589:k
7554:0
7550:=
7540:T
7535:)
7532:k
7526:p
7523:(
7519:V
7498:}
7495:p
7492:,
7486:,
7483:1
7480:{
7474:k
7447:,
7444:)
7438:s
7435:l
7432:o
7414:(
7405:=
7402:)
7396:s
7393:l
7390:o
7372:(
7309:k
7287:k
7245:L
7207:.
7203:)
7197:2
7193:/
7189:1
7184:p
7176:,
7170:,
7165:2
7161:/
7157:1
7152:1
7149:+
7146:k
7137:(
7127:=
7122:2
7118:/
7114:1
7109:)
7106:k
7100:p
7097:(
7066:,
7061:2
7057:/
7053:1
7048:)
7045:k
7039:p
7036:(
7026:)
7023:k
7017:p
7014:(
7010:V
7006:=
6996:)
6993:k
6987:p
6984:(
6980:L
6954:L
6918:)
6915:k
6909:p
6906:(
6902:L
6878:.
6874:0
6870:=
6853:T
6848:)
6845:k
6839:p
6836:(
6832:L
6819:2
6798:X
6790:Y
6779:p
6774:R
6743:=
6738:L
6699:L
6662:p
6656:k
6650:1
6630:)
6627:k
6621:p
6618:(
6612:p
6590:)
6587:k
6581:p
6578:(
6574:L
6549:0
6545:=
6528:T
6523:)
6520:k
6514:p
6511:(
6507:L
6494:2
6473:X
6465:Y
6454:p
6449:R
6381:.
6376:)
6373:k
6367:p
6364:(
6358:p
6353:]
6347:p
6342:v
6337:,
6331:,
6326:1
6323:+
6320:k
6315:v
6309:[
6304:=
6299:)
6296:k
6290:p
6287:(
6283:V
6256:,
6252:0
6248:=
6231:T
6226:)
6223:k
6217:p
6214:(
6210:V
6183:.
6180:}
6175:p
6170:v
6165:,
6159:,
6154:1
6151:+
6148:k
6143:v
6138:{
6111:2
6089:X
6081:Y
6067:p
6062:R
6011:k
5977:p
5971:k
5965:1
5928:X
5922:T
5917:X
5891:k
5857:p
5854:=
5851:k
5846:0
5839:p
5833:k
5827:1
5820:j
5810:n
5805:1
5802:+
5799:k
5796:=
5793:j
5782:{
5777:=
5772:2
5761:k
5756:i
5751:x
5744:k
5740:V
5731:i
5726:x
5713:n
5708:1
5705:=
5702:i
5666:X
5645:k
5621:k
5601:k
5581:,
5576:i
5571:x
5564:T
5559:k
5555:V
5551:=
5546:k
5541:i
5536:x
5531:=
5526:i
5521:z
5499:k
5479:,
5474:k
5470:V
5466:=
5461:k
5457:L
5431:2
5420:i
5415:z
5408:k
5404:L
5395:i
5390:x
5377:n
5372:1
5369:=
5366:i
5335:)
5332:n
5326:i
5320:1
5317:(
5312:k
5307:R
5297:i
5292:z
5268:i
5263:z
5256:k
5252:L
5229:k
5204:i
5199:x
5173:.
5170:}
5167:p
5164:,
5158:,
5155:1
5152:{
5146:k
5126:k
5120:p
5098:k
5094:L
5061:0
5033:0
5012:X
5006:T
5001:X
4974:X
4948:X
4899:A
4879:0
4873:A
4850:0
4844:)
4839:k
4822:(
4810:)
4804:s
4801:l
4798:o
4780:(
4751:}
4748:p
4745:,
4739:,
4736:1
4733:{
4727:k
4704:.
4697:j
4686:T
4681:j
4676:v
4669:j
4664:v
4651:p
4646:1
4643:+
4640:k
4637:=
4634:j
4606:2
4598:=
4595:)
4590:k
4573:(
4561:)
4555:s
4552:l
4549:o
4531:(
4502:}
4499:1
4493:p
4490:,
4484:,
4481:1
4478:{
4472:k
4449:.
4442:j
4431:T
4426:j
4421:v
4414:j
4409:v
4396:p
4391:1
4388:=
4385:j
4357:2
4349:=
4346:)
4340:s
4337:l
4334:o
4316:(
4307:=
4304:)
4299:p
4282:(
4250:.
4243:j
4232:T
4227:j
4222:v
4215:j
4210:v
4197:k
4192:1
4189:=
4186:j
4158:2
4150:=
4145:T
4140:k
4136:V
4131:)
4125:1
4117:k
4109:,
4103:,
4098:1
4090:1
4081:(
4068:k
4064:V
4057:2
4049:=
4044:T
4039:k
4035:V
4029:1
4022:)
4016:k
4012:W
4006:T
4001:k
3997:W
3993:(
3988:k
3984:V
3977:2
3969:=
3966:)
3961:k
3944:(
3913:k
3879:}
3876:p
3873:,
3867:,
3864:1
3861:{
3855:k
3823:V
3803:V
3799:X
3795:=
3790:p
3786:V
3781:X
3777:=
3772:p
3768:W
3744:s
3741:l
3738:o
3720:=
3715:p
3677:p
3674:=
3671:k
3648:k
3624:k
3604:k
3576:}
3573:p
3570:,
3564:,
3561:1
3558:{
3552:k
3526:k
3522:W
3486:p
3481:R
3471:k
3452:k
3448:V
3444:=
3439:k
3405:k
3363:}
3360:p
3357:,
3351:,
3348:1
3345:{
3339:k
3317:k
3313:W
3291:Y
3264:k
3259:R
3250:Y
3244:T
3239:k
3235:W
3229:1
3222:)
3216:k
3212:W
3206:T
3201:k
3197:W
3193:(
3190:=
3185:k
3145:n
3139:i
3133:1
3121:p
3116:R
3106:i
3101:x
3077:k
3072:R
3062:i
3057:x
3050:T
3045:k
3041:V
3037:=
3032:k
3027:i
3022:x
2996:W
2976:k
2956:k
2950:n
2930:]
2925:k
2920:v
2914:X
2910:,
2904:,
2899:1
2894:v
2888:X
2884:[
2881:=
2860:k
2856:V
2851:X
2847:=
2842:k
2838:W
2817:V
2797:k
2777:k
2771:p
2749:k
2745:V
2724:}
2721:p
2718:,
2712:,
2709:1
2706:{
2700:k
2674:}
2671:p
2668:,
2662:,
2659:1
2656:{
2650:j
2628:j
2594:j
2564:h
2561:t
2557:j
2531:h
2528:t
2524:j
2501:j
2496:v
2472:j
2467:v
2461:X
2440:V
2419:X
2413:T
2408:X
2382:0
2374:p
2355:1
2328:2
2320:=
2316:]
2310:2
2305:p
2297:,
2291:,
2286:2
2281:1
2272:[
2262:=
2258:]
2252:p
2244:,
2238:,
2233:1
2224:[
2214:=
2209:p
2203:p
2177:X
2171:T
2166:X
2138:T
2134:V
2127:V
2101:X
2072:]
2067:p
2062:v
2057:,
2051:,
2046:1
2041:v
2036:[
2033:=
2028:p
2022:p
2018:V
1997:]
1992:p
1987:u
1982:,
1976:,
1971:1
1966:u
1961:[
1958:=
1953:p
1947:n
1943:U
1917:X
1892:0
1884:p
1865:1
1839:]
1833:p
1825:,
1819:,
1814:1
1805:[
1795:=
1790:p
1784:p
1758:X
1731:T
1727:V
1720:U
1717:=
1713:X
1691:X
1619:Y
1613:T
1608:X
1601:1
1594:)
1589:X
1583:T
1578:X
1573:(
1570:=
1564:s
1561:l
1558:o
1514:X
1428:0
1420:2
1389:n
1383:n
1379:I
1373:2
1365:=
1361:)
1353:(
1323:0
1319:=
1315:)
1307:(
1300:E
1256:p
1251:R
1220:,
1212:+
1203:X
1199:=
1195:Y
1173:X
1151:Y
1108:X
1086:X
1056:X
1035:p
1014:Y
986:Y
962:p
941:X
920:n
900:p
894:n
870:p
850:n
824:T
819:)
813:n
808:x
803:,
797:,
792:1
787:x
781:(
776:=
771:p
765:n
760:X
736:T
731:)
725:n
721:y
717:,
711:,
706:1
702:y
697:(
692:=
687:1
681:n
676:Y
445:(
426:e
419:t
412:v
Text is available under the Creative Commons Attribution-ShareAlike License. Additional terms may apply.