10304:
10290:
10328:
10316:
2114:
3501:
1872:
3165:
3715:
1063:
CCA is now a cornerstone of multivariate statistics and multi-view learning, and a great number of interpretations and extensions have been proposed, such as probabilistic CCA, sparse CCA, multi-view CCA, Deep CCA, and DeepGeoCCA. Unfortunately, perhaps because of its popularity, the literature can
1079:
form (corresponding to datasets and their sample covariance matrices). These two forms are almost exact analogues of each other, which is why their distinction is often overlooked, but they can behave very differently in high dimensional settings. We next give explicit mathematical definitions for
5824:
One can also use canonical-correlation analysis to produce a model equation which relates two sets of variables, for example a set of performance measures and a set of explanatory variables, or a set of outputs and set of inputs. Constraint restrictions can be imposed on such a model to ensure it
5828:
Visualization of the results of canonical correlation is usually through bar plots of the coefficients of the two sets of variables for the pairs of canonical variates showing significant correlation. Some authors suggest that they are best visualized by plotting them as heliographs, a circular
5579:
3150:
2992:
2834:
2463:
2109:{\displaystyle (a_{k},b_{k})={\underset {a,b}{\operatorname {argmax} }}\operatorname {corr} (a^{T}X,b^{T}Y)\quad {\text{ subject to }}\operatorname {cov} (a^{T}X,a_{j}^{T}X)=\operatorname {cov} (b^{T}Y,b_{j}^{T}Y)=0{\text{ for }}j=1,\dots ,k-1}
3496:{\displaystyle \left(c^{T}\Sigma _{XX}^{-1/2}\Sigma _{XY}\Sigma _{YY}^{-1/2}\right)(d)\leq \left(c^{T}\Sigma _{XX}^{-1/2}\Sigma _{XY}\Sigma _{YY}^{-1/2}\Sigma _{YY}^{-1/2}\Sigma _{YX}\Sigma _{XX}^{-1/2}c\right)^{1/2}\left(d^{T}d\right)^{1/2},}
4443:
4183:
3975:
3509:
5800:
A typical use for canonical correlation in the experimental context is to take two sets of variables and see what is common among the two sets. For example, in psychological testing, one could take two well established multidimensional
7198:
and may also be negative. The regression view of CCA also provides a way to construct a latent variable probabilistic generative model for CCA, with uncorrelated hidden variables representing shared and non-shared variability.
4854:
4666:
1047:
of significance can be treated as special cases of canonical-correlation analysis, which is the general procedure for investigating the relationships between two sets of variables." The method was first introduced by
6674:
6592:
4555:
4295:
3829:
1735:
5813:. By seeing how the MMPI-2 factors relate to the NEO factors, one could gain insight into what dimensions were common between the tests and how much variance was shared. For example, one might find that an
6506:
1291:
5417:
2241:
5101:
5017:
3003:
6871:
1407:
4745:
4929:
7004:
6446:
6381:
1223:
1158:
1588:
1519:
5910:
5389:
5336:
2586:
2528:
2845:
2687:
2348:
1064:
be inconsistent with notation, we attempt to highlight such inconsistencies in this article to help the reader make best use of the existing literature and techniques available.
2675:
2632:
1825:. Then one seeks vectors maximizing the same correlation subject to the constraint that they are to be uncorrelated with the first pair of canonical variables; this gives the
5645:
2296:
1817:
1776:
5703:
2157:
1864:
1658:
1623:
3710:{\displaystyle \rho \leq {\frac {\left(c^{T}\Sigma _{XX}^{-1/2}\Sigma _{XY}\Sigma _{YY}^{-1}\Sigma _{YX}\Sigma _{XX}^{-1/2}c\right)^{1/2}}{\left(c^{T}c\right)^{1/2}}}.}
1317:
5758:
7196:
7163:
7130:
7097:
5868:
6278:
4326:
4066:
3858:
874:
6796:
6749:
6090:
5790:
1546:
1477:
1352:
6217:
6122:
912:
6308:
6243:
6188:
6058:
6032:
6006:
5940:
5810:
7393:
7064:
7044:
6962:
6942:
6918:
6898:
6816:
6769:
6722:
6702:
6162:
6142:
5980:
5960:
5723:
5668:
5409:
5300:
5280:
4875:
4766:
4688:
4581:
4464:
4321:
4204:
4061:
4026:
4006:
3849:
3738:
2340:
2320:
1447:
1427:
869:
859:
700:
7549:
7645:
Knyazev, A.V.; Argentati, M.E. (2002), "Principal Angles between
Subspaces in an A-Based Scalar Product: Algorithms and Perturbation Estimates",
4771:
4586:
9425:
6597:
6515:
907:
17:
9930:
864:
715:
7949:
Hardoon, D. R.; Szedmak, S.; Shawe-Taylor, J. (2004). "Canonical
Correlation Analysis: An Overview with Application to Learning Methods".
10080:
5806:
4469:
4209:
3743:
446:
1666:
9704:
8345:
7233:
947:
750:
8006:
Representation-Constrained
Canonical Correlation Analysis: A Hybridization of Canonical Correlation and Principal Component Analyses
7598:"CCA-Zoo: A collection of Regularized, Deep Learning based, Kernel, and Probabilistic CCA methods in a scikit-learn style framework"
9478:
6455:
9917:
5825:
reflects theoretical requirements or intuitively obvious conditions. This type of model is known as a maximum correlation model.
826:
375:
5574:{\displaystyle \chi ^{2}=-\left(p-1-{\frac {1}{2}}(m+n+1)\right)\ln \prod _{j=i}^{\min\{m,n\}}(1-{\widehat {\rho }}_{j}^{2}),}
7859:
7756:
7289:
1242:
8340:
8040:
3145:{\displaystyle \rho ={\frac {c^{T}\Sigma _{XX}^{-1/2}\Sigma _{XY}\Sigma _{YY}^{-1/2}d}{{\sqrt {c^{T}c}}{\sqrt {d^{T}d}}}}.}
884:
647:
182:
5163:
10397:
8944:
8092:
5242:
2180:
902:
5025:
4941:
6824:
1360:
735:
710:
659:
6310:, which illustrates that the canonical-correlation analysis treats correlated and anticorrelated variables similarly.
5262:
Each row can be tested for significance with the following method. Since the correlations are sorted, saying that row
5183:. The CCA-Zoo library implements CCA extensions, such as probabilistic CCA, sparse CCA, multi-view CCA, and Deep CCA.
9727:
9619:
7208:
4693:
783:
778:
431:
5180:
4880:
2678:
10332:
9905:
9779:
7243:
1043:
that have a maximum correlation with each other. T. R. Knapp notes that "virtually all of the commonly encountered
441:
79:
9963:
9624:
9369:
8740:
8330:
5647:
6971:
6386:
6321:
1163:
1098:
10014:
9226:
9033:
8922:
8880:
7731:
5127:
1551:
1482:
940:
836:
600:
421:
8954:
10257:
9216:
8119:
5877:
5814:
5168:
3156:
2987:{\displaystyle \Sigma _{YY}^{1/2}=V_{Y}D_{Y}^{1/2}V_{Y}^{\top },\qquad V_{Y}D_{Y}V_{Y}^{\top }=\Sigma _{YY}.}
2829:{\displaystyle \Sigma _{XX}^{1/2}=V_{X}D_{X}^{1/2}V_{X}^{\top },\qquad V_{X}D_{X}V_{X}^{\top }=\Sigma _{XX},}
2458:{\displaystyle \rho ={\frac {a^{T}\Sigma _{XY}b}{{\sqrt {a^{T}\Sigma _{XX}a}}{\sqrt {b^{T}\Sigma _{YY}b}}}}.}
811:
513:
289:
5341:
5305:
2536:
2478:
9808:
9757:
9742:
9732:
9601:
9473:
9440:
9266:
9221:
9051:
7238:
7228:
7223:
5725:
are logically zero (and estimated that way also) the product for the terms after this point is irrelevant.
5224:
5204:
5192:
5112:
4029:
1068:
768:
705:
615:
593:
436:
426:
5223:
for small angles, leading to very inaccurate computation of highly correlated principal vectors in finite
10320:
10152:
9953:
9877:
9178:
8932:
8601:
8065:
7305:
Knapp, T. R. (1978). "Canonical correlation analysis: A general parametric significance-testing system".
919:
831:
816:
277:
99:
10355:"Discriminant Correlation Analysis: Real-Time Feature Level Fusion for Multimodal Biometric Recognition"
2637:
2594:
10037:
10009:
10004:
9752:
9511:
9417:
9397:
9305:
9016:
8834:
8317:
8189:
7572:
5231:
879:
806:
556:
451:
239:
172:
132:
6874:
5251:
5141:
1084:- understanding the differences between these objects is crucial for interpretation of the technique.
9769:
9537:
9258:
9183:
9112:
9041:
8961:
8949:
8819:
8807:
8800:
8508:
8229:
7826:
7066:
are simultaneously transformed in such a way that the cross-correlation between the whitened vectors
933:
539:
307:
177:
7842:
5591:
10252:
10019:
9882:
9567:
9532:
9496:
9281:
8723:
8632:
8591:
8503:
8194:
8033:
7963:
7667:
7272:
7023:
5585:
5137:
3985:
of decreasing magnitudes. Orthogonality is guaranteed by the symmetry of the correlation matrices.
561:
481:
404:
322:
152:
114:
109:
69:
64:
5176:
2271:
1781:
1740:
10161:
9774:
9714:
9651:
9289:
9273:
9011:
8873:
8863:
8713:
8627:
7877:"A whitening approach to probabilistic canonical correlation analysis for omics data integration"
7827:"Canonical Correlation Analysis: Use of Composite Heliographs for Representing Multiple Patterns"
5673:
2299:
2122:
1834:
980:
508:
357:
257:
84:
7132:
is diagonal. The canonical correlations are then interpreted as regression coefficients linking
1628:
1593:
10199:
10129:
9922:
9859:
9614:
9501:
8498:
8395:
8302:
8181:
8080:
7958:
7837:
7662:
7550:"Nonlinear measures of association with kernel canonical correlation analysis and applications"
7267:
4438:{\displaystyle \Sigma _{YY}^{-1/2}\Sigma _{YX}\Sigma _{XX}^{-1}\Sigma _{XY}\Sigma _{YY}^{-1/2}}
4178:{\displaystyle \Sigma _{XX}^{-1/2}\Sigma _{XY}\Sigma _{YY}^{-1}\Sigma _{YX}\Sigma _{XX}^{-1/2}}
3970:{\displaystyle \Sigma _{XX}^{-1/2}\Sigma _{XY}\Sigma _{YY}^{-1}\Sigma _{YX}\Sigma _{XX}^{-1/2}}
1296:
688:
664:
566:
327:
302:
262:
74:
5731:
10224:
10166:
10109:
9935:
9828:
9463:
9347:
9206:
9198:
9088:
9080:
8895:
8791:
8769:
8728:
8693:
8660:
8606:
8581:
8536:
8475:
8435:
8237:
8060:
7168:
7135:
7102:
7069:
5840:
1044:
642:
464:
272:
187:
59:
6248:
10147:
9722:
9671:
9647:
9609:
9527:
9506:
9458:
9337:
9315:
9284:
9193:
9070:
9021:
8939:
8912:
8868:
8824:
8586:
8362:
8242:
7654:
7609:
7218:
6774:
6727:
6063:
5763:
5212:
5149:
1524:
1455:
1325:
1319:
1053:
571:
521:
7422:. The Twelfth International Conference on Learning Representations (ICLR 2024, spotlight).
6193:
6098:
8:
10294:
10219:
10142:
9823:
9587:
9580:
9542:
9450:
9430:
9402:
9135:
9001:
8996:
8986:
8978:
8796:
8757:
8647:
8637:
8546:
8325:
8281:
8199:
8124:
8026:
7780:
Tofallis, C. (1999). "Model
Building with Multiple Dependent Variables and Constraints".
6287:
6222:
6167:
6037:
6011:
5985:
5919:
5227:
5153:
5145:
1032:
674:
610:
581:
486:
312:
245:
231:
217:
192:
142:
94:
54:
7658:
7613:
10374:
10308:
10119:
9973:
9869:
9818:
9694:
9591:
9575:
9552:
9329:
9063:
9046:
9006:
8917:
8812:
8774:
8745:
8705:
8665:
8611:
8528:
8214:
8209:
7984:
7913:
7888:
7876:
7807:
7789:
7762:
7714:
7530:
7512:
7378:
7352:
7049:
7029:
6947:
6927:
6903:
6883:
6801:
6754:
6707:
6687:
6147:
6127:
5965:
5945:
5821:
dimension accounted for a substantial amount of shared variance between the two tests.
5708:
5653:
5394:
5285:
5265:
5123:
4860:
4751:
4673:
4566:
4449:
4306:
4189:
4046:
4011:
3991:
3834:
3723:
2325:
2305:
1432:
1412:
652:
576:
362:
157:
7712:
Canonical correlation analysis of high-dimensional data with very small sample support
10303:
10214:
10184:
10176:
9996:
9987:
9912:
9843:
9699:
9684:
9659:
9547:
9488:
9354:
9342:
8968:
8885:
8829:
8752:
8596:
8518:
8297:
8171:
7976:
7918:
7855:
7752:
7627:
7534:
7478:
7285:
5802:
5196:
3978:
745:
588:
501:
297:
267:
212:
207:
162:
104:
10378:
7766:
7730:
Sieranoja, S.; Sahidullah, Md; Kinnunen, T.; Komulainen, J.; Hadid, A. (July 2018).
7374:
10366:
10239:
10194:
9958:
9945:
9838:
9813:
9747:
9679:
9557:
9165:
9058:
8991:
8904:
8851:
8670:
8541:
8335:
8219:
8134:
8101:
7968:
7908:
7898:
7847:
7811:
7799:
7748:
7744:
7672:
7617:
7564:
7522:
7497:
7470:
7419:
Deep
Geodesic Canonical Correlation Analysis for Covariance-Based Neuroimaging Data
7344:
7332:
7314:
7277:
7011:
6921:
1237:
1049:
773:
526:
476:
386:
370:
340:
202:
197:
147:
137:
35:
7996:
A note on the ordinal canonical-correlation analysis of two sets of ranking scores
7988:
5829:
format with ray like bars, with each half representing the two sets of variables.
4032:
of the correlation matrix of X and Y corresponding to the highest singular value.
1409:. In practice, we would estimate the covariance matrix based on sampled data from
10156:
9900:
9762:
9689:
9364:
9238:
9211:
9188:
9157:
8784:
8779:
8733:
8463:
8114:
7729:
7689:
7281:
5220:
2469:
1226:
1024:
801:
605:
471:
411:
9646:
7416:
Ju, Ce; Kobler, Reinmar J; Tang, Liyao; Guan, Cuntai; Kawanabe, Motoaki (2024).
7348:
10105:
10100:
8563:
8493:
8139:
7697:
7568:
7526:
7459:"Simultaneous canonical correlation analysis with invariant canonical loadings"
7433:
7370:
7318:
7213:
6449:
5871:
1233:
1093:
1057:
821:
352:
89:
10370:
8012:
program)- in
Journal of Applied Economic Sciences 4(1), 2009, pp. 115â124
7903:
7676:
7474:
6219:, so that the first (and only in this example) pair of canonical variables is
6034:, so that the first (and only in this example) pair of canonical variables is
10391:
10262:
10229:
10092:
10053:
9864:
9833:
9297:
9251:
8856:
8558:
8385:
8149:
8144:
7972:
7740:
2018 IEEE 3rd
International Conference on Signal and Image Processing (ICSIP)
7631:
7482:
7017:
6965:
6818:
are treated as elements of a vector space with an inner product given by the
6681:
740:
669:
551:
282:
167:
7803:
7458:
1080:
the population problem and highlight the different objects in the so-called
10204:
10137:
10114:
10029:
9359:
8655:
8553:
8488:
8430:
8415:
8352:
8307:
8002:
program)- in
Journal of Quantitative Economics 7(2), 2009, pp. 173â199
7980:
7922:
7738:
7688:
7417:
7262:
HĂ€rdle, Wolfgang; Simar, LĂ©opold (2007). "Canonical
Correlation Analysis".
5172:
5159:
4849:{\displaystyle \Sigma _{YY}^{-1}\Sigma _{YX}\Sigma _{XX}^{-1}\Sigma _{XY},}
1075:
form (corresponding to random vectors and their covariance matrices) or in
4661:{\displaystyle \Sigma _{XX}^{-1}\Sigma _{XY}\Sigma _{YY}^{-1}\Sigma _{YX}}
10247:
10209:
9892:
9793:
9655:
9468:
9435:
8927:
8844:
8839:
8483:
8440:
8420:
8400:
8390:
8159:
6724:, correspondingly. In this interpretation, the random variables, entries
6677:
6669:{\displaystyle \Sigma _{YY}=\operatorname {Cov} (Y,Y)=\operatorname {E} }
6587:{\displaystyle \Sigma _{XX}=\operatorname {Cov} (X,X)=\operatorname {E} }
5818:
3982:
3852:
1661:
1028:
546:
40:
7995:
7938:
7851:
7622:
7597:
1230:
9093:
8573:
8273:
8204:
8154:
8129:
8049:
7356:
6819:
6509:
5131:
1355:
964:
695:
391:
317:
10353:
Haghighat, Mohammad; Abdel-Mottaleb, Mohamed; Alhalabi, Wadee (2016).
5792:
correlations will be identically 1 and hence the test is meaningless.
3831:
are collinear. In addition, the maximum of correlation is attained if
2255:; these are often more straightforward to interpret than the weights.
9246:
9098:
8718:
8513:
8425:
8410:
8405:
8370:
5156:
for statistical hypothesis testing in canonical correlation analysis.
854:
635:
5282:
is zero implies all further correlations are also zero. If we have
10354:
8762:
8380:
8257:
8252:
8247:
8005:
7893:
7718:
7434:"Statistical Learning with Sparsity: the Lasso and Generalizations"
4550:{\displaystyle \Sigma _{XX}^{-1/2}\Sigma _{XY}\Sigma _{YY}^{-1/2}d}
4290:{\displaystyle \Sigma _{YY}^{-1/2}\Sigma _{YX}\Sigma _{XX}^{-1/2}c}
3824:{\displaystyle \Sigma _{YY}^{-1/2}\Sigma _{YX}\Sigma _{XX}^{-1/2}c}
1031:
among the variables, then canonical-correlation analysis will find
7794:
7517:
1730:{\displaystyle \rho =\operatorname {corr} (a_{k}^{T}X,b_{k}^{T}Y)}
10267:
9968:
8009:
7999:
7836:. Lecture Notes in Computer Science. Vol. 4045. p. 93.
630:
10352:
10189:
9170:
9144:
9124:
8375:
8166:
7942:
7732:"Audiovisual Synchrony Detection with Optimized Audio Features"
7710:
Yang Song, Peter J. Schreier, David RamŽırez, and Tanuj Hasija
7007:
5247:
5216:
5208:
5119:
381:
8018:
6501:{\displaystyle \operatorname {E} (X)=\operatorname {E} (Y)=0}
5238:
1087:
625:
620:
347:
8109:
5186:
1452:
Canonical-correlation analysis seeks a sequence of vectors
960:
Way of inferring information from cross-covariance matrices
7948:
7018:
Whitening and probabilistic canonical correlation analysis
5115:
on a correlation matrix. It is available as a function in
7498:"A spectral algorithm for learning Hidden Markov Models"
913:
List of datasets in computer vision and image processing
10359:
IEEE Transactions on
Information Forensics and Security
1286:{\displaystyle \Sigma _{XY}=\operatorname {cov} (X,Y)}
7381:
7171:
7138:
7105:
7072:
7052:
7032:
6974:
6950:
6930:
6924:
for the pair of subspaces spanned by the entries of
6906:
6886:
6827:
6804:
6777:
6757:
6730:
6710:
6690:
6600:
6518:
6458:
6389:
6324:
6290:
6251:
6225:
6196:
6170:
6150:
6130:
6101:
6066:
6040:
6014:
5988:
5968:
5948:
5922:
5880:
5843:
5766:
5734:
5711:
5676:
5656:
5594:
5420:
5397:
5344:
5308:
5288:
5268:
5028:
4944:
4883:
4863:
4774:
4754:
4696:
4676:
4589:
4569:
4472:
4452:
4329:
4309:
4212:
4192:
4069:
4049:
4014:
3994:
3861:
3837:
3746:
3726:
3512:
3168:
3006:
2848:
2690:
2640:
2597:
2539:
2481:
2351:
2328:
2308:
2274:
2183:
2125:
1875:
1837:
1784:
1743:
1669:
1631:
1596:
1554:
1527:
1485:
1458:
1435:
1415:
1363:
1328:
1299:
1245:
1166:
1101:
9931:
Autoregressive conditional heteroskedasticity (ARCH)
2677:
can be obtained from the eigen-decomposition (or by
2236:{\displaystyle \Sigma _{XX}a_{k},\Sigma _{YY}b_{k}}
9393:
7782:Journal of the Royal Statistical Society, Series D
7387:
7335:(1936). "Relations Between Two Sets of Variates".
7190:
7157:
7124:
7091:
7058:
7038:
6998:
6956:
6936:
6912:
6892:
6865:
6810:
6790:
6763:
6743:
6716:
6696:
6668:
6586:
6500:
6440:
6375:
6302:
6272:
6237:
6211:
6182:
6156:
6136:
6116:
6084:
6052:
6026:
6000:
5974:
5954:
5934:
5904:
5862:
5784:
5752:
5717:
5697:
5662:
5639:
5573:
5403:
5383:
5330:
5294:
5274:
5096:{\displaystyle V=d^{T}\Sigma _{YY}^{-1/2}Y=b^{T}Y}
5095:
5012:{\displaystyle U=c^{T}\Sigma _{XX}^{-1/2}X=a^{T}X}
5011:
4923:
4869:
4848:
4760:
4739:
4682:
4660:
4575:
4560:Reversing the change of coordinates, we have that
4549:
4458:
4437:
4315:
4289:
4198:
4177:
4055:
4020:
4000:
3969:
3843:
3823:
3732:
3709:
3495:
3144:
2986:
2828:
2669:
2626:
2580:
2522:
2457:
2334:
2314:
2290:
2235:
2151:
2108:
1858:
1811:
1770:
1729:
1652:
1617:
1582:
1540:
1513:
1471:
1441:
1421:
1401:
1346:
1311:
1285:
1217:
1152:
7824:
6866:{\displaystyle \operatorname {cov} (x_{i},y_{j})}
6313:
2302:for any pair of (vector-shaped) random variables
1402:{\displaystyle \operatorname {cov} (x_{i},y_{j})}
10389:
7874:
7644:
7415:
5677:
5512:
5363:
3988:Another way of viewing this computation is that
1838:
9479:Multivariate adaptive regression splines (MARS)
7548:Huang, S. Y.; Lee, M. H.; Hsiao, C. K. (2009).
5189:as macro CanCorr shipped with the main software
4740:{\displaystyle \Sigma _{YY}^{-1}\Sigma _{YX}a;}
7495:
5728:Note that in the small sample size limit with
4924:{\displaystyle \Sigma _{XX}^{-1}\Sigma _{XY}b}
908:List of datasets for machine-learning research
8034:
7596:Chapman, James; Wang, Hao-Ting (2021-12-18).
7557:Journal of Statistical Planning and Inference
7547:
941:
5692:
5680:
5527:
5515:
5378:
5366:
1853:
1841:
7595:
7261:
5807:Minnesota Multiphasic Personality Inventory
3981:). The subsequent pairs are found by using
3855:with the maximum eigenvalue for the matrix
8079:
8041:
8027:
7825:Degani, A.; Shafto, M.; Olson, L. (2006).
7496:Hsu, D.; Kakade, S. M.; Zhang, T. (2012).
7234:Regularized canonical correlation analysis
6999:{\displaystyle \operatorname {corr} (U,V)}
6880:The definition of the canonical variables
6441:{\displaystyle Y=(y_{1},\dots ,y_{m})^{T}}
6376:{\displaystyle X=(x_{1},\dots ,x_{n})^{T}}
6164:are perfectly anticorrelated, then, e.g.,
5234:, alternative algorithms are available in
5207:on a correlation matrix is related to the
1218:{\displaystyle Y=(y_{1},\dots ,y_{m})^{T}}
1153:{\displaystyle X=(x_{1},\dots ,x_{n})^{T}}
1088:Population CCA definition via correlations
1056:the mathematical concept was published by
948:
934:
8692:
7962:
7912:
7902:
7892:
7841:
7834:Diagrammatic Representation and Inference
7793:
7666:
7621:
7516:
7331:
7271:
7264:Applied Multivariate Statistical Analysis
6875:Covariance#Relationship to inner products
5584:which is asymptotically distributed as a
5302:independent observations in a sample and
1583:{\displaystyle b_{k}\in \mathbb {R} ^{m}}
1570:
1514:{\displaystyle a_{k}\in \mathbb {R} ^{n}}
1501:
979:, is a way of inferring information from
7779:
6920:is then equivalent to the definition of
4935:The canonical variables are defined by:
1831:. This procedure may be continued up to
7939:Discriminant Correlation Analysis (DCA)
7505:Journal of Computer and System Sciences
5905:{\displaystyle \operatorname {E} (X)=0}
5243:linear-algebra function subspace_angles
14:
10390:
10005:KaplanâMeier estimator (product limit)
7369:
5982:are perfectly correlated, then, e.g.,
5384:{\displaystyle i=1,\dots ,\min\{m,n\}}
5331:{\displaystyle {\widehat {\rho }}_{i}}
5144:and several other packages, including
2581:{\displaystyle d=\Sigma _{YY}^{1/2}b,}
2523:{\displaystyle c=\Sigma _{XX}^{1/2}a,}
10078:
9645:
9392:
8691:
8461:
8078:
8022:
7692:, J. T. Kent and J. M. Bibby (1979).
7363:
7304:
5257:
2342:. The target function to maximize is
1449:(i.e. from a pair of data matrices).
10315:
10015:Accelerated failure time (AFT) model
7647:SIAM Journal on Scientific Computing
7638:
7456:
7022:CCA can also be viewed as a special
5760:then we are guaranteed that the top
1052:in 1936, although in the context of
10346:
10327:
9610:Analysis of variance (ANOVA, anova)
8462:
7875:Jendoubi, T.; Strimmer, K. (2018).
5670:. Since all the correlations from
903:Glossary of artificial intelligence
24:
9705:CochranâMantelâHaenszel statistics
8331:Pearson product-moment correlation
6641:
6602:
6559:
6520:
6477:
6459:
5881:
5046:
4962:
4906:
4885:
4831:
4810:
4797:
4776:
4719:
4698:
4646:
4625:
4612:
4591:
4516:
4503:
4474:
4407:
4394:
4373:
4360:
4331:
4256:
4243:
4214:
4147:
4134:
4113:
4100:
4071:
3939:
3926:
3905:
3892:
3863:
3790:
3777:
3748:
3614:
3601:
3580:
3567:
3538:
3402:
3389:
3360:
3331:
3318:
3289:
3227:
3214:
3185:
3069:
3056:
3027:
2969:
2960:
2921:
2850:
2811:
2802:
2763:
2692:
2670:{\displaystyle \Sigma _{YY}^{1/2}}
2642:
2627:{\displaystyle \Sigma _{XX}^{1/2}}
2599:
2547:
2489:
2432:
2402:
2372:
2276:
2211:
2185:
1828:second pair of canonical variables
1247:
25:
10409:
7932:
7209:Generalized canonical correlation
5795:
5338:is the estimated correlation for
5106:
3720:There is equality if the vectors
1822:first pair of canonical variables
1590:) such that the random variables
10326:
10314:
10302:
10289:
10288:
10079:
7244:Partial least squares regression
1737:. The (scalar) random variables
9964:Least-squares spectral analysis
7868:
7818:
7773:
7723:
7704:
7682:
7602:Journal of Open Source Software
7589:
7457:Gu, Fei; Wu, Hao (2018-04-01).
5411:th row, the test statistic is:
5252:FileExchange function subspacea
2929:
2771:
1967:
8945:Mean-unbiased minimum-variance
8048:
7749:10.1109/SIPROCESS.2018.8600424
7541:
7489:
7450:
7426:
7409:
7325:
7298:
7255:
6993:
6981:
6860:
6834:
6663:
6647:
6635:
6623:
6581:
6565:
6553:
6541:
6489:
6483:
6471:
6465:
6429:
6396:
6364:
6331:
6314:Connection to principal angles
5893:
5887:
5640:{\displaystyle (m-i+1)(n-i+1)}
5634:
5616:
5613:
5595:
5565:
5532:
5482:
5464:
3266:
3260:
2468:The first step is to define a
2258:
2065:
2028:
2016:
1979:
1964:
1932:
1902:
1876:
1724:
1682:
1396:
1370:
1341:
1329:
1280:
1268:
1206:
1173:
1141:
1108:
969:canonical-correlation analysis
323:Relevance vector machine (RVM)
18:Canonical correlation analysis
13:
1:
10258:Geographic information system
9474:Simultaneous equations models
7249:
6968:. The canonical correlations
6284:We notice that in both cases
4300:Reciprocally, there is also:
2263:
2177:. The 'dual' sets of vectors
812:Computational learning theory
376:Expectationâmaximization (EM)
9441:Coefficient of determination
9052:Uniformly most powerful test
7282:10.1007/978-3-540-72244-1_14
7239:Singular value decomposition
7229:Linear discriminant analysis
7224:Principal component analysis
5205:singular value decomposition
5193:Julia (programming language)
5113:singular value decomposition
2291:{\displaystyle \Sigma _{XY}}
1812:{\displaystyle V=b_{1}^{T}Y}
1771:{\displaystyle U=a_{1}^{T}X}
769:Coefficient of determination
616:Convolutional neural network
328:Support vector machine (SVM)
7:
10010:Proportional hazards models
9954:Spectral density estimation
9936:Vector autoregression (VAR)
9370:Maximum posterior estimator
8602:Randomized controlled trial
7202:
5832:
5698:{\displaystyle \min\{m,n\}}
4040:The solution is therefore:
4035:
2152:{\displaystyle a_{k},b_{k}}
1859:{\displaystyle \min\{m,n\}}
977:canonical variates analysis
920:Outline of machine learning
817:Empirical risk minimization
10:
10414:
10398:Covariance and correlation
9770:Multivariate distributions
8190:Average absolute deviation
7569:10.1016/j.jspi.2008.10.011
7527:10.1016/j.jcss.2011.12.025
7375:"Essai sur la gĂ©omĂ©trie Ă
7319:10.1037/0033-2909.85.2.410
5111:CCA can be computed using
1653:{\displaystyle b_{k}^{T}Y}
1618:{\displaystyle a_{k}^{T}X}
557:Feedforward neural network
308:Artificial neural networks
10371:10.1109/TIFS.2016.2569061
10284:
10238:
10175:
10128:
10091:
10087:
10074:
10046:
10028:
9995:
9986:
9944:
9891:
9852:
9801:
9792:
9758:Structural equation model
9713:
9670:
9666:
9641:
9600:
9566:
9520:
9487:
9449:
9416:
9412:
9388:
9328:
9237:
9156:
9120:
9111:
9094:Score/Lagrange multiplier
9079:
9032:
8977:
8903:
8894:
8704:
8700:
8687:
8646:
8620:
8572:
8527:
8509:Sample size determination
8474:
8470:
8457:
8361:
8316:
8290:
8272:
8228:
8180:
8100:
8091:
8087:
8074:
8056:
7904:10.1186/s12859-018-2572-9
7677:10.1137/S1064827500377332
7475:10.1007/s41237-017-0042-8
7349:10.1093/biomet/28.3-4.321
7026:where the random vectors
5140:as the standard function
3157:CauchyâSchwarz inequality
2246:canonical loading vectors
1312:{\displaystyle n\times m}
983:. If we have two vectors
981:cross-covariance matrices
540:Artificial neural network
10253:Environmental statistics
9775:Elliptical distributions
9568:Generalized linear model
9497:Simple linear regression
9267:HodgesâLehmann estimator
8724:Probability distribution
8633:Stochastic approximation
8195:Coefficient of variation
7973:10.1162/0899766042321814
7024:whitening transformation
5753:{\displaystyle p<n+m}
849:Journals and conferences
796:Mathematical foundations
706:Temporal difference (TD)
562:Recurrent neural network
482:Conditional random field
405:Dimensionality reduction
153:Dimensionality reduction
115:Quantum machine learning
110:Neuromorphic engineering
70:Self-supervised learning
65:Semi-supervised learning
9913:Cross-correlation (XCF)
9521:Non-standard predictors
8955:LehmannâScheffĂ© theorem
8628:Adaptive clinical trial
7804:10.1111/1467-9884.00195
7399:Bull. Soc. Math. France
7191:{\displaystyle Y^{CCA}}
7158:{\displaystyle X^{CCA}}
7125:{\displaystyle Y^{CCA}}
7092:{\displaystyle X^{CCA}}
5863:{\displaystyle X=x_{1}}
5179:and in statsmodels, as
4028:are the left and right
2300:cross-covariance matrix
1082:canonical decomposition
1071:, CCA can be viewed in
1067:Like its sister method
258:Apprenticeship learning
10309:Mathematics portal
10130:Engineering statistics
10038:NelsonâAalen estimator
9615:Analysis of covariance
9502:Ordinary least squares
9426:Pearson product-moment
8830:Statistical functional
8741:Empirical distribution
8574:Controlled experiments
8303:Frequency distribution
8081:Descriptive statistics
7389:
7307:Psychological Bulletin
7192:
7159:
7126:
7093:
7060:
7040:
7000:
6964:with respect to this
6958:
6938:
6914:
6894:
6867:
6812:
6792:
6765:
6745:
6718:
6698:
6670:
6588:
6502:
6442:
6377:
6304:
6274:
6273:{\displaystyle V=-Y=X}
6239:
6213:
6184:
6158:
6138:
6118:
6086:
6054:
6028:
6002:
5976:
5956:
5936:
5906:
5864:
5786:
5754:
5719:
5699:
5664:
5641:
5575:
5531:
5405:
5385:
5332:
5296:
5276:
5203:CCA computation using
5097:
5013:
4925:
4871:
4850:
4762:
4741:
4684:
4662:
4577:
4551:
4460:
4439:
4317:
4291:
4200:
4179:
4057:
4022:
4002:
3971:
3845:
3825:
3734:
3711:
3497:
3146:
2988:
2830:
2671:
2628:
2582:
2524:
2459:
2336:
2316:
2292:
2237:
2153:
2110:
1970: subject to
1860:
1813:
1772:
1731:
1654:
1619:
1584:
1542:
1515:
1473:
1443:
1423:
1403:
1348:
1313:
1287:
1219:
1154:
807:Biasâvariance tradeoff
689:Reinforcement learning
665:Spiking neural network
75:Reinforcement learning
10225:Population statistics
10167:System identification
9901:Autocorrelation (ACF)
9829:Exponential smoothing
9743:Discriminant analysis
9738:Canonical correlation
9602:Partition of variance
9464:Regression validation
9308:(JonckheereâTerpstra)
9207:Likelihood-ratio test
8896:Frequentist inference
8808:Locationâscale family
8729:Sampling distribution
8694:Statistical inference
8661:Cross-sectional study
8648:Observational studies
8607:Randomized experiment
8436:Stem-and-leaf display
8238:Central limit theorem
7694:Multivariate Analysis
7390:
7193:
7160:
7127:
7094:
7061:
7041:
7001:
6959:
6939:
6915:
6895:
6868:
6813:
6793:
6791:{\displaystyle y_{j}}
6766:
6746:
6744:{\displaystyle x_{i}}
6719:
6699:
6671:
6589:
6503:
6443:
6378:
6305:
6275:
6240:
6214:
6185:
6159:
6139:
6119:
6087:
6085:{\displaystyle V=Y=X}
6055:
6029:
6003:
5977:
5957:
5937:
5907:
5865:
5787:
5785:{\displaystyle m+n-p}
5755:
5720:
5700:
5665:
5642:
5576:
5496:
5406:
5386:
5333:
5297:
5277:
5098:
5014:
4926:
4872:
4851:
4768:is an eigenvector of
4763:
4742:
4685:
4663:
4583:is an eigenvector of
4578:
4552:
4461:
4440:
4323:is an eigenvector of
4318:
4292:
4201:
4180:
4063:is an eigenvector of
4058:
4023:
4003:
3972:
3846:
3826:
3735:
3712:
3498:
3147:
2989:
2831:
2672:
2629:
2583:
2525:
2460:
2337:
2317:
2293:
2238:
2154:
2111:
1861:
1814:
1773:
1732:
1655:
1620:
1585:
1543:
1541:{\displaystyle b_{k}}
1516:
1474:
1472:{\displaystyle a_{k}}
1444:
1424:
1404:
1349:
1347:{\displaystyle (i,j)}
1314:
1288:
1236:, one may define the
1220:
1155:
643:Neural radiance field
465:Structured prediction
188:Structured prediction
60:Unsupervised learning
10148:Probabilistic design
9733:Principal components
9576:Exponential families
9528:Nonlinear regression
9507:General linear model
9469:Mixed effects models
9459:Errors and residuals
9436:Confounding variable
9338:Bayesian probability
9316:Van der Waerden test
9306:Ordered alternative
9071:Multiple comparisons
8950:RaoâBlackwellization
8913:Estimating equations
8869:Statistical distance
8587:Factorial experiment
8120:Arithmetic-Geometric
7743:. pp. 377â381.
7379:
7266:. pp. 321â330.
7219:Angles between flats
7169:
7136:
7103:
7070:
7050:
7030:
6972:
6948:
6928:
6904:
6884:
6825:
6802:
6775:
6755:
6728:
6708:
6688:
6684:for the entries of
6598:
6516:
6456:
6387:
6322:
6288:
6249:
6223:
6212:{\displaystyle b=-1}
6194:
6168:
6148:
6128:
6117:{\displaystyle Y=-X}
6099:
6064:
6038:
6012:
5986:
5966:
5946:
5920:
5878:
5841:
5764:
5732:
5709:
5674:
5654:
5592:
5418:
5395:
5342:
5306:
5286:
5266:
5213:angles between flats
5197:MultivariateStats.jl
5026:
4942:
4881:
4861:
4772:
4752:
4694:
4674:
4587:
4567:
4470:
4450:
4327:
4307:
4210:
4190:
4067:
4047:
4012:
3992:
3859:
3835:
3744:
3724:
3510:
3166:
3004:
2846:
2688:
2638:
2595:
2537:
2479:
2349:
2326:
2306:
2272:
2181:
2162:canonical directions
2123:
2119:The sets of vectors
1873:
1835:
1782:
1741:
1667:
1629:
1594:
1552:
1525:
1483:
1456:
1433:
1413:
1361:
1326:
1297:
1243:
1164:
1099:
1054:angles between flats
832:Statistical learning
730:Learning with humans
522:Local outlier factor
10220:Official statistics
10143:Methods engineering
9824:Seasonal adjustment
9592:Poisson regressions
9512:Bayesian regression
9451:Regression analysis
9431:Partial correlation
9403:Regression analysis
9002:Prediction interval
8997:Likelihood interval
8987:Confidence interval
8979:Interval estimation
8940:Unbiased estimators
8758:Model specification
8638:Up-and-down designs
8326:Partial correlation
8282:Index of dispersion
8200:Interquartile range
7852:10.1007/11783183_11
7659:2002SJSC...23.2008K
7623:10.21105/joss.03823
7614:2021JOSS....6.3823C
6303:{\displaystyle U=V}
6238:{\displaystyle U=X}
6183:{\displaystyle a=1}
6053:{\displaystyle U=X}
6027:{\displaystyle b=1}
6001:{\displaystyle a=1}
5935:{\displaystyle Y=X}
5564:
5228:computer arithmetic
5177:Cross decomposition
5073:
4989:
4904:
4877:is proportional to
4829:
4795:
4717:
4690:is proportional to
4644:
4610:
4543:
4501:
4466:is proportional to
4434:
4392:
4358:
4283:
4241:
4206:is proportional to
4174:
4132:
4098:
3966:
3924:
3890:
3817:
3775:
3641:
3599:
3565:
3429:
3387:
3358:
3316:
3254:
3212:
3096:
3054:
2964:
2925:
2910:
2874:
2806:
2767:
2752:
2716:
2666:
2623:
2571:
2513:
2061:
2012:
1805:
1764:
1720:
1699:
1646:
1611:
1033:linear combinations
675:Electrochemical RAM
582:reservoir computing
313:Logistic regression
232:Supervised learning
218:Multimodal learning
193:Feature engineering
138:Generative modeling
100:Rule-based learning
95:Curriculum learning
55:Supervised learning
30:Part of a series on
10240:Spatial statistics
10120:Medical statistics
10020:First hitting time
9974:Whittle likelihood
9625:Degrees of freedom
9620:Multivariate ANOVA
9553:Heteroscedasticity
9365:Bayesian estimator
9330:Bayesian inference
9179:KolmogorovâSmirnov
9064:Randomization test
9034:Testing hypotheses
9007:Tolerance interval
8918:Maximum likelihood
8813:Exponential family
8746:Density estimation
8706:Statistical theory
8666:Natural experiment
8612:Scientific control
8529:Survey methodology
8215:Standard deviation
7951:Neural Computation
7881:BMC Bioinformatics
7385:
7188:
7155:
7122:
7089:
7056:
7036:
6996:
6954:
6934:
6910:
6890:
6863:
6808:
6788:
6761:
6741:
6714:
6694:
6666:
6584:
6498:
6438:
6373:
6300:
6270:
6235:
6209:
6180:
6154:
6134:
6114:
6082:
6050:
6024:
5998:
5972:
5952:
5932:
5902:
5860:
5782:
5750:
5715:
5695:
5660:
5648:degrees of freedom
5637:
5571:
5541:
5401:
5381:
5328:
5292:
5272:
5258:Hypothesis testing
5093:
5045:
5009:
4961:
4921:
4884:
4867:
4846:
4809:
4775:
4758:
4737:
4697:
4680:
4658:
4624:
4590:
4573:
4547:
4515:
4473:
4456:
4435:
4406:
4372:
4330:
4313:
4287:
4255:
4213:
4196:
4175:
4146:
4112:
4070:
4053:
4018:
3998:
3967:
3938:
3904:
3862:
3841:
3821:
3789:
3747:
3730:
3707:
3613:
3579:
3537:
3493:
3401:
3359:
3330:
3288:
3226:
3184:
3142:
3068:
3026:
2984:
2950:
2911:
2888:
2849:
2826:
2792:
2753:
2730:
2691:
2667:
2641:
2624:
2598:
2578:
2546:
2520:
2488:
2455:
2332:
2312:
2288:
2233:
2149:
2106:
2047:
1998:
1924:
1856:
1809:
1791:
1768:
1750:
1727:
1706:
1685:
1650:
1632:
1615:
1597:
1580:
1538:
1511:
1469:
1439:
1419:
1399:
1344:
1309:
1283:
1215:
1150:
243: •
158:Density estimation
10342:
10341:
10280:
10279:
10276:
10275:
10215:National accounts
10185:Actuarial science
10177:Social statistics
10070:
10069:
10066:
10065:
10062:
10061:
9997:Survival function
9982:
9981:
9844:Granger causality
9685:Contingency table
9660:Survival analysis
9637:
9636:
9633:
9632:
9489:Linear regression
9384:
9383:
9380:
9379:
9355:Credible interval
9324:
9323:
9107:
9106:
8923:Method of moments
8792:Parametric family
8753:Statistical model
8683:
8682:
8679:
8678:
8597:Random assignment
8519:Statistical power
8453:
8452:
8449:
8448:
8298:Contingency table
8268:
8267:
8135:Generalized/power
8008:(Also provides a
7998:(Also provides a
7957:(12): 2639â2664.
7861:978-3-540-35623-3
7758:978-1-5386-6396-7
7438:hastie.su.domains
7388:{\displaystyle n}
7291:978-3-540-72243-4
7059:{\displaystyle Y}
7039:{\displaystyle X}
6957:{\displaystyle Y}
6937:{\displaystyle X}
6922:principal vectors
6913:{\displaystyle V}
6893:{\displaystyle U}
6811:{\displaystyle Y}
6764:{\displaystyle X}
6717:{\displaystyle Y}
6697:{\displaystyle X}
6676:can be viewed as
6157:{\displaystyle Y}
6137:{\displaystyle X}
5975:{\displaystyle Y}
5955:{\displaystyle X}
5809:(MMPI-2) and the
5803:personality tests
5718:{\displaystyle p}
5663:{\displaystyle p}
5551:
5462:
5404:{\displaystyle i}
5319:
5295:{\displaystyle p}
5275:{\displaystyle i}
4870:{\displaystyle a}
4761:{\displaystyle b}
4683:{\displaystyle b}
4576:{\displaystyle a}
4459:{\displaystyle c}
4316:{\displaystyle d}
4199:{\displaystyle d}
4056:{\displaystyle c}
4021:{\displaystyle d}
4001:{\displaystyle c}
3979:Rayleigh quotient
3844:{\displaystyle c}
3733:{\displaystyle d}
3702:
3137:
3134:
3117:
2450:
2447:
2417:
2335:{\displaystyle Y}
2315:{\displaystyle X}
2077:
1971:
1909:
1442:{\displaystyle Y}
1422:{\displaystyle X}
1014:, ...,
994:, ...,
958:
957:
763:Model diagnostics
746:Human-in-the-loop
589:Boltzmann machine
502:Anomaly detection
298:Linear regression
213:Ontology learning
208:Grammar induction
183:Semantic analysis
178:Association rules
163:Anomaly detection
105:Neuro-symbolic AI
16:(Redirected from
10405:
10383:
10382:
10365:(9): 1984â1996.
10350:
10330:
10329:
10318:
10317:
10307:
10306:
10292:
10291:
10195:Crime statistics
10089:
10088:
10076:
10075:
9993:
9992:
9959:Fourier analysis
9946:Frequency domain
9926:
9873:
9839:Structural break
9799:
9798:
9748:Cluster analysis
9695:Log-linear model
9668:
9667:
9643:
9642:
9584:
9558:Homoscedasticity
9414:
9413:
9390:
9389:
9309:
9301:
9293:
9292:(KruskalâWallis)
9277:
9262:
9217:Cross validation
9202:
9184:AndersonâDarling
9131:
9118:
9117:
9089:Likelihood-ratio
9081:Parametric tests
9059:Permutation test
9042:1- & 2-tails
8933:Minimum distance
8905:Point estimation
8901:
8900:
8852:Optimal decision
8803:
8702:
8701:
8689:
8688:
8671:Quasi-experiment
8621:Adaptive designs
8472:
8471:
8459:
8458:
8336:Rank correlation
8098:
8097:
8089:
8088:
8076:
8075:
8043:
8036:
8029:
8020:
8019:
7992:
7966:
7927:
7926:
7916:
7906:
7896:
7872:
7866:
7865:
7845:
7831:
7822:
7816:
7815:
7797:
7777:
7771:
7770:
7736:
7727:
7721:
7708:
7702:
7701:
7686:
7680:
7679:
7670:
7653:(6): 2009â2041,
7642:
7636:
7635:
7625:
7593:
7587:
7586:
7584:
7583:
7577:
7571:. Archived from
7554:
7545:
7539:
7538:
7520:
7502:
7493:
7487:
7486:
7454:
7448:
7447:
7445:
7444:
7430:
7424:
7423:
7413:
7407:
7406:
7394:
7392:
7391:
7386:
7367:
7361:
7360:
7343:(3â4): 321â377.
7329:
7323:
7322:
7302:
7296:
7295:
7275:
7259:
7197:
7195:
7194:
7189:
7187:
7186:
7164:
7162:
7161:
7156:
7154:
7153:
7131:
7129:
7128:
7123:
7121:
7120:
7098:
7096:
7095:
7090:
7088:
7087:
7065:
7063:
7062:
7057:
7045:
7043:
7042:
7037:
7012:principal angles
7006:is equal to the
7005:
7003:
7002:
6997:
6963:
6961:
6960:
6955:
6943:
6941:
6940:
6935:
6919:
6917:
6916:
6911:
6899:
6897:
6896:
6891:
6872:
6870:
6869:
6864:
6859:
6858:
6846:
6845:
6817:
6815:
6814:
6809:
6797:
6795:
6794:
6789:
6787:
6786:
6770:
6768:
6767:
6762:
6750:
6748:
6747:
6742:
6740:
6739:
6723:
6721:
6720:
6715:
6703:
6701:
6700:
6695:
6675:
6673:
6672:
6667:
6662:
6661:
6613:
6612:
6593:
6591:
6590:
6585:
6580:
6579:
6531:
6530:
6507:
6505:
6504:
6499:
6447:
6445:
6444:
6439:
6437:
6436:
6427:
6426:
6408:
6407:
6382:
6380:
6379:
6374:
6372:
6371:
6362:
6361:
6343:
6342:
6309:
6307:
6306:
6301:
6279:
6277:
6276:
6271:
6244:
6242:
6241:
6236:
6218:
6216:
6215:
6210:
6189:
6187:
6186:
6181:
6163:
6161:
6160:
6155:
6143:
6141:
6140:
6135:
6123:
6121:
6120:
6115:
6091:
6089:
6088:
6083:
6059:
6057:
6056:
6051:
6033:
6031:
6030:
6025:
6007:
6005:
6004:
5999:
5981:
5979:
5978:
5973:
5961:
5959:
5958:
5953:
5941:
5939:
5938:
5933:
5911:
5909:
5908:
5903:
5869:
5867:
5866:
5861:
5859:
5858:
5791:
5789:
5788:
5783:
5759:
5757:
5756:
5751:
5724:
5722:
5721:
5716:
5704:
5702:
5701:
5696:
5669:
5667:
5666:
5661:
5646:
5644:
5643:
5638:
5580:
5578:
5577:
5572:
5563:
5558:
5553:
5552:
5544:
5530:
5510:
5489:
5485:
5463:
5455:
5430:
5429:
5410:
5408:
5407:
5402:
5390:
5388:
5387:
5382:
5337:
5335:
5334:
5329:
5327:
5326:
5321:
5320:
5312:
5301:
5299:
5298:
5293:
5281:
5279:
5278:
5273:
5232:fix this trouble
5102:
5100:
5099:
5094:
5089:
5088:
5072:
5068:
5056:
5044:
5043:
5018:
5016:
5015:
5010:
5005:
5004:
4988:
4984:
4972:
4960:
4959:
4930:
4928:
4927:
4922:
4917:
4916:
4903:
4895:
4876:
4874:
4873:
4868:
4855:
4853:
4852:
4847:
4842:
4841:
4828:
4820:
4808:
4807:
4794:
4786:
4767:
4765:
4764:
4759:
4746:
4744:
4743:
4738:
4730:
4729:
4716:
4708:
4689:
4687:
4686:
4681:
4667:
4665:
4664:
4659:
4657:
4656:
4643:
4635:
4623:
4622:
4609:
4601:
4582:
4580:
4579:
4574:
4556:
4554:
4553:
4548:
4542:
4538:
4526:
4514:
4513:
4500:
4496:
4484:
4465:
4463:
4462:
4457:
4444:
4442:
4441:
4436:
4433:
4429:
4417:
4405:
4404:
4391:
4383:
4371:
4370:
4357:
4353:
4341:
4322:
4320:
4319:
4314:
4296:
4294:
4293:
4288:
4282:
4278:
4266:
4254:
4253:
4240:
4236:
4224:
4205:
4203:
4202:
4197:
4184:
4182:
4181:
4176:
4173:
4169:
4157:
4145:
4144:
4131:
4123:
4111:
4110:
4097:
4093:
4081:
4062:
4060:
4059:
4054:
4030:singular vectors
4027:
4025:
4024:
4019:
4007:
4005:
4004:
3999:
3976:
3974:
3973:
3968:
3965:
3961:
3949:
3937:
3936:
3923:
3915:
3903:
3902:
3889:
3885:
3873:
3850:
3848:
3847:
3842:
3830:
3828:
3827:
3822:
3816:
3812:
3800:
3788:
3787:
3774:
3770:
3758:
3739:
3737:
3736:
3731:
3716:
3714:
3713:
3708:
3703:
3701:
3700:
3696:
3687:
3683:
3679:
3678:
3663:
3662:
3658:
3649:
3645:
3640:
3636:
3624:
3612:
3611:
3598:
3590:
3578:
3577:
3564:
3560:
3548:
3536:
3535:
3520:
3502:
3500:
3499:
3494:
3489:
3488:
3484:
3475:
3471:
3467:
3466:
3451:
3450:
3446:
3437:
3433:
3428:
3424:
3412:
3400:
3399:
3386:
3382:
3370:
3357:
3353:
3341:
3329:
3328:
3315:
3311:
3299:
3287:
3286:
3259:
3255:
3253:
3249:
3237:
3225:
3224:
3211:
3207:
3195:
3183:
3182:
3151:
3149:
3148:
3143:
3138:
3136:
3135:
3130:
3129:
3120:
3118:
3113:
3112:
3103:
3100:
3095:
3091:
3079:
3067:
3066:
3053:
3049:
3037:
3025:
3024:
3014:
2993:
2991:
2990:
2985:
2980:
2979:
2963:
2958:
2949:
2948:
2939:
2938:
2924:
2919:
2909:
2905:
2896:
2887:
2886:
2873:
2869:
2860:
2835:
2833:
2832:
2827:
2822:
2821:
2805:
2800:
2791:
2790:
2781:
2780:
2766:
2761:
2751:
2747:
2738:
2729:
2728:
2715:
2711:
2702:
2676:
2674:
2673:
2668:
2665:
2661:
2652:
2633:
2631:
2630:
2625:
2622:
2618:
2609:
2587:
2585:
2584:
2579:
2570:
2566:
2557:
2529:
2527:
2526:
2521:
2512:
2508:
2499:
2464:
2462:
2461:
2456:
2451:
2449:
2448:
2443:
2442:
2430:
2429:
2420:
2418:
2413:
2412:
2400:
2399:
2390:
2387:
2383:
2382:
2370:
2369:
2359:
2341:
2339:
2338:
2333:
2321:
2319:
2318:
2313:
2297:
2295:
2294:
2289:
2287:
2286:
2242:
2240:
2239:
2234:
2232:
2231:
2222:
2221:
2206:
2205:
2196:
2195:
2158:
2156:
2155:
2150:
2148:
2147:
2135:
2134:
2115:
2113:
2112:
2107:
2078:
2075:
2060:
2055:
2040:
2039:
2011:
2006:
1991:
1990:
1972:
1969:
1960:
1959:
1944:
1943:
1925:
1923:
1901:
1900:
1888:
1887:
1865:
1863:
1862:
1857:
1818:
1816:
1815:
1810:
1804:
1799:
1777:
1775:
1774:
1769:
1763:
1758:
1736:
1734:
1733:
1728:
1719:
1714:
1698:
1693:
1659:
1657:
1656:
1651:
1645:
1640:
1624:
1622:
1621:
1616:
1610:
1605:
1589:
1587:
1586:
1581:
1579:
1578:
1573:
1564:
1563:
1547:
1545:
1544:
1539:
1537:
1536:
1520:
1518:
1517:
1512:
1510:
1509:
1504:
1495:
1494:
1478:
1476:
1475:
1470:
1468:
1467:
1448:
1446:
1445:
1440:
1428:
1426:
1425:
1420:
1408:
1406:
1405:
1400:
1395:
1394:
1382:
1381:
1353:
1351:
1350:
1345:
1318:
1316:
1315:
1310:
1292:
1290:
1289:
1284:
1258:
1257:
1238:cross-covariance
1227:random variables
1224:
1222:
1221:
1216:
1214:
1213:
1204:
1203:
1185:
1184:
1159:
1157:
1156:
1151:
1149:
1148:
1139:
1138:
1120:
1119:
1050:Harold Hotelling
1045:parametric tests
1027:, and there are
1025:random variables
950:
943:
936:
897:Related articles
774:Confusion matrix
527:Isolation forest
472:Graphical models
251:
250:
203:Learning to rank
198:Feature learning
36:Machine learning
27:
26:
21:
10413:
10412:
10408:
10407:
10406:
10404:
10403:
10402:
10388:
10387:
10386:
10351:
10347:
10343:
10338:
10301:
10272:
10234:
10171:
10157:quality control
10124:
10106:Clinical trials
10083:
10058:
10042:
10030:Hazard function
10024:
9978:
9940:
9924:
9887:
9883:BreuschâGodfrey
9871:
9848:
9788:
9763:Factor analysis
9709:
9690:Graphical model
9662:
9629:
9596:
9582:
9562:
9516:
9483:
9445:
9408:
9407:
9376:
9320:
9307:
9299:
9291:
9275:
9260:
9239:Rank statistics
9233:
9212:Model selection
9200:
9158:Goodness of fit
9152:
9129:
9103:
9075:
9028:
8973:
8962:Median unbiased
8890:
8801:
8734:Order statistic
8696:
8675:
8642:
8616:
8568:
8523:
8466:
8464:Data collection
8445:
8357:
8312:
8286:
8264:
8224:
8176:
8093:Continuous data
8083:
8070:
8052:
8047:
8016:
7935:
7930:
7873:
7869:
7862:
7843:10.1.1.538.5217
7829:
7823:
7819:
7778:
7774:
7759:
7734:
7728:
7724:
7709:
7705:
7690:Kanti V. Mardia
7687:
7683:
7643:
7639:
7594:
7590:
7581:
7579:
7575:
7552:
7546:
7542:
7500:
7494:
7490:
7463:Behaviormetrika
7455:
7451:
7442:
7440:
7432:
7431:
7427:
7414:
7410:
7380:
7377:
7376:
7368:
7364:
7330:
7326:
7303:
7299:
7292:
7260:
7256:
7252:
7205:
7176:
7172:
7170:
7167:
7166:
7143:
7139:
7137:
7134:
7133:
7110:
7106:
7104:
7101:
7100:
7077:
7073:
7071:
7068:
7067:
7051:
7048:
7047:
7031:
7028:
7027:
7020:
6973:
6970:
6969:
6949:
6946:
6945:
6929:
6926:
6925:
6905:
6902:
6901:
6885:
6882:
6881:
6854:
6850:
6841:
6837:
6826:
6823:
6822:
6803:
6800:
6799:
6782:
6778:
6776:
6773:
6772:
6756:
6753:
6752:
6735:
6731:
6729:
6726:
6725:
6709:
6706:
6705:
6689:
6686:
6685:
6657:
6653:
6605:
6601:
6599:
6596:
6595:
6575:
6571:
6523:
6519:
6517:
6514:
6513:
6457:
6454:
6453:
6450:expected values
6432:
6428:
6422:
6418:
6403:
6399:
6388:
6385:
6384:
6367:
6363:
6357:
6353:
6338:
6334:
6323:
6320:
6319:
6316:
6289:
6286:
6285:
6250:
6247:
6246:
6224:
6221:
6220:
6195:
6192:
6191:
6169:
6166:
6165:
6149:
6146:
6145:
6129:
6126:
6125:
6100:
6097:
6096:
6065:
6062:
6061:
6039:
6036:
6035:
6013:
6010:
6009:
5987:
5984:
5983:
5967:
5964:
5963:
5947:
5944:
5943:
5921:
5918:
5917:
5879:
5876:
5875:
5854:
5850:
5842:
5839:
5838:
5835:
5798:
5765:
5762:
5761:
5733:
5730:
5729:
5710:
5707:
5706:
5675:
5672:
5671:
5655:
5652:
5651:
5593:
5590:
5589:
5559:
5554:
5543:
5542:
5511:
5500:
5454:
5441:
5437:
5425:
5421:
5419:
5416:
5415:
5396:
5393:
5392:
5343:
5340:
5339:
5322:
5311:
5310:
5309:
5307:
5304:
5303:
5287:
5284:
5283:
5267:
5264:
5263:
5260:
5221:ill-conditioned
5171:in the library
5109:
5084:
5080:
5064:
5057:
5049:
5039:
5035:
5027:
5024:
5023:
5000:
4996:
4980:
4973:
4965:
4955:
4951:
4943:
4940:
4939:
4909:
4905:
4896:
4888:
4882:
4879:
4878:
4862:
4859:
4858:
4834:
4830:
4821:
4813:
4800:
4796:
4787:
4779:
4773:
4770:
4769:
4753:
4750:
4749:
4722:
4718:
4709:
4701:
4695:
4692:
4691:
4675:
4672:
4671:
4649:
4645:
4636:
4628:
4615:
4611:
4602:
4594:
4588:
4585:
4584:
4568:
4565:
4564:
4534:
4527:
4519:
4506:
4502:
4492:
4485:
4477:
4471:
4468:
4467:
4451:
4448:
4447:
4425:
4418:
4410:
4397:
4393:
4384:
4376:
4363:
4359:
4349:
4342:
4334:
4328:
4325:
4324:
4308:
4305:
4304:
4274:
4267:
4259:
4246:
4242:
4232:
4225:
4217:
4211:
4208:
4207:
4191:
4188:
4187:
4165:
4158:
4150:
4137:
4133:
4124:
4116:
4103:
4099:
4089:
4082:
4074:
4068:
4065:
4064:
4048:
4045:
4044:
4038:
4013:
4010:
4009:
3993:
3990:
3989:
3957:
3950:
3942:
3929:
3925:
3916:
3908:
3895:
3891:
3881:
3874:
3866:
3860:
3857:
3856:
3836:
3833:
3832:
3808:
3801:
3793:
3780:
3776:
3766:
3759:
3751:
3745:
3742:
3741:
3725:
3722:
3721:
3692:
3688:
3674:
3670:
3669:
3665:
3664:
3654:
3650:
3632:
3625:
3617:
3604:
3600:
3591:
3583:
3570:
3566:
3556:
3549:
3541:
3531:
3527:
3526:
3522:
3521:
3519:
3511:
3508:
3507:
3480:
3476:
3462:
3458:
3457:
3453:
3452:
3442:
3438:
3420:
3413:
3405:
3392:
3388:
3378:
3371:
3363:
3349:
3342:
3334:
3321:
3317:
3307:
3300:
3292:
3282:
3278:
3277:
3273:
3272:
3245:
3238:
3230:
3217:
3213:
3203:
3196:
3188:
3178:
3174:
3173:
3169:
3167:
3164:
3163:
3125:
3121:
3119:
3108:
3104:
3102:
3101:
3087:
3080:
3072:
3059:
3055:
3045:
3038:
3030:
3020:
3016:
3015:
3013:
3005:
3002:
3001:
2972:
2968:
2959:
2954:
2944:
2940:
2934:
2930:
2920:
2915:
2901:
2897:
2892:
2882:
2878:
2865:
2861:
2853:
2847:
2844:
2843:
2814:
2810:
2801:
2796:
2786:
2782:
2776:
2772:
2762:
2757:
2743:
2739:
2734:
2724:
2720:
2707:
2703:
2695:
2689:
2686:
2685:
2679:diagonalization
2657:
2653:
2645:
2639:
2636:
2635:
2614:
2610:
2602:
2596:
2593:
2592:
2562:
2558:
2550:
2538:
2535:
2534:
2504:
2500:
2492:
2480:
2477:
2476:
2470:change of basis
2435:
2431:
2425:
2421:
2419:
2405:
2401:
2395:
2391:
2389:
2388:
2375:
2371:
2365:
2361:
2360:
2358:
2350:
2347:
2346:
2327:
2324:
2323:
2307:
2304:
2303:
2279:
2275:
2273:
2270:
2269:
2266:
2261:
2227:
2223:
2214:
2210:
2201:
2197:
2188:
2184:
2182:
2179:
2178:
2143:
2139:
2130:
2126:
2124:
2121:
2120:
2076: for
2074:
2056:
2051:
2035:
2031:
2007:
2002:
1986:
1982:
1968:
1955:
1951:
1939:
1935:
1913:
1908:
1896:
1892:
1883:
1879:
1874:
1871:
1870:
1836:
1833:
1832:
1800:
1795:
1783:
1780:
1779:
1759:
1754:
1742:
1739:
1738:
1715:
1710:
1694:
1689:
1668:
1665:
1664:
1641:
1636:
1630:
1627:
1626:
1606:
1601:
1595:
1592:
1591:
1574:
1569:
1568:
1559:
1555:
1553:
1550:
1549:
1532:
1528:
1526:
1523:
1522:
1505:
1500:
1499:
1490:
1486:
1484:
1481:
1480:
1463:
1459:
1457:
1454:
1453:
1434:
1431:
1430:
1414:
1411:
1410:
1390:
1386:
1377:
1373:
1362:
1359:
1358:
1327:
1324:
1323:
1298:
1295:
1294:
1250:
1246:
1244:
1241:
1240:
1209:
1205:
1199:
1195:
1180:
1176:
1165:
1162:
1161:
1144:
1140:
1134:
1130:
1115:
1111:
1100:
1097:
1096:
1090:
1022:
1013:
1002:
993:
975:), also called
961:
954:
925:
924:
898:
890:
889:
850:
842:
841:
802:Kernel machines
797:
789:
788:
764:
756:
755:
736:Active learning
731:
723:
722:
691:
681:
680:
606:Diffusion model
542:
532:
531:
504:
494:
493:
467:
457:
456:
412:Factor analysis
407:
397:
396:
380:
343:
333:
332:
253:
252:
236:
235:
234:
223:
222:
128:
120:
119:
85:Online learning
50:
38:
23:
22:
15:
12:
11:
5:
10411:
10401:
10400:
10385:
10384:
10344:
10340:
10339:
10337:
10336:
10324:
10312:
10298:
10285:
10282:
10281:
10278:
10277:
10274:
10273:
10271:
10270:
10265:
10260:
10255:
10250:
10244:
10242:
10236:
10235:
10233:
10232:
10227:
10222:
10217:
10212:
10207:
10202:
10197:
10192:
10187:
10181:
10179:
10173:
10172:
10170:
10169:
10164:
10159:
10150:
10145:
10140:
10134:
10132:
10126:
10125:
10123:
10122:
10117:
10112:
10103:
10101:Bioinformatics
10097:
10095:
10085:
10084:
10072:
10071:
10068:
10067:
10064:
10063:
10060:
10059:
10057:
10056:
10050:
10048:
10044:
10043:
10041:
10040:
10034:
10032:
10026:
10025:
10023:
10022:
10017:
10012:
10007:
10001:
9999:
9990:
9984:
9983:
9980:
9979:
9977:
9976:
9971:
9966:
9961:
9956:
9950:
9948:
9942:
9941:
9939:
9938:
9933:
9928:
9920:
9915:
9910:
9909:
9908:
9906:partial (PACF)
9897:
9895:
9889:
9888:
9886:
9885:
9880:
9875:
9867:
9862:
9856:
9854:
9853:Specific tests
9850:
9849:
9847:
9846:
9841:
9836:
9831:
9826:
9821:
9816:
9811:
9805:
9803:
9796:
9790:
9789:
9787:
9786:
9785:
9784:
9783:
9782:
9767:
9766:
9765:
9755:
9753:Classification
9750:
9745:
9740:
9735:
9730:
9725:
9719:
9717:
9711:
9710:
9708:
9707:
9702:
9700:McNemar's test
9697:
9692:
9687:
9682:
9676:
9674:
9664:
9663:
9639:
9638:
9635:
9634:
9631:
9630:
9628:
9627:
9622:
9617:
9612:
9606:
9604:
9598:
9597:
9595:
9594:
9578:
9572:
9570:
9564:
9563:
9561:
9560:
9555:
9550:
9545:
9540:
9538:Semiparametric
9535:
9530:
9524:
9522:
9518:
9517:
9515:
9514:
9509:
9504:
9499:
9493:
9491:
9485:
9484:
9482:
9481:
9476:
9471:
9466:
9461:
9455:
9453:
9447:
9446:
9444:
9443:
9438:
9433:
9428:
9422:
9420:
9410:
9409:
9406:
9405:
9400:
9394:
9386:
9385:
9382:
9381:
9378:
9377:
9375:
9374:
9373:
9372:
9362:
9357:
9352:
9351:
9350:
9345:
9334:
9332:
9326:
9325:
9322:
9321:
9319:
9318:
9313:
9312:
9311:
9303:
9295:
9279:
9276:(MannâWhitney)
9271:
9270:
9269:
9256:
9255:
9254:
9243:
9241:
9235:
9234:
9232:
9231:
9230:
9229:
9224:
9219:
9209:
9204:
9201:(ShapiroâWilk)
9196:
9191:
9186:
9181:
9176:
9168:
9162:
9160:
9154:
9153:
9151:
9150:
9142:
9133:
9121:
9115:
9113:Specific tests
9109:
9108:
9105:
9104:
9102:
9101:
9096:
9091:
9085:
9083:
9077:
9076:
9074:
9073:
9068:
9067:
9066:
9056:
9055:
9054:
9044:
9038:
9036:
9030:
9029:
9027:
9026:
9025:
9024:
9019:
9009:
9004:
8999:
8994:
8989:
8983:
8981:
8975:
8974:
8972:
8971:
8966:
8965:
8964:
8959:
8958:
8957:
8952:
8937:
8936:
8935:
8930:
8925:
8920:
8909:
8907:
8898:
8892:
8891:
8889:
8888:
8883:
8878:
8877:
8876:
8866:
8861:
8860:
8859:
8849:
8848:
8847:
8842:
8837:
8827:
8822:
8817:
8816:
8815:
8810:
8805:
8789:
8788:
8787:
8782:
8777:
8767:
8766:
8765:
8760:
8750:
8749:
8748:
8738:
8737:
8736:
8726:
8721:
8716:
8710:
8708:
8698:
8697:
8685:
8684:
8681:
8680:
8677:
8676:
8674:
8673:
8668:
8663:
8658:
8652:
8650:
8644:
8643:
8641:
8640:
8635:
8630:
8624:
8622:
8618:
8617:
8615:
8614:
8609:
8604:
8599:
8594:
8589:
8584:
8578:
8576:
8570:
8569:
8567:
8566:
8564:Standard error
8561:
8556:
8551:
8550:
8549:
8544:
8533:
8531:
8525:
8524:
8522:
8521:
8516:
8511:
8506:
8501:
8496:
8494:Optimal design
8491:
8486:
8480:
8478:
8468:
8467:
8455:
8454:
8451:
8450:
8447:
8446:
8444:
8443:
8438:
8433:
8428:
8423:
8418:
8413:
8408:
8403:
8398:
8393:
8388:
8383:
8378:
8373:
8367:
8365:
8359:
8358:
8356:
8355:
8350:
8349:
8348:
8343:
8333:
8328:
8322:
8320:
8314:
8313:
8311:
8310:
8305:
8300:
8294:
8292:
8291:Summary tables
8288:
8287:
8285:
8284:
8278:
8276:
8270:
8269:
8266:
8265:
8263:
8262:
8261:
8260:
8255:
8250:
8240:
8234:
8232:
8226:
8225:
8223:
8222:
8217:
8212:
8207:
8202:
8197:
8192:
8186:
8184:
8178:
8177:
8175:
8174:
8169:
8164:
8163:
8162:
8157:
8152:
8147:
8142:
8137:
8132:
8127:
8125:Contraharmonic
8122:
8117:
8106:
8104:
8095:
8085:
8084:
8072:
8071:
8069:
8068:
8063:
8057:
8054:
8053:
8046:
8045:
8038:
8031:
8023:
8014:
8013:
8003:
7993:
7964:10.1.1.14.6452
7946:
7934:
7933:External links
7931:
7929:
7928:
7867:
7860:
7817:
7788:(3): 371â378.
7772:
7757:
7722:
7703:
7698:Academic Press
7681:
7668:10.1.1.73.2914
7637:
7588:
7540:
7488:
7469:(1): 111â132.
7449:
7425:
7408:
7384:
7362:
7324:
7313:(2): 410â416.
7297:
7290:
7273:10.1.1.324.403
7253:
7251:
7248:
7247:
7246:
7241:
7236:
7231:
7226:
7221:
7216:
7214:RV coefficient
7211:
7204:
7201:
7185:
7182:
7179:
7175:
7152:
7149:
7146:
7142:
7119:
7116:
7113:
7109:
7086:
7083:
7080:
7076:
7055:
7035:
7019:
7016:
6995:
6992:
6989:
6986:
6983:
6980:
6977:
6953:
6933:
6909:
6889:
6862:
6857:
6853:
6849:
6844:
6840:
6836:
6833:
6830:
6807:
6785:
6781:
6760:
6738:
6734:
6713:
6693:
6665:
6660:
6656:
6652:
6649:
6646:
6643:
6640:
6637:
6634:
6631:
6628:
6625:
6622:
6619:
6616:
6611:
6608:
6604:
6583:
6578:
6574:
6570:
6567:
6564:
6561:
6558:
6555:
6552:
6549:
6546:
6543:
6540:
6537:
6534:
6529:
6526:
6522:
6497:
6494:
6491:
6488:
6485:
6482:
6479:
6476:
6473:
6470:
6467:
6464:
6461:
6435:
6431:
6425:
6421:
6417:
6414:
6411:
6406:
6402:
6398:
6395:
6392:
6370:
6366:
6360:
6356:
6352:
6349:
6346:
6341:
6337:
6333:
6330:
6327:
6318:Assuming that
6315:
6312:
6299:
6296:
6293:
6282:
6281:
6269:
6266:
6263:
6260:
6257:
6254:
6234:
6231:
6228:
6208:
6205:
6202:
6199:
6179:
6176:
6173:
6153:
6133:
6113:
6110:
6107:
6104:
6093:
6081:
6078:
6075:
6072:
6069:
6049:
6046:
6043:
6023:
6020:
6017:
5997:
5994:
5991:
5971:
5951:
5931:
5928:
5925:
5901:
5898:
5895:
5892:
5889:
5886:
5883:
5872:expected value
5857:
5853:
5849:
5846:
5834:
5831:
5797:
5796:Practical uses
5794:
5781:
5778:
5775:
5772:
5769:
5749:
5746:
5743:
5740:
5737:
5714:
5694:
5691:
5688:
5685:
5682:
5679:
5659:
5636:
5633:
5630:
5627:
5624:
5621:
5618:
5615:
5612:
5609:
5606:
5603:
5600:
5597:
5582:
5581:
5570:
5567:
5562:
5557:
5550:
5547:
5540:
5537:
5534:
5529:
5526:
5523:
5520:
5517:
5514:
5509:
5506:
5503:
5499:
5495:
5492:
5488:
5484:
5481:
5478:
5475:
5472:
5469:
5466:
5461:
5458:
5453:
5450:
5447:
5444:
5440:
5436:
5433:
5428:
5424:
5400:
5380:
5377:
5374:
5371:
5368:
5365:
5362:
5359:
5356:
5353:
5350:
5347:
5325:
5318:
5315:
5291:
5271:
5259:
5256:
5255:
5254:
5245:
5201:
5200:
5190:
5184:
5166:
5157:
5135:
5108:
5107:Implementation
5105:
5104:
5103:
5092:
5087:
5083:
5079:
5076:
5071:
5067:
5063:
5060:
5055:
5052:
5048:
5042:
5038:
5034:
5031:
5020:
5019:
5008:
5003:
4999:
4995:
4992:
4987:
4983:
4979:
4976:
4971:
4968:
4964:
4958:
4954:
4950:
4947:
4933:
4932:
4920:
4915:
4912:
4908:
4902:
4899:
4894:
4891:
4887:
4866:
4856:
4845:
4840:
4837:
4833:
4827:
4824:
4819:
4816:
4812:
4806:
4803:
4799:
4793:
4790:
4785:
4782:
4778:
4757:
4747:
4736:
4733:
4728:
4725:
4721:
4715:
4712:
4707:
4704:
4700:
4679:
4669:
4655:
4652:
4648:
4642:
4639:
4634:
4631:
4627:
4621:
4618:
4614:
4608:
4605:
4600:
4597:
4593:
4572:
4558:
4557:
4546:
4541:
4537:
4533:
4530:
4525:
4522:
4518:
4512:
4509:
4505:
4499:
4495:
4491:
4488:
4483:
4480:
4476:
4455:
4445:
4432:
4428:
4424:
4421:
4416:
4413:
4409:
4403:
4400:
4396:
4390:
4387:
4382:
4379:
4375:
4369:
4366:
4362:
4356:
4352:
4348:
4345:
4340:
4337:
4333:
4312:
4298:
4297:
4286:
4281:
4277:
4273:
4270:
4265:
4262:
4258:
4252:
4249:
4245:
4239:
4235:
4231:
4228:
4223:
4220:
4216:
4195:
4185:
4172:
4168:
4164:
4161:
4156:
4153:
4149:
4143:
4140:
4136:
4130:
4127:
4122:
4119:
4115:
4109:
4106:
4102:
4096:
4092:
4088:
4085:
4080:
4077:
4073:
4052:
4037:
4034:
4017:
3997:
3964:
3960:
3956:
3953:
3948:
3945:
3941:
3935:
3932:
3928:
3922:
3919:
3914:
3911:
3907:
3901:
3898:
3894:
3888:
3884:
3880:
3877:
3872:
3869:
3865:
3840:
3820:
3815:
3811:
3807:
3804:
3799:
3796:
3792:
3786:
3783:
3779:
3773:
3769:
3765:
3762:
3757:
3754:
3750:
3729:
3718:
3717:
3706:
3699:
3695:
3691:
3686:
3682:
3677:
3673:
3668:
3661:
3657:
3653:
3648:
3644:
3639:
3635:
3631:
3628:
3623:
3620:
3616:
3610:
3607:
3603:
3597:
3594:
3589:
3586:
3582:
3576:
3573:
3569:
3563:
3559:
3555:
3552:
3547:
3544:
3540:
3534:
3530:
3525:
3518:
3515:
3504:
3503:
3492:
3487:
3483:
3479:
3474:
3470:
3465:
3461:
3456:
3449:
3445:
3441:
3436:
3432:
3427:
3423:
3419:
3416:
3411:
3408:
3404:
3398:
3395:
3391:
3385:
3381:
3377:
3374:
3369:
3366:
3362:
3356:
3352:
3348:
3345:
3340:
3337:
3333:
3327:
3324:
3320:
3314:
3310:
3306:
3303:
3298:
3295:
3291:
3285:
3281:
3276:
3271:
3268:
3265:
3262:
3258:
3252:
3248:
3244:
3241:
3236:
3233:
3229:
3223:
3220:
3216:
3210:
3206:
3202:
3199:
3194:
3191:
3187:
3181:
3177:
3172:
3153:
3152:
3141:
3133:
3128:
3124:
3116:
3111:
3107:
3099:
3094:
3090:
3086:
3083:
3078:
3075:
3071:
3065:
3062:
3058:
3052:
3048:
3044:
3041:
3036:
3033:
3029:
3023:
3019:
3012:
3009:
2995:
2994:
2983:
2978:
2975:
2971:
2967:
2962:
2957:
2953:
2947:
2943:
2937:
2933:
2928:
2923:
2918:
2914:
2908:
2904:
2900:
2895:
2891:
2885:
2881:
2877:
2872:
2868:
2864:
2859:
2856:
2852:
2837:
2836:
2825:
2820:
2817:
2813:
2809:
2804:
2799:
2795:
2789:
2785:
2779:
2775:
2770:
2765:
2760:
2756:
2750:
2746:
2742:
2737:
2733:
2727:
2723:
2719:
2714:
2710:
2706:
2701:
2698:
2694:
2664:
2660:
2656:
2651:
2648:
2644:
2621:
2617:
2613:
2608:
2605:
2601:
2589:
2588:
2577:
2574:
2569:
2565:
2561:
2556:
2553:
2549:
2545:
2542:
2531:
2530:
2519:
2516:
2511:
2507:
2503:
2498:
2495:
2491:
2487:
2484:
2466:
2465:
2454:
2446:
2441:
2438:
2434:
2428:
2424:
2416:
2411:
2408:
2404:
2398:
2394:
2386:
2381:
2378:
2374:
2368:
2364:
2357:
2354:
2331:
2311:
2285:
2282:
2278:
2265:
2262:
2260:
2257:
2230:
2226:
2220:
2217:
2213:
2209:
2204:
2200:
2194:
2191:
2187:
2168:weight vectors
2146:
2142:
2138:
2133:
2129:
2117:
2116:
2105:
2102:
2099:
2096:
2093:
2090:
2087:
2084:
2081:
2073:
2070:
2067:
2064:
2059:
2054:
2050:
2046:
2043:
2038:
2034:
2030:
2027:
2024:
2021:
2018:
2015:
2010:
2005:
2001:
1997:
1994:
1989:
1985:
1981:
1978:
1975:
1966:
1963:
1958:
1954:
1950:
1947:
1942:
1938:
1934:
1931:
1928:
1922:
1919:
1916:
1912:
1907:
1904:
1899:
1895:
1891:
1886:
1882:
1878:
1855:
1852:
1849:
1846:
1843:
1840:
1808:
1803:
1798:
1794:
1790:
1787:
1767:
1762:
1757:
1753:
1749:
1746:
1726:
1723:
1718:
1713:
1709:
1705:
1702:
1697:
1692:
1688:
1684:
1681:
1678:
1675:
1672:
1649:
1644:
1639:
1635:
1614:
1609:
1604:
1600:
1577:
1572:
1567:
1562:
1558:
1535:
1531:
1508:
1503:
1498:
1493:
1489:
1466:
1462:
1438:
1418:
1398:
1393:
1389:
1385:
1380:
1376:
1372:
1369:
1366:
1343:
1340:
1337:
1334:
1331:
1308:
1305:
1302:
1282:
1279:
1276:
1273:
1270:
1267:
1264:
1261:
1256:
1253:
1249:
1234:second moments
1212:
1208:
1202:
1198:
1194:
1191:
1188:
1183:
1179:
1175:
1172:
1169:
1147:
1143:
1137:
1133:
1129:
1126:
1123:
1118:
1114:
1110:
1107:
1104:
1094:column vectors
1089:
1086:
1058:Camille Jordan
1018:
1011:
1007: = (
998:
991:
987: = (
959:
956:
955:
953:
952:
945:
938:
930:
927:
926:
923:
922:
917:
916:
915:
905:
899:
896:
895:
892:
891:
888:
887:
882:
877:
872:
867:
862:
857:
851:
848:
847:
844:
843:
840:
839:
834:
829:
824:
822:Occam learning
819:
814:
809:
804:
798:
795:
794:
791:
790:
787:
786:
781:
779:Learning curve
776:
771:
765:
762:
761:
758:
757:
754:
753:
748:
743:
738:
732:
729:
728:
725:
724:
721:
720:
719:
718:
708:
703:
698:
692:
687:
686:
683:
682:
679:
678:
672:
667:
662:
657:
656:
655:
645:
640:
639:
638:
633:
628:
623:
613:
608:
603:
598:
597:
596:
586:
585:
584:
579:
574:
569:
559:
554:
549:
543:
538:
537:
534:
533:
530:
529:
524:
519:
511:
505:
500:
499:
496:
495:
492:
491:
490:
489:
484:
479:
468:
463:
462:
459:
458:
455:
454:
449:
444:
439:
434:
429:
424:
419:
414:
408:
403:
402:
399:
398:
395:
394:
389:
384:
378:
373:
368:
360:
355:
350:
344:
339:
338:
335:
334:
331:
330:
325:
320:
315:
310:
305:
300:
295:
287:
286:
285:
280:
275:
265:
263:Decision trees
260:
254:
240:classification
230:
229:
228:
225:
224:
221:
220:
215:
210:
205:
200:
195:
190:
185:
180:
175:
170:
165:
160:
155:
150:
145:
140:
135:
133:Classification
129:
126:
125:
122:
121:
118:
117:
112:
107:
102:
97:
92:
90:Batch learning
87:
82:
77:
72:
67:
62:
57:
51:
48:
47:
44:
43:
32:
31:
9:
6:
4:
3:
2:
10410:
10399:
10396:
10395:
10393:
10380:
10376:
10372:
10368:
10364:
10360:
10356:
10349:
10345:
10335:
10334:
10325:
10323:
10322:
10313:
10311:
10310:
10305:
10299:
10297:
10296:
10287:
10286:
10283:
10269:
10266:
10264:
10263:Geostatistics
10261:
10259:
10256:
10254:
10251:
10249:
10246:
10245:
10243:
10241:
10237:
10231:
10230:Psychometrics
10228:
10226:
10223:
10221:
10218:
10216:
10213:
10211:
10208:
10206:
10203:
10201:
10198:
10196:
10193:
10191:
10188:
10186:
10183:
10182:
10180:
10178:
10174:
10168:
10165:
10163:
10160:
10158:
10154:
10151:
10149:
10146:
10144:
10141:
10139:
10136:
10135:
10133:
10131:
10127:
10121:
10118:
10116:
10113:
10111:
10107:
10104:
10102:
10099:
10098:
10096:
10094:
10093:Biostatistics
10090:
10086:
10082:
10077:
10073:
10055:
10054:Log-rank test
10052:
10051:
10049:
10045:
10039:
10036:
10035:
10033:
10031:
10027:
10021:
10018:
10016:
10013:
10011:
10008:
10006:
10003:
10002:
10000:
9998:
9994:
9991:
9989:
9985:
9975:
9972:
9970:
9967:
9965:
9962:
9960:
9957:
9955:
9952:
9951:
9949:
9947:
9943:
9937:
9934:
9932:
9929:
9927:
9925:(BoxâJenkins)
9921:
9919:
9916:
9914:
9911:
9907:
9904:
9903:
9902:
9899:
9898:
9896:
9894:
9890:
9884:
9881:
9879:
9878:DurbinâWatson
9876:
9874:
9868:
9866:
9863:
9861:
9860:DickeyâFuller
9858:
9857:
9855:
9851:
9845:
9842:
9840:
9837:
9835:
9834:Cointegration
9832:
9830:
9827:
9825:
9822:
9820:
9817:
9815:
9812:
9810:
9809:Decomposition
9807:
9806:
9804:
9800:
9797:
9795:
9791:
9781:
9778:
9777:
9776:
9773:
9772:
9771:
9768:
9764:
9761:
9760:
9759:
9756:
9754:
9751:
9749:
9746:
9744:
9741:
9739:
9736:
9734:
9731:
9729:
9726:
9724:
9721:
9720:
9718:
9716:
9712:
9706:
9703:
9701:
9698:
9696:
9693:
9691:
9688:
9686:
9683:
9681:
9680:Cohen's kappa
9678:
9677:
9675:
9673:
9669:
9665:
9661:
9657:
9653:
9649:
9644:
9640:
9626:
9623:
9621:
9618:
9616:
9613:
9611:
9608:
9607:
9605:
9603:
9599:
9593:
9589:
9585:
9579:
9577:
9574:
9573:
9571:
9569:
9565:
9559:
9556:
9554:
9551:
9549:
9546:
9544:
9541:
9539:
9536:
9534:
9533:Nonparametric
9531:
9529:
9526:
9525:
9523:
9519:
9513:
9510:
9508:
9505:
9503:
9500:
9498:
9495:
9494:
9492:
9490:
9486:
9480:
9477:
9475:
9472:
9470:
9467:
9465:
9462:
9460:
9457:
9456:
9454:
9452:
9448:
9442:
9439:
9437:
9434:
9432:
9429:
9427:
9424:
9423:
9421:
9419:
9415:
9411:
9404:
9401:
9399:
9396:
9395:
9391:
9387:
9371:
9368:
9367:
9366:
9363:
9361:
9358:
9356:
9353:
9349:
9346:
9344:
9341:
9340:
9339:
9336:
9335:
9333:
9331:
9327:
9317:
9314:
9310:
9304:
9302:
9296:
9294:
9288:
9287:
9286:
9283:
9282:Nonparametric
9280:
9278:
9272:
9268:
9265:
9264:
9263:
9257:
9253:
9252:Sample median
9250:
9249:
9248:
9245:
9244:
9242:
9240:
9236:
9228:
9225:
9223:
9220:
9218:
9215:
9214:
9213:
9210:
9208:
9205:
9203:
9197:
9195:
9192:
9190:
9187:
9185:
9182:
9180:
9177:
9175:
9173:
9169:
9167:
9164:
9163:
9161:
9159:
9155:
9149:
9147:
9143:
9141:
9139:
9134:
9132:
9127:
9123:
9122:
9119:
9116:
9114:
9110:
9100:
9097:
9095:
9092:
9090:
9087:
9086:
9084:
9082:
9078:
9072:
9069:
9065:
9062:
9061:
9060:
9057:
9053:
9050:
9049:
9048:
9045:
9043:
9040:
9039:
9037:
9035:
9031:
9023:
9020:
9018:
9015:
9014:
9013:
9010:
9008:
9005:
9003:
9000:
8998:
8995:
8993:
8990:
8988:
8985:
8984:
8982:
8980:
8976:
8970:
8967:
8963:
8960:
8956:
8953:
8951:
8948:
8947:
8946:
8943:
8942:
8941:
8938:
8934:
8931:
8929:
8926:
8924:
8921:
8919:
8916:
8915:
8914:
8911:
8910:
8908:
8906:
8902:
8899:
8897:
8893:
8887:
8884:
8882:
8879:
8875:
8872:
8871:
8870:
8867:
8865:
8862:
8858:
8857:loss function
8855:
8854:
8853:
8850:
8846:
8843:
8841:
8838:
8836:
8833:
8832:
8831:
8828:
8826:
8823:
8821:
8818:
8814:
8811:
8809:
8806:
8804:
8798:
8795:
8794:
8793:
8790:
8786:
8783:
8781:
8778:
8776:
8773:
8772:
8771:
8768:
8764:
8761:
8759:
8756:
8755:
8754:
8751:
8747:
8744:
8743:
8742:
8739:
8735:
8732:
8731:
8730:
8727:
8725:
8722:
8720:
8717:
8715:
8712:
8711:
8709:
8707:
8703:
8699:
8695:
8690:
8686:
8672:
8669:
8667:
8664:
8662:
8659:
8657:
8654:
8653:
8651:
8649:
8645:
8639:
8636:
8634:
8631:
8629:
8626:
8625:
8623:
8619:
8613:
8610:
8608:
8605:
8603:
8600:
8598:
8595:
8593:
8590:
8588:
8585:
8583:
8580:
8579:
8577:
8575:
8571:
8565:
8562:
8560:
8559:Questionnaire
8557:
8555:
8552:
8548:
8545:
8543:
8540:
8539:
8538:
8535:
8534:
8532:
8530:
8526:
8520:
8517:
8515:
8512:
8510:
8507:
8505:
8502:
8500:
8497:
8495:
8492:
8490:
8487:
8485:
8482:
8481:
8479:
8477:
8473:
8469:
8465:
8460:
8456:
8442:
8439:
8437:
8434:
8432:
8429:
8427:
8424:
8422:
8419:
8417:
8414:
8412:
8409:
8407:
8404:
8402:
8399:
8397:
8394:
8392:
8389:
8387:
8386:Control chart
8384:
8382:
8379:
8377:
8374:
8372:
8369:
8368:
8366:
8364:
8360:
8354:
8351:
8347:
8344:
8342:
8339:
8338:
8337:
8334:
8332:
8329:
8327:
8324:
8323:
8321:
8319:
8315:
8309:
8306:
8304:
8301:
8299:
8296:
8295:
8293:
8289:
8283:
8280:
8279:
8277:
8275:
8271:
8259:
8256:
8254:
8251:
8249:
8246:
8245:
8244:
8241:
8239:
8236:
8235:
8233:
8231:
8227:
8221:
8218:
8216:
8213:
8211:
8208:
8206:
8203:
8201:
8198:
8196:
8193:
8191:
8188:
8187:
8185:
8183:
8179:
8173:
8170:
8168:
8165:
8161:
8158:
8156:
8153:
8151:
8148:
8146:
8143:
8141:
8138:
8136:
8133:
8131:
8128:
8126:
8123:
8121:
8118:
8116:
8113:
8112:
8111:
8108:
8107:
8105:
8103:
8099:
8096:
8094:
8090:
8086:
8082:
8077:
8073:
8067:
8064:
8062:
8059:
8058:
8055:
8051:
8044:
8039:
8037:
8032:
8030:
8025:
8024:
8021:
8017:
8011:
8007:
8004:
8001:
7997:
7994:
7990:
7986:
7982:
7978:
7974:
7970:
7965:
7960:
7956:
7952:
7947:
7944:
7940:
7937:
7936:
7924:
7920:
7915:
7910:
7905:
7900:
7895:
7890:
7886:
7882:
7878:
7871:
7863:
7857:
7853:
7849:
7844:
7839:
7835:
7828:
7821:
7813:
7809:
7805:
7801:
7796:
7791:
7787:
7783:
7776:
7768:
7764:
7760:
7754:
7750:
7746:
7742:
7741:
7733:
7726:
7720:
7716:
7713:
7707:
7699:
7695:
7691:
7685:
7678:
7674:
7669:
7664:
7660:
7656:
7652:
7648:
7641:
7633:
7629:
7624:
7619:
7615:
7611:
7607:
7603:
7599:
7592:
7578:on 2017-03-13
7574:
7570:
7566:
7562:
7558:
7551:
7544:
7536:
7532:
7528:
7524:
7519:
7514:
7510:
7506:
7499:
7492:
7484:
7480:
7476:
7472:
7468:
7464:
7460:
7453:
7439:
7435:
7429:
7421:
7420:
7412:
7404:
7400:
7396:
7382:
7372:
7366:
7358:
7354:
7350:
7346:
7342:
7338:
7334:
7333:Hotelling, H.
7328:
7320:
7316:
7312:
7308:
7301:
7293:
7287:
7283:
7279:
7274:
7269:
7265:
7258:
7254:
7245:
7242:
7240:
7237:
7235:
7232:
7230:
7227:
7225:
7222:
7220:
7217:
7215:
7212:
7210:
7207:
7206:
7200:
7183:
7180:
7177:
7173:
7150:
7147:
7144:
7140:
7117:
7114:
7111:
7107:
7084:
7081:
7078:
7074:
7053:
7033:
7025:
7015:
7013:
7009:
6990:
6987:
6984:
6978:
6975:
6967:
6966:inner product
6951:
6931:
6923:
6907:
6887:
6878:
6876:
6855:
6851:
6847:
6842:
6838:
6831:
6828:
6821:
6805:
6783:
6779:
6758:
6736:
6732:
6711:
6691:
6683:
6682:inner product
6679:
6678:Gram matrices
6658:
6654:
6650:
6644:
6638:
6632:
6629:
6626:
6620:
6617:
6614:
6609:
6606:
6576:
6572:
6568:
6562:
6556:
6550:
6547:
6544:
6538:
6535:
6532:
6527:
6524:
6511:
6495:
6492:
6486:
6480:
6474:
6468:
6462:
6451:
6433:
6423:
6419:
6415:
6412:
6409:
6404:
6400:
6393:
6390:
6368:
6358:
6354:
6350:
6347:
6344:
6339:
6335:
6328:
6325:
6311:
6297:
6294:
6291:
6267:
6264:
6261:
6258:
6255:
6252:
6232:
6229:
6226:
6206:
6203:
6200:
6197:
6177:
6174:
6171:
6151:
6131:
6111:
6108:
6105:
6102:
6094:
6079:
6076:
6073:
6070:
6067:
6047:
6044:
6041:
6021:
6018:
6015:
5995:
5992:
5989:
5969:
5949:
5929:
5926:
5923:
5915:
5914:
5913:
5899:
5896:
5890:
5884:
5873:
5855:
5851:
5847:
5844:
5830:
5826:
5822:
5820:
5816:
5812:
5808:
5804:
5793:
5779:
5776:
5773:
5770:
5767:
5747:
5744:
5741:
5738:
5735:
5726:
5712:
5689:
5686:
5683:
5657:
5649:
5631:
5628:
5625:
5622:
5619:
5610:
5607:
5604:
5601:
5598:
5587:
5568:
5560:
5555:
5548:
5545:
5538:
5535:
5524:
5521:
5518:
5507:
5504:
5501:
5497:
5493:
5490:
5486:
5479:
5476:
5473:
5470:
5467:
5459:
5456:
5451:
5448:
5445:
5442:
5438:
5434:
5431:
5426:
5422:
5414:
5413:
5412:
5398:
5375:
5372:
5369:
5360:
5357:
5354:
5351:
5348:
5345:
5323:
5316:
5313:
5289:
5269:
5253:
5249:
5246:
5244:
5240:
5237:
5236:
5235:
5233:
5229:
5226:
5222:
5218:
5214:
5210:
5206:
5198:
5194:
5191:
5188:
5185:
5182:
5178:
5174:
5170:
5167:
5165:
5161:
5158:
5155:
5151:
5147:
5143:
5139:
5136:
5133:
5129:
5125:
5121:
5118:
5117:
5116:
5114:
5090:
5085:
5081:
5077:
5074:
5069:
5065:
5061:
5058:
5053:
5050:
5040:
5036:
5032:
5029:
5022:
5021:
5006:
5001:
4997:
4993:
4990:
4985:
4981:
4977:
4974:
4969:
4966:
4956:
4952:
4948:
4945:
4938:
4937:
4936:
4918:
4913:
4910:
4900:
4897:
4892:
4889:
4864:
4857:
4843:
4838:
4835:
4825:
4822:
4817:
4814:
4804:
4801:
4791:
4788:
4783:
4780:
4755:
4748:
4734:
4731:
4726:
4723:
4713:
4710:
4705:
4702:
4677:
4670:
4653:
4650:
4640:
4637:
4632:
4629:
4619:
4616:
4606:
4603:
4598:
4595:
4570:
4563:
4562:
4561:
4544:
4539:
4535:
4531:
4528:
4523:
4520:
4510:
4507:
4497:
4493:
4489:
4486:
4481:
4478:
4453:
4446:
4430:
4426:
4422:
4419:
4414:
4411:
4401:
4398:
4388:
4385:
4380:
4377:
4367:
4364:
4354:
4350:
4346:
4343:
4338:
4335:
4310:
4303:
4302:
4301:
4284:
4279:
4275:
4271:
4268:
4263:
4260:
4250:
4247:
4237:
4233:
4229:
4226:
4221:
4218:
4193:
4186:
4170:
4166:
4162:
4159:
4154:
4151:
4141:
4138:
4128:
4125:
4120:
4117:
4107:
4104:
4094:
4090:
4086:
4083:
4078:
4075:
4050:
4043:
4042:
4041:
4033:
4031:
4015:
3995:
3986:
3984:
3980:
3962:
3958:
3954:
3951:
3946:
3943:
3933:
3930:
3920:
3917:
3912:
3909:
3899:
3896:
3886:
3882:
3878:
3875:
3870:
3867:
3854:
3838:
3818:
3813:
3809:
3805:
3802:
3797:
3794:
3784:
3781:
3771:
3767:
3763:
3760:
3755:
3752:
3727:
3704:
3697:
3693:
3689:
3684:
3680:
3675:
3671:
3666:
3659:
3655:
3651:
3646:
3642:
3637:
3633:
3629:
3626:
3621:
3618:
3608:
3605:
3595:
3592:
3587:
3584:
3574:
3571:
3561:
3557:
3553:
3550:
3545:
3542:
3532:
3528:
3523:
3516:
3513:
3506:
3505:
3490:
3485:
3481:
3477:
3472:
3468:
3463:
3459:
3454:
3447:
3443:
3439:
3434:
3430:
3425:
3421:
3417:
3414:
3409:
3406:
3396:
3393:
3383:
3379:
3375:
3372:
3367:
3364:
3354:
3350:
3346:
3343:
3338:
3335:
3325:
3322:
3312:
3308:
3304:
3301:
3296:
3293:
3283:
3279:
3274:
3269:
3263:
3256:
3250:
3246:
3242:
3239:
3234:
3231:
3221:
3218:
3208:
3204:
3200:
3197:
3192:
3189:
3179:
3175:
3170:
3162:
3161:
3160:
3158:
3139:
3131:
3126:
3122:
3114:
3109:
3105:
3097:
3092:
3088:
3084:
3081:
3076:
3073:
3063:
3060:
3050:
3046:
3042:
3039:
3034:
3031:
3021:
3017:
3010:
3007:
3000:
2999:
2998:
2981:
2976:
2973:
2965:
2955:
2951:
2945:
2941:
2935:
2931:
2926:
2916:
2912:
2906:
2902:
2898:
2893:
2889:
2883:
2879:
2875:
2870:
2866:
2862:
2857:
2854:
2842:
2841:
2840:
2823:
2818:
2815:
2807:
2797:
2793:
2787:
2783:
2777:
2773:
2768:
2758:
2754:
2748:
2744:
2740:
2735:
2731:
2725:
2721:
2717:
2712:
2708:
2704:
2699:
2696:
2684:
2683:
2682:
2680:
2662:
2658:
2654:
2649:
2646:
2619:
2615:
2611:
2606:
2603:
2575:
2572:
2567:
2563:
2559:
2554:
2551:
2543:
2540:
2533:
2532:
2517:
2514:
2509:
2505:
2501:
2496:
2493:
2485:
2482:
2475:
2474:
2473:
2471:
2452:
2444:
2439:
2436:
2426:
2422:
2414:
2409:
2406:
2396:
2392:
2384:
2379:
2376:
2366:
2362:
2355:
2352:
2345:
2344:
2343:
2329:
2309:
2301:
2283:
2280:
2256:
2254:
2253:
2248:
2247:
2228:
2224:
2218:
2215:
2207:
2202:
2198:
2192:
2189:
2176:
2175:
2170:
2169:
2164:
2163:
2144:
2140:
2136:
2131:
2127:
2103:
2100:
2097:
2094:
2091:
2088:
2085:
2082:
2079:
2071:
2068:
2062:
2057:
2052:
2048:
2044:
2041:
2036:
2032:
2025:
2022:
2019:
2013:
2008:
2003:
1999:
1995:
1992:
1987:
1983:
1976:
1973:
1961:
1956:
1952:
1948:
1945:
1940:
1936:
1929:
1926:
1920:
1917:
1914:
1910:
1905:
1897:
1893:
1889:
1884:
1880:
1869:
1868:
1867:
1850:
1847:
1844:
1830:
1829:
1824:
1823:
1806:
1801:
1796:
1792:
1788:
1785:
1765:
1760:
1755:
1751:
1747:
1744:
1721:
1716:
1711:
1707:
1703:
1700:
1695:
1690:
1686:
1679:
1676:
1673:
1670:
1663:
1660:maximize the
1647:
1642:
1637:
1633:
1612:
1607:
1602:
1598:
1575:
1565:
1560:
1556:
1533:
1529:
1506:
1496:
1491:
1487:
1464:
1460:
1450:
1436:
1416:
1391:
1387:
1383:
1378:
1374:
1367:
1364:
1357:
1354:entry is the
1338:
1335:
1332:
1321:
1306:
1303:
1300:
1277:
1274:
1271:
1265:
1262:
1259:
1254:
1251:
1239:
1235:
1232:
1228:
1210:
1200:
1196:
1192:
1189:
1186:
1181:
1177:
1170:
1167:
1145:
1135:
1131:
1127:
1124:
1121:
1116:
1112:
1105:
1102:
1095:
1085:
1083:
1078:
1074:
1070:
1065:
1061:
1059:
1055:
1051:
1046:
1042:
1038:
1034:
1030:
1026:
1021:
1017:
1010:
1006:
1001:
997:
990:
986:
982:
978:
974:
970:
966:
951:
946:
944:
939:
937:
932:
931:
929:
928:
921:
918:
914:
911:
910:
909:
906:
904:
901:
900:
894:
893:
886:
883:
881:
878:
876:
873:
871:
868:
866:
863:
861:
858:
856:
853:
852:
846:
845:
838:
835:
833:
830:
828:
825:
823:
820:
818:
815:
813:
810:
808:
805:
803:
800:
799:
793:
792:
785:
782:
780:
777:
775:
772:
770:
767:
766:
760:
759:
752:
749:
747:
744:
742:
741:Crowdsourcing
739:
737:
734:
733:
727:
726:
717:
714:
713:
712:
709:
707:
704:
702:
699:
697:
694:
693:
690:
685:
684:
676:
673:
671:
670:Memtransistor
668:
666:
663:
661:
658:
654:
651:
650:
649:
646:
644:
641:
637:
634:
632:
629:
627:
624:
622:
619:
618:
617:
614:
612:
609:
607:
604:
602:
599:
595:
592:
591:
590:
587:
583:
580:
578:
575:
573:
570:
568:
565:
564:
563:
560:
558:
555:
553:
552:Deep learning
550:
548:
545:
544:
541:
536:
535:
528:
525:
523:
520:
518:
516:
512:
510:
507:
506:
503:
498:
497:
488:
487:Hidden Markov
485:
483:
480:
478:
475:
474:
473:
470:
469:
466:
461:
460:
453:
450:
448:
445:
443:
440:
438:
435:
433:
430:
428:
425:
423:
420:
418:
415:
413:
410:
409:
406:
401:
400:
393:
390:
388:
385:
383:
379:
377:
374:
372:
369:
367:
365:
361:
359:
356:
354:
351:
349:
346:
345:
342:
337:
336:
329:
326:
324:
321:
319:
316:
314:
311:
309:
306:
304:
301:
299:
296:
294:
292:
288:
284:
283:Random forest
281:
279:
276:
274:
271:
270:
269:
266:
264:
261:
259:
256:
255:
248:
247:
242:
241:
233:
227:
226:
219:
216:
214:
211:
209:
206:
204:
201:
199:
196:
194:
191:
189:
186:
184:
181:
179:
176:
174:
171:
169:
168:Data cleaning
166:
164:
161:
159:
156:
154:
151:
149:
146:
144:
141:
139:
136:
134:
131:
130:
124:
123:
116:
113:
111:
108:
106:
103:
101:
98:
96:
93:
91:
88:
86:
83:
81:
80:Meta-learning
78:
76:
73:
71:
68:
66:
63:
61:
58:
56:
53:
52:
46:
45:
42:
37:
34:
33:
29:
28:
19:
10362:
10358:
10348:
10331:
10319:
10300:
10293:
10205:Econometrics
10155: /
10138:Chemometrics
10115:Epidemiology
10108: /
10081:Applications
9923:ARIMA model
9870:Q-statistic
9819:Stationarity
9737:
9715:Multivariate
9658: /
9654: /
9652:Multivariate
9650: /
9590: /
9586: /
9360:Bayes factor
9259:Signed rank
9171:
9145:
9137:
9125:
8820:Completeness
8656:Cohort study
8554:Opinion poll
8489:Missing data
8476:Study design
8431:Scatter plot
8353:Scatter plot
8346:Spearman's Ï
8308:Grouped data
8015:
7954:
7950:
7884:
7880:
7870:
7833:
7820:
7785:
7781:
7775:
7739:
7725:
7711:
7706:
7693:
7684:
7650:
7646:
7640:
7608:(68): 3823.
7605:
7601:
7591:
7580:. Retrieved
7573:the original
7560:
7556:
7543:
7508:
7504:
7491:
7466:
7462:
7452:
7441:. Retrieved
7437:
7428:
7418:
7411:
7402:
7398:
7365:
7340:
7336:
7327:
7310:
7306:
7300:
7263:
7257:
7021:
6879:
6317:
6283:
5836:
5827:
5823:
5815:extraversion
5805:such as the
5799:
5727:
5583:
5261:
5219:function is
5202:
5173:scikit-learn
5164:proc cancorr
5110:
4934:
4559:
4299:
4039:
3987:
3719:
3154:
2996:
2838:
2590:
2467:
2267:
2251:
2250:
2245:
2244:
2173:
2172:
2167:
2166:
2161:
2160:
2118:
1827:
1826:
1821:
1820:
1451:
1091:
1081:
1076:
1072:
1066:
1062:
1040:
1036:
1029:correlations
1019:
1015:
1008:
1004:
999:
995:
988:
984:
976:
972:
968:
962:
827:PAC learning
514:
416:
363:
358:Hierarchical
290:
244:
238:
10333:WikiProject
10248:Cartography
10210:Jurimetrics
10162:Reliability
9893:Time domain
9872:(LjungâBox)
9794:Time-series
9672:Categorical
9656:Time-series
9648:Categorical
9583:(Bernoulli)
9418:Correlation
9398:Correlation
9194:JarqueâBera
9166:Chi-squared
8928:M-estimator
8881:Asymptotics
8825:Sufficiency
8592:Interaction
8504:Replication
8484:Effect size
8441:Violin plot
8421:Radar chart
8401:Forest plot
8391:Correlogram
8341:Kendall's Ï
7563:(7): 2162.
7511:(5): 1460.
7395:dimensions"
5819:neuroticism
5586:chi-squared
3983:eigenvalues
3853:eigenvector
2472:and define
2259:Computation
2243:are called
2159:are called
1662:correlation
711:Multi-agent
648:Transformer
547:Autoencoder
303:Naive Bayes
41:data mining
10200:Demography
9918:ARMA model
9723:Regression
9300:(Friedman)
9261:(Wilcoxon)
9199:Normality
9189:Lilliefors
9136:Student's
9012:Resampling
8886:Robustness
8874:divergence
8864:Efficiency
8802:(monotone)
8797:Likelihood
8714:Population
8547:Stratified
8499:Population
8318:Dependence
8274:Count data
8205:Percentile
8182:Dispersion
8115:Arithmetic
8050:Statistics
7894:1802.03490
7719:1604.02047
7582:2015-09-04
7443:2023-09-12
7371:Jordan, C.
7337:Biometrika
7250:References
6820:covariance
6510:covariance
6448:have zero
5870:with zero
5650:for large
5391:. For the
2264:Derivation
2249:or simply
2171:or simply
1356:covariance
1293:to be the
1092:Given two
1073:population
965:statistics
696:Q-learning
594:Restricted
392:Mean shift
341:Clustering
318:Perceptron
246:regression
148:Clustering
143:Regression
9581:Logistic
9348:posterior
9274:Rank sum
9022:Jackknife
9017:Bootstrap
8835:Bootstrap
8770:Parameter
8719:Statistic
8514:Statistic
8426:Run chart
8411:Pie chart
8406:Histogram
8396:Fan chart
8371:Bar chart
8253:L-moments
8140:Geometric
7959:CiteSeerX
7887:(1): 15.
7838:CiteSeerX
7795:1109.0725
7663:CiteSeerX
7632:2475-9066
7535:220740158
7518:0811.4413
7483:1349-6964
7268:CiteSeerX
6979:
6832:
6645:
6621:
6603:Σ
6563:
6539:
6521:Σ
6512:matrices
6481:
6463:
6413:…
6348:…
6259:−
6204:−
6109:−
5885:
5777:−
5623:−
5602:−
5549:^
5546:ρ
5539:−
5498:∏
5494:
5452:−
5446:−
5435:−
5423:χ
5358:…
5317:^
5314:ρ
5225:precision
5124:canoncorr
5059:−
5047:Σ
4975:−
4963:Σ
4907:Σ
4898:−
4886:Σ
4832:Σ
4823:−
4811:Σ
4798:Σ
4789:−
4777:Σ
4720:Σ
4711:−
4699:Σ
4647:Σ
4638:−
4626:Σ
4613:Σ
4604:−
4592:Σ
4529:−
4517:Σ
4504:Σ
4487:−
4475:Σ
4420:−
4408:Σ
4395:Σ
4386:−
4374:Σ
4361:Σ
4344:−
4332:Σ
4269:−
4257:Σ
4244:Σ
4227:−
4215:Σ
4160:−
4148:Σ
4135:Σ
4126:−
4114:Σ
4101:Σ
4084:−
4072:Σ
3952:−
3940:Σ
3927:Σ
3918:−
3906:Σ
3893:Σ
3876:−
3864:Σ
3803:−
3791:Σ
3778:Σ
3761:−
3749:Σ
3627:−
3615:Σ
3602:Σ
3593:−
3581:Σ
3568:Σ
3551:−
3539:Σ
3517:≤
3514:ρ
3415:−
3403:Σ
3390:Σ
3373:−
3361:Σ
3344:−
3332:Σ
3319:Σ
3302:−
3290:Σ
3270:≤
3240:−
3228:Σ
3215:Σ
3198:−
3186:Σ
3082:−
3070:Σ
3057:Σ
3040:−
3028:Σ
3008:ρ
2970:Σ
2961:⊤
2922:⊤
2851:Σ
2812:Σ
2803:⊤
2764:⊤
2693:Σ
2643:Σ
2600:Σ
2548:Σ
2490:Σ
2433:Σ
2403:Σ
2373:Σ
2353:ρ
2277:Σ
2212:Σ
2186:Σ
2101:−
2092:…
2026:
1977:
1930:
1680:
1671:ρ
1566:∈
1497:∈
1368:
1304:×
1266:
1248:Σ
1190:…
1125:…
1060:in 1875.
855:ECML PKDD
837:VC theory
784:ROC curve
716:Self-play
636:DeepDream
477:Bayes net
268:Ensembles
49:Paradigms
10392:Category
10379:15624506
10295:Category
9988:Survival
9865:Johansen
9588:Binomial
9543:Isotonic
9130:(normal)
8775:location
8582:Blocking
8537:Sampling
8416:QâQ plot
8381:Box plot
8363:Graphics
8258:Skewness
8248:Kurtosis
8220:Variance
8150:Heronian
8145:Harmonic
7981:15516276
7923:30626338
7767:51682024
7373:(1875).
7203:See also
6508:, their
6452:, i.e.,
6124:, i.e.,
5942:, i.e.,
5874:, i.e.,
5833:Examples
5199:package.
4036:Solution
2252:loadings
1819:are the
278:Boosting
127:Problems
10321:Commons
10268:Kriging
10153:Process
10110:studies
9969:Wavelet
9802:General
8969:Plug-in
8763:L space
8542:Cluster
8243:Moments
8061:Outline
8010:FORTRAN
8000:FORTRAN
7914:6327589
7812:8942357
7655:Bibcode
7610:Bibcode
7357:2333955
5211:of the
5195:in the
5181:CanCorr
3851:is the
3155:By the
2298:be the
2174:weights
1866:times.
860:NeurIPS
677:(ECRAM)
631:AlexNet
273:Bagging
10377:
10190:Census
9780:Normal
9728:Manova
9548:Robust
9298:2-way
9290:1-way
9128:-test
8799:
8376:Biplot
8167:Median
8160:Lehmer
8102:Center
7989:202473
7987:
7979:
7961:
7943:MATLAB
7921:
7911:
7858:
7840:
7810:
7765:
7755:
7665:
7630:
7533:
7481:
7405:: 103.
7355:
7288:
7270:
7008:cosine
6873:; see
6680:in an
5248:MATLAB
5217:cosine
5215:. The
5209:cosine
5169:Python
5142:cancor
5132:Octave
5120:MATLAB
2591:where
1911:argmax
1521:) and
1322:whose
1320:matrix
1231:finite
1077:sample
1023:) of
1003:) and
653:Vision
509:RANSAC
387:OPTICS
382:DBSCAN
366:-means
173:AutoML
10375:S2CID
9814:Trend
9343:prior
9285:anova
9174:-test
9148:-test
9140:-test
9047:Power
8992:Pivot
8785:shape
8780:scale
8230:Shape
8210:Range
8155:Heinz
8130:Cubic
8066:Index
7985:S2CID
7889:arXiv
7830:(PDF)
7808:S2CID
7790:arXiv
7763:S2CID
7735:(PDF)
7715:arXiv
7576:(PDF)
7553:(PDF)
7531:S2CID
7513:arXiv
7501:(PDF)
7353:JSTOR
5588:with
5239:SciPy
5230:. To
5175:, as
5150:vegan
3977:(see
2997:Thus
1229:with
875:IJCAI
701:SARSA
660:Mamba
626:LeNet
621:U-Net
447:t-SNE
371:Fuzzy
348:BIRCH
10047:Test
9247:Sign
9099:Wald
8172:Mode
8110:Mean
7977:PMID
7919:PMID
7856:ISBN
7753:ISBN
7628:ISSN
7479:ISSN
7286:ISBN
7165:and
7099:and
7046:and
6976:corr
6944:and
6900:and
6771:and
6751:of
6704:and
6594:and
6383:and
6245:and
6190:and
6144:and
6060:and
6008:and
5962:and
5837:Let
5739:<
5187:SPSS
5148:and
5128:also
4008:and
3740:and
2839:and
2634:and
2322:and
2268:Let
1927:corr
1778:and
1677:corr
1625:and
1429:and
1160:and
1039:and
885:JMLR
870:ICLR
865:ICML
751:RLHF
567:LSTM
353:CURE
39:and
10367:doi
9227:BIC
9222:AIC
7969:doi
7909:PMC
7899:doi
7848:doi
7800:doi
7745:doi
7673:doi
7618:doi
7565:doi
7561:139
7523:doi
7471:doi
7345:doi
7315:doi
7278:doi
7010:of
6829:cov
6798:of
6618:Cov
6536:Cov
6095:If
5916:If
5912:.
5817:or
5811:NEO
5705:to
5678:min
5513:min
5364:min
5250:as
5241:as
5162:as
5160:SAS
5154:CCP
5146:CCA
5130:in
5122:as
2681:):
2165:or
2023:cov
1974:cov
1839:min
1365:cov
1263:cov
1225:of
1069:PCA
1035:of
973:CCA
963:In
611:SOM
601:GAN
577:ESN
572:GRU
517:-NN
452:SDL
442:PGD
437:PCA
432:NMF
427:LDA
422:ICA
417:CCA
293:-NN
10394::
10373:.
10363:11
10361:.
10357:.
7983:.
7975:.
7967:.
7955:16
7953:.
7917:.
7907:.
7897:.
7885:20
7883:.
7879:.
7854:.
7846:.
7832:.
7806:.
7798:.
7786:48
7784:.
7761:.
7751:.
7737:.
7696:.
7671:,
7661:,
7651:23
7649:,
7626:.
7616:.
7604:.
7600:.
7559:.
7555:.
7529:.
7521:.
7509:78
7507:.
7503:.
7477:.
7467:45
7465:.
7461:.
7436:.
7401:.
7397:.
7351:.
7341:28
7339:.
7311:85
7309:.
7284:.
7276:.
7014:.
6877:.
5491:ln
5152:.
3159:,
967:,
880:ML
10381:.
10369::
9172:G
9146:F
9138:t
9126:Z
8845:V
8840:U
8042:e
8035:t
8028:v
7991:.
7971::
7945:)
7941:(
7925:.
7901::
7891::
7864:.
7850::
7814:.
7802::
7792::
7769:.
7747::
7717::
7700:.
7675::
7657::
7634:.
7620::
7612::
7606:6
7585:.
7567::
7537:.
7525::
7515::
7485:.
7473::
7446:.
7403:3
7383:n
7359:.
7347::
7321:.
7317::
7294:.
7280::
7184:A
7181:C
7178:C
7174:Y
7151:A
7148:C
7145:C
7141:X
7118:A
7115:C
7112:C
7108:Y
7085:A
7082:C
7079:C
7075:X
7054:Y
7034:X
6994:)
6991:V
6988:,
6985:U
6982:(
6952:Y
6932:X
6908:V
6888:U
6861:)
6856:j
6852:y
6848:,
6843:i
6839:x
6835:(
6806:Y
6784:j
6780:y
6759:X
6737:i
6733:x
6712:Y
6692:X
6664:]
6659:T
6655:Y
6651:Y
6648:[
6642:E
6639:=
6636:)
6633:Y
6630:,
6627:Y
6624:(
6615:=
6610:Y
6607:Y
6582:]
6577:T
6573:X
6569:X
6566:[
6560:E
6557:=
6554:)
6551:X
6548:,
6545:X
6542:(
6533:=
6528:X
6525:X
6496:0
6493:=
6490:)
6487:Y
6484:(
6478:E
6475:=
6472:)
6469:X
6466:(
6460:E
6434:T
6430:)
6424:m
6420:y
6416:,
6410:,
6405:1
6401:y
6397:(
6394:=
6391:Y
6369:T
6365:)
6359:n
6355:x
6351:,
6345:,
6340:1
6336:x
6332:(
6329:=
6326:X
6298:V
6295:=
6292:U
6280:.
6268:X
6265:=
6262:Y
6256:=
6253:V
6233:X
6230:=
6227:U
6207:1
6201:=
6198:b
6178:1
6175:=
6172:a
6152:Y
6132:X
6112:X
6106:=
6103:Y
6092:.
6080:X
6077:=
6074:Y
6071:=
6068:V
6048:X
6045:=
6042:U
6022:1
6019:=
6016:b
5996:1
5993:=
5990:a
5970:Y
5950:X
5930:X
5927:=
5924:Y
5900:0
5897:=
5894:)
5891:X
5888:(
5882:E
5856:1
5852:x
5848:=
5845:X
5780:p
5774:n
5771:+
5768:m
5748:m
5745:+
5742:n
5736:p
5713:p
5693:}
5690:n
5687:,
5684:m
5681:{
5658:p
5635:)
5632:1
5629:+
5626:i
5620:n
5617:(
5614:)
5611:1
5608:+
5605:i
5599:m
5596:(
5569:,
5566:)
5561:2
5556:j
5536:1
5533:(
5528:}
5525:n
5522:,
5519:m
5516:{
5508:i
5505:=
5502:j
5487:)
5483:)
5480:1
5477:+
5474:n
5471:+
5468:m
5465:(
5460:2
5457:1
5449:1
5443:p
5439:(
5432:=
5427:2
5399:i
5379:}
5376:n
5373:,
5370:m
5367:{
5361:,
5355:,
5352:1
5349:=
5346:i
5324:i
5290:p
5270:i
5138:R
5134:)
5126:(
5091:Y
5086:T
5082:b
5078:=
5075:Y
5070:2
5066:/
5062:1
5054:Y
5051:Y
5041:T
5037:d
5033:=
5030:V
5007:X
5002:T
4998:a
4994:=
4991:X
4986:2
4982:/
4978:1
4970:X
4967:X
4957:T
4953:c
4949:=
4946:U
4931:.
4919:b
4914:Y
4911:X
4901:1
4893:X
4890:X
4865:a
4844:,
4839:Y
4836:X
4826:1
4818:X
4815:X
4805:X
4802:Y
4792:1
4784:Y
4781:Y
4756:b
4735:;
4732:a
4727:X
4724:Y
4714:1
4706:Y
4703:Y
4678:b
4668:,
4654:X
4651:Y
4641:1
4633:Y
4630:Y
4620:Y
4617:X
4607:1
4599:X
4596:X
4571:a
4545:d
4540:2
4536:/
4532:1
4524:Y
4521:Y
4511:Y
4508:X
4498:2
4494:/
4490:1
4482:X
4479:X
4454:c
4431:2
4427:/
4423:1
4415:Y
4412:Y
4402:Y
4399:X
4389:1
4381:X
4378:X
4368:X
4365:Y
4355:2
4351:/
4347:1
4339:Y
4336:Y
4311:d
4285:c
4280:2
4276:/
4272:1
4264:X
4261:X
4251:X
4248:Y
4238:2
4234:/
4230:1
4222:Y
4219:Y
4194:d
4171:2
4167:/
4163:1
4155:X
4152:X
4142:X
4139:Y
4129:1
4121:Y
4118:Y
4108:Y
4105:X
4095:2
4091:/
4087:1
4079:X
4076:X
4051:c
4016:d
3996:c
3963:2
3959:/
3955:1
3947:X
3944:X
3934:X
3931:Y
3921:1
3913:Y
3910:Y
3900:Y
3897:X
3887:2
3883:/
3879:1
3871:X
3868:X
3839:c
3819:c
3814:2
3810:/
3806:1
3798:X
3795:X
3785:X
3782:Y
3772:2
3768:/
3764:1
3756:Y
3753:Y
3728:d
3705:.
3698:2
3694:/
3690:1
3685:)
3681:c
3676:T
3672:c
3667:(
3660:2
3656:/
3652:1
3647:)
3643:c
3638:2
3634:/
3630:1
3622:X
3619:X
3609:X
3606:Y
3596:1
3588:Y
3585:Y
3575:Y
3572:X
3562:2
3558:/
3554:1
3546:X
3543:X
3533:T
3529:c
3524:(
3491:,
3486:2
3482:/
3478:1
3473:)
3469:d
3464:T
3460:d
3455:(
3448:2
3444:/
3440:1
3435:)
3431:c
3426:2
3422:/
3418:1
3410:X
3407:X
3397:X
3394:Y
3384:2
3380:/
3376:1
3368:Y
3365:Y
3355:2
3351:/
3347:1
3339:Y
3336:Y
3326:Y
3323:X
3313:2
3309:/
3305:1
3297:X
3294:X
3284:T
3280:c
3275:(
3267:)
3264:d
3261:(
3257:)
3251:2
3247:/
3243:1
3235:Y
3232:Y
3222:Y
3219:X
3209:2
3205:/
3201:1
3193:X
3190:X
3180:T
3176:c
3171:(
3140:.
3132:d
3127:T
3123:d
3115:c
3110:T
3106:c
3098:d
3093:2
3089:/
3085:1
3077:Y
3074:Y
3064:Y
3061:X
3051:2
3047:/
3043:1
3035:X
3032:X
3022:T
3018:c
3011:=
2982:.
2977:Y
2974:Y
2966:=
2956:Y
2952:V
2946:Y
2942:D
2936:Y
2932:V
2927:,
2917:Y
2913:V
2907:2
2903:/
2899:1
2894:Y
2890:D
2884:Y
2880:V
2876:=
2871:2
2867:/
2863:1
2858:Y
2855:Y
2824:,
2819:X
2816:X
2808:=
2798:X
2794:V
2788:X
2784:D
2778:X
2774:V
2769:,
2759:X
2755:V
2749:2
2745:/
2741:1
2736:X
2732:D
2726:X
2722:V
2718:=
2713:2
2709:/
2705:1
2700:X
2697:X
2663:2
2659:/
2655:1
2650:Y
2647:Y
2620:2
2616:/
2612:1
2607:X
2604:X
2576:,
2573:b
2568:2
2564:/
2560:1
2555:Y
2552:Y
2544:=
2541:d
2518:,
2515:a
2510:2
2506:/
2502:1
2497:X
2494:X
2486:=
2483:c
2453:.
2445:b
2440:Y
2437:Y
2427:T
2423:b
2415:a
2410:X
2407:X
2397:T
2393:a
2385:b
2380:Y
2377:X
2367:T
2363:a
2356:=
2330:Y
2310:X
2284:Y
2281:X
2229:k
2225:b
2219:Y
2216:Y
2208:,
2203:k
2199:a
2193:X
2190:X
2145:k
2141:b
2137:,
2132:k
2128:a
2104:1
2098:k
2095:,
2089:,
2086:1
2083:=
2080:j
2072:0
2069:=
2066:)
2063:Y
2058:T
2053:j
2049:b
2045:,
2042:Y
2037:T
2033:b
2029:(
2020:=
2017:)
2014:X
2009:T
2004:j
2000:a
1996:,
1993:X
1988:T
1984:a
1980:(
1965:)
1962:Y
1957:T
1953:b
1949:,
1946:X
1941:T
1937:a
1933:(
1921:b
1918:,
1915:a
1906:=
1903:)
1898:k
1894:b
1890:,
1885:k
1881:a
1877:(
1854:}
1851:n
1848:,
1845:m
1842:{
1807:Y
1802:T
1797:1
1793:b
1789:=
1786:V
1766:X
1761:T
1756:1
1752:a
1748:=
1745:U
1725:)
1722:Y
1717:T
1712:k
1708:b
1704:,
1701:X
1696:T
1691:k
1687:a
1683:(
1674:=
1648:Y
1643:T
1638:k
1634:b
1613:X
1608:T
1603:k
1599:a
1576:m
1571:R
1561:k
1557:b
1548:(
1534:k
1530:b
1507:n
1502:R
1492:k
1488:a
1479:(
1465:k
1461:a
1437:Y
1417:X
1397:)
1392:j
1388:y
1384:,
1379:i
1375:x
1371:(
1342:)
1339:j
1336:,
1333:i
1330:(
1307:m
1301:n
1281:)
1278:Y
1275:,
1272:X
1269:(
1260:=
1255:Y
1252:X
1211:T
1207:)
1201:m
1197:y
1193:,
1187:,
1182:1
1178:y
1174:(
1171:=
1168:Y
1146:T
1142:)
1136:n
1132:x
1128:,
1122:,
1117:1
1113:x
1109:(
1106:=
1103:X
1041:Y
1037:X
1020:m
1016:Y
1012:1
1009:Y
1005:Y
1000:n
996:X
992:1
989:X
985:X
971:(
949:e
942:t
935:v
515:k
364:k
291:k
249:)
237:(
20:)
Text is available under the Creative Commons Attribution-ShareAlike License. Additional terms may apply.