10293:
10279:
10317:
10305:
2103:
3490:
1861:
3154:
3704:
1052:
CCA is now a cornerstone of multivariate statistics and multi-view learning, and a great number of interpretations and extensions have been proposed, such as probabilistic CCA, sparse CCA, multi-view CCA, Deep CCA, and DeepGeoCCA. Unfortunately, perhaps because of its popularity, the literature can
1068:
form (corresponding to datasets and their sample covariance matrices). These two forms are almost exact analogues of each other, which is why their distinction is often overlooked, but they can behave very differently in high dimensional settings. We next give explicit mathematical definitions for
5813:
One can also use canonical-correlation analysis to produce a model equation which relates two sets of variables, for example a set of performance measures and a set of explanatory variables, or a set of outputs and set of inputs. Constraint restrictions can be imposed on such a model to ensure it
5817:
Visualization of the results of canonical correlation is usually through bar plots of the coefficients of the two sets of variables for the pairs of canonical variates showing significant correlation. Some authors suggest that they are best visualized by plotting them as heliographs, a circular
5568:
3139:
2981:
2823:
2452:
2098:{\displaystyle (a_{k},b_{k})={\underset {a,b}{\operatorname {argmax} }}\operatorname {corr} (a^{T}X,b^{T}Y)\quad {\text{ subject to }}\operatorname {cov} (a^{T}X,a_{j}^{T}X)=\operatorname {cov} (b^{T}Y,b_{j}^{T}Y)=0{\text{ for }}j=1,\dots ,k-1}
3485:{\displaystyle \left(c^{T}\Sigma _{XX}^{-1/2}\Sigma _{XY}\Sigma _{YY}^{-1/2}\right)(d)\leq \left(c^{T}\Sigma _{XX}^{-1/2}\Sigma _{XY}\Sigma _{YY}^{-1/2}\Sigma _{YY}^{-1/2}\Sigma _{YX}\Sigma _{XX}^{-1/2}c\right)^{1/2}\left(d^{T}d\right)^{1/2},}
4432:
4172:
3964:
3498:
5789:
A typical use for canonical correlation in the experimental context is to take two sets of variables and see what is common among the two sets. For example, in psychological testing, one could take two well established multidimensional
7187:
and may also be negative. The regression view of CCA also provides a way to construct a latent variable probabilistic generative model for CCA, with uncorrelated hidden variables representing shared and non-shared variability.
4843:
4655:
1036:
of significance can be treated as special cases of canonical-correlation analysis, which is the general procedure for investigating the relationships between two sets of variables." The method was first introduced by
6663:
6581:
4544:
4284:
3818:
1724:
5802:. By seeing how the MMPI-2 factors relate to the NEO factors, one could gain insight into what dimensions were common between the tests and how much variance was shared. For example, one might find that an
6495:
1280:
5406:
2230:
5090:
5006:
2992:
6860:
1396:
4734:
4918:
6993:
6435:
6370:
1212:
1147:
1577:
1508:
5899:
5378:
5325:
2575:
2517:
2834:
2676:
2337:
1053:
be inconsistent with notation, we attempt to highlight such inconsistencies in this article to help the reader make best use of the existing literature and techniques available.
2664:
2621:
1814:. Then one seeks vectors maximizing the same correlation subject to the constraint that they are to be uncorrelated with the first pair of canonical variables; this gives the
5634:
2285:
1806:
1765:
5692:
2146:
1853:
1647:
1612:
3699:{\displaystyle \rho \leq {\frac {\left(c^{T}\Sigma _{XX}^{-1/2}\Sigma _{XY}\Sigma _{YY}^{-1}\Sigma _{YX}\Sigma _{XX}^{-1/2}c\right)^{1/2}}{\left(c^{T}c\right)^{1/2}}}.}
1306:
5747:
7185:
7152:
7119:
7086:
5857:
6267:
4315:
4055:
3847:
863:
6785:
6738:
6079:
5779:
1535:
1466:
1341:
6206:
6111:
901:
6297:
6232:
6177:
6047:
6021:
5995:
5929:
5799:
7382:
7053:
7033:
6951:
6931:
6907:
6887:
6805:
6758:
6711:
6691:
6151:
6131:
5969:
5949:
5712:
5657:
5398:
5289:
5269:
4864:
4755:
4677:
4570:
4453:
4310:
4193:
4050:
4015:
3995:
3838:
3727:
2329:
2309:
1436:
1416:
858:
848:
689:
7538:
7634:
Knyazev, A.V.; Argentati, M.E. (2002), "Principal Angles between
Subspaces in an A-Based Scalar Product: Algorithms and Perturbation Estimates",
4760:
4575:
9414:
6586:
6504:
896:
9919:
853:
704:
7938:
Hardoon, D. R.; Szedmak, S.; Shawe-Taylor, J. (2004). "Canonical
Correlation Analysis: An Overview with Application to Learning Methods".
10069:
5795:
4458:
4198:
3732:
435:
1655:
9693:
8334:
7222:
936:
739:
7995:
Representation-Constrained
Canonical Correlation Analysis: A Hybridization of Canonical Correlation and Principal Component Analyses
7587:"CCA-Zoo: A collection of Regularized, Deep Learning based, Kernel, and Probabilistic CCA methods in a scikit-learn style framework"
9467:
6444:
9906:
5814:
reflects theoretical requirements or intuitively obvious conditions. This type of model is known as a maximum correlation model.
815:
364:
5563:{\displaystyle \chi ^{2}=-\left(p-1-{\frac {1}{2}}(m+n+1)\right)\ln \prod _{j=i}^{\min\{m,n\}}(1-{\widehat {\rho }}_{j}^{2}),}
7848:
7745:
7278:
1231:
8329:
8029:
3134:{\displaystyle \rho ={\frac {c^{T}\Sigma _{XX}^{-1/2}\Sigma _{XY}\Sigma _{YY}^{-1/2}d}{{\sqrt {c^{T}c}}{\sqrt {d^{T}d}}}}.}
873:
636:
171:
5152:
10386:
8933:
8081:
5231:
2169:
891:
5014:
4930:
6813:
1349:
724:
699:
648:
6299:, which illustrates that the canonical-correlation analysis treats correlated and anticorrelated variables similarly.
5251:
Each row can be tested for significance with the following method. Since the correlations are sorted, saying that row
5172:. The CCA-Zoo library implements CCA extensions, such as probabilistic CCA, sparse CCA, multi-view CCA, and Deep CCA.
9716:
9608:
7197:
4682:
772:
767:
420:
5169:
4869:
2667:
10321:
9894:
9768:
7232:
1032:
that have a maximum correlation with each other. T. R. Knapp notes that "virtually all of the commonly encountered
430:
68:
9952:
9613:
9358:
8729:
8319:
5636:
6960:
6375:
6310:
1152:
1087:
10003:
9215:
9022:
8911:
8869:
7720:
5116:
1540:
1471:
929:
825:
589:
410:
8943:
10246:
9205:
8108:
5866:
5803:
5157:
3145:
2976:{\displaystyle \Sigma _{YY}^{1/2}=V_{Y}D_{Y}^{1/2}V_{Y}^{\top },\qquad V_{Y}D_{Y}V_{Y}^{\top }=\Sigma _{YY}.}
2818:{\displaystyle \Sigma _{XX}^{1/2}=V_{X}D_{X}^{1/2}V_{X}^{\top },\qquad V_{X}D_{X}V_{X}^{\top }=\Sigma _{XX},}
2447:{\displaystyle \rho ={\frac {a^{T}\Sigma _{XY}b}{{\sqrt {a^{T}\Sigma _{XX}a}}{\sqrt {b^{T}\Sigma _{YY}b}}}}.}
800:
502:
278:
5330:
5294:
2525:
2467:
9797:
9746:
9731:
9721:
9590:
9462:
9429:
9255:
9210:
9040:
7227:
7217:
7212:
5714:
are logically zero (and estimated that way also) the product for the terms after this point is irrelevant.
5213:
5193:
5181:
5101:
4018:
1057:
757:
694:
604:
582:
425:
415:
5212:
for small angles, leading to very inaccurate computation of highly correlated principal vectors in finite
10309:
10141:
9942:
9866:
9167:
8921:
8590:
8054:
7294:
Knapp, T. R. (1978). "Canonical correlation analysis: A general parametric significance-testing system".
908:
820:
805:
266:
88:
10344:"Discriminant Correlation Analysis: Real-Time Feature Level Fusion for Multimodal Biometric Recognition"
2626:
2583:
10026:
9998:
9993:
9741:
9500:
9406:
9386:
9294:
9005:
8823:
8306:
8178:
7561:
5220:
868:
795:
545:
440:
228:
161:
121:
6863:
5240:
5130:
1073:- understanding the differences between these objects is crucial for interpretation of the technique.
9758:
9526:
9247:
9172:
9101:
9030:
8950:
8938:
8808:
8796:
8789:
8497:
8218:
7815:
7055:
are simultaneously transformed in such a way that the cross-correlation between the whitened vectors
922:
528:
296:
166:
7831:
5580:
10241:
10008:
9871:
9556:
9521:
9485:
9270:
8712:
8621:
8580:
8492:
8183:
8022:
7952:
7656:
7261:
7012:
5574:
5126:
3974:
of decreasing magnitudes. Orthogonality is guaranteed by the symmetry of the correlation matrices.
550:
470:
393:
311:
141:
103:
98:
58:
53:
5165:
2260:
1770:
1729:
10150:
9763:
9703:
9640:
9278:
9262:
9000:
8862:
8852:
8702:
8616:
7866:"A whitening approach to probabilistic canonical correlation analysis for omics data integration"
7816:"Canonical Correlation Analysis: Use of Composite Heliographs for Representing Multiple Patterns"
5662:
2288:
2111:
1823:
969:
497:
346:
246:
73:
7121:
is diagonal. The canonical correlations are then interpreted as regression coefficients linking
1617:
1582:
10188:
10118:
9911:
9848:
9603:
9490:
8487:
8384:
8291:
8170:
8069:
7947:
7826:
7651:
7539:"Nonlinear measures of association with kernel canonical correlation analysis and applications"
7256:
4427:{\displaystyle \Sigma _{YY}^{-1/2}\Sigma _{YX}\Sigma _{XX}^{-1}\Sigma _{XY}\Sigma _{YY}^{-1/2}}
4167:{\displaystyle \Sigma _{XX}^{-1/2}\Sigma _{XY}\Sigma _{YY}^{-1}\Sigma _{YX}\Sigma _{XX}^{-1/2}}
3959:{\displaystyle \Sigma _{XX}^{-1/2}\Sigma _{XY}\Sigma _{YY}^{-1}\Sigma _{YX}\Sigma _{XX}^{-1/2}}
1285:
677:
653:
555:
316:
291:
251:
63:
5720:
10213:
10155:
10098:
9924:
9817:
9452:
9336:
9195:
9187:
9077:
9069:
8884:
8780:
8758:
8717:
8682:
8649:
8595:
8570:
8525:
8464:
8424:
8226:
8049:
7157:
7124:
7091:
7058:
5829:
1033:
631:
453:
261:
176:
48:
6237:
10136:
9711:
9660:
9636:
9598:
9516:
9495:
9447:
9326:
9304:
9273:
9182:
9059:
9010:
8928:
8901:
8857:
8813:
8575:
8351:
8231:
7643:
7598:
7207:
6763:
6716:
6052:
5752:
5201:
5138:
1513:
1444:
1314:
1308:
1042:
560:
510:
7411:. The Twelfth International Conference on Learning Representations (ICLR 2024, spotlight).
6182:
6087:
8:
10283:
10208:
10131:
9812:
9576:
9569:
9531:
9439:
9419:
9391:
9124:
8990:
8985:
8975:
8967:
8785:
8746:
8636:
8626:
8535:
8314:
8270:
8188:
8113:
8015:
7769:
Tofallis, C. (1999). "Model
Building with Multiple Dependent Variables and Constraints".
6276:
6211:
6156:
6026:
6000:
5974:
5908:
5216:
5142:
5134:
1021:
663:
599:
570:
475:
301:
234:
220:
206:
181:
131:
83:
43:
7647:
7602:
10363:
10297:
10108:
9962:
9858:
9807:
9683:
9580:
9564:
9541:
9318:
9052:
9035:
8995:
8906:
8801:
8763:
8734:
8694:
8654:
8600:
8517:
8203:
8198:
7973:
7902:
7877:
7865:
7796:
7778:
7751:
7703:
7519:
7501:
7367:
7341:
7038:
7018:
6936:
6916:
6892:
6872:
6790:
6743:
6696:
6676:
6136:
6116:
5954:
5934:
5810:
dimension accounted for a substantial amount of shared variance between the two tests.
5697:
5642:
5383:
5274:
5254:
5112:
4849:
4740:
4662:
4555:
4438:
4295:
4178:
4035:
4000:
3980:
3823:
3712:
2314:
2294:
1421:
1401:
641:
565:
351:
146:
7701:
Canonical correlation analysis of high-dimensional data with very small sample support
10292:
10203:
10173:
10165:
9985:
9976:
9901:
9832:
9688:
9673:
9648:
9536:
9477:
9343:
9331:
8957:
8874:
8818:
8741:
8585:
8507:
8286:
8160:
7965:
7907:
7844:
7741:
7616:
7523:
7467:
7274:
5791:
5185:
3967:
734:
577:
490:
286:
256:
201:
196:
151:
93:
10367:
7755:
7719:
Sieranoja, S.; Sahidullah, Md; Kinnunen, T.; Komulainen, J.; Hadid, A. (July 2018).
7363:
10355:
10228:
10183:
9947:
9934:
9827:
9802:
9736:
9668:
9546:
9154:
9047:
8980:
8893:
8840:
8659:
8530:
8324:
8208:
8123:
8090:
7957:
7897:
7887:
7836:
7800:
7788:
7737:
7733:
7661:
7606:
7553:
7511:
7486:
7459:
7408:
Deep
Geodesic Canonical Correlation Analysis for Covariance-Based Neuroimaging Data
7333:
7321:
7303:
7266:
7000:
6910:
1226:
1038:
762:
515:
465:
375:
359:
329:
191:
186:
136:
126:
24:
7985:
A note on the ordinal canonical-correlation analysis of two sets of ranking scores
7977:
5818:
format with ray like bars, with each half representing the two sets of variables.
4021:
of the correlation matrix of X and Y corresponding to the highest singular value.
1398:. In practice, we would estimate the covariance matrix based on sampled data from
10145:
9889:
9751:
9678:
9353:
9227:
9200:
9177:
9146:
8773:
8768:
8722:
8452:
8103:
7718:
7678:
7270:
5209:
2458:
1215:
1013:
790:
594:
460:
400:
9635:
7405:
Ju, Ce; Kobler, Reinmar J; Tang, Liyao; Guan, Cuntai; Kawanabe, Motoaki (2024).
7337:
10094:
10089:
8552:
8482:
8128:
7686:
7557:
7515:
7448:"Simultaneous canonical correlation analysis with invariant canonical loadings"
7422:
7359:
7307:
7202:
6438:
5860:
1222:
1082:
1046:
810:
341:
78:
10359:
8001:
program)- in
Journal of Applied Economic Sciences 4(1), 2009, pp. 115â124
7892:
7665:
7463:
6208:, so that the first (and only in this example) pair of canonical variables is
6023:, so that the first (and only in this example) pair of canonical variables is
10380:
10251:
10218:
10081:
10042:
9853:
9822:
9286:
9240:
8845:
8547:
8374:
8138:
8133:
7961:
7729:
2018 IEEE 3rd
International Conference on Signal and Image Processing (ICSIP)
7620:
7471:
7006:
6954:
6807:
are treated as elements of a vector space with an inner product given by the
6670:
729:
658:
540:
271:
156:
7792:
7447:
1069:
the population problem and highlight the different objects in the so-called
10193:
10126:
10103:
10018:
9348:
8644:
8542:
8477:
8419:
8404:
8341:
8296:
7991:
program)- in
Journal of Quantitative Economics 7(2), 2009, pp. 173â199
7969:
7911:
7727:
7677:
7406:
7251:
HĂ€rdle, Wolfgang; Simar, LĂ©opold (2007). "Canonical
Correlation Analysis".
5161:
5148:
4838:{\displaystyle \Sigma _{YY}^{-1}\Sigma _{YX}\Sigma _{XX}^{-1}\Sigma _{XY},}
1064:
form (corresponding to random vectors and their covariance matrices) or in
4650:{\displaystyle \Sigma _{XX}^{-1}\Sigma _{XY}\Sigma _{YY}^{-1}\Sigma _{YX}}
10236:
10198:
9881:
9782:
9644:
9457:
9424:
8916:
8833:
8828:
8472:
8429:
8409:
8389:
8379:
8148:
6713:, correspondingly. In this interpretation, the random variables, entries
6666:
6658:{\displaystyle \Sigma _{YY}=\operatorname {Cov} (Y,Y)=\operatorname {E} }
6576:{\displaystyle \Sigma _{XX}=\operatorname {Cov} (X,X)=\operatorname {E} }
5807:
3971:
3841:
1650:
1017:
535:
29:
7984:
7927:
7840:
7611:
7586:
1219:
9082:
8562:
8262:
8193:
8143:
8118:
8038:
7345:
6808:
6498:
5120:
1344:
953:
684:
380:
306:
10342:
Haghighat, Mohammad; Abdel-Mottaleb, Mohamed; Alhalabi, Wadee (2016).
5781:
correlations will be identically 1 and hence the test is meaningless.
3820:
are collinear. In addition, the maximum of correlation is attained if
2244:; these are often more straightforward to interpret than the weights.
9235:
9087:
8707:
8502:
8414:
8399:
8394:
8359:
5145:
for statistical hypothesis testing in canonical correlation analysis.
843:
624:
5271:
is zero implies all further correlations are also zero. If we have
10343:
8751:
8369:
8246:
8241:
8236:
7994:
7882:
7707:
7423:"Statistical Learning with Sparsity: the Lasso and Generalizations"
4539:{\displaystyle \Sigma _{XX}^{-1/2}\Sigma _{XY}\Sigma _{YY}^{-1/2}d}
4279:{\displaystyle \Sigma _{YY}^{-1/2}\Sigma _{YX}\Sigma _{XX}^{-1/2}c}
3813:{\displaystyle \Sigma _{YY}^{-1/2}\Sigma _{YX}\Sigma _{XX}^{-1/2}c}
1020:
among the variables, then canonical-correlation analysis will find
7783:
7506:
1719:{\displaystyle \rho =\operatorname {corr} (a_{k}^{T}X,b_{k}^{T}Y)}
10256:
9957:
7998:
7988:
7825:. Lecture Notes in Computer Science. Vol. 4045. p. 93.
619:
10341:
10178:
9159:
9133:
9113:
8364:
8155:
7931:
7721:"Audiovisual Synchrony Detection with Optimized Audio Features"
7699:
Yang Song, Peter J. Schreier, David RamŽırez, and Tanuj Hasija
6996:
5236:
5205:
5197:
5108:
370:
8007:
6490:{\displaystyle \operatorname {E} (X)=\operatorname {E} (Y)=0}
5227:
1076:
614:
609:
336:
8098:
5175:
1441:
Canonical-correlation analysis seeks a sequence of vectors
949:
Way of inferring information from cross-covariance matrices
7937:
7007:
Whitening and probabilistic canonical correlation analysis
5104:
on a correlation matrix. It is available as a function in
7487:"A spectral algorithm for learning Hidden Markov Models"
902:
List of datasets in computer vision and image processing
10348:
IEEE Transactions on
Information Forensics and Security
1275:{\displaystyle \Sigma _{XY}=\operatorname {cov} (X,Y)}
7370:
7160:
7127:
7094:
7061:
7041:
7021:
6963:
6939:
6919:
6913:
for the pair of subspaces spanned by the entries of
6895:
6875:
6816:
6793:
6766:
6746:
6719:
6699:
6679:
6589:
6507:
6447:
6378:
6313:
6279:
6240:
6214:
6185:
6159:
6139:
6119:
6090:
6055:
6029:
6003:
5977:
5957:
5937:
5911:
5869:
5832:
5755:
5723:
5700:
5665:
5645:
5583:
5409:
5386:
5333:
5297:
5277:
5257:
5017:
4933:
4872:
4852:
4763:
4743:
4685:
4665:
4578:
4558:
4461:
4441:
4318:
4298:
4201:
4181:
4058:
4038:
4003:
3983:
3850:
3826:
3735:
3715:
3501:
3157:
2995:
2837:
2679:
2629:
2586:
2528:
2470:
2340:
2317:
2297:
2263:
2172:
2114:
1864:
1826:
1773:
1732:
1658:
1620:
1585:
1543:
1516:
1474:
1447:
1424:
1404:
1352:
1317:
1288:
1234:
1155:
1090:
9920:
Autoregressive conditional heteroskedasticity (ARCH)
2666:
can be obtained from the eigen-decomposition (or by
2225:{\displaystyle \Sigma _{XX}a_{k},\Sigma _{YY}b_{k}}
9382:
7771:Journal of the Royal Statistical Society, Series D
7376:
7324:(1936). "Relations Between Two Sets of Variates".
7179:
7146:
7113:
7080:
7047:
7027:
6987:
6945:
6925:
6901:
6881:
6854:
6799:
6779:
6752:
6732:
6705:
6685:
6657:
6575:
6489:
6429:
6364:
6291:
6261:
6226:
6200:
6171:
6145:
6125:
6105:
6073:
6041:
6015:
5989:
5963:
5943:
5923:
5893:
5851:
5773:
5741:
5706:
5686:
5651:
5628:
5562:
5392:
5372:
5319:
5283:
5263:
5085:{\displaystyle V=d^{T}\Sigma _{YY}^{-1/2}Y=b^{T}Y}
5084:
5001:{\displaystyle U=c^{T}\Sigma _{XX}^{-1/2}X=a^{T}X}
5000:
4912:
4858:
4837:
4749:
4728:
4671:
4649:
4564:
4549:Reversing the change of coordinates, we have that
4538:
4447:
4426:
4304:
4278:
4187:
4166:
4044:
4009:
3989:
3958:
3832:
3812:
3721:
3698:
3484:
3133:
2975:
2817:
2658:
2615:
2569:
2511:
2446:
2323:
2303:
2279:
2224:
2140:
2097:
1847:
1800:
1759:
1718:
1641:
1606:
1571:
1529:
1502:
1460:
1430:
1410:
1390:
1335:
1300:
1274:
1206:
1141:
7813:
6855:{\displaystyle \operatorname {cov} (x_{i},y_{j})}
6302:
2291:for any pair of (vector-shaped) random variables
1391:{\displaystyle \operatorname {cov} (x_{i},y_{j})}
10378:
7863:
7633:
7404:
5666:
5501:
5352:
3977:Another way of viewing this computation is that
1827:
9468:Multivariate adaptive regression splines (MARS)
7537:Huang, S. Y.; Lee, M. H.; Hsiao, C. K. (2009).
5178:as macro CanCorr shipped with the main software
4729:{\displaystyle \Sigma _{YY}^{-1}\Sigma _{YX}a;}
7484:
5717:Note that in the small sample size limit with
4913:{\displaystyle \Sigma _{XX}^{-1}\Sigma _{XY}b}
897:List of datasets for machine-learning research
8023:
7585:Chapman, James; Wang, Hao-Ting (2021-12-18).
7546:Journal of Statistical Planning and Inference
7536:
930:
5681:
5669:
5516:
5504:
5367:
5355:
1842:
1830:
7584:
7250:
5796:Minnesota Multiphasic Personality Inventory
3970:). The subsequent pairs are found by using
3844:with the maximum eigenvalue for the matrix
8068:
8030:
8016:
7814:Degani, A.; Shafto, M.; Olson, L. (2006).
7485:Hsu, D.; Kakade, S. M.; Zhang, T. (2012).
7223:Regularized canonical correlation analysis
6988:{\displaystyle \operatorname {corr} (U,V)}
6869:The definition of the canonical variables
6430:{\displaystyle Y=(y_{1},\dots ,y_{m})^{T}}
6365:{\displaystyle X=(x_{1},\dots ,x_{n})^{T}}
6153:are perfectly anticorrelated, then, e.g.,
5223:, alternative algorithms are available in
5196:on a correlation matrix is related to the
1207:{\displaystyle Y=(y_{1},\dots ,y_{m})^{T}}
1142:{\displaystyle X=(x_{1},\dots ,x_{n})^{T}}
1077:Population CCA definition via correlations
1045:the mathematical concept was published by
937:
923:
8681:
7951:
7901:
7891:
7881:
7830:
7823:Diagrammatic Representation and Inference
7782:
7655:
7610:
7505:
7320:
7260:
7253:Applied Multivariate Statistical Analysis
6864:Covariance#Relationship to inner products
5573:which is asymptotically distributed as a
5291:independent observations in a sample and
1572:{\displaystyle b_{k}\in \mathbb {R} ^{m}}
1559:
1503:{\displaystyle a_{k}\in \mathbb {R} ^{n}}
1490:
968:, is a way of inferring information from
7768:
6909:is then equivalent to the definition of
4924:The canonical variables are defined by:
1820:. This procedure may be continued up to
7928:Discriminant Correlation Analysis (DCA)
7494:Journal of Computer and System Sciences
5894:{\displaystyle \operatorname {E} (X)=0}
5232:linear-algebra function subspace_angles
10379:
9994:KaplanâMeier estimator (product limit)
7358:
5971:are perfectly correlated, then, e.g.,
5373:{\displaystyle i=1,\dots ,\min\{m,n\}}
5320:{\displaystyle {\widehat {\rho }}_{i}}
5133:and several other packages, including
2570:{\displaystyle d=\Sigma _{YY}^{1/2}b,}
2512:{\displaystyle c=\Sigma _{XX}^{1/2}a,}
10067:
9634:
9381:
8680:
8450:
8067:
8011:
7681:, J. T. Kent and J. M. Bibby (1979).
7352:
7293:
5246:
2331:. The target function to maximize is
1438:(i.e. from a pair of data matrices).
10304:
10004:Accelerated failure time (AFT) model
7636:SIAM Journal on Scientific Computing
7627:
7445:
7011:CCA can also be viewed as a special
5749:then we are guaranteed that the top
1041:in 1936, although in the context of
10335:
10316:
9599:Analysis of variance (ANOVA, anova)
8451:
7864:Jendoubi, T.; Strimmer, K. (2018).
5659:. Since all the correlations from
892:Glossary of artificial intelligence
13:
9694:CochranâMantelâHaenszel statistics
8320:Pearson product-moment correlation
6630:
6591:
6548:
6509:
6466:
6448:
5870:
5035:
4951:
4895:
4874:
4820:
4799:
4786:
4765:
4708:
4687:
4635:
4614:
4601:
4580:
4505:
4492:
4463:
4396:
4383:
4362:
4349:
4320:
4245:
4232:
4203:
4136:
4123:
4102:
4089:
4060:
3928:
3915:
3894:
3881:
3852:
3779:
3766:
3737:
3603:
3590:
3569:
3556:
3527:
3391:
3378:
3349:
3320:
3307:
3278:
3216:
3203:
3174:
3058:
3045:
3016:
2958:
2949:
2910:
2839:
2800:
2791:
2752:
2681:
2659:{\displaystyle \Sigma _{YY}^{1/2}}
2631:
2616:{\displaystyle \Sigma _{XX}^{1/2}}
2588:
2536:
2478:
2421:
2391:
2361:
2265:
2200:
2174:
1817:second pair of canonical variables
1236:
14:
10398:
7921:
7198:Generalized canonical correlation
5784:
5327:is the estimated correlation for
5095:
3709:There is equality if the vectors
1811:first pair of canonical variables
1579:) such that the random variables
10315:
10303:
10291:
10278:
10277:
10068:
7233:Partial least squares regression
1726:. The (scalar) random variables
9953:Least-squares spectral analysis
7857:
7807:
7762:
7712:
7693:
7671:
7591:Journal of Open Source Software
7578:
7446:Gu, Fei; Wu, Hao (2018-04-01).
5400:th row, the test statistic is:
5241:FileExchange function subspacea
2918:
2760:
1956:
8934:Mean-unbiased minimum-variance
8037:
7738:10.1109/SIPROCESS.2018.8600424
7530:
7478:
7439:
7415:
7398:
7314:
7287:
7244:
6982:
6970:
6849:
6823:
6652:
6636:
6624:
6612:
6570:
6554:
6542:
6530:
6478:
6472:
6460:
6454:
6418:
6385:
6353:
6320:
6303:Connection to principal angles
5882:
5876:
5629:{\displaystyle (m-i+1)(n-i+1)}
5623:
5605:
5602:
5584:
5554:
5521:
5471:
5453:
3255:
3249:
2457:The first step is to define a
2247:
2054:
2017:
2005:
1968:
1953:
1921:
1891:
1865:
1713:
1671:
1385:
1359:
1330:
1318:
1269:
1257:
1195:
1162:
1130:
1097:
958:canonical-correlation analysis
312:Relevance vector machine (RVM)
1:
10247:Geographic information system
9463:Simultaneous equations models
7238:
6957:. The canonical correlations
6273:We notice that in both cases
4289:Reciprocally, there is also:
2252:
2166:. The 'dual' sets of vectors
801:Computational learning theory
365:Expectationâmaximization (EM)
9430:Coefficient of determination
9041:Uniformly most powerful test
7271:10.1007/978-3-540-72244-1_14
7228:Singular value decomposition
7218:Linear discriminant analysis
7213:Principal component analysis
5194:singular value decomposition
5182:Julia (programming language)
5102:singular value decomposition
2280:{\displaystyle \Sigma _{XY}}
1801:{\displaystyle V=b_{1}^{T}Y}
1760:{\displaystyle U=a_{1}^{T}X}
758:Coefficient of determination
605:Convolutional neural network
317:Support vector machine (SVM)
7:
9999:Proportional hazards models
9943:Spectral density estimation
9925:Vector autoregression (VAR)
9359:Maximum posterior estimator
8591:Randomized controlled trial
7191:
5821:
5687:{\displaystyle \min\{m,n\}}
4029:The solution is therefore:
4024:
2141:{\displaystyle a_{k},b_{k}}
1848:{\displaystyle \min\{m,n\}}
966:canonical variates analysis
909:Outline of machine learning
806:Empirical risk minimization
10:
10403:
10387:Covariance and correlation
9759:Multivariate distributions
8179:Average absolute deviation
7558:10.1016/j.jspi.2008.10.011
7516:10.1016/j.jcss.2011.12.025
7364:"Essai sur la gĂ©omĂ©trie Ă
7308:10.1037/0033-2909.85.2.410
5100:CCA can be computed using
1642:{\displaystyle b_{k}^{T}Y}
1607:{\displaystyle a_{k}^{T}X}
546:Feedforward neural network
297:Artificial neural networks
10360:10.1109/TIFS.2016.2569061
10273:
10227:
10164:
10117:
10080:
10076:
10063:
10035:
10017:
9984:
9975:
9933:
9880:
9841:
9790:
9781:
9747:Structural equation model
9702:
9659:
9655:
9630:
9589:
9555:
9509:
9476:
9438:
9405:
9401:
9377:
9317:
9226:
9145:
9109:
9100:
9083:Score/Lagrange multiplier
9068:
9021:
8966:
8892:
8883:
8693:
8689:
8676:
8635:
8609:
8561:
8516:
8498:Sample size determination
8463:
8459:
8446:
8350:
8305:
8279:
8261:
8217:
8169:
8089:
8080:
8076:
8063:
8045:
7893:10.1186/s12859-018-2572-9
7666:10.1137/S1064827500377332
7464:10.1007/s41237-017-0042-8
7338:10.1093/biomet/28.3-4.321
7015:where the random vectors
5129:as the standard function
3146:CauchyâSchwarz inequality
2235:canonical loading vectors
1301:{\displaystyle n\times m}
972:. If we have two vectors
970:cross-covariance matrices
529:Artificial neural network
10242:Environmental statistics
9764:Elliptical distributions
9557:Generalized linear model
9486:Simple linear regression
9256:HodgesâLehmann estimator
8713:Probability distribution
8622:Stochastic approximation
8184:Coefficient of variation
7962:10.1162/0899766042321814
7013:whitening transformation
5742:{\displaystyle p<n+m}
838:Journals and conferences
785:Mathematical foundations
695:Temporal difference (TD)
551:Recurrent neural network
471:Conditional random field
394:Dimensionality reduction
142:Dimensionality reduction
104:Quantum machine learning
99:Neuromorphic engineering
59:Self-supervised learning
54:Semi-supervised learning
9902:Cross-correlation (XCF)
9510:Non-standard predictors
8944:LehmannâScheffĂ© theorem
8617:Adaptive clinical trial
7793:10.1111/1467-9884.00195
7388:Bull. Soc. Math. France
7180:{\displaystyle Y^{CCA}}
7147:{\displaystyle X^{CCA}}
7114:{\displaystyle Y^{CCA}}
7081:{\displaystyle X^{CCA}}
5852:{\displaystyle X=x_{1}}
5168:and in statsmodels, as
4017:are the left and right
2289:cross-covariance matrix
1071:canonical decomposition
1060:, CCA can be viewed in
1056:Like its sister method
247:Apprenticeship learning
10298:Mathematics portal
10119:Engineering statistics
10027:NelsonâAalen estimator
9604:Analysis of covariance
9491:Ordinary least squares
9415:Pearson product-moment
8819:Statistical functional
8730:Empirical distribution
8563:Controlled experiments
8292:Frequency distribution
8070:Descriptive statistics
7378:
7296:Psychological Bulletin
7181:
7148:
7115:
7082:
7049:
7029:
6989:
6953:with respect to this
6947:
6927:
6903:
6883:
6856:
6801:
6781:
6754:
6734:
6707:
6687:
6659:
6577:
6491:
6431:
6366:
6293:
6263:
6262:{\displaystyle V=-Y=X}
6228:
6202:
6173:
6147:
6127:
6107:
6075:
6043:
6017:
5991:
5965:
5945:
5925:
5895:
5853:
5775:
5743:
5708:
5688:
5653:
5630:
5564:
5520:
5394:
5374:
5321:
5285:
5265:
5192:CCA computation using
5086:
5002:
4914:
4860:
4839:
4751:
4730:
4673:
4651:
4566:
4540:
4449:
4428:
4306:
4280:
4189:
4168:
4046:
4011:
3991:
3960:
3834:
3814:
3723:
3700:
3486:
3135:
2977:
2819:
2660:
2617:
2571:
2513:
2448:
2325:
2305:
2281:
2226:
2142:
2099:
1959: subject to
1849:
1802:
1761:
1720:
1643:
1608:
1573:
1531:
1504:
1462:
1432:
1412:
1392:
1337:
1302:
1276:
1208:
1143:
796:Biasâvariance tradeoff
678:Reinforcement learning
654:Spiking neural network
64:Reinforcement learning
10214:Population statistics
10156:System identification
9890:Autocorrelation (ACF)
9818:Exponential smoothing
9732:Discriminant analysis
9727:Canonical correlation
9591:Partition of variance
9453:Regression validation
9297:(JonckheereâTerpstra)
9196:Likelihood-ratio test
8885:Frequentist inference
8797:Locationâscale family
8718:Sampling distribution
8683:Statistical inference
8650:Cross-sectional study
8637:Observational studies
8596:Randomized experiment
8425:Stem-and-leaf display
8227:Central limit theorem
7683:Multivariate Analysis
7379:
7182:
7149:
7116:
7083:
7050:
7030:
6990:
6948:
6928:
6904:
6884:
6857:
6802:
6782:
6780:{\displaystyle y_{j}}
6755:
6735:
6733:{\displaystyle x_{i}}
6708:
6688:
6660:
6578:
6492:
6432:
6367:
6294:
6264:
6229:
6203:
6174:
6148:
6128:
6108:
6076:
6074:{\displaystyle V=Y=X}
6044:
6018:
5992:
5966:
5946:
5926:
5896:
5854:
5776:
5774:{\displaystyle m+n-p}
5744:
5709:
5689:
5654:
5631:
5565:
5485:
5395:
5375:
5322:
5286:
5266:
5087:
5003:
4915:
4861:
4840:
4757:is an eigenvector of
4752:
4731:
4674:
4652:
4572:is an eigenvector of
4567:
4541:
4450:
4429:
4312:is an eigenvector of
4307:
4281:
4190:
4169:
4052:is an eigenvector of
4047:
4012:
3992:
3961:
3835:
3815:
3724:
3701:
3487:
3136:
2978:
2820:
2661:
2618:
2572:
2514:
2449:
2326:
2306:
2282:
2227:
2143:
2100:
1850:
1803:
1762:
1721:
1644:
1609:
1574:
1532:
1530:{\displaystyle b_{k}}
1505:
1463:
1461:{\displaystyle a_{k}}
1433:
1413:
1393:
1338:
1336:{\displaystyle (i,j)}
1303:
1277:
1225:, one may define the
1209:
1144:
632:Neural radiance field
454:Structured prediction
177:Structured prediction
49:Unsupervised learning
10137:Probabilistic design
9722:Principal components
9565:Exponential families
9517:Nonlinear regression
9496:General linear model
9458:Mixed effects models
9448:Errors and residuals
9425:Confounding variable
9327:Bayesian probability
9305:Van der Waerden test
9295:Ordered alternative
9060:Multiple comparisons
8939:RaoâBlackwellization
8902:Estimating equations
8858:Statistical distance
8576:Factorial experiment
8109:Arithmetic-Geometric
7732:. pp. 377â381.
7368:
7255:. pp. 321â330.
7208:Angles between flats
7158:
7125:
7092:
7059:
7039:
7019:
6961:
6937:
6917:
6893:
6873:
6814:
6791:
6764:
6744:
6717:
6697:
6677:
6673:for the entries of
6587:
6505:
6445:
6376:
6311:
6277:
6238:
6212:
6201:{\displaystyle b=-1}
6183:
6157:
6137:
6117:
6106:{\displaystyle Y=-X}
6088:
6053:
6027:
6001:
5975:
5955:
5935:
5909:
5867:
5830:
5753:
5721:
5698:
5663:
5643:
5581:
5407:
5384:
5331:
5295:
5275:
5255:
5202:angles between flats
5186:MultivariateStats.jl
5015:
4931:
4870:
4850:
4761:
4741:
4683:
4663:
4576:
4556:
4459:
4439:
4316:
4296:
4199:
4179:
4056:
4036:
4001:
3981:
3848:
3824:
3733:
3713:
3499:
3155:
2993:
2835:
2677:
2627:
2584:
2526:
2468:
2338:
2315:
2295:
2261:
2170:
2151:canonical directions
2112:
2108:The sets of vectors
1862:
1824:
1771:
1730:
1656:
1618:
1583:
1541:
1514:
1472:
1445:
1422:
1402:
1350:
1315:
1286:
1232:
1153:
1088:
1043:angles between flats
821:Statistical learning
719:Learning with humans
511:Local outlier factor
10209:Official statistics
10132:Methods engineering
9813:Seasonal adjustment
9581:Poisson regressions
9501:Bayesian regression
9440:Regression analysis
9420:Partial correlation
9392:Regression analysis
8991:Prediction interval
8986:Likelihood interval
8976:Confidence interval
8968:Interval estimation
8929:Unbiased estimators
8747:Model specification
8627:Up-and-down designs
8315:Partial correlation
8271:Index of dispersion
8189:Interquartile range
7841:10.1007/11783183_11
7648:2002SJSC...23.2008K
7612:10.21105/joss.03823
7603:2021JOSS....6.3823C
6292:{\displaystyle U=V}
6227:{\displaystyle U=X}
6172:{\displaystyle a=1}
6042:{\displaystyle U=X}
6016:{\displaystyle b=1}
5990:{\displaystyle a=1}
5924:{\displaystyle Y=X}
5553:
5217:computer arithmetic
5166:Cross decomposition
5062:
4978:
4893:
4866:is proportional to
4818:
4784:
4706:
4679:is proportional to
4633:
4599:
4532:
4490:
4455:is proportional to
4423:
4381:
4347:
4272:
4230:
4195:is proportional to
4163:
4121:
4087:
3955:
3913:
3879:
3806:
3764:
3630:
3588:
3554:
3418:
3376:
3347:
3305:
3243:
3201:
3085:
3043:
2953:
2914:
2899:
2863:
2795:
2756:
2741:
2705:
2655:
2612:
2560:
2502:
2050:
2001:
1794:
1753:
1709:
1688:
1635:
1600:
1022:linear combinations
664:Electrochemical RAM
571:reservoir computing
302:Logistic regression
221:Supervised learning
207:Multimodal learning
182:Feature engineering
127:Generative modeling
89:Rule-based learning
84:Curriculum learning
44:Supervised learning
19:Part of a series on
10229:Spatial statistics
10109:Medical statistics
10009:First hitting time
9963:Whittle likelihood
9614:Degrees of freedom
9609:Multivariate ANOVA
9542:Heteroscedasticity
9354:Bayesian estimator
9319:Bayesian inference
9168:KolmogorovâSmirnov
9053:Randomization test
9023:Testing hypotheses
8996:Tolerance interval
8907:Maximum likelihood
8802:Exponential family
8735:Density estimation
8695:Statistical theory
8655:Natural experiment
8601:Scientific control
8518:Survey methodology
8204:Standard deviation
7940:Neural Computation
7870:BMC Bioinformatics
7374:
7177:
7144:
7111:
7078:
7045:
7025:
6985:
6943:
6923:
6899:
6879:
6852:
6797:
6777:
6750:
6730:
6703:
6683:
6655:
6573:
6487:
6427:
6362:
6289:
6259:
6224:
6198:
6169:
6143:
6123:
6103:
6071:
6039:
6013:
5987:
5961:
5941:
5921:
5891:
5849:
5771:
5739:
5704:
5684:
5649:
5637:degrees of freedom
5626:
5560:
5530:
5390:
5370:
5317:
5281:
5261:
5247:Hypothesis testing
5082:
5034:
4998:
4950:
4910:
4873:
4856:
4835:
4798:
4764:
4747:
4726:
4686:
4669:
4647:
4613:
4579:
4562:
4536:
4504:
4462:
4445:
4424:
4395:
4361:
4319:
4302:
4276:
4244:
4202:
4185:
4164:
4135:
4101:
4059:
4042:
4007:
3987:
3956:
3927:
3893:
3851:
3830:
3810:
3778:
3736:
3719:
3696:
3602:
3568:
3526:
3482:
3390:
3348:
3319:
3277:
3215:
3173:
3131:
3057:
3015:
2973:
2939:
2900:
2877:
2838:
2815:
2781:
2742:
2719:
2680:
2656:
2630:
2613:
2587:
2567:
2535:
2509:
2477:
2444:
2321:
2301:
2277:
2222:
2138:
2095:
2036:
1987:
1913:
1845:
1798:
1780:
1757:
1739:
1716:
1695:
1674:
1639:
1621:
1604:
1586:
1569:
1527:
1500:
1458:
1428:
1408:
1388:
1333:
1298:
1272:
1204:
1139:
232: •
147:Density estimation
10331:
10330:
10269:
10268:
10265:
10264:
10204:National accounts
10174:Actuarial science
10166:Social statistics
10059:
10058:
10055:
10054:
10051:
10050:
9986:Survival function
9971:
9970:
9833:Granger causality
9674:Contingency table
9649:Survival analysis
9626:
9625:
9622:
9621:
9478:Linear regression
9373:
9372:
9369:
9368:
9344:Credible interval
9313:
9312:
9096:
9095:
8912:Method of moments
8781:Parametric family
8742:Statistical model
8672:
8671:
8668:
8667:
8586:Random assignment
8508:Statistical power
8442:
8441:
8438:
8437:
8287:Contingency table
8257:
8256:
8124:Generalized/power
7997:(Also provides a
7987:(Also provides a
7946:(12): 2639â2664.
7850:978-3-540-35623-3
7747:978-1-5386-6396-7
7427:hastie.su.domains
7377:{\displaystyle n}
7280:978-3-540-72243-4
7048:{\displaystyle Y}
7028:{\displaystyle X}
6946:{\displaystyle Y}
6926:{\displaystyle X}
6911:principal vectors
6902:{\displaystyle V}
6882:{\displaystyle U}
6800:{\displaystyle Y}
6753:{\displaystyle X}
6706:{\displaystyle Y}
6686:{\displaystyle X}
6665:can be viewed as
6146:{\displaystyle Y}
6126:{\displaystyle X}
5964:{\displaystyle Y}
5944:{\displaystyle X}
5798:(MMPI-2) and the
5792:personality tests
5707:{\displaystyle p}
5652:{\displaystyle p}
5540:
5451:
5393:{\displaystyle i}
5308:
5284:{\displaystyle p}
5264:{\displaystyle i}
4859:{\displaystyle a}
4750:{\displaystyle b}
4672:{\displaystyle b}
4565:{\displaystyle a}
4448:{\displaystyle c}
4305:{\displaystyle d}
4188:{\displaystyle d}
4045:{\displaystyle c}
4010:{\displaystyle d}
3990:{\displaystyle c}
3968:Rayleigh quotient
3833:{\displaystyle c}
3722:{\displaystyle d}
3691:
3126:
3123:
3106:
2439:
2436:
2406:
2324:{\displaystyle Y}
2304:{\displaystyle X}
2066:
1960:
1898:
1431:{\displaystyle Y}
1411:{\displaystyle X}
1003:, ...,
983:, ...,
947:
946:
752:Model diagnostics
735:Human-in-the-loop
578:Boltzmann machine
491:Anomaly detection
287:Linear regression
202:Ontology learning
197:Grammar induction
172:Semantic analysis
167:Association rules
152:Anomaly detection
94:Neuro-symbolic AI
10394:
10372:
10371:
10354:(9): 1984â1996.
10339:
10319:
10318:
10307:
10306:
10296:
10295:
10281:
10280:
10184:Crime statistics
10078:
10077:
10065:
10064:
9982:
9981:
9948:Fourier analysis
9935:Frequency domain
9915:
9862:
9828:Structural break
9788:
9787:
9737:Cluster analysis
9684:Log-linear model
9657:
9656:
9632:
9631:
9573:
9547:Homoscedasticity
9403:
9402:
9379:
9378:
9298:
9290:
9282:
9281:(KruskalâWallis)
9266:
9251:
9206:Cross validation
9191:
9173:AndersonâDarling
9120:
9107:
9106:
9078:Likelihood-ratio
9070:Parametric tests
9048:Permutation test
9031:1- & 2-tails
8922:Minimum distance
8894:Point estimation
8890:
8889:
8841:Optimal decision
8792:
8691:
8690:
8678:
8677:
8660:Quasi-experiment
8610:Adaptive designs
8461:
8460:
8448:
8447:
8325:Rank correlation
8087:
8086:
8078:
8077:
8065:
8064:
8032:
8025:
8018:
8009:
8008:
7981:
7955:
7916:
7915:
7905:
7895:
7885:
7861:
7855:
7854:
7834:
7820:
7811:
7805:
7804:
7786:
7766:
7760:
7759:
7725:
7716:
7710:
7697:
7691:
7690:
7675:
7669:
7668:
7659:
7642:(6): 2009â2041,
7631:
7625:
7624:
7614:
7582:
7576:
7575:
7573:
7572:
7566:
7560:. Archived from
7543:
7534:
7528:
7527:
7509:
7491:
7482:
7476:
7475:
7443:
7437:
7436:
7434:
7433:
7419:
7413:
7412:
7402:
7396:
7395:
7383:
7381:
7380:
7375:
7356:
7350:
7349:
7332:(3â4): 321â377.
7318:
7312:
7311:
7291:
7285:
7284:
7264:
7248:
7186:
7184:
7183:
7178:
7176:
7175:
7153:
7151:
7150:
7145:
7143:
7142:
7120:
7118:
7117:
7112:
7110:
7109:
7087:
7085:
7084:
7079:
7077:
7076:
7054:
7052:
7051:
7046:
7034:
7032:
7031:
7026:
7001:principal angles
6995:is equal to the
6994:
6992:
6991:
6986:
6952:
6950:
6949:
6944:
6932:
6930:
6929:
6924:
6908:
6906:
6905:
6900:
6888:
6886:
6885:
6880:
6861:
6859:
6858:
6853:
6848:
6847:
6835:
6834:
6806:
6804:
6803:
6798:
6786:
6784:
6783:
6778:
6776:
6775:
6759:
6757:
6756:
6751:
6739:
6737:
6736:
6731:
6729:
6728:
6712:
6710:
6709:
6704:
6692:
6690:
6689:
6684:
6664:
6662:
6661:
6656:
6651:
6650:
6602:
6601:
6582:
6580:
6579:
6574:
6569:
6568:
6520:
6519:
6496:
6494:
6493:
6488:
6436:
6434:
6433:
6428:
6426:
6425:
6416:
6415:
6397:
6396:
6371:
6369:
6368:
6363:
6361:
6360:
6351:
6350:
6332:
6331:
6298:
6296:
6295:
6290:
6268:
6266:
6265:
6260:
6233:
6231:
6230:
6225:
6207:
6205:
6204:
6199:
6178:
6176:
6175:
6170:
6152:
6150:
6149:
6144:
6132:
6130:
6129:
6124:
6112:
6110:
6109:
6104:
6080:
6078:
6077:
6072:
6048:
6046:
6045:
6040:
6022:
6020:
6019:
6014:
5996:
5994:
5993:
5988:
5970:
5968:
5967:
5962:
5950:
5948:
5947:
5942:
5930:
5928:
5927:
5922:
5900:
5898:
5897:
5892:
5858:
5856:
5855:
5850:
5848:
5847:
5780:
5778:
5777:
5772:
5748:
5746:
5745:
5740:
5713:
5711:
5710:
5705:
5693:
5691:
5690:
5685:
5658:
5656:
5655:
5650:
5635:
5633:
5632:
5627:
5569:
5567:
5566:
5561:
5552:
5547:
5542:
5541:
5533:
5519:
5499:
5478:
5474:
5452:
5444:
5419:
5418:
5399:
5397:
5396:
5391:
5379:
5377:
5376:
5371:
5326:
5324:
5323:
5318:
5316:
5315:
5310:
5309:
5301:
5290:
5288:
5287:
5282:
5270:
5268:
5267:
5262:
5221:fix this trouble
5091:
5089:
5088:
5083:
5078:
5077:
5061:
5057:
5045:
5033:
5032:
5007:
5005:
5004:
4999:
4994:
4993:
4977:
4973:
4961:
4949:
4948:
4919:
4917:
4916:
4911:
4906:
4905:
4892:
4884:
4865:
4863:
4862:
4857:
4844:
4842:
4841:
4836:
4831:
4830:
4817:
4809:
4797:
4796:
4783:
4775:
4756:
4754:
4753:
4748:
4735:
4733:
4732:
4727:
4719:
4718:
4705:
4697:
4678:
4676:
4675:
4670:
4656:
4654:
4653:
4648:
4646:
4645:
4632:
4624:
4612:
4611:
4598:
4590:
4571:
4569:
4568:
4563:
4545:
4543:
4542:
4537:
4531:
4527:
4515:
4503:
4502:
4489:
4485:
4473:
4454:
4452:
4451:
4446:
4433:
4431:
4430:
4425:
4422:
4418:
4406:
4394:
4393:
4380:
4372:
4360:
4359:
4346:
4342:
4330:
4311:
4309:
4308:
4303:
4285:
4283:
4282:
4277:
4271:
4267:
4255:
4243:
4242:
4229:
4225:
4213:
4194:
4192:
4191:
4186:
4173:
4171:
4170:
4165:
4162:
4158:
4146:
4134:
4133:
4120:
4112:
4100:
4099:
4086:
4082:
4070:
4051:
4049:
4048:
4043:
4019:singular vectors
4016:
4014:
4013:
4008:
3996:
3994:
3993:
3988:
3965:
3963:
3962:
3957:
3954:
3950:
3938:
3926:
3925:
3912:
3904:
3892:
3891:
3878:
3874:
3862:
3839:
3837:
3836:
3831:
3819:
3817:
3816:
3811:
3805:
3801:
3789:
3777:
3776:
3763:
3759:
3747:
3728:
3726:
3725:
3720:
3705:
3703:
3702:
3697:
3692:
3690:
3689:
3685:
3676:
3672:
3668:
3667:
3652:
3651:
3647:
3638:
3634:
3629:
3625:
3613:
3601:
3600:
3587:
3579:
3567:
3566:
3553:
3549:
3537:
3525:
3524:
3509:
3491:
3489:
3488:
3483:
3478:
3477:
3473:
3464:
3460:
3456:
3455:
3440:
3439:
3435:
3426:
3422:
3417:
3413:
3401:
3389:
3388:
3375:
3371:
3359:
3346:
3342:
3330:
3318:
3317:
3304:
3300:
3288:
3276:
3275:
3248:
3244:
3242:
3238:
3226:
3214:
3213:
3200:
3196:
3184:
3172:
3171:
3140:
3138:
3137:
3132:
3127:
3125:
3124:
3119:
3118:
3109:
3107:
3102:
3101:
3092:
3089:
3084:
3080:
3068:
3056:
3055:
3042:
3038:
3026:
3014:
3013:
3003:
2982:
2980:
2979:
2974:
2969:
2968:
2952:
2947:
2938:
2937:
2928:
2927:
2913:
2908:
2898:
2894:
2885:
2876:
2875:
2862:
2858:
2849:
2824:
2822:
2821:
2816:
2811:
2810:
2794:
2789:
2780:
2779:
2770:
2769:
2755:
2750:
2740:
2736:
2727:
2718:
2717:
2704:
2700:
2691:
2665:
2663:
2662:
2657:
2654:
2650:
2641:
2622:
2620:
2619:
2614:
2611:
2607:
2598:
2576:
2574:
2573:
2568:
2559:
2555:
2546:
2518:
2516:
2515:
2510:
2501:
2497:
2488:
2453:
2451:
2450:
2445:
2440:
2438:
2437:
2432:
2431:
2419:
2418:
2409:
2407:
2402:
2401:
2389:
2388:
2379:
2376:
2372:
2371:
2359:
2358:
2348:
2330:
2328:
2327:
2322:
2310:
2308:
2307:
2302:
2286:
2284:
2283:
2278:
2276:
2275:
2231:
2229:
2228:
2223:
2221:
2220:
2211:
2210:
2195:
2194:
2185:
2184:
2147:
2145:
2144:
2139:
2137:
2136:
2124:
2123:
2104:
2102:
2101:
2096:
2067:
2064:
2049:
2044:
2029:
2028:
2000:
1995:
1980:
1979:
1961:
1958:
1949:
1948:
1933:
1932:
1914:
1912:
1890:
1889:
1877:
1876:
1854:
1852:
1851:
1846:
1807:
1805:
1804:
1799:
1793:
1788:
1766:
1764:
1763:
1758:
1752:
1747:
1725:
1723:
1722:
1717:
1708:
1703:
1687:
1682:
1648:
1646:
1645:
1640:
1634:
1629:
1613:
1611:
1610:
1605:
1599:
1594:
1578:
1576:
1575:
1570:
1568:
1567:
1562:
1553:
1552:
1536:
1534:
1533:
1528:
1526:
1525:
1509:
1507:
1506:
1501:
1499:
1498:
1493:
1484:
1483:
1467:
1465:
1464:
1459:
1457:
1456:
1437:
1435:
1434:
1429:
1417:
1415:
1414:
1409:
1397:
1395:
1394:
1389:
1384:
1383:
1371:
1370:
1342:
1340:
1339:
1334:
1307:
1305:
1304:
1299:
1281:
1279:
1278:
1273:
1247:
1246:
1227:cross-covariance
1216:random variables
1213:
1211:
1210:
1205:
1203:
1202:
1193:
1192:
1174:
1173:
1148:
1146:
1145:
1140:
1138:
1137:
1128:
1127:
1109:
1108:
1039:Harold Hotelling
1034:parametric tests
1016:, and there are
1014:random variables
939:
932:
925:
886:Related articles
763:Confusion matrix
516:Isolation forest
461:Graphical models
240:
239:
192:Learning to rank
187:Feature learning
25:Machine learning
16:
15:
10402:
10401:
10397:
10396:
10395:
10393:
10392:
10391:
10377:
10376:
10375:
10340:
10336:
10332:
10327:
10290:
10261:
10223:
10160:
10146:quality control
10113:
10095:Clinical trials
10072:
10047:
10031:
10019:Hazard function
10013:
9967:
9929:
9913:
9876:
9872:BreuschâGodfrey
9860:
9837:
9777:
9752:Factor analysis
9698:
9679:Graphical model
9651:
9618:
9585:
9571:
9551:
9505:
9472:
9434:
9397:
9396:
9365:
9309:
9296:
9288:
9280:
9264:
9249:
9228:Rank statistics
9222:
9201:Model selection
9189:
9147:Goodness of fit
9141:
9118:
9092:
9064:
9017:
8962:
8951:Median unbiased
8879:
8790:
8723:Order statistic
8685:
8664:
8631:
8605:
8557:
8512:
8455:
8453:Data collection
8434:
8346:
8301:
8275:
8253:
8213:
8165:
8082:Continuous data
8072:
8059:
8041:
8036:
8005:
7924:
7919:
7862:
7858:
7851:
7832:10.1.1.538.5217
7818:
7812:
7808:
7767:
7763:
7748:
7723:
7717:
7713:
7698:
7694:
7679:Kanti V. Mardia
7676:
7672:
7632:
7628:
7583:
7579:
7570:
7568:
7564:
7541:
7535:
7531:
7489:
7483:
7479:
7452:Behaviormetrika
7444:
7440:
7431:
7429:
7421:
7420:
7416:
7403:
7399:
7369:
7366:
7365:
7357:
7353:
7319:
7315:
7292:
7288:
7281:
7249:
7245:
7241:
7194:
7165:
7161:
7159:
7156:
7155:
7132:
7128:
7126:
7123:
7122:
7099:
7095:
7093:
7090:
7089:
7066:
7062:
7060:
7057:
7056:
7040:
7037:
7036:
7020:
7017:
7016:
7009:
6962:
6959:
6958:
6938:
6935:
6934:
6918:
6915:
6914:
6894:
6891:
6890:
6874:
6871:
6870:
6843:
6839:
6830:
6826:
6815:
6812:
6811:
6792:
6789:
6788:
6771:
6767:
6765:
6762:
6761:
6745:
6742:
6741:
6724:
6720:
6718:
6715:
6714:
6698:
6695:
6694:
6678:
6675:
6674:
6646:
6642:
6594:
6590:
6588:
6585:
6584:
6564:
6560:
6512:
6508:
6506:
6503:
6502:
6446:
6443:
6442:
6439:expected values
6421:
6417:
6411:
6407:
6392:
6388:
6377:
6374:
6373:
6356:
6352:
6346:
6342:
6327:
6323:
6312:
6309:
6308:
6305:
6278:
6275:
6274:
6239:
6236:
6235:
6213:
6210:
6209:
6184:
6181:
6180:
6158:
6155:
6154:
6138:
6135:
6134:
6118:
6115:
6114:
6089:
6086:
6085:
6054:
6051:
6050:
6028:
6025:
6024:
6002:
5999:
5998:
5976:
5973:
5972:
5956:
5953:
5952:
5936:
5933:
5932:
5910:
5907:
5906:
5868:
5865:
5864:
5843:
5839:
5831:
5828:
5827:
5824:
5787:
5754:
5751:
5750:
5722:
5719:
5718:
5699:
5696:
5695:
5664:
5661:
5660:
5644:
5641:
5640:
5582:
5579:
5578:
5548:
5543:
5532:
5531:
5500:
5489:
5443:
5430:
5426:
5414:
5410:
5408:
5405:
5404:
5385:
5382:
5381:
5332:
5329:
5328:
5311:
5300:
5299:
5298:
5296:
5293:
5292:
5276:
5273:
5272:
5256:
5253:
5252:
5249:
5210:ill-conditioned
5160:in the library
5098:
5073:
5069:
5053:
5046:
5038:
5028:
5024:
5016:
5013:
5012:
4989:
4985:
4969:
4962:
4954:
4944:
4940:
4932:
4929:
4928:
4898:
4894:
4885:
4877:
4871:
4868:
4867:
4851:
4848:
4847:
4823:
4819:
4810:
4802:
4789:
4785:
4776:
4768:
4762:
4759:
4758:
4742:
4739:
4738:
4711:
4707:
4698:
4690:
4684:
4681:
4680:
4664:
4661:
4660:
4638:
4634:
4625:
4617:
4604:
4600:
4591:
4583:
4577:
4574:
4573:
4557:
4554:
4553:
4523:
4516:
4508:
4495:
4491:
4481:
4474:
4466:
4460:
4457:
4456:
4440:
4437:
4436:
4414:
4407:
4399:
4386:
4382:
4373:
4365:
4352:
4348:
4338:
4331:
4323:
4317:
4314:
4313:
4297:
4294:
4293:
4263:
4256:
4248:
4235:
4231:
4221:
4214:
4206:
4200:
4197:
4196:
4180:
4177:
4176:
4154:
4147:
4139:
4126:
4122:
4113:
4105:
4092:
4088:
4078:
4071:
4063:
4057:
4054:
4053:
4037:
4034:
4033:
4027:
4002:
3999:
3998:
3982:
3979:
3978:
3946:
3939:
3931:
3918:
3914:
3905:
3897:
3884:
3880:
3870:
3863:
3855:
3849:
3846:
3845:
3825:
3822:
3821:
3797:
3790:
3782:
3769:
3765:
3755:
3748:
3740:
3734:
3731:
3730:
3714:
3711:
3710:
3681:
3677:
3663:
3659:
3658:
3654:
3653:
3643:
3639:
3621:
3614:
3606:
3593:
3589:
3580:
3572:
3559:
3555:
3545:
3538:
3530:
3520:
3516:
3515:
3511:
3510:
3508:
3500:
3497:
3496:
3469:
3465:
3451:
3447:
3446:
3442:
3441:
3431:
3427:
3409:
3402:
3394:
3381:
3377:
3367:
3360:
3352:
3338:
3331:
3323:
3310:
3306:
3296:
3289:
3281:
3271:
3267:
3266:
3262:
3261:
3234:
3227:
3219:
3206:
3202:
3192:
3185:
3177:
3167:
3163:
3162:
3158:
3156:
3153:
3152:
3114:
3110:
3108:
3097:
3093:
3091:
3090:
3076:
3069:
3061:
3048:
3044:
3034:
3027:
3019:
3009:
3005:
3004:
3002:
2994:
2991:
2990:
2961:
2957:
2948:
2943:
2933:
2929:
2923:
2919:
2909:
2904:
2890:
2886:
2881:
2871:
2867:
2854:
2850:
2842:
2836:
2833:
2832:
2803:
2799:
2790:
2785:
2775:
2771:
2765:
2761:
2751:
2746:
2732:
2728:
2723:
2713:
2709:
2696:
2692:
2684:
2678:
2675:
2674:
2668:diagonalization
2646:
2642:
2634:
2628:
2625:
2624:
2603:
2599:
2591:
2585:
2582:
2581:
2551:
2547:
2539:
2527:
2524:
2523:
2493:
2489:
2481:
2469:
2466:
2465:
2459:change of basis
2424:
2420:
2414:
2410:
2408:
2394:
2390:
2384:
2380:
2378:
2377:
2364:
2360:
2354:
2350:
2349:
2347:
2339:
2336:
2335:
2316:
2313:
2312:
2296:
2293:
2292:
2268:
2264:
2262:
2259:
2258:
2255:
2250:
2216:
2212:
2203:
2199:
2190:
2186:
2177:
2173:
2171:
2168:
2167:
2132:
2128:
2119:
2115:
2113:
2110:
2109:
2065: for
2063:
2045:
2040:
2024:
2020:
1996:
1991:
1975:
1971:
1957:
1944:
1940:
1928:
1924:
1902:
1897:
1885:
1881:
1872:
1868:
1863:
1860:
1859:
1825:
1822:
1821:
1789:
1784:
1772:
1769:
1768:
1748:
1743:
1731:
1728:
1727:
1704:
1699:
1683:
1678:
1657:
1654:
1653:
1630:
1625:
1619:
1616:
1615:
1595:
1590:
1584:
1581:
1580:
1563:
1558:
1557:
1548:
1544:
1542:
1539:
1538:
1521:
1517:
1515:
1512:
1511:
1494:
1489:
1488:
1479:
1475:
1473:
1470:
1469:
1452:
1448:
1446:
1443:
1442:
1423:
1420:
1419:
1403:
1400:
1399:
1379:
1375:
1366:
1362:
1351:
1348:
1347:
1316:
1313:
1312:
1287:
1284:
1283:
1239:
1235:
1233:
1230:
1229:
1198:
1194:
1188:
1184:
1169:
1165:
1154:
1151:
1150:
1133:
1129:
1123:
1119:
1104:
1100:
1089:
1086:
1085:
1079:
1011:
1002:
991:
982:
964:), also called
950:
943:
914:
913:
887:
879:
878:
839:
831:
830:
791:Kernel machines
786:
778:
777:
753:
745:
744:
725:Active learning
720:
712:
711:
680:
670:
669:
595:Diffusion model
531:
521:
520:
493:
483:
482:
456:
446:
445:
401:Factor analysis
396:
386:
385:
369:
332:
322:
321:
242:
241:
225:
224:
223:
212:
211:
117:
109:
108:
74:Online learning
39:
27:
12:
11:
5:
10400:
10390:
10389:
10374:
10373:
10333:
10329:
10328:
10326:
10325:
10313:
10301:
10287:
10274:
10271:
10270:
10267:
10266:
10263:
10262:
10260:
10259:
10254:
10249:
10244:
10239:
10233:
10231:
10225:
10224:
10222:
10221:
10216:
10211:
10206:
10201:
10196:
10191:
10186:
10181:
10176:
10170:
10168:
10162:
10161:
10159:
10158:
10153:
10148:
10139:
10134:
10129:
10123:
10121:
10115:
10114:
10112:
10111:
10106:
10101:
10092:
10090:Bioinformatics
10086:
10084:
10074:
10073:
10061:
10060:
10057:
10056:
10053:
10052:
10049:
10048:
10046:
10045:
10039:
10037:
10033:
10032:
10030:
10029:
10023:
10021:
10015:
10014:
10012:
10011:
10006:
10001:
9996:
9990:
9988:
9979:
9973:
9972:
9969:
9968:
9966:
9965:
9960:
9955:
9950:
9945:
9939:
9937:
9931:
9930:
9928:
9927:
9922:
9917:
9909:
9904:
9899:
9898:
9897:
9895:partial (PACF)
9886:
9884:
9878:
9877:
9875:
9874:
9869:
9864:
9856:
9851:
9845:
9843:
9842:Specific tests
9839:
9838:
9836:
9835:
9830:
9825:
9820:
9815:
9810:
9805:
9800:
9794:
9792:
9785:
9779:
9778:
9776:
9775:
9774:
9773:
9772:
9771:
9756:
9755:
9754:
9744:
9742:Classification
9739:
9734:
9729:
9724:
9719:
9714:
9708:
9706:
9700:
9699:
9697:
9696:
9691:
9689:McNemar's test
9686:
9681:
9676:
9671:
9665:
9663:
9653:
9652:
9628:
9627:
9624:
9623:
9620:
9619:
9617:
9616:
9611:
9606:
9601:
9595:
9593:
9587:
9586:
9584:
9583:
9567:
9561:
9559:
9553:
9552:
9550:
9549:
9544:
9539:
9534:
9529:
9527:Semiparametric
9524:
9519:
9513:
9511:
9507:
9506:
9504:
9503:
9498:
9493:
9488:
9482:
9480:
9474:
9473:
9471:
9470:
9465:
9460:
9455:
9450:
9444:
9442:
9436:
9435:
9433:
9432:
9427:
9422:
9417:
9411:
9409:
9399:
9398:
9395:
9394:
9389:
9383:
9375:
9374:
9371:
9370:
9367:
9366:
9364:
9363:
9362:
9361:
9351:
9346:
9341:
9340:
9339:
9334:
9323:
9321:
9315:
9314:
9311:
9310:
9308:
9307:
9302:
9301:
9300:
9292:
9284:
9268:
9265:(MannâWhitney)
9260:
9259:
9258:
9245:
9244:
9243:
9232:
9230:
9224:
9223:
9221:
9220:
9219:
9218:
9213:
9208:
9198:
9193:
9190:(ShapiroâWilk)
9185:
9180:
9175:
9170:
9165:
9157:
9151:
9149:
9143:
9142:
9140:
9139:
9131:
9122:
9110:
9104:
9102:Specific tests
9098:
9097:
9094:
9093:
9091:
9090:
9085:
9080:
9074:
9072:
9066:
9065:
9063:
9062:
9057:
9056:
9055:
9045:
9044:
9043:
9033:
9027:
9025:
9019:
9018:
9016:
9015:
9014:
9013:
9008:
8998:
8993:
8988:
8983:
8978:
8972:
8970:
8964:
8963:
8961:
8960:
8955:
8954:
8953:
8948:
8947:
8946:
8941:
8926:
8925:
8924:
8919:
8914:
8909:
8898:
8896:
8887:
8881:
8880:
8878:
8877:
8872:
8867:
8866:
8865:
8855:
8850:
8849:
8848:
8838:
8837:
8836:
8831:
8826:
8816:
8811:
8806:
8805:
8804:
8799:
8794:
8778:
8777:
8776:
8771:
8766:
8756:
8755:
8754:
8749:
8739:
8738:
8737:
8727:
8726:
8725:
8715:
8710:
8705:
8699:
8697:
8687:
8686:
8674:
8673:
8670:
8669:
8666:
8665:
8663:
8662:
8657:
8652:
8647:
8641:
8639:
8633:
8632:
8630:
8629:
8624:
8619:
8613:
8611:
8607:
8606:
8604:
8603:
8598:
8593:
8588:
8583:
8578:
8573:
8567:
8565:
8559:
8558:
8556:
8555:
8553:Standard error
8550:
8545:
8540:
8539:
8538:
8533:
8522:
8520:
8514:
8513:
8511:
8510:
8505:
8500:
8495:
8490:
8485:
8483:Optimal design
8480:
8475:
8469:
8467:
8457:
8456:
8444:
8443:
8440:
8439:
8436:
8435:
8433:
8432:
8427:
8422:
8417:
8412:
8407:
8402:
8397:
8392:
8387:
8382:
8377:
8372:
8367:
8362:
8356:
8354:
8348:
8347:
8345:
8344:
8339:
8338:
8337:
8332:
8322:
8317:
8311:
8309:
8303:
8302:
8300:
8299:
8294:
8289:
8283:
8281:
8280:Summary tables
8277:
8276:
8274:
8273:
8267:
8265:
8259:
8258:
8255:
8254:
8252:
8251:
8250:
8249:
8244:
8239:
8229:
8223:
8221:
8215:
8214:
8212:
8211:
8206:
8201:
8196:
8191:
8186:
8181:
8175:
8173:
8167:
8166:
8164:
8163:
8158:
8153:
8152:
8151:
8146:
8141:
8136:
8131:
8126:
8121:
8116:
8114:Contraharmonic
8111:
8106:
8095:
8093:
8084:
8074:
8073:
8061:
8060:
8058:
8057:
8052:
8046:
8043:
8042:
8035:
8034:
8027:
8020:
8012:
8003:
8002:
7992:
7982:
7953:10.1.1.14.6452
7935:
7923:
7922:External links
7920:
7918:
7917:
7856:
7849:
7806:
7777:(3): 371â378.
7761:
7746:
7711:
7692:
7687:Academic Press
7670:
7657:10.1.1.73.2914
7626:
7577:
7529:
7477:
7458:(1): 111â132.
7438:
7414:
7397:
7373:
7351:
7313:
7302:(2): 410â416.
7286:
7279:
7262:10.1.1.324.403
7242:
7240:
7237:
7236:
7235:
7230:
7225:
7220:
7215:
7210:
7205:
7203:RV coefficient
7200:
7193:
7190:
7174:
7171:
7168:
7164:
7141:
7138:
7135:
7131:
7108:
7105:
7102:
7098:
7075:
7072:
7069:
7065:
7044:
7024:
7008:
7005:
6984:
6981:
6978:
6975:
6972:
6969:
6966:
6942:
6922:
6898:
6878:
6851:
6846:
6842:
6838:
6833:
6829:
6825:
6822:
6819:
6796:
6774:
6770:
6749:
6727:
6723:
6702:
6682:
6654:
6649:
6645:
6641:
6638:
6635:
6632:
6629:
6626:
6623:
6620:
6617:
6614:
6611:
6608:
6605:
6600:
6597:
6593:
6572:
6567:
6563:
6559:
6556:
6553:
6550:
6547:
6544:
6541:
6538:
6535:
6532:
6529:
6526:
6523:
6518:
6515:
6511:
6486:
6483:
6480:
6477:
6474:
6471:
6468:
6465:
6462:
6459:
6456:
6453:
6450:
6424:
6420:
6414:
6410:
6406:
6403:
6400:
6395:
6391:
6387:
6384:
6381:
6359:
6355:
6349:
6345:
6341:
6338:
6335:
6330:
6326:
6322:
6319:
6316:
6307:Assuming that
6304:
6301:
6288:
6285:
6282:
6271:
6270:
6258:
6255:
6252:
6249:
6246:
6243:
6223:
6220:
6217:
6197:
6194:
6191:
6188:
6168:
6165:
6162:
6142:
6122:
6102:
6099:
6096:
6093:
6082:
6070:
6067:
6064:
6061:
6058:
6038:
6035:
6032:
6012:
6009:
6006:
5986:
5983:
5980:
5960:
5940:
5920:
5917:
5914:
5890:
5887:
5884:
5881:
5878:
5875:
5872:
5861:expected value
5846:
5842:
5838:
5835:
5823:
5820:
5786:
5785:Practical uses
5783:
5770:
5767:
5764:
5761:
5758:
5738:
5735:
5732:
5729:
5726:
5703:
5683:
5680:
5677:
5674:
5671:
5668:
5648:
5625:
5622:
5619:
5616:
5613:
5610:
5607:
5604:
5601:
5598:
5595:
5592:
5589:
5586:
5571:
5570:
5559:
5556:
5551:
5546:
5539:
5536:
5529:
5526:
5523:
5518:
5515:
5512:
5509:
5506:
5503:
5498:
5495:
5492:
5488:
5484:
5481:
5477:
5473:
5470:
5467:
5464:
5461:
5458:
5455:
5450:
5447:
5442:
5439:
5436:
5433:
5429:
5425:
5422:
5417:
5413:
5389:
5369:
5366:
5363:
5360:
5357:
5354:
5351:
5348:
5345:
5342:
5339:
5336:
5314:
5307:
5304:
5280:
5260:
5248:
5245:
5244:
5243:
5234:
5190:
5189:
5179:
5173:
5155:
5146:
5124:
5097:
5096:Implementation
5094:
5093:
5092:
5081:
5076:
5072:
5068:
5065:
5060:
5056:
5052:
5049:
5044:
5041:
5037:
5031:
5027:
5023:
5020:
5009:
5008:
4997:
4992:
4988:
4984:
4981:
4976:
4972:
4968:
4965:
4960:
4957:
4953:
4947:
4943:
4939:
4936:
4922:
4921:
4909:
4904:
4901:
4897:
4891:
4888:
4883:
4880:
4876:
4855:
4845:
4834:
4829:
4826:
4822:
4816:
4813:
4808:
4805:
4801:
4795:
4792:
4788:
4782:
4779:
4774:
4771:
4767:
4746:
4736:
4725:
4722:
4717:
4714:
4710:
4704:
4701:
4696:
4693:
4689:
4668:
4658:
4644:
4641:
4637:
4631:
4628:
4623:
4620:
4616:
4610:
4607:
4603:
4597:
4594:
4589:
4586:
4582:
4561:
4547:
4546:
4535:
4530:
4526:
4522:
4519:
4514:
4511:
4507:
4501:
4498:
4494:
4488:
4484:
4480:
4477:
4472:
4469:
4465:
4444:
4434:
4421:
4417:
4413:
4410:
4405:
4402:
4398:
4392:
4389:
4385:
4379:
4376:
4371:
4368:
4364:
4358:
4355:
4351:
4345:
4341:
4337:
4334:
4329:
4326:
4322:
4301:
4287:
4286:
4275:
4270:
4266:
4262:
4259:
4254:
4251:
4247:
4241:
4238:
4234:
4228:
4224:
4220:
4217:
4212:
4209:
4205:
4184:
4174:
4161:
4157:
4153:
4150:
4145:
4142:
4138:
4132:
4129:
4125:
4119:
4116:
4111:
4108:
4104:
4098:
4095:
4091:
4085:
4081:
4077:
4074:
4069:
4066:
4062:
4041:
4026:
4023:
4006:
3986:
3953:
3949:
3945:
3942:
3937:
3934:
3930:
3924:
3921:
3917:
3911:
3908:
3903:
3900:
3896:
3890:
3887:
3883:
3877:
3873:
3869:
3866:
3861:
3858:
3854:
3829:
3809:
3804:
3800:
3796:
3793:
3788:
3785:
3781:
3775:
3772:
3768:
3762:
3758:
3754:
3751:
3746:
3743:
3739:
3718:
3707:
3706:
3695:
3688:
3684:
3680:
3675:
3671:
3666:
3662:
3657:
3650:
3646:
3642:
3637:
3633:
3628:
3624:
3620:
3617:
3612:
3609:
3605:
3599:
3596:
3592:
3586:
3583:
3578:
3575:
3571:
3565:
3562:
3558:
3552:
3548:
3544:
3541:
3536:
3533:
3529:
3523:
3519:
3514:
3507:
3504:
3493:
3492:
3481:
3476:
3472:
3468:
3463:
3459:
3454:
3450:
3445:
3438:
3434:
3430:
3425:
3421:
3416:
3412:
3408:
3405:
3400:
3397:
3393:
3387:
3384:
3380:
3374:
3370:
3366:
3363:
3358:
3355:
3351:
3345:
3341:
3337:
3334:
3329:
3326:
3322:
3316:
3313:
3309:
3303:
3299:
3295:
3292:
3287:
3284:
3280:
3274:
3270:
3265:
3260:
3257:
3254:
3251:
3247:
3241:
3237:
3233:
3230:
3225:
3222:
3218:
3212:
3209:
3205:
3199:
3195:
3191:
3188:
3183:
3180:
3176:
3170:
3166:
3161:
3142:
3141:
3130:
3122:
3117:
3113:
3105:
3100:
3096:
3088:
3083:
3079:
3075:
3072:
3067:
3064:
3060:
3054:
3051:
3047:
3041:
3037:
3033:
3030:
3025:
3022:
3018:
3012:
3008:
3001:
2998:
2984:
2983:
2972:
2967:
2964:
2960:
2956:
2951:
2946:
2942:
2936:
2932:
2926:
2922:
2917:
2912:
2907:
2903:
2897:
2893:
2889:
2884:
2880:
2874:
2870:
2866:
2861:
2857:
2853:
2848:
2845:
2841:
2826:
2825:
2814:
2809:
2806:
2802:
2798:
2793:
2788:
2784:
2778:
2774:
2768:
2764:
2759:
2754:
2749:
2745:
2739:
2735:
2731:
2726:
2722:
2716:
2712:
2708:
2703:
2699:
2695:
2690:
2687:
2683:
2653:
2649:
2645:
2640:
2637:
2633:
2610:
2606:
2602:
2597:
2594:
2590:
2578:
2577:
2566:
2563:
2558:
2554:
2550:
2545:
2542:
2538:
2534:
2531:
2520:
2519:
2508:
2505:
2500:
2496:
2492:
2487:
2484:
2480:
2476:
2473:
2455:
2454:
2443:
2435:
2430:
2427:
2423:
2417:
2413:
2405:
2400:
2397:
2393:
2387:
2383:
2375:
2370:
2367:
2363:
2357:
2353:
2346:
2343:
2320:
2300:
2274:
2271:
2267:
2254:
2251:
2249:
2246:
2219:
2215:
2209:
2206:
2202:
2198:
2193:
2189:
2183:
2180:
2176:
2157:weight vectors
2135:
2131:
2127:
2122:
2118:
2106:
2105:
2094:
2091:
2088:
2085:
2082:
2079:
2076:
2073:
2070:
2062:
2059:
2056:
2053:
2048:
2043:
2039:
2035:
2032:
2027:
2023:
2019:
2016:
2013:
2010:
2007:
2004:
1999:
1994:
1990:
1986:
1983:
1978:
1974:
1970:
1967:
1964:
1955:
1952:
1947:
1943:
1939:
1936:
1931:
1927:
1923:
1920:
1917:
1911:
1908:
1905:
1901:
1896:
1893:
1888:
1884:
1880:
1875:
1871:
1867:
1844:
1841:
1838:
1835:
1832:
1829:
1797:
1792:
1787:
1783:
1779:
1776:
1756:
1751:
1746:
1742:
1738:
1735:
1715:
1712:
1707:
1702:
1698:
1694:
1691:
1686:
1681:
1677:
1673:
1670:
1667:
1664:
1661:
1638:
1633:
1628:
1624:
1603:
1598:
1593:
1589:
1566:
1561:
1556:
1551:
1547:
1524:
1520:
1497:
1492:
1487:
1482:
1478:
1455:
1451:
1427:
1407:
1387:
1382:
1378:
1374:
1369:
1365:
1361:
1358:
1355:
1332:
1329:
1326:
1323:
1320:
1297:
1294:
1291:
1271:
1268:
1265:
1262:
1259:
1256:
1253:
1250:
1245:
1242:
1238:
1223:second moments
1201:
1197:
1191:
1187:
1183:
1180:
1177:
1172:
1168:
1164:
1161:
1158:
1136:
1132:
1126:
1122:
1118:
1115:
1112:
1107:
1103:
1099:
1096:
1093:
1083:column vectors
1078:
1075:
1047:Camille Jordan
1007:
1000:
996: = (
987:
980:
976: = (
948:
945:
944:
942:
941:
934:
927:
919:
916:
915:
912:
911:
906:
905:
904:
894:
888:
885:
884:
881:
880:
877:
876:
871:
866:
861:
856:
851:
846:
840:
837:
836:
833:
832:
829:
828:
823:
818:
813:
811:Occam learning
808:
803:
798:
793:
787:
784:
783:
780:
779:
776:
775:
770:
768:Learning curve
765:
760:
754:
751:
750:
747:
746:
743:
742:
737:
732:
727:
721:
718:
717:
714:
713:
710:
709:
708:
707:
697:
692:
687:
681:
676:
675:
672:
671:
668:
667:
661:
656:
651:
646:
645:
644:
634:
629:
628:
627:
622:
617:
612:
602:
597:
592:
587:
586:
585:
575:
574:
573:
568:
563:
558:
548:
543:
538:
532:
527:
526:
523:
522:
519:
518:
513:
508:
500:
494:
489:
488:
485:
484:
481:
480:
479:
478:
473:
468:
457:
452:
451:
448:
447:
444:
443:
438:
433:
428:
423:
418:
413:
408:
403:
397:
392:
391:
388:
387:
384:
383:
378:
373:
367:
362:
357:
349:
344:
339:
333:
328:
327:
324:
323:
320:
319:
314:
309:
304:
299:
294:
289:
284:
276:
275:
274:
269:
264:
254:
252:Decision trees
249:
243:
229:classification
219:
218:
217:
214:
213:
210:
209:
204:
199:
194:
189:
184:
179:
174:
169:
164:
159:
154:
149:
144:
139:
134:
129:
124:
122:Classification
118:
115:
114:
111:
110:
107:
106:
101:
96:
91:
86:
81:
79:Batch learning
76:
71:
66:
61:
56:
51:
46:
40:
37:
36:
33:
32:
21:
20:
9:
6:
4:
3:
2:
10399:
10388:
10385:
10384:
10382:
10369:
10365:
10361:
10357:
10353:
10349:
10345:
10338:
10334:
10324:
10323:
10314:
10312:
10311:
10302:
10300:
10299:
10294:
10288:
10286:
10285:
10276:
10275:
10272:
10258:
10255:
10253:
10252:Geostatistics
10250:
10248:
10245:
10243:
10240:
10238:
10235:
10234:
10232:
10230:
10226:
10220:
10219:Psychometrics
10217:
10215:
10212:
10210:
10207:
10205:
10202:
10200:
10197:
10195:
10192:
10190:
10187:
10185:
10182:
10180:
10177:
10175:
10172:
10171:
10169:
10167:
10163:
10157:
10154:
10152:
10149:
10147:
10143:
10140:
10138:
10135:
10133:
10130:
10128:
10125:
10124:
10122:
10120:
10116:
10110:
10107:
10105:
10102:
10100:
10096:
10093:
10091:
10088:
10087:
10085:
10083:
10082:Biostatistics
10079:
10075:
10071:
10066:
10062:
10044:
10043:Log-rank test
10041:
10040:
10038:
10034:
10028:
10025:
10024:
10022:
10020:
10016:
10010:
10007:
10005:
10002:
10000:
9997:
9995:
9992:
9991:
9989:
9987:
9983:
9980:
9978:
9974:
9964:
9961:
9959:
9956:
9954:
9951:
9949:
9946:
9944:
9941:
9940:
9938:
9936:
9932:
9926:
9923:
9921:
9918:
9916:
9914:(BoxâJenkins)
9910:
9908:
9905:
9903:
9900:
9896:
9893:
9892:
9891:
9888:
9887:
9885:
9883:
9879:
9873:
9870:
9868:
9867:DurbinâWatson
9865:
9863:
9857:
9855:
9852:
9850:
9849:DickeyâFuller
9847:
9846:
9844:
9840:
9834:
9831:
9829:
9826:
9824:
9823:Cointegration
9821:
9819:
9816:
9814:
9811:
9809:
9806:
9804:
9801:
9799:
9798:Decomposition
9796:
9795:
9793:
9789:
9786:
9784:
9780:
9770:
9767:
9766:
9765:
9762:
9761:
9760:
9757:
9753:
9750:
9749:
9748:
9745:
9743:
9740:
9738:
9735:
9733:
9730:
9728:
9725:
9723:
9720:
9718:
9715:
9713:
9710:
9709:
9707:
9705:
9701:
9695:
9692:
9690:
9687:
9685:
9682:
9680:
9677:
9675:
9672:
9670:
9669:Cohen's kappa
9667:
9666:
9664:
9662:
9658:
9654:
9650:
9646:
9642:
9638:
9633:
9629:
9615:
9612:
9610:
9607:
9605:
9602:
9600:
9597:
9596:
9594:
9592:
9588:
9582:
9578:
9574:
9568:
9566:
9563:
9562:
9560:
9558:
9554:
9548:
9545:
9543:
9540:
9538:
9535:
9533:
9530:
9528:
9525:
9523:
9522:Nonparametric
9520:
9518:
9515:
9514:
9512:
9508:
9502:
9499:
9497:
9494:
9492:
9489:
9487:
9484:
9483:
9481:
9479:
9475:
9469:
9466:
9464:
9461:
9459:
9456:
9454:
9451:
9449:
9446:
9445:
9443:
9441:
9437:
9431:
9428:
9426:
9423:
9421:
9418:
9416:
9413:
9412:
9410:
9408:
9404:
9400:
9393:
9390:
9388:
9385:
9384:
9380:
9376:
9360:
9357:
9356:
9355:
9352:
9350:
9347:
9345:
9342:
9338:
9335:
9333:
9330:
9329:
9328:
9325:
9324:
9322:
9320:
9316:
9306:
9303:
9299:
9293:
9291:
9285:
9283:
9277:
9276:
9275:
9272:
9271:Nonparametric
9269:
9267:
9261:
9257:
9254:
9253:
9252:
9246:
9242:
9241:Sample median
9239:
9238:
9237:
9234:
9233:
9231:
9229:
9225:
9217:
9214:
9212:
9209:
9207:
9204:
9203:
9202:
9199:
9197:
9194:
9192:
9186:
9184:
9181:
9179:
9176:
9174:
9171:
9169:
9166:
9164:
9162:
9158:
9156:
9153:
9152:
9150:
9148:
9144:
9138:
9136:
9132:
9130:
9128:
9123:
9121:
9116:
9112:
9111:
9108:
9105:
9103:
9099:
9089:
9086:
9084:
9081:
9079:
9076:
9075:
9073:
9071:
9067:
9061:
9058:
9054:
9051:
9050:
9049:
9046:
9042:
9039:
9038:
9037:
9034:
9032:
9029:
9028:
9026:
9024:
9020:
9012:
9009:
9007:
9004:
9003:
9002:
8999:
8997:
8994:
8992:
8989:
8987:
8984:
8982:
8979:
8977:
8974:
8973:
8971:
8969:
8965:
8959:
8956:
8952:
8949:
8945:
8942:
8940:
8937:
8936:
8935:
8932:
8931:
8930:
8927:
8923:
8920:
8918:
8915:
8913:
8910:
8908:
8905:
8904:
8903:
8900:
8899:
8897:
8895:
8891:
8888:
8886:
8882:
8876:
8873:
8871:
8868:
8864:
8861:
8860:
8859:
8856:
8854:
8851:
8847:
8846:loss function
8844:
8843:
8842:
8839:
8835:
8832:
8830:
8827:
8825:
8822:
8821:
8820:
8817:
8815:
8812:
8810:
8807:
8803:
8800:
8798:
8795:
8793:
8787:
8784:
8783:
8782:
8779:
8775:
8772:
8770:
8767:
8765:
8762:
8761:
8760:
8757:
8753:
8750:
8748:
8745:
8744:
8743:
8740:
8736:
8733:
8732:
8731:
8728:
8724:
8721:
8720:
8719:
8716:
8714:
8711:
8709:
8706:
8704:
8701:
8700:
8698:
8696:
8692:
8688:
8684:
8679:
8675:
8661:
8658:
8656:
8653:
8651:
8648:
8646:
8643:
8642:
8640:
8638:
8634:
8628:
8625:
8623:
8620:
8618:
8615:
8614:
8612:
8608:
8602:
8599:
8597:
8594:
8592:
8589:
8587:
8584:
8582:
8579:
8577:
8574:
8572:
8569:
8568:
8566:
8564:
8560:
8554:
8551:
8549:
8548:Questionnaire
8546:
8544:
8541:
8537:
8534:
8532:
8529:
8528:
8527:
8524:
8523:
8521:
8519:
8515:
8509:
8506:
8504:
8501:
8499:
8496:
8494:
8491:
8489:
8486:
8484:
8481:
8479:
8476:
8474:
8471:
8470:
8468:
8466:
8462:
8458:
8454:
8449:
8445:
8431:
8428:
8426:
8423:
8421:
8418:
8416:
8413:
8411:
8408:
8406:
8403:
8401:
8398:
8396:
8393:
8391:
8388:
8386:
8383:
8381:
8378:
8376:
8375:Control chart
8373:
8371:
8368:
8366:
8363:
8361:
8358:
8357:
8355:
8353:
8349:
8343:
8340:
8336:
8333:
8331:
8328:
8327:
8326:
8323:
8321:
8318:
8316:
8313:
8312:
8310:
8308:
8304:
8298:
8295:
8293:
8290:
8288:
8285:
8284:
8282:
8278:
8272:
8269:
8268:
8266:
8264:
8260:
8248:
8245:
8243:
8240:
8238:
8235:
8234:
8233:
8230:
8228:
8225:
8224:
8222:
8220:
8216:
8210:
8207:
8205:
8202:
8200:
8197:
8195:
8192:
8190:
8187:
8185:
8182:
8180:
8177:
8176:
8174:
8172:
8168:
8162:
8159:
8157:
8154:
8150:
8147:
8145:
8142:
8140:
8137:
8135:
8132:
8130:
8127:
8125:
8122:
8120:
8117:
8115:
8112:
8110:
8107:
8105:
8102:
8101:
8100:
8097:
8096:
8094:
8092:
8088:
8085:
8083:
8079:
8075:
8071:
8066:
8062:
8056:
8053:
8051:
8048:
8047:
8044:
8040:
8033:
8028:
8026:
8021:
8019:
8014:
8013:
8010:
8006:
8000:
7996:
7993:
7990:
7986:
7983:
7979:
7975:
7971:
7967:
7963:
7959:
7954:
7949:
7945:
7941:
7936:
7933:
7929:
7926:
7925:
7913:
7909:
7904:
7899:
7894:
7889:
7884:
7879:
7875:
7871:
7867:
7860:
7852:
7846:
7842:
7838:
7833:
7828:
7824:
7817:
7810:
7802:
7798:
7794:
7790:
7785:
7780:
7776:
7772:
7765:
7757:
7753:
7749:
7743:
7739:
7735:
7731:
7730:
7722:
7715:
7709:
7705:
7702:
7696:
7688:
7684:
7680:
7674:
7667:
7663:
7658:
7653:
7649:
7645:
7641:
7637:
7630:
7622:
7618:
7613:
7608:
7604:
7600:
7596:
7592:
7588:
7581:
7567:on 2017-03-13
7563:
7559:
7555:
7551:
7547:
7540:
7533:
7525:
7521:
7517:
7513:
7508:
7503:
7499:
7495:
7488:
7481:
7473:
7469:
7465:
7461:
7457:
7453:
7449:
7442:
7428:
7424:
7418:
7410:
7409:
7401:
7393:
7389:
7385:
7371:
7361:
7355:
7347:
7343:
7339:
7335:
7331:
7327:
7323:
7322:Hotelling, H.
7317:
7309:
7305:
7301:
7297:
7290:
7282:
7276:
7272:
7268:
7263:
7258:
7254:
7247:
7243:
7234:
7231:
7229:
7226:
7224:
7221:
7219:
7216:
7214:
7211:
7209:
7206:
7204:
7201:
7199:
7196:
7195:
7189:
7172:
7169:
7166:
7162:
7139:
7136:
7133:
7129:
7106:
7103:
7100:
7096:
7073:
7070:
7067:
7063:
7042:
7022:
7014:
7004:
7002:
6998:
6979:
6976:
6973:
6967:
6964:
6956:
6955:inner product
6940:
6920:
6912:
6896:
6876:
6867:
6865:
6844:
6840:
6836:
6831:
6827:
6820:
6817:
6810:
6794:
6772:
6768:
6747:
6725:
6721:
6700:
6680:
6672:
6671:inner product
6668:
6667:Gram matrices
6647:
6643:
6639:
6633:
6627:
6621:
6618:
6615:
6609:
6606:
6603:
6598:
6595:
6565:
6561:
6557:
6551:
6545:
6539:
6536:
6533:
6527:
6524:
6521:
6516:
6513:
6500:
6484:
6481:
6475:
6469:
6463:
6457:
6451:
6440:
6422:
6412:
6408:
6404:
6401:
6398:
6393:
6389:
6382:
6379:
6357:
6347:
6343:
6339:
6336:
6333:
6328:
6324:
6317:
6314:
6300:
6286:
6283:
6280:
6256:
6253:
6250:
6247:
6244:
6241:
6221:
6218:
6215:
6195:
6192:
6189:
6186:
6166:
6163:
6160:
6140:
6120:
6100:
6097:
6094:
6091:
6083:
6068:
6065:
6062:
6059:
6056:
6036:
6033:
6030:
6010:
6007:
6004:
5984:
5981:
5978:
5958:
5938:
5918:
5915:
5912:
5904:
5903:
5902:
5888:
5885:
5879:
5873:
5862:
5844:
5840:
5836:
5833:
5819:
5815:
5811:
5809:
5805:
5801:
5797:
5793:
5782:
5768:
5765:
5762:
5759:
5756:
5736:
5733:
5730:
5727:
5724:
5715:
5701:
5678:
5675:
5672:
5646:
5638:
5620:
5617:
5614:
5611:
5608:
5599:
5596:
5593:
5590:
5587:
5576:
5557:
5549:
5544:
5537:
5534:
5527:
5524:
5513:
5510:
5507:
5496:
5493:
5490:
5486:
5482:
5479:
5475:
5468:
5465:
5462:
5459:
5456:
5448:
5445:
5440:
5437:
5434:
5431:
5427:
5423:
5420:
5415:
5411:
5403:
5402:
5401:
5387:
5364:
5361:
5358:
5349:
5346:
5343:
5340:
5337:
5334:
5312:
5305:
5302:
5278:
5258:
5242:
5238:
5235:
5233:
5229:
5226:
5225:
5224:
5222:
5218:
5215:
5211:
5207:
5203:
5199:
5195:
5187:
5183:
5180:
5177:
5174:
5171:
5167:
5163:
5159:
5156:
5154:
5150:
5147:
5144:
5140:
5136:
5132:
5128:
5125:
5122:
5118:
5114:
5110:
5107:
5106:
5105:
5103:
5079:
5074:
5070:
5066:
5063:
5058:
5054:
5050:
5047:
5042:
5039:
5029:
5025:
5021:
5018:
5011:
5010:
4995:
4990:
4986:
4982:
4979:
4974:
4970:
4966:
4963:
4958:
4955:
4945:
4941:
4937:
4934:
4927:
4926:
4925:
4907:
4902:
4899:
4889:
4886:
4881:
4878:
4853:
4846:
4832:
4827:
4824:
4814:
4811:
4806:
4803:
4793:
4790:
4780:
4777:
4772:
4769:
4744:
4737:
4723:
4720:
4715:
4712:
4702:
4699:
4694:
4691:
4666:
4659:
4642:
4639:
4629:
4626:
4621:
4618:
4608:
4605:
4595:
4592:
4587:
4584:
4559:
4552:
4551:
4550:
4533:
4528:
4524:
4520:
4517:
4512:
4509:
4499:
4496:
4486:
4482:
4478:
4475:
4470:
4467:
4442:
4435:
4419:
4415:
4411:
4408:
4403:
4400:
4390:
4387:
4377:
4374:
4369:
4366:
4356:
4353:
4343:
4339:
4335:
4332:
4327:
4324:
4299:
4292:
4291:
4290:
4273:
4268:
4264:
4260:
4257:
4252:
4249:
4239:
4236:
4226:
4222:
4218:
4215:
4210:
4207:
4182:
4175:
4159:
4155:
4151:
4148:
4143:
4140:
4130:
4127:
4117:
4114:
4109:
4106:
4096:
4093:
4083:
4079:
4075:
4072:
4067:
4064:
4039:
4032:
4031:
4030:
4022:
4020:
4004:
3984:
3975:
3973:
3969:
3951:
3947:
3943:
3940:
3935:
3932:
3922:
3919:
3909:
3906:
3901:
3898:
3888:
3885:
3875:
3871:
3867:
3864:
3859:
3856:
3843:
3827:
3807:
3802:
3798:
3794:
3791:
3786:
3783:
3773:
3770:
3760:
3756:
3752:
3749:
3744:
3741:
3716:
3693:
3686:
3682:
3678:
3673:
3669:
3664:
3660:
3655:
3648:
3644:
3640:
3635:
3631:
3626:
3622:
3618:
3615:
3610:
3607:
3597:
3594:
3584:
3581:
3576:
3573:
3563:
3560:
3550:
3546:
3542:
3539:
3534:
3531:
3521:
3517:
3512:
3505:
3502:
3495:
3494:
3479:
3474:
3470:
3466:
3461:
3457:
3452:
3448:
3443:
3436:
3432:
3428:
3423:
3419:
3414:
3410:
3406:
3403:
3398:
3395:
3385:
3382:
3372:
3368:
3364:
3361:
3356:
3353:
3343:
3339:
3335:
3332:
3327:
3324:
3314:
3311:
3301:
3297:
3293:
3290:
3285:
3282:
3272:
3268:
3263:
3258:
3252:
3245:
3239:
3235:
3231:
3228:
3223:
3220:
3210:
3207:
3197:
3193:
3189:
3186:
3181:
3178:
3168:
3164:
3159:
3151:
3150:
3149:
3147:
3128:
3120:
3115:
3111:
3103:
3098:
3094:
3086:
3081:
3077:
3073:
3070:
3065:
3062:
3052:
3049:
3039:
3035:
3031:
3028:
3023:
3020:
3010:
3006:
2999:
2996:
2989:
2988:
2987:
2970:
2965:
2962:
2954:
2944:
2940:
2934:
2930:
2924:
2920:
2915:
2905:
2901:
2895:
2891:
2887:
2882:
2878:
2872:
2868:
2864:
2859:
2855:
2851:
2846:
2843:
2831:
2830:
2829:
2812:
2807:
2804:
2796:
2786:
2782:
2776:
2772:
2766:
2762:
2757:
2747:
2743:
2737:
2733:
2729:
2724:
2720:
2714:
2710:
2706:
2701:
2697:
2693:
2688:
2685:
2673:
2672:
2671:
2669:
2651:
2647:
2643:
2638:
2635:
2608:
2604:
2600:
2595:
2592:
2564:
2561:
2556:
2552:
2548:
2543:
2540:
2532:
2529:
2522:
2521:
2506:
2503:
2498:
2494:
2490:
2485:
2482:
2474:
2471:
2464:
2463:
2462:
2460:
2441:
2433:
2428:
2425:
2415:
2411:
2403:
2398:
2395:
2385:
2381:
2373:
2368:
2365:
2355:
2351:
2344:
2341:
2334:
2333:
2332:
2318:
2298:
2290:
2272:
2269:
2245:
2243:
2242:
2237:
2236:
2217:
2213:
2207:
2204:
2196:
2191:
2187:
2181:
2178:
2165:
2164:
2159:
2158:
2153:
2152:
2133:
2129:
2125:
2120:
2116:
2092:
2089:
2086:
2083:
2080:
2077:
2074:
2071:
2068:
2060:
2057:
2051:
2046:
2041:
2037:
2033:
2030:
2025:
2021:
2014:
2011:
2008:
2002:
1997:
1992:
1988:
1984:
1981:
1976:
1972:
1965:
1962:
1950:
1945:
1941:
1937:
1934:
1929:
1925:
1918:
1915:
1909:
1906:
1903:
1899:
1894:
1886:
1882:
1878:
1873:
1869:
1858:
1857:
1856:
1839:
1836:
1833:
1819:
1818:
1813:
1812:
1795:
1790:
1785:
1781:
1777:
1774:
1754:
1749:
1744:
1740:
1736:
1733:
1710:
1705:
1700:
1696:
1692:
1689:
1684:
1679:
1675:
1668:
1665:
1662:
1659:
1652:
1649:maximize the
1636:
1631:
1626:
1622:
1601:
1596:
1591:
1587:
1564:
1554:
1549:
1545:
1522:
1518:
1495:
1485:
1480:
1476:
1453:
1449:
1439:
1425:
1405:
1380:
1376:
1372:
1367:
1363:
1356:
1353:
1346:
1343:entry is the
1327:
1324:
1321:
1310:
1295:
1292:
1289:
1266:
1263:
1260:
1254:
1251:
1248:
1243:
1240:
1228:
1224:
1221:
1217:
1199:
1189:
1185:
1181:
1178:
1175:
1170:
1166:
1159:
1156:
1134:
1124:
1120:
1116:
1113:
1110:
1105:
1101:
1094:
1091:
1084:
1074:
1072:
1067:
1063:
1059:
1054:
1050:
1048:
1044:
1040:
1035:
1031:
1027:
1023:
1019:
1015:
1010:
1006:
999:
995:
990:
986:
979:
975:
971:
967:
963:
959:
955:
940:
935:
933:
928:
926:
921:
920:
918:
917:
910:
907:
903:
900:
899:
898:
895:
893:
890:
889:
883:
882:
875:
872:
870:
867:
865:
862:
860:
857:
855:
852:
850:
847:
845:
842:
841:
835:
834:
827:
824:
822:
819:
817:
814:
812:
809:
807:
804:
802:
799:
797:
794:
792:
789:
788:
782:
781:
774:
771:
769:
766:
764:
761:
759:
756:
755:
749:
748:
741:
738:
736:
733:
731:
730:Crowdsourcing
728:
726:
723:
722:
716:
715:
706:
703:
702:
701:
698:
696:
693:
691:
688:
686:
683:
682:
679:
674:
673:
665:
662:
660:
659:Memtransistor
657:
655:
652:
650:
647:
643:
640:
639:
638:
635:
633:
630:
626:
623:
621:
618:
616:
613:
611:
608:
607:
606:
603:
601:
598:
596:
593:
591:
588:
584:
581:
580:
579:
576:
572:
569:
567:
564:
562:
559:
557:
554:
553:
552:
549:
547:
544:
542:
541:Deep learning
539:
537:
534:
533:
530:
525:
524:
517:
514:
512:
509:
507:
505:
501:
499:
496:
495:
492:
487:
486:
477:
476:Hidden Markov
474:
472:
469:
467:
464:
463:
462:
459:
458:
455:
450:
449:
442:
439:
437:
434:
432:
429:
427:
424:
422:
419:
417:
414:
412:
409:
407:
404:
402:
399:
398:
395:
390:
389:
382:
379:
377:
374:
372:
368:
366:
363:
361:
358:
356:
354:
350:
348:
345:
343:
340:
338:
335:
334:
331:
326:
325:
318:
315:
313:
310:
308:
305:
303:
300:
298:
295:
293:
290:
288:
285:
283:
281:
277:
273:
272:Random forest
270:
268:
265:
263:
260:
259:
258:
255:
253:
250:
248:
245:
244:
237:
236:
231:
230:
222:
216:
215:
208:
205:
203:
200:
198:
195:
193:
190:
188:
185:
183:
180:
178:
175:
173:
170:
168:
165:
163:
160:
158:
157:Data cleaning
155:
153:
150:
148:
145:
143:
140:
138:
135:
133:
130:
128:
125:
123:
120:
119:
113:
112:
105:
102:
100:
97:
95:
92:
90:
87:
85:
82:
80:
77:
75:
72:
70:
69:Meta-learning
67:
65:
62:
60:
57:
55:
52:
50:
47:
45:
42:
41:
35:
34:
31:
26:
23:
22:
18:
17:
10351:
10347:
10337:
10320:
10308:
10289:
10282:
10194:Econometrics
10144: /
10127:Chemometrics
10104:Epidemiology
10097: /
10070:Applications
9912:ARIMA model
9859:Q-statistic
9808:Stationarity
9726:
9704:Multivariate
9647: /
9643: /
9641:Multivariate
9639: /
9579: /
9575: /
9349:Bayes factor
9248:Signed rank
9160:
9134:
9126:
9114:
8809:Completeness
8645:Cohort study
8543:Opinion poll
8478:Missing data
8465:Study design
8420:Scatter plot
8342:Scatter plot
8335:Spearman's Ï
8297:Grouped data
8004:
7943:
7939:
7873:
7869:
7859:
7822:
7809:
7774:
7770:
7764:
7728:
7714:
7700:
7695:
7682:
7673:
7639:
7635:
7629:
7597:(68): 3823.
7594:
7590:
7580:
7569:. Retrieved
7562:the original
7549:
7545:
7532:
7497:
7493:
7480:
7455:
7451:
7441:
7430:. Retrieved
7426:
7417:
7407:
7400:
7391:
7387:
7354:
7329:
7325:
7316:
7299:
7295:
7289:
7252:
7246:
7010:
6868:
6306:
6272:
5825:
5816:
5812:
5804:extraversion
5794:such as the
5788:
5716:
5572:
5250:
5208:function is
5191:
5162:scikit-learn
5153:proc cancorr
5099:
4923:
4548:
4288:
4028:
3976:
3708:
3143:
2985:
2827:
2579:
2456:
2256:
2240:
2239:
2234:
2233:
2162:
2161:
2156:
2155:
2150:
2149:
2107:
1816:
1815:
1810:
1809:
1440:
1080:
1070:
1065:
1061:
1055:
1051:
1029:
1025:
1018:correlations
1008:
1004:
997:
993:
988:
984:
977:
973:
965:
961:
957:
951:
816:PAC learning
503:
405:
352:
347:Hierarchical
279:
233:
227:
10322:WikiProject
10237:Cartography
10199:Jurimetrics
10151:Reliability
9882:Time domain
9861:(LjungâBox)
9783:Time-series
9661:Categorical
9645:Time-series
9637:Categorical
9572:(Bernoulli)
9407:Correlation
9387:Correlation
9183:JarqueâBera
9155:Chi-squared
8917:M-estimator
8870:Asymptotics
8814:Sufficiency
8581:Interaction
8493:Replication
8473:Effect size
8430:Violin plot
8410:Radar chart
8390:Forest plot
8380:Correlogram
8330:Kendall's Ï
7552:(7): 2162.
7500:(5): 1460.
7384:dimensions"
5808:neuroticism
5575:chi-squared
3972:eigenvalues
3842:eigenvector
2461:and define
2248:Computation
2232:are called
2148:are called
1651:correlation
700:Multi-agent
637:Transformer
536:Autoencoder
292:Naive Bayes
30:data mining
10189:Demography
9907:ARMA model
9712:Regression
9289:(Friedman)
9250:(Wilcoxon)
9188:Normality
9178:Lilliefors
9125:Student's
9001:Resampling
8875:Robustness
8863:divergence
8853:Efficiency
8791:(monotone)
8786:Likelihood
8703:Population
8536:Stratified
8488:Population
8307:Dependence
8263:Count data
8194:Percentile
8171:Dispersion
8104:Arithmetic
8039:Statistics
7883:1802.03490
7708:1604.02047
7571:2015-09-04
7432:2023-09-12
7360:Jordan, C.
7326:Biometrika
7239:References
6809:covariance
6499:covariance
6437:have zero
5859:with zero
5639:for large
5380:. For the
2253:Derivation
2238:or simply
2160:or simply
1345:covariance
1282:to be the
1081:Given two
1062:population
954:statistics
685:Q-learning
583:Restricted
381:Mean shift
330:Clustering
307:Perceptron
235:regression
137:Clustering
132:Regression
9570:Logistic
9337:posterior
9263:Rank sum
9011:Jackknife
9006:Bootstrap
8824:Bootstrap
8759:Parameter
8708:Statistic
8503:Statistic
8415:Run chart
8400:Pie chart
8395:Histogram
8385:Fan chart
8360:Bar chart
8242:L-moments
8129:Geometric
7948:CiteSeerX
7876:(1): 15.
7827:CiteSeerX
7784:1109.0725
7652:CiteSeerX
7621:2475-9066
7524:220740158
7507:0811.4413
7472:1349-6964
7257:CiteSeerX
6968:
6821:
6634:
6610:
6592:Σ
6552:
6528:
6510:Σ
6501:matrices
6470:
6452:
6402:…
6337:…
6248:−
6193:−
6098:−
5874:
5766:−
5612:−
5591:−
5538:^
5535:ρ
5528:−
5487:∏
5483:
5441:−
5435:−
5424:−
5412:χ
5347:…
5306:^
5303:ρ
5214:precision
5113:canoncorr
5048:−
5036:Σ
4964:−
4952:Σ
4896:Σ
4887:−
4875:Σ
4821:Σ
4812:−
4800:Σ
4787:Σ
4778:−
4766:Σ
4709:Σ
4700:−
4688:Σ
4636:Σ
4627:−
4615:Σ
4602:Σ
4593:−
4581:Σ
4518:−
4506:Σ
4493:Σ
4476:−
4464:Σ
4409:−
4397:Σ
4384:Σ
4375:−
4363:Σ
4350:Σ
4333:−
4321:Σ
4258:−
4246:Σ
4233:Σ
4216:−
4204:Σ
4149:−
4137:Σ
4124:Σ
4115:−
4103:Σ
4090:Σ
4073:−
4061:Σ
3941:−
3929:Σ
3916:Σ
3907:−
3895:Σ
3882:Σ
3865:−
3853:Σ
3792:−
3780:Σ
3767:Σ
3750:−
3738:Σ
3616:−
3604:Σ
3591:Σ
3582:−
3570:Σ
3557:Σ
3540:−
3528:Σ
3506:≤
3503:ρ
3404:−
3392:Σ
3379:Σ
3362:−
3350:Σ
3333:−
3321:Σ
3308:Σ
3291:−
3279:Σ
3259:≤
3229:−
3217:Σ
3204:Σ
3187:−
3175:Σ
3071:−
3059:Σ
3046:Σ
3029:−
3017:Σ
2997:ρ
2959:Σ
2950:⊤
2911:⊤
2840:Σ
2801:Σ
2792:⊤
2753:⊤
2682:Σ
2632:Σ
2589:Σ
2537:Σ
2479:Σ
2422:Σ
2392:Σ
2362:Σ
2342:ρ
2266:Σ
2201:Σ
2175:Σ
2090:−
2081:…
2015:
1966:
1919:
1669:
1660:ρ
1555:∈
1486:∈
1357:
1293:×
1255:
1237:Σ
1179:…
1114:…
1049:in 1875.
844:ECML PKDD
826:VC theory
773:ROC curve
705:Self-play
625:DeepDream
466:Bayes net
257:Ensembles
38:Paradigms
10381:Category
10368:15624506
10284:Category
9977:Survival
9854:Johansen
9577:Binomial
9532:Isotonic
9119:(normal)
8764:location
8571:Blocking
8526:Sampling
8405:QâQ plot
8370:Box plot
8352:Graphics
8247:Skewness
8237:Kurtosis
8209:Variance
8139:Heronian
8134:Harmonic
7970:15516276
7912:30626338
7756:51682024
7362:(1875).
7192:See also
6497:, their
6441:, i.e.,
6113:, i.e.,
5931:, i.e.,
5863:, i.e.,
5822:Examples
5188:package.
4025:Solution
2241:loadings
1808:are the
267:Boosting
116:Problems
10310:Commons
10257:Kriging
10142:Process
10099:studies
9958:Wavelet
9791:General
8958:Plug-in
8752:L space
8531:Cluster
8232:Moments
8050:Outline
7999:FORTRAN
7989:FORTRAN
7903:6327589
7801:8942357
7644:Bibcode
7599:Bibcode
7346:2333955
5200:of the
5184:in the
5170:CanCorr
3840:is the
3144:By the
2287:be the
2163:weights
1855:times.
849:NeurIPS
666:(ECRAM)
620:AlexNet
262:Bagging
10366:
10179:Census
9769:Normal
9717:Manova
9537:Robust
9287:2-way
9279:1-way
9117:-test
8788:
8365:Biplot
8156:Median
8149:Lehmer
8091:Center
7978:202473
7976:
7968:
7950:
7932:MATLAB
7910:
7900:
7847:
7829:
7799:
7754:
7744:
7654:
7619:
7522:
7470:
7394:: 103.
7344:
7277:
7259:
6997:cosine
6862:; see
6669:in an
5237:MATLAB
5206:cosine
5204:. The
5198:cosine
5158:Python
5131:cancor
5121:Octave
5109:MATLAB
2580:where
1900:argmax
1510:) and
1311:whose
1309:matrix
1220:finite
1066:sample
1012:) of
992:) and
642:Vision
498:RANSAC
376:OPTICS
371:DBSCAN
355:-means
162:AutoML
10364:S2CID
9803:Trend
9332:prior
9274:anova
9163:-test
9137:-test
9129:-test
9036:Power
8981:Pivot
8774:shape
8769:scale
8219:Shape
8199:Range
8144:Heinz
8119:Cubic
8055:Index
7974:S2CID
7878:arXiv
7819:(PDF)
7797:S2CID
7779:arXiv
7752:S2CID
7724:(PDF)
7704:arXiv
7565:(PDF)
7542:(PDF)
7520:S2CID
7502:arXiv
7490:(PDF)
7342:JSTOR
5577:with
5228:SciPy
5219:. To
5164:, as
5139:vegan
3966:(see
2986:Thus
1218:with
864:IJCAI
690:SARSA
649:Mamba
615:LeNet
610:U-Net
436:t-SNE
360:Fuzzy
337:BIRCH
10036:Test
9236:Sign
9088:Wald
8161:Mode
8099:Mean
7966:PMID
7908:PMID
7845:ISBN
7742:ISBN
7617:ISSN
7468:ISSN
7275:ISBN
7154:and
7088:and
7035:and
6965:corr
6933:and
6889:and
6760:and
6740:of
6693:and
6583:and
6372:and
6234:and
6179:and
6133:and
6049:and
5997:and
5951:and
5826:Let
5728:<
5176:SPSS
5137:and
5117:also
3997:and
3729:and
2828:and
2623:and
2311:and
2257:Let
1916:corr
1767:and
1666:corr
1614:and
1418:and
1149:and
1028:and
874:JMLR
859:ICLR
854:ICML
740:RLHF
556:LSTM
342:CURE
28:and
10356:doi
9216:BIC
9211:AIC
7958:doi
7898:PMC
7888:doi
7837:doi
7789:doi
7734:doi
7662:doi
7607:doi
7554:doi
7550:139
7512:doi
7460:doi
7334:doi
7304:doi
7267:doi
6999:of
6818:cov
6787:of
6607:Cov
6525:Cov
6084:If
5905:If
5901:.
5806:or
5800:NEO
5694:to
5667:min
5502:min
5353:min
5239:as
5230:as
5151:as
5149:SAS
5143:CCP
5135:CCA
5119:in
5111:as
2670:):
2154:or
2012:cov
1963:cov
1828:min
1354:cov
1252:cov
1214:of
1058:PCA
1024:of
962:CCA
952:In
600:SOM
590:GAN
566:ESN
561:GRU
506:-NN
441:SDL
431:PGD
426:PCA
421:NMF
416:LDA
411:ICA
406:CCA
282:-NN
10383::
10362:.
10352:11
10350:.
10346:.
7972:.
7964:.
7956:.
7944:16
7942:.
7906:.
7896:.
7886:.
7874:20
7872:.
7868:.
7843:.
7835:.
7821:.
7795:.
7787:.
7775:48
7773:.
7750:.
7740:.
7726:.
7685:.
7660:,
7650:,
7640:23
7638:,
7615:.
7605:.
7593:.
7589:.
7548:.
7544:.
7518:.
7510:.
7498:78
7496:.
7492:.
7466:.
7456:45
7454:.
7450:.
7425:.
7390:.
7386:.
7340:.
7330:28
7328:.
7300:85
7298:.
7273:.
7265:.
7003:.
6866:.
5480:ln
5141:.
3148:,
956:,
869:ML
10370:.
10358::
9161:G
9135:F
9127:t
9115:Z
8834:V
8829:U
8031:e
8024:t
8017:v
7980:.
7960::
7934:)
7930:(
7914:.
7890::
7880::
7853:.
7839::
7803:.
7791::
7781::
7758:.
7736::
7706::
7689:.
7664::
7646::
7623:.
7609::
7601::
7595:6
7574:.
7556::
7526:.
7514::
7504::
7474:.
7462::
7435:.
7392:3
7372:n
7348:.
7336::
7310:.
7306::
7283:.
7269::
7173:A
7170:C
7167:C
7163:Y
7140:A
7137:C
7134:C
7130:X
7107:A
7104:C
7101:C
7097:Y
7074:A
7071:C
7068:C
7064:X
7043:Y
7023:X
6983:)
6980:V
6977:,
6974:U
6971:(
6941:Y
6921:X
6897:V
6877:U
6850:)
6845:j
6841:y
6837:,
6832:i
6828:x
6824:(
6795:Y
6773:j
6769:y
6748:X
6726:i
6722:x
6701:Y
6681:X
6653:]
6648:T
6644:Y
6640:Y
6637:[
6631:E
6628:=
6625:)
6622:Y
6619:,
6616:Y
6613:(
6604:=
6599:Y
6596:Y
6571:]
6566:T
6562:X
6558:X
6555:[
6549:E
6546:=
6543:)
6540:X
6537:,
6534:X
6531:(
6522:=
6517:X
6514:X
6485:0
6482:=
6479:)
6476:Y
6473:(
6467:E
6464:=
6461:)
6458:X
6455:(
6449:E
6423:T
6419:)
6413:m
6409:y
6405:,
6399:,
6394:1
6390:y
6386:(
6383:=
6380:Y
6358:T
6354:)
6348:n
6344:x
6340:,
6334:,
6329:1
6325:x
6321:(
6318:=
6315:X
6287:V
6284:=
6281:U
6269:.
6257:X
6254:=
6251:Y
6245:=
6242:V
6222:X
6219:=
6216:U
6196:1
6190:=
6187:b
6167:1
6164:=
6161:a
6141:Y
6121:X
6101:X
6095:=
6092:Y
6081:.
6069:X
6066:=
6063:Y
6060:=
6057:V
6037:X
6034:=
6031:U
6011:1
6008:=
6005:b
5985:1
5982:=
5979:a
5959:Y
5939:X
5919:X
5916:=
5913:Y
5889:0
5886:=
5883:)
5880:X
5877:(
5871:E
5845:1
5841:x
5837:=
5834:X
5769:p
5763:n
5760:+
5757:m
5737:m
5734:+
5731:n
5725:p
5702:p
5682:}
5679:n
5676:,
5673:m
5670:{
5647:p
5624:)
5621:1
5618:+
5615:i
5609:n
5606:(
5603:)
5600:1
5597:+
5594:i
5588:m
5585:(
5558:,
5555:)
5550:2
5545:j
5525:1
5522:(
5517:}
5514:n
5511:,
5508:m
5505:{
5497:i
5494:=
5491:j
5476:)
5472:)
5469:1
5466:+
5463:n
5460:+
5457:m
5454:(
5449:2
5446:1
5438:1
5432:p
5428:(
5421:=
5416:2
5388:i
5368:}
5365:n
5362:,
5359:m
5356:{
5350:,
5344:,
5341:1
5338:=
5335:i
5313:i
5279:p
5259:i
5127:R
5123:)
5115:(
5080:Y
5075:T
5071:b
5067:=
5064:Y
5059:2
5055:/
5051:1
5043:Y
5040:Y
5030:T
5026:d
5022:=
5019:V
4996:X
4991:T
4987:a
4983:=
4980:X
4975:2
4971:/
4967:1
4959:X
4956:X
4946:T
4942:c
4938:=
4935:U
4920:.
4908:b
4903:Y
4900:X
4890:1
4882:X
4879:X
4854:a
4833:,
4828:Y
4825:X
4815:1
4807:X
4804:X
4794:X
4791:Y
4781:1
4773:Y
4770:Y
4745:b
4724:;
4721:a
4716:X
4713:Y
4703:1
4695:Y
4692:Y
4667:b
4657:,
4643:X
4640:Y
4630:1
4622:Y
4619:Y
4609:Y
4606:X
4596:1
4588:X
4585:X
4560:a
4534:d
4529:2
4525:/
4521:1
4513:Y
4510:Y
4500:Y
4497:X
4487:2
4483:/
4479:1
4471:X
4468:X
4443:c
4420:2
4416:/
4412:1
4404:Y
4401:Y
4391:Y
4388:X
4378:1
4370:X
4367:X
4357:X
4354:Y
4344:2
4340:/
4336:1
4328:Y
4325:Y
4300:d
4274:c
4269:2
4265:/
4261:1
4253:X
4250:X
4240:X
4237:Y
4227:2
4223:/
4219:1
4211:Y
4208:Y
4183:d
4160:2
4156:/
4152:1
4144:X
4141:X
4131:X
4128:Y
4118:1
4110:Y
4107:Y
4097:Y
4094:X
4084:2
4080:/
4076:1
4068:X
4065:X
4040:c
4005:d
3985:c
3952:2
3948:/
3944:1
3936:X
3933:X
3923:X
3920:Y
3910:1
3902:Y
3899:Y
3889:Y
3886:X
3876:2
3872:/
3868:1
3860:X
3857:X
3828:c
3808:c
3803:2
3799:/
3795:1
3787:X
3784:X
3774:X
3771:Y
3761:2
3757:/
3753:1
3745:Y
3742:Y
3717:d
3694:.
3687:2
3683:/
3679:1
3674:)
3670:c
3665:T
3661:c
3656:(
3649:2
3645:/
3641:1
3636:)
3632:c
3627:2
3623:/
3619:1
3611:X
3608:X
3598:X
3595:Y
3585:1
3577:Y
3574:Y
3564:Y
3561:X
3551:2
3547:/
3543:1
3535:X
3532:X
3522:T
3518:c
3513:(
3480:,
3475:2
3471:/
3467:1
3462:)
3458:d
3453:T
3449:d
3444:(
3437:2
3433:/
3429:1
3424:)
3420:c
3415:2
3411:/
3407:1
3399:X
3396:X
3386:X
3383:Y
3373:2
3369:/
3365:1
3357:Y
3354:Y
3344:2
3340:/
3336:1
3328:Y
3325:Y
3315:Y
3312:X
3302:2
3298:/
3294:1
3286:X
3283:X
3273:T
3269:c
3264:(
3256:)
3253:d
3250:(
3246:)
3240:2
3236:/
3232:1
3224:Y
3221:Y
3211:Y
3208:X
3198:2
3194:/
3190:1
3182:X
3179:X
3169:T
3165:c
3160:(
3129:.
3121:d
3116:T
3112:d
3104:c
3099:T
3095:c
3087:d
3082:2
3078:/
3074:1
3066:Y
3063:Y
3053:Y
3050:X
3040:2
3036:/
3032:1
3024:X
3021:X
3011:T
3007:c
3000:=
2971:.
2966:Y
2963:Y
2955:=
2945:Y
2941:V
2935:Y
2931:D
2925:Y
2921:V
2916:,
2906:Y
2902:V
2896:2
2892:/
2888:1
2883:Y
2879:D
2873:Y
2869:V
2865:=
2860:2
2856:/
2852:1
2847:Y
2844:Y
2813:,
2808:X
2805:X
2797:=
2787:X
2783:V
2777:X
2773:D
2767:X
2763:V
2758:,
2748:X
2744:V
2738:2
2734:/
2730:1
2725:X
2721:D
2715:X
2711:V
2707:=
2702:2
2698:/
2694:1
2689:X
2686:X
2652:2
2648:/
2644:1
2639:Y
2636:Y
2609:2
2605:/
2601:1
2596:X
2593:X
2565:,
2562:b
2557:2
2553:/
2549:1
2544:Y
2541:Y
2533:=
2530:d
2507:,
2504:a
2499:2
2495:/
2491:1
2486:X
2483:X
2475:=
2472:c
2442:.
2434:b
2429:Y
2426:Y
2416:T
2412:b
2404:a
2399:X
2396:X
2386:T
2382:a
2374:b
2369:Y
2366:X
2356:T
2352:a
2345:=
2319:Y
2299:X
2273:Y
2270:X
2218:k
2214:b
2208:Y
2205:Y
2197:,
2192:k
2188:a
2182:X
2179:X
2134:k
2130:b
2126:,
2121:k
2117:a
2093:1
2087:k
2084:,
2078:,
2075:1
2072:=
2069:j
2061:0
2058:=
2055:)
2052:Y
2047:T
2042:j
2038:b
2034:,
2031:Y
2026:T
2022:b
2018:(
2009:=
2006:)
2003:X
1998:T
1993:j
1989:a
1985:,
1982:X
1977:T
1973:a
1969:(
1954:)
1951:Y
1946:T
1942:b
1938:,
1935:X
1930:T
1926:a
1922:(
1910:b
1907:,
1904:a
1895:=
1892:)
1887:k
1883:b
1879:,
1874:k
1870:a
1866:(
1843:}
1840:n
1837:,
1834:m
1831:{
1796:Y
1791:T
1786:1
1782:b
1778:=
1775:V
1755:X
1750:T
1745:1
1741:a
1737:=
1734:U
1714:)
1711:Y
1706:T
1701:k
1697:b
1693:,
1690:X
1685:T
1680:k
1676:a
1672:(
1663:=
1637:Y
1632:T
1627:k
1623:b
1602:X
1597:T
1592:k
1588:a
1565:m
1560:R
1550:k
1546:b
1537:(
1523:k
1519:b
1496:n
1491:R
1481:k
1477:a
1468:(
1454:k
1450:a
1426:Y
1406:X
1386:)
1381:j
1377:y
1373:,
1368:i
1364:x
1360:(
1331:)
1328:j
1325:,
1322:i
1319:(
1296:m
1290:n
1270:)
1267:Y
1264:,
1261:X
1258:(
1249:=
1244:Y
1241:X
1200:T
1196:)
1190:m
1186:y
1182:,
1176:,
1171:1
1167:y
1163:(
1160:=
1157:Y
1135:T
1131:)
1125:n
1121:x
1117:,
1111:,
1106:1
1102:x
1098:(
1095:=
1092:X
1030:Y
1026:X
1009:m
1005:Y
1001:1
998:Y
994:Y
989:n
985:X
981:1
978:X
974:X
960:(
938:e
931:t
924:v
504:k
353:k
280:k
238:)
226:(
Text is available under the Creative Commons Attribution-ShareAlike License. Additional terms may apply.