Knowledge

Regression analysis

Source 📝

7491:. A properly conducted regression analysis will include an assessment of how well the assumed form is matched by the observed data, but it can only do so within the range of values of the independent variables actually available. This means that any extrapolation is particularly reliant on the assumptions being made about the structural form of the regression relationship. If this knowledge includes the fact that the dependent variable cannot go outside a certain range of values, this can be made use of in selecting the model – even if the observed dataset has no values particularly near such bounds. The implications of this step of choosing an appropriate functional form for the regression can be great when extrapolation is considered. At a minimum, it can ensure that any extrapolation arising from a fitted model is "realistic" (or in accord with what is known). 12635: 10324: 7779: 417: 13910: 12621: 9733: 5105: 31: 9713: 7425: 3790: 13934: 12659: 13922: 12647: 6024: 7380:(or polyserial correlations) between the categorical variables. Such procedures differ in the assumptions made about the distribution of the variables in the population. If the variable is positive with low values and represents the repetition of the occurrence of an event, then count models like the 7766:
applications and on some calculators. While many statistical software packages can perform various types of nonparametric and robust regression, these methods are less standardized. Different software packages implement different methods, and a method with a given name may be implemented differently
1533:
in 1809. Legendre and Gauss both applied the method to the problem of determining, from astronomical observations, the orbits of bodies about the Sun (mostly comets, but also later the then newly discovered minor planets). Gauss published a further development of the theory of least squares in 1821,
1508:
between the independent and dependent variables. Importantly, regressions by themselves only reveal relationships between a dependent variable and a collection of independent variables in a fixed dataset. To use regressions for prediction or to infer causal relationships, respectively, a researcher
7304:
are sometimes more difficult to interpret if the model's assumptions are violated. For example, if the error term does not have a normal distribution, in small samples the estimated parameters will not follow normal distributions and complicate inference. With relatively large samples, however, a
3450: 5820: 7428:
In the middle, the interpolated straight line represents the best balance between the points above and below this line. The dotted lines represent the two extreme lines. The first curves represent the estimated values. The outer curves represent a prediction for a new
5812: 6707: 5266: 4009:
in the class of linear unbiased estimators. Practitioners have developed a variety of methods to maintain some or all of these desirable properties in real-world settings, because these classical assumptions are unlikely to hold exactly. For example, modeling
1509:
must carefully justify why existing relationships have predictive power for a new context or why a relationship between two variables has a causal interpretation. The latter is especially important when researchers hope to estimate causal relationships using
7467:. Performing extrapolation relies strongly on the regression assumptions. The further the extrapolation goes outside the data, the more room there is for the model to fail due to differences between the assumptions and the sample data or the true values. 4560: 3250: 6210: 6515: 3842:
By itself, a regression is simply a calculation using the data. In order to interpret the output of regression as a meaningful statistical quantity that measures real-world relationships, researchers often rely on a number of classical
4822: 7247: 4383: 6785: 5352: 7592:
is the number of observations needed to reach the desired precision if the model had only one independent variable. For example, a researcher is building a linear regression model using a dataset that contains 1000 patients
6019:{\displaystyle {\hat {\sigma }}_{\beta _{0}}={\hat {\sigma }}_{\varepsilon }{\sqrt {{\frac {1}{n}}+{\frac {{\bar {x}}^{2}}{\sum (x_{i}-{\bar {x}})^{2}}}}}={\hat {\sigma }}_{\beta _{1}}{\sqrt {\frac {\sum x_{i}^{2}}{n}}}.} 3122: 3255: 7331:
The response variable may be non-continuous ("limited" to lie on some subset of the real line). For binary (zero or one) variables, if analysis proceeds with least-squares linear regression, the model is called the
3010: 5533: 13720: 5099: 7476:
that represents the uncertainty may accompany the point prediction. Such intervals tend to expand rapidly as the values of the independent variable(s) moved outside the range covered by the observed data.
1595:, regression in which the predictor (independent variable) or response variables are curves, images, graphs, or other complex data objects, regression methods accommodating various types of missing data, 2255: 1545:
in the 19th century to describe a biological phenomenon. The phenomenon was that the heights of descendants of tall ancestors tend to regress down towards a normal average (a phenomenon also known as
7683: 2589: 2331: 5700: 2464: 6530: 5118: 4891: 7138: 6979: 5034: 2004: 3699: 2709: 4111:
is largely focused on developing techniques that allow researchers to make reasonable real-world conclusions in real-world settings, where classical assumptions do not hold exactly.
7499:
There are no generally agreed methods for relating the number of observations versus the number of independent variables in the model. One method conjectured by Good and Hardin is
7402:
When the model function is not linear in the parameters, the sum of squares must be minimized by an iterative procedure. This introduces many complications which are summarized in
2901: 1473:) of the dependent variable when the independent variables take on a given set of values. Less common forms of regression use slightly different procedures to estimate alternative 4728:
Returning our attention to the straight line case: Given a random sample from the population, we estimate the population parameters and obtain the sample linear regression model:
7015: 9779: 4927: 4703: 3912: 3774: 2813: 2618: 2513: 2083: 7296:
Interpretations of these diagnostic tests rest heavily on the model's assumptions. Although examination of the residuals can be used to invalidate a model, the results of a
4430: 2763: 2654: 3127: 7037: 6833: 6087: 4674: 9607: 6401: 6366: 5437: 5384: 4644: 4617: 4274: 4247: 7164: 7109: 7063: 4421: 12697: 6339: 7530: 5633: 3563: 6863: 6243: 3741: 3589: 2484: 2386: 2358: 1875: 1652: 1307: 6930: 5572: 4954: 4590: 4220: 4157: 4105: 4070: 4043: 3975: 3941: 2157: 2130: 2034: 1933: 1902: 1855: 1820: 1778: 1740: 1686: 1345: 7376:
type models may be used when the sample is not randomly selected from the population of interest. An alternative to such procedures is linear regression based on
5685: 5659: 3615: 3510: 3480: 3036: 6811: 7631: 7611: 7590: 7570: 7550: 7083: 6950: 6903: 6883: 6393: 6303: 6283: 6263: 6079: 5592: 5457: 5408: 4723: 4193: 3537: 2836: 2177: 2103: 1706: 4734: 7172: 4283: 1603:
methods for regression, regression in which the predictor variables are measured with error, regression with more predictor variables than observations, and
1573:
of the response variable is Gaussian, but the joint distribution need not be. In this respect, Fisher's assumption is closer to Gauss's formulation of 1821.
7483:
However, this does not cover the full set of modeling errors that may be made: in particular, the assumption of a particular form for the relation between
1302: 6718: 5272: 9772: 1292: 13715: 8855: 8009: 6029:
Under the further assumption that the population error term is normally distributed, the researcher can use these estimated standard errors to create
13705: 3445:{\displaystyle \sum _{i}{\hat {e}}_{i}^{2}=\sum _{i}({\hat {Y}}_{i}-({\hat {\beta }}_{0}+{\hat {\beta }}_{1}X_{1i}+{\hat {\beta }}_{2}X_{2i}))^{2}=0} 9449: 7694:
Although the parameters of a regression model are usually estimated using the method of least squares, other methods which have been used include:
3041: 1133: 2515:
to distinguish the estimate from the true (unknown) parameter value that generated the data. Using this estimate, the researcher can then use the
2591:
for prediction or to assess the accuracy of the model in explaining the data. Whether the researcher is intrinsically interested in the estimate
2818:
It is important to note that there must be sufficient data to estimate a regression model. For example, suppose that a researcher has access to
12690: 11756: 9841: 9765: 1340: 2910: 12261: 7348:
model is a standard method of estimating a joint relationship between several binary dependent variables and some independent variables. For
4015: 5468: 1461:) that minimizes the sum of squared differences between the true data and that line (or hyperplane). For specific mathematical reasons (see 1297: 1148: 12411: 1785: 879: 13755: 12035: 10676: 9850: 3709:
be able to reconstruct any of the independent variables by adding and multiplying the remaining independent variables. As discussed in
1380: 1183: 8748:"The coefficient of determination R-squared is more informative than SMAPE, MAE, MAPE, MSE and RMSE in regression analysis evaluation" 7767:
in different packages. Specialized regression software has been developed for use in fields such as survey analysis and neuroimaging.
12683: 5046: 7403: 5542:(MSE) of the regression. The denominator is the sample size reduced by the number of model parameters estimated from the same data, 11809: 9855: 447: 13788: 13760: 12800: 12248: 10288: 8965: 8606: 7844: 1259: 357: 2182: 13765: 13081: 8848: 8544: 808: 17: 8454: 13823: 13730: 8721: 8654: 8478: 8354: 8272: 8033: 7942: 7639: 5807:{\displaystyle {\hat {\sigma }}_{\beta _{1}}={\hat {\sigma }}_{\varepsilon }{\sqrt {\frac {1}{\sum (x_{i}-{\bar {x}})^{2}}}}} 2521: 2260: 1492:
Regression analysis is primarily used for two conceptually distinct purposes. First, regression analysis is widely used for
13813: 10671: 10371: 9823: 9638: 3993:
A handful of conditions are sufficient for the least-squares estimator to possess desirable properties: in particular, the
3807: 1317: 1080: 615: 347: 6702:{\displaystyle \sum _{i=1}^{n}\sum _{k=1}^{p}x_{ij}x_{ik}{\hat {\beta }}_{k}=\sum _{i=1}^{n}x_{ij}y_{i},\ j=1,\dots ,p.\,} 2391: 13770: 13334: 12922: 11275: 10423: 10183: 9739: 9290: 9027: 4565:
This is still linear regression; although the expression on the right hand side is quadratic in the independent variable
1789: 1335: 8370:
Fotheringham, AS; Wong, DWS (1 January 1991). "The modifiable areal unit problem in multivariate statistical analysis".
5462:
Under the assumption that the population error term has a constant variance, the estimate of that variance is given by:
5261:{\displaystyle {\widehat {\beta }}_{1}={\frac {\sum (x_{i}-{\bar {x}})(y_{i}-{\bar {y}})}{\sum (x_{i}-{\bar {x}})^{2}}}} 4080:
standard errors, among other techniques. When rows of data correspond to locations in space, the choice of how to model
12838: 10163: 9813: 7282: 5040: 4961: 4828: 1168: 1143: 1092: 8495: 13828: 13354: 12937: 12058: 11950: 10083: 9551: 9178: 8985: 8841: 8679: 8625: 8450: 3829: 1583:
Regression methods continue to be an area of active research. In recent decades, new methods have been developed for
1216: 1211: 864: 4834: 12663: 12236: 12110: 9889: 9506: 7849: 7813: 7114: 6955: 1580:
to calculate regressions. Before 1970, it sometimes took up to 24 hours to receive the result from one regression.
874: 512: 311: 4974: 1615:
In practice, researchers first select a model they would like to estimate and then use their chosen method (e.g.,
13471: 13099: 13076: 12294: 11955: 11700: 11071: 10661: 8739: 7932: 7854: 7264: 3618: 1941: 362: 300: 120: 95: 7738:
Distance metric learning, which is learned by the search of a meaningful distance metric in a given input space.
5043:, a set of simultaneous linear equations in the parameters, which are solved to yield the parameter estimators, 3624: 13926: 13745: 13695: 13507: 13404: 13165: 13066: 12979: 12345: 11557: 11364: 11253: 11211: 10125: 9693: 9633: 9231: 3811: 1577: 1373: 1269: 1033: 854: 222: 11285: 4388:
In multiple linear regression, there are several independent variables or functions of independent variables.
2663: 13938: 13578: 13349: 12588: 11547: 10450: 9226: 8915: 8808: 7839: 1453:) that most closely fits the data according to a specific mathematical criterion. For example, the method of 1244: 946: 722: 181: 2105:
must be specified. Sometimes the form of this function is based on knowledge about the relationship between
13960: 13443: 13170: 12735: 12139: 12088: 12073: 12063: 11932: 11804: 11771: 11597: 11552: 11382: 10311: 10211: 10201: 10120: 10065: 9668: 9065: 9022: 8975: 8970: 7748: 4072:. Correlated errors that exist within subsets of the data or follow specific patterns can be handled using 2841: 1482: 1201: 1138: 1048: 1026: 869: 859: 440: 13590: 13483: 13394: 13207: 13195: 13104: 12793: 12651: 12483: 12284: 12208: 11509: 11263: 10932: 10396: 10338: 10153: 9719: 9015: 8941: 8803: 8798: 6984: 5691: 3591:, then there does not generally exist a set of parameters that will perfectly fit the data. The quantity 1352: 1264: 1249: 710: 532: 383: 8631: 4896: 13660: 13585: 13490: 13026: 12368: 12340: 12335: 12083: 11842: 11748: 11728: 11636: 11347: 11165: 10648: 10520: 9833: 9343: 9278: 8879: 7702: 7317: 4555:{\displaystyle y_{i}=\beta _{0}+\beta _{1}x_{i}+\beta _{2}x_{i}^{2}+\varepsilon _{i},\ i=1,\dots ,n.\!} 2037: 1546: 1312: 1239: 989: 884: 672: 605: 565: 352: 321: 248: 4681: 3860: 3245:{\displaystyle {\hat {Y}}_{i}={\hat {\beta }}_{0}+{\hat {\beta }}_{1}X_{1i}+{\hat {\beta }}_{2}X_{2i}} 13782: 13777: 13645: 13344: 13051: 13009: 12973: 12880: 12100: 11868: 11589: 11514: 11443: 11372: 11292: 11280: 11150: 11138: 11131: 10839: 10560: 10178: 10005: 9969: 9938: 9744: 9602: 9241: 9072: 8895: 7714: 7369: 4011: 3750: 2776: 2766: 2594: 2489: 2159:
that does not rely on the data. If no such knowledge is available, a flexible or convenient form for
2046: 1592: 1366: 972: 740: 610: 342: 331: 295: 202: 7613:). If the researcher decides that five observations are needed to precisely define a straight line ( 6205:{\displaystyle y_{i}=\beta _{1}x_{i1}+\beta _{2}x_{i2}+\cdots +\beta _{p}x_{ip}+\varepsilon _{i},\,} 2717: 2623: 13620: 13319: 13031: 12927: 12758: 12583: 12350: 12213: 11898: 11863: 11827: 11612: 11054: 10963: 10922: 10834: 10525: 10364: 10158: 10036: 10000: 9928: 9818: 9800: 9643: 8900: 7823: 7759: 7733: 7723: 7333: 7274: 6510:{\displaystyle \varepsilon _{i}=y_{i}-{\hat {\beta }}_{1}x_{i1}-\cdots -{\hat {\beta }}_{p}x_{ip}.} 4172: 4126: 4077: 3994: 1596: 1570: 1535: 1486: 994: 914: 837: 755: 585: 547: 542: 502: 497: 403: 274: 197: 90: 69: 7480:
For such reasons and others, some tend to say that it might be unwise to undertake extrapolation.
7020: 6816: 1485:) or estimate the conditional expectation across a broader collection of non-linear models (e.g., 13891: 13818: 13650: 12492: 12105: 12045: 11982: 11620: 11604: 11342: 11204: 11194: 11044: 10958: 9899: 9688: 9673: 9326: 9321: 9221: 9089: 8870: 7899: 4965: 4649: 3800: 2712: 1522: 1466: 941: 790: 690: 517: 433: 326: 6344: 5413: 5360: 4622: 4595: 4252: 4225: 1619:) to estimate the parameters of that model. Regression models involve the following components: 13914: 13886: 13665: 13389: 12875: 12786: 12530: 12460: 12253: 12190: 11945: 11832: 10829: 10726: 10633: 10512: 10411: 10252: 10078: 9943: 9933: 9884: 9648: 9408: 9127: 9122: 8332: 7919: 7818: 7377: 7258: 7143: 7088: 7042: 4957: 4394: 4006: 3844: 3710: 3483: 2657: 2365: 1823: 1616: 1549:). For Galton, regression had only this biological meaning, but his work was later extended by 1454: 1121: 1097: 999: 760: 735: 695: 507: 290: 285: 227: 8529: 6308: 13871: 13655: 13190: 13091: 12984: 12833: 12740: 12725: 12555: 12497: 12440: 12266: 12159: 12068: 11794: 11678: 11537: 11529: 11419: 11411: 11226: 11122: 11100: 11059: 11024: 10991: 10937: 10912: 10867: 10806: 10766: 10568: 10391: 10333: 10293: 10257: 10242: 10193: 10137: 9964: 9678: 9663: 9628: 9316: 9216: 9084: 8289: 8205:"The goodness of fit of regression formulae, and the distribution of regression coefficients" 7974: 7869: 7793: 7729: 7502: 7306: 5600: 3542: 1526: 1075: 897: 849: 705: 620: 492: 378: 74: 35: 9546: 8714:
Data Fitting and Uncertainty (A practical introduction to weighted least squares and beyond)
6838: 6218: 3716: 3568: 2469: 2371: 2343: 2085:
that most closely fits the data. To carry out regression analysis, the form of the function
1860: 1637: 13846: 13630: 13495: 13461: 13409: 13234: 13229: 13175: 13131: 13021: 12959: 12848: 12843: 12478: 12053: 12002: 11978: 11940: 11858: 11837: 11789: 11668: 11646: 11615: 11524: 11401: 11352: 11270: 11243: 11199: 11155: 10917: 10693: 10573: 10298: 10237: 10224: 10173: 10073: 9995: 9974: 9948: 9698: 9653: 9099: 9044: 8890: 8885: 7879: 7397: 7349: 7321: 6908: 6369: 6050: 6038: 5545: 4932: 4568: 4198: 4135: 4083: 4048: 4021: 4002: 3998: 3953: 3919: 3453: 2135: 2108: 2012: 1911: 1905: 1880: 1833: 1798: 1756: 1718: 1664: 1530: 1422: 1004: 954: 398: 388: 269: 237: 192: 171: 79: 8406:
Principles and Procedures of Statistics with Special Reference to the Biological Sciences.
7983:, Firmin Didot, Paris, 1805. “Sur la Méthode des moindres quarrés” appears as an appendix. 8: 13856: 13851: 13625: 13424: 13249: 12858: 12625: 12550: 12473: 12154: 11918: 11911: 11873: 11761: 11466: 11332: 11327: 11317: 11309: 11127: 11088: 10978: 10968: 10877: 10656: 10612: 10530: 10455: 10357: 10316: 10247: 10132: 10099: 10051: 10041: 10020: 10015: 9894: 9861: 9273: 9251: 9000: 8995: 8905: 8685:
Meade, Nigel; Islam, Towhidul (1995). "Prediction intervals for growth curve forecasts".
7889: 7864: 7718: 7472: 7419: 7345: 7341: 7309:
can be invoked such that hypothesis testing may proceed using asymptotic approximations.
6030: 5664: 5638: 3702: 3594: 3489: 3459: 3015: 2770: 2036:
are assumed to be free of error. This important assumption is often overlooked, although
1600: 1562: 1510: 1478: 1474: 1107: 1043: 1014: 919: 745: 664: 650: 625: 527: 487: 316: 217: 212: 166: 115: 105: 9757: 8106:(Galton uses the term "regression" in this paper, which discusses the height of humans.) 7633:), then the maximum number of independent variables the model can support is 4, because 6793: 4893:, is the difference between the value of the dependent variable predicted by the model, 4817:{\displaystyle {\widehat {y}}_{i}={\widehat {\beta }}_{0}+{\widehat {\beta }}_{1}x_{i}.} 3486:. Alternatively, one can visualize infinitely many 3-dimensional planes that go through 13478: 13061: 12910: 12639: 12450: 12304: 12200: 12149: 12025: 11922: 11906: 11883: 11660: 11394: 11377: 11337: 11248: 11143: 11105: 11076: 11036: 10996: 10942: 10859: 10545: 10540: 10328: 10232: 10221: 10046: 9658: 9236: 8827: 8774: 8747: 8387: 8314: 8237: 8224: 8204: 8185: 8141: 8072: 7784: 7616: 7596: 7575: 7555: 7535: 7415: 7381: 7373: 7242:{\displaystyle \mathbf {{\hat {\boldsymbol {\beta }}}=(X^{\top }X)^{-1}X^{\top }Y} .\,} 7068: 6935: 6888: 6868: 6378: 6288: 6268: 6248: 6064: 5577: 5442: 5393: 4708: 4378:{\displaystyle y_{i}=\beta _{0}+\beta _{1}x_{i}+\varepsilon _{i},\quad i=1,\dots ,n.\!} 4178: 4160: 3522: 2821: 2162: 2088: 1691: 1558: 1450: 1406: 1085: 1009: 795: 590: 421: 150: 135: 6285:-th independent variable. If the first independent variable takes the value 1 for all 13640: 13635: 13563: 13512: 13286: 13266: 13254: 13214: 13185: 13153: 13071: 12964: 12932: 12828: 12768: 12634: 12545: 12515: 12507: 12327: 12318: 12243: 12174: 12030: 12015: 11990: 11878: 11819: 11685: 11673: 11299: 11216: 11160: 11083: 10927: 10849: 10628: 10502: 10323: 10283: 10010: 9920: 9911: 9724: 9712: 9516: 9168: 9039: 9032: 8830:– how linear regression mistakes can appear when Y-range is much smaller than X-range 8779: 8717: 8675: 8650: 8621: 8594: 8525: 8474: 8446: 8391: 8350: 8268: 8257: 8029: 7938: 7884: 7874: 7803: 7778: 7385: 7353: 6056: 5539: 4120: 3982: 3857:
Deviations from the model have an expected value of zero, conditional on covariates:
3744: 3621:
in the model. Moreover, to estimate a least squares model, the independent variables
2903:. Suppose further that the researcher wants to estimate a bivariate linear model via 2337: 1584: 1462: 1446: 1402: 1394: 1178: 1021: 934: 730: 700: 645: 640: 595: 537: 416: 207: 110: 64: 8555: 6780:{\displaystyle \mathbf {(X^{\top }X){\hat {\boldsymbol {\beta }}}={}X^{\top }Y} ,\,} 5347:{\displaystyle {\widehat {\beta }}_{0}={\bar {y}}-{\widehat {\beta }}_{1}{\bar {x}}} 13805: 13558: 13517: 13329: 13304: 13116: 13046: 12890: 12675: 12570: 12525: 12289: 12276: 12169: 12144: 12078: 12010: 11888: 11496: 11389: 11322: 11235: 11182: 11001: 10872: 10666: 10550: 10465: 10432: 9979: 9846: 9469: 9459: 9266: 9060: 9010: 9005: 8948: 8936: 8769: 8759: 8694: 8515: 8507: 8379: 8347:
Geographically weighted regression: the analysis of spatially varying relationships
8304: 8252: 8232: 8216: 8177: 8133: 8094:(Galton uses the term "reversion" in this paper, which discusses the size of peas.) 8062: 7894: 7834: 7357: 5112:
In the case of simple regression, the formulas for the least squares estimates are
3944: 1632: 1628: 1604: 1566: 1501: 1206: 959: 909: 819: 803: 773: 635: 630: 580: 570: 468: 232: 161: 8821: 2340:, different forms of regression analysis provide tools to estimate the parameters 2333:
to be a reasonable approximation for the statistical process generating the data.
13866: 13568: 13371: 13364: 13299: 13239: 13121: 13111: 13041: 13014: 12999: 12954: 12944: 12895: 12487: 12231: 12093: 12020: 11695: 11569: 11542: 11519: 11488: 11115: 11110: 11064: 10794: 10445: 10262: 10168: 10109: 10104: 9582: 9526: 9348: 8990: 8910: 8575: 7978: 7698: 7451:
the range of values in the dataset used for model-fitting is known informally as
7277:
of the estimated parameters. Commonly used checks of goodness of fit include the
7270: 7269:
Once a regression model has been constructed, it may be important to confirm the
6034: 5387: 4014:
can lead to reasonable estimates independent variables are measured with errors.
3986: 3124:
that explain the data equally well: any combination can be chosen that satisfies
1234: 1038: 904: 844: 393: 100: 11977: 13553: 13339: 13276: 13261: 13244: 13202: 13004: 12885: 12753: 12730: 12720: 12436: 12431: 10894: 10824: 10470: 10206: 9556: 9521: 9511: 9336: 9094: 8920: 8732: 8309: 8089: 7859: 7365: 3456:. To understand why there are infinitely many options, note that the system of 1542: 1254: 785: 522: 145: 4132:
In linear regression, the model specification is that the dependent variable,
13954: 13861: 13740: 13548: 13522: 13399: 13359: 13324: 13314: 13294: 13036: 12994: 12969: 12917: 12905: 12900: 12809: 12593: 12560: 12423: 12384: 12195: 12164: 11628: 11582: 11187: 10889: 10716: 10480: 10475: 10278: 9808: 9788: 9501: 9481: 9398: 9077: 8613: 8181: 7798: 7755: 7463: 7453: 7361: 7324:
or are variables constrained to fall only in a certain range, often arise in
3516: 3117:{\displaystyle ({\hat {\beta }}_{0},{\hat {\beta }}_{1},{\hat {\beta }}_{2})} 2904: 2361: 1470: 1173: 1102: 984: 715: 600: 264: 140: 27:
Set of statistical processes for estimating the relationships among variables
8158: 8067: 8050: 1557:
to a more general statistical context. In the work of Yule and Pearson, the
13881: 13309: 13158: 13148: 13126: 13056: 12949: 12863: 12535: 12468: 12445: 12360: 11690: 10986: 10884: 10819: 10761: 10746: 10683: 10638: 9587: 9418: 8833: 8783: 8698: 8602: 8579: 8466: 8159: 8117: 8092:. "Typical laws of heredity", Nature 15 (1877), 492–495, 512–514, 532–533. 7337: 7325: 4960:. This method obtains parameter estimates that minimize the sum of squared 4108: 3978: 1554: 130: 8552:
Proc. International Conference on Computer Analysis of Images and Patterns
7726:, requires a large number of observations and is computationally intensive 2040:
can be used when the independent variables are assumed to contain errors.
13876: 13573: 13271: 13180: 13143: 12707: 12578: 12540: 12223: 12124: 11986: 11799: 11766: 11258: 11175: 11170: 10814: 10771: 10751: 10731: 10721: 10490: 9866: 9683: 9454: 9363: 9358: 8980: 8958: 8764: 8511: 8409: 7808: 7763: 4107:
within geographic units can have important consequences. The subfield of
1588: 1497: 979: 473: 176: 125: 8318: 7372:
may be used when the dependent variable is only sometimes observed, and
3005:{\displaystyle Y_{i}=\beta _{0}+\beta _{1}X_{1i}+\beta _{2}X_{2i}+e_{i}} 13502: 13224: 11424: 10904: 10604: 10535: 10485: 10460: 10380: 9577: 9536: 9531: 9444: 9353: 9261: 9173: 9153: 8228: 8189: 8168: 8145: 8076: 3814: in this section. Unsourced material may be challenged and removed. 1493: 1458: 1128: 824: 750: 8423: 8104:
Francis Galton. Presidential address, Section H, Anthropology. (1885)
7958: 7285:
and hypothesis testing. Statistical significance can be checked by an
5528:{\displaystyle {\hat {\sigma }}_{\varepsilon }^{2}={\frac {SSR}{n-2}}} 1504:. Second, in some situations regression analysis can be used to infer 30: 13600: 13419: 12989: 12870: 11577: 11429: 11049: 10844: 10756: 10741: 10736: 10701: 9572: 9541: 9439: 9283: 9246: 9183: 9137: 9132: 9117: 8264: 7278: 5594: 1550: 1505: 1287: 1068: 8520: 8383: 8345:
Fotheringham, A. Stewart; Brunsdon, Chris; Charlton, Martin (2002).
8220: 8163: 8137: 8121: 7424: 5104: 3789: 13466: 13219: 12853: 12823: 11093: 10711: 10588: 10583: 10578: 9474: 9306: 8815: 3452:
and are therefore valid solutions that minimize the sum of squared
2466:. A given regression method will ultimately provide an estimate of 2179:
is chosen. For example, a simple univariate regression may propose
13138: 12598: 12299: 9597: 9434: 9388: 9311: 9211: 9206: 9158: 8545:"Human age estimation by metric learning for regression problems" 7828: 3482:
equations is to be solved for 3 unknowns, which makes the system
1753:
directly observed in data and are often denoted using the scalar
1063: 12778: 7980:
Nouvelles mÊthodes pour la dÊtermination des orbites des comètes
7762:
and multiple regression using least squares can be done in some
3617:
appears often in regression analysis, and is referred to as the
1715:, which are observed in data and often denoted using the scalar 13433: 12520: 11501: 11475: 11455: 10706: 10497: 9612: 9592: 9464: 9256: 8746:
Chicco, Davide; Warrens, Matthijs J.; Jurman, Giuseppe (2021).
8665:
Applied Regression Analysis, Linear Models and Related Methods.
7717:, which is more robust in the presence of outliers, leading to 7409: 7301: 7297: 7290: 7286: 5094:{\displaystyle {\widehat {\beta }}_{0},{\widehat {\beta }}_{1}} 3038:
data points, then they could find infinitely many combinations
2838:
rows of data with one dependent and two independent variables:
1661:, which are observed in data and are often denoted as a vector 814: 8011:
Theoria combinationis observationum erroribus minimis obnoxiae
7336:. Nonlinear models for binary dependent variables include the 2660:, least squares is widely used because the estimated function 13595: 10349: 9413: 9393: 9383: 9378: 9373: 9368: 9331: 9163: 8707:
Regression Analysis — Theory, Methods, and Applications
1058: 1053: 780: 8344: 2773:) are useful when researchers want to model other functions 13721:
Committee on the Environment, Public Health and Food Safety
13527: 10440: 9403: 7995:
Chapter 1 of: Angrist, J. D., & Pischke, J. S. (2008).
1561:
of the response and explanatory variables is assumed to be
8740:
Operations and Production Systems with Multiple Objectives
4129:
for a derivation of these formulas and a numerical example
2250:{\displaystyle f(X_{i},\beta )=\beta _{0}+\beta _{1}X_{i}} 1500:, where its use has substantial overlap with the field of 9787: 8473:(3rd ed.). Hoboken, New Jersey: Wiley. p. 211. 6061:
In the more general multiple regression model, there are
4074:
clustered standard errors, geographic weighted regression
1421:
in machine learning parlance) and one or more error-free
8729:
Many Regression Algorithms, One Unified Model: A Review.
6712:
In matrix notation, the normal equations are written as
6375:
The least squares parameter estimates are obtained from
3851:
The sample is representative of the population at large.
2656:
will depend on context and their goals. As described in
1346:
List of datasets in computer vision and image processing
8455:
page 274 section 9.7.4 "interpolation vs extrapolation"
7997:
Mostly Harmless Econometrics: An Empiricist's Companion
7678:{\displaystyle {\frac {\log 1000}{\log 5}}\approx 4.29} 7404:
Differences between linear and non-linear least squares
3997:
assumptions imply that the parameter estimates will be
2584:{\displaystyle {\hat {Y_{i}}}=f(X_{i},{\hat {\beta }})} 1569:
in his works of 1922 and 1925. Fisher assumed that the
8731:
Neural Networks, vol. 69, Sept. 2015, pp. 60–79.
8245: 2326:{\displaystyle Y_{i}=\beta _{0}+\beta _{1}X_{i}+e_{i}} 8369: 8349:(Reprint ed.). Chichester, England: John Wiley. 8333:
Regressions: Why Are Economists Obessessed with Them?
7707:
Percentage regression, for situations where reducing
7642: 7619: 7599: 7578: 7558: 7538: 7505: 7494: 7175: 7146: 7117: 7091: 7071: 7045: 7023: 6987: 6958: 6938: 6911: 6891: 6871: 6841: 6819: 6796: 6721: 6533: 6404: 6381: 6347: 6311: 6291: 6271: 6251: 6221: 6090: 6067: 5823: 5703: 5667: 5641: 5603: 5580: 5548: 5471: 5445: 5416: 5396: 5363: 5275: 5121: 5049: 4977: 4935: 4899: 4837: 4737: 4711: 4684: 4652: 4625: 4598: 4571: 4433: 4397: 4286: 4255: 4228: 4201: 4181: 4138: 4086: 4051: 4024: 3956: 3922: 3863: 3854:
The independent variables are measured with no error.
3753: 3719: 3627: 3597: 3571: 3545: 3525: 3492: 3462: 3258: 3130: 3044: 3018: 2913: 2844: 2824: 2779: 2720: 2666: 2626: 2597: 2524: 2492: 2472: 2394: 2374: 2346: 2263: 2185: 2165: 2138: 2111: 2091: 2049: 2015: 1944: 1914: 1883: 1863: 1836: 1801: 1759: 1721: 1694: 1667: 1640: 12705: 12262:
Autoregressive conditional heteroskedasticity (ARCH)
8162:; Yule, G.U.; Blanchard, Norman; Lee, Alice (1903). 7774: 2459:{\displaystyle \sum _{i}(Y_{i}-f(X_{i},\beta ))^{2}} 1587:, regression involving correlated responses such as 8745: 8471:
Common Errors in Statistics (And How to Avoid Them)
11724: 8256: 7677: 7625: 7605: 7584: 7564: 7544: 7524: 7241: 7158: 7132: 7103: 7077: 7057: 7031: 7009: 6973: 6944: 6924: 6897: 6877: 6857: 6827: 6805: 6779: 6701: 6509: 6387: 6360: 6333: 6297: 6277: 6257: 6237: 6204: 6073: 6018: 5806: 5679: 5653: 5627: 5586: 5566: 5527: 5451: 5431: 5402: 5378: 5346: 5260: 5093: 5039:Minimization of this function results in a set of 5028: 4948: 4921: 4885: 4816: 4717: 4697: 4668: 4638: 4611: 4584: 4554: 4415: 4377: 4268: 4241: 4214: 4187: 4151: 4099: 4064: 4037: 3969: 3935: 3906: 3768: 3735: 3693: 3609: 3583: 3557: 3531: 3504: 3474: 3444: 3244: 3116: 3030: 3004: 2895: 2830: 2807: 2757: 2703: 2648: 2612: 2583: 2507: 2478: 2458: 2380: 2352: 2325: 2249: 2171: 2151: 2124: 2097: 2077: 2043:The researchers' goal is to estimate the function 2028: 1998: 1927: 1896: 1869: 1849: 1814: 1772: 1734: 1700: 1680: 1646: 1445:). The most common form of regression analysis is 7930: 7209: 7192: 7183: 6748: 6739: 6723: 6395:normal equations. The residual can be written as 4551: 4374: 1908:that may stand in for un-modeled determinants of 1449:, in which one finds the line (or a more complex 13952: 8251: 7754:All major statistical software packages perform 4929:, and the true value of the dependent variable, 11810:Multivariate adaptive regression splines (MARS) 8709:, Springer-Verlag, Berlin, 2011 (4th printing). 8605:(1987). "Regression and correlation analysis," 8028:. Kendall/Hunt Publishing Company. p. 59. 5108:Illustration of linear regression on a data set 4195:data points there is one independent variable: 1788:, different terminologies are used in place of 8591:Evan J. Williams, "I. Regression," pp. 523–41. 7959:Criticism and Influence Analysis in Regression 7312: 4886:{\displaystyle e_{i}=y_{i}-{\widehat {y}}_{i}} 1465:), this allows the researcher to estimate the 1341:List of datasets for machine-learning research 12794: 12691: 10365: 9773: 8849: 8542: 8500:Journal of Modern Applied Statistical Methods 7831:(a linear least squares estimation algorithm) 7410:Prediction (interpolation and extrapolation) 7133:{\displaystyle {\hat {\boldsymbol {\beta }}}} 6974:{\displaystyle {\hat {\boldsymbol {\beta }}}} 4016:Heteroscedasticity-consistent standard errors 1374: 441: 8863: 8733:https://doi.org/10.1016/j.neunet.2015.05.005 8636:Journal of Business and Economic Statistics, 5029:{\displaystyle SSR=\sum _{i=1}^{n}e_{i}^{2}} 8644: 7924: 7572:is the number of independent variables and 2336:Once researchers determine their preferred 1999:{\displaystyle Y_{i}=f(X_{i},\beta )+e_{i}} 13756:Centers for Disease Control and Prevention 12801: 12787: 12698: 12684: 10410: 10372: 10358: 9780: 9766: 8856: 8842: 8684: 8597:, "II. Analysis of Variance," pp. 541–554. 8465: 8051:"Kinship and Correlation (reprinted 1989)" 3694:{\displaystyle (X_{1i},X_{2i},...,X_{ki})} 2257:, suggesting that the researcher believes 1381: 1367: 448: 434: 34:Regression line for 50 random points in a 13716:Centre for Disease Prevention and Control 13706:Center for Disease Control and Prevention 11023: 8773: 8763: 8519: 8308: 8236: 8066: 7360:with more than two values, there are the 7238: 6776: 6698: 6201: 3830:Learn how and when to remove this message 3779: 2388:that minimizes the sum of squared errors 8584:International Encyclopedia of Statistics 8493: 8259:Statistical Methods for Research Workers 8209:Journal of the Royal Statistical Society 8126:Journal of the Royal Statistical Society 7423: 7320:, which are response variables that are 5694:of the parameter estimates are given by 5103: 2704:{\displaystyle f(X_{i},{\hat {\beta }})} 1576:In the 1950s and 1960s, economists used 1521:The earliest form of regression was the 29: 13761:Health departments in the United States 10289:Numerical smoothing and differentiation 8607:New Palgrave: A Dictionary of Economics 8287: 7991: 7989: 7968: 7934:Statistical Models: Theory and Practice 7845:Multivariate adaptive regression spline 7391: 7352:with more than two values there is the 7180: 7121: 6962: 6745: 6044: 5635:if an intercept is used. In this case, 3012:. If the researcher only has access to 2765:. However, alternative variants (e.g., 14: 13953: 13766:Council on Education for Public Health 12336:Kaplan–Meier estimator (product limit) 8425:Probability, Statistics and Estimation 8421: 8202: 8048: 8023: 3981:with one another. Mathematically, the 1401:is a set of statistical processes for 13824:Professional degrees of public health 13731:Ministry of Health and Family Welfare 12782: 12679: 12409: 11976: 11723: 11022: 10792: 10409: 10353: 9761: 8837: 8822:What is multiple regression used for? 8496:"Least Squares Percentage Regression" 8002: 3747:and therefore that a unique solution 2896:{\displaystyle (Y_{i},X_{1i},X_{2i})} 38:around the line y=1.5x+2 (not shown) 13921: 13814:Bachelor of Science in Public Health 12646: 12346:Accelerated failure time (AFT) model 9824:Iteratively reweighted least squares 9694:Generative adversarial network (GAN) 8828:Regression of Weakly Correlated Data 8116: 7986: 4114: 3812:adding citations to reliable sources 3783: 2364:(including its most common variant, 2009:Note that the independent variables 1795:Most regression models propose that 1541:The term "regression" was coined by 13933: 13082:Workers' right to access the toilet 12923:Human right to water and sanitation 12658: 11941:Analysis of variance (ANOVA, anova) 10793: 8582:, ed. (1978), "Linear Hypotheses," 7931:David A. Freedman (27 April 2009). 7758:regression analysis and inference. 7747:For a more comprehensive list, see 7461:this range of the data is known as 7443:variable given known values of the 7010:{\displaystyle {\hat {\beta }}_{j}} 4705:is an error term and the subscript 4423:to the preceding regression gives: 3847:. These assumptions often include: 3539:distinct parameters, one must have 1790:dependent and independent variables 1610: 1336:Glossary of artificial intelligence 24: 12036:Cochran–Mantel–Haenszel statistics 10662:Pearson product-moment correlation 9842:Pearson product-moment correlation 8727:Stulp, Freek, and Olivier Sigaud. 8569: 8026:Second-Semester Applied Statistics 7711:errors is deemed more appropriate. 7495:Power and sample size calculations 7226: 7200: 6764: 6731: 4922:{\displaystyle {\widehat {y}}_{i}} 4725:indexes a particular observation. 1578:electromechanical desk calculators 1565:. This assumption was weakened by 25: 13972: 13355:Commercial determinants of health 12808: 8791: 8618:Alternative Methods of Regression 8404:Steel, R.G.D, and Torrie, J. H., 7957:R. Dennis Cook; Sanford Weisberg 4592:, it is linear in the parameters 3943:is constant across observations ( 13932: 13920: 13909: 13908: 12938:National public health institute 12657: 12645: 12633: 12620: 12619: 12410: 10322: 9732: 9731: 9711: 8672:Applied Nonparametric Regression 8645:Draper, N.R.; Smith, H. (1998). 8335:March 2006. Accessed 2011-12-03. 7850:Multivariate normal distribution 7814:Fraction of variance unexplained 7777: 7689: 7289:of the overall fit, followed by 7231: 7222: 7216: 7213: 7205: 7196: 7189: 7025: 6821: 6769: 6760: 6754: 6736: 6727: 4698:{\displaystyle \varepsilon _{i}} 3907:{\displaystyle E(e_{i}|X_{i})=0} 3788: 415: 13335:Open-source healthcare software 13077:Sociology of health and illness 12751:Associative (causal) forecasts 12295:Least-squares spectral analysis 8536: 8487: 8459: 8443:Statistical methods of analysis 8435: 8415: 8398: 8363: 8338: 8325: 8281: 8196: 8164:"The Law of Ancestral Heredity" 8152: 8110: 8098: 7855:Pearson correlation coefficient 7265:Category:Regression diagnostics 4349: 4167:(but need not be linear in the 3799:needs additional citations for 3769:{\displaystyle {\hat {\beta }}} 2808:{\displaystyle f(X_{i},\beta )} 2613:{\displaystyle {\hat {\beta }}} 2508:{\displaystyle {\hat {\beta }}} 2078:{\displaystyle f(X_{i},\beta )} 363:Least-squares spectral analysis 301:Generalized estimating equation 121:Multinomial logistic regression 96:Vector generalized linear model 13696:Caribbean Public Health Agency 13508:Sexually transmitted infection 13405:Statistical hypothesis testing 13166:Occupational safety and health 13067:Sexual and reproductive health 12980:Occupational safety and health 11276:Mean-unbiased minimum-variance 10379: 9644:Recurrent neural network (RNN) 9634:Differentiable neural computer 8818:– basic history and references 8632:Calculating Interval Forecasts 8122:"On the Theory of Correlation" 8083: 8042: 8017: 7965:, Vol. 13. (1982), pp. 313–361 7951: 7937:. Cambridge University Press. 7913: 7252: 7124: 6995: 6965: 6609: 6479: 6438: 5965: 5941: 5934: 5912: 5896: 5860: 5831: 5791: 5784: 5762: 5740: 5711: 5622: 5604: 5561: 5549: 5479: 5423: 5370: 5338: 5304: 5246: 5239: 5217: 5209: 5203: 5181: 5178: 5172: 5150: 4956:. One method of estimation is 3916:The variance of the residuals 3895: 3881: 3867: 3760: 3713:, this condition ensures that 3688: 3628: 3515:More generally, to estimate a 3427: 3423: 3398: 3363: 3341: 3331: 3316: 3306: 3276: 3217: 3182: 3160: 3138: 3111: 3099: 3077: 3055: 3045: 2890: 2845: 2802: 2783: 2758:{\displaystyle E(Y_{i}|X_{i})} 2752: 2738: 2724: 2698: 2692: 2670: 2649:{\displaystyle {\hat {Y_{i}}}} 2640: 2604: 2578: 2572: 2550: 2538: 2499: 2447: 2443: 2424: 2405: 2208: 2189: 2072: 2053: 1980: 1961: 756:Relevance vector machine (RVM) 13: 1: 13350:Social determinants of health 12589:Geographic information system 11805:Simultaneous equations models 9689:Variational autoencoder (VAE) 9649:Long short-term memory (LSTM) 8916:Computational learning theory 7999:. Princeton University Press. 7906: 7840:Modifiable areal unit problem 7281:, analyses of the pattern of 6885:element of the column vector 6055:For a numerical example, see 1935:or random statistical noise: 1457:computes the unique line (or 1245:Computational learning theory 809:Expectation–maximization (EM) 182:Nonlinear mixed-effects model 13410:Analysis of variance (ANOVA) 13171:Human factors and ergonomics 12736:Decomposition of time series 11772:Coefficient of determination 11383:Uniformly most powerful test 10312:Regression analysis category 10202:Response surface methodology 9669:Convolutional neural network 8649:(3rd ed.). John Wiley. 7920:Necessary Condition Analysis 7749:List of statistical software 7032:{\displaystyle \mathbf {X} } 6828:{\displaystyle \mathbf {X} } 1483:Necessary Condition Analysis 1405:the relationships between a 1202:Coefficient of determination 1049:Convolutional neural network 761:Support vector machine (SVM) 7: 13591:Good manufacturing practice 13395:Randomized controlled trial 12341:Proportional hazards models 12285:Spectral density estimation 12267:Vector autoregression (VAR) 11701:Maximum posterior estimator 10933:Randomized controlled trial 10184:Frisch–Waugh–Lovell theorem 10154:Mean and predicted response 9664:Multilayer perceptron (MLP) 8804:Encyclopedia of Mathematics 8647:Applied Regression Analysis 7770: 7742: 7318:Limited dependent variables 7313:Limited dependent variables 4669:{\displaystyle \beta _{2}.} 4045:to change across values of 1534:including a version of the 1353:Outline of machine learning 1250:Empirical risk minimization 384:Mean and predicted response 10: 13977: 13661:Theory of planned behavior 13586:Good agricultural practice 13491:Public health surveillance 13383:epidemiological statistics 13027:Public health intervention 12717:Historical data forecasts 12101:Multivariate distributions 10521:Average absolute deviation 9834:Correlation and dependence 9740:Artificial neural networks 9654:Gated recurrent unit (GRU) 8880:Differentiable programming 8372:Environment and Planning A 8310:10.1214/088342305000000331 8024:Mogull, Robert G. (2004). 7746: 7703:Bayesian linear regression 7413: 7395: 7370:Censored regression models 7293:of individual parameters. 7262: 7256: 6361:{\displaystyle \beta _{1}} 6054: 6048: 5432:{\displaystyle {\bar {y}}} 5379:{\displaystyle {\bar {x}}} 4639:{\displaystyle \beta _{1}} 4612:{\displaystyle \beta _{0}} 4269:{\displaystyle \beta _{1}} 4242:{\displaystyle \beta _{0}} 4124: 4118: 3983:variance–covariance matrix 2038:errors-in-variables models 1547:regression toward the mean 1516: 990:Feedforward neural network 741:Artificial neural networks 177:Linear mixed-effects model 13904: 13839: 13798: 13783:World Toilet Organization 13778:World Health Organization 13685: 13674: 13611: 13536: 13452: 13380: 13345:Public health informatics 13285: 13090: 13052:Right to rest and leisure 12881:Globalization and disease 12816: 12749: 12715: 12615: 12569: 12506: 12459: 12422: 12418: 12405: 12377: 12359: 12326: 12317: 12275: 12222: 12183: 12132: 12123: 12089:Structural equation model 12044: 12001: 11997: 11972: 11931: 11897: 11851: 11818: 11780: 11747: 11743: 11719: 11659: 11568: 11487: 11451: 11442: 11425:Score/Lagrange multiplier 11410: 11363: 11308: 11234: 11225: 11035: 11031: 11018: 10977: 10951: 10903: 10858: 10840:Sample size determination 10805: 10801: 10788: 10692: 10647: 10621: 10603: 10559: 10511: 10431: 10422: 10418: 10405: 10387: 10307: 10271: 10220: 10192: 10179:Minimum mean-square error 10146: 10092: 10066:Decomposition of variance 10064: 10029: 9988: 9970:Growth curve (statistics) 9957: 9939:Generalized least squares 9919: 9908: 9875: 9832: 9799: 9707: 9621: 9565: 9494: 9427: 9299: 9199: 9192: 9146: 9110: 9073:Artificial neural network 9053: 8929: 8896:Automatic differentiation 8869: 8816:Earliest Uses: Regression 7734:interval predictor models 7715:Least absolute deviations 7159:{\displaystyle p\times 1} 7104:{\displaystyle n\times 1} 7058:{\displaystyle n\times p} 4416:{\displaystyle x_{i}^{2}} 3565:distinct data points. If 2767:least absolute deviations 1525:, which was published by 973:Artificial neural network 343:Least absolute deviations 13829:Schools of public health 13621:Diffusion of innovations 13320:Health impact assessment 13032:Public health laboratory 12928:Management of depression 12759:Simple linear regression 12584:Environmental statistics 12106:Elliptical distributions 11899:Generalized linear model 11828:Simple linear regression 11598:Hodges–Lehmann estimator 11055:Probability distribution 10964:Stochastic approximation 10526:Coefficient of variation 10037:Generalized linear model 9929:Simple linear regression 9819:Non-linear least squares 9801:Computational statistics 8901:Neuromorphic engineering 8864:Differentiable computing 8742:. John Wiley & Sons. 8609:, v. 4, pp. 120–23. 8469:; Hardin, J. W. (2009). 8422:Rouaud, Mathieu (2013). 8049:Galton, Francis (1989). 7963:Sociological Methodology 7824:Generalized linear model 7760:Simple linear regression 7724:Nonparametric regression 7334:linear probability model 7275:statistical significance 6334:{\displaystyle x_{i1}=1} 4173:simple linear regression 4127:simple linear regression 1597:nonparametric regression 1571:conditional distribution 1487:nonparametric regression 1282:Journals and conferences 1229:Mathematical foundations 1139:Temporal difference (TD) 995:Recurrent neural network 915:Conditional random field 838:Dimensionality reduction 586:Dimensionality reduction 548:Quantum machine learning 543:Neuromorphic engineering 503:Self-supervised learning 498:Semi-supervised learning 91:Generalized linear model 13892:Social hygiene movement 13819:Doctor of Public Health 13651:Social cognitive theory 13453:Infectious and epidemic 13235:Fecal–oral transmission 12244:Cross-correlation (XCF) 11852:Non-standard predictors 11286:Lehmann–ScheffĂŠ theorem 10959:Adaptive clinical trial 9674:Residual neural network 9090:Artificial Intelligence 8705:A. Sen, M. Srivastava, 8554:: 74–82. Archived from 8290:"Fisher and Regression" 7900:Linear trend estimation 7525:{\displaystyle N=m^{n}} 6265:-th observation on the 6081:independent variables: 5628:{\displaystyle (n-p-1)} 3558:{\displaystyle N\geq k} 3252:, all of which lead to 2713:conditional expectation 2620:or the predicted value 1708:denotes a row of data). 1523:method of least squares 1467:conditional expectation 691:Apprenticeship learning 13887:Germ theory of disease 13666:Transtheoretical model 12640:Mathematics portal 12461:Engineering statistics 12369:Nelson–Aalen estimator 11946:Analysis of covariance 11833:Ordinary least squares 11757:Pearson product-moment 11161:Statistical functional 11072:Empirical distribution 10905:Controlled experiments 10634:Frequency distribution 10412:Descriptive statistics 10329:Mathematics portal 10253:Orthogonal polynomials 10079:Analysis of covariance 9944:Weighted least squares 9934:Ordinary least squares 9885:Ordinary least squares 8752:PeerJ Computer Science 8738:Malakooti, B. (2013). 8699:10.1002/for.3980140502 8687:Journal of Forecasting 8630:Chatfield, C. (1993) " 8543:YangJing Long (2009). 8288:Aldrich, John (2005). 8182:10.1093/biomet/2.2.211 7819:Function approximation 7679: 7627: 7607: 7586: 7566: 7546: 7526: 7447:variables. Prediction 7430: 7378:polychoric correlation 7259:Regression diagnostics 7243: 7160: 7134: 7105: 7079: 7059: 7033: 7011: 6975: 6946: 6926: 6899: 6879: 6859: 6858:{\displaystyle x_{ij}} 6829: 6807: 6781: 6703: 6644: 6575: 6554: 6511: 6389: 6362: 6335: 6299: 6279: 6259: 6239: 6238:{\displaystyle x_{ij}} 6206: 6075: 6049:For a derivation, see 6020: 5808: 5681: 5661:so the denominator is 5655: 5629: 5588: 5568: 5529: 5453: 5433: 5404: 5380: 5348: 5262: 5109: 5095: 5030: 5010: 4958:ordinary least squares 4950: 4923: 4887: 4818: 4719: 4699: 4670: 4640: 4613: 4586: 4556: 4417: 4379: 4270: 4243: 4222:, and two parameters, 4216: 4189: 4153: 4101: 4066: 4039: 4018:allow the variance of 3971: 3937: 3908: 3780:Underlying assumptions 3770: 3737: 3736:{\displaystyle X^{T}X} 3711:ordinary least squares 3695: 3611: 3585: 3584:{\displaystyle N>k} 3559: 3533: 3506: 3476: 3446: 3246: 3118: 3032: 3006: 2897: 2832: 2809: 2759: 2705: 2658:ordinary least squares 2650: 2614: 2585: 2509: 2480: 2479:{\displaystyle \beta } 2460: 2382: 2381:{\displaystyle \beta } 2366:ordinary least squares 2354: 2353:{\displaystyle \beta } 2327: 2251: 2173: 2153: 2126: 2099: 2079: 2030: 2000: 1929: 1898: 1871: 1870:{\displaystyle \beta } 1851: 1816: 1774: 1736: 1702: 1682: 1648: 1647:{\displaystyle \beta } 1617:ordinary least squares 1455:ordinary least squares 1240:Bias–variance tradeoff 1122:Reinforcement learning 1098:Spiking neural network 508:Reinforcement learning 422:Mathematics portal 348:Iteratively reweighted 39: 18:Statistical regression 13771:Public Health Service 13656:Social norms approach 13646:PRECEDE–PROCEED model 13092:Preventive healthcare 12985:Pharmaceutical policy 12834:Chief Medical Officer 12726:Exponential smoothing 12556:Population statistics 12498:System identification 12232:Autocorrelation (ACF) 12160:Exponential smoothing 12074:Discriminant analysis 12069:Canonical correlation 11933:Partition of variance 11795:Regression validation 11639:(Jonckheere–Terpstra) 11538:Likelihood-ratio test 11227:Frequentist inference 11139:Location–scale family 11060:Sampling distribution 11025:Statistical inference 10992:Cross-sectional study 10979:Observational studies 10938:Randomized experiment 10767:Stem-and-leaf display 10569:Central limit theorem 10294:System identification 10258:Chebyshev polynomials 10243:Numerical integration 10194:Design of experiments 10138:Regression validation 9965:Polynomial regression 9890:Partial least squares 9629:Neural Turing machine 9217:Human image synthesis 8824:– Multiple regression 8799:"Regression analysis" 8494:Tofallis, C. (2009). 8203:Fisher, R.A. (1922). 8068:10.1214/ss/1177012581 7870:Regression validation 7730:Scenario optimization 7680: 7628: 7608: 7587: 7567: 7547: 7527: 7427: 7414:Further information: 7350:categorical variables 7322:categorical variables 7307:central limit theorem 7273:of the model and the 7244: 7161: 7135: 7106: 7080: 7060: 7034: 7012: 6976: 6947: 6927: 6925:{\displaystyle y_{i}} 6900: 6880: 6860: 6830: 6808: 6782: 6704: 6624: 6555: 6534: 6512: 6390: 6363: 6336: 6300: 6280: 6260: 6240: 6207: 6076: 6039:population parameters 6021: 5809: 5682: 5656: 5630: 5589: 5569: 5567:{\displaystyle (n-p)} 5530: 5454: 5434: 5405: 5381: 5349: 5263: 5107: 5096: 5031: 4990: 4951: 4949:{\displaystyle y_{i}} 4924: 4888: 4819: 4720: 4700: 4671: 4641: 4614: 4587: 4585:{\displaystyle x_{i}} 4557: 4418: 4380: 4271: 4244: 4217: 4215:{\displaystyle x_{i}} 4190: 4169:independent variables 4154: 4152:{\displaystyle y_{i}} 4102: 4100:{\displaystyle e_{i}} 4067: 4065:{\displaystyle X_{i}} 4040: 4038:{\displaystyle e_{i}} 3972: 3970:{\displaystyle e_{i}} 3938: 3936:{\displaystyle e_{i}} 3909: 3771: 3738: 3696: 3612: 3586: 3560: 3534: 3507: 3477: 3447: 3247: 3119: 3033: 3007: 2898: 2833: 2810: 2760: 2706: 2651: 2615: 2586: 2510: 2481: 2461: 2383: 2368:) finds the value of 2355: 2328: 2252: 2174: 2154: 2152:{\displaystyle X_{i}} 2127: 2125:{\displaystyle Y_{i}} 2100: 2080: 2031: 2029:{\displaystyle X_{i}} 2001: 1930: 1928:{\displaystyle Y_{i}} 1899: 1897:{\displaystyle e_{i}} 1872: 1852: 1850:{\displaystyle X_{i}} 1817: 1815:{\displaystyle Y_{i}} 1786:fields of application 1775: 1773:{\displaystyle e_{i}} 1737: 1735:{\displaystyle Y_{i}} 1703: 1683: 1681:{\displaystyle X_{i}} 1659:independent variables 1649: 1627:, often denoted as a 1439:explanatory variables 1423:independent variables 1076:Neural radiance field 898:Structured prediction 621:Structured prediction 493:Unsupervised learning 379:Regression validation 358:Bayesian multivariate 75:Polynomial regression 36:Gaussian distribution 33: 13847:Sara Josephine Baker 13746:Public Health Agency 13631:Health communication 13496:Disease surveillance 13462:Asymptomatic carrier 13444:Statistical software 13132:Preventive nutrition 12960:Medical anthropology 12849:Environmental health 12479:Probabilistic design 12064:Principal components 11907:Exponential families 11859:Nonlinear regression 11838:General linear model 11800:Mixed effects models 11790:Errors and residuals 11767:Confounding variable 11669:Bayesian probability 11647:Van der Waerden test 11637:Ordered alternative 11402:Multiple comparisons 11281:Rao–Blackwellization 11244:Estimating equations 11200:Statistical distance 10918:Factorial experiment 10451:Arithmetic-Geometric 10299:Moving least squares 10238:Approximation theory 10174:Studentized residual 10164:Errors and residuals 10159:Gauss–Markov theorem 10074:Analysis of variance 9996:Nonlinear regression 9975:Segmented regression 9949:General linear model 9867:Confounding variable 9814:Linear least squares 9720:Computer programming 9699:Graph neural network 9274:Text-to-video models 9252:Text-to-image models 9100:Large language model 9085:Scientific computing 8891:Statistical manifold 8886:Information geometry 8765:10.7717/peerj-cs.623 8586:. Free Press, v. 1, 8512:10.2139/ssrn.1406472 8445:, World Scientific. 8441:Chiang, C.L, (2003) 8263:(Twelfth ed.). 7880:Segmented regression 7640: 7617: 7597: 7576: 7556: 7552:is the sample size, 7536: 7503: 7398:Nonlinear regression 7392:Nonlinear regression 7173: 7144: 7115: 7089: 7069: 7043: 7021: 6985: 6956: 6936: 6909: 6889: 6869: 6839: 6817: 6794: 6719: 6531: 6402: 6379: 6370:regression intercept 6345: 6309: 6289: 6269: 6249: 6219: 6088: 6065: 6051:linear least squares 6045:General linear model 6031:confidence intervals 5821: 5701: 5665: 5639: 5601: 5578: 5546: 5469: 5443: 5414: 5394: 5361: 5273: 5119: 5047: 4975: 4933: 4897: 4835: 4735: 4709: 4682: 4650: 4623: 4596: 4569: 4431: 4395: 4284: 4253: 4226: 4199: 4179: 4136: 4084: 4049: 4022: 3954: 3920: 3861: 3808:improve this article 3751: 3717: 3703:linearly independent 3625: 3595: 3569: 3543: 3523: 3490: 3460: 3256: 3128: 3042: 3016: 2911: 2842: 2822: 2777: 2718: 2664: 2624: 2595: 2522: 2490: 2470: 2392: 2372: 2344: 2261: 2183: 2163: 2136: 2109: 2089: 2047: 2013: 1942: 1912: 1881: 1861: 1834: 1799: 1757: 1719: 1692: 1665: 1638: 1536:Gauss–Markov theorem 1506:causal relationships 1395:statistical modeling 1265:Statistical learning 1163:Learning with humans 955:Local outlier factor 404:Gauss–Markov theorem 399:Studentized residual 389:Errors and residuals 223:Principal components 193:Nonlinear regression 80:General linear model 13961:Regression analysis 13857:Carl Rogers Darnall 13852:Samuel Jay Crumbine 13626:Health belief model 13479:Notifiable diseases 13415:Regression analysis 13250:Waterborne diseases 12839:Cultural competence 12764:Regression analysis 12551:Official statistics 12474:Methods engineering 12155:Seasonal adjustment 11923:Poisson regressions 11843:Bayesian regression 11782:Regression analysis 11762:Partial correlation 11734:Regression analysis 11333:Prediction interval 11328:Likelihood interval 11318:Confidence interval 11310:Interval estimation 11271:Unbiased estimators 11089:Model specification 10969:Up-and-down designs 10657:Partial correlation 10613:Index of dispersion 10531:Interquartile range 10317:Statistics category 10248:Gaussian quadrature 10133:Model specification 10100:Stepwise regression 9958:Predictor structure 9895:Total least squares 9877:Regression analysis 9862:Partial correlation 9793:regression analysis 9066:In-context learning 8906:Pattern recognition 8641:. pp. 121–135. 8297:Statistical Science 8267:: Oliver and Boyd. 8055:Statistical Science 7890:Stepwise regression 7865:Prediction interval 7719:quantile regression 7473:prediction interval 7420:Prediction interval 7388:model may be used. 7346:multivariate probit 6005: 5680:{\displaystyle n-2} 5654:{\displaystyle p=1} 5538:This is called the 5495: 5439:is the mean of the 5025: 4507: 4412: 4171:). For example, in 4012:errors-in-variables 3610:{\displaystyle N-k} 3505:{\displaystyle N=2} 3475:{\displaystyle N=2} 3292: 3031:{\displaystyle N=2} 2771:quantile regression 1906:additive error term 1828:regression function 1479:quantile regression 1475:location parameters 1399:regression analysis 1108:Electrochemical RAM 1015:reservoir computing 746:Logistic regression 665:Supervised learning 651:Multimodal learning 626:Feature engineering 571:Generative modeling 533:Rule-based learning 528:Curriculum learning 488:Supervised learning 463:Part of a series on 249:Errors-in-variables 116:Logistic regression 106:Binomial regression 51:Regression analysis 45:Part of a series on 13455:disease prevention 13390:Case–control study 13062:Security of person 12911:Health care reform 12571:Spatial statistics 12451:Medical statistics 12351:First hitting time 12305:Whittle likelihood 11956:Degrees of freedom 11951:Multivariate ANOVA 11884:Heteroscedasticity 11696:Bayesian estimator 11661:Bayesian inference 11510:Kolmogorov–Smirnov 11395:Randomization test 11365:Testing hypotheses 11338:Tolerance interval 11249:Maximum likelihood 11144:Exponential family 11077:Density estimation 11037:Statistical theory 10997:Natural experiment 10943:Scientific control 10860:Survey methodology 10546:Standard deviation 10334:Statistics outline 10233:Numerical analysis 9659:Echo state network 9547:JĂźrgen Schmidhuber 9242:Facial recognition 9237:Speech recognition 9147:Software libraries 8716:. Vieweg+Teubner, 8612:Birkes, David and 8576:William H. Kruskal 8331:Rodney Ramcharan. 7794:Anscombe's quartet 7785:Mathematics portal 7675: 7623: 7603: 7582: 7562: 7542: 7522: 7433:Regression models 7431: 7416:Predicted response 7382:Poisson regression 7374:Heckman correction 7239: 7166:. The solution is 7156: 7130: 7101: 7075: 7055: 7029: 7007: 6971: 6942: 6922: 6895: 6875: 6855: 6825: 6806:{\displaystyle ij} 6803: 6777: 6699: 6507: 6385: 6358: 6331: 6295: 6275: 6255: 6235: 6202: 6071: 6016: 5991: 5804: 5677: 5651: 5625: 5584: 5564: 5525: 5472: 5449: 5429: 5400: 5376: 5344: 5258: 5110: 5091: 5026: 5011: 4946: 4919: 4883: 4814: 4715: 4695: 4666: 4636: 4609: 4582: 4552: 4493: 4413: 4398: 4375: 4266: 4239: 4212: 4185: 4161:linear combination 4149: 4097: 4062: 4035: 3967: 3933: 3904: 3766: 3733: 3691: 3619:degrees of freedom 3607: 3581: 3555: 3529: 3502: 3472: 3442: 3305: 3269: 3268: 3242: 3114: 3028: 3002: 2893: 2828: 2805: 2755: 2701: 2646: 2610: 2581: 2505: 2486:, usually denoted 2476: 2456: 2404: 2378: 2350: 2323: 2247: 2169: 2149: 2122: 2095: 2075: 2026: 1996: 1925: 1894: 1867: 1847: 1812: 1770: 1732: 1713:dependent variable 1698: 1678: 1644: 1625:unknown parameters 1559:joint distribution 1511:observational data 1451:linear combination 1409:(often called the 1407:dependent variable 676: • 591:Density estimation 136:Multinomial probit 40: 13948: 13947: 13900: 13899: 13810:Higher education 13641:Positive deviance 13636:Health psychology 13612:Health behavioral 13539:safety management 13513:Social distancing 13287:Population health 13267:Smoking cessation 13215:Pharmacovigilance 13186:Injury prevention 13154:Infection control 13072:Social psychology 13022:Prisoners' rights 12965:Medical sociology 12933:Public health law 12829:Biological hazard 12776: 12775: 12769:Econometric model 12673: 12672: 12611: 12610: 12607: 12606: 12546:National accounts 12516:Actuarial science 12508:Social statistics 12401: 12400: 12397: 12396: 12393: 12392: 12328:Survival function 12313: 12312: 12175:Granger causality 12016:Contingency table 11991:Survival analysis 11968: 11967: 11964: 11963: 11820:Linear regression 11715: 11714: 11711: 11710: 11686:Credible interval 11655: 11654: 11438: 11437: 11254:Method of moments 11123:Parametric family 11084:Statistical model 11014: 11013: 11010: 11009: 10928:Random assignment 10850:Statistical power 10784: 10783: 10780: 10779: 10629:Contingency table 10599: 10598: 10466:Generalized/power 10347: 10346: 10339:Statistics topics 10284:Calibration curve 10093:Model exploration 10060: 10059: 10030:Non-normal errors 9921:Linear regression 9912:statistical model 9755: 9754: 9517:Stephen Grossberg 9490: 9489: 8722:978-3-8348-1022-9 8663:Fox, J. (1997). 8656:978-0-471-17082-2 8595:Julian C. Stanley 8480:978-0-470-45798-6 8412:, 1960, page 288. 8356:978-0-471-49616-8 8274:978-0-05-002170-5 8035:978-0-7575-1181-3 7944:978-1-139-47731-4 7885:Signal processing 7875:Robust regression 7804:Estimation theory 7667: 7626:{\displaystyle m} 7606:{\displaystyle N} 7585:{\displaystyle m} 7565:{\displaystyle n} 7545:{\displaystyle N} 7386:negative binomial 7358:ordinal variables 7354:multinomial logit 7186: 7127: 7078:{\displaystyle Y} 6998: 6968: 6945:{\displaystyle j} 6898:{\displaystyle Y} 6878:{\displaystyle i} 6751: 6673: 6612: 6482: 6441: 6388:{\displaystyle p} 6298:{\displaystyle i} 6278:{\displaystyle j} 6258:{\displaystyle i} 6074:{\displaystyle p} 6057:linear regression 6011: 6010: 5968: 5953: 5951: 5937: 5899: 5882: 5863: 5834: 5802: 5801: 5787: 5743: 5714: 5587:{\displaystyle p} 5540:mean square error 5523: 5482: 5452:{\displaystyle y} 5426: 5403:{\displaystyle x} 5390:(average) of the 5373: 5341: 5323: 5307: 5286: 5256: 5242: 5206: 5175: 5132: 5082: 5060: 4910: 4874: 4792: 4770: 4748: 4718:{\displaystyle i} 4526: 4391:Adding a term in 4188:{\displaystyle n} 4121:Linear regression 4115:Linear regression 3985:of the errors is 3840: 3839: 3832: 3763: 3745:invertible matrix 3532:{\displaystyle k} 3401: 3366: 3344: 3319: 3296: 3279: 3259: 3220: 3185: 3163: 3141: 3102: 3080: 3058: 2831:{\displaystyle N} 2711:approximates the 2695: 2643: 2607: 2575: 2541: 2502: 2395: 2338:statistical model 2172:{\displaystyle f} 2098:{\displaystyle f} 1701:{\displaystyle i} 1607:with regression. 1585:robust regression 1463:linear regression 1447:linear regression 1391: 1390: 1196:Model diagnostics 1179:Human-in-the-loop 1022:Boltzmann machine 935:Anomaly detection 731:Linear regression 646:Ontology learning 641:Grammar induction 616:Semantic analysis 611:Association rules 596:Anomaly detection 538:Neuro-symbolic AI 458: 457: 111:Binary regression 70:Simple regression 65:Linear regression 16:(Redirected from 13968: 13936: 13935: 13924: 13923: 13912: 13911: 13806:Health education 13683: 13682: 13537:Food hygiene and 13518:Tropical disease 13330:Infant mortality 13305:Community health 13181:Controlled Drugs 13117:Health promotion 13047:Right to housing 12891:Health economics 12803: 12796: 12789: 12780: 12779: 12700: 12693: 12686: 12677: 12676: 12661: 12660: 12649: 12648: 12638: 12637: 12623: 12622: 12526:Crime statistics 12420: 12419: 12407: 12406: 12324: 12323: 12290:Fourier analysis 12277:Frequency domain 12257: 12204: 12170:Structural break 12130: 12129: 12079:Cluster analysis 12026:Log-linear model 11999: 11998: 11974: 11973: 11915: 11889:Homoscedasticity 11745: 11744: 11721: 11720: 11640: 11632: 11624: 11623:(Kruskal–Wallis) 11608: 11593: 11548:Cross validation 11533: 11515:Anderson–Darling 11462: 11449: 11448: 11420:Likelihood-ratio 11412:Parametric tests 11390:Permutation test 11373:1- & 2-tails 11264:Minimum distance 11236:Point estimation 11232: 11231: 11183:Optimal decision 11134: 11033: 11032: 11020: 11019: 11002:Quasi-experiment 10952:Adaptive designs 10803: 10802: 10790: 10789: 10667:Rank correlation 10429: 10428: 10420: 10419: 10407: 10406: 10374: 10367: 10360: 10351: 10350: 10327: 10326: 10084:Multivariate AOV 9980:Local regression 9917: 9916: 9909:Regression as a 9900:Ridge regression 9847:Rank correlation 9782: 9775: 9768: 9759: 9758: 9745:Machine learning 9735: 9734: 9715: 9470:Action selection 9460:Self-driving car 9267:Stable Diffusion 9232:Speech synthesis 9197: 9196: 9061:Machine learning 8937:Gradient descent 8858: 8851: 8844: 8835: 8834: 8812: 8787: 8777: 8767: 8702: 8660: 8563: 8562: 8560: 8549: 8540: 8534: 8533: 8523: 8491: 8485: 8484: 8463: 8457: 8439: 8433: 8432: 8430: 8419: 8413: 8402: 8396: 8395: 8378:(7): 1025–1044. 8367: 8361: 8360: 8342: 8336: 8329: 8323: 8322: 8312: 8294: 8285: 8279: 8278: 8262: 8253:Ronald A. Fisher 8249: 8243: 8242: 8240: 8200: 8194: 8193: 8156: 8150: 8149: 8114: 8108: 8102: 8096: 8087: 8081: 8080: 8070: 8046: 8040: 8039: 8021: 8015: 8006: 8000: 7993: 7984: 7972: 7966: 7955: 7949: 7948: 7928: 7922: 7917: 7895:Taxicab geometry 7835:Local regression 7787: 7782: 7781: 7699:Bayesian methods 7684: 7682: 7681: 7676: 7668: 7666: 7655: 7644: 7632: 7630: 7629: 7624: 7612: 7610: 7609: 7604: 7591: 7589: 7588: 7583: 7571: 7569: 7568: 7563: 7551: 7549: 7548: 7543: 7531: 7529: 7528: 7523: 7521: 7520: 7248: 7246: 7245: 7240: 7234: 7230: 7229: 7220: 7219: 7204: 7203: 7188: 7187: 7179: 7165: 7163: 7162: 7157: 7139: 7137: 7136: 7131: 7129: 7128: 7120: 7110: 7108: 7107: 7102: 7084: 7082: 7081: 7076: 7064: 7062: 7061: 7056: 7038: 7036: 7035: 7030: 7028: 7016: 7014: 7013: 7008: 7006: 7005: 7000: 6999: 6991: 6980: 6978: 6977: 6972: 6970: 6969: 6961: 6951: 6949: 6948: 6943: 6931: 6929: 6928: 6923: 6921: 6920: 6904: 6902: 6901: 6896: 6884: 6882: 6881: 6876: 6864: 6862: 6861: 6856: 6854: 6853: 6834: 6832: 6831: 6826: 6824: 6812: 6810: 6809: 6804: 6786: 6784: 6783: 6778: 6772: 6768: 6767: 6758: 6753: 6752: 6744: 6735: 6734: 6708: 6706: 6705: 6700: 6671: 6667: 6666: 6657: 6656: 6643: 6638: 6620: 6619: 6614: 6613: 6605: 6601: 6600: 6588: 6587: 6574: 6569: 6553: 6548: 6522:normal equations 6516: 6514: 6513: 6508: 6503: 6502: 6490: 6489: 6484: 6483: 6475: 6462: 6461: 6449: 6448: 6443: 6442: 6434: 6427: 6426: 6414: 6413: 6394: 6392: 6391: 6386: 6367: 6365: 6364: 6359: 6357: 6356: 6340: 6338: 6337: 6332: 6324: 6323: 6304: 6302: 6301: 6296: 6284: 6282: 6281: 6276: 6264: 6262: 6261: 6256: 6244: 6242: 6241: 6236: 6234: 6233: 6211: 6209: 6208: 6203: 6197: 6196: 6184: 6183: 6171: 6170: 6152: 6151: 6139: 6138: 6126: 6125: 6113: 6112: 6100: 6099: 6080: 6078: 6077: 6072: 6035:hypothesis tests 6025: 6023: 6022: 6017: 6012: 6006: 6004: 5999: 5986: 5985: 5983: 5982: 5981: 5980: 5970: 5969: 5961: 5954: 5952: 5950: 5949: 5948: 5939: 5938: 5930: 5924: 5923: 5907: 5906: 5901: 5900: 5892: 5888: 5883: 5875: 5873: 5871: 5870: 5865: 5864: 5856: 5849: 5848: 5847: 5846: 5836: 5835: 5827: 5813: 5811: 5810: 5805: 5803: 5800: 5799: 5798: 5789: 5788: 5780: 5774: 5773: 5754: 5753: 5751: 5750: 5745: 5744: 5736: 5729: 5728: 5727: 5726: 5716: 5715: 5707: 5686: 5684: 5683: 5678: 5660: 5658: 5657: 5652: 5634: 5632: 5631: 5626: 5593: 5591: 5590: 5585: 5573: 5571: 5570: 5565: 5534: 5532: 5531: 5526: 5524: 5522: 5511: 5500: 5494: 5489: 5484: 5483: 5475: 5458: 5456: 5455: 5450: 5438: 5436: 5435: 5430: 5428: 5427: 5419: 5409: 5407: 5406: 5401: 5385: 5383: 5382: 5377: 5375: 5374: 5366: 5353: 5351: 5350: 5345: 5343: 5342: 5334: 5331: 5330: 5325: 5324: 5316: 5309: 5308: 5300: 5294: 5293: 5288: 5287: 5279: 5267: 5265: 5264: 5259: 5257: 5255: 5254: 5253: 5244: 5243: 5235: 5229: 5228: 5212: 5208: 5207: 5199: 5193: 5192: 5177: 5176: 5168: 5162: 5161: 5145: 5140: 5139: 5134: 5133: 5125: 5100: 5098: 5097: 5092: 5090: 5089: 5084: 5083: 5075: 5068: 5067: 5062: 5061: 5053: 5041:normal equations 5035: 5033: 5032: 5027: 5024: 5019: 5009: 5004: 4955: 4953: 4952: 4947: 4945: 4944: 4928: 4926: 4925: 4920: 4918: 4917: 4912: 4911: 4903: 4892: 4890: 4889: 4884: 4882: 4881: 4876: 4875: 4867: 4860: 4859: 4847: 4846: 4823: 4821: 4820: 4815: 4810: 4809: 4800: 4799: 4794: 4793: 4785: 4778: 4777: 4772: 4771: 4763: 4756: 4755: 4750: 4749: 4741: 4724: 4722: 4721: 4716: 4704: 4702: 4701: 4696: 4694: 4693: 4675: 4673: 4672: 4667: 4662: 4661: 4645: 4643: 4642: 4637: 4635: 4634: 4618: 4616: 4615: 4610: 4608: 4607: 4591: 4589: 4588: 4583: 4581: 4580: 4561: 4559: 4558: 4553: 4524: 4520: 4519: 4506: 4501: 4492: 4491: 4479: 4478: 4469: 4468: 4456: 4455: 4443: 4442: 4422: 4420: 4419: 4414: 4411: 4406: 4384: 4382: 4381: 4376: 4345: 4344: 4332: 4331: 4322: 4321: 4309: 4308: 4296: 4295: 4275: 4273: 4272: 4267: 4265: 4264: 4248: 4246: 4245: 4240: 4238: 4237: 4221: 4219: 4218: 4213: 4211: 4210: 4194: 4192: 4191: 4186: 4158: 4156: 4155: 4150: 4148: 4147: 4106: 4104: 4103: 4098: 4096: 4095: 4071: 4069: 4068: 4063: 4061: 4060: 4044: 4042: 4041: 4036: 4034: 4033: 3976: 3974: 3973: 3968: 3966: 3965: 3945:homoscedasticity 3942: 3940: 3939: 3934: 3932: 3931: 3913: 3911: 3910: 3905: 3894: 3893: 3884: 3879: 3878: 3835: 3828: 3824: 3821: 3815: 3792: 3784: 3775: 3773: 3772: 3767: 3765: 3764: 3756: 3742: 3740: 3739: 3734: 3729: 3728: 3700: 3698: 3697: 3692: 3687: 3686: 3659: 3658: 3643: 3642: 3616: 3614: 3613: 3608: 3590: 3588: 3587: 3582: 3564: 3562: 3561: 3556: 3538: 3536: 3535: 3530: 3511: 3509: 3508: 3503: 3481: 3479: 3478: 3473: 3451: 3449: 3448: 3443: 3435: 3434: 3422: 3421: 3409: 3408: 3403: 3402: 3394: 3387: 3386: 3374: 3373: 3368: 3367: 3359: 3352: 3351: 3346: 3345: 3337: 3327: 3326: 3321: 3320: 3312: 3304: 3291: 3286: 3281: 3280: 3272: 3267: 3251: 3249: 3248: 3243: 3241: 3240: 3228: 3227: 3222: 3221: 3213: 3206: 3205: 3193: 3192: 3187: 3186: 3178: 3171: 3170: 3165: 3164: 3156: 3149: 3148: 3143: 3142: 3134: 3123: 3121: 3120: 3115: 3110: 3109: 3104: 3103: 3095: 3088: 3087: 3082: 3081: 3073: 3066: 3065: 3060: 3059: 3051: 3037: 3035: 3034: 3029: 3011: 3009: 3008: 3003: 3001: 3000: 2988: 2987: 2975: 2974: 2962: 2961: 2949: 2948: 2936: 2935: 2923: 2922: 2902: 2900: 2899: 2894: 2889: 2888: 2873: 2872: 2857: 2856: 2837: 2835: 2834: 2829: 2814: 2812: 2811: 2806: 2795: 2794: 2764: 2762: 2761: 2756: 2751: 2750: 2741: 2736: 2735: 2710: 2708: 2707: 2702: 2697: 2696: 2688: 2682: 2681: 2655: 2653: 2652: 2647: 2645: 2644: 2639: 2638: 2629: 2619: 2617: 2616: 2611: 2609: 2608: 2600: 2590: 2588: 2587: 2582: 2577: 2576: 2568: 2562: 2561: 2543: 2542: 2537: 2536: 2527: 2514: 2512: 2511: 2506: 2504: 2503: 2495: 2485: 2483: 2482: 2477: 2465: 2463: 2462: 2457: 2455: 2454: 2436: 2435: 2417: 2416: 2403: 2387: 2385: 2384: 2379: 2359: 2357: 2356: 2351: 2332: 2330: 2329: 2324: 2322: 2321: 2309: 2308: 2299: 2298: 2286: 2285: 2273: 2272: 2256: 2254: 2253: 2248: 2246: 2245: 2236: 2235: 2223: 2222: 2201: 2200: 2178: 2176: 2175: 2170: 2158: 2156: 2155: 2150: 2148: 2147: 2131: 2129: 2128: 2123: 2121: 2120: 2104: 2102: 2101: 2096: 2084: 2082: 2081: 2076: 2065: 2064: 2035: 2033: 2032: 2027: 2025: 2024: 2005: 2003: 2002: 1997: 1995: 1994: 1973: 1972: 1954: 1953: 1934: 1932: 1931: 1926: 1924: 1923: 1904:representing an 1903: 1901: 1900: 1895: 1893: 1892: 1876: 1874: 1873: 1868: 1856: 1854: 1853: 1848: 1846: 1845: 1821: 1819: 1818: 1813: 1811: 1810: 1779: 1777: 1776: 1771: 1769: 1768: 1741: 1739: 1738: 1733: 1731: 1730: 1707: 1705: 1704: 1699: 1687: 1685: 1684: 1679: 1677: 1676: 1653: 1651: 1650: 1645: 1611:Regression model 1605:causal inference 1529:in 1805, and by 1502:machine learning 1383: 1376: 1369: 1330:Related articles 1207:Confusion matrix 960:Isolation forest 905:Graphical models 684: 683: 636:Learning to rank 631:Feature learning 469:Machine learning 460: 459: 450: 443: 436: 420: 419: 327:Ridge regression 162:Multilevel model 42: 41: 21: 13976: 13975: 13971: 13970: 13969: 13967: 13966: 13965: 13951: 13950: 13949: 13944: 13896: 13867:Margaret Sanger 13835: 13794: 13678: 13676: 13670: 13613: 13607: 13579:Safety scandals 13538: 13532: 13454: 13448: 13382: 13376: 13372:Social medicine 13365:Race and health 13300:Child mortality 13281: 13240:Open defecation 13122:Human nutrition 13112:Family planning 13100:Behavior change 13086: 13042:Right to health 12955:Maternal health 12945:Health politics 12896:Health literacy 12812: 12807: 12777: 12772: 12745: 12711: 12704: 12674: 12669: 12632: 12603: 12565: 12502: 12488:quality control 12455: 12437:Clinical trials 12414: 12389: 12373: 12361:Hazard function 12355: 12309: 12271: 12255: 12218: 12214:Breusch–Godfrey 12202: 12179: 12119: 12094:Factor analysis 12040: 12021:Graphical model 11993: 11960: 11927: 11913: 11893: 11847: 11814: 11776: 11739: 11738: 11707: 11651: 11638: 11630: 11622: 11606: 11591: 11570:Rank statistics 11564: 11543:Model selection 11531: 11489:Goodness of fit 11483: 11460: 11434: 11406: 11359: 11304: 11293:Median unbiased 11221: 11132: 11065:Order statistic 11027: 11006: 10973: 10947: 10899: 10854: 10797: 10795:Data collection 10776: 10688: 10643: 10617: 10595: 10555: 10507: 10424:Continuous data 10414: 10401: 10383: 10378: 10348: 10343: 10321: 10303: 10267: 10263:Chebyshev nodes 10216: 10212:Bayesian design 10188: 10169:Goodness of fit 10142: 10115: 10105:Model selection 10088: 10056: 10025: 9984: 9953: 9910: 9904: 9871: 9828: 9795: 9786: 9756: 9751: 9703: 9617: 9583:Google DeepMind 9561: 9527:Geoffrey Hinton 9486: 9423: 9349:Project Debater 9295: 9193:Implementations 9188: 9142: 9106: 9049: 8991:Backpropagation 8925: 8911:Tensor calculus 8865: 8862: 8797: 8794: 8657: 8580:Judith M. Tanur 8572: 8570:Further reading 8567: 8566: 8558: 8547: 8541: 8537: 8492: 8488: 8481: 8464: 8460: 8440: 8436: 8428: 8420: 8416: 8403: 8399: 8384:10.1068/a231025 8368: 8364: 8357: 8343: 8339: 8330: 8326: 8292: 8286: 8282: 8275: 8250: 8246: 8221:10.2307/2341124 8201: 8197: 8157: 8153: 8138:10.2307/2979746 8115: 8111: 8103: 8099: 8088: 8084: 8047: 8043: 8036: 8022: 8018: 8007: 8003: 7994: 7987: 7973: 7969: 7956: 7952: 7945: 7929: 7925: 7918: 7914: 7909: 7904: 7783: 7776: 7773: 7752: 7745: 7692: 7656: 7645: 7643: 7641: 7638: 7637: 7618: 7615: 7614: 7598: 7595: 7594: 7577: 7574: 7573: 7557: 7554: 7553: 7537: 7534: 7533: 7516: 7512: 7504: 7501: 7500: 7497: 7439:a value of the 7422: 7412: 7400: 7394: 7315: 7271:goodness of fit 7267: 7261: 7255: 7225: 7221: 7212: 7208: 7199: 7195: 7178: 7177: 7176: 7174: 7171: 7170: 7145: 7142: 7141: 7119: 7118: 7116: 7113: 7112: 7090: 7087: 7086: 7070: 7067: 7066: 7044: 7041: 7040: 7024: 7022: 7019: 7018: 7001: 6990: 6989: 6988: 6986: 6983: 6982: 6960: 6959: 6957: 6954: 6953: 6937: 6934: 6933: 6916: 6912: 6910: 6907: 6906: 6890: 6887: 6886: 6870: 6867: 6866: 6846: 6842: 6840: 6837: 6836: 6820: 6818: 6815: 6814: 6795: 6792: 6791: 6763: 6759: 6757: 6743: 6742: 6730: 6726: 6722: 6720: 6717: 6716: 6662: 6658: 6649: 6645: 6639: 6628: 6615: 6604: 6603: 6602: 6593: 6589: 6580: 6576: 6570: 6559: 6549: 6538: 6532: 6529: 6528: 6495: 6491: 6485: 6474: 6473: 6472: 6454: 6450: 6444: 6433: 6432: 6431: 6422: 6418: 6409: 6405: 6403: 6400: 6399: 6380: 6377: 6376: 6352: 6348: 6346: 6343: 6342: 6316: 6312: 6310: 6307: 6306: 6290: 6287: 6286: 6270: 6267: 6266: 6250: 6247: 6246: 6226: 6222: 6220: 6217: 6216: 6192: 6188: 6176: 6172: 6166: 6162: 6144: 6140: 6134: 6130: 6118: 6114: 6108: 6104: 6095: 6091: 6089: 6086: 6085: 6066: 6063: 6062: 6059: 6053: 6047: 6000: 5995: 5987: 5984: 5976: 5972: 5971: 5960: 5959: 5958: 5944: 5940: 5929: 5928: 5919: 5915: 5908: 5902: 5891: 5890: 5889: 5887: 5874: 5872: 5866: 5855: 5854: 5853: 5842: 5838: 5837: 5826: 5825: 5824: 5822: 5819: 5818: 5794: 5790: 5779: 5778: 5769: 5765: 5758: 5752: 5746: 5735: 5734: 5733: 5722: 5718: 5717: 5706: 5705: 5704: 5702: 5699: 5698: 5692:standard errors 5666: 5663: 5662: 5640: 5637: 5636: 5602: 5599: 5598: 5579: 5576: 5575: 5547: 5544: 5543: 5512: 5501: 5499: 5490: 5485: 5474: 5473: 5470: 5467: 5466: 5444: 5441: 5440: 5418: 5417: 5415: 5412: 5411: 5395: 5392: 5391: 5365: 5364: 5362: 5359: 5358: 5333: 5332: 5326: 5315: 5314: 5313: 5299: 5298: 5289: 5278: 5277: 5276: 5274: 5271: 5270: 5249: 5245: 5234: 5233: 5224: 5220: 5213: 5198: 5197: 5188: 5184: 5167: 5166: 5157: 5153: 5146: 5144: 5135: 5124: 5123: 5122: 5120: 5117: 5116: 5085: 5074: 5073: 5072: 5063: 5052: 5051: 5050: 5048: 5045: 5044: 5020: 5015: 5005: 4994: 4976: 4973: 4972: 4940: 4936: 4934: 4931: 4930: 4913: 4902: 4901: 4900: 4898: 4895: 4894: 4877: 4866: 4865: 4864: 4855: 4851: 4842: 4838: 4836: 4833: 4832: 4805: 4801: 4795: 4784: 4783: 4782: 4773: 4762: 4761: 4760: 4751: 4740: 4739: 4738: 4736: 4733: 4732: 4710: 4707: 4706: 4689: 4685: 4683: 4680: 4679: 4678:In both cases, 4657: 4653: 4651: 4648: 4647: 4630: 4626: 4624: 4621: 4620: 4603: 4599: 4597: 4594: 4593: 4576: 4572: 4570: 4567: 4566: 4515: 4511: 4502: 4497: 4487: 4483: 4474: 4470: 4464: 4460: 4451: 4447: 4438: 4434: 4432: 4429: 4428: 4407: 4402: 4396: 4393: 4392: 4340: 4336: 4327: 4323: 4317: 4313: 4304: 4300: 4291: 4287: 4285: 4282: 4281: 4280:straight line: 4260: 4256: 4254: 4251: 4250: 4233: 4229: 4227: 4224: 4223: 4206: 4202: 4200: 4197: 4196: 4180: 4177: 4176: 4143: 4139: 4137: 4134: 4133: 4130: 4123: 4117: 4091: 4087: 4085: 4082: 4081: 4056: 4052: 4050: 4047: 4046: 4029: 4025: 4023: 4020: 4019: 3961: 3957: 3955: 3952: 3951: 3927: 3923: 3921: 3918: 3917: 3889: 3885: 3880: 3874: 3870: 3862: 3859: 3858: 3836: 3825: 3819: 3816: 3805: 3793: 3782: 3755: 3754: 3752: 3749: 3748: 3724: 3720: 3718: 3715: 3714: 3679: 3675: 3651: 3647: 3635: 3631: 3626: 3623: 3622: 3596: 3593: 3592: 3570: 3567: 3566: 3544: 3541: 3540: 3524: 3521: 3520: 3491: 3488: 3487: 3484:underdetermined 3461: 3458: 3457: 3430: 3426: 3414: 3410: 3404: 3393: 3392: 3391: 3379: 3375: 3369: 3358: 3357: 3356: 3347: 3336: 3335: 3334: 3322: 3311: 3310: 3309: 3300: 3287: 3282: 3271: 3270: 3263: 3257: 3254: 3253: 3233: 3229: 3223: 3212: 3211: 3210: 3198: 3194: 3188: 3177: 3176: 3175: 3166: 3155: 3154: 3153: 3144: 3133: 3132: 3131: 3129: 3126: 3125: 3105: 3094: 3093: 3092: 3083: 3072: 3071: 3070: 3061: 3050: 3049: 3048: 3043: 3040: 3039: 3017: 3014: 3013: 2996: 2992: 2980: 2976: 2970: 2966: 2954: 2950: 2944: 2940: 2931: 2927: 2918: 2914: 2912: 2909: 2908: 2881: 2877: 2865: 2861: 2852: 2848: 2843: 2840: 2839: 2823: 2820: 2819: 2790: 2786: 2778: 2775: 2774: 2746: 2742: 2737: 2731: 2727: 2719: 2716: 2715: 2687: 2686: 2677: 2673: 2665: 2662: 2661: 2634: 2630: 2628: 2627: 2625: 2622: 2621: 2599: 2598: 2596: 2593: 2592: 2567: 2566: 2557: 2553: 2532: 2528: 2526: 2525: 2523: 2520: 2519: 2494: 2493: 2491: 2488: 2487: 2471: 2468: 2467: 2450: 2446: 2431: 2427: 2412: 2408: 2399: 2393: 2390: 2389: 2373: 2370: 2369: 2360:. For example, 2345: 2342: 2341: 2317: 2313: 2304: 2300: 2294: 2290: 2281: 2277: 2268: 2264: 2262: 2259: 2258: 2241: 2237: 2231: 2227: 2218: 2214: 2196: 2192: 2184: 2181: 2180: 2164: 2161: 2160: 2143: 2139: 2137: 2134: 2133: 2116: 2112: 2110: 2107: 2106: 2090: 2087: 2086: 2060: 2056: 2048: 2045: 2044: 2020: 2016: 2014: 2011: 2010: 1990: 1986: 1968: 1964: 1949: 1945: 1943: 1940: 1939: 1919: 1915: 1913: 1910: 1909: 1888: 1884: 1882: 1879: 1878: 1862: 1859: 1858: 1841: 1837: 1835: 1832: 1831: 1806: 1802: 1800: 1797: 1796: 1764: 1760: 1758: 1755: 1754: 1726: 1722: 1720: 1717: 1716: 1693: 1690: 1689: 1672: 1668: 1666: 1663: 1662: 1639: 1636: 1635: 1613: 1519: 1469:(or population 1417:variable, or a 1387: 1358: 1357: 1331: 1323: 1322: 1283: 1275: 1274: 1235:Kernel machines 1230: 1222: 1221: 1197: 1189: 1188: 1169:Active learning 1164: 1156: 1155: 1124: 1114: 1113: 1039:Diffusion model 975: 965: 964: 937: 927: 926: 900: 890: 889: 845:Factor analysis 840: 830: 829: 813: 776: 766: 765: 686: 685: 669: 668: 667: 656: 655: 561: 553: 552: 518:Online learning 483: 471: 454: 414: 394:Goodness of fit 101:Discrete choice 28: 23: 22: 15: 12: 11: 5: 13974: 13964: 13963: 13946: 13945: 13943: 13942: 13930: 13918: 13905: 13902: 13901: 13898: 13897: 13895: 13894: 13889: 13884: 13879: 13874: 13869: 13864: 13859: 13854: 13849: 13843: 13841: 13837: 13836: 13834: 13833: 13832: 13831: 13826: 13821: 13816: 13808: 13802: 13800: 13796: 13795: 13793: 13792: 13785: 13780: 13775: 13774: 13773: 13768: 13763: 13758: 13750: 13749: 13748: 13743: 13735: 13734: 13733: 13725: 13724: 13723: 13718: 13710: 13709: 13708: 13700: 13699: 13698: 13689: 13687: 13680: 13675:Organizations, 13672: 13671: 13669: 13668: 13663: 13658: 13653: 13648: 13643: 13638: 13633: 13628: 13623: 13617: 13615: 13609: 13608: 13606: 13605: 13604: 13603: 13598: 13588: 13583: 13582: 13581: 13576: 13571: 13566: 13561: 13556: 13551: 13542: 13540: 13534: 13533: 13531: 13530: 13525: 13520: 13515: 13510: 13505: 13500: 13499: 13498: 13488: 13487: 13486: 13476: 13475: 13474: 13464: 13458: 13456: 13450: 13449: 13447: 13446: 13441: 13440: 13439: 13431: 13422: 13417: 13412: 13402: 13397: 13392: 13386: 13384: 13381:Biological and 13378: 13377: 13375: 13374: 13369: 13368: 13367: 13362: 13357: 13347: 13342: 13340:Multimorbidity 13337: 13332: 13327: 13322: 13317: 13312: 13307: 13302: 13297: 13291: 13289: 13283: 13282: 13280: 13279: 13277:Vector control 13274: 13269: 13264: 13262:School hygiene 13259: 13258: 13257: 13252: 13247: 13245:Sanitary sewer 13242: 13237: 13232: 13222: 13217: 13212: 13211: 13210: 13203:Patient safety 13200: 13199: 13198: 13193: 13188: 13183: 13178: 13173: 13163: 13162: 13161: 13156: 13151: 13146: 13136: 13135: 13134: 13129: 13119: 13114: 13109: 13108: 13107: 13096: 13094: 13088: 13087: 13085: 13084: 13079: 13074: 13069: 13064: 13059: 13054: 13049: 13044: 13039: 13034: 13029: 13024: 13019: 13018: 13017: 13012: 13007: 13002: 12997: 12987: 12982: 12977: 12967: 12962: 12957: 12952: 12947: 12942: 12941: 12940: 12935: 12925: 12920: 12915: 12914: 12913: 12908: 12898: 12893: 12888: 12886:Harm reduction 12883: 12878: 12873: 12868: 12867: 12866: 12861: 12851: 12846: 12841: 12836: 12831: 12826: 12820: 12818: 12814: 12813: 12806: 12805: 12798: 12791: 12783: 12774: 12773: 12771: 12766: 12761: 12756: 12754:Moving average 12750: 12747: 12746: 12744: 12743: 12741:NaĂŻve approach 12738: 12733: 12731:Trend analysis 12728: 12723: 12721:Moving average 12716: 12713: 12712: 12703: 12702: 12695: 12688: 12680: 12671: 12670: 12668: 12667: 12655: 12643: 12629: 12616: 12613: 12612: 12609: 12608: 12605: 12604: 12602: 12601: 12596: 12591: 12586: 12581: 12575: 12573: 12567: 12566: 12564: 12563: 12558: 12553: 12548: 12543: 12538: 12533: 12528: 12523: 12518: 12512: 12510: 12504: 12503: 12501: 12500: 12495: 12490: 12481: 12476: 12471: 12465: 12463: 12457: 12456: 12454: 12453: 12448: 12443: 12434: 12432:Bioinformatics 12428: 12426: 12416: 12415: 12403: 12402: 12399: 12398: 12395: 12394: 12391: 12390: 12388: 12387: 12381: 12379: 12375: 12374: 12372: 12371: 12365: 12363: 12357: 12356: 12354: 12353: 12348: 12343: 12338: 12332: 12330: 12321: 12315: 12314: 12311: 12310: 12308: 12307: 12302: 12297: 12292: 12287: 12281: 12279: 12273: 12272: 12270: 12269: 12264: 12259: 12251: 12246: 12241: 12240: 12239: 12237:partial (PACF) 12228: 12226: 12220: 12219: 12217: 12216: 12211: 12206: 12198: 12193: 12187: 12185: 12184:Specific tests 12181: 12180: 12178: 12177: 12172: 12167: 12162: 12157: 12152: 12147: 12142: 12136: 12134: 12127: 12121: 12120: 12118: 12117: 12116: 12115: 12114: 12113: 12098: 12097: 12096: 12086: 12084:Classification 12081: 12076: 12071: 12066: 12061: 12056: 12050: 12048: 12042: 12041: 12039: 12038: 12033: 12031:McNemar's test 12028: 12023: 12018: 12013: 12007: 12005: 11995: 11994: 11970: 11969: 11966: 11965: 11962: 11961: 11959: 11958: 11953: 11948: 11943: 11937: 11935: 11929: 11928: 11926: 11925: 11909: 11903: 11901: 11895: 11894: 11892: 11891: 11886: 11881: 11876: 11871: 11869:Semiparametric 11866: 11861: 11855: 11853: 11849: 11848: 11846: 11845: 11840: 11835: 11830: 11824: 11822: 11816: 11815: 11813: 11812: 11807: 11802: 11797: 11792: 11786: 11784: 11778: 11777: 11775: 11774: 11769: 11764: 11759: 11753: 11751: 11741: 11740: 11737: 11736: 11731: 11725: 11717: 11716: 11713: 11712: 11709: 11708: 11706: 11705: 11704: 11703: 11693: 11688: 11683: 11682: 11681: 11676: 11665: 11663: 11657: 11656: 11653: 11652: 11650: 11649: 11644: 11643: 11642: 11634: 11626: 11610: 11607:(Mann–Whitney) 11602: 11601: 11600: 11587: 11586: 11585: 11574: 11572: 11566: 11565: 11563: 11562: 11561: 11560: 11555: 11550: 11540: 11535: 11532:(Shapiro–Wilk) 11527: 11522: 11517: 11512: 11507: 11499: 11493: 11491: 11485: 11484: 11482: 11481: 11473: 11464: 11452: 11446: 11444:Specific tests 11440: 11439: 11436: 11435: 11433: 11432: 11427: 11422: 11416: 11414: 11408: 11407: 11405: 11404: 11399: 11398: 11397: 11387: 11386: 11385: 11375: 11369: 11367: 11361: 11360: 11358: 11357: 11356: 11355: 11350: 11340: 11335: 11330: 11325: 11320: 11314: 11312: 11306: 11305: 11303: 11302: 11297: 11296: 11295: 11290: 11289: 11288: 11283: 11268: 11267: 11266: 11261: 11256: 11251: 11240: 11238: 11229: 11223: 11222: 11220: 11219: 11214: 11209: 11208: 11207: 11197: 11192: 11191: 11190: 11180: 11179: 11178: 11173: 11168: 11158: 11153: 11148: 11147: 11146: 11141: 11136: 11120: 11119: 11118: 11113: 11108: 11098: 11097: 11096: 11091: 11081: 11080: 11079: 11069: 11068: 11067: 11057: 11052: 11047: 11041: 11039: 11029: 11028: 11016: 11015: 11012: 11011: 11008: 11007: 11005: 11004: 10999: 10994: 10989: 10983: 10981: 10975: 10974: 10972: 10971: 10966: 10961: 10955: 10953: 10949: 10948: 10946: 10945: 10940: 10935: 10930: 10925: 10920: 10915: 10909: 10907: 10901: 10900: 10898: 10897: 10895:Standard error 10892: 10887: 10882: 10881: 10880: 10875: 10864: 10862: 10856: 10855: 10853: 10852: 10847: 10842: 10837: 10832: 10827: 10825:Optimal design 10822: 10817: 10811: 10809: 10799: 10798: 10786: 10785: 10782: 10781: 10778: 10777: 10775: 10774: 10769: 10764: 10759: 10754: 10749: 10744: 10739: 10734: 10729: 10724: 10719: 10714: 10709: 10704: 10698: 10696: 10690: 10689: 10687: 10686: 10681: 10680: 10679: 10674: 10664: 10659: 10653: 10651: 10645: 10644: 10642: 10641: 10636: 10631: 10625: 10623: 10622:Summary tables 10619: 10618: 10616: 10615: 10609: 10607: 10601: 10600: 10597: 10596: 10594: 10593: 10592: 10591: 10586: 10581: 10571: 10565: 10563: 10557: 10556: 10554: 10553: 10548: 10543: 10538: 10533: 10528: 10523: 10517: 10515: 10509: 10508: 10506: 10505: 10500: 10495: 10494: 10493: 10488: 10483: 10478: 10473: 10468: 10463: 10458: 10456:Contraharmonic 10453: 10448: 10437: 10435: 10426: 10416: 10415: 10403: 10402: 10400: 10399: 10394: 10388: 10385: 10384: 10377: 10376: 10369: 10362: 10354: 10345: 10344: 10342: 10341: 10336: 10331: 10319: 10314: 10308: 10305: 10304: 10302: 10301: 10296: 10291: 10286: 10281: 10275: 10273: 10269: 10268: 10266: 10265: 10260: 10255: 10250: 10245: 10240: 10235: 10229: 10227: 10218: 10217: 10215: 10214: 10209: 10207:Optimal design 10204: 10198: 10196: 10190: 10189: 10187: 10186: 10181: 10176: 10171: 10166: 10161: 10156: 10150: 10148: 10144: 10143: 10141: 10140: 10135: 10130: 10129: 10128: 10123: 10118: 10113: 10102: 10096: 10094: 10090: 10089: 10087: 10086: 10081: 10076: 10070: 10068: 10062: 10061: 10058: 10057: 10055: 10054: 10049: 10044: 10039: 10033: 10031: 10027: 10026: 10024: 10023: 10018: 10013: 10008: 10006:Semiparametric 10003: 9998: 9992: 9990: 9986: 9985: 9983: 9982: 9977: 9972: 9967: 9961: 9959: 9955: 9954: 9952: 9951: 9946: 9941: 9936: 9931: 9925: 9923: 9914: 9906: 9905: 9903: 9902: 9897: 9892: 9887: 9881: 9879: 9873: 9872: 9870: 9869: 9864: 9859: 9853: 9851:Spearman's rho 9844: 9838: 9836: 9830: 9829: 9827: 9826: 9821: 9816: 9811: 9805: 9803: 9797: 9796: 9785: 9784: 9777: 9770: 9762: 9753: 9752: 9750: 9749: 9748: 9747: 9742: 9729: 9728: 9727: 9722: 9708: 9705: 9704: 9702: 9701: 9696: 9691: 9686: 9681: 9676: 9671: 9666: 9661: 9656: 9651: 9646: 9641: 9636: 9631: 9625: 9623: 9619: 9618: 9616: 9615: 9610: 9605: 9600: 9595: 9590: 9585: 9580: 9575: 9569: 9567: 9563: 9562: 9560: 9559: 9557:Ilya Sutskever 9554: 9549: 9544: 9539: 9534: 9529: 9524: 9522:Demis Hassabis 9519: 9514: 9512:Ian Goodfellow 9509: 9504: 9498: 9496: 9492: 9491: 9488: 9487: 9485: 9484: 9479: 9478: 9477: 9467: 9462: 9457: 9452: 9447: 9442: 9437: 9431: 9429: 9425: 9424: 9422: 9421: 9416: 9411: 9406: 9401: 9396: 9391: 9386: 9381: 9376: 9371: 9366: 9361: 9356: 9351: 9346: 9341: 9340: 9339: 9329: 9324: 9319: 9314: 9309: 9303: 9301: 9297: 9296: 9294: 9293: 9288: 9287: 9286: 9281: 9271: 9270: 9269: 9264: 9259: 9249: 9244: 9239: 9234: 9229: 9224: 9219: 9214: 9209: 9203: 9201: 9194: 9190: 9189: 9187: 9186: 9181: 9176: 9171: 9166: 9161: 9156: 9150: 9148: 9144: 9143: 9141: 9140: 9135: 9130: 9125: 9120: 9114: 9112: 9108: 9107: 9105: 9104: 9103: 9102: 9095:Language model 9092: 9087: 9082: 9081: 9080: 9070: 9069: 9068: 9057: 9055: 9051: 9050: 9048: 9047: 9045:Autoregression 9042: 9037: 9036: 9035: 9025: 9023:Regularization 9020: 9019: 9018: 9013: 9008: 8998: 8993: 8988: 8986:Loss functions 8983: 8978: 8973: 8968: 8963: 8962: 8961: 8951: 8946: 8945: 8944: 8933: 8931: 8927: 8926: 8924: 8923: 8921:Inductive bias 8918: 8913: 8908: 8903: 8898: 8893: 8888: 8883: 8875: 8873: 8867: 8866: 8861: 8860: 8853: 8846: 8838: 8832: 8831: 8825: 8819: 8813: 8793: 8792:External links 8790: 8789: 8788: 8758:(e623): e623. 8743: 8736: 8725: 8710: 8703: 8693:(5): 413–430. 8682: 8668: 8661: 8655: 8642: 8628: 8610: 8599: 8598: 8592: 8588: 8587: 8571: 8568: 8565: 8564: 8561:on 2010-01-08. 8535: 8486: 8479: 8458: 8434: 8414: 8397: 8362: 8355: 8337: 8324: 8303:(4): 401–417. 8280: 8273: 8244: 8215:(4): 597–612. 8195: 8176:(2): 211–236. 8151: 8109: 8097: 8090:Francis Galton 8082: 8041: 8034: 8016: 8001: 7985: 7967: 7950: 7943: 7923: 7911: 7910: 7908: 7905: 7903: 7902: 7897: 7892: 7887: 7882: 7877: 7872: 7867: 7862: 7860:Quasi-variance 7857: 7852: 7847: 7842: 7837: 7832: 7826: 7821: 7816: 7811: 7806: 7801: 7796: 7790: 7789: 7788: 7772: 7769: 7744: 7741: 7740: 7739: 7736: 7732:, leading to 7727: 7721: 7712: 7705: 7691: 7688: 7687: 7686: 7674: 7671: 7665: 7662: 7659: 7654: 7651: 7648: 7622: 7602: 7581: 7561: 7541: 7519: 7515: 7511: 7508: 7496: 7493: 7460: 7450: 7411: 7408: 7396:Main article: 7393: 7390: 7366:ordered probit 7314: 7311: 7257:Main article: 7254: 7251: 7250: 7249: 7237: 7233: 7228: 7224: 7218: 7215: 7211: 7207: 7202: 7198: 7194: 7191: 7185: 7182: 7155: 7152: 7149: 7126: 7123: 7100: 7097: 7094: 7074: 7054: 7051: 7048: 7027: 7004: 6997: 6994: 6967: 6964: 6941: 6919: 6915: 6894: 6874: 6852: 6849: 6845: 6823: 6802: 6799: 6788: 6787: 6775: 6771: 6766: 6762: 6756: 6750: 6747: 6741: 6738: 6733: 6729: 6725: 6710: 6709: 6697: 6694: 6691: 6688: 6685: 6682: 6679: 6676: 6670: 6665: 6661: 6655: 6652: 6648: 6642: 6637: 6634: 6631: 6627: 6623: 6618: 6611: 6608: 6599: 6596: 6592: 6586: 6583: 6579: 6573: 6568: 6565: 6562: 6558: 6552: 6547: 6544: 6541: 6537: 6518: 6517: 6506: 6501: 6498: 6494: 6488: 6481: 6478: 6471: 6468: 6465: 6460: 6457: 6453: 6447: 6440: 6437: 6430: 6425: 6421: 6417: 6412: 6408: 6384: 6368:is called the 6355: 6351: 6330: 6327: 6322: 6319: 6315: 6294: 6274: 6254: 6232: 6229: 6225: 6213: 6212: 6200: 6195: 6191: 6187: 6182: 6179: 6175: 6169: 6165: 6161: 6158: 6155: 6150: 6147: 6143: 6137: 6133: 6129: 6124: 6121: 6117: 6111: 6107: 6103: 6098: 6094: 6070: 6046: 6043: 6027: 6026: 6015: 6009: 6003: 5998: 5994: 5990: 5979: 5975: 5967: 5964: 5957: 5947: 5943: 5936: 5933: 5927: 5922: 5918: 5914: 5911: 5905: 5898: 5895: 5886: 5881: 5878: 5869: 5862: 5859: 5852: 5845: 5841: 5833: 5830: 5815: 5814: 5797: 5793: 5786: 5783: 5777: 5772: 5768: 5764: 5761: 5757: 5749: 5742: 5739: 5732: 5725: 5721: 5713: 5710: 5676: 5673: 5670: 5650: 5647: 5644: 5624: 5621: 5618: 5615: 5612: 5609: 5606: 5583: 5563: 5560: 5557: 5554: 5551: 5536: 5535: 5521: 5518: 5515: 5510: 5507: 5504: 5498: 5493: 5488: 5481: 5478: 5448: 5425: 5422: 5399: 5372: 5369: 5355: 5354: 5340: 5337: 5329: 5322: 5319: 5312: 5306: 5303: 5297: 5292: 5285: 5282: 5268: 5252: 5248: 5241: 5238: 5232: 5227: 5223: 5219: 5216: 5211: 5205: 5202: 5196: 5191: 5187: 5183: 5180: 5174: 5171: 5165: 5160: 5156: 5152: 5149: 5143: 5138: 5131: 5128: 5088: 5081: 5078: 5071: 5066: 5059: 5056: 5037: 5036: 5023: 5018: 5014: 5008: 5003: 5000: 4997: 4993: 4989: 4986: 4983: 4980: 4943: 4939: 4916: 4909: 4906: 4880: 4873: 4870: 4863: 4858: 4854: 4850: 4845: 4841: 4825: 4824: 4813: 4808: 4804: 4798: 4791: 4788: 4781: 4776: 4769: 4766: 4759: 4754: 4747: 4744: 4714: 4692: 4688: 4665: 4660: 4656: 4633: 4629: 4606: 4602: 4579: 4575: 4563: 4562: 4550: 4547: 4544: 4541: 4538: 4535: 4532: 4529: 4523: 4518: 4514: 4510: 4505: 4500: 4496: 4490: 4486: 4482: 4477: 4473: 4467: 4463: 4459: 4454: 4450: 4446: 4441: 4437: 4410: 4405: 4401: 4386: 4385: 4373: 4370: 4367: 4364: 4361: 4358: 4355: 4352: 4348: 4343: 4339: 4335: 4330: 4326: 4320: 4316: 4312: 4307: 4303: 4299: 4294: 4290: 4263: 4259: 4236: 4232: 4209: 4205: 4184: 4146: 4142: 4119:Main article: 4116: 4113: 4094: 4090: 4059: 4055: 4032: 4028: 3991: 3990: 3964: 3960: 3950:The residuals 3948: 3930: 3926: 3914: 3903: 3900: 3897: 3892: 3888: 3883: 3877: 3873: 3869: 3866: 3855: 3852: 3838: 3837: 3796: 3794: 3787: 3781: 3778: 3762: 3759: 3732: 3727: 3723: 3690: 3685: 3682: 3678: 3674: 3671: 3668: 3665: 3662: 3657: 3654: 3650: 3646: 3641: 3638: 3634: 3630: 3606: 3603: 3600: 3580: 3577: 3574: 3554: 3551: 3548: 3528: 3512:fixed points. 3501: 3498: 3495: 3471: 3468: 3465: 3441: 3438: 3433: 3429: 3425: 3420: 3417: 3413: 3407: 3400: 3397: 3390: 3385: 3382: 3378: 3372: 3365: 3362: 3355: 3350: 3343: 3340: 3333: 3330: 3325: 3318: 3315: 3308: 3303: 3299: 3295: 3290: 3285: 3278: 3275: 3266: 3262: 3239: 3236: 3232: 3226: 3219: 3216: 3209: 3204: 3201: 3197: 3191: 3184: 3181: 3174: 3169: 3162: 3159: 3152: 3147: 3140: 3137: 3113: 3108: 3101: 3098: 3091: 3086: 3079: 3076: 3069: 3064: 3057: 3054: 3047: 3027: 3024: 3021: 2999: 2995: 2991: 2986: 2983: 2979: 2973: 2969: 2965: 2960: 2957: 2953: 2947: 2943: 2939: 2934: 2930: 2926: 2921: 2917: 2892: 2887: 2884: 2880: 2876: 2871: 2868: 2864: 2860: 2855: 2851: 2847: 2827: 2804: 2801: 2798: 2793: 2789: 2785: 2782: 2754: 2749: 2745: 2740: 2734: 2730: 2726: 2723: 2700: 2694: 2691: 2685: 2680: 2676: 2672: 2669: 2642: 2637: 2633: 2606: 2603: 2580: 2574: 2571: 2565: 2560: 2556: 2552: 2549: 2546: 2540: 2535: 2531: 2501: 2498: 2475: 2453: 2449: 2445: 2442: 2439: 2434: 2430: 2426: 2423: 2420: 2415: 2411: 2407: 2402: 2398: 2377: 2349: 2320: 2316: 2312: 2307: 2303: 2297: 2293: 2289: 2284: 2280: 2276: 2271: 2267: 2244: 2240: 2234: 2230: 2226: 2221: 2217: 2213: 2210: 2207: 2204: 2199: 2195: 2191: 2188: 2168: 2146: 2142: 2119: 2115: 2094: 2074: 2071: 2068: 2063: 2059: 2055: 2052: 2023: 2019: 2007: 2006: 1993: 1989: 1985: 1982: 1979: 1976: 1971: 1967: 1963: 1960: 1957: 1952: 1948: 1922: 1918: 1891: 1887: 1866: 1844: 1840: 1809: 1805: 1782: 1781: 1767: 1763: 1743: 1729: 1725: 1709: 1697: 1675: 1671: 1655: 1643: 1612: 1609: 1543:Francis Galton 1518: 1515: 1425:(often called 1389: 1388: 1386: 1385: 1378: 1371: 1363: 1360: 1359: 1356: 1355: 1350: 1349: 1348: 1338: 1332: 1329: 1328: 1325: 1324: 1321: 1320: 1315: 1310: 1305: 1300: 1295: 1290: 1284: 1281: 1280: 1277: 1276: 1273: 1272: 1267: 1262: 1257: 1255:Occam learning 1252: 1247: 1242: 1237: 1231: 1228: 1227: 1224: 1223: 1220: 1219: 1214: 1212:Learning curve 1209: 1204: 1198: 1195: 1194: 1191: 1190: 1187: 1186: 1181: 1176: 1171: 1165: 1162: 1161: 1158: 1157: 1154: 1153: 1152: 1151: 1141: 1136: 1131: 1125: 1120: 1119: 1116: 1115: 1112: 1111: 1105: 1100: 1095: 1090: 1089: 1088: 1078: 1073: 1072: 1071: 1066: 1061: 1056: 1046: 1041: 1036: 1031: 1030: 1029: 1019: 1018: 1017: 1012: 1007: 1002: 992: 987: 982: 976: 971: 970: 967: 966: 963: 962: 957: 952: 944: 938: 933: 932: 929: 928: 925: 924: 923: 922: 917: 912: 901: 896: 895: 892: 891: 888: 887: 882: 877: 872: 867: 862: 857: 852: 847: 841: 836: 835: 832: 831: 828: 827: 822: 817: 811: 806: 801: 793: 788: 783: 777: 772: 771: 768: 767: 764: 763: 758: 753: 748: 743: 738: 733: 728: 720: 719: 718: 713: 708: 698: 696:Decision trees 693: 687: 673:classification 663: 662: 661: 658: 657: 654: 653: 648: 643: 638: 633: 628: 623: 618: 613: 608: 603: 598: 593: 588: 583: 578: 573: 568: 566:Classification 562: 559: 558: 555: 554: 551: 550: 545: 540: 535: 530: 525: 523:Batch learning 520: 515: 510: 505: 500: 495: 490: 484: 481: 480: 477: 476: 465: 464: 456: 455: 453: 452: 445: 438: 430: 427: 426: 425: 424: 409: 408: 407: 406: 401: 396: 391: 386: 381: 373: 372: 368: 367: 366: 365: 360: 355: 350: 345: 337: 336: 335: 334: 329: 324: 319: 314: 306: 305: 304: 303: 298: 293: 288: 280: 279: 278: 277: 272: 267: 259: 258: 254: 253: 252: 251: 243: 242: 241: 240: 235: 230: 225: 220: 215: 210: 205: 203:Semiparametric 200: 195: 187: 186: 185: 184: 179: 174: 172:Random effects 169: 164: 156: 155: 154: 153: 148: 146:Ordered probit 143: 138: 133: 128: 123: 118: 113: 108: 103: 98: 93: 85: 84: 83: 82: 77: 72: 67: 59: 58: 54: 53: 47: 46: 26: 9: 6: 4: 3: 2: 13973: 13962: 13959: 13958: 13956: 13941: 13940: 13931: 13929: 13928: 13919: 13917: 13916: 13907: 13906: 13903: 13893: 13890: 13888: 13885: 13883: 13880: 13878: 13875: 13873: 13870: 13868: 13865: 13863: 13862:Joseph Lister 13860: 13858: 13855: 13853: 13850: 13848: 13845: 13844: 13842: 13838: 13830: 13827: 13825: 13822: 13820: 13817: 13815: 13812: 13811: 13809: 13807: 13804: 13803: 13801: 13797: 13790: 13786: 13784: 13781: 13779: 13776: 13772: 13769: 13767: 13764: 13762: 13759: 13757: 13754: 13753: 13751: 13747: 13744: 13742: 13741:Health Canada 13739: 13738: 13736: 13732: 13729: 13728: 13726: 13722: 13719: 13717: 13714: 13713: 13711: 13707: 13704: 13703: 13701: 13697: 13694: 13693: 13691: 13690: 13688: 13686:Organizations 13684: 13681: 13673: 13667: 13664: 13662: 13659: 13657: 13654: 13652: 13649: 13647: 13644: 13642: 13639: 13637: 13634: 13632: 13629: 13627: 13624: 13622: 13619: 13618: 13616: 13610: 13602: 13599: 13597: 13594: 13593: 13592: 13589: 13587: 13584: 13580: 13577: 13575: 13572: 13570: 13567: 13565: 13562: 13560: 13557: 13555: 13552: 13550: 13547: 13546: 13544: 13543: 13541: 13535: 13529: 13526: 13524: 13523:Vaccine trial 13521: 13519: 13516: 13514: 13511: 13509: 13506: 13504: 13501: 13497: 13494: 13493: 13492: 13489: 13485: 13482: 13481: 13480: 13477: 13473: 13470: 13469: 13468: 13465: 13463: 13460: 13459: 13457: 13451: 13445: 13442: 13438: 13436: 13432: 13430: 13428: 13423: 13421: 13418: 13416: 13413: 13411: 13408: 13407: 13406: 13403: 13401: 13400:Relative risk 13398: 13396: 13393: 13391: 13388: 13387: 13385: 13379: 13373: 13370: 13366: 13363: 13361: 13360:Health equity 13358: 13356: 13353: 13352: 13351: 13348: 13346: 13343: 13341: 13338: 13336: 13333: 13331: 13328: 13326: 13325:Health system 13323: 13321: 13318: 13316: 13315:Global health 13313: 13311: 13308: 13306: 13303: 13301: 13298: 13296: 13295:Biostatistics 13293: 13292: 13290: 13288: 13284: 13278: 13275: 13273: 13270: 13268: 13265: 13263: 13260: 13256: 13253: 13251: 13248: 13246: 13243: 13241: 13238: 13236: 13233: 13231: 13228: 13227: 13226: 13223: 13221: 13218: 13216: 13213: 13209: 13206: 13205: 13204: 13201: 13197: 13194: 13192: 13189: 13187: 13184: 13182: 13179: 13177: 13174: 13172: 13169: 13168: 13167: 13164: 13160: 13157: 13155: 13152: 13150: 13147: 13145: 13142: 13141: 13140: 13137: 13133: 13130: 13128: 13125: 13124: 13123: 13120: 13118: 13115: 13113: 13110: 13106: 13103: 13102: 13101: 13098: 13097: 13095: 13093: 13089: 13083: 13080: 13078: 13075: 13073: 13070: 13068: 13065: 13063: 13060: 13058: 13055: 13053: 13050: 13048: 13045: 13043: 13040: 13038: 13037:Right to food 13035: 13033: 13030: 13028: 13025: 13023: 13020: 13016: 13013: 13011: 13008: 13006: 13003: 13001: 12998: 12996: 12993: 12992: 12991: 12988: 12986: 12983: 12981: 12978: 12975: 12971: 12970:Mental health 12968: 12966: 12963: 12961: 12958: 12956: 12953: 12951: 12948: 12946: 12943: 12939: 12936: 12934: 12931: 12930: 12929: 12926: 12924: 12921: 12919: 12918:Housing First 12916: 12912: 12909: 12907: 12906:Health system 12904: 12903: 12902: 12901:Health policy 12899: 12897: 12894: 12892: 12889: 12887: 12884: 12882: 12879: 12877: 12874: 12872: 12869: 12865: 12862: 12860: 12857: 12856: 12855: 12852: 12850: 12847: 12845: 12842: 12840: 12837: 12835: 12832: 12830: 12827: 12825: 12822: 12821: 12819: 12815: 12811: 12810:Public health 12804: 12799: 12797: 12792: 12790: 12785: 12784: 12781: 12770: 12767: 12765: 12762: 12760: 12757: 12755: 12752: 12748: 12742: 12739: 12737: 12734: 12732: 12729: 12727: 12724: 12722: 12719: 12718: 12714: 12709: 12706:Quantitative 12701: 12696: 12694: 12689: 12687: 12682: 12681: 12678: 12666: 12665: 12656: 12654: 12653: 12644: 12642: 12641: 12636: 12630: 12628: 12627: 12618: 12617: 12614: 12600: 12597: 12595: 12594:Geostatistics 12592: 12590: 12587: 12585: 12582: 12580: 12577: 12576: 12574: 12572: 12568: 12562: 12561:Psychometrics 12559: 12557: 12554: 12552: 12549: 12547: 12544: 12542: 12539: 12537: 12534: 12532: 12529: 12527: 12524: 12522: 12519: 12517: 12514: 12513: 12511: 12509: 12505: 12499: 12496: 12494: 12491: 12489: 12485: 12482: 12480: 12477: 12475: 12472: 12470: 12467: 12466: 12464: 12462: 12458: 12452: 12449: 12447: 12444: 12442: 12438: 12435: 12433: 12430: 12429: 12427: 12425: 12424:Biostatistics 12421: 12417: 12413: 12408: 12404: 12386: 12385:Log-rank test 12383: 12382: 12380: 12376: 12370: 12367: 12366: 12364: 12362: 12358: 12352: 12349: 12347: 12344: 12342: 12339: 12337: 12334: 12333: 12331: 12329: 12325: 12322: 12320: 12316: 12306: 12303: 12301: 12298: 12296: 12293: 12291: 12288: 12286: 12283: 12282: 12280: 12278: 12274: 12268: 12265: 12263: 12260: 12258: 12256:(Box–Jenkins) 12252: 12250: 12247: 12245: 12242: 12238: 12235: 12234: 12233: 12230: 12229: 12227: 12225: 12221: 12215: 12212: 12210: 12209:Durbin–Watson 12207: 12205: 12199: 12197: 12194: 12192: 12191:Dickey–Fuller 12189: 12188: 12186: 12182: 12176: 12173: 12171: 12168: 12166: 12165:Cointegration 12163: 12161: 12158: 12156: 12153: 12151: 12148: 12146: 12143: 12141: 12140:Decomposition 12138: 12137: 12135: 12131: 12128: 12126: 12122: 12112: 12109: 12108: 12107: 12104: 12103: 12102: 12099: 12095: 12092: 12091: 12090: 12087: 12085: 12082: 12080: 12077: 12075: 12072: 12070: 12067: 12065: 12062: 12060: 12057: 12055: 12052: 12051: 12049: 12047: 12043: 12037: 12034: 12032: 12029: 12027: 12024: 12022: 12019: 12017: 12014: 12012: 12011:Cohen's kappa 12009: 12008: 12006: 12004: 12000: 11996: 11992: 11988: 11984: 11980: 11975: 11971: 11957: 11954: 11952: 11949: 11947: 11944: 11942: 11939: 11938: 11936: 11934: 11930: 11924: 11920: 11916: 11910: 11908: 11905: 11904: 11902: 11900: 11896: 11890: 11887: 11885: 11882: 11880: 11877: 11875: 11872: 11870: 11867: 11865: 11864:Nonparametric 11862: 11860: 11857: 11856: 11854: 11850: 11844: 11841: 11839: 11836: 11834: 11831: 11829: 11826: 11825: 11823: 11821: 11817: 11811: 11808: 11806: 11803: 11801: 11798: 11796: 11793: 11791: 11788: 11787: 11785: 11783: 11779: 11773: 11770: 11768: 11765: 11763: 11760: 11758: 11755: 11754: 11752: 11750: 11746: 11742: 11735: 11732: 11730: 11727: 11726: 11722: 11718: 11702: 11699: 11698: 11697: 11694: 11692: 11689: 11687: 11684: 11680: 11677: 11675: 11672: 11671: 11670: 11667: 11666: 11664: 11662: 11658: 11648: 11645: 11641: 11635: 11633: 11627: 11625: 11619: 11618: 11617: 11614: 11613:Nonparametric 11611: 11609: 11603: 11599: 11596: 11595: 11594: 11588: 11584: 11583:Sample median 11581: 11580: 11579: 11576: 11575: 11573: 11571: 11567: 11559: 11556: 11554: 11551: 11549: 11546: 11545: 11544: 11541: 11539: 11536: 11534: 11528: 11526: 11523: 11521: 11518: 11516: 11513: 11511: 11508: 11506: 11504: 11500: 11498: 11495: 11494: 11492: 11490: 11486: 11480: 11478: 11474: 11472: 11470: 11465: 11463: 11458: 11454: 11453: 11450: 11447: 11445: 11441: 11431: 11428: 11426: 11423: 11421: 11418: 11417: 11415: 11413: 11409: 11403: 11400: 11396: 11393: 11392: 11391: 11388: 11384: 11381: 11380: 11379: 11376: 11374: 11371: 11370: 11368: 11366: 11362: 11354: 11351: 11349: 11346: 11345: 11344: 11341: 11339: 11336: 11334: 11331: 11329: 11326: 11324: 11321: 11319: 11316: 11315: 11313: 11311: 11307: 11301: 11298: 11294: 11291: 11287: 11284: 11282: 11279: 11278: 11277: 11274: 11273: 11272: 11269: 11265: 11262: 11260: 11257: 11255: 11252: 11250: 11247: 11246: 11245: 11242: 11241: 11239: 11237: 11233: 11230: 11228: 11224: 11218: 11215: 11213: 11210: 11206: 11203: 11202: 11201: 11198: 11196: 11193: 11189: 11188:loss function 11186: 11185: 11184: 11181: 11177: 11174: 11172: 11169: 11167: 11164: 11163: 11162: 11159: 11157: 11154: 11152: 11149: 11145: 11142: 11140: 11137: 11135: 11129: 11126: 11125: 11124: 11121: 11117: 11114: 11112: 11109: 11107: 11104: 11103: 11102: 11099: 11095: 11092: 11090: 11087: 11086: 11085: 11082: 11078: 11075: 11074: 11073: 11070: 11066: 11063: 11062: 11061: 11058: 11056: 11053: 11051: 11048: 11046: 11043: 11042: 11040: 11038: 11034: 11030: 11026: 11021: 11017: 11003: 11000: 10998: 10995: 10993: 10990: 10988: 10985: 10984: 10982: 10980: 10976: 10970: 10967: 10965: 10962: 10960: 10957: 10956: 10954: 10950: 10944: 10941: 10939: 10936: 10934: 10931: 10929: 10926: 10924: 10921: 10919: 10916: 10914: 10911: 10910: 10908: 10906: 10902: 10896: 10893: 10891: 10890:Questionnaire 10888: 10886: 10883: 10879: 10876: 10874: 10871: 10870: 10869: 10866: 10865: 10863: 10861: 10857: 10851: 10848: 10846: 10843: 10841: 10838: 10836: 10833: 10831: 10828: 10826: 10823: 10821: 10818: 10816: 10813: 10812: 10810: 10808: 10804: 10800: 10796: 10791: 10787: 10773: 10770: 10768: 10765: 10763: 10760: 10758: 10755: 10753: 10750: 10748: 10745: 10743: 10740: 10738: 10735: 10733: 10730: 10728: 10725: 10723: 10720: 10718: 10717:Control chart 10715: 10713: 10710: 10708: 10705: 10703: 10700: 10699: 10697: 10695: 10691: 10685: 10682: 10678: 10675: 10673: 10670: 10669: 10668: 10665: 10663: 10660: 10658: 10655: 10654: 10652: 10650: 10646: 10640: 10637: 10635: 10632: 10630: 10627: 10626: 10624: 10620: 10614: 10611: 10610: 10608: 10606: 10602: 10590: 10587: 10585: 10582: 10580: 10577: 10576: 10575: 10572: 10570: 10567: 10566: 10564: 10562: 10558: 10552: 10549: 10547: 10544: 10542: 10539: 10537: 10534: 10532: 10529: 10527: 10524: 10522: 10519: 10518: 10516: 10514: 10510: 10504: 10501: 10499: 10496: 10492: 10489: 10487: 10484: 10482: 10479: 10477: 10474: 10472: 10469: 10467: 10464: 10462: 10459: 10457: 10454: 10452: 10449: 10447: 10444: 10443: 10442: 10439: 10438: 10436: 10434: 10430: 10427: 10425: 10421: 10417: 10413: 10408: 10404: 10398: 10395: 10393: 10390: 10389: 10386: 10382: 10375: 10370: 10368: 10363: 10361: 10356: 10355: 10352: 10340: 10337: 10335: 10332: 10330: 10325: 10320: 10318: 10315: 10313: 10310: 10309: 10306: 10300: 10297: 10295: 10292: 10290: 10287: 10285: 10282: 10280: 10279:Curve fitting 10277: 10276: 10274: 10270: 10264: 10261: 10259: 10256: 10254: 10251: 10249: 10246: 10244: 10241: 10239: 10236: 10234: 10231: 10230: 10228: 10226: 10225:approximation 10223: 10219: 10213: 10210: 10208: 10205: 10203: 10200: 10199: 10197: 10195: 10191: 10185: 10182: 10180: 10177: 10175: 10172: 10170: 10167: 10165: 10162: 10160: 10157: 10155: 10152: 10151: 10149: 10145: 10139: 10136: 10134: 10131: 10127: 10124: 10122: 10119: 10117: 10116: 10108: 10107: 10106: 10103: 10101: 10098: 10097: 10095: 10091: 10085: 10082: 10080: 10077: 10075: 10072: 10071: 10069: 10067: 10063: 10053: 10050: 10048: 10045: 10043: 10040: 10038: 10035: 10034: 10032: 10028: 10022: 10019: 10017: 10014: 10012: 10009: 10007: 10004: 10002: 10001:Nonparametric 9999: 9997: 9994: 9993: 9991: 9987: 9981: 9978: 9976: 9973: 9971: 9968: 9966: 9963: 9962: 9960: 9956: 9950: 9947: 9945: 9942: 9940: 9937: 9935: 9932: 9930: 9927: 9926: 9924: 9922: 9918: 9915: 9913: 9907: 9901: 9898: 9896: 9893: 9891: 9888: 9886: 9883: 9882: 9880: 9878: 9874: 9868: 9865: 9863: 9860: 9857: 9856:Kendall's tau 9854: 9852: 9848: 9845: 9843: 9840: 9839: 9837: 9835: 9831: 9825: 9822: 9820: 9817: 9815: 9812: 9810: 9809:Least squares 9807: 9806: 9804: 9802: 9798: 9794: 9790: 9789:Least squares 9783: 9778: 9776: 9771: 9769: 9764: 9763: 9760: 9746: 9743: 9741: 9738: 9737: 9730: 9726: 9723: 9721: 9718: 9717: 9714: 9710: 9709: 9706: 9700: 9697: 9695: 9692: 9690: 9687: 9685: 9682: 9680: 9677: 9675: 9672: 9670: 9667: 9665: 9662: 9660: 9657: 9655: 9652: 9650: 9647: 9645: 9642: 9640: 9637: 9635: 9632: 9630: 9627: 9626: 9624: 9622:Architectures 9620: 9614: 9611: 9609: 9606: 9604: 9601: 9599: 9596: 9594: 9591: 9589: 9586: 9584: 9581: 9579: 9576: 9574: 9571: 9570: 9568: 9566:Organizations 9564: 9558: 9555: 9553: 9550: 9548: 9545: 9543: 9540: 9538: 9535: 9533: 9530: 9528: 9525: 9523: 9520: 9518: 9515: 9513: 9510: 9508: 9505: 9503: 9502:Yoshua Bengio 9500: 9499: 9497: 9493: 9483: 9482:Robot control 9480: 9476: 9473: 9472: 9471: 9468: 9466: 9463: 9461: 9458: 9456: 9453: 9451: 9448: 9446: 9443: 9441: 9438: 9436: 9433: 9432: 9430: 9426: 9420: 9417: 9415: 9412: 9410: 9407: 9405: 9402: 9400: 9399:Chinchilla AI 9397: 9395: 9392: 9390: 9387: 9385: 9382: 9380: 9377: 9375: 9372: 9370: 9367: 9365: 9362: 9360: 9357: 9355: 9352: 9350: 9347: 9345: 9342: 9338: 9335: 9334: 9333: 9330: 9328: 9325: 9323: 9320: 9318: 9315: 9313: 9310: 9308: 9305: 9304: 9302: 9298: 9292: 9289: 9285: 9282: 9280: 9277: 9276: 9275: 9272: 9268: 9265: 9263: 9260: 9258: 9255: 9254: 9253: 9250: 9248: 9245: 9243: 9240: 9238: 9235: 9233: 9230: 9228: 9225: 9223: 9220: 9218: 9215: 9213: 9210: 9208: 9205: 9204: 9202: 9198: 9195: 9191: 9185: 9182: 9180: 9177: 9175: 9172: 9170: 9167: 9165: 9162: 9160: 9157: 9155: 9152: 9151: 9149: 9145: 9139: 9136: 9134: 9131: 9129: 9126: 9124: 9121: 9119: 9116: 9115: 9113: 9109: 9101: 9098: 9097: 9096: 9093: 9091: 9088: 9086: 9083: 9079: 9078:Deep learning 9076: 9075: 9074: 9071: 9067: 9064: 9063: 9062: 9059: 9058: 9056: 9052: 9046: 9043: 9041: 9038: 9034: 9031: 9030: 9029: 9026: 9024: 9021: 9017: 9014: 9012: 9009: 9007: 9004: 9003: 9002: 8999: 8997: 8994: 8992: 8989: 8987: 8984: 8982: 8979: 8977: 8974: 8972: 8969: 8967: 8966:Hallucination 8964: 8960: 8957: 8956: 8955: 8952: 8950: 8947: 8943: 8940: 8939: 8938: 8935: 8934: 8932: 8928: 8922: 8919: 8917: 8914: 8912: 8909: 8907: 8904: 8902: 8899: 8897: 8894: 8892: 8889: 8887: 8884: 8882: 8881: 8877: 8876: 8874: 8872: 8868: 8859: 8854: 8852: 8847: 8845: 8840: 8839: 8836: 8829: 8826: 8823: 8820: 8817: 8814: 8810: 8806: 8805: 8800: 8796: 8795: 8785: 8781: 8776: 8771: 8766: 8761: 8757: 8753: 8749: 8744: 8741: 8737: 8734: 8730: 8726: 8723: 8719: 8715: 8711: 8708: 8704: 8700: 8696: 8692: 8688: 8683: 8681: 8680:0-521-42950-1 8677: 8673: 8669: 8666: 8662: 8658: 8652: 8648: 8643: 8640: 8637: 8633: 8629: 8627: 8626:0-471-56881-3 8623: 8619: 8615: 8611: 8608: 8604: 8603:Lindley, D.V. 8601: 8600: 8596: 8593: 8590: 8589: 8585: 8581: 8577: 8574: 8573: 8557: 8553: 8546: 8539: 8531: 8527: 8522: 8517: 8513: 8509: 8505: 8501: 8497: 8490: 8482: 8476: 8472: 8468: 8462: 8456: 8452: 8451:981-238-310-7 8448: 8444: 8438: 8431:. p. 60. 8427: 8426: 8418: 8411: 8407: 8401: 8393: 8389: 8385: 8381: 8377: 8373: 8366: 8358: 8352: 8348: 8341: 8334: 8328: 8320: 8316: 8311: 8306: 8302: 8298: 8291: 8284: 8276: 8270: 8266: 8261: 8260: 8254: 8248: 8239: 8234: 8230: 8226: 8222: 8218: 8214: 8210: 8206: 8199: 8191: 8187: 8183: 8179: 8175: 8171: 8170: 8165: 8161: 8160:Pearson, Karl 8155: 8147: 8143: 8139: 8135: 8132:(4): 812–54. 8131: 8127: 8123: 8119: 8118:Yule, G. Udny 8113: 8107: 8101: 8095: 8091: 8086: 8078: 8074: 8069: 8064: 8060: 8056: 8052: 8045: 8037: 8031: 8027: 8020: 8014:. (1821/1823) 8013: 8012: 8005: 7998: 7992: 7990: 7982: 7981: 7976: 7975:A.M. Legendre 7971: 7964: 7960: 7954: 7946: 7940: 7936: 7935: 7927: 7921: 7916: 7912: 7901: 7898: 7896: 7893: 7891: 7888: 7886: 7883: 7881: 7878: 7876: 7873: 7871: 7868: 7866: 7863: 7861: 7858: 7856: 7853: 7851: 7848: 7846: 7843: 7841: 7838: 7836: 7833: 7830: 7827: 7825: 7822: 7820: 7817: 7815: 7812: 7810: 7807: 7805: 7802: 7800: 7799:Curve fitting 7797: 7795: 7792: 7791: 7786: 7780: 7775: 7768: 7765: 7761: 7757: 7756:least squares 7750: 7737: 7735: 7731: 7728: 7725: 7722: 7720: 7716: 7713: 7710: 7706: 7704: 7700: 7697: 7696: 7695: 7690:Other methods 7672: 7669: 7663: 7660: 7657: 7652: 7649: 7646: 7636: 7635: 7634: 7620: 7600: 7579: 7559: 7539: 7517: 7513: 7509: 7506: 7492: 7490: 7486: 7481: 7478: 7475: 7474: 7468: 7466: 7465: 7464:extrapolation 7458: 7457:. Prediction 7456: 7455: 7454:interpolation 7448: 7446: 7442: 7438: 7437: 7426: 7421: 7417: 7407: 7405: 7399: 7389: 7387: 7383: 7379: 7375: 7371: 7367: 7363: 7362:ordered logit 7359: 7355: 7351: 7347: 7343: 7339: 7335: 7329: 7327: 7323: 7319: 7310: 7308: 7303: 7299: 7294: 7292: 7288: 7284: 7280: 7276: 7272: 7266: 7260: 7235: 7169: 7168: 7167: 7153: 7150: 7147: 7098: 7095: 7092: 7072: 7052: 7049: 7046: 7002: 6992: 6939: 6917: 6913: 6892: 6872: 6850: 6847: 6843: 6800: 6797: 6773: 6715: 6714: 6713: 6695: 6692: 6689: 6686: 6683: 6680: 6677: 6674: 6668: 6663: 6659: 6653: 6650: 6646: 6640: 6635: 6632: 6629: 6625: 6621: 6616: 6606: 6597: 6594: 6590: 6584: 6581: 6577: 6571: 6566: 6563: 6560: 6556: 6550: 6545: 6542: 6539: 6535: 6527: 6526: 6525: 6523: 6504: 6499: 6496: 6492: 6486: 6476: 6469: 6466: 6463: 6458: 6455: 6451: 6445: 6435: 6428: 6423: 6419: 6415: 6410: 6406: 6398: 6397: 6396: 6382: 6373: 6371: 6353: 6349: 6328: 6325: 6320: 6317: 6313: 6292: 6272: 6252: 6230: 6227: 6223: 6198: 6193: 6189: 6185: 6180: 6177: 6173: 6167: 6163: 6159: 6156: 6153: 6148: 6145: 6141: 6135: 6131: 6127: 6122: 6119: 6115: 6109: 6105: 6101: 6096: 6092: 6084: 6083: 6082: 6068: 6058: 6052: 6042: 6040: 6036: 6032: 6013: 6007: 6001: 5996: 5992: 5988: 5977: 5973: 5962: 5955: 5945: 5931: 5925: 5920: 5916: 5909: 5903: 5893: 5884: 5879: 5876: 5867: 5857: 5850: 5843: 5839: 5828: 5817: 5816: 5795: 5781: 5775: 5770: 5766: 5759: 5755: 5747: 5737: 5730: 5723: 5719: 5708: 5697: 5696: 5695: 5693: 5688: 5674: 5671: 5668: 5648: 5645: 5642: 5619: 5616: 5613: 5610: 5607: 5596: 5581: 5558: 5555: 5552: 5541: 5519: 5516: 5513: 5508: 5505: 5502: 5496: 5491: 5486: 5476: 5465: 5464: 5463: 5460: 5446: 5420: 5397: 5389: 5367: 5335: 5327: 5320: 5317: 5310: 5301: 5295: 5290: 5283: 5280: 5269: 5250: 5236: 5230: 5225: 5221: 5214: 5200: 5194: 5189: 5185: 5169: 5163: 5158: 5154: 5147: 5141: 5136: 5129: 5126: 5115: 5114: 5113: 5106: 5102: 5086: 5079: 5076: 5069: 5064: 5057: 5054: 5042: 5021: 5016: 5012: 5006: 5001: 4998: 4995: 4991: 4987: 4984: 4981: 4978: 4971: 4970: 4969: 4967: 4963: 4959: 4941: 4937: 4914: 4907: 4904: 4878: 4871: 4868: 4861: 4856: 4852: 4848: 4843: 4839: 4830: 4811: 4806: 4802: 4796: 4789: 4786: 4779: 4774: 4767: 4764: 4757: 4752: 4745: 4742: 4731: 4730: 4729: 4726: 4712: 4690: 4686: 4676: 4663: 4658: 4654: 4631: 4627: 4604: 4600: 4577: 4573: 4548: 4545: 4542: 4539: 4536: 4533: 4530: 4527: 4521: 4516: 4512: 4508: 4503: 4498: 4494: 4488: 4484: 4480: 4475: 4471: 4465: 4461: 4457: 4452: 4448: 4444: 4439: 4435: 4426: 4425: 4424: 4408: 4403: 4399: 4389: 4371: 4368: 4365: 4362: 4359: 4356: 4353: 4350: 4346: 4341: 4337: 4333: 4328: 4324: 4318: 4314: 4310: 4305: 4301: 4297: 4292: 4288: 4279: 4278: 4277: 4261: 4257: 4234: 4230: 4207: 4203: 4182: 4175:for modeling 4174: 4170: 4166: 4162: 4144: 4140: 4128: 4122: 4112: 4110: 4092: 4088: 4079: 4075: 4057: 4053: 4030: 4026: 4017: 4013: 4008: 4004: 4000: 3996: 3988: 3984: 3980: 3962: 3958: 3949: 3946: 3928: 3924: 3915: 3901: 3898: 3890: 3886: 3875: 3871: 3864: 3856: 3853: 3850: 3849: 3848: 3846: 3834: 3831: 3823: 3820:December 2020 3813: 3809: 3803: 3802: 3797:This section 3795: 3791: 3786: 3785: 3777: 3757: 3746: 3730: 3725: 3721: 3712: 3708: 3704: 3683: 3680: 3676: 3672: 3669: 3666: 3663: 3660: 3655: 3652: 3648: 3644: 3639: 3636: 3632: 3620: 3604: 3601: 3598: 3578: 3575: 3572: 3552: 3549: 3546: 3526: 3518: 3517:least squares 3513: 3499: 3496: 3493: 3485: 3469: 3466: 3463: 3455: 3439: 3436: 3431: 3418: 3415: 3411: 3405: 3395: 3388: 3383: 3380: 3376: 3370: 3360: 3353: 3348: 3338: 3328: 3323: 3313: 3301: 3297: 3293: 3288: 3283: 3273: 3264: 3260: 3237: 3234: 3230: 3224: 3214: 3207: 3202: 3199: 3195: 3189: 3179: 3172: 3167: 3157: 3150: 3145: 3135: 3106: 3096: 3089: 3084: 3074: 3067: 3062: 3052: 3025: 3022: 3019: 2997: 2993: 2989: 2984: 2981: 2977: 2971: 2967: 2963: 2958: 2955: 2951: 2945: 2941: 2937: 2932: 2928: 2924: 2919: 2915: 2906: 2905:least squares 2885: 2882: 2878: 2874: 2869: 2866: 2862: 2858: 2853: 2849: 2825: 2816: 2799: 2796: 2791: 2787: 2780: 2772: 2768: 2747: 2743: 2732: 2728: 2721: 2714: 2689: 2683: 2678: 2674: 2667: 2659: 2635: 2631: 2601: 2569: 2563: 2558: 2554: 2547: 2544: 2533: 2529: 2518: 2496: 2473: 2451: 2440: 2437: 2432: 2428: 2421: 2418: 2413: 2409: 2400: 2396: 2375: 2367: 2363: 2362:least squares 2347: 2339: 2334: 2318: 2314: 2310: 2305: 2301: 2295: 2291: 2287: 2282: 2278: 2274: 2269: 2265: 2242: 2238: 2232: 2228: 2224: 2219: 2215: 2211: 2205: 2202: 2197: 2193: 2186: 2166: 2144: 2140: 2117: 2113: 2092: 2069: 2066: 2061: 2057: 2050: 2041: 2039: 2021: 2017: 1991: 1987: 1983: 1977: 1974: 1969: 1965: 1958: 1955: 1950: 1946: 1938: 1937: 1936: 1920: 1916: 1907: 1889: 1885: 1864: 1842: 1838: 1829: 1825: 1807: 1803: 1793: 1791: 1787: 1765: 1761: 1752: 1748: 1744: 1727: 1723: 1714: 1710: 1695: 1673: 1669: 1660: 1656: 1641: 1634: 1630: 1626: 1622: 1621: 1620: 1618: 1608: 1606: 1602: 1598: 1594: 1593:growth curves 1590: 1586: 1581: 1579: 1574: 1572: 1568: 1564: 1560: 1556: 1552: 1548: 1544: 1539: 1537: 1532: 1528: 1524: 1514: 1512: 1507: 1503: 1499: 1495: 1490: 1488: 1484: 1480: 1476: 1472: 1471:average value 1468: 1464: 1460: 1456: 1452: 1448: 1444: 1440: 1436: 1432: 1428: 1424: 1420: 1416: 1412: 1408: 1404: 1400: 1396: 1384: 1379: 1377: 1372: 1370: 1365: 1364: 1362: 1361: 1354: 1351: 1347: 1344: 1343: 1342: 1339: 1337: 1334: 1333: 1327: 1326: 1319: 1316: 1314: 1311: 1309: 1306: 1304: 1301: 1299: 1296: 1294: 1291: 1289: 1286: 1285: 1279: 1278: 1271: 1268: 1266: 1263: 1261: 1258: 1256: 1253: 1251: 1248: 1246: 1243: 1241: 1238: 1236: 1233: 1232: 1226: 1225: 1218: 1215: 1213: 1210: 1208: 1205: 1203: 1200: 1199: 1193: 1192: 1185: 1182: 1180: 1177: 1175: 1174:Crowdsourcing 1172: 1170: 1167: 1166: 1160: 1159: 1150: 1147: 1146: 1145: 1142: 1140: 1137: 1135: 1132: 1130: 1127: 1126: 1123: 1118: 1117: 1109: 1106: 1104: 1103:Memtransistor 1101: 1099: 1096: 1094: 1091: 1087: 1084: 1083: 1082: 1079: 1077: 1074: 1070: 1067: 1065: 1062: 1060: 1057: 1055: 1052: 1051: 1050: 1047: 1045: 1042: 1040: 1037: 1035: 1032: 1028: 1025: 1024: 1023: 1020: 1016: 1013: 1011: 1008: 1006: 1003: 1001: 998: 997: 996: 993: 991: 988: 986: 985:Deep learning 983: 981: 978: 977: 974: 969: 968: 961: 958: 956: 953: 951: 949: 945: 943: 940: 939: 936: 931: 930: 921: 920:Hidden Markov 918: 916: 913: 911: 908: 907: 906: 903: 902: 899: 894: 893: 886: 883: 881: 878: 876: 873: 871: 868: 866: 863: 861: 858: 856: 853: 851: 848: 846: 843: 842: 839: 834: 833: 826: 823: 821: 818: 816: 812: 810: 807: 805: 802: 800: 798: 794: 792: 789: 787: 784: 782: 779: 778: 775: 770: 769: 762: 759: 757: 754: 752: 749: 747: 744: 742: 739: 737: 734: 732: 729: 727: 725: 721: 717: 716:Random forest 714: 712: 709: 707: 704: 703: 702: 699: 697: 694: 692: 689: 688: 681: 680: 675: 674: 666: 660: 659: 652: 649: 647: 644: 642: 639: 637: 634: 632: 629: 627: 624: 622: 619: 617: 614: 612: 609: 607: 604: 602: 601:Data cleaning 599: 597: 594: 592: 589: 587: 584: 582: 579: 577: 574: 572: 569: 567: 564: 563: 557: 556: 549: 546: 544: 541: 539: 536: 534: 531: 529: 526: 524: 521: 519: 516: 514: 513:Meta-learning 511: 509: 506: 504: 501: 499: 496: 494: 491: 489: 486: 485: 479: 478: 475: 470: 467: 466: 462: 461: 451: 446: 444: 439: 437: 432: 431: 429: 428: 423: 418: 413: 412: 411: 410: 405: 402: 400: 397: 395: 392: 390: 387: 385: 382: 380: 377: 376: 375: 374: 370: 369: 364: 361: 359: 356: 354: 351: 349: 346: 344: 341: 340: 339: 338: 333: 330: 328: 325: 323: 320: 318: 315: 313: 310: 309: 308: 307: 302: 299: 297: 294: 292: 289: 287: 284: 283: 282: 281: 276: 273: 271: 268: 266: 265:Least squares 263: 262: 261: 260: 256: 255: 250: 247: 246: 245: 244: 239: 236: 234: 231: 229: 226: 224: 221: 219: 216: 214: 211: 209: 206: 204: 201: 199: 198:Nonparametric 196: 194: 191: 190: 189: 188: 183: 180: 178: 175: 173: 170: 168: 167:Fixed effects 165: 163: 160: 159: 158: 157: 152: 149: 147: 144: 142: 141:Ordered logit 139: 137: 134: 132: 129: 127: 124: 122: 119: 117: 114: 112: 109: 107: 104: 102: 99: 97: 94: 92: 89: 88: 87: 86: 81: 78: 76: 73: 71: 68: 66: 63: 62: 61: 60: 56: 55: 52: 49: 48: 44: 43: 37: 32: 19: 13937: 13925: 13913: 13882:Radium Girls 13877:Typhoid Mary 13564:Microbiology 13434: 13426: 13414: 13310:Epidemiology 13208:Organization 13159:Oral hygiene 13149:Hand washing 13127:Healthy diet 13057:Right to sit 12950:Labor rights 12763: 12662: 12650: 12631: 12624: 12536:Econometrics 12486: / 12469:Chemometrics 12446:Epidemiology 12439: / 12412:Applications 12254:ARIMA model 12201:Q-statistic 12150:Stationarity 12046:Multivariate 11989: / 11985: / 11983:Multivariate 11981: / 11921: / 11917: / 11781: 11733: 11691:Bayes factor 11590:Signed rank 11502: 11476: 11468: 11456: 11151:Completeness 10987:Cohort study 10885:Opinion poll 10820:Missing data 10807:Study design 10762:Scatter plot 10684:Scatter plot 10677:Spearman's ρ 10639:Grouped data 10272:Applications 10111: 9989:Non-standard 9876: 9792: 9588:Hugging Face 9552:David Silver 9200:Audio–visual 9054:Applications 9033:Augmentation 8953: 8878: 8802: 8755: 8751: 8728: 8713: 8706: 8690: 8686: 8671: 8670:Hardle, W., 8664: 8646: 8638: 8635: 8617: 8583: 8556:the original 8551: 8538: 8503: 8499: 8489: 8470: 8461: 8442: 8437: 8424: 8417: 8405: 8400: 8375: 8371: 8365: 8346: 8340: 8327: 8300: 8296: 8283: 8258: 8247: 8212: 8208: 8198: 8173: 8167: 8154: 8129: 8125: 8112: 8105: 8100: 8093: 8085: 8061:(2): 80–86. 8058: 8054: 8044: 8025: 8019: 8010: 8008:C.F. Gauss. 8004: 7996: 7979: 7970: 7962: 7953: 7933: 7926: 7915: 7753: 7708: 7693: 7498: 7488: 7484: 7482: 7479: 7471: 7469: 7462: 7452: 7444: 7440: 7435: 7434: 7432: 7429:measurement. 7401: 7330: 7326:econometrics 7316: 7295: 7268: 6789: 6711: 6521: 6519: 6374: 6214: 6060: 6033:and conduct 6028: 5689: 5537: 5461: 5356: 5111: 5038: 4826: 4727: 4677: 4564: 4390: 4387: 4168: 4164: 4131: 4109:econometrics 4073: 3995:Gauss–Markov 3992: 3979:uncorrelated 3841: 3826: 3817: 3806:Please help 3801:verification 3798: 3706: 3514: 2817: 2517:fitted value 2516: 2335: 2042: 2008: 1827: 1794: 1783: 1750: 1749:, which are 1746: 1712: 1658: 1624: 1614: 1582: 1575: 1555:Karl Pearson 1540: 1520: 1491: 1442: 1438: 1434: 1430: 1426: 1418: 1414: 1410: 1398: 1392: 1260:PAC learning 947: 796: 791:Hierarchical 723: 678: 677: 671: 575: 322:Non-negative 50: 13939:WikiProject 13679:and history 13559:Engineering 13272:Vaccination 13144:Food safety 12708:forecasting 12664:WikiProject 12579:Cartography 12541:Jurimetrics 12493:Reliability 12224:Time domain 12203:(Ljung–Box) 12125:Time-series 12003:Categorical 11987:Time-series 11979:Categorical 11914:(Bernoulli) 11749:Correlation 11729:Correlation 11525:Jarque–Bera 11497:Chi-squared 11259:M-estimator 11212:Asymptotics 11156:Sufficiency 10923:Interaction 10835:Replication 10815:Effect size 10772:Violin plot 10752:Radar chart 10732:Forest plot 10722:Correlogram 10672:Kendall's τ 9736:Categories 9684:Autoencoder 9639:Transformer 9507:Alex Graves 9455:OpenAI Five 9359:IBM Watsonx 8981:Convolution 8959:Overfitting 8712:T. Strutz: 8506:: 526–534. 8467:Good, P. I. 8410:McGraw Hill 7809:Forecasting 7764:spreadsheet 7342:logit model 7253:Diagnostics 6952:element of 6813:element of 5410:values and 3845:assumptions 3705:: one must 3519:model with 1784:In various 1747:error terms 1589:time series 1567:R.A. Fisher 1498:forecasting 1144:Multi-agent 1081:Transformer 980:Autoencoder 736:Naive Bayes 474:data mining 332:Regularized 296:Generalized 228:Least angle 126:Mixed logit 13692:Caribbean 13569:Processing 13503:Quarantine 13425:Student's 13225:Sanitation 12859:History of 12531:Demography 12249:ARMA model 12054:Regression 11631:(Friedman) 11592:(Wilcoxon) 11530:Normality 11520:Lilliefors 11467:Student's 11343:Resampling 11217:Robustness 11205:divergence 11195:Efficiency 11133:(monotone) 11128:Likelihood 11045:Population 10878:Stratified 10830:Population 10649:Dependence 10605:Count data 10536:Percentile 10513:Dispersion 10446:Arithmetic 10381:Statistics 10147:Background 10110:Mallows's 9725:Technology 9578:EleutherAI 9537:Fei-Fei Li 9532:Yann LeCun 9445:Q-learning 9428:Decisional 9354:IBM Watson 9262:Midjourney 9154:TensorFlow 9001:Activation 8954:Regression 8949:Clustering 8169:Biometrika 7907:References 7709:percentage 7263:See also: 6932:, and the 6790:where the 6037:about the 5595:regressors 4427:parabola: 4165:parameters 4078:Newey–West 4003:consistent 1494:prediction 1459:hyperplane 1435:covariates 1431:predictors 1427:regressors 1403:estimating 1129:Q-learning 1027:Restricted 825:Mean shift 774:Clustering 751:Perceptron 679:regression 581:Clustering 576:Regression 371:Background 275:Non-linear 257:Estimation 13872:John Snow 13799:Education 13789:Full list 13677:education 13601:ISO 22000 13554:Chemistry 13467:Epidemics 13420:ROC curve 13230:Emergency 13010:Radiation 12990:Pollution 12974:Ministers 12871:Euthenics 11912:Logistic 11679:posterior 11605:Rank sum 11353:Jackknife 11348:Bootstrap 11166:Bootstrap 11101:Parameter 11050:Statistic 10845:Statistic 10757:Run chart 10742:Pie chart 10737:Histogram 10727:Fan chart 10702:Bar chart 10584:L-moments 10471:Geometric 10222:Numerical 9608:MIT CSAIL 9573:Anthropic 9542:Andrew Ng 9440:AlphaZero 9284:VideoPoet 9247:AlphaFold 9184:MindSpore 9138:SpiNNaker 9133:Memristor 9040:Diffusion 9016:Rectifier 8996:Batchnorm 8976:Attention 8971:Adversary 8809:EMS Press 8614:Dodge, Y. 8392:153979055 8265:Edinburgh 7670:≈ 7661:⁡ 7650:⁡ 7283:residuals 7279:R-squared 7227:⊤ 7214:− 7201:⊤ 7184:^ 7181:β 7151:× 7125:^ 7122:β 7096:× 7050:× 6996:^ 6993:β 6966:^ 6963:β 6765:⊤ 6749:^ 6746:β 6732:⊤ 6687:… 6626:∑ 6610:^ 6607:β 6557:∑ 6536:∑ 6480:^ 6477:β 6470:− 6467:⋯ 6464:− 6439:^ 6436:β 6429:− 6407:ε 6350:β 6190:ε 6164:β 6157:⋯ 6132:β 6106:β 5989:∑ 5974:β 5966:^ 5963:σ 5935:¯ 5926:− 5910:∑ 5897:¯ 5868:ε 5861:^ 5858:σ 5840:β 5832:^ 5829:σ 5785:¯ 5776:− 5760:∑ 5748:ε 5741:^ 5738:σ 5720:β 5712:^ 5709:σ 5672:− 5617:− 5611:− 5556:− 5517:− 5487:ε 5480:^ 5477:σ 5424:¯ 5371:¯ 5339:¯ 5321:^ 5318:β 5311:− 5305:¯ 5284:^ 5281:β 5240:¯ 5231:− 5215:∑ 5204:¯ 5195:− 5173:¯ 5164:− 5148:∑ 5130:^ 5127:β 5080:^ 5077:β 5058:^ 5055:β 4992:∑ 4962:residuals 4908:^ 4872:^ 4862:− 4790:^ 4787:β 4768:^ 4765:β 4746:^ 4687:ε 4655:β 4628:β 4601:β 4540:… 4513:ε 4485:β 4462:β 4449:β 4363:… 4338:ε 4315:β 4302:β 4258:β 4231:β 4007:efficient 3761:^ 3758:β 3602:− 3550:≥ 3454:residuals 3399:^ 3396:β 3364:^ 3361:β 3342:^ 3339:β 3329:− 3317:^ 3298:∑ 3277:^ 3261:∑ 3218:^ 3215:β 3183:^ 3180:β 3161:^ 3158:β 3139:^ 3100:^ 3097:β 3078:^ 3075:β 3056:^ 3053:β 2968:β 2942:β 2929:β 2800:β 2693:^ 2690:β 2641:^ 2605:^ 2602:β 2573:^ 2570:β 2539:^ 2500:^ 2497:β 2474:β 2441:β 2419:− 2397:∑ 2376:β 2348:β 2292:β 2279:β 2229:β 2216:β 2206:β 2070:β 1978:β 1865:β 1642:β 1551:Udny Yule 1288:ECML PKDD 1270:VC theory 1217:ROC curve 1149:Self-play 1069:DeepDream 910:Bayes net 701:Ensembles 482:Paradigms 238:Segmented 13955:Category 13915:Category 13614:sciences 13549:Additive 13220:Safe sex 13191:Medicine 13105:Theories 12876:Genomics 12854:Eugenics 12844:Deviance 12824:Auxology 12626:Category 12319:Survival 12196:Johansen 11919:Binomial 11874:Isotonic 11461:(normal) 11106:location 10913:Blocking 10868:Sampling 10747:Q–Q plot 10712:Box plot 10694:Graphics 10589:Skewness 10579:Kurtosis 10551:Variance 10481:Heronian 10476:Harmonic 10052:Logistic 10042:Binomial 10021:Isotonic 10016:Quantile 9716:Portals 9475:Auto-GPT 9307:Word2vec 9111:Hardware 9028:Datasets 8930:Concepts 8784:34307865 8674:(1990), 8521:2299/965 8319:20061201 8255:(1954). 8120:(1897). 7771:See also 7743:Software 7532:, where 7368:models. 5459:values. 4829:residual 3999:unbiased 3987:diagonal 3776:exists. 3701:must be 1824:function 1601:Bayesian 1563:Gaussian 1527:Legendre 1443:features 1415:response 711:Boosting 560:Problems 353:Bayesian 291:Weighted 286:Ordinary 218:Isotonic 213:Quantile 13927:Commons 13840:History 13737:Canada 13712:Europe 13196:Nursing 13176:Hygiene 13139:Hygiene 12864:Liberal 12817:General 12710:methods 12652:Commons 12599:Kriging 12484:Process 12441:studies 12300:Wavelet 12133:General 11300:Plug-in 11094:L space 10873:Cluster 10574:Moments 10392:Outline 10047:Poisson 9598:Meta AI 9435:AlphaGo 9419:PanGu-ÎŁ 9389:ChatGPT 9364:Granite 9312:Seq2seq 9291:Whisper 9212:WaveNet 9207:AlexNet 9179:Flux.jl 9159:PyTorch 9011:Sigmoid 9006:Softmax 8871:General 8811:, 2001 8775:8279135 8530:1406472 8238:1084801 8229:2341124 8190:2331683 8146:2979746 8077:2245330 7829:Kriging 7701:, e.g. 7459:outside 7436:predict 7384:or the 7291:t-tests 7017:. Thus 6341:, then 6245:is the 5386:is the 4163:of the 1877:, with 1688:(where 1517:History 1477:(e.g., 1411:outcome 1293:NeurIPS 1110:(ECRAM) 1064:AlexNet 706:Bagging 312:Partial 151:Poisson 13727:India 13702:China 13574:Safety 13255:Worker 12521:Census 12111:Normal 12059:Manova 11879:Robust 11629:2-way 11621:1-way 11459:-test 11130:  10707:Biplot 10498:Median 10491:Lehmer 10433:Center 10011:Robust 9613:Huawei 9593:OpenAI 9495:People 9465:MuZero 9327:Gemini 9322:Claude 9257:DALL-E 9169:Theano 8782:  8772:  8720:  8678:  8653:  8624:  8528:  8477:  8449:  8390:  8353:  8317:  8271:  8235:  8227:  8188:  8144:  8075:  8032:  7941:  7449:within 7356:. For 7344:. The 7338:probit 7302:F-test 7298:t-test 7287:F-test 7111:, and 6865:, the 6672:  6215:where 5357:where 4525:  4005:, and 3743:is an 1633:vector 1629:scalar 1086:Vision 942:RANSAC 820:OPTICS 815:DBSCAN 799:-means 606:AutoML 270:Linear 208:Robust 131:Probit 57:Models 13752:U.S. 13596:HACCP 13545:Food 13437:-test 13429:-test 13015:Light 13000:Water 12145:Trend 11674:prior 11616:anova 11505:-test 11479:-test 11471:-test 11378:Power 11323:Pivot 11116:shape 11111:scale 10561:Shape 10541:Range 10486:Heinz 10461:Cubic 10397:Index 9679:Mamba 9450:SARSA 9414:LLaMA 9409:BLOOM 9394:GPT-J 9384:GPT-4 9379:GPT-3 9374:GPT-2 9369:GPT-1 9332:LaMDA 9164:Keras 8559:(PDF) 8548:(PDF) 8429:(PDF) 8388:S2CID 8315:JSTOR 8293:(PDF) 8225:JSTOR 8186:JSTOR 8142:JSTOR 8073:JSTOR 4159:is a 4076:, or 1830:) of 1822:is a 1531:Gauss 1419:label 1308:IJCAI 1134:SARSA 1093:Mamba 1059:LeNet 1054:U-Net 880:t-SNE 804:Fuzzy 781:BIRCH 317:Total 233:Local 13528:WASH 13484:List 13472:List 13005:Soil 12378:Test 11578:Sign 11430:Wald 10503:Mode 10441:Mean 9791:and 9603:Mila 9404:PaLM 9337:Bard 9317:BERT 9300:Text 9279:Sora 8780:PMID 8718:ISBN 8676:ISBN 8667:Sage 8651:ISBN 8622:ISBN 8578:and 8526:SSRN 8475:ISBN 8447:ISBN 8351:ISBN 8269:ISBN 8030:ISBN 7939:ISBN 7673:4.29 7653:1000 7487:and 7418:and 7364:and 7340:and 6524:are 6520:The 5690:The 5574:for 5388:mean 4827:The 4646:and 4249:and 4125:See 3977:are 3576:> 2132:and 1857:and 1745:The 1711:The 1657:The 1623:The 1591:and 1553:and 1496:and 1318:JMLR 1303:ICLR 1298:ICML 1184:RLHF 1000:LSTM 786:CURE 472:and 12995:Air 11558:BIC 11553:AIC 10126:BIC 10121:AIC 9344:NMT 9227:OCR 9222:HWR 9174:JAX 9128:VPU 9123:TPU 9118:IPU 8942:SGD 8770:PMC 8760:doi 8695:doi 8634:," 8516:hdl 8508:doi 8380:doi 8305:doi 8233:PMC 8217:doi 8178:doi 8134:doi 8063:doi 7658:log 7647:log 7300:or 7140:is 7085:is 7039:is 6981:is 6905:is 6835:is 5597:or 4966:SSR 3810:by 3707:not 2769:or 1751:not 1631:or 1489:). 1481:or 1441:or 1413:or 1393:In 1044:SOM 1034:GAN 1010:ESN 1005:GRU 950:-NN 885:SDL 875:PGD 870:PCA 865:NMF 860:LDA 855:ICA 850:CCA 726:-NN 13957:: 8807:, 8801:, 8778:. 8768:. 8754:. 8750:. 8691:14 8689:. 8639:11 8620:. 8616:, 8550:. 8524:. 8514:. 8502:. 8498:. 8453:- 8408:, 8386:. 8376:23 8374:. 8313:. 8301:20 8299:. 8295:. 8231:. 8223:. 8213:85 8211:. 8207:. 8184:. 8172:. 8166:. 8140:. 8130:60 8128:. 8124:. 8071:. 8057:. 8053:. 7988:^ 7977:. 7961:, 7470:A 7406:. 7328:. 7065:, 6372:. 6305:, 6041:. 5687:. 5101:. 4968:: 4964:, 4831:, 4619:, 4276:: 4001:, 3947:). 2907:: 2815:. 1792:. 1599:, 1538:. 1513:. 1437:, 1433:, 1429:, 1397:, 1313:ML 13791:) 13787:( 13435:Z 13427:t 12976:) 12972:( 12802:e 12795:t 12788:v 12699:e 12692:t 12685:v 11503:G 11477:F 11469:t 11457:Z 11176:V 11171:U 10373:e 10366:t 10359:v 10114:p 10112:C 9858:) 9849:( 9781:e 9774:t 9767:v 8857:e 8850:t 8843:v 8786:. 8762:: 8756:7 8735:. 8724:. 8701:. 8697:: 8659:. 8532:. 8518:: 8510:: 8504:7 8483:. 8394:. 8382:: 8359:. 8321:. 8307:: 8277:. 8241:. 8219:: 8192:. 8180:: 8174:2 8148:. 8136:: 8079:. 8065:: 8059:4 8038:. 7947:. 7751:. 7685:. 7664:5 7621:m 7601:N 7593:( 7580:m 7560:n 7540:N 7518:n 7514:m 7510:= 7507:N 7489:X 7485:Y 7445:X 7441:Y 7236:. 7232:Y 7223:X 7217:1 7210:) 7206:X 7197:X 7193:( 7190:= 7154:1 7148:p 7099:1 7093:n 7073:Y 7053:p 7047:n 7026:X 7003:j 6940:j 6918:i 6914:y 6893:Y 6873:i 6851:j 6848:i 6844:x 6822:X 6801:j 6798:i 6774:, 6770:Y 6761:X 6755:= 6740:) 6737:X 6728:X 6724:( 6696:. 6693:p 6690:, 6684:, 6681:1 6678:= 6675:j 6669:, 6664:i 6660:y 6654:j 6651:i 6647:x 6641:n 6636:1 6633:= 6630:i 6622:= 6617:k 6598:k 6595:i 6591:x 6585:j 6582:i 6578:x 6572:p 6567:1 6564:= 6561:k 6551:n 6546:1 6543:= 6540:i 6505:. 6500:p 6497:i 6493:x 6487:p 6459:1 6456:i 6452:x 6446:1 6424:i 6420:y 6416:= 6411:i 6383:p 6354:1 6329:1 6326:= 6321:1 6318:i 6314:x 6293:i 6273:j 6253:i 6231:j 6228:i 6224:x 6199:, 6194:i 6186:+ 6181:p 6178:i 6174:x 6168:p 6160:+ 6154:+ 6149:2 6146:i 6142:x 6136:2 6128:+ 6123:1 6120:i 6116:x 6110:1 6102:= 6097:i 6093:y 6069:p 6014:. 6008:n 6002:2 5997:i 5993:x 5978:1 5956:= 5946:2 5942:) 5932:x 5921:i 5917:x 5913:( 5904:2 5894:x 5885:+ 5880:n 5877:1 5851:= 5844:0 5796:2 5792:) 5782:x 5771:i 5767:x 5763:( 5756:1 5731:= 5724:1 5675:2 5669:n 5649:1 5646:= 5643:p 5623:) 5620:1 5614:p 5608:n 5605:( 5582:p 5562:) 5559:p 5553:n 5550:( 5520:2 5514:n 5509:R 5506:S 5503:S 5497:= 5492:2 5447:y 5421:y 5398:x 5368:x 5336:x 5328:1 5302:y 5296:= 5291:0 5251:2 5247:) 5237:x 5226:i 5222:x 5218:( 5210:) 5201:y 5190:i 5186:y 5182:( 5179:) 5170:x 5159:i 5155:x 5151:( 5142:= 5137:1 5087:1 5070:, 5065:0 5022:2 5017:i 5013:e 5007:n 5002:1 4999:= 4996:i 4988:= 4985:R 4982:S 4979:S 4942:i 4938:y 4915:i 4905:y 4879:i 4869:y 4857:i 4853:y 4849:= 4844:i 4840:e 4812:. 4807:i 4803:x 4797:1 4780:+ 4775:0 4758:= 4753:i 4743:y 4713:i 4691:i 4664:. 4659:2 4632:1 4605:0 4578:i 4574:x 4549:. 4546:n 4543:, 4537:, 4534:1 4531:= 4528:i 4522:, 4517:i 4509:+ 4504:2 4499:i 4495:x 4489:2 4481:+ 4476:i 4472:x 4466:1 4458:+ 4453:0 4445:= 4440:i 4436:y 4409:2 4404:i 4400:x 4372:. 4369:n 4366:, 4360:, 4357:1 4354:= 4351:i 4347:, 4342:i 4334:+ 4329:i 4325:x 4319:1 4311:+ 4306:0 4298:= 4293:i 4289:y 4262:1 4235:0 4208:i 4204:x 4183:n 4145:i 4141:y 4093:i 4089:e 4058:i 4054:X 4031:i 4027:e 3989:. 3963:i 3959:e 3929:i 3925:e 3902:0 3899:= 3896:) 3891:i 3887:X 3882:| 3876:i 3872:e 3868:( 3865:E 3833:) 3827:( 3822:) 3818:( 3804:. 3731:X 3726:T 3722:X 3689:) 3684:i 3681:k 3677:X 3673:, 3670:. 3667:. 3664:. 3661:, 3656:i 3653:2 3649:X 3645:, 3640:i 3637:1 3633:X 3629:( 3605:k 3599:N 3579:k 3573:N 3553:k 3547:N 3527:k 3500:2 3497:= 3494:N 3470:2 3467:= 3464:N 3440:0 3437:= 3432:2 3428:) 3424:) 3419:i 3416:2 3412:X 3406:2 3389:+ 3384:i 3381:1 3377:X 3371:1 3354:+ 3349:0 3332:( 3324:i 3314:Y 3307:( 3302:i 3294:= 3289:2 3284:i 3274:e 3265:i 3238:i 3235:2 3231:X 3225:2 3208:+ 3203:i 3200:1 3196:X 3190:1 3173:+ 3168:0 3151:= 3146:i 3136:Y 3112:) 3107:2 3090:, 3085:1 3068:, 3063:0 3046:( 3026:2 3023:= 3020:N 2998:i 2994:e 2990:+ 2985:i 2982:2 2978:X 2972:2 2964:+ 2959:i 2956:1 2952:X 2946:1 2938:+ 2933:0 2925:= 2920:i 2916:Y 2891:) 2886:i 2883:2 2879:X 2875:, 2870:i 2867:1 2863:X 2859:, 2854:i 2850:Y 2846:( 2826:N 2803:) 2797:, 2792:i 2788:X 2784:( 2781:f 2753:) 2748:i 2744:X 2739:| 2733:i 2729:Y 2725:( 2722:E 2699:) 2684:, 2679:i 2675:X 2671:( 2668:f 2636:i 2632:Y 2579:) 2564:, 2559:i 2555:X 2551:( 2548:f 2545:= 2534:i 2530:Y 2452:2 2448:) 2444:) 2438:, 2433:i 2429:X 2425:( 2422:f 2414:i 2410:Y 2406:( 2401:i 2319:i 2315:e 2311:+ 2306:i 2302:X 2296:1 2288:+ 2283:0 2275:= 2270:i 2266:Y 2243:i 2239:X 2233:1 2225:+ 2220:0 2212:= 2209:) 2203:, 2198:i 2194:X 2190:( 2187:f 2167:f 2145:i 2141:X 2118:i 2114:Y 2093:f 2073:) 2067:, 2062:i 2058:X 2054:( 2051:f 2022:i 2018:X 1992:i 1988:e 1984:+ 1981:) 1975:, 1970:i 1966:X 1962:( 1959:f 1956:= 1951:i 1947:Y 1921:i 1917:Y 1890:i 1886:e 1843:i 1839:X 1826:( 1808:i 1804:Y 1780:. 1766:i 1762:e 1742:. 1728:i 1724:Y 1696:i 1674:i 1670:X 1654:. 1382:e 1375:t 1368:v 948:k 797:k 724:k 682:) 670:( 449:e 442:t 435:v 20:)

Index

Statistical regression

Gaussian distribution
Regression analysis
Linear regression
Simple regression
Polynomial regression
General linear model
Generalized linear model
Vector generalized linear model
Discrete choice
Binomial regression
Binary regression
Logistic regression
Multinomial logistic regression
Mixed logit
Probit
Multinomial probit
Ordered logit
Ordered probit
Poisson
Multilevel model
Fixed effects
Random effects
Linear mixed-effects model
Nonlinear mixed-effects model
Nonlinear regression
Nonparametric
Semiparametric
Robust

Text is available under the Creative Commons Attribution-ShareAlike License. Additional terms may apply.

↑