Knowledge

Poisson regression

Source 📝

3739:(a particular unit of observation). For example, biologists may count the number of tree species in a forest: events would be tree observations, exposure would be unit area, and rate would be the number of species per unit area. Demographers may model death rates in geographic areas as the count of deaths divided by person−years. More generally, event rates can be calculated as events per unit time, which allows the observation window to vary for each unit. In these examples, exposure is respectively unit area, person−years and unit time. In Poisson regression this is handled as an 8037: 7430: 394: 7416: 3743:. If the rate is count/exposure, multiplying both sides of the equation by exposure moves it to the right side of the equation. When both sides of the equation are then logged, the final model contains log(exposure) as a term that is added to the regression coefficients. This logged variable, log(exposure), is called the offset variable and enters on the right-hand side of the equation with a parameter estimate (for log(exposure)) constrained to 1. 7454: 7442: 3965: 3042: 491:
is a popular generalization of Poisson regression because it loosens the highly restrictive assumption that the variance is equal to the mean made by the Poisson model. The traditional negative binomial regression model is based on the Poisson-gamma mixture distribution. This model is popular because
4159:
Another common problem with Poisson regression is excess zeros: if there are two processes at work, one determining whether there are zero events or any events, and a Poisson process determining how many events there are, there will be more zeros than a Poisson regression would predict. An example
3432: 3726:
such as the arrival of a telephone call at a call centre. The events must be independent in the sense that the arrival of one call will not make another more or less likely, but the probability per unit time of events is understood to be related to covariates such as time of day.
2686: 3223: 1913: 1615: 3589: 2026: 4565: 3819: 1787: 781: 636: 991: 2129: 2349: 4072:
and indicates that the model is not appropriate. A common reason is the omission of relevant explanatory variables, or dependent observations. Under some circumstances, the problem of overdispersion can be solved by using
3656: 2836: 1460: 1371: 1187: 2531: 1084:
and must be found by numerical methods. The probability surface for maximum-likelihood Poisson regression is always concave, making Newton–Raphson or other gradient-based methods appropriate estimation techniques.
4312: 3242: 2417: 3808: 2766: 2201: 704: 2824: 1801: 2546: 550: 4383: 1471: 4084:
Ver Hoef and Boveng described the difference between quasi-Poisson (also called overdispersion with quasi-likelihood) and negative binomial (equivalent to gamma-Poisson) as follows: If
3069: 3700: 667: 4433: 3960:{\displaystyle \log \left({\frac {\operatorname {E} (Y\mid x)}{\text{exposure}}}\right)=\log(\operatorname {E} (Y\mid x))-\log({\text{exposure}})=\theta 'x-\log({\text{exposure}})} 1921: 7492: 1672: 1054: 3460: 4441: 1664: 1282: 1236: 2439: 919: 806: 712: 562: 1792:
So the coefficient of the model is to be interpreted as the increase in the logarithm of the count of the outcome variable when the independent variable increases by 1.
927: 8087: 4588: 2034: 1074: 897: 874: 834: 854: 4622: 1021: 1117: 2261: 2241: 2221: 2269: 7485: 3037:{\displaystyle p(y_{1},\ldots ,y_{m}\mid x_{1},\ldots ,x_{m};\theta )=\prod _{i=1}^{m}{\frac {e^{y_{i}\theta 'x_{i}}e^{-e^{\theta 'x_{i}}}}{y_{i}!}}.} 4627: 3597: 1379: 1290: 1125: 7554: 7478: 6551: 7056: 3441:
only appear in the first two terms of each term in the summation. Therefore, given that we are only interested in finding the best value for
3427:{\displaystyle \ell (\theta \mid X,Y)=\log L(\theta \mid X,Y)=\sum _{i=1}^{m}\left(y_{i}\theta 'x_{i}-e^{\theta 'x_{i}}-\log(y_{i}!)\right).} 2467: 7206: 2357: 7563: 6830: 5471: 4217: 3735:
Poisson regression may also be appropriate for rate data, where the rate is a count of events divided by some measure of that unit's
7568: 6604: 424: 8001: 7043: 4156:. Ver Hoef and Boveng discussed an example where they selected between the two by plotting mean squared residuals vs. the mean. 3749: 2698: 334: 8077: 4850: 5135: 5116: 5097: 5078: 5048: 5014: 4995: 2134:
That is, when the independent variable increases by 1, the outcome variable is multiplied by the exponentiated coefficient.
1908:{\displaystyle \log \left({\dfrac {\operatorname {E} (Y_{2}\mid x_{2})}{\operatorname {E} (Y_{1}\mid x_{1})}}\right)=\beta } 7536: 5466: 5166: 4129: 2152: 672: 324: 7896: 6070: 5218: 2775: 2681:{\displaystyle p(y\mid x;\theta )={\frac {\lambda ^{y}}{y!}}e^{-\lambda }={\frac {e^{y\theta 'x}e^{-e^{\theta 'x}}}{y!}}} 4160:
would be the distribution of cigarettes smoked in an hour by members of a group where some individuals are non-smokers.
8072: 7876: 7526: 4722: 1610:{\displaystyle \log(\operatorname {E} (Y_{2}\mid x_{2}))-\log(\operatorname {E} (Y_{1}\mid x_{1}))=\beta (x_{2}-x_{1})} 518: 7796: 6853: 6745: 7602: 7458: 7031: 6905: 288: 3218:{\displaystyle L(\theta \mid X,Y)=\prod _{i=1}^{m}{\frac {e^{y_{i}\theta 'x_{i}}e^{-e^{\theta 'x_{i}}}}{y_{i}!}}.} 7089: 6750: 6495: 5866: 5456: 339: 277: 97: 72: 4324: 7838: 7140: 6352: 6159: 6048: 6006: 4168: 4125: 4078: 199: 6080: 2021:{\displaystyle {\dfrac {\operatorname {E} (Y_{2}\mid x_{2})}{\operatorname {E} (Y_{1}\mid x_{1})}}=e^{\beta }} 8082: 7383: 6342: 5245: 3661: 644: 158: 3232:
has not actually changed. A formula in this form is typically difficult to work with; instead, one uses the
8024: 7924: 7914: 7833: 7778: 6934: 6883: 6868: 6858: 6727: 6599: 6566: 6392: 6347: 6177: 417: 1782:{\displaystyle \log(\operatorname {E} (Y_{2}\mid x_{2}))-\log(\operatorname {E} (Y_{1}\mid x_{1}))=\beta } 8051: 7866: 7446: 7278: 7079: 7003: 6304: 6058: 5727: 5191: 4191: 360: 4904:"Is eliciting dependency worth the effort? A study for the multivariate Poisson-Gamma probability model" 4064:
is that its mean is equal to its variance. In certain circumstances, it will be found that the observed
3584:{\displaystyle \ell (\theta \mid X,Y)=\sum _{i=1}^{m}\left(y_{i}\theta 'x_{i}-e^{\theta 'x_{i}}\right).} 7546: 7163: 7135: 7130: 6878: 6637: 6543: 6523: 6431: 6142: 5960: 5443: 5315: 5058: 4617: 4560:{\displaystyle \sum _{i=1}^{m}\log(p(y_{i};e^{\theta 'x_{i}}))-\lambda \left\|\theta \right\|_{2}^{2},} 4396: 329: 298: 225: 3723: 1030: 7891: 7718: 7682: 7651: 6895: 6663: 6384: 6309: 6238: 6167: 6087: 6075: 5945: 5933: 5926: 5634: 5355: 4386: 2537: 319: 308: 272: 179: 776:{\displaystyle \log(\operatorname {E} (Y\mid \mathbf {x} ))={\boldsymbol {\theta }}'\mathbf {x} ,\,} 631:{\displaystyle \log(\operatorname {E} (Y\mid \mathbf {x} ))=\alpha +\mathbf {\beta } '\mathbf {x} ,} 7871: 7749: 7713: 7641: 7531: 7513: 7378: 7145: 7008: 6693: 6658: 6622: 6407: 5849: 5758: 5717: 5629: 5320: 5159: 5070: 4714: 4164: 3975: 3971: 3055:
that makes this probability as large as possible. To do this, the equation is first rewritten as a
1623: 1241: 1195: 1024: 496: 446: 380: 251: 174: 67: 46: 2422: 986:{\displaystyle \operatorname {E} (Y\mid \mathbf {x} )=e^{{\boldsymbol {\theta }}'\mathbf {x} }.\,} 902: 789: 7612: 7287: 6900: 6840: 6777: 6415: 6399: 6137: 5999: 5989: 5839: 5753: 2124:{\displaystyle \operatorname {E} (Y_{2}\mid x_{2})=e^{\beta }\operatorname {E} (Y_{1}\mid x_{1})} 410: 303: 5032: 7965: 7791: 7656: 7646: 7597: 7325: 7255: 7048: 6985: 6740: 6627: 5624: 5521: 5428: 5307: 5206: 4908:
Proceedings of the Institution of Mechanical Engineers, Part O: Journal of Risk and Reliability
4851:"Quasi-Poisson vs. Negative Binomial Regression: How should we model overdispersed count data?" 1081: 267: 262: 204: 5036: 4573: 4207:
When estimating the parameters for Poisson regression, one typically tries to find values for
3722:
Poisson regression may be appropriate when the dependent variable is a count, for instance of
8046: 8006: 7970: 7955: 7906: 7850: 7677: 7350: 7292: 7235: 7061: 6954: 6863: 6589: 6473: 6332: 6324: 6214: 6206: 6021: 5917: 5895: 5854: 5819: 5786: 5732: 5707: 5662: 5601: 5561: 5363: 5186: 1059: 882: 859: 819: 553: 355: 51: 4152:
and substantial extra-Poisson variation, the negative binomial weights are capped at 1/
839: 8011: 7950: 7937: 7886: 7786: 7708: 7687: 7661: 7273: 6848: 6797: 6773: 6735: 6653: 6632: 6584: 6463: 6441: 6410: 6319: 6196: 6147: 6065: 6038: 5994: 5950: 5712: 5488: 5368: 5024: 4935:
Perperoglou, Aris (2011-09-08). "Fitting survival data with penalized Poisson regression".
4862: 4612: 4390: 4061: 3229: 2458: 999: 504: 466: 375: 365: 246: 214: 169: 148: 56: 8: 8029: 7960: 7845: 7812: 7764: 7754: 7733: 7728: 7607: 7589: 7574: 7505: 7420: 7345: 7268: 6949: 6713: 6706: 6668: 6576: 6556: 6528: 6261: 6127: 6122: 6112: 6104: 5922: 5883: 5773: 5763: 5672: 5451: 5407: 5325: 5250: 5152: 5063: 4707: 4607: 4172: 3703: 3056: 1096: 450: 293: 194: 189: 143: 92: 82: 27: 7470: 4866: 8041: 7945: 7934: 7434: 7245: 7099: 6995: 6944: 6701: 6678: 6455: 6189: 6172: 6132: 6043: 5938: 5900: 5871: 5831: 5791: 5737: 5654: 5340: 5335: 4960: 4831: 4799: 4783:"Multiple routes to delinquency? A test of developmental and general theories of crime" 4782: 4763: 4672: 4649:"Log Linear Models for Contingency Tables: A Generalization of Classical Least Squares" 3048: 2246: 2226: 2206: 2149:
Often, the object of interest is the average partial effect or average marginal effect
1077: 398: 112: 2344:{\displaystyle {\frac {\partial E(Y|x)}{\partial x}}=\exp(\theta '\mathbb {x} )\beta } 8036: 7996: 7723: 7633: 7624: 7429: 7340: 7310: 7302: 7122: 7113: 7038: 6969: 6825: 6810: 6785: 6673: 6614: 6480: 6468: 6094: 6011: 5955: 5878: 5722: 5644: 5423: 5297: 5131: 5112: 5093: 5074: 5044: 5010: 4991: 4984: 4952: 4878: 4835: 4718: 4187: 458: 393: 184: 87: 41: 4964: 7692: 7559: 7365: 7320: 7084: 7071: 6964: 6939: 6873: 6805: 6683: 6291: 6184: 6117: 6030: 5977: 5796: 5667: 5461: 5345: 5260: 5227: 4944: 4915: 4870: 4823: 4794: 4755: 4664: 4591: 4074: 3707: 482: 209: 138: 7975: 7881: 7822: 7817: 7282: 7026: 6888: 6815: 6490: 6364: 6337: 6314: 6283: 5910: 5905: 5859: 5589: 5240: 5126:
Myers, Raymond H.; et al. (2010). "Logistic and Poisson Regression Models".
5020: 4435:. Regularization can be added to this optimization problem by instead maximizing 370: 77: 6772: 7919: 7231: 7226: 5689: 5619: 5265: 4623:
Partial likelihood methods for panel data § Pooled QMLE for Poisson models
4069: 474: 122: 4948: 4920: 4903: 4827: 8066: 7991: 7521: 7501: 7388: 7355: 7218: 7179: 6990: 6959: 6423: 6377: 5982: 5684: 5511: 5275: 5270: 5009:. Springer Texts in Statistics (Second ed.). New York: Springer-Verlag. 4956: 4178:
On the contrary, underdispersion may pose an issue for parameter estimation.
3651:{\displaystyle {\frac {\partial \ell (\theta \mid X,Y)}{\partial \theta }}=0} 2354:
This can be estimated using the coefficient estimates from the Poisson model
1455:{\displaystyle \log(\operatorname {E} (Y_{1}\mid x_{1}))=\alpha +\beta x_{1}} 1366:{\displaystyle \log(\operatorname {E} (Y_{2}\mid x_{2}))=\alpha +\beta x_{2}} 500: 241: 117: 1182:{\displaystyle \log(\operatorname {E} (Y\mid \mathbf {x} ))=\alpha +\beta x} 7330: 7263: 7240: 7155: 6485: 5781: 5679: 5614: 5556: 5541: 5478: 5433: 5128:
Generalized Linear Models With Applications in Engineering and the Sciences
4882: 107: 7579: 7373: 7335: 7018: 6919: 6781: 6594: 6561: 6053: 5970: 5965: 5609: 5566: 5546: 5526: 5516: 5285: 4902:
Schwarzenegger, Rafael; Quigley, John; Walls, Lesley (23 November 2021).
4595: 3658:
which has no closed-form solution. However, the negative log-likelihood,
153: 102: 4628:
Control function (econometrics) § Endogeneity in Poisson regression
921:, the predicted mean of the associated Poisson distribution is given by 6219: 5699: 5399: 5330: 5280: 5255: 5175: 4767: 4676: 4653:
Journal of the Royal Statistical Society, Series C (Applied Statistics)
2830:, the probability of attaining this particular set of data is given by 454: 438: 4874: 2526:{\displaystyle \lambda :=\operatorname {E} (Y\mid x)=e^{\theta 'x},\,} 6372: 6224: 5844: 5639: 5551: 5536: 5531: 5496: 4814:
Berk R, MacDonald J (2008). "Overdispersion and Poisson regression".
4694:(2nd ed.). Cambridge, Massachusetts: The MIT Press. p. 726. 4186:
Poisson regression creates proportional hazards models, one class of
478: 470: 5037:"The Econometrics of Discrete Positive Variables: the Poisson Model" 4759: 4668: 2243:. The average partial effect in the Poisson model for a continuous 5888: 5506: 5383: 5378: 5373: 4065: 2412:{\displaystyle {\hat {\theta }}=({\hat {\alpha }},{\hat {\beta }})} 4739: 4648: 507:
function as the assumed probability distribution of the response.
7393: 7094: 4307:{\displaystyle \sum _{i=1}^{m}\log(p(y_{i};e^{\theta 'x_{i}})),} 2444: 7315: 6296: 6270: 6250: 5501: 5292: 5107:
Jones, Andrew M.; et al. (2013). "Models for count data".
492:
it models the Poisson heterogeneity with a gamma distribution.
5144: 5069:(8th ed.). Upper Saddle River: Prentice Hall. pp.  5235: 3803:{\displaystyle \log(\operatorname {E} (Y\mid x))=\theta 'x} 2761:{\displaystyle x_{i}\in \mathbb {R} ^{n+1},\,i=1,\ldots ,m} 816:
independent variables concatenated to the number one. Here
4901: 4211:
that maximize the likelihood of an expression of the form
1093:
Suppose we have a model with a single predictor, that is,
7500: 5043:. New York: Cambridge University Press. pp. 270–83. 5130:(Second ed.). New Jersey: Wiley. pp. 176–183. 4740:"The Analysis of Rates Using Poisson Regression Models" 4928: 4780: 4576: 4444: 4399: 4327: 4220: 3822: 3752: 3664: 3600: 3463: 3245: 3072: 2839: 2778: 2701: 2549: 2470: 2425: 2360: 2272: 2249: 2229: 2209: 2196:{\displaystyle {\frac {\partial E(Y|x)}{\partial x}}} 2155: 2037: 1926: 1924: 1816: 1804: 1675: 1626: 1474: 1382: 1293: 1244: 1198: 1128: 1099: 1062: 1033: 1002: 930: 905: 885: 862: 842: 822: 792: 715: 699:{\displaystyle \mathbf {\beta } \in \mathbb {R} ^{n}} 675: 647: 565: 521: 481:. A Poisson regression model is sometimes known as a 7057:
Autoregressive conditional heteroskedasticity (ARCH)
4692:
Econometric Analysis of Cross Section and Panel Data
4055: 2203:, which is interpreted as the change in the outcome 485:, especially when used to model contingency tables. 4813: 4128:. For both models, parameters are estimated using 4120:is the quasi-Poisson overdispersion parameter, and 2819:{\displaystyle y_{1},\ldots ,y_{m}\in \mathbb {N} } 461:. Poisson regression assumes the response variable 6519: 5062: 4983: 4706: 4582: 4559: 4427: 4377: 4306: 3959: 3802: 3730: 3694: 3650: 3583: 3426: 3217: 3036: 2818: 2760: 2691:Now suppose we are given a data set consisting of 2680: 2525: 2433: 2411: 2343: 2255: 2235: 2223:for a one unit change in the independent variable 2215: 2195: 2123: 2020: 1907: 1781: 1658: 1609: 1454: 1365: 1276: 1230: 1181: 1111: 1068: 1048: 1015: 985: 913: 891: 868: 848: 828: 800: 775: 698: 661: 630: 544: 477:can be modeled by a linear combination of unknown 8088:Mathematical and quantitative methods (economics) 4849:Ver Hoef, JAY M.; Boveng, Peter L. (2007-01-01). 4202: 3717: 2137:The exponentiated coefficient is also called the 1192:Suppose we compute the predicted values at point 1088: 812: + 1)-dimensional vector consisting of 8064: 5061:(2008). "Models for Event Counts and Duration". 4737: 3594:To find a maximum, we need to solve an equation 545:{\displaystyle \mathbf {x} \in \mathbb {R} ^{n}} 6605:Multivariate adaptive regression splines (MARS) 5041:Econometrics of Qualitative Dependent Variables 4321:is the number of examples in the data set, and 4981: 4848: 706:. Sometimes this is written more compactly as 7486: 5160: 4683: 4646: 2445:Maximum likelihood-based parameter estimation 418: 4068:is greater than the mean; this is known as 3710:can be applied to find the optimal value of 879:Thus, when given a Poisson regression model 5004: 4934: 4181: 1080:. The maximum-likelihood estimates lack a 7493: 7479: 5205: 5167: 5153: 5031: 4713:(Fifth ed.). Prentice-Hall. pp.  4689: 4378:{\displaystyle p(y_{i};e^{\theta 'x_{i}})} 4140:. For negative binomial, the weights are 1465:By subtracting the first from the second: 425: 411: 5818: 5007:Log-linear models and logistic regression 4919: 4798: 2812: 2736: 2717: 2522: 2427: 2331: 2144: 982: 772: 686: 655: 532: 3702:, is a convex function, and so standard 3051:, we wish to find the set of parameters 8002:Numerical smoothing and differentiation 5111:. London: Routledge. pp. 295–341. 4982:Cameron, A. C.; Trivedi, P. K. (1998). 3695:{\displaystyle -\ell (\theta \mid X,Y)} 1027:observations with corresponding values 964: 756: 662:{\displaystyle \alpha \in \mathbb {R} } 8065: 7131:Kaplan–Meier estimator (product limit) 5057: 4937:Statistical Methods & Applications 4704: 4132:. For quasi-Poisson, the weights are 4096:, the quasi-Poisson model assumes var( 2826:. Then, for a given set of parameters 499:with the logarithm as the (canonical) 7474: 7204: 6771: 6518: 5817: 5587: 5204: 5148: 5125: 5106: 5087: 2536:and thus, the Poisson distribution's 1795:By applying the rules of logarithms: 7537:Iteratively reweighted least squares 7441: 7141:Accelerated failure time (AFT) model 4175:may function better in these cases. 4130:iteratively reweighted least squares 4104:while the gamma-Poisson assumes var( 510: 7453: 6736:Analysis of variance (ANOVA, anova) 5588: 4816:Journal of Quantitative Criminology 13: 7555:Pearson product-moment correlation 6831:Cochran–Mantel–Haenszel statistics 5457:Pearson product-moment correlation 4975: 4800:10.1111/j.1745-9125.1997.tb00870.x 3879: 3836: 3762: 3633: 3604: 2477: 2301: 2276: 2184: 2159: 2086: 2038: 1966: 1929: 1856: 1819: 1735: 1685: 1534: 1484: 1392: 1303: 1138: 931: 725: 575: 14: 8099: 4986:Regression analysis of count data 4428:{\displaystyle e^{\theta 'x_{i}}} 4056:Overdispersion and zero inflation 1056:of the predictor variables, then 8035: 7452: 7440: 7428: 7415: 7414: 7205: 4194:for descriptions of Cox models. 3228:Note that the expression on the 1154: 1049:{\displaystyle \mathbf {x} _{i}} 1036: 973: 947: 907: 794: 765: 741: 621: 591: 556:, then the model takes the form 523: 392: 16:Statistical model for count data 7090:Least-squares spectral analysis 4943:(4). Springer Nature: 451–462. 4781:Paternoster R, Brame R (1997). 2461:, as stated above, is given by 340:Least-squares spectral analysis 278:Generalized estimating equation 98:Multinomial logistic regression 73:Vector generalized linear model 6071:Mean-unbiased minimum-variance 5174: 5092:. Cambridge University Press. 4990:. Cambridge University Press. 4895: 4842: 4807: 4774: 4731: 4698: 4640: 4539: 4533: 4522: 4519: 4478: 4472: 4372: 4331: 4298: 4295: 4254: 4248: 4203:Regularized Poisson regression 4126:negative binomial distribution 4124:is the shape parameter of the 4079:negative binomial distribution 3954: 3946: 3920: 3912: 3900: 3897: 3885: 3876: 3854: 3842: 3783: 3780: 3768: 3759: 3718:Poisson regression in practice 3689: 3671: 3628: 3610: 3485: 3467: 3413: 3397: 3300: 3282: 3267: 3249: 3094: 3076: 2913: 2843: 2571: 2553: 2495: 2483: 2406: 2400: 2385: 2376: 2367: 2335: 2319: 2296: 2289: 2282: 2179: 2172: 2165: 2118: 2092: 2070: 2044: 1998: 1972: 1961: 1935: 1888: 1862: 1851: 1825: 1770: 1767: 1741: 1732: 1720: 1717: 1691: 1682: 1604: 1578: 1569: 1566: 1540: 1531: 1519: 1516: 1490: 1481: 1427: 1424: 1398: 1389: 1338: 1335: 1309: 1300: 1271: 1245: 1225: 1199: 1161: 1158: 1144: 1135: 1089:Interpretation of coefficients 951: 937: 748: 745: 731: 722: 598: 595: 581: 572: 495:Poisson regression models are 1: 8078:Categorical regression models 7384:Geographic information system 6600:Simultaneous equations models 4633: 4590:. This technique, similar to 4197: 1659:{\displaystyle x_{2}=x_{1}+1} 1277:{\displaystyle (Y_{1},x_{1})} 1231:{\displaystyle (Y_{2},x_{2})} 159:Nonlinear mixed-effects model 8025:Regression analysis category 7915:Response surface methodology 6567:Coefficient of determination 6178:Uniformly most powerful test 5090:Negative Binomial Regression 5005:Christensen, Ronald (1997). 4690:Wooldridge, Jeffrey (2010). 2457:, the mean of the predicted 2434:{\displaystyle \mathbb {x} } 2419:with the observed values of 914:{\displaystyle \mathbf {x} } 801:{\displaystyle \mathbf {x} } 489:Negative binomial regression 7: 7897:Frisch–Waugh–Lovell theorem 7867:Mean and predicted response 7136:Proportional hazards models 7080:Spectral density estimation 7062:Vector autoregression (VAR) 6496:Maximum posterior estimator 5728:Randomized controlled trial 4705:Greene, William H. (2003). 4601: 4570:for some positive constant 4192:proportional hazards models 3437:Notice that the parameters 361:Mean and predicted response 10: 8104: 7547:Correlation and dependence 6896:Multivariate distributions 5316:Average absolute deviation 4618:Fixed-effect Poisson model 3978:can be achieved using the 2449:Given a set of parameters 154:Linear mixed-effects model 8073:Generalized linear models 8020: 7984: 7933: 7905: 7892:Minimum mean-square error 7859: 7805: 7779:Decomposition of variance 7777: 7742: 7701: 7683:Growth curve (statistics) 7670: 7652:Generalized least squares 7632: 7621: 7588: 7545: 7512: 7410: 7364: 7301: 7254: 7217: 7213: 7200: 7172: 7154: 7121: 7112: 7070: 7017: 6978: 6927: 6918: 6884:Structural equation model 6839: 6796: 6792: 6767: 6726: 6692: 6646: 6613: 6575: 6542: 6538: 6514: 6454: 6363: 6282: 6246: 6237: 6220:Score/Lagrange multiplier 6205: 6158: 6103: 6029: 6020: 5830: 5826: 5813: 5772: 5746: 5698: 5653: 5635:Sample size determination 5600: 5596: 5583: 5487: 5442: 5416: 5398: 5354: 5306: 5226: 5217: 5213: 5200: 5182: 4949:10.1007/s10260-011-0172-1 4921:10.1177/1748006X211059417 4828:10.1007/s10940-008-9048-4 4738:Frome, Edward L. (1983). 4387:probability mass function 4165:generalized linear models 2538:probability mass function 497:generalized linear models 320:Least absolute deviations 7750:Generalized linear model 7642:Simple linear regression 7532:Non-linear least squares 7514:Computational statistics 7379:Environmental statistics 6901:Elliptical distributions 6694:Generalized linear model 6623:Simple linear regression 6393:Hodges–Lehmann estimator 5850:Probability distribution 5759:Stochastic approximation 5321:Coefficient of variation 5109:Applied Health Economics 4583:{\displaystyle \lambda } 4182:Use in survival analysis 4060:A characteristic of the 3984: 3970:Offset in the case of a 447:generalized linear model 68:Generalized linear model 7039:Cross-correlation (XCF) 6647:Non-standard predictors 6081:Lehmann–ScheffĂŠ theorem 5754:Adaptive clinical trial 1069:{\displaystyle \theta } 892:{\displaystyle \theta } 869:{\displaystyle \alpha } 829:{\displaystyle \theta } 8042:Mathematics portal 7966:Orthogonal polynomials 7792:Analysis of covariance 7657:Weighted least squares 7647:Ordinary least squares 7598:Ordinary least squares 7435:Mathematics portal 7256:Engineering statistics 7164:Nelson–Aalen estimator 6741:Analysis of covariance 6628:Ordinary least squares 6552:Pearson product-moment 5956:Statistical functional 5867:Empirical distribution 5700:Controlled experiments 5429:Frequency distribution 5207:Descriptive statistics 4647:Nelder, J. A. (1974). 4584: 4561: 4465: 4429: 4379: 4308: 4241: 3961: 3804: 3696: 3652: 3585: 3511: 3428: 3326: 3219: 3120: 3038: 2939: 2820: 2768:, along with a set of 2762: 2682: 2527: 2435: 2413: 2345: 2257: 2237: 2217: 2197: 2145:Average partial effect 2125: 2022: 1909: 1783: 1660: 1611: 1456: 1367: 1278: 1232: 1183: 1113: 1082:closed-form expression 1070: 1050: 1017: 987: 915: 893: 870: 850: 849:{\displaystyle \beta } 830: 802: 777: 700: 663: 632: 546: 399:Mathematics portal 325:Iteratively reweighted 8007:System identification 7971:Chebyshev polynomials 7956:Numerical integration 7907:Design of experiments 7851:Regression validation 7678:Polynomial regression 7603:Partial least squares 7351:Population statistics 7293:System identification 7027:Autocorrelation (ACF) 6955:Exponential smoothing 6869:Discriminant analysis 6864:Canonical correlation 6728:Partition of variance 6590:Regression validation 6434:(Jonckheere–Terpstra) 6333:Likelihood-ratio test 6022:Frequentist inference 5934:Location–scale family 5855:Sampling distribution 5820:Statistical inference 5787:Cross-sectional study 5774:Observational studies 5733:Randomized experiment 5562:Stem-and-leaf display 5364:Central limit theorem 5088:Hilbe, J. M. (2007). 5033:GouriĂŠroux, Christian 4585: 4562: 4445: 4430: 4393:with the mean set to 4380: 4309: 4221: 3962: 3805: 3731:"Exposure" and offset 3697: 3653: 3586: 3491: 3429: 3306: 3220: 3100: 3039: 2919: 2821: 2763: 2683: 2528: 2436: 2414: 2346: 2258: 2238: 2218: 2198: 2126: 2023: 1910: 1784: 1661: 1612: 1457: 1368: 1279: 1233: 1184: 1114: 1071: 1051: 1018: 1016:{\displaystyle Y_{i}} 988: 916: 894: 871: 851: 831: 803: 778: 701: 664: 633: 554:independent variables 547: 356:Regression validation 335:Bayesian multivariate 52:Polynomial regression 8083:Poisson distribution 8012:Moving least squares 7951:Approximation theory 7887:Studentized residual 7877:Errors and residuals 7872:Gauss–Markov theorem 7787:Analysis of variance 7709:Nonlinear regression 7688:Segmented regression 7662:General linear model 7580:Confounding variable 7527:Linear least squares 7274:Probabilistic design 6859:Principal components 6702:Exponential families 6654:Nonlinear regression 6633:General linear model 6595:Mixed effects models 6585:Errors and residuals 6562:Confounding variable 6464:Bayesian probability 6442:Van der Waerden test 6432:Ordered alternative 6197:Multiple comparisons 6076:Rao–Blackwellization 6039:Estimating equations 5995:Statistical distance 5713:Factorial experiment 5246:Arithmetic-Geometric 5065:Econometric Analysis 4709:Econometric Analysis 4613:Poisson distribution 4574: 4442: 4397: 4391:Poisson distribution 4325: 4218: 4062:Poisson distribution 3820: 3750: 3662: 3598: 3461: 3243: 3070: 2837: 2776: 2699: 2547: 2468: 2459:Poisson distribution 2453:and an input vector 2423: 2358: 2270: 2263:can be shown to be: 2247: 2227: 2207: 2153: 2035: 1922: 1802: 1673: 1624: 1472: 1380: 1291: 1242: 1196: 1126: 1097: 1076:can be estimated by 1060: 1031: 1000: 928: 903: 899:and an input vector 883: 860: 840: 820: 790: 713: 673: 645: 563: 519: 505:Poisson distribution 467:Poisson distribution 381:Gauss–Markov theorem 376:Studentized residual 366:Errors and residuals 200:Principal components 170:Nonlinear regression 57:General linear model 8030:Statistics category 7961:Gaussian quadrature 7846:Model specification 7813:Stepwise regression 7671:Predictor structure 7608:Total least squares 7590:Regression analysis 7575:Partial correlation 7506:regression analysis 7346:Official statistics 7269:Methods engineering 6950:Seasonal adjustment 6718:Poisson regressions 6638:Bayesian regression 6577:Regression analysis 6557:Partial correlation 6529:Regression analysis 6128:Prediction interval 6123:Likelihood interval 6113:Confidence interval 6105:Interval estimation 6066:Unbiased estimators 5884:Model specification 5764:Up-and-down designs 5452:Partial correlation 5408:Index of dispersion 5326:Interquartile range 4867:2007Ecol...88.2766V 4608:Zero-inflated model 4553: 4173:zero-inflated model 3706:techniques such as 3704:convex optimization 3454:! and simply write 3057:likelihood function 1112:{\displaystyle n=1} 451:regression analysis 226:Errors-in-variables 93:Logistic regression 83:Binomial regression 28:Regression analysis 22:Part of a series on 8047:Statistics outline 7946:Numerical analysis 7366:Spatial statistics 7246:Medical statistics 7146:First hitting time 7100:Whittle likelihood 6751:Degrees of freedom 6746:Multivariate ANOVA 6679:Heteroscedasticity 6491:Bayesian estimator 6456:Bayesian inference 6305:Kolmogorov–Smirnov 6190:Randomization test 6160:Testing hypotheses 6133:Tolerance interval 6044:Maximum likelihood 5939:Exponential family 5872:Density estimation 5832:Statistical theory 5792:Natural experiment 5738:Scientific control 5655:Survey methodology 5341:Standard deviation 5059:Greene, William H. 4580: 4557: 4531: 4425: 4375: 4304: 3957: 3800: 3692: 3648: 3581: 3424: 3215: 3049:maximum likelihood 3034: 2816: 2758: 2678: 2523: 2431: 2409: 2341: 2253: 2233: 2213: 2193: 2121: 2018: 2003: 1905: 1893: 1779: 1656: 1607: 1452: 1363: 1274: 1228: 1179: 1109: 1078:maximum likelihood 1066: 1046: 1013: 983: 911: 889: 866: 846: 826: 798: 773: 696: 659: 628: 542: 469:, and assumes the 459:contingency tables 443:Poisson regression 113:Multinomial probit 8060: 8059: 8052:Statistics topics 7997:Calibration curve 7806:Model exploration 7773: 7772: 7743:Non-normal errors 7634:Linear regression 7625:statistical model 7468: 7467: 7406: 7405: 7402: 7401: 7341:National accounts 7311:Actuarial science 7303:Social statistics 7196: 7195: 7192: 7191: 7188: 7187: 7123:Survival function 7108: 7107: 6970:Granger causality 6811:Contingency table 6786:Survival analysis 6763: 6762: 6759: 6758: 6615:Linear regression 6510: 6509: 6506: 6505: 6481:Credible interval 6450: 6449: 6233: 6232: 6049:Method of moments 5918:Parametric family 5879:Statistical model 5809: 5808: 5805: 5804: 5723:Random assignment 5645:Statistical power 5579: 5578: 5575: 5574: 5424:Contingency table 5394: 5393: 5261:Generalized/power 5137:978-0-470-45463-3 5118:978-0-415-67682-3 5099:978-0-521-85772-7 5080:978-0-13-600383-0 5050:978-0-521-58985-7 5016:978-0-387-98247-2 4997:978-0-521-63201-0 4875:10.1890/07-0043.1 4861:(11): 2766–2772. 4188:survival analysis 4169:negative binomial 3952: 3918: 3861: 3860: 3640: 3210: 3047:By the method of 3029: 2676: 2597: 2403: 2388: 2370: 2308: 2256:{\displaystyle x} 2236:{\displaystyle x} 2216:{\displaystyle Y} 2191: 2002: 1892: 1620:Suppose now that 511:Regression models 435: 434: 88:Binary regression 47:Simple regression 42:Linear regression 8095: 8040: 8039: 7797:Multivariate AOV 7693:Local regression 7630: 7629: 7622:Regression as a 7613:Ridge regression 7560:Rank correlation 7495: 7488: 7481: 7472: 7471: 7456: 7455: 7444: 7443: 7433: 7432: 7418: 7417: 7321:Crime statistics 7215: 7214: 7202: 7201: 7119: 7118: 7085:Fourier analysis 7072:Frequency domain 7052: 6999: 6965:Structural break 6925: 6924: 6874:Cluster analysis 6821:Log-linear model 6794: 6793: 6769: 6768: 6710: 6684:Homoscedasticity 6540: 6539: 6516: 6515: 6435: 6427: 6419: 6418:(Kruskal–Wallis) 6403: 6388: 6343:Cross validation 6328: 6310:Anderson–Darling 6257: 6244: 6243: 6215:Likelihood-ratio 6207:Parametric tests 6185:Permutation test 6168:1- & 2-tails 6059:Minimum distance 6031:Point estimation 6027: 6026: 5978:Optimal decision 5929: 5828: 5827: 5815: 5814: 5797:Quasi-experiment 5747:Adaptive designs 5598: 5597: 5585: 5584: 5462:Rank correlation 5224: 5223: 5215: 5214: 5202: 5201: 5169: 5162: 5155: 5146: 5145: 5141: 5122: 5103: 5084: 5068: 5054: 5028: 5001: 4989: 4969: 4968: 4932: 4926: 4925: 4923: 4899: 4893: 4892: 4890: 4889: 4846: 4840: 4839: 4811: 4805: 4804: 4802: 4778: 4772: 4771: 4735: 4729: 4728: 4712: 4702: 4696: 4695: 4687: 4681: 4680: 4644: 4592:ridge regression 4589: 4587: 4586: 4581: 4566: 4564: 4563: 4558: 4552: 4547: 4542: 4518: 4517: 4516: 4515: 4506: 4490: 4489: 4464: 4459: 4434: 4432: 4431: 4426: 4424: 4423: 4422: 4421: 4412: 4384: 4382: 4381: 4376: 4371: 4370: 4369: 4368: 4359: 4343: 4342: 4313: 4311: 4310: 4305: 4294: 4293: 4292: 4291: 4282: 4266: 4265: 4240: 4235: 4144:/(1 +  4077:estimation or a 4075:quasi-likelihood 4051: 4048: 4045: 4042: 4039: 4036: 4033: 4030: 4027: 4024: 4021: 4018: 4015: 4012: 4009: 4006: 4003: 4000: 3997: 3994: 3991: 3988: 3981: 3966: 3964: 3963: 3958: 3953: 3950: 3933: 3919: 3916: 3866: 3862: 3858: 3857: 3834: 3809: 3807: 3806: 3801: 3796: 3708:gradient descent 3701: 3699: 3698: 3693: 3657: 3655: 3654: 3649: 3641: 3639: 3631: 3602: 3590: 3588: 3587: 3582: 3577: 3573: 3572: 3571: 3570: 3569: 3560: 3544: 3543: 3534: 3526: 3525: 3510: 3505: 3445:we may drop the 3433: 3431: 3430: 3425: 3420: 3416: 3409: 3408: 3387: 3386: 3385: 3384: 3375: 3359: 3358: 3349: 3341: 3340: 3325: 3320: 3224: 3222: 3221: 3216: 3211: 3209: 3205: 3204: 3194: 3193: 3192: 3191: 3190: 3189: 3188: 3179: 3158: 3157: 3156: 3155: 3146: 3138: 3137: 3122: 3119: 3114: 3043: 3041: 3040: 3035: 3030: 3028: 3024: 3023: 3013: 3012: 3011: 3010: 3009: 3008: 3007: 2998: 2977: 2976: 2975: 2974: 2965: 2957: 2956: 2941: 2938: 2933: 2906: 2905: 2887: 2886: 2874: 2873: 2855: 2854: 2825: 2823: 2822: 2817: 2815: 2807: 2806: 2788: 2787: 2767: 2765: 2764: 2759: 2732: 2731: 2720: 2711: 2710: 2687: 2685: 2684: 2679: 2677: 2675: 2667: 2666: 2665: 2664: 2663: 2659: 2638: 2637: 2633: 2616: 2611: 2610: 2598: 2596: 2588: 2587: 2578: 2532: 2530: 2529: 2524: 2518: 2517: 2513: 2440: 2438: 2437: 2432: 2430: 2418: 2416: 2415: 2410: 2405: 2404: 2396: 2390: 2389: 2381: 2372: 2371: 2363: 2350: 2348: 2347: 2342: 2334: 2329: 2309: 2307: 2299: 2292: 2274: 2262: 2260: 2259: 2254: 2242: 2240: 2239: 2234: 2222: 2220: 2219: 2214: 2202: 2200: 2199: 2194: 2192: 2190: 2182: 2175: 2157: 2130: 2128: 2127: 2122: 2117: 2116: 2104: 2103: 2085: 2084: 2069: 2068: 2056: 2055: 2027: 2025: 2024: 2019: 2017: 2016: 2004: 2001: 1997: 1996: 1984: 1983: 1964: 1960: 1959: 1947: 1946: 1927: 1914: 1912: 1911: 1906: 1898: 1894: 1891: 1887: 1886: 1874: 1873: 1854: 1850: 1849: 1837: 1836: 1817: 1788: 1786: 1785: 1780: 1766: 1765: 1753: 1752: 1716: 1715: 1703: 1702: 1665: 1663: 1662: 1657: 1649: 1648: 1636: 1635: 1616: 1614: 1613: 1608: 1603: 1602: 1590: 1589: 1565: 1564: 1552: 1551: 1515: 1514: 1502: 1501: 1461: 1459: 1458: 1453: 1451: 1450: 1423: 1422: 1410: 1409: 1372: 1370: 1369: 1364: 1362: 1361: 1334: 1333: 1321: 1320: 1283: 1281: 1280: 1275: 1270: 1269: 1257: 1256: 1237: 1235: 1234: 1229: 1224: 1223: 1211: 1210: 1188: 1186: 1185: 1180: 1157: 1118: 1116: 1115: 1110: 1075: 1073: 1072: 1067: 1055: 1053: 1052: 1047: 1045: 1044: 1039: 1022: 1020: 1019: 1014: 1012: 1011: 992: 990: 989: 984: 978: 977: 976: 971: 967: 950: 920: 918: 917: 912: 910: 898: 896: 895: 890: 875: 873: 872: 867: 856:concatenated to 855: 853: 852: 847: 835: 833: 832: 827: 807: 805: 804: 799: 797: 782: 780: 779: 774: 768: 763: 759: 744: 705: 703: 702: 697: 695: 694: 689: 680: 668: 666: 665: 660: 658: 637: 635: 634: 629: 624: 619: 615: 594: 551: 549: 548: 543: 541: 540: 535: 526: 483:log-linear model 427: 420: 413: 397: 396: 304:Ridge regression 139:Multilevel model 19: 18: 8103: 8102: 8098: 8097: 8096: 8094: 8093: 8092: 8063: 8062: 8061: 8056: 8034: 8016: 7980: 7976:Chebyshev nodes 7929: 7925:Bayesian design 7901: 7882:Goodness of fit 7855: 7828: 7818:Model selection 7801: 7769: 7738: 7697: 7666: 7623: 7617: 7584: 7541: 7508: 7499: 7469: 7464: 7427: 7398: 7360: 7297: 7283:quality control 7250: 7232:Clinical trials 7209: 7184: 7168: 7156:Hazard function 7150: 7104: 7066: 7050: 7013: 7009:Breusch–Godfrey 6997: 6974: 6914: 6889:Factor analysis 6835: 6816:Graphical model 6788: 6755: 6722: 6708: 6688: 6642: 6609: 6571: 6534: 6533: 6502: 6446: 6433: 6425: 6417: 6401: 6386: 6365:Rank statistics 6359: 6338:Model selection 6326: 6284:Goodness of fit 6278: 6255: 6229: 6201: 6154: 6099: 6088:Median unbiased 6016: 5927: 5860:Order statistic 5822: 5801: 5768: 5742: 5694: 5649: 5592: 5590:Data collection 5571: 5483: 5438: 5412: 5390: 5350: 5302: 5219:Continuous data 5209: 5196: 5178: 5173: 5138: 5119: 5100: 5081: 5051: 5017: 4998: 4978: 4976:Further reading 4973: 4972: 4933: 4929: 4900: 4896: 4887: 4885: 4847: 4843: 4812: 4808: 4779: 4775: 4760:10.2307/2531094 4736: 4732: 4725: 4703: 4699: 4688: 4684: 4669:10.2307/2347125 4645: 4641: 4636: 4604: 4575: 4572: 4571: 4548: 4543: 4532: 4511: 4507: 4499: 4498: 4494: 4485: 4481: 4460: 4449: 4443: 4440: 4439: 4417: 4413: 4405: 4404: 4400: 4398: 4395: 4394: 4364: 4360: 4352: 4351: 4347: 4338: 4334: 4326: 4323: 4322: 4287: 4283: 4275: 4274: 4270: 4261: 4257: 4236: 4225: 4219: 4216: 4215: 4205: 4200: 4184: 4148:). With large 4112:(1 +  4058: 4053: 4052: 4049: 4046: 4043: 4040: 4037: 4034: 4031: 4028: 4025: 4022: 4019: 4016: 4013: 4010: 4007: 4004: 4001: 3998: 3995: 3992: 3989: 3986: 3979: 3949: 3926: 3915: 3835: 3833: 3829: 3821: 3818: 3817: 3789: 3751: 3748: 3747: 3733: 3720: 3663: 3660: 3659: 3632: 3603: 3601: 3599: 3596: 3595: 3565: 3561: 3553: 3552: 3548: 3539: 3535: 3527: 3521: 3517: 3516: 3512: 3506: 3495: 3462: 3459: 3458: 3453: 3404: 3400: 3380: 3376: 3368: 3367: 3363: 3354: 3350: 3342: 3336: 3332: 3331: 3327: 3321: 3310: 3244: 3241: 3240: 3230:right hand side 3200: 3196: 3195: 3184: 3180: 3172: 3171: 3167: 3163: 3159: 3151: 3147: 3139: 3133: 3129: 3128: 3124: 3123: 3121: 3115: 3104: 3071: 3068: 3067: 3019: 3015: 3014: 3003: 2999: 2991: 2990: 2986: 2982: 2978: 2970: 2966: 2958: 2952: 2948: 2947: 2943: 2942: 2940: 2934: 2923: 2901: 2897: 2882: 2878: 2869: 2865: 2850: 2846: 2838: 2835: 2834: 2811: 2802: 2798: 2783: 2779: 2777: 2774: 2773: 2721: 2716: 2715: 2706: 2702: 2700: 2697: 2696: 2668: 2652: 2651: 2647: 2643: 2639: 2626: 2622: 2618: 2617: 2615: 2603: 2599: 2589: 2583: 2579: 2577: 2548: 2545: 2544: 2506: 2505: 2501: 2469: 2466: 2465: 2447: 2426: 2424: 2421: 2420: 2395: 2394: 2380: 2379: 2362: 2361: 2359: 2356: 2355: 2330: 2322: 2300: 2288: 2275: 2273: 2271: 2268: 2267: 2248: 2245: 2244: 2228: 2225: 2224: 2208: 2205: 2204: 2183: 2171: 2158: 2156: 2154: 2151: 2150: 2147: 2139:incidence ratio 2112: 2108: 2099: 2095: 2080: 2076: 2064: 2060: 2051: 2047: 2036: 2033: 2032: 2012: 2008: 1992: 1988: 1979: 1975: 1965: 1955: 1951: 1942: 1938: 1928: 1925: 1923: 1920: 1919: 1882: 1878: 1869: 1865: 1855: 1845: 1841: 1832: 1828: 1818: 1815: 1811: 1803: 1800: 1799: 1761: 1757: 1748: 1744: 1711: 1707: 1698: 1694: 1674: 1671: 1670: 1644: 1640: 1631: 1627: 1625: 1622: 1621: 1598: 1594: 1585: 1581: 1560: 1556: 1547: 1543: 1510: 1506: 1497: 1493: 1473: 1470: 1469: 1446: 1442: 1418: 1414: 1405: 1401: 1381: 1378: 1377: 1357: 1353: 1329: 1325: 1316: 1312: 1292: 1289: 1288: 1265: 1261: 1252: 1248: 1243: 1240: 1239: 1219: 1215: 1206: 1202: 1197: 1194: 1193: 1153: 1127: 1124: 1123: 1098: 1095: 1094: 1091: 1061: 1058: 1057: 1040: 1035: 1034: 1032: 1029: 1028: 1007: 1003: 1001: 998: 997: 972: 963: 962: 961: 957: 946: 929: 926: 925: 906: 904: 901: 900: 884: 881: 880: 861: 858: 857: 841: 838: 837: 821: 818: 817: 793: 791: 788: 787: 764: 755: 754: 740: 714: 711: 710: 690: 685: 684: 676: 674: 671: 670: 654: 646: 643: 642: 620: 611: 610: 590: 564: 561: 560: 552:is a vector of 536: 531: 530: 522: 520: 517: 516: 513: 431: 391: 371:Goodness of fit 78:Discrete choice 17: 12: 11: 5: 8101: 8091: 8090: 8085: 8080: 8075: 8058: 8057: 8055: 8054: 8049: 8044: 8032: 8027: 8021: 8018: 8017: 8015: 8014: 8009: 8004: 7999: 7994: 7988: 7986: 7982: 7981: 7979: 7978: 7973: 7968: 7963: 7958: 7953: 7948: 7942: 7940: 7931: 7930: 7928: 7927: 7922: 7920:Optimal design 7917: 7911: 7909: 7903: 7902: 7900: 7899: 7894: 7889: 7884: 7879: 7874: 7869: 7863: 7861: 7857: 7856: 7854: 7853: 7848: 7843: 7842: 7841: 7836: 7831: 7826: 7815: 7809: 7807: 7803: 7802: 7800: 7799: 7794: 7789: 7783: 7781: 7775: 7774: 7771: 7770: 7768: 7767: 7762: 7757: 7752: 7746: 7744: 7740: 7739: 7737: 7736: 7731: 7726: 7721: 7719:Semiparametric 7716: 7711: 7705: 7703: 7699: 7698: 7696: 7695: 7690: 7685: 7680: 7674: 7672: 7668: 7667: 7665: 7664: 7659: 7654: 7649: 7644: 7638: 7636: 7627: 7619: 7618: 7616: 7615: 7610: 7605: 7600: 7594: 7592: 7586: 7585: 7583: 7582: 7577: 7572: 7566: 7564:Spearman's rho 7557: 7551: 7549: 7543: 7542: 7540: 7539: 7534: 7529: 7524: 7518: 7516: 7510: 7509: 7498: 7497: 7490: 7483: 7475: 7466: 7465: 7463: 7462: 7450: 7438: 7424: 7411: 7408: 7407: 7404: 7403: 7400: 7399: 7397: 7396: 7391: 7386: 7381: 7376: 7370: 7368: 7362: 7361: 7359: 7358: 7353: 7348: 7343: 7338: 7333: 7328: 7323: 7318: 7313: 7307: 7305: 7299: 7298: 7296: 7295: 7290: 7285: 7276: 7271: 7266: 7260: 7258: 7252: 7251: 7249: 7248: 7243: 7238: 7229: 7227:Bioinformatics 7223: 7221: 7211: 7210: 7198: 7197: 7194: 7193: 7190: 7189: 7186: 7185: 7183: 7182: 7176: 7174: 7170: 7169: 7167: 7166: 7160: 7158: 7152: 7151: 7149: 7148: 7143: 7138: 7133: 7127: 7125: 7116: 7110: 7109: 7106: 7105: 7103: 7102: 7097: 7092: 7087: 7082: 7076: 7074: 7068: 7067: 7065: 7064: 7059: 7054: 7046: 7041: 7036: 7035: 7034: 7032:partial (PACF) 7023: 7021: 7015: 7014: 7012: 7011: 7006: 7001: 6993: 6988: 6982: 6980: 6979:Specific tests 6976: 6975: 6973: 6972: 6967: 6962: 6957: 6952: 6947: 6942: 6937: 6931: 6929: 6922: 6916: 6915: 6913: 6912: 6911: 6910: 6909: 6908: 6893: 6892: 6891: 6881: 6879:Classification 6876: 6871: 6866: 6861: 6856: 6851: 6845: 6843: 6837: 6836: 6834: 6833: 6828: 6826:McNemar's test 6823: 6818: 6813: 6808: 6802: 6800: 6790: 6789: 6765: 6764: 6761: 6760: 6757: 6756: 6754: 6753: 6748: 6743: 6738: 6732: 6730: 6724: 6723: 6721: 6720: 6704: 6698: 6696: 6690: 6689: 6687: 6686: 6681: 6676: 6671: 6666: 6664:Semiparametric 6661: 6656: 6650: 6648: 6644: 6643: 6641: 6640: 6635: 6630: 6625: 6619: 6617: 6611: 6610: 6608: 6607: 6602: 6597: 6592: 6587: 6581: 6579: 6573: 6572: 6570: 6569: 6564: 6559: 6554: 6548: 6546: 6536: 6535: 6532: 6531: 6526: 6520: 6512: 6511: 6508: 6507: 6504: 6503: 6501: 6500: 6499: 6498: 6488: 6483: 6478: 6477: 6476: 6471: 6460: 6458: 6452: 6451: 6448: 6447: 6445: 6444: 6439: 6438: 6437: 6429: 6421: 6405: 6402:(Mann–Whitney) 6397: 6396: 6395: 6382: 6381: 6380: 6369: 6367: 6361: 6360: 6358: 6357: 6356: 6355: 6350: 6345: 6335: 6330: 6327:(Shapiro–Wilk) 6322: 6317: 6312: 6307: 6302: 6294: 6288: 6286: 6280: 6279: 6277: 6276: 6268: 6259: 6247: 6241: 6239:Specific tests 6235: 6234: 6231: 6230: 6228: 6227: 6222: 6217: 6211: 6209: 6203: 6202: 6200: 6199: 6194: 6193: 6192: 6182: 6181: 6180: 6170: 6164: 6162: 6156: 6155: 6153: 6152: 6151: 6150: 6145: 6135: 6130: 6125: 6120: 6115: 6109: 6107: 6101: 6100: 6098: 6097: 6092: 6091: 6090: 6085: 6084: 6083: 6078: 6063: 6062: 6061: 6056: 6051: 6046: 6035: 6033: 6024: 6018: 6017: 6015: 6014: 6009: 6004: 6003: 6002: 5992: 5987: 5986: 5985: 5975: 5974: 5973: 5968: 5963: 5953: 5948: 5943: 5942: 5941: 5936: 5931: 5915: 5914: 5913: 5908: 5903: 5893: 5892: 5891: 5886: 5876: 5875: 5874: 5864: 5863: 5862: 5852: 5847: 5842: 5836: 5834: 5824: 5823: 5811: 5810: 5807: 5806: 5803: 5802: 5800: 5799: 5794: 5789: 5784: 5778: 5776: 5770: 5769: 5767: 5766: 5761: 5756: 5750: 5748: 5744: 5743: 5741: 5740: 5735: 5730: 5725: 5720: 5715: 5710: 5704: 5702: 5696: 5695: 5693: 5692: 5690:Standard error 5687: 5682: 5677: 5676: 5675: 5670: 5659: 5657: 5651: 5650: 5648: 5647: 5642: 5637: 5632: 5627: 5622: 5620:Optimal design 5617: 5612: 5606: 5604: 5594: 5593: 5581: 5580: 5577: 5576: 5573: 5572: 5570: 5569: 5564: 5559: 5554: 5549: 5544: 5539: 5534: 5529: 5524: 5519: 5514: 5509: 5504: 5499: 5493: 5491: 5485: 5484: 5482: 5481: 5476: 5475: 5474: 5469: 5459: 5454: 5448: 5446: 5440: 5439: 5437: 5436: 5431: 5426: 5420: 5418: 5417:Summary tables 5414: 5413: 5411: 5410: 5404: 5402: 5396: 5395: 5392: 5391: 5389: 5388: 5387: 5386: 5381: 5376: 5366: 5360: 5358: 5352: 5351: 5349: 5348: 5343: 5338: 5333: 5328: 5323: 5318: 5312: 5310: 5304: 5303: 5301: 5300: 5295: 5290: 5289: 5288: 5283: 5278: 5273: 5268: 5263: 5258: 5253: 5251:Contraharmonic 5248: 5243: 5232: 5230: 5221: 5211: 5210: 5198: 5197: 5195: 5194: 5189: 5183: 5180: 5179: 5172: 5171: 5164: 5157: 5149: 5143: 5142: 5136: 5123: 5117: 5104: 5098: 5085: 5079: 5055: 5049: 5029: 5015: 5002: 4996: 4977: 4974: 4971: 4970: 4927: 4894: 4841: 4822:(3): 269–284. 4806: 4773: 4730: 4724:978-0130661890 4723: 4697: 4682: 4638: 4637: 4635: 4632: 4631: 4630: 4625: 4620: 4615: 4610: 4603: 4600: 4579: 4568: 4567: 4556: 4551: 4546: 4541: 4538: 4535: 4530: 4527: 4524: 4521: 4514: 4510: 4505: 4502: 4497: 4493: 4488: 4484: 4480: 4477: 4474: 4471: 4468: 4463: 4458: 4455: 4452: 4448: 4420: 4416: 4411: 4408: 4403: 4374: 4367: 4363: 4358: 4355: 4350: 4346: 4341: 4337: 4333: 4330: 4315: 4314: 4303: 4300: 4297: 4290: 4286: 4281: 4278: 4273: 4269: 4264: 4260: 4256: 4253: 4250: 4247: 4244: 4239: 4234: 4231: 4228: 4224: 4204: 4201: 4199: 4196: 4183: 4180: 4070:overdispersion 4057: 4054: 3985: 3968: 3967: 3956: 3948: 3945: 3942: 3939: 3936: 3932: 3929: 3925: 3922: 3914: 3911: 3908: 3905: 3902: 3899: 3896: 3893: 3890: 3887: 3884: 3881: 3878: 3875: 3872: 3869: 3865: 3856: 3853: 3850: 3847: 3844: 3841: 3838: 3832: 3828: 3825: 3813:which implies 3811: 3810: 3799: 3795: 3792: 3788: 3785: 3782: 3779: 3776: 3773: 3770: 3767: 3764: 3761: 3758: 3755: 3732: 3729: 3719: 3716: 3691: 3688: 3685: 3682: 3679: 3676: 3673: 3670: 3667: 3647: 3644: 3638: 3635: 3630: 3627: 3624: 3621: 3618: 3615: 3612: 3609: 3606: 3592: 3591: 3580: 3576: 3568: 3564: 3559: 3556: 3551: 3547: 3542: 3538: 3533: 3530: 3524: 3520: 3515: 3509: 3504: 3501: 3498: 3494: 3490: 3487: 3484: 3481: 3478: 3475: 3472: 3469: 3466: 3449: 3435: 3434: 3423: 3419: 3415: 3412: 3407: 3403: 3399: 3396: 3393: 3390: 3383: 3379: 3374: 3371: 3366: 3362: 3357: 3353: 3348: 3345: 3339: 3335: 3330: 3324: 3319: 3316: 3313: 3309: 3305: 3302: 3299: 3296: 3293: 3290: 3287: 3284: 3281: 3278: 3275: 3272: 3269: 3266: 3263: 3260: 3257: 3254: 3251: 3248: 3234:log-likelihood 3226: 3225: 3214: 3208: 3203: 3199: 3187: 3183: 3178: 3175: 3170: 3166: 3162: 3154: 3150: 3145: 3142: 3136: 3132: 3127: 3118: 3113: 3110: 3107: 3103: 3099: 3096: 3093: 3090: 3087: 3084: 3081: 3078: 3075: 3045: 3044: 3033: 3027: 3022: 3018: 3006: 3002: 2997: 2994: 2989: 2985: 2981: 2973: 2969: 2964: 2961: 2955: 2951: 2946: 2937: 2932: 2929: 2926: 2922: 2918: 2915: 2912: 2909: 2904: 2900: 2896: 2893: 2890: 2885: 2881: 2877: 2872: 2868: 2864: 2861: 2858: 2853: 2849: 2845: 2842: 2814: 2810: 2805: 2801: 2797: 2794: 2791: 2786: 2782: 2757: 2754: 2751: 2748: 2745: 2742: 2739: 2735: 2730: 2727: 2724: 2719: 2714: 2709: 2705: 2689: 2688: 2674: 2671: 2662: 2658: 2655: 2650: 2646: 2642: 2636: 2632: 2629: 2625: 2621: 2614: 2609: 2606: 2602: 2595: 2592: 2586: 2582: 2576: 2573: 2570: 2567: 2564: 2561: 2558: 2555: 2552: 2534: 2533: 2521: 2516: 2512: 2509: 2504: 2500: 2497: 2494: 2491: 2488: 2485: 2482: 2479: 2476: 2473: 2446: 2443: 2429: 2408: 2402: 2399: 2393: 2387: 2384: 2378: 2375: 2369: 2366: 2352: 2351: 2340: 2337: 2333: 2328: 2325: 2321: 2318: 2315: 2312: 2306: 2303: 2298: 2295: 2291: 2287: 2284: 2281: 2278: 2252: 2232: 2212: 2189: 2186: 2181: 2178: 2174: 2170: 2167: 2164: 2161: 2146: 2143: 2132: 2131: 2120: 2115: 2111: 2107: 2102: 2098: 2094: 2091: 2088: 2083: 2079: 2075: 2072: 2067: 2063: 2059: 2054: 2050: 2046: 2043: 2040: 2029: 2028: 2015: 2011: 2007: 2000: 1995: 1991: 1987: 1982: 1978: 1974: 1971: 1968: 1963: 1958: 1954: 1950: 1945: 1941: 1937: 1934: 1931: 1916: 1915: 1904: 1901: 1897: 1890: 1885: 1881: 1877: 1872: 1868: 1864: 1861: 1858: 1853: 1848: 1844: 1840: 1835: 1831: 1827: 1824: 1821: 1814: 1810: 1807: 1790: 1789: 1778: 1775: 1772: 1769: 1764: 1760: 1756: 1751: 1747: 1743: 1740: 1737: 1734: 1731: 1728: 1725: 1722: 1719: 1714: 1710: 1706: 1701: 1697: 1693: 1690: 1687: 1684: 1681: 1678: 1655: 1652: 1647: 1643: 1639: 1634: 1630: 1618: 1617: 1606: 1601: 1597: 1593: 1588: 1584: 1580: 1577: 1574: 1571: 1568: 1563: 1559: 1555: 1550: 1546: 1542: 1539: 1536: 1533: 1530: 1527: 1524: 1521: 1518: 1513: 1509: 1505: 1500: 1496: 1492: 1489: 1486: 1483: 1480: 1477: 1463: 1462: 1449: 1445: 1441: 1438: 1435: 1432: 1429: 1426: 1421: 1417: 1413: 1408: 1404: 1400: 1397: 1394: 1391: 1388: 1385: 1374: 1373: 1360: 1356: 1352: 1349: 1346: 1343: 1340: 1337: 1332: 1328: 1324: 1319: 1315: 1311: 1308: 1305: 1302: 1299: 1296: 1273: 1268: 1264: 1260: 1255: 1251: 1247: 1227: 1222: 1218: 1214: 1209: 1205: 1201: 1190: 1189: 1178: 1175: 1172: 1169: 1166: 1163: 1160: 1156: 1152: 1149: 1146: 1143: 1140: 1137: 1134: 1131: 1108: 1105: 1102: 1090: 1087: 1065: 1043: 1038: 1010: 1006: 994: 993: 981: 975: 970: 966: 960: 956: 953: 949: 945: 942: 939: 936: 933: 909: 888: 865: 845: 825: 796: 784: 783: 771: 767: 762: 758: 753: 750: 747: 743: 739: 736: 733: 730: 727: 724: 721: 718: 693: 688: 683: 679: 657: 653: 650: 639: 638: 627: 623: 618: 614: 609: 606: 603: 600: 597: 593: 589: 586: 583: 580: 577: 574: 571: 568: 539: 534: 529: 525: 512: 509: 475:expected value 453:used to model 433: 432: 430: 429: 422: 415: 407: 404: 403: 402: 401: 386: 385: 384: 383: 378: 373: 368: 363: 358: 350: 349: 345: 344: 343: 342: 337: 332: 327: 322: 314: 313: 312: 311: 306: 301: 296: 291: 283: 282: 281: 280: 275: 270: 265: 257: 256: 255: 254: 249: 244: 236: 235: 231: 230: 229: 228: 220: 219: 218: 217: 212: 207: 202: 197: 192: 187: 182: 180:Semiparametric 177: 172: 164: 163: 162: 161: 156: 151: 149:Random effects 146: 141: 133: 132: 131: 130: 125: 123:Ordered probit 120: 115: 110: 105: 100: 95: 90: 85: 80: 75: 70: 62: 61: 60: 59: 54: 49: 44: 36: 35: 31: 30: 24: 23: 15: 9: 6: 4: 3: 2: 8100: 8089: 8086: 8084: 8081: 8079: 8076: 8074: 8071: 8070: 8068: 8053: 8050: 8048: 8045: 8043: 8038: 8033: 8031: 8028: 8026: 8023: 8022: 8019: 8013: 8010: 8008: 8005: 8003: 8000: 7998: 7995: 7993: 7992:Curve fitting 7990: 7989: 7987: 7983: 7977: 7974: 7972: 7969: 7967: 7964: 7962: 7959: 7957: 7954: 7952: 7949: 7947: 7944: 7943: 7941: 7939: 7938:approximation 7936: 7932: 7926: 7923: 7921: 7918: 7916: 7913: 7912: 7910: 7908: 7904: 7898: 7895: 7893: 7890: 7888: 7885: 7883: 7880: 7878: 7875: 7873: 7870: 7868: 7865: 7864: 7862: 7858: 7852: 7849: 7847: 7844: 7840: 7837: 7835: 7832: 7830: 7829: 7821: 7820: 7819: 7816: 7814: 7811: 7810: 7808: 7804: 7798: 7795: 7793: 7790: 7788: 7785: 7784: 7782: 7780: 7776: 7766: 7763: 7761: 7758: 7756: 7753: 7751: 7748: 7747: 7745: 7741: 7735: 7732: 7730: 7727: 7725: 7722: 7720: 7717: 7715: 7714:Nonparametric 7712: 7710: 7707: 7706: 7704: 7700: 7694: 7691: 7689: 7686: 7684: 7681: 7679: 7676: 7675: 7673: 7669: 7663: 7660: 7658: 7655: 7653: 7650: 7648: 7645: 7643: 7640: 7639: 7637: 7635: 7631: 7628: 7626: 7620: 7614: 7611: 7609: 7606: 7604: 7601: 7599: 7596: 7595: 7593: 7591: 7587: 7581: 7578: 7576: 7573: 7570: 7569:Kendall's tau 7567: 7565: 7561: 7558: 7556: 7553: 7552: 7550: 7548: 7544: 7538: 7535: 7533: 7530: 7528: 7525: 7523: 7522:Least squares 7520: 7519: 7517: 7515: 7511: 7507: 7503: 7502:Least squares 7496: 7491: 7489: 7484: 7482: 7477: 7476: 7473: 7461: 7460: 7451: 7449: 7448: 7439: 7437: 7436: 7431: 7425: 7423: 7422: 7413: 7412: 7409: 7395: 7392: 7390: 7389:Geostatistics 7387: 7385: 7382: 7380: 7377: 7375: 7372: 7371: 7369: 7367: 7363: 7357: 7356:Psychometrics 7354: 7352: 7349: 7347: 7344: 7342: 7339: 7337: 7334: 7332: 7329: 7327: 7324: 7322: 7319: 7317: 7314: 7312: 7309: 7308: 7306: 7304: 7300: 7294: 7291: 7289: 7286: 7284: 7280: 7277: 7275: 7272: 7270: 7267: 7265: 7262: 7261: 7259: 7257: 7253: 7247: 7244: 7242: 7239: 7237: 7233: 7230: 7228: 7225: 7224: 7222: 7220: 7219:Biostatistics 7216: 7212: 7208: 7203: 7199: 7181: 7180:Log-rank test 7178: 7177: 7175: 7171: 7165: 7162: 7161: 7159: 7157: 7153: 7147: 7144: 7142: 7139: 7137: 7134: 7132: 7129: 7128: 7126: 7124: 7120: 7117: 7115: 7111: 7101: 7098: 7096: 7093: 7091: 7088: 7086: 7083: 7081: 7078: 7077: 7075: 7073: 7069: 7063: 7060: 7058: 7055: 7053: 7051:(Box–Jenkins) 7047: 7045: 7042: 7040: 7037: 7033: 7030: 7029: 7028: 7025: 7024: 7022: 7020: 7016: 7010: 7007: 7005: 7004:Durbin–Watson 7002: 7000: 6994: 6992: 6989: 6987: 6986:Dickey–Fuller 6984: 6983: 6981: 6977: 6971: 6968: 6966: 6963: 6961: 6960:Cointegration 6958: 6956: 6953: 6951: 6948: 6946: 6943: 6941: 6938: 6936: 6935:Decomposition 6933: 6932: 6930: 6926: 6923: 6921: 6917: 6907: 6904: 6903: 6902: 6899: 6898: 6897: 6894: 6890: 6887: 6886: 6885: 6882: 6880: 6877: 6875: 6872: 6870: 6867: 6865: 6862: 6860: 6857: 6855: 6852: 6850: 6847: 6846: 6844: 6842: 6838: 6832: 6829: 6827: 6824: 6822: 6819: 6817: 6814: 6812: 6809: 6807: 6806:Cohen's kappa 6804: 6803: 6801: 6799: 6795: 6791: 6787: 6783: 6779: 6775: 6770: 6766: 6752: 6749: 6747: 6744: 6742: 6739: 6737: 6734: 6733: 6731: 6729: 6725: 6719: 6715: 6711: 6705: 6703: 6700: 6699: 6697: 6695: 6691: 6685: 6682: 6680: 6677: 6675: 6672: 6670: 6667: 6665: 6662: 6660: 6659:Nonparametric 6657: 6655: 6652: 6651: 6649: 6645: 6639: 6636: 6634: 6631: 6629: 6626: 6624: 6621: 6620: 6618: 6616: 6612: 6606: 6603: 6601: 6598: 6596: 6593: 6591: 6588: 6586: 6583: 6582: 6580: 6578: 6574: 6568: 6565: 6563: 6560: 6558: 6555: 6553: 6550: 6549: 6547: 6545: 6541: 6537: 6530: 6527: 6525: 6522: 6521: 6517: 6513: 6497: 6494: 6493: 6492: 6489: 6487: 6484: 6482: 6479: 6475: 6472: 6470: 6467: 6466: 6465: 6462: 6461: 6459: 6457: 6453: 6443: 6440: 6436: 6430: 6428: 6422: 6420: 6414: 6413: 6412: 6409: 6408:Nonparametric 6406: 6404: 6398: 6394: 6391: 6390: 6389: 6383: 6379: 6378:Sample median 6376: 6375: 6374: 6371: 6370: 6368: 6366: 6362: 6354: 6351: 6349: 6346: 6344: 6341: 6340: 6339: 6336: 6334: 6331: 6329: 6323: 6321: 6318: 6316: 6313: 6311: 6308: 6306: 6303: 6301: 6299: 6295: 6293: 6290: 6289: 6287: 6285: 6281: 6275: 6273: 6269: 6267: 6265: 6260: 6258: 6253: 6249: 6248: 6245: 6242: 6240: 6236: 6226: 6223: 6221: 6218: 6216: 6213: 6212: 6210: 6208: 6204: 6198: 6195: 6191: 6188: 6187: 6186: 6183: 6179: 6176: 6175: 6174: 6171: 6169: 6166: 6165: 6163: 6161: 6157: 6149: 6146: 6144: 6141: 6140: 6139: 6136: 6134: 6131: 6129: 6126: 6124: 6121: 6119: 6116: 6114: 6111: 6110: 6108: 6106: 6102: 6096: 6093: 6089: 6086: 6082: 6079: 6077: 6074: 6073: 6072: 6069: 6068: 6067: 6064: 6060: 6057: 6055: 6052: 6050: 6047: 6045: 6042: 6041: 6040: 6037: 6036: 6034: 6032: 6028: 6025: 6023: 6019: 6013: 6010: 6008: 6005: 6001: 5998: 5997: 5996: 5993: 5991: 5988: 5984: 5983:loss function 5981: 5980: 5979: 5976: 5972: 5969: 5967: 5964: 5962: 5959: 5958: 5957: 5954: 5952: 5949: 5947: 5944: 5940: 5937: 5935: 5932: 5930: 5924: 5921: 5920: 5919: 5916: 5912: 5909: 5907: 5904: 5902: 5899: 5898: 5897: 5894: 5890: 5887: 5885: 5882: 5881: 5880: 5877: 5873: 5870: 5869: 5868: 5865: 5861: 5858: 5857: 5856: 5853: 5851: 5848: 5846: 5843: 5841: 5838: 5837: 5835: 5833: 5829: 5825: 5821: 5816: 5812: 5798: 5795: 5793: 5790: 5788: 5785: 5783: 5780: 5779: 5777: 5775: 5771: 5765: 5762: 5760: 5757: 5755: 5752: 5751: 5749: 5745: 5739: 5736: 5734: 5731: 5729: 5726: 5724: 5721: 5719: 5716: 5714: 5711: 5709: 5706: 5705: 5703: 5701: 5697: 5691: 5688: 5686: 5685:Questionnaire 5683: 5681: 5678: 5674: 5671: 5669: 5666: 5665: 5664: 5661: 5660: 5658: 5656: 5652: 5646: 5643: 5641: 5638: 5636: 5633: 5631: 5628: 5626: 5623: 5621: 5618: 5616: 5613: 5611: 5608: 5607: 5605: 5603: 5599: 5595: 5591: 5586: 5582: 5568: 5565: 5563: 5560: 5558: 5555: 5553: 5550: 5548: 5545: 5543: 5540: 5538: 5535: 5533: 5530: 5528: 5525: 5523: 5520: 5518: 5515: 5513: 5512:Control chart 5510: 5508: 5505: 5503: 5500: 5498: 5495: 5494: 5492: 5490: 5486: 5480: 5477: 5473: 5470: 5468: 5465: 5464: 5463: 5460: 5458: 5455: 5453: 5450: 5449: 5447: 5445: 5441: 5435: 5432: 5430: 5427: 5425: 5422: 5421: 5419: 5415: 5409: 5406: 5405: 5403: 5401: 5397: 5385: 5382: 5380: 5377: 5375: 5372: 5371: 5370: 5367: 5365: 5362: 5361: 5359: 5357: 5353: 5347: 5344: 5342: 5339: 5337: 5334: 5332: 5329: 5327: 5324: 5322: 5319: 5317: 5314: 5313: 5311: 5309: 5305: 5299: 5296: 5294: 5291: 5287: 5284: 5282: 5279: 5277: 5274: 5272: 5269: 5267: 5264: 5262: 5259: 5257: 5254: 5252: 5249: 5247: 5244: 5242: 5239: 5238: 5237: 5234: 5233: 5231: 5229: 5225: 5222: 5220: 5216: 5212: 5208: 5203: 5199: 5193: 5190: 5188: 5185: 5184: 5181: 5177: 5170: 5165: 5163: 5158: 5156: 5151: 5150: 5147: 5139: 5133: 5129: 5124: 5120: 5114: 5110: 5105: 5101: 5095: 5091: 5086: 5082: 5076: 5072: 5067: 5066: 5060: 5056: 5052: 5046: 5042: 5038: 5034: 5030: 5026: 5022: 5018: 5012: 5008: 5003: 4999: 4993: 4988: 4987: 4980: 4979: 4966: 4962: 4958: 4954: 4950: 4946: 4942: 4938: 4931: 4922: 4917: 4913: 4909: 4905: 4898: 4884: 4880: 4876: 4872: 4868: 4864: 4860: 4856: 4852: 4845: 4837: 4833: 4829: 4825: 4821: 4817: 4810: 4801: 4796: 4792: 4788: 4784: 4777: 4769: 4765: 4761: 4757: 4753: 4749: 4745: 4741: 4734: 4726: 4720: 4716: 4711: 4710: 4701: 4693: 4686: 4678: 4674: 4670: 4666: 4662: 4658: 4654: 4650: 4643: 4639: 4629: 4626: 4624: 4621: 4619: 4616: 4614: 4611: 4609: 4606: 4605: 4599: 4597: 4594:, can reduce 4593: 4577: 4554: 4549: 4544: 4536: 4528: 4525: 4512: 4508: 4503: 4500: 4495: 4491: 4486: 4482: 4475: 4469: 4466: 4461: 4456: 4453: 4450: 4446: 4438: 4437: 4436: 4418: 4414: 4409: 4406: 4401: 4392: 4388: 4365: 4361: 4356: 4353: 4348: 4344: 4339: 4335: 4328: 4320: 4301: 4288: 4284: 4279: 4276: 4271: 4267: 4262: 4258: 4251: 4245: 4242: 4237: 4232: 4229: 4226: 4222: 4214: 4213: 4212: 4210: 4195: 4193: 4189: 4179: 4176: 4174: 4170: 4166: 4161: 4157: 4155: 4151: 4147: 4143: 4139: 4135: 4131: 4127: 4123: 4119: 4115: 4111: 4107: 4103: 4099: 4095: 4091: 4087: 4082: 4080: 4076: 4071: 4067: 4063: 3983: 3977: 3973: 3943: 3940: 3937: 3934: 3930: 3927: 3923: 3909: 3906: 3903: 3894: 3891: 3888: 3882: 3873: 3870: 3867: 3863: 3851: 3848: 3845: 3839: 3830: 3826: 3823: 3816: 3815: 3814: 3797: 3793: 3790: 3786: 3777: 3774: 3771: 3765: 3756: 3753: 3746: 3745: 3744: 3742: 3738: 3728: 3725: 3715: 3713: 3709: 3705: 3686: 3683: 3680: 3677: 3674: 3668: 3665: 3645: 3642: 3636: 3625: 3622: 3619: 3616: 3613: 3607: 3578: 3574: 3566: 3562: 3557: 3554: 3549: 3545: 3540: 3536: 3531: 3528: 3522: 3518: 3513: 3507: 3502: 3499: 3496: 3492: 3488: 3482: 3479: 3476: 3473: 3470: 3464: 3457: 3456: 3455: 3452: 3448: 3444: 3440: 3421: 3417: 3410: 3405: 3401: 3394: 3391: 3388: 3381: 3377: 3372: 3369: 3364: 3360: 3355: 3351: 3346: 3343: 3337: 3333: 3328: 3322: 3317: 3314: 3311: 3307: 3303: 3297: 3294: 3291: 3288: 3285: 3279: 3276: 3273: 3270: 3264: 3261: 3258: 3255: 3252: 3246: 3239: 3238: 3237: 3235: 3231: 3212: 3206: 3201: 3197: 3185: 3181: 3176: 3173: 3168: 3164: 3160: 3152: 3148: 3143: 3140: 3134: 3130: 3125: 3116: 3111: 3108: 3105: 3101: 3097: 3091: 3088: 3085: 3082: 3079: 3073: 3066: 3065: 3064: 3062: 3058: 3054: 3050: 3031: 3025: 3020: 3016: 3004: 3000: 2995: 2992: 2987: 2983: 2979: 2971: 2967: 2962: 2959: 2953: 2949: 2944: 2935: 2930: 2927: 2924: 2920: 2916: 2910: 2907: 2902: 2898: 2894: 2891: 2888: 2883: 2879: 2875: 2870: 2866: 2862: 2859: 2856: 2851: 2847: 2840: 2833: 2832: 2831: 2829: 2808: 2803: 2799: 2795: 2792: 2789: 2784: 2780: 2771: 2755: 2752: 2749: 2746: 2743: 2740: 2737: 2733: 2728: 2725: 2722: 2712: 2707: 2703: 2694: 2672: 2669: 2660: 2656: 2653: 2648: 2644: 2640: 2634: 2630: 2627: 2623: 2619: 2612: 2607: 2604: 2600: 2593: 2590: 2584: 2580: 2574: 2568: 2565: 2562: 2559: 2556: 2550: 2543: 2542: 2541: 2539: 2519: 2514: 2510: 2507: 2502: 2498: 2492: 2489: 2486: 2480: 2474: 2471: 2464: 2463: 2462: 2460: 2456: 2452: 2442: 2397: 2391: 2382: 2373: 2364: 2338: 2326: 2323: 2316: 2313: 2310: 2304: 2293: 2285: 2279: 2266: 2265: 2264: 2250: 2230: 2210: 2187: 2176: 2168: 2162: 2142: 2140: 2135: 2113: 2109: 2105: 2100: 2096: 2089: 2081: 2077: 2073: 2065: 2061: 2057: 2052: 2048: 2041: 2031: 2030: 2013: 2009: 2005: 1993: 1989: 1985: 1980: 1976: 1969: 1956: 1952: 1948: 1943: 1939: 1932: 1918: 1917: 1902: 1899: 1895: 1883: 1879: 1875: 1870: 1866: 1859: 1846: 1842: 1838: 1833: 1829: 1822: 1812: 1808: 1805: 1798: 1797: 1796: 1793: 1776: 1773: 1762: 1758: 1754: 1749: 1745: 1738: 1729: 1726: 1723: 1712: 1708: 1704: 1699: 1695: 1688: 1679: 1676: 1669: 1668: 1667: 1666:. We obtain: 1653: 1650: 1645: 1641: 1637: 1632: 1628: 1599: 1595: 1591: 1586: 1582: 1575: 1572: 1561: 1557: 1553: 1548: 1544: 1537: 1528: 1525: 1522: 1511: 1507: 1503: 1498: 1494: 1487: 1478: 1475: 1468: 1467: 1466: 1447: 1443: 1439: 1436: 1433: 1430: 1419: 1415: 1411: 1406: 1402: 1395: 1386: 1383: 1376: 1375: 1358: 1354: 1350: 1347: 1344: 1341: 1330: 1326: 1322: 1317: 1313: 1306: 1297: 1294: 1287: 1286: 1285: 1266: 1262: 1258: 1253: 1249: 1220: 1216: 1212: 1207: 1203: 1176: 1173: 1170: 1167: 1164: 1150: 1147: 1141: 1132: 1129: 1122: 1121: 1120: 1106: 1103: 1100: 1086: 1083: 1079: 1063: 1041: 1026: 1008: 1004: 979: 968: 958: 954: 943: 940: 934: 924: 923: 922: 886: 877: 863: 843: 823: 815: 811: 769: 760: 751: 737: 734: 728: 719: 716: 709: 708: 707: 691: 681: 677: 651: 648: 625: 616: 612: 607: 604: 601: 587: 584: 578: 569: 566: 559: 558: 557: 555: 537: 527: 508: 506: 502: 501:link function 498: 493: 490: 486: 484: 480: 476: 472: 468: 464: 460: 456: 452: 448: 444: 440: 428: 423: 421: 416: 414: 409: 408: 406: 405: 400: 395: 390: 389: 388: 387: 382: 379: 377: 374: 372: 369: 367: 364: 362: 359: 357: 354: 353: 352: 351: 347: 346: 341: 338: 336: 333: 331: 328: 326: 323: 321: 318: 317: 316: 315: 310: 307: 305: 302: 300: 297: 295: 292: 290: 287: 286: 285: 284: 279: 276: 274: 271: 269: 266: 264: 261: 260: 259: 258: 253: 250: 248: 245: 243: 242:Least squares 240: 239: 238: 237: 233: 232: 227: 224: 223: 222: 221: 216: 213: 211: 208: 206: 203: 201: 198: 196: 193: 191: 188: 186: 183: 181: 178: 176: 175:Nonparametric 173: 171: 168: 167: 166: 165: 160: 157: 155: 152: 150: 147: 145: 144:Fixed effects 142: 140: 137: 136: 135: 134: 129: 126: 124: 121: 119: 118:Ordered logit 116: 114: 111: 109: 106: 104: 101: 99: 96: 94: 91: 89: 86: 84: 81: 79: 76: 74: 71: 69: 66: 65: 64: 63: 58: 55: 53: 50: 48: 45: 43: 40: 39: 38: 37: 33: 32: 29: 26: 25: 21: 20: 7985:Applications 7824: 7759: 7702:Non-standard 7457: 7445: 7426: 7419: 7331:Econometrics 7281: / 7264:Chemometrics 7241:Epidemiology 7234: / 7207:Applications 7049:ARIMA model 6996:Q-statistic 6945:Stationarity 6841:Multivariate 6820: 6784: / 6780: / 6778:Multivariate 6776: / 6717: 6716: / 6712: / 6486:Bayes factor 6385:Signed rank 6297: 6271: 6263: 6251: 5946:Completeness 5782:Cohort study 5680:Opinion poll 5615:Missing data 5602:Study design 5557:Scatter plot 5479:Scatter plot 5472:Spearman's ρ 5434:Grouped data 5127: 5108: 5089: 5064: 5040: 5006: 4985: 4940: 4936: 4930: 4911: 4907: 4897: 4886:. Retrieved 4858: 4854: 4844: 4819: 4815: 4809: 4790: 4786: 4776: 4751: 4747: 4743: 4733: 4708: 4700: 4691: 4685: 4660: 4656: 4652: 4642: 4569: 4318: 4316: 4208: 4206: 4185: 4177: 4167:such as the 4162: 4158: 4153: 4149: 4145: 4141: 4137: 4133: 4121: 4117: 4113: 4109: 4105: 4101: 4097: 4093: 4089: 4085: 4083: 4059: 3969: 3812: 3740: 3736: 3734: 3721: 3711: 3593: 3450: 3446: 3442: 3438: 3436: 3233: 3227: 3060: 3059:in terms of 3052: 3046: 2827: 2769: 2692: 2690: 2540:is given by 2535: 2454: 2450: 2448: 2353: 2148: 2138: 2136: 2133: 1794: 1791: 1619: 1464: 1191: 1092: 995: 878: 813: 809: 785: 640: 514: 494: 488: 487: 462: 442: 436: 299:Non-negative 127: 7459:WikiProject 7374:Cartography 7336:Jurimetrics 7288:Reliability 7019:Time domain 6998:(Ljung–Box) 6920:Time-series 6798:Categorical 6782:Time-series 6774:Categorical 6709:(Bernoulli) 6544:Correlation 6524:Correlation 6320:Jarque–Bera 6292:Chi-squared 6054:M-estimator 6007:Asymptotics 5951:Sufficiency 5718:Interaction 5630:Replication 5610:Effect size 5567:Violin plot 5547:Radar chart 5527:Forest plot 5517:Correlogram 5467:Kendall's τ 4787:Criminology 4596:overfitting 1025:independent 808:is now an ( 309:Regularized 273:Generalized 205:Least angle 103:Mixed logit 8067:Categories 7860:Background 7823:Mallows's 7326:Demography 7044:ARMA model 6849:Regression 6426:(Friedman) 6387:(Wilcoxon) 6325:Normality 6315:Lilliefors 6262:Student's 6138:Resampling 6012:Robustness 6000:divergence 5990:Efficiency 5928:(monotone) 5923:Likelihood 5840:Population 5673:Stratified 5625:Population 5444:Dependence 5400:Count data 5331:Percentile 5308:Dispersion 5241:Arithmetic 5176:Statistics 4888:2016-09-01 4744:Biometrics 4634:References 4198:Extensions 4081:instead. 3982:function: 836:is simply 503:, and the 479:parameters 455:count data 439:statistics 348:Background 252:Non-linear 234:Estimation 7935:Numerical 6707:Logistic 6474:posterior 6400:Rank sum 6148:Jackknife 6143:Bootstrap 5961:Bootstrap 5896:Parameter 5845:Statistic 5640:Statistic 5552:Run chart 5537:Pie chart 5532:Histogram 5522:Fan chart 5497:Bar chart 5379:L-moments 5266:Geometric 4957:1618-2510 4836:121273486 4793:: 45–84. 4754:665–674. 4663:323–329. 4578:λ 4537:θ 4529:λ 4526:− 4501:θ 4470:⁡ 4447:∑ 4407:θ 4354:θ 4277:θ 4246:⁡ 4223:∑ 4171:model or 4116:), where 3944:⁡ 3938:− 3928:θ 3910:⁡ 3904:− 3892:∣ 3883:⁡ 3874:⁡ 3849:∣ 3840:⁡ 3827:⁡ 3791:θ 3775:∣ 3766:⁡ 3757:⁡ 3678:∣ 3675:θ 3669:ℓ 3666:− 3637:θ 3634:∂ 3617:∣ 3614:θ 3608:ℓ 3605:∂ 3555:θ 3546:− 3529:θ 3493:∑ 3474:∣ 3471:θ 3465:ℓ 3395:⁡ 3389:− 3370:θ 3361:− 3344:θ 3308:∑ 3289:∣ 3286:θ 3277:⁡ 3256:∣ 3253:θ 3247:ℓ 3174:θ 3165:− 3141:θ 3102:∏ 3083:∣ 3080:θ 2993:θ 2984:− 2960:θ 2921:∏ 2911:θ 2892:… 2876:∣ 2860:… 2809:∈ 2793:… 2750:… 2713:∈ 2654:θ 2645:− 2628:θ 2608:λ 2605:− 2581:λ 2569:θ 2560:∣ 2508:θ 2490:∣ 2481:⁡ 2472:λ 2401:^ 2398:β 2386:^ 2383:α 2368:^ 2365:θ 2339:β 2324:θ 2317:⁡ 2302:∂ 2277:∂ 2185:∂ 2160:∂ 2106:∣ 2090:⁡ 2082:β 2058:∣ 2042:⁡ 2014:β 1986:∣ 1970:⁡ 1949:∣ 1933:⁡ 1903:β 1876:∣ 1860:⁡ 1839:∣ 1823:⁡ 1809:⁡ 1777:β 1755:∣ 1739:⁡ 1730:⁡ 1724:− 1705:∣ 1689:⁡ 1680:⁡ 1592:− 1576:β 1554:∣ 1538:⁡ 1529:⁡ 1523:− 1504:∣ 1488:⁡ 1479:⁡ 1440:β 1434:α 1412:∣ 1396:⁡ 1387:⁡ 1351:β 1345:α 1323:∣ 1307:⁡ 1298:⁡ 1174:β 1168:α 1151:∣ 1142:⁡ 1133:⁡ 1064:θ 965:θ 944:∣ 935:⁡ 887:θ 864:α 844:β 824:θ 757:θ 738:∣ 729:⁡ 720:⁡ 682:∈ 678:β 652:∈ 649:α 613:β 605:α 588:∣ 579:⁡ 570:⁡ 528:∈ 471:logarithm 215:Segmented 7765:Logistic 7755:Binomial 7734:Isotonic 7729:Quantile 7421:Category 7114:Survival 6991:Johansen 6714:Binomial 6669:Isotonic 6256:(normal) 5901:location 5708:Blocking 5663:Sampling 5542:Q–Q plot 5507:Box plot 5489:Graphics 5384:Skewness 5374:Kurtosis 5346:Variance 5276:Heronian 5271:Harmonic 5035:(2000). 4965:10883925 4914:(5): 5. 4883:18051645 4602:See also 4540:‖ 4534:‖ 4504:′ 4410:′ 4357:′ 4280:′ 4066:variance 4011:exposure 3980:offset() 3951:exposure 3931:′ 3917:exposure 3859:exposure 3794:′ 3737:exposure 3558:′ 3532:′ 3373:′ 3347:′ 3177:′ 3144:′ 2996:′ 2963:′ 2695:vectors 2657:′ 2631:′ 2511:′ 2327:′ 969:′ 761:′ 617:′ 449:form of 330:Bayesian 268:Weighted 263:Ordinary 195:Isotonic 190:Quantile 7760:Poisson 7447:Commons 7394:Kriging 7279:Process 7236:studies 7095:Wavelet 6928:General 6095:Plug-in 5889:L space 5668:Cluster 5369:Moments 5187:Outline 5025:1633357 4863:Bibcode 4855:Ecology 4768:2531094 4677:2347125 4389:of the 4385:is the 4032:poisson 2772:values 473:of its 289:Partial 128:Poisson 7724:Robust 7316:Census 6906:Normal 6854:Manova 6674:Robust 6424:2-way 6416:1-way 6254:-test 5925:  5502:Biplot 5293:Median 5286:Lehmer 5228:Center 5134:  5115:  5096:  5077:  5073:–944. 5047:  5023:  5013:  4994:  4963:  4955:  4881:  4834:  4766:  4721:  4717:–752. 4675:  4317:where 4190:: see 4163:Other 4026:family 3999:offset 3741:offset 3724:events 786:where 641:where 465:has a 247:Linear 185:Robust 108:Probit 34:Models 6940:Trend 6469:prior 6411:anova 6300:-test 6274:-test 6266:-test 6173:Power 6118:Pivot 5911:shape 5906:scale 5356:Shape 5336:Range 5281:Heinz 5256:Cubic 5192:Index 4961:S2CID 4832:S2CID 4764:JSTOR 4750:(3): 4673:JSTOR 4659:(3): 445:is a 294:Total 210:Local 7504:and 7173:Test 6373:Sign 6225:Wald 5298:Mode 5236:Mean 5132:ISBN 5113:ISBN 5094:ISBN 5075:ISBN 5045:ISBN 5011:ISBN 4992:ISBN 4953:ISSN 4879:PMID 4719:ISBN 4108:) = 4100:) = 4092:) = 4038:link 1238:and 1023:are 669:and 457:and 7839:BIC 7834:AIC 6353:BIC 6348:AIC 5071:906 4945:doi 4916:doi 4912:237 4871:doi 4824:doi 4795:doi 4756:doi 4752:pp. 4715:740 4665:doi 4661:pp. 4467:log 4243:log 4044:log 4005:log 3987:glm 3974:in 3972:GLM 3941:log 3907:log 3871:log 3824:log 3754:log 3392:log 3274:log 2314:exp 1806:log 1727:log 1677:log 1526:log 1476:log 1384:log 1295:log 1130:log 996:If 717:log 567:log 515:If 437:In 8069:: 5039:. 5021:MR 5019:. 4959:. 4951:. 4941:20 4939:. 4910:. 4906:. 4877:. 4869:. 4859:88 4857:. 4853:. 4830:. 4820:24 4818:. 4791:35 4789:. 4785:. 4762:. 4748:39 4746:. 4742:. 4671:. 4657:23 4655:. 4651:. 4598:. 4146:κΟ 4114:κΟ 4102:θΟ 4014:)) 3714:. 3236:: 3063:: 2475::= 2441:. 2141:. 1284:: 1119:: 876:. 441:, 7827:p 7825:C 7571:) 7562:( 7494:e 7487:t 7480:v 6298:G 6272:F 6264:t 6252:Z 5971:V 5966:U 5168:e 5161:t 5154:v 5140:. 5121:. 5102:. 5083:. 5053:. 5027:. 5000:. 4967:. 4947:: 4924:. 4918:: 4891:. 4873:: 4865:: 4838:. 4826:: 4803:. 4797:: 4770:. 4758:: 4727:. 4679:. 4667:: 4555:, 4550:2 4545:2 4523:) 4520:) 4513:i 4509:x 4496:e 4492:; 4487:i 4483:y 4479:( 4476:p 4473:( 4462:m 4457:1 4454:= 4451:i 4419:i 4415:x 4402:e 4373:) 4366:i 4362:x 4349:e 4345:; 4340:i 4336:y 4332:( 4329:p 4319:m 4302:, 4299:) 4296:) 4289:i 4285:x 4272:e 4268:; 4263:i 4259:y 4255:( 4252:p 4249:( 4238:m 4233:1 4230:= 4227:i 4209:θ 4154:Îş 4150:Îź 4142:Îź 4138:θ 4136:/ 4134:Îź 4122:Îş 4118:θ 4110:Îź 4106:Y 4098:Y 4094:Îź 4090:Y 4088:( 4086:E 4050:) 4047:) 4041:= 4035:( 4029:= 4023:, 4020:x 4017:+ 4008:( 4002:( 3996:~ 3993:y 3990:( 3976:R 3955:) 3947:( 3935:x 3924:= 3921:) 3913:( 3901:) 3898:) 3895:x 3889:Y 3886:( 3880:E 3877:( 3868:= 3864:) 3855:) 3852:x 3846:Y 3843:( 3837:E 3831:( 3798:x 3787:= 3784:) 3781:) 3778:x 3772:Y 3769:( 3763:E 3760:( 3712:θ 3690:) 3687:Y 3684:, 3681:X 3672:( 3646:0 3643:= 3629:) 3626:Y 3623:, 3620:X 3611:( 3579:. 3575:) 3567:i 3563:x 3550:e 3541:i 3537:x 3523:i 3519:y 3514:( 3508:m 3503:1 3500:= 3497:i 3489:= 3486:) 3483:Y 3480:, 3477:X 3468:( 3451:i 3447:y 3443:θ 3439:θ 3422:. 3418:) 3414:) 3411:! 3406:i 3402:y 3398:( 3382:i 3378:x 3365:e 3356:i 3352:x 3338:i 3334:y 3329:( 3323:m 3318:1 3315:= 3312:i 3304:= 3301:) 3298:Y 3295:, 3292:X 3283:( 3280:L 3271:= 3268:) 3265:Y 3262:, 3259:X 3250:( 3213:. 3207:! 3202:i 3198:y 3186:i 3182:x 3169:e 3161:e 3153:i 3149:x 3135:i 3131:y 3126:e 3117:m 3112:1 3109:= 3106:i 3098:= 3095:) 3092:Y 3089:, 3086:X 3077:( 3074:L 3061:θ 3053:θ 3032:. 3026:! 3021:i 3017:y 3005:i 3001:x 2988:e 2980:e 2972:i 2968:x 2954:i 2950:y 2945:e 2936:m 2931:1 2928:= 2925:i 2917:= 2914:) 2908:; 2903:m 2899:x 2895:, 2889:, 2884:1 2880:x 2871:m 2867:y 2863:, 2857:, 2852:1 2848:y 2844:( 2841:p 2828:θ 2813:N 2804:m 2800:y 2796:, 2790:, 2785:1 2781:y 2770:m 2756:m 2753:, 2747:, 2744:1 2741:= 2738:i 2734:, 2729:1 2726:+ 2723:n 2718:R 2708:i 2704:x 2693:m 2673:! 2670:y 2661:x 2649:e 2641:e 2635:x 2624:y 2620:e 2613:= 2601:e 2594:! 2591:y 2585:y 2575:= 2572:) 2566:; 2563:x 2557:y 2554:( 2551:p 2520:, 2515:x 2503:e 2499:= 2496:) 2493:x 2487:Y 2484:( 2478:E 2455:x 2451:θ 2428:x 2407:) 2392:, 2377:( 2374:= 2336:) 2332:x 2320:( 2311:= 2305:x 2297:) 2294:x 2290:| 2286:Y 2283:( 2280:E 2251:x 2231:x 2211:Y 2188:x 2180:) 2177:x 2173:| 2169:Y 2166:( 2163:E 2119:) 2114:1 2110:x 2101:1 2097:Y 2093:( 2087:E 2078:e 2074:= 2071:) 2066:2 2062:x 2053:2 2049:Y 2045:( 2039:E 2010:e 2006:= 1999:) 1994:1 1990:x 1981:1 1977:Y 1973:( 1967:E 1962:) 1957:2 1953:x 1944:2 1940:Y 1936:( 1930:E 1900:= 1896:) 1889:) 1884:1 1880:x 1871:1 1867:Y 1863:( 1857:E 1852:) 1847:2 1843:x 1834:2 1830:Y 1826:( 1820:E 1813:( 1774:= 1771:) 1768:) 1763:1 1759:x 1750:1 1746:Y 1742:( 1736:E 1733:( 1721:) 1718:) 1713:2 1709:x 1700:2 1696:Y 1692:( 1686:E 1683:( 1654:1 1651:+ 1646:1 1642:x 1638:= 1633:2 1629:x 1605:) 1600:1 1596:x 1587:2 1583:x 1579:( 1573:= 1570:) 1567:) 1562:1 1558:x 1549:1 1545:Y 1541:( 1535:E 1532:( 1520:) 1517:) 1512:2 1508:x 1499:2 1495:Y 1491:( 1485:E 1482:( 1448:1 1444:x 1437:+ 1431:= 1428:) 1425:) 1420:1 1416:x 1407:1 1403:Y 1399:( 1393:E 1390:( 1359:2 1355:x 1348:+ 1342:= 1339:) 1336:) 1331:2 1327:x 1318:2 1314:Y 1310:( 1304:E 1301:( 1272:) 1267:1 1263:x 1259:, 1254:1 1250:Y 1246:( 1226:) 1221:2 1217:x 1213:, 1208:2 1204:Y 1200:( 1177:x 1171:+ 1165:= 1162:) 1159:) 1155:x 1148:Y 1145:( 1139:E 1136:( 1107:1 1104:= 1101:n 1042:i 1037:x 1009:i 1005:Y 980:. 974:x 959:e 955:= 952:) 948:x 941:Y 938:( 932:E 908:x 814:n 810:n 795:x 770:, 766:x 752:= 749:) 746:) 742:x 735:Y 732:( 726:E 723:( 692:n 687:R 656:R 626:, 622:x 608:+ 602:= 599:) 596:) 592:x 585:Y 582:( 576:E 573:( 538:n 533:R 524:x 463:Y 426:e 419:t 412:v

Index

Regression analysis
Linear regression
Simple regression
Polynomial regression
General linear model
Generalized linear model
Vector generalized linear model
Discrete choice
Binomial regression
Binary regression
Logistic regression
Multinomial logistic regression
Mixed logit
Probit
Multinomial probit
Ordered logit
Ordered probit
Poisson
Multilevel model
Fixed effects
Random effects
Linear mixed-effects model
Nonlinear mixed-effects model
Nonlinear regression
Nonparametric
Semiparametric
Robust
Quantile
Isotonic
Principal components

Text is available under the Creative Commons Attribution-ShareAlike License. Additional terms may apply.

↑