25:
9693:
330:
9679:
4362:. Empirical Bayes methods enable the use of auxiliary empirical data, from observations of related parameters, in the development of a Bayes estimator. This is done under the assumption that the estimated parameters are obtained from a common prior. For example, if independent observations of different parameters are performed, then the estimation performance of a particular parameter can sometimes be improved by using data from other observations.
9717:
9705:
98:
2506:
2822:
7190:, where W is the weighted rating and C is the average rating of all films. So, in simpler terms, the fewer ratings/votes cast for a film, the more that film's Weighted Rating will skew towards the average across all films, while films with many ratings/votes will have a rating approaching its pure arithmetic average rating.
2311:
2046:
Risk functions are chosen depending on how one measures the distance between the estimate and the unknown parameter. The MSE is the most common risk function in use, primarily due to its simplicity. However, alternative risk functions are also occasionally used. The following are several examples of
6869:
For example, if ÎŁ=Ď/2, then the deviation of 4 measurements combined matches the deviation of the prior (assuming that errors of measurements are independent). And the weights Îą,β in the formula for posterior match this: the weight of the prior is 4 times the weight of the measurement. Combining
3056:
The use of an improper prior means that the Bayes risk is undefined (since the prior is not a probability distribution and we cannot take an expectation under it). As a consequence, it is no longer meaningful to speak of a Bayes estimator that minimizes the Bayes risk. Nevertheless, in many cases,
6042:
By contrast, generalized Bayes rules often have undefined Bayes risk in the case of improper priors. These rules are often inadmissible and the verification of their admissibility can be difficult. For example, the generalized Bayes estimator of a location parameter θ based on
Gaussian samples
1204:
Conjugate priors are especially useful for sequential estimation, where the posterior of the current measurement is used as the prior in the next measurement. In sequential estimation, unless a conjugate prior is used, the posterior distribution typically becomes more complex with each added
1481:
1201:, for which the resulting posterior distribution also belongs to the same family. This is an important property, since the Bayes estimator, as well as its statistical properties (variance, confidence interval, etc.), can all be derived from the posterior distribution.
4179:
2660:
3747:
6438:
4979:
2501:{\displaystyle L(\theta ,{\widehat {\theta }})={\begin{cases}a|\theta -{\widehat {\theta }}|,&{\mbox{for }}\theta -{\widehat {\theta }}\geq 0\\b|\theta -{\widehat {\theta }}|,&{\mbox{for }}\theta -{\widehat {\theta }}<0\end{cases}}}
3182:
2036:
3565:
5933:
3263:
is typically well-defined and finite. Recall that, for a proper prior, the Bayes estimator minimizes the posterior expected loss. When the prior is improper, an estimator which minimizes the posterior expected loss is referred to as a
6943:
Compare to the example of binomial distribution: there the prior has the weight of (Ď/ÎŁ)²â1 measurements. One can see that the exact weight does depend on the details of the distribution, but when ĎâŤÎŁ, the difference becomes small.
6741:
5624:
6860:
5769:
2188:
1170:
3191:, since Bayes' theorem can only be applied when all distributions are proper. However, it is not uncommon for the resulting "posterior" to be a valid probability distribution. In this case, the posterior expected loss
1722:
4871:
5691:
2586:
2262:
6248:
1355:
526:
6250:
be a sequence of Bayes estimators of θ based on an increasing number of measurements. We are interested in analyzing the asymptotic performance of this sequence of estimators, i.e., the performance of
691:
3258:
6606:
5842:
6960:
which is claimed to give "a true
Bayesian estimate". The following Bayesian formula was initially used to calculate a weighted average score for the Top 250, though the formula has since changed:
3864:
1298:
838:
6938:
5265:
3012:
594:
6769:
bits of the new information. In applications, one often knows very little about fine details of the prior distribution; in particular, there is no reason to assume that it coincides with B(
1837:
2817:{\displaystyle L(\theta ,{\widehat {\theta }})={\begin{cases}0,&{\mbox{for }}|\theta -{\widehat {\theta }}|<K\\L,&{\mbox{for }}|\theta -{\widehat {\theta }}|\geq K.\end{cases}}}
1891:
1346:
1594:
5480:
2914:
5528:
5433:
3391:
7178:, the confidence of the average rating surpasses the confidence of the mean vote for all films (C), and the weighted bayesian rating (W) approaches a straight average (R). The closer
3576:
5122:
1011:
7019:
5391:
5340:
4476:
773:
744:
631:
6777:) exactly. In such a case, one possible interpretation of this calculation is: "there is a non-pathological prior distribution with the mean value 0.5 and the standard deviation
6125:
4797:
4670:
4425:
1638:
5304:
4338:
6173:
5049:
4751:
4638:
4039:
1778:
1537:
6862:, with weights in this weighted average being Îą=Ď², β=Σ². Moreover, the squared posterior deviation is Σ²+Ď². In other words, the prior is combined with the measurement in
5966:
4509:
4282:
4722:
3429:
3313:
2960:
6337:
6309:
6275:
4877:
4569:
7193:
IMDb's approach ensures that a film with only a few ratings, all at 10, would not rank above "the
Godfather", for example, with a 9.2 average from over 500,000 ratings.
3063:
3047:
1900:
1238:
5013:
2298:
5999:
4542:
4236:
3924:
3437:
3333:
1034:
715:
546:
450:
6757:
is small, the prior information is still relevant to the decision problem and affects the estimate. To see the relative weight of the prior information, assume that
6067:
2651:
2625:
2106:
4031:
3978:
3951:
5847:
4690:
4609:
4589:
3779:
473:
7148:
7123:
7098:
7073:
7048:
4004:
3887:
6028:
If a Bayes rule is unique then it is admissible. For example, as stated above, under mean squared error (MSE) the Bayes rule is unique and therefore admissible.
4203:
3799:
2867:
2065:
1054:
889:
859:
6617:
5534:
6803:
6038:
If θ belongs to a continuous (non-discrete) set, and if the risk function R(θ,δ) is continuous in θ for every δ, then all Bayes rules are admissible.
5697:
2115:
1074:
6788:
Another example of the same phenomena is the case when the prior estimate and a measurement are normally distributed. If the prior is centered at
7125:= weight given to the prior estimate (in this case, the number of votes IMDB deemed necessary for average rating to approach statistical validity)
2927:, i.e., a prior distribution which does not imply a preference for any particular value of the unknown parameter. One can still define a function
2923:, of all real numbers) for which every real number is equally likely. Yet, in some sense, such a "distribution" seems like a natural choice for a
6940:; in particular, the prior plays the same role as 4 measurements made in advance. In general, the prior has the weight of (Ď/ÎŁ)² measurements.
1647:
8814:
6765:; in this case each measurement brings in 1 new bit of information; the formula above shows that the prior information has the same weight as
9319:
6324:
360:
2844:
9469:
151:
9093:
7734:
6319:, the effect of the prior probability on the posterior is negligible. Moreover, if δ is the Bayes estimator under MSE risk, then it is
4809:
6496:) where θ denotes the probability for success. Assuming θ is distributed according to the conjugate prior, which in this case is the
5635:
1476:{\displaystyle {\widehat {\theta }}(x)={\frac {\sigma ^{2}}{\sigma ^{2}+\tau ^{2}}}\mu +{\frac {\tau ^{2}}{\sigma ^{2}+\tau ^{2}}}x.}
2512:
8867:
2194:
233:
6178:
478:
9306:
2601:, or a point close to it depending on the curvature and properties of the posterior distribution. Small values of the parameter
636:
6957:
7257:
3197:
779:
if it minimizes the Bayes risk among all estimators. Equivalently, the estimator which minimizes the posterior expected loss
6514:
5777:
7729:
7429:
3807:
1247:
782:
8333:
7481:
6881:
5128:
2968:
6476:(MLE). The relations between the maximum likelihood and Bayes estimators can be shown in the following simple example.
6466:
2919:
However, occasionally this can be a restrictive requirement. For example, there is no distribution (covering the set,
2076:
9116:
9008:
7373:
7354:
7327:
1781:
555:
353:
316:
68:
46:
1786:
39:
9721:
9294:
9168:
1845:
1303:
243:
1549:
9352:
9013:
8758:
8129:
7719:
3742:{\displaystyle E=\int {L(a-\theta )p(\theta |x)d\theta }={\frac {1}{p(x)}}\int L(a-\theta )f(x-\theta )d\theta .}
421:
269:
146:
5438:
2875:
9403:
8615:
8422:
8311:
8269:
5488:
5396:
3338:
207:
8343:
9646:
8605:
7508:
7400:
7202:
5057:
929:
9748:
9197:
9146:
9131:
9121:
8990:
8862:
8829:
8655:
8610:
8440:
7207:
6966:
6473:
1197:
is sometimes chosen for simplicity. A conjugate prior is defined as a prior distribution belonging to some
346:
238:
176:
5345:
5309:
4430:
1205:
measurement, and the Bayes estimator cannot usually be calculated without resorting to numerical methods.
749:
720:
607:
9709:
9541:
9342:
9266:
8567:
8321:
7990:
7454:
7395:
7315:
6084:
4756:
4643:
4384:
7390:
4174:{\displaystyle \int L(a-\theta )f(x_{1}-\theta )d\theta =\int L(a-x_{1}-\theta ')f(-\theta ')d\theta '.}
1602:
9426:
9398:
9393:
9141:
8900:
8806:
8786:
8694:
8405:
8223:
7706:
7578:
5273:
4290:
228:
197:
6134:
9158:
8926:
8647:
8572:
8501:
8430:
8350:
8338:
8208:
8196:
8189:
7897:
7618:
5022:
4727:
4614:
4370:
1731:
1490:
910:
290:
171:
6433:{\displaystyle {\sqrt {n}}(\delta _{n}-\theta _{0})\to N\left(0,{\frac {1}{I(\theta _{0})}}\right),}
4974:{\displaystyle {\widehat {\sigma }}_{m}^{2}={\frac {1}{n}}\sum {(x_{i}-{\widehat {\mu }}_{m})^{2}}.}
4381:
The following is a simple example of parametric empirical Bayes estimation. Given past observations
2831:
is the most widely used and validated. Other loss functions are used in statistics, particularly in
2699:
2350:
9641:
9408:
9271:
8956:
8921:
8885:
8670:
8112:
8021:
7980:
7892:
7583:
7422:
7159:
6021:
6015:
5938:
4985:
4481:
4241:
3177:{\displaystyle p(\theta |x)={\frac {p(x|\theta )p(\theta )}{\int p(x|\theta )p(\theta )d\theta }}.}
2031:{\displaystyle {\widehat {\theta }}(X)={\frac {(a+n)\max {(\theta _{0},x_{1},...,x_{n})}}{a+n-1}}.}
311:
223:
33:
6508:), the posterior distribution is known to be B(a+x,b+n-x). Thus, the Bayes estimator under MSE is
4698:
3399:
3283:
2930:
9550:
9163:
9103:
9040:
8678:
8662:
8400:
8262:
8252:
8102:
8016:
6953:
6320:
6287:
6253:
4547:
6956:
uses a formula for calculating and comparing the ratings of films by its users, including their
3560:{\displaystyle p(\theta |x)={\frac {p(x|\theta )p(\theta )}{p(x)}}={\frac {f(x-\theta )}{p(x)}}}
3023:
9588:
9518:
9311:
9248:
9003:
8890:
7887:
7784:
7691:
7570:
7469:
4355:
4349:
1215:
1065:
202:
50:
5844:, and if we assume a normal prior (which is a conjugate prior in this case), we conclude that
4991:
2271:
1893:, then the posterior is also Pareto distributed, and the Bayes estimator under MSE is given by
9613:
9555:
9498:
9324:
9217:
9126:
8852:
8736:
8595:
8587:
8477:
8469:
8284:
8180:
8158:
8117:
8082:
8049:
7995:
7970:
7925:
7864:
7824:
7626:
7449:
6070:
5971:
5928:{\displaystyle \theta _{n+1}\sim N({\widehat {\mu }}_{\pi },{\widehat {\sigma }}_{\pi }^{2})}
5016:
4514:
4366:
4208:
3896:
3431:
in this case, especially when no other more subjective information is available. This yields
3318:
3018:
2924:
1640:, then the posterior is also Gamma distributed, and the Bayes estimator under MSE is given by
1241:
1064:
Using the MSE as risk, the Bayes estimate of the unknown parameter is simply the mean of the
1019:
700:
531:
435:
398:
105:
6046:
2630:
2604:
2085:
1193:
If there is no inherent reason to prefer one prior probability distribution over another, a
9536:
9111:
9060:
9036:
8998:
8916:
8895:
8847:
8726:
8704:
8673:
8582:
8459:
8410:
8328:
8301:
8257:
8213:
7975:
7751:
7631:
7337:
4009:
3956:
3929:
1543:
285:
166:
136:
6284:
To this end, it is customary to regard θ as a deterministic parameter whose true value is
4675:
4594:
4574:
3755:
458:
8:
9683:
9608:
9531:
9212:
8976:
8969:
8931:
8839:
8819:
8791:
8524:
8390:
8385:
8375:
8367:
8185:
8146:
8036:
8026:
7935:
7714:
7670:
7588:
7513:
7415:
7130:
7105:
7080:
7055:
7030:
6328:
3983:
1840:
417:
117:
109:
89:
3869:
9743:
9697:
9508:
9362:
9258:
9207:
9083:
8980:
8964:
8941:
8718:
8452:
8435:
8395:
8306:
8201:
8163:
8134:
8094:
8054:
8000:
7917:
7603:
7598:
6452:
4800:
4188:
3784:
3277:
2852:
2828:
2304:
from the posterior distribution, and is a generalization of the previous loss function:
2050:
1597:
1039:
874:
844:
453:
334:
259:
161:
131:
2962:, but this would not be a proper probability distribution since it has infinite mass,
9692:
9603:
9573:
9565:
9385:
9376:
9301:
9232:
9088:
9073:
9048:
8936:
8877:
8743:
8731:
8357:
8274:
8218:
8141:
7985:
7907:
7686:
7560:
7369:
7350:
7323:
7253:
6736:{\displaystyle \delta _{n}(x)={\frac {a+b}{a+b+n}}E+{\frac {n}{a+b+n}}\delta _{MLE}.}
6497:
5619:{\displaystyle \sigma _{\pi }^{2}=\sigma _{m}^{2}-\sigma _{f}^{2}=\sigma _{m}^{2}-K.}
3188:
2832:
1198:
916:
374:
329:
264:
141:
113:
6855:{\displaystyle {\frac {\alpha }{\alpha +\beta }}B+{\frac {\beta }{\alpha +\beta }}b}
9628:
9583:
9347:
9334:
9227:
9202:
9136:
9068:
8946:
8554:
8447:
8380:
8293:
8240:
8059:
7930:
7724:
7608:
7523:
7490:
6043:(described in the "Generalized Bayes estimator" section above) is inadmissible for
156:
9545:
9289:
9151:
9078:
8627:
8600:
8577:
8546:
8173:
8168:
8122:
7852:
7503:
7333:
6315:), the posterior density of θ is approximately normal. In other words, for large
1348:, then the posterior is also Normal and the Bayes estimator under MSE is given by
1194:
1188:
378:
192:
9035:
2047:
such alternatives. We denote the posterior generalized distribution function by
9494:
9489:
7952:
7882:
7528:
5764:{\displaystyle {\widehat {\sigma }}_{\pi }^{2}={\widehat {\sigma }}_{m}^{2}-K.}
3893:
In this case it can be shown that the generalized Bayes estimator has the form
3050:
2598:
2183:{\displaystyle L(\theta ,{\widehat {\theta }})=a|\theta -{\widehat {\theta }}|}
1165:{\displaystyle {\widehat {\theta }}(x)=E=\int \theta \,p(\theta |x)\,d\theta .}
866:
694:
401:
9737:
9651:
9618:
9481:
9442:
9253:
9222:
8686:
8640:
8245:
7947:
7774:
7538:
7533:
597:
405:
394:
4672:
We can then use the past observations to determine the mean and variance of
9593:
9526:
9503:
9418:
8748:
8044:
7942:
7877:
7819:
7804:
7741:
7696:
6032:
306:
1717:{\displaystyle {\widehat {\theta }}(X)={\frac {n{\overline {X}}+a}{n+b}}.}
9636:
9598:
9281:
9182:
9044:
8857:
8824:
8316:
8233:
8228:
7872:
7829:
7809:
7789:
7779:
7548:
7075:= average rating for the movie as a number from 1 to 10 (mean) = (Rating)
6750:â â, the Bayes estimator (in the described problem) is close to the MLE.
2869:
has thus far been assumed to be a true probability distribution, in that
8482:
7962:
7662:
7593:
7543:
7518:
7438:
7366:
Bayesian
Estimation and Experimental Design in Linear Regression Models
2845:
Admissible decision rule § Bayes rules and generalized Bayes rules
7252:(5. print. ed.). Cambridge : Cambridge Univ. Press. p. 172.
6866:
the same way as if it were an extra measurement to take into account.
6472:
Another estimator which is asymptotically normal and efficient is the
6024:. The following are some specific examples of admissibility theorems.
8635:
8487:
8107:
7902:
7814:
7799:
7794:
7759:
390:
8151:
7769:
7646:
7641:
7636:
6479:
2301:
2268:
Another "linear" loss function, which assigns different "weights"
915:
The most common risk function used for
Bayesian estimation is the
862:
also minimizes the Bayes risk and therefore is a Bayes estimator.
9656:
9357:
7299:
4866:{\displaystyle {\widehat {\mu }}_{m}={\frac {1}{n}}\sum {x_{i}},}
413:
6311:. Under specific conditions, for large samples (large values of
416:
function. An alternative way of formulating an estimator within
9578:
8559:
8533:
8513:
7764:
7555:
5686:{\displaystyle {\widehat {\mu }}_{\pi }={\widehat {\mu }}_{m},}
4591:
which depends on unknown parameters. For example, suppose that
2627:
are recommended, in order to use the mode as an approximation (
3049:, which are not probability distributions, are referred to as
2597:
The following loss function is trickier: it yields either the
2581:{\displaystyle F({\widehat {\theta }}(x)|X)={\frac {a}{a+b}}.}
1016:
where the expectation is taken over the joint distribution of
869:
then an estimator which minimizes the posterior expected loss
7407:
5393:, which are assumed to be known. In particular, suppose that
2257:{\displaystyle F({\widehat {\theta }}(x)|X)={\tfrac {1}{2}}.}
412:). Equivalently, it maximizes the posterior expectation of a
97:
7182:(the number of ratings for the film) is to zero, the closer
6243:{\displaystyle \delta _{n}=\delta _{n}(x_{1},\ldots ,x_{n})}
521:{\displaystyle {\widehat {\theta }}={\widehat {\theta }}(x)}
7498:
2810:
2494:
2108:, which yields the posterior median as the Bayes' estimate:
6797:
6781:
which gives the weight of prior information equal to 1/(4
6128:
1540:
686:{\displaystyle E_{\pi }(L(\theta ,{\widehat {\theta }}))}
6947:
1182:
5629:
Finally, we obtain the estimated moments of the prior,
3253:{\displaystyle \int {L(\theta ,a)p(\theta |x)d\theta }}
2070:
904:
7368:. Chichester: John Wiley & Sons. pp. 38â117.
6601:{\displaystyle \delta _{n}(x)=E={\frac {a+x}{a+b+n}}.}
6081:
Let θ be an unknown random variable, and suppose that
5837:{\displaystyle x_{i}|\theta _{i}\sim N(\theta _{i},1)}
2764:
2711:
2461:
2390:
2240:
2077:
Bias of an estimator § Median-unbiased estimators
7150:= the mean vote across the whole pool (currently 7.0)
7133:
7108:
7083:
7058:
7033:
6969:
6884:
6806:
6792:
with deviation ÎŁ, and the measurement is centered at
6620:
6517:
6488:
Consider the estimator of θ based on binomial sample
6340:
6290:
6256:
6181:
6137:
6087:
6049:
5974:
5941:
5850:
5780:
5700:
5638:
5537:
5491:
5441:
5399:
5348:
5312:
5276:
5131:
5060:
5025:
4994:
4880:
4812:
4759:
4730:
4701:
4678:
4646:
4617:
4597:
4577:
4550:
4517:
4484:
4433:
4387:
4293:
4244:
4211:
4191:
4042:
4012:
3986:
3959:
3932:
3899:
3872:
3859:{\displaystyle \int L(a-\theta )f(x-\theta )d\theta }
3810:
3787:
3758:
3579:
3440:
3402:
3341:
3321:
3286:
3200:
3066:
3026:
2971:
2933:
2878:
2855:
2663:
2633:
2607:
2515:
2314:
2274:
2197:
2118:
2088:
2053:
1903:
1848:
1789:
1734:
1650:
1605:
1552:
1493:
1358:
1306:
1250:
1218:
1077:
1042:
1022:
932:
877:
847:
785:
752:
723:
703:
639:
610:
558:
534:
481:
461:
438:
9320:
Autoregressive conditional heteroskedasticity (ARCH)
2827:
Other loss functions can be conceived, although the
1293:{\displaystyle x|\theta \sim N(\theta ,\sigma ^{2})}
833:{\displaystyle E(L(\theta ,{\widehat {\theta }})|x)}
6933:{\displaystyle {\frac {4}{4+n}}V+{\frac {n}{4+n}}v}
6020:Bayes rules having finite Bayes risk are typically
8782:
7142:
7117:
7092:
7067:
7042:
7013:
6932:
6854:
6735:
6600:
6432:
6303:
6269:
6242:
6167:
6119:
6061:
5993:
5960:
5927:
5836:
5763:
5685:
5618:
5522:
5474:
5427:
5385:
5334:
5298:
5260:{\displaystyle \sigma _{m}^{2}=E_{\pi }+E_{\pi },}
5259:
5116:
5043:
5007:
4973:
4865:
4791:
4745:
4716:
4684:
4664:
4632:
4603:
4583:
4563:
4536:
4503:
4470:
4419:
4332:
4276:
4230:
4197:
4173:
4025:
3998:
3972:
3945:
3918:
3881:
3858:
3793:
3773:
3741:
3559:
3423:
3385:
3327:
3307:
3252:
3176:
3041:
3007:{\displaystyle \int {p(\theta )d\theta }=\infty .}
3006:
2954:
2908:
2861:
2816:
2645:
2619:
2580:
2500:
2292:
2256:
2182:
2100:
2059:
2030:
1885:
1831:
1772:
1716:
1632:
1588:
1531:
1475:
1340:
1292:
1232:
1164:
1048:
1028:
1005:
883:
853:
832:
767:
738:
717:: this defines the risk function as a function of
709:
685:
625:
588:
540:
520:
467:
444:
7320:Statistical decision theory and Bayesian Analysis
7100:= number of votes/ratings for the movie = (votes)
5516:
5110:
4742:
4713:
4658:
4629:
4329:
1208:Following are some examples of conjugate priors.
9735:
5342:are the moments of the conditional distribution
3187:This is a definition, and not an application of
1946:
8868:Multivariate adaptive regression splines (MARS)
2838:
589:{\displaystyle L(\theta ,{\widehat {\theta }})}
7344:
4238:. Thus, the expression minimizing is given by
1832:{\displaystyle x_{i}|\theta \sim U(0,\theta )}
697:is taken over the probability distribution of
7423:
4343:
4284:, so that the optimal estimator has the form
3752:The generalized Bayes estimator is the value
2041:
1886:{\displaystyle \theta \sim Pa(\theta _{0},a)}
1341:{\displaystyle \theta \sim N(\mu ,\tau ^{2})}
354:
7364:Pilz, JĂźrgen (1991). "Bayesian estimation".
1589:{\displaystyle x_{i}|\theta \sim P(\theta )}
7322:(2nd ed.). New York: Springer-Verlag.
6611:The MLE in this case is x/n and so we get,
3781:that minimizes this expression for a given
7468:
7430:
7416:
7281:Lehmann and Casella (1998), Theorem 5.2.4.
5475:{\displaystyle \sigma _{f}^{2}(\theta )=K}
4373:approaches to empirical Bayes estimation.
3057:one can define the posterior distribution
2909:{\displaystyle \int p(\theta )d\theta =1.}
361:
347:
8081:
7234:
7232:
5523:{\displaystyle \mu _{\pi }=\mu _{m}\,\!,}
5515:
5428:{\displaystyle \mu _{f}(\theta )=\theta }
5109:
4741:
4712:
4657:
4628:
4328:
3386:{\displaystyle p(x|\theta )=f(x-\theta )}
1152:
1131:
69:Learn how and when to remove this message
7250:Probability Theory: The Logic of Science
6076:
234:Integrated nested Laplace approximations
32:This article includes a list of general
7290:Lehmann and Casella (1998), section 6.8
6459:. It follows that the Bayes estimator δ
3396:It is common to use the improper prior
2300:to over or sub estimation. It yields a
9736:
9394:KaplanâMeier estimator (product limit)
7314:
7247:
7229:
7174:. As the number of ratings surpasses
6035:, then all Bayes rules are admissible.
5117:{\displaystyle \mu _{m}=E_{\pi }\,\!,}
4354:A Bayes estimator derived through the
4185:This is identical to (1), except that
1006:{\displaystyle \mathrm {MSE} =E\left,}
9467:
9034:
8781:
8080:
7850:
7467:
7411:
7238:Lehmann and Casella, Definition 4.2.9
7014:{\displaystyle W={Rv+Cm \over v+m}\ }
6948:Practical example of Bayes estimators
6878:results in the posterior centered at
3276:A typical example is estimation of a
1183:Bayes estimators for conjugate priors
9704:
9404:Accelerated failure time (AFT) model
7363:
7345:Lehmann, E. L.; Casella, G. (1998).
6746:The last equation implies that, for
5935:, from which the Bayes estimator of
5386:{\displaystyle f(x_{i}|\theta _{i})}
5335:{\displaystyle \sigma _{f}(\theta )}
4471:{\displaystyle f(x_{i}|\theta _{i})}
2071:Posterior median and other quantiles
905:Minimum mean square error estimation
768:{\displaystyle {\widehat {\theta }}}
739:{\displaystyle {\widehat {\theta }}}
626:{\displaystyle {\widehat {\theta }}}
18:
9716:
8999:Analysis of variance (ANOVA, anova)
7851:
6120:{\displaystyle x_{1},x_{2},\ldots }
4792:{\displaystyle x_{1},\ldots ,x_{n}}
4665:{\displaystyle \sigma _{\pi }\,\!.}
4420:{\displaystyle x_{1},\ldots ,x_{n}}
3801:. This is equivalent to minimizing
13:
9094:CochranâMantelâHaenszel statistics
7720:Pearson product-moment correlation
7226:Lehmann and Casella, Theorem 4.1.1
4478:, one is interested in estimating
2998:
1633:{\displaystyle \theta \sim G(a,b)}
940:
937:
934:
38:it lacks sufficient corresponding
14:
9760:
7383:
5299:{\displaystyle \mu _{f}(\theta )}
4333:{\displaystyle a(x)=a_{0}+x.\,\!}
3980:be the value minimizing (1) when
3280:with a loss function of the type
2591:
1059:
9715:
9703:
9691:
9678:
9677:
9468:
6168:{\displaystyle f(x_{i}|\theta )}
6009:
4753:of the marginal distribution of
4427:having conditional distribution
4006:. Then, given a different value
3570:so the posterior expected loss
328:
244:Approximate Bayesian computation
96:
23:
9353:Least-squares spectral analysis
5044:{\displaystyle \sigma _{m}^{2}}
4746:{\displaystyle \sigma _{m}\,\!}
4633:{\displaystyle \mu _{\pi }\,\!}
4181: (2)
3889: (1)
3335:is a location parameter, i.e.,
2082:A "linear" loss function, with
1773:{\displaystyle x_{1},...,x_{n}}
1532:{\displaystyle x_{1},...,x_{n}}
422:maximum a posteriori estimation
270:Maximum a posteriori estimation
8334:Mean-unbiased minimum-variance
7437:
7293:
7284:
7275:
7266:
7241:
7220:
6785:)-1 bits of new information."
6684:
6678:
6637:
6631:
6557:
6550:
6543:
6534:
6528:
6416:
6403:
6377:
6374:
6348:
6237:
6205:
6162:
6155:
6141:
5922:
5873:
5831:
5812:
5792:
5463:
5457:
5416:
5410:
5380:
5366:
5352:
5329:
5323:
5293:
5287:
5251:
5242:
5225:
5219:
5206:
5203:
5187:
5184:
5178:
5160:
5106:
5103:
5097:
5084:
4958:
4922:
4465:
4451:
4437:
4303:
4297:
4154:
4140:
4134:
4104:
4086:
4067:
4061:
4049:
3847:
3835:
3829:
3817:
3768:
3762:
3727:
3715:
3709:
3697:
3685:
3679:
3657:
3650:
3643:
3637:
3625:
3612:
3605:
3601:
3589:
3583:
3551:
3545:
3537:
3525:
3510:
3504:
3496:
3490:
3484:
3477:
3470:
3458:
3451:
3444:
3412:
3406:
3380:
3368:
3359:
3352:
3345:
3302:
3290:
3240:
3233:
3226:
3220:
3208:
3159:
3153:
3147:
3140:
3133:
3122:
3116:
3110:
3103:
3096:
3084:
3077:
3070:
3036:
3030:
2985:
2979:
2943:
2937:
2891:
2885:
2794:
2771:
2741:
2718:
2688:
2667:
2551:
2544:
2540:
2534:
2519:
2451:
2428:
2380:
2357:
2339:
2318:
2233:
2226:
2222:
2216:
2201:
2176:
2153:
2143:
2122:
2001:
1950:
1943:
1931:
1922:
1916:
1880:
1861:
1826:
1814:
1801:
1669:
1663:
1627:
1615:
1583:
1577:
1564:
1377:
1371:
1335:
1316:
1287:
1268:
1255:
1223:
1149:
1142:
1135:
1119:
1112:
1105:
1096:
1090:
986:
976:
970:
955:
827:
820:
816:
795:
789:
680:
677:
656:
650:
583:
562:
515:
509:
1:
9647:Geographic information system
8863:Simultaneous equations models
7308:
7203:Recursive Bayesian estimation
6800:the posterior is centered at
6004:
5961:{\displaystyle \theta _{n+1}}
4504:{\displaystyle \theta _{n+1}}
4277:{\displaystyle a-x_{1}=a_{0}}
600:, such as squared error. The
432:Suppose an unknown parameter
427:
8830:Coefficient of determination
8441:Uniformly most powerful test
7208:Generalized expected utility
6474:maximum likelihood estimator
4717:{\displaystyle \mu _{m}\,\!}
4695:First, we estimate the mean
4611:is normal with unknown mean
4376:
3424:{\displaystyle p(\theta )=1}
3308:{\displaystyle L(a-\theta )}
2955:{\displaystyle p(\theta )=1}
2839:Generalized Bayes estimators
1686:
548:(based on some measurements
177:Principle of maximum entropy
7:
9399:Proportional hazards models
9343:Spectral density estimation
9325:Vector autoregression (VAR)
8759:Maximum posterior estimator
7991:Randomized controlled trial
7396:Encyclopedia of Mathematics
7272:Berger (1980), section 4.5.
7196:
6304:{\displaystyle \theta _{0}}
6270:{\displaystyle \delta _{n}}
4564:{\displaystyle \theta _{i}}
3266:generalized Bayes estimator
1300:, and the prior is normal,
899:
894:generalized Bayes estimator
147:Bernsteinâvon Mises theorem
10:
9765:
9159:Multivariate distributions
7579:Average absolute deviation
7349:(2nd ed.). Springer.
7347:Theory of Point Estimation
6874:measurements with average
6484:in a binomial distribution
6013:
4347:
4344:Empirical Bayes estimators
3271:
3042:{\displaystyle p(\theta )}
2842:
2074:
2042:Alternative risk functions
1186:
908:
16:Mathematical decision rule
9673:
9627:
9564:
9517:
9480:
9476:
9463:
9435:
9417:
9384:
9375:
9333:
9280:
9241:
9190:
9181:
9147:Structural equation model
9102:
9059:
9055:
9030:
8989:
8955:
8909:
8876:
8838:
8805:
8801:
8777:
8717:
8626:
8545:
8509:
8500:
8483:Score/Lagrange multiplier
8468:
8421:
8366:
8292:
8283:
8093:
8089:
8076:
8035:
8009:
7961:
7916:
7898:Sample size determination
7863:
7859:
7846:
7750:
7705:
7679:
7661:
7617:
7569:
7489:
7480:
7476:
7463:
7445:
6325:converges in distribution
4360:empirical Bayes estimator
1233:{\displaystyle x|\theta }
1177:minimum mean square error
911:Minimum mean square error
172:Principle of indifference
9642:Environmental statistics
9164:Elliptical distributions
8957:Generalized linear model
8886:Simple linear regression
8656:HodgesâLehmann estimator
8113:Probability distribution
8022:Stochastic approximation
7584:Coefficient of variation
7213:
7160:weighted arithmetic mean
6753:On the other hand, when
6467:asymptotically efficient
6016:Admissible decision rule
5008:{\displaystyle \mu _{m}}
4986:law of total expectation
2293:{\displaystyle a,b>0}
923:. The MSE is defined by
224:Markov chain Monte Carlo
9302:Cross-correlation (XCF)
8910:Non-standard predictors
8344:LehmannâScheffĂŠ theorem
8017:Adaptive clinical trial
6954:Internet Movie Database
6321:asymptotically unbiased
5994:{\displaystyle x_{n+1}}
4571:'s have a common prior
4537:{\displaystyle x_{n+1}}
4231:{\displaystyle a-x_{1}}
3919:{\displaystyle x+a_{0}}
3328:{\displaystyle \theta }
2849:The prior distribution
1029:{\displaystyle \theta }
710:{\displaystyle \theta }
541:{\displaystyle \theta }
445:{\displaystyle \theta }
410:posterior expected loss
229:Laplace's approximation
216:Posterior approximation
53:more precise citations.
9698:Mathematics portal
9519:Engineering statistics
9427:NelsonâAalen estimator
9004:Analysis of covariance
8891:Ordinary least squares
8815:Pearson product-moment
8219:Statistical functional
8130:Empirical distribution
7963:Controlled experiments
7692:Frequency distribution
7470:Descriptive statistics
7144:
7119:
7094:
7069:
7044:
7015:
6934:
6856:
6737:
6602:
6434:
6305:
6271:
6244:
6169:
6121:
6063:
6062:{\displaystyle p>2}
5995:
5962:
5929:
5838:
5765:
5687:
5620:
5524:
5476:
5429:
5387:
5336:
5300:
5261:
5118:
5045:
5009:
4975:
4867:
4793:
4747:
4718:
4692:in the following way.
4686:
4666:
4634:
4605:
4585:
4565:
4538:
4505:
4472:
4421:
4356:empirical Bayes method
4350:Empirical Bayes method
4334:
4278:
4232:
4199:
4175:
4027:
4000:
3974:
3947:
3920:
3883:
3860:
3795:
3775:
3743:
3561:
3425:
3387:
3329:
3309:
3254:
3178:
3043:
3008:
2956:
2910:
2863:
2818:
2647:
2646:{\displaystyle L>0}
2621:
2620:{\displaystyle K>0}
2582:
2502:
2294:
2258:
2184:
2102:
2101:{\displaystyle a>0}
2061:
2032:
1887:
1839:, and if the prior is
1833:
1774:
1718:
1634:
1596:, and if the prior is
1590:
1533:
1477:
1342:
1294:
1234:
1166:
1066:posterior distribution
1050:
1030:
1007:
885:
855:
834:
769:
740:
711:
687:
627:
590:
542:
522:
469:
446:
335:Mathematics portal
278:Evidence approximation
9614:Population statistics
9556:System identification
9290:Autocorrelation (ACF)
9218:Exponential smoothing
9132:Discriminant analysis
9127:Canonical correlation
8991:Partition of variance
8853:Regression validation
8697:(JonckheereâTerpstra)
8596:Likelihood-ratio test
8285:Frequentist inference
8197:Locationâscale family
8118:Sampling distribution
8083:Statistical inference
8050:Cross-sectional study
8037:Observational studies
7996:Randomized experiment
7825:Stem-and-leaf display
7627:Central limit theorem
7248:Jaynes, E.T. (2007).
7145:
7120:
7095:
7070:
7045:
7016:
6935:
6857:
6738:
6603:
6435:
6306:
6272:
6245:
6170:
6131:samples with density
6122:
6077:Asymptotic efficiency
6064:
5996:
5963:
5930:
5839:
5766:
5688:
5621:
5525:
5477:
5430:
5388:
5337:
5301:
5262:
5119:
5046:
5017:law of total variance
5010:
4976:
4868:
4794:
4748:
4719:
4687:
4667:
4635:
4606:
4586:
4566:
4539:
4506:
4473:
4422:
4335:
4279:
4233:
4205:has been replaced by
4200:
4176:
4028:
4026:{\displaystyle x_{1}}
4001:
3975:
3973:{\displaystyle a_{0}}
3948:
3946:{\displaystyle a_{0}}
3921:
3884:
3861:
3796:
3776:
3744:
3562:
3426:
3388:
3330:
3310:
3255:
3179:
3044:
3009:
2957:
2925:non-informative prior
2911:
2864:
2819:
2648:
2622:
2583:
2503:
2295:
2259:
2185:
2103:
2062:
2033:
1888:
1834:
1782:uniformly distributed
1775:
1719:
1635:
1591:
1534:
1478:
1343:
1295:
1235:
1175:This is known as the
1167:
1051:
1031:
1008:
886:
856:
835:
770:
741:
712:
688:
628:
591:
543:
523:
470:
447:
239:Variational inference
9537:Probabilistic design
9122:Principal components
8965:Exponential families
8917:Nonlinear regression
8896:General linear model
8858:Mixed effects models
8848:Errors and residuals
8825:Confounding variable
8727:Bayesian probability
8705:Van der Waerden test
8695:Ordered alternative
8460:Multiple comparisons
8339:RaoâBlackwellization
8302:Estimating equations
8258:Statistical distance
7976:Factorial experiment
7509:Arithmetic-Geometric
7391:"Bayesian estimator"
7131:
7106:
7081:
7056:
7031:
6967:
6958:Top Rated 250 Titles
6882:
6804:
6618:
6515:
6480:Example: estimating
6338:
6288:
6254:
6179:
6135:
6085:
6047:
5972:
5939:
5848:
5778:
5698:
5636:
5535:
5489:
5439:
5397:
5346:
5310:
5274:
5129:
5058:
5023:
4992:
4878:
4810:
4757:
4728:
4699:
4685:{\displaystyle \pi }
4676:
4644:
4615:
4604:{\displaystyle \pi }
4595:
4584:{\displaystyle \pi }
4575:
4548:
4515:
4482:
4431:
4385:
4291:
4242:
4209:
4189:
4040:
4010:
3984:
3957:
3930:
3926:, for some constant
3897:
3870:
3808:
3785:
3774:{\displaystyle a(x)}
3756:
3577:
3438:
3400:
3339:
3319:
3284:
3198:
3064:
3024:
2969:
2931:
2876:
2853:
2661:
2631:
2605:
2513:
2312:
2272:
2195:
2116:
2086:
2051:
1901:
1846:
1787:
1732:
1648:
1603:
1550:
1491:
1356:
1304:
1248:
1216:
1075:
1040:
1020:
930:
875:
845:
783:
750:
721:
701:
637:
608:
556:
532:
479:
468:{\displaystyle \pi }
459:
436:
317:Posterior predictive
286:Evidence lower bound
167:Likelihood principle
137:Bayesian probability
9749:Bayesian estimation
9609:Official statistics
9532:Methods engineering
9213:Seasonal adjustment
8981:Poisson regressions
8901:Bayesian regression
8840:Regression analysis
8820:Partial correlation
8792:Regression analysis
8391:Prediction interval
8386:Likelihood interval
8376:Confidence interval
8368:Interval estimation
8329:Unbiased estimators
8147:Model specification
8027:Up-and-down designs
7715:Partial correlation
7671:Index of dispersion
7589:Interquartile range
7170:with weight vector
7143:{\displaystyle C\ }
7118:{\displaystyle m\ }
7093:{\displaystyle v\ }
7068:{\displaystyle R\ }
7043:{\displaystyle W\ }
6329:normal distribution
6069:; this is known as
6001:can be calculated.
5921:
5751:
5724:
5606:
5588:
5570:
5552:
5456:
5177:
5146:
5040:
4904:
4033:, we must minimize
3999:{\displaystyle x=0}
3953:. To see this, let
919:(MSE), also called
528:be an estimator of
452:is known to have a
418:Bayesian statistics
397:that minimizes the
90:Bayesian statistics
84:Part of a series on
9629:Spatial statistics
9509:Medical statistics
9409:First hitting time
9363:Whittle likelihood
9014:Degrees of freedom
9009:Multivariate ANOVA
8942:Heteroscedasticity
8754:Bayesian estimator
8719:Bayesian inference
8568:KolmogorovâSmirnov
8453:Randomization test
8423:Testing hypotheses
8396:Tolerance interval
8307:Maximum likelihood
8202:Exponential family
8135:Density estimation
8095:Statistical theory
8055:Natural experiment
8001:Scientific control
7918:Survey methodology
7604:Standard deviation
7140:
7115:
7090:
7065:
7040:
7011:
6930:
6852:
6796:with deviation Ď,
6733:
6598:
6453:Fisher information
6430:
6301:
6267:
6240:
6165:
6117:
6071:Stein's phenomenon
6059:
6031:If θ belongs to a
5991:
5958:
5925:
5898:
5834:
5761:
5728:
5701:
5683:
5616:
5592:
5574:
5556:
5538:
5520:
5472:
5442:
5425:
5383:
5332:
5296:
5257:
5163:
5132:
5114:
5041:
5026:
5005:
4971:
4881:
4863:
4801:maximum likelihood
4789:
4743:
4714:
4682:
4662:
4630:
4601:
4581:
4561:
4544:. Assume that the
4534:
4501:
4468:
4417:
4330:
4274:
4228:
4195:
4171:
4023:
3996:
3970:
3943:
3916:
3882:{\displaystyle x.}
3879:
3856:
3791:
3771:
3739:
3557:
3421:
3383:
3325:
3305:
3278:location parameter
3250:
3174:
3039:
3004:
2952:
2906:
2859:
2829:mean squared error
2814:
2809:
2768:
2715:
2643:
2617:
2578:
2498:
2493:
2465:
2394:
2290:
2254:
2249:
2180:
2098:
2057:
2028:
1883:
1841:Pareto distributed
1829:
1770:
1714:
1630:
1586:
1529:
1473:
1338:
1290:
1230:
1179:(MMSE) estimator.
1162:
1046:
1026:
1003:
921:squared error risk
881:
851:
830:
765:
736:
707:
683:
623:
586:
538:
518:
465:
454:prior distribution
442:
260:Bayesian estimator
208:Hierarchical model
132:Bayesian inference
9731:
9730:
9669:
9668:
9665:
9664:
9604:National accounts
9574:Actuarial science
9566:Social statistics
9459:
9458:
9455:
9454:
9451:
9450:
9386:Survival function
9371:
9370:
9233:Granger causality
9074:Contingency table
9049:Survival analysis
9026:
9025:
9022:
9021:
8878:Linear regression
8773:
8772:
8769:
8768:
8744:Credible interval
8713:
8712:
8496:
8495:
8312:Method of moments
8181:Parametric family
8142:Statistical model
8072:
8071:
8068:
8067:
7986:Random assignment
7908:Statistical power
7842:
7841:
7838:
7837:
7687:Contingency table
7657:
7656:
7524:Generalized/power
7259:978-0-521-59271-0
7139:
7114:
7089:
7064:
7050:= weighted rating
7039:
7010:
7006:
6925:
6901:
6847:
6823:
6712:
6673:
6593:
6498:Beta distribution
6420:
6346:
5908:
5886:
5738:
5711:
5671:
5649:
4984:Next, we use the
4948:
4916:
4891:
4843:
4823:
4198:{\displaystyle a}
3794:{\displaystyle x}
3689:
3555:
3514:
3169:
2862:{\displaystyle p}
2833:robust statistics
2790:
2767:
2737:
2714:
2685:
2573:
2531:
2482:
2464:
2447:
2411:
2393:
2376:
2336:
2248:
2213:
2172:
2140:
2060:{\displaystyle F}
2023:
1913:
1709:
1689:
1660:
1598:Gamma distributed
1546:random variables
1465:
1420:
1368:
1199:parametric family
1087:
1049:{\displaystyle x}
967:
917:mean square error
884:{\displaystyle x}
854:{\displaystyle x}
813:
762:
733:
674:
620:
580:
506:
491:
375:estimation theory
371:
370:
265:Credible interval
198:Linear regression
79:
78:
71:
9756:
9719:
9718:
9707:
9706:
9696:
9695:
9681:
9680:
9584:Crime statistics
9478:
9477:
9465:
9464:
9382:
9381:
9348:Fourier analysis
9335:Frequency domain
9315:
9262:
9228:Structural break
9188:
9187:
9137:Cluster analysis
9084:Log-linear model
9057:
9056:
9032:
9031:
8973:
8947:Homoscedasticity
8803:
8802:
8779:
8778:
8698:
8690:
8682:
8681:(KruskalâWallis)
8666:
8651:
8606:Cross validation
8591:
8573:AndersonâDarling
8520:
8507:
8506:
8478:Likelihood-ratio
8470:Parametric tests
8448:Permutation test
8431:1- & 2-tails
8322:Minimum distance
8294:Point estimation
8290:
8289:
8241:Optimal decision
8192:
8091:
8090:
8078:
8077:
8060:Quasi-experiment
8010:Adaptive designs
7861:
7860:
7848:
7847:
7725:Rank correlation
7487:
7486:
7478:
7477:
7465:
7464:
7432:
7425:
7418:
7409:
7408:
7404:
7379:
7360:
7341:
7316:Berger, James O.
7302:
7297:
7291:
7288:
7282:
7279:
7273:
7270:
7264:
7263:
7245:
7239:
7236:
7227:
7224:
7149:
7147:
7146:
7141:
7137:
7124:
7122:
7121:
7116:
7112:
7099:
7097:
7096:
7091:
7087:
7074:
7072:
7071:
7066:
7062:
7049:
7047:
7046:
7041:
7037:
7020:
7018:
7017:
7012:
7008:
7007:
7005:
6994:
6977:
6939:
6937:
6936:
6931:
6926:
6924:
6910:
6902:
6900:
6886:
6870:this prior with
6861:
6859:
6858:
6853:
6848:
6846:
6832:
6824:
6822:
6808:
6742:
6740:
6739:
6734:
6729:
6728:
6713:
6711:
6691:
6674:
6672:
6655:
6644:
6630:
6629:
6607:
6605:
6604:
6599:
6594:
6592:
6575:
6564:
6553:
6527:
6526:
6439:
6437:
6436:
6431:
6426:
6422:
6421:
6419:
6415:
6414:
6395:
6373:
6372:
6360:
6359:
6347:
6342:
6310:
6308:
6307:
6302:
6300:
6299:
6276:
6274:
6273:
6268:
6266:
6265:
6249:
6247:
6246:
6241:
6236:
6235:
6217:
6216:
6204:
6203:
6191:
6190:
6174:
6172:
6171:
6166:
6158:
6153:
6152:
6126:
6124:
6123:
6118:
6110:
6109:
6097:
6096:
6068:
6066:
6065:
6060:
6000:
5998:
5997:
5992:
5990:
5989:
5967:
5965:
5964:
5959:
5957:
5956:
5934:
5932:
5931:
5926:
5920:
5915:
5910:
5909:
5901:
5894:
5893:
5888:
5887:
5879:
5866:
5865:
5843:
5841:
5840:
5835:
5824:
5823:
5805:
5804:
5795:
5790:
5789:
5774:For example, if
5770:
5768:
5767:
5762:
5750:
5745:
5740:
5739:
5731:
5723:
5718:
5713:
5712:
5704:
5692:
5690:
5689:
5684:
5679:
5678:
5673:
5672:
5664:
5657:
5656:
5651:
5650:
5642:
5625:
5623:
5622:
5617:
5605:
5600:
5587:
5582:
5569:
5564:
5551:
5546:
5529:
5527:
5526:
5521:
5514:
5513:
5501:
5500:
5481:
5479:
5478:
5473:
5455:
5450:
5434:
5432:
5431:
5426:
5409:
5408:
5392:
5390:
5389:
5384:
5379:
5378:
5369:
5364:
5363:
5341:
5339:
5338:
5333:
5322:
5321:
5305:
5303:
5302:
5297:
5286:
5285:
5266:
5264:
5263:
5258:
5250:
5249:
5240:
5239:
5218:
5217:
5202:
5201:
5176:
5171:
5159:
5158:
5145:
5140:
5123:
5121:
5120:
5115:
5096:
5095:
5083:
5082:
5070:
5069:
5050:
5048:
5047:
5042:
5039:
5034:
5014:
5012:
5011:
5006:
5004:
5003:
4980:
4978:
4977:
4972:
4967:
4966:
4965:
4956:
4955:
4950:
4949:
4941:
4934:
4933:
4917:
4909:
4903:
4898:
4893:
4892:
4884:
4872:
4870:
4869:
4864:
4859:
4858:
4857:
4844:
4836:
4831:
4830:
4825:
4824:
4816:
4798:
4796:
4795:
4790:
4788:
4787:
4769:
4768:
4752:
4750:
4749:
4744:
4740:
4739:
4723:
4721:
4720:
4715:
4711:
4710:
4691:
4689:
4688:
4683:
4671:
4669:
4668:
4663:
4656:
4655:
4639:
4637:
4636:
4631:
4627:
4626:
4610:
4608:
4607:
4602:
4590:
4588:
4587:
4582:
4570:
4568:
4567:
4562:
4560:
4559:
4543:
4541:
4540:
4535:
4533:
4532:
4510:
4508:
4507:
4502:
4500:
4499:
4477:
4475:
4474:
4469:
4464:
4463:
4454:
4449:
4448:
4426:
4424:
4423:
4418:
4416:
4415:
4397:
4396:
4339:
4337:
4336:
4331:
4318:
4317:
4283:
4281:
4280:
4275:
4273:
4272:
4260:
4259:
4237:
4235:
4234:
4229:
4227:
4226:
4204:
4202:
4201:
4196:
4180:
4178:
4177:
4172:
4167:
4153:
4133:
4122:
4121:
4079:
4078:
4032:
4030:
4029:
4024:
4022:
4021:
4005:
4003:
4002:
3997:
3979:
3977:
3976:
3971:
3969:
3968:
3952:
3950:
3949:
3944:
3942:
3941:
3925:
3923:
3922:
3917:
3915:
3914:
3888:
3886:
3885:
3880:
3865:
3863:
3862:
3857:
3800:
3798:
3797:
3792:
3780:
3778:
3777:
3772:
3748:
3746:
3745:
3740:
3690:
3688:
3671:
3666:
3653:
3608:
3566:
3564:
3563:
3558:
3556:
3554:
3540:
3520:
3515:
3513:
3499:
3480:
3465:
3454:
3430:
3428:
3427:
3422:
3392:
3390:
3389:
3384:
3355:
3334:
3332:
3331:
3326:
3314:
3312:
3311:
3306:
3259:
3257:
3256:
3251:
3249:
3236:
3183:
3181:
3180:
3175:
3170:
3168:
3143:
3125:
3106:
3091:
3080:
3048:
3046:
3045:
3040:
3013:
3011:
3010:
3005:
2994:
2961:
2959:
2958:
2953:
2915:
2913:
2912:
2907:
2868:
2866:
2865:
2860:
2823:
2821:
2820:
2815:
2813:
2812:
2797:
2792:
2791:
2783:
2774:
2769:
2765:
2744:
2739:
2738:
2730:
2721:
2716:
2712:
2687:
2686:
2678:
2652:
2650:
2649:
2644:
2626:
2624:
2623:
2618:
2587:
2585:
2584:
2579:
2574:
2572:
2558:
2547:
2533:
2532:
2524:
2507:
2505:
2504:
2499:
2497:
2496:
2484:
2483:
2475:
2466:
2462:
2454:
2449:
2448:
2440:
2431:
2413:
2412:
2404:
2395:
2391:
2383:
2378:
2377:
2369:
2360:
2338:
2337:
2329:
2299:
2297:
2296:
2291:
2263:
2261:
2260:
2255:
2250:
2241:
2229:
2215:
2214:
2206:
2189:
2187:
2186:
2181:
2179:
2174:
2173:
2165:
2156:
2142:
2141:
2133:
2107:
2105:
2104:
2099:
2066:
2064:
2063:
2058:
2037:
2035:
2034:
2029:
2024:
2022:
2005:
2004:
2000:
1999:
1975:
1974:
1962:
1961:
1929:
1915:
1914:
1906:
1892:
1890:
1889:
1884:
1873:
1872:
1838:
1836:
1835:
1830:
1804:
1799:
1798:
1779:
1777:
1776:
1771:
1769:
1768:
1744:
1743:
1723:
1721:
1720:
1715:
1710:
1708:
1697:
1690:
1682:
1676:
1662:
1661:
1653:
1639:
1637:
1636:
1631:
1595:
1593:
1592:
1587:
1567:
1562:
1561:
1538:
1536:
1535:
1530:
1528:
1527:
1503:
1502:
1482:
1480:
1479:
1474:
1466:
1464:
1463:
1462:
1450:
1449:
1439:
1438:
1429:
1421:
1419:
1418:
1417:
1405:
1404:
1394:
1393:
1384:
1370:
1369:
1361:
1347:
1345:
1344:
1339:
1334:
1333:
1299:
1297:
1296:
1291:
1286:
1285:
1258:
1239:
1237:
1236:
1231:
1226:
1171:
1169:
1168:
1163:
1145:
1115:
1089:
1088:
1080:
1055:
1053:
1052:
1047:
1035:
1033:
1032:
1027:
1012:
1010:
1009:
1004:
999:
995:
994:
993:
969:
968:
960:
943:
890:
888:
887:
882:
865:If the prior is
860:
858:
857:
852:
839:
837:
836:
831:
823:
815:
814:
806:
775:is said to be a
774:
772:
771:
766:
764:
763:
755:
745:
743:
742:
737:
735:
734:
726:
716:
714:
713:
708:
692:
690:
689:
684:
676:
675:
667:
649:
648:
632:
630:
629:
624:
622:
621:
613:
595:
593:
592:
587:
582:
581:
573:
547:
545:
544:
539:
527:
525:
524:
519:
508:
507:
499:
493:
492:
484:
474:
472:
471:
466:
451:
449:
448:
443:
363:
356:
349:
333:
332:
299:Model evaluation
100:
81:
80:
74:
67:
63:
60:
54:
49:this article by
40:inline citations
27:
26:
19:
9764:
9763:
9759:
9758:
9757:
9755:
9754:
9753:
9734:
9733:
9732:
9727:
9690:
9661:
9623:
9560:
9546:quality control
9513:
9495:Clinical trials
9472:
9447:
9431:
9419:Hazard function
9413:
9367:
9329:
9313:
9276:
9272:BreuschâGodfrey
9260:
9237:
9177:
9152:Factor analysis
9098:
9079:Graphical model
9051:
9018:
8985:
8971:
8951:
8905:
8872:
8834:
8797:
8796:
8765:
8709:
8696:
8688:
8680:
8664:
8649:
8628:Rank statistics
8622:
8601:Model selection
8589:
8547:Goodness of fit
8541:
8518:
8492:
8464:
8417:
8362:
8351:Median unbiased
8279:
8190:
8123:Order statistic
8085:
8064:
8031:
8005:
7957:
7912:
7855:
7853:Data collection
7834:
7746:
7701:
7675:
7653:
7613:
7565:
7482:Continuous data
7472:
7459:
7441:
7436:
7389:
7386:
7376:
7357:
7330:
7311:
7306:
7305:
7298:
7294:
7289:
7285:
7280:
7276:
7271:
7267:
7260:
7246:
7242:
7237:
7230:
7225:
7221:
7216:
7199:
7132:
7129:
7128:
7107:
7104:
7103:
7082:
7079:
7078:
7057:
7054:
7053:
7032:
7029:
7028:
6995:
6978:
6976:
6968:
6965:
6964:
6950:
6914:
6909:
6890:
6885:
6883:
6880:
6879:
6836:
6831:
6812:
6807:
6805:
6802:
6801:
6718:
6714:
6695:
6690:
6656:
6645:
6643:
6625:
6621:
6619:
6616:
6615:
6576:
6565:
6563:
6549:
6522:
6518:
6516:
6513:
6512:
6486:
6464:
6458:
6450:
6410:
6406:
6399:
6394:
6387:
6383:
6368:
6364:
6355:
6351:
6341:
6339:
6336:
6335:
6295:
6291:
6289:
6286:
6285:
6261:
6257:
6255:
6252:
6251:
6231:
6227:
6212:
6208:
6199:
6195:
6186:
6182:
6180:
6177:
6176:
6154:
6148:
6144:
6136:
6133:
6132:
6105:
6101:
6092:
6088:
6086:
6083:
6082:
6079:
6048:
6045:
6044:
6018:
6012:
6007:
5979:
5975:
5973:
5970:
5969:
5946:
5942:
5940:
5937:
5936:
5916:
5911:
5900:
5899:
5889:
5878:
5877:
5876:
5855:
5851:
5849:
5846:
5845:
5819:
5815:
5800:
5796:
5791:
5785:
5781:
5779:
5776:
5775:
5746:
5741:
5730:
5729:
5719:
5714:
5703:
5702:
5699:
5696:
5695:
5674:
5663:
5662:
5661:
5652:
5641:
5640:
5639:
5637:
5634:
5633:
5601:
5596:
5583:
5578:
5565:
5560:
5547:
5542:
5536:
5533:
5532:
5509:
5505:
5496:
5492:
5490:
5487:
5486:
5482:; we then have
5451:
5446:
5440:
5437:
5436:
5404:
5400:
5398:
5395:
5394:
5374:
5370:
5365:
5359:
5355:
5347:
5344:
5343:
5317:
5313:
5311:
5308:
5307:
5281:
5277:
5275:
5272:
5271:
5245:
5241:
5235:
5231:
5213:
5209:
5197:
5193:
5172:
5167:
5154:
5150:
5141:
5136:
5130:
5127:
5126:
5091:
5087:
5078:
5074:
5065:
5061:
5059:
5056:
5055:
5035:
5030:
5024:
5021:
5020:
4999:
4995:
4993:
4990:
4989:
4961:
4957:
4951:
4940:
4939:
4938:
4929:
4925:
4921:
4908:
4899:
4894:
4883:
4882:
4879:
4876:
4875:
4853:
4849:
4848:
4835:
4826:
4815:
4814:
4813:
4811:
4808:
4807:
4783:
4779:
4764:
4760:
4758:
4755:
4754:
4735:
4731:
4729:
4726:
4725:
4706:
4702:
4700:
4697:
4696:
4677:
4674:
4673:
4651:
4647:
4645:
4642:
4641:
4622:
4618:
4616:
4613:
4612:
4596:
4593:
4592:
4576:
4573:
4572:
4555:
4551:
4549:
4546:
4545:
4522:
4518:
4516:
4513:
4512:
4489:
4485:
4483:
4480:
4479:
4459:
4455:
4450:
4444:
4440:
4432:
4429:
4428:
4411:
4407:
4392:
4388:
4386:
4383:
4382:
4379:
4365:There are both
4352:
4346:
4313:
4309:
4292:
4289:
4288:
4268:
4264:
4255:
4251:
4243:
4240:
4239:
4222:
4218:
4210:
4207:
4206:
4190:
4187:
4186:
4160:
4146:
4126:
4117:
4113:
4074:
4070:
4041:
4038:
4037:
4017:
4013:
4011:
4008:
4007:
3985:
3982:
3981:
3964:
3960:
3958:
3955:
3954:
3937:
3933:
3931:
3928:
3927:
3910:
3906:
3898:
3895:
3894:
3871:
3868:
3867:
3809:
3806:
3805:
3786:
3783:
3782:
3757:
3754:
3753:
3675:
3670:
3649:
3621:
3604:
3578:
3575:
3574:
3541:
3521:
3519:
3500:
3476:
3466:
3464:
3450:
3439:
3436:
3435:
3401:
3398:
3397:
3351:
3340:
3337:
3336:
3320:
3317:
3316:
3285:
3282:
3281:
3274:
3232:
3204:
3199:
3196:
3195:
3139:
3126:
3102:
3092:
3090:
3076:
3065:
3062:
3061:
3051:improper priors
3025:
3022:
3021:
2975:
2970:
2967:
2966:
2932:
2929:
2928:
2877:
2874:
2873:
2854:
2851:
2850:
2847:
2841:
2808:
2807:
2793:
2782:
2781:
2770:
2763:
2761:
2752:
2751:
2740:
2729:
2728:
2717:
2710:
2708:
2695:
2694:
2677:
2676:
2662:
2659:
2658:
2632:
2629:
2628:
2606:
2603:
2602:
2594:
2562:
2557:
2543:
2523:
2522:
2514:
2511:
2510:
2492:
2491:
2474:
2473:
2460:
2458:
2450:
2439:
2438:
2427:
2421:
2420:
2403:
2402:
2389:
2387:
2379:
2368:
2367:
2356:
2346:
2345:
2328:
2327:
2313:
2310:
2309:
2273:
2270:
2269:
2239:
2225:
2205:
2204:
2196:
2193:
2192:
2175:
2164:
2163:
2152:
2132:
2131:
2117:
2114:
2113:
2087:
2084:
2083:
2079:
2073:
2052:
2049:
2048:
2044:
2006:
1995:
1991:
1970:
1966:
1957:
1953:
1949:
1930:
1928:
1905:
1904:
1902:
1899:
1898:
1868:
1864:
1847:
1844:
1843:
1800:
1794:
1790:
1788:
1785:
1784:
1764:
1760:
1739:
1735:
1733:
1730:
1729:
1698:
1681:
1677:
1675:
1652:
1651:
1649:
1646:
1645:
1604:
1601:
1600:
1563:
1557:
1553:
1551:
1548:
1547:
1523:
1519:
1498:
1494:
1492:
1489:
1488:
1458:
1454:
1445:
1441:
1440:
1434:
1430:
1428:
1413:
1409:
1400:
1396:
1395:
1389:
1385:
1383:
1360:
1359:
1357:
1354:
1353:
1329:
1325:
1305:
1302:
1301:
1281:
1277:
1254:
1249:
1246:
1245:
1222:
1217:
1214:
1213:
1195:conjugate prior
1191:
1189:Conjugate prior
1185:
1141:
1111:
1079:
1078:
1076:
1073:
1072:
1062:
1041:
1038:
1037:
1021:
1018:
1017:
989:
985:
959:
958:
954:
950:
933:
931:
928:
927:
913:
907:
902:
876:
873:
872:
846:
843:
842:
819:
805:
804:
784:
781:
780:
777:Bayes estimator
754:
753:
751:
748:
747:
746:. An estimator
725:
724:
722:
719:
718:
702:
699:
698:
666:
665:
644:
640:
638:
635:
634:
612:
611:
609:
606:
605:
572:
571:
557:
554:
553:
533:
530:
529:
498:
497:
483:
482:
480:
477:
476:
460:
457:
456:
437:
434:
433:
430:
383:Bayes estimator
379:decision theory
367:
327:
312:Model averaging
291:Nested sampling
203:Empirical Bayes
193:Conjugate prior
162:Cromwell's rule
75:
64:
58:
55:
45:Please help to
44:
28:
24:
17:
12:
11:
5:
9762:
9752:
9751:
9746:
9729:
9728:
9726:
9725:
9713:
9701:
9687:
9674:
9671:
9670:
9667:
9666:
9663:
9662:
9660:
9659:
9654:
9649:
9644:
9639:
9633:
9631:
9625:
9624:
9622:
9621:
9616:
9611:
9606:
9601:
9596:
9591:
9586:
9581:
9576:
9570:
9568:
9562:
9561:
9559:
9558:
9553:
9548:
9539:
9534:
9529:
9523:
9521:
9515:
9514:
9512:
9511:
9506:
9501:
9492:
9490:Bioinformatics
9486:
9484:
9474:
9473:
9461:
9460:
9457:
9456:
9453:
9452:
9449:
9448:
9446:
9445:
9439:
9437:
9433:
9432:
9430:
9429:
9423:
9421:
9415:
9414:
9412:
9411:
9406:
9401:
9396:
9390:
9388:
9379:
9373:
9372:
9369:
9368:
9366:
9365:
9360:
9355:
9350:
9345:
9339:
9337:
9331:
9330:
9328:
9327:
9322:
9317:
9309:
9304:
9299:
9298:
9297:
9295:partial (PACF)
9286:
9284:
9278:
9277:
9275:
9274:
9269:
9264:
9256:
9251:
9245:
9243:
9242:Specific tests
9239:
9238:
9236:
9235:
9230:
9225:
9220:
9215:
9210:
9205:
9200:
9194:
9192:
9185:
9179:
9178:
9176:
9175:
9174:
9173:
9172:
9171:
9156:
9155:
9154:
9144:
9142:Classification
9139:
9134:
9129:
9124:
9119:
9114:
9108:
9106:
9100:
9099:
9097:
9096:
9091:
9089:McNemar's test
9086:
9081:
9076:
9071:
9065:
9063:
9053:
9052:
9028:
9027:
9024:
9023:
9020:
9019:
9017:
9016:
9011:
9006:
9001:
8995:
8993:
8987:
8986:
8984:
8983:
8967:
8961:
8959:
8953:
8952:
8950:
8949:
8944:
8939:
8934:
8929:
8927:Semiparametric
8924:
8919:
8913:
8911:
8907:
8906:
8904:
8903:
8898:
8893:
8888:
8882:
8880:
8874:
8873:
8871:
8870:
8865:
8860:
8855:
8850:
8844:
8842:
8836:
8835:
8833:
8832:
8827:
8822:
8817:
8811:
8809:
8799:
8798:
8795:
8794:
8789:
8783:
8775:
8774:
8771:
8770:
8767:
8766:
8764:
8763:
8762:
8761:
8751:
8746:
8741:
8740:
8739:
8734:
8723:
8721:
8715:
8714:
8711:
8710:
8708:
8707:
8702:
8701:
8700:
8692:
8684:
8668:
8665:(MannâWhitney)
8660:
8659:
8658:
8645:
8644:
8643:
8632:
8630:
8624:
8623:
8621:
8620:
8619:
8618:
8613:
8608:
8598:
8593:
8590:(ShapiroâWilk)
8585:
8580:
8575:
8570:
8565:
8557:
8551:
8549:
8543:
8542:
8540:
8539:
8531:
8522:
8510:
8504:
8502:Specific tests
8498:
8497:
8494:
8493:
8491:
8490:
8485:
8480:
8474:
8472:
8466:
8465:
8463:
8462:
8457:
8456:
8455:
8445:
8444:
8443:
8433:
8427:
8425:
8419:
8418:
8416:
8415:
8414:
8413:
8408:
8398:
8393:
8388:
8383:
8378:
8372:
8370:
8364:
8363:
8361:
8360:
8355:
8354:
8353:
8348:
8347:
8346:
8341:
8326:
8325:
8324:
8319:
8314:
8309:
8298:
8296:
8287:
8281:
8280:
8278:
8277:
8272:
8267:
8266:
8265:
8255:
8250:
8249:
8248:
8238:
8237:
8236:
8231:
8226:
8216:
8211:
8206:
8205:
8204:
8199:
8194:
8178:
8177:
8176:
8171:
8166:
8156:
8155:
8154:
8149:
8139:
8138:
8137:
8127:
8126:
8125:
8115:
8110:
8105:
8099:
8097:
8087:
8086:
8074:
8073:
8070:
8069:
8066:
8065:
8063:
8062:
8057:
8052:
8047:
8041:
8039:
8033:
8032:
8030:
8029:
8024:
8019:
8013:
8011:
8007:
8006:
8004:
8003:
7998:
7993:
7988:
7983:
7978:
7973:
7967:
7965:
7959:
7958:
7956:
7955:
7953:Standard error
7950:
7945:
7940:
7939:
7938:
7933:
7922:
7920:
7914:
7913:
7911:
7910:
7905:
7900:
7895:
7890:
7885:
7883:Optimal design
7880:
7875:
7869:
7867:
7857:
7856:
7844:
7843:
7840:
7839:
7836:
7835:
7833:
7832:
7827:
7822:
7817:
7812:
7807:
7802:
7797:
7792:
7787:
7782:
7777:
7772:
7767:
7762:
7756:
7754:
7748:
7747:
7745:
7744:
7739:
7738:
7737:
7732:
7722:
7717:
7711:
7709:
7703:
7702:
7700:
7699:
7694:
7689:
7683:
7681:
7680:Summary tables
7677:
7676:
7674:
7673:
7667:
7665:
7659:
7658:
7655:
7654:
7652:
7651:
7650:
7649:
7644:
7639:
7629:
7623:
7621:
7615:
7614:
7612:
7611:
7606:
7601:
7596:
7591:
7586:
7581:
7575:
7573:
7567:
7566:
7564:
7563:
7558:
7553:
7552:
7551:
7546:
7541:
7536:
7531:
7526:
7521:
7516:
7514:Contraharmonic
7511:
7506:
7495:
7493:
7484:
7474:
7473:
7461:
7460:
7458:
7457:
7452:
7446:
7443:
7442:
7435:
7434:
7427:
7420:
7412:
7406:
7405:
7385:
7384:External links
7382:
7381:
7380:
7374:
7361:
7355:
7342:
7328:
7310:
7307:
7304:
7303:
7292:
7283:
7274:
7265:
7258:
7240:
7228:
7218:
7217:
7215:
7212:
7211:
7210:
7205:
7198:
7195:
7152:
7151:
7136:
7126:
7111:
7101:
7086:
7076:
7061:
7051:
7036:
7022:
7021:
7004:
7001:
6998:
6993:
6990:
6987:
6984:
6981:
6975:
6972:
6949:
6946:
6929:
6923:
6920:
6917:
6913:
6908:
6905:
6899:
6896:
6893:
6889:
6851:
6845:
6842:
6839:
6835:
6830:
6827:
6821:
6818:
6815:
6811:
6744:
6743:
6732:
6727:
6724:
6721:
6717:
6710:
6707:
6704:
6701:
6698:
6694:
6689:
6686:
6683:
6680:
6677:
6671:
6668:
6665:
6662:
6659:
6654:
6651:
6648:
6642:
6639:
6636:
6633:
6628:
6624:
6609:
6608:
6597:
6591:
6588:
6585:
6582:
6579:
6574:
6571:
6568:
6562:
6559:
6556:
6552:
6548:
6545:
6542:
6539:
6536:
6533:
6530:
6525:
6521:
6485:
6478:
6460:
6456:
6448:
6441:
6440:
6429:
6425:
6418:
6413:
6409:
6405:
6402:
6398:
6393:
6390:
6386:
6382:
6379:
6376:
6371:
6367:
6363:
6358:
6354:
6350:
6345:
6298:
6294:
6264:
6260:
6239:
6234:
6230:
6226:
6223:
6220:
6215:
6211:
6207:
6202:
6198:
6194:
6189:
6185:
6164:
6161:
6157:
6151:
6147:
6143:
6140:
6116:
6113:
6108:
6104:
6100:
6095:
6091:
6078:
6075:
6058:
6055:
6052:
6040:
6039:
6036:
6029:
6011:
6008:
6006:
6003:
5988:
5985:
5982:
5978:
5955:
5952:
5949:
5945:
5924:
5919:
5914:
5907:
5904:
5897:
5892:
5885:
5882:
5875:
5872:
5869:
5864:
5861:
5858:
5854:
5833:
5830:
5827:
5822:
5818:
5814:
5811:
5808:
5803:
5799:
5794:
5788:
5784:
5772:
5771:
5760:
5757:
5754:
5749:
5744:
5737:
5734:
5727:
5722:
5717:
5710:
5707:
5693:
5682:
5677:
5670:
5667:
5660:
5655:
5648:
5645:
5627:
5626:
5615:
5612:
5609:
5604:
5599:
5595:
5591:
5586:
5581:
5577:
5573:
5568:
5563:
5559:
5555:
5550:
5545:
5541:
5530:
5519:
5512:
5508:
5504:
5499:
5495:
5471:
5468:
5465:
5462:
5459:
5454:
5449:
5445:
5424:
5421:
5418:
5415:
5412:
5407:
5403:
5382:
5377:
5373:
5368:
5362:
5358:
5354:
5351:
5331:
5328:
5325:
5320:
5316:
5295:
5292:
5289:
5284:
5280:
5268:
5267:
5256:
5253:
5248:
5244:
5238:
5234:
5230:
5227:
5224:
5221:
5216:
5212:
5208:
5205:
5200:
5196:
5192:
5189:
5186:
5183:
5180:
5175:
5170:
5166:
5162:
5157:
5153:
5149:
5144:
5139:
5135:
5124:
5113:
5108:
5105:
5102:
5099:
5094:
5090:
5086:
5081:
5077:
5073:
5068:
5064:
5038:
5033:
5029:
5002:
4998:
4982:
4981:
4970:
4964:
4960:
4954:
4947:
4944:
4937:
4932:
4928:
4924:
4920:
4915:
4912:
4907:
4902:
4897:
4890:
4887:
4873:
4862:
4856:
4852:
4847:
4842:
4839:
4834:
4829:
4822:
4819:
4786:
4782:
4778:
4775:
4772:
4767:
4763:
4738:
4734:
4709:
4705:
4681:
4661:
4654:
4650:
4625:
4621:
4600:
4580:
4558:
4554:
4531:
4528:
4525:
4521:
4498:
4495:
4492:
4488:
4467:
4462:
4458:
4453:
4447:
4443:
4439:
4436:
4414:
4410:
4406:
4403:
4400:
4395:
4391:
4378:
4375:
4371:non-parametric
4348:Main article:
4345:
4342:
4341:
4340:
4327:
4324:
4321:
4316:
4312:
4308:
4305:
4302:
4299:
4296:
4271:
4267:
4263:
4258:
4254:
4250:
4247:
4225:
4221:
4217:
4214:
4194:
4183:
4182:
4170:
4166:
4163:
4159:
4156:
4152:
4149:
4145:
4142:
4139:
4136:
4132:
4129:
4125:
4120:
4116:
4112:
4109:
4106:
4103:
4100:
4097:
4094:
4091:
4088:
4085:
4082:
4077:
4073:
4069:
4066:
4063:
4060:
4057:
4054:
4051:
4048:
4045:
4020:
4016:
3995:
3992:
3989:
3967:
3963:
3940:
3936:
3913:
3909:
3905:
3902:
3891:
3890:
3878:
3875:
3855:
3852:
3849:
3846:
3843:
3840:
3837:
3834:
3831:
3828:
3825:
3822:
3819:
3816:
3813:
3790:
3770:
3767:
3764:
3761:
3750:
3749:
3738:
3735:
3732:
3729:
3726:
3723:
3720:
3717:
3714:
3711:
3708:
3705:
3702:
3699:
3696:
3693:
3687:
3684:
3681:
3678:
3674:
3669:
3665:
3662:
3659:
3656:
3652:
3648:
3645:
3642:
3639:
3636:
3633:
3630:
3627:
3624:
3620:
3617:
3614:
3611:
3607:
3603:
3600:
3597:
3594:
3591:
3588:
3585:
3582:
3568:
3567:
3553:
3550:
3547:
3544:
3539:
3536:
3533:
3530:
3527:
3524:
3518:
3512:
3509:
3506:
3503:
3498:
3495:
3492:
3489:
3486:
3483:
3479:
3475:
3472:
3469:
3463:
3460:
3457:
3453:
3449:
3446:
3443:
3420:
3417:
3414:
3411:
3408:
3405:
3382:
3379:
3376:
3373:
3370:
3367:
3364:
3361:
3358:
3354:
3350:
3347:
3344:
3324:
3304:
3301:
3298:
3295:
3292:
3289:
3273:
3270:
3261:
3260:
3248:
3245:
3242:
3239:
3235:
3231:
3228:
3225:
3222:
3219:
3216:
3213:
3210:
3207:
3203:
3189:Bayes' theorem
3185:
3184:
3173:
3167:
3164:
3161:
3158:
3155:
3152:
3149:
3146:
3142:
3138:
3135:
3132:
3129:
3124:
3121:
3118:
3115:
3112:
3109:
3105:
3101:
3098:
3095:
3089:
3086:
3083:
3079:
3075:
3072:
3069:
3038:
3035:
3032:
3029:
3015:
3014:
3003:
3000:
2997:
2993:
2990:
2987:
2984:
2981:
2978:
2974:
2951:
2948:
2945:
2942:
2939:
2936:
2917:
2916:
2905:
2902:
2899:
2896:
2893:
2890:
2887:
2884:
2881:
2858:
2840:
2837:
2825:
2824:
2811:
2806:
2803:
2800:
2796:
2789:
2786:
2780:
2777:
2773:
2762:
2760:
2757:
2754:
2753:
2750:
2747:
2743:
2736:
2733:
2727:
2724:
2720:
2709:
2707:
2704:
2701:
2700:
2698:
2693:
2690:
2684:
2681:
2675:
2672:
2669:
2666:
2655:
2654:
2642:
2639:
2636:
2616:
2613:
2610:
2599:posterior mode
2593:
2592:Posterior mode
2590:
2589:
2588:
2577:
2571:
2568:
2565:
2561:
2556:
2553:
2550:
2546:
2542:
2539:
2536:
2530:
2527:
2521:
2518:
2508:
2495:
2490:
2487:
2481:
2478:
2472:
2469:
2459:
2457:
2453:
2446:
2443:
2437:
2434:
2430:
2426:
2423:
2422:
2419:
2416:
2410:
2407:
2401:
2398:
2388:
2386:
2382:
2375:
2372:
2366:
2363:
2359:
2355:
2352:
2351:
2349:
2344:
2341:
2335:
2332:
2326:
2323:
2320:
2317:
2306:
2305:
2289:
2286:
2283:
2280:
2277:
2265:
2264:
2253:
2247:
2244:
2238:
2235:
2232:
2228:
2224:
2221:
2218:
2212:
2209:
2203:
2200:
2190:
2178:
2171:
2168:
2162:
2159:
2155:
2151:
2148:
2145:
2139:
2136:
2130:
2127:
2124:
2121:
2110:
2109:
2097:
2094:
2091:
2075:Main article:
2072:
2069:
2056:
2043:
2040:
2039:
2038:
2027:
2021:
2018:
2015:
2012:
2009:
2003:
1998:
1994:
1990:
1987:
1984:
1981:
1978:
1973:
1969:
1965:
1960:
1956:
1952:
1948:
1945:
1942:
1939:
1936:
1933:
1927:
1924:
1921:
1918:
1912:
1909:
1895:
1894:
1882:
1879:
1876:
1871:
1867:
1863:
1860:
1857:
1854:
1851:
1828:
1825:
1822:
1819:
1816:
1813:
1810:
1807:
1803:
1797:
1793:
1767:
1763:
1759:
1756:
1753:
1750:
1747:
1742:
1738:
1725:
1724:
1713:
1707:
1704:
1701:
1696:
1693:
1688:
1685:
1680:
1674:
1671:
1668:
1665:
1659:
1656:
1642:
1641:
1629:
1626:
1623:
1620:
1617:
1614:
1611:
1608:
1585:
1582:
1579:
1576:
1573:
1570:
1566:
1560:
1556:
1526:
1522:
1518:
1515:
1512:
1509:
1506:
1501:
1497:
1484:
1483:
1472:
1469:
1461:
1457:
1453:
1448:
1444:
1437:
1433:
1427:
1424:
1416:
1412:
1408:
1403:
1399:
1392:
1388:
1382:
1379:
1376:
1373:
1367:
1364:
1350:
1349:
1337:
1332:
1328:
1324:
1321:
1318:
1315:
1312:
1309:
1289:
1284:
1280:
1276:
1273:
1270:
1267:
1264:
1261:
1257:
1253:
1229:
1225:
1221:
1187:Main article:
1184:
1181:
1173:
1172:
1161:
1158:
1155:
1151:
1148:
1144:
1140:
1137:
1134:
1130:
1127:
1124:
1121:
1118:
1114:
1110:
1107:
1104:
1101:
1098:
1095:
1092:
1086:
1083:
1061:
1060:Posterior mean
1058:
1045:
1025:
1014:
1013:
1002:
998:
992:
988:
984:
981:
978:
975:
972:
966:
963:
957:
953:
949:
946:
942:
939:
936:
909:Main article:
906:
903:
901:
898:
880:
850:
829:
826:
822:
818:
812:
809:
803:
800:
797:
794:
791:
788:
761:
758:
732:
729:
706:
682:
679:
673:
670:
664:
661:
658:
655:
652:
647:
643:
633:is defined as
619:
616:
585:
579:
576:
570:
567:
564:
561:
537:
517:
514:
511:
505:
502:
496:
490:
487:
464:
441:
429:
426:
402:expected value
369:
368:
366:
365:
358:
351:
343:
340:
339:
338:
337:
322:
321:
320:
319:
314:
309:
301:
300:
296:
295:
294:
293:
288:
280:
279:
275:
274:
273:
272:
267:
262:
254:
253:
249:
248:
247:
246:
241:
236:
231:
226:
218:
217:
213:
212:
211:
210:
205:
200:
195:
187:
186:
185:Model building
182:
181:
180:
179:
174:
169:
164:
159:
154:
149:
144:
142:Bayes' theorem
139:
134:
126:
125:
121:
120:
102:
101:
93:
92:
86:
85:
77:
76:
31:
29:
22:
15:
9:
6:
4:
3:
2:
9761:
9750:
9747:
9745:
9742:
9741:
9739:
9724:
9723:
9714:
9712:
9711:
9702:
9700:
9699:
9694:
9688:
9686:
9685:
9676:
9675:
9672:
9658:
9655:
9653:
9652:Geostatistics
9650:
9648:
9645:
9643:
9640:
9638:
9635:
9634:
9632:
9630:
9626:
9620:
9619:Psychometrics
9617:
9615:
9612:
9610:
9607:
9605:
9602:
9600:
9597:
9595:
9592:
9590:
9587:
9585:
9582:
9580:
9577:
9575:
9572:
9571:
9569:
9567:
9563:
9557:
9554:
9552:
9549:
9547:
9543:
9540:
9538:
9535:
9533:
9530:
9528:
9525:
9524:
9522:
9520:
9516:
9510:
9507:
9505:
9502:
9500:
9496:
9493:
9491:
9488:
9487:
9485:
9483:
9482:Biostatistics
9479:
9475:
9471:
9466:
9462:
9444:
9443:Log-rank test
9441:
9440:
9438:
9434:
9428:
9425:
9424:
9422:
9420:
9416:
9410:
9407:
9405:
9402:
9400:
9397:
9395:
9392:
9391:
9389:
9387:
9383:
9380:
9378:
9374:
9364:
9361:
9359:
9356:
9354:
9351:
9349:
9346:
9344:
9341:
9340:
9338:
9336:
9332:
9326:
9323:
9321:
9318:
9316:
9314:(BoxâJenkins)
9310:
9308:
9305:
9303:
9300:
9296:
9293:
9292:
9291:
9288:
9287:
9285:
9283:
9279:
9273:
9270:
9268:
9267:DurbinâWatson
9265:
9263:
9257:
9255:
9252:
9250:
9249:DickeyâFuller
9247:
9246:
9244:
9240:
9234:
9231:
9229:
9226:
9224:
9223:Cointegration
9221:
9219:
9216:
9214:
9211:
9209:
9206:
9204:
9201:
9199:
9198:Decomposition
9196:
9195:
9193:
9189:
9186:
9184:
9180:
9170:
9167:
9166:
9165:
9162:
9161:
9160:
9157:
9153:
9150:
9149:
9148:
9145:
9143:
9140:
9138:
9135:
9133:
9130:
9128:
9125:
9123:
9120:
9118:
9115:
9113:
9110:
9109:
9107:
9105:
9101:
9095:
9092:
9090:
9087:
9085:
9082:
9080:
9077:
9075:
9072:
9070:
9069:Cohen's kappa
9067:
9066:
9064:
9062:
9058:
9054:
9050:
9046:
9042:
9038:
9033:
9029:
9015:
9012:
9010:
9007:
9005:
9002:
9000:
8997:
8996:
8994:
8992:
8988:
8982:
8978:
8974:
8968:
8966:
8963:
8962:
8960:
8958:
8954:
8948:
8945:
8943:
8940:
8938:
8935:
8933:
8930:
8928:
8925:
8923:
8922:Nonparametric
8920:
8918:
8915:
8914:
8912:
8908:
8902:
8899:
8897:
8894:
8892:
8889:
8887:
8884:
8883:
8881:
8879:
8875:
8869:
8866:
8864:
8861:
8859:
8856:
8854:
8851:
8849:
8846:
8845:
8843:
8841:
8837:
8831:
8828:
8826:
8823:
8821:
8818:
8816:
8813:
8812:
8810:
8808:
8804:
8800:
8793:
8790:
8788:
8785:
8784:
8780:
8776:
8760:
8757:
8756:
8755:
8752:
8750:
8747:
8745:
8742:
8738:
8735:
8733:
8730:
8729:
8728:
8725:
8724:
8722:
8720:
8716:
8706:
8703:
8699:
8693:
8691:
8685:
8683:
8677:
8676:
8675:
8672:
8671:Nonparametric
8669:
8667:
8661:
8657:
8654:
8653:
8652:
8646:
8642:
8641:Sample median
8639:
8638:
8637:
8634:
8633:
8631:
8629:
8625:
8617:
8614:
8612:
8609:
8607:
8604:
8603:
8602:
8599:
8597:
8594:
8592:
8586:
8584:
8581:
8579:
8576:
8574:
8571:
8569:
8566:
8564:
8562:
8558:
8556:
8553:
8552:
8550:
8548:
8544:
8538:
8536:
8532:
8530:
8528:
8523:
8521:
8516:
8512:
8511:
8508:
8505:
8503:
8499:
8489:
8486:
8484:
8481:
8479:
8476:
8475:
8473:
8471:
8467:
8461:
8458:
8454:
8451:
8450:
8449:
8446:
8442:
8439:
8438:
8437:
8434:
8432:
8429:
8428:
8426:
8424:
8420:
8412:
8409:
8407:
8404:
8403:
8402:
8399:
8397:
8394:
8392:
8389:
8387:
8384:
8382:
8379:
8377:
8374:
8373:
8371:
8369:
8365:
8359:
8356:
8352:
8349:
8345:
8342:
8340:
8337:
8336:
8335:
8332:
8331:
8330:
8327:
8323:
8320:
8318:
8315:
8313:
8310:
8308:
8305:
8304:
8303:
8300:
8299:
8297:
8295:
8291:
8288:
8286:
8282:
8276:
8273:
8271:
8268:
8264:
8261:
8260:
8259:
8256:
8254:
8251:
8247:
8246:loss function
8244:
8243:
8242:
8239:
8235:
8232:
8230:
8227:
8225:
8222:
8221:
8220:
8217:
8215:
8212:
8210:
8207:
8203:
8200:
8198:
8195:
8193:
8187:
8184:
8183:
8182:
8179:
8175:
8172:
8170:
8167:
8165:
8162:
8161:
8160:
8157:
8153:
8150:
8148:
8145:
8144:
8143:
8140:
8136:
8133:
8132:
8131:
8128:
8124:
8121:
8120:
8119:
8116:
8114:
8111:
8109:
8106:
8104:
8101:
8100:
8098:
8096:
8092:
8088:
8084:
8079:
8075:
8061:
8058:
8056:
8053:
8051:
8048:
8046:
8043:
8042:
8040:
8038:
8034:
8028:
8025:
8023:
8020:
8018:
8015:
8014:
8012:
8008:
8002:
7999:
7997:
7994:
7992:
7989:
7987:
7984:
7982:
7979:
7977:
7974:
7972:
7969:
7968:
7966:
7964:
7960:
7954:
7951:
7949:
7948:Questionnaire
7946:
7944:
7941:
7937:
7934:
7932:
7929:
7928:
7927:
7924:
7923:
7921:
7919:
7915:
7909:
7906:
7904:
7901:
7899:
7896:
7894:
7891:
7889:
7886:
7884:
7881:
7879:
7876:
7874:
7871:
7870:
7868:
7866:
7862:
7858:
7854:
7849:
7845:
7831:
7828:
7826:
7823:
7821:
7818:
7816:
7813:
7811:
7808:
7806:
7803:
7801:
7798:
7796:
7793:
7791:
7788:
7786:
7783:
7781:
7778:
7776:
7775:Control chart
7773:
7771:
7768:
7766:
7763:
7761:
7758:
7757:
7755:
7753:
7749:
7743:
7740:
7736:
7733:
7731:
7728:
7727:
7726:
7723:
7721:
7718:
7716:
7713:
7712:
7710:
7708:
7704:
7698:
7695:
7693:
7690:
7688:
7685:
7684:
7682:
7678:
7672:
7669:
7668:
7666:
7664:
7660:
7648:
7645:
7643:
7640:
7638:
7635:
7634:
7633:
7630:
7628:
7625:
7624:
7622:
7620:
7616:
7610:
7607:
7605:
7602:
7600:
7597:
7595:
7592:
7590:
7587:
7585:
7582:
7580:
7577:
7576:
7574:
7572:
7568:
7562:
7559:
7557:
7554:
7550:
7547:
7545:
7542:
7540:
7537:
7535:
7532:
7530:
7527:
7525:
7522:
7520:
7517:
7515:
7512:
7510:
7507:
7505:
7502:
7501:
7500:
7497:
7496:
7494:
7492:
7488:
7485:
7483:
7479:
7475:
7471:
7466:
7462:
7456:
7453:
7451:
7448:
7447:
7444:
7440:
7433:
7428:
7426:
7421:
7419:
7414:
7413:
7410:
7402:
7398:
7397:
7392:
7388:
7387:
7377:
7375:0-471-91732-X
7371:
7367:
7362:
7358:
7356:0-387-98502-6
7352:
7348:
7343:
7339:
7335:
7331:
7329:0-387-96098-8
7325:
7321:
7317:
7313:
7312:
7301:
7296:
7287:
7278:
7269:
7261:
7255:
7251:
7244:
7235:
7233:
7223:
7219:
7209:
7206:
7204:
7201:
7200:
7194:
7191:
7189:
7185:
7181:
7177:
7173:
7169:
7165:
7161:
7157:
7134:
7127:
7109:
7102:
7084:
7077:
7059:
7052:
7034:
7027:
7026:
7025:
7002:
6999:
6996:
6991:
6988:
6985:
6982:
6979:
6973:
6970:
6963:
6962:
6961:
6959:
6955:
6945:
6941:
6927:
6921:
6918:
6915:
6911:
6906:
6903:
6897:
6894:
6891:
6887:
6877:
6873:
6867:
6865:
6849:
6843:
6840:
6837:
6833:
6828:
6825:
6819:
6816:
6813:
6809:
6799:
6795:
6791:
6786:
6784:
6780:
6776:
6772:
6768:
6764:
6760:
6756:
6751:
6749:
6730:
6725:
6722:
6719:
6715:
6708:
6705:
6702:
6699:
6696:
6692:
6687:
6681:
6675:
6669:
6666:
6663:
6660:
6657:
6652:
6649:
6646:
6640:
6634:
6626:
6622:
6614:
6613:
6612:
6595:
6589:
6586:
6583:
6580:
6577:
6572:
6569:
6566:
6560:
6554:
6546:
6540:
6537:
6531:
6523:
6519:
6511:
6510:
6509:
6507:
6503:
6499:
6495:
6491:
6483:
6477:
6475:
6470:
6468:
6465:under MSE is
6463:
6454:
6446:
6427:
6423:
6411:
6407:
6400:
6396:
6391:
6388:
6384:
6380:
6369:
6365:
6361:
6356:
6352:
6343:
6334:
6333:
6332:
6330:
6326:
6322:
6318:
6314:
6296:
6292:
6282:
6280:
6262:
6258:
6232:
6228:
6224:
6221:
6218:
6213:
6209:
6200:
6196:
6192:
6187:
6183:
6159:
6149:
6145:
6138:
6130:
6114:
6111:
6106:
6102:
6098:
6093:
6089:
6074:
6072:
6056:
6053:
6050:
6037:
6034:
6030:
6027:
6026:
6025:
6023:
6017:
6010:Admissibility
6002:
5986:
5983:
5980:
5976:
5953:
5950:
5947:
5943:
5917:
5912:
5905:
5902:
5895:
5890:
5883:
5880:
5870:
5867:
5862:
5859:
5856:
5852:
5828:
5825:
5820:
5816:
5809:
5806:
5801:
5797:
5786:
5782:
5758:
5755:
5752:
5747:
5742:
5735:
5732:
5725:
5720:
5715:
5708:
5705:
5694:
5680:
5675:
5668:
5665:
5658:
5653:
5646:
5643:
5632:
5631:
5630:
5613:
5610:
5607:
5602:
5597:
5593:
5589:
5584:
5579:
5575:
5571:
5566:
5561:
5557:
5553:
5548:
5543:
5539:
5531:
5517:
5510:
5506:
5502:
5497:
5493:
5485:
5484:
5483:
5469:
5466:
5460:
5452:
5447:
5443:
5422:
5419:
5413:
5405:
5401:
5375:
5371:
5360:
5356:
5349:
5326:
5318:
5314:
5290:
5282:
5278:
5254:
5246:
5236:
5232:
5228:
5222:
5214:
5210:
5198:
5194:
5190:
5181:
5173:
5168:
5164:
5155:
5151:
5147:
5142:
5137:
5133:
5125:
5111:
5100:
5092:
5088:
5079:
5075:
5071:
5066:
5062:
5054:
5053:
5052:
5036:
5031:
5027:
5018:
5000:
4996:
4987:
4968:
4962:
4952:
4945:
4942:
4935:
4930:
4926:
4918:
4913:
4910:
4905:
4900:
4895:
4888:
4885:
4874:
4860:
4854:
4850:
4845:
4840:
4837:
4832:
4827:
4820:
4817:
4806:
4805:
4804:
4802:
4784:
4780:
4776:
4773:
4770:
4765:
4761:
4736:
4732:
4724:and variance
4707:
4703:
4693:
4679:
4659:
4652:
4648:
4640:and variance
4623:
4619:
4598:
4578:
4556:
4552:
4529:
4526:
4523:
4519:
4496:
4493:
4490:
4486:
4460:
4456:
4445:
4441:
4434:
4412:
4408:
4404:
4401:
4398:
4393:
4389:
4374:
4372:
4368:
4363:
4361:
4358:is called an
4357:
4351:
4325:
4322:
4319:
4314:
4310:
4306:
4300:
4294:
4287:
4286:
4285:
4269:
4265:
4261:
4256:
4252:
4248:
4245:
4223:
4219:
4215:
4212:
4192:
4168:
4164:
4161:
4157:
4150:
4147:
4143:
4137:
4130:
4127:
4123:
4118:
4114:
4110:
4107:
4101:
4098:
4095:
4092:
4089:
4083:
4080:
4075:
4071:
4064:
4058:
4055:
4052:
4046:
4043:
4036:
4035:
4034:
4018:
4014:
3993:
3990:
3987:
3965:
3961:
3938:
3934:
3911:
3907:
3903:
3900:
3876:
3873:
3853:
3850:
3844:
3841:
3838:
3832:
3826:
3823:
3820:
3814:
3811:
3804:
3803:
3802:
3788:
3765:
3759:
3736:
3733:
3730:
3724:
3721:
3718:
3712:
3706:
3703:
3700:
3694:
3691:
3682:
3676:
3672:
3667:
3663:
3660:
3654:
3646:
3640:
3634:
3631:
3628:
3622:
3618:
3615:
3609:
3598:
3595:
3592:
3586:
3580:
3573:
3572:
3571:
3548:
3542:
3534:
3531:
3528:
3522:
3516:
3507:
3501:
3493:
3487:
3481:
3473:
3467:
3461:
3455:
3447:
3441:
3434:
3433:
3432:
3418:
3415:
3409:
3403:
3394:
3377:
3374:
3371:
3365:
3362:
3356:
3348:
3342:
3322:
3299:
3296:
3293:
3287:
3279:
3269:
3267:
3246:
3243:
3237:
3229:
3223:
3217:
3214:
3211:
3205:
3201:
3194:
3193:
3192:
3190:
3171:
3165:
3162:
3156:
3150:
3144:
3136:
3130:
3127:
3119:
3113:
3107:
3099:
3093:
3087:
3081:
3073:
3067:
3060:
3059:
3058:
3054:
3052:
3033:
3027:
3020:
3001:
2995:
2991:
2988:
2982:
2976:
2972:
2965:
2964:
2963:
2949:
2946:
2940:
2934:
2926:
2922:
2903:
2900:
2897:
2894:
2888:
2882:
2879:
2872:
2871:
2870:
2856:
2846:
2836:
2834:
2830:
2804:
2801:
2798:
2787:
2784:
2778:
2775:
2758:
2755:
2748:
2745:
2734:
2731:
2725:
2722:
2705:
2702:
2696:
2691:
2682:
2679:
2673:
2670:
2664:
2657:
2656:
2640:
2637:
2634:
2614:
2611:
2608:
2600:
2596:
2595:
2575:
2569:
2566:
2563:
2559:
2554:
2548:
2537:
2528:
2525:
2516:
2509:
2488:
2485:
2479:
2476:
2470:
2467:
2455:
2444:
2441:
2435:
2432:
2424:
2417:
2414:
2408:
2405:
2399:
2396:
2384:
2373:
2370:
2364:
2361:
2353:
2347:
2342:
2333:
2330:
2324:
2321:
2315:
2308:
2307:
2303:
2287:
2284:
2281:
2278:
2275:
2267:
2266:
2251:
2245:
2242:
2236:
2230:
2219:
2210:
2207:
2198:
2191:
2169:
2166:
2160:
2157:
2149:
2146:
2137:
2134:
2128:
2125:
2119:
2112:
2111:
2095:
2092:
2089:
2081:
2080:
2078:
2068:
2054:
2025:
2019:
2016:
2013:
2010:
2007:
1996:
1992:
1988:
1985:
1982:
1979:
1976:
1971:
1967:
1963:
1958:
1954:
1940:
1937:
1934:
1925:
1919:
1910:
1907:
1897:
1896:
1877:
1874:
1869:
1865:
1858:
1855:
1852:
1849:
1842:
1823:
1820:
1817:
1811:
1808:
1805:
1795:
1791:
1783:
1765:
1761:
1757:
1754:
1751:
1748:
1745:
1740:
1736:
1727:
1726:
1711:
1705:
1702:
1699:
1694:
1691:
1683:
1678:
1672:
1666:
1657:
1654:
1644:
1643:
1624:
1621:
1618:
1612:
1609:
1606:
1599:
1580:
1574:
1571:
1568:
1558:
1554:
1545:
1542:
1524:
1520:
1516:
1513:
1510:
1507:
1504:
1499:
1495:
1486:
1485:
1470:
1467:
1459:
1455:
1451:
1446:
1442:
1435:
1431:
1425:
1422:
1414:
1410:
1406:
1401:
1397:
1390:
1386:
1380:
1374:
1365:
1362:
1352:
1351:
1330:
1326:
1322:
1319:
1313:
1310:
1307:
1282:
1278:
1274:
1271:
1265:
1262:
1259:
1251:
1243:
1227:
1219:
1211:
1210:
1209:
1206:
1202:
1200:
1196:
1190:
1180:
1178:
1159:
1156:
1153:
1146:
1138:
1132:
1128:
1125:
1122:
1116:
1108:
1102:
1099:
1093:
1084:
1081:
1071:
1070:
1069:
1067:
1057:
1043:
1023:
1000:
996:
990:
982:
979:
973:
964:
961:
951:
947:
944:
926:
925:
924:
922:
918:
912:
897:
895:
891:
878:
868:
863:
861:
848:
824:
810:
807:
801:
798:
792:
786:
778:
759:
756:
730:
727:
704:
696:
671:
668:
662:
659:
653:
645:
641:
617:
614:
603:
599:
598:loss function
577:
574:
568:
565:
559:
551:
535:
512:
503:
500:
494:
488:
485:
462:
455:
439:
425:
423:
419:
415:
411:
407:
406:loss function
403:
400:
396:
395:decision rule
392:
388:
384:
380:
376:
364:
359:
357:
352:
350:
345:
344:
342:
341:
336:
331:
326:
325:
324:
323:
318:
315:
313:
310:
308:
305:
304:
303:
302:
298:
297:
292:
289:
287:
284:
283:
282:
281:
277:
276:
271:
268:
266:
263:
261:
258:
257:
256:
255:
251:
250:
245:
242:
240:
237:
235:
232:
230:
227:
225:
222:
221:
220:
219:
215:
214:
209:
206:
204:
201:
199:
196:
194:
191:
190:
189:
188:
184:
183:
178:
175:
173:
170:
168:
165:
163:
160:
158:
157:Cox's theorem
155:
153:
150:
148:
145:
143:
140:
138:
135:
133:
130:
129:
128:
127:
123:
122:
119:
115:
111:
107:
104:
103:
99:
95:
94:
91:
88:
87:
83:
82:
73:
70:
62:
59:November 2009
52:
48:
42:
41:
35:
30:
21:
20:
9720:
9708:
9689:
9682:
9594:Econometrics
9544: /
9527:Chemometrics
9504:Epidemiology
9497: /
9470:Applications
9312:ARIMA model
9259:Q-statistic
9208:Stationarity
9104:Multivariate
9047: /
9043: /
9041:Multivariate
9039: /
8979: /
8975: /
8753:
8749:Bayes factor
8648:Signed rank
8560:
8534:
8526:
8514:
8209:Completeness
8045:Cohort study
7943:Opinion poll
7878:Missing data
7865:Study design
7820:Scatter plot
7742:Scatter plot
7735:Spearman's Ď
7697:Grouped data
7394:
7365:
7346:
7319:
7300:IMDb Top 250
7295:
7286:
7277:
7268:
7249:
7243:
7222:
7192:
7187:
7183:
7179:
7175:
7171:
7167:
7163:
7158:is just the
7155:
7153:
7023:
6951:
6942:
6875:
6871:
6868:
6863:
6793:
6789:
6787:
6782:
6778:
6774:
6770:
6766:
6762:
6758:
6754:
6752:
6747:
6745:
6610:
6505:
6501:
6493:
6489:
6487:
6481:
6471:
6461:
6444:
6442:
6316:
6312:
6283:
6278:
6080:
6041:
6033:discrete set
6019:
5773:
5628:
5269:
4983:
4694:
4380:
4364:
4359:
4353:
4184:
3892:
3866:for a given
3751:
3569:
3395:
3275:
3265:
3262:
3186:
3055:
3016:
2920:
2918:
2848:
2826:
2045:
1207:
1203:
1192:
1176:
1174:
1063:
1015:
920:
914:
893:
892:is called a
870:
864:
840:
776:
693:, where the
601:
549:
431:
409:
387:Bayes action
386:
382:
372:
307:Bayes factor
65:
56:
37:
9722:WikiProject
9637:Cartography
9599:Jurimetrics
9551:Reliability
9282:Time domain
9261:(LjungâBox)
9183:Time-series
9061:Categorical
9045:Time-series
9037:Categorical
8972:(Bernoulli)
8807:Correlation
8787:Correlation
8583:JarqueâBera
8555:Chi-squared
8317:M-estimator
8270:Asymptotics
8214:Sufficiency
7981:Interaction
7893:Replication
7873:Effect size
7830:Violin plot
7810:Radar chart
7790:Forest plot
7780:Correlogram
7730:Kendall's Ď
5019:to compute
4988:to compute
695:expectation
552:), and let
408:(i.e., the
51:introducing
9738:Categories
9589:Demography
9307:ARMA model
9112:Regression
8689:(Friedman)
8650:(Wilcoxon)
8588:Normality
8578:Lilliefors
8525:Student's
8401:Resampling
8275:Robustness
8263:divergence
8253:Efficiency
8191:(monotone)
8186:Likelihood
8103:Population
7936:Stratified
7888:Population
7707:Dependence
7663:Count data
7594:Percentile
7571:Dispersion
7504:Arithmetic
7439:Statistics
7309:References
7154:Note that
6277:for large
6022:admissible
6014:See also:
6005:Properties
5051:such that
4803:approach:
4799:using the
4367:parametric
2843:See also:
602:Bayes risk
428:Definition
252:Estimators
124:Background
110:Likelihood
34:references
9744:Estimator
8970:Logistic
8737:posterior
8663:Rank sum
8411:Jackknife
8406:Bootstrap
8224:Bootstrap
8159:Parameter
8108:Statistic
7903:Statistic
7815:Run chart
7800:Pie chart
7795:Histogram
7785:Fan chart
7760:Bar chart
7642:L-moments
7529:Geometric
7401:EMS Press
6844:β
6838:α
6834:β
6820:β
6814:α
6810:α
6716:δ
6682:θ
6623:δ
6547:θ
6520:δ
6451:) is the
6408:θ
6378:→
6366:θ
6362:−
6353:δ
6293:θ
6259:δ
6222:…
6197:δ
6184:δ
6160:θ
6115:…
5968:based on
5944:θ
5913:π
5906:^
5903:σ
5891:π
5884:^
5881:μ
5868:∼
5853:θ
5817:θ
5807:∼
5798:θ
5753:−
5736:^
5733:σ
5716:π
5709:^
5706:σ
5669:^
5666:μ
5654:π
5647:^
5644:μ
5608:−
5594:σ
5576:σ
5572:−
5558:σ
5544:π
5540:σ
5507:μ
5498:π
5494:μ
5461:θ
5444:σ
5435:and that
5423:θ
5414:θ
5402:μ
5372:θ
5327:θ
5315:σ
5291:θ
5279:μ
5233:μ
5229:−
5223:θ
5211:μ
5199:π
5182:θ
5165:σ
5156:π
5134:σ
5101:θ
5089:μ
5080:π
5063:μ
5028:σ
4997:μ
4946:^
4943:μ
4936:−
4919:∑
4889:^
4886:σ
4846:∑
4821:^
4818:μ
4774:…
4733:σ
4704:μ
4680:π
4653:π
4649:σ
4624:π
4620:μ
4599:π
4579:π
4553:θ
4511:based on
4487:θ
4457:θ
4402:…
4249:−
4216:−
4162:θ
4148:θ
4144:−
4128:θ
4124:−
4111:−
4099:∫
4093:θ
4084:θ
4081:−
4059:θ
4056:−
4044:∫
3854:θ
3845:θ
3842:−
3827:θ
3824:−
3812:∫
3734:θ
3725:θ
3722:−
3707:θ
3704:−
3692:∫
3664:θ
3647:θ
3635:θ
3632:−
3619:∫
3599:θ
3596:−
3535:θ
3532:−
3494:θ
3482:θ
3448:θ
3410:θ
3378:θ
3375:−
3357:θ
3323:θ
3300:θ
3297:−
3247:θ
3230:θ
3212:θ
3202:∫
3166:θ
3157:θ
3145:θ
3128:∫
3120:θ
3108:θ
3074:θ
3034:θ
2999:∞
2992:θ
2983:θ
2973:∫
2941:θ
2898:θ
2889:θ
2880:∫
2799:≥
2788:^
2785:θ
2779:−
2776:θ
2766:for
2735:^
2732:θ
2726:−
2723:θ
2713:for
2683:^
2680:θ
2671:θ
2529:^
2526:θ
2480:^
2477:θ
2471:−
2468:θ
2463:for
2445:^
2442:θ
2436:−
2433:θ
2415:≥
2409:^
2406:θ
2400:−
2397:θ
2392:for
2374:^
2371:θ
2365:−
2362:θ
2334:^
2331:θ
2322:θ
2211:^
2208:θ
2170:^
2167:θ
2161:−
2158:θ
2138:^
2135:θ
2126:θ
2017:−
1955:θ
1911:^
1908:θ
1866:θ
1853:∼
1850:θ
1824:θ
1809:∼
1806:θ
1687:¯
1658:^
1655:θ
1610:∼
1607:θ
1581:θ
1572:∼
1569:θ
1456:τ
1443:σ
1432:τ
1423:μ
1411:τ
1398:σ
1387:σ
1366:^
1363:θ
1327:τ
1320:μ
1311:∼
1308:θ
1279:σ
1272:θ
1263:∼
1260:θ
1228:θ
1157:θ
1139:θ
1129:θ
1126:∫
1109:θ
1085:^
1082:θ
1024:θ
983:θ
980:−
965:^
962:θ
871:for each
841:for each
811:^
808:θ
799:θ
760:^
757:θ
731:^
728:θ
705:θ
672:^
669:θ
660:θ
646:π
618:^
615:θ
578:^
575:θ
566:θ
536:θ
504:^
501:θ
489:^
486:θ
463:π
440:θ
399:posterior
391:estimator
152:Coherence
106:Posterior
9684:Category
9377:Survival
9254:Johansen
8977:Binomial
8932:Isotonic
8519:(normal)
8164:location
7971:Blocking
7926:Sampling
7805:QâQ plot
7770:Box plot
7752:Graphics
7647:Skewness
7637:Kurtosis
7609:Variance
7539:Heronian
7534:Harmonic
7318:(1985).
7197:See also
5015:and the
4165:′
4151:′
4131:′
3019:measures
2302:quantile
1780:are iid
900:Examples
867:improper
118:Evidence
9710:Commons
9657:Kriging
9542:Process
9499:studies
9358:Wavelet
9191:General
8358:Plug-in
8152:L space
7931:Cluster
7632:Moments
7450:Outline
7403:, 2001
7338:0804611
7024:where:
6864:exactly
6327:to the
6323:and it
4377:Example
3315:. Here
3272:Example
1544:Poisson
414:utility
47:improve
9579:Census
9169:Normal
9117:Manova
8937:Robust
8687:2-way
8679:1-way
8517:-test
8188:
7765:Biplot
7556:Median
7549:Lehmer
7491:Center
7372:
7353:
7336:
7326:
7256:
7186:is to
7172:(v, m)
7138:
7113:
7088:
7063:
7038:
7009:
6443:where
6175:. Let
5270:where
1242:Normal
475:. Let
389:is an
36:, but
9203:Trend
8732:prior
8674:anova
8563:-test
8537:-test
8529:-test
8436:Power
8381:Pivot
8174:shape
8169:scale
7619:Shape
7599:Range
7544:Heinz
7519:Cubic
7455:Index
7214:Notes
6492:~b(θ,
3017:Such
596:be a
404:of a
385:or a
114:Prior
9436:Test
8636:Sign
8488:Wald
7561:Mode
7499:Mean
7370:ISBN
7351:ISBN
7324:ISBN
7254:ISBN
7166:and
6952:The
6798:then
6455:of θ
6127:are
6054:>
5306:and
4369:and
2746:<
2638:>
2612:>
2486:<
2285:>
2093:>
1539:are
1036:and
381:, a
377:and
8616:BIC
8611:AIC
7162:of
6767:a+b
6129:iid
1947:max
1728:If
1541:iid
1487:If
1240:is
1212:If
604:of
420:is
393:or
373:In
9740::
7399:,
7393:,
7334:MR
7332:.
7231:^
6500:B(
6469:.
6447:(θ
6331::
6281:.
6073:.
3393:.
3268:.
3053:.
2904:1.
2835:.
2653:):
2067:.
1244:,
1068:,
1056:.
896:.
424:.
116:á
112:Ă
108:=
8561:G
8535:F
8527:t
8515:Z
8234:V
8229:U
7431:e
7424:t
7417:v
7378:.
7359:.
7340:.
7262:.
7188:C
7184:W
7180:v
7176:m
7168:C
7164:R
7156:W
7135:C
7110:m
7085:v
7060:R
7035:W
7003:m
7000:+
6997:v
6992:m
6989:C
6986:+
6983:v
6980:R
6974:=
6971:W
6928:v
6922:n
6919:+
6916:4
6912:n
6907:+
6904:V
6898:n
6895:+
6892:4
6888:4
6876:v
6872:n
6850:b
6841:+
6829:+
6826:B
6817:+
6794:b
6790:B
6783:d
6779:d
6775:b
6773:,
6771:a
6763:b
6761:=
6759:a
6755:n
6748:n
6731:.
6726:E
6723:L
6720:M
6709:n
6706:+
6703:b
6700:+
6697:a
6693:n
6688:+
6685:]
6679:[
6676:E
6670:n
6667:+
6664:b
6661:+
6658:a
6653:b
6650:+
6647:a
6641:=
6638:)
6635:x
6632:(
6627:n
6596:.
6590:n
6587:+
6584:b
6581:+
6578:a
6573:x
6570:+
6567:a
6561:=
6558:]
6555:x
6551:|
6544:[
6541:E
6538:=
6535:)
6532:x
6529:(
6524:n
6506:b
6504:,
6502:a
6494:n
6490:x
6482:p
6462:n
6457:0
6449:0
6445:I
6428:,
6424:)
6417:)
6412:0
6404:(
6401:I
6397:1
6392:,
6389:0
6385:(
6381:N
6375:)
6370:0
6357:n
6349:(
6344:n
6317:n
6313:n
6297:0
6279:n
6263:n
6238:)
6233:n
6229:x
6225:,
6219:,
6214:1
6210:x
6206:(
6201:n
6193:=
6188:n
6163:)
6156:|
6150:i
6146:x
6142:(
6139:f
6112:,
6107:2
6103:x
6099:,
6094:1
6090:x
6057:2
6051:p
5987:1
5984:+
5981:n
5977:x
5954:1
5951:+
5948:n
5923:)
5918:2
5896:,
5874:(
5871:N
5863:1
5860:+
5857:n
5832:)
5829:1
5826:,
5821:i
5813:(
5810:N
5802:i
5793:|
5787:i
5783:x
5759:.
5756:K
5748:2
5743:m
5726:=
5721:2
5681:,
5676:m
5659:=
5614:.
5611:K
5603:2
5598:m
5590:=
5585:2
5580:f
5567:2
5562:m
5554:=
5549:2
5518:,
5511:m
5503:=
5470:K
5467:=
5464:)
5458:(
5453:2
5448:f
5420:=
5417:)
5411:(
5406:f
5381:)
5376:i
5367:|
5361:i
5357:x
5353:(
5350:f
5330:)
5324:(
5319:f
5294:)
5288:(
5283:f
5255:,
5252:]
5247:2
5243:)
5237:m
5226:)
5220:(
5215:f
5207:(
5204:[
5195:E
5191:+
5188:]
5185:)
5179:(
5174:2
5169:f
5161:[
5152:E
5148:=
5143:2
5138:m
5112:,
5107:]
5104:)
5098:(
5093:f
5085:[
5076:E
5072:=
5067:m
5037:2
5032:m
5001:m
4969:.
4963:2
4959:)
4953:m
4931:i
4927:x
4923:(
4914:n
4911:1
4906:=
4901:2
4896:m
4861:,
4855:i
4851:x
4841:n
4838:1
4833:=
4828:m
4785:n
4781:x
4777:,
4771:,
4766:1
4762:x
4737:m
4708:m
4660:.
4557:i
4530:1
4527:+
4524:n
4520:x
4497:1
4494:+
4491:n
4466:)
4461:i
4452:|
4446:i
4442:x
4438:(
4435:f
4413:n
4409:x
4405:,
4399:,
4394:1
4390:x
4326:.
4323:x
4320:+
4315:0
4311:a
4307:=
4304:)
4301:x
4298:(
4295:a
4270:0
4266:a
4262:=
4257:1
4253:x
4246:a
4224:1
4220:x
4213:a
4193:a
4169:.
4158:d
4155:)
4141:(
4138:f
4135:)
4119:1
4115:x
4108:a
4105:(
4102:L
4096:=
4090:d
4087:)
4076:1
4072:x
4068:(
4065:f
4062:)
4053:a
4050:(
4047:L
4019:1
4015:x
3994:0
3991:=
3988:x
3966:0
3962:a
3939:0
3935:a
3912:0
3908:a
3904:+
3901:x
3877:.
3874:x
3851:d
3848:)
3839:x
3836:(
3833:f
3830:)
3821:a
3818:(
3815:L
3789:x
3769:)
3766:x
3763:(
3760:a
3737:.
3731:d
3728:)
3719:x
3716:(
3713:f
3710:)
3701:a
3698:(
3695:L
3686:)
3683:x
3680:(
3677:p
3673:1
3668:=
3661:d
3658:)
3655:x
3651:|
3644:(
3641:p
3638:)
3629:a
3626:(
3623:L
3616:=
3613:]
3610:x
3606:|
3602:)
3593:a
3590:(
3587:L
3584:[
3581:E
3552:)
3549:x
3546:(
3543:p
3538:)
3529:x
3526:(
3523:f
3517:=
3511:)
3508:x
3505:(
3502:p
3497:)
3491:(
3488:p
3485:)
3478:|
3474:x
3471:(
3468:p
3462:=
3459:)
3456:x
3452:|
3445:(
3442:p
3419:1
3416:=
3413:)
3407:(
3404:p
3381:)
3372:x
3369:(
3366:f
3363:=
3360:)
3353:|
3349:x
3346:(
3343:p
3303:)
3294:a
3291:(
3288:L
3244:d
3241:)
3238:x
3234:|
3227:(
3224:p
3221:)
3218:a
3215:,
3209:(
3206:L
3172:.
3163:d
3160:)
3154:(
3151:p
3148:)
3141:|
3137:x
3134:(
3131:p
3123:)
3117:(
3114:p
3111:)
3104:|
3100:x
3097:(
3094:p
3088:=
3085:)
3082:x
3078:|
3071:(
3068:p
3037:)
3031:(
3028:p
3002:.
2996:=
2989:d
2986:)
2980:(
2977:p
2950:1
2947:=
2944:)
2938:(
2935:p
2921:R
2901:=
2895:d
2892:)
2886:(
2883:p
2857:p
2805:.
2802:K
2795:|
2772:|
2759:,
2756:L
2749:K
2742:|
2719:|
2706:,
2703:0
2697:{
2692:=
2689:)
2674:,
2668:(
2665:L
2641:0
2635:L
2615:0
2609:K
2576:.
2570:b
2567:+
2564:a
2560:a
2555:=
2552:)
2549:X
2545:|
2541:)
2538:x
2535:(
2520:(
2517:F
2489:0
2456:,
2452:|
2429:|
2425:b
2418:0
2385:,
2381:|
2358:|
2354:a
2348:{
2343:=
2340:)
2325:,
2319:(
2316:L
2288:0
2282:b
2279:,
2276:a
2252:.
2246:2
2243:1
2237:=
2234:)
2231:X
2227:|
2223:)
2220:x
2217:(
2202:(
2199:F
2177:|
2154:|
2150:a
2147:=
2144:)
2129:,
2123:(
2120:L
2096:0
2090:a
2055:F
2026:.
2020:1
2014:n
2011:+
2008:a
2002:)
1997:n
1993:x
1989:,
1986:.
1983:.
1980:.
1977:,
1972:1
1968:x
1964:,
1959:0
1951:(
1944:)
1941:n
1938:+
1935:a
1932:(
1926:=
1923:)
1920:X
1917:(
1881:)
1878:a
1875:,
1870:0
1862:(
1859:a
1856:P
1827:)
1821:,
1818:0
1815:(
1812:U
1802:|
1796:i
1792:x
1766:n
1762:x
1758:,
1755:.
1752:.
1749:.
1746:,
1741:1
1737:x
1712:.
1706:b
1703:+
1700:n
1695:a
1692:+
1684:X
1679:n
1673:=
1670:)
1667:X
1664:(
1628:)
1625:b
1622:,
1619:a
1616:(
1613:G
1584:)
1578:(
1575:P
1565:|
1559:i
1555:x
1525:n
1521:x
1517:,
1514:.
1511:.
1508:.
1505:,
1500:1
1496:x
1471:.
1468:x
1460:2
1452:+
1447:2
1436:2
1426:+
1415:2
1407:+
1402:2
1391:2
1381:=
1378:)
1375:x
1372:(
1336:)
1331:2
1323:,
1317:(
1314:N
1288:)
1283:2
1275:,
1269:(
1266:N
1256:|
1252:x
1224:|
1220:x
1160:.
1154:d
1150:)
1147:x
1143:|
1136:(
1133:p
1123:=
1120:]
1117:x
1113:|
1106:[
1103:E
1100:=
1097:)
1094:x
1091:(
1044:x
1001:,
997:]
991:2
987:)
977:)
974:x
971:(
956:(
952:[
948:E
945:=
941:E
938:S
935:M
879:x
849:x
828:)
825:x
821:|
817:)
802:,
796:(
793:L
790:(
787:E
681:)
678:)
663:,
657:(
654:L
651:(
642:E
584:)
569:,
563:(
560:L
550:x
516:)
513:x
510:(
495:=
362:e
355:t
348:v
72:)
66:(
61:)
57:(
43:.
Text is available under the Creative Commons Attribution-ShareAlike License. Additional terms may apply.