Knowledge

Bayesian inference

Source 📝

3975: 3693: 3970:{\displaystyle {\begin{aligned}p({\boldsymbol {\theta }}\mid \mathbf {E} ,{\boldsymbol {\alpha }})&={\frac {p(\mathbf {E} \mid {\boldsymbol {\theta }},{\boldsymbol {\alpha }})}{p(\mathbf {E} \mid {\boldsymbol {\alpha }})}}\cdot p({\boldsymbol {\theta }}\mid {\boldsymbol {\alpha }})\\&={\frac {p(\mathbf {E} \mid {\boldsymbol {\theta }},{\boldsymbol {\alpha }})}{\int p(\mathbf {E} \mid {\boldsymbol {\theta }},{\boldsymbol {\alpha }})p({\boldsymbol {\theta }}\mid {\boldsymbol {\alpha }})\,d{\boldsymbol {\theta }}}}\cdot p({\boldsymbol {\theta }}\mid {\boldsymbol {\alpha }}),\end{aligned}}} 9028:"In the first chapters of this work, prior distributions with finite support and the corresponding Bayes procedures were used to establish some of the main theorems relating to the comparison of experiments. Bayes procedures with respect to more general prior distributions have played a very important role in the development of statistics, including its asymptotic theory." "There are many problems where a glance at posterior distributions, for suitable priors, yields immediately interesting information. Also, this technique can hardly be avoided in sequential analysis." 7683: 5822: 14556: 279: 2499: 2471:" arguments did not specify Bayesian updating: they left open the possibility that non-Bayesian updating rules could avoid Dutch books. Hacking wrote: "And neither the Dutch book argument nor any other in the personalist arsenal of proofs of the probability axioms entails the dynamic assumption. Not one entails Bayesianism. So the personalist requires the dynamic assumption to be Bayesian. It is true that in consistency a personalist could abandon the Bayesian model of learning from experience. Salt could lose its savour." 4966: 5786:. This correctly estimates the variance, due to the facts that (1) the average of normally distributed random variables is also normally distributed, and (2) the predictive distribution of a normally distributed data point with unknown mean and variance, using conjugate or uninformative priors, has a Student's t-distribution. In Bayesian statistics, however, the posterior predictive distribution can always be determined exactly—or at least to an arbitrary level of precision when numerical methods are used. 476: 4659: 14542: 9305:. The jury convicted, but the case went to appeal on the basis that no means of accumulating evidence had been provided for jurors who did not wish to use Bayes' theorem. The Court of Appeal upheld the conviction, but it also gave the opinion that "To introduce Bayes' Theorem, or any similar method, into a criminal trial plunges the jury into inappropriate and unnecessary realms of theory and complexity, deflecting them from their proper task." 14580: 14568: 1829: 47: 4961:{\displaystyle p(\theta \mid \mathbf {X} ,\alpha )={\frac {p(\theta ,\mathbf {X} ,\alpha )}{p(\mathbf {X} ,\alpha )}}={\frac {p(\mathbf {X} \mid \theta ,\alpha )p(\theta ,\alpha )}{p(\mathbf {X} \mid \alpha )p(\alpha )}}={\frac {p(\mathbf {X} \mid \theta ,\alpha )p(\theta \mid \alpha )}{p(\mathbf {X} \mid \alpha )}}\propto p(\mathbf {X} \mid \theta ,\alpha )p(\theta \mid \alpha ).} 7592: 5801:, such that the prior and posterior distributions come from the same family, it can be seen that both prior and posterior predictive distributions also come from the same family of compound distributions. The only difference is that the posterior predictive distribution uses the updated values of the hyperparameters (applying the Bayesian update rules given in the 7691:
expected that if the site were inhabited during the early medieval period, then 1% of the pottery would be glazed and 50% of its area decorated, whereas if it had been inhabited in the late medieval period then 81% would be glazed and 5% of its area decorated. How confident can the archaeologist be in the date of inhabitation as fragments are unearthed?
9284: 1502: 6923: 8875:. By calculating the area under the relevant portion of the graph for 50 trials, the archaeologist can say that there is practically no chance the site was inhabited in the 11th and 12th centuries, about 1% chance that it was inhabited during the 13th century, 63% chance during the 14th century and 36% during the 15th century. The 7317: 9268:". Bayes' theorem is applied successively to all evidence presented, with the posterior from one stage becoming the prior for the next. The benefit of a Bayesian approach is that it gives the juror an unbiased, rational mechanism for combining evidence. It may be appropriate to explain Bayes' theorem to jurors in 9022:"Under some conditions, all admissible procedures are either Bayes procedures or limits of Bayes procedures (in various senses). These remarkable results, at least in their original form, are due essentially to Wald. They are useful because the property of being Bayes is easier to analyze than admissibility." 9090:
While conceptually simple, Bayesian methods can be mathematically and numerically challenging. Probabilistic programming languages (PPLs) implement functions to easily build Bayesian models together with efficient automatic inference methods. This helps separate the model building from the inference,
7054:
Suppose there are two full bowls of cookies. Bowl #1 has 10 chocolate chip and 30 plain cookies, while bowl #2 has 20 of each. Our friend Fred picks a bowl at random, and then picks a cookie at random. We may assume there is no reason to believe Fred treats one bowl differently from another, likewise
9323:). He argues that if the posterior probability of guilt is to be computed by Bayes' theorem, the prior probability of guilt must be known. This will depend on the incidence of the crime, which is an unusual piece of evidence to consider in a criminal trial. Consider the following three propositions: 479:
A geometric visualisation of Bayes' theorem. In the table, the values 2, 3, 6 and 9 give the relative weights of each corresponding condition and case. The figures denote the cells of the table involved in each metric, the probability being the fraction of each figure that is shaded. This shows that
6442:
who published in two seminal research papers in 1963 and 1965 when and under what circumstances the asymptotic behaviour of posterior is guaranteed. His 1963 paper treats, like Doob (1949), the finite case and comes to a satisfactory conclusion. However, if the random variable has an infinite but
5489:
in his famous book from 1933. Kolmogorov underlines the importance of conditional probability by writing "I wish to call attention to ... and especially the theory of conditional probabilities and conditional expectations ..." in the Preface. The Bayes theorem determines the posterior distribution
3393:
By parameterizing the space of models, the belief in all models may be updated in a single step. The distribution of belief over the model space may then be thought of as a distribution of belief over the parameter space. The distributions in this section are expressed as continuous, represented by
9589:
methods, which removed many of the computational problems, and an increasing interest in nonstandard, complex applications. Despite growth of Bayesian research, most undergraduate teaching is still based on frequentist statistics. Nonetheless, Bayesian methods are widely accepted and used, such as
7690:
An archaeologist is working at a site thought to be from the medieval period, between the 11th century to the 16th century. However, it is uncertain exactly when in this period the site was inhabited. Fragments of pottery are found, some of which are glazed and some of which are decorated. It is
9581:
currents in Bayesian practice. In the objective or "non-informative" current, the statistical analysis depends on only the model assumed, the data analyzed, and the method assigning the prior, which differs from one objective Bayesian practitioner to another. In the subjective or "informative"
8845: 6751: 9291:
If the existence of the crime is not in doubt, only the identity of the culprit, it has been suggested that the prior should be uniform over the qualifying population. For example, if 1,000 people could have committed the crime, the prior probability of guilt would be 1/1000.
5622: 4065: 8499: 3304: 8311: 1824:{\displaystyle {\begin{aligned}P(H\mid E)&={\frac {P(E\mid H)P(H)}{P(E)}}\\\\&={\frac {P(E\mid H)P(H)}{P(E\mid H)P(H)+P(E\mid \neg H)P(\neg H)}}\\\\&={\frac {1}{1+\left({\frac {1}{P(H)}}-1\right){\frac {P(E\mid \neg H)}{P(E\mid H)}}}}\\\end{aligned}}} 8132: 5742:
the distribution of a new, unobserved data point. That is, instead of a fixed point as a prediction, a distribution over possible points is returned. Only this way is the entire posterior distribution of the parameter(s) used. By comparison, prediction in
6596: 7965: 11144:
Evaristo, Jaivime; McDonnell, Jeffrey J.; Scholl, Martha A.; Bruijnzeel, L. Adrian; Chun, Kwok P. (2016-01-01). "Insights into plant water uptake from xylem-water isotope measurements in two tropical catchments with contrasting moisture conditions".
4642: 5729: 6463:
continued to work on the case of infinite countable probability spaces. To summarise, there may be insufficient trials to suppress the effects of the initial choice, and especially for large (but finite) systems the convergence might be very slow.
6687: 2192:
is close to 1 or the conditional hypothesis is quite likely. If that term is very large, much larger than 1, then the hypothesis, given the evidence, is quite unlikely. If the hypothesis (without consideration of evidence) is unlikely, then
2482:'s rule, which applies Bayes' rule to the case where the evidence itself is assigned a probability. The additional hypotheses needed to uniquely require Bayesian updating have been deemed to be substantial, complicated, and unsatisfactory. 7587:{\displaystyle {\begin{aligned}P(H_{1}\mid E)&={\frac {P(E\mid H_{1})\,P(H_{1})}{P(E\mid H_{1})\,P(H_{1})\;+\;P(E\mid H_{2})\,P(H_{2})}}\\\\\ &={\frac {0.75\times 0.5}{0.75\times 0.5+0.5\times 0.5}}\\\\\ &=0.6\end{aligned}}} 3086: 8592: 4537: 2091: 4644:
It quantifies the agreement between data and expert opinion, in a geometric sense that can be made precise. If the marginal likelihood is 0 then there is no agreement between the data and expert opinion and Bayes' rule cannot be
5755:(MAP)—and then plugging this estimate into the formula for the distribution of a data point. This has the disadvantage that it does not account for any uncertainty in the value of the parameter, and hence will underestimate the 9076:. Since Bayesian model comparison is aimed on selecting the model with the highest posterior probability, this methodology is also referred to as the maximum a posteriori (MAP) selection rule or the MAP probability rule. 5510: 5977: 3980: 8315: 1429: – the posterior probability of a hypothesis is proportional to its prior probability (its inherent likeliness) and the newly acquired likelihood (its compatibility with the new observed evidence). 3478: 3165: 9059:
where the aim is to select one model from a set of competing models that represents most closely the underlying process that generated the observed data. In Bayesian model comparison, the model with the highest
6074: 9411:, rejecting the belief, commonly held by Bayesians, that high likelihood achieved by a series of Bayesian updates would prove the hypothesis beyond any reasonable doubt, or even with likelihood greater than 0. 8973: 7804: 9201:
is the theory of prediction based on observations; for example, predicting the next symbol based upon a given series of symbols. The only assumption is that the environment follows some unknown but computable
3384: 8136: 2341: 766: 5490:
from the prior distribution. Uniqueness requires continuity assumptions. Bayes' theorem can be generalized to include improper prior distributions such as the uniform distribution on the real line. Modern
7969: 1930: 2452: 6511: 10968:
Foreman, L. A.; Smith, A. F. M., and Evett, I. W. (1997). "Bayesian analysis of deoxyribonucleic acid profiling data in forensic identification applications (with discussion)".
9582:
current, the specification of the prior depends on the belief (that is, propositions on which the analysis is prepared to act), which can summarize information from experts, previous studies, etc.
5979:. That is, if the model were true, the evidence would be more likely than is predicted by the current state of belief. The reverse applies for a decrease in belief. If the belief does not change, 5308: 6918:{\displaystyle p({\tilde {x}}|\mathbf {X} ,\alpha )=\int p({\tilde {x}},\theta \mid \mathbf {X} ,\alpha )\,d\theta =\int p({\tilde {x}}\mid \theta )p(\theta \mid \mathbf {X} ,\alpha )\,d\theta .} 7811: 7322: 5012: 3698: 1507: 4557: 4243: 5633: 3566: 3160: 2490:
If evidence is simultaneously used to update belief over a set of exclusive and exhaustive propositions, Bayesian inference may be thought of as acting on this belief distribution as a whole.
3634: 6611: 9407:
epistemology, because it presupposes what it attempts to justify. According to this view, a rational interpretation of Bayesian inference would see it merely as a probabilistic version of
5475: 4468: 9034:"An important area of investigation in the development of admissibility ideas has been that of conventional sampling-theory procedures, and many interesting results have been obtained." 9566:
backwards from observations to parameters, or from effects to causes). After the 1920s, "inverse probability" was largely supplanted by a collection of methods that came to be called
7312: 7250: 3686: 3656: 3500: 3439: 3417: 2584: 4419: 4169: 9264:
Bayesian inference can be used by jurors to coherently accumulate the evidence for and against a defendant, and to see whether, in totality, it meets their personal threshold for "
2953: 10861:
Robinson, Mark D & McCarthy, Davis J & Smyth, Gordon K edgeR: a Bioconductor package for differential expression analysis of digital gene expression data, Bioinformatics.
4339: 2522:
in general formulation of Bayesian inference. Although this diagram shows discrete models and events, the continuous case may be visualized similarly using probability densities.
2684: 2260: 7168: 1983: 2155: 1988: 9501: 7672: 6408: 2878: 2839: 12248: 10895:
Kurtz, David M.; Esfahani, Mohammad S.; Scherer, Florian; Soo, Joanne; Jin, Michael C.; Liu, Chih Long; Newman, Aaron M.; Dührsen, Ulrich; Hüttmann, Andreas (2019-07-25).
6746: 6164: 5054: 4370: 4273: 2793: 7058:
Intuitively, it seems clear that the answer should be more than a half, since there are more plain cookies in bowl #1. The precise answer is given by Bayes' theorem. Let
2942: 2190: 2126: 1427: 1392: 1056: 944: 9002:
Wald characterized admissible procedures as Bayesian procedures (and limits of Bayesian procedures), making the Bayesian formalism a central technique in such areas of
8543: 6234: 5340: 5109: 1276: 1241:
does not appear anywhere in the symbol, unlike for all the other factors) and hence does not factor into determining the relative probabilities of different hypotheses.
7630: 6361: 6277: 2720: 5074: 5032: 4197: 4123: 2608: 2520: 8873: 6199: 6123: 1453: 7110: 7083: 5182: 3597: 2751: 2635: 3569: 2907: 2220: 1357: 1215: 832: 6076:. That is, the evidence is independent of the model. If the model were true, the evidence would be exactly as likely as predicted by the current state of belief. 5386: 5155: 1304: 8587: 8567: 7712: 7188: 6320: 6300: 5360: 5202: 5129: 4293: 4097: 1497: 1473: 1328: 1239: 1184: 1164: 1144: 1124: 1098: 1076: 1015: 992: 970: 903: 881: 858: 789: 11101:
Ogle, Kiona; Tucker, Colin; Cable, Jessica M. (2014-01-01). "Beyond simple linear mixing models: process-based isotope partitioning of ecological processes".
4479: 9250:(Continuous Individualized Risk Index), where serial measurements are incorporated to update a Bayesian model which is primarily built from prior knowledge. 8849:
A computer simulation of the changing belief as 50 fragments are unearthed is shown on the graph. In the simulation, the site was inhabited around 1420, or
6482:. The usefulness of a conjugate prior is that the corresponding posterior distribution will be in the same family, and the calculation may be expressed in 2157:, about 50% likely - equally likely or not likely. If that term is very small, close to zero, then the probability of the hypothesis, given the evidence, 9374:
are both true, but in this case he argues that a jury should acquit, even though they know that they will be letting some guilty people go free. See also
9424:
is sometimes interpreted as an application of Bayesian inference. In this view, Bayes' rule guides (or should guide) the updating of probabilities about
4968:
This is expressed in words as "posterior is proportional to likelihood times prior", or sometimes as "posterior = likelihood times prior, over evidence".
3309: 11262: 677: 8840:{\displaystyle f_{C}(c\mid E=e)={\frac {P(E=e\mid C=c)}{P(E=e)}}f_{C}(c)={\frac {P(E=e\mid C=c)}{\int _{11}^{16}{P(E=e\mid C=c)f_{C}(c)dc}}}f_{C}(c)} 10524:
Fatermans, J.; Van Aert, S.; den Dekker, A.J. (2019). "The maximum a posteriori probability rule for atom column detection from HAADF STEM images".
11771: 9072:
of the model. When two competing models are a priori considered to be equiprobable, the ratio of their posterior probabilities corresponds to the
1834: 13677: 5885: 2353: 10217: 2461:
Bayesian updating is widely used and computationally convenient. However, it is not the only updating rule that might be considered rational.
14182: 12221: 11585: 9198: 309: 3444: 1221:
or "model evidence". This factor is the same for all possible hypotheses being considered (as is evident from the fact that the hypothesis
14332: 9156:
community for these reasons; a number of applications allow many demographic and evolutionary parameters to be estimated simultaneously.
8886: 7717: 5982: 100: 1146:
fixed, it indicates the compatibility of the evidence with the given hypothesis. The likelihood function is a function of the evidence,
13956: 12597: 12234: 6494:
It is often desired to use a posterior distribution to estimate a parameter or variable. Several methods of Bayesian estimation select
9246:
applications, including differential gene expression analysis. Bayesian inference is also used in a general cancer risk model, called
2265: 12227: 9025:"In decision theory, a quite general method for proving admissibility consists in exhibiting a procedure as a unique Bayes solution." 11007: 5617:{\displaystyle p({\tilde {x}}\mid \mathbf {X} ,\alpha )=\int p({\tilde {x}}\mid \theta )p(\theta \mid \mathbf {X} ,\alpha )d\theta } 13730: 10955: 9031:"A useful fact is that any Bayes decision rule obtained by taking a proper prior over the whole parameter space must be admissible" 4060:{\displaystyle p(\mathbf {E} \mid {\boldsymbol {\theta }},{\boldsymbol {\alpha }})=\prod _{k}p(e_{k}\mid {\boldsymbol {\theta }}).} 182: 11074:
Cai, X.Q.; Wu, X.Y.; Zhou, X. (2009). "Stochastic scheduling subject to breakdown-repeat breakdowns with incomplete information".
9585:
In the 1980s, there was a dramatic growth in research and applications of Bayesian methods, mostly attributed to the discovery of
9399:
have rejected the idea of Bayesian rationalism, i.e. using Bayes rule to make epistemological inferences: It is prone to the same
8494:{\displaystyle P(E={\bar {G}}{\bar {D}}\mid C=c)=((1-0.01)-{\frac {0.81-0.01}{16-11}}(c-11))(0.5+{\frac {0.5-0.05}{16-11}}(c-11))} 5394: 14169: 9172: 8546: 6419: 3099: 6501:
For one-dimensional problems, a unique median exists for practical continuous problems. The posterior median is attractive as a
5805:
article), while the prior predictive distribution uses the values of the hyperparameters that appear in the prior distribution.
3394:
probability densities, as this is the usual situation. The technique is, however, equally applicable to discrete distributions.
12176: 9450: 9206:. It is a formal inductive framework that combines two well-studied principles of inductive inference: Bayesian statistics and 3299:{\displaystyle P(M\mid \mathbf {E} )={\frac {P(\mathbf {E} \mid M)}{\sum _{m}{P(\mathbf {E} \mid M_{m})P(M_{m})}}}\cdot P(M),} 12163: 12125: 12078: 12025: 11988: 11962: 11947: 11928: 11895: 11869: 11850: 11824: 11785: 11761: 11733: 11710: 11683: 11572: 11049: 10990: 10713: 10576:
Bessiere, P., Mazer, E., Ahuactzin, J. M., & Mekhnacha, K. (2013). Bayesian Programming (1 edition) Chapman and Hall/CRC.
10472: 10444: 10415: 9904: 9778: 9604: 9455: 12158:
Francisco J. Samaniego (2010). "A Comparison of the Bayesian and Frequentist Approaches to Estimation". Springer. New York,
2347: 12592: 12292: 10096:
Robins, James; Wasserman, Larry (2000). "Conditioning, likelihood, and coherence: A review of some foundational concepts".
9482: 9247: 5207: 9091:
allowing practitioners to focus on their specific problems and leaving PPLs to handle the computational details for them.
6447:(i.e., corresponding to a die with infinite many faces) the 1965 paper demonstrates that for a dense subset of priors the 13196: 12344: 8306:{\displaystyle P(E={\bar {G}}D\mid C=c)=((1-0.01)-{\frac {0.81-0.01}{16-11}}(c-11))(0.5-{\frac {0.5-0.05}{16-11}}(c-11))} 4974: 12213: 9132:
techniques since the late 1950s. There is also an ever-growing connection between Bayesian methods and simulation-based
6282:
The former follows directly from Bayes' theorem. The latter can be derived by applying the first rule to the event "not
4207: 11557: 11520: 10587: 9990: 3511: 3105: 1935: 3602: 13979: 13871: 12153: 12109: 12062: 11913: 11597: 11363: 10176: 9757: 9149: 8127:{\displaystyle P(E=G{\bar {D}}\mid C=c)=(0.01+{\frac {0.81-0.01}{16-11}}(c-11))(0.5+{\frac {0.5-0.05}{16-11}}(c-11))} 7596:
Before we observed the cookie, the probability we assigned for Fred having chosen bowl #1 was the prior probability,
6716: 5865: 5843: 5790: 5504: 391: 357: 302: 265: 7806:
as evidence. Assuming linear variation of glaze and decoration with time, and that these variables are independent,
5836: 14621: 14616: 14584: 14157: 14031: 9555: 6456: 6439: 192: 14215: 13876: 13621: 12992: 12582: 12237:, in: S. Bernecker and D. Pritchard (eds.), Routledge Companion to Epistemology. London: Routledge 2010, 609–620. 9521: 9477: 9445: 9277: 8876: 6601: 6591:{\displaystyle {\tilde {\theta }}=\operatorname {E} =\int \theta \,p(\theta \mid \mathbf {X} ,\alpha )\,d\theta } 6508:
If there exists a finite mean for the posterior distribution, then the posterior mean is a method of estimation.
6448: 6423: 5752: 4436: 218: 95: 17: 11720:
Phillips, L. D.; Edwards, Ward (October 2008). "Chapter 6: Conservatism in a Simple Probability Inference Task (
11490: 9218:
is the sum of the probabilities of all programs (for a universal computer) that compute something starting with
7055:
for the cookies. The cookie turns out to be a plain one. How probable is it that Fred picked it out of bowl #1?
5034:
can be very high, or the Bayesian model retains certain hierarchical structure formulated from the observations
14266: 13478: 13285: 13174: 13132: 11549: 11198:"Comparison of Parameter Estimation Methods in Stochastic Chemical Kinetic Models: Examples in Systems Biology" 9301: 9050: 7255: 7193: 156: 13206: 9109: 2474:
Indeed, there are non-Bayesian updating rules that also avoid Dutch books (as discussed in the literature on "
14611: 14509: 13468: 12371: 12186: 11842: 11702: 10788:
Hutter, Marcus; He, Yang-Hui; Ormerod, Thomas C (2007). "On Universal Prediction and Bayesian Confirmation".
10149:
Choudhuri, Nidhan; Ghosal, Subhashis; Roy, Anindya (2005-01-01). "Bayesian Methods for Function Estimation".
9573:
In the 20th century, the ideas of Laplace were further developed in two different directions, giving rise to
6478:
In parameterized form, the prior distribution is often assumed to come from a family of distributions called
5627: 3669: 3639: 3483: 3422: 3400: 2529: 9833:
Lee, Se Yoon (2021). "Gibbs sampler and coordinate ascent variational inference: A set-theoretical review".
7960:{\displaystyle P(E=GD\mid C=c)=(0.01+{\frac {0.81-0.01}{16-11}}(c-11))(0.5-{\frac {0.5-0.05}{16-11}}(c-11))} 4652:
is the distribution of the parameter(s) after taking into account the observed data. This is determined by
4389: 4133: 14060: 14009: 13994: 13984: 13853: 13725: 13692: 13518: 13473: 13303: 11750:
For a full report on the history of Bayesian statistics and the debates with frequentists approaches, read
11287:
Schütz, N.; Holschneider, M. (2011). "Detection of trend changes in time series using Bayesian inference".
9634: 9389:
is a movement that advocates for Bayesian inference as a means of justifying the rules of inductive logic.
4637:{\displaystyle p(\mathbf {X} \mid \alpha )=\int p(\mathbf {X} \mid \theta )p(\theta \mid \alpha )d\theta .} 4421:. The prior distribution might not be easily determined; in such a case, one possibility may be to use the 295: 187: 125: 5724:{\displaystyle p({\tilde {x}}\mid \alpha )=\int p({\tilde {x}}\mid \theta )p(\theta \mid \alpha )d\theta } 14606: 14572: 14404: 14205: 14129: 13430: 13184: 12853: 12317: 12181: 12039: 12005: 11805: 10279: 9769:
Gelman, Andrew; Carlin, John B.; Stern, Hal S.; Dunson, David B.; Vehtari, Aki; Rubin, Donald B. (2013).
9534:(1701–1761), who proved that probabilistic limits could be placed on an unknown event. However, it was 8984: 6709: 5783: 4971:
In practice, for almost all complex Bayesian models used in machine learning, the posterior distribution
4298: 10284:"Admissible Bayes Character of T-, R-, and Other Fully Invariant Tests for Multivariate Normal Problems" 6682:{\displaystyle \{\theta _{\text{MAP}}\}\subset \arg \max _{\theta }p(\theta \mid \mathbf {X} ,\alpha ).} 2640: 2225: 14289: 14261: 14256: 14004: 13763: 13669: 13649: 13557: 13268: 13086: 12569: 12441: 10953:
Dawid, A. P. and Mortera, J. (1996) "Coherent Analysis of Forensic Identification Evidence".
9396: 9259: 9160: 7115: 177: 146: 10192: 803:
below). Often there are competing hypotheses, and the task is to determine which is the most probable.
14021: 13789: 13510: 13435: 13364: 13293: 13213: 13201: 13071: 13059: 13052: 12760: 12481: 9644: 9265: 9085: 5482: 2131: 239: 120: 10847: 10159: 9192: 7635: 6366: 2844: 2805: 14504: 14271: 14134: 13819: 13784: 13748: 13533: 12975: 12884: 12843: 12755: 12446: 12285: 12254: 11447: 10228: 9629: 9586: 9547: 9308:
Gardner-Medwin argues that the criterion on which a verdict in a criminal trial should be based is
9203: 9136:
techniques since complex models cannot be processed in closed form by a Bayesian analysis, while a
9044: 8996: 8992: 5830: 5491: 2587: 260: 172: 10349: 9230:
is sampled, the universal prior and Bayes' theorem can be used to predict the yet unseen parts of
6722: 6128: 5037: 4346: 4256: 2756: 14413: 14026: 13966: 13903: 13541: 13525: 13263: 13125: 13115: 12965: 12879: 11775: 9639: 9472: 9121: 6430:
independent of the initial prior under some conditions firstly outlined and rigorously proven by
5478: 428: 12035: 11486: 11415: 9792: 2912: 2160: 2096: 1397: 1362: 1026: 914: 14451: 14381: 14174: 14111: 13866: 13753: 12750: 12647: 12554: 12433: 12332: 11603:
Edwards, Ward (1968). "Conservatism in Human Information Processing". In Kleinmuntz, B. (ed.).
11442: 11024: 10842: 10154: 9567: 9488: 9439: 6483: 6204: 5847: 5744: 5313: 5082: 4649: 3663: 2475: 1246: 151: 8506: 7599: 6418:
Consider the behaviour of a belief distribution as it is updated a large number of times with
6325: 6239: 2689: 14476: 14418: 14361: 14187: 14080: 13989: 13715: 13599: 13458: 13450: 13340: 13332: 13147: 13043: 13021: 12980: 12945: 12912: 12858: 12833: 12788: 12727: 12687: 12489: 12312: 11671: 10246: 9624: 9619: 9509: 9433: 9386: 9064:
given the data is selected. The posterior probability of a model depends on the evidence, or
9061: 9003: 6427: 6279:. This can be interpreted to mean that hard convictions are insensitive to counter-evidence. 5059: 5017: 4551: 4429: 4182: 4108: 3081:{\displaystyle P(M\mid E)={\frac {P(E\mid M)}{\sum _{m}{P(E\mid M_{m})P(M_{m})}}}\cdot P(M).} 2723: 2593: 2505: 948: 647: 420: 400: 54: 8852: 6169: 6093: 5494:
methods have boosted the importance of Bayes' theorem including cases with improper priors.
2093:
If that term is approximately 1, then the probability of the hypothesis given the evidence,
1435: 415:
becomes available. Fundamentally, Bayesian inference uses prior knowledge, in the form of a
14399: 13974: 13923: 13899: 13861: 13779: 13758: 13710: 13589: 13567: 13536: 13445: 13322: 13273: 13191: 13164: 13120: 13076: 12838: 12614: 12494: 12013: 11633: 11464: 11306: 11209: 11154: 10807: 10752: 10629: 10393:(see p. 309 of Chapter 6.7 "Admissibility", and pp. 17–18 of Chapter 1.8 "Complete Classes" 9535: 9375: 9015: 9007: 7088: 7061: 6691:
There are examples where no maximum is attained, in which case the set of MAP estimates is
5735: 5160: 3575: 2729: 2613: 2610:
represent the current state of belief for this process. Each model is represented by event
651: 523: 464: 234: 115: 85: 12230:, in: J. Dancy et al. (eds.), A Companion to Epistemology. Oxford: Blackwell 2010, 93–106. 12193: 10897:"Dynamic Risk Profiling Using Serial Tumor Biomarkers for Personalized Outcome Prediction" 2883: 2196: 1333: 1191: 808: 670:
for the observed data. Bayesian inference computes the posterior probability according to
8: 14546: 14471: 14394: 14075: 13839: 13832: 13794: 13702: 13682: 13654: 13387: 13253: 13248: 13238: 13230: 13048: 13009: 12899: 12889: 12798: 12577: 12533: 12451: 12376: 12278: 11003: 9563: 9559: 9543: 9191:, XEAMS, and others. Spam classification is treated in more detail in the article on the 9129: 9105: 9065: 7682: 5794: 5771: 5767: 5763: 5365: 5134: 4543: 4473: 1281: 1218: 1102: 663: 432: 66: 58: 38: 12118:
The Bayesian Choice: From Decision-Theoretic Foundations to Computational Implementation
12017: 11637: 11310: 11213: 11158: 10811: 10756: 10633: 8999:
statistical procedure is either a Bayesian procedure or a limit of Bayesian procedures.
4532:{\displaystyle \operatorname {L} (\theta \mid \mathbf {X} )=p(\mathbf {X} \mid \theta )} 2086:{\displaystyle \left({\tfrac {1}{P(H)}}-1\right){\tfrac {P(E\mid \neg H)}{P(E\mid H)}}.} 14560: 14371: 14225: 14121: 14070: 13946: 13843: 13827: 13804: 13315: 13298: 13258: 13169: 13064: 13026: 12997: 12957: 12917: 12863: 12780: 12466: 12461: 11884: 11657: 11468: 11330: 11296: 11238: 11197: 11178: 10931: 10896: 10841:
Gács, Peter; Vitányi, Paul M. B. (2 December 2010). "Raymond J. Solomonoff 1926-2009".
10823: 10797: 10770: 10742: 10653: 10559: 10533: 10506: 10113: 10078: 10037: 9961: 9860: 9842: 9815: 9707: 9609: 9464:
Bayesian inference is used to estimate parameters in stochastic chemical kinetic models
9295:
The use of Bayes' theorem by jurors is controversial. In the United Kingdom, a defence
9133: 9011: 8572: 8552: 7697: 7173: 6305: 6285: 6085: 5748: 5345: 5187: 5114: 4383: 4278: 4082: 3162:, it can be shown by induction that repeated application of the above is equivalent to 2479: 1482: 1458: 1313: 1224: 1169: 1149: 1129: 1109: 1083: 1061: 1000: 977: 955: 888: 866: 843: 774: 655: 283: 208: 110: 12259: 10617: 10168: 9894: 9670: 5014:
is not obtained in a closed form distribution, mainly because the parameter space for
14555: 14466: 14436: 14428: 14248: 14239: 14164: 14095: 13951: 13936: 13911: 13799: 13740: 13606: 13594: 13220: 13137: 13081: 13004: 12848: 12770: 12549: 12423: 12240: 12159: 12149: 12145: 12121: 12105: 12074: 12058: 12050: 12021: 11984: 11958: 11943: 11924: 11909: 11891: 11879: 11865: 11846: 11820: 11812: 11781: 11757: 11729: 11706: 11679: 11661: 11649: 11593: 11568: 11553: 11545: 11516: 11472: 11359: 11350: 11322: 11243: 11225: 11182: 11170: 11126: 11118: 11045: 10986: 10936: 10918: 10709: 10645: 10563: 10551: 10468: 10440: 10411: 10384: 10172: 10117: 9996: 9986: 9965: 9953: 9900: 9864: 9774: 9753: 9539: 9421: 9069: 9068:, which reflects the probability that the data is generated by the model, and on the 8880: 7112:
to bowl #2. It is given that the bowls are identical from Fred's point of view, thus
6938: 6502: 6444: 6435: 5762:
In some instances, frequentist statistics can work around this problem. For example,
5486: 3659: 2945: 836: 796: 671: 667: 659: 534: 517: 416: 404: 278: 213: 90: 62: 11334: 10510: 9819: 9711: 7686:
Example results for archaeology example. This simulation was generated using c=15.2.
435:. Bayesian inference has found application in a wide range of activities, including 14491: 14446: 14210: 14197: 14090: 14065: 13999: 13931: 13809: 13417: 13310: 13243: 13156: 13103: 12922: 12793: 12587: 12471: 12386: 12353: 12205: 12199: 11800:
The following books are listed in ascending order of probabilistic sophistication:
11641: 11581: 11452: 11395: 11314: 11233: 11217: 11162: 11110: 11083: 10926: 10908: 10827: 10815: 10774: 10760: 10684: 10637: 10547: 10543: 10498: 10364: 10328: 10295: 10164: 10109: 10105: 10068: 10027: 9943: 9933: 9852: 9807: 9745: 9725: 9699: 9591: 9404: 9207: 9168: 7190:
is the observation of a plain cookie. From the contents of the bowls, we know that
6495: 1476: 368: 328: 105: 11645: 10657: 9856: 14408: 14152: 14014: 13941: 13616: 13490: 13463: 13440: 13409: 13036: 13031: 12985: 12715: 12366: 12217: 11804:
Stone, JV (2013), "Bayes' Rule: A Tutorial Introduction to Bayesian Analysis",
11613: 11460: 11039: 11011: 9675: 9137: 9056: 8569:
is discovered, Bayes' theorem is applied to update the degree of belief for each
6479: 6473: 5802: 5798: 5734:
Bayesian theory calls for the use of the posterior predictive distribution to do
2498: 463:, Bayesian inference is closely related to subjective probability, often called " 460: 141: 13898: 12088: 11687: 11263:"The Tadpole Bayesian Model for Detecting Trend Changes in Financial Quotations" 9280:, replacing multiplication with addition, might be easier for a jury to handle. 909:, corresponds to new data that were not used in computing the prior probability. 14357: 14352: 12815: 12745: 12391: 12139:
Probabilistic Reasoning in Intelligent Systems: Networks of Plausible Inference
11456: 11318: 11114: 10913: 9408: 9400: 9336:– the known facts and testimony could have arisen if the defendant is innocent. 9296: 9269: 9243: 9167:. Applications which make use of Bayesian inference for spam filtering include 9145: 8879:
asserts here the asymptotic convergence to the "true" distribution because the
7632:, which was 0.5. After observing the cookie, we must revise the probability to 6460: 6431: 4653: 4422: 4200: 3504: 2526:
Suppose a process is generating independent and identically distributed events
12057:. Wiley Classics Library. 2004. (Originally published (1970) by McGraw-Hill.) 10819: 10333: 10316: 10300: 10283: 10073: 10056: 10032: 10015: 10000: 9690:
Hacking, Ian (December 1967). "Slightly More Realistic Personal Probability".
9128:. Bayesian inference techniques have been a fundamental part of computerized 5747:
often involves finding an optimum point estimate of the parameter(s)—e.g., by
4476:, especially when viewed as a function of the parameter(s), sometimes written 475: 14600: 14514: 14481: 14344: 14305: 14116: 14085: 13549: 13503: 13108: 12810: 12637: 12401: 12396: 12210: 11972: 11754:
Bayesians Versus Frequentists A Philosophical Debate on Statistical Reasoning
11229: 11174: 11122: 10922: 10502: 10403: 10369: 10131: 9957: 9551: 9330:– the known facts and testimony could have arisen if the defendant is guilty. 9153: 9125: 6705: 6452: 4432:
is the distribution of the observed data conditional on its parameters, i.e.
4248: 4174: 4100: 12264:— Informal introduction with many examples, ebook (PDF) freely available at 10595: 10350:"Minimax Confidence Sets for the Mean of a Multivariate Normal Distribution" 3388: 1017:
is observed. This is what we want to know: the probability of a hypothesis
14456: 14389: 14366: 14281: 13611: 12907: 12805: 12740: 12682: 12667: 12604: 12559: 11976: 11834: 11694: 11653: 11621: 11326: 11247: 11130: 11087: 10940: 10649: 10555: 10460: 10432: 9614: 9531: 9468: 9273: 9180: 9073: 8988: 255: 9554:. Early Bayesian inference, which used uniform priors following Laplace's 5972:{\textstyle {\frac {P(E\mid M)}{P(E)}}>1\Rightarrow P(E\mid M)>P(E)} 5507:
is the distribution of a new data point, marginalized over the posterior:
4425:
to obtain a prior distribution before updating it with newer observations.
4386:
is the distribution of the parameter(s) before any data is observed, i.e.
14499: 14461: 14144: 14045: 13907: 13720: 13687: 13179: 13096: 13091: 12735: 12692: 12672: 12652: 12642: 12411: 12134: 11777:
Bernoulli's Fallacy: Statistical Illogic and the Crisis of Modern Science
11617: 10489:
Stoica, P.; Selen, Y. (2004). "A review of information criterion rules".
9948: 9392: 9317: 9164: 9163:, Bayesian inference has been used to develop algorithms for identifying 6426:
gives that in the limit of infinite trials, the posterior converges to a
2464: 440: 412: 10641: 8975:
is finite (see above section on asymptotic behaviour of the posterior).
2485: 13345: 12825: 12525: 12456: 12406: 12381: 12301: 11022:
Gardner-Medwin, A. (2005) "What Probability Should the Jury Address?".
10422:(From "Chapter 12 Posterior Distributions and Bayes Solutions", p. 324) 10082: 10057:"On the asymptotic behavior of Bayes estimates in the discrete case II" 10041: 9938: 9921: 9429: 9425: 9176: 5739: 2468: 444: 424: 12206:
Mathematical Notes on Bayesian Statistics and Markov Chain Monte Carlo
11221: 11166: 10765: 10730: 9980: 9811: 5630:
is the distribution of a new data point, marginalized over the prior:
3473:{\displaystyle p({\boldsymbol {\theta }}\mid {\boldsymbol {\alpha }})} 3090:
Upon observation of further evidence, this procedure may be repeated.
13498: 13350: 12970: 12765: 12677: 12662: 12657: 12622: 11400: 11383: 10689: 10672: 10136:
Pitman's measure of closeness: A comparison of statistical estimators
10016:"On the asymptotic behavior of Bayes' estimates in the discrete case" 9500:
The problem considered by Bayes in Proposition 9 of his essay, "
9184: 9152:
schemes. Recently Bayesian inference has gained popularity among the
6692: 6434:
in 1948, namely if the random variable in consideration has a finite
4126: 12012:. Springer Series in Statistics (Second ed.). Springer-Verlag. 10265: 10263: 7170:, and the two must add up to 1, so both are equal to 0.5. The event 6069:{\textstyle {\frac {P(E\mid M)}{P(E)}}=1\Rightarrow P(E\mid M)=P(E)} 5076:. In such situations, we need to resort to approximation techniques. 13014: 12632: 12509: 12504: 12499: 11433:
Wolpert, R. L. (2004). "A Conversation with James O. Berger".
10983:
Interpreting Evidence: Evaluating Forensic Science in the Courtroom
10538: 9847: 9703: 9226:
and any computable but unknown probability distribution from which
8968:{\displaystyle \{GD,G{\bar {D}},{\bar {G}}D,{\bar {G}}{\bar {D}}\}} 7799:{\displaystyle \{GD,G{\bar {D}},{\bar {G}}D,{\bar {G}}{\bar {D}}\}} 6438:. The more general results were obtained later by the statistician 5779: 5756: 2343:
and relevant probabilities can be compared directly to each other.
1166:, while the posterior probability is a function of the hypothesis, 448: 408: 11975:; Carlin, John B.; Stern, Hal S.; Dunson, David B.; Vehtari, Aki; 11952: 11878: 11567:(Second (updated printing 2007) ed.). Pearson Prentice–Hall. 11301: 10802: 10747: 6698:
There are other methods of estimation that minimize the posterior
3419:
span the parameter space. Let the initial prior distribution over
1499:, is a valid likelihood, Bayes' rule can be rewritten as follows: 14519: 14220: 11666:
Chapter: Conservatism in Human Information Processing (excerpted)
11143: 10260: 9320: 9314:
probability of the evidence, given that the defendant is innocent
9188: 3379:{\displaystyle P(\mathbf {E} \mid M)=\prod _{k}{P(e_{k}\mid M)}.} 436: 11562: 9920:
Taraldsen, Gunnar; Tufto, Jarle; Lindqvist, Bo H. (2021-07-24).
9538:(1749–1827) who introduced (as Principle VI) what is now called 9276:
are more widely understood than probabilities. Alternatively, a
8978: 6748:(that is independent of previous observations) is determined by 2346:
One quick and easy way to remember the equation would be to use
2336:{\displaystyle {\tfrac {P(E\mid \neg H)}{P(E\mid H)\cdot P(H)}}} 761:{\displaystyle P(H\mid E)={\frac {P(E\mid H)\cdot P(H)}{P(E)}},} 14441: 13422: 13396: 13376: 12627: 12418: 11540:
Aster, Richard; Borchers, Brian, and Thurber, Clifford (2012).
7714:(century) is to be calculated, with the discrete set of events 11724:(1966) 72: 346-354)". In Jie W. Weiss; David J. Weiss (eds.). 10729:
Rathmanner, Samuel; Hutter, Marcus; Ormerod, Thomas C (2011).
9899:. Internet Archive. Chichester  ; New York : Wiley. 383: 380: 12270: 11624:(eds.). "Judgment under uncertainty: Heuristics and biases". 10871: 9502:
An Essay Towards Solving a Problem in the Doctrine of Chances
9237: 452: 46: 12265: 11935:
Updated classic textbook. Bayesian theory clearly presented.
11565:
Mathematical Statistics, Volume 1: Basic and Selected Topics
10618:"Probabilistic machine learning and artificial intelligence" 9791:
de Carvalho, Miguel; Page, Garritt; Barney, Bradley (2019).
8987:
justification of the use of Bayesian inference was given by
6712:
using the sampling distribution ("frequentist statistics").
1925:{\displaystyle P(E)=P(E\mid H)P(H)+P(E\mid \neg H)P(\neg H)} 12361: 10523: 10347: 6700: 5775: 4069: 2262:
is much larger than 1 and this term can be approximated as
407:
is used to update the probability for a hypothesis as more
377: 346: 343: 337: 9436:
problems with incomplete information by Cai et al. (2009).
9283: 6489: 5789:
Both types of predictive distributions have the form of a
9210:. Solomonoff's universal prior probability of any prefix 4372:, a new data point whose distribution is to be predicted. 3389:
Parametric formulation: motivating the formal description
2447:{\displaystyle P(E\cap H)=P(E\mid H)P(H)=P(H\mid E)P(E).} 456: 11833: 9432:. The Bayesian inference has also been applied to treat 9346:
Gardner-Medwin argues that the jury should believe both
6455:
no asymptotic convergence. Later in the 1980s and 1990s
2222:
is small (but not necessarily astronomically small) and
11971: 11726:
A Science of Decision Making:The Legacy of Ward Edwards
11270:
R&R Journal of Statistics and Mathematical Sciences
10894: 10728: 10317:"Invariant Proper Bayes Tests for Exponential Families" 9919: 6422:
trials. For sufficiently nice prior probabilities, the
840:, is the estimate of the probability of the hypothesis 496:. Similar reasoning can be used to show that P(¬A|B) = 9726:"Bayes' Theorem (Stanford Encyclopedia of Philosophy)" 8509: 5985: 5888: 2270: 2230: 2136: 2032: 1998: 10278: 10153:. Bayesian Thinking. Vol. 25. pp. 373–414. 9790: 8991:, who proved that every unique Bayesian procedure is 8889: 8855: 8595: 8575: 8555: 8318: 8139: 7972: 7814: 7720: 7700: 7638: 7602: 7320: 7258: 7196: 7176: 7118: 7091: 7064: 6754: 6725: 6614: 6600:
Taking a value with the greatest probability defines
6514: 6369: 6328: 6308: 6288: 6242: 6207: 6172: 6131: 6096: 5636: 5513: 5397: 5368: 5348: 5316: 5303:{\displaystyle P_{X,Y}(dx,dy)=P_{Y}^{x}(dy)P_{X}(dx)} 5210: 5190: 5163: 5137: 5117: 5085: 5062: 5040: 5020: 4977: 4662: 4560: 4482: 4439: 4392: 4349: 4301: 4281: 4259: 4210: 4185: 4136: 4111: 4085: 3983: 3696: 3672: 3642: 3605: 3578: 3514: 3486: 3447: 3425: 3403: 3312: 3168: 3108: 2956: 2915: 2886: 2847: 2808: 2759: 2732: 2692: 2643: 2616: 2596: 2532: 2508: 2486:
Inference over exclusive and exhaustive possibilities
2356: 2268: 2228: 2199: 2163: 2134: 2099: 1991: 1938: 1837: 1505: 1485: 1461: 1438: 1400: 1365: 1336: 1316: 1284: 1249: 1227: 1194: 1172: 1152: 1132: 1112: 1086: 1064: 1029: 1003: 980: 958: 917: 891: 869: 846: 811: 777: 680: 431:. Bayesian updating is particularly important in the 392: 358: 349: 340: 14183:
Autoregressive conditional heteroskedasticity (ARCH)
11693: 6413: 2799:. These must sum to 1, but are otherwise arbitrary. 2456: 374: 334: 12115: 9504:", is the posterior distribution for the parameter 9366:, but the reverse is not true. It is possible that 9144:allow for efficient simulation algorithms like the 5007:{\displaystyle p(\theta \mid \mathbf {X} ,\alpha )} 371: 331: 13645: 11953:Carlin, Bradley P. & Louis, Thomas A. (2008). 11883: 11349: 11286: 10148: 9784: 8967: 8867: 8839: 8581: 8561: 8537: 8493: 8305: 8126: 7959: 7798: 7706: 7666: 7624: 7586: 7306: 7244: 7182: 7162: 7104: 7077: 6917: 6740: 6681: 6590: 6402: 6355: 6314: 6294: 6271: 6228: 6193: 6158: 6117: 6068: 5971: 5770:in frequentist statistics when constructed from a 5723: 5616: 5469: 5380: 5354: 5334: 5302: 5196: 5176: 5149: 5123: 5103: 5068: 5048: 5026: 5006: 4960: 4636: 4531: 4462: 4413: 4364: 4333: 4287: 4267: 4238:{\displaystyle \theta \sim p(\theta \mid \alpha )} 4237: 4191: 4163: 4117: 4099:, a data point in general. This may in fact be a 4091: 4059: 3969: 3680: 3650: 3628: 3591: 3560: 3494: 3472: 3433: 3411: 3378: 3298: 3154: 3080: 2936: 2901: 2872: 2833: 2787: 2745: 2714: 2678: 2629: 2602: 2578: 2514: 2446: 2335: 2254: 2214: 2184: 2149: 2120: 2085: 1977: 1924: 1823: 1491: 1467: 1447: 1421: 1386: 1351: 1322: 1298: 1270: 1233: 1209: 1178: 1158: 1138: 1118: 1092: 1070: 1050: 1009: 986: 964: 938: 897: 875: 852: 826: 783: 760: 12010:Statistical Decision Theory and Bayesian Analysis 11955:Bayesian Methods for Data Analysis, Third Edition 11918: 11811: 10731:"A Philosophical Treatise of Universal Induction" 10408:Asymptotic Methods in Statistical Decision Theory 9835:Communications in Statistics – Theory and Methods 9242:Bayesian inference has been applied in different 3561:{\displaystyle \mathbf {E} =(e_{1},\dots ,e_{n})} 3155:{\displaystyle \mathbf {E} =(e_{1},\dots ,e_{n})} 2802:Suppose that the process is observed to generate 14598: 11859: 11563:Bickel, Peter J. & Doksum, Kjell A. (2001). 11384:"When did Bayesian Inference Become 'Bayesian'?" 10673:"When did Bayesian inference become "Bayesian"?" 9522:History of statistics § Bayesian statistics 9260:Jurimetrics § Bayesian analysis of evidence 7694:The degree of belief in the continuous variable 6641: 3629:{\displaystyle p(e\mid {\boldsymbol {\theta }})} 423:Bayesian inference is an important technique in 13731:Multivariate adaptive regression splines (MARS) 12004: 11921:Introduction to Bayesian Inference and Decision 11719: 11377: 11375: 11196:Gupta, Ankur; Rawlings, James B. (April 2014). 11100: 11067: 10787: 10130: 10098:Journal of the American Statistical Association 10095: 9826: 9495: 4656:, which forms the heart of Bayesian inference: 3502:is a set of parameters to the prior itself, or 11004:Bayes' Theorem and Weighing Evidence by Juries 10383: 10247:"Posterior Predictive Distribution Stat Slide" 9979:Robert, Christian P.; Casella, George (2004). 9458:investigate the brain as a Bayesian mechanism. 6932: 6410:", from which the result immediately follows. 470: 12286: 12034: 11611: 11602: 11195: 10981:Robertson, B. and Vignaux, G. A. (1995) 9978: 8979:In frequentist statistics and decision theory 1394:, both in the numerator, affect the value of 303: 11479: 11408: 11372: 10314: 9099: 9079: 8962: 8890: 8883:corresponding to the discrete set of events 7793: 7721: 6704:(expected-posterior loss) with respect to a 6628: 6615: 5470:{\displaystyle P_{X}^{y}(A)=E(1_{A}(X)|Y=y)} 2867: 2854: 2828: 2815: 2782: 2760: 12177:"Bayesian approach to statistical problems" 11998: 11839:Scientific Reasoning: The Bayesian Approach 11699:Scientific Reasoning: the Bayesian Approach 11491:"A Bayesian mathematical statistics primer" 11426: 11341: 10840: 10706:Bayesian Computation with R, Second edition 10488: 10348:Hwang, J. T. & Casella, George (1982). 5878: 5797:). In fact, if the prior distribution is a 4550:) is the distribution of the observed data 4463:{\displaystyle p(\mathbf {X} \mid \theta )} 12331: 12293: 12279: 11817:Understanding Uncertainty, Revised Edition 11590:Bayesian Inference in Statistical Analysis 11504: 10703: 10615: 10585: 10459: 10431: 10218:"Introduction to Bayesian Decision Theory" 9877: 9238:Bioinformatics and healthcare applications 9055:Bayesian methodology also plays a role in 7456: 7452: 5811: 310: 296: 12944: 12068: 11751: 11688:Link to Fragmentary Edition of March 1996 11542:Parameter Estimation and Inverse Problems 11446: 11399: 11300: 11260: 11237: 11073: 11064:Howson & Urbach (2005), Jaynes (2003) 10930: 10912: 10846: 10801: 10764: 10746: 10688: 10537: 10368: 10332: 10299: 10158: 10072: 10031: 9947: 9937: 9922:"Improper priors and improper posteriors" 9846: 9312:the probability of guilt, but rather the 7482: 7432: 7385: 7307:{\displaystyle P(E\mid H_{2})=20/40=0.5.} 7245:{\displaystyle P(E\mid H_{1})=30/40=0.75} 6905: 6839: 6581: 6554: 6451:is not applicable. In this case there is 5866:Learn how and when to remove this message 3923: 12090:Probability Theory: The Logic of Science 11957:. Boca Raton, FL: Chapman and Hall/CRC. 11728:. Oxford University Press. p. 536. 11676:Probability Theory: The Logic of Science 11513:Pattern Recognition and Machine Learning 11485: 11414: 11381: 10970:Journal of the Royal Statistical Society 10956:Journal of the Royal Statistical Society 10670: 10054: 10013: 9880:Foundations of the Theory of Probability 9461:Bayesian inference in ecological studies 9381: 9299:explained Bayes' theorem to the jury in 9282: 9115: 7681: 5829:This section includes a list of general 4129:of the data point's distribution, i.e., 4070:Formal description of Bayesian inference 3093: 2497: 480:P(A|B) P(B) = P(B|A) P(A) i.e. P(A|B) = 474: 183:Integrated nested Laplace approximations 12098:Kendall's Advanced Theory of Statistics 11770: 11605:Formal Representation of Human Judgment 11432: 11347: 10708:. New York, Dordrecht, etc.: Springer. 10193:"Maximum A Posteriori (MAP) Estimation" 10134:; Keating, J. P.; Mason, R. L. (1993). 9793:"On the geometry of Bayesian inference" 9773:, Third Edition. Chapman and Hall/CRC. 9689: 9120:Bayesian inference has applications in 8547:independent and identically distributed 6490:Estimates of parameters and predictions 6420:independent and identically distributed 5477:Existence and uniqueness of the needed 4047: 4007: 3999: 3953: 3945: 3928: 3916: 3908: 3894: 3886: 3859: 3851: 3816: 3808: 3788: 3764: 3756: 3724: 3708: 3681:{\displaystyle {\boldsymbol {\theta }}} 3674: 3651:{\displaystyle {\boldsymbol {\theta }}} 3644: 3619: 3570:independent and identically distributed 3495:{\displaystyle {\boldsymbol {\alpha }}} 3488: 3463: 3455: 3434:{\displaystyle {\boldsymbol {\theta }}} 3427: 3412:{\displaystyle {\boldsymbol {\theta }}} 3405: 3100:independent and identically distributed 2579:{\displaystyle E_{n},\ n=1,2,3,\ldots } 14: 14599: 14257:Kaplan–Meier estimator (product limit) 11510: 11037: 10402: 9451:Bayesian tool for methylation analysis 7677: 5497: 4414:{\displaystyle p(\theta \mid \alpha )} 4164:{\displaystyle x\sim p(x\mid \theta )} 2493: 433:dynamic analysis of a sequence of data 14330: 13897: 13644: 12943: 12713: 12330: 12274: 11981:Bayesian Data Analysis, Third Edition 10855: 10244: 10061:The Annals of Mathematical Statistics 10020:The Annals of Mathematical Statistics 9605:Bayesian approaches to brain function 9456:Bayesian approaches to brain function 4376: 4203:of the parameter distribution, i.e., 795:whose probability may be affected by 528: 14567: 14267:Accelerated failure time (AFT) model 12202:from Queen Mary University of London 12200:Introduction to Bayesian probability 12096:O'Hagan, A. and Forster, J. (2003). 11942:. Fourth Edition (2012), John Wiley 11940:Bayesian Statistics: An Introduction 9892: 9483:Bayesian inference in motor learning 9104:See the separate Knowledge entry on 5815: 2686:are specified to define the models. 883:, the current evidence, is observed. 14579: 13862:Analysis of variance (ANOVA, anova) 12714: 12242:Stanford Encyclopedia of Philosophy 11906:Introduction to Bayesian Statistics 10985:. John Wiley and Sons. Chichester. 10671:Fienberg, Stephen E. (2006-03-01). 9896:Probability based on Radon measures 9832: 9542:and used it to address problems in 9442:is used to search for lost objects. 9428:conditional on new observations or 9253: 6467: 5111:be the conditional distribution of 4334:{\displaystyle x_{1},\ldots ,x_{n}} 2753:. Before the first inference step, 1985:This focuses attention on the term 24: 13957:Cochran–Mantel–Haenszel statistics 12583:Pearson product-moment correlation 11890:(third ed.). Addison-Wesley. 11862:Statistics: A Bayesian Perspective 11744: 11722:Journal of Experimental Psychology 9926:Scandinavian Journal of Statistics 9038: 6530: 6079: 5835:it lacks sufficient corresponding 4483: 2679:{\displaystyle P(E_{n}\mid M_{m})} 2597: 2509: 2285: 2255:{\displaystyle {\tfrac {1}{P(H)}}} 2047: 1960: 1913: 1898: 1782: 1693: 1678: 1439: 25: 14633: 12169: 12141:, San Mateo, CA: Morgan Kaufmann. 10467:. Chapman and Hall. p. 433. 10439:. Chapman and Hall. p. 432. 10321:Annals of Mathematical Statistics 10288:Annals of Mathematical Statistics 10269:Bickel & Doksum (2001, p. 32) 7163:{\displaystyle P(H_{1})=P(H_{2})} 6717:posterior predictive distribution 6498:from the posterior distribution. 6414:Asymptotic behaviour of posterior 5791:compound probability distribution 5505:posterior predictive distribution 5204:. The joint distribution is then 2502:Diagram illustrating event space 2457:Alternatives to Bayesian updating 1978:{\displaystyle P(H)+P(\neg H)=1.} 14578: 14566: 14554: 14541: 14540: 14331: 12144:Pierre Bessière et al. (2013). " 12120:(paperback ed.). Springer. 11882:& Mark J. Schervish (2002). 9556:principle of insufficient reason 9199:Solomonoff's Inductive inference 6892: 6826: 6779: 6663: 6568: 6496:measurements of central tendency 5820: 5759:of the predictive distribution. 5595: 5536: 5042: 4991: 4918: 4892: 4846: 4805: 4759: 4730: 4708: 4676: 4594: 4568: 4516: 4499: 4447: 4261: 3991: 3878: 3843: 3780: 3748: 3716: 3516: 3320: 3235: 3202: 3182: 3110: 2637:. The conditional probabilities 2590:is unknown. Let the event space 2478:") following the publication of 1058:is the probability of observing 367: 327: 277: 193:Approximate Bayesian computation 45: 14216:Least-squares spectral analysis 12261:Data, Uncertainty and Inference 12220:, categorized and annotated by 11923:(2nd ed.). Probabilistic. 11422:. Vol. 25. pp. 17–90. 11280: 11254: 11189: 11137: 11094: 11058: 11031: 11016: 10996: 10975: 10962: 10947: 10888: 10864: 10834: 10781: 10722: 10697: 10664: 10609: 10579: 10570: 10517: 10491:IEEE Signal Processing Magazine 10482: 10453: 10425: 10396: 10377: 10341: 10308: 10272: 10238: 10209: 10185: 10142: 10124: 10089: 10048: 10007: 9982:Monte Carlo Statistical Methods 9972: 9913: 9478:Bayesian inference in marketing 9446:Bayesian inference in phylogeny 9094: 6708:, and these are of interest to 5753:maximum a posteriori estimation 5310:. The conditional distribution 2150:{\displaystyle {\tfrac {1}{2}}} 646:Bayesian inference derives the 219:Maximum a posteriori estimation 27:Method of statistical inference 13197:Mean-unbiased minimum-variance 12300: 11418:(2005). "Reference analysis". 10972:, Series A, 160, 429–469. 10548:10.1016/j.ultramic.2019.02.003 10389:Testing Statistical Hypotheses 10215: 10110:10.1080/01621459.2000.10474344 9886: 9871: 9763: 9739: 9718: 9683: 9676:Merriam-Webster.com Dictionary 9663: 9487:Bayesian inference is used in 9051:Bayesian information criterion 8956: 8944: 8926: 8911: 8834: 8828: 8805: 8799: 8786: 8762: 8738: 8714: 8702: 8696: 8680: 8668: 8660: 8636: 8624: 8606: 8549:. When a new fragment of type 8526: 8520: 8488: 8485: 8473: 8438: 8435: 8432: 8420: 8388: 8376: 8373: 8367: 8349: 8337: 8322: 8300: 8297: 8285: 8250: 8247: 8244: 8232: 8200: 8188: 8185: 8179: 8158: 8143: 8121: 8118: 8106: 8071: 8068: 8065: 8053: 8018: 8012: 7994: 7976: 7954: 7951: 7939: 7904: 7901: 7898: 7886: 7851: 7845: 7818: 7787: 7775: 7757: 7742: 7667:{\displaystyle P(H_{1}\mid E)} 7661: 7642: 7619: 7606: 7499: 7486: 7479: 7460: 7449: 7436: 7429: 7410: 7402: 7389: 7382: 7363: 7347: 7328: 7281: 7262: 7219: 7200: 7157: 7144: 7135: 7122: 6902: 6882: 6876: 6864: 6855: 6836: 6810: 6801: 6789: 6774: 6767: 6758: 6732: 6673: 6653: 6578: 6558: 6542: 6536: 6521: 6403:{\displaystyle 1-P(M\mid E)=0} 6391: 6379: 6344: 6338: 6260: 6253: 6246: 6217: 6211: 6182: 6176: 6147: 6135: 6106: 6100: 6063: 6057: 6048: 6036: 6030: 6018: 6012: 6004: 5992: 5966: 5960: 5951: 5939: 5933: 5921: 5915: 5907: 5895: 5712: 5700: 5694: 5682: 5673: 5661: 5649: 5640: 5605: 5585: 5579: 5567: 5558: 5546: 5526: 5517: 5464: 5451: 5447: 5441: 5428: 5419: 5413: 5297: 5288: 5275: 5266: 5245: 5227: 5001: 4981: 4952: 4940: 4934: 4914: 4902: 4888: 4880: 4868: 4862: 4842: 4827: 4821: 4815: 4801: 4793: 4781: 4775: 4755: 4740: 4726: 4718: 4698: 4686: 4666: 4622: 4610: 4604: 4590: 4578: 4564: 4526: 4512: 4503: 4489: 4457: 4443: 4408: 4396: 4356: 4232: 4220: 4158: 4146: 4074: 4051: 4030: 4011: 3987: 3957: 3941: 3920: 3904: 3898: 3874: 3863: 3839: 3820: 3804: 3792: 3776: 3768: 3744: 3728: 3704: 3623: 3609: 3572:event observations, where all 3555: 3523: 3467: 3451: 3369: 3350: 3330: 3316: 3290: 3284: 3271: 3258: 3252: 3231: 3212: 3198: 3186: 3172: 3149: 3117: 3072: 3066: 3053: 3040: 3034: 3015: 2996: 2984: 2972: 2960: 2931: 2919: 2896: 2890: 2873:{\displaystyle M\in \{M_{m}\}} 2834:{\displaystyle E\in \{E_{n}\}} 2779: 2766: 2709: 2696: 2673: 2647: 2438: 2432: 2426: 2414: 2405: 2399: 2393: 2381: 2372: 2360: 2326: 2320: 2311: 2299: 2291: 2276: 2245: 2239: 2209: 2203: 2179: 2167: 2115: 2103: 2073: 2061: 2053: 2038: 2013: 2007: 1966: 1957: 1948: 1942: 1919: 1910: 1904: 1889: 1880: 1874: 1868: 1856: 1847: 1841: 1808: 1796: 1788: 1773: 1750: 1744: 1699: 1690: 1684: 1669: 1660: 1654: 1648: 1636: 1628: 1622: 1616: 1604: 1579: 1573: 1565: 1559: 1553: 1541: 1525: 1513: 1416: 1404: 1381: 1369: 1346: 1340: 1259: 1253: 1204: 1198: 1045: 1033: 933: 921: 821: 815: 749: 743: 735: 729: 720: 708: 696: 684: 13: 1: 14510:Geographic information system 13726:Simultaneous equations models 12233:S. Hartmann and J. Sprenger: 12055:Optimal Statistical Decisions 12040:Smith, Adrian F. M. 11908:: Second Edition, John Wiley 11843:Open Court Publishing Company 11795: 11780:. Columbia University Press. 11703:Open Court Publishing Company 11646:10.1126/science.185.4157.1124 11382:Fienberg, Stephen E. (2006). 10959:, Series B, 58, 425–443. 10592:probabilistic-programming.org 10169:10.1016/s0169-7161(05)25013-7 9882:. Chelsea Publishing Company. 9857:10.1080/03610926.2021.1921214 9651: 9150:Metropolis–Hastings algorithm 5628:prior predictive distribution 13693:Coefficient of determination 13304:Uniformly most powerful test 12249:Bayesian Confirmation Theory 12116:Robert, Christian P (2007). 11819:(2nd ed.). John Wiley. 11806:Download first chapter here 11544:, Second Edition, Elsevier. 11358:. Harvard University Press. 11348:Stigler, Stephen M. (1986). 10790:Theoretical Computer Science 9656: 9635:Principle of maximum entropy 9590:for example in the field of 9496:Bayes and Bayesian inference 6741:{\displaystyle {\tilde {x}}} 6159:{\displaystyle P(M\mid E)=0} 5049:{\displaystyle \mathbf {X} } 4554:over the parameter(s), i.e. 4365:{\displaystyle {\tilde {x}}} 4295:observed data points, i.e., 4268:{\displaystyle \mathbf {X} } 2909:is updated to the posterior 2788:{\displaystyle \{P(M_{m})\}} 126:Principle of maximum entropy 7: 14262:Proportional hazards models 14206:Spectral density estimation 14188:Vector autoregression (VAR) 13622:Maximum posterior estimator 12854:Randomized controlled trial 12182:Encyclopedia of Mathematics 12069:Schervish, Mark J. (1995). 11904:Bolstad, William M. (2007) 11837:& Peter Urbach (2005). 10588:"Probabilistic Programming" 9752:, Oxford University Press. 9597: 9491:to solve numerical problems 8877:Bernstein-von Mises theorem 7314:Bayes' formula then yields 7085:correspond to bowl #1, and 6933:Probability of a hypothesis 6927: 6710:statistical decision theory 6449:Bernstein-von Mises theorem 6424:Bernstein-von Mises theorem 4546:(sometimes also termed the 2797:initial prior probabilities 471:Introduction to Bayes' rule 96:Bernstein–von Mises theorem 10: 14638: 14022:Multivariate distributions 12442:Average absolute deviation 12255:What is Bayesian Learning? 12226:A. Hajek and S. Hartmann: 11919:Winkler, Robert L (2003). 11886:Probability and Statistics 11533: 11457:10.1214/088342304000000053 11319:10.1103/PhysRevE.84.021120 11115:10.1890/1051-0761-24.1.181 10914:10.1016/j.cell.2019.06.011 9878:Kolmogorov, A.N. (1933) . 9519: 9515: 9508:(the success rate) of the 9342:– the defendant is guilty. 9257: 9161:statistical classification 9083: 9048: 9042: 8503:Assume a uniform prior of 6471: 6083: 2937:{\displaystyle P(M\mid E)} 2185:{\displaystyle P(H\mid E)} 2121:{\displaystyle P(H\mid E)} 1422:{\displaystyle P(H\mid E)} 1387:{\displaystyle P(E\mid H)} 1051:{\displaystyle P(E\mid H)} 939:{\displaystyle P(H\mid E)} 521: 515: 14536: 14490: 14427: 14380: 14343: 14339: 14326: 14298: 14280: 14247: 14238: 14196: 14143: 14104: 14053: 14044: 14010:Structural equation model 13965: 13922: 13918: 13893: 13852: 13818: 13772: 13739: 13701: 13668: 13664: 13640: 13580: 13489: 13408: 13372: 13363: 13346:Score/Lagrange multiplier 13331: 13284: 13229: 13155: 13146: 12956: 12952: 12939: 12898: 12872: 12824: 12779: 12761:Sample size determination 12726: 12722: 12709: 12613: 12568: 12542: 12524: 12480: 12432: 12352: 12343: 12339: 12326: 12308: 11860:Berry, Donald A. (1996). 11752:Vallverdu, Jordi (2016). 11697:& Urbach, P. (2005). 11356:The History of Statistics 10820:10.1016/j.tcs.2007.05.016 10197:www.probabilitycourse.com 9645:Probabilistic programming 9266:beyond a reasonable doubt 9214:of a computable sequence 9100:Statistical data analysis 9086:Probabilistic programming 9080:Probabilistic programming 8538:{\textstyle f_{C}(c)=0.2} 7033: 6973: 6229:{\displaystyle P(E)>0} 5485:. This was formulated by 5335:{\displaystyle P_{X}^{y}} 5104:{\displaystyle P_{Y}^{x}} 1271:{\displaystyle P(E)>0} 626: 564: 121:Principle of indifference 14505:Environmental statistics 14027:Elliptical distributions 13820:Generalized linear model 13749:Simple linear regression 13519:Hodges–Lehmann estimator 12976:Probability distribution 12885:Stochastic approximation 12447:Coefficient of variation 11999:Intermediate or advanced 11983:. Chapman and Hall/CRC. 11808:, Sebtel Press, England. 11261:Fornalski, K.W. (2016). 10503:10.1109/MSP.2004.1311138 10463:; Hinkley, D.V. (1974). 10435:; Hinkley, D.V. (1974). 9630:Information field theory 9587:Markov chain Monte Carlo 9414: 9204:probability distribution 9045:Bayesian model selection 7625:{\displaystyle P(H_{1})} 6356:{\displaystyle 1-P(M)=0} 6272:{\displaystyle P(M|E)=1} 5879:Interpretation of factor 5784:Student's t-distribution 5782:are constructed using a 5492:Markov chain Monte Carlo 5481:is a consequence of the 4472:This is also termed the 4275:is the sample, a set of 2715:{\displaystyle P(M_{m})} 2588:probability distribution 2467:noted that traditional " 1310:For different values of 1217:is sometimes termed the 952:, is the probability of 421:posterior probabilities. 173:Markov chain Monte Carlo 14622:Probabilistic arguments 14617:Statistical forecasting 14165:Cross-correlation (XCF) 13773:Non-standard predictors 13207:Lehmann–Scheffé theorem 12880:Adaptive clinical trial 11582:Box, G. E. P. 11103:Ecological Applications 11044:. Chicago: Open Court. 10334:10.1214/aoms/1177697822 10301:10.1214/aoms/1177700051 10074:10.1214/aoms/1177700155 10033:10.1214/aoms/1177703871 9640:Probabilistic causation 9473:stock market prediction 9122:artificial intelligence 5850:more precise citations. 5812:Mathematical properties 5479:conditional expectation 5184:be the distribution of 5069:{\displaystyle \theta } 5027:{\displaystyle \theta } 4192:{\displaystyle \alpha } 4118:{\displaystyle \theta } 3662:is applied to find the 2603:{\displaystyle \Omega } 2515:{\displaystyle \Omega } 2480:Richard C. Jeffrey 636: 617: 610: 603: 459:. In the philosophy of 429:mathematical statistics 178:Laplace's approximation 165:Posterior approximation 14561:Mathematics portal 14382:Engineering statistics 14290:Nelson–Aalen estimator 13867:Analysis of covariance 13754:Ordinary least squares 13678:Pearson product-moment 13082:Statistical functional 12993:Empirical distribution 12826:Controlled experiments 12555:Frequency distribution 12333:Descriptive statistics 12087:Jaynes, E. T. (1998). 12036:Bernardo, José M. 11756:. New York: Springer. 11612:Edwards, Ward (1982). 11515:. New York: Springer. 11511:Bishop, C. M. (2007). 11420:Handbook of statistics 11147:Hydrological Processes 11088:10.1287/opre.1080.0660 11038:Miller, David (1994). 10616:Ghahramani, Z (2015). 10465:Theoretical Statistics 10437:Theoretical Statistics 10370:10.1214/aos/1176345877 10282:; Schwartz R. (1965). 10151:Handbook of Statistics 9771:Bayesian Data Analysis 9568:frequentist statistics 9546:, medical statistics, 9489:probabilistic numerics 9467:Bayesian inference in 9440:Bayesian search theory 9288: 9193:naïve Bayes classifier 9112:section in that page. 8969: 8869: 8868:{\displaystyle c=15.2} 8841: 8583: 8563: 8545:, and that trials are 8539: 8495: 8307: 8128: 7961: 7800: 7708: 7687: 7668: 7626: 7588: 7308: 7246: 7184: 7164: 7106: 7079: 6919: 6742: 6683: 6592: 6404: 6357: 6316: 6296: 6273: 6230: 6195: 6194:{\displaystyle P(M)=1} 6160: 6119: 6118:{\displaystyle P(M)=0} 6070: 5973: 5745:frequentist statistics 5725: 5618: 5471: 5382: 5356: 5336: 5304: 5198: 5178: 5151: 5125: 5105: 5070: 5050: 5028: 5008: 4962: 4650:posterior distribution 4638: 4533: 4464: 4415: 4366: 4335: 4289: 4269: 4239: 4193: 4165: 4119: 4093: 4061: 3971: 3682: 3664:posterior distribution 3652: 3630: 3593: 3562: 3496: 3474: 3435: 3413: 3380: 3300: 3156: 3082: 2938: 2903: 2874: 2835: 2789: 2747: 2716: 2680: 2631: 2604: 2580: 2523: 2516: 2476:probability kinematics 2448: 2348:rule of multiplication 2337: 2256: 2216: 2186: 2151: 2122: 2087: 1979: 1926: 1825: 1493: 1469: 1449: 1448:{\displaystyle \neg H} 1423: 1388: 1353: 1324: 1300: 1272: 1235: 1211: 1180: 1160: 1140: 1120: 1094: 1072: 1052: 1021:the observed evidence. 1011: 988: 966: 940: 899: 877: 854: 828: 785: 762: 513: 284:Mathematics portal 227:Evidence approximation 14477:Population statistics 14419:System identification 14153:Autocorrelation (ACF) 14081:Exponential smoothing 13995:Discriminant analysis 13990:Canonical correlation 13854:Partition of variance 13716:Regression validation 13560:(Jonckheere–Terpstra) 13459:Likelihood-ratio test 13148:Frequentist inference 13060:Location–scale family 12981:Sampling distribution 12946:Statistical inference 12913:Cross-sectional study 12900:Observational studies 12859:Randomized experiment 12688:Stem-and-leaf display 12490:Central limit theorem 12235:Bayesian Epistemology 12228:Bayesian Epistemology 12211:Bayesian reading list 11416:Bernardo, José-Miguel 10315:Schwartz, R. (1969). 10138:. Philadelphia: SIAM. 10055:Freedman, DA (1965). 10014:Freedman, DA (1963). 9692:Philosophy of Science 9625:Inductive probability 9620:Free energy principle 9510:binomial distribution 9434:stochastic scheduling 9387:Bayesian epistemology 9382:Bayesian epistemology 9362:implies the truth of 9354:in order to convict. 9286: 9116:Computer applications 9062:posterior probability 9004:frequentist inference 8970: 8870: 8842: 8584: 8564: 8540: 8496: 8308: 8129: 7962: 7801: 7709: 7685: 7669: 7627: 7589: 7309: 7247: 7185: 7165: 7107: 7105:{\displaystyle H_{2}} 7080: 7078:{\displaystyle H_{1}} 6920: 6743: 6719:of a new observation 6684: 6593: 6428:Gaussian distribution 6405: 6358: 6317: 6297: 6274: 6231: 6196: 6161: 6120: 6071: 5974: 5726: 5619: 5483:Radon–Nikodym theorem 5472: 5388:is then determined by 5383: 5357: 5337: 5305: 5199: 5179: 5177:{\displaystyle P_{X}} 5152: 5126: 5106: 5071: 5051: 5029: 5009: 4963: 4639: 4534: 4465: 4430:sampling distribution 4416: 4367: 4336: 4290: 4270: 4240: 4194: 4166: 4120: 4094: 4062: 3972: 3683: 3653: 3631: 3594: 3592:{\displaystyle e_{i}} 3563: 3497: 3475: 3436: 3414: 3381: 3301: 3157: 3094:Multiple observations 3083: 2939: 2904: 2875: 2836: 2790: 2748: 2746:{\displaystyle M_{m}} 2717: 2681: 2632: 2630:{\displaystyle M_{m}} 2605: 2581: 2517: 2501: 2449: 2338: 2257: 2217: 2187: 2152: 2123: 2088: 1980: 1927: 1826: 1494: 1470: 1450: 1424: 1389: 1354: 1325: 1301: 1273: 1236: 1212: 1181: 1161: 1141: 1121: 1095: 1073: 1053: 1012: 989: 967: 949:posterior probability 941: 900: 878: 855: 829: 786: 763: 648:posterior probability 478: 419:in order to estimate 401:statistical inference 188:Variational inference 14612:Logic and statistics 14400:Probabilistic design 13985:Principal components 13828:Exponential families 13780:Nonlinear regression 13759:General linear model 13721:Mixed effects models 13711:Errors and residuals 13688:Confounding variable 13590:Bayesian probability 13568:Van der Waerden test 13558:Ordered alternative 13323:Multiple comparisons 13202:Rao–Blackwellization 13165:Estimating equations 13121:Statistical distance 12839:Factorial experiment 12372:Arithmetic-Geometric 12146:Bayesian Programming 12104:. Arnold, New York. 12071:Theory of statistics 11041:Critical Rationalism 11028:, 2 (1), March 2005. 11002:Dawid, A. P. (2001) 10357:Annals of Statistics 9728:. Plato.stanford.edu 9536:Pierre-Simon Laplace 9278:logarithmic approach 9234:in optimal fashion. 9110:statistical modeling 9016:confidence intervals 9008:parameter estimation 8995:. Conversely, every 8887: 8853: 8593: 8573: 8553: 8507: 8316: 8137: 7970: 7812: 7718: 7698: 7636: 7600: 7318: 7256: 7194: 7174: 7116: 7089: 7062: 6752: 6723: 6612: 6512: 6367: 6326: 6306: 6286: 6240: 6205: 6170: 6129: 6094: 5983: 5886: 5768:prediction intervals 5764:confidence intervals 5736:predictive inference 5634: 5511: 5395: 5366: 5346: 5314: 5208: 5188: 5161: 5135: 5115: 5083: 5060: 5038: 5018: 4975: 4660: 4558: 4480: 4437: 4390: 4347: 4299: 4279: 4257: 4208: 4183: 4134: 4109: 4083: 3981: 3694: 3670: 3640: 3603: 3576: 3512: 3484: 3445: 3423: 3401: 3310: 3166: 3106: 2954: 2913: 2902:{\displaystyle P(M)} 2884: 2845: 2806: 2757: 2730: 2690: 2641: 2614: 2594: 2530: 2506: 2354: 2266: 2226: 2215:{\displaystyle P(H)} 2197: 2161: 2132: 2097: 1989: 1936: 1835: 1503: 1483: 1459: 1436: 1398: 1363: 1352:{\displaystyle P(H)} 1334: 1314: 1282: 1247: 1225: 1210:{\displaystyle P(E)} 1192: 1170: 1150: 1130: 1110: 1084: 1062: 1027: 1001: 978: 956: 915: 889: 867: 844: 827:{\displaystyle P(H)} 809: 775: 678: 524:Bayesian probability 465:Bayesian probability 427:, and especially in 266:Posterior predictive 235:Evidence lower bound 116:Likelihood principle 86:Bayesian probability 14472:Official statistics 14395:Methods engineering 14076:Seasonal adjustment 13844:Poisson regressions 13764:Bayesian regression 13703:Regression analysis 13683:Partial correlation 13655:Regression analysis 13254:Prediction interval 13249:Likelihood interval 13239:Confidence interval 13231:Interval estimation 13192:Unbiased estimators 13010:Model specification 12890:Up-and-down designs 12578:Partial correlation 12534:Index of dispersion 12452:Interquartile range 12244:: "Inductive Logic" 12194:Bayesian Statistics 12073:. Springer-Verlag. 12018:1985sdtb.book.....B 11638:1974Sci...185.1124T 11632:(4157): 1124–1131. 11435:Statistical Science 11311:2011PhRvE..84b1120S 11214:2014AIChE..60.1253G 11159:2016HyPr...30.3210E 11076:Operations Research 10812:2007arXiv0709.1516H 10757:2011Entrp..13.1076R 10704:Jim Albert (2009). 10642:10.1038/nature14541 10634:2015Natur.521..452G 10586:Daniel Roy (2015). 10410:. Springer-Verlag. 9560:inverse probability 9544:celestial mechanics 9130:pattern recognition 9108:, specifically the 9106:Bayesian statistics 9066:marginal likelihood 8757: 7678:Making a prediction 6941: 5795:marginal likelihood 5772:normal distribution 5498:Bayesian prediction 5412: 5381:{\displaystyle Y=y} 5331: 5265: 5150:{\displaystyle X=x} 5100: 4544:marginal likelihood 4251:of hyperparameters. 3599:are distributed as 2494:General formulation 1330:, only the factors 1299:{\displaystyle 0/0} 1219:marginal likelihood 1106:. As a function of 664:likelihood function 537: 39:Bayesian statistics 33:Part of a series on 14607:Bayesian inference 14492:Spatial statistics 14372:Medical statistics 14272:First hitting time 14226:Whittle likelihood 13877:Degrees of freedom 13872:Multivariate ANOVA 13805:Heteroscedasticity 13617:Bayesian estimator 13582:Bayesian inference 13431:Kolmogorov–Smirnov 13316:Randomization test 13286:Testing hypotheses 13259:Tolerance interval 13170:Maximum likelihood 13065:Exponential family 12998:Density estimation 12958:Statistical theory 12918:Natural experiment 12864:Scientific control 12781:Survey methodology 12467:Standard deviation 12216:2011-06-25 at the 12196:from Scholarpedia. 12102:Bayesian Inference 12051:DeGroot, Morris H. 11010:2015-07-01 at the 10907:(3): 699–713.e19. 10391:(Second ed.). 10245:Hitchcock, David. 10104:(452): 1340–1346. 9939:10.1111/sjos.12550 9893:Tjur, Tue (1980). 9679:. Merriam-Webster. 9610:Credibility theory 9289: 9287:Adding up evidence 9012:hypothesis testing 8985:decision-theoretic 8965: 8865: 8837: 8743: 8579: 8559: 8535: 8491: 8303: 8124: 7957: 7796: 7704: 7688: 7664: 7622: 7584: 7582: 7304: 7242: 7180: 7160: 7102: 7075: 7048:) = 30 / 50 = 0.6 6937: 6915: 6738: 6679: 6649: 6588: 6400: 6353: 6312: 6292: 6269: 6226: 6191: 6156: 6115: 6066: 5969: 5749:maximum likelihood 5721: 5614: 5467: 5398: 5378: 5352: 5332: 5317: 5300: 5251: 5194: 5174: 5147: 5121: 5101: 5086: 5079:General case: Let 5066: 5046: 5024: 5004: 4958: 4634: 4529: 4460: 4411: 4384:prior distribution 4377:Bayesian inference 4362: 4331: 4285: 4265: 4235: 4189: 4161: 4115: 4089: 4057: 4026: 3967: 3965: 3678: 3648: 3626: 3589: 3558: 3492: 3470: 3431: 3409: 3376: 3345: 3296: 3226: 3152: 3098:For a sequence of 3078: 3010: 2934: 2899: 2870: 2831: 2785: 2743: 2712: 2676: 2627: 2600: 2576: 2524: 2512: 2444: 2333: 2331: 2252: 2250: 2212: 2182: 2147: 2145: 2118: 2083: 2078: 2018: 1975: 1922: 1821: 1819: 1489: 1465: 1445: 1419: 1384: 1349: 1320: 1296: 1268: 1231: 1207: 1176: 1156: 1136: 1116: 1100:and is called the 1090: 1068: 1048: 1007: 984: 962: 936: 895: 873: 850: 824: 781: 758: 533: 529:Formal explanation 514: 417:prior distribution 323:Bayesian inference 209:Bayesian estimator 157:Hierarchical model 81:Bayesian inference 14594: 14593: 14532: 14531: 14528: 14527: 14467:National accounts 14437:Actuarial science 14429:Social statistics 14322: 14321: 14318: 14317: 14314: 14313: 14249:Survival function 14234: 14233: 14096:Granger causality 13937:Contingency table 13912:Survival analysis 13889: 13888: 13885: 13884: 13741:Linear regression 13636: 13635: 13632: 13631: 13607:Credible interval 13576: 13575: 13359: 13358: 13175:Method of moments 13044:Parametric family 13005:Statistical model 12935: 12934: 12931: 12930: 12849:Random assignment 12771:Statistical power 12705: 12704: 12701: 12700: 12550:Contingency table 12520: 12519: 12387:Generalized/power 12164:978-1-4419-5940-9 12127:978-0-387-71598-8 12080:978-0-387-94546-0 12027:978-0-387-96098-2 11990:978-1-4398-4095-5 11964:978-1-58488-697-6 11948:978-1-1183-3257-3 11930:978-0-9647938-4-2 11897:978-0-201-52488-8 11880:Morris H. DeGroot 11871:978-0-534-23476-8 11852:978-0-8126-9578-6 11826:978-1-118-65012-7 11813:Dennis V. Lindley 11787:978-0-231-55335-3 11763:978-3-662-48638-2 11735:978-0-19-532298-9 11712:978-0-8126-9578-6 11684:978-0-521-59271-0 11672:Jaynes E. T. 11574:978-0-13-850363-5 11487:Bernardo, José M. 11388:Bayesian Analysis 11289:Physical Review E 11222:10.1002/aic.14409 11167:10.1002/hyp.10841 11153:(18): 3210–3227. 11051:978-0-8126-9197-9 10991:978-0-471-96026-3 10876:ciri.stanford.edu 10766:10.3390/e13061076 10715:978-0-387-92297-3 10677:Bayesian Analysis 10628:(7553): 452–459. 10474:978-0-04-121537-3 10446:978-0-04-121537-3 10417:978-0-387-96307-5 9906:978-0-471-27824-5 9812:10.1214/18-BA1112 9800:Bayesian Analysis 9779:978-1-4398-4095-5 9750:Laws and Symmetry 9422:scientific method 9376:Lindley's paradox 8959: 8947: 8929: 8914: 8881:probability space 8816: 8684: 8582:{\displaystyle c} 8562:{\displaystyle e} 8471: 8418: 8352: 8340: 8283: 8230: 8161: 8104: 8051: 7997: 7937: 7884: 7790: 7778: 7760: 7745: 7707:{\displaystyle C} 7569: 7558: 7514: 7503: 7183:{\displaystyle E} 7052: 7051: 6939:Contingency table 6867: 6813: 6770: 6735: 6640: 6625: 6604:a posteriori 6524: 6445:probability space 6440:David A. Freedman 6436:probability space 6315:{\displaystyle M} 6295:{\displaystyle M} 6022: 5925: 5876: 5875: 5868: 5685: 5652: 5570: 5529: 5355:{\displaystyle X} 5197:{\displaystyle X} 5124:{\displaystyle Y} 4906: 4831: 4744: 4359: 4288:{\displaystyle n} 4092:{\displaystyle x} 4017: 3933: 3796: 3568:be a sequence of 3336: 3276: 3217: 3058: 3001: 2548: 2330: 2249: 2144: 2077: 2017: 1815: 1812: 1754: 1703: 1583: 1492:{\displaystyle H} 1468:{\displaystyle H} 1323:{\displaystyle H} 1234:{\displaystyle H} 1179:{\displaystyle H} 1159:{\displaystyle E} 1139:{\displaystyle H} 1119:{\displaystyle E} 1093:{\displaystyle H} 1071:{\displaystyle E} 1010:{\displaystyle E} 987:{\displaystyle E} 965:{\displaystyle H} 898:{\displaystyle E} 876:{\displaystyle E} 853:{\displaystyle H} 837:prior probability 784:{\displaystyle H} 753: 668:statistical model 666:" derived from a 660:prior probability 644: 643: 634:   P(H) 535:Contingency table 399:) is a method of 320: 319: 214:Credible interval 147:Linear regression 16:(Redirected from 14629: 14582: 14581: 14570: 14569: 14559: 14558: 14544: 14543: 14447:Crime statistics 14341: 14340: 14328: 14327: 14245: 14244: 14211:Fourier analysis 14198:Frequency domain 14178: 14125: 14091:Structural break 14051: 14050: 14000:Cluster analysis 13947:Log-linear model 13920: 13919: 13895: 13894: 13836: 13810:Homoscedasticity 13666: 13665: 13642: 13641: 13561: 13553: 13545: 13544:(Kruskal–Wallis) 13529: 13514: 13469:Cross validation 13454: 13436:Anderson–Darling 13383: 13370: 13369: 13341:Likelihood-ratio 13333:Parametric tests 13311:Permutation test 13294:1- & 2-tails 13185:Minimum distance 13157:Point estimation 13153: 13152: 13104:Optimal decision 13055: 12954: 12953: 12941: 12940: 12923:Quasi-experiment 12873:Adaptive designs 12724: 12723: 12711: 12710: 12588:Rank correlation 12350: 12349: 12341: 12340: 12328: 12327: 12295: 12288: 12281: 12272: 12271: 12190: 12131: 12084: 12047: 12031: 11994: 11977:Rubin, Donald B. 11968: 11934: 11901: 11889: 11875: 11856: 11841:(3rd ed.). 11830: 11791: 11767: 11739: 11716: 11701:(3rd ed.). 11668: 11608: 11586:Tiao, G. C. 11578: 11527: 11526: 11508: 11502: 11501: 11495: 11483: 11477: 11476: 11450: 11430: 11424: 11423: 11412: 11406: 11405: 11403: 11401:10.1214/06-ba101 11379: 11370: 11369: 11353: 11345: 11339: 11338: 11304: 11284: 11278: 11277: 11267: 11258: 11252: 11251: 11241: 11208:(4): 1253–1268. 11193: 11187: 11186: 11141: 11135: 11134: 11098: 11092: 11091: 11082:(5): 1236–1249. 11071: 11065: 11062: 11056: 11055: 11035: 11029: 11020: 11014: 11000: 10994: 10979: 10973: 10966: 10960: 10951: 10945: 10944: 10934: 10916: 10892: 10886: 10885: 10883: 10882: 10868: 10862: 10859: 10853: 10852: 10850: 10838: 10832: 10831: 10805: 10785: 10779: 10778: 10768: 10750: 10741:(6): 1076–1136. 10726: 10720: 10719: 10701: 10695: 10694: 10692: 10690:10.1214/06-BA101 10668: 10662: 10661: 10613: 10607: 10606: 10604: 10603: 10594:. Archived from 10583: 10577: 10574: 10568: 10567: 10541: 10521: 10515: 10514: 10486: 10480: 10478: 10457: 10451: 10450: 10429: 10423: 10421: 10400: 10394: 10392: 10381: 10375: 10374: 10372: 10354: 10345: 10339: 10338: 10336: 10312: 10306: 10305: 10303: 10276: 10270: 10267: 10258: 10257: 10251: 10242: 10236: 10235: 10233: 10227:. Archived from 10225:cogsci.ucsd.edu/ 10222: 10213: 10207: 10206: 10204: 10203: 10189: 10183: 10182: 10162: 10146: 10140: 10139: 10128: 10122: 10121: 10093: 10087: 10086: 10076: 10052: 10046: 10045: 10035: 10026:(4): 1386–1403. 10011: 10005: 10004: 9976: 9970: 9969: 9951: 9941: 9917: 9911: 9910: 9890: 9884: 9883: 9875: 9869: 9868: 9850: 9841:(6): 1549–1568. 9830: 9824: 9823: 9806:(4): 1013‒1036. 9797: 9788: 9782: 9767: 9761: 9746:van Fraassen, B. 9743: 9737: 9736: 9734: 9733: 9722: 9716: 9715: 9687: 9681: 9680: 9667: 9592:machine learning 9471:for currency or 9405:justificationist 9254:In the courtroom 9014:, and computing 8974: 8972: 8971: 8966: 8961: 8960: 8952: 8949: 8948: 8940: 8931: 8930: 8922: 8916: 8915: 8907: 8874: 8872: 8871: 8866: 8846: 8844: 8843: 8838: 8827: 8826: 8817: 8815: 8814: 8798: 8797: 8756: 8751: 8741: 8709: 8695: 8694: 8685: 8683: 8663: 8631: 8605: 8604: 8588: 8586: 8585: 8580: 8568: 8566: 8565: 8560: 8544: 8542: 8541: 8536: 8519: 8518: 8500: 8498: 8497: 8492: 8472: 8470: 8459: 8448: 8419: 8417: 8406: 8395: 8354: 8353: 8345: 8342: 8341: 8333: 8312: 8310: 8309: 8304: 8284: 8282: 8271: 8260: 8231: 8229: 8218: 8207: 8163: 8162: 8154: 8133: 8131: 8130: 8125: 8105: 8103: 8092: 8081: 8052: 8050: 8039: 8028: 7999: 7998: 7990: 7966: 7964: 7963: 7958: 7938: 7936: 7925: 7914: 7885: 7883: 7872: 7861: 7805: 7803: 7802: 7797: 7792: 7791: 7783: 7780: 7779: 7771: 7762: 7761: 7753: 7747: 7746: 7738: 7713: 7711: 7710: 7705: 7674:, which is 0.6. 7673: 7671: 7670: 7665: 7654: 7653: 7631: 7629: 7628: 7623: 7618: 7617: 7593: 7591: 7590: 7585: 7583: 7567: 7563: 7559: 7557: 7534: 7523: 7512: 7508: 7504: 7502: 7498: 7497: 7478: 7477: 7448: 7447: 7428: 7427: 7405: 7401: 7400: 7381: 7380: 7358: 7340: 7339: 7313: 7311: 7310: 7305: 7294: 7280: 7279: 7251: 7249: 7248: 7243: 7232: 7218: 7217: 7189: 7187: 7186: 7181: 7169: 7167: 7166: 7161: 7156: 7155: 7134: 7133: 7111: 7109: 7108: 7103: 7101: 7100: 7084: 7082: 7081: 7076: 7074: 7073: 6942: 6936: 6924: 6922: 6921: 6916: 6895: 6869: 6868: 6860: 6829: 6815: 6814: 6806: 6782: 6777: 6772: 6771: 6763: 6747: 6745: 6744: 6739: 6737: 6736: 6728: 6688: 6686: 6685: 6680: 6666: 6648: 6627: 6626: 6623: 6597: 6595: 6594: 6589: 6571: 6526: 6525: 6517: 6503:robust estimator 6480:conjugate priors 6468:Conjugate priors 6409: 6407: 6406: 6401: 6362: 6360: 6359: 6354: 6322:", yielding "if 6321: 6319: 6318: 6313: 6301: 6299: 6298: 6293: 6278: 6276: 6275: 6270: 6256: 6235: 6233: 6232: 6227: 6200: 6198: 6197: 6192: 6165: 6163: 6162: 6157: 6124: 6122: 6121: 6116: 6075: 6073: 6072: 6067: 6023: 6021: 6007: 5987: 5978: 5976: 5975: 5970: 5926: 5924: 5910: 5890: 5871: 5864: 5860: 5857: 5851: 5846:this section by 5837:inline citations 5824: 5823: 5816: 5730: 5728: 5727: 5722: 5687: 5686: 5678: 5654: 5653: 5645: 5623: 5621: 5620: 5615: 5598: 5572: 5571: 5563: 5539: 5531: 5530: 5522: 5476: 5474: 5473: 5468: 5454: 5440: 5439: 5411: 5406: 5387: 5385: 5384: 5379: 5361: 5359: 5358: 5353: 5341: 5339: 5338: 5333: 5330: 5325: 5309: 5307: 5306: 5301: 5287: 5286: 5264: 5259: 5226: 5225: 5203: 5201: 5200: 5195: 5183: 5181: 5180: 5175: 5173: 5172: 5156: 5154: 5153: 5148: 5130: 5128: 5127: 5122: 5110: 5108: 5107: 5102: 5099: 5094: 5075: 5073: 5072: 5067: 5055: 5053: 5052: 5047: 5045: 5033: 5031: 5030: 5025: 5013: 5011: 5010: 5005: 4994: 4967: 4965: 4964: 4959: 4921: 4907: 4905: 4895: 4883: 4849: 4837: 4832: 4830: 4808: 4796: 4762: 4750: 4745: 4743: 4733: 4721: 4711: 4693: 4679: 4643: 4641: 4640: 4635: 4597: 4571: 4538: 4536: 4535: 4530: 4519: 4502: 4471: 4469: 4467: 4466: 4461: 4450: 4420: 4418: 4417: 4412: 4371: 4369: 4368: 4363: 4361: 4360: 4352: 4340: 4338: 4337: 4332: 4330: 4329: 4311: 4310: 4294: 4292: 4291: 4286: 4274: 4272: 4271: 4266: 4264: 4246: 4244: 4242: 4241: 4236: 4198: 4196: 4195: 4190: 4172: 4170: 4168: 4167: 4162: 4124: 4122: 4121: 4116: 4098: 4096: 4095: 4090: 4066: 4064: 4063: 4058: 4050: 4042: 4041: 4025: 4010: 4002: 3994: 3976: 3974: 3973: 3968: 3966: 3956: 3948: 3934: 3932: 3931: 3919: 3911: 3897: 3889: 3881: 3866: 3862: 3854: 3846: 3834: 3826: 3819: 3811: 3797: 3795: 3791: 3783: 3771: 3767: 3759: 3751: 3739: 3727: 3719: 3711: 3687: 3685: 3684: 3679: 3677: 3657: 3655: 3654: 3649: 3647: 3635: 3633: 3632: 3627: 3622: 3598: 3596: 3595: 3590: 3588: 3587: 3567: 3565: 3564: 3559: 3554: 3553: 3535: 3534: 3519: 3501: 3499: 3498: 3493: 3491: 3479: 3477: 3476: 3471: 3466: 3458: 3440: 3438: 3437: 3432: 3430: 3418: 3416: 3415: 3410: 3408: 3385: 3383: 3382: 3377: 3372: 3362: 3361: 3344: 3323: 3305: 3303: 3302: 3297: 3277: 3275: 3274: 3270: 3269: 3251: 3250: 3238: 3225: 3215: 3205: 3193: 3185: 3161: 3159: 3158: 3153: 3148: 3147: 3129: 3128: 3113: 3087: 3085: 3084: 3079: 3059: 3057: 3056: 3052: 3051: 3033: 3032: 3009: 2999: 2979: 2943: 2941: 2940: 2935: 2908: 2906: 2905: 2900: 2879: 2877: 2876: 2871: 2866: 2865: 2840: 2838: 2837: 2832: 2827: 2826: 2794: 2792: 2791: 2786: 2778: 2777: 2752: 2750: 2749: 2744: 2742: 2741: 2724:degree of belief 2721: 2719: 2718: 2713: 2708: 2707: 2685: 2683: 2682: 2677: 2672: 2671: 2659: 2658: 2636: 2634: 2633: 2628: 2626: 2625: 2609: 2607: 2606: 2601: 2585: 2583: 2582: 2577: 2546: 2542: 2541: 2521: 2519: 2518: 2513: 2453: 2451: 2450: 2445: 2342: 2340: 2339: 2334: 2332: 2329: 2294: 2271: 2261: 2259: 2258: 2253: 2251: 2248: 2231: 2221: 2219: 2218: 2213: 2191: 2189: 2188: 2183: 2156: 2154: 2153: 2148: 2146: 2137: 2127: 2125: 2124: 2119: 2092: 2090: 2089: 2084: 2079: 2076: 2056: 2033: 2030: 2026: 2019: 2016: 1999: 1984: 1982: 1981: 1976: 1931: 1929: 1928: 1923: 1830: 1828: 1827: 1822: 1820: 1816: 1814: 1813: 1811: 1791: 1768: 1766: 1762: 1755: 1753: 1736: 1719: 1711: 1708: 1704: 1702: 1631: 1599: 1591: 1588: 1584: 1582: 1568: 1536: 1498: 1496: 1495: 1490: 1477:logical negation 1474: 1472: 1471: 1466: 1454: 1452: 1451: 1446: 1428: 1426: 1425: 1420: 1393: 1391: 1390: 1385: 1358: 1356: 1355: 1350: 1329: 1327: 1326: 1321: 1305: 1303: 1302: 1297: 1292: 1277: 1275: 1274: 1269: 1240: 1238: 1237: 1232: 1216: 1214: 1213: 1208: 1185: 1183: 1182: 1177: 1165: 1163: 1162: 1157: 1145: 1143: 1142: 1137: 1125: 1123: 1122: 1117: 1099: 1097: 1096: 1091: 1077: 1075: 1074: 1069: 1057: 1055: 1054: 1049: 1016: 1014: 1013: 1008: 993: 991: 990: 985: 971: 969: 968: 963: 945: 943: 942: 937: 904: 902: 901: 896: 882: 880: 879: 874: 859: 857: 856: 851: 833: 831: 830: 825: 790: 788: 787: 782: 767: 765: 764: 759: 754: 752: 738: 703: 615:= P(¬E|¬H)·P(¬H) 538: 532: 511: 509: 508: 505: 502: 495: 493: 492: 489: 486: 395: 390: 389: 386: 385: 382: 379: 376: 373: 361: 356: 355: 352: 351: 348: 345: 342: 339: 336: 333: 312: 305: 298: 282: 281: 248:Model evaluation 49: 30: 29: 21: 14637: 14636: 14632: 14631: 14630: 14628: 14627: 14626: 14597: 14596: 14595: 14590: 14553: 14524: 14486: 14423: 14409:quality control 14376: 14358:Clinical trials 14335: 14310: 14294: 14282:Hazard function 14276: 14230: 14192: 14176: 14139: 14135:Breusch–Godfrey 14123: 14100: 14040: 14015:Factor analysis 13961: 13942:Graphical model 13914: 13881: 13848: 13834: 13814: 13768: 13735: 13697: 13660: 13659: 13628: 13572: 13559: 13551: 13543: 13527: 13512: 13491:Rank statistics 13485: 13464:Model selection 13452: 13410:Goodness of fit 13404: 13381: 13355: 13327: 13280: 13225: 13214:Median unbiased 13142: 13053: 12986:Order statistic 12948: 12927: 12894: 12868: 12820: 12775: 12718: 12716:Data collection 12697: 12609: 12564: 12538: 12516: 12476: 12428: 12345:Continuous data 12335: 12322: 12304: 12299: 12218:Wayback Machine 12175: 12172: 12128: 12081: 12044:Bayesian Theory 12028: 12006:Berger, James O 12001: 11991: 11965: 11931: 11898: 11872: 11853: 11827: 11798: 11788: 11774:(August 2021). 11772:Clayton, Aubrey 11764: 11747: 11745:Further reading 11742: 11736: 11713: 11614:Daniel Kahneman 11575: 11536: 11531: 11530: 11523: 11509: 11505: 11493: 11484: 11480: 11431: 11427: 11413: 11409: 11380: 11373: 11366: 11346: 11342: 11285: 11281: 11265: 11259: 11255: 11194: 11190: 11142: 11138: 11099: 11095: 11072: 11068: 11063: 11059: 11052: 11036: 11032: 11021: 11017: 11012:Wayback Machine 11001: 10997: 10980: 10976: 10967: 10963: 10952: 10948: 10893: 10889: 10880: 10878: 10870: 10869: 10865: 10860: 10856: 10848:10.1.1.186.8268 10839: 10835: 10796:(2007): 33–48. 10786: 10782: 10727: 10723: 10716: 10702: 10698: 10669: 10665: 10614: 10610: 10601: 10599: 10584: 10580: 10575: 10571: 10526:Ultramicroscopy 10522: 10518: 10487: 10483: 10475: 10458: 10454: 10447: 10430: 10426: 10418: 10401: 10397: 10382: 10378: 10352: 10346: 10342: 10313: 10309: 10277: 10273: 10268: 10261: 10249: 10243: 10239: 10231: 10220: 10214: 10210: 10201: 10199: 10191: 10190: 10186: 10179: 10160:10.1.1.324.3052 10147: 10143: 10129: 10125: 10094: 10090: 10053: 10049: 10012: 10008: 9993: 9977: 9973: 9918: 9914: 9907: 9891: 9887: 9876: 9872: 9831: 9827: 9795: 9789: 9785: 9768: 9764: 9744: 9740: 9731: 9729: 9724: 9723: 9719: 9688: 9684: 9669: 9668: 9664: 9659: 9654: 9649: 9600: 9524: 9518: 9498: 9417: 9384: 9262: 9256: 9240: 9138:graphical model 9118: 9102: 9097: 9088: 9082: 9057:model selection 9053: 9047: 9041: 9039:Model selection 9018:. For example: 8981: 8951: 8950: 8939: 8938: 8921: 8920: 8906: 8905: 8888: 8885: 8884: 8854: 8851: 8850: 8822: 8818: 8793: 8789: 8758: 8752: 8747: 8742: 8710: 8708: 8690: 8686: 8664: 8632: 8630: 8600: 8596: 8594: 8591: 8590: 8574: 8571: 8570: 8554: 8551: 8550: 8514: 8510: 8508: 8505: 8504: 8460: 8449: 8447: 8407: 8396: 8394: 8344: 8343: 8332: 8331: 8317: 8314: 8313: 8272: 8261: 8259: 8219: 8208: 8206: 8153: 8152: 8138: 8135: 8134: 8093: 8082: 8080: 8040: 8029: 8027: 7989: 7988: 7971: 7968: 7967: 7926: 7915: 7913: 7873: 7862: 7860: 7813: 7810: 7809: 7782: 7781: 7770: 7769: 7752: 7751: 7737: 7736: 7719: 7716: 7715: 7699: 7696: 7695: 7680: 7649: 7645: 7637: 7634: 7633: 7613: 7609: 7601: 7598: 7597: 7581: 7580: 7570: 7564: 7561: 7560: 7535: 7524: 7522: 7515: 7509: 7506: 7505: 7493: 7489: 7473: 7469: 7443: 7439: 7423: 7419: 7406: 7396: 7392: 7376: 7372: 7359: 7357: 7350: 7335: 7331: 7321: 7319: 7316: 7315: 7290: 7275: 7271: 7257: 7254: 7253: 7228: 7213: 7209: 7195: 7192: 7191: 7175: 7172: 7171: 7151: 7147: 7129: 7125: 7117: 7114: 7113: 7096: 7092: 7090: 7087: 7086: 7069: 7065: 7063: 7060: 7059: 7043: 6976: 6971: 6965: 6961: 6955: 6951: 6949: 6947: 6935: 6930: 6891: 6859: 6858: 6825: 6805: 6804: 6778: 6773: 6762: 6761: 6753: 6750: 6749: 6727: 6726: 6724: 6721: 6720: 6662: 6644: 6622: 6618: 6613: 6610: 6609: 6567: 6516: 6515: 6513: 6510: 6509: 6492: 6476: 6474:Conjugate prior 6470: 6416: 6368: 6365: 6364: 6327: 6324: 6323: 6307: 6304: 6303: 6302:" in place of " 6287: 6284: 6283: 6252: 6241: 6238: 6237: 6206: 6203: 6202: 6171: 6168: 6167: 6130: 6127: 6126: 6095: 6092: 6091: 6088: 6086:Cromwell's rule 6082: 6080:Cromwell's rule 6008: 5988: 5986: 5984: 5981: 5980: 5911: 5891: 5889: 5887: 5884: 5883: 5881: 5872: 5861: 5855: 5852: 5842:Please help to 5841: 5825: 5821: 5814: 5808: 5803:conjugate prior 5799:conjugate prior 5677: 5676: 5644: 5643: 5635: 5632: 5631: 5594: 5562: 5561: 5535: 5521: 5520: 5512: 5509: 5508: 5500: 5450: 5435: 5431: 5407: 5402: 5396: 5393: 5392: 5367: 5364: 5363: 5347: 5344: 5343: 5326: 5321: 5315: 5312: 5311: 5282: 5278: 5260: 5255: 5215: 5211: 5209: 5206: 5205: 5189: 5186: 5185: 5168: 5164: 5162: 5159: 5158: 5136: 5133: 5132: 5116: 5113: 5112: 5095: 5090: 5084: 5081: 5080: 5061: 5058: 5057: 5041: 5039: 5036: 5035: 5019: 5016: 5015: 4990: 4976: 4973: 4972: 4917: 4891: 4884: 4845: 4838: 4836: 4804: 4797: 4758: 4751: 4749: 4729: 4722: 4707: 4694: 4692: 4675: 4661: 4658: 4657: 4593: 4567: 4559: 4556: 4555: 4515: 4498: 4481: 4478: 4477: 4446: 4438: 4435: 4434: 4433: 4391: 4388: 4387: 4379: 4351: 4350: 4348: 4345: 4344: 4325: 4321: 4306: 4302: 4300: 4297: 4296: 4280: 4277: 4276: 4260: 4258: 4255: 4254: 4209: 4206: 4205: 4204: 4184: 4181: 4180: 4135: 4132: 4131: 4130: 4110: 4107: 4106: 4084: 4081: 4080: 4077: 4072: 4046: 4037: 4033: 4021: 4006: 3998: 3990: 3982: 3979: 3978: 3964: 3963: 3952: 3944: 3927: 3915: 3907: 3893: 3885: 3877: 3867: 3858: 3850: 3842: 3835: 3833: 3824: 3823: 3815: 3807: 3787: 3779: 3772: 3763: 3755: 3747: 3740: 3738: 3731: 3723: 3715: 3707: 3697: 3695: 3692: 3691: 3673: 3671: 3668: 3667: 3643: 3641: 3638: 3637: 3618: 3604: 3601: 3600: 3583: 3579: 3577: 3574: 3573: 3549: 3545: 3530: 3526: 3515: 3513: 3510: 3509: 3505:hyperparameters 3487: 3485: 3482: 3481: 3462: 3454: 3446: 3443: 3442: 3426: 3424: 3421: 3420: 3404: 3402: 3399: 3398: 3397:Let the vector 3391: 3357: 3353: 3346: 3340: 3319: 3311: 3308: 3307: 3265: 3261: 3246: 3242: 3234: 3227: 3221: 3216: 3201: 3194: 3192: 3181: 3167: 3164: 3163: 3143: 3139: 3124: 3120: 3109: 3107: 3104: 3103: 3096: 3047: 3043: 3028: 3024: 3011: 3005: 3000: 2980: 2978: 2955: 2952: 2951: 2914: 2911: 2910: 2885: 2882: 2881: 2861: 2857: 2846: 2843: 2842: 2822: 2818: 2807: 2804: 2803: 2773: 2769: 2758: 2755: 2754: 2737: 2733: 2731: 2728: 2727: 2703: 2699: 2691: 2688: 2687: 2667: 2663: 2654: 2650: 2642: 2639: 2638: 2621: 2617: 2615: 2612: 2611: 2595: 2592: 2591: 2537: 2533: 2531: 2528: 2527: 2507: 2504: 2503: 2496: 2488: 2459: 2355: 2352: 2351: 2295: 2272: 2269: 2267: 2264: 2263: 2235: 2229: 2227: 2224: 2223: 2198: 2195: 2194: 2162: 2159: 2158: 2135: 2133: 2130: 2129: 2098: 2095: 2094: 2057: 2034: 2031: 2003: 1997: 1996: 1992: 1990: 1987: 1986: 1937: 1934: 1933: 1836: 1833: 1832: 1818: 1817: 1792: 1769: 1767: 1740: 1735: 1734: 1730: 1723: 1718: 1709: 1706: 1705: 1632: 1600: 1598: 1589: 1586: 1585: 1569: 1537: 1535: 1528: 1506: 1504: 1501: 1500: 1484: 1481: 1480: 1460: 1457: 1456: 1437: 1434: 1433: 1432:In cases where 1399: 1396: 1395: 1364: 1361: 1360: 1335: 1332: 1331: 1315: 1312: 1311: 1288: 1283: 1280: 1279: 1248: 1245: 1244: 1226: 1223: 1222: 1193: 1190: 1189: 1171: 1168: 1167: 1151: 1148: 1147: 1131: 1128: 1127: 1111: 1108: 1107: 1085: 1082: 1081: 1063: 1060: 1059: 1028: 1025: 1024: 1002: 999: 998: 979: 976: 975: 957: 954: 953: 916: 913: 912: 890: 887: 886: 868: 865: 864: 845: 842: 841: 810: 807: 806: 791:stands for any 776: 773: 772: 739: 704: 702: 679: 676: 675: 620: 614: 607: 600: 589:= P(E|¬H)·P(¬H) 588: 581: 574: 567: 561: 559: 554: 552: 548: 546: 545: 543: 531: 526: 520: 506: 503: 500: 499: 497: 490: 487: 484: 483: 481: 473: 461:decision theory 393: 370: 366: 359: 330: 326: 316: 276: 261:Model averaging 240:Nested sampling 152:Empirical Bayes 142:Conjugate prior 111:Cromwell's rule 28: 23: 22: 18:Bayesian method 15: 12: 11: 5: 14635: 14625: 14624: 14619: 14614: 14609: 14592: 14591: 14589: 14588: 14576: 14564: 14550: 14537: 14534: 14533: 14530: 14529: 14526: 14525: 14523: 14522: 14517: 14512: 14507: 14502: 14496: 14494: 14488: 14487: 14485: 14484: 14479: 14474: 14469: 14464: 14459: 14454: 14449: 14444: 14439: 14433: 14431: 14425: 14424: 14422: 14421: 14416: 14411: 14402: 14397: 14392: 14386: 14384: 14378: 14377: 14375: 14374: 14369: 14364: 14355: 14353:Bioinformatics 14349: 14347: 14337: 14336: 14324: 14323: 14320: 14319: 14316: 14315: 14312: 14311: 14309: 14308: 14302: 14300: 14296: 14295: 14293: 14292: 14286: 14284: 14278: 14277: 14275: 14274: 14269: 14264: 14259: 14253: 14251: 14242: 14236: 14235: 14232: 14231: 14229: 14228: 14223: 14218: 14213: 14208: 14202: 14200: 14194: 14193: 14191: 14190: 14185: 14180: 14172: 14167: 14162: 14161: 14160: 14158:partial (PACF) 14149: 14147: 14141: 14140: 14138: 14137: 14132: 14127: 14119: 14114: 14108: 14106: 14105:Specific tests 14102: 14101: 14099: 14098: 14093: 14088: 14083: 14078: 14073: 14068: 14063: 14057: 14055: 14048: 14042: 14041: 14039: 14038: 14037: 14036: 14035: 14034: 14019: 14018: 14017: 14007: 14005:Classification 14002: 13997: 13992: 13987: 13982: 13977: 13971: 13969: 13963: 13962: 13960: 13959: 13954: 13952:McNemar's test 13949: 13944: 13939: 13934: 13928: 13926: 13916: 13915: 13891: 13890: 13887: 13886: 13883: 13882: 13880: 13879: 13874: 13869: 13864: 13858: 13856: 13850: 13849: 13847: 13846: 13830: 13824: 13822: 13816: 13815: 13813: 13812: 13807: 13802: 13797: 13792: 13790:Semiparametric 13787: 13782: 13776: 13774: 13770: 13769: 13767: 13766: 13761: 13756: 13751: 13745: 13743: 13737: 13736: 13734: 13733: 13728: 13723: 13718: 13713: 13707: 13705: 13699: 13698: 13696: 13695: 13690: 13685: 13680: 13674: 13672: 13662: 13661: 13658: 13657: 13652: 13646: 13638: 13637: 13634: 13633: 13630: 13629: 13627: 13626: 13625: 13624: 13614: 13609: 13604: 13603: 13602: 13597: 13586: 13584: 13578: 13577: 13574: 13573: 13571: 13570: 13565: 13564: 13563: 13555: 13547: 13531: 13528:(Mann–Whitney) 13523: 13522: 13521: 13508: 13507: 13506: 13495: 13493: 13487: 13486: 13484: 13483: 13482: 13481: 13476: 13471: 13461: 13456: 13453:(Shapiro–Wilk) 13448: 13443: 13438: 13433: 13428: 13420: 13414: 13412: 13406: 13405: 13403: 13402: 13394: 13385: 13373: 13367: 13365:Specific tests 13361: 13360: 13357: 13356: 13354: 13353: 13348: 13343: 13337: 13335: 13329: 13328: 13326: 13325: 13320: 13319: 13318: 13308: 13307: 13306: 13296: 13290: 13288: 13282: 13281: 13279: 13278: 13277: 13276: 13271: 13261: 13256: 13251: 13246: 13241: 13235: 13233: 13227: 13226: 13224: 13223: 13218: 13217: 13216: 13211: 13210: 13209: 13204: 13189: 13188: 13187: 13182: 13177: 13172: 13161: 13159: 13150: 13144: 13143: 13141: 13140: 13135: 13130: 13129: 13128: 13118: 13113: 13112: 13111: 13101: 13100: 13099: 13094: 13089: 13079: 13074: 13069: 13068: 13067: 13062: 13057: 13041: 13040: 13039: 13034: 13029: 13019: 13018: 13017: 13012: 13002: 13001: 13000: 12990: 12989: 12988: 12978: 12973: 12968: 12962: 12960: 12950: 12949: 12937: 12936: 12933: 12932: 12929: 12928: 12926: 12925: 12920: 12915: 12910: 12904: 12902: 12896: 12895: 12893: 12892: 12887: 12882: 12876: 12874: 12870: 12869: 12867: 12866: 12861: 12856: 12851: 12846: 12841: 12836: 12830: 12828: 12822: 12821: 12819: 12818: 12816:Standard error 12813: 12808: 12803: 12802: 12801: 12796: 12785: 12783: 12777: 12776: 12774: 12773: 12768: 12763: 12758: 12753: 12748: 12746:Optimal design 12743: 12738: 12732: 12730: 12720: 12719: 12707: 12706: 12703: 12702: 12699: 12698: 12696: 12695: 12690: 12685: 12680: 12675: 12670: 12665: 12660: 12655: 12650: 12645: 12640: 12635: 12630: 12625: 12619: 12617: 12611: 12610: 12608: 12607: 12602: 12601: 12600: 12595: 12585: 12580: 12574: 12572: 12566: 12565: 12563: 12562: 12557: 12552: 12546: 12544: 12543:Summary tables 12540: 12539: 12537: 12536: 12530: 12528: 12522: 12521: 12518: 12517: 12515: 12514: 12513: 12512: 12507: 12502: 12492: 12486: 12484: 12478: 12477: 12475: 12474: 12469: 12464: 12459: 12454: 12449: 12444: 12438: 12436: 12430: 12429: 12427: 12426: 12421: 12416: 12415: 12414: 12409: 12404: 12399: 12394: 12389: 12384: 12379: 12377:Contraharmonic 12374: 12369: 12358: 12356: 12347: 12337: 12336: 12324: 12323: 12321: 12320: 12315: 12309: 12306: 12305: 12298: 12297: 12290: 12283: 12275: 12269: 12268: 12257: 12252: 12246: 12238: 12231: 12224: 12208: 12203: 12197: 12191: 12171: 12170:External links 12168: 12167: 12166: 12156: 12148:". CRC Press. 12142: 12132: 12126: 12113: 12094: 12085: 12079: 12066: 12048: 12032: 12026: 12000: 11997: 11996: 11995: 11989: 11973:Gelman, Andrew 11969: 11963: 11950: 11938:Lee, Peter M. 11936: 11929: 11916: 11902: 11896: 11876: 11870: 11857: 11851: 11831: 11825: 11809: 11797: 11794: 11793: 11792: 11786: 11768: 11762: 11746: 11743: 11741: 11740: 11734: 11717: 11711: 11691: 11669: 11609: 11600: 11579: 11573: 11560: 11558:978-0123850485 11537: 11535: 11532: 11529: 11528: 11522:978-0387310732 11521: 11503: 11478: 11448:10.1.1.71.6112 11441:(1): 205–218. 11425: 11407: 11371: 11364: 11340: 11279: 11253: 11188: 11136: 11109:(1): 181–195. 11093: 11066: 11057: 11050: 11030: 11015: 10995: 10974: 10961: 10946: 10887: 10863: 10854: 10833: 10780: 10721: 10714: 10696: 10663: 10608: 10578: 10569: 10516: 10481: 10473: 10452: 10445: 10424: 10416: 10404:Le Cam, Lucien 10395: 10385:Lehmann, Erich 10376: 10363:(3): 868–881. 10340: 10307: 10294:(3): 747–770. 10271: 10259: 10237: 10234:on 2013-02-28. 10208: 10184: 10177: 10141: 10132:Sen, Pranab K. 10123: 10088: 10067:(2): 454–456. 10047: 10006: 9992:978-1475741452 9991: 9971: 9932:(3): 969–991. 9912: 9905: 9885: 9870: 9825: 9783: 9762: 9738: 9717: 9704:10.1086/288169 9682: 9661: 9660: 9658: 9655: 9653: 9650: 9648: 9647: 9642: 9637: 9632: 9627: 9622: 9617: 9612: 9607: 9601: 9599: 9596: 9562:" (because it 9558:, was called " 9540:Bayes' theorem 9520:Main article: 9517: 9514: 9497: 9494: 9493: 9492: 9485: 9480: 9475: 9465: 9462: 9459: 9453: 9448: 9443: 9437: 9416: 9413: 9401:vicious circle 9383: 9380: 9344: 9343: 9337: 9331: 9297:expert witness 9258:Main article: 9255: 9252: 9244:Bioinformatics 9239: 9236: 9159:As applied to 9146:Gibbs sampling 9126:expert systems 9117: 9114: 9101: 9098: 9096: 9093: 9084:Main article: 9081: 9078: 9043:Main article: 9040: 9037: 9036: 9035: 9032: 9029: 9026: 9023: 8980: 8977: 8964: 8958: 8955: 8946: 8943: 8937: 8934: 8928: 8925: 8919: 8913: 8910: 8904: 8901: 8898: 8895: 8892: 8864: 8861: 8858: 8836: 8833: 8830: 8825: 8821: 8813: 8810: 8807: 8804: 8801: 8796: 8792: 8788: 8785: 8782: 8779: 8776: 8773: 8770: 8767: 8764: 8761: 8755: 8750: 8746: 8740: 8737: 8734: 8731: 8728: 8725: 8722: 8719: 8716: 8713: 8707: 8704: 8701: 8698: 8693: 8689: 8682: 8679: 8676: 8673: 8670: 8667: 8662: 8659: 8656: 8653: 8650: 8647: 8644: 8641: 8638: 8635: 8629: 8626: 8623: 8620: 8617: 8614: 8611: 8608: 8603: 8599: 8578: 8558: 8534: 8531: 8528: 8525: 8522: 8517: 8513: 8490: 8487: 8484: 8481: 8478: 8475: 8469: 8466: 8463: 8458: 8455: 8452: 8446: 8443: 8440: 8437: 8434: 8431: 8428: 8425: 8422: 8416: 8413: 8410: 8405: 8402: 8399: 8393: 8390: 8387: 8384: 8381: 8378: 8375: 8372: 8369: 8366: 8363: 8360: 8357: 8351: 8348: 8339: 8336: 8330: 8327: 8324: 8321: 8302: 8299: 8296: 8293: 8290: 8287: 8281: 8278: 8275: 8270: 8267: 8264: 8258: 8255: 8252: 8249: 8246: 8243: 8240: 8237: 8234: 8228: 8225: 8222: 8217: 8214: 8211: 8205: 8202: 8199: 8196: 8193: 8190: 8187: 8184: 8181: 8178: 8175: 8172: 8169: 8166: 8160: 8157: 8151: 8148: 8145: 8142: 8123: 8120: 8117: 8114: 8111: 8108: 8102: 8099: 8096: 8091: 8088: 8085: 8079: 8076: 8073: 8070: 8067: 8064: 8061: 8058: 8055: 8049: 8046: 8043: 8038: 8035: 8032: 8026: 8023: 8020: 8017: 8014: 8011: 8008: 8005: 8002: 7996: 7993: 7987: 7984: 7981: 7978: 7975: 7956: 7953: 7950: 7947: 7944: 7941: 7935: 7932: 7929: 7924: 7921: 7918: 7912: 7909: 7906: 7903: 7900: 7897: 7894: 7891: 7888: 7882: 7879: 7876: 7871: 7868: 7865: 7859: 7856: 7853: 7850: 7847: 7844: 7841: 7838: 7835: 7832: 7829: 7826: 7823: 7820: 7817: 7795: 7789: 7786: 7777: 7774: 7768: 7765: 7759: 7756: 7750: 7744: 7741: 7735: 7732: 7729: 7726: 7723: 7703: 7679: 7676: 7663: 7660: 7657: 7652: 7648: 7644: 7641: 7621: 7616: 7612: 7608: 7605: 7579: 7576: 7573: 7571: 7566: 7565: 7562: 7556: 7553: 7550: 7547: 7544: 7541: 7538: 7533: 7530: 7527: 7521: 7518: 7516: 7511: 7510: 7507: 7501: 7496: 7492: 7488: 7485: 7481: 7476: 7472: 7468: 7465: 7462: 7459: 7455: 7451: 7446: 7442: 7438: 7435: 7431: 7426: 7422: 7418: 7415: 7412: 7409: 7404: 7399: 7395: 7391: 7388: 7384: 7379: 7375: 7371: 7368: 7365: 7362: 7356: 7353: 7351: 7349: 7346: 7343: 7338: 7334: 7330: 7327: 7324: 7323: 7303: 7300: 7297: 7293: 7289: 7286: 7283: 7278: 7274: 7270: 7267: 7264: 7261: 7241: 7238: 7235: 7231: 7227: 7224: 7221: 7216: 7212: 7208: 7205: 7202: 7199: 7179: 7159: 7154: 7150: 7146: 7143: 7140: 7137: 7132: 7128: 7124: 7121: 7099: 7095: 7072: 7068: 7050: 7049: 7041: 7031: 7030: 7027: 7024: 7021: 7017: 7016: 7013: 7010: 7007: 7000: 6999: 6994: 6991: 6986: 6979: 6978: 6974: 6972: 6969: 6962: 6959: 6952: 6948: 6945: 6934: 6931: 6929: 6926: 6914: 6911: 6908: 6904: 6901: 6898: 6894: 6890: 6887: 6884: 6881: 6878: 6875: 6872: 6866: 6863: 6857: 6854: 6851: 6848: 6845: 6842: 6838: 6835: 6832: 6828: 6824: 6821: 6818: 6812: 6809: 6803: 6800: 6797: 6794: 6791: 6788: 6785: 6781: 6776: 6769: 6766: 6760: 6757: 6734: 6731: 6678: 6675: 6672: 6669: 6665: 6661: 6658: 6655: 6652: 6647: 6643: 6639: 6636: 6633: 6630: 6621: 6617: 6587: 6584: 6580: 6577: 6574: 6570: 6566: 6563: 6560: 6557: 6553: 6550: 6547: 6544: 6541: 6538: 6535: 6532: 6529: 6523: 6520: 6491: 6488: 6472:Main article: 6469: 6466: 6461:Persi Diaconis 6432:Joseph L. Doob 6415: 6412: 6399: 6396: 6393: 6390: 6387: 6384: 6381: 6378: 6375: 6372: 6352: 6349: 6346: 6343: 6340: 6337: 6334: 6331: 6311: 6291: 6268: 6265: 6262: 6259: 6255: 6251: 6248: 6245: 6225: 6222: 6219: 6216: 6213: 6210: 6190: 6187: 6184: 6181: 6178: 6175: 6155: 6152: 6149: 6146: 6143: 6140: 6137: 6134: 6114: 6111: 6108: 6105: 6102: 6099: 6084:Main article: 6081: 6078: 6065: 6062: 6059: 6056: 6053: 6050: 6047: 6044: 6041: 6038: 6035: 6032: 6029: 6026: 6020: 6017: 6014: 6011: 6006: 6003: 6000: 5997: 5994: 5991: 5968: 5965: 5962: 5959: 5956: 5953: 5950: 5947: 5944: 5941: 5938: 5935: 5932: 5929: 5923: 5920: 5917: 5914: 5909: 5906: 5903: 5900: 5897: 5894: 5880: 5877: 5874: 5873: 5828: 5826: 5819: 5813: 5810: 5732: 5731: 5720: 5717: 5714: 5711: 5708: 5705: 5702: 5699: 5696: 5693: 5690: 5684: 5681: 5675: 5672: 5669: 5666: 5663: 5660: 5657: 5651: 5648: 5642: 5639: 5624: 5613: 5610: 5607: 5604: 5601: 5597: 5593: 5590: 5587: 5584: 5581: 5578: 5575: 5569: 5566: 5560: 5557: 5554: 5551: 5548: 5545: 5542: 5538: 5534: 5528: 5525: 5519: 5516: 5499: 5496: 5466: 5463: 5460: 5457: 5453: 5449: 5446: 5443: 5438: 5434: 5430: 5427: 5424: 5421: 5418: 5415: 5410: 5405: 5401: 5390: 5389: 5377: 5374: 5371: 5351: 5329: 5324: 5320: 5299: 5296: 5293: 5290: 5285: 5281: 5277: 5274: 5271: 5268: 5263: 5258: 5254: 5250: 5247: 5244: 5241: 5238: 5235: 5232: 5229: 5224: 5221: 5218: 5214: 5193: 5171: 5167: 5146: 5143: 5140: 5120: 5098: 5093: 5089: 5077: 5065: 5056:and parameter 5044: 5023: 5003: 5000: 4997: 4993: 4989: 4986: 4983: 4980: 4969: 4957: 4954: 4951: 4948: 4945: 4942: 4939: 4936: 4933: 4930: 4927: 4924: 4920: 4916: 4913: 4910: 4904: 4901: 4898: 4894: 4890: 4887: 4882: 4879: 4876: 4873: 4870: 4867: 4864: 4861: 4858: 4855: 4852: 4848: 4844: 4841: 4835: 4829: 4826: 4823: 4820: 4817: 4814: 4811: 4807: 4803: 4800: 4795: 4792: 4789: 4786: 4783: 4780: 4777: 4774: 4771: 4768: 4765: 4761: 4757: 4754: 4748: 4742: 4739: 4736: 4732: 4728: 4725: 4720: 4717: 4714: 4710: 4706: 4703: 4700: 4697: 4691: 4688: 4685: 4682: 4678: 4674: 4671: 4668: 4665: 4646: 4633: 4630: 4627: 4624: 4621: 4618: 4615: 4612: 4609: 4606: 4603: 4600: 4596: 4592: 4589: 4586: 4583: 4580: 4577: 4574: 4570: 4566: 4563: 4540: 4528: 4525: 4522: 4518: 4514: 4511: 4508: 4505: 4501: 4497: 4494: 4491: 4488: 4485: 4459: 4456: 4453: 4449: 4445: 4442: 4426: 4423:Jeffreys prior 4410: 4407: 4404: 4401: 4398: 4395: 4378: 4375: 4374: 4373: 4358: 4355: 4342: 4328: 4324: 4320: 4317: 4314: 4309: 4305: 4284: 4263: 4252: 4247:This may be a 4234: 4231: 4228: 4225: 4222: 4219: 4216: 4213: 4201:hyperparameter 4188: 4178: 4177:of parameters. 4173:This may be a 4160: 4157: 4154: 4151: 4148: 4145: 4142: 4139: 4114: 4104: 4088: 4076: 4073: 4071: 4068: 4056: 4053: 4049: 4045: 4040: 4036: 4032: 4029: 4024: 4020: 4016: 4013: 4009: 4005: 4001: 3997: 3993: 3989: 3986: 3962: 3959: 3955: 3951: 3947: 3943: 3940: 3937: 3930: 3926: 3922: 3918: 3914: 3910: 3906: 3903: 3900: 3896: 3892: 3888: 3884: 3880: 3876: 3873: 3870: 3865: 3861: 3857: 3853: 3849: 3845: 3841: 3838: 3832: 3829: 3827: 3825: 3822: 3818: 3814: 3810: 3806: 3803: 3800: 3794: 3790: 3786: 3782: 3778: 3775: 3770: 3766: 3762: 3758: 3754: 3750: 3746: 3743: 3737: 3734: 3732: 3730: 3726: 3722: 3718: 3714: 3710: 3706: 3703: 3700: 3699: 3676: 3660:Bayes' theorem 3646: 3625: 3621: 3617: 3614: 3611: 3608: 3586: 3582: 3557: 3552: 3548: 3544: 3541: 3538: 3533: 3529: 3525: 3522: 3518: 3490: 3469: 3465: 3461: 3457: 3453: 3450: 3429: 3407: 3390: 3387: 3375: 3371: 3368: 3365: 3360: 3356: 3352: 3349: 3343: 3339: 3335: 3332: 3329: 3326: 3322: 3318: 3315: 3295: 3292: 3289: 3286: 3283: 3280: 3273: 3268: 3264: 3260: 3257: 3254: 3249: 3245: 3241: 3237: 3233: 3230: 3224: 3220: 3214: 3211: 3208: 3204: 3200: 3197: 3191: 3188: 3184: 3180: 3177: 3174: 3171: 3151: 3146: 3142: 3138: 3135: 3132: 3127: 3123: 3119: 3116: 3112: 3095: 3092: 3077: 3074: 3071: 3068: 3065: 3062: 3055: 3050: 3046: 3042: 3039: 3036: 3031: 3027: 3023: 3020: 3017: 3014: 3008: 3004: 2998: 2995: 2992: 2989: 2986: 2983: 2977: 2974: 2971: 2968: 2965: 2962: 2959: 2946:Bayes' theorem 2933: 2930: 2927: 2924: 2921: 2918: 2898: 2895: 2892: 2889: 2869: 2864: 2860: 2856: 2853: 2850: 2830: 2825: 2821: 2817: 2814: 2811: 2784: 2781: 2776: 2772: 2768: 2765: 2762: 2740: 2736: 2711: 2706: 2702: 2698: 2695: 2675: 2670: 2666: 2662: 2657: 2653: 2649: 2646: 2624: 2620: 2599: 2575: 2572: 2569: 2566: 2563: 2560: 2557: 2554: 2551: 2545: 2540: 2536: 2511: 2495: 2492: 2487: 2484: 2458: 2455: 2443: 2440: 2437: 2434: 2431: 2428: 2425: 2422: 2419: 2416: 2413: 2410: 2407: 2404: 2401: 2398: 2395: 2392: 2389: 2386: 2383: 2380: 2377: 2374: 2371: 2368: 2365: 2362: 2359: 2328: 2325: 2322: 2319: 2316: 2313: 2310: 2307: 2304: 2301: 2298: 2293: 2290: 2287: 2284: 2281: 2278: 2275: 2247: 2244: 2241: 2238: 2234: 2211: 2208: 2205: 2202: 2181: 2178: 2175: 2172: 2169: 2166: 2143: 2140: 2117: 2114: 2111: 2108: 2105: 2102: 2082: 2075: 2072: 2069: 2066: 2063: 2060: 2055: 2052: 2049: 2046: 2043: 2040: 2037: 2029: 2025: 2022: 2015: 2012: 2009: 2006: 2002: 1995: 1974: 1971: 1968: 1965: 1962: 1959: 1956: 1953: 1950: 1947: 1944: 1941: 1921: 1918: 1915: 1912: 1909: 1906: 1903: 1900: 1897: 1894: 1891: 1888: 1885: 1882: 1879: 1876: 1873: 1870: 1867: 1864: 1861: 1858: 1855: 1852: 1849: 1846: 1843: 1840: 1810: 1807: 1804: 1801: 1798: 1795: 1790: 1787: 1784: 1781: 1778: 1775: 1772: 1765: 1761: 1758: 1752: 1749: 1746: 1743: 1739: 1733: 1729: 1726: 1722: 1717: 1714: 1712: 1710: 1707: 1701: 1698: 1695: 1692: 1689: 1686: 1683: 1680: 1677: 1674: 1671: 1668: 1665: 1662: 1659: 1656: 1653: 1650: 1647: 1644: 1641: 1638: 1635: 1630: 1627: 1624: 1621: 1618: 1615: 1612: 1609: 1606: 1603: 1597: 1594: 1592: 1590: 1587: 1581: 1578: 1575: 1572: 1567: 1564: 1561: 1558: 1555: 1552: 1549: 1546: 1543: 1540: 1534: 1531: 1529: 1527: 1524: 1521: 1518: 1515: 1512: 1509: 1508: 1488: 1464: 1444: 1441: 1418: 1415: 1412: 1409: 1406: 1403: 1383: 1380: 1377: 1374: 1371: 1368: 1348: 1345: 1342: 1339: 1319: 1308: 1307: 1295: 1291: 1287: 1278:(Else one has 1267: 1264: 1261: 1258: 1255: 1252: 1242: 1230: 1206: 1203: 1200: 1197: 1187: 1175: 1155: 1135: 1115: 1089: 1067: 1047: 1044: 1041: 1038: 1035: 1032: 1022: 1006: 983: 961: 935: 932: 929: 926: 923: 920: 910: 894: 884: 872: 849: 823: 820: 817: 814: 804: 780: 757: 751: 748: 745: 742: 737: 734: 731: 728: 725: 722: 719: 716: 713: 710: 707: 701: 698: 695: 692: 689: 686: 683: 672:Bayes' theorem 642: 641: 638: 637:P(¬H) = 1−P(H) 635: 632: 628: 627: 624: 623: 616: 612:P(¬H|¬E)·P(¬E) 609: 608:= P(¬E|H)·P(H) 602: 596: 595: 590: 583: 576: 570: 569: 565: 563: 556: 549: 544: 541: 530: 527: 518:Bayes' theorem 516:Main article: 472: 469: 405:Bayes' theorem 318: 317: 315: 314: 307: 300: 292: 289: 288: 287: 286: 271: 270: 269: 268: 263: 258: 250: 249: 245: 244: 243: 242: 237: 229: 228: 224: 223: 222: 221: 216: 211: 203: 202: 198: 197: 196: 195: 190: 185: 180: 175: 167: 166: 162: 161: 160: 159: 154: 149: 144: 136: 135: 134:Model building 131: 130: 129: 128: 123: 118: 113: 108: 103: 98: 93: 91:Bayes' theorem 88: 83: 75: 74: 70: 69: 51: 50: 42: 41: 35: 34: 26: 9: 6: 4: 3: 2: 14634: 14623: 14620: 14618: 14615: 14613: 14610: 14608: 14605: 14604: 14602: 14587: 14586: 14577: 14575: 14574: 14565: 14563: 14562: 14557: 14551: 14549: 14548: 14539: 14538: 14535: 14521: 14518: 14516: 14515:Geostatistics 14513: 14511: 14508: 14506: 14503: 14501: 14498: 14497: 14495: 14493: 14489: 14483: 14482:Psychometrics 14480: 14478: 14475: 14473: 14470: 14468: 14465: 14463: 14460: 14458: 14455: 14453: 14450: 14448: 14445: 14443: 14440: 14438: 14435: 14434: 14432: 14430: 14426: 14420: 14417: 14415: 14412: 14410: 14406: 14403: 14401: 14398: 14396: 14393: 14391: 14388: 14387: 14385: 14383: 14379: 14373: 14370: 14368: 14365: 14363: 14359: 14356: 14354: 14351: 14350: 14348: 14346: 14345:Biostatistics 14342: 14338: 14334: 14329: 14325: 14307: 14306:Log-rank test 14304: 14303: 14301: 14297: 14291: 14288: 14287: 14285: 14283: 14279: 14273: 14270: 14268: 14265: 14263: 14260: 14258: 14255: 14254: 14252: 14250: 14246: 14243: 14241: 14237: 14227: 14224: 14222: 14219: 14217: 14214: 14212: 14209: 14207: 14204: 14203: 14201: 14199: 14195: 14189: 14186: 14184: 14181: 14179: 14177:(Box–Jenkins) 14173: 14171: 14168: 14166: 14163: 14159: 14156: 14155: 14154: 14151: 14150: 14148: 14146: 14142: 14136: 14133: 14131: 14130:Durbin–Watson 14128: 14126: 14120: 14118: 14115: 14113: 14112:Dickey–Fuller 14110: 14109: 14107: 14103: 14097: 14094: 14092: 14089: 14087: 14086:Cointegration 14084: 14082: 14079: 14077: 14074: 14072: 14069: 14067: 14064: 14062: 14061:Decomposition 14059: 14058: 14056: 14052: 14049: 14047: 14043: 14033: 14030: 14029: 14028: 14025: 14024: 14023: 14020: 14016: 14013: 14012: 14011: 14008: 14006: 14003: 14001: 13998: 13996: 13993: 13991: 13988: 13986: 13983: 13981: 13978: 13976: 13973: 13972: 13970: 13968: 13964: 13958: 13955: 13953: 13950: 13948: 13945: 13943: 13940: 13938: 13935: 13933: 13932:Cohen's kappa 13930: 13929: 13927: 13925: 13921: 13917: 13913: 13909: 13905: 13901: 13896: 13892: 13878: 13875: 13873: 13870: 13868: 13865: 13863: 13860: 13859: 13857: 13855: 13851: 13845: 13841: 13837: 13831: 13829: 13826: 13825: 13823: 13821: 13817: 13811: 13808: 13806: 13803: 13801: 13798: 13796: 13793: 13791: 13788: 13786: 13785:Nonparametric 13783: 13781: 13778: 13777: 13775: 13771: 13765: 13762: 13760: 13757: 13755: 13752: 13750: 13747: 13746: 13744: 13742: 13738: 13732: 13729: 13727: 13724: 13722: 13719: 13717: 13714: 13712: 13709: 13708: 13706: 13704: 13700: 13694: 13691: 13689: 13686: 13684: 13681: 13679: 13676: 13675: 13673: 13671: 13667: 13663: 13656: 13653: 13651: 13648: 13647: 13643: 13639: 13623: 13620: 13619: 13618: 13615: 13613: 13610: 13608: 13605: 13601: 13598: 13596: 13593: 13592: 13591: 13588: 13587: 13585: 13583: 13579: 13569: 13566: 13562: 13556: 13554: 13548: 13546: 13540: 13539: 13538: 13535: 13534:Nonparametric 13532: 13530: 13524: 13520: 13517: 13516: 13515: 13509: 13505: 13504:Sample median 13502: 13501: 13500: 13497: 13496: 13494: 13492: 13488: 13480: 13477: 13475: 13472: 13470: 13467: 13466: 13465: 13462: 13460: 13457: 13455: 13449: 13447: 13444: 13442: 13439: 13437: 13434: 13432: 13429: 13427: 13425: 13421: 13419: 13416: 13415: 13413: 13411: 13407: 13401: 13399: 13395: 13393: 13391: 13386: 13384: 13379: 13375: 13374: 13371: 13368: 13366: 13362: 13352: 13349: 13347: 13344: 13342: 13339: 13338: 13336: 13334: 13330: 13324: 13321: 13317: 13314: 13313: 13312: 13309: 13305: 13302: 13301: 13300: 13297: 13295: 13292: 13291: 13289: 13287: 13283: 13275: 13272: 13270: 13267: 13266: 13265: 13262: 13260: 13257: 13255: 13252: 13250: 13247: 13245: 13242: 13240: 13237: 13236: 13234: 13232: 13228: 13222: 13219: 13215: 13212: 13208: 13205: 13203: 13200: 13199: 13198: 13195: 13194: 13193: 13190: 13186: 13183: 13181: 13178: 13176: 13173: 13171: 13168: 13167: 13166: 13163: 13162: 13160: 13158: 13154: 13151: 13149: 13145: 13139: 13136: 13134: 13131: 13127: 13124: 13123: 13122: 13119: 13117: 13114: 13110: 13109:loss function 13107: 13106: 13105: 13102: 13098: 13095: 13093: 13090: 13088: 13085: 13084: 13083: 13080: 13078: 13075: 13073: 13070: 13066: 13063: 13061: 13058: 13056: 13050: 13047: 13046: 13045: 13042: 13038: 13035: 13033: 13030: 13028: 13025: 13024: 13023: 13020: 13016: 13013: 13011: 13008: 13007: 13006: 13003: 12999: 12996: 12995: 12994: 12991: 12987: 12984: 12983: 12982: 12979: 12977: 12974: 12972: 12969: 12967: 12964: 12963: 12961: 12959: 12955: 12951: 12947: 12942: 12938: 12924: 12921: 12919: 12916: 12914: 12911: 12909: 12906: 12905: 12903: 12901: 12897: 12891: 12888: 12886: 12883: 12881: 12878: 12877: 12875: 12871: 12865: 12862: 12860: 12857: 12855: 12852: 12850: 12847: 12845: 12842: 12840: 12837: 12835: 12832: 12831: 12829: 12827: 12823: 12817: 12814: 12812: 12811:Questionnaire 12809: 12807: 12804: 12800: 12797: 12795: 12792: 12791: 12790: 12787: 12786: 12784: 12782: 12778: 12772: 12769: 12767: 12764: 12762: 12759: 12757: 12754: 12752: 12749: 12747: 12744: 12742: 12739: 12737: 12734: 12733: 12731: 12729: 12725: 12721: 12717: 12712: 12708: 12694: 12691: 12689: 12686: 12684: 12681: 12679: 12676: 12674: 12671: 12669: 12666: 12664: 12661: 12659: 12656: 12654: 12651: 12649: 12646: 12644: 12641: 12639: 12638:Control chart 12636: 12634: 12631: 12629: 12626: 12624: 12621: 12620: 12618: 12616: 12612: 12606: 12603: 12599: 12596: 12594: 12591: 12590: 12589: 12586: 12584: 12581: 12579: 12576: 12575: 12573: 12571: 12567: 12561: 12558: 12556: 12553: 12551: 12548: 12547: 12545: 12541: 12535: 12532: 12531: 12529: 12527: 12523: 12511: 12508: 12506: 12503: 12501: 12498: 12497: 12496: 12493: 12491: 12488: 12487: 12485: 12483: 12479: 12473: 12470: 12468: 12465: 12463: 12460: 12458: 12455: 12453: 12450: 12448: 12445: 12443: 12440: 12439: 12437: 12435: 12431: 12425: 12422: 12420: 12417: 12413: 12410: 12408: 12405: 12403: 12400: 12398: 12395: 12393: 12390: 12388: 12385: 12383: 12380: 12378: 12375: 12373: 12370: 12368: 12365: 12364: 12363: 12360: 12359: 12357: 12355: 12351: 12348: 12346: 12342: 12338: 12334: 12329: 12325: 12319: 12316: 12314: 12311: 12310: 12307: 12303: 12296: 12291: 12289: 12284: 12282: 12277: 12276: 12273: 12267: 12266:causaScientia 12263: 12262: 12258: 12256: 12253: 12250: 12247: 12245: 12243: 12239: 12236: 12232: 12229: 12225: 12223: 12222:Tom Griffiths 12219: 12215: 12212: 12209: 12207: 12204: 12201: 12198: 12195: 12192: 12188: 12184: 12183: 12178: 12174: 12173: 12165: 12161: 12157: 12155: 12154:9781439880326 12151: 12147: 12143: 12140: 12136: 12133: 12129: 12123: 12119: 12114: 12111: 12110:0-340-52922-9 12107: 12103: 12100:, Volume 2B: 12099: 12095: 12092: 12091: 12086: 12082: 12076: 12072: 12067: 12064: 12063:0-471-68029-X 12060: 12056: 12052: 12049: 12045: 12041: 12037: 12033: 12029: 12023: 12019: 12015: 12011: 12007: 12003: 12002: 11992: 11986: 11982: 11978: 11974: 11970: 11966: 11960: 11956: 11951: 11949: 11945: 11941: 11937: 11932: 11926: 11922: 11917: 11915: 11914:0-471-27020-2 11911: 11907: 11903: 11899: 11893: 11888: 11887: 11881: 11877: 11873: 11867: 11863: 11858: 11854: 11848: 11844: 11840: 11836: 11832: 11828: 11822: 11818: 11814: 11810: 11807: 11803: 11802: 11801: 11789: 11783: 11779: 11778: 11773: 11769: 11765: 11759: 11755: 11749: 11748: 11737: 11731: 11727: 11723: 11718: 11714: 11708: 11704: 11700: 11696: 11692: 11689: 11685: 11681: 11677: 11673: 11670: 11667: 11663: 11659: 11655: 11651: 11647: 11643: 11639: 11635: 11631: 11627: 11623: 11619: 11615: 11610: 11606: 11601: 11599: 11598:0-471-57428-7 11595: 11591: 11587: 11583: 11580: 11576: 11570: 11566: 11561: 11559: 11555: 11551: 11547: 11543: 11539: 11538: 11524: 11518: 11514: 11507: 11499: 11492: 11488: 11482: 11474: 11470: 11466: 11462: 11458: 11454: 11449: 11444: 11440: 11436: 11429: 11421: 11417: 11411: 11402: 11397: 11393: 11389: 11385: 11378: 11376: 11367: 11365:9780674403406 11361: 11357: 11352: 11344: 11336: 11332: 11328: 11324: 11320: 11316: 11312: 11308: 11303: 11298: 11295:(2): 021120. 11294: 11290: 11283: 11276:(1): 117–122. 11275: 11271: 11264: 11257: 11249: 11245: 11240: 11235: 11231: 11227: 11223: 11219: 11215: 11211: 11207: 11203: 11202:AIChE Journal 11199: 11192: 11184: 11180: 11176: 11172: 11168: 11164: 11160: 11156: 11152: 11148: 11140: 11132: 11128: 11124: 11120: 11116: 11112: 11108: 11104: 11097: 11089: 11085: 11081: 11077: 11070: 11061: 11053: 11047: 11043: 11042: 11034: 11027: 11026: 11019: 11013: 11009: 11005: 10999: 10992: 10988: 10984: 10978: 10971: 10965: 10958: 10957: 10950: 10942: 10938: 10933: 10928: 10924: 10920: 10915: 10910: 10906: 10902: 10898: 10891: 10877: 10873: 10867: 10858: 10849: 10844: 10837: 10829: 10825: 10821: 10817: 10813: 10809: 10804: 10799: 10795: 10791: 10784: 10776: 10772: 10767: 10762: 10758: 10754: 10749: 10744: 10740: 10736: 10732: 10725: 10717: 10711: 10707: 10700: 10691: 10686: 10682: 10678: 10674: 10667: 10659: 10655: 10651: 10647: 10643: 10639: 10635: 10631: 10627: 10623: 10619: 10612: 10598:on 2016-01-10 10597: 10593: 10589: 10582: 10573: 10565: 10561: 10557: 10553: 10549: 10545: 10540: 10535: 10531: 10527: 10520: 10512: 10508: 10504: 10500: 10496: 10492: 10485: 10476: 10470: 10466: 10462: 10456: 10448: 10442: 10438: 10434: 10428: 10419: 10413: 10409: 10405: 10399: 10390: 10386: 10380: 10371: 10366: 10362: 10358: 10351: 10344: 10335: 10330: 10326: 10322: 10318: 10311: 10302: 10297: 10293: 10289: 10285: 10281: 10275: 10266: 10264: 10255: 10248: 10241: 10230: 10226: 10219: 10212: 10198: 10194: 10188: 10180: 10178:9780444515391 10174: 10170: 10166: 10161: 10156: 10152: 10145: 10137: 10133: 10127: 10119: 10115: 10111: 10107: 10103: 10099: 10092: 10084: 10080: 10075: 10070: 10066: 10062: 10058: 10051: 10043: 10039: 10034: 10029: 10025: 10021: 10017: 10010: 10002: 9998: 9994: 9988: 9984: 9983: 9975: 9967: 9963: 9959: 9955: 9950: 9949:11250/2984409 9945: 9940: 9935: 9931: 9927: 9923: 9916: 9908: 9902: 9898: 9897: 9889: 9881: 9874: 9866: 9862: 9858: 9854: 9849: 9844: 9840: 9836: 9829: 9821: 9817: 9813: 9809: 9805: 9801: 9794: 9787: 9780: 9776: 9772: 9766: 9759: 9758:0-19-824860-1 9755: 9751: 9747: 9742: 9727: 9721: 9713: 9709: 9705: 9701: 9697: 9693: 9686: 9678: 9677: 9672: 9666: 9662: 9646: 9643: 9641: 9638: 9636: 9633: 9631: 9628: 9626: 9623: 9621: 9618: 9616: 9613: 9611: 9608: 9606: 9603: 9602: 9595: 9593: 9588: 9583: 9580: 9576: 9571: 9569: 9565: 9561: 9557: 9553: 9552:jurisprudence 9549: 9545: 9541: 9537: 9533: 9529: 9523: 9513: 9511: 9507: 9503: 9490: 9486: 9484: 9481: 9479: 9476: 9474: 9470: 9466: 9463: 9460: 9457: 9454: 9452: 9449: 9447: 9444: 9441: 9438: 9435: 9431: 9427: 9423: 9419: 9418: 9412: 9410: 9409:falsification 9406: 9403:as any other 9402: 9398: 9394: 9390: 9388: 9379: 9377: 9373: 9369: 9365: 9361: 9357: 9353: 9349: 9341: 9338: 9335: 9332: 9329: 9326: 9325: 9324: 9322: 9319: 9315: 9311: 9306: 9304: 9303: 9298: 9293: 9285: 9281: 9279: 9275: 9271: 9267: 9261: 9251: 9249: 9245: 9235: 9233: 9229: 9225: 9222:. Given some 9221: 9217: 9213: 9209: 9208:Occam's Razor 9205: 9200: 9196: 9194: 9190: 9186: 9182: 9178: 9174: 9170: 9166: 9162: 9157: 9155: 9154:phylogenetics 9151: 9147: 9143: 9139: 9135: 9131: 9127: 9123: 9113: 9111: 9107: 9092: 9087: 9077: 9075: 9071: 9067: 9063: 9058: 9052: 9046: 9033: 9030: 9027: 9024: 9021: 9020: 9019: 9017: 9013: 9009: 9005: 9000: 8998: 8994: 8990: 8986: 8976: 8953: 8941: 8935: 8932: 8923: 8917: 8908: 8902: 8899: 8896: 8893: 8882: 8878: 8862: 8859: 8856: 8847: 8831: 8823: 8819: 8811: 8808: 8802: 8794: 8790: 8783: 8780: 8777: 8774: 8771: 8768: 8765: 8759: 8753: 8748: 8744: 8735: 8732: 8729: 8726: 8723: 8720: 8717: 8711: 8705: 8699: 8691: 8687: 8677: 8674: 8671: 8665: 8657: 8654: 8651: 8648: 8645: 8642: 8639: 8633: 8627: 8621: 8618: 8615: 8612: 8609: 8601: 8597: 8576: 8556: 8548: 8532: 8529: 8523: 8515: 8511: 8501: 8482: 8479: 8476: 8467: 8464: 8461: 8456: 8453: 8450: 8444: 8441: 8429: 8426: 8423: 8414: 8411: 8408: 8403: 8400: 8397: 8391: 8385: 8382: 8379: 8370: 8364: 8361: 8358: 8355: 8346: 8334: 8328: 8325: 8319: 8294: 8291: 8288: 8279: 8276: 8273: 8268: 8265: 8262: 8256: 8253: 8241: 8238: 8235: 8226: 8223: 8220: 8215: 8212: 8209: 8203: 8197: 8194: 8191: 8182: 8176: 8173: 8170: 8167: 8164: 8155: 8149: 8146: 8140: 8115: 8112: 8109: 8100: 8097: 8094: 8089: 8086: 8083: 8077: 8074: 8062: 8059: 8056: 8047: 8044: 8041: 8036: 8033: 8030: 8024: 8021: 8015: 8009: 8006: 8003: 8000: 7991: 7985: 7982: 7979: 7973: 7948: 7945: 7942: 7933: 7930: 7927: 7922: 7919: 7916: 7910: 7907: 7895: 7892: 7889: 7880: 7877: 7874: 7869: 7866: 7863: 7857: 7854: 7848: 7842: 7839: 7836: 7833: 7830: 7827: 7824: 7821: 7815: 7807: 7784: 7772: 7766: 7763: 7754: 7748: 7739: 7733: 7730: 7727: 7724: 7701: 7692: 7684: 7675: 7658: 7655: 7650: 7646: 7639: 7614: 7610: 7603: 7594: 7577: 7574: 7572: 7554: 7551: 7548: 7545: 7542: 7539: 7536: 7531: 7528: 7525: 7519: 7517: 7494: 7490: 7483: 7474: 7470: 7466: 7463: 7457: 7453: 7444: 7440: 7433: 7424: 7420: 7416: 7413: 7407: 7397: 7393: 7386: 7377: 7373: 7369: 7366: 7360: 7354: 7352: 7344: 7341: 7336: 7332: 7325: 7301: 7298: 7295: 7291: 7287: 7284: 7276: 7272: 7268: 7265: 7259: 7239: 7236: 7233: 7229: 7225: 7222: 7214: 7210: 7206: 7203: 7197: 7177: 7152: 7148: 7141: 7138: 7130: 7126: 7119: 7097: 7093: 7070: 7066: 7056: 7047: 7040: 7036: 7032: 7028: 7025: 7022: 7019: 7018: 7014: 7011: 7008: 7006: 7002: 7001: 6998: 6995: 6992: 6990: 6987: 6985: 6981: 6980: 6975: 6968: 6963: 6958: 6953: 6944: 6943: 6940: 6925: 6912: 6909: 6906: 6899: 6896: 6888: 6885: 6879: 6873: 6870: 6861: 6852: 6849: 6846: 6843: 6840: 6833: 6830: 6822: 6819: 6816: 6807: 6798: 6795: 6792: 6786: 6783: 6764: 6755: 6729: 6718: 6713: 6711: 6707: 6706:loss function 6703: 6702: 6696: 6694: 6689: 6676: 6670: 6667: 6659: 6656: 6650: 6645: 6637: 6634: 6631: 6619: 6607: 6605: 6598: 6585: 6582: 6575: 6572: 6564: 6561: 6555: 6551: 6548: 6545: 6539: 6533: 6527: 6518: 6506: 6504: 6499: 6497: 6487: 6485: 6481: 6475: 6465: 6462: 6458: 6454: 6453:almost surely 6450: 6446: 6441: 6437: 6433: 6429: 6425: 6421: 6411: 6397: 6394: 6388: 6385: 6382: 6376: 6373: 6370: 6350: 6347: 6341: 6335: 6332: 6329: 6309: 6289: 6280: 6266: 6263: 6257: 6249: 6243: 6223: 6220: 6214: 6208: 6188: 6185: 6179: 6173: 6153: 6150: 6144: 6141: 6138: 6132: 6112: 6109: 6103: 6097: 6087: 6077: 6060: 6054: 6051: 6045: 6042: 6039: 6033: 6027: 6024: 6015: 6009: 6001: 5998: 5995: 5989: 5963: 5957: 5954: 5948: 5945: 5942: 5936: 5930: 5927: 5918: 5912: 5904: 5901: 5898: 5892: 5870: 5867: 5859: 5856:February 2012 5849: 5845: 5839: 5838: 5832: 5827: 5818: 5817: 5809: 5806: 5804: 5800: 5796: 5793:(as does the 5792: 5787: 5785: 5781: 5777: 5774:with unknown 5773: 5769: 5765: 5760: 5758: 5754: 5750: 5746: 5741: 5737: 5718: 5715: 5709: 5706: 5703: 5697: 5691: 5688: 5679: 5670: 5667: 5664: 5658: 5655: 5646: 5637: 5629: 5625: 5611: 5608: 5602: 5599: 5591: 5588: 5582: 5576: 5573: 5564: 5555: 5552: 5549: 5543: 5540: 5532: 5523: 5514: 5506: 5502: 5501: 5495: 5493: 5488: 5484: 5480: 5461: 5458: 5455: 5444: 5436: 5432: 5425: 5422: 5416: 5408: 5403: 5399: 5375: 5372: 5369: 5349: 5327: 5322: 5318: 5294: 5291: 5283: 5279: 5272: 5269: 5261: 5256: 5252: 5248: 5242: 5239: 5236: 5233: 5230: 5222: 5219: 5216: 5212: 5191: 5169: 5165: 5144: 5141: 5138: 5118: 5096: 5091: 5087: 5078: 5063: 5021: 4998: 4995: 4987: 4984: 4978: 4970: 4955: 4949: 4946: 4943: 4937: 4931: 4928: 4925: 4922: 4911: 4908: 4899: 4896: 4885: 4877: 4874: 4871: 4865: 4859: 4856: 4853: 4850: 4839: 4833: 4824: 4818: 4812: 4809: 4798: 4790: 4787: 4784: 4778: 4772: 4769: 4766: 4763: 4752: 4746: 4737: 4734: 4723: 4715: 4712: 4704: 4701: 4695: 4689: 4683: 4680: 4672: 4669: 4663: 4655: 4651: 4647: 4631: 4628: 4625: 4619: 4616: 4613: 4607: 4601: 4598: 4587: 4584: 4581: 4575: 4572: 4561: 4553: 4549: 4545: 4541: 4523: 4520: 4509: 4506: 4495: 4492: 4486: 4475: 4454: 4451: 4440: 4431: 4427: 4424: 4405: 4402: 4399: 4393: 4385: 4381: 4380: 4353: 4343: 4326: 4322: 4318: 4315: 4312: 4307: 4303: 4282: 4253: 4250: 4229: 4226: 4223: 4217: 4214: 4211: 4202: 4186: 4179: 4176: 4155: 4152: 4149: 4143: 4140: 4137: 4128: 4112: 4105: 4102: 4086: 4079: 4078: 4067: 4054: 4043: 4038: 4034: 4027: 4022: 4018: 4014: 4003: 3995: 3984: 3960: 3949: 3938: 3935: 3924: 3912: 3901: 3890: 3882: 3871: 3868: 3855: 3847: 3836: 3830: 3828: 3812: 3801: 3798: 3784: 3773: 3760: 3752: 3741: 3735: 3733: 3720: 3712: 3701: 3689: 3665: 3661: 3615: 3612: 3606: 3584: 3580: 3571: 3550: 3546: 3542: 3539: 3536: 3531: 3527: 3520: 3507: 3506: 3459: 3448: 3395: 3386: 3373: 3366: 3363: 3358: 3354: 3347: 3341: 3337: 3333: 3327: 3324: 3313: 3293: 3287: 3281: 3278: 3266: 3262: 3255: 3247: 3243: 3239: 3228: 3222: 3218: 3209: 3206: 3195: 3189: 3178: 3175: 3169: 3144: 3140: 3136: 3133: 3130: 3125: 3121: 3114: 3102:observations 3101: 3091: 3088: 3075: 3069: 3063: 3060: 3048: 3044: 3037: 3029: 3025: 3021: 3018: 3012: 3006: 3002: 2993: 2990: 2987: 2981: 2975: 2969: 2966: 2963: 2957: 2949: 2947: 2928: 2925: 2922: 2916: 2893: 2887: 2862: 2858: 2851: 2848: 2823: 2819: 2812: 2809: 2800: 2798: 2774: 2770: 2763: 2738: 2734: 2725: 2704: 2700: 2693: 2668: 2664: 2660: 2655: 2651: 2644: 2622: 2618: 2589: 2573: 2570: 2567: 2564: 2561: 2558: 2555: 2552: 2549: 2543: 2538: 2534: 2500: 2491: 2483: 2481: 2477: 2472: 2470: 2466: 2462: 2454: 2441: 2435: 2429: 2423: 2420: 2417: 2411: 2408: 2402: 2396: 2390: 2387: 2384: 2378: 2375: 2369: 2366: 2363: 2357: 2349: 2344: 2323: 2317: 2314: 2308: 2305: 2302: 2296: 2288: 2282: 2279: 2273: 2242: 2236: 2232: 2206: 2200: 2176: 2173: 2170: 2164: 2141: 2138: 2112: 2109: 2106: 2100: 2080: 2070: 2067: 2064: 2058: 2050: 2044: 2041: 2035: 2027: 2023: 2020: 2010: 2004: 2000: 1993: 1972: 1969: 1963: 1954: 1951: 1945: 1939: 1916: 1907: 1901: 1895: 1892: 1886: 1883: 1877: 1871: 1865: 1862: 1859: 1853: 1850: 1844: 1838: 1805: 1802: 1799: 1793: 1785: 1779: 1776: 1770: 1763: 1759: 1756: 1747: 1741: 1737: 1731: 1727: 1724: 1720: 1715: 1713: 1696: 1687: 1681: 1675: 1672: 1666: 1663: 1657: 1651: 1645: 1642: 1639: 1633: 1625: 1619: 1613: 1610: 1607: 1601: 1595: 1593: 1576: 1570: 1562: 1556: 1550: 1547: 1544: 1538: 1532: 1530: 1522: 1519: 1516: 1510: 1486: 1478: 1462: 1442: 1430: 1413: 1410: 1407: 1401: 1378: 1375: 1372: 1366: 1343: 1337: 1317: 1293: 1289: 1285: 1265: 1262: 1256: 1250: 1243: 1228: 1220: 1201: 1195: 1188: 1173: 1153: 1133: 1113: 1105: 1104: 1087: 1080: 1065: 1042: 1039: 1036: 1030: 1023: 1020: 1004: 997: 981: 974: 959: 951: 950: 930: 927: 924: 918: 911: 908: 892: 885: 870: 862: 847: 839: 838: 818: 812: 805: 802: 798: 794: 778: 771: 770: 769: 755: 746: 740: 732: 726: 723: 717: 714: 711: 705: 699: 693: 690: 687: 681: 673: 669: 665: 661: 657: 653: 649: 639: 633: 630: 629: 625: 622: 613: 606: 605:P(H|¬E)·P(¬E) 598: 597: 594: 591: 587: 584: 582:= P(E|H)·P(H) 580: 577: 572: 571: 566: 557: 550: 540: 539: 536: 525: 519: 501:P(B|¬A) P(¬A) 477: 468: 466: 462: 458: 454: 450: 446: 442: 438: 434: 430: 426: 422: 418: 414: 410: 406: 402: 398: 397: 388: 364: 363: 354: 324: 313: 308: 306: 301: 299: 294: 293: 291: 290: 285: 280: 275: 274: 273: 272: 267: 264: 262: 259: 257: 254: 253: 252: 251: 247: 246: 241: 238: 236: 233: 232: 231: 230: 226: 225: 220: 217: 215: 212: 210: 207: 206: 205: 204: 200: 199: 194: 191: 189: 186: 184: 181: 179: 176: 174: 171: 170: 169: 168: 164: 163: 158: 155: 153: 150: 148: 145: 143: 140: 139: 138: 137: 133: 132: 127: 124: 122: 119: 117: 114: 112: 109: 107: 106:Cox's theorem 104: 102: 99: 97: 94: 92: 89: 87: 84: 82: 79: 78: 77: 76: 72: 71: 68: 64: 60: 56: 53: 52: 48: 44: 43: 40: 37: 36: 32: 31: 19: 14583: 14571: 14552: 14545: 14457:Econometrics 14407: / 14390:Chemometrics 14367:Epidemiology 14360: / 14333:Applications 14175:ARIMA model 14122:Q-statistic 14071:Stationarity 13967:Multivariate 13910: / 13906: / 13904:Multivariate 13902: / 13842: / 13838: / 13612:Bayes factor 13581: 13511:Signed rank 13423: 13397: 13389: 13377: 13072:Completeness 12908:Cohort study 12806:Opinion poll 12741:Missing data 12728:Study design 12683:Scatter plot 12605:Scatter plot 12598:Spearman's ρ 12560:Grouped data 12260: 12241: 12180: 12138: 12135:Pearl, Judea 12117: 12101: 12097: 12089: 12070: 12054: 12043: 12009: 11980: 11954: 11939: 11920: 11905: 11885: 11861: 11838: 11835:Colin Howson 11816: 11799: 11776: 11753: 11725: 11721: 11698: 11675: 11665: 11629: 11625: 11622:Amos Tversky 11604: 11589: 11564: 11541: 11512: 11506: 11497: 11481: 11438: 11434: 11428: 11419: 11410: 11394:(1): 1–40 . 11391: 11387: 11355: 11343: 11292: 11288: 11282: 11273: 11269: 11256: 11205: 11201: 11191: 11150: 11146: 11139: 11106: 11102: 11096: 11079: 11075: 11069: 11060: 11040: 11033: 11025:Significance 11023: 11018: 10998: 10982: 10977: 10969: 10964: 10954: 10949: 10904: 10900: 10890: 10879:. Retrieved 10875: 10866: 10857: 10836: 10793: 10789: 10783: 10738: 10734: 10724: 10705: 10699: 10680: 10676: 10666: 10625: 10621: 10611: 10600:. Retrieved 10596:the original 10591: 10581: 10572: 10529: 10525: 10519: 10497:(4): 36–47. 10494: 10490: 10484: 10464: 10455: 10436: 10427: 10407: 10398: 10388: 10379: 10360: 10356: 10343: 10324: 10320: 10310: 10291: 10287: 10274: 10253: 10240: 10229:the original 10224: 10216:Yu, Angela. 10211: 10200:. Retrieved 10196: 10187: 10150: 10144: 10135: 10126: 10101: 10097: 10091: 10064: 10060: 10050: 10023: 10019: 10009: 9985:. Springer. 9981: 9974: 9929: 9925: 9915: 9895: 9888: 9879: 9873: 9838: 9834: 9828: 9803: 9799: 9786: 9770: 9765: 9749: 9741: 9730:. Retrieved 9720: 9695: 9691: 9685: 9674: 9665: 9615:Epistemology 9584: 9578: 9574: 9572: 9532:Thomas Bayes 9527: 9525: 9505: 9499: 9469:econophysics 9397:David Miller 9391: 9385: 9371: 9367: 9363: 9359: 9355: 9351: 9347: 9345: 9339: 9333: 9327: 9313: 9309: 9307: 9300: 9294: 9290: 9274:betting odds 9263: 9241: 9231: 9227: 9223: 9219: 9215: 9211: 9197: 9181:SpamAssassin 9158: 9141: 9119: 9103: 9095:Applications 9089: 9074:Bayes factor 9070:prior belief 9054: 9001: 8989:Abraham Wald 8982: 8848: 8502: 7808: 7693: 7689: 7595: 7057: 7053: 7045: 7038: 7034: 7004: 6996: 6988: 6983: 6966: 6956: 6714: 6699: 6697: 6690: 6603: 6599: 6507: 6500: 6493: 6477: 6417: 6281: 6089: 5882: 5862: 5853: 5834: 5807: 5788: 5761: 5733: 5391: 4552:marginalized 4547: 3690: 3503: 3396: 3392: 3097: 3089: 2950: 2880:, the prior 2801: 2796: 2795:is a set of 2525: 2489: 2473: 2463: 2460: 2345: 1431: 1309: 1101: 1078: 1018: 995: 972: 947: 906: 860: 835: 800: 792: 645: 618: 611: 604: 592: 586:P(¬H|E)·P(E) 585: 578: 573:Has evidence 322: 321: 256:Bayes factor 80: 14585:WikiProject 14500:Cartography 14462:Jurimetrics 14414:Reliability 14145:Time domain 14124:(Ljung–Box) 14046:Time-series 13924:Categorical 13908:Time-series 13900:Categorical 13835:(Bernoulli) 13670:Correlation 13650:Correlation 13446:Jarque–Bera 13418:Chi-squared 13180:M-estimator 13133:Asymptotics 13077:Sufficiency 12844:Interaction 12756:Replication 12736:Effect size 12693:Violin plot 12673:Radar chart 12653:Forest plot 12643:Correlogram 12593:Kendall's τ 11864:. Duxbury. 11618:Paul Slovic 11351:"Chapter 3" 10327:: 270–283. 10254:stat.sc.edu 9548:reliability 9430:experiments 9393:Karl Popper 9318:frequentist 9316:(akin to a 9165:e-mail spam 9134:Monte Carlo 6608:estimates: 6484:closed form 5848:introducing 5738:, i.e., to 4654:Bayes' rule 4075:Definitions 2841:. For each 2465:Ian Hacking 2128:, is about 656:antecedents 652:consequence 599:No evidence 579:P(H|E)·P(E) 485:P(B|A) P(A) 441:engineering 413:information 14601:Categories 14452:Demography 14170:ARMA model 13975:Regression 13552:(Friedman) 13513:(Wilcoxon) 13451:Normality 13441:Lilliefors 13388:Student's 13264:Resampling 13138:Robustness 13126:divergence 13116:Efficiency 13054:(monotone) 13049:Likelihood 12966:Population 12799:Stratified 12751:Population 12570:Dependence 12526:Count data 12457:Percentile 12434:Dispersion 12367:Arithmetic 12302:Statistics 12137:. (1988). 11796:Elementary 11695:Howson, C. 11550:0123850487 10881:2019-08-11 10602:2020-01-02 10539:1902.05809 10461:Cox, D. R. 10433:Cox, D. R. 10280:Kiefer, J. 10202:2017-06-02 10001:1159112760 9848:2008.01006 9732:2014-01-05 9698:(4): 316. 9671:"Bayesian" 9652:References 9579:subjective 9530:refers to 9426:hypotheses 9177:Bogofilter 9148:and other 9140:structure 9049:See also: 8997:admissible 8993:admissible 6443:countable 5831:references 5487:Kolmogorov 4474:likelihood 4103:of values. 2586:, but the 2469:Dutch book 1103:likelihood 793:hypothesis 560:hypothesis 553:hypothesis 542:Hypothesis 522:See also: 445:philosophy 425:statistics 201:Estimators 73:Background 59:Likelihood 13833:Logistic 13600:posterior 13526:Rank sum 13274:Jackknife 13269:Bootstrap 13087:Bootstrap 13022:Parameter 12971:Statistic 12766:Statistic 12678:Run chart 12663:Pie chart 12658:Histogram 12648:Fan chart 12623:Bar chart 12505:L-moments 12392:Geometric 12187:EMS Press 11662:143452957 11592:, Wiley, 11473:120094454 11443:CiteSeerX 11302:1104.3448 11230:0001-1541 11183:131588159 11175:1099-1085 11123:1939-5582 10923:1097-4172 10843:CiteSeerX 10803:0709.1516 10748:1105.5721 10564:104419861 10532:: 81–91. 10155:CiteSeerX 10118:120767108 9966:237736986 9958:0303-6898 9865:220935477 9657:Citations 9575:objective 9526:The term 9302:R v Adams 9270:odds form 9185:SpamBayes 8957:¯ 8945:¯ 8927:¯ 8912:¯ 8775:∣ 8745:∫ 8727:∣ 8649:∣ 8613:∣ 8480:− 8465:− 8454:− 8427:− 8412:− 8401:− 8392:− 8383:− 8356:∣ 8350:¯ 8338:¯ 8292:− 8277:− 8266:− 8257:− 8239:− 8224:− 8213:− 8204:− 8195:− 8168:∣ 8159:¯ 8113:− 8098:− 8087:− 8060:− 8045:− 8034:− 8001:∣ 7995:¯ 7946:− 7931:− 7920:− 7911:− 7893:− 7878:− 7867:− 7834:∣ 7788:¯ 7776:¯ 7758:¯ 7743:¯ 7656:∣ 7552:× 7540:× 7529:× 7467:∣ 7417:∣ 7370:∣ 7342:∣ 7269:∣ 7207:∣ 6910:θ 6900:α 6889:∣ 6886:θ 6874:θ 6871:∣ 6865:~ 6850:∫ 6844:θ 6834:α 6823:∣ 6820:θ 6811:~ 6796:∫ 6787:α 6768:~ 6733:~ 6671:α 6660:∣ 6657:θ 6646:θ 6638:⁡ 6632:⊂ 6620:θ 6586:θ 6576:α 6565:∣ 6562:θ 6552:θ 6549:∫ 6540:θ 6534:⁡ 6522:~ 6519:θ 6386:∣ 6374:− 6333:− 6142:∣ 6043:∣ 6031:⇒ 5999:∣ 5946:∣ 5934:⇒ 5902:∣ 5719:θ 5710:α 5707:∣ 5704:θ 5692:θ 5689:∣ 5683:~ 5668:∫ 5659:α 5656:∣ 5650:~ 5612:θ 5603:α 5592:∣ 5589:θ 5577:θ 5574:∣ 5568:~ 5553:∫ 5544:α 5533:∣ 5527:~ 5064:θ 5022:θ 4999:α 4988:∣ 4985:θ 4950:α 4947:∣ 4944:θ 4932:α 4926:θ 4923:∣ 4909:∝ 4900:α 4897:∣ 4878:α 4875:∣ 4872:θ 4860:α 4854:θ 4851:∣ 4825:α 4813:α 4810:∣ 4791:α 4785:θ 4773:α 4767:θ 4764:∣ 4738:α 4716:α 4702:θ 4684:α 4673:∣ 4670:θ 4629:θ 4620:α 4617:∣ 4614:θ 4602:θ 4599:∣ 4585:∫ 4576:α 4573:∣ 4524:θ 4521:∣ 4496:∣ 4493:θ 4487:⁡ 4455:θ 4452:∣ 4406:α 4403:∣ 4400:θ 4357:~ 4316:… 4230:α 4227:∣ 4224:θ 4215:∼ 4212:θ 4187:α 4156:θ 4153:∣ 4141:∼ 4127:parameter 4113:θ 4048:θ 4044:∣ 4019:∏ 4008:α 4000:θ 3996:∣ 3954:α 3950:∣ 3946:θ 3936:⋅ 3929:θ 3917:α 3913:∣ 3909:θ 3895:α 3887:θ 3883:∣ 3869:∫ 3860:α 3852:θ 3848:∣ 3817:α 3813:∣ 3809:θ 3799:⋅ 3789:α 3785:∣ 3765:α 3757:θ 3753:∣ 3725:α 3713:∣ 3709:θ 3675:θ 3645:θ 3636:for some 3620:θ 3616:∣ 3540:… 3489:α 3464:α 3460:∣ 3456:θ 3428:θ 3406:θ 3364:∣ 3338:∏ 3325:∣ 3279:⋅ 3240:∣ 3219:∑ 3207:∣ 3179:∣ 3134:… 3061:⋅ 3022:∣ 3003:∑ 2991:∣ 2967:∣ 2926:∣ 2852:∈ 2813:∈ 2661:∣ 2598:Ω 2574:… 2510:Ω 2421:∣ 2388:∣ 2367:∩ 2315:⋅ 2306:∣ 2286:¬ 2283:∣ 2174:∣ 2110:∣ 2068:∣ 2048:¬ 2045:∣ 2021:− 1961:¬ 1914:¬ 1899:¬ 1896:∣ 1863:∣ 1803:∣ 1783:¬ 1780:∣ 1757:− 1694:¬ 1679:¬ 1676:∣ 1643:∣ 1611:∣ 1548:∣ 1520:∣ 1440:¬ 1411:∣ 1376:∣ 1040:∣ 928:∣ 863:the data 724:⋅ 715:∣ 691:∣ 551:Satisfies 403:in which 101:Coherence 55:Posterior 14547:Category 14240:Survival 14117:Johansen 13840:Binomial 13795:Isotonic 13382:(normal) 13027:location 12834:Blocking 12789:Sampling 12668:Q–Q plot 12633:Box plot 12615:Graphics 12510:Skewness 12500:Kurtosis 12472:Variance 12402:Heronian 12397:Harmonic 12214:Archived 12046:. Wiley. 12042:(1994). 12008:(1985). 11979:(2013). 11815:(2013). 11654:17835457 11607:. Wiley. 11588:(1973). 11489:(2006). 11335:11460968 11327:21928962 11248:27429455 11131:24640543 11008:Archived 10941:31280963 10650:26017444 10556:30991277 10511:17338979 10406:(1986). 10387:(1986). 9820:88521802 9712:14344339 9598:See also 9528:Bayesian 9358:and not- 9350:and not- 6928:Examples 6602:maximum 6457:Freedman 5780:variance 5757:variance 5157:and let 4645:applied. 4548:evidence 3480:, where 1831:because 1475:"), the 994:, i.e., 907:evidence 801:evidence 799:(called 558:Violates 547:Evidence 449:medicine 409:evidence 67:Evidence 14573:Commons 14520:Kriging 14405:Process 14362:studies 14221:Wavelet 14054:General 13221:Plug-in 13015:L space 12794:Cluster 12495:Moments 12313:Outline 12189:, 2001 12014:Bibcode 11678:, CUP. 11674:(2003) 11634:Bibcode 11626:Science 11534:Sources 11498:Icots-7 11465:2082155 11307:Bibcode 11239:4946376 11210:Bibcode 11155:Bibcode 10932:7380118 10828:1500830 10808:Bibcode 10775:2499910 10753:Bibcode 10735:Entropy 10630:Bibcode 10083:2238150 10042:2238346 9748:(1989) 9516:History 9321:p-value 9189:Mozilla 7003:Choc, ¬ 6982:Plain, 6363:, then 6236:, then 5844:improve 5740:predict 2944:. From 2722:is the 662:and a " 654:of two 619:P(¬E) = 510:⁠ 498:⁠ 494:⁠ 482:⁠ 437:science 362:-zee-ən 14442:Census 14032:Normal 13980:Manova 13800:Robust 13550:2-way 13542:1-way 13380:-test 13051:  12628:Biplot 12419:Median 12412:Lehmer 12354:Center 12162:  12152:  12124:  12108:  12077:  12061:  12024:  11987:  11961:  11946:  11927:  11912:  11894:  11868:  11849:  11823:  11784:  11760:  11732:  11709:  11682:  11660:  11652:  11596:  11571:  11556:  11548:  11519:  11471:  11463:  11445:  11362:  11333:  11325:  11246:  11236:  11228:  11181:  11173:  11129:  11121:  11048:  10989:  10939:  10929:  10921:  10872:"CIRI" 10845:  10826:  10773:  10712:  10658:216356 10656:  10648:  10622:Nature 10562:  10554:  10509:  10471:  10443:  10414:  10175:  10157:  10116:  10081:  10040:  9999:  9989:  9964:  9956:  9903:  9863:  9818:  9777:  9756:  9710:  9564:infers 9550:, and 9169:CRM114 7568:  7513:  7020:Total 6977:Total 6950:Cookie 5833:, but 5362:given 5131:given 4249:vector 4199:, the 4175:vector 4125:, the 4101:vector 3977:where 3508:. Let 3306:where 2547:  1455:("not 946:, the 905:, the 861:before 834:, the 768:where 631:Total 621:1−P(E) 568:Total 455:, and 14066:Trend 13595:prior 13537:anova 13426:-test 13400:-test 13392:-test 13299:Power 13244:Pivot 13037:shape 13032:scale 12482:Shape 12462:Range 12407:Heinz 12382:Cubic 12318:Index 12251:(PDF) 11658:S2CID 11494:(PDF) 11469:S2CID 11331:S2CID 11297:arXiv 11266:(PDF) 11179:S2CID 10824:S2CID 10798:arXiv 10771:S2CID 10743:arXiv 10683:(1). 10654:S2CID 10560:S2CID 10534:arXiv 10507:S2CID 10353:(PDF) 10250:(PDF) 10232:(PDF) 10221:(PDF) 10114:S2CID 10079:JSTOR 10038:JSTOR 9962:S2CID 9861:S2CID 9843:arXiv 9816:S2CID 9796:(PDF) 9708:S2CID 9415:Other 9272:, as 9173:DSPAM 6693:empty 6606:(MAP) 6166:. If 6125:then 3666:over 1126:with 1079:given 1019:given 996:after 973:given 650:as a 453:sport 396:-zhən 63:Prior 14299:Test 13499:Sign 13351:Wald 12424:Mode 12362:Mean 12160:ISBN 12150:ISBN 12122:ISBN 12106:ISBN 12075:ISBN 12059:ISBN 12022:ISBN 11985:ISBN 11959:ISBN 11944:ISBN 11925:ISBN 11910:ISBN 11892:ISBN 11866:ISBN 11847:ISBN 11821:ISBN 11782:ISBN 11758:ISBN 11730:ISBN 11707:ISBN 11680:ISBN 11650:PMID 11594:ISBN 11584:and 11569:ISBN 11554:ISBN 11546:ISBN 11517:ISBN 11360:ISBN 11323:PMID 11244:PMID 11226:ISSN 11171:ISSN 11127:PMID 11119:ISSN 11046:ISBN 10987:ISBN 10937:PMID 10919:ISSN 10901:Cell 10710:ISBN 10646:PMID 10552:PMID 10469:ISBN 10441:ISBN 10412:ISBN 10173:ISBN 9997:OCLC 9987:ISBN 9954:ISSN 9901:ISBN 9775:ISBN 9754:ISBN 9577:and 9420:The 9395:and 9370:and 9248:CIRI 9124:and 8863:15.2 8457:0.05 8404:0.01 8398:0.81 8386:0.01 8269:0.05 8216:0.01 8210:0.81 8198:0.01 8090:0.05 8037:0.01 8031:0.81 8022:0.01 7923:0.05 7870:0.01 7864:0.81 7855:0.01 7537:0.75 7526:0.75 7302:0.5. 7252:and 7240:0.75 6946:Bowl 6715:The 6701:risk 6459:and 6221:> 6201:and 5955:> 5928:> 5778:and 5776:mean 5766:and 5626:The 5503:The 4648:The 4542:The 4428:The 4382:The 1932:and 1359:and 1263:> 797:data 658:: a 593:P(E) 512:etc. 507:P(B) 491:P(B) 13479:BIC 13474:AIC 11642:doi 11630:185 11453:doi 11396:doi 11315:doi 11234:PMC 11218:doi 11163:doi 11111:doi 11084:doi 10927:PMC 10909:doi 10905:178 10816:doi 10794:384 10761:doi 10685:doi 10638:doi 10626:521 10544:doi 10530:201 10499:doi 10365:doi 10329:doi 10296:doi 10165:doi 10106:doi 10069:doi 10028:doi 9944:hdl 9934:doi 9853:doi 9808:doi 9700:doi 9310:not 9142:may 9006:as 8533:0.2 8451:0.5 8442:0.5 8263:0.5 8254:0.5 8084:0.5 8075:0.5 7917:0.5 7908:0.5 7578:0.6 7555:0.5 7549:0.5 7543:0.5 7532:0.5 7029:80 7015:30 6642:max 6635:arg 6624:MAP 6090:If 5751:or 5342:of 3441:be 2726:in 1479:of 601:¬E 467:". 457:law 411:or 394:BAY 365:or 360:BAY 14603:: 12185:, 12179:, 12053:, 12038:; 12020:. 11845:. 11705:. 11690:). 11664:. 11656:. 11648:. 11640:. 11628:. 11620:; 11616:; 11552:, 11496:. 11467:. 11461:MR 11459:. 11451:. 11439:19 11437:. 11390:. 11386:. 11374:^ 11354:. 11329:. 11321:. 11313:. 11305:. 11293:84 11291:. 11272:. 11268:. 11242:. 11232:. 11224:. 11216:. 11206:60 11204:. 11200:. 11177:. 11169:. 11161:. 11151:30 11149:. 11125:. 11117:. 11107:24 11105:. 11080:57 11078:. 11006:. 10935:. 10925:. 10917:. 10903:. 10899:. 10874:. 10822:. 10814:. 10806:. 10792:. 10769:. 10759:. 10751:. 10739:13 10737:. 10733:. 10679:. 10675:. 10652:. 10644:. 10636:. 10624:. 10620:. 10590:. 10558:. 10550:. 10542:. 10528:. 10505:. 10495:21 10493:. 10361:10 10359:. 10355:. 10325:40 10323:. 10319:. 10292:36 10290:. 10286:. 10262:^ 10252:. 10223:. 10195:. 10171:. 10163:. 10112:. 10102:95 10100:. 10077:. 10065:36 10063:. 10059:. 10036:. 10024:34 10022:. 10018:. 9995:. 9960:. 9952:. 9942:. 9930:49 9928:. 9924:. 9859:. 9851:. 9839:51 9837:. 9814:. 9804:14 9802:. 9798:. 9706:. 9696:34 9694:. 9673:. 9594:. 9570:. 9512:. 9378:. 9195:. 9187:, 9183:, 9179:, 9175:, 9171:, 9010:, 8983:A 8754:16 8749:11 8589:: 8483:11 8468:11 8462:16 8430:11 8415:11 8409:16 8295:11 8280:11 8274:16 8242:11 8227:11 8221:16 8116:11 8101:11 8095:16 8063:11 8048:11 8042:16 7949:11 7934:11 7928:16 7896:11 7881:11 7875:16 7296:40 7288:20 7234:40 7226:30 7026:40 7023:40 7012:20 7009:10 6997:50 6993:20 6989:30 6964:#2 6954:#1 6695:. 6505:. 6486:. 3688:: 3658:. 2948:: 2350:: 1973:1. 1306:.) 674:: 640:1 575:E 562:¬H 451:, 447:, 443:, 439:, 384:ən 378:eɪ 338:eɪ 65:÷ 61:× 57:= 13424:G 13398:F 13390:t 13378:Z 13097:V 13092:U 12294:e 12287:t 12280:v 12130:. 12112:. 12093:. 12083:. 12065:. 12030:. 12016:: 11993:. 11967:. 11933:. 11900:. 11874:. 11855:. 11829:. 11790:. 11766:. 11738:. 11715:. 11686:( 11644:: 11636:: 11577:. 11525:. 11500:. 11475:. 11455:: 11404:. 11398:: 11392:1 11368:. 11337:. 11317:: 11309:: 11299:: 11274:2 11250:. 11220:: 11212:: 11185:. 11165:: 11157:: 11133:. 11113:: 11090:. 11086:: 11054:. 10993:. 10943:. 10911:: 10884:. 10851:. 10830:. 10818:: 10810:: 10800:: 10777:. 10763:: 10755:: 10745:: 10718:. 10693:. 10687:: 10681:1 10660:. 10640:: 10632:: 10605:. 10566:. 10546:: 10536:: 10513:. 10501:: 10479:) 10477:. 10449:. 10420:. 10373:. 10367:: 10337:. 10331:: 10304:. 10298:: 10256:. 10205:. 10181:. 10167:: 10120:. 10108:: 10085:. 10071:: 10044:. 10030:: 10003:. 9968:. 9946:: 9936:: 9909:. 9867:. 9855:: 9845:: 9822:. 9810:: 9781:. 9760:. 9735:. 9714:. 9702:: 9506:a 9372:C 9368:B 9364:C 9360:B 9356:A 9352:B 9348:A 9340:C 9334:B 9328:A 9232:x 9228:x 9224:p 9220:p 9216:x 9212:p 8963:} 8954:D 8942:G 8936:, 8933:D 8924:G 8918:, 8909:D 8903:G 8900:, 8897:D 8894:G 8891:{ 8860:= 8857:c 8835:) 8832:c 8829:( 8824:C 8820:f 8812:c 8809:d 8806:) 8803:c 8800:( 8795:C 8791:f 8787:) 8784:c 8781:= 8778:C 8772:e 8769:= 8766:E 8763:( 8760:P 8739:) 8736:c 8733:= 8730:C 8724:e 8721:= 8718:E 8715:( 8712:P 8706:= 8703:) 8700:c 8697:( 8692:C 8688:f 8681:) 8678:e 8675:= 8672:E 8669:( 8666:P 8661:) 8658:c 8655:= 8652:C 8646:e 8643:= 8640:E 8637:( 8634:P 8628:= 8625:) 8622:e 8619:= 8616:E 8610:c 8607:( 8602:C 8598:f 8577:c 8557:e 8530:= 8527:) 8524:c 8521:( 8516:C 8512:f 8489:) 8486:) 8477:c 8474:( 8445:+ 8439:( 8436:) 8433:) 8424:c 8421:( 8389:) 8380:1 8377:( 8374:( 8371:= 8368:) 8365:c 8362:= 8359:C 8347:D 8335:G 8329:= 8326:E 8323:( 8320:P 8301:) 8298:) 8289:c 8286:( 8251:( 8248:) 8245:) 8236:c 8233:( 8201:) 8192:1 8189:( 8186:( 8183:= 8180:) 8177:c 8174:= 8171:C 8165:D 8156:G 8150:= 8147:E 8144:( 8141:P 8122:) 8119:) 8110:c 8107:( 8078:+ 8072:( 8069:) 8066:) 8057:c 8054:( 8025:+ 8019:( 8016:= 8013:) 8010:c 8007:= 8004:C 7992:D 7986:G 7983:= 7980:E 7977:( 7974:P 7955:) 7952:) 7943:c 7940:( 7905:( 7902:) 7899:) 7890:c 7887:( 7858:+ 7852:( 7849:= 7846:) 7843:c 7840:= 7837:C 7831:D 7828:G 7825:= 7822:E 7819:( 7816:P 7794:} 7785:D 7773:G 7767:, 7764:D 7755:G 7749:, 7740:D 7734:G 7731:, 7728:D 7725:G 7722:{ 7702:C 7662:) 7659:E 7651:1 7647:H 7643:( 7640:P 7620:) 7615:1 7611:H 7607:( 7604:P 7575:= 7546:+ 7520:= 7500:) 7495:2 7491:H 7487:( 7484:P 7480:) 7475:2 7471:H 7464:E 7461:( 7458:P 7454:+ 7450:) 7445:1 7441:H 7437:( 7434:P 7430:) 7425:1 7421:H 7414:E 7411:( 7408:P 7403:) 7398:1 7394:H 7390:( 7387:P 7383:) 7378:1 7374:H 7367:E 7364:( 7361:P 7355:= 7348:) 7345:E 7337:1 7333:H 7329:( 7326:P 7299:= 7292:/ 7285:= 7282:) 7277:2 7273:H 7266:E 7263:( 7260:P 7237:= 7230:/ 7223:= 7220:) 7215:1 7211:H 7204:E 7201:( 7198:P 7178:E 7158:) 7153:2 7149:H 7145:( 7142:P 7139:= 7136:) 7131:1 7127:H 7123:( 7120:P 7098:2 7094:H 7071:1 7067:H 7046:E 7044:| 7042:1 7039:H 7037:( 7035:P 7005:E 6984:E 6970:2 6967:H 6960:1 6957:H 6913:. 6907:d 6903:) 6897:, 6893:X 6883:( 6880:p 6877:) 6862:x 6856:( 6853:p 6847:= 6841:d 6837:) 6831:, 6827:X 6817:, 6808:x 6802:( 6799:p 6793:= 6790:) 6784:, 6780:X 6775:| 6765:x 6759:( 6756:p 6730:x 6677:. 6674:) 6668:, 6664:X 6654:( 6651:p 6629:} 6616:{ 6583:d 6579:) 6573:, 6569:X 6559:( 6556:p 6546:= 6543:] 6537:[ 6531:E 6528:= 6398:0 6395:= 6392:) 6389:E 6383:M 6380:( 6377:P 6371:1 6351:0 6348:= 6345:) 6342:M 6339:( 6336:P 6330:1 6310:M 6290:M 6267:1 6264:= 6261:) 6258:E 6254:| 6250:M 6247:( 6244:P 6224:0 6218:) 6215:E 6212:( 6209:P 6189:1 6186:= 6183:) 6180:M 6177:( 6174:P 6154:0 6151:= 6148:) 6145:E 6139:M 6136:( 6133:P 6113:0 6110:= 6107:) 6104:M 6101:( 6098:P 6064:) 6061:E 6058:( 6055:P 6052:= 6049:) 6046:M 6040:E 6037:( 6034:P 6028:1 6025:= 6019:) 6016:E 6013:( 6010:P 6005:) 6002:M 5996:E 5993:( 5990:P 5967:) 5964:E 5961:( 5958:P 5952:) 5949:M 5943:E 5940:( 5937:P 5931:1 5922:) 5919:E 5916:( 5913:P 5908:) 5905:M 5899:E 5896:( 5893:P 5869:) 5863:( 5858:) 5854:( 5840:. 5716:d 5713:) 5701:( 5698:p 5695:) 5680:x 5674:( 5671:p 5665:= 5662:) 5647:x 5641:( 5638:p 5609:d 5606:) 5600:, 5596:X 5586:( 5583:p 5580:) 5565:x 5559:( 5556:p 5550:= 5547:) 5541:, 5537:X 5524:x 5518:( 5515:p 5465:) 5462:y 5459:= 5456:Y 5452:| 5448:) 5445:X 5442:( 5437:A 5433:1 5429:( 5426:E 5423:= 5420:) 5417:A 5414:( 5409:y 5404:X 5400:P 5376:y 5373:= 5370:Y 5350:X 5328:y 5323:X 5319:P 5298:) 5295:x 5292:d 5289:( 5284:X 5280:P 5276:) 5273:y 5270:d 5267:( 5262:x 5257:Y 5253:P 5249:= 5246:) 5243:y 5240:d 5237:, 5234:x 5231:d 5228:( 5223:Y 5220:, 5217:X 5213:P 5192:X 5170:X 5166:P 5145:x 5142:= 5139:X 5119:Y 5097:x 5092:Y 5088:P 5043:X 5002:) 4996:, 4992:X 4982:( 4979:p 4956:. 4953:) 4941:( 4938:p 4935:) 4929:, 4919:X 4915:( 4912:p 4903:) 4893:X 4889:( 4886:p 4881:) 4869:( 4866:p 4863:) 4857:, 4847:X 4843:( 4840:p 4834:= 4828:) 4822:( 4819:p 4816:) 4806:X 4802:( 4799:p 4794:) 4788:, 4782:( 4779:p 4776:) 4770:, 4760:X 4756:( 4753:p 4747:= 4741:) 4735:, 4731:X 4727:( 4724:p 4719:) 4713:, 4709:X 4705:, 4699:( 4696:p 4690:= 4687:) 4681:, 4677:X 4667:( 4664:p 4632:. 4626:d 4623:) 4611:( 4608:p 4605:) 4595:X 4591:( 4588:p 4582:= 4579:) 4569:X 4565:( 4562:p 4539:. 4527:) 4517:X 4513:( 4510:p 4507:= 4504:) 4500:X 4490:( 4484:L 4470:. 4458:) 4448:X 4444:( 4441:p 4409:) 4397:( 4394:p 4354:x 4341:. 4327:n 4323:x 4319:, 4313:, 4308:1 4304:x 4283:n 4262:X 4245:. 4233:) 4221:( 4218:p 4171:. 4159:) 4150:x 4147:( 4144:p 4138:x 4087:x 4055:. 4052:) 4039:k 4035:e 4031:( 4028:p 4023:k 4015:= 4012:) 4004:, 3992:E 3988:( 3985:p 3961:, 3958:) 3942:( 3939:p 3925:d 3921:) 3905:( 3902:p 3899:) 3891:, 3879:E 3875:( 3872:p 3864:) 3856:, 3844:E 3840:( 3837:p 3831:= 3821:) 3805:( 3802:p 3793:) 3781:E 3777:( 3774:p 3769:) 3761:, 3749:E 3745:( 3742:p 3736:= 3729:) 3721:, 3717:E 3705:( 3702:p 3624:) 3613:e 3610:( 3607:p 3585:i 3581:e 3556:) 3551:n 3547:e 3543:, 3537:, 3532:1 3528:e 3524:( 3521:= 3517:E 3468:) 3452:( 3449:p 3374:. 3370:) 3367:M 3359:k 3355:e 3351:( 3348:P 3342:k 3334:= 3331:) 3328:M 3321:E 3317:( 3314:P 3294:, 3291:) 3288:M 3285:( 3282:P 3272:) 3267:m 3263:M 3259:( 3256:P 3253:) 3248:m 3244:M 3236:E 3232:( 3229:P 3223:m 3213:) 3210:M 3203:E 3199:( 3196:P 3190:= 3187:) 3183:E 3176:M 3173:( 3170:P 3150:) 3145:n 3141:e 3137:, 3131:, 3126:1 3122:e 3118:( 3115:= 3111:E 3076:. 3073:) 3070:M 3067:( 3064:P 3054:) 3049:m 3045:M 3041:( 3038:P 3035:) 3030:m 3026:M 3019:E 3016:( 3013:P 3007:m 2997:) 2994:M 2988:E 2985:( 2982:P 2976:= 2973:) 2970:E 2964:M 2961:( 2958:P 2932:) 2929:E 2923:M 2920:( 2917:P 2897:) 2894:M 2891:( 2888:P 2868:} 2863:m 2859:M 2855:{ 2849:M 2829:} 2824:n 2820:E 2816:{ 2810:E 2783:} 2780:) 2775:m 2771:M 2767:( 2764:P 2761:{ 2739:m 2735:M 2710:) 2705:m 2701:M 2697:( 2694:P 2674:) 2669:m 2665:M 2656:n 2652:E 2648:( 2645:P 2623:m 2619:M 2571:, 2568:3 2565:, 2562:2 2559:, 2556:1 2553:= 2550:n 2544:, 2539:n 2535:E 2442:. 2439:) 2436:E 2433:( 2430:P 2427:) 2424:E 2418:H 2415:( 2412:P 2409:= 2406:) 2403:H 2400:( 2397:P 2394:) 2391:H 2385:E 2382:( 2379:P 2376:= 2373:) 2370:H 2364:E 2361:( 2358:P 2327:) 2324:H 2321:( 2318:P 2312:) 2309:H 2303:E 2300:( 2297:P 2292:) 2289:H 2280:E 2277:( 2274:P 2246:) 2243:H 2240:( 2237:P 2233:1 2210:) 2207:H 2204:( 2201:P 2180:) 2177:E 2171:H 2168:( 2165:P 2142:2 2139:1 2116:) 2113:E 2107:H 2104:( 2101:P 2081:. 2074:) 2071:H 2065:E 2062:( 2059:P 2054:) 2051:H 2042:E 2039:( 2036:P 2028:) 2024:1 2014:) 2011:H 2008:( 2005:P 2001:1 1994:( 1970:= 1967:) 1964:H 1958:( 1955:P 1952:+ 1949:) 1946:H 1943:( 1940:P 1920:) 1917:H 1911:( 1908:P 1905:) 1902:H 1893:E 1890:( 1887:P 1884:+ 1881:) 1878:H 1875:( 1872:P 1869:) 1866:H 1860:E 1857:( 1854:P 1851:= 1848:) 1845:E 1842:( 1839:P 1809:) 1806:H 1800:E 1797:( 1794:P 1789:) 1786:H 1777:E 1774:( 1771:P 1764:) 1760:1 1751:) 1748:H 1745:( 1742:P 1738:1 1732:( 1728:+ 1725:1 1721:1 1716:= 1700:) 1697:H 1691:( 1688:P 1685:) 1682:H 1673:E 1670:( 1667:P 1664:+ 1661:) 1658:H 1655:( 1652:P 1649:) 1646:H 1640:E 1637:( 1634:P 1629:) 1626:H 1623:( 1620:P 1617:) 1614:H 1608:E 1605:( 1602:P 1596:= 1580:) 1577:E 1574:( 1571:P 1566:) 1563:H 1560:( 1557:P 1554:) 1551:H 1545:E 1542:( 1539:P 1533:= 1526:) 1523:E 1517:H 1514:( 1511:P 1487:H 1463:H 1443:H 1417:) 1414:E 1408:H 1405:( 1402:P 1382:) 1379:H 1373:E 1370:( 1367:P 1347:) 1344:H 1341:( 1338:P 1318:H 1294:0 1290:/ 1286:0 1266:0 1260:) 1257:E 1254:( 1251:P 1229:H 1205:) 1202:E 1199:( 1196:P 1186:. 1174:H 1154:E 1134:H 1114:E 1088:H 1066:E 1046:) 1043:H 1037:E 1034:( 1031:P 1005:E 982:E 960:H 934:) 931:E 925:H 922:( 919:P 893:E 871:E 848:H 822:) 819:H 816:( 813:P 779:H 756:, 750:) 747:E 744:( 741:P 736:) 733:H 730:( 727:P 721:) 718:H 712:E 709:( 706:P 700:= 697:) 694:E 688:H 685:( 682:P 555:H 504:/ 488:/ 387:/ 381:ʒ 375:b 372:ˈ 369:/ 353:/ 350:n 347:ə 344:i 341:z 335:b 332:ˈ 329:/ 325:( 311:e 304:t 297:v 20:)

Index

Bayesian method
Bayesian statistics

Posterior
Likelihood
Prior
Evidence
Bayesian inference
Bayesian probability
Bayes' theorem
Bernstein–von Mises theorem
Coherence
Cox's theorem
Cromwell's rule
Likelihood principle
Principle of indifference
Principle of maximum entropy
Conjugate prior
Linear regression
Empirical Bayes
Hierarchical model
Markov chain Monte Carlo
Laplace's approximation
Integrated nested Laplace approximations
Variational inference
Approximate Bayesian computation
Bayesian estimator
Credible interval
Maximum a posteriori estimation
Evidence lower bound

Text is available under the Creative Commons Attribution-ShareAlike License. Additional terms may apply.