Knowledge

Info-metrics

Source 📝

1002: > 2 the problem is underdetermined. Within the info-metrics framework, the solution is to maximize the entropy of the random variable subject to the two constraints: mean and normalization. This yields the usual maximum entropy solution. The solutions to that problem can be extended and generalized in several ways. First, one can use another entropy instead of Shannon’s entropy. Second, the same approach can be used for continuous random variables, for all types of conditional models (e.g., regression, inequality and nonlinear models), and for many constraints. Third, priors can be incorporated within that framework. Fourth, the same framework can be extended to accommodate greater uncertainty: uncertainty about the observed values and/or uncertainty about the model itself. Last, the same basic framework can be used to develop new models/theories, validate these models using all available information, and test statistical hypotheses about the model. 1856:: Suppose there is a portfolio manager who needs to allocate some assets or assign portfolio weights to different assets, while taking into account the investor’s constraints and preferences. Using these preferences and constraints, as well as the observed information, such as the market mean return, and covariances, of each asset over some time period, the entropy maximization framework can be used to find the optimal portfolio weights. In this case, the entropy of the portfolio represents its diversity. This framework can be modified to include other constraints such as minimal variance, maximal diversity etc. That model involves inequalities and can be further generalized to include short sales. More such examples and related code can be found on 32: 187:. Early contributions were mostly in the natural and mathematical/statistical sciences. Since the mid 1980s and especially in the mid 1990s the maximum entropy approach was generalized and extended to handle a larger class of problems in the social and behavioral sciences, especially for complex problems and data. The word ‘info-metrics’ was coined in 2009 by Amos Golan, right before the interdisciplinary Info-Metrics Institute was inaugurated. 1259: 1055: 1504: 863: 1346: 629:(e.g., Shannon). Observing an outcome at the tails of the distribution (a rare event) provides much more information than observing another, more probable, outcome. The entropy is the expected information content of an outcome of the random variable 1254:{\displaystyle {\begin{aligned}&{\underset {\{P\}}{\text{maximize}}}&&H(\mathbf {p} )=-\sum _{k=1}^{6}p_{k}\log _{2}(p_{k})\\&{\text{subject to}}&&\sum _{k}p_{k}x_{k}=y{\text{ and }}\sum _{k}p_{k}=1\end{aligned}}} 640: 142:
framework to tackle under-determined or ill-posed problems – problems where there is not sufficient information for finding a unique solution. Such problems are very common across all sciences: available information is
1837: 1052:. You also know that the sum of the probabilities must be 1. Maximizing the entropy (and using log base 2) subject to these two constraints (mean and normalization) yields the most uninformed solution. 1693: 103:. It is the science of modeling, reasoning, and drawing inferences under conditions of noisy and limited information. From the point of view of the sciences, this framework is at the intersection of 998:-dimensional discrete random variable given just the mean (expected value) of that variable. We also know that the probabilities are nonnegative and normalized (i.e., sum up to exactly 1). For all 1060: 1756: 627: 927: 1592: 980: 1543: 459: 502: 1338: 370: 326: 1499:{\displaystyle {\widehat {p}}_{k}={\frac {2^{-{\widehat {\lambda }}x_{k}}}{\sum _{k=1}^{6}2^{-{\widehat {\lambda }}x_{k}}}}\equiv {\frac {2^{-\lambda x_{k}}}{\Omega }}} 960: 1612: 1294: 529: 277: 250: 1622:
with mean of 3.5 you would expect that all faces are equally likely and the probabilities are equal. This is what the maximum entropy solution gives. If the
1563: 414: 390: 216: 1850:: Using the expected daily rainfall (arithmetic mean), the maximum entropy framework can be used to infer and forecast the daily rainfall distribution. 2009:
Ludwig Boltzmann. "Further studies on the thermal equilibrium of gas molecules (weitere studien über das wärmegleichgewicht unter gasmolekülen)".
858:{\displaystyle H(P)=\sum _{k=1}^{K}p_{k}\log _{2}\left({\frac {1}{p_{k}}}\right)=-\sum _{k=1}^{K}p_{k}\log _{2}(p_{k})=\operatorname {E} \left} 2242:
A. Golan. "Modcomp model of compensation's effect on personnel retention – an information theoretic approach". Report, US Navy, February 2003.
2232:
D. Glennon and A. Golan. "A Markov model of bank failure estimated using an information-theoretic approach banks". Report, US Treasury, 2003.
1615: 1048:. Given that information, you want to infer the probabilities that a specific value of the face will show up in the next toss of the 167:
problems across the scientific spectrum. The info-metrics framework can also be used to test hypotheses about competing theories or
2205:
Marsha Courchane, Amos Golan, and David Nickerson. "Estimation and evaluation of loan discrimination: An informational approach".
1761: 2090:
David Donoho, Hossein Kakavand, and James Mammen. "The simplest solution to an underdetermined system of linear equations". In
1629: 2284:"Info-Metrics Institute: Information-Theoretic Data Analysis and Exposition | American University, Washington, D.C." 2213: 2163:
J. R. Banavar, A. Maritan, and I. Volkov. "Applications of the principle of maximum entropy: from physics to ecology".
1698: 534: 2079:
I. Csiszar. "Why least squares and maximum entropy? an aximomatic approach to inference for linear inverse problem".
75: 53: 46: 870: 2332: 2040:
Y. Alhassid and R. D. Levine. "Experimental and inherent uncertainties in the information theoretic approach".
2184:
Peter W Buchen and Michael Kelly. "The maximum entropy distribution of an asset inferred from option prices".
1950:
Bera, Anil K.; Park, Sung Y. (2008). "Optimal portfolio diversification using the maximum entropy principle".
2246:
Amos Golan and Volker Dose. "A generalized information theoretical approach to tomographic reconstruction".
2224:
Marco Frittelli. "The minimal entropy martingale measure and the valuation problem in incomplete markets".
1881: 180: 2268:
Golan A., and D. Volker, “A Generalized Information Theoretical Approach to Tomographic Reconstruction,”
2170:
Anil K. Bera and Sung Y. Park. "Optimal portfolio diversification using the maximum entropy principle".
965: 1568: 2261:
U. V. Toussaint, A. Golan and V. Dose and, “Maximum Entropy Decomposition of Quadruple Mass Spectra.”
2235:
A. Golan. "A multivariable stochastic theory of size distribution of firms with empirical evidence".
148: 2254:
Bart Haegeman and Rampal S Etienne. "Entropy maximization and the spatial distribution of species".
1512: 419: 1896: 139: 40: 464: 1299: 331: 282: 2177:
Bhati, B. Buyuksahin, and A. Golan. "Image reconstruction: An information theoretic approach".
994:
Consider the problem of modeling and inferring the unobserved probability distribution of some
144: 57: 2191:
Randall C Campbell and R Carter Hill. "Predicting multinomial choices using maximum entropy".
2011:
Sitzungsberichte der Akademie der Wissenschaften, Mathematische-Naturwissenschaftliche Klasse
1891: 932: 132: 1036:
is the event and the distinct outcomes are the numbers 1 through 6 on the upper face of the
156: 8: 2072:
Jan M. Van Campenhout Cover and Thomas M. "Maximum entropy and conditional probability".
1597: 1266: 112: 92: 2283: 1871: 1626:
is unfair (or loaded) with a mean of 4, the resulting maximum entropy solution will be
507: 255: 228: 108: 104: 128: 1548: 1044:. Suppose you only observe the empirical mean value, y, of N tosses of a six-sided 399: 375: 201: 116: 1969: 1995: 196: 100: 2134:
Maximum Entropy and Ecology: A Theory of Abundance, Distribution and Energetics
2000:
The London, Edinburgh, and Dublin Philosophical Magazine and Journal of Science
1016:
Inference based on information resulting from repeated independent experiments.
983: 184: 2198:
Ariel Caticha and Amos Golan. "An entropic framework for modeling economies".
1594:
is the inferred Lagrange multipliers associated with the mean constraint, and
2326: 2026: 16:
Interdisciplinary approach to scientific modelling and information processing
1860: 2146: 2106:
Foundations of Info-metrics: Modeling, Inference, and Imperfect Information
1937:
Foundations of Info-metrics: Modeling, Inference, and Imperfect Information
1021: 120: 20: 2016: 223: 152: 2111:
Golan. "Information and entropy econometrics – a review and synthesis".
2065:
A Caticha. "Lectures on probability, entropy, and statistical physics".
124: 1886: 1859:
An extensive list of work related to info-metrics can be found here:
1025: 168: 164: 96: 1040:. The experiment is the independent repetitions of tossing the same 1916:
Shannon, Claude (1948). "A mathematical theory of communication".
2141:
Maximum entropy econometrics: Robust estimation with limited data
1876: 160: 1832:{\textstyle p_{k}(LS)=(0.095,0.124,0.152,0.181,0.210,0.238)} 2317: 1623: 1619: 1049: 1045: 1041: 1037: 1033: 1029: 2300: 1695:. For comparison, minimizing the least squares criterion 2092:
Information Theory, 2006 IEEE International Symposium on
1998:. "Xi. on the nature of the motion which we call heat". 1688:{\textstyle p_{k}=(0.103,0.123,0.146,0.174,0.207,0.247)} 504:. Define the informational content of a single outcome 2098: 1764: 1701: 1632: 1600: 1571: 1551: 1515: 1302: 1269: 537: 510: 467: 402: 378: 334: 285: 258: 231: 204: 2200:
Physica A: Statistical Mechanics and its Applications
1349: 1058: 968: 935: 873: 643: 422: 2157: 1970:"Portfolio Allocation – Foundations of Info-Metrics" 1842: 1831: 1751:{\textstyle \left(\sum _{k=1}^{6}p_{k}^{2}\right)} 1750: 1687: 1606: 1586: 1557: 1537: 1498: 1332: 1288: 1253: 974: 954: 921: 857: 621: 523: 496: 453: 408: 396:-dimensional probability distribution defined for 384: 364: 320: 271: 244: 210: 2127:Maximum Entropy Models in Science and Engineering 989: 622:{\textstyle h(x_{k})=h(p_{k})=\log _{2}(1/p_{k})} 2324: 2023:. (New Haven, CT: Yale University Press), 1902. 2248:Journal of Physics A: Mathematical and General 2186:Journal of Financial and Quantitative Analysis 2021:Elementary principles in statistical mechanics 922:{\displaystyle p_{k}\log _{2}(p_{k})\equiv 0} 2179:American Statistical Association Proceedings 2029:. "A mathematical theory of communication". 1076: 1070: 2301:"Center for Science of Information NSF STC" 2212:Tsukasa Fujiwara and Yoshio Miyahara. "The 2263:Journal of Vacuum Science and Technology A 190: 2270:J. of Physics A: Mathematical and General 1861:http://info-metrics.org/bibliography.html 1758:instead of maximizing the entropy yields 1618:(normalization) function. If it’s a fair 183:formalism, which is based on the work of 76:Learn how and when to remove this message 2151:Probability Theory: The Logic of Science 2060:Relative Entropy and Inductive Inference 1949: 179:Info-metrics evolved from the classical 39:This article includes a list of general 2074:IEEE Transactions on Information Theory 1915: 1020:The following example is attributed to 2325: 2113:Foundations and Trends in Econometrics 1934: 1545:is the inferred probability of event 91:is an interdisciplinary approach to 25: 2214:minimal entropy martingale measures 2165:Journal of Physics-Condensed Matter 2153:. Cambridge University Press, 2003. 2139:A. Golan, G. Judge, and D. Miller. 2099:Basic books and research monographs 13: 1984: 1601: 1587:{\textstyle {\widehat {\lambda }}} 1491: 975:{\displaystyle \operatorname {E} } 969: 797: 633:whose probability distribution is 45:it lacks sufficient corresponding 14: 2344: 2276: 2158:Other representative applications 2122:. MIT Press, Cambridge, MA, 1979. 1010: 2108:. Oxford University Press, 2018. 1843:Some cross-disciplinary examples 1092: 30: 2216:for geometric Lévy processes". 2055:. Interscience, New York, 1965. 1538:{\textstyle {\widehat {p}}_{k}} 1024:and was further popularized by 2094:, pages 1924–1928. IEEE, 2007. 1962: 1943: 1928: 1909: 1826: 1790: 1784: 1775: 1682: 1646: 1162: 1149: 1096: 1088: 990:The basic info-metrics problem 910: 897: 840: 834: 791: 778: 653: 647: 616: 595: 576: 563: 554: 541: 454:{\displaystyle p_{k}\epsilon } 448: 436: 315: 302: 1: 2120:The Maximum Entropy Formalism 2031:Bell System Technical Journal 1918:Bell System Technical Journal 1902: 497:{\textstyle \sum _{k}p_{k}=1} 155:. Info-metrics is useful for 2265:22(2), Mar/Apr 2004, 401–406 2143:. John Wiley&Sons, 1996. 2118:R. D. Levine and M. Tribus. 1882:Principle of maximum entropy 1333:{\textstyle k=1,2,\ldots ,6} 365:{\textstyle k=1,2,\ldots ,K} 7: 2207:Journal of Housing Research 1989: 1865: 1005: 321:{\textstyle p_{k}=p(x_{k})} 10: 2349: 1939:. Oxford University Press. 218:that can result in one of 174: 159:, information processing, 18: 2067:MaxEnt, Sao Paulo, Brazil 2318:http://info-metrics.org/ 2237:Advances in Econometrics 2174:, 27(4-6):484–512, 2008. 2081:The Annals of Statistics 2042:Chemical Physics Letters 1897:Constrained optimization 140:constrained optimization 138:Info-metrics provides a 19:Not to be confused with 2258:, 175(4):E74–E90, 2010. 2256:The American Naturalist 2218:Finance and Stochastics 2188:, 31(01):143–159, 1996. 2136:. Oxford U Press, 2011. 1028:. Consider a six-sided 955:{\displaystyle p_{k}=0} 222:distinct outcomes. The 191:Preliminary definitions 60:more precise citations. 2195:, 64(3):263–269, 1999. 2013:, pages 275–370, 1872. 1833: 1752: 1727: 1689: 1608: 1588: 1559: 1539: 1500: 1427: 1334: 1290: 1255: 1125: 976: 956: 923: 859: 754: 679: 623: 525: 498: 455: 410: 386: 366: 322: 273: 246: 212: 101:information processing 2333:Mathematical modeling 2228:, 10(1):39–52, 2000. 2220:, 7(4):509–531, 2003. 2115:, 2(1-2):1–145, 2008. 2076:, IT-27, No. 4, 1981. 1892:Statistical inference 1834: 1753: 1707: 1690: 1609: 1589: 1560: 1540: 1501: 1407: 1335: 1291: 1256: 1105: 977: 957: 924: 860: 734: 659: 624: 526: 499: 456: 411: 387: 367: 323: 274: 247: 213: 133:philosophy of science 2250:, 34(7):1271, 2001. 2226:Mathematical finance 2209:, 11(1):67–90, 2000. 2202:, 408:149–163, 2014. 1935:Golan, Amos (2018). 1854:Portfolio management 1762: 1699: 1630: 1607:{\textstyle \Omega } 1598: 1569: 1549: 1513: 1347: 1300: 1289:{\textstyle x_{k}=k} 1267: 1056: 1032:, where tossing the 966: 933: 871: 641: 535: 508: 465: 420: 400: 376: 332: 283: 256: 229: 202: 131:, modeling, and the 2172:Econometric Reviews 2006:(91):108–127, 1857. 1952:Econometric Reviews 1848:Rainfall prediction 1742: 113:applied mathematics 109:statistical methods 93:scientific modeling 2053:Information Theory 1872:Information theory 1829: 1748: 1728: 1685: 1604: 1584: 1555: 1535: 1496: 1340:. The solution is 1330: 1286: 1251: 1249: 1230: 1189: 1080: 972: 952: 919: 855: 619: 524:{\textstyle x_{k}} 521: 494: 477: 451: 406: 382: 362: 318: 272:{\textstyle x_{k}} 269: 245:{\textstyle p_{k}} 242: 208: 105:information theory 2272:(2001) 1271–1283. 2193:Economics Letters 2087::2032–2066, 1991. 1581: 1526: 1494: 1461: 1445: 1391: 1360: 1221: 1219: 1180: 1173: 1068: 1065: 844: 722: 468: 169:causal mechanisms 129:decision analysis 125:complexity theory 86: 85: 78: 2340: 2314: 2312: 2311: 2296: 2294: 2293: 2239:, 10:1–46, 1994. 2048:(1):16–20, 1980. 1978: 1977: 1974:info-metrics.org 1966: 1960: 1959: 1947: 1941: 1940: 1932: 1926: 1925: 1913: 1838: 1836: 1835: 1830: 1774: 1773: 1757: 1755: 1754: 1749: 1747: 1743: 1741: 1736: 1726: 1721: 1694: 1692: 1691: 1686: 1642: 1641: 1613: 1611: 1610: 1605: 1593: 1591: 1590: 1585: 1583: 1582: 1574: 1564: 1562: 1561: 1556: 1544: 1542: 1541: 1536: 1534: 1533: 1528: 1527: 1519: 1505: 1503: 1502: 1497: 1495: 1490: 1489: 1488: 1487: 1467: 1462: 1460: 1459: 1458: 1457: 1456: 1447: 1446: 1438: 1426: 1421: 1405: 1404: 1403: 1402: 1393: 1392: 1384: 1373: 1368: 1367: 1362: 1361: 1353: 1339: 1337: 1336: 1331: 1295: 1293: 1292: 1287: 1279: 1278: 1260: 1258: 1257: 1252: 1250: 1240: 1239: 1229: 1220: 1217: 1209: 1208: 1199: 1198: 1188: 1176: 1174: 1171: 1168: 1161: 1160: 1145: 1144: 1135: 1134: 1124: 1119: 1095: 1083: 1081: 1079: 1066: 1062: 981: 979: 978: 973: 961: 959: 958: 953: 945: 944: 928: 926: 925: 920: 909: 908: 893: 892: 883: 882: 864: 862: 861: 856: 854: 850: 849: 845: 843: 826: 817: 816: 790: 789: 774: 773: 764: 763: 753: 748: 727: 723: 721: 720: 708: 699: 698: 689: 688: 678: 673: 628: 626: 625: 620: 615: 614: 605: 591: 590: 575: 574: 553: 552: 530: 528: 527: 522: 520: 519: 503: 501: 500: 495: 487: 486: 476: 460: 458: 457: 452: 432: 431: 415: 413: 412: 407: 391: 389: 388: 383: 371: 369: 368: 363: 327: 325: 324: 319: 314: 313: 295: 294: 278: 276: 275: 270: 268: 267: 252:of each outcome 251: 249: 248: 243: 241: 240: 217: 215: 214: 209: 117:computer science 81: 74: 70: 67: 61: 56:this article by 47:inline citations 34: 33: 26: 2348: 2347: 2343: 2342: 2341: 2339: 2338: 2337: 2323: 2322: 2309: 2307: 2299: 2291: 2289: 2282: 2279: 2160: 2101: 2037::379–423, 1948. 1996:Rudolf Clausius 1992: 1987: 1985:Further reading 1982: 1981: 1968: 1967: 1963: 1958:(4–6): 484–512. 1948: 1944: 1933: 1929: 1914: 1910: 1905: 1868: 1845: 1769: 1765: 1763: 1760: 1759: 1737: 1732: 1722: 1711: 1706: 1702: 1700: 1697: 1696: 1637: 1633: 1631: 1628: 1627: 1599: 1596: 1595: 1573: 1572: 1570: 1567: 1566: 1550: 1547: 1546: 1529: 1518: 1517: 1516: 1514: 1511: 1510: 1483: 1479: 1472: 1468: 1466: 1452: 1448: 1437: 1436: 1432: 1428: 1422: 1411: 1406: 1398: 1394: 1383: 1382: 1378: 1374: 1372: 1363: 1352: 1351: 1350: 1348: 1345: 1344: 1301: 1298: 1297: 1274: 1270: 1268: 1265: 1264: 1248: 1247: 1235: 1231: 1225: 1218: and  1216: 1204: 1200: 1194: 1190: 1184: 1175: 1170: 1166: 1165: 1156: 1152: 1140: 1136: 1130: 1126: 1120: 1109: 1091: 1082: 1069: 1064: 1059: 1057: 1054: 1053: 1013: 1008: 992: 967: 964: 963: 940: 936: 934: 931: 930: 904: 900: 888: 884: 878: 874: 872: 869: 868: 830: 825: 821: 812: 808: 807: 803: 785: 781: 769: 765: 759: 755: 749: 738: 716: 712: 707: 703: 694: 690: 684: 680: 674: 663: 642: 639: 638: 610: 606: 601: 586: 582: 570: 566: 548: 544: 536: 533: 532: 515: 511: 509: 506: 505: 482: 478: 472: 466: 463: 462: 427: 423: 421: 418: 417: 401: 398: 397: 377: 374: 373: 333: 330: 329: 309: 305: 290: 286: 284: 281: 280: 263: 259: 257: 254: 253: 236: 232: 230: 227: 226: 203: 200: 199: 197:random variable 193: 181:maximum entropy 177: 82: 71: 65: 62: 52:Please help to 51: 35: 31: 24: 17: 12: 11: 5: 2346: 2336: 2335: 2321: 2320: 2315: 2297: 2278: 2277:External links 2275: 2274: 2273: 2266: 2259: 2244: 2243: 2240: 2233: 2222: 2221: 2210: 2203: 2196: 2189: 2182: 2175: 2168: 2167:, 22(6), 2010. 2159: 2156: 2155: 2154: 2144: 2137: 2130: 2129:. Wiley, 1993. 2123: 2116: 2109: 2100: 2097: 2096: 2095: 2088: 2077: 2070: 2063: 2056: 2049: 2038: 2024: 2014: 2007: 1991: 1988: 1986: 1983: 1980: 1979: 1961: 1942: 1927: 1907: 1906: 1904: 1901: 1900: 1899: 1894: 1889: 1884: 1879: 1874: 1867: 1864: 1844: 1841: 1828: 1825: 1822: 1819: 1816: 1813: 1810: 1807: 1804: 1801: 1798: 1795: 1792: 1789: 1786: 1783: 1780: 1777: 1772: 1768: 1746: 1740: 1735: 1731: 1725: 1720: 1717: 1714: 1710: 1705: 1684: 1681: 1678: 1675: 1672: 1669: 1666: 1663: 1660: 1657: 1654: 1651: 1648: 1645: 1640: 1636: 1603: 1580: 1577: 1558:{\textstyle k} 1554: 1532: 1525: 1522: 1507: 1506: 1493: 1486: 1482: 1478: 1475: 1471: 1465: 1455: 1451: 1444: 1441: 1435: 1431: 1425: 1420: 1417: 1414: 1410: 1401: 1397: 1390: 1387: 1381: 1377: 1371: 1366: 1359: 1356: 1329: 1326: 1323: 1320: 1317: 1314: 1311: 1308: 1305: 1285: 1282: 1277: 1273: 1246: 1243: 1238: 1234: 1228: 1224: 1215: 1212: 1207: 1203: 1197: 1193: 1187: 1183: 1179: 1177: 1169: 1167: 1164: 1159: 1155: 1151: 1148: 1143: 1139: 1133: 1129: 1123: 1118: 1115: 1112: 1108: 1104: 1101: 1098: 1094: 1090: 1087: 1084: 1078: 1075: 1072: 1063: 1061: 1012: 1011:Six-sided dice 1009: 1007: 1004: 991: 988: 971: 951: 948: 943: 939: 918: 915: 912: 907: 903: 899: 896: 891: 887: 881: 877: 853: 848: 842: 839: 836: 833: 829: 824: 820: 815: 811: 806: 802: 799: 796: 793: 788: 784: 780: 777: 772: 768: 762: 758: 752: 747: 744: 741: 737: 733: 730: 726: 719: 715: 711: 706: 702: 697: 693: 687: 683: 677: 672: 669: 666: 662: 658: 655: 652: 649: 646: 618: 613: 609: 604: 600: 597: 594: 589: 585: 581: 578: 573: 569: 565: 562: 559: 556: 551: 547: 543: 540: 518: 514: 493: 490: 485: 481: 475: 471: 450: 447: 444: 441: 438: 435: 430: 426: 409:{\textstyle X} 405: 385:{\textstyle P} 381: 361: 358: 355: 352: 349: 346: 343: 340: 337: 317: 312: 308: 304: 301: 298: 293: 289: 266: 262: 239: 235: 211:{\textstyle X} 207: 192: 189: 176: 173: 163:building, and 111:of inference, 99:and efficient 84: 83: 38: 36: 29: 15: 9: 6: 4: 3: 2: 2345: 2334: 2331: 2330: 2328: 2319: 2316: 2306: 2302: 2298: 2288: 2285: 2281: 2280: 2271: 2267: 2264: 2260: 2257: 2253: 2252: 2251: 2249: 2241: 2238: 2234: 2231: 2230: 2229: 2227: 2219: 2215: 2211: 2208: 2204: 2201: 2197: 2194: 2190: 2187: 2183: 2180: 2176: 2173: 2169: 2166: 2162: 2161: 2152: 2148: 2145: 2142: 2138: 2135: 2131: 2128: 2125:J. N. Kapur. 2124: 2121: 2117: 2114: 2110: 2107: 2104:Golan, Amos. 2103: 2102: 2093: 2089: 2086: 2082: 2078: 2075: 2071: 2068: 2064: 2061: 2057: 2054: 2050: 2047: 2043: 2039: 2036: 2032: 2028: 2027:C. E. Shannon 2025: 2022: 2018: 2015: 2012: 2008: 2005: 2001: 1997: 1994: 1993: 1975: 1971: 1965: 1957: 1953: 1946: 1938: 1931: 1923: 1919: 1912: 1908: 1898: 1895: 1893: 1890: 1888: 1885: 1883: 1880: 1878: 1875: 1873: 1870: 1869: 1863: 1862: 1857: 1855: 1851: 1849: 1840: 1823: 1820: 1817: 1814: 1811: 1808: 1805: 1802: 1799: 1796: 1793: 1787: 1781: 1778: 1770: 1766: 1744: 1738: 1733: 1729: 1723: 1718: 1715: 1712: 1708: 1703: 1679: 1676: 1673: 1670: 1667: 1664: 1661: 1658: 1655: 1652: 1649: 1643: 1638: 1634: 1625: 1621: 1617: 1578: 1575: 1552: 1530: 1523: 1520: 1484: 1480: 1476: 1473: 1469: 1463: 1453: 1449: 1442: 1439: 1433: 1429: 1423: 1418: 1415: 1412: 1408: 1399: 1395: 1388: 1385: 1379: 1375: 1369: 1364: 1357: 1354: 1343: 1342: 1341: 1327: 1324: 1321: 1318: 1315: 1312: 1309: 1306: 1303: 1283: 1280: 1275: 1271: 1261: 1244: 1241: 1236: 1232: 1226: 1222: 1213: 1210: 1205: 1201: 1195: 1191: 1185: 1181: 1178: 1157: 1153: 1146: 1141: 1137: 1131: 1127: 1121: 1116: 1113: 1110: 1106: 1102: 1099: 1085: 1073: 1051: 1047: 1043: 1039: 1035: 1031: 1027: 1023: 1018: 1017: 1003: 1001: 997: 987: 985: 949: 946: 941: 937: 916: 913: 905: 901: 894: 889: 885: 879: 875: 865: 851: 846: 837: 831: 827: 822: 818: 813: 809: 804: 800: 794: 786: 782: 775: 770: 766: 760: 756: 750: 745: 742: 739: 735: 731: 728: 724: 717: 713: 709: 704: 700: 695: 691: 685: 681: 675: 670: 667: 664: 660: 656: 650: 644: 636: 632: 611: 607: 602: 598: 592: 587: 583: 579: 571: 567: 560: 557: 549: 545: 538: 516: 512: 491: 488: 483: 479: 473: 469: 445: 442: 439: 433: 428: 424: 403: 395: 379: 359: 356: 353: 350: 347: 344: 341: 338: 335: 310: 306: 299: 296: 291: 287: 264: 260: 237: 233: 225: 221: 205: 198: 188: 186: 182: 172: 170: 166: 162: 158: 154: 150: 146: 141: 136: 134: 130: 126: 122: 118: 114: 110: 106: 102: 98: 94: 90: 80: 77: 69: 59: 55: 49: 48: 42: 37: 28: 27: 22: 2308:. Retrieved 2304: 2290:. Retrieved 2287:american.edu 2286: 2269: 2262: 2255: 2247: 2245: 2236: 2225: 2223: 2217: 2206: 2199: 2192: 2185: 2178: 2171: 2164: 2150: 2147:E. T. Jaynes 2140: 2133: 2126: 2119: 2112: 2105: 2091: 2084: 2080: 2073: 2066: 2059: 2052: 2045: 2041: 2034: 2030: 2020: 2010: 2003: 1999: 1973: 1964: 1955: 1951: 1945: 1936: 1930: 1921: 1917: 1911: 1858: 1853: 1852: 1847: 1846: 1508: 1262: 1019: 1015: 1014: 999: 995: 993: 866: 634: 630: 393: 219: 194: 178: 137: 121:econometrics 89:Info-metrics 88: 87: 72: 63: 44: 21:Informetrics 2058:A Caticha. 2051:R. B. Ash. 2017:J. W. Gibbs 984:expectation 224:probability 195:Consider a 147:, limited, 58:introducing 2310:2017-11-07 2305:soihub.org 2292:2017-11-07 2132:J. Harte. 1924:: 379–423. 1903:References 1172:subject to 986:operator. 416:such that 145:incomplete 66:March 2018 41:references 1887:Inference 1709:∑ 1616:partition 1602:Ω 1579:^ 1576:λ 1524:^ 1492:Ω 1477:λ 1474:− 1464:≡ 1443:^ 1440:λ 1434:− 1409:∑ 1389:^ 1386:λ 1380:− 1358:^ 1322:… 1223:∑ 1182:∑ 1147:⁡ 1107:∑ 1103:− 1022:Boltzmann 914:≡ 895:⁡ 819:⁡ 801:⁡ 776:⁡ 736:∑ 732:− 701:⁡ 661:∑ 593:⁡ 470:∑ 434:ϵ 354:… 165:inference 157:modelling 153:uncertain 97:inference 2327:Category 1990:Classics 1866:See also 1067:maximize 1006:Examples 372:. Thus, 2181:, 2005. 2069:, 2008. 2062:. 2004. 1877:Entropy 1614:is the 982:is the 185:Shannon 175:History 54:improve 1509:where 1026:Jaynes 962:, and 531:to be 161:theory 43:, but 1824:0.238 1818:0.210 1812:0.181 1806:0.152 1800:0.124 1794:0.095 1680:0.247 1674:0.207 1668:0.174 1662:0.146 1656:0.123 1650:0.103 1263:for 867:Here 392:is a 149:noisy 1296:and 461:and 328:for 151:and 1624:die 1620:die 1138:log 1050:die 1046:die 1042:die 1038:die 1034:die 1030:die 929:if 886:log 810:log 767:log 692:log 584:log 279:is 2329:: 2303:. 2149:. 2085:19 2083:, 2046:73 2044:, 2035:27 2033:, 2019:. 2004:14 2002:, 1972:. 1956:27 1954:. 1922:27 1920:. 1839:. 1565:, 637:: 171:. 135:. 127:, 123:, 119:, 115:, 107:, 95:, 2313:. 2295:. 1976:. 1827:) 1821:, 1815:, 1809:, 1803:, 1797:, 1791:( 1788:= 1785:) 1782:S 1779:L 1776:( 1771:k 1767:p 1745:) 1739:2 1734:k 1730:p 1724:6 1719:1 1716:= 1713:k 1704:( 1683:) 1677:, 1671:, 1665:, 1659:, 1653:, 1647:( 1644:= 1639:k 1635:p 1553:k 1531:k 1521:p 1485:k 1481:x 1470:2 1454:k 1450:x 1430:2 1424:6 1419:1 1416:= 1413:k 1400:k 1396:x 1376:2 1370:= 1365:k 1355:p 1328:6 1325:, 1319:, 1316:2 1313:, 1310:1 1307:= 1304:k 1284:k 1281:= 1276:k 1272:x 1245:1 1242:= 1237:k 1233:p 1227:k 1214:y 1211:= 1206:k 1202:x 1196:k 1192:p 1186:k 1163:) 1158:k 1154:p 1150:( 1142:2 1132:k 1128:p 1122:6 1117:1 1114:= 1111:k 1100:= 1097:) 1093:p 1089:( 1086:H 1077:} 1074:P 1071:{ 1000:K 996:K 970:E 950:0 947:= 942:k 938:p 917:0 911:) 906:k 902:p 898:( 890:2 880:k 876:p 852:] 847:) 841:) 838:X 835:( 832:P 828:1 823:( 814:2 805:[ 798:E 795:= 792:) 787:k 783:p 779:( 771:2 761:k 757:p 751:K 746:1 743:= 740:k 729:= 725:) 718:k 714:p 710:1 705:( 696:2 686:k 682:p 676:K 671:1 668:= 665:k 657:= 654:) 651:P 648:( 645:H 635:P 631:X 617:) 612:k 608:p 603:/ 599:1 596:( 588:2 580:= 577:) 572:k 568:p 564:( 561:h 558:= 555:) 550:k 546:x 542:( 539:h 517:k 513:x 492:1 489:= 484:k 480:p 474:k 449:] 446:1 443:, 440:0 437:[ 429:k 425:p 404:X 394:K 380:P 360:K 357:, 351:, 348:2 345:, 342:1 339:= 336:k 316:) 311:k 307:x 303:( 300:p 297:= 292:k 288:p 265:k 261:x 238:k 234:p 220:K 206:X 79:) 73:( 68:) 64:( 50:. 23:.

Index

Informetrics
references
inline citations
improve
introducing
Learn how and when to remove this message
scientific modeling
inference
information processing
information theory
statistical methods
applied mathematics
computer science
econometrics
complexity theory
decision analysis
philosophy of science
constrained optimization
incomplete
noisy
uncertain
modelling
theory
inference
causal mechanisms
maximum entropy
Shannon
random variable
probability
expectation

Text is available under the Creative Commons Attribution-ShareAlike License. Additional terms may apply.