Knowledge

Rate–distortion theory

Source 📝

296: 2783:) is the differential entropy of a Gaussian random variable with variance D. This lower bound is extensible to sources with memory and other distortion measures. One important feature of the SLB is that it is asymptotically tight in the low distortion regime for a wide class of sources and in some occasions, it actually coincides with the rate–distortion function. Shannon Lower Bounds can generally be found if the distortion between any two numbers can be expressed as a function of the difference between the value of these two numbers. 5340: 5330: 3758: 31: 172: 2539: 3455: 3444: 510:
Rate–distortion theory gives an analytical expression for how much compression can be achieved using lossy compression methods. Many of the existing audio, speech, image, and video compression techniques have transforms, quantization, and bit-rate allocation procedures that capitalize on the general
3460:
Rate–distortion theory tell us that 'no compression system exists that performs outside the gray area'. The closer a practical compression system is to the red (lower) bound, the better it performs. As a general rule, this bound can only be attained by increasing the coding block length parameter.
1648: 3688: 3468:
This rate–distortion function holds only for Gaussian memoryless sources. It is known that the Gaussian source is the most "difficult" source to encode: for a given mean square error, it requires the greatest number of bits. The performance of a practical compression system working
2283: 783: 3272: 2686:
is often difficult to obtain except in some instances for which we next offer two of the best known examples. The rate–distortion function of any source is known to obey several fundamental properties, the most important ones being that it is a
3000: 995: 3161: 1786: 533:
is a subject of on-going discussion. In the most simple case (which is actually used in most cases), the distortion is defined as the expected value of the square of the difference between input and output signal (i.e., the
1451: 1459: 2797:
When working with stationary sources with memory, it is necessary to modify the definition of the rate distortion function and it must be understood in the sense of a limit taken over sequences of increasing lengths.
2794:, is an elegant iterative technique for numerically obtaining rate–distortion functions of arbitrary finite input/output alphabet sources and much work has been done to extend it to more general problem instances. 3518: 873: 2534:{\displaystyle D_{Q}=\int _{-\infty }^{\infty }\int _{-\infty }^{\infty }P_{X,Y}(x,y)(x-y)^{2}\,dx\,dy=\int _{-\infty }^{\infty }\int _{-\infty }^{\infty }Q_{Y\mid X}(y\mid x)P_{X}(x)(x-y)^{2}\,dx\,dy.} 2866: 666: 283:, that should be communicated over a channel, so that the source (input signal) can be approximately reconstructed at the receiver (output signal) without exceeding an expected distortion 582:, but are often not easy to include in rate–distortion theory. In image and video compression, the human perception models are less well developed and inclusion is mostly limited to the 1256: 2770: 2646: 2263: 1046: 3439:{\displaystyle R(D)={\begin{cases}{\frac {1}{2}}\log _{2}(\sigma _{x}^{2}/D),&{\text{if }}0\leq D\leq \sigma _{x}^{2}\\0,&{\text{if }}D>\sigma _{x}^{2}.\end{cases}}} 1874: 2118: 3498: 1931: 504: 441: 574:. In audio compression, perceptual models (and therefore perceptual distortion measures) are relatively well developed and routinely used in compression techniques such as 2053: 3226: 1835: 1323: 1135: 653: 2877: 2598: 1972: 3902: 3837: 3751: 3718: 2673: 2175: 2148: 468: 405: 378: 351: 324: 3931: 3799: 1288: 4124:
PyRated is a very simple Python package to do the most basic calculation in rate-distortion theory: the determination of the "codebook" and the transmission rate
3863: 889: 3246: 3192: 2562: 2215: 2195: 2012: 1992: 1181: 1161: 1093: 1073: 624: 3865:
bits/symbol will be lost when transmitting this information over the given channel. For the user to have any hope of reconstructing with a maximum distortion
2706: 3011: 1663: 2702:
and thus the shape for the function in the examples is typical (even measured rate–distortion functions in real life tend to have very similar forms).
1339: 1643:{\displaystyle H(Y\mid X)=-\int _{-\infty }^{\infty }\int _{-\infty }^{\infty }Q_{Y\mid X}(y\mid x)P_{X}(x)\log _{2}(Q_{Y\mid X}(y\mid x))\,dx\,dy.} 3801:
bits/symbol of information from the source must reach the user. We also know from Shannon's channel coding theorem that if the source entropy is
4164: 236: 3683:{\displaystyle R(D)=\left\{{\begin{matrix}H_{b}(p)-H_{b}(D),&0\leq D\leq \min {(p,1-p)}\\0,&D>\min {(p,1-p)}\end{matrix}}\right.} 208: 153: 4832: 4643: 189: 2709:(SLB), which in the case of squared error and memoryless sources, states that for arbitrary sources with finite differential entropy, 215: 4532: 5038: 4861: 4655: 2705:
Although analytical solutions to this problem are scarce, there are upper and lower bounds to these functions including the famous
796: 4346: 222: 5043: 4620: 2804: 4082: 4053: 778:{\displaystyle d(x,{\hat {x}})={\begin{cases}0&{\text{if }}x={\hat {x}}\\1&{\text{if }}x\neq {\hat {x}}\end{cases}}} 4773: 204: 1794:
The mutual information can be understood as a measure for 'prior' uncertainty the receiver has about the sender's signal (
3869:, we must impose the requirement that the information lost in transmission does not exceed the maximum tolerable loss of 85: 2675:. These definitions can be formulated measure-theoretically to account for discrete and mixed random variables as well. 5150: 4888: 4827: 4638: 4588: 4411: 112: 4271: 4256: 4157: 2544:
As the above equations show, calculating a rate–distortion function requires the stochastic description of the input
255: 96: 5263: 3960: 3462: 883:
The functions that relate the rate and distortion are found as the solution of the following minimization problem:
595: 591: 146: 4137: 5273: 5111: 4962: 4881: 4675: 5246: 4866: 4660: 4448: 193: 70: 1189: 5343: 4379: 5008: 229: 5333: 5236: 4778: 4336: 4150: 3942: 3166:
where superscripts denote a complete sequence up to that time and the subscript 0 indicates initial state.
2787: 1052: 122: 40: 2715: 2603: 2220: 1003: 5374: 4326: 4321: 279:; it addresses the problem of determining the minimal number of bits per symbol, as measured by the rate 139: 127: 5195: 5033: 5013: 4957: 4615: 4406: 4209: 3249: 1840: 2058: 1802:)), diminished by the uncertainty that is left after receiving information about the sender's signal ( 5369: 5278: 5219: 5145: 4993: 4583: 4578: 4433: 4276: 3509: 3472: 1886: 3296: 1837:). Of course the decrease in uncertainty is due to the communicated amount of information, which is 705: 473: 410: 5283: 4856: 4650: 4351: 3768:
Suppose we want to transmit information about a source to the user with a distortion not exceeding
2692: 2995:{\displaystyle R_{n}(D)={\frac {1}{n}}\inf _{Q_{Y^{n}\mid X^{n}}\in {\mathcal {Q}}}I(Y^{n},X^{n})} 2017: 16:
Branch of information theory which provides the theoretical foundations for lossy data compression
5224: 4595: 4482: 4438: 4251: 4234: 4224: 3721: 3204: 1049: 182: 1805: 1293: 1098: 629: 4849: 4600: 4384: 4229: 2699: 276: 5121: 4041: 2567: 1936: 5253: 3263: 2679: 3872: 3816: 655:. Typical distortion functions are the Hamming distortion and the Squared-error distortion. 546:, watching pictures and video) the distortion measure should preferably be modeled on human 4937: 4399: 4361: 4182: 3730: 3696: 3465:
that operate at distances from the rate–distortion function that are practically relevant.
2683: 2651: 2153: 2126: 1654: 990:{\displaystyle \inf _{Q_{Y\mid X}(y\mid x)}I_{Q}(Y;X){\text{ subject to }}D_{Q}\leq D^{*}.} 559: 446: 383: 356: 329: 302: 65: 45: 3907: 3775: 1264: 8: 5168: 5059: 5018: 5003: 4972: 4967: 4876: 4783: 4716: 4685: 4670: 4453: 3842: 3195: 3169: 2688: 1330: 50: 4071: 5241: 5211: 5190: 5096: 5028: 4922: 4610: 4426: 4416: 4311: 4291: 4286: 4008: 3997: 3231: 3177: 2547: 2266: 2200: 2180: 1997: 1977: 1166: 1146: 1139: 1078: 1058: 609: 535: 295: 272: 60: 22: 4822: 5185: 5173: 5155: 5023: 4907: 4844: 4690: 4605: 4561: 4522: 4204: 4096: 4088: 4078: 4049: 3757: 3156:{\displaystyle {\mathcal {Q}}=\{Q_{Y^{n}\mid X^{n}}(Y^{n}\mid X^{n},X_{0}):E\leq D\}} 2274: 567: 539: 3503: 1781:{\displaystyle \inf _{Q_{Y\mid X}(y\mid x)}E]{\text{ subject to }}I_{Q}(Y;X)\leq R.} 1653:
The problem can also be formulated as a distortion–rate function, where we find the
5160: 5116: 5089: 5084: 4942: 4927: 4837: 4746: 4741: 4570: 4303: 4281: 4173: 3948: 3806: 1657:
over achievable distortions for given rate constraint. The relevant expression is:
542:
techniques operate on data that will be perceived by human consumers (listening to
117: 75: 5079: 4893: 4817: 4798: 4768: 4736: 4702: 4261: 4199: 2695: 1974:. Alternatively, if the communication channel is perfect and the received signal 571: 4871: 4665: 4394: 4389: 4246: 4219: 4191: 3969: 3904:
bits/symbol. This means that the channel capacity must be at least as large as
3254: 2791: 1446:{\displaystyle H(Y)=-\int _{-\infty }^{\infty }P_{Y}(y)\log _{2}(P_{Y}(y))\,dy} 515: 4119: 5363: 5178: 5126: 4793: 4788: 4763: 4695: 4316: 4214: 3954: 563: 55: 5299: 4266: 4241: 4142: 3454: 80: 4100: 5258: 5136: 4932: 4808: 4758: 4128:, given a utility function (distortion matrix) and a Lagrange multiplier 3975: 3461:
Nevertheless, even at unit blocklengths one can often find good (scalar)
1791:
The two formulations lead to functions which are inverses of each other.
555: 3763: 5315: 5106: 5101: 4988: 4947: 4753: 3998:"Rethinking Lossy Compression: The Rate-Distortion-Perception Tradeoff" 551: 547: 30: 3170:
Memoryless (independent) Gaussian source with squared-error distortion
2270: 2265:
and the prescribed maximum distortion, respectively. When we use the
171: 5229: 5074: 4731: 4013: 3199: 3957: – Process of reducing correlation within one or more signals 443:. We try to minimize the distortion between the original sequence 4998: 4472: 4421: 4092: 4073:
Rate Distortion Theory: A Mathematical Basis for Data Compression
3504:
Memoryless (independent) Bernoulli source with Hamming distortion
4512: 4005:
Proceedings of the International Conference on Machine Learning
606:
Distortion functions measure the cost of representing a symbol
579: 1055:(PDF) of the communication channel output (compressed signal) 5347: 4952: 4545: 4492: 868:{\displaystyle d(x,{\hat {x}})=\left(x-{\hat {x}}\right)^{2}} 543: 4502: 4356: 4341: 4331: 3677: 3432: 1333:
of the output signal given the input signal, respectively:
771: 587: 583: 529:
per data sample to be stored or transmitted. The notion of
3450:
The following figure shows what this function looks like:
4477: 4443: 3228:, and if we assume that successive samples of the signal 2861:{\displaystyle R(D)=\lim _{n\rightarrow \infty }R_{n}(D)} 575: 526: 562:, distortion measures can ultimately be identified with 4120:"PyRated: a python package for rate distortion theory" 3542: 3910: 3875: 3845: 3819: 3778: 3764:
Connecting rate-distortion theory to channel capacity
3733: 3699: 3521: 3475: 3275: 3234: 3207: 3180: 3014: 2880: 2807: 2718: 2654: 2606: 2570: 2550: 2286: 2223: 2203: 2183: 2156: 2129: 2061: 2020: 2000: 1980: 1939: 1889: 1843: 1808: 1666: 1462: 1342: 1296: 1267: 1192: 1169: 1149: 1101: 1081: 1061: 1006: 892: 799: 669: 632: 612: 476: 449: 413: 386: 359: 332: 305: 3965:
Pages displaying wikidata descriptions as a fallback
3963: – decision algorithm used in video compression 4062: 2123:In the definition of the rate–distortion function, 196:. Unsourced material may be challenged and removed. 4070: 3925: 3896: 3857: 3831: 3793: 3745: 3712: 3682: 3492: 3438: 3240: 3220: 3186: 3155: 2994: 2860: 2764: 2667: 2640: 2592: 2556: 2533: 2257: 2209: 2189: 2169: 2142: 2112: 2047: 2006: 1986: 1966: 1925: 1868: 1829: 1780: 1642: 1445: 1317: 1282: 1250: 1175: 1155: 1129: 1087: 1067: 1040: 989: 867: 777: 647: 618: 498: 462: 435: 399: 372: 345: 318: 3945: – Class of algorithms in information theory 3469:on—say—images, may well be below the 5361: 4138:VcDemo Image and Video Compression Learning Tool 3772:. Rate–distortion theory tells us that at least 3647: 3603: 2914: 2824: 1668: 894: 518:in his foundational work on information theory. 299:Rate distortion encoder and decoder. An encoder 2600:, and then aims at finding the conditional PDF 275:which provides the theoretical foundations for 4158: 147: 4172: 3995: 3150: 3025: 878: 4039: 4027: 3978: – Type of signal in signal processing 788: 4165: 4151: 4040:Cover, Thomas M.; Thomas, Joy A. (2012) . 2648:that minimize rate for a given distortion 1048:, sometimes called a test channel, is the 154: 140: 4012: 3727:Plot of the rate-distortion function for 2761: 2521: 2514: 2396: 2389: 1630: 1623: 1436: 1247: 256:Learn how and when to remove this message 3951: – Compact encoding of digital data 1251:{\displaystyle I(Y;X)=H(Y)-H(Y\mid X)\,} 294: 4117: 601: 525:is usually understood as the number of 5362: 4068: 514:Rate–distortion theory was created by 4146: 3512:with Hamming distortion is given by: 1325:are the entropy of the output signal 658: 2765:{\displaystyle R(D)\geq h(X)-h(D)\,} 2641:{\displaystyle Q_{Y\mid X}(y\mid x)} 2269:as distortion measure, we have (for 2258:{\displaystyle Q_{Y\mid X}(y\mid x)} 1075:for a given input (original signal) 1041:{\displaystyle Q_{Y\mid X}(y\mid x)} 538:). However, since we know that most 511:shape of rate–distortion functions. 194:adding citations to reliable sources 165: 86:Limiting density of discrete points 13: 3508:The rate-distortion function of a 3266:for the rate–distortion function: 3017: 2953: 2834: 2437: 2432: 2419: 2414: 2331: 2326: 2313: 2308: 1518: 1513: 1500: 1495: 1374: 1369: 14: 5386: 4111: 1869:{\displaystyle I\left(Y;X\right)} 97:Asymptotic equipartition property 5339: 5338: 5329: 5328: 3756: 3453: 3252:(or equivalently, the source is 2113:{\displaystyle I(Y;X)=H(X)=H(Y)} 1879:As an example, in case there is 170: 29: 3996:Blau, Y.; Michaeli, T. (2019). 3493:{\displaystyle R\left(D\right)} 1926:{\displaystyle H(Y\mid X)=H(Y)} 521:In rate–distortion theory, the 470:and the reconstructed sequence 290: 181:needs additional citations for 113:Shannon's source coding theorem 4046:Elements of Information Theory 4033: 4021: 3989: 3920: 3914: 3891: 3885: 3788: 3782: 3669: 3651: 3625: 3607: 3583: 3577: 3561: 3555: 3531: 3525: 3348: 3322: 3285: 3279: 3141: 3138: 3112: 3106: 3097: 3058: 2989: 2963: 2897: 2891: 2855: 2849: 2831: 2817: 2811: 2758: 2752: 2743: 2737: 2728: 2722: 2635: 2623: 2587: 2581: 2505: 2492: 2489: 2483: 2470: 2458: 2380: 2367: 2364: 2352: 2252: 2240: 2107: 2101: 2092: 2086: 2077: 2065: 2036: 2024: 1955: 1943: 1920: 1914: 1905: 1893: 1824: 1812: 1766: 1754: 1736: 1733: 1721: 1708: 1700: 1688: 1620: 1617: 1605: 1586: 1570: 1564: 1551: 1539: 1478: 1466: 1433: 1430: 1424: 1411: 1395: 1389: 1352: 1346: 1312: 1300: 1277: 1271: 1244: 1232: 1223: 1217: 1208: 1196: 1124: 1112: 1035: 1023: 953: 941: 926: 914: 848: 824: 818: 803: 762: 730: 694: 688: 673: 639: 499:{\displaystyle {\hat {X}}^{n}} 484: 436:{\displaystyle {\hat {X}}^{n}} 421: 71:Conditional mutual information 1: 4118:Marzen, Sarah; DeDeo, Simon. 3982: 3972: – Geometrical structure 4042:"10. Rate Distortion Theory" 3961:Rate–distortion optimization 2048:{\displaystyle H(Y\mid X)=0} 1053:probability density function 123:Noisy-channel coding theorem 7: 3936: 3221:{\displaystyle \sigma ^{2}} 2177:are the distortion between 1994:is identical to the signal 1883:communication at all, then 10: 5391: 5220:Compressed data structures 4542:RLE + BWT + MTF + Huffman 4210:Asymmetric numeral systems 4007:. PMLR. pp. 675–685. 3250:stochastically independent 1830:{\displaystyle H(Y\mid X)} 1318:{\displaystyle H(Y\mid X)} 1130:{\displaystyle I_{Q}(Y;X)} 648:{\displaystyle {\hat {x}}} 626:by an approximated symbol 5324: 5308: 5292: 5210: 5135: 5067: 5058: 4981: 4915: 4906: 4807: 4724: 4715: 4631: 4579:Discrete cosine transform 4569: 4560: 4509:LZ77 + Huffman + context 4462: 4372: 4302: 4190: 4181: 3510:Bernoulli random variable 3262:), we find the following 879:Rate–distortion functions 407:which outputs a sequence 380:is then fed to a decoder 5284:Smallest grammar problem 3943:Blahut–Arimoto algorithm 2788:Blahut–Arimoto algorithm 2693:monotonically decreasing 2593:{\displaystyle P_{X}(x)} 1967:{\displaystyle I(Y;X)=0} 789:Squared-error distortion 554:: much like the use of 205:"Rate–distortion theory" 5225:Compressed suffix array 4774:Nyquist–Shannon theorem 4048:(2nd ed.). Wiley. 4028:Cover & Thomas 2012 3722:binary entropy function 353:. The encoded sequence 128:Shannon–Hartley theorem 3927: 3898: 3897:{\displaystyle H-R(D)} 3859: 3833: 3832:{\displaystyle C<H} 3795: 3747: 3714: 3684: 3494: 3440: 3242: 3222: 3188: 3157: 2996: 2862: 2766: 2669: 2642: 2594: 2558: 2535: 2259: 2211: 2191: 2171: 2144: 2114: 2049: 2008: 1988: 1968: 1927: 1870: 1831: 1782: 1741: subject to  1644: 1447: 1319: 1284: 1252: 1177: 1157: 1131: 1089: 1069: 1042: 991: 958: subject to  869: 779: 649: 620: 507: 500: 464: 437: 401: 374: 347: 320: 277:lossy data compression 269:Rate–distortion theory 102:Rate–distortion theory 5254:Kolmogorov complexity 5122:Video characteristics 4499:LZ77 + Huffman + ANS 4069:Berger, Toby (1971). 3928: 3899: 3860: 3834: 3805:bits/symbol, and the 3796: 3748: 3746:{\displaystyle p=0.5} 3715: 3713:{\displaystyle H_{b}} 3685: 3495: 3441: 3264:analytical expression 3243: 3223: 3198:random variable with 3189: 3158: 2997: 2863: 2767: 2670: 2668:{\displaystyle D^{*}} 2643: 2595: 2559: 2536: 2260: 2212: 2192: 2172: 2170:{\displaystyle D^{*}} 2145: 2143:{\displaystyle D_{Q}} 2115: 2050: 2009: 1989: 1969: 1928: 1871: 1832: 1783: 1645: 1448: 1320: 1285: 1253: 1178: 1158: 1132: 1090: 1070: 1043: 992: 870: 780: 650: 621: 501: 465: 463:{\displaystyle X^{n}} 438: 402: 400:{\displaystyle g_{n}} 375: 373:{\displaystyle Y^{n}} 348: 346:{\displaystyle X^{n}} 321: 319:{\displaystyle f_{n}} 298: 271:is a major branch of 5344:Compression software 4938:Compression artifact 4894:Psychoacoustic model 3926:{\displaystyle R(D)} 3908: 3873: 3843: 3817: 3794:{\displaystyle R(D)} 3776: 3731: 3697: 3519: 3473: 3273: 3232: 3205: 3178: 3012: 2878: 2805: 2716: 2684:minimization problem 2652: 2604: 2568: 2564:in terms of the PDF 2548: 2284: 2221: 2201: 2181: 2154: 2127: 2059: 2018: 2014:at the sender, then 1998: 1978: 1937: 1887: 1841: 1806: 1664: 1460: 1340: 1294: 1283:{\displaystyle H(Y)} 1265: 1190: 1167: 1147: 1099: 1079: 1059: 1004: 890: 797: 667: 630: 610: 602:Distortion functions 566:as used in Bayesian 560:lossless compression 474: 447: 411: 384: 357: 330: 303: 190:improve this article 66:Directed information 46:Differential entropy 5334:Compression formats 4973:Texture compression 4968:Standard test image 4784:Silence compression 3858:{\displaystyle H-C} 3500:lower bound shown. 3425: 3387: 3339: 3258:, or the signal is 2707:Shannon lower bound 2441: 2423: 2335: 2317: 1522: 1504: 1378: 1331:conditional entropy 326:encodes a sequence 51:Conditional entropy 5375:Information theory 5242:Information theory 5097:Display resolution 4923:Chroma subsampling 4312:Byte pair encoding 4257:Shannon–Fano–Elias 3923: 3894: 3855: 3829: 3791: 3743: 3710: 3680: 3675: 3490: 3446:    3436: 3431: 3411: 3373: 3325: 3238: 3218: 3184: 3174:If we assume that 3153: 2992: 2959: 2858: 2838: 2762: 2665: 2638: 2590: 2554: 2531: 2424: 2406: 2318: 2300: 2275:continuous signals 2267:mean squared error 2255: 2207: 2187: 2167: 2140: 2110: 2045: 2004: 1984: 1964: 1923: 1866: 1827: 1778: 1704: 1640: 1505: 1487: 1443: 1361: 1315: 1280: 1248: 1173: 1153: 1140:mutual information 1127: 1085: 1065: 1038: 987: 930: 865: 775: 770: 659:Hamming distortion 645: 616: 536:mean squared error 508: 496: 460: 433: 397: 370: 343: 316: 273:information theory 61:Mutual information 23:Information theory 5357: 5356: 5206: 5205: 5156:Deblocking filter 5054: 5053: 4902: 4901: 4711: 4710: 4556: 4555: 4084:978-0-13-753103-5 4077:. Prentice Hall. 4055:978-1-118-58577-1 3403: 3359: 3307: 3241:{\displaystyle X} 3187:{\displaystyle X} 2913: 2911: 2823: 2790:, co-invented by 2682:solution to this 2557:{\displaystyle X} 2210:{\displaystyle Y} 2190:{\displaystyle X} 2007:{\displaystyle X} 1987:{\displaystyle Y} 1742: 1667: 1176:{\displaystyle X} 1156:{\displaystyle Y} 1088:{\displaystyle X} 1068:{\displaystyle Y} 959: 893: 851: 821: 765: 748: 733: 716: 691: 642: 619:{\displaystyle x} 540:lossy compression 487: 424: 266: 265: 258: 240: 164: 163: 5382: 5370:Data compression 5342: 5341: 5332: 5331: 5161:Lapped transform 5065: 5064: 4943:Image resolution 4928:Coding tree unit 4913: 4912: 4722: 4721: 4567: 4566: 4188: 4187: 4174:Data compression 4167: 4160: 4153: 4144: 4143: 4134: 4105: 4104: 4076: 4066: 4060: 4059: 4037: 4031: 4025: 4019: 4018: 4016: 4002: 3993: 3966: 3949:Data compression 3932: 3930: 3929: 3924: 3903: 3901: 3900: 3895: 3864: 3862: 3861: 3856: 3838: 3836: 3835: 3830: 3807:channel capacity 3800: 3798: 3797: 3792: 3760: 3752: 3750: 3749: 3744: 3719: 3717: 3716: 3711: 3709: 3708: 3689: 3687: 3686: 3681: 3679: 3676: 3672: 3628: 3576: 3575: 3554: 3553: 3499: 3497: 3496: 3491: 3489: 3457: 3445: 3443: 3442: 3437: 3435: 3434: 3424: 3419: 3404: 3401: 3386: 3381: 3360: 3357: 3344: 3338: 3333: 3318: 3317: 3308: 3300: 3247: 3245: 3244: 3239: 3227: 3225: 3224: 3219: 3217: 3216: 3193: 3191: 3190: 3185: 3162: 3160: 3159: 3154: 3137: 3136: 3124: 3123: 3096: 3095: 3083: 3082: 3070: 3069: 3057: 3056: 3055: 3054: 3042: 3041: 3021: 3020: 3001: 2999: 2998: 2993: 2988: 2987: 2975: 2974: 2958: 2957: 2956: 2947: 2946: 2945: 2944: 2932: 2931: 2912: 2904: 2890: 2889: 2867: 2865: 2864: 2859: 2848: 2847: 2837: 2771: 2769: 2768: 2763: 2674: 2672: 2671: 2666: 2664: 2663: 2647: 2645: 2644: 2639: 2622: 2621: 2599: 2597: 2596: 2591: 2580: 2579: 2563: 2561: 2560: 2555: 2540: 2538: 2537: 2532: 2513: 2512: 2482: 2481: 2457: 2456: 2440: 2435: 2422: 2417: 2388: 2387: 2351: 2350: 2334: 2329: 2316: 2311: 2296: 2295: 2264: 2262: 2261: 2256: 2239: 2238: 2216: 2214: 2213: 2208: 2196: 2194: 2193: 2188: 2176: 2174: 2173: 2168: 2166: 2165: 2149: 2147: 2146: 2141: 2139: 2138: 2119: 2117: 2116: 2111: 2054: 2052: 2051: 2046: 2013: 2011: 2010: 2005: 1993: 1991: 1990: 1985: 1973: 1971: 1970: 1965: 1932: 1930: 1929: 1924: 1875: 1873: 1872: 1867: 1865: 1861: 1836: 1834: 1833: 1828: 1787: 1785: 1784: 1779: 1753: 1752: 1743: 1740: 1720: 1719: 1703: 1687: 1686: 1649: 1647: 1646: 1641: 1604: 1603: 1582: 1581: 1563: 1562: 1538: 1537: 1521: 1516: 1503: 1498: 1452: 1450: 1449: 1444: 1423: 1422: 1407: 1406: 1388: 1387: 1377: 1372: 1324: 1322: 1321: 1316: 1289: 1287: 1286: 1281: 1257: 1255: 1254: 1249: 1182: 1180: 1179: 1174: 1162: 1160: 1159: 1154: 1136: 1134: 1133: 1128: 1111: 1110: 1094: 1092: 1091: 1086: 1074: 1072: 1071: 1066: 1047: 1045: 1044: 1039: 1022: 1021: 996: 994: 993: 988: 983: 982: 970: 969: 960: 957: 940: 939: 929: 913: 912: 874: 872: 871: 866: 864: 863: 858: 854: 853: 852: 844: 823: 822: 814: 784: 782: 781: 776: 774: 773: 767: 766: 758: 749: 746: 735: 734: 726: 717: 714: 693: 692: 684: 654: 652: 651: 646: 644: 643: 635: 625: 623: 622: 617: 505: 503: 502: 497: 495: 494: 489: 488: 480: 469: 467: 466: 461: 459: 458: 442: 440: 439: 434: 432: 431: 426: 425: 417: 406: 404: 403: 398: 396: 395: 379: 377: 376: 371: 369: 368: 352: 350: 349: 344: 342: 341: 325: 323: 322: 317: 315: 314: 261: 254: 250: 247: 241: 239: 198: 174: 166: 156: 149: 142: 118:Channel capacity 76:Relative entropy 33: 19: 18: 5390: 5389: 5385: 5384: 5383: 5381: 5380: 5379: 5360: 5359: 5358: 5353: 5320: 5304: 5288: 5269:Rate–distortion 5202: 5131: 5050: 4977: 4898: 4803: 4799:Sub-band coding 4707: 4632:Predictive type 4627: 4552: 4519:LZSS + Huffman 4469:LZ77 + Huffman 4458: 4368: 4304:Dictionary type 4298: 4200:Adaptive coding 4177: 4171: 4114: 4109: 4108: 4085: 4067: 4063: 4056: 4038: 4034: 4026: 4022: 4000: 3994: 3990: 3985: 3964: 3939: 3909: 3906: 3905: 3874: 3871: 3870: 3844: 3841: 3840: 3818: 3815: 3814: 3777: 3774: 3773: 3766: 3732: 3729: 3728: 3704: 3700: 3698: 3695: 3694: 3674: 3673: 3650: 3639: 3630: 3629: 3606: 3589: 3571: 3567: 3549: 3545: 3541: 3537: 3520: 3517: 3516: 3506: 3479: 3474: 3471: 3470: 3430: 3429: 3420: 3415: 3400: 3398: 3389: 3388: 3382: 3377: 3356: 3354: 3340: 3334: 3329: 3313: 3309: 3299: 3292: 3291: 3274: 3271: 3270: 3233: 3230: 3229: 3212: 3208: 3206: 3203: 3202: 3179: 3176: 3175: 3172: 3132: 3128: 3119: 3115: 3091: 3087: 3078: 3074: 3065: 3061: 3050: 3046: 3037: 3033: 3032: 3028: 3016: 3015: 3013: 3010: 3009: 2983: 2979: 2970: 2966: 2952: 2951: 2940: 2936: 2927: 2923: 2922: 2918: 2917: 2903: 2885: 2881: 2879: 2876: 2875: 2843: 2839: 2827: 2806: 2803: 2802: 2717: 2714: 2713: 2659: 2655: 2653: 2650: 2649: 2611: 2607: 2605: 2602: 2601: 2575: 2571: 2569: 2566: 2565: 2549: 2546: 2545: 2508: 2504: 2477: 2473: 2446: 2442: 2436: 2428: 2418: 2410: 2383: 2379: 2340: 2336: 2330: 2322: 2312: 2304: 2291: 2287: 2285: 2282: 2281: 2228: 2224: 2222: 2219: 2218: 2202: 2199: 2198: 2182: 2179: 2178: 2161: 2157: 2155: 2152: 2151: 2134: 2130: 2128: 2125: 2124: 2060: 2057: 2056: 2019: 2016: 2015: 1999: 1996: 1995: 1979: 1976: 1975: 1938: 1935: 1934: 1888: 1885: 1884: 1851: 1847: 1842: 1839: 1838: 1807: 1804: 1803: 1748: 1744: 1739: 1715: 1711: 1676: 1672: 1671: 1665: 1662: 1661: 1593: 1589: 1577: 1573: 1558: 1554: 1527: 1523: 1517: 1509: 1499: 1491: 1461: 1458: 1457: 1418: 1414: 1402: 1398: 1383: 1379: 1373: 1365: 1341: 1338: 1337: 1295: 1292: 1291: 1266: 1263: 1262: 1191: 1188: 1187: 1168: 1165: 1164: 1148: 1145: 1144: 1106: 1102: 1100: 1097: 1096: 1080: 1077: 1076: 1060: 1057: 1056: 1011: 1007: 1005: 1002: 1001: 978: 974: 965: 961: 956: 935: 931: 902: 898: 897: 891: 888: 887: 881: 859: 843: 842: 835: 831: 830: 813: 812: 798: 795: 794: 791: 769: 768: 757: 756: 745: 743: 737: 736: 725: 724: 713: 711: 701: 700: 683: 682: 668: 665: 664: 661: 634: 633: 631: 628: 627: 611: 608: 607: 604: 572:decision theory 490: 479: 478: 477: 475: 472: 471: 454: 450: 448: 445: 444: 427: 416: 415: 414: 412: 409: 408: 391: 387: 385: 382: 381: 364: 360: 358: 355: 354: 337: 333: 331: 328: 327: 310: 306: 304: 301: 300: 293: 262: 251: 245: 242: 199: 197: 187: 175: 160: 17: 12: 11: 5: 5388: 5378: 5377: 5372: 5355: 5354: 5352: 5351: 5336: 5325: 5322: 5321: 5319: 5318: 5312: 5310: 5306: 5305: 5303: 5302: 5296: 5294: 5290: 5289: 5287: 5286: 5281: 5276: 5271: 5266: 5261: 5256: 5251: 5250: 5249: 5239: 5234: 5233: 5232: 5227: 5216: 5214: 5208: 5207: 5204: 5203: 5201: 5200: 5199: 5198: 5193: 5183: 5182: 5181: 5176: 5171: 5163: 5158: 5153: 5148: 5142: 5140: 5133: 5132: 5130: 5129: 5124: 5119: 5114: 5109: 5104: 5099: 5094: 5093: 5092: 5087: 5082: 5071: 5069: 5062: 5056: 5055: 5052: 5051: 5049: 5048: 5047: 5046: 5041: 5036: 5031: 5021: 5016: 5011: 5006: 5001: 4996: 4991: 4985: 4983: 4979: 4978: 4976: 4975: 4970: 4965: 4960: 4955: 4950: 4945: 4940: 4935: 4930: 4925: 4919: 4917: 4910: 4904: 4903: 4900: 4899: 4897: 4896: 4891: 4886: 4885: 4884: 4879: 4874: 4869: 4864: 4854: 4853: 4852: 4842: 4841: 4840: 4835: 4825: 4820: 4814: 4812: 4805: 4804: 4802: 4801: 4796: 4791: 4786: 4781: 4776: 4771: 4766: 4761: 4756: 4751: 4750: 4749: 4744: 4739: 4728: 4726: 4719: 4713: 4712: 4709: 4708: 4706: 4705: 4703:Psychoacoustic 4700: 4699: 4698: 4693: 4688: 4680: 4679: 4678: 4673: 4668: 4663: 4658: 4648: 4647: 4646: 4635: 4633: 4629: 4628: 4626: 4625: 4624: 4623: 4618: 4613: 4603: 4598: 4593: 4592: 4591: 4586: 4575: 4573: 4571:Transform type 4564: 4558: 4557: 4554: 4553: 4551: 4550: 4549: 4548: 4540: 4539: 4538: 4535: 4527: 4526: 4525: 4517: 4516: 4515: 4507: 4506: 4505: 4497: 4496: 4495: 4487: 4486: 4485: 4480: 4475: 4466: 4464: 4460: 4459: 4457: 4456: 4451: 4446: 4441: 4436: 4431: 4430: 4429: 4424: 4414: 4409: 4404: 4403: 4402: 4392: 4387: 4382: 4376: 4374: 4370: 4369: 4367: 4366: 4365: 4364: 4359: 4354: 4349: 4344: 4339: 4334: 4329: 4324: 4314: 4308: 4306: 4300: 4299: 4297: 4296: 4295: 4294: 4289: 4284: 4279: 4269: 4264: 4259: 4254: 4249: 4244: 4239: 4238: 4237: 4232: 4227: 4217: 4212: 4207: 4202: 4196: 4194: 4185: 4179: 4178: 4170: 4169: 4162: 4155: 4147: 4141: 4140: 4135: 4113: 4112:External links 4110: 4107: 4106: 4083: 4061: 4054: 4032: 4020: 3987: 3986: 3984: 3981: 3980: 3979: 3973: 3970:Sphere packing 3967: 3958: 3952: 3946: 3938: 3935: 3922: 3919: 3916: 3913: 3893: 3890: 3887: 3884: 3881: 3878: 3854: 3851: 3848: 3828: 3825: 3822: 3790: 3787: 3784: 3781: 3765: 3762: 3742: 3739: 3736: 3707: 3703: 3691: 3690: 3678: 3671: 3668: 3665: 3662: 3659: 3656: 3653: 3649: 3646: 3643: 3640: 3638: 3635: 3632: 3631: 3627: 3624: 3621: 3618: 3615: 3612: 3609: 3605: 3602: 3599: 3596: 3593: 3590: 3588: 3585: 3582: 3579: 3574: 3570: 3566: 3563: 3560: 3557: 3552: 3548: 3544: 3543: 3540: 3536: 3533: 3530: 3527: 3524: 3505: 3502: 3488: 3485: 3482: 3478: 3448: 3447: 3433: 3428: 3423: 3418: 3414: 3410: 3407: 3399: 3397: 3394: 3391: 3390: 3385: 3380: 3376: 3372: 3369: 3366: 3363: 3355: 3353: 3350: 3347: 3343: 3337: 3332: 3328: 3324: 3321: 3316: 3312: 3306: 3303: 3298: 3297: 3295: 3290: 3287: 3284: 3281: 3278: 3237: 3215: 3211: 3183: 3171: 3168: 3164: 3163: 3152: 3149: 3146: 3143: 3140: 3135: 3131: 3127: 3122: 3118: 3114: 3111: 3108: 3105: 3102: 3099: 3094: 3090: 3086: 3081: 3077: 3073: 3068: 3064: 3060: 3053: 3049: 3045: 3040: 3036: 3031: 3027: 3024: 3019: 3003: 3002: 2991: 2986: 2982: 2978: 2973: 2969: 2965: 2962: 2955: 2950: 2943: 2939: 2935: 2930: 2926: 2921: 2916: 2910: 2907: 2902: 2899: 2896: 2893: 2888: 2884: 2869: 2868: 2857: 2854: 2851: 2846: 2842: 2836: 2833: 2830: 2826: 2822: 2819: 2816: 2813: 2810: 2792:Richard Blahut 2773: 2772: 2760: 2757: 2754: 2751: 2748: 2745: 2742: 2739: 2736: 2733: 2730: 2727: 2724: 2721: 2662: 2658: 2637: 2634: 2631: 2628: 2625: 2620: 2617: 2614: 2610: 2589: 2586: 2583: 2578: 2574: 2553: 2542: 2541: 2530: 2527: 2524: 2520: 2517: 2511: 2507: 2503: 2500: 2497: 2494: 2491: 2488: 2485: 2480: 2476: 2472: 2469: 2466: 2463: 2460: 2455: 2452: 2449: 2445: 2439: 2434: 2431: 2427: 2421: 2416: 2413: 2409: 2405: 2402: 2399: 2395: 2392: 2386: 2382: 2378: 2375: 2372: 2369: 2366: 2363: 2360: 2357: 2354: 2349: 2346: 2343: 2339: 2333: 2328: 2325: 2321: 2315: 2310: 2307: 2303: 2299: 2294: 2290: 2254: 2251: 2248: 2245: 2242: 2237: 2234: 2231: 2227: 2206: 2186: 2164: 2160: 2137: 2133: 2109: 2106: 2103: 2100: 2097: 2094: 2091: 2088: 2085: 2082: 2079: 2076: 2073: 2070: 2067: 2064: 2044: 2041: 2038: 2035: 2032: 2029: 2026: 2023: 2003: 1983: 1963: 1960: 1957: 1954: 1951: 1948: 1945: 1942: 1922: 1919: 1916: 1913: 1910: 1907: 1904: 1901: 1898: 1895: 1892: 1864: 1860: 1857: 1854: 1850: 1846: 1826: 1823: 1820: 1817: 1814: 1811: 1789: 1788: 1777: 1774: 1771: 1768: 1765: 1762: 1759: 1756: 1751: 1747: 1738: 1735: 1732: 1729: 1726: 1723: 1718: 1714: 1710: 1707: 1702: 1699: 1696: 1693: 1690: 1685: 1682: 1679: 1675: 1670: 1651: 1650: 1639: 1636: 1633: 1629: 1626: 1622: 1619: 1616: 1613: 1610: 1607: 1602: 1599: 1596: 1592: 1588: 1585: 1580: 1576: 1572: 1569: 1566: 1561: 1557: 1553: 1550: 1547: 1544: 1541: 1536: 1533: 1530: 1526: 1520: 1515: 1512: 1508: 1502: 1497: 1494: 1490: 1486: 1483: 1480: 1477: 1474: 1471: 1468: 1465: 1454: 1453: 1442: 1439: 1435: 1432: 1429: 1426: 1421: 1417: 1413: 1410: 1405: 1401: 1397: 1394: 1391: 1386: 1382: 1376: 1371: 1368: 1364: 1360: 1357: 1354: 1351: 1348: 1345: 1314: 1311: 1308: 1305: 1302: 1299: 1279: 1276: 1273: 1270: 1259: 1258: 1246: 1243: 1240: 1237: 1234: 1231: 1228: 1225: 1222: 1219: 1216: 1213: 1210: 1207: 1204: 1201: 1198: 1195: 1172: 1152: 1126: 1123: 1120: 1117: 1114: 1109: 1105: 1084: 1064: 1037: 1034: 1031: 1028: 1025: 1020: 1017: 1014: 1010: 998: 997: 986: 981: 977: 973: 968: 964: 955: 952: 949: 946: 943: 938: 934: 928: 925: 922: 919: 916: 911: 908: 905: 901: 896: 880: 877: 876: 875: 862: 857: 850: 847: 841: 838: 834: 829: 826: 820: 817: 811: 808: 805: 802: 790: 787: 786: 785: 772: 764: 761: 755: 752: 744: 742: 739: 738: 732: 729: 723: 720: 712: 710: 707: 706: 704: 699: 696: 690: 687: 681: 678: 675: 672: 660: 657: 641: 638: 615: 603: 600: 564:loss functions 516:Claude Shannon 493: 486: 483: 457: 453: 430: 423: 420: 394: 390: 367: 363: 340: 336: 313: 309: 292: 289: 264: 263: 178: 176: 169: 162: 161: 159: 158: 151: 144: 136: 133: 132: 131: 130: 125: 120: 115: 107: 106: 105: 104: 99: 91: 90: 89: 88: 83: 78: 73: 68: 63: 58: 53: 48: 43: 35: 34: 26: 25: 15: 9: 6: 4: 3: 2: 5387: 5376: 5373: 5371: 5368: 5367: 5365: 5349: 5345: 5337: 5335: 5327: 5326: 5323: 5317: 5314: 5313: 5311: 5307: 5301: 5298: 5297: 5295: 5291: 5285: 5282: 5280: 5277: 5275: 5272: 5270: 5267: 5265: 5262: 5260: 5257: 5255: 5252: 5248: 5245: 5244: 5243: 5240: 5238: 5235: 5231: 5228: 5226: 5223: 5222: 5221: 5218: 5217: 5215: 5213: 5209: 5197: 5194: 5192: 5189: 5188: 5187: 5184: 5180: 5177: 5175: 5172: 5170: 5167: 5166: 5164: 5162: 5159: 5157: 5154: 5152: 5149: 5147: 5144: 5143: 5141: 5138: 5134: 5128: 5127:Video quality 5125: 5123: 5120: 5118: 5115: 5113: 5110: 5108: 5105: 5103: 5100: 5098: 5095: 5091: 5088: 5086: 5083: 5081: 5078: 5077: 5076: 5073: 5072: 5070: 5066: 5063: 5061: 5057: 5045: 5042: 5040: 5037: 5035: 5032: 5030: 5027: 5026: 5025: 5022: 5020: 5017: 5015: 5012: 5010: 5007: 5005: 5002: 5000: 4997: 4995: 4992: 4990: 4987: 4986: 4984: 4980: 4974: 4971: 4969: 4966: 4964: 4961: 4959: 4956: 4954: 4951: 4949: 4946: 4944: 4941: 4939: 4936: 4934: 4931: 4929: 4926: 4924: 4921: 4920: 4918: 4914: 4911: 4909: 4905: 4895: 4892: 4890: 4887: 4883: 4880: 4878: 4875: 4873: 4870: 4868: 4865: 4863: 4860: 4859: 4858: 4855: 4851: 4848: 4847: 4846: 4843: 4839: 4836: 4834: 4831: 4830: 4829: 4826: 4824: 4821: 4819: 4816: 4815: 4813: 4810: 4806: 4800: 4797: 4795: 4794:Speech coding 4792: 4790: 4789:Sound quality 4787: 4785: 4782: 4780: 4777: 4775: 4772: 4770: 4767: 4765: 4764:Dynamic range 4762: 4760: 4757: 4755: 4752: 4748: 4745: 4743: 4740: 4738: 4735: 4734: 4733: 4730: 4729: 4727: 4723: 4720: 4718: 4714: 4704: 4701: 4697: 4694: 4692: 4689: 4687: 4684: 4683: 4681: 4677: 4674: 4672: 4669: 4667: 4664: 4662: 4659: 4657: 4654: 4653: 4652: 4649: 4645: 4642: 4641: 4640: 4637: 4636: 4634: 4630: 4622: 4619: 4617: 4614: 4612: 4609: 4608: 4607: 4604: 4602: 4599: 4597: 4594: 4590: 4587: 4585: 4582: 4581: 4580: 4577: 4576: 4574: 4572: 4568: 4565: 4563: 4559: 4547: 4544: 4543: 4541: 4536: 4534: 4531: 4530: 4529:LZ77 + Range 4528: 4524: 4521: 4520: 4518: 4514: 4511: 4510: 4508: 4504: 4501: 4500: 4498: 4494: 4491: 4490: 4488: 4484: 4481: 4479: 4476: 4474: 4471: 4470: 4468: 4467: 4465: 4461: 4455: 4452: 4450: 4447: 4445: 4442: 4440: 4437: 4435: 4432: 4428: 4425: 4423: 4420: 4419: 4418: 4415: 4413: 4410: 4408: 4405: 4401: 4398: 4397: 4396: 4393: 4391: 4388: 4386: 4383: 4381: 4378: 4377: 4375: 4371: 4363: 4360: 4358: 4355: 4353: 4350: 4348: 4345: 4343: 4340: 4338: 4335: 4333: 4330: 4328: 4325: 4323: 4320: 4319: 4318: 4315: 4313: 4310: 4309: 4307: 4305: 4301: 4293: 4290: 4288: 4285: 4283: 4280: 4278: 4275: 4274: 4273: 4270: 4268: 4265: 4263: 4260: 4258: 4255: 4253: 4250: 4248: 4245: 4243: 4240: 4236: 4233: 4231: 4228: 4226: 4223: 4222: 4221: 4218: 4216: 4213: 4211: 4208: 4206: 4203: 4201: 4198: 4197: 4195: 4193: 4189: 4186: 4184: 4180: 4175: 4168: 4163: 4161: 4156: 4154: 4149: 4148: 4145: 4139: 4136: 4133: 4131: 4127: 4121: 4116: 4115: 4102: 4098: 4094: 4090: 4086: 4080: 4075: 4074: 4065: 4057: 4051: 4047: 4043: 4036: 4030:, p. 310 4029: 4024: 4015: 4010: 4006: 3999: 3992: 3988: 3977: 3974: 3971: 3968: 3962: 3959: 3956: 3955:Decorrelation 3953: 3950: 3947: 3944: 3941: 3940: 3934: 3917: 3911: 3888: 3882: 3879: 3876: 3868: 3852: 3849: 3846: 3826: 3823: 3820: 3812: 3808: 3804: 3785: 3779: 3771: 3761: 3759: 3754: 3740: 3737: 3734: 3725: 3723: 3705: 3701: 3666: 3663: 3660: 3657: 3654: 3644: 3641: 3636: 3633: 3622: 3619: 3616: 3613: 3610: 3600: 3597: 3594: 3591: 3586: 3580: 3572: 3568: 3564: 3558: 3550: 3546: 3538: 3534: 3528: 3522: 3515: 3514: 3513: 3511: 3501: 3486: 3483: 3480: 3476: 3466: 3464: 3458: 3456: 3451: 3426: 3421: 3416: 3412: 3408: 3405: 3395: 3392: 3383: 3378: 3374: 3370: 3367: 3364: 3361: 3351: 3345: 3341: 3335: 3330: 3326: 3319: 3314: 3310: 3304: 3301: 3293: 3288: 3282: 3276: 3269: 3268: 3267: 3265: 3261: 3257: 3256: 3251: 3235: 3213: 3209: 3201: 3197: 3181: 3167: 3147: 3144: 3133: 3129: 3125: 3120: 3116: 3109: 3103: 3100: 3092: 3088: 3084: 3079: 3075: 3071: 3066: 3062: 3051: 3047: 3043: 3038: 3034: 3029: 3022: 3008: 3007: 3006: 2984: 2980: 2976: 2971: 2967: 2960: 2948: 2941: 2937: 2933: 2928: 2924: 2919: 2908: 2905: 2900: 2894: 2886: 2882: 2874: 2873: 2872: 2852: 2844: 2840: 2828: 2820: 2814: 2808: 2801: 2800: 2799: 2795: 2793: 2789: 2784: 2782: 2778: 2755: 2749: 2746: 2740: 2734: 2731: 2725: 2719: 2712: 2711: 2710: 2708: 2703: 2701: 2697: 2694: 2690: 2685: 2681: 2676: 2660: 2656: 2632: 2629: 2626: 2618: 2615: 2612: 2608: 2584: 2576: 2572: 2551: 2528: 2525: 2522: 2518: 2515: 2509: 2501: 2498: 2495: 2486: 2478: 2474: 2467: 2464: 2461: 2453: 2450: 2447: 2443: 2429: 2425: 2411: 2407: 2403: 2400: 2397: 2393: 2390: 2384: 2376: 2373: 2370: 2361: 2358: 2355: 2347: 2344: 2341: 2337: 2323: 2319: 2305: 2301: 2297: 2292: 2288: 2280: 2279: 2278: 2276: 2272: 2268: 2249: 2246: 2243: 2235: 2232: 2229: 2225: 2204: 2184: 2162: 2158: 2135: 2131: 2121: 2104: 2098: 2095: 2089: 2083: 2080: 2074: 2071: 2068: 2062: 2042: 2039: 2033: 2030: 2027: 2021: 2001: 1981: 1961: 1958: 1952: 1949: 1946: 1940: 1917: 1911: 1908: 1902: 1899: 1896: 1890: 1882: 1877: 1862: 1858: 1855: 1852: 1848: 1844: 1821: 1818: 1815: 1809: 1801: 1797: 1792: 1775: 1772: 1769: 1763: 1760: 1757: 1749: 1745: 1730: 1727: 1724: 1716: 1712: 1705: 1697: 1694: 1691: 1683: 1680: 1677: 1673: 1660: 1659: 1658: 1656: 1637: 1634: 1631: 1627: 1624: 1614: 1611: 1608: 1600: 1597: 1594: 1590: 1583: 1578: 1574: 1567: 1559: 1555: 1548: 1545: 1542: 1534: 1531: 1528: 1524: 1510: 1506: 1492: 1488: 1484: 1481: 1475: 1472: 1469: 1463: 1456: 1455: 1440: 1437: 1427: 1419: 1415: 1408: 1403: 1399: 1392: 1384: 1380: 1366: 1362: 1358: 1355: 1349: 1343: 1336: 1335: 1334: 1332: 1328: 1309: 1306: 1303: 1297: 1274: 1268: 1241: 1238: 1235: 1229: 1226: 1220: 1214: 1211: 1205: 1202: 1199: 1193: 1186: 1185: 1184: 1170: 1150: 1142: 1141: 1121: 1118: 1115: 1107: 1103: 1082: 1062: 1054: 1051: 1032: 1029: 1026: 1018: 1015: 1012: 1008: 984: 979: 975: 971: 966: 962: 950: 947: 944: 936: 932: 923: 920: 917: 909: 906: 903: 899: 886: 885: 884: 860: 855: 845: 839: 836: 832: 827: 815: 809: 806: 800: 793: 792: 759: 753: 750: 740: 727: 721: 718: 708: 702: 697: 685: 679: 676: 670: 663: 662: 656: 636: 613: 599: 597: 596:normalization 593: 589: 585: 581: 577: 573: 569: 565: 561: 557: 553: 549: 545: 541: 537: 532: 528: 524: 519: 517: 512: 491: 481: 455: 451: 428: 418: 392: 388: 365: 361: 338: 334: 311: 307: 297: 288: 286: 282: 278: 274: 270: 260: 257: 249: 238: 235: 231: 228: 224: 221: 217: 214: 210: 207: –  206: 202: 201:Find sources: 195: 191: 185: 184: 179:This article 177: 173: 168: 167: 157: 152: 150: 145: 143: 138: 137: 135: 134: 129: 126: 124: 121: 119: 116: 114: 111: 110: 109: 108: 103: 100: 98: 95: 94: 93: 92: 87: 84: 82: 79: 77: 74: 72: 69: 67: 64: 62: 59: 57: 56:Joint entropy 54: 52: 49: 47: 44: 42: 39: 38: 37: 36: 32: 28: 27: 24: 21: 20: 5300:Hutter Prize 5268: 5264:Quantization 5169:Compensation 4963:Quantization 4686:Compensation 4252:Shannon–Fano 4192:Entropy type 4129: 4125: 4123: 4072: 4064: 4045: 4035: 4023: 4004: 3991: 3866: 3810: 3802: 3769: 3767: 3755: 3726: 3720:denotes the 3692: 3507: 3467: 3459: 3452: 3449: 3260:uncorrelated 3259: 3253: 3173: 3165: 3004: 2870: 2796: 2785: 2780: 2776: 2774: 2704: 2677: 2543: 2217:for a given 2122: 1880: 1878: 1799: 1795: 1793: 1790: 1652: 1326: 1260: 1138: 999: 882: 605: 592:quantization 550:and perhaps 530: 522: 520: 513: 509: 291:Introduction 284: 280: 268: 267: 252: 243: 233: 226: 219: 212: 200: 188:Please help 183:verification 180: 101: 81:Entropy rate 5259:Prefix code 5112:Frame types 4933:Color space 4759:Convolution 4489:LZ77 + ANS 4400:Incremental 4373:Other types 4292:Levenshtein 3976:White noise 1183:defined as 1050:conditional 590:weighting ( 556:probability 5364:Categories 5316:Mark Adler 5274:Redundancy 5191:Daubechies 5174:Estimation 5107:Frame rate 5029:Daubechies 4989:Chain code 4948:Macroblock 4754:Companding 4691:Estimation 4611:Daubechies 4317:Lempel–Ziv 4277:Exp-Golomb 4205:Arithmetic 4014:1901.07821 3983:References 3463:quantizers 3255:memoryless 2689:continuous 2680:analytical 598:) matrix. 568:estimation 552:aesthetics 548:perception 531:distortion 246:March 2012 216:newspapers 5293:Community 5117:Interlace 4503:Zstandard 4282:Fibonacci 4272:Universal 4230:Canonical 4093:75-148254 3880:− 3850:− 3664:− 3620:− 3601:≤ 3595:≤ 3565:− 3413:σ 3375:σ 3371:≤ 3365:≤ 3327:σ 3320:⁡ 3210:σ 3145:≤ 3072:∣ 3044:∣ 2949:∈ 2934:∣ 2835:∞ 2832:→ 2747:− 2732:≥ 2661:∗ 2630:∣ 2616:∣ 2499:− 2465:∣ 2451:∣ 2438:∞ 2433:∞ 2430:− 2426:∫ 2420:∞ 2415:∞ 2412:− 2408:∫ 2374:− 2332:∞ 2327:∞ 2324:− 2320:∫ 2314:∞ 2309:∞ 2306:− 2302:∫ 2271:amplitude 2247:∣ 2233:∣ 2163:∗ 2031:∣ 1900:∣ 1819:∣ 1770:≤ 1695:∣ 1681:∣ 1612:∣ 1598:∣ 1584:⁡ 1546:∣ 1532:∣ 1519:∞ 1514:∞ 1511:− 1507:∫ 1501:∞ 1496:∞ 1493:− 1489:∫ 1485:− 1473:∣ 1409:⁡ 1375:∞ 1370:∞ 1367:− 1363:∫ 1359:− 1307:∣ 1239:∣ 1227:− 1030:∣ 1016:∣ 980:∗ 972:≤ 921:∣ 907:∣ 849:^ 840:− 819:^ 763:^ 754:≠ 731:^ 689:^ 640:^ 485:^ 422:^ 5279:Symmetry 5247:Timeline 5230:FM-index 5075:Bit rate 5068:Concepts 4916:Concepts 4779:Sampling 4732:Bit rate 4725:Concepts 4427:Sequitur 4262:Tunstall 4235:Modified 4225:Adaptive 4183:Lossless 3937:See also 3839:), then 3402:if  3358:if  3200:variance 3196:Gaussian 2700:function 1329:and the 1143:between 747:if  715:if  5237:Entropy 5186:Wavelet 5165:Motion 5024:Wavelet 5004:Fractal 4999:Deflate 4982:Methods 4769:Latency 4682:Motion 4606:Wavelet 4523:LHA/LZH 4473:Deflate 4422:Re-Pair 4417:Grammar 4247:Shannon 4220:Huffman 4176:methods 3813:(where 1655:infimum 1137:is the 230:scholar 41:Entropy 5348:codecs 5309:People 5212:Theory 5179:Vector 4696:Vector 4513:Brotli 4463:Hybrid 4362:Snappy 4215:Golomb 4101:156968 4099:  4091:  4081:  4052:  3693:where 2871:where 2775:where 2696:convex 1261:where 1095:, and 580:Vorbis 232:  225:  218:  211:  203:  5139:parts 5137:Codec 5102:Frame 5060:Video 5044:SPIHT 4953:Pixel 4908:Image 4862:ACELP 4833:ADPCM 4823:μ-law 4818:A-law 4811:parts 4809:Codec 4717:Audio 4656:ACELP 4644:ADPCM 4621:SPIHT 4562:Lossy 4546:bzip2 4537:LZHAM 4493:LZFSE 4395:Delta 4287:Gamma 4267:Unary 4242:Range 4009:arXiv 4001:(PDF) 3194:is a 1000:Here 544:music 237:JSTOR 223:books 5151:DPCM 4958:PSNR 4889:MDCT 4882:WLPC 4867:CELP 4828:DPCM 4676:WLPC 4661:CELP 4639:DPCM 4589:MDCT 4533:LZMA 4434:LDCT 4412:DPCM 4357:LZWL 4347:LZSS 4342:LZRW 4332:LZJB 4130:beta 4097:OCLC 4089:LCCN 4079:ISBN 4050:ISBN 3824:< 3645:> 3409:> 3248:are 3005:and 2786:The 2698:(U) 2197:and 2150:and 2055:and 1933:and 1290:and 1163:and 588:MPEG 586:and 584:JPEG 570:and 527:bits 523:rate 209:news 5196:DWT 5146:DCT 5090:VBR 5085:CBR 5080:ABR 5039:EZW 5034:DWT 5019:RLE 5009:KLT 4994:DCT 4877:LSP 4872:LAR 4857:LPC 4850:FFT 4747:VBR 4742:CBR 4737:ABR 4671:LSP 4666:LAR 4651:LPC 4616:DWT 4601:FFT 4596:DST 4584:DCT 4483:LZS 4478:LZX 4454:RLE 4449:PPM 4444:PAQ 4439:MTF 4407:DMC 4385:CTW 4380:BWT 4352:LZW 4337:LZO 4327:LZ4 4322:842 3809:is 3741:0.5 3648:min 3604:min 3311:log 2915:inf 2825:lim 2678:An 2277:): 1669:inf 1575:log 1400:log 895:inf 578:or 576:MP3 558:in 192:by 5366:: 5014:LP 4845:FT 4838:DM 4390:CM 4122:. 4095:. 4087:. 4044:. 4003:. 3933:. 3753:: 3724:. 2691:, 2120:. 1881:no 1876:. 594:, 287:. 5350:) 5346:( 4166:e 4159:t 4152:v 4132:. 4126:R 4103:. 4058:. 4017:. 4011:: 3921:) 3918:D 3915:( 3912:R 3892:) 3889:D 3886:( 3883:R 3877:H 3867:D 3853:C 3847:H 3827:H 3821:C 3811:C 3803:H 3789:) 3786:D 3783:( 3780:R 3770:D 3738:= 3735:p 3706:b 3702:H 3670:) 3667:p 3661:1 3658:, 3655:p 3652:( 3642:D 3637:, 3634:0 3626:) 3623:p 3617:1 3614:, 3611:p 3608:( 3598:D 3592:0 3587:, 3584:) 3581:D 3578:( 3573:b 3569:H 3562:) 3559:p 3556:( 3551:b 3547:H 3539:{ 3535:= 3532:) 3529:D 3526:( 3523:R 3487:) 3484:D 3481:( 3477:R 3427:. 3422:2 3417:x 3406:D 3396:, 3393:0 3384:2 3379:x 3368:D 3362:0 3352:, 3349:) 3346:D 3342:/ 3336:2 3331:x 3323:( 3315:2 3305:2 3302:1 3294:{ 3289:= 3286:) 3283:D 3280:( 3277:R 3236:X 3214:2 3182:X 3151:} 3148:D 3142:] 3139:) 3134:n 3130:Y 3126:, 3121:n 3117:X 3113:( 3110:d 3107:[ 3104:E 3101:: 3098:) 3093:0 3089:X 3085:, 3080:n 3076:X 3067:n 3063:Y 3059:( 3052:n 3048:X 3039:n 3035:Y 3030:Q 3026:{ 3023:= 3018:Q 2990:) 2985:n 2981:X 2977:, 2972:n 2968:Y 2964:( 2961:I 2954:Q 2942:n 2938:X 2929:n 2925:Y 2920:Q 2909:n 2906:1 2901:= 2898:) 2895:D 2892:( 2887:n 2883:R 2856:) 2853:D 2850:( 2845:n 2841:R 2829:n 2821:= 2818:) 2815:D 2812:( 2809:R 2781:D 2779:( 2777:h 2759:) 2756:D 2753:( 2750:h 2744:) 2741:X 2738:( 2735:h 2729:) 2726:D 2723:( 2720:R 2657:D 2636:) 2633:x 2627:y 2624:( 2619:X 2613:Y 2609:Q 2588:) 2585:x 2582:( 2577:X 2573:P 2552:X 2529:. 2526:y 2523:d 2519:x 2516:d 2510:2 2506:) 2502:y 2496:x 2493:( 2490:) 2487:x 2484:( 2479:X 2475:P 2471:) 2468:x 2462:y 2459:( 2454:X 2448:Y 2444:Q 2404:= 2401:y 2398:d 2394:x 2391:d 2385:2 2381:) 2377:y 2371:x 2368:( 2365:) 2362:y 2359:, 2356:x 2353:( 2348:Y 2345:, 2342:X 2338:P 2298:= 2293:Q 2289:D 2273:- 2253:) 2250:x 2244:y 2241:( 2236:X 2230:Y 2226:Q 2205:Y 2185:X 2159:D 2136:Q 2132:D 2108:) 2105:Y 2102:( 2099:H 2096:= 2093:) 2090:X 2087:( 2084:H 2081:= 2078:) 2075:X 2072:; 2069:Y 2066:( 2063:I 2043:0 2040:= 2037:) 2034:X 2028:Y 2025:( 2022:H 2002:X 1982:Y 1962:0 1959:= 1956:) 1953:X 1950:; 1947:Y 1944:( 1941:I 1921:) 1918:Y 1915:( 1912:H 1909:= 1906:) 1903:X 1897:Y 1894:( 1891:H 1863:) 1859:X 1856:; 1853:Y 1849:( 1845:I 1825:) 1822:X 1816:Y 1813:( 1810:H 1800:Y 1798:( 1796:H 1776:. 1773:R 1767:) 1764:X 1761:; 1758:Y 1755:( 1750:Q 1746:I 1737:] 1734:] 1731:Y 1728:, 1725:X 1722:[ 1717:Q 1713:D 1709:[ 1706:E 1701:) 1698:x 1692:y 1689:( 1684:X 1678:Y 1674:Q 1638:. 1635:y 1632:d 1628:x 1625:d 1621:) 1618:) 1615:x 1609:y 1606:( 1601:X 1595:Y 1591:Q 1587:( 1579:2 1571:) 1568:x 1565:( 1560:X 1556:P 1552:) 1549:x 1543:y 1540:( 1535:X 1529:Y 1525:Q 1482:= 1479:) 1476:X 1470:Y 1467:( 1464:H 1441:y 1438:d 1434:) 1431:) 1428:y 1425:( 1420:Y 1416:P 1412:( 1404:2 1396:) 1393:y 1390:( 1385:Y 1381:P 1356:= 1353:) 1350:Y 1347:( 1344:H 1327:Y 1313:) 1310:X 1304:Y 1301:( 1298:H 1278:) 1275:Y 1272:( 1269:H 1245:) 1242:X 1236:Y 1233:( 1230:H 1224:) 1221:Y 1218:( 1215:H 1212:= 1209:) 1206:X 1203:; 1200:Y 1197:( 1194:I 1171:X 1151:Y 1125:) 1122:X 1119:; 1116:Y 1113:( 1108:Q 1104:I 1083:X 1063:Y 1036:) 1033:x 1027:y 1024:( 1019:X 1013:Y 1009:Q 985:. 976:D 967:Q 963:D 954:) 951:X 948:; 945:Y 942:( 937:Q 933:I 927:) 924:x 918:y 915:( 910:X 904:Y 900:Q 861:2 856:) 846:x 837:x 833:( 828:= 825:) 816:x 810:, 807:x 804:( 801:d 760:x 751:x 741:1 728:x 722:= 719:x 709:0 703:{ 698:= 695:) 686:x 680:, 677:x 674:( 671:d 637:x 614:x 506:. 492:n 482:X 456:n 452:X 429:n 419:X 393:n 389:g 366:n 362:Y 339:n 335:X 312:n 308:f 285:D 281:R 259:) 253:( 248:) 244:( 234:· 227:· 220:· 213:· 186:. 155:e 148:t 141:v

Index

Information theory

Entropy
Differential entropy
Conditional entropy
Joint entropy
Mutual information
Directed information
Conditional mutual information
Relative entropy
Entropy rate
Limiting density of discrete points
Asymptotic equipartition property
Rate–distortion theory
Shannon's source coding theorem
Channel capacity
Noisy-channel coding theorem
Shannon–Hartley theorem
v
t
e

verification
improve this article
adding citations to reliable sources
"Rate–distortion theory"
news
newspapers
books
scholar

Text is available under the Creative Commons Attribution-ShareAlike License. Additional terms may apply.