Knowledge

Shannon–Fano coding

Source 📝

4448: 4438: 2133: 2679: 2076:
in the second set receive "1". As long as any sets with more than one member remain, the same process is repeated on those sets, to determine successive digits of their codes. When a set has been reduced to one symbol this means the symbol's code is complete and will not form the prefix of any other symbol's code.
1929: 2075:
In Fano's method, the symbols are arranged in order from most probable to least probable, and then divided into two sets whose total probabilities are as close as possible to being equal. All symbols then have the first digits of their codes assigned; symbols in the first set receive "0" and symbols
2576:
is almost as computationally simple and produces prefix codes that always achieve the lowest possible expected code word length, under the constraints that each symbol is represented by a code formed of an integral number of bits. This is a constraint that is often unneeded, since the codes will be
142:
divides the source symbols into two sets ("0" and "1") with probabilities as close to 1/2 as possible. Then those sets are themselves divided in two, and so on, until each set contains only one symbol. The codeword for that symbol is the string of "0"s and "1"s that records which half of the divides
2757:
In this case D & E have the lowest frequencies and so are allocated 0 and 1 respectively and grouped together with a combined probability of 0.282. The lowest pair now are B and C so they're allocated 0 and 1 and grouped together with a combined probability of 0.333. This leaves BC and DE now
2079:
The algorithm produces fairly efficient variable-length encodings; when the two smaller sets produced by a partitioning are in fact of equal probability, the one bit of information used to distinguish them is used most efficiently. Unfortunately, Shannon–Fano coding does not always produce optimal
415:
Once the codeword lengths have been determined, we must choose the codewords themselves. One method is to pick codewords in order from most probable to least probable symbols, picking each codeword to be the lexicographically first word of the correct length that maintains the prefix-free property.
191:
There are several reasons for this mixup. For one thing, in the discussion of his coding scheme, Shannon mentions Fano’s scheme and calls it “substantially the same” (Shannon, 1948, p. 17 ). For another, both Shannon’s and Fano’s coding schemes are similar in the sense that they both are efficient,
1127:
We can pick codewords in order, choosing the lexicographically first word of the correct length that maintains the prefix-free property. Clearly A gets the codeword 00. To maintain the prefix-free property, B's codeword may not start 00, so the lexicographically first available word of length 3 is
2215:
With this division, A and B will each have a code that starts with a 0 bit, and the C, D, and E codes will all start with a 1, as shown in Figure b. Subsequently, the left half of the tree gets a new division between A and B, which puts A on a leaf with code 00 and B on a leaf with code 01.
2622:
can produce greater overall compression than either Huffman or Shannon–Fano, since it can encode in fractional numbers of bits which more closely approximate the actual information content of the symbol. However, arithmetic coding has not superseded Huffman the way that Huffman supersedes
2440: 1556: 2916: 2641:(1952) gave a different algorithm that always produces an optimal tree for any given symbol probabilities. While Fano's Shannon–Fano tree is created by dividing from the root to the leaves, the Huffman algorithm works in the opposite direction, merging from the leaves to the root. 183:
Around 1948, both Claude E. Shannon (1948) and Robert M. Fano (1949) independently proposed two different source coding algorithms for an efficient description of a discrete memoryless source. Unfortunately, in spite of being different, both schemes became known under the same name
3128:
The Imploding algorithm is actually a combination of two distinct algorithms. The first algorithm compresses repeated byte sequences using a sliding dictionary. The second algorithm is used to compress the encoding of the sliding dictionary output, using multiple Shannon–Fano
2211:
All symbols are sorted by frequency, from left to right (shown in Figure a). Putting the dividing line between symbols B and C results in a total of 22 in the left group and a total of 17 in the right group. This minimizes the difference in totals between the two groups.
2758:
with the lowest probabilities so 0 and 1 are prepended to their codes and they are combined. This then leaves just A and BCDE, which have 0 and 1 prepended respectively and are then combined. This leaves us with a single node and our algorithm is complete.
1678: 159:
does. However, Shannon–Fano codes have an expected codeword length within 1 bit of optimal. Fano's method usually produces encoding with shorter expected lengths than Shannon's method. However, Shannon's method is easier to analyse theoretically.
772:
This example shows the construction of a Shannon–Fano code for a small alphabet. There 5 different source symbols. Suppose 39 total symbols have been observed with the following frequencies, from which we can estimate the symbol probabilities.
2119:
The left part of the list is assigned the binary digit 0, and the right part is assigned the digit 1. This means that the codes for the symbols in the first part will all start with 0, and the codes in the second part will all start with
2219:
After four division procedures, a tree of codes results. In the final tree, the three symbols with the highest frequencies have all been assigned 2-bit codes, and two symbols with lower counts have 3-bit codes as shown table below:
2336: 1452: 2818: 583: 1667: 2023: 2562: 2510: 684: 476: 945: 360: 126: 1446:
Note that although the codewords under the two methods are different, the word lengths are the same. We have lengths of 2 bits for A, and 3 bits for B, C, D and E, giving an average length of
1402: 1223: 1103: 298: 2123:
Recursively apply the steps 3 and 4 to each of the two halves, subdividing groups and adding bits to the codes until each symbol has become a corresponding code leaf on the tree.
2616: 1035: 386: 880: 2577:
packed end-to-end in long sequences. If we consider groups of codes at a time, symbol-by-symbol Huffman coding is only optimal if the probabilities of the symbols are
1924:{\displaystyle \mathbb {E} L=\sum _{i=1}^{n}p_{i}l_{i}\leq \sum _{i=1}^{n}p_{i}(-\log _{2}p_{i}+1)=-\sum _{i=1}^{n}p_{i}\log _{2}p_{i}+\sum _{i=1}^{n}p_{i}=H(X)+1.} 762: 731: 2060: 704: 410: 64: 2939: 2080:
prefix codes; the set of probabilities {0.35, 0.17, 0.17, 0.16, 0.15} is an example of one that will be assigned non-optimal codes by Shannon–Fano coding.
484: 2435:{\displaystyle {\frac {2\,{\text{bits}}\cdot (15+7+6)+3\,{\text{bits}}\cdot (6+5)}{39\,{\text{symbols}}}}\approx 2.28\,{\text{bits per symbol.}}} 1551:{\displaystyle {\frac {2\,{\text{bits}}\cdot (15)+3\,{\text{bits}}\cdot (7+6+6+5)}{39\,{\text{symbols}}}}\approx 2.62\,{\text{bits per symbol,}}} 1575: 128:. One common way of choosing the codewords uses the binary expansion of the cumulative probabilities. This method was proposed in Shannon's " 2911:{\displaystyle {\frac {1\,{\text{bit}}\cdot 15+3\,{\text{bits}}\cdot (7+6+6+5)}{39\,{\text{symbols}}}}\approx 2.23\,{\text{bits per symbol.}}} 3272: 2113:
Sort the lists of symbols according to frequency, with the most frequently occurring symbols at the left and the least common at the right.
3940: 3751: 1937: 2116:
Divide the list into two parts, with the total frequency counts of the left part being as close to the total of the right as possible.
2102:
A Shannon–Fano tree is built according to a specification designed to define an effective code table. The actual algorithm is simple:
3640: 2572:
Neither Shannon–Fano algorithm is guaranteed to generate an optimal code. For this reason, Shannon–Fano codes are almost never used;
4146: 3969: 3763: 2661:
Create a new internal node with these two nodes as children and with probability equal to the sum of the two nodes' probabilities.
3454: 4151: 3728: 129: 2623:
Shannon–Fano, both because arithmetic coding is more computationally expensive and because it is covered by multiple patents.
2515: 2445:
We see that Fano's method, with an average length of 2.28, has outperformed Shannon's method, with an average length of 2.62.
3152: 2456: 2921:
We see that the Huffman code has outperformed both types of Shannon–Fano code, which had expected lengths of 2.62 and 2.28.
3881: 591: 422: 238:
Shannon's method starts by deciding on the lengths of all the codewords, then picks a prefix code with those word lengths.
3172: 888: 303: 69: 4258: 3996: 3935: 3746: 3696: 3519: 2030: 3379: 3364: 3265: 164: 4371: 1358: 1179: 1059: 4381: 4219: 4070: 3989: 3783: 2991:
Stanislav Krajči, Chin-Fu Liu, Ladislav Mikeš and Stefan M. Moser (2015), "Performance analysis of Fano coding",
2962: 419:
A second method makes use of cumulative probabilities. First, the probabilities are written in decreasing order
4354: 3974: 3768: 3556: 3005: 244: 4451: 3487: 3230: 2761:
The code lengths for the different characters this time are 1 bit for A and 3 bits for all other characters.
4116: 179:
Regarding the confusion in the two different codes being referred to by the same name, Krajči et al. write:
4441: 4344: 3886: 3444: 3258: 2453:
It is shown by Krajči et al that the expected length of Fano's method has expected length bounded above by
2026: 845: 3434: 3429: 2812:
This results in the lengths of 1 bit for A and per 3 bits for B, C, D and E, giving an average length of
2330:
This results in lengths of 2 bits for A, B and C and per 3 bits for D and E, giving an average length of
389: 2584: 2062:. Hence we see that the Shannon–Fano code is always within one bit of the optimal expected word length. 4376: 4303: 4141: 4121: 4065: 3723: 3514: 3317: 152: 4477: 4386: 4327: 4253: 4101: 3691: 3686: 3541: 3384: 997: 365: 4391: 3964: 3758: 3459: 2578: 4487: 4482: 4332: 3703: 3590: 3546: 3342: 3332: 850: 3957: 3708: 3492: 3337: 3180: 4229: 155:
in the sense that they do not always achieve the lowest possible expected codeword length, as
4361: 204:
by Cover and Thomas, Goldie and Pinch, Jones and Jones, and Han and Kobayashi. It is called
4045: 3507: 3469: 3290: 740: 709: 3225: 3112: 2036: 8: 4276: 4167: 4126: 4111: 4080: 4075: 3984: 3891: 3824: 3793: 3778: 3561: 4349: 4319: 4298: 4204: 4136: 4030: 3718: 3534: 3524: 3419: 3399: 3394: 3243: 2954: 689: 395: 133: 49: 3930: 4293: 4281: 4263: 4131: 4015: 3952: 3798: 3713: 3669: 3630: 3312: 3148: 2958: 2619: 2088: 168: 2110:
or frequency counts so that each symbol’s relative frequency of occurrence is known.
4268: 4224: 4197: 4192: 4050: 4035: 3945: 3854: 3849: 3678: 3411: 3389: 3281: 3239: 3189: 3168: 2638: 144: 20: 4187: 4001: 3925: 3906: 3876: 3844: 3810: 3369: 3307: 3142: 578:{\displaystyle c_{1}=0,\qquad c_{i}=\sum _{j=1}^{i-1}p_{j}{\text{ for }}i\geq 2,} 3979: 3773: 3502: 3497: 3354: 3327: 3299: 3193: 2646: 2632: 2573: 228: 156: 28: 4471: 4286: 4234: 3901: 3896: 3871: 3803: 3424: 3322: 2107: 734: 4407: 3374: 3349: 3250: 3226:"A Mathematical Theory of Communication [reprint with corrections]" 3212: 39:
based on a set of symbols and their probabilities (estimated or measured).
1662:{\displaystyle l_{i}=\lceil -\log _{2}p_{i}\rceil \leq -\log _{2}p_{i}+1.} 4366: 4244: 4040: 3916: 3866: 885:
For the Shannon–Fano code, we need to calculate the desired word lengths
36: 32: 2686:
We use the same frequencies as for the Shannon–Fano example above, viz:
2658:
Prepend 0 and 1 respectively to any code already assigned to these nodes
211:
Fano's (1949) method, using binary division of probabilities, is called
4423: 4214: 4209: 4096: 4055: 3861: 2655:
Remove the two nodes of lowest probability or frequency from the queue
3219:. Cambridge (Mass.), USA: Research Laboratory of Electronics at MIT. 3140: 4337: 4182: 3839: 2018:{\displaystyle H(X)=-\textstyle \sum _{i=1}^{n}p_{i}\log _{2}p_{i}} 200:
Shannon's (1948) method, using predefined word lengths, is called
4106: 3580: 3529: 2678: 2132: 3620: 3024:(2nd ed.), Wiley–Interscience. "Historical Notes" to Chapter 5. 2993:
2015 IEEE International Symposium on Information Theory (ISIT)
4455: 4060: 3653: 3600: 2669:
The remaining node is the root node and the tree is complete.
2106:
For a given list of symbols, develop a corresponding list of
1267:
Alternatively, we can use the cumulative probability method.
3610: 3464: 3449: 3439: 3173:"A Method for the Construction of Minimum-Redundancy Codes" 2947:
Journal of Network Communications and Emerging Technologies
2033:
says that any code must have an average length of at least
143:
it fell on. This method was proposed in a later (in print)
222: 3585: 3551: 3141:
Ze-Nian Li; Mark S. Drew; Jiangchuan Liu (9 April 2014).
3010:. AT & T Bell Laboratories. 1948-07-01. p. 403. 2557:{\displaystyle p_{\text{min}}=\textstyle \min _{i}p_{i}} 392:, meaning the smallest integer greater than or equal to 3007:
The Bell System Technical Journal 1948-07: Vol 27 Iss 3
2567: 2505:{\displaystyle \mathbb {E} L\leq H(X)+1-p_{\text{min}}} 196:
prefix-free coding schemes with a similar performance.
2588: 2532: 1959: 1128:
010. Continuing like this, we get the following code:
35:, is one of two related techniques for constructing a 2821: 2587: 2518: 2459: 2339: 2083:
Fano's version of Shannon–Fano coding is used in the
2039: 1940: 1681: 1578: 1455: 1361: 1182: 1062: 1000: 891: 853: 743: 712: 692: 679:{\displaystyle c_{1}=0,c_{2}=p_{1},c_{3}=p_{1}+p_{2}} 594: 487: 471:{\displaystyle p_{1}\geq p_{2}\geq \cdots \geq p_{n}} 425: 398: 368: 306: 247: 72: 52: 2649:, using its frequency of occurrence as the priority. 478:. Then, the cumulative probabilities are defined as 2645:Create a leaf node for each symbol and add it to a 940:{\displaystyle l_{i}=\lceil -\log _{2}p_{i}\rceil } 355:{\displaystyle l_{i}=\lceil -\log _{2}p_{i}\rceil } 121:{\displaystyle l_{i}=\lceil -\log _{2}p_{i}\rceil } 3063:, American Mathematical Society. Subsection 3.7.1. 3033:Charles M. Goldie and Richard G. E. Pinch (1991), 2910: 2610: 2556: 2504: 2434: 2054: 2017: 1923: 1661: 1550: 1396: 1217: 1097: 1029: 939: 874: 756: 725: 698: 678: 577: 470: 404: 380: 354: 292: 120: 58: 4469: 2940:"Entropy Coding and Different Coding Techniques" 2652:While there is more than one node in the queue: 2534: 2065: 163:Shannon–Fano coding should not be confused with 2564:is the probability of the least common symbol. 1569:For Shannon's method, the word lengths satisfy 167:(also known as Elias coding), the precursor to 132:" (1948), his article introducing the field of 2673: 3266: 1397:{\displaystyle \lceil -\log _{2}p_{i}\rceil } 1218:{\displaystyle \lceil -\log _{2}p_{i}\rceil } 1098:{\displaystyle \lceil -\log _{2}p_{i}\rceil } 3280: 1621: 1592: 1391: 1362: 1212: 1183: 1092: 1063: 934: 905: 375: 369: 349: 320: 115: 86: 46:chooses a prefix code where a source symbol 2938:Kaur, Sandeep; Singh, Sukhjeet (May 2016). 3273: 3259: 3134: 3046:Gareth A. Jones and J. Mary Jones (2012), 3037:, Cambridge University Press. Section 1.6. 3020:Thomas M. Cover and Joy A. Thomas (2006), 3147:. Springer Science & Business Media. 3100:Data Communications and Computer Networks 2937: 2902: 2887: 2846: 2828: 2461: 2448: 2426: 2411: 2382: 2346: 2087:compression method, which is part of the 2070: 1683: 1672:Hence the expected word length satisfies 1542: 1527: 1486: 1462: 3087:Data Compression: The Complete Reference 2677: 2131: 2097: 1561:which is within one bit of the entropy. 293:{\displaystyle p_{1},p_{2},\dots ,p_{n}} 3223: 3167: 3105: 3059:Te Sun Han and Kingo Kobayashi (2007), 2140:We continue with the previous example. 1564: 223:Shannon's code: predefined word lengths 4470: 2987: 2985: 2983: 233: 130:A Mathematical Theory of Communication 3254: 3061:Mathematics of Information and Coding 3210: 3102:, Phi Publishing. Subsection 1.11.5. 3074:A First Course in Information Theory 2581:and are some power of a half, i.e., 2568:Comparison with other coding methods 2980: 2931: 686:and so on. The codeword for symbol 215:by Salomon and Gupta. It is called 13: 3244:10.1002/j.1538-7305.1948.tb01338.x 2611:{\displaystyle \textstyle 1/2^{k}} 241:Given a source with probabilities 14: 4499: 3213:"The transmission of information" 3117:- .ZIP File Format Specification" 2626: 300:the desired codeword lengths are 4447: 4446: 4437: 4436: 3161: 3092: 2031:Shannon's source coding theorem 1030:{\displaystyle -\log _{2}p_{i}} 507: 381:{\displaystyle \lceil x\rceil } 3079: 3066: 3053: 3040: 3027: 3022:Elements of Information Theory 3014: 2998: 2879: 2855: 2664:Add the new node to the queue. 2480: 2474: 2403: 2391: 2373: 2355: 2049: 2043: 1950: 1944: 1912: 1906: 1803: 1768: 1519: 1495: 1477: 1471: 863: 857: 1: 3231:Bell System Technical Journal 3204: 3076:, Springer. Subsection 3.2.2. 3048:Information and Coding Theory 2066:Fano's code: binary splitting 66:is given the codeword length 2127: 7: 3224:Shannon, C.E. (July 1948). 2674:Example with Huffman coding 16:Data compression algorithms 10: 4504: 4328:Compressed data structures 3650:RLE + BWT + MTF + Huffman 3318:Asymmetric numeral systems 3194:10.1109/JRPROC.1952.273898 3144:Fundamentals of Multimedia 2630: 875:{\displaystyle H(X)=2.186} 767: 706:is chosen to be the first 226: 4432: 4416: 4400: 4318: 4243: 4175: 4166: 4089: 4023: 4014: 3915: 3832: 3823: 3739: 3687:Discrete cosine transform 3677: 3668: 3617:LZ77 + Huffman + context 3570: 3480: 3410: 3298: 3289: 3098:Prakash C. Gupta (2006), 2290: 2287: 2284: 2281: 2273: 2270: 1315:Cumulative probabilities 174: 165:Shannon–Fano–Elias coding 4392:Smallest grammar problem 3119:. PKWARE Inc. 2007-09-28 3089:, Springer. Section 2.6. 3072:Raymond W Yeung (2002), 3050:(Springer). Section 3.4. 2924: 4333:Compressed suffix array 3882:Nyquist–Shannon theorem 3217:Technical Report No. 65 2618:. In most situations, 151:Shannon–Fano codes are 3181:Proceedings of the IRE 3085:David Salomon (2013), 2912: 2683: 2612: 2558: 2506: 2436: 2137: 2136:Shannon–Fano Algorithm 2071:Outline of Fano's code 2056: 2019: 1980: 1925: 1889: 1832: 1757: 1713: 1663: 1552: 1398: 1219: 1099: 1031: 941: 876: 758: 727: 700: 680: 579: 547: 472: 406: 382: 356: 294: 198: 122: 60: 4362:Kolmogorov complexity 4230:Video characteristics 3607:LZ77 + Huffman + ANS 2913: 2681: 2613: 2559: 2507: 2437: 2135: 2098:The Shannon–Fano tree 2057: 2020: 1960: 1926: 1869: 1812: 1737: 1693: 1664: 1553: 1399: 1220: 1100: 1032: 942: 877: 759: 757:{\displaystyle c_{i}} 733:binary digits in the 728: 726:{\displaystyle l_{i}} 701: 681: 580: 521: 473: 407: 383: 357: 295: 181: 123: 61: 4452:Compression software 4046:Compression artifact 4002:Psychoacoustic model 3035:Communication Theory 2819: 2585: 2516: 2457: 2449:Expected word length 2337: 2055:{\displaystyle H(X)} 2037: 1938: 1679: 1576: 1565:Expected word length 1453: 1359: 1180: 1060: 998: 889: 851: 741: 710: 690: 592: 485: 423: 396: 366: 304: 245: 70: 50: 4442:Compression formats 4081:Texture compression 4076:Standard test image 3892:Silence compression 3211:Fano, R.M. (1949). 2637:A few years later, 234:Shannon's algorithm 213:Shannon–Fano coding 202:Shannon–Fano coding 186:Shannon–Fano coding 25:Shannon–Fano coding 4350:Information theory 4205:Display resolution 4031:Chroma subsampling 3420:Byte pair encoding 3365:Shannon–Fano–Elias 2908: 2684: 2608: 2607: 2554: 2553: 2542: 2502: 2432: 2138: 2052: 2015: 2014: 1921: 1659: 1548: 1394: 1215: 1095: 1027: 937: 872: 754: 723: 696: 676: 575: 468: 402: 378: 352: 290: 134:information theory 118: 56: 4465: 4464: 4314: 4313: 4264:Deblocking filter 4162: 4161: 4010: 4009: 3819: 3818: 3664: 3663: 3154:978-3-319-05290-8 2906: 2894: 2891: 2850: 2832: 2808: 2807: 2753: 2752: 2682:Huffman Algorithm 2620:arithmetic coding 2533: 2526: 2499: 2430: 2418: 2415: 2386: 2350: 2326: 2325: 2207: 2206: 1546: 1534: 1531: 1490: 1466: 1442: 1441: 1263: 1262: 1123: 1122: 840: 839: 699:{\displaystyle i} 561: 405:{\displaystyle x} 219:by Krajči et al. 169:arithmetic coding 59:{\displaystyle i} 4495: 4478:Data compression 4450: 4449: 4440: 4439: 4269:Lapped transform 4173: 4172: 4051:Image resolution 4036:Coding tree unit 4021: 4020: 3830: 3829: 3675: 3674: 3296: 3295: 3282:Data compression 3275: 3268: 3261: 3252: 3251: 3247: 3220: 3198: 3197: 3188:(9): 1098–1101. 3177: 3165: 3159: 3158: 3138: 3132: 3131: 3125: 3124: 3116: 3109: 3103: 3096: 3090: 3083: 3077: 3070: 3064: 3057: 3051: 3044: 3038: 3031: 3025: 3018: 3012: 3011: 3002: 2996: 2989: 2978: 2977: 2975: 2973: 2967: 2961:. Archived from 2944: 2935: 2917: 2915: 2914: 2909: 2907: 2905:bits per symbol. 2904: 2895: 2893: 2892: 2889: 2882: 2851: 2848: 2833: 2830: 2823: 2766: 2765: 2691: 2690: 2639:David A. Huffman 2617: 2615: 2614: 2609: 2606: 2605: 2596: 2563: 2561: 2560: 2555: 2552: 2551: 2541: 2528: 2527: 2524: 2511: 2509: 2508: 2503: 2501: 2500: 2497: 2464: 2441: 2439: 2438: 2433: 2431: 2429:bits per symbol. 2428: 2419: 2417: 2416: 2413: 2406: 2387: 2384: 2351: 2348: 2341: 2279:Second division 2225: 2224: 2145: 2144: 2091: 2086: 2061: 2059: 2058: 2053: 2024: 2022: 2021: 2016: 2013: 2012: 2000: 1999: 1990: 1989: 1979: 1974: 1930: 1928: 1927: 1922: 1899: 1898: 1888: 1883: 1865: 1864: 1852: 1851: 1842: 1841: 1831: 1826: 1796: 1795: 1783: 1782: 1767: 1766: 1756: 1751: 1733: 1732: 1723: 1722: 1712: 1707: 1686: 1668: 1666: 1665: 1660: 1652: 1651: 1639: 1638: 1620: 1619: 1607: 1606: 1588: 1587: 1557: 1555: 1554: 1549: 1547: 1545:bits per symbol, 1544: 1535: 1533: 1532: 1529: 1522: 1491: 1488: 1467: 1464: 1457: 1403: 1401: 1400: 1395: 1390: 1389: 1377: 1376: 1272: 1271: 1224: 1222: 1221: 1216: 1211: 1210: 1198: 1197: 1133: 1132: 1104: 1102: 1101: 1096: 1091: 1090: 1078: 1077: 1036: 1034: 1033: 1028: 1026: 1025: 1013: 1012: 952: 951: 946: 944: 943: 938: 933: 932: 920: 919: 901: 900: 881: 879: 878: 873: 844:This source has 778: 777: 763: 761: 760: 755: 753: 752: 735:binary expansion 732: 730: 729: 724: 722: 721: 705: 703: 702: 697: 685: 683: 682: 677: 675: 674: 662: 661: 649: 648: 636: 635: 623: 622: 604: 603: 584: 582: 581: 576: 562: 559: 557: 556: 546: 535: 517: 516: 497: 496: 477: 475: 474: 469: 467: 466: 448: 447: 435: 434: 411: 409: 408: 403: 390:ceiling function 387: 385: 384: 379: 361: 359: 358: 353: 348: 347: 335: 334: 316: 315: 299: 297: 296: 291: 289: 288: 270: 269: 257: 256: 145:technical report 127: 125: 124: 119: 114: 113: 101: 100: 82: 81: 65: 63: 62: 57: 44:Shannon's method 21:data compression 19:In the field of 4503: 4502: 4498: 4497: 4496: 4494: 4493: 4492: 4468: 4467: 4466: 4461: 4428: 4412: 4396: 4377:Rate–distortion 4310: 4239: 4158: 4085: 4006: 3911: 3907:Sub-band coding 3815: 3740:Predictive type 3735: 3660: 3627:LZSS + Huffman 3577:LZ77 + Huffman 3566: 3476: 3412:Dictionary type 3406: 3308:Adaptive coding 3285: 3279: 3207: 3202: 3201: 3175: 3166: 3162: 3155: 3139: 3135: 3122: 3120: 3114: 3111: 3110: 3106: 3097: 3093: 3084: 3080: 3071: 3067: 3058: 3054: 3045: 3041: 3032: 3028: 3019: 3015: 3004: 3003: 2999: 2990: 2981: 2971: 2969: 2965: 2942: 2936: 2932: 2927: 2903: 2888: 2883: 2847: 2829: 2824: 2822: 2820: 2817: 2816: 2676: 2635: 2629: 2601: 2597: 2592: 2586: 2583: 2582: 2570: 2547: 2543: 2537: 2523: 2519: 2517: 2514: 2513: 2496: 2492: 2460: 2458: 2455: 2454: 2451: 2427: 2412: 2407: 2383: 2347: 2342: 2340: 2338: 2335: 2334: 2296:Third division 2268:First division 2130: 2100: 2089: 2084: 2073: 2068: 2038: 2035: 2034: 2008: 2004: 1995: 1991: 1985: 1981: 1975: 1964: 1939: 1936: 1935: 1894: 1890: 1884: 1873: 1860: 1856: 1847: 1843: 1837: 1833: 1827: 1816: 1791: 1787: 1778: 1774: 1762: 1758: 1752: 1741: 1728: 1724: 1718: 1714: 1708: 1697: 1682: 1680: 1677: 1676: 1647: 1643: 1634: 1630: 1615: 1611: 1602: 1598: 1583: 1579: 1577: 1574: 1573: 1567: 1543: 1528: 1523: 1487: 1463: 1458: 1456: 1454: 1451: 1450: 1385: 1381: 1372: 1368: 1360: 1357: 1356: 1206: 1202: 1193: 1189: 1181: 1178: 1177: 1086: 1082: 1073: 1069: 1061: 1058: 1057: 1021: 1017: 1008: 1004: 999: 996: 995: 928: 924: 915: 911: 896: 892: 890: 887: 886: 852: 849: 848: 770: 748: 744: 742: 739: 738: 717: 713: 711: 708: 707: 691: 688: 687: 670: 666: 657: 653: 644: 640: 631: 627: 618: 614: 599: 595: 593: 590: 589: 560: for  558: 552: 548: 536: 525: 512: 508: 492: 488: 486: 483: 482: 462: 458: 443: 439: 430: 426: 424: 421: 420: 397: 394: 393: 367: 364: 363: 343: 339: 330: 326: 311: 307: 305: 302: 301: 284: 280: 265: 261: 252: 248: 246: 243: 242: 236: 231: 225: 177: 147:by Fano (1949). 109: 105: 96: 92: 77: 73: 71: 68: 67: 51: 48: 47: 17: 12: 11: 5: 4501: 4491: 4490: 4488:Entropy coding 4485: 4483:Claude Shannon 4480: 4463: 4462: 4460: 4459: 4444: 4433: 4430: 4429: 4427: 4426: 4420: 4418: 4414: 4413: 4411: 4410: 4404: 4402: 4398: 4397: 4395: 4394: 4389: 4384: 4379: 4374: 4369: 4364: 4359: 4358: 4357: 4347: 4342: 4341: 4340: 4335: 4324: 4322: 4316: 4315: 4312: 4311: 4309: 4308: 4307: 4306: 4301: 4291: 4290: 4289: 4284: 4279: 4271: 4266: 4261: 4256: 4250: 4248: 4241: 4240: 4238: 4237: 4232: 4227: 4222: 4217: 4212: 4207: 4202: 4201: 4200: 4195: 4190: 4179: 4177: 4170: 4164: 4163: 4160: 4159: 4157: 4156: 4155: 4154: 4149: 4144: 4139: 4129: 4124: 4119: 4114: 4109: 4104: 4099: 4093: 4091: 4087: 4086: 4084: 4083: 4078: 4073: 4068: 4063: 4058: 4053: 4048: 4043: 4038: 4033: 4027: 4025: 4018: 4012: 4011: 4008: 4007: 4005: 4004: 3999: 3994: 3993: 3992: 3987: 3982: 3977: 3972: 3962: 3961: 3960: 3950: 3949: 3948: 3943: 3933: 3928: 3922: 3920: 3913: 3912: 3910: 3909: 3904: 3899: 3894: 3889: 3884: 3879: 3874: 3869: 3864: 3859: 3858: 3857: 3852: 3847: 3836: 3834: 3827: 3821: 3820: 3817: 3816: 3814: 3813: 3811:Psychoacoustic 3808: 3807: 3806: 3801: 3796: 3788: 3787: 3786: 3781: 3776: 3771: 3766: 3756: 3755: 3754: 3743: 3741: 3737: 3736: 3734: 3733: 3732: 3731: 3726: 3721: 3711: 3706: 3701: 3700: 3699: 3694: 3683: 3681: 3679:Transform type 3672: 3666: 3665: 3662: 3661: 3659: 3658: 3657: 3656: 3648: 3647: 3646: 3643: 3635: 3634: 3633: 3625: 3624: 3623: 3615: 3614: 3613: 3605: 3604: 3603: 3595: 3594: 3593: 3588: 3583: 3574: 3572: 3568: 3567: 3565: 3564: 3559: 3554: 3549: 3544: 3539: 3538: 3537: 3532: 3522: 3517: 3512: 3511: 3510: 3500: 3495: 3490: 3484: 3482: 3478: 3477: 3475: 3474: 3473: 3472: 3467: 3462: 3457: 3452: 3447: 3442: 3437: 3432: 3422: 3416: 3414: 3408: 3407: 3405: 3404: 3403: 3402: 3397: 3392: 3387: 3377: 3372: 3367: 3362: 3357: 3352: 3347: 3346: 3345: 3340: 3335: 3325: 3320: 3315: 3310: 3304: 3302: 3293: 3287: 3286: 3278: 3277: 3270: 3263: 3255: 3249: 3248: 3221: 3206: 3203: 3200: 3199: 3160: 3153: 3133: 3104: 3091: 3078: 3065: 3052: 3039: 3026: 3013: 2997: 2979: 2929: 2928: 2926: 2923: 2919: 2918: 2901: 2898: 2886: 2881: 2878: 2875: 2872: 2869: 2866: 2863: 2860: 2857: 2854: 2845: 2842: 2839: 2836: 2827: 2810: 2809: 2806: 2805: 2802: 2799: 2796: 2793: 2790: 2786: 2785: 2782: 2779: 2776: 2773: 2770: 2755: 2754: 2751: 2750: 2747: 2744: 2741: 2738: 2735: 2734:Probabilities 2731: 2730: 2727: 2724: 2721: 2718: 2715: 2711: 2710: 2707: 2704: 2701: 2698: 2695: 2675: 2672: 2671: 2670: 2667: 2666: 2665: 2662: 2659: 2656: 2650: 2647:priority queue 2633:Huffman coding 2631:Main article: 2628: 2627:Huffman coding 2625: 2604: 2600: 2595: 2591: 2574:Huffman coding 2569: 2566: 2550: 2546: 2540: 2536: 2531: 2522: 2495: 2491: 2488: 2485: 2482: 2479: 2476: 2473: 2470: 2467: 2463: 2450: 2447: 2443: 2442: 2425: 2422: 2410: 2405: 2402: 2399: 2396: 2393: 2390: 2381: 2378: 2375: 2372: 2369: 2366: 2363: 2360: 2357: 2354: 2345: 2328: 2327: 2324: 2323: 2320: 2317: 2314: 2311: 2308: 2304: 2303: 2300: 2297: 2293: 2292: 2289: 2286: 2283: 2280: 2276: 2275: 2272: 2269: 2265: 2264: 2261: 2258: 2255: 2252: 2249: 2248:Probabilities 2245: 2244: 2241: 2238: 2235: 2232: 2229: 2209: 2208: 2205: 2204: 2201: 2198: 2195: 2192: 2189: 2188:Probabilities 2185: 2184: 2181: 2178: 2175: 2172: 2169: 2165: 2164: 2161: 2158: 2155: 2152: 2149: 2129: 2126: 2125: 2124: 2121: 2117: 2114: 2111: 2099: 2096: 2072: 2069: 2067: 2064: 2051: 2048: 2045: 2042: 2011: 2007: 2003: 1998: 1994: 1988: 1984: 1978: 1973: 1970: 1967: 1963: 1958: 1955: 1952: 1949: 1946: 1943: 1932: 1931: 1920: 1917: 1914: 1911: 1908: 1905: 1902: 1897: 1893: 1887: 1882: 1879: 1876: 1872: 1868: 1863: 1859: 1855: 1850: 1846: 1840: 1836: 1830: 1825: 1822: 1819: 1815: 1811: 1808: 1805: 1802: 1799: 1794: 1790: 1786: 1781: 1777: 1773: 1770: 1765: 1761: 1755: 1750: 1747: 1744: 1740: 1736: 1731: 1727: 1721: 1717: 1711: 1706: 1703: 1700: 1696: 1692: 1689: 1685: 1670: 1669: 1658: 1655: 1650: 1646: 1642: 1637: 1633: 1629: 1626: 1623: 1618: 1614: 1610: 1605: 1601: 1597: 1594: 1591: 1586: 1582: 1566: 1563: 1559: 1558: 1541: 1538: 1526: 1521: 1518: 1515: 1512: 1509: 1506: 1503: 1500: 1497: 1494: 1485: 1482: 1479: 1476: 1473: 1470: 1461: 1444: 1443: 1440: 1439: 1436: 1433: 1430: 1427: 1424: 1420: 1419: 1416: 1413: 1410: 1407: 1404: 1393: 1388: 1384: 1380: 1375: 1371: 1367: 1364: 1352: 1351: 1348: 1345: 1342: 1339: 1336: 1332: 1331: 1328: 1325: 1322: 1319: 1316: 1312: 1311: 1308: 1305: 1302: 1299: 1296: 1295:Probabilities 1292: 1291: 1288: 1285: 1282: 1279: 1276: 1265: 1264: 1261: 1260: 1257: 1254: 1251: 1248: 1245: 1241: 1240: 1237: 1234: 1231: 1228: 1225: 1214: 1209: 1205: 1201: 1196: 1192: 1188: 1185: 1173: 1172: 1169: 1166: 1163: 1160: 1157: 1156:Probabilities 1153: 1152: 1149: 1146: 1143: 1140: 1137: 1125: 1124: 1121: 1120: 1117: 1114: 1111: 1108: 1105: 1094: 1089: 1085: 1081: 1076: 1072: 1068: 1065: 1053: 1052: 1049: 1046: 1043: 1040: 1037: 1024: 1020: 1016: 1011: 1007: 1003: 992: 991: 988: 985: 982: 979: 976: 975:Probabilities 972: 971: 968: 965: 962: 959: 956: 936: 931: 927: 923: 918: 914: 910: 907: 904: 899: 895: 871: 868: 865: 862: 859: 856: 842: 841: 838: 837: 834: 831: 828: 825: 822: 821:Probabilities 818: 817: 814: 811: 808: 805: 802: 798: 797: 794: 791: 788: 785: 782: 769: 766: 751: 747: 720: 716: 695: 673: 669: 665: 660: 656: 652: 647: 643: 639: 634: 630: 626: 621: 617: 613: 610: 607: 602: 598: 586: 585: 574: 571: 568: 565: 555: 551: 545: 542: 539: 534: 531: 528: 524: 520: 515: 511: 506: 503: 500: 495: 491: 465: 461: 457: 454: 451: 446: 442: 438: 433: 429: 401: 377: 374: 371: 351: 346: 342: 338: 333: 329: 325: 322: 319: 314: 310: 287: 283: 279: 276: 273: 268: 264: 260: 255: 251: 235: 232: 229:Shannon coding 227:Main article: 224: 221: 206:Shannon coding 176: 173: 157:Huffman coding 149: 148: 137: 117: 112: 108: 104: 99: 95: 91: 88: 85: 80: 76: 55: 29:Claude Shannon 27:, named after 15: 9: 6: 4: 3: 2: 4500: 4489: 4486: 4484: 4481: 4479: 4476: 4475: 4473: 4457: 4453: 4445: 4443: 4435: 4434: 4431: 4425: 4422: 4421: 4419: 4415: 4409: 4406: 4405: 4403: 4399: 4393: 4390: 4388: 4385: 4383: 4380: 4378: 4375: 4373: 4370: 4368: 4365: 4363: 4360: 4356: 4353: 4352: 4351: 4348: 4346: 4343: 4339: 4336: 4334: 4331: 4330: 4329: 4326: 4325: 4323: 4321: 4317: 4305: 4302: 4300: 4297: 4296: 4295: 4292: 4288: 4285: 4283: 4280: 4278: 4275: 4274: 4272: 4270: 4267: 4265: 4262: 4260: 4257: 4255: 4252: 4251: 4249: 4246: 4242: 4236: 4235:Video quality 4233: 4231: 4228: 4226: 4223: 4221: 4218: 4216: 4213: 4211: 4208: 4206: 4203: 4199: 4196: 4194: 4191: 4189: 4186: 4185: 4184: 4181: 4180: 4178: 4174: 4171: 4169: 4165: 4153: 4150: 4148: 4145: 4143: 4140: 4138: 4135: 4134: 4133: 4130: 4128: 4125: 4123: 4120: 4118: 4115: 4113: 4110: 4108: 4105: 4103: 4100: 4098: 4095: 4094: 4092: 4088: 4082: 4079: 4077: 4074: 4072: 4069: 4067: 4064: 4062: 4059: 4057: 4054: 4052: 4049: 4047: 4044: 4042: 4039: 4037: 4034: 4032: 4029: 4028: 4026: 4022: 4019: 4017: 4013: 4003: 4000: 3998: 3995: 3991: 3988: 3986: 3983: 3981: 3978: 3976: 3973: 3971: 3968: 3967: 3966: 3963: 3959: 3956: 3955: 3954: 3951: 3947: 3944: 3942: 3939: 3938: 3937: 3934: 3932: 3929: 3927: 3924: 3923: 3921: 3918: 3914: 3908: 3905: 3903: 3902:Speech coding 3900: 3898: 3897:Sound quality 3895: 3893: 3890: 3888: 3885: 3883: 3880: 3878: 3875: 3873: 3872:Dynamic range 3870: 3868: 3865: 3863: 3860: 3856: 3853: 3851: 3848: 3846: 3843: 3842: 3841: 3838: 3837: 3835: 3831: 3828: 3826: 3822: 3812: 3809: 3805: 3802: 3800: 3797: 3795: 3792: 3791: 3789: 3785: 3782: 3780: 3777: 3775: 3772: 3770: 3767: 3765: 3762: 3761: 3760: 3757: 3753: 3750: 3749: 3748: 3745: 3744: 3742: 3738: 3730: 3727: 3725: 3722: 3720: 3717: 3716: 3715: 3712: 3710: 3707: 3705: 3702: 3698: 3695: 3693: 3690: 3689: 3688: 3685: 3684: 3682: 3680: 3676: 3673: 3671: 3667: 3655: 3652: 3651: 3649: 3644: 3642: 3639: 3638: 3637:LZ77 + Range 3636: 3632: 3629: 3628: 3626: 3622: 3619: 3618: 3616: 3612: 3609: 3608: 3606: 3602: 3599: 3598: 3596: 3592: 3589: 3587: 3584: 3582: 3579: 3578: 3576: 3575: 3573: 3569: 3563: 3560: 3558: 3555: 3553: 3550: 3548: 3545: 3543: 3540: 3536: 3533: 3531: 3528: 3527: 3526: 3523: 3521: 3518: 3516: 3513: 3509: 3506: 3505: 3504: 3501: 3499: 3496: 3494: 3491: 3489: 3486: 3485: 3483: 3479: 3471: 3468: 3466: 3463: 3461: 3458: 3456: 3453: 3451: 3448: 3446: 3443: 3441: 3438: 3436: 3433: 3431: 3428: 3427: 3426: 3423: 3421: 3418: 3417: 3415: 3413: 3409: 3401: 3398: 3396: 3393: 3391: 3388: 3386: 3383: 3382: 3381: 3378: 3376: 3373: 3371: 3368: 3366: 3363: 3361: 3358: 3356: 3353: 3351: 3348: 3344: 3341: 3339: 3336: 3334: 3331: 3330: 3329: 3326: 3324: 3321: 3319: 3316: 3314: 3311: 3309: 3306: 3305: 3303: 3301: 3297: 3294: 3292: 3288: 3283: 3276: 3271: 3269: 3264: 3262: 3257: 3256: 3253: 3245: 3241: 3237: 3233: 3232: 3227: 3222: 3218: 3214: 3209: 3208: 3195: 3191: 3187: 3183: 3182: 3174: 3170: 3164: 3156: 3150: 3146: 3145: 3137: 3130: 3118: 3108: 3101: 3095: 3088: 3082: 3075: 3069: 3062: 3056: 3049: 3043: 3036: 3030: 3023: 3017: 3009: 3008: 3001: 2994: 2988: 2986: 2984: 2968:on 2019-12-03 2964: 2960: 2956: 2952: 2948: 2941: 2934: 2930: 2922: 2899: 2896: 2884: 2876: 2873: 2870: 2867: 2864: 2861: 2858: 2852: 2843: 2840: 2837: 2834: 2825: 2815: 2814: 2813: 2803: 2800: 2797: 2794: 2791: 2788: 2787: 2783: 2780: 2777: 2774: 2771: 2768: 2767: 2764: 2763: 2762: 2759: 2748: 2745: 2742: 2739: 2736: 2733: 2732: 2728: 2725: 2722: 2719: 2716: 2713: 2712: 2708: 2705: 2702: 2699: 2696: 2693: 2692: 2689: 2688: 2687: 2680: 2668: 2663: 2660: 2657: 2654: 2653: 2651: 2648: 2644: 2643: 2642: 2640: 2634: 2624: 2621: 2602: 2598: 2593: 2589: 2580: 2575: 2565: 2548: 2544: 2538: 2529: 2520: 2493: 2489: 2486: 2483: 2477: 2471: 2468: 2465: 2446: 2423: 2420: 2408: 2400: 2397: 2394: 2388: 2379: 2376: 2370: 2367: 2364: 2361: 2358: 2352: 2343: 2333: 2332: 2331: 2321: 2318: 2315: 2312: 2309: 2306: 2305: 2301: 2298: 2295: 2294: 2278: 2277: 2267: 2266: 2262: 2259: 2256: 2253: 2250: 2247: 2246: 2242: 2239: 2236: 2233: 2230: 2227: 2226: 2223: 2222: 2221: 2217: 2213: 2202: 2199: 2196: 2193: 2190: 2187: 2186: 2182: 2179: 2176: 2173: 2170: 2167: 2166: 2162: 2159: 2156: 2153: 2150: 2147: 2146: 2143: 2142: 2141: 2134: 2122: 2118: 2115: 2112: 2109: 2108:probabilities 2105: 2104: 2103: 2095: 2093: 2081: 2077: 2063: 2046: 2040: 2032: 2028: 2009: 2005: 2001: 1996: 1992: 1986: 1982: 1976: 1971: 1968: 1965: 1961: 1956: 1953: 1947: 1941: 1918: 1915: 1909: 1903: 1900: 1895: 1891: 1885: 1880: 1877: 1874: 1870: 1866: 1861: 1857: 1853: 1848: 1844: 1838: 1834: 1828: 1823: 1820: 1817: 1813: 1809: 1806: 1800: 1797: 1792: 1788: 1784: 1779: 1775: 1771: 1763: 1759: 1753: 1748: 1745: 1742: 1738: 1734: 1729: 1725: 1719: 1715: 1709: 1704: 1701: 1698: 1694: 1690: 1687: 1675: 1674: 1673: 1656: 1653: 1648: 1644: 1640: 1635: 1631: 1627: 1624: 1616: 1612: 1608: 1603: 1599: 1595: 1589: 1584: 1580: 1572: 1571: 1570: 1562: 1539: 1536: 1524: 1516: 1513: 1510: 1507: 1504: 1501: 1498: 1492: 1483: 1480: 1474: 1468: 1459: 1449: 1448: 1447: 1437: 1434: 1431: 1428: 1425: 1422: 1421: 1417: 1414: 1411: 1408: 1405: 1386: 1382: 1378: 1373: 1369: 1365: 1355:Word lengths 1354: 1353: 1349: 1346: 1343: 1340: 1337: 1335:...in binary 1334: 1333: 1329: 1326: 1323: 1320: 1317: 1314: 1313: 1309: 1306: 1303: 1300: 1297: 1294: 1293: 1289: 1286: 1283: 1280: 1277: 1274: 1273: 1270: 1269: 1268: 1258: 1255: 1252: 1249: 1246: 1243: 1242: 1238: 1235: 1232: 1229: 1226: 1207: 1203: 1199: 1194: 1190: 1186: 1176:Word lengths 1175: 1174: 1170: 1167: 1164: 1161: 1158: 1155: 1154: 1150: 1147: 1144: 1141: 1138: 1135: 1134: 1131: 1130: 1129: 1118: 1115: 1112: 1109: 1106: 1087: 1083: 1079: 1074: 1070: 1066: 1056:Word lengths 1055: 1054: 1050: 1047: 1044: 1041: 1038: 1022: 1018: 1014: 1009: 1005: 1001: 994: 993: 989: 986: 983: 980: 977: 974: 973: 969: 966: 963: 960: 957: 954: 953: 950: 949: 948: 929: 925: 921: 916: 912: 908: 902: 897: 893: 883: 869: 866: 860: 854: 847: 835: 832: 829: 826: 823: 820: 819: 815: 812: 809: 806: 803: 800: 799: 795: 792: 789: 786: 783: 780: 779: 776: 775: 774: 765: 749: 745: 736: 718: 714: 693: 671: 667: 663: 658: 654: 650: 645: 641: 637: 632: 628: 624: 619: 615: 611: 608: 605: 600: 596: 572: 569: 566: 563: 553: 549: 543: 540: 537: 532: 529: 526: 522: 518: 513: 509: 504: 501: 498: 493: 489: 481: 480: 479: 463: 459: 455: 452: 449: 444: 440: 436: 431: 427: 417: 413: 399: 391: 372: 344: 340: 336: 331: 327: 323: 317: 312: 308: 285: 281: 277: 274: 271: 266: 262: 258: 253: 249: 239: 230: 220: 218: 214: 209: 207: 203: 197: 195: 189: 187: 180: 172: 170: 166: 161: 158: 154: 146: 141: 140:Fano's method 138: 135: 131: 110: 106: 102: 97: 93: 89: 83: 78: 74: 53: 45: 42: 41: 40: 38: 34: 30: 26: 22: 4408:Hutter Prize 4372:Quantization 4277:Compensation 4071:Quantization 3794:Compensation 3360:Shannon–Fano 3359: 3300:Entropy type 3235: 3229: 3216: 3185: 3179: 3163: 3143: 3136: 3127: 3121:. Retrieved 3107: 3099: 3094: 3086: 3081: 3073: 3068: 3060: 3055: 3047: 3042: 3034: 3029: 3021: 3016: 3006: 3000: 2992: 2970:. Retrieved 2963:the original 2950: 2946: 2933: 2920: 2811: 2760: 2756: 2685: 2636: 2571: 2452: 2444: 2329: 2218: 2214: 2210: 2139: 2101: 2082: 2078: 2074: 1933: 1671: 1568: 1560: 1445: 1266: 1126: 884: 843: 771: 587: 418: 414: 240: 237: 216: 212: 210: 205: 201: 199: 193: 190: 185: 182: 178: 162: 150: 139: 43: 24: 18: 4367:Prefix code 4220:Frame types 4041:Color space 3867:Convolution 3597:LZ77 + ANS 3508:Incremental 3481:Other types 3400:Levenshtein 3238:: 379–423. 3169:Huffman, D. 3115:APPNOTE.TXT 2579:independent 2092:file format 217:Fano coding 37:prefix code 33:Robert Fano 4472:Categories 4424:Mark Adler 4382:Redundancy 4299:Daubechies 4282:Estimation 4215:Frame rate 4137:Daubechies 4097:Chain code 4056:Macroblock 3862:Companding 3799:Estimation 3719:Daubechies 3425:Lempel–Ziv 3385:Exp-Golomb 3313:Arithmetic 3205:References 3123:2008-01-06 2972:3 December 2789:Codewords 2307:Codewords 1423:Codewords 1244:Codewords 208:by Yeung. 194:suboptimal 153:suboptimal 4401:Community 4225:Interlace 3611:Zstandard 3390:Fibonacci 3380:Universal 3338:Canonical 2959:212439287 2897:≈ 2853:⋅ 2835:⋅ 2490:− 2469:≤ 2421:≈ 2389:⋅ 2353:⋅ 2002:⁡ 1962:∑ 1957:− 1871:∑ 1854:⁡ 1814:∑ 1810:− 1785:⁡ 1772:− 1739:∑ 1735:≤ 1695:∑ 1641:⁡ 1628:− 1625:≤ 1622:⌉ 1609:⁡ 1596:− 1593:⌈ 1537:≈ 1493:⋅ 1469:⋅ 1392:⌉ 1379:⁡ 1366:− 1363:⌈ 1213:⌉ 1200:⁡ 1187:− 1184:⌈ 1093:⌉ 1080:⁡ 1067:− 1064:⌈ 1015:⁡ 1002:− 935:⌉ 922:⁡ 909:− 906:⌈ 567:≥ 541:− 523:∑ 456:≥ 453:⋯ 450:≥ 437:≥ 376:⌉ 370:⌈ 350:⌉ 337:⁡ 324:− 321:⌈ 275:… 116:⌉ 103:⁡ 90:− 87:⌈ 4387:Symmetry 4355:Timeline 4338:FM-index 4183:Bit rate 4176:Concepts 4024:Concepts 3887:Sampling 3840:Bit rate 3833:Concepts 3535:Sequitur 3370:Tunstall 3343:Modified 3333:Adaptive 3291:Lossless 3171:(1952). 2953:(5): 5. 2512:, where 1350:0.11011 1347:0.10110 1344:0.10010 1341:0.01100 1338:0.00000 362:. Here, 4345:Entropy 4294:Wavelet 4273:Motion 4132:Wavelet 4112:Fractal 4107:Deflate 4090:Methods 3877:Latency 3790:Motion 3714:Wavelet 3631:LHA/LZH 3581:Deflate 3530:Re-Pair 3525:Grammar 3355:Shannon 3328:Huffman 3284:methods 2890:symbols 2769:Symbol 2694:Symbol 2414:symbols 2228:Symbol 2148:Symbol 2128:Example 2085:IMPLODE 2027:entropy 2025:is the 1530:symbols 1275:Symbol 1136:Symbol 955:Symbol 846:entropy 781:Symbol 768:Example 388:is the 4456:codecs 4417:People 4320:Theory 4287:Vector 3804:Vector 3621:Brotli 3571:Hybrid 3470:Snappy 3323:Golomb 3151:  3129:trees. 2957:  2749:0.128 2746:0.154 2743:0.154 2740:0.179 2737:0.385 2714:Count 2263:0.128 2260:0.154 2257:0.154 2254:0.179 2251:0.385 2203:0.128 2200:0.154 2197:0.154 2194:0.179 2191:0.385 2168:Count 2029:, and 1934:Here, 1330:0.872 1327:0.718 1324:0.564 1321:0.385 1318:0.000 1310:0.128 1307:0.154 1304:0.154 1301:0.179 1298:0.385 1171:0.128 1168:0.154 1165:0.154 1162:0.179 1159:0.385 1051:2.963 1048:2.700 1045:2.700 1042:2.480 1039:1.379 990:0.128 987:0.154 984:0.154 981:0.179 978:0.385 882:bits. 836:0.128 833:0.154 830:0.154 827:0.179 824:0.385 801:Count 175:Naming 4247:parts 4245:Codec 4210:Frame 4168:Video 4152:SPIHT 4061:Pixel 4016:Image 3970:ACELP 3941:ADPCM 3931:μ-law 3926:A-law 3919:parts 3917:Codec 3825:Audio 3764:ACELP 3752:ADPCM 3729:SPIHT 3670:Lossy 3654:bzip2 3645:LZHAM 3601:LZFSE 3503:Delta 3395:Gamma 3375:Unary 3350:Range 3176:(PDF) 2966:(PDF) 2955:S2CID 2943:(PDF) 2925:Notes 870:2.186 4259:DPCM 4066:PSNR 3997:MDCT 3990:WLPC 3975:CELP 3936:DPCM 3784:WLPC 3769:CELP 3747:DPCM 3697:MDCT 3641:LZMA 3542:LDCT 3520:DPCM 3465:LZWL 3455:LZSS 3450:LZRW 3440:LZJB 3149:ISBN 2974:2019 2900:2.23 2849:bits 2804:111 2801:110 2798:101 2795:100 2424:2.28 2385:bits 2349:bits 2322:111 2319:110 1540:2.62 1489:bits 1465:bits 1438:110 1435:101 1432:100 1429:011 1259:101 1256:100 1253:011 1250:010 192:but 31:and 4304:DWT 4254:DCT 4198:VBR 4193:CBR 4188:ABR 4147:EZW 4142:DWT 4127:RLE 4117:KLT 4102:DCT 3985:LSP 3980:LAR 3965:LPC 3958:FFT 3855:VBR 3850:CBR 3845:ABR 3779:LSP 3774:LAR 3759:LPC 3724:DWT 3709:FFT 3704:DST 3692:DCT 3591:LZS 3586:LZX 3562:RLE 3557:PPM 3552:PAQ 3547:MTF 3515:DMC 3493:CTW 3488:BWT 3460:LZW 3445:LZO 3435:LZ4 3430:842 3240:doi 3190:doi 2831:bit 2717:15 2535:min 2525:min 2498:min 2316:10 2313:01 2310:00 2171:15 2090:ZIP 1993:log 1845:log 1776:log 1632:log 1600:log 1426:00 1370:log 1247:00 1191:log 1071:log 1006:log 913:log 804:15 737:of 588:so 328:log 94:log 4474:: 4122:LP 3953:FT 3946:DM 3498:CM 3236:27 3234:. 3228:. 3215:. 3186:40 3184:. 3178:. 3126:. 2982:^ 2949:. 2945:. 2885:39 2838:15 2792:0 2784:E 2781:D 2778:C 2775:B 2772:A 2729:5 2726:6 2723:6 2720:7 2709:E 2706:D 2703:C 2700:B 2697:A 2409:39 2359:15 2302:1 2299:0 2291:1 2288:0 2285:1 2282:0 2274:1 2271:0 2243:E 2240:D 2237:C 2234:B 2231:A 2183:5 2180:6 2177:6 2174:7 2163:E 2160:D 2157:C 2154:B 2151:A 2120:1. 2094:. 1919:1. 1657:1. 1525:39 1475:15 1418:3 1415:3 1412:3 1409:3 1406:2 1290:E 1287:D 1284:C 1281:B 1278:A 1239:3 1236:3 1233:3 1230:3 1227:2 1151:E 1148:D 1145:C 1142:B 1139:A 1119:3 1116:3 1113:3 1110:3 1107:2 970:E 967:D 964:C 961:B 958:A 947:. 816:5 813:6 810:6 807:7 796:E 793:D 790:C 787:B 784:A 764:. 412:. 188:. 171:. 23:, 4458:) 4454:( 3274:e 3267:t 3260:v 3246:. 3242:: 3196:. 3192:: 3157:. 3113:" 2995:. 2976:. 2951:6 2880:) 2877:5 2874:+ 2871:6 2868:+ 2865:6 2862:+ 2859:7 2856:( 2844:3 2841:+ 2826:1 2603:k 2599:2 2594:/ 2590:1 2549:i 2545:p 2539:i 2530:= 2521:p 2494:p 2487:1 2484:+ 2481:) 2478:X 2475:( 2472:H 2466:L 2462:E 2404:) 2401:5 2398:+ 2395:6 2392:( 2380:3 2377:+ 2374:) 2371:6 2368:+ 2365:7 2362:+ 2356:( 2344:2 2050:) 2047:X 2044:( 2041:H 2010:i 2006:p 1997:2 1987:i 1983:p 1977:n 1972:1 1969:= 1966:i 1954:= 1951:) 1948:X 1945:( 1942:H 1916:+ 1913:) 1910:X 1907:( 1904:H 1901:= 1896:i 1892:p 1886:n 1881:1 1878:= 1875:i 1867:+ 1862:i 1858:p 1849:2 1839:i 1835:p 1829:n 1824:1 1821:= 1818:i 1807:= 1804:) 1801:1 1798:+ 1793:i 1789:p 1780:2 1769:( 1764:i 1760:p 1754:n 1749:1 1746:= 1743:i 1730:i 1726:l 1720:i 1716:p 1710:n 1705:1 1702:= 1699:i 1691:= 1688:L 1684:E 1654:+ 1649:i 1645:p 1636:2 1617:i 1613:p 1604:2 1590:= 1585:i 1581:l 1520:) 1517:5 1514:+ 1511:6 1508:+ 1505:6 1502:+ 1499:7 1496:( 1484:3 1481:+ 1478:) 1472:( 1460:2 1387:i 1383:p 1374:2 1208:i 1204:p 1195:2 1088:i 1084:p 1075:2 1023:i 1019:p 1010:2 930:i 926:p 917:2 903:= 898:i 894:l 867:= 864:) 861:X 858:( 855:H 750:i 746:c 719:i 715:l 694:i 672:2 668:p 664:+ 659:1 655:p 651:= 646:3 642:c 638:, 633:1 629:p 625:= 620:2 616:c 612:, 609:0 606:= 601:1 597:c 573:, 570:2 564:i 554:j 550:p 544:1 538:i 533:1 530:= 527:j 519:= 514:i 510:c 505:, 502:0 499:= 494:1 490:c 464:n 460:p 445:2 441:p 432:1 428:p 400:x 373:x 345:i 341:p 332:2 318:= 313:i 309:l 286:n 282:p 278:, 272:, 267:2 263:p 259:, 254:1 250:p 136:. 111:i 107:p 98:2 84:= 79:i 75:l 54:i

Index

data compression
Claude Shannon
Robert Fano
prefix code
A Mathematical Theory of Communication
information theory
technical report
suboptimal
Huffman coding
Shannon–Fano–Elias coding
arithmetic coding
Shannon coding
ceiling function
binary expansion
entropy
entropy
Shannon's source coding theorem
ZIP file format
probabilities

Huffman coding
independent
arithmetic coding
Huffman coding
David A. Huffman
priority queue

"Entropy Coding and Different Coding Techniques"
S2CID
212439287

Text is available under the Creative Commons Attribution-ShareAlike License. Additional terms may apply.