1952:
1409:
uncertainty as to the original signal's value. If the receiver has some information about the random process that generates the noise, one can in principle recover the information in the original signal by considering all possible states of the noise process. In the case of the
Shannon–Hartley theorem, the noise is assumed to be generated by a Gaussian process with a known variance. Since the variance of a Gaussian process is equivalent to its power, it is conventional to call this variance the noise power.
49:
1405:
symbol pulse, with each slightly different level being assigned a different meaning or bit sequence. Taking into account both noise and bandwidth limitations, however, there is a limit to the amount of information that can be transferred by a signal of a bounded power, even when sophisticated multi-level encoding techniques are used.
2593:
2202:
1412:
Such a channel is called the
Additive White Gaussian Noise channel, because Gaussian noise is added to the signal; "white" means equal amounts of noise at all frequencies within the channel bandwidth. Such noise can arise both from random sources of energy and also from coding and measurement error
1404:
Bandwidth and noise affect the rate at which information can be transmitted over an analog channel. Bandwidth limitations alone do not impose a cap on the maximum information rate because it is still possible for the signal to take on an indefinitely large number of different voltage levels on each
790:
Hartley argued that the maximum number of distinguishable pulse levels that can be transmitted and received reliably over a communications channel is limited by the dynamic range of the signal amplitude and the precision with which the receiver can distinguish amplitude levels. Specifically, if the
1400:
If there were such a thing as a noise-free analog channel, one could transmit unlimited amounts of error-free data over it per unit of time (Note that an infinite-bandwidth analog channel could not transmit unlimited amounts of error-free data absent infinite signal power). Real channels, however,
1942:
noise. This formula's way of introducing frequency-dependent noise cannot describe all continuous-time noise processes. For example, consider a noise process consisting of adding a random wave whose amplitude is 1 or −1 at any point in time, and a channel that adds such a wave to the source
1408:
In the channel considered by the
Shannon–Hartley theorem, noise and signal are combined by addition. That is, the receiver measures a signal that is equal to the sum of the signal encoding the desired information and a continuous random variable that represents the noise. This addition creates
2813:(1 + 10) = 1443 bit/s. These values are typical of the received ranging signals of the GPS, where the navigation message is sent at 50 bit/s (below the channel capacity for the given S/N), and whose bandwidth is spread to around 1 MHz by a pseudo-noise multiplication before transmission.
1943:
signal. Such a wave's frequency components are highly dependent. Though such a noise may have a high power, it is fairly easy to transmit a continuous signal with much less power than one would need if the underlying noise was a sum of independent noises in each frequency band.
1199:
symbols per second. Some authors refer to it as a capacity. But such an errorless channel is an idealization, and if M is chosen small enough to make the noisy channel nearly errorless, the result is necessarily less than the
Shannon capacity of the noisy channel of bandwidth
2816:
As stated above, channel capacity is proportional to the bandwidth of the channel and to the logarithm of SNR. This means channel capacity can be increased linearly either by increasing the channel's bandwidth given a fixed SNR requirement or, with fixed bandwidth, by using
2430:
1249:
versus levels of noise interference and data corruption. The proof of the theorem shows that a randomly constructed error-correcting code is essentially as good as the best possible code; the theorem is proved through the statistics of such random codes.
2035:
1385:
the probability of error at the receiver increases without bound as the rate is increased. So no useful information can be transmitted beyond the channel capacity. The theorem does not address the rare situation in which rate and capacity are equal.
1413:
at the sender and receiver respectively. Since sums of independent
Gaussian random variables are themselves Gaussian random variables, this conveniently simplifies analysis, if one assumes that such error sources are also Gaussian and independent.
2808:
What is the channel capacity for a signal having a 1 MHz bandwidth, received with a SNR of −30 dB ? That means a signal deeply buried in noise. −30 dB means a S/N = 10. It leads to a maximal rate of information of 10
1329:
there exists a coding technique which allows the probability of error at the receiver to be made arbitrarily small. This means that theoretically, it is possible to transmit information nearly without error up to nearly a limit of
1793:
1600:
pulse levels can be literally sent without any confusion. More levels are needed to allow for redundant coding and error correction, but the net data rate that can be approached with coding is equivalent to using that
1521:
1168:
The concept of an error-free capacity awaited Claude
Shannon, who built on Hartley's observations about a logarithmic measure of information and Nyquist's observations about the effect of bandwidth limitations.
2286:
2376:
230:
in the presence of the noise interference, assuming that the signal power is bounded, and that the
Gaussian noise process is characterized by a known power or power spectral density. The law is named after
3208:- gives an entertaining and thorough introduction to Shannon theory, including two proofs of the noisy-channel coding theorem. This text also discusses state-of-the-art methods from coding theory, such as
394:
2651:
2772:
2588:{\displaystyle \log _{2}\left(1+{\frac {S}{N}}\right)={\frac {1}{\ln 2}}\cdot \ln \left(1+{\frac {S}{N}}\right)\approx {\frac {1}{\ln 2}}\cdot {\frac {S}{N}}\approx 1.44\cdot {S \over N};}
1568:
1397:
in
Hartley's line rate formula in terms of a signal-to-noise ratio, but achieving reliability through error-correction coding rather than through reliably distinguishable pulse levels.
1241:
during World War II provided the next big step in understanding how much information could be reliably communicated through noisy channels. Building on
Hartley's foundation, Shannon's
2197:{\displaystyle \log _{2}\left(1+{\frac {S}{N}}\right)\approx \log _{2}{\frac {S}{N}}={\frac {\ln 10}{\ln 2}}\cdot \log _{10}{\frac {S}{N}}\approx 3.32\cdot \log _{10}{\frac {S}{N}},}
844:
1132:
1996:
921:
1141:
should depend on the noise statistics of the channel, or how the communication could be made reliable even when individual symbol pulses could not be reliably distinguished to
3244:
2723:
660:
605:
developed the concept of channel capacity, based in part on the ideas of
Nyquist and Hartley, and then formulated a complete theory of information and its transmission.
2420:
1655:
is the total power of the received signal and noise together. A generalization of the above equation for the case where the additive noise is not white (or that the
1380:
1324:
2684:
951:
690:
1910:
1876:
1683:
1653:
601:
as a communications system. At the time, these concepts were powerful breakthroughs individually, but they were not part of a comprehensive theory. In the 1940s,
566:
1197:
1004:
960:
Hartley then combined the above quantification with Nyquist's observation that the number of independent pulses that could be put through a channel of bandwidth
762:
733:
1932:
1845:
1819:
1619:
1598:
1348:
1295:
1275:
1218:
1163:
1075:
1055:
1031:
978:
710:
536:
508:
474:
452:
420:
325:
300:
268:
1573:
The square root effectively converts the power ratio back to a voltage ratio, so the number of levels is approximately proportional to the ratio of signal
787:
bits per second). This method, later known as Hartley's law, became an important precursor for Shannon's more sophisticated notion of channel capacity.
1393:
channel subject to Gaussian noise. It connects Hartley's result with Shannon's channel capacity theorem in a form that is equivalent to specifying the
619:
In 1927, Nyquist determined that the number of independent pulses that could be put through a telegraph channel per unit time is limited to twice the
2797:
If the requirement is to transmit at 50 kbit/s, and a bandwidth of 10 kHz is used, then the minimum S/N required is given by 50000 = 10000 log
1693:
1687:
is not constant with frequency over the bandwidth) is obtained by treating the channel as many narrow, independent Gaussian channels in parallel:
3064:
1436:
171:
2786:
If the SNR is 20 dB, and the bandwidth available is 4 kHz, which is appropriate for telephone communications, then C = 4000 log
2221:
2297:
576:(CNR) of the communication signal to the noise and interference at the receiver (expressed as a linear power ratio, not as logarithmic
2976:
1426:
Comparing the channel capacity to the information rate from Hartley's law, we can find the effective number of distinguishable levels
3056:
334:
3176:
3149:
3003:
510:
is the average received signal power over the bandwidth (in case of a carrier-modulated passband transmission, often denoted
2825:
improves, but at the cost of the SNR requirement. Thus, there is an exponential rise in the SNR requirement if one adopts a
2608:
2839:
31:
103:
2731:
597:
developed a handful of fundamental ideas related to the transmission of information, particularly in the context of the
130:
2915:
3239:
3119:
3013:
2986:
2959:
114:
3033:
17:
1527:
164:
2207:
in which case the capacity is logarithmic in power and approximately linear in bandwidth (not quite linear, since
3249:
2826:
791:
amplitude of the transmitted signal is restricted to the range of volts, and the precision of the receiver is ±Δ
2865:
768:. Nyquist published his results in 1928 as part of his paper "Certain topics in Telegraph Transmission Theory".
538:
is the average power of the noise and interference over the bandwidth, measured in watts (or volts squared); and
88:
3209:
2783:
At a SNR of 0 dB (Signal power = Noise power) the Capacity in bits/s is equal to the bandwidth in hertz.
1955:
1951:
1010:
620:
477:
303:
227:
193:
1580:
This similarity in form between Shannon's capacity and Hartley's law should not be interpreted to mean that
805:
192:
tells the maximum rate at which information can be transmitted over a communications channel of a specified
1242:
1229:
1083:
201:
140:
58:
1961:
868:
3234:
157:
119:
1257:
from a statistical description of a channel, and establishes that given a noisy channel with capacity
2689:
1145:
levels; with Gaussian noise statistics, system designers had to choose a very conservative value of
3205:
2794:(101) = 26.63 kbit/s. Note that the value of S/N = 100 is equivalent to the SNR of 20 dB.
1057:, in bit/s. Other times it is quoted in this more quantitative form, as an achievable line rate of
2010:
For large or small and constant signal-to-noise ratios, the capacity formula can be approximated:
629:
3254:
3219:
2818:
573:
512:
211:
2656:
In this low-SNR approximation, capacity is independent of bandwidth if the noise is white, of
2391:
1958:
channel capacity with the power-limited regime and bandwidth-limited regime indicated. Here,
1389:
The Shannon–Hartley theorem establishes what that channel capacity is for a finite-bandwidth
1246:
1234:
1034:
602:
569:
271:
232:
2801:(1+S/N) so C/B = 5 then S/N = 2 − 1 = 31, corresponding to an SNR of 14.91 dB (10 x log
1359:
1303:
2880:
2662:
929:
781:
668:
83:
63:
1886:
1852:
8:
2822:
1660:
1632:
543:
197:
68:
2884:
1629:
In the simple version above, the signal and noise are fully uncorrelated, in which case
1179:
986:
744:
715:
3089:
3081:
2934:
1939:
1917:
1830:
1804:
1604:
1583:
1333:
1280:
1260:
1238:
1203:
1148:
1060:
1040:
1016:
963:
695:
521:
493:
459:
437:
405:
310:
285:
253:
185:
78:
40:
3182:
3172:
3155:
3145:
3115:
3108:
3052:
3029:
3009:
2982:
2955:
3085:
3201:
3073:
2930:
2898:
2888:
2657:
1822:
1254:
1006:
pulses per second, to arrive at his quantitative measure for achievable line rate.
423:
275:
248:
219:
135:
93:
2949:
1390:
427:
205:
2892:
3077:
1879:
1401:
are subject to limitations imposed by both finite bandwidth and nonzero noise.
279:
215:
2211:
increases with bandwidth, imparting a logarithmic effect). This is called the
3228:
1788:{\displaystyle C=\int _{0}^{B}\log _{2}\left(1+{\frac {S(f)}{N(f)}}\right)df}
1574:
594:
590:
236:
208:
73:
3159:
2821:
that need a very high SNR to operate. As the modulation rate increases, the
3186:
737:
614:
431:
222:
for such a communication link, a bound on the maximum amount of error-free
98:
27:
Theorem that tells the maximum rate at which information can be transmitted
223:
3202:
On-line textbook: Information Theory, Inference, and Learning Algorithms
1009:
Hartley's law is sometimes quoted as just a proportionality between the
3213:
953:
is the pulse rate, also known as the symbol rate, in symbols/second or
48:
1516:{\displaystyle 2B\log _{2}(M)=B\log _{2}\left(1+{\frac {S}{N}}\right)}
776:
During 1928, Hartley formulated a way to quantify information and its
2902:
851:
777:
598:
1421:
1172:
Hartley's rate result can be viewed as the capacity of an errorless
3110:
An Introduction to Information Theory: symbols, signals & noise
2281:{\displaystyle C\approx 0.332\cdot B\cdot \mathrm {SNR\ (in\ dB)} }
858:
that could be sent, Hartley constructed a measure of the line rate
577:
485:
2371:{\displaystyle \mathrm {SNR\ (in\ dB)} =10\log _{10}{S \over N}.}
850:
By taking information per pulse in bit/pulse to be the base-2-
2844:
980:
481:
1223:
1624:
1220:, which is the Hartley–Shannon result that followed later.
954:
2598:
then the capacity is linear in power. This is called the
389:{\displaystyle C=B\log _{2}\left(1+{\frac {S}{N}}\right)}
2686:
watts per hertz, in which case the total noise power is
278:
of data that can be communicated at an arbitrarily low
226:
per time unit that can be transmitted with a specified
2646:{\displaystyle C\approx 1.44\cdot B\cdot {S \over N}.}
3245:
Mathematical theorems in theoretical computer science
2734:
2692:
2665:
2611:
2433:
2394:
2300:
2224:
2038:
1964:
1920:
1889:
1855:
1833:
1807:
1696:
1663:
1635:
1607:
1586:
1530:
1439:
1362:
1336:
1306:
1283:
1263:
1206:
1182:
1151:
1086:
1063:
1043:
1019:
989:
966:
932:
871:
808:
747:
718:
698:
671:
632:
546:
524:
496:
462:
440:
408:
337:
313:
288:
256:
2951:
Information Theory; and its Engineering Applications
1245:(1948) describes the maximum possible efficiency of
302:through an analog communication channel subject to
3107:
2767:{\displaystyle C\approx 1.44\cdot {S \over N_{0}}}
2766:
2717:
2678:
2645:
2587:
2414:
2370:
2280:
2196:
1990:
1926:
1904:
1870:
1839:
1813:
1787:
1677:
1647:
1613:
1592:
1562:
1515:
1374:
1342:
1318:
1289:
1269:
1212:
1191:
1157:
1126:
1069:
1049:
1025:
998:
972:
945:
915:
838:
795:volts, then the maximum number of distinct pulses
756:
727:
704:
692:is the pulse frequency (in pulses per second) and
684:
654:
560:
530:
502:
468:
446:
414:
388:
319:
294:
262:
2866:"Certain topics in telegraph transmission theory"
1422:Comparison of Shannon's capacity to Hartley's law
741:, and transmitting at the limiting pulse rate of
3226:
3167:Wozencraft, John M.; Jacobs, Irwin Mark (1965).
3166:
2424:), applying the approximation to the logarithm:
1137:Hartley did not work out exactly how the number
3065:Proceedings of the Institute of Radio Engineers
2006:can be scaled proportionally for other values.
1416:
3139:
1563:{\displaystyle M={\sqrt {1+{\frac {S}{N}}}}.}
165:
3140:Taub, Herbert; Schilling, Donald L. (1986).
2829:; however, the spectral efficiency improves.
3041:. Urbana, IL: University of Illinois Press.
3001:
1938:Note: the theorem only applies to Gaussian
1277:and information transmitted at a line rate
242:
712:is the bandwidth (in hertz). The quantity
172:
158:
3002:Dunlop, John; Smith, D. Geoffrey (1998).
2981:(2nd ed.). Thomson Delmar Learning.
2013:
1253:Shannon's theorem shows how to compute a
1224:Noisy channel coding theorem and capacity
584:
3169:Principles of Communications Engineering
3057:"Communication in the presence of noise"
3035:The Mathematical Theory of Communication
1950:
1625:Frequency-dependent (colored noise) case
1033:, in Hertz and what today is called the
516:), measured in watts (or volts squared);
488:bandwidth in case of a bandpass signal);
3051:
3028:
2974:
2913:
2863:
282:using an average received signal power
247:The Shannon–Hartley theorem states the
14:
3227:
3105:
2857:
1847:is the bandwidth of the channel in Hz;
839:{\displaystyle M=1+{A \over \Delta V}}
623:of the channel. In symbolic notation,
2386:Similarly, when the SNR is small (if
2381:
1127:{\displaystyle R\leq 2B\log _{2}(M).}
434:(information rate, sometimes denoted
2947:
2029:), the logarithm is approximated by
1991:{\displaystyle {\frac {S}{N_{0}}}=1}
916:{\displaystyle R=f_{p}\log _{2}(M),}
218:. The theorem establishes Shannon's
3142:Principles of Communication Systems
1353:The converse is also important. If
854:of the number of distinct messages
454:) excluding error-correction codes;
430:, a theoretical upper bound on the
104:Limiting density of discrete points
24:
2978:Introduction to Telecommunications
2954:(3rd ed.). New York: Pitman.
2935:10.1002/j.1538-7305.1928.tb01236.x
2329:
2326:
2320:
2317:
2308:
2305:
2302:
2271:
2268:
2262:
2259:
2250:
2247:
2244:
827:
25:
3266:
3220:MIT News article on Shannon Limit
3195:
1946:
115:Asymptotic equipartition property
2840:Nyquist–Shannon sampling theorem
771:
47:
32:Nyquist–Shannon sampling theorem
2914:Hartley, R. V. L. (July 1928).
608:
131:Shannon's source coding theorem
3210:low-density parity-check codes
3106:Pierce, John Robinson (1980).
3099:
3045:
3022:
3005:Telecommunications Engineering
2995:
2968:
2941:
2907:
2718:{\displaystyle N=B\cdot N_{0}}
2332:
2314:
2274:
2256:
1899:
1893:
1865:
1859:
1768:
1762:
1754:
1748:
1465:
1459:
1118:
1112:
907:
901:
766:signalling at the Nyquist rate
200:. It is an application of the
89:Conditional mutual information
13:
1:
3132:
2923:Bell System Technical Journal
2916:"Transmission of Information"
2864:Nyquist, Harry (April 1928).
1577:to noise standard deviation.
1165:to achieve a low error rate.
304:additive white Gaussian noise
1243:noisy channel coding theorem
1230:Noisy-channel coding theorem
735:later came to be called the
655:{\displaystyle f_{p}\leq 2B}
204:to the archetypal case of a
202:noisy-channel coding theorem
141:Noisy-channel coding theorem
7:
2893:10.1109/T-AIEE.1928.5055024
2833:
2777:
1912:is the noise power spectrum
1417:Implications of the theorem
10:
3271:
3078:10.1109/JRPROC.1949.232969
1227:
612:
270:, meaning the theoretical
29:
3240:Telecommunication theory
2975:Gokhale, Anu A. (2004).
2850:
2819:higher-order modulations
2415:{\displaystyle S/N\ll 1}
2213:bandwidth-limited regime
1247:error-correcting methods
243:Statement of the theorem
30:Not to be confused with
2018:When the SNR is large (
589:During the late 1920s,
190:Shannon–Hartley theorem
146:Shannon–Hartley theorem
3250:Theorems in statistics
2768:
2719:
2680:
2647:
2589:
2416:
2372:
2282:
2198:
2014:Bandwidth-limited case
2007:
1992:
1928:
1906:
1872:
1841:
1815:
1789:
1679:
1649:
1615:
1594:
1564:
1517:
1376:
1375:{\displaystyle R>C}
1344:
1320:
1319:{\displaystyle R<C}
1291:
1271:
1214:
1193:
1159:
1128:
1071:
1051:
1027:
1000:
974:
947:
917:
840:
758:
729:
706:
686:
656:
585:Historical development
574:carrier-to-noise ratio
562:
532:
504:
470:
448:
416:
390:
321:
296:
264:
212:communications channel
120:Rate–distortion theory
2769:
2720:
2681:
2679:{\displaystyle N_{0}}
2648:
2590:
2417:
2373:
2283:
2199:
1993:
1954:
1929:
1907:
1873:
1842:
1816:
1790:
1680:
1650:
1616:
1595:
1565:
1518:
1377:
1345:
1321:
1292:
1272:
1215:
1194:
1160:
1129:
1072:
1052:
1028:
1001:
975:
948:
946:{\displaystyle f_{p}}
918:
841:
764:pulses per second as
759:
730:
707:
687:
685:{\displaystyle f_{p}}
657:
570:signal-to-noise ratio
563:
533:
505:
471:
449:
417:
391:
322:
297:
265:
2948:Bell, D. A. (1962).
2790:(1 + 100) = 4000 log
2732:
2690:
2663:
2609:
2600:power-limited regime
2431:
2392:
2298:
2222:
2036:
1962:
1918:
1905:{\displaystyle N(f)}
1887:
1871:{\displaystyle S(f)}
1853:
1831:
1805:
1694:
1661:
1633:
1605:
1584:
1528:
1437:
1360:
1334:
1304:
1281:
1261:
1204:
1180:
1149:
1084:
1061:
1041:
1017:
987:
964:
930:
869:
806:
782:data signalling rate
745:
716:
696:
669:
630:
544:
522:
494:
460:
438:
406:
335:
311:
286:
254:
84:Directed information
64:Differential entropy
3095:on 8 February 2010.
2885:1928TAIEE..47..617N
2823:spectral efficiency
1934:is frequency in Hz.
1825:in bits per second;
1717:
1678:{\displaystyle S/N}
1648:{\displaystyle S+N}
561:{\displaystyle S/N}
274:upper bound on the
196:in the presence of
69:Conditional entropy
3235:Information theory
2897:Also 2002 Reprint
2764:
2715:
2676:
2643:
2585:
2412:
2382:Power-limited case
2368:
2278:
2194:
2008:
1988:
1940:stationary process
1924:
1902:
1868:
1837:
1811:
1785:
1703:
1675:
1645:
1621:in Hartley's law.
1611:
1590:
1560:
1513:
1372:
1340:
1316:
1287:
1267:
1239:information theory
1237:'s development of
1210:
1192:{\displaystyle 2B}
1189:
1155:
1124:
1067:
1047:
1023:
999:{\displaystyle 2B}
996:
970:
943:
913:
836:
757:{\displaystyle 2B}
754:
728:{\displaystyle 2B}
725:
702:
682:
652:
558:
528:
500:
480:of the channel in
466:
444:
412:
386:
317:
292:
260:
186:information theory
79:Mutual information
41:Information theory
18:Hartley's law
3178:978-0-471-96240-3
3151:978-0-07-062956-1
2762:
2638:
2580:
2561:
2548:
2522:
2492:
2466:
2363:
2325:
2313:
2267:
2255:
2189:
2157:
2131:
2102:
2071:
1980:
1927:{\displaystyle f}
1840:{\displaystyle B}
1814:{\displaystyle C}
1772:
1614:{\displaystyle M}
1593:{\displaystyle M}
1555:
1553:
1506:
1350:bits per second.
1343:{\displaystyle C}
1290:{\displaystyle R}
1270:{\displaystyle C}
1213:{\displaystyle B}
1158:{\displaystyle M}
1077:bits per second:
1070:{\displaystyle R}
1050:{\displaystyle R}
1035:digital bandwidth
1026:{\displaystyle B}
973:{\displaystyle B}
834:
705:{\displaystyle B}
531:{\displaystyle N}
503:{\displaystyle S}
469:{\displaystyle B}
447:{\displaystyle I}
415:{\displaystyle C}
379:
320:{\displaystyle N}
295:{\displaystyle S}
263:{\displaystyle C}
182:
181:
16:(Redirected from
3262:
3190:
3163:
3126:
3125:
3113:
3103:
3097:
3096:
3094:
3088:. Archived from
3061:
3055:(January 1949).
3049:
3043:
3042:
3040:
3026:
3020:
3019:
2999:
2993:
2992:
2972:
2966:
2965:
2945:
2939:
2938:
2920:
2911:
2905:
2903:10.1109/5.989873
2896:
2870:
2861:
2773:
2771:
2770:
2765:
2763:
2761:
2760:
2748:
2724:
2722:
2721:
2716:
2714:
2713:
2685:
2683:
2682:
2677:
2675:
2674:
2658:spectral density
2652:
2650:
2649:
2644:
2639:
2631:
2594:
2592:
2591:
2586:
2581:
2573:
2562:
2554:
2549:
2547:
2533:
2528:
2524:
2523:
2515:
2493:
2491:
2477:
2472:
2468:
2467:
2459:
2443:
2442:
2423:
2421:
2419:
2418:
2413:
2402:
2377:
2375:
2374:
2369:
2364:
2356:
2351:
2350:
2335:
2323:
2311:
2287:
2285:
2284:
2279:
2277:
2265:
2253:
2210:
2203:
2201:
2200:
2195:
2190:
2182:
2177:
2176:
2158:
2150:
2145:
2144:
2132:
2130:
2119:
2108:
2103:
2095:
2090:
2089:
2077:
2073:
2072:
2064:
2048:
2047:
2028:
1997:
1995:
1994:
1989:
1981:
1979:
1978:
1966:
1933:
1931:
1930:
1925:
1911:
1909:
1908:
1903:
1877:
1875:
1874:
1869:
1846:
1844:
1843:
1838:
1823:channel capacity
1820:
1818:
1817:
1812:
1794:
1792:
1791:
1786:
1778:
1774:
1773:
1771:
1757:
1743:
1727:
1726:
1716:
1711:
1686:
1684:
1682:
1681:
1676:
1671:
1654:
1652:
1651:
1646:
1620:
1618:
1617:
1612:
1599:
1597:
1596:
1591:
1569:
1567:
1566:
1561:
1556:
1554:
1546:
1538:
1522:
1520:
1519:
1514:
1512:
1508:
1507:
1499:
1483:
1482:
1455:
1454:
1381:
1379:
1378:
1373:
1349:
1347:
1346:
1341:
1325:
1323:
1322:
1317:
1296:
1294:
1293:
1288:
1276:
1274:
1273:
1268:
1255:channel capacity
1219:
1217:
1216:
1211:
1198:
1196:
1195:
1190:
1176:-ary channel of
1164:
1162:
1161:
1156:
1133:
1131:
1130:
1125:
1108:
1107:
1076:
1074:
1073:
1068:
1056:
1054:
1053:
1048:
1032:
1030:
1029:
1024:
1011:analog bandwidth
1005:
1003:
1002:
997:
979:
977:
976:
971:
952:
950:
949:
944:
942:
941:
922:
920:
919:
914:
897:
896:
887:
886:
845:
843:
842:
837:
835:
833:
822:
763:
761:
760:
755:
734:
732:
731:
726:
711:
709:
708:
703:
691:
689:
688:
683:
681:
680:
661:
659:
658:
653:
642:
641:
567:
565:
564:
559:
554:
537:
535:
534:
529:
509:
507:
506:
501:
475:
473:
472:
467:
453:
451:
450:
445:
424:channel capacity
421:
419:
418:
413:
395:
393:
392:
387:
385:
381:
380:
372:
356:
355:
328:
326:
324:
323:
318:
306:(AWGN) of power
301:
299:
298:
293:
276:information rate
269:
267:
266:
261:
249:channel capacity
220:channel capacity
174:
167:
160:
136:Channel capacity
94:Relative entropy
51:
37:
36:
21:
3270:
3269:
3265:
3264:
3263:
3261:
3260:
3259:
3225:
3224:
3198:
3193:
3179:
3152:
3144:. McGraw-Hill.
3135:
3130:
3129:
3122:
3104:
3100:
3092:
3059:
3050:
3046:
3038:
3027:
3023:
3016:
3000:
2996:
2989:
2973:
2969:
2962:
2946:
2942:
2918:
2912:
2908:
2868:
2862:
2858:
2853:
2836:
2812:
2804:
2800:
2793:
2789:
2780:
2756:
2752:
2747:
2733:
2730:
2729:
2709:
2705:
2691:
2688:
2687:
2670:
2666:
2664:
2661:
2660:
2630:
2610:
2607:
2606:
2572:
2553:
2537:
2532:
2514:
2507:
2503:
2481:
2476:
2458:
2451:
2447:
2438:
2434:
2432:
2429:
2428:
2398:
2393:
2390:
2389:
2387:
2384:
2355:
2346:
2342:
2301:
2299:
2296:
2295:
2243:
2223:
2220:
2219:
2208:
2181:
2172:
2168:
2149:
2140:
2136:
2120:
2109:
2107:
2094:
2085:
2081:
2063:
2056:
2052:
2043:
2039:
2037:
2034:
2033:
2019:
2016:
1974:
1970:
1965:
1963:
1960:
1959:
1949:
1919:
1916:
1915:
1888:
1885:
1884:
1854:
1851:
1850:
1832:
1829:
1828:
1806:
1803:
1802:
1758:
1744:
1742:
1735:
1731:
1722:
1718:
1712:
1707:
1695:
1692:
1691:
1667:
1662:
1659:
1658:
1656:
1634:
1631:
1630:
1627:
1606:
1603:
1602:
1585:
1582:
1581:
1545:
1537:
1529:
1526:
1525:
1498:
1491:
1487:
1478:
1474:
1450:
1446:
1438:
1435:
1434:
1424:
1419:
1391:continuous-time
1361:
1358:
1357:
1335:
1332:
1331:
1305:
1302:
1301:
1282:
1279:
1278:
1262:
1259:
1258:
1232:
1226:
1205:
1202:
1201:
1181:
1178:
1177:
1150:
1147:
1146:
1103:
1099:
1085:
1082:
1081:
1062:
1059:
1058:
1042:
1039:
1038:
1018:
1015:
1014:
988:
985:
984:
965:
962:
961:
937:
933:
931:
928:
927:
892:
888:
882:
878:
870:
867:
866:
826:
821:
807:
804:
803:
780:(also known as
774:
746:
743:
742:
717:
714:
713:
697:
694:
693:
676:
672:
670:
667:
666:
637:
633:
631:
628:
627:
617:
611:
587:
550:
545:
542:
541:
523:
520:
519:
495:
492:
491:
461:
458:
457:
439:
436:
435:
428:bits per second
407:
404:
403:
371:
364:
360:
351:
347:
336:
333:
332:
312:
309:
308:
307:
287:
284:
283:
255:
252:
251:
245:
206:continuous-time
178:
35:
28:
23:
22:
15:
12:
11:
5:
3268:
3258:
3257:
3255:Claude Shannon
3252:
3247:
3242:
3237:
3223:
3222:
3217:
3197:
3196:External links
3194:
3192:
3191:
3177:
3164:
3150:
3136:
3134:
3131:
3128:
3127:
3120:
3098:
3053:Shannon, C. E.
3044:
3030:Shannon, C. E.
3021:
3014:
2994:
2987:
2967:
2960:
2940:
2929:(3): 535–563.
2906:
2855:
2854:
2852:
2849:
2848:
2847:
2842:
2835:
2832:
2831:
2830:
2827:16QAM or 64QAM
2814:
2810:
2806:
2802:
2798:
2795:
2791:
2787:
2784:
2779:
2776:
2775:
2774:
2759:
2755:
2751:
2746:
2743:
2740:
2737:
2712:
2708:
2704:
2701:
2698:
2695:
2673:
2669:
2654:
2653:
2642:
2637:
2634:
2629:
2626:
2623:
2620:
2617:
2614:
2596:
2595:
2584:
2579:
2576:
2571:
2568:
2565:
2560:
2557:
2552:
2546:
2543:
2540:
2536:
2531:
2527:
2521:
2518:
2513:
2510:
2506:
2502:
2499:
2496:
2490:
2487:
2484:
2480:
2475:
2471:
2465:
2462:
2457:
2454:
2450:
2446:
2441:
2437:
2411:
2408:
2405:
2401:
2397:
2383:
2380:
2379:
2378:
2367:
2362:
2359:
2354:
2349:
2345:
2341:
2338:
2334:
2331:
2328:
2322:
2319:
2316:
2310:
2307:
2304:
2289:
2288:
2276:
2273:
2270:
2264:
2261:
2258:
2252:
2249:
2246:
2242:
2239:
2236:
2233:
2230:
2227:
2205:
2204:
2193:
2188:
2185:
2180:
2175:
2171:
2167:
2164:
2161:
2156:
2153:
2148:
2143:
2139:
2135:
2129:
2126:
2123:
2118:
2115:
2112:
2106:
2101:
2098:
2093:
2088:
2084:
2080:
2076:
2070:
2067:
2062:
2059:
2055:
2051:
2046:
2042:
2015:
2012:
1987:
1984:
1977:
1973:
1969:
1948:
1947:Approximations
1945:
1936:
1935:
1923:
1913:
1901:
1898:
1895:
1892:
1882:
1880:power spectrum
1878:is the signal
1867:
1864:
1861:
1858:
1848:
1836:
1826:
1810:
1796:
1795:
1784:
1781:
1777:
1770:
1767:
1764:
1761:
1756:
1753:
1750:
1747:
1741:
1738:
1734:
1730:
1725:
1721:
1715:
1710:
1706:
1702:
1699:
1674:
1670:
1666:
1644:
1641:
1638:
1626:
1623:
1610:
1589:
1571:
1570:
1559:
1552:
1549:
1544:
1541:
1536:
1533:
1523:
1511:
1505:
1502:
1497:
1494:
1490:
1486:
1481:
1477:
1473:
1470:
1467:
1464:
1461:
1458:
1453:
1449:
1445:
1442:
1423:
1420:
1418:
1415:
1383:
1382:
1371:
1368:
1365:
1339:
1327:
1326:
1315:
1312:
1309:
1286:
1266:
1235:Claude Shannon
1228:Main article:
1225:
1222:
1209:
1188:
1185:
1154:
1135:
1134:
1123:
1120:
1117:
1114:
1111:
1106:
1102:
1098:
1095:
1092:
1089:
1066:
1046:
1022:
995:
992:
969:
940:
936:
924:
923:
912:
909:
906:
903:
900:
895:
891:
885:
881:
877:
874:
848:
847:
832:
829:
825:
820:
817:
814:
811:
773:
770:
753:
750:
724:
721:
701:
679:
675:
663:
662:
651:
648:
645:
640:
636:
613:Main article:
610:
607:
603:Claude Shannon
586:
583:
582:
581:
557:
553:
549:
539:
527:
517:
499:
489:
465:
455:
443:
411:
397:
396:
384:
378:
375:
370:
367:
363:
359:
354:
350:
346:
343:
340:
316:
291:
259:
244:
241:
233:Claude Shannon
216:Gaussian noise
180:
179:
177:
176:
169:
162:
154:
151:
150:
149:
148:
143:
138:
133:
125:
124:
123:
122:
117:
109:
108:
107:
106:
101:
96:
91:
86:
81:
76:
71:
66:
61:
53:
52:
44:
43:
26:
9:
6:
4:
3:
2:
3267:
3256:
3253:
3251:
3248:
3246:
3243:
3241:
3238:
3236:
3233:
3232:
3230:
3221:
3218:
3215:
3211:
3207:
3203:
3200:
3199:
3188:
3184:
3180:
3174:
3170:
3165:
3161:
3157:
3153:
3147:
3143:
3138:
3137:
3123:
3121:0-486-24061-4
3117:
3112:
3111:
3102:
3091:
3087:
3083:
3079:
3075:
3071:
3067:
3066:
3058:
3054:
3048:
3037:
3036:
3031:
3025:
3017:
3015:0-7487-4044-9
3011:
3008:. CRC Press.
3007:
3006:
2998:
2990:
2988:1-4018-5648-9
2984:
2980:
2979:
2971:
2963:
2961:9780273417576
2957:
2953:
2952:
2944:
2936:
2932:
2928:
2924:
2917:
2910:
2904:
2900:
2894:
2890:
2886:
2882:
2879:(2): 617–44.
2878:
2874:
2867:
2860:
2856:
2846:
2843:
2841:
2838:
2837:
2828:
2824:
2820:
2815:
2807:
2796:
2785:
2782:
2781:
2757:
2753:
2749:
2744:
2741:
2738:
2735:
2728:
2727:
2726:
2710:
2706:
2702:
2699:
2696:
2693:
2671:
2667:
2659:
2640:
2635:
2632:
2627:
2624:
2621:
2618:
2615:
2612:
2605:
2604:
2603:
2601:
2582:
2577:
2574:
2569:
2566:
2563:
2558:
2555:
2550:
2544:
2541:
2538:
2534:
2529:
2525:
2519:
2516:
2511:
2508:
2504:
2500:
2497:
2494:
2488:
2485:
2482:
2478:
2473:
2469:
2463:
2460:
2455:
2452:
2448:
2444:
2439:
2435:
2427:
2426:
2425:
2409:
2406:
2403:
2399:
2395:
2365:
2360:
2357:
2352:
2347:
2343:
2339:
2336:
2294:
2293:
2292:
2240:
2237:
2234:
2231:
2228:
2225:
2218:
2217:
2216:
2214:
2191:
2186:
2183:
2178:
2173:
2169:
2165:
2162:
2159:
2154:
2151:
2146:
2141:
2137:
2133:
2127:
2124:
2121:
2116:
2113:
2110:
2104:
2099:
2096:
2091:
2086:
2082:
2078:
2074:
2068:
2065:
2060:
2057:
2053:
2049:
2044:
2040:
2032:
2031:
2030:
2026:
2022:
2011:
2005:
2001:
1985:
1982:
1975:
1971:
1967:
1957:
1953:
1944:
1941:
1921:
1914:
1896:
1890:
1883:
1881:
1862:
1856:
1849:
1834:
1827:
1824:
1808:
1801:
1800:
1799:
1782:
1779:
1775:
1765:
1759:
1751:
1745:
1739:
1736:
1732:
1728:
1723:
1719:
1713:
1708:
1704:
1700:
1697:
1690:
1689:
1688:
1672:
1668:
1664:
1642:
1639:
1636:
1622:
1608:
1587:
1578:
1576:
1575:RMS amplitude
1557:
1550:
1547:
1542:
1539:
1534:
1531:
1524:
1509:
1503:
1500:
1495:
1492:
1488:
1484:
1479:
1475:
1471:
1468:
1462:
1456:
1451:
1447:
1443:
1440:
1433:
1432:
1431:
1429:
1414:
1410:
1406:
1402:
1398:
1396:
1392:
1387:
1369:
1366:
1363:
1356:
1355:
1354:
1351:
1337:
1313:
1310:
1307:
1300:
1299:
1298:
1284:
1264:
1256:
1251:
1248:
1244:
1240:
1236:
1231:
1221:
1207:
1186:
1183:
1175:
1170:
1166:
1152:
1144:
1140:
1121:
1115:
1109:
1104:
1100:
1096:
1093:
1090:
1087:
1080:
1079:
1078:
1064:
1044:
1036:
1020:
1012:
1007:
993:
990:
982:
967:
958:
956:
938:
934:
910:
904:
898:
893:
889:
883:
879:
875:
872:
865:
864:
863:
861:
857:
853:
830:
823:
818:
815:
812:
809:
802:
801:
800:
798:
794:
788:
786:
783:
779:
772:Hartley's law
769:
767:
751:
748:
740:
739:
722:
719:
699:
677:
673:
649:
646:
643:
638:
634:
626:
625:
624:
622:
616:
606:
604:
600:
596:
595:Ralph Hartley
592:
591:Harry Nyquist
579:
575:
572:(SNR) or the
571:
555:
551:
547:
540:
525:
518:
515:
514:
497:
490:
487:
483:
479:
463:
456:
441:
433:
429:
425:
409:
402:
401:
400:
382:
376:
373:
368:
365:
361:
357:
352:
348:
344:
341:
338:
331:
330:
329:
314:
305:
289:
281:
277:
273:
257:
250:
240:
238:
237:Ralph Hartley
234:
229:
225:
221:
217:
213:
210:
207:
203:
199:
195:
191:
187:
175:
170:
168:
163:
161:
156:
155:
153:
152:
147:
144:
142:
139:
137:
134:
132:
129:
128:
127:
126:
121:
118:
116:
113:
112:
111:
110:
105:
102:
100:
97:
95:
92:
90:
87:
85:
82:
80:
77:
75:
74:Joint entropy
72:
70:
67:
65:
62:
60:
57:
56:
55:
54:
50:
46:
45:
42:
39:
38:
33:
19:
3206:David MacKay
3168:
3141:
3109:
3101:
3090:the original
3072:(1): 10–21.
3069:
3063:
3047:
3034:
3024:
3004:
2997:
2977:
2970:
2950:
2943:
2926:
2922:
2909:
2876:
2872:
2859:
2655:
2599:
2597:
2385:
2290:
2212:
2206:
2024:
2020:
2017:
2009:
2003:
1999:
1937:
1797:
1628:
1579:
1572:
1427:
1425:
1411:
1407:
1403:
1399:
1394:
1388:
1384:
1352:
1328:
1252:
1233:
1173:
1171:
1167:
1142:
1138:
1136:
1008:
959:
925:
859:
855:
849:
799:is given by
796:
792:
789:
784:
775:
765:
738:Nyquist rate
736:
664:
618:
615:Nyquist rate
609:Nyquist rate
588:
511:
432:net bit rate
398:
246:
189:
183:
145:
99:Entropy rate
3214:Turbo codes
3114:. Courier.
2873:Trans. AIEE
224:information
214:subject to
3229:Categories
3133:References
1297:, then if
280:error rate
3171:. Wiley.
3032:(1998) .
2745:⋅
2739:≈
2703:⋅
2628:⋅
2622:⋅
2616:≈
2570:⋅
2564:≈
2551:⋅
2542:
2530:≈
2501:
2495:⋅
2486:
2445:
2407:≪
2353:
2241:⋅
2235:⋅
2229:≈
2179:
2166:⋅
2160:≈
2147:
2134:⋅
2125:
2114:
2092:
2079:≈
2050:
1729:
1705:∫
1485:
1457:
1110:
1091:≤
899:
852:logarithm
828:Δ
778:line rate
644:≤
621:bandwidth
599:telegraph
478:bandwidth
358:
228:bandwidth
194:bandwidth
3160:12107009
3086:52873253
2834:See also
2778:Examples
578:decibels
486:passband
272:tightest
3187:1325622
2881:Bibcode
2422:
2388:
1821:is the
1685:
1657:
568:is the
476:is the
422:is the
59:Entropy
3212:, and
3185:
3175:
3158:
3148:
3118:
3084:
3012:
2985:
2958:
2805:(31)).
2324:
2312:
2291:where
2266:
2254:
1798:where
926:where
665:where
399:where
209:analog
188:, the
3204:, by
3093:(PDF)
3082:S2CID
3060:(PDF)
3039:(PDF)
2919:(PDF)
2869:(PDF)
2851:Notes
2845:Eb/N0
2232:0.332
981:hertz
482:hertz
198:noise
3183:OCLC
3173:ISBN
3156:OCLC
3146:ISBN
3116:ISBN
3010:ISBN
2983:ISBN
2956:ISBN
2742:1.44
2619:1.44
2567:1.44
2163:3.32
2002:and
1956:AWGN
1367:>
1311:<
983:was
955:baud
862:as:
593:and
235:and
3074:doi
2931:doi
2899:doi
2889:doi
2809:log
2436:log
2344:log
2170:log
2138:log
2083:log
2041:log
2027:≫ 1
1720:log
1476:log
1448:log
1101:log
890:log
426:in
349:log
184:In
3231::
3181:.
3154:.
3080:.
3070:37
3068:.
3062:.
2925:.
2921:.
2887:.
2877:47
2875:.
2871:.
2803:10
2725:.
2602:.
2539:ln
2498:ln
2483:ln
2348:10
2340:10
2215:.
2174:10
2142:10
2122:ln
2117:10
2111:ln
1998:;
1430::
1037:,
1013:,
957:.
580:).
239:.
3216:.
3189:.
3162:.
3124:.
3076::
3018:.
2991:.
2964:.
2937:.
2933::
2927:7
2901::
2895:.
2891::
2883::
2811:2
2799:2
2792:2
2788:2
2758:0
2754:N
2750:S
2736:C
2711:0
2707:N
2700:B
2697:=
2694:N
2672:0
2668:N
2641:.
2636:N
2633:S
2625:B
2613:C
2583:;
2578:N
2575:S
2559:N
2556:S
2545:2
2535:1
2526:)
2520:N
2517:S
2512:+
2509:1
2505:(
2489:2
2479:1
2474:=
2470:)
2464:N
2461:S
2456:+
2453:1
2449:(
2440:2
2410:1
2404:N
2400:/
2396:S
2366:.
2361:N
2358:S
2337:=
2333:)
2330:B
2327:d
2321:n
2318:i
2315:(
2309:R
2306:N
2303:S
2275:)
2272:B
2269:d
2263:n
2260:i
2257:(
2251:R
2248:N
2245:S
2238:B
2226:C
2209:N
2192:,
2187:N
2184:S
2155:N
2152:S
2128:2
2105:=
2100:N
2097:S
2087:2
2075:)
2069:N
2066:S
2061:+
2058:1
2054:(
2045:2
2025:N
2023:/
2021:S
2004:C
2000:B
1986:1
1983:=
1976:0
1972:N
1968:S
1922:f
1900:)
1897:f
1894:(
1891:N
1866:)
1863:f
1860:(
1857:S
1835:B
1809:C
1783:f
1780:d
1776:)
1769:)
1766:f
1763:(
1760:N
1755:)
1752:f
1749:(
1746:S
1740:+
1737:1
1733:(
1724:2
1714:B
1709:0
1701:=
1698:C
1673:N
1669:/
1665:S
1643:N
1640:+
1637:S
1609:M
1588:M
1558:.
1551:N
1548:S
1543:+
1540:1
1535:=
1532:M
1510:)
1504:N
1501:S
1496:+
1493:1
1489:(
1480:2
1472:B
1469:=
1466:)
1463:M
1460:(
1452:2
1444:B
1441:2
1428:M
1395:M
1370:C
1364:R
1338:C
1314:C
1308:R
1285:R
1265:C
1208:B
1187:B
1184:2
1174:M
1153:M
1143:M
1139:M
1122:.
1119:)
1116:M
1113:(
1105:2
1097:B
1094:2
1088:R
1065:R
1045:R
1021:B
994:B
991:2
968:B
939:p
935:f
911:,
908:)
905:M
902:(
894:2
884:p
880:f
876:=
873:R
860:R
856:M
846:.
831:V
824:A
819:+
816:1
813:=
810:M
797:M
793:V
785:R
752:B
749:2
723:B
720:2
700:B
678:p
674:f
650:B
647:2
639:p
635:f
556:N
552:/
548:S
526:N
513:C
498:S
484:(
464:B
442:I
410:C
383:)
377:N
374:S
369:+
366:1
362:(
353:2
345:B
342:=
339:C
327::
315:N
290:S
258:C
173:e
166:t
159:v
34:.
20:)
Text is available under the Creative Commons Attribution-ShareAlike License. Additional terms may apply.