electrical contractor near

Page 225

219 Exercise 6.21 (Solution on p. 256.) Derive the maximum-entropy results, both the numeric aspect (entropy equals log2 K) and the theoretical one (equally likely symbols maximize entropy). Derive the value of the minimum entropy alphabet. Example 6.1 A four-symbol alphabet has the following probabilities. Pr [a0 ] =

1 2

Pr [a1 ] =

1 4

Pr [a2 ] =

1 8

Pr [a3 ] =

Note that these probabilities sum to one as they should. As 12 = 2−1 , of this alphabet equals 1 1 1 1 1 1 H (A) = − log2 + log2 + log2 + 2 2 4 4 8 8 1 1 1 1 =− (−1) + (−2) + (−3) + (−3) 2 4 8 8

1 8

log2

1 2

1 log 8 2

1 8

= −1. The entropy

(6.51)

= 1.75 bits

6.21 Source Coding Theorem28 The significance of an alphabet’s entropy rests in how we can represent it with a sequence of bits. Bit sequences form the “coin of the realm” in digital communications: they are the universal way of representing symbolic-valued signals. We convert back and forth between symbols to bit-sequences with what is known as a codebook: a table that associates symbols to bit sequences. In creating this table, we must be able to assign a unique bit sequence to each symbol so that we can go between symbol and bit sequences without error. note: You may be conjuring the notion of hiding information from others when we use the name codebook for the symbol-to-bit-sequence table. There is no relation to cryptology, which comprises mathematically provable methods of securing information. The codebook terminology was developed during the beginnings of information theory just after World War II. As we shall explore in some detail elsewhere, digital communication is the transmission of symbolic-valued signals from one place to another. When faced with the problem, for example, of sending a file across the Internet, we must first represent each character by a bit sequence. Because we want to send the file quickly, we want to use as few bits as possible. However, we don’t want to use so few bits that the receiver cannot determine what each character was from the bit sequence. For example, we could use one bit for every character: File transmission would be fast but useless because the codebook creates errors. Shannon proved in his monumental work what we call today the Source Coding Theorem. Let B (ak ) denote the number of bits used to represent the symbol ak . The average number of bits B (A) required to represent the entire PK alphabet equals k=1 B (ak ) Pr [ak ]. The Source Coding Theorem states that the average number of bits needed to accurately represent the alphabet need only to satisfy H (A) ≤ B (A) < H (A) + 1

(6.52)

Thus, the alphabet’s entropy specifies to within one bit how many bits on the average need to be used to send the alphabet. The smaller an alphabet’s entropy, the fewer bits required for digital transmission of files expressed in that alphabet. 28 This

content is available online at http://cnx.org/content/m0091/2.13/.


Turn static files into dynamic content formats.

Create a flipbook

Articles inside

7.2 Permutations and Combinations

2min
page 262

7.1 Decibels

2min
page 261

Solutions

2min
page 265

Solutions

11min
pages 255-260

6.37 Communication Protocols

3min
page 239

6.34 Message Routing

2min
page 235

6.33 Communication Networks

3min
page 234

6.31 Capacity of a Channel

2min
page 232

6.30 Noisy Channel Coding Theorem

2min
page 231

6.28 Error-Correcting Codes: Channel Decoding

5min
pages 228-229

6.26 Block Channel Coding

2min
page 225

6.24 Channel Coding

3min
page 223

6.20 Entropy

1min
page 218

6.15 Frequency Shift Keying

2min
page 212

6.13 Digital Communication

2min
page 209

6.5 Line-of-Sight Transmission

3min
page 202

6.1 Information Communication

3min
page 195

6.12 Signal-to-Noise Ratio of an Amplitude-Modulated Signal

2min
page 208

6.9 Channel Models

2min
page 205

5.16 Discrete-Time Filtering of Analog Signals

3min
page 179

5.5 Discrete-Time Signals and Systems

6min
pages 152-153

2.1 Complex Numbers

8min
pages 11-13

5.14 Filtering in the Frequency Domain

8min
pages 172-175

Solutions

2min
page 30

3.9 The Impedance Concept

2min
page 48

5.4 Amplitude Quantization

5min
pages 150-151

3.16 Power Conservation in Circuits

3min
page 62

3.12 Equivalent Circuits: Impedances and Sources

3min
page 53
Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.