electrical contractor near

Page 228

222

CHAPTER 6. INFORMATION COMMUNICATION

6.23 Subtleties of Coding31 In the Huffman code, the bit sequences that represent individual symbols can have differing lengths so the bitstream index m does not increase in lock step with the symbol-valued signal’s index n. To capture how often bits must be transmitted to keep up with the source’s production of symbols, we can only compute averages. If our source code averages B (A) bits/symbol and symbols are produced at a rate R, the average bit rate equals B (A) R, and this quantity determines the bit interval duration T . Exercise 6.23 Calculate the relation between T and the average bit rate B (A) R.

(Solution on p. 257.)

A subtlety of source coding is whether we need “commas” in the bitstream. When we use an unequal number of bits to represent symbols, how does the receiver determine when symbols begin and end? If you created a source code that required a separation marker in the bitstream between symbols, it would be very inefficient since you are essentially requiring an extra symbol in the transmission stream. note: A good example of this need is the Morse Code: Between each letter, the telegrapher needs to insert a pause to inform the receiver when letter boundaries occur. As shown in this example (Example 6.3), no commas are placed in the bitstream, but you can unambiguously decode the sequence of symbols from the bitstream. Huffman showed that his (maximally efficient) code had the prefix property: No code for a symbol began another symbol’s code. Once you have the prefix property, the bitstream is partially self-synchronizing: Once the receiver knows where the bitstream starts, we can assign a unique and correct symbol sequence to the bitstream. Exercise 6.24 (Solution on p. 257.) Sketch an argument that prefix coding, whether derived from a Huffman code or not, will provide unique decoding when an unequal number of bits/symbol are used in the code. However, having a prefix code does not guarantee total synchronization: After hopping into the middle of a bitstream, can we always find the correct symbol boundaries? The self-synchronization issue does mitigate the use of efficient source coding algorithms. Exercise 6.25 (Solution on p. 257.) Show by example that a bitstream produced by a Huffman code is not necessarily self-synchronizing. Are fixed-length codes self synchronizing? Another issue is bit errors induced by the digital channel; if they occur (and they will), synchronization can easily be lost even if the receiver started “in synch” with the source. Despite the small probabilities of error offered by good signal set design and the matched filter, an infrequent error can devastate the ability to translate a bitstream into a symbolic signal. We need ways of reducing reception errors without demanding that pe be smaller. Example 6.4 The first electrical communications system—the telegraph—was digital. When first deployed in 1844, it communicated text over wireline connections using a binary code—the Morse code—to represent individual letters. To send a message from one place to another, telegraph operators would tap the message using a telegraph key to another operator, who would relay the message on to the next operator, presumably getting the message closer to its destination. In short, the telegraph relied on a network not unlike the basics of modern computer networks. To say it presaged modern communications would be an understatement. It was also far ahead of some needed technologies, namely the Source Coding Theorem. The Morse code, shown in Figure 6.19, was not a prefix code. To separate codes for each letter, Morse code required that a space—a pause—be inserted between each letter. In information theory, that space counts as another code letter, which means that the Morse code encoded text with a three-letter source code: dots, dashes and space. The resulting source code is not within a bit of entropy, and is grossly inefficient (about 25%). Figure 6.19 shows a Huffman code for English text, which as we know is efficient. 31 This

content is available online at http://cnx.org/content/m0093/2.16/.


Turn static files into dynamic content formats.

Create a flipbook

Articles inside

7.2 Permutations and Combinations

2min
page 262

7.1 Decibels

2min
page 261

Solutions

2min
page 265

Solutions

11min
pages 255-260

6.37 Communication Protocols

3min
page 239

6.34 Message Routing

2min
page 235

6.33 Communication Networks

3min
page 234

6.31 Capacity of a Channel

2min
page 232

6.30 Noisy Channel Coding Theorem

2min
page 231

6.28 Error-Correcting Codes: Channel Decoding

5min
pages 228-229

6.26 Block Channel Coding

2min
page 225

6.24 Channel Coding

3min
page 223

6.20 Entropy

1min
page 218

6.15 Frequency Shift Keying

2min
page 212

6.13 Digital Communication

2min
page 209

6.5 Line-of-Sight Transmission

3min
page 202

6.1 Information Communication

3min
page 195

6.12 Signal-to-Noise Ratio of an Amplitude-Modulated Signal

2min
page 208

6.9 Channel Models

2min
page 205

5.16 Discrete-Time Filtering of Analog Signals

3min
page 179

5.5 Discrete-Time Signals and Systems

6min
pages 152-153

2.1 Complex Numbers

8min
pages 11-13

5.14 Filtering in the Frequency Domain

8min
pages 172-175

Solutions

2min
page 30

3.9 The Impedance Concept

2min
page 48

5.4 Amplitude Quantization

5min
pages 150-151

3.16 Power Conservation in Circuits

3min
page 62

3.12 Equivalent Circuits: Impedances and Sources

3min
page 53
Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.