Review of “The Information: A History, A Theory, A Flood” by James Gleick

US-paperback2-198x300

Note: This review is by my husband Jim.

Not surprisingly, the subject of James Gleick’s The Information is the field of knowledge known as “Information Theory.” The theory’s origin can be traced to a seminal article written in 1948 by Claude Shannon, an engineer employed at that time by Bell Laboratories. The article, entitled “A Mathematical Theory of Communication,” appeared in two parts in the Bell System Technical Journal. Shannon focused on how to optimize the amount of information a sender wants to transmit. His theory is important because, inter alia, its use and practice greatly improves the speed and amount of content that can be transmitted or communicated electronically. As Gleick points out:

“Satellite television channels, pocket music players, efficient camera and telephones and countless other modern appurtenances depend on coding algorithms to compress numbers—sequences of bits—and those algorithms trace their lineage to Shannon’s original 1948 paper.”

Claude Shannon also invented a number of mazes, games, and  parlor tricks.

Claude Shannon also invented a number of mazes, games, and parlor tricks.

But getting a feel for how the theory works or why it is so important isn’t easy, so Gleick takes the reader on a 180-page historical tour of various earlier forms of communication between remote sites. For example, Europeans were amazed to find that tribes of sub-Saharan Africa were able to send remarkably detailed messages to one another by means of drums. The fact that their languages were tonal (like Mandarin, but unlike any European language) facilitated their “translation” into drum sounds.

In another example of comparatively long-range communication, European war fleets were able to transmit messages by way of visual flag signals, but the range of possible messages was limited to a few pre-arranged commands. By the late 18th century, the French were able to send messages long distances by way of “telegraphs.” The first devices known as telegraphs were series of signaling devices like semaphores spaced within sight of one another. Signals could be sent from one device to the next, but complicated messages were difficult to transmit because there was no known efficient method to encode the message succinctly.

Demonstration of the semaphore telegraph designed by Claude Chappe

Demonstration of the semaphore telegraph designed by Claude Chappe

The invention of the electrical telegraph provided the opportunity to send signals much faster than the visual “telegraphs.” However, it was not until an efficient code like the one developed by Samuel Morse was generally put in use that the transmission of complex or long messages became practical.

Just why Morse Code was efficient is related to a well-defined concept conventionally called the “entropy of a message” or the “Shannon entropy.” It encourages the removal of as much extraneous data as possible from a message to shorten it but without a loss of meaning. Most of you will be aware of this process even without knowing the history and theory behind it. The meaning of “I lv u” is clear, and takes less space than “I love you.” Conventions such as “twitter-speak” allow for even more economy: when someone only says “OMG” you know what that person is communicating, and six spaces have been saved.

Morse Code Table

Morse Code Table

The initial thrust of Shannon’s theorizing was to condense the quantity of data to be transmitted over telephone lines, greatly enhancing the capacity of the lines to transmit ideas (content) without increasing the amount of physical assets needed to transmit. But the concept of quantifying the extraction of information from raw data soon flowed from telephone engineering into other fields such as psychology, genetics, and quantum physics.

Gleick also discusses the tension between the concepts of information and meaning. Although Shannon famously said that meaning is “irrelevant to the engineering problem,” meaning remains the thing humans most want to convey or transmit in communication. The problem remains a sticky philosophical one, and Gleick does a nice job of analyzing it, although he does not solve it.

Gleick is a master of elucidating daunting scientific concepts. Just like his earlier book, Chaos, The Information brings to light an intellectually challenging set of ideas and makes them understandable to the layman. Bravo!

Rating: 5/5

Published in hardcover by Pantheon Books, a division of Random House, Inc. 2011; Published in paperback by Vintage Books, a division of Random House, 2012

Advertisements

About rhapsodyinbooks

We're into reading, politics, and intellectual exchanges.
This entry was posted in Book Review and tagged . Bookmark the permalink.

5 Responses to Review of “The Information: A History, A Theory, A Flood” by James Gleick

  1. Beth F says:

    Wow a 5/5 rating. I like reading nonfiction, but I can’t really see myself reading this one. Glad Jim loved it.

  2. BermudaOnion says:

    I wonder if a lamebrain layman like me could understand that book.

  3. Stefanie says:

    Isn’t this a fantastic book? I read it not long after it first came out and really liked. Gleick does a great of job breaking everything down and helping it all to make sense. Good review!

  4. Great review! Way to go Mr. Rhapsody!

  5. All right, I will try it! If you really promise that it’s accessible to the layman. I do think that it’s fascinating to think about the many different ways humans have tried out to get information across to each other — it’s an easy need to identify, with no easy solutions.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s