Celebrating Claude Shannon
February 27, 2015
A certain website dedicated to knowledge and discussion, Edge, poses a question each year in the hope of provoking a thoughtful conversation. This year’s question, for example, is “What do you think about machines that think?” Very timely. However, I’m here to recall a nugget from 2012, when journalism professor and author of The Wikipedia Revolution Andrew Lih penned an answer to that year’s question, “What is your favorite deep, elegant, or beautiful explanation?”
Lih’s response is titled “Information Is the Resolution of Uncertainty.” I suggest you read the whole post—it isn’t that long. It traces the beginning of the information age to 1948, when under-sung mathematician, engineer, and cryptographer Claude Shannon coined that definition: “Information is the resolution of uncertainty.” Shannon’s perspective was informed by his experiences during World War II, when then-new technologies vastly complicated issues of protecting, and eavesdropping upon, critical information. Lih writes:
“As long as something can be relayed that resolves uncertainty, that is the fundamental nature of information. While this sounds surprisingly obvious, it was an important point, given how many different languages people speak and how one utterance could be meaningful to one person, and unintelligible to another. Until Shannon’s theory was formulated, it was not known how to compensate for these types of ‘psychological factors’ appropriately. Shannon built on the work of fellow researchers Ralph Hartley and Harry Nyquist to reveal that coding and symbols were the key to resolving whether two sides of a communication had a common understanding of the uncertainty being resolved.
“Shannon then considered: what was the simplest resolution of uncertainty? To him, it was the flip of the coin—heads or tails, yes or no—as an event with only two outcomes. Shannon concluded that any type of information, then, could be encoded as a series of fundamental yes or no answers. Today, we know these answers as bits of digital information—ones and zeroes—that represent everything from email text, digital photos, compact disc music or high definition video.”
It does sound simple, but this binary approach to information was a unique and startling perspective. Lih traces the development of every digital invention that shapes our modern world to this extensive thought experiment. Claude Shannon went on to influence his students as a professor at MIT, many of whom later built such things as digital modems (and later wireless communications), computer graphics, data compression, and artificial intelligence. Lih laments that Shannon does not get the credit he deserves, and would like him to be widely remembered as the father of the information age.
Cynthia Murrell, February 27, 2015
Sponsored by ArnoldIT.com, developer of Augmentext