Bringing The World Home To You

© 2024 WUNC North Carolina Public Radio
120 Friday Center Dr
Chapel Hill, NC 27517
919.445.9150 | 800.962.9862
Play Live Radio
Next Up:
0:00
0:00
0:00 0:00
Available On Air Stations

Bit By Bit, 'The Information' Reveals Everything

 

The Information, written by James Gleick, covers nearly everything — jungle drums, language, Morse code, telegraphy, telephony, quantum mechanics, thermodynamics, genetics and more — as it relates to information, which he describes as the "fundamental core of things." Information theory can now be seen as the overarching concept for our times, describing how scientists in many disciplines see a common thread to their work.

Gleick's book spans centuries and geographic locations, but one person stays throughout the story for almost 400 pages: Claude Shannon, an engineer and mathematician who worked at Bell Labs in the mid-20th century. Shannon created what is now called information theory, Gleick tells Robert Siegel on All Things Considered:

"He was the first person to use the word 'bit' as a scientific unit of measuring this funny abstract thing that until this point in time scientists had not thought of as a measurable scientific quantity."

Bits are more commonly recognized as the 1s and 0s that enable computers to store and share information, but can also be thought of in this context as a yes/no, either/or or on/off switch. Gleick describes the bit as "the irreducible quantum of information," upon which all things are built.

Just like Isaac Newton took vague words like "force" and "mass" that had fuzzy contemporary meanings and turned them into specific mathematical definitions, "information" now can refer to a specific scientific definition similar to a bit.

"Binary yes or no choices are at the root of things," Gleick explains. The physicist John Archibald Wheeler coined an epigram to encapsulate the concept behind information theory: "It from bit." It described the idea that the smallest particle of every piece of matter is a binary question, a 1 or a 0. From these pieces of information, other things could develop — like DNA, matter and living organisms. The field of information theory, in addition to creating new meanings for words like "information," also builds upon knowledge from other scientific disciplines such as thermodynamics, even though the result may be a little tough to understand.

James Gleick also wrote <em>Chaos: Making a New Science</em>, which popularized the idea of the butterfly effect. His books have been finalists for the Pulitzer Prize and the National Book Award.
/ Phylis Rose
/
Phylis Rose
James Gleick also wrote Chaos: Making a New Science, which popularized the idea of the butterfly effect. His books have been finalists for the Pulitzer Prize and the National Book Award.

"When Claude Shannon first wrote his paper and made a connection between information and the thermodynamic concept of entropy, a rumor started around Bell Labs that the great atomic physicist John von Neumann had suggested to Shannon, 'Just use the word entropy — no one will know what you're talking about, and everyone will be scared to doubt you.' "

Though it may be a difficult subject to conceptualize, entropy does have a deep connection to information science, Gleick says. Entropy is associated with disorder in thermodynamic systems, and analogously so in informational systems. Though it may seem paradoxical to link information to disorder, Gleick explains that each new bit of information is a surprise — if you knew what a particular message contained, there would not be information in it.

"Information equals disorder, disorder equals entropy and a lot of physicists have been both scratching their heads and making scientific progress ever since," Gleick says.

In the everyday — not scientific — sense, an object like the moon only seems to contain information when we perceive it and develop thoughts about it, whether that's the man in the moon, the moon being made of cheese or the moon driving people to madness. But Gleick says that even without our perceiving it, the moon is more than just matter — it still has its own bits of intrinsic information.

"It sounds mystical, and I can't pretend that I fully understand it either, but it's just one of the many ways in which scientists have discovered a conception of information that helps them solve problems in a whole range of disciplines."

Copyright 2023 NPR. To see more, visit https://www.npr.org.

More Stories