The Science of Information: From Language to Black Holes [TTC Video]
18 February 2016, 17:55
Course No 1301 | M4V, AVC, 640x360 | AAC, 128 kbps, 2 Ch | 24x30 mins | + PDF Guidebook | 14.35GB
The science of information is the most influential, yet perhaps least appreciated field in science today. Never before in history have we been able to acquire, record, communicate, and use information in so many different forms. Never before have we had access to such vast quantities of data of every kind. This revolution goes far beyond the limitless content that fills our lives, because information also underlies our understanding of ourselves, the natural world, and the universe. It is the key that unites fields as different as linguistics, cryptography, neuroscience, genetics, economics, and quantum mechanics. And the fact that information bears no necessary connection to meaning makes it a profound puzzle that people with a passion for philosophy have pondered for centuries.
Little wonder that an entirely new science has arisen that is devoted to deepening our understanding of information and our ability to use it. Called information theory, this field has been responsible for path-breaking insights such as the following:
- What is information? In 1948, mathematician Claude Shannon boldly captured the essence of information with a definition that doesn’t invoke abstract concepts such as meaning or knowledge. In Shannon’s revolutionary view, information is simply the ability to distinguish reliably among possible alternatives.
- The bit: Atomic theory has the atom. Information theory has the bit: the basic unit of information. Proposed by Shannon’s colleague at Bell Labs, John Tukey, bit stands for “binary digit”—0 or 1 in binary notation, which can be implemented with a simple on/off switch. Everything from books to black holes can be measured in bits.
- Redundancy: Redundancy in information may seem like mere inefficiency, but it is a crucial feature of information of all types, including languages and DNA, since it provides built-in error correction for mistakes and noise. Redundancy is also the key to breaking secret codes.
Building on these and other fundamental principles, information theory spawned the digital revolution of today, just as the discoveries of Galileo and Newton laid the foundation for the scientific revolution four centuries ago. Technologies for computing, telecommunication, and encryption are now common, and it’s easy to forget that these powerful technologies and techniques had their own Galileos and Newtons.
The Science of Information: From Language to Black Holes covers the exciting concepts, history, and applications of information theory in 24 challenging and eye-opening half-hour lectures taught by Professor Benjamin Schumacher of Kenyon College. A prominent physicist and award-winning educator at one of the nation’s top liberal arts colleges, Professor Schumacher is also a pioneer in the field of quantum information, which is the latest exciting development in this dynamic scientific field.
Professor Schumacher introduces the essential mathematical ideas that govern the subject—concepts that can be understood by anyone with a background in high school math. But it is not necessary to follow the equations to appreciate the remarkable story that Dr. Schumacher tells.
A New View of Reality
Clearly, information has been around a long time. In human terms, language, writing, art, music, and mathematics are perfect examples; so are Morse code, Mendelian genetics, and radio signals—all originating before 1900. But a series of conceptual breakthroughs in the 20th century united what seemed like unrelated phenomena and led to a dramatic new way of looking at reality. The Science of Information takes you on this stimulating intellectual journey, in which some of the key figures include:
- Claude Shannon: Shannon plays a key role throughout the course as the dominant figure in the early decades of information theory, making major contributions in computer science, cryptography, genetics, and other areas. His crucial 1948 paper was the “shot heard” round the world” for the information revolution.
- Alan Turing: The genius behind the decryption of the Nazi Enigma code during World War II, Turing invented the principle of the modern digital computer, and he showed the inherent limitation of all computers by showing that the notorious “halting problem” was fundamentally unsolvable.
- John A. Wheeler: One of the greatest physicists of the 20th century, Wheeler had a passion for the most fundamental questions of science, which led him to conceive the famous slogan, “It from bit,” meaning that all of physical reality emerges from information. He was also Professor Schumacher’s mentor.
In addition, you study the contributions of other pioneers, such as John Kelly, who used information theory to devise an influential strategy for betting and investing; David Huffman, who blazed the trail in data compression, now used in formats such as JPEG and MP3; and Gregory Chaitin, who pursued computer algorithms for information theory, hypothesizing a celebrated yet uncomputable number called Omega. You also explore the pivotal contributions of pre-20th-century thinkers including Charles Babbage, Ada Lovelace, Samuel F. B. Morse, and Joseph Fourier.
The Laws of Information at Work
With lucid explanations and imaginative graphics, Professor Schumacher shows you the world through an extraordinary set of lenses. “If we wear our information-colored glasses,” he says, “we will see the laws of information at work all around us, in a hundred different ways.” The course illustrates this with examples such as:
- Money: Today most money exists as electronic account data. But even in ancient times, money was a record-keeping device—in other words, information. Precious metal coins had a cryptographic function: to make it hard to counterfeit messages of economic agreement and obligation.
- Privacy: The search for guaranteed privacy has only one refuge—the quantum realm. Professor Schumacher explains how the only perfectly secure communications take place between pairs of entangled quantum particles called qubits (a term he coined). Such systems are now in use.
- Games: The parlor game 20 Questions obviously involves the exchange of information. But why is the number of questions 20? Why not 10 or 30? The answer has to do with the connection between entropy and information—in this case, the total number of possible solutions to the game.
Dr. Schumacher also shows you how information theory can provide answers to profound scientific questions. What is the information content of the genome? The human brain? A black hole? The universe? Time and again, the concepts and laws of information reveal breathtaking insights into the workings of nature, even as they lay the foundation of astounding new technologies.
One final example: 12 billion miles from Earth, a spacecraft built with 1970s technology is racing through interstellar space, never to return. From that distance, the sun is a very bright star and Earth is a pale blue dot. Voyager 1’s radio transmitter is about as strong as a cell phone tower on Earth, which typically can’t reach phones more than a few miles away. Yet we continue, to this day, to receive data from Voyager. How is that possible? The Science of Information explains this amazing feat, along with so much more.
All posts of category «TTC Video»