Is Information Fundamental? - The Nature of Reality — The Nature of Reality

Authored by pbs.org and submitted by On_Too_Much_Adderall
image for Is Information Fundamental? - The Nature of Reality — The Nature of Reality

What if the fundamental “stuff” of the universe isn’t matter or energy, but information?

That’s the idea some theorists are pursuing as they search for ever-more elegant and concise descriptions of the laws that govern our universe. Could our universe, in all its richness and diversity, really be just a bunch of bits?

To understand the buzz over information, we have to start at the beginning: What is information?

So is an image like this:

“It doesn’t matter whether something consists of equations, words, images or sounds—you can encode any of that in strings of zeroes and ones,” or bits, says Scott Aaronson, associate professor of electrical engineering and computer science at MIT. Your computer is doing it right now, using tiny magnets, capacitors, and transistors to store billions or trillions of binary digits. “These might have been hard concepts for people to understand a century ago, but because of the computer revolution, we deal with these concepts all the time,” says Aaronson. In an age when a USB drive dangles from every keychain and an iPhone strains the seams of every pocket, it isn’t such a leap to agree that anything can be expressed in information.

To some theorists, though, information is more than just a description of our universe and the stuff in it: it is the most basic currency of existence, occupying what theorist Paul Davies terms the “ontological basement” of reality.

The rules of quantum information provide the most “compact” description of physics, says Vlatko Vedral, professor of quantum information theory at the University of Oxford and the National University of Singapore. “Information, it seems to me, requires fewer assumptions about anything else we could postulate. As soon as you talk about matter and energy, you have to write down the laws that govern matter and energy.”

Does this mean that our universe is made of information, as some headlines claim?

“It strikes me as a contentless question,” says Aaronson. “To say that matter and energy are important in physics is to say something with content.” You can imagine a universe barren of matter and energy, after all; specifying that our universe is furnished with both tells you something about it and distinguishes it from other possible universes. “But I don’t know how you could even conceive of a universe” without information, he says.

Yet, as a fresh way of thinking about, well, what the universe is about, information has touched off provocative work in computer science and theoretical astrophysics, apparently disparate fields that may share a deep link manifested by that cosmic Rosetta stone, the black hole. But before we dive into the black hole, let’s step back to take a deeper look at information itself.

All messages contain information, but not all messages are created equal. “Unexpected things have high information content,” says Vedral. Take a sunrise, for example. “If the sun rises tomorrow, you won’t see any newspaper writing about it. But of course if it didn’t rise, it would be a major event.”

We sense intuitively that “surprises,” like a missed sunrise, carry more information than routine events. Claude Shannon, widely considered the father of information theory, formalized this intuition by defining a quantity that’s now known as “Shannon entropy.” The Shannon entropy of a message is related to the sum of the logarithm of the probability of each bit taking on a particular value. That’s a mouthful, but, Vedral explains, it mathematically captures two important features of information: the value of surprises, and the fact that information is “additive”—that is, that the total information contained in two, three, four, or a billion unrelated events is equal to the sum of the information in each one.

Physicists describe entropy a little differently, often speaking in terms of the “disorder” of a system. More precisely, entropy is the number of different ways you can rearrange the littlest parts of a system and still get the same big system. A bucket full of red Legos, for instance, has high entropy. Shake it up, spin it around, and you still have what you began with: a bucket of red Legos. Assemble those same blocks into a Lego castle, though, and you’ve slashed the entropy; moving a single block nets you a different “macroscopic” system.

Whichever perspective you choose, the essential result is the same. Take the paragraph you’re reading right now, for instance, with its many different letters, punctuation marks, and spaces arranged in a very particular order. It contains more information, and thus has higher entropy, than a paragraph like this one—

—even though they contain the same number of characters. Entropy, then, provides a measure of not just the length but the information content of a message.

Now imagine trying to build the ultimate hard drive, one that holds the maximum amount of information allowed by physics. Why should physics place a limit on the information storage capacity of this hypothetical hard drive? Thinking it over from a purely classical perspective, it seems that you could store an infinite amount of information. But when we add quantum mechanics to the mix, we introduce fundamental limits on the accuracy of our measurements. These limits cause entropy to max out at about 1069 bits per square meter. “If you tried to pack information more densely than that,” says Aaronson, “your hard drive would collapse into a black hole.” That’s not just a whimsical footnote. Black holes, it turns out, are the universe’s very best information repositories. (Of course, Aaronson points out, they don’t make very practical hard drives, unless you’re willing to wait about 1070 years to retrieve your data.)

But there’s something odd about the way the entropy of a black hole grows. As physicists Stephen Hawking and Jacob Bekenstein discovered in the 1970s, the entropy of a black hole increases with the black hole’s two-dimensional surface area, as defined by an imaginary spherical shell with radius R s . This is bizarre; you would expect the amount of information you can pack into any object, like a book or a hard drive, to grow with the three-dimensional volume of the object, not its surface area.

This discrepancy is more than just theoretical arcana. To physicists, it suggests that the fundamental laws of physics may have a simpler representation in two dimensions rather than the traditional three. In 1997, the Argentinian physicist Juan Maldacena, now at the Institute for Advanced Study, took advantage of this idea to work out a mathematical “duality” between our universe and one with fewer dimensions, one time dimension, and no gravity. This provides a handy mathematical back door; problems that are difficult to solve in one domain may shake out easily in the other.

To some theorists, the duality isn’t just mathematical. The universe as we experience it, they say, may actually be the projection of information encoded on some distant cosmic boundary. Where this boundary lies and how the projection occurs are still open questions, but these theorists argue that our reality may be, in essence, a hologram analogous to the silvery images on museum store postcards.

We have information theory to thank for this peculiar plot twist. But if it’s hard to imagine a practical application for this “holographic principle,” it’s far easier to see how quantum information is changing computing. That’s because quantum information has different basic properties from classical information. The bits that make up classical information can be either one or zero. But the “qubits” that make up quantum information can exist in a superposition of the “one” and “zero” states; in a sense, they can take on both states at once. To maintain this superposition, though, qubits must exist in perfect isolation. As soon as that isolation is disturbed, the superposition crumbles.

“Quantum information is like the information in a dream,” explained Charles Bennett, a quantum information scientist at IBM Research, in a recent talk at the annual meeting of the American Association for the Advancement of Science. “In describing it, you change you memory of it.” This may not sound like a desirable quality in a computer, but in combination with entanglement, it can be exploited to dramatically speed up certain types of calculations and to send perfectly secure encrypted messages. As Steve Girvin, a theoretical physicist at Yale University, points out, it can also be used to generate genuinely random numbers suitable for encryption keys. Quantum cryptography is already being used commercially for some bank transfers and other highly secure transmissions.

“This second quantum revolution—the revolution of information—is a complete surprise,” says Girvin. “It took decades to come to grips with the weirdness and realize that the information of quantum mechanical systems is different than the information content of classical systems, and being uncertain about something can actually be good instead of bad.”

Quantum information is useful stuff—but what is it telling us about the essential nature of our reality? Some thinkers argue that it suggests our entire universe is itself a quantum computer. “I like this image,” says Vedral, while admitting that the analogy is imperfect. “You could ask, can I treat the rest of the universe as something that I can program in the way that I program my [ordinary] computer?” No, says Vedral. “You’re still limited by the laws of physics, and you have a certain amount of resources which are finite. There are computations you will never be able to execute.” The computation at which this cosmic quantum computer is uniquely capable is that of computing its own evolution.

Whether information is a useful strategy of thought or something deeper, we still don’t know. “We’re still struggling with what our theories are really telling us,” says Vedral. “You have to take a leap of imagination.”

arXiv: Computational capacity of the universe

Treating the entire universe as a computer, MIT “quantum mechanic” Seth Lloyd derives the information storage and operational limits of the universe.

FQXi: It From Bit or Bit From It?

Read prize-winning essays from the Foundational Questions Institute’s 2013 essay contest on the theme of information and its role in reality.

Information and the Nature of Reality

With contributions from physicists, philosophers, theologians, and biologists, this volume of essays looks at the meaning and significance of information from multiple perspectives.

jattyrr on February 4th, 2018 at 11:04 UTC »

That was a fantastic article! Btw according to the article you could retrieve that information from the black hole but it would take 1070 amount of years.

Gbc_Legion1150 on February 4th, 2018 at 10:50 UTC »

Like a legit black hole?

chasebrendon on February 4th, 2018 at 10:40 UTC »

I’m guessing memory sticks are some way from carrying a black hole warning sign?