Home

Information Theory Quotes

There are 257 quotes

"Information exists; it's not in a special medium. It always has to be in some physical medium, but you really have to consider information in your theory." - Daniel Dennett
"We know at present there is no naturalistic explanation that produces information, not natural selection, not self-organizational processes, not pure chance, but we do know of a cause which is capable of producing information, and that is intelligence."
"The reason we age is that it's the analog information in the body that's lost over time, not the digital."
"We have access to a lot more things that biologists don't. And sure we can use information theory and complexity theory to try and analyze the critters' behaviors and their brains."
"Whenever we find information, whether it's in a section of software or a paragraph in a book... and we trace that information back to its ultimate source, we always come to a mind, not a material process."
"The uncertainty principle is a statement about how much information we are ever able to extract from a quantum system."
"DNA is a particular machine that encodes an unlimited amount of information in chemical form and has figured out a way to replicate itself."
"Information regarding whatever object falls into a black hole is not lost because that information gets stored holographically outside the black hole itself."
"The basic rule of physics is nothing really ever gets lost; information never gets lost."
"Information maps onto a space where we can make predictions. It is information that allows us to be conscious beings, it is information that allows us to be living beings."
"No room can ever have as much information as is implicit in [the universe]. The most accurate description we can ever have is in terms of a number of degrees of freedom which is proportional to the area and not the volume."
"The world is pixelated, not voxelated. The degrees of freedom of the world, the most accurate description we can ever have, is in terms of a number of degrees of freedom which is proportional to the area and not the volume."
"I read a really interesting paper recently... It was an academic paper detailing the differences between Echo Chambers and epistemic Bubbles."
"Living organisms only really make sense when often when they're described in terms of information and the management of information."
"He's the father of information theory, he wrote this absolutely seminal, super important paper about transmitting information and storing it and encoding it and things like that. It really gave birth to the information age and a lot of the foundations for computer science."
"Life wants to get the most information possible around its surroundings and complexity is in fact the ability to gather and exchange and preserve the most information possible."
"It's an extension of classical information theory with complex numbers and if you run through that math you get some very interesting results."
"What we think is objectively real is nothing more than an emergent construct of underlying mathematical informational processes."
"The new view on gravity has to do with information."
"Information is information, not matter or energy. No materialism that does not admit this can survive at the present day."
"Entropy can be considered as a measurement of the amount of hidden information in a system."
"Information might in some sense, according to this argument, be in some sense the very basic layer of our understanding of the universe."
"The holographic principle is built on an observation... that there is a connection between the geometry of space and time and its information content."
"We know from our experience that information always arises from an intelligent source."
"In all codes and languages, there are vastly more ways of arranging characters that will generate gibberish than there are arrangements that will generate meaningful sequences."
"Information, especially in a digital form, always comes from an intelligent source."
"Shannon had founded a new, far-reaching theory; the ideas he began to explore would form the cornerstone of what we now call information theory."
"For any lossless compression method, the fundamental lower bound on the average length per symbol is the entropy of the source distribution."
"This was a key part of Shannon's Source Coding Theorem."
"The more unpredictable a distribution is, the higher the information entropy is for the distribution."
"The Shannon limit or Shannon capacity... is like the speed of light."
"The question of how much we can compress a message fundamentally is also asking about how much information is contained in that message."
"Information is really the most fundamental thing, and it gives rise to the physical - a notion he pithily summarized with the expression 'it from bit'." - John Archibald Wheeler
"Shannon entropy can be thought of as the amount of hidden information in a system - or more precisely the amount of information we can hope to gain by making a measurement on the system."
"The number of bits of information that a system can contain is by definition the logarithm to the base 2 of the number of states."
"The information is primary, and the matter is secondary."
"The shape itself is designed to manipulate fields of quantum information at the very smallest level."
"If you're considering DNA is information then yes mutations can add information and they do."
"Physicists have come to the idea that information is underlying, it's fundamental to the universe."
"Life is the only thing in the universe that can store and pass along information, reproduce itself, and evolve."
"We've been accidentally dumping information into the galaxy for over half a century."
"All information is produced by someone, but it’s also produced for a purpose."
"Human DNA is a vast multi-dimensional library of information."
"We have lots of amazing coded information and symmetry... but there's a divine orchestration."
"The modern information age would require another idea, one that would finally pin down the nature of information and its relationship to the order and disorder of the universe."
"In this paper Shannon did something absolutely incredible, he took the vague and mysterious concepts of information and managed to pin it down."
"The bit is the smallest quantity of information, it is highly significant because it's the fundamental item, it is the smallest unit of information in which there is sufficient discrimination to communicate anything at all."
"But information isn't just something humans create, we're beginning to understand that this concept lies at the heart not only of 21st-century human society but also at the heart of the physical world itself."
"The discovery that information is running the show in life suggests a designing intelligence."
"Animals survive their physical form, while humans survive the information in their consciousness."
"Your DNA is an antenna receiving information from somewhere."
"If we're living in a simulation, it's all information that's being rendered."
"It's just different combinations of information, the loss of information and just the conservation of information that was already there."
"The key to the mystery of the origin of life was actually information, it was code."
"Reality is just information, comprehended through the lens of information."
"I'm saying that a mind is the only known thing that's capable of producing information."
"The only source capable of producing information is intelligence."
"We know of a cause that can produce lots of new information and it's a mind or an intelligence."
"Geopolitics is this wonderful game of fractal information."
"Entanglement is the second property that gives quantum information a really unique difference."
"Information is negative entropy... the universe is not driven by matter and energy but information at the edges."
"I fundamentally believe that if information gets processed a certain way, then it is conscious."
"Almost all information in the universe is in the form of black holes."
"If everything is entangled, maybe we're entangled in terms of information."
"Data is really the basic building blocks of our information, our knowledge, and our wisdom."
"Black holes contain, by far, most of the entropy in the universe, and require most information to fully describe."
"Information will always, if it's used properly, constrain outcomes. It selects certain outcomes and removes others."
"Our whole notion of what scientific explanation is might change. It might be all informational, virtual already."
"You can't lose information. What goes in must come out."
"Information is indestructible, yeah it might change shape but it can never be lost."
"Information has started to crop up in area after area of physics at a very fundamental level."
"Maybe information is not just some coincidental little thing, but maybe information is really, truly fundamental."
"Information is everywhere; it's never destroyed. It's always there; you just have to tap into it."
"Consciousness is the fundamental information structure of the universe, information moving through its cycles, feed forward feedback information throughout the whole network of creation."
"The laws of nature preserve information entirely, so all the details that make up you and the story of your grandmother's life are immortal."
"Shannon did something absolutely incredible. He took the vague, mysterious concept of information and managed to pin it down."
"In his paper, Shannon showed that a single binary digit, one of these ones or zeros, is a fundamental unit of information."
"The humble bit turned out to be an enormously powerful idea."
"The power of the bit lay in its universality."
"Whatever is the minimum between the cause information and the effect information is carried forward as the cause-effect information of the mechanism being in its current state."
"Reality is made of information; the quantum world expresses information which is non-reproducible, therefore knowable only from the inside."
"Information is completely contextual. It is never absolute."
"Information is really a mutual entropy, a shared entropy. It's a difference of entropies."
"There is something about information theory that presents a barrier, a hurdle to acceptance."
"The reason it's called the holographic principle is because a hologram generates a realistic-looking three-dimensional image. But all the information needed to describe the hologram is contained on a two-dimensional surface."
"Coded information doesn't come about by physics and chemistry. It has to come about from intelligence."
"the great challenge raised in the 70s and 80s which has been solved i would say in the last two or three years to an extent is that challenge does the information get destroyed we think it doesn't we think it comes out and it's encoded in the hawking radiation."
"Information gives rise to every 'it', every particle, every field of force, even the space-time continuum itself."
"Entropy is used in a lot of cryptography and information theory, it's really important."
"So, why is it that on the quantum level, information seems to be doing just this?"
"Information always comes from a mind."
"...there's reason to believe the universe has a finite set of information in it."
"Information theory is a powerful framework for reasoning about the way that data can contain information."
"Shannon both invented information theory and solved some of its most important problems."
"Shannon's proposal was to define information content as the log of one over the probability of the outcome."
"Information content increases as the probability of the event gets smaller."
"...the expectation of X has to equal the expected expectation of X given some information."
"Gravity does not lead to the loss of information after all."
"Everything from nothing: information theory reveals the equivalence between the totality of all information and the nothingness of zero information."
"Whenever we're dealing with information, whenever we find information and we trace it back to its source, we always come to a mind, not a material process."
"To get an evolutionary process going anywhere in the universe, you still need to solve this fundamental information problem."
"Entropy is not just a property of a system... it's a property of a system together with your state of knowledge about the system."
"What the logarithm measures is the number of bits of information."
"There must be some description that you can use for A which contains all possible experimental information that you could do on A if you're never interested in also looking at B."
"Entropy is essentially synonymous with information. Entropy means information and surprisal."
"Consciousness arises from the integrated information processing that occurs in the brain."
"Life is really about information and logic and information processing."
"From bit symbolizes the idea that every item of the physical world has at bottom, in most instances, an immaterial source and explanation." - John Archibald Wheeler
"Our deeply ingrained belief in flowing time and an unknown future is not due to the non-existence of the future, but rather physical limitations on how systems are allowed to process information."
"I think it's really interesting is the biomechanics where they intersect with information Theory."
"Information cannot travel back in time. If it could, then you could have grandfather paradoxes where you can have something and not something at the same time in the same sense."
"Welcome back, so I'm really excited today to tell you about the Shannon Nyquist sampling theorem."
"It's more accurate to think of this as the maximum speed information can move at."
"Aging is simply loss of information."
"Information causes change. If it doesn't, it's not information."
"Randomness is your enemy. Randomness is called error. Randomness is called noise. You want signal, not noise."
"The number of combinations of ways that you can order information and then package these is much more than the time would allow in our universe."
"Information theory is about entropy, mutual information, entropy maximization, and how this relates to neuron spiking."
"Matter and energy are derivative; information is primary."
"Teleportation is really a cornerstone of quantum information theory and variations on the idea also often come up."
"So at least in this case, the one where we understand quantum gravity the best, it seems clear that a black hole does not destroy quantum information."
"Consciousness emerges because it's the simplest mechanism to train self-organizing information processing systems."
"We can explain the world by the composition of informationally independent pieces, independent mechanisms."
"The holographic principle in gravity says that the horizon degrees of freedom of black holes capture all information of the bulk."
"Information is a physical quantity as real as energy, as real as temperature, as real as light."
"The smallest lossless prediction or otherwise known as the minimum description length."
"If it's a quantum book written in qubits where these pages are very highly entangled, there's still a lot of information in the book, but you can't read it the way I just described."
"It's a whole different way of dealing with information. There's quantum information."
"Entanglement actually allows teleportation of information quantum information from one location to another."
"Information theory was invented by Claude Shannon to solve communication problems."
"When information flows into a black hole, its surface area increases in proportion to the amount of information."
"In physics, information is preserved and everything can be traced back to its roots."
"The relative entropy is a measure of distance between probability distributions."
"Quantum information cannot be deleted; it is a pretty big part of quantum mechanics: information is indestructible."
"Consciousness is perhaps fundamental, and that the underlying sort of substance of the universe is information of which our consciousness is part of."
"Information is not lost in black hole evaporation; stuff falls in, it evaporates, and you could in principle recover the information that fell in."
"Coded symbolic information needs a set of instructions to tell the system how to build another one."
"Information has its own laws and its own miracles."
"So that will be 1. So a coin flip has one bit of information."
"Information can only be generated by an intelligent agency; it doesn't come accidentally."
"Understanding the non-covalent structure of DNA is so critical to understanding information storage and information transfer."
"The power of superposition is very enormous by just having 300 electrons; it contains a lot of information."
"DNA functions in the information storage and RNA functions as information retrieval."
"The information paradox... could revolutionize our understanding of information entropy and the fabric of spacetime itself."
"All known laws of physics require such extrinsic inputs of information about the features of the system."
"The latent variables are encoding high-level semantic information about the scene."
"Maximize the mutual information between our data and the learned representations."
"This one simple linear flow of information has now become more of a circular flow of information."
"We move from the physical aspect of reality, energy, to something more subtle which is information."
"Bitcoin is a breakthrough in information theory that allows people to conduct provable transactions without reference to trusted third parties."
"The scope of possible information is infinite."
"Consciousness is a digital information system."
"Shannon's noisy Channel coding theorem says that if you add redundancy in a cunning way, you don't actually need to add very much redundancy to be able to achieve arbitrarily low probability of error."
"There is zero scientific evidence for materialism; in fact, there's evidence that reality is made of information in quantum mechanics."
"Negative probabilities work when we lack which way information."
"If the original signal is band-limited, then it's possible to sample in a way that preserves all the information."
"Signals that are band-limited can be sampled in a way that preserves all the information."
"Sampling preserves all of the information."
"These purely computational information theoretic ideas actually have physical consequences."
"If X and Y are independent, then conditioning on X gives us no information about Y."
"The biggest enigma in the origin of life is the emergence of the information processing mechanisms."
"Information processing is going to keep on being an important way of carrying forward our understanding."
"My own personal interest in information theory is not only in the solutions that we come up with but also in the beautiful questions that the field asks."
"The right way to measure information content is with log base 2 of 1 over P of an outcome."
"Shannon says we should be able to achieve more than 10-fold compression of that bent coin file."
"The entropy may often be much smaller than 1 bit per character."
"For any channel, you can work out its capacity by maximizing mutual information."
"The capacity of the binary erasure channel is 1 minus F."
"The only cause now in operation that explains the effect of information is intelligence."
"What soul is, is another level of information."
"The soul is information on a higher arc."
"The Fischer information actually gives us a notion of distance in model space."
"In information theory, the cross entropy between two probability distributions over the same underlying set of events measures the average number of bits needed to identify an event drawn from the set."
"The entropy is a sensible measure of expected or average information content."
"The general claim is that the Shannon information content of an outcome is a sensible way to measure how much information that outcome gives you."
"For BCH codes, we do not have a control on the information length K."
"Entropy is a measure of uncertainty."
"The shortest path tells us something super interesting about the minimal amount of information required to propagate that motif in time and space."
"Information theory is the scientific study of the quantification, storage, and communication of digital information."
"If you could transmit information faster than the speed of light, then from another person's point of view, you'd transmit information backward in time."
"Information is defined as the capacity to reduce uncertainty."
"The self-information of an event is defined as the logarithm of one divided by the probability of that event."
"The entropy of a variable can quantify the uncertainty or the entropy of individual variables."
"The mutual information is the amount of information that each of these variables know about the other."
"The uncertainty left about one variable after knowing the other is what we call the conditional entropy."
"The entropy of a discrete random variable is a measure of its uncertainty."
"Minimizing uncertainty, or very equivalently, maximizing information, that's what learning is about."
"Aging is essentially what's driving all of these things is a loss of information."
"Increasing the amount of information predicted actually results in better representations."
"The mutual information is a measure of how much dependence there is between the two random variables."
"The capacity of the channel Q is the maximum over all possible input distributions PX of the mutual information between the input and the output."
"Reliable, that is virtually error-free communication, is possible over this channel at any rate up to the capacity of the channel."
"We are energy, we are information."
"At the end of the day, no matter what media we use to represent information, it all reduces to zeros and ones."
"The noisy channel coding theorem... it's an existence proof, but it doesn't give you the recipe of finding that code."
"If the source rate is less than the channel capacity per unit time, then there exists a coding scheme for which the source output can be transmitted over this noisy channel and reconstructed with an arbitrarily low probability of error."
"Instead of having a large lookup table of codewords, one can simply have a generator matrix."
"Claude Shannon is the father of information theory."
"Minimizing the contrastive loss is equivalent to maximizing the lower bound of the mutual information between two views."
"Intelligence is all about information processing."