|
![](/i/fill.gif) |
So I wrote a bunch of stuff in response (included below in case you're
interested). But in retrospect I think that my issue is that there are
multiple definitions of "information" at play here, and the video
switches between them. Here's some possible ways to define "information":
1) Information is a measure of the number of underlying degrees of
freedom in a physical system.
2) Information is a measure of the minimal compressed size of a complete
description of a physical system.
3) Information is a measure of the size of the smallest *physically
feasible* compressed representation of the complete description of a
physical system.
4) Information is a measure of the degree of one's (possibly incomplete)
knowledge about the complete state of a physical system.
5) Information is the amount of one's ignorance as to the complete state
of a physical system.
For what it's worth, definition (1) is the one I'm used to (see here
https://en.wikipedia.org/wiki/Physical_information#Classical_versus_quantum_information),
although the video doesn't seem to be using it at all.
Anyway, here's where specifically I think the video gets into trouble.
As near as I can tell they make an argument which switches which
definition of information it's using:
1. information is the same thing as entropy
- true only for definition (5)
2. the entropy of the universe is always increasing
3. therefore the information of the universe is always increasing
- since it relies on point 1, also true only for definition (5)
4. determinsitic physics cannot increase information
- true for definition (2), false for (3), (4) and (5)
5. therefore non-determinism (like QM) must be responsiable for the
increase in information
- since there is no definition on information true for both points 3
and 4, this is an unfounded conclusion
Hopefully that's more clear in conveying my confusion with the video's
claims? If you have another way of interpreting the video's argument
that makes more sense I'd be interested to hear it.
Also, there's a nice and readable discussion of the relationship between
information and entropy here:
https://en.wikipedia.org/wiki/Physical_information#Physical_information_and_entropy
In fairness, it does closely match some parts of what the video is
trying to go for, but it has the advantage that it keeps the same
definition of "information" and doesn't end up with strange (and AFAIK
false) conclusions about QM being necessary for the second law of
thermodynamics to exist.
I dunno, maybe I'm just being unnecessarily grumpy about nit-picking
this. Information *is* a really cool and current topic in theoretical
physics, and the video does give semi-correct introductions to some of
the relevant ideas in an entertaining way. I just wish it was more
concerned with being accurate.
(old stuff I typed out follows)
On 1/10/2017 1:15 PM, clipka wrote:>
> Is it truly confusion, or could it actually be insight?
>
I think it's confusion. Or, at least, even if the authors know their
stuff I think the video is very confusing.
This is not to say that it's complete bunk! There are very deep
connections between physics, entropy, and information, and AFAIK some
prominent physicists expect it to be an up-and-coming area of study in
the future of theoretical physics. So it's not that the video doesn't
look into some real and interesting topics, it's just that I think
someone watching it would be likely to be misled in some important details.
For instance: From the video I think someone would probably get the
impression that information and entropy are exactly the same thing, but
I think it's actually better to think of them as opposites. That is,
one very useful way to think of entropy is as a measure of the
uncertainty/ignorance of the underlying state of a physical system,
which is to say a measure of your *lack* of information about the
system. Which is sort of the opposite of what it seems the video is
trying to say. (note: after hastily typing this I realized that there
were multiple definitions of information at play, so probably ignore
this paragraph)
Maybe it's just an issue with me being confused by an otherwise
straightforward video, but certainly *I* have trouble making sense of
what it's saying in a way which isn't subtly false.
>
> Remember that quantum theory does away with the idea that there even
> /is/ such a thing as "the" state of the universe.
>
Hmm what do you mean? I was under the impression that quantum mechanics
describes the state of the universe perfectly well with a giant quantum
wave function? Not the notion of "state" that we're used to from
classical physics, but I was assuming that this would still count as a
"state".
>
> Also, from a quick glance on Wikipedia, it seems that there isn't really
> a clear consensus - let alone irrefutable proof - whether the amount of
> information in the universe is constant not.
>
Certainly if you just take straight quantum mechanics then information
is conserved. My impression is that while there's not universal
agreement for cases outside of standard QM, the situations where
information might not be conserved are generally considered to be
problems which can hopefully be fixed with further analysis. Hence the
hand wringing over the black hole information paradox.
But yeah, I probably over stated the amount of certainty in information
conservation. Nevertheless, I'm not aware of any seriously considered
physical theory which allows information to increase like the video
claims (taking "information" to mean the number of bits/qbits required
to completely specify the state of the universe). If you are aware of
such a theory I'd be super interested to hear about it.
>
> According to Heisenberg, shouldn't that be "uncertainly possible"? ;)
>
lol
Post a reply to this message
|
![](/i/fill.gif) |