CATEGORY:
Trick or Truth Essay Contest (2015)
[back]
TOPIC:
In Defense of Universe Hacking by Harlan Swyers
[refresh]
Login or
create account to post reply or comment.
Author Harlan Swyers wrote on Jan. 16, 2015 @ 21:27 GMT
Essay AbstractIf we imagine the universe as a set of data and each observer as an outcome of a series of measurement operations, then we can view ourselves as being unique series of cryptographic keys reflecting a particular track in the data (like a bubble track in a cloud chamber). We can turn to computer science for examples of perfectly deterministic systems where we regularly try to anonymize data instead of decipher it, and use insights from that field to understand physical problems. Onion routing, for example, provides a means of encrypting data so that nodes in a chain only know the node where data came from and the node data is to be sent, via the use of Chaum mixes as part of a mix network. This has implications for the information paradoxes surrounding black holes and topics of computational complexity, and how decoherence approaches which include the environment as active participant can avert the paradox and provide an additional means to connect mathematics and physics.
Author BioHal Swyers holds a M.S. in Environmental Management from the University of Maryland University College. He studies physics and mathematics as a personal hobby and all content provided purely reflect his own opinions and should not be construed to reflect the opinions of others.
Download Essay PDF File
Edwin Eugene Klingman wrote on Jan. 17, 2015 @ 23:51 GMT
Dear Harlan,
I believe that Fourier series is at the root of many of today's physics concepts, most especially those that emphasize 'superposition'. I also like very much your description of algorithms as processes that do not change with changes in language and (ideally) produce the same result. And your interesting remarks on random numbers.
You discuss von Neumann's take on hidden variables and quantum incompleteness. As you know, having read and commented on my essay, I treat a local Stern-Gerlach model that does "describe the system exactly and with certainty", and I claim that Bell is erasing the hidden variable information with his hidden constraints. Also note that the energy-exchange process effectively dissipates precession energy, ending with the particle spin aligned with the local field. But this appears to me to represent a loss of information, represented in quantum mechanics as 'collapse of the wave function'. If that is the case, then I would expect "loss of information' in a black hole.
In an earlier essay I treat information as a change in structure ('in-form'ing) effected by energy, as a record of the energy. This precludes existence of information as a 'substance' or physical entity of any kind. And my current essay brings existence of 'entanglement' into question. Finally a number of cosmologists are questioning black holes in favor of ECOs (eternally collapsing objects). So, questioning the concepts of information, black holes, and entanglement, I have not worked on the "firewall" problem, but I believe you've represented well the current state of the problem. And I very much like the way you end your essay!
Thanks for reading my essay and commenting, and thanks for entering your own essay.
Edwin Eugene Klingman
report post as inappropriate
Author Harlan Swyers replied on Jan. 19, 2015 @ 13:53 GMT
Edwin,
Thanks so much for the response. Likewise, thank you for reading the essay and your comments.
Best
Harlan
Bob Shour wrote on Jan. 18, 2015 @ 19:46 GMT
Dear Harlan Swyers
An algorithm as an invariant process is an interesting characterization. I have not before encountered Onion routing or Chaum mixing. They make interesting analogies. Your article left me wishing to know more about quantum mechanics. Thank you for writing your article, and thank you for commenting on mine.
Best wishes,
Bob Shour
report post as inappropriate
Author Harlan Swyers replied on Jan. 19, 2015 @ 13:55 GMT
Bob,
Likewise, thank you for your comments. I think the concept of naturally occurring algorithms is something worth contemplating. Best regards and thanks again.
Harlan
Lawrence B Crowell wrote on Jan. 21, 2015 @ 19:43 GMT
Dear Hal Swyers,
For somebody who has no degree in physics your paper is remarkable. It is at a high level.
The problem AMPS poses is the entanglement with a black hole and radiation continues. This means a crisis happens where the black holes has more entanglement information than permitted by the Bekenstein bound. There were early ideas about black hole remnants that had a huge amount of quantum entanglement information. In order to save the situation one of two things must be admitted. Either the black hole transfers entanglement with early Hawking radiation to the outside, but this requires converting bipartite entanglements into tripartite entanglements. This is a form of cloning and is not unitary. To same unitarity it must then be admitted the equivalence principle breaks down so there can be no more information transferred to the BH.
I think that entanglement algebras are themselves uncertain. In quantum gravity there is an uncertainty as to whether an entanglement is bipartite or tripartite or GHZ entangled. I think this is a manifestation of cobordism with the structure of spacetime.
I will have an essay here probably in the next batch that show up. I discuss some of these matters. You might find it interesting.
Cheers LC
report post as inappropriate
Author Harlan Swyers replied on Jan. 23, 2015 @ 01:10 GMT
Lawrence,
Thanks for the kind words! Indeed a lot of personal time has been spent studying physics. I'll await to read your essay, I am sure there will indeed be some good discussion on this issue.
Best
Harln
Lawrence B Crowell replied on Jan. 25, 2015 @ 00:49 GMT
I and some other people are working to find an equivalency, an isomorphism of sorts, between topological cobordisms of spacetime and entanglement geometries. The spacetime of importance is the AdS spacetime and the topological changes are local quantum fluctuations. These should be equivalent to the entanglement geometries of quantum states on the boundary.
Stay tuned, maybe this will work. Then again it might of course be just wrong.
LC
report post as inappropriate
Lawrence B Crowell wrote on Jan. 29, 2015 @ 22:35 GMT
Dear Harlan,
I read your paper again. I am sufficiently impressed to give it the maximum score. This means my paper is now showing, which you might be interested in reading at:
http://fqxi.org/community/forum/topic/2320
Cheers LC
report post as inappropriate
Author Harlan Swyers replied on Jan. 30, 2015 @ 01:42 GMT
Lawrence,
Thanks much! Looking forward to reading your paper!
Best
Harlan
Lawrence B Crowell replied on Jan. 31, 2015 @ 21:33 GMT
My essay is up on the essay contest list now. I don't talk about the firewall explicitly, but the brief discussion I gave on some of my work does address this.
Thanks, LC
report post as inappropriate
Gary D. Simpson wrote on Jan. 31, 2015 @ 13:14 GMT
Harlan,
Many thanks. This is a well written and thoughtful essay.
The idea of an observer as being the result of a unique result of a set of measurement operations is sublime. This enables me to envision the universe and the observer as single entity.
I sometimes struggle with bra-ket notation but I found your use of it to be very effective. I think it is the supporting text and thinking that makes it understandable.
I am something of a skeptic regarding information theory and such but you have made me reconsider this.
I love the ending.
Best Regards and Good Luck,
Gary Simpson
report post as inappropriate
Author Harlan Swyers replied on Feb. 15, 2015 @ 14:51 GMT
Gary,
Thank for the nice comments.
The best way I have been able to understand Bra-ket notation, is to just remember that the Bra is always the conjugate of the ket.
In complex terms, if a ket represents a complex number like 3+4i then the bra is 3-4i. Or for instance if the ket is eix the ket is e-ix[\sup].
A bra and its conjugate ket will the squared norm (expected value) unless there is an operation (O) that changes the ket into some other number . In this case you would see the the average value of the operator. In QM, because we are dealing with probabilities, the squared norm is always equal to one.
Hopefully that helps.
Cheers!
Harlan
Christophe Tournayre wrote on Jan. 31, 2015 @ 15:36 GMT
Thank you for your essay. I found it very interesting though I struggled on some parts (that’s my fault, not yours).
At the end of your essay, you said : “This means our environment serves as the seemingly unhackable database of information. However, human nature being what it is…”
I have a question: If we want to make our environment even more unhackable, how would you approach the challenge?
report post as inappropriate
Christophe Tournayre replied on Jan. 31, 2015 @ 16:02 GMT
Sorry, I wrote my question too quickly. It should have been: "what would make our environment even more unhackable?"
report post as inappropriate
Author Harlan Swyers replied on Feb. 16, 2015 @ 14:50 GMT
Christophe,
Sorry for the late response.
The only way to make sense of truly unhackable system would be for the system to be completely mutual exclusive for all time. In this sense you can imagine two truly parallel systems with absolutely no interaction. This is a much stronger situation than pure independence. For instance many times when people talk about parallel universes, they are referring to two co-evolving independent states. True mutual exclusiveness requires no commonality. This situation requires two systems to have no common outcomes. So in other words, one system could not know the outcome of the other system; or effectively it would be unhackable.
Cheers!
Harlan
Christophe Tournayre replied on Feb. 28, 2015 @ 13:11 GMT
Harlan,
Thank you for responding. I know my question sounded stupid. I liked your approach but could not understand how you got to it.
Regards,
Christophe
report post as inappropriate
Sylvain Poirier wrote on Feb. 8, 2015 @ 23:45 GMT
You wrote: "At first we might see a problem in the existence of Aspect’s machine, since here we have a system that can be defined and constructed with a relatively minimal number of statements, but yet is capable of producing strings of any possible length which can then be proved to be random through violations of Bell’s inequalities showing their quantum origin."
You are mixing 2 very...
view entire post
You wrote: "At first we might see a problem in the existence of Aspect’s machine, since here we have a system that can be defined and constructed with a relatively minimal number of statements, but yet is capable of producing strings of any possible length which can then be proved to be random through violations of Bell’s inequalities showing their quantum origin."
You are mixing 2 very different notions of randomness. The notion of quantum randomness is relative to the time when a device is in a specific state and ready to produce a number which will come at random in the sense that it is not determined yet, as repeating the same experiment with exact copies of the device with the same initial state may give different results. We have a physical indetermination of the choice between many possible values it is still able to take.
On the other hand, you mentioned the notion of whether a given fixed number is a random number or not. This is a completely different question.
Indeed by Chaitin's theorem, it is not possible to prove that any specific big number is a random one, however this does not diminish the fact that anyway most big numbers are random. Concretely, when a quantum device is going to produce a number at random (that quantum paradoxes show to be random in a physical sense, i.e. that its value is not fixed in advance but any number still has actual chances to be produced by the device at this time), there is then a provably high probability for the future yet undetermined output to have the property of "being a random number" in the sense of Kolmogorov complexity.
This "high probability of being a random number" is related to the fact that the average expectable complexity is a high one.
The point is that the provably high value of the average expectable complexity of the future undetermined output, does not change anything to the fact that, according to Chaitin's theorem, it is not possible to formally prove the high value of the complexity of any
specific number, among all possibilities.
To explain very simply this "paradox" which I do not even find strange, imagine a function f that will play the role of the Kolmogorov complexity, to be applied to a future output that may turn out to be either a or b with probability 1/2 each.
Then we know that the average expectable level of complexity is high : f(a)+f(b)>2 is provable. However for each specific possibility, there is no way to prove its high complexity : neither f(a)>1 nor f(b)>1 is provable.
Nevertheless, we have a proof of f(a)+f(b)>2 and thus of (f(a)>1 or f(b)>1).
But this provability of (f(a)>1 or f(b)>1) does not anyhow contradict that f(a)>1 is unprovable and f(b)>1 is also unprovable, because the formal proof of (f(a)>1 or f(b)>1) does not provide any way to formally find out which of both numbers f(a) and f(b) is >1.
Still there is a real mathematical truth about it but it is not algorithmically computable. (This non-computability should not be confused with physical indetermination.)
If you are amazed that physical devices may produce random numbers, well, remember that a physical quantum device is NOT a classical deterministic Turing machine, which Chaitin's theorem exclusively refers to. As I explained in
my essay, I interpret quantum randomness as having a non-physical source (conciousness). See also my
arguments against the hidden variable approach.
You wrote: "As photons are generated by excited atoms, we can only state the exact polarization of photons is indeterminate prior to measurement. Since the polarization states are generated from some quantum source, then Aspect’s machine contains a component that is indescribable in the formal language that can describe its construction and prove its result."
It all depends on the precise experiment. Some kinds of excited atoms may have a spin that determines the polarization of the emitted photon. In practice the result is usually indeterminate because systems are made of many atoms whose spins are randomly distributed, full of entropy.
You wrote: "real numbers can be characterized as being rational functions, which are simply the ratio of polynomials"
Looking at the reference you gave where this amazing result on the simple nature of real numbers comes from, the explanation is this one:
"the real numbers are a subset of the rational functions" where "By a rational function in the variable t, we will mean a function of the form p(t)/q(t), where p(t) and q(t) are polynomials with standard real coefficients", so that "for instance, the number 7 can be viewed as 7/1, where 7 and 1 are both polynomials of degree 0". So more generally in this way, we can get any real number u in the form of the rational function u/1 where u and 1 are seen as polynomials of degree 0 with real coefficients. Wow !
view post as summary
report post as inappropriate
Author Harlan Swyers replied on Feb. 15, 2015 @ 14:31 GMT
Sylvain,
Thanks for the nice editorial.
I am quite aware of the difference, but perhaps wasn't clear in the write up.
Aspect's machine is designed to signal whether there are two photons with coincident polarizations. The set up looks to measure one set of polarizations from a larger set. That means that over a period of time, while the generation of photons pairs can be controlled to occur at some constant rate, the photon pairs that cause coincidence must fire at random. There can be no discernible pattern if ones and zeros measured within any given interval at a specific polarization. This sequence is provably random because it can be proved to be part of a quantum process. No classical means can discern this.
I am sorry you read more into what I was saying, but if you check the reference to Sidney Coleman's video, these things should be clear to you.
Thank you for the comments.
Cheers!
Harlan
Sujatha Jagannathan wrote on Feb. 16, 2015 @ 09:13 GMT
Your work is tech-driven and it goes on to say very detailed structures of environs in greener way!
Sincerely,
Miss. Sujatha Jagannathan
report post as inappropriate
Richard Lewis wrote on Feb. 16, 2015 @ 11:18 GMT
Dear Harlan,
I enjoyed reading your essay, particularly as I am interested in the phenomena of quantum entanglement.
In your essay you make the point that: 'This serves to enforce the locality on interacting fields, e.g. it ensures that faster than light interactions do not occur.'
To take the case of a photon interacting with an electron, my understanding is that during transmission, the photon always travels at the speed of light, but at the point of interaction (observation) effects can occur at faster than the speed of light. So my understanding is that non-local effects can occur at the point of measurement while protecting the fact that useful information cannot be transmitted at a speed faster than the speed of light.
Is this in line with your thinking on the subject of non-locality in quantum entanglement?
Regards
Richard
report post as inappropriate
Author Harlan Swyers replied on Feb. 16, 2015 @ 14:39 GMT
Richard,
Thanks for the nice comments and reading the essay.
Sidney Coleman's video referenced in the paper does a nice job explaining that there are no faster than light effects. There is no signaling between entangled and spatially separated systems. There is no interaction Hamiltonian. The systems are simple in an entangled state.
Classically you would understand this situation the case of right and left shoes being placed in separate boxes and separated across the galaxy. When one opened the box on one side of the galaxy and saw a right shoe, they would know instantly the box on the other side of the galaxy contained a left shoe. There is no mystery here.
The quantum version of this requires that the system is not in a definite state prior to opening the boxes. Once an observation is made by one observer, the other observer cannot have made a contradictory observation. This places each local observer in a slightly privileged place, since their decisions effect the outcomes they can experience.
Again, there is no faster than light action here, it is merely a consistency requirement. This consistency requirement is the source of the Many Worlds Interpretation which argues that it is entirely possible for the other observer to see an inconsistent outcome if they are in a separate branching universe. Regardless, such an outcome is embedded in the evolution of the overall wave function.
The point is that we ourselves are tied to an evolution that is a subsystem of the greater whole. Our reality is only formulated in the context of outcomes of earlier evolutions, and what we experience must be consistent with the evolution of the wave function within our particular subsystem.
Hope that helps,
Harlan
Eckard Blumschein wrote on Feb. 22, 2015 @ 09:18 GMT
Harlan,
May I remind you of my simple "off" vs. "of" question which I don't expect to be answered by Lee Smolin himself?
Eckard
report post as inappropriate
Joe Fisher wrote on Mar. 31, 2015 @ 15:11 GMT
Dear Mr. Swyers,
I thought that your engrossing essay was exceptionally well written and I do hope that it fares well in the competition.
I think Newton was wrong about abstract gravity; Einstein was wrong about abstract space/time, and Hawking was wrong about the explosive capability of NOTHING.
All I ask is that you give my essay WHY THE REAL UNIVERSE IS NOT MATHEMATICAL a fair reading and that you allow me to answer any objections you may leave in my comment box about it.
Joe Fisher
report post as inappropriate
Login or
create account to post reply or comment.