Greater Thoughts that Cannot Be Imageoned

Most scientists do not enter their chosen fields because the work is easy. They do their science mainly because it is challenging and rewarding when triumphant. Yet few scientists will ever taste the sweet dew drops of triumph — real world-changing success — in their lifetimes. So it is remarkable perhaps that the small delights in science are sustaining enough for the human soul to warrant persistence and hard endeavour in the face of mostly mediocre results and relatively few cutting edge break-throughs.

Still, I like to think that most scientists get a real kick out of re-discovering results that others before them have already uncovered. I do not think there is any diminution for a true scientist in having been late to a discovery and not having publication priority. In fact I believe this to be universally true for people who are drawn into science for aesthetic reasons, people who just want to get good at science for the fun of it and to better appreciate the beauty in this world. If you are of this kind you likely know exactly what I mean. You could tomorrow stumble upon some theorem proven hundreds of years ego by Gauss or Euler or Brahmagupta and still revel in the sweet taste of insight and understanding.

Going even further, I think such moments of true insight are essential in the flowering of scientific aesthetic sensibilities and the instilling of a love for science in young children, or young at heart adults. “So what?” that you make this discovery a few hundred years later than someone else? They had a birth head start on you! The victory is truly still yours. And “so what?” that you have a few extra giants’ shoulders to stand upon? You also saw through the haze and fog of much more information overload and Internet noise and thought-pollution, so you can savour the moment like the genius you are.

Such moments of private discovery go unrecorded and must surely occur many millions of times more frequently than genuinely new discoveries and break-throughs. Nevertheless, every such transient to invisible moment in human history must also be a little boost to the general happiness and welfare of all of humanity. Although only that one person may feel vibrant from their private moment of insight, their radiance surely influences the microcosm of people around them.

I cannot count how many such moments I have had. They are more than I will probably admit, since I cannot easily admit to any! But I think they occur quite a lot, in very small ways. However, back in the mid 1990’s I had, what I thought, was a truly significant glimpse into the infinite. Sadly it had absolutely nothing to do with my PhD research, so I could only write hurriedly rough notes on recycled printout paper during small hours of the morning when sleep eluded my body. To this day I am still dreaming about the ideas I had back then, and still trying to piece something together to publish. But it is not easy. So I will be trying to leak out a bit of what is in my mind in some of these WordPress pages. Likely what will get written will be very sketchy and denuded of technical detail. But I figure if I put the thoughts out into the Web maybe, somehow, some bright young person will catch them via Internet osmosis of a sort, and take them to a higher level.

geons_vs_superstrings_1

There are a lot of threads to knit together, and I hardly know where to start. I have already started writing perhaps half a dozen manuscripts, none finished, most very sketchy. And this current writing is yet another forum I have begun.

The latest bit of reading I was doing gave me a little shove to start this topic anew. It happens from time to time that I return to studying Clifford Geometric Algebra (“GA” for short). The round-about way this happened last week was this:

  • Weary from reading a Complex Analysis book that promised a lot but started to get tedious: so for a light break YouTube search for a physics talk, and find Twistors and Spinors talks by Sir Roger Penrose. (Twistor Theory is heavily based on Complex Analysis so it was a natural search to do after finishing a few chapters of the mathematics book).
  • Find out the Twistor Diagram efforts of Andrew Hodges have influenced Nima Arkani-Hamed and even Ed Witten to obtain new cool results crossing over twistor theory with superstring theory and scattering amplitude calculations (the “Amplituhedron” methods).
  • That stuff is ok to dip into, but it does not really advance my pet project of exploring topological geon theory. So I look for some more light reading and rediscover papers from the Cambridge Geometric Algebra Research Group (Lasenby, Doran, Gull). And start re-reading Gull’s paper on electron paths and tunnelling and the Dirac theory inspired by David Hestene’s work
  • The Gull paper mentions criticisms of the Dirac theory that I had forgotten. In the geometric algebra it is clear that solving the Dirac equation gives not positively charge anti-electrons, but unphysical negative frequency solutions with negative charge and negative mass. So they are not positrons. It’s provoking that the authors claim this problem is not fully resolved by second quantisation, but rather perhaps just gets glossed over? I’m not sure what to think of this. (If the negative frequencies get banished by second quantisation why not just conclude first quantisation is not nature’s real process?)
  • Still, whatever the flaws in Dirac theory, the electron paths paper has tantalising similarities with the Bohm pilot wave theory electron trajectories. And there is also a reference to the Statistical Interpretation of Quantum Mechanics (SIQM) due to Ballentine (and attributed also as Einstein’s preferred interpretation of QM).
  • It gets me thinking again of how GA might be helpful in my problems with topological geons. But I shelve this thought for a bit.
  • Reading Ballentine’s paper is pretty darn interesting. It dates from 1970, but it is super clear and easy to read. I love that in a paper. The gist of it is that an absolute minimalist interpretation of quantum mechanics would drop Copenhagen ideas and view the wave function as more like a description of what could happen in nature, tat is, the wave functions are descriptions of statistical ensembles of identically prepared experiments or systems in nature. (Sure, no two systems are ever prepared in the exact same initial state, but that hardly matters when you are only doing statistics rather than precise deterministic modelling.)
  • So Ballentine was suggesting the wave functions are;
    1. not a complete description of an individual particle, but rather
    2. better thought of as a description of an ensemble of identically prepared states.

This is where I ended up, opening my editor to draft a OneOverEpsilon post.

So here’s the thing I like about the ensemble interpretation and how the geometric algebra reworking of Dirac theory adds to a glimmer of clarity about what might be happening with the deep physics of our universe. For a start the ensemble interpretation is transparently not a complete theoretical framework, since it is a statistical theory it does not pretend to be a theory of reality. Whatever is responsible for the statistical behaviour of quantum systems is still an open question in SIQM. The Bohm-like trajectories that the geometric algebra solutions to the Dirac theory are able to compute as streamline plots are illuminating in this respect, since they seem to clearly show that what the Dirac wave equation is modelling is almost certainly not the behaviour a single particle. (One could guess this from Schrödinger theory as well, but I guess physicists were already lured into believing in the literal wave-particle duality meme well before Bohm was able to influence anyone’s thinking.)

Also, it is possible (I do not really know for sure) that the negative frequency solutions in Dirac theory can be viewed as merely an artifact of the statistical ensemble framework. No single particle acts truly in accordance with the Dirac wave equation. So there is no real reason to get ones pants in a twist about the awful appearance of negative frequencies.

(For those in-the-know: the Dirac theory negative frequency solutions turn out to have particle currents in the reverse spatial direction to their momenta, so that’s not a backwards time propagating anti-particle, it is a forwards in time propagating negative mass particle. That’s a particle that’d fall upwards in a gravitational field if the principle of equivalence holds universally. As an aside note: it is a bit funky that this cannot be tested experimentally since no one can yet clump enough anti-matter together to test which way it accelerates in a gravitational field. But I presume the sign of particle inertial mass can be checked in the lab, and, so far, all massive particles known to science at least are known to have positive inertial mass.)

And as a model of reality the Dirac equation has therefore, certain limitations and flaws. It can get some of the statistics correct for particular experiments, but a statistical model always has limits of applicability. This is neither a defense or a critique of Dirac theory.  My view is that it would be a bit naïve to regard Dirac theory as the theory of electrons, and naïve to think it should have no flaws.  At best such wave-function models are merely a window frame for a particular narrow view out into our universe.  Maybe I am guilty of a bit of sophistry or rhetoric here, but that’s ok for a WordPress blog I think … just puttin’ some ideas “out there”.

Then another interesting confluence is that one of Penrose’s big projects in Twistor theory was to do away with the negative frequency solutions in 2-Spinor theory. And I think, from recall, he succeeded in this some time ago with the extension of twistor space to include the two off-null halves. Now I do not know how this translates into real-valued geometric algebra, but in the papers of Doran, Lasenby and Gull you can find direct translations of twistor objects into geometric algebra over real numbers. So there has to be in there somewhere a translation of Penrose’s development in eliminating the negative frequencies.

So do you feel a new research paper on Dirac theory in the wind just there? Absolutely you should! Please go and write it for me will you? I have my students and daughters’ educations to deal with and do not have the free time to research off-topic too much. So I hope someone picks up on this stuff. Anyway, this is where maybe the GA reworking of Dirac theory can borrow from twistor theory to add a little bit more insight.

There’s another possible confluence with the main unsolved problem in twistor theory. The Twistor theory programme is held back (stalled?) a tad (for 40 years) by the “googly problem” as Penrose whimsically refers to it. The issue is one of trying to find self-dual solutions of Einstein’s vacuum equations (as far as I can tell, I find it hard to fathom twistor theory so I’m not completely sure what the issue is). The “googly problem” stood for 40 years, and in essence is the problem of “finding right-handed interacting massless fields (positive helicity) using the same twistor conventions that give rise to left-handed fields (negative helicity)”. Penrose maybe has a solution dubbed Palatial Twistor Theory which you might be able to read about here: “On the geometry of palatial twistor theory” by Roger Penrose, and also lighter reading here: “Michael Atiya’s Imaginative Mind” by Siobhan Roberts in Quanta Magazine.

If you do not want to read those articles then the synopsis, I think, is that twistor theory has some problematic issues in gravitation theory when it comes to chirality (handedness), which is indeed a problem since obtaining a closer connection between relativity and quantum theory was a prime motive behind the development of twistor theory. So if twistor theory cannot fully handle left and right-handed solutions to Einstein’s equations it might be said to have failed to fulfill one it’s main animating purposes.

So ok, to my mind there might be something the geometric algebra translation of twistor theory can bring to bear on this problem, because general relativity is solved in fairly standard fashion with geometric algebra (that’s because GA is a mathematical framework for doing real space geometry, and handles Lorentzian metrics as simply as Euclidean, not artificially imposed complex analytic structure is required). So if the issues with twistor theory are reworked in geometric algebra then some bright spark should be able to do the job twistor theory was designed do do.

By the way, the great beauty and advantage Penrose sees in twistor theory is the grounding of twistor theory in complex numbers. The Geometric Algebra Research Group have pointed out that this is largely a delusion. It turns out that complex analysis and holomorphic functions are just a sector of full spacetime algebra. Spacetime algebra, and in fact higher dimensional GA, have a concept of monogenic functions which entirely subsume the holomorphic (analytic) functions of 2D complex analysis. Complex numbers are also completely recast for the better as encodings of even sub-algebras of the full Clifford–Geometric Algebra of real space. In other words, by switching languages to geometric algebra the difficulties that arise in twistor theory should (I think) be overcome, or at least clarified.

If you look at the Geometric Algebra Research Group papers you will see how doing quantum mechanics or twistor theory with complex numbers is really a very obscure way to do physics. Using complex analysis and matrix algebra tends to make everything a lot harder to interpret and more obscure. This is because matrix algebra is a type of encoding of geometric algebra, but it is not a favourable encoding, it hides the clear geometric meanings in the expressions of the theory.

*      *       *

So far all I have described is a breezy re-awakening of some old ideas floating around in my head. I rarely get time these days to sit down and hack these ideas into a reasonable shape. But there are more ideas I will try to write down later that are part of a patch-work that I think is worth exploring. It is perhaps sad that over the years I had lost the nerve to work on topological geon theory. Using spacetime topology to account for most of the strange features of quantum mechanics is however still my number one long term goal in life. Whether it will meet with success is hard to discern, perhaps that is telling: if I had more confidence I would simply abandon my current job and dive recklessly head-first into geon theory.

Before I finish up this post I want to thus outline very, very breezily and incompletely, the basic idea I had for topological geon theory. It is fairly simplistic in many ways. There is however new impetus from the past couple of years developments in the Black Hole firewall paradox debates: the key idea from this literature has been the “ER=EPR” correspondence hypothesis, which is that quantum entanglement (EPR) might be almost entirely explained in terms of spacetime wormholes (ER: Einstein-Rosen bridges). This ignited my interest because back in 1995/96 I had the idea that Planck scale wormholes in spacetime can allow all sorts of strange and gnarly advance causation effects on the quantum (Planckian) space and time scales. It seemed clear to me that such “acausal” dynamics could account for a lot of the weird correlations and superpositions seen in quantum physics, and yet fairly simply so by using pure geometry and topology. It was also clear that if advanced causation (backwards time travel or closed timelike curves) are admitted into physics, even if only at the Planck scale, then you cannot have a complete theory of predictive physics. Yet physics would be deterministic and basically like general relativity in the 4D block universe picture, but with particle physics phenomenology accounted for in topological properties of localised regions of spacetime (topological 4-geons). The idea, roughly speaking, is that fundamental particles are non-trivial topological regions of spacetime.  The idea is that geons are not 3D slices of space, but are (hypothetically) fully 4-dimensional creatures of raw spacetime topology.   Particles are not apart from spacetime. Particles are not “fields that live in spacetime”, no! Particles are part of spacetime.  At least that was the initial idea of Geon Theory.

Wave mechanics, or even quantum field theory, are often perceived to be mysterious because they either have to be interpreted as non-deterministic (when one deals with “wave function collapse”) or as semi-deterministic but incomplete and statistical descriptions of fundamental processes.   When physicists trace back where the source of all this mystery lies they are often led to some version of non-locality. And if you take non-locality at face value it does seem rather mysterious given that all the models of fundamental physical processes involve discrete localised particle exchanges (Feynman diagrams or their stringy counterparts).   One is forced to use tricks like sums over histories to obtain numerical calculations that agree with experiments.  But no one understand why such calculational tricks are needed, and it leads to a plethora of strange interpretations, like Many Worlds Theory, Pilot Waves, and so on.   A lot of these mysteries I think dissolve away when the ultimate source of non-locality is found to be deep non-trivial topology in spacetime which admits closed time-like curves (advanced causation, time travel).  To most physicists such ideas appear nonsensical and outrageous.  With good reason of course, it is very hard to make sense of a model of the world which allows time travel, as decades of scifi movies testify!  But geon theory doe snot propose unconstrained advanced causation (information from the future influences events in the past).   On the contrary, geon theory is fundamentally limited in outrageousness by the assumption the closed time-like curves are restricted to something like the Planck scale.   I should add that this is a wide open field of research.  No one has worked out much at all on the limits and applicability of geon theory.    For any brilliant young physicists or mathematicians this is a fantastic open playground to explore.

The only active researcher I know in this field is Mark Hadley. It seemed amazing to me that after publishing his thesis (also around 1994/95 independently of my own musings) no one seemed to take up his ideas and run with them.  Not even Chris Isham who refereed Hadley’s thesis.  The write-up of Hadley’s thesis in NewScientist seemed to barely cause a micro-ripple in the theoretical physics literature.    I am sure sociologists of science could explain why, but to me, at the time, having already discovered the same ideas, I was perplexed.

To date no one has explicitly spelt out how all of quantum mechanics can be derived from geon theory. Although Hadley I surmise, completed 90% of this project!  The final 10% is incredibly difficult though — it would necessitate deriving something like the Standard Model of particle physics from pure 4D spacetime topology — no easy feat when you consider high dimensional string theory has not really managed the same job despite hundreds of geniuses working on it for over 35 years. My thinking has been that string theory involves a whole lot of ad hockery and “code bloat” to borrow a term from computer science! If string theory was recast in terms of topological geons living as part of spacetime, rather than as separate to spacetime, then I suspect great advances could be made. I really hope someone will see these hints and connections and do something momentous with them.  Maybe some maverick like that surfer dude Garett Lisi might be able to weigh in and provide some fire power?

In the mean time  geometric algebra has so not been applied to geon theory, but GA blends in with these ideas since it seems, to me, to be the natural language for geometric physics. If particle phenomenology boils down to spacetime topology, then the spacetime algebra techniques should find exciting applications.  The obstacle is that so far spacetime algebra has only been developed for physics in spaces with trivial topology.

Another connection is with “combinatorial spacetime” models — the collection of ideas for “building up spacetime” from discrete combinatorial structures (spin foams, or causal networks, causal triangulations, and all that stuff). My thinking is that all these methods are unnecessary, but hint at interesting directions where geometry meets particle physics because (I suspect) such combinatorial structure approaches to quantum gravity are really only gross approximations to the spacetime picture of topological geon theory. It is in the algebra which arises from non-trivial spacetime topology and it’s associated homology that (I suspect) combinatorial spacetime pictures derive their use.

Naturally I think the combinatorial structure approaches are not fundamental. I think topology of spacetime is what is fundamental.

*      *       *

That probably covers enough of what I wanted to get off my chest for now. There is a lot more to write, but I need time to investigate these things so that I do not get too speculative and vague and vacuously philosophical.

What haunts me most nights when I try to dream up some new ideas to explore for geon theory (and desperately try to find some puzzles I can actually tackle) is not that someone will arrive at the right ideas before me, but simply that I never will get to understand them before I die. I do not want to be first. I just want to get there myself without knowing how anyone else has got to the new revolutionary insights into spacetime physics. I had the thrill of discovering geon theory by myself, independently of Mark Hadley, but now there has been this long hiatus and I am worried no one will forge the bridges from geon theory to particle physics while I am still alive.

I have this plan for what I will do when/if I do hear such news. It is the same method my brother Greg is using with Game of Thrones. He is on a GoT television and social media blackout until the books come out. He’s a G.R.R. Martin purest you see. But he still wants to watch the TV adaptation later on for amusement (the books are waaayyy better! So he says.) It is surprisingly easy to enforce such a blackout. Sports fans will know how. Any follower of All Black Rugby who misses an AB test match knows the skill of doing a media blackout until they get to watch their recording or replay. It’s impossible to watch an AB game if you know the result ahead of time. Rugby is darned exciting, but a 15-aside game has too many stops and starts to warrant sitting through it all when you already know the result. But when you do not know the result the build-up and tension are terrific. I think US Americans have something similar in their version of Football, since American Football has even more stop/start, it would be excruciatingly boring to sit through it all if you knew the result. But strangely intense when you do not know!

So knowing the result of a sports contest ahead of time is more catastrophic than a movie or book plot spoiler. It would be like that if there is a revolution in fundamental physics involving geon theory ideas. But I know I can do a physics news blackout fairly easily now that I am not lecturing in a physics department. And I am easily enough of an extreme introvert to be able to isolate my mind from the main ideas, all I need is a sniff, and I will then be able to work it all out for myself. It’s not like any ordinary friend of mine is going to be able to explain it to me!

If geon theory turns out to have any basis in reality I think the ideas that crack it all open to the light of truth will be among the few great ideas of my generation (the post Superstring generation) that could be imagined. If there are greater ideas I would be happy to know them in time, but with the bonus of not needing a physics news blackout! If it’s a result I could never have imagined then it’d be worth just savouring the triumph of others.


CCL_BY-NC-SA(https://creativecommons.org/licenses/by-nc-sa/4.0/legalcode)

Bohm and Beability

I write this being of sound mind and judgement … etc., etc., …

At this stage of life a dude like me can enter a debate about the foundations of quantum mechanics with little trepidation. There is a chance someone will put forward proposals that are just too technically difficult to understand, but there is a higher chance of getting either something useful out of the debate or obtaining some amusement and hilarity. The trick is to be a little detached and open-minded while retaining a decent dose of scepticism.

Inescapable Non-locality

Recently I was watching a lecture by Sheldon Goldstein (a venerable statesman of physics) who was speaking about John Stewart Bell’s contributions to the foundations of quantum mechanics. Bell was, like Einstein, sceptical of the conventional interpretations that gave either too big a role for “observers” and the “measurement process” or swept such issues aside by appealing to Many Worlds or some other fanciful untestable hypotheses.

What Bell ended up showing was a theory for a class of experiments that would prove the physics of our universe is fundamental non-local. Bell was actually after experimental verification that we cannot have local hidden variable theories. Hidden variables being things in physics that we cannot observe. Bell hated the idea of unobservable physics (and Einstein would have agreed, (me too, but that’s irrelevant)). The famous “Bell’s Inequalities” are a set of relations referring to experimental results that will give clear different numbers for outcomes of experiments if our universe’s physics is inherently non-local, or classical-with-hidden-variables.  The hidden variables are used to model the weirdness of quantum mechanics.

Hidden variable theories attempt to use classical physics, and possibly strict locality (no signals going faster than light, and even no propagation of information faster than light) to explain fundamental physical processes. David Bohm came up with the most complete ideas for hidden variables theories, but his, and all subsequent attempts, had some very strange features that seemed to be always needed in order to explain the results of the particular types of experiments that John Bell had devised. In Bohm’s theories he uses a feature called a Pilot Wave, which is an information carrying wave that physicists can only indirectly observe via it’s influence on experimental outcomes. We only get to see the statistics and probabilities induced by Bohm’s pilot waves. They spread out everywhere and they thus link space-like separated regions of the universe between which no signals faster than light could ever travel between. This has the character of non-locality but without requiring relativity violating information signalling faster than light, so the hope was one could use pilot waves to get a local hidden variables theory that would agree with experiments.

Goldstein tells us that Bell set out to show it was impossible to have a local hidden variables theory, but he ended up showing you could not have any local theory — at all! — all theories have to have some non-locality. Or rather, what the Bell Inequalities ended up proving (via numerous repeated experiments which measured conformance to the Bell inequalities) was that the physics in our universe could never be local, whatever theory one devises to model reality it has to be non-local. So it has to have some way for information to get from one region to another faster than light.

That is what quantum mechanics assumes, but without giving us any mechanism to explain it. A lot of physicists would just say, “It’s just the way our world is”, or they might use some exotic fanciful physics, like Many Worlds, to try to explain non-locality.

History records that Bell’s theorems were tested in numerous types of experiments, some with photons, some with electrons, some with entire atoms, and all such experiments have confirmed quantum mechanics and non-locality and have dis-proven hidden variables and locality. For the record, one may still believe in hidden variables, but the point is that if even your hidden variables theory has to be non-local then you lose all the motivation for believing in hidden variables. Hidden variables were designed to try to avoid non-locality. That was almost the only reason for postulating hidden variables. Why would you want to build-in to the foundations of a theory something unobservable? Hidden variables were a desperation in this sense, a crazy idea designed to do mainly just one thing — remove non-locality. So Bell and the experiments showed this project has failed.

ER.EPR_JohnStewartBell_CERN_1982

I like this photo of Bell from CERN in 1982 because it shows him at a blackboard that has a Bell Inequality calculation for an EPR type set-up. (Courtesy of : Christine Sutten, CERN https://home.cern/about/updates/2014/11/fifty-years-bells-theorem)

Now would you agree so far?  I hope not.  Hidden variables are not too much more crazy then any of the “standard interpretations” of quantum mechanics, of which there are a few dozen varieties, all fairly epistemologically bizarre.  Most other interpretations have postulates that are considerably more radical than hidden variables postulates. Indeed, one of the favourable things about a non-local hidden variables theory is that it would give the same predications as quantum mechanics but without a terribly bizarre epistemology.  Nevertheless, HV theories have fallen out of favour because people do not like nature to have hidden things that cannot be observed.  This is perhaps an historical prejudice we have inherited from the school of logical positivism, and maybe for that reason we should be more willing to give it up!  But the prejudice is quite persistent.

Quantum Theory without Observers

Goldstein raises some really interesting points when he starts to talk about the role of measurement and the role of observers. He points out that physicists are mistaken when they appeal to observers and some mysterious “measurement process” in their attempts to rectify the interpretations of quantum mechanics. It’s a great point that I have not heard mentioned very often before. According to Goldstein, a good theory of physics should not mention macroscopic entities like observers or measurement apparatus, because such things should be entirely dependent upon—and explained by—fundamental elementary processes.

This demand seems highly agreeable to me. It is a nice general Copernican principle to remove ourselves from the physics needed to explain our universe. And it is only a slightly stronger step to also remove the very vague and indiscreet notion of “measurement”.

The trouble is that in basic quantum mechanics one deals with wave functions or quantum fields (more generally) that fundamentally cannot account for the appearance of our world of experience. The reason is that these tools only give us probabilities for all the various ways things can happen over time, we get probabilities and nothing else from quantum theory. What actually happens in time is not accounted for by just giving the probabilities. This is often a called the “Measurement Problem” of quantum mechanics. It is not truly a problem. It is a fundamental incompleteness. The problem is that standard quantum theory has absolutely no mechanism for explaining the appearance of classical reality that we observe.

So this helps explain why a lot of quantum interpretation philosophy injects the notions of “observer” and “measurement” into the foundations of physics. It seems to be necessary for proving an account of the real semi-classical appearance of our world. We are not all held in ghostly superpositions because we all observe and “measure” each other, constantly. Or maybe our body cells are enough, they are “observing each other” for us? Or maybe a large molecule has “observational power” and is sufficient? Goldstein, correctly IMHO, argues this is all bad philosophy. Our scientific effort should be spent on trying to complete quantum theory or find a better more complete theory or framework for fundamental physics.

Here’s Goldstein encapsulating this:

It’s not that you don’t want observers in physics. Observers are in the real world and physics better account for the fact that there are observers. But observers, and measurement, and vague notions like that, and, not just vague, even macroscopic notions, they just seem not to belong in the very formulation of what could be regarded as a fundamental physical theory.

There should be no axioms about “measurement”. Here is one passage that John Bell wrote about this:

The concept of measurement becomes so fuzzy on reflection that it is quite surprising to have it appearing in physical theory at the most fundamental level. … Does not any analysis of measurement require concepts more fundamental than measurement? And should not the fundamental theory be about these more fundamental concepts?

Rise of the Wormholes

I need to explain one more set of ideas before making the note for this post.

There is so much to write about ER=EPR, and I’ve written a few posts about ER=EPR so far, but not enough. The gist of it, recall, is that the fuss in recent decades over the “Black Hole Information Paradox” or the “Black Hole Firewall” have been incredibly useful in leading a group of theoreticians towards a basic dim inchoate understanding that the non-locality in quantum mechanics is somehow related to wormhole bridges in spacetime.  Juan Maldacena and Leonard Susskind have pioneered this approach to understanding quantum information.

A lot of the weirdness on quantum mechanics turns out to be just geometry and topology of spacetime.

The “EPR”=”Einstein-Podolsky-Rosen-Bohm thought experiments”, precisely the genesis of the ideas that John Bell devised his Bell Inequalities for testing quantum theory, and which prove that physics involves fundamentally non-local interactions.

The “ER=”Einstein-Rosen wormhole bridges”. Wormholes are a science fiction device for time travel or fast interstellar travel. The idea is that you might imagine creating a spacetime wormhole by pinching off a thread of spacetime like the beginnings of a black hole, but then reconnecting the pinched end somewhere else in space, maybe a long time or distance separation away, and keep the pinched end open at this reconnection region.  So you can make this wormhole bridge a space length or time interval short-cut between two perhaps vastly separated regions of spacetime.

It seems that if you have an extremal version of a wormhole that is essentially shrunk down to zero radius, so it cannot be traversed by any mass, then this minimalistic wormhole still acts as a conduit of information. These provide the non-local connections between spacelike separated points in spacetime. Basically the ends of the ER=EPR wormholes are like particles, and they are connected by a wormhole that cannot be traversed by any actual particle.

Entanglement and You

So now we come to the little note I wanted to make.

I agree with Goldstein that we aught not artificially inject the concept of an observer or a “measurement process” into the heart of quantum mechanics. We should avoid such desperations, and instead seek to expand our theory to encompass better explanations of classical appearances in our world.

The interesting thing is that when we imagine how ER=EPR wormholes could influence our universe, by connecting past and future, we might end up with something much more profound than “observers” and “measurements”. We might end up with an understanding of how human consciousness and our psychological sense of the flow of time emerges from fundamental physics. All without needing to inject such transcendent notions into the physics. Leave the physics alone, let it be pristine, but get it correct and then maybe amazing things can emerge.

I do not have such a theory worked out. But I can give you the main idea. After all, I would like someone to be working on this, and I do not have the time or technical ability yet, so I do not want the world of science to wait for me to get my act together.

First: it would not surprise me if, in future, a heck of a lot of quantum theory “weirdness” was explained by ER=EPR like principles. If you abstract a little and step back from any particular instance of “quantum weirdness”, (like wave-particle duality or superposition or entanglement in any particular experiment) then what we really see is that most of the weirdness is due to non-locality. Now, this might take various guises, but if there is one mechanism for non-locality then it is a good bet something like this mechanism is at work behind most instances of non-locality that arise in quantum mechanics.

Secondly: the main way in which ER=EPR wormholes account for non-local effects is via pure information connecting regions of spacetime via the extremal wormholes. And what is interesting about this is that this makes a primitive form of time travel possible. Only information can “time travel” via these wormholes, but that might be enough to explain a lot of quantum mechanics.

Thirdly: although it is unlikely time travel effects can ever propagate up to macroscopic physics, because we just cannot engineer large enough wormholes, the statistical effects of the minimalistic ER+EPR wormholes might be enough to account for enough correlation between past and future that we might be able to eventually prove, in principle, that information gets to us from our future, at least at the level of fundamental quantum processes.

Now here’s the more speculative part: I think what might emerge from such considerations is a renewed description of the old Block Universe concept from Einstein’s general relativity (GR). Recall, in GR, time is more or less placed on an equal theoretical footing to space. This means past and future are all connected and exist whether we know it or not. Our future is “out there in time” and we just have not yet travelled into it. And we cannot travel back to our past because the bridges are not possible, the only wormhole bridges connecting past to future over macroscopic times are those minimal extremal ER=EPR wormholes that provide the universe with quantum entanglement phenomena and non-locality.

So I do not know what the consequences of such developments will be. But I can imagine some possibilities. One is that although we cannot access our future, or travel back to our past, the information from such regions in the Block Universe are tenuously connected to us nonetheless. Such connections are virtually impossible for us to exploit usefully because we could never confirm what we are dealing with until the macroscopic future “arrives” so to speak.  So although we know it is not complete, we will still have to end up using quantum mechanics probability amplitude mathematics to make predictions about physics.  In other words, quantum mechanics models our situation with respect to the world, not the actual state of the world from an atemporal Block Universe perspective.  It’s the same problem with the time travel experiment conducted in 1994 in the laboratory under the supervision of Günter Nimtz, whose lab sent analogue signals encoding Mozart’s 40th Symphony into the future (by a few milliseconds).

For that experiment there are standard explanations using Maxwell’s theory of electromagnetism that show no particles travel faster than light into the future. Nevertheless, Nimtz’s laboratory got a macroscopic recording of bits of information from Mozart’s 40th Symphony out of one back-end of a tunnelling apparatus before it was sent into the front-end of the apparatus. The interesting thing to me is not about violation of special relativity or causality.  (You might think the physicists could violate causality because one of them could wait at the back-end and when they hear Mozart come out they could tell their colleague to send Beethoven instead, thus creating a paradox.  But they could not do this because they could not send a communication fast enough in real time to warn their colleague to send Beethoven’s Fifth instead of Mozart.)  Sadly that aspect of the experiment was the most controversial, but it was not the most interesting thing. Many commentators argued about the claimed violations of SR, and there are some good arguments about photon “group velocity” being able to transmit a signal faster than light without any particular individual photon needing to go faster than light.

(Actually many of Nimtz’s experiments used electron tunnelling, not photon tunnelling, but the general principles are the same.)

All the “wave packet” and “group velocity” explanations of Nimtz’s time travel experiments are, if you ask me, merely attempts to reconcile the observations with special relativity. They all, however, use collective phenomena, either waves, or group packets. But we all know photons are not waves, they are particles (many still debate this, but just bear out my argument). The wave behaviour of fundamental particles is in fact a manifestation of quantum mechanics. Maxwell’s theory is, thus, only phenomenological. It describes electromagnetic waves, and photons get interpreted (unfortunately) as modes of such waves. But this is mistaken. Photons collectively can behave as Maxwell’s waves, but Maxwell’s theory is describing a fictional reality. Maxwell’s theory only approximates what photons actually do. They do not, in Maxwell’s theory, impinge on photon detectors like discrete quanta. And yet we all know this is what light actually does! It violates Maxwell’s theory every day!

So what, I think, is truly interesting about Nimtz’s experiments is that they were sensitive enough to give us a window into wormhole traversal. Quantum tunnelling is nothing more than information traversal though ER=EPR type wormholes. At least that’s my hypothesis. It is a non-classical effect, and Maxwell’s theory only accounts for it via the fiction that photons are waves. A wrong explanation can often fully explain the facts of course!

Letting Things Be

What Goldstein, and Bohm, and later John Stewart Bell wanted to do is explain the world. They knew quantum field theory does not explain the world. It does not tell us why things come to be what they are. Why a measurement pointer ends up pointing in particular direction rather than any one of the other superposed states of pointer orientation the quantum theory tells us it aught to be in.  Such outcomes or predictions are what David Bohm referred to as “local Beables”.  Goldstein explains more in his seminar: John Bell and the Foundations of Quantum Mechanics” Sesto, Italy 2014, (https://www.youtube.com/watch?v=RGbpvKahbSY).

My favourite idea, one I have been entertaining for over twenty years, in fact ever since 1995 when I read Kip Thorne’s book about classical general relativity and wormholes, is that the wormholes (or technically “closed timelike curves”) are where all the ingredients are for explaining quantum mechanics from a classical point of view. Standard twentieth century quantum theory does not admit wormholes. But if you ignore quantum theory and start again from classical dynamics, but allow ER=EPR wormholes to exist, then I think most of quantum mechanics can be recovered without the need for un-explained axiomatic superpositions and wave-function collapse (the conventional explanation for “measurements” and classical appearances). In other words, quantum theory, like Maxwell’s EM theory, is only a convenient fictional model of our physics. You see, when you naturally have information going backwards and forwards in time you cannot avoid superpositions of state. But when a stable time-slice emerges or “crystallizes” out of this mess of acausal dynamics, then it should look like a measurement has occurred. But no such miracle happens, it simply emerges or crystallizes naturally from the atemporal dynamics. (I use the term “crystallize” advisedly here, it is not a literal crystallization, but something abstractly similar, and George Ellis uses it in a slightly different take on the Block Universe concept, so I figure it is a fair term to use).

Also, is it possible that atemporal dynamics will tend to statistically “crystallize” something like Bohm’s pilot wave guide potential.  If you know a little about Bohmian mechanics you know the pilot wave is postulated as a real potential, something that just exists in our universe’s physics.  Yet is has no other model alike, it is not a quantum field, it is not a classical filed, it is what it is.  But what if there is no need for such a postulate?  How could it be avoided?  My idea is that maybe the combined statistical effects of influences propagating forward and backward in time give rise to an effective potential much like the Bohm pilot wave or Schrödinger wave function.  Either way, both constructs in conventional or Bohmian quantum mechanics might be just necessary fictions we need to describe, in one way or another, the proper complete Block Universe atemporal spacetime dynamics induced by the existence of spacetime wormholes.  I could throw around other ideas, but the main one is that wormholes endow spacetime with a really gnarly stringy sort of topology that has, so far, not been explored enough by physicists.

Classically you get non-locality when you allow wormholes. That’s the quickest summary I can give you. So I will end here.

*      *       *


CCL_BY-NC-SA(https://creativecommons.org/licenses/by-nc-sa/4.0/legalcode)

Giving Your Equations a Nice Bath & Scrub

There’s a good book for beginning computer programmers I recently came across.  All young kids wanting to write code professionally should check out Robert Martin’s book, “Clean Code: A Handbook of Agile Software Craftsmanship”  (Ideally get your kids to read this before the more advanced “Design Patterns” books.)

But is there such a guide for writing clean mathematics?

I could ask around on Mathforums or Quora, but instead here I will suggest some of my own tips for such a guide volume.  What gave me this spark to write a wee blog about this was a couple of awesome “finds”.  The first was Professor Tadashi Tokieda’s Numberphile clips and his AIMS Lectures on Topology and Geometry (all available on YouTube).  Tokieda plugs a couple of “good reads”, and this was the second treasure: V.I. Arnold’s lectures on Abel’s Theorem, which were typed up by his student V.B. Alekseev, “Abel’s Theorem in Problems and Solutions”, which is available in abridged format (minus solutions) in a translation by Julian Gilbey here: “Abels’ Theorem Through Problems“.

Tadashi lecturing in South Africa.

Tadashi lecturing in South Africa. Clearer than Feynman?

Tokieda’s lectures and Arnold’s exposition style are perfect examples of “clean mathematics”.  What do I mean by this?

Firstly, what I absolutely do not mean is Bourbaki style rigour and logical precision.  That’s not clean mathematics.  Because the more precision and rigour you demand the more dense and less comprehensible it all becomes to the point where it becomes unreadable and hence useless.

I mean mathematics that is challenging for the mind (so interesting) and yet clear and understandable and visualizable.  That last aspect is crucial.  If I cannot visualise an abstract idea then it has not been explained well and I have not understood it deeply.  We can only easily visualize 2D examples or 3D if we struggle.  So how are higher dimensional ideas visualised?  Tokieda shows there is no need.  You can use the algebra perfectly well for higher dimensional examples, but always give the idea in 2D or 3D.

It’s amazing that 3D seems sufficient for most expositions.  With a low dimension example most of the essence of the general N dimensional cases can be explained in pictures.   Perhaps this is due to 3D being the most awkward dimension?  It’s just a pity we do not have native 4D vision centres in our brain (we actually do, it’s called memory, but it sadly does not lead to full 4D optical feature recognition).

Dr Tokieda tells you how good pictures can be good proofs.  The mass of more confusing algebra a good picture can replace is startling (if you are used to heavy symbolic algebra).  I would also add that Sir Roger Penrose and John Baez are to experts who make a lot of use of pictorial algebra, and that sort of stuff is every bit as rigorous as symbolic algebra, and I would argue even more-so.  How’s that?  The pictorial algebra is less prone to mistake and misinterpretation, precisely because our brains are wired to receive information visually without the language symbol filters.  Thus whenever you choose instead to write proofs using formal symbolics you are reducing your writing down to less rigour, because it is easier to make mistakes and have your proof misread.

So now, in homage to Robert Martin’s programming style guide, here are some analogous sample chapter or section headings for a hypothetical book on writing clean mathematics.

Keep formal (numbered) definitions to a minimum

Whenever you need a formal definition you have failed the simplicity test.  A definition means you have not found a natural way to express or name a concept.  That’s really all definitions are, they set up names for concepts.

Occasionally advanced mathematics requires defining non-intuitive concepts, and these will require a formal approach, precisely because they are non-intuitive.  But otherwise, name objects and relations clearly and put the keywords in old, and then you can avoid cluttering up chapters with formal boring looking definition breaks.  The definitions should, if at all possible, flow naturally and be embedded in natural language paragraphs.

Do not write symbolic algebra when a picture will suffice

Most mathematicians have major hang-ups about providing misleading visual illustrations.  So my advice is do not make them misleading!  But you should use picture proofs anyway, whenever possible, just make sure they capture the essence and are generalisable to higher dimensions.  It is amazing how often this is possible.  If you doubt me, then just watch Tadashi Tokieda’s lectures linked to above.

Pro mathematicians often will think pictures are weak.  But the reality is the opposite.  Pictures are powerful.  Pictures should not sacrifice rigour.  It is the strong mathematician who can make their ideas so clear and pristine that a minimalistic picture will suffice to explain an idea of great abstract generality.  Mathematicians need to follow the physicists credo of using inference, one specific well-chosen example can suffice as an exemplar case covering infinitely many general cases.  The hard thing is choosing a good example.  It is an art.  A lot of mathematician writers seem to fail at this art, or not even try.

You do not have to use picture in your research if you do not get much from them, but in your expositions, in your writing for the public, failing to use pictures is a disservice to your readers.

The problem with popular mathematics books is not the density of equations, it is the lack of pictures.  If for every equation you have a couple of nice illustrative pictures, then there would be no such thing as “too many equations” even for a lay readership.  The same rule should apply to academic mathematics writing, with perhaps an reasonable allowance for a slightly higher symbol to picture ratio, because academically you might need to fill in a few gaps for rigour.

Rigour does not imply completeness

Mathematics should be rigorous, but not tediously so.  When gaps do not reduce clarity then you can avoid excessive equations.  Just write what the reader needs, do not fill in every gap for them.  And whenever a gap can be filled with a picture, use the picture rather than more lines of symbolic algebra.  So you do not need ruthless completeness.  Just provide enough for rigour to be inferred.

Novel writers know this.  If they set out to describe scenes completely they would ever get past chapter one. Probably not even past paragraph one.  And giving the reader too much information destroys the operation of their inner imagination and leads to the reader disconnecting from the story.

For every theorem provide many examples

The Definition to Theorem ratio should be low, for every couple of definitions there should be a bundle of nice theorems, otherwise the information content of your definitions has been poor.  More  definitions than theorems means you’ve spent more of your words naming stuff not using stuff.  Likewise the Theorem to Example ratio should be lo.  More theorems than examples means you’ve cheated the student by showing them lot of abstract ideas with no practical use.  So show them plenty of practical uses so they do not feel cheated.

Write lucidly and for entertainment

This is related to the next heading which is to write with a story narrative.  On a finer level, every sentence should be clear, use plain language, and minimum jargon.  Mathematics text should be every bit as descriptive and captivating as a great novel.  If you fail in writing like a good journalist or novelist then you have failed to write clean mathematics.  Good mathematics should entertain the aficionado.  It does not have to be set like a literal murder mystery with so many pop culture references and allusions that you lose all the technical content.  But for a mathematically literate reader you should be giving them some sense of build-up in tension and then resolution.  Dangle some food in front of them and lead them to water.  People who pick up a mathematics book are not looking for sex, crime and drama, nor even for comedy, but you should give them elements of such things inside the mathematics.  Teasers like why we are doing this, what will it be used for, how it relates to physics or other sciences, these are your sex and crime and drama.  And for humour you can use mathematical characters, stories of real mathematicians.  It might not be funny, but there is always a way to amuse an interested reader, so find those ways.

Write with a Vision

I think a lot of mathematical texts are dry ad suffer because they present “too close to research”.  What a good mathematical writer should aim for is the essence of any kind of writing, which is to narrate a story.  Psychology tells us this is how average human beings best receive and remember information.  So in mathematics you need a grand vision of where you are going.  If instead you just want to write about your research, then do the rest of us a favour and keep it off the bookshelves!

If you want to tell a story about your research then tell the full story, some history, some drama in how you stumbled, but then found a way through the forest of abstractions, and how you triumphed in the end.

The problem with a lot of mathematics monographs is that they aim for comprehensive coverage of a topic.  But that’s a bad style guide.  Instead they should aim to provide tools to solve a class of problems.  And the narrative is how to get from scratch up to the tools needed to solve the basic problem and then a little more.  With lots of dangling temptations along the way.  The motivation then is the main problem to be solved, which is talked about up front, as a carrot, not left as an obscure mystery one must read the entire book through to find.  Murder mysteries start with the murder first, not last.

*      *      *

That’s enough for now. I should add to this list of guides later. I should follow my own advice too.

*      *      *

Licence:


CCL_BY-NC-SA(https://creativecommons.org/licenses/by-nc-sa/4.0/legalcode)