Thursday, July 5, 2018

Going nowhere fast

Xiaolin Zeng for Quanta Magazine
Via aeon.co by Ben Allanach

In recent years, physicists have been watching the data coming in from the Large Hadron Collider (LHC) with a growing sense of unease. We’ve spent decades devising elaborate accounts for the behaviour of the quantum zoo of subatomic particles, the most basic building blocks of the known universe. The Standard Model is the high-water mark of our achievements to date, with some of its theoretical predictions verified to within a one-in-ten-billion chance of error – a simply astounding degree of accuracy. But it leaves many questions unanswered. For one, where does gravity come from? Why do matter particles always possess three, ever-heavier copies, with peculiar patterns in their masses? What is dark matter, and why does the universe contain more matter than antimatter?

In the hope of solving some of these mysteries, physicists have been grafting on elegant and exciting new mathematical structures to the Standard Model. The programme follows an arc traced by fundamental physics since the time of Isaac Newton: the pursuit of unification, in which science strives to explain seemingly disparate ‘surface’ phenomena by identifying, theorising and ultimately proving their shared ‘bedrock’ origin. This top-down, reductive style of thinking has yielded many notable discoveries. Newton perceived that both an apple falling to the ground, and the planets orbiting around the sun, could be explained away by gravity. The physicist Paul Dirac came up with antimatter in 1928 by marrying quantum mechanics and Einstein’s special theory of relativity. And since the late 20th century, string theorists have been trying to reconcile gravity and quantum physics by conceiving of particles as tiny vibrating loops of string that exist in somewhere between 10 and 26 dimensions.


So when the European Organization for Nuclear Research (CERN) cranked up the LHC just outside Geneva for a second time in 2015, hopes for empirical validation were running high. The fruits of physicists’ most adventurous top-down thinking would finally be put to the test. In its first three-year run, the LHC had already notched up one astounding success: in 2012, CERN announced that the Higgs boson had been found, produced in high-energy, head-on collisions between protons. The new particle existed for just a fleeting fraction of a second before decaying into a pair of tell-tale photons at specific, signature energies. What set the scientific world alight was not the excitement of a new particle per se, but the fact it was a smoking gun for a theory about how matter gets its mass. Until the British physicist Peter Higgs and others came up with their hypothetical boson in 1964, the emerging mathematical model had predicted – against the evidence – that particles should have no mass at all. Eventually, half a century after the ‘fix’ was first proposed, the boson officially entered the subatomic bestiary, the last bit of the Standard Model to be experimentally verified.

This time, though, none of the more exotic particles and interactions that theorists hoped to see has been forthcoming. No ‘stop squarks’, no ‘gluinos’, no ‘neutralinos’. The null results are now encrusting the hull of the Standard Model, like barnacles on a beautiful old frigate, and dragging her down to the ocean floor. It looks like the centuries-long quest for top-down unification has stalled, and particle physics might have a full-blown crisis on its hands.

Behind the question of mass, an even bigger and uglier problem was lurking in the background of the Standard Model: why is the Higgs boson so light? In experiments it weighed in at 125 times the mass of a proton. But calculations using the theory implied that it should be much bigger – roughly ten million billion times bigger, in fact.

This super-massive Higgs boson is meant to be the result of quantum fluctuations: an ultra-heavy particle-antiparticle pair, produced for a fleeting instant and then subsequently annihilated. Quantum fluctuations of ultra-heavy particle pairs should have a profound effect on the Higgs boson, whose mass is very sensitive to them. The other particles in the Standard Model are shielded from such quantum effects by certain mathematical symmetries – that is, things don’t change under transformation, like a square turned through 90 degrees – but the Higgs boson is the odd one out, and feels the influence very keenly.

Except that it doesn’t, because the mass of the Higgs appears to be so small. One logical option is that nature has chosen the initial value of the Higgs boson mass to precisely offset these quantum fluctuations, to an accuracy of one in 1016. However, that possibility seems remote at best, because the initial value and the quantum fluctuation have nothing to do with each other. It would be akin to dropping a sharp pencil onto a table and having it land exactly upright, balanced on its point. In physics terms, the configuration of the pencil is unnatural or fine-tuned. Just as the movement of air or tiny vibrations should make the pencil fall over, the mass of the Higgs shouldn’t be so perfectly calibrated that it has the ability to cancel out quantum fluctuations.

However, instead of an uncanny correspondence, maybe the naturalness problem with the Higgs boson could be explained away by a new, more foundational theory: supersymmetry. To grasp supersymmetry, we need to look a bit more closely at particles. Particles behave a bit like tiny spinning tops, although the amount of their spin is restricted. For example, all electrons in the universe have the same amount of spin; all photons have double this amount, and all Higgs bosons have no spin at all. The fundamental unit of spin is the spin of the electron. Other particles may only have spins equal to some whole number multiplied by the electron’s spin.

Supersymmetry is an idea that connects particles of different spins: it says they are different aspects of the same underlying object. Importantly, the large quantum fluctuations of particle-antiparticle pairs that affect the Higgs boson make the Higgs lighter if the spin of the antiparticle is an odd number multiple of an electron’s spin, or heavier if the spin of the antiparticle is an even number multiple of an electron’s spin. What this means is that supersymmetry can balance the quantum effects on the mass of the Higgs boson like a see-saw. On one side sit all of the odd-number spin particles, exactly balanced against the other side with the even-number spin particles. The overall effect is that the see-saw doesn’t move, and the Higgs boson experiences no huge quantum influences on its mass.

A major consequence of supersymmetry is that every particle we know about should have a copy (a ‘superpartner’) with exactly the same properties – except for two things. One, its spin should differ by one unit. And two, the superpartner should be heavier. The mass of the superpartner is not fixed, but the heavier one makes them, the less exact the cancellation between the particle and its superpartner, and the more you have to rely on the mass of the particle itself being fine-tuned. One can make superpartners have a mass of around 1,000 times that of a proton, and they still function reasonably well. But increase the mass by a factor of 10 and the theory goes back to looking quite unnatural.

By smashing protons together, the LHC should be able to produce these superpartners, provided they weigh around 1,000 times the mass of a proton. To do this, you change the energy of the proton beams into the mass of the predicted superpartners, via Einstein’s equation of special relativity: E=mc2 (energy equals the square of the mass). Each collision is a quantum process, however, which means it’s inherently random and you can’t predict exactly what will happen. But using the correct theory, you can calculate the relative probabilities of various outcomes. By measuring billions upon billions of collisions, you can then check the theory’s predictions against the relative frequencies of particles that are created.

As you can already tell, finding out what happens at the point of the protons colliding involves a lot of detective work. In this case, you try to check how often supersymmetric particles are produced by watching them decay into more ordinary particles. The positions of these byproducts are measured by huge detectors, machines placed around crossing points in the counter-rotating beams of the LHC that act like enormous three-dimensional cameras.

The signature of supersymmetric particles was meant to be the production of a heavy invisible particle, which could sneak through the detector like a thief, leaving no trace. These very weakly interacting particles are candidates for the origin of dark matter in the universe; the strange, invisible stuff that we know from cosmological measurement should be about four times more prevalent than ordinary matter. The red flag for their presence was meant to be theft of momentum from a collision, meaning that the momentum before and after the collision doesn’t balance.

My colleagues and I watched the LHC closely for such tell-tale signs of superpartners. None have been found. We started to ask whether we might have missed them somehow. Perhaps some of the particles being produced were too low in energy for the collisions to be observed. Or perhaps we were wrong about dark matter particles – maybe there was some other, unstable type of particle.

In the end, these ideas weren’t really a ‘get-out-of-jail-free’ card. Using various experimental analysis techniques, they were also hunted out and falsified. Another possibility was that the superpartners were a bit heavier than expected; so perhaps the mass of the Higgs boson did have some cancellation in it (one part in a few hundred, say). But as the data rolled in and the beam energy of the LHC was ramped up, supersymmetry became more and more squeezed as a solution to the Higgs boson naturalness problem.

The trouble is that it’s not clear when to give up on supersymmetry. True, as more data arrives from the LHC with no sign of superpartners, the heavier they would have to be if they existed, and the less they solve the problem. But there’s no obvious point at which one says ‘ah well, that’s it – now supersymmetry is dead’. Everyone has their own biased point in time at which they stop believing, at least enough to stop working on it. The LHC is still going and there’s still plenty of effort going into the search for superpartners, but many of my colleagues have moved on to new research topics. For the first 20 years of my scientific career, I cut my teeth on figuring out ways to detect the presence of superpartners in LHC data. Now I’ve all but dropped it as a research topic.

It could be that we got the wrong end of the stick with how we frame the puzzle of the Higgs boson. Perhaps we’re missing something from the mathematical framework with which we calculate its mass. Researchers have worked along these lines and so far come up with nothing, but that doesn’t mean there’s no solution. Another suspicion relates to the fact that the hypothesis of heavy particles relies on arguments based on a quantum theory of gravity – and such a theory has not yet been verified, although there are mathematically consistent constructions.

Perhaps the bleakest sign of a flaw in present approaches to particle physics is that the naturalness problem isn’t confined to the Higgs boson. Calculations tell us that the energy of empty space (inferred from cosmological measurements to be tiny) should be huge. This would make the outer reaches of the universe decelerate away from us, when in fact observations of certain distant supernovae suggest that the outer reaches of our universe are accelerating. Supersymmetry doesn’t fix this conflict. Many of us began to suspect that whatever solved this more difficult issue with the universe’s vacuum energy would solve the other, milder one concerning the mass of the Higgs.

All these challenges arise because of physics’ adherence to reductive unification. Admittedly, the method has a distinguished pedigree. During my PhD and early career in the 1990s, it was all the rage among theorists, and the fiendishly complex mathematics of string theory was its apogee. But none of our top-down efforts seem to be yielding fruit. One of the difficulties of trying to get at underlying principles is that it requires us to make a lot of theoretical presuppositions, any one of which could end up being wrong. We were hoping by this stage to have measured the mass of some superpartners, which would have given us some data on which to pin our assumptions. But we haven’t found anything to measure.

Instead, many of us have switched from the old top-down style of working to a more humble, bottom-up approach. Instead of trying to drill down to the bedrock by coming up with a grand theory and testing it, now we’re just looking for any hints in the experimental data, and working bit by bit from there. If some measurement disagrees with the Standard Model’s predictions, we add an interacting particle with the right properties to explain it. Then we look at whether it’s consistent with all the other data. Finally, we ask how the particle and its interactions can be observed in the future, and how experiments should sieve the data in order to be able to test it.

The bottom-up method is much less ambitious than the top-down kind, but it has two advantages: it makes fewer assumptions about theory, and it’s tightly tethered to data. This doesn’t mean we need to give up on the old unification paradigm, it just suggests that we shouldn’t be so arrogant as to think we can unify physics right now, in a single step. It means incrementalism is to be preferred to absolutism – and that we should use empirical data to check and steer us at each instance, rather than making grand claims that come crashing down when they’re finally confronted with experiment.

A test case for the bottom-up methodology is the bottom meson, a composite particle made of something called a bottom quark and another known as a lighter quark. Bottom mesons appear to be decaying with the ‘wrong’ probabilities. Experiments in the LHC have measured billions of such decays, and it seems that the probability of getting a muon pair from particular interactions is about three-quarters of the probability of what the Standard Model says it should be. We can’t be totally sure yet that this effect is in strong disagreement with the Standard Model – more data is being analysed to make sure that the result is not due to statistics, or some subtle systematic error.

Some of us are busy speculating on what these findings might mean. Excitations of two different types of new, unobserved, exotic particles – known as Z-primes and leptoquarks, each buried deep within the bottom mesons – could be responsible for the bottom mesons misbehaving. However, the trouble is that one doesn’t know which (if either) type of particle is responsible. In order to check, ideally we’d produce them in LHC collisions and detect their decay products (these decay products should include muons with a certain energy). The LHC has a chance of producing Z-primes or leptoquarks, but it’s possible they’re just too heavy. In that case, one would need to build a higher energy collider: an ambitious plan for a beam of energy of seven times the intensity of the LHC would be a good option.

In the meantime, my colleagues and I ask: ‘Why should the new particles be there?’ A new mathematical symmetry might be the answer for Z-primes: it requires the Z-prime’s existence to hold. From this symmetry, one then gets additional theoretical constraints, and also some predictions for likely experimental signatures which could be checked with experiments in the future. Often, the bottom mesons are predicted to decay in other ways with some probability – for example, to something called an antimuon-tau. The LHC will be actively analysing their data for such signals in the future.

We began with an experimental signature (the particular bottom meson decays that disagree with Standard Model predictions), then we tried to ‘bung in’ a new hypothesised particle to explain it. Its predictions must be compared with current data to check that the explanation is still viable. Then we started building an additional theoretical structure that predicted the existence of the particle, as well as its interactions. This theory will allow us to make predictions for future measurements of decays, as well as search for the direct production of the new particle at the LHC. Only after any hints from these measurements and searches have been taken into account, and the models tweaked, might we want to embed the model in a larger, more unified theoretical structure. This may drive us progressively on the unification road, rather than attempting to jump to it in one almighty leap.

Source

No comments:

Post a Comment