Monday, February 4, 2013

Diphoton Excess Gone?

If a rumor about Higgs boson data is correct, the odds that the Higgs boson observed is truly just precisely the Standard Model Higgs boson increases greatly. We will know for sure later this year.

Rumors say diphoton excess in early Higgs Boson search data was a statistical fluke

The rumor in the physics community is that the excess of the number of diphoton decays in the experiments that discovered the Higgs boson at the large hadron collider (LHC) over the Standard Model expectation (4.3 sigma in the first round of data) have disappeared or greatly subsided as the size of the data set grew. The diphoton excess was present in the originally released data from the CMS half of the experiment. But, CMS did not release is second round of diphoton data when its sister ATLAS experiment did and when it had originally planned to late last year.

[R]emember that CMS have not yet published the diphoton update with 13/fb at 8 TeV. Rumours revealed that this was because the excess had diminished. At the Edinburgh Higgs symposium some more details about the situation were given. The talks are not online but Matt Strassler who was there has told us that the results have now been deemed correct. It may be understandable that when the results are not quite what they hope for they will scrutinize them more carefully, but I find it wrong that they do not then publish the results once checked. It was clear that they intended to publish these plots at HCP2012 in November and would have done so if they showed a bigger excess. By not releasing them now they are introducing a bias in what is publicly known and theorists are left to draw conclusions based on the ATLAS results only which still show an over-excess in the diphoton channel.
I have previously discussed why an excess over the Standard Model prediction in the key diphoton discovery channel for a Higgs boson was not just possible but likely in the first round of data after the Higgs boson was discovered was not just possible but more likely than not given the bias involved in releasing data immediately after that channel met a certain statistical threshold.  While not every experimental data point so far in the Higgs boson search is exactly in line with the Standard Model prediction, only the diphoton decay rate has been extremely significantly about the expected level.

Thus, it appears that the strongest piece of experimental data in the early Higgs boson data pointing to beyond the Standard Model physics was just a statistical fluke.

New results from both experiments on the diphoton decay rates will be released later this year with 20/fb of data, however, so the only real harm caused by not releasing the CMS data will be the time wasted by theoretical physicists and lay people who like to think about and blog about physics, speculating about beyond the Standard Model physics that insiders already knew that the data didn't support. (The LHC will be down for upgrades for about a year, so when this is released it will be the last update of experimental data in the LHC Higgs boson search for about a year and a half.)

The Higgs boson mass discrepency also probably experimental error

As noted previously in another post at this blog, it also seems likely that the discrepency between the Higgs boson mass measurements is also an experimental error. Two versions of the measurement were conducted at each experiment. Three of the four results produced a value very close to 126 GeV. One version of the experiment produced a value about 1 GeV lower.

There are a variety of circumstantial reasons to think that this outlier is wrong, rather than representing new physics, such as the existence of two neutral Higgs bosons with very similar masses.  For example, if there were two Higgs bosons of similar masses, one would expect the two kinds of measurements at each of the experiments to be similar to each other forming two pairs of mass values and this isn't what happened.

The proton size discrepency in muonic hydrogen probably experimental or calculation error

Some combination of experimental error and underestimated theoretical calculation uncertainty (that arise because infinite series equal to the exact result are truncated to make numerical approximations of the theoretical amount with imperfectly estimated error amounts) or subtle omissions of important theoretical terms in the theoretical calculation, are overwhelmingly also the likely source of the observed discrepency between the proton radius measured in ordinary hydrogen an the proton radius measured in muonic hydrogen (the muonic hydrogen measurement which is about 4% different is probably close to a correct value that is the same in both cases).

A similar issue with experimental measurement or theoretical calculations probably accounts for the discrepency between the measured and calculated value of the muon's magnetic moment.

Other ongoing constraints on BSM physics

The observed Higgs boson mass cures potential Standard Model equation pathologies

The Higgs boson mass that is observed means that the equations of the Standard Model continue to be unitary (i.e. predict sets of possiblities whose probabilities equal exactly 100% rather than more than 100% or less than 100%) at all energy scales up to the Planck scale. It is also consistent with a vacuum that is at least "metastable" (i.e. stable everywhere for time periods approximating the current age of the universe or more, even if not absolutely perfectly stable). Neither of these conditions would have held for some Higgs boson masses other than 126 GeV that hadn't been ruled out until the discovery was made last year, which would have made beyond the Standard Model physics a necessity.

BSM physics generically found in alternatives to the Standard Model are tightly constrained by experiment

Developing a beyond the Standard Model theory that produces a particle with exactly the Standard Model Higgs boson spin, diphoton, four lepton and other measurable decays at the measured Higgs boson mass profoundly constrains the experimentally discernable phenomenological consequences any such model. Tevatron data reinterpreted in light of knowledge that there is probably a 126 GeV Higgs boson is also consistent with the Standard Model prediction for b quark decays from Higgs bosons although the data at Tevatron was never strong enough by itself to prediict that a 126 GeV Higgs boson existed on that basis alone.

Supersymmetry theories, generically, require at least two charged and three neutral Higgs bosons. Experiments are increasingly setting strict boundaries on the possible masses of the two or more charged SUSY Higgs bosons and the two more more neutral Higgs bosons beyond the Standard Model expectation. Experiments are also setting increasingly stringent boundaries on the minimum masses of supersymmetric superpartners and on the key SUSY parameter tan beta. There are enough moving parts in supersymmetry theories that the LHC can't absolute rule out all possible supersymmetry theories.

But, it takes increasingly fine tuned, unnatural and exotic version of these theories to fit the data: (1) a Higgs boson with exactly the properties of a 126 GeV Standard Model Higgs boson, and (2) the absence of any other beyond the Standard Model particle of 1 TeV or less in mass. One can imagine exotic particles that would escape detection at LEP, Tevatron and the LHC given their particular experimental limitations known after the fact, but any particle of 1 TeV or less not detected by any of these experiments by the time that the LHC experiment is concluded has to be exotic indeed.

Since string theories generically have a low energy approximation in the form of supersymmetry theories, the theory space of experimentally possible string vacua is also highly constrained to quite exotic versions of those theories.

The Standard Model provides that phenomena include proton decay, flavor changing neutral currents, neurinoless double beta decay, baryon number non-conservation, lepton number non-conservation, and "generations" of fermions beyond the Standard Model's three generations simply do not exist. But, many beyond the Standard Model theories frequently predict that such phenomena exist but are rare or otherwise hard to observe. Ever tightening experimental limits on these phenomena, none of which have been discovered, and some of which (like four or more generation of fermions) are even definitively ruled out by experiment, increasingly disfavor competiting models.

As previously noted at this blog, these constraints have even more power when constrained. It isn't too hard, for example, to adjust supersymmetry theories so that their particles are too heavy to be directly directed at the LHC. But, supersymmetry theories with more massive superpartners generically also imply higher levels of neutrinoless double beta decay that experiments independent of the LHC can increasingly rule out.

Tightening constrains on the minimum half-life of the proton and on the maximum rate of neutrinoless double beta decay by factors of about one hundred, which is something that should be possible within the next decade or so, rules out a whole swath of significantly investigated beyond the Standard Model theories to an extent similar to that of LHC experiments ruling out beyond the Standard Model particles up to ahout 1 TeV in mass.

Bottom line: the Standard Model is unstoppable.

The bottom line is that the "worst case scenario" for theoretical physicists, in which the LHC experimentally completes the task of discovering the last of the Standard Model particles, the Higgs boson, while discovering no beyond the Standard Model physics, is looking increasingly probable.

If beyond the Standard Model physics exists, it appears to exist in the realm of gravitational and neutrino and greatly super-TeV energy scale physics that are beyond the LHC's ability to detect it, and not at the non-gravitational 1 TeV +/- or less energy scale that the LHC can discern.

This greatly undermines the incentive for people who fund "big science" to spend lots of money in the near future on expensive new particle accellerators that can explore energies beyond those of the LHC, because it reduces the reason to suspect that there is anything interesting to discover at those energy scales.

This isn't just a problem for particle physicists either.

Issues For Theorists Related To Gravity and Dark Matter

Astrophysicists have long ago ruled out any of the fundamental particles of the Standard Model or any of its known stable composite particles as dark matter candidates. The LHC is rigorously ruling out another whole swath of dark matter candidates, and the numerous direct dark matter detection experiments are increasingly eliminating similar candidates.

The LHC is also tightly constraining theories relevant to gravitational physics that call for extra dimensions by limiting the scale and properties of these extra dimensions which matters because extra dimensions play a central role in allowing gravity to be as weak as it is in a "theory of everything" that describes the three Standard Model forces and gravity as manifestations of a single unifed force that has symmetries that are broken at low energies. "Large" extra dimensions have been ruled out by experimental data from the LHC.

These tight and precise constraints on particle physics candidates for dark matter greatly constrain the properties of any dark matter candidates that do exist, since particles of suitable mass that interact via the weak force are pretty much completely ruled out (and strong and electromagnetic force interactions for these particles had already been largely ruled out). And, they also make modifications of gravity more attractive relative to exotic dark matter, even though no known plausible modifications of gravity can describe phenomena observed in galactic clusters yet some significant amounts of at least non-exotic dim matter that is found at high level in galactic clusters but not in ordinary galaxies must be present to explain the data so far.

Difficulties In Explaining The Matter-Antimatter Asymmetry

The data also complicate cosmology models by tightly constraining the potential sources of the observed matter-antimatter asymmetry in particle physics at quite high energies. The overwhelmingly most popular modification of the Standard Model insistence on conservation of baryon number and lepton number to address the matter-antimatter asymmetry that we observe in the universe is to assume in a beyond the Standard Model theory that rather than conserving B and L independently, that under the right circumstances, the universe merely conserves the quantity B-L. But, so far, experiments tightly constrain any violation of independent B and L conservation, and this constraint could conceivably reach the point where experimentally permitted B and L non-conservation in lieu of B-L conservation is insufficient to account for the observed matter-antimatter asymmetry.

The one missing piece of the Standard Model that could yet address this issue is a determination of the CP violation phase of the PMNS matrix that governs differences between the behavior of matter and antimatter for leptons in the same way that a similar CP violation phase of the CKM matrix governs this for quarks. But, if the CP-violating phase for leptons isn't enough to explain the matter-antimatter asymmetry in the universe (and many theorists are already guessing that this may be nearly maximal) and the observations that B number conservation and L number conservation is maintained in all experimentally observable contexts up to those that the LHC can see continue to hold, then cosmologogists may need radical new ideas to explain this observed feature of the universe without the help of beyond the Standard Model particle physics.

No comments: