Monday 26 August 2013

On the latest anomaly in LHCb

[...So, Resonaances is back with its trademark pessimism and frustration. But here is a post with a glimmer of hope and a bit more substance...]

LHCb recently reported an anomaly in the angular distribution of  B0 → K*0 (→K+π-) μ+ μ- decays. The discreet charm of flavor physics is that even trying to understand which process is being studied may  give you a serious migraine. So let's first translate to English.  B0 is a pseudoscalar meson made of an anti-b- and a d-quark that is easily found in the junk produced by LHC collisions. K*0, actually K*0(892) because they come in variety of masses,  is a vector meson made of an anti-s and a d-quark which promptly decays to a usual charged kaon and a pion.  B0 → K*0(→K+π-) μ+ μ-, in the following simply referred to as B → K*μμ, is a rare decay  occurring with the branching fraction of order 10^-7. Of course there's also the conjugate process where each particle is replaced with its anti-particle, and the two are dumped together in the LHCb analysis.  Some properties of this decay have been studied before at the B-factories, Tevatron, and LHC, without finding anything unexpected. The new thing about the latest LHCb analysis is that they study the full monty differential distribution with respect to the 4 variables characterizing this four-body decay process: 3 angles θK,θl and φ (see the picture) and the invariant mass q^2 of the di-muon pair.  A parametrization of that differential distribution is

Basically, LHCb measured all these Sn and FL coefficients as a function of q^2.  The largest anomaly is observed at low q^2 in the parameter S5 (also presented as  P5' which is S5 rescaled a function of FL).  LHCb quantifies it is a 3.7 sigma deviation from the Standard Model in the region 4.3≤q^2≤8.68 GeV^2;  this is downgraded to 2.5 sigma if the look-elsewhere effect is taken into account.  Theorists fitting the data quote the deviation between 1 and  4.5 sigma, depending on theoretical assumptions and how the data are sliced and cooked.

The interesting question is whether new physics could be responsible for the anomaly. To go beyond a yes/no answer one has to, unfortunately, go through a bit of technicalities.  At the parton level, the relevant process is the b→sμ+μ- decay.  Theorists computing the  B → K*μμ decay thus start from an effective interaction Lagrangian with  4-fermion and dipole operators involving  the b- and s-quarks. The operators relevant for this process are

where Λref≈35 TeV. This set of operators allows one to describe the B → K*μμ  decays in a completely model independent way, whether within the Standard Model or in some new physics scenario. In the Standard Model a subset of these operators is generated (see the diagrams) with the coefficients C7, C9 and C10 of order 1 (the suppression scale of the effective operators is tens of TeV due to the loop suppression, and also due to the CKM suppression via the small Vbs matrix element; this is why B → K*μμ is so sensitive to new physics). New physics could provide additional contributions to these 3 operators or produce the C' operators that are not generated in the Standard Model at all. For example, the tree-level exchange of a Z' boson coupled to leptons and, in a flavor violating way, to quarks could affect C9 and C9'; the dipole operators C7 and C7' could be generated e.g. by loop diagrams with  a charged Higgs boson, and so on. Now, all of these operators affect the angular distribution of B → K*μμ decays, in particular they can shift the anomalous observable S5/P5'. But one should be careful not to screw up the other observables that remain in a good agreement with the Standard Model. Moreover, the same operators also affect  countless other processes in the B-meson sector, including the well measured branching fractions for  B → Xs γ  and  Bs→μμ decays.  Thus, it is a non-trivial question whether a consistent solution to the anomaly can be found.  The answer is that that, indeed, there do exist regions in the parameter space where the fit to the data is much better than in the Standard Model. According to this paper, the best scenario is the one where new physics generates simultaneously C9 and C9', with the contribution to C9 similar in size but opposite in sign to the Standard Model effective contribution. Other combinations of the operators can also improve the fit,  but the gain is less striking.

 


So, the verdict is... well, at this point the anomaly is not utterly solid yet.  One warning flag is that it shows up in a complicated angular analysis rather than in a clean and simple observable, which gives  more opportunities for theories and experimenters alike to commit a subtle error in the analysis. Moreover, in order to explain the anomaly, new physics contributions to the B → K*μμ amplitude need to be of the same order of magnitude as the Standard Models ones, which requires a certain degree of conspiracy. Most likely, the experimental data and the Standard Model predictions will approach each other when more data is analyzed,  as it has happened countless times in the past.   Nevertheless, we're looking forward to the future updates on B → K*μμ with a little more anticipation than usual. Note that the current LHCb analysis includes only the 7 TeV run data;  the twice as large 8 TeV sample is still waiting to see the light...

[Most pictures stolen from Nicola Serra's talk at EPS] 

Tuesday 13 August 2013

A kingdom for a scale

The recent hiatus, so far the longest in the history of Résonaances, was caused by a unique combination of work, travel, frustration, depression, and sloth. Sorry :-|  A day may come when this blog will fall silent forever; but it is not this day ;)
 
After the first run of the LHC particle physics finds itself in an unprecedented situation. During most of  the history of the discipline we had a high energy scale that allowed us to organize our theoretical and experimental efforts. It first appeared back in the 1930s when Fermi wrote down his theory of weak interactions which contained a 4-fermion operator mediating the beta decay of the neutron. For dimensional reasons, 4-fermion operators appear in the Lagrangian divided by an energy scale squared, and in the case of the Fermi operator  this scale is what we now know as the electroweak scale v=174 GeV. This scale come with well defined physical consequences. Scattering amplitudes in the Fermi theory misbehave at energies above v, and some new physics must appear to regulate them. Later several details of this picture were modified. In particular, it was found that the Fermi  4-fermion operator is a low energy effective description of the exchange of a W boson between pairs of fermions. However the argument for new physics near the electroweak scale remained in place, this time to regulate the W and Z bosons scattering amplitudes. That's why even before the LHC kicked off  we could give an almost risk-free promise that it was going to discover something.

Now things have changed dramatically. The LHC has explored the energy range up to about 1 TeV and definitively crossed the electroweak scale. The promised new physics phenomenon was found: a spin-0 boson coupled to mass,  as predicted in the Standard Model. This little addition miraculously cures all the woes of the theory. Ignoring gravity, the Standard Model with the 125 GeV Higgs boson can be extended to arbitrarily high scales. Only the coupling of matter to gravity guarantees some new phenomena, like maybe strong gravitational effects and production of black holes. But that should happen at an immensely high scale of 10^19 GeV that we may never be able to reach in collider experiments. We are not sure if  there is any other physical scale between the electroweak and Planck scale. There's no well defined energy frontier we can head toward. Particle physics no longer has  a firm reference point. An artist's view of the current situation is this:

There's actually one important practical consequence.  Regardless how high energy collider we build next: 30 TeV, 100 TeV, or 1000 TeV,  we cannot be sure it will discover any new phenomena rather than just confirm the old theory in the new energy range. 

This is not to say that the Standard Model must be valid all the way to the Planck scale. On the contrary we have strong hints it is otherwise. The existence of dark matter, the observations of  neutrino oscillations, the matter-antimatter asymmetry in the Universe, and the cosmological inflation, they all require some physics beyond the Standard Model. However none of the above points to a concrete scale where new phenomena must show up. The answers may be just behind the corner  and be revealed by the run-II of the LHC. Or the answer may be due to Planck-scale physics and will never be directly explored; or else it may be due to very light and very weakly coupled degrees of freedom that should be probed by other means than colliders. For example, for dark matter particles we know theoretically motivated models with the mass ranging from sub-eV (axions) to the GUT scale (wimpzillas), and there is no mass  between these two extremes that is clearly favored from the theory point of view.  The case of the neutrino oscillations is a bit different because, as soon as we prove experimentally that neutrinos are Majorana particles, we will confirm  the existence of a set of dimension-5  operators beyond the Standard Model, the so-called Weinberg operators of the form (H L)^2/Λ. Then the scale Λ is the maximum energy scale where new physics (singlet Majorana neutrinos or something more complicated) has to show up. This is however little consolation given the scale emerging from neutrino experiments is Λ∼10^15 GeV, obviously beyond the direct reach of accelerators in a foreseeable future.   

So, while pushing up the energy frontier in accelerators will continue, I think that currently searching high and low for a new scale is the top priority. Indeed, increasing the collision energy has become an expensive and time consuming endeavor; we will achieve an almost factor of 2 increase in 2 years, and, optimistically, we can hope for another factor of 2 at the time scale of ∼25 years. On the other hand, indirect sensitivity to high scales via searches for  higher dimensional operators beyond the standard model can often be improved by orders of magnitude in the near future. The hope is that Fermi's trick will work again and we may discover the new scale indirectly, by means of experiments at much lower energies. There are literally hundreds of dimension-6 operators beyond the standard model that can be searched for in experiments. For example, operators involving the Higgs fields would affect the Higgs couplings measured, and in this case the LHC and later the ILC can probe the operators suppressed by up to ∼10 TeV. Flavor and CP violating processes offer an even more sensitive probe, with the typical sensitivity between 10 and 10^5 TeV. Who knows, maybe the recent anomaly in B→K*μμ decays is not yet another false alarm, but an effect of the flavor violating dimension-6 operators of the form


with Λ of order 30 TeV.  And if not,  there are hundreds other doors to knock on.  Demonstrating the presence of a nearby new physics scale would surely bring back momentum to the particle physics program. At least, we would know where we stand, and how big a collider we must build to be guaranteed new physics.  So yes, a kingdom and on my part I'm adding the hand of a princess too..