Monday 4 March 2013

Collider Searches for Dark Matter

The quest to find non-gravitational evidence for dark matter proceeds along several different fronts.  The most direct approach is to look for dark matter particles scattering in detectors here on Earth.  Indirect searches, looking for cosmological signals of dark matter annihilation or decay, has provided a number of tantalising hints, of which the "line" signal at Fermi is the most recent.  However, directly producing dark matter at experiments like the LHC offers us the most control over the initial conditions and thus the least ambiguity in interpretation.

The problem with dark matter at collider experiments is that it is dark, i.e. it doesn't show up in the detectors.  To get around this problem we look for the production of other stuff as well as the dark matter itself.  We can then tell if the dark matter is there by seeing an apparent violation of conservation of momentum; the missing momentum is carried away by the unobserved dark matter particles.

The traditional approach to these type of searches is to take a complete model of new physics (such as supersymmetry) and use that to model the production process.  So in SUSY, we produce gluinos or squarks, which then go through a several-step decay producing dark matter and multiple Standard Model (SM) particles.  Indeed, even today the signal "jets and missing transverse momentum" is considered a characteristic SUSY search.

However, a couple of years ago an alternative and somewhat opposite approach began to become popular.
The new idea was to consider a much simpler model, with the dark matter the only new object beyond the SM.  In such a model all1 couplings of dark matter to the visible sector are so-called non-renormalisable operators, that is they are suppressed by a high energy scale.  At such an energy our theory stops making sense; this means our theory is incomplete.  However, it is still useful to work in such a theory as the physics below the cut-off scale is relatively independent of the physics above it.  Specifically, many different high-energy models can have the same low-energy description.  This means that any conclusions drawn from such simplified models will be robust and of broad applicability.

The number of possible models of this type is in principle infinite.  However, if we restrict ourselves to models coupling two visible sector particles to two dark matter ones, a good description for almost all known complete models, there are only tens of possible choices.  Clearly that is few enough that we can enumerate all possibilities.

Recall that we need to produce a SM particle in addition to the dark matter.  This is most easily done through initial state radiation.  The simplest final state is the completely invisible one:
Production of DM χ from quarks q.  (Quarks are the constituents of protons.)
The simplest observable final state radiates e.g. a photon from out initial particles:
As above, but with initial state radiation.  From Zhou et al.
So we look for a SM particle recoiling against invisible stuff.  These signals are called "mono-X", where X is the visible particle.  Monojets and monophotons are the most venerable example, as they can signal certain types of extra-dimensional models; but this dark matter program has also seen consideration of mono-Z, mono-W and even mono-Higgs signals.

As an aside, some of the earliest work I did in my Ph.D. was on stuff very similar to this, several years earlier.  I really kick myself that I didn't think of it then, or get involved when the first few papers in this area came out.

This brings me to a paper from a couple of weeks ago that is something of a summary of the work in this area.  By that I don't mean that it is a review, but rather that the authors combined the results of several different searches to place the most stringent limits on such simplified dark matter models.  What makes this particularly easy is that all the different searches are sufficiently distinct that they are statistically independent.  This means that the different limits can be combined using fairly elementary statistics instead of the full analysis tools of the LHC experimental collaborations.  (Note that Zhou et al. are theorists.)

The increase in sensitivity by combining searches depends on model, but is tens of percent.  Most cases are dominated by one particular signal, which is why you don't gain more.  Still, it is an improvement, and an important one given where those limits are.  Consider the following results from the paper:
These are exclusion plots for two particular SM-dark matter couplings, called D5 and D8.  The horizontal axis is the dark matter mass, and the vertical axis the cut-off scale.  Note that larger values of the cut-off correspond to weaker interactions.  The blue/red/black/grey lines are exclusion contours; everything below the line is excluded.  The green line is the line needed to reproduce the observed dark matter abundance in these models.  The fact that the exclusion contours and relic density line intersect mean that improving the limits rules out viable models of dark matter.  Even moderate improvements are useful in this regard.

1 This is not actually true.  There are three possible exceptions, the so-called Higgs, vector and neutrino portals.  However, none of them are acceptable for such simple models, as they would lead to too much dark matter in the present Universe.

No comments:

Post a Comment