symmetry

First results from search for a dark light

The Heavy Photon Search at Jefferson Lab is looking for a hypothetical particle from a hidden “dark sector.”

HPS Silicon Vertex Tracker being assembled

In 2015, a group of researchers installed a particle detector just half of a millimeter away from an extremely powerful electron beam. The detector could either start them on a new search for a hidden world of particles and forces called the “dark sector”—or its sensitive parts could burn up in the beam.

Earlier this month, scientists presented the results from that very first test run at the Heavy Photon Search collaboration meeting at the US Department of Energy’s Thomas Jefferson National Accelerator Facility. To the scientists’ delight, the experiment is working flawlessly.

Dark sector particles could be the long-sought components of dark matter, the mysterious form of matter thought to be five times more abundant in the universe than regular matter. To be specific, HPS is looking for a dark-sector version of the photon, the elementary “particle of light” that carries the fundamental electromagnetic force in the Standard Model of particle physics.

Analogously, the dark photon would be the carrier of a force between dark-sector particles. But unlike the regular photon, the dark photon would have mass. That’s why it’s also called the heavy photon.

To search for dark photons, the HPS experiment uses a very intense, nearly continuous beam of highly energetic electrons from Jefferson Lab’s CEBAF accelerator. When slammed into a tungsten target, the electrons radiate energy that could potentially produce the mystery particles. Dark photons are believed to quickly decay into pairs of electrons and their antiparticles, positrons, which leave tracks in the HPS detector.

“Dark photons would show up as an anomaly in our data—a very narrow bump on a smooth background from other processes that produce electron-positron pairs,” says Omar Moreno from SLAC National Accelerator Laboratory, who led the analysis of the first data and presented the results at the collaboration meeting.

The challenge is that, due to the large beam energy, the decay products are compressed very narrowly in beam direction. To catch them, the detector must be very close to the electron beam. But not too close—the smallest beam movements could make the beam swerve into the detector. Even if the beam doesn’t directly hit the HPS apparatus, electrons interacting in the target can scatter into the detector and cause unwanted signals. 

The HPS team implemented a number of precautions to make sure their detector could handle the potentially destructive beam conditions. They installed and carefully aligned a system to intercept any large beam motions, made the detector’s support structure movable to bring the detector close to the beam and measure the exact beam position, and installed a feedback system that would shut the beam down if its motions were too large. They also placed their whole setup in vacuum because interactions of the beam with gas molecules would create too much background. Finally, they cooled the detector to negative 30 degrees Fahrenheit to reduce the effects of radiation damage. These measures allowed the team to operate their experiment so close to the beam.

“That’s maybe as close as anyone has ever come to such a particle beam,” says John Jaros, head of the HPS group at SLAC, which built the innermost part of the HPS detector, the Silicon Vertex Tracker. “So, it was fairly exciting when we gradually decreased the distance between the detector and the beam for the first time and saw that everything worked as planned. A large part of that success lies with the beautiful beams Jefferson Lab provided.” 

SLAC’s Mathew Graham, who oversees the HPS analysis group, says, “In addition to figuring out if we can actually do the experiment, the first run also helped us understand the background signals in the experiment and develop the data analysis tools we need for our search for dark photons.”

So far, the team has seen no signs of dark photons. But to be fair, the data they analyzed came from just 1.7 days of accumulated running time. HPS collects data in short spurts when the CLAS experiment, which studies protons and neutrons using the same beam line, is not in use.

A second part of the analysis is still ongoing: The researchers are also closely inspecting the exact location, or vertex, from which an electron-positron pair emerges.

“If a dark photon lives long enough, it might make it out of the tungsten target where it was produced and travel some distance through the detector before it decays into an electron-positron pair,” Moreno says. The detector was specifically designed to observe such a signal.

Jefferson Lab has approved the HPS project for a total of 180 days of experimental time. Slowly but surely, HPS scientists are finding chances to use it.

Continue reading

LHC swings back into action

Protons are colliding once again in the Large Hadron Collider.

Overhead view of people sitting in front of two rows of computer screens

This morning at CERN, operators nudged two high-energy beams of protons into a collision course inside the world’s largest and most energetic particle accelerator, the Large Hadron Collider. These first stable beams inside the LHC since the extended winter shutdown usher in another season of particle hunting.

The LHC’s 2017 run is scheduled to last until December 10. The improvements made during the winter break will ensure that scientists can continue to search for new physics and study rare subatomic phenomena. The machine exploits Albert Einstein’s principle that energy and matter are equivalent and enables physicists to transform ordinary protons into the rare massive particles that existed when our universe was still in its infancy.

“Every time the protons collide, it’s like panning for gold,” says Richard Ruiz, a theorist at Durham University. “That’s why we need so much data. It’s very rare that the LHC produces something interesting like a Higgs boson, the subatomic equivalent of a huge gold nugget. We need to find lots of these rare particles so that we can measure their properties and be confident in our results.”

During the LHC’s four-month winter shutdown, engineers replaced one of its main dipole magnets and carried out essential upgrades and maintenance work. Meanwhile, the LHC experiments installed new hardware and revamped their detectors. Over the last several weeks, scientists and engineers have been performing the final checks and preparations for the first “stable beams” collisions.

“There’s no switch for the LHC that instantly turns it on,” says Guy Crockford, an LHC operator. “It’s a long process, and even if it’s all working perfectly, we still need to check and calibrate everything. There’s a lot of power stored in the beam and it can easily damage the machine if we’re not careful.”

In preparation for data-taking, the LHC operations team first did a cold checkout of the circuits and systems without beam and then performed a series of dress rehearsals with only a handful of protons racing around the machine.

“We set up the machine with low intensity beams that are safe enough that we could relax the safety interlocks and make all the necessary tweaks and adjustments,” Crockford says. “We then deliberately made the proton beams unstable to check that all the loose particles were caught cleanly. It’s a long and painstaking process, but we need complete confidence in our settings before ramping up the beam intensity to levels that could easily do damage to the machine.”

The LHC started collisions for physics with only three proton bunches per beam. Over the course of the next month, the operations team will gradually increase the number of proton bunches until they have 2760 per beam. The higher proton intensity greatly increases the rate of collisions, enabling the experiments to collect valuable data at a much faster rate.

“We’re always trying to improve the machine and increase the number of collisions we deliver to the experiments,” Crockford says. “It’s a personal challenge to do a little better every year.”

Continue reading

The facts and nothing but the facts

At a recent workshop on blind analysis, researchers discussed how to keep their expectations out of their results.

Illustration of a scientist handing papers to another, blindfolded scientist

Scientific experiments are designed to determine facts about our world. But in complicated analyses, there’s a risk that researchers will unintentionally skew their results to match what they were expecting to find. To reduce or eliminate this potential bias, scientists apply a method known as “blind analysis.”

Blind studies are probably best known from their use in clinical drug trials, in which patients are kept in the dark about—or blind to—whether they’re receiving an actual drug or a placebo. This approach helps researchers judge whether their results stem from the treatment itself or from the patients’ belief that they are receiving it.

Particle physicists and astrophysicists do blind studies, too. The approach is particularly valuable when scientists search for extremely small effects hidden among background noise that point to the existence of something new, not accounted for in the current model. Examples include the much-publicized discoveries of the Higgs boson by experiments at CERN’s Large Hadron Collider and of gravitational waves by the Advanced LIGO detector.

“Scientific analyses are iterative processes, in which we make a series of small adjustments to theoretical models until the models accurately describe the experimental data,” says Elisabeth Krause, a postdoc at the Kavli Institute for Particle Astrophysics and Cosmology, which is jointly operated by Stanford University and the Department of Energy’s SLAC National Accelerator Laboratory. “At each step of an analysis, there is the danger that prior knowledge guides the way we make adjustments. Blind analyses help us make independent and better decisions.”

Krause was the main organizer of a recent workshop at KIPAC that looked into how blind analyses could be incorporated into next-generation astronomical surveys that aim to determine more precisely than ever what the universe is made of and how its components have driven cosmic evolution.

Black boxes and salt

One outcome of the workshop was a finding that there is no one-size-fits-all approach, says KIPAC postdoc Kyle Story, one of the event organizers. “Blind analyses need to be designed individually for each experiment.”

The way the blinding is done needs to leave researchers with enough information to allow a meaningful analysis, and it depends on the type of data coming out of a specific experiment.

A common approach is to base the analysis on only some of the data, excluding the part in which an anomaly is thought to be hiding. The excluded data is said to be in a “black box” or “hidden signal box.”

Take the search for the Higgs boson. Using data collected with the Large Hadron Collider until the end of 2011, researchers saw hints of a bump as a potential sign of a new particle with a mass of about 125 gigaelectronvolts. So when they looked at new data, they deliberately quarantined the mass range around this bump and focused on the remaining data instead.

They used that data to make sure they were working with a sufficiently accurate model. Then they “opened the box” and applied that same model to the untouched region. The bump turned out to be the long-sought Higgs particle.

That worked well for the Higgs researchers. However, as scientists involved with the Large Underground Xenon experiment reported at the workshop, the “black box” method of blind analysis can cause problems if the data you’re expressly not looking at contains rare events crucial to figuring out your model in the first place.

LUX has recently completed one of the world’s most sensitive searches for WIMPs—hypothetical particles of dark matter, an invisible form of matter that is five times more prevalent than regular matter. LUX scientists have done a lot of work to guard LUX against background particles—building the detector in a cleanroom, filling it with thoroughly purified liquid, surrounding it with shielding and installing it under a mile of rock. But a few stray particles make it through nonetheless, and the scientists need to look at all of their data to find and eliminate them.

For that reason, LUX researchers chose a different blinding approach for their analyses. Instead of using a “black box,” they use a process called “salting.”

LUX scientists not involved in the most recent LUX analysis added fake events to the data—simulated signals that just look like real ones. Just like the patients in a blind drug trial, the LUX scientists didn’t know whether they were analyzing real or placebo data. Once they completed their analysis, the scientists that did the “salting” revealed which events were false.

A similar technique was used by LIGO scientists, who eventually made the first detection of extremely tiny ripples in space-time called gravitational waves.

High-stakes astronomical surveys

The Blind Analysis workshop at KIPAC focused on future sky surveys that will make unprecedented measurements of dark energy and the Cosmic Microwave Background—observations that will help cosmologists better understand the evolution of our universe.

Dark energy is thought to be a force that is causing the universe to expand faster and faster as time goes by. The CMB is a faint microwave glow spread out over the entire sky. It is the oldest light in the universe, left over from the time the cosmos was only 380,000 years old.

To shed light on the mysterious properties of dark energy, the Dark Energy Science Collaboration is preparing to use data from the Large Synoptic Survey Telescope, which is under construction in Chile. With its unique 3.2-gigapixel camera, LSST will image billions of galaxies, the distribution of which is thought to be strongly influenced by dark energy.

“Blinding will help us look at the properties of galaxies picked for this analysis independent of the well-known cosmological implications of preceding studies,” DESC member Krause says. One way the collaboration plans on blinding its members to this prior knowledge is to distort the images of galaxies before they enter the analysis pipeline.

Not everyone in the scientific community is convinced that blinding is necessary. Blind analyses are more complicated to design than non-blind analyses and take more time to complete. Some scientists participating in blind analyses inevitably spend time looking at fake data, which can feel like a waste.

Yet others strongly advocate for going blind. KIPAC researcher Aaron Roodman, a particle-physicist-turned-astrophysicist, has been using blinding methods for the past 20 years.

“Blind analyses have already become pretty standard in the particle physics world,” he says. “They’ll be also crucial for taking bias out of next-generation cosmological surveys, particularly when the stakes are high. We’ll only build one LSST, for example, to provide us with unprecedented views of the sky.”

Continue reading

CERN unveils new linear accelerator

Linac 4 will replace an older accelerator as the first step in the complex that includes the LHC.

Linac 4, CERN's newest accelerator acquisition since the Large Hadron Collider (LHC), was inaugurated today. (M.Brice/CERN)

At a ceremony today, CERN European research center inaugurated its newest accelerator.

Linac 4 will eventually become the first step in CERN’s accelerator chain, delivering proton beams to a wide range of experiments, including those at the Large Hadron Collider.

After an extensive testing period, Linac 4 will be connected to CERN’s accelerator complex during a long technical shutdown in 2019-20. Linac 4 will replace Linac 2, which was put into service in 1978. Linac 4 will feed the CERN accelerator complex with particle beams of higher energy.

“We are delighted to celebrate this remarkable accomplishment,” says CERN Director General Fabiola Gianotti. “Linac 4 is a modern injector and the first key element of our ambitious upgrade program, leading to the High-Luminosity LHC. This high-luminosity phase will considerably increase the potential of the LHC experiments for discovering new physics and measuring the properties of the Higgs particle in more detail.”

“This is an achievement not only for CERN, but also for the partners from many countries who contributed in designing and building this new machine,” says CERN Director for Accelerators and Technology Frédérick Bordry. “We also today celebrate and thank the wide international collaboration that led this project, demonstrating once again what can be accomplished by bringing together the efforts of many nations.”

The linear accelerator is the first essential element of an accelerator chain. In the linear accelerator, the particles are produced and receive the initial acceleration. The density and intensity of the particle beams are also shaped in the linac. Linac 4 is an almost 90-meter-long machine sitting 12 meters below the ground. It took nearly 10 years to build it.

Linac 4 will send negative hydrogen ions, consisting of a hydrogen atom with two electrons, to CERN’s Proton Synchrotron Booster, which further accelerates the negative ions and removes the electrons. Linac 4 will bring the beam up to an energy of 160 million electronvolts, more than 3 times the energy of its predecessor. The increase in energy, together with the use of hydrogen ions, will enable doubling the beam intensity delivered to the LHC, contributing to an increase in the luminosity of the LHC by 2021.

Luminosity is a parameter indicating the number of particles colliding within a defined amount of time. The peak luminosity of the LHC is planned to be increased by a factor of 5 by the year 2025. This will make it possible for the experiments to accumulate about 10 times more data over the period 2025 to 2035 than before.

Editor's note: This article is based on a CERN press release.

Continue reading

Understanding the unknown universe

The authors of We Have No Idea remind us that there are still many unsolved mysteries in science.

Header: Understanding the unknown universe

What is dark energy? Why aren’t we made of antimatter? How many dimensions are there? 

These are a few of the many unanswered questions that Jorge Cham, creator of the online comic Piled Higher and Deeper, and Daniel Whiteson, an experimental particle physicist at the University of California, Irvine, explain in their new book, We Have No Idea. In the process, they remind readers of one key point: When it comes to our universe, there’s a lot we still don’t know. 

Inline1: Understanding the unknown universe
Jorge Cham

The duo started working together in 2008 after Whiteson reached out to Cham, asking if he’d be willing to help create physics cartoons. “I always thought physics was well connected to the way comics work,” Whiteson says. “Because, what’s a Feynman diagram but a little cartoon of particles hitting each other?” (Feynman diagrams are pictures commonly used in particle physics papers that represent the interactions of subatomic particles.)

Inline3: Understanding the unknown universe
Daniel Whiteson

Before working on this book, the pair made a handful of popular YouTube videos on topics like dark matter, extra dimensions and the Higgs boson. Many of these subjects are also covered in We Have No Idea.

One of the main motivators of this latest project was to address a “certain apathy toward science,” Cham says. “I think we both came into it having this feeling that the general public either thinks scientists have everything figured out, or they don't really understand what scientists are doing.” 

To get at this issue, the pair focused on topics that even someone without a science background could find compelling. “You don’t need 10 years of physics background to know [that] questions about how the universe started or what it’s made of are interesting,” Whiteson says. “We tried to find questions that were gut-level approachable.”

Another key theme of the book, the authors say, is the line between what science can and cannot tell us. While some of the possible solutions to the universe’s mysteries have testable predictions, others (such as string theory) currently do not. “We wanted questions that were accessible yet answerable,” says Whiteson. “We wanted to show people that there were deep, basic, simple questions that we all had, but that the answers were out there.” 

Many scientists are hard at work trying to fill the gaping holes in our knowledge about the universe. Particle physicists, for example, are exploring a number of these questions, such as those about the nature of antimatter and mass.

Inline2: Understanding the unknown universe

Artwork by Jorge Cham

Some lines of inquiry have brought different research communities together. Dark matter searches, for example, were primarily the realm of cosmologists, who probe large-scale structures of the universe. However, as the focus shifted to finding out what particle—or particles—dark matter was made of, this area of study started to attract astrophysicists as well. 

Why are people trying to answer these questions? “I think science is an expression of humanity and our curiosity to know the answers to basic questions we ask ourselves: Who are we? Why are we here? How does the world work?” Whiteson says. “On the other hand, questions like these lead to understanding, and understanding leads to being able to have greater power over the environment to solve our problems.

In the very last chapter of the book, the authors explain the idea of a “testable universe,” or the parts of the universe that fall within the bounds of science. In the Stone Ages, when humans had very few tools at their disposal, the testable universe was very small. But it increased as people built telescopes, satellites and particle colliders, and it continues to expand with ongoing advances in science and technology. “That’s the exciting thing,” Cham says. “Our ability to answer these questions is growing.” 

Some mysteries of the universe still live in the realm of philosophy. But tomorrow, next year or a thousand years from now, a scientist may come along and devise an experiment that will be able to find the answers.   

“We’re in a special place in history when most of the world seems explained,” Whiteson says. Thousands of years ago, basic questions, such as why fire burns or where rain comes from, were still largely a mystery. “These days, all those mysteries seem answered, but the truth is, there’s a lot of mysteries left. [If] you want to make a massive imprint on human intellectual history, there’s plenty of room for that.”

Continue reading

Sterile neutrino search hits roadblock at reactors

A new result from the Daya Bay collaboration reveals both limitations and strengths of experiments studying antineutrinos at nuclear reactors.

Photo: Daya Bay 2

As nuclear reactors burn through fuel, they produce a steady flow of particles called neutrinos. Neutrinos interact so rarely with other matter that they can flow past the steel and concrete of a power plant’s containment structures and keep on moving through anything else that gets in their way.

Physicists interested in studying these wandering particles have taken advantage of this fact by installing neutrino detectors nearby. A recent result using some of these detectors demonstrated both their limitations and strengths.

The reactor antineutrino anomaly

In 2011, a group of theorists noticed that several reactor-based neutrino experiments had been publishing the same, surprising result: They weren’t detecting as many neutrinos as they thought they would.

Or rather, to be technically correct, they weren’t seeing as many antineutrinos as they thought they would; nuclear reactors actually produce the antimatter partners of the elusive particles. About 6 percent of the expected antineutrinos just weren’t showing up. They called it “the reactor antineutrino anomaly.”

The case of the missing neutrinos was a familiar one. In the 1960s, the Davis experiment located in Homestake Mine in South Dakota reported a shortage of neutrinos coming from processes in the sun. Other experiments confirmed the finding. In 2001, the Sudbury Neutrino Observatory in Ontario demonstrated that the missing neutrinos weren’t missing at all; they had only undergone a bit of a costume change.

Neutrinos come in three types. Scientists discovered that neutrinos could transform from one type to another. The missing neutrinos had changed into a different type of neutrino that the Davis experiment couldn’t detect.

Since 2011, scientists have wondered whether the reactor antineutrino anomaly was a sign of an undiscovered type of neutrino, one that was even harder to detect, called a sterile neutrino.

A new result from the Daya Bay experiment in China not only casts doubt on that theory, it also casts doubt on the idea that scientists understand their model of reactor processes well enough at this time to use it to search for sterile neutrinos.

The word from Daya Bay

The Daya Bay experiment studies antineutrinos coming from six nuclear reactors on the southern coast of China, about 35 miles northeast of Hong Kong. The reactors are powered by the fission of uranium. Over time, the amount of uranium inside the reactor decreases while the amount of plutonium increases. The fuel is changed—or cycled—about every 18 months.

The main goal of the Daya Bay experiment was to look for the rarest of the known neutrino oscillations. It did that, making a groundbreaking discovery after just nine weeks of data-taking.

But that wasn’t the only goal of the experiment. “We realized right from the beginning that it is important for Daya Bay to address as many interesting physics problems as possible,” says Daya Bay co-spokesperson Kam-Biu Luk of the University of California, Berkeley and the US Department of Energy’s Lawrence Berkeley National Laboratory.

For this result, Daya Bay scientists took advantage of their enormous collection of antineutrino data to expand their investigation to the reactor antineutrino anomaly.

Using data from more than 2 million antineutrino interactions and information about when the power plants refreshed the uranium in each reactor, Daya Bay physicists compared the measurements of antineutrinos coming from different parts of the fuel cycle: early ones dominated by uranium through later ones dominated by both uranium and plutonium.

In theory, the type of fuel producing the antineutrinos should not affect the rate at which they transform into sterile neutrinos. According to Bob Svoboda, chair of the Department of Physics at the University of California, Davis, “a neutrino wouldn’t care how it got made.” But Daya Bay scientists found that the shortage of antineutrinos existed only in processes dominated by uranium.

Their conclusion is that, once again, the missing neutrinos aren’t actually missing. This time, the problem of the missing antineutrinos seems to stem from our understanding of how uranium burns in nuclear power plants. The predictions for how many antineutrinos the scientists should detect may have been overestimated.

“Most of the problem appears to come from the uranium-235 model (uranium-235 is a fissile isotope of uranium), not from the neutrinos themselves,” Svoboda says. “We don’t fully understand uranium, so we have to take any anomaly we measured with a grain of salt.”

This knock against the reactor antineutrino anomaly does not disprove the existence of sterile neutrinos. Other, non-reactor experiments have seen different possible signs of their influence. But it does put a damper on the only evidence of sterile neutrinos to have come from reactor experiments so far.

Other reactor neutrino experiments, such as NEOS in South Korea and PROSPECT in the United States will fill in some missing details. NEOS scientists directly measured antineutrinos coming from reactors in the Hanbit nuclear power complex using a detector placed about 80 feet away, a distance some scientists believe is optimal for detecting sterile neutrinos should they exist. PROSPECT scientists will make the first precision measurement of antineutrinos coming from a highly enriched uranium core, one that does not produce plutonium as it burns.

A silver lining

The Daya Bay result offers the most detailed demonstration yet of scientists’ ability to use neutrino detectors to peer inside running nuclear reactors.

“As a study of reactors, this is a tour de force,” says theorist Alexander Friedland of SLAC National Accelerator Laboratory. “This is an explicit demonstration that the composition of the reactor fuel has an impact on the neutrinos.”

Some scientists are interested in monitoring nuclear power plants to find out if nuclear fuel is being diverted to build nuclear weapons.

“Suppose I declare my reactor produces 100 kilograms of plutonium per year,” says Adam Bernstein of the University of Hawaii and Lawrence Livermore National Laboratory. “Then I operate it in a slightly different way, and at the end of the year I have 120 kilograms.” That 20-kilogram surplus, left unmeasured, could potentially be moved into a weapons program.

Current monitoring techniques involve checking what goes into a nuclear power plant before the fuel cycle begins and then checking what comes out after it ends. In the meantime, what happens inside is a mystery.

Neutrino detectors allow scientists to understand what’s going on in a nuclear reactor in real time.

Scientists have known for decades that neutrino detectors could be useful for nuclear nonproliferation purposes. Scientists studying neutrinos at the Rovno Nuclear Power Plant in Ukraine first demonstrated that neutrino detectors could differentiate between uranium and plutonium fuel.

Most of the experiments have done this by looking at changes in the aggregate number of antineutrinos coming from a detector. Daya Bay showed that neutrino detectors could track the plutonium inventory in nuclear fuel by studying the energy spectrum of antineutrinos produced.

“The most likely use of neutrino detectors in the near future is in so-called ‘cooperative agreements,’ where a $20-million-scale neutrino detector is installed in the vicinity of a reactor site as part of a treaty,” Svoboda says. “The site can be monitored very reliably without having to make intrusive inspections that bring up issues of national sovereignty.”

Luk says he is dubious that the idea will take off, but he agrees that Daya Bay has shown that neutrino detectors can give an incredibly precise report. “This result is the best demonstration so far of using a neutrino detector to probe the heartbeat of a nuclear reactor.”

Continue reading

Mystery glow of Milky Way likely not dark matter

According to the Fermi LAT collaboration, the galaxy’s excessive gamma-ray glow likely comes from pulsars, the remains of collapsed ancient stars.

Artist's rendering of a spinning pulsar emitting gamma rays

A mysterious gamma-ray glow at the center of the Milky Way is most likely caused by pulsars, the incredibly dense, rapidly spinning cores of collapsed ancient stars that were up to 30 times more massive than the sun.

That’s the conclusion of a new analysis by an international team of astrophysicists on the Fermi LAT collaboration. The findings cast doubt on previous interpretations of the signal as a potential sign of dark matter, a form of matter that accounts for 85 percent of all matter in the universe but that so far has evaded detection.

“Our study shows that we don’t need dark matter to understand the gamma-ray emissions of our galaxy,” says Mattia Di Mauro from the Kavli Institute for Particle Astrophysics and Cosmology, a joint institute of Stanford University and the US Department of Energy's SLAC National Accelerator Laboratory. “Instead, we have identified a population of pulsars in the region around the galactic center, which sheds new light on the formation history of the Milky Way.”

Di Mauro led the analysis, which looked at the glow with the Large Area Telescope on NASA’s Fermi Gamma-ray Space Telescope, which has been orbiting Earth since 2008. The LAT, a sensitive “eye” for gamma rays, the most energetic form of light, was conceived of and assembled at SLAC, which also hosts its operations center.

The collaboration’s findings, submitted to The Astrophysical Journal for publication, are available as a preprint.   

A mysterious glow

Dark matter is one of the biggest mysteries of modern physics. Researchers know that dark matter exists because it bends light from distant galaxies and affects how galaxies rotate. But they don’t know what the substance is made of. Most scientists believe it’s composed of yet-to-be-discovered particles that almost never interact with regular matter other than through gravity, making it very hard to detect them.

One way scientific instruments might catch a glimpse of dark matter particles is when the particles either decay or collide and destroy each other. “Widely studied theories predict that these processes would produce gamma rays,” says Seth Digel, head of KIPAC’s Fermi group. “We search for this radiation with the LAT in regions of the universe that are rich in dark matter, such as the center of our galaxy.”

Previous studies have indeed shown that there are more gamma rays coming from the galactic center than expected, fueling some scientific papers and media reports that suggest the signal might hint at long-sought dark matter particles. However, gamma rays are produced in a number of other cosmic processes, which must be ruled out before any conclusion about dark matter can be drawn. This is particularly challenging because the galactic center is extremely complex, and astrophysicists don’t know all the details of what’s going on in that region. 

Most of the Milky Way’s gamma rays originate in gas between the stars that is lit up by cosmic rays, charged particles produced in powerful star explosions called supernovae. This creates a diffuse gamma-ray glow that extends throughout the galaxy. Gamma rays are also produced by supernova remnants, pulsars—collapsed stars that emit “beams” of gamma rays like cosmic lighthouses—and more exotic objects that appear as points of light.  

“Two recent studies by teams in the US and the Netherlands have shown that the gamma-ray excess at the galactic center is speckled, not smooth as we would expect for a dark matter signal,” says KIPAC’s Eric Charles, who contributed to the new analysis. “Those results suggest the speckles may be due to point sources that we can’t see as individual sources with the LAT because the density of gamma-ray sources is very high and the diffuse glow is brightest at the galactic center.”

Remains of ancient stars

The new study takes the earlier analyses to the next level, demonstrating that the speckled gamma-ray signal is consistent with pulsars.

“Considering that about 70 percent of all point sources in the Milky Way are pulsars, they were the most likely candidates,” Di Mauro says. “But we used one of their physical properties to come to our conclusion. Pulsars have very distinct spectra—that is, their emissions vary in a specific way with the energy of the gamma rays they emit. Using the shape of these spectra, we were able to model the glow of the galactic center correctly with a population of about 1,000 pulsars and without introducing processes that involve dark matter particles.”

The team is now planning follow-up studies with radio telescopes to determine whether the identified sources are emitting their light as a series of brief light pulses—the trademark that gives pulsars their name.

Discoveries in the halo of stars around the center of the galaxy, the oldest part of the Milky Way, also reveal details about the evolution of our galactic home, just as ancient remains teach archaeologists about human history.

“Isolated pulsars have a typical lifetime of 10 million years, which is much shorter than the age of the oldest stars near the galactic center,” Charles says. “The fact that we can still see gamma rays from the identified pulsar population today suggests that the pulsars are in binary systems with companion stars, from which they leach energy. This extends the life of the pulsars tremendously.”    

Dark matter remains elusive

The new results add to other data that are challenging the interpretation of the gamma-ray excess as a dark matter signal.

“If the signal were due to dark matter, we would expect to see it also at the centers of other galaxies,” Digel says. “The signal should be particularly clear in dwarf galaxies orbiting the Milky Way. These galaxies have very few stars, typically don’t have pulsars and are held together because they have a lot of dark matter. However, we don’t see any significant gamma-ray emissions from them.”

The researchers believe that a recently discovered strong gamma-ray glow at the center of the Andromeda galaxy, the major galaxy closest to the Milky Way, may also be caused by pulsars rather than dark matter. 

But the last word may not have been spoken. Although the Fermi-LAT team studied a large area of 40 degrees by 40 degrees around the Milky Way’s galactic center (the diameter of the full moon is about half a degree), the extremely high density of sources in the innermost four degrees makes it very difficult to see individual ones and rule out a smooth, dark matter-like gamma-ray distribution, leaving limited room for dark matter signals to hide.  

This work was funded by NASA and the DOE Office of Science, as well as agencies and institutes in France, Italy, Japan and Sweden.

Editor's note: A version of this article was originally published by SLAC National Accelerator Laboratory.

Continue reading

#AskSymmetry Twitter chat with Tulika Bose

See Boston University physicist Tulika Bose's answers to readers’ questions about research at the Large Hadron Collider.

Freeze frame of physicist Tulika Bose with Continue reading

Categories