Predicting the Subatomic Future

Predicting the Subatomic Future
Lattice QCD on Hyak
Martin Savage, Professor of Physics, UW

In a relevant and timely profile, physics professor Martin Savage talks about his work with Lattice Quantum Chromodynamics, which:

  • Was awarded $750,000 by the NSF for computing equipment
  • Runs on Hyak in preparation for peta- and exascale computing
  • Operates at a time-space scale of femtoseconds and femtometers

Savage's work aims to "get [us] to the stage where [we] can calculate reliably anything of  importance in nuclear systems..."

Learn how eScience is helping UW faculty model nuclear reactions.

 


Martin Savage, Professor of Physics at UW, is one PI of a team associated with an NSF grant of ~$750k for Major Research Instrumentation (MRI)1. The "instrument" in this case is the team’s set of 140 Hyak nodes, 140TB of fast shared storage, and associated equipment which they use to develop next-generation computational techniques for nuclear and particle physics. Hyak is indispensable as the team prepares to run at the petascale in the near future and at the exascale within the decade.

“You want to get to the stage where you can calculate reliably anything of importance in nuclear systems."

Hyak’s considerable capability supports the development of techniques required for precision calculations of nuclear reaction rates. Ultimately, the goal is to simulate nuclear reactions better than we can now. The numerical solution of quantum chromodynamics, the theory underlying the strong force, has potential impact in multiple areas, and eventually, it is hoped that it will be used to refine larger scale nuclear calculations used to deal with complex nuclear systems, for instance, in the design of nuclear reactors.

“You want to get to the stage where you can calculate reliably anything of importance in nuclear systems and in nuclear astrophysics. You want to develop a computational tool that will give you results with quantifiable uncertainties that can be systematically eliminated,” says Savage.

Strong Forces, Quarks, Gluons

Savage’s work explores the strong nuclear force among quarks and gluons, the subatomic particles that form atomic nuclei. The strong force between quarks is the most powerful force in nature, binding quarks and gluons together to form protons, neutrons, pions, and more exotic particles. It also gives rise to a force between the neutrons and protons that overpowers the electromagnetic repulsion between protons to bind the atomic nucleus together. The long-range nature of the electromagnetic force (combined with the Pauli-exclusion principle, an entirely quantum phenomenon) eventually overcome the short-range nature of the strong force to destabilize very large nuclei. (This underlies the process of fission and explains why superheavy elements have not been observed).


Fig. 1  An atom is composed of a dense nucleus at its center surrounded by a diffuse cloud of electrons. The nucleus is comprised of neutrons and protons that interact with each other via the short-distance strong force. Further, the protons also interact with themselves via the long-distance Coulomb force (electromagnetic). The neutrons and protons are composite objects formed from quarks and gluons, whose dynamics are governed by the underlying theory called quantum chromodynamics


Unlike some other fields of research, where large data sets remain to be explored to uncover empirical rules that describe the data, the underlying laws that determine nuclear physics were established and verified during the last century with substantial experimental and theoretical investigations. However, knowing the laws that govern the strong forces between quarks and gluons, and using them to make precise predictions for nuclear processes, are two different things. Today, researchers can use their knowledge of the laws to create well-defined calculations with the goal of predicting nuclear reaction rates to precision. Savage notes, “We can’t get at some important reactions or structures experimentally [or analytically].”

Professor Savage uses a method called Lattice Quantum Chromodynamics (Lattice QCD), a method used by physicists to investigate the subatomic domain. The Lattice QCD method discretizes the space-time continuum into a four-dimensional grid (three spatial dimensions plus the dimension of time). The quarks live on the points of the grid and the gluons live on the links between the points of the grid. The calculations involve taking “snapshots” of the quantum-fluctuations in the quark and gluon fields as the particles interact.

 


 

Fig. 2  The quantum fluctuations in the quark and gluon fields are central to the properties and dynamics of nuclear systems, and the gluon fields exhibit fluctuations in their topology. This movie shows the results of a lattice QCD calculation of the topological charge density in the gluon field in a finite-volume of space as a function of time. Regions of light green correspond to negative fluctuations, while regions of blue correspond to positive fluctuations.  


These simulations involve time and space on a scale that’s challenging for most of us to imagine. The current Lattice QCD calculations work in spatial extents of roughly 4x10-15  meters (four femtometres), and time is measured in femtoseconds. For context, a femtosecond is to a second what a second is to about 31.7 million years, and four femtometres are to a meter what the width of a human hair is to roughly the distance between the earth and the sun. 

Prep for Petascale & the Speed of Science

Having a high-performance computing (HPC) resource such as Hyak available locally was critical to Savage and his team winning the MRI NSF grant.

“Nuclear physics programs require exaflops machines. Our goal right now is to get ourselves prepared for the exascale via the petascale (for instance, the NSF NCSA Blue Waters facility on the campus of the University of Illinois at Urbana-Champaign).  Hyak is not an exascale or a petascale facility. But it’s set up in such a way that we can test and develop our ideas. We can look at how our codes scale with the number of compute cores. If we run a code on one node, then on 100 nodes, we want to know: is it running 100 times faster on 100 nodes, or 10 times faster? If it’s running only 10 times faster, then we’ve got a problem.”

With a local HPC cluster, the process of trial and error is sped up dramatically (compared to using shared facilities at national Supercomputer centers). In addition, a local HPC cluster allows researchers to stay focused on the work of development, not getting bogged down in the work and wait time of writing grant proposals for compute resources. “It’s amazing to have this kind of resource at the University…when you have a problem you can test it right away, without having to write a proposal and wait three months. Does it work? No, okay, let’s try something else.”

HPC is now central to the goals of nuclear physics, and at least six faculty members (permanent, affiliated and adjunct), along with a number of postdoctoral fellows, graduate students and undergraduate students, make use of HPC resources that are available locally and nationally. The large-scale HPC efforts required to accomplish significant goals resemble large experimental collaborations rather than single investigator programs, as a broad range of skills are required. The areas of expertise range from nuclear physics, particle physics, applied mathematics, computer science, to a good understanding of the hardware. Savage is member of the Nuclear Physics with Lattice QCD2 (NPLQCD) collaboration, and the Hyak computing resources are being used to achieve the goals of NPLQCD. Most members of NPLQCD are part of USQCD, a SciDAC collaboration and the national Lattice QCD collaboration.

The UW has a large research effort in nuclear physics, both experimental (CENPA) and theoretical. In addition to the Nuclear Theory Group in the physics department, of which Savage is a member, the national Institute for Nuclear Theory (INT) is located in the Physics Building, giving a combined total of seven permanent faculty members, a few research faculty, a number of postdoctoral fellows and a number of graduate and undergraduate students. The INT hosts workshops and meeting regularly, with 20 to 50 attendees being standard.

Tacking Complex Problems

Savage observes that resources like Hyak allow scientists to “tackle increasingly complex problems. They allow you to begin to simulate entire complex systems. To simulate a nuclear reactor all the way through to the cooling, you need high-performance computing. When it’s questions about energy issues, how to improve the efficiency of combustion, figuring out where waste is occurring, you have to get these things just right. These resources are expensive to build, expensive to run, but in the long run it can save you a large amount of money and allow you to build new facilities that work well and are safe. Computing is a tool that allows you to make progress in these areas. Unfortunately, the recent tragic events in Japan have made crystal clear the importance of acquiring the computational tools for complete system simulation. The ability to simulate all possible scenarios with reliable uncertainty quantification ahead of the deployment of the nuclear reactors in Fukushima would have provided important information for their design. I hope the University support for local computational resources continues, and I hope that they will see from our efforts that this is a good thing.”

Savage has witnessed first-hand the evolution of the UW-supported scientific computing infrastructure from the single compute core of a few years ago to the modern-day large-scale parallel computing environment (Hyak) that is necessary to tackle the most complex scientific challenges. The NPLQCD collaboration wishes to thank the UW eScience Institute team, particularly Chance Reschke, for making this UW-wide scientific research facility the success that it is, and moving the UW toward the frontier of scientific computing.

As Savage imagines the future, the excitement in his voice is clear: “Computing is changing how research gets done.  “The next 10 years are going to be phenomenal. Amazing things are going to happen. We just have to get ready.”

 


Footnotes

1The MRI proposal that was funded by the NSF that enabled the purchase of these nodes for HYAK had David Kaplan (Director of the INT) as PI, with George Bertsch, Aurel Bulgac, Chance Reschke and Martin Savage as co-PI’s. The title of the proposal was Acquisition of a computer cluster for density functional and lattice QCD investigations into nuclear fission and fusion.

2 The members of the NPLQCD collaboration are Silas Beane (University of New Hampshire), Emmanual Chang (University of Barcelona), William Detmold (College of William and Mary), Huey-Wen Lin (UW), Thomas Luu (Lawrence Livermore National Laboratory), Kostas Orginos (College of William and Mary), Assumpta Parreno (University of Barcelona), Aaron Torok (Indiana University), Andre Walker-Loud (Lawrence Berkeley Laboratory) and Martin Savage (UW).

 

AttachmentSize
QCD_2 - Computer.m4v17.3 MB