- Why do we study particle physics?
- What are fundamental particles?
- What are the fundamental forces?
- What are particle accelerators used for?
- What is the LHC?
- How does the LHC work?
- What are some other accelerators/colliders?
- Are there other ways to study particles?
- What do the theories underlying particle physics look like?
- What is the Higgs boson exactly?
- What are some unanswered questions in particle physics?
- What is ‘new physics’?
- Some terms and useful links
Particle physics is the study of the fundamental particulate constituents of nature. Our knowledge of these constituents is important to understand the laws that shape our universe, how they manifest their will, and why things are the way they are. Because of the scale at which its experiments are conducted and the diverse branches of physics and mathematics its theories draw from, particle physics has many applications – ranging from solid-state physics to medical diagnostics to distributed computing.
Fundamental, or elementary, particles are the smallest constituents of something and aren’t made up of even smaller particles. Particle physics addresses both matter and force in terms of particles, so fundamental particles come in two types as well. Bosons are force-carriers: if a particle is acted upon by a force, then it can also be understood as the particle interacting with the particulate carriers of that force. For example: if the weak force is acting on a particle, then the particle is interacting with W± and Z bosons, which are carriers of the weak force. Fermions are matter-makers: they come together under the influence of others like them as well as bosons to make up matter (or antimatter).
However, a deeper classification of bosons and fermions alludes to their collective properties when they accumulate in large numbers. All particles have a spin quantum number associated with their angular momentum, expressed as integer fractions (0, 1/2, 1, 3/2, …). All fermions have half-integer spins (1/2, 3/2, …) and their behaviour is thus defined by the rules of Fermi-Dirac statistics. All bosons have integer spins (0, 1, 2, …) and their behaviour is thus defined by the rules of Bose-Einstein statistics.
One commonly cited difference between the statistics is that bosons don’t obey Pauli’s exclusion principle while fermions do. The principle states that, given all fermions can be accorded four quantum numbers that define the particles’ states, no two particles can have the same four numbers at the same time. And because bosons can possess such identical quantum numbers, they can form Bose-Einstein condensates.
There are four forces that occur naturally. In the descending order of their strength, they are the strong nuclear force (holds nuclei together), the electromagnetic force (acts between charged particles), the weak nuclear force (responsible for radioactivity) and gravitation (acts between things having mass). They are respectively mediated by gluons, photons, W±/Z bosons and the hypothetical gravitons. At higher energies, the electromagnetic and weak nuclear forces unify into one common electroweak force. Some physicists think that at even higher energies, beyond those we can access with the current state of experimental physics, all four forces will unify to form one ‘grand unified’ force. The theories that aspire to account for such a unification are called grand unified theories.
Accelerators are used to take advantage of three concepts that let us access the extremely small distances inhabited by particles.
The first is the special theory of relativity, formulated by Albert Einstein in 1904. Its most famous implications are that mass and energy can be converted into each other, and that objects become more massive as they travel at speeds closer to that of light. As a result, when an accelerator accelerates particles, they become more energetic, and the excess energy is used to create new particles. What particles can or can’t be created depends on the experimental conditions as well as what theoretical models allow. The output is also limited by the total energy available during the collision (13 TeV in the LHC’s case).
(Note: The mass-energy equivalence dictates that the energy and mass of a particle are related by a factor of c2, c being the speed of light. Specifically, m = E/c2. So, when a physicist says a particle weighs 1 electron-volt, a unit of energy relevant to particles, she’s actually saying it weighs 1 eV/c2. Secondly: a common benchmark is the proton’s rest mass, about 938 million eV = 0.938 GeV.)
The second concept has to do with the principles of quantum mechanics, which allow heavier particles to break down into combinations of lighter particles in a process called decaying. How often they decay into which particles is dictated by parameters called coupling constants. Let’s say a heavier H particle decays into two P particles some of the time, two Q particles at other times, and never into any R particles. If the H and P coupling constant is higher than the H and Q coupling constant, then H will decay into two Ps proportionally more often than it will decay into two Qs. And the H and R coupling constant will be zero. Based on which particle physicists are trying to study, detectors stationed around the accelerator tunnel will be tuned to look for certain combinations of decay products.
The third concept involves the wave-function. It’s a number that describes the probability of a particle existing at a particle point in space and time, and is calculated in terms of a wave – i.e. possessing a frequency and a wavelength. The wavelength corresponds to the distance across which the particle will manifest itself. Since energy and frequency are directly proportional, energy and wavelength are inversely proportional. The implication is that the higher the energy an accelerator is able to push particles to, the smaller the particles it will be capable of finding.
The LHC is the Large Hadron Collider, the world’s most powerful particle accelerator. It accelerates two groups of protons to very high energies, within a tunnel measuring 27 km in circumference, and then smashes them head on. The collisions don’t happen at arbitrary locations but at one of four points around the tunnel – points around which particle detectors are installed. These are named:
- ATLAS – A Toroidal LHC Apparatus
- CMS – Compact Muon Solenoid
- LHCb – LHC-beauty (studies particles associated with the beauty quark/antiquark particles)
- ALICE – A Large Ion Collider Experiment
- TOTEM – Total Elastic and Diffractive Cross-section Measurement
- MoEDAL – Monopole and Exotics Detector at the LHC
- LHCf – LHC-forward (CERN: “uses particles thrown forward by LHC collisions to simulate cosmic rays”)
ATLAS and CMS have large overlaps about the kind of particles and mechanisms they study while the rest study other aspects. The last three, TOTEM, MoEDAL and LHCf, are also much smaller.
Housed at CERN, France’s national nuclear research facility, the LHC was first opened for business in 2009. It functioned until early 2013 and then shut down for upgrades over the next 18 or so months, reopening in May 2015. Before the upgrades, it could accelerate two opposing beams of protons to 4,000 GeV (i.e., 4 TeV) each – bringing the total energy at the moment of collision to 4 TeV × 2 beams = 8 TeV. After the upgrades, the beam energy was 6.5 TeV and the collision energy, 13 TeV.
The LHC doesn’t – can’t – accelerate protons from rest to 6.5 TeV. That’s accomplished in stages. First, the linear accelerator (LINAC) 2 accelerates protons to 0.5 GeV and then injects them into the proton synchrotron booster, which gets them up to 1.4 GeV. Then, the super proton synchrotron accelerates them to 450 GeV and sends them into the LHC, which boosts them to the final 6,500 GeV.
The LHC tunnel is about 3.8 metres wide, located 50-175 metres underground and contains two smaller beam pipes, intersecting at the four collision points. Each pipe contains one train of protons being accelerated anti-parallely with the other. Each train contains about 320 trillion protons divided into just over 2,800 bunches. When the opposing trains are brought together, by then traveling at 11,000 revolutions/second, two bunches are made to collide once every 25 nanoseconds.
The acceleration itself happens because of something called the Lorentz force: when a charged particle like a proton moves through a magnetic field, it experiences a force that makes it move in a curved path. So, the LHC uses a gigantic architecture of over 1,600 magnets to achieve acceleration to energies of 6.5 TeV, as well as keep the beams focused. These are superconducting electromagnets: their magnetic field strengths increase as more current is passed through them. At the same time, they also generate a lot of heat that could kill the superconducting character of the magnets and cause the magnetic fields to collapse, a phenomenon called quenching (one such incident occurred at the LHC during tests in 2008). To prevent this, the magnets are jacketed in a sleeve of liquid helium maintained at –271.25 ºC.
The LHC is a circular accelerator. Other well-known circular accelerators in operation are the Relativistic Heavy Ion Collider and the KEKB. There are also some plans to build a much larger version of the LHC to succeed it – one in China and one adjacent to its current location.
Linear accelerators (linacs) – where particles are accelerated in a straight line through a long tube – are preferred when lighter charged particles are to be accelerated. This is because, when a charged particle is accelerated through a magnetic field to close to the speed of light, it starts to emit energy in the form of synchrotron radiation. The emission’s power is inversely proportional to the fourth power of the particle’s mass (P ∝ m-4), which means it shoots up the lighter the particle becomes. To avoid this, a linac uses a series of voltage differences that successively push an electron to higher energies, without a magnetic field. A famous example of a linac is at the Stanford National Accelerator Laboratory.
Yes. Apart from particles produced on Earth by humans, they’re also streaming in toward us from space, originating from living stars, exploding stars, dead stars, blackholes, gas clouds and a variety of other astrophysical phenomena. Capturing and studying these particles – and/or their decay products – provides us a peek into the distant reaches of the cosmos. Even conventional astronomy is a kind of application of particle physics because it works with photons. An emerging, ‘unconventional’ form of astronomy uses neutrinos instead.
The Standard Model of particle physics is the theory underlying the study of particle physics. It consists of three groups – SU(3), SU(2) and U(1) – with specific mathematical definitions (under the branch of differential geometry). Each group represents a gauge interaction, which we associate with the three fundamental forces acting at the nuclear level: strong nuclear, weak nuclear and electromagnetic (in that order). The number in brackets denotes each group’s degree, associated with the groups’ degrees of freedom. The Standard Model unites the three groups by matrix multiplication as SU(3) × SU(2) × U(1). (The unified electroweak force is represented as the product SU(2) × U(1).)
In the 1960s, when the Standard Model was being pieced together, theoretical physicists found that they didn’t have a way to explain how the different fundamental particles became massive or why they had the masses that they did. In 1964, three groups of physicists working separately proposed a solution to this problem by introducing a new field. When a fundamental particle moved through this field, it would experience a retarding force mediated by the field’s boson. The strength of the force depended on the coupling constant between the fundamental particle and the boson.
This is called the ABEGHHK’tH mechanism for the people involved in its formulation: Philip Warren Anderson, Robert Brout, Francois Englert, Gerald Guralnik, C.R. Hagen, Peter Higgs, Tom Kibble and Gerardus ‘t Hooft. However, the mechanism as well as the boson are more popularly associated with Higgs alone. By finding the boson in its first run, physicists working with the LHC were able to experimentally confirm the validity of the Higgs mechanism. A year later, in 2013, Englert and Higgs were awarded the Nobel Prize for physics.
A more detailed ontology of the Higgs boson is here.
- What is dark matter?
- Hierarchy problem: Why is the force of gravity so much weaker than the other forces?
- Search for neutrinoless double-beta decay: Are neutrinos their own antiparticles?
- Strong-CP problem: The weak force violates parity but the strong nuclear force, which is very similar, doesn’t. Why not?
- Antimatter problem: Why is there more matter than antimatter in the universe today though they were created in equal quantities after the Big Bang?
- Does the proton decay?
… among others.
‘New physics’ refers to a theory or theories that will be able to subsume the Standard Model and deal satisfactorily with the questions the model couldn’t answer (see previous section). There are many candidates for such theories. A particularly favoured one among them is called supersymmetry (SUSY) because it maintains naturalness (see next section) as well as because it is potentially discoverable at the energy levels that can be accessed using the LHC.
- Statistical significance – Say a dataset of 10 distinct numbers has an average value of A. Every number will deviate from A by a deviation called D. So, there are 10 deviations: D1, D2, D3, …, D9, D10. The average of these values is the standard deviation, SD. Broadly speaking, SD defines the size of fluctuations within the dataset. In particle physics, the dataset could be composed of items referring to the outcomes of millions of collisions. When looking for a novel particle against this background, the corresponding outcome should be far afield of the dataset’s SD. The farther it is, the more statistically significant the finding is (pending the look elsewhere effect). The SD is denoted by the symbol σ (sigma); typically, a 3σ outcome has a 1-in-350 chance of being a random deviation and is necessary to claim the outcome as evidence. At 5σ, the odds of it being a random reading shrink to 1-in-3.5 million and become eligible to be claimed as a discovery.
- Look elsewhere effect – To quote Matt Strassler from his blog: “Even when the probability of a particular statistical fluke, of a particular type, in a particular experiment seems to be very small indeed, we must remain cautious. There are hundreds of different types of experiments going on, collecting millions of data points and looking at the data in thousands of different ways. Is it really unlikely that someone, somewhere, will hit the jackpot, and see in their data an amazing statistical fluke that seems so impossible that it convincingly appears to be a new phenomenon? The probability of it happening depends on how many different experiments we include in calculating the probability … This is sometimes called the ‘look-elsewhere effect’; how many other places did you look before you found something that seems exceptional?”
- Luminosity – The number of particles produced per unit area in collisions. Integrated luminosity is measured over a prolonged duration and the instantaneous luminosity is measured per second.
- Baseline – The distance travelled by a particle as it goes from the source to the detector. Long-baseline experiments are a common feature of neutrino physics.
- Neutrino oscillations – Neutrinos come in three flavours: electron muon and tau. As they travel, neutrinos of one flavour can spontaneously transform into neutrinos of another flavour. These are called neutrino oscillations. Their discovery prompted the realisation that neutrinos have mass; massless particles can’t oscillate.
- Chirality – The property of an object that makes it nonidentical to its mirror image.
- Channels – The modes of decay associated with a particle, and which detectors are built to listen in on.
- Gauge invariance – One of the simplest explanations (really) of this concept helms the Wikipedia article ‘Introduction to gauge theory‘: “A gauge theory is a type of theory in physics. Modern theories describe physical forces in terms of fields, e.g., the electromagnetic field, the gravitational field, and fields that describe forces between the elementary particles. A general feature of these field theories is that the fundamental fields cannot be directly measured; however, some associated quantities can be measured, such as charges, energies, and velocities. In field theories, different configurations of the unobservable fields can result in identical observable quantities. A transformation from one such field configuration to another is called a gauge transformation; the lack of change in the measurable quantities, despite the field being transformed, is a property called gauge invariance.”
- Lagrangian – Lagrangian mechanics is a reformulation of Isaac Newton’s classical mechanics that simplifies calculations when multiple objects and different kinds of forces are involved.
- Symmetry – When a system’s implicit properties don’t change when the system is subjected to some kind of transformation (such as rotation or being substituted with its mirror image).
- Naturalness – An aesthetic measure in particle physics that discourages imbalances by many orders of magnitude. For example, the fact that gravity is 1032 times weaker than the weak nuclear force is very unnatural while the comparability of strengths of the other forces feels more natural.
Some useful links
- Matt Strassler
- Pauline Gagnon
- Adam Falkowski
- Jackson Clarke
- Clara Nellist
- Matthew Buckley
- Tommaso Dorigo
- symmetry magazine
- Knocking on Heaven’s Door by Lisa Randall
- Beyond the God Particle by Leon Lederman & Christopher Hill
- String Theory and the Scientific Method by Richard Dawid