This week we learn why general relativity (GR) and quantum mechanics (QM) are allergic to each other. We also learn why string theory is a possible solution that integrates aspects of both QM and GR into a new *Gravity 3.0*.

There are lots of different ways of stating the Uncertainty Principle. One helpful way is to say that for every kind of physical observable, quantum physics introduces a random jitter about the classical value which cannot be turned off, even at absolute zero temperature. There is fundamental random uncertainty in every physical observable. But life is not supremely depressing: physicists might have to give up on classical determinism but we still have quantum determinism. We have equations that allow us to compute the wavefunctions of elementary particles, which in turn lets us compute the relative likelihood of different physically possible outcomes. Which one actually eventuates is up to random chance.

Anything not expressly forbidden by a physics principle can happen in quantum mechanics. However, and this is an important however -- major quantum fluctuations towards more outrageous outcomes are much rarer than minor ones. For example, I can compute for you the probability that all the oxygen molecules in this room will suddenly move and collect in the palm of my left hand in five seconds from now, but this probability is so tiny that you do not have to worry about it happening before the Earth gets burned up by our Sun becoming a red giant. Which will be long after we wreck our only spaceship's climate through slavish devotion to the profit motive.

How should you interpret the randomness of quantum mechanics? If you are really curious, do a physics degree and go into the exciting research field of quantum information theory. But if you just want a brief overview, here it is. Overall, quantum jitter about the classical value of any given physical observable does matter, but on macroscopic scales we are too big and clumsy to notice it. If on the other hand we have access to powerful lab equipment like a particle accelerator, we can see for ourselves how quantum physics changes life, and in particular our expectations on the strength of forces at higher energies than we are used to.

As we discussed last week, the Heisenberg Uncertainty Principle is very specific. It limits the precision of specific pairs of physical properties. One example is the pair of energy and time: if you measure the time interval more precisely then you know less about the energy, and vice versa. $$ \Delta E \Delta t \gtrsim h \,. $$

Now, we usually speak of energy in layperson physics contexts as if it were always and absolutely conserved: Energy cannot be created or destroyed, it can only be transformed from one form to another

. However, the teacher who parroted that line to you in high school was not actually telling the whole truth. First of all, there is only a principle of conservation of energy if your spacetime has a special property that it is invariant (stays the same) if you translate things along in time. So in some spacetimes (most of them, actually), you will not obtain a principle of conservation of energy. But let us set that concern aside for now. Say that you do have an energy conservation principle, as is an excellent approximation for us here on Earth right now. Classical physics says that you must conserve overall energy at all times: it is absolutely verboten to do otherwise.

But what happens in quantum mechanics? Because of the Heisenberg Uncertainty Principle, we can tolerate small violations of energy conservation (energy uncertainty $\Delta E$) as long as the crime is done sufficiently quickly (in a time $\Delta t$)! In other words, overall energy is conserved over long timescales, but quantum mechanics allows you to violate it temporarily, over very short timescales. How short? Suppose that I wanted to conjure a virtual electron-positron pair out of the vacuum. Quantum uncertainty would let me get away with it for a timeframe of $$ \Delta t \lesssim {\frac{h}{2mc^2}} $$ which for electron-positron pair is about $10^{-21}$ seconds. I obtained this formula by substituting in $2mc^2$ for the energy uncertainty: $mc^2$ for the electron and $mc^2$ for the positron, where $m$ is the particle mass and $c$ is the speed of light. But that is all the time I would have before I had to own up and obey energy conservation.

Notice one very important fact about the time interval computed using the Heisenberg Uncertainty Principle formula -- it is inversely proportional to the energy of the particle created: $\Delta t \lesssim h/\Delta E$. This means that virtual particles with higher energy (e.g. rest energy locked up in their mass) are created out the vacuum spontaneously quite a lot less often than lower-cost particles. This is one of the ways quantum mechanics expresses a preference for microscopic crimes. It is a bit like a wise old detective who might overlook a peccadillo as part of a package to help a teenage kid go straight, but when faced with a serious criminal they throw the book at them. Similarly, if you want to commit small energy crimes, quantum mechanics will not come after you too quickly; but if you try something really big then it will bop you on the head really fast.

*Virtual particles* that are created fleetingly only to vanish back into the vacuum (so that energy is conserved over macroscopic timescales) are different beasties from real particles. If you like, you can think of them as pretend

or Pinocchio particles. In order for them to become real particles that live a natural lifespan (e.g. electrons which last forever, or e.g. free neutrons which decay in roughly a thousand seconds), they have to find energy from somewhere to turn from virtual particles into real ones.

(Note: the argument in this section is pretty advanced, and I would not expect you to be able to regurgitate it on the final exam.)

So because of quantum uncertainty, the vacuum (the state with no particles

) is seething with virtual electron-positron pairs at all times! Not only that: since in quantum mechanics anything that is not expressly forbidden can happen, the vacuum is also replete with other virtual particle-antiparticle pairs as well! (Remember, *virtual* means that they have only a fleeting existence governed by the Uncertainty Principle before they have to disappear back into the vacuum.) This fact about the vacuum

is a pretty interesting phenomenon, and it has some real physical consequences. The lightest particles available drive this physics the most strongly, because they are the least energetically expensive ones to create.

Let us now see how quantum mechanics affects the strength of coupling constants. Consider a lone

electron, denoted by the dark blue disc in the cartoon picture below. The truth is that this lone electron is actually not alone. The electron sits in a vacuum: a sea of virtual particle-antiparticle pairs making fleeting appearances as allowed by Heisenberg Uncertainty. For the sake of expositional clarity, let us pretend for the moment that there are only electrons and positrons, to distill the issue to its purest form. Then our real electron (dark blue) is surrounded by virtual electron-positron pairs. Virtual electrons are denoted in the following cartoon by light blue discs, while virtual positrons a.k.a. anti-electrons are denoted
by light yellow discs. (Note: colour is used in the cartoon just to keep track of which particle is which; it does not carry any physical significance.)

Our real electron is surrounded by virtual pairs. Close in, where the electric field from the real electron is strongest, you will find the biggest effect of virtual pairs there. Why? Because having an electric field helps create virtual pairs via the electromagnetic force. When virtual electron-positron pairs are created, they are separated by a distance known as the electron Compton Wavelength
$$
\lambda_C = {\frac{h}{mc}}\,.
$$
The virtual electron-positron pairs are often referred to by physicists as virtual *dipoles* because they have a positive end and a negative end separated by a nonzero distance.

Using the rules for electrostatic repulsion/attraction, the virtual electron is repelled by the real electron while the virtual positron is attracted by it. So virtual pairs in the vacuum are oriented so that positive charges in the virtual pairs cluster nearer the real electron than the compensating negative charges in the virtual pairs. This creates charge separation and the effect is that the virtual dipoles *screen* some of the charge of the real electron.

The screening effect is very mild at low energies, because the virtual e-e+ pairs have a very fleeting lifespan. But the deeper you can dig nearer the real electron, the more of the real electron’s charge you can feel because less of it is screened from you. And because digging deeper requires more momentum for the probe (c.f. de Broglie wavelength from last week), this is why the electromagnetic force grows stronger at higher energies. This phenomenon is called running of the coupling. In plainer English: **the electromagnetic coupling constant is not constant!**

The situation is even more interesting for gravity: its coupling grows with energy in a more drastic way. We will have more to say about that shortly.

So, what do you see when you turn on a particle accelerator like the LHC? Indeed, the EM force gets stronger at high energy. Equivalently, its inverse strength gets weaker, as you can see for the blue line on the first graph below. But a very interesting (and technical) thing happens for the two nuclear forces: their strength actually weakens at high energy; equivalently, their inverse strength grows! The behaviour for the weak and strong nuclear forces are indicated by the green and red lines on the first graph. Gravity is so weak its coupling would be off the top of the page for most of the graph and dives down to a possible joining point with the other three forces only at a very high energy scale of the order of $10^{16}GeV$ or higher.

LEP

is mentioned in this figure; it was the electron-positron accelerator that used to inhabit the same tunnels as are now used at CERN for the LHC experiment and taught us a great deal about the electroweak force. The second graph in the right hand panel is what you can arrange if you add supersymmetry to the mix, which a hypothetical symmetry that I will mention a little more in early January. On both graphs, the part which is certain is the part left of the dotted line: that is what has been checked experimentally. Anything much to the right of the dotted line is conjecture at this point, based on extrapolation from lower energies.

When you put special relativity together with quantum mechanics, you get something called QFT which stands for *Quantum Field Theory*. But Einstein's theory of General Relativity (GR) is a very different animal. Why is it so hard to unify QFT and GR? One reason is that the mathematical apparatus used for QFT is very different from the mathematical apparatus of GR. QFT, our quantum toolbox, gave us the Standard Model of Particle Physics. This covers matter and the three gauge-mediated forces. In particle-land, gravity is classical, and weak, and ignored. Einstein’s General Theory of Relativity, on the other hand, is an inherently relativistic and inherently classical theory of gravity. The GR toolbox is geometrical and involves very different math than the QFT toolbox suitable for the other 3 forces. GR describes space-time as a smooth dynamical fabric – which is warped by matter – and which makes matter move.

The discussion in this section is the key message for this week. Let us consider a simpler stripped-down world in which there is only gravity. An interesting physical phenomenon to discuss is the scattering of gravitons - which occurs because gravitons actually feel gravity too: they have energy so gravity pulls on them. Remember that Ice Skater Analogy way back when we talked about particle physics? We described electric repulsion between two electrons in terms of exchanging a photon between them.

Similarly, we can look at gravitons feeling the gravitational force by considering exchange of a third graviton between two other gravitons. Sometimes this is drawn in a Feynman diagram like the one on the right above.

Using QFT + GR we can predict what the probability for graviton scattering should be. Once the mathematical dust settles, the probability turns out to be $$P_{\rm GR} \sim {\frac{G_N}{c^5\hbar}} E^2$$

This formula does not mean much until we realize what the symbols mean. $P$ on the left hand side of the equation stands for the probability of graviton scattering. On the right hand side, $G_N$ is Newton’s constant, $c$ is the speed of light, and $\hbar$ is Planck’s constant. The only variable on the right hand side of the equation is $E$, the energy you have available.

Let us try to make sense of the formula by taking some particular limits of it. In particular, let us look at the low-energy (small-$E$) limit and the high-energy (large-$E$) limit separately. In the low-energy limit, the probability of graviton scattering is very small, i.e. much less than 100%. This is perfectly fine: it just means that gravitons hardly scatter off each other at all. By contrast, in the high-energy limit
$$
{\frac{G_N E^{2}}{c^5\hbar}}\gg 1\,,
$$
we have a probability that is much greater than unity. **OOPS!** We have just obtained a probability for real physical things to happen which is greater than 100%!

This is not just a small error. It is a *deep theoretical emergency* reflecting the clash between the smoothness of classical spacetime and the random uncertainty of quantum mechanics. No credible fix for this problem has yet been found. This is the fundamental sickness of Einstein’s theory of gravity: it does not make any sense as a quantum theory. While General Relativity is a mighty fine classical theory of gravity -- consistent with all known solar system, galactic, and cosmological observations so far -- it fails to cut the mustard as a quantum theory.

The crossover point between where GR makes sense and where GR blows up in your face is known as the Planck energy: $$ E_P \sim \sqrt{{\frac{c^5 \hbar}{G_N}}} \,. $$ In four spacetime dimensions, numerically the Planck energy is about $10^{19}GeV$. This is seventeen orders of magnitude greater than the energy tied up in the mass of a single Higgs boson. This is why most physicists do not expect that a human-built collider could ever produce Planck-scale physics directly in the lab. Instead, we must look for more indirect evidence about the nature of quantum gravity by using cosmological observations. .

GR is not a consistent quantum theory of gravity. We therefore seek a Gravity 3.0 theory which does manage to consistently knit together GR and QFT. This is not an easy problem: even Albert Einstein was not smart enough to solve it before his death. Quantizing GR is something that got physicists -- including some extremely famous ones -- totally stuck for decades.

String theory is currently, in my professional opinion, and in the professional opinion of most of the world’s quantum gravity workers, the most technically convincing construction of a quantum gravity theory. It has the in-built advantage that you recover GR at low energy.

String theory was actually originally invented to explain the strong nuclear interaction, not quantum gravity. People of the day -- including my PhD thesis supervisor Prof. Leonard Susskind -- realized that perhaps an open string with quarks on its ends might explain quark confinement. Unfortunately, that dream was not realized (at least, not directly; stay tuned for future progress reports!). The fatal flaw in open string theory as a theory of the strong nuclear interaction was the presence of a very unexpected particle in the spectrum: a beastie with zero mass and spin two. Only years later did other physicists realize that this animal had the mass and spin of the graviton and also interacted like a graviton. And then string theory as a theory of quantum gravity was born.

The titanic clash between GR and QM is not something you can Band-Aid over. It is a deep theoretical emergency. What this means practically is that while GR is a great long-distance theory of gravity, it must be modified at short distances (high energies) to fix up the probability problem.

We now seek a theory of Gravity 3.0, which should have softer high-energy behaviour than GR does and which must knit together GR and QM. The most important principles are:-

- It must reduce to known physics in well-understood regimes. This is called the Correspondence Principle.
- It must have no internal mathematical inconsistencies, known as anomalies.
- It must describe quantum gravity in a calculationally useful way.
- The aim is to be able to calculate physically interesting quantities for black holes and the Big Bang, not just to know that those two places in the universe are super-gnarly.

Relaxing our insistence that particles be the Legos of the universe is actually enough (!) to solve our titanic clash between GR and QM. String theory takes this seriously and hypothesizes that the most elementary Legos in the universe are strings, i.e. one-dimensional strands of energy. This is not like cat string: it is relativistic, and much much smaller. You can think of cat string as made of molecules, which are made of atoms, which are made of quarks and gluons and electrons, which in turn are made of fundamental string.

String theory is versatile because it proposes that the same basic ingredient (fundamental string) can wiggle in different oscillation patterns and therefore can describe subatomic particles

of differing masses and spins (and charges). In particular, string theorists know how to represent all the particles of the Standard Model in terms of fundamental strings. String theory is economic because it requires only one basic type of ingredient. Strings can be open, i.e. have endpoints (like a skipping rope), or they can be closed, i.e. have no ends (like a rubberband).

Like guitar strings and organ pipes, you can have the string be in the **fundamental** mode, which is the lowest possible frequency oscillation. (In the case of the organ pipe, it is the air inside it that is doing the waving; for the string it is the body of the string itself that waves.) The string could alternatively be in an **overtone**, which is a higher excitation that costs more energy. There is an infinite number of possible overtones, each characterized by a specific energy, but only one fundamental. Typically you will not be able to excite the overtones if you have a tiny energy budget. This is why strings look like particles

at low-energy. It is like you took your glasses off and saw a blurry string which looked for all intents and purposes like a particle

.

For the open string, the groundstate has zero mass and one oscillator. This single oscillator can point in any direction, which gives what physicists call a **vector** (something that points in one direction). The resulting object has zero mass and spin one. In other words, open string groundstates describes massless messenger bosons of the Standard Model like the photon and the gluon.

For the closed string, the groundstate has zero mass and two oscillators: one left-moving and one right-moving. These two oscillators can point in any directions. The resulting object has mass zero and spin two. This is the graviton missing from the Standard Model. The fact that closed strings have graviton groundstates is why string theorists can claim with a straight face that string theory is a theory of quantum gravity -- it is not some random gravity model from our imagination but reduces to Einstein's gravity theory at low energy.

Recall our Ice Skater Analogy for messenger bosons. What we saw when we discussed that analogy was that interactions happen at points in spacetime: i.e., at a specific place in space at a specific point in time. ￼

What about strings? String interactions are naturally and inherently spread out in spacetime. Strings interact by smoothly splitting or joining. This softening makes Gravity 3.0 calculable! Booyah!!

￼Particles are pointy, or hard in physicists’ terms. Strings are extended, which makes their high-energy behaviour softer.

In particle QM, we learned that higher momentum means shorter quantum (de Broglie) wavelength, and therefore greater sensitivity. So more money means better resolution. Schematically, $$ \Delta x_{\rm min, particle} = {\frac{\hbar}{\Delta p}} $$ For strings, the behaviour has a qualitatively new ingredient:- ￼$$ \Delta x_{\rm min, string} = {\frac{\hbar}{\Delta p}} + {\frac{\hbar c}{m_s^2}} \Delta p $$

String resolution follows particle resolution at low energy (this must happen, by the Correspondence Principle). But it then worsens again at high energy. Why? With a big enough energy budget, the string’s oscillator energy beats the tension energy and the probe string gets floppier and fatter. This weakens its effectiveness as a probe. ￼

The most important qualitative conclusion from this is that there is a law of diminishing returns in string theory: if you keep cranking up the energy higher and higher beyond the turnover point (the valley in the red curve in the figure), you will not make a better experiment. Maybe the buck really does stop at string theory! There might not be any more onion layers.