Wikipedia defines nonrenormalizable theories in this way:
"Not all theories lend themselves to renormalization in the manner described above, with a finite supply of counterterms and all quantities becoming cutoff-independent at the end of the calculation. If the Lagrangian contains combinations of field operators of high enough dimension in energy units, the counterterms required to cancel all divergences proliferate to infinite number"
Suppose that we have a process into which particles with a total energy E enter. Is it really a problem that we need an "infinite number" of counterterms?
| k + q
| e- ___
| q / \ q
| ~~~~~~ ~~~~ ● X+ massive
| e+ \____/ charge
| -k
|
| virtual pair
|
e-
^ t
|
Above we have the vacuum polarization diagram. The momentum q is the input which "disturbs" the processes in a pure vacuum polarization loop with no incoming or outgoing particles.
In the previous blog post we claimed that destructive interference cancels out the entire vacuum polarization loop integral, unless q ≠ 0 disturbs it. The momentum line q is the "input" to the process.
Hypothesis. Destructive interference cancels out all loop integrals if the "input" to them from external lines goes to zero.
Does the hypothesis solve the problem of an infinite number of counterterms?
A theory where the 4-momentum k interacts: divergences really are due to the input producing many Fourier components in another field?
In gravity, the mass-energy E interacts with other particles.
A Feynman diagram loop contains arbitrary values of the 4-momentum k. Can that cause a problem if we have many loops which are connected to each other?
Let us study the diagram below.
particle
• ---------------------------------------
| interaction
|
|
|
__ q + k + j
___/ \___ q + k
/ \___/ \
/ q + k - j \
q ~~ ~~ q
\_______________/
q - k
We assume that interactions in the diagram can depend on the 4-momentum. There are nested loops with arbitrary 4-momenta k and j.
Destructive interference almost entirely cancels any 4-momentum
|k| > |q|,
where | | denotes the euclidean norm of the 4-momentum:
|(E, px, py, pz)| = sqrt(E² + px² + py² + pz²).
If q = 0, the cancellation is perfect. If q ≠ 0, then, in very rare cases, |k| can be huge. The nested, smaller loop will in those cases have a very large |j|. Very large |j| will have a huge interaction with the particle.
The integral over j can have a very large value. Is that a problem? It should not be. If the probability of a large |k| is infinitesimal, we do not need to care much of a huge integral. In QFT there seems to be a confusion about overlapping probabilities. As if a huge integral value in some extremely rare event with a probability
P < ε << 1
would somehow make P large, or even infinite.
Hypothesis. The huge value of the integral means that the process produces many Fourier components of a wave at the same time. The probabilities overlap.
The confusion is the same as in bremmstrahlung in electron scattering. The large value of the integral describes a complex wave which contains many Fourier components. If |k| is allowed to be large, why would it produce just one Fourier component? More likely is that it will produce a large number of Fourier components, just as happens in bremsstrahlung.
The produced wave can still be seen as "one wave", but it just happens to have many Fourier components. In classical bremsstrahlung, the complex wave certainly contains many Fourier components.
Black holes. If j has a lot of energy, then the particle may interact with a black hole. But that is extremely improbable.
What if the input q is of the Planck scale? Then we will have black holes in the diagram, and it may be that Feynman diagrams do not work at all. However, particles with a Planck scale energy are rare, or nonexistent in nature.
It looks like nonrenormalizable theories work fine. We may have a cascade of loops where the energy rises to the Planck scale, but those have an infinitesimal probability of occurring, and we can ignore them.
Hypothesis. "Nonrenormalizability" of gravity is not a problem at all.
There are other problems in the quantum field theory of gravity, though. The interaction is nonlinear and complicated.
Assaf Shomer (2007): entropy of a black hole
Assaf Shomer (2007) argues that the entropy of a black hole is too large for it to be describable with a renormalizable quantum field theory. He claims that a renormalizable QFT is asymptotically a conformal field theory, and the entropy in such a system must be smaller than that of a black hole.
The entropy of a black hole is roughly the same as the entropy of the wavelength
λ = 16 π rs
radiation which we can use to feed and grow a black hole. There rs is the Schwarzschild radius. The reverse process would be the hypothetical Hawking radiation.
Let us compare the entropy of a black hole of an energy E to the entropy of a (classical) photon gas stored into a vessel of the same size, and having the same energy E. A photon gas, presumably, is governed by a conformal field theory.
Wikipedia says that the energy density of a black body radiation photon gas is
dE / dV ~ T⁴,
where T is the absolute temperature. The entropy density is
dS / dV ~ T³.
Note that there is an error in Wikipedia in the table for the entropy: the table claims that one can replace the volume V with the temperature T, and derive a strange formula S = 16 σ / (3 c) T⁴, which does not depend on V.
Let the Schwarzschild radius rs vary. The energy of a black hole is
Ebh = C rs,
where C is a constant of nature. The entropy, according to Bekenstein and Hawking, is
Sbh ~ rs².
Let us then put photon gas worth Ebh = C rs to a vessel whose volume is
V ~ rs³.
The total energy inside the vessel is then
~ T⁴ rs³ ~ C rs,
which implies that the temperature
T ~ 1 / sqrt(rs).
The entropy inside the vessel is
S ~ rs³ * T³
~ rs^3/2.
We see that as rs grows, the entropy of a black hole grows quadratically, while the entropy of a photon gas vessel of the same energy only grows by an exponent 1.5.
Assaf Shomer has a calculation error in the paper. He claims:
Since d = 4, that would mean that S ~ rs^¾ Shomer confused the density of energy and entropy to the total energy and entropy in the vessel.
Anyway, Shomer's main argument still seems to stand: the entropy of a black hole grows faster than the entropy of a equivalent photon gas vessel, if we increase the total stored energy.
In this blog we have claimed that the ingoing matter "freezes" at the horizon of a black hole. The calculation above assumed that the vessel containing the photon gas has no freezing effect: time flows at the same rate everywhere in the vessel.
Assaf Shomer's argument suggests that a black hole cannot globally, in static spatial coordinates around a black hole, behave asymptotically like a conformal field theory.
But we are interested in local phenomena in freely falling coordinates, where particle energies are much less than the Planck energy. Thus, Shomer's argument does not prevent us from having a fruitful quantum field theory of gravity.
Destructive interference cancels large frequencies exponentially well
When an electron e- passes by a massive charge X+, the time variation of the electric field in comoving coordinates of the electron, or in the comoving coordinates of X+ is something like
ΔE(t) = 1 / (1 + (v t)²).
Let us assume that v = 1.
The "disturbance" ΔE(t) then has a Fourier transform
For large frequencies, or 4-momenta, the Fourier component is absolutely negligible. If f = 100, the component is ~ 10⁻⁶³⁰.
However, is this cancellation even too strong? Let us compare this to the renormalized vacuum polarization value. Is the contribution of |k| > 100 |q| absolutely negligible?
Hagen Kleinert (2013) calculates the effect of q ≠ 0 by using the q² derivative of Π₂(q²):
The arbitrary 4-momentum is in his nomenclature p, not k as in our blog text. The contribution of large |q| seems to be a decreasing geometric series. It is not exponentially decreasing.
This is not paradoxical. Even if the "disturbance" is very much free of high frequencies, its "impact" may be less free.
Is it possible that for some loop, the impact of a disturbance diverges, too? What would an ultraviolet divergence mean in such a case? If there are overlapping probabilities, can it mean an infinite energy for the generated wave?
Quantum gravity
Zvi Bern (2002) writes about divergences in quantum gravity. In Section 2.2 he states that gravity with matter typically diverges badly at one loop, and pure gravity at two loops.
Let us check if we can find a way to alleviate the problem.
There is another problem, too. The Feynman integral for just five loops of gravitons contains 10³⁰ terms! We have to find a simpler way to calculate the interaction. In our blog we hold the view that gravity is a combination of an attractive force and an increase in the inertia of a particle in a gravitational field. Could it be that calculations with inertia would be simpler than calculations with the metric tensor?
Ultraviolet divergence in a loop
In an infrared divergence, on September 24, 2025 we were able to explain the problem away by claiming that the produced wave contains an infinite number of low-energy bremsstrahlung photons. Does the same principle work for an ultraviolet divergence?
If we hit a rubber membrane with an infinitely sharp hammer, then it, presumably, creates the Coulomb field of a point charge, which has an infinite energy. That is why the hammer is never allowed to be infinitely sharp: destructive interference must cancel the infinite energy. The hammer must be blunt.
--> t
Above we have a Feynman diagram. Let the lines represent gravitons and the vertices their gravity interaction. Two gravitons enter from the left, create new virtual gravitons, interact, and exit on the right.
The Feynman integral calculates the "4-point function", or the probability amplitude for the process to happen, assuming that the input flux from the left has a certain value.
The input gravitons coming from the left have some modest probability amplitudes. If the Feynman integral is infinite, that would mean that the output flux of the gravitons on the right is infinite. That is, we have created an infinite energy. This is clearly nonsensical. What is the problem?
The process has a classical limit. The input gravitons would be wave packets, and the output gravitons are wave packets. General relativity is supposed to conserve energy. We conclude that the ultraviolet divergent Feynman integral miscalculates the process, and badly. Why does it miscalculate?
*** WORK IN PROGRESS ***
No comments:
Post a Comment