Wikipedia defines nonrenormalizable theories in this way:
"Not all theories lend themselves to renormalization in the manner described above, with a finite supply of counterterms and all quantities becoming cutoff-independent at the end of the calculation. If the Lagrangian contains combinations of field operators of high enough dimension in energy units, the counterterms required to cancel all divergences proliferate to infinite number"
Suppose that we have a process into which particles with a total energy E enter. Is it really a problem that we need an "infinite number" of counterterms?
| k + q
| e- ___
| q / \ q
| ~~~~~~ ~~~~ ● X+ massive
| e+ \____/ charge
| -k
|
| virtual pair
|
e-
^ t
|
Above we have the vacuum polarization diagram. The momentum q is the input which "disturbs" the processes in a pure vacuum polarization loop with no incoming or outgoing particles.
In the previous blog post we claimed that destructive interference cancels out the entire vacuum polarization loop integral, unless q ≠ 0 disturbs it. The momentum line q is the "input" to the process.
Hypothesis. Destructive interference cancels out all loop integrals if the "input" to them from external lines goes to zero.
Does the hypothesis solve the problem of an infinite number of counterterms?
A theory where the 4-momentum k interacts: divergences really are due to the input producing many Fourier components in another field?
In gravity, the mass-energy E interacts with other particles.
A Feynman diagram loop contains arbitrary values of the 4-momentum k. Can that cause a problem if we have many loops which are connected to each other?
Let us study the diagram below.
particle
• ---------------------------------------
| interaction
|
|
|
__ q + k + j
___/ \___ q + k
/ \___/ \
/ q + k - j \
q ~~ ~~ q
\_______________/
q - k
We assume that interactions in the diagram can depend on the 4-momentum. There are nested loops with arbitrary 4-momenta k and j.
Destructive interference almost entirely cancels any 4-momentum
|k| > |q|,
where | | denotes the euclidean norm of the 4-momentum:
|(E, px, py, pz)| = sqrt(E² + px² + py² + pz²).
If q = 0, the cancellation is perfect. If q ≠ 0, then, in very rare cases, |k| can be huge. The nested, smaller loop will in those cases have a very large |j|. Very large |j| will have a huge interaction with the particle.
The integral over j can have a very large value. Is that a problem? It should not be. If the probability of a large |k| is infinitesimal, we do not need to care much of a huge integral. In QFT there seems to be a confusion about overlapping probabilities. As if a huge integral value in some extremely rare event with a probability
P < ε << 1
would somehow make P large, or even infinite.
Hypothesis. The huge value of the integral means that the process produces many Fourier components of a wave at the same time. The probabilities overlap.
The confusion is the same as in bremmstrahlung in electron scattering. The large value of the integral describes a complex wave which contains many Fourier components. If |k| is allowed to be large, why would it produce just one Fourier component? More likely is that it will produce a large number of Fourier components, just as happens in bremsstrahlung.
The produced wave can still be seen as "one wave", but it just happens to have many Fourier components. In classical bremsstrahlung, the complex wave certainly contains many Fourier components.
Black holes. If j has a lot of energy, then the particle may interact with a black hole. But that is extremely improbable.
What if the input q is of the Planck scale? Then we will have black holes in the diagram, and it may be that Feynman diagrams do not work at all. However, particles with a Planck scale energy are rare, or nonexistent in nature.
It looks like nonrenormalizable theories work fine. We may have a cascade of loops where the energy rises to the Planck scale, but those have an infinitesimal probability of occurring, and we can ignore them.
Hypothesis. "Nonrenormalizability" of gravity is not a problem at all.
There are other problems in the quantum field theory of gravity, though. The interaction is nonlinear and complicated.
Assaf Shomer (2007): entropy of a black hole
Assaf Shomer (2007) argues that the entropy of a black hole is too large for it to be describable with a renormalizable quantum field theory. He claims that a renormalizable QFT is asymptotically a conformal field theory, and the entropy in such a system must be smaller than that of a black hole.
The entropy of a black hole is roughly the same as the entropy of the wavelength
λ = 16 π rs
radiation which we can use to feed and grow a black hole. There rs is the Schwarzschild radius. The reverse process would be the hypothetical Hawking radiation.
Let us compare the entropy of a black hole of an energy E to the entropy of a (classical) photon gas stored into a vessel of the same size, and having the same energy E. A photon gas, presumably, is governed by a conformal field theory.
Wikipedia says that the energy density of a black body radiation photon gas is
dE / dV ~ T⁴,
where T is the absolute temperature. The entropy density is
dS / dV ~ T³.
Note that there is an error in Wikipedia in the table for the entropy: the table claims that one can replace the volume V with the temperature T, and derive a strange formula S = 16 σ / (3 c) T⁴, which does not depend on V.
Let the Schwarzschild radius rs vary. The energy of a black hole is
Ebh = C rs,
where C is a constant of nature. The entropy, according to Bekenstein and Hawking, is
Sbh ~ rs².
Let us then put photon gas worth Ebh = C rs to a vessel whose volume is
V ~ rs³.
The total energy inside the vessel is then
~ T⁴ rs³ ~ C rs,
which implies that the temperature
T ~ 1 / sqrt(rs).
The entropy inside the vessel is
S ~ rs³ * T³
~ rs^3/2.
We see that as rs grows, the entropy of a black hole grows quadratically, while the entropy of a photon gas vessel of the same energy only grows by an exponent 1.5.
Assaf Shomer has a calculation error in the paper. He claims:
Since d = 4, that would mean that S ~ rs^¾ Shomer confused the density of energy and entropy to the total energy and entropy in the vessel.
Anyway, Shomer's main argument still seems to stand: the entropy of a black hole grows faster than the entropy of a equivalent photon gas vessel, if we increase the total stored energy.
In this blog we have claimed that the ingoing matter "freezes" at the horizon of a black hole. The calculation above assumed that the vessel containing the photon gas has no freezing effect: time flows at the same rate everywhere in the vessel.
Assaf Shomer's argument suggests that a black hole cannot globally, in static spatial coordinates around a black hole, behave asymptotically like a conformal field theory.
But we are interested in local phenomena in freely falling coordinates, where particle energies are much less than the Planck energy. Thus, Shomer's argument does not prevent us from having a fruitful quantum field theory of gravity.
Destructive interference cancels large frequencies exponentially well
When an electron e- passes by a massive charge X+, the time variation of the electric field in comoving coordinates of the electron, or in the comoving coordinates of X+ is something like
ΔE(t) = 1 / (1 + (v t)²).
Let us assume that v = 1.
The "disturbance" ΔE(t) then has a Fourier transform
For large frequencies, or 4-momenta, the Fourier component is absolutely negligible. If f = 100, the component is ~ 10⁻⁶³⁰.
However, is this cancellation even too strong? Let us compare this to the renormalized vacuum polarization value. Is the contribution of |k| > 100 |q| absolutely negligible?
Hagen Kleinert (2013) calculates the effect of q ≠ 0 by using the q² derivative of Π₂(q²):
The arbitrary 4-momentum is in his nomenclature p, not k as in our blog text. The contribution of large |q| seems to be a decreasing geometric series. It is not exponentially decreasing.
This is not paradoxical. Even if the "disturbance" is very much free of high frequencies, its "impact" may be less free.
Is it possible that for some loop, the impact of a disturbance diverges, too? What would an ultraviolet divergence mean in such a case? If there are overlapping probabilities, can it mean an infinite energy for the generated wave?
Quantum gravity
Zvi Bern (2002) writes about divergences in quantum gravity. In Section 2.2 he states that gravity with matter typically diverges badly at one loop, and pure gravity at two loops.
Let us check if we can find a way to alleviate the problem.
There is another problem, too. The Feynman integral for just five loops of gravitons contains 10³⁰ terms! We have to find a simpler way to calculate the interaction. In our blog we hold the view that gravity is a combination of an attractive force and an increase in the inertia of a particle in a gravitational field. Could it be that calculations with inertia would be simpler than calculations with the metric tensor?
Ultraviolet divergence in a loop
In an infrared divergence, on September 24, 2025 we were able to explain the problem away by claiming that the produced wave contains an infinite number of low-energy bremsstrahlung photons. Does the same principle work for an ultraviolet divergence?
If we hit a rubber membrane with an infinitely sharp hammer, then it, presumably, creates the Coulomb field of a point charge, which has an infinite energy. That is why the hammer is never allowed to be infinitely sharp: destructive interference must cancel the infinite energy. The hammer must be blunt.
--> t
Above we have a Feynman diagram. Let the lines represent gravitons and the vertices their gravity interaction. Two gravitons enter from the left, create new virtual gravitons, interact, and exit on the right.
The Feynman integral calculates the "4-point function", or the probability amplitude for the process to happen, assuming that the input flux from the left has a certain value.
The input gravitons coming from the left have some modest probability amplitudes. If the Feynman integral is infinite, that would mean that the output flux of the gravitons on the right is infinite. That is, we have created an infinite energy. This is clearly nonsensical. What is the problem?
The process has a classical limit. The input gravitons could be wave packets, and the output gravitons are wave packets. General relativity is supposed to conserve energy. We conclude that the ultraviolet divergent Feynman integral miscalculates the process, and badly.
Let us imagine that the graviton q interacts with the electron and the positron gravitationally. In that case, the interaction is proportional to
|k|.
Maybe the Feynman integral diverges, even after renormalization? What does that mean, physically?
The contribution to the probability amplitude of the process might be such that each interval
n ≤ |k| < n + 1
contributes an equal amount. The sum is infinite. Then the classical probabilities must overlap. That would mean that the process at the same time launches many pairs with various momenta k. We have suggested that the pair, as a whole, is a boson. Many bosons with various k are launched at the same time. The right side of the diagram then would describe a simultaneous absorption of all these bosons. In this interpretation, the various k are not separate classical probabilities. They overlap. Then there is no divergence in the classical probability.
An analogy: a sharp hammer hit to a rubber membrane produces many Fourier components with a large |k|. The next hit absorbs them all.
Suppose that q produces a wave packet of a pair which carries the momentum q. The Fourier decomposition of the wave packet contains many different momenta k. Why would the sum of the classical probabilities for each k be less than 1, or even finite?
Destructive interference should almost completely cancel high 4-momenta. For classical waves, it probably establishes a cutoff.
Why does a Feynman integral diverge? It may be due to the following:
1. the integral does not take into account destructive interference, or
2. the integral does not understand that classical probabilities for various 4-momenta k in a loop overlap.
Classically, the infrared divergence in bremsstrahlung is easy to understand. But having a large amount of very high 4-momenta k would be strange. When an electron passes a charge X+ inside a material, the polarization of the material behaves varies smoothly. The Fourier decomposition of the polarization will not contain high frequencies.
Classically, a hit with an infinitely sharp hammer will contain very high frequencies. The energy of the hit probably is infinite. If the Green's function is not completely absorbed, then it may "leak" lots of very high frequencies, and we might get a classical divergence. An infinite amount of energy is created.
We already discussed one type of a classical ultraviolet divergence. If an electron receives an instantaneous impulse, and its acceleration is infinite, it will radiate an infinite energy.
Why do Feynman diagrams operate with a single hit of the hammer? The method does work in some cases, but in the general setting, it is prone to create an infinite energy – fail miserably.
Question. Is destructive interference the correct method to regularize and renormalize all Feynman integrals? It is suspicious that integrals of the type
∫ d⁴k * 1 / kⁿ
only converge slowly, or not at all. Exponential convergence might be more realistic?
In quantum mechanics, one natural "convergence mechanism" is the uncertainty relation which says that a particle can "borrow" an energy E at most for a time
t < ħ / (2 E).
The "convergence" in it is quite slow.
Conclusions
Let us close this long post. We will continue the study of ultraviolet divergences in an upcoming post.
We have to analyze more thoroughly what does an ultraviolet divergent Feynman integral really mean. It can remain divergent also after the "renormalization", if the input (like q ≠ 0 in vacuum polarization) "disturbs" the integral value enough.
Classically, an ultraviolet divergent approximation formula is a very bad approximation, since it generates an infinite energy from a finite input. Feynman integrals may in some cases calculate the processes wrong, regardless of regularization and renormalization procedures.
In quantum gravity, ultraviolet divergences seem to happen inevitably at two loops. Furthermore Einstein formulae are too complicated, so that a simple Feynman integral may contain 10³⁰ terms. We have to find a simpler method to calculate interaction processes.