Yours truly thinks the graph is just showing the instantaneous radiogenic heat fluxes and does NOT account for the delay time for the heat energy to propagate to the surface. But the original caption does NOT specify what exactly the curves are.
A radioactive nucleus may decay now or in gigayear. There is no way in principle to tell.
However the average decay rate is well defined: it is an exponential decay:
N = N_0*exp(-t/t_e) ,where N is the current number of radioactive nuclei in a sample, N_0 was the initial number, t is the time since time zero, and t_e is the e-folding time: the time for number of radioactive nuclei to decrease by a factor of 1/e: i.e., 1 over exponential e = 2.7182818 ....
For relatively large samples, the exponential decay law is virtually exact.
The more commonly known one is the half-life, the time for the number of radioactive nuclei to decay by one half.
The relationship between half-life and e-folding time is derived as follows:
(1/2)N_0 = N_0*exp(-t_h/t_e) (1/2) = exp(-t_h/t_e) ln(1/2) = -t_h/t_e ln(2) = t_h/t_e t_h = t_e*ln(2) t_e = t_h/ln(2)where ln(2) = 0.6931 ... is the natural logarithm of 2 (see Wikipedia: Natural logarithm of 2).
These radioactive isotopes were created before the formation of the Solar System in supernovae which ejected them into the interstellar medium (ISM) whence they got into the solar nebula.
It's curious to think that if nature had given us a significantly lower abundance of uranium-235, we would NEVER have had nuclear power, nuclear weapons, and, for better or worse, balance of terror.
The heat energy from the past is still in the Earth's interior because the rate of heat energy flux by heat conduction and convection is just very low for large rocky bodies.
To give a comparison number, the worldwide commercial energy consumption is ∼ 18 TW = 18*10**12 W circa 2018.
Since we can only project harvesting a tiny fraction of the total heat energy flow to the surface, it's clear that geothermal power will NEVER be a major source of commercial power though it be locally important sometimes.
For another comparison, let's do a Fermi problem. The basal metabolic rate (BMR) for a human is ∼ 100 W. This is just the power needed to exist, NOT doing anything.
There are of order 10**10 humans, and so the total power needed by humans to just veg is 1 TW.
It drives uplift that keeps the continents from eroding away and leaving Earth a water world.
Also volcanic outgassing provides carbon dioxide (CO_2) to the Earth's atmosphere over geological time. The CO_2 sink of calcium carbonate (CaCO_3) formation removes it from the Earth's atmosphere over geological time. So without volcanic outgassing photosynthesis and biota, at least on land would gradually turn off.
So despite the dangers of earthquakes and volcanoes, having an active primordial-radiogenic heat geology (see also Wikipedia: Earth's internal heat budget: Radiogenic heat: Primordial heat) is necessary to the biosphere as we know it.
The radiant flux absorbed by the Earth's atmosphere and Earth is on average 240 W/m**2. The part that is absorbed by the ground and ocean is 170 W/m**2 (see Wikipedia: Earth's energy budget).
These heat fluxes are much larger than the Earth's geothermal heat flux.
So solar irradiance powers the biosphere and keeps it warm.
Praise the Sun.
However, the smaller the rocky body or rocky-icy body, the faster heat conduction releases the heat energy to outer space and this lowers its power to drive primordial-radiogenic heat geology.
The resulting situation is as follows. The Earth and Venus have significant primordial-radiogenic heat geology. Mars has less, but some. Smaller rocky bodies or rocky-icy bodies (e.g., the Moon, Mercury, and the asteroids) have virtually none.
So such short-lived radioactive isotopes probably did drive primordial-radiogenic heat geology for awhile in the early Solar System, but they all decayed away and do NOT drive primordial-radiogenic heat geology now.