Radioisotope dating of rocks and meteorites is perhaps the most potent claimed proof for the supposed old age of the earth and the solar system. The absolute ages provided by the radioisotope dating methods provide an apparent aura of certainty to the claimed millions and billions of years for formation of the earth’s rocks. Many in both the scientific community and the general public around the world thus remain convinced of the earth’s claimed great antiquity.
However, accurate radioisotopic age determinations require that the decay constants of the respective parent radionuclides be accurately known and constant in time. Ideally, the uncertainty of the decay constants should be negligible compared to, or at least be commensurate with, the analytical uncertainties of the mass spectrometer measurements entering the radioisotope age calculations (Begemann et al. 2001). Clearly, based on the ongoing discussion in the conventional literature this is still not the case at present. The stunning improvements in the performance of mass spectrometers during the past four or so decades, starting with the landmark paper by Wasserburg et al. (1969), have not been accompanied by any comparable improvement in the accuracy of the decay constants (Begemann et al. 2001; Steiger and Jäger 1977), in spite of ongoing attempts (Miller 2012). The uncertainties associated with direct half-life determinations are, in most cases, still at the 1% level, which is still significantly better than any radioisotope method for determining the ages of rock formations. However, even uncertainties of only 1% in the half-lives lead to very significant discrepancies in the derived radioisotope ages. The recognition of an urgent need to improve the situation is not new (for example, Min et al. 2000; Renne, Karner, and Ludwig 1998). It continues to be mentioned, at one time or another, by every group active in geo- or cosmochronology (Schmitz 2012).