The determination of the
absolute age of fossils has been invaluable to biologists as they trace the
history of life and the evolution of living organisms. One of the most common
techniques for determining these ages is radiometric dating, which is based on
the observations of physicists and chemists at the turn of the twentieth
century and their studies on the decay of radioisotopes.
During the course of his work
on uranium salts, in 1896, Henri Becquerel, teacher of Marie Curie,
serendipitously found that uranium spontaneously emitted radiation, which led
to his discovery of radioactivity. Building upon Becquerel’s findings, Ernest
Rutherford, father of nuclear physics, and his student Frederick Soddy, working
at McGill University in 1902, found that the process of radioactive decay
changes atoms from one element (parent isotope) to another (daughter isotope)
at a constant rate. (Isotopes have the same number of protons in their nucleus
but a different number of neutrons, giving rise to the same element but with
different atomic masses.) Rutherford and Soddy predicted the time it would take
for one-half the atoms in an isotope to decay, and this half-life (t½) decay is
unique for each isotope. Carbon–14 has a t½ of 5,730 years and has been used in
more recent times to determine the age of organic materials, such as wood,
bone, shells, and fabrics that are up to 75,000–80,000 years old. By contrast,
uranium-238, with a t½ = 4.5 billion years, is used to determine the age of
older fossil samples.
Bertram Boltwood was a Yale University
radiochemist and pioneer in the study of radioisotopes. Initially a
correspondent and later a close friend of Rutherford, in 1907 he was the first
to apply Rutherford’s principles to what would become radiometric dating. Using
the ratio of the half-life decays of uranium-238/lead-206, Boltwood estimated
the age of the Earth to be 2.2 billion years, ten times older than had
previously been determined, but one-half the present age calculation.
No comments:
Post a Comment