The method was developed by Willard Libby in the late 1940s and soon became a standard tool for archaeologists.
The older a sample is, the less (the period of time after which half of a given sample will have decayed) is about 5,730 years, the oldest dates that can be reliably measured by radiocarbon dating are around 50,000 years ago, although special preparation methods occasionally permit dating of older samples.
The idea behind radiocarbon dating is straightforward, but years of work were required to develop the technique to the point where accurate dates could be obtained.
Research has been ongoing since the 1960s to determine what the proportion of in the atmosphere has been over the past fifty thousand years.
The resulting data, in the form of a calibration curve, is now used to convert a given measurement of radiocarbon in a sample into an estimate of the sample's calendar age.
Other corrections must be made to account for the proportion of throughout the biosphere (reservoir effects).
Additional complications come from the burning of fossil fuels such as coal and oil, and from the above-ground nuclear tests done in the 1950s and 1960s.
Because the time it takes to convert biological materials to fossil fuels is substantially longer than the time it takes for its in the atmosphere, which attained a maximum in 1963 of almost twice what it had been before the testing began.
Measurement of radiocarbon was originally done by beta-counting devices, which counted the amount of beta radiation emitted by decaying atoms in the sample and not just the few that happen to decay during the measurements; it can therefore be used with much smaller samples (as small as individual plant seeds), and gives results much more quickly.
The development of radiocarbon dating has had a profound impact on archaeology.