- Joined
- May 20, 2015
- Messages
- 3,221
- Age
- 61
- Location
- St. Augustine, FL.
- Gender
- Male
- Religious Affiliation
- Atheist
- Political Affiliation
- Moderate
- Marital Status
- In Relationship
This is a serious question because I genuinely don't know the answer.
We know that isotopes decay at a steady pace, and we can measure the proportions of the isotopes present today. But how does that help us with aging unless we also know the starting levels of the different isotopes?
For example, if some mythical element 23C decays into 21C with a half-life of one year, and we can see that in a given sample there 90% 21C and 10% 23C we know that a year ago there would have been 80% 21C and 20% 23C, two years ago there would have been 60% 21C and 40% 23C, three years ago it would have been 20% 21C and 80% 23C. But how do we know whether the sample is one, two or three years old unless we know how much 23C there was to begin with?
I realise radioactive decay isn't always as convenient as an isotope of a given element decaying into another isotope of the same element, and with so much radioactive decay going on there may be factors that help corroborate an indication of age. But if someone could explain how that works (or give useful links to read) I'd appreciate it.
Here's a good place to start:
Radiometric Dating
This article only assumes you know some Pre-Calculus math.