Here is something that has always puzzled me about radiocarbon dating. (I have only a layman's understanding of the subject).
The decay rate of the isotope C14 is known. If I can measure today's amount of C14 in some piece of wood, and I know the amount of C14 at the time the plant died, I can determine how long the plant has been dead. So far so good.
My problem is: How can I tell with certainty how much C14 was contained in a tree some 50000 years ago? The amount of C14 in the wood is linked to the percentage of C14 in the carbon of the atmosphere at the time. Apparently, this amount is determined mostly by cosmic radiation, which replenishes the ever decaying supply of C14 on Earth. Radiation has been observed to vary (slightly) over time, even within the short period of a half-century in which we paid attention to it. Isn't it a rather brave extrapolation to claim that the amount of C14 on Earth hasn't significantly changed in the last 50000 years? In addition to that, wouldn't you expect concentrations to vary locally on Earth at any given time, depending on radiation patterns and convective distribution of C14 by weather?
There must be a simple answer to this question, but I have never heard anyone even bring up the issue: What is the basis for the assumption that the initial concentration of C14 in a dying organism was the same 50000 years ago as it is today? How would we estimate errors in radiocarbon dating, considering an uncertainty in past C14 concentrations?