I think my point didn't get across well.
Let me explain:
Any specimen that is fossilized will begin decay from whatever ratio of isotope present in the world from that moment of it's creation. Thus, any antediluvian era fossil (that is before Noah's flood), where there is very little radiation, very little isotopes were found on earth. Since all specimens after Noah's flood have more isotopes than the pre-flood, today, the more generation of isotopes is considered to be a constant. Hence, based on this assumption, radiometric dating will always consider little isotopes in samples to be "left over" from decay rather than considering the fact that it could also be what is actually generated or the original amount created from radiation.