A nice theory is one side of the medal, however, in day to day practice there are much more hurdles
to jump over, before accurate and reliable results are obtained.
The following, only seemingly exotic, example shows, what that means. It asks the question:
How accurate are precise data?
I have analyzed the raw data of three TRITON generated data files of approx. the same lenght (~11.5
h) of an Ames Nd standard sample. Data acquisition parameters are said to be identical.
F4 and F12 obtained from J. Schwieters (Thermo, Bremen)
P12 obtained from G. Caro (IPG, Paris)
The following figure shows a plot of the 144/146Nd raw ratios for these three files:
Using the La Jolla standard value (RN = R144/146 = 1.3852), one observes a remarkable static offset of the measured raw ratio r144/146 for all three files:
Time scale is in [sec]: The total lenght of the runs is ~11.5 hours each.
The (questionable) explicit correction of raw ratios before the application of fractionation correction formulae, as shown in this note:
This static offsett is defined as
(block means only)
per mass unit
rN(0) is the raw ratio of the i144 and i146 ion currents at the very beginning of the data acquisition. The question is:
Has this offset any influence on the fractionation corrected "true" ratio?
I will offer two answers:
The so-to-speak "automatic" (or implicit) correction of (mass dependent) static discriminations as a (favorable) side effect of fractionation correction, as shown in this section:
High precision runs seem to require a lot more checks and still not really proven correction methods, before their high precision may result in serious high accuracy results.