# Thread: Uncertainty of Measurement and Precision in Chemistry test

1. Hello all,
I know some of this but obviously not all which is why I'm looking for help (= simple enough explanation for an idiot like me)

In a lot of the tests in the laboratory there is a target set of 0.10ug/L and the allowed uncertainty of measurement for that target is set at <30% (0.03ug/L)

I worked out how to calculate the uncertainty figure using the random and systematic errors and it turned out to be 12.8% so that was that, well within the spec. I thought. However this was identified as being outside the target because as well as the UoM the test precision was being recorded on our QC chart at 15.5% and I'm trying to understand how that can be. I know that the expected/accepted precision target should be 10% so would that be the specific reason for querying the UoM (?) or is it a combination of -

0.10 (target) X 12.8% (UoM) X 15.5% (Precision) = 0.13187 ug/L i.e. applying the worse case of UoM and Precision to the target of 0.10ug/L the result would be 0.032ug/L which is further away than the required 0.030ug/L an so is greater than the 30% required (0.03ug/L) or am I off track?

I'd really appreciate anyone that can give me an idiots explanation as I have to apply this to a number of other tests and I had been expecting to calculate the UoM, get a % result and move on. Instead I'm possibly looking at calculating a result each time (as I've tried above) using the UoM and Precision results instead of just identifying that % UoM calculation is within specification.

2.

 Bookmarks
##### Bookmarks
 Posting Permissions
 You may not post new threads You may not post replies You may not post attachments You may not edit your posts   BB code is On Smilies are On [IMG] code is On [VIDEO] code is On HTML code is Off Trackbacks are Off Pingbacks are Off Refbacks are On Terms of Use Agreement