Perhaps our resident mathematicians can help settle an argument at work on the accuracy of fractions versus decimals. I'm saying that fractions are more exact. I say this because there are certain fractions that one cannot write in decimal form so that it is as accurate as the fraction it represents, such as .33333 repeating versus 1/3. People against me say although it appears as if fractions are more accurate, they are not in all cases. They say that in certain calculations decimals work more accurately then fractions. One guy said that calculating square roots is more exact using decimals although I couldn't follow his reasoning.

I work with some smart people who do a lot of calculating in a day so I have a feeling I'm wrong. What is the opinions of our experts here?