An interesting concept, precision... or, better, relevant precision... Take depth, for example. Just how deep am I if I'm 100' deep? Depending on body position, I can occupy nearly six feet of "depth" all at the same time, so any precision beyond about the nearest meter or yard is hardly worth messing with. That's why I liken much of our dive planning, especially when dealing with decompression algorithms, to measuring with a micrometer, marking with chalk and cutting with an ax....News flash, its not relevant on paper or in the water.
On the other hand, changes of just a foot or two are physiologically significant in bubble dynamics; when the deco model assumes a steady stop those changes can have profound consequences in the model's efficacy, so depth control may need to be more precise than depth measurement... but... I can't control my depth precisely unless I have a way to measure it precisely - more precisely than is relevant to a discussion of "how deep am I?" alone.
And so it is with PO2, EAD, END, gas analysis, consumption rate, etc.
But...
Let me use a military metaphor to illustrate my next point.
The ballistic dispersion of a dumb bomb is about four mils. That is, if you drop a whole bunch of bombs from precisely the same point and under precisely the same conditions, their trajectories won't be precisely the same, but will vary (disperse) as much as four feet per thousand feet of travel. So, if a dive bomber drops a bomb at a slant range of 5000 feet from the target, no matter how precisely the bomb is aimed, the bomb can only be counted on to come within about 20 feet of the target, which makes the aim point "imprecise" or only "relevant" within that 40' diameter circle. However... if my objective is to actually hit the target, and avoid having to come back and get shot at tomorrow, my chances of success are best with very precise aim - far more precise than the ballistic dispersion I know is a physical property of the weapon's trajectory.
ergo...
making calculations a bit more precise than is relevant may not matter much in a given instance, but it does help define the bullseye.
In other words, calculating (and measuring) my FO2 to the nearest hundredth of a percent in a +/- 1% world may not be relevant, but it's harmless so long as I bear in mind my calculation is just a center point on a curve of probability that stretches a whole percentage point either side of my calculation.
Now, is it reasonable to ask for a two decimal answer for FO2(%.xx) in a class? Not in my class, at least, though if someone does it I'll usually say something along the lines of "that's fine, just remember that it's far finer than is really meaningful..."
I like to use terms like "about 21%" to emphasize the range of reality.
Rick