Difference between accuracy and precision is that Accuracy means how close a measured value is to the actual value and Precision means how close the measured values are to each other.
We use many devices to measure physical quantities, such as length, time and temperature. They all have some limit of precision. Each device for measurement has some least count. The precision of the device depends on its least count, therefore, all devices have some limit of precision.
Accuracy Vs Precision (Comparison table)
|Accuracy means how close the results to the actual value.||Precision means how close the results are with each other.|
|Accuracy is the measure of quantity to reality.||Precision measures how well measurements can be reproduced.|
|Accuracy takes into account the accepted value.||Precision does not take into account the accepted value.|
|In accuracy, bad results would be far from the actual value.||In precision, bad results would be scattered.|
|Accuracy does not speak about the quality.||Precision speaks about quality.|
|Accuracy has a single factor.||Precision has multiple factors.|
|It is concerned with a systematic error.||It is concerned with random error.|
Now! we learn in detail about accuracy and precision with examples, so stay with us for a few minutes.
What is the accuracy?
The accuracy of a measurement is the difference between your measurement and the accepted correct answer. The bigger the difference, the less accurate your measurement. An accurate measurement is one which has less fractional or percentage error. The accuracy of a measurement depends on the fractional or percentage uncertainty in that measurement.
For example, when the object is recorded as 20.5 cm by using a meter rod having the smallest division in millimeter, it is the difference of two readings of the initial and position. The uncertainty in the single reading as discussed before is taken as ± 0.05 cm which is now double and called absolute uncertainty equal to ± 0.1 cm. Absolute uncertainty, in effect, is equal to the least count of the measuring instrument. This is called precision.
Precision or absolute uncertainty least count = ± 0.1 cm
Fractional uncertainty = 0.1 cm/20.5 cm =0.004
Percentage uncertainty = 0.1 cm /20.5 cm × 100 /100 =0.4 %
Another measurement taken by the vernier calipers with least count as 0.01 cm is recorded as 0.45 cm.It has
Precision or absolute uncertainty = ± 0.01 cm
Fractional uncertainty = 0.01 cm /0.45 cm =0.02
Percentage uncertainty = 0.1 cm /0.45 cm × 100 /100 = 2 %
So the reading 20.5 cm taken by the meter rule is although less precise but is more accurate having less percentage uncertainty or error.
Whereas the reading 0.45 cm taken relative, a measurement which is important. The smaller a physical quantity, the more precise the instrument should be used. Here the measurement 0.45 cm demands that a more precise instrument, such as micrometer screw gauge, with least count 0.001 cm, should have been used.
What is precision?
A precise instrument is the one that has less absolute uncertainty. The precision of a measurement is determined by the instrument or device being used.
The precision of a measurement depends upon the size of the unit you use to make a measurement. The smaller the unit, the more precise the measurement. The precision of measurement describes the units you used to measure something.
For example, you might describe your height as about 6 feet. That would not be very precise. If however, you said that you were ’74 inches tall’, that would be more precise.
Let’s see the video about Accuracy Vs Precision.