Accuracy

In error theory, accuracy is the degree of correspondence of the theoretical data, which can be inferred from a series of measured values (data sample), with the real or reference data, i.e. the difference between the average sample value and the true or reference. Indicates the proximity of the value found to the real one. It is a qualitative concept that depends on both random and systematic errors. See also Accuracy vs Precision »

Accuracy is the degree of agreement of the measured physical quantity with its true magnitude. It can also be defined as the maximum amount (error) by which the result differs from the true value or as the nearness of the measured value to its true value, often expressed as a percentage. It also represents a static characteristic of an instrument.

The concept of the accuracy of a measurement is a qualitative one. An appropriate approach to stating this closeness of agreement is to identify the measurement errors and to quantify them by the value of their associated uncertainties, where uncertainty is the estimated range of value of an error. Accuracy depends on the inherent limitations of instruments and shortcomings in the measurement process. Often an estimate for the value of the error is based on a reference value used during the instrument’s calibration as a surrogate for the true value. A relative error based on this reference value is estimated by:

\[\textrm{Accuracy (A)}=\dfrac{\textrm{|measured value – true value|}}{\textrm{reference value}}\times 100\]

Thus, if the accuracy of a temperature indicator, with a full-scale range of 0÷500 °C is specified as ±0.5%, it indicates that the measured value will always be within ±2.5 °C of the true value if measured through a standard instrument during the process of calibration. But if it indicates a reading of 250 °C, the error will also be ±2.5 °C, i.e. ±1% of the reading. Thus it is always better to choose a scale of measurement where the input is near full-scale value. But the true value is always difficult to get. We use standard calibrated instruments in the laboratory for measuring true value if the variable.

Accuracy and costs

The demand for accuracy increases the costs increases exponentially. If the tolerance of a component is to be measured, then the accuracy requirement will typically be 10% of the tolerance values. Demanding high accuracy unless it is required is not viable, as it increases the cost of the measuring equipment and hence the inspection cost. Besides, it makes the measuring equipment unreliable, because, higher accuracy increases sensitivity. Therefore, in practice, while designing the measuring equipment, the desired/required accuracy to cost considerations depends on the quality and reliability of the component/product and inspection cost.

As the industry moves toward greater adoption of machine learning, accuracy is becoming a primary design consideration. Training systems use a level of accuracy that is not possible when doing inferencing at the edge. Teams knowingly introduce inaccuracy to reduce costs. A better understanding of accuracy’s implications is required, particularly when used within safety-critical applications such as autonomous driving.

Scroll to Top