The range of maximum deviation of the transducer output from a reference curve due to the transducer is defined as **error band**; said deviation (which is generally expressed in percent of the full scale) can be caused by non-linearity, non-repeatability, hysteresis, etc.; it is determined by several consecutive calibration cycles so as to include repeatability. Error band is a measurement of worst-case error. This is the best specification (compared to linearity) to determine gauge suitability for an application.

It can also be verified that the transducer must operate only in a range of variation of the input quantity that is contained in the measurement range; it follows that by varying the value of the static error considered acceptable, it is possible to have different fields of use.

The error band specification describes a bipolar band (i.e., ±0.2%) around the ideal line. The “ideal line” is the line plotted where all dimensional changes produce perfect sensor output voltages. All measurements must fall within the error band for the instrument to be within specification. The magnitude of the band is equal to the worst-case error throughout the gauge’s measurement range. Using the worst-case error assures that every measurement made by the gage will perform within the error band specification.