Fidelity is defined as the degree to which a measurement instrument indicates changes in the measurand quantity without dynamic error. In metrology, it is said that a measuring instrument is all the more “faithful” the more it provides indications of little discordant value between them in the course of several measurements of a constant physical quantity.
The fidelity error is evaluated by performing a certain number of measurements of the same, assuming constant measurand: the error will, therefore, be represented by the semi-difference between the maximum and minimum value of the corresponding measures.
The fidelity error is mainly due to external influences: temperature, magnetic field, pressure, angular or linear acceleration, etc. These quantities will act simultaneously and with different intensity for each moment so that the instrument will provide different indications of the same size over time; therefore, an instrument will be all the more faithful, the more it has been constructed to be insensitive to the magnitudes of influence.