In Metrology the term measurement is closely associated with all the activities about scientific, industrial, commercial, and human aspects. It is defined as the assignment of a number to a characteristic of an object or event, which can be compared with other objects or events. The knowledge of the reality that surrounds us is based on the measurement of physical quantities, in fact, we can say that knowing means measuring.

Measurement applications

Types of measurement applications can be classified into only three major categories:

  1. Monitoring of processes and operations: refers to situations where the measuring device is being used to keep track of some physical quantity (without any control functions).
  2. Control of processes and operations: is one of the most important classes of measurement application. This usually refers to an automatic feedback control system.
  3. Experimental engineering analysis: is that part of engineering design, develop­ment, and research that relies on laboratory testing of one kind or another to answer questions.

Every application of measurement, including those not yet “invented,” can be put into one of the three groups just listed or some combination of them.

The primary objective of measurement in the industrial inspection is to determine the quality of the component manufactured. Different quality requirements, such as permissible tolerance limits, form, surface finish, size, and flatness, have to be considered to check the conformity of the component to the quality specifications. In order to realize this, quantitative information of a physical object or process has to be acquired by comparison with a reference. The three basic elements of measurements, which are of significance, are the following:

  1. Measurand, a physical quantity to be measured (such as length, weight, and angle);
  2. Comparator, to compare the measurand (physical quantity) with a known standard (reference) for evaluation;
  3. Reference, the physical quantity or property to which quantitative comparisons are to be made, which is internationally accepted.

All these three elements would be considered to explain the direct measurement using a calibrated fixed reference. In order to determine the length of the component, measurement is carried out by comparing it with a steel scale (a known standard).

Influence quantities in measurements

In metrology, in cases where the environmental conditions of the actual use of the transducer deviate significantly from the environmental calibration conditions, the effects due to the influence quantities must be taken into account. In these cases, specific tests must be conducted on a population of transducers or, at least, on a single transducer.

It appears necessary to highlight that attention must be paid to environmental conditions not only during the sensor operation but also during the previous phases such as storage and transport; these environmental conditions, if not checked and verified, can significantly alter, and above all, in an unpredictable way the metrological performance of the transducer. Some of the main quantities of influence that occur in mechanical and thermal measurements are summarized below.

Effects due to temperature

For each transducer, the working temperature variation range is indicated within which it can be used without causing damage. In this field of use, the trends of both the zero drift and the sensitivity drift are generally provided by the manufacturer. In fact, for example, in the measurements carried out with resistance strain gauges are given both the trends of the apparent deformation as a function of temperature (zero drift) and the sensitivity coefficient of the calibration factor as a function of temperature (sensitivity drift).

A further method that allows expressing the effect due to the temperature is the identification of a range of variation of the error due to it, which is expressed for example as a percentage of the full scale. It is also necessary to know the maximum and minimum value of the temperature at which the transducer can be exposed without permanent damage, that is without the metrological characteristics varying. Changes in ambient temperature determine not only effects on static metrological characteristics but also dynamic ones. It is necessary that the values supplied by the manufacturer refer to a specific temperature variation range. However, the temperature shows effects that can also be significant when there are step variations.

Effects due to acceleration

Errors caused by acceleration can occur either directly on the sensitive element, or the connection or support elements and can be of such a magnitude as to induce deformations to render the measurements conducted meaningless. In general, the transducers will show a more relevant acceleration sensitivity according to some axes; therefore it is necessary to indicate the triad of the selected reference axes and express the error due to acceleration.

The maximum difference between the output of the sensor in the absence and the presence of a specified constant acceleration applied according to a specific axis is defined as the acceleration error. Finally, it is opportune to specify that some sensors show a sensitivity to the acceleration of gravity so that the disposition of the transducer with respect to the gravitational field constitutes an essential condition of constraint.

Effects due to vibrations

The variation of the frequency of the vibrations, applied according to a specific reference axis, can determine (for example due to resonance phenomena, etc.) significant effects in the signal output provided by the transducer.

To express the effect due to vibrations, it will be necessary to define the maximum variation in the output, for each value of the physical input quantity, when a specific amplitude of the vibration, and for a given frequency range, is applied according to an axis of the transducer.

Effects due to environmental pressure

Sometimes it can be verified that the transducer must operate in conditions under which the pressure is significantly different from the pressure at which the calibration operation was carried out, which in general is equal to the environmental pressure. Relatively different pressures from those to which the calibration tests have been conducted may determine variations in the internal geometry of the transducer to vary the metrological characteristics provided by the manufacturer.

A deviation from the calibration conditions is much more severe than from damage to the transducer which, on the other hand, is easily detectable by the experimenter. The error due to pressure is defined as the maximum variation of the transducer output, for each value of the input quantity included in the measurement range, when the pressure at which the transducer operates is made to vary in specified intervals.

Effects due to commissioning of the transducer

If the commissioning of a transducer does not occur with care, damage can occur (deformation of the structure, for example) such as to vary the operating conditions of the transducer. No data relating to this cause of the error are available from the manufacturer, and the user must make sure of the proper and correct installation of the device.

Methods of measurements

When precision measurements are made to determine the values of a physical variable, different methods of measurement are employed. For measurement method is defined as the logical sequence of efficient operations, employed in measuring physical quantities under observation.

The better the measurement method used and how much better are the instruments and their technology, much closer to reality is the measure describing the state of the measured physical quantity. In principle, therefore, the measure represents the physical reality with a certain approximation, or with a certain error, an error that can be made very small but never null.

The choice of the method of measurement depends on the required accuracy and the amount of permissible error. Irrespective of the method used, the primary objective is to minimize the uncertainty associated with the measurement. The common methods employed for making measurements are as follows:

Direct method

In this method, the quantity to be measured is directly compared with the primary or secondary standard. Scales, vernier calipers, micrometers, bevel protractors, etc., are used in the direct method. This method is widely employed in the production field. In the direct method, a very slight difference exists between the actual and the measured values of the quantity. This difference occurs because of the limitation of the human being performing the measurement.

The advantage of direct measurements consists mainly in the fact that with them it is harder to make gross errors, since the instrument necessary to make the comparison is generally simple, and therefore not subject to hidden faults.

Indirect method

In this method, the value of a quantity is obtained by measuring other quantities that are functionally related to the required value. Measurement of the quantity is carried out directly and then the value is determined by using a mathematical relationship.

Most of the measurements are obtained indirectly, almost always for cost reasons. For example, a density measurement of a given substance could be obtained directly through a device called densimeter, but it is definitely more convenient to directly measure the mass and volume of the substance and then make the relationship.

Indirect measurements, on the other hand, are more subject to approximations since error propagation is present in the formula that represents the physical law. It is, therefore, necessary to pay particular attention to the approximations that are made when performing direct measurements.

Fundamental or absolute method

In this case, the measurement is based on the measurements of base quantities used to define the quantity. The quantity under consideration is directly measured and is then linked with the definition of that quantity.

Comparative method

In this method, as the name suggests, the quantity to be measured is compared with the known value of the same quantity or any other quantity practically related to it. The quantity is compared with the master gauge and only the deviations from the master gauge are recorded after comparison. The most common examples are comparators, dial indicators, etc.

Transposition method

This method involves making the measurement by direct comparison, wherein the quantity to be measured V is initially balanced by a known value X of the same quantity; next, X is replaced by the quantity to be measured and balanced again by another known value Y. If the quantity to be measured is equal to both X and Y, then it is equal to:


An example of this method is the determination of mass by balancing methods and known weights.

Coincidence method

This is a “differential” method of measurement wherein a very minute difference between the quantity to be measured and the reference is determined by careful observation of the coincidence of certain lines and signals. Measurements on vernier caliper and micrometer are examples of this method.

Deflection method

This method involves the indication of the value of the quantity to be measured directly by the deflection of a pointer on a calibrated scale. Pressure measurement is an example of this method.

Complementary method

The value of the quantity to be measured is combined with a known value of the same quantity. The combination is so adjusted that the sum of these two values is equal to the predetermined comparison value. An example of this method is the determination of the volume of a solid by liquid displacement.

Null measurement method

In this method, the difference between the value of the quantity to be measured and the known value of the same quantity with which comparison is to be made is brought to zero.

Substitution method

It is a direct comparison method. This method involves the replacement of the value of the quantity to be measured with a known value of the same quantity, so selected that the effects produced in the indicating device by these two values are the same. The Borda method of determining mass is an example of this method.

Contact method

In this method, the surface to be measured is touched by the sensor or the measuring tip of the instrument. Care needs to be taken to provide constant contact pressure in order to avoid errors due to excess constant pressure. Examples of this method include measurements using a micrometer, vernier caliper, and dial indicator.

Contactless method

As the name indicates, there is no direct contact with the surface to be measured. Examples of this method include the use of optical instruments, tool maker’s microscope, and profile projector.

Composite method

The actual contour of a component to be checked is compared with its maximum and minimum tolerance limits. Cumulative errors of the interconnected elements of the component, which are controlled through a combined tolerance, can be checked by this method. This method is very reliable to ensure the interchangeability and is usually effected through the use of composite GO gauges. The use of a GO screw plug gauge to check the thread of a nut is an example of this method.

Related keywords

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top