Remote monitoring software integrated by Tektronix into Keithley's DAQ6510 measurement acquisition unit

How to determine the calibration interval of an instrument?

  • The calibration interval must be defined by the owner-user of the measuring instrument according to its conditions of use and the possible risks related to an erroneous measurement.
  • The periodicity or interval of calibration can be defined by the manufacturer of the measuring instrument or means (every year or every X uses for example), by the regulation (in particular concerning legal metrology) and by the company according to its requirements and needs.

 
Many factors must be taken into consideration in order to define the periodicity of calibration of a given instrument: conditions of use, costs and consequences due to incorrect measurements, customer requirements, standards, regulations, manufacturer’s recommendation, user’s experience regarding the use of this type of instrument in his processes and activity…

The user can rely on long-term data about the stability of an instrument in its usual context of use to help determine the appropriate calibration interval.

The overriding factors listed above differ from one measuring device to another and from one application to another. The determination of the calibration interval of a measuring instrument therefore requires a study specific to each instrument and to each company according to the specific context of use.

A calibration interval that is too short implies frequent calibrations and therefore higher costs. A calibration interval that is too long can have an impact on the processes and activities of the company using the instrument and therefore lead to financial losses due to corrective measures that must be taken following the realization of erroneous measurements.

Therefore, a balance must be found between the risk incurred and the expenses related to the calibration of the instruments.