Effective PdM requires reliable calibration

An element of preventive maintenance is regularly testing and calibrating process instrumentation. This article discusses how to specify an appropriate hand-held calibrator.

By Luis Szabo

1 of 2 < 1 | 2 View on one page

Keith Mobley said it succinctly in his November 2002 column: "Effective preventive maintenance requires reliable, accurate instrumentation and gauges." An element of preventive maintenance is regularly testing and calibrating process instrumentation.

Test and calibration devices generally fall into three grades: industrial, instrument and laboratory (Table 1). Laboratory-grade equipment is used under controlled, stable environmental conditions. Industrial-grade devices are applied broadly for testing or monitoring, usually with whole-number resolution only. Instrument-grade equipment is used for tolerance validation, testing and calibration. This grade has at least four times the accuracy of the instrument to be calibrated or tested, typically with full-scale temperature compensation.

Accuracy varies

The accuracy specification refers to the degree of uncertainty associated with a measurement. Each manufacturer expresses accuracy in a format favorable to its equipment, which makes comparison a challenge.

Accuracy specifications are stated as full-scale, range, percent of scale, or percent of reading or indicated value, and may include temperature compensation. For digital instrumentation, counts or digits are often part of the specification. To interpret these varied descriptions of accuracy, calculate the magnitude of the measurement uncertainty.

For example, a pressure transmitter has a range of 0 to 30 psi and an overall accuracy spec of 1% of full scale. The uncertainty in the measurement is 0.3 psi throughout the entire range (1% of 30 psi). For example, at 15 psi, the true measurement can be anywhere between 14.7 psi and 15.3 psi.


Table 1. Test and calibration equipment can be graded according to its accuracy and use.

View related content on PlantServics.com

Determine requirements

In practice, many companies have adopted a minimum 4:1 ratio: The measurement standard must be four times as accurate as the unit under test, for the given range.

The measurement standard range must be as close as possible to the unit under test. Compare the accuracy at points throughout the measurement range.

For example, compare the transmitter above , 0 to 30 psi, 1% full-scale accuracy and a 4:1 accuracy ratio , to an industrial-grade test gauge with a 0 to 100 psi range and 0.25% full-scale accuracy. (Comparing only full-scale accuracy, one might guess the test gauge itself can be used as a calibration standard.) Just for fun, let's compare a dedicated, instrument-grade hand-held pressure calibrator, also having a range of 0 to 100 psi, but with 0.025% full-scale accuracy.

Remember that the measurement uncertainty must be applied to the full scale , 0 to 100 psi , in both cases. Table 2 illustrates the results. The test gauge, with a 1.2:1 accuracy ratio, is inappropriate for a calibration standard, but is appropriate for a test or indicator gauge. The dedicated hand-held pressure calibrator has an uncertainty of 0.025 psi and a 12:1 accuracy ratio and is an appropriate calibration standard.

Now, select a more accurate transmitter with, say, 0.1% full-scale accuracy over the same range. Will that hand-held pressure calibrator still be appropriate? Table 3 shows that the dedicated hand-held calibrator is not an appropriate calibration standard if a 4:1 accuracy ratio is required. In fact, if the calibrator accuracy was 0.01%, it would still provide only a 3:1 accuracy ratio. But a hand-held calibrator with the same scale as the unit under test is an appropriate standard.

The calibration procedure should reflect the appropriate standard clearly so these calculations need to be performed only once.


Table 2. With an accuracy ratio of only 1.2:1, the test gauge is not an appropriate calibration standard for this transmitter. But at 12:1, the calibrator exceeds the 4:1 minimum and can be used.

Want to document?

Smart field devices constitute a significant portion of the installed base, and an even higher percentage of new instrument sales. In terms of installed base and annual sales, HART devices make up the majority of smart devices. These microprocessor-based units have been around for more than a decade, yet the documenting process calibrator is a relatively new product. The device is a portable, intelligent field calibrator, which reduces calibration time. Calibrating a HART device typically requires a digital voltmeter to monitor the 4-20 mA output, a HART host communicator and a calibrator. With a documenting process calibrator, one unit performs the entire procedure.

1 of 2 < 1 | 2 View on one page
Show Comments
Hide Comments

Join the discussion

We welcome your thoughtful comments.
All comments will display your user name.

Want to participate in the discussion?

Register for free

Log in for complete access.

Comments

No one has commented on this page yet.

RSS feed for comments on this page | RSS feed for all comments