59066409 instrument calibration procedure

25

Click here to load reader

Upload: mohammad-nurayzat-johari

Post on 21-Apr-2015

186 views

Category:

Documents


9 download

TRANSCRIPT

Page 1: 59066409 Instrument Calibration Procedure

Instrument Calibration Procedure

Internal and External Dial, Vernier and Digital Calipers and Outside Micrometers

GPC CAL001DATE 10/07

REV. 0

Quality Assurance Manager

Approved Date:

Page 2: 59066409 Instrument Calibration Procedure

SECTION1

INTRODUCTION AND DESCRIPTION

1.1 This procedure describes the calibration of Internal and External Dial, Vernier and Digital Calipers and Outside Micrometers. The instrument being calibrated is referred to herein as the TI (Test Instrument)

1.2 This procedure includes test and essential performance parameters only. Any malfunction noticed during calibration, whether specifically tested for or not, should be corrected.

Table 1 Calibration Description

TITI

CharacteristicsPerformance Specification

Test Method

Dial, Vernier and Digital

Calipers

Zero indication test Test point 0 in.Tolerance +/- .0005 in.

Determined by sliding the jaws together and

reading the TI indication.

Outside accuracy Range 0 to 12”Tolerance: see Table II

Comparison to gage blocks between the TI

jaws.Inside accuracy Range 0 to 12”

Tolerance: see Table IIComparison to gage blocks dimensions placing the blocks

with attached caliper jaws outside the TI

jaws.Depth gage accuracy Range 0 to 12” Comparison to gage

block dimensions, placing the TI depth

gage block and reading the TI outside

dimension scale.Outside Micrometers

Length and Linearity Range: 0 to 12” Measured by comparing TI

indications to gage block dimensions

setup to test the basic length and

micrometer head linearity.

Page 3: 59066409 Instrument Calibration Procedure

EQUIPMENT REQUIREMENTS

Table 2 Equipment Requirements

Item Minimum Use Specifications

Calibration Equipment

2.1 Gage Block Set Range .050- 4”Tolerance .00005

ESSM #SE0060

2.2 Low power magnifier To aid in reading the TL vernier or barrel scale

2.3 Micrometer wrench Adjustment of micrometer2.4 Small jewelers screwdriver Adjustment of caliper2.5 Small clamp Hold gage blocks for inside

measurement comparison2.6 Light machine oil or spray Lubrication of slides or

thimble2.7 Clean flat surface Surface Plate or Steel Block

Page 4: 59066409 Instrument Calibration Procedure

PRELIMINARY OPERATIONS

3.1 TI INSPECTION

3.2 Ensure that the work area is clean, well illuminated, free from excessive drafts, free from excessive humidity, and that the rate of temperature change does not exceed 4 oF or 15.56oC per hour.

3.3 Ensure that the gage block set (figure 2.1) is clean and that the TI and the gage blocks have been allowed to stabilize at the ambient temperature for a minimum of 2 hours.

Figure 2.1

CALIPERS

Page 5: 59066409 Instrument Calibration Procedure

3.4 Ensure that the caliper’s jaws slide smoothly and freely along the full length of the TI beam, if not take corrective action.

3.5 Inspect to ensure the jaws are free of nicks and burrs and that it is clean and free from damage that would impair its operation.

NOTETI test point are basically determined by selecting 4 test point within the first inch of range and 4 to 8 point additional test points extending over the remainder of the TI range at approximately equal spacing.

3.6 In order top minimize the number of gage blocks needed to test calipers with higher ranges, select test point at 25, 50, 75, and 100% of the TI range beyond the 1st inch and round to the nearest ½ inch. Use major vernier graduations or normal electronic digital values as test points as necessary.

OUTSIDE MICROMETERS

3.7 Slowly rotate the TI micrometer thimble, and ensure that it operates smoothly its entire range.

3.8 The TI length measurement test should be preformed at approximately 6 points across the range of the TI. For example, 0 to 1” micrometer may have calculated test points at .000, .200, .400, .600, .800 and 1.000 inch.

Page 6: 59066409 Instrument Calibration Procedure

CALIBRATION PROCESS CALIPERS

NOTESUnless otherwise specified, verify the results of each test and take corrective action whenever the test requirement is not being met before proceeding.

Cotton gloves should be worn when handling gage blocks to prevent the transfer of body heat, protect the gage surfaces.

4.1 ZERO TEST

4.1.1 Slide the TI jaws together, ensuring that no light is visible between the jaws measuring surfaces. If the TI has inside measurement capability, verify that the dial indicator or vernier indicates zero, as necessary. Adjust the dial bezel, if necessary.

4.1.2 Tighten the TI sliding jaw set screw, if applicable

4.1.3 If the TI has a digital readout, depress the zero set and verify that the digital indication reads 0.000.

4.1.4 If the TI is a vernier type, verify that the TI zero marks are aligned, as applicable.

4.2 OUTSIDE ACCURACY TEST

4.2.1 Determine the gage blocks required to obtain test points at a minimum of 4 point throughout the first inch of the TI as follows .125, .300, .650, and 1.000 inch.

4.2.2. Open the TI to beyond the first test point. Insert the gage blocks and close the jaws until they are firmly in contact with the gage block. Repeat each reading 3-5 times and verify that each indication does not exceed +/- ¼ of the least dial graduation.

4.2.3 Verify that the first inch test points are within +/- .001 inch for 0-6” range and +/- .002 for 0-12”calipers.

4.2.4 Repeat steps 4.2.2 and 4.2.3 for the remaining test points of 25%, 50%, 75% and full range (100%).

4.3 INSIDE ACCURACY TEST

4.3.1 Determine the gage blocks required to obtain test points at a minimum of 4 point throughout the first inch of the TI as follows .125, .300, .650, and 1.000 inch.

Page 7: 59066409 Instrument Calibration Procedure

4.3.2. Using a three gage block set-up. Place the test gage block between two other blocks and secure with a thumb screw type clamp (Figure 1). Open the TI to the approximately the first test point. Insert the TI and open the jaws until they are firmly in contact with the inside of the gage blocks. Repeat each reading 3-5 times and verify that each indication does not exceed +/-¼ of the least dial graduation.

4.3.3 Verify that the first inch test points are within +/- .001 inch for 0-6” range and +/- .002 for 0-12”calipers.

4.3.4 Repeat steps 4.2.2 and 4.2.3 for the remaining test points of 25%, 50%, 75% and full range (100%).

Figure 1Inside Measurement Set-up

4.4 DEPTH ACCURACY

4.4.1 Position the end of the TI beam against a 1.0 inch gage block surface with the end of the rod against the surface plate or steel gage block.

4.4.2 Ensure that the TI gage measuring surfaces are squarely placed against the surface of the gage block and the surface plate.

4.4.3 Make any necessary final adjustments and note the scale indication.

4.4.4 Verify that the value(s) noted in the preceding step is within +/- .001 for 0-6” range and +/- .002 for 0-12”calipers.

4.4.5 Perform steps 4.4.1 through 4.4.4 for each additional inch of depth gage range, changing gage blocks, as necessary.

Page 8: 59066409 Instrument Calibration Procedure

SECTION 5CALIBRATION PROCESS

OUTSIDE MICROMETERS

5.1 Determine the gage blocks required to obtain test points at a minimum of 5 point.(Lowest TI indication, 20%, 40%, 60%, 80% and full range.)

5.1.1 Adjust the spindle several tenths of an inch from zero, based on the TI basic length. Starting with the gage block equal to the lowest TI range, slide the gage block(s) between the anvil and spindle.

5.1.2 Adjust the TI as applicable to contact the gage block(s). Repeat each reading 2-3 times and verify each indication. Refer to Table 3 Micrometer Calibration Tolerance.

5.1.3 Repeat step 5.1.1 & 5.1.2 through the complete range of the TI.

5.2 If the TI is the interchangeable anvil type, attach the each anvil in range and verify as outlined in paragraph 5.1-5.1.3.

Table 3 Micrometer Calibration ToleranceMicrometer Size Range Calibration Tolerance

w/vernierCalibration ToleranceWithout vernier

0-1” +/- .0001 +/- .0011”-2” through 9”-10” +/- .0002 +/- .00110”-11” and greater +/- .0003 +/- .002

Page 9: 59066409 Instrument Calibration Procedure

SECTION 6LABELING AND RECORDS

6.1 Complete a Measuring and Test Calibration Record (QA-4) for each item calibrated. Utilize additional QA-4 form as continuation sheet as needed.

6.2 Affix a calibration label to the TI with the date calibrated, date of next calibration and who performed the calibration.

6.2.1 Special calibration label shall be applied to any item that has not been calibrated to it’s fully range of capability.

6.3 All completed calibration records shall be file and maintained.

MICROMETER CALIBRATIONAim: - To study various types of micrometers. - To calibrate the given micrometers, using slip gauge as standard. - To study use of combination set. Apparatus: - Set of Micrometers - Set of Slip gauges - Combination set Theory: A micrometer is a device used widely in mechanical engineering and machining for precision measurement, along with other metrological instruments such as dial calipers and vernier calipers. Micrometer screw-gauge is used for measuring accurately the diameter of a thin wire or the thickness of a sheet of metal. It consists of a U-shaped frame, fitted with a screwed spindle which is attached to a thimble, as shown in Fig. 1. Fig. 1 Screw gauge 2Micrometers use the principle of a screw to amplify small distances that are too small to measure directly into large rotations of the screw that are big enough to read from a scale. The accuracy of a micrometer derives from the accuracy of the threadform that is at its heart. The basic operating principles of a micrometer are as follows: The amount of rotation of an accurately made screw can be directly and precisely correlated to a certain amount of axial movement (and vice versa), through the constant known as the screw's pitch (for single start screw thread). A screw's pitch is the distance it moves forward or backward axially with one complete turn. The screw has a known pitch such as 0.5 mm. Hence in this case, for one revolution of the screw the spindle moves axially by 0.5 mm. This movement of the spindle is shown on an engraved linear millimeter scale on the sleeve. On the thimble there is a circular scale which is divided into 50 or 100 equal parts. When the anvil and spindle end are brought in contact, the edge of the circular scale should be at the zero of the sleeve (linear scale) and the zero of the circular scale should be opposite to the datum line of the sleeve. If the zero is not coinciding with the datum

Page 10: 59066409 Instrument Calibration Procedure

line, there will be a positive or negative zero error as shown in Fig. 2. Fig. 2 Zero error in case of screw gauge The least count of the micrometer screw can be calculated using the formula given below: Least count =Pitch/ Number of divisions on the circular scale =0.5 mm/50 =0.01 mm • As an example, to determine the diameter of a wire, the wire is to be placed between the anvil and spindle end, and the thimble is rotated till the wire is firmly held between the anvil and the spindle. The ratchet is provided to avoid excessive pressure on the wire. It prevents the spindle from further movement. The diameter of the wire could be determined from the reading as shown in Figure 3. 3Fig. 3 Linear and circular scales of screw gauge ¾ Reading = Linear scale reading + (coinciding cicular scale * Least count) = 2.5 mm+ (46*0.01) = 2.96 mm (for Figure 3) Accuracy of the measured reading is the degree of veracity while precision is the degree of reproducibility. The analogy may be used to explain the difference between accuracy and precision is the target comparison. In this analogy, repeated measurements are compared to arrows that are shot at a target as shown in Figure 4. Accuracy describes the closeness of arrows to the bullseye at the target center. Arrows that strike closer to the bullseye are considered more accurate. The closer a system's measurements to the accepted value, the more accurate the system is considered to be. Low Accuracy High Precision High AccuracyLow Precision High Accuracy High Precision Fig. 4 Difference between accuracy and precision Gauge block or slip gauge is a precision ground and lapped length measuring standard. It is used as a reference for the setting of measuring equipment used in machine shops, such as micrometers, sine bars, and dial indicators (when used in calibration or inspection role). These gauges consists of a set of steel blocks, each of which has one pair of opposite faces lapped flat and parallel accurately to a few millionths of an inch. They are used to check the accuracy of workshop and similar gauges, which in use arc subjected to Linear scale Circular scale 4wearing action; slip gauges should never be used as ordinary measuring gauges, but as reference or master standards. They arc generally employed in connection with comparator instruments when workshop gauges have to be checked. The slip gauges are supplied in sets, the number in each set varying according to the purpose in view. The most widely employed set consists of 81 gauges of differing thickness made up as fallows: • Nine pieces with range of 0.1 001 to 0.1009 in. in steps of 0.0001 in. • Forty-nine pieces with a range of 0.101 to 0.149 in. in steps of 0.001 in. • Nineteen pieces with a range of 0.05 to 0.95 in. in steps of 0.05 in.

Page 11: 59066409 Instrument Calibration Procedure

• Four pieces of parallel width, 1 in, 2 in, 3 in, and 4 in. respectively. Metric unit sets of 103 pieces are made up as follows: • Forty-nine pieces with a range of 1.01 to 1.09 mm. in steps of 0.01 mm. • Forty-nine pieces with a range of 0.50 to 24.50 mm. In steps of 0.50 mm. • Four pieces of25, 50, 75, and 100 mm respectively. • One extra piece of 1.005 mm. Smaller sets of 76, 56, 48, and 31 pieces are also supplied in the metric sizes, and sets of 49, 41, 35 and 28 in English sizes. Before using these gauges they should be wiped with a piece of soft linen cloth. If the presence of any grease is suspected, or in the case of new gauges having a protective coating, the surface should be wiped over with a piece of soft linen moistened with benzole or petrol. The removal of any grease including that from the fingers is important since; otherwise, dirt may be picked up more easily. In a dustless atmosphere however, a trace of grease on the surface assists in obtaining a satisfactory wringing action. Fingering of slip gauges should be avoided as much as possible since it tends to promote tarnishing and thermal expansion effects. A change of only in temperature causes a length change of about 1 /1, 00, 000 in. per inch thickness of gauge. It is important for fine precision measurements to use slip gauges in a room thermo-statistically controlled at 200 C and to allow, the work and the gauges sufficient time to attain this temperature if taken into heat-regulated room. Gauges after use should be wiped off carefully at once and returned to their storage case, closing the lid of the latter as soon as possible. Wringing is the process of sliding two blocks together so that their faces lightly bond. When combined with a very light film of oil, this action excludes any air from the gap between the two blocks. The alignment of the ultra-smooth surfaces in this manner permits molecular attraction to occur between the blocks, and forms a very strong bond between the blocks along with no discernible alteration to the stack’s overall dimensions. 5• The recommended procedure for wringing a pair of slip gauges together is as follows and shown in the Fig. 5. ¾ First clean the surfaces as desired previously, and then place one gauge centrally across the other gauge at right angles to form a symmetrical cross. Finally rotate the upper gauge over the lower one to its final coincident position. This method results in appreciably less rubbing action than the upper gauge is slid lengthwise over the lower one. Fig. 5 Wringing of slip gaugesProcedure:

Page 12: 59066409 Instrument Calibration Procedure

For calibration of micrometers 1. Check the range of measurement of the micrometer. 2. Note down zero error of the micrometer, if any. 3. Select a number of slip gauge combinations. 4. Measure each slip gauge combination with the micrometer and note down micrometer reading (M) and slip gauge combination length (G) in tabular form. 5. Plot a calibration chart for the micrometer taking M on the X-axis and (M-G) on the Y-axis. 6. Repeat the steps l-5 for other micrometers. Precautions: ! While making slip gauge combination, do the wringing correctly, so that no foreign particles are entrapped. ! While taking micrometer reading, care should be taken to clamp the spindle in position, before taking it away from the block, as due to friction the Spindle will rotate and give a wrong reading. ! Turn the spindle always in the clockwise direction to avoid backlash error.

INTRODUCTION This procedure describes the steps taken when calibrating vernier calliper gauges of any dimension using the DIN 862 standard. 1.1 General aspects Vernier calliper gauges are used to measure distances of either inside or outside measures or depths. Gauge blocks are used to calibrate the vernier calliper gauge. 2 MEASURING EQUIPMENT Gauge blocks - ZEISS, 103 blocks Coordinate measuring machine - ZEISS UMC 850 Gauge ring 50 mm - MITUTOYO 3 RECEIPT The vernier calliper gauge is received from the customer and is visually checked for any obvious defects. Defects which are checked are corrosion; missing or worn parts of the scale (on classical scales); missing parts such as screws; and significant scratches or other defects which would impede use. The customer name, and type and serial number of the vernier calliper is also noted and is tagged on to the vernier calliper using a yellow identification sticker. The procedure for attaching this sticker is described in the paragraph 12.10.2. of the Quality manual. The number of gauges is checked and compared with the accompanying documentation.4 CLEANING The measuring surfaces of the gauge blocks and the gauge measuring surfaces are cleaned using petroleum ether. Any grit or other particles are also cleaned from the scale. The surfaces are wiped afterwards using the tougher side of a chamois leather (or special synthetic cloth). Minor damages on guides or measuring surfaces are repaired using special fine grindstone. 5 THERMAL STABILISATION The temperature of the gauge is stabilised at 20 o

Page 13: 59066409 Instrument Calibration Procedure

C ± 1 oC for 5 hours. Page No.: 5 of 12LABORATORY FOR PRODUCTION MEASUREMENTSOP 6 - CALIBRATION OF VERNIER CALLIPER GAUGES Issue No. E-16 CALIBRATION 6.1 Visual inspection • Measuring surfaces are checked to see if there are any scratches which may impede calibration. • Scale is checked: • Classical scale is checked for any marking points or numbers on the scale which may be missing or worn. • Dial scale is checked for the straightness of the pointer and its distance from the scale. The pointer must be of the same width as the lines on the scale. Lines of the scale must be oriented towards the centre of the dial scale. Resolution must be written on the scale (e. g. 0,01 mm) • Digital scale is checked for numerical legibility and for any scratches on the display. If the numbers are of poor resolution (checked at display value 88.88) the battery is changed and resolution is checked again. The numbers must be clearly visible in each measuring position. 6.2 Functional check The gauge is tested for the full and smooth running of its specified distance. The air in guides is checked. The shown value should not change if the gauge is fixed used fixing screw. 6.2.1 Parallelism of Measuring Surfaces The parallelism is checked by observing the air slot when measuring surfaces are in contact. It should not change if the gauge is fixed used fixing screw. Zero position of the depth measuring bar is also checked. Observed deviations are recorded in calibration report. 6.3 Measurement of deviations Deviations of gauges with measuring range 0 - 150 mm are checked using gauge blocks of the dimensions stated in the Table 1. Deviation of inside measure is checked at the value 50 mm using gauge ring (for all measuring ranges). Deviations of gauges greater than 150 mm are checked using coordinate measuring machine according to the following procedure (for one measuring position) :

INTRODUCTION This procedure describes the steps taken when calibrating micrometers of any dimension using the DIN 863 standard. 1.1 General aspects Micrometers are used to measure distances of either inside or outside measures, where attachments to the micrometer or special types of micrometer are used to measure inside dimensions. Ceramic gauge blocks and coordinate measuring machine are used to

Page 14: 59066409 Instrument Calibration Procedure

calibrate the micrometer. 2 MEASURING EQUIPMENT Ceramic gauge blocks 2,5 - 25 - MITUTOYO, 10 blocks Steel gauge blocks 0,5 - 100 mm - KOBA, 122 blocks Steel gauge blocks 125 - 500 mm - KOBA, 8 blocks Steel gauge blocks 500 - 1000 mm - FRANK, 5 blocks Optical Flat - TESA Gauge rings 4 - 275 mm - MITUTOYO 3 RECEIPT The micrometer is received from the customer and is visually checked for any obvious defects. Defects which are checked are corrosion, missing or worn parts of the scale (on classical scales); missing parts such as screws, and significant scratches or other defects which would impede use. The customer name, and type and serial number of the micrometer is also noted and is tagged on to the micrometer using a yellow identification sticker. The procedure for attaching this sticker is described in the paragraph 12.10.2. of the Quality manual. The number of gauges is checked and compared with the accompanying documentation.4 CLEANING The measuring surfaces of the gauge blocks and the micrometer anvils are cleaned using petroleum ether. Any grit or other particles are also cleaned from the scale. The surfaces are wiped afterwards using the tougher side of a chamois leather (or special synthetic cloth). 5 THERMAL STABILISATION The temperature of the micrometer is stabilised at 20 oC ± 1 oC for 5 hours. Page No.: 5 of 13LABORATORY FOR PRODUCTION MEASUREMENTSOP 5 - CALIBRATION OF MICROMETERS Issue No. E-16 CALIBRATION 6.1 Visual inspection • Measuring surfaces are checked to see if there are any scratches, which may impede calibration. • Scale is checked: ◊ Classical scale is checked for any marking lines or numbers on the scale, which may be missing or worn. The marking lines should be of the same width and have sharp edges. ◊ Digital scale is checked for numerical legibility and for any scratches on the display. If the numbers are of poor resolution (checked at display value 88.88) the battery is changed and resolution is checked again. The numbers must be clearly visible in each measuring position. 6.2 Functional check The micrometer is tested for the full and smooth running of its specified distance. The

Page 15: 59066409 Instrument Calibration Procedure

lock nut is tested to see that is holds the micrometer spindle firmly at various positions. 6.2.1 Parallelism of Measuring Surfaces The optical flat is placed between the two anvils, with the micrometer tightened in the normal way using the ratchet. The flat is given a small twist to check the grip of the anvils, and to remove any dust that may have settled on the surfaces. The number of interference lines or circles are counted on both measuring faces. The interference lines/circles that appear anywhere in the region less than 0.4 mm from the edge of the flat are ignored in the count. The numbers are added together and if the total is less than or equal to the values listed in the table below, the parallelism of measuring surfaces is in tolerance: Measuring Range (mm) Number of allowable interference lines or circles for parallelism according to DIN 863 00 to 25 6 ≈ 2 µm 25 to 50 6 ≈ 2 µm 50 to 75 10 ≈ 3 µm 75 to 100 10 ≈ 3 µm In the case when parallelism is out of tolerances, the remark is written in the calibration report. Page No.: 6 of 13LABORATORY FOR PRODUCTION MEASUREMENTSOP 5 - CALIBRATION OF MICROMETERS Issue No. E-16.2.2 Flatness of Measuring Surfaces The flatness of the micrometer anvils is inspected by means of the optical flat. The optical flat is placed between the two anvils following the procedure described in the previous section 6.2.2. If any interference lines are observed, the two outer lines are ignored and the other lines are counted. The table below shows the equivalent flatness corresponding to the number of lines counted: Number of interference lines or circles according to DIN 2 ≈ 0.6 µm 6.3 Measurement of deviations 6.3.1 Micrometers for measuring outside dimensions 6.3.1.1 Starting Point The specified starting distance is checked: • for the zero value (for micrometers of 0 - 25 mm) by tightening the micrometer (using the ratchet) until it tightens no further and then reading off the value, or; • for values other than zero in the same way but with the use of a gauge block of appropriate dimension (e. g. for micrometer 50 - 75 mm starting point is checked using a gauge block 50 mm). The deviation is stated in the calibration report. 6.3.1.2 Deviations in measuring positions For micrometers of measuring range 0 - 25 mm the following gauge block dimensions are used: 2.5, 5.1, 7.7, 10.3, 12.9, 15.0, 17.6, 20.2, 22.8, 25 mm. For micrometers of other measuring ranges the same gauge blocks are used, but each

Page 16: 59066409 Instrument Calibration Procedure

attached to a gauge block of a dimension that is equal to the lower measuring range limit. The gauge block (or a combination of gauge blocks) is placed between the two anvils, with the micrometer tightened in the normal way using the ratchet so that the block is held firmly. The block is given a small twist to check the grip of the anvils, and to remove any dust that may have settled on the surfaces. The block is also placed in a position such that the anvils touch the centre of the gauge measuring faces. If more than one gauge block is used, additional value of 0,2 µm is added to the sum of the gauge block lengths for each joint (experimentally evaluated). Page No.: 7 of 13LABORATORY FOR PRODUCTION MEASUREMENTSOP 5 - CALIBRATION OF MICROMETERS Issue No. E-16.3.2 Micrometers for measuring inside dimensions 6.3.2.1 Starting Point The deviation in the starting point is checked using a gauge block of the dimension that is equal to the lower measuring range limit. The gauge block is put into special gauge block holder for checking inside dimensions (Fig. 1 ) Figure 1: Gauge block holder for checking inside dimensions The deviation is stated in the calibration report. If more than one gauge block is used, additional value of 0,2 µm is added to the sum of the gauge block lengths for each joint (experimentally evaluated). measured distancegauge block6.3.2.2 Deviations in measuring positions The same gauge blocks are used as in 6.3.1.2, but they are put in the holder (Fig. 1). The micrometer is gently rotated in order to find the minimum distance. 6.3.3 Calibration of standard bars for micrometers Standard bars for micrometers are calibrated in the same way and with the same device as long gauge blocks. The procedure and uncertainty is described in SOP 16. 6.3.4 Three point micrometers for measuring inside diameters 6.3.4.1 Starting Point The deviation in the starting point is checked using a gauge ring of the dimension that is equal to the lower measuring range limit. The deviation is stated in the calibration report. 6.3.4.2 Deviations in measuring positions Two gauge rings are used for checking deviations. One diameter is approximately at the middle of the measuring range and the other is approximately at the upper limit of the measuring range. Measurement is repeated in three different positions (the gauge ring is rotated twice for 120°) and the mean value is stated in the calibration report.