ENGINEERING METROLOGY TOOLBOX  
 Frequently Asked Questions (FAQ) 

Frequently Asked Questions (FAQ) is a web-based collection of frequently asked dimensional metrology questions grouped by topic.

Frequently Asked Engineering Metrology
Questions and Answers


Frequently Asked Questions (FAQ) Contents
BulletCalibrations
BulletTraceability
BulletStandards and Procedures
BulletMiscellaneous
Return to Contents
Return to Top of Document

Calibrations
Bullet Where do I find information about NIST Calibration Services?
Bullet Where do I send dimensional artifacts and items?
Bullet How often must I recalibrate an instrument or artifact?
Bullet What is the official NIST Number on my report?
Bullet Does NIST “certify” instruments or artifacts? Does NIST report an artifact out of tolerance?
Bullet Does NIST calibrate length-measuring instruments, such as, micrometers, calipers, and laser interferometers?
Bullet Does NIST calibrate miscellaneous dimensional quantities, such as, plating thickness, focal length of lenses, radius of curvature, and surface plates?
Bullet Why are NIST calibration fees so high?
Bullet How should I calibrate my [insert name of low-accuracy hand-held gage here]?
Bullet If I have my gage blocks calibrated every two years, can I average the data and reduce the uncertainty? What would be the new uncertainty?
Bullet The size of my gage block seems to change every time it is calibrated. Why is it changing?
Return to Questions
Return to Top of Document

QuestionWhere do I find information about NIST Calibration Services?

AnswerCalibration services, costs, and technical staff contacts for standard dimensional artifacts are updated and published yearly in the NIST Special Publication 250 Appendix “Fee Schedule” by the NIST Calibration Program Office:

Calibration Services
National Institute of Standards and Technology
100 Bureau Drive, Stop 2300
Gaithersburg, MD 20899-2300
Telephone: (301) 975-2092
Fax: (301) 869-3548
E-mail: calibrations@nist.gov

For calibration of non-standard dimensional items, contact NIST technical staff listed for Special Tests. It will often be necessary to provide drawings with a detailed description of what you want measured and desired uncertainties and to contact NIST technical staff by telephone or e-mail.

Return to Questions
Return to Top of Document

QuestionWhere do I send dimensional artifacts and items?

AnswerEngineering Metrology Group mailing and shipping address:

National Institute of Standards and Technology
100 Bureau Drive, Stop 8211
Gaithersburg, MD 20899-8211

Notice: All dimensional artifacts and items should be properly packed to prevent damage during shipment. Steel artifacts should be protected with a rust-inhibiting substance or material. All shipping containers will be reused to return items.

Return to Questions
Return to Top of Document

QuestionHow often must I recalibrate an instrument or artifact?

AnswerYou need to calibrate often enough that the instrument or artifact is known well enough that the quality of your product is not adversely affected. You must know what uncertainty you require of the instrument or artifact and you need to know how the calibration is expected to change with time. Most systems for setting calibration intervals assume that the calibration slowly drifts over time, or that the change is at least describable by some reasonable statistical distribution. A good reference for these types of calculations is the National Conference of Standards Laboratories International (NCSLI) Recommended Practice Number 1 (RP-1).

You must remember that when an instrument comes in for calibration, if it is seriously out of calibration you must suspect the judgements made on the basis of data from this instrument. This may involve recalling parts or other instruments for rechecks. Thus, it is a bad practice, economically, to set recalibration intervals that are too long. On the other hand, if every instrument is in calibration when checked, the intervals may be too short, at least in the sense that you are paying for unneeded calibrations. The rules in RP-1 are an attempt to optimize the tradeoffs.

Return to Questions
Return to Top of Document

QuestionWhat is the official NIST Number on my report?

AnswerThe official NIST number, or the number that NIST uses to keep track of reports, is a long number containing the number for the Division performing the service, the number the NIST Calibration Program Office generates, and the year the service is performed. This number should be on every page of the report. For example, a typical number for our Division (821) is 821/123456-97.

Many reports will also have other reference numbers. Our Group, for example, uses M numbers, in the form: M followed by four digits 0-9, to keep track of calibrations we perform. Why do we use M numbers? Well, looking back, we used to use L numbers, in the form: L followed by four digits 0-9, and after L9999, we went to M0001. Before L numbers, there was another series of L numbers that stretch back to the 1920's. Why was L numbers repeated? We really do not know, but perhaps long ago there were fewer letters in the alphabet.

Return to Questions
Return to Top of Document

QuestionDoes NIST “certify” instruments or artifacts? Does NIST report an artifact out of tolerance?

AnswerWe only certify at NIST compliance with standards for American Petroleum Institute (API) Gages. For other artifacts, we simply report the result of a measurement and our customers are responsibile for deciding conformance with specifications or standards.

Return to Questions
Return to Top of Document

QuestionDoes NIST calibrate length-measuring instruments, such as, micrometers, calipers, and laser interferometers?

AnswerThe only length-measurement instruments that we calibrate at NIST are laser displacement interferometers, which is a non-routine calibration service. As a special test, this service can be performed by making arrangements with Jack A. Stone.

Return to Questions
Return to Top of Document

QuestionDoes NIST calibrate miscellaneous dimensional quantities, such as, plating thickness, focal length of lenses, radius of curvature, and surface plates?

AnswerNo. There are many things that we do not calibrate at NIST. The Engineering Metrology Group has fewer than a dozen people most of whom make measurements only part time. We cannot possibly measure every gage the country needs and limtations on space and equipment forces us to choose calibrations we think will do the most good for the most industries, which changes slowly over time in response to customer feedback. We send a response card with every calibration and attend professional meetings our customers attend to get a better feel for what we should be calibrating. So, if you do not see what you need in the NIST Special Publication 250, contact us immediately. If we have the equipment, we can sometimes do the calibration as a Special Test. If enough people contact us about performing certain calibrations, we can go up our chain of command and argue for the resources to add the calibration to our offerings.

Look at the answer to the next question for more advice.

If NIST cannot calibrate an artifact, where else can I go for help?

NIST does not maintain any master list of calibration labs. Two sources of information on dimensional metrology laboratories are the National Voluntary Laboratory Accreditation Program and the American Association for Laboratory Accreditation.

Both of these accreditation bodies have a database of accredited laboratories that can be searched. The laboratory scope of accreditation is available, so you can see if they do the calibration you need.

Return to Questions
Return to Top of Document

QuestionWhy are NIST calibration fees so high?

AnswerOur fees are high compared to commercial calibration labs, but the NIST uncertainty is smaller. The cost versus uncertainty curve gets fairly steep for very low uncertainties. For example, our price for calibrating gage blocks by mechanical comparison may be 10 times more than the cost from an industry lab.

The U.S. Federal Government rules are that we fully recover all costs involved in the calibration service. For example, each size gage block has two master blocks used as references in the calibration. These master blocks are calibrated by interferometry. We maintain about 900 master blocks, measuring them on a 3-year cycle, and the cost is about one staff-year. This cost must be recovered from the customers. The measurement assurance program has a fair effort involved in the statistical process control (SPC) for calibrations and less direct expenses like those for configuration control of the computer programs, process improvement testing, and so forth. Other operating expenses are electricity, air conditioning, and overhead expenses common to any business.

Finally, one third of the charge goes to the NIST Calibration Program Office, in part to finance the work of that office, and part to fund research into calibration improvements. Because most of this money returns to the Division that generates it, dimensional metrology customers are, in other words, funding research into dimensional calibrations. The U.S. Congress mandates this research surcharge on calibrations.

Return to Questions
Return to Top of Document

QuestionHow should I calibrate my [insert name of low-accuracy hand-held gage here]?

AnswerWe have no idea. OK, that is a bit strong—perhaps a few ideas—but you must remember that NIST expertise is focused on very high accuracy measurements, so even if we have a procedure, it might be inappropriate (read too expensive) for the uncertainty level you need. We have always thought it would be a good gesture by the manufacturers to provide a calibration procedure as part of the instruction manual for any instrument they sell. Since the cost of making up a calibration procedure by an engineer probably exceeds the cost of many measuring instruments, you might keep this in mind when choosing vendors for your equipment.

One good approach is to ask people. The Internet is very useful this way; if you send a question to many list servers or forums you will get very detailed and expert advice on all sorts of things. You will also get answers of marginal sanity. You only have to be able to sort out what you want. Also, there is a government and industry effort, called Government-Industry Data Exchange Program (GIDEP), that collects procedures and makes them available to the public.

Return to Questions
Return to Top of Document

QuestionIf I have my gage blocks calibrated every two years, can I average the data and reduce the uncertainty? What would be the new uncertainty?

AnswerWhile data averaging usually gives better results, the exact change in uncertainty may be difficult to calculate. The NIST uncertainty budgets contain terms some partially and other fully correlated over time. If you are uncomfortable with dealing with such complications, our advice is to average the results to get a better value, but use the stated uncertainty.

Return to Questions
Return to Top of Document

QuestionThe size of my gage block seems to change every time it is calibrated. Why is it changing?

AnswerThere are two major sources for change in the calibration of an artifact. There are always small changes because there is some random error in all calibrations, even those from NIST. For example, the current NIST uncertainty for a 2 mm gage block is about 25 nm (1 µin). This means that, if we calibrate the same gage block lots of times, we would expect that most (about 95 %) of the calibrations to be within a ±25 nm interval around our answer. Thus, from year to year, there should be some gage blocks that vary from their previous history.

Looking at this another way, since we are serious when we say the uncertainty is 25 nm at the 95 % confidence level, for every 100 calibrations of 2 mm gage blocks, we are fairly sure that 5 measurements are more than 25 nm from the “truth.” In fact, if all of the blocks are somehow found to be inside the uncertainty limit, our uncertainty is wrong, and would need to be reanalyzed.

The second possibility is that the gage block is actually changing size. Steel is made up of a number of different phases, that is, the atoms are arranged in different patterns in the metal. Unfortunately, some of these phases are not in equilibrium, the atoms in some patterns would actually be happier (lower energy state) in another pattern. Along the edges where these phases meet, atoms can hop from one phase to another. Since the phases have different densities, as atoms change phases, the overall size changes. The artifact will then change size slowly. Some gage blocks grow while others shrink by as much as several µm/m/year. Thus, a long gage will grow or shrink proportionately more than a small one.

The gage block standard has a specification that limits the shrinkage or growth to 1 µm/m/year. In fact, for most steel, chrome carbide, and tungsten carbide gage blocks, the shrinkage or growth are well within this limit. Some of the newer materials do not have enough history to tell how stable they are.

Some materials to be cautious of are glasses and invar. Nearly all glass will shrink over the first few years of being manufactured. Glass is a very, very viscous liquid. After it is cooled, the surface tension slowly draws the atoms together. In the first year of new glass, changes of 10 µm/m are common. The changes slow down, and after 20 or 30 years, things are pretty good. Fused silica and fused quartz do not show this behavior, even though they are glass, because they are much more viscous, and they shrink so slowly that it does not matter, although you might have it checked after a century or so.

Invar is a mixture of metals which together exhibits very low thermal expansion. Most invar is fairly unstable, although some is not. With invar it pays to have it calibrated a couple of times, maybe every year, until you see if it is stable. Super invar seems to be much more dimensionally stable, but it costs more.

Return to Questions
Return to Top of Document

Traceability
Bullet How do I establish traceability to NIST? What is traceability and how can I get it for my master gages?
Bullet Does traceability to foreign national labs, such as, the United Kingdom's National Physical Laboratory (NPL), Germany's Physikalisch-Technische Bundesanstalt (PTB), and Canada's National Research Council (NRC), imply traceability to NIST?
Bullet Is laser vacuum wavelength an intrinsic standard, automatically traceable to the SI unit of length?
Return to Questions
Return to Top of Document

QuestionHow do I establish traceability to NIST? What is traceability and how can I get it for my master gages?

AnswerIt is important to note that for length measurements NIST does not legally define traceability or determine what constitutes “traceability to NIST.” Traceability is really determined by your customer--an auditor might determine it or it may be set out in a regulation from another government agency, such as, Department of Energy (DOE), Federal Drug Administration (FDA), or Department of Defense (DOD). Nevertheless, we can make some comments about current trends in defining traceability.

The basic idea behind traceability is that we should all use the same unit, the meter in our case. How to implement this idea has a long history, and not a very pretty one at that. For many years, it meant that you needed a NIST number on your report to show traceability. Your calibration source got the number from NIST or another lab. The other lab then could have gotten it from NIST or from another lab, and so forth. We have heard from customers who say having traced back some of their calibrations that they have gone back through seven or more labs to NIST. It is doubtful that the accuracy of the NIST calibration was preserved through this chain.

What is new in traceability is uncertainty. For a measurement to be traceable, it must have an associated uncertainty, and be part of an unbroken chain of comparisons back to the relevant source (i.e., national lab, intrinsic standard, and so forth), and each of these comparisons must have a stated measurement uncertainty. In order to know the uncertainty, the lab must know the uncertainty from the previous lab, not just the answer. If the number of labs in the chain to NIST is large, the uncertainty will grow accordingly. In most countries, the laboratory accreditation organization makes sure that each lab provides a reasonable estimate of uncertainty and that the chain of measurements is continuous, taking most of the guesswork out of the process. If you use a non-accredited lab, you are responsible for making sure that your supplier knows the uncertainty, traceability path, and so forth.

NIST developed an organizational policy on traceability and a set of related supplementary materials. For information and reference, visit the NIST Traceability web site.

Sometimes it is necessary to show that a measuring instrument is traceable to NIST. A plausible approach is usually to use the instrument to measure one or more artifacts whose dimensions are traceable to NIST. Often this is the very best approach—particularly if you can measure several NIST-traceable artifacts that are identical or similar to what the machine is intended to measure in normal operation. Once again, however, at NIST, we cannot require an auditor to accept this as a suitable method of establishing traceability—we have no legal authority.

Return to Questions
Return to Top of Document

QuestionDoes traceability to foreign national labs, such as, the United Kingdom's National Physical Laboratory (NPL), Germany's Physikalisch-Technische Bundesanstalt (PTB), and Canada's National Research Council (NRC), imply traceability to NIST?

AnswerThe U.S. has many written agreements with foreign countries recognizing that, at the national laboratory level, we have the same units. For information and reference about international agreements between NIST and foreign agencies and organizations, visit the NIST Office of International and Academic Affairs (OIAA) web site. In addition to many bilateral agreements, NIST signed a multilateral Mutual Recognition Arrangement (MRA) through which most of the world’s National Metrology Institutes (NMI) recognize each other's measurements. For information and reference about the MRA, visit the Bureau of Weights and Measures (Bureau Intérnational des Poids et Mésures, BIPM) web site.

Unfortunately, this does not necessarily translate to traceability. NIST cannot do much about this problem because the rules for traceability are enforced by other agencies. You have to consult with your specific auditor for the interpretation of traceability.

In most countries, traceability is defined and interpreted by a government accreditation agency, and these agencies then work toward reciprocity with other countries accreditation agencies. There are a number of laboratory accreditation bodies (ABs) in the U.S. that are signatories to the international agreement (ILAC MRA). The Signatories to the ILAC Mutual Recognition Arrangement are listed on the ILAC website. Note that “calibration” and ldquo;testing” are considered different fields of accreditation and you should look into the definitions of these terms when using the table of signatories. The MRA basically assures you that a certificate from a lab that is accredited by one AB is acceptable to all of the other ABs that are signatories.

Return to Questions
Return to Top of Document

QuestionIs laser vacuum wavelength an intrinsic standard, automatically traceable to the SI unit of length?

AnswerFor arbitrary lasers, there is no clear-cut and widely accepted answer to this question. However, persuasive arguments in favor of this viewpoint have been made for the case of the red helium neon (He-Ne) laser most widely used in industry, subject to several caveats discussed at the end of this FAQ. The Consultative Committee for Length of the BIPM (International Bureau of Weights and Measures) subscribes to this viewpoint. (see J. A. Stone et.al., Metrologia 46 (2009) 11-18).

As discussed below, the vacuum wavelength of a He-Ne laser operating on the standard red transition (3s2→2p4) cannot deviate by more than two or three parts in 106 from its nominal value. The vacuum wavelength is set by the atomic structure of the neon atom and in that sense it is an intrinsic standard. If you do not need accuracy better than a few parts in 106, it is our opinion that there would be nothing gained by actually measuring the vacuum wavelength. If you need greater accuracy, the vacuum wavelength may be determined by comparison with an iodine-stabilized laser. If you would like us to measure the vacuum wavelength of your lasers, you can make arrangements with Jack A. Stone.

If the background information, as discussed below, is perhaps more technical than your interests, you can skip directly to the Summary at the end of this FAQ. Beware that most of the discussion here is about vacuum wavelength whereas many applications require wavelength in air.

A He-Ne laser operates at a frequency near the center of the Doppler-broadened neon gain curve. The exact frequency or vacuum wavelength of the laser depends on two factors: (1) the wavelength difference between the actual operating point and the center of the gain curve and (2) the exact location of the center of the curve.

The first factor above will probably never exceed ±2 parts in 106 for lasers of realistic design. The Doppler width of the gain curve at half-height is about ±1.5×10-6 (that is, 1.5 parts in 106) and we would not expect the laser to operate far outside this range. The value ±2×10-6 should be a conservative estimate of the possible fractional range of variation except (possibly) for tubes that have been enriched in 22Ne in order to broaden the gain curve. Although some He-Ne lasers have been reported to operate at somewhat more than 1 part in 106 from the center of the gain curve, we are not aware of any reports of lasers operating outside the range of ±2 parts in 106.

The wavelength corresponding to the center of the gain curve depends slightly on gas pressure and more significantly on the isotope of neon used in the tube. For 20Ne, the center of the gain curve lies within 1 part in 107 of 632.99140 nm, with the exact value depending on gas pressure. For 22Ne, the center of the gain curve shifts toward smaller wavelengths by about 2 parts in 106.

Summary: In summary, if the laser tube is filled with 20Ne, the vacuum wavelength is 632.99140 nm with a conservative estimate of the relative expanded uncertainty (coverage factor k=2) being 2X10-6. Existing evidence indicates that tubes filled with natural neon (90 % 20Ne and 10 % 22Ne) will also fall within 2×10-6 of this wavelength. The situation is a bit less clear for tubes containing more 22Ne, but in any event, even if the 22Ne content is entirely unknown, the wavelength must lie within ±3×10-6 of 632.9908 nm, a value approximately half way between the 20Ne and 22Ne peaks.

The ±3×10-6 fractional range of possible values can be taken as an estimate of the expanded (k=2) relative uncertainty for the He-Ne vacuum wavelength, when nothing is known about the isotope mix of the tube, assuming that the laser is operating on the standard red transition (3s2→2p4). That is, the relative standard uncertainty is estimated as 1.5×10-6. This should be a very safe, conservative, estimate of the uncertainty; one might be able to argue that a smaller uncertainty is justifiable but very few people would argue that a larger uncertainty is appropriate.

However, it is extremely important to note the following caveats:

1.  It must be verified that the laser is indeed a helium neon laser (not a red diode laser) and that it operates on the standard red transition (3s2→2p4). Almost all red He-Ne lasers sold in the U.S. operate on this line, but operation near 640 nm or 612 nm is also possible. If there were any doubt in your mind it would be good to verify with the manufacturer that your particular laser model is indeed a 633 nm laser and not a 640 nm or 612 nm lasers.

It must be noted that, on occasion, high power He-Ne lasers (> 4 mW output power) with long tubes (>250 mm length) may have an output that is contaminated by a small amount of radiation at other wavelengths, most commonly 640 nm. For lasers manufactured in the last 20 years, we have heard no reports of lasers that operate predominately on the wrong transition. Even in those rare cases where the output is contaminated, the spurious radiation is usually less than 5 % of the total output power and the effect on measurement results is extremely small for typical applications. Nevertheless, some caution is needed when using lasers with relatively large output power; see the Metrologia article (referenced above) for details.

2.  Most often in metrology applications, it is the wavelength in air, rather than the vacuum wavelength, that is relevant to the measurement, and this must be obtained by dividing the vacuum wavelength by the index of refraction of air. In typical situations, the index of refraction of air is determined by calculation from measured atmospheric parameters (most importantly, atmospheric temperature and pressure). The uncertainty in the index of refraction of air, and hence the uncertainty in the air-wavelength, is then often dominated by the uncertainty with which atmospheric conditions are known. The uncertainty of the sensors must be known before it is possible to assign an uncertainty to the air wavelength.

Return to Questions
Return to Top of Document

Standards and Procedures
Bullet What is the NIST standard for this?
Bullet How can I obtain dimensional metrology standards? How can I obtain procedures for use and calibration of dimensional measuring equipment?
Return to Questions
Return to Top of Document

QuestionWhat is the NIST standard for this?

AnswerThere is none. The “Standard” in the National Institute of Standards and Technology refers to metrology standards, such as, the meter and the volt. Long ago, when NIST was named National Bureau of Standards (NBS), wrote standards and Letter Circulars, mostly because there was no organized national effort for standards writing. Many of these old government standards and Letter Circulars became the American Society of Mechanical Engineers (ASME) for threads, American Society for Testing and Materials (ASTM) for sieves, or other voluntary standards. At NIST, our technical staff does have a large presence on standards committees, but NIST does not have any direct authority over the standards system. For more information on the NIST role in U.S. domestic and international standards, visit the NIST Office of Standards Services web site.

Return to Questions
Return to Top of Document

QuestionHow can I obtain dimensional metrology standards? How can I obtain procedures for use and calibration of dimensional measuring equipment?

AnswerThere are many information sources for standards on the Web. Check out these web sites:

The National Standards Systems Network (NSSN) is a World Wide Web-based system designed to provide users with a wide range of standards information from major standards developers, including developers accredited by the American National Standards Institute, other U.S. private sector standards organizations, government agencies, including the U.S. Department of Defense, and international standards organizations.

The Department of Defense Single Stock Point for Military Specifications and Standards (DODSSP) was created to centralize the control, distribution, and access to the extensive collection of Military Specifications, Standards, and related standardization documents either prepared by or adopted by the U.S. Department of Defense. To search for U.S. Department of Defense and other U.S. government standards, such as, the GGG-series, use ASSIST Quick Search.

Some standards may deal with calibration of measuring equipment or include information on best practices for use of measuring equipment.

Return to Questions
Return to Top of Document

Miscellaneous
Bullet What brand of [dimensional measurement instrument, artifact] should I buy?
Bullet What temperature and humidity should I keep my dimensional metrology lab? What if I want it warmer than 20 °C?
Bullet What is the coefficient of thermal expansion (CTE) of [insert substance here]?
Bullet How do I get the uncertainty of my calibrations?
Bullet Are there any training courses in dimensional metrology?
Bullet What is the definition of the meter and the inch?
Bullet How do I order a NIST publication?
Bullet I checked my thread wire with my [name brand here] micrometer and your number is small by 1 micrometer. How could you make such a big mistake [you fool, you fool]?
Bullet What is the best material for my dimensional artifacts, such as, gage blocks, balls, and cylinders?
Bullet What should I use to clean my dimensional artifacts, such as, gage blocks?
Bullet Can I visit NIST?
Return to Questions
Return to Top of Document

QuestionWhat brand of [dimensional measurement instrument, artifact] should I buy?

AnswerSorry, we really cannot help with this question. The “best” instrument depends on so many factors, such as, cost, specifications, ease of use, accuracy needed for your lab use, environment, and operator training, that nobody could really tell you what you should use. Only you can balance your needs against all of the relevant factors.

Return to Questions
Return to Top of Document

QuestionWhat temperature and humidity should I keep my dimensional metrology lab? What if I want it warmer than 20 °C?

AnswerThe national and international reference temperature for dimensional measurements is 20 °C. The actual statements are in ISO 1 and ASME/ANSI Y14.5. The basic idea is that since all materials change size as the temperature changes, when you make a drawing the drawing must refer to some temperature or the drawing is meaningless. Thus, unless there is specific wording in a standard or on a drawing that says the dimensions refer to a specific temperature the assumption is that the temperature is 20 °C. Why this temperature was chosen is not actually known, but it seems to have been a generally accepted idea in the 1930's when it became ISO 1.

Now, what if you do not want to be cold? Well, the only difference between measuring at 20 °C and 23 °C is that you need to make a correction for the thermal expansion of the material. For steel this correction is 11.5 µm/m/°C. For example, if we take a

100 mm (4 inch) steel gage block at 23 °C

the expansion is

11.5 µm/m/°C X 100 mm X 3 °C = 3.5 µm (135 µinch).

Thus, if you do not make the correction your answer will be off. If an error of 11.5 µm/m/°C is negligible compared to the uncertainty you need, you are fine.

Unfortunately, even if you make the correction there is the uncertainty in the coefficient of thermal expansion to worry about. For steel gage blocks, the gage block standard says that the coefficient of thermal expansion must be 11.5 µm/m/°C plus or minus 1.0 µm/m/°C. Thus, even if you make the correction there is still this left over uncertainty from choosing the coefficient. At 23 °C the uncertainty of 1.0 µm/m/°C gives an uncertainty of 3.0 µm/m. On a 100 mm block this is 0.3 µm (12 µinch). This uncertainty goes in your uncertainty budget, and if it is negligible you are fine. If not, you may have to measure closer to 20 °C.

For other materials, especially parts, the coefficient of thermal expansion may not be so well known. Note that the coefficients given in many reference books are averaged over a very large temperature range. The coefficients are temperature dependent, so the uncertainty could be quite large. Also, you may not know exactly what material you have. Stainless steel, for example, is a very broad class of metals that vary from below 10 µm/m/°C to over 18 µm/m/°C.

The more you know about the coefficient of thermal expansion the easier it is to measure at nonstandard temperatures without accumulating much uncertainty.

Relative humidity should be kept below 45 % to avoid excessive rusting of gages. Temperature and humidity in dimensional standards is discussed in the standard ANSI B89.6.2.

Return to Questions
Return to Top of Document

QuestionWhat is the coefficient of thermal expansion (CTE) of [insert substance here]?

AnswerFor gages, the manufacturer should give you this information. If they don’t, you have at least a small problem. Looking in a reference book for the coefficient of thermal expansion (CTE) involves a number of assumptions.

a.  When I look up “steel” in the reference book, is it the same “steel” as my gage? The more information you have the better. A worse case example is “stainless steel” where 300- and 400-series stainless steels have CTE around 10 µm/m/°C and 18 µm/m/°C, respectively. If you do not know which you have, the uncertainty in the CTE is rather large.

b.  Reference book values are averages over a given temperature range, and a competent reference will at least give you the range. Since the CTE varies with temperature, the average CTE over the temperature 0 °C to 200 °C leaves a fair amount of uncertainty in the value you need, at 20 °C.

The upshot is that you must do your best. If the manufacturer does not give you a value and uncertainty you must scramble about and make sure you estimated uncertainty reflects your ignorance of the true value.

Return to Questions
Return to Top of Document

QuestionHow do I get the uncertainty of my calibrations?

Answer“Do not panic.” Consult the “A Hitchhiker's Guide to the Universe” by Douglas Adams.

NIST uses the comprehensive International Organization for Standardization (ISO) “Guide to the Expression of Uncertainty in Measurement,” or GUM, which is available for purchase, to calculate our uncertainties. A shorter version of this publication is the 1994 edition of Guidelines for Evaluating and Expressing the Uncertainty of NIST Measurement Results by Barry N. Taylor and Chris E. Kuyatt, NIST Technical Note 1297; National Institute of Standards and Technology, Gaithersburg, MD, which is available online in a pdf version or in an html version, or a printed copy may be ordered at no charge. Both of these documents look very complicated, but at the core they really not. You probably have all of the knowledge you need to make your uncertainty statement, but it is not organized in the form demanded by these documents.

There are simpler introductions to the concepts available on the web sites for NIST and the National Physical Laboratory in England (NPL). In would not hurt to look over the other NPL “good practice” entries, they are, in fact, quite good. Further, there are links to other references on the web sites for the National Voluntary Laboratory Accreditation Program (NVLAP), American Association for Laboratory Accreditation (A2LA), and the European Accreditation (EA).

The basic task in getting the uncertainty for a measurement is to list all of the sources of uncertainty; this list is called an uncertainty budget. There are two basic types of sources, those that are sampled repeatedly, i.e., are measured by the repeatability of the measurement and represented by the standard deviation, and those that are not. Those that are not require an informed estimate (guess). The trick is that if we are all to have uncertainties on the same scale, we must all guess it about the same way.

For our own use, we prepared a standardized template for an uncertainty budget and wrote a paper explaining the components. The paper has 10 examples from our own calibrations. The calibrations go from the very sophisticated (i.e., gage blocks with measurement assurance, SPC) to very unsophisticated (i.e., optical projector, much like everybody else). Drop us a line and we will send you a copy. Also, the paper can be found on the NIST web site in the publications portion of the Engineering Metrology Toolbox. The paper is available online in a pdf version. Once you have read the paper, we will at least be using the same vocabulary.

Return to Questions
Return to Top of Document

QuestionAre there any training courses in dimensional metrology?

AnswerThe National Council of Standards Laboratories, International (NCSLI) lists many metrology courses that are taught in the U.S.

Return to Questions
Return to Top of Document

QuestionWhat is the definition of the meter and the inch?

AnswerThe current definition of the meter (m) is the SI unit of length and is defined as the length of the path traveled by light in vacuum during the time interval of 1/299 792 458 of a second. For the history of the meter, visit these NIST web sites:

BulletHistorical Context of the SI: Unit of Length (Meter)
BulletLength:  Evolution from Measurement Standard to a Fundamental Constant

This number was chosen to match up as closely as possible to the old definition of the meter, which was the length of a certain number of wavelengths of Krypton86 radiation (light) from an electrical discharge lamp in a vacuum.

In practice the new definition of the meter cannot be used directly in a convenient and accurate manner for laboratory-scale measurements, because it is too difficult to do the time of flight measurement. We still count wavelengths, just like we did for Krypton. In the new scheme of things, however, you can use one of many light sources, the uncertainty in the meter depending on how well characterized the light source is. The Bureau of Weights and Measures (Bureau Intérnational des Poids et Mésures, BIPM) has worked with various national laboratories to provide accurate characterizations of the frequency and wavelength of a number of light sources, including certain types of stabilized helium-neon lasers, a variety of other stabilized lasers, and older sources such as spectral lamps of cadmium, mercury and krypton. The wavelengths and uncertainties of these sources are given in "Practical Realization of the Definition of the Metre [1997]" by T.J. Quinn, Metrologia 36: 211-244, 1999. Anyone with an appropriate laser listed in this reference can generate the meter, although they should participate in periodic interlaboratory testing to insure that their measurement system is in control. These specialized lasers are very seldom used directly for dimensional measurement because they are more expensive and less convenient to use than other types of stabilized lasers; rather, they are normally only used to calibrate the secondary-level lasers found in interferometers. You could maintain your own specialized laser for this purpose or you can send your secondary lasers to NIST for calibration.

The inch has always been tied to the meter in the U.S. After the Civil War, the Surveyor General of the U.S. set the inch as 39.37 inches to the meter. The British had a yard bar, the Canadians had 25.4 mm per inch, and the other English speaking nations had chosen one of these or some variation. They were all close, but not exactly the same. The yard bar was in fact shrinking. In 1959, a conference attended by the directors of the National Measurement Institutes from all of the English speaking countries, those who still used inches, met to standardize the inch. The 25.4 mm per inch (exactly) was chosen, and since then this is the inch used by NIST (formerly NBS). Except . . . Yes, there is always an exception. Surveyors still use the old inch because all of the mountains of measurements in place defining the positions of everything in the US are in these inches, and the job of changing them was considered too large a task. So surveyors still use the surveyor inch, 39.37 inches per meter.

Return to Questions
Return to Top of Document

QuestionHow do I order a NIST publication?

AnswerNIST publications are scientific and technical documents that have been authored or co-authored by NIST employees or published by NIST. To order NIST publications, visit the NIST Public Affairs web site. Most NIST publications are ordered through the National Technical Information Service. The Journal of Research of NIST is ordered through the Government Printing Office.

Return to Questions
Return to Top of Document

QuestionI checked my thread wire with my [name brand here] micrometer and your number is small by 1 micrometer. How could you make such a big mistake [you fool, you fool]?

AnswerAnd what makes you think you can calibrate something a complicated as a thread wire with a mere micrometer? Well you cannot, at least not without corrections. Thread wires have an unusual definition of diameter. Since they are ordinarily used in contact with a thread, a point contact, the definition includes some of the deformation into the definition of diameter. The idea is that if the “diameter” already has the deformation correction in it, the user does not have to make a correction, saving time and brain effort. Thus, the defined “diameter” of a thread wire is the distance across the wire while in contact with a cylinder and a flat at a certain force. The cylinder diameter and force vary with the pitch of the wire; the cylinder diameter and force chosen to be near the defined conditions for three wire pitch diameter measurements. This deformation varies from 0.5 µm to over 1.5 µm (20 µin to 60 µin) depending on the pitch.

If you sent in the same wire and called it a cylinder, we would have given you the undeformed diameter, which should match your result on your instrument.

All kidding aside, the meaning of technical words, such as, diameter, flatness, and so forth, are contained in various national and international standards. They often do not mean what they do in everyday conversation. Besides the diameter of thread wires where the actual definition is non-standard, there are other cases like gage blocks where there are other subtle differences. For gage blocks under 2 mm (0.1 in), the flatness is defined only when the block is wrung down to a thick reference surface. Above 2 mm (0.1 in), the flatness is in the free state (unwrung). If a standard exists for the instrument or gage you use, it is a very good practice to get a copy and read it.

Return to Questions
Return to Top of Document

QuestionWhat is the best material for my dimensional artifacts, such as, gage blocks, balls, and cylinders?

AnswerEvery gage material has some niche, or it disappears from the market. To choose a material depends on what it will be used for and how it will be used. The main properties to consider are cost, thermal expansion, thermal diffusivity, stability, wear, and wringing properties.

Cost: This is a major driver for many uses, particularly shop use. Most calibration labs have few gages and very high accuracy calibrations may cost more than the gage. The least expensive gage material is usually steel.

Thermal Expansion: Steel has long been a favorite gage material because most parts are made of steel in many industries. If the gage and part have the same thermal expansion, then the temperature where they are compared is not so important. If the thermal expansions are very different, the metrologist must calculate correction factors, or put correction factors on the drawing. If you make parts that are not steel, steel gages are no advantage, of course.

Thermal Diffusivity: If a block is warmer or colder than the lab, from handling, for example, the thermal diffusivity controls how fast the block stabilizes at the environmental temperature. Thus, low thermal diffusivity gages demand longer thermal soak times. The new zirconia ceramics are slower than the metal ceramics and steel, perhaps twice as slow.

Stability: If gages grow or shrink, and many materials do, you have to keep track of the changes and make corrections or have a short recalibration cycle. Most gages made from gage block steel, and properly heat-treated are stable. Our customer data on chrome carbide and tungsten carbide also show excellent stability. There are newer materials, but it will take some time to get enough history to say how stable they are.

Wear: The ceramic gages (e.g., chrome carbide, tungsten carbide, zirconia) have better wear properties than steel. How important this is depends on the use. We have steel master blocks that were used for 30 years before we retired them because they were too scratched to wring well. We are very very careful.

Steel will rust if not kept properly oiled. The ceramics are much more corrosion resistant.

Wringing Properties: Some materials seem to wring more consistently than others, which is important if you make gage block stacks as master gages. You should get a few and try them and see how consistent you are with them.

Return to Questions
Return to Top of Document

QuestionWhat should I use to clean my dimensional artifacts, such as, gage blocks?

AnswerWe use mineral spirits for gage blocks that are really greasy, and ethanol (200 proof) as the final cleaner. This seems to work pretty well. We do not use ultrasonic cleaners because both of these solvents are flammable and we would need special hoods and other equipment for safety reasons. We do not use methanol because it is a poison. We have not made formal tests of isopropyl alcohol, but our inherited wisdom from ages past says that it was not as good as ethanol. It may depend on what kind of grease your customers use and what solvent you use for the first cleaning.

We keep the mineral spirits in small safety cans (about 250 mm in diameter) that have a top and tray inside which you can push down into the liquid to wet the blocks. A small stiff brush is used to clean the block when wet. We then dry the block with a towel. We keep the alcohol in squeeze bottles and squirt some on lint free wipes (made for clean rooms), and use the alcohol wet wipe to clean the surface of traces of oil left by the mineral spirits. We wipe them dry with another wipe and put on an aluminum tray to thermalize. This is a fairly slow process, but we inspect each block for burrs and irregularities anyway. It might not be adequate for a very high volume measurement lab.

Ultrasonic cleaners were used in many labs, but when alcohols or other flammable solvents are used you need at least special hoods and probably some sort of permissions from various safety offices. All the best solvents are now pretty much banned. Basically, all of the good cleaners are carcinogenic, destroy the environment, illegal, or immoral.

There are some water based cleaning systems, but they are usually high volume systems that can dry the metal quickly before the water causes damage. If you use a lot of gages you might look into these.

Return to Questions
Return to Top of Document

QuestionCan I visit NIST?

AnswerOf course, you can visit NIST. There are general tours, but if you want to see what we really do with your gages you should call a staff member in the area you are interested in and make an appointment for a visit. In the Dimensional Metrology Group, we give tours of our facility and are always happy to talk to dimensional metrologists about technical matters. Because of security regulations, you cannot show up without notice.

Return to Questions
Return to Top of Document

Web Site Home: Engineering Metrology Toolbox
Technical Inquiries: Daniel S. Sawyer, Group Leader
Dimensional Metrology
Sensor Science Division
Physical Measurement Laboratory
NIST, 100 Bureau Drive, Stop 8211, Gaithersburg, MD 20899-8211

The National Institute of Standards and Technology (NIST) is an agency of the U.S. Department of Commerce.
NIST / PML / Sensor Science Division

NIST Program Questions: Public Inquires Unit
(301) 975-NIST (6478) TTY (301) 975-8295
NIST, 100 Bureau Drive, Stop 3460, Gaithersburg, MD 20899-3460

Privacy Policy / Security Notice / Accessibility Statement / Disclaimer / Freedom of Information Act (FOIA) / 
Environmental Policy Statement / No Fear Act Policy / NIST Information Quality Standards / Scientific Integrity Summary

USA.gov

Date Created: Friday, April 4, 2003
Last Updated: Monday, July 1, 2019
Technical Web Site Questions: Wei Ren
Feedback and Comments: Webmaster