Micrometer 20120616 JSCC

A micrometer is a tool used to measure distances with an accuracy of a thousandth of an inch. Several types exist, with variations on the type of measurement made (e.g. internal, external), on the minimum and maximum measurable distance, and on the system of measurement (e.g. metric, SAE).


Micrometer with insets

A micrometer measuring 101 thousandths of an inch, with insets showing the point of measurement (left) and the readout (right). Each revolution of the spindle advances it 25 thousandths of an inch, moving the leading edge of the barrel along the measurement scale. Reading the micrometer requires first reading the last hash mark visible on the measurement scale, then adding the number of thousandths of an inch indicated on the barrel's graduations. In the pictured case, the last hash mark on the scale is at the 1, indicating 100 thousandths of an inch. The first graduation on the barrel lies along the measurement line, yielding a measurement of 100 + 1 = 101 thousandths of an inch. Josiah Carberry Collection.

A micrometer measures between the rotating spindle and a fixed anvil. The threads along which the spindle rotates is precisely machined to advance the spindle an extremely small distance with each revolution. A barrel rotates with the spindle to indicate the distance between the spindle and the anvil.

Measurement typesEdit

  • Internal
  • External
  • Depth