Terms in Measurement


Sensitivity of the instrument is defined as the ratio of the magnitude of the output signal to the magnitude of the input signal.

It  denotes  the  smallest  change  in  the  measured  variable  to  which  the  instruments responds.

Sensitivity has no unique unit. It has wide range of the units which dependent up on the

Instrument or measuring system.


Readability is a word which is frequently used in the analog measurement. The readability is depends on the both the instruments and observer.

Readability is defined as the closeness with which the scale of an analog instrument can be read.

The susceptibility of a measuring instrument to having its indications converted to a meaningful number.  It implies the ease with which observations can be made accurately.

For getting better readability the instrument scale should be as high as possible.


Accuracy may be defined as the ability of instruments to respond to a true value of a measured variable under the reference conditions. It refers to how closely the measured value agrees with the true value.


Precision is defined as the degrees of exactness for which an instrument is designed or intended  to  perform.  It refers  to  repeatability  or  consistency  of  measurement  when  the instruments are carried out under identical conditions at a short interval of time.

It  can  also  defined  as  the  ability  of  the  instruments  to  reproduce  a  group  of  the instruments as the same measured quantity under the same conditions.


Correction is defined as a value which is added algebraically to the uncorrected result of the measurement to compensate to an assumed systematic error.


Calibration is the process of determining and adjusting an instruments accuracy to make sure its accuracy is with in manufacturing specifications.

It is the process of determining the values of the quantity being measured corresponding to a pre-established arbitrary scale.  It is the measurement of measuring instrument.  The quantity to be measured is the „input‟ to the measuring instrument.

The „input‟ affects some „parameter‟ which is the „output‟ & is read out.  The amount of

„Output‟ is governed by that of „input‟.   Before we can read any instrument, a „scale‟ must be

Framed for the „output‟ by successive application of some already standardized (inputs) signals. This process is known as „calibration‟.


A part which can be substituted for the component manufactured to the small shape and dimensions is known a interchangeable part. The operation of substituting the part for similar manufactured components of the shape and dimensions is known as interchangeability.

Constant of a measuring instrument: The factor by which the indication of the instrument shall be multiplied to obtain the result of measurement.

Nominal value of a physical measure:  The value of the quantity reproduced by the physical measure and is indicated on that measure.

Conventional true value of a physical measure:  The value of the quantity reproduced by the physical measure, determined by a measurement carried out with the help of measuring instruments, which show a total error which is practically negligible.

Standard:  It is the physical embodiment of a unit.  For every kind of quantity to be measured, there should be a unit to express the result of the measurement & a standard to enable the measurement.

Related Posts

Comments are closed.

© 2024 Mechanical Engineering - Theme by WPEnjoy · Powered by WordPress