What are the standards used for calibration Mention at least
What are the standards used for calibration? Mention at least three primary standards in the temperature range 10 to 1000 k. Define sensitivity. How does it change in a linearly varying measurement vs exponential system.
Instrumentation (Mechanical Engineering Course) Question
Solution
Calibration in measurement technology and metrology is the comparison of measurement values delivered by a device under test with those of a calibration standard of known accuracy. Such a standard could be another measurement device of known accuracy, a device generating the quantity to be measured such as a voltage, or a physical artefact, such as a metre ruler.
The outcome of the comparison can result in no significant error being noted on the device under test, a significant error being noted but no adjustment made, or an adjustment made to correct the error to an acceptable level. Strictly, the term calibration means just the act of comparison, and does not include any subsequent adjustment. The calibration standard is normally traceable to a national standard held by a National Metrological Institute
Sensitivity: It is the ratio of change in output to change in input value.
For linearly varying system the sensitivity almost remains constant because change in output to change in input is constant.
For exponentially varying system the sensitivity rapidly varies because change in output to change in input is varying, and after a certain time it becomes zero because output doesnt change.
