MTS load cell resolution, accuracy, precision, and sensitivity.

QUESTIONS


What is the Resolution of my Load Cell?  

What is the lowest Force my Load Cell can detect accurately?

What is the difference between Sensitivity and Resolution?


 

ANSWER

 

RESOLUTION

 

MTS load cell resolution is a measured value.

 

*ASTM requirements are generally based on ½ the noise or one digit whichever is greater.

*Resolution shall not exceed ½ the tolerance specification.

*Resolution is required in ACS calibrations

*Resolution is assessed at the lowest non-zero calibration point.

*Data cannot be recorded below 200 x resolution (ASTM E4)

 

As an example, if I have a 50kN load cell calibrated at 2% of full scale (1kN) and measured 0.008kN noise (Figures 1 & 2), the resolution would be ½X0.008 kN or 0.004 kN. 

 

The minimal acceptable data point based on the measured resolution would be 0.004X200 or 0.800 kN.   

 

As a general rule, the worst-case acceptable data point should not exceed 2% of the full scale.  In this case .02*50,000kN=1.000kN.  A value higher than 2% of full scale would suggest a problem.  

 

 

Figure 1 – Screen Shot of Noise at a Static Load of 2% full scale


 

Figure 2 – Meter to measure the peak to peak noise

 

 

PRECISION

 

Precision is a nebulous term.  It is related to accuracy and really involves all the components that go in to the accuracy of the load cell along with its calibration.  It is better to compare the calibrated output vs. nominal load standard output to determine error between the two.  There are other factors that go into it; filtering, noise, dynamics, temperature change, linearity, hysteresis etc. but there is no one number for precision.

 

 

 

CALIBRATION ACCURACY AND CLASSIFICATION

 

When load cells are calibrated, they are measured against a standard.  The accuracy is how far the measurement is from the standard load cell’s reading. 

 

When MTS does calibrations, the error can be no more than 1% of the reading.  This means if you have a 250N load cell the acceptable error could be as much as + or – 2.5 N (.56 lbs) at full scale but at 2% of full scale (5 N) it would be + or - .05 N (.01 lbs). 

 

 

This is why resolution is measured during calibrations to ensure measurements fall within 1% accuracy even at the lowest calibration point.  If the measured resolution was .009 for example, the error at 2% of the full scale would not be more than the 1% of the reading or .05N since any noise would be out too many decimal places to have an effect. 

 

When we do calibrations, we typically do them to 1% accuracy.  If noise prevents calibrating to this accuracy due to resolution being worse than 1% accuracy at the lowest calibration point it has to be reclassified or the lowest calibration data point must be raised.  Our procedure is to raise the lowest calibration point thus reducing the usable range of the load cell. 


SENSITIVITY

 

MTS standard load cells have a sensitivity of 2 mV/V.

 

Our software/hardware use a +/10 V min/max range.

 

To achieve full scale voltage output, the ideal gain is 500, therefore, at full scale…

 

0.002V\V * 500 *10V = 10V

 

Note:  Some smaller MTS load cells have a sensitivity of 1 mV/V.  

 

K
Keary is the author of this solution article.

Did you find it helpful? Yes No

Send feedback
Sorry we couldn't be helpful. Help us improve this article with your feedback.