Non-fluctuating resolution is an interval over which it is evenly likely that the value displayed occurs. For instance, if the reading is 225 and the resolution is 5 then all we know is the actual value is somewhere between 222.5 and 227.5 and any value within that range is equally likely. In statistics, this is a rectangular distribution.
When we have a fluctuating digital display, the distribution of the data may be gaussian which can be characterized by a standard deviation. To convert the range of a rectangular distribution to an equivalent standard deviation of a gaussian distribution, one divides the range of the rectangular distribution by two times the square root of 3. (This is how we treat resolution in an uncertainty analysis.)
The method above converts the standard deviation of a gaussian distribution into the range of a rectangular distribution by multiplying by two times the square root of 3. This method more accurately assesses the resolution of a fluctuating display when data collected approaches a gaussian distribution. Method (1) is maintained for systems where recording all the data is not possible.
10 seconds and 1000 readings were chosen as the period from test runs on various machines. Within 10 s, the resolution was usually within 5% of the maximum value found when collecting data for a minute.
Date Initiated: 11-19-2022
Technical Contact: Earl Ruth