In my previous post on Digital Noise I demonstrated that a low temperature resolution produces a noisy rate-of-rise (RoR) curve. Here I hint on a simple way to get a less noisy RoR signal on systems with limited temperature resolution. Just increasing your sampling interval.
2sec Interval
We start again with the profile of the previous post, sampled at a 2 seconds interval with a temperature resolution of only 1 decimal. All smoothing algorithms have again be deactivated.
4sec Interval
Logging that same profile with a 4 seconds sampling rate instead turns out already a lot smoother. The same profile with the 2 seconds interval shown above is loaded in the background here.
4sec Interval + Oversampling
Turning on oversampling helps to smooth out the signal even more by averaged two readings per interval. In "Nature" the effect of oversampling is larger than can be achieved with this simulation which is based on just one fixed data set.
4sec Interval + Oversampling + Curve Smoothing
Now setting curve smoothing from 0 to 1 improves the situation further. Note that still there is no RoR smoothing applied. Background profile is removed here for better visibility.
4sec Interval + Oversampling + Curve and Delta Smoothing
Turning on a little RoR smoothing (Smooth Delta = 4) results in a very useable RoR curve. Note that the Delta Span is still set to 1s, so basically deactivated.
4sec Interval + Oversampling + Curve and More Delta Smoothing
Finally, setting Smooth Delta to 8 (Smooth Spikes still set to 1). Delta Span still deactivated. A perfect smooth RoR curve despited the limited temperature resolution.
2sec Interval + Oversampling + Curve and More Delta Smoothing
For comparison, here is the same curve with 1 digit decimals resolution as shown above rendered using the same smoothing settings. Just the sampling interval is back to the original 2 seconds.