Page 1 of 1

Tipping bucket gauge inaccuracy

Posted: Mon 13 Oct 2014 11:11 am
by Adrian Hudson
Many factors come in to play in the accuracy of tipping bucket rain gauges. Dirt in the bucket, evaporation, poor basic calibration, mounting of the gauge (allowing vibrations in wind, etc). The greatest source of inaccuracy (of an otherwise well mounted, calibrated and maintained gauge) is due to rain rate. It is well known that tipping bucket rain gauges indicate different rain volumes depending on the rain rate. This is due to ripples in the bucket due to larger drops entering during high rates and "pile up" of extra water before it tips.

My Davis VP2 is VERY sensitive to rain rate. In fact really, even though its probably the best amateur gauge on the market, it is little more than a rough indicator of rainfall. I have it calibrated to within 1-2% of a standard 5 inch copper gauge at rainfall rates around 5-10mm/hour but at low rainfall rates (1mm/hr or less) it reads up to 20% low and at higher rates it reads various amounts higher (10-15% high). This is the opposite to normal wisdom which states that tipping bucket gauges underestimate the rain at high rates and vice versa. However, this is not the point of this post.

If one were able to characterize a particular tipping bucket gauge against a known standard, a graph could be produced that could be used to correct the readings from the gauge as long as the rain rate were known. Obviously this would be nearly impossible to do by hand as the observer would have to be making different corrections from minute to minute depending on the observed rainfall rate...

...but a computer could do it.

If it were assumed that the error were linear (it isn't, but it isn't too far off except for really unusually high rates) then a factor (a multiplier) could be applied to the standard tip to account for the actual rain collected.

(Note, this is not the same as the current rain multiplier in Cumulus which is used to calibrate a gauge without physically twiddling the screws and which applies a straight increase or decrease in the amount recorded).

So, for example, assuming a gauge that is calibrated to be as accurate as possible at very low rates and which records an undercatch at progressively higher rates. say something like this

Rate catch
5mm/h 100%
10mm/h 95%
15mm/h 90%

(figures just plucked out of the air)

A factor could be calculated depending on the observed rate 1 + (rate * 0.05) which could be used to adjust the rain recorded so that for example at a rate of 15mm/h a single tip (with a gauge set to 0.2mm/tip) would be assumed to actually be 0.202mm and recorded as such. (I hope my maths is right but if not it is the principle that matters!!)

The 0.05 in the previous sentence could be included as an additional calibration factor in the program.

What do people think?

The only thing I can think of is that the rainfall rate is calculated from tips per unit time which is itself dependent on the observed rate so its a bit incestuous but this effect is small and shouldn't affect things much.