I understood (and put in the Wiki) the following...
But it appears that the threshold used is >= the value in the ini file (or the default in my case), rather than simply greater than?RainDayThreshold=-1 The threshold value which the daily rainfall has to exceed for the day to be considered a 'rain day'. Default -1 = 0.2mm or 0.01in. Value is entered in your current rain units.
EDIT: Or I wonder is it because my 0.2mm of rain is > 0.2 due to rounding/floating point errors?
EDIT2: Um, no rounding/calibration error I can see. Cumulus recorded my rainfall on the 28th as 0.199791878461838mm, but also counted that as a rain day.