Example scenario..
Pressure is dropping and is currently shown as 1030mb. It doesn't change on Cumulus screen until 1029 mb is reached. E.G. it is shown as 1030.0 mb, then leaps to 1029.0 mb. But, it is being reported as dropping by 0.3 mb an hour in the box below?
However, uploads to weather sites (like WOW) report the pressure with the digit which follows the decimal point included. So pressure gets reported as say 1029.7mb, then 1029.4 mb and so on.
So my query is simple..
If Cumulus knows about the more precise pressure when reading from the console, then why does it not report it on the cumulus software screen itself?
Using whole numbers also makes Cumulus own (pressure) graph look very blocky? In fact you can have a flat line for hours if pressure doesn't change much. Whereas looking at your own uploaded pressure data in graph format is just fine (when viewed via WOW etc). As the small variations in pressure will be shown, because they have been reported in the uploaded data which Cumulus dumped on them.
So the more precise readings seam to exist, but just don't appear to be used on (or by) Cumulus itself