Welcome to the Cumulus Support forum.

Latest Cumulus MX V3 release 3.28.6 (build 3283) - 21 March 2024

Cumulus MX V4 beta test release 4.0.0 (build 4017) - 17 March 2024

Legacy Cumulus 1 release v1.9.4 (build 1099) - 28 November 2014 (a patch is available for 1.9.4 build 1099 that extends the date range of drop-down menus to 2030)

Download the Software (Cumulus MX / Cumulus 1 and other related items) from the Wiki

Wanting to average hourly, weekly and monthly temps.

Discussion and questions about Cumulus weather station software version 1. This section is the main place to get help with Cumulus 1 software developed by Steve Loft that ceased development in November 2014.
Post Reply
lardconcepts
Posts: 35
Joined: Sat 26 Nov 2011 10:11 pm
Weather Station: Maplin WH1801
Operating System: Win 8.1 Pro
Location: Mid-wales

Wanting to average hourly, weekly and monthly temps.

Post by lardconcepts »

I want to make more use of my lovely data from Cumulus, and I also want to make some mashups with things like weekly electricity meter readings.

Here's what I know:

I can make some lovely graphs very easily with Google Spreadsheets, but I'm limited to 400,000 cells across all sheets. Plus, when pulling data to use in the Google Visualisation API things start to slow down when there's a lot of rows.

I can make some faster, more interactive graphs with Google Fusion Tables but I'm limited to 100 MB per table and a total of 250Mb of fusion table data for my whole account.

Additionally, despite multiple requests, fusion refuses to support the popular dd/mm/yyyy format that Cumulus outputs, and fusion also wants it in mm/dd/yyyy hh:mm A format (ref) which means both combining and manipulating column A and B in Google Spreadsheets before passing it to fusion tables.

That bit can be done, but we're still left with overall data sets too large for either application and for my needs, resolutions of 1 to 5 minutes is way too fine hourly is perfect.

I'd like to break it down into hourly averages, daily averages are taken care of by the dayfile, and then weekly averages, which can be made from the dayfile.
Just to throw another thing into the mix, there aren't a known number of readings per hour until we actually parse the file.

My latest month's csv data is uploaded from Cumulus to a bit of webspace every day, and currently Spreadsheets automatically loads the data in.

Both spreadsheets and fusion tables have an API and an SQL-like query language:

Google Spreadsheets API - Query language
Google Fusion Tables API -

Google also has [url=https://developers.google.com/appengine/]App Engine
, a cloud-based python service, as well as Google Apps Script, "a JavaScript cloud scripting language that provides easy ways to automate tasks across Google products".

Here's my idea so far:

Each monthly file falls within the limits of a Google Spreadsheet, and follows the format Dec12log.txt, Jan13log.txt etc.
But combined, they are way too big even for a fusion table, and the data can be average to hours to make it more manageable.

So, I find an appropriate API, automatically make multiple spreadsheets and load each month into each spreadsheet.
Then, I use another spreadsheet to use the query and avg functions to pull the hourly averaged data into separate sheets, and finally pull all of the averaged data from all those sheets into a fusion table.

Simples, huh? Not for me it isn't - I can't even get past the first hurdle of working out how to get the sql avg (average) to do what I want!
I've revisited it from time to time over the last couple of months and started to come to the conclusions that:

A: There has to be a simpler, better way.
B: I'm a bit thick!

Whatever way it ends up being, it has to be reasonably open and able to be used by other Cumulus users in the same situation.

Any ideas?!?
Post Reply