Page 1 of 1

Dayfile.txt duplicate records

Posted: Wed 11 Jul 2012 9:05 am
by LAdrain
On my WH1081 today I noticed that at 7:50 I had 12.9mm of rain? It had stopped raining by then.
Cumulus was showing a red error and when I clicked in this I got a message error Reading dayfile.txt.
When I look at the file I see two records on the same date and time from about 7:00 to 8:00 today.

Can I delete todays records and culumus will catch up when I plug the station into the PC or do I remove every other record?
Should I be doing something else to correct the large rain spike?

Re: Dayfile.txt duplicate records

Posted: Wed 11 Jul 2012 9:14 am
by steve
The best way to 'rewind' and catch up from the logger data is to use a set of files from the Cumulus backup folders. Choose a set from a suitable time, and copy them to the data folder. If you (re)start Cumulus every day, then you will have a set from yesterday which should be suitable. (The latest version of 1.9.3 also takes backups at midnight, which is useful if you don't restart Cumulus very often.)

If you zip up the diags folder and attach it, I may be able to see what went wrong in the first place.

Edit: I don't understand what you mean by this:
When I look at the file I see two records on the same date and time from about 7:00 to 8:00 today.
Entries in the dayfile.txt file just have a date, and there's normally one per day; do you mean the July log file?

Re: Dayfile.txt duplicate records

Posted: Wed 11 Jul 2012 9:55 am
by LAdrain
Yes i have just checked and it is Jul12log.txt that has the duplicate times, dayfile.txt has two records for today. i must have been looking at both.

It looks to have gone wrong at about 8:00.

Re: Dayfile.txt duplicate records

Posted: Wed 11 Jul 2012 10:08 am
by steve
Two instances of Cumulus were started at about the same time:

11/07/2012 07:46:31.519 : Cumulus 1.9.2 Build 1032 startup

11/07/2012 07:46:31.642 : Cumulus 1.9.2 Build 1032 startup

So both were trying to write to the data files at the same time, hence the duplicate entries, and both were trying to read data from the station at the same time, hence the garbage data readings.

There's a setting on the station settings screen which causes Cumulus to attempt to detect the situation where two instances have been accidentally started; you could try using this to prevent this happening again in future.

If you copy the files from the backup which was created for the earlier of those two startups at 07:46, and then start Cumulus, all should be well.

Re: Dayfile.txt duplicate records

Posted: Wed 11 Jul 2012 10:11 am
by LAdrain
Many thanks for you help with this.