Welcome to the Cumulus Support forum.
Latest Cumulus MX V4 release 4.4.2 (build 4085) - 12 March 2025
Latest Cumulus MX V3 release 3.28.6 (build 3283) - 21 March 2024
Legacy Cumulus 1 release 1.9.4 (build 1099) - 28 November 2014
(a patch is available for 1.9.4 build 1099 that extends the date range of drop-down menus to 2030)
Download the Software (Cumulus MX / Cumulus 1 and other related items) from the Wiki
If you are posting a new Topic about an error or if you need help PLEASE read this first viewtopic.php?p=164080#p164080
Latest Cumulus MX V4 release 4.4.2 (build 4085) - 12 March 2025
Latest Cumulus MX V3 release 3.28.6 (build 3283) - 21 March 2024
Legacy Cumulus 1 release 1.9.4 (build 1099) - 28 November 2014
(a patch is available for 1.9.4 build 1099 that extends the date range of drop-down menus to 2030)
Download the Software (Cumulus MX / Cumulus 1 and other related items) from the Wiki
If you are posting a new Topic about an error or if you need help PLEASE read this first viewtopic.php?p=164080#p164080
I/O Error 103 when writing to log
-
LillypillyWeather
- Posts: 14
- Joined: Tue 31 Jan 2012 8:45 am
- Weather Station: Digitech xc-0348
- Operating System: Windows Server 2008 R2
- Location: Kallangur, Queensland, Australia
- Contact:
I/O Error 103 when writing to log
I am getting constant I/O error when cumulus is attempting to write to a log file.
23/05/2012 5:00:00 AM Error closing data file: C:\Cumulus\data\May12log.txt. I/O error 103
Please help. I have renamed the old file "may12logerror.txt" and allowed cumulus to create a new one, but still getting the same error.
Cheers
Greg
23/05/2012 5:00:00 AM Error closing data file: C:\Cumulus\data\May12log.txt. I/O error 103
Please help. I have renamed the old file "may12logerror.txt" and allowed cumulus to create a new one, but still getting the same error.
Cheers
Greg
- steve
- Cumulus Author
- Posts: 26672
- Joined: Mon 02 Jun 2008 6:49 pm
- Weather Station: None
- Operating System: None
- Location: Vienne, France
- Contact:
Re: I/O Error 103 when writing to log
It's not a problem with the file itself. Something else on your system has the file open and is preventing Cumulus writing to it. The usual suspects are anti-virus programs etc.
Steve
-
Lynwood
- Posts: 6
- Joined: Sun 19 Dec 2010 8:34 pm
- Weather Station: Maplin N96FY
- Operating System: Windows 7 Pro
- Location: Vale of Glamorgan
Re: I/O Error 103 when writing to log
Steve
I've started getting the following (from yesterday). I have BT Net Protect Plus / McAfee and have been running this protection since I have had BT Broadband but only now started getting this error:
14/12/2012 18:34:00 : 14/12/2012 18:34:00 Error writing data to data file: C:\Cumulus\data\Dec12log.txt. I/O error 32
14/12/2012 18:34:00 : 14/12/2012 18:34:00 Error writing EOL to data file: C:\Cumulus\data\Dec12log.txt. I/O error 103
14/12/2012 18:34:00 : 14/12/2012 18:34:00 Error closing data file: C:\Cumulus\data\Dec12log.txt. I/O error 103
I've started getting the following (from yesterday). I have BT Net Protect Plus / McAfee and have been running this protection since I have had BT Broadband but only now started getting this error:
14/12/2012 18:34:00 : 14/12/2012 18:34:00 Error writing data to data file: C:\Cumulus\data\Dec12log.txt. I/O error 32
14/12/2012 18:34:00 : 14/12/2012 18:34:00 Error writing EOL to data file: C:\Cumulus\data\Dec12log.txt. I/O error 103
14/12/2012 18:34:00 : 14/12/2012 18:34:00 Error closing data file: C:\Cumulus\data\Dec12log.txt. I/O error 103
- steve
- Cumulus Author
- Posts: 26672
- Joined: Mon 02 Jun 2008 6:49 pm
- Weather Station: None
- Operating System: None
- Location: Vienne, France
- Contact:
Re: I/O Error 103 when writing to log
Something on your system has (or had) the file open, preventing Cumulus writing to it. Try excluding the Cumulus folder from your anti-virus software. You may also need to restart Cumulus, if it's still reporting the error.
Steve
-
Lynwood
- Posts: 6
- Joined: Sun 19 Dec 2010 8:34 pm
- Weather Station: Maplin N96FY
- Operating System: Windows 7 Pro
- Location: Vale of Glamorgan
Re: I/O Error 103 when writing to log
Hi Steve
Solved - new network drive software was backing up the file on every write so I've excluded the data folder and that's solved the problem.
Solved - new network drive software was backing up the file on every write so I've excluded the data folder and that's solved the problem.
-
MikeLempriere
- Posts: 14
- Joined: Wed 27 Jun 2012 3:41 pm
- Weather Station: Davis Vantage Vue
- Operating System: Win XP SP3
- Location: Bainbridge Island, WA
Re: I/O Error 103 when writing to log
I've been seeing this same error occasionally for about a year now. It usually happens coincident with the system backup so I'm betting it's a sharing issue. This a minor ongoing annoyance as my server emails me whenever an error appears in the Cumulus ftp data on the server and it happens several times a week.
The solution above in this thread says to exclude the Cumulus directory - this won't work for me as I do want my weather data to be backed up.
What I'd like to know is, is this a real error, or just a warning? The error says "closing" the file - sounds to me like Cumulus has already successfully written the data. From other web reading, this error indicates a sharing issue. Is Cumulus unable to write to the file after this error, or is it succeeding, then giving an FYI error?
If it is simply an FYI error:
a] Who cares if somebody else is reading the file at the same time we're writing it? That's THEIR problem - they need to handle a possibly incomplete read, Cumulus cannot do anything differently. If my backup gets an incomplete file, no matter, there's another backup tomorrow - this happens on hundreds of files every night, that's life. I could choose to fix my backup to retry it's read, but again, Cumulus has done it's best, it's the backup that would have to change. In this case, I'd suggest Cumulus simply not report the error at all.
b] If indeed it is just a close error, then Cumulus should retry the file close until it succeeds. (See discussion in e] below).
c] Without knowing the actual code, I can only suggest; perhaps simply not doing open() and close() pairs at all but instead reuse the same file handle ad infinitum, doing flush() calls to ensure the data is forced to the disk in a timely fashion. This would remove the problem entirely and make the code more efficient and take less system resources. (However I do see the utility in having it closed when not being actively written - never mind, this isn't such a good idea...)
If on the other hand, the error means that we have been prevented from writing, well, we obviously do need to succeed:
d] If we simply skip this write, what are the consequences? Will we be retrying again with the next station read in 1 second? If so, just don't even bother telling about the error the first time - who cares about one data point being delayed for 1 second? (As long as it does get there soon.) Sure, if we continue to get the error repeatedly, that's a problem that the operator should look into as data is being lost; Cumulus must tell them about it if it were to continue to fail.
e] However, if this is only written with each log pass (probably the case), perhaps every 5 mins, that gives us plenty of time to retry. I'd suggest simply looping on this error and retrying once per sec for a few tries, and only then if we've consistently failed, go ahead and report the error. Couldn't hurt to make this a user setting, something like "AgressiveReportingOfErrors=T/F" or "LogWriteRetries=0/9".
The fundamental issue here is NOT whether the software encountered an error. The issue is: Does the operator need to made aware? If the software can handle it on it's own, it should do so, and be done with it. If the problem requires operator intervention, only then should a message be generated.
Believe me - I'm a computer professional and my greatest gripe about software is when it fails to give you enough information to handle the problem - more is almost always better. However, my second biggest gripe about software is when it takes up your time with information that is not of use.
The solution above in this thread says to exclude the Cumulus directory - this won't work for me as I do want my weather data to be backed up.
What I'd like to know is, is this a real error, or just a warning? The error says "closing" the file - sounds to me like Cumulus has already successfully written the data. From other web reading, this error indicates a sharing issue. Is Cumulus unable to write to the file after this error, or is it succeeding, then giving an FYI error?
If it is simply an FYI error:
a] Who cares if somebody else is reading the file at the same time we're writing it? That's THEIR problem - they need to handle a possibly incomplete read, Cumulus cannot do anything differently. If my backup gets an incomplete file, no matter, there's another backup tomorrow - this happens on hundreds of files every night, that's life. I could choose to fix my backup to retry it's read, but again, Cumulus has done it's best, it's the backup that would have to change. In this case, I'd suggest Cumulus simply not report the error at all.
b] If indeed it is just a close error, then Cumulus should retry the file close until it succeeds. (See discussion in e] below).
c] Without knowing the actual code, I can only suggest; perhaps simply not doing open() and close() pairs at all but instead reuse the same file handle ad infinitum, doing flush() calls to ensure the data is forced to the disk in a timely fashion. This would remove the problem entirely and make the code more efficient and take less system resources. (However I do see the utility in having it closed when not being actively written - never mind, this isn't such a good idea...)
If on the other hand, the error means that we have been prevented from writing, well, we obviously do need to succeed:
d] If we simply skip this write, what are the consequences? Will we be retrying again with the next station read in 1 second? If so, just don't even bother telling about the error the first time - who cares about one data point being delayed for 1 second? (As long as it does get there soon.) Sure, if we continue to get the error repeatedly, that's a problem that the operator should look into as data is being lost; Cumulus must tell them about it if it were to continue to fail.
e] However, if this is only written with each log pass (probably the case), perhaps every 5 mins, that gives us plenty of time to retry. I'd suggest simply looping on this error and retrying once per sec for a few tries, and only then if we've consistently failed, go ahead and report the error. Couldn't hurt to make this a user setting, something like "AgressiveReportingOfErrors=T/F" or "LogWriteRetries=0/9".
The fundamental issue here is NOT whether the software encountered an error. The issue is: Does the operator need to made aware? If the software can handle it on it's own, it should do so, and be done with it. If the problem requires operator intervention, only then should a message be generated.
Believe me - I'm a computer professional and my greatest gripe about software is when it fails to give you enough information to handle the problem - more is almost always better. However, my second biggest gripe about software is when it takes up your time with information that is not of use.
- steve
- Cumulus Author
- Posts: 26672
- Joined: Mon 02 Jun 2008 6:49 pm
- Weather Station: None
- Operating System: None
- Location: Vienne, France
- Contact:
Re: I/O Error 103 when writing to log
It's a real error - data hasn't been written to the log file, and that data is therefore lost.
Thank you for your suggestions for improving Cumulus. There is a section for enhancement requests; there are links near the top of the forum page. If you would like to create a request, I can consider it along with the hundreds of other requests I have received. There may be a request for this already.
Most of my time for the last year or so has been taken up by a new version of Cumulus, which is now in beta test. Once this version is reasonably fit for use and roughly equivalent in functionality to the 'old' version of Cumulus, I can start looking at the enhancement requests with a view to implementing some of them. This is likely to be quite some time off. The beta testing has taken up literally all of my spare time over the last six weeks, including two weeks holiday, and when it is finally over, I will be taking things easy for a while!
Thank you for your suggestions for improving Cumulus. There is a section for enhancement requests; there are links near the top of the forum page. If you would like to create a request, I can consider it along with the hundreds of other requests I have received. There may be a request for this already.
Most of my time for the last year or so has been taken up by a new version of Cumulus, which is now in beta test. Once this version is reasonably fit for use and roughly equivalent in functionality to the 'old' version of Cumulus, I can start looking at the enhancement requests with a view to implementing some of them. This is likely to be quite some time off. The beta testing has taken up literally all of my spare time over the last six weeks, including two weeks holiday, and when it is finally over, I will be taking things easy for a while!
Steve
-
MikeLempriere
- Posts: 14
- Joined: Wed 27 Jun 2012 3:41 pm
- Weather Station: Davis Vantage Vue
- Operating System: Win XP SP3
- Location: Bainbridge Island, WA
Re: I/O Error 103 when writing to log
One more question please?
I have several other programs running on the same box that do ongoing logging that do not seem to be encountering this error, yet they're read by the same backup, at the same time, every night, that seems to be causing this issue. Is Cumulus doing something unusual, perhaps demanding a file lock?
There's no antivirus software on this machine, it's very old and just doesn't have the horsepower to handle it. I removed AVG Free as part of diagnosing this problem. If I get things working cleanly, I will try to reinstate it, but at the moment, we know that's not a factor.
As far as locking goes, it's Ok if the backup reads a partial file - in fact, the program I'm using (GnuTar) will give a warning that "the file has changed while reading" in case I wish to worry about it ('though I choose to ignore it) - not getting the last entry is fine; if the system were to need to be rebuilt from the backup, that last datapoint is unimportant, as it will be followed by many more missing datapoints, perhaps many hours worth, as the system crashed and nobody was there to log that data. And if it doesn't crash today, we'll get the data in tomorrow's backup.
My guess is that Cumulus is trying to lock and write, but GnuTar is opening the file for a "long" time, so Cumulus cannot get access in a reasonable amount of time. If this is right, I'd suggest that Cumulus merely not bother locking.
Thanks for you consideration... (I'll be happy to move this to the new features thread when we're done discussing...)
I have several other programs running on the same box that do ongoing logging that do not seem to be encountering this error, yet they're read by the same backup, at the same time, every night, that seems to be causing this issue. Is Cumulus doing something unusual, perhaps demanding a file lock?
There's no antivirus software on this machine, it's very old and just doesn't have the horsepower to handle it. I removed AVG Free as part of diagnosing this problem. If I get things working cleanly, I will try to reinstate it, but at the moment, we know that's not a factor.
As far as locking goes, it's Ok if the backup reads a partial file - in fact, the program I'm using (GnuTar) will give a warning that "the file has changed while reading" in case I wish to worry about it ('though I choose to ignore it) - not getting the last entry is fine; if the system were to need to be rebuilt from the backup, that last datapoint is unimportant, as it will be followed by many more missing datapoints, perhaps many hours worth, as the system crashed and nobody was there to log that data. And if it doesn't crash today, we'll get the data in tomorrow's backup.
My guess is that Cumulus is trying to lock and write, but GnuTar is opening the file for a "long" time, so Cumulus cannot get access in a reasonable amount of time. If this is right, I'd suggest that Cumulus merely not bother locking.
Thanks for you consideration... (I'll be happy to move this to the new features thread when we're done discussing...)
- steve
- Cumulus Author
- Posts: 26672
- Joined: Mon 02 Jun 2008 6:49 pm
- Weather Station: None
- Operating System: None
- Location: Vienne, France
- Contact:
Re: I/O Error 103 when writing to log
Yes, unfortunately the system routines I use insist on exclusive access. I believe this not to be the case for the ones in Cumulus MX (the new version of Cumulus), but in any case I have much more control over that in the new code and can make sure, when I have chance to look into it. So this issue should not arise.
Steve
-
MikeLempriere
- Posts: 14
- Joined: Wed 27 Jun 2012 3:41 pm
- Weather Station: Davis Vantage Vue
- Operating System: Win XP SP3
- Location: Bainbridge Island, WA
Re: I/O Error 103 when writing to log
Ok, I'll just let it drop then, thanks for your consideration...
Keep up the great work Steve!
Keep up the great work Steve!
-
MikeLempriere
- Posts: 14
- Joined: Wed 27 Jun 2012 3:41 pm
- Weather Station: Davis Vantage Vue
- Operating System: Win XP SP3
- Location: Bainbridge Island, WA
Re: I/O Error 103 when writing to log
BTW: Got rid of this problem by replacing the caveman-playing-with-fire computer with a computer that's only really, really old. I guess there's simply enough horsepower now for Cumulus to complete it's open-write-close without getting swapped out and conflicting with the backup. This neither disproves nor verifies my assumption that it's caused by the backup, though implies the latter.