Page 1 of 1

My experience: I moved to a rPi

Posted: Wed 06 May 2015 7:39 pm
by mcrossley
Well, I've taken the plunge and moved full time to CumulusMX running on a Raspberry Pi model B. The rPi wasn't powerful enough to run CumulusMX, Apache, PHP, and MySQL for my self hosted web site, so I moved to a web hosting solution (care of Steve's server, thanks Steve).

The rPi is running CumulusMX and uploading realtime data every 5 seconds. The rPi is also running NGINX and is hosting my old astro web site which just consists of static HTML pages.

Early days, but it's looking good so far, and my electric bill will thank me as I have switched off the old Windows box. :D

Edit: I forgot to add - A BIG thanks to Steve for all the work he is putting into CumulusMX. :clap:

Re: My experience: I moved to a rPi

Posted: Wed 06 May 2015 8:57 pm
by slashmusic
Hiho
I would suggest, Buy a Raspberry Pi 2 add a USB Harddisk and reinstall everything you removed from your old Pi.
Long time ago, I bought a BananaPi (Also using Debian Wheezy) and it is having nearly the same Hardware specs (Speed) as a RaspberryPi2
This BananaPi is able to handle SATA Harddisks and I am running:
Apache Webserver with SSL Support, MYSQL Server, PHP, Squid with an Ad blocker, DNSMAsq as a DNS Server, MRTG - pulling SNMP form BananaPi and other Servers ,Logitech Media Server, PyLoad, Sabnzbd, Oscam and OwnCloud for few People and I am absolutely fine with the Speed. I am also sure, that CumulusMX is able to run on this device......
So switch to a little bit faster device (Pi2 or BananaPi) , add a Real harddisk (USB or SATA) modify the OS to be able to boot directly form the Harddisk and you will have a lot of fun.

I moved every piece of software running on my old Win2003 Seerver box step by step to my Pi's. The only reason to further have this windows system was Cumulus ,and I am still running a MailServer on Windows, because I have not found a MailSuite which is good as my old one and running on ARM.
But I will also find this, sometimes in the near future.

Re: My experience: I moved to a rPi

Posted: Wed 06 May 2015 9:05 pm
by mcrossley
I am already running a HDD, but actually it is quite nice to have the web site running on a remote server. I was thinking about an SSD, not for performance but lower power requirements. But £50 buys a lot of electricity!

Re: My experience: I moved to a rPi

Posted: Wed 06 May 2015 9:10 pm
by slashmusic
mcrossley wrote:But £50 buys a lot of electricity!
Yes, but what's about the fun......Btw: Raspberry can only handle USB so you need a SATA2USB Adapter.
Maybe you should switch to a Bananapi.....should I test, if cumulus is running on Bananapi also ?
The BananaPi is having excatly the same prioce as the RaspberryPi2

Re: My experience: I moved to a rPi

Posted: Wed 06 May 2015 9:16 pm
by mcrossley
Yes, I have a eye on a bananapi or a cubie, both of which have SATA interfaces. The USB on the RPI is a bit of a bottleneck.
The cubie has better processor options too iirc.

Re: My experience: I moved to a rPi

Posted: Wed 06 May 2015 9:27 pm
by slashmusic
mcrossley wrote:Yes, I have a eye on a bananapi or a cubie, both of which have SATA interfaces. The USB on the RPI is a bit of a bottleneck.
The cubie has better processor options too iirc.
Yes, BananaPi is cheaper than the Cubie...I never been in a Cubie Forum so far, so I don't know how's about the support but until now, everything what's running on a Raspberry Pi is also running on a Bananapi and this is the reason why I decided to use BananaPi.....

Some Miinutes ago I started to check if Cumulus is running on my BananaPi, but forgot that I need to install Mono first :?: . Since Mono is such a big thing to install, and I am not sure if I am able to uninstall it clean and completely, I decided to stop this. But I am really sure it is running. I also tested a 1 wire DS1820 with a Serial to USB adapter some month ago, so I am also sure, that the BananaPi is detecting the Hardware (Weatherstation)

Re: My experience: I moved to a rPi

Posted: Wed 06 May 2015 9:30 pm
by slashmusic
By the way, do you think it is possible to display the CPU Temperature of the Raspberry in Cumulus....maybe as an extra sensor ? :-)

Re: My experience: I moved to a rPi

Posted: Sun 10 May 2015 11:00 am
by mcrossley
Here are a couple of Bash scripts I am using to backup stuff. I'm no expert on Bash scripts so there may be better ways of doing this, they keep a rotating backup of things...

This one is for the Pi, it backs up the whole CumulusMX folder (plus any additional folders you list) and FTPs it to an USB stick in my router...

Code: Select all

#!/bin/bash
logfile="EndOfDayTasks.log"
ftpserver="<<YOUR_BACKUP_FTP_SERVER>>"
ftpusername=<<YOUR_FTP_USER_NAME>>
ftppassword=<<YOUR_FTP_USER_PASSWORD>>
cmxdir=<<PATH_TO_CUMULUS_MX>>   #eg. /CumulusMX
bckdir=<<PATH_TO_BACKUP_FOLDER>> #eg. /var/backups
ftpdir=<<PATH_TO_BACKUP_FOLDER_ON_FTP_SERVER>> #eg. backups


echo start at $(date) >$logfile
echo Switch CWD to $mxdir >>$logfile
cd $mxdir 2>&1 >>$logfile
pwd 2>&1 >>$logfile
echo -e "\n" >>$logfile

echo Creating backup file $mxdir/CumulusMX_T.tgz... >>$logfile
tar cpzf $bkdir/CumulusMX_T.tgz <<LIST_ANY_ADDITIONAL_FOLDERS_HERE>> $mxdir 2>&1 >>$logfile
echo done. >>$logfile
echo -e "\n" >>$logfile

if [ -f $bkdir/CumulusMX_T.tgz ]; then
	echo Local backup file cleanup... >>$logfile
	rm $bkdir/CumulusMX_9.tgz  >>$logfile
	mv $bkdir/CumulusMX_8.tgz $bkdir/CumulusMX_9.tgz  >>$logfile
	mv $bkdir/CumulusMX_7.tgz $bkdir/CumulusMX_8.tgz  >>$logfile
	mv $bkdir/CumulusMX_6.tgz $bkdir/CumulusMX_7.tgz  >>$logfile
	mv $bkdir/CumulusMX_5.tgz $bkdir/CumulusMX_6.tgz  >>$logfile
	mv $bkdir/CumulusMX_4.tgz $bkdir/CumulusMX_5.tgz  >>$logfile
	mv $bkdir/CumulusMX_3.tgz $bkdir/CumulusMX_4.tgz  >>$logfile
	mv $bkdir/CumulusMX_2.tgz $bkdir/CumulusMX_3.tgz  >>$logfile
	mv $bkdir/CumulusMX_1.tgz $bkdir/CumulusMX_2.tgz  >>$logfile
	mv $bkdir/CumulusMX_0.tgz $bkdir/CumulusMX_1.tgz  >>$logfile
	mv $bkdir/CumulusMX_T.tgz $bkdir/CumulusMX_0.tgz  >>$logfile
	echo done. >>$logfile
	echo -e "\n" >>$logfile

	echo Copying backup file to router...
	curl -v -u $ftpusername:$ftppassword -T $bkdir/CumulusMX_0.tgz ftp://$ftpserver/$ftpdir/CumulusMX_T.tgz 2>&1 >>$logfile
	echo done. >>$logfile
	echo -e "\n" >>$logfile

	echo Remote backup file cleanup... >>$logfile
	curl -v -u $ftpusername:$ftppassword ftp://$ftpserver -Q "DELE /$ftpdir/CumulusMX_9.tgz" 2>&1 >>$logfile
	curl -v -u $ftpusername:$ftppassword ftp://$ftpserver -Q "RNFR /$ftpdir/CumulusMX_8.tgz" -Q "RNTO /$ftpdir/CumulusMX_9.tgz" 2>&1 >>$logfile
	curl -v -u $ftpusername:$ftppassword ftp://$ftpserver -Q "RNFR /$ftpdir/CumulusMX_7.tgz" -Q "RNTO /$ftpdir/CumulusMX_8.tgz" 2>&1 >>$logfile
	curl -v -u $ftpusername:$ftppassword ftp://$ftpserver -Q "RNFR /$ftpdir/CumulusMX_6.tgz" -Q "RNTO /$ftpdir/CumulusMX_7.tgz" 2>&1 >>$logfile
	curl -v -u $ftpusername:$ftppassword ftp://$ftpserver -Q "RNFR /$ftpdir/CumulusMX_5.tgz" -Q "RNTO /$ftpdir/CumulusMX_6.tgz" 2>&1 >>$logfile
	curl -v -u $ftpusername:$ftppassword ftp://$ftpserver -Q "RNFR /$ftpdir/CumulusMX_4.tgz" -Q "RNTO /$ftpdir/CumulusMX_5.tgz" 2>&1 >>$logfile
	curl -v -u $ftpusername:$ftppassword ftp://$ftpserver -Q "RNFR /$ftpdir/CumulusMX_3.tgz" -Q "RNTO /$ftpdir/CumulusMX_4.tgz" 2>&1 >>$logfile
	curl -v -u $ftpusername:$ftppassword ftp://$ftpserver -Q "RNFR /$ftpdir/CumulusMX_2.tgz" -Q "RNTO /$ftpdir/CumulusMX_3.tgz" 2>&1 >>$logfile
	curl -v -u $ftpusername:$ftppassword ftp://$ftpserver -Q "RNFR /$ftpdir/CumulusMX_1.tgz" -Q "RNTO /$ftpdir/CumulusMX_2.tgz" 2>&1 >>$logfile
	curl -v -u $ftpusername:$ftppassword ftp://$ftpserver -Q "RNFR /$ftpdir/CumulusMX_0.tgz" -Q "RNTO /$ftpdir/CumulusMX_1.tgz" 2>&1 >>$logfile
	curl -v -u $ftpusername:$ftppassword ftp://$ftpserver -Q "RNFR /$ftpdir/CumulusMX_T.tgz" -Q "RNTO /$ftpdir/CumulusMX_0.tgz" 2>&1 >>$logfile
	echo done. >>$logfile
	echo -e "\n" >>$logfile
fi

echo end at $(date +"%Y-%m-%d %H:%M:%S") >>$logfile
I get Cumulus to run this script as an external command after teh day roll over - my script actually does other stuff too, like copy the day and month files to the web server and insert the new data into the db.

I have a similar script on the web server than backs up the web site and MySQL and FTPs them to my router too, this is scheduled to run daily using cron...

Code: Select all

#!/bin/bash
logfile="backupAll.log"
ftpserver="<<YOUR_BACKUP_FTP_SERVER>>"
ftpusername=<<YOUR_FTP_USER_NAME>>
ftppassword=<<YOUR_FTP_USER_PASSWORD>>
ftpdir=<<PATH_TO_BACKUP_FOLDER_ON_FTP_SERVER>> #eg. backups
bckdir=<<PATH_TO_BACKUP_FOLDER>> #eg. /backups

echo Export MySQL database to $bckdir/backup_sql_T.gz... >>$logfile
mysqldump -h localhost -u <<YOUR_MYSQL_USER>> -p<<YOUR_MYSQL_PASSWORD>> -q <<YOUR_MYSQL_SCHEMA_NAME>> | gzip > $bckdir/backup_sql_T.gz
echo Export complete at $(date +"%Y-%m-%d %H:%M:%S") >>$logfile
echo -e "\n" >>$logfile

if [ -f $bckdir/backup_sql_T.gz ]; then
	echo Local SQL backup file cleanup... >>$logfile
	rm $bckdir/backup_sql_9.gz  >>$logfile
	mv $bckdir/backup_sql_8.gz $bckdir/backup_sql_9.gz  >>$logfile
	mv $bckdir/backup_sql_7.gz $bckdir/backup_sql_8.gz  >>$logfile
	mv $bckdir/backup_sql_6.gz $bckdir/backup_sql_7.gz  >>$logfile
	mv $bckdir/backup_sql_5.gz $bckdir/backup_sql_6.gz  >>$logfile
	mv $bckdir/backup_sql_4.gz $bckdir/backup_sql_5.gz  >>$logfile
	mv $bckdir/backup_sql_3.gz $bckdir/backup_sql_4.gz  >>$logfile
	mv $bckdir/backup_sql_2.gz $bckdir/backup_sql_3.gz  >>$logfile
	mv $bckdir/backup_sql_1.gz $bckdir/backup_sql_2.gz  >>$logfile
	mv $bckdir/backup_sql_0.gz $bckdir/backup_sql_1.gz  >>$logfile
	mv $bckdir/backup_sql_T.gz $bckdir/backup_sql_0.gz  >>$logfile
	echo done. >>$logfile
	echo -e "\n" >>$logfile

	echo Copying SQL backup file to router...
	curl -v -u $ftpusername:$ftppassword -T $bckdir/backup_sql_0.gz ftp://$ftpserver/$ftpdir/backup_sql_T.gz 2>&1 >>$logfile
	echo done. >>$logfile
	echo -e "\n" >>$logfile

	echo Remote backup file cleanup... >>$logfile
	curl -v -u $ftpusername:$ftppassword ftp://$ftpserver -Q "DELE /$ftpdir/backup_sql_9.gz" 2>&1 >>$logfile
	curl -v -u $ftpusername:$ftppassword ftp://$ftpserver -Q "RNFR /$ftpdir/backup_sql_8.gz" -Q "RNTO /$ftpdir/backup_sql_9.gz" 2>&1 >>$logfile
	curl -v -u $ftpusername:$ftppassword ftp://$ftpserver -Q "RNFR /$ftpdir/backup_sql_7.gz" -Q "RNTO /$ftpdir/backup_sql_8.gz" 2>&1 >>$logfile
	curl -v -u $ftpusername:$ftppassword ftp://$ftpserver -Q "RNFR /$ftpdir/backup_sql_6.gz" -Q "RNTO /$ftpdir/backup_sql_7.gz" 2>&1 >>$logfile
	curl -v -u $ftpusername:$ftppassword ftp://$ftpserver -Q "RNFR /$ftpdir/backup_sql_5.gz" -Q "RNTO /$ftpdir/backup_sql_6.gz" 2>&1 >>$logfile
	curl -v -u $ftpusername:$ftppassword ftp://$ftpserver -Q "RNFR /$ftpdir/backup_sql_4.gz" -Q "RNTO /$ftpdir/backup_sql_5.gz" 2>&1 >>$logfile
	curl -v -u $ftpusername:$ftppassword ftp://$ftpserver -Q "RNFR /$ftpdir/backup_sql_3.gz" -Q "RNTO /$ftpdir/backup_sql_4.gz" 2>&1 >>$logfile
	curl -v -u $ftpusername:$ftppassword ftp://$ftpserver -Q "RNFR /$ftpdir/backup_sql_2.gz" -Q "RNTO /$ftpdir/backup_sql_3.gz" 2>&1 >>$logfile
	curl -v -u $ftpusername:$ftppassword ftp://$ftpserver -Q "RNFR /$ftpdir/backup_sql_1.gz" -Q "RNTO /$ftpdir/backup_sql_2.gz" 2>&1 >>$logfile
	curl -v -u $ftpusername:$ftppassword ftp://$ftpserver -Q "RNFR /$ftpdir/backup_sql_0.gz" -Q "RNTO /$ftpdir/backup_sql_1.gz" 2>&1 >>$logfile
	curl -v -u $ftpusername:$ftppassword ftp://$ftpserver -Q "RNFR /$ftpdir/backup_sql_T.gz" -Q "RNTO /$ftpdir/backup_sql_0.gz" 2>&1 >>$logfile
	echo done. >>$logfile
	echo -e "\n" >>$logfile
fi


echo Backup website to $bckdir/webdump_T.tgz... >>$logfile
tar cpzf $bckdir/backup_web_T.tgz www/ private/ 2>&1 >>$logfile
echo Backup complete at $(date +"%Y-%m-%d %H:%M:%S") >>$logfile
echo >>$logfile

if [ -f $bckdir/backup_web_T.tgz ]; then
	echo Local Web backup file cleanup... >>$logfile
	rm $bckdir/backup_web_6.tgz  >>$logfile
	mv $bckdir/backup_web_5.tgz $bckdir/backup_web_6.tgz  >>$logfile
	mv $bckdir/backup_web_4.tgz $bckdir/backup_web_5.tgz  >>$logfile
	mv $bckdir/backup_web_3.tgz $bckdir/backup_web_4.tgz  >>$logfile
	mv $bckdir/backup_web_2.tgz $bckdir/backup_web_3.tgz  >>$logfile
	mv $bckdir/backup_web_1.tgz $bckdir/backup_web_2.tgz  >>$logfile
	mv $bckdir/backup_web_0.tgz $bckdir/backup_web_1.tgz  >>$logfile
	mv t$bckdirp/backup_web_T.tgz $bckdir/backup_web_0.tgz  >>$logfile
	echo done. >>$logfile
	echo -e "\n" >>$logfile

	echo Copying Web backup file to router...
	curl -v -u $ftpusername:$ftppassword -T $bckdir/backup_web_0.tgz ftp://$ftpserver/$ftpdir/backup_web_T.tgz 2>&1 >>$logfile
	echo done. >>$logfile
	echo -e "\n" >>$logfile

	echo Remote backup file cleanup... >>$logfile
	curl -v -u $ftpusername:$ftppassword ftp://$ftpserver -Q "DELE /$ftpdir/backup_web_9.tgz" 2>&1 >>$logfile
	curl -v -u $ftpusername:$ftppassword ftp://$ftpserver -Q "RNFR /$ftpdir/backup_web_8.tgz" -Q "RNTO /$ftpdir/backup_web_9.tgz" 2>&1 >>$logfile
	curl -v -u $ftpusername:$ftppassword ftp://$ftpserver -Q "RNFR /$ftpdir/backup_web_7.tgz" -Q "RNTO /$ftpdir/backup_web_8.tgz" 2>&1 >>$logfile
	curl -v -u $ftpusername:$ftppassword ftp://$ftpserver -Q "RNFR /$ftpdir/backup_web_6.tgz" -Q "RNTO /$ftpdir/backup_web_7.tgz" 2>&1 >>$logfile
	curl -v -u $ftpusername:$ftppassword ftp://$ftpserver -Q "RNFR /$ftpdir/backup_web_5.tgz" -Q "RNTO /$ftpdir/backup_web_6.tgz" 2>&1 >>$logfile
	curl -v -u $ftpusername:$ftppassword ftp://$ftpserver -Q "RNFR /$ftpdir/backup_web_4.tgz" -Q "RNTO /$ftpdir/backup_web_5.tgz" 2>&1 >>$logfile
	curl -v -u $ftpusername:$ftppassword ftp://$ftpserver -Q "RNFR /$ftpdir/backup_web_3.tgz" -Q "RNTO /$ftpdir/backup_web_4.tgz" 2>&1 >>$logfile
	curl -v -u $ftpusername:$ftppassword ftp://$ftpserver -Q "RNFR /$ftpdir/backup_web_2.tgz" -Q "RNTO /$ftpdir/backup_web_3.tgz" 2>&1 >>$logfile
	curl -v -u $ftpusername:$ftppassword ftp://$ftpserver -Q "RNFR /$ftpdir/backup_web_1.tgz" -Q "RNTO /$ftpdir/backup_web_2.tgz" 2>&1 >>$logfile
	curl -v -u $ftpusername:$ftppassword ftp://$ftpserver -Q "RNFR /$ftpdir/backup_web_0.tgz" -Q "RNTO /$ftpdir/backup_web_1.tgz" 2>&1 >>$logfile
	curl -v -u $ftpusername:$ftppassword ftp://$ftpserver -Q "RNFR /$ftpdir/backup_web_T.tgz" -Q "RNTO /$ftpdir/backup_web_0.tgz" 2>&1 >>$logfile
	echo done. >>$logfile
	echo -e "\n" >>$logfile
fi

echo end at $(date +"%Y-%m-%d %H:%M:%S") >>$logfile

Somebody may find these of some use, remember check your backup is both running and valid - regularly!

PS: I just realised I had hard coded folder paths into the scripts, so I just parametrised them here - there may be some errors!

Re: My experience: I moved to a rPi

Posted: Mon 11 May 2015 4:30 am
by wxstormer
I'm running MX on a Pi2 now, but I'm not using the SD card except to boot the device. I have a Synology Diskstation that exports a NFS mount that is used for the root partition on the Pi2. It's significantly faster than the SD card on the Pi2 and I have built in backup with CrashPlan... I'm also hosting the websites on the device with Apache and PHP (still working on them).

Re: My experience: I moved to a rPi

Posted: Mon 11 May 2015 8:38 am
by slashmusic
Hi, I want to add another scenario how you can configure a Raspberry PI to work with CumulusMX.
I exactly decided to use an SD Card only solution because in case the Network is having an outage, no Data can be written and the Pi will crash (if the root partition is located on a Fileserver)
So My Pi is running on a SD Card completely, but I mounted a NFS Partition on my QNAP NAS.
I created a CRON Task to sync the complete CumulusMX folder from the SD Card to the QNAP.
I think, doing this once per hour is often enough, in case I am loosing all Data on my PI, the oldest Weather entry is max 1 hour. I also decided to use RSYNC and only modified data will be updated once per hour to the QNAP.

The cron entry looks like this:

00 */1 * * * sudo rsync -rtagu /CumulusMX/ /mnt/Backup/Administrator/CumulusMX_Sync/ --exclude=MXdiags

Everyhing will be backed up, except MXDIAGS Folder.

Of course in case the SD Card gets damaged, you need to install the OS completely new and based on what else you modified / Installed / Configured on your Pi it can take several hours untiul everyhting is as before.
In this case, I am doing once per month a complete Backup of the SD Card by using DD.
DD is creatign a IMG file which can be easily being written on a new SD Card again by using Image writer Program you used when you installed the SD Card.
the DD command is:
the Cron entry to create the backup once a week is:
30 03 * * 1 sudo /home/pi/dd_backup.sh

The script dd_backup.sh looks like:

Code: Select all

[b]#!/bin/sh
# Full dd Backup
sudo dd if=/dev/mmcblk0 of=/mnt/Backup/Administrator/RaspberryPi2_Cumulus/$(date +%F)_Cumulus.img bs=1MB

#delete Backups older than32 days
sudo find /mnt/Backup/Administrator/RaspberryPi2_Cumulus/ -name '*.img' -mtime +32 -exec rm {} \;[/b]