Welcome to the Cumulus Support forum.

Latest Cumulus MX V3 release 3.28.6 (build 3283) - 21 March 2024

Cumulus MX V4 beta test release 4.0.0 (build 4018) - 28 March 2024

Legacy Cumulus 1 release v1.9.4 (build 1099) - 28 November 2014 (a patch is available for 1.9.4 build 1099 that extends the date range of drop-down menus to 2030)

Download the Software (Cumulus MX / Cumulus 1 and other related items) from the Wiki

mysqldump on CumulusMX Database.

Topics about the Beta trials up to Build 3043, the last build by Cumulus's founder Steve Loft. It was by this time way out of Beta but Steve wanted to keep it that way until he made a decision on his and Cumulus's future.

Moderator: mcrossley

Locked
User avatar
ConligWX
Posts: 1571
Joined: Mon 19 May 2014 10:45 pm
Weather Station: Davis vPro2+ w/DFARS + AirLink
Operating System: Ubuntu 22.04 LTS
Location: Bangor, NI
Contact:

mysqldump on CumulusMX Database.

Post by ConligWX »

can I confirm the best option to make a backup of the MySQL cumulus Database is to use the following commands whilst the system is live and writing to the live InnoDB database.

Code: Select all

mysqldump -uuser -ppass --single-transaction --routines --triggers cumulus_db | gzip > /backups/cumulusmx-$(date +%d-%m-%Y).sql.gz
--single-transaction produces a checkpoint that allows the dump to capture all data prior to the checkpoint while receiving incoming changes. Those incoming changes do not become part of the dump. That ensures the same point-in-time for all tables.

--routines dumps all stored procedures and stored functions

--triggers dumps all triggers for each table that has them
Regards Simon

https://www.conligwx.org - @conligwx
Davis Vantage Pro2 Plus with Daytime FARS • WeatherLink Live • Davis AirLink • PurpleAir •

Image
jlmr731
Posts: 225
Joined: Sat 27 Aug 2016 12:11 am
Weather Station: Davis vantage pro 2
Operating System: Debian
Location: Wickliffe, Ohio
Contact:

Re: mysqldump on CumulusMX Database.

Post by jlmr731 »

Toxic17 wrote:can I confirm the best option to make a backup of the MySQL cumulus Database is to use the following commands whilst the system is live and writing to the live InnoDB database.

Code: Select all

mysqldump -uuser -ppass --single-transaction --routines --triggers cumulus_db | gzip > /backups/cumulusmx-$(date +%d-%m-%Y).sql.gz
--single-transaction produces a checkpoint that allows the dump to capture all data prior to the checkpoint while receiving incoming changes. Those incoming changes do not become part of the dump. That ensures the same point-in-time for all tables.

--routines dumps all stored procedures and stored functions

--triggers dumps all triggers for each table that has them
I am not sure you need --single-transaction because there are no foreign keys between tables (unless you have altered them), also routines and triggers would include any stored procedures and i dont believe you would have any either

I just run mysqldump with no options and have not had a problem restoring any data and the gzip part looks fine, just remember to remove older backup from time to time.
i take it your setting this up as a cron job
User avatar
ConligWX
Posts: 1571
Joined: Mon 19 May 2014 10:45 pm
Weather Station: Davis vPro2+ w/DFARS + AirLink
Operating System: Ubuntu 22.04 LTS
Location: Bangor, NI
Contact:

Re: mysqldump on CumulusMX Database.

Post by ConligWX »

Yes thats corrrect. the job is run in a script called from cronjob. I also have rsync copy this and the CumulusMX folder over to a NAS, as a Backup. Then the NAS data is the backed up to another NAS.
Regards Simon

https://www.conligwx.org - @conligwx
Davis Vantage Pro2 Plus with Daytime FARS • WeatherLink Live • Davis AirLink • PurpleAir •

Image
User avatar
dazza1223
Posts: 860
Joined: Sun 25 Jan 2015 8:41 pm
Weather Station: Davis Vantage Pro 2 plus
Operating System: Raspberry pi 4 (4gb)
Location: Worthing
Contact:

Re: mysqldump on CumulusMX Database.

Post by dazza1223 »

#!/bin/bash
# Database credentials
user="*****"
password="*****"
host="127.0.0.1"
db_name="cumulus"
# Other options
backup_path="//home/pi/nas/backups/sql_backup"
date=$(date +"%d-%b-%Y")
# Set default file permissions
umask 177
# Dump database into SQL file
mysqldump --user=$user --password=$password --host=$host $db_name > $backup_path/$db_name-$date.sql

# Delete files older than 30 days
find $backup_path/* -mtime +30 -exec rm {} \;



that what i use and it been ruining fine mate
Have fun and keep learning

dazza :D

https://www.davisworthing.co.uk

Image
jlmr731
Posts: 225
Joined: Sat 27 Aug 2016 12:11 am
Weather Station: Davis vantage pro 2
Operating System: Debian
Location: Wickliffe, Ohio
Contact:

Re: mysqldump on CumulusMX Database.

Post by jlmr731 »

Code: Select all

#!/bin/sh
#backup all mysql databases
# list MySQL databases and dump each
DIR=/thebackupdirectory/
DATESTAMP=$(date +%Y%m%d-%H:%M:%S)
DB_USER=
DB_PASS=

# remove old backups
 find ${DIR} -type f -mtime +5 -exec rm -rf {} \;


DB_LIST=`mysql -u $DB_USER -p"$DB_PASS" -e'show databases;'`
DB_LIST=${DB_LIST##Database}
for DB in $DB_LIST;
do
  FILENAME=${DIR}${DB}-${DATESTAMP}.sql.gz
  mysqldump -u $DB_USER -p"$DB_PASS" --opt --flush-logs $DB | gzip > $FILENAME
done

mysqlcheck -u $DB_USER -p"$DB_PASS" --all-databases > /root/mysql_backups/check_errors-${DATESTAMP}.log

This is what i use, nice thing about this is that it will back up each database independently, nice if you only need to restore 1 database instead of all of them, so it is similar to dazza's script
you can take the time out of the datestamp if your only running once a day, i run this a few times a day so its nice to have at what time it ran, also it will produce a logfile of backups. and also del old files more than 5 days
(found this somewhere on that internet thingy so thanks to who wrote it)

Also since i have a few systems im running mysql replication and this script runs on a slave so it wont mess with the master database, its a good idea if you have an old laptop, desktop or raspi to set as a slave
Locked