-
-
Save powellc/4162155 to your computer and use it in GitHub Desktop.
#!/bin/bash | |
# Location to place backups. | |
backup_dir="/var/backups/databases/" | |
nightly_dir="/var/backups/databases/latest/" | |
#String to append to the name of the backup files | |
backup_date=`date +%d-%m-%Y` | |
#Numbers of days you want to keep copie of your databases | |
number_of_days=15 | |
databases=`psql -l -t | cut -d'|' -f1 | sed -e 's/ //g' -e '/^$/d'` | |
for i in $databases; do if [ "$i" != "postgres" ] && [ "$i" != "template0" ] && [ "$i" != "template1" ] && [ "$i" != "template_postgis" ]; then | |
echo Dumping $i to $backup_dir$i\_$backup_date.sql | |
pg_dump $i > $backup_dir$i\_$backup_date.sql | |
bzip2 $backup_dir$i\_$backup_date.sql | |
ln -fs $backup_dir$i\_$backup_date.sql.bz2 $nightly_dir$i-nightly.sql.bz2 | |
fi | |
done | |
find $backup_dir -type f -prune -mtime +$number_of_days -exec rm -f {} \; |
beautiful thank you
Brilliant, Thank You.
I have over 20 databases inside our PG Installation, and more are added from time to time.
So the psql -l is great, as it will run through the list.
If I run a pgdump manually, it will ask for the password for the postgres user before creating the dump file.
I imagine the same will be true if I run this script.
Is there a line of code we can add to feed the password in for each iteration of the loop ?
Is it worth putting a "wait" command inside the loop too ? just say so each DB has 1 minute to complete ?
I don't want the system trying to backup all DBs at once (or does it wait for the previous iteration's dump file to be created before proceeding anyway ?)
Thanks for your help - Great Post and Great Script !
Kev.
Brilliant, Thank You.
I have over 20 databases inside our PG Installation, and more are added from time to time.
So the psql -l is great, as it will run through the list.If I run a pgdump manually, it will ask for the password for the postgres user before creating the dump file.
I imagine the same will be true if I run this script.Is there a line of code we can add to feed the password in for each iteration of the loop ?
Is it worth putting a "wait" command inside the loop too ? just say so each DB has 1 minute to complete ?
I don't want the system trying to backup all DBs at once (or does it wait for the previous iteration's dump file to be created before proceeding anyway ?)Thanks for your help - Great Post and Great Script !
Kev.
Maybe this can help you: https://www.postgresql.org/docs/13/libpq-pgpass.html
whenever i unzip the file, I always got this error bzip2: Input file Adom_General-nightly.sql.bz2 is not a normal file
, anyone has the ideas ?
Brilliant, Thank You.
I have over 20 databases inside our PG Installation, and more are added from time to time.
So the psql -l is great, as it will run through the list.If I run a pgdump manually, it will ask for the password for the postgres user before creating the dump file.
I imagine the same will be true if I run this script.Is there a line of code we can add to feed the password in for each iteration of the loop ?
Is it worth putting a "wait" command inside the loop too ? just say so each DB has 1 minute to complete ?
I don't want the system trying to backup all DBs at once (or does it wait for the previous iteration's dump file to be created before proceeding anyway ?)Thanks for your help - Great Post and Great Script !
Kev.
Actually you can use something like that PGPASSWORD=SOMESTRONGPASSWORD pgdump *PARAMS.....*
Not best practice but work
@benzhou2014 you are trying to unpack the symbolic link file, symbolic link is not a normal file, you need to use the real path of the file
to unpack the nightly item use this command :
bunzip2 "$(realpath your_filename-nightly.sql.bz2)"
That is it
Turns out you need to force ln to overwrite old symlinks using the -f option. Oh well, so latest wasn't getting updated. Everything else works fine, though.