make the script compatible with cron jobs and have it log each backup event, including any errors, you can modify the script to write to a log file. This will ensure that every time the backup script runs, it logs the event (success or failure) to a log file.
PostgreSQL Backup Script with Logging
#!/bin/bash
# Database credentials
DATABASE="DB NAME"
USERNAME="DB USERNAME"
PASSWORD="DB PASSWORD"
HOST="localhost"
PORT="5432"
# Directory to back up
SOURCE_DIR=/var/www/html/pwdwebsite/public_html/
# Target directory
TARGET_DIR=/home/websiteuser/backup/bak/
# Log file path
LOG_FILE=/home/websiteuser/backup/backup.log
# Output files
NOW=$(date +"%Y_%m_%d_%H_%M_%S")
DB_OUTPUT=$TARGET_DIR/db.$NOW.sql.gz
FILES_OUTPUT=$TARGET_DIR/files.$NOW.zip
# Export PostgreSQL password so it is not prompted
export PGPASSWORD=$PASSWORD
# Function to log messages
log_message() {
echo "$(date +"%Y-%m-%d %H:%M:%S") : $1" >> $LOG_FILE
}
# Start backup log
log_message "Backup started."
# Back up files
if zip -r $FILES_OUTPUT $SOURCE_DIR >/dev/null 2>&1; then
log_message "Files backup successful: $FILES_OUTPUT"
else
log_message "Files backup failed."
fi
# Back up PostgreSQL database
if pg_dump -h $HOST -p $PORT -U $USERNAME -d $DATABASE | gzip > $DB_OUTPUT 2>/dev/null; then
log_message "Database backup successful: $DB_OUTPUT"
else
log_message "Database backup failed."
fi
# Unset the password variable for security
unset PGPASSWORD
# Remove files older than 5 days
if find $TARGET_DIR -type f -mtime +5 | xargs rm -f 2>/dev/null; then
log_message "Old backups removed."
else
log_message "Failed to remove old backups."
fi
# End backup log
log_message "Backup completed."
Explanation :
- Log File Path:
- A log file (
LOG_FILE=/home/websiteuser/backup/backup.log
) is created. This file will store messages about each backup operation.
- A log file (
- Logging Function:
log_message() {
echo "$(date +"%Y-%m-%d %H:%M:%S") : $1" >> $LOG_FILE
}
A function called log_message
is created to handle the logging. It records the date and time along with the message passed to it and appends it to the log file.
Backing up Files:
if zip -r $FILES_OUTPUT $SOURCE_DIR >/dev/null 2>&1; then
log_message "Files backup successful: $FILES_OUTPUT"
else
log_message "Files backup failed."
fi
- The zip command logs whether the file backup succeeded or failed.
Backing up the PostgreSQL Database:
if pg_dump -h $HOST -p $PORT -U $USERNAME -d $DATABASE | gzip > $DB_OUTPUT 2>/dev/null; then
log_message "Database backup successful: $DB_OUTPUT"
else
log_message "Database backup failed."
fi
- Logs success or failure for the database backup.
Removing Old Files:
if find $TARGET_DIR -type f -mtime +5 | xargs rm -f 2>/dev/null; then
log_message "Old backups removed."
else
log_message "Failed to remove old backups."
fi
- Logs success or failure when removing backups older than 5 days.
Ending the Backup:
log_message "Backup completed."
- Logs the completion of the entire backup process.
Cron Job Compatibility:
- In a cron job, output isn’t visible directly, so logging to a file ensures you can check later if the job ran correctly.
- Redirecting command output (>/dev/null 2>&1) ensures that only logs are written to the log file without showing any command output on the terminal, which is important when running via cron.
Cron Job Setup
You can schedule this script to run regularly using a cron job. For example, to run the script every day at 2 AM:
- Open your crontab:
crontab -e
2. Add the following line to schedule the script to run at 2 AM every day:
0 2 * * * /path/to/your/backup_script.sh
Make sure to replace /path/to/your/backup_script.sh with the actual path to the script.
This will run the script daily, log the results in the specified log file, and ensure the backups are handled correctly.