Friday, July 26, 2024

Automating Source Code and Database Backup Uploads to AWS S3 Bucket

 

 1. Create an S3 User with Limited Access
2. Create a Shell Script for DB & SourceCode
3. Configure AWS CLI with S3 User Credentials
4. Set Up a Cron Job for Scheduled Execution
5. Using S3 Lifecycle Policies to delete S3 objects that are older than 10 days.


===============================2) Create Script For The Source Code

#!/bin/bash
# About_Author : Subhash Sautiyal

# Variables
SOURCE_DIR=/var/www/html
BUCKET_NAME=source-code-db-bkp
BUCKET_FOLDER=Every-Week-Source-Code-Backup
ARCHIVE_NAME="source_code_$(date +Date_%d_%m_%y_Time_%H_%M_%S).tar.gz"

# Create archive of the source directory
tar -czvf $ARCHIVE_NAME $SOURCE_DIR

if [ $? -eq 0 ]; then
    echo "Archive $ARCHIVE_NAME created successfully."
else
    echo "Error creating archive."
    exit 1
fi

# Upload to S3
aws s3 cp $ARCHIVE_NAME s3://$BUCKET_NAME/$BUCKET_FOLDER

if [ $? -eq 0 ]; then
    echo "File uploaded to S3 successfully."
else
    echo "Error uploading file to S3."
    exit 1
fi

# Cleanup
rm $ARCHIVE_NAME
echo "Cleanup done. Local archive removed."

exit 0


===============================2) Create Script For The Mysql Database

#!/bin/bash
# About_Author : Subhash Sautiyal

# Variables
DB_HOST=127.0.0.1
DB_USER=root
DB_PASS=123456
DB_NAME=testDB
BUCKET_NAME=source-code-db-bkp/Every-day-Mysql-DB-Backup
DUMP_NAME="${DB_NAME}_dump_$(date +Date_%d_%m_%y_Time_%H_%M_%S).sql"

# Create MySQL dump
mysqldump -h $DB_HOST -u $DB_USER -p$DB_PASS $DB_NAME > $DUMP_NAME

if [ $? -eq 0 ]; then
    echo "Database dump $DUMP_NAME created successfully."
else
    echo "Error creating database dump."
    exit 1
fi

# Upload to S3
aws s3 cp $DUMP_NAME s3://$BUCKET_NAME/

if [ $? -eq 0 ]; then
    echo "File uploaded to S3 successfully."
else
    echo "Error uploading file to S3."
    exit 1
fi

# Cleanup
rm $DUMP_NAME
echo "Cleanup done. Local dump removed."

exit 0

===============================3) Set Up a Cron Job for Scheduled Execution

mkdir cron

0 0 * * * /usr/bin/sh /etc/Source-Code-And-DB-Backup/Every-Day-Mysql-Demo-Blog-DB-Backup.sh >> /var/log/cron/code_and_db_backup_cron.log 2>&1
0 0 * * * /usr/bin/sh /etc/Source-Code-And-DB-Backup/Every-Day-Mysql-Demo-Main-DB-Backup.sh >> /var/log/cron/code_and_db_backup_cron.log 2>&1
0 0 * * 5,3 /usr/bin/sh /etc/Source-Code-And-DB-Backup/Every-Week-Source-Code-Backup.sh >> /var/log/cron/code_and_db_backup_cron.log 2>&1


===============================3) Using S3 Lifecycle Policies


Select Your Bucket >> Go to Management Tab >> Create a Lifecycle Rule
Click on "Create lifecycle rule." >> Give your rule a name, e.g., "DeleteOldObjects."
Choose the scope of the rule (e.g., apply to the entire bucket or a specific prefix).
Click "Next."

Configure Expiration:

Under "Lifecycle rule actions," select "Expire current versions of objects."
Enter "10" for "Number of days after object creation."
Click "Next." >> Review and Create

Review your settings.
Click "Create rule."

No comments:

Post a Comment

testing