How To Backup Mongodb Automatically

Gupta Aditya
6 min readSep 11, 2023
Photo by Rubaitul Azad on Unsplash

🚀 Introduction:

Hello readers! Today, we’re embarking on a journey into the world of database management. Our destination? Automated MongoDB Backups.

🔍 What We’ll Cover

We’ll delve into the intricacies of automating MongoDB backups and archiving them date-wise in an AWS S3 bucket. This guide promises not just knowledge, but also the peace of mind that your data is safe and within reach whenever you need it. And while the S3 bucket is our chosen storage for this guide, it’s just one of many options. If you’ve got a different setup, feel free to adapt as needed!

🐧 Linux Server — A Must-Have

For this demonstration, I’m using the ever-reliable Ubuntu 22.04. Don’t have a Linux server yet? No worries!

🌟 Special Offer

Linode has got you covered. For the first few months, they’re offering a $100 credit, making your Linux adventure virtually free. Interested? Dive in and grab this deal with the link below!

🛠 Setting Up Automated MongoDB Backups

Before we dive into automating the backups, it’s essential to manually test our script to ensure everything works seamlessly. Once verified, we’ll use crontab to schedule our backups daily at 12:00 am IST.

📝 Step 1: Create the Backup Script

Start by creating a new file for our script:


🔗 Step 2: Scripting the Backup Process

Paste the following code into the file you just created:


# MongoDB server settings
HOST='mongodb+srv://username:pass@instance/dbname' # Replace with your MongoDB URL
# Backup directory
BACKUP_DIR='./backupdir' # Ensure you create this directory and provide its full path
BACKUP_NAME="mongodb_backup_$(date +\%Y\%m\%d\%H\%M\%S).gz"

# S3 bucket details (Optional: Only if you're saving to S3)

# Create backup
mongodump --uri $HOST --archive=$BACKUP_DIR/$BACKUP_NAME --gzip

# Copy the backup to S3 (Optional: Only if you're saving to S3)

# Optional: Remove local backup after uploading to S3 (Use this only if you're not storing backups on local storage)

In this script:

  • We first specify our MongoDB URL.
  • Use the MongoDB tool to save the backup in our designated directory.
  • Optionally, we then use the AWS CLI to transfer the backup to an S3 bucket.
  • Finally, we remove the local backup to save space (only if you’re not storing backups on local storage).

🔒 Step 3: Make the Script Executable

Change the file mode to make it executable:

chmod +c filename

You can now run your script using:


But wait! If you run the script now, you might encounter an error. Why? Because we haven’t installed the necessary MongoDB tools yet. Let’s get those set up in the next steps!

🔧 Installing MongoDB Database Tools

To ensure our backup script runs flawlessly, we need to install the MongoDB Database Tools. Here’s how you can do it:

🌐 Step 1: Download the MongoDB Database Tools

  1. Head over to MongoDB Database Tools Download.
  2. Scroll to the MongoDB Command Line Database Tools section.
  3. Choose your desired version, OS (in our case, it’s Ubuntu 22.04), and set the package type to tgz.
  4. Once you’ve made your selections, click on “Copy Link” to copy the download link to your clipboard.

💾 Step 2: Download and Install the Tools

  1. Use the wget command to download the package:
wget  # Replace with your copied link

2. After downloading, extract the contents of the package:

tar -zxvf mongodb-database-tools-*-100.8.0.tgzb

3. Move the extracted tools to /usr/local/bin/ so they're accessible system-wide:

sudo cp -r mongodb-database-tools-ubuntu2204-x86_64-100.8.0/bin/* /usr/local/bin/

🎉 Done! If you’ve decided to skip the S3 bucket part in our previous steps, you can now run the backup script. It should execute without any hitches and save the backup with the current date in your specified backup directory.

🌩 Storing MongoDB Backups in AWS S3

If you’ve chosen AWS S3 as your backup storage solution, there are a few additional steps to ensure seamless integration. Here’s how to set it up:

🔑 Step 1: Obtain IAM Access Key and Secret Key

Before you begin, make sure you have an IAM user with the necessary S3 permissions. Obtain the Access Key and Secret Key for this user. These keys will allow your server to interact with your S3 bucket.

🐍 Step 2: Install AWS CLI

The AWS Command Line Interface (CLI) is a unified tool that allows you to manage your AWS services. To install it:

  1. First, ensure you have python3-pip installed:
sudo apt install python3-pip

2. Then, install the AWS CLI using pip:

pip install awscli

⚙ Step 3: Configure AWS CLI

Now, you’ll set up the AWS CLI with your credentials:

aws configure

This command will prompt you to enter your Access Key, Secret Key, default region, and default output format. Fill in the details accordingly.

🚀 Step 4: Test Your Setup

With everything in place, you can now rerun your backup script with AWS S3 integration enabled. After execution, check your S3 bucket to ensure the backup was successfully uploaded.

🕰 Setting Up a Cron Job for Automated Backups

After verifying that your backup script works flawlessly and the file has been saved successfully, it’s time to automate the process. By setting up a cron job, you can ensure that your script runs at regular intervals without manual intervention.

📝 Step 1: Open the Crontab

To set up or edit a cron job, use the following command:

crontab -e

⏰ Step 2: Schedule the Script

Add the following line to schedule your backup script:

30 18 * * * /path/to/ >> /root/logfile.log 2>&1

In the above command:

  • The script is scheduled to run daily at 6:30 PM UTC, which corresponds to 12:00 AM IST.
  • /path/to/ should be replaced with the actual path to your backup script.
  • The >> /root/logfile.log 2>&1 part ensures that any output (including errors) from the script is saved to a log file located at /root/logfile.log.

🔧 Step 3: Adjusting the Time (Optional)

If you wish to change the time the script runs, adjust the 30 18 * * * part of the command. The first number represents minutes and the second represents hours in UTC. Adjust these values according to your needs.

🌟 Conclusion

Databases are the lifeblood of many applications, and ensuring their safety is paramount. Through this guide, we’ve journeyed from the basics of scripting MongoDB backups to automating the process and finally, integrating with AWS S3 for secure storage. With these steps in hand, you’re not only safeguarding your data but also ensuring that it’s easily retrievable and stored efficiently.

Remember, while technology provides us with tools and automation, the real power lies in understanding and customizing these tools to fit our unique needs. Whether you’re using local storage or the cloud, the key is regular backups and periodic checks to ensure everything runs smoothly.

Thank you for joining us on this journey. We hope this guide serves as a valuable resource in your database management endeavors. Stay curious, keep learning, and always prioritize the safety of your data!

Guys follow me for such amazing blogs and if have any review then please let me know I will keep those points in my mind next time while writing blogs. If want to read more such blog to know more about me here is my linkedin Please do not hesitate to keep 👏👏👏👏👏 for it (An Open Secret: You can clap up to 50 times for a post, and the best part is, it wouldn’t cost you anything), also feel free to share it across. This really means a lot to me.