AKZN Notes

Archives for My Lazy and Forgetful Mind

Using Rclone to sync file to Google Drive on Ubuntu

Last Modified on


As of now, Google has yet to release an official backup/sync client for Linux.

The following write-up describes an approach for using an open source package, rclone, to automate backups from Linux to Google drive.

Download/install rclone:

To download/install rclone:

$ cd $HOME  
$ curl https://rclone.org/install.sh | sudo bash  
...  
rclone v1.53.1 has successfully installed.

This installs the binary to location /usr/bin/rclone.

Before being able to configure access for rclone to our target Google Drive account, we need to enable the Google Drive API and create appropriate credentials.

Enable Google Drive API:

Login with your Google account at: https://console.cloud.google.com to begin the process for enabling the API.

  • Navigate to “APIs & Services” → “Library
  • Search for and enable “Google Drive API
  • Choose “Create Credentials” and continue to the next section for configuring Oauth Consent Screen

Create Google Drive API credentials for rclone:

From the left hand pane of the cloud console, choose “Oauth consent screen” and perform the configuration using the following values.

  • User Type: External -> Create
  • Application name: rclone
  • Support email: enter your google account email
  • Add scopes: “Google Drive API” -> “…/auth/drive.files” (using this scope preventing rclone to access all file except under the folder created by rclone)

From the left hand pane, choose “Create credentials”

  • Click on the menu “+Create credentials” and select “Oauth client ID
  • Application type: Desktop app
  • Name: rclone
  • Click “Create”
  • A client id & client secret will be generated. Make a note of these as they are required during the rclone configuration process

Configuring rclone

Begin the rclone configuration by running:

$ rclone config

Throughout the configuration process, choose/enter the following options:

  • n to add a new remote
  • name: gdrive
  • from the list storage providers choose the number/option corresponding to Google Drive \ “drive
  • Google application client id: as noted from Google config
  • client secret: as noted from Google config
  • scope: 3 / access only to folder created by rclone. \ “drive.file”
  • root_folder_id: leave blank
  • service_account_file: leave blank
  • edit advanced config: N
  • remote config use auto config: Y (pick N if crone installed on remote/headless server)
  • if yo pick N(headless server) you need to install rclone (prefered same version) into your local PC with browser, then copy the generated command into your rclone console on Local PC.
    • On windows, download rclone for windows if you havent donwnload it. Then run terminal from extracted folder. then follow intstruction (usually to command its ./rclone.exe)
  • Open the generated URL link with your browser and complete the authentication process
  • After successfully authenticating and granting permissions to rclone, go back to the rclone shell terminal session to continue configuration
  • team drive: N
  • Y to confirm configuration is correct
  • Choose q to complete the configuration

To confirm that connectivity to Google Drive works correctly, we can list our remote root directory via:

$ rclone ls gdrive:/  
       myfile.docx  
       ...  
       ...  

if it's throwing folder empty or not found, it's because we are using Gdrive api scope "…/auth/drive.files".
To make a folder that rclone would be written, type :

$ rclone mkdir gdrive:[FOLDERNAME]

and then we can check if Gdrive connectivity is correct using :

$ rclone ls gdrive:[FOLDERNAME]

Choosing an appropriate rclone command: copy or sync?

Depending on your backup requirements, you can choose from several different command options. Amongst these are copy, sync.

Copy

https://rclone.org/commands/rclone_copy/#synopsis

Copies the source to the destination. Doesn’t delete files from the destination. Doesn’t transfer unchanged files.

rclone copy source:sourcepath dest:destpath

The following command:

$ rclone copy --transfers 20 --retries 5 "/home/dev/g" "gdrive:/mybackup"

can be used to backup the contents of local subdirectory /home/dev/g to Google drive location gdrive:/mybackup, where gdrive is the reference to our Google drive alias as supplied during the MARKDOWN_HASH373e61df9543695520960be1b203337eMARKDOWNHASH config process, and /mybackup_ is the target location within our Google drive root directory.

The flags chosen in the above command translate to the following:

--transfers : number of file transfers to perform in parallel (20)
--retries : number of times to retry failed operations (5)

To see a full list of available flags, visit: https://rclone.org/flags/

Sync

https://rclone.org/commands/rclone_sync/#synopsis

Destination is updated to match source, including deleting files if necessary. Doesn’t transfer unchanged files

rclone sync <flags> SOURCE [remote_name]:DESTINATION

For the automation example that follows, we'll use the rclone copy option. Tread with caution if you decide to use the sync option. A misunderstanding of how the sync command works can potentially lead to loss of data.

Automating the rclone command with flock and cron

The rclone copy command in the previous section can be implemented as a cron job, scheduled to run at desired intervals. Before we go into the scheduling, we need to deal with the situation where a backup may still be executing while the next scheduled instance is due to commence. A simple approach to managing this scenario would be via the concept of Advisory file locking with flock.

According to the man page for flock, we can use a one-liner flock command to implement our rclone backup.

$ flock -n /tmp/google_drv_sync.lock /usr/bin/rclone copy --transfers 20 --retries 5 "/home/dev/g" "gdrive:/mybackup"

The above flock command will create the file google_drv_sync.lock (if it does not already exist) and acquire an exclusive (write) lock on the file for the duration of the rclone backup. The -n option tells flock that in the case where the lock cannot be acquired (i.e. previous backup is still running), exit immediately with return code 1 (i.e. do not wait for lock to be released).

Enabling cron daemon

Ensure cron is enabled and running on the host.

sudo service cron start

You will also need to ensure the daemon is enabled during reboot/startup process.

If you are using a WSL2 Linux distro, exiting your WSL2 distro shell does not leave the daemon running in the background, and its state is not saved. The WSL2 Linux distro is considered to be offline if a background shell is not running.

To automatically re-enable the daemon once you log back into the WSL2/distro shell, you can add the following to your $HOME/.bashrc

$HOME/.bashrc

...
...

sudo service cron status > /dev/null || sudo service cron start

Implementing crontab entry

Assuming our backup will run at 10 minute intervals, we can add the following crontab entry via crontab -e.

*/10 * * * * flock -n /tmp/google_drv_sync.lock /usr/bin/rclone copy --transfers 20 --retries 5 "/home/dev/g" "gdrive:/mybackup"

for bare sync without flock
use --config=/path/to/file parameter to use as root to use already created config

* * * * * rclone sync --config=/path/to/file.conf /local/path/ gdrive:/remote/path/

Monitoring locks using lslocks

To monitor the status of locks acquired at any point in time, the lslocks command can be used. Once cron has triggered the rclone copy command, output of lslocks should show that a lock has been placed on /tmp/google_drv_sync.lock. Output from cat /proc/locksshould show the lock type as Advisory.

Notes

rclone is open source with source code published at https://github.com/rclone/rclone. Compiling from source is described at https://github.com/rclone/rclone/blob/master/MANUAL.md#install-from-source.

In addition to Google Drive, rclone supports a wide range of additional storage providers, including Amazon and Microsoft OneDrive.

article adapted from https://medium.com/swlh/using-rclone-on-linux-to-automate-backups-to-google-drive-d599b49c42e8

Leave a Reply

Your email address will not be published.