Following on from my first post on Personal Digital Sovereignty, I continued migrating files and photos I found in my regular Google drive. Interestingly, I found the pre-2017 repository, at the time where Google decided to change the way it stores photos as objects, labels and metadata, rather than the folder structure with year/months and an occasional album. After successful migration of imported and metadata-fixed chunks of Google Photo zipped exports, I boldly deleted that structure, together with some forgotten GoPro raw footage on Google drive. Rclone reports:
% rclone about google:Total: 4.003 TiBUsed: 15.634 GiBFree: 3.720 TiBTrashed: 0 BOther: 273.808 GiB

Apparently, without photos and videos, my digital life is stored in 15GB of storage. And that is the digital paper trail of about 2 decades where the last 10 years is fully digital. Further investigation of files in the drive was a reveal in two ways. First, still a lot of photos and videos are randomly stored in the folder structure. I used rclone ncdu to find out where the big files are hidden. Open source tooling is great, didn’t I say that? Secondly, there was a mental shift. From having exact and perfect data, to a more lenient attitude. I noticed that I started to throw away older data, even photos that were too random, didn’t spark a memory, or couldn’t reach into my brain for its usefulness. The context here being the Swiss vault/Proton drive and asking myself if it is worth keeping in there.
Ironically, I used Google Gemini to guide me through the technical bits, help me with strategic decisions and troubleshooting error codes, as well as generating a blog post that I’ll copy and paste below. Clearly, Gemini is Generative AI and great in looking up things and decision assistance. I have to be honest; it saved me a lot of time but I also realised that, without proper admin background in UNIX/Linux, bigger concepts of operating systems, networks and data, this path is much more difficult.
Personal Digital Sovereignty is a touching many areas and takes time. I don’t need to win every race for the championship, but for the one for personal data, freedom and dignity, I feel I did. I haven’t pulled the trigger on deleting all my photos and other data in Google drive, but from this week on, all mobile photos are sent to the Proton drive in Switzerland.
Next race that is worth a blog post has already started. Personal data stored by others, family, colleagues and friends (email and phone) but also all the services logins associated with an email address or phone number. More to come.
Keep qool.
Quinten
Following my previous post on the “why” behind this transition, I wanted to pull back the curtain on the “how.” For those of us in technical enablement and sales facilitation, we know that successful migrations aren’t just about moving bits; they are about maintaining data integrity and ensuring total sovereignty over your professional artifacts.
The Open Source Engine: Why Rclone?
To move nearly 30 GiB of data and over 10,000 objects from the edge of the Pacific to the Swiss Alps, I relied on Rclone. It is the “Swiss Army Knife” of cloud storage—an open-source command-line tool that embodies the principles of net neutrality and interoperability. It allowed my Mac to act as a secure, high-performance bridge, pulling data from Google, encrypting it locally in RAM, and pushing it to the zero-knowledge vault of Proton Drive.
The Tactical Playbook: Essential Commands
Navigating this path required more than a simple copy command. To handle high-latency routes and API limitations, I refined my flags to ensure stability and efficiency:
- Audit and Visibility: Before moving a single byte, I ran
rclone size google:andrclone about google:to establish a baseline of my “Used” vs. “Other” storage. - The Bridge Command: To move the data without hitting API rate limits or triggering gateway timeouts, I used:Bash
caffeinate rclone copy google: proton:MyDrive \ --ignore-existing \ --progress \ --transfers 1 \ --buffer-size 512M \ --low-level-retries 20 - Command Logic:
--transfers 1: Crucial for large media files (like my professional workshop recordings) to avoid over-saturating the encrypted connection.--buffer-size 512M: Pre-loads file chunks into local RAM to smooth out network stutters during the trans-Pacific hop.--ignore-existing: Allows for an iterative migration, picking up only what hasn’t already landed in the vault.
The “Ghost” in the Storage: 273 GB vs. 15.6 GB
A major technical hurdle in the Google ecosystem is distinguishing between Google Drive (standard file hierarchies) and Google Photos (a database-driven media blob). My final audit revealed the disparity:
Plaintext
Total: 4.003 TiBUsed: 15.634 GiBOther: 273.808 GiB
The 15.634 GiB represents my curated professional life: years of sales training content, facilitator guides, and my journey through Auckland West Toastmasters. The 273.808 GiB of “Other” data is the “Ghost” of my Google Photos library. While the files were deleted from my phone, Google holds onto this data in a separate silo. Reclaiming my sovereignty required a two-pronged attack: mirroring the Drive files via Rclone and then authorizing a manual purge of the Photos archive to zero out that “Other” footprint.
Technical Resilience: Solving the 422 Loop
No migration is without friction. I frequently encountered 422 Unprocessable Entity errors (Code 2501). This occurs when the server’s metadata index is out of sync with its physical storage—Proton would claim a file existed, but then fail to find it during the upload handshake.
The Solution: I applied a “Rename and Push” strategy. By slightly renaming stubborn files (e.g., adding a _final suffix), I forced a new metadata entry, bypassing the corrupted index and landing the data successfully.
The Sovereignty Mindset
As a professional facilitator, I believe in sharing knowledge freely so the next traveler can find a clearer path. We often accept “walled gardens” for convenience, but as teams downsize and the tech landscape shifts, owning your own “Project Management” and “Leadership” archives is a non-negotiable professional baseline.
By using open-source tools like Rclone and Handbrake, we reduce our dependency on proprietary black boxes. My digital history—from my icebreaker speeches to complex enablement planning—is now in a vault where I alone hold the keys.
The path is now travelled; I hope these markers help you secure your own digital borders.

Leave a comment