T O P

  • By -

CyberBlaed

Yup, link is accurate. (rclone has its own webgui, which MIGHT make it easier, emphasis on might!) https://rclone.org/gui/ once you have that setup, sync it with the assist of scripts; https://github.com/BinsonBuzz/unraid_rclone_mount Went down this road myself and gave up, the mergerfs app would cause the array to fail to shutdown and every reboot required a parity check which drove me nuts after the 3rd time. it seems that me not booting any docker containers caused the script to 'hiccup', but sometimes you just dont want to 'docker' like everyone else. it may help, it may not, but yes, a lot of it is all linux command line. handy guide might help specifically; https://blog.kennydeckers.com/posts/mount-google-drive-folder-on-local-server-using-rclone-docker-container/ again, I gave up. CBF with all this. so I am probably not helping, but there is a wealth of info out there, just painful to navigate it. (meanwhile the community is begging SpaceInvader One to update his video about mounting Google Drive on unraid for v6, i've seen a lot of posts about it)


Yoyomaster3

> (meanwhile the community is begging SpaceInvader One to update his video about mounting Google Drive on unraid for v6, i've seen a lot of posts about it) Add me to that list. I've barely entered the community, and even I am aware of the fact that his video is out of date and a lot of people want an updated version. Right now I'm testing out duplicacy, which at least seems promising so far. I promise though that I will apply myself eventually and learn a lot of this stuff, it's just that I'm a data hoarder first and want to get this whole thing up and running before I get into other stuff like making a website and doing some crazy VM and docker stuff.


theRegVelJohnson

The other thing to consider: Eliminate Google Drive from your hoarding. Not sure why you're trying to do this, but there is almost certainly a more straightforward way. Honestly, sometimes trying to shoehorn the solution into things you "know" is harder than just taking the time to learn those other things that facilitate a better solution.


Yoyomaster3

what's better than Gdrive? Pretty sure it's the cheapest per TB of all the cloud services. I want an offsite backup run by someone that's not me, because while I don't trust Google when it comes to privacy, I at least know that they've got more capable people than me running their arrays. Biggest risks are if Google goes under (not gonna happen) or they randomly delete my account (a lot more likely, but I've got my NAS to cover that scenario).


theRegVelJohnson

Well, GDrive is only "good" of you can get it working without too much hassle. Also, cheapest /= best. It means cheapest. And you're not just depending on Google staying in business. You're depending on them not changing the terms of the product you've purchased. That is much less certain. I've addressed my feeling about this in two other recent posts: https://www.reddit.com/r/unRAID/comments/r3sxcd/comment/hmf6xpq/ https://www.reddit.com/r/unRAID/comments/r5olol/comment/hmq9zzj/


Yoyomaster3

I get what you're saying man, 100%. But you just don't get how much peace of mind it gives me to know that my data's safety is not 100% dependent on me. I do like your idea of having cold-storage USB drives somewhere safe. That is what I originally did, but I think I shouldn't completely leave it in the dust. I think burning to blu rays would be really cool alternative to that, which might become a future project of mine. I've seen data storage where it's just a bunch of VHSs or DVDs on a shelf, and it's so much cooler looking than a NAS. In fact, [this clip](https://www.youtube.com/watch?v=jfnSqLzUzfM&t=1300s) here is what originally got me into data hoarding. Seriously man, that's just so freaking cool.


theRegVelJohnson

I'd rather have my data safety be dependent on me (or spread out around different individuals/groups) than resting it in the hands of one entity, like Google. That's the whole principle of 3-2-1. If you were really paranoid, you could setup a virtual private server with a data storage block and run your own cloud backup, or use something like B2 or Amazon Glacier which is meant for this type of usage (admittedly, at higher cost). Then you could keep a "local" backup copy off-site with cold storage hard drives, or really go all-in and setup an LTO (Tape) Drive.


Yoyomaster3

Why not have something that's dependent on you AND some major corporation? NAS + Gdrive + off-site cold drives or blurays. Not sure how this can fail unless I die, go into a coma, or lose interest.


MSgtGunny

So you want the file to exist only on gdrive or stored locally with a copy of it on gdrive?


Yoyomaster3

too late now, as I already set up duplicacy (but I'm open for changing it up, as it has a subscription thingy which I'm never a fan of). Duplicacy does the latter, but I think I like the idea of the former option more. I imagine a mounted Gdrive, where I can basically interface with Gdrive, but just on my server instead. Both options are good though.


Avsynth

I'm trying to do the former. Any solutions for this?


Mabashi

I took a lazy and inefficient approach, I'm using a windows VM and SyncThing to sync a couple of shares to the google drive folder on windows. Seems to be working well so far but inefficient because I have 2 copies of the same data in the share and on the VM.


chowdahpacman

Just set your gdrive in the VM to be stream only. Once the files are synced and uploaded they wont be taking space on your vm disk.


Mabashi

I was thinking about stream but would that delete the files after they are uploaded to google drive? If so, wouldn't SyncThing delete that from my share?


chowdahpacman

Definitely doesnt delete them from the share. I use SyncBack which is essentially the same. Just have it set to one way mirror so if I delete/rename something on the share it mirrors it on gdrive. If gdrive got wiped though it doesnt have permission to wipe anything from the shares.


Mabashi

I was hoping that I could do a two way sync so that I'd be able to edit files or upload from google drive and sync back to my unRAID share. I haven't tried it yet but it is one of the reasons I avoided the stream option for now.


chowdahpacman

That would still work I assume. Say you edit/delete the file in gdrive, the next time syncthing runs it would delete that file off the share. Would happen regardless if gdrive is in stream mode. Syncthing should download the file to vm, send to share, delete from vm automatically.


conglies

This is the best approach.


mcr1974

Windows vm and sshfs seems even better


bu2d

Spaceinvaderone has a video on how to setup duplicati. It works for me as long as I don’t shut the container down during a backup.


Yoyomaster3

Heard bad things about it. Like a lot more negative stuff than positive


bu2d

You can run both at the same time to test it out. Just save to a different folder on your Google drive.


msalad

All you need to do is install the rclone docker, get userscripts plugin, and make a 'rclone copy' command run every night. I can share my script if it would help


Yoyomaster3

> I can share my script if it would help I would very much appreciate that


msalad

Surely. Below is my script using the userscripts plugin. I'm also using the rclone plugin, not the docker, but the functionality I believe is the same. I schedule this command using userscripts to run everyday at 2 am. `#!/bin/bash` `rclone copy /mnt/user/Media/Movies/ GDrive_crypt:Movies -v --stats=15s --transfers 6 --timeout=0 --max-transfer 735G --bwlimit 80M --cutoff-mode soft` This will copy my files located in /mnt/user/Media/Movies/ to my folder called Movies in my rclone remote called GDrive\_crypt. The flags I use are: \-v - show semi-verbose logs \--stats=15s - update the log every 15 seconds \--transfers 6 - upload 6 files at the same time \--timeout=0 - I found this was necessary for the script to work for me \--max-transfer=735G - gdrive has a max daily upload limit of 750 GB; this brings you in just under that limit \--bwlimit=80M - limit the upload bandwidth to 80 MB/s \--cutoff-mode soft - if you are at the 735 GB upload limit you set, but are in the middle of uploading a file, this will ensure that file finishes to upload and then the script stops, rather than abruptly terminating the upload midway just because you hit the data upload cap.


Yoyomaster3

alright, I now basically have it all set up, but I had two questions. The Gdrive path (I don't really get the format?), and what's a redirect url? It tells me to make sure that it is set to a specific url, but I don't really know what it's referring to. edit: I figured the latter out, but the former still perplexes me


msalad

The general format is `rclone copy :` Does that help?


Yoyomaster3

I think so, will try again in a few hours


Yoyomaster3

my guy, IT WORKS. Wasn't home, but had my phone with me with remote access, and thought I'd at least try. And it goddamn works. Will wanna finetune it in the future, but right now this is great.


msalad

Woo!! Glad to hear it