Google drive mounted in Windows/Linux

Concentric

Senior Member
Joined
Feb 16, 2017
Messages
783
Going to give it a try. Moved to Linux a while back and one thing that's missing is a decent sync for Google. Been using google-drive-ocamlfuse but it seems be intermittent.
if you are using Linux, rclone is an absolute breeze!
Easy to set up, use in scripts etc.
All my uploads are done automatically after being downloaded by sonarr and radarr.
can share the script if anyone is interested
 

HeftyCrab

Expert Member
Joined
Mar 26, 2009
Messages
2,262
Going to give it a try. Moved to Linux a while back and one thing that's missing is a decent sync for Google. Been using google-drive-ocamlfuse but it seems be intermittent.
Please report back on how it went. I just tried using the Gnome Accounts option but its extremely slow on my 10Mb line. Dont know why. On Windows using Google Sync its super fast.
 

Fulcrum29

Honorary Master
Joined
Jun 25, 2010
Messages
29,930
I remember LTT doing this to hoard data on the entry-level unlimited tier business plan, but they ran into severe throttling issues. Wendell at Level1Techs then gave an explanation, that through testing they determined that 150 TB seems to be the point where they throttle you.

You still have unlimited backup storage, but your download and upload are impacted until you return to the ‘acceptable’ threshold. Google isn’t clear on this in their terms so I guess they can change the threshold as they please in order to maintain optimal service levels.

This is ignoring the API queries.

150 TB should be more than enough… really. Unless you want to push 4K RAW videos onto the service.
 

cavedog

Honorary Master
Joined
Oct 19, 2007
Messages
14,872
I remember LTT doing this to hoard data on the entry-level unlimited tier business plan, but they ran into severe throttling issues. Wendell at Level1Techs then gave an explanation, that through testing they determined that 150 TB seems to be the point where they throttle you.

You still have unlimited backup storage, but your download and upload are impacted until you return to the ‘acceptable’ threshold. Google isn’t clear on this in their terms so I guess they can change the threshold as they please in order to maintain optimal service levels.

This is ignoring the API queries.

150 TB should be more than enough… really. Unless you want to push 4K RAW videos onto the service.
Even with the unconfirmed throttle which google obviously deny but lets be honest it can't be unlimited because there harddrives need to be somewhere at 150TB for ~R180 depending on exchange rate is pretty damn good still. Let's just say you pay for 24 months up front. That will mean for 2 years you never have to buy another hdd again and you know your data will be there even if harddrives fail. R4320 for 2 years of storage even for the 150TB unconfirmed throttle is pretty damn good. Wow
 

cavedog

Honorary Master
Joined
Oct 19, 2007
Messages
14,872
Because look at the OPs post ... :D And we all know what our Plex and other libraries normally host.
Thank you for your concerns. Duly noted. I think these agencies you speak of will rather focus their attention on these plex and emby share services that uses google gsuite to store like 800TB of data and charging people a monthly fee for it. These agencies won't know what is stored in your drive unless you start sharing it and obviously that is an issue even with p2p and usenet.
 

sand_man

Honorary Master
Joined
Jun 4, 2005
Messages
28,995
Wow nice. 100Mbps can actually push quite a bit of data hey. Saw you hit the 750GB daily limit. :p
Yeah, the upload seemed to stall yesterday at around 5ish. By then I had pulled around the 750gb mark so assumed that was Google calling time out, but it maxed out again quite soon after so maybe just a localized glitch?

Storage sitting on 1.5tb used and still pushing content! Busy changing file path settings in Sonarr so future material will auto locate to gdrive.
 

cavedog

Honorary Master
Joined
Oct 19, 2007
Messages
14,872
Yeah, the upload seemed to stall yesterday at around 5ish. By then I had pulled around the 750gb mark so assumed that was Google calling time out, but it maxed out again quite soon after so maybe just a localized glitch?

Storage sitting on 1.5tb used and still pushing content! Busy changing file path settings in Sonarr so future material will auto locate to gdrive.
Nice. I use rclone on a feralhosting seedbox too and each box has a shared 20Gbps port so nice I filled drive up with about 20tb or so.
 

sand_man

Honorary Master
Joined
Jun 4, 2005
Messages
28,995
In hindsight I probably should have tested a few random files rather. Ascertain how efficiently the setup is likely to perform before committing so heavily.
 

sand_man

Honorary Master
Joined
Jun 4, 2005
Messages
28,995
Nice. I use rclone on a feralhosting seedbox too and each box has a shared 20Gbps port so nice I filled drive up with about 20tb or so.
Yeah, I'm still a bit concerned over how data consumption will be 200% more than it would normally be. This solution is decent and a win win for the consumer but the ISP is going to take a hiding!! I suppose the consolation is locally peered. Don't ISP's pay next to nothing for local data consumption?
 

Speedster

Executive Member
Joined
May 2, 2006
Messages
8,064
Would a similar setup to this work using OneDrive? Does Microsoft also peer locally?
 

cavedog

Honorary Master
Joined
Oct 19, 2007
Messages
14,872
Would a similar setup to this work using OneDrive? Does Microsoft also peer locally?
Yes you can exactly the same instructions except you choose one drive in the rclone mount and as for the api not sure if those are still relevant. Yes One drive is also locally peers with all ISPs at Terao JHB.
 

sand_man

Honorary Master
Joined
Jun 4, 2005
Messages
28,995
Nice!!! So what happened when you were only getting 20MB/s before :unsure:
One was copy/paste (20MB/s), the other was using the Rclone browser download feature... Upload, though, stalled at around 4pm and has bee dead in the water ever since. Maybe I've hit that 750gb cap? Been around 1.5tb over the last 45 hours... :cautious:
 

cavedog

Honorary Master
Joined
Oct 19, 2007
Messages
14,872
One was copy/paste (20MB/s), the other was using the Rclone browser download feature... Upload, though, stalled at around 4pm and has bee dead in the water ever since. Maybe I've hit that 750gb cap? Been around 1.5tb over the last 45 hours... :cautious:
Okay so the copy and paste in windows has some sort of limitation. Interesting. Did not notice it as I only have a 200Mbps line.
 

sand_man

Honorary Master
Joined
Jun 4, 2005
Messages
28,995
Okay so the copy and paste in windows has some sort of limitation. Interesting. Did not notice it as I only have a 200Mbps line.
Maybe yeah... It was all over the place tbh... 20MB/s is probably a bit optimistic. It fluctuated wildly but on average probably around 20MB/s...

Don't know what Rclone is downloading. Upload I can understand but what the hell was it downloading between 2am-5am??

Eish, this setup is going to be incredibly data intensive! @cavedog I'm starting to have doubts... :unsure:




 
Top