I was able to download a public shared file using this command:
$ wget --no-check-certificate 'https://docs.google.com/uc?export=download&id=FILEID' -O FILENAME
FILEID must be replaced by the actual file ID.
FILENAME is the path/filename where download will be stored.
Note you cannot use a folderid instead of fileid. I have used view source in a folder view where I could find the following HTML
<div id="entry-0B0jxxycBojSwVW.... The string starting with
0B was the fileid.
A newer, open-source multi-platform client, written in Go called drive is quite nice and full-featured, and also is in active development.
The pull command downloads data from Google Drive that does not exist locally, and deletes local data that is not present on Google Drive. Run it without any arguments to pull all of the files from the current path:
$ drive pull
Pulling by matches is also supported
$ cd ~/myDrive/content/2015 $ drive pull --matches vines docx
See the above link for further examples, these are just the tip of the iceberg.
There is a giant program in the ubuntu 17.04 repository called: rclone
$ sudo apt update && sudo apt install rclone $ rclone config --> than follow the steps to setup your gdrive
now you can rclone copy and sync everything you want. The projects supports all kinds of clouds.. f.e.:
$ rclone sync /home/<usrname>/Desktop/yourfolder gdrivename:yourfolder
(btw I uploaded about 600GB and transferred 1TB with rclone from gdrive to gdrive in few hours, gdrive to ubuntu and ubuntu to gdrive.. and it works terrific over weeks in a row!!