Giter Site home page Giter Site logo

Comments (24)

zeshanb avatar zeshanb commented on May 12, 2024

Nice catch. The way you described the sync command is the way it should
behave but what i've noticed is rclone does actually start deleting the
source data by default if I don't use correct options for rClone command.

On Tue, Mar 17, 2015 at 10:27 AM, gustavorochakv [email protected]
wrote:

Sync command documentation is incorrect
The documentation states that the sync command "Sync the source to the
destination" and that it "Deletes any files that exist in source that don't
exist in destination".
It is actually the other way around: the source is the master, and the
destination is changed to match it (deleting files, if necessary).
Tested on versions 1.10 (Linux) and 1.12 (both Linux and Windows), and the
behaviour is the same on all of them

No information about versioning
The documentation does not mention that both the copy and the sync
commands creates a new version of the files that have been updated on the
source (at least on Google Drive) instead of deleting the old one and
adding the updated version, like other similar tools.
This was one of the main reasons that made me use this tool, and I only
found that out through testing. It might be a good idea to increase the
awareness about this feature.


Reply to this email directly or view it on GitHub
#39.

from rclone.

ncw avatar ncw commented on May 12, 2024

@gustavorochakv Hmm, yes that is a little confusing!

You are correct in what it actualy does. How about re-wording like this?

    Sync the source to the destination, changing the destination only.  Doesn't transfer
    unchanged files, testing first by modification time then by
    size.  Destination is updated to match source, including deleting files if necessary.
    Since this can cause data loss, test first with the --dry-run flag.

As for versioning - I wasn't actually aware what rclone does created a version, but I've verified it does just now. What it does (on google drive) is if the destination doesn't exist it creates it, but if it does then it updates it. I guess this is causing the versions. The api docs for update don't state it makes a new version but it makes sense that it does.

I can document that in the google drive docs, and that will tell everyone that this behaviour is deliberate.

I take it you think this is a good thing?

from rclone.

ncw avatar ncw commented on May 12, 2024

@zeshanb If you can reproduce rclone deleting the source data (that would be a bad bug) can you add an issue with instructions on how to reproduce?

Thanks

Nick

from rclone.

gustavorochakv avatar gustavorochakv commented on May 12, 2024

@ncw Yes, the reword covers it nicely.

About versioning, it is documented under the "newRevision" parameter:

If true or not set, a new blob is created as head revision, and previous revisions are preserved (causing increased use of the user's data storage quota).

I must note, however, that the information on that API doc is incorrect: adding the version/revision does not increase the usage of the user's data storage quota. That would only happen if the "pinned" parameter is passed as "true" (it defaults to "false") of if the user manually enter the version management interface and ticked the "Keep forever" checkbox, as per this link.

This could be an interesting addition to rclone: having the option to set the revisions as pinned by default. This would be particularly useful for Google Apps for Business with unlimited storage (Google Drive for Work or whatever they're calling it now). I might look into it in the future, though I probably won't have to time to do so until late May (specially since I don't know a thing about the Go language).

In case you're not familiar with the version control system on Google Drive, non pinned revisions are deleted after 30 days or 100 revisions (whatever comes first). These revisions do not count towards a user storage quota.
Pinned revisions are kept indefinitely, but they do count towards a user storage quota. A file can have a maximum of 200 pinned revisions.

I was using a few alternative solutions before trying rclone (mainly a few forks of Grive), but whenever a file changed, it would delete the file on Google Drive and upload the new one.
By adding it as a revision, if I find out that I made a mistake before the last sync, I can download specific older versions and roll back. This is a fairly useful feature.

from rclone.

zeshanb avatar zeshanb commented on May 12, 2024

Will see about starting a vm to check it out.

Zeshan

On Wed, Mar 18, 2015 at 11:08 AM, Nick Craig-Wood [email protected]
wrote:

@zeshanb https://github.com/zeshanb If you can reproduce rclone
deleting the source data (that would be a bad bug) can you add an issue
with instructions on how to reproduce?

Thanks

Nick


Reply to this email directly or view it on GitHub
#39 (comment).

from rclone.

rtg20 avatar rtg20 commented on May 12, 2024

"Nice catch. The way you described the sync command is the way it should
behave but what i've noticed is rclone does actually start deleting the
source data by default if I don't use correct options for rClone command."

zeshanb, please can you specify the 'correct options' so I can try them? Have been searching for a good way to sync my stuff with Google Drive but don't want to accidentally delete anything. Thank you!

from rclone.

zeshanb avatar zeshanb commented on May 12, 2024

Sure rtg20,

rclone config

Use the command line wizard to configure source and destination. This will
take the confusion out. I like to call the local file system local and
remote as remote or I use source and destination.

I hope main developer will come in if I incorrectly understood anything
below:

If you are copying from remote to local. To get output of what it's doing:

rclone --dry-run --transfers=2 -q --checkers=1 copy remote:drive local:./

With --dry-run option running copy subcommand won't delete anything but
just show what files are being copied from remote to local. This copy sub
command doesn't delete the files from destinations. The other sync sub
command on the other hand deletes any files in remote that don't exist in
local.

--transfers=2
Just as when doing ftp..doing simultaneous transfers from remote locations
can have issues..if you have decent broadband you should be able to do two
simultaneous for extended time. You are at the mercy of network admin :D

-q
This way you only get the most important info print out of what file was
transfered and not all the other "gobaldy goo" diagnostics

-checkers=1
This way the script has to do less checking and putting less stress on your
system and over all process.

There are more goodies options here for both sub commands copy and sync

http://rclone.org/docs/

Regards,
Zeshan

On Sat, Mar 21, 2015 at 12:32 AM, rtg20 [email protected] wrote:

"Nice catch. The way you described the sync command is the way it should
behave but what i've noticed is rclone does actually start deleting the
source data by default if I don't use correct options for rClone command."

zeshanb, please can you specify the 'correct options' so I can try them?
Have been searching for a good way to sync my stuff with Google Drive but
don't want to accidentally delete anything. Thank you!


Reply to this email directly or view it on GitHub
#39 (comment).

from rclone.

zeshanb avatar zeshanb commented on May 12, 2024

--dry-run will only output what will take place on terminal and won't copy
or move anything. Run the command again without --dry-run if you are happy
with what it's doing.

Regards,
Zeshan

On Sat, Mar 21, 2015 at 12:21 PM, ZDataTech [email protected] wrote:

Sure rtg20,

rclone config

Use the command line wizard to configure source and destination. This will
take the confusion out. I like to call the local file system local and
remote as remote or I use source and destination.

I hope main developer will come in if I incorrectly understood anything
below:

If you are copying from remote to local. To get output of what it's doing:

rclone --dry-run --transfers=2 -q --checkers=1 copy remote:drive local:./

With --dry-run option running copy subcommand won't delete anything but
just show what files are being copied from remote to local. This copy sub
command doesn't delete the files from destinations. The other sync sub
command on the other hand deletes any files in remote that don't exist in
local.

--transfers=2
Just as when doing ftp..doing simultaneous transfers from remote locations
can have issues..if you have decent broadband you should be able to do two
simultaneous for extended time. You are at the mercy of network admin :D

-q
This way you only get the most important info print out of what file was
transfered and not all the other "gobaldy goo" diagnostics

-checkers=1
This way the script has to do less checking and putting less stress on
your system and over all process.

There are more goodies options here for both sub commands copy and sync

http://rclone.org/docs/

Regards,
Zeshan

On Sat, Mar 21, 2015 at 12:32 AM, rtg20 [email protected] wrote:

"Nice catch. The way you described the sync command is the way it should
behave but what i've noticed is rclone does actually start deleting the
source data by default if I don't use correct options for rClone
command."

zeshanb, please can you specify the 'correct options' so I can try them?
Have been searching for a good way to sync my stuff with Google Drive but
don't want to accidentally delete anything. Thank you!


Reply to this email directly or view it on GitHub
#39 (comment).

from rclone.

rtg20 avatar rtg20 commented on May 12, 2024

thanks for the help, I got it to work. However the dry run option only tells me how many files it wants to transfer. it doesn't give me paths and filenames. I tried specifying verbosity, but then it listed all the files that were the same - and I have so many that I couldn't make out the ones that would transfer.

What do you think? thanks!

from rclone.

zeshanb avatar zeshanb commented on May 12, 2024

Remove the -q option when running command with --dry-run. This should give
you all the info.

Zeshan

On Sat, Mar 21, 2015 at 4:55 PM, rtg20 [email protected] wrote:

thanks for the help, I got it to work. However the dry run option only
tells me how many files it wants to transfer. it doesn't give me paths and
filenames. I tried specifying verbosity, but then it listed all the files
that were the same - and I have so many that I couldn't make out the ones
that would transfer.

What do you think? thanks!


Reply to this email directly or view it on GitHub
#39 (comment).

from rclone.

rtg20 avatar rtg20 commented on May 12, 2024

Hmm, I can't see to get it to work. Here's what I copy and pasted from the terminal window:

richard@SERVER:~/rclone-v1.12-linux-amd64$ ./rclone --dry-run --transfers=2 --checkers=1 copy gdrive:/Documents /home/richard/Documents
2015/03/21 14:04:41 Local file system at /home/richard/Documents: Building file list
2015/03/21 14:04:42 Local file system at /home/richard/Documents: Waiting for checks to finish
2015/03/21 14:05:41
Transferred: 0 Bytes ( 0.00 kByte/s)
Errors: 0
Checks: 2697
Transferred: 4

Elapsed time: 1m0.818401526s

4 files to transfer...but which?

thanks!

from rclone.

zeshanb avatar zeshanb commented on May 12, 2024

Hello there,

Please run rclone config again and setup another location for your local
system and name it local and point it to /home/rxxxxxx/Documents

Simplify things a bit by pointing to rclone binary in your .bash_profile

PATH=$PATH:$HOME/bin:/golang/go/bin:/rcloneexport PATH

You might have a bunch of paths in bash_profile..at the end put a colon and
location of the rclone binary. Then exit out and back into terminal and run
the dry-run again

rclone --dry-run --transfers=2 --checkers=1
--log-file="~/mydrivetransfer.txt" copy gdrive:/Documents local:

Regards,
Zeshan

On Sat, Mar 21, 2015 at 5:07 PM, rtg20 [email protected] wrote:

Hmm, I can't see to get it to work. Here's what I copy and pasted from the
terminal window:

richard@SERVER:~/rclone-v1.12-linux-amd64$ ./rclone --dry-run
--transfers=2 --checkers=1 copy gdrive:/Documents /home/richard/Documents
2015/03/21 14:04:41 Local file system at /home/richard/Documents: Building
file list
2015/03/21 14:04:42 Local file system at /home/richard/Documents: Waiting
for checks to finish
2015/03/21 14:05:41
Transferred: 0 Bytes ( 0.00 kByte/s)
Errors: 0
Checks: 2697
Transferred: 4
Elapsed time: 1m0.818401526s

4 files to transfer...but which?

thanks!


Reply to this email directly or view it on GitHub
#39 (comment).

from rclone.

zeshanb avatar zeshanb commented on May 12, 2024

profile might look similar to this:

PATH=$PATH:$HOME/bin:/golang/go/bin:/rclone

export PATH

On Sat, Mar 21, 2015 at 5:48 PM, ZDataTech [email protected] wrote:

Hello there,

Please run rclone config again and setup another location for your local
system and name it local and point it to /home/rxxxxxx/Documents

Simplify things a bit by pointing to rclone binary in your .bash_profile

PATH=$PATH:$HOME/bin:/golang/go/bin:/rcloneexport PATH

You might have a bunch of paths in bash_profile..at the end put a colon
and location of the rclone binary. Then exit out and back into terminal and
run the dry-run again

rclone --dry-run --transfers=2 --checkers=1
--log-file="~/mydrivetransfer.txt" copy gdrive:/Documents local:

Regards,
Zeshan

On Sat, Mar 21, 2015 at 5:07 PM, rtg20 [email protected] wrote:

Hmm, I can't see to get it to work. Here's what I copy and pasted from
the terminal window:

richard@SERVER:~/rclone-v1.12-linux-amd64$ ./rclone --dry-run
--transfers=2 --checkers=1 copy gdrive:/Documents /home/richard/Documents
2015/03/21 14:04:41 Local file system at /home/richard/Documents:
Building file list
2015/03/21 14:04:42 Local file system at /home/richard/Documents: Waiting
for checks to finish
2015/03/21 14:05:41
Transferred: 0 Bytes ( 0.00 kByte/s)
Errors: 0
Checks: 2697
Transferred: 4
Elapsed time: 1m0.818401526s

4 files to transfer...but which?

thanks!


Reply to this email directly or view it on GitHub
#39 (comment).

from rclone.

rtg20 avatar rtg20 commented on May 12, 2024

thanks for the help, I was able to fix the PATH, but when I run config there's no option to change local. here's what I have. It only allows me to configure remotes.

thanks!

richard@SERVER:~$ rclone config
Current remotes:

Name Type
==== ====
gdrive drive

e) Edit existing remote
n) New remote
d) Delete remote
q) Quit config
e/n/d/q> q

from rclone.

zeshanb avatar zeshanb commented on May 12, 2024

No problem. Select option n) then select local and name it local or as you
like. Then you will have to use full path for your local file system.

local:/home/username/Documents

Your dry-run will be like this:

rclone --dry-run --transfers=2 --checkers=1 --log-file="~/My
DriveTransferLog.txt" copy gdrive:/Documents local:/home/username/Documents

Have fun diversifying your storage. :)

Regards,
Zeshan

On Sat, Mar 21, 2015 at 6:08 PM, rtg20 [email protected] wrote:

thanks for the help, I was able to fix the PATH, but when I run config
there's no option to change local. here's what I have. It only allows me to
configure remotes.

thanks!

richard@SERVER:~$ rclone config
Current remotes:

Name Type
==== ====
gdrive drive

e) Edit existing remote
n) New remote
d) Delete remote
q) Quit config
e/n/d/q> q


Reply to this email directly or view it on GitHub
#39 (comment).

from rclone.

rtg20 avatar rtg20 commented on May 12, 2024

...ok that all worked, but I still don't get a list of files.

richard@SERVER:~$ rclone --dry-run --transfers=2 --checkers=1 --log-file="./MyDriveTransferLog.txt" copy gdrive:/Documents local:/home/richard/Documents

Transferred: 0 Bytes ( 0.00 kByte/s)
Errors: 0
Checks: 31641
Transferred: 8
Elapsed time: 12m51.297505984s

richard@SERVER:~$

then the log file has

015/03/21 19:19:54 Can't redirect stderr to file
2015/03/21 19:19:55 Local file system at /home/richard/Documents: Building file list
2015/03/21 19:19:57 Local file system at /home/richard/Documents: Waiting for checks to finish
2015/03/21 19:20:55
Transferred: 0 Bytes ( 0.00 kByte/s)
Errors: 0
Checks: 2079
Transferred: 4
Elapsed time: 1m1.443805125s

it adds more info after every minute or so of elapsed time, but the list of files that are changed is no where to be found. :-(

not sure what I'm doing wrong. Have I discovered a bug....?

thanks

Richard

from rclone.

zeshanb avatar zeshanb commented on May 12, 2024

Testing on a system here. Will update in a bit.

Regards,
Zeshan

On Sat, Mar 21, 2015 at 10:39 PM, rtg20 [email protected] wrote:

...ok that all worked, but I still don't get a list of files.

richard@SERVER:~$ rclone --dry-run --transfers=2 --checkers=1
--log-file="./MyDriveTransferLog.txt" copy gdrive:/Documents
local:/home/richard/Documents

Transferred: 0 Bytes ( 0.00 kByte/s)
Errors: 0
Checks: 31641
Transferred: 8
Elapsed time: 12m51.297505984s

richard@SERVER:~$

then the log file has

015/03/21 19:19:54 Can't redirect stderr to file
2015/03/21 19:19:55 Local file system at /home/richard/Documents: Building
file list
2015/03/21 19:19:57 Local file system at /home/richard/Documents: Waiting
for checks to finish
2015/03/21 19:20:55
Transferred: 0 Bytes ( 0.00 kByte/s)
Errors: 0
Checks: 2079
Transferred: 4
Elapsed time: 1m1.443805125s

it adds more info after every minute or so of elapsed time, but the list
of files that are changed is no where to be found. :-(

not sure what I'm doing wrong. Have I discovered a bug....?

thanks

Richard


Reply to this email directly or view it on GitHub
#39 (comment).

from rclone.

zeshanb avatar zeshanb commented on May 12, 2024

It's not a bug. I've just tested on vm here. You can run this command.

rclone --dry-run --transfers=2 --checkers=2
--log-file="/home/richard/Documents/logOfDriveTransfer.txt" -v copy
gdrive:/Documents local:/home/richard/Documents

Due to -v and --log-file options this should output a file here
/home/richard/Documents/logOfDriveTransfer.txt

(you need to use a full path for --log-file option)

if you are happy with the list of files listed in logOfDriveTransfer.txt
then run the same command without --dry-run and the files should start
downloading to your system.

Regards,
Zeshan

On Sat, Mar 21, 2015 at 10:39 PM, rtg20 [email protected] wrote:

...ok that all worked, but I still don't get a list of files.

richard@SERVER:~$ rclone --dry-run --transfers=2 --checkers=1
--log-file="./MyDriveTransferLog.txt" copy gdrive:/Documents
local:/home/richard/Documents

Transferred: 0 Bytes ( 0.00 kByte/s)
Errors: 0
Checks: 31641
Transferred: 8
Elapsed time: 12m51.297505984s

richard@SERVER:~$

then the log file has

015/03/21 19:19:54 Can't redirect stderr to file
2015/03/21 19:19:55 Local file system at /home/richard/Documents: Building
file list
2015/03/21 19:19:57 Local file system at /home/richard/Documents: Waiting
for checks to finish
2015/03/21 19:20:55
Transferred: 0 Bytes ( 0.00 kByte/s)
Errors: 0
Checks: 2079
Transferred: 4
Elapsed time: 1m1.443805125s

it adds more info after every minute or so of elapsed time, but the list
of files that are changed is no where to be found. :-(

not sure what I'm doing wrong. Have I discovered a bug....?

thanks

Richard


Reply to this email directly or view it on GitHub
#39 (comment).

from rclone.

rtg20 avatar rtg20 commented on May 12, 2024

thanks so much for the help! :-)

from rclone.

rtg20 avatar rtg20 commented on May 12, 2024

...interestingly, if a directory doesn't exist....it generates an error.

(the subdir 2015 exists on my local PC but not on Google Drive)

2015/03/22 23:07:54 Local file system at /home/richard/Pictures/2015: Building file list
2015/03/22 23:07:54 Local file system at /home/richard/Pictures/2015: Waiting for checks to finish
2015/03/22 23:07:54 Google drive root 'Pictures/2015': Couldn't find root: Couldn't find directory: "Pictures/2015"

from rclone.

zeshanb avatar zeshanb commented on May 12, 2024

That might be resolved by putting a slash after gdrive like this gdrive:/
in your rclone command if you have configured drive as a remote and named
it gdrive.

regards,
Zeshan
On Mar 23, 2015 2:09 AM, "rtg20" [email protected] wrote:

...interestingly, if a directory doesn't exist....it generates an error.

(the subdir 2015 exists on my local PC but not on Google Drive)

2015/03/22 23:07:54 Local file system at /home/richard/Pictures/2015:
Building file list
2015/03/22 23:07:54 Local file system at /home/richard/Pictures/2015:
Waiting for checks to finish
2015/03/22 23:07:54 Google drive root 'Pictures/2015': Couldn't find root:
Couldn't find directory: "Pictures/2015"


Reply to this email directly or view it on GitHub
#39 (comment).

from rclone.

rtg20 avatar rtg20 commented on May 12, 2024

actually I think it's my fault - local should come before gdrive in the command, because local is the source of the files... oops! :-(

from rclone.

zeshanb avatar zeshanb commented on May 12, 2024

no worries..that's why there is dry run option.
On Mar 23, 2015 1:55 PM, "rtg20" [email protected] wrote:

actually I think it's my fault - local should come before gdrive in the
command, because local is the source of the files... oops! :-(


Reply to this email directly or view it on GitHub
#39 (comment).

from rclone.

ncw avatar ncw commented on May 12, 2024

This is all fixed in v1.13 now!

from rclone.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.