Giter Site home page Giter Site logo

s3sync's Introduction

S3Sync

Intro

I needed to backup some stuff once when I was in the woods. Unfortunately, I didn't find anything easy but elegant enough to sync my stuff with Amazon S3.

S3Sync uses the official aws sdk for ruby so we expect it to be stable. The most sensitive parts of the code are tested and that only tends to get better, I'm crazy about testing code! :)

Code maturity

This project started as a fork of the original s3sync command that had its last release in 2008. After a while it became a complete rewrite which might be considered good in a lot of cases, however, it also entails losing the maturity that the old code used to have.

To overcome this problem, I invested time writing tests for some of the most hairy part of the code: the sync command.

That being said, I believe there must be a couple stupid bugs around and I highly appreciate reports and patches (specially if they come with tests).

Installation

$ gem install s3sync

Usage

S3Sync's help command is pretty powerful, so you can get all the help you need from him. He's always ready to answer your questions:

$ s3sync help [SUBCOMMAND]

If you want to learn more about a specific command, you just need to inform the optional [SUBCOMMAND] argument:

$ s3sync help sync

Managing buckets

The following commands are used to manage buckets themselves

  • s3sync listbuckets: Show all available buckets
  • s3sync createbucket <name>: Create a new bucket
  • s3sync deletebucket <name> [-f]: Delete a bucket

Managing content

  • delete <bucket>:<key>: Delete a key from a bucket
  • list <bucket>[:prefix] [-m] [-d]: List items filed under a given bucket
  • put <bucket>[:<key>] <file>: Upload a file to a bucket under a certain prefix
  • get <bucket>:<key> <file>: Retrieve an object and save to the specified file
  • url <bucket>:<key>: Generates public urls or authenticated endpoints for the object

The sync command

If you want to sync up an s3 folder with a local folder (both directions are accepted), you can use the sync command. e.g.:

$ s3sync sync Work/reports disc.company.com:reports/2013/08

The above line will sync the local folder Work/reports with the remote node disc.company.com:reports/2013/08.

The most important options of this command are:

  • --exclude=EXPR: Skip copying files that matches this pattern. (Accept Ruby REs)
  • --keep: Keep files in the destination even if they don't exist in the source
  • --dry-run: Do not download or exclude anything, just prints out what was planned

Contributing

  1. Fork it
  2. Create your feature branch (git checkout -b my-new-feature)
  3. Commit your changes (git commit -am 'Add some feature')
  4. Push to the branch (git push origin my-new-feature)
  5. Create new Pull Request

Feedback

Reporting bugs and giving any other feedback is highly appreciated. Feel free to create a new issue if you find anything wrong or if you have any ideas to improve the ranger!

instanc.es Badge Bitdeli Badge

s3sync's People

Contributors

bitdeli-chef avatar clarete avatar enable-labs avatar grosser avatar jomunoz avatar ms4720 avatar siruguri avatar spiffxp avatar takezoux2 avatar tklovett avatar vyorkin avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

s3sync's Issues

sync doesn't update files when filesize is same, but content mismatches

I've noticed that if the old file and the new file are the same size, they don't get synced. Most of these cases are text files which have same-length strings updated, resulting in the same file size.

If I go ahead and add/remove 1 character from the text file, sync will update the file.

s3sync sync shouldn't append source directory path into target path

I figured out why s3sync was always uploading everything every time. It turns out that it was appending the source directory path into the target path.

e.g. if I run:

s3sync sync source aws.example.com:target

  • source/README.txt => aws.example.com:target/source/README.txt

Using the full path doesn't work either:

s3sync sync /Users/guest/source aws.example.com:target

  • /Users/guest/source/README.txt => aws.example.com:target/Users/guest/source/README.txt

Using the current working directory doesn't work either:

s3sync sync . aws.example.com:target

  • ./README.txt => aws.example.com:target/./README.txt

I thought the above would work, but apparently target/README.txt is a different path than target/./README.txt

Any guidance would be appreciated.

Does sync re-upload everything every time?

I've noticed that the s3sync sync command takes quite awhile to run w/ no indication of what it's doing. I know there's already an issue opened to have a progress bar: #13

My question is does sync re-upload/re-download everything every time it's run. I notice whenever I pass --dry-run as a parameter, it'll list everything including files that have already been synced.

But what I've noticed is that even when every file has already been synced, it still takes the same amount of time to run the s3sync sync command (~30-60secs)

config can't convert Symbol to String

/opt/chef/embedded/lib/ruby/gems/1.9.1/gems/s3sync-2.0.1/lib/s3sync/config.rb:87:in `update': can't convert Symbol into String (TypeError)
    from /opt/chef/embedded/lib/ruby/gems/1.9.1/gems/s3sync-2.0.1/lib/s3sync/config.rb:87:in `read'
    from /opt/chef/embedded/lib/ruby/gems/1.9.1/gems/s3sync-2.0.1/bin/s3sync:37:in `<top (required)>'
    from /opt/chef/embedded/bin/s3sync:23:in `load'
    from /opt/chef/embedded/bin/s3sync:23:in `<main>'

Changing line 77 to read
self[v.to_s] = ENV[v.to_s] unless ENV[v.to_s].nil?

fixes this. Using Ruby 1.9.1 with embedded chef.

newbie startup issue: listbuckets doesn't work for me

Any ideas?
Here is a truncated stack trace:

s3sync listbuckets
/Users/dcj/.rvm/gems/ruby-1.9.3-p327/gems/nokogiri-1.5.9/lib/nokogiri/nokogiri.bundle: [BUG] Segmentation fault
ruby 1.9.3p327 (2012-11-10 revision 37606) [x86_64-darwin12.2.1]

-- Control frame information -----------------------------------------------
c:0030 p:-17552646494304 s:0120 b:0120 l:000119 d:000119 TOP
c:0029 p:---- s:0118 b:0118 l:000117 d:000117 CFUNC :require
c:0028 p:0135 s:0114 b:0114 l:000113 d:000113 METHOD /Users/dcj/.rvm/rubies/ruby-1.9.3-p327/lib/ruby/site_ruby/1.9.1/rubygems/core_ext/kernel_require.rb:53
c:0027 p:0250 s:0104 b:0104 l:000103 d:000103 TOP /Users/dcj/.rvm/gems/ruby-1.9.3-p327/gems/nokogiri-1.5.9/lib/nokogiri.rb:28
c:0026 p:---- s:0102 b:0102 l:000101 d:000101 FINISH
c:0025 p:---- s:0100 b:0100 l:000099 d:000099 CFUNC :require
c:0024 p:0135 s:0096 b:0096 l:000095 d:000095 METHOD /Users/dcj/.rvm/rubies/ruby-1.9.3-p327/lib/ruby/site_ruby/1.9.1/rubygems/core_ext/kernel_require.rb:53
c:0023 p:0083 s:0086 b:0086 l:000085 d:000085 TOP /Users/dcj/.rvm/gems/ruby-1.9.3-p327/gems/aws-sdk-1.19.0/lib/aws/s3/client.rb:20
c:0022 p:---- s:0084 b:0084 l:000083 d:000083 FINISH
c:0021 p:0042 s:0082 b:0082 l:002078 d:000081 LAMBDA /Users/dcj/.rvm/gems/ruby-1.9.3-p327/gems/aws-sdk-1.19.0/lib/aws/core/configuration.rb:473
c:0020 p:---- s:0077 b:0077 l:000076 d:000076 FINISH
c:0019 p:---- s:0075 b:0075 l:000074 d:000074 CFUNC :call
c:0018 p:0113 s:0070 b:0068 l:000260 d:000067 LAMBDA /Users/dcj/.rvm/gems/ruby-1.9.3-p327/gems/aws-sdk-1.19.0/lib/aws/core/configuration.rb:384
c:0017 p:---- s:0064 b:0064 l:000063 d:000063 FINISH
c:0016 p:0123 s:0062 b:0062 l:000061 d:000061 METHOD /Users/dcj/.rvm/gems/ruby-1.9.3-p327/gems/aws-sdk-1.19.0/lib/aws/core/service_interface.rb:73
c:0015 p:---- s:0058 b:0058 l:000057 d:000057 FINISH
c:0014 p:---- s:0056 b:0056 l:000055 d:000055 CFUNC :new
c:0013 p:0037 s:0052 b:0052 l:0002e8 d:000051 LAMBDA /Users/dcj/.rvm/gems/ruby-1.9.3-p327/gems/s3sync-2.0.0/lib/s3sync/cli.rb:441
c:0012 p:---- s:0044 b:0044 l:000043 d:000043 FINISH
c:0011 p:0304 s:0042 b:0042 l:000041 d:000041 METHOD /Users/dcj/.rvm/gems/ruby-1.9.3-p327/gems/cmdparse-2.0.5/lib/cmdparse.rb:464
c:0010 p:0455 s:0034 b:0034 l:0002e8 d:0002e8 METHOD /Users/dcj/.rvm/gems/ruby-1.9.3-p327/gems/s3sync-2.0.0/lib/s3sync/cli.rb:469
c:0009 p:0149 s:0029 b:0029 l:000028 d:000028 TOP /Users/dcj/.rvm/gems/ruby-1.9.3-p327/gems/s3sync-2.0.0/bin/s3sync:66
c:0008 p:---- s:0025 b:0025 l:000024 d:000024 FINISH
c:0007 p:---- s:0023 b:0023 l:000022 d:000022 CFUNC :load
c:0006 p:0167 s:0019 b:0019 l:0010f8 d:001308 EVAL /Users/dcj/.rvm/gems/ruby-1.9.3-p327/bin/s3sync:23
c:0005 p:---- s:0015 b:0015 l:000014 d:000014 FINISH
c:0004 p:---- s:0013 b:0013 l:000012 d:000012 CFUNC :eval
c:0003 p:0121 s:0007 b:0007 l:0010f8 d:002340 EVAL /Users/dcj/.rvm/gems/ruby-1.9.3-p327/bin/ruby_executable_hooks:14
c:0002 p:---- s:0004 b:0004 l:000003 d:000003 FINISH
c:0001 p:0000 s:0002 b:0002 l:0010f8 d:0010f8 TOP

-- Ruby level backtrace information ----------------------------------------
/Users/dcj/.rvm/gems/ruby-1.9.3-p327/bin/ruby_executable_hooks:14:in <main>' /Users/dcj/.rvm/gems/ruby-1.9.3-p327/bin/ruby_executable_hooks:14:ineval'
/Users/dcj/.rvm/gems/ruby-1.9.3-p327/bin/s3sync:23:in <main>' /Users/dcj/.rvm/gems/ruby-1.9.3-p327/bin/s3sync:23:inload'

Seeing "EOF error: end of file reached"

Suddenly seeing this error a lot copying backups to AWS bucket using s3sync.

Indeed 3 AMI boxes have managed the full 100 retries on successive night.

Been stable for years, so running (very) old version of the code.

I'll update our copy, I suspect that the old code isn't handling more frequent errors gracefully.

High memory & cpu consumption

I'm trying to upload about 370GB of data to S3 and it is eating my resources like a beast, just saying

PID USER      PR  NI    VIRT    RES    SHR S  %CPU %MEM     TIME+ COMMAND
26388 ops       20   0 7025268 6.620g   4304 R 100.1 21.2  34:57.35 ruby

sync command is extremely slow

s3ranger sync --dry-run bucket:folder folder

Takes more than 10 min for a 'folder' with approx 9000 keys.
While it's running there's no output.
It seems, it's getting everything, and then outputting it. It might be nicer to get one, and output it, so the user will know it's doing something.

document usage from ruby / rake

Our Use case: deploy a Jekyll generated site to S3

For that it would be cleaner to call the Sync command directly as Ruby code in the Rakefile instead of running the shell command s3sync from the Rakefile. Probably via the SyncCommand class.

Is that possible? Would be great to document it in the README if supported.

PS: the README could profit from a clarification what disc.company.com is. "Remote Node" seems to be language derived from the s3sync internals, may be better to use "bucket" and "object path" in AWS lingo.

SSL

There is no mention of ssl in the docs. How secure is s3Sync is it uploading over SSL

Replacing mirrored files?

When I do a sync command, s3sync sync local_folder s3_bucket, it seems to replace existing files on s3 even if they are the exact same. The modified date of the s3 file also changes.

I would imagine that would incur unnecessary s3 fees. Can you confirm what happens when syncing a file that already exists on s3?

cannot load such file -- aws/s3

I just installed the gem, but an s3sync help sync yields:

/home/emile/.rvm/rubies/ruby-2.2.1/lib/ruby/site_ruby/2.2.0/rubygems/core_ext/kernel_require.rb:121:in `require': cannot load such file -- aws/s3 (LoadError)
    from /home/emile/.rvm/rubies/ruby-2.2.1/lib/ruby/site_ruby/2.2.0/rubygems/core_ext/kernel_require.rb:121:in `require'
    from /home/emile/.rvm/gems/ruby-2.2.1/gems/s3sync-2.0.2/lib/s3sync/cli.rb:28:in `<top (required)>'
    from /home/emile/.rvm/rubies/ruby-2.2.1/lib/ruby/site_ruby/2.2.0/rubygems/core_ext/kernel_require.rb:69:in `require'
    from /home/emile/.rvm/rubies/ruby-2.2.1/lib/ruby/site_ruby/2.2.0/rubygems/core_ext/kernel_require.rb:69:in `require'
    from /home/emile/.rvm/gems/ruby-2.2.1/gems/s3sync-2.0.2/bin/s3sync:31:in `<top (required)>'
    from /home/emile/.rvm/gems/ruby-2.2.1/bin/s3sync:23:in `load'
    from /home/emile/.rvm/gems/ruby-2.2.1/bin/s3sync:23:in `<main>'
    from /home/emile/.rvm/gems/ruby-2.2.1/bin/ruby_executable_hooks:15:in `eval'
    from /home/emile/.rvm/gems/ruby-2.2.1/bin/ruby_executable_hooks:15:in `<main>'

s3ranger sync --md5

@clarete
How about this feature:

s3ranger sync --md5 local remote
-> generate .s3ranger in local with all the checksums
-> upload

s3ranger sync --md5 remote local
-> download .s3ranger and compare checksums
-> download what changed

stacktrace in sync

Not sure if this is s3sync, or the aws-sdk:

/var/lib/gems/1.9.1/gems/aws-sdk-1.38.0/lib/aws/core/data.rb:117:in method_missing': undefined methodhas_key?' for #String:0x00000003ecc388 (NoMethodError)
from /var/lib/gems/1.9.1/gems/aws-sdk-1.38.0/lib/aws/core/response.rb:184:in method_missing' from /var/lib/gems/1.9.1/gems/aws-sdk-1.38.0/lib/aws/s3/prefix_and_delimiter_collection.rb:31:ineach_member_in_page'
from /var/lib/gems/1.9.1/gems/aws-sdk-1.38.0/lib/aws/s3/object_collection.rb:288:in each_member_in_page' from /var/lib/gems/1.9.1/gems/aws-sdk-1.38.0/lib/aws/s3/paginated_collection.rb:31:in_each_item'
from /var/lib/gems/1.9.1/gems/aws-sdk-1.38.0/lib/aws/core/collection/with_limit_and_next_token.rb:54:in _each_batch' from /var/lib/gems/1.9.1/gems/aws-sdk-1.38.0/lib/aws/core/collection.rb:80:ineach_batch'
from /var/lib/gems/1.9.1/gems/aws-sdk-1.38.0/lib/aws/core/collection.rb:47:in each' from /var/lib/gems/1.9.1/gems/aws-sdk-1.38.0/lib/aws/s3/object_collection.rb:282:ineach'
from /var/lib/gems/1.9.1/gems/s3sync-2.0.2/lib/s3sync/sync.rb:305:in to_a' from /var/lib/gems/1.9.1/gems/s3sync-2.0.2/lib/s3sync/sync.rb:305:inread_tree_remote'
from /var/lib/gems/1.9.1/gems/s3sync-2.0.2/lib/s3sync/sync.rb:317:in read_trees' from /var/lib/gems/1.9.1/gems/s3sync-2.0.2/lib/s3sync/sync.rb:187:inrun'
from /var/lib/gems/1.9.1/gems/s3sync-2.0.2/lib/s3sync/cli.rb:423:in run' from /var/lib/gems/1.9.1/gems/s3sync-2.0.2/lib/s3sync/cli.rb:80:inexecute'
from /var/lib/gems/1.9.1/gems/cmdparse-2.0.6/lib/cmdparse.rb:464:in parse' from /var/lib/gems/1.9.1/gems/s3sync-2.0.2/lib/s3sync/cli.rb:462:inrun'
from /var/lib/gems/1.9.1/gems/s3sync-2.0.2/bin/s3sync:66:in <top (required)>' from /usr/local/bin/s3sync:23:inload'
from /usr/local/bin/s3sync:23:in `

'

This was done from a script that's attempting to sync a big chunk of stuff. It made progress, and then crashed with this. Restarting gave same trace immediately.

Download an entire bucket at once

It seems as if there's no way to synchronize an entire bucket with a directory.

sync -v readmodel.develop.rolestar:/ . returns immediately with no syncing and
sync -v readmodel.develop.rolestar: . never returns

HTTP error: content-length does not match

I'm trying to backup one of my old buckets (used with Jungle Disk) and this is what I get:

$ s3sync sync <bucket>: <folder>
/Users/olivier/.rvm/gems/ruby-1.9.3-p448/gems/aws-sdk-1.19.0/lib/aws/core/http/net_http_handler.rb:83:in `block (2 levels) in handle': content-length does not match (AWS::Core::Http::NetHttpHandler::TruncatedBodyError)
    from /Users/olivier/.rvm/rubies/ruby-1.9.3-p448/lib/ruby/1.9.1/net/http.rb:1323:in `block (2 levels) in transport_request'
    from /Users/olivier/.rvm/rubies/ruby-1.9.3-p448/lib/ruby/1.9.1/net/http.rb:2672:in `reading_body'
    from /Users/olivier/.rvm/rubies/ruby-1.9.3-p448/lib/ruby/1.9.1/net/http.rb:1322:in `block in transport_request'
    from /Users/olivier/.rvm/rubies/ruby-1.9.3-p448/lib/ruby/1.9.1/net/http.rb:1317:in `catch'
    from /Users/olivier/.rvm/rubies/ruby-1.9.3-p448/lib/ruby/1.9.1/net/http.rb:1317:in `transport_request'
    from /Users/olivier/.rvm/rubies/ruby-1.9.3-p448/lib/ruby/1.9.1/net/http.rb:1294:in `request'
    from /Users/olivier/.rvm/gems/ruby-1.9.3-p448/gems/aws-sdk-1.19.0/lib/aws/core/http/connection_pool.rb:330:in `request'
    from /Users/olivier/.rvm/gems/ruby-1.9.3-p448/gems/aws-sdk-1.19.0/lib/aws/core/http/net_http_handler.rb:61:in `block in handle'
    from /Users/olivier/.rvm/gems/ruby-1.9.3-p448/gems/aws-sdk-1.19.0/lib/aws/core/http/connection_pool.rb:129:in `session_for'
    from /Users/olivier/.rvm/gems/ruby-1.9.3-p448/gems/aws-sdk-1.19.0/lib/aws/core/http/net_http_handler.rb:55:in `handle'
    from /Users/olivier/.rvm/gems/ruby-1.9.3-p448/gems/aws-sdk-1.19.0/lib/aws/core/client.rb:244:in `block in make_sync_request'
    from /Users/olivier/.rvm/gems/ruby-1.9.3-p448/gems/aws-sdk-1.19.0/lib/aws/core/client.rb:273:in `retry_server_errors'
    from /Users/olivier/.rvm/gems/ruby-1.9.3-p448/gems/aws-sdk-1.19.0/lib/aws/core/client.rb:240:in `make_sync_request'
    from /Users/olivier/.rvm/gems/ruby-1.9.3-p448/gems/aws-sdk-1.19.0/lib/aws/core/client.rb:502:in `block (2 levels) in client_request'
    from /Users/olivier/.rvm/gems/ruby-1.9.3-p448/gems/aws-sdk-1.19.0/lib/aws/core/client.rb:382:in `log_client_request'
    from /Users/olivier/.rvm/gems/ruby-1.9.3-p448/gems/aws-sdk-1.19.0/lib/aws/core/client.rb:468:in `block in client_request'
    from /Users/olivier/.rvm/gems/ruby-1.9.3-p448/gems/aws-sdk-1.19.0/lib/aws/core/client.rb:364:in `return_or_raise'
    from /Users/olivier/.rvm/gems/ruby-1.9.3-p448/gems/aws-sdk-1.19.0/lib/aws/core/client.rb:467:in `client_request'
    from (eval):3:in `get_object'
    from /Users/olivier/.rvm/gems/ruby-1.9.3-p448/gems/aws-sdk-1.19.0/lib/aws/s3/s3_object.rb:1324:in `get_object'
    from /Users/olivier/.rvm/gems/ruby-1.9.3-p448/gems/aws-sdk-1.19.0/lib/aws/s3/s3_object.rb:1076:in `read'
    from /Users/olivier/.rvm/gems/ruby-1.9.3-p448/gems/s3sync-2.0.0/lib/s3sync/sync.rb:349:in `block (2 levels) in download_files'
    from /Users/olivier/.rvm/gems/ruby-1.9.3-p448/gems/s3sync-2.0.0/lib/s3sync/sync.rb:348:in `open'
    from /Users/olivier/.rvm/gems/ruby-1.9.3-p448/gems/s3sync-2.0.0/lib/s3sync/sync.rb:348:in `block in download_files'
    from /Users/olivier/.rvm/gems/ruby-1.9.3-p448/gems/s3sync-2.0.0/lib/s3sync/sync.rb:334:in `each'
    from /Users/olivier/.rvm/gems/ruby-1.9.3-p448/gems/s3sync-2.0.0/lib/s3sync/sync.rb:334:in `download_files'
    from /Users/olivier/.rvm/gems/ruby-1.9.3-p448/gems/s3sync-2.0.0/lib/s3sync/sync.rb:190:in `run'
    from /Users/olivier/.rvm/gems/ruby-1.9.3-p448/gems/s3sync-2.0.0/lib/s3sync/cli.rb:397:in `run'
    from /Users/olivier/.rvm/gems/ruby-1.9.3-p448/gems/s3sync-2.0.0/lib/s3sync/cli.rb:456:in `block (2 levels) in run'
    from /Users/olivier/.rvm/gems/ruby-1.9.3-p448/gems/cmdparse-2.0.5/lib/cmdparse.rb:464:in `parse'
    from /Users/olivier/.rvm/gems/ruby-1.9.3-p448/gems/s3sync-2.0.0/lib/s3sync/cli.rb:469:in `run'
    from /Users/olivier/.rvm/gems/ruby-1.9.3-p448/gems/s3sync-2.0.0/bin/s3sync:66:in `<top (required)>'
    from /Users/olivier/.rvm/gems/ruby-1.9.3-p448/bin/s3sync:23:in `load'
    from /Users/olivier/.rvm/gems/ruby-1.9.3-p448/bin/s3sync:23:in `<main>'
    from /Users/olivier/.rvm/gems/ruby-1.9.3-p448/bin/ruby_executable_hooks:15:in `eval'
    from /Users/olivier/.rvm/gems/ruby-1.9.3-p448/bin/ruby_executable_hooks:15:in `<main>'

Let me know if you need more information.

New gem release?

The s3sync install via rubygems just doesn't work

$ gem install s3sync
Fetching: cmdparse-3.0.1.gem (100%)
Successfully installed cmdparse-3.0.1
Fetching: s3sync-2.0.2.gem (100%)
Successfully installed s3sync-2.0.2
2 gems installed
$ s3sync help
/Users/johnbackus/.rvm/gems/ruby-2.2.3/gems/cmdparse-3.0.1/lib/cmdparse.rb:796:in `initialize': wrong number of arguments (1 for 0) (ArgumentError)

it looks like #35 fixed this though and installing from github seems to not produce the same errors

Network is unreachable error

Hello,

I'm trying to use this tool and after creating a file name s3sync.yml with my AWS credentials when I try to list my buckets I get this error:

user@ubuntu:~$ s3sync listbuckets
/usr/lib/ruby/1.9.1/net/http.rb:762:in `initialize': Network is unreachable - connect(2) (Errno::ENETUNREACH)
    from /usr/lib/ruby/1.9.1/net/http.rb:762:in `open'
    from /usr/lib/ruby/1.9.1/net/http.rb:762:in `block in connect'
    from /usr/lib/ruby/1.9.1/timeout.rb:68:in `timeout'
    from /usr/lib/ruby/1.9.1/timeout.rb:99:in `timeout'
    from /usr/lib/ruby/1.9.1/net/http.rb:762:in `connect'
    from /usr/lib/ruby/1.9.1/net/http.rb:755:in `do_start'
    from /usr/lib/ruby/1.9.1/net/http.rb:750:in `start'
    from /var/lib/gems/1.9.1/gems/aws-sdk-1.35.0/lib/aws/core/credential_providers.rb:333:in `get_credentials'
    from /var/lib/gems/1.9.1/gems/aws-sdk-1.35.0/lib/aws/core/credential_providers.rb:41:in `credentials'
    from /var/lib/gems/1.9.1/gems/aws-sdk-1.35.0/lib/aws/core/credential_providers.rb:319:in `credentials'
    from /var/lib/gems/1.9.1/gems/aws-sdk-1.35.0/lib/aws/core/credential_providers.rb:121:in `block in credentials'
    from /var/lib/gems/1.9.1/gems/aws-sdk-1.35.0/lib/aws/core/credential_providers.rb:119:in `each'
    from /var/lib/gems/1.9.1/gems/aws-sdk-1.35.0/lib/aws/core/credential_providers.rb:119:in `credentials'
    from /var/lib/gems/1.9.1/gems/aws-sdk-1.35.0/lib/aws/core/credential_providers.rb:53:in `access_key_id'
    from /var/lib/gems/1.9.1/gems/aws-sdk-1.35.0/lib/aws/core/client.rb:548:in `build_request'
    from /var/lib/gems/1.9.1/gems/aws-sdk-1.35.0/lib/aws/core/client.rb:490:in `block (3 levels) in client_request'
    from /var/lib/gems/1.9.1/gems/aws-sdk-1.35.0/lib/aws/core/response.rb:171:in `call'
    from /var/lib/gems/1.9.1/gems/aws-sdk-1.35.0/lib/aws/core/response.rb:171:in `build_request'
    from /var/lib/gems/1.9.1/gems/aws-sdk-1.35.0/lib/aws/core/response.rb:111:in `initialize'
    from /var/lib/gems/1.9.1/gems/aws-sdk-1.35.0/lib/aws/core/client.rb:203:in `new'
    from /var/lib/gems/1.9.1/gems/aws-sdk-1.35.0/lib/aws/core/client.rb:203:in `new_response'
    from /var/lib/gems/1.9.1/gems/aws-sdk-1.35.0/lib/aws/core/client.rb:489:in `block (2 levels) in client_request'
    from /var/lib/gems/1.9.1/gems/aws-sdk-1.35.0/lib/aws/core/client.rb:390:in `log_client_request'
    from /var/lib/gems/1.9.1/gems/aws-sdk-1.35.0/lib/aws/core/client.rb:476:in `block in client_request'
    from /var/lib/gems/1.9.1/gems/aws-sdk-1.35.0/lib/aws/core/client.rb:372:in `return_or_raise'
    from /var/lib/gems/1.9.1/gems/aws-sdk-1.35.0/lib/aws/core/client.rb:475:in `client_request'
    from (eval):3:in `list_buckets'
    from /var/lib/gems/1.9.1/gems/aws-sdk-1.35.0/lib/aws/s3/bucket_collection.rb:140:in `each'
    from /var/lib/gems/1.9.1/gems/s3sync-2.0.1/lib/s3sync/cli.rb:110:in `run'
    from /var/lib/gems/1.9.1/gems/s3sync-2.0.1/lib/s3sync/cli.rb:80:in `execute'
    from /var/lib/gems/1.9.1/gems/cmdparse-2.0.5/lib/cmdparse.rb:464:in `parse'
    from /var/lib/gems/1.9.1/gems/s3sync-2.0.1/lib/s3sync/cli.rb:462:in `run'
    from /var/lib/gems/1.9.1/gems/s3sync-2.0.1/bin/s3sync:66:in `<top (required)>'
    from /usr/local/bin/s3sync:19:in `load'
    from /usr/local/bin/s3sync:19:in `<main>'

In your documentation there is no initial setup example, I'm missing something?

I have aws-sdk gem working to upload other single file backups on the same server.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.