Comments (17)
Those are the full files as downloaded by aws s3 sync vs s3 c demo (sorted). As can be seen there are many missing files without any specific pattern.
from aws-c-s3.
@dbickson Thank you for creating the issue with details. We will take a look and will let you know as soon as we have any updates.
from aws-c-s3.
We have opened the bucket permissions so you can try on your own.
I still got an AccessDenied from s3://vl-sample-dataset-kitti/Kitti/
from aws-c-s3.
I tried to create a bucket with the same file directory, but fake files with a single char in it.
I cannot reproduce the error you seen as well. I successfully downloaded all 22480 files with ./s3 cp -v ERROR -r us-west-2 s3://test-bucket-asd/raw/ ./test &>cp-out
I saw something in the output that's suspicious
�[1431B22.20user 17.14system 0:59.49elapsed 66%CPU (0avgtext+0avgdata 182988maxresident)k
16096inputs+12727272outputs (64major+161888minor)pagefaults 0swaps
And after that, you did another download of the same file? I am not sure why?
from aws-c-s3.
Hi @TingDaoK can we do a short zoom session I can share my screen it takes less than a minute to reproduce the issue on my side.
![Screen Shot 2023-07-07 at 21 57 51](https://private-user-images.githubusercontent.com/4495523/251833619-cf1937b3-c2bd-470a-9a36-5e4886cf33e2.png?jwt=eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJpc3MiOiJnaXRodWIuY29tIiwiYXVkIjoicmF3LmdpdGh1YnVzZXJjb250ZW50LmNvbSIsImtleSI6ImtleTUiLCJleHAiOjE3MDg4MDg2OTAsIm5iZiI6MTcwODgwODM5MCwicGF0aCI6Ii80NDk1NTIzLzI1MTgzMzYxOS1jZjE5MzdiMy1jMmJkLTQ3MGEtOWEzNi01ZTQ4ODZjZjMzZTIucG5nP1gtQW16LUFsZ29yaXRobT1BV1M0LUhNQUMtU0hBMjU2JlgtQW16LUNyZWRlbnRpYWw9QUtJQVZDT0RZTFNBNTNQUUs0WkElMkYyMDI0MDIyNCUyRnVzLWVhc3QtMSUyRnMzJTJGYXdzNF9yZXF1ZXN0JlgtQW16LURhdGU9MjAyNDAyMjRUMjA1OTUwWiZYLUFtei1FeHBpcmVzPTMwMCZYLUFtei1TaWduYXR1cmU9NmY0M2Q3NTg0MjYzNGYxMDhkZGJkYzNkM2VkNzg4MjZkYjllOTg5ZGRjYjAyY2I4ZDQ1YWFkMmE1YTEzZTM3NyZYLUFtei1TaWduZWRIZWFkZXJzPWhvc3QmYWN0b3JfaWQ9MCZrZXlfaWQ9MCZyZXBvX2lkPTAifQ.xp5oYJ4bM6vOcI8zV80jNkyQBh6yyEeZn4TRM2j3_AI)
according to our management console the bucket is public
Can the issue originate from the fact I am using t2.xlarge instance?
from aws-c-s3.
Not sure. And from your nohup.out, we can actually see all 22480 files are downloaded. So, I guess something went wrong writing to the disk?
from aws-c-s3.
HI @TingDaoK I have given additional bucket permissions, can you try accessing again.
from aws-c-s3.
I agree in the nohup there are exactly 22,480 download unique filename printouts, which is correct. But not all of them are saved to disk.
grep download nohup.out | sort -u > 1
cut -f 2 1 -d ' '> 2
sort -u 2 | wc
22480 22480 1498642
ubuntu@ip-172-31-30-217:/mnt/data/wakar$ find ~/Kitti -type f | wc
11287 11287 571716
from aws-c-s3.
HI @TingDaoK I am a little closer to breakthrough, I have added the following trace:
if (!transfer_ctx->output_sink) {
printf("Failed to download to %s\n", (char*)file_path);
return AWS_OP_ERR;
}
I see a lot of failures here which may explain the missing files. Please advise?
from aws-c-s3.
Can you do -v ERROR
, and attach the log to us?
from aws-c-s3.
HI @TingDaoK I have given additional bucket permissions, can you try accessing again.
Cool. Now, I have access, I'll try to reproduce it with your bucket.
from aws-c-s3.
Some more hints:
aws-error:45(AWS_ERROR_MAX_FDS_EXCEEDED)
[ERROR] [2023-07-07T19:56:51Z] [00007f0146ffd700] [common-io] - static: Failed to open file. path:'/home/ubuntu/Kitti/raw/testing/image_2/005993.png' mode:'wb' errno:24 aws-error:45(AWS_ERROR_MAX_FDS_EXCEEDED)
[ERROR] [2023-07-07T19:56:51Z] [00007f0146ffd700] [common-io] - static: Failed to open file. path:'/home/ubuntu/Kitti/raw/testing/image_2/005994.png' mode:'wb' errno:24 aws-error:45(AWS_ERROR_MAX_FDS_EXCEEDED)
[ERROR] [2023-07-07T19:56:51Z] [00007f0146ffd700] [common-io] - static: Failed to open file. path:'/home/ubuntu/Kitti/raw/testing/image_2/005995.png' mode:'wb' errno:24 aws-error:45(AWS_ERROR_MAX_FDS_EXCEEDED)
[ERROR] [2023-07-07T19:56:51Z] [00007f0146ffd700] [common-io] - static: Failed to open file. path:'/home/ubuntu/Kitti/raw/testing/image_2/005996.png' mode:'wb' errno:24 aws-error:45(AWS_ERROR_MAX_FDS_EXCEEDED)
[ERROR] [2023-07-07T19:56:51Z] [00007f0146ffd700] [common-io] - static: Failed to open file. path:'/home/ubuntu/Kitti/raw/testing/image_2/005997.png' mode:'wb' errno:24 aws-error:45(AWS_ERROR_MAX_FDS_EXCEEDED)
[ERROR] [2023-07-07T19:56:51Z] [00007f0146ffd700] [common-io] - static: Failed to open file. path:'/home/ubuntu/Kitti/raw/testing/image_2/005998.png' mode:'wb' errno:24 aws-error:45(AWS_ERROR_MAX_FDS_EXCEEDED)
[ERROR] [2023-07-07T19:56:51Z] [00007f0146ffd700] [common-io] - static: Failed to open file. path:'/home/ubuntu/Kitti/raw/testing/image_2/005999.png' mode:'wb' errno:24 aws-error:45(AWS_ERROR_MAX_FDS_EXCEEDED)
from aws-c-s3.
hi @TingDaoK what is the correct way to increase the open file limit? I tried with ulimit but it did not work here.
from aws-c-s3.
aha, so it's basically too many open files that hit the limits of file descriptor from system.
Easiest way maybe https://stackoverflow.com/questions/11342167/how-to-increase-ulimit-on-amazon-ec2-instance
from aws-c-s3.
Issue is solved and now all files are received.
My only suggestion is to make -v ERROR the default since I was expecting some feedback regarding the problem and the run seems to finish fine.
Thanks again for your great support!
from aws-c-s3.
I have an ongoing PR #330 to improve the error handling for samples. Should actually error out in your case with the change.
from aws-c-s3.
We have updated the error handling here #332. It should properly error out now
from aws-c-s3.
Related Issues (20)
- [s3 endpoint] aws_s3_client_endpoint_release race condition produces segmentation faults HOT 1
- Hard-coded Host header domain in aws_s3_get_object_size_message_new HOT 1
- TCP keep-alive settings that have proved useful HOT 1
- FreeBSD port: testunit fails HOT 3
- s3_request: num_requests_in_flight not reduced due to nil client field HOT 1
- [s3_auto_ranged_get] off-by-one error in calculating the end of the range
- `test_s3_get_performance` is an invalid test name HOT 4
- [Feature Request]: support for multi-part byte ranges HOT 1
- Under certain conditions meta requests are retried after receiving non-recoverable S3 error responses. HOT 6
- [auto-ranged PUT]: CompleteMultipartUpload called while 1 multipart upload is missing HOT 6
- [CopyObject]: please re-enable client support and tests for CopyObject
- Improve Samples HOT 1
- S3-Transfer-Manager Streaming HOT 22
- Failed to build dependencies on ubuntu 20 HOT 5
- Trouble to try the samples in my S3 account HOT 5
- Faster Paging HOT 3
- Issue about possible mistake "bad copypast" HOT 1
- Handle range header client-side HOT 4
- Build broken since #360 HOT 5
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from aws-c-s3.