Upload slow on S3 Bucket for large number of files
I am uploading large numbner of images on AWS s3 bucket via s3 api from my local machine. 800 images gets uploaded in 15 min, but 8000 images take close to 24 hours to upload. Each image is around 10 MB. The time taken to upload increases exponentially with larger number of files.
Transfer acceleration didn't help as I am close to the datacenter location. Multipart upload is recommended for mostly for larger files (>100 MB). What am I missing here ?
do you know?
how many words do you know
See also questions close to this topic
-
Upload file from html when block public access is true
I am using
django-s3direct
to file uploadhttps://github.com/bradleyg/django-s3direct
Using IAM role setting because I upload the file from the server on ECS container.
Now I set the
blockPublicAccess
ofS3
false.When uploading images from html, there comes error.
https://s3.ap-northeast-1.amazonaws.com/static-resource-v/images/c64d6e593de44aa5b10dcf1766582547/_origin.jpg?uploads (403 (Forbidden) ) initiate error: static-resource-v/line-assets/images/c64d6e593de44aa5b10dcf1766582547/_origin.jpg AWS Code: AccessDenied, Message:Access Deniedstatus:403
OK, it is understandable.
Browser try to access the for initiation.
However there is any way to upload file from browser when blockPublicAccess is true??
-
Linux on Lightsail instance is asking for a password and it's not working
I'm trying to restart
mariaDB
on Ubuntu but it's not letting me.I enter:
systemctl restart mariadb
and get:
==== AUTHENTICATING FOR org.freedesktop.systemd1.manage-units === Authentication is required to restart 'mariadb.service'. Authenticating as: Ubuntu (ubuntu) Password: polkit-agent-helper-1: pam_authenticate failed: Authentication failure ==== AUTHENTICATION FAILED ===
I have the same password for all functions so I do not understand why it is not working. What can I do?
-
AWS Pinpoint sendMessages() Addresses param field error
I'm having trouble replicating the format of the params object's Addresses format in a way where I can easily add to the object.
If I use this as the params with
destinationNumber[0]
anddestinationNumber[1]
in the format of 1 + 9 digit number ie13334535667
then it sends the message to both numbers no problem.const params = { ApplicationId: applicationId, MessageRequest: { Addresses: { [destinationNumber[0]]: { ChannelType: 'SMS' }, [destinationNumber[1]]: { ChannelType: 'SMS' } }, MessageConfiguration: { SMSMessage: { Body: message, Keyword: registeredKeyword, MessageType: messageType, OriginationNumber: originationNumber } } } };
I'm trying to replicate this format for
Addresses
, but I'm gettingUnexpected key '13334535667' found in params.MessageRequest.Addresses['0']
. The format my console output shows for Addresses is[ { '12345678910': { ChannelType: 'SMS' } }, { '12345678911': { ChannelType: 'SMS' } } ]
I'm using a map to call this
function createPhoneMessagingObject(phoneNumber: string) { return { [phoneNumber]: { ChannelType: 'SMS' } }; }
I tried wrapping key in array like in phone object, but per the output, the brackets goes away so maybe there's an easier/more correct way of doing this. I appreciate any help!
-
Assign ELB Account to S3 Bucket Policy
I used the AWS console from the load balancer edit attributes screen and used it to create a bucket to use for access logging. I'm using this policy to form CDK code in typescript to stand up new S3 buckets to use for access logging in higher level environments where I cannot use the console. This is the policy I need to somehow form in typescript CDK code:
"Statement": [ { "Effect":Allow", "Principal": { "AWS": "arn:--ELB-arnstuff--:root" }, "Action": "s3:PutObject", "Resource": "arn:--S3-Bucket-arnstuff--/AWSLogs/123456789/*" } ]
I've managed to get the cdk code figured out to this point:
bucket.addToResourcePolicy( new cdk.aws_iam.PolicyStatement({ effect: awsIam.Effect.ALLOW, principals: //'**This is part I haven't figured out**', actions: ['s3:PutObject'], resources: ['${bucket.bucketArn}/*'] }) );
At this point I don't care if it's hard coded in the CDK, I just need something to help keep the ball rolling forward. Any help is appreciated, thanks
-
How can I get file metadata like the date created attribute from an S3 file?
I have a lot of LogicPro files (.logicx) stored in an S3 bucket, and I want to extract the creation date from all of these files. This should not be the creation date of the object on s3, but the date for when it was created on my MacBook (the "Created" attribute in Finder).
I've tried to retrieve the metadata from the object using the
HEAD
action:aws s3api head-object --bucket <my-bucket> --key <my-object-key>
The output did not contain any information about the creation date of the actual file.
{ "AcceptRanges":"bytes", "LastModified":"2021-10-28T13:22:33+00:00", "ContentLength":713509, "ETag":"\"078c18ff0ab5322ada843a18bdd3914e\"", "VersionId":"9tseZuMRenKol1afntNM8mkRbeXo9n2W", "ContentType":"image/jpeg", "ServerSideEncryption":"AES256", "Metadata":{}, "StorageClass":"STANDARD_IA" }
Is it possible to extract the file creation metadata attribute from an S3 object, without having to download the whole object?
-
How can I import .cer in SAP Commerce Cloud
I am using SAP Commerce Cloud, in Public Cloud. And I am trying to insert a .cer file to make rest calls to API Gateway.
I read about importing it in java using command lines to import to keystore.
But, I don't know how to do it in the SAP Commerce Cloud
-
How to incorporate the planetlab workload for load balancing for CloudAnalyst?
I have set up a dynamic load balancing policy. I would like to test this with the planetlab workload traces on CloudAnalyst tool. How can I do this?
-
How would one create different environments in Commercetools?
I am trying to set up Commercetools with a CI/CD pipeline. It is my first time working with microservices and cloud architectures.
With a monolithic code base you can have development, QA and production environments - how would one go about this with Commercetools? Would you setup two/three projects? Should the projects share the same microservices or would you set up multiple of those as well? If not, then I suppose you would do end-to-end testing on production, that can't be right?
I am not interested in how to setup microservices, I am interested in how to set up the project that performs changes to the Commercetools API.
Thanks for any help.