AWS s3 Browsershowing invalid access key
Make sure that you have setup proper ACL's for your images.Enable Read permissions for everyone on all images or you can setup bucket policy.
See also questions close to this topic
What is mean by Task timed out after 3.00 seconds - AWS Lambda with node.js, AWS RDS, AWS cloudwatch
I am executing select query in aws lambda(nodejs) then i got this(Task timed out after 3.00 seconds ) message in aws cloudwatch.
Can we use the AWS or App Center to test the ARCore based apps? if yes, how?
I want to write a test script to test the android/Ios app which is using Augment Reality so that I can run the tests over AWS or App Center.
Please do share any helpful links if anyone has.
how to get the estimated cost of an ec2-instance of type m4.large with ebs storage of 500gb?
how to get the estimated cost of an ec2-instance of tpe m4.large with ebs storage of about 500gb through java sdk ? is there any specific sdk provided by aws for this ? i have tried to look many aws api's but i didn't find any , the link i find was very hard to understand and fetch value from here in terms of instance type and cost, here is the link: https://pricing.us-east-1.amazonaws.com/offers/v1.0/aws/AmazonS3/current/us-east-1/index.json
is there any java api or sdk available to fetch estimated cost of an instance ?
Writing data to S3 through Spark Data Frame and in scale - s3 connectivity issue and caused by s3 503 slow down error
We are trying to read and write the data to S3 in spark using AWS EMR clusters. And during this process, while we were scaling the execution, we ended up with some issues. When we try to process the same job for one-quarter of data we are not noticing this issue, but when we scale it to run multiple quarters of data in parallel, randomly for one/more quarters of data, we started seeing the spark jobs failing while writing the data to S3. Then we went down further to understand the issue in deeper, that is when we realized that spark is throwing the issue while it is writing the data to S3 and that is caused by S3 503 Slow down the error.
The slow down error will come only when we exceeded the S3 TPS of a given path. And the suggestion from S3 is to add random hash values to s3 path while writing. We tried this using partition by, but we come to know that some hash values(xy, 2 digit hash values) only will perform better. So does anyone come across the similar issue, and if so may I know how you had overcome this issue?
Can one source predefined AWS bucket policies like Lego blocks?
Suppose that we have a requirement that all bucket policies should reject storage requests that don't include encryption information. A clean way would be to define this once as a template of sorts, then import that template into specific bucket policies when needed.
I cant seem to find anything that can do this, both in the AWS access policy language and in Terraform. I would like to do this in Terraform if possible, but any advice would be appreciated.
Python - Error when deleting sub-folder in Python
I am in the process of extracting data from AWS to my local. I am able to extract the data. Next I am trying to have one of the sub-folders deleted from the downloaded content. I use the below content and I get an error
TypeError: listdir: path should be string, bytes, os.PathLike or None, not module
Could anyone guide me as to where am I going wrong.
import boto, os import datetime from os import path import os, shutil current_time = datetime.datetime.now().strftime("%Y-%m-%d") LOCAL_PATH = '/Users/user/Desktop/files/' # connect to the bucket conn = boto.connect_s3(AWS_ACCESS_KEY_ID, AWS_SECRET_ACCESS_KEY) bucket = conn.get_bucket(bucket_name) ### download files # go through the list of files bucket_list = bucket.list(prefix='files/store1) #bucket_list = bucket.list() for l in bucket_list: keyString = str(l.key) d = LOCAL_PATH + keyString try: l.get_contents_to_filename(d) except OSError: # check if dir exists if not os.path.exists(d): os.makedirs(d) for the_file in os.listdir(path): file_path = os.path.join(path, the_file) try: if os.path.isdir(file_path): shutil.rmtree(file_path) except Exception as e: print(e)