AWS CloudFront Error - The bucket you are attempting to access must be addressed using the specified endpoint
I get this XML error when accessing my static hosting S3 bucket via the CloudFront URL.
The bucket you are attempting to access must be addressed using the specified endpoint. Please send all future requests to this endpoint.
I had a bucket called "mywebsite" located in Sydney, deleted it and created a bucket with the same name in N Virginia. I've tried disabling and recreating the CF distribution but it I think it's still pointing to the old bucket? The original bucket has disappeared from S3 already.
What should I do?
See also questions close to this topic
-
Converting large CSV to JSON in AWS Lambda using python
I have a lambda function which takes an incoming csv file and converts it to JSON. But this is only working on a small file. When I tested with a large file (1GB) it failed with an out of disk space error. I've been searching for ways around this and have read about the boto streaming option. But I'm struggling with how to code for this, since I'm very new to python. Can anyone give me some pointers with this?
Are there any better ways to accomplish my task...converting a large csv to json in lambda using python. Thanks!!
s3_object = s3.get_object(Bucket=bucket, Key=key) data = s3_object['Body'].read() contents = data.decode('utf-8') print(type(filename_csv)) print(type(key)) with open(filename_csv, 'a') as csv_data: csv_data.write(contents) with open(filename_csv) as csv_data: csv_reader = c1.DictReader(csv_data) with open(filename_json, 'a') as output_file: for csv_row in csv_reader: json.dump(csv_row, output_file) output_file.write('\n') with open(filename_json, 'r') as json_file_contents: response = s3.put_object(Bucket=bucket, Key=keyname_s3, Body=json_file_contents.read())
-
How to host a node js / express server with MySQL on Elastic Beanstalk AWS
I am trying to host my Node js / express server on AWS Elastic Beanstalk. My express server is connected to a MySQL database. So far, I have hosted the MySQL database on an RDS instance on AWS. I have also created an elastic beanstalk environment and "deployed" the Node js server using a CodePipeline (which in connected to GitHub). I have found this process very interesting and feel like I am very close. However, I am running into trouble with the MySQL connection. Here are the relevant lines of code from my Node server:
const express = require("express"); const mysql = require("mysql"); const cors = require("cors") const formData = require("express-form-data"); const { PORT = 5000 } = process.env const app = express(); app.use(cors()); app.use(formData.parse({maxFieldSize: '10mb'})); var mysqlConnection = mysql.createConnection({ host: "my-rds-dbconnection-endpoint", user: "my-username", password: "****", database: "nameOfSchema" }) mysqlConnection.connect((err) => { if(err) { console.log('THIS IS NOT CONNECTING #20') throw err; } console.log("Connected to AWS!") }) console.log('PORT', PORT) app.get('/', (request, response) => { response.send('Hello World!') console.log('hello from /') }); app.listen(PORT, function() { console.log(`App listening on port ${PORT}`); });
When I run the server using Node in my command prompt for testing purposes, I make it all the way to the message "connected to AWS" and it seems to make the connection to the database. However, when I connect to the environment URL, I get the expected output 'Hello World', but when I download the logs to see what happens, the 'THIS IS NOT CONNECTING' statement prints. Obviously, when I access this node server from the elastic beanstalk environment, it is not connecting to my database. I am unsure why. Does anyone have any ideas about what I could be missing? I sure would appreciate your experience/help!!!
-
How to put the item in DynamoDB with conditional expression that is not Primary Key
After generating a random
uuid
value I go ahead and assign it to theitem.uuid
:let uuid = uuidv4(); let title = "Title 100"
let item = { uuid: uuid, title: title };
With the Dynamodb table where the Primary Key is the
item.uuid
attribute, I add the item to the table with:let TABLE_NAME = "My Table Name"; let options = {}; let promise = dbClient.put_item({ Item: item, TableName: TABLE_NAME, ...options });
The new item is inserted into the Dynamodb table with the Primary Key
uuid
generated earlier.Next, I want to add another item to the same table, with the same "title 101" title:
let uuid = uuidv4(); let title = "title 101" let item = { uuid: uuid, title: title };
( both items will have the same
title
, but differentuuid
used as a Primary Key in a table).Since I want to make sure, this item will be added only if no other items have the same "title 101" title, I use the
ConditionExpression
:let options = { ConditionExpression: "#title <> :title", ExpressionAttributeNames: { "#title": "title" }, ExpressionAttributeValues: { ":title" : "title 101" } } let promise = dbClient.put_item({ Item: item, TableName: TABLE_NAME, ...options }); promise .then(res => { console.log('--------- SUCCESS --------'); console.log(res); console.log('-----------------'); });
But regardless of the
ConditionExpression
used here, the DynamoDB goes ahead and adds a new item with the sametitle
value.Is there a way to use the conditional expression on an attribute (field) that is not a Primary Key?
-
NodeJS Multer-S3 can upload to S3 without using credentials?
I'm a little bit lost as to what's going on, I've been trying to solve this a few days now. I'm trying to only allow my IAM user to upload an image with public access to read. However, I can comment out the IAM user credentials from AWS-SDK and it would still upload to my S3 bucket no problem. This is not how I intended it to work. I have a feeling its my policies but I'm not really sure where to start.
Here is the AWS-SDK credentials being commented out in my code
Here is the code for uploading an image to S3
Here is another piece of code used for uploading an image
For some reason this is enough to upload to my S3 bucket. Just to clarify, I want to make sure the file is being uploaded only if it has the proper credentials. Currently, the file is being uploaded even when S3 credentials are commented out.
The following are my AWS S3 policies/permissions.
If you can point me in the right direction, that'll be fantastic. I'm pretty new to using AWS S3 and am a little lost.
Thanks a bunch.
-
Load XGBoost from an s3 bucket
I have an XGBoost model sitting in an AWS s3 bucket which I want to load. currently, I'm attempting to use
s3fs
to load the data, but I keep getting type errors:from s3fs.core import S3FileSystem import xgboost as xgb fs = S3FileSystem() bst = xgb.XGBClassifier() with fs.open(f'{bucket}/{modelPath}', mode='rb') as f: bst.load_model(f.read())
I would include output, this I'm pretty sure this approach is completely wrong. How do I get this working?
(also, I think I would rather use boto)
-
AWS S3 PreSignedURL Lambda Function Test Error
I am attempting to test this AWS S3 PreSignedURL Lambda Function
import boto3 from botocore.exceptions import ClientError def lambda_handler(bucket='bg-us-east-1-bucket', key='filename.jpg', expiry=3600): """ Method to get post url from s3 :param bucket: name of s3 bucket :param key: key of the file :return: """ try: presigned_post = boto3.client('s3').generate_presigned_post( bucket, key, ExpiresIn=expiry ) return SimpleNamespace(success=True, data={ 'upload_url': presigned_post['url'], 'fields': presigned_post['fields'] }) except ClientError: return SimpleNamespace( success=False, error='Unable to upload image' )
I test with this Test Event:
{ "bucket": "bg-us-east-1-bucket", "key": "value3" }
and I get this Execution error response:
"errorMessage": "Parameter validation failed:\nInvalid type for parameter Bucket, value: {'bucket': 'bg-us-east-1-bucket', 'key': 'value3'}, type: <class 'dict'>, valid types: <class 'str'>", "errorType": "ParamValidationError",
Any suggestions what is causing this AWS Lambda error response?
-
Serving React App from S3 Bucket returns "You need to enable javascript to run this app"
Strange problem I faced today with deploying my react app (built by amplify) to S3Bucket Everything worked fine for the past year and half, nothing changed except for a new computer and new npm/react/amplify installations, etc.
Copying my build to S3, the site will fail, the browser will download a file that has the following
"You need to enable JavaScript to run this app."
Here is what I tried
Reviewed all settings, all content types, etc. (nothing changed except the files)
Added a test text.html, that worked flawlessly
Ran "serve -s build" locally to confirm build was working
The only thing that works was to open my 'index.html' in my build directory and removed the following line
You need to enable JavaScript to run this app.
That seems to have fixed the problem!
Short of me modifying index.html every time after I run my build (which breaks my pipeline), any ideas what happened here? has anyone faced such a problem lately?
It seems to be localized to S3 bucket deployment
Extra Info I use Amplify to handle my GraphQL, backends, etc. I use classic Cloud-Front in front of an S3 bucket with aws cli to copy files to the S3 bucket in a shell script that again has been working fine for the past 18 months!
-
Is there a way to list CloudFront distributions by tag without downloading them all using ListDistributions?
This question is similar, but the only answer there is to get the list of all distributions and filter them locally, which is not what I am looking for.
This page hints that it might be possible to filter by tag ("You can search and filter your resources based on the tags you add"), but I just cannot figure out how!
-
AWS Amplify and Haproxy
I'm trying to setup custom domain on AWS Amplify using Haproxy, what have I did,
- Setup custom domain on AWS Amplify
- Add A Record for the subdomain to my Haproxy Server
Because the result of custom domain is CNAME Record to Cloudfront, i'm trying to following this Using Cloudfront as a HAProxy backend server with https
But can't make it working yet.
Anyone have done this before? or anyone have idea to handling this?
Thank You.