<< All Blog Posts
The Importance of Precise Error Messages

The Importance of Precise Error Messages

Automation is more than a buzzword. It’s essential for scalability and productivity. LoudSwarm by Six Feet Up — a highly engaging virtual event platform built on Django, Django Rest Framework and React — has hosted dozens of global tech conferences including DjangoCon Europe, the Plone Conference and Python Web Conference. As the demand for LoudSwarm grows, event organizers needed to save time on the backend by giving presenters the ability to upload presentation videos independently. Enter S3.

S3 has two features — Presigned URLs and Presigned Posts — which allow developers to create a location for users to upload data. Specifically, developers can give users temporary credentials so that users can upload files directly to S3 without having those files pass through the server. This process also preserves the developer’s ability to apply filters and limit where content can and can’t be uploaded.

Prior to automating the process and giving presenters the ability to upload videos independently, LoudSwarm organizers manually uploaded video files to S3. The upload would then trigger a workflow that transcoded the videos. Giving users the permissions needed to upload their own videos is where the need for Presigned Posts came in.

What started as a “simple” automation uncovered an S3 issue that almost had the Six Feet Up team stumped.

Implementation

In Python and Django, the Boto3 package makes using Presigned Posts really easy. After installing the Boto3 client in your preferred environment:

    import boto3
    
    s3client = boto3.client(
        "s3",
        region_name=REGION_NAME,
        aws_access_key_id=ACCESS_KEY,
        aws_secret_access_key=SECRET_KEY,
        aws_session_token=SESSION_TOKEN,
    )
    
    get_post_credentials(bucket_name, object_key):
        response = s3client.generate_presigned_post(
            bucket_name,
            object_key,
            ExpiresIn=3600,
        )
        return response

1. `bucket_name` refers to your S3 bucket, where you want the users' uploads to be stored.
2. `object_key` refers to the saved key for the user’s uploaded file.

Constraints can be specified (e.g.: you can specify the content type allowed to be uploaded by the user). Then, add an API endpoint to perform checks, and return the data from `get_post_credentials`.

To use the credentials, the Boto3 documentation examples list the following form fields as necessary:

<input type="hidden" name="key" value="VALUE" />
<input type="hidden" name="AWSAccessKeyId" value="VALUE" />
<input type="hidden" name="policy" value="VALUE" />
<input type="hidden" name="signature" value="VALUE" />

To test uploading, we created a quick Python script based on Boto3's examples (https://boto3.amazonaws.com/v1/documentation/api/latest/guide/s3-presigned-urls.html) which uploaded to our own S3 buckets. This allowed us to test uploads while removing CORS from the equation should any problems occur.

Once the credential retrieval process was confirmed working we updated our React component to post the form data with the documented fields, using the following JavaScript:

    const form = new FormData();
    form.append('key', key);
    form.append('AWSAccessKeyId', AWSAccessKeyId);
    form.append('policy', policy);
    form.append('signature', signature);
    form.append('file', file);
    const headers = { 'Content-Type': 'multipart/form-data' };
    axios
      .post(endpoint, form, { headers })
      .then(() => {
          // success actions
      });

After local development, everything worked great. The credential retrieval and file uploads were working as expected.

Post-Deployment Problems

However, on deployment to our sandbox environment, we found things weren't so great. When testing the site's upload process, we could see that we were retrieving credentials from our REST API, but we were getting the following error when trying to use them in the frontend:

    <Error>
        <Code>InvalidAccessKeyId</Code>
        <Message>The AWS Access Key Id you provided does not exist in our records.</Message>
        <AWSAccessKeyId>ASIA5HCYN5KVIOHVLA5J</AWSAccessKeyId>
        <RequestId>AFCXGJGVWQ5FHR3Q</RequestId>
        <HostId>xiI2LSa9TKB/j9peQt+GvG3t/oPmMXCc2N9IIWyLFI97ps0cl5oqfYOU1eKjXKTTwFJ0Ho6QyhU=</HostId>
    </Error>

(AWS Access Keys beginning with 'ASIA' are temporary keys)

When we checked the fields in the browser’s developer toolbar and the form data, everything appeared to be submitting as expected. However, looking at the credentials being returned from S3 in our backend, we could see S3 was returning an additional field, "x-amz-security-token," which hadn't come up in any of the documentation up to this point.

AWS Error
Wouldn’t it be great if AWS had precise error messages? 🙂


Our first step in troubleshooting this issue was to take the previously failing credentials that were generated on the sandbox backend and test them in our upload script. Those temporary credentials worked only with the additional field, so we added the field to the bottom of our FormData as follows:

    const form = new FormData();
    form.append('key', key);
    form.append('AWSAccessKeyId', AWSAccessKeyId);
    form.append('policy', policy);
    form.append('signature', signature);
    form.append('file', file);
    if (amzSecurityToken) {
      form.append('x-amz-security-token', amzSecurityToken);
    }
    const headers = { 'Content-Type': 'multipart/form-data' };
    axios
      .post(endpoint, form, { headers })
      .then(() => {
          // success actions
    });

With the new form deployed, we tested it on our sandbox environment. Again we got the same error:

    <Message>The AWS Access Key Id you provided does not exist in our records.</Message>

Next, we checked to make sure all of the form fields were being submitted — including the new one. Again, everything appeared to be working correctly. At this stage, we wondered if the error was being created by an environment issue (potentially one related to credential generation as a guess), and we asked our DevOps team to look into it. We were hopeful that fresh eyes, with the AWS setup in mind, would provide us with a new perspective.

However, after intense discussions and scouring online threads about issues created by the additional "x-amz-security-token" field, the DevOps team was just as stumped.

We knew there had to be a solution. We kept down the path of troubleshooting, and we decided to test uploads using Alfred as a proxy. That way, we could see the exact difference in the requests as they go over the wire and determine why the upload script was working, when the web form was not.

Looking at the data, again, everything appeared correct, but one thing stood out. There was one single difference: the order of the fields in the form data. This time, the React App's "file" field came before the "x-amz-security-token" field. In the upload script, the file field was the last field. That couldn't be it, could it? The error is clearly about the AWS Access Key ID, not a missing key that's not even documented! Also, in any other multipart file upload, the order has never been an issue because the data gets processed after everything is uploaded.

So, of course we tested it… reordering the fields to set the file as the last field as below, and IT WORKED!

    const form = new FormData();
    form.append('key', key);
    form.append('AWSAccessKeyId', AWSAccessKeyId);
    form.append('policy', policy);
    form.append('signature', signature);
    if (amzSecurityToken) {
      form.append('x-amz-security-token', amzSecurityToken);
    }
    form.append('file', file);
    const headers = { 'Content-Type': 'multipart/form-data' };
    axios
      .post(endpoint, form, { headers })
      .then(() => {
          // success actions
      });

Resolution

Through trial and error, we discovered that S3 treats the upload as a stream versus taking in all the data and parsing it out. If it receives the "file" form field before the "x-amz-security-token" form field, it errors out with an imprecise error message.

I hope this post helps others who might come across a similar problem. Happy coding and troubleshooting!

Strategy Guide Photo

Save Time With Django Strategy Guides

Save time in your Django development projects: download our free strategy guides on Django Filters and Django Tag Templates.


Thanks for filling out the form! A Six Feet Up representative will be in contact with you soon.

Connect with us