Uploading files using AJAX directly to S3

Chris White, Laravel Developer:

For most situations using S3 is a no brainer, but the majority of developers transfer their user’s uploads to S3 after they have received them on the server side. This doesn’t have to be the case, your user’s web browser can send the file directly to an S3 bucket. You don’t even have to open the bucket up to the public. Signed upload URLs with an expiry will allow temporary access to upload a single object.

This is exactly what me and my colleagues at Small Town Heroes did for The Passion: #ikloopmee. When the user has created his own testimonial (which is rendered onto a <canvas> element) he directly uploads the blob from his browser to S3 using a signed URL.

Here’s what our upload code looks like (given a already signed URL):

// Upload the blob to S3
    method: 'PUT',
    contentType: s3Data.contentType,
    headers: {
        'Cache-Control': 'max-age=86400000',
    processData: false,
    url: s3Data.signedUrl,
    data: blob,
}).done((data) =>
    // Hurray!
).fail((err) =>
    // Oops!

In order to get this to work, we did have to extend jQuery with a binary transport (which is a slightly adjusted version of this original):

 * jquery.binarytransport.js
 * @description. jQuery ajax transport for making binary data type requests.
 * @version 1.1
 * @author Henry Algus <henryalgus@gmail.com>
 * @author Bramus <bramus@bram.us>

// use this transport for "binary" data type
$.ajaxTransport("+binary", function(options, originalOptions, jqXHR){
    // check for conditions and support for blob / arraybuffer response type
    if (window.FormData && ((options.dataType && (options.dataType == 'binary')) || (options.data && ((window.ArrayBuffer && options.data instanceof ArrayBuffer) || (window.Blob && options.data instanceof Blob)))))
        return {
            // create new XMLHttpRequest
            send: function(headers, callback){
                // setup all variables
                var xhr = new XMLHttpRequest(),
                url = options.url,
                type = options.type,
                async = options.async || true,
                // blob or arraybuffer. Default is blob
                dataType = options.responseType || "blob",
                data = options.data || null,
                username = options.username || null,
                password = options.password || null;

                xhr.addEventListener('load', function(){
                        var data = {};
                        data[options.dataType] = xhr.response;
                        // make callback and send data
                        callback(xhr.status, xhr.statusText, data, xhr.getAllResponseHeaders());

                xhr.open(type, url, async, username, password);

                // setup custom headers
                for (var i in headers ) {
                        xhr.setRequestHeader(i, headers[i] );

                xhr.responseType = dataType;
            abort: function(){
                return false;

Avoiding the burden of file uploads →

Did this help you out? Like what you see?
Consider donating.

I don’t run ads on my blog nor do I do this for profit. A donation however would always put a smile on my face though. Thanks!

☕️ Buy me a Coffee ($3)

Published by Bramus!

Bramus is a frontend web developer from Belgium, working as a Chrome Developer Relations Engineer at Google. From the moment he discovered view-source at the age of 14 (way back in 1997), he fell in love with the web and has been tinkering with it ever since (more …)

Unless noted otherwise, the contents of this post are licensed under the Creative Commons Attribution 4.0 License and code samples are licensed under the MIT License

Leave a comment

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.