Uploading files using AJAX directly to S3

Chris White, Laravel Developer:

For most situations using S3 is a no brainer, but the majority of developers transfer their user’s uploads to S3 after they have received them on the server side. This doesn’t have to be the case, your user’s web browser can send the file directly to an S3 bucket. You don’t even have to open the bucket up to the public. Signed upload URLs with an expiry will allow temporary access to upload a single object.

This is exactly what me and my colleagues at Small Town Heroes did for The Passion: #ikloopmee. When the user has created his own testimonial (which is rendered onto a <canvas> element) he directly uploads the blob from his browser to S3 using a signed URL.

Here’s what our upload code looks like (given a already signed URL):

// Upload the blob to S3
    method: 'PUT',
    contentType: s3Data.contentType,
    headers: {
        'Cache-Control': 'max-age=86400000',
    processData: false,
    url: s3Data.signedUrl,
    data: blob,
}).done((data) =>
    // Hurray!
).fail((err) =>
    // Oops!

In order to get this to work, we did have to extend jQuery with a binary transport (which is a slightly adjusted version of this original):

 * jquery.binarytransport.js
 * @description. jQuery ajax transport for making binary data type requests.
 * @version 1.1
 * @author Henry Algus <henryalgus@gmail.com>
 * @author Bramus <bramus@bram.us>

// use this transport for "binary" data type
$.ajaxTransport("+binary", function(options, originalOptions, jqXHR){
    // check for conditions and support for blob / arraybuffer response type
    if (window.FormData && ((options.dataType && (options.dataType == 'binary')) || (options.data && ((window.ArrayBuffer && options.data instanceof ArrayBuffer) || (window.Blob && options.data instanceof Blob)))))
        return {
            // create new XMLHttpRequest
            send: function(headers, callback){
                // setup all variables
                var xhr = new XMLHttpRequest(),
                url = options.url,
                type = options.type,
                async = options.async || true,
                // blob or arraybuffer. Default is blob
                dataType = options.responseType || "blob",
                data = options.data || null,
                username = options.username || null,
                password = options.password || null;

                xhr.addEventListener('load', function(){
                        var data = {};
                        data[options.dataType] = xhr.response;
                        // make callback and send data
                        callback(xhr.status, xhr.statusText, data, xhr.getAllResponseHeaders());

                xhr.open(type, url, async, username, password);

                // setup custom headers
                for (var i in headers ) {
                        xhr.setRequestHeader(i, headers[i] );

                xhr.responseType = dataType;
            abort: function(){
                return false;

Avoiding the burden of file uploads →

Did this help you out? Like what you see?
Consider donating.

I don’t run ads on my blog nor do I do this for profit. A donation however would always put a smile on my face though. Thanks!

☕️ Buy me a Coffee ($3)

Man DDOSes his S3 bucket by adding its images in a Google Spreadsheet

I login to my AWS account to see what is going on, and I see this: $1177.76 in usage charges! A thousand, one hundred, seventy seven dollars. Out of which $1065 in outgoing bandwidth transfer costs. The scary part: 8.8 Terabytes of outgoing traffic! Tera. Not Giga. Terabytes.

To make things worse, I realized that the cost was going up hour after hour. Fifty to hundred dollars more in billing charges with each. passing. hour. I started sweating.

The Google attack: How I attacked myself using Google Spreadsheets and I ramped up a $1000 bandwidth bill →

(via @codepo8)