Fun with browsers: how to get an image into the current page

Christian Heilmann created a demo page where a user can add an image to the page through various ways.

I gave myself the task to build an interface to make it as easy as possible for a user to add an image into the document. I wanted to support:

  • Image upload
  • Drag and Drop
  • Copy and Paste

Each scenario requires a small tad of JS. Here’s a pen with the final result:

Fun with browsers: how to get an image into the current page →

react-dropzone

Simple HTML5 drag-drop zone to drop files on.

import React from 'react';
import Dropzone from 'react-dropzone';

export default class DropzoneDemo extends React.Component {

  onDrop(acceptedFiles, rejectedFiles) {
    console.log("Accepted files: ", acceptedFiles);
    console.log("Rejected files: ", rejectedFiles);
  }

  render() {
    return (
        <Dropzone accept="image/*" multiple={true} onDrop={this.onDrop}>
          <div>This dropzone accepts only images. Try dropping some here, or click to select files to upload.</div>
        </Dropzone>
    );
  }

}

In the onDrop you can then process the files for uploading.

react-dropzone
react-dropzone Source (GitHub) →

Uploading files using AJAX directly to S3

Chris White, Laravel Developer:

For most situations using S3 is a no brainer, but the majority of developers transfer their user’s uploads to S3 after they have received them on the server side. This doesn’t have to be the case, your user’s web browser can send the file directly to an S3 bucket. You don’t even have to open the bucket up to the public. Signed upload URLs with an expiry will allow temporary access to upload a single object.

This is exactly what me and my colleagues at Small Town Heroes did for The Passion: #ikloopmee. When the user has created his own testimonial (which is rendered onto a <canvas> element) he directly uploads the blob from his browser to S3 using a signed URL.

Here’s what our upload code looks like (given a already signed URL):

// Upload the blob to S3
$.ajax({
    method: 'PUT',
    contentType: s3Data.contentType,
    headers: {
        'Cache-Control': 'max-age=86400000',
        'x-amz',
    },
    processData: false,
    url: s3Data.signedUrl,
    data: blob,
}).done((data) =>
    // Hurray!
).fail((err) =>
    // Oops!
);

In order to get this to work, we did have to extend jQuery with a binary transport (which is a slightly adjusted version of this original):

/**
 *
 * jquery.binarytransport.js
 *
 * @description. jQuery ajax transport for making binary data type requests.
 * @version 1.1
 * @author Henry Algus <[email protected]>
 * @author Bramus <[email protected]>
 *
 */

// use this transport for "binary" data type
$.ajaxTransport("+binary", function(options, originalOptions, jqXHR){
    // check for conditions and support for blob / arraybuffer response type
    if (window.FormData && ((options.dataType && (options.dataType == 'binary')) || (options.data && ((window.ArrayBuffer && options.data instanceof ArrayBuffer) || (window.Blob && options.data instanceof Blob)))))
    {
        return {
            // create new XMLHttpRequest
            send: function(headers, callback){
                // setup all variables
                var xhr = new XMLHttpRequest(),
                url = options.url,
                type = options.type,
                async = options.async || true,
                // blob or arraybuffer. Default is blob
                dataType = options.responseType || "blob",
                data = options.data || null,
                username = options.username || null,
                password = options.password || null;

                xhr.addEventListener('load', function(){
                        var data = {};
                        data[options.dataType] = xhr.response;
                        // make callback and send data
                        callback(xhr.status, xhr.statusText, data, xhr.getAllResponseHeaders());
                });

                xhr.open(type, url, async, username, password);

                // setup custom headers
                for (var i in headers ) {
                        xhr.setRequestHeader(i, headers[i] );
                }

                xhr.responseType = dataType;
                xhr.send(data);
            },
            abort: function(){
                return false;
            }
        };
    }
});

Avoiding the burden of file uploads →

Did this help you out? Like what you see?
Consider donating.

I don’t run ads on my blog nor do I do this for profit. A donation however would always put a smile on my face though. Thanks!

☕️ Buy me a Coffee ($3)

Html5 File Upload with Progress

Html5 finally solves an age old problem of being able to upload files while also showing the upload progress. However it is fairly complicated and not for the faint of heart because you are essentially taking over the entire server side processing (when you tap into the byte stream) and that includes implementing the multipart/form-data protocol on the server side, along with a bunch of other things.

Html5 File Upload with Progress →

Enabling Image Upload in Staff Panel of Kayako Support Suite

When it comes to features and functionality Kayako Supportsuite is a fine product indeed. One of the things that is lacking though is the ability to upload new images to the server in order for you to insert them into a KB article for example. Took me a little while, but I found out how to enable the functionality straight into the WYSIWYG editor that comes with Kayako (which is the extremely outdated HTMLArea). Quite sure I’ll be making some other guys out there very happy with this guide, as this is a much requested feature 🙂

Continue reading “Enabling Image Upload in Staff Panel of Kayako Support Suite”