Using science to make truly tappable user interfaces

Since the average human finger pad is 10 to 14mm — and the average fingertip is 8mm to 10mm we can pretty easily define a range for what constitutes a “truly tappable UI:”

A truly tappable UI is built with elements that are at minimum around 10mm, with the optimum touch element size around 13mm.

Apple’s HIG (which have been around for quite some time now) still recommends 44 points (~7mm) as the minimum, yet Apple themselves seem to be preferring bigger hit areas since iOS10.

With the release of iOS10 the iTunes controls grew from ~7mm to 12.8mm (which is close to the optimum value of 13mm), as pictured above. I guess the bigger phones have to to something with that.

Using science to make truly tappable user interfaces →

Fixing HTML Video on Mobile


Samir Zahran on how and why they built Whitewater, an open source video encoder and player for their site:

Common HTML5 Video features such as preloading and autoplay are completely missing in some browsers. The scripting APIs are limited compared to what’s available on desktop. Worst of all, Safari on the iPhone (the most popular mobile browser to visit our site) does not allow inline video playback at all (not until iOS 10 is released)

It renders on <canvas> and requires you to first encode your video to the Whitewater format which uses diffmaps.

Fixing HTML Video on Mobile →
Whitewater Mobile Video →

Note: In case you don’t want to encode your videos you can – alternatively – use the hack that powers “players” such as iphone-inline-video: manually change the currentTime of the video element.

Here’s a little snippet of a (non-public) project that I’m working on that might be of help:

play(reset = false) {

  // Reset to start if need be
  if (reset) { = 0;

  // Store time
  this.lastTime =;

  // "Play" the video, using requestAnimationFrame


pause() {
  this.rAF && cancelAnimationFrame(this.rAF);

_rAFRender() {

  // Calculate time difference between this and the previous call to _rAFRender
  const time =;
  const elapsed = (time - this.lastTime) / 1000;

  // More time has passed than the framerate
  if (elapsed >= (1 / 25)) {

    // Seek the video to currentTime, thus rendering a new frame even though it's not playing = + elapsed;

    // Store time as lastTime
    this.lastTime = time;


  // We are not at the end of the video: keep on rendering
  if ( < {
    this.rAF = requestAnimationFrame(this._rAFRender);


If you create an <audio> element in parallel, you can even sync up the audio from the video with it – a feature Whitewater has not.

Note to self: I urgently need to release this component …

The mobile device lab at Facebook


Insightful post by the folks at Facebook on how they transitioned from testing their apps on a single device to a mobile device lab (holding 1000+ devices) at their Prineville data center.

Having tried out several things, they eventually built their own custom racks which not only hold the devices, but also function as an electromagnetic isolation (EMI) chamber.

Each rack holds eight Mac Minis (or four OCP Leopard servers for Android testing) that drive the phones to install, test, and uninstall the application we’re testing. Each Mac Mini is connected to four iPhones, and each OCP Leopard server is connected to eight Android devices, for a total of 32 phones per rack. The phones connect to Wi-Fi via a wireless access point in each rack. These phones lie on a slightly tilted pegboard so mounted cameras can record their screens. Engineers can access these cameras remotely to learn more about how each phone reacts to a code change.

Right now they have about 60 of these racks, and they are planning on doubling the capacity of each rack from 32 to 64 devices.

UPDATE: Here’s a better picture of such a rack. Note the Mac Minis at the bottom:


The mobile device lab at the Prineville data center →
TechCrunch: Facebook lifts the veil on its mobile device testing lab →

Pepperoni – A delicious blueprint for mobile development


Pepperoni is a blueprint for building cross-platform mobile experiences rapidly with ready-to-use integrated building blocks for common mobile app features, powered by React Native.

The Pepperoni blueprint is crafted on a solid foundation using modern architecture and industry best practices.

Pepperoni →

Facebook: Mobile @Scale London recap


Less than three years ago, engineers from Twitter, LinkedIn, Dropbox, Pinterest, and Facebook — including two from the then brand-new Facebook London office — met at Mobile @Scale in Menlo Park to talk about the challenges of building mobile software at large scale. Last Wednesday, the first Mobile @Scale London showed how far mobile development at scale has come in only a few short years.

Videos of the event are posted on the page. Recommended stuff if titles such as “Scaling iOS @ Google”, “6 lessons learned scaling mobile at SoundCloud”, “Backend-driven native UIs”, “3,000 images per second”, “React Native: Bringing the best of web development to native”, etc. ring a bell.

Mobile @Scale London recap →

Mobile Development with a #devops mindset

Kick-ass presentation which Patrick Debois – the one and only – gave as a lecture to my students Web & Mobile Development. In the presentation he reflects on a recent high profile mobile app Small Town Heroes, the company he works at, launched:

This presentation shows how you can improve your mobile development cycle when you understand the devops feedback loop.

Introducing Ionic Lab


We just added a new feature to the Ionic CLI tool that we’re calling Ionic Lab. Ionic Lab makes it much easier to test your apps on multiple screen sizes and platform types. […] Just like serve, it opens your app in a browser, but now it shows you what your app will look like on a phone, with both iOS and Android side by side.

Confirms exactly why I am in love with Ionic. Great addition!

Introducing Ionic Lab →

The end of apps as we know it


The idea of having a screen full of icons, representing independent apps, that need to be opened to experience them, is making less and less sense. The idea that these apps sit in the background, pushing content into a central experience, is making more and more sense. That central experience may be something that looks like a notification centre today, or something similar to Google Now, or something entirely new.

The primary design pattern here is cards. Critically it’s not cards as a simple interaction design pattern for an apps content, but as containers for content that can come from any app.

The idea of cards, taken to the next level … “the notification is the interface”.

The end of apps as we know it →