The folks over at Formidable have been experimenting with Houdini and WebGL/Three.js to create futuristic UIs
Futuristic sci-fi UIs in movies often support a story where humans, computers, and interfaces are far more advanced than today, often mixed with things like super powers, warp drives, and holograms. What is it about these UIs that feel so futuristic and appealing? Can we build some of these with the web technologies we have today?
The GitHub homepage features a very nice rotating 3D globe, augmented with realtime data shooting around. Here’s how they built it:
At the most fundamental level, the globe runs in a WebGL context powered by three.js. We feed it data of recent pull requests that have been created and merged around the world through a JSON file. The scene is made up of five layers: a halo, a globe, the Earth’s regions, blue spikes for open pull requests, and pink arcs for merged pull requests. We don’t use any textures: we point four lights at a sphere, use about 12,000 five-sided circles to render the Earth’s regions, and draw a halo with a simple custom shader on the backside of a sphere.
This is a React renderer for Threejs on the web and react-native. Building a dynamic scene graph becomes so much easier when you can break it up into declarative, re-usable components that react to state changes.
This is less of an abstraction and more of a pure reconciler (like react-dom in relation to HTML). It does not target a specific Threejs version nor does it need updates when Threejs alters, adds or removes features. It won’t change any specifics or rules. There are zero limitations.
About a year ago, Facebook announced a feature named “3D Photos”, a way to show photos taken with Apple’s “Portrait Mode” (or any other device that does the same) interactively:
Whether it’s a shot of your pet, your friends, or a beautiful spot from your latest vacation, you just take a photo in Portrait mode using your compatible dual-lens smartphone, then share as a 3D photo on Facebook where you can scroll, pan and tilt to see the photo in realistic 3D—like you’re looking through a window.
As unearthed by this research Facebook builds a 3D model out of the image + depth data, and then render the generated .glb file on screen using Three.js.
For example, here’s the wireframe of the kangaroo pictured at the top of this post:
3D wireframe of the kangaroo (Yuri akella Artiukh)
A photo taken in Apple’s Portrait Mode is in essence no more than the flat photo combined with a depth map. A depth map is a gray scale photowhere white defines points close-by and pure black defines points farthest away. Using the depth map, you can then blur the content that is furthest away.
Winging back to Facebook: if you upload a file named photo.jpg along with a file photo_depth.jpg, Facebook will treat the latter as the depth map for the photo.jpg, and create a post with a 3D photo from them.
Uploading a photo and its depth map to become one one 3D photo
If you don’t have a depth map of a photo, you can always create one yourself manually using Photoshop or any other image editing tool.
Certain advertises have used this technique a few times by now, as illustrated on Omnivirt:
Tools like the online 3D Photo Creator have a depth prediction algorithm built in. The result is most likely not as good as your own DIY depth map, yet it give you a head start.
🤖 Psst, As a bonus you can check the console to see the link to the resulting .glb float by in said tool 😉
To go the other way around – from 3d photo to photo and depth map – you can use a tool such as the Facebook 3D Photo Depth Analyzer to extract both the photo and the depth map from a 3D photo post.
Just enter the Facebook post ID and hit analyze 🙂
Another approach to show a 3D photo is to use WebGL. With this technique you don’t need to generate a .glb, but can directly use a photo and its accompanying depth map:
Photo Sphere Viewer uses the Three.js library, so nothing is required for your visitors except for a browser compatible with canvas or, better, WebGL.
Radio broadcasts leave Earth at the speed of light and travel outwards into space. Follow them through the Milky Way as you scroll backwards through time and listen to what the stars hear.
Built using Three.js. Taking a look at the source code reveals all tracks used.
And oh, don’t forget the “Inverse Square Law of Propagation”:
Although Lightyear.fm has radiowaves reaching over 100 lightyears into space, due to the Inverse Square Law of Propagation, any terrestrial radio broadcast would become nothing but background noise just a few light years away from Earth.
The aforementioned ViziCities, the 3D city and data visualisation platform, is now open source. Early build available at GitHub. Demo also available (click to drag, SHIFT+click to rotate). Neat, really neat!