The GitHub homepage features a very nice rotating 3D globe, augmented with realtime data shooting around. Here’s how they built it:
At the most fundamental level, the globe runs in a WebGL context powered by three.js. We feed it data of recent pull requests that have been created and merged around the world through a JSON file. The scene is made up of five layers: a halo, a globe, the Earth’s regions, blue spikes for open pull requests, and pink arcs for merged pull requests. We don’t use any textures: we point four lights at a sphere, use about 12,000 five-sided circles to render the Earth’s regions, and draw a halo with a simple custom shader on the backside of a sphere.
glcheck is a WebGL-focused testing framework. It runs unit and render tests using puppeteer which allows it to run automated tests and generate coverage reports for both WebGL 1 and 2 applications.
About a year ago, Facebook announced a feature named “3D Photos”, a way to show photos taken with Apple’s “Portrait Mode” (or any other device that does the same) interactively:
Whether it’s a shot of your pet, your friends, or a beautiful spot from your latest vacation, you just take a photo in Portrait mode using your compatible dual-lens smartphone, then share as a 3D photo on Facebook where you can scroll, pan and tilt to see the photo in realistic 3D—like you’re looking through a window.
As unearthed by this research Facebook builds a 3D model out of the image + depth data, and then render the generated .glb file on screen using Three.js.
For example, here’s the wireframe of the kangaroo pictured at the top of this post:
3D wireframe of the kangaroo (Yuri akella Artiukh)
~
A photo taken in Apple’s Portrait Mode is in essence no more than the flat photo combined with a depth map. A depth map is a gray scale photowhere white defines points close-by and pure black defines points farthest away. Using the depth map, you can then blur the content that is furthest away.
Winging back to Facebook: if you upload a file named photo.jpg along with a file photo_depth.jpg, Facebook will treat the latter as the depth map for the photo.jpg, and create a post with a 3D photo from them.
Uploading a photo and its depth map to become one one 3D photo
~
If you don’t have a depth map of a photo, you can always create one yourself manually using Photoshop or any other image editing tool.
Certain advertises have used this technique a few times by now, as illustrated on Omnivirt:
Tools like the online 3D Photo Creator have a depth prediction algorithm built in. The result is most likely not as good as your own DIY depth map, yet it give you a head start.
🤖 Psst, As a bonus you can check the console to see the link to the resulting .glb float by in said tool 😉
~
To go the other way around – from 3d photo to photo and depth map – you can use a tool such as the Facebook 3D Photo Depth Analyzer to extract both the photo and the depth map from a 3D photo post.
Just enter the Facebook post ID and hit analyze 🙂
~
Another approach to show a 3D photo is to use WebGL. With this technique you don’t need to generate a .glb, but can directly use a photo and its accompanying depth map:
William Candillon recreated some parts of Instagram’s editing process (namely the filter, rotate, and crop steps) in React Native.
The filters are implemented with GL-React:
Gl-react offers a really efficient paradigm for integrating React and OpenGL together. The library enables you to easily build scenes using React composability. For instance, consider an image with two filters: Brannan and Valencia. It can be expressed as below. The GLImage implementation is based on gl-react-image.
AR.js is Efficient Augmented Reality for the Web using ARToolKit. It is Fast! It runs efficiently even on phones. 60 fps on my 2 year-old phone! it is a pure javascript solution, fully opensource. It works on any phone with webgl and webrtc
During the first “Uber Visualization Night” on June 20, 2017, Nicolas Belmonte from Uber’s Data Visualization team gave a nice introduction to the aforementioned deck.gl:
Radio broadcasts leave Earth at the speed of light and travel outwards into space. Follow them through the Milky Way as you scroll backwards through time and listen to what the stars hear.
Built using Three.js. Taking a look at the source code reveals all tracks used.
And oh, don’t forget the “Inverse Square Law of Propagation”:
Although Lightyear.fm has radiowaves reaching over 100 lightyears into space, due to the Inverse Square Law of Propagation, any terrestrial radio broadcast would become nothing but background noise just a few light years away from Earth.