For the past year and a half, it’s been our privilege to work on one of our largest and most ambitious undertakings ever: collaborating closely with a team of Facebook engineers, designers, and data experts to roll out a global, multi-scale base map for all of Facebook’s billions of users. In late 2020, this map went live, and we’re extremely proud of the results.
With the new Facebook coming soon to all users, the developers saw an opportunity to build a11y in from the start:
To make the new site more accessible, we were able to introduce guardrails right from the beginning, integrate focus management into the core infrastructure, support features that weren’t available when we built the original site in 2004, and build in monitoring and analysis to help prevent regressions as we continue to add new features.
One of the things I like — and something that’s been often discussed, even way before this Github Issue — is the introduction of a generic Heading component. Leveraging React Context, they then render Contextual Headings.
Recoil is the State Management Library for React they use at Facebook.
Recoil lets you create a data-flow graph that flows from atoms (shared state) through selectors (pure functions) and down into your React components. Atoms are units of state that components can subscribe to. Selectors transform this state either synchronously or asynchronously
Best to watch the talk from React Europe 2020 embedded above. It clearly explains why Recoil come to exist, and when & how to use it.
Throughout the process, we anchored our work around two technical mantras:
As little as possible, as early as possible. We should deliver only the resources we need, and we should strive to have them arrive right before we need them.
Engineering experience in service of user experience. The end goal of our development is all about the people using our website. As we think about the UX challenges on our site, we can adapt the experience to guide engineers to do the right thing by default.
🤔 A bit weird they named the CSS section “Rethinking CSS to unlock new capabilities” though, as they’re basically using CSS as it is meant to be used: use em for font-sizes so that users can zoom, use CSS Custom Properties for theming / dark mode, etc. #embracetheplatform
I am a curious person who is always interested in opening up the browser DevTools to see how things were made on a website. This is the first time that I blog about something like this. I found some interesting uses of different CSS features (at least for me), and I wanted to share them with you.
Always interesting to take a peek at how Big Company is doing things …
Ashley discusses some of the technologies and strategies powering FB5, the new facebook.com. Topics covered include Facebook’s approach to CSS-in-JS, data-driven dependencies, phased code downloading, and more.
Just out: a new layout for Facebook. You might recognize the style they’ve been pushing out in Messenger and the Facebook App. Also comes with Dark Mode.
The new layout will be rolled out to everyone over the next few months, but you can already opt-in manually by clicking the arrow on the right in the navbar and choosing the “Switch to new Facebook” option.
Those new Profile pages look really nice. Also love how photos are shown in a full screen overlay (with a blurred background).
Photo of my son, Noah, on Facebook.
Feels very app like, and looks touch-friendly (look at the size of the close button, for example).
One issue though with it all though: it doesn’t really play nice with bigger screens imo. Take a look at the new Home:
Photo of the new Facebook Home, on a wide screen.
Looks much better on a narrow screen:
Photo of the new Faceook Home, in a narrow window.
And yes, it’s a full React app. You can check it using the React DevTools 😉
About a year ago, Facebook announced a feature named “3D Photos”, a way to show photos taken with Apple’s “Portrait Mode” (or any other device that does the same) interactively:
Whether it’s a shot of your pet, your friends, or a beautiful spot from your latest vacation, you just take a photo in Portrait mode using your compatible dual-lens smartphone, then share as a 3D photo on Facebook where you can scroll, pan and tilt to see the photo in realistic 3D—like you’re looking through a window.
As unearthed by this research Facebook builds a 3D model out of the image + depth data, and then render the generated .glb file on screen using Three.js.
For example, here’s the wireframe of the kangaroo pictured at the top of this post:
3D wireframe of the kangaroo (Yuri akella Artiukh)
A photo taken in Apple’s Portrait Mode is in essence no more than the flat photo combined with a depth map. A depth map is a gray scale photowhere white defines points close-by and pure black defines points farthest away. Using the depth map, you can then blur the content that is furthest away.
Winging back to Facebook: if you upload a file named photo.jpg along with a file photo_depth.jpg, Facebook will treat the latter as the depth map for the photo.jpg, and create a post with a 3D photo from them.
Uploading a photo and its depth map to become one one 3D photo
If you don’t have a depth map of a photo, you can always create one yourself manually using Photoshop or any other image editing tool.
Certain advertises have used this technique a few times by now, as illustrated on Omnivirt:
Tools like the online 3D Photo Creator have a depth prediction algorithm built in. The result is most likely not as good as your own DIY depth map, yet it give you a head start.
🤖 Psst, As a bonus you can check the console to see the link to the resulting .glb float by in said tool 😉
To go the other way around – from 3d photo to photo and depth map – you can use a tool such as the Facebook 3D Photo Depth Analyzer to extract both the photo and the depth map from a 3D photo post.
Just enter the Facebook post ID and hit analyze 🙂
Another approach to show a 3D photo is to use WebGL. With this technique you don’t need to generate a .glb, but can directly use a photo and its accompanying depth map:
Data Selfie is a browser extension that tracks you while you are on Facebook to show you your own data traces and reveal what machine learning algorithms could predict about your personality based on that data.
The tool explores our relationship to the online data we leave behind as a result of media consumption and social networks – the information you share consciously and unconsciously.
Yesterday I’ve tweeted about this one, but it’s too great to not mention here too. It’s the video clip for “Bear Claws” by “The Academic”.
For this video the band has used the lag of a Facebook Live video to their advantage. By doing so, they’ve created a loop sampler. Also note the clever use of the lighting which changes color per “section“, creating a nice visual effect too.