Webbed Briefs – Brief videos about web technologies and how to make the most of them.

Heydon is back with a new project named “Webbed Briefs”:

WEBBED BRIEFS are brief videos about the web, its technologies, and how to make the most of them. They’re packed with information, fun times™, and actual goats. Yes, it’s a vlog, but it isn’t on Youtube. Unthinkable!

The first video is entitled “What Is ARIA Even For?”, and indicates where this is headed: tons of information, fast paced, lots of mindfarts, and quite a lot of cursing …

Webbed Briefs →

Speed up build times with this little Git trick

When building applications on build pipelines like GitHub Actions, Google Cloud Build, CircleCI, etc. every second counts. Here’s this small trick I use to speed up build times: when cloning the repo from its Git source, I instruct Git to do a shallow clone of the single branch it is building.

💡 If you’re running a prebuilt “git clone” step on your Build Pipeline it most likely already uses this trick. Be sure to double check though.

~

Shallow Cloning

When doing a git clone you, by default, get the entire history of the project along with that. Take this clone of Laravel for example:

$ git clone [email protected]:laravel/laravel.git
Cloning into 'laravel'...
remote: Enumerating objects: 19, done.
remote: Counting objects: 100% (19/19), done.
remote: Compressing objects: 100% (15/15), done.
remote: Total 32004 (delta 5), reused 11 (delta 3), pack-reused 31985
Receiving objects: 100% (32004/32004), 9.94 MiB | 6.98 MiB/s, done.
Resolving deltas: 100% (18934/18934), done.

That’s a whopping 32004 objects totalling ±10MiB that have been downloaded, even though the default Laravel branch only counts 66 files spread across 36 directories.

The objects contained in this ±10MiB make up the entire history of every file and folder the project. To build the project we don’t really need all that, as we’re only interested in the latest version of each file and folder. By leveraging the --depth argument of our git clone command, we can enforce just that. This is what we call Shallow Cloning.

$ git clone --depth 1 [email protected]:laravel/laravel.git
Cloning into 'laravel'...
remote: Enumerating objects: 108, done.
remote: Counting objects: 100% (108/108), done.
remote: Compressing objects: 100% (88/88), done.
remote: Total 108 (delta 6), reused 49 (delta 1), pack-reused 0
Receiving objects: 100% (108/108), 41.80 KiB | 535.00 KiB/s, done.
Resolving deltas: 100% (6/6), done.

That’s a much speedier clone with only 108 objects, totalling a mere ±40KiB!

🤔 You could argue that 10MiB worth of objects is still OK to clone, but think of scenarios where you have a big “monorepo” with plenty of branches … then you’ll be talking about hundreds of wasted MiBs, if not GiBs.

~

Single Branch Cloning

When building a project you’re building only one certain branch. Information about the other branches is irrelevant at that time. To directly clone one specific branch we can use the --branch option to target said branch. With that alone we won’t get there though, we we still need to discard information about other branches. This is where the --single-branch option comes into play:

$ git clone --branch 3.0 --single-branch [email protected]:laravel/laravel.git
Cloning into 'laravel'...
remote: Enumerating objects: 20392, done.
remote: Total 20392 (delta 0), reused 0 (delta 0), pack-reused 20392
Receiving objects: 100% (20392/20392), 5.79 MiB | 853.00 KiB/s, done.
Resolving deltas: 100% (12731/12731), done.

Here we’ve cloned only the 3.0 branch of Laravel, resulting in roughly 10000 fewer objects to be downloaded.

By checking the contents of git branch -a we can also verify that other branch info has not been fetched:

$ cd laravel
$ git branch -a
* 3.0
  remotes/origin/3.0

~

Shallow Cloning + Single Branch Cloning

By combining both we can download only the latest files of a specific branch. Since the use of --single-branch is implied when using --depth, we can drop the former and our command will look like this:

$ git clone --depth 1 --branch <branchname> <repo>

Here’s an example downloading the Laravel 3.0 branch:

$ git clone --depth 1 --branch 3.0 [email protected]:laravel/laravel.git
Cloning into 'laravel'...
remote: Enumerating objects: 545, done.
remote: Counting objects: 100% (545/545), done.
remote: Compressing objects: 100% (465/465), done.
remote: Total 545 (delta 78), reused 293 (delta 45), pack-reused 0
Receiving objects: 100% (545/545), 1.34 MiB | 832.00 KiB/s, done.
Resolving deltas: 100% (78/78), done.

~

With this in place you’ll see your build times drop by minutes, especially when working on a monorepo with many branches.

Did this help you out? Like what you see?
Thank me with a coffee.

I don't do this for profit but a small one-time donation would always put a smile on my face. Thanks!

☕️ Buy me a Coffee (€3)

🐳 Building Docker Images on Google Cloud Build? Check out this trick to enable caching.

Operator Lookup — Search JavaScript Operators

Handy tool by Josh W. Comeau to look up JavaScript operators.

JavaScript Operator Lookup →

🤩 If you’ve been following bram.us for a while you might already know that my favourite operators are the optional chaining operator (.?) and the null coalescing operator (??). Definitely check them out as they will change the way you write your JavaScript code.

Container Queries are coming to Chromium!

Just announced on the Chromium mailing list is an “Intent to Prototype” Container Queries, which is quite exciting news 🎉

🤔 Container Queries?

Container Queries allow authors to style elements according to the size of a container. This is similar to a @media query, except that it evaluates against a container instead of the viewport.

The experimental implementation will follow Miriam Suzanne’s proposal, which looks like this:

main,
aside {
  contain: size; /* (1) Create an implicit "container root" or "containment context" */
}

.media-object {
  display: grid;
  gap: 1em;
}

@container (max-width: 45em) { /* (2) When the nearest `contain: size` ancestor has a max-width of 45em … */
  .media-object { /* … apply these rules onto .media-object */
    grid-template: 'img content' auto / auto 1fr;
  }
}

Using contain: size (1) will create an implicit “container root” or “containment context” on that element. Elements contained inside it can then have container queries applied onto them, by use of a new at-rule @container (<container-media-query>) (2). The target selector and CSS rules to apply in that case are nested within the @container at-rule, just like we already do with other at-rules.

In the example above extra rules will be applied to .media-object whenever its nearest ancestor with size containment set — such as <main> or <aside> — has a max-width of 45em.

~

A previous version of this proposal by L. David Baron required a context selector to be set, but that has been dropped here. The @container rule from Miriam’s version will work in any containment context (read: the nearest parent element that has contain: size set). The syntax might still change, but that’s irrelevant to the prototype which is to be implemented:

This is not at all finalized, but the underlying problems we need to solve in Blink are (mostly) the same regardless of how the feature is accessed, so we’ll for now use this proposal as the temporary syntax.

~

Intent to Prototype: Container Queries →
Chrome Tracking Bug →

Did this help you out? Like what you see?
Thank me with a coffee.

I don't do this for profit but a small one-time donation would always put a smile on my face. Thanks!

☕️ Buy me a Coffee (€3)

HTML Forms: How (and Why) to Prevent Double Form Submissions

When double clicking a submit button your form will be sent twice. Using JavaScript you can prevent this from happening, but wouldn’t it be nice if this behavior could be tweaked by use of an attribute on the <form>? If you think so, feel free to give this issue a thumbs up.

Today Sebastian wondered:

I quickly chimed in saying that I do tend to lock up forms when submitting them. Let me explain why …

~

I started locking up submitted forms as users of the apps I’m building reported that sometimes the actions they had taken — such as adding an entry — were performed twice. I took me some time to figure out what exactly was going on, but eventually I found out this was because they were double clicking the submit button of the forms. As they double clicked the button, the form got sent over twice. By locking up forms after their first submission, all successive submissions — such as those triggered by that second click of a double click — are ignored.

~

To prevent these extra form submissions from being made I don’t hijack the form with Ajax nor do I perform any other complicated things. I let the inner workings of the web and forms in the browser be: when pressing the submit button the browser will still collect all form data, build a new HTTP request, and execute that request.

What I simply do is extend the form’s capabilities by adding a flag — by means of a CSS class — onto the form to indicate whether it’s being submitted or not. I can then use this flag’s presence to deny any successive submissions, and also hook some CSS styles on. — Progressive Enhancement, Yay! 🎉

The code looks as follows:

 // Prevent Double Submits
document.querySelectorAll('form').forEach(form => {
	form.addEventListener('submit', (e) => {
		// Prevent if already submitting
		if (form.classList.contains('is-submitting')) {
			e.preventDefault();
		}
		
		// Add class to hook our visual indicator on
		form.classList.add('is-submitting');
	});
});

💡 Although the problem initially was a double click problem, we don’t listen for any clicks on the submit button but listen for the form’s submit event. This way our code not only works when clicking any of the submit buttons, but also when pressing enter to submit.

With that .is-submitting class in place, we can then attach some extra CSS onto the form to give the user visual feedback. Here’s a few examples:

See the Pen
Prevent Form Double Submits
by Bramus (@bramus)
on CodePen.

See the Pen
Prevent Form Double Submits (Alternative version)
by Bramus (@bramus)
on CodePen.

Note that this solution might not cover 100% of all possible scenarios, as it doesn’t take failing networks and other things that might go wrong into account. However, as I’m relying on the basic mechanisms of the web I think I can also rely on the browser to show that typical “Confirm Form Resubmission” interstitial should a timeout occur. Additionally, if need be, one could always unlock the form after a fixed amount of time. That way the user will be able to re-submit it again.

~

Dealing with double form submissions isn’t a new issue at all. You’ll find quite some results when running a few queries through Google — something I only found out after stumbling over the issue myself.

Back in 2015 (!) Mattias Geniar also pondered about this, after being pulled into the subject from a sysadmin view. Now, when even sysadmins are talking about an HTML/UX issue you know there’s something going on. This made me wonder why browsers behaved like that and how we could solve it:

As a result I decided to open an issue at the WHATWG HTML Standard repo, suggesting for a way to fix this at spec level:

An attribute on <form> to tweak this behavior – instead of having to rely on JavaScript – would come in handy and form a nice addition to the spec.

I see two options to go forward:

  1. Browsers/the standard keeps the current behavior and allow multiple submits. Developers must opt-in to prevent multiple submissions using a preventmultiplesubmits attribute.
  2. Browsers/the standard adjust the current behavior to only submit forms once. Developers must opt-in to allow multiple submissions using a allowmultiplesubmits attribute.

Initial response on the issue was very low, and it looks like this isn’t that big of a priority.

Back then I was more in favor of the second solution, but now I’m quite undecided as changing the default behavior — which has been around for ages — is quite tricky.

~

Another way that this issue could be fixed is at the browser level: if they were to treat double clicks on form submit buttons as single clicks, then the double click problem — but not the double submit problem — could also be taken care of.

To then attach styles to forms being submitted a CSS Pseudo Class :submitting would come in handy. Come to think of it, this Pseudo Class would be a quite nice addition to CSS in itself, no matter whether this double click issue gets solved or not.

~

Winging back to the addition to the HTML spec I suggested: If you do think it could be something the HTML spec could benefit from, feel free to add a thumbs up to the issue to indicate that you want this, or add an extra comment to it if you have more input on the subject.

Did this help you out? Like what you see?
Thank me with a coffee.

I don't do this for profit but a small one-time donation would always put a smile on my face. Thanks!

☕️ Buy me a Coffee (€3)

Convert Guzzle requests to curl commands with namshi/cuzzle

The other day the namshi/cuzzle PHP pacakge came in really handy.

This library let’s you dump a Guzzle request to a cURL command for debug and log purposes

This way I could test some things on the CLI, and easily share these tests with all my colleagues, including those without PHP installed.

use Namshi\Cuzzle\Formatter\CurlFormatter;
use GuzzleHttp\Message\Request;

$request = new Request('GET', 'example.local');
$options = [];

echo (new CurlFormatter())->format($request, $options);
// ~> curl example.local -X GET -A 'GuzzleHttp/6.4.1 curl/7.71.1 PHP/7.4.9'

Also comes with a Monolog formatter to easily log the resulting curl commands in your log files. Do keep in mind that you might be leaking sensitive information (passwords/tokens) that way …

Installation per Composer:

composer require namshi/cuzzle

Cuzzle, cURL command from Guzzle requests →

Viscosity – OpenVPN Client for Mac and Windows

Speaking of VPN connections: to easily manage and connect to (Open)VPN connections I use Viscosity.

A first class OpenVPN client for Mac and Windows that lets you secure your network with ease & style.

You can either manually configure your connections, or simply drop your .ovpn config files onto it. The app sits neatly out of the way in the Menu Bar (Mac) or Task Bar’s Notification Area (Windows).

Viscosity – OpenVPN Client for Mac and Windows →

Show the routing tables on Mac / Linux

In a project we at vBridge are working on, we rely on a Virtual Private Network to link our connected devices, certain servers, and our webapp together. I had an issue where a specific server in the 10.55/24 range was nog being reachable.

While debugging the issue — going deeper into the rabbit hole called the 5 Whys — I eventually needed to verify if the proper routing tables from the VPN connection had been set up or not. To do so, I used netstat:

$ netstat -nr
Routing tables

Internet:
Destination        Gateway            Flags        Netif Expire
default            192.168.83.1       UGSc           en0       
default            link#19            UCSI        utun10    
10.77/16           10.77.0.5          UGSc        utun10       
10.77.0.5          10.77.0.5          UH          utun10       
127                127.0.0.1          UCS            lo0       
127.0.0.1          127.0.0.1          UH             lo0       
169.254            link#5             UCS            en0      !
192.168.83         link#5             UCS            en0      !
192.168.83.1/32    link#5             UCS            en0      !
…
255.255.255.255    ff:ff:ff:ff:ff:ff  UHLWbI         en0      !
255.255.255.255/32 link#19            UCSI        utun10      

…

As you can see, the 10.55/24 route indeed wasn’t registered indeed, explaining why the host wasn’t reachable.

Did this help you out? Like what you see?
Thank me with a coffee.

I don't do this for profit but a small one-time donation would always put a smile on my face. Thanks!

☕️ Buy me a Coffee (€3)