Video from Kai Sassnowski’s talk at Laracon EU 2019:
The goal of this talk is to explain how dependency containers work by building our own. We start out by building the simplest DI container possible to demonstrate the underlying concept. Most people will be surprised about how little code this actually takes (3-4 effective lines of code). From there, we gradually add more sophisticated features like autowiring to create a container that more closely resembles what we are familiar with from Laravel. After listening to this talk, you will know how DI containers work at their core. You will also know how autowiring works and why it doesn’t work under certain circumstances.
If you have a (legacy) PHP project running on a legacy server (running PHP 5.4.27 for example), but are locally developing with a more modern PHP version (PHP 7.4 for example), you might end up installing dependencies that are not compatible with the PHP version on the server.
To bypass this, you can tell Composer, via its config property, which PHP version that you’re using (on production).
When you now run composer require package, it will only install dependencies that are compatible with that PHP version (instead of the PHP version that’s executing Composer)
💁♂️ You might also want to run that legacy application locally in a Docker container with said PHP version. The Docker West PHP Docker Images will come in handy, although versions prior to PHP 5.6 are supported.
🔐 Even better, of course, is to phase out the production server running that outdated/unsupported/insecure PHP version …
Did this help you out? Like what you see? Thank me with a coffee.
I don't do this for profit but a small one-time donation would surely put a smile on my face. Thanks!
For an RN app I’m co-developing we have several repos that work together. One of the repos acts a library for other repos to use. During development, in order to test a few things out, we sometimes need to have the local dev version of the library repo work with one of the other repos (e.g. the local dev version of the library-repo is a dependency of another repo).
For regular JS apps we’d use yarn link to get this working. For React Native however, that approach doesn’t work: the Metro Bundler can’t cope with symlinked dependencies (See facebook/metro issue #1).
The solution we found was to use wml – which uses watchman under the hood – for this:
Wml is an alternative to symlinks that actually copies changed files from source to destination folders.
Wml is a CLI tool that works pretty much like ln -s. You first set up your links by using the wml add command and then run the wml service (wml start) to start listening. That’s it!
Usage is as follows:
# add the link to wml using `wml add <src> <dest>`
wml add ~/my-package ~/main-project/node_modules/my-package
# start watching all links added
Do note that wml is not perfect. Quite regularly we noticed that things just stopped working, and wrong (cached) includes were made, the bundler would complain about two packages providing React Native, etc. In that case the solution was to quit and reset just about everything:
# remove all wml links
wml rm all
# reset watchman
# clean up node_modules and reinstall dependencies
rm -rf ./node_modules
# now re-add your wml links
Also note that wml alters your .watchmanconfig file in the source folder so that it ignores the locale node_modules folder. Don’t forget to reset it once you’ve stopped with wml.
So yeah … it’s a bit of an unstable solution (but it might help you forward).
Reading up on the original Metro issue I noticed that this possible workaround was mentioned in it … haven’t tested it though.
Did this help you out? Like what you see? Consider donating.
I don’t run ads on my blog nor do I do this for profit. A donation however would always put a smile on my face though. Thanks!
Talk by Hannes Van De Vreken, as given at the recent phpCE conference in Poland:
Did you know your IoC container can do a whole lot more than just constructor injection? Besides that it is actually packed with features. Inflectors, resolving callbacks, aliasing, method invocation to name a few. In this talk you will learn how to leverage the power of a great container and service providers to write better, loosely coupled code. Well designed code put together by your IoC container will make your applications SOLID, modular, lean and decoupled from the framework!
Hannes also just published the slides(from his rendition at Confoo Montreal 2018):
Whilst checking out the aforementioned IMcD23/TabView and a few other iOS/macOS libraries I could not help by notice the lack of CocoaPods and the presense of Carthage. Apparently the community is now leaning more towards the latter.
Carthage is intended to be the simplest way to add frameworks to your Cocoa application.
Carthage builds your dependencies and provides you with binary frameworks, but you retain full control over your project structure and setup. Carthage does not automatically modify your project files or your build settings.
Dependencies are defined in a Cartfile:
# Require version 2.3.1 or later
github "ReactiveCocoa/ReactiveCocoa" >= 2.3.1
# Require version 1.x
github "Mantle/Mantle" ~> 1.0 # (1.0 or later, but less than 2.0)
# Require exactly version 0.4.1
github "jspahrsummers/libextobjc" == 0.4.1
# Use the latest version
# Use the branch
github "jspahrsummers/xcconfigs" "branch"
# Use a project from GitHub Enterprise
# Use a project from any arbitrary server, on the "development" branch
git "https://enterprise.local/desktop/git-error-translations2.git" "development"
# Use a local project
git "file:///directory/to/project" "branch"
# A binary only framework
binary "https://my.domain.com/release/MyFramework.json" ~> 2.3
After running carthage the lockfile Cartfile.resolved(which you should commit into your version control system!) will be created. Dev dependencies can be stored in a Cartfile.private file.
So why is Carthage getting more popular? From their README, these two here caught my attention:
CocoaPods (by default) automatically creates and updates an Xcode workspace for your application and all dependencies. Carthage builds framework binaries using xcodebuild, but leaves the responsibility of integrating them up to the user. CocoaPods’ approach is easier to use, while Carthage’s is flexible and unintrusive.
Carthage has been created as a decentralized dependency manager. There is no central list of projects, which reduces maintenance work and avoids any central point of failure.
In comparison to npm, the Yarn website pushes these three main benefits forwards:
You don’t need to learn an entirely new (*) syntax (unlike the switch from grunt to gulp to webpack).
You don’t need to find your packages on a different repository/website (unlike the switch from bower to npm).
You don’t need to change your directory structure (Yarn still uses package.json and puts everything in the node_modules subfolder).
For us, the end user, not much changes. Everything that was, still is. From a UX perspective that can count.
Working with Yarn: “It’s a Unix System, I know this!”
(*) I know. The syntax is a tad different, but not that much: Once yarn is installed, just run yarn (or yarn install) instead of npm install. To add a package to your package.json run yarn add packagename (instead of npm i packagename --save). Slightly different indeed, but nothing big.
Of course I’m also excited about the initial benefits listed. Take reliability for example. Coming from a PHP background – where we have Composer for dependency management – I applaud the fact that Yarn – like Composer – automatically generates lock files.
Yarn Review so far:
👉 It's Super Fast. Like woah. 👉 Creates package.json if one doesn't exist 👉 Saves Exact versions in a .lock file
Thanks to the yarn.lock file you can no longer end up with different versions of dependencies on different machines should a new version of a dependency be released in between two runs of yarn install. The lock file, as its name indicates, locks the dependencies down to exact versions (cfr. npm shrinkwrap, but then by default and on steroids). That way you have reproducible installs, on all machines.
The magic clue behind it? Whenever you run yarn install, the yarn.lock file has precedence over the package.json.
If the yarn.lock file exists, the (exact) versions defined in it will be used.
If no yarn.lock exists, the (loosely defined) versions defined in package.json will be used, and a yarn.lock is generated.
That way you, as a developer, have true control over which exact versions will be installed on every machine. As long as you commit your yarn.lock file into Git (or whatever VCS floats your boat) of course.
Let me rephrase and repeat that for you, as it’s really important: you must commit the yarn.lock file into Git.
To update the versions locked in the yarn.lock file run yarn upgrade. After having verified that your project Works On My Machine™, it’s safe to commit the updated lock file into Git.
Oh, and about the aforementioned speed benefit (thanks to parallelization and the use of a global cache on disk), let this tweet convince you:
NPM install: 2m23s, Yarn first install: 40s, yarn second install: 18s 🚀
Disc is a tool for analyzing the module tree of browserify project bundles. It’s especially handy for catching large and/or duplicate modules which might be either bloating up your bundle or slowing down the build process.
Build your bundle with the --full-paths flag and then pass that to discify:
As a lecturer ICT I have to correct the work our students make. Therefor I collect all solutions and put them in a subfolder-organised structure on disk: per student I create a subfolder and put their solution into that folder(*).