Something is a moral hazard if a party, A, makes a different decision knowing that any risk is shouldered by a different party, B.
This scenario requires A to know more than B, which we call information asymmetry.
With this definition in mind, I think some parts of software development are a moral hazard.
Silos and Handovers
If developers don’t run the software they build and instead pass it to an operations team, or let testers do all the validation, this is a moral hazard.
Developers are isolated from the consequences of writing shoddy software because they won’t be the ones woken up in the middle of the night to fix it. Operators shoulder the risk of code written by the developers.
Put developers on-call for the software they write, and this information asymmetry vanishes.
The quality of code will almost certainly increase because no-one wants to write shoddy code if they are the ones being woken up in the middle of the night.
Software development has one of the highest turnover rates of any industry.
When demand is high for digital skill-sets, it’s easier for developers to hop between companies and roles.
If a developer chooses a new technology, or a particular architecture, knowing that the ramifications of that decision won’t be felt for several years and they likely won’t be around to see it, this is a moral hazard.
Some pain from these choices are felt immediately. Some won’t manifest for a while because the issues might only crop up at scale.
Benji Weber, an ex-colleague from our time at Unruly together, wrote about this in his departure blogpost after 9 years at the company.
His point about how fast the software industry moves and the “hype-cycle” of technology applies here. Developers that work on a project after you are shouldering the risk of your choices.
We worked together on a project using a “hot new tech”. The act of finally decommissioning that same project was one of the high-points of my time at Unruly.
Use technologies that are proven, open-source, some would proudly call boring), and design based on open standards to remove this hazard.
(I am guilty of this myself - I foisted a technology choice onto a team and then moved teams. In hindsight, this was unfair of me.)
We should take on risk from decisions we make, and use the empowerment we feel from owning our mistakes when things go wrong.
Forcing that risk on other people is fundamentally unkind. Flex your empathy muscle when dealing with information asymmetry and try to correct that imbalance if you can.
November is National Blog Posting Month, or NaBloPoMo. I’ll be endeavouring to write one blog post per day in the month of November 2019 - some short and sweet, others long and boring.