Web dev at the end of the world, from Hveragerði, Iceland

Writing when tech has broken the web’s social contract


I continue to work through my worries about where whe are. I’m still thinking about where I stand in an industry that no longer seems to care about what it makes. It’s become obvious that the software industry doesn’t care about software.

It’s only getting worse

Is anyone else noticing that software just… doesn’t work right anymore? Every single booking system I’ve tried to use in the last week has had a serious revenue-affecting bug (airline, cab reservation, temporary rental, doctor appointment). Two of them were time zone bugs, one was a form field validation bug on a required field, the last was “you can’t get an assigned seat unless you call us”.

Valerie Aurora

Of course, I’m biased since I both wrote a long essay on this a couple of years ago and an entire book last year.

My theory, which I outlined in the book, is that software quality and business success, especially for individual managers, have been completely decoupled. Insisting on software development methods that are known to reduce defects and improve User Experience is more likely to harm your career than help. High-level managers and executives get rewarded for project size, not long term success. The software industry has never been particularly good at what it does, but it’s been getting worse.

Just look at all the lay-offs.

Any executive that has led their company to a situation where they have to lay off 10–20% of their workforce is by definition incompetent.

They’ve led their company into a horrifying financial and organisational disaster with detrimental consequences that will echo on for an entire decade. Their management put the company in dire straights. In a rational market, they would have been forced to resign, if not outright fired.

Instead, they became a role model. Just as with all the other incredibly poor software development decisions these companies have made, the rest of the industry decided to follow and copy them.

And, their executives get rewarded as well.

Basically, that our software is deteriorating should not be a surprise to anyone. It’s an irrational market, governed by people who don’t know what they’re doing.

The AI frustration

Nowhere is that more evident than with AI. A few weeks ago Google’s CEO, Sundar Pichai, went on “60 Minutes” and made a number of glaringly false statements that either belies a complete lack of understanding of the technology he was promoting, or complete dishonesty.

I prefer to think he was just being ignorant, but that just emphasises my point that these people should all have been fired.

I’m getting a bit off track here, but the software industry is going down a path that is just so very disappointing.

Watching executives making promises based on a fundamental misunderstanding of the tech they’re hyping, while every single one of their actions seems tailor-made to increase defects, bugs, and degrade user experience is…

Frustrating is what it is. Frustrating.

The one consistent aspect about this deterioration of the industry is their inability to think about the larger context, specifically about the various social contracts that govern tech’s position in society.

With AI, tech has broken the web’s social contract

The problem is that quite a few people in tech don’t believe in any social contract. Their conceptualisation of society is that it’s an equilibrium of dominant wills motivated by mimetic desire. That the rich are on top; that the rest of us aspire to be like them; and that any and all criticism towards them is born from jealousy. The world can only be improved by those with power over others. Any form of pro-social reasoning, consensus-building, or genuine negotiations seems to be alien to them.

These people are reactionary libertarian assholes, and they are tech’s ruling class. They might see themselves as benevolent shepherds of humanity’s future, esp. the creepy longtermist types, but by and large, they are power-hungry libertarian assholes.

This is why they leave scorched earth behind.

It’s not a secret that much of what the tech industry has done has what economists would euphemistically call “negative externalities”.

Microsoft’s inability to manage software defects meant that, for close to two decades, society had to bear the cost of dealing the fundamentally broken security of most versions of Windows. Their incompetence created a $4 billion USD market for antivirus software at the “peak” of Windows’s insecurity in 2004 and nobody knows how many billions of actual costs to society from software virus infections and hacks.

(In a former life I worked for an antivirus software vendor. If anything, we’re vastly underestimating the destructive cost Microsoft inflicted on our economies with their security incompetence during these years.)

It just keeps going on.

Android is a mess of malware and unpatched phones.

AirBnB seems specifically designed to increase property prices beyond what locals can afford.

The ‘gig’ economy is designed to completely undermine general pay and job security.

Their advertising-oriented universal surveillance is the authoritarian government’s wet dream. There are too many examples to fully list out.

But AI is the latest.

The old social contract

Now, the deal with the web has generally been rather simple:

  • We and media companies put stuff up on the web for free. Some of us do it for business reasons. Some of it is personal. Much of it is just culture. People being people.
  • Tech companies use this stuff, but mosty in an economically complementary way. Search engines use it to help you find what you’re looking for. Social media sites show you ads. Some of it is processed in order to improve the services they provide, such as spellcheck or autocomplete.

This has been breaking down for a while. Translators got hit hard by translation software, and that was only possible by processing texts made by human translators.

Google has steadily been manoeuvring their search engine results to more and more replace the pages in the results. Some of this has resulted in lawsuits or outright legislation. Many of Google’s legal issues in the EU stem from this shift.

Tech’s universal surveillance has also pushed the boundaries of what many found acceptable. Even my parents use ad blockers now.

But, language and diffusion models go further. The deal tech is offering us there is also simple:

  • We and media companies put stuff up on the web for free. Some of us do it for business reasons. Some of it is personal.
  • Tech companies use this stuff to create systems that can make shoddy, degraded versions of our work, deepfake us, and make convincing fake only personas for astroturfing, destroying our work, businesses, and social interactions.

This is a bad deal.

This is not a remotely fair deal for those of us on the “putting stuff up on the web for free” side of the equation. It doesn’t matter whether it’s illegal or not—though legislation is probably the only way to get the tech industry to stop it—because the social contract is broken.

Our incentive recedes in lockstep with the increasing dominance of generative AI content. As it recedes, fewer and fewer people and organisations will contribute to the digital commons. More and more stuff will be locked behind a paywall.

This has already affected my relationship with the web.

In terms of percentages, I posted fewer extracts from The Intelligence Illusion online than I did Out of the Software Crisis. I’ve also all but stopped posting photographs like I used to. I may be blogging more, but I’m also worrying more whether I should.

I’m at the point in time when, under normal circumstances, the strategic thing to do would be to push on with writing more online. It would normally be the most effective way to improve my career under the circumstances.

But now there’s a question mark against all public writing.

Do I gamble that the flood of language model texts will put a premium on thoughtful writing? That I’m not just improving the models by putting more writing out in the world?

Do I figure out ways of putting more of my writing behind some sort of pay- or login wall, even though that would be counterproductive for my career? Wouldn’t that also just disconnect me from my friends and the online community in general?

Do I just ignore the fact that I’m helping train the generative pap that tech hopes will replace us all?

I don’t know.

I know I’ll keep writing about it. Just writing this post helped clarify my thoughts and put order to my emotions.

Will I continue to post my writing online?

I hope so. I hope online writing survives this.

But I’m no longer certain it will.

The best way to support this blog or my newsletter is to buy one of my books, The Intelligence Illusion: a practical guide to the business risks of Generative AI or Out of the Software Crisis.

You can also find me on Mastodon and Bluesky