It took me too long to realise that opinions aren’t just opinions.
Some are strongly held beliefs. Others are more loosely held but based on a rational argument. Ideas driven entirely by emotion; positions that people take just because they think arguing is fun; even opinions held only to offend, only to be dropped as soon as people get inured to the shock.
Some are open to debate, in that they will listen to an argument. Some are open to debate, in that they enjoy arguments as a sport.
All too often—especially in tech—those opinions aren’t genuine but are instead bullshit. Debating them is pointless as these people neither believe nor disbelieve what they’re saying. They don’t care about the facts or the argument.
As Harry Frankfurt argued in his book On Bullshit:
For the bullshitter, however, all these bets are off: he is neither on the side of the true nor on the side of the false. His eye is not on the facts at all, as the eyes of the honest man and of the liar are, except insofar as they may be pertinent to his interest in getting away with what he says. He does not care whether the things he says describes reality correctly. He just picks them out, or makes them up, to suit his purpose.
Online tech discourse is dominated by bullshit. Some of it comes from grifters. Some of it from people who are deliberately manipulative.
But I’d argue that they are in a minority. Only a minority of those who spout bullshit in tech do it because they benefit from it in some way. Most bullshit because bullshitting is the norm in tech and people are very good at picking up whatever practices are normative in the surrounding culture.
Tech is run by bullshitters. Both VCs and tech executives are predominantly people with, as some of my friends in Bristol used to describe, the gift of the gab. Bullshitting is second nature to them and the rest of the industry picks it up as a cultural norm. Some of them are grifters, sure, but those are far outnumbered by the horde of salaried programmers and managers online who keep regurgitating the latest bullshit coming from the upper echelons of the tech industries.
That’s the tell. Actual bullshitters shift and adapt their bullshit to maximise their personal short term gain. The regurgitators repeat—sometimes verbatim—the bullshit coming from the people in the industry they respect. They may even consider the second-hand bullshit to be a factual and rational opinion.
You can’t debate this.
Not only is it impossible to talk bullshitters out of their bullshit—the engagement from you is what they want; the purpose of their bullshit—but it’s equally futile to try to talk people out of second-hand bullshit. All you will get are equally second-hand bullshit reasoning and half-baked rationalisations.
What this means is that when it comes to topics in tech, whether it’s on “AI” or a web development framework, debating strangers online is completely and utterly pointless and figuring out which person is genuinely open to discuss an idea and which is repeating opinions they think they should have is extremely hard.
The sensible thing to do under these circumstances is to just avoid debating people online unless you have enough familiarity with them to be reasonably certain their opinion is one that’s based on curiosity and reason.
Internet randoms don’t have a right to your time.
- "Heading off confusion: When do headings fail WCAG? - TPGi"
- "AddyOsmani.com - Write about what you learn. It pushes you to understand topics better.“
- "LoFi software and inverting our relationship to The Cloud – chadkohalyk.com"
- "Fantasy Meets Reality – cabel.com"
- "As HashiCorp adopts the BSL, an era of open-source software might be ending". Relevant to what I wrote the other day about free/open source being in a transition.
- "Adactio: Journal—Automation"
- "Progressively Enhanced HTML Accordion — Adrian Roselli"
- "Amazon removes books ‘generated by AI’ for sale under author’s name | Books | The Guardian". The dynamic tech cos are creating is one where the only thing that works to address misbehaviour is a raging mob with pitchforks. Which is just going to result in a lot more raging mobs.
- "No regrets: Gender-affirming chest surgery in adults has long-term satisfaction | Ars Technica". This isn’t the only survey to show long-term satisfaction and extremely low regret rates (there’s a ton of them), but this one is pleasantly conclusive.
- "Negative Space - Ed Zitron’s Where’s Your Ed At"
- “Forgive The Writers, For We Are So Tired – Chuck Wendig: Terribleminds"
- "What if Generative AI turned out to be a Dud?“. Like I’ve been saying for months, the functionality for LLMs generally isn’t there and what positive functionality is there is overestimated by investors, executives, and legislators.
- "These Women Warned Of AI’s Dangers And Risks Long Before ChatGPT – Rolling Stone"
- "Not so private room: Zoom’s AI privacy fiasco exposes how vulnerable we are to Big Tech’s whims"
- "AI Is Starting to Look Like the Dot Com Bubble". The similarities have been obvious for months now. The main difference is that the AI bubble comes at the beginning of a near-exponential rise in environmental and political volatility, so whether it improves or not honestly won’t matter that much.
- "Unions strike to save the world from bad TV and movies | New Scientist"
- "AI Causes Real Harm. Let’s Focus on That over the End-of-Humanity Hype - Scientific American"
- "Google says AI systems should be able to mine publishers' work unless companies opt out | Artificial intelligence (AI) | The Guardian".