There are quite a few things that prevent people from learning and acquiring new skills. This isn’t a bad thing since what many of these ‘blockers’ have in common is that they tend to help us in unfamiliar environments and new problems. Some of these ingrained heuristics are biological (i.e. infants begin to exhibit them early). Some of them are cultural.
We have an internal built in physics model that helps us interact with the world around us. It’s accurate enough for day to day living but it’s wrong enough to interfere with our understanding of how physics actually work. Many of our instincts and preconceptions prevent us from internalising a more accurate physics model.
Another habit of ours, at least in the west, is to perceive dichotomies—binary opposites—in our surroundings. It’s a tendency that can be useful: it helps us quickly pick one action among alternatives. “Do you want to go out for a drink or stay in tonight?”
Then there’s our tendency adapt everything into a familiar story, preferably one that reinforces our preconceptions and ideologies. Again, this is useful because stories are the oldest and most effective tool for information and skills transfer that humanity has, probably as old as language itself. That’s why you should always socialise with your colleagues during coffee breaks. The stories the old-timers tell contain valuable information for how the work is done.
These are all valuable tactics. As long as you aren’t trying to understand a complex system, that is.
The following is from a blog post written by a teacher trying to get across his frustration with just how computer illiterate most people seem to be.
Kids Can’t Use Computers… And This Is Why It Should Worry You:
> > The truth is, kids can’t use general purpose computers, and neither can most of the adults I know. There’s a narrow range of individuals whom, at school, I consider technically savvy. These are roughly the thirty to fifty year-olds that have owned a computer for much of their adult lives. There are, of course, exceptions amongst the staff and students. There are always one or two kids in every cohort that have already picked up programming or web development or can strip a computer down to the bare bones, replace a motherboard, and reinstall an operating system. There are usually a couple of tech-savvy teachers outside the age range I’ve stated, often from the Maths and Science departments who are only ever defeated by their school laptops because they don’t have administrator privileges, but these individuals are rare. > >
I know from personal experience (teaching, sitting next to the customer service guy at my last place of employment constantly headdesking all day out of frustration) that he’s completely right. Adults have worn their computer illiteracy as a badge of pride for many years now so it shouldn’t surprise anyone that their children share their digital inadequacies. Moreover, neither group is even willing to try to solve a problem when they encounter it.
As recent privacy and malware issues have demonstrated, there is a price to be paid for computer illiteracy. And nobody should ever be proud of being ignorant. If not knowing something is a nonissue, then it’s a nonissue, but don’t brag about it.
That blogger also demonstrates his linguistic ignorance when he explains that he likes to be an arsehole whenever somebody uses the term ‘internet’ to mean ‘my access to the internet’ instead of the internet itself. As in ‘the internet isn’t working’.
I mean, just how stupid do you have to be to not realise that almost everybody who says this knows very well that the entire internet hasn’t stopped working? It’s analogous to saying ‘the TV channels aren’t working’ when your cable TV set-top box is on the fritz. It doesn’t mean you think those channels aren’t broadcasting. It means that you don’t have access to any of them.
It isn’t just stupid to misunderstand language like this, it demonstrates a wilful ignorance of spoken English, wilful because he’s clearly heard the phrase often enough to understand what people are actually trying to say.
The following is from a blog post written by Jeff Eaton, a programmer with customer support and teaching experience:
> > The problem is not that a URL and a Search Term are two different things. The problem is that that particular distinction is one of thousands that are hidden under the surface of simple computer and internet tasks. What’s the difference between a “program” and a “web site?” What’s the difference between a local and a remote file? What’s a remote file? What’s caching? How do you tell the difference between a browser window that looks like a dialog box, and a modal window that contains a browser pane? Because guess what? All of those things matter at some point — and somewhere out there is a development team working hard to blur the distinction for their application, just for the hell of it. > >
His point is that computers are very complex things, more complex than those of us familiar with them think they are. A person can be intelligent, highly specialised, well educated, and still not be interested in learning how to properly use a computer. Why should they? Computers are more complex than they have to be and the payoff for understanding that complexity is, for most people, very limited.
Computers can both be too complex and people can be too lazy to apply themselves in computing. You can both criticise people for taking pride in ignorance and criticise computers for being needlessly complex. Despite what many commenters seem to think, pointing out the latter does not invalidate the former. And, conversely, pointing out the former doesn’t invalidate the latter.
Don’t fall into the trap of assuming that you have to accept the binary as presented.
There is a risk in presenting facts as a neat story. The problem is often that storytelling is often too effective a tool for presenting ideas and ‘facts’, trumping data, statistics, and research. Turning the usual ‘kids know more about computers than teachers’ story onto its head is a neat story, especially since it happens to be terrifyingly true, but it also obscures important problems with how modern computing works.
Namely, that computers are much too complex and difficult to use.