I've found a fascinating trend. The number of parents who, inspired by their child's endless games of Guitar Hero or Halo, have decided that their child has unlimited potential in the IT field.
I think that is a symptom of society's general belief that IT people are a type of overpaid handyman. (Society has similar disdain for actual handymen who are likely also more skilled than we think.)
In general it seems that people assign a low value to knowledge they don't have and tend to have little knowledge of complicated subjects. Hence many complicated subjects are believed to be easy to understand if one has the time; which again means that an IT professional, to many, is simply somebody who had to the time to read a manual while others had more important things to do.
Learning how computers work is immensely difficult and most people realise it when they try and then stay away from the subject, using the excuse that they don't have the time to learn what is, they regard, an easy field.
From that grows society's general understanding that it is socially acceptable to know absolutely nothing about computers. I know people who refer to a monitor as the "computer" and the computer itself as the "hard disk". If I displayed a similar ignorance of, say, cars, and referred to the steering wheel as "car" and to the actual car as "radio", people would find my ignorance hard to believe.
There was a Dilbert cartoon many years ago where the secretary (or "senior associate") told Dilbert that "my oldest is flunking all his classes; I hope he can get a job involving computers" and Dilbert replied "carrying them?" which the secretary did not find amusing. That attitude is actually quite common.
Computer skills are not regarded as real skills and it assumed that anybody can learn them (which is for the mast part true) and that it doesn't take long and little effort (which is not).
A 55-year old acquaintance of mine recently told me that he was thinking doing something "in computers", and learn how to program; he asked me where he could learn and he wouldn't mind if it takes a year.
He was then surprised that my flatmate and I were watching a video podcast about the Cocoa platform (i.e. about programming). "I thought you can program already?"
I told him that there are several platforms and that after several years of studying one has to learn one or two and that it never ends.
I work with Visual Basic .NET and C# in the office and am constantly buying (and to an extend reading) books about C#, the .NET framework, and the Windows Installer (and if you think that has anything to do with installing Windows and clicking on "yes" to format the disk you are probably part of "society" and not of the IT skilled). In my free time I now study Objective C and Cocoa because I am very bad at memory management and pointers. (All my C programs either leak memory or were supposed to terminate after giving a result.)
Technologies that I try to stay on top of:
- .NET framework (except WPF)
- Cocoa framework
- Shell scripting
- Windows Powershell (if only I had the time, get it?)
- Active Directory as it is relevant for my IIS and SQL servers
- Microsoft SQL Server
- IIS 6, IIS 7 and Web servers in general
- Classic ASP and ASP.NET
- Build automation
- Source control
- Windows XP/2003/Vista/2008
- SOX
Technologies I try to comprehend in my free time and for fun:
- Mono
- Cocoa# (Cocoa bindings for C# and Mono)
- Python (if I ever get to it again)
- C
- REALbasic (for my home projects until I figure out Cocoa#)
- Interix (Windows Services for UNIX)
- DSLinux (Linux for Nintendo DS)
Those are typical subjects for an IT professional to study ALL THE TIME. I doubt that most people who hope for an easy career in IT would understand even all the words in the list even after they read the definitions.
In short: I agree.