I’ve been reviewing an old PowerPoint slide deck I have from Andy Hunt’s “Refactoring Your Wetware” talk. One of the slides covers the Dreyfus Model of Skill Aquisition, something that Andy and Dave Thomas also discuss in their “Herding Racehorses, Racing Sheep” talk (which you can see when the No Fluff, Just Stuff conference comes to your town, by the way).
The whole concept is just fascinating to me. The model is made up of several “layers” or levels of proficiency in a given skill:
- Advanced Beginner
Each level represents a different pattern of behavior, a different way of thinking. A learner will perceive the world differently at each level. For example, at the “Beginner” level, the learner is preoccupied not necessarily with learning something, but with simply accomplishing a task. Compare that to an “Expect” learner, who relies on intuition, not reason, to accomplish his goals.
Forcing an expert to fit into a rules-based structure designed for a beginner ultimately makes the expert less productive and even downright miserable. On the flip side, placing a beginner in the intuitive environment that an expert thrives in can render the beginner incapable of doing anything. Yet both of these actions occur continually in the corporate world. Why is that?
I don’t have any good answers yet, but Dreyfus surfaces some fascinating ideas. For a more thorough explanation of the model, read this entry from Dave’s blog or Google for “dreyfus model.”
Why does this happen? I’m afraid because the cynical view is pretty accurate (at least in the corporation where I served my time): managers generally know nothing about technology, even IT managers, and are sometimes completely out of touch with reality.
The problem is not so much that they can’t program, but that they know nothing about what a programmer actually does. They think we sit and type code based on documentation that someone gives us. In reality, the most important thing we do is think, but the corporate environment seems to be set up around the proposition that someone else should do the thinking, even though those people who supposedly do so in reality are themselves pretty clueless, and give us lousy, misleading, incomplete, and ill-conceived documents to work from.
Yep, I agree. The problem usually seems to come down to people, not methodologies or technologies. You can give a team lousy requirements and force them to program in Visual Basic and they’ll still turn out something halfway decent if the people involved are skilled and passionate about what they do. On the other hand, you can put an apathetic, inexperienced group of coders in an agile environment and let them program in Ruby and they’ll turn out a load of crud. Having quality people to begin with is the key, IMHO.