: New: Book Report: Waltzing with Bears

This book's subtitle is "Managing Risk on Software Projects" and it's written by the Peopleware guys. OK, nobody's reading this blog post anymore; the non-computer folks have clicked away to find something more interesting; the computer-project folks have all clicked away to Amazon or their local library bookstore or something like that to order the book. So I'm just typing these notes to myself here.

It talks about the Denver Airport automated baggage handling system software. This was a famously late and awful system: it wasn't ready in time, and delayed the aiport's opening. I remember reading about it on comp.risks back in the day. I remember thinking Those scheming contractors; saying' they'd get it done quickly when they knew that wasn't gonna happen. Sayin' it just so that they get the contract. Except now that I read this book, I find out that's not how it went. Nobody who knew anything about computer programming claimed that the system was going to be ready in time. They were screaming at the airport planners that the system wouldn't be ready in time. The airport planners came up with Plan B: There is no Plan B.

...the invisible hand of the marketplace made a significant gesture right at the outset. When the DIA board of governors first put the [Automated Bag Handling System] out to bid, nobody was willing to submit a bid for the scheduled delivery date... Eventually, the airport engaged BAE Automated Systems to take on the project on a best-efforts basis.

It's OK to take on risk; it's OK when things go wrong... as long as you've planned for things to go wrong. Have a Plan B. There's two main things that make this tricky: you and everybody else. You don't want to think too hard about your project's risks. It's stressful. Everybody else on the project doesn't want to think about your project's risks, either. Office politics comes into it: if you ask "What are the symptoms that the project isn't working out and we should shut it down?" then some folks on the project hear "I want to shut your project down." I work at a company that embraces risk. I've worked on some failed projects. Some people on those projects get mad at me when I call those projects failures. It's all very well that the company forgives failures; but not everybody's comfortable with it. It's nuanced to point out that a project failure might not be a personal failure, that someone might demonstrate great skill on a project that ultimately falls over; if you call a project a failure, some folks on that project will take it personally.

OK, so how can you plan your way around bad things without giving yourself ulcers and alienating everyone around you? This book has some advice. The advice isn't all-powerful. If your organization doesn't like risk... well, maybe you shouldn't plan any risky projects. If office politics abound, again, you should probably just avoid risk, avoid standing out. But if your organization values sensible risk-taking, there's still the ulcers and hurt feelings to worry about. What's the book's recommendations.

Brainstorm up a list of possible risks. Some people will hate doing this: why do you hate the project so much that you're encouraging everybody to point out everything that can go wrong on the project? If your organization doesn't normally do stuff like this, the people on the project may feel singled out: why is our project the risky one?

Terrible things to think about. Some folks are more comfortable talking about this if you prefix each of them with the phrase "nightmare scenario".

What are the ballpark chances of the risk coming up? What are the consequences? If you "multiply" the chances by the consequences, are the risks acceptable?

You can try to make risks less likely: transport your project lead in a big bus, very safe in collisions; assign one engineer to get familiar with that unfamiliar technology. You can try to mitigate risks, reduce their consequences: have two project leads who talk with each other about every decision; have a more familiar technology handy in case the new hotness turns out to be a sham. But these measures have costs of their own, of course. They won't all be worthwhile.

Thinking about risks and planning can go well together. If you ask a committee for a list of features that some new project needs, they'll go wild larding on the requests. Maybe you're planning a few incremental releases: version 0.1, 0.2, etc. It's difficult to get folks to agree on priorities, but risks can help. To force folks to prioritize which vital features need to be in the early versions, you can point out the risk that the project will never make it to version 1.0 and they'll have to hobble along using version 0.7 or whatever. That's a good way to make sure that people ask for the really important features to make it in to version 0.7.

...And there's a bunch more in this book; but it's short. It's good that they focused on this one thing. Because it's a lush, rich thing full of engineering and math and psychology and cooperation and... Good book. Check it out.

Tags: book programming choice

web+comment@lahosken.san-francisco.ca.us
blog comments powered by Disqus

Updates:

Tags