As it goes, the answer to "how do you eat an elephant" reads "one bite at a time." And a lot of agile folks subscribe to that notion in general—you can accomplish any task if you break it up into smaller tasks, and take it one small bit at a time.
Problem is, that doesn't really work. Once more, it comes down to context (you knew I'd drag that in somehow, right?) There's a great example of this in the new book The Agile Samurai. One of the examples on estimation starts off asking how long it will take you to eat a cookie. Based on that, how long will it take you to eat 7 cookies? 14? How about 200?
Context obviously matters: the state of the system initially is not the same as the state of the system after eating 20 cookies (believe me, I've been there). It's non-linear: input to the system changes the system itself, and it's capacity, and eventually its desire to ever see another cookie again.
So no, "one bite at a time" is the wrong answer. It's not sustainable. Instead, the first defense against "How do you eat an elephant?" is to ask the clarifying question, "WHY ARE YOU EATING AN ELEPHANT???"
Too often, even on agile projects, we take whatever absurd requirement is handed to us and fire up the story cards and the burndown charts and get cracking on it. It would be a nice first step to dig into the absurdity first, and see why we think we need to make elephant kabobs. Maybe the real requirement was misheard, and they actually said "add a font and some knobs"
But it is incumbent upon you to ask. And if the person you ask replies with "because they told us to" (or something similarly Nuremberg-esque), go up one and keep asking.
And if it turns out that you do, in fact, need to eat an elephant, then you're going to need a lot of friends. Or a lot of Tupperware.
"What are you planning on doing today?" That's one of the key three questions you must answer during a daily Scrum meeting. And it's the topic of the first Pomodoro of the day, if you use the Pomodoro technique. No big deal, it's just asking you to think about a goal or set of goals for the day, right?
But a "goal" sounds like an elusive, hard to reach target. I suppose we've been conditioned by too much football, where the drive to the goal line is a hard-fought contest of brute strength and occasional cunning.
In some ways, a football goal is an easier target than those we face: everyone knows right where it is, and it's not moving. In software and in life, goals are rarely that static or cooperative. So we might need a little help making sure we're actually headed in the right direction.
A study done at the Dominican University showed that three factors contribute to helping you accomplish significantly more toward achieving your goals:
Just writing your goals down can be a big help. Sharing those goals publicly, even just with a close friend, helps even more. And for the trifecta, you can email your friend with periodic status reports on how you're doing.
Without that kind of support, it's easy to wander aimlessly through the jungle of e-mail, IM's, tweets, failed unit tests, and obfuscated requirements until the week has suddenly evaporated. And that's just on the project—your personal goals are just as important; in some cases, more so.
What happens without goals? Heinlein nails it:
"In the absence of clearly-defined goals, we become strangely loyal to performing daily trivia until ultimately we become enslaved by it." — Robert Heinlein
So here's today's challenge. Consider what you're doing. In what way is that advancing you toward your goal, versus daily trivia?
A new study shows 10X improvements in cognitive tests (including ability to focus) from doing brief meditation. That's not too surprising by itself, as the benefits of meditation have been shown in neuroimaging and behavioral studies for years. What is surprising, though, is that it doesn't take much to see results.
As it turns out, you don't have to train like a monk for years to get these benefits of enhanced cognitive skills. Researchers found profound improvements after just four days of 20 minutes training per day, doing simple "mindfulness" meditation.
The February 15th print edition of SD Times lists columnist Larry O'Brien's picks for the top ten most influential software development books of the past 10 years. Of the ten, he's chosen these fine two books:
In addition, in his blog comments, Larry mentions the two most influential books during the decade, which were released in 1999: Kent Beck's Extreme Programming Explained, and of course, The Pragmatic Programmer.
So of the 12 books mentioned as most influential of the past decade, I'm privileged to have been co-author on 3 of them. Thanks Larry, I'll try and get my head through the doorway now. The swelling should go down momentarily.
And a serious thank you to my co-authors, Dave Thomas and Venkat Subramaniam.
I pre-ordered an iPad this weekend. Apparently so did a lot of other people :-) Do we know what we're getting into? Probably not. You can always tell the pioneers by the arrows in their backs, right? Version 2.0 will probably include a hologram generator, matter replicator, or something else that makes the current version little more than a really nice picture frame.
But we forge ahead anyway, not really knowing quite what to expect. Will it just be an over-grown iPhone/Touch? Will it be something that's a little bit more? Or something that's a whole lot more?
I cast my vote with the "whole lot more"—eventually.
There's already a major shift in the landscape as cellphones and their progeny become the primary internet interface for many people in many contexts. It's a powerful pull away from the traditional desktop monitor/keyboard interface.
But cell phones, even with their little thumb keyboards, can only do so much. You need a certain amount of screen real estate for any serious reading, viewing or working. So the iPad is just a bigger iPhone, right?
I think the iPad has the potential to redefine how we interact with computing devices in general. Think about it: no mouse, no stylus, no fixed keyboard. Want a Dvorak keyboard? Or a customized layout for a sophisticated application such as Final Cut or Logic Pro? It's just code. But for that matter, you really only need a keyboard for text entry.
We're looking at the beginning of the true direct-manipulation interface. No more wiggling a spatially disconnected mouse or scribbling on an eternally blank tablet with no feedback. I think the effect of such an immediate, in-your-face interface will be pervasive and long lasting, in ways that we're only just beginning to imagine.
There's an old story about requirements gathering that says you can't gauge the needed capacity of a new bridge by counting the number of people who swim across every day. In winter.
But once the bridge is there, a whole new ecosystem is created. New opportunities, new possibilities emerge from the new context. And I think that's exactly what we'll see with the iPad.
Over the course of time this style of multi-touch tablet device will probably replace the standard keyboard/mouse/screen arrangement for most consumers most of the time.
There are a couple ways this can happen:
1) It becomes an immersive front-end for direct-manipulation web apps.
2) It becomes the UI device for your desktop (now moved into the back of the closet), replacing the keyboard, screen and mouse.
3) Processing power on the device itself will grow to replace the laptop.
As soon as my iPad arrives, the first thing I'll do is check out our titles in ePub format on the nice big, bright screen. That right there will relegate my Kindle to the drawer.
Then as an experiment, I want to try using a VNC client (Remote Desktop) to control a full-power desktop computer over WiFi. With enough bandwidth, that could be a lot of fun. Imagine the full power of your eight-core desktop away from your desk.