Project Schedules, Estimation & Uncertainty in Agile
Project Estimation & Uncertainty in Agile
How Uncertainty Works
by Kevin Thompson, Ph.D, PMP, CSP
All estimates are subject to uncertainty, and project schedules are no exception. A schedule consists of a set of tasks, which are executed at times dictated by dependencies and resources. The simplest schedule, consists of a set of tasks that are executed one after the other, and we’ll look more at this case below.
The schedule for a project contains uncertainty because the estimated effort or duration of each task has some uncertainty associated with it. We would really like to know exactly how long a task will take. If we can’t know that, we would at least like to know how much uncertainty is associated with the task. Unfortunately, we will never know the first, and usually will never know the second, either. (The exception to the latter rule is for tasks that are repeated, identically, enough times for us to collect meaningful statistics about them.)
In this article, we will look at uncertainty: Why it exists, how it behaves, how it accumulates, how to reduce it, and how to cope with it.
Suppose we are estimating the effort required to remodel a house. We will start by breaking down the project “Remodel House” into a few smaller steps, namely
- Remodel Kitchen
- Remodel Bedroom
- Remodel Living Room
We want to estimate the effort required for each of these steps, such as “Remodel Bedroom.” Unfortunately, our estimate will not be exact. Estimates differ from reality because of uncertainties, which arise in many ways
We may not have accounted for all the requirements. Do we need to replace the baseboards? If we didn’t think about the baseboards, our concept of scope is missing an important element, and our estimate will not include the work to replace them.
Perhaps we did include baseboard replacement in scope, but we assumed the effort involved was limited to nailing the new ones in position. Unfortunately, we didn’t account for the work required to measure and cut the baseboards to the right size, so even though the scope was right, the effort estimate will be too low.
Even if we did remember all of the work that needs to be done for the baseboards, and estimate accordingly, our estimates may be off because some of the boards split when nailed. We will either have to replace those baseboards, or drill them to avoid splitting when we nail them. Either way, the work will increase beyond our estimate.
What happens if the paint we need is out of stock, or someone delivers the wrong baseboards? These external events are unpredictable, and disrupt our schedule.
Most people have an intuitive feeling that tasks are more likely to require more effort than planned, rather than less. This feeling is correct, for two reasons.
First, we are likely to omit scope or tasks that contribute to the work, and so underestimate the effort. This is obvious, and we reviewed some examples above.
Second, there is more room, in a mathematical sense, for work to grow beyond expectation than to shrink below expectation. This may be less obvious, but think about the work required for Remodel Bedroom. Assume we estimate this work at three days. The actual time may be more than three days over this estimate, but cannot be more than three days under the estimate!
In mathematical terms, we cannot estimate a task as requiring “X plus or minus Y days,” because the estimate becomes meaningless if Y is greater than X.
What we can do is replace the concept of an increment of uncertainty with that of an uncertainty factor, F. This means that we think that X is the most likely duration, but the range of values is between X divided by F, and X times F.
For example, suppose we estimate “Paint Bedroom” at 3 days, with an uncertainty factor of 2.
- The most likely case is 3 days.
- The best case is 1.5 days, which is 1.5 days under the estimate.
- The worst case is 6 days, which is 3 days over the estimate.
The real behavior of uncertainty is more complex than this simple model. (More sophisticated models rely on lognormal probability distributions and Monte Carlo methods.) However, our concept of an uncertainty factor is a convenient way to describe the most significant behavior of uncertainty, and we will continue with it.
If errors were uniformly distributed above and below the expected values, they would tend to cancel out, on average, for a large set of tasks. As we’ll see, this is not the case.
Suppose we have a project containing 10 tasks, to be done one at a time, and each one is expected to take 10 working days. The naïve estimate for the project schedule is 100 working days for all tasks. The real schedule will deviate from the expectation, due to uncertainty.
Now we will assume that each task has an associated uncertainty factor of two. In the worst case, every task would take twice as long as expected, and the project would take 200 working days to complete.
The worst case isn’t very likely, so let’s consider a more typical case, where half of the tasks are under the estimate by the uncertainty factor, while half are over
The half that were under estimate take 5 days each (under by 5), while the half that are over estimate take 20 days (over by 15). The total time for all tasks is then 125 days.
Our typical case isn’t as bad as the worst case, but it exceeds the naïve schedule by 25% (25 days). This behavior is common, and it occurs because tasks can be over their estimates by more than they can be under. More realistic calculations, which rely on lognormal probability distributions instead of our simple uncertainty factor, are even more pessimistic. The long tail of the lognormal distribution has no upper bound, and tells us that some projects will never complete.
While we cannot eliminate uncertainty when estimating work, we can take some steps to reduce it. One way to reduce uncertainty is to reduce the size of the thing to be estimated. The fewer the elements are that must be considered when producing an estimate, the more reliable the estimate is likely to be.
For example, the process of remodeling a bedroom contains a number of steps. If we estimate “Remodel Bedroom” directly, we may not think of all of the issues involved, and our estimate may be uncertain by a factor of three. Painting the bedroom, however, is a much smaller task, and has fewer elements that we may leave out of our estimate, so our estimate might be uncertain by a factor of two.
The obvious strategy for reducing uncertainty is to break large specifications or work items into smaller pieces. Thus we might break down “Remodel Bedroom” into four smaller tasks:
- Remove old carpet
- Paint room
- Cut new carpet to fit
- Install new carpet
Now we can produce an estimate for each of the smaller tasks. When we add these together, we will have an estimate for the bedroom remodeling, which is likely to be better than what we would have produced if we had not broken the work into smaller pieces.
The strategy of decreasing the “granularity” (size or level of detail) of items to be estimated improves accuracy, but has limitations. It can produce more items than we have the time to analyze, and thus delay the project completion. Also, there is no point in reducing the size below a level where the relative uncertainty does not improve. The important thing is to pick a granularity that (1) enables a tolerable level of uncertainty, and (2) produces a set of things to estimate that is small enough to be practical.
Once we have reduced uncertainty to a practical minimum, the only other thing we can do is to take the remaining uncertainty into account in our process. We’ll look at strategies suited for projects that are subject to different kinds of constraints below. (For simplicity, we’ll assume that resources are fixed, since the ability to add resources does not vary in a meaningful way between the different scenarios. Similarly, we also assume that scope is well-enough controlled to prevent scope creep.)
The first thing to understand about this set of constraints is that success is not always possible. If scope is truly fixed, and schedule is subject to uncertainty, then we have already seen that extreme cases will break any schedule.
These projects are planned with uncertainty in mind, by adding enough buffer time into the schedule to handle a reasonable level of uncertainty (for example, allocate 30% of the schedule for this purpose). This approach works well in situations where uncertainty is small, such as for repeating processes (e.g., laying carpet, or painting houses).
Many agile project-management frameworks handle uncertainty by committing to the only thing that can truly be controlled, which is the schedule, and adjusting the scope as required to meet the schedule. This strategy is particularly useful in high-uncertainty environments, where estimation is known to be inaccurate, and where scope is not well-understood and may change frequently.
The Scrum framework handles this situation effectively. It requires careful planning, but in a way that handles high uncertainty gracefully. Scrum projects work in short cycles to deliver modest increments of scope quickly, and to allow for frequent changes in scope and priority. Within each cycle, scope is adjusted as necessary in order to guarantee that the schedule is met.
Uncertainty is greatest when the scope is not known prior to the start of execution. This is the case for reactive organizations, such as Customer Support groups, which receive urgent requests that must be handled quickly, but which cannot be scheduled or planned in any meaningful way.
Another agile process, Kanban, is useful for this scenario. Kanban processes re-prioritize requests daily, and throttle (control) the flow of work by limiting simultaneous work-in-progress to a specified number (e.g., up to three concurrent requests can be handled by the staff).
Uncertainty cannot be eliminated by any estimation methods. It arises partly because of imperfect knowledge of what to do and how long it should take, and partly because of unpredictable events. The nature of task estimates is such that uncertainty biases deviations up, on average, relative to expected values. Again, on average, these upward biases accumulate even when as many tasks are under estimate as over, lengthening the schedule beyond the sum of expected task durations.
Reducing scope helps to reduce uncertainty, but only to a point. When uncertainty has been reduced as much as practical, the next step is to design the process to cope with uncertainty. We have reviewed three strategies for handling uncertainty:
- For fixed-schedule and fixed-scope projects, add buffer time in the schedule. This works for low-uncertainty projects, especially those that repeat the same type of work many times.
- For fixed-schedule projects, use an agile process such as Scrum, and adjust scope in a planned way to meet the schedule. This is an effective way to conduct a project when estimates are poor, and scope is poorly-defined and changes frequently, while still allowing for planning and a useful degree of predictability.
- For unscheduled projects with unknown scope, uncertainty is very high, and planning is not possible. In this case, a strategy such as Kanban, which focuses on constraining work-in-progress, is effective.
Other Frequently Viewed Articles
Agile Software Development with Scrum FAQ: Learn about Agile Software Development, Scrum, Stories, Roles and More…
Agile, Waterfall and Uncertainty in Project Management: Agile Development vs. Waterfall
Introduction to Scrum: Benefits and Practices to Agile Software Development with Scrum.
Scrum as Project Management: Comparing and Contrasting Agile Development Scrum from Traditional Project Management Methodologies.
Agile Development & Scrum Meets the PMP: Agile Development and How it Compares and Contrasts to the PMI’s Methodology.
Daily Scrums in a Distributed World: Formal Collaboration to Reduce Overhead.
Integration of Waterfall and Agile Development: Tips for integrating Waterfall and Agile Development Methodologies.