Agile is maturing, and moving beyond the last-half-of-the-IT-lifecycle. For example, we have seen excellent discussions on the “hybrid” approach. This involves using Agile where it’s most appropriate (and where the prerequisites are in place), and using other insightful PM methods where they’re more appropriate. That approach in IT, plus increasing use of Agile concepts in areas such as new product development, shows promise.
I do still have concerns about a few agile zealots who insist upon contrasting Agile to Waterfall. Competent project managers disposed of Waterfall in the early 1980s. We also disposed, for the most part, of years-long, hold-your-breath-and-wait-forever IT projects. What did we replace them with? Three-to-six-month bursts (we called them iterations) that delivered prioritized useful business functions. We also identified most of the prerequisites for success:
- A good, high-level project plan;
- A clear business case;
- Understanding of the information and data structures;
- Customer-driven high-level business requirements;
- Risk assessment, and mitigation responsibilities;
- The right talent assigned, the right amount of time; both on the IT side, and from customers; and
- Facilitated sessions (rapid initial planning and joint application design) for fast project planning, and requirements elicitation in 1-2 weeks.
In recent articles and presentations, I’ve seen new insights about Agile by practitioners who speak about how they would decide which parts of the information technology project were best suited for an agile approach and which should use classic methods. Of course, they also fully understood the advantages and requirements of each approach… but still, there was something missing.
And now I can explain the title of this article. I solved this same problem — for a different advancement — 30 years ago. Prototyping and Agile share a strong set of parallels.
A Voice from the Past
In the early 1980s, many IT groups were moving from third-generation languages such as COBOL to higher-level languages. These new languages were improving coding throughput by a factor two to three times. Excited about the prospect, developers were interested in “getting to code” much quicker. They thought the then-classic structured systems analysis methods to be a waste of time, and began “prototyping” — adapting an approach engineers had used for years.
This meant they needed to sit down with their customer, show what they had produced, and quickly make improvements. They did this both for displays and reports. And all the prerequisites referenced above were still essential — especially if the system involved new data.
But the most enthusiastic “new way” proponents were adamant that everything should be prototyped, because all other approaches were the old way. And their new way needed no requirements, no documentation, and seldom even needed testing. All that overhead “stuff” was a holdover from the past — -so they claimed.
After a few sessions of guiding these prototyping bright lights to higher ground, I came up with a solution. I too had been an early adopter of both high level languages and prototyping. As a manager, I had transformed entire organizations to their use. And, I understood both their prerequisites and benefits. So I built a table that identified a range of attributes about the system, subsystem or business process being developed. Here’s a copy of that table (updated for readability) from around 1984.
Two Types of IT Applications
The instructions directed developers to use the table to evaluate their system or application to determine which type it was: process-oriented or information-based. For each factor, the developer was supposed to rate the application by circling 1 to 6, depending on how well it met the process-oriented or information-based test.
One outcome: They often found that they didn’t always know the answers — yet they were still eager to develop the solution. So they performed professional analysis to resolve the open items. Then they followed our steps to analyze and evaluate the results:
First, add the circled scores, divide by 8, and truncate any remainder. If the results are clear, decide the most appropriate approach:
- Systems scoring 1 or 2 are process-oriented; you should use classic structured systems analysis, aided by prototyping of all outputs during requirements definition.
- Systems scoring 5 or 6 are primarily information-based and are good candidates for delivery using iterative Prototyping.
A system that scores 3 or 4 is a mixture. Decompose it into its sub-systems and re-evaluate those against the factors; repeat for any sub-system scoring
3 or 4, until you get to detailed processes.
We found a lot of results that we called Zorro systems: The circled scores showed as one or more Zs down the chart, thus requiring that subsystem decomposition mentioned above. Because this approach gave developers a rational process, it caught on very quickly. Even project managers and managers liked the approach. And we used that interest to get those key prerequisites into place — especially those involving the right customers — still a challenge today.
We integrated this chart into our commercial methodologies and added them to the many home-grown and commercial methodologies for which we did methods improvement. The last time I looked at the chart was in 1987 — until one of those “hybrid Agile” presentations tweaked my memory.
By the way, we also found that teams that knew enough to score their system (or decompose and score it) confidently were able to produce much better early estimates — even before they had business requirements.
Applicable to Agile?
I don’t profess to be an Agile expert; I did follow Scrum from the early 1990s, when a business partner asked for help in relating Scrum to project management; he was working on integrating his facilitated requirements analysis with Scrum. I was an early advocate of Extreme Programming; and I like dynamic system development method (DSDM) as a true life-cycle-wide Agile approach.
I have seen dozens of new ideas in the practice of project management. I think Agile methods (depending on the flavor) can offer significant benefits when used wisely — along with huge risk when mismanaged. For example, I’d be very careful with Agile around regulatory projects that have high consequences. But Agile has built on the smart PM practices of the 1980s and added useful concepts, principles, tools and expectations.
But I have a question: How would you change the Prototyping chart to adapt the factors to Agile’s key decision points? I think some of the factors might be the same. The question about Data, for example, is important; it affects primarily whether the project is intended to build the database or use it. If it’s building the database, huge amounts of regression testing will be required.
So, dear reader, I’m interested in your opinion on this question: What would you change to help make the decision where Agile methods are best used? Given the savvy insights of those who are practicing hybrid agile, I’d bet you have some good experience to share…
As a parting comment, and to illustrate my then-biases for prototyping, I ended my sessions with another page that identified four areas where projects could benefit from prototyping:
Note that these were most-often shown from transparencies on an overhead projector. (In this era of PowerPoint, some of you may be unfamiliar with that particular presentation approach.)