In many of the dominant programming languages of 1960s, like Fortran and Basic, the computer moved sequentially through a list of numbered instruction statements, and the programmer could jump from one point in the sequence to another using a statement called GOTO. Like a film editor making a jump cut, GOTO simply dropped one story line that a program was developing and picked up another. It handed off control from one point in a program to another unconditionally, taking nothing else into account—neither the values of different variables nor the state of the program and its data. As a result confused tangles of software that came to be known as "
spaghetti code" was generated.
Advocates of a methodology known as "
structured programming" proposed a set of practices to banish the plague of procedural pasta. The central idea was to compose a program as a collection of subunits, each of which had only a single point of entry and a single point of exit.
The specific injunctions of structured programming would quickly be codified in the design and syntax of new generations of programming languages. Its broader imperative—that "each program layer is to be understood all by itself"—has driven one innovation after another in subsequent decades as programmers have devised new ways to isolate, "modularize," and "encapsulate" the moving parts of software from one another.
A certain Frederick Brooks led a 360 programming team in IBM to deliver a OS named OS/360. The project inspired his seminal writing in The Mythical Man-Month. He led the team through the concept and design phase,but he departed before the ideas were implemented, and after he left, the project got into "serious trouble."
Serious trouble: Slippage. Delay. Revised schedules that slipped again. And more delay.
A young Watts Humphrey took over the reins of software management at IBM in 1966. he was stunned by its state, he wrote;
"My first need was to find out where the work stood. . . . In each laboratory I visited, I asked management for their plans and schedules. No one had anything other than informal notes or memos. When I asked for the managers' views on the best way to manage software, they all said they would make a plan before they started development. When I asked why they did not do this, their answer was that they did not have time. This was clearly nonsense! The right way to do the job was undoubtedly the fastest and cheapest way. It was obvious that these managers were not managing but reacting. They were under enormous pressure and had so much to do they could only do those things that were clearly required for shipping code. Everything but the immediate crisis was deferred. I concluded that the situation was really my fault, not theirs. As long as I permitted them to announce and ship products without plans, they would continue to do so."
He went to his boss. "I told him that since all the delivery schedules were not worth anything anyway, I intended to cancel them. Next, I would instruct all the software managers to produce plans for every project. From then on, we would not announce, deliver, or fund programming projects until I first had a documented and signed-off development plan on my desk. . . . It took the laboratories about sixty days to produce their first plans. This group, who had never before made a delivery schedule, did not miss a date for the next two and a half years."
Humphrey's success at enforcing schedule discipline at IBM stood on two principles: Plans were mandatory. And plans had to be realistic. They had to be "bottom-up," derived from the experience and knowledge of the programmers who would commit to meeting them, rather than "topdown," imposed by executive fiat or marketing wish.
After retiring from IBM Humphrey joined forces with the Software Engineering Institute (SEI) at Carnegie Mellon University. At SEI, Humphrey and his colleagues created the
Capability Maturity Model (CMM) as a kind of yardstick for judging the quality of software development organizations. The CMM provides a five-step ladder for programming teams to climb.
An organization at Level 1 is basically not doing much of anything. At Level 2, they're doing some planning, tracking, configuration management, they make some noises about quality assurance, that kind of stuff. A Level 3 organization begins to define processes—how they work, how they get things done, trainable things. At Level 4 they're using measurements. They have a framework for actually tracking and managing what they do, something statistically trackable. Level 5 organizations have a continuously improving process.
For decades the organization of the typical project followed the
"waterfall model." The waterfall approach—the label first surfaced in 1970—divided a project into an orderly sequence of discrete phases, like requirements definition, design, implementation, integration, testing, and deployment. One phase would finish before the next began. This all seemed logical on paper, but in practice it almost invariably led to delay, confusion, and disaster. Everything took forever, and nothing worked right. Programmers would either sit idle, waiting for the requirements, or give up and start design work and coding before they got the requirements. "Big design up front" led to big delays, and "big-bang integration"—writing major chunks of code separately and then putting them all together near the end of the project—caused system collapse. By the time the finished product arrived, so much time had passed that the problems the program aimed to solve no longer mattered, and new problems clamored for solutions.
The waterfall model gradually acquired the bad reputation it deserved. In the mid-eighties, Boehm defined an alternative known as the
"spiral model," which broke development down to "iterations" of six months to two years—mini-waterfalls dedicated to producing working code faster and allowing feedback from use of the resulting partially completed product to guide the next iteration. The spiral became standard in the realm of large-scale government-contracted software development, where a typical project's "acquisition cycle" might span as long as a decade. For the accelerating world of commercial software, however, it was still too slow.
In the nineties, the software industry's methodology devotees adopted the banner of
Rapid Application Development (RAD), which promised to speed up the delivery of finished software through quick prototyping, more aggressive iteration cycles, and reliance on new tools that let the computer itself handle some of programming's more mundane tasks. RAD helped software companies work more nimbly. But soon after it began to take hold, along came the Web, which ratcheted up the industry's hunger for speed yet again, to the manic pace of Internet time.
In 2001, a group of seventeen leaders in the field, gathered at a Utah ski resort to try to find common ground among their related but diverse approaches.
The meeting found a more virile name for the movement—
Agile Software Development—and produced a manifesto that reads in its entirety:
We are uncovering better ways of developing software by doing it and helping others do it. Through this work we have come to value:
• Individuals and interactions over processes and tools
• Working software over comprehensive documentation
• Customer collaboration over contract negotiation »
• Responding to change over following a plan That is, while there is value in the items on the right, we value the items on the left more.
Agile development was more of an umbrella for shared values than a roadmap of specific processes.
A variety of related but distinct agile methodologies have flourished since the manifesto's publication. One,
Scrum, divides projects into thirty-day "sprints" and emphasizes daily meetings to keep work on track. The most popular species of agile methodology by far, though, is
Extreme Programming.
Extreme Programming's label mostly refers to the way it adopts a set of widely accepted methods and then pushes them to their limits.
Is testing important? Then have developers write their tests before they write their code. Is it good for the development team to talk with the customer? Then keep the customer on hand to answer the developers' questions. Instead of periodic code reviews, have developers work in pairs all the time so that every line of code has more than one set of eyes on it from the moment it is written. Above all, accept that the customer's requirements, and thus the software's goals, are going to keep changing, and organize your project to "embrace change."
XP introduced a new and sometimes strange vocabulary to the software development world. It mandated breaking projects down into "stories." Each story represents a feature request that the customer lays out for the developers in a narrative that explains what the program should do. The programmers go off and code the "story." If it does what the customer wants, they're done and ready for the next story. Under XP's radical incrementalism, there is no "big design up front." Coding starts almost immediately. This appeals to many programmers, whose itch to start cranking out code is often thwarted by lengthy design processes. The license XP grants to forget about writing detailed specifications and documentation of code is also popular.
But programmers often find some other XP tenets difficult to follow, especially the rallying cry of You Aren't Gonna Need It, or YAGNI. YAGNI takes the just-in-time manufacturing method, in which factories don't collect the parts to build a product until a customer places an order, and applies it to the world of knowledge goods. The principle advises, 'Always implement things when you actually need them, never when you just foresee that you need them." In other words: Your meeting-scheduling program may someday need to support users in different time zones, but if your company has only one office right now, that can probably wait. This is practical but painful counsel for the habitual axe sharpeners of the programming tribe.
Conventional wisdom holds that agile development and XP are best for small teams of experienced coders. Though XP has found enthusiastic converts in pockets all over the software industry, it's a demanding discipline if you try to obey all of its mandates.
Critics have argued that XP often serves as an excuse for lazy coders to ignore the discipline demanded by specifications, documentation, planning, and all the other onerous burdens of "heavyweight" process.
-
Summarized from
Dreaming in code : two dozen programmers, three years, 4,732 bugs, and one quest for transcendent software by Scott Rosenberg.