Rethinking the Software Life Cycle

NEWS AT SEI

Authors

Rick Kazman

Mark H. Klein

Robert Nord

This library item is related to the following area(s) of work:

Software Architecture

This article was originally published in News at SEI on: September 1, 2003

Several architecture-centric analysis and design methods have been created in the past 10 years, beginning with the Software Architecture Analysis Method, or SAAM. The SAAM inspired the creation of a number of other methods. The first of these methods created at the Software Engineering Institute was the Architecture Tradeoff Analysis Method, or ATAM, which has in turn inspired the Quality Attribute Workshop, or QAW, the Cost-Benefit Analysis Method, or CBAM, Active Reviews for Intermediate Designs, or ARID, and the Attribute-Driven Design, or ADD, method.

These methods share not only a common heritage, but also a common set of characteristics aside from being architecture-centric. For example, they all use scenarios to direct and focus activities in the methods; and they are all driven by operationalized quality-attribute models. The SAAM was focused on modifiability. The ATAM focuses on tradeoffs among quality attributes. The QAW attempts to elicit and accurately document quality-attribute requirements, particularly in the absence of explicit architecture documentation. The CBAM uses architecture-based information to determine return on investment of the various architectural strategies being considered. An ARID looks at the usability of a design. The ADD method shapes design decisions around quality-attribute considerations. The methods are also all focused on documenting the rationale behind the decisions made; in this way the rationale serves as a knowledge base on which to base both existing and future decisions.

In each of these methods there are activities that logically belong to different parts of the traditional software development life cycle (SDLC). These activities are, however, embedded in the methods, because the methods have been designed to be performed independently, typically by a consultant, a quality-assurance group, or a researcher from outside of the developing organization. Note that being independent is not necessarily a bad thing: many organizations realize the value of having an outside auditor investigate their internal practices, even when the auditor’s activities duplicate many of the activities that already take place in the organization. The outsider brings a fresh perspective, a well-honed set of analytical skills, and is (presumably) untainted by existing “group-think” or by office politics.

The point here is that these methods have not normally been integrated with each other or integrated into an organization’s SDLC. As a result of being performed independently, each method repeats existing effort and must include activities that actually belong, in concept, to requirements elicitation, or design, or maintenance phases. For example, to analyze an existing software architecture using the ATAM, one needs to understand the business needs that the system is intended to meet, the requirements, the existing design decisions, and the anticipated changes to the system that will occur in the maintenance phase. This duplication of effort is appropriate for a method that is conducted by an outsider, but is inappropriate if the methods are seen as integral to an organization’s normal development process.

A typical SDLC, as it is practiced in relatively mature software-development organizations, has (at least) the following activities:

  • identification of business needs and constraints
  • elicitation and collection of requirements
  • architecture design
  • detailed design1
  • implementation
  • testing
  • deployment
  • maintenance

Of course, this list is not exhaustive, and many of these activities can be broken down into sub-activities (for example, most of these activities include a documentation sub-activity and an analysis sub-activity). Also, this list is not to be taken as implying a particular development process--spiral, waterfall, agile, or any other. The idea here is simply that these are distinct activities, with their own inputs, outputs, specialists, activities, analysis techniques, and notations that need to be undertaken in the development of any substantial software-intensive project.

However, as these architecture-centric methods become more widespread, more widely adopted, and integrated into an SDLC, organizations will inevitably want to tailor them: methods that were created primarily as single-use insertions by an external organization are not necessarily appropriate as a part of a stable, ongoing development process. Consequently, organizations that wish to include elements of quality attribute-based requirements elicitation and gathering, explicit architecture design, and architecture analysis in their life cycles will be best served if they can do so “organically”--seamlessly merging appropriate portions of the methods into their SDLCs.

What this means is that the steps and artifacts of the five architecture-based methods listed above--QAW, ADD, ATAM, CBAM, and ARID--will need to be tailored, blended, and, in some cases, removed entirely when the activities of these methods are integrated into an organization’s existing life cycle.

Merging Methods and Models

In a soon-to-be published technical note,2 we survey the activities of these methods to understand what they have in common and to propose a means of tailoring the activities so that they can fit more easily into existing SDLC models.

We can think of the typical life-cycle activities in terms of the five methods mentioned above. In particular, we want to understand where the activities in the five methods have their major application and impact. In Table 1, we list which artifacts are inputs to the method, outputs from the method, or both.

Table 1: Methods and Life-Cycle Stages

Life Cycle Stage
QAW ADD ATAM CBAM ARID
Business Needs
and Constraints
Input Input Input Input  
Requirements Input;
Output
Input Input;
Ouput
Input;
Ouput
 
Architecture Design
  Output Input;
Output
Input;
Output
Input
Detailed Design
        Input;
Output
Implementation          
Testing          
Deployment          
Maintenance       Input;
Output
 

Not surprisingly, the methods focus on the life-cycle stages and artifacts that appear earlier in a project’s lifetime. This is because these are architecture-based techniques, and an architecture is the blueprint for a system. Once a project is in implementation, testing, deployment, or maintenance, the architecture has been largely decided on, either explicitly or implicitly. There is one exception to this principle: the CBAM may apply to maintenance activities. This is because substantial changes to the system can be made in maintenance that affect the architecture. The “Input; Output” annotation under CBAM in this stage indicates this possibility.

Given the information in Table 1, we can now begin to think about placing these methods into a software development organization’s own life cycle. It is unlikely that any of the methods would be included without alteration because, as stated above, they were meant to be independent activities performed by outsiders.

Summary

Architecture-based methods can influence a wide variety of activities throughout the SDLC. These methods have traditionally taken place as independent activities. The relationships between life-cycle stages and the activities embedded in existing architecture-based methods are summarized in Table 2.

Table 2: Life-Cycle Stages and Architecture-Based Activities

Life-Cycle Stage

Architecture-Based Activity

Business Needs and Constraints

  • Create a documented set of business goals--issues/environment, opportunities, rationale, and constraints--using a business presentation template

Requirements

  • Elicit and document six-part quality attribute scenarios using general scenarios, utility trees, and scenario brainstorming

Architecture Design

  • Design the architecture using the ADD method steps
  • Document the architecture using multiple views
  • Analyze the architecture using some combination of the ATAM, ARID, and CBAM

Detailed Design

  • Validate the usability of high-risk parts of the detailed design using an ARID review

Implementation

Testing

Deployment

Maintenance

  • Update documented set of business goals using a business presentation template
  • Collect use case, growth, and exploratory scenarios using general scenarios, utility trees, and scenario brainstorming
  • Design the new architecture strategies using the ADD method steps
  • Augment the collected scenarios with a range of response and associated utility values (creating a utility-response curve); determine the costs, expected benefits, and return on investment (ROI) of all architectural strategies using the CBAM steps
  • Make decisions among architecture strategies based on ROI, using the CBAM results



It might be argued that each of these steps involves additional overhead as compared with the traditional, non-architecture-aware SDLC. That is true. But the additional encumbrance is more than repaid by having an architecture that is designed, documented, analyzed, and evolved in a disciplined way. The alternative to adding these steps to the SDLC is to choose a chaotic approach to architecture in the SDLC.

For further reading on this topic, including an overview of the five methods and an analysis of how each one contributes to the SDLC, see A Life-Cycle View of Architecture Analysis and Design Methods (CMU/SEI-2003-TN-026), which will be published some time this fall.

References

Bachmann, F.; Bass, L.; Chastek, G.; Donohoe, P.; & Peruzzi, F. The Architecture-Based Design Method (CMU/SEI-2000-TR-001). Pittsburgh, PA: Software Engineering Institute, Carnegie Mellon University, 2000.

Barbacci, M.; Ellison, R.; Lattanze, A.; Stafford, J.; Weinstock, C.; & Wood, W. Quality Attribute Workshops, 3rd ed. (CMU/SEI-2003-TR-016). Pittsburgh, PA: Software Engineering Institute, Carnegie Mellon University, 2003.

Clements, P. Active Reviews for Intermediate Designs (CMU/SEI-2000-TN-009). Pittsburgh, PA: Software Engineering Institute, Carnegie Mellon University, 2000.

Kazman, R.; Asundi, J.; & Klein, M. “Quantifying the Costs and Benefits of Architectural Decisions,” 297-306. Proceedings of the 23rd International Conference on Software Engineering (ICSE 23). Toronto, Canada, May 12-19, 2001. Los Alamitos, CA: IEEE Computer Society, 2001.

Kazman, R.; Barbacci, M.; Klein, M.; Carriere, S. J.; & Woods, S. G. “Experience with Performing Architecture Tradeoff Analysis,” 54-63. Proceedings of the 21st International Conference on Software Engineering (ICSE 21). Los Angeles, CA, May 16-22, 1999. Los Alamitos, CA: IEEE Computer Society, 1999

1 We are using the term “detailed design” here because it is a widely accepted term. It should in no way be taken to imply that there are no details in architecture design. The architect will definitely have to go into details in some areas, to specify, for example, the properties of components and their interactions, while detailed design typically is concerned with algorithms, data structures, and realization.

2 Kazman, R.; Nord, R.; & Klein, M. A Life-Cycle View of Architecture Analysis and Design Methods (CMU/SEI-2003-TN-026).

About the Authors

Rick Kazman is a senior member of the technical staff at the SEI, where he is a technical lead in the Architecture Tradeoff Analysis Initiative. He is also an adjunct professor at the Universities of Waterloo and Toronto. His primary research interests within software engineering are software architecture, design tools, and software visualization. He is the author of more than 50 papers and co-author of several books, including a book recently published by Addison-Wesley titled Software Architecture in Practice. Kazman received a BA and MMath from the University of Waterloo, an MA from York University, and a PhD from Carnegie Mellon University.

Robert L. Nord is a senior member of the technical staff in the Product Line Systems Program at the Software Engineering Institute (SEI) where he works to develop and communicate effective methods and practices for software architecture. Prior to joining the SEI, he was a member of the software architecture program at Siemens, where he balanced research in software architecture with work in designing and evaluating large-scale systems. He earned a Ph.D. in Computer Science from Carnegie Mellon University. Dr. Nord lectures on architecture-centric approaches. He is co-author of Applied Software Architecture and Documenting Software Architectures: Views and Beyond.

Mark Klein is Senior Member of the Technical Staff of the Software Engineering Institute. He has over 20 years of experience in research on various facets of software engineering, dependable real-time systems and numerical methods. Klein's most recent work focuses on the analysis of software architectures, architecture tradeoff analysis, attribute-driven architectural design and scheduling theory. Klein's work in real-time systems involved the development of rate monotonic analysis (RMA), the extension of the theoretical basis for RMA, and its application to realistic systems. Klein’s earliest work involved research in high-order finite element methods for solving fluid flow equations arising in oil reservoir simulation. He is the co-author two books: A Practitioner’s Handbook for Real-Time Analysis: Guide to Rate Monotonic Analysis for Real-Time Systems and Evaluating Software Architecture: Methods and Case Studies.

The views expressed in this article are the author's only and do not represent directly or imply any official position or view of the Software Engineering Institute or Carnegie Mellon University. This article is intended to stimulate further discussion about this topic.

Find Us Here

Find us on Youtube  Find us on LinkedIn  Find us on twitter  Find us on Facebook

Share This Page

Share on Facebook  Send to your Twitter page  Save to del.ico.us  Save to LinkedIn  Digg this  Stumble this page.  Add to Technorati favorites  Save this page on your Google Home Page 

For more information

Contact Us

info@sei.cmu.edu

412-268-5800

Help us improve

Visitor feedback helps us continually improve our site.

Please tell us what you
think with this short
(< 5 minute) survey.