PACC Starter Kit Provides Tools for Developing Systems that Exhibit Predictable Behavior

NEWS AT SEI

This library item is related to the following area(s) of work:

Predictability by Construction

This article was originally published in News at SEI on: May 1, 2008

Many software systems have stringent quality attribute requirements. For example, industrial robots must perform tasks with strict deadlines, medical devices must comply with safety requirements, and most software must minimize security vulnerabilities. Although analysis theories and techniques to satisfy these requirements have existed for many years, they are not widely used because of the resources and expertise required to create, maintain, and evaluate analysis models. Even when such theories are used, doubts often remain about the accuracy of the results because of uncertainty of the correspondence between the models that are analyzed and the code that will be executed.

The PACC Starter Kit (PSK) is an integrated set of software development tools that demonstrates how existing technologies can be integrated to provide objective confidence in predictions of system behavior. “Our primary intent is to provide working examples and building blocks that help organizations get started with integrating such technologies in their own development environments,” says James Ivers, a researcher at the SEI and one of the developers of the PSK.

“The PSK is an Eclipse-based development environment for Windows that combines a model-driven development approach with reasoning frameworks that apply analyses to predict runtime behavior based on specifications of component behavior and are accompanied by some measure of confidence,” says Ivers.

Often, quality attribute requirements in areas such as performance, security, or safety are some of the most difficult system behaviors to get right. Effective techniques to provide early confidence in the ability to satisfy such requirements usually include some form of architectural or design analysis that exploits knowledge of the relevant quality attributes.

Unfortunately, there often remains a gap between architectural concepts and the implementations that will actually be deployed. And that gap leads to uncertainty as to whether the qualities designed into an architecture will ultimately be realized in the executing software.
The PSK shows how a collection of today’s technologies can be integrated to mitigate this risk. Key features of the concepts integrated in the PSK include

  • reducing the gap between architecture and code by using a component technology that mirrors the architectural idioms and enforces key constraints on software developers that ensure applicability of associated quality attribute analyses
  • packaging technologies for quality attribute analysis into reasoning frameworks that can be used to automatically predict system behavior without requiring users to become theory experts
  • providing measurable confidence that architectural analysis results will be consistent with executing software behavior, through either statistical confidence labels or automated proof-generation techniques

The PSK includes online tutorials that guide a user through its use on a number of examples. According to Ivers, the most interesting example is the audio-mixing application, which is built from components that include a signal generator, WAV decoder, splitter, adder, inverter, and graphical display. Ivers explains that “the audio example has some of the same quality attribute requirements as larger, more complex systems, but is a small application that is easy to understand. Problems tend to be readily observable, either audibly or by using the included graphical display.”
 

Figure 1: Graphical display of signals processed in a multiple-channel use of the audio-mixing application

Figure 1: Graphical display of signals processed in a multiple-channel use of the audio-mixing application

The tutorial demonstrates how the behavior of individual components is specified and how the architecture of the mixer is depicted in the PSK. It shows how code for the included component technology is generated from component specifications and how that code is deployed and executed. It also shows how the included reasoning frameworks are applied to provide quality attribute analyses; two of the analyses that are demonstrated for this example are

  • performance analysis: To avoid artifacts like skips in the audio output, all processing for a given frame of input data must be completed within a specific timeframe. The tutorial demonstrates how to supply the information needed to perform this analysis (e.g., assignment of thread priorities and how to measure execution time) and how to perform the analysis using one of several evaluation procedures.


Figure 2: The performance model that is generated for one of the audio mixer application

Figure 2: The performance model that is generated for one of the audio mixer application

This view is the result of information extracted from the specification of the audio application; this form is more suitable for analysis by the various performance evaluation procedures found in the PSK.

  • behavior analysis: For the user to hear the right audio output, the application must follow important behavioral rules. For example, components must follow rules about their interactions with each other, such as acknowledging the reception of information in protocols that require acknowledgement, even under error conditions. The tutorial demonstrates how users can define these rules and check for any violations.


Figure 3: Sequence diagram demonstrating a specific execution in which the specification fails to follow user-supplied rules about inter-component communication

Figure 3: Sequence diagram demonstrating a specific execution in which the specification fails to follow user-supplied rules about inter-component communication


This is one example of the type of evidence supplied to give users a reason to believe analysis results.

Ivers explains how these examples are used: “You can use them to see how problems can be detected early, modify the architecture or the details of individual components to correct problems, re-analyze the results to confirm fixes, and confirm that the executing system exhibits the qualities indicated by the analyses.”

The PSK is intended as a vehicle for demonstrating how these concepts can be integrated in practice. The PSK, now available for download  integrates a collection of technologies that include
  • a design language based on UML statecharts for describing component behavior and descriptions of how components are assembled
  • a code generator for creating software implementations ready to execute in the provided execution environment (the Pin component technology and a real-time extensions layer for Windows) that enables you to go from software designs to executing examples
  • several automated quality attribute analyses
    • performance analysis for worst-case, average-case, and sporadic-server latency
    • behavior analysis for user-specified claims of software behavior (e.g., satisfaction of application-specific invariants, conformance to interaction protocols, or absence of deadlock)
    • security analysis for detecting buffer overflows in C code
    • memory analysis for anticipated resource use
  • accompanying means of demonstrating objective confidence. For example, statistical measures of confidence in performance analysis are based on data supplied by measurement tools included in the kit. In the behavior analysis, execution traces are used to demonstrate failure, and proof generation is used to demonstrate success.

To learn more, visit the PSK pages or contact us using the link in the For More Information box at the bottom of this page.

Find Us Here

Find us on Youtube  Find us on LinkedIn  Find us on twitter  Find us on Facebook

Share This Page

Share on Facebook  Send to your Twitter page  Save to del.ico.us  Save to LinkedIn  Digg this  Stumble this page.  Add to Technorati favorites  Save this page on your Google Home Page 

For more information

Contact Us

info@sei.cmu.edu

412-268-5800

Help us improve

Visitor feedback helps us continually improve our site.

Please tell us what you
think with this short
(< 5 minute) survey.