search menu icon-carat-right cmu-wordmark

Creating a Framework for Reliability Validation

Press Release

November 24, 2009—The SEI and the U.S. Army Aviation and Missile Research Development and Engineering Center (AMRDEC) recently began a one-year engagement that aims to develop a comprehensive  approach to overcome deficiencies with the testing now being done to validate software and system reliability. This comprehensive approach will include the development of a roadmap for the research and application of technologies such as model-based engineering, assurance cases, analytical tools, and industry standards, such as the Architecture Analysis and Design Language (AADL) in the assessment and validation of complex system reliability. Participants say they hope the framework they build can become an industry practice for analytical architectural validation.

AMRDEC is the Army organization that provides research, development, and engineering technology and services for aviation and missile platforms across the program life cycle.

“Avionics systems are going to more digitally oriented systems—digital cockpits,” says Willie J. Fitzpatrick, AMRDEC Aviation Division chief. “Assuring that those digital cockpits are safe has become a bigger problem. And the cost for assuring us they’re safe, reliable, and dependable has been going up. Our customers, who are the project managers for the different aviation platforms, have limited budgets for delivering a system to a soldier in the field. They would rather invest in getting the soldier as much functionality as they can, but they can’t trade off system safety.”

“System safety, reliability, and performance are increasingly affected by system-level problems due to the time-sensitive nature of software interaction on a set of networked computers despite best design and test practices,” says the SEI’s Peter Feiler. “We bring to the table an architecture-centric model-based approach that allows us to discover these problems earlier in the life cycle and make the validation process, which is currently very heavily test based, more cost effective and efficient. Similarly, we will consider assurance cases, which have been used successfully in the UK to reason about what is sufficient evidence to certify safety-critical public transportation systems.”

In the context of this project, architecture models with precise semantics drive quantitative and predictive analysis. Bruce Lewis, an SEI resident affiliate from AMRDEC, explains, “We’re actually doing quantitative analysis of the system architecture—software and hardware—through virtual integration to determine if the system can perform in a way that is correct and meets the requirements for safety and other non-functional qualities before building it.  This approach gives us deeper insight into the complexities of the hardware or software system and also integrates through the framework and roadmap deeper reasoning about evidence for assurance required and the correctness of requirements.”  

“This comprehensive approach and the roadmap will give us things that testing cannot do,” says Fitzpatrick. “I’m hoping for a change in best practices. The approach in the past has always been ‘Let’s build it then test the heck out of it.’ We will still rely on some testing. But even if we were given all the money required for testing, we wouldn’t have enough time.”

Lewis agrees, “Testing says given these test parameters, does the system work? An analytical approach gives you a much broader ability to say this will or won’t work, and then, of course, you have to validate your assumptions and validate that the implementation actually does what you modeled.”

“We are not looking for a specialized solution that works only for a particular program or just the Army,” says Feiler. “We expect the framework will accommodate a range of technologies from the SEI and the research community. A roadmap will outline a strategy for its evolution and its incorporation into an industry-wide software-reliant system practice.”