Error: Complex object types cannot be converted to simple values.
October 21, 2014—Look around and you will find empirical research and analysis in some surprising places. In Major League Baseball, big-data drives sabermetrics management practices. In the corporate world, companies have begun to leverage employee data to fuel the development of talent analytics as a human resources tool. Advocates of "Moneyball Government" are developing measurement-based approaches for assessing whether taxpayer dollars are being allocated to the initiatives that produce the best results and impact. On the personal scale, an array of emerging data sensors has spurred a new kind of self-analysis for practitioners of the Quantified Self movement. These are just a few of the ways in which empirical analysis has entered the mainstream.
Our growing ability to sense, collect, store, and analyze data has spread to nearly all fields of endeavor, creating new insights and enabling decisions based on facts rather than hunches. Millions of everyday people are now engaged in empirical analysis, even if they don't know it. But, at the Department of Defense (DoD)—an organization that thrives on data and relies on software-based systems—there is a dearth of information in one very important area: software development and acquisition.
This is why earlier this year the SEI created the Empirical Research Office (ERO). Under the leadership of Associate Director David Zubrow and Assistant Director Forrest Shull, both of the SEI's Software Solutions Division, the ERO will spearhead research to support policy and program-management decision making related to software systems. This addresses a timely need based on software's ever-increasing role in providing critical capabilities to the warfighter, and the DoD's concomitant ever-increasing investment in creating and sustaining software. "We're the only FFRDC focused on software engineering," said Shull, "and that makes us uniquely suited to lead this work."
Shull sees the new unit applying a data-driven, empirically based approach to assessing the state of the practice, evaluating methods and technologies, and formulating guidance. "We want to provide decision makers with credible and practical advice when it comes to creating policy related to the DoD's substantial investments in software systems," said Shull, "and we also want to provide measures for assessing program health and emerging technologies."
The overall objective of the ERO, notes Shull, is to provide the DoD the data it needs to improve its software acquisition, development, and sustainment practices. To achieve this objective, the ERO will work to become the DoD's "go to" source for empirically grounded information on software engineering and acquisition.
Shull has outlined three research paths for the ERO. "First, we will conduct a data-driven analysis of current DoD software acquisition programs to ensure future research focuses on the right problems." This work will identify the top software-related challenges faced by current projects and examine what software technologies cause the biggest sustainment headaches. Analyses of this type could examine questions, including
"Second," noted Shull, "we will quantify the costs and benefits of various emerging solutions." Research topics in this path will be driven by the interests of DoD programs. For example, the ERO could address the degree to which virtual prototypes based on software simulations prove accurate and effective. Research might also identify what measures of technical debt can be automatically extracted from existing code bases, determine whether and how the return on investment of design-by-construction approaches can be calculated, or identify measures useful for project management and reporting in an agile context.
Example analyses of this type could include
"We also want to create the infrastructure and tools needed to support low-cost analysis," added Shull. This could include sharing subsets of the data to facilitate collaborative efforts with other FFRDCs and the larger research community. Given the size of the undertaking, Shull sees fostering community engagement as critical. "We want to involve researchers and subject matter experts whose methods and expertise can elevate the quality of the research and improve the overall capability of the effort," he said.
Shull joined the SEI after 15 years at the Fraunhofer Center for Experimental Software Engineering in Maryland where he was director of the Measurement and Knowledge Management Division. His work has focused on applying empirical methods to providing actionable, concrete decision support for customers in the development and acquisition of systems and software.
Shull has served as a lead researcher on projects for NASA's Office of Safety and Mission Assurance, the NASA Safety Center, the U.S. Department of Defense, the Defense Advanced Research Projects Agency (DARPA), the National Science Foundation, and companies such as Motorola and Fujitsu Labs of America. He has been an adjunct professor at the University of Maryland College Park, where he developed and taught software engineering courses aimed at skill development for practicing software professionals. Through December 2014, he also served as editor-in-chief of IEEE Software, the leading publication for translating software research and theory into practice. He is an elected member of the Board of Governors of the IEEE Computer Society, starting in 2015.
For more information about the SEI's Empirical Research Office, please contact email@example.com.
Categories: Measurement and Analysis