Saturday, November 7, 2009

1990 to 1999: Process

 
 
  

 


 


Security and Cryptography Security Software Engineering Internet/Online Mike Andrews James A. Whittaker Addison-Wesley Professional How to Break Web Software: Functional and Security Testing of Web Applications and Web Services

1990 to 1999: Process


The next major "solution" to the software quality problem came in the 1990s under the phrase software process improvement. At the center of this movement was the much heralded, often derided, Capability Maturity Model or CMM; see "The Capability Maturity Model" sidebar for a short explanation. For brevity's sake, we'll oversimplify the software process improvement dogma: Software development is a management problem to which you can apply proper procedures for managing data, processes, and practices to good end. Controlling the way software is produced ensures better software.


In other words, because developers had failed to manage their projects appropriately (as evidenced historically by software's poor track record for quality), managers must install organizational controls to manage for them. The problem with this belief is many-fold, because even the best processes in the world can be misapplied (Jeffrey Voas, "Can Clean Pipes Produce Dirty Water?" IEEE Software, vol. 14, no. 4, July 1997, pp. 93-95).


Although we're being facetious, our point is serious: Despite the fact that good software development processes are usually necessary, the software process improvement movement sold its processes to developers in a way that established an adversarial relationship between management and technical personnel. To make matters worse, many managers who knew nothing about software suddenly found their skills in high demand in software companies keen on process improvement.


However, software development is fundamentally a technical task: Good developers can develop good software despite poor or no management. However, the converse is improbable: Poor technicians are unlikely to develop good software under even the best management. (For an alternative analysis but similar conclusion, mostly concerning management's role in Y2K mitigation, see Robert Glass, "Y2K and Other Software Noncrises," IEEE Software, vol. 17, no. 2, Mar. 2002, pp. 104-100.) Thus, the CMM has propagated slowly. In many large software companies, developers are still unaware of its very existence.


The CMM is not the only software process improvement idea that came out of the 1990s. In the decade's later years, software development organizations began to apply a related theory to their processes—Six Sigma, a method originally devised for reducing manufacturing and design defects in hardware systems.


Six Sigma is a disciplined, data-driven approach and methodology for eliminating defects (driving towards six sigmas between lower and upper specification limits) in any process—from manufacturing to transactional and from product to service. To achieve Six Sigma, a process must not produce more than 3.4 defects per million opportunities. A Six Sigma defect is defined as anything outside of customer specifications. A Six Sigma opportunity is then the total quantity of chances for a defect (http://www.isixsigma.com/sixsigma/six_sigma.asp).


The problem with Six Sigma, however, is that it is not clear what one million opportunities to introduce defects into a software product means. Furthermore, how could that ever be properly measured?


To further widen the chasm dividing management and technical staff over how to develop software, the 1990s was also a decade of remarkable progress in computing infrastructure. New operating platforms eclipsed older operating systems in sophistication. Knowledge that once was useful became obsolete. New programming languages popped up and became overnight successes. Programming had to be learned and relearned. New APIs (application programming interfaces) for communication, security, distributed computing, and, of course, the Web turned developer's lives upside down. Because developers were constantly addressing the crisis of staying current, they had little time to attend to the pressures of following particular software process standards.


In defense of the software process movement, we must recognize it as a new phenomenon. Like many new phenomena, it is not completely understood and is widely misapplied. To our minds, one lesson of the 1990s is that the current state of the practice in software process does not easily support new technologies. What worked for mainframe or desktop applications does not necessarily work on products that are built quickly and deployed hourly in today's Internet-time workplace.


However, like its partially successful predecessors, the emphasis on software process produced some beneficial side effects. The fact that many more developers are aware of simple things like configuration management, defect tracking, and peer review is clearly positive. The 1990s began as a process revolution and ended with the realization that process is not something that you can force on people or that will catch on in a few years. Furthermore, process for the sake of process is not enough. Process improvement comes from better technical practices, plain and simple.


Finally, the 1990s marked the first real attempt to turn software development into engineering through the concepts of component-based software engineering (CBSE) and commercial off-the-shelf (COTS) components. The idea is to create small, high-quality parts and join them together. The problem, of course, is that high-quality parts joined together do not necessarily result in a high-quality composite system. The composite system might suffer from a flawed method of composition, or assumptions about the components' behavior or environment might be flawed. Furthermore, commercial software components, which companies usually license as executables, can yield nasty side effects unknown to the licensee. Such side effects might only manifest themselves when joined to other components and are virtually impossible to detect by testing the component in isolation. Therefore, although the divide-and-conquer paradigm works well for hardware and physical systems, it can actually be a disaster for logical systems. Only time will tell how CBSE will affect software quality's future.


     
     
      

     


     


     


    No comments: