Software Quality

This site has been created to log references to quality in software development.
If you have any suggested additions, please contact me.

 

.Software QA Articles - Articles on important software QA topics.

.From ACM's TechNews, December 16, 2005.

"Quality Is Now Development Job One"
eWeek (12/12/05) Vol. 22, No. 49, P. D1; Coffee, Peter

Embedding quality assurance (QA) deeply into the software development life cycle makes sound business sense, as it would help reduce the cost of correcting defects and satisfy regulations such as Sarbanes-Oxley. A roundtable discussion convened by eWeek Labs focused on the breadth and nature of QA, which is expanding as organizations depend on purchased software and network-based services as elements for line-of-business applications. Identify Software's Lori Wizdo argued that it is wiser for customers to define quality in terms of user experience instead of what developers produce. Sam Guckenheimer of Microsoft said development managers must understand that most developers are not security specialists, which points to the need for the pervasive incorporation of automatic potential security issue analysis into tools and processes. He and others stressed that the quality process must be a closed feedback loop. Segue Software's Ian MacLeod expressed his wish for extensive industry adoption of tool integration, noting that "Quality is all the elements of the ecosystem--requirements, development and test management, defect management, monitoring, and diagnostics across the deployment line and into operations." Station Casinos CIO Marshall Andrew reported that QA time is currently about half as long as development, and his company is working to reduce that time without a tradeoff in quality through the use of QA-accelerating tools. Panelists concurred that as developers become more productive thanks to quality improvement tools, enterprise managers should apply the increased efficiency toward better understanding and fulfilling user requirements. Click Here to View Full Article

.From ACM's TechNews, October 3, 2005

"Development Study: Haste Makes Waste"
Computerworld (09/23/05); Hayes, Linda

A recent study has found that increased funding has little impact on the overall quality of a project. Qualitative Software Management's estimation product, known as SLIM, provides an empirical analysis of a variety of projects addressing IT, real time embedded systems, and engineering. Offering accurate project estimation has been a long-standing but elusive goal of the software industry. In repeating a study it had conducted in 1996, SLIM confirmed that the delivery difference between a large team and a small team was only 12 days; that result came partially because large teams created more than six times the number of errors as small teams. Large teams failed to deliver significant improvements in cost, time, or quality, widely recognized as the three major metrics of software development. The SLIM study confirmed the theory that adding more resources to a project yields diminishing returns. The efficiency of small teams is fostered by an atmosphere of intimate collaboration, and the information produced does not take as much time to sink into each member of a smaller team. The diminishing quality stems from the absence of a holistic view of the project when it is broken down into compartmentalized segments and distributed to each member of a large team; this approach can lead to errors and complications that arise from cobbling together bodies of code written by different authors not in communication with each other. Click Here to View Full Article

.From ACM's TechNews, September 14, 2005

"Why Software Fails"
IEEE Spectrum (09/05); Charette, Robert N.

The causes of software failures are well established, yet an end to such failures is not in sight, according to IEEE member and author Robert Charette. Preventing such failures is not a major priority among most organizations, despite the damage such failures can do to their prospects: IT fiascos can destroy a company, hinder economic growth and quality of life, and even undermine security, as evidenced by the FBI's costly Virtual Case File episode. Charette estimates that 15 percent to 20 percent of software projects with budgets of $10 million or higher are doomed to fail, leading to his projection that such failures have cost the U.S. economy possibly $75 billion over the last five years. The most frequent reason for an IT project's failure is a disproportionate amount of rework stemming from errors that were not caught before final system testing or rollout, and thus are harder to track down; worse, the process of repairing those glitches often introduces new errors. Charette describes bad decisions by project managers as "probably the single greatest cause of software failures today," and notes that such decisions are compounded by vague or missing knowledge; these decisions can take the form of underhiring programmers, choosing an inappropriate contract, or not reviewing the project's progress regularly. Business factors underlying IT project failures include competition, the need to cut costs, political expediency, and a shortage of support among upper management. Relying on immature or untested technology to hopefully boost the company's competitive standing is a guaranteed formula for failure, while larger projects introduce more complexity, which in turn makes errors more likely. A company stands a better chance of catching and correcting errors earlier through an open, honest, collaborative, and communicative IT project environment, Charette writes. Click Here to View Full Article

.From ACM's TechNews, August 22, 2005

"Software Development Survey Says People, Not Tools, Matter Most"
Application Development Trends (08/01/05)

Management and technology approaches are more significant factors in determining the quality of software development projects than tools, according to a new study from Quantitative Software Management. In its report on "the Best and Worst in Class" projects, QSM found that the best development teams were able to control requirements change, had project leadership and highly skilled people on the job with application domain experience, and made good use of tooling. "An inability to adroitly manage change can be the enemy of productivity and quality," says Doug Putnam, managing partner of QSM. "Effective leadership creates a culture where change is well managed by highly skilled teams with good domain knowledge." Tools were considered the third most important factor, but the report concludes that they still do not make up for training and management. The best-in-class software development projects have a total lifecycle-cost ratio of more than 7.5 months/$300,000, compared with more than two years/$2.2 million for the worst-in-class initiatives. Click Here to View Full Article

.From ACM's TechNews, July 27, 2005

"Buggy Software: Up From a Low-Quality Quagmire"
Computerworld (07/25/05) P. 23; Hildreth, Sue

CIOs are studying how software bugs are introduced into the application development process and why they seem so resistant to prevention in an effort to stave off the tremendous losses in revenue, production, data, and customer satisfaction such flaws can entail. Experts on bad software blame the problem on poor application life-cycle management (ALM), and note that initiatives to improve software quality must encompass every stage of the software's existence--from planning through development, testing, and maintenance. Gartner analyst Theresa Lanowitz estimates that about 90 percent of all IT organizations are in the dark when it comes to effective ALM, and she concludes that the majority "waste quite a bit of their budget because they have bad business practices, fail to deliver on requirements, and fail to manage projects to meet schedule, cost, and quality goals." Clear communication between developers, testers, and business users must be established at the outset of the application's life cycle, and Tescom Software Systems' Arthur Povlot says most quality assurance problems can be traced to poor requirements. Sorin Fiscu with the Berkshire Life Insurance Company of America says developers should subject their code to specific QA tests before passing it on to the QA staff, while configuration management and change management policies and tools can help enforce a standard code creation and testing process. Once the code is passed off by developers, it must be rigorously tested for functionality, integration, performance, security, and any program changes or updates. Povlot recommends the creation of test cases for all the application's most crucial requirements. To maintain the software's quality after deployment, data collected during production must be re-entered into the requirements planning of the next iteration. Click Here to View Full Article

.From ACM's TechNews, July 15, 2005

"Software Under Scrutiny"
CIO Australia (07/07/05); Bushell, Sue

Software inspection processes are the best way to lower development costs, abbreviate delivery schedules, and make operational software products more trustworthy in the field. Software inspection process expert Edward Weller III says experience and discussions with industry fellows indicate that management is more open to software solutions when developers present them in a feasible, credible way that satisfies decision makers' information needs. One of the most renowned software inspection methodologies is the Fagan Inspection Process, which is designed to reduce the number of defects users must contend with, eliminate defects prior to testing, and instill measurability and manageability within software development projects. The data captured through software inspection can help managers ascertain any software process improvement action's return on investment by quantifying incurred costs and achieved savings, while technical practitioners can use the data to determine defect detection rates and detected bug types in order to improve software practice and products. IBM's Andrew Bowey says CIOs can be very resistant to inspections, so it should be clearly communicated to them that the organization will enjoy more savings when bugs are spotted and patched earlier in the development process. John Salerno of Dedicated Systems reports that "companies that have been burnt are more likely to use [inspections] because what we find is that inspections are valuable for this reason: that it finds bugs where they are injected, which is usually by the engineers themselves in the development team." IBM's Joji Vergara says the benefits to be gained through the inspection process depend on how clearly each participant is told why his participation was solicited. Click Here to View Full Article

.From ACM's TechNews, July 13, 2005

"Assertive Debugging: Correcting Software as If We Meant It"
Embedded Systems Programming (06/05) Vol. 18, No. 6, P. 28; Halpern, Mark

Programmer/software designer Mark Halpern describes the Assertive Debugging System (ADS) as a scheme that will abridge the current debugging process and enable the systematic, documentable debugging of software objects, which he believes will soon become a legal requirement. ADS deals with bugs in program implementation that stem from substantive or logic errors; these are the most dangerous kinds of bugs because they are easy to introduce, trivial, often unnoticeable, and not immediately dangerous. Halpern's approach is designed to induce the manifestation of bugs at the earliest possible time so corrective action can be taken before their existence is covered up by continued program execution, and this is done by monitoring the behavior of many variables at run time, in search of violations of assertions the programmer made when defining those variables. The assertions are relayed within a notation that is a natural offshoot of the programming language, and they can be clustered variably to allow the programmer to trigger or mute sets of related assertions with a single command. As a subject program is compiled, the activated assertions produce code within the object program that can be used to check the applicable variables for any breaches of the behavioral constraints defined by the programmer. Once the code detects a violation, the program's execution is stopped and the programmer-specified exception action is taken. Most programmers argue that ADS is unaffordable, but Halpern counters that the approach yields value in every execution, a claim that cannot be made for current debugging practices. In addition, the cost of ADS in machine cycles is more than offset by what it conserves in project schedule slippage, software-engineer time, and time-to-market. Click Here to View Full Article

.From ACM's TechNews, June 24, 2005

"Large Users Hope for Broader Adoption of Usability Standard"
Computerworld (06/20/05) P. 1; Thibodeau, Patrick

A three-year-old usability standard should gain greater acceptance in the business community when it is approved by the International Standards Organization. The standard, called the Common Industry Format for Usability Test Reports (CIF), is expected to gain steam once it is adopted internationally. The European demand for communication across borders will help solidify the standard's position. CIF reports usability test results in a common format that gives prospective software buyers an idea of "the real costs of ownership," said Jack Means of State Farm. CIF, developed jointly by Microsoft, Intel, IBM, and others, owes the impetus for its creation largely to Boeing, which was experiencing costly usability issues. Thomas Tullis of Fidelity Investments, another supporter of CIF, says the standard helps companies make sound purchasing decisions because "the usability of the software that you buy on an enterprise-wide basis potentially has a really significant impact on the productivity of your employees." CIF will enjoy greater acceptance once customers start to ask vendors for usability reports before they purchase software, though that could substantially alter developers' method of building applications. The standard was accepted by the ISO's technology standards committee last month, and now awaits full ISO approval. Click Here to View Full Article

.From ACM's TechNews, June 20, 2005

"Improving Software Quality"
SD Times (06/01/05) No. 127, P. 34; Vereen, Lindsey

The software industry is reevaluating its approach to fixing software errors in products, and is questioning whether developers make the best testers, whether quality can be tested into a system, and at what point in the development life cycle testers should be involved. Agile processes such as Extreme Programming (XP) move testing forward and give developers more responsibility, but there are some concerns whether the strategy can work in environments where there are more than 10 developers. Agile methods have influenced the move toward automation and the move away from simple scripting, and although such testing is done the same way every time, the quality of the developer is not an issue, as is the case with manual testing. The ability to perform a large number of tests through automation frees up companies to pursue more exploratory, or context-based testing, such as a schedule to complete the task in two days. Security testing is a challenge because security is not an application functionality issue and expertise comes from the operations side of the company, but experts favor a static strategy and looking at source code for possible vulnerabilities. Performance testing, which tends to start at a beta release and is outside the overall development process, still has not reached the level of functional testing. Companies are also starting to use open-source tools, which are competitive and cheaper, for testing. Click Here to View Full Article

.From ACM's TechNews, June 17, 2005

"Testing, Testing, One to Three, Testing"
ITWorld.com (06/10/05); McGrath, Sean

Software development in the past was typically driven by consumer demand and business requirements, but, fueled by best practices and codification, it is now heading toward a stage where testing for bugs or a lack there of has become the top priority. The testing process has been helped by dynamic programming languages such as Python, Jython, Smalltalk, and Scheme, that allow programmers to input instructions as software is running and change software after it has been deployed. XML allows programmers to capture the structure of data to be processed using a machine readable contract language called schema. Software can than be instructed at run time to check if the data meets expectations. This process of continuous testing allows for quicker software development and improved products. Though static programming languages such as Java and C++, in which instructions are compiled into an unchangeable machine readable form, still dominates the software development landscape, a shift is underway, precipitated by improved testing capabilities. Click Here to View Full Article

.From ACM's TechNews, June 13, 2005

"Automatic Source Code Review Is Development Tools' Next Frontier"
eWeek (06/06/05) Vol. 22, No. 23, P. D6; Coffee, Peter

Automatic source code review, in which a programmer's work is compared against an expanding archive of coding standards, is increasingly necessary for those with the responsibility of equipping development teams. Managers should make an effort to include convenient customization and extension of rule bases within the criteria, and target clear and consistent support for coding standards as an overall goal, writes Peter Coffee. Developers and large development teams are increasingly codifying richer and more expansive volumes of knowledge and practice to facilitate fast and precise authentication by automated tools. The National Institute of Standards and Technology's Software Assurance Metrics and Tool Evaluation (SAMATE) project is one initiative software tool purchasers can refer to. Last month the SAMATE engineers issued a project plan that considers the potential application of automated tools at various points in a software project's life cycle, with a particular emphasis on the "assessment, auditing, and acceptance" stage. Source code scanning products listed on the SAMATE site--some open source and some proprietary--address varying degrees of standards compliance. Coffee says development managers should be mindful of Perl programming expert Teodor Zlatanov's advice in "The Road to Better Programming" that "a programmer shouldn't be required to follow precise code guidelines to the letter; nor should he improvise those guidelines to get the job done." Click Here to View Full Article

.From ACM's TechNews, June 10, 2005

"Developers Should Carry the Banner of Software Standards"
eWeek (06/06/05) Vol. 22, No. 23, P. D1; Coffee, Peter

Computers and the software applications they run are no longer novelties, but critical components in people's lives and work, writes Peter Coffee; as such, commercial software developers should assume responsibility for their products in the same way electricians are held responsible for using certain grades of wire or consumer electronics makers guarantee devices are safe for use at a particular voltage. Such standards are taken for granted in industry and commerce, and ensure efficient exchange of complex products and services without requiring customers to certify and inspect every aspect of their purchase. Currently, however, the software industry hides behind end-user license agreements (EULAs) that relieve vendors from explicit or implied warranty claims, even though those disclaimers are often rendered invalid by the federal Magnuson-Moss Warranty Act or various state laws. For their part, customers need to make clear their expectations. Software standards that allow programmers to create their own useful licenses are needed, while large corporate and public-sector customers can help foster the development of acceptable practices by understanding current standards and incorporating those into purchase orders. An example of this is Section 508 of the Rehabilitation Act that requires technology in the federal government to be accessible to disabled users. Click Here to View Full Article

.From ACM's TechNews, June 6, 2005

"FBI Pushed Ahead With Troubled Software"
Washington Post (06/06/05) P. A1; Eggen, Dan

A confidential report to the House Appropriations Committee indicates that the FBI was aware that its $170 million Virtual Case File (VCF) system was highly flawed, but willfully pressed on with a $17 million pilot program last December, even though by then it was obvious that the software would have to be discarded. This was just one of many instances in the course of the VCF's development in which the bureau knowingly passed up the opportunity to terminate the program before incurring significant financial losses, according to the study. The case file management system was part of a massive effort to upgrade the FBI's communications network, and the report signals that some officials noticed problems with the VCF in early 2003, stemming from "contracting and program management oversight;" in December 2003, functional and technical problems were cited when Science Applications International (SAIC) delivered its first batch of software to the bureau. The FBI had spotted 400 problems by March of last year but did not disclose them to the contractor, while an official in the FBI's Cyber Division recommended an independent audit to address "serious concerns" about the project's status. Some officials say the FBI proceeded with the testing phase of the VCF despite recommendations to jettison the software because the bureau felt it had to show something for its efforts. Complaints from SAIC officials about frequent FBI management turnover and design changes were verified by the House investigation. FBI officials recently announced that many of the deficiencies outlined in the report will be addressed through a sweeping reorganization of the bureau, and through the use of the Sentinel program based on off-the-shelf software. Click Here to View Full Article

.From ACM's TechNews, May 9, 2005

"Silver Bullets for Little Monsters: Making Software More Trustworthy"
IT Professional (04/05) Vol. 7, No. 2, P. 9; Larson, David; Miller, Keith

Professors David Larson and Keith Miller of the University of Illinois at Springfield agree that while a single solution for all software development problems may be a pipe dream, existing "silver bullets" can solve a few problems in the near term as long as developers select them carefully. This will allow developers to address specific software defects and then confirm that such defects have been eliminated. The authors detail three defects for which solutions already exist: Memory leaks, buffer overflows, and files that remain open when a program terminates. Larson and Miller recommend the eradication of memory leaks before the rollout of commercial software, and list a broad range of solutions, including generic tools such as mtrace, YAMD, and Valgrind; the C Model Checker, which can scan the actual code without a programmer-developed model; and a pointer analysis framework suggested by Berhard Scholz, Johann Blieberger, and Thomas Fahringer that detects leaks via static analysis of program behavior. The greater number of eliminated memory leaks translates into greater software trustworthiness and reliability. Larson and Miller cite several papers detailing techniques for identifying conditions that give rise to buffer overflows, including the application of "randomly deformed data streams," analysis of errors injected into the software, and checking of basic-block signatures to confirm the non-maliciousness of executing instructions. The unclosed-file problem can be addressed with available static analysis tools that are comparatively cheap in terms of programming or computer time. One such tool is Microsoft's Slam, which takes a C program as input and authenticates its characteristics using a rule authored in Specification Language for Interface Checking. Click Here to View Full Article

.From ACM's TechNews, May 5, 2005

"Summit Calls for 'National Software Strategy'"
PRNewswire (05/05/05)

A national software strategy is needed to improve software trustworthiness, empower the U.S. software workforce, reinvigorate software research and development, and encourage software industry innovation, according to a report from the 2nd National Software Summit (NSS2). Software is of extreme importance to the nation because of its role in supporting critical infrastructure, including communications, transportation, finance, and the electrical grid. But compared to what can be built using known best practices, today's software products are unacceptably vulnerable to error and malicious disruption, said Center for National Software Studies (CNSS) President Alan Salisbury, who is also the Army official in charge of U.S. Army software. "For far too long we have simply accepted poor quality software as a fact of life. The real facts are that we know how to build much better quality software today, and we need to invest in even better software engineering for the future," he said. There are several critical gaps that compromise software development today: the lack of adequate development tools and technology needed to build error-free software, failure to apply best practices in software development, and a software workforce facing significant threat from overseas competition. The NSS2 report recommended government, industry, and academic representatives create a National Software Strategy that would ensure the competitiveness of the U.S. software industry while enabling the routine development of trustworthy software products and systems. Implementation of the strategy would then by governed by a similarly representative National Software Strategy Steering Group that would meet approximately every three years. Click Here to View Full Article

Page Owner: Professor Sauter (Vicki.Sauter@umsl.edu)
© Vicki L. Sauter. All rights Reserved.