Economics of Software Bugs: Facts You Need To Know
- July 24, 2014
Understanding the effective cost of software bugs is essential for any software development vendor or customer. Without getting too technical with numbers, there are basic facts that everyone should be aware of, especially the top management and the CXO’s.
Software producers need to analyze the numbers and work hours consumed over the development lifecycle to truly grasp how much time, money & headache it costs them to fix major bugs later in the stage that could have been easily fixed in the beginning. Lack of such foresight means their clients pay the price in terms of time invested, cost of bug-fixing, and brand dilution.
This comes back to hurt software developers yet again, when their client moves away to a more organized development house.
Software bugs are costing the U.S. economy an estimated $59.5 billion each year, with more than half of the cost borne by end users and the remainder by developers and vendors, according to a new federal study conducted at NIST.
Birth of a Software Bug
Software development lifecycle evolves through multiple stages to produce a working application. There are two costs that are very important to know and both of them grow exponentially as you proceed ahead in the development lifecycle.
a) Cost of bugs identification
b) Cost of bug fixation
It’s only logical to test and fix all the bugs at each stage, yet many software development teams overlook this basic rule. The later you find a bug in the development cycle, the more costly it is to fix. The primary reasons for this behavior are as follows.
Lack of Thorough QA Processes
The testing process is usually not in place and many times the test plan is not in place either. The testing phase is often ignored till the end of the development, which absolutely should not be the case.
Testing should be planned at the stage of analysis & design, and should cover the testing process at every stage. Cost of bug fixing can be significantly reduced if the bugs found are also fixed in the same development stage.
Inadequate Software Testing Infrastructure
Development teams are more focused on their development environment, and seldom establish separate testing environments. Whether it’s due to lack of realization that there needs to be a separate test environment, or due to lack of resources because hardware replication is expensive, the end result is the same.
A lack of understanding of target audience is also the culprit. If your app is never tested in environments where most of your users will be using it, you can never be sure that it will perform smoothly.
Users access the applications through so many different mediums (browsers+ operating systems + devices), making it difficult to establish so many testing environments.
Independent test environment is critical to success in testing and nailing down the issues before being hit with surprises on the production environment.
Inexperienced Software Testing Engineers
A simple user testing may be the only option many development teams are looking at before releasing the application. This may work out of luck, but not recommended.
Nothing beats an experienced testing input. Experienced testing experts have a holistic understanding about multiple aspects like functionality, performance, security etc.
“Improvements in testing could reduce the cost by about a third, or $22.5 billion, but it won’t eliminate all software errors, the study said. Of the total $59.5 billion cost, users incurred 64% of the cost and developers 36%.”
Testing is not a one time activity. Its a continuous process to ensure quality of your software. The costs can be well managed if there is a clear strategy to execute the testing cycles and well-defined outcomes and expectations laid out.
What are your thoughts on the subject? How can the importance of software testing be highlighted so the cost of software bugs be reduced?