Poor Software Quality Could Prove To Be Costly For US Businesses

Fixing something built poorly in the first place costs more than creating something correctly the first time. A study conducted by the Consortium for Information & Software Quality (CISQ), in partnership with Synopsys, titled The Cost of Poor Software Quality (CPSQ) in the US, estimates the total cost of poor software quality in the US in FY 2020 was $2.08 trillion. The CISQ report estimates that the total cost of software weaknesses in systems, networks, and applications has been estimated at another $1.31 trillion, with a growth rate of 14% since 2018.

Contributors to poor software quality

The major contributor (75%) to the CPSQ is the inability to fix known vulnerabilities. The second reason accounting for $520 billion is the failure to upgrade legacy system platforms. The third is unsuccessful development projects, pegged at 46% in 2018, accounting for $260 billion. The report further adds that the leading cause behind multiple project failures is the lack of attention to quality. These data points reveal that maintaining the quality in a DevOps scenario that moves extremely fast is a delicate task. Furthermore, software quality is not given much pre-eminence in organizations compared to other objectives. Poor quality software can reduce revenues, increase user error rates, and degrade operational performance. However, the cost of poor software quality will spiral in the near future for various reasons.

Managing poor software quality

The Internet of Things or IoT has spawned bad computer software that is today present in almost everything. The report also cites an example of products and services now being run on software and delivered profitably as online services like Amazon. As 17% of overall global revenue is currently accounted for by emerging technologies, they are expected to drive growth in the coming years. This implies that software is crucial not just for products and businesses but for the overall economy itself, and sapping $2 trillion from this will be a significant drain on growth. Also, poor quality software makes online products, networks and systems easier to attack, which is a considerable expense. The report also revealed that in May 2017, 111 billion lines of new software code were being produced every year, and thus the attack surface was increasing rapidly. The National Institute of Standards and Technology in the US estimates that in a typical software, there are 25 errors for every 1000 lines of code. This increases the vulnerabilities manifold, mainly due to fundamental errors in software coding. As technology evolves, the attack surface expands, bringing changes such as the malfunction in the value chain, expanding complexity and concealing the triggers for massive failures. The cost of cybercrime is, therefore, exponentially increasing.

Balancing speed with the security imperative

The report says the best way to reduce the cost of poor quality software is to prevent it from happening in the first place or fix problems and deficiencies close to the source as possible. However, the way to fix this problem is well established, as building security in the software while it is being developed. How 130 organizations in nine verticals are doing this is well documented in Synopsys's report 'Building Security in Mature Model'. Implementing security measures at every point of the software development life cycle (SDLC) is vital. While testing has to be done throughout the SDLC and not just at the beginning or the end, the key to success is to do this in the early stages. Security testing includes five solid measures: First, threat modelling and architecture risk analysis can help eliminate design flaws; Second, during the writing of the software, bugs or other defects can be found in the code by static, dynamic, and interactive analysis security testing; Third, in open source software components, software composition analysis can help developers fix vulnerabilities; Fourth, to know how the software responds, one can try fuzz testing; and Fifth, to find weaknesses in software products before they are released, ‘red teaming’ or ‘penetration testing’ that mimic hackers can be used.

Prevention is the best investment

Automation and intelligent orchestration tools are helping developers do the right test at the right time and in the right stage of the SDLC, thus saving on time and costs. In addition, developers can fix bugs and other defects in real time to avoid getting swamped with false positives or irrelevant defect notices. Finally, the report says that the biggest bang for CPSQ investment money is to prevent most of these from occurring as early as possible while they are still cheap to fix.

The IoT security systems are increasingly becoming prime targets for cyberattacks. The potential consequences of a successful cyberattack on OT or IoT systems can be devastating, including loss of life, property damage, and financial losses. Cybalt’s IoT security solutions can help you go a long way in protecting your systems from cyberattacks. Cybalt brings the right combination of industry experts, OT and IoT-specific solutions, and matured processes to monitor and secure client IoT environments from cyberattacks.

 

Other Blogs

From Nuclear Centrifuges To Machine Shops: Securing IoT

From Nuclear Centrifuges To Machine Shops: Securing IoT

IoT or ‘the internet of things’ has been around for a lot longer than the buzzword

Read More
Demystifying XDR

Demystifying XDR

As the capabilities of threat actors have increased so have the tools which we utilize to detect and respond to their activities.

Read More
Cybersecurity In A Post Pandemic World

Cybersecurity In A Post Pandemic World

As many cyber security practitioners will tell you, the most imminent and dangerous threat to any network are the employees accessing it.

Read More

Subscribe to our mailing list

Get Free Assessment