The more frequently you release apps, the more security vulnerabilities you are likely to introduce in the code, a new study confirms.

4 Min Read

The frequency with which you release and update software has more of an impact on application security than factors like code size and whether you are developing your apps in-house or offshore, according to new research.

CAST Research Labs recently analyzed a total of 1,388 applications developed using either Java EE or .Net. The company ran some 67 million rule-checks against a combined 278 million lines of code and unearthed 1.3 million weaknesses in them.

The exercise showed once again—like many have been saying for years—that while agile practices can accelerate application delivery and make it easier for developers to adapt to changing requirements, they can also heighten security risks. 

Specifically, CAST Research found that Java EE applications released more than six times per year tended to have a significantly higher density of known security weakness (Common Weakness Enumeration—CWE) compared to code released less than six times per year.

CAST's analysis showed that CWE density in Java EE applications remained fairly consistent regardless of the development methodology itself. In other words, Java-EE Applications developed using an agile/iterative model had roughly the same vulnerability densities as applications developed using a hybrid waterfall and agile method or a pure waterfall approach. What really made a difference to security was the frequency of updates and releases.

Interestingly, the results were statistically different with .Net applications. With .Net, applications that were developed using a traditional waterfall approach had a much higher CWE density compared to applications developed with agile, hybrid and even no methods at all.

"In Java we found that financial services and telecom had the highest densities, and that applications released to production more than six times per year were particularly vulnerable," says Bill Curtis, SVP and Chief Scientist at CAST Research Labs.

Meanwhile, others factors like application size and where the development work is done had less of an impact on vulnerability density.

Generally, the larger the code set, the more opportunities developers have to make coding errors such as SQL injection and cross-site scripting issues. So larger applications generally tend to have more security vulnerabilities in absolute terms than smaller apps. But vulnerability density—or the number of errors per one thousand lines of code—remains the same regardless of application size, CAST's analysis showed. The same was also the case for the source of the code.

"Interestingly, we did not find that whether an application was developed onshore or offshore, or whether it was developed in-house versus outsourced made a difference in CWE density."

CAST's study showed .Net applications on average having a higher CWE density than Java-EE applications. Most of the Java-EE apps across industries that CAST examined averaged five errors, or less, per one thousand lines of code.

In contrast, CWE density scores were much higher in .Net applications, especially in certain industries such as energy, insurance, and IT consulting. Many .Net applications that CAST analyzed had vulnerability densities in the 20- to 30-per-thousand lines of code range.

"We did not expect to see differences between Java and .NET in the pattern of factors related to CWE density, but they emerged," Curtis says.

Appsec has become a hot topic. The adoption of agile and continuous release cycles has put pressure on organizations to integrate security testing and proceses earlier and throughout the software development lifecycle. The trend is driving new DevSecOps approaches focused on unifying development, security, and operations teams into one common goal. Studies such as those by CAST highlight the need for such efforts.

"IT organizations must accept responsibility for providing training in secure architectural and coding practices to those deficient in these skills," Curtis says. 

In addition, organizations need to ensure they are using sound static, dynamic, and penetration testing techniques through the development cycle and that all vulnerabilities are patched as soon as possible. Dependencies and interactions with other applications or third-party software should be investigated for potential security weaknesses.

"Executive management owns the responsibility for ensuring cybersecure capabilities and enforcing cybersecure practices," he says.

Related Content:

 

Join Dark Reading LIVE for two days of practical cyber defense discussions. Learn from the industry’s most knowledgeable IT security experts. Check out the INsecurity agenda here.

About the Author(s)

Jai Vijayan, Contributing Writer

Jai Vijayan is a seasoned technology reporter with over 20 years of experience in IT trade journalism. He was most recently a Senior Editor at Computerworld, where he covered information security and data privacy issues for the publication. Over the course of his 20-year career at Computerworld, Jai also covered a variety of other technology topics, including big data, Hadoop, Internet of Things, e-voting, and data analytics. Prior to Computerworld, Jai covered technology issues for The Economic Times in Bangalore, India. Jai has a Master's degree in Statistics and lives in Naperville, Ill.

Keep up with the latest cybersecurity threats, newly discovered vulnerabilities, data breach information, and emerging trends. Delivered daily or weekly right to your email inbox.

You May Also Like


More Insights