We have decades of secure code development training behind us, the refinement of secure coding practices, and new application security testing and development tools coming to market annually. Yet one of the biggest and oldest barriers to developing secure software remains. What is it?
It's the complexity of these tools, how they are managed, and how they integrate that creates unnecessary drudgery for development teams. There is so much that we force developers to slog through that it slows them down from being able to do their job, which is to develop great software that people want to use.
This is what usually happens: As developers busily do their thing and build new applications and features, someone from security comes in the room and explains that a software assessment tool will be inserted in their pipeline. The tool throws off all types of false positives and irrelevant results, which, from the perspective of developers, only gets in the way and slows them down. If this process was ever scalable, it's certainly not scalable in modern continuous delivery environments.
To succeed, security teams must take radically different — and more effective — approaches to help developers build more secure applications. This was what we did at Capital One, with considerable success.
Streamlining Application Security Processes
We had more than 3,500 different application teams, within seven separate lines of business, developing roughly 3,500 applications within their own continuous delivery pipelines. Each of these teams had significant flexibility regarding the tools that they chose to use and what languages they used for development. While productive, it was a form of managed chaos, with seven fully independent lines of business each essentially doing their own thing.
Our application security team, on the other hand, essentially consisted of a small pool of consultants. We were spending an inordinate amount of time just trying to get the Web application security tools up and running with each of the 3,500 teams. In addition to getting the software security assessment tools in place, the consultants would reach out to each application team to provide consulting and training. With so many development teams and different tools and languages in place, it just wasn't scalable.
Another significant challenge for the application security team was the lack of a stick they could use to enforce good security development hygiene. Our security consultants would reach out to each development team and attempt to engage them for training, consult on effective development security practices, and try to convince them about the need to change practices. There was no way to force the development teams to actually engage and make these efforts work.
Fortunately, every developer and team at Capital One truly wanted to develop secure code. Unfortunately, the processes we had in place were too slow to be reasonably effective and timely. It would take three months from the initial contact with a development team to actually install and train the team on how to use the security assessment software. Not good enough.
The Solution? We Had to Transform the System
So, we did what developers do: we built an app for that. Then we provided teams an option that removed the burden of having to run their own application assessment tools. It was software security assessment-as-a-service for these internal teams, and all developers needed do was sign up.
To secure their code, development teams would log into the system, send a single command through the API, and have themselves and their app registered for assessment. The system would then automatically pull the compiled code artifacts needed for the assessment, identify requirements and policy, and then orchestrate and manage the third-party scanning tools on the back end. The results from the assessment were then fed to the developer's dashboard or pulled/pushed from an API. The good news for developers was that they didn't have to do anything in this process aside from registering the app to get high-quality assessment results for minimum effort.
The application security team also used their expertise to filter assessment results so developers weren't burdened with irrelevant outputs and false positives. What developers received in the results they knew to be legitimate security concerns that needed to be resolved.
Of course, we didn't roll this out all at once. Initially, we worked with one of the two largest teams at Capital One. Our goal in that initial pilot was to learn how to make a service that developers would really want to use. We took a novel approach and actually asked the development teams what they would want — including everything from the user interface to presentation of the results to the level of automation. Not so surprisingly, developers appreciated the fact that the system could be fully automated.
What was amazing was that developers loved the system. In fact, they loved it so much that they started telling other development teams.
We began the pilot at the beginning of 2017, and the initial two teams that used the system began telling other teams. As a result, we witnessed registrations accelerate through word of mouth. By the middle of 2017, the system went from two applications registered to 780 applications registered. All the while we, kept improving the service and developers continued self-registration.
We added additional software security assessment tools. Each of these new tools did different types of assessments better than other tools, so they all complemented each other. What was most exciting was that we could provide enhanced application software security assessments and code review without developers having to change their day-to-day routines even as we added additional assessment and code composition tools.
Results: A Single Pipeline for Code Analysis Tools
The results speak for themselves: the application teams became our customers. And when all was said and done, by the end of 2017, we had 2,600 application development teams enrolled in the system. In contrast, in the year before the system was implemented, the company processed about 12,000 software security assessments. In the year we introduced this system, which we named "security code orchestration," the company ramped up to run that same number of assessments per day and totaled near 400,000 software security assessments for the year.
For application security teams, this is a clear win. First, because all the software security assessment tool sets were abstracted away from the engineering teams, we could add or replace software security tools on the back end of the system with ease. This meant we could instantly improve the quality of assessments by improving the quality of our tools across the entire enterprise. The development teams wouldn't even know a change was made. As we evolved the system, it became the single pipeline for code analysis tools, including non-security-related tasks such as code quality and license compliance. As such, we ended up with going from secure code orchestration to code quality orchestration.
Most importantly, the entire organization was able to move from an ad hoc and a low level of software assessment coverage to more than 80% coverage. And, as it turned out, the software security teams didn't need any kind of a stick — developers wanted a secure code pipeline because it was so easy to use and provided them value.