When runtime application self-protection is held to a higher standard, it can secure thousands of applications and prevent burnout in security teams.

Doug Ennis, CEO, Waratek

April 4, 2023

5 Min Read
Cybersecurity concept art
Source: vska via Alamy Stock Photo

In recent years, the application security world has seen the rise of runtime application self-protection (RASP) technology. As described by Gartner, RASP is a security technology that is integrated into an application or its runtime environment, capable of controlling and preventing real-time attacks. Unfortunately, many Web application firewall (WAF) companies saw an opportunity to leverage the term. They introduced "RASP-like" agents at the network layer, which isn't fully embracing the definition of RASP technology.

In contrast, genuine RASP technology operates at the application layer, where it has full context of the user, application logic, and domain information. This context allows a RASP to make informed decisions about the application's security and to prevent exploits before they can cause harm. As a result, a true RASP should have zero false positives and reduced latency, providing an immediate boost in performance. True RASP requires a list of immutable rules that use context to understand when a new vulnerability is introduced and act accordingly. This immutability is possible when rules are baked into the code base at the application layer and does not require any changes once deployed.

Three Areas Where RASP Went Wrong

1. The Barking Dog Problem: Most Alerts Are False Positives

The issue with WAFs is that they work at the network layer, a lagging indicator of application execution. The resulting lack of context leads to high false-positive rates, long wait times, and poor performance, as WAFs can only guess about the nature of a vulnerability based on what they previously have been exposed to.

Imagine a guard dog in the yard barking whenever it hears a noise beyond the fence. These noises may be the approach of an intruder, but they could also be cars passing by. The guard dog cannot accurately gauge the difference, so the severity of any given noise is lost, making it impossible for the people inside the house to know which alerts are genuine and which are false positives. This scenario is essentially the capability of the standard RASP offering.

2. The 999 Bad Guys Problem: Only Capable of Testing a Sample

Believe it or not, some vendors tell you to only run their security solution in production environments if you protect only a sample size. That means it pulls a sample — perhaps one in every 1,000 requests — and tests that sample while catching what happens for the next 999. Meaning, if you're a good actor, your signature will check out. But regardless of whether or not the following 999 actors have bad intentions, they will get through. This lack of consistency is all because WAF-based RASPs can't handle the performance requirements of having to test every request.

3. The "It Takes Too Long" Problem: Latency Affects Performance

Any time you have a WAF-based RASP, you experience increased latency, since it cannot influence the application's code base in any way. Meanwhile, widely available RASPs have to send entire text payloads to their Web analyzer and then wait for it to be sent back, which can take a long time. And if your customers are waiting on payments to go through, they might give up and seek out your competitors instead.

Improving this process is similar to code optimization. When creating a list, developers set it up to add new items to the beginning of a list instead of the end. This optimization prevents the VM from rebuilding the entire list each time a new item is added, preventing increasing latency as the list grows. Compiler engineers addressed these issues by implementing just-in-time (JIT) compiling in the early 2000s, which automatically optimizes code based on the nuances of the given language.

Why Has the Definition of RASP Been So Watered Down?

Developing RASP technology requires a combination of security engineering and software engineering skills. To be effective, the RASP developer must deeply understand the application's architecture and the nuances of the programming language being used. This requires domain expertise that is rare among security professionals.

True RASP Optimizes Code for Performance as Well as Security

Because most RASP platforms behave like WAFs, there's a massive overhead involved, which requires running them in sample mode. In contrast, a genuine RASP performs the actual protection in the runtime.

These operations exist in memory, which is very efficient, and because this exists in the same space as your applications, they're very performant. By performing the protection in the runtime, there's no need to rate limit or perform protection in sample sizes because the actual operation takes just a few milliseconds.

Regardless of any changes made to the application, high-performance security remains constant. This philosophy aligns with the philosophy of infrastructure-as-code, in which you define the desired state of your infrastructure, and no matter what happens in the environment, the state of the infrastructure remains the same.

RASP, by definition, parallels many principles of infrastructure-as-code. This parallel is possible due to the deep contextual awareness of the application and the language in which it's built. Like infrastructure-as-code, a genuine approach to RASP can and should make use of immutability to ensure that rules are applied regardless of changes to the codebase.

Immutability is possible by performing a check on the output of a function the first time it's called and switching out any unhealthy functionality with protected functionality, ensuring that the application is always healthy as it runs.

This approach allows security to be deployment agnostic and doesn't require code changes to the application code, tuning, or waiting for deployment windows.

By performing protection in the runtime, patching results with immediate protection on all running instances of the application, one eliminates the need for constant false positives and removes the risk of future exploits.

RASP Can and Should Be Held to a Higher Standard

In short, RASP should be held to a higher standard. When done so, it's possible to secure thousands of applications, lowering the total cost of ownership of your WAFs and helping prevent burnout in your security teams.

About the Author(s)

Doug Ennis

CEO, Waratek

Doug Ennis, CEO of Waratek, has over 20 years of experience ranging from internal IT, consulting and sales. He has developed a deep understanding and appreciation of cybersecurity and the networking landscape. Doug has held sales leadership roles in companies focused on data privacy, network security, application performance management, and mobile device security and management. He received his Master's of Science, Information Technology and Security from Capella University and his Bachelor's in Computer Science from John Carroll University.

Keep up with the latest cybersecurity threats, newly discovered vulnerabilities, data breach information, and emerging trends. Delivered daily or weekly right to your email inbox.

You May Also Like


More Insights