Endpoint

4/19/2017
10:30 AM
Scott Petry
Scott Petry
Commentary
Connect Directly
Twitter
RSS
E-Mail vvv
50%
50%

The Architecture of the Web Is Unsafe for Today's World

The Internet is based on protocols that assume content is secure. A new, more realistic model is needed.

Twenty-eight years ago, British computer scientist Tim Berners-Lee proposed a system to link text documents across a computer network. It changed the way the world communicates and does business. From its humble beginnings, the Internet has become a complex, dynamic, and heterogeneous environment.

Today, the Internet revolution's main instrument, the Web browser, exposes users to unbounded malicious content and has become unmanageable.

How did browsers become such a liability? Because they're based on an ancient set of communication rules, protocols that assume connections are secure and content is safe. The openness and utility of the protocols led to enormous innovation. But today, with all its sophistication, the Web is still based on protocols that weren't designed for security or enterprise-class management.

Simple Beginnings Led to Enormous Risk
Berners-Lee says he didn't imagine how far his invention would go. In the early '90s, he designed something that today seems obvious: protocols and programs that enable a client application to fetch and display content residing on a remote server. He envisioned a system that would give researchers easier access to scientific documents on servers scattered around the world.

The first protocol would allow two devices to establish a network connection. Today, his Hypertext Transport Protocol (HTTP) has become the de facto standard for how Internet-connected applications and devices communicate with each other.

Berners-Lee's second protocol has become the lingua franca of the Web: the Hypertext Markup Language. HTML defined a basic set of instructions on how to format, link, and pool documents with simple text-based commands embedded in documents. These "tags" would enable seamless cross-referencing, and thus improved collaboration amongst researchers.

Protocols need apps to make them useful, so Berners-Lee built the associated components to create an end-to-end solution. He defined the URL and built the first Web server and the first browser, dubbed WorldWideWeb.

Built for Simplicity and Extensibility, Not Security
With all the pieces in place, a user on one computer could open a browser, connect to a Web server, and request content. The content would be passed back to the browser, which would unpack the data and format it for local interaction.

In Berners-Lee's own words, his system was designed for "a friendly, academic environment". But within a few years, the World Wide Web was no longer his app; the name had become synonymous with the digital revolution. And with a bit of steering by international standards committees, users, technology vendors, content providers, and significantly less "friendly" participants were all in.

By the end of 1995, nearly 50 million people were connected to the Internet. Explosive growth would continue for more than two decades.

Source: Scott Petry
Source: Scott Petry

 

When the Web Got Out of Hand
With the popularity of the Web on the rise, demand for more functionality was growing. Recognizing that standards needed to be established to build on the common infrastructure without breaking interoperability, organizations such as the Internet Engineering Task Force and the World Wide Web Consortium (W3C) quickly expanded the syntax of the underlying protocols. In 1996, the W3C published an outline to support the execution of scripts in the browser.

This was the point of no return for the browser. Instead of rendering formatted text, it would now download programmatic code from third-party servers and execute it locally.

Source: Scott Petry
Source: Scott Petry

Building on these enhancements, commercial vendors would expand the capabilities of the browser. Plug-ins were developed to render videos, applications were being rewritten to support Java or ActiveX, and with the emergence of JavaScript, code execution went native.

Source: Scott Petry
Source: Scott Petry

By the end of 1997, the browser had evolved well beyond Berners-Lee's concept. The browser had become a dashboard to a vast array of interconnected services and the user interface for business applications. But it was still built on his vision of a "friendly environment," where the browser would connect to a remote host and execute whatever payload the host delivered.

"Unfriendlies" See Opportunities, Too
The propagation of viruses over computer networks had historical precedent. The exploitation of systems has evolved alongside interconnectivity since the dawn of computers.

The Morris Worm is credited with being one of the first malicious programs distributed over the Internet. Written by Robert Morris, a graduate student at Cornell University, it was launched in November 1988. Morris claims to have written the worm to measure the size of the Internet. Regardless, it was so successful that major sections of the network had to be disconnected in order to contain it.

The situation has only gotten worse. The sophistication and volume of Internet-based exploits continue to grow. Motivated hackers go after personal information such as banking credentials, healthcare data, tax refunds, and more. And state-sponsored or corporate espionage attacks target individuals and information sources meant to disrupt business or economies.

All of these things are exploiting the common, wonderful, but woefully unprepared protocols upon which the Internet is built.

Can the Genie Go Back in the Bottle?
With nearly 4 billion users online connecting to more than 1 billion websites, there is no chance that the fundamental protocols will be retrofitted with security controls. But that doesn't mean organizations can't protect themselves.

A new model for accessing the Web is taking shape. It allows users to interact with Web-based content without exposing their environment to exploit and lets IT manage the browser centrally for security, governance, and compliance.

It's called browser isolation: run a browser on a remote, virtual host, where all the Web content is contained in a secure, disposable environment. Interaction with the virtual browser is via an encrypted remote display protocol. This neuters Web threats. By only touching pixels, users can't bring malicious content such as ransomware, drive-by downloads, or other dangerous content into their network or device.

Industry analysts are increasingly recommending this approach, and a recent report suggests strong growth in this category. Enterprises are adopting browser isolation platforms to outsource their attack surface area. You should, too.

Related Content:

Scott Petry is Co-Founder and CEO of Authentic8. Prior to Authentic8, Scott founded Postini and served in a variety of C-level roles until its acquisition by Google in 2007. He served as Director of Product Management at Google until 2009. Prior to Postini, Scott was General ... View Full Bio
Comment  | 
Print  | 
More Insights
Comments
Newest First  |  Oldest First  |  Threaded View
Crowdsourced vs. Traditional Pen Testing
Alex Haynes, Chief Information Security Officer, CDL,  3/19/2019
BEC Scammer Pleads Guilty
Dark Reading Staff 3/20/2019
Register for Dark Reading Newsletters
White Papers
Video
Cartoon Contest
Write a Caption, Win a Starbucks Card! Click Here
Latest Comment: This comment is waiting for review by our moderators.
Current Issue
5 Emerging Cyber Threats to Watch for in 2019
Online attackers are constantly developing new, innovative ways to break into the enterprise. This Dark Reading Tech Digest gives an in-depth look at five emerging attack trends and exploits your security team should look out for, along with helpful recommendations on how you can prevent your organization from falling victim.
Flash Poll
The State of Cyber Security Incident Response
The State of Cyber Security Incident Response
Organizations are responding to new threats with new processes for detecting and mitigating them. Here's a look at how the discipline of incident response is evolving.
Twitter Feed
Dark Reading - Bug Report
Bug Report
Enterprise Vulnerabilities
From DHS/US-CERT's National Vulnerability Database
CVE-2018-18913
PUBLISHED: 2019-03-21
Opera before 57.0.3098.106 is vulnerable to a DLL Search Order hijacking attack where an attacker can send a ZIP archive composed of an HTML page along with a malicious DLL to the target. Once the document is opened, it may allow the attacker to take full control of the system from any location with...
CVE-2018-20031
PUBLISHED: 2019-03-21
A Denial of Service vulnerability related to preemptive item deletion in lmgrd and vendor daemon components of FlexNet Publisher version 11.16.1.0 and earlier allows a remote attacker to send a combination of messages to lmgrd or the vendor daemon, causing the heartbeat between lmgrd and the vendor ...
CVE-2018-20032
PUBLISHED: 2019-03-21
A Denial of Service vulnerability related to message decoding in lmgrd and vendor daemon components of FlexNet Publisher version 11.16.1.0 and earlier allows a remote attacker to send a combination of messages to lmgrd or the vendor daemon, causing the heartbeat between lmgrd and the vendor daemon t...
CVE-2018-20034
PUBLISHED: 2019-03-21
A Denial of Service vulnerability related to adding an item to a list in lmgrd and vendor daemon components of FlexNet Publisher version 11.16.1.0 and earlier allows a remote attacker to send a combination of messages to lmgrd or the vendor daemon, causing the heartbeat between lmgrd and the vendor ...
CVE-2019-3855
PUBLISHED: 2019-03-21
An integer overflow flaw which could lead to an out of bounds write was discovered in libssh2 before 1.8.1 in the way packets are read from the server. A remote attacker who compromises a SSH server may be able to execute code on the client system when a user connects to the server.