Endpoint
4/19/2017
10:30 AM
Scott Petry
Scott Petry
Commentary
Connect Directly
Twitter
RSS
E-Mail vvv
50%
50%

The Architecture of the Web Is Unsafe for Today's World

The Internet is based on protocols that assume content is secure. A new, more realistic model is needed.

Twenty-eight years ago, British computer scientist Tim Berners-Lee proposed a system to link text documents across a computer network. It changed the way the world communicates and does business. From its humble beginnings, the Internet has become a complex, dynamic, and heterogeneous environment.

Today, the Internet revolution's main instrument, the Web browser, exposes users to unbounded malicious content and has become unmanageable.

How did browsers become such a liability? Because they're based on an ancient set of communication rules, protocols that assume connections are secure and content is safe. The openness and utility of the protocols led to enormous innovation. But today, with all its sophistication, the Web is still based on protocols that weren't designed for security or enterprise-class management.

Simple Beginnings Led to Enormous Risk
Berners-Lee says he didn't imagine how far his invention would go. In the early '90s, he designed something that today seems obvious: protocols and programs that enable a client application to fetch and display content residing on a remote server. He envisioned a system that would give researchers easier access to scientific documents on servers scattered around the world.

The first protocol would allow two devices to establish a network connection. Today, his Hypertext Transport Protocol (HTTP) has become the de facto standard for how Internet-connected applications and devices communicate with each other.

Berners-Lee's second protocol has become the lingua franca of the Web: the Hypertext Markup Language. HTML defined a basic set of instructions on how to format, link, and pool documents with simple text-based commands embedded in documents. These "tags" would enable seamless cross-referencing, and thus improved collaboration amongst researchers.

Protocols need apps to make them useful, so Berners-Lee built the associated components to create an end-to-end solution. He defined the URL and built the first Web server and the first browser, dubbed WorldWideWeb.

Built for Simplicity and Extensibility, Not Security
With all the pieces in place, a user on one computer could open a browser, connect to a Web server, and request content. The content would be passed back to the browser, which would unpack the data and format it for local interaction.

In Berners-Lee's own words, his system was designed for "a friendly, academic environment". But within a few years, the World Wide Web was no longer his app; the name had become synonymous with the digital revolution. And with a bit of steering by international standards committees, users, technology vendors, content providers, and significantly less "friendly" participants were all in.

By the end of 1995, nearly 50 million people were connected to the Internet. Explosive growth would continue for more than two decades.

Source: Scott Petry
Source: Scott Petry

 

When the Web Got Out of Hand
With the popularity of the Web on the rise, demand for more functionality was growing. Recognizing that standards needed to be established to build on the common infrastructure without breaking interoperability, organizations such as the Internet Engineering Task Force and the World Wide Web Consortium (W3C) quickly expanded the syntax of the underlying protocols. In 1996, the W3C published an outline to support the execution of scripts in the browser.

This was the point of no return for the browser. Instead of rendering formatted text, it would now download programmatic code from third-party servers and execute it locally.

Source: Scott Petry
Source: Scott Petry

Building on these enhancements, commercial vendors would expand the capabilities of the browser. Plug-ins were developed to render videos, applications were being rewritten to support Java or ActiveX, and with the emergence of JavaScript, code execution went native.

Source: Scott Petry
Source: Scott Petry

By the end of 1997, the browser had evolved well beyond Berners-Lee's concept. The browser had become a dashboard to a vast array of interconnected services and the user interface for business applications. But it was still built on his vision of a "friendly environment," where the browser would connect to a remote host and execute whatever payload the host delivered.

"Unfriendlies" See Opportunities, Too
The propagation of viruses over computer networks had historical precedent. The exploitation of systems has evolved alongside interconnectivity since the dawn of computers.

The Morris Worm is credited with being one of the first malicious programs distributed over the Internet. Written by Robert Morris, a graduate student at Cornell University, it was launched in November 1988. Morris claims to have written the worm to measure the size of the Internet. Regardless, it was so successful that major sections of the network had to be disconnected in order to contain it.

The situation has only gotten worse. The sophistication and volume of Internet-based exploits continue to grow. Motivated hackers go after personal information such as banking credentials, healthcare data, tax refunds, and more. And state-sponsored or corporate espionage attacks target individuals and information sources meant to disrupt business or economies.

All of these things are exploiting the common, wonderful, but woefully unprepared protocols upon which the Internet is built.

Can the Genie Go Back in the Bottle?
With nearly 4 billion users online connecting to more than 1 billion websites, there is no chance that the fundamental protocols will be retrofitted with security controls. But that doesn't mean organizations can't protect themselves.

A new model for accessing the Web is taking shape. It allows users to interact with Web-based content without exposing their environment to exploit and lets IT manage the browser centrally for security, governance, and compliance.

It's called browser isolation: run a browser on a remote, virtual host, where all the Web content is contained in a secure, disposable environment. Interaction with the virtual browser is via an encrypted remote display protocol. This neuters Web threats. By only touching pixels, users can't bring malicious content such as ransomware, drive-by downloads, or other dangerous content into their network or device.

Industry analysts are increasingly recommending this approach, and a recent report suggests strong growth in this category. Enterprises are adopting browser isolation platforms to outsource their attack surface area. You should, too.

Related Content:

Scott Petry is Co-Founder and CEO of Authentic8. Prior to Authentic8, Scott founded Postini and served in a variety of C-level roles until its acquisition by Google in 2007. He served as Director of Product Management at Google until 2009. Prior to Postini, Scott was General ... View Full Bio
Comment  | 
Print  | 
More Insights
Comments
Newest First  |  Oldest First  |  Threaded View
Register for Dark Reading Newsletters
White Papers
Video
Cartoon Contest
Current Issue
Security Operations and IT Operations: Finding the Path to Collaboration
A wide gulf has emerged between SOC and NOC teams that's keeping both of them from assuring the confidentiality, integrity, and availability of IT systems. Here's how experts think it should be bridged.
Flash Poll
New Best Practices for Secure App Development
New Best Practices for Secure App Development
The transition from DevOps to SecDevOps is combining with the move toward cloud computing to create new challenges - and new opportunities - for the information security team. Download this report, to learn about the new best practices for secure application development.
Slideshows
Twitter Feed
Dark Reading - Bug Report
Bug Report
Enterprise Vulnerabilities
From DHS/US-CERT's National Vulnerability Database
CVE-2017-0290
Published: 2017-05-09
NScript in mpengine in Microsoft Malware Protection Engine with Engine Version before 1.1.13704.0, as used in Windows Defender and other products, allows remote attackers to execute arbitrary code or cause a denial of service (type confusion and application crash) via crafted JavaScript code within ...

CVE-2016-10369
Published: 2017-05-08
unixsocket.c in lxterminal through 0.3.0 insecurely uses /tmp for a socket file, allowing a local user to cause a denial of service (preventing terminal launch), or possibly have other impact (bypassing terminal access control).

CVE-2016-8202
Published: 2017-05-08
A privilege escalation vulnerability in Brocade Fibre Channel SAN products running Brocade Fabric OS (FOS) releases earlier than v7.4.1d and v8.0.1b could allow an authenticated attacker to elevate the privileges of user accounts accessing the system via command line interface. With affected version...

CVE-2016-8209
Published: 2017-05-08
Improper checks for unusual or exceptional conditions in Brocade NetIron 05.8.00 and later releases up to and including 06.1.00, when the Management Module is continuously scanned on port 22, may allow attackers to cause a denial of service (crash and reload) of the management module.

CVE-2017-0890
Published: 2017-05-08
Nextcloud Server before 11.0.3 is vulnerable to an inadequate escaping leading to a XSS vulnerability in the search module. To be exploitable a user has to write or paste malicious content into the search dialogue.

Dark Reading Radio
Archived Dark Reading Radio
In past years, security researchers have discovered ways to hack cars, medical devices, automated teller machines, and many other targets. Dark Reading Executive Editor Kelly Jackson Higgins hosts researcher Samy Kamkar and Levi Gundert, vice president of threat intelligence at Recorded Future, to discuss some of 2016's most unusual and creative hacks by white hats, and what these new vulnerabilities might mean for the coming year.