Dark Reading is part of the Informa Tech Division of Informa PLC

This site is operated by a business or businesses owned by Informa PLC and all copyright resides with them.Informa PLC's registered office is 5 Howick Place, London SW1P 1WG. Registered in England and Wales. Number 8860726.

Endpoint

4/19/2017
10:30 AM
Scott Petry
Scott Petry
Commentary
Connect Directly
Twitter
RSS
E-Mail vvv
50%
50%

The Architecture of the Web Is Unsafe for Today's World

The Internet is based on protocols that assume content is secure. A new, more realistic model is needed.

Twenty-eight years ago, British computer scientist Tim Berners-Lee proposed a system to link text documents across a computer network. It changed the way the world communicates and does business. From its humble beginnings, the Internet has become a complex, dynamic, and heterogeneous environment.

Today, the Internet revolution's main instrument, the Web browser, exposes users to unbounded malicious content and has become unmanageable.

How did browsers become such a liability? Because they're based on an ancient set of communication rules, protocols that assume connections are secure and content is safe. The openness and utility of the protocols led to enormous innovation. But today, with all its sophistication, the Web is still based on protocols that weren't designed for security or enterprise-class management.

Simple Beginnings Led to Enormous Risk
Berners-Lee says he didn't imagine how far his invention would go. In the early '90s, he designed something that today seems obvious: protocols and programs that enable a client application to fetch and display content residing on a remote server. He envisioned a system that would give researchers easier access to scientific documents on servers scattered around the world.

The first protocol would allow two devices to establish a network connection. Today, his Hypertext Transport Protocol (HTTP) has become the de facto standard for how Internet-connected applications and devices communicate with each other.

Berners-Lee's second protocol has become the lingua franca of the Web: the Hypertext Markup Language. HTML defined a basic set of instructions on how to format, link, and pool documents with simple text-based commands embedded in documents. These "tags" would enable seamless cross-referencing, and thus improved collaboration amongst researchers.

Protocols need apps to make them useful, so Berners-Lee built the associated components to create an end-to-end solution. He defined the URL and built the first Web server and the first browser, dubbed WorldWideWeb.

Built for Simplicity and Extensibility, Not Security
With all the pieces in place, a user on one computer could open a browser, connect to a Web server, and request content. The content would be passed back to the browser, which would unpack the data and format it for local interaction.

In Berners-Lee's own words, his system was designed for "a friendly, academic environment". But within a few years, the World Wide Web was no longer his app; the name had become synonymous with the digital revolution. And with a bit of steering by international standards committees, users, technology vendors, content providers, and significantly less "friendly" participants were all in.

By the end of 1995, nearly 50 million people were connected to the Internet. Explosive growth would continue for more than two decades.

Source: Scott Petry
Source: Scott Petry

 

When the Web Got Out of Hand
With the popularity of the Web on the rise, demand for more functionality was growing. Recognizing that standards needed to be established to build on the common infrastructure without breaking interoperability, organizations such as the Internet Engineering Task Force and the World Wide Web Consortium (W3C) quickly expanded the syntax of the underlying protocols. In 1996, the W3C published an outline to support the execution of scripts in the browser.

This was the point of no return for the browser. Instead of rendering formatted text, it would now download programmatic code from third-party servers and execute it locally.

Source: Scott Petry
Source: Scott Petry

Building on these enhancements, commercial vendors would expand the capabilities of the browser. Plug-ins were developed to render videos, applications were being rewritten to support Java or ActiveX, and with the emergence of JavaScript, code execution went native.

Source: Scott Petry
Source: Scott Petry

By the end of 1997, the browser had evolved well beyond Berners-Lee's concept. The browser had become a dashboard to a vast array of interconnected services and the user interface for business applications. But it was still built on his vision of a "friendly environment," where the browser would connect to a remote host and execute whatever payload the host delivered.

"Unfriendlies" See Opportunities, Too
The propagation of viruses over computer networks had historical precedent. The exploitation of systems has evolved alongside interconnectivity since the dawn of computers.

The Morris Worm is credited with being one of the first malicious programs distributed over the Internet. Written by Robert Morris, a graduate student at Cornell University, it was launched in November 1988. Morris claims to have written the worm to measure the size of the Internet. Regardless, it was so successful that major sections of the network had to be disconnected in order to contain it.

The situation has only gotten worse. The sophistication and volume of Internet-based exploits continue to grow. Motivated hackers go after personal information such as banking credentials, healthcare data, tax refunds, and more. And state-sponsored or corporate espionage attacks target individuals and information sources meant to disrupt business or economies.

All of these things are exploiting the common, wonderful, but woefully unprepared protocols upon which the Internet is built.

Can the Genie Go Back in the Bottle?
With nearly 4 billion users online connecting to more than 1 billion websites, there is no chance that the fundamental protocols will be retrofitted with security controls. But that doesn't mean organizations can't protect themselves.

A new model for accessing the Web is taking shape. It allows users to interact with Web-based content without exposing their environment to exploit and lets IT manage the browser centrally for security, governance, and compliance.

It's called browser isolation: run a browser on a remote, virtual host, where all the Web content is contained in a secure, disposable environment. Interaction with the virtual browser is via an encrypted remote display protocol. This neuters Web threats. By only touching pixels, users can't bring malicious content such as ransomware, drive-by downloads, or other dangerous content into their network or device.

Industry analysts are increasingly recommending this approach, and a recent report suggests strong growth in this category. Enterprises are adopting browser isolation platforms to outsource their attack surface area. You should, too.

Related Content:

Scott Petry is Co-Founder and CEO of Authentic8. Prior to Authentic8, Scott founded Postini and served in a variety of C-level roles until its acquisition by Google in 2007. He served as Director of Product Management at Google until 2009. Prior to Postini, Scott was General ... View Full Bio
Comment  | 
Print  | 
More Insights
Comments
Newest First  |  Oldest First  |  Threaded View
Aviation Faces Increasing Cybersecurity Scrutiny
Kelly Jackson Higgins, Executive Editor at Dark Reading,  8/22/2019
Microsoft Tops Phishers' Favorite Brands as Facebook Spikes
Kelly Sheridan, Staff Editor, Dark Reading,  8/22/2019
Capital One Breach: What Security Teams Can Do Now
Dr. Richard Gold, Head of Security Engineering at Digital Shadows,  8/23/2019
Register for Dark Reading Newsletters
White Papers
Video
Cartoon
Current Issue
7 Threats & Disruptive Forces Changing the Face of Cybersecurity
This Dark Reading Tech Digest gives an in-depth look at the biggest emerging threats and disruptive forces that are changing the face of cybersecurity today.
Flash Poll
The State of IT Operations and Cybersecurity Operations
The State of IT Operations and Cybersecurity Operations
Your enterprise's cyber risk may depend upon the relationship between the IT team and the security team. Heres some insight on what's working and what isn't in the data center.
Twitter Feed
Dark Reading - Bug Report
Bug Report
Enterprise Vulnerabilities
From DHS/US-CERT's National Vulnerability Database
CVE-2019-15540
PUBLISHED: 2019-08-25
filters/filter-cso/filter-stream.c in the CSO filter in libMirage 3.2.2 in CDemu does not validate the part size, triggering a heap-based buffer overflow that can lead to root access by a local Linux user.
CVE-2019-15538
PUBLISHED: 2019-08-25
An issue was discovered in xfs_setattr_nonsize in fs/xfs/xfs_iops.c in the Linux kernel through 5.2.9. XFS partially wedges when a chgrp fails on account of being out of disk quota. xfs_setattr_nonsize is failing to unlock the ILOCK after the xfs_qm_vop_chown_reserve call fails. This is primarily a ...
CVE-2016-6154
PUBLISHED: 2019-08-23
The authentication applet in Watchguard Fireware 11.11 Operating System has reflected XSS (this can also cause an open redirect).
CVE-2019-5594
PUBLISHED: 2019-08-23
An Improper Neutralization of Input During Web Page Generation ("Cross-site Scripting") in Fortinet FortiNAC 8.3.0 to 8.3.6 and 8.5.0 admin webUI may allow an unauthenticated attacker to perform a reflected XSS attack via the search field in the webUI.
CVE-2019-6695
PUBLISHED: 2019-08-23
Lack of root file system integrity checking in Fortinet FortiManager VM application images of all versions below 6.2.1 may allow an attacker to implant third-party programs by recreating the image through specific methods.