Twenty-eight years ago, British computer scientist Tim Berners-Lee proposed a system to link text documents across a computer network. It changed the way the world communicates and does business. From its humble beginnings, the Internet has become a complex, dynamic, and heterogeneous environment.
Today, the Internet revolution's main instrument, the Web browser, exposes users to unbounded malicious content and has become unmanageable.
How did browsers become such a liability? Because they're based on an ancient set of communication rules, protocols that assume connections are secure and content is safe. The openness and utility of the protocols led to enormous innovation. But today, with all its sophistication, the Web is still based on protocols that weren't designed for security or enterprise-class management.
Simple Beginnings Led to Enormous Risk
Berners-Lee says he didn't imagine how far his invention would go. In the early '90s, he designed something that today seems obvious: protocols and programs that enable a client application to fetch and display content residing on a remote server. He envisioned a system that would give researchers easier access to scientific documents on servers scattered around the world.
The first protocol would allow two devices to establish a network connection. Today, his Hypertext Transport Protocol (HTTP) has become the de facto standard for how Internet-connected applications and devices communicate with each other.
Berners-Lee's second protocol has become the lingua franca of the Web: the Hypertext Markup Language. HTML defined a basic set of instructions on how to format, link, and pool documents with simple text-based commands embedded in documents. These "tags" would enable seamless cross-referencing, and thus improved collaboration amongst researchers.
Protocols need apps to make them useful, so Berners-Lee built the associated components to create an end-to-end solution. He defined the URL and built the first Web server and the first browser, dubbed WorldWideWeb.
Built for Simplicity and Extensibility, Not Security
With all the pieces in place, a user on one computer could open a browser, connect to a Web server, and request content. The content would be passed back to the browser, which would unpack the data and format it for local interaction.
In Berners-Lee's own words, his system was designed for "a friendly, academic environment". But within a few years, the World Wide Web was no longer his app; the name had become synonymous with the digital revolution. And with a bit of steering by international standards committees, users, technology vendors, content providers, and significantly less "friendly" participants were all in.
By the end of 1995, nearly 50 million people were connected to the Internet. Explosive growth would continue for more than two decades.
When the Web Got Out of Hand
With the popularity of the Web on the rise, demand for more functionality was growing. Recognizing that standards needed to be established to build on the common infrastructure without breaking interoperability, organizations such as the Internet Engineering Task Force and the World Wide Web Consortium (W3C) quickly expanded the syntax of the underlying protocols. In 1996, the W3C published an outline to support the execution of scripts in the browser.
This was the point of no return for the browser. Instead of rendering formatted text, it would now download programmatic code from third-party servers and execute it locally.
By the end of 1997, the browser had evolved well beyond Berners-Lee's concept. The browser had become a dashboard to a vast array of interconnected services and the user interface for business applications. But it was still built on his vision of a "friendly environment," where the browser would connect to a remote host and execute whatever payload the host delivered.
"Unfriendlies" See Opportunities, Too
The propagation of viruses over computer networks had historical precedent. The exploitation of systems has evolved alongside interconnectivity since the dawn of computers.
The Morris Worm is credited with being one of the first malicious programs distributed over the Internet. Written by Robert Morris, a graduate student at Cornell University, it was launched in November 1988. Morris claims to have written the worm to measure the size of the Internet. Regardless, it was so successful that major sections of the network had to be disconnected in order to contain it.
The situation has only gotten worse. The sophistication and volume of Internet-based exploits continue to grow. Motivated hackers go after personal information such as banking credentials, healthcare data, tax refunds, and more. And state-sponsored or corporate espionage attacks target individuals and information sources meant to disrupt business or economies.
All of these things are exploiting the common, wonderful, but woefully unprepared protocols upon which the Internet is built.
Can the Genie Go Back in the Bottle?
With nearly 4 billion users online connecting to more than 1 billion websites, there is no chance that the fundamental protocols will be retrofitted with security controls. But that doesn't mean organizations can't protect themselves.
A new model for accessing the Web is taking shape. It allows users to interact with Web-based content without exposing their environment to exploit and lets IT manage the browser centrally for security, governance, and compliance.
It's called browser isolation: run a browser on a remote, virtual host, where all the Web content is contained in a secure, disposable environment. Interaction with the virtual browser is via an encrypted remote display protocol. This neuters Web threats. By only touching pixels, users can't bring malicious content such as ransomware, drive-by downloads, or other dangerous content into their network or device.
Industry analysts are increasingly recommending this approach, and a recent report suggests strong growth in this category. Enterprises are adopting browser isolation platforms to outsource their attack surface area. You should, too.