The newest version of TLS won't break everything in your security infrastructure, but you do need to be prepared for the changes it brings.

Transport Layer Security (TLS) is a foundation piece of modern Internet security. As the replacement of the earlier (and now deprecated) SSL, TLS encrypts the majority of sessions taking place via a web interface. And now, there's a new version with new considerations for organizations giving their users and customers a more secure web experience.

In August, TLS 1.3 was defined in IETF RFC 8446. With that formal definition, the new version became available for implementation and a possible part of the requirements for a number of different regulations.

TLS 1.3 was not suddenly sprung on an unsuspecting world. The new standard went through 28 drafts to reach a production state and some products and services began incorporating TLS 1.3 compatibility over a year before the final version. Even so, articles have been written, and speeches given, about all the ways that TLS 1.3 will break current security protocols. So what is it about TLS 1.3 that leads to so much anxiety?

How TLS 1.3 is different

One of the important benefits touted for TLS 1.3 is improved performance, much of which comes because of a simplified "handshake" process between client and server when establishing a session. There are several technical reasons this is possible, but one of them is that a single negotiation — that of which encryption algorithm to use — is eliminated.

The server provides a key for an approved algorithm, the client accepts the key, and the session is begun. One strength of this scheme is that a number of older, weaker, encryption algorithms are no longer allowed, so several attack mechanisms become impossible.

When the server supplies an encryption key, it is valid for the particular session, and only that session. This leads to something called Perfect Forward Secrecy (PFS), which means that it's impossible for a threat actor to capture a bunch of traffic, later discover the server's encryption key, and then decrypt the captured traffic after the fact. This is, by itself, a major step forward in data security.

Why TLS 1.3 is important

While many organizations, especially those in finance and banking, have been proponents of TLS 1.3, there has not been universal joy at its adoption. The reason is that, despite the concerns of some security professionals, there's no "back door" into the unencrypted traffic.

Why would security professionals, of all people, want a back door into encryption? The answer is visibility. Many enterprise security tools, especially those that do anything described as "deep packet inspection," are essentially engaging in an authorized man-in-the-middle attack, intercepting encrypted traffic, decrypting and analyzing the contents, then re-encrypting the stream before sending it to its destination.

This sort of man-in-the-middle approach is relatively simple with an encryption key based on a server identity (rather than a session), but becomes vastly more complex with the scheme used by TLS 1.3. To put it bluntly, TLS 1.3 breaks many of the products used by organizations deploying TLS 1.2 for their encryption. Those organizations have concerns for both malware trapping and regulatory compliance since they may not have a way of inspecting the contents of communications going in and out of the network.

Network and application infrastructure companies have begun rolling out products that address the inspection issues in TLS 1.3. This is critical because both server software and browsers are beginning to be released that support or require the use of TLS 1.3. The real question will be how quickly organizations adopt the new protocol, a question that is more relevant given that, by some measures, more than half of all commercial web sites still have pages using TLS 1.0 for security.

Related content:

 

 

Black Hat Europe returns to London Dec 3-6 2018  with hands-on technical Trainings, cutting-edge Briefings, Arsenal open-source tool demonstrations, top-tier security solutions and service providers in the Business Hall. Click for information on the conference and to register.

About the Author(s)

Curtis Franklin, Principal Analyst, Omdia

Curtis Franklin Jr. is Principal Analyst at Omdia, focusing on enterprise security management. Previously, he was senior editor of Dark Reading, editor of Light Reading's Security Now, and executive editor, technology, at InformationWeek, where he was also executive producer of InformationWeek's online radio and podcast episodes

Curtis has been writing about technologies and products in computing and networking since the early 1980s. He has been on staff and contributed to technology-industry publications including BYTE, ComputerWorld, CEO, Enterprise Efficiency, ChannelWeb, Network Computing, InfoWorld, PCWorld, Dark Reading, and ITWorld.com on subjects ranging from mobile enterprise computing to enterprise security and wireless networking.

Curtis is the author of thousands of articles, the co-author of five books, and has been a frequent speaker at computer and networking industry conferences across North America and Europe. His most recent books, Cloud Computing: Technologies and Strategies of the Ubiquitous Data Center, and Securing the Cloud: Security Strategies for the Ubiquitous Data Center, with co-author Brian Chee, are published by Taylor and Francis.

When he's not writing, Curtis is a painter, photographer, cook, and multi-instrumentalist musician. He is active in running, amateur radio (KG4GWA), the MakerFX maker space in Orlando, FL, and is a certified Florida Master Naturalist.

Keep up with the latest cybersecurity threats, newly discovered vulnerabilities, data breach information, and emerging trends. Delivered daily or weekly right to your email inbox.

You May Also Like


More Insights