News, news analysis, and commentary on the latest trends in cybersecurity technology.

How Chipmakers Are Implementing Confidential Computing

On-chip solutions aim to prevent breaches by separating the computing element and keeping data in the secure vault at all times.

Agam Shah, Contributing Writer

October 31, 2022

5 Min Read
Illustration of a password-protected block in a computer environment
Source: Siarhei Yurchanka via Alamy Stock Photo

Top chipmakers Nvidia, Intel, ARM, and AMD are providing the hardware hooks for an emerging security concept called confidential computing, which provides layers of trust through hardware and software so customers can be confident that their data is secure.

Chipmakers are adding protective vaults and encryption layers to secure data when it is stored, in transit, or being processed. The goal is to prevent hackers from launching hardware attacks to steal data.

The chip offerings are trickling down to cloud providers, with Microsoft (Azure) and Google (Cloud) offering security-focused virtual machines in which data in secure vaults can be unlocked by only authorized parties. Attestation verifies the source and integrity of the program entering the secure vault to access the data. Once authorized, the processing happens inside the vault, and the code doesn't leave the secure vault.

Confidential computing is not a part of everyday computing yet, but it may become necessary to protect sensitive applications and data from sophisticated attacks, says Jim McGregor, principal analyst at Tirias Research.

The chipmakers are focusing on hardware protections because "software is easy to hack," McGregor says.

Nvidia's Morpheus Uses AI to Analyze Behavior

There are multiple dimensions to confidential computing. On-chip confidential computing aims to prevent breaches like the 2018 Meltdown and Spectre vulnerabilities by separating the computing element and keeping data in the secure vault at all times.

"Everybody wants to continue to reduce the attack surface of data," says Justin Boitano, vice president and general manager of Nvidia's enterprise and edge computing operations. "Up to this point, it is obviously encrypted in transit and at rest. Confidential computing solves the encrypted in-use at the infrastructure level."

Nvidia is taking a divergent approach to confidential computing with Morpheus, which uses artificial intelligence (AI) to keep computer systems secure. For example, Morpheus identifies suspicious user behavior by using AI techniques to inspect network packets for sensitive data.

"Security analysts can go and fix the security policies before it becomes a problem," Boitano says. "From there, we also realize the big challenges — you have to kind of assume that people are already in your network, so you have also got to look at the behavior of users and machines on the network."

Nvidia is also using Morpheus to establish security priorities for analysts tracking system threats. The AI system breaks down login information to identify abnormal user behavior on a network and machines that may have been compromised by phishing, social engineering, or other techniques. That analysis helps the company's security team prioritize their actions.

"You're trying to look at everything and then use AI to determine what you need to keep and act on versus what might just be noise that you can drop," Boitano says.

Intel Rolls Out Project Amber

Confidential computing will also help enterprises build a new class of applications where third-party data sets can mingle with proprietary data sets in a secure area to create better learning models, says Anil Rao, vice president and general manager for systems architecture and engineering at Intel's office of the chief technology officer.

Companies are eager to bring diverse data sets into proprietary data to make internal AI systems more accurate, Rao says. Confidential computing makes sure only authorized data is fed into AI and learning models, and that the data is not pilfered or stolen.

"If you have data coming in from credit card companies, you have data coming in from insurance companies, and you have data coming in from other locations, what you can do is say, 'I'm going to process all of these pieces of data inside of a [secure] enclave,'" Rao says.

Intel already had a secure enclave called SGX (Secure Guard Extension), but it recently added Project Amber, a cloud-based service that uses hardware and software techniques to attest and certify the trustworthiness of data.

On its upcoming 4th Generation Xeon Scalable processor, Intel's Project Amber uses instructions called Trust Domain Execution (TDX) to unlock secure enclaves. An Amber engine on one chip generates a numerical code for the secure enclave. If the code provided by the data or program seeking access matches, it is allowed to enter the secure enclave; if not, entry is denied.

ARM Teams Up With AWS

At the recent online ARM DevSummit, ARM — whose chip designs are being used by AWS on its Graviton cloud chip — announced it was focusing confidential computing on dynamic "realms" that will order programs and data in separate computational environments.

ARM's latest confidential computing architecture will deepen secure "wells" and make it harder for hackers to pull out data. The company is releasing confidential computing software stacks and guides for implementation in processors coming out over the next two years.

"We're already investing to ensure that you have the tools and software to see the ecosystem for early development," said Gary Campbell, executive vice president for central engineering at ARM, during a keynote at the event.

AMD and Microsoft Go Open Source

During a presentation at the AI Hardware Summit in August, Mark Russinovich, Microsoft Azure's chief technology officer, gave an example of how Royal Bank of Canada was using AMD's SEV-SNP confidential computing technology in Azure. The bank's AI model mixed proprietary data sets with information from merchants, consumers, and banks in real time, which helped provide more targeted advertising offerings to its customers.

Confidential computing features such as attestation ensured only the authorized data was mingling with its proprietary data set and not compromising it, Russinovich said.

Nvidia, Microsoft, Google, and AMD are collaborating on Caliptra, an open source specification for chipmakers to build confidential computing security blocks into chips and systems.

About the Author

Agam Shah

Contributing Writer

Agam Shah has covered enterprise IT for more than a decade. Outside of machine learning, hardware, and chips, he's also interested in martial arts and Russia.

Keep up with the latest cybersecurity threats, newly discovered vulnerabilities, data breach information, and emerging trends. Delivered daily or weekly right to your email inbox.

You May Also Like


More Insights