News, news analysis, and commentary on the latest trends in cybersecurity technology.
Intel Hardens Confidential Computing With Project Amber Updates
The chip giant has developed new features and services to make it tougher for malicious hackers and insiders to access sensitive data from applications in the cloud.
INTEL INNOVATION 2022 -- San Jose, Calif. — Intel has announced new hardware and software features for Project Amber, a confidential computing service that interlaces hardware and software to attest and certify the trustworthiness of data, at its Innovation developer summit this week.
The enhancements include features to protect data from the time it leaves the system and is in transit, in use, or at rest in storage.
"This is a fundamental technology that Intel has been developing for years. The place where it's going to be the most important is in AI/ML models ... to make sure when you're running a model on the edge, it's not being pilfered, it's not being stolen, it's not being manipulated," said Intel CTO Greg Lavender during his Wednesday keynote.
Data is traveling farther when outside the data center, with multiple stopovers, until it reaches cloud services or completes a round trip to enterprise infrastructure. Information from sources like sensors are added as data moves along a telecom network, with stopovers and artificial intelligence (AI) chips ensuring only relevant data moves ahead.
Project Amber uses hardware and software techniques to verify that the packets of data and its origin device are trustworthy. That layer of trust between devices and waypoints when data is in transit is a form of assurance that a company's infrastructure and execution environment are secure, says Anil Rao, Intel vice president for systems architecture and engineering in the office of CTO.
"Gone are the days where the central hubs are simply the data movers," Rao says. "They're not simple data movers. They're intelligent data movers."
The confidential computing offering is important for an enterprise mixing its own datasets with information from third parties to strengthen AI learning models. Project Amber provides a way to ensure that data is coming from trusted sources, Rao says.
Secure Enclaves
Project Amber adds a stronger lockdown mechanism to protect data while it is being processed. The Trust Domain Execution (TDX) instructions, which are on the company's upcoming 4th Generation Xeon Scalable processor, can secure an entire virtual machine as a trusted enclave.
The data is locked down so even hypervisors — which manage and monitor virtual machines — can't peek into the confidential computing environment.
"Your application will still do a virtual machine entry and exit call, but during those calls the data is still encrypted," Rao says.
Today's computing environment in the cloud is built around virtual machines, and applications don't run directly off processors, says Steve Leibson, principal analyst at Tirias Research.
"When we ran on processors, we didn't need attestation because nobody was going to alter a Xeon. But a virtual machine — that's just software. You can alter it," Leibson says. "Attestation is trying to provide the same sort of rigidity to software machines as silicon does for hardware processors."
TDX is bigger in scope than Secure Guard Extensions (SGX), which is a secure area in the memory in which to push, run, and operate code. SGX, a common feature on Intel chips, is also a part of Project Amber.
Intel's Rao compares the scope of TDX and SGX to hotel rooms. If TDX was a trusted boundary in the form of a secure hotel room, SGX was a secure locker inside the hotel room.
Project Amber allows data to enter secure enclaves after matching numerical codes issued by Amber engines. If the codes match, data can enter the secure enclave; if not, entry is denied because data could have been altered, modified, or hacked in transit.
"It's almost like you're giving somebody your VIN number and saying, 'Is this the authentic VIN number for my car or has someone done something hanky-panky with that thing?'" Rao says.
Intel will also provide customers the ability to define their own policies to create a trusted execution environment.
"You may want to process everything in an East Coast data center versus a West Coast," Rao says. "What Amber says is that here is exactly what it is — your code did not pass the policy."
Protection in the Clouds
Amber will support multiple cloud service providers, but Intel didn't provide specific details.
"We want to make it multicloud so you don't need to have a different attestation mechanism as an enterprise when you go to different clouds," Rao says.
There are hundreds of millions of processors from Intel in data centers around the world, and bad actors have an established ability to break into servers and steal secrets, Tirias' Leibson says.
"It's a cat-and-mouse game, and Intel is constantly trying to develop new ways to prevent the bad guys from breaking into the servers and stealing secrets," Leibson says. "And it goes all the way from script kiddies, to teenagers who are just hacking around, to state-sponsored sites."
At some point, one has to think about protecting data in use, in motion, and in storage. Project Amber was thus inevitable, especially with computing moving farther away from homegrown infrastructure to the cloud, Leibson says.
Project Amber is still in the pilot phase as Intel gears the technology for computing models adopted by verticals. The chipmaker is working with research company Leidos to use Project Amber in the healthcare sector, which has many types of devices and sensors spread over large geographies and requires attestation to ensure systems receive only trustworthy data.
About the Author
You May Also Like
A Cyber Pros' Guide to Navigating Emerging Privacy Regulation
Dec 10, 2024Identifying the Cybersecurity Metrics that Actually Matter
Dec 11, 2024The Current State of AI Adoption in Cybersecurity, Including its Opportunities
Dec 12, 2024Cybersecurity Day: How to Automate Security Analytics with AI and ML
Dec 17, 2024The Dirt on ROT Data
Dec 18, 2024