The protection of used data is the abbreviated version of the goal of confidential computing. However, this initiative is more complicated than that. On Monday, November 14, representatives from Google Cloud, AMD, and Intel met to discuss the state of confidential computing, how it’s evolving, and the hurdles that still need to be overcome. What does confidential computing mean for cloud and edge deployments? For hardware manufacturers and software developers?
The State of Confidential Computing
“Confidential computing is really the method by which a cloud provider or host environment can tie their hands,” said Brent Hollingsworth, director of the Epyc software ecosystem at AMD. “They can stop themselves from seeing data at a fundamental level in ways they weren’t able to before.”
Formally, confidential computing is an initiative to ensure that cloud computing technology can secure the data used at the hardware level. It uses trusted execution environments, a trusted enclave within a central processing unit.
SEE: Recruitment Kit: Cloud Engineer (TechRepublic Premium)
For chipmakers and software producers, half the battle may be explaining the story behind this new capability to customers, said Anil Rao, vice president and general manager of architecture and engineering. systems at Intel. Several panelists noted that confidential computing is, at present, difficult to market. The goal is for it to be essential, but for now it’s considered a plus.
Changing that requires asking technical questions that will also determine whether customers buy. Among the forward-looking questions posed by Vint Cerf, Chief Internet Evangelist for Google Cloud, are “What happens if a CC server goes down? How do you recover? How do you transfer partial results , etc? What about scaling? How do I get CC to work in a multi-core environment? Does it work with GPUs and TPUs? Are certifications available and from whom and on what basis?”
Brent noted that the most exciting advanced developments today come from large organizations that have the resources to rebuild the infrastructure based on the idea of putting security first. For example, he brandished Project Zero, Google’s white hacking team.
Confidential Computing at the Edge
Confidential computing is an advantage for edge applications because they may not have the same physical properties as a data center. A cell tower with a server at the bottom, for example, is a peripheral situation that requires special security. Unmanned or unmonitored facilities could also benefit.
“When you’re pushing your IP to the edge and want to make sure your IP is handled carefully, that’s a fantastic example,” Rao said. “We actually see some of our customers deploying confidential computing for scenarios of this nature, whether it’s things like Google Antos or from their central location to their branch office.
“If it’s extinct infrastructure in their branch, these are all fundamental ways the edge is an important part of confidential computing.”
Cerf pointed out that 6G and the mobile edge are also relevant here. While 6G design is still fluid, in general, the application level has a say in how the communication system works. It’s another example of built-in security, a philosophy that shares multiple walls with confidential computing. Customers may wish to partition the application that controls the communication component.
What’s next for confidential computing?
What should we expect from confidential computing in the next five years? Cerf predicts that it will continue to be standardized, with confidential-like computing in a variety of computing environments. However, it depends on the capabilities and choices of the chipset manufacturers.
SEE: Don’t Curb Your Excitement: Trends and Challenges in Edge Computing (TechRepublic)
Likewise, Rao envisions a world where confidential computing is the norm, where the term “private cloud” becomes obsolete. It must be assumed that the data used will not be visible to any outside observers, the panelists agreed.
What holds back confidential computing?
However, there are a variety of technical challenges before this happens. Not everything on the cloud is capable of confidential computing yet. Chipsets have yet to be developed to provide this as well as specialization, so that domain-specific computing can be done at the same time.
Nelly Porter, group product manager for Google Cloud, pointed out that issues such as live migration are still a problem for confidential computing. Attestation is also a concern, Rao said. Customers don’t want to be early adopters in general, he pointed out, and cloud computing is still in the early stage typical of a few organizations wanting to make the first move.
The development of VM workloads needs to be improved so that security is built from within, instead of organizations requesting or trying to bring an older system with a large attack surface to that level safety, Hollingsworth said. Rao also pointed to Intel’s Project Amber, a third-party attestation service.
However, some large organizations try to be trendsetters. In February 2022, the Open Compute Project released Caliptra, an open specification for chip hardware made in collaboration with Microsoft, Google, and AMD. Its goal is to solve some of these problems related to confidential computing that is not integrated from the start. A specific block of silicon establishes a root of trust by which data can be locked at the chip level, making it more difficult for attackers trying to breach the hardware.
Another area of concern and possibility is isolation. Cerf suggested that continuous attestation in fluctuating software environments might be possible due to the isolation provided by confidential computing; although this is, at the present stage, only speculation.
Attestation involves a software environment that guarantees a specific program on specific hardware or a trusted execution environment. Rao agreed, noting that the purpose of confidential computing is not “to absolve bad app behavior” and that it could change how app developers think about integrating security.
Cerf pointed out that Google Cloud is also working on reliable I/O specifications, which, along with domain-specific computing, can help confidential computing become the norm. Porter is also looking forward to tapping into confidential computing with the use of GPUs as accelerators as more and more customers begin to run not just on CPUs, but with training and models requiring accelerators. .
Confidential computing is not yet known, but progress is being made to integrate it into various security strategies.
Want to learn more about confidential computing? Check out our guide or learn more about Project Amber and Ubuntu’s Confidential Computing Update.
#Google #Cloud #eagerly #awaits #confidential #computing #considers #hardware