Overview

Currently, it is common practice to encrypt data when it is stored or transmitted, but the encryption of data in use, specifically in memory, is often neglected. Moreover, conventional computing infrastructure lacks robust mechanisms to safeguard data and code during their active use. This poses a challenge for organizations dealing with sensitive information like Personally Identifiable Information (PII), financial data, or health records, as they must address potential threats that could compromise the confidentiality and integrity of both the application and the data residing in system memory. Confidential computing protects data in use by performing the computation in a hardware-based, attested Trusted Execution Environment. By establishing secure and isolated environments, organizations can effectively enhance the security of their operations involving sensitive and regulated data. These controlled environments ensure that unauthorized access or alterations to applications and data during their active usage are prevented. As a result, the overall security posture of these organizations is significantly elevated.

Introduction

Computing encompasses three distinct states for data: during transit, at rest, and in use. When data is actively moving through a network, it is considered "in transit." Data that is stored and not actively accessed is referred to as "at rest." Lastly, data being processed or utilized is categorized as "in use." In our modern era, where the storage, consumption, and sharing of sensitive data have become commonplace, safeguarding such data in all its states has become increasingly crucial. This pertains to a wide range of sensitive information, including credit card data, medical records, firewall configurations, and even geolocation data. Cryptography is now commonly deployed to provide both data confidentiality (stopping unauthorized viewing) and data integrity (preventing or detecting unauthorized changes). While techniques to protect data in transit and at rest are now commonly deployed, the third state - protecting data in use - is the new frontier.

Security risks for unprotected data “in use”

As threat vectors against network and storage devices are increasingly thwarted by the protections that apply to data in transit and at rest, attackers have shifted to targeting data-in-use. The industry witnessed several high-profile memory scraping, such as the Target breach, and CPU-side-channel attacks which have dramatically increased attention to this third state, as well as several high profile attacks involving malware injection, such as the Triton attack and the Ukraine power grid attack.

Advanced malware protection is a type of an intelligence-powered, incorporated enterprise-class highly developed malware analysis and protection solution.  It also gives security teams the level of deep visibility and control that is required to quickly detect attacks, cooperation and control malware before it causes damage. According to analysis conducted by Data Bridge Market Research, Advanced malware protection market size is valued at USD 8,901.17 million by 2028 and is expected to grow at a compound annual growth rate of 14.30% in the forecast period of 2021 to 2028. Data Bridge Market Research report on advanced malware protection provides analysis and insights regarding the various factors expected to be prevalent throughout the forecasted period while providing their impacts on the market’s growth.

https://www.databridgemarketresearch.com/reports/global-advanced-malware-protection-market

As the amount of data stored and processed on mobile, edge, and IoT devices continues to grow, ensuring the security of data and applications during execution becomes more critical. These devices often operate in remote and challenging environments, making it difficult to maintain their security. Additionally, considering the personal nature of the information stored on mobile devices, manufacturers and mobile operating system providers must demonstrate that personal data is protected and remains inaccessible to device vendors and third parties during sharing and processing. These protections must adhere to regulatory requirements. Even in situations where you have control over your infrastructure, safeguarding your most sensitive data while it is being used is an essential component of a robust defense-in-depth strategy.

Confidential Computing leverages hardware-based Trusted Execution Environments (TEEs) to safeguard data during its active use. By adopting Confidential Computing, we can effectively mitigate many of the threats discussed earlier. A Trusted Execution Environment (TEE) is an environment that ensures a high level of assurance in terms of data integrity, data confidentiality, and code integrity. Utilizing hardware-backed techniques, a TEE provides enhanced security guarantees for executing code and protecting data within the environment.

In the context of confidential computing, unauthorized entities encompass other applications on the host, the host operating system, hypervisor, system administrators, service providers, and the infrastructure owner, along with anyone who has physical access to the hardware. Data confidentiality ensures that these unauthorized entities are unable to access data while it is being utilized within the Trusted Execution Environment (TEE). Data integrity protects against unauthorized alterations to data during processing by entities outside the TEE. Code integrity guarantees that unauthorized entities cannot replace or modify the code within the TEE. Collectively, these attributes not only assure data confidentiality but also ensure the correctness of computations, instilling trust in the computation results. This level of assurance is often absent in approaches that do not utilize a hardware-based TEE.

The following table compares a typical TEE implementation with typical implementations of two other emerging classes of solution that protect data in use, Homomorphic Encryption (HE) and Trusted Platform Modules (TPM)

Table 1 - comparison of security properties of Confidential Computing vs. HE and TPMs

 

HW TEE

Homomorphic Encryption

TPM

Data Integrity

Y

Y (Subject to Code Integrity)

Keys Only

Data Confidentiality

Y

Y

Keys Only

Code Integrity

Y

No

Y

Code Confidentiality

Y (May Require Work)

No

Y

Authenticated Launch

Varies

No

No

Programmability

Y

Partial (“circuits”)

No

Attestability

Y

No

Y

Recoverability

Y

No

Y

Trusted Execution Environments (TEEs)

According to the CCC (following common industry practices), a Trusted Execution Environment (TEE) is characterized by three essential properties, which are as follows:

Fig - Trusted Execution Environment (TEE) Characteristics

Confidential Computing: The Future of Cloud Computing Security

Unauthorized entities encompass various actors such as other applications on the host, the host operating system and hypervisor, system administrators, service providers, the infrastructure owner, or anyone else who physically accesses the hardware. These properties collectively assure both data confidentiality and the accuracy of computations performed within the TEE, thereby instilling trust in the computation results.

Additionally, depending on the specific TEE implementation, it may offer further features, including:

Hardware-based TEEs leverage hardware-backed techniques to provide heightened security assurances for code execution and data protection within the TEE. This level of assurance is often absent in approaches that do not rely on a hardware-based TEE.

Benefits of Confidential Computing

Confidential computing offers numerous advantages to organizations concerned with data privacy and security.

Fig - Benefits of Confidential Computing

Confidential Computing: The Future of Cloud Computing Security

Implementing Confidential Computing

Implementing confidential computing requires careful planning and consideration.

The following table shows how scalability in various metrics compares between classical computing, computing using a typical hardware-based TEE, and Homomorphic Encryption. As with the security comparison, the actual answers may vary by vendor, model, or algorithm.

Table 2 - Comparison of scalability properties of Confidential Computing vs. HE and TPMs

Properties

Native

HW TEE

Homomorphic Encryption

Data Size Limits

High

Medium

Low

Computation Speed

High

High-Medium

Low

Scale Out Across Machines

Yes

More Work

Yes

Ability to Combine Data Across Sets (MPC)

Yes

Yes

Very Limited

Challenges in Implementation

While confidential computing brings significant benefits, organizations must address several challenges when implementing it.

Key strategies

Intel Announces New Confidential Computing Initiatives. Intel announced a number of new confidential computing initiatives on January 25, 2023. These initiatives include:

Google Announces Confidential Cloud Platform. Google announced the general availability of its Confidential Cloud Platform on February 1, 2023. The Confidential Cloud Platform is a suite of services that help organizations protect sensitive data in the cloud. These services include:

Microsoft Announces Confidential Computing for Azure. Microsoft announced that it is bringing confidential computing to Azure on February 3, 2023. Confidential computing for Azure is a set of services that help organizations protect sensitive data in the cloud. These services include:

These are a few examples of key strategic initiatives that have been announced recently related to confidential computing. These initiatives are designed to help organizations adopt confidential computing technologies and protect sensitive data in the cloud.

Real-World Use Cases

Confidential computing finds practical applications across various industries, enabling organizations to protect sensitive data and ensure privacy.

Fig - Real World Use Cases

Confidential Computing: The Future of Cloud Computing Security

Keys, Secrets, Credentials and Tokens Storage and Processing:

Cryptographic keys, secrets, credentials and tokens are the “keys to the kingdom” for organizations responsible for protecting sensitive data. Traditionally, on-premises hardware security modules (HSMs) were used to comply with security standards and ensure the security of these assets. However, the proprietary nature of traditional HSMs limited their scalability and compatibility with cloud and edge computing environments, resulting in increased costs and deployment challenges. Confidential computing addresses these limitations by utilizing standardized compute infrastructure available on-premises, in public/hybrid clouds, and even at the network edge for IoT use cases. Independent software vendors (ISVs) and large organizations have already embraced confidential computing to securely store and process cryptographic and secret information. Key management applications leverage hardware-based trusted execution environments (TEEs) to store and process these assets, ensuring data confidentiality, integrity, and code integrity. The security achieved through confidential computing is comparable to traditional HSMs, providing a more scalable and cost-effective solution for storing and processing sensitive information.

Public Cloud Use Cases:

In traditional public cloud environments, trust is placed in multiple layers within the cloud provider's infrastructure. Confidential Computing introduces additional protection guarantees by reducing the number of layers that need to be trusted by end-users. With hardware-based Trusted Execution Environments (TEEs) safeguarding applications and data in use, unauthorized actors, even with physical or privileged access, face significant challenges in accessing protected application code and data. Confidential Computing aims to remove the cloud provider from the Trusted Computing Base, enabling workloads that were previously limited by security concerns or compliance requirements to be securely migrated to the public cloud.

Multi-Party Computing

As new compute paradigms emerge to enable data and processing power sharing across multiple parties, ensuring the confidentiality and integrity of sensitive or regulated data becomes crucial. Confidential Computing provides a solution for organizations to securely share and analyze data without compromising its privacy, even across untrusted platforms. Private multi-party analytics can be applied in various domains such as financial services, healthcare, and government to combine and analyze private data without exposing underlying data or machine learning models. With Confidential Computing, data remains protected against tampering and compromise, even from insider threats, ensuring secure collaboration and unlocking the potential of global data sharing while mitigating security, privacy, and regulatory risks.

Blockchain

Blockchains provide an immutable ledger for recording and validating transactions without the need for a centralized authority. While they offer transparency and data consistency, storing sensitive data on the immutable blockchain poses privacy concerns. Confidential Computing can enhance blockchain implementations by leveraging hardware-based Trusted Execution Environments (TEEs). TEEs enable users to execute smart contracts securely, ensuring data privacy, scalability, and verification optimization. TEE-based attestation services provide reliability proof for transactions, eliminating the need for each participant to independently validate historical data. Additionally, confidential computing addresses computational and communication inefficiencies associated with consensus protocols in blockchain systems.

Mobile and Personal Computing Devices

Confidential computing on client devices offers use cases that provide assurances of data privacy and integrity. Application developers and device manufacturers can ensure that personal data is not observable during sharing or processing, removing liability from manufacturers. Trusted Execution Environments (TEEs) enable formal verification of functional correctness, allowing developers to prove that user data has not left the device. For instance, continuous authentication implementations can operate within a TEE to identify users without exposing sensitive biometrics or behavioural data. Similarly, decentralized on-device model training can improve models and share improvements without leaking training data, providing user-controlled policy and constraints through mutual attestation in a hardware-based TEE.

Edge and IoT Use Cases:

Confidential computing finds valuable use cases in edge and IoT environments where data privacy and security are paramount. For instance, in scenarios like local search and filtering within home routers for DDoS detection, a confidential computing environment can protect sensitive user behavior inferred from TCP/IP packet metadata. Other examples include edge confidential machine learning processing, such as video metadata generation for reducing latency, CCTV camera surveillance with templates of persons of interest, and on-device training models. Confidential computing technology also helps mitigate attacks that exploit physical access to devices in environments where untrusted parties may have physical accessibility.

A data collection of records, a technological database linked together using cryptography, is called a blockchain. Globally expanding cross-border trade operations are predicted to boost demand for the technology. Data Bridge Market Research analyses that the blockchain market, valued at USD 10.02 billion in 2022, will reach USD 766.10 billion by 2030, growing at a CAGR of 71.96% during the forecast period of 2023 to 2030. The acceptance of cryptocurrencies by the law motivates companies and investors to increase their blockchain technology investments. Additionally, blockchain technology is anticipated to become more effective and efficient in the companies' efforts shortly. DeFi is a new blockchain-based financial technology that lessens banks' control over financial services and money. Throughout the projection period, market growth is anticipated by increasing strategic initiatives in the decentralized finance space.

Future Trends and Directions

The field of confidential computing is rapidly evolving, and several future trends and directions can be identified.

Conclusion

Confidential computing offers a ground-breaking approach to protecting sensitive data during processing in untrusted environments. By combining principles such as data isolation, secure enclaves, attestation, encryption, and minimizing trust assumptions, organizations can ensure the confidentiality and integrity of their data. Despite challenges related to performance, key management, legacy systems, and application portability, the benefits of implementing confidential computing are substantial. Real-world use cases demonstrate its value in healthcare, finance, edge computing, and cloud computing. By following best practices and considering future trends, organizations can embrace confidential computing to safeguard their sensitive data and preserve privacy in an increasingly interconnected world.


DBMR has served more than 40% of Fortune 500 firms internationally and has a network of more than 5000 clients. Our Team would be happy to help you with your queries. Visit, https://www.databridgemarketresearch.com/contact

Contact Us

LEARN MORE

Additional Insights On Impact and Actions