CDO Survey Results Hero Image

What do CDO’s think? New survey by TripleBlind offer insights

TripleBlind recently surveyed 150 chief data officers (CDOs) and other senior executives in charge of data management at healthcare and financial services organizations with annual revenues of at least $50 million and at least 250 employees—and the results were compelling. Expanding on last week’s blog, we wanted to dive deeper into the key findings of the survey. These include:

  1. 64% of respondents are concerned that employees at organizations with which they are collaborating will use data in a way not authorized in signed legal agreements, 
  2. 60% are concerned people at organizations with which they collaborate will use data that violates HIPAA and/or other data privacy regulations,
  3. 60% are concerned that the privacy-enhancing technology (PET) solution deployed by data collaboration partners will modify the data to make the results of analyses inaccurate.

TripleBlind’s Breakdown

  1. 64% of respondents are concerned that employees at organizations with which they are collaborating will use data in a way not authorized in signed legal agreements.

We agree there is cause for concern when collaborating with other organizations when collaborating with sensitive data. There is no guarantee the data user or organization won’t use the data in ways not authorized. The TripleBlind solution allows approved operations to be executed. We require the consent of all parties and facilitate auditable digital rights on how data may be used by a counterparty.

On top of that, TripleBlind has developed a patented advancement of Secure Multi-party Computation (SMPC) that is faster and more practically usable than other SMPC implementations. Model inference using our SMPC offers the strongest level of protection, both for the data and for the model. No recoverable version of the data or the model is ever exchanged between the parties. Instead, a one-way transformation is applied to partial shares of the model and the data, which allows computations to be performed in an irreversible SMPC-transformed space. No encryption key exists that can be compromised, and SMPC is mathematically proven to be quantum safe, meaning that a bad actor with unlimited computational resources would be unable to compromise the system.

  1. 60% are concerned people at organizations with which they collaborate will use data that violates HIPAA and/or other data privacy regulations.

With GDPR, CCPA, HIPAA, PDPA, and many more data privacy and data residency standards, it’s difficult to understand exactly what is in compliance, especially when collaborating internationally. And the penalties for non-compliance can be steep. Violations of HIPAA will cost an organization as much as $25,000 per violation category per year

The TripleBlind solution solves for regulatory compliance. With our privacy-enhancing technology, regulated data can be used without violating GDPR or HIPAA. With TripleBlind, data and algorithms are de-identified, one-way encrypted, never decrypted, and only accessible via the API. 

By eliminating decryption, TripleBlind enables:

  • Usage of all data without the risk of re-identification
  • No exposure of any raw sensitive data, but usage of real data for AI modeling and training
  • Generalizing your algorithms on varied datasets to reduce bias
  • Normalization and feature engineering data without breaking privacy
  • No hardware dependencies or movement of data outside a firewall

TripleBlind eliminates costly compliance measures such as manual data anonymization, while simultaneously enabling third-party access so companies can use data without decryption, replication or visibility to raw data. 

  1. 60% are concerned that the privacy-enhancing technology (PET) solution deployed by data collaboration partners will modify the data to make the results of analyses inaccurate.

Collaborating with a PET such as synthetic data requires personally identifiable information (PII) to be removed meaning it strips records of all PII and often results in the loss of valuable context. Loss of context reduces utility in several ways: by making it more difficult to match two datasets together using common identifiers, reducing the accuracy of computational tasks, and eliminating potential correlative features. So it will modify the data.

However, not all PETs are the same. With our solution, the data being used is original data and so there are no concerns related to the loss of key data points or unique outliers. We provide access to a larger number of more diverse datasets. This enables superior analysis and training of machine learning models. 

For more information, reach out to us for more details about the survey or give us a call to chat about your data privacy concerns.