Q and A hero image

Q&A with TripleBlind’s VP Partnerships and Marketing Chris Barnett

We recently sat down with Chris Barnett, TripleBlind’s VP of Partnerships & Marketing to discuss his thoughts on joining TripleBlind, barriers to data collaboration and why privacy is important. 

Chris has more than twenty years experience as an executive team member of technology startups. He was EVP Global Sales & Marketing at EyeVerify and was responsible for forming partnerships with Airwatch, Good Technology, Samsung, Wells Fargo, Alipay, and more than 50 other financial institutions prior to the $100M+ Alipay acquisition of EyeVerify in September 2016. 

Prior to that he was EVP Markets at Vlingo, increasing quarterly revenues from $200k to $6M leading to Vlingo’s $100M+ acquisition by Nuance. Before that he was SVP Sales at Handmark, Director of Digital Products at Rand McNally and VP/GM at Monster.com. 

 

Can you tell us how you got started in this space? Is there a particular story that inspired you to pursue a career in privacy enhancing computation?

During my time at EyeVerify, I worked with a wide range of companies around the world. It became clear early on there were significant problems when companies wanted to collaborate with their data. 

Data collaboration among organizations continues to be a core business strategy for many enterprises. However, today’s privacy-preserving technologies continue to fall short.

Current solutions are either hard to enforce, slow the enterprise’s compute performance, hardware-dependent, reduce accuracy, or are otherwise suboptimal.

I wanted to find a way to protect intellectual property value without limiting its priceless potential to solve health, financial, and scientific applications. 

 

What do we need to protect the integrity and privacy of data?

To protect the privacy and integrity of data, we need state of the art crypto-systems. The systems that are commonly used today, such as RSA, rely on conditional security, called “prime factorization”, because their security depends on large numbers that are difficult to determine if they are prime. 

Crypto-systems that use information-theoretic security, also known as Unconditional Security, are secure against adversaries with unlimited time and resources. This creates a far higher standard of security than any other existing approach. 

 

What are the biggest barriers and flaws when it comes to privacy?

The legal/contractual approach to privacy falls short for several reasons.

It still leaves open the ability to misuse the data, intentionally or otherwise, both internally and externally. Breaking compliance requires just one incompetent or malicious actor in the entire organization, like a major credit bureau using “admin/admin” as their credentials for their primary database or a major credit card issuer keeping all of their credit pull information in an unsecured S3 bucket.

The custodians or owners of the data cannot consent to every operation performed on that data. While they might have the option to do so on paper, there’s no way to enforce it in practice. It relies on the right organizational processes and structures in place, which are fallible if they exist at all. If the privacy policy is in the way of a particular operation, the data custodian can unilaterally change the privacy policy contract on the actual data owner. 

 

Are there any foreseeable, critical threats on the horizon that you think companies need to start preparing for?

There are now more than 100 data privacy and data residency regulations worldwide, including in several U.S. states. Complying with all of these to enable safe and secure data collaboration has the potential to be a daunting task that could impede international and intranational commerce.

 

How do you allow institutions to work together even when they are in different regulatory regimes? 

I believe there are a few misconceptions when it comes to international regulatory policies. A lot of people think that GDPR, HIPAA, PIPEDA, or the dozens of these different privacy laws around the world are very different. The common misconception is that these regulations state organizations can do nothing with this data. In fact, the reality is more nuanced: some regulations state organizations can not take data and tie it back to an individual person, other regulations state organizations can not move the data out of the country where it originated, etc.

Traditionally, PHI, or personal health information, has been too difficult to de-identify or lose all value once de-identified. That’s where this misnomer comes in. There are privacy-enhancing solutions that work within these rules to make sure that you can actually use the data without ever being able to tie it back to an individual person.

 

Why hasn’t privacy enhancing computation achieved widespread adoption by enterprises?

Gartner recently called privacy-enhancing computation (PEC) a technology trend executive leaders should consider in 2022 to ensure growth, greater digitalization, and operational efficiency across their organization. 

While this is a fairly new technology sector, there are solutions that have been built on proven technologies that companies are turning to as data privacy laws around the world are taking priority and are becoming stricter. The demand for this kind of solution will only increase, and there is a lot of hesitancy because of its newness. However, I believe now with larger, reputable companies like Gartner calling attention to this space, we’ll only spread awareness and wider adoption of PECs. 

 

What needs to be done now to better prepare for the future? 

We always need to think about the future when looking at current solutions. What is working now may present problems in a couple of years or so, and it takes away time from businesses constantly having to pivot. There are technologies that exist that can protect data for years to come.