Frances is CEO & Founder of Anonybit. She believes that the strategy of building higher and higher walls around biometric systems is flawed because it still leaves data for fraudsters to find. Through Anonybit, Frances takes a different approach, focusing on the decentralization of data as the key to maintaining personal privacy, data protection and digital security.
Frances sat down for a chat with Cindy White, CMO at Mitek, to talk more about the importance of biometrics and Anonybit.
Cindy: You’ve talked before about it being time to re-think how our identities and personal assets are managed and stored. Can you explain what you see as the problems with current approaches to storing biometrics?
Frances: When COVID happened, and we saw digital onboarding go through the roof, I was involved at the time with another fraud prevention company. That was my first foray into financial services; before that, it was all government.
What I found shocking was that all of the data that was being collected and captured for onboarding was being discarded because of data protection and privacy concerns. And when you peel back as to why we see so much fraud happening, in my opinion, it all comes back to that: we're losing the anchor of trust that we can establish at onboarding, which we could be using throughout the user journey. And I just found that completely shocking.
When I started trying to understand why, it came back to privacy; people were afraid of storing the very thing that they could use to protect themselves in the user journey.
A lot of the processes are just checking the box for KYC for AML, and then the data gets flushed out. So if you want to do a fraud investigation downstream, if you want to create a blocklist of people who shouldn't be there, if you want to prevent synthetic identity… you're relying on weak signals, things like devices and locations and other things that fraudsters know how to circumvent.
What happens today? You onboard, and then the data gets flushed. The user creates a username and password for their account. Even if they use their face ID on their device, it doesn't matter, now there's a password that anybody can use to get in from any other device. Or, the attacker calls a call centre, says what their dog ate for breakfast, and is able to re-establish access to the victim's account.
The most powerful thing that we have at our disposal, we're not using because we're afraid of storing it. It's constantly a trade-off between privacy and security - and as a result, we're losing the battle.
Cindy: What are you doing through Anonybit to change the way biometric systems are used?
Frances: Anonybit was founded to address this core thing - to unify the user journey and prevent fraud while storing the information that is collected in the digital onboarding process.
Anonybit is a decentralised infrastructure that protects personal data and digital assets. We eliminate the need to store biometrics. We eliminate the trade-offs that are typically made between privacy and security because we don't rely on a biometric that’s stored on any local device either.
Cindy: How does it work?
Frances: What we do is we shard biometric data like selfies. We'll take the face, for example, and we will break it up into what we call anonymized bits - that's why the name of the company is Anonybit. We distribute these bits into a multi-party cloud environment where they're stored. And they're never retrieved, even for matching.
When we want to do a match, we break down the piece again and then we compare the new bit against its stored counterpart using a different environment. The pieces get matched separately in parallel, and it's only the computations that come back to verify an identity. As long as a majority of the pieces come back as a match, then we can say that you are who you claim to be. And this is how we do it without any centralised honey pots.
Cindy: Can you explain what you mean when you talk about ‘the circle of identity’?
Frances: The circle of identity is a concept I've coined, that essentially links the entire user journey from digital onboarding to authentication with account recovery. Today, that circle of identity is broken, because we are not storing the information that's collected at onboarding.
What Anonybit enables us to do is take the data collected at onboarding, store it in the decentralised cloud, and run queries against it - to prevent duplicates, identify people on watch lists, block lists, and synthetic identities, things like that.
We can bind and match any attributes you want to the biometric profile. This is very different to connecting it to a user ID. It’s connected to your biometric, which is the link to your physical identity. We can bind a device, we can bind a passkey, we can bind a token for a credential, we can bind a credit card number… whatever you want. Any attribute can be bound to your biometric record. Now you have the basis for two-factor authentication, strong authentication downstream, whether for login, for step-up, for verifying a transaction, or for doing what we call risky transactions, like adding a new payee or sending money over a certain amount. You can even use this at the call centre, branch, or on a chat channel: you would have a consistent, persistent biometric experience using the exact same profile that was established at onboarding.
Today, this is not happening and this is the exact gap that attackers exploit.
It boils down to, how do you take that data and stretch it across the user lifecycle? For the first time, we're able to take and unify the entire identity lifecycle from onboarding to authentication, and account recovery, to deliver a comprehensive solution to the market.
Cindy: What would you say are the key considerations that any organization looking to deploy a biometric solution, cannot ignore?
Frances: Biometric systems are complex technologies that need to be thought through in terms of their implementation. I would say there are five things that enterprises should consider.
One is liveness and deepfake prevention. We know that attackers are getting more sophisticated in their use of generative AI, trying to spoof systems. So, any responsible deployment of biometrics should have some kind of liveness or deepfake prevention.
Second is matching accuracy: not all algorithms are created equal. There are plenty of studies issued by NIST [National Institute of Standards and Technology] which talk about the different algorithms and their performance.
Third is bias, which is always a concern - both gender and racial bias have to be considered. And again, not all algorithms are created equal.
The fourth is scalability. There are different functions for biometrics. We have one-to-one matching, which is “Am I who I claim to be?”, and the other is one-too-many lookups, which is “Have I seen you in a gallery before?”. Different technologies scale in different ways. So scalability and how systems get deployed is really important, especially in financial services.
And lastly, is privacy. This one really gets to the root of whether an institution will ultimately agree to the deployment of biometrics in the first place. This has always been the Achilles heel for identity management; the trade-offs between privacy and security, and how biometrics are stored and managed.
Up until now, the only choice has been to store biometrics in a centralised repository, and then be scared to death of a data breach. Or you rely on a biometric that is stored on a device which then leaves you not knowing who's behind the device, because anybody can enrol in that device - and that device is not hooked into the original enrollment.
For the first time, we're able to eliminate that trade-off by using other techniques, other privacy-enhancing technologies, that allow you to have your cake and eat it too.
We recorded three videos capturing Frances’s insights on the battle between privacy and security, what everyone should know about biometrics, and Anonybit’s decentralized approach. To watch the videos, visit Frances’s Innovator page.