
Presented by Capital One Software
Tokenization is emerging as a cornerstone of modern data security, helping enterprises separate the value of their data from its risk. During this VB in conversationRavi Raghav, President, Capital One Software, discusses ways that a tokenization breach can help reduce the value of data and preserve the underlying data format and usage, including Capital One’s own experience at scale.
Tokenization, claims Raghu, is a highly advanced technology. It converts the sensitive data into a no-nonsense digital alternative, called a token, which actually maps to the original, which is stored in a digital wallet. The token placeholder preserves both the format and utility of sensitive data, and can be used in applications including AI models. Because tokenization removes the need to manage encryption keys or dedicate compute to constant encryption and decryption, it offers companies a highly scalable way to protect their most sensitive data, he added.
"The killer part, from a security perspective, when you think about it in other ways, if a bad actor gets hold of the data, they get hold of the token," He explained. "The actual data isn’t sitting with the token, unlike other methods like encryption, where the actual data is sitting there, just waiting for someone to grab someone’s key or use brute force to get the actual data. From every angle this is the ideal way one should go about protecting sensitive data."
Tokenization Differentiator
Most organizations are just scratching the surface of data security, adding security when the data is read, to prevent an end user from accessing it. At a minimum, organizations should focus on saving data on writes, as it is being stored. But best-in-class organizations go even further, protecting data at birth, the moment it’s created.
At one end of the security spectrum is a simple lock-and-key approach that restricts access but keeps the underlying data intact. More advanced methods, such as masking or modifying the data, permanently change its meaning – which can compromise its usefulness. File-level encryption provides broader protection for large amounts of stored data, but when you get down to field-level encryption (for example, a Social Security number), it becomes a bigger challenge. It takes a lot of computation to encrypt a single field, and then decrypt it at the point of use. And yet it has a fatal flaw: the original data is still there, only the key is needed to gain access.
Tokenization avoids these pitfalls by replacing the original data with a surrogate that has no intrinsic value. If the token is intercepted – whether by the wrong person or the wrong machine – the data itself is preserved.
Business value of tokenization
"Basically you are protecting the data, and it is priceless," Raghu said. "Another thing that is invaluable – can you use it for modeling purposes afterwards? On the one hand, it is a protection thing, and on the other hand it is a business that enables."
Because tokenization preserves the structure and order of the original data, it can still be used for modeling and analytics, turning security into business enabler. Take private health data governed by HIPAA as an example: tokenization means that the data is used to build pricing models or for gene therapy research, while remaining compliant.
"If your data is already secure, you can then promote the use of data across the enterprise and everyone needs to create more value from the data," Raghu said. "Conversely, if you don’t have it, there’s a lot of concessions today for businesses to get more people to have access to it, or more AI agents to have access to their data. Ironically, they are limiting the blast radius of innovation. The impact of tokenization is massive, and there are many metrics you can use to measure it – operational impact, revenue impact, and of course peace of mind from a security perspective."
Breaking down barriers to adoption
Until now, the main challenge with traditional tokenization has been performance. AI requires a scale and speed that is unprecedented. This is one of the major challenges that DataBolt, its walletless tokenization solution, which can generate 4 million tokens per second, is a key challenge.
"Capital One has been undergoing tokenization for over a decade. We started doing this because we are serving our 100 million banking customers. We want to protect this sensitive data," Raghu said. "We’ve eaten our dog’s food with our internal tokenization capacity, over 100 billion times a month. We’ve taken that know-how and that capability, scale and speed and innovated so that the world can take advantage of it, so that it’s a commercial offering."
Walletless tokenization is an advanced form of tokenization that does not require a central database (vault) to store token mappings. Instead, it uses mathematical algorithms, cryptographic techniques, and deterministic mapping to dynamically generate tokens. This approach is faster, more scalable, and eliminates the security risk associated with managing a vault.
"We realized that for the scale and speed requirements we had, we needed to develop this capability ourselves," Raghu said. "We’ve been iterating constantly to make sure it can scale to hundreds of billions of tasks a month. All of our innovation has been around the capability and capacity to do this at a battle-tested scale within our enterprise, for the purpose of serving our customers."
While traditional tokenization methods can add some complexity and slow operations, DataBot integrates seamlessly with encrypted data warehouses, allowing businesses to maintain strong security without slowing down performance or operations. Tokenization takes place in the client’s environment, removing the need to communicate with an external network to perform tokenization operations, which can also slow down performance.
"We believe that fundamentally, tokenization should be easy to adopt," Raghu said. "You must be able to store your data very quickly and operate at the speed and scale and cost requirements of organizations. I think this has been a major barrier to mass adoption of tokenization so far. In a world of AI, this is going to be a huge enabler."
Don’t miss Full conversation with Ravi Raghav, President, President, Capital One Software.
Sponsored articles are content produced by a company that is either paying for the post or has a business relationship with VentureBeat, and is always clearly marked. For more information, contact sales@ventorbet.com.