A New Scalable Approach to Data Tokenization
6 Pages Posted: 20 Jun 2010
Date Written: June 19, 2010
Abstract
This is a new approach to tokenize data which eliminates challenges associated with standard centralized tokenization. Particularly in high volume operations, the usual way of generating tokens is prone to issues that impact the availability and performance of the data. From a security standpoint, it is critical to address the issue of collisions caused when tokenization solutions assign the same token to two separate pieces of data. This next generation tokenization solution addresses all of these issues. System performance, availability and scaling are enhanced, numeric and alpha tokens are generated to protect a wide range of high-risk data, key management is greatly simplified, and collisions are eliminated. This new approach has the potential to change where tokenization can be used.
Keywords: Data Tokenization, Performance, Database Security, Encryption, Privacy, VISA CISP, GLBA, HIPAA, PCI DSS
JEL Classification: O31
Suggested Citation: Suggested Citation