You are here
Home > Considerations for Using Tokenization to Mask Your Sensitive Data
By Linda Musthaler [sharethis]

When companies think of protecting sensitive data, either in their own data center or in the cloud, they are most likely to think of encryption as the means to obfuscate the real data. Tokenization is another means to protect data, and this process has unique properties that may help companies fulfill requirements that encryption doesn’t address. Tokenization can be a good alternative to encryption in some cases, or it can be a complementary solution that works with encryption to provide a very high level of data protection.

Tokenization is the process of replacing real data (such as a credit card number or a social security number) with random substitute data called a token. Unlike encryption, there is no algorithm that methodically generates tokens; instead, tokens are random characters that have no meaning and that cannot be converted back to the real data values by any mathematical means. In most cases, the process uses an index table called a vault to keep track of the relationship between a real value and its corresponding token. Once a token is generated, it can be used in many types of applications as a substitute for real data. If the real value is needed, an authorized application can reach into the vault and retrieve the data value by presenting the token.


Uses for Tokenization

To date, the most common use for tokenization has been in the electronic payments industry, to protect credit and debit primary account numbers (PANs) after payments have been authorized. PCI DSS requires that this data be safeguarded, and tokenization is the ideal method to desensitize transaction data so that merchants can analyze customers’ purchasing histories.

Now, tokenization is an up-and-coming technology for enterprise applications, especially those that store data in the cloud. In many cases, a key driver for tokenizing rather than encrypting data is to meet data residency requirements. Some governmental entities (for example, Germany and Switzerland) require that data pertaining to residents of those jurisdictions remain within the physical borders of the region. Encryption often doesn’t meet this requirement but tokenization does if the token vault is physically located in the required region. Real data can be tokenized and stored locally while the tokens go into the cloud applications.

Enterprises are increasingly looking at tokenization as an option for protecting personally identifiable information (PII), protected health information (PHI), and sensitive customer account information.

Tokenization Solutions

As a technology, tokenization is relatively new—only about five years old. Nevertheless, there are numerous solutions on the market today, and the technology is evolving rapidly to address issues such as preserving data formats and application functions. More about that in a moment.

Tokenization solutions typically come in the form of hardware or software appliances or gateways in the cloud. A solution can be hosted in-house in a company’s data center or by a third party vendor or cloud provider. Most (but not all) solutions use a secure vault to store the data. (At least one vaultless tokenization solution is now on the market.)

If a company chooses to host the tokenization solution in-house, it must answer a few key questions:

» How will we secure the vault, since it holds all the sensitive data?

» If we encrypt the real data in the vault (which is typical), what is our key management strategy?

» Which people and/or applications will have access to the vault to store and retrieve tokens, and how will we track their activities?

There are similar considerations if the company allows a third party to host the tokenization solution:

» How does our vendor secure the vault?

» Does the vendor have access to the real data in the clear?

» Where is the real data physically stored, including all backups and replication instances?

It may be necessary to have thorough answers to these questions to satisfy internal and external policies and compliance requirements.

Use of Tokens in Applications

There are numerous considerations about how tokens can and should be used in enterprise applications.

Some tokenization solutions are already pre-integrated to work with various applications, especially, SAP and other CRM applications. Depending on the tokenization solution chosen, integration with an application may be required. Some solutions are application-agnostic and will work with any enterprise application.

It’s best to limit the number of fields in an application that are tokenized. Generally, the more fields that contain token data, the slower the application performance will be if tokens need to be reconverted to real data for any reason. It’s possible to tokenize one or two fields and encrypt others within the same application.

Many tokenization solutions have the ability to generate format-preserving tokens to aid in various aspects of an application. For example, when tokenizing credit card PANs, the token value should have the same number of digits as a real card number. This ensures that tokens can easily replace real card numbers in ancillary post authorization applications, such as business analytics or loyalty marketing, without requiring extensive modification of the applications.

Many enterprise applications need to perform critical functions such as sort and search on fields that a company might like to tokenize—say, for example, patients’ social security numbers. By definition, tokens are random values, which renders sort and search functions useless. Some tokenization vendors have found ways around this dilemma, so it’s important to ask how the solution preserves application features and functionality, and whether or not the workaround weakens the obfuscation technique in any way. Unlike encryption, there are no globally accepted standards for tokenization processes, so it’s very important to ask vendors to thoroughly explain how their solutions work.

Tokenization or Encryption, or Both?

Though they perform similar functions, tokenization and encryption aren’t competing technologies. In many cases, they are complementary in helping a company achieve many data obfuscation objectives. For example, tokenization may help with data residency requirements, while encryption helps meet data privacy requirements.

Companies should discuss all requirements with prospective solution vendors and choose the obfuscation technique(s) that best match their business needs.

Linda Musthaler About the Author
Linda Musthaler is a principal analyst with Essential Solutions Corp. She writes about the practical value of information technology, and how it can make individual workers and entire organizations more productive.

One thought on “Considerations for Using Tokenization to Mask Your Sensitive Data

Comments are closed.