Even so, security experts say there are some practices that you can adopt that go beyond Visa's recommendations (PDF). While there is room for discussion about any one of these tokenization suggestions, experts recommend these tips to achieve the best possible security posture for data protection:
1. Randomly Generate Tokens
According to many security experts, the only way to guarantee that tokens are not able to be reversed is if they are generated randomly.
"If the output is not generated by a mathematical function applied to the input, it cannot be reversed to regenerate the original PAN data," Adrian Lane, analyst for Securosis, recently on the topic. "The only way to discover PAN data from a real token is a (reverse) lookup in the token server database. Random tokens are simple to generate, and the size and data type constraints are trivial. This should be the default, as most firms should neither need or want PAN data retrievable from the token."
2. Avoid Homegrown Solutions
While tokenization may seem simple on its face, Ulf Mattsson, chief technology officer for Protegrity warns that "there are more ways to go wrong with tokenization that traditional encryption."
"It's a little bit of rocket science because first you need to generate the tokens, manage the tokens in a good way, protect your token server in a good way and then on top of that you need a normal encryption system with key management that should be compliant that's protecting your token server," Mattsson says.
Mattsson has heard a number of horror stories about homegrown deployments of tokenization that were easily cracked due to the reversibility of the tokens and lack of security around the system in general. "There are homegrown systems out there that are called tokenization and they do not meet the security level of tokenization; in many cases they don't even meet basic security levels for encryption," he says.
3. Protect the Token Server
The Visa standards did start out with a note about the importance of network segregation and keeping tokenization systems PCI compliant, but the importance of securing the token server bears repeating. If organizations fail to secure this server, it can put the whole balance of the token system at risk and render an organization's tokenization investments moot if it is not properly secured.
"In the corner somewhere you have to have a token server which can reverse the (tokenization process)," Mattson. "That server will need to be encrypted with traditional key management and strong encryption. If it's PCI data that it holds, the server needs to be PCI-compliant."
4. Create An Encryption Ecosystem
Over the last year or so, experts have debated whether an organization should choose between end-to-end encryption or tokenization. Many within the card processing world, however, believe that organizations shouldn't be choosing between the two. Each type of technology serves a different purpose: The strength of tokenization is its irreversibility and its ability to play nice with the database infrastructure. Meanwhile, end-to-end encryption helps fill in the gaps as the cardholder data and PANs travel across the rest of the IT infrastructure.
"We believe that tokenization is a prudent strategy when used in conjunction with end-to-end encryption," says Steven Elefant, CIO for Heartland Heartland Payment Systems, which expects to provide tokenization services to its customers later this year as a complement to the encryption services it announced to its customers last fall.
Have a comment on this story? Please click "Discuss" below. If you'd like to contact Dark Reading's editors directly, send us a message.