Tokenization might be the PCI Holy Grail, but the search for it could be just as circuitous

Dark Reading Staff, Dark Reading

July 22, 2010

7 Min Read

As merchants and credit-card processors continue to struggle with securing cardholder data for the sake of PCI compliance and overall brand protection, many of them increasingly are turning to tokenization technology as a way to help them reduce the scope of their risks. But vendors in the burgeoning tokenization market are still skirmishing over technology definitions and standards.

Meanwhile, Visa last week released a best practices guide (PDF) to relieve confusion about tokenization and help merchants, processors, acquirers, and others in the payment ecosystem understand how to comply with PCI via tokenization.

Tokenization is used to replace live cardholder personal account numbers (PANs) in databases with stand-in values that are meaningless to data thieves, but can be cross-referenced to real data if necessary. Compared to full encryption products, tokenization is often much easier to deploy and is less likely to disrupt applications that tap into databases for customer information.

"There may be lots and lots of applications driven by the underlying database that don't need access to the plain text PANs at all," says Philip Rogaway, a cryptography professor for the Computer Science department at UC Davis. "All of those applications can remain unchanged [with tokens], whereas traditional [encryption] solutions would have touched everything that would have driven off the database."

With the allure of easier deployment and smoother interaction with applications, tokenization's biggest draw is the fact it can dramatically reduce the need for costly PCI audits. "The big benefit is it reduces your scope of PCI compliance -- even down to the point where you might not be obligated to meet the PCI requirements of data security standard because you have outsourced or eliminated all of your cardholder data, so therefore you're not obligated to comply with PCI," says John Kindervag, analyst for Forrester Research.

Kindervag calls the complete elimination of cardholder data from merchant databases the "Holy Grail" of PCI -- and something that can be accomplished if merchants transfer risk to card processors, which are increasingly teaming up with tokenization vendors or developing homegrown technology to offer encryption and tokenization services. In this case, the card processor issues the token and manages the back-end cross-reference table so the merchants don't have to worry about securing that back-end infrastructure.

These services have taken off in the past year. Last September, processor First Data Corp. announced it would be leveraging RSA's tokenization technology to offer customers outsourced tokenization. And earlier this year, Fifth Third Processing Solutions told customers it would team with Voltage Security to provide them with tokenization solutions. Heartland Payment Systems, meanwhile, says it expects to provide tokenization services through Voltage, which already powers the end-to-end encryption services that Heartland announced to customers last fall.

If a larger merchant chooses to go it alone, then purchases its own tokenization solution to generate its own tokens and store its own cross-reference tables, it'll still need to worry about securing at least some of its infrastructure to PCI DSS specs, even if the scope of those compliance efforts will be drastically reduced.

Visa's best practices guide is aimed at helping organizations understand how tokenization needs to be implemented and what the card brand expects of merchants who tokenize their PANs. One of the big requirements on the list is the databases that hold the cross-reference tables that link back to the actual pans be segmented from the network and secured according to PCI security standards.

Eduardo Perez, head of Global Payment System Security for Visa, says the best practices guide was developed because a lot of confusion still exists among merchants about tokenization, which as of yet has no real standards developed across the industry.

"Where properly implemented, tokenization may help simplify a merchant's payment card environment," Perez said in a statement. "However, we know from working with the industry and from forensics investigations, that there are some common implementation pitfalls that have contributed to data compromises. For example, entities have failed to monitor for malfunctions, anomalies and suspicious activity, allowing an intruder to manipulate the tokenization system undetected. As more merchants look at tokenization solutions, these best practices will provide guidance on how to implement those solutions effectively and highlight areas for particular vigilance."

Forrester's Kindervag agrees that the market is still nascent and quite confusing to merchants; he says Forrester is deluged with calls from clients asking for clarification about the tokenization market.

"We're still to the point where we haven't defined what a token is. Every different vendor is going to have a different way of creating a token, and every different vendor is going to have a different view of how it should be generated," he explains. "Some people say it has to be random, some people say it can contain parts of the card number, and other people say that reduces the key space so it could be reverse-engineered. Some people say you can create a token using formatted encryption technology. Some people say you can't."

Kindervag says there is a struggle between vendors seeking to get enough critical mass that nobody -- namely the PCI Standards Council or any of the card brands -- can come back and say there is a wrong way to do things and put them out of business. For example, when it comes to defining tokenization, some vendors are fervent about true tokenization being based on a dynamically grown table of PAN values that refer to the tokens.

"There is a lot of confusion in the industry about what tokenization actually is -- the most common miscommunication is around formatted encryption. Encryption is encryption, and it's based on a secret key and mathematical algorithm," says Ulf Mattson, CTO of Protegrity. "Tokenization is not based on a secret key or algorithm that mathematically can get you back to the data. That's its core strength."

But other vendors, such as Voltage, offer a tokenization option that is based on format-preserving encryption (FPE), which can turn a 16-digit PAN into another 16-digit number under the control of a key. Rogaway, who came up with some of the early cryptographic research upon which Voltage based its FPE tokenization, says both ways are equally strong from a security standpoint, but FPE offers greater architectural flexibility for organizations.

For its own part, Voltage does also offer customers the choice between FPE tokenization and more traditional table-based tokenization, but the debate over whether FPE even qualifies for the term will still rage on, even after Visa's best practices were released.

"I wish the document had done a better job establishing the nomenclature," UC Davis' Rogaway says.

Another particularly impassioned battle about tokenization standards is the argument over whether hardware-based or software-based tokenization solutions are best. As Rogaway puts it, vendors, such as Voltage, with hardware-based tools claim that software-based tools, such as those offered by RSA, are not secure enough.

"You've got a face-off there," he says. "We're getting into a Thunderdome kind of match; they're all trying to do the 'two men enter, one men leave' kind of thing."

Heartland, for one, says it chose Voltage specifically because it didn't believe software-based tools -- which do not encrypt PANs in between the time a card is swiped and sent to the processor to be tokenized and returned to the merchant -- were secure enough.

"There are those who believe that you can do tokenization strictly in software at the point of sale, but we don't believe that that is adequate security unless you securely get the card number as soon as the digits leave the magstripe and do that in hardware and software," says Steven Elefant, CIO for Heartland, who believes tokenization should be layered on top of other encryption solutions.

While Forrester's Kindervag says he leans more toward hardware-based solutions, even software-based tools are better than nothing. "Both solutions are an order of magnitude better than the way credit cards are taken today," he says.

Have a comment on this story? Please click "Discuss" below. If you'd like to contact Dark Reading's editors directly, send us a message.

About the Author(s)

Dark Reading Staff

Dark Reading

Dark Reading is a leading cybersecurity media site.

Keep up with the latest cybersecurity threats, newly discovered vulnerabilities, data breach information, and emerging trends. Delivered daily or weekly right to your email inbox.

You May Also Like


More Insights