Guidelines for tokenization - which allows for hiding payment card primary account numbers (PANs) by replacing them with randomly generated numbers - were recently released by the pci PCI Security Standards Council (PCI SSC). The guidelines immediately drew words of praise, caution and criticism from the data security industry.
Tokens are a way for merchants to reduce the scope of Payment Card Industry (PCI) Data Security Standard (DSS) requirements. Reducing said scope can save money that might otherwise be spent on demonstrating compliance with the PCI DSS. Tokens replace the customer's PAN with a random, 16-digit number that always begins with the last four digits of the consumer's card.
The token substitution applies to files, applications, systems and databases in the merchant's computer network, effectively hiding PANs therein. One of the easiest ways to fall out of PCI DSS compliance is by storing customer credit card information. Tokenization removes that data from the merchant computer, decreasing the likelihood of a data breach.
"Merchants are ultimately responsible for the proper implementation of any tokenization solution they use, including its deployment and operation and validation of its tokenization environment as part of their annual PCI DSS compliance assessment," the PCI SSC stated when releasing the guidelines.
"As with many evolving technologies, there is currently a lack of industry standards for implementing secure tokenization solutions in a payment environment."
PCI SSC General Manager Bob Russo said the guidelines are just a starting point for merchants considering tokenization. "The council will continue to evaluate tokenization and other technologies to determine the need for further guidance and/or requirements," Russo said. "While this guidance will provide merchants with additional understanding on how tokenization may help their PCI efforts, it is important to note that tokenization should not be viewed as an alternative to the [PCI DSS]."
Reaction to the tokenization guidelines quickly followed its release. "Those with a strong grasp of the DSS, security best practices and data tokenization concepts will probably find that this document does more to confirm current assumptions than to provide new insights or information," Jeremy Simon of Halock Security Labs said. "It seems these documents are aimed more at those without a strong understanding of the DSS or how tokenization works to help them avoid implementing a solution that does not achieve the desired objectives for PCI scope reduction and/or risk reduction."
Simon called the guidelines a "must read for any organization currently planning or considering tokenization."
In his blog, Trustwave Security Consultant Joel Dubin, wrote, "Is tokenization effective? For the time being, it probably is. Of course, eventually some clever hacker will probably find a way to beat the system. But right now it offers both PCI compliance and some level of network security - the best of both worlds for merchants using credit cards."
Walter Conway, a Qualified Security Assessor and PCI DSS consultant with 403 Labs LLC, wrote that the guidelines contain surprises. "In particular, the council declares that all tokens are not equal, and that, contrary to what anyone might tell you, some tokens (so called 'high value tokens') will still be in scope for PCI compliance," he stated.
The PCI SSC stated in the tokenization guidelines that it believes high-value tokens could potentially be hacked and converted to cash or used in fraudulent transactions. For this reason the PCI SSC deemed high-value tokens may be included in the PCI DSS scope - even though they cannot be used directly to get PANs or other cardholder information.
"Merchants should understand that tokenization is not a silver bullet that makes PCI go away," Conway stated. "Unfortunately, silver bullets have been outlawed for PCI. Instead, tokenization is a strategy that can reduce a merchant's PCI scope, often dramatically."
He added that the amount of scope reduction depends on a number of factors, including how the tokens are constructed, how effectively the merchant is able to segment the tokenization engine from other parts of the network, the merchant's ability to secure the "token vault" that matches tokens with PANs, how the encryption is managed and how the changing of tokens back into the original PANs is handled.
"My recommendation is that before you can assess how tokenization will reduce your PCI scope, you not only need to understand how your tokens are generated and stored, but also how you will use them," Conway said.
Vendors were quick to praise the guidelines. "We're pleased with the initial result of many months of effort by the PCI SSC and members of the Tokenization Working Group, within the Scoping Special Interest Group, led by our own VP of Product Management, Gary Palgon," said Dan Konisky, Liaison Technologies Inc.'s Director of Product Management.
"This document is an important stepping stone with additional guidance to follow, which we hope will include validation criteria for PCI auditors."
Electronic Payment Exchange Inc., a provider of tokenization and end-to-end encryption services, welcomed the guidelines. "We are delighted to see that the [PCI SSC] is now officially recognizing the value and security of tokenization solutions," EPX Chief Security Officer Matt Ornce said. "While we have always known that EPX tokenization solutions reduced the level of merchant risk, now we can say with certainty that they also reduce the level of PCI DSS assessment scope for merchants."
Shift4 Corp., the inventor of the technology that generates and accepts tokens, expressed a different view. Shift4 released its tokenization technology in 2005. It did not patent the technology or copyright the name, believing the technology would be a driver for the industry.
According to a blog on the Shift4 website, the PCI DSS guidelines "missed the mark." The blog further stated, "At no point in the 23-page response does the PCI SSC publish anything that could even remotely be construed as a standard."
Shift4 was represented on the Tokenization Working Group but wrote that each group member "brought his/her own agenda, and many promoted overtly, supporting patented and/or copyrighted 'tokenization' technologies to the group - knowing that if they could get their idea included, they stood to profit. Seeking compromise, the SSC forced these dissimilar pieces into the standard and created not just a camel, but a crippled one at that.
"What was released today was not an industry standard, and it was not a guideline. It was an eloquently worded, poorly veiled passing of the buck from the PCI SSC to individual acquirers and QSAs. And it is the QSAs and the PCI Council who stand to profit from these 'guidelines,' as more merchants will be required to validate obviously secure solutions with QSAs in order to comply with this new document."
The PCI SSC did not respond to a request for a response to the Shift4 comments.
For additional news stories, please visit www.greensheet.com and click on "Read the Entire Story" in the center column below the latest news story excerpt. This will take you to the full text of that story, followed by all other news stories posted online.
The Green Sheet Inc. is now a proud affiliate of Bankcard Life, a premier community that provides industry-leading training and resources for payment professionals. Click here for more information.
Notice to readers: These are archived articles. Contact names or information may be out of date. We regret any inconvenience.
Prev Next