SecurityWeek: Tokenization: Benefits and Challenges for Securing Transaction Data
How Tokenization Can be Used for Securing Payment Card Transactions and Data
Over the summer, representatives of the merchant community called upon all stakeholders in the payments industry to work together on establishing open and efficient standards to protect consumers and businesses in the United States against security threats.
The Food Marketing Institute, the Merchant Advisory Group, the National Grocers Association, the National Restaurant Association, the National Retail Federation, the National Association of Convenience Stores, and the Retail Industry Leaders Association believe that payment card and other sensitive personal information can be protected across commerce channels by adopting a universal tokenization standard. The organizations believe that this is a step that needs to be taken in order to mitigate identity theft and payment card fraud.
“Regardless of whether a consumer is paying at a brick and mortar checkout, at the pump, on the Internet, or even via a mobile phone, there is a need to ensure the payment data is protected. One way this can be done is through a technology called tokenization,” the organizations wrote in a joint statement issued July 28.
The U.S. Federal Reserve’s Mobile Payments Industry Workgroup (MPIW) has also discussed the advantages and the challenges of payment tokenization. A report released to the public in late September shows that the MPIW is concerned about the challenges posed by the development of common standards for tokenization, and the lack of consistent terminology. A new tokenization sub-group has been tasked with investigating the challenges.
American Express has also embraced tokenization. The company just launched a suite of solutions designed to protect online and mobile payments by replacing sensitive card information with tokens.
Tokenization is the process in which sensitive information is replaced with a randomly generated unique token or symbol. These tokens would ensure that data is not transmitted or stored in an unsecure format. However, for the use of tokenization to be efficient in the payments industry, a universal standard must be created to ensure that merchants can support the technology across multiple providers, and without negatively impacting customer experience.
Furthermore, the groups noted that tokenization can be the answer to securing not just payments, but other aspects of commerce as well, including the transmission and storage of electronic health records and age verification identity checks.
“This call stems from the fact the merchants are required to connect to services provided by the payment industry, and they cannot by themselves perform changes in the way these systems operate,” said Irene Abezgauz, VP of product management at Quotium. “The magnetic stripe technology is an old technology, over four decades old. A lot about the way we perform payments has changed, but this technology is still with us, and it’s weighing us back. The merchants need to comply and work with the interfaces they are given.”
Experts contacted by SecurityWeek agree that tokenization can be an efficient solution for security of credit and debit card data.
“Tokenization is a very useful solution that can protect cardholder data at many points in the transaction lifecycle, especially post-authorization and for recurring transactions once a card has been presented,” noted Rob Sadowski, director of Technology Solutions for RSA.
“Tokenization is a great solution that has been utilized for many years to mitigate the risks associated with payment processing for retailers of all types. One downfall to tokenization thus far has been the lack of a standard that everyone followed so this is a great step forward for consumer financial security,” said Mark Stanislav, security evangelist for Duo Security.
“Tokenization is the best currently available solution to significantly increase the security around payment card data without having to change anything on the cardholder end,” Gregory Nowak, principal research analyst with the Information Security Forum, toldSecurityWeek.
While tokenization can be an efficient solution, many have pointed out that it’s not enough to protect payment card data.
“[There] are no solver bullet solutions and protecting payment card data requires a comprehensive, layered approach. End-to-end encryption during the acceptance and authorization process as well as enhanced card and cardholder authentication technology in addition to tokenization also play an important role in card data protection,” Sadowski explained.
“It is best to view the security of payment card data as a series of layers that address different kinds of security threats. Payment cards can be attacked on several fronts and no solution comprehensively addresses all of them,” said David Tushie, standards and technical representative at the International Card Manufacturers Association (ICMA).
The merchant groups that support the use of tokenization noted that criminals can steal payment card data where the card is swiped or a card number is entered, where card information is stored, and where card information is transmitted.
However, of these three vulnerability points, tokenization fully addresses only the second point, storage, and partially addresses the third point, transmission, explained Raymond Côté, president of Auric Systems International, a company that provides payment processing solutions.
“The press release is also unclear as to what is meant by ‘supported by all networks, brands and payment types.’ Does this imply a tokenized card could be submitted to multiple payment processors? If so, that would turn the token itself into a valuable commodity — an unfavorable outcome. The value of tokenization is that the token itself is valueless,” Côté told SecurityWeek. “One concern in regards to an industry standard for tokenization is that it could result in a monoculture with a single interface into tokenization services. Security exploits propagate quickly within monoculture environments.”
Advantages of tokenization
The obvious advantage of tokenization is that it preserves the value of cardholder data for merchants and service providers, while making it useless to criminals if it is compromised or stolen, Sadowski said.
“Tokenization dramatically lowers the likelihood of a credit card breach impacting them when a retailer is compromised. By representing their credit card details with a token instead, a breach of one retailer won’t require a replacement card to be issued,” Stanislav explained.
Rush Taggart, chief security officer at payment processing firm CardConnect, believes that tokenization offers better protection even than high quality encryption because the latter can be broken if the attacker obtains the encryption keys or if they have access to enough computing power.
On the other hand, tokenization doesn’t rely on encryption keys so organizations don’t have to worry about managing such sensitive data. Tokenization offers a higher level of security as long as the tokenization system is logically isolated and segmented from data processing systems and applications that process or store the sensitive data replaced by tokens.
“Another advantage of tokenization is that it can be applied to all types of sensitive data, not just credit and debit card numbers. Its capabilities include Social Security numbers, drivers’ license numbers, electronic health records, prescriptions, and even addresses – all personal information that should be properly protected,” Taggart told SecurityWeek.
Another advantage, pointed out by Nowak, is that tokenization can reduce the scope of systems for which PCI DSS compliance needs to be demonstrated. “Correctly implemented, it can both improve security and lessen the compliance burden,” he explained.
Taggart has noted that many large and small merchants in the U.S. have already implemented tokenization to radically reduce their PCI scope.
“Maintaining PCI DSS compliance across thousands of machines or more is extremely costly. Maintaining PCI DSS compliance for a small number of systems running the tokenization service is quite manageable. Utilizing a Visa-approved tokenization service provider can reduce PCI DSS compliance to just a few questions,” Taggart explained.
Disadvantages of Tokenization
As with all technologies, tokenization has some disadvantages. One of them, as pointed out by Sadowski, is that the most secure implementations require that the original card number be presented for tokenization. This means that the tokenization solution must have a way to protect the original cardholder data before it is tokenized. Furthermore, the centralized token vault in which the original payment card data is stored becomes an attractive target for criminals.
Another issue, according to Tushie, is related to infrastructure.
“Tokens must be created for each merchant and card account. When a transaction flows through the Merchant and Acquirer processing systems, these account tokens must be de-tokenized so that the Issuer can approve the transaction for a known card account. Transaction settlement messages must be tokenized for Merchant and Acquirer systems,” said Tushie. “This requires new infrastructure of trusted third parties in the transaction flow to tokenize and de-tokenize card accounts in these transactions. While this is not seen as a huge technological hurdle to overcome, it likely will add cost to the transaction fees for these third party services.”
Côté believes one of the disadvantages is that using payment processor specific tokenization, for instance, locks organizations into a particular payment processor. That’s why the tokenization service should support multiple payment processors or it should be payment processor agnostic. Other issues highlighted by the expert are the potentially negative impact on the speed of transactions, and the fact that data analysis can’t be performed on tokenized data since “good” tokens do not allow the initial data to be reconstituted just from the token itself.
“The only disadvantage I currently see is the recent watering down of what qualifies as a token. When we first developed our tokenization solution, about eight years ago, only random number strings with no value could be considered tokens,” said Taggart. “Now, there are things known as ‘cryptographically reversible’ tokens, which just seem to be created with a high-level of encryption – there is the risk the code could be cracked. The question remains how a business can receive an honest answer as to what type of tokenization a payment provider uses.”
“One of the concerns is replaying – we need to be sure that our token cannot be replayed or reused, because if it can – attackers will just use this token instead of the actual data and achieve the same criminal goals,” said Abezgauz.
Addressing the challenges
One of the challenges that must be addressed in implementing tokenization is that existing applications and infrastructure might have to be retrofitted or built from the ground up to accommodate the change, said Stanislav.
Tushie noted that secure financial payment cards, by their nature, must be interoperable in international interchange. For example, a card issued in the United States must work at point-of-sale terminals in Singapore or Paris, and vice versa. This means that international standards addressing the card technology as well as the processing systems in which they work must be developed and agreed upon. However, the expert has pointed out that technology frameworks for tokenization have already been proposed by international standards organizations and EMVco, which manages, maintains and enhances the EMV integrated circuit card specifications for chip-based payment cards and acceptance devices.
“With strong industry backing, clear guidelines, and well devised reference implementations, tokenization will have the best chance to succeed in a timely manner. With consideration for keeping backwards compatibility with the primary account number (PAN), organizations would be more likely to adopt tokenization more quickly if there were fewer technical hurdles in their way,” said Stanislav
On the other hand, Côté highlights that in many cases the barrier to implementation is not technical, but institutional.
“Corporate reluctance to mandate proper security and privacy (whether PCI, HIPAA, or regional security standards) handling of their data is the largest challenge,” he noted.
According to Abezgauz, the balance is between the tokens not being replayable, and usability/scalability.
“Do we provide a new token to the user every time? How do we manage these tokens? We need to make sure we can easily connect the real payment information to the token. For smaller organizations this can be doable, but it’s not trivial when looking at massive scale,” she noted.
“One solution is providing means to do a challenge/response mechanism that will ensure that the user information never leaves the user possession. One example is the end-to-end encryption solution suggested by Visa. They released recommendations for doing end-to-end encryption to protect user data,” Abezgauz added. “When we talk about user payments though, how do we extend this from POS to online purchases where users want to be able to perform purchases using their payment card from different devices – mobile devices, laptops and more. Currently users are not supplied with card readers. It can be one option for a solution though.”
Conclusions
Some experts estimate that it would take several years to adopt tokenization on a wide scale. Abezgauz argues that tokenization on a wide scale is not trivial. “I do not see it happening in the near future. I am also not sure that eventually tokenization will prevail over end-to-end encryption solutions,” she said.
Taggart is more optimistic and points out that the recent data breaches have led to a surge in interest in these technologies.
“There are so many variables here that it is hard to say. If there was a dedicated effort to implement tokenization into all payment channels and tokenize all legacy cardholder information in software systems, I would say a lot could be done within the next six months to one year,” he said.
“Tokenization is the pinnacle of data protection technologies. The physical and logical separation of payment and privacy information significantly reduces a company’s exposure to information theft. Well-tested, reliable, and flexible tokenization services are available today; there is no technical reason to delay implementation,” said Côté.
Rick Ricker, VP of enterprise payment solutions at 3Delta Systems, a provider of payment and data tokenization services that specializes in card-not-present tokenization, says his company is pleased that the merchant community is recognizing tokenization as a valid technique for protecting customer data.
“Tokenization has gotten a fair bit of attention lately with efforts by the Clearing House and EMV (the card brands) touting solutions for transactional tokens. The proposal to have ANSI x.9 create the standard to make it open is an interesting twist, of course the merchants have an interest in an open system that will have low cost,” Ricker said. “Overlooked in these recent announcements about new standard proposals is that currently there are several hundred thousand merchants that have already adopted tokenization offered by 3Delta and other gateways, as well as the processor community. These solutions have tokenized and protected millions of credit cards. Any merchant desiring a tokenization solution today can choose and implement today without waiting for a new standard.”
Nowak says tokenization is not a magic bullet, and organizations should only implement it after they’ve properly assessed the risks and benefits.
“Its primary benefit is to reduce the number of merchant systems on which cardholder data resides, thus reducing the PCI DSS compliance burden, and hopefully reducing risk of data loss as well. Any organization considering implementing tokenization should do so only if their risk assessments have determined that, of all the major security initiatives they could undertake, this was the one that would have the most significant positive impact,” he explained.
Stanislav believes that the efforts to standardize tokenization are a great step forward and one that, much like EMV adoption in the United States, is well past due.
“The fight now will be to get a single standard created to reduce fragmentation of usage and also retrofit, as needed, fronted and backend systems to utilize the adopted standard when it’s ready,” said Stanislav.