Saturday, February 28, 2009

'Tokenization' Touted to Increase Credit Card Data Security

Via -

Remember the business bestseller, "Who Moved My Cheese?"

Even the most sophisticated hackers may be asking that very question the next time they attempt a Heartland-size credit card heist if a new data security technology called tokenization catches on with the payment industry.

The concept behind tokenization is remarkably simple: Data thieves can't steal what isn't there.

Tokenization intercepts your card information at the point-of-sale terminal or online payment interface and replaces your cardholder data with randomly generated proxy numbers, or tokens. The transaction then continues, under an assumed name as it were, through the normal authorization process.

The biggest difference: Your card data is never stored intact anywhere, making it nearly impossible for hackers to reassemble it through decryption or reverse engineering.

Hack into your merchant's database or that of the payment processor and all you'll receive for your trouble are worthless tokens.

The only place your card data actually resides is at the data facility of the third-party provider that administers the tokenization program. But hack into their database and all you'll find is the digital equivalent of jigsaw puzzle pieces scattered across multiple locations.

"People ask, 'Why can't what happened to Heartland happen to you?'" says Randy Carr, vice president of marketing for Shift4, developer of the 4GO tokenization technology. "You would have to steal numerous people in numerous buildings to actually steal a credit card number from us." While no system built by man can be considered 100 percent hack-proof, tokenization may be the next best thing.

"I think the concept of tokenization is good," says Troy Leach, technical director of the Payment Card Industry (PCI) Security Standards Council. "That is why the council is exploring the concept this year. We're asking, 'Does tokenization simplify the process of PCI compliance for merchants, or does it provide additional complexity?'"


The March 2009 issue of INSECURE Magazine (PDF) is a good write-up on tokenization as well.

No comments:

Post a Comment