TOKENIZATION FOR CLOUD DATA PROTECTION Tokenization Defined
Perspecsys Cloud Tokenization
Tokenization is a process where a sensitive data field is replaced with a surrogate value called a token. Detokenization is the reverse process of replacing a token with its associated clear text value. Depending on the particular implementation of a tokenization solution, tokens can be used to achieve compliance with requirements that stipulate how sensitive data needs to be treated and secured by companies in order to adhere to data residency and sovereignty requirements or guidelines such as PCI DSS, HITECH & HIPAA, CJIS, and Gramm–Leach–Bliley. Whether sensitive data resides within on-premise systems or in the cloud, transmission, storage and processing of tokens instead of original data are acknowledged industry-standard methods for securing sensitive information.
Perspecsys’ AppProtex Cloud Data Control Gateway enables enterprises to define data protection policies governing how sensitive data is secured and protected when stored in cloud applications. Defining data protection policies, authorized administrators can select, on a field-by-field basis, whether to allow a field to remain in clear text, to encrypt field data, or to replace data with a token. When using tokens as an obfuscation method, sensitive data never leaves the organization’s control.
How Does Tokenization Differ From Encryption? Encryption is an obfuscation approach that uses a cipher algorithm to mathematically transform data. The resulting encrypted value can be transformed back to the original value via the use of a key. While encryption can be used to obfuscate a value, a link back to its true form still exists. Tokenization is unique in that it completely removes the original data from the systems in which the tokens reside.
Making Public Clouds Private The AppProtex Gateway gives enterprises the ability to monitor and discover how cloud applications are being used within the organization and take steps to protect and secure sensitive data, ensuring it never leaves an enterprise’s control.
Within Perspecsys, tokens are randomly generated strings of characters with no mathematical or logical association to the clear text data they replace (unlike some "masking" systems that tokenize part of a string). The Perspecsys tokenization system generates and assigns a new token for each unique piece of data that it receives within a defined token space based on a sequence. In the case of long text data type fields, Perspecsys generates a single token for the entire string, not a token for each unique word in the string. As randomly-generated and arbitrarily assigned values, knowing the clear text value of a single token would provide an adversary no insight to the value of any other token. End users can still perform operations such as Searching and Sorting on data that has been tokenized in the cloud due to the innovative capabilities of the gateway.
Tokenization Advantages Tokens cannot be returned to their corresponding clear text values without access to the original “lookup” table that matches them to their original values. These tables are typically kept in a database in a secure location inside a company’s firewall. Tokens can be made to maintain the same structure and data type as their original values. While formatpreserving encryption can retain the structure and data type, it’s still reversible to the original given the key and algorithm. Unlike encrypted values which express the relative length of their clear text value, tokens can be generated such that they do not have any relationship to the length of the original value.
Tokenization and Data Residency Because tokens cannot be mathematically reversed back to their original values, tokenization is frequently the de facto approach to addressing data residency. Depending on the countries in which they operate, companies often face strict regulatory guidelines governing their treatment of sensitive customer and employee information. These data residency laws mandate that certain types of information must remain within a defined geographic jurisdiction. In cloud environments, where data centers can be located in various parts of the world, tokenization can be used to keep sensitive data local (resident) while tokens are stored and processed in the cloud.
Perspecsys 3rd Party Assessment Perspecsys’ tokenization solution has been assessed by Coalfire™, a 3rd Party PCI DSS QSA and FedRamp 3PAO, for compliance with the PCI DSS tokenization standards. Key findings from the report include:
Perspecsys tokens were observed to have no relation to the data for which the token was generated. Tokenization conforms to the PCI SSC Tokenization Guidelines All tokenization components were located on secure internal networks that are isolated from any untrusted and out-of-scope networks Only trusted communications were permitted in and out of the tokenization system environment The tokenization solution enforced strong cryptography and security protocols to safeguard cardholder data when stored and during transmission over open, public networks The tokenization solution implemented strong access controls and authentication measures in accordance with PCI DSS Requirements 7 and 8 The tokenization system components are designed to strict configuration standards and are protected from vulnerabilities The tokenization solution supports a mechanism for secure deletion of data as required by a dataretention policy The tokenization solution implements logging, monitoring, and alerting as appropriate to identify any suspicious activity and initiate response procedures
The complete report is available at www.Perspecsys.com
Contact us today to learn more or request a demo: Email: sales@perspecsys.com P +1 703-712-4752 (USA) +1 905-857-0411 (Canada) +44 207-868-2037 (Europe) Perspecsys enables enterprises to define data protection policies governing how sensitive data is secured and protected when stored in cloud applications. Defining © 2014 Perspecsys Inc. This material may not be reproduced, displayed, modified or distributed without the permission of Perspecsys Inc. Perspecsys, the Perspecsys logo and the Perspecsys AppProtex Cloud Data Control Gateway are trademarks or registered trademarks of Perspecsys Inc. in the United States, other countries or protection policies, authorized administrators can in this publication to Perspecsys products or services do not imply that both.data Other company images, product, and service names, may be trademarks or service marks of others. References Perspecsys intends to make them available in all countries in which Perspecsys operates. Other names may on be trademarks of their respective owners. select, a field-by-field basis, whether to allow a field to remain in clear text, to encrypt field data, or to replace