Tokenization For Cloud Data Protection

Page 1

TOKENIZATION FOR CLOUD DATA PROTECTION Tokenization Defined

Perspecsys Cloud Tokenization

Tokenization is a process where a sensitive data field is replaced with a surrogate value called a token. Detokenization is the reverse process of replacing a token with its associated clear text value. Depending on the particular implementation of a tokenization solution, tokens can be used to achieve compliance with requirements that stipulate how sensitive data needs to be treated and secured by companies in order to adhere to data residency and sovereignty requirements or guidelines such as PCI DSS, HITECH & HIPAA, CJIS, and Gramm–Leach–Bliley. Whether sensitive data resides within on-premise systems or in the cloud, transmission, storage and processing of tokens instead of original data are acknowledged industry-standard methods for securing sensitive information.

Perspecsys’ AppProtex Cloud Data Control Gateway enables enterprises to define data protection policies governing how sensitive data is secured and protected when stored in cloud applications. Defining data protection policies, authorized administrators can select, on a field-by-field basis, whether to allow a field to remain in clear text, to encrypt field data, or to replace data with a token. When using tokens as an obfuscation method, sensitive data never leaves the organization’s control.

How Does Tokenization Differ From Encryption? Encryption is an obfuscation approach that uses a cipher algorithm to mathematically transform data. The resulting encrypted value can be transformed back to the original value via the use of a key. While encryption can be used to obfuscate a value, a link back to its true form still exists. Tokenization is unique in that it completely removes the original data from the systems in which the tokens reside.

Making Public Clouds Private The AppProtex Gateway gives enterprises the ability to monitor and discover how cloud applications are being used within the organization and take steps to protect and secure sensitive data, ensuring it never leaves an enterprise’s control.

Within Perspecsys, tokens are randomly generated strings of characters with no mathematical or logical association to the clear text data they replace (unlike some "masking" systems that tokenize part of a string). The Perspecsys tokenization system generates and assigns a new token for each unique piece of data that it receives within a defined token space based on a sequence. In the case of long text data type fields, Perspecsys generates a single token for the entire string, not a token for each unique word in the string. As randomly-generated and arbitrarily assigned values, knowing the clear text value of a single token would provide an adversary no insight to the value of any other token. End users can still perform operations such as Searching and Sorting on data that has been tokenized in the cloud due to the innovative capabilities of the gateway.


Turn static files into dynamic content formats.

Create a flipbook
Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.