Tokenization
With tokenization, a meaningful piece of data, e.g. an account
number, is converted to a random string called a token, which
has no meaningful value when violated. Tokens serve as a
reference to the original data, but cannot be used to guess
these values. That's because unlike encryption, tokenization
doesn't use a mathematical process to turn the sensitive
information into the token. There is no key or algorithm that
can be used to derive the original data for a token. Instead,
tokenization uses a database, called a token vault, that stores
the relationship between the sensitive value and the token. The
real data in the vault is then secured, often using encryption.
The token value can be used in various applications as a
substitute for the real data. When the real data needs to be
retrieved - for example when processing a recurring credit card
payment - the token is submitted to the vault and the index is
used to retrieve the real value for the authorization process.
For the end user, this process is performed seamlessly by the
browser or application almost instantaneously. You probably
don't even know that the data is stored in the cloud in a
different format.
The advantage of tokens is that there is no mathematical
relationship to the real data they represent. If they get hurt,
they mean nothing. No key can reset them to actual data values.
The design of a token can also be considered to make it more
useful. For example, the last four digits of a payment card
number can be kept in the token so that the tokenized number (or
part of it) can be printed on the customer's receipt so that the
customer can see a reference to their actual credit card number.
The characters printed could all be asterisks plus the last four
digits. In this case, for security reasons, the merchant only
has a received card number, not a real card number.