Tokenization
of Data

Data tokenization is a cybersecurity method that replaces sensitive data in-the-clear with an algorithmically generated, non-sensitive token that obscures the content of the original data.