Data tokenization is a substitution technique in which private or sensitive data elements are replaced with randomly generated alphanumeric strings. These strings or tokens have no value and can’t be exploited. The original value or dataset cannot be reverse-engineered from a token value.


References