tokenization
Tokenization refers to a process by which a piece of sensitive data, such as a credit card number, is replaced by a surrogate value known as a token. The sensitive data still generally needs to be stored securely at one centralized location for subsequent reference and requires strong protections around it. The security of a tokenization approach depends on the security of the sensitive values and the algorithm and process used to create the surrogate value and map it back to the original value.
- Part of Speech: noun
- Industry/Domain: Technology
- Category: Information technology
- Company: Gartner
0
Other terms in this blossary
Creator
- consultant
- 100% positive feedback
(New York - NY, United States)