tokenization on td nonstop systems - gtug · in canada, the bank ... do not store the card...

32
Michelle West Cards and Merchant Solutions TD Bank Financial Group 04 . 20 . 2016 Tokenization on TD NonStop Systems

Upload: hoangnhan

Post on 01-May-2018

212 views

Category:

Documents


0 download

TRANSCRIPT

Michelle WestCards and Merchant Solutions

TD Bank Financial Group

04 . 20 . 2016

Tokenization on TD NonStop Systems

Agenda

1. Who is TD Bank Group?

2. Why TD Bank Needs to Secure Data-at-Rest

3. Intro to Tokenization

4. Alternatives Considered

5. Implementation Plan

6. Results

7. Summary

2

Internal

Internal

Internal

1. Who is TD Bank Group?

The Toronto-Dominion Bank is a Canadian multinational banking and financial services corporation headquartered in Toronto. Commonly known as TD and operating as TD Bank Group.

TD Bank Group is the largest bank in Canada by market capitalization and a top-10 bank in North America.

In Canada, the bank operates as TD Canada Trust and serves more than 11 million customers at over 1,150 branches. In the United States, the company operates as TD Bank. The U.S. subsidiary serves more than 6.5 million customers with a network of more than 1,300 branches in the eastern United States.

Both the POS and ABM use ACI's Base24 and HPE Services

6

2. Why TD Bank Needs to Secure Data-at-Rest

Protection of customer data– Data breach threats are everywhere!

PCI requirements for both POS and ATM. (QSA Audit requirement)– Requirement 3.2.1 - Do not store the full contents of any track from the magnetic stripe

(located on the back of a card, contained in a chip, or elsewhere). This data is alternatively called full track, track, track 1, track 2, and magnetic-stripe data.

– Requirement 3.2.2 - Do not store the card verification code or value (three digit or four-digit number printed on the front or back of a payment card) used to verify card-not present transactions.

– Requirement Number: 3.4 - Render the PAN unreadable anywhere it is stored

– * Identified Risk: Storing sensitive cardholder data without encryption or tokenization may facilitate opportunity for its disclosure to individuals who are not authorized to access this data and who may use the information for fraudulent activity.

7

3. Tokenization – the concept

Sensitive data (e.g., PANs) are replaced with multiuse Tokens in the database Tokens maintain the format of the original data PANs can be reconstructed from Tokens (not just a one way hash) Not just for PANs!

Tokenization Engine

PAN: 4026157151401408

Token: AB3ce7xn12VT5982

16 byte

Token: 4738218214678978

Token: 402615xn12VT1408

“664” tokenization

5

Transaction Log before Tokenization

9

$B2402.RYN1PTLF.PO110114 RECORD 11 KEY 12290 (%30002) LEN 1066

0: ....S...01VISAVISA4026157151401408 000RYN1AIB10015001588888830

35: 88888830 001001RYN1AIB188888830 1026410088888830

70: 588888830 11111100210001399....S...................1101

105: 1410264100110114000000110114000000005605TEST TERMINAL ASSET ML JOE

140: DOE NEW YORK IE IE0000 ..63049300000000000000007011

175: 11110000000000005999B24 B24 100000V 050............

210: ....1306M4026157151401408=1306?

245: P1A^APACS^02 9001000 6910000000000

280: 02000001501109789786100000097861000000........1220

315: 00 00000000000

350: 0000 00

385: & ....! 04.. 0 Y ! C0..111 2

420: 7 1 ! C1..S1A^APACS^AST^02! C4..20351000061 ! B4..011500..

455: 15060 ! P0.& 88888830 ! B8."

490: POS ! B9.< ISO000000

525:

Transaction Log after Tokenization

10

$B2402.RYN1PTLF.PO110114 RECORD 11 KEY 12290 (%30002) LEN 1066

0: ....S...01VISAVISA402615xn12VT1408 000RYN1AIB10015001588888830

35: 88888830 001001RYN1AIB188888830 1026410088888830

70: 588888830 11111100210001399....S...................1101

105: 1410264100110114000000110114000000005605TEST TERMINAL ASSET ML JOE

140: DOE NEW YORK IE IE0000 ..63049300000000000000007011

175: 11110000000000005999B24 B24 100000V 050............

210: ....1306M402615xn12VT1408=1306?

245: P1A^APACS^02 9001000 6910000000000

280: 02000001501109789786100000097861000000........1220

315: 00 00000000000

350: 0000 00

385: & ....! 04.. 0 Y ! C0..111 2

420: 7 1 ! C1..S1A^APACS^AST^02! C4..20351000061 ! B4..011500..

455: 15060 ! P0.& 88888830 ! B8."

490: POS ! B9.< ISO000000

525:

4. Alternatives considered

11

Alternatives considered:

Volume Level Encryption vs

comForte SecurData/Tokenization

Alternatives considered: VLE (Volume Level Encryption)

12

Using VLE with the storage CLIM is an effective way to protect the disk from physical theft

Encrypted DB

Data EncryptedRats!

I can’t exploit encrypted data

Data “in the clear”

Is VLE (Volume Level Encryption) enough?

13

Encrypted DB

Data “in the clear”

Data Encrypted

>FUP DUP $VLEDISK.SECURE, $UNSECURE.UNSAFE VLE doesn’t

protect from TACL attacks

“In the clear” DB

That was easy!

Introducing SecurData: Transparent tokenization for HP NonStop

14

Application

SecurData Manager .

Intercept of Enscribe file system calls

TKN

PAN

DB

TKNPANTKN

SecurData transparently tokenizes Sensitive Data in DB (Enscribe or SQL)

No changes to Application code required.

Audit Log

I/O Intercept

SecurData Tokenization Engine

API

Transparent intercept: No code changes

required

Stateless Tokenization

Table

Alternatives considered: Volume Level Encrytion (VLE)

– Pro:– Effective way to protect sensitive data from physical theft – Easy to implement

– Con:– NO protection against attacks while system is running and disk is mounted

Tokenization– Pro:

– Fully protects data-at-rest (satisfies PCI 3.4)– Format Preserving (no database changes required)– Application transparent (no code changes)– Can be implemented in phases (no “Big Bang”)

– Con:– Requires migration strategy

Progress with SecurData

Part 5: Implementation Plan

16

ABM Tokenization Pilot Application

–HPE, ACI and comForte scheduled a 2 day workshop to familiarize the technical resources with the SecurData software.

–By the end of the first day the software was loaded and a tokenization/de-tokenization of the ACOF file in a test environment was successful

17

SecurData Pilot – Status Quo

18

HP NonStop IBM Mainframe

TCP/IP Process

local loopback Port 21

Port 1234 NonStop SSL

FTPSERV

PAN1 PAN2 ...

FTPS

FUP

BASE24 classic

read entries

update entries

process

B24 Production Enscribe DB (incl. PANs)

Intermediate Enscribe File (incl. PANs)

Write to filesystem

ACI TACL

macro

invokes

SecurData Pilot – Environment with SecurData in place

19

HP NonStop IBM Mainframe

TCP/IP Process

local loopback Port 21

Port 1234 NonStop SSL

FTPSERV

PAN1 PAN2 ...

FTPS

BASE24 classic

Read entries (de-tokenize on-the-fly)

update entries

process

B24 Production Enscribe DB (incl. PANs)

Intermediate Enscribe File (incl. Tokens)

Write to filesystem (tokenize on-the-fly)

SecurData

ACI TACL macro

invokes

FUP

SecurData

Lessons Learned From ABM Tokenization Pilot Test Environment

– Test environment needs to closely match the production environment.– ACOF FTP test environment was very different from the production

environment – This lead to difficulties in testing FTP communication configuration

Performance– The tokenization/de-tokenization process did not cause any performance

degradation

20

Lessons Learned continued….

Strong comForte Support– comForte technical support was available and very knowledgeable– comForte test environments were leveraged to work out FTP issues that arose

HPE team very familiar with SecurData product– Leverage proven HPE process– Leverage HPE/comForte relationship

21

ABM Implementation

ABM Team decided on a Phased Approach

Plans for this phased approach began in April of 2014 – Phases:

1. ILF Files2. TLF Files3. CAF

ABM Tokenization completed successfully in Production in March of 2015

22

POS Implementation Due to TD Org Structure, the ABM team and the POS team fall under

different divisions

Because both ABM and POS run Base24, there are often synergies within our projects

There are multiple projects in which both ABM and POS work with HPE to implement; – it often makes sense for HPE to work with POS on one and to work with ABM

on another and then, once these are completed, swap;

– lessons learned from one, benefit the other

ABM implemented Tokenization while POS worked with HPE on another project

Once ABM was implemented, HPE and Comforte started working with the POS team to implement Tokenization

23

POS Tokenization phases: Avoiding the “Big Bang”

24

Phase Activity

TD POS TokenizationPhase 1

Implementation of the Defines, Installing SecurData, Tokenizing 1 small fileCompleted

TD POS TokenizationPhase 2

OMF Audit, Visa ILF, Banknet ILF, Vantiv ILF, ICTS SAF, PIP SAF, NRT SAF, IMNI SAF, 1 IMNI ILF (Acq,Iss), CTF Extracts, IMNAdmnCompleted

TD POS Tokenization Phase 3

PTLF, VMS Extracts, Remainder of IMNI ILFs, SMS Extract, SMS ILF Completed

TD POS Tokenization Phase 4

Encrypted Extracts (FTP will decrypt before it is sent to the HOST - SMS, CTF ILF Extract, VMS PTLF Extract, PTD) In Progress

TD POS Tokenization Phase 5

CAFScheduled for May 2016

The Phased Approach:

Phase 0 Pre-Implementation activity:

– Added the DEFINE referring to the the SecurData Manager.– Recycled Pathway and the Base24 POS nodes

Phase 1 HPE resources installed the SecurData vault with TD Key

Custodians. Chose small ILF files to tokenize in order to ensure tokenization

was functioning as expected. Updated utility tools with the SecurData library to make sure we

could still detokenize the PANs in these files and view them.

25

The Phased Approach:

Phase 2 Used the SDF to set date/times for files to be tokenized

– The files we were planning on tokenizing were larger than the last phase so we wanted to be cautious. We set it up in SDF to tokenize some on the first day and slowly, throughout the week, tokenize more of the same type.

By the end of the week, BNET and Visa ILFs were completed as well as 1 of each (acquiring and issuing) IMN ILFs

Also started to tokenize some SAF files and the OMF audit file

26

The Phased Approach:

Phase 3 PTLF, Remaining IMN ILFs, Internal Files

– still broken up by dates in the SDF file to determine when the tokenization of these files would begin.

– Verification took place at each of the times set in the SDF to ensure tokenization was working and detokenization could be done.

Phase 4 Encryption/Decryption of the Extract Files

– All files (except the CAF) were now tokenized– online processing was functioning as expected – during the extract process, the extracted files had to be stored

somewhere during creation; prior to being transferred to the host for batch processing

27

The Phased Approach:

Phase 5 – Implementation planned for May 2016 CAF Tokenization Still to be completed Once complete, TD ABM and POS will be fully tokenized in

production

28

Considerations: Security

– determine who should be able to see the PAN in the clear and set up Utilities to allow for this and ensure other IDs can only see the tokenized PAN

PAN Recognition – determine how PANs need to be identified in your environment; by

placement in the file or by the of the PAN format

Timing– determine when you would like the tokenization to begin; this can be

set in the SDF so one implementation can include multiple files with the Tokenization beginning at different times.

Tokenization Scheme– determine how the PANs should be tokenized (eg. Alpha, Numeric,

Partially Tokenized – 664 rule)

29

Considerations Continued: Delimiters:

– Because TD chose to tokenize the entire PAN with Alpha characters between A-H, we had to look at our delimiters. In some cases, a "D" was being used as a delimiter. When the PAN was being DeTokenized, SecurData was unable to determine where the PAN ended and included the delimiter in a character to be de-tokenized.

File Conversion?– Most files (ILF, PTLF) don't need to have conversion programs run

against them to have them converted to tokenized files all at once. A date and time when tokenization starts can be added to the SDF and when a new file is created, tokenization can begin.

– For other files, it needs to be determined if conversion programs should be run and the processes recycled.

– Is the PAN a key? If this is the case, some planning of how to tokenize this file is needed.

30

Summary

Securing data-at-rest is a solvable problem (PCI)– No code changes

Not a “Big Bang” solution– 5 phases; some with multiple implementations

– One file tokenized…. then another… validations throughout Back out plans are a fundamental part of implementation No issues with any of the implementations No performance degradation; in some cases, performance

improvement Phase 5: Tokenization of CAF still to do for POS but has already

been done for ABM

31

Questions?

32