solution blueprint - customer 360

Download Solution Blueprint - Customer 360

Post on 16-Apr-2017




0 download

Embed Size (px)


  • We operate as John Hancock in the United States, and Manulife in other parts of the world.

    Customer 360 Solution Blueprint


    January 30th, 2015

  • 1

    Table of Contents

    # Topic Page

    1 Revision History 02

    2 Document Distribution List 03

    3 References 04

    4 CCF Solution Architecture 05

    5 Key Assumptions 06

    6 Technical Risk and Mitigation 07

    7 Technical/Non-Functional



    8 Data Flow 15

    # Topic Page

    9 Conceptual Architecture 24

    10 Extract, Transform and Load (ETL) 28

    11 Data Quality 59

    12 Master Data Management (MDM) 72

    13 Data Architecture 109

    14 Data Governance 116

    15 Summary & Next Steps 129

    16 Appendix 131

  • 2

    Customer Centricity Foundation (CCF)As-Is Solution Architecture

  • 3

    Key Assumptions

    No. Assumption

    1The Data elements identified for customer 360 are the critical data elements that will be measured for data quality and data


    2Data owners and data stewards have been identified for critical data elements for customer domain within the Canadian


    3 Out of the box Data Quality Information Analyzer reporting tool will be used for measuring customer critical data.

    4The Data Governance workflows will be provided as swim lane diagrams in Visio and will not be automated using AWD by the

    Data Governance work stream.

    5The IBM Standard Data Model for Party, Accounts and Product will be leverage with possibility of extension up to 10-15 custom

    entities. The estimated entities for this phase are around 40-45.

    6Account/Policy Data will be hosted in MDM. For this phase, Account/policy Data will not be mastered. The Account/Policy Data

    is maintained with relationship to Customer.

    7MDM will house product data to support customer to product and product to line of business relationships but is not considered

    master data.

    8 Virtual and physical MDM will be leveraged for distinct match, merge, and persistence capabilities in a hybrid model.

    9The out of the box MDM artifacts will be the basis for any customization or configuration (matching algorithms, UI components

    ,and services).

    10There will be multiple points of human interaction within the MDM tool suite based on a given task (linking, data quality,

    reference data management, golden record management, or product data configuration).

    11Customer 360 will leverage existing extract files from admin systems, except for MPW where new extract process will be


  • 4

    Technical Risk and Mitigation

    No. Risk DescriptionRisk

    CategoryMitigation Strategy


    Quality of data is not being high enough to effectively match and

    merge client data before exposing it to the client via the

    Customer Portal and CCT.

    HighExecute Data Profiling, Remediation

    and Monitoring


    Access to unmasked data for the purposes of effectively profiling

    data, data quality and identifying fields to match on, and for

    respective probability and thresholds.


    Identity individuals who need access

    to unmasked data and restrict access

    to others.


    Aggregating entire Customer data from across the Canadian

    division in one place increases risk of Confidential Customer data

    being lost, stolen and exposed in case of a Security Breach.


    Implementing high Security measures

    and protocols to protect data e.g.

    using SFTP and HTTPS.


    Several Technology components will be implemented for the first

    time at Manulife, raising a risk of integration challenges which

    can lead to schedule and cost impacts


    Conduct a Proof of Technology for

    Customer 360 architecture


  • 5

    Technical/Non-Functional Requirements

    No. Criteria Requirement Criteria Count

    1 AvailabilityCriteria for determining the availability of the system for service when required

    by the end users.6

    2 MaintainabilityCriteria pertaining to the ease of maintenance of the system with respect to

    needed replacement of technology and rectifying defects.1

    3 Operability The criteria related to the day to day ease of Operation of the System. 2

    4 Performance Criteria related to the speed and response of the system. 4

    5 RecoverabilityCriteria to determine how soon the system would recover to the original state

    after a failure.8

    6 ScalabilityCriteria to determine the ability of the system to increase throughput under

    increased load when additional resources are added.2

    7 SecurityCriteria to determine Security measures to protect the system from internal and

    external threats.4

  • 6

    Technical/Non-Functional Requirements

    Req No.Work

    Stream Requirement Owner Status

    1 MDMQ. Determine what time the data will be available ?

    A. The data should be cleansed, matched and synchronized with Salesforce by 7am.Jamie Complete

    2 ETLQ. Determine when do the source files arrive by?

    A. All source files expected to arrive by 5am.Steven Complete

    3 Web ServiceQ. Determine how often will ePresentment pull data from ePresentment stage?

    A. Large runs monthly and annually. Otherwise, small runs occurring nightly for notices,

    letters, confirms, etc.

    Jamie Complete

    4 ETL

    Q. Determine does ePresentment stage expect full data everyday?

    A. ePresentment would expect full data everyday. Deltas would suffice, but any document

    delivery preference changes during the day should be reflected in the ePreferences staging


    Jamie Complete

    5 ETLQ. Determine when something fails within ETL, how soon should someone be notified of the


    A. Recommendation : ETL error notification should be sent at least once a day.

    Steven Open

    6 IVRQ. Determine is it ok for the IVR vocal password to be down for the day?

    A. 12 hours to align with overnight batch schedules.Jamie Open


  • 7

    Technical/Non-Functional Requirements

    Req No.Work

    Stream Requirement Owner Status

    1 ETLQ. Determine how are deltas being identified / captured within the current processing?

    A.. For Dataphile, the source and target files are being compared to capture the deltas. For

    the other source files the process is still to be determined.





    Req No.Work

    Stream Requirement Owner Status

    1 ETLQ. What is the existing File validation processes?

    Jamie Open

    2 ETLQ. Determine what happens when the source files per system do not arrive on time?

    A. Given that 360 should process files as they come in, there should not be holding of any


    Jamie Complete

  • 8

    Technical/Non-Functional Requirements


    Req No.Work

    Stream Requirement Owner Status

    1 ETL

    Q. Determine if all the files are processed as the file arrives or is there a queue in process?

    A. The files should be processed as they arrive, preference for a real-time processing option.

    However if there are cost or delivery date related issues, then files will be processed when

    all files are available, or at an arbitrary time.

    Jamie Open

    2 ETLQ. Determine what volume are we expecting from each of the source systems (initial and

    incremental)? Steven Open

    3 Tech Arch

    Q. Determine what is the daily expected volume from each of the sources/within

    ePresentment stage?

    A. Total of new customers + preference changes. Expect less than 5000 per day, ongoing.

    (Rough estimate)

    Jamie Complete

    4 Tech ArchQ. Determine what are the archiving requirements for the ePresentment stage?

    A. Archiving requirements are not necessary for ePresentment stage. It is not source of

    record. Data is primarily transient; being staged for performance reasons.

    Jamie Complete

  • 9

    Technical/Non-Functional Requirements

    Req No.Work

    Stream Requirement Owner Status

    1 Tech ArchQ. Determine the fail over time?

    A. Fail over to alternate site should be immediate; utilizing a cross-data center clustered

    WAS architecture with our global load balancer.

    Jamie Complete

    2 Tech Arch

    Q. Determine the data loss time?

    A. For data where 360 is not source of record, 24 hour data loss is probably acceptable as

    data can be re-run from the admin systems. For data where 360 is the source of record

    (preferences, for example) then acceptable data loss is very small. However would still

    probably be hours worth, given that Salesforce would capture the data and presumably we

    can resend the messages.

    Jamie Complete

    3 Tech ArchQ. Determine what happens when the virtual MDM is lost? How soon can it be recovered?

    A. Virtual repository would have to be back up within a 24 hour period.Jamie Complete

    4 Tech ArchQ. How often will the system be backed up?

    A. System should be backed up nightly. Tape backup would be the option.Jamie Complete

    5 Tech ArchQ. Who will be responsible for the database back up?

    A. DBAs would be responsible for the database backup.Jamie Complete

    6 Tech ArchQ. What data must be saved in case of a disaster?

    A. Recommendation:Jamie Open

    7 Tech ArchQ. How quickly after a major disaster must the system be up and running?

    A. System should be back up and running within 24 hours after disaster