a developers handbook interfaces

Upload: mehedi-hasan

Post on 06-Apr-2018

220 views

Category:

Documents


0 download

TRANSCRIPT

  • 8/2/2019 a Developers Handbook Interfaces

    1/33

    Informatica Developers Handbook

  • 8/2/2019 a Developers Handbook Interfaces

    2/33

    Informatica ArchitectureCommonLE Integration Design

    Table of Contents

    TABLE OF CONTENTS.................................................................................................................................. .2

    1 BACKGROUND .......................................................................................................................................... ...3

    2 DETAILED ETL PROCEDURES ......................................................................................................... ..... ..... .4

    3 INFORMATICA STANDARDS ................................................................................................. ..... .............. ...7

    4 BUILD AND UNIT TEST ACTIVITIES ........................................................................................... ..... ..... ..... 14

    APPENDIX A: STEP-BY-STEP APPLICATION OF CODE TEMPLATE TO CORE PROCESSES.................21

    APPENDIX B: ACCESSING COMMONLE LOGS..........................................................................................24

    APPENDIX C: IMPLEMENTING RECORD-LEVEL EXCEPTION LOGGING INTO CORE PROCESSES... .. .29

    APPENDIX D: IMPLEMENTING RECORD-LEVEL AUDIT LOGGING INTO CORE PROCESSES.............. ..32

  • 8/2/2019 a Developers Handbook Interfaces

    3/33

    Informatica ArchitectureCommonLE Integration Design

    1 Background

    1.1 Purpose

    This document has been created to provide a more detailed understanding of the ETL patterns and the usage of

    Informatica as it related to Project OneUP. This document should be leveraged during the technical design and build

    phases of the development effort. This document is NOT static. As architecture patterns evolve and new best practices

    are introduced and implemented, the pages that follow will be updated to reflect these changes.

    1.2 Intended Audience

    This documentation is geared towards Integration Solution Architects, Technical Designers, and Informatica conversion

    and interface developers. Integration Solution Architects will gain a deeper knowledge of the technology being used to

    extract and load data from one system to the next. With this knowledge, the ISAs will be prepared to ask better

    questions of the business process teams to gain additional insight to improve the quality of data transfer as well as the

    quality of the SID documentation. Technical designers will use this documentation to understand when to utilize various

    extract and load strategies, what types of data conversion database objects need to be created, and how conversions

    and interfaces differ as business processes as well as units of code. The code developers will use this as a guideline for

    standards, conventions, and best practices as well as a first resource for answering questions relevant to development.

  • 8/2/2019 a Developers Handbook Interfaces

    4/33

    Informatica ArchitectureCommonLE Integration Design

    2 Detailed ETL Procedures

    2.1 Informatica ETL Interface Strategies

    Within each of the patterns, a typical code design approach is outlined. In addition to this brief outline, the section on

    Workflow Development will also delineate the constructs of the process flow and workflow details within Informatica.

    2.1.1 Interface PatternsInterfaces that are developed using Informatica as the middleware technology will typically be point to point batch

    routines that are scheduled for source and target. The AI interface pattern document outlines each pattern identified for

    Project One Up.

    2.1.1.1 Detailed Logical Architecture

    Integration Layer

    Source

    Extraction

    Transform to

    Target

    Format

    Target Load3 5Begin Audit

    Log

    End Audit

    Log

    Error

    Handling

    Target ApplicationLayer

    ApplicationData

    Format

    Exception

    Handling

    Logging/

    AuditSequencing

    EAI Common Services

    XRefBatching/

    De-Batching

    1

    4

    Standard Batch Interface

    7

    6

    LEGEND

    MW Components

    Common Components

    Normal ProcessingOptional Processing

    Source ApplicationLayer

    DataFormat

    Application

    2a

    8a

    ETL Common Components

    Transformation

    Error Logging

    Batch Data Store

    2b

    2b

    8b

    1. Audit log is triggered to denote middleware will be receiving data.

    2. Source data is extracted via the specific source extract strategy defined for the interface.

    a. Source data is pulled directly from the source.

    b. Data is staged within the middleware database to support multiple requirements for the source data.

    3. Data is transformed via the ETL tool into the target-specific format(s).

  • 8/2/2019 a Developers Handbook Interfaces

    5/33

    Informatica ArchitectureCommonLE Integration Design

    4. Cross reference lookups are performed during the source-to-target mapping.

    5. Data is marked for insert/update/delete to the target application.

    6. Data is loaded to the target application based upon the format specified.

    7. Audit log is triggered to denote middleware has processed the data.

    8. Error handling will be triggered based upon the status of preceding steps.

    a. All-or-nothing error handler

    b. Record-by-record error handler

    This interface pattern does not require use of the middleware database. The middleware database (labeled Batch Data

    Store) in Step 2 is utilized to accomplish any one of the following requirements of the business process:

    Multiple passes through each received data set (for example, if source data is sent only once and

    multiple mappings will require this information, it is best to store the data within a database to facilitateone process to receive data and multiple process to load data)

    Audit trail for logging purposes

    SOX compliance requirements

    Error handling

    2.2 Informatica Error Logging and Exception Handling

    2.2.1 Informatica Standard Task Level Error LoggingWhen logging audit and exception data to CommonLE either task level or row level error logging can be utilized. Task

    level is required by all interfaces to track failure or success of all interface sessions within a workflow. The standardimplementation is outlined in the Appendix for Audit Log and Error Messaging (CommonLE).

    2.2.2 Informatica Row Level Error LoggingRow level error logging is specified by business requirements and is either implemented through one of the exception

    patterns described in the Informatica Error Handling Design document or by using informaticas row level error logging

    functionality (verbose logging).

    As an alternative to the exception patterns, verbose logging within Infomatica can be utilized. Keep in mind verbose

    logging within the Informatica session can greatly reduce performance of the session run. When configuring sessions, a

    developer has multiple options for error handling, error logging, and traceability levels. When an error occurs at the

    transformation level (per row/record), the PowerCenter Server logs error information that allows a support team to

    determine the cause and source of the error. Row error logging may be captured within a database format or using flatfile structures. For Project OneUP, a decision has been made to use the database format option for row error logging

    purposes. The relational database structure will allow the Application Integration team to standardize the format and

    content of the error logs and manage this portion of the application within one central location.

    In addition to capturing error data based upon the row being processed within transformations, the PowerCenter Server

    may also be able to capture the source data associated with the row in a transformation. However, Informatica will be

    unable to create a link between the row level error in a transformation and the source record within the source qualifier if

    the error occurs after an active source. An active source within Informatica is defined as an active transformation used to

    generate rows. Here is a list of the following transformations that are classified as active:

  • 8/2/2019 a Developers Handbook Interfaces

    6/33

    Informatica ArchitectureCommonLE Integration Design

    - Aggregator- Application Source Qualifier- Custom, configured as an active transformation (It has been assumed that SAPcustom transformations fall into this category as well)- Joiner- MQ Source Qualifier- Normalizer (VSAM or pipeline)- Rank- Sorter- Source Qualifier- XML Source Qualifier- Mapplet, if it contains any of the above active transformations

    By default, the PowerCenter Server will log all transformation errors within the session log file and all rejected target

    records into the reject or bad file. When row error logging has been enabled, all such information is now filtered to the

    error log database/flat file structures. If the architecture landscape determines that all errors should reside in the error

    logging structures and the standard session log and reject/bad file, then the configuration should include enabling

    Verbose Data Tracing. All of this additional logging may negatively impact the performance of sessions and workflows

    being executed on the PowerCenter server, as data are being processed on a row-by-row basis instead of a block of

    records at once.

  • 8/2/2019 a Developers Handbook Interfaces

    7/33

    Informatica ArchitectureCommonLE Integration Design

    3 Informatica Standards

    3.1 Workflow Development

    For each business object, it is possible that multiple workflows exist to perform the full spectrum of interface activities

    from legacy to SAP. A workflow is defined as a set of sessions and other tasks (commands calling shell scripts, decisionand control points, e-mail notifications, etc.) organized in concurrent and/or parallel processing streams. Each workflow

    will execute a mapping or series of mappings that extract source data and load it into target systems. Working with the

    AI team, each Solution Integration Design will need to be modularized into workflows that perform the required pre-

    defined business functions. As a result, the interface programs built for a particular business object within the Solution

    Integration Design documentation could span multiple workflows and thus multiple technical design documents (as each

    technical design is at the workflow level).

    3.2 Code Naming Standards

    The following tables reflect the naming standards that have been outlined in Pepsicos ETL-Informatica-Design-Best-

    Practices document.

    3.2.1 Code CommentsWithin the Informatica code base, mappings, sessions, and workflows have a high-level description or comment field that

    is displayed when editing any of these units of code. Within the mapping section, be sure to add text that defines the

    author, date of comment, description of the mapping/session/workflow, and a version control section. Below is a sample

    of the mapping description that should be inserted into each mapping built for QTG1.

    Author: Developer NameDate: 01/01/2005Description: This mapping performs the core functionality for the XYZ interface.

    ================Revision History:================1.0 01/01/2005 - Initial development

    In addition to this comment, each of the transformations within a mapping should also have a brief explanation defining

    its functionality within the mapping.

    3.2.2 Transformation Naming Standards

    Type ofTransform

    Naming Convention Description/Example

    Source Definition [table_name] or

    [flat_file_name]

    The source definition should carry the same name as the

    Flat File or Relational Table that it was imported from. If thesource was created from a shortcut, that should be indicatedin the name.

    Target Definit ion [table_name] or[flat_file_name]_ACTION

    The target definition should carry the same name as theRelational Table it was imported from. If the target wascreated from a shortcut, that should be indicated in thename. Flat File targets should have _FF at the end of thename. The ACTION will correspond to the DML beingperformed on the target INS, UPD, DEL.

    Source Qualifier sq_[source_name]

    sqo_[source_name]

    sq_name of Source or sqo_name of Source if SQL overrideis used.

  • 8/2/2019 a Developers Handbook Interfaces

    8/33

    Informatica ArchitectureCommonLE Integration Design

    Type ofTransform

    Naming Convention Description/Example

    Expression exp_[RelevantDescriptor] exp_RelevantDescriptionOfTheProcessBeingDone

    Update Strategy upd_[target_name]_ACTION An update strategy should have a suffix appended to itcorresponding with the particular action (INS, UPD, DEL)

    Router rtr_[RelevantDescriptor] rtr_ RelevantDescriptionOfTheProcessionBeingDone

    Filter fltr_[RelevantDescriptor] fltr_ RelevantDescriptionOfTheProcessionBeingDone

    Aggregator agg _[RelevantDescriptor] agg_ RelevantDescriptionOfTheProcessionBeingDone

    Lookup lkp_[source_name] orlkp_[RelevantDescriptor] orlkpo_[RelevantDescriptor]

    If one table: lkp_LookupTableName; If multiple tables arejoined to bring back a result: lkp_RelevantDescriptionOfTheProcessionBeingDone. If SQLoverride is used lkpo_...

    SequenceGenerator

    seq_[RelevantDescriptor] Typically the description is based upon the target table andthe primary key column that the sequence will be populating.

    Stored Procedure sp_StoredProcedureName This is used when executing stored procedures from thedatabase.

    ExternalProcedure

    ext_ProcedureName Used for external procedures

    AdvancedExternalProcedure

    aep_ProcedureName Used for advanced external procedures

    Joiner jnr_SourceTable/FileName1_SourceTable/FileName2

    Used to join disparate source types: Oracle to Flat File forexample.

    Normalizer Nrm_[RelevantDescriptor] Used to create multiple records from the one record beingprocessed. For example: nrm_Create_Error_Messages

    Rank rnk _[RelevantDescriptor] rnk_ RelevantDescriptionOfTheProcessionBeingDone

    Mapplet Mplt_[RelevantDescriptor] mplt_ RelevantDescriptionOfTheProcessionBeingDone

    SorterTransformation

    srt_[RelevantDescriptor] srt_ RelevantDescriptionOfTheProcessionBeingDone

    TransactionControl tc_[RelevantDescriptor] tc_RelevantDescriptionOfControl

    Union un_[RelevantDescriptor] un_RelevantDescriptionOfUnion

    XML Parser Xmp_[RelevantDescriptor] xmp_RelevantDescriptionOfXMLParser

    XML Generator Xmg_[RelevantDescriptor] xmg_RelevantDescriptionOfGenerator

    CustomTransformation

    ct_[RelevantDescriptor] ct_RelevantDescriptionOfCustomTransformation

    IDoc Interpreter int_[RelevantDescriptor] int_idoc_RelevantDescriptionOfCustomTransformation

    * Wherever possible, transformations should include the $PMRootDir//Temp and$PMRootDir//Cache directories. Such transformations include but are not limited to:

    TransformationName

    Directory

    Sorter $PMRootDir//Temp

    Joiner $PMRootDir//Cache

    Aggregator $PMRootDir//Cache

    Lookup $PMRootDir//Cache

    Rank $PMRootDir//Cache

  • 8/2/2019 a Developers Handbook Interfaces

    9/33

    Informatica ArchitectureCommonLE Integration Design

    3.2.3 Informatica Code Object Naming Standards

    Code Object Naming Convention Description/Example

    Mapping m_ _ _

    _ _

    The mapping is the main unit of code for Informatica. It willbe important to include the RICEF type, typically it will be

    CONV for Conversions. The target is required and thesource is typically used when trying to differentiate amongmultiple mappings that affect the same target. Versionnumbers will not be used for this implementation.

    Session s_m_ _ _ _ _

    s_m_MappingName without the version number attached.The session is the wrapper for the mapping containing allconnection information necessary to extract and load data.

    Workflow Wf_ _ ____

    (ie:wf_INTFC_ISCP_INVENT_INFO_BW_I2)

    The workflow is a job stream that strings all necessarytasks together to create a data flow from source to targetsystems.

    Worklets Wklt _description. Worklets are objects that represent a set of workflow tasksthat allow you to reuse a set of workflow logic in severalworkflows.

    Reusable Session rs_description This is a session that may be shared among severalworkflow and may execute while another instance of thesame session is running.

    Cntrl Task Cntrl_description You can use the Control takes to stop, abort, or fail the top-level workflow or the parent workflow based on an inputlink condition.

    Event Task Evnt_description Event-Raise task represents a user-defined event. Whenthe Informatica Server executes the Event-Raise task, theEvent-Raise task triggers the event. Use the Event-Raisetask with the Event-Wait task to define events.

    The Event-Wait task waits for an event to occur. Once theevent triggers, the Informatica Server continues executingthe rest of the workflow.

    Decision Task Dcsn_description The Decision task allows you to enter a condition thatdetermines the execution of the workflow, similar to a linkcondition.

    Command Task Cmd_description The Command task allows you to specify one or more shellcommands to run during the workflow. For example, you

    can specify shell commands in the Command task todelete reject files, copy a file, or archive target files.

    Email Task eml_description The Workflow Manager provides an Email task that allowsyou to send email during a workflow. You can createreusable Email tasks in the Task Developer for any type ofemail. Or, you can create non-reusable Email tasks in theWorkflow and Worklet Designer.

    Assignment Task asmt_description The Assignment task allows you to assign a value to auser-defined workflow variable.

    Timer Task tm_description The Timer task allows you to specify the period of time to

  • 8/2/2019 a Developers Handbook Interfaces

    10/33

    Informatica ArchitectureCommonLE Integration Design

    Code Object Naming Convention Description/Example

    wait before the Informatica Server executes the next task inthe workflow. You can choose to start the next task in theworkflow at an exact time and date. You can also chooseto wait a period of time after the start time of another task,workflow, or worklet before starting the next task.

    3.2.4 Port Variable Naming Standards

    Port Type Naming Convention Description/Example

    Variable v_ReleventName Used in expression transformations

    Output o_RelevantName orout_RelevantName (only setthis for new output portscreated in an expressiontransformation)

    Used in expression transformations to define the outgoingport for use in subsequent transformations.

    Input i_Relevant_Name orin_RelevantName (only setthis for input ports into a

    lookup)

    Used in lookup and expression transformations to denoteports that are used within the transformation and do notcarry forward.

    Lookup lk_RelevantName (only set thisin transformations for ports thatoriginated in a lookuptransformation)

    Used in expression transformations for unconnectedlookups.

    Return r_RelevantName Return values are found in lookup transformations and aretypically the column from the source object beingreferenced in the lookup code.

    3.3 Connection Configuration Standards

    Each session within the workflow is associated to a mapping. The mapping consists of source, target and transformation

    objects. Within each of the source and target objects are connection parameters which are configured at the session

    level in Workflow Manger. The connection strings are documented in the QTG2 Informatica Connections List.xls

    spreadsheet. This document can be found under the following StarTeam directory: 1UP - Informatica\QTG2\Supplement.

    3.4 General Best Practices

    3.4.1 Log File NamesLog File Names Validate that all file names for logs match the unit of code. When workflow names are changed from

    wf_INT_LOAD to wf_INTFC_LOAD for example, the log file will remain wf_INT_LOAD.log until the developer changes

    the log file name. This is true of sessions as well. Validate that all workflow and session log names match the name ofthe corresponding unit of code.

    3.4.2 Session Development StandardsAll session parameters need to be set in the Task developer at session level and not overridden in the Workflow (in

    Workflow manager)

    3.4.3 Lookup TransformationsLookups should be created to return a default value of -1 in case of a lookup failure.

  • 8/2/2019 a Developers Handbook Interfaces

    11/33

    Informatica ArchitectureCommonLE Integration Design

    3.5 Informatica Middleware Environment Standards

    3.5.1 Infromatica Directory Structures

    QTG1INF Dev - phgp0233: /etlapps/dev/71/qtg1/SrcFiles/

    /etlapps/dev/71/qtg1/TgtFiles/INF QA - phgp0232: /etlapps/fit/71/qtg1/SrcFiles/

    /etlapps/fit/71/qtg1/TgtFiles/

    QTG2INF Dev - phgp0233: /etlapps/dev/81/qtg2/SrcFiles/

    /etlapps/dev/81/qtg2/TgtFiles/

    INF QA - phgp0232: /etlapps/fit/81/qtg2/SrcFiles//etlapps/fit/81/qtg2/TgtFiles/

    3.5.2 FMS Directory Structure on Informatica Server

    INF Dev - phgp0233: /etlapps/dev/81/p1up_shared/fms/

    INF QA - phgp0232: /etlapps/dev/81/p1up_shared/fms/

    3.5.3 FMS Control File Names(By default Informatica does not use control files to send files via FMS).

    All FMS Control Files should use the following naming standard:

    FMS____. Xml

    * For mainframe systems substitute the _ for .

    3.5.4 Informatica Flat File Naming Standard

    All files brought into or sent from the middleware layer should adhere to the below standard. (Note: This assumes thatFMS will be able to rename files from Source & to Target.)

    ___. yyyyMMddHHmmss.RDY (timestamp will be anoptional field - to be used when multiple files will appear before being processed.)

    Example: The ItemSiteMaster file for the ISCP process area, business objects Transportaion Lanes for I2RP would be as

    follows:

    ISCP_I2RP_TRNLANES_ITEMSITEMASTER.yyyyMMddHHmmss.RDY

    3.5.5 Informatica Middleware Staging Table Naming Standards

    All source and target staging tables will consist of a common set of columns not including the data columns required foreach specific interface:

  • 8/2/2019 a Developers Handbook Interfaces

    12/33

    Informatica ArchitectureCommonLE Integration Design

    Transaction ID unique sequence number for each record per interface run.

    Timestamp date\time stamp when the record was inserted into the staging table.

    Status flag to indicate whether the record has been processed, completed failed, etc

    Transaction Name name of interface

    The STATUS field can consist of the following values. Depending on the interface not all STATUS codes will be used.

    N (New) flag indicating that the record has been successfully inserted into the staging DB. P (Processing) flag indicating that the middleware application is processing the record.

    C (Complete) flag indicating that the middleware application has successfully processed the record.

    F (Failed) flag indicting that the middleware application has failed to process the record. (Assumption depending on interface business rules, failed records will remain in the staging table until successfullyprocessed).

    Table Design:Name Type NullTRANSACTION_ID VARCHAR2 NoCREATE_DTM DATE NoSTATUS VARCHAR2 NoTRANSACTION_NAME VARCHAR2 No

    Table naming standards for a source system loading data into middleware staging are:

    _SRC___

    Example: The ItemSiteMaster table for the ISCP process area, business objects Transportaion Lanes from BW would beas follows:

    ISCP_SRC_BW_TRNLANES_ITEMSITEMASTER

    The same applies to the middleware application needing to load data into the middleware staging before sending to thetarget system.

    _TGT___

    Example: The ItemSiteMaster table for the ISCP process area, business objects Transportaion Lanes to I2RP would beas follows:

    ISCP_TGT_I2RP_TRNLANES_ITEMSITEMASTER

    3.6 Control-M Execution of WorkflowsMost, if not all, of the interfaces built within Informatica will be executed using Pepsicos global scheduling tool Control-M.

    In most cases, Control-M will not only trigger Informatica workflows but also SAP and Legacy specific jobs. Each

    Control-M job will be linked to other jobs within the group pertaining to a particular interface. These dependencies are

    driven by the return codes of each of the individual jobs within the job group. To manage the execution of the workflows

    and return codes to Control-M, each interface built within Informatica will be executed via a Unix shell script. Below is the

    basic structure of the shell script:

    #!/bin/sh

    ###############################################################################

    ## Variables used for commencement of the Project OneUP IDoc Listener Workflow

    ###############################################################################

    ###########################################

    ## Creating Variables for Execution ##

    ## USERNAME, PASSWORD, and INFORMAT_PORT ##

    ###########################################

    . //schedapps/p1up/env_p1up_batch.sh

    . //schedapps/p1up/env_p1up_batch_qtg2.sh for QTG2 and PCNA1 interfaces

  • 8/2/2019 a Developers Handbook Interfaces

    13/33

    Informatica ArchitectureCommonLE Integration Design

    ###############################################################################

    ##

    ## Used to start Project OneUP Informatica Workflow

    ##

    ###############################################################################

    //schedapps/p1up/start_workflow.sh US_CORP_1UP_QTG1_INTFCwf_INTFC_QTG1_SHARED_IDOC_LISTENER-wait

    The yellow highlighted section of the script provides the proper initialization of the environment variables for the

    start_workflow.sh script. User name, password, and Informatica port number are set within the env_p1up_batch.sh

    script.

    The core functionality of these scripts is highlighted in grey. There are two versions of this line, start_workflow.sh and

    stop_workflow.sh. In nearly all situations, the start_workflow.sh is used with a wait condition. The only Informatica

    component that uses the stop_workflow.sh is the IDoc Listener, which is started without a wait condition.

    There are three parameters that are supplied to the start_workflow.sh and stop_workflow.sh scripts: folder name

    (highlighted in blue text), workflow name (highlighted in green text), and the wait condition (red text). The wait condition

    should be used by most interfaces, as this will allow the workflow to complete prior to sending a return code to Control-M.

    This is important because the return code is responsible for communicating success or failure to Control-M and Control-

    M uses this return code to dictate execution of subsequent jobs in the group.

    There will be a script implemented for each interface. The script name should conform to the following standard:

    p1up_qtqg2_

    The parameter values for each script will be interface specific.

    To manually start the Informatica workflow with out Control-M, run the start_workflow.sh for that particular interface fromthe /schedapps/p1up directory.

  • 8/2/2019 a Developers Handbook Interfaces

    14/33

    Informatica ArchitectureCommonLE Integration Design

    4 Build and Unit Test Activities

    During the development cycle, each developer should focus build and unit test activities on the sessions that perform theextract and load procedures for the interface. All unit test scripts should be completed for these main components. Uponsuccessful completion of these unit test activities, a developer should work with the development lead to incorporate theCommonLE components into an existing workflow. After walking through the following procedures with the developmentleads, any developer working on multiple interfaces will have the basic understanding of the constructs and organization

    of the standard interface wrapper to develop and test the wrapper for subsequent interfaces.

    4.1 PowerCenter Designer Tasks

    Each developer will need to create shortcuts to the following three SHARED mappings from SHARED_US_CORP_1UPfolder:

    m_P1UP_SHARED_AUDIT_LOG_BEGIN

    m_P1UP_SHARED_AUDIT_LOG_END

    m_P1UP_SHARED_ERROR_MESSAGING

    DO NOT DIRECTLY COPY THESE MAPPINGS INTO YOUR DEVELOPMENT FOLDER. Shortcuts are required sothat each developer is referencing the latest version of the code. If the mapping changes within the Shared folder, thosechanges will be propagated into the developers folder as well. Changes may impact the developers session and itsability to execute, but this type of error should not be difficult to resolve with either a Validation of the session or a slightconfiguration change.

    Screenshot 7.1.1.aThis demonstrates the creation of a SHORTCUT into a developer folder. Notice the shortcut icon on eachmapping that was added.

    4.2 PowerCenter Workflow Manager Tasks

    After the mapping shortcuts have been created in the developers folder, the associated sessions can now be copied aswell. The following four sessions will be copied:

    s_m_P1UP_SHARED_AUDIT_LOG_BEGIN_SAMPLE

    s_m_P1UP_SHARED_AUDIT_LOG_END_SUCCESS_SAMPLE

    s_m_P1UP_SHARED_AUDIT_LOG_END_FAILURE_SAMPLE

    s_m_P1UP_SHARED_ERROR_MESSAGING_SAMPLE

    To copy these sessions, follow these instructions:

  • 8/2/2019 a Developers Handbook Interfaces

    15/33

    Informatica ArchitectureCommonLE Integration Design

    1.) Connect to and open the desired developer folder.2.) Connect to but do NOT open SHARED_US_CORP_1UP.3.) Highlight all four sessions related to audit and error logging within this folder. Use the Edit menu to select

    Copy

    Screenshot 7.1.2.a

    4.) Navigate to the developers folder that is currently open and Paste using the Edit menu.

    Screenshot 7.1.2.b

  • 8/2/2019 a Developers Handbook Interfaces

    16/33

    Informatica ArchitectureCommonLE Integration Design

    5.) Step #4 will generate a new window to emerge called Copy Wizard. The Copy Wizard is designed to helpeliminate any conflicts Workflow Manager detects when copying sessions or workflows from one folder to thenext. This wizard should determine that there is a conflict with regards to the session/mapping associations.For each mapping/session combination, you will need to go through and select the mapping shortcut youpreviously created. Screenshot 6.1.2.d demonstrates the resolution of the conflict.

    Screenshot 7.1.2.c Copy Wizard

    Screenshot 7.1.2.d Resolution

  • 8/2/2019 a Developers Handbook Interfaces

    17/33

    Informatica ArchitectureCommonLE Integration Design

    6.) Click Next>> and Finish to complete this wizard.7.) You should now have created copies of those sessions. You should now rename each of the sessions you

    copied to align with the interface you are building. The following is the naming convention you should follow foreach reusable session:

    s_m_INTFC_[interface acronym]_AUDIT_LOG_BEGINs_m_INTFC_[interface acronym]_AUDIT_LOG_END_SUCCESSs_m_INTFC_[interface acronym]_AUDIT_LOG_END_FAILUREs_m_INTFC_[interface acronym]_ERROR_MESSAGING

    8.) Lastly, each of these sessions will require parameter file entries within the following text files on the Unixservers:

    //etlapps/[phase]/71/qtg1/Scripts/US_CORP_1UP_QTG1_INTFC_begin_audit_parms.txt//etlapps/[phase]/71/qtg1/Scripts/US_CORP_1UP_QTG1_INTFC_end_audit_parms.txt//etlapps/[phase]/71/qtg1/Scripts/US_CORP_1UP_QTG1_INTFC_error_parms.txt

    9.) Refer to Section 6.1.3 for sample entries into the parameter files.

    4.3 Mapping Parameters for Sessions

    This table represents all of the parameters used for the CommonLE audit and error logging mappings and sessions. Thetable specifies which units of code utilize the various parameters on the list. It is the developers responsibility todetermine the values for their work units and communicate that information to the development leads and the Informaticaarchitect so that all documentation and code can be kept up-to-date.

    Parameter Name Default Value Error Message

    AuditBegin

    AuditEnd

    Description

    $$INTERFACE_NAME DEFAULT_INTERFACE_NAM X X X This value will correspond with the

  • 8/2/2019 a Developers Handbook Interfaces

    18/33

    Informatica ArchitectureCommonLE Integration Design

    E value used to insert intoINFA_INTERFACE_LOG table.

    $$APPLICATION_ID DEFAULT_APPLICATION_ID X X X This parameter identifies theApplication from a CommonLEperspective.

    $$SERVICE_NAME 0 X X X This parameter will correspond with

    the numeric value of the Caliber IDfor the interface object.

    $$TRANSACTION_DOMAIN

    DEFAULT_BUSINESS_OBJECT

    X X X This parameter will correspond withthe name of the interface object andis directly related to theSERVICE_NAME numeric value.

    $$APPLICATION_DOMAIN

    DEFAULT_TARGET_SYSTEM

    X X X This parameter will correspond to thacronym for the target system orapplication.

    $$SEVERITY_CODE 0 X The severity code will be managedfor the interface. Any error will beassigned the severity code for theentire interface.

    $$WORKFLOW_NAME DEFAULT_WORKFLOW_NAME

    X X X This matches the workflow name forall sessions in the interface.

    $$NEXT_SESSION DEFAULT_NEXT_SESSION X This parameter will be the name of the subsequent session in theworkflow.

    $$AUDIT_STATUS DEFAULT_AUDIT_STATUS X This parameter will be different for sessions that end the workflowsuccessfully versus a session thatends the workflow with a failure.Usually two sessions, one forsuccess and one for failure, existafter a decision task in the workflowwhich analyzes the status of theworkflow based upon its tasks.

    $$PREVIOUS_SESSION DEFAULT_PREVIOUS_SESSION

    X This parameter will be the name ofthe previously executed session inthe workflow.

    Below are samples from each of the parameter files.

    Screenshot 7.1.3.a US_CORP_1UP_QTG1_INTFC_begin_audit_parms.txt

  • 8/2/2019 a Developers Handbook Interfaces

    19/33

    Informatica ArchitectureCommonLE Integration Design

    Screenshot 7.1.3.b US_CORP_1UP_QTG1_INTFC_end_audit_parms.txt

    Screenshot 7.1.3.c US_CORP_1UP_QTG1_INTFC_error_parms.txt

  • 8/2/2019 a Developers Handbook Interfaces

    20/33

    Informatica ArchitectureCommonLE Integration Design

    4.4 Build Completion and Next Steps

    4.4.1 String / Assembly TestingFor string and assembly testing, all code will need to be moved into the project specific string/assembly test folder(QTG1_INTFC). There are currently shortcuts for the shared mappings that exist in these folders. Therefore, thedevelopment lead will only be responsible for migrating the sessions and workflows into the project folder. Thedevelopment lead will need to re-point each session to use the mapping shortcuts already created within the projectfolder. In addition, the parameter files must be changed to reflect the new folder that all code is residing in. Thesemodifications should complete the migration into the project folders.

    4.4.2 Migration to QA, UAT, and PRODThe parameter files and any scripts related to the interface workflows should be migrated from PHGP0233 to PHGP0232and PHGP0234 accordingly. Unless environment-specific details are referenced in scripts, no additional modificationswould be necessary.

  • 8/2/2019 a Developers Handbook Interfaces

    21/33

    Informatica ArchitectureCommonLE Integration Design

    Appendix A: Step-by-Step Application of Code Template to Core Processes

    This following appendix will provide developers with a common architecture and code template for building interfaces thatpublish messages for posting into the CommonLE. This documentation will also provide the development leads with asort of checklist to walk through each interface and determine if the code has been modified according to the necessarysteps.

    1) Create a copy of the following mappings from the SHARED_US_CORP_1UP folder into your current folder:

    i) m_P1UP_SHARED_AUDIT_LOG_BEGIN

    ii) m_P1UP_SHARED_AUDIT_LOG_END

    iii) m_P1UP_SHARED_SUMMARY_ERROR_MESSAGING

    iv) m_P1UP_SHARED_INTFC_ERR_LOG_MESSAGING

    v) m_P1UP_SHARED_INTFC_AUDIT_LOG_MESSAGING

    2) Create a session using the mappingm_P1UP_SHARED_AUDIT_LOG_BEGIN. To save time, create a copy of thesession s_m_P1UP_SHARED_AUDIT_LOG_BEGIN_SAMPLE from folder SHARED_US_CORP_1UP.

    3) Rename the session to comply with the following standards for interfaces.

    i) s_m_INTFC_[interface_abbreviation]_AUDIT_LOG_BEGIN

    4) Double-click the session and click on the Properties tab. Change the session log file name toyour_session_name.log.

    5) Click on the Properties Tab of your session. Use the following value for the parameter file setting:$PMRootDir/ai/Scripts/US_CORP_1UP_AI_INTFC_begin_audit_parms.txt.

    6) Click on the Mapping Tab. For the target entitled shortcut_to_INFA_INTERFACE_LOG, change the reject file nameto your_session_name.bad.

    7) Log into Unix command line for the Informatica server. Modify the parameter file for begin audit logs located inthe //etlapps/dev/71/qtg1/Scripts directory. The file name will beUS_CORP_1UP_AI_INTFC_begin_audit_parms.txt. To add the applicable data, copy and paste the following 8lines into the parameter file and replace the parameter values with the values that pertain to your session.

    [US_CORP_1UP_QTG1_INTFC.s_m_P1UP_SHARED_AUDIT_LOG_BEGIN_SAMPLE]$$INTERFACE_NAME=SAMPLE_INTERFACE_NAME$$APPLICATION_ID=1UP_QTG1_INF_DEV$$SERVICE_NAME=12345(Note: This is actually the caliber ID)$$TRANSACTION_DOMAIN=BUSINESS_OBJECT_NAME$$APPLICATION_DOMAIN=TARGET_APPLICATION$$NEXT_SESSION=s_m_INTFC_NEXT_SESSION$$WORKFLOW_NAME=wf_P1UP_SHARED_INTERFACE_SAMPLE

    Please refer to Section 7.1.3 for mapping parameters and parameter files.

    8) Create a session using the mappingm_P1UP_SHARED_AUDIT_LOG_END_FAILURE. To save time, you can copysession s_m_P1UP_SHARED_AUDIT_LOG_END_FAILURE_SAMPLE from folder SHARED_US_CORP_1UP.

    9) Rename the session to comply with the following standards for interfaces.

    i) s_m_INTFC_[interface_abbreviation]_AUDIT_LOG_END_FAILURE

    10) Double-click the session and click on the Properties tab. Change the session log file name toyour_session_name.log.

    11) Click on the Properties Tab of your session. Use the following value for the parameter file setting:$PMRootDir/ai/Scripts/US_CORP_1UP_AI_INTFC_end_audit_parms.txt.

    12) Create a session using the mappingm_P1UP_SHARED_AUDIT_LOG_END_SUCCESS. To save time, you cancopy session s_m_P1UP_SHARED_AUDIT_LOG_END_SUCCESS_SAMPLE from folderSHARED_US_CORP_1UP.

    13) Rename the session to comply with the following standards for interfaces.

  • 8/2/2019 a Developers Handbook Interfaces

    22/33

    Informatica ArchitectureCommonLE Integration Design

    i) s_m_INTFC_[interface_abbreviation]_AUDIT_LOG_END_SUCCESS

    14) Double-click the session and click on the Properties tab. Change the session log file name toyour_session_name.log.

    15) Click on the Properties Tab of your session. Use the following value for the parameter file setting:$PMRootDir/ai/Scripts/US_CORP_1UP_AI_INTFC_end_audit_parms.txt.

    16) Log into Unix command line for the Informatica server. Modify the parameter file for begin audit logs located inthe //etlapps/dev/71/qtg1/Scripts directory. The file name will be US_CORP_1UP_AI_INTFC_end_audit_parms.txt.To add the applicable data, copy and paste the following 18 lines into the parameter file and replace the parametervalues with the values that pertain to your session.

    [US_CORP_1UP_QTG1_INTFC.s_m_P1UP_SHARED_AUDIT_LOG_END_SUCCESS_SAMPLE]$$INTERFACE_NAME=TECH_ARCH_TEAM$$APPLICATION_ID=1UP_QTG1_INF_DEV$$SERVICE_NAME=99999 (Note: This is actually the caliber ID)$$TRANSACTION_DOMAIN=TECH_ARCH_DOMAIN$$APPLICATION_DOMAIN=TGT_TECH_ARCH$$PREVIOUS_SESSION=s_m_P1UP_TECH_ARCH_SAMPLE$$WORKFLOW_NAME=wf_P1UP_SHARED_INTERFACE_SAMPLE$$AUDIT_STATUS=PROCESSED

    [US_CORP_1UP_QTG1_INTFC.s_m_P1UP_SHARED_AUDIT_LOG_END_FAILURE_SAMPLE]$$INTERFACE_NAME=TECH_ARCH_TEAM$$APPLICATION_ID=1UP_QTG1_INF_DEV$$SERVICE_NAME=99999 (Note: This is actually the caliber ID)$$TRANSACTION_DOMAIN=TECH_ARCH_DOMAIN$$APPLICATION_DOMAIN=TGT_TECH_ARCH$$PREVIOUS_SESSION=s_m_P1UP_TECH_ARCH_SAMPLE$$WORKFLOW_NAME=wf_P1UP_SHARED_INTERFACE_SAMPLE$$AUDIT_STATUS=FAILED

    Please refer to Section 7.1.3 for mapping parameters and parameter files.

    17) Create a session using the mappingm_P1UP_SHARED_SUMMARY_ERROR_MESSAGING. To save time, createa copy of the session s_m_P1UP_SHARED_SUMMARY_ERROR_MESSAGING_SAMPLE from folder

    SHARED_US_CORP_1UP.18) Rename the session to comply with the following standards for interfaces.

    i) s_m_INTFC_[interface_abbreviation]_SUMMARY_ERROR_MESSAGING

    19) Double-click the session and click on the Properties tab. Change the session log file name toyour_session_name.log.

    20) Click on the Properties Tab of your session. Use the following value for the parameter file setting:$PMRootDir/ai/Scripts/US_CORP_1UP_AI_INTFC_error_parms.txt.

    21) Log into Unix command line for the Informatica server. Modify the parameter file for exception logs located in the//etlapps/dev/71/qtg1/Scripts directory. The file name will be US_CORP_1UP_AI_INTFC_error_parms.txt. To addthe applicable data, copy and paste the following 8 lines into the parameter file and replace the parameter valueswith the values that pertain to your session.

    [US_CORP_1UP_QTG1_INTFC.s_m_P1UP_SHARED_ERROR_MESSAGING_SAMPLE]$$INTERFACE_NAME=SAMPLE_INTERFACE_NAME$$APPLICATION_ID=1UP_QTG1_INF_DEV$$SERVICE_NAME=12345(Note: This is actually the caliber ID)$$TRANSACTION_DOMAIN=BUSINESS_OBJECT_NAME$$APPLICATION_DOMAIN=TARGET_APPLICATION$$SEVERITY_CODE=3 (NOTE: This will be dependent upon the SID definition for the interface)$$WORKFLOW_NAME=wf_P1UP_SHARED_INTERFACE_SAMPLE

    Please refer to Section 7.1.3 for mapping parameters and parameter files.

  • 8/2/2019 a Developers Handbook Interfaces

    23/33

    Informatica ArchitectureCommonLE Integration Design

    22) Create a session using the mappingm_P1UP_SHARED_INTFC_ERR_LOG_MESSAGING. To save time, create acopy of the session s_m_P1UP_SHARED_INTFC_ERR_LOG_MESSAGING_SAMPLE from folderSHARED_US_CORP_1UP.

    23) Rename the session to comply with the following standards for interfaces.

    i) s_m_INTFC_[interface_abbreviation]_INTFC_ERR_LOG_MESSAGING

    24) Double-click the session and click on the Properties tab. Change the session log file name toyour_session_name.log.

    25) Click on the Mapping Tab. For the target entitled INFA_INTERFACE_ERR_LOG1, change the reject file name toyour_session_name.bad.

    26) Click on the Properties Tab of your session. Use the following value for the parameter file setting:$PMRootDir/ai/Scripts/US_CORP_1UP_AI_INTFC_error_parms.txt. The same error parameter file will beleveraged throughout the record-level exception handling components. Copy the lines used for the summaryexception messaging session and reference this new session. Keep these entries close together in case a changeis required.

    27) Create a session using the mappingm_P1UP_SHARED_INTFC_AUDIT_LOG_MESSAGING. To save time, createa copy of the session s_m_P1UP_SHARED_INTFC_AUDIT_LOG_MESSAGING_SAMPLE from folderSHARED_US_CORP_1UP.

    28) Rename the session to comply with the following standards for interfaces.

    i) s_m_INTFC_[interface_abbreviation]_INTFC_AUDIT_LOG_MESSAGING

    29) Double-click the session and click on the Properties tab. Change the session log file name toyour_session_name.log.

    30) Click on the Mapping Tab. For the target entitled INFA_INTERFACE_AUDIT_LOG, change the reject file name toyour_session_name.bad.

    31) Click on the Properties Tab of your session. Use the following value for the parameter file setting:$PMRootDir/ai/Scripts/US_CORP_1UP_AI_INTFC_begin_audit_parms.txt. The same audit begin parameters willbe leveraged throughout the record-level audit logging components for this session. Copy the lines used for thebegin audit messaging session and reference this new session. Keep these entries close together in case a changeis required.

    32) Within the core processing sessions, add the following entries to the workflow parameter file located at:$PMRootDir/ai/Scripts/US_CORP_1UP_AI_INTFC_workflow_parms.txt.

    [US_CORP_1UP_AI_INTFC.s_m_P1UP_SHARED_INTFC_AUDIT_LOG_MESSAGING_SAMPLE]$$INTERFACE_NAME=SAMPLE_INTERFACE_NAMEShortcut_to_mplt_Process_Audit_Logs.$$AUDIT_LOGGING_SWITCH=ON

  • 8/2/2019 a Developers Handbook Interfaces

    24/33

    Informatica ArchitectureCommonLE Integration Design

    Appendix B: Accessing CommonLE Logs

    The following steps outline the process to login to the CommonLE front-end application to view Informatica log entries.

    1) The logs and errors are viewed via a web browser, Use the following link for the development environment:

    http://wlsite4.corp.pep.pvt:7229/1UPPepsiCSD/gologin.do

    2) Enter the User Name and Password and click Submit

    3) The Welcome screen appears. Click Logs & Errors

    http://wlsite4.corp.pep.pvt:7229/1UPPepsiCSD/gologin.dohttp://wlsite4.corp.pep.pvt:7229/1UPPepsiCSD/gologin.do
  • 8/2/2019 a Developers Handbook Interfaces

    25/33

    Informatica ArchitectureCommonLE Integration Design

    4) Click View Logs and choose Application. You may use the other fields to narrow the search. Click the Submit button.

  • 8/2/2019 a Developers Handbook Interfaces

    26/33

    Informatica ArchitectureCommonLE Integration Design

    5) To view the details of a specific log click the Application link.

  • 8/2/2019 a Developers Handbook Interfaces

    27/33

    Informatica ArchitectureCommonLE Integration Design

    6) The details of that specific log will be displayed at the bottom of the page.

  • 8/2/2019 a Developers Handbook Interfaces

    28/33

    Informatica ArchitectureCommonLE Integration Design

    1)

  • 8/2/2019 a Developers Handbook Interfaces

    29/33

    Informatica ArchitectureCommonLE Integration Design

    Appendix C: Implementing Record-Level Exception Logging into Core Processes

    Within the SID documentation associated with a given interface, an exception or error pattern will be selected by the

    Integration Solution Architect. These patterns identify how the business is required to track the data through an interface.

    Each pattern tracks exceptions at differing levels and completion alerts will also vary across the patterns. For those

    patterns that require record-level exception logging (Transmit with Workflow Success & Report Exceptions, Transmit with

    Workflow Failure & Report Exceptions, and Restrict with Workflow Failure & Report Exceptions), each developer willneed to implement this mapplet into the core process of the interface workflow. For a design of the mapplet, please refer

    to the Informatica Error Handling Design documentation located in the same StarTeam directory.

    Identifying Possible Error Locations

    One of the outputs from the SID process is the identification of the error pattern for the interface and all potential

    exceptions within the business logic of the code. Throughout the core mappings for an interface, there will be several

    instances where errors can be captured and logged. Most frequently these locations will be directly after lookup

    procedure transformations or just prior to a target instance. Errors prior to target instances will contain target-specific

    load requirements that must be proactively enforced. For example, if field 4 is NOT NULLABLE in the target application,

    an expression or router must avoid sending all records with no value in field 4 to the target and instead send this

    information as an alert to the CommonLE.

    Add Evaluation TransformationsWithin a mapping, routers will be the most frequent tool for evaluating record sets and choosing success or exception

    paths. Routers will contain the necessary groups to send appropriate data to the successful target instances and other

    groups to direct data to the mapplet for logging to the exception table in middleware.

    Implementing the MappletWhen an exception is encountered within the code, the mapplet will be utilized to insert that data into the exception table

    (INFA_INTERFACE_ERR_LOG) in a standard format.

    Mapplet Input Port Port Description Input Value fromMapping

    in_INTERFACE_NAME This is the name of the interfacecurrently being executed. Thisparameter should be consistentacross all of the parameter filesfor a given interface.

    $$INTERFACE_NAME fromthe workflow parameter file

    in_MAPPING_NAME This parameter should be local tothe mapping itself and have thefull name of the mapping beingexecuted.

    $$MAPPING_NAME from theworkflow parameter file

    in_TRANSFORMATION_NAME This value will be defined as aconstant within a transformation in

    the mapping and will correspondto the name of the transformationwhere the exception occurred.

    Constant defined within themapplet-calling mapping

    in_TRANSFORMATION_TYPE This value will be thetransformation type for thelocation of the exception.

    Constant defined within themapplet-calling mapping

    in_TRANS_INPUT_DATA An expression should be used toconcatenate the input values for agiven failed transformation. Thisis most useful/vital for lookupprocedures.

    Concatenated value definedwithin the mapplet-callingmapping

  • 8/2/2019 a Developers Handbook Interfaces

    30/33

    Informatica ArchitectureCommonLE Integration Design

    in_TRANS_OUTPUT_DATA All output values from a failedtransformation should beconcatenated and mapped to thisport of the mapplet.

    Concatenated value definedwithin the mapplet-callingmapping.

    in_ERR_CODE This provides a standardexception code for a given error in

    an interface. The error messageis derived from this value.

    Constant value matching theexact type and case from the

    table below

    in_ERR_BUSINESS_ID This identifies each source recordas a unique occurrence. It is verypossible that more than one fieldis required for a unique businessid. Each SID document shouldarticulate in detail the exactbusiness id for a given interface.

    Concatenated value definedwithin the mapplet-callingmapping

    in_ERR_TIMESTAMP The time of occurrence for anexception

    SYSDATE defined within themapplet-calling mapping.

    Error CodesThis section will contain a table of all of the acceptable error messages to be logged into theINFA_INTERFACE_ERR_LOG table. Emphasis must be placed upon using the proper messages when logging to this

    table.

    Error Code Value Error Code Description Example Usage

    LOOKUP_PROCEDURE_ERROR Whenever a cross referencelookup procedure returns a defaultvalue due to mismatch ofincoming values, this error shouldbe logged into the exception logtable within the middlewareSAPEAI database schema.

    Whenlkp_PAYMENT_TERMSreturns -1, log this error alongwith the incoming data valuesfor the transformation.

    DATA_VALIDATION_ERROR SID documentation may outlinebusiness data validationprocedures. These validationsmust be checked within mappingsand errors logged, processessuspended, etc.

    Whenexp_CHECK_DEBIT_CREDIT_MATCH detects adifference betweenAMT_DEBIT andAMT_CREDIT, route thisinformation to the exceptionmapplet.

    COMPUTATION_ERROR This error message value shouldbe used when computation errorsare detected within expressions,aggregators, etc.

    When in_denominator = 0then route record to exceptionmapplet with a divide by zeroerror using this messagevalue.

    DATA_CONVERSION_ERROR This value will be used whenconversions or substitutions areused within expressions and nopossible matches are found.

    When in_oldValue is not in (1,2, 3, 4, 5) then mark this asan error.

    RECORD_COUNT_ERROR This error message is reserved forsource/target record countanalysis.

    When number of sourcerecords does not equal thenumber of target records, logthis value.

    TARGET_LOAD_ERROR This error message is reserved for target load errors.

    When target load conditionsare not met, this error shouldbe sent to the CommonLE toidentify the record as not

  • 8/2/2019 a Developers Handbook Interfaces

    31/33

    Informatica ArchitectureCommonLE Integration Design

    being loaded into the target.Where applicable, this errorcan be used in conjunctionwith another error code.

  • 8/2/2019 a Developers Handbook Interfaces

    32/33

    Informatica ArchitectureCommonLE Integration Design

    Appendix D: Implementing Record-Level Audit Logging into Core Processes

    Within the Solution Integration Design for a given interface, a Business ID or unique identifier for a record in the source

    system is documented. This Business ID subsequently becomes the unique identifier for each record transmitted

    through the interface code. This unique identifier will be logged to the Audit Logging portion of the CommonLE as a

    requirement of all QTG2 interfaces using Informatica. The components that perform audit logging may be turned off by

    production support personnel. This switching functionality is controlled at the interface /workflow level; therefore largeinterface volume can be removed by production support when it no longer is needed.

    Creating the Business ID

    Within the Solution Integration Design documentation, there is a section for the creation of a Business ID for the

    interface. This identifier will be either one field or the concatenation of multiple fields that combine to create the natural

    key for the incoming data record. This Business ID should be created within the first one or two transformations

    downstream of the source qualifier transformation. If a SQL Override is used within the mapping, it may be advisable to

    create the Business ID within the SQL statement. For example, if the source table is DM_ACCOUNT and the Business

    ID is ACCT_ID, your SQL statement could read: select ACCT_ID as INTFC_BUSINESS_ID, ACCT_ID as ACCT_ID,

    STRT_DT as START_DATE from DM_ACCOUNT. Because there are multiple ways of processing data through a

    mapping, the developer may choose to have only one ACCT_ID value returned by the SQL statement and then connect

    it to multiple transformation paths. When using SQL Overrides to meet other requirements, the architecture team

    recommends this strategy for implementation.

    Inserting the Audit Logging Mapplet & Target

    The majority of functionality for inserting into the interface audit log table within the middleware database resides within a

    reusable mapplet transformation in the shared Informatica folder. This mapplet, mplt_Process_Audit_Logs, contains a

    router transformation that controls the usage of audit logging within an interface. The routers main grouping evaluates

    the value of the AUDIT_LOGGING_SWITCH parameter within the core workflow parameter file on the Unix server. Each

    developer will need to provide the following inputs to the mapplet transformation:

    INTERFACE_NAME

    MAPPING_NAME

    AUDIT_BUSINESS_ID

    AUDIT_TIMESTAMP

    The outputs of this transformation will link directly to the target table, INFA_INTERFACE_AUDIT_LOG. Using the

    AutoLink feature of Informatica, the output from the mapplet transformation will automatically link or port to the target

    tables columns. During session creation, assign SAPEAI as the connection for this target table.

    Setting Up Mapping Parameters and Parameter FileFor interface core processes, there is one parameter file used across the various interfaces for a given release. The

    parameter file, US_CORP_1UP_AI_INTFC_workflow_parms.txt, will contain the specific parameters used by coreprocess sessions. The most frequently used parameter, $$INTERFACE_NAME, should be present in this parameter file

    as it is present in all other parameter files. As a developer, please make certain that the INTERFACE_NAME value

    matches across all of these separate files. The common architecture components require this synchronization.

    Within each mapping, there are two main parameters that need to be defined: $$INTERFACE_NAME set via the

    parameter file and $$MAPPING_NAME that can be defaulted within the mapping itself. There is no need to maintain this

    value within the parameter file itself. In addition, the mapplet contains a parameter that must be fed from the parameter

    file for core processes. This parameter, $$AUDIT_LOGGING_SWITCH, will provide the functionality of controlling audit

    logging to the Common Services reporting components. When not configured to the value of ON, the interface will not

  • 8/2/2019 a Developers Handbook Interfaces

    33/33

    Informatica ArchitectureCommonLE Integration Design

    log Business IDs to the INFA_INTERFACE_AUDIT_LOG table and subsequently no messages will be delivered to the

    AUDIT queue for Common Services.

    For assembly testing purposes, audit logging should always be enabled. The general rule for system test cycles should

    be that the AUDIT_LOGGING_SWITCH is set to ON unless performance becomes a major issue for successful

    completion of the testing phases. Performance of the common components should be thoroughly investigated during

    these test intervals. Application Integration architects will assist with any performance issues that emerge from thesecommon mapplets, mappings, and sessions. Data volumes within the INFA_INTERFACE_AUDIT_LOG table may

    become the single largest contributor of performance issues for this reusable component.