methodology for livefeed data integration (beyond bi/sap ... · pdf filemethodology for...
TRANSCRIPT
Internal and Confidential
Methodology forLiveFeed Data Integration
(Beyond BI/SAP) using Informatica
LIVE FEED SCENARIO
For live feed scenario, where site/country data keeps changing on ongoing basis even after go live and needs to be incorporated in target SAP/other
application system.
Agenda
Agenda
What's New Now ?
New Live feed Integration platform Design (LFIP)
Screen Shoots of LVIP Platform
Nomenclature
What’s New
• Benefit Highlightsq We will present you an SOA based architectural model, that will meet your immediate needs and
offer great flexibility for future extensions at no additional redesign costq A architecture that has a proven track record of being one of the best. It has been built over few
years of continuous inputs and acts as great referenceq Ease of incorporating new services/removing services in a plug & play approachq High level of audit trail, extensive error logging and end to end data lineageq Full length of data reconciliationq Real time data feed and thus offering real time reporting capabilitiesq Elimination of try and error approach by inexperienced resources.
Ø This approach can be costly in terms of timelines and resourcesØ Often this tends to be a continuous ongoing patchy work to existing solutionØ Often becomes complicated and unmanageable due to series of patch works
q Delivery of proven platform will deliver greater value and provide cost saving on business front tooq This platform gives the auditors & data owners high level of transparency in term of seeing there
data and its audit trail for regulatory compliance as well as data integrity aspects.q Save multi millions in revenues over the next 1-2 years in terms of learning curve, avoiding fixes to
existing work, eliminate waste of time by dependent users to get right reports etcØ Shorten delivery of right platformØ Delivery capability to manage / roll-out more sites at the same time within fixed deadlinesØ Shorten each phase time due to better platform and quick turnaround time for checks and
appropriate reports
NEW METHODOLGY
A proven methodology and mature technological architecture to be used for Live feed integration and/or data migration projects by big clients with multiple countries/sites.
Countries
Interfaces flow between Sites/Countries and Core – Focus on the outgoing and incoming interfaces
Countries
Legacy system
Legacy system
Legacy System
Legacy system
Legacy System
Legacy System
Loca
l In
tegr
atio
n La
yer
CD
IL
Regulatory reporting solution
Local DWH
Local Reconciliation
Tax
CD
IL
InformaticaCore Scope
Filenet
ECC
Informatica
8C
8D
8B
8A
7
3
4
5
Cheque printing tool
Loca
l In
tegr
atio
n La
yer
27
Transactionnel data Master data
AAARAPGLCOetc
SAP ECC
SAP BW
Ariba
8C
Integration Scenario : An executed client case
Ariba
AAARAPGLCOetc
SAP BW
4
5
6
SWIFT NET
Bank Communication
Tool (Visual Rappro) Banks
12
Core Data Integration Layer Area
GroupMagnitude
10
4B
CDIL CDIL
15
Local DWH /taxes/etc
8
External Applications
Application in scope of Core
Solution
Caption
Filenet
13
SAP ECC
13
12
CountriesLegacy system
Legacy system
Legacy System
Legacy systemLegacy
System
Legacy System
Loca
l In
tegr
atio
n La
yer
3
Cheque Printing System
27
7
28
2
Integration Scenario : Another perspective
CDIL DWH
Agenda
Agenda
What's New Now ?
New Live feed Integration platform Design (LFIP)
Screen Shoots of LVIP Platform
Nomenclature
LIL
-IT
LOGGING Module
Process Audit Data Audit Log
Error Logging
Message Unit
File Handling Module
Archive Module (uses FileNet)
FILE
SY
STE
M (I
N/O
UT/
LOG
)
Secure File Transfer Tool
TRANSPORT LAYER EXECUTION LAYER
Interface GL-p1Executor Shell - PreExecutor INF
Executor Shell - FTP Log
FILE SYSTEM(ARCHIVE)
File Relay ModuleSAP ECC
FileNet
SAP BW
Ariba
LOG DB
Executor Shell - Post
SYS-A SYS-B
SYS-C
SYS-D
INB
OU
ND
OU
TBO
UN
D
Interface M2Executor Shell - PreExecutor INFExecutor Shell - FTP LogExecutor Shell - Post
Interface GL-p2Executor Shell - PreExecutor INF-IDOC ErrorExecutor Shell - FTP LogExecutor Shell - Post
Executor INF-IDOC Error
SYS-E SYS-F
INFORMATICA
WF-GLp1
Session 1
Session 2
WF-M2
Session 1
Session 2
WF-GLp2
Session 1
Session 2
File Monitor Module
LIL
-XX
HLD : Component Layout Across Systems
SYS-H
../LIL_IT
. /out
. /in
../core
…/{country_1}/out/in
…/{country_2}/out/in
../LIL_UK
./out
. /inFi
le H
andl
ing
File
Rel
ay
File
Arc
hive
../004_gl…/in /out/template/run/{[datetime]_[runno]}
/in/out/logs
../003_ap…/in /out/template/run/{[datetime]_[runno]}
/in/out/logs
Interface GL-p1
Shell - Pre
Exec INF
Shell - Archive
Shell - Clean
INF WF
Session
Task
Session
Task
File
Mon
itor
INFA/
../SessionLogs
../SrcFiles
../TgtFiles
Execution layer
SYS-B1SYS-A SYS-B1 SYS-ESYS-D
SAP ECC../logs
SYS-F1
SAP BW../logs
SYS-F2
FileNet../logs
SYS-F4
ARIBA../logs
SYS-F3
INF Prepare
03
04
05
08
01
07-1
07-2
07-3
07-4
07-5
08-1
../core
08-2
Inte
rfac
e FT
-UD
M
02
HLD : Data Files Flow Across Systems : INBOUND
FILE SYSTEM(ARCHIVE)
SYS-C
../LIL_IT
. /out
. /in
../core
…/{country_1}/out/in
…/{country_2}/out/in
../LIL_UK
./out
. /inFi
le H
andl
ing
File
Rel
ay
File
Arc
hive
../004_gl…/in /out/template/run/{[datetime]_[runno]}
/in/out/logs
../003_ap…/in /out/template/run/{[datetime]_[runno]}
/in/out/logs
Interface GL-p1
Shell - Pre
Exec INF
Shell - Archive
Shell - Clean
INF WF
Session
Task
Session
Task
File
Mon
itor
INFA Server
../SessionLogs
../SrcFiles
../TgtFiles
Execution layer
SYS-B1SYS-A SYS-B1 SYS-ESYS-D
SAP ECC../logs
SYS-F1
SAP BW../logs
SYS-F2
FileNet../logs
SYS-F4
ARIBA../logs
SYS-F3
INF Prepare
03
04
05
08
01
07-1
07-2
07-3
07-4
07-5
08-1
../core
08-2
Inte
rfac
e FT
-UD
M
02
HLD : Data Files Flow Across Systems : OUTBOUND
FILE SYSTEM(ARCHIVE)
SYS-C
Shell - Send 07-5
HLD : Technical View
Interaction Model : Site & Central Process
SAPProduction
System
CoreResponsibility
JJJJ
Data Logs
SiteSystems
Mapping
DataCorrection
FlatFile
SiteResponsibility
Off-LineQualityChecks
LLLL
LLLLWebservices
Mapping
Mapping
Mapping
JJJJ
Reconcillation Process
SAPECC / BW
LOAD DATA
CORESITE
Load
Review comments on checks (Email)
CHECKSSQL
CHECKS
Informatica Reports (File Based/ Web services)
(On Demand)Configuration checks
& Feedbacks
Web Portal-One Point Control Center
Business User / Data owners
Checks & Reviews
& Corrections
Online Process Status and Error Log access
File Server
Ariba
FileNet
CAPTURE DATA STATUS
GENERATE REPORTS
12
Insurance MultuProvider
�������
Informatica���
Manual EntriesFeeders
Ent
erpr
ise
Rep
ortin
g la
yer
Dat
a M
art l
ayer
Ext
ract
ion
laye
rO
pera
tiona
l Dat
a S
tora
ge la
yer
Local GAAPGroup IFRS Solvency II Local IFRS Local Conso Prep
Local Conso –Local GAAP
Group IFRSInsurance
Local GAAPInsurance
Local IFRSInsurance
Local Conso PrepInsurance
Local Conso Local GAAP
Insurance
Solvency IIInsurance
Overview of various data marts and building blocks
Reporting Model : An executed client case
BO WebI
Informatica
BO WebI
CO-OM-CCA
Cost center accounting
Statutory reports
Regulatory reports
SAP BI DWH
CDIL
Fin. Mgmt
reports
Operationalreports
Ad’hocreports
Insurance
Reconciliationreports
GL / CDILReconciliation
CCA1
G/L2
4
5
3C
ryst
al
SAP ECC6
SAP BEx Analyzer / Pioneer
Outbound layer
FI – G/L
General Ledger
Bank
Tax Department
Local reconciliation
tool
Stat./ Regulatorysubmission tools
Local DWH/ Treasury
Feeders Manual Entries
Reporting Model – Component View
BO WebI – CDIL
Standard reports
Ad’hocreports
Agenda
Agenda
What's New Now ?
New Live feed Integration platform Design (LFIP)
Screen Shoots of LVIP Platform
Nomenclature
CDIL - RealTime Reports
• In general, following are the reports used in this project • Process Reports
q Full Report Ø Day basedØ Interface based
q Interface ReportØ Selection Screens
q Data Object ReportsØ Inbound Objects statusØ Outbound Objects statusØ Object status based report (selection on status, TimeSpan): Same as above with extra filter on
statusØ Objects in Error
• Data Reportsq Task Summaryq Interface Summary
CDIL - Job Plan Process Report
M2 Interface
AP Interface
GL Interface
Interface Name
<link to Data Report>
<link to Data Report>
Other links
<link to Job log file>
SuccessDone20090112 13:20:50+13134
Busy20090112 10:20:50+13133
<link to Job log file>
FailedDone20090112 01:20:50+13132
InfoJob StatusProcess StatusRun DateTimeJob Plan IDs
Click on + to drill down for next level reporting
Click on this to excess the log files over LDAP authentication
Click here to see data summary reporting quickly
1. Can be run for a Timespan (eg. 24Hr)
2. Can be run for one particular Interface
CDIL - Job Step Process Report
<link to Job Step log file>
SuccessDoneNotifications+13
<link to Job Step log file>
FailedDoneINF Load WF GL IDOC+12
<link to Job Step log file>
SuccessDoneCheck Files+11
InfoJob StatusProcess StatusProcess NameJob Step IDs
Click on + to drill down for next level reporting
Job Plan (13132)
CDIL - Job Task Process Report
<link to Job task log file>FailedDoneWf_Call+122
<link to Job task log file>SuccessDoneCreate Parameter File+121
InfoJob StatusTask StatusTask NameJob Task IDs
Click on this to access the log files over LDAP authentication
Job Plan->Job Step (12)
In case of Multiple log files, this is a link to folder level
CDIL - Data Report : Task Summary
Job Plan->Job Step (12)
<link to report>
20AppliedSAP IDoc XXXTWF:Session1Name+121
<link to error report>
10IDOC FailedSAP IDoc XXXTWF:Session1Name+121
<link to error_x file>
20Failed-Pre CheckSAP IDoc XXXTWF:Session1Name+121
+122
+121
+121
Job Task IDs
WF:Session2Name
WF:Session1Name
WF:Session1Name
Task Name
S
S
Object Type
File2
File1
Object Name
<link to file>50Read
<link to file>50Read
InfoCountRecord Status
Report with key fields of success rec.
Report with key fields and Error Message with
link to Error File
Report with key fields and Error
Message and link to Error File
CDIL - Data Object Report : Inbound
<link to Error report>
<link to Error report>
Info
UNKNOWN
BAD
GOOD
Object Status
Done
Done
Done
File Process
20090112 01:20:50UNKNOWNNA_dadaada_
GL_Interface20090112 01:20:50LIL_UKIT_GL_XX_Balance_XXX.txt
UsedGL_Interface20090112 01:20:50LIL_ItalyIT_GL_XX_Balance_XXX.txt
Object Used Status
Consuming Service Name
Arrival DateTimeService/ Interface Name
Object Name
Click on + to drill down for next level reporting
-Time Span (24Hr)
CDIL - Data Object Report : Outbound
<link to Object Error report>
<link to object Error report>
Info
UNKNOWN
BAD
GOOD
Object Status
Done
Done
Done
File Process
IGNOREDUNKNOWN20090112 01:20:50GL_Interface
NA_dadaada_
IGNOREDLIL_UK20090112 01:20:50GL_Interface
IT_GL_XX_Balance_XXX.txt
Sent OKLIL_Italy20090112 01:20:50GL_Interface
IT_GL_XX_Balance_XXX.txt
Object Used Status
Consuming Service Name
DateTimeService/ Interface Name
Object Name
Click on + to drill down for next level reporting
-Time Span (24Hr)
CDIL - Data Object Report : Object Error
The header record count doesn’t match with total count
BAD20090112 01:20:50LIL_UKIT_GL_XX_Balance_XXX.txt
This object is not registered in Meta data
UNKNOWN20090112 01:20:50UNKNOWNNA_dadaada_
Object Status Error DescriptionDateTimeService/ Interface Name
Object Name
-Object Name or all, TimeSpan
CDIL - Data Report : Data Detail
link to File
Job Plan->Job Step-> Job Task (12)
** only available for certain timestamp (+- x days)
POSTED
ID=1313131, AC=313131, SUAC=313131313
ID=1313131, AC=313131, SUAC=3131313
Data Key
IDOC POST Error
Data Check Error
Status
Invalid xxxxxx
ACCT: invlaid GL account
Error Detail
<link to error_x file>
<link to error_x file>
Detailed Info
CDIL - Data Report : Specific Data
0
1
1
Data Level
<link to error_xfile>
LIFNR=invalid;PCT=invalidData Check ErrorID=1313131, AC=313131555557777
555557777
555557777
Doc Key
ID=1313131, AC=313131, SUAC=31313132
ID=1313131, AC=313131, SUAC=31313131
Data Key
OK
Data Check Error
Status
ACCT=invlaid
Error Detail
<link to error_xfile>
Detailed Info
link to File
Data Search ->Data Detail
** the search key has to match interface type
ENTER DOC KEY 555557777 SEARCH
Master Data Error at line item
KO20100228100001022010022719003000_IT_004_FEED05_0001.txt
10000101
Reference key
OK20000101201002282010022719003000_IT_004_FEED05_0001.txt
2010022719003000_IT_005_FEED01_0001.txt
2010022719003000_IT_004_FEED02_0001.txt
Name of the file
20100228
20100227
Integration date
Target system (SAP) document number (if applicable)
KO
KO
Status ( OK/KO)
Unbalanced sum of debit/credit
File header record missing
Additional Information
-Time Span (24Hr/1 run day)
-Country
High level
CDIL - Load Status : Country Daily Report
CDIL - Error Report : Per File
LDGRP=InvalidDH|FX|Vendorposting|10000103|LG2|0172|20100226|20100202|20100215
HKONT=InvalidDL|10|T|97000|||||||||||||||||DE1032111|||||||
DL|20|T|97010|||||||||||||||||DE1032111|||||||
DL|20|T|97010|||||||||||||||||DE1032111|||||||
HKONT=InvalidDL|10|T|97000|||||||||||||||||DE1032111|||||||
DH|FX|Vendorposting|10000102|LG1|0172|20100226|20100202|20100215
DL|10|T|97000|||||||||||||||||DE1032111|||||||
DH|FX|Vendorposting|10000101|LG1|0172|20100226|20100202|20100215
FH|xx|xx|xx|xx|xx
-Each file basis2010022719003000_IT_004_FEED05_0001.txt
CDIL - Error Report : Per File
Master Data Error at line item
Record counts don’t match
Header record missing
Header record missing
-Each file basis2010022719003000_IT_004_FEED05_0001.txt
Record counts don’t match
-2010022719003000_IT_004_FEED05_0001.txt
-2010022719003000_IT_004_FEED05_0001.txt
Agenda
Agenda
What's New Now ?
New Live feed Integration platform Design (LFIP)
Screen Shoots of LVIP Platform
Nomenclature
Nomeclature
• CDILq Core data integration layer
• HLDq High level design
• LILq Local integration layer
• Interface / Serviceq A series of steps/tasks that makes up a component to handle one functional specification
• UDMq Universal Data Mover. It basically is a file movement software that gurantees 100% delivery
• FileNetq A application used for online file storage
• Aribaq A B2B procurement application
• SAP ECC / BWq ERP (Enterprise Resource & Planning) application produced by SAP AG (German company)
• Site Teamq This is the team of technical and business data owners who focuses on extracting site data, site
level checks and reconciliations and delivery it further• Central/ Core Team
q This is the technical and Functional designers who ensures implementation of all site level, project phase level and SAP level checks and reconciliations