ramesh bods_is

6
Ramesh Mobile: +91-9987713828 SAP BOBJ EIM- IS/DS Consultant Maid ID: [email protected] SUMMARY Over all 5+ Years of IT Experience which includes of SAP EIM-DATASERVICES (BODS)/DATA INTEGRATER (BODI),SAP Business Objects WEB-I, BODS Administration, SAP Information Steward, SAP HANA Experience Involved in End-to-End DWH Implementation Projects. BODI/DS Experience: POC on SAP Information Steward using Matapedia, MDM, Data Insight, Cleansing Package Builder, Match and Consolidation. POC on SAP HANA .Creating all types of Information views, SLT, BODS Replication and analysis on HANA Database using all kind of reporting tools. Created Many Validation rules to validate sample data using SAP Information Steward Scripting language. Experience On Business Objects MDM to integrate legacy System with BO System. Implemented Data Assessment solution by configuring the Data Profiling and regularly collecting the profile tasks in the form of Dashboards. Experienced in configuring Information steward for Data Profiling. Experienced in building data quality monitoring Dashboards and Financial Impact Dashboards Implemented Data Governance, Failed Database. Designed cleansing packages in IS and used them in SAP BODQ. Complete knowledge in implementing Information Steward Administration. Strong scripting knowledge of SAP BODS/IS.

Upload: ramesh-ch

Post on 14-Apr-2017

156 views

Category:

Documents


1 download

TRANSCRIPT

Page 1: Ramesh BODS_IS

RameshMobile: +91-9987713828SAP BOBJ EIM- IS/DS ConsultantMaid ID: [email protected]

SUMMARY

Over all 5+ Years of IT Experience which includes of SAP EIM-DATASERVICES (BODS)/DATA INTEGRATER

(BODI),SAP Business Objects WEB-I, BODS Administration, SAP Information Steward, SAP HANA Experience In-

volved in End-to-End DWH Implementation Projects.

BODI/DS Experience:

POC on SAP Information Steward using Matapedia, MDM, Data Insight, Cleansing Package Builder, Match and Consolidation.

POC on SAP HANA .Creating all types of Information views, SLT, BODS Replication and analysis on HANA Data-base using all kind of reporting tools.

Created Many Validation rules to validate sample data using SAP Information Steward Scripting language.

Experience On Business Objects MDM to integrate legacy System with BO System.

Implemented Data Assessment solution by configuring the Data Profiling and regularly collecting the profile

tasks in the form of Dashboards.

Experienced in configuring Information steward for Data Profiling.

Experienced in building data quality monitoring Dashboards and Financial Impact Dashboards

Implemented Data Governance, Failed Database.

Designed cleansing packages in IS and used them in SAP BODQ.

Complete knowledge in implementing Information Steward Administration.

Strong scripting knowledge of SAP BODS/IS.

Experience in working with BODI/DS using with different Data Sources (Flat files, Oracle) and SAP-ECC&BI/BW.

Strong skills in writing SQL Queries on Oracle, MS-SQL Server.

An energetic, self-motivated designer with hands on experience in creating batch jobs, using the ETL tool BO

Data Integrator and SAP Data Services.

Highly optimized the batch jobs developed in SAP BODS/DI and tuning the performance by various techniques in

BODS.

Experience on SAP BOBJ BODS ADMINISTRATOR like user creation, adding repositories to job servers.

Used different performance tuning techniques like Parallel processing, Multi-Threading, Partitioning and Bulk

Loading, etc to improve the extraction and loading performance

Implemented Multi-User access management and Version Management by installing the Secured Central Reposi-

tory.

Page 2: Ramesh BODS_IS

Developments of Crystal Xcelsius components Grids, Pie charts, Bar Charts and developed the animation actions

for those components.

Expertise in Business Objects administrator settings for various users and assigning rights and permissions for

various features, objects and users in Central Management Console.

Knowledge of complete SDLC process.

Worked extensively with Data Services Management Console for administration of Users and configuring the

Real-Time clients, Assess Servers, Real-Time Jobs and monitoring the statistics reports.

Technical Skills:

ETL Tools Sap BODI/Ds, Data StageEIM Tool Sap Information Steward.Reporting Tools Sap Bo, Xcelsius, Web-I, TableauLegacy SYSTEMS Sap- ERP, Sap-BW, Oracle AppsOperating Systems Ms-Windows.Data Bases Oracle, MS-Sql, Ms- Access 2000, SAP HANALanguages C, C++, Sql/Plsql.Packages Bobj Rapid Marts

Professional Experience:

Working with NDS INFOTECH from OCT-2011-Till date.

Project#1:

Project Name : SAP BOBJ Rapid Marts, DWH IMPLEMENTATION.

Client : TEMPURPEDIC (Product), U.S.

Team size : 5

Role : BODS, BO Developer and Administrator.

Environment : SAP Data Services 4.2, SAP IS 4.2 Oracle, Windows server 2008, BO.

Duration : OCT 2012 to till date

Roles and responsibilities:

Involved in gathering the requirements, understand the current LDWH, prepared design data flow of adopting

EDW.

Extracted from multiple sources like (Flat Files/Tables) from Oracle Apps ERP system.

Interacted with Business Analyst and business users to understand the business requirements and Gathered re -

quirements for enterprise data warehouse schemas.

Performed Source System Analysis. Analyzed the Source and Target Data elements, created Mapping document,

Unit and System Test Plan.

Implemented Rapid Mart Recovery mechanism for Delta Management.

Error Handling Using TRY Catch Mechanism and Event based Triggering by writing BODS Scripting Language.

Page 3: Ramesh BODS_IS

Generating Multiple Event Files to process reports Based On Job to reduce Burden On Reporting Server.

Dynamic Path selection by creating Variables and Parameters.

Done Out of box Implementation to RapidMarts packages (AP,AR,Inventory,Sales,Purchasing).

Schedule Delta Jobs to Execute the Delta Data or Changed Data Daily.

Designed and Debugged ETL process to extract data from Sources systems Transform as per business require-

ments and load the data into dimensional data models to support BI reports, Dashboard, scorecards.

Prepared Data Flow and Technical, Specification Documents and script files for tables.

Project#2:

Client : RTI INTERNATIONAL (METALS), U.S

Team size : 3

Role : BODI, BO Developer and BODS Administer

Environment : Data Services 3.2, Oracle, Windows server 2003,BO.

Duration : Mar 2012 to Oct 2012.

Roles and responsibilities:

Involved in gathering the requirements, understand the current LDWH, prepared design data flow of adopting

EDW by implementing SAP BOBJ CC Rapid Mart.

Prepared system and executed all the Pre-Install activities before installing the standard CC Rapid Mart i.e.

Repository creation, creation and management of Job Server.

Enhanced the standard Rapid Mart Work Flows, to enable the ETL of custom tables in SAP by developing new

Data Flows and pulled data using LOOKUP(), SUBSTR functions

Implemented Rapid Mart Recovery mechanism for Delta Management.

Schedule Delta Jobs to Execute the Delta Data or Changed Data Daily.

Maintained the Directories with Data Files & ABAP Files to Enable Time Dimension, Data load Using Custom

Logic.

Spitted the standard ETL Jobs into individual jobs for Master Data, Hierarchies, Cost Center Detail and Summary,

Profit Center Detail and Summary and SAP Meta Data.

Prepared Data Flow and Technical Documents.

Project#3:

Client : BASF CEMICALS, U.S.A

Team size : 6

Role : BODI and BO Developer.

Page 4: Ramesh BODS_IS

Duration : OCT 2011 – Mar 2012

Environment : BODI/DS, Oracle10g, Windows XP. BO.

Roles and responsibilities:

Enhanced the standard SAP dataflow to integrate the non-SAP source systems data viz. Flat files and XML files

Designed an ETL Job which extracts the Excel Workbook sheets dynamically and supports the multiple work -

book extraction

Designed an ETL Job which integrates Flat file and XML data, loads into single target using XML_PIPELINE

and MERGER Transforms

Designed and ETL Job which distributes the multiple tables data into nested structure in XML format to external

systems using TEMPLATE_XML object

Used  BODS Data Integrator Transforms  PIVOT and REVERSE_PIVOT to apply the row level and column level

transformation according to reporting requirement

Developed a standard BODS ETL Job to load the Time Dimension table using Data Integrator Transform

DATE_GENERATION

Implemented Slowly Changing Dimension – Type 2 (SCD – Type 2) for the required dimensions by using TA-

BLE_COMPARIOSN Transform, HISTORY_PRESERVING Transform and KEY_GENERATION Transform

Implemented and ETL logic to identify the Holiday of the Year by adding a new column as indicator in the Time

Dimension

Implemented the DWH Dimension concept i.e. Dimension with NULL record using the Platform Transform

ROW_GENERATION

Used the LOOKUP_EXT and LOOKUP_SEQ functions to derive the columns in BODS ETL by look upping the 

values in lookup table of type validity tables

Implemented ETL Data Flows with DATA_TRNSFER transform to improve the Data Loading performance by

forcibly generating the Insert into…select statement.

Education:

Masters of Computer Applications (MCA) from Osmania University in 2010.