dq developer 9x specialist skill set inventory
DESCRIPTION
DQ Developer 9x Specialist Skill Set InventoryTRANSCRIPT
-
Informatica University, Copyright Informatica Corporation, November 2014
Skill Set Inventory Data Quality 9.x: Developer, Specialist Certification
About the Informatica Certified Professional (ICP) Program Informatica certification is a two-stage structure that correlates professional certification with the what, why and how of Informatica implementations: Specialist To achieve Specialist recognition, a candidate must pass a written exam, proctored in person or via webcam. The Specialist exam validates that the individual understands the product and can apply the skills and capabilities required to contribute to a project as a full team member. Expert To reach the Expert level an Informatica Certified Specialist must pass Informatica Velocity Best Practices and Implementation Methodologies Certification. This certification validates that you are able to lead a project implementation team following our best practices. Our new exams have been developed to define product competency by job role, where demonstrated performance and outcomes validate Informatica implementation skills.
About the ICS Data Quality Developer Exam and the Skill Set Inventory This exam measures your competency as a member of a project implementation team. This involves having an in-depth knowledge of each of the Data Quality processes from Profiling to Standardization, Matching and Consolidation. You need to be able to select and configure the appropriate Data Quality transformation for your requirements and build, debug and execute Data Quality mappings as well as integrate mappings into Power Center.
The skill set inventory is used to guide your preparation before taking the exam. It is an outline of the technical topics and subject areas that are covered in each exam. The skill set inventory includes test domain weighting, test objectives and topical content. The topics and concepts are included to clarify the test objectives.
-
Informatica University, Copyright Informatica Corporation, November 2014
Test takers will be tested on:
Be able to navigate through the Developer Tool Use Project Collaboration techniques; shared Tags, Comments, Profiles, reference tables to
share information with Analysts Use a variety of profiling methods to profile data Use Analyst built profiles to develop mappings Build mappings and mapplets to cleanse and standardize data Perform Address Validation Identify duplicate records in a dataset Automatically and manually consolidate duplicate records to create a master record Execute mappings/mapplets in PowerCenter Use Data Quality Mappings in an Excel spreadsheet (connect to Web Services)
Training Prerequisites The skills and knowledge areas measured by this exam are focused on product core functionality inside the realm of a standard project implementation. Training materials, supporting documentation and practical experience may become sources of question development. The suggested training prerequisites for this certification level are the completion of the following Informatica course(s):
onDemand Training
Data Quality 9.5: Analyst onDemand w/ 40 lab hours
Informatica Developer Tool: Introduction - onDemand w/ 30 lab hours
Data Quality 9.x: Developer onDemand w/ 40 lab hours
Data Quality 9.x: Developer Level 2 onDemand w/ 40 lab hours
Exam Test Domains
The test domains and the extent to which they are represented as an estimated percentage of the exam follows:
Title % of Exam Informatica Overview 10%
Analyst Collaboration 10%
Profiling 15%
Standardization/Mapplets 10%
Address Validation 5%
Matching 10%
Consolidation and the DQA 10%
-
Informatica University, Copyright Informatica Corporation, November 2014
Integration with PowerCenter 5%
Object Import and Export 5%
DQ for Excel 5%
Parameters 5%
Content 10%
Question Format You may select from one or more response offerings to answer a question. You will score the question correctly if your response accurately completes the statement or answers the question. Incorrect distractors are given as possible correct answers so that those without the required skills and experience may wrongly select that choice. A passing grade of 70% is needed to achieve recognition as an Data Quality Developer 9.x: Specialist certification. You are given 90 minutes to complete the exam. Test formats used in this exam are:
Multiple Choice: Select one option that best answers the question or completes the statement Multiple Response: Select all that apply to best answer the question or complete the statement True/False: After reading the statement or questions select the best answer
Exam Policy
If you do not pass on your 1st attempt, you must wait 2 weeks after the exam to retake the exam You may take the exam up to three times in one year from the date of your first exam attempt
Test Topics The exam will contain 70 questions comprised of topics that span across the sections listed below. In order to ensure that you are prepared for the test, review the subtopics associated with each section.
Informatica Overview
Be able to describe the Informatica 9 architecture and set up including the repositories
required for installation
Be able to provide information on the Data Quality process and dimensions of data quality
Analyst Collaboration
Be able to use and describe the functionality of the Analyst Tool including Scorecarding,
Reference Table Management, Tags, Filters, Profiles and Comments
Be able to describe how Analysts and Developers can collaborate on projects
-
Informatica University, Copyright Informatica Corporation, November 2014
Describe the benefits of project collaboration to a team
Profiling
Be able to perform and interpret column, rule, comparative and mid-stream profiling
Standardization/Mapplets
Be aware of where data standardization fits in the DQ process
Apply, configure and troubleshoot data standardization transformations
Be able to differentiate between the various parsing techniques available
Recognize how reference tables are used in the standardization process
Address Validation
Verify the importance of Address Validation
Configure the Address Validation transformation including explaining each mode that is
available and what custom templates can be used for
Interpret the AV outputs generated
Matching
Know how to develop and build match plans to identify duplicate or related data
Be able to differentiate between the matching algorithms available and explain how
matching scores are calculated
Know how to configure Identity Matching option for the match transformation
Explain populations, the identity match strategies that are available and how they are used
in Identity Matching
Consolidation and the DQA
Explain Automatic and Manual consolidation techniques and configure the Transformations
used for automatic consolidation
Use the Exception Transformation to generate and populate the Bad Records and Duplicate
Records tables
Troubleshoot any problems that may occur
-
Informatica University, Copyright Informatica Corporation, November 2014
Be able to generate a survivor record in the DQA
Remove duplicates using DQA
Integration with Power Center
Explain how to integrate IDQ mappings and mapplets into PowerCenter Workflows including
how content is handled
Object Import and Export
Describe the difference between the Basic and Advanced Import options available
Explain Dependency conflict resolution and how it is handled
Describe how to Export a Project
DQ for Excel
Be able to describe the requirements for integration for Excel
Explain the techniques for creating mappings for DQ for Excel
Explain the Web service capabilities required for DQ for Excel
Provide an explanation on how to use the Informatica ribbon in Excel
Parameters
Explain what parameters are and why they are used including what parameter functions are
supported
Identify which Transformations can use parameters
Describe the process of exporting mappings with parameters built in
Content
Explain what Content is, what is contained in the Core Accelerator and why it is used
-
Informatica University, Copyright Informatica Corporation, November 2014
Sample Test Questions Scorecarding is performed on which of the following?
Option Text
A.
Per record basis
B.
Per column/attribute basis
C.
Per table basis
D.
Per database basis
Select all correct statements for IDQ grouping and matching:
Option Text
A.
IDQ field level matching does not utilize grouping
B.
When field level matching is performed, the records within each group will be compared against each other
C.
When field level matching is performed, matching will be performed across multiple groups in a single match transformation
D.
When field level matching is performed, matching will not be performed across groups, therefore it is imperative grouping is performed on a complete and accurate field(s)
What types of profiling can be performed in the Developer Tool in Data Quality?
Option Text
A.
Column Profiling, Primary Key Inference, Dependency Inference
B.
Column and Join Profiling only
C.
Column, Join Analysis, Mid Stream, Comparative Profiling
D.
Column, Join Analysis, Mid Stream, Comparative, Primary and Foreign Key and Overlap Profiling
Data Consolidation manages the following processes (select all that apply):
Option Text
A.
Identifies duplicates within a data set
B.
Merging or linking duplicate or related records
C.
Removing duplicates from a dataset
D.
Replacing inaccurate data
-
Informatica University, Copyright Informatica Corporation, November 2014
Select the architecture description that best describes IDQ 9.1
Option Text
A.
The architecture is completely server based; there is no client. Both Developer and Analyst users access IDQ through a web based interface
B.
The architecture is both client and server based. The Data Integration, Analyst, and Model Repository services all run on the server. Developer is the client application
C.
IDQ has both a client and server repository. Production ready data quality mappings are transferred from the client repository to the server repository
D.
IDQ does not have a repository. All rule and mapping work is stored as XML files in the server directory structure