It is used to debugg a mapping.
Design a mapping with the following Dataflow diagram.
Create a source defination with tmp metadata
Create a target defination with the name emp-tax
Empno, Ename, Job, Sal, Deptno, Tax
Create a mapping with the name m-Taxcal
Drag the source and target definition on to the workspace
Create a transformation type filter . connect the required ports from source equiater to filter transformation
Define the following condition in filter transformation
SUBSTRING (ENAME , 1, 1) ='s'
Create a transformation of type an expression connect the ports from filter to expresion T/R
Add a new port tax in Expression transformation with the following expression
sal * 0.17
Connect the ports from txpress to target
Open a mapping to Debug client component
From mappings menu , select Debugger , click on start Debugger [f9]
Click on next , select radio button (.) use an existing reusable session
Click on next , click on next click on finish
From mappings menu select Debugger , click on nextinstance [f10]
Click on f10
To stop a debugger from mappings menu select Debugger click on stop Debugger . click on yes
Ex: (1)
source Target
CNo CName AMT CNo CName AMT
100 Srinu 4500 100 Sri 4500
101 Sreenu 6000 101 Sree 6000
Unit Testing
A unit test for the dataware house is white box testing I should check the ETL procedures, jobs, mappings and frented developed reports.
Executing the following test for each and every ETL application.
Test Case 1 :Data Availablity
Description
Ensure that data is available for extraction in the source data base.
Test Procedure
Connect to the source database with a valid username and password.
Run the SQL Query on the database to verify the whether data is doing or not.
Informatica Interview Questions & Answers
Test Case 2: Data Load(Insert)
Description:
Ensure that records are being inserted in the target.
Test Procedure:
(1) Make sure that the target table is not having records
(2) Run the mapping and check that records are being inserted in the target table.
Test Case 3
Description:
Ensure that the data from the source should property populated into the target incrementary and no data should be lossed.
Test Procedure:
Add a new record with the new values in addition to already executing in the source.
--> Run the mapping
Test Case 4: Data Accuracy
Description:
The data from the source should be populated into the target accurately.
Test Procedure:
Add a new record with the new values in addition to already existing records in the source.
--> Run the mapping.
Test Case 5: Verify Dataloss
Description:
Check the no of records in the source and target.
Test Procedure:
Run the mapping and check the no of records inserted in the target and no of rejected records.
Test Case 6: Verify Column Mapping
Description:
Verify that source columns are poperty linked to the target columns.
Test Procedure:
A mannual check can be performed to confirm that source columns are properly linked to the target columns.
Test Case 7: Verify Naming Standards:
Description:
Verify whether Naming standards are followed and neccessary comments are given.
Test Procedure:
A Mannual check can be performed to verify the naming convention.
Test Case 8: Vary the transformation used in the mapping
Description:
Check for the joins filters conditions and looups in the mapping
Performance Testing
The first step in performance is tuning to identify the performance BOTTLENECK in the following order
(1) Target
(2) Source
(3) Mapping (transformation)
(4) Session
(5) System (Hardware)
Identify the Target BottleNecks:
The target bottleneck can be is identified by configuring the session to write to "flatfile as a target"
Test Emp
Identify the Source bottleneck:
Test Procedure:
In a test mapping remove all the transformations and if the performance is still similar then there is "Source bottleneck"
Test Mapping:
Identify The Mapping bottleneck:
Test Procedure:
Keep the filter transformation before each target definition, set the filter condition to "FALSE" . So that no data is loaded into target.
If the type it takes to run the new session is same as original session then there is mapping bottleneck.
Test Mapping:
The Dataware housing project are categorized into following types
(1) Development Project
(2) Enhancement Project
(3) Mygration Project
(4) Production support Project
Requirement Analysis:
There are two outputs from the requirement analysis
(1) Business Requirement Specification (BRS):
The business analysis is a response for gathering the business reuirements from end user [decision makers] and documents the requirements as BRS.
The business requirement analysis takes place as the client location.
The following are the participents in preparing the BRS
(2) System requirement specification: (SRS)
The senior technical co ordinator will prepared the SRS which contains software and hardware requirements.
DESIGN (Designing and Planning the Solution):
There are two outputs from the designing face
(1) HLD (High level design)
(2) LLD (Low level design)
(1) HLD:
The ETL architects and datawarehousing architects will design the solution as per the business requirements. The ETL project architects are derived in HLD.
If you want to enrich your career and become a professional in Informatica, then visit Tekslate - a global online training platform: "Informatica Training" This course will help you to achieve excellence in this domain.
(2) LLD:
Based on the HLD the senior ETL developers will prepare LLD. The LLD contains more techniques details of the system. The LLD contains dataflow diagram detais about the source and target system.
The LLD contains
(1) table Loading order
(2) Dataflow diagram
(3) How to handle the Nulls
(4) Incremental Laod
(5) Type 1 and Type 2 dimension and sorraget keys
The LLD is also known as the ETL specification document (or) mapping specification document.
Development (Coding):
Based on LLd the ETL Team designs mappings
Note: The metadata is known as ETL code
Code Review:
The code review will be done by developers
--> In the code review the developer will review the code and logic but not data.]
Ex:
Peer Review:
Your code will be required by some third party developers (your team member) to validate the code but not the data.
Testing’s:
The following test carried out ina test environment
Unit Testing:
It will be carried by developer in the development phase.
--> The unit test is required to verify the ETL procedures and data validation.
Development and Integration Testing:
(1) Run all the mappings in the sequence order
(2) Do the impact analysis
(3) Check the Dependence
System Integration Testing:
--> After development phase we have to move our code to a Quality analyzer environment
--> In this environment we are giving read only permission to testing people.
--> They will run all the workflows.
--> And they will test our code according to their standard.
User Acceptance Testing:
The test is carried out in the present of client user with the real sample data.
Production: (Development)
Migrating the ETL code into the production environment from development.
You liked the article?
Like: 0
Vote for difficulty
Current difficulty (Avg): Medium
TekSlate is the best online training provider in delivering world-class IT skills to individuals and corporates from all parts of the globe. We are proven experts in accumulating every need of an IT skills upgrade aspirant and have delivered excellent services. We aim to bring you all the essentials to learn and master new technologies in the market with our articles, blogs, and videos. Build your career success with us, enhancing most in-demand skills in the market.