• USA : +1 973 910 5725
  • INDIA: +91 905 291 3388
  • info@tekslate.com
  • Login

File Debugger in Informatica

File Debugger In Informatica

It is used to debugg a mapping.

Design a mapping with the following Dataflow diagram.

Create a source defination with tmp metadata

Create a target defination with the name emp-tax

Empno, Ename, Job, Sal, Deptno, Tax

Create a mapping with the name m-Taxcal

Drag the source and target defination on to the workspace

Create a transformation type filter . connect the required ports from source equiater to filter transformation

Define the following condition in filter transformation

SUBSTRING (ENAME , 1, 1) =’s’

Create a transformation of type an expression connect the ports from filter to expresion T/R

Add a new port tax in Expression transformation with the following expression

sal * 0.17

Connect the ports from txpress to target

Procedure to use File Debugger

Open a mapping to Debug client component

From mappings menu , select Debugger , click on start Debugger [f9]

Click on next , select radio button (.) use an existing reusable session

Click on next , click on next  click on finish

From mappings menu select Debugger , click on nextinstance [f10]

Click on f10

To stop a debugger from mappings menu select Debugger  click on stop Debugger . click on yes

Ex: (1)

source                                                  Target

CNo       CName         AMT                     CNo        CName         AMT

100         Srinu              4500                      100             Sri                4500

101        Sreenu          6000                      101             Sree            6000

Unit Testing

A unit test for the dataware house is white box testing I should check the ETL procedures, jobs, mappings and frented developed reports.

Executing the following test for each and every ETL application.

 Test Case 1 :Data Availablity

Description

Ensure that data is available for extraction in the source data base.

Test Procedure

Connect to the source database with a valid username and password.

Run the SQL Query on the database to verify the whether data is doing or not.

Test Case 2: Data Load(Insert)

Description:

Ensure that records are being inserted in the target.

Test Procedure:

(1) Make sure that target table is not having records

(2) Run the mapping and check that records are being inserted in the target table.

Test Case 3

Description:

Ensure that the data from the source should property populated into tha target incrementary and no data should be lossed.

Test Procedure:

Add a new record with the new values in addition to already executing in the source.

–> Run the mapping

Test Case 4: Data Accuracy

Description:

The data from the source should be populated into the target accurately.

Test Procedure:

Add a new record with the new values in addition to already existing records in the source.

–> Run the mapping.

Test Case 5: Verify Dataloss

Description:

Check the no of records in the source and target.

Test Procedure:

Run the mapping and check the no of records inserted in the target and no of rejected records.

Test Case 6: Verify Column Mapping

Description:

Verify that source columns are poperty linked to the target columns.

Test Procedure:

A mannual check can be performed to confirm that source columns are properly linked to the target columns.

Test Case 7: Verify Naming Standards:

Description:

Verify whether Naming standards are followed and neccessary comments are given.

Test Procedure:

A Mannual check can be performed to verify the naming convention.

Test Case 8: Vary the transformation used in the mapping

Description:

Check for the joins filters conditions and looups in the mapping

Performance Testing

The first step in performancing is tuning to identify the performance BOTTLENECK  in the following order

(1) Target

(2) Source

(3) Mapping (transformation)

(4) Session

(5) System (Hardware)

Identify the Target BottleNecks:

The target bottleneck can be isdentified by configuring the session to write to “flatfile as a target”

  Development testing

File Debugger in Informatica

Test Emp                                                                           

Identify the Source bottleneck:

Test Procedure:

In a test mapping remove all the transformations and if the performance is still similar then there is “Source bottleneck”

Test Mapping:

File Debugger in Informatica

Identify The Mapping bottleneck:

Test Procedure:

Keep the filter transformation before each target definition, set the filter condition to “FALSE” . So that no data is loaded into target.

If the type it takes to run the new session is same as original session then there is mapping bottleneck.

Test Mapping:

File Debugger in Informatica

Dataware Housing Development Project Life Cycle:

The dataware housing project are categorized into following types

(1) Development Project

(2) Enhancement Project

(3) Mygration Project

(4) Production support Project

Requirement Analysis:

There are two outputs from the requirement analysis

(1) Business Requirement Specification (BRS):

The business analysist is response for gathering the business reuirements from end user [decision makers] and documents the requirements as BRS.

The business requirement analysis takes place as the client location.

The following are the participents in preparing the BRS

  1. Business Analysist
  2. Decision Makers [Business Managers]

(2) System requirement specification: (SRS)

The senior technical co ordinator will prepared the SRS which contains software and hardware requirements.

DESIGN (Designing and Planning the Solution):

There are two outputs from the designing face

(1) HLD (High level design)

(2) LLD (Low level design)

(1) HLD:

The ETL architects and datawarehousing architects will design the solution as per the business requirements. The ETL project architects are derived in HLD.

Interested in mastering Informatica? Learn more about Informatica Tutorial in this blog post.

(2) LLD:

Based on the HLD the senior ETL developers will prepare LLD. The LLD contains more techniques details of the system. The LLD contains dataflow diagram detais about the source and target system.

The LLD contains

(1) table Loading order

(2) Dataflow diagram

(3) How to handle the Nulls

(4) Incremental Laod

(5) Type 1 and Type 2 dimension and sorraget keys

The LLD is also known as ETL specification document (or) mapping specification document.

Development (Coding):

Based on LLd the ETL Team designs mappings

Note: The metadata is known as ETL code

Code Review:

The code review will be done by developers

–> In the code review th edeveloper will review the code and logic but not data.]

Ex:

  1. You have to check the naming standard of transformation mapping, session and workflows and if any session level task.
  2. Check whether the source and target are shortcuts or not
  3. Placed the proper logic or not.

Peer Review:

Your code will be required by some third party developers (your team member) to validate the code but not the data.

Testing’s:

The following test carried out ina test environment

  1. Unit Testing
  2. Development Integration Testing
  3. System Integration Testing
  4. User Acceptance Testing

Unit Testing:

It will be carried by developer in the development phase.

  • No of rows in source and target.

–> The unit test is required to verify the ETL procedures and data validation.

  • Random rows checking in source and target.
  • The proper data is populated to target or not as per the ETL logic.
  • Here You have to do the check SUM also
  • You have to do the data validation.

Development and Integration Testing:

(1) Run all the mappings in the sequence order

(2) Do the impact analysis

(3) Check the Dependence

System Integration Testing:

–> After development phase we have to move our code to Quality analyzer environment

–> In this environment we are giving read only permission to testing people.

–> They will run all the workflows.

–> And they will test our code according to their standard.

User Acceptance Testing:

The test is carried out in the present of client user with the real sample data.

Production: (Development)

Migrating the ETL code into the production environment from development.

For indepth understanding of  Transformation in Informatica click on

Summary
Review Date
Reviewed Item
File Debugger in Informatica
Author Rating
5

“At TekSlate, we are trying to create high quality tutorials and articles, if you think any information is incorrect or want to add anything to the article, please feel free to get in touch with us at info@tekslate.com, we will update the article in 24 hours.”

0 Responses on File Debugger in Informatica"

Leave a Message

Your email address will not be published. Required fields are marked *

Site Disclaimer, Copyright © 2016 - All Rights Reserved.