Mapping Parameter in Informatica

Ratings:
(4)
Views:0
Banner-Img
  • Share this blog:

 

Mapping Parameter in Informatica

A mapping parameter represents a constant value that can be defines before mapping run

A mapping parameter is represented in a parameter file which is saved with an extension  .PRM (or) .TXT

Syntax:

[folder .WF: work flow.ST: session]

$$ parameter = constant value

Advantages

A mapping can be reuse for various constant that passing through parameter.

Reduce the development over head

Mapping parameter are local to the mapping can be accessed with in the mapping.

Data Flow Diagram

19

Note:

Mapping parameters can also be defined in source qualifier transformation.

Procedure

Create source and target definitions

Create the mapping with the name M_ mapping _parameter

Drop the source and target definitions

Create the transformation type filter

From mapping menu select parameterized variables

From tool bar click on add a new variable

Name Type Data type precession scale
SS Dno Parameter decimal 2 0

 

 

Click ok

From source qualifier copy the required to the filter transformation

In a filter transformation define th following condition

Dept no = $$ Dno

From filter connect the ports to the target

Creation of parameter file

Open the note pad define the following syntax

[BSR. WF: W_MP.ST:S_MP]

$$ Dno =10

Save the parameter file with an extension .PRM

Create the session with the name S_MP

Double click the session select the properties tab

Attribute

Value

Parameter filename C:/bsr/test.PRM

 

 

Select the mapping tab set the reader and writer connections

Click apply and click ok

Create the workflow with the name W_ SP

Mapping variable

A mapping variable represents a value that can be changed during the mapping run

After each successful compilation of the session the integration service stores the variable with its value in the repository.

Use the following functions to define the mapping variables

  1. Set count variable()
  2. Set max variable()
  3. Set min variable(0
  4. Set variable

Business purpose

   Use mapping variables to perform incremental rating

If you want to enrich your career and become a professional in Informatica, then visit Tekslate - a global online training platform: "Informatica Training" This course will help you to achieve excellence in this domain.

Procedure

Source definition with the name EMP

Target definition emp - iner

Emp no, ename, job, sal, dept no

Create a mapping with the name M _ employee _incremental read

Drop the source and target definition

Create the transformation type filter.

From mapping menu select parameters and variables

From tool bar click on add a new variable

Name Type Precision Scale
$$ Dno Variable    

Intial value 0 , click ok

From source qualifier copy required the ports to the filter transformation

Develop the following filter condition to perform the incremental reading

Dept no = SET VARIABLE ($$DNO,($$ DNO+10))

From filter transformation connect the ports to target

To view the correct vaiable value which is stored in the repository

From workflow select session right click on view percent value

Note: you set increment

Mapping variable generating sequence numbers;

Source definition EMP

Target definition EMP – sequence

EMP key, emp no, ename, sal

Create the mapping with the name  M_ sequence _ variable

Drop the source and target definition

Create the transformation type expression

From source qualifier copy the required ports tp expression transformation

From mapping menu select parameters and variables

From tool bar click on add a new variable

Name Type Data type Aggregation
$$ sequence variable Integer cont

Initial value 0 , click ok

From expression transformation connect the ports to target

Note:

Sequential , normalize, mapping variable, dynamic look up are generating the sequential generater key

Introduction to Informatica

Session parameter

A session parameter defines the connection to a data base system

A session parameter defines in a parameter file which is saved with an extension  .prm (or) .txt

Parameter file system

[folder .session]

DB connection .src = relational connection

Business purpose

A session can be re use to make connection to multiple data base to perform extraction

20

The source are heterogeneous and homogeneous but the table must be same

Create a source and target definitions

Emp no, ename, sal, job, deptno [empno PK desable]

Create the mapping with the name M _ session _ parameter _ DB

Drop the source and target definition

From source qualifier connect ports to the target

From repository menu click on save

Create a session with the name S _ M _ session _ parameter + DB

Double click the session select the mapping tabb

From left name select SQ _EMP

From connection click on tenter to open relational connection browser

Selection use connection variable

Enter the name $ DB connection _src, click ok

From let pane select the target set the writer connection with load type normal

Click apply , click ok

Creation of parameter file:

Open the note pad define the following syntax

[BSR . S_M_session _parameter_DB]

$DB connection _src =scott _reader

Save the file with the extension .prm .txt

Double click the session select the properties tab

Attribute Value
Parameter file name C:Batch 4 pmSYS.PRM

 

 

 

Advance topics list 2:

  1. Session recovery
  2. Push down optimization
  3. Difference between normal and bulk waiting
  4. PMCMD utility
  5. PMREP utility
  6. Mapping debugger
  7. ETL performance optimization technique
  8. ETL unit testing
  9. Incremental aggregation aggregate cache
  10. Types of look up cache
  11. Rank and sort cache
  12. Performance testing

Advanced mapping technique:

  1. Dynamic look up cache
  2. Unconnected stored procedure to return multiple output ports
  3. Unconnected stored procedure to drop and create index
  4. Short cuts[reusable objects]
  5. Meta data version
  6. Checking – check out
  7. Session particesing

PMCMD utility

The PMCMD is a command line program line program utility which provides communication with integration services

The PMCMDA can perform the some of the operations which can also be perform using workflow manager client

Use PMCMD to start the work flow on integration service

The PMCMDA is a command line client program (or) interface executes the following command

Connect

This is a case sentive it can write the lower case , the PMCMD connect to integration service to perform some of the operations whih can also be perform using work flow manager list

Syntax:

Connect – sv service – d domain – u user – p password

Disconnect

It disconnect the PMCMD from integration service

Syntax

Disconnect

Exit:

It disconnect the PMCMD from integration service and closes the client program PMCMD

Start work flow

It allow you to execute the workflow on integration service

Syntax: Start workflow - f folder workflow

Set folder

It activates the folder from which we can run the workflow.

Syntax: Set folder folder (name)  (Ex: workflow batch up m [folder name])

Note: After activating the folder the workflow can be started with the following syntax.

Start work workflow

Unset folder

It deactivates the current folder

Syntax: Unset folder

Procedure: Access the command prompt, type PMCMD to invoke the client program.

Ex: C: PMCMD <--

PMCMD > connect - SV nipuna(integration service name) - d domain(system name)=admin - U Administrator(username) - P

Administrator

Start workflow

PMCMD > stat workflow - f batch 4PM   W - PMCMD(workflow name) <--'

Type 2:

PMCMD > Set workflow   Batch4pm <--'

> Start workflow  W - PMCMD <--'

> Unset folder

Get service Properties

It provides the following details about the Integrating service.

(1) Domain Name

(2) Name of Integration Service

(3) Version of Integration Service

(4) Operating system on which Integration service is running.

(5) Integration Service code page, operating mode.

(6) Node Name

(7) Primary node details

PMREP Utility

The PMREP is a command-line program utility which provide an access to repository service to perform repository administration

Execute the following command to perform administration

Connect

It connects to the repository service with the following syntax.

PMREP > connect - r repository (it is caused the repository service) - d domain - n user - x password (pwd)

Ex: C: Pmrep <--'

Pmrep>connect -r nipuna -rep -d domain -admin -n(user) Administrator -x(password) Administrator

Informatica Interview Questions & Answers

Create folder:

It create the new folder in the repository.

Syntax: Create folder -n(new) folder name(folder(AXAEDM))

Delete folder:

It delete folder from repository.

Syntax: delete folder  -n folder name

Ex:          delete folder  -n AXAEDM

BackUp

It allows you to take the backup od entire repository into a ".rep" file.

Syntax: backUP -0 (output) file path

Ex: backUP -0 C: DEVINE DEV.rep

delete object: It deletes the repository objects from repository

Syntax: delteobject - 0 < object_type > - f < folder_name > - n < object_name >

Ex: Deleteobject - 0 session - f Batch4pm - n 530 <--'

Object Export

It exports the repository objects such as mappings, sessions, workflows etc  into a "XML" file.

Syntax:

ObjectExport  - n <object_name>

- o <object_type>

- F <folder_name>

- U <xml_output_file_name>

Ex: ObjectExport -n m10 -c mapping -f Batch4pm -U C: Batch4pmABC.xml  <--'

[GUI based used export select mappingname from repository select export ]

Restore: It restore the backup file into a empty repository database

Exit: It disconnect the Pmrep from repository service and closes Pmrep client program.

Session Recovery:

When you stop ta session (or) an error causes session to start then identify the reasons for the failure of the session and start the session using one of the following methods.

(1) Restart the session if the integration services as not issued at least one commit.

(2) Perform session recovery if the integration services as issued at least one commit.

When you start the session in a recovery mode the integration services reads the row ID of last record committed from "OPP_srvr_recovery table"

The integration services reads "all the source data" but processing from next rowId.

Procedure: Double click the session , select the properties tab

Attribute                             Value

Recovery strategy           Resume from last check point

Click apply and click Ok

Push-down optimization:

The session which is configured for push-down optimization "increases the session performance"

The integration services analyze the mapping, transformation and determines which transformation logic to be sent to the source (or) target database server.

The integration services translate the transformation logic into SQL and sends it to the database server for processing.

The push-down optimization is a configured in three different ways.

(1) Source side PUSH down optimization:

The Integration service sends as much as possible transformation logic to the source database server.

(2) Tagetside PUSH down optimization:

The integartion services sends as much as transformation logic as possible to the target database server.

(3) Full PUSH down optimization:

The integration service sends the transformation logic to both source and target database servers.

Enhancement in Power center 8 version: [comparing 7 version]

The following are the new enhancements made it power center 8 in comparison with power center 8 in comparison with power center 7.

(1) Service-oriented architecture (SOA) [domain explain]

(2) PUSH down optimization

(3) User defined functions

(4) Generating workflow from the designer client. [8.6 version it is implemented]

(5) SQL transformation

(6) Java transformation

(7) MM/DD/YY  H24/MM/US (ultra seconds)

For indepth understanding of Informatica, click on

About Author
Authorlogo
Name
TekSlate
Author Bio

TekSlate is the best online training provider in delivering world-class IT skills to individuals and corporates from all parts of the globe. We are proven experts in accumulating every need of an IT skills upgrade aspirant and have delivered excellent services. We aim to bring you all the essentials to learn and master new technologies in the market with our articles, blogs, and videos. Build your career success with us, enhancing most in-demand skills in the market.