Job Sequencing in Data Stage
It is Controlling the order of execution.
Can run multiple jobs with the same name
Checking the compilation of job 1, if it is compiled without errors, then job 2 is compiled.
This situation arises when Job is Aborted
When everything is successful, we should get a confirmation mail that it is Successful.
Wait for file
Waits for the file, until it is searched and gets loaded into the source.
If a job is aborted in between by any chance, check point make sure that the job starts from the point where it has failed.
We cannot use a parameter defined in 1 job to another job, TO do this we use parameter Mapping.
Job sequencing Stages are categorized into 4 groups
- Run stages
- Flow control stages
- Error handling stages
- Other stages
New Select Sequence jobclick on view click on palette option palette opens
Designing a Master Job
To run a set of jobs in specific sequence click view Repository Select Jobs Drag and Drop Required jobs.
Drag the Job Activity from the palette Right click Browse the job
Supports multiple outputs but only 1 i/p
Multiple inputs and multiple outputs
here we are executing Trigger Stage that is job finished with no errors/ Warnings.
If job aborts, we use Terminator
Interested in mastering DataStage Training?
Enroll now for a FREE demo on DataStage Training.
Exception handling Notification Activity Terminator
If Server is failed Exception handling
Automatically handle Activities that fail If this is enabled, then only exceptions will perform and continues the job
Job finishes – mail has to be delivered
Aborts – mail
Server down – mail
Wait for file Activity
Properties browse the record file wait for file to appear do not timeout
Add checkpoints to sequence is restartable on failure
Click on Job Activity
We can see the option Do not check point run
Jobs Insert parameter Job properties Add parameters
Oracle oracle parameter parameter set
D no deptn Integer
For in-depth understanding of DataStage click on