Load, cube data from BI ----------- HANA DB:- (Delta)
Create Datastores for SAP BI source
Create Datastores for HANA DB
Steps at BI Level:-
Select any cube & cheek whether data is or not in data
Go to the provider (tab) in T-Code RS4.1
Go to open hub destination (tab )
Provide
Name: ___________
Description: ___________
Provider: __________ (Technical name of web, Dso’s & etc)
In the next screen in the destination (tab) provide
Destination type: Third part tool
RFC destination: RFC connection of (DATA services)
Activate
Create Transformation b/w cube & open hub
Create DTP & activate
NOTE: DTP extraction mode must be in delta
Go to process change (tab)
Right-click on unassigned note create process change & provide process name, Description and will get prompt start variant click on create
Next screen
Start using meta chain or API
Save ---- Go back
Select our DTP ------- connect to start variant & activate
If you want to enrich your career and become a professional in SAP Hana, then visit Tekslate - a global online training platform: "SAP Hana Administration Training" This course will help you to achieve excellence in this domain.
Steps at Data Services
Import open hub destination into BI source datastore
And Map fields of the open hub & fields of the HANA database table.
NOTE: Open hub brings updated Read-only to execute the job
Project 02:
Project 02: Load data from the traditional database (Non-SAP system) To Hana database (initial & delta)
STEP 1:
Create a view job
Go to the source database
Create a table called CDC – Time with CASTY – update column with data time data type and import source table and CDC Time table into data services
STEP 2:
Create a project
Create a job
Create variables (global variables) (4) $GV – Start time
$ ar – end time with data time data types
Step 3:
Take 2 scripts to name it as INITIAL – START and INITIAL – END
Take a new workflow and put in between two scripts
Double click as INITIAL – START Script go-to smart editor and write the below syntax
$GV – STARTIME = ‘1996.01.01
12.00.00’, $GV – ENDTIME = system ():
Double click as INITIAL – END Script, to go smart editor and write the below system
SQL (‘cdc’, ‘DELETE FROM CDC – TIME’ ); SQL (‘cdc’, ‘INSERT INTO CDC – TIME VALVES (S$GV – ENDTIME);
Step 4:
Double click as workflow
Create data flow
Select the source table (i.e.) employer table
OPS - EMPLOYEE (CDC)
Take Query transformation, map source table to Query transformation
Take the template as target table, map it to Query transformation
Step 5:
Double click on Query transformation
Map the required fields from schematic to schema out
Select LAST – update column
Go to where clause and provide the logic as below
(ODS – EMPLOYEE. LAST – UPDATE >$ GV – STARTTIME) And (ODS – EMPLOYEE. LAST – UPDATE < $GV – ENDTIME)
STEP 6:
(HANA) Table
Take as target table, connect it to Query
Double click as the target HANA table go to the options tab, enable delete data from the table before loading checkbox and enable drop and created table cheek box
Step 7 :
Validate
Extract the job and you can see in the source system in CDC – Time table there is a time stamp loaded which is nothing but the INITIAL load time
TekSlate is the best online training provider in delivering world-class IT skills to individuals and corporates from all parts of the globe. We are proven experts in accumulating every need of an IT skills upgrade aspirant and have delivered excellent services. We aim to bring you all the essentials to learn and master new technologies in the market with our articles, blogs, and videos. Build your career success with us, enhancing most in-demand skills in the market.
Stay Updated
Get stories of change makers and innovators from the startup ecosystem in your inbox