Projects1:-
- Load, cube data from BI ———– HANA DB :- (Delta)
- Create Data stores for SAP BI source
- Create Data stores for HANA DB
- Steps at BI Level:-
- Select any cube & cheek whether data is or not in data
- Go to provider (tab) in T-Code RS4.1
- Go to open hub destination (tab )
- Provide
- 1 : name : ___________
- Description: ___________
- Provider : __________(Technical name of web, Dso’s & etc)
- In next screen in destination (tab) provide
- Destination type : Third part tool
- RFC destination :RFC connection of (DATA services)
- Activate
- Create Transformation b/w cube & open hub
- Create DTP & activate
NOTE: DTP extraction mode must be in delta
- Go to process change (tab)
- Right click on unassigned note create process change & provide process name, Description and will get prompt start variant click on create
- Next screen
Start using meta chain or API
- Save —- Go back
- Select our DTP ——- connect to start variant & activate
Steps at Data Services:-
- Import open hub destination into BI source data store
- Import data base table from hana data store
- create project
- create Job
- create workflow
- create Data flow
- drag OPA hub open hub
- Double click on it
- read from process change
- select other process change
- Take Query trans formation
- connect open hub transformation
- Drag table from open hub
- connect Query transformation from hana table
- And Map fields of open hub & fields of HANA data base table.
NOTE: Open hub brings updated Read onlyExecute the job
Project #2:-
Project # 2:
Load data from traditional data base (Non-SAP system)
To hana data base (initial & delta)
STEP – 1:-
- Create a view job
- Go to source database
- Create table called CDC – Time with CASTY – update column with data time data type and import source table and CDC Time table into data services
STEP2:-
- Create project
- Create job
- Create variables (global variables) (4) $GV – Start time
$ ar – end time with data time data types
Step 3:-
- Take 2 scripts name it as INITIAL – START and INITIAL – END
- Take a new workflow and put in between two scripts
- Double click as INITIAL – START Script go to smart editor and write the below syntax
$GV – STARTIME = ‘1996.01.01 12.00.00’,
$GV – ENDTIME = sysdate ():
- Double click as INITIAL – END Script, to go smart editor and write the below system
SQL (‘cdc’, ‘DELETE FROM CDC – TIME’ );
SQL (‘cdc’, ‘INSERT INTO CDC – TIME VALVES (S$GV – ENDTIME\);
Step 4:
- Double click as workflow
- Create data flow
- Select the source table (i.e.) employer table
OPS – EMPLOYEE (CDC)
- Take Query transformation, map source table to Query transformation
- Take template as target table, map it to Query transformation
Step 5:
- Double click on Query transformation
- Map the required fields from schematic to schema out
- Select LAST – update coloumn
- Go to where clause and provide the logic as below
(ODS – EMPLOYEE. LAST – UPDATE >$ GV – STARTTIME)
And
(ODS – EMPLOYEE. LAST – UPDATE < $GV – ENDTIME)
STEP 6:-
(HANA) Table
- Take as target table, connect it to Query
- Double click as the target HANA table go to options tab, enable delete data from table before loading checkbox and enable drop and created table cheek box
Step 7 :
- Validate
- Extract the job and you can see in the source system in CDC – Time table there is a time stamp loaded which is nothing but INITIAL load time
DELTA LOAD:-
- Create a new job name it as delta load.
- Take two scripts it as initial START and INITIAL END.
- Take a new workflow, put it in between two scripts
- Double click on initial START Script and go to smart editor provide below togic
$av – STARTTIME = to-date (Sqc (‘cdc’, ‘SELECT
Last – UPDATE From CDC – TIME), ‘1996.01.01 12.00.00’),
$av – ENDTIME = sysdate ();
- Double click on end script and go to import editor provider below logic,
SQC (‘cdc’, Update CDC – TIME SET LAST- UPDATE= {$av – ENDTIME});
- Take employee table as a source table
- Take Query transformation and connect the both
- Double click on Query transformation, map the required fields from schema – in to schema out
- Take HANA table as a target table
- And make sure that in the target HANA table options tab, Disable the following check boxes
- Delete data from table before loading
- Drop and recreate table
- Validate it
- Execute the Job
0 Responses on Implementation of Projects in SAP Hana"