1. Load, cube data from BI ----------- HANA DB :- (Delta)

  • Create Data stores for SAP BI source
  • Create Data stores for HANA DB


  1. Steps at BI Level:-

  • Select any cube & cheek whether data is or not in data
  • Go to provider (tab) in T-Code RS4.1
  • Go to open hub destination (tab )
  • Provide

    • 1 : name : ___________
    • Description: ___________
    • Provider : __________(Technical name of web, Dso’s & etc)

  • In next screen in destination (tab) provide

  1. Destination type : Third part tool
  2. RFC destination :RFC connection of (DATA services)
  3. Activate

  • Create Transformation b/w cube & open hub
  • Create DTP & activate


NOTE: DTP extraction mode must be in delta

  • Go to process change (tab)
  • Right click on unassigned note create process change & provide process name, Description and will get prompt start variant click on create
  • Next screen

Start using meta chain or API

  • Save ---- Go back
  • Select our DTP ------- connect to start variant & activate


Steps at Data Services:-

  1. Import open hub destination into BI source data store
  2. Import data base table from hana data store
  3. create project
  4. create Job
  5. create workflow
  6. create Data flow
  7. drag OPA hub open hub
  8. Double click on it
  9. read from process change
  10. select other process change
  11. Take Query trans formation
  12. connect open hub transformation
  13. Drag table from open hub
  14. connect Query  transformation from hana table
  15. And Map fields of open hub & fields of HANA data base table.


NOTE: Open hub brings updated Read onlyExecute the job


Project  #2:-




Project # 2:     


Load data from traditional data base (Non-SAP system)

To hana data base (initial & delta)



STEP – 1:-

  • Create a view job
  • Go to source database
  • Create table called CDC – Time with CASTY – update column with data time data type and import source table and CDC Time table into data services


  • Create project
  • Create job
  • Create variables (global variables) (4) $GV – Start time

$ ar – end time with data time data types

Step 3:-

  • Take 2 scripts name it as INITIAL – START and INITIAL – END
  • Take a new workflow and put in between two scripts
  • Double click as INITIAL – START Script go to smart editor and write the below syntax


$GV – STARTIME = ‘1996.01.01               12.00.00’,

$GV – ENDTIME = sysdate ():

  • Double click as INITIAL – END Script, to go smart editor and write the below system




Step 4:

  • Double click as workflow
  • Create data flow
  • Select the source table (i.e.) employer table




  • Take Query transformation, map source table to Query transformation
  • Take template as target table, map it to Query transformation

Step 5:

  • Double click on Query transformation
  • Map the required fields from schematic to schema out
  • Select LAST – update coloumn
  • Go to where clause and provide the logic as below




STEP 6:-

(HANA) Table

  • Take as target table, connect it to Query
  • Double click as the target HANA table go to options tab, enable delete data from table before loading checkbox and enable drop and created table cheek box



Step 7 :

  • Validate
  • Extract the job and you can see in the source system in CDC – Time table there is a time stamp loaded which is nothing but INITIAL load time




  • Create a new job name it as delta load.
  • Take two scripts it as initial START and INITIAL END.
  • Take a new workflow, put it in between two scripts
  • Double click on initial START Script and go to smart editor provide below togic

$av – STARTTIME = to-date (Sqc (‘cdc’, ‘SELECT

Last – UPDATE From CDC – TIME), ‘1996.01.01             12.00.00’),

$av – ENDTIME = sysdate ();

  • Double click on end script and go to import editor provider below logic,

SQC (‘cdc’, Update CDC – TIME SET LAST- UPDATE= {$av – ENDTIME});





  • Take employee table as a source table
  • Take Query transformation and connect the both
  • Double click on Query transformation, map the required fields from schema – in to schema out
  • Take HANA table as a target table
  • And make sure that in the target HANA table options tab, Disable the following check boxes

    • Delete data from table before loading
    • Drop and recreate table

  • Validate it
  • Execute the Job