Click on the Destination button to define extraction-specific settings of the extraction.

ext_spec_set_de_form

Table Name

determines the name of the target table. You have the following options:

  • Same as name of SAP object: Copy the name of the SAP object
  • Same as name of extraction: Adopt name of extraction
  • Custom: Here you can define your own name.

  • Append timestamp: adds the current timestamp in the format [_YYYY_MM_DD_hh_mm_ss_fff] to the file name of the extraction.

Column Name Style

Defines the column name. The following options are available:

column_name_style_options

  • Code: The technical column name from SAP is used as the column name, for example, e.g. MAKTX
  • PrefixedCode: The technical name of the table is linked to the tilde character and the corresponding column name, e.g. MAKT~MAKTX
  • CodeAndText: The technical name and the description of the column from SAP are linked with an underscore and used as column names, e.g. MAKTX_Material Description (Short Text)
  • TextAndCode: The description and the technical name of the column SAP are linked with an underscore and used as column names, e.g. Material Description (Short Text)_MAKTX

Date conversion

Convert date strings
Converts the character-type SAP date (YYYYMMDD, for example, 19900101) to a formatted date (YYYY-MM-DD, for example, 1990-01-01). In the data target, the SAP date does not have a string data type but a real date type.

Convert invalid dates to
If an SAP date cannot be converted to a valid date, this default date value is used. invalid value is then converted to the entered date. NULL is supported as a value.

When an invalid SAP date is converted, the two special cases 00000000 and 9999XXXX are checked first.

Convert 00000000 to
Converts the SAP date 00000000 to this value.

Convert 9999XXXX to
Converts the SAP date 9999XXXX to this value.

Preparation

Defines the action on the target database before the data is inserted into the target table.

  • Drop & Create: Remove table if available and create new table (default).
  • Truncate Or Create: Empty table if available, otherwise create.
  • Create If Not Exists: Create table if not available.
  • Prepare Merge: prepares the merge process and creates e.g. a temporary staging table. For more information see section About Merging.
  • None: no action
  • Custom SQL: Here you can define your own script. See the Custom SQL section below.

If you only want to create the table in the first step and do not want to insert any data, you have two options:

  1. Copy the SQL statement and execute it directly on the target data database.
  2. Select the None option for Row Processing and execute the extraction.

Once the table is created, it is up to you to change the table definition, by, for example, creating corresponding key fields and indexes or additional fields.

Row Processing

Defines how the data is inserted into the target table.

  • Insert: Insert records (default).
  • Fill merge staging table: Insert records into the staging table.
  • None: no action.
  • Custom SQL: Here you can define your own script. See the Custom SQL section below.
  • Merge (deprecated): This option is obsolete. Please use the Fill merge staging table option and check the About Merging section.

Finalization

Defines the action on the target database after the data has been successfully inserted into the target table.

  • Finalize Merge: Closes the merge process and deletes the temporary staging table, for example. For more information, see About Merging.
  • None: no action (default).
  • Custom SQL: Here you can define your own script. See the Custom SQL section below.

About Merging

The prerequisite is that you have a table in which you want to merge the new data with the existing data. Ideally, you created this table in the first run with the corresponding preparation option and filled it with data with the Row Processing option Insert.

In the subsequent runs, you now want to merge the new data records with the existing ones. You therefore set the following options:

  • Preparation to Prepare Merge,
  • Row Processing on Fill merge staging table and
  • Finalization to *Finalize merge.

The merge process is performed using a staging table and takes place in three steps. In the first step, a temporary table is created in which the data is inserted in the second step. In the third step, the temporary table is merged with the target table and then the temporary table is deleted.

Destination-Exa-Makt-Merge

The Merge command updates existing records or inserts non-existing ones. Which fields are updated can be found in the SQL statement.
The SQL command can be changed if necessary, e.g. to exclude certain columns from the update.
Fields that do not appear in the SQL statement are not affected by changes.

It is important that an appropriate index is created in order to execute the merge command quickly. Without an index, the execution of the merge command would take a long time depending on the amount of data.

**Custom SQL.

Here you can define your own SQL or script expressions. You can use existing SQL commands as templates.

Templates

You can write your own SQL expressions and thus have the possibility to adapt the loading of the data to your needs.
You can also, for example, execute stored procedures that exist in the database. To do this, you can use the SQL templates provided in the following phases:

  • Preparation (e.g. Drop & Create or Create if Not Exists)
  • Row Processing (e.g. Insert or Merge) and
  • Finalization.

Script Expressions

You can now also use script expressions for the Custom SQL command. You can find more information on the Script Expressions page (under Advanced Techniques).

Formula-ExistsTable

Among other things, you can use the ExistsTable(tableName) command to verify the existence of a table. This function was introduced because some database systems only support this to a limited extent.

SQL-Skript
#{
   iif
   (
      ExistsTable("MAKT"),
      "TRUNCATE TABLE \"MAKT\";",
      "
         CREATE TABLE \"MAKT\"(
            \"MATNR\" VARCHAR(18),
            \"SPRAS\" VARCHAR(2),
            \"MAKTX\" VARCHAR(40));
      "
   )
}#

Debugging

By checking debugging, the default BULK insert is deactivated when writing to the database.

This enables detailed error analysis if certain data rows cannot be persisted on the database. Possible causes could be incorrect values with regard to the stored data type.

Debugging should be deactivated again after the successful error analysis, otherwise the performance of the DB write processes remains low.

Debugging

Transaction style

One Transaction
Prepare, Row Processing and Finalization are all performed in a single transaction.
Advantage: clean rollback of all changes.
Disadvantage: possibly extensive locking during the entire extraction period

Three Transactions
Prepare, Row Processing and Finalization are each executed in a separate transaction.
Advantage: clean rollback of the individual sections, possibly shorter locking phases than with One Transaction (e.g. with DDL in the Prepare, the entire DB is only locked during the Prepare and not for the entire extraction duration)
Disadvantage: No rollback of previous step possible (error in Row Processing only rolls back Row-Processing changes, but not Prepare)

RowProcessingOnly
Only Row Processing is executed in a transaction. Prepare and finalization without an explicit transaction (implicit commits).
Advantage: DDL in perpare and finalization for DMBS that do not allow DDL in explicit transactions (e.g. AzureDWH)
Disadvantage: No rollback of Prepare/Finalization, not even as partial step

No Transaction
No explicit transactions.
Advantage: No transaction management required by DBMS (locking, DB transaction log, etc.). This means no locking and possible performance advantages
. Disadvantage: No rollback