User Tools

Site Tools



Category: Export / Database


This action exports the current dataset into a database table. The table must already exist. If the table doesn't exist it can be created using the Database command action prior to exporting data into it.

Note that the operation effectively appends rows to the target table. If you need to remove old data use the Database command action. If you need to update rows, use the Update database table action.

Action settings

Setting Description
Connector*Select or create a configured connector.

* Setting can be specified using a parameter.

Export settings

Setting Description
Table name*Select the table to export to. Click the refresh button if the table list does not automatically populate, or needs to be updated.
Column mapping Select the method for mapping the dataset columns to the fields in the database table.
Options: Automatic (export to database columns with the same names) or Explicit (manually map the dataset
columns on the left to the database fields on the right; use the "Map possible" link to auto-select matching field names).

* Setting can be specified using a parameter.

Options settings

Setting Description
When value doesn't match column type and can't be convertedChoose how EasyMorph will handle cases where the data type between the dataset and the database fields don't align. Options: Export as NULL (the columns' values will be empty) or Fail batch (the workflow will stop and display an error).
When exporting a batch failsChoose how EasyMorph will handle situations when the export fails.
Options: Halt execution, roll back already exported batches (No error capture performed. If a batch fails, EasyMorph reduces the batch size logarithmically and retries exporting until it singles out the exact row that fails and produces an error. Subsequently, everything rolls back to the point before row insertion.) or Add new column to flag rows in batches that failed (A new column Export errors is appended and exporting begins. If a batch fails, all rows in the batch are marked as failed in that column, and exporting continues. Successfully exported batches are not rolled back.).

Custom SQL settings

Setting Description
Execute custom SQL before exportingTo run an ad hoc SQL process on the data prior to exporting, check this setting on and enter an SQL statement. Click the "Edit…" button to open the "Parameterized text editor" if necessary to build the statement.



Under the hood, the exporting is performed using SQL INSERT statements by batches of 10, 100, 1'000, or 10'000 rows. Exporting in bigger batches can be faster and is recommended for narrow tables (i.e. tables with few fields). For wide tables (i.e. tables with hundreds of fields) or when columns contain long strings use smaller batches, otherwise an export statement may fail due to the SQL statement limit of the database driver.

Some databases have a performance bottleneck for batches of SQL INSERT statements which may lead to slow performance (especially on wide tables). Consider using the Bulk export action or bulk load statements with the Database command action when exporting more than 1 million rows at once.

Data type conversion

Exporting will convert cell values only in the following cases:

  • Boolean cell to a numeric column exported as 1 (TRUE) or 0 (FALSE)
  • Numeric cell to a date or a date/time column - the cell value is converted to a date or a date-time literal according to the SQL dialect of the target database
  • Any cell to a text column or a column of unknown type is exported as a text literal
  • Error cells are always exported as NULL

In all the other cases when the cell type doesn’t match the target column type either a NULL value is exported, or the exporting fails, depending on action settings.

Community examples

See also

transformations/exportdb.txt · Last modified: 2023/10/14 16:20 by craigt

Donate Powered by PHP Valid HTML5 Valid CSS Driven by DokuWiki