Exports the current dataset into the database table which name is specified either explicitly or using a parameter. The table must already exist. If the table doesn't exist it can be created using the Database command action prior to exporting data into it.
Under the hood, the exporting is performed using SQL INSERT statements by batches of 10, 100, 1'000, or 10'000 rows. Exporting in bigger batches can be faster and is recommended for narrow tables (i.e. tables with few fields). For wide tables (i.e. tables with hundreds of fields) or when columns contain long strings use smaller batches, otherwise an export statement may fail due to the SQL statement limit of the database driver.
Some databases have a performance bottleneck for batches of SQL INSERT statements which may lead to slow performance (especially on wide tables). Consider using the Bulk export action or bulk load statements with the Database command action when exporting more than 1 million rows at once.
Exporting will convert cell values only in the following cases:
In all the other cases when the cell type doesn’t match the target column type either a NULL value is exported, or the exporting fails, depending on action settings.
The action has an option "When exporting a batch fails" that specifies behavior in case a batch of rows fails to export.
|Error capture mode||Behavior|
|Halt execution, roll back already exported batches||No error capture performed. If a batch fails, EasyMorph EasyMorph reduces the batch size logarithmically and retries exporting until it singles out the exact row that fails and produces an error. After that, everything rolls back to the point before row insertion.|
|Add new column to flag rows in batches that fail||A new column labelled "Export errors" is appended and exporting begins. If a batch fails, all rows in the batch are marked as failed in that column, and exporting continues. Successfully exported batches are not rolled back.|