User Tools

Site Tools


transformations:exportdb

Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revisionPrevious revision
Next revision
Previous revision
Last revisionBoth sides next revision
transformations:exportdb [2020/01/22 13:04] dmitrytransformations:exportdb [2021/02/12 07:02] dmitry
Line 3: Line 3:
 Exports the current dataset into the database table which name is specified either explicitly or using a [[:parameters|parameter]]. The table must already exist. If the table doesn't exist it can be created using the [[sqlcommand|Database command]] action prior to exporting data into it. Exports the current dataset into the database table which name is specified either explicitly or using a [[:parameters|parameter]]. The table must already exist. If the table doesn't exist it can be created using the [[sqlcommand|Database command]] action prior to exporting data into it.
  
-Under the hood, the exporting is performed using SQL INSERT statements by batches of 10 or 100 or 1000 rows. Exporting in bigger batches can be slightly faster on narrow tables (i.e. tables with few fields). For wide tables (i.e. table with hundreds of fields) or when columns contain long strings use smaller batches.+Note that the operation effectively //appends// rows to the target tableIf you need to remove old data use the [[sqlcommand|Database command]] actionIf you need to update rows, use the [[transformations:updatedb|Update database table]] action.
  
-Export will convert cell values only in the following cases:+===Performance=== 
 +Under the hood, the exporting is performed using SQL INSERT statements by batches of 10, 100, 1'000, or 10'000 rows. Exporting in bigger batches can be faster and is recommended for narrow tables (i.e. tables with few fields). For wide tables (i.e. tables with hundreds of fields) or when columns contain long strings use smaller batches, otherwise an export statement may fail due to the SQL statement limit of the database driver. 
 + 
 +Some databases have a performance bottleneck for batches of SQL INSERT statements which may lead to slow performance (especially on wide tables). Consider using the [[transformations:bulkexportdb|Bulk export]] action or bulk load statements with the [[sqlcommand|Database command]] action when exporting more than 1 million rows at once.  
 + 
 +===Data type conversion=== 
 +Exporting will convert cell values only in the following cases:
  
   * Boolean cell to a numeric column exported as 1 (TRUE) or 0 (FALSE)   * Boolean cell to a numeric column exported as 1 (TRUE) or 0 (FALSE)
-  * Numeric cell to a date or a date/time column - the cell value is converted to a date or a datetime literal according to the SQL dialect of the target database +  * Numeric cell to a date or a date/time column - the cell value is converted to a date or a date-time literal according to the SQL dialect of the target database 
-  * Any cell to a text column or a column of unknown type is exported as text literal+  * Any cell to a text column or a column of unknown type is exported as text literal
   * Error cells are always exported as NULL   * Error cells are always exported as NULL
  
-In all the other cases when the cell type doesn’t match the target column type a NULL value is exported.+In all the other cases when the cell type doesn’t match the target column type either a NULL value is exported, or the exporting fails, depending on action settings.
  
-The operation is //atomic// meaning that it either successfully exports all rows, or nothing (in case of failure).+===Error capture===
  
-Note that the operation effectively //appends// rows to existing ones. If you need to remove old data use the [[sqlcommand|Database command]] action.+The action has an option "When exporting a batch fails" that specifies behavior in case a batch of rows fails to export.
  
-Some databases have a performance bottleneck for batches of SQL INSERT statements which may lead to slow performance (especially on wide tables)Consider using bulk load statements with the [[sqlcommand|Database command]] action when exporting more than 1 million rows at once+^ Error capture mode  ^ Behavior 
 +| Halt execution, roll back already exported batches  | No error capture performedIf a batch fails, EasyMorph EasyMorph reduces the batch size logarithmically and retries exporting until it singles out the exact row that fails and produces an error. After that, everything rolls back to the point before row insertion. | 
 +| Add new column to flag rows in batches that fail  | A new column labelled "Export errors" is appended and exporting begins. If a batch fails, all rows in the batch are marked as failed in that column, and exporting continuesSuccessfully exported batches are not rolled back. |
  
-==See also==+===See also===
  
   * [[http://easymorph.com/learn/export-database.html|Tutorial: exporting data into a database table]]   * [[http://easymorph.com/learn/export-database.html|Tutorial: exporting data into a database table]]
-  * [[transformations:sqlcommand|SQL Command]]+  * [[transformations:bulkexportdb|Bulk export into database]] 
 +  * [[transformations:sqlcommand|Database command]] 
 +  * [[transformations:updatedb|Update database table]]
transformations/exportdb.txt · Last modified: 2023/10/14 16:20 by craigt

Donate Powered by PHP Valid HTML5 Valid CSS Driven by DokuWiki