Bulk Loading a CSV Into an Existing Table

This document explain how to load the CSV into the cloud when using Workbench (or other similar tool)

Syntax

COPY [schema_name].table_name [ ( column_name [, ... ] ) ]
        FROM 'filename'
        [ [ WITH ] ( option [, ...] ) ]
        ;

    with option can be one of:
        DIRECTORY
        OFFSET N
        LIMIT N
        DELIMITER 'delimiter'
        RECORD DELIMITER 'record delimiter'
        ERROR_LOG 'error_log_filename'
        ERROR_VERBOSITY { 0 | 1 }
        STOP AFTER N ERRORS
        PARSERS { '[column_name=parser_format, ...]' }
Files/Directory:

Example

This command will copy data from a CSV file into the table called 'customers', and will output the error log if necessary to the specified error log.

Notice no schema name was used therefor the public schema is used by default. 

copy customers from '/home/user1/demo/customers.csv' with delimiter ',' error_log '/home/user1/demo/logs/customers.log' error_verbosity 1;

See also Best Practices when loading files to tables

If you have any questions, feel free to contact SQream support 

  • 11
  • 17-May-2017
  • 546 Views