What's the more efficient INSERT command or SQL loader for bulk loading - ORACLE 11g R2

As part of the new process requirements, we will create a table and will contain about 3000 - 4000 records. We have a copy of these records in text format in a txt file.

Loading these records into a table leaves me with two options.

  • Use a shell script to create an SQL file containing INSERT statements for these records

    • using awk, shell variables and loops to create sql and a script to execute that sql, we easily execute
  • Using SQL loader.

    • Reorganizing the list of entries and generating the ctl file is the only dependency.

Which of the two above options would be most effective in terms of using database resources, using it on the client server on which this should be performed.

I understand that the number of records is rather small, but we may have to repeat this operation with a large number of records (about 60,000), in which case I would like the best option to be configured from the very beginning.

+4
source share
3 answers

SQL*Loader - . . DIRECT load NOLOGGING, , ( ), . , , unusable.

, , SQL*Loader - . , , , . , DBA , CONVENTIONAL INSERT, 200+ . , , .

+4

SQL * Loader , INSERT. 60 000 .

+2

Of the two SQL * Loader options you mentioned, this is certainly the way to go - much faster and more efficient.

However, I would choose a different approach - external tables . It has all the advantages of SQL * Loader and allows you to process your external CSV file as a regular database table.

+2
source

All Articles