I use php and postgres and read the csv file using php and move the line in the following format:
{ {line1 column1, line1 column2, line1 column3} , { line2 column1,line2 column2,line2 column3} }
Caring for a single transaction by passing a string parameter to the postgresql function.
I can check all records, formatting, amount of data, etc. and get the result of importing 500,000 records in 3 minutes.
To read the data in the postgresql function:
DECLARE d varchar[]; BEGIN FOREACH d SLICE 1 IN ARRAY p_dados LOOP INSERT INTO schema.table ( column1, column2, column3, ) VALUES ( d[1], d[2]::INTEGER,
abfurlan
source share