Ignore duplicates when importing from CSV

I am using a PostgreSQL database, after I created my table, I have to fill them with a CSV file. However, the CSV file is damaged and violates the primary key rule, so the database throws an error and I can’t fill the table. Any ideas how to tell the database to ignore duplicates when importing from CSV? Writing a script to remove them from a CSV file is not acceptable. Any workarounds are welcome. Thanks!:)

+1
source share
1 answer

In postgreSQL, duplicate rows are not allowed if they violate a unique constraint . I believe that your best option is to import the CSV file into a temporary table that has no restrictions, remove duplicate values ​​from it, and finally import temp from this table into the final table.

+1
source

All Articles