Deleting records in batches can be performed in a PL / SQL cycle, but is usually considered bad practice, since all deletion should usually be considered as a single transaction; and this cannot be done from the SQL * Loader control file. Your database administrator must be UNDO sized to do the work you need to do.
If you delete the entire table, you will almost certainly be better off truncating, either in the control file :
options(skip=1,load=250000,errors=0,ROWS=30000,BINDSIZE=10485760) load data infile 'G:1.csv' "str '^_^'" truncate into table IMPORT_ABC ...
Or as a separate truncate SQL * Plus / SQL Developer / statement / some other client before starting the download:
truncate table import_abc;
The disadvantage is that your table will appear empty to other users while new rows are loading, but if this is a special import area (guessing on behalf of), which may not matter at all.
If your UNDO really so small, you may need to run several downloads, and in this case - perhaps obviously - you need to make sure that you only have truncate in the control file for the first (or use the separate truncate operator) and append instead in subsequent control files, as you noted in the comments.
You can also consider external tables if you use this data as a base to populate something else, since there is no UNDO when replacing an external data source. You probably need to talk to your database administrator about setting this option and grant you the necessary permissions to access directories.
source share