Netezza "[08S01] Communication communication error" Loading external data

Receiving:

ERROR [HY008] Operation canceled ERROR [08S01] Communication error

When trying to upload an external .txt file to the Netezza database. I did this in the past (only last week), but today I got this error. I can connect to the database, run truncate and create table instructions to load this data, select, etc. But, nevertheless, no luck. This is about 200 thousand records, and here is my code:

INSERT INTO PTG_ITO_ETL.FINANCE_TY15_RT_TPG SELECT * FROM EXTERNAL 'C:\\Users\\Documents\\Data Sources\\Finance_FY15_RT\\SBTPG\\TPG_INTUIT_RT_PRODIV_20150214.TXT' USING ( MAXERRORS 1 DATESTYLE 'MDY' DATEDELIM '/' BOOLSTYLE 'Y_N' Y2BASE 2000 ENCODING 'internal' SKIPROWS 1 REMOTESOURCE 'ODBC' ESCAPECHAR '\' ) 

I tried the solution only from another post that I could find on this issue:

ERROR [08S01] Communication error while entering data into an external table in netezza

"I found in Windows 7 and Windows Server 2008 R2 TCP Chimney Settings were the culprits.

http://blogs.dirteam.com/blogs/sanderberkouwer/archive/2008/05/15/backward-compatible-networking-with-server-core.aspx

Follwing Commands Bug fixed for me:

 netsh interface tcp set global rss=disabled netsh interface tcp set global chimney=disabled netsh interface tcp set global autotuning=disabled" 

But to no avail. Not sure what causes this problem? Am on windows 7 using Aginity, Netezza version 7.0.4 Thank you!

Thanks Craig

+5
source share
2 answers

Try the following and check for errors

  SELECT * FROM EXTERNAL 'C:\\Users\\Documents\\Data Sources\\Finance_FY15_RT\\SBTPG\\TPG_INTUIT_RT_PRODIV_20150214.TXT' (field1 varchar(20000)) USING ( MAXERRORS 1 Delim 199 DATESTYLE 'MDY' DATEDELIM '/' BOOLSTYLE 'Y_N' Y2BASE 2000 ENCODING 'internal' SKIPROWS 1 REMOTESOURCE 'ODBC' ESCAPECHAR '\' ) 
+2
source

Please rate the help from @ScottMcG, after downloading the latest Netezza driver (7.0.4.7) I was able to do this job. Still mistakenly (with the same error) a couple of times, but also worked hard with a lot of testing.

I read another post similar, and this error seems common, the more posts. I read one, where a person used an external table with a million + records and was mistaken every time. If necessary, I (you) may need to split the data into smaller external tables for loading into a single table / database. There seems to be problems around 200k +.

This is also from my laptop, so maybe a lot higher if you were on the box inside the Data Center. I would advise updating the driver, and then, if necessary, split it into smaller files.

Thanks everyone!

+1
source

Source: https://habr.com/ru/post/1213566/


All Articles