SSIS Converts Varchar2 to DT_STR

We have an SSIS package that loads data from an Oracle database into a SQL Server data warehouse. There are several environments configured for this data warehouse; Development, testing and production. Dev and test share the machine, Prod is autonomous.

When the SSIS package runs on the PROD machine, it loads the Varchar2 columns from our original Oracle database in MSSQL in the DT_WSTR format and saves it in the NVarchar column. I.E. all steps include Unicode support.

When the same package is launched with the same source database in the DEV / Test window, it somehow sees the external columns as Varchar, displays this on DT_STR in the data stream and refuses to store it in the NVarchar column.

All OSs are Win2K8r2, MSSQL 2008 64 bit. The package starts in 32-bit mode, the same behavior is observed when starting from BIDS or from an SQL agent.

Does anyone want to guess why? I have already seen the suggestion to disable the verification of external metadata ( https://stackoverflow.com/a/165608/ ), but this is not a practical suggestion for our situation.

+4
oracle unicode ssis
Oct 21 '13 at 12:56 on
source share
3 answers

An old question that I know, but it seems to still matter. And since I could not find a suitable answer for the last 3 months that I was looking for, I believe that now is the best time to publish my results.

I had the same curious behavior and finally managed to solve it. My layout looked like this:

  • Oracle 10g R2 Database on Windows 2003 Server (lets call it ORA)
  • Dev machine with Windows 8, Visual Studio 2012 + SSDT, Sql Express 2012, ODT 12.1.0.21 (lets call it DEV)
  • Sql 2012 server on Windows 2012 Server, Oracle Client 11.2 (allows you to call TEST)

Both DEV and TEST connected to ORA. DEV reported VARCHAR2 columns as DT_WSTR, while TEST insisted that they are DT_STR.

Then I installed ODT 12.1.0.21 in TEST and the problem was resolved. It is noteworthy that during the installation I used the option "wide machine". I'm not sure how much of this was.

It seems that there is a difference in the data types returned by Oracle OleDb providers in different versions of the client components.

+1
May 05 '17 at 13:51
source share

Check the NLS_LANG value in the registry.

reg query HKEY_LOCAL_MACHINE\SOFTWARE\Wow6432Node\ORACLE\KEY_<orahome> /f NLS_LANG 

If it matches the server character set, OraOLEDB will use the regular (non-Unicode) DBTYPE_STR data type, otherwise it will use the Unicode mode, DBTYPE_WSTR data type.

If the NLS_LANG field is missing, the default is US7ASCII, which will almost certainly not match your database, and you will use Unicode data types.

To get the server character set, follow these steps:

 SELECT parameter, value FROM nls_database_parameters WHERE parameter = 'NLS_CHARACTERSET'; 
+1
Sep 06 '17 at 1:10
source share

Check the value of the metadata validation property if it truly makes it false

0
Jul 02 '15 at 18:10
source share



All Articles