believeNRows = FALSE seems to be the key. It is best to use it when opening a connection:
db <- odbcConnect (dsn = "testdsn", uid = "testuser", pwd = "testpasswd", believeNRows = FALSE)
When testing with unixODBC isql, it reports SQLRowCount as 4294967295 (even if there is only one row) on 64-bit Linux, while it reports -1 on 32-bit Linux. This is probably an optimization because it allows you to respond faster. This relieves the database of the burden of receiving a complete set of response data immediately. For instance. there can be many entries, while only the first few hits will ever be retrieved.
4294967295 is (2 ^ 32) -1, which is the maximum value for an unsigned int, but will be declared as -1 with an int signed. So R complains about a negative length vector. So I assume this is a problem with signed vs. unsigned integer (or sizeof (long) between 32 and 64 bits).
Setting trustNRows = FALSE solved the problem for me, so I can use the same R code on both systems.
BTW: I am using R 2.10.1, RODBC 1.3.2, unixODBC 2.3.0 with Oracle 10.2.0.4 on Linux 64 bit. Be sure to use
export CFLAGS = "- DBUILD_REAL_64_BIT_MODE -DSIZEOF_LONG = 8 -fshort-wchar"
when configuring for unixODBC, because the Oracle ODBC driver expects REAL_64_BIT_MODE, not LEGACY_64_BIT_MODE.
And remember the internationalization issues: R uses $ LANG, and Oracle uses $ NLS_LANG.
I'm having problems with UTF8, so I use for example
LANG = en_US; NLS_LANG = American_America