I have a funny problem trying to insert non-ASCII characters into a SQL Server database using the Microsoft ODBC driver for Linux. The problem is that when sending and receiving data, it is assumed that different character sets are used. For information, the server sorting is set to Latin1_General_CI_AS (I'm only trying to insert European accent characters).
Testing with tsql (which comes with FreeTDS) is fine. At startup, it displays the following:
locale is "en_GB.utf8"
locale charset is "UTF-8"
using default charset "UTF-8"
I can both insert and select a non-ASCII value in the table.
However, using my own utility that uses the ODBC API, it does not work. When I make a selection request, the data is returned to the UTF-8 character set as desired. However, if I insert UTF-8 characters, they become damaged.
SQL > update test set a = 'Béthune';
Running SQL: update test set a = 'Béthune'
Query executed OK: 1 affected rows
SQL > select * from test;
Running SQL: select * from test
+------------+
| a |
+------------+
| Béthune |
+------------+
If I instead paste the data encoded in ISO-8859-1, then this works correctly, however the select query will still return it to UTF-8!
I have already set the locale en_GB.utf8and client charsetUTF-8 in the details of connecting to the database. Aargh!
FWIW I seem to have the same problem if I use the FreeTDS driver or the official Microsoft driver.
EDIT: , , . , SQL SQLPrepare. - ODBC, , iconv, , , !
UnixODBC, , .
2: UnixODBC , , , nl_langinfo(CODESET) ISO-8859-1. , man , locale charmap, UTF-8. , , , .