Skip to Content.
Sympa Menu

freetds - Re: [freetds] Force use of UTF-8 instead of ISO8859-1

freetds AT lists.ibiblio.org

Subject: FreeTDS Development Group

List archive

Chronological Thread  
  • From: Frediano Ziglio <freddy77 AT gmail.com>
  • To: FreeTDS Development Group <freetds AT lists.ibiblio.org>
  • Subject: Re: [freetds] Force use of UTF-8 instead of ISO8859-1
  • Date: Sun, 07 Feb 2010 01:48:27 +0100

Il giorno sab, 06/02/2010 alle 14.17 -0500, James K. Lowden ha scritto:
> Frediano Ziglio wrote:
> > >> correctly our imlpementation returns
> > >> 10 and.... (rumble!) SQL_WCHAR... I forget this SMALL detail....
> > >
> > > What should it return if the encoding is UTF-8? Â SQL_WCHAR seems
> > > correct, unless it means UCS-2LE. Â I think SQL_WCHAR means "Unicode".
> > >
> >
> > Not only this, it also specify an encoding. For instance if you use
> > unixODBC is UCS-2 or system wchar_t using iODBC.
> ...
> > > SQL_DESC_OCTET_LENGTH is the length in bytes of the buffer needed hold
> > > the data. Â The driver knows how the buffer is encoded. Â It should
> > > return -- as dbcollen() does -- the maximum size that could be
> > > required to hold any value that the column could hold. Â For nchar(10)
> > > in UTF-8, that's 40.
> > >
> >
> > True and false I think... octet length is the buffer size using
> > default client type for a given server type.... 20 for a
> > nchar(10) if client have sizeof(SQLWCHAR) == 2 cause output is
> > supposed to be a "wide" character set (UCS-2/UCS-4... or sometimes
> > UTF-16... never UTF-8). Client character set affect how we translate
> > to multi-byte character set but not how library encodes wide character
> > set.
>
> Wouldn't it be 40 for USC-4?
>

Yes, I think so.

> > Client character set affect how we translate
> > to multi-byte character set but not how library encodes wide character
> > set.
>
> No, "how we translate" === "how library encodes". Character data have
> only two forms: server encoding and client encoding. The library has no
> encoding. All data that the client exchanges with the library always uses
> one encoding: the client's.
>

ODBC has two characters type. "Normal" that is multi-byte and wide.

> The DM does not know the encoding FreeTDS is using, but the application
> does. The application is asking SQL_DESC_OCTET_LENGTH for the size of the
> buffer it needs to allocate. FreeTDS has already converted UCS-2 to
> utf-8. It should return 40, else the application is likely to allocate
> too small a buffer.
>

No, currently ODBC does not convert buffer ASAP, this is slight
different from other libraries (dblib and ctlib) where characters got
converted in libTDS. This for two reason: support getting binary data as
server send and better support wide encoding.

> Should SQLDescribeCol return SQL_WCHAR for utf-8 data? I think Yes. If
> it returned a new constant e.g. SQL_U8CHAR, most applcations wouldn't know
> how to deal with it. If it returns SQL_CHAR, the application might assume
> 1 byte/char. SQL_WCHAR is the best option. If the application assumes
> SQL_WCHAR means UCS-2, then the client should use UCS-2 encoding instead.
>

SQL_WCHAR means wide characters this is defined by DM. Currently the
only supported encoding is Unicode for wide characters. UTF-8 is a
multi-byte encoding for Unicode. Client can decide between wide or not.
With FreeTDS is can even decide multi-byte encoding (like UTF-8).

>
> I think we basically agree here. You see the changes to bsqlodbc, which
> now works. I don't see another way for the application to determine the
> buffer size.
>

Good... I don't understand how could it works if descriptor is just
allocated (SQLGetDescField should return error). I would try with
nchar(1) and a no-ASCII characters... better a Japanese one (which needs
at least 3 bytes).

> Regards,
>

bye
Frediano






Archive powered by MHonArc 2.6.24.

Top of Page