UT_UCS4Char byte ordering question

From: Kenneth J.Davis (jeremyd@computer.org)
Date: Sat May 18 2002 - 05:42:30 EDT

  • Next message: Rui Miguel Silva Seabra: "Re: revision marks"

    I was tracing through the Windows code (as every menu selection causes an
    assert popup in debug mode) and int UT_UCS4_mbtowc::mbtowc (UT_UCS4Char & wc, char mb)
    as called from UT_UCS4Char * UT_UCS4_strcpy_char(UT_UCS4Char * dest, const char * src)
    converts the char * to a big endian coded UT_UCS4Char.
    UT_uint32 ap_sb_Field_PageInfo::getDesiredWidth(void) calls the above strcpy
    function and passes the result to pG->measureString(bufUCS,0,len,charWidths);
    where pG is a GR_Win32Graphics. Which in turn seems to think the bufUCS is a
    16 bit (well there are some masks and shifts that appear only valid from 0x0 to 0xFFFF)
    and in native endian format. The assertions appear to be caused by a value
    being initialized to an invalid value as a result of the big endianness. While
    I have some changes that are not in cvs, I do not think any of them should
    effect this. My question is, should the above call to UT_UCS4_strcpy_char() be
    converting the char * to an array of UT_UCS4Char(s) that are in big endian byte
    order [which as far as I can tell is how the peer libiconv call is written to
    perform the given conversion] or native byte order?
    or am I the only one with this problem?

    Thanks,
    Jeremy Davis
    jeremyd@computer.org



    This archive was generated by hypermail 2.1.4 : Sat May 18 2002 - 07:14:15 EDT