Re: UT_UCS4Char byte ordering question

From: phearbear (phearbear@home.se)
Date: Sat May 18 2002 - 08:45:28 EDT

  • Next message: Tomas Frydrych: "commit HEAD: ut_wctomb.cpp"

    Kenneth J.Davis wrote:

    >I was tracing through the Windows code (as every menu selection causes an
    >assert popup in debug mode) and int UT_UCS4_mbtowc::mbtowc
    (UT_UCS4Char & wc, char mb)
    >as called from UT_UCS4Char * UT_UCS4_strcpy_char(UT_UCS4Char * dest,
    const char * src)
    >converts the char * to a big endian coded UT_UCS4Char.
    >UT_uint32 ap_sb_Field_PageInfo::getDesiredWidth(void) calls the above
    strcpy
    >function and passes the result to
    pG->measureString(bufUCS,0,len,charWidths);
    >where pG is a GR_Win32Graphics. Which in turn seems to think the
    bufUCS is a
    >16 bit (well there are some masks and shifts that appear only valid
    from 0x0 to 0xFFFF)
    >and in native endian format. The assertions appear to be caused by a
    value
    >being initialized to an invalid value as a result of the big
    endianness. While
    >I have some changes that are not in cvs, I do not think any of them should
    >effect this. My question is, should the above call to
    UT_UCS4_strcpy_char() be
    >converting the char * to an array of UT_UCS4Char(s) that are in big
    endian byte
    >order [which as far as I can tell is how the peer libiconv call is
    written to
    >perform the given conversion] or native byte order?
    >or am I the only one with this problem?
    >
    >Thanks,
    >Jeremy Davis
    >jeremyd@computer.org
    >
    >
    >
    Oups, this was my fault, forgot to change it after hard-coding the
    UT_iconv conversion to UCS-4 instead of using the UCS_INTERNAL define.
    should work now.

    /Johan



    This archive was generated by hypermail 2.1.4 : Sat May 18 2002 - 07:48:31 EDT