Re: Internationalization
- From: Ryan McDougall <ryan mcdougall telusplanet net>
- To: nshmyrev yandex ru
- Cc: twanger bluetwanger de, desktop-devel-list gnome org
- Subject: Re: Internationalization
- Date: Fri, 09 Apr 2004 11:57:54 -0600
On Fri, 2004-04-09 at 09:18 +0400, nshmyrev wrote:
> >One thinkable approach would be to extend the token list to include all
> >translated versions of the tokens, but I don't like that much. Write
> >your own scanner :)
>
>
> By the way, there are known problems with flex - it still can't handle utf8 keywords without patch, and patch is not always included and distributions, it is sometimes hard to apply it.
>
> See http://www.freedesktop.org/Software/BadSoftware
>
> So, write your own flex :)
A little searching will give you all sorts of lex-like tools out there,
perhaps some have decent intl support. I know there are quite a few
python based lexical analyzers, and I think python has a decent intl
framework.
Also, I think wrt to mathematical words, AFAIK they are all pretty
standard, so tangent is tan no matter where you go. Does anyone have
counter examples?
Cheers,
Ryan
[
Date Prev][
Date Next] [
Thread Prev][
Thread Next]
[
Thread Index]
[
Date Index]
[
Author Index]