When working with the Pango code, I happened to notice that right now we are typically embedding TTF font files verbatim into the PS files we generate. The only exception to this is if the TTF embedding code fails, because, say, the font happens to have a table other than glyf bigger than 64k. Concerns: - We don't have subsetting for TTF embedding, though we do for TTF = > Type1 conversion. A small file with Greek, Latin, Japanese, Arabic, Bengali, Hebrew and Hindi generated a 1.5M PS file with the current code and 1.0M when I forced all fonts to be converted to Type1. And that's with only the Arabic and Hindi fonts embedded as TrueType. (others either had too big tables or were Type1 as source.) The situation could easily be far worse. - I'm getting lots of "(process:20138): GnomePrint-WARNING **: Too big table in font" spew. - Do most PS printers these really have TTF interpreters in them? How is this working? I'm wondering if we'd be better off currently if we were doing the blanket conversion to Type1? is there a reason for favoring embedding TTFs directly currently? Regards, Owen P.S. Clearly there are lots of other things that could be done better: - Adding TTF subsetting if we want to continue embedding TTF fonts (though that is pretty hard) - Adding Type1 subsetting when we are embedding Type1 source format fonts (should be considerably easier than TTF subsetting, plus code exists in various places) - Doing a better job at not writing out pages and pages of all-zeros encoding tables when subsetting a large font. But those all require at least a moderate amount of coding...
Attachment:
signature.asc
Description: This is a digitally signed message part