Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Is it too much to expect that every device should be capable of rendering every unicode point (latest version at release) in at least one font?

Yes, I think it probably is. Unicode is vast, and the 100,000+ characters specified in the latest standard include numerous obscure, specialised, or downright gimmicky ones.

The effort required to create just one font that supports even a crude version of each and every character is probably measured in human lifetimes. Imposing that kind of burden as a barrier to entry for any new platform seems unrealistic.



For one, all those glyphs don't need to be drawn by a single person or even coexist in the same font (they cannot, anyway, given current font formats). Of the 110182 characters Unicode 6.2 defines Windows 8 (the only thing I can test at the moment) includes glyphs for 102082 of them out of the box. The missing ones fall mostly into either »rarely-used CJK ideograph« or »historical script« (like Hieroglyphs).

Unicode serves many different needs and not all characters are necessary to support in a general-purpose OS. There are fonts to cover the missing pieces and professionals working in fields requiring those usually have them installed.

There is also little benefit to provide a single font that encompasses all of Unicode. Designers pick fonts for aesthetic reasons and every script has different styles (although Latin, Greek and Cyrillic are fairly similar which is why they usually are all included in every font). E.g. you have the main distinctions into serif and sans-serif (for non-decorative body text). This distinction never existed for scripts like Han, Hebrew, Arabic, various Indic scripts, etc. So if you were to create only one font, what are your choices to include for every script? Pan-Unicode fonts are mostly useful as fallback fonts to ensure that you can see some rarely-used glyphs but for nearly all practical purposes they cannot be used for anything else. It's also an enormous effort beyond creating the glyphs because you'll have to include kerning tables, define positions where combining characters appear, etc. Those are often issues that make such pan-Unicode fonts unusable because yes, they may contain plenty of glyphs but cannot be used reliably to render text that goes beyond simple scripts (and diacritic placement can even be wrong with just Latin.


Whether you try to supply a comprehensive set of characters in one font file or many isn’t really the issue, though. You’ve still got to get all those glyphs from somewhere, however they are grouped.

I’m just not sure I see a compelling argument that any new device entering the market must be able to render advanced mathematical notations, animals, and tarot cards. That’s a very high barrier to entry.

In due course, if there are freely available, good quality fonts that do the job, then by all means include them, but we’re a long way from that situation today. Even the most comprehensive efforts, things like Unifont, don’t cover all of Unicode. Also, without wishing to belittle anyone’s efforts, some of these projects are working on bitmap fonts, and it’s increasingly a vector world. Perhaps they are still useful as a rendering of last resort, but I suspect anyone working on a new platform or device has more pressing concerns.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: