[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: What's the relationship between vfs and tfms?


   But the precise syntax of the re-encoding file *is* system dependent, no
   matter what the aim of the file.

OK, now I get it, you are worried about the difference between say
8r.enc for DVIPS and 8r.vec for some other system.  Not sure why this
is a big issue.  Whoever supplies the TeX system supplies the vectors
for those base encodings.

   The point that the encoding vectors perform an identical function doesn't
   matter in the slightest - the problem is that given that different dvi
   drivers need different syntaxes (in general), you get system-dependent
   files.  That is: files that differ in form (not function: form) from system
   to system.

Yep, got it. (Although I dispute its importance, see below)

   >Well, some systems need to it written out one way, some another.
   >But the basic idea is the same: a map from numbers to glyph names.

   But that's irrelevant: never mind the basic point, the files differ in form
   from system to system.  Unlike, say, .vf files, which are identical on all

Oops!  Beg to differ.  We have endless trouble with something as
simple as the VF file for MathTime MTMI, just because they are *not*
identical for different TeX systems!  For a start, the VF file has the
name of the font wired in.  Since this is *not* constant we have to
provide different VF files for Textures versus OzTeX versus DVIPS e.g.
In addition, the VF file has the encoding of the target fonts
hardwired.  Since MTMI calls on Times-Italic, and needs the dotlessi
e.g. it needs to know what encoding is used for text fonts! Ugh!  And
unlike a vector file which is plain ASCII, I have to use some tools to
modify a VF file (I won't even get into the problems occassioned by
the fact that you cannot control which version of Times-Italic you are
going to get when using VF).

   >Yes, unfortunately on the Mac people haven't yet figured out how to
   >reencode fonts other than via the PostScrip route.  So they can only
   >use what is in Mac standard roman encoding on screen and for non-PS
   >printing.  But just because remapping is all that can be done there is
   >not to say that this is either a good thing or in any way a substitute for
   >true reencoding.

   It *is* a good thing and it is a substitute for what you term re-encoding,
   because without it, we wouldn't be able to have *any* sort of re-encoding.

Right, what you are saying is that if you don't have real reencoding
you have a problem, and that problem can only be *partially* solved by
remapping (and don't call it reencoding, it thoroughly confuses the
issue - encoding: map from char code to glyph name, mapping: map from
character code to character code - as in what VF does).

   >   To use an argument you used elsewhere: for `simple' use, this doesn't
   >   matter.  After all, I've never been known to write in Icelandic; and
   >   virtually all Mac users who *do* write in Icelandic *do* have access to
   >   these characters.

   >Hmm, how about ff, ffi, ffl ligatures in the Lucida fonts?  Can't use them
   >on the Mac.

   Yes you can: you've got to print using Postscript, though.  But if I were

You come from the magic land of Unix :-), obviously, where device
independence means: PostScript --- and on screen viewing is apparently
not a high priority.  I prefer to be able to see my document on screen
(with all characters) *and* print to non-PS device (e.g.  cheap HP
printers and fax boards) without having to resort to a PS interpreter.

   to spend money buying founts, I expect I'd've spent money so I could print

Fonts cost (a lot) less than a PS printer. 

   >Typographically this is not a good thing.  Best to use the ready-made
   >accented characters the designer created.

   Best, yes.  But it's not necessarily a typographically bad thing, is it?
   Surely the faking fontinst does is often nearly indistinguishable from the
   `real' thing?

I am constitutionally opposed to all fake typographic hacks :-) 
such as fake smallcaps, fake condensed, fake obliqued and fake accented
characters.  Maybe we can't tell the difference some of the time, but 
the people on comp.fonts often can.

   >   But the precise form of each encoding vector file *does* depend on the
   >   syntax required by the particular dvi driver you are using.  And not all
   >   TeX dvi drivers can handle re-encoding; some of them must use re-mapping,
   >   which is entirely system-dependent.

   >In which case they are sunk anyway as far as this dicsussion goes...

   Why do you say that?

Because with remapping only (no reencoding) you can't use any
characters that are not in that platforms default text font encoding
vector (like ff, ffi, ffl, Lslash, Yacute, minus, dotlessj etc.)

And don't tell me again that you can as long as you print to PS.
I know everything is trivial in PS. That is the whole challenge:
to do something that is device independent.