Tach auch. =)
Post by Torsten Bronger[...] However, though it sounds attractive, it's mainly intended to
be used for software package documentation. Hence the output is not
of half of the quality of LaTeX documents.
The output quality is not a real issue anymore (see my other posting),
however, I agree that some markup tags are missing.
Which other posting exactly? I can find only one in this thread, where
you're not talking about quality.
Post by Torsten BrongerNote that Texinfo doesn't generate manpages. [...]
Neither does LaTeX. I suspect that it's a lot easier to create an
automatic conversion to mapages from Texinfo though.
If you write LaTeX documents properly, then a simple (but long)
sed-expression would do it.
Post by Torsten BrongerAlso you have to worry about the output format produced, when
writing the source file. For example, you cannot use mathematical
formulae, when producing HTML (or any other non-TeX format) output.
This is correct. But does it matter to the original poster?
I suggest using LaTeX for educational papers and documentation. You
can still produce (ugly) HTML output if desired, but the native
DVI/PDF/PostScript output looks professional and is much easier to read.
This advice may be a good one, unfortunately the information Robert
provided is not enough for real decision help.
I don't know, so I mention that independently.
Post by Torsten BrongerThey are skipped silently.
No, they are passed to the output verbatim which makes it at least
understandable. For very simple formulas it even doesn't matter.
That's true, I'm sorry. But as soon as you have something like \sin(x),
then you're getting into trouble. Also including formulae is more
difficult in Texinfo. I don't know, if he'd ever need that feature, but
he's talking about teaching, so I silently assume so.
Post by Torsten BrongerBy the way: LaTeX is a lot simpler to use. As soon as you're going
to use international characters in Texinfo, you'll know what I mean.
At least the Latin-1 set works with Texinfo. I wrote in German and
was able to input the umlauts directly.
For whatever reason, I had to filter my umlauts through sed to produce
the appropriate Texinfo tags like @"a or @"o -- and yes, I've used
@documentencoding, and yes, my editor was set to produce latin-1 (which
worked correctly).
Post by Torsten BrongerThe same thing holds for formulae, tabular data as well as tables of
contents and indices.
For formulae and (complex!) tables this is true.
It is also true for TOC, if you'd like to use accents. Indices in
Texinfo are okay, but LaTeX provides much more flexibility.
Post by Torsten BrongerBut if you're going to write package manuals only, then Texinfo
might already suit you. It's good for writing short usage manuals
and references, but then I'd write a separate in-depth manual with
LaTeX.
Well, it's not about length or depth of the manual you're writing but
about the complexity of certain parts of it. If you really rely on
complex tables and formulae, Texinfo is not (yet) suitable for the
job. On the other hand, the big advantages of Texinfo are that it
forces you to keep things simple, and that conversions to HTML,
DocBook, and LaTeX are guaranteed to work.
Depth is in some way related to complexity. Regarding HTML and DocBook
output you're right (unfortunately). Keeping things simple is not
always the right thing to do. Sometimes you just need complex tables.
What I really dislike about Texinfo is that if the output in one format
looks perfect, it doesn't mean that it's even readable in another. I
had to use @iftex, @ifhtml and @ifinfo quite often. LaTeX guarantees
that the output looks good _and_ is readable, giving more priority to
the latter.
Regards.