In the late 1970s, Donald Knuth was revising the second volume of his multivolume opus The Art of Computer Programming, got the galleys, looked at them, and said (approximately) "bleccch"! he had just received his first samples of the new computer typesetting, and its quality was so far below that of the first edition of Volume 2 that he couldn't stand it. He thought for awhile, and said (approximately), "I'm a computer scientist; I ought to be able to do something about this", so he set out to learn what were the traditional rules for typesetting math, what constituted good typography, and (because the fonts of symbols that he needed really didn't exist) as much as he could about type design. He figured this would take about 6 months. (Ultimately, it took nearly 10 years, but along the way he had lots of help from some people who should be well known to readers of this list Hermann Zapf, Chuck Bigelow, Kris Holmes, Matthew Carter and Richard Southall are acknowledged in the introduction to Volume E, "Computer Modern Typefaces", of the Addison-Wesley "Computers & Typesetting" book series.)
A year or so after he started, Knuth was invited by the American Math Society (AMS) to present one of the principal invited lectures at their annual meeting. This honor is awarded to significant academic researchers who (mostly) were trained as mathematicians, but who have done most of their work in not strictly mathematical areas (there are a number of physicists, astronomers, etc., in the annals of this lecture series as well as computer scientists); the lecturer can speak on any topic s/he wishes, and Knuth decided to speak on computer science in the service of mathematics. The topic he presented was his new work on TeX (for typesetting) and Metafont (for developing fonts for use with TeX). He presented not only the roots of the typographical concepts, but also the mathematical notions (e.g., the use of bezier splines to shape glyphs) on which these two programs are based. The programs sounded like they were just about ready to use, and quite a few mathematicians, including the chair of the math Society's board of trustees, decided to take a closer look. As it turned out, TeX was still a lot closer to a research project than to an industrial strength product, but there were certain attractive features:
To produce his own books, Knuth had to tackle all the paraphernalia of academic publishing footnotes, floating insertions (figures and tables), etc., etc. As a mathematician/computer scientist, he developed an input language that makes sense to other scientists, and for math expressions, is quite similar to how one mathematician would recite a string of notation to another on the telephone. The TeX language is an interpreter. It accepts mixed commands and data. The command language is very low level (skip so much space, change to font X, set this string of words in paragraph form, ...), but is amenable to being enhanced by defining macro commands to build a very high level user interface (this is the title, this is the author, use them to set a title page according to AMS specifications). The handling of footnotes and similar structures are so well behaved that "style files" have been created for TeX to process critical editions and legal tomes. It is also (after some highly useful enhancements in about 1990) able to handle the composition of many different languages according to their own traditional rules, and is for this reason (as well as for the low cost), quite widely used in eastern Europe.
Some of the algorithms in TeX have not been bettered in any of the composition tools devised in the years since TeX appeared. The most obvious example is the paragraph breaking: text is considered a full paragraph at a time, not line-by-line; this is the basic starting algorithm used in the HZ-program by Peter Karow (and named for Hermann Zapf, who developed the special fonts this program needs to improve on the basics).
In summary, TeX is a special-purpose programming language that is the centerpiece of a typesetting system that produces publication quality mathematics (and surrounding text), available to and usable by individuals.
TeX is the composition engine (strictly speaking, an interpreter, not a compiler). It is essentially a batch engine, although a limited amount of interactivity is possible when processing a file, to allow error recovery and diagnosis. thus it *is* a page layout application.
PostScript is one of the most popular "final" output forms for TeX; in this respect, TeX is comparable to Quark, for example.
One of the major areas where TeX will hold its own over the next few years is as a "back end" to SGML and XML systems, where no human intervention is expected between data input (structured, not WYSIWYG) and removing the output from the printer or viewing it on a screen. Granted, this isn't "creative" in the sense most often discussed, but it's still important to the readability and usefulness of such documents that care is taken with the design and typography, and the flexibility and programmability of TeX makes that possible.
As an aside, TeX can be the cause of religious wars. For those of us who need the capabilities of TeX for production of books and journal articles in research mathematics no other current composition tool, proprietary or otherwise, can handle the material and produce high-quality, publication-worthy output, and simultaneously be usable by the writer of the document. We'll be glad to provide test material to anyone who wants to prove us wrong. (It hasn't happened yet; the audience is much too small, and the problem too complex for a Microsoft or Quark or Adobe to be interested.) On the other hand, if you want a tool for producing a newspaper or a novel or a slick advertisement or a letter to Aunt Henrietta, unless you're already using it for something else (say your dissertation), TeX is not the tool for you.
If you become known for this sort of interest in accuracy and reward for the first report, we suspect that it will attract readers to the material itself, as well as just to the finding of errors.