Friday, 24 February 2012

Building a local LaTeX tree

If, like me, your sysadmins have you suffering the burden of an archaic OS then, like me, you may also keep a beady eye out for ways of circumnavigating the virtual obstruction. If a by-product is that you suffer old LaTeX packages, there's actually an easy way of installing your own, up-to-date versions. Basically, you can build a local LaTeX package tree, which is searched before the default location. Below, I describe how to do this in Linux. This worked perfectly for me so I can't offer any help if it goes wrong or if you're using another OS. Fortunately this is widely covered on the web so Googling something like "local texmf tree" should net you something useful. There's a lot of information in the LaTeX Wikibook.

In your home folder, create the folder texmf/tex/latex/. Installing a given package boils down to copying the style file and any associated things into a new subfolder in ~/texmf/tex/latex/. After each new package is installed, go back to your home folder and run texhash. Don't worry if it tells you about not being able to access the base folders. (On my system, they involve something like /usr/share/texmf.)

Installing packages comes in three basic flavours. Some packages, like quotchap, are just a single style file.  If you have the relevant .sty file to hand, you can copy that into a new subfolder, run texhash and then start using it. More complicated packages, like microtype, require that you download the .dtx and .ins files and run

latex microtype.ins

to create the package files, which must then be copied into the relevant subfolder in the local texmf tree. Finally, packages that contain a large number of smaller units, like oberdiek, offer an archive that just needs to be extracted into texmf.

I usually found that the README files gave me the necessary information to copy things to the right place. Failing that, you can usually infer the right location from the folder on the CTAN servers.

There are two caveats I'll mention. First, expect some dependencies to crop up when updating very old packages. For example, when I tried my local installation of microtype, I ended up having to update everything in the package under oberdiek. Second, the status of packages that are listed under tex/generic is unclear. I found I had to copy them to subfolders in tex/latex for them to be detected and work properly. (I believe this is because they need to be accessible to pure TeX too.)

Problems? Improvements? Extra tips? Found this useful? Let me know in the comments.

Monday, 20 February 2012

Installing Windows 7 OEM without disks (2)

I previously posted a half-useful rant that claimed to resolve the issue of installing Windows 7 from scratch with an OEM product key. My previous solution, which boiled down to activating via phone, stopped working and I've since found a better solution. It sounds stupid, actually. Basically, it turns my laptop has a sticker with a W7 product key on it and, with that, it activated perfectly normally by connecting to Microsoft over the internet. So if you're trying to re-install W7 from the official ISO images, take a good look around your laptop for a sticker with a product key, particularly if your copy of Windows appears to be registered with the OEM key.

There are a few oddities about this sticker, though. First, the sticker's product key isn't the same as the OEM product key. That is, it isn't the same as the Dell product key that you can find online or the product key that the Windows registry had when I first turned on the laptop or restore factory settings. Second, no-one at Dell technical support thought to mention the sticker when I phoned them and told them that I was re-installing W7 from scratch and the OEM key wasn't working.

Third, the sticker is under the battery.

Looking for your product key? Don't expect it to be easy. Mine is under the battery. (It's the blurred Microsoft tag right of the centre.)
Yes, that's right, I had to remove the battery to find it. I only looked there because I happened to look at the comments to a Lifehacker post with the links to the W7 ISOs.

So when I say take a good look around your laptop, I mean take a good look around your laptop. Let me know in the comments if you've also found a secret sticker...

Friday, 17 February 2012

Good reasons to use PDF(La)TeX

If you're a long-time LaTeX user who still compiles to PDF by converting DVI to PostScript to PDF, you've probably asked yourself why you bother. After all, who even uses PostScript these days? I suspect the answer is no-one. Or at least, no-one who can't also use PDF. Here are a few more reasons to start compiling straight to PDF with PDF(La)TeX. If you always object to PDF because you mainly produce EPS plots, there are clean ways to convert EPS figures near the end of the post.

It's simpler. That is, instead stringing together enough commands to reach the moon, as in,

latex thesis ; dvips thesis.dvi -o ; ps2pdf thesis.ps

you cut it down to

pdflatex thesis

Done.

PDF files are smaller than PS. As above, only a small point but worth mentioning. Most of this is because PDF files are binary whereas PS files can be read as text. PDF also has built-in compression.

PDF looks better. Specifically, it has better font-handling in general, related, I believe, to the way that the fonts are stored in the PDF file. Also, it allows you to use the microtype package, which gives you the awesomeness on the right rather than the ugly sister of the typographic world on the left.
The effect of microtype. The left sample is rendered without microtype; the right sample with. (texblog.net)
The microtype package makes maximum use of advanced typographic things like kerning. So tack \usepackage{microtype} in your next preamble. Even if you write garbage, it'll be beautiful garbage.
I have never been as self-conscious about my handwriting as when I was inking in the caption for this comic. (xkcd.com)
You get hyperlinking and clickable contents in the document. Okay, this is also achieved by compiling through other formats but it's done better when compiling straight to PDF. All you need is to \include{hyperref} in your preamble. There are plenty of options to set so explore the documentation.

Your figures are automatically compressed. Now, I need to be careful with this one because it can be a downer if EPS figures are compressed with the low default quality factor. A lot of people still make use of EPS figures (especially in astronomy) so this might be why you haven't shifted to PDF(La)TeX before.

The easiest way around this is to export your plots straight to PDF. I make my publication plots with Veusz, which has a PDF option. Actually, Veusz is generally awesome and I highly recommend it for high-quality final plotting. I believe that MATLAB also exports to PDF but I'm not sure about gnuplot. Just another reason I only use it for day-to-day purposes.

If you must make EPS plots, you can dictate how the compression is done. Open the EPS file in a text editor and add one of the following snippets at the end of the preamble. i.e. the bit commented out with leading % symbols.

For lossless FlatEncode, add

systemdict /setdistillerparams known {
<< /AutoFilterColorImages false /ColorImageFilter /FlateEncode >> setdistillerparams
} if

Alternatively, you can use lossy DCTEncode but force the quality factor to be very high, in which case you should add
systemdict /setdistillerparams known {
<< /ColorACSImageDict << /QFactor 0.15 /Blend 1 /ColorTransform 1 /HSamples [1 1 1 1] /VSamples [1 1 1 1] >>
    >> setdistillerparams
} if

When you convert the EPS using, say, ps2pdf, there should be no (or little) loss of quality. Thanks to one Gary Steele for these tips. I haven't tested the two algorithms carefully but experiment to see what works for you. Using FlatEncode appeared to be lossless but still occasionally reaped some huge compression ratios.

So that's why you should switch to PDF(La)TeX. When my thesis is done, you can bet there won't be a PS version!