Headless / commandline Bittorrent

I made noises yesterday that I might learn about BitTorrent. So I tried. (It’s an interesting protocol from the point of view of needing clients to enforce penalties against refusing to upload.) Here’s the paradigm I wanted it to fit into: at, the command scheduler. 95% of my readership will know (intimately) how residential broadband works in Australia, but for those who don’t it is typical to not have unlimited downloads. On a good, just above entry level, plan you might have a limit of about 4GB to 10GB a month. (The entry level plans tend to have a limit of about 512MB or 1GB for only about $10 to $15 less. This is like selling small amounts of an addictive substance. Someone will eventually tell you about Google Earth.) However, it is also very typical to have an off-peak period with a higher, perhaps even unstated, bandwidth cap. (Mine is 48GB in a month, in the midnight to noon period only.)

Since I run a headless server anyway (for mail and music), I vastly prefer to schedule my downloads for when I’m sleeping and bandwidth is cheaper. When I’m going to download something available on the web, I run at midnight and then give the command wget --limit-rate=something -q -c [url] (-c because I’ve usually tested about the first 100K of the download already). In the morning, assuming I’m not downloading an album from a very well known band, my file is there. I therefore wanted to do the same with Bittorrent.

I’ll cut to the chase here: the solution is Transmission‘s command-line tools, particularly transmission-daemon and transmission-remote. The daemon controls all the torrents, and, usefully for my limited download window, they can be stopped and started from the remote command and therefore from cron. The only catch is that these tools seem to have only very recently matured, as in, they don’t exist in Ubuntu 7.10/Gutsy. (It has a transmission-cli package, but the daemon isn’t in it.) The transmission-cli package from Hardy will not install on Gutsy either, but you can backport it without any hassles, assuming you know how to get hold of and build Ubuntu (Debian-style) source packages, which, granted, probably doesn’t apply to 95% of my readership (although it might apply to a majority once we count Planet Linux Australia).

I thought this was worth sharing though, after I spent hours mucking around with BitTorrent (the Python client, not the rebranded µTorrent) and BitTornado, neither of which, even in the headless versions, has much support for such selfish notions as not wanting to run it until such time as it’s good and done (you can send SIGTERM and they do resume cleanly on the next invocation, not exactly champion of the world on design though), and neither of which puts up with this old-fashioned nonsense of wanting to run a process without a controlling tty. BitTorrent doesn’t even have an inkling that you might want to limit download speeds to anything less than maximum (perhaps Bram Cohen has never lived in a house with anyone else who wanted to use his net connection). I ended up looking at Transmission because the GUI version (also apparently very nice) is now Ubuntu’s default BitTorrent client. (Clutch is allegedly a nice web interface for Transmission too, but I haven’t looked at it at all.)

Edit: this post is not a bleg. The last paragraph describes a problem, yes, but this post is about how I’ve solved that problem by discovering Transmission. Please, there’s no need to email me on getting BitTorrent, BitTornado or rTorrent to work in screen or similar. I’ve discovered Transmission, which doesn’t need screen and which is very cron-friendly. This post is intended as a positive review of Transmission, not a request for help. End edit.

Creative Commons License
Headless / commandline Bittorrent by Mary Gardiner is licensed under a Creative Commons Attribution 4.0 International License.

Personal computing history

I’ve enjoyed reading the LWN.net ten-year timeline (1, 2, 3, 4, 5, 6 and perhaps more to come), enough that I wanted to do a short wrap-up of my own computer story. Apparently it’s going around, but I didn’t realise that when I started this.

Approx 1990 My great-uncle died and my great-aunt offered us his computer. I was extremely excited, and assured it was portable, very expensive and top-notch. This didn’t turn out to be exactly true: it had been very expensive and top-notch when he bought it. I didn’t work out when that was, but it was some time in the past. The machine ran MS DOS 2-ish, had no hard drive, and had 2 5.25″ drives. One floppy held the entire operating system.

It was portable in the sense that it was one of the all-in-one designs that Compaq stuck with for so long. It was about the same size as a medium-sized hard suitcase and had a handle so that it could be upended and hauled around.

I learned some variety of BASIC (BASICA, I think) and spent many hours typing out programs that greeted me by name and let me add numbers together. I used it for school assignments for about six months before I discovered that it booted with the Insert key toggled off, so if I made a mistake it was just like a typewriter: I had to re-type everything from the mistake onwards.

I soon ran into what turns out to been a semi-imaginary bugbear of at least the next ten years of my life: I had no ideas for what to program. I realised I could learn from the program that displayed a bee flying around and played ‘Flight of the Bumblebees’ but I couldn’t summon the energy to pick apart 2000! lines! of code!

1993–1998 My parents got a new computer at the end of 1993, another Compaq as it happened. 486, and I believe 4MB of RAM and a 100MB hard drive. This was rather underpowered for the time, I think, but not radically so. I became something of a power user of word processors and the like. My parents were convinced that I was on the verge of destroying their machine. The poverty of our flat file ‘database’ application bugged me no end. When I came across relational databases much later I knew exactly what they were for. Towards the end of high school (I jumped a couple of years in computing studies and took my final exams in Year 10, so I had some exposure to the wider computer culture, however distorted) I desperately wanted to learn C, but what I was going to do with it I didn’t know.

In 1996 I got a copy of Fractint, I have no idea where from. Most likely the World Wide Web, which I used for the first time that same year. (Someone in my computer class logged on for me, went to Yahoo, and typed in ‘girls’ and started surfing for porn. 1996 was a big year in parental outrage at our school.) I even signed up for a Hotmail account. Anyway, not only did I spend hours and hours choosing just the right colours for my fractals, Fractint was my first exposure to the idea of collective, free, software development and I liked it. I read an article about Linux around the same time and liked the sound of that too, but I fundamentally had no idea what it was on about.

In 1999 I started undergrad and was immediately delighted with UNIX, which I considered as super-powered MS-DOS with better doskey. That said, it was at least a year before I learned to drive tar from the man page, and I think at least two years before I learned that see also crontab(5) means typing the command man 5 crontab (now probably my most frequently used man page). I was unimpressed with the university’s webmail system for a long time (I don’t recall why, but undoubtedly it sucked) and read my mail by telnetting to port 110 on the relevant machine. Very recently, someone suggested to me that doing that is an urban myth equivalent to whistling 9600 baud. No, no it isn’t. But it sucks for attachments.

Andrew and I started going out in August of 1999 and soon after that he bought his own desktop and installed Debian on it. He was pleased with my taste when I chose the username ‘mary’. The relationship was young enough that I still deeply cared about his opinions on questions like what is a tasteful user name? (I still use ‘mary’, it’s short. And tasteful.) At the end of the year he moved out of college and an rm -rf accident on my part and reluctance to download a year’s worth of email on his part means that we no longer have copies of about a year’s worth of emails to each other.

I did learn C in 1999, although I somehow missed the square brackets dereferencing syntax for pointer arithmetic and was doing a lot of accessing arrays like this: *(p + 5).

I had a job as a programmer in 2000. That didn’t work out so well, but I did get enough money for my first computer. Andrew downloaded SuSE for me because he wanted to see what it was like. Bad, that’s what, because it only had mutt 0.2 packages. I had to reinstall it and Windows several times each to get them on. (I was playing a lot of Baldur’s Gate at the time, I believe the Infinity engine still sucks under WINE in 2008.) I wrestled with Exim’s documentation for the best part of a day to get it to act as a smarthost, because ‘smarthost’ has nothing to do with the term ‘mail relay’. At the end of the year I registered puzzling.org. It was hosted on a FreeBSD box for a while, using qmail. (This is not why Andrew and I have a lot of user-suffix@puzzling.org addresses rather than user+suffix. That is because we were later hosted on Crossfire‘s machine for a while, and another user on that machine had used qmail style suffixes and asked for them to be set in Postfix. No one has ever let me finish telling this story until now.)

At the beginning of 2001 Andrew and I skipped the second half of a holiday at the beach for the first linux.conf.au. (Not counting CALU, which I don’t consider the ‘zeroth’ linux.conf.au because programmers count offsets from zero, but no one in their right mind counts objects from zero. Miss Manners would agree, I know.) This was an experience that in memory has not been surpassed in terms of pure mystical wonder. Especially Tridge’s hacking the TiVo talk. And Martin Pool’s rsync proxy thing. I think there is something irretrievably lost when you get better understanding of technology. No conference has been the same again. (And I don’t know that since taking up diving I’d be prepared to leave a beach holiday for any conference.) Soon afterwards, although unrelatedly, Andrew and I were living in a big sharehouse with Nicholas, Catie and Mark and Nicholas, I think, prodded me into installing Debian. And then was alarmed when the first apt-get command I ran was to install nmap. At the time, this was how I checked my machines for unnecessary services.

I tutored computer science from 2001–2003, and managed to always be a better programmer than my students. I suspect this was when I finally learned to program respectably. puzzling.org left Crossfire’s hosting after Andrew quit Weblink, was in shared hosting for a while at csoft and then went to various virtual machines and ended up at Linode. I managed to do all this without a complete meltdown a la the Exim smarthost thing. That was saved for trying to get pppoe working based on knowledge equivalent to an oily rag. Put the modem in bridge mode and all will be well.

This is making me feel as if everything I know about computers I had learned by 2003, and actually that’s largely true. I haven’t picked up a new programming language since then other than for specific projects (Perl in 2004, mainly). I switched to Ubuntu from Debian because Andrew was Canonical employee number 10 or 15 or so. I learned how DNS works around about the same time (from the point of view of configuring BIND, I can’t, say, parse the wire format). Things I’ve acquired since then belong to different stories: relationships, diving, travel, language, computational linguistics, experimental methodology, bits and pieces of statistics, some vastly improved life management skills around budgeting, a certain amount of peace that’s come from somewhere I don’t know. Writing. I’m still programming and doing hobbyist sysadmin, but success no longer comes with wow, I can really do this, finally, at last. I expect my stuff to work these days and I can even usually tell you how long it will take. (I remember Andrew claiming to have this skill as early as 1999.) New toolkits are no longer headline news, but if there was one thing I’d like to add to my store of skills it would be the arrogance of the gods that many programmers have as their birthright. Any problem a mere mortal can pose, they can solve. It sounds dreadful, but flip it over. See? It’s like flying.

WordPress locked down with HTTP Basic Auth

I run several WordPress sites for other people (this isn’t one of them). A couple of them are private: no password for the site, can’t read the site. For years I’ve had an unwieldly situation in which the lockdown was implemented with HTTP Basic Auth configured in Apache, and the users separately log into the site in order to post.

I used HTTP Basic Auth for locking it down even after I discovered Authenticated WordPress (requires a login as a WordPress user before you can see anything) partly because it’s accessible to RSS readers. Many RSS readers (and assorted web fetching tools) can speak HTTP Basic Auth. Few can log themselves into WordPress, although I wouldn’t be surprised to find an exception or three. Eventually though different search terms led me to the HTTP Authentication plugin, and it turns out they play nicely together. If you install them both the site requires HTTP authentication in order to access any part of it, and any person who has successfully authenticated is logged into WordPress too.

A couple of niggles:

  1. (The HTTP Authentication plugin requires that you have two matching lists of user names (well, actually one can be a proper subset of the other if you like, but users who aren’t in both can’t authenticate): the WordPress DB needs to have a registered user, and the external authentication source needs to have an entry for the same user.) Actually, I tell a lie. There is an option to automatically create a WordPress account for a user who shows up as successfully authenticated with an unknown user name.
  2. The HTTP Authentication documentation is slightly wrong: you don’t need the nickname to match the external user, you need the username to match the external user (which is the sensible way anyway).

Creative Commons License
WordPress locked down with HTTP Basic Auth by Mary Gardiner is licensed under a Creative Commons Attribution 4.0 International License.

Attaching messages to outgoing mail in mutt

When I want to forward an email to someone as an attachment (usually because I want them to reply to it without having to snip gunky forward ‘headers’ in the body and preserving Message-ID and such) I can’t always forward-as-attachment, sometimes because I’m replying rather than starting a new mail, sometimes because I’m forwarding messages from multiple folders.

Up until today, I’ve resorted to all kinds of tricks to add other messages as attachments to mutt messages. One old favourite was copying them (shift+c) to a new maildir, attaching the individual files and manually setting their MIME type to message/rfc822. This does actually work but there is an easier way, the attach-message function, bound to shift+a by default.

On the screen where you normally add attachments, press shift+a. Navigate to the folder containing the messages you wish to attach (if they’re in different folders, just do this once per folder). Tag all the messages you want to attach (the default keybinding for tagging a message is ‘t’). Quit from the folder browser (‘q’). All tagged messages will be attached to the outgoing mail.

Creative Commons License
Attaching messages to outgoing mail in mutt by Mary Gardiner is licensed under a Creative Commons Attribution 4.0 International License.

Sending messages one-by-one with mutt

Here’s a feature of my mail client, mutt, that I wasn’t previously aware of: the ability to emulate the mail command if you invoke it with mutt -x.

This is my use-case: every so often, I want to email a bunch of people, typically for some kind of invitation thing. But I’d like to email them the same message one by one.

I don’t want to Cc them all because I know a whole lot of people who are addicted to group reply/reply to all, and also because Gmail itself is addicted to this: it interprets mutt’s Mail-Followup-To header as Reply-To, meaning if I included an email list in the Ccs and mutt has set a Mail-Followup-To to be that email list and all other recipients minus myself (it’s been told I’m subscribed to the list), all Gmail users will reply to the entire Cc list minus myself, which is exactly the reverse of what I want. And lastly, occasionally I don’t want to give them a complete list of exactly who is and isn’t privy to whatever is in this particular email.

The standard solution is then to Bcc them. But most of my social mailing lists don’t accept Bccs, some of my friends also don’t accept them, and I also have trouble remembering who I sent the mail to.

After that, the typical thing to do (on the UNIX-like commandline anyway), is something like this:

  1. Store the text of the message in messagebody.txt
  2. Store the recipient list in addresses.txt
  3. Run a script that goes pretty much like this:
    for address in `cat addresses.txt`
    do
    mail -s “Some subject line” $address < messagebody.txt
    done

But then the problem is that I don’t have the usual copy of the outgoing mail in my mutt outbox, because I sent it with mail, not with mutt. However, just now I checked the mutt man page, and saw this:

 OPTIONS  -x     Emulate the mailx compose mode.

So, that means I can do this, or something equivalent to this, and get exactly the behaviour I want (mails sent one-at-a-time with the recipient’s email in the To: field and a copy left in my mutt outbox):

for address in `cat addresses.txt`
do
mutt -x -s “Some subject line” $address < messagebody.txt
done

Creative Commons License
Sending messages one-by-one with mutt by Mary Gardiner is licensed under a Creative Commons Attribution 4.0 International License.