Possible structures for a technical talk

Edited version of a private email I sent about linux.conf.au 2010 preparation, but not conference-specific. The email discussion led to me enumerating the kind of talks that I think you can give about technical subjects at a linux-conf.au-style (non-academic developer-oriented) conference.

Recitation of facts; or the architecture diagram talk

In this kind of talk, the speaker walks the audience through, say, the architecture of a project. The talk contents are dictated by the architecture. Five minutes on the input module, five minutes on preprocessing, five minutes on transforming into a new set of matrix bases, five minutes on postprocessing, five minutes on sanity checking, five minutes on rendering.

This is the most obvious structure for a technical talk. It’s also the driest structure. It is useful mostly for audience members who want to work on or with the project code, which is for most talks a small fraction of the audience. It best suits very major projects with years of cruft in their architecture and a wide potential userbase.

Insane hack adventure; or “hacking the Tivo”, Andrew Tridgell, linux.conf.au 2001

In this kind of talk, the speaker usually talks about a project he or she personally undertook and structures it as a battle against the odds. So of course Y is the most obvious solution, although not trivial. So I tried Y for a couple of months. And then it turns out that the X protocol absolutely makes Y impossible for these reasons [dramatic pause] and at this point any normal human being would have given up, but I retired to a monastery for six months and returned refreshed and a lotus blossom I saw on my return inspired me to try Q…

This is probably the most effective structure for a technical talk and it’s surprisingly widely applicable (you can frame a lot of things in terms of here’s the current problem, here’s what you think the obvious solution is, here’s what happened when I tried it). The biggest limitation is that it’s very hard to do it about work you didn’t do yourself so it’s not available to all speakers. And you can misjudge it: so of course I tried Y says the speaker uh, Y is obviously dumb, why didn’t you go straight to Q? asks the audience (at some conferences, they might well ask it out loud, repeatedly). It doesn’t work for speakers who don’t have expertise (at least narrowly on this one topic) over that the audience has.

Demonstration of surprising ease

In this talk, the speaker proposes a task which sounds too formidable to complete in the talk slot. (Pretty much any task, for some talk slots.) They then proceed to show how it can be done despite the odds using their tool. Edward Hervey edited a video PiTiVi at linux.conf.au 2008. At OSDC 2008 Thomas Lee added a ‘unless’ keyword to the Python language.

This structure can be fun but it has a lot of pitfalls. Edward was continually battling to type while dealing with a handheld microphone. Speakers should recruit an assistant for these talks. The Python keyword talk was kind of fun, but it was also hard to follow: and now I am going over to this totally other tree and adding several lines to this other file at line 1099 and 1346 and is obviously required: he did not succeed in making me think I could easily edit the Python language. In all cases, it needs a decent amount of rehearsal: even more so than other talks.

Dropping of wisdoms

These talks are generally (at linux.conf.au anyway) talks about the social/community/artful aspects of coding. For example, Jonathan Corbet’s talk about how kernel hacking actually works in terms of the trees and maintainers etc at 2009. (One of Linus Torvalds’s bigger 2009 contributions was being in the audience for that talk and adding occasional commentary on his various prejudices.) Andrew Bennetts has had some good luck with a series of talks that began at OSDC 2008, in which he essentially lists every tool he can think of to help you make your Python programs faster. I didn’t see it but I would guess Paul Fenwick’s talk about awesome things you’ve never heard of in Perl was the linux.conf.au equivalent in 2009.

These are useful talks, the main pitfall is that they require considerable expertise over the bulk of the audience. Otherwise someone is just listing a bunch of stuff you do every day back to you. They also need to be about tasks or tools that a lot of people actually use or want to use. No one wants the hidden tricks to using libnousersexceptme.

Creative Commons License
Possible structures for a technical talk by Mary Gardiner is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.

Ada Lovelace Day wrap 2: Karen Spärck Jones elsewhere

Yes, this does mean that a third of these things is coming, but I wanted to point to some other profiles of Karen Spärck Jones, aside from my brief one. At least at the present time, she’s on the first page of most profiled Ada Lovelace Day subjects. I was really pleased to learn more about this inspiring scientist.

Martin Belam has a long profile quoting extensively from Spärck Jones’s interviews and speeches and focussing on both her own career progression: she worked with Margaret Masterman at the Cambridge Language Research Unit. “You have no conception of how narrow the career options were [for women],” is one of Belam’s quotes. Another one of her stories reminds me of more recent stories Pia Waugh has told me about the resistance of parents playing a role in girls not choosing computing careers (these days it’s apparently the perceived low earnings and limited career prospects of programmers from the point of view of ambitious parents, so at least something has changed):

We were trying to get at girls in schools [to take up computing] and we knew we had to get to the teachers first. We found that the spread of computing in the administrative and secretarial world has completely devalued it. When one of the teachers suggested to the parents of one girl that perhaps she should go into computing the parents said: ‘Oh we don’t want Samantha just to be a secretary’. That’s nothing to do with nerdiness, but the fact that it’s such a routine thing.

Bill Thompson was a student of Spärck Jones’s, and writes about her influence on him as a fellow philosopher turned computer scientist. He also wrote her obituary for The Times (and, in 2003, that of her husband, fellow computer scientist Roger Needham).

IT journalist Brian Runciman remembers Spärck Jones as the most interesting woman he’s ever interviewed in Computing’s too important to be left to men. (I think it’s very important to get more women into computing. My slogan is: Computing is too important to be left to men. seems to be Spärck Jones’s best known quote.) In the interview with him, she talked about how her ideas permeate modern search engine implementations.

She scored smaller mentions from:

Ada Lovelace Day profile: Karen Spärck Jones

Let’s create new role models and make sure that whenever the question “Who are the leading women in tech?” is asked, that we all have a list of candidates on the tips of our tongues… To take part All you need to do is… pick your tech heroine and then publish your blog post any time on Tuesday 24th March 2009. It doesn’t matter how new or old your blog is, what gender you are, what language you blog in, or what you normally blog about – everyone is invited.

This is a profile of a woman in technology for Ada Lovelace Day.

Creative Commons License
Karen Spärck Jones by Markus Kuhn (modifications by Mary Gardiner) is licensed under a Creative Commons Attribution 2.5 Australia License.
Based on a work at commons.wikimedia.org.

I first heard about Karen Spärck Jones, who was a senior scientist in my field of computational linguistics, in 2007 as part of my paying job, which is as the editorial assistant for Computational Linguistics. Just before she died, Spärck Jones wrote Computational Linguistics: What About the Linguistics? which we published posthumously as the Last Words column for Vol. 33, No. 3. (Spärck Jones was aware both that she was dying and that her column was going to appear under the heading ‘Last Words’.) I was never able to correspond with her directly: she died before we even had the camera ready copies done.

Spärck Jones’s academic career began in 1957, and was funded entirely by grant money until 1994: most academics will recognise this as a hard way, requiring researchers to fund their own positions with grant money awarded in cycles.

Spärck Jones was the originator of the Inverse Document Frequency measure in information retrieval (1972, A statistical interpretation of term specificity and its application in retrieval., Journal of Documentation, 28:11–21) which is nearly ubiquitously used as part of the measure of the importance of various words contained in documents when searching for information. (The word ‘the’, for example, is very unimportant, as it occurs in essentially all documents, thus having high document frequency and low inverse document frequency.) She had a long history in experimental investigations of human language (most computational linguists are now in this business). She was also at one time president of the Association for Computation Linguistics.

Awards Spärck Jones won in her lifetime include Fellowships of the American and European Artificial Intelligence societies, Fellowship of the British Academy, the ACL Lifetime Achievement Award and the Lovelace Medal of the British Computer Society.

Elsewhere: Spärck Jones’s obituary in Computational Linguistics and Wikipedia.

Ada Lovelace Day profile: Allison Randal

Let’s create new role models and make sure that whenever the question “Who are the leading women in tech?” is asked, that we all have a list of candidates on the tips of our tongues… To take part All you need to do is… pick your tech heroine and then publish your blog post any time on Tuesday 24th March 2009. It doesn’t matter how new or old your blog is, what gender you are, what language you blog in, or what you normally blog about – everyone is invited.

This is a profile of a woman in technology for Ada Lovelace Day.

Creative Commons License
Allison Randal (Three photos) by Miles Sabin, Piers Cawley, Paul Fenwick, Mary Gardiner is licensed under a Creative Commons Attribution-Share Alike 2.5 Australia License.

Allison Randal is the chief architect of the Parrot virtual machine, which, I have just now discovered, had their 1.0.0 release a week ago today. I’ve known of Parrot for a long time, because of its posited relationship with the Python programming language (see the original April Fool’s joke), but I didn’t know much about the project beyond it being a VM until Randal’s linux.conf.au 2008 talk (see slides, Ogg Theora video, Ogg Speex audio).

I am not a Perl programmer and Randal is mostly known within the Perl (and OSCON, see below) communities, but Randal’s talk at linux.conf.au 2008 was the most memorable for me: she talked about bringing modern compilation ideas to the Free Software programming languages community, and then about the architecture of Parrot and the various intermediate languages it is possible to target.

The most striking thing about Randal’s work for me is that she combined high profile technical coding with deep community involvement (and technical writing). She is a past president and current board member of the Perl Foundation and chairs the talk selection for OSCON. In an ideal world I’d like to be able to straddle technical and technical community work in my own life, and Randal is one of the highest profile examples of this I know of.

Elsewhere: Randal’s homepage, Randal’s O’Reilly Radar blog, Randal’s use.perl blog and Wikipedia.

On girl stuff

In both of my recent talks involving women and Free Software the audience has latched onto something I didn’t expect. At OSDC it was the GNOME finding that they only got women applying for their summer of code projects once they created special ones for women. I think I expected people to have heard about that already, but they hadn’t. (GNOME had zero applications from women for Google Summer of Code, and some hundreds for the Women’s Summer Outreach variant.) There were probably a couple of things going on there aside from women responding to a specific invitation — in particular, computer science academics at some universities getting excited about being able to give their women students a specific invitation — but clearly invitations are part of what’s going on.

There is a karmic debt to do some work already incurred by giving these talks, but since the work I do isn’t Free Software and wouldn’t be generally useful if I released it as such (I know a lot of people say this about their work, but I try and predict word usage based on the opinion of the document, this really is quite niche software) and I had a reasonable idea for a variant on this kind of talk, I gave a second one anyway, at the LinuxChix miniconf. It was titled ‘Starting Your Free Software Adventure’ and happened to use women as examples. The idea was to show people what the first steps look like. I conducted (extremely short) interviews of several women involved in Free Software or Culture or their communities, including Kristen Carlson Accardi, Brenda Wallace and Terri Oda among others. (I intend to make the slides available, but since I quoted the subjects extensively and directly, it will require gathering permission and then a bit of work editing them.)

As I noted previously this talk was a failure all up, because the wrong audience turned up for it. But one thing stood out and kept coming up all week: Terri mentioning that she had resisted at times working on things perceived as ‘girl stuff’. In Free Software this includes but is not limited to documentation, usability research, community management and (somewhat unusually) sometimes management in general. The audience immediately hit on it, and it swirled around me all week.

This is a perennial problem for professional women: software development is by no means unique in having developed a hierarchy that goes from high status roles disproportionately occupied by and associated with men to somewhat lower status roles disproportionately occupied by and associated with women. (In the case of software, disproportionately occupied by women still means male dominated, by the way, at least in the English-speaking world.) It’s difficult to disentangle the extent to which women and/or their mentors and teachers self-select for the lower status roles (and I would hardly argue that the self-selection occurs in a vacuum either) versus the extent to which they are more or less barred from high status roles versus the extent to which the association is actually flipped and professions and jobs within them have become low status because women started doing them. Other well-known examples, are, for example, the concentration of women in biological sciences as opposed to, say, physics, the different specialisation choices of male and female medical doctors and surgeons, and so on. Sometimes, as in the war between sciences, the status of a field is somewhere between a joke and real, to the extent that those can be differentiated, but often it isn’t: there’s a correlation between the male to female ratio of a medical specialty and its pay.

In all of these cases, a woman who is conscious of this problem tends to face a choice. Do the ‘girl stuff’, or not? (Of course, ideally one rejects the dichotomy, but no individual woman is responsible for constructing it, and if you are sincerely of the belief that one is not programmed to a frightening and unavoidable extent by one’s social context we’re working from very different premises and don’t have a lot to say to each other.) And some, although I don’t know what proportion, of women feel guilty about their choice, especially if they do choose to do girl stuff. Just go ahead and imagine your own scare quotes from now on, by the way.

It also gets messy in various other ways. There’s the extent to which a woman who doesn’t do girl stuff is invested in maintaining the status she has chosen and also the aforementioned loop where if women are doing something, it will come to be seen as not particularly hard or noteworthy.

Most concretely, I usually see this tension bubble away underneath outreach programmes promoting computing careers (you know what, I have my own status issues and I still resist calling it IT) to women. There’s the people who want to go for yeah we all know coding is populated by weirdos, and male weirdos at that, luckily you don’t have to be a geek and you don’t have to code, phew! I tend to hear about that one only once my outreach friends have gotten involved and staged a coup, admittedly. There’s the there’s so many opportunities in computing, and yes, coding is one of them and its fulfilling and it’s something you can do, but dammit, coders get all the cred and attention and dammit can we talk about something else? Women who admin/write/test/manage rock! And there’s you know, women coders don’t exactly rule the world yet, and furthermore isn’t all this oh-yes-you-could-code-I-guess-and-that’s-a-fine-thing but look! something for folks with people skills! talk basically a soft version of ew coding that’s for boys, also, last I checked, math is hard?

I observe again that there’s no right answer here in the real world right now. Women doing girl stuff have good reasons to feel dissatisfied that their hard-won skills are underpaid and under-respected, women doing boy stuff (scare quotes! please insert!) want other women to know that there’s fun to be had over here, thank you.

One crucial point in my thoughts about this I stumbled on only after the conversation Brianna Laugher recounts, over Indian on the Friday night (the location of all major conference breakthroughs worldwide). She said — paraphrased — that she didn’t feel that she should have a problem or be criticised for doing what she is good at, or what’s so desperately needed in her communities, and have to be just another coder in order to be fully respected. And I said that while this was certainly true, women also need to have the opportunity, to give themselves the opportunity, to be selfish: if they want to code, or do something else they are currently either bad at or not notably good at, or for that matter which they are good at but in which they’d have competitors, they should consider doing that, rather than automatically looking for and filling the space that is most obviously empty.

I had a brief, but related conversation with Jeff Waugh at the Professional Delegates Networking Session — an attempt to formally recreate the Indian diner breakthrough environment —  at which he commented that he continued to find the invitation culture (the same one I discussed in my OSDC talk) of women in Free Software mystifying and frustrating. (Not his exact words, if you have better adjectives Jeff let me know.) I took that one somewhere else: specifically to invitation cultures outside Anglo culture and then to honorific use in the Korean language, but when considering the question of women I think this is intertwined with the be selfish thing: women are reluctant to enter places where they aren’t obviously welcomed, and what better way to be welcomed than to do work that needs doing and not become just another person doing the coding free-for-all and delaying external validation for potentially quite a long time?

I have no answers. Just the perennial question of distinguishing what other people want, what other people claim they want, the genuine satisfaction of being of service to someone, and the genuine satisfaction of knowing you’ve done a good job of something hard. Take a look at where you’re standing on that one occasionally.

Writing helpful reviews

I outlined the style of good academic reviews to Jonathan in light of our impending OSDC review responsibilities, and it’s worth noting here too.

For information’s sake, my authority, such as it is, on reviewing comes from being the editorial assistant of Computational Linguistics, which is a journal with a hardworking editor and conscientious reviewers. Not all academic reviews are of the quality I discuss below. They should be.

Begin with stating the title of the paper you are reviewing. Then spend one to three paragraphs summarising its content, particularly what you perceive as its major findings and conclusions.

This has a couple of purposes. The first is that if the reviews have got mixed up in the system the author finds out as soon as possible and doesn’t have to slog through a review that (perhaps) is a partial match for their paper and (especially in academic circles) a privacy problem to boot. The second is so that they know in what light to read the rest of the review. If they see that you have understood its fundamentals they will be inclined to take the entire review seriously. If they see you have misunderstood it, they can do one of two things. One is to realise that their paper is confusing, and to make its focus clearer. The other is to discount your review. The decision here may be affected by the following section.

The main body of the review is a discussion of how to improve the paper. Both the tone and discussion will vary considerably depending on certain factors:

  1. is the paper already accepted?
  2. is this the only reviewing round or will you or another reviewer be checking the changes?

For OSDC, both factors hold. For almost all conferences, there is only (at most) one reviewing round for full papers. This makes reviews more limited in scope than journal reviews, where substantial changes are often recommended even (or perhaps especially) to articles the reviewer fundamentally likes. Journal reviewers can have a role which is not far from being anonymous co-authors. (If a colleague did as much re-reading and suggestions of additional work and additional reading as Computational Linguistics reviewers do, many people would consider adding them to the authors list.)

In the event that the article has been accepted, or that this is the single reviewing round, you should limit the scope of your suggestions to much more cosmetic things. Someone who has had an article accepted is just going to be annoyed that you want it to have a whole new body of work incorporated, and they will ignore you. (And if it’s rejected after a single reviewing round, they are probably ill-placed to revise much!) In the OSDC scenario, reviewers are going to be mostly limited to suggestions as to how to structure the argument and the paper better, and not really able to productively suggest changes to the argument or the work described in the paper.

As you write your review and this section in particular, keep in mind the key factor of providing useful critiques: how could this work be better on its own terms? That is, don’t provide a review that is, fundamentally, about how the paper would have been better if you’d written it… about your pet topic. This is a subtle, tempting and common mistake, and if you have never caught yourself in it, you are likely to be the worst affected. Remember: What is the paper trying to do? How can it do it better? Avoid the temptation to suggest that it would be a better paper if it was doing something different from its current aim. (There is a little more leeway for this in journal reviews, but even in that case, generally what happens if a reviewer thinks this is that they review the article on its current form and recommend a fate suited to its current aims, and additionally comment that they would be interested in seeing further work in the additional direction should the authors choose.)

As a recipient of reviews, I do have a couple of things to add. One is to respect page limits. If you are reviewing for a work with a page limit, especially a conference, and you do really want to see a longer discussion of foo, please suggest which bar could be shortened or cut. Otherwise it is close to impossible for an author to consider your suggestion. Also, if you are making suggestions for future work that you think the authors should consider but which you do not actually want to see in the article, make this clear in the text of your review. I would probably recommend a whole separate section for this if you’re going to do it.

A review may conclude with a list of typos, spelling mistakes, suggested rephrasings, etc. Mistakes that affect the reading of the paper (eg mislabeled figures and sections) go right at the start of this list. A sufficiently ill-proofread paper may go back with a suggestion that the authors find the mistakes themselves.

Creative Commons License
Writing helpful reviews by Mary Gardiner is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.

Clean up IMAP folders

Per Matt Palmer’s blog entry OfflineIMAP and Deleting Folders users of any mail sorting recipe that creates new mail folders a lot tend to find that over time they accumulate a lot of mail folders for, eg, email lists they are no longer subscribed to. And most IMAP clients will waste time checking those folders for new mail all the time.

Matt wrote:

Now, of course, someone’s going to point me to a small script that finds all of your local empty folders and deletes them locally then issues an IMAP “delete folder” command on the server. But I had fun working all this out, so it’s not a complete waste.

I haven’t quite done this, I’ve just written a script that detects and deletes empty remote folders. (For me, offlineimap does not have the behaviour of creating new remote folders, so I haven’t bothered cleaning up local folders.)

It’s good: it’s speeding up my mail syncs a whole lot, deleting these old folders I haven’t received mail in for about five years. I’ve got full details and the script available for download (as you’d expect, it’s short): Python script to delete empty IMAP folders.

Creative Commons License
Clean up IMAP folders by Mary Gardiner is licensed under a Creative Commons Attribution 4.0 International License.

Headless / commandline Bittorrent

I made noises yesterday that I might learn about BitTorrent. So I tried. (It’s an interesting protocol from the point of view of needing clients to enforce penalties against refusing to upload.) Here’s the paradigm I wanted it to fit into: at, the command scheduler. 95% of my readership will know (intimately) how residential broadband works in Australia, but for those who don’t it is typical to not have unlimited downloads. On a good, just above entry level, plan you might have a limit of about 4GB to 10GB a month. (The entry level plans tend to have a limit of about 512MB or 1GB for only about $10 to $15 less. This is like selling small amounts of an addictive substance. Someone will eventually tell you about Google Earth.) However, it is also very typical to have an off-peak period with a higher, perhaps even unstated, bandwidth cap. (Mine is 48GB in a month, in the midnight to noon period only.)

Since I run a headless server anyway (for mail and music), I vastly prefer to schedule my downloads for when I’m sleeping and bandwidth is cheaper. When I’m going to download something available on the web, I run at midnight and then give the command wget --limit-rate=something -q -c [url] (-c because I’ve usually tested about the first 100K of the download already). In the morning, assuming I’m not downloading an album from a very well known band, my file is there. I therefore wanted to do the same with Bittorrent.

I’ll cut to the chase here: the solution is Transmission‘s command-line tools, particularly transmission-daemon and transmission-remote. The daemon controls all the torrents, and, usefully for my limited download window, they can be stopped and started from the remote command and therefore from cron. The only catch is that these tools seem to have only very recently matured, as in, they don’t exist in Ubuntu 7.10/Gutsy. (It has a transmission-cli package, but the daemon isn’t in it.) The transmission-cli package from Hardy will not install on Gutsy either, but you can backport it without any hassles, assuming you know how to get hold of and build Ubuntu (Debian-style) source packages, which, granted, probably doesn’t apply to 95% of my readership (although it might apply to a majority once we count Planet Linux Australia).

I thought this was worth sharing though, after I spent hours mucking around with BitTorrent (the Python client, not the rebranded µTorrent) and BitTornado, neither of which, even in the headless versions, has much support for such selfish notions as not wanting to run it until such time as it’s good and done (you can send SIGTERM and they do resume cleanly on the next invocation, not exactly champion of the world on design though), and neither of which puts up with this old-fashioned nonsense of wanting to run a process without a controlling tty. BitTorrent doesn’t even have an inkling that you might want to limit download speeds to anything less than maximum (perhaps Bram Cohen has never lived in a house with anyone else who wanted to use his net connection). I ended up looking at Transmission because the GUI version (also apparently very nice) is now Ubuntu’s default BitTorrent client. (Clutch is allegedly a nice web interface for Transmission too, but I haven’t looked at it at all.)

Edit: this post is not a bleg. The last paragraph describes a problem, yes, but this post is about how I’ve solved that problem by discovering Transmission. Please, there’s no need to email me on getting BitTorrent, BitTornado or rTorrent to work in screen or similar. I’ve discovered Transmission, which doesn’t need screen and which is very cron-friendly. This post is intended as a positive review of Transmission, not a request for help. End edit.

Creative Commons License
Headless / commandline Bittorrent by Mary Gardiner is licensed under a Creative Commons Attribution 4.0 International License.

Personal computing history

I’ve enjoyed reading the LWN.net ten-year timeline (1, 2, 3, 4, 5, 6 and perhaps more to come), enough that I wanted to do a short wrap-up of my own computer story. Apparently it’s going around, but I didn’t realise that when I started this.

Approx 1990 My great-uncle died and my great-aunt offered us his computer. I was extremely excited, and assured it was portable, very expensive and top-notch. This didn’t turn out to be exactly true: it had been very expensive and top-notch when he bought it. I didn’t work out when that was, but it was some time in the past. The machine ran MS DOS 2-ish, had no hard drive, and had 2 5.25″ drives. One floppy held the entire operating system.

It was portable in the sense that it was one of the all-in-one designs that Compaq stuck with for so long. It was about the same size as a medium-sized hard suitcase and had a handle so that it could be upended and hauled around.

I learned some variety of BASIC (BASICA, I think) and spent many hours typing out programs that greeted me by name and let me add numbers together. I used it for school assignments for about six months before I discovered that it booted with the Insert key toggled off, so if I made a mistake it was just like a typewriter: I had to re-type everything from the mistake onwards.

I soon ran into what turns out to been a semi-imaginary bugbear of at least the next ten years of my life: I had no ideas for what to program. I realised I could learn from the program that displayed a bee flying around and played ‘Flight of the Bumblebees’ but I couldn’t summon the energy to pick apart 2000! lines! of code!

1993–1998 My parents got a new computer at the end of 1993, another Compaq as it happened. 486, and I believe 4MB of RAM and a 100MB hard drive. This was rather underpowered for the time, I think, but not radically so. I became something of a power user of word processors and the like. My parents were convinced that I was on the verge of destroying their machine. The poverty of our flat file ‘database’ application bugged me no end. When I came across relational databases much later I knew exactly what they were for. Towards the end of high school (I jumped a couple of years in computing studies and took my final exams in Year 10, so I had some exposure to the wider computer culture, however distorted) I desperately wanted to learn C, but what I was going to do with it I didn’t know.

In 1996 I got a copy of Fractint, I have no idea where from. Most likely the World Wide Web, which I used for the first time that same year. (Someone in my computer class logged on for me, went to Yahoo, and typed in ‘girls’ and started surfing for porn. 1996 was a big year in parental outrage at our school.) I even signed up for a Hotmail account. Anyway, not only did I spend hours and hours choosing just the right colours for my fractals, Fractint was my first exposure to the idea of collective, free, software development and I liked it. I read an article about Linux around the same time and liked the sound of that too, but I fundamentally had no idea what it was on about.

In 1999 I started undergrad and was immediately delighted with UNIX, which I considered as super-powered MS-DOS with better doskey. That said, it was at least a year before I learned to drive tar from the man page, and I think at least two years before I learned that see also crontab(5) means typing the command man 5 crontab (now probably my most frequently used man page). I was unimpressed with the university’s webmail system for a long time (I don’t recall why, but undoubtedly it sucked) and read my mail by telnetting to port 110 on the relevant machine. Very recently, someone suggested to me that doing that is an urban myth equivalent to whistling 9600 baud. No, no it isn’t. But it sucks for attachments.

Andrew and I started going out in August of 1999 and soon after that he bought his own desktop and installed Debian on it. He was pleased with my taste when I chose the username ‘mary’. The relationship was young enough that I still deeply cared about his opinions on questions like what is a tasteful user name? (I still use ‘mary’, it’s short. And tasteful.) At the end of the year he moved out of college and an rm -rf accident on my part and reluctance to download a year’s worth of email on his part means that we no longer have copies of about a year’s worth of emails to each other.

I did learn C in 1999, although I somehow missed the square brackets dereferencing syntax for pointer arithmetic and was doing a lot of accessing arrays like this: *(p + 5).

I had a job as a programmer in 2000. That didn’t work out so well, but I did get enough money for my first computer. Andrew downloaded SuSE for me because he wanted to see what it was like. Bad, that’s what, because it only had mutt 0.2 packages. I had to reinstall it and Windows several times each to get them on. (I was playing a lot of Baldur’s Gate at the time, I believe the Infinity engine still sucks under WINE in 2008.) I wrestled with Exim’s documentation for the best part of a day to get it to act as a smarthost, because ‘smarthost’ has nothing to do with the term ‘mail relay’. At the end of the year I registered puzzling.org. It was hosted on a FreeBSD box for a while, using qmail. (This is not why Andrew and I have a lot of user-suffix@puzzling.org addresses rather than user+suffix. That is because we were later hosted on Crossfire‘s machine for a while, and another user on that machine had used qmail style suffixes and asked for them to be set in Postfix. No one has ever let me finish telling this story until now.)

At the beginning of 2001 Andrew and I skipped the second half of a holiday at the beach for the first linux.conf.au. (Not counting CALU, which I don’t consider the ‘zeroth’ linux.conf.au because programmers count offsets from zero, but no one in their right mind counts objects from zero. Miss Manners would agree, I know.) This was an experience that in memory has not been surpassed in terms of pure mystical wonder. Especially Tridge’s hacking the TiVo talk. And Martin Pool’s rsync proxy thing. I think there is something irretrievably lost when you get better understanding of technology. No conference has been the same again. (And I don’t know that since taking up diving I’d be prepared to leave a beach holiday for any conference.) Soon afterwards, although unrelatedly, Andrew and I were living in a big sharehouse with Nicholas, Catie and Mark and Nicholas, I think, prodded me into installing Debian. And then was alarmed when the first apt-get command I ran was to install nmap. At the time, this was how I checked my machines for unnecessary services.

I tutored computer science from 2001–2003, and managed to always be a better programmer than my students. I suspect this was when I finally learned to program respectably. puzzling.org left Crossfire’s hosting after Andrew quit Weblink, was in shared hosting for a while at csoft and then went to various virtual machines and ended up at Linode. I managed to do all this without a complete meltdown a la the Exim smarthost thing. That was saved for trying to get pppoe working based on knowledge equivalent to an oily rag. Put the modem in bridge mode and all will be well.

This is making me feel as if everything I know about computers I had learned by 2003, and actually that’s largely true. I haven’t picked up a new programming language since then other than for specific projects (Perl in 2004, mainly). I switched to Ubuntu from Debian because Andrew was Canonical employee number 10 or 15 or so. I learned how DNS works around about the same time (from the point of view of configuring BIND, I can’t, say, parse the wire format). Things I’ve acquired since then belong to different stories: relationships, diving, travel, language, computational linguistics, experimental methodology, bits and pieces of statistics, some vastly improved life management skills around budgeting, a certain amount of peace that’s come from somewhere I don’t know. Writing. I’m still programming and doing hobbyist sysadmin, but success no longer comes with wow, I can really do this, finally, at last. I expect my stuff to work these days and I can even usually tell you how long it will take. (I remember Andrew claiming to have this skill as early as 1999.) New toolkits are no longer headline news, but if there was one thing I’d like to add to my store of skills it would be the arrogance of the gods that many programmers have as their birthright. Any problem a mere mortal can pose, they can solve. It sounds dreadful, but flip it over. See? It’s like flying.

WordPress locked down with HTTP Basic Auth

I run several WordPress sites for other people (this isn’t one of them). A couple of them are private: no password for the site, can’t read the site. For years I’ve had an unwieldly situation in which the lockdown was implemented with HTTP Basic Auth configured in Apache, and the users separately log into the site in order to post.

I used HTTP Basic Auth for locking it down even after I discovered Authenticated WordPress (requires a login as a WordPress user before you can see anything) partly because it’s accessible to RSS readers. Many RSS readers (and assorted web fetching tools) can speak HTTP Basic Auth. Few can log themselves into WordPress, although I wouldn’t be surprised to find an exception or three. Eventually though different search terms led me to the HTTP Authentication plugin, and it turns out they play nicely together. If you install them both the site requires HTTP authentication in order to access any part of it, and any person who has successfully authenticated is logged into WordPress too.

A couple of niggles:

  1. (The HTTP Authentication plugin requires that you have two matching lists of user names (well, actually one can be a proper subset of the other if you like, but users who aren’t in both can’t authenticate): the WordPress DB needs to have a registered user, and the external authentication source needs to have an entry for the same user.) Actually, I tell a lie. There is an option to automatically create a WordPress account for a user who shows up as successfully authenticated with an unknown user name.
  2. The HTTP Authentication documentation is slightly wrong: you don’t need the nickname to match the external user, you need the username to match the external user (which is the sensible way anyway).

Creative Commons License
WordPress locked down with HTTP Basic Auth by Mary Gardiner is licensed under a Creative Commons Attribution 4.0 International License.