Finding out about locked LiveJournal (InsaneJournal etc) posts in your normal feed reader

This feature is rather old but I just found out about it. If you have a LiveJournal or SimilarJournal (InsaneJournal at least) and you have access to ‘friends-locked’ posts but you’d like to be able to pick them up in your normal feed reader, you can very likely use a URL like this:

http://YOURUSERNAME:PASSWORD@THEIRUSERNAME.livejournal.com/data/atom?auth=digest

For communities use the community URL followed by /data/atom?auth=digest. Similarly for LJ-like sites with their own URLs. You can also replace atom with rss if you are that way inclined.

This is described in FAQ 149 on LiveJournal.

If using a reader that is implemented in Python (eg Planet and Venus, rss2email) you will probably need to make sure you are using Python 2.5 underneath rather than any earlier version of Python, as Python’s urllib2 did not work with LiveJournal’s Digest-Auth implementation until just after Python 2.4 was released.

If this doesn’t work in your reader, the feature you want them to add and test is called ‘HTTP digest authentication’. There may also be some alternative way they want you to put in ‘YOURUSERNAME’ and ‘PASSWORD’.

Ubuntu 8.04 (Hardy) Beta bugs wrap-up

This is in my usual tradition of going over the bugs before an Ubuntu release. It’s a good time for Ubuntu, if I’d done this even last week I would have been more annoyed.

The most annoying bug of all has been around for quite some time, and may not be an Ubuntu bug at all, but rather a BIOS power management or disk firmware bug: bug 59695. This would be what cost me my last hard drive, which failed at a reported 1.7 million load cycles. The new drive is already at 2257, which is a bit high as well although I’ve tried various workarounds. The trouble with this one is that firstly the power saving settings of the drive appear to be totally opaque. Just try some numbers! Watch for high temperatures, although we don’t know what ‘high’ is for your particular drive, nor can we tell you how to work it out! My drive will probably last longer now, but the laptop is painful to touch after a while. Great.

Bug 194214 is quite seriously annoying too. For me (and I don’t use Compiz) it manifests as my Ctrl key being virtually ‘stuck’ down, so that if I press ‘q’ applications terminate, ‘d’ closes my shell, ‘Page Up’ switches tabs, ‘c’ sends SIGINT etc. An X restart is required to restore normality. The community is on this though, the report is quite impressive and git bisect has been used.

Bug 193970 is a regression from a user’s point of view, if not a programmer’s (for the latter, the important distinction is that it’s not actually the same software). The problem is that Ubuntu is now using the Free driver for the Intel 3945 wireless chipset (iwl3945), rather than the other one (ipw3945). iwl3945 doesn’t really support the ‘wireless off’ switch, well, at all, unless you consider rebooting after using it an acceptable solution. This isn’t bugging me much at the moment because I’ve learned to leave wireless on all the time, but it is annoying when I take my laptop on trains and similar and the battery drains needlessly fast. It looks like this will probably survive to the release, I will likely switch back to ipw3945 for the duration of Hardy’s lifetime.

Bug 204097 (which may be a duplicate of something else) probably cost me some data in the hard drive failure. Essentially they’ve decided to put a nice wrapper around fsck (the filesystem checks) that does not handle a check failure at all. It just reboots and tries it again. And again. And… you get the idea. Nor does it inform you that this is even due to a failure. You have to guess and boot into recovery mode yourself. Of course, this is a hard one to solve correctly, because a typical desktop user is eventually going to be told and now you have to do something very weird and difficult. But the ‘just hope it doesn’t fail the next boot’ thing is weirder.

Bug 185190 is just mystifying. Essentially the GNOME world clock programmer has decided that it is really hard to work out programmatically what timezone a city is actually in (and it is, you try it) and so they’ll just guess based on the longitude. Fortunately this only fails for very minor unheard of cities like Beijing and St Petersburg. Oh, and a bunch of major North American cities, which I genuinely am surprised is considered acceptable.

The major fix I’ve noticed is that bug 153119 (microphone was more or less useless on my laptop due to very soft volume) seems to be gone. This one surprised me since there was no response to the bug itself. update-manager is marginally better too I think. NetworkManager is fine, but I’m not using the Hardy default packages… Suspend and hibernate both work, that’s a shock this far out (a month) from a final release.

Building an online shopping site for which I will not kill you

I’ve done a fair bit of shopping online recently, and I’m just about ready to kill everyone. Here’s how to not be on the hit list:

  1. Let me compute shipping costs without having to give you my name, full address, email, phone number and chosen password. If you’re an Australian site, you can work it out from my cart and my postcode. If you’re international, my cart and my country. I know full well that you make a ludicrous amount of money from ‘shipping’ and I’m factoring it into my price comparisons. I’m getting to the stage of assuming the worst if you make me sign up before revealing shipping costs and I’m bypassing your site. No really, I am, I’m not buying from you any more, because 5 minutes signing up is 5 minutes too long.
  2. If you sell electronic equipment, be upfront about whether it’s grey market and if so, where the warranty holds (ie, in the event of failure, do I have to have it couriered to some other country to be fixed, or is it an Australian warranty?) If it’s in any way unclear, I am also assuming the worst.
  3. Ideally, debit my credit card at the time of shipping, rather than at the time of placing the order. (I’m usually not big on waaa waaa doing business is hard, that’s why it’s hard to buy from us, but you should anyway because capitalism means you have to buy things but I’m kinda sympathetic to avoiding credit card fraud, so often you get a pass here.)
  4. If you only have a web form for customer contacts, make sure damn sure it works. And by works, I mean the mission critical kind of works too, because it sure is annoying if the last POST-PAYMENT phase of the order fails and I can’t contact you about it.
  5. When I do file a support request through your web page, how about automatically emailing me a copy of it, ideally with some kind of tracking number? That way I have some reasonable assurance that it at least made it as far as your server. If you put up a generic ‘Thank you for your request, we will eventually respond’ page, I don’t know if my actual request got through. Especially if a payment just failed…
  6. I know you get a lot of dumb support requests. But please, please, don’t put up that page, you know, the one that goes man, you guys sure are dumb. This website is infallible and yet all the time waaa waaa waaa customers can’t order because customers can’t read our info even though they somehow were literate enough to apply for a credit card. Don’t blame us if we’re cranky about your dumb complaints. You’re lucky we even have this non-functioning web support form up at all. Because one day your third party gateway will ACCEPT A CREDIT CARD and your clever clever system will fall over before inserting the associated order into your database and then your contact form will also fail and then you will look rather stupid for talking to the customer about how dumb they are, won’t you?

I suppose I should mention some good systems here, shouldn’t I? OK. If you buy contact lenses through Net Optical Australia you’ll get so much feedback from them about your order status that your mail server might keel over. Glasses Online is fairly good in that respect too (they’ll even phone you to double check your weirdo prescription, if you have one). The good people behind the Nine Inch Nails album were very fast with their help despite however many million calls for help they were dealing with.

I should mention the bad system here too, but my understanding of Australian libel laws make it kind of dangerous. How about you guys just take the chargeback on the chin and we’ll call it even? (I will say though that unfortunately having clear-cut statements about shipping costs and warranties apparently does not totally correlate with a functional ordering or support system.)

King of Copyright

I saw The King of Kong today, which I recommend, especially if you haven’t heard anything about it aside from this, as I suspect it is unduly affected by spoilers. Afterwards, go read about the disputed facts on the Wikipedia page.

At the beginning of the movie there was an ad about respecting copyright. Not the you wouldn’t STEAL a car one with the heavy handed soundtrack, but one in which burning discs is portrayed as equivalent to burning the local film industry, in the form of a Happy Feet poster. In one way this is quite effective: do you support setting fire to neotenous penguins? No, you don’t. In other ways it’s less effective, in that the Australian public is notoriously indifferent to the local film industry. It’s not clear that they can blame downloading for the death of the industry any more than blacksmiths could. Or sellers of cockroach flavoured icecream.

In any case, this was shortly followed by a trailer for Be Kind Rewind in which what is essentially fanvidding gets a sympathetic portrayal and The Man is evil (he always is). I figure there’s got to be more scope for hypocrisy here. You could do a sweet little romcom about a guy being busted for downloading awesome music of a little known band, regardless of the hundreds of dollars he has spent attending their every gig. He falls in love with a woman who is supposed to be his smart, hardnosed IP lawyer and yet somehow on screen is always a little flighty, clumsy and generally in need of the help of a good man. The band plays at their wedding, and the bootlegs are so popular that they break BitTorrent. The couple live in some well lit loft at the centre of a major city without any evidence of having their lifestyle funded by a serious inheritance and the band become billionaires through T-Shirt sales. And you put the penguin arsonist ad before it, of course.

I survived the libc breakage of 2008

I didn’t get a lousy t-shirt, but I suppose I could make one. And put lice in it.

There have probably been at least a few people who haven’t heard of this, so: if people were unfortunate enough to be running Ubuntu Hardy (the development version that will become Ubuntu 8.04 in late April) yesterday and upgraded libc6 to 2.7-9ubuntu1, their system broke. It broke immediately, if they (like me) were running the upgrade process under sudo, because sudo started refusing to launch dpkg, and segfaulting.

There are a bunch of solutions to this, none of them particularly obvious to anyone who hasn’t had a broken libc6 in the last little while (the whole thing took me right back to when Debian unstable lived up to its name and used to do things like break lilo a lot — for that matter, it took me back to when I used lilo). Hrm, this is going to be one of those things, isn’t it, where none of them particularly obvious is going to be read as help, I haven’t fixed it, please send advice? Don’t you worry, I have fixed it on my machine. If you’re actually affected, go try the solutions described at the top of the relevant (now fixed) bug, bug 201673 (note that I had to also reinstall libc6-i686, which not a lot of people seem to mention). If you don’t want to be affected, either don’t be using Ubuntu Hardy right now, or, if you are, don’t upgrade libc6 or anything libc-like to Ubuntu version 2.7-9ubuntu1 (2.7-9ubuntu2 is fixed though). It should be more or less passed now, but maybe some mirrors won’t update for a little while still, I guess. The bad version was marked unreadable in the main package archive, which was probably not exactly useless but not as helpful as +1 claimed either, because the mirrors, including the official mirrors, were still distributing the broken package merrily enough.

Anyway, there were a couple of interesting things about this, from an observer’s point of view. The first is that every second person in the forums and half of +1 had their own personal solution to the problem, most of them correct but some of them harder than others (dpkg -x makes needing to pull the different .tar.gz files out of a .deb unnecessary, and chroot /mnt dpkg -i is even better as it means that your package database will be up to date with your downgrade). I suppose there’s some bias here: the people who followed already posted solutions (like me) just didn’t contribute. It left a thread with the impression that there were about twenty potential fixes, all of which you’d have to try because the earlier ones hadn’t worked for some people, whereas in actual fact the earlier ones hadn’t been tried by some people.

The other is the in practice failure of claims along the lines of [as] is prominently mentioned in a number of places, you should not be using pre-release versions of Ubuntu unless you are comfortable dealing with occasional (even severe) breakage. It’s only a partial failure: there are plenty of people who heed this and don’t whine (publicly) when development releases break. But there’s certainly a lot of people who don’t, and while you can chide them, you also still have to handhold them through fixing their system afterwards. In fact, it might even be a slight problem: the only people who heed the warning and don’t install development releases are the kind of people who read warnings and heed them, people who by inclination aren’t noisy and whiny. So you get a pretty whiny testing base (on top of the people who are seriously involved testers).

I tend to test pre-release Ubuntu and file bugs because if I don’t, I can be stuck with broken hardware or regressed software for an entire six month official release cycle (breaking hardware is more of a laptop thing). After my experiences with Hardy though I think I’ll go back to my old policy of upgrading at the beta release, rather than before it. Beta should be March 27, then you can find out about all the remaining bugs that are seriously annoying me, still.

BitTorrent

I made noises yesterday that I might learn about BitTorrent. So I tried. (It’s an interesting protocol from the point of view of needing clients to enforce penalties against refusing to upload.) Here’s the paradigm I wanted it to fit into: at, the command scheduler. 95% of my readership will know (intimately) how residential broadband works in Australia, but for those who don’t it is typical to not have unlimited downloads. On a good, just above entry level, plan you might have a limit of about 4GB to 10GB a month. (The entry level plans tend to have a limit of about 512MB or 1GB for only about $10 to $15 less. This is like selling small amounts of an addictive substance. Someone will eventually tell you about Google Earth.) However, it is also very typical to have an off-peak period with a higher, perhaps even unstated, bandwidth cap. (Mine is 48GB in a month, in the midnight to noon period only.)

Since I run a headless server anyway (for mail and music), I vastly prefer to schedule my downloads for when I’m sleeping and bandwidth is cheaper. When I’m going to download something available on the web, I run at midnight and then give the command wget --limit-rate=something -q -c [url] (-c because I’ve usually tested about the first 100K of the download already). In the morning, assuming I’m not downloading an album from a very well known band, my file is there. I therefore wanted to do the same with Bittorrent.

I’ll cut to the chase here: the solution is Transmission‘s command-line tools, particularly transmission-daemon and transmission-remote. The daemon controls all the torrents, and, usefully for my limited download window, they can be stopped and started from the remote command and therefore from cron. The only catch is that these tools seem to have only very recently matured, as in, they don’t exist in Ubuntu 7.10/Gutsy. (It has a transmission-cli package, but the daemon isn’t in it.) The transmission-cli package from Hardy will not install on Gutsy either, but you can backport it without any hassles, assuming you know how to get hold of and build Ubuntu (Debian-style) source packages, which, granted, probably doesn’t apply to 95% of my readership (although it might apply to a majority once we count Planet Linux Australia).

I thought this was worth sharing though, after I spent hours mucking around with BitTorrent (the Python client, not the rebranded µTorrent) and BitTornado, neither of which, even in the headless versions, has much support for such selfish notions as not wanting to run it until such time as it’s good and done (you can send SIGTERM and they do resume cleanly on the next invocation, not exactly champion of the world on design though), and neither of which puts up with this old-fashioned nonsense of wanting to run a process without a controlling tty. BitTorrent doesn’t even have an inkling that you might want to limit download speeds to anything less than maximum (perhaps Bram Cohen has never lived in a house with anyone else who wanted to use his net connection). I ended up looking at Transmission because the GUI version (also apparently very nice) is now Ubuntu’s default BitTorrent client. (Clutch is allegedly a nice web interface for Transmission too, but I haven’t looked at it at all.)

Edit: this post is not a bleg. The last paragraph describes a problem, yes, but this post is about how I’ve solved that problem by discovering Transmission. Please, there’s no need to email me on getting BitTorrent, BitTornado or rTorrent to work in screen or similar. I’ve discovered Transmission, which doesn’t need screen and which is very cron-friendly. This post is intended as a positive review of Transmission, not a request for help. End edit.

The brave new world

I liked the samples of the Nine Inch Nails Ghosts I-IV albums well enough, and I like FLAC well enough, that I figured I’d just buy the whole thing (USD5) rather than go with the free tracks. It’s what we’ve all been waiting for, right, FLAC without needing the physical CD?

Cut to three days later. I can only download the thing between midnight and noon, thanks to the bandwidth needed (it’s a 600MB download). I have now downloaded it something like ten times, at least in part (about 4GB of downloads and counting now). For a while they didn’t have the bandwidth resources. Now they do, but the ZIP file is corrupt or something (and not just on picky Linux systems, I can’t unpack it using Windows Explorer or WinZip either). The download almost always cuts out at about 300 MB anyway and for some reason their webservers do not do 206 Partial Content, so down it all comes again every time I request it.

It turns out the brave new world of CD quality downloadable albums must be a while away yet. One thing that would have been helpful would be allowing me to download each track individually. Even leaving aside the corrupt ZIP file issue (their tech support has not replied yet) at least I’d have most of them by now, and not be slamming their servers with several requests for a 600MB file per night as wget keeps retrying.

(Edit: apparently the whole album is CC-BY-SA-NC anyway, so it can be obtained via torrents already and one day I’m sure I’ll even learn how to use them. It’s taking my firewall settings up and down that sounds like a nuisance.)

Personal computing history

I’ve enjoyed reading the LWN.net ten-year timeline (1, 2, 3, 4, 5, 6 and perhaps more to come), enough that I wanted to do a short wrap-up of my own computer story. Apparently it’s going around, but I didn’t realise that when I started this.

Approx 1990 My great-uncle died and my great-aunt offered us his computer. I was extremely excited, and assured it was portable, very expensive and top-notch. This didn’t turn out to be exactly true: it had been very expensive and top-notch when he bought it. I didn’t work out when that was, but it was some time in the past. The machine ran MS DOS 2-ish, had no hard drive, and had 2 5.25″ drives. One floppy held the entire operating system.

It was portable in the sense that it was one of the all-in-one designs that Compaq stuck with for so long. It was about the same size as a medium-sized hard suitcase and had a handle so that it could be upended and hauled around.

I learned some variety of BASIC (BASICA, I think) and spent many hours typing out programs that greeted me by name and let me add numbers together. I used it for school assignments for about six months before I discovered that it booted with the Insert key toggled off, so if I made a mistake it was just like a typewriter: I had to re-type everything from the mistake onwards.

I soon ran into what turns out to been a semi-imaginary bugbear of at least the next ten years of my life: I had no ideas for what to program. I realised I could learn from the program that displayed a bee flying around and played ‘Flight of the Bumblebees’ but I couldn’t summon the energy to pick apart 2000! lines! of code!

1993–1998 My parents got a new computer at the end of 1993, another Compaq as it happened. 486, and I believe 4MB of RAM and a 100MB hard drive. This was rather underpowered for the time, I think, but not radically so. I became something of a power user of word processors and the like. My parents were convinced that I was on the verge of destroying their machine. The poverty of our flat file ‘database’ application bugged me no end. When I came across relational databases much later I knew exactly what they were for. Towards the end of high school (I jumped a couple of years in computing studies and took my final exams in Year 10, so I had some exposure to the wider computer culture, however distorted) I desperately wanted to learn C, but what I was going to do with it I didn’t know.

In 1996 I got a copy of Fractint, I have no idea where from. Most likely the World Wide Web, which I used for the first time that same year. (Someone in my computer class logged on for me, went to Yahoo, and typed in ‘girls’ and started surfing for porn. 1996 was a big year in parental outrage at our school.) I even signed up for a Hotmail account. Anyway, not only did I spend hours and hours choosing just the right colours for my fractals, Fractint was my first exposure to the idea of collective, free, software development and I liked it. I read an article about Linux around the same time and liked the sound of that too, but I fundamentally had no idea what it was on about.

In 1999 I started undergrad and was immediately delighted with UNIX, which I considered as super-powered MS-DOS with better doskey. That said, it was at least a year before I learned to drive tar from the man page, and I think at least two years before I learned that see also crontab(5) means typing the command man 5 crontab (now probably my most frequently used man page). I was unimpressed with the university’s webmail system for a long time (I don’t recall why, but undoubtedly it sucked) and read my mail by telnetting to port 110 on the relevant machine. Very recently, someone suggested to me that doing that is an urban myth equivalent to whistling 9600 baud. No, no it isn’t. But it sucks for attachments.

Andrew and I started going out in August of 1999 and soon after that he bought his own desktop and installed Debian on it. He was pleased with my taste when I chose the username ‘mary’. The relationship was young enough that I still deeply cared about his opinions on questions like what is a tasteful user name? (I still use ‘mary’, it’s short. And tasteful.) At the end of the year he moved out of college and an rm -rf accident on my part and reluctance to download a year’s worth of email on his part means that we no longer have copies of about a year’s worth of emails to each other.

I did learn C in 1999, although I somehow missed the square brackets dereferencing syntax for pointer arithmetic and was doing a lot of accessing arrays like this: *(p + 5).

I had a job as a programmer in 2000. That didn’t work out so well, but I did get enough money for my first computer. Andrew downloaded SuSE for me because he wanted to see what it was like. Bad, that’s what, because it only had mutt 0.2 packages. I had to reinstall it and Windows several times each to get them on. (I was playing a lot of Baldur’s Gate at the time, I believe the Infinity engine still sucks under WINE in 2008.) I wrestled with Exim’s documentation for the best part of a day to get it to act as a smarthost, because ‘smarthost’ has nothing to do with the term ‘mail relay’. At the end of the year I registered puzzling.org. It was hosted on a FreeBSD box for a while, using qmail. (This is not why Andrew and I have a lot of user-suffix@puzzling.org addresses rather than user+suffix. That is because we were later hosted on Crossfire‘s machine for a while, and another user on that machine had used qmail style suffixes and asked for them to be set in Postfix. No one has ever let me finish telling this story until now.)

At the beginning of 2001 Andrew and I skipped the second half of a holiday at the beach for the first linux.conf.au. (Not counting CALU, which I don’t consider the ‘zeroth’ linux.conf.au because programmers count offsets from zero, but no one in their right mind counts objects from zero. Miss Manners would agree, I know.) This was an experience that in memory has not been surpassed in terms of pure mystical wonder. Especially Tridge’s hacking the TiVo talk. And Martin Pool’s rsync proxy thing. I think there is something irretrievably lost when you get better understanding of technology. No conference has been the same again. (And I don’t know that since taking up diving I’d be prepared to leave a beach holiday for any conference.) Soon afterwards, although unrelatedly, Andrew and I were living in a big sharehouse with Nicholas, Catie and Mark and Nicholas, I think, prodded me into installing Debian. And then was alarmed when the first apt-get command I ran was to install nmap. At the time, this was how I checked my machines for unnecessary services.

I tutored computer science from 2001–2003, and managed to always be a better programmer than my students. I suspect this was when I finally learned to program respectably. puzzling.org left Crossfire’s hosting after Andrew quit Weblink, was in shared hosting for a while at csoft and then went to various virtual machines and ended up at Linode. I managed to do all this without a complete meltdown a la the Exim smarthost thing. That was saved for trying to get pppoe working based on knowledge equivalent to an oily rag. Put the modem in bridge mode and all will be well.

This is making me feel as if everything I know about computers I had learned by 2003, and actually that’s largely true. I haven’t picked up a new programming language since then other than for specific projects (Perl in 2004, mainly). I switched to Ubuntu from Debian because Andrew was Canonical employee number 10 or 15 or so. I learned how DNS works around about the same time (from the point of view of configuring BIND, I can’t, say, parse the wire format). Things I’ve acquired since then belong to different stories: relationships, diving, travel, language, computational linguistics, experimental methodology, bits and pieces of statistics, some vastly improved life management skills around budgeting, a certain amount of peace that’s come from somewhere I don’t know. Writing. I’m still programming and doing hobbyist sysadmin, but success no longer comes with wow, I can really do this, finally, at last. I expect my stuff to work these days and I can even usually tell you how long it will take. (I remember Andrew claiming to have this skill as early as 1999.) New toolkits are no longer headline news, but if there was one thing I’d like to add to my store of skills it would be the arrogance of the gods that many programmers have as their birthright. Any problem a mere mortal can pose, they can solve. It sounds dreadful, but flip it over. See? It’s like flying.

WordPress locked down with HTTP Basic Auth

I run several WordPress sites for other people (this isn’t one of them). A couple of them are private: no password for the site, can’t read the site. For years I’ve had an unwieldly situation in which the lockdown was implemented with HTTP Basic Auth configured in Apache, and the users separately log into the site in order to post.

I used HTTP Basic Auth for locking it down even after I discovered Authenticated WordPress (requires a login as a WordPress user before you can see anything) partly because it’s accessible to RSS readers. Many RSS readers (and assorted web fetching tools) can speak HTTP Basic Auth. Few can log themselves into WordPress, although I wouldn’t be surprised to find an exception or three. Eventually though different search terms led me to the HTTP Authentication plugin, and it turns out they play nicely together. If you install them both the site requires HTTP authentication in order to access any part of it, and any person who has successfully authenticated is logged into WordPress too.

A couple of niggles:

  1. (The HTTP Authentication plugin requires that you have two matching lists of user names (well, actually one can be a proper subset of the other if you like, but users who aren’t in both can’t authenticate): the WordPress DB needs to have a registered user, and the external authentication source needs to have an entry for the same user.) Actually, I tell a lie. There is an option to automatically create a WordPress account for a user who shows up as successfully authenticated with an unknown user name.
  2. The HTTP Authentication documentation is slightly wrong: you don’t need the nickname to match the external user, you need the username to match the external user (which is the sensible way anyway).

Wireless A/V, the end

I figure there has to be someone out there wondering what I did with my wireless A/V problems (namely, wireless A/V transmitters get interference from 802.11 wireless networks and are thus useless, especially in blocks of flats).

Well, first up for our purposes we decided to settle for wireless audio only. Neither of us have at all significant amounts of video sitting around on hard drives. We bought a Logitech Squeezebox. (The current Squeezebox is a re-badged Slim Devices Squeezebox version 3, Logitech aquired Slim Devices.) It connects to either wireless (WPA and WEP supported) or wired networks. It has several audio outputs, we’re just using analog RCA.

The Squeezebox needs to be fed music from a storage device. You can install SlimServer (soon to have a 7.0 release, at which time it will be known as SqueezeCenter) on a machine which has your music on it (Windows, Mac, Linux or anything else that can run Perl including lots of Network Attached Storage devices). You can sign up for SqueezeNetwork, which streams Internet Radio to it. We don’t use SqueezeNetwork: Australian broadband does not have the kind of download limits where it’s a good idea to stream your music over the ‘net all the time. Also it sounds like SqueezeNetwork, like pretty much every other Internet Radio service, has a lot of content which they have agreed not to distribute to non-US (or sometimes non-EU) users.

I considered one of these a year or so back but was put off by the Australian recommended retail of $500 (now $399.95 and in the US US$299). But on my most recent search Shopbot turned up a lot of excitingly named places — such as Don’t Pay Retail — that stock it for well under RRP. So I did the annoying dance of trying to find somewhere that didn’t charge exorbitant shipping (everywhere that has it seems to practice that annoying lock-in trick of only telling the customer the shipping cost after they’ve signed up and entered their name, email, phone, and full billing and shipping addresses) and we’ve had one since early December.

Reviews along the lines of this one are more or less correct about its strengths, in particular the menu design.

Some points not commonly discussed in the reviews:

  • The menu design is really great. Actually this is mentioned in the reviews, but it’s great.
  • SlimServer is GPL. Some of the skins for the web interfaces and some plugins might be an exception to that (there’s lots of IANAL discussion on the forums regarding people trying to avoid GPLing their plugins). There’s a Logitech EULA on the download page, that is almost but entirely unlike the GPL, as in, they’re both software licences, but this applies only to the firmware. I saw one of the developers somewhere saying that it’s practically impossible to close development too, since they use a lot of third party modules.
  • It supports a lot of formats (Ogg Vorbis, various Windows stuff I know nothing about) in the server software. Music is sent over the network as either MP3 (if the original file was MP3) or FLAC (if the original file was anything else). Hence, you can only skip through either MP3 or FLAC files. Other files can only be played at normal speed without skipping, or be paused.
  • The web interface to the server is ‘good enough’ (better than MPD clients I’ve used) but not amazing. In particular, it needs more AJAX for the current playlist. Drag-and-drop rearrangement of the playlist is much nicer than click move-song-one-position, load, click move-song-one-position, load, click move-song-one-position, load…
  • The network protocol is documented (and presumably there’s a GPL implementation in Perl) but it’s a custom protocol, not something any other software than Slim Devices software speaks.
  • The Jive architecture, with which your applications talk to the new Duet remotes, isn’t Free Software, although some of the stuff written for it is. I am entirely unclear on how core Jive will be to future Squeezeboxen. Fairly core.
  • There is an active development and user community at their forums, most of which are mirrored to mailing lists.

I’m a bit worried by the recent appearance of the Squeezebox Duet, which is much more in the Sonos multi-room mold. I like my little box and its monochrome display. However, apparently the Squeezebox v3 remains in production, as does its very expensive brother the Transporter, and if the v3 is any indication, the Duet will be a great piece of work for people who want something more like the Sonos system. (People who live in a bigger place than us.) Further, the Duet remotes can control Squeezeboxen and Transporters, not just Duet Receivers.

Another device work looking at if you’re in the market for a device to which you can stream music you have sitting on your hard drive — yeah I’ve heard of sound cards, but I only have one good set of speakers and they live over near the TV, not near the computer — is the Roku Soundbridge. It speaks DAAP and UPnP, which makes it much more generic in terms of the software that can communicate with it. Ross Burton reviews it favourably. Since it’s hard to pick between them from reviews, we decided to buy the cheaper one, which in Australia in December seemed to be the Squeezebox.