Computers & Internet

Stupid T-Shirt

How awesome is the internet? A little while ago, I was watching David Fincher’s far-fetched but entertaining thriller, The Game. If you haven’t seen the film, there are spoilers ahead.

At the end of the movie, some pretty unlikely things happen, but it’s a lot of fun, and I think most audiences let it slide. One of the funny moments at the end is when a character gives Michael Douglas’ character a t-shirt which describes his experiences. After watching the movie, I thought it would make a pretty funny t-shirt… but I couldn’t remember exactly what the shirt said. Naturally, I turned to the internet. Not only was I able to figure out what it said (from multiple sites), I also found a site that actually sells the shirt.

The Game t-shirt: I was drugged and left for dead in Mexico - And all I got was this stupid T-shirt.

They’ve even got a screenshot from the movie. Alas, it’s a bit pricey for such a simplistic shirt. Still, the idea that such a shirt would be anything more than some custom thing a film nerd whipped up is pretty funny. I mean, how many people would even get the reference?

Choice, Productivity and Feature Bloat

Jacob Neilson’s recent column on productivity and screen size referenced an interesting study comparing a feature-rich application with a simpler one:

The distinction between operations and tasks is important in application design because the goal is to optimize the user interface for task performance, rather than sub-optimize it for individual operations. For example, Judy Olson and Erik Nilsen wrote a classic paper comparing two user interfaces for large data tables. One interface offered many more features for table manipulation and each feature decreased task-performance time in specific circumstances. The other design lacked these optimized features and was thus slower to operate under the specific conditions addressed by the first design’s special features.

So, which of these two designs was faster to use? The one with the fewest features. For each operation, the planning time was 2.9 seconds in the stripped-down design and 4.6 seconds in the feature-rich design. With more choices, it takes more time to make a decision on which one to use. The extra 1.7 seconds required to consider the richer feature set consumed more time than users saved by executing faster operations.

In this case, more choices means less productive. So why aren’t all of our applications much smaller and less feature-intesive? Well, as I went over a few weeks ago, people tend to overvalue measurable things like features and undervalue less tangible aspects like usability and productivity. Here’s another reason we endure feature bloat:

A lot of software developers are seduced by the old “80/20” rule. It seems to make a lot of sense: 80% of the people use 20% of the features. So you convince yourself that you only need to implement 20% of the features, and you can still sell 80% as many copies.

Unfortunately, it’s never the same 20%. Everybody uses a different set of features. In the last 10 years I have probably heard of dozens of companies who, determined not to learn from each other, tried to release “lite” word processors that only implement 20% of the features. This story is as old as the PC.

That quote is from a relatively old article, and when I first read it, I still didn’t get why you couldn’t create a “lite” word processor that would be significantly smaller than Word, but still get the job done. Then I started using several of the more obscure features of Word, notably the “Track Changes” feature (which was a life saver at the time), which never would have made it into a “lite” version (yes, there are other options for collaborative editing these days, but you gotta use what you have at hand at the time). Add in the ever increasing computer power and ever decreasing cost of memory and storage, and feature bloat looks like less of a problem. However, as this post started out by noting, productivity often suffers as a result (and as Neilson’s article shows, productivity is more difficult to measure than counting a list of features).

The one approach for dealing with “featuritis” that seems to be catching on these days is starting with your “lite” version, then allowing people to install plugins to fill in the missing functionality. This is one of the things that makes Firefox so popular, as it not only allows plugins, it actually encourages users to create their own. Alas, this has lead to choice problems of it’s own. One of my required features for any browser that I would consider for personal use is mouse gestures. Firefox has at least 4 extensions available that implement mouse gestures in one way or another (though it’s not immediately obvious what the differences are, and there appear to be other extensions which utilize mouse gestures for other functions). By contrast, my other favorite browser, Opera, natively supports mouse gestures.

Of course, this is not a new approach to the feature bloat problem. Indeed, as far as I can see, this is one of the primary driving forces behind *nix-based applications. Their text editors don’t have a word count feature because there is already a utility for doing so (command line: wc [filename]). And so on. It’s part of *nix’s modular design, and it’s one of the things that makes it great, but it also presents problems of it’s own (which I belabored at length last week)

In the end, it comes down to tradeoffs. Humans don’t solve problems, they exchange problems, and so on. Right now, the plugin strategy seems to make a reasonable tradeoff, but it certainly isn’t perfect.

Adventures in Linux, Paradox of Choice Edition

Last week, I wrote about the paradox of choice: having too many options often leads to something akin to buyer’s remorse (paralysis, regret, dissatisfaction, etc…), even if their choice was ultimately a good one. I had attended a talk given by Barry Schwartz on the subject (which he’s written a book about) and I found his focus on the psychological impact of making decisions fascinating. In the course of my ramblings, I made an offhand comment about computers and software:

… the amount of choices in assembling your own computer can be stifling. This is why computer and software companies like Microsoft, Dell, and Apple (yes, even Apple) insist on mediating the user’s experience with their hardware & software by limiting access (i.e. by limiting choice). This turns out to be not so bad, because the number of things to consider really is staggering.

The foolproofing that these companies do can sometimes be frustrating, but for the most part, it works out well. Linux, on the other hand, is the poster child for freedom and choice, and that’s part of why it can be a little frustrating to use, even if it is technically a better, more stable operating system (I’m sure some OSX folks will get a bit riled with me here, but bear with me). You see this all the time with open source software, especially when switching from regular commercial software to open source.

One of the admirable things about Linux is that it is very well thought out and every design decision is usually done for a specific reason. The problem, of course, is that those reasons tend to have something to do with making programmers’ lives easier… and most regular users aren’t programmers. I dabble a bit here and there, but not enough to really benefit from these efficiencies. I learned most of what I know working with Windows and Mac OS, so when some enterprising open source developer decides that he doesn’t like the way a certain Windows application works, you end up seeing some radical new design or paradigm which needs to be learned in order to use it. In recent years a lot of work has gone into making Linux friendlier for the regular user, and usability (especially during the installation process) has certainly improved. Still, a lot of room for improvement remains, and I think part of that has to do with the number of choices people have to make.

Let’s start at the beginning and take an old Dell computer that we want to install Linux on (this is basically the computer I’m running right now). First question: which distrubution of Linux do we want to use? Well, to be sure, we could start from scratch and just install the Linux Kernel and build upwards from there (which would make the process I’m about to describe even more difficult). However, even Linux has it’s limits, so there are lots of distrubutions of linux which package the OS, desktop environments, and a whole bunch of software together. This makes things a whole lot easier, but at the same time, there are a ton of distrutions to choose from. The distributions differ in a lot of ways for various reasons, including technical (issues like hardware support), philosophical (some distros poo poo commercial involvement) and organizational (things like support and updates). These are all good reasons, but when it’s time to make a decision, what distro do you go with? Fedora? Suse? Mandriva? Debian? Gentoo? Ubuntu? A quick look at Wikipedia reveals a comparison of Linux distros, but there are a whopping 67 distros listed and compared in several different categories. Part of the reason there are so many distros is that there are a lot of specialized distros built off of a base distro. For example, Ubuntu has several distributions, including Kubuntu (which defaults to the KDE desktop environment), Edubuntu (for use in schools), Xubuntu (which uses yet another desktop environment called Xfce), and, of course, Ubuntu: Christian Edition (linux for Christians!).

So here’s our first choice. I’m going to pick Ubuntu, primarily because their tagline is “Linux for Human Beings” and hey, I’m human, so I figure this might work for me. Ok, and it has a pretty good reputation for being an easy to use distro focused more on users than things like “enterprises.”

Alright, the next step is to choose a desktop environment. Lucky for us, this choice is a little easier, but only because Ubuntu splits desktop environments into different distributions (unlike many others which give you the choice during installation). For those who don’t know what I’m talking about here, I should point out that a desktop environment is basically an operating system’s GUI – it uses the desktop metaphor and includes things like windows, icons, folders, and abilities like drag-and-drop. Microsoft Windows and Mac OSX are desktop environments, but they’re relatively locked down (to ensure consistency and ease of use (in theory, at least)). For complicated reasons I won’t go into, Linux has a modular system that allows for several different desktop environments. As with linux distributions, there are many desktop environments. However, there are really only two major players: KDE and Gnome. Which is better appears to be a perennial debate amongst linux geeks, but they’re both pretty capable (there are a couple of other semi-popular ones like Xfce and Enlightenment, and then there’s the old standby, twm (Tom’s Window Manager)). We’ll just go with the default Gnome installation.

Note that we haven’t even started the installation process and if we’re a regular user, we’ve already made two major choices, each of which will make you wonder things like: Would I have this problem if I installed Suse instead of Ubuntu? Is KDE better than Gnome?

But now we’re ready for installation. This, at least, isn’t all that bad, depending on the computer you’re starting with. Since we’re using an older Dell model, I’m assuming that the hardware is fairly standard stuff and that it will all be supported by my distro (if I were using a more bleeding edge type box, I’d probably want to check out some compatibility charts before installing). As it turns out, Ubuntu and it’s focus on creating a distribution that human beings can understand has a pretty painless installation. It was actually a little easier than Windows, and when I was finished, I didn’t have to remove the mess of icons and trial software offers (purchasing a Windows PC through somone like HP is apparently even worse). When you’re finished installing Ubuntu, you’re greeted with a desktop that looks like this (click the pic for a larger version):

Default Ubuntu Desktop (click for larger)

No desktop clutter, no icons, no crappy trial software. It’s beautiful! It’s a little different from what we’re used to, but not horribly so. Windows users will note that there are two bars, one on the top and one on the bottom, but everything is pretty self explanatory and this desktop actually improves on several things that are really strange about Windows (i.e. to turn off you’re computer, first click on “Start!”). Personally, I think having two toolbars is a bit much so I get rid of one of them, and customize the other so that it has everything I need (I also put it at the bottom of the screen for several reasons I won’t go into here as this entry is long enough as it is).

Alright, we’re almost homefree, and the installation was a breeze. Plus, lots of free software has been installed, including Firefox, Open Office, and a bunch of other good stuff. We’re feeling pretty good here. I’ve got most of my needs covered by the default software, but let’s just say we want to install Amarok, so that we can update our iPod. Now we’re faced with another decision: How do we install this application? Since Ubuntu has so thoughtfully optimized their desktop for human use, one of the things we immediately notice in the “Applications” menu is an option which says “Add/Remove…” and when you click on it, a list of software comes up and it appears that all you need to do is select what you want and it will install it for you. Sweet! However, the list of software there doesn’t include every program, so sometimes you need to use the Synaptic package manager, which is also a GUI application installation program (though it appears to break each piece of software into smaller bits). Also, in looking around the web, you see that someone has explained that you should download and install software by typing this in the command line: apt-get install amarok. But wait! We really should be using the aptitude command instead of apt-get to install applications.

If you’re keeping track, that’s four different ways to install a program, and I haven’t even gotten into repositories (main, restricted, universe, multiverse, oh my!), downloadable package files (these operate more or less the way a Windows user would download a .exe installation file, though not exactly), let alone downloading the source code and compiling (sounds fun, doesn’t it?). To be sure, they all work, and they’re all pretty easy to figure out, but there’s little consistency, especially when it comes to support (most of the time, you’ll get a command line in response to a question, which is completely at odds with the expectations of someone switching from Windows). Also, in the case of Amarok, I didn’t fare so well (for reasons belabored in that post).

Once installed, most software works pretty much the way you’d expect. As previously mentioned, open source developers sometimes get carried away with their efficiencies, which can sometimes be confusing to a newbie, but for the most part, it works just fine. There are some exceptions, like the absurd Blender, but that’s not necessarily a hugely popular application that everyone needs.

Believe it or not, I’m simplifying here. There are that many choices in Linux. Ubuntu tries its best to make things as simple as possible (with considerable success), but when using Linux, it’s inevitable that you’ll run into something that requires you to break down the metaphorical walls of the GUI and muck around in the complicated swarm of text files and command lines. Again, it’s not that difficult to figure this stuff out, but all these choices contribute to the same decision fatigue I discussed in my last post: anticipated regret (there are so many distros – I know I’m going to choose the wrong one), actual regret (should I have installed Suse?), dissatisfaction, excalation of expectations (I’ve spent so much time figuring out what distro to use that it’s going to perfectly suit my every need!), and leakage (i.e. a bad installation process will affect what you think of a program, even after installing it – your feelings before installing leak into the usage of the application).

None of this is to say that Linux is bad. It is free, in every sense of the word, and I believe that’s a good thing. But if they ever want to create a desktop that will rival Windows or OSX, someone needs to create a distro that clamps down on some of these choices. Or maybe not. It’s hard to advocate something like this when you’re talking about software that is so deeply predicated on openess and freedom. However, as I concluded in my last post:

Without choices, life is miserable. When options are added, welfare is increased. Choice is a good thing. But too much choice causes the curve to level out and eventually start moving in the other direction. It becomes a matter of tradeoffs. Regular readers of this blog know what’s coming: We don’t so much solve problems as we trade one set of problems for another, in the hopes that the new set of problems is more favorable than the old.

Choice is a double edged sword, and by embracing that freedom, Linux has to deal with the bad as well as the good (just as Microsoft and Apple have to deal with the bad aspects of suppressing freedom and choice). Is it possible to create a Linux distro that is as easy to use as Windows or OSX while retaining the openness and freedom that makes it so wonderful? I don’t know, but it would certainly be interesting.

The Paradox of Choice

At the UI11 Conference I attended last week, one of the keynote presentations was made by Barry Schwartz, author of The Paradox of Choice: Why More Is Less. Though he believes choice to be a good thing, his presentation focused more on the negative aspects of offering too many choices. He walks through a number of examples that illustrate the problems with our “official syllogism” which is:

  • More freedom means more welfare
  • More choice means more freedom
  • Therefore, more choice means more welfare

In the United States, we have operated as if this syllogism is unambigiously true, and as a result, we’re deluged with choices. Just take a look at a relatively small supermarket: there are 285 cookies, 75 iced teas, 275 cereals, 40 toothpastes, 230 soups, and 175 salad dressings (not including 12 extra virgin olive oils and 18 vinegars which could be combined to make hundreds of vinaigrettes) to choose from (and this was supposedly a smaller supermarket). At your typical Circuit City, the sheer breadth of stereo components allows you to create any one of 6.5 million possible stereo systems. And this applies all throughout our lives, extending even to working, marriage, and whether or not to have children. In the past, these things weren’t much of a question. Today, everything is a choice. [thanks to Jesper R�nn-Jensen for his notes on Schwartz’s talk – it’s even got pictures!]

So how do we react to all these choices? Luke Wroblewski provides an excellent summary, which I will partly steal (because, hey, he’s stealing from Schwartz after all):

  • Paralysis: When faced with so many choices, people are often overwhelmed and put off the decision. I often find myself in such a situation: Oh, I don’t have time to evaluate all of these options, I’ll just do it tomorrow. But, of course, tomorrow is usually not so different than today, so you see a lot of procrastination.
  • Decision Quality: Of course, you can’t procrastinate forever, so when forced to make a decision, people will often use simple heuristics to evaluate the field of options. In retail, this often boils down to evaluation based mostly on Brand and Price. I also read a recent paper on feature fatigue (full article not available, but the abstract is there) that fits nicely here.

    In fields where there are many competing products, you see a lot of feature bloat. Loading a product with all sorts of bells and whistles will differentiate that product and often increase initial sales. However, all of these additional capabilities come at the expense of usability. What’s more, even when people know this, they still choose high-feature models. The only thing that really helps is when someone actually uses a product for a certain amount of time, at which point they realize that they either don’t use the extra features or that the tradeoffs in terms of usability make the additional capabilities considerably less attractive. Part of the problem is perhaps that usability is an intangible and somewhat subjective attribute of a product. Intellectually, everyone knows that it is important, but when it comes down to decision-time, most people base their decisions on something that is more easily measured, like number of features, brand, or price. This is also part of why focus groups are so bad at measuring usability. I’ve been to a number of focus groups that start with a series of exercises in front of a computer, then end with a roundtable discussion about their experiences. Usually, the discussion was completely at odds with what the people actually did when in front of the computer. Watch what they do, not what they say…

  • Decision Satisfaction: When presented with a lot of choices, people may actually do better for themselves, yet they often feel worse due to regret or anticipated regret. Because people resort to simplifying their decision making process, and because they know they’re simplifying, they might also wonder if one or more of the options they cut was actually better than what they chose. A little while ago, I bought a new cell phone. I actually did a fair amount of work evaluating the options, and I ended up going with a low-end no-frills phone… and instantly regretted it. Of course, the phone itself wasn’t that bad (and for all I know, it was better than the other phones I passesd over), but I regret dismissing some of the other options, such as the camera (how many times over the past two years have I wanted to take a picture and thought Hey, if I had a camera on my phone I could have taken that picture!)
  • Escalation of expectations: When we have so many choices and we do so much work evaluating all the options, we begin to expect more. When things were worse (i.e. when there were less choices), it was much easier to exceed expectations. In the cell phone example above, part of the regret was no doubt fueled by the fact that I spent a lot of time figuring out which phone to get.
  • Maximizer Impact: There are some people who always want to have the best, and the problems inherent in too many choices hit these people the hardest.
  • Leakage: The conditions present when you’re making a decision exert influence long after the decision has actually been made, contributing to the dissatisfaction (i.e. regret, anticipated regret) and escalation of expectations outlined above.

As I was watching this presentation, I couldn’t help but think of various examples in my own life that illustrated some of the issues. There was the cell phone choice which turned out badly, but I also thought about things I had chosen that had come out well. For example, about a year ago, I bought an iPod, and I’ve been extremely happy with it (even though it’s not perfect), despite the fact that there were many options which I considered. Why didn’t the process of evaluating all the options evoke a feeling of regret? Because my initial impulse was to purchase the iPod, and I looked at the other options simply out of curiosity. I also had the opportunity to try out some of the players, and that experience helped enormously. And finally, the one feature that had given me pause was video (which wasn’t available on the iPod when I started looking around). The Cowon iAudio X5 was giving me pause because it had video capabilities and the iPod at the time didn’t. As it turned out, about a week later the Video iPod was released and made my decision very easy. I got that and haven’t looked back since. The funny thing is that since I’ve gotten that iPod, I haven’t used the video feature for anything useful. Not even once.

Another example is my old PC which has recently kicked the bucket. I actually assembled that PC from a bunch of parts, rather than going through a mainstream company like Dell, and the number of components available would probably make the Circuit City stereo example I gave earlier look tiny by comparison. Interestingly, this diversity of choices for PCs is often credited as part of the reason PCs overtook Macs:

Back in the early days of Macintoshes, Apple engineers would reportedly get into arguments with Steve Jobs about creating ports to allow people to add RAM to their Macs. The engineers thought it would be a good idea; Jobs said no, because he didn’t want anyone opening up a Mac. He’d rather they just throw out their Mac when they needed new RAM, and buy a new one.

Of course, we know who won this battle. The “Wintel” PC won: The computer that let anyone throw in a new component, new RAM, or a new peripheral when they wanted their computer to do something new. Okay, Mac fans, I know, I know: PCs also “won” unfairly because Bill Gates abused his monopoly with Windows. Fair enough.

But the fact is, as Hill notes, PCs never aimed at being perfect, pristine boxes like Macintoshes. They settled for being “good enough” — under the assumption that it was up to the users to tweak or adjust the PC if they needed it to do something else.

But as Schwartz would note, the amount of choices in assembling your own computer can be stifling. This is why computer and software companies like Microsoft, Dell, and Apple (yes, even Apple) insist on mediating the user’s experience with their hardware by limiting access (i.e. by limiting choice). This turns out to be not so bad, because the number of things to consider really is staggering. So why was I so happy with my computer? Because I really didn’t make many of the decisions – I simply went over to Ars Technica’s System Guide and used their recommendations. When it comes time to build my next computer, what do you think I’m going to do? Indeed, Ars is currently compiling recommendations for their October system guide, due out sometime this week. My new computer will most likely be based off of their “Hot Rod” box. (Linux presents some interesting issues in this context as well, though I think I’ll save that for another post.)

So what are the lessons here? One of the big ones is to separate the analysis from the choice by getting recommendations from someone else (see the Ars Technica example above). In the market for a digital camera? Call a friend (preferably one who is into photography) and ask them what to get. Another thing that strikes me is that just knowing about this can help you overcome it to a degree. Try to keep your expectations in check, and you might open up some room for pleasant surprises (doing this is suprisingly effective with movies). If possible, try using the product first (borrow a friend’s, use a rental, etc…). Don’t try to maximize the results so much; settle for things that are good enough (this is what Schwartz calls satisficing).

Without choices, life is miserable. When options are added, welfare is increased. Choice is a good thing. But too much choice causes the curve to level out and eventually start moving in the other direction. It becomes a matter of tradeoffs. Regular readers of this blog know what’s coming: We don’t so much solve problems as we trade one set of problems for another, in the hopes that the new set of problems is more favorable than the old. So where is the sweet spot? That’s probably a topic for another post, but my initial thoughts are that it would depend heavily on what you’re doing and the context in which you’re doing it. Also, if you were to take a wider view of things, there’s something to be said for maximizing options and then narrowing the field (a la the free market). Still, the concept of choice as a double edged sword should not be all that surprising… after all, freedom isn’t easy. Just ask Spider Man.

Link Dump

I’ve been quite busy lately so once again it’s time to unleash the chain-smoking monkey research squad and share the results:

  • The Truth About Overselling!: Ever wonder how web hosting companies can offer obscene amounts of storage and bandwidth these days? It turns out that these web hosting companies are offering more than they actually have. Josh Jones of Dreamhost explains why this practice is popular and how they can get away with it (short answer – most people emphatically don’t use or need that much bandwidth).
  • Utterly fascinating pseudo-mystery on Metafilter. Someone got curious about a strange flash advertisement, and a whole slew of people started investigating, analyzing the flash file, plotting stuff on a map, etc… Reminded me a little of that whole Publius Enigma thing [via Chizumatic].
  • Weak security in our daily lives: “Right now, I am going to give you a sequence of minimal length that, when you enter it into a car’s numeric keypad, is guaranteed to unlock the doors of said car. It is exactly 3129 keypresses long, which should take you around 20 minutes to go through.” [via Schneier]
  • America’s Most Fonted: The 7 Worst Fonts: Fonts aren’t usually a topic of discussion here, but I thought it was funny that the Kaedrin logo (see upper left hand side of this page) uses the #7 worst font. But it’s only the logo and that’s ok… right? RIGHT?
  • Architecture is another topic rarely discussed here, but I thought that the new trend of secret rooms was interesting. [via Kottke]

That’s all for now. Things appear to be slowing down, so that will hopefully mean more time for blogging (i.e. less link dumpy type posts).

Linux Humor & Blog Notes

I’ll be attending the User Interface 11 conference this week, and as such, won’t have much time to check in. Try not to wreck the place while I’m gone. Since I’m off to the airport in fairly short order (why did I schedule a flight to conflict with the Eagles/Cowboys matchup? Dammit!) here’s a quick comic with some linux humor:

sudo make me a sandwich

The author, Randall Munroe, is a NASA scientist who has a keen sense of humor (and is apparently deathly afraid of raptors) and publishes a new comic a few times a week. The comic above is one of his most popular, and even graces one of his T-Shirts (I also like the “Science. It works, bitches.” shirt)

I’m sure I’ll be able to wrangle some internet access during the week, but chances are that it will be limited (I need to get me a laptop at some point). I’ll be back late Thursday night, so posting will probably resume next Sunday.

Adventures in Linux, iPod edition

Last weekend, my Windows machine died and I decided to give linux a shot. My basic thought was that if I could get a linux box to do everything I need, why bother getting another copy of windows? So I cast about looking for applications to fulfill my needs, and thus found myself on Mark Pilgrim‘s recently updated list of linux Essentials (Pilgrim has recently experienced a bit of net notoriety due to his decision to abandon Apple for Ubuntu).

So I need something to replace iTunes (which I use to play music and update my iPod). No problem:

amaroK. It’s just like iTunes except it automatically fetches lyrics from Argentina, automatically looks up bands on Wikipedia, automatically identifies songs with MusicBrainz, and its developers are actively working on features that don’t involve pushing DRM-infected crap down my throat. Add the amarok repository to get the latest version. apt-get install amarok

After taking that advice and installing Amarok, I think that paragraph would be better written as:

amaroK. It’s just like iTunes except it automatically orphans most of your library so that you can’t see or play most of your music on your iPod, it doesn’t handle video, it can’t write to the iPod’s podcast directory, and (my personal favorite) if you plug your Amarokized iPod into a windows machine, it crashes iTunes. Add the amarok repository to get the latest version, as the latest version doesn’t seem to have those problems.

Yes, that’s right, I plugged in my iPod and Amarok corrupted the itunes database. I could still use my iPod, but I could only see 256 songs (out of around 1000). It didn’t delete the files – all 1000 songs were still on the iPod – it just screwed up the database that controls the ipod. The issue turns out to be that I installed an older version of Amarok, and since Mark recommended getting the latest version, I really can’t fault him for this debacle. You see, Ubuntu comes with a few user-friendly ways of installing programs. These are based on what’s called “Repositories” which are basically databases full of programs that you can browse. So I fired up one of these installation programs, found Amarok, and installed it… not realizing that the default Ubuntu repository had an older version of the program.

Some thoughts:

  • Linux is dangerous (it’s the hole hawg of operating systems)! Sometimes doing simple things can have catastrophic results.
  • When someone says get the latest version, get the latest version.
  • I learned what repositories are and how to add one to my system.
  • When asking for help, you’ll probably get an answer quickly, but it will usually consist of several command lines which you probably won’t understand. This is particularly nerve wracking when combined with the first bullet point. I get a little anxious whenever someone tells me to type one of these things in the command line, because I don’t know what’s going on and I don’t want my system to explode. Linux is dangerous. Sometimes doing simple things can have catastrophic results. When I said I don’t want my system to explode, I meant that metaphorically, but I’m positive that if I set my mind to it, I could make my computer literally explode by altering a simple text file somewhere. This reliance on the command line is also one of the reasons it’s hard to learn linux – they usually work, but you don’t understand why unless you look up the commands (and even then, it can be a little difficult to understand. Documentation isn’t one of open source’s strong points). Plus, whenever I am forced to do these command lines, I’m usually very task oriented. I don’t have time to research the intricacies of every command line utility, I just want to complete my task.

To their credit, I posted my problem to the Amarok forum (at 3 in the morning) and received several helpful responses by the time I woke up in the morning, just a few hours later. I was able to install the latest version of Amarok, though that didn’t really help me repair my iPod (there was a feature which would do this in theory, but when I tried it, the application just started eating up lots of memory until it hit the system limit, and then it just shut down). I had to use a different utility, called gtkpod, to scan through my iPod and rescue all of the orphaned files (and it took a few hours to do so). For some reason, a lot of my music is being recognized as podcasts in my iPod, but otherwise the iPod is in much better shape. I can see all my music now, and plugging it into a windows computer doesn’t crash iTunes anymore.

Obviously, I had a bad experience here, but I’m still a little confused as to how Amarok is a valid iTunes replacement. Even with the latest version, it still has no support for videos (and the developers don’t plan to either, their excuse being that Amarok is just a music player) and it’s podcast support isn’t ideal (I can upload them to my iPod, but they get put in the general library, not the special podcast library Strike that. It turns out that when the iPod isn’t corrupted, the podcasts work as they should, though I’m still not sure it’s the ideal interface). The interface for viewing and maintaining the iPod is a little sparse and lacks some of the maintenance features of iTunes. As far as I can tell, Amarok is a fine music player and probably rivals or surpasses iTunes in that respect (I assume this is why people seem to love it so much). But in terms of maintaining an iPod, it sucks (at least, so far – I’m willing to bet there’s lots of functionality I’m missing). Support for iPods in general seems to be a bit lacking in linux, though there are some things you can do in linux that you can’t do in windows. It’s also something that could probably improve in time, but it’s definitely not there yet.

Despite the problems, I find myself strangely bemused at the experience. It was exactly what I feared, but in the end, I’m not that upset about it. There’s a part of me that likes digging into the details of a problem and troubleshooting like this… but then, there’s also a part of me that knows spending 5 hours trying to install something I could install in about 10 minutes on a Windows box is ludicrous. All’s well that ends well, I guess, but consider me unimpressed. It’s not enough for me to forsake linux, but it’s enough to make me want to create a dual boot machine rather than a pure linux box.

Update: In using Amarok a little more, I see that it supports podcasts better than I originally thought.

The Death of Sulaco

I have two computers running here at Kaedrin headquarters. My primary computer is a Windows box called Sulaco. My secondary computer is running Ubuntu Linux and is called Nostromo. Yesterday, Sulaco nearly died. I’ll spare you the details (which are covered in the forum), but it started with some display trouble. It could have been the drivers for my video card, or it could have been that the video card itself was malfunctioning. In any case, by this morning, Sulaco’s Windows registry was thoroughly corrupted. All attempts to salvage the installation failed. For some reason, my Windows XP CD failed to boot, and my trusty Win 98 floppy boot disk wouldn’t let me run the setup from the XP CD (nor could I even see my hard drive, which had some files on it I wanted to retrieve).

To further complicate matters, the CD burner on my linux box has always been flaky, so I couldn’t use that to create a new boot disk. However, I did remember that my Ubuntu installation disk could run as a Live CD. A few minutes of google searching yielded step-by-step instructions for booting a Windows box with an Ubuntu Live CD, mounting the Windows drive and sharing it via Windows File Sharing (i.e. Samba). A few minutes later and I was copying all appropriate data from Sulaco to Nostromo.

For all intents and purposes, Sulaco is dead. She has served me well, and it should be noted that she was constructed nearly 6 years ago with turn-of-the-century hardware. I’m actually amazed that she held up so well for so long, but her age was showing. Upgrades would have been necessary even without the display/registry problems. The question now is how to proceed.

I’ve been fiddling with Linux for, oh, 8 years or so. Until recently, I’ve never found it particularly useful. Even now, I’m wary of it. However, the ease with which I was able to install Ubuntu and get it up on my wireless network (this task had given me so much trouble in the past that I was overjoyed when I managed to get it working) made me reconsider a bit. Indeed, the fact that the way I recovered from a Windows crash was to use linux is also heartening. On the other hand, I also have to consider the fact that if someone hadn’t written detailed instructions for the exact task I was attempting, I probably never would have figured it out in a reasonable timeframe. This is the problem with linux. It’s hard to learn.

Yes, I know, it’s a great operating system. I’ve fiddled with it enough to realize that some of the things that might seem maddeningly and deliberately obscure are actually done for the best of reasons in a quite logical manner (unless, of course, you’re talking about the documentation, which is usually infuriating). I’m not so much worried that I can’t figure it out, it’s that I don’t really have the time to work through its ideosyncracies. As I’ve said, recent experiences have been heartening, but I’m still wary. Open source software is a wonderful thing in theory, but I’d say that my experience with such applications has been mixed at best. For an example of what I’m worried about, see Shamus’ attempts to use Blender, an open source 3d modeling program.

My next step will be to build a new box in Sulaco’s place. As of right now, I’m leaning towards installing Ubuntu on that and using one of the various Windows emulators like WINE to run the windows proprietary software I need (which probably isn’t much at this point). So right now, Nostromo is my guinea pig. If I can get this machine to do everything I need it to do in the next few days, I’ll be a little less wary. If I can’t, I’ll find another Windows CD and install that. To be perfectly honest, Windows has served me well. Until yesterday, I’ve never had a problem with my installation of XP, which was stable and responsive for several years (conventional wisdom seems to dictate that running XP requires a complete reinstallation every few months – I’ve never had that problem). That said, I don’t particularly feel like purchasing a new copy, especially when Vista is right around the corner…

Magic Design

A few weeks ago, I wrote about magic and how subconscious problem solving can sometimes seem magical:

When confronted with a particularly daunting problem, I’ll work on it very intensely for a while. However, I find that it’s best to stop after a bit and let the problem percolate in the back of my mind while I do completely unrelated things. Sometimes, the answer will just come to me, often at the strangest times. Occasionally, this entire process will happen without my intending it, but sometimes I’m deliberately trying to harness this subconscious problem solving ability. And I don’t think I’m doing anything special here; I think everyone has these sort of Eureka! moments from time to time. …

Once I noticed this, I began seeing similar patterns throughout my life and even history.

And indeed, Jason Kottke recently posted about how design works, referencing a couple of other designers, including Michael Bierut of Design Observer, who describes his process like this:

When I do a design project, I begin by listening carefully to you as you talk about your problem and read whatever background material I can find that relates to the issues you face. If you’re lucky, I have also accidentally acquired some firsthand experience with your situation. Somewhere along the way an idea for the design pops into my head from out of the blue. I can’t really explain that part; it’s like magic. Sometimes it even happens before you have a chance to tell me that much about your problem!

[emphasis mine] It is like magic, but as Bierut notes, this sort of thing is becoming more important as we move from an industrial economy to an information economy. He references a book about managing artists:

At the outset, the writers acknowledge that the nature of work is changing in the 21st century, characterizing it as “a shift from an industrial economy to an information economy, from physical work to knowledge work.” In trying to understand how this new kind of work can be managed, they propose a model based not on industrial production, but on the collaborative arts, specifically theater.

… They are careful to identify the defining characteristics of this kind of work: allowing solutions to emerge in a process of iteration, rather than trying to get everything right the first time; accepting the lack of control in the process, and letting the improvisation engendered by uncertainty help drive the process; and creating a work environment that sets clear enough limits that people can play securely within them.

This is very interesting and dovetails nicely with several topics covered on this blog. Harnessing self-organizing forces to produce emergent results seems to be rising in importance significantly as we proceed towards an information based economy. As noted, collaboration is key. Older business models seem to focus on a more brute force way of solving problems, but as we proceed we need to find better and faster ways to collaborate. The internet, with it’s hyperlinked structure and massive data stores, has been struggling with a data analysis problem since its inception. Only recently have we really begun to figure out ways to harness the collective intelligence of the internet and its users, but even now, we’re only scraping the tip of the iceberg. Collaborative projects like Wikipedia or wisdom-of-crowds aggregators like Digg or Reddit represent an interesting step in the right direction. The challenge here is that we’re not facing the problems directly anmore. If you want to create a comprehensive encyclopedia, you can hire a bunch of people to research, write, and edit entries. Wikipedia tried something different. They didn’t explicitely create an encyclopedia, they created (or, at least, they deployed) a system that made it easy for large amount of people to collaborate on a large amount of topics. The encyclopedia is an emergent result of that collaboration. They sidestepped the problem, and as a result, they have a much larger and dynamic information resource.

None of those examples are perfect, of course, but the more I think about it, the more I think that their imperfection is what makes them work. As noted above, you’re probably much better off releasing a site that is imperfect and iterating, making changes and learning from your mistakes as you go. When dealing with these complex problems, you’re not going to design the perfect system all at once. I realize that I keep saying we need better information aggregation and analysis tools, and that we have these tools, but they leave something to be desired. The point of these systems, though, is that they get better with time. Many older information analysis systems break when you increase the workload quickly. They don’t scale well. These newer systems only really work well once they have high participation rates and large amounts of data.

It remains to be seen whether or not these systems can actually handle that much data (and participation), but like I said, they’re a good start and they’re getting better with time.

YALD

Time is short this week, so it’s time for Yet Another Link Dump (YALD!):

  • Who Writes Wikipedia? An interesting investigation of one of the controversial aspects of Wikipedia. Some contend that the authors are a small but dedicated bunch, others claim that authorship is large and diverse (meaning that the resulting encyclopedia is self-organizing and emergent). Aaron Swartz decided to look into it:

    When you put it all together, the story become clear: an outsider makes one edit to add a chunk of information, then insiders make several edits tweaking and reformatting it. In addition, insiders rack up thousands of edits doing things like changing the name of a category across the entire site — the kind of thing only insiders deeply care about. As a result, insiders account for the vast majority of the edits. But it’s the outsiders who provide nearly all of the content.

    And when you think about it, this makes perfect sense. Writing an encyclopedia is hard. To do anywhere near a decent job, you have to know a great deal of information about an incredibly wide variety of subjects. Writing so much text is difficult, but doing all the background research seems impossible.

    On the other hand, everyone has a bunch of obscure things that, for one reason or another, they’ve come to know well. So they share them, clicking the edit link and adding a paragraph or two to Wikipedia. At the same time, a small number of people have become particularly involved in Wikipedia itself, learning its policies and special syntax, and spending their time tweaking the contributions of everybody else.

    Depending on how you measure it, many perspectives are correct, but the important thing here is that both types of people (outsiders and insiders) are necessary to make the system work. Via James Grimmelman, who has also written an interesting post on Wikipedia Fallacies that’s worth reading.

  • Cyber Cinema, 1981-2001: An absurdly comprehensive series of articles chronicling cyberpunk cinema. This guy appears to know his stuff, and chooses both obvious and not-so-obvious films to review. For example, he refers to Batman as “a fine example of distilled Cyberpunk.” I probably wouldn’t have pegged Batman as cyberpunk, but he makes a pretty good case for it… Anyway, I haven’t read all of his choices (20 movies, 1 for each year), but it’s pretty interesting stuff. [via Metaphlog]
  • The 3-Day Novel Contest: Well, it’s too late to partake now, but this is an interesting contest where entrants all submit a novel written in 3 days. The contest is usually held over labor day weekend (allowing everyone to make the most of their long holiday weekend). The Survival Guide is worth reading even if you don’t intend on taking part. Some excerpts: On the attitude required for such an endeavor:

    Perhaps the most important part of attitude when approaching a 3-Day Novel Contest is that of humility. It is not, as one might understandably and mistakenly expect, aggression or verve or toughness or (as it has been known) a sheer murderous intent to complete a 3-Day Novel (of this latter approach it is almost always the entrant who dies and not the contest). Let’s face it, what you are about to do, really, defies reality for most people. As when in foreign lands, a slightly submissive, respectful attitude generally fares better for the traveller than a self-defeating mode of overbearance. As one rather pompous contestant confessed after completing the contest: “I’ve been to Hell, and ended up writing about it.”

    On outlines and spontaneity:

    Those without a plan, more often than not, find themselves floundering upon the turbulent, unforgiving seas of forced spontaneous creativity. An outline can be quite detailed and, as veterans of the contest will also tell you, the chances of sticking to the outline once things get rolling are about 1,000 to 1. But getting started is often a major hurdle and an outline can be invaluable as an initiator.

    Two things that interest me about this: plans that fall apart, but must be made anyway (which I have written about before) and the idea that just getting started is important (which is something I’ll probably write about sometime, assuming I haven’t already done so and forgot).

    On eating:

    Keep it simple, and fast. Wieners (straight from the package—protein taken care of). Bananas and other fruit (vitamin C, potassium, etc.). Keep cooking to a minimum. Pizzas, Chinese—food to go. Forget balance, this is not a “spa”, there are no “healing days”. This is a competition; a crucible; a hill of sand. Climb! Climb!

    Lots of other fun stuff there. Also, who says you need to do it on Labor day weekend. Why not take a day off and try it out? [via Web Petals, who has some other interesting quotes from the contest]

That’s all for now. Sorry for just throwing links at you all the time, but I’ve entered what’s known as Wedding Season. Several weddings over the next few weekends, only one of which is in this area. This week’s was in Rhode Island, so I had a wonderful 12-13 hours of driving to contend with (not to mention R.I.’s wonderful road system – apparently they don’t think signs are needed). Thank goodness for podcasts – specifically Filmspotting, Mastercritic, and the Preston and Steve Show (who are professional broadcasters, but put their entire show (2+ hours) up, commercial free, every day).

Shockingly, it seems that I only needed to use two channels on my Monster FM Transmitter and both of those channels are the ones I use around Philly. Despite this, I’ve not been too happy with my FM transmitter thingy. It get’s the job done, I guess, but I find myself consistently annoyed at its performace (this trip being an exception). It seems that these things are very idiosyncratic and unpredictible, working in some cars better than others (thus some people swear by one brand, while others will badmouth that same brand). In large cities like New York and Philadelphia, the FM dial gets crowded and thus it’s difficult to find a suitable station, further complicating matters. I think my living in a major city area combined with an awkward placement of the cigarrette lighter in my car (which I assume is a factor) makes it somewhat difficult to find a good station. What would be really useful would be a list of available stations and an attempt to figure out ways to troubleshoot your car’s idiosyncracies. Perhaps a wiki would work best for this, though I doubt I’ll be motivated enought to spend the time installing a wiki system here for this purpose (does a similar site already exist? I did a quick search but came up empty-handed). (There are kits that allow you to tap into your car stereo, but they’re costly and I don’t feel like paying more for that than I did for the player… )