Categories
computer

Why Linux Won't Rule The Desktop Any Time Soon (1 of 2)

tux.jpgBefore the hate mail roles in, let me say that I’m a huge advocate of open standards and general use of F/OSS. I love Linux and use it almost exclusively for all my server side applications. I’ve used Linux on desktop systems for many years, have developed with it, had jobs supporting it, built custom distributions with it, and hacked on more systems than I can remember. All this, but I still switched to OS X for desktop use several years ago and the thought of switching back makes me nauseated. This is not meant to be a criticism of any single distribution or technology, but why the culture around Linux will not yield a massive exodus of “typical” Windows users any time soon.

It’s 2007, and decries have been made many times over the last 15 years that “Linux Is Ready For The Desktop”. Well, it’s 2007, and no, it’s not ready for the desktop; at least not for the masses. The people which tend to run desktop Linux systems these days satisfy two of these three criteria:

  1. Have more time than money.
  2. Are looking to run server processes.
  3. Have a natural competency for all things technical and enjoy fixing things.

If you fit two of these categories there’s a good chance I’d recommend Linux, and I’ll be the first to help with the installation. But since the common user is not looking to run a web server on their desktop, and surely not a 12-hour non-stop let’s-fix-all-my-device-drivers session, I don’t recommend it often to non-geeks. I’ve given presentations to businesses on why they should migrate their old Windows servers to Linux, but have not once recommended my parents do so on their old home machine.

Why? Two things: usability and support. Today we’ll tackle the first half.

Usability

Many interfaces are built purely for function, not usability.

As a culture of programmers, it’s somewhat expected that we’re producing interfaces with obtuse options, confusing layouts and poorly worded text. Would you like to “Set the SCSI IMMED flag“, or perhaps some “ISO9660 options“? Sure, why not.

Lot’s of tiny parts make for a horrible user experience.

The Unix principle that the system be compromised of many small pieces of functionality assembled into a larger system may work at the system level, but not for everyday desktop applications. Obscure or horribly named products makes users more confused.

Linux users love reconfiguration/options far more than ordinary users.

As a result, many distributions ship with redundant tools, and if not, someone on the other half of the planet creates another distribution with slightly different packages. While the freedom to do so is priceless, the practical value of many spin-off distributions is next to zilch. We, the community of F/OSS advocates, need to more heavily emphasize convention over configuration and use forking only if absolutely necessary.

The entire experience is not consistent and fluid.

Without a single vendor driving development of the * major * desktop * applications, integration between all the pieces just isn’t happening well. Forcing the user to go through installation of optional packages and perform technical manual configuration is a major turn off.

Some “frond-end” applications just scrape the output of the command-line version.

This is a horribly error prone way to integrate with a lower-level piece of software since there isn’t a static way of finding issues. Please stop doing this. It makes things break across upgrades, and just adds more weight to the “just wipe the disk and start from scratch” camp (which I happen to be part of).

Let the flames begin, and don’t foget to stick around for part 2!