Slashdot just publicized 30 Days With Ubuntu, a review of the Ubuntu Linux distribution and its strengths and shortcomings. I found the review to be honest and accurate. It is purely technical and does not argue the ethics of free/libre software. I want to use it as an opportunity to reflect on where this movement is today, and why it matters. It’s not any in-depth analysis, but might be interesting to some readers (and writing such things always helps me to hone my beliefs, my arguments, and my rhetoric). In this article, I will refer to free/libre software as “Free Code”, which is a bit of an experiment and which I will explain at the end.
I’ve been a Kubuntu user for a year now (and a Debian user before that for 3 1/2 years), so I have plenty of opinions of my own, but I want to focus on the main criticisms identified by the reviewer:
- support for recent hardware is incomplete – some things don’t work or require a lot of tweaking
- no commercial games, except through (crappy) emulation
- no good video editing code, no good PhotoShop replacement
- 64-bit version sucks in many ways
These are remarkably few criticisms, which give an indication of how far Linux and Ubuntu have come. I’ll try to address each one of them briefly.
Hardware support
This is going to be a tricky issue unless and until we get mainstream adoption. That’s not entirely a chicken/egg problem, because it does not concern machines which are specifically selected or built for the purpose of running Linux (which is of course how the vast majority of users get Windows on their computers in the first place). I think three things have to mainly happen to make progress:
- We need to build a really, really sexy Linux-based PC that everyone wants to have, get some large vendor to support it, and market it widely. See [[FriendlyPC]] for some ideas on this. The OLPC effort will be a good case study on what to do and what to avoid. (I’m particularly interested in whether Squeak will prove to be a viable development platform or not.) Not likely in 2 years. Maybe use the Wikipedia brand? If WP regains significant credibility, a “Wikipedia PC” could become quite the item.
- Users of particular hardware need to be able to find each other, and pool resources (money, activism & coding ability), to make hardware support happen. If a major distributor like Ubuntu integrated such a matching tool, it could make a significant difference in the speed with which new hardware is supported. This goes for general code needs as well, of course, but as the review shows, these are fairly well met already.
- There needs to be a single certification program that is supported by all major Linux vendors & companies. I don’t want to look for “Red Hat enabled” or “Ubuntu supported”; I want to know whether recent versions of any of these distributions will work with the hardware or not. And once the certification program has gained acceptance, you can do a bait and switch and withdraw certification from non-free vendors.
Of course, an increasing market dominance of a single Linux-based platform will also make things easier. My general philosophy on competition is that it should only exist where it makes eminent sense, i.e., it should grow on the basis of irreconcilable disagreements about philosophy, direction, management, or architecture, not on the simple desire of some people to make more money than others. This ties into my belief that we need to gradually change from a profit-driven society to a cooperative one. Hence, I support rallying around the most progressive Linux distribution, which appears to be Ubuntu/Debian at this point. (That said, I would prefer the for-profit which supports Ubuntu, Canonical, to be fully owned by the non-profit, which is the model chosen by the Mozilla Foundation.)
Some users believe strongly in making existing non-free drivers as easy to obtain and use as possible. I’m not completely opposed to the idea, but we need a more creative solution than this. My suggestion is what I call “/usr/bin/indulgence”, something like Automatix, but more deeply embedded into the OS, which would make installing any non-free code (from the lowest to the highest level) a trivial procedure, but which would also ask the user very kindly to support an equivalent Free Code implementation in any of the three ways mentioned earlier: money, activism, or code.
I object to handling such matters carelessly and to the vociferous “You people are idiots for not making this as easy as possible” arguments that are sometimes made; these are short-sighted and uncreative (as is the ideological opposition to even discussing the issue).
Commercial games and emulation
The reviewer points out that Linux is not a state of the art gaming platform. I’m quite happy with that. Free Code games give me an occasional distraction, but are not of sufficient depth to be seriously addictive. (Some Battle for Wesnoth or NetHack players might object to the previous statement.) I think this is something parents should take into account, and we should communicate it as a plus when talking to offices, governments, and home users.
If you know me, you may be surprised to learn that I’m in favor of some governmental regulation on games; not on grounds of violence or sex, but on grounds of addictiveness. That is a topic for another entry — but I personally see mainstream games on Linux as a non-issue. Your Mileage May Vary. If you want more distractions, there are thousands of games from the 1980s and 1990s that will work perfectly in emulators which pretend your machine is a Super Nintendo, a Commodore Amiga, or a DOS PC. Seach for “Abandonware” and “ROMs” on BitTorrent et al.
I’m more interested in how the Free Code ethos, combined with continued innovation in funding and collaboration, could result in Free Games that are technically modern, but built with more than just the interest in getting as many players as possible to pay a monthly subscription fee. Games that teach things. Games that make people do things. Games that demand and encourage creativity. Games where the builders care about the lives of the players, and not just their money. I’ll be happy to promote and help with that. Getting the latest Blizzard game to run nicely? Not so much.
Missing code
You know that Linux is ready for governments and businesses when a 30 day review points out DVD and photo editing as the main weaknesses — and not because there are no Free Code replacements, but because they aren’t quite good enough yet. The reviewer only tried two applications, GIMP and Kino. I share his feelings towards the GIMP photo editor, which I regard as an “old school” Free Code project where the developers would rather tell the users why their program is, in fact, highly usable than conducting serious usability tests and making improvements. To be fair, the existing GIMP user base, which is used to the current implementation, may also resist significant changes.
That is not to say that the quite remarkable GIMP functionality could not be wrapped into a nicer user interface. GIMPShop is one such attempt, which I have not tried. I hope that it will become a well-maintained fork; I don’t have much hope for GIMP itself to improve in the UI department. I am personally partial to Krita which, while still young, seems to have generally made the right implementation decisions, and is truly user-focused (as is all of KDE — I love those guys). I am not a professional photo editor, so I don’t know how mature Krita is for serious work. It is good enough for everything I do.
As for video editing, it would have been nice to read about some alternatives to Kino, such as the popular Cinelerra. As an outsider, I expect it will take a couple of years for one of these solutions to become truly competitive. Fortunately, many end users have very basic needs (cutting, titling, some effects), which the existing solutions cover, as the reviewer acknowledges.
There are, of course, countless areas which the reviewer does not touch upon. Personally, I think two important missing components for home users are high quality speech recognition (which many users will expect now that Vista is introducing it to the mainstream), and Free OCR. Here I’m not even talking about quality; the existing packages are, frankly, useless (I have scanned, OCR’d and proofread books, so I know a little bit about the topic). While setting my father up with Kubuntu, I played with the Tesseract engine, which was recently freed. It produced the best results of all the ones I’ve tested, but still did poorly compared to proprietary solutions (perhaps due to a lack of preprocessing). Frankly, without some serious financial support I doubt we’ll make much headway anytime soon. If the Wikimedia Foundation had more resources to spare, I’d advocate funding this, as it would greatly benefit Wikisource.
64 bit support
The reviewer points out a few glitches in the apps and drivers of the 64 bit version of Ubuntu. I cannot see a single argument in the article why this even matters to end users, except that “Vista has it” and that Ubuntu will have to provide good 64 bit support “to be taken seriously.” For the most part, this seems to be an issue of buzzword compliance, and I’m confident that the glitches will be worked out long before it starts to be relevant (i.e. when regular users start to operate on huge amounts of data in the course of their daily work).
Instead of chasing after an implementation detail of doubtful significance, I find the opposite direction of innovation much more fascinating: getting Linux to run well on low end machines and embedded systems. If you want, you can run Linux on a 486 PC from 15 years ago using DeLi Linux, though you’ll have to do without some modern applications simply due to their capabilities. Others, on the other hand, are up-to-date, and there are even well maintained text-mode web browsers for Linux, not to mention highly capable text-based authoring environments like emacs and vim.
Where next?
Given the state of Linux today, there is really no excuse for any government not to switch to Free Code. Certainly, there are still unmet needs, but if governments collaborate, these will be addressed quickly. Which, incidentally, is also a lovely way to bring people from different cultures together. Imagine Free Code hackers from Venezuela, Norway, and Pakistan working together on the same codebase. Such scenarios are, of course, already playing out, but would be much more common if any taxpayer-funded code needed to be freely licensed.
I believe this is inevitable, and that we’re going to continue to see significant progress within governments and academia. One could compare the Free Code progress to any civil rights struggle. Some will consider this an exaggeration, but one should not underestimate how much is connected to it, especially 20 to 30 years down the line: the involvement of developing nations in the information society, the control over the media platform through which everything will be created and played, the potential for reform to capitalism itself — and more. While Free Code may not save lives in ways which are as obvious as, say, an HIV drug or a law against anti-gay violence, it is the enabling support layer of a Free Culture which is very much connected to such issues: to education and awareness, to media decentralization, to intellectual property law, and so on and so forth.
Accordingly, we can expect some governments to take reactionary positions for decades (regardless of the quality of free solutions and undeniable cost savings), while others are already well in the progress of migrating. This may sound obvious, but is a bit counterintuitive to many politically naive hackers: No matter how good a job we do, we will continue to meet resistance and opposition on all levels.
As for businesses migrating their desktops, I believe it’s going to be a slower process. There are many contributing factors here: the ignorance of decision-makers, the feeling that anything “free” is suspicious, the sustained marketing effort of proprietary vendors, the reluctance to commit to approaches which require cooperation with competitors, the short term thinking that often dominates company policies, and last but not least, the common lack of any ethical perspective. Of course, there are also more practical grounds for opposition, but in general, I think the corporate sector (in spite of the adoption of Free Code on the server) is going to be the hardest to convert. I’m supportive of helping especially small businesses make the switch, in spite of my reservations about the profit motive. (Realistically, we’re going to have to live with capitalism for some time …)
Let me explain the reason I used the idiosyncratic term “Free Code” in this article. “Free Software” is disappointingly ambiguous, and “Open Source” is morally sterile. The word “code” points to what matters most: the instructions, the recipes underlying an application, which can be remixed, shared and rearranged. And “Free Code”, like “Free Culture”, calls to mind freedom over any vague notion of “openness” of “sources”. Last but not least, it carries the ominous double meaning that Lessig pointed out: code is law. Code influences and shapes our society increasingly, the more networked it becomes. Who would you rather have in charge of writing law: a monopoly-centered oligarchy, or anyone who has the will and the ability to do so? If you realize what code truly is, the conclusions become inescapable.
“Free Code” is perhaps unlikely to catch on, but sometimes I just want to try out a phrase to see how it feels. I’ve also always found that “software” is an undeniably silly term. It’s easy to ridicule people who would care about such a thing. It’s “soft.” It’s a commodity. To call a coder a software developer industrializes and trivializes their role. Coding should be a creative process which is deeply rooted in social and political awareness. People should be proud to say: “I am not a ‘software developer’. I am a Free Coder.”