CategoryFree Culture

Wikipedianews

The discovery of Gliese 581 c is a watershed moment in the search for extrasolar planets and alien life. What folly to view religion as revelation, when it is science that is unwrapping the universe like a giant birthday present, making visible entire worlds one by one, in the unimaginably vast candy store of billions of observable galaxies. One of the most promising missions among the planet hunters is COROT, a space telescope operated by the French and European Space Agencies. And, of course, when I wanted to see what the state of that mission is, I intuitively looked it up on Wikipedia.

Purely by coincidence, COROT has found its first planet yesterday. Not only was this noted in the Wikipedia article about COROT, the planet itself already has an entry of its own. Thus, I did not learn about the discovery through the numerous RSS feeds and news websites I follow (including Wikinews), but through Wikipedia. We call Wikipedia an encyclopedia — but it is clearly much more than any encyclopedia history has ever seen.

I am hardly the first person to notice this, and indeed, the New York Times recently devoted an article to exploring Wikipedia’s coverage of the Virginia Tech massacre. How can one make more intelligent use of the news-like characeristics of Wikipedia and combine them in meaningful ways with our news-dedicated project, Wikinews?

I’ve personally subscribed to the history RSS feeds of a number of articles of interest (access them from the “history” page of the article in the bottom left corner). These give you diffs to the latest changes to the article, which can be useful in order to, say, notice that one of your favorite bands has released a new album. But of course you will get a lot of crud, including vandalism and boring maintenance edits. There are simple ways to make feeds smarter — only pushing changes into the feed when an article has stabilized, filtering minor edits, etc.

Structured data will also allow for some interesting feed possibilities: if an album is an object associated with a band, then it is possible to be notified if there are specific changes to the existing objects or additions of new ones. This general principle can be applied wonderfully broadly, turning any wiki into a universal event notification mechanism. (Alert me when person X dies / a conference of type Y happens / an astronomical object with the characteristics A, B, and C is discovered.) Wikipedia (and its structured data repository) will be the single most useful one, but specialized wikis will of course thrive and benefit from the same technology.

In the department of less remote possibilities, I’ve described an RSS extension I’d like to see back in February. It would allow the transformation of portals into mini-news sites linking directly to relevant Wikipedia articles. In general, the more ways we have to publish RSS automatically or semi-automatically, the better–the community will innovate around the technology.

Our separate Wikinews project remains justifiable as a differently styled, more detailed and granular view of events of the day largely irrespective of their historical significance. But I believe we should try to make the two projects interact more neatly when it comes to major events. Cross-wiki content transclusion in combination with the ever-elusive single user login might spur some interesting collaborations, particularly about components that are useful to both projects (timelines, infoboxes, and so on). Perhaps even the idea of shared fair use content is not entirely blasphemous in this context.

The increasing use of Wikipedia as a news source in its own right will only strengthen its cultural significance in ways that we have yet to understand.

Citizendium is not Free Content

Nearly a month after its public launch, Citizendium (a new wiki-like encyclopedia that positions itself against Wikipedia) still has not figured out its licensing policy. While the project has no choice but to follow Wikipedia’s GNU FDL when it imports articles from there, its own pages are still under undefined terms. Contributors are asked to wait while (someone) figures out the licensing terms: “All new articles will be available under an open content license yet to be determined.” It does not say who will make that determination and on what legal basis they even have a right to do so. Unless CZ decides to ask every contributor for permission to relicense, authors would have a very good claim to question a licensing decision they do not agree with.

I hope that Citizendium will become free content eventually, instead of adopting odious restrictions like “no commercial use” which would make subsets of it incompatible with Wikipedia and other free knowledge resources, not to mention making an awful mess of the editing process. Meanwhile, the content created by CZ contributors is completely proprietary and not usable by anyone beyond its publication to the CZ website. The few high quality articles they have developed and which could potentially be merged back into Wikipedia are non-free. I would caution anyone who contributes there to at least explicitly license their content (for example, by putting a licensing template on their user page).

A bunch of buttons

The Definition of Free Cultural Works has been officially adopted by the Wikimedia Foundation. This gives us some major visibility, and we want to make use of that visibility by encouraging people to adopt a new set of free culture buttons, designed by amymade.com. Here are some examples:

I asked Amy to make three different colors for different types of works (such as music, scientific papers, or weblogs). I’m very happy with how they turned out — if you have a CC button for a free license on any of your works, you may want to consider replacing it with one of these buttons. I hope they will be adopted within Wikimedia as well.

30 Days of Freedom

Slashdot just publicized 30 Days With Ubuntu, a review of the Ubuntu Linux distribution and its strengths and shortcomings. I found the review to be honest and accurate. It is purely technical and does not argue the ethics of free/libre software. I want to use it as an opportunity to reflect on where this movement is today, and why it matters. It’s not any in-depth analysis, but might be interesting to some readers (and writing such things always helps me to hone my beliefs, my arguments, and my rhetoric). In this article, I will refer to free/libre software as “Free Code”, which is a bit of an experiment and which I will explain at the end.

I’ve been a Kubuntu user for a year now (and a Debian user before that for 3 1/2 years), so I have plenty of opinions of my own, but I want to focus on the main criticisms identified by the reviewer:

  1. support for recent hardware is incomplete – some things don’t work or require a lot of tweaking
  2. no commercial games, except through (crappy) emulation
  3. no good video editing code, no good PhotoShop replacement
  4. 64-bit version sucks in many ways

These are remarkably few criticisms, which give an indication of how far Linux and Ubuntu have come. I’ll try to address each one of them briefly.

Hardware support

This is going to be a tricky issue unless and until we get mainstream adoption. That’s not entirely a chicken/egg problem, because it does not concern machines which are specifically selected or built for the purpose of running Linux (which is of course how the vast majority of users get Windows on their computers in the first place). I think three things have to mainly happen to make progress:

  • We need to build a really, really sexy Linux-based PC that everyone wants to have, get some large vendor to support it, and market it widely. See [[FriendlyPC]] for some ideas on this. The OLPC effort will be a good case study on what to do and what to avoid. (I’m particularly interested in whether Squeak will prove to be a viable development platform or not.) Not likely in 2 years. Maybe use the Wikipedia brand? If WP regains significant credibility, a “Wikipedia PC” could become quite the item.
  • Users of particular hardware need to be able to find each other, and pool resources (money, activism & coding ability), to make hardware support happen. If a major distributor like Ubuntu integrated such a matching tool, it could make a significant difference in the speed with which new hardware is supported. This goes for general code needs as well, of course, but as the review shows, these are fairly well met already.
  • There needs to be a single certification program that is supported by all major Linux vendors & companies. I don’t want to look for “Red Hat enabled” or “Ubuntu supported”; I want to know whether recent versions of any of these distributions will work with the hardware or not. And once the certification program has gained acceptance, you can do a bait and switch and withdraw certification from non-free vendors. :-)

Of course, an increasing market dominance of a single Linux-based platform will also make things easier. My general philosophy on competition is that it should only exist where it makes eminent sense, i.e., it should grow on the basis of irreconcilable disagreements about philosophy, direction, management, or architecture, not on the simple desire of some people to make more money than others. This ties into my belief that we need to gradually change from a profit-driven society to a cooperative one. Hence, I support rallying around the most progressive Linux distribution, which appears to be Ubuntu/Debian at this point. (That said, I would prefer the for-profit which supports Ubuntu, Canonical, to be fully owned by the non-profit, which is the model chosen by the Mozilla Foundation.)

Some users believe strongly in making existing non-free drivers as easy to obtain and use as possible. I’m not completely opposed to the idea, but we need a more creative solution than this. My suggestion is what I call “/usr/bin/indulgence”, something like Automatix, but more deeply embedded into the OS, which would make installing any non-free code (from the lowest to the highest level) a trivial procedure, but which would also ask the user very kindly to support an equivalent Free Code implementation in any of the three ways mentioned earlier: money, activism, or code.

I object to handling such matters carelessly and to the vociferous “You people are idiots for not making this as easy as possible” arguments that are sometimes made; these are short-sighted and uncreative (as is the ideological opposition to even discussing the issue).

Commercial games and emulation

The reviewer points out that Linux is not a state of the art gaming platform. I’m quite happy with that. Free Code games give me an occasional distraction, but are not of sufficient depth to be seriously addictive. (Some Battle for Wesnoth or NetHack players might object to the previous statement.) I think this is something parents should take into account, and we should communicate it as a plus when talking to offices, governments, and home users.

If you know me, you may be surprised to learn that I’m in favor of some governmental regulation on games; not on grounds of violence or sex, but on grounds of addictiveness. That is a topic for another entry — but I personally see mainstream games on Linux as a non-issue. Your Mileage May Vary. If you want more distractions, there are thousands of games from the 1980s and 1990s that will work perfectly in emulators which pretend your machine is a Super Nintendo, a Commodore Amiga, or a DOS PC. Seach for “Abandonware” and “ROMs” on BitTorrent et al.

I’m more interested in how the Free Code ethos, combined with continued innovation in funding and collaboration, could result in Free Games that are technically modern, but built with more than just the interest in getting as many players as possible to pay a monthly subscription fee. Games that teach things. Games that make people do things. Games that demand and encourage creativity. Games where the builders care about the lives of the players, and not just their money. I’ll be happy to promote and help with that. Getting the latest Blizzard game to run nicely? Not so much.

Missing code

You know that Linux is ready for governments and businesses when a 30 day review points out DVD and photo editing as the main weaknesses — and not because there are no Free Code replacements, but because they aren’t quite good enough yet. The reviewer only tried two applications, GIMP and Kino. I share his feelings towards the GIMP photo editor, which I regard as an “old school” Free Code project where the developers would rather tell the users why their program is, in fact, highly usable than conducting serious usability tests and making improvements. To be fair, the existing GIMP user base, which is used to the current implementation, may also resist significant changes.

That is not to say that the quite remarkable GIMP functionality could not be wrapped into a nicer user interface. GIMPShop is one such attempt, which I have not tried. I hope that it will become a well-maintained fork; I don’t have much hope for GIMP itself to improve in the UI department. I am personally partial to Krita which, while still young, seems to have generally made the right implementation decisions, and is truly user-focused (as is all of KDE — I love those guys). I am not a professional photo editor, so I don’t know how mature Krita is for serious work. It is good enough for everything I do.

As for video editing, it would have been nice to read about some alternatives to Kino, such as the popular Cinelerra. As an outsider, I expect it will take a couple of years for one of these solutions to become truly competitive. Fortunately, many end users have very basic needs (cutting, titling, some effects), which the existing solutions cover, as the reviewer acknowledges.

There are, of course, countless areas which the reviewer does not touch upon. Personally, I think two important missing components for home users are high quality speech recognition (which many users will expect now that Vista is introducing it to the mainstream), and Free OCR. Here I’m not even talking about quality; the existing packages are, frankly, useless (I have scanned, OCR’d and proofread books, so I know a little bit about the topic). While setting my father up with Kubuntu, I played with the Tesseract engine, which was recently freed. It produced the best results of all the ones I’ve tested, but still did poorly compared to proprietary solutions (perhaps due to a lack of preprocessing). Frankly, without some serious financial support I doubt we’ll make much headway anytime soon. If the Wikimedia Foundation had more resources to spare, I’d advocate funding this, as it would greatly benefit Wikisource.

64 bit support

The reviewer points out a few glitches in the apps and drivers of the 64 bit version of Ubuntu. I cannot see a single argument in the article why this even matters to end users, except that “Vista has it” and that Ubuntu will have to provide good 64 bit support “to be taken seriously.” For the most part, this seems to be an issue of buzzword compliance, and I’m confident that the glitches will be worked out long before it starts to be relevant (i.e. when regular users start to operate on huge amounts of data in the course of their daily work).

Instead of chasing after an implementation detail of doubtful significance, I find the opposite direction of innovation much more fascinating: getting Linux to run well on low end machines and embedded systems. If you want, you can run Linux on a 486 PC from 15 years ago using DeLi Linux, though you’ll have to do without some modern applications simply due to their capabilities. Others, on the other hand, are up-to-date, and there are even well maintained text-mode web browsers for Linux, not to mention highly capable text-based authoring environments like emacs and vim.


Where next?

Given the state of Linux today, there is really no excuse for any government not to switch to Free Code. Certainly, there are still unmet needs, but if governments collaborate, these will be addressed quickly. Which, incidentally, is also a lovely way to bring people from different cultures together. Imagine Free Code hackers from Venezuela, Norway, and Pakistan working together on the same codebase. Such scenarios are, of course, already playing out, but would be much more common if any taxpayer-funded code needed to be freely licensed.

I believe this is inevitable, and that we’re going to continue to see significant progress within governments and academia. One could compare the Free Code progress to any civil rights struggle. Some will consider this an exaggeration, but one should not underestimate how much is connected to it, especially 20 to 30 years down the line: the involvement of developing nations in the information society, the control over the media platform through which everything will be created and played, the potential for reform to capitalism itself — and more. While Free Code may not save lives in ways which are as obvious as, say, an HIV drug or a law against anti-gay violence, it is the enabling support layer of a Free Culture which is very much connected to such issues: to education and awareness, to media decentralization, to intellectual property law, and so on and so forth.

Accordingly, we can expect some governments to take reactionary positions for decades (regardless of the quality of free solutions and undeniable cost savings), while others are already well in the progress of migrating. This may sound obvious, but is a bit counterintuitive to many politically naive hackers: No matter how good a job we do, we will continue to meet resistance and opposition on all levels.

As for businesses migrating their desktops, I believe it’s going to be a slower process. There are many contributing factors here: the ignorance of decision-makers, the feeling that anything “free” is suspicious, the sustained marketing effort of proprietary vendors, the reluctance to commit to approaches which require cooperation with competitors, the short term thinking that often dominates company policies, and last but not least, the common lack of any ethical perspective. Of course, there are also more practical grounds for opposition, but in general, I think the corporate sector (in spite of the adoption of Free Code on the server) is going to be the hardest to convert. I’m supportive of helping especially small businesses make the switch, in spite of my reservations about the profit motive. (Realistically, we’re going to have to live with capitalism for some time …)

Let me explain the reason I used the idiosyncratic term “Free Code” in this article. “Free Software” is disappointingly ambiguous, and “Open Source” is morally sterile. The word “code” points to what matters most: the instructions, the recipes underlying an application, which can be remixed, shared and rearranged. And “Free Code”, like “Free Culture”, calls to mind freedom over any vague notion of “openness” of “sources”. Last but not least, it carries the ominous double meaning that Lessig pointed out: code is law. Code influences and shapes our society increasingly, the more networked it becomes. Who would you rather have in charge of writing law: a monopoly-centered oligarchy, or anyone who has the will and the ability to do so? If you realize what code truly is, the conclusions become inescapable.

“Free Code” is perhaps unlikely to catch on, but sometimes I just want to try out a phrase to see how it feels. I’ve also always found that “software” is an undeniably silly term. It’s easy to ridicule people who would care about such a thing. It’s “soft.” It’s a commodity. To call a coder a software developer industrializes and trivializes their role. Coding should be a creative process which is deeply rooted in social and political awareness. People should be proud to say: “I am not a ‘software developer’. I am a Free Coder.”

LibriVox Dramatic Recordings

I’ve been fond of the LibriVox project for some time, where volunteers contribute spoken recordings of public domain texts (see the Wikinews interview I did last year). It’s a wonderful example of what becomes possible when a work is no longer trapped by copyright law. But I only today discovered the Dramatic Works section of their catalog. Here, multiple readers distribute the speaking of lines from dramatic works like Shakespeare’s “King Lear”, and the result is edited into a single recording. The entire process is coordinated through the LibriVox forums. I love it.

Granted, the results are of varying quality, and only a handful of works have been completed so far. But the technology that enables such collaborations to happen is also still in its infancy. The very existence of high quality open source audio editing software like Audacity has already driven a great many projects (including our own Spoken Wikipedia); imagine what kind of creativity an open source audio collaboration suite could unleash.

Improvements, of course, often come in very small steps. A nice example is the Shtooka software, an open source program specifically designed for the purpose of creating pronunciation files. It is not rocket science, but according to Gerard, who has recorded hundreds of such files, it makes the process so much simpler. I wouldn’t be surprised if the folks at LibriVox come up with their own “Shtooka” solution to distributing the workload of complex dramatic recordings.

Jamendo – (More or Less) Free Music

When I discover something great, new on the web that has been around for a while, I always feel this weird combination of pleasure and shock. First: “Wow, this is great!” Then: “Why did I not find out about this sooner?! Damnit, I’m supposed to know this stuff!” (Occasionally followed by: “I should blog about this!” 😉

Jamendo is an example. It’s a multilingual web 2.0 style music-sharing community with an emphasis on copyright licenses which are at least free as in beer. Some are also free as in speech. You can browse the available music by genre and license, and download or stream the tracks you want. BitTorrent is supported, though I haven’t seen any mega-size torrents yet (it would be neat to download an entire genre). It’s got everything else you could want: RSS feeds, discussion & review boards for each album, prominent donation links, cover images ..

The reason I hadn’t heard about is that its largest community is French-speaking. But Jamendo is multilingual, so it seems only a matter of time until the other languages catch up. The site seems to be a bit buggy at times, but reloading usually does the trick for me. Now I’ve got lots of new music to explore ..

WikiYouth

Are you an advocate of youth rights on the Internet?

Have you ever used or edited a wiki, such as the world-famous Wikipedia?

Then we want you to join the Wiki Youth Movement.

http://www.wikiyouth.org/

Wikis like Wikipedia allow young people everywhere to share knowledge, ideas and experiences. But wherever young people use the Internet, they are faced with reactionary and condescending views. Fearful adults try to regulate the content they can see and the communities they can contribute to. The Wiki Youth Association seeks to give young readers, users and editors a voice.

Beyond demanding equal treatment as wiki contributors, our goal is to build a shared understanding of wiki ethics. Vandalism and immature behavior are condemned on most wikis, and sensible learning approaches for new wiki users are encouraged. We want to have fun, but not at the expense of others. We want to help you to understand the maze of wiki-rules, so that you too can have fun.

We would also like to give intelligent young people a shared social space where they can talk about their experiences in not only wiki communities, but also their daily lives. eventually, we hope that we can develop the WYA into a true social movement which organizes events and campaigns. But our initial goals are modest: we only want to become the single largest world-wide community of young wiki users.

The WYA is not a formal organization. There is no membership other than registration for our wiki and forums.

Join today!

—James Hare, Co-founder

—Erik Möller, Advisor