Wikipedia’s core problem is not expertise, it’s self-selection

Bringing Wikipedia articles up to a quality standard we can be proud of will require more than just “stable versions” (frozen revisions that community members claim to be of a given quality standard). Take the article on Mitt Romney, one of the many people hoping to become the next president of the United States. The article describes Romney’s record as governor of Massachusetts with the following words:

Romney was sworn in as the 70th governor of Massachusetts on January 2, 2003, along with Lieutenant Governor Kerry Healey. Within one year of taking office, Romney eliminated a 3 billion dollar budget deficit. During this time he did not raise taxes or debt. He also proceeded to end his term with a 1 billion dollar surplus as well as lower taxes and a lower unemployment rate.

All this information is properly referenced and sourced to … Romney for President, Inc. Of course, the article will eventually become more sane, but this is the state it’s been in for weeks, and this is what we currently serve readers looking for information about this particular candidate. And it’s quite likely that such a revision would at least have been approved as “non-vandalized” under a stable version system.

Yet, is the answer to give up on the idea of radically open editing? The source of the problem here seems to be not so much that “anyone can edit”, but that the people who do edit are self-selected. And for many topics, self-selection leads to bias. Whether it’s Mormons writing about Mormonism, Pokemon lovers writing about Pokemon characters, or teenage Mitt Romney supporters writing about Mitt Romney, the problem shows up on thousands of topics. Sometimes different self-selected factions counter each other’s bias, but that is obviously not something one can rely on, especially when one faction wins a particular war of attrition.

Putting stronger emphasis on professional expertise will not address this problem, and indeed, one will find examples of the same self-selection bias in more expert-driven communities like Citizendium (e.g. an article on chiropractic largely written by a chiropractor). All one can hope for from self-selected experts is that their bias is more intelligently disguised. Are volunteer communities doomed to self-selection bias? Well, dealing with the problem requires first recognizing it as such. And currently recognition of the problem on Wikipedia is very limited. Indeed, suggestions of self-selection bias are usually countered with replies such as “judge the article, not the authors”, often followed by reference to the “no personal attacks” policy. Outside clear commercial interests, Wikipedians are ill-prepared to deal with their own bias.

It also seems clear that a broad recusal & disclosure policy that would extend the current “conflict of interest” guidelines would go too far. Firstly, it would simply lead to much self-selection bias being hidden from view: The editor promoting Romney’s campaign on MySpace would simply remove the reference to that MySpace page from their userpage. Secondly, biased or not, self-selected editors will often be the best-informed about a particular subject. Rather than trying to remove them from the set of editors working on a particular article, it generally seems wiser to broaden the set to include more independent voices.

I believe we need to think of this as a socio-technical problem: How do we get a large number of relatively random, but highly trusted contributors to carefully look at a particular article and to scan for bias? Clearly, NPOV dispute tags aren’t sufficient: POV fighters will have an interest in removing them as soon as possible, and given the sheer number of them, they no long serve as sufficient motivation for the average editor. Furthermore, the articles which people choose to “fix” are again highly self-selected.

As just one possible alternative, imagine that some trusted (elected?) group of users could flag articles for “bias review”. They would set a number of people from 10 to 100 who would be randomly selected from the pool of active editors. Those people would get a note: “The article XY has been flagged for bias review. You have been randomly selected as a reviewer. Do you accept?” If the user does not accept, the review notice would automatically be propagated to another random user. In combination with stable quality versions, this could help to get many independent voices to look for obvious signs of bias. One might also consider encouraging the development of article forks by separate workgroups, and letting readers decide (by discussion or vote) which one is the least biased.

Do you have other ideas? Whatever the solution, I do believe that we need to start thinking seriously about the problem if we want Wikipedia to be useful in any area of “contested knowledge”. And we need to start experimenting, rather than waiting endlessly for a consensus that will never come. Right now, thousands of contested articles are dominated by factions fighting POV wars of attrition. That cannot be the final answer.

5 Comments

  1. BTW: What is the status of stable versions?

  2. Torsten. Pretty much done. The second optional UI needs finishing and the whole things needs code review though.

    Anyways, indeed, simplistic “non-vandalized” checks would catch half the crap we don’t want. My proposed WP:Flagged Revisions policy would require some POV/cleanup checking even for the minimal review level. This will no doubt slow it down, in that it would be harder to review a page for the first time. But given watchlist tracking of reviewed edits and quick links to diffs against the last one, reviews from the 1st one on will just be maintenance, focusing on what changed.

    Also, the random editor selection for POV review reminds me of the “tasks” extension or such…something by mangus.

  3. Election is not a good model. At least, pure election is not good. Election is a political model and as such, it will continue to have the same problems as you described: a lot of competing factions. It is not so hard to imagine some kind of political fights for membership in such body.

    So, here are some notes about what kind of the body we need:

    – The body should be international, cross language and cross project. Fighting POV is not related only to English Wikipedia and a lot of smaller communities are not able to fight POV alone.

    – People who are dealing with articles should have a good knowledge about matter inside of particular article. This means that the body which would deal with POV on the low level should consist maybe a hundreds of people. Also, the body should make a way how to deal with a lot of languages. (I didn’t look at statistics, but there are maybe 50 languages with a lot of content.)

    – A couple of levels of elections may help, too: (1) At the first level all elected board members + maybe 10 persons more (who are also need to be elected) may make some kind of the first level political body. (2) This group should some number of people who would organize the job in the particular areas (10-20-30? people). (3) Organizers shouldn’t have any power over the articles, but they should find people who know to deal with matter.

    The point is that person who is dealing with articles should be well protected. And there should be a number of preventative rules for that. For example, one organizer may choose a person to be an “executor”. However, organizer alone shouldn’t be able to remove an “executor” from the position. Maybe majority, maybe 80% of organizers would be able to remove such person. Also, only the first level body should be able to recall an organizer (or, maybe, 80% of organizers, too).

    In short, the logic of the model is: (1) people are choosing persons to which they believe at the top level; (2) those persons are choosing good organizers to which they believe; (3) and those persons are choosing people with a good knowledge in some field.

    Of course, all of the process should be public.

    All people should have some quasi-formal attributes:

    – All of them have to be Wikimedians.

    – All people at the low level should have good knowledge in the particular field which is proved on the Wikimedian projects.

    On the second level, there should be a number of super-projects: natural sciences, social sciences… language-regional projects… Maybe some parts of the jobs may be adopted by some WM (sub)committees (at the first and/or at the second level).

  4. Do you think simply increasing the number of editors would eventually solve this issues you raised? At some point, enough editors would eventually self-select a majority of articles and counter every bias with all other points of view.

    I agree with your conclusion that it is “wiser to broaden the set to include more independent voices”. I think getting the huge number of people browsing wikipedia but never contributing it, to start pressing the “edit this page” link, would in the long run, be the best way to solve article quality matters.

Leave a Reply

Your email address will not be published.

*