Categories
color coded douglas kastle reputation wikipedia

Wikipedia trust coloring demo


I love wikipedia, it is a comfortable hole that I fall into a lot and find it very hard to get back out of. In many way like 6 degrees of Kevin Bacon, I seem to inexorably coming back to pages dealing with Star Trek or Nazi Germany, maybe that says more about the user rather than wikipedia.

Most of the time when reading pages I don’t feel there is any reason to doubt what I am reading (I have proof-read some Star Trek articles for quality and have yet to find an error). However there are some pages, politicians, Scientology etc. that you just know you cannot believe by faith alone. It is a problem and I have covered the issue of wikipedia reputation before. It is good to know that other people are working on solving this problem.

The tack that they have taken is similar to the idea that I proposed (even though it is probably self evident) which is to give a user a rating based on the work contributed, but also what survives edits. As a result words on a specific page are color coded based on the user who added it and their reputation. I like their color coding, it is a lot more subtle than I thought could be done and I think it works. If nothing else it is a very good way of highlighting what is contention and bringing eyes to bare to at least help resolve it. I can’t wait to see this deployed.

Categories
douglas kastle John Seigenthaler puppy reputation wiki wikipedia

Wikipedia Reputation

It seems like every month there is a slashdot article on the validity and trustability of wikipedia

Wikipedia and the Politics of Verification
Is Wikipedia Failing?
A Wikipedia Without Graffiti
Long-Term Wikipedia Vandalism Exposed

It took me a while to find it again but way back in september 2006 Tom Cross posted a very interesting article, in First Monday, on a possible reputation systems to limit the effects of vandalism on public wikis. It was very informative then and I think that something like this is even more relevant now :

Puppy smoothies: Improving the reliability of open, collaborative wikis

The slashdot post is here :

Could a Reputation System Improve Wikipedia?

The gist of the research is that on each wiki page the content of the page is colored coded to mark the freshness (or age) of the all the text on the page. The older the text the longer it has survived unedited and the greater likelihood that it has survived many eyes looking at it.

If I could add at all to this excellent piece of research it would be that there may be a way to color code relative to the user that edited the text. Imagine a rating system that measures every edit a user makes versus the number of corrections made to that user’s edits. This is somewhat similar to the ebay model of buyer and seller feedback. However instead of trying to get a edit positively feedback (Fast payment, super customer. Thank you.) it is just the survivability of an edit that is positive feedback and should someone change your edit, even for spelling, then you rating should go down by the proportion of changes to the original edit size. So if some one correct 20% of the original edit the postive effect of the original edit is only 80% the effect of an uncorrected edit.

This mechanism, I believe, should flag trolls and vandals a lot easier as if they do nothing but contribute negative content then they will have a negative rating, which if color coded corrected will come scream of the scream in red.

This might have had a little effect on the John Seigenthaler case :

Wikipedia Hoax Author Confesses

as I’m guessing the hoax author wasn’t very prolific in valid unedited articles elsewhere in wikipedia, should some one have gotten to this page what turned out to be questionable content on this page could have been flagged with a color to indicate content from an immature (in terms of contributing to wikipedia) user as they have yet to get a high rating in wikipedia.

Any way it’s just a thought.

Please Do Not be alarmed., originally uploaded by Thorin.