The recent attention on Wikipedia. is good, I think, even — or especially — when it is critical. Ed Felton details a quick quality check-up on several articles, and he finds it mostly good, except when it’s not. I’ve done something similar with articles in areas on which I could be considered an expert, and although the quality of the article may vary — some are extensive and excellent, while some are too short (or non-existent) and merely acceptable — I haven’t yet run into the kind of errors he presents.
But here he is, an expert on several of these topics (including, one might expect, himself), checking topics for accuracy. Wouldn’t it be helpful if he could “tag” the entry, saying he had read over it and found it generally free from errors?
In other words, what is lacking in Wikipedia is a positive feedback process. The negative feedback process is well-established and works most of the time, but we cannot assume (as one recent paper seems to) that the absence of corrections or activity is a mark of either accuracy or inaccuracy.
What we need is a page that allows for something along the lines of “I am Ed Felton and I approve this article.” Of course, then you need some mechanism for making sure Ed Felton really is Ed Felton.
The alternative is to stay with Wikipedia’s current anonymous focus on the content. Then as people visited an article, they could indicate if they thought that it contained accurate and helpful information. It may be that the anonymous vote alone would be helpful.
I may be biased, since I get paid for my assumed expertise, but I like the idea of subject experts providing a filter for the encyclopedia. Such experts need not be certified as experts in the area, as long as they are correctly identified as such. You could then rely on external checks of the expertise of an endorser.
You could also rely on internal checks, though I’m not entirely clear on how this would happen. Imagine that Dr. Felton approved of several articles that were later found in error. His reputation as an “approver” would then be somewhat tarnished. In seeing pages that he approved, you might note that he had gotten a few wrong, and trust his opinion slightly less.
This starts to feel a bit like Slashdot, and it may fall into the same failings: tarnishing the reputation of those who do not agree with the majority. But given the mission of Wikipedia, that might be avoided.