Daily Archives: January 15, 2006

Try Something At The Learning Buffet

Whether it might be for your own edification or possibly for use in your staff training, I recommend you take a look at the New Technologies Learning Buffet. This was created by Tom Foster (Chandler-Gilbert Community College) and Alan Levine (Maricopa Community College Learning Center). This happens to be a great example of how a wiki could be used for training and development. I came across it after reading a post in Levine’s ConDogBlog. The buffet has loads of resources included for blogs, wikis, e-portfolios, photosharing, Google Maps, and somethink else called “using free stuff.” If you’ve yet to try some of these technologies, or would like to encourage colleagues to do so, you may find this is a good way to explore these new technologies independently or with colleagues. As I go through the Buffet I can’t help but think that more of this type of resource, particularly for introducing people to library resources, could be a good thing to create. It demonstrates that a wiki is a good platform for training and connecting individuals with information that allows them to explore on their own and learn constructively.

A “Befuddling and Often Capricious Crapshoot”

This is how Rick Montgomery characterizes the conduct of peer review in his front-page article in the Kansas City Star, Fraud Proves that Science Journals Can Be Fooled (1/14/06) (temporarily freely available online).

While Montgomery’s analysis of the peer review process is limited (e.g., focusing solely on the conduct of peer review for science journals), it is a good example of how issues in information literacy instruction and scholarly communication instruction sometimes cross over into the mainstream. The key points included in his spotlight box reflect some basics of what academic librarians have been teaching for years: 1) note the size of the study (i.e., apply an understanding of research methodology); 2) consider who paid for the research (i.e., look for potential bias); and 3) beware of claims made at scientific conferences (i.e., understand the nature of the scholarly communication and publication cycle to better appreciate the status of a claim in terms of peer review). The description of the time that goes into reviewing a mss. will also be enlightening to those not actively involved in the process.

Count on this article (and others that reflect the current scandal over the publication of results of fraudulent studies) to be a useful jumping off point for many instruction sessions to come.

Professional Prize Proliferation

Ever think there are just way too many prizes and awards being handed out in the library profession. From the top of the heap (I see Library Journal just named its “Librarian of the Year“) to, well, just name it – there’s an award for just about everything in this profession (ILL, serials, book reviewing…) created by just about every association at every level – there are just so many awards and prizes being dealt out to academic and other librarians that the value of awards in an age of prize proliferation is being brought into question.

Prompting my thinking about prize proliferation is the recent discovery of the work of James English, a University of Pennsylvania English professor who speaks with some authority on the subject. His new book, The Economy of Prestige: Prizes, Awards, and the Circulation of Cultural Value argues that we have become a culture saturated with prizes and awards. There are over 9,000 annual awards in the film industry alone. The library profession certainly has fewer, but it does seems that every time one picks up a professional journal there are more than just a few award announcements.

While English points out a number of flaws in a world of award excess, he insists they do have some value. In a profession such as ours, that receives little recognition from the world at large, the proliferation of prizes may help us to acknowledge that we make an important contribution – one worthy of awards. Are there too many? Should we insist on the elimination of many librarian awards so that the remaining few would be quite significant and leave no doubt as to their value and meaning? I guess we might all agree with what English has to say about that matter. “There aren’t too many prizes until I’ve won more of them.”

Your Cheatin’ Heart

The New York Times Week in Review is awash in metaphysical questions this morning – what is truth? how do we know? The Million Little Pieces controversy is one of those moments in the news cycle when society seems to collectively pause to assess whether it’s been had or not. Randy Kennedy examines the public’s willingness to be entertained by “truckloads of falsehood in memoirs” and concludes that readers want redemption, not truth – and they want it packaged in confessional, reality-television mode. (Presumably the publisher anticipated that when suggesting to the author the novel he submitted be marketed as a memoir.) Memoirist Mary Kerr disagrees. “Distinguishing between fiction and non- isn’t nearly the taxing endeavor some would have us believe. Sexing a chicken is way harder.” She concludes redemption is cheap, if it’s too easy; writing a memoir is an attempt “to unearth life’s truths.”

The other fraud that’s getting the microscopic treatment in today’s paper is the breakdown of peer review in the case of Hwang Woo Suk, whose fraudulent research reports on stem cell research advances resulted in his paper being withdrawn by Science. Two articles in today’s Times examine the train wreck. Nicholas Wade points out that there are two kinds of science: texbook science (material that has been validated over time) and frontier science – “wild and untamed, and often either wrong or irrelevant to future research.” He urges fellow journalists “to recognize that journals like Science and Nature do not, and cannot, publish scientific truths. They publish roughly screened scientific claims, which may or may not turn out to be true.” In the magazine, David Dobbs says “The scientific-publishing system does little to prevent scientific fraud. Is there a better way?” He argues for “open-source” reviewing, where a paper is published with the comments of assigned reviewers and anyone else who cares to join the fray.

Open, collaborative review may seem a scary departure. But scientists may find it salutary. It stands to maintain rigor, turn review processes into productive forums and make publication less a proprietary claim to knowledge than the spark of a fruitful exchange . . . Hwang’s fabrications, as it happens, were first uncovered in Web exchanges among scientists who found his data suspicious. Might that have happened faster if such examination were built into the scientific process?

In this cynical age of spin, it’s refreshing to see the public grappling with the nature of truth and by what creative and scientific processes we arrive at it. There’s plenty of material here for an interesting lesson in information literacy.