Current Cites

Current Cites, February 2005

Edited by Roy Tennant

Contributors: Charles W. Bailey, Jr., Terry Huwe, Shirl Kennedy, Leo Robert Klein, Jim Ronningen, Roy Tennant

""The Blogosphere" (special issue) "  Communications of the ACM  47(12)(December 2004) - The idea of a systematic analysis of the blogosphere sounds like an exercise in futility - OK, we've got that manifestation isolated, wait, there are new eruptions over here and here and here - but this special issue of Communications of the ACM has several articles which do pin down aspects of blogging by measurement, experiment and anecdotal evidence. Patterns in interpersonal relationships and activity emerge over time. What is expressed in blogs, and what bloggers get out of it, is revealed by survey. An author who began blogging way back in 1999 describes the phases of change in online communities wrought by the development of easy to use blogging software. How semantic metadata could add a knowledge management layer to blogs is explored through the creation of a prototype semantic blogging demonstrator. And old concerns about the effect of filtering one's information intake are reawakened in the light of new functions used in blogspace - could it be that RSS abuse could make you really simple? The issue is an essential addition to the literature about this revolutionary phenomenon. - JR

Garvin, Peggy. "Why Google Uncle Sam?  (13 February 2005)( - Google's Uncle Sam search has evolved into the most popular search tool for the .gov and .mil domains. Garvin, author of The United States Government Internet Manual, questions this popularity, pointing out a number of deficiencies. For one thing, it doesn't include all the information that the federal government makes available online, since some sites don't have .gov or .mil domains (e.g.,, Also, the Uncle Sam service does not offer an advanced search form; if you click on advance search, you'll be sent to Google's generic version. Garvin also takes a look at the federal government's own search engine, at Although it, too, has some limitations, it does offer some features that Uncle Sam does not. Bottom line -- "When searching the federal government niche, follow the same recommended practice as in general searching: use more than one search engine." Also listed are two additional tools for federal government research: Department of Defense Search and Vivisimo's FirstGov cluster search. - SK

Marcum, Deanna B.. "The Future of CatalogingEBSCO Leadership Seminar, Boston, 16 January 2005  (January 2005)( - This thought piece on the future of cataloging is long on musings and short on predictions. But that isn't to denigrate it, only to clarify it's role given the possible connotations of the title. Rather than coming up with solutions or predictions, Marcum ponders the proper role of cataloging in a Google age. Marcum cites the Google project to digitize much or all of the contents of a selected set of major research libraries as evidence that the world of cataloging is changing dramatically, and she briefly identifies ways in which the Library of Congress is responding to this new environment. But, Marcum cautions, "the future of cataloging is not something that the Library of Congress, or even the small library group with which we will meet, can or expects to resolve alone." She then poses some specific questions that should be considered, including how we can massively change our current MARC/AACR2 system without creating chaos. - RT

Nicholson, Scott. "A Framework for Internet Archeology: Discovering Use Patterns in Digital Library and Web-Based Information ResourcesFirst Monday  10(2)(7 February 2005)( - Nicholson is interested in the trail of "data-based artifacts" that users leave behind when they interact with digital libraries or other Web-based information space. In particular he explores one discovery process that is called bibliomining -- a combination of data warehousing, data mining and bibliometrics. He employs the research framework of archeology to analyze bibliomining as a potential aid for managers of digital libraries. Using the language of archaeology to analyze the nature of the Internet is familiar approach -- a case of borrowing language from an established field to help assess the emerging virtual spaces we are building. This approach is utilized so often because it enables developers to visualize the network in understandable terms. Bibliomining draws on the basic tenets of archaeological practice, that is to say, "recovery, systematic description, and study", and Nicholson suggests that it may be a new tool for digital library managers. He says that we're still "describing" the digital library, even as we build it; Bibliomining may help us move beyond description, toward a sustainable cultural of continuous improvement. - TH

Sanderson, Robert, Jeffrey  Young, and Ralph  LeVan. "SRW/U With OAI: Expected and Unexpected SynergiesD-Lib Magazine  11(2)(February 2005)( - This very interesting (but technical) piece explores synergies between the Web Services replacement for Z39.50, Search and Retrieve via the Web (SRW) and the Open Archives Initiative (OAI) Protocol for Metadata Harveseting. SRW is a search protocol and OAI-PMH is for retrieving specified sets of records (or all) from a content repository. The authors demonstrate that "SRW and OAI clearly complement each other. Although the two protocols have chosen different answers to certain questions, this does not prevent them from being stacked up like building blocks into very different and interesting configurations." Highly recommended for anyone familiar with SRW or OAI. - RT

Spool, Jared M. "Seven Common Usability Testing MistakesUIE Roadshow Articles  (2005)( - This is the kind of article that you want to give to your administrator when he or she starts wondering what usability can and cannot do. It briefly indicates what you can measure and what you can't, who should be involved and the kind of follow-up you should do. I've rarely read something by author, Jared Spool, where I didn't learn something and this brief treatment is no exception. - LRK

Stone, Brad. "The Road Now TakenNewsweek (via MSNBC)  (21 February 2005)( - Regardless of which Internet mapping site you prefer, the geospatial data that makes it work was provided by one of two companies -- NAVTEQ, based in Chicago or Tele Atlas, a Netherlands firm. This article describes how these companies go about gathering the data and making sure it stays current. Meanwhile, the sales of "GPS-enabled devices," including cell phones, is projected to go through the roof by 2008. Thus, there looks to be no end in sight to the demand for geospatial data. While Internet users are particularly enamored of mapping websites, many business people are downright addicted to various high tech navigation tools. The article notes that North America, Western Europe and Japan are fairly well "mapped" right now; future expansion is projected in Eastern Europe and Asia. - SK

Suber, Peter. "Comments on the Weakening of the NIH Public-Access PolicySPARC Open Access Newsletter  (82)(2005)( - Since the National Institutes of Health (NIH) sponsors megabucks worth of research, it would be a big deal if all of the articles resulting from that research would be made freely available. Last July, the U.S. House Appropriations Committee made recommendations that made this a possibility (see "NIH Public-Access Policy: Frequently Asked Questions" for details). Now, after events I won't describe here (see "Congress Approves the NIH Plan"), the NIH has issued its "Policy on Enhancing Public Access to Archived Publications Resulting from NIH-Funded Research," and the news for open access advocates is mixed at best. Deposit of articles in PubMed Central is voluntary (not mandatory), and it is "strongly encouraged as soon as possible (and within twelve months of the publisher's official date of final publication)." Suber dissects the NIH plan with his usual clarity and precision, and he provides interesting background information about it, including how it compares to an earlier draft. One key point that he makes is that the policy "invites publishers who dislike the policy to voice a preference contrary to the NIH's preference," which "creates an untenable, high-risk dilemma for authors." In spite of the NIH plan's perceived downsides, Suber notes in his postscript that: "Even the watered down version of the policy will be an advance over the status quo, though a smaller advance than we had been led to expect. . . . Since the body of NIH-funded research is very large and very high in quality, even delayed free access to a subset is better than toll access to the totality." - CB

Tonkin, Emma. "Making the Case for a WikiAriadne  (42)(2005)( - Wiki: "the simplest online database that could possibly work." Anyone can create Wiki pages and edit them, so a Wiki is by nature a collaborative tool (and one designed to drive control freaks off the deep end). The Wikipedia is probably the most famous Wiki. Tonkin gives the reader a brief overview of Wikis, suggests various uses, provides comparative information about major Wiki software, discusses deployment issues, and speculates about the future of Wikis. - CB