Current Cites

Current Cites, March 2006

Edited by Roy Tennant

Contributors: Charles W. Bailey, Jr., Leo Robert Klein, Jim Ronningen, Roy Tennant

DLF-Aquifer Services Institutional Survey Report  Washington, DC: Digital Library Federation, 9 March 2006.( - This 45-page report from the Digital Library Federation (DFL) Aquifer Services Working Group consists of summarized responses from DLF members "to discover user-services assessment efforts and to assess what services are desired by end users and institutions and how the Aquifer project might potentially meet these needs. Key findings of the survey include: 1) Use of digital collections and services is often assessed at the point of introduction or update, rather than systematically over time, 2) searching is the most common way that digital collections are used, 3) metadata standardization is the most commonly reported strategy for supporting digital collections, 4) budgetary, time, and personnel constraints challenge the ability of institutions to develop needed services, and 5) institutions and users desire cross-resource discovery tools and greater ability to personalize service options. A very useful feature of this report is the list of user studies undertaken by DLF institutions, with abstracts for each. - RT

Cohen, Daniel J. "From Babel to Knowledge: Data Mining Large Digital CollectionsD-Lib Magazine  12(3)(March 2006)( - This is a fascinating account of how you can construct a search engine optimized for specific tasks such as finding course syllabi using simple technologies, access to such resources as Google's application program interface (API), and intelligent post-processing. A few conclusions from the author's research include: 1) More emphasis needs to be placed on creating APIs for digital collections, 2) Resources that are free to use in any way, even if they are imperfect, are more valuable than those that are gated or use-restricted, even if those resources are qualitatively better, and 3) Quantity may make up for a lack of quality. For explanations and justifications of these points see the article, which anyone building search systems should definitely do. - RT

Harnad, Stevan. "Maximizing Research Impact through Institutional and National Open-Access Self-Archiving Mandates  (2006)( - A recent study by Tom Wilson ("Institutional Open Archives: Where Are We Now?") investigates item deposit rates at most UK institutional repositories (excluding ETDs where possible). After reviewing his findings, Wilson states: "By any measure it can hardly be claimed that the concept of open archiving has taken off in British universities and I don't think that any of its protagonists would claim otherwise. The movement is at an early stage, with something in the order of 12 per cent of UK universities involved and with a minuscule proportion of the total research output covered by the IOA [Institutional Open Archives]." Little wonder then that open access proponent Stevan Harnad has come to advocate mandatory self-archiving at the institutional and national levels as a solution to low institutional repository deposit rates. (Harnad suggests that there is "a spontaneous 15% baseline rate" for institutional repository deposits.) One might imagine that researchers would resist mandatory deposit; however, Harnad notes that a 2005 study by Alma Swan and Sheridan Brown found that only 5% of researchers would refuse to do so. He further notes that in the three institutions and one department (CERN, Queensland University of Technology, the University of Minho, and the University of Southampton's Department of Electronics and Computer Science) that have mandated deposit, the strategy appears to be working. Will publishers allow self-archiving? Harnad indicates that only 7% of publishers do not allow self-archiving. Why do it? Harnad deftly recaps the open access research impact argument. With possible national-level deposit mandates in the works, such as the American Center for CURES Act of 2005 and the Research Councils UK's Position Statement on Access to Research Outputs, mandatory deposit is a hot topic, and Harnad's heavily linked paper provides a good summary of the pro-mandate position. - CB

Levy, Stephen, and Brad  Stone. "The New Wisdom of the WebNewsweek  (April 3, 2006)( - The living web, web 2.0, online community - however you refer to the phenomenon of the web as a world forum, the simplicity and timeliness of publishing whatever you want is nothing short of revolutionary. This overview article is probably most valuable for those non-participants who'll read it in paper form later this week; bloggers and others are already critiquing it online, basically treating it as just another post, one opinion among many. In fact, while its content does provide a nice sampling of current web community trends and efforts to capitalize on them, the impact of the changes described in it is really driven home when you take the article as a lesson in itself about the current state of the infosphere. It used to be that a news weekly could start a debate when it introduced to the general population a topic previously known only to a few; today, numerous forums already exist in which aspects of web community are being discussed by countless individuals (not to mention many, many more simply using the web to share comments, images, audio and video, without the theorizing). Those of us engaged in this are already learning something from 'the wisdom of crowds' about the nature of what we're doing as we do it, and don't have much use for a snapshot of the ocean when we can wade in and swim whenever we want to. - JR

St. Laurent, Simon. "The Next Web?  (March 15, 2006)( - "You could always go home, Dorothy," is the underlying theme of this review of popular web technologies that haven't yet lived up to their promise. Web veteran Simon St. Laurent briefly goes over the XML Web, the Semantic Web, XHTML and Web Services, explaining that each required substantial new infrastructure to implement and for that reason "never quite made it to mainstream web development". In contrast he points to the success of Ajax where the parts, namely JavaScript and HTML, have been around for ages. "After waiting for all of those promises of better tools to come," he concludes, "it seems that developers looked at the parts they had available, and chose the ones they could use today. It can be annoyingly hard work, but the results are impressive." - LRK

Stanger, Nigel, and Graham  McGregor. Hitting the Ground Running: Building New Zealand's First Publicly Available Institutional Repository  Dunedin, NZ: University of Otago, March 2006.( - This paper describes the rapid implementation of an institutional repository using open source software. Although they get high marks for rapidity out of the gate, and for a promising beginning, the paper is light on such details as to how the initial success will be sustained. The reader is also left to wonder how they plan on taking this pilot project for one of the university's schools and deploying it university-wide, if indeed they intend to do so. But those concerns aside, this can be a useful article for demonstrating how little it takes technically to get a repository launched and for achieving early success in terms of paper discovery and downloading. - RT

Suber, Peter. "Three Gathering Storms That Could Cause Collateral Damage for Open AccessSPARC Open Access Newsletter  (95)(2006)( - The Internet is a-changin', and those changes may make old timers long for the days when it was an obscure, purely noncommercial enterprise. In this paper, noted open acces advocate Peter Suber previews three potential changes that you should be aware of: (1) the WIPO "Treaty on the Protection of Broadcasting Organizations," (2) threats to Net neutrality, and (3) fees for bulk e-mailers to circumvent major e-mail services' spam filters. These potential changes may not sound alarming, but they are harbingers of deeper changes in the fundamental nature of the Internet that may have significant long-term implications. Let's take one of them as an example: AOL and Yahoo want to charge bulk e-mailers to avoid spam filters. The implications? Here's what Suber says: "The program is insidious and would lead almost everyone to pay the fees if they could--account holders at Yahoo and AOL and the bulk mailers who send to Yahoo and AOL addresses. It would also lead other email providers to adopt similar policies or fear that they were leaving money on the table. This would harm everyone who sends or receives non-spam mass mailings. This newsletter is an example but only one of many. The program would harm every form of OA content delivered by email, from emailed eprints and listserv postings to journal current-awareness messages like tables of contents and the results of stored searches. It would hurt non-profit groups and informal communities that network by email, including academic and political groups. Cash-strapped operations relying on email for distribution would either be forced to shut down or face higher costs that threaten their stability." - CB

Wakimotoa, Jina Choi, David S.  Walker, and Katherine S.  Dabboura. "The Myths and Realities of SFX in Academic Libraries"  The Journal of Academic Librarianship  32(2)(March 2006): 127-136. - The report of a three-fold study ("end-user survey, librarian focus group interviews, and sample SFX statistics and tests") to answer these questions regarding the use and effectiveness of an OpenURL resolver (SFX from ExLibris) in an academic setting: "How successful is the system in actually meeting the research needs of librarians and library users? Do undergraduate students, who have increasingly high expectations of online resources, think that SFX lives up to their expectations? Do librarians feel comfortable relying on SFX for accurate and consistent linking? Do the perceptions of librarians and library end-users reflect the reality of SFX usage?" Their conclusions? "Ultimately, this study showed that end-user expectations were slightly higher than their actual experiences of obtaining full text. The majority of the librarians were positive, however, reporting that SFX worked most of the time. Both groups had complaints about SFX and saw areas for improvement, but they still rely heavily on it for their research." - RT