Current Cites

March 2011

Edited by Roy Tennant

Contributors: Charles W. Bailey, Jr., Warren Cheetham, Alison Cody, Peter Hirtle, Leo Robert Klein, Roy Tennant

Bishop, Bradley W., Lauren H.  Mandel, and Charles R.  McClure. "Geographic Information Systems (GIS) in Public Library AssessmentLIBRES   21(1)(March 2011)( - This paper outlines how Geographic Information Systems (GIS) were used in three different projects to assess public library services. As the authors note, while the use of GIS systems to analyse and display data associated with various library services is increasing, the publication of the methodology of such work is comparatively scarce. This article aims to bridge that gap, by discussing the methodology, cost, educational and political issues for the use of GIS in library and information science (LIS) using three projects as examples. Each example used illustrates the strengths and limitations of GIS systems, and the paper concludes with suggestions of how researchers and practitioners might advance the use of GIS services in LIS. - WC

Collins, Maria, and Jill E.  Grogg. "Building a Better ERMSLibrary Journal  (15 March 2011)( - This very useful overview article of electronic resources management systems (ERMS) reports on the results of two surveys: one of librarians and one of commercial and open source system vendors. The authors found librarians wanted six things from their ERMS: 1) workflow management, 2) license management, 3) statistics management, 4) administrative information storage, 5) acquisitions funcationality, and 6) interoperability. They discuss each of these desires and provide an overview of both commercial and open source systems in this space, in charts that are unfortunately difficult to read online. This piece is highly recommended for any librarian shopping in this space. - RT

Crawford, Walt. "'Crawford at Large: Library 2.0 Five Years Later'. "  Online  58(3)(March-April 2011): 58-60. - ‘Library 2.0’, we hardly knew ye! Walt Crawford attempts to put the ‘movement’ into perspective five years down the road. Use of the term itself has declined and the fact that some of the zealotry has declined along with it is something that Walt doesn’t seem to regret. On the other hand, he emphasizes how Library 2.0 (or whatever you want to call it) gave librarians the tools to communicate better with their public and while there was precedent even for this with earlier technologies, it remains a worthwhile legacy. - LRK

Holley, Rose. "Resource Sharing in Australia: Find and Get in Trove -- Making 'Getting' BetterD-Lib Magazine  17(3/4)(March/April 2011)( - The National Library of Australia has done a marvellous job of developing the 'find' part of Trove (, the Australian online discovery service that harvests metadata from over 1,000 Australian libraries and cultural heritage organisations. As this paper outlines, their sights are now firmly fixed on refining the 'get' component of the search process, in order to meet the needs of online searchers so used to 'ggetting' an end-result with search services like Google. 'Get' options improved so far include better identification of digital items, enhancing webpage layout, the ability to link through to online bookshops to purchase items, the ability to purchase copies of articles and an interesting project that will allow searchers to seamlessly click through to authenticated content in e-journals and online databases, if their library membership allows access. A number of upcoming plans of improving the service are also discussed, which point to an interesting future for this service. The paper concludes with some challenges for contributing libraries and organisations, to do more things like deep linking between Trove and local catalogues, using online order forms and implementing digitisation on demand services. - WC

Howard, Rachel. CONTENTdm Cookbook: Recipes for Metadata Entry for UofL Digital Initiatives  Louisville: University of Louisville Libraries, 2011.( - Making standards and best practices work in production systems is not always straightforward. To develop metadata guidelines for their CONTENTdm system, Rachel Howard and her colleagues at the University of Louisville started with the Collaborative Digitization Program's Dublin Core Metadata Best Practices, then modified them using the CONTENTdm Metadata Working Group's Best Practices for CONTENTdm and Other OAI-PMH Compliant Repositories. Next, they adjusted these guidelines to "to accommodate the capabilities, limitations, and additional field properties presented by the software" and to make the record display easier for users to understand. The result: a very detailed set of guidelines that other libraries might find useful as a model for their own CONTENTdm metadata guideline development efforts. - CB

Luther, Judy, and Maureen C.  Kelly. "The Next Generation of DiscoveryLibrary Journal  (15 March 2011)( - This overview article seeks to explain what "next generation" library discovery systems offer, what to consider when shopping for such a system using criteria organized under the broad categories of content, search, fit, and cost. The authors discuss each of these categories in the light of current vendor offerings and end with "the long view" which is basically a set of questions about what the future might hold. The includes a brief description of each of the "big four" vendor products in the unified search tool marketplace (OCLC, Serials Solutions, EBSCO and ExLibris) from the vendors themselves. David Rapp contributed a very brief overview of "other options" that are based on a federated search model (searching databases at the time of request and merging results on the fly). Viewing the article requires a free registration. - RT

Meyer, Carol Anne. "Distinguishing Published Scholarly Content With CrossMarkLearned Publishing  24(2)(April 2011): 87-93. ( (subscription required)). - One of the dirty secrets of the so-called "green self-archiving open access movement" is how much a draft manuscript of an article can vary from the published "version of record." Meyer identifies retracted and corrected articles as being particularly problematic, and notes that there is no assurance that an early version of a manuscript in an institutional repository will be updated or removed if the research is later found to be flawed. Focus groups of scholars, librarians, and publishers found it difficult to identify and work with multiple versions. CrossRef, the people who brought us DOI's, propose to address the issue with a publisher-supplied mark that will indicate the publisher has accepted on-going responsibility for the maintenance of the content. I don't know if that will solve the problem; we will have to wait and see its forthcoming implementation. Meyer's article, however, is still valuable because it describes and documents an issue that all institutional repositories face. - PH

Rapp, David. "The Future of the ILSLibrary Journal  (1 April 2011)( - Library Journal held a roundtable discussion (around a rectangular table, but let's not quibble) at ALA Midwinter with executives from many integrated library system (ILS) vendors and expert librarians. I doubt you will find any major revelations here, but it can provide a window into the thinking of the top executives of companies active in this space. Plus, it's interesting just to read the wide-ranging conversation of a group of knowledgeable and bright people talk about major issues that libraries and the library systems vendors are facing. - RT

Schneider, Jodi. "Beyond the PDFAriadne  (66)(January 2011)( - This article reports on the three-day workshop "Beyond the PDF" held in January at the University of California, San Diego. The workshop brought together librarians and researchers from several scientific disciplines to discuss scholarly communication, and in particular how current and emerging technologies can be harnessed to improve the dissemination of information in the sciences. The workshop itself was a combination of short presentations and working sessions focused in six areas: annotation, data, provenance, new models, writing and reviewing, and impact. The article contains short reports on the presentations and ensuing discussion, and links to the draft reports put together by the working groups formed at the workshop. - AC

Tatomir, Jennifer, and Joan C  Durrance. "Overcoming the Information Gap: Measuring the Accessibility of Library Databases to Adaptive Technology UsersLibrary Hi Tech  28(4)(2010): 577-594. ( - Must-read article for anyone involved with the delivery of web-based content, which means probably all of us. The days when institutions could build to one device or browser are long gone. Part of our new openness is being accessible to screen-readers and other tools that people with disabilities use to access content. This article goes through a good summary of both federal and international standards for how to do this. (Many states have their own guidelines as well.) Based on these standards, the authors developed a checklist measuring ten ‘key’ features which they helpfully spend some time discussing. They then proceed to evaluate 32 library databases using the query ‘nutrition and cancer’. Unfortunately, in their words, ‘no database included in the study was rated as largely accessible’. - LRK

Wilkin, John P. Bibliographic Indeterminacy and the Scale of Problems and Opportunities of 'Rights' in Digital Collection Building  Washington, D.C.: Council on Library and Information Resources, February 2011.( - One of the most perplexing questions raised by the Google Books project is the copyright status of books found in the nation's libraries. How many titles are in the public domain, and how many are "orphans"? Previous efforts have tried to answer the question by looking at the WorldCat database or publishing data. Wilkin conducts an analysis of the 5 million + titles already found in the Hathi Trust database. He concludes that the number of orphans in this corpus is likely to be high: "more than 800,000 US orphans and nearly 2 million non-US orphans." This has major implications for efforts to build a Digital Public Library of America. His findings would also suggest that legislative changes that have been proposed in the past are unlikely to solve the orphan works problems when one is dealing with mass digitization projects. Wilkin's analysis is based on a set of increasingly speculative assumptions that it is easy to challenge. Nevertheless, his report forms a solid basis for further discussion and refinement. Anyone interested in bibliographical analysis will find much of interest in his study. - PH