Current Cites

Volume 15, no. 9, September 2004

Edited by Roy Tennant

ISSN: 1060-2356 - http://sunsite.berkeley.edu/CurrentCites/2004/cc04.15.9.html

Contributors: Charles W. Bailey, Jr., Terry Huwe, Shirl Kennedy, Jim Ronningen, Roy Tennant

Antelman, Kristin.  "Do Open-Access Articles Have a Greater Research Impact?College & Research Libraries   65(5) (September 2004):  372-382. (http://eprints.rclis.org/archive/00002309/). - For those who have been working to create open access repositories of research and scholarship, this article is a godsend. Antelman performed a formal study of whether open access articles are cited more frequently than those only available through subscription services. The short answer is "yes". For the long answer, as well as to review her methodology, see the (yes) open access article. - RT

Chapman, Stephen.  "Techniques for Creating Sustainable Digital Collections"  Library Technology Reports   40(5) (Sept./Oct. 2004) - Library Technology Reports appears to be on a roll, with this excellent issue following close on the heels of Susan Gibbon's report on institutional repositories (cited in a previous issue of Current Cites). Few people are as well suited for covering this topic as Chapman, who has long experience in creating digital collections at Harvard, and has spoken on this topic for years as a faculty member of the highly regarded School for Scanning: Building Good Digital Collections. The report begins with a section on institutional readiness for digitization, followed by sections on managing digitization, levels of service for image digitization, levels of service for text digitization, managing costs, and commiting to change. So if you find yourself suddenly responsible for a digitization project, as many are, your first purchase should not be a scanner, but rather this issue of LTR. Out of all the money you will spend on your project (and spend it you will) the $63 cost of this report will be the single most effective use of your resources. - RT

Dean, Katie.  "Saving the Artistic OrphansWired News   (20 September 2004) (http://www.wired.com/news/culture/0,1284,64494,00.html). - "Artistic orphans," as discussed in this article, are "older books, films and music" that are "no longer commercially viable," but are kept from the public domain because they are still under copyright. Changes in the copyright law that no longer require intellectual property owners to register or renew their copyrights with the U.S. Copyright Office have made locating these owners "a formidable challenge." Brewster Kahle, founder of the Internet Archive and Rick Prelinger, a film collector, are interested in digitizing these materials and putting them online so the public can have free access. They filed suit in March to have declared unconstitutional the changes to copyright law that prevent such materials from entering the public domain. The legal wrangling is ongoing; the government filed a motion to dismiss the case, the plaintiffs filed an opposition and the government will file its reply in October. In late October, the U.S. District Court for the Northern District of California will hear arguments. Lawrence Lessig, the Stanford Law School professor representing Kahle and Prelinger explains that copyright was traditionally "opt-in" -- where intellectual property owners had to actively register and then renew their works. Now, from the moment a work is "fixed in a tangible medium," copyright protection exists without any need for registration or renewal. The article notes "that on average, 85 percent of copyright owners never bothered to renew their copyright after the first 28 years anyway." You can submit examples of orphan works via a website set up by Kahle and Prelinger. - SK

Elliott, Susan A. Metasearch and Usability: Toward a Seamless Interface to Library Resources   Anchorage, AK: University of Alaska, August 2004. (http://www.lib.uaa.alaska.edu/tundra/msuse1.pdf). - This paper is the result of a sabbatical leave investigation on behalf of the Consortium Library of the University of Alaska Anchorage regarding metasearch software and usability. The author visited a number of libraries that have implemented, or are in the process of implementing, metasearch applications. The strength of this paper lies not in the specifics regarding sofwtare options, which are already out of date (although for those who simply can't resist, they are available in a separate file of appendices), but in the body of the report in which Elliott succinctly outlines the problem these tools are attempting to solve, how they are trying to do it, and current issues and problems. As she identifies, things are far from perfect but these tools may at least offer libraries a way to make things more manageable for the users we serve. - RT

Ellison, Jim.  "Assessing the accessibility of fifty United States government Web pages: Using Bobby to check on Uncle Sam First Monday   9(7) (5 July 2004) (http://www.firstmonday.org/issues/issue9_7/ellison/). - Ellison takes a hard look at the real obstacles that people with disabilities face when using government Web sites. He reviews 50 sites using the well-known evaluation program known as Bobby, which checks HTML to evaluate how successfully the code perform in providing accessibility. While he argues that there is great potential for improved accessibility, he claims that the U.S. government has not met its self-imposed goals yet. This would tend to weaken the government's standing to enforce accessibility standards on other organizations, he concludes. - TH

Greenstein, Daniel.  "Research Libraries' Costs of Doing Business (and Strategies for Avoiding Them)EDUCAUSE Review   39(5) (2004):  72-73. (http://www.educause.edu/pub/er/erm04/erm04510.asp). - Materials costs continue to spiral upward. Shaped by Google and similar systems, users' expectations rise as well, and they demand that libraries provide increasingly sophisticated, easy-to-use systems. Digital formats proliferate. What's a research library to do? Based on the collaborative experiences of the University of California System, Greenstein has some suggestions for research libraries in similar situations. Rely mainly on electronic journals, but preserve at least one archival print copy of each one. Closely coordinate collection development to eliminate duplicate materials costs, and develop new bibliographic systems to support this. Centralize system support functions, such as digital preservation and tool building (e.g., online portals). Using these strategies, UC believes it can save $30-$50 million dollars a year. Sounds like big money. Will it solve the problem? The author says: "If the money is simply eaten away by unmitigated steep increases in the price of library materials, the answer is no. Changing the unsustainable economics of scholarly publishing remains a key to the future of research libraries — indeed, to the continued ability of colleges and universities to provide faculty and researchers with the access they need to the world's scholarly knowledge." - CB

Hepburn, Gary.  "Seeking an educational commons: The promise of open source development models First Monday   9(8) (2 August 2004) (http://www.firstmonday.org/issues/issue9_8/hepburn/). - Hepburn matches an assessment of the potential of open source computing with the development of classroom curricula, and finds a good match. Easily available resources, flexibility and minimal intrusion of corporate culture into the classroom are all desirable side benefits of open source architecture, he argues. A central aspect of a new open source "commons" that could take root is creativity: Hepburn foresees that educators and curriculum planners will experience a noteworthy uptick in creative thinking if they cleave to an open source standard. Much of this line of reasoning is based on the hitherto-unrealized potential of the Internet to reshape the classroom. A key challenge for educators, though, is the development of both institutional and professional-level commitments to mainstreaming technology management into teaching — a process that will challenge teachers and educators for some time to come. - TH

Puglia, Steve, Jeffrey  Reed, and Erin  Rhodes. Technical Guidelines for Digitizing Archival Materials for Electronic Access: Creation of Production Master Files - Raster Images< /A>   Washington, DC: National Archives and Records Administration, June 2004. (http://www.archives.gov/research_room/arc/arc_info/guidelines_for_digitizing_archival_materials.html). - What the staff at NARA don't know about digitizing isn't worth knowing. And thanks to documents like this one, you too can know what they do. From recommendations on metadata capture to essential tips on scanning for the maximum fidelity and information capture, this is a gold mine of best practice that can help anyone digitizing content for web access. Beginning with a section on metadata, the paper includes sections on imaging workflow, digitization specifications, storage, and quality control. The technical overview alone offers a wealth of essential information for digitization novices as well as those who may have been doing this activity for some time, but without a thorough technical grounding in all the technical aspects. Highly recommended for anyone digitizing content. - RT

Rowlands, Ian, Dave  Nicholas, and Paul  Huntingdon.  "Journal Publishing: What Do Authors Want?Nature Web Focus: Access to the Literature: The Debate Continues   (13 September 2004) (http://www.nature.com/nature/focus/accessdebate/31.html). - In the final analysis, scholarly journal publishing should be designed to satisfy the needs of scholars. So what do they want anyway? The authors conducted a large-scale international survey to find the answer, ending up with 3,787 fully completed questionnaires from 97 countries. Not surprisingly, they found that authors continue to want traditional journal benefits: "They want the imprimatur of quality and integrity that a peer-reviewed, high-impact title can offer, together with reasonable levels of publisher service. Above all, they want to narrowcast their ideas to a close community of like-minded researchers. . ." The majority of authors (61%) indicate that they have access to needed articles, and 77% say that access is better than five years ago. Not many have heard of open access (82% say that they know little or nothing about it), and they are not willing to pay much to publish articles (only 16% would pay more than $500). Rowlands et al. estimate that the average that authors would be willing to pay may be about $400, which is below the fees typically charged by open access publishers. Clearly, publishing reform advocates still have much work to do in educating authors about the economics of scholarly publishing and academic library finances. - CB

Shenton, Andrew K., and Pat  Dixon.  "Issues Arising From Youngsters' Information Seeking Behavior"  Library & Information Science Research   26(2) (Spring 2004):  177-200. - Faced with training adults to be more careful and critical information seekers and users, it's helpful to see which patterns are imprinted in our school years. This article explores the general information-seeking patterns of school-age children in a single British town. While a larger sample (only 188 individuals here) and greater geographic variation could certainly lead to more universally applicable conclusions, for most English-speaking information providers there will be a high recognition factor of those behaviors which are clear precursors to adult habits, e.g. "the use of untaught, expedient methods was apparent in many contexts, including the 'speculative' entry of URLs to access Web sites and the location of information in books by simply flicking through the pages." No wonder at expedient Google's popularity, being so good at providing reasonable results for speculative input. Also instructive is the prevalence of image or pattern retention which, once achieved, encourages forgetting details like titles and addresses. A bit discouraging for teachers of information literacy, but good to know what one is up against. - JR

Twist, Jo.  "Web Tool May Banish Broken LinksBBC News   (24 September 2004) (http://news.bbc.co.uk/1/hi/technology/3666660.stm). - The Jargon File defines link rot as "The natural decay of web links as the sites they're connected to change or die." And while it is a fact of life on the Web today, it is also a tremendous source of frustrion to information professionals, scholars, and plain ordinary Web users. Well, a team of UK intern students at IBM has come up with a tool that addresses the problem of broken Web links. Although other tools exist that can detect broken links, this tool — called Peridot — also ferrets out where the missing information has gone and "replaces outdated information with other relevant documents and links." It can also detect links to "inappropriate information." Basically, the technology keeps track of key elements of webpages so it is able to quickly spot any changes. In its current version, "it runs reliably over 100,000 pages." - SK


Current Cites - ISSN: 1060-2356
Copyright (c) 2004 by the Regents of the University of California All rights reserved.

Copying is permitted for noncommercial use by computerized bulletin board/conference systems, individual scholars, and libraries. Libraries are authorized to add the journal to their collections at no cost. This message must appear on copied material. All commercial use requires permission from the editor. All product names are trademarks or registered trade marks of their respective holders. Mention of a product in this publication does not necessarily imply endorsement of the product. To subscribe to the Current Cites distribution list, send the message "sub cites [your name]" to listserv@library.berkeley.edu, replacing "[your name]" with your name. To unsubscribe, send the message "unsub cites" to the same address.

Document maintained at http://sunsite.berkeley.edu/CurrentCites/2004/cc04.15.9.html by Roy Tennant.
Last update September 30, 2004.