Current Cites

Current Cites, April 2007

Edited by Roy Tennant

Contributors: Charles W. Bailey, Jr., Keri Cascio, Frank Cervone, Susan Gibbons, Leo Robert Klein, Jim Ronningen, Brian Rosenblum, Karen G. Schneider, Roy Tennant

Editor's note: With this issue we welcome five new contributors to the Currrent Cites team: Keri Cascio, Frank Cervone, Susan Gibbons, Brian Rosenblum, and Karen Schneider. We are delighted to be joined by such a distinguished and talented group, and after reading this issue I think you will agree that they have a lot to offer. Welcome!

Arfeuille, Erik. "New Technologies in Libraries - The End"  New Technologies in Libraries (5 April 2007) - Anyone interested in digital libraries over the past 10 years is sure to recognize the name of Erik Arfeuille. His regular compendium of articles on library-related topics, New Technologies in Libraries, was a welcome source of current awareness. It certainly gave me pointers on what to read (and recommend). Alas in a farewell message dated 4/5/07, he announces that his "workload" no longer allows him to produce the lists. While this is a shame, the nature of his contribution for so many years is appreciated. - LRK

Carlson, Scott. "Are Reference Desks Dying Out?The Chronicle of Higher Education  53(33)(20 April 2007): A37+. ( - Despite the overblown title, this article explore some interesting issues regarding modern library reference service. The article begins with the example of a UC Merced librarian answering text-message reference questions from students via cellphone while thousands of miles away at a conference. "Doing things the way I'm doing them now," Carlson quotes the librarian, Ms. Michelle Jacobs, as saying, "I have reached almost twice as many students as when I sat on a reference desk." That isn't the whole story, though, and Carlson goes on to give those advocating face-to-face reference services airtime as well. The article does not come down on either side with any force, but rather leaves the reader thinking about options. This reader thinks that the real answer is not one or the other, but both, implemented in ways that maximize the benefits of each while minimizing the staffing impact. - RT

Chau, Michael, Xiao  Fang, and Olivia R.  Liu Sheng. "What Are People Searching on Government Web Sites?"  Communications of the ACM  50(4)(April 2007): 87-92. - Quantification from search log analysis meets some big questions of political philosophy: we don't get final answers here but are introduced to an avenue of exploration, and that's a start. The authors analyzed a log of over a million search queries at the website. Their first conclusion gets the "at last we have the numbers to support the obvious" prize: the top categories of what people search for are different at a government website than at an all-purpose search site such as Alta Vista. (Of course, queries for sex on Utah's site might reveal evidence of an interesting fetish subculture for state government porn, but I'd rather not imagine what that could look like.) We hit the big questions when the focus turns to search terms of potential interest to terrorists, and the issues around open government come into play. Is someone searching for "water system" interested in poisoning it, or looking for good news about irrigation? "Small pox" - spreading it or avoiding it? The authors can't even get close to a solution to the problem of which information might be too sensitive to remain freely available, not that we'd expect them to pass judgement on issues more appropriate for the state Supreme Court. Their effort is commendable in that it makes a good case that ignorance certainly isn't bliss and data gathering and analysis may eventually inform some very difficult debates. - JR

Fichter, Darlene. "The Age of Darwinian Design (Intranet Librarian)"  Online  31(2)(March/April 2007): 52-54. - Insightful article by Darlene Fichter on the joys of "Rapid Iterative Design". This is a method, traditionally used in the development phase of designing a website, where you go through prototypes, testing them on users, refining them when problems arise and then testing the results until you have a complete solution. Fichter extends this procedure to websites even after they've been launched arguing that it makes no sense to wait for the next iteration of the site for improvements to be made. In this way, she points out, library websites can mirror the "permanent beta" of successful commercial sites. - LRK

Fitzgerald, Brian F., Jessica M.  Coates, and Suzanne M.  Lewis, eds. Open Content Licensing: Cultivating the Creative Commons  Sydney: Sydney University Press, 2007.( - This freely available e-book presents papers from the 2005 Open Content Licensing: Cultivating the Creative Commons conference in Brisbane, Australia. It includes two papers by Lawrence Lessig: "Does Copyright Have Limits? Eldred v. Ashcroft and Its Aftermath" and "The Vision for the Creative Commons: What Are We and Where Are We Headed? Free Culture." While much of the book has an Australian slant, the underlying issues raised about open content licenses, such as Creative Commons licenses, in areas such as computer games, creative industries, and government resonate worldwide. - CB

Gorman, G.E. "Google Print and the Principle of Functionality "  Online Information Review  31(2)(2007): 113-11. - G.E. Gorman obviously hasn't gotten his copy of 'The Long Tail'. In this piece, he warns against the "spurious, economically unsound views" of Google Print in their intention to digitize "everything [they] can lay their hands on". He recommends using "professional judgment" as a selection method instead. All I can say is beware of what you wish for! There already was a selection method in place that produced the original collections. None represent the universe of all publications. Furthermore, past use on the shelf is no indicator of future use once in digital form. Digitalization of low-use material surely promises more than simply "clutter[ing] the web" as Gorman argues. Also thankfully, Google Print isn't the only game in town. Its academic partners are free to pursue their own digitization schemes using methods hopefully more to Gorman's liking. - LRK

Grogg, Jill E., and Beth  Ashmore. "Google Book Search Libraries and Their Digital CopiesSearcher  15(4)(April 2007)( - Entire articles have been written about the Google Book Search Library Project--how Google's doing it, why libraries are joining in, and the issue of copyright--but not much has been said about what those libraries plan to do with their copies of the digitized materials once they've been scanned. Grogg and Ashmore survey the field and study how the project fits into existing and future digital libraries at various institutions. Plans include open access to all, inclusion in OPACs and digital repositories, and archiving and preservation. Many of the libraries are still developing the infrastructure and delivery system to handle the sheer volume of materials they are receiving. Of the twelve institutions reviewed for the article, seven are sticking with scanning materials in the public domain, and five are scanning all materials regardless of copyright (at least until any court decisions have been made). Grogg and Ashmore answer the question of motivation to join when they write, "Google can offer digitization on a grand scale at a price libraries can afford." It's a bargain that's hard to turn down, even with the threat of pending litigation. - KC

McGovern, Nancy. "A Digital Decade: Where Have We Been and Where Are We Going in Digital Preservation?RLG DigiNews  11(1)(April 15, 2007)( - Nancy McGovern provides a remarkably clear assessment of developments in the digital preservation community over the past ten years, and provides a look at what is needed as we move forward. One of the most important recognitions here is that a digital preservation program built upon a "three-legged stool" (organization, technology, resources) is more sturdy and sustainable than "a technology pogo stick." Organizationally, in the last decade we have seen the emergence of the concept of the trusted digital repository (TDR), the creation of numerous policy statements, and the acknowledgment of the need for evidence-based audit and certification. Still needed is the ability to move such polices and theories into action, and the development of better digital preservation skills. On the technology leg, developments include the OAIS Reference Model, the development of numerous repository and digital library applications, and the development of various other tools to perform digital preservation tasks such as identifying file formats, normalizing data, and generating metadata. In the coming years the community will need to enhance and integrate these tools and software to help create modular, automated and scalable workflows. The resources leg--developing an understanding of and commitment to the costs of maintaining a digital preservation program over time--is perhaps the least developed of the three legs, and there is no general community model. (TDR and OAIS provide this function for the technology and organization legs). Various resource models have been proposed, but we need more responses to these contributions from the community, and more transparency in reporting resource usage, in order to move from "just-in-time" funding to more programmatic, sustained support for digital preservation. The article helpfully includes numerous links to many of the resources and documents discussed. - BR

National Science Foundation, . Cyberinfrastructure Vision for 21st Century Discovery  Arlington, VA: National Science Foundation, March 21, 2007.( - Often libraries are overlooked when issues related to cyberinfrastructure are discussed, but this is not the case in the latest in this series of reports on cyberinfrastructure development. In five chapters, this report looks at the major issues to be addressed in the next several years including high performance computing; data analysis and visualization; virtual organizations and distributed communities; as well as learning and workforce development. Throughout the document, but particularly in the chapter on data analysis and visualization, the critical role of libraries in developing the cyberinfrastructure is made clear. Not surprisingly, many of the issues discussed in the report will be familiar to those in the information professions. Perhaps the biggest (unaddressed) question in the report is how we in the information professions will take up the challenge to lead in the further development of the cyberinfrastructure lest it be left to others. - FC

Puglia, Steve, and Erin  Rhodes. "Digital Imaging - How Far Have We Come and What Still Needs to be Done?RLG DigiNews  11(1)(15 April 2007)( - Few are as qualified as Steve Puglia to pen this history of library- and archive-based digitization efforts. Having long labored in that particular orchard for the National Archives and Records Administration, as well as served on the faculty of the highly regarded School for Scanning, Puglia has lived much of what he recounts. But this is by no means simply a history of NARA's efforts, Puglia casts a wide net over all the major players and the documents and procedures they promulgated over the years. The table of "Imaging Specifications and Guidelines" that identifies many of these is an impressive statement to the body of work produced by those active in the field. This and the other article cited in this issue of Current Cites are a fitting end and tribute to this part of RLG DigiNews history. - RT

Read, Eleanor J. "Data Services in Academic Libraries: Assessing Needs and Promoting Services"  Reference & User Services Quarterly  46(3)(Spring 2007): 61-75. - Back when data services meant a place for running mag tapes on mainframes, it was a contained specialization without wider ramifications for information providers generally. However, the explosion of networked numerical data deliverable to desktops has created challenges for technologists and public service people. Read's article can help both groups see through the haze of this data cloud to identify sources, skill sets and support networks. It springs from a data services awareness survey conducted at the University of Texas, polling faculty and graduate students in disciplines using social sciences data. One paradox is that the wider availability of datasets has not been accompanied by a greater awareness of their availability; one conclusion is that today's data service providers have outreach and instruction as major job components. - JR

Spoerri, Anselm. "What is Popular on Wikipedia and Why?First Monday  (April 2007)( - "Google giveth, Google taketh": this paper about Wikipedia's popularity is even more pointedly an impact analysis of Google's secret sauce. Spoerr's discussion of "which pages and topics are the most popular on Wikipedia and why" uses data generated from Wikicharts to swiftly move through a discussion about what's popular on Wikipedia (which despite Wikipedia's reputation as an "encyclopedia" turns out to be entertainment and sexuality). Spoerr then steps beyond these observations to the larger question of "what precisely drives Wikipedia's traffic and growing popularity," which is apparently a back-scratching relationship with large search engines, particularly Google. Though we can't crack open Google's black box to find out how it works, Spoerr's analysis strongly suggests that Google, recognizing Wikipedia's popularity and high trust with users, gives precedence to Wikipedia's entries so that results are likely to show up within the highly-desirable top three results. Spoerr points out that Wikipedia's favored placement only increases the ferocity of competition among other websites to make the top three, or at least top ten, search results. An unspoken question underlying this article is where library-based Web resources fit into the competition for Web turf--then again, maybe we don't want to know the answer. - KGS

Stacey, Paul. "Open Educational Resources in a Global ContextFirst Monday  12(4)(April 2007)( - This article provides a useful overview of the state of development of open educational resource (OER) initiatives and some of the questions regarding their use and effectiveness in improving global access to education. Based on an online discussion that took place in a UNESCO-sponsored forum in November/December 2005, the author provides examples of different models of OER initiatives (MIT's OpenCourseWare, Rice University's Connexions, and Carnegie Mellon's Open Learning Initiative), explores various business models, and suggests next steps that can help OER initiatives realize their full potential. Especially interesting are the discussions on global issues such as language, the digital divide, and international cultural considerations. The author also discusses some technical issues from a user's perspective, looks at the possibility of social and community-based authoring, and points to some convergences with other "open" initiatives, such as open-source software and open access to research and scholarship. - BR

Staley, Laura, Rachel  Van Noord, and Betha  Gutsche, et. al. Blended Learning Guide  Dublin, OH: OCLC, March 2007.( - This 38-page guide is an excellent overview of the present mix of learning technologies being used by a number of organizations to provide e-learning courses. Their definition of blended learning is "a combination -- or blend -- of different online learning modes, or of online and in-person learning." Summary sheets on each of these modes (e.g., Discussion Boards, Instant Messaging/Chat, Podcasting, etc.) are followed by a set of case histories about how various libraries have used blended learning techniques. Highly recommended for any individual or organization to gain a better understanding of current learning technologies and how they can be used effectively in a blended mode. Full disclosure: I was on the WebJunction Advisory Board and soon will be employed by OCLC. - RT

Van Orsdel, Lee C., and Kathleen  Born. "Serial WarsLibrary Journal  (15 April 2007)( - Library Journal has published its annual review of serials prices. The bottom line: "In 2007, academic libraries saw overall journal price increases just under eight percent for the second year in a row. U.S. titles rose nine percent on average; non-U.S., 7.3 percent." STM journals continued to be quite expensive, with average 2007 prices for the top three disciplines being: $3,429 for Chemistry, $2,865 for Physics, and $2,071 for Engineering. The country with the highest average price per title ($3,362) was the Netherlands. There is considerable discussion of open access issues in this article, and Peter Suber has commented: "This is an excellent picture of where OA stands today. If you have colleagues who want to know what's been happening and only have time for one article, give them this URL." - CB

Wilber, Dana J. "MyLiteracies: Understanding the Net Generation through LiveJournals and Literacy PracticesInnovate: Journal of Online Education  3(4)(April/May 2007)( - This month's issue of Innovate: Journal of Online Education focuses on the Net Generation student and how educators and the educational systems could or should response to the challenges these student impose. While there are a number of good articles, Wilber's deserves particular note, which is a summary of an ethnographic case study she conducted in Fall 2005. During the course of the semester, Wilber studied the literacy and technology practices of college student, focusing specifically on her use of the social networking and the blogging site LiveJournal. She discovered an emerging set of new literacy practices that challenge the once clear delineation between author and reader. - SG