Current Cites

March 2008

Edited by Roy Tennant

http://lists.webjunction.org/currentcites/2008/cc08.19.3.html

Contributors: Charles W. Bailey, Jr., Keri Cascio, Leo Robert Klein, Roy Tennant


Corrado, Edward A., and Kathryn A.  Frederick. "Free and Open Source Options for Creating Database-Driven Subject GuidesCode4Lib Journal  (2)(24 March 2008)(http://journal.code4lib.org/articles/47). - A common strategy to help library users find the information they seek is to create web pages focused on library resources in broad topic areas. The most efficient way to create and maintain such pages is by using a database. This article provides a survey of free and open source software options for creating and maintaining database-driven subject pages. Applications highlighted include SubjectsPlus, LibData, Research Guide, and Library Course Builder. Social bookmarking sites, course management systems, blogs, and wikis are also mentioned as options. - RT

DeRidder, Jody L. "Googlizing a Digital libraryCode4Lib Journal  (2)(24 March 2008)(http://journal.code4lib.org/articles/43). - This article describes how one institution dramatically increased access to their digital library materials by exposing information about these items to web crawlers. Called the "deep web" because of its opaque nature to web crawlers, content hidden behind database walls can be exposed to crawlers in various ways. DeRidder discusses these options and describes their particular strategy. A lengthy bibliography and list of helpful links will assist those who wish to do the same. - RT

Freeland, Chris, Martin  Kalfatovic, and Jay  Paige, et. al."Geocoding LCSH in the Biodiversity Heritage LibraryCode4Lib Journal  (24 March 2008)(http://journal.code4lib.org/articles/52). - This article is an interesting description of using Library of Congress Subject Headings (LCSH), geographical coordinates, and the Google Maps Application Program Interface (API) to create new methods of information discovery in the Biodiversity Heritage Library. Despite the "wow" factor of using a Google Maps interface to discover items in the library, there are several problems this project surfaced. One is the lack of data -- not every item that refers to a geographic location has been coded as such. Also, it is not always helpful to plot something that refers to an entire continent as a point somewhere in the middle of that continent. But this is interesting work and it demonstrates potentially useful directions for using our existing data in new ways to enhance retrieval. - RT

Gantz, John F., Christopher  Chute, and Alex  Manfrediz, et. al.The Diverse and Exploding Digital Universe: An Updated Forecast of Worldwide Information Growth through 2011  Framingham, MA: International Data Corp, 2008.(http://www.emc.com/collateral/analyst-reports/diverse-exploding-digital-universe.pdf). - In 2007, the digital universe held 281 billion gigabytes (281 exabytes), which is about 45 gigabytes of digital information for every person on the planet. By 2011, the digital universe is projected to grow ten-fold to 1.8 zettabytes (1,800 exabytes). According to the report: "the number of digital 'atoms' in the digital universe is already bigger than the number of stars in the universe. And, because the digital universe is expanding by a factor of 10 every five years, in 15 years it will surpass Avogadro's number." (Avogadro's number is 602,200,000,000,000,000,000,000.) - CB

Gillesse, Robert, Judith  Rog, and Astrid  Verheusen. Alternative File Formats for Storing Master Images of Digitisation Projects  Hague: Netherlands: Koninklijke Bibliotheek, 2008.(http://www.kb.nl/hrd/dd/dd_links_en_publicaties/publicaties/Alternative%20File%20Formats%20for%20Storing%20Masters%202%201.pdf). - This in-depth study by the Research and Development Department of the Koninklijke Bibliotheek (National Library of the Netherlands) found that the best alternatives for master digital images to uncompressed TIF files were JPEG 2000 lossless (53% storage savings) and PNG (40% storage savings). When the master digital image is also the distribution file, JPEG 2000 lossy and JPEG with greater compression were the best formats. - CB

Mellinger, Margaret, and Kim  Griggs. "The ICAP (Interactive Course Assignment Pages) Publishing SystemCode4Lib Journal  (2)(24 March 2008)(http://journal.code4lib.org/articles/63). - Many academic librarians create and maintain course web pages that identify library resources useful for a particular course. In this article, the authors describe a project to develop open source software to make it easy for librarians to create and maintain these kinds of pages with no HTML coding. The resulting ICAP Publishing System is now available for anyone to download, install and use. Since this article is in a technical journal, their software decisions are explained and code examples are included. - RT

Smith, Joan A., and Mike L.  Nelson. "Site Design Impact on Robots: An Examination of Search Engine Crawler Behavior at Deep and Wide WebsitesD-Lib Magazine  14(3/4)(March/April 2008)(http://www.dlib.org/dlib/march08/smith/03smith.html). - Anyone with a web site knows that a large proportion of the traffic they get tends to come from search engines -- particularly Google. So knowing how well these search engines crawl your site can be important if you want more people to find you. This article studies how the design of a web site can influence how it's crawled. For example, "wide" web sites that don't have many levels of pages may be easier for crawlers to penetrate than "deep" sites that have many levels. To find out how the Google, MSN and Yahoo crawlers responded to these two kinds of sites, the authors set up some dummy sites and watched how they were crawled for a full year. They provide animations that depict how the crawls progressed over the year. After discussing how the MSN and Yahoo crawlers tended to not crawl as thoroughly as Google (falling as low as 3% coverage in the worst cases, whereas Google never fell below 99%), they conclude that "Digital library sites that want to maximize their exposure to search engine users should look to improve the crawler-friendliness of their site...site design does matter to the crawler and webmasters should consider implementing a crawler-friendly site design that includes index pages and/or a sitemap." - RT

Stormont, Sam. "Looking to Connect: Technical Challenges that Impede the Growth of Virtual ReferenceReference & User Services Quarterly  47(2)(Winter 2007): 114-119. (http://rusq.org/2008/01/06/looking-to-connect-technical-challenges-that-impede-the-growth-of-virtual-reference-2/). - Sam Stormont, co-author of Starting and Operating Live Virtual Reference Services, is the guest columnist for the Accidental Technologist in the most recent issue of Reference & User Services Quarterly. Although virtual reference services have been around in one form or another for over twenty years, libraries are still finding that usage by patrons is lower than expected. If millions of teenagers are using instant messaging everyday, why aren't they knocking down our virtual door at the reference desk? Stormont believes that overly complicated virtual reference interfaces might be part of the problem. Many systems with co-browsing features are unreliable, since every user's workstation is set up differently. Convenience is key with our audience, and expanding our options with collaborating through instant messaging software might be the answer to our popularity problem. - KC

Wisniewski, Jeff. "The New Rules of Web Design"  Online  32(2)(March-April 2008) - More on the "Simplicity is Dead" movement, this time from the Web Services Librarian at Univ. of Pittsburgh. The focus is on how web design has changed over the years, how screens have become bigger, and the requirements of users have grown. A simple Google interface isn't enough because it only does one thing while library websites must do many. Meeting the needs and expectations of current users requires more than utilitarian values like usability and accessibility. There must also be "visceral attributes" such as "desirability, usefulness, and value." While the author calls these "new rules of design", I think some of them have been around for a while. Probably the best lesson we can draw from the mounting volume of evidence in this field is not to go too far in one direction or the other, neither be too sour or too sweet. - LRK