, Sandra K. Roe, Editor
Daniel Lovins, Cataloging News Editor
Designating Materials: From "Germane Terms" to Element Types
Jean Weihs and Lynne C. Howarth
ABSTRACT: While directions for the use of "germane terms for the physical medium of the work" appeared in the 1964 Rules for Descriptive Cataloging in the Library of Congress: Phonorecords, most libraries choosing to integrate nonbook materials in their collections, either colour-coded their catalogue cards, or added two-digit media codes to call numbers. The first formalized list of "general material designations" (gmds) – placed immediately following the title proper as an early warning device – was published in 1978 in the second edition of the Anglo-American Cataloguing Rules. Since their introduction they have been controversial as this investigation of the evolution of material designations explores.
KEYWORDS: Anglo-American Cataloguing Rules; AACR; Resource Description and Access; RDA; material designations; general material designation; gmd; specific material designation; smd; cataloguing code revision; non-book materials cataloguing; audiovisual materials cataloguing
Education for cataloguing and classification at the Department of Information Sciences in Osijek, Croatia
ABSTRACT: In 2005 Croatian higher education curricula underwent a significant reform in order to comply with requirements of the Bologna Process. This paper examines the ways that reform affected cataloguing instruction at the Department of Library and Information Sciences, Faculty of Philosophy, University J. Juraj Strossmayer in Osijek, Croatia.
KEYWORDS: Cataloging education, cataloguing curriculum, library and information science education, Department of Library and Information Sciences, Faculty of Philosophy, University J. Juraj Strossmayer, Croatia
A Survey of Cataloger Perspectives on Practicum Experiences
Ione T. Damasco and Melanie J. McGurr
ABSTRACT: The issue of integrating both theory and practice into the graduate cataloging curriculum has been widely recognized as a long-standing obstacle for educators and practitioners alike. One way students can gain practical cataloging experience is through a practicum. In order to gauge cataloger attitudes about practica, an online survey was distributed to entry-level catalogers with less than ten years of experience and who were currently employed at an Association of Research Libraries (ARL) member institution. Although the experiences ranged widely, the majority of respondents felt the practicum was a valuable experience that should be formally required within the library science curriculum.
KEYWORDS: Cataloging curriculum, practica, catalogers, cataloging education, Association of Research Libraries, survey
The Use of Reading Levels as Alternative Classification in School Libraries
Cynthia R. Houston
ABSTRACT: A relatively new phenomenon in school libraries is the organization of books by the reading levels associated with reading incentive programs such as Accelerated Reader and Reading Counts. There is a body of research on the effects of reading incentive programs on variables related to reading ability and motivation but currently there is no research on the characteristics of the school libraries that use reading levels as an alternative classification scheme for shelf arrangement. This article reports the results of a survey of P-12 media specialists in Kentucky who identified their libraries as arranged by reading level. Analysis of results provides information on the demographic characteristics of these libraries and media specialists' rationale for organizing their library collections in this way.
KEYWORDS: School libraries, reading incentive programs, Accelerated Reader, alternative classification, reading level, shelf arrangement
The Changing Landscape of Contemporary Cataloging
Sue Ann Gardner
ABSTRACT: Intended to contribute to the current dialogue about how the emerging information environment is impacting cataloging issues, this survey paper covers a broad range of topics, such as how search engines compare with integrated library systems, and includes some thoughts on how cataloging processes may evolve to continue to remain relevant. The author suggests that there is a need for significant changes in integrated library system interfaces and infrastructures as well as some changes in cataloging practice. The value of descriptive vs. non-descriptive elements in the catalog record and some pros and cons of the MARC format are covered.
KEYWORDS: Future of cataloging, online catalogs
Like so many others, this issue of Cataloging & Classification Quarterly includes articles on a diverse group of topics. Following their earlier article, "Making the Link: AACR to RDA: Part 1: Setting the Stage," Weihs and Howarth have written a benchmark article for future writers on what we currently call ‘general material designation.’ Their article traces evolution of this designation back to the advent of integrated catalogs and concludes with RDA’s treatment of this same concept.
Two articles address aspects of cataloging education. Petr summaries the library and information science programs throughout Croatia and describes the changes that occurred in the cataloging curriculum at the Department of Library and Information Sciences at the University J. Juraj Strossmayer in Croatia more specifically. Damasco and McGurr report the findings of their survey related to cataloging practica.
School libraries that organize their collections by reading levels that correspond to reading incentive programs is the topic of the next article. Houston surveyed media specialists throughout Kentucky. The results include the demographic characteristics of the libraries and the rationale for adopting this method of shelf arrangement, as opposed to the Dewey Decimal Classification or another traditional option.
Gardner summarizes issues facing cataloging in this time of great change. The news column continues Gardner’s theme including sections on dis-integration, the Library of Congress Working Group on the Future of Bibliographic Control, RDA, OCLC 2.0, etc.
Sandra K. Roe
Daniel Lovins, News Editor
Welcome to the news column. Its purpose is to disseminate information on any aspect of cataloging and classification that may be of interest to the cataloging community. This column is not just intended for news items, but serves to document discussions of interest as well as news concerning you, your research efforts, and your organization. If you have any pertinent materials, notes, minutes, or reports, please contact Daniel Lovins (email:; phone: 203-432-1707). News columns will typically be available prior to publication in print from the CCQ website at .
We would appreciate receiving items having to do with:
Research and Opinion
A. Learning to Live with Dis-Integration (a Meditation on Weinberger’s New Book)
The exchange of ideas within and across human minds hardens into something called "knowledge" only with the application of great force. This change in material state happens first as authors channel their creative energy into the writing of books; and then a second time as librarians transform miscellaneous piles of books into classified collections.
Until recently, our classification systems, like the books they organize, have been fairly rigid in design. Books came out in new editions and class notations changed, but in general there had been a great deal of stability to the way knowledge is represented in our libraries. Selectors can purchase multiple copies of a book, of course, and, with the invention of portable surrogates, i.e., catalog cards, a much higher degree of flexibility was achieved. But replication of surrogates costs a lot of money, and maintaining multiple arrangements of surrogates (as in having an Author/Title catalog sit beside a Subject catalog) costs even more. It has long been understood and accepted by librarians and readers alike, therefore, that there would be a limit to the number of ways one could extract knowledge from collections.
In his new book, Everything is Miscellaneous (2007), David Weinberger documents the extent to which—in the digital age—these assumptions are unraveling. Invoking the metaphor of a single leaf hanging from multiple branches, Weinberger makes the case that knowledge repositories can now support a virtually unlimited number of pathways into their collections. Digital objects, then, resemble ideas as they naturally occur ‘in the wild’—that is, as in spontaneous thought and conversation—as more of a web of relationships than as a hierarchy of facts.
Weinberger recalls Arthur Lovejoy’s study of the "Great Chain of Being," i.e., the medieval view that reality is a kind of upside-down tree, with God as the root node on top, celestial spheres, human beings, and animals forming links somewhat further down the chain, and inanimate, unconscious matter serving as terminal nodes at the bottom. Even after Galileo, Darwin, et al., poked fatal holes in this view of the world, the model continued to operate in a modified form if for no other reason than that people still needed a non-arbitrary way to organize things.
Melvil Dewey’s classification system exemplified the analog-era need to have single leaves or nodes (i.e., concepts) subsumed under single branches (i.e., the categories to which they belong). In Dewey’s classification, all leaves and branches could be traced back to ten basic categories. This approach was not only reasonable, but in fact extremely useful. By using decimal numbers, he created an infinitely extensible knowledge map; new topics could always be squeezed in between pre-existing ones simply by tapping unused numerals or supplying additional decimal places.
The system is encyclopedic in scope, but also necessarily biased to a particular place, time, and social class. That is to say, it represented the world as it appeared to a well-educated, Protestant, liberal-arts graduate of 1870s New England (that is to say, Melvil Dewey at Amherst College). The system has built-in flexibility, and has undergone regular revision for more than a hundred years, but the broad Dewey classes can not change too much without undermining their foundations.
Even in our new digital world, though, this relative inflexibility is not necessarily a bad thing. It is just limited to particular communities of practice. In Dewey’s case, he did a superb job helping largely-Christian, American, public, school, and college library readers, discover and retrieve the materials they wanted and needed. But this does not mean other groups should need or want to make use of the same distinctions and hierarchies for their own collections.
Essential Qualities versus Family Resemblances
As Socrates explains in Plato’s Phaedrus, there are natural ‘joints’ in reality just as there are in an animal carcass. A philosopher, like a butcher, can use these joints to carve up the material with relative ease. Weinberger acknowledges that such joints really do exist, but, like Eleanor Rosch and Ludwig Wittgenstein before him, rejects the ideology of essentialism. This is to say, he rejects the idea that an object derives its conceptual definition and temporal persistence from a kind of infinitely contracting point at the center. Rather like Wittgenstein, Weinberger sees "family resemblances" among objects and ideas, that is, clusters of animals, objects, concepts, etc., that form around shared qualities or traits, and which can change depending on whatever trait is emphasized at a given moment. They persist only so long as it is socially or environmentally useful for them to do so. For example, "Male vs. Female" is a real joint in nature, but then there are hermaphrodite babies to whom we give hormone therapy to nudge them one way or the other. Parents and physicians must decide which sex to declare and how to raise the child.
Moreover, the way in which concepts or identities are teased apart invariably shapes how we perceive them. Consider the fact that until recently, most clinical trials were done on men, on the assumption that results could later be extended to women. Similarly, the difference between Christianity and Islam forms a real joint in the world of ideas, but Dewey’s classification made the cut in a way that privileged Christianity and marginalized Islam. The joint is there for all to see, but severing the joint and separating the pieces involved subjective judgment.
Similarly, the Library of Congress Subject Heading (LCSH) used to designate the object of Christian worship simply as "God," while the objects of Muslim and Jewish worship were labeled "God (Islam)" and "God (Judaism)," respectively. The addition of the qualifier "(Christianity)" in 2006 to the first case removed some of the bias from LCSH, but the ideological skew runs at a much deeper level and would be impossible to eradicate completely. This has been evident to those involved in RDA negotiations, where the effort to produce a universal cataloging code easily runs aground on questions of nomenclature (e.g., whether to call a certain sacred scripture "Bible" or "Old Testament" or "Tanakh," or by yet some other name).
In any event, in a networked digital world, it may no longer be necessary to impose a monolithic classification scheme on everything, since, as Weinberger points out, a single conceptual node (or "leaf") can easily hang from multiple branches. For example, "Jewelry" can be subsumed under "Decorative arts," "Costumes," "Accessories," and "Merchandise" simultaneously. Moreover, it used to be the case that the gatekeepers of knowledge (e.g., librarians) got to control the means of discovery. In a digitally-networked socially-tagged world, however, this is no longer strictly the case.
All Ontologies are Political
A formal classification system presupposes a person or corporate body that gets to design and maintain it. The system can then be used to impose fixed identities on a target population, sometimes with unhappy results. In addition to library classifications, Weinberger cites the Periodic Table of Elements, the Linnaean classification of biology, the psychiatric Diagnostic and Statistical Manual (DSM), and the racial categories employed by the U.S. Census Bureau. All are extremely useful to their respective disciplines, and yet all obscure the fundamentally miscellaneous nature of what they describe, i.e., elements and attributes that may be arranged in multiple hierarchies of meaning (if at all) and reconfigured depending on the task at hand.
For example, the U.S. Office of Management and Budget issued a directive in October 1997 to allow Census respondents to indicate more than one racial identity for the first time. Virtually overnight, the number of races identified in the United States went from 5 to 126. And if this weren’t confusing enough, the American Anthropological Association had recently proclaimed that race "has no scientific justification in human biology."
Then there’s the case of the DSM, used by psychiatrists and psychologists to classify patients and justify payments from insurance companies. Judging people as pathological according to an ostensibly scientific classification system—and then forcing them, say, to undergo corrective therapy—is a dangerous business. Weinberger cites a paper given by gay activist Ronald Gold at the American Psychiatric Association’s 1973 convention entitled "Stop. You’re Making me Sick!" The speech may have helped produce the desired effect, since DSM ceased to classify homosexuality as a mental disorder in 1973.
Another problem with centrally-planned classification is that, to some at least, it creates the impression that knowledge is something to be encountered passively through books or lectures, rather than creatively and intersubjectively through scientific observation and the exchange of ideas. Formal classification presents knowledge as a pre-coordinated grid of facts, with each topic neatly bundled and stowed away. Weinberger views Wikipedia, therefore, as a kind of antidote, since the open source encyclopedia portrays knowledge more as a negotiation among readers than as a set of facts and rules imposed from above (which is how some Wikipedians portray the Britannica). Moreover, he paradoxically presents the very unreliability of Wikipedia as a source of its strength. By allowing authors to post false information, and then allowing readers to flag it as false, the community enjoys the benefit of being reminded that all knowledge is tentative and probabilistic, and that each reader has to take responsibility for judging the knowledge claims of others.
Weinberger’s thesis is not so much an attack on the idea of real differences in the world, as it is on the ideology of essentialism (and perhaps also on ideology as a mental habit). By treating a provisional arrangement of terms as though it were a fixed and permanent mirror of nature, there is risk of collateral damage to whatever or whomever gets classified within it. Sometimes the consequences are mild, as when Pluto was demoted to the status of a "dwarf" planet; sometimes they are severe as when one receives the stigma of psychiatric diagnosis.
Still, I wonder if Weinberger underestimates the importance of centrally-planned classification schemes. There is a growing problem on sites like delicious, Technorati, and Flickr, for example, of users having to make sense out of a morass of ambiguous tags.1 Better thesaural control would improve the discovery experience and extend tags’ shelf-life, which otherwise have a tendency to "rot," i.e., lose semantic power and fall into disuse.
Similarly, I wonder if Weinberger fully appreciates the social costs of anarchy in the marketplace of ideas. As Robert Fisk points out in The Independent (4/21/07), lives and reputations have been ruined by disinformation spread through Wikipedia and other open-sourced resources. For example, the highly-respected University of Minnesota professor Taner Akcam is no longer able to cross international borders because of disinformation planted in his Wikipedia biography. Fisk writes that Akcam has become "an inmate of the internet hate machine, the circle of hell in which any political filth or personal libel can be hurled at the innocent without any recourse to the law, to libel lawyers or to common decency." Also troubling is the effect unfiltered Web-based publishing—a kind of global vanity press—has on education and society. The title of Andrew Keen’s new book expresses it nicely: The Cult of the Amateur : How today’s Internet is Killing our Culture." It would seem librarians still have an important role to play in guiding readers to vetted, authoritative information.
B. The Future of Bibliographic Control
While the Internet has become a kind of epistemological Wild West, a semblance of order can still be found within our catalogs. This should not be taken for granted, though. Ever since Karen Calhoun released her LC-commissioned report2 in March, 2006, many within our community have feared that our de facto national library would endorse her recommendation to abandon precoordinated assignment of LCSH.3 One could almost hear a collective sigh of relief, then, as LC announced on June 13, 2007, that it would look for ways to reduce costs, simplify application, and expand use beyond the library catalog, but that it would not stop assigning precoördinated LCSH strings to bibliographic records.4
Meanwhile, the Library of Congress Working Group on the Future of Bibliographic Control is close to finishing its work. Partly in response to a crisis following LC’s decision to abandon series authority control, Deanna Marcum organized a group of experts from universities, libraries, and industry, in an effort to reach consensus on bibliographic best practices. The group’s charge has been to: (1.) Present findings on how bibliographic control and other descriptive practices can effectively support management of and access to library materials in the evolving information and technology environment; (2.) Recommend ways in which the library community can collectively move toward achieving this vision; and (3.) Advise the Library of Congress on its role and priorities.5
Three themed sessions have taken place, i.e.: "Uses and Users" and "Structures and Standards" and "Economics and Organization." The final reports and recommendations are due on November 1, 2007.
By all accounts, the discussions have been stimulating and rewarding, and occasionally controversial. At the second session, for example, David Bade presented his paper: "Structures, Standards, and the People who make them Meaningful," which generated some heated exchanges on Autocat. Bade suggests that terms like "metadata harvesting," "data mining," "information silos", as found in the Working Group background documents, betray a industrial production model of librarianship. Conspicuously absent, he thought, is the insight that without the interpretation, evaluation, and context, traditionally provided by human catalogers, metadata are meaningless. "There is no information, no meaning ‘out there’ to be found, managed, mined, merged, manipulated or processed," he wrote, "unless it has been created by someone for some purpose in a form which that creator deemed adequate for the intended users and uses."
This echoes some of the concerns Philip Agre expressed in his 1995 article, "Institutional Circuitry: Thinking About the Forms and Uses of Information." Agre suggested that librarians, in their effort to be neutral and dispassionate tend to view information as a commodity. A measure of detachment is necessary to professional cataloging since the mission is to help readers obtain their heart’s desire, regardless of what the librarians might have chosen for themselves. Catalogers, for their part, provide an artificial language of descriptive terms, allowing different (perhaps even incommensurable) schools of thought, academic disciplines, linguistic groups, etc., to communicate with one another on equal footing. This has been especially important in the universities, where increasingly interdisciplinary research compels scholars to converse and collaborate in less-familiar disciplines and languages.
Of course, despite our best efforts, there is always bias in our profession (see Weinberger section, above). In order to minimize the effect, though, we strive for objectivity and standardized descriptions. This, in turn, can make it seem like we are forcing the literatures of heterogeneous groups into the proverbial Procrustean bed. Librarians aim for neutrality in their work, and intellectual growth, discovery, and creativity for their readers, but the human condition being what it is, conflicts, bias, and misunderstandings always arise.
Bade’s criticisms notwithstanding, automation of certain processes seems necessary to improving efficiency. We now have an historic opportunity to reduce duplicative efforts across libraries, museums, and other agencies, and re-focus human intelligence on those parts of our collections which are unique or in which we have special expertise. We also have a chance to become much more interoperable with other information services on the Web.
As Grant Campbell has pointed out, for example, a library catalog built according to W3C "recommendations" could easily share metadata with sites like the Internet Movie Database (IMDB). This would eliminate the need for creating film descriptions from scratch (a very time-consuming endeavor), and take advantage of the self-organizing community of enthusiasts that is already out there. Similarly, by allowing faculty and other community experts to bookmark, annotate, and recommend items to their colleagues and students, we effectively harness the expertise that already exists on our local campuses. This also provides an incentive for readers to continue using a local catalog that they are now helping to build, rather than jumping ship to commercial search engines or online bookstores.
C. RDA and the Semantic Web
It is precisely this pressure to find new efficiencies, adopt best digital practices, and reduce parochialism among networked libraries that informs the forthcoming Resource Description and Access (RDA) cataloging code. RDA editors and reviewers have struggled mightily since 2004 to strike a balance between this pressure and the importance of backward-compatibility with AACR2. There has been a good deal of disagreement on how to proceed, as traditionalists fear abandoning print-based collections, administrators fear the costs of yet another retrospective conversion, and systems people fear falling short of World Wide Web Consortium (W3C) recommendations. Not surprisingly, attempts at compromise have managed to frustrate all parties and create a feeling of paralysis.
On April 30th, 2007, however, a meeting took place in London that included representatives from RDA, Dublin Core, W3C, Google, Microsoft, and others, and that participants have described as a major breakthrough. Among other developments, a landmark agreement was reached between RDA and Dublin Core Metadata Initiative (DCMI) communities to co-develop a formal application profile for RDA. The new profile will be built on five pillars: the DCMI Abstract Model, Functional Requirements for Bibliographic Records (FRBR), Functional Requirements for Authority Data (FRAD), Resource Description Framework (RDF), and Simple Knowledge Organization Scheme (SKOS).
It is unclear whether the text of RDA will change much as a result, other than to become more rigorous and consistent in the application of terms. But the new agreement should help developers tease out implicit RDA vocabularies into separate Web-addressable resources, make it easier for RDA concepts to be used outside of libraries, and enable programmers to design applications that incorporate RDA concepts. The element set, its structure, the guidelines (‘rules’), and the encoding format, will be disentangled from one-another. As Diane Hillman has pointed out, the library community will benefit by adopting semantic web technologies and the semantic web community will benefit from access to high-quality repurposable metadata. Moreover, because much of the new functionality will take place outside the cataloging code, relatively little restructuring (and therefore retrospective conversion) should be required for backward compatibility.
D. OCLC 2.0
Closely related to the creation of a new kind of cataloging code is the engineering of new kind of catalog. This is sometimes referred to as the "Next Generation Catalog," "OPAC 2.0," or occasionally the "Dis-Integrated Library System." Whatever one decides to call it, the race is heating up to develop the best one possible. Major contenders include Endeca’s Information Access Platform (usually referred to as simply "Endeca"), Medialab’s Aquabrowser, Ex Libris’s Primo, and Innovative Interface’s Encore, all claiming to provide "faceted navigation," intelligent relevancy ranking, spelling corrections, social tagging, FRBRized results, peer-to-peer recommendations, and multiple fulfillment options (i.e., one-step paging from stacks, off-site shelving, on-demand digitization, ILL, Amazon.com, or through some other means).
My intention is not to provide free advertising, but it is worth pointing out the extent to which OCLC has been innovating in this area. Roy Tennant—who for years railed against commercial vendors peddling "lipstick on a pig" (i.e., basically ugly software with superficial enhancements)—has been hired as a senior program officer. At the same time, OCLC has been promoting a new "WorldCat Local" service, a kind of OPAC ‘front end’ that leverages the metadata and data-mining capabilities of WorldCat while customizing the interface with local branding and search parameters.
While the beta version of WorldCat Local looks promising, there are some significant implementation challenges. Due to intellectual property rights issues, certain records and certain data elements within records will not be permitted to display. Also, the WorldCat "Master Record" retrieved by patrons may not correspond exactly to the copies actually held at their local institution. In the future, the option to display "Institution Records" (IRs) rather than the Masters may help solve this problem, though it will requires massive synchronization of databases.
OCLC has also been improving its WorldCat Identities service. While still under development, one can already see how bibliographic and authority data have been repurposed to illuminate aspects of an author’s life and work. Links to Identities have already begun displaying in the English-language Wikipedia, and the service is likely to play an important role in the Virtual International Authority File.6 IT author and publisher Tim O’Reilly has described it as an "amazing tool for social network exploration of the literary and artistic world."7
Master Records and Institution Records
Because of the growing number of contributing libraries and its absorption of the RLG Union Catalog, the WorldCat database has grown in size to 85 million bibliographic records representing 1 billion holdings. In the first half of 2007, 16 million RLG cluster records were converted into WorldCat IRs, reflecting variations in cataloging practice or details specific to the item as artifact (which is why it is important to have the IRs linked to WorldCat Local directly as mentioned above). They are especially helpful for identifying chains of custody, manuscript marginalia, autographs, physical condition, and other details concerning the physical artifact.
E. Meta-blogs on Metadata
In case you have not already seen it, Planet Cataloging,8 launched in May 2007, based on Planet News Feeder software,9 is a metadata meta-blog which aggregates entries from selected cataloging blogs into a seamless running news feed. You see each posting through the eyes and stylesheets of Planet Cataloging in reverse chronological order, but clicking on any blog’s title takes you directly into its native interface. The editors Jennifer Lang and Kevin Clarke seem to choose their sources judiciously, but readers may also nominate additional blogs for inclusion. Here’s just a small sampling of what’s covered: 025.431: The Dewey blog,10 Catalogablog,11 Diane Hillman (LITA Blog),12 Lorcan Dempsey's weblog,13 Metadata Blog (ALCTS NRMIG),14 The FRBR Blog,15 Thingology (i.e., LibraryThing's ideas blog),16 and eXtensible Catalog (XC).17 It’s quite impressive. Take a look.
A. Robert Wolven
A member of the aforementioned LC Working Group on the Future of Bibliographic Control, Robert Wolven has been awarded the prestigious Margaret Mann Citation for 2007. Director of Library Systems and Bibliographic Control at Columbia University, Wolven was cited for his "outstanding contributions to the practice of cataloging and metadata as a thinker grounded in practice, a leader inspired with generosity, and a doer motivated by an encompassing vision of that can be achieved." Wolven is donating his $2,000 in prize money to the Simmons Graduate School of Library Science.
B. Robert Bothmann
The 2007 winner of the Esther J. Piercy Award from the Association for Library Collections & Technical Services is Robert (Bobby) Bothmann, Electronic Access/Catalog Librarian at Minnesota State University, Mankato. The award recognizes the contributions to those areas of librarianship included in library collections and technical services by a librarian with no more than 10 years of professional experience who has shown outstanding promise for continuing contribution and leadership.
C. Best of CCQ v.40 awarded to Michael A. Chopey
Michael A. Chopey, Catalog Librarian, University of Hawaii at Manoa Libraries, has received an award for the best article published in volume 40 of Cataloging & Classification Quarterly. His article, "Planning and Implementing a Metadata-Driven Digital Repository," presents an "introduction to the purpose of metadata and how it has developed, and an overview of the steps to be taken and the functional expertise required in planning for and implementing the creation, storage, and use of metadata for resource discovery in a local repository of information objects." Though clearly developed for catalogers, this article makes for great reading for all those other players in a digital library project that need to interact with catalogers. It promotes understanding of cataloging issues among non-cataloging audience. The article also has wide applicability to both practicing librarians and researchers because it offers concrete steps grounded in best practices for systematic planning and immediate implementation, as well as specific recommendations that may be tested in a research setting. The quality of the writing is lucid and engaging. The sources used were authoritative and appropriate.
Even though everything digital seems to be in flux, this article is and will likely remain a good summary of the basic considerations to be kept in mind when initiating and running a metadata project. The analysis of steps to be taken in developing metadata-driven repositories is sufficiently penetrating and function-oriented—as supposed to being defined by specific job titles and technologies—that it should remain relevant for some time to come. Such a helpful overview, benefiting both catalogers and non-catalogers alike, had not existed before. The members of the panel have first-hand experience benefiting from this article and recommending it to other colleagues.
The article appears in CCQ 40(3/4): 255-287 (http://dx.doi.org/10.1300/J104v40n03_12). Members of the award panel were John Riemer (chair), Laurel Jizba, and Daniel Lovins.