Thinking Inside the Grid: Selecting a Discovery System through the RFP Process

Thinking Inside the Grid: Selecting a Discovery System through the RFP Process

Dace Freivalds (Pennsylvania State University Libraries, USA) and Binky Lush (Pennsylvania State University Libraries, USA)
DOI: 10.4018/978-1-4666-1821-3.ch007
OnDemand PDF Download:
No Current Special Offers


Many libraries are in the process of purchasing and implementing Web-scale discovery systems. In order to ensure that the selected system meets the needs of the institution’s users, a thorough and careful evaluation of potential systems is critical. Using the Penn State University Libraries’ selection process as an example, this chapter describes the use of a formal Request for Proposal (RFP) process to evaluate Web-scale discovery systems impartially and objectively. While the RFP was mandated at Penn State, the methodology presented here can serve as a model for selecting a discovery system even when a library is not required to use an RFP. The chapter provides sample evaluation grids, scoring schemes, team guidelines, reference check questions, and other tools that can be used during the selection process to ensure a thorough and complete evaluation.
Chapter Preview

Literature Review

While library literature contains a growing body of work on Web-scale discovery systems, there is limited professional library literature that specifically addresses the use of a structured, methodological process, much less a Request for Proposal (RFP), to evaluate and select a discovery system. Luther and Kelly (2011) identify content, search, fit and cost as factors to consider when selecting a discovery system while Vaughan (2011) breaks down the list of questions to consider when contemplating the purchase of a discovery system into seven sections: General and Background Questions, Local Library Resources, Publisher and Aggregator Agreements and Indexed Content, Open-Access Content, Relevancy Ranking, Authentication and Rights Management, and User Interface. Boock, Buck, Chadwell, Nichols and Reese (2009) and Brubaker, Leach-Murray and Parker (2011), in describing their processes for finding the right discovery layer, compare the features of various discovery systems. Rowe (2010) reviews three products (Serials Solutions®1 Summon™2, EBSCO Discovery Services™3, and OCLC WorldCat®4 Local) and provides comparative review scores for them, while the University of Michigan’s Article Discovery Group (2010) created “a list of concrete features and tasks that could serve as a basis for the comparison and evaluation of article discovery tools” which then became the criteria used to evaluate individual tools (Bhatnagar et al., 2010, p. 5).

Library literature does, however, contain numerous articles on the use of an RFP or a structured, evaluative process for selecting other library information services such as a federated search product (Caswell & Wynstra, 2007), serials vendor (Westfall, 2011) and library management system (Calvert & Read, 2006). The value of including input from end users, including faculty and students, in developing an RFP and in evaluating the competing products is described by Ryan (2004). Valuable information on the RFP process can also be obtained from more general articles from the information science field that describe the standard steps involved in the process (Peters, 2011; Clegg and Montgomery, 2006) and review the pros and cons of the process (Schachter, 2003; Schrage, 1996; Wisniewski, 2009).

Complete Chapter List

Search this Book: