Skip to Main Content

Database Trial Information

Information about database trials and evaluation processes

Guide Status

Note: This guide is intended to provide basic information about past database trials and the process by which the State Library evaluates databases. This guide is not actively maintained and does not include recent database trials. If you have questions about databases provided by the State Library for state employee use, please contact Janelle Youngblood, Electronic Resources Librarian.

Evaluation Process

The State Library maintains a regular process of reviewing our databases to ensure they meet state agencies' and employees' research and information needs. As part of this process, we evaluate both current and trial databases to determine the best fit according to a variety of variables—balancing content coverage/availability, demonstrated use, expressed interest, and cost efficiency.

Here's a more detailed breakdown of the types of data analysis that may be used for database evaluation:

Usage statistics: data that indicates how databases are used

  • Searches – the total number of searches performed in a database over a given time period
  • Actions – the total number of actions taken in a database over a given time period; actions include clicking on links, retrieving generated citations, downloading PDFs, etc.
  • Downloads – the total number of full-text downloads (usually PDFs) from a database over a given time period
  • Ratio of downloads to searches – an estimate of how effectively users are able to retrieve full-text items from their searches

Title list and holdings comparisons: reveal potential gaps in content or additional resources to fill existing gaps

  • Title lists – a list of all titles available in a particular database, often but not always by title, publisher, identification number, access start and end dates, delay periods, and more; includes journals, ebooks, reports, videos, etc.
  • Holdings – a dataset that provides information for the full-text material available in a database; focused on the full-text access start and end dates, includes delay periods

Document delivery requests: a review of the items requested via the library's request/renew service that were filled via document delivery; includes both item titles (at the journal and book level) and associated estimated costs

Cost efficiency: most often determined by cost per use; also takes into account potential cost savings compared to other methods of delivering access to materials (interlibrary loan, document delivery, etc.)

User survey responses: aggregate data that most often uses Likert scale questions with weighted averages

User survey comments: statements from open-text questions requesting feedback during trial evaluation surveys

User feedback and requests: anecdotal feedback and requests provided by users (often via email or chat)

Library staff survey comments: statements from open-text questions requesting feedback during trial evaluation surveys

Library staff feedback and requests: anecdotal feedback and requests provided by library staff (often during meetings or following up from a user request)