Related product Public Services

Assessment of Public Services in Academic Libraries

Assessment in academic libraries usually conforms to the mission of the institution and the strategic plan (goals and objectives) of the library. Before starting an assessment project, librarians and library staff should have a clear understanding of both the institutional mission and their own strategic plan. Assessment of a library service or collection (also called evaluation) may have many different purposes; for example, an overall assessment of the library may be done to respond to campus programs reviews with specific guidelines put in place by the university’s central administration. Internal evaluation of a service in the library may be done to determine its quality, efficiency, needed changes, resources, and/or personnel needs; in this case the assessment is done with guidelines created by the library itself. In addition, assessment may be done in connection with external entities for institutional accreditations, degree program accreditations, institutional internal program reviews, etc.

These are just a few of the possible instances when an assessment may be needed. Therefore, it is important to define the purpose of the assessment and the audience or to whom it is going to be directed.

In practice and from an assessment perspective, some routine data collected in public services are not considered assessment methods because the results are not totally in control of the Library and therefore it is difficult to propose a goal for the outcomes. The number of reference transactions and gate counts are examples of this type of data; however, raw data collected as the two cases in this example can be coupled with user surveys. Thus, reference data supported with a satisfaction users survey can provide evidence of the quality of the service; similarly, in addition to collecting gate counts a survey about how the space of the library is used will also give evidence of both quantitative and qualitative value. In many cases, it tells a story and is an effective way of presenting results.

We are proposing in this project, simple assessment methods that would not require a lot of technical expertise and at low cost. This method can also be adapted with higher technology devices, apps or software packages when they are available. The major areas typically included in a public services unit of an academic library are : reference, circulation, document delivery, reserve, and space usage. Twenty-three assessment methods are presented; an assessment project can be time-demanding, and staff needs to be trained and supervised, also each assessment method requires a schedule (when it will be done), the unit responsible and the staff involved. A staggered schedule that would span for three or four years is recommended; therefore, not every method is done every year. Finally, the most appropriate staff members are expected to summarize and analyze the results as well as to propose changes or improvements. In this way the assessment cycle is concluded.

In this project, the CARLI Public Services Committee presents a number of ideas, tools, and examples for the assessment of the units of a typical public services division of an academic library. This project includes the following sections:

  • Standards for Public Services developed by library associations.
  • Articles, book chapters, and web pages presenting practical and successful evaluation practices.
  • Recently published books on assessment, focused on those which take a practical approach.  The table of contents of each item is included to facilitate a quick understanding of their content coverage.
  • Assessment methods, including a number of assessments for public services units that can be adapted and modified to specific library's environment.

This set of ideas, tools and recommendations are geared towards academic libraries with limited time and resources needed to start an assessment project.

We would like to recognize the following entities from which permissions to use materials were obtained:
American Library Association; Columbia University Libraries; Council on Library and Information Resources, CLIR; Junior Tidal, New York City College of Technology Library; Massachusetts Institute of Technology, University Libraries; Nestor L. Osorio; Northern Illinois University Libraries; and University of Oklahoma University Libraries.

expand / collapse all

Assessment Title: Web page for Circulation - Design
Description: This assessment measures the ease of navigation and the practical value of its content.Tools for performing a Focus Group are presented.  In addition, organization and aesthetic aspects of the website are reviewed.Tools for performing a Focus Group are presented.  In addition, organization and aesthetic aspects of the website are reviewed.
Tools, Instrument: see Index of Assessment Tools, Instrument 9
Additional Information: There are numerous articles and books about how to effectively design and manage a focus group in a library setting
Assessment Title: Reference Services at Circulation
Description: Collect data of reference transactions at the circulation desk.
Tools, Instrument: See Index of Assessment Tools, Instrument 10 – a modified form of Instrument 2
Assessment Title: Laptop Usage Survey
Description: A survey taken by users to determine their satisfaction with the service.
Tools, Instrument: See Index of Assessment Tools, Instrument 11
Additional Information: In addition to usage data, this survey will provide qualitative data.
Assessment Title: Circulation: Turnaround of Shelving
Description: This instrument is designed to collect data to measure the average turnaround time between the time a book is returned (either at desk or at drop box) and when the book returns to the shelf.
Tools, Instrument: See Index of Assessment Tools, Instrument 12
Assessment Title: Assessment of Circulation Desk Services
Description: A survey about the quality of services provided by circulation services.
Tools, Instrument: See Index of Assessment Tools, Instrument 13
Additional Information: See: Source: Long, D. (2012). “Check This Out”: Assessing Customer Service at the Circulation Desk. Journal of Access Services, 9(3), 154–168.
Assessment Title: Processing Time for Interlibrary Loan Requests
Description: The average time between patron request submission and processing of the request in ILLIAD is monitored. Processing for article requests, loan requests, and average of all requests is examined
Tools, Instrument: Report obtained from ILLIAD.
Assessment Title: Web Page for Document Delivery - Design
Description: This assessment measures the easiness of navigation and the practical value of content on a document delivery website.  It also evaluates organization and aesthetics. Tools for performing a focus group are presented, as well.
Tools, Instrument: See Index of Assessment Tools, Instrument 17
Additional Information: There are numerous articles and books about how to effectively design and manage a focus group in a library setting.
Assessment Title: Reference Services at Document Delivery
Description: Collect data of reference transactions at the Circulation desk
Tools, Instrument: See Index of Assessment Tools, Instrument 18 – a modified form of Instrument 2
Assessment Title: Annual Count of Borrowing and Lending, and Traffic Report
Description:
Tools, Instrument: This data is provided by ILLIAD and the State Library.
Additional Information: This data can be supported by a users survey. See Index of Assessment Tools, Instrument 17
Assessment Title: Public Services Assessment Survey
Description: A survey that can be given to users to determine their level of overall satisfaction with their experience using the services
Tools, Instrument: See Index of Assessment Tools, Instrument 1
Assessment Title: Reference Desk Services
Description: Sample data at Reference points, taking daily or during some weeks each semester
Tools, Instrument: See Index of Assessment Tools, Instrument 2
Additional Information: Libraries with subscriptions to, for example, LibQUAL+, can generate more complex data sets.
Assessment Title: Reference Interaction Survey
Description: It measures the level of satisfaction of users with the service provided at the Reference Desk.
Tools, Instrument: See Index of Assessment Tools, Instrument 3
Additional Information: In addition to the data collected about the number of transactions, this survey provides information about the quality of the transaction as perceived by the users.
Assessment Title: Research Consultations (Not at the Reference desk)
Description: Reference staff members maintain a log of research consultations
Tools, Instrument: See Index of Assessment Tools, Instrument 4
Assessment Title: Outreach/Liaison Activities
Description: Reference staff members maintain a log of outreach activities and presentations (Non BI)
Tools, Instrument: See Index of Assessment Tools, Instrument 5
Assessment Title: Google Analytics, LibGuide Stats
Description: This is report from Google Analytics. It can be used to demonstrate the usage of specific reference databases such as Reference Universe.
Tools, Instrument: Google Analytics
Additional Information: Chapter 5. Optimizing Google Analytics for LibGuides
Assessment Title: Subject LibGuides Usability Design
Description: This assessment measures the easiness of navigation, along with organization and aesthetics.
Tools, Instrument: See Index of Assessment Tools, Instrument 6
Additional Information: The four articles in Instrument 6 provide specific details about how to customize a usability assessment for LibGuides.
Assessment Title: Web Page Content and Design (Main Library Page)
Description: This assessment measures the easiness of navigation and the practical value of web page content, in addition to the organization and aesthetics aspects.
Tools, Instrument: See Index of Assessment Tools, Instrument 7
Assessment Title: Services for Students with Disabilities
Description: This is an example of an access and accommodation service. Usage data collected and a satisfaction survey.
Tools, Instrument: See Index of Assessment Tools, Instrument 8
Assessment Title: Web Page for Reserve - Design
Description: It measures, in addition to the organization and esthetics aspects, the ease of navigation and the practical value of its content. Tools for performing a focus group are presented.
Tools, Instrument: See Index of Assessment Tools, Instrument 14
Additional Information: There are numerous articles and books about how to effectively design and manage a focus group in a library setting.
Assessment Title: Faculty Satisfaction Survey
Description: This assessment gauges satisfaction with Reserves web form/experience and quality of service.
Tools, Instrument: See Index of Assessment Tools, Instrument 15
Assessment Title: Data of Traditional and Digital Reserve Items
Description: A description of data for usability assessment from the Council on Library and Information Resources is presented.
Tools, Instrument: See Index of Assessment Tools, Instrument 16
Assessment Title: Survey of Space Utilization
Description: Users provide information about how the space in the Library(s) is used.
Tools, Instrument: See Index of Assessment Tools, Instrument 19
Assessment Title: Gate Count
Description: Daily gate count- preferably hourly - is performed routinely or by taking periodic samples.
Tools, Instrument: See Index of Assessment Tools, Instrument 20
Additional Information: Gate Counts alone are not considered an assessment method but if combined with other instruments such as a space usage survey, they can provide valuable information. There are several methods to capture data, including digital, mechanical, or manual collection. The instrument presented can be adapted to those scenarios.

The Standards articulate expectations for library contributions to institutional effectiveness. This statement found in the introduction of  “Standards for Libraries in Higher Education” a document adopted by the ACRL Board of Directors in October of  2011 - it clearly states the significant value of library standards. In this section we provide a comprehensive list of standards related to functions of public services in academic libraries.

American Library Association

ALA Standards & Guidelines

Association of College & Research Libraries, (A.C.R.L.), American Library Association

ACRL Guidelines, Standards, and Frameworks

ACRL Proficiencies for Assessment Librarians and Coordinators, American Library Association, January 23, 2017. 

Guidelines for University Library Services to Undergraduate Students, American Library Association, September 1, 2006. 

Library User Survey Templates & How-Tos

Website: Library Research Service
LRS free survey templates for programs and events and ways to administer them.

Measuring Quality: Performance Measurement in Libraries, 2nd revised edition
Book: International Federation of Library Associations and Institutions
Edited by Roswitha Poll & Peter de Boekhorst, the 2nd revised edition was expanded to includes information on public libraries as well as academic.

Project Outcome: Measuring the True Impact of Public Libraries
Website: Public Library Association
PLA’s Project Outcome performance measurement tool along with current offerings, surveys, the ability to compare data, and much more.

Standards for Libraries in Higher Education, American Library Association, August 29, 2006.
International Federation of Library Associations and Institutions (IFLA)

Current IFLA Standards

Interlibrary Loan

Guidelines for Best Practice in Interlibrary Loan and Document Delivery

International Resource Sharing and Document Delivery: Principles and Guidelines for Procedure (2009 Revision)

IFLA Digital Reference Guidelines (2002)

Literacy and Reading

Using Research to Promote Literacy and Reading in Libraries: Guidelines for Librarians

Medical Library Association (MLA)

MLA Competencies for Lifelong Learning and Professional Success (2017)

Standards for Hospital Libraries (2007)

Public Library Association (PLA), American Library Association

Planning & Evaluation, American Library Association, June 16, 2016.

Reference and User Services Association (RUSA)

Measuring and Assessing Reference Services and Resources: A Guide

Professional Competencies for Reference and User Services Librarians
[These guidelines can be used to create a user satisfaction survey for reference services]

Reference and User Services Guideline Links by Topic, American Library Association, September 29, 2008.

Electronic Services

Guidelines for Implementing and Maintaining Virtual Reference Services (2017)
   
Guidelines for the Introduction of Electronic Information Resources to Users (2006)

Information Literacy

Information Literacy Guidelines and Competencies for Undergraduate History Students (2013)

Interlibrary Loan

Guidelines for Interlibrary Loan Operations Management (2012)
    
Interlibrary Loan Code for the United States (January 2016) (Includes link to generic ILL form)

Interlibrary Loan Code for the United States Explanatory Supplement (January 2016)

Guidelines for Resource-Sharing Response to Natural and Man-made Disasters (2017)

Reference/Information Services

New Definition of Reference (2008)

Guidelines for Behavioral Performance of Reference and Information Service Providers (2013)

Guidelines for Business Information Responses (2013)

Guidelines for Cooperative Reference Services (2006)
    
Health and Medical Reference Guidelines (2015)
    
Professional Competencies for Reference and User Services Librarians  (2017)

Special Libraries Association (SLA)

Competencies for Information Professionals

Articles

Applegate, R. (2009). Designing comprehensive assessment plans: The big picture leads to the little picture. In D. M. Mueller (Ed.), Pushing the edge: Explore, extend, engage: Proceedings of the Fourteenth National Conference of the Association of College and Research Libraries, March 12-15, 2009, Seattle, Washington (pp. 165-171). Chicago: Association of College and Research Libraries.
https://scholarworks.iupui.edu/handle/1805/1877

Ayre, S., Brettle, A., Gilroy, D., Knock, D., Mitchelmore, R., Pattison, Smith, S., Turner, J. (2018). Developing a generic tool to routinely measure the impact of health libraries. Health Information and Libraries Journal, 35(3), 227-245. doi:10.1111/hir.12223 [doi]
https://onlinelibrary.wiley.com/doi/10.1111/hir.12223

Background: Health libraries contribute to many activities of a health care organisation. Impact assessment needs to capture that range of contributions.
Objectives: To develop and pilot a generic impact questionnaire that: (1) could be used routinely across all English NHS libraries; (2) built on previous impact surveys; and (3) was reliable and robust. Methods This collaborative project involved: (1) literature search; (2) analysis of current best practice and baseline survey of use of current tools and requirements; (3) drafting and piloting the questionnaire; and (4) analysis of the results, revision and plans for roll out.
Findings: The framework selected was the International Standard Methods And Procedures For Assessing The Impact Of Libraries (ISO 16439). The baseline survey (n = 136 library managers) showed that existing tools were not used, and impact assessment was variable. The generic questionnaire developed used a Critical Incident Technique. Analysis of the findings (n = 214 health staff and students), plus comparisons with previous impact studies indicated that the questionnaire should capture the impact for all types of health libraries.
Conclusions: The collaborative project successfully piloted a generic impact questionnaire that, subject to further validation, should apply to many types of health library and information services.

Blevins, A. E., DeBerg, J., & Kiscaden, E. (2016). Assessment of service desk quality at an academic health sciences library. Medical Reference Services Quarterly, 35(3), 285-293. doi:10.1080/02763869.2016.1189782 [doi]
https://www.tandfonline.com/doi/abs/10.1080/02763869.2016.1189782

Due to an identified need for formal assessment, a small team of librarians designed and administered a survey to gauge the quality of customer service at their academic health sciences library. Though results did not drive major changes to services, several important improvements were implemented and a process was established to serve as a foundation for future use. This article details the assessment process used and lessons learned during the project. [Author Abstract]

Ebenezer, C. (2003). Usability evaluation of an NHS library website. Health Information & Libraries Journal, 20(3), 134.
https://www.ulib.niu.edu:2555/10.1046/j.1365-2532.2003.00450.x

Objectives: To carry out a usability evaluation of the recently launched South London and Maudsley NHS Trust library website.

Methods: A variety of standard methodologies were employed: content and design evaluation of selected comparable sites, focus groups, a questionnaire survey of library and Web development staff, heuristic evaluation, observation testing, card sorting/cluster analysis, and label intuitiveness/category membership testing. All test participants were staff of or providers of services to the trust. Demographic information was recorded for each participant. Results: Test participants’ overall responses to the site were enthusiastic and favourable, indicating the scope and content of the site to be broadly appropriate to the user group. Testers made numerous suggestions for new content. Usability problems were discovered in two main areas: in the organization of the site, and in the terminology used to refer to information services and sources. Based on test results, proposals for a revised menu structure, improved accessibility, and changes to the terminology used within the site are presented. Conclusion: Usability evaluation methods, appropriately scaled, can be advantageously applied to NHS library websites by an individual Web editor working alone. [Abstract from author]

Farney, T. (2016). Optimizing Google Analytics for LibGuides. Library Technology Reports, 52(7), 26–30.
Retrieved from http://search.ebscohost.com https://journals.ala.org/index.php/ltr/issue/view/613

The article discusses several options for libraries to install the Google Analytics software on the LibGuides content management system and how it affects data assessment at the individual guide level. Topics include the benefits of installing Google Analytics, installation of Google Tag Manager software to LibGuides, and events to track for a LibGuides website.

Hamasu, C., & Kelly, B. (2013). Assessment and evaluation is not a gut feeling: Integrating assessment and evaluation into library operations. Journal of the Medical Library Association: JMLA, 101(2), 85-87. doi:10.3163/1536-5050.101.2.001 [doi]
https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3634389/

Horowitz, Lisa R. Assessing Library Services. A practical guide for the nonexpert. Library Leadership & Management, 2009, 23 (4) 193-203.
https://journals.tdl.org/llm/index.php/llm/article/download/1793/1068

The idea of assessing library services can intimidate even the most seasoned librarian. This article is a straightforward introduction to assessment for the nonexpert, written for reference and public service managers and coordinators who want to analyze data about existing services or incorporate assessment into plans for a new service. It addresses how to approach assessment, how to make it more useful, and how it can improve a service over time. Those who are new to assessment will find this article a helpful starting point for service evaluation; those who are experienced in assessment can share this article with nonexpert staff to explain the basics of assessment, dispel anxieties, and generate support for assessment projects.

Griffin, M., & Taylor, T. I. (2018). Employing Analytics to Guide a Data-Driven Review of LibGuides. Journal of Web Librarianship, 12(3), 147–159.
https://www.tandfonline.com/doi/full/10.1080/19322909.2018.1487191

This article presents a methodology for conducting an evidence-based review of LibGuides content based on native and non-native analytics data. This methodology uses built-in analytics data from Springshare's platform and data from Google Analytics to investigate LibGuides functionality, use, and design criteria. These criteria, in turn, enable a strategic consideration of how and why we as librarians create LibGuides. Are our guides intended to facilitate reference and research consultations, or do they primarily serve to enable independent research by students? More specifically, who benefits the most from the LibGuides we generate—librarians or researchers? We conclude with a consideration of how analytics data can be leveraged to generate librarian buy-in for reevaluating design criteria of library subject guides and consider implications for practice and further research in this area. [Abstract from Author]

Gustafson, M. M. (2018). They Searched What? Usage Data as a Measure of Library Services and Outreach. Serials Librarian, 74(1–4), 240–243.
https://www.tandfonline.com/doi/full/10.1080/0361526X.2018.1439246

Electronic resources librarians are able to collect a large amount of data across resources, but often the biggest challenge is distilling the data into something meaningful to contribute to the strategic mission of an organization. The Cunningham Memorial Library at Indiana State University acquired ProQuest’s Summon web-scale discovery service in 2015. This article discusses how the library’s e-resources librarian used the new discovery workflow to begin to harvest user search queries as a means to inform resource placement and design on the library’s website and suite of e-resources tools; inform and refine Springshare LibGuides tags for better discovery; craft “best bets” to assist users in getting to their just in time information needs be they a quick search on an unknown topic or a targeted search for a specific research finding as quickly and effectively as possible; and empower instruction librarians to fine tune their information literacy curriculum based on search skills of our users. [Abstract from author]

Key Performance Indicator Handbook for Libraries Serving Print Disabled People (2012). Compiled by Jon Hardisty, Royal National Institute of Blind People, National Library Service, England; Sebastian Mundt, Hochschule der Medien Stuttgart, Germany. IFLA Section Libraries Serving Persons with Print Disabilities; IFLA Section Statistics and Evaluation.
https://www.ifla.org/files/assets/libraries-for-print-disabilities/publications/performance_indicators_lpd_final_2012-10-31.pdf

Long, D. (2012). “Check This Out”: Assessing Customer Service at the Circulation Desk. Journal of Access Services, 9(3), 154–168.
https://doi.org/10.1080/15367967.2012.684586
https://www.tandfonline.com/doi/full/10.1080/15367967.2012.684586

The access services staff at Milner Library, Illinois State University, designed a customer service assessment to evaluate how effectively the department was carrying out its mission statement. Areas of assessment included the department’s waiting times, helpfulness, and courtesy. The assessment activity focused on circulation services, which is the library’s service point reaching the greatest number of patrons. The staff created a survey instrument collaboratively, distributed in paper and electronic format for a week. Results suggested that the department provides prompt and helpful service but has areas of improvement for courtesy. The department adapted customer service training for staff and student assistants.

Manzari, L., & Trinidad-Christensen, J. (2006). User-Centered Design of a Web Site for Library and Information Science Students: Heuristic Evaluation and Usability Testing. Information Technology & Libraries, 25(3), 163–169.
https://ejournals.bc.edu/index.php/ital/article/view/3348

This study describes the life cycle of a library Web site created with a user-centered design process to serve a graduate school of library and information science (LIS). Findings based on a heuristic evaluation and usability study were applied in an iterative redesign of the site to better serve the needs of this special academic library population. Recommendations for design of Web-based services for library patrons from LIS programs are discussed, as well as implications for Web sites for special libraries within larger academic library settings. [Abstract from author]

Oakleaf, M. (2010). The value of academic libraries: A comprehensive research review and report. Chicago, IL: American Library Association.
http://www.ala.org/ala/mgrps/divs/acrl/issues/value/val_report.pdf

Oelschlegel, S., Grabeel, K. L., Tester, E., Heidel, R. E., & Russomanno, J. (2018). Librarians promoting changes in the health care delivery system through systematic assessment. Medical Reference Services Quarterly, 37(2), 142-152. doi:10.1080/02763869.2018.1439216 [doi]
https://www.tandfonline.com/doi/full/10.1080/02763869.2018.1439216

Patient engagement in health care decisions largely depends on a patient's health literacy and the health literacy attributes of the health care organization. Librarians have an established role in connecting patients with health information in the context of their care. However, librarians can play a larger role in helping to make changes in their organization's health literacy attributes. This article discusses one medical library's process of leading systematic assessment of their organization's health literacy attributes. Included in this discussion is the institutional support, timeline, assessment tool, the results for five areas of health literacy, marketing and the event-planning process to disseminate results. The systematic assessment process described employs the Health Literacy Environment of Hospitals and Health Centers document, which provides assessment tools for Print Communication, Oral Communication, Navigation, Technology, and Policies and Protocols. [Author Abstract]

Prentice, K. A., & Argyropoulos, E. K. (2018). Library space: Assessment and planning through a space utilization study. Medical Reference Services Quarterly, 37(2), 132-141. doi:10.1080/02763869.2018.1439213 [doi]
https://www.tandfonline.com/doi/full/10.1080/02763869.2018.1439213

The objective of this article is to describe the recent space and furniture utilization study conducted through direct observation at the small, academic-centered Schusterman Library. Student workers from the library's reference desk monitored two semesters of use and went on to observe a third semester after electrical power upgrades were installed. Extensive use details were collected about where library patrons sat during which parts of the day, and certain areas of the library were ultimately identified as much more active than others. Overall, the information gathered proved useful to library planning and will be valuable to future space initiatives. This article further demonstrates feasible means for any library to implement a similar study with minimal resources. [Author Abstract]

Shurtz, S., Sewell, R., Halling, T. D., McKay, B., & Pepper, C. (2015). Assessment of an iPad loan program in an academic medical library: A case study. Medical Reference Services Quarterly, 34(3), 265-281. doi:10.1080/02763869.2015.1019743 [doi]
https://www.tandfonline.com/doi/full/10.1080/02763869.2015.1019743

An academic medical library expanded its iPad loan service to multiple campus libraries and conducted an assessment of the service. iPads loaded with medical and educational apps were loaned for two-week checkouts from five library campus locations. Device circulation statistics were tracked and users were invited to complete an online survey about their experience. Data were gathered and analyzed for 11 months. The assessment informed the library on how best to adapt the service, including what resources to add to the iPads, and the decision to move devices to campuses with more frequent usage. [Author Abstract]

Thorngate, Sarah and Allison Hoden (2017). Xploratory Usability Testing of User Interface Options in LibGuides 2, College & Research Libraries, Vol 78, No 6.
https://crl.acrl.org/index.php/crl/article/view/16739/18254

Online research guides offer librarians a way to provide digital researchers with point-of-need support. If these guides are to support student learning well, it is critical that they provide an effective user experience. This article details the results of an exploratory comparison study that tested three key user interface options in LibGuides 2—number of columns, placement of the navigation menu, and visual integration with the library website—to understand their impact on guide usability. In addition to informing our own design choices, our findings can serve as the basis for further investigation into the connections between student learning and the usability of the LibGuides user interface.

Yong-Mi Kim. (2011). Users’ perceptions of university library websites: A unifying view. Library & Information Science Research, 33(1), 63-72.
https://www.sciencedirect.com/science/article/abs/pii/S0740818810001118

University libraries have invested a large amount of resources into digitizing information for the Web, yet scholars and practitioners question the value of this investment due to a lack of use of university library website resources (ULWR). Addressing this concern, researchers have investigated the use of ULWR and offered insights into the problem. However, studies have employed a single perspective rather than a comprehensive approach; as a result, the findings shed light on only part of the use issue. Also, existing studies have consistently reported that users with different academic roles have distinct usage patterns of ULWR and information sources. But they have not considered a wide range of users or systematically investigated such differences. This study examines these differences. This study examines (a) the user perspective, derived from technology adoption literature; (b) the website design perspective, embedded in human computer interaction literature; and (c) the library service quality perspective, based on information science literature. The second area is addressed by surveying a wide range of users, categorizing them based on their academic role differences, and then comparing their use of ULWR and information sources, thereby highlighting distinctive usage patterns. Research based on the responses of 315 participants shows that while users favorably rated factors derived from the perspectives of user and library service qualities for ULWR use, they perceived university library websites as somewhat difficult to use. Also, distinct user patterns are observed in this dataset. [All rights reserved Elsevier].

Book Chapters

Survey Questions, pages 165-167. In: Brown, M. Suzanne; Freund, LeiLani (2010). Services for users with disabilities. SPEC kit; 321. Washington, DC: Association of Research Libraries. On Section: User Needs Assessment, part: University of Waterloo, Library Survey of Information and Accessibility of Students with Disabilities. This extensive user’s survey can be adapted.
https://publications.arl.org/Services-for-Users-with-Disabilities-SPEC-Kit-321/

Hernon, Peter; Lann, Jennifer (2006). Refinement of data collection instrument. In: Hernon, Peter; Calvert, Philip J. (2006). Improving the Quality of Library Services for Students with Disabilities. Westport, CN: Libraries Unlimited. In this chapter under the Appendix: Service quality: Library collections and services for those with disabilities the reader will find an extensive user’s survey that can be adapted, pages 151-156.
https://www.abc-clio.com/LibrariesUnlimited/product.aspx?pc=F2360P

Websites

Assessment in the MIT Libraries: MIT Library Surveys

DataBank: Conferences: Assessment & Metrics, Northwestern University Libraries

How to Conduct Library Website Usability Studies for Free, OEDb Open Education Database

Library Assessment, University of Nevada, Library, Las Vegas
2016 User Satisfaction Survey: surveyed library users to determine satisfaction with services & resources.

Library Assessment Committee Strategic Plan, Rutgers University Libraries

Planning and Assessment, Rutgers University Libraries

LibQUAL+ Library Services for Individuals with Disabilities, Columbia University Libraries

Libraries Survey 2017

MINES: Measuring the Impact of Networked Electronic Services, Publications

Project Outcome, Public Library Association (PLA)

“A FREE toolkit designed to help public libraries understand and share the impact of essential library services and programs by providing simple surveys and an easy-to-use process for measuring and analyzing outcomes. Project Outcome also provides libraries with the resources and training support needed to apply their results and confidently advocate for their library’s future.” -PLA

Book Cover: Academic Libraries and the Academy

Britto, Marwin; Kinsley, Kirsten, editors. Academic Libraries and the Academy: Strategies and Approaches to Demonstrate Your Value, Impact, and Return on Investment, 2 volumes, Chicago: Association of College and Research Libraries, a division of the American Library Association, 2018.

Volume One
 1. High-impact practices and archives
 2. Growing our field evidence: succession planning for sustainable information literacy
 3. Connecting student success and library services
 4. Our "special obligation": library assessment, learning analytics, and intellectual freedom
 5. Research and writing in the discipline: a model for faculty-librarian collaboration
 6. Thinking LEAN: the relevance of Gemba-Kaizen and visual assessment in collection management
 7. Delivering on the institution's mission: developing measures for a research library's strategic
 8. Begin again
 9. Three thousand library users can't be wrong: demonstrating library impact using one open-ended survey question
10. Rowan University Libraries: head-counting study
11. Measuring accessibility and reliability of a laptop-lending kiosk in an academic library
12. Triangulating an assessment plan
13. Leveraging research to guide fundamental changes in learning: a case study at Kreitzberg Library, Norwich University
14. Answering the question before it's asked: building a library impact dashboard
15. Closing the gap: the library in academic program review
16. An ounce of performance is worth pounds of promises: the impact of web-scale discovery on full-text consumption
17. Show them the (data-driven) goods: a transparent collection assessment tool for libraries
18. Q-methodology: a versatile, quick, and adaptable indirect assessment method
19. Assessing discovery: how first-year students use the Primo discovery tool
Volume Two
20. Problems and promises of using LMS learner analytics for assessment: case study of a first-year English program
21. Reframing information literacy assessment: reaching for college seniors
22. Library instruction, net promoter scores, and nudging beyond satisfaction
23. Gathering user behaviors: improving library space while enhancing the library's profile
24. Constructing the evaluation mosaic of a library module for new undergraduate students
25. Breaking the SEAL: enhancing engagement with academic libraries and the academy through educational design innovation in technology-enhanced learning
26. Using reflective writing to enhance the student research process
27. Assessing the effectiveness of collaboration workshops in an academic library: a mixed-methods approach
28. Transitioning from a teaching to a research-focused collection in a Middle Eastern university: a road map for future directions
29. Creating a strategic and flexible assessment framework for undergraduate student outreach
30. Value assessment strategies and return on investment of the twenty-first century libraries: covenant university in view
31. Cracking the code: course syllabi unpacked, decoded, and documented for evidence of library value
32. Building a case for the replacement of a legacy library management system
33. When numbers are not enough: using assessment toward organizational change
34. Assessment as engagement: understanding faculty perceptions of research at Trinity College
35. Targeting collection assessment data to the intended audience
36. Story time in the academic library: using assessment evidence to communicate library value
37. "We only see what we look at": sight as a metaphor for exploring student library use, study habits, and research behaviors
38. Longitudinal information literacy skills assessment
39. The maturing of a big library data project, or, how to future-proof your library data and student success project
40. A voice of their own? : letting library collections tell their story: the UNT Libraries collection map
41. A story for the ages: staff engage in reorganization by reading a decade's trend data
42. Using program evaluation as a proxy for assessment: diffusion from policy literature to improve academic program assessment.
 

Book Cover: Assessing Liaison Librarians

Mack, Daniel C.; White, Gary W., editors. Assessing Liaison Librarians: Documenting Impact for Positive Change, Chicago, IL: Association of College and Research Libraries, 2014.

Introduction: Libraries and Assessment / Daniel C. Mack and Gary W. White, Ph.D.
 1. The Place of Liaisons and the Central Role of Assessment in American Higher Education / Gary W. White, Ph.D.
 2. Programmatic Assessment of Research Services: Informing the Evolution of an Engaged Liaison Librarian Model / Sarah Anne Murphy and Craig Gibson
 3. Assessment of Teaching, Learning, and Literacies / Ellysa Stern Cahoy
 4. Library Assessment for Online, Blended, and Other Learning Environments / Maria R. Barefoot, MLIS, AHIP
 5. Beyond the Bibliographer: Assessing Collection Development Activities in the New Digital Library / Daniel C. Mack
 6. Liaison Librarians and Scholarly Communication: A Framework and Strategies for Assessment / Dawn Childress and Daniel Hickey
 7. The Library as Platform: Assessing Outreach and Engagement in the Library of the Future / Marcy Bidney
 8. Professional Development of Liaison Librarians: Fostering Skills for the Twenty-First Century / Gary W. White, Ph.D.
Conclusion: Designing and Implementing a Liaison Assessment Program / Daniel C. Mack
About the authors
Index

Book Cover: Assessing Library Space for Learning

Susan E. Montgomery. Assessing Library Space for Learning, Rowan and Littlefield, 2017.

Part I. Library space assessment background.
 1. Space assessment : an interdisciplinary look at past and present studies / Karen R. Diller
 2. Library space and learning theory / Susan E. Montgomery
 3. A place to think, feel, and act : psychological approaches to understanding library spaces / Paul B. Harris and Stephanie N. Schweighardt
 4. The evolving role of the architect in library design / Tom Sens and Sarah Parisi Dowlin
Part II. Applying library space assessment.
 5. Academic libraries and accreditation : a theory-based framework for assessing modern library spaces / James Zimmerman
 6. Designed to meet our institutional mission : a case study from Grand Valley State University Libraries / Kristin Meyer and Erin Silva Fisher
 7. Satisfying the electric youth: maximizing student success through space, resources, services, and outlets / Mary Beth Lock, Meghan Webb and John Champlin
 8. Creating a learning culture for student-athletes / Jason Dodd and Dale Lackeyram
 9. Research-creation : library space and resources for fine arts students / Shailoo Bedi, Christine Walde, Tad Suzuki, and Bill Blair
10. Are they different? : An investigation of space and learning in a STEM branch library / Ian McCullough and Jo Ann Calzonetti
11. First-generation undergraduate students and library spaces : a case study / Karen A. Neurohr and Lucy E. Bailey
12. Library space redesign : stimulus and response : University of California, Santa Cruz / Gregory Careaga
Part III. Library space and routine assessment.
13. Watch this space! : Viewing assessment as a continuous process / Camille Andrews, Tobi Hines, and Sara E. Wright
14. Future of academic library space assessment / Danuta A. Nitecki.

Association of Specialized and Cooperative Library Agencies (2005). Revised standards and guidelines of service for the Library of Congress network of libraries for the blind and physically handicapped, 2017. Chicago: Association of Specialized and Cooperative Library Agencies, American Library Association.
https://www.alastore.ala.org/content/revised-standards-and-guidelines-service-library-congress-network-libraries-blind-and-1

This document represents a new approach to the Standards and Guidelines used by the Network Libraries of the Library of Congress National Library Service for the Blind and Physically Handicapped (LC/NLS). Like previous editions, this document is intended as a resource for LC/NLS network libraries to maintain the best service levels for eligible individuals and institutions. This new concise and flexible edition of the Standards and Guidelines provides a straightforward, and detailed version for network service providers to use as benchmarks when providing services to eligible parties. The standards address core areas of LC/NLS network library services and activities, including provisions of services, resource development and management, public education and outreach, administration and organization, and planning and evaluation. The addition of standards addressing staffing and use of physical library space and the introduction of revised staffing model guidelines highlight the importance of these areas for network services providers and their patrons.

Book Cover: Higher Education Outcomes Assessment for the Twenty-First Century

Peter Hernon. Higher Education Outcomes Assessment for the Twenty-First Century, Santa Barbara, California : Libraries Unlimited, 2013.

 1. Outcomes assessment today : an overview / Peter Hernon
 2. Literature on assessment for learning / Peter Hernon and Candy Schwartz
 3. The U.S. government / Robert E. Dugan
 4. Higher education outcomes in the states and institutions / Robert E. Dugan
 5. Outcomes and the national party platforms in the 2012 presidential election / Robert E. Dugan
 6. Institutional effectiveness / Peter Hernon
 7. Information literacy as a student learning outcome : institutional accreditation / Laura Saunders
 8. Critical thinking and information literacy : accreditation and outcomes assessment / Laura Saunders
 9. Library engagement in outcomes assessment / Peter Hernon
10. Some elements of study procedures / Peter Hernon
11. Evidence gathering / Candy Schwartz
12. Moving forward / Peter Hernon and Robert E. Dugan
Appendix: Higher education organizations, associations, and centers with responsibilities, activities, and/or research related to outcomes.

Book Cover: Library Analytics and Metrics

Ben Showers. Library Analytics and Metrics: Using Data to Drive Decisions and Services, London: FACET Publishing, 2015.

Introduction: Getting the measure of analytics and metrics
1. Library data: big and small
2. Data-driven collections management
3. Using data to demonstrate library impact and value
4. Going beyond the numbers: using qualitative research to transform the library user experience
5. Web and social media metrics for the cultural heritage sector
6. Understanding and managing the risks of analytics
7. Conclusion: towards a data-driven future?

Book Cover: Library Assessment in Higher Education

Joseph R. Matthews. Library Assessment in Higher Education, Santa Barbara, CA: Libraries Unlimited, 2015.

 1. Introduction
 2. Mission of the college or university
 3. Model of the student educational process
 4. Assessment of student learning outcomes
 5. Assessment of the library's contribution to student learning outcomes
 6. Institutional assessment of the teaching effectiveness
 7. Assessment of the library's impact on teaching effectiveness
 8. Institutional assessment of the research environment
 9. Assessment of the library's impact on the research environment
10. The library's impact in other areas
11. Planning assessment
12. Implementing assessment.

Book Cover: Managing with Data Using ACRLMetrics and PLAmetrics

Peter Hernon; Robert E Dugan; Joseph R Matthews. Managing with data using ACRLMetrics and PLAmetrics, Chicago, ALA Editions, an imprint of the American Library Association, 2015.

 1. The context for libraries today and beyond
 2. Accountability
 3. Collections
 4. Services
 5. Staffing
 6. Benchmarking and benchmarking studies
 7. Best practices
 8. Moving towards outcomes assessment while embracing value
 9. Use
10. Presenting the findings
11. Managing with data (evidence).
Appendix: Answers to Chapter Exercises
About the authors
Index

Book Cover: Shaping the Campus Conversation on Student Learning and Experience

Brown, Karen E.; Gilchrist, Debra L.; Goek, Sara; Hinchliffe, Lisa Janicke; Malenfant, Kara Josephine;  Ollis, Chase; Payne, Allison, editors. Shaping the campus conversation on student learning and experience: activating the results of Assessment in Action, Chicago, Illinois: Association of College and Research Libraries, a division of the American Library Association, 2018.

 1. Evidence of academic library impact on student learning and success: advancing library leadership and advocacy with assessment in action / Karen Brown
 2. Creating sustainable assessment through collaboration: a national program reveals effective practices / Kara J. Malenfant and Karen Brown
 3. Academic library contributions to student success: documented practices from the field / Karen Brown and Kara J. Malenfant
 4. Documented library contributions to student learning and success: building evidence with team-based assessment in action campus projects / Karen Brown and Kara J. Malenfant
 5. Academic library impact on student learning and success: findings from assessment in action team projects / Karen Brown and Kara J. Malenfant
 6. Value of academic libraries statement / prepared by Adam Murray and Lorelei Tanji
 7. A stone soup approach to building large-scale library assessments / Mary O'Kelly
 8. Filling in the potholes: providing smooth pathways for successful library instruction for first year students / Adam Brennan and Lisa Haldeman
 9. Building campus partnerships and improving student success through a collaborative drop-in tutoring service / Stephanie Bush
10. Becoming part of the conversation through assessment of undergraduate library internships / Clinton K. Baugess and Kathryn S. Martin
11. Positively impacting the library experience of aboriginal and international students / Nancy Goebel
12. You spin me right round (like a record), or, Does the assessment loop ever truly "close"? / Iris Jahng
13. Don't wait for them to come to you: partnering with student support services / Katie Bishop
14. Assessing information literacy for transfer student success / Karen Stanley Grigg
15. Opening doors for libraries on campus and beyond / Ken Liss
16. Professional development for assessment: lessons from reflective practice / Lisa Janicke Hinchliffe
17. Assessing for alignment: how to win collaborators and influence stakeholders / Stephanie Mikitish, Vanessa Kitzie, and Lynn Silipigni Connaway
18. Conclusion: Reflecting on the past, looking to the future / Lisa Janicke Hinchliffe.
Appendix A: Program contributors and participating institutions
Appendix B: Connect, collaborate, and communicate: a report from the Value of Academic Libraries summits
Appendix C: Applying for the Assessment in Action Plan program
Appendix D: First interim narrative report to Institute of Museum and Library Services
Appendix E: Second interim narrative report to Institute of Museum and Library Services
Appendix F: Final narrative report to Institute of Museum and Library Services
Appendix G: Assessment in action syllabus 2015-2016
Appendix H: Assessment in action team report index by regional accrediting agencies
Appendix I: Assessment in action comprehensive bibliography
Appendix J: Assessment in action studies with exemplary design elements
Appendix K: Progress report on planning multi-institutional research on Library Value

Book Cover: Spec Kit 303 Library Assessment

Stephanie Wright; Lynda S White. SPEC Kit 303: Library Assessment, Washington, D.C.: Association of Research Libraries, 2007.

Library Assessment frontmatter
Survey Results
Representative Documents
Position Descriptions
Assessment Charges and Mission Statements
Organization Charts
Assessment Web Sites
Assessment Activity Reports
Assessment Plans
Selected Resources
Books
Journal Articles
Library Assessment Web Sites
Other Resources

Book Cover: The Library Assessment Cookbook

Dobbs, Aaron W., editor. The Library Assessment Cookbook, Chicago: Association of College and Research Libraries, A division of the American Library Association, 2017.

Section 1. Data preparation for assessment
 Kitchen prep: getting ready for library assessment / Emily Guhde
 Creating a statistics report using ingredients already in the library / Judy Geczi
 Using student GPA to show the "nutritional value" of a library service / Mitchell Scott
 Boiling down qualitative data to build personas that inform spaces, services, and technologies / Monena Hall and Maurini Strub
 "Stocking the larder" : recruiting for focus groups and other small group assessments / Carol Mollman
 Making project management planning and assessment planning a piece of cake / Susan Payne
Section 2. Traditional and online collections assessments
 Professional development assessment paella / Catherine Sassen, Karen Harker, and Erin O'Toole
 Creating a framework for comprehensive collection assessment / Galadriel Chilton
 Exploratory collection assessment: the subject snapshot / Madeline Kelly
 Recipe for collection assessment: mixing together key ingredients to make a roux / Paula Barnett-Ellis and Charlcie Pettway Vann
 Potluck surprise: assessing donated materials / Jeannine Berroteran
 Using required readings from course syllabi to show library value and assess the collection / Eric Hartnett and Simona Tabacaru
 MAPing collection use: using massive analysis projects for collections analysis / Galadriel Chilton, Joelle Thomas, Alice Fairfield, Arta Dobbs, Elisabeth Umpleby, and Dawn Cadogan
 Using baskets and a rubric to assess online resources in the "messy middle" / Kathleen Reed, Jean Blackburn, and Dana McFarland
 Annual database (or general resource) evaluation / Ashley Zmau
 Harvested IR metadata gumbo / Courtenay McLeland and Alice Eng
 Fad food or family tradition? : assessing digital projects for long-term value / Allison Ringness
 Rendering repositories: taking out the fat and getting to the impact / Sian Brannon and Laura Waugh
 Flavoring your e-resources collection: the spice rack assessment / Amanda Binder and Elizabeth Siler
 Open access citation analysis / Emily Raymond and Heather Scalf
 Tower cakes for ranking subscription resources / Karen Harker, Todd Enoch, and Laurel Crawford
Section 3. Instruction programs assessments
 Measuring the ingredients of a good bibliography: a recipe for citation analysis / Karen Kohn, Larissa Gordon
 Open-faced formative assessment / Gina Calia-Lotz
 Cooking with information resources in the online kitchen / Beate Gersch and Joseph A. Salem, Jr
 Lazy Susan: continuous improvement cycle of learning and assessment / Joy Oehlers
 Information literacy assessment entrée : using a reflective essay to assess course learning outcomes / Stephanie Alexander, Tom Bickley, Gr Keer, Aline Soules, and Diana K. Wakimoto
 PROTEIN (Peer Review of Teaching: Evaluative Instruction Networks) supplements for librarians / Jason Vance
 The library rally / Michelle J. Gibeault
 Rubrics as a method for assessing & improving library instruction / Megan Hodge, Laura Gariepy, and Jenny Stout
 Slow-cooked rubric: designing and using a rubric to assess undergraduate final papers / Eleanor Johnson and Katie Bishop
 Cooking up rubrics to assess student learning / Marjorie Leta
 How much do good cooking methods affect the quality of the meal? / Christy Fic and Kirk Moll
 Library tour taste test / Nancy Noe
 Word soup: using word clouds to assess the one-minute paper / Maria Barefoot
 Creating connoisseurs: assessing students' ability to evaluate websites / Mandi Smith, Jason Smith, Cathy Blackman, and Wensheng Wang
 Assessment layer cake / Patricia J. Mileham and Kimberly J. Whalen
Section 4. Outreach and programming assessment
 Shopping for kitchen implements? : how to decide if a library program is useful, usable, and desirable / Joy Oehlers and Joyce Tokuda
 Assessing an event: mixin' it up with the long night against procrastination / Katherine Penner and Sarah Clark
 Measuring the success of library outreach to first-year student athletes / Beth Hendrix
 Planning the perfect party: data for dessert! / Katy Mathuews and Zachary Lewis
Section 5. Assessments assessment
 Rubrics and rutabagas: only one is useful for assessing staff during evaluations / Sian Brannon and Julie Leuzinger
 Cooking times may vary: assessing student workflows in the stacks / Joyce Douglas and Katy Mathuews
 Stop folding the dough: intervening to decide whether or not to assess again / Sian Brannon
 Recipe for success: add a personal SWOT analysis to any assessment project / Faithe Ruiz
Section 6. Strategic planning assessment
 Two birds with one stone: using a survey to get employee feedback and to educate in the strategic planning process / Regina Mays and Peter Fernandez
 Strategic stress test: aligning programs and services with a strategic plan / Strategic Initiatives Department
 Chicken soup for digital scholarship: assessing your next steps / A. Miller
Section 7. Service points and services assessment
 Service desk activity burrito / Heather Scalf and Ali Adil
 Who's at your table? : planning for success through community engagement / Stefanie Metko
 Survey system for measuring library outcomes / Rebecca Bayrer, Dawn Melberg, and Eve Melton
 Academic libraries: the breakfast of student champions / Katy Mathuews
 If we prepare it, will they come back for more? / Adriana Gonzalez and Jason B. Reed
 Methods mash / Jennifer Jones
 Service assessment with a side of secret shoppers / Jennifer Jones
 Healthy chat reference assessment with a side of zesty infographs / Natalie Haber
 Reference referral training stew: the perfect assessment mix / Lisa Vassady and Alyssa Archer
 Deconstructing reference statistics / Jen-chien Yu
 The secret ingredient: mystery shopping your service points / Candice Benjes-Small and Elizabeth Kocevar-Weidinger
 The missing piece: assessing implementation fidelity / Megan Hodge and Laura Gariepy
 Patron for a day (PFAD): a space assessment / Stephanie Hartman and Lisa R. Horowitz
 An assessment rubric inspired by The Four Seasons / Nathaniel King and Kelly Lutz
Section 8. Equipment, building, and space assessment
 Learning space ethnography study: observe students in the library to inform service and resource design / Victoria Raish
 Interview to the double: uncovering student motivations in the library / Kathleen Reed, Cameron Hoffman, and Meg Ecclestone
 Flipchart surveys / Laura Newton Miller
 Space invaders: measuring use and satisfaction through mixed methods / Susan Gardner Archambault
 Assessing student learning behaviors in informal learning spaces / Susan Beatty
 Culinary snapshots: assessing international student needs through photographs / Alyssa Berger, Ana Villar, and Danielle Rowland
 Don't just count 'em, sweep 'em! / Gricel Dominguez
 Library space cake / Jenny Horton
 Decorating the library cake: a space utilization study / Kellie Meehlhause
Section 9. Website and web services assessment
 Add webAIM and stir: assessing web accessibility for users with disabilities / Laura DeLancey
 Easy (no-bake) online card sorting / Samantha Rich
 But this is for the library: best practices for usability testing and library website design / Sojourna Cunningham, Regina Mays, and Holly Mercer
 Hear the people sing: communicating usability results to a large library audience / Sojourna Cunningham, Regina Mays, and Holly Mercer
 Farm to table: a recipe for website usability testing on a budget / Tiffany Davis, Jen Park, and Derek Sanderson
 Assessing your library website with usability testing / Brighid M. Gonzales
 Order takeout: virtual usability testing to meet your users where they are / Jennifer C. Hill and Anita Norton
 Lean usability testing for healthy website assessment / Alex Sundt
 Simple usability stir fry / Tim Broadwater and Jessica Tapia.

Book Cover: Viewing Library Metrics from Different Perspectives

Dugan, R. E., Hernon, P., & Nitecki, D. A. Viewing Library Metrics from Different Perspectives: Inputs, Outputs, and Outcomes, Santa Barbara, CA: Libraries Unlimited, 2009

 1. Introduction
 2. Related literature
 3. Assessment and evaluation
 4. The library perspective
 5. The customer perspective
 6. The institutional perspective
 7. The stakeholder perspective
 8. Benchmarking and best practices
 9. Metrics for marketing and public relations
10. Management information systems
11. Utilizing metrics: interpretation, synthesis, and presentation
12. The joys of metrics
Appendices
A. Inputs: library perspective (reported as numbers)
B. Inputs: library perspective (reported as ratios/percentages)
C. Outputs: library perspective (reported as numbers)
D. Outputs: library perspective (reported as ratios/percentages)
E. Selected examples of process metrics
F. Selected examples of trend metrics
G. Examples of qualitative metrics
H. Customer perspective
I. Institutional perspective
J. Stakeholder perspectives (examples)
K. Selected metrics used by libraries for benchmarking and best practices
L. Marketing and public relations
M. Selected metrics from library reports
N. Some metrics related to scholarly communication
O. COUNTER Code of Practice, release 3 (August 2008)

Instrument 1

Public Services User Satisfaction Survey

This survey measures the satisfaction level of user interactions at each of the Public Services desks or service points.
Permission for the use of this material was obtained from Northern Illinois University Libraries.

Instrument 2

Reference Desk Services Data Sample Form

This tool gathers weekly statistics on reference interactions.
Permission for the use of this material was obtained from Northern Illinois University Libraries.

Instrument 3

Reference Interaction Survey

This brief evaluation gauges satisfaction of reference service.
Permission for the use of this material was obtained from Northern Illinois University Libraries.

Instrument 4

Research Consultations (Mot at the Reference desk) Form

This form gathers weekly tabulation of reference interactions other than at the traditional reference desk.
Permission for the use of this material was obtained from Nestor L. Osorio.

Instrument 5

Outreach/Liaison Activities Rubric

Qualitative and quantitative queries for library outreach and engagement for both individuals and programs.
Permission for the use of this materials was obtained from the American Library Association.

Instrument 6

Subject LibGuides Usability Design

Four articles with practical applications of methodologies and techniques to evaluate LibGuides usability.

Instrument 7

Web page Content and Design (Main Library Page)

"Usability Testing of a Responsively Designed Library Website" includes three instruments, a screening tool, task scenarios, and responsive design post survey.
Permission for the use of this materials was obtained from Junior Tidal, New York City College of Technology Library.

Instrument 8

Services for Students with Disabilities

Books with adaptable surveys to assess services to users with disabilities.

Instrument 9

Web Page for Circulation - Design

Assessment of the usability of web page design including the evaluation of usefulness, value, and appropriateness of the content.
Permission for the use of this material was obtained from John Wiley & Sons Inc.

Instrument 10

Reference Transactions: Circulation Desk

Form to gather reference interactions at circulation desk.
Permission for the use of this material was obtained from Nestor L. Osorio.

Instrument 11

Laptop circulation – Satisfaction Survey

Evaluation to gauge the satisfaction of laptop computer circulation.
Permission for the use of this material was obtained from Nestor L. Osorio.

Instrument 12

Circulation: Turnaround of shelving

Form to gather information on shelving turnaround time.
Permission for the use of this materials was obtained from Northern Illinois University Libraries.

Instrument 13

Assessment of Circulation Desk Service

Evaluation of service at library circulation desk.
Permission to use this material was obtained from Columbia University Libraries.

Instrument 14

Web Page for Reserve Services  - Design

Assessment of the usability of web page design including the evaluation of usefulness, value, and appropriateness of the content.
Permission for the use of this material was obtained from John Wiley & Sons Inc.

Instrument 15

Reserve Satisfaction Survey

Survey instruments that evaluates satisfaction with course reserves operations.
Permission for the use of this materials was obtained from Northern Illinois University Libraries.

Instrument 16

Data of Traditional and Digital Reserve Items

Data and usage of library course reserve operations.
Permission for the use of this materials was obtained from the Council on Library and Information Resources, CLIR.

Instrument 17

Web Page for Document Delivery Services - Design

Assessment of the usability of web page design including the evaluation of usefulness, value, and appropriateness of the content.
Permission for the use of this material was obtained from John Wiley & Sons Inc.

Instrument 18

Reference Transactions: Document Delivery Services

Form to gather reference interactions for document delivery services.
Permission for the use of this materials was obtained from Nestor L. Osorio.

Instrument 19

Survey of Space Utilization

Two examples of evaluation of library space utilization.
Permission for the use of this materials was obtained from Massachusetts Institute of Technology University Libraries.
Permission for the use of this materials was obtained from the University of Oklahoma University Libraries.

Instrument 20

Gate Count

Form to count people in library space.
Permission for the use of this materials was obtained from Northern Illinois University Libraries.

Prepared by the 2018-2019 CARLI Public Services Committee: Rachel Bicicchi, Millikin University; Marissa Ellermann, Southern Illinois University Carbondale; Aaron Harwig, College of DuPage (Co-Chair); Chad Kahl, Illinois State University; Joanna Kolendo, Chicago State University (Co-Chair); Nestor L. Osorio, Northern Illinois University; Katherine Sleyko, Prairie State College; Nancy Weichert, University of Illinois at Springfield; Reina Williams, Rush University; Elizabeth Clarage, CARLI Staff Liaison; Denise Green, CARLI Staff Liaison.