Competency N: “evaluate programs and services using measurable criteria”
INTRODUCTION
“Multifaceted and rigorous assessment that seeks to measure meaningful impact in appropriate ways will also prove to the be the best way to demonstrate value” (Stenstrom, 2015, p. 277).
As libraries compete with other information sources in the twenty-first century, it’s critical that they’re able to accurately assess their impact and demonstrate value to their stakeholders and community. Regular and strategic assessment of programs and services is critical to the success and sustainability of any information organization. Strenstrom (2015) breaks down her definition of value through the following three categories:
- User satisfaction
- Economic impact
- Social impact
In order to better explore how meaningful assessment practices improve information organizations, I will look at the implementation of a culture of assessment through strategic planning and service integration as well as how to use professional guidelines and data as assessment metrics.
Culture of Assessment
In their seminal work, Lakos & Phipps (2004) advocate for creating a culture of assessment in one’s library or information organization. They provide the following definition: “A culture of assessment is an organizational environment in which decisions are based on facts, research, and analysis, and where services are planned an delivered in ways that maximize positive outcomes and impacts for customers and stakeholders” (p. 352). Effective strategic planning is one way to build assessment into a library’s organizational culture. In my first piece of evidence, I include a comparative analysis of two strategic plans for two different public libraries. Both the Vancouver Public Library and the Oceanside Public Library conducted widespread community outreach to learn of their information needs and wants of their patrons and potential patrons in designing the outcomes of their strategic plans. Additionally, the Vancouver Public Library incorporated key performance indicators associated with each outcome such that they could measure their progress in achieving their strategic goals.
The move toward incorporating a culture of assessment was more deliberate in my second piece of evidence, a look at academic library case studies that reflect the implementation of programs as driven by assessment practices. In an attempt to improve the quality and efficiency of services offered, four university libraries engaged in plans to examine and potentially, re-organize how they executed assessment in their organizations (Tatarka, Chapa, Xin, & Rutner, 2010). Within this case report, all four institutions had designated Assessment Librarians or Analysts who created a team to prioritize how assessment was practiced and achieved. Similarly, in an initiative to improve information literacy competency at the University of California Los Angeles, the initiative’s five librarians organized into five groups that included informing, teaching, reaching, collaborating, and notably, measuring to achieve project goals.
It’s clear in looking at the various strategic planning practices detailed in the evidence above, that the organizations adhered to Lakos and Phipps’s contention that in order to successfully integrate a culture of assessment it’s necessary to design organizational systems that:
- ensure a focus on customers
- enable shared learning
- measure results
- use information from the external environment for internal decision-making
Measuring for Success
One useful metric for evaluating programs and services includes framing service assessment through the standards established by professional guidelines. In my third piece of evidence, I include a discussion post from INFO 210: Reference and Information Services wherein I examined my experience with phone reference services in the context of the Reference and User Services Association’s Guidelines for the reference interview. In detailing my transaction with the service providers, I align each of their actions and statements with specific guidelines in the professional document concerning visibility/approachability, interest, listening/inquiring, and follow-up to demonstrate how adhering to professional evaluative standards can produce effective reference services.
Data aggregation and analysis has also become an increasingly popular method of organization evaluation. “Libraries must be able to measure their outcomes and systematically make technology, budget allocation, service, and policy decisions based on a range of data – needs assessment data, customer evaluation data, stakeholder expectation data, and internal process and organizational effectiveness data” (Lakos & Phipps, 2004, p. 346). User services are one way to facilitate data collection. In 2010, following a restructuring of organization at the Milner Library at the University of Illinois, media services, stacks maintenance functions, and three paraprofessionals were brought into the Access Services department. While much of the department’s staff had worked behind the scenes previously, “after the reorganization, each staff member was expected to participate in daily 2-hr rotations at the circulation desk” (Long, 2012). Because the circulation desk was now being manned by those with a lot of experience and those with little experience, the department decided to do a patron satisfaction assessment to determine ways in which the circulation desks services could be improved. The assessment was done through a voluntary survey to which the AS staff received 386 responses. While the circulation desk was rated very positively overall, the results did indicate that faculty and graduate students prefer interaction with library staff over student assistants and that courtesy among all circulation desk workers could be improved. In response, the AS department developed a customer service plan to be implemented among staff now and in future training sessions.
However, relying solely on data analysis can be hazardous to an organization. “The big data movement allows correlations to be made in ways never previously imagined” (Stenstrom, 2015, p. 274). Though in his article on the fallacies of big data, Straub (2015) cautions that correlation is not the same as causation and interpretation of data is more important than the data itself. (p. 838) Straub’s article is a fascinating look at the big data mythos and the fallacies that accompany this new and popular method of analysis. He acknowledges that technical developments including social networks, mobile devices, cloud computing, apps, etc. have resulted in the generation of quantities of data that have never previously existed on such a scale. (p. 837) As such, it’s exciting to think about the plethora of ways that data can be mined and analyzed through algorithmic power toward assessment of conflicts and solutions. Straub uses the example of computer-based language translation services ie; Google Translate or Babelfish to depict the shift from an artificial intelligence paradigm to an algorithmic paradigm and emphasize that a fundamental challenge of big data is correct interpretation of the information it provides. (p. 839) He stresses that information itself doesn’t tell us everything and as such, has the potential to tell us anything including things that are not factual or distort reality. Toward effective interpretation of big data, one must acknowledge that data is always a temporal construct, that a suffusion of similar data points to something being true that is not necessarily true and similarly, that data can be valid facts but are not per se valid facts. Essentially, Straub argues that context is everything when considering big data but it’s easy, even tempting, to ignore context in the rush to make connections and draw conclusions. Even so, “this notion of using big data to make decisions and demonstrate value speaks to data literacy and the growing importance of understanding how data are used to answer complex questions” (Stenstrom, 2015, p. 274). Both Stenstrom and Straub’s focus on big data analysis evidences how it’s become a viable method for assessment even if there’s tension in the discourse on its value.
EVIDENCE
Strategic Plan Comparative Analysis (INFO 204: Information Professions)
The strategic plan comparative analysis was a group project completed for INFO 204 wherein two classmates and I identified a domestic library and an international library and studied their published strategic plans as well as additional information made available through official websites in order to assess how they developed goals and outcomes for their organizations and how they intended to measure success through key performance indicators.
As this was a group project, one member was responsible for the information about the Oceanside Public Library, I was responsible for the information about the Vancouver Public Library, and the third team member was responsible for synthesizing the information we provided into a final paper. However, that team member had circumstances that prevented her from performing her role in the group and therefore, I organized and wrote the final paper which I’m including as evidence. With the exception of the Mission statement section (pgs. 5-6), all of the text in the paper is my own original writing.
Planning for the Future (INFO 204: Information Professions)
The assignment for this paper included evaluating planning trends for more than one library to generate insight on how the libraries approach the challenges and opportunities for their organizations such that I can be better prepared for my role as an information professional. Through my research I discovered that an emphasis on the value of the culture of assessment has begun to shape how libraries approach the implementation of new programs and services as evidenced by the case studies included in the paper.
Discussion Post – Phone Reference (INFO 210: Reference and Information Services)
During my reference services course, each week we engaged with a different method of reference service and shared our experience through a discussion post on Canvas. I submit as evidence for Competency N, my discussion post detailing a phone reference transaction wherein I contacted the Los Angeles Public Library to inquire about information sources relating to the Freedom of Information Act. Throughout the post, I explicitly align the behaviors of the service provider with specific guidelines enumerated in the RUSA reference interview professional guidelines to demonstrate how adhering to professional standards determines effective service.
CONCLUSION
As Lakos and Phipps wrote in 2004, “in the current external environment, libraries are challenged to be nimble, innovative, responsive, proactive, and most of all, able to demonstrate their value” (p. 346). This assertion is even more accurate in today’s information landscape. Assessment is therefore a critical piece in any library’s organizational environment. One way an information organization can determine or integrate a culture of assessment into their practice is through the following blueprint:
- organization’s mission and planning are externally focused
- organizational planning documents include how performance measures are assessed and are outcomes-oriented
- leadership commits to and supports assessment activities
- staff recognize value of assessment and engage it regularly
- relevant data and user feedback are routinely collected, analyzed, and used in decision making
- services, programs, policies are evaluated for quality, impact, efficiency
Additionally, this criteria can serve as evidence that the organization has successfully embraced assessment best practices such that they can benefit their information community.
References:
Lakos, A., and Phipps, S. (2004). Creating a culture of assessment: A catalyst for organizational change. Libraries and the Academy, 4(3), 345-361.
Long, D. D. (2012). “Check this out”: Assessing customer service at the circulation desk. Journal Of Access Services, 9(3), 154-168.
Stenström, C. (2015). Demonstrating value. In S. Hirsh (Ed.), Information services today: An introduction (pp. 271-277). Lanham, MD: Rowman & Littlefield.
Strauβ, S. S. (2015). Datafication and the seductive power of uncertainty — a critical exploration of big data enthusiasm. Information (2078-2489), 6(4), 836-847.
