HAPLR (Hennen's American Public Library Ratings) has ranked public libraries based on several factors for many years. Attempting to compare libraries, apples to apples, by counting books checked out, walk-in business, program attendance, computer usage, and budget allocation among many other factors. Library Journal has also gotten in the game and provided their own ranking system. My interpretation of the intent is to reward libraries for doing a good job and to encourage other libraries to emulate their success.
Every year, libraries across the country must report to their state library. This data is then sent to the Federal Government and posted at the National Center for Education Statistics (http://nces.ed.gov/surveys/libraries/)
A problem with the numbers
A major problem with this statistical measure is the reporting. Every library in the United States must report on a number of indicators every year to their state government. Sometimes these numbers determine funding from the state, sometimes it doesn't. The survey system isn't very clear in what it is asking and information collected locally doesn't always mesh with what is being asked. Some of the questions are dated and do not reflect how current libraries operate. Number of internet terminals and number of computer users are only very recent additions. "Reference questions asked" has not changed either even though many librarians are answering much more than that, particularly with technology, yet the federal government does not want that tracked.
Antiquated and confusing questions can result in numbers being way too high, way too low, or not reported. As demonstrated in the Library Journal controversy, it wasn't logical that a library had 16 million computer users in a 12 month period. There is quite a bit of data that may seem illogical, but since this information isn't attached to anything financially and libraries don't benefit from this scoring system there is no incentive to take this report seriously. Why then is everyone ranking libraries based on this data?
Statistics or Success
With Hennen and now Library Journal began ranking libraries, it seems to push an agenda, focusing on excellence based on statistics rather than on factors that actually lead to success. Success based on statistics cannot be emulated. Furthermore, placing stars for these libraries doesn't help their budgets or success locally nor does it reflect why they are successful. The information used in the scoring system is also very dated. Information available to Hennen and Library Journal are typically two or even three years behind. So the scoring for the library doesn't reflect what the library is doing currently, but what it was doing years ago.
It was one thing for a marker such as HAPLR to record the scores, it's another thing to star the libraries. It makes libraries focus on the wrong things, statistical markers. It also does something else. I remember when this information was first distributed and one library colleague commented:
Is it really stating that libraries aren't doing their jobs well? That they aren't successful? I enjoyed the discussion on PUB-LIB with my favorite coming from James Casey:
"Susan's gut instinct to look behind the statistics is most assuredly useful for any of us pondering the HAPLR and LJ rating results. That is clearly evident when HAPLR gives bouquets of praise to libraries serving upscale, suburban communities such as Cuyahoga County (Suburban Cleveland) earning a rating of 893 and Baltimore County at 794 while those excellent public libraries serving large and often underprivileged urban populations draw abysmal scores like 293 for Detroit Public Library, 285 for Chicago Public Library and 385 for Baltimore's Enoch Pratt Public Library. One has to look behind the statistics to get at reality.
If one were to examine only statistical and output measures in rating the performance of Presidents of the United States (for example), Lincoln would have to be considered one of the most inept and unsuccessful of our Presidents instead of being one of the greatest. The thousands of deaths, enormous destruction of property, military blunders, idiot generals hired, civil liberties curtailed, etc. have to be considered within the context of the overwhelming difficulties and challenges he had to overcome just to save the Union and bring Slavery to an end. --- The work of urban libraries in struggling against ignorance, crime, shrinking property tax bases, crumbling schools, ward politics, etc. is just as heroic and certainly not fairly represented by such dismal scores as 285 and 293. "
Some libraries will never be ranked at the top. In fact, in the last ten years, the top ten libraries have been trading spaces, but few have gone off the list and few new libraries have gotten onto the list. What are the real factors behind this? Statistics can never demonstrate that. There are many factors to this success, many however, may not have anything to do with the library's performance. It might be the affluency of the community, the percentage of college educated patrons, or simply the local culture of the area. The library's success can be tied to the community's success and that is very difficult to measure.
Libraries make a difference in their communities/
Librarians are rock stars
I would agree with this statement. I think when librarians or library administration are noticed prominently in the community and respected that it is a true sign of library success. When the local rotary or Kiwanis group calls several times wanting your library director to speak at their next function, when they are chomping at the bit to provide funding for the library, that organizations and businesses want to be included and fund your projects because it makes them look good, and when the youth librarians get mobbed by little kids Beatles-style when they are out and about in the community- those are signs of success and of librarians being rock stars.
According to an OCLC report From Awareness to Funding, those numbers are not what is important (you can read my lengthy analysis here). I think these statistics are unhelpful and distracting and we certainly shouldn't be awarding stars based on this data. It doesn't really compare libraries and doesn't explain why a library is more successful. Library success is really dependent upon the perception and support of the community, not from a national ranking system. It may give a bit of heft in a performance review, but people in your community coming forth supporting the need is far more powerful.
2 comments:
I think that most front-line librarians would agree with your analysis. Now if only library management would read this post, we might get somewhere.
Thanks,
rcn
SF Bay Area
I do know library administration has commented often about this rating system. Most are very unhappy with it. Library Journal's entry into it is very recent so maybe it is good timing to get them to rethink this.
Post a Comment