ASSOCIATES (2012, March, v. 18, no. 3)
Food for Thought
Visual Shelf Reading: One Library’s Process
Linda D. Lemery
Mary B. Blount Library,
Averett University, Danville, VA
Introduction. In a perfect world, all libraries would have books on the shelves in perfect call number order according to the Library of Congress (LOC) or other classification designation on their spines. However, it’s not a perfect world. While the value of shelving materials in order (and of shelf reading to maintain that order) is not in question (Kendrick 1991, 16; Purvogel 1988, 131, 139; Ristau 1988, 39; Sharp 1992, 177), having full-time library faculty/staff maintain that order themselves appears to be not possible (Banks 1990, 39-40; Bennett, Buxton, and Capriotti, 1979, 4-6). Yet, shelving order must somehow be maintained.
Many university libraries delegate visual shelf reading (VSR) to student assistants with oversight from regular library personnel. Attempts have been made at developing ‘within institution systems’ of shelf reading or standardization (Anderson 1998, 2; Bennett, Buxton, and Capriotti, 1979, 4-5; Kendrick 1991, 16; Lowenberg 1989, 25; Sharp 1992, 180-181). The newer radio-frequency identification tagging and detection [RFID] technology might be cost-effective for larger collections, but not for smaller ones. Though at about 100,000 print volumes, Blount Library’s circulating print book collection is small, order can only be maintained through VSR by a dedicated team of student assistants. This article shares with readers Blount Library’s VSR process.
Components of the System. The components of the VSR system include student assistants, stacks assignments, expectations, monitoring, quantitative measures, data entry, and qualitative checking.
Student Assistants. Blount Library has 23 dedicated and increasingly self-directed student assistants. They know how they will be evaluated (Lemery 2008, 457-461) and what is expected of them (Figure 1). They determine how they will meet or exceed those expectations. The evaluation rubric has already been published (Lemery 2008, 451-462).
Stacks Assignments. The Circulating Collection is divided up into groups of stacks based on previously assessed shelving book density (Figure 2). Student assistants choose a group of stacks for VSR over the course of a semester or academic year. Groups of stacks containing the more frequently circulated collections (such as juvenile fiction or non-fiction) have multiple student assistants assigned to them. Notice in Figure 2 that while Sharon Pink has at least 3 stacks to maintain, George Orange and Sheila Blue share responsibility for the Juvenile Nonfiction Collection because the shelving density is greater. Similarly, Sally Lavender, Michael Green, Jason Red, and Nicole Black split responsibility for the Juvenile PZ Fiction Collection.
Expectations. Student assistants understand on hire that they are expected to perform in the job at levels of “Good” or better, which for VSR is quantitatively defined as spending an average of at least 30 minutes per week on VSR. Expectations begin after students successfully complete the three quizzes on the LC Easy 3.0 training module (with levels of 90% or higher), and have been instructed on how to do VSR and on the importance of aligning book spine edge with shelf edge (Curran 1988, 618).
With 23 student assistants, 30 minutes of expected VSR per week per student translates to at least 11.5 hours per week of student time devoted to maintaining the circulating and reference collections in LOC classification number order. Students trade off covering the Circulation Desk and doing other tasks such as VSR. Having discretion over how they arrange their workflow seems to act as a job satisfier.
Monitoring. Evaluations are conducted twice per semester: once 4-6 weeks into the semester (interim) and once after the semester is over (final). The interim evaluation serves as an opportunity to recognize and reinforce student strengths, and to realign student energies as necessary. Quantitative VSR achievement levels of “Good” or better are satisfactory and require no plan for improvement; lower achievement levels are targeted in a plan for improvement. Students embrace the system because they know what they have to do to achieve.
Quantitative Measures. Wide acceptance of VSR assessment hinges on having objective measurements in place that are fair and accurate (Figure 3). Expectations are entered weekly (e.g., 30 minutes = “1”; 45 minutes = “1.5”; 15 minutes = “0.5”). Most expectations are entered as “1” since most students work in Circulation and thus have multiple Circulation duties (i.e., if a student works half-time each in Technical Processing and in Circulation, that student would have a VSR expectation of 0.5 per week entered in the VSR spreadsheet [see the George Orange example in Figure 3]).
When student assistants perform VSR events, they enter the event data on the VSR participation log (Figure 4). Students decide how many VSR minutes they do at a time; some prefer doing 60 minutes if scheduling permits (2.0 actual entered in the VSR monitoring spreadsheet), while others prefer doing the bare minimum of 30 minutes (VSR monitoring spreadsheet 1.0) or less. A cumulative system like this requires that spreadsheet data be updated frequently.
Data Entry. In developing this library monitoring system, the Circulation Manager had input from student assistants, as well as, help from an outside student EXCEL expert in developing and perfecting the embedded formulas that translate numeric assessments into qualitative ratings (Figure 5). Thus, providing developmental input generated student assistant buy-in. Selecting one student per semester to enter the data on an access-restricted computer as a special project and rewarding that student with VSR activity minutes on a minute-for-minute basis removed the data entry responsibility from the Circulation Manager’s workload.
Qualitative Checking. System integrity is compromised if the data entered on the VSR participation log sheets (Figure 4) is not honest or accurate. One element safeguarding data integrity is night supervisor input in the evaluation process. Their comments can indicate issues with student work ethic, which are then addressed through discussion and mentoring. Another safeguarding element is a qualitative checking system whereby a random sampling of each VSR event listed on the VSR log is quality-checked for accuracy. The qualitative checking procedure design is elegant but beyond the scope of this article.
In Conclusion. The VSR procedure keeps the print books in order at Blount Library. From the viewpoint of student assistants, the monitoring is objective, timely, systematic, and fair; encourages self-directed work; and gives student assistants control over when they do their VSR work. From the viewpoint of the Circulation Manager, book collections are in better order, data ratings correspond directly to the evaluation system, student assistants are engaged and accountable, data entry is maintained by trusted student assistants, and the system runs smoothly with minimal intervention and is close to self-maintaining. While log sheet and monitoring spreadsheet development required much iteration on the design end, the spreadsheets are now very useful and much clearer. The fact that Averett students now go to the shelves and rapidly find what they want is mute testimony to the fact that this stacks maintenance system is effective.
Contact Information, and Acknowledgment: Reader correspondence is welcome and may be addressed to: Linda D. Lemery, Circulation Manager, Mary B. Blount Library, Averett University, 344 West Main Street, Danville, VA 24541. Email: firstname.lastname@example.org. The author thanks household EXCEL expert David Lemery (age 14 at the time of VSR project development) for developing the initial nested IF statement model adapted for use in the VSR Workload Spreadsheet, and Elaine Day, Library Director, Mary B. Blount Library, Averett University, Danville, VA, for giving her the freedom to develop Circulation processes and for reviewing this manuscript.
Anderson, D. R. 1998. Method without madness: Shelf-reading methods and project management. College & Undergraduate Libraries 5(1): 1-13.
Banks, J. 1990. Shelf-reading: A pilot study. Collection Management 13(1-2): 39-46.
Bennett, M. J., Buxton, D. T., & Capriotti, E. 1979. Shelf-reading in a large, open-stack library. The Journal of Academic Librarianship 5(1): 4-8.
Curran, C. 1988. Edging: Art or science? American Libraries 19(7): 618.
Kendrick, C. L. 1991. Performance measures of shelving accuracy. The Journal of Academic Librarianship 17(1): 16-18.
Lemery, L. D. 2008. Student assistant management: Using an evaluation rubric. College & Undergraduate Libraries 15(4): 451-462.
Lowenberg, S. 1989. A comprehensive shelf reading program. The Journal of Academic Librarianship 15(1): 24-27.
Purvogel, C. 1988. Shelf-reading in rare book libraries: Nonessential or a necessity? Collection Management 10(1-2): 131-140.
Ristau, H. 1988. Keep your shelves in order: Techniques for training pages. School Library Journal 34(9): 39-43.
Sharp, S. C. 1992. A library shelver’s performance evaluation as it relates to reshelving accuracy. Collection Management 17(1-2): 177-192.842 views