Go to the U of M home page

Monday, October 29, 2012

Library Assessment Conference Poster

Hello everyone. I am pleased to announce that our poster "Using What You Already Collect: Library Data and Student Success" won a "Judges' Choice" award and the "People's Choice" award at the 2012 Library Assessment Conference. If you are interested you can check out the poster here:


Using What You Already Collect: Library Data and Student Success

Please let us know if you have any questions about it! Thanks to the judges and attendees at the Library Assessment Conference!



Monday, October 1, 2012

What about faculty and staff?

One of the main goals of this project is to tie student usage of libraries to academic success measures such as GPA and retention. However, one question we get a lot is what about faculty and staff? Does our data demonstrate any usage patterns from these user populations?


The data we gather certainly does not discriminate towards any particular type of U of M Internet ID. We gather all of them and don't remove any based on whether they are a student or not. So, in other words, of the 59,722 unique Internet IDs we collected in the Spring semester 2012, how many of those represented faculty and staff? Can we get at that data?


Thanks to the U of M Office of Institutional Research, yes we can! The following table shows usage percentages for user populations other than students for Spring 2012:









Staff typeUsage percentage
Faculty73%
Grad Assistants93%
Civil Service23%
Professional51%
Administrative34%



When sharing this data, people usually are amazed that we have registered such a high percentage of faculty use, or they are amazed that 100% of the faculty don't use the library. Some people are even disgusted that 100% of the faculty don't use the library. Keep in mind that we do not have a perfect count of people or Internet IDs that use the library. We can only analyze the Internet IDs we can successfully gather. There are some library services that do not require an Internet ID (reference desk transactions) and if you are using a campus IP address you can enter many of our resources through a bookmark or Google search. Having said that, we are reasonably confident that if you are a library user, at some point during the semester we are going to capture your Internet ID. However, if we had perfect usage tracking mechanisms that gathered every use for everyone at the University, these percentages would likely be higher. But I'm guessing not by much.


We can also determine what colleges and departments have the highest and lowest usage. For example, 90% of the School or Nursing faculty made use of the library during Spring 2012, while only 30% of the School of Dentistry did. The School of Dentistry's 30% number was by far the lowest percentage of use by faculty. No other school or department went below 69%. Other high percentage use for faculty was seen by the College of Liberal Arts (87%), the College of Science and Engineering (77%), and the College of Food, Agriculture, and Natural Sciences (76%).


That's all for now. We are having fun swimming in all this data!


Thursday, September 20, 2012

Spring 2012 Data

Hello everyone! It has been a long time, but we finally have data and information to share concerning our continued efforts around this project. We finally have Spring 2012 data to share! But first, let's take a look back at the raw data from Fall 2011:





So, in the library access areas we measured there were 146,126 Internet IDs (61,195 unique) captured at the frequency of 1,548,209 transactions. With this data, we found that 77% of undergrads made use of the library during the Fall 2011 semester and 85% of grad students.


Now let's take a look at Spring 2012:





In the Spring 2012 semester, we captured 1,646,836 transactions, almost 100,000 more than in the Fall 2011 semester. As might be expected, the increase in usage was mainly seen in website, database, and e-journal usage.

You'll also note that we attempted to count transactions in more library access categories such as students who take our introduction to libraries instruction through their Writing course, or students that had a consultation with our Archives and Special Collections department. These new library access points did not have a huge impact on our overall numbers, but they are nice to have to compare and to determine who uses these services the most.

One thing this chart does not list is the number of unique Internet IDs captured in the Spring 2012 semester. That number is 59,722. So, slightly less unique Internet IDs in the Spring 2012 semester than in the Fall. Some of that difference can certainly be attributed to student graduation and attrition. In fact, from Fall to Spring we saw 43,015 returning users and 16,707 new users. Some of those new users were most certainly new undergrads and grad students just starting at the U of M in January.

In the next few days we will write some entries regarding the demographics of this Spring 2012 data, namely who are these students and what colleges and majors are represented. But as a little teaser, what we have found out so far is that 78% of undergrads used the library in Spring 2012. That is one percentage point higher than Fall 2011. 87% of graduate students made use of the library in Spring 2012. That is two percentage points higher than Fall 2011.

More to come soon!

Friday, June 29, 2012

Evidence of Impact

I was signing up for an ELI webinar when I noticed this on the EDUCAUSE site--

"Seeking Evidence of Impact (SEI) is a program lead by the ELI community to find current effective practices that would enable the collection of evidence to help faculty and administration make decisions about best practices to adopt, invest their time, effort and fiscal resources in....We hope to bring all types of higher education institutions and professional associations into a conversation on this theme. We envision an inclusive discussion that includes faculty members, instructional support professionals, librarians, students, and research experts in a collaborative exchange of insights and ideas."

http://www.educause.edu/eli/programs-and-resources

Have any libraries been part of this so far?

As I was reading this over I like the mention of "making decisions." Sometimes I feel our need to collect data makes us lose sight of the reasons and next steps. For my concrete brain...the data collection and analysis should hopefully should be driven by decision making. I also think this organization (EDUCAUSE/ELI) is doing some great work on behalf of Libraries. I think many users lump the Libraries into the broad category of "technology" even if that isn't necessarily how we see ourselves.



Monday, June 18, 2012

Good for you?

walden.jpgI tripped across this reference to some of our work on the blog for the Walden University Library...love the title Is Using The Library Good For You? Yes!

I have been thinking of ways to share our results with new students during our efforts at Orientation and this might hit the right note.

Read at: http://www.waldenlibrarynews.com/blog/2012/6/13/is-using-the-library-good-for-you-yes.html

Friday, June 8, 2012

Gathering Reference Statistics a Balance of Privacy

As mentioned earlier, reference is one of the high touch, possibly high impact services. Reference is an area of great privacy concern. How do you gather reference statistics? We started with the low hanging fruit, which also was the least intrusive, chat reference.

Like many institutions, the University of Minnesota uses QuestionPoint for chat and e-mail reference. When a patron poses a question via chat, the inclusion of their e-mail address is not required, and using their designated University of Minnesota e-mail is not specified. However, at least we have some data to use. Many other institutions use services such as Meebo, which use anonymous chat patron handles, making gathering data much more difficult. QuestionPoint transactions are accessible for only 90 days, which means we need to go in and retrieve the data at numerous points during the semester.

How do we gather in-person reference transaction statistics without impeding privacy and maintaining a welcoming atmosphere? Please post suggestions in the comments below.

Friday, June 1, 2012

High touch services

We are working on gathering data for spring and I am struck again on how challenging it presentation.jpgis to gather meaningful statistics on some of our core services--reference/consultation and instruction.

These are high-touch and potentially some of the highest impact services we provide but our data collections techniques are almost non-existant for these face to face interactions.

What we did in Fall....
For Reference we gathered email and chat transations from Questionpoint--the software we use to facilitate these services.
BUT...
-email addresses aren't required
-U of M email address aren't required
-no data on desk transactions in any location
-no data on consultations or small group sessions with librarians

For Instruction we gathered our workshop *registrations* and we gathered *class lists* of courses who had a one-short/guest lecture sort of session.
BUT...
-not matched up with actual attendance or any last minute drop-ins
-missing entire classes due to confusion over section, professor, etc.
-missing non-course session (e.g. dept. orientation sessions, clinical groups, etc.)
-challenging to make multiple sessions obvious in reports

Bottom line: We are missing tens of thousands of interactions and uses of our libraries.

So...I have been asking myself....

  • Can we ask students to swipe with each session or consultation or question at the desk?
  • How do we adequately reflect this high touch, nuanced interactions?
  • How do demonstrate the range of activities in these categries (from 5 minute interaction to multiple session class time or hour long consultation)?


Do you have any ideas?

An addendum to NSSE

We hope to learn more about this effort and determine how other existing assessments can be overlaid (is that the right word?) with our own efforts...

nsse.jpg
At ALA in Anaheim in June....

Title: Feasible, Scalable, and Measurable: Information Literacy Assessment & the National Survey of Student Engagement (NSSE)
Date: Monday, June 25, 2012, Time: 8-10:00 a.m., Location: ACC-203A

Description: While academic libraries struggle with assessing efforts to improve students' information skills, few assessment opportunities offer longitudinal and comparative data regarding information literacy. NSSE, a leading postsecondary assessment survey, has collaborated with librarians to create the Information Literacy Module, an addendum to the survey. The module is designed to facilitate efforts to assess undergraduates' information literacy and compare their results to other institutions. NSSE researchers and librarians will provide an overview of NSSE and the module, suggest methods to incorporate NSSE data into assessment efforts, and invite participant feedback to improve the information literacy module.

Outcomes:
Participants will be introduced to the draft survey questions for the NSSE information literacy module
Participants will learn how to incorporate NSSE data into their assessment efforts
Participants will have an opportunity to ask questions and provide feedback regarding the module's content

Presenters:
Polly Boruff-Jones (Drury University)
Carrie Donovan (Indiana University Libraries)
Kevin Fosnacht (NSSE)

Friday, May 25, 2012

Pre-Demographic Results

When we began work on the Library Data and Student Success project, we knew that we'd need to combine the data we collected with demographic and performance data tracked by the Office of Institutional Research (OIR). But even before we matched students with their library use, we were having fun with the numbers.

To recap, we collected U of M Internet IDs from 13 different service and resource areas during the Fall 2011 semester. Each of these 13 datasets has its own set of caveats regarding exactly what we can capture. For example, if a librarian has an instruction session with a particular class, anyone registered for that class is counted as receiving Course-Integrated Librarian Instruction. We have no way of knowing which students were in the room that day, so we're most likely over-counting in that area. On the flip side, reference staff don't typically log the patron's Internet ID in an in-person reference transaction, and the ID is sometimes not included even in an online transaction, so reference interactions are most likely under-counted.

With those caveats in mind, we set about doing what analysis we could while we waited for the OIR analysts to perform their magic. We added each of the 13 datasets to a Microsoft Access database as a table. Each table consisted of a list of Internet IDs. Using just these tables, we were able to determine the answers to questions like these:

  • How many individuals interacted with the Libraries in any measurable way?

  • How many interactions of each type did each individual have?

  • What was the total number of interactions of each type?

  • How many different types of interactions did an individual have with the Libraries?

  • How many individuals who did one thing (were registered for a course with course-integrated library instruction, for example) also did another (checked out a book)?



These questions were addressed and answered in aggregate, but having the Internet ID in each log allowed us to separate individuals from interactions. Here are some of the tidbits we were able to calculate:

  • 61,195 individuals used the Libraries in some measurable way in Fall 2011.

  • 10,455 people accessed an e-book, and 21,993 checked out or renewed at least one item.

  • 38,328 people used a database, and 30,105 accessed an e-journal.

  • 23,807 people used only one of the 13 service and resource areas. 2,774 used six or more.

  • 47,197 people used some type of digital resource (database, website, e-journal, or e-book) for a total of 1,110,727 digital interactions.



Tuesday, May 1, 2012

A word about privacy

Last Friday, April 27, our Library Data and Student Success team presented our findings at the ARLD Day at the Minnesota Landscape Arboretum. Our session was preceded by a presentation by two representatives of the ACRL Value of Academic Libraries project so the day actually flowed quite nicely. First, attendees were given a broad overview of how they can start measuring the value of their libraries, and then our presentation highlighted a tangible example of a library doing just that.



Our presentation went well, but through questions and other attendee comments it became clear that privacy implications are a big hold-up for other libraries doing this kind of work. Obviously, in order to do this kind of work libraries must track our usage in ways we maybe haven't done before, including retaining some user information regarding material check-outs and renewals. This should come as no surprise, but most libraries have strict policies in place that prohibit the retention of user data around material circulation and resource/services usage in general.



When we started this project at the University of Minnesota we quickly realized that we needed to alter our privacy practices while at the same time maintaining at least a baseline of privacy for our users that they would be comfortable with. To be crystal clear, we realized we needed to retain user information, namely the U of M Internet ID of our users for each resource or service usage. By retaining the U of M Internet ID we could then get at some of the demographic and success measures we were seeking to find out about our users.



What we also realized, however, is that we didn't need to retain exactly what our users checked out or accessed. We only needed to tie user ids to broad activities such as "checking out a book" or "using an ejournal." In other words, we are not retaining specifics about user activity. Maybe this table will help describe our efforts further:






















We kept this:But not this:
Checked out X booksActual book titles
Attended X workshopsActual workshops
Reference interactionSubstance of interaction
Logged into library workstationDate, location, duration
Used an ejournalActual ejournal title



Hopefully this makes our activities clearer. To be blunt, we had to stretch our privacy policies to make this project a reality. For the first time, we are retaining some user information in order to find out 1) who our users are, 2) what types of resources they use, and 3) how this use impacts their success in the classroom. There is no way this project could have happened if we didn't tie actual users to their library activities in some way. However, we are confident that user privacy is still being maintained. Data is only being reported in the aggregate, and the data we are retaining is being kept in a secure location with the Office of Institutional Research.



So far, so good. Any questions? Let us know!



Thursday, April 19, 2012

Library Technology Conference Presentation

On March 14, 2012 we gave our first presentation concerning the data we've gathered so far at the Library Technology Conference, held yearly at Macalester College in St. Paul, MN.



The presentation below contains audio and the slides we presented. It also includes the question and answer session for those of you that can't get enough! Enjoy!















Tuesday, April 17, 2012

Welcome to the wonderful world of library data and student success!

During the Fall Semester 2011, the University of Minnesota Libraries gathered U of M Internet IDs in 13 different service and resource areas. This site will describe what we found.



Welcome to our project site! We hope this site acts as a clearinghouse of sorts for all the data and correlations we have collected so far, and what we can collect and share in the future.



To whet your appetite, here is a graph that demonstrates any library use by students in all colleges, programs, and schools at the U of M during the Fall Semester, 2011. Note that across the university, no college, program, or school dropped below 60% for grad, undergrad, or professional student overall library usage.



usebylevelandcollege.jpg



Click the image for a larger version.