Category Archives: user experience

Always Be Collecting Feedback

As part of our ongoing efforts to get to know our users better, the UCSF Library web team decided we wanted some of that sweet, sweet microfeedback action, and so we deployed a very short satisfaction survey at library.ucsf.edu back in July.

How Are We Doing tabAnytime a patron clicked the How Are We Doing button at the bottom of every page, they were asked a simple question: How would you rate your experience today? Patrons could let us know if they were satisfied or unsatisfied with their web experience.

Regardless of their answer, they’d be prompted to go into detail and provide some demographic information, but the only required question was whether they were satisfied/unsatisfied. Our hope was that keeping the survey short and to the point, and constantly available on every page, would encourage participation.

The not-so-secret goal of this survey structure was for the web team to learn directly from our patrons where they’re having problems, so that we make improvements to the website based on their needs and not our own assumptions. Our thinking was that a user frustrated enough to leave a comment was a user we’d want to hear from.

Enough Background Already, What Did We Learn?

help-btnThe stats below are from July 7 through August 23, 2015, the day before we introduced our new Help feature. We’re excluding overlapping dates from this analysis because the Help button began competing with the How Are We Doing button for user attention (more on this below), and we wanted to base our analysis on clean data.

Of the 201 responses received during that period, 65% had a satisfactory experience at our site. Hooray!

If we drill down to the 76% of respondents who shared their demographic information, the overwhelming number of responses came from UCSF affiliates (94%), with Staff leading the way, closely followed by Faculty, then Students. It’s likely the data was skewed by the summer months, and it’ll be interesting to see if the affiliation breakdown changes now that the fall semester is in full swing.

satisfaction-by-affiliation

Patron satisfaction is awesome, but remember our not-so-secret goal was to learn why our users are unsatisfied. While only 20% of all respondents bothered to comment at all, our hypothesis about frustrated users being more likely to go into detail was correct: 87% of comments came from unsatisfied users. Hooray (really)!

Unsatisfied Users are More Likely to Comment

What’s Making our Users Unhappy?

Most of the frustration came from patrons who felt that navigating the site and finding what they needed was just too hard. 2nd prize for user frustration goes to remote access issues, with patrons expressing difficulties getting to articles using EZproxy, UC-eLinks, and VPN.

Connection errors and Library service issues (comments like you don’t have a book I want and my barcode number doesn’t work anymore) tied for 3rd place, and I was personally amused and humbled to know that 9% of the feedback was about how annoying they found the feedback survey popup window (removed after a few weeks).

Unsatisfied Users - Comments Breakdown
* If a patron gave feedback in more than one category, I picked the dominant issue.

So What’s Next?

We were fortunate to meet with some respondents, and we used their comments and other key takeaways from our satisfaction survey, the interviews we conducted with our customer-facing staff, and the LibQUAL 2013 survey to finalize the Library User Personas we’ll rely on as we make changes going forward.

We’ll keep our satisfaction survey going, but with the successful rollout of the Help feature, the time has come to modify how we ask for feedback. The How Are We Doing and Help buttons serve different purposes; unfortunately, the current design doesn’t make that distinction clear. Getting requests for help in the satisfaction survey gave us useful information before we rolled out the Help feature, but now it’s more appropriate that help me please comments get routed to the right person right away for faster customer service.

We’ll launch our more low-key request for feedback this week.

New Feedback Button
The new Feedback button will live at the bottom of every web page.

Redesign Your Website in 4,000 Easy Steps

Earlier this month, Rich Trott and I delivered a session at the University of California Computing Services Conference (UCCSC) in San Francisco. It was about our experience using an approach of continuous iterative improvements and frequent feedback to help keep our site fresh and meeting user needs. We talked about why this approach has been working better than the tradition complete redesign that might happen every few years (or not.)

If you can’t wait to hear more, see the slides with notes.

Or if you’re more of the video type, you can check that out too.

Let us know about your experiences using this kind of approach to website upkeep and positive user experience. What works for your site or organization?

Photo by chexee

It’s All About Audience

Back in February, the new Web Projects Team made known our purpose and guiding principles. All of that still holds true, but we realized that “Support education and meet the research needs of our users regardless of location or device” might need some clarification. UCSF is a somewhat unique academic institution having more staff than students and no undergraduates, among other things. So who is the primary audience that the library supports?

Primary Audiences served by the Library

  1. Teaching faculty
    • usually also involved in clinical research or practice or basic science research
  2. Students in degree programs
    • professional students in medicine, pharmacy, nursing, and dentistry
    • graduate students in basic science
    • graduate students in social sciences, nursing, and history
  3. Researchers in basic science or clinical medicine
    • faculty
    • postdocs
    • PhD students
    • lab managers/research staff

Notice that there is a fair amount of overlap between audiences with some people wearing multiple hats.

Of course there are others who use the Library too, for example, alumni, the public, visitors, Library staff, outside librarians, etc. They can all still benefit from parts of our site, but their needs will not drive decisions about how to structure our web pages and services. Ultimately, everything about the UCSF Library web should make it easier and more intuitive for the three audiences listed above to meet their research and education needs. All else is secondary, though not necessarily unimportant.

UCSF by the numbers

To define these audiences, we began by simply consulting the counts already provided by UCSF. However, those completely ignore Lab Managers and Research Assistants who have many of the same library needs as postdocs. There are also other staff members who do a lot of legwork for faculty, and therefore, reflect the library needs of faculty even though they are not counted as such. And if you talk about “students,” you must realize that the library needs of a medical student are completely different from those of a social sciences PhD. This means that the numbers are a rough estimate for our purposes.

These less obvious realities were gleaned from talking to people. The Library already tends to focus a lot on the Service Desk and subject liaisons when thinking about user interactions. To balance that, we decided to interview a variety of other library employees who act as liaisons to various user segments with library needs. A big thank you goes out to these individuals who took the time to share their super-valuable insights about user work patterns, language, and challenges!

  • Megan Laurance on basic science researchers
  • Art Townsend on Mission Bay users
  • Ben Stever and Kirk Hudson on Tech Commons users
  • Polina Ilieva and Maggie Hughes on researchers of special collections and archives
  • Dylan Romero on those who use multimedia stations and equipment and the CLE

A few other sources of insight came from meetings of the Student Advisory Board to the Library, LibQual feedback, and the Resource Access Improvement group.

We also came to the conclusion that it is helpful to think about users in terms of what they DO rather than by title alone. It’s the nature of their work that really defines their needs regarding library support. Once again the numbers are a rough estimate, but the segmentation they reveal is still helpful.

Library Users

Next Steps

The Web Projects Team will continue to make iterative improvements to the Library web presence, some small and some larger, driven by our now established Purpose and Guiding Principles and through the lens of our primary audiences.

We will also be regularly checking feedback from end users via usage statistics and quick user tests, and that will in turn, drive further improvements. In addition, we’ll continue to share about the evolution of the Library web and improvements to the user experience. If you have questions or comments on any of this, we’re all ears!

photo credit: Reuver via photopin cc

Redesigning the Legacy Tobacco Documents Library Site Part 1 — User Research

The Legacy Tobacco Documents Library site (LTDL) is undergoing a user-centered redesign.  A user-centered design process (a key feature of user experience, or UX) is pretty much what it sounds like: every decision about how the site will work starts from the point of view of the target user’s needs.

As a UX designer, my job begins with user research to identify the target users, and engaging with these users to identify their actual needs (versus what we might assume they want).

Prior to my arrival, the LTDL team had already identified three target users: the novice (a newbie with little or no experience searching our site), the motivated user (someone who has not been trained in how to search our site, but is determined to dig in and get what they need. Unlike the novice, the motivated user won’t abandon their search efforts), and the super user (someone who has gone through LTDL search training and knows how to construct complex search queries).

Given this head start, I spent a few weeks conducting extensive user research with a handful of volunteers representing all three user types.  I used a combination of hands-off observation, casual interviews, and user testing of the existing site to discover:

    • what the user expects from the LTDL search experience
    • what they actually need to feel successful in their search efforts
    • what they like about the current site
    • what they’d like to change about the current site

Lessons learned will guide my design decisions for the rest of the process.  Below you’ll find excerpts from the User Research Overview presentation I delivered to my team:


In addition to engaging directly with users, I did a deep dive into the site analytics.  The data revealed the surprising statistic that most of the LTDL site traffic (75%) originated from external search engines like Google.  The data further revealed that once these users got to our site, they were plugging in broad search terms (like tobacco or cancer) that were guaranteed to return an overwhelming number of results.  This meant that most of our users were novices and motivated users, not the super users we were used to thinking about and catering to.

This information exposed the key problem to be solved with the LTDL redesign: how to build an easy-to-use search engine that teaches the user how to return quality results, without dumbing down the experience for our super users.