Ethical Discovery in Today’s Digital Age

Librarianship | Arthur Hayden| July 30, 2019

default image

EBSCO User Experience Researcher Arthur Hayden outlines a recent presentation he gave at UXLibs 2019 on ethical discovery in the digital age.

There has never been a more exciting time to be a user experience (UX) professional in the library space. The digital age continues to offer nearly limitless new opportunities to capture the world’s knowledge and facilitate connections between scholars thousands of miles apart. As the industry adapts to the changing landscape, now is the time to innovate and build creative platforms that harness new technologies — giving researchers a robust set of tools that expand their abilities to learn, write, and share their work. The UX design process is perfectly suited to build these tools and address the changing needs of library patrons.

Discovery platforms arose to help sift through the overwhelming amount of scholarly content available online. They offered search-engine solutions with the scholarly principles and authority of libraries. But discovery has continued to evolve and user expectations along with it. Relevancy-ordered lists of books, articles, videos — this content is but the bare minimum of what is now expected from knowledge providers. New patrons enter the library space craving not only access to scholarly resources, but also a greater understanding of the context surrounding their specific research topic. A political science student writing a thesis on fossil fuels in the Caribbean expects to find plenty of relevant materials to cite in her paper, but also wants to see other related terms being queried by her peers, newly published works from the most prominent authors in the field, and further resources that help her grasp the historical, political, and social impact of her topic. All this, from a single search of “natural gas trinidad and tobago”.

This request is, from the knowledge provider’s perspective, intriguing but fraught with challenges. Historically, this work would be done by subject matter experts — individuals who have dedicated years of their lives to understanding that context and are thus capable of pointing inquisitive patrons in the right direction. In the absence of such experts, discovery tools turn to more mathematical solutions (algorithms) utilizing metadata calculations and other problem-solving operations to make educated guesses about what content is most relevant to the user. When done well, a discovery platform enriches the research process, opening pathways to new resources which patrons may not have located on their own. It’s a tantalizing proposition: fast access to deep content with broad contextual substance.

New patrons enter the library space craving not only access to scholarly resources, but also a greater understanding of the context surrounding their specific research topic.

But increased reliance on such a universal tool places enormous responsibility in the hands of the knowledge provider. Given the power and trust to make decisions over what is relevant may create a self-fulfilling environment. Patrons, in their quest for efficiency, could easily perceive suggested content as the best content and fail to spend additional time and effort considering where the content is coming from or miss resources that present contrasting viewpoints. While it may be tempting to downplay personal responsibility behind the guise of automation, even artificial intelligence and machine-learning introduces human bias.

In an article published in Science Magazine, researchers from Princeton discuss their work deriving artificial intelligence from machines that scan text on the open web. They found that the machines could uncover patterns in the text and “learn” complex, human-like associations between words. For example, the system would identify “flowers” as more “pleasant” than “insects.” In this way, the machine embeds words with certain properties without having any direct experience with the world. Such technology has obvious applications within the field of discovery, offering more ways to interpret user queries in a fraction of the time. But the researchers also observed the system learning stereotyped biases as easily as any other word association — for example, pairing traditionally female names with words pertaining to family and homelife while traditionally men’s names provoked career-themed words. The team urged caution when building systems in which machine-learning impacts decision-making.

Expanding access to research tools — ones that both enable discovery and promote information literacy skills — for marginalized groups, particularly non-English speakers, is a critical part of addressing these problems. We must acknowledge existing failures of representation — in academia and beyond — knowing that these biases will seep into the equation if not explicitly accounted for. Knowledge providers must accept the reality that automation is not a removal of bias. Otherwise, we risk creating a simply more digitized echo chamber and do further disservice to people struggling against oppressive power structures.

An ethical discovery experience is one that marries both the algorithmic and the human. By leveraging subject matter experts and systematic calculations, such a discovery platform enriches the research process. Offering relevant content, broad related subjects, and contextual pathways to new resources — it not only meets users immediate needs, but encourages them to grow as researchers without prescribing ideology.


1. Caliskan, Aylin, et al. “Semantics Derived Automatically from Language Corpora Contain Human-like Biases.” Science, vol. 356, no. 6334, Apr. 2017, pp. 183–186.

image description
Arthur Hayden
EBSCO Senior UX Researcher

Arthur Hayden is a Senior UX Researcher and has been working at EBSCO Information Services since 2013. While he isn’t a librarian, he’s passionate about sharing knowledge with the world and enjoys working to that goal. He enjoys most sports and traveling to new places.

Thanks for your comment!

Your comment will be reviewed by a moderator for approval.


Other EBSCO Sites +