Against the Grain v30 #1 February 2018

Page 1

c/o Katina Strauch Post Office Box 799 Sullivan’s Island, SC 29482

ANNUAL REPORT, PLA issue volume 30, number 1

ISSN: 1043-2094

TM

FEBRUARY 2018

“Linking Publishers, Vendors and Librarians”

Library Platforms by Trey Shelton (Chair, Acquisitions & Collections Services, George A. Smathers Libraries, University of Florida) <tshelton@uflib.ufl.edu>

T

his issue of Against the Grain focuses on library platforms. When I first began soliciting contributions for this issue, the authors-to-be often asked what exactly the term library platforms encompassed. The term is intentionally broad, and includes any tool libraries use to offer services and/or content, either homegrown or otherwise acquired. Most library platforms are generally back-end or behind-the-scenes tools designed to assist in the operations of the library. However, many, if not most, also feature some version of a public interface. These systems may focus on a particular function, such as an Institutional Repositories (IR) or an Electronic Resource Management System (ERMS) or may aim to provide solutions for multiple areas of a library, such as an Integrated Library System (ILS). For the purposes of this issue, library platforms are not designed and hosted by publishers or content aggregators for their own content and/ or services, a theme Against the Grain will address in an upcoming issue.

The plethora of inadequate legacy systems and cobbled-together processes with which many libraries continue to operate make migrating to a new or upgraded system enticing. At the same time, these older systems and processes also make such a migration more daunting and complex (unfortunately, libraries do not usually have the luxury of simply scrapping everything and starting anew). Once a library identifies a need for a new system the wide range of library platforms available, the varied applications thereof, and the rapid pace of mergers and consolidations in this sector of the market are overwhelming. Resources such as Breeding’s Library Systems Report 20171 provide an excellent overview of developments in the library platform realm, but, generally speaking, libraries often find it difficult to locate timely, unbiased, and practical information. This issue aims to provide useful information on a variety of library platforms from authors with varied backgrounds and experiences.

If Rumors Were Horses

H

appy days! ATG is using a new platform that allows a better reading experience for the full print issue! You can flip pages, zoom, click links, and download to PDF. Check it out and let us know what you think! And feel free to share with others — the

Maddie welcomes a new friend to the farm: Buckley the potbellied pig. Maddie is the daughter of Leah Hinds (Executive Director, Charleston Conference).

Dec/Jan print issue is available open access for two weeks as a trial. Here is the link http:// www.against-the-grain.com/v29-6-full-issue/ Saw that the hard-working Joe Esposito of Processed Media has merged forces with the diligent Michael Clarke to form Clarke & Esposito. Clarke & Esposito will concentrate principally on strategic consulting services related to professional and academic publishing and information services. Joe Esposito, Processed Media’s principal, is Senior Partner in the new firm. Clarke & Company’s Michael Clarke is the new firm’s Managing Partner. Also joining the combined firm as Partner is Pam Harley, who has been working with Clarke & Company. “We have been working together for three years now, and it made sense to formalize the relationship,” Joe Esposito said. “Working together we continued on page 6

The introduction of Library Services Platforms2 (LSPs), also frequently referred to as Next Generation Integrated Library Systems (Next-Gen ILSes), offer significant promise in terms of streamlined workflows and automated processes. The vision this product segment represents is grand, and LSP providers are working fiercely to keep up with competitors and meet librarian expectations. Peter McCracken, co-founder of Serials Solutions and currently E-Resources Librarian at Cornell University’s Olin Library, kindly agreed to share his perspective with ATG readers on the latest development in the continued on page 8

What To Look For In This Issue: Best Practices in College Student Development....................................... 42 Getting Attention................................ 46 Substantial Similarity......................... 52 What Happens After Short-Term Loan Withdrawal................................ 63 Snow Plows or No Plows - What’s a City to Do?.......................................... 69 News and Announcements for the Charleston Library Conference......... 71 ALA Midwinter Meeting 2018........... 72 Interviews Pat Sabosik......................................... 39 Profiles Encouraged Up and Comer, People, and Company Profiles................................................ 74 Plus more............................... See inside

1043-2094(201802)30:1;1-U


Celebrating 30 Years of Scholarly Knowledge Creation & Dissemination

www.igi-global.com IGIGlobal www.igi-global.com

InfoSci-Journals ®

A Collection of 22,000+ Quality, Vetted Scholarly Articles from 175+ Journals

Annual Subscription Price for New Customers: $5,950

$4,462

Receive a 25% Discount When You Sign Up for an IGI Global Online Librarian Account

Features Include:

Subject Coverage:

• No Set-Up or Maintenance Fees • Guarantee of No More Than 5% Annual Increases • COUNTER 4 Usage Reports • Complimentary Archival Access • Free MARC Records • Unlimited Simultaneous Users • No Embargo of Content • Full-Text PDF and XML Format • No DRM

• • • • • • • • • • •

Business & Management Computer Science & Information Technology Education Engineering Environmental, Agricultural & Physical Sciences Government & Law Library & Information Science Media & Communications Medical, Healthcare, & Life Sciences Security & Forensics Social Sciences & Humanities

For more information, visit www.igi-global.com/infosci-journals or contact eresources@igi-global.com.

For Librarian rs Custome

Sign Up for An IGI Global Online Librarian Account and Receive a 25% Discount on All Purchases

IGI Global offers Online Librarian Accounts for quick and easy print purchases, plus instant access to thousands of electronic titles through the award-winning InfoSci® platform.

Visit www.igi-global.com/login/ create-account/ for more details. Sign up at www.igi-global.com/newsletters

• A 25% discount on all purchases made directly through IGI Global! • Multiple payment options. • Receive an extra 10% discount on recently released titles within the first year! • Product referrals and recommendations. • First-rate customer service. • Much more!

facebook.com/igiglobal

twitter.com/igiglobal

linkedin.com/company/IGIGlobal



Against the Grain (ISSN: 1043-2094) (USPS: 012-618), Copyright 2017 by the name Against the Grain, LLC is published six times a year in February, April, June, September, November, and December/January by Against the Grain, LLC. Business and Editorial Offices: PO Box 799, 1712 Thompson Ave., Sullivan’s Island, SC 29482. Accounting and Circulation Offices: same. Call (843-509-2848) to subscribe. Periodicals postage is paid at Charleston, SC. POSTMASTER: Send address changes to Against the Grain, LLC, PO Box 799, Sullivan’s Island, SC 29482.

Editor:

Katina Strauch (College of Charleston)

Associate Editors:

Cris Ferguson (Murray State) Tom Gilson (College of Charleston) John Riley (Consultant)

Research Editors:

Judy Luther (Informed Strategies)

Assistants to the Editor:

Ileana Strauch Toni Nix (Just Right Group, LLC)

Editor At Large:

Dennis Brunning (Arizona State University)

Contributing Editors:

Glenda Alvin (Tennessee State University) Rick Anderson (University of Utah) Sever Bordeianu (U. of New Mexico) Todd Carpenter (NISO) Bryan Carson (Western Kentucky University) Eleanor Cook (East Carolina University) Anne Doherty (Choice) Ruth Fischer (SCS / OCLC) Michelle Flinchbaugh (U. of MD Baltimore County) Joyce Dixon-Fyle (DePauw University) Laura Gasaway (Retired, UNC, Chapel Hill) Regina Gong (Lansing Community College) Chuck Hamaker (UNC, Charlotte) William M. Hannay (Schiff, Hardin & Waite) Mark Herring (Winthrop University) Bob Holley (Retired, Wayne State University) Donna Jacobs (MUSC) Lindsay Wertman (IGI Global) Ramune Kubilius (Northwestern University) Myer Kutz (Myer Kutz Associates, Inc.) Tom Leonhardt Rick Lugg (SCS / OCLC) Jack Montgomery (Western Kentucky University) Bob Nardini (ProQuest) Jim O’Donnell (Arizona State University) Ann Okerson (Center for Research Libraries) Rita Ricketts (Blackwell’s) Jared Seay (College of Charleston)

ATG Proofreader:

Rebecca Saunders (College of Charleston)

Graphics:

Bowles & Carver, Old English Cuts & Illustrations. Grafton, More Silhouettes. Ehmcke, Graphic Trade Symbols By German Designers. Grafton, Ready-to-Use Old-Fashioned Illustrations. The Chap Book Style.

Production & Ad Sales:

Toni Nix, Just Right Group, LLC., P.O. Box 412, Cottageville, SC 29435, phone: 843-835-8604 fax: 843-835-5892 <justwrite@lowcountry.com>

Advertising information:

Toni Nix, phone: 843-835-8604, fax: 843-835-5892 <justwrite@lowcountry.com>

Publisher:

A. Bruce Strauch

Send correspondence, press releases, etc., to: Katina Strauch, Editor, Against the Grain, LLC, Post Office Box 799, Sullivan’s Island, SC 29482. phone: 843-723-3536, cell: 843-509-2848. <kstrauch@comcast.net>

Against the Grain is indexed in Library Literature, LISA, Ingenta, and The Informed Librarian. Authors’ opinions are to be regarded as their own. All rights reserved. Printed in the United States of America. This issue was produced on an iMac using Microsoft Word, and Adobe CS6 Premium software under Mac OS X Mountain Lion. Against the Grain is copyright ©2018 by Katina Strauch

4

Against the Grain / February 2018

Against The Grain TABLE OF CONTENTS v.30 #1 February 2018 © Katina Strauch

ISSUES, NEWS, & GOINGS ON Rumors.................................................. 1 From Your Editor................................. 6

Letters to the Editor............................. 6 Deadlines............................................... 6

FEATURES Library Platforms — Guest Editor, Trey Shelton

Library Platforms................................ 1

by Trey Shelton — The term is intentionally broad, and includes any tool libraries use to offer services and/or content, either homegrown or otherwise acquired. Most library platforms are generally back-end or behind-the-scenes tools designed to assist in the operations of the library.

Current and Future Library Catalogs: An Introduction to FOLIO..................12

by Peter McCracken — At Cornell University, the library spent nearly 70% of its 2016/17 materials budget on electronic resources. Many institutions spend a much higher percentage on electronic resources. But too many libraries — including Cornell — do not yet have tools that can effectively manage the result of this significant shift in spending.

Optimizing an Octopus: A Look at the Current State of Electronic Resources Management and New Developments in the CORAL ERM System............. 18

by Heather Wilson — The reality is that electronic resources management (ERM) is dynamic, unstable, and unpredictable, and that is if the librarian is lucky enough to work in a library that promotes innovative approaches to those processes.

Homegrown Search Results and Platforms............................................. 21

by Elizabeth Siler — Although we have some of the most powerful search engines in the world in OPACs, Discovery Services, and leveraging the all-powerful Google, these resources do not always direct users to the unique or specialized content provided by the library. To help promote these resources, libraries have come up with some unique ways to display and market this content by creating local search results and databases.

Evaluating IR Platforms: Usability Criteria for End Users and IR Managers............................................ 25 by Rachel Walton — How can we evaluate IR platforms?

Beyond Bibliographic Discovery: Bringing Concepts and Findings into the Mix................................................ 28

by Athena Hoeppner — Despite encompassing a significant percentage of library content and richly featured interfaces with sophisticated relevance ranking algorithms and an ever-growing suite of functions, Web Scale Discovery (WSD) does not meet every type of user need, as evidenced by the continued existence and use of subject indexes and other databases.

Vendor Platforms — Tools for Efficient Library Acquisitions........... 29

by Justin Clarke — In the past 25 years, the library supply vendor community has embraced emerging technologies to advance their online platforms used by customers to identify and obtain resources for their patrons.

Op Ed.................................................. 32

Open Data, the New Frontier for Open Research by Tim Britton — Why is it that open data is not yet the norm given that it is fifteen years since the National Institutes of Health (NIH) released its trail-blazing 2003 Statement on Sharing Research Data?

Back Talk............................................ 78

Balancing the Books by Jim O’Donnell — Our books have a problem. Whatever competition there is for attention from other and newer media, our books are competing with themselves. How do we keep the love alive?

REVIEWS Book Reviews...................................... 40

Monograph Musings by Regina Gong — In this issue books reviewed include Inherent Strategies in Library Management and Renew Yourself: A Six-Step Plan for More Meaningful Work.

Collecting to the Core........................ 42

Best Practices in College Student Development by Dr. Beth A. McDonough and Dr. April Perry — This column highlights monographic works that are essential to the academic library within a particular discipline, inspired by the Resources for College Libraries.

Booklover............................................ 43

Japanese Art by Donna Jacobs — Have you read An Artist of the Floating World by Kazuo Ishiguro?

Wryly Noted....................................... 44

Books About Books by John Riley — A look inside Photographed Letters on Wings: How Microfilmed V-Mail Helped Win World War II by Tom Weiner with Bill Streeter.

Oregon Trails...................................... 45 Favorite Books of 2017 by Thomas W. Leonhardt — This is a list of Tom’s top 17 books!

<http://www.against-the-grain.com>


ANNUAL REPORT, PLA Issue ATG SPECIAL REPORT Messing with Success: The Charleston Library Conference FutureLab.............34

by Mark Sandler — Through a variety of more or less standard techniques, Charleston Future Lab facilitators sought to surface a degree of consensus around a handful of themes from the hundred-plus submitted statements.

“Linking Publishers, Vendors and Librarians”

ATG INTERVIEWS & PROFILES Profiles Encouraged........................... 74

In this issue we have one up and comer profile, many people profiles, and two company profiles.

LEGAL ISSUES Edited by Bryan Carson, Bruce Strauch, and Jack Montgomery

Cases of Note — Copyright............... 52

Substantial Similarity by Bruce Strauch — This one is about designing greeting cards with pumpkins and fir trees. Substantial Similarity or not?

Questions and Answers...................... 52

Copyright Column by Laura N. Gasaway — Many relevant questions and answers. Fascinating question about a wikileaks PDF and copyright.

PUBLISHING Bet You Missed It............................... 10

by Bruce Strauch — What do dissertations and biometric litigation have in common? Read about it here!

The Scholarly Publishing Scene........ 46

Getting Attention by Myer Kutz — What is or should be the role of the popular press in article transmission? What is the future direction of the journals business?

And They Were There........................ 47

Reports of Meetings — In this issue we have a report on the @Risk Forum by Tony Horava, and an indepth report on the Academic Publishing in Europe (APE) conference by Anthony Watkinson, and the first batch of reports from the 2017 Charleston Conference compiled by Ramune Kubilius and provided by a group of conference reporters.

BOOKSELLING AND VENDING Biz of Acq............................................ 55

Being Earnest with Collections......... 63

Planning for the Unplannable: Meeting Unexpected Collections Budget Demands by Christine K. Dulancy — What to do in midyear when the DDA budget runs out of funds?

What Happens After Short-Term Loan Withdrawal by Carol Joyner Cramer — What happens when you drop content from your DDA pool?

Reinvention by Michael Gruenberg — Rod Stewart gets Mike’s vote as the poster child of reinvention.

Snow Plows or No Plows - What’s a City to Do? by Corey Seeman — This new column intends to provide an eclectic exploration of business and management topics relative to the intersection of publishing, librarianship and the information industry.

Both Sides Now: Vendors and Librarians........................................... 56

Optimizing Library Services............. 58

How Information Scientists Can Help Fix the Broken Peer Review Process by Jeremy Horne — A common core of standards and recommended practices should be developed and applied appropriately to various disciplines that all peer reviewers in their respective areas would use.

Squirreling Away: Managing Information Resources & Libraries.....69

TECHNOLOGY AND STANDARDS Let’s Get Technical............................. 65

Improving Electronic Resources Management Communication with Chat Solutions by Michael Fernandez — Michael explains the challenges he and his staff faced with communicating among themselves and the steps that were taken to improve communication.

Library Analytics: Shaping the Future.................................................. 67

Applying Data Analysis: Demonstrate Value, Shape Services, and Broaden Information Literacy by John McDonald and Kathleen McEvoy — Are we making use of anonymous user data?

ETC. Charleston Comings and Goings...... 71 News and Announcements for the Charleston Library Conference by Leah H. Hinds

Against the Grain / February 2018

ALA Midwinter Meeting 2018.......... 72

February 9-13, Denver, CO by Leah H. Hinds

Uncommon ...

Against the Grain is your key to the latest news about libraries, publishers, book jobbers, and subscription agents. ATG is a unique collection of reports on the issues, literature, and people that impact the world of books, journals, and electronic information.

Unconventional ...

ATG is published six times a year, in February, April, June, September, November, and December/January. A six-issue subscription is available for only $55 U.S. ($65 Canada, $95 foreign, payable in U.S. dollars), making it an uncommonly good buy for all that it covers. Make checks payable to Against the Grain, LLC and mail to: Against the Grain c/o Katina Strauch Post Office Box 799 Sullivan’s Island, SC 29482 *Wire transfers are available, email <kstrauch@comcast.net> for details, however, credit cards are the preferred alternative to checks ($25 fee applies).

_______________________________________________ City          State   Zip _______________________________________________ Company        Phone _______________________________________________ Email _______________________________________________

General Manager, ACI Scholarly Blog Index

Name _______________________________________________ Address _______________________________________________

Pat Sabosik......................................... 39

<http://www.against-the-grain.com>

5


From Your (apologizing) Editor:

C

harleston has had to deal with some rough weather over the past several months. We at Sullivan’s had a lot of snow for us (5 inches!) and it didn’t melt quickly like it was supposed to. We have a column in this issue by Corey Seeman about how our colleagues in Michigan deal with the snow with snowplows, etc. But we in Charleston are inexperienced and without equipment. The Dec/Jan issue of ATG was for ALA Midwinter and we made it all right. But we must do our subscription renewals before working on the February print issue of ATG and with ALA Midwinter occurring later than usual this year we now find ourselves a bit behind and running much too slow with getting the February issue out. In fact we may have set a record and for that we do apologize! Still, this special issue focuses on library platforms and is excellent. It is guest edited

by the amazing Trey Shelton (who runs the Charleston Premiers during the Charleston Conference) and includes articles by Peter McCracken (LSPs and FOLIO), Elizabeth Siler (homegrown platforms), Rachel Walton (IR platforms), Heather Wilson (Octopus and ERMS development), Athena Hoeppner (beyond bibliographic discovery), and Justin Clarke (vendor platforms). Our OpEd is by managing editor for open research Tim Britton to kick off Love Data Week. The glorious book-loving Jim O’Donnell talks about keeping our books in the competition for attention. Our Special Report spotlights the first Charleston Future Lab 2017 initiative about the future of the industry led by the dynamic Mark Sandler and the energetic Heather Staines. It is our intention to continue this in Charleston 2018. Anthony Watkinson (who never seems to rest!) has

Letters to the Editor Send letters to <kstrauch@comcast.net>, phone or fax 843-723-3536, or snail mail: Against the Grain, Post Office Box 799, Sullivan’s Island, SC 29482. You can also send a letter to the editor from the ATG Homepage at http://www.against-the-grain.com. Dear Editor: Many thanks for this message. I’ve appreciated having the subscription over the years I’ve attended the Charleston Conference. I will retire at the end of March and would like to transfer the subscription to a colleague? Is this possible? Priscilla Treadwell (Director of Digital Sales at Princeton University Press, Princeton University Press) <Priscilla_Treadwell@press.princeton.edu> Dear Priscilla: Of course! Thanks! Katina, Yr. Ed.

AGAINST THE GRAIN DEADLINES VOLUME 30 — 2018-2019 2018 Events Issue MLA, SLA, Book Expo ALA Annual Reference Publishing Charleston Conference ALA Midwinter

Ad Reservation Camera-Ready

April 2018

02/15/18

03/08/18

June 2018

04/05/18

04/26/18

September 2018

06/14/18

07/05/18

November 2018

08/16/18

09/06/18

Dec. 2018-Jan. 2019

11/08/18

11/26/18

FOR MORE INFORMATION CONTACT Toni Nix <justwrite@lowcountry.com>; Phone: 843-835-8604; Fax: 843-835-5892; USPS Address: P.O. Box 412, Cottageville, SC 29435; FedEx/UPS ship to: 398 Crab Apple Lane, Ridgeville, SC 29472.

reported on the Academic Publishing in Europe (APE) conference (see And They Were There which begins on p.47). Our interview is with the enterprising Pat Sabosik who is also becoming a master gardener in her spare time! The awesome Lolly Gasaway answers many legal questions and answers, Biz of Acq is about planning, Being Earnest is about withdrawal of short term loans, Corey Seeman (who loves squirrels) starts a new business-focused column about snow plows in Charleston this time (see above), library analytics is about demonstrating value and there is much, much more! I am supposed to get the February issue to the printer today, finally! So gotta run! Thanks for your patience! Love, Yr. Ed.

Rumors from page 1 are able to provide a deeper level of expertise and insight to our clients.” The client list for the predecessor organizations ranges across many of the most renowned and innovative organizations working in science, medicine, technology, and education, including both for-profit and not-for-profit organizations. Clarke & Esposito is a sister company to STM Advisers (http://www.stmadvisers.com), which focuses on mergers and acquisitions in scientific, technical, medical, and academic publishing. Recently STM Advisers represented the Mathematical Association of America in the sale of its book program to the American Mathematical Society. “We are perhaps a minority in the professional and academic environment in that, rather than gloom and doom, we see opportunities for smart management teams willing to make shrewd investments in the future,” said Michael Clarke. http://www.ce-strategy.co Have you noticed that Tom Gilson who adapts to and notices everything has started a special publisher job bank section on the ATG NewsChannel? I noticed that Clarke & Esposito just published a job postinga there. http://www.against-the-grain.com/?s=job+bank+publishers&submit=Search We have been posting Rumors to the ATG NewsChannel over the past several weeks and we will continue to do this as Rumors emerge! Did you see that Skip Pritchard’s book has reached the Publisher’s Weekly bestseller list! The Book of Mistakes: 9 Secrets to Creating a Successful Future, debuted at number 19 on the non-fiction bestseller list. The book is described as an inspiring business fable and a motivational tool. Visit https://thebookofmistakes.com! And how about the Rumor about my granddaughter, Porter’s first birthday on February 17! Or Rolf Janke’s new position continued on page 8

6

Against the Grain / February 2018

<http://www.against-the-grain.com>


Leading Digital Products BY T A YL OR & F RAN C I S G RO U P

Taylor & Francis eBooks

Greenleaf eCollections Taking Sustainability Seriously

The Macat Library

Our new, single-destination platform

G RE AT W ORKS FOR C RITIC AL T HI NK I NG

Routledge Performance Archive

Mode 2018 rn L Awar ibrary ds Platin um Winn er

ROUTLEDGE HISTORICAL RESOURCES

y s lb w ia vie nt e se E R Es IC O

CH

by y d s hl de ew i ig H en ev m R m E co IC Re HO C

History of Feminism

History of Economic Thought

Routledge Encyclopedia of Modernism

ROUTLEDGE HISTORICAL RESOURCES

To request a FREE TRIAL for your library, contact: e-reference@taylorandfrancis.com

Visit www.taylorandfrancis.com to see a full listing of digital and print products.


Ta ke a closer look at.... The CHARLESTON REPORT Business Insights into the Library Market

You Need The Charleston Report...

if you are a publisher, vendor, product developer, merchandiser, consultant or wholesaler who is interested in improving and/or expanding your position in the U.S. library market.

Subscribe today at our discounted rate of only $75.00 The Charleston Company 6180 East Warren Avenue, Denver, CO 80222 Phone: 303-282-9706 • Fax: 303-282-9743

Rumors from page 6 at Rowman & Littlefield? Or how about ISI resurfacing on Clarivate Analytics? We do our best to keep you informed! Thanks! http://www.against-the-grain.com/category/ rumors/ Just published a brand new Rumor today about the wonderfully capable Kristen Antelman who will become University Librarian at UC Santa Barbara beginning April 1, 2018. I hope also that y’all have been paying attention to Charleston Conference’s FAST PITCH. We had our second session in Charleston on Wednesday, November 8, 2017. Most of us have ideas that can be expanded to help the profession and the industry. We shouldn’t hold back! How can faculty and students get better use of information? We want your ideas! How can discovery be more innovative? We are going to open fast pitch ideas up in a few months so we can brainstorm together! It was great having the innovator Kent Anderson as one of the judges at Fast Pitch this past November! Kent started the wonderful Scholarly Kitchen for SSP ten years ago. They are conducting a survey to get opinions about the new SSP directions. The survey is continued on page 24

Library Platforms from page 1 LSP market, namely FOLIO, an open-source LSP currently in development. While LSPs have some librarians envisioning (or, perhaps, living in) a world where their ERMS is integrated with traditional ILS functions, there are still many ERMS alternatives, some integrated with discovery/ access functionality and others completely standalone. One such standalone system is CORAL, which also happens to be opensource. In this issue, Heather Wilson, the Acquisitions and Electronic Resources Librarian at Caltech and a member of the CORAL Steering Committee, provides readers with an extremely relatable overview of the complications of e-resource management, discusses how CORAL’s flexibility addresses some of those complications, and points to continued development of CORAL to ensure its relevancy and usefulness. Most of the library platforms covered in this issue are not primarily intended to host and serve content, but that, of course, is the primary intended use of IRs. Rachel Walton, Digital Archivist and Records Management Coordinator at Rollins College, delves into IR usability criteria for both public-user interfaces and back-end administrative interfaces. Not surprisingly, at least for those familiar with IR systems, the two sets of criteria have very little overlap and bring to

8

Against the Grain / February 2018

light the challenges those who develop these types of systems face. I challenge readers to apply the criteria Rachel discusses to their own IR to see how it stacks up. Discovery has been a hot topic in libraries for many years now. Interestingly, none of the authors in this issue chose to focus their articles on what I would call “traditional” WebScale Discovery (WSD) products, such as Summon or EBSCO Discovery Service. However, two authors in this issue do write about discovery issues and from different perspectives. Elizabeth Siler, Collection Development Librarian at UNC Charlotte, discusses homegrown library platforms, including custom Bento Box search results, and format specific search interfaces for streaming video, e-textbooks, and Open Educational Resources (OERs). In contrast, Athena Hoeppner, Discovery Services Librarian at the University of Central Florida, writes about Yewno and Knowtro, both cleverly named third-party WSD services that take a different approach to discovery. One type of library platform that receives too little attention are vendors’ ordering platforms. Though these tools have been around for some time, Application Programming Interfaces (APIs) have allowed book and serial vendors to start integrate their ordering systems into LSPs and further streamline complicated acquisitions workflows. Justin Clarke, Director of Sales and Marketing North America at HARRASSOWITZ, describes how these

ordering systems developed over the years, the complications vendors face in developing and maintaining these systems, and discusses how APIs are changing how vendors interact with their customers. Of course, there is not enough room in any single issue of ATG to cover the broad diversity that exists in the library platform market. I hope you find the platforms covered in this issue interesting. I know I do! I would like to thank all of the wonderful contributing authors who graciously offered their time and talent. I would also like to thank Katina Strauch and Tom Gilson for inviting me to guest edit, as well as my colleague David Van Kleeck, Interim Chair of Cataloging Services at the University of Florida, for allowing me to pick his brain regarding cataloging terminology.

Endnotes 1. Breeding, Marshall, “Library Systems Report 2017: Competing visions for technology, openness, and workflow.” American Libraries Magazine. May (2017). https:// americanlibrariesmagazine.org/2017/05/01/ library-systems-report-2017/ 2. Breeding, Marshall, “Library Services Platforms: A Maturing Genre of Products.” Library Technology Reports. 51, no. 4 (2015). http://dx.doi.org/10.5860/ltr.51n4

<http://www.against-the-grain.com>



Bet You Missed It Press Clippings — In the News — Carefully Selected by Your Crack Staff of News Sleuths Column Editor: Bruce Strauch (The Citadel, Emeritus) <bruce.strauch@gmail.com> Editor’s Note: Hey, are y’all reading this? If you know of an article that should be called to Against the Grain’s attention ... send an email to <kstrauch@comcast.net>. We’re listening! — KS

DANCE YOUR DISSERTATION by Bruce Strauch (The Citadel, Emeritus)

STEM Ph.D. dissertation topics are real conversation stoppers. But weird, impenetrable research can translate into dance videos. If you go to YouTube, you can find hundreds of scientists are in on the fad with thousands of viewings. Music of Tchaikovsky or Notorious B.I.G. Dancers playing photons, spermatocytes or Polio virus. Musculoskeletal disorder: Hip-hop maneuvers to rewritten lyrics of “Good Vibrations” while wearing an NBA jersey. Disrupting the rewarding properties of nicotine: “Purple Rain” with a Bhangra-style dance. Discrete Representations of the Braids Groups: aerial dancers hang upside down creating braided loops to techno-club music. And where did this come from? It’s sponsored by Science magazine and the American Association for the Advancement of Science. With a $500 prize. See — Melissa Korn, “Your Ph.D. Thesis Sounds Funky! Let’s Dance to It,” The Wall Street Journal, Nov. 14, 2017, p.A1.

PEN VENOMOUS AS A RATTLESNAKE’S FANGS by Bruce Strauch (The Citadel, Emeritus)

In myth, Anne Royall was the first female journalist to interview a president. She sat on John Quincy Adams’ clothes while he was skinny-dipping, and he had to tread water and answer her questions. Her life was worthy of the myth. Born in 1769, she was raised on the frontier in great hardship and privation. Her widowed mother became a maid for a wealthy man who opened his library for Anne and later married her. When he died, his family sued over his estate, and she found herself penniless again. She reinvented herself as an “itinerant storyteller.” She travelled the new nation writing a series of “Black Books,” sardonic portraits of the rich and corrupt. She mocked the evangelicals of the Second Great Awakening, exposed corrupt land schemes and bank scandals. Settling in Washington, she was next door to a firehouse where a Presbyterian church held services. This deeply offended her sense of the separation of church and state, and she railed and cursed at them from her window. She was arrested for being a common scold — communis rixatrix — a strictly female offense dating to the Middle Ages. The punishment was dunking, although the judge spared her that. See — Melanie Kirkkpatrick reviews Jeff Briggers’ The Trials of a Scold, The Wall Street Journal, Nov. 3, 2017, p.A13.

SARASOTA INDIE by Bruce Strauch (The Citadel, Emeritus)

WSJ travel section reviews the appeal of Sarasota, Florida. Along with the farm-to-table restaurants and artisanal cheeses, they get in a plug for independent bookseller Bookstore 1 on Main Street. See — Alexander Lobrano, “A Florida Beach Town With Snob Appeal,” The Wall Street Journal, Jan. 20-21, 2018, p. D7.

10 Against the Grain / February 2018

LET’S READ ABOUT ADULTERY by Bruce Strauch (The Citadel, Emeritus)

Evelyn Waugh, A Handful of Dust (1934) (this is Waugh’s great one where the cuckolded husband ends up trapped by a lunatic on the Amazon endlessly reading aloud from Dickens); (2) Ford Madox Ford, The Good Soldier (1915) (wealthy British and American couples at German spa); (3) Henry James, The Golden Bowl (1904) (impoverished Roman prince and his no-money American lover each marry money and then continue their affair); (4) Stendhal, The Red and the Black (1830) (provincial aristocrat wife and the teenage tutor of her children); (5) Guy de Maupassant, Like Death (1889) (politician’s wife has affair with portrait painter who in turn falls in love with her daughter). See — Anka Muhlstein, “Five Best,” The Wall Street Journal, Sept. 30-Oct. 1, 2017, p.C16. Muhlstein is the author of “The Pen and the Brush: How Passion for Art Shaped 19th-Century French Novels.”

BIOMETRIC LITIGATION by Bruce Strauch (The Citadel, Emeritus)

So far, 13 million people have downloaded a Google Arts & Culture app to let its magic algorithm match their selfies to a historical work of art. But if you’re in Illinois or Texas, no art match for you. Both states outlaw the collection of biometric data including “face geometry” without the owner’s permission. The app is only one of a mass of facial recognition tech products like iPhones that unlock to your face, doorbells that identify guests, and security cameras that spot recidivist shoplifters. And the good news for lawyers is lawsuits are up in Illinois because under its statute individuals can sue. Other states only permit the attorneys general to do it. But if selfie art lovers give their permission in the dowload, why is Google blocking it? It’s a mystery. But the dangers of biometric collection include such things as a police state tracking its citizens as China does. See — Jack Nicas, “Google Faces Off With Privacy Laws,” The Wall Street Journal, Jan. 18, 2018, p.B1.

INDIE BOOKSTORES DOWN IN DIXIE by Bruce Strauch (The Citadel, Emeritus)

Louisville celebrity chef Edward Lee has written Smoke and Pickles about the crossroads of Korean and Southern food. And he loves sleuthing for antique cookbooks in independent bookstores. His favs: Square Books of Oxford, Miss.; Faulkner House Books, New Orleans, La.; Burke’s Book Store, Memphis, Tenn.; 2nd Edition Booksellers, Durham, NC; Chop Suey Books, Richmond, Va.; Carmichael’s Bookstore, Louisville, Ky; Parnassus Books, Nashville, Tenn; Turnrow Book Co, Greenwood, Miss; A Cappella Books, Atlanta, Ga; Politics and Prose, Washington, DC; Church Street Coffee and Books, Birmingham, Al. See — Emily Storrow, “Cookbook Worm,” Palate, Oct. 2017, p.22.

<http://www.against-the-grain.com>


Established in 1872, HARRASSOWITZ has been consistently serving academic and research libraries around the world ever since. We offer proven acquisitions solutions developed for and by librarians. Our customers value us for the quality and accuracy of our services and for our industry-leading solutions.

Subscriptions

Standing orders

We deliver a full range of subscription-based information resources, in all languages, in all media, including electronic and print journals, e-journal and e-book packages, digital archives, back issues, and more.

We provide a fast, reliable service for monographic series, sets in progress, proceedings, annuals, yearbooks, and irregular series.

Approval plans For books and music scores, our flexible approval plan services are designed to enrich library collections while saving our customers time and effort.

Further Information: HARRASSOWITZ Booksellers & Subscription Agents, 65174 Wiesbaden E-Mail: service@harrassowitz.de Tel: +49 (0) 611 530-0 Web: www.harrassowitz.de

Databases We are a leading supplier of scholarly and professional content through a broad range of international databases.

Monographs Our comprehensive range of services deliver materials in all media, enabling libraries to provide breadth and depth in their collections.

E-Books Our innovative e-books services are tailored to our customers’ needs and include standing or firm orders, approval plans, and e-packages.

Music scores Combining quality service and specialist knowledge, HARRASSOWITZ is the world’s leading and most trusted supplier of music scores.


Current and Future Library Catalogs: An Introduction to FOLIO by Peter McCracken (Electronic Resources Librarian, Cornell University) <phm64@cornell.edu>

O

ver the past ten years, many academic libraries have implemented new online catalogs and automation tools, to better manage the significant shift in materials expenditures from print to electronic. At Cornell University, where I work, the University Library spent nearly 70% of its 2016/17 materials budget on electronic resources. Many institutions spend a much higher percentage on electronic resources. But too many libraries — including Cornell — do not yet have tools that can effectively manage the result of this significant shift in spending. The online catalogs of previous generations are not well designed to manage this change, and in most cases libraries have found it necessary to use one or more separate tools to manage all of this spending. Too many tools, and too much time, is spent dealing with checking and correcting links, managing licensing details, trying to determine actual holdings rights, ensuring accurate access for the correct individuals, and much more. In the past decade, several new library automation systems have arrived on the market to better manage this shift. The most common are Alma, from Ex Libris (now owned by ProQuest); Sierra, from Innovative Interfaces; and Worldshare Management Services, from OCLC. These new systems have seen significant adoption; as described below, among 205 leading college and university academic libraries in the United States and Canada, 122, or 60%, have shifted to a new system in the past ten years, and 97% of those installs (118 of 122) were one of these three systems. This shift will continue in the next few years, and over the next year, FOLIO — a new, open-source library management service — will become available to libraries, as well. My employer, Cornell University Library, is fully committed to FOLIO and, along with several other libraries, has committed significant resources to its development. FOLIO will introduce the next evolution to the library

management service marketplace, and one very much worth libraries following and considering. Since the introduction of the first online public access catalogs (OPACs) in the 1980s, through the integrated library systems (ILSs) of the 1990s and 2000s, and to the library management services platforms (LMSs) of today, libraries have sought to find effective tools for making their collections accessible to their patrons. It has not been easy, and each transition has tried to improve upon the problems of prior systems. The domination of electronic resources in the library marketplace has meant a change in how librarians offer, and manage these resources. The proliferation of electronic content means libraries generally cannot keep track of exactly what is available in the databases to which they have access. They must rely on outside companies to track this information, and tracking that information is now a critical part of the library management system. Managing licensing — and determining who should be able to access what, via which channels — has become a vital part of the librarian’s skill set, and we need tools to help us manage that work. In addition, allowing appropriate access without leaking too much personally identifiable information, becomes vital. All of these, and many other functions, require a completely new toolset. In thinking about upcoming changes in the library automation universe, I wondered who is using which library automation systems, and how long have they had their current system — or, more specifically, how many libraries have not upgraded to a better system? Using data collected from Marshall Breeding’s valuable database about library automation tools, at LibraryTechnology.org, in late January 2018, I reviewed the current systems in use in Association of Research Library members and Oberlin Group members. I sought a way to display the age and system

in use at those institutions; hopefully, the attached figures will be useful in doing so. Figure 1 shows a summary of all current automation systems among members of the Association of Research Libraries (ARL), representing the largest research libraries in the United States and Canada. A clear split exists between the prior generation of automation systems, generally known as ILSs, and the current generation, or LMSs. Among 125 ARLs, one does not currently have an automation system. Of the remaining 124, 70 (or 56%) are using catalogs acquired in the last eight years. The division between the two generations of catalog systems is undeniable; no system appears on both sides of the 2009/2010 gap, when no new systems were installed. The vast majority of these new installations, among ARLs, was Alma or Sierra. Note, however, that 40% of the Sierra installs were in 2011 alone; in contrast, Alma has had a regular schedule of at least 5, and an average of 7½, ARL installs, per year, since 2012. Alma, and most likely its electronic resources management features, have appealed to many research libraries. While close to 60% of all ARLs acquired a new automation system in the last eight years, over 40% did not. Of those, nearly half (23 of 54) are using Voyager, from ProQuest. All of these libraries are almost certainly looking for new catalogs that will allow them to do much more, more effectively, than they currently can. While electronic resources obviously existed when these automation systems were installed, the world of resource acquisitions has changed dramatically, and in many research libraries the electronic resources budget has grown from perhaps 20% of all expenditures to some 80% of all expenditures. The tools that libraries need to manage these electronic resources simply do not exist in the catalogs they use, and most libraries, if they have some sort of electronic resources management tool, must use that tool outside their ILS. Some libraries — for example, nearly all university libraries in Germany — still do this work in basic spreadsheets and home-made databases. Other libraries use tools clearly not intended for this work, such as Trello, the online project management tool. While libraries are able to make parts of it work, these tools have many gaps that libraries need filled. Still others, such as Cornell University, are using Intota, an electronic resources management tool from the Serials Solutions division of ProQuest, but there are no plans to grow or support Intota, as ProQuest not surcontinued on page 14

12 Against the Grain / February 2018

<http://www.against-the-grain.com>



prisingly hopes to migrate these libraries to Alma, instead. The situation is similar in smaller American academic college libraries. As shown in Figure 2, the Oberlin Group of libraries had an even more noticeable break between the previous automation systems of pre-2008, and the newer systems since 2011. The Oberlin Group is a gathering of 80 selective college and small university libraries. The group serves as a good indicator of the direction of college library automation plans. Between the start of 2006 and the end of 2010, only one Oberlin Group library installed a new automation system. (It is possible that a library acquired one in that period and then replaced it between then and now; such a system would not be included in this chart.) Obviously, some of this gap resulted from the Great Recession of 2008 and its aftermath, though this period includes two years before the downturn began in December 2007. Once an effective solution existed and their financial situations allowed them to acquire a new system, many college libraries did so. In almost every case, they selected either Innovative Interface’s Sierra (in 22 cases), OCLC’s Worldshare Management Service (15), or Ex Libris’/ProQuest’s Alma (13).

all. Expanding on ideas and tools from the past decade, but being built brand-new from the ground up, FOLIO provides a new way of managing library access to all resources. FOLIO is an expansion of the Kuali OLE project to build an open-source library management system, and financial support from EBSCO, along with other partners, has created a community of developers committed to creating a management system that any library can use. The end product will be free and opensource, in the sense that anyone can download the software to run the system. But as is often said about open-source projects, “It’s ‘free,’ as in a free puppy; not ‘free’ like free beer.” To make FOLIO work, libraries will still need to commit time and resources to install, support, implement, and maintain the system. EBSCO and others will offer those support services to the libraries that seek it, for a fee. From a business point of view, this highlights one of the biggest differences between FOLIO and existing LMSs. For stand-alone systems like Alma or Sierra, libraries can only acquire the product from the producer, and while there may be some price competition between different products, there is no competition for installations of, say, Alma, since ProQuest is the only provider. Similarly, long-term support for, and access to, the resource can only come from the original provider. Long-term contracts are critical for protecting a library from unexpected price in-

But as with major research libraries, many small college libraries still need to implement a better, more effective, solution. Those libraries that are still using past-generation catalogs need a collection of tools that will ensure they are providing access in the most efficient and most effective manner possible. Many are certainly considering Alma or Sierra (or, for the smaller institutions, often Worldshare Management Services), but some are also awaiting the completion of FOLIO, the open-source LMS being developed by the Open Library Foundation, with extensive financial support from EBSCO, the Mellon Foundation, participating libraries, and many other funders. FOLIO represents the next evolution in the field of LMSs, and is being developed by a wide range of libraries and vendors from around the world. FOLIO is open-source, so when released it will be freely available to

creases. For an open source product, anyone can support the product. This could include a library vendor like EBSCO, SirsiDynix, or Bywater Solutions, which has been supporting the open source Koha system for years. Other viable options include local IT firms, or a single institution on its own, or perhaps an IT team from a consortium, which could support all of the consortium members’ installations. Given the many options for installing and supporting an open source solution, one can reasonably expect a significantly lower annual maintenance cost than one would see from a single-source provider. FOLIO is being built by programmers, developers, product owners, product managers, subject matter experts, and many more, from a range of institutions in North America, Europe, and Asia. Many developers work for EBSCO or companies hired by EBSCO

Current and Future Library ... from page 12

14 Against the Grain / February 2018

to build various parts of FOLIO. Still others work for participating libraries — Cornell, Duke University, and Texas A&M University, in particular, have developers dedicated to writing FOLIO code, while many, many, others have subject matter experts who contribute time and expertise through Special Interest Groups that define how the product will be built. Funding from the Mellon Foundation and others also provides support for additional developers and product managers through the Open Library Foundation, which oversees and directs FOLIO development. The FOLIO project aims to have an initial “minimum viable product” available for use by July 2018. By January 2019, many additional units will be complete, with much more functionality added. A full “version 1” should be available for download and use by next January. Of course, any major project like this takes an enormous amount of work to complete, and the first versions are intended for libraries that are willing to contribute beta testing time and experience. But with that comes the ability to directly affect the manner in which the project develops; in the end, those who participate from the early stages will have the opportunity to directly impact how the product grows and improves. Two parts of FOLIO that will be of particular interest, I believe, will be the Codex and the FOLIO Marketplace. The Codex is among the most difficult to understand, and I must admit that the description that follows comes from time spent discussing the Codex and its structure with a collection of expert developers — specifically, the ones creating the Codex — but even after all of that, my understanding may be limited or inaccurate, or the eventual structure and actual implementation of what is expected of the Codex may change. As I view it, Codex is a concept that represents the ability for a FOLIO user (generally, a library employee, not a patron) to search a vast world of resources through a single interface. Separately, however, is the ability to search a subset of Codex data — generally, but not always, referred to as an institution’s “Inventory” by FOLIO developers — that represents all of the resources that an institution chooses to make available to its users or patrons. This Inventory will be compiled within FOLIO, and will be the content behind most of the activity relevant for a library’s daily work, such as circulation, licensing, access, ILL activities, cataloging, inventory, holdings management, and more. At present, however, there are no plans to create a public interface to this FOLIO Inventory, so libraries will generally use a tool such as VuFind, Blacklight, or perhaps a commercial discovery layer, to provide patron access to the institution’s Inventory. continued on page 16

<http://www.against-the-grain.com>


Discover

The most cited and largest peer-reviewed collection of optics and photonics content.

OSA offers multiple subscription options for accessing your content. Visit osapublishing.org/library for information on how to acquire this high-quality content for your institution.

Volume 51 • Number 1

|

July 1


Current and Future Library ... from page 14 Each institution defines what is in-scope for its Inventory. This would almost certainly include descriptive records for the physical volumes that the library owns and stores on its shelves, along with electronic records for the resources that a library subscribes to, purchases, or leases, for and on behalf of its patrons. At the institution’s discretion, the Inventory can also include many other subsets, such as descriptions of — or perhaps the full text of — institutional repository data, electronic theses and dissertations, discovery layer contents, resources that are available for patron-driven acquisition, externally hosted eBook and audiobook collections, institution-specific online content from commercial publishers, and much more. A library might provide different collections to different individuals, through the Codex search tool. For example, acquisitions staff might have access to data stores far beyond what appears in the patrons’ Inventory, so they can quickly and easily determine sources for the acquisition of print and electronic resources. An acquisitions staffperson might use the entire FOLIO Codex tool to identify all the vendors who could provide an electronic version of a particular monograph, and may also discover that the monograph is already available through a path that perhaps the patron or public services staff had not located. But the Codex is limited to the resources that are willing to be included in it; if a vendor of eBooks chooses not to allow its data to be searched via the Codex, then its results will not appear there, and the acquisitions staffperson will need to search that resource separately if they want to consider its contents. Each collection could essentially be a separate knowledgebase that the Codex can search, if or when a library chooses to include it. Or, the library might decide to track all of its electronic resources in a single knowledgebase. Because the FOLIO project is creating and defining specific APIs about how a knowledgebase is searched, it is up to each knowledgebase vendor to offer access to their resource through the FOLIO structure. Currently, EBSCO has built an eHoldings app that will allow librarians to manage electronic resources data in their own EBSCO knowledgebase tool. Other knowledgebases, of nearly any size, can create interfaces between their data and the FOLIO system, using the appropriate APIs. Or, a third party could build and distribute a tool that allows a library to access a vendor’s data through a combination of that vendor’s data feed and the third-party’s app or interface. Librarians will have many opportunities to access and manage the data that most matters to them. Generally, the Codex search of all available resources does not search MARC records, per

16 Against the Grain / February 2018

se. “Codex” searching will be limited to a small number of (as yet, not-completely-defined) fields, such as author or contributor, title, some relevant publication data, and perhaps some subject data, when available. It will, in essence, be nearly like searching BIBFRAME “Instances” — looking at the concept of a resource, but not necessarily the “Manifestation” level common to the FRBR structure. But Codex searching will link out to more descriptive item-level data from other resources. So a search for a specific monographic title will locate a brief record that represents the title, and then will have perma-links to knowledgebases that provide more information about that resource, such as representations of electronic versions of the title, or print copies in the library’s physical inventory, or consortial copies available from other affiliated collections, or information about copies that can be purchased or leased — all depending on which collections or volumes or knowledgebases the library chooses to include for its users. When looking at bibliographic data, this information will, invariably, be MARC records — but only because that’s what is so prevalent today. However, the Codex structure does not specifically demand a MARC format, and any other appropriately-structured data could be included in an institution’s Inventory. If, for example, a library chose to start using Resource Description Framework (RDF), or some other encoding standard, to describe the resources it offers to patrons, all newly added content could be described with RDF, and existing content could remain in MARC. The Codex would search both collections, and results would appear to patrons as just a single Inventory. Over time, MARC records could be transformed to the newer encoding standard, and the data would shift from one collection to the other — though the shift would not be obvious to the end user. Like the Codex, the Marketplace will be a new space that brings together disparate resources into a single environment. But instead of bibliographic data, the marketplace will offer tools, workflows, services, data, and other functionality in an environment that will provide many parties with the opportunity to buy, sell, offer, trade, or implement these tools as they see fit. The Marketplace’s best comparison is almost certainly to an App Store, in that the resources that appear in the Marketplace are designed specifically for the FOLIO community, and contributors can decide if, or how much, they will charge for the products and services they offer in the Marketplace. User reviews can help guide others toward or away from specific tools that might or might not meet their needs. Many libraries may prefer to stick with an automation tool from a single-source vendor; certainly, the creator of the product most likely employs the most knowledgeable individuals regarding the resource in question. Others, though, may prefer the flexibility that an open source solution provides, both in terms

of finding companies to support the library’s installation, and in finding additional tools to support the product in ways that the single-source vendor may choose to not do, or may not be able to do. Because it is open source, libraries will be able to use just the modules that interest them, and if they feel that a certain unit does not meet their needs, they can select a different module with a similar function, or even build their own. If the tools that currently exist don’t meet their needs, they do not need to wait until the vendor decides to get around to building the tools they need — they can build it themselves, or hire someone to do so, to their specifications. They could, then, offer their solution to others, charging money for it or not, as they see fit. While the parts of FOLIO that are currently being built — by OLE, EBSCO, and many smaller companies — are “open source,” so anyone can download and use them, this does not mean that all aspects of tools for FOLIO must be either open source or free. Within the FOLIO Marketplace, companies, individuals, libraries, and others, can all build and sell or give away tools that they believe will benefit the library community. Similarly, within the standard FOLIO tools, people, libraries, and companies will be able to build and sell or distribute workflows that will help libraries manage their resources. For example, a library that wants to track all arrivals of a print serial with a complex publication schedule could purchase a workflow that was designed specifically to track that particular serial. Or someone could build and share the steps necessary to correctly establish access to specific complicated electronic resources. Another vendor could create and offer a tool that helps libraries identify missing online resources, or perhaps a tool that checks availability of online resources. Libraries need many, many tools to manage everything they offer to patrons, and the FOLIO Marketplace will make it easier for smaller vendors to get their useful tools to the libraries that need them. In every case, it will be up to the creator to decide the cost and availability of such workflows and other tools and apps. The library automation marketplace continues to evolve. Libraries expect more efficient and more effective ways of managing the resources they offer to their patrons, and it should be no surprise that managing electronic resources is among the most important services needed today. Past-generation systems simply do not offer this functionality, and libraries find themselves working hard to force old or inappropriate tools to do this difficult work. The current generation of systems, particularly Alma, Worldshare Management Services, and Sierra, provide librarians with the tools they need. But the next generation tool, FOLIO, will provide new and additional functionality to manage these resources in a more efficient manner. And, more importantly, FOLIO will introduce a broad paradigm shift in how libraries run these systems. As an open source continued on page 18

<http://www.against-the-grain.com>



Optimizing an Octopus: A Look at the Current State of Electronic Resources Management and New Developments in the CORAL ERM System by Heather Wilson (Acquisitions & Electronic Resources Librarian, Caltech) <hwilson@caltech.edu>

N

o one mentions in job descriptions that being an electronic resources librarian means drawing a lot of octopuses. In addition to dabbling in supply chain management, scholarly communication, and acquisitions, e-resources professionals are often asked to explain why those areas are related to the work of managing electronic access. The result is often a series of sprawling, tangled diagrams with arms reaching everywhere, all in an attempt to represent the “electronic resources management workflow.” The unfamiliar viewer may be shocked by the complexity of processes; knowledgeable viewers are often only marginally less overwhelmed. The reality is that electronic resources management (ERM) is dynamic, unstable, and unpredictable, and that is if the librarian is lucky enough to work in a library that promotes innovative approaches to those processes. Librarians and library software companies have dedicated considerable resources to developing systems that support and simplify the library’s ERM workflows, especially as the systems have multiplied beyond the central ILS. In addition to holdings and licensing information, many libraries have added the link resolver, the proxy server, web scale discovery systems, helpdesk ticketing systems, and other tools to their suite of services, all of which have a role in the e-resources life cycle. However, most attempts to simplify and centralize these individual services have been unable to encompass emerging tools and ultimately generate the need for more extended workflows, more octopus diagrams, and more spreadsheets in a shared drive.1 It’s been nearly a year since Emily Singley and Jane Natches, two researchers from Boston College, published

Current and Future Library ... from page 16 product in a community that values new ideas and new tools, FOLIO will, I believe, provide the opportunity for libraries and small vendors to develop and offer tools that will benefit the entire library community — and its patrons — in ways that we cannot begin to imagine. As we finally find and develop more and more useful ways of ensuring that our patrons will be able to truly take advantage of all the

18 Against the Grain / February 2018

about the pervasiveness of those external ERM processes at institutions with Library Service Platforms (LSP) in place, research which illuminated some of the central challenges in developing ERM systems.2 LSPs, also commonly called “Next-Gen ILSes,” have emerged as one approach to managing both electronic resources and library systems, an approach which hopes to reduce the number of processes by centralizing them in one larger, often shared system. Libraries have seen “a growing trend toward a consolidation of services for electronic resources management, A-Z journal listings, full text link resolving, and discovery services under a single service provider,” and these centralized platforms are a natural evolution of that trend.3 Using the TERMS framework (https://library.hud.ac.uk/blogs/terms/) as a guide for some of the more universal functions of ERM, Singley and Natches conducted a survey of nearly 300 library professionals to uncover how ERM was being handled in three major LSPs: Alma, Sierra, and OCLC Worldshare. The survey respondents expressed that the systems have simplified many workflows, especially with regard to journal activation within the central systems. Nevertheless, all of the surveyed library staff expressed the need for performing many ERM tasks outside of the systems, particularly in managing renewals and performing ongoing assessment of the subscriptions. Across all three systems, more than half of staff users were assessing their renewals outside of the LSP in spreadsheets, shared drives, and other external options.4 Usage modules and cost-per-use calculations continue to be a struggle to manage within central systems, and even many tools designed specifically for this purpose are not able to provide the full range of COUNTER data and

resources we acquire and offer to them, this is an exciting time to play a role in defining the next generation of these critical tools. While library automation developers always ask librarians for feedback in what they would like to see in the next iteration of their products, FOLIO gives librarians and library staff a unique opportunity to truly lead this development, through the many paths by which individuals can participate in developing the FOLIO project.

calculations. For example, EBSCO’s Usage Consolidation tools provide excellent ways to centralize JR1, BR1, BR2, DB1 and DB2 reports;5 however, many libraries are finding value primarily in the JR5 and JR1 GOA reports.6-7 This is not to single out that particular tool; finding systems to handle the calculations cited above is challenging, and demand for these reports and calculations is only just emerging. While improving these tools is a constant process, meeting these increasingly more specific needs at the current pace of collection management is a challenge for a centralized system, where other workflows will inevitably be affected. The related challenge of feasible data migration in a major system change continues to be a discussion point in the centralization of these processes. Although librarians may appreciate the minimized processes that come with one-stop ERM in an LSP, that value is often contingent on how carefully the e-resources and metadata functions are being handled. Singley and Natches note that the “complexity and ever-changing nature of ERM has made it necessary for libraries to invest in multiple software systems as well as use manual workarounds to support ERM workflows.”8 Many of the systems were not originally designed for interoperability, particularly in the parsing of metadata between them. This lack of connection between systems has often resulted in siloed workflows that are difficult to centralize without extensive project management and staff time. Libraries in this and similar situations have begun to explore a different solution than system centralization: perhaps what is actually needed are configurable systems and integrations that allow ERM to be implemented more gradually, developed along the library’s own timeline. This solution looks instead at creating connections between existing systems through APIs and integration tools, which will allow a library to make system changes only when necessary (and reasonable to do so). The details of the open source FOLIO LSP had not yet been announced when Singley and Natches performed their survey in early 2016, but presentations of FOLIO at the recent 2017 EBSCO User Group reflected library demand for stronger integration of pre-existing platforms and clearer data migration plans.9 In fact, the agendas from the user groups of other major LSP providers, including Ex Libris and OCLC, included sessions about API development and fluid integration with outside systems.10-11 It’s clear that many librarians and library providers are coming to the conclusion, as Singley and Natches did, “that ERM remains a complex process that is, as yet, too daunting to encompass within any continued on page 20

<http://www.against-the-grain.com>


we provide cataloging for seamless integration

BIBCO RDA records with authority work produced in-house by a highly qualified team including pre-order, order, invoice and holding fields remapped to your specifications. over 40,000 records added annually on our bibliographic database www.ilibri.com

we work on bibframe

for the future of global metadata sharing

casalini libri is currently involved in the implementation of the BIBFRAME data model with enrichment, conversion, entity identification, reconciliation and publication activities. visit www.casalini.it/#bibframe for more information

serving libraries and publishers since 1958

European publications sourcing, selection, supply collection development authority control, cataloging technical services shelf-ready

www.casalini.it

for complete information on our services


Optimizing an Octopus ... from page 18 one software system.”12 Beyond the functional concerns about centralized systems, many libraries continue to find value in maintaining a suite of separate tools. Librarians providing case studies of system interoperability at three Canadian university libraries noted that by maintaining unique but connected tools, they were able to implement “flexible and diverse systems” while promoting “healthy competition within the marketplace.”13 In coming to the decision to maintain or migrate systems over time, librarians are aware that the original interoperability issues are still present among their systems; in these situations, technical services librarians and developers team up to enhance metadata parsing solutions and API technology. While it’s understood that many parameters prevent this option from being available to many libraries, for others, this approach is more feasible than a large system migration or dependency on a single software provider. In both emerging approaches, it must be noted that huge amounts of the work fall on the technical services librarians, whose many diagrams must be consolidated and scrutinized as the workflows and metadata are incorporated. When preparing to integrate services, information about the library’s holdings from various incongruous systems is often seen in its rawest form: proxy configuration files expose periods of non-maintenance, and wrongfully activated links are held up to the light. Following this period of scrutiny, technical services librarians are then often asked to construct a central dataset for either one major system or several interoperable templates, a task which seems fairly straightforward, yet is somehow anything but. ERM is tough for all involved during many migrations, but it can also be a very rewarding process that often makes these library staff into more appreciated assets at their institutions. For those entrusted to construct and optimize a central dataset from disparate file structures, an open source ERM system such as CORAL (http://coral-erm. org/) can be very helpful. Open source ERM systems often provide librarians with the flexibility to implement as much of the system as needed according to technical limitations of their libraries. In the case of CORAL, the ERM is supported by an active developer group, an accessible user community, a history of vendor and library support, and a governance structure. While many libraries with CORAL implementations have developers who contribute to the project, CORAL is also often implemented by librarians with the help of the developer and user community. The system is modular, and while some libraries stay up-to-date with the latest release (which occurs approximately every six months), some libraries have been known to bring just one function of one CORAL module into their other systems. Recently, the governance committees have been actively recruiting new members and expanding opportunities for leadership. The CORAL Steering Committee has become more focused

20 Against the Grain / February 2018

Figure 1, screenshot taken from a CORAL developer installation maintained at Caltech, which shows the File Import and EBSCO Knowledge Base integration that have been in recent development. on user engagement as the team looks toward refreshing the project road map. Following a recent survey among the user community, the Steering Committee will develop user-driven goals in this road map. In the coming months, the recently formed Web Committee will be transitioning the website and its data to an updated site, as well as diving into the project documentation. Beyond governance, the developers on the project, with in-depth input from subject matter experts in ERM, have improved functionality of the system in ways that coincide with the emerging demands from librarians, and the work is impressive. (Figure 1 shows where some of this new functionality can be found in the system.) Earlier in 2017, developers enhanced the import configurability of CORAL to allow for custom field import at numerous levels in its Resource module, the primary means for documenting titles and packages in the CORAL system. These improvements allow librarians to lay out and import their holdings in bulk without forcing the data into a new structure. Even if the import does not satisfy the most detail-oriented of e-resources librarians, having the initial dataset imported for editing reduces the workload considerably. Also in 2017, developers turned their focus on integrations. First, Matthias Meusburger and Paul Poulain from the open source software group BibLibre started to develop an integration between CORAL and the open source ILS Koha, an enhancement that allows vendor, budget, and acquisition information to be synchronized between CORAL and the ILS and provides real time acquisitions (RTA) options for users of those systems. The integration has since been developed to be more universally configurable and work with any API-enhanced ILS. SirsiDynix, which offers hosting and support services for CORAL, facilitated development of an integration between CORAL and EBSCO’s central Holdings Management knowledge base using EBSCO’s Resource Management API (RM-API). To that end, Product Manager Carla Clark coordinated with contractor Luke Aeschleman to broadly define the integration. Luke then designed and coded the feature, which allows CORAL users to import resource information in bulk from EBSCO’s central knowledgebase, including title information, resource URLs, subject designations, and coverage dates. For campuses with these systems, these integrations streamlined and automated many ERM processes while allowing libraries to keep some systems in place inconspicuously. Looking forward, developers are exploring integrations with projects such as the open discovery tool,

Vufind, and the open knowledgebase, GOKb. While ERM tools on the market demonstrate a commitment to meeting the integration needs of libraries through API and RTA integrations, they are often confined to rigid release cycles that are opaque before launch. CORAL’s open development community, institutional product support, and modular implementation approach allow for leaner release cycles that are more transparent and can meet an increasingly diverse array of ERM needs. Of course, there are far more ways to look at easing ERM than this dichotomy of centralization or integration. In any event, immediate need will likely dictate a hybrid approach to ERM, and any consideration given to longterm strategy is a luxury for many libraries. But regardless of the level and direction of ERM implementation in a library, professionals and providers alike seem to have a growing awareness that their system decisions should be flexible in order to address quickly emerging ERM tools and ideas. For example, literature in the field shows that more and more libraries are getting out of major e-resource packages, or “Big Deals.”14 However, little has been written about how to manage the residual access. Electronic resources librarians are finding variations in how post-cancellation access is being provided, and budget managers are finding themselves in a new role, tracking license and payment information in new ways. The emerging need to track perpetual access terms more closely generates a need for a new workflow, a new octopus diagram. Maintenance in any system becomes a little more cumbersome for the involved electronic resources librarian if the systems in place cannot be configured to meet this new data need. That specific yet increasingly recurring situation is only one example of many emerging collection management approaches, and library directors are looking to replace, not complicate, old processes. Even as we see impressive developments among all systems handling ERM, those working on the CORAL project have recently seen that librarians and developers working on the ground have a considerable advantage in anticipating these developments. The CORAL community is made up of libraries that demand the most from their collections and services, and their support has made the open source project competitive with proprietary solutions. While it is possible to eliminate and integrate processes at the system level, these projects benefit when informed by those who do not see octopuses in these processes, but a series of constantly evolving workflows that are at the heart of the institution. endnotes on page 24

<http://www.against-the-grain.com>


Homegrown Search Results and Platforms by Elizabeth Siler (Collection Development Librarian, UNC Charlotte) <esiler3@uncc.edu>

W

ith all of the proprietary systems available in the library market place to manage and make available a library’s resources they all have limitations on what they can do, when the library is trying to promote a particular resource or program on a library’s campus. Because of these limitations, and the ingenuity of the library’s programming specialists, the library creates local databases and search platforms to serve a particular need they might have, whether it be to promote a particular type of resource or promote a program in the library.

Promoting Specialized or Under Utilized Resources

Although all the library’s resources can be located in the OPAC or discovery system, they are not always well marketed or stand out for use by students and faculty. Marketing for electronic resources has continually been an issue for libraries trying to promote their resources. Because of this, libraries have come up with ways to market resources by creating local search platforms. Bento Box Search Results — One example of this is how libraries leverage the information in the OPAC or a discovery system to create a Bento Box type search results page to help separate resources and highlight different types of resources the library has to offer, especially audio visual material that is often overlooked as legitimate source for research. One of the first libraries to execute the Bento Box search results page was NC State University Libraries (Figure 1). The purpose of this interface was an “attempt to guide searchers to appropriate resources.” When NC State first reported on this feature several other libraries were also using a similar format including Villanova, University of California, San Francisco, University of Michigan, and University of Virginia.1 Over time, several other universities have employed this type of in-house developed search method.

Figure 1: NC State Bento Box Results Against the Grain / February 2018

According to a blog post written on the Usable Libraries by Emily Singely, one of the main reasons to use the Bento Box style of search results is because the search result style “eliminates the ‘default search’ problem, where users tend to favor the most prominent search option and end up missing important resources.”2 This gets to the point of trying to leverage the search functionality you have in your OPAC and displaying it in a way that is more digestible by users and helps highlight resources students may have otherwise ignored if not displayed in this way. How this is achieved at each institution is different but the overall concept is the same. For some libraries, this kind of programming is not possible because of a lack of staff or interest, but many larger libraries have utilized and perfected this method of bringing users search results. Locally Created Search Interfaces as Usage Booster – Streaming Video Search — At UNC Charlotte, the usage of the streaming media that the library either purchased or subscribed to was lower than expected. The library was seeking funding for additional streaming video and felt that the videos were not being discovered in our discovery system amongst all of the other content within our catalog of resources. At that time the library had purchased streaming video packages from Alexander Street Press and Ambrose, as well as subscribing to larger packages of films from Films on Demand and a small package from Kanopy Streaming. Overall, the library had access to over 20,000 streaming videos and had spent close to $100K on these resources. In order to increase the overall usage the library decided to create its own search interface for streaming video content to provide another place for discovery of this content. Since the database was created to serve as a marketing tool, the designers did not implement deep indexing or extensive faceting. With the simplified search, titles can be searched by keyword (based on the title), browsed by subject (based on how the vendor had categorized the video) and can be limited by date. The records include a brief description, run-time, year and a thumbnail. The vendors provide all of the indexed information through spreadsheets the library acquires from the administration portals. Once a user clicks on a title result, they are sent directly to the video on the vendor website, in a new window, so they do not lose their search results. Users can get to the UNC Charlotte Streaming video page (Figure 2)3 in two ways: on the library homepage, there is a tab in the search box for streaming video, or they can go directly to the streaming video search page, which is advertised on various sections of the library’s webpages. If they go directly to the website they will also see highlighted films that are automatically pulled from the title lists each week, as well as direct links to each of our collections. Upkeep is relatively minimal for this database except for adding new films that we lease or buy. Our subscription databases only remove titles twice a year, so we remove titles infrequently. Unfortunately, usage of the library’s streaming video has not significantly changed since the new search interface was implemented, so more testing and possibly formulating focus groups to assess what is needed to boost this usage will be required. On the other hand, we have received a significant increase in requests by faculty for more streaming video, so their awareness of this type of content in general has increased. With a little more promotion, there is still a possibility for the streaming video database to be useful toward showing the breadth of our collection or assisting faculty with steering students towards films assigned for classes. This experience has helped the library determine how we will spend our money on streaming video in the future. continued on page 22

<http://www.against-the-grain.com>

21


As a way to promote this program, the library created a simple SQL database, the Faculty eTextbook Database (Figure 3),4 with all the eBook titles the library owned as well as titles the library could purchase for faculty to use in their classes. Much like the streaming video database the searchable information in the database is limited compared to the catalog. The user can search by author, ISBN, or keyword, which is taken from either the title or the subject assigned by the publishers. The information that drives the database comes directly from the publisher as an excel file either emailed directly to the library or acquired through the publisher’s administration portal and is reformatted in the system. We also provide detailed information to the faculty member using ProQuest Syndetics, which provides a description and/or the table of contents for the book. Titles the library owns have a green dot next to them to show they are available and titles with a yellow dot indicate that the library would need to order the book for the faculty member. The library tries to keep the database as up to date as possible by adding new titles Figure 2: UNC Charlotte Streaming Video Website monthly, quarterly or yearly, depending on what can be obtained from the vendor. In the Locally Created Search Interfaces event a faculty member does not see a book they are looking for in the database they can contact a dedicated email address to inquire regarding to Promote a Library Program In addition to the important day to day work the library does for its its availability and provide alternative options. The program overall students, faculty, and staff, it also provides unique services and programs has been successful in terms of increasing eBook usage and helping to the campus to enhance teaching and learning. These programs often faculty save students on the cost of textbooks. A full account of the support a larger issue the campus is dealing with that needs additional program can be found in the monograph Affordable Course Materials: Electronic Textbooks and Open Educational Resources, published by promotion and outreach. the Association for Collections and Technical Services.5 E-Textbook Database — At UNC Charlotte, like with many other Since creating the database in 2015, Atkins Library has encouraged schools, we are grappling with textbook affordability. One of the other libraries who are working on textbook affordability to consider ways we have tried to help faculty and students address the high cost of textbooks is encouraging the faculty to use eBooks that are freely implementing a similar program. As part of this encouragement, the available to our students. These eBooks are not restricted by DRM library is happy to share the code created by their programmers with and allow unlimited user access through the library as alternatives to other schools to set up their own database. We were happy to hear that the library at the University of South Florida took us up on our offer traditional textbooks. and has implemented a similar database. Links to these databases can be found in the references for this article. Open Educational Resources Databases — Another trend that we have been seeing in relation to supporting textbook affordability is the creation of databases that pull all of the Open Educational Resources that are available to faculty from across the internet. Recently George Mason University created their Mason OER Metafinder (Figure 4).6 According to the database website, this metafinder will “simultaneously search OER Repositories.” Some of the OER Repositories included in the metasearch are Merlot, OER Commons, Open Textbook Library, and MIT OpenCourseware. This kind of database helps to promote the use of Open Education resources and also helps faculty members navigate content that has been curated as opposed to just using Google.

Homegrown Search Results and Platforms from page 21

Conclusion

Figure 3: UNC Charlotte Faculty eTextbook Database 22 Against the Grain / February 2018

One of the academic library’s ultimate goals is to connect reliable research and educational content with their students and faculty. Although we have some of the most powerful search engines in the world in OPACs, Discovery Services, and leveraging the all-powerful Google, these resources do not always direct users to the unique or specialized content provided by the library. To continued on page 24

<http://www.against-the-grain.com>


CONFERENCE PROCEEDINGS

PAPERS

PRESENTATIONS

JOURNALS

EBOOKS

Search the world’s largest collection of optics and photonics applied research SEARCH ›

Enter your search term

Newly Published Proceedings

Featured Presentation

Featured Journal Article

Proceedings of SPIE

Watch the video

Read article

Featuring 470,000 papers from SPIE Conferences, technical Journals, and eBooks. LEARN MORE

spiedl.org


Homegrown Search Results and Platforms from page 22

help promote these resources, libraries have come up with some unique ways to display and market this content by creating local search results and databases. This article lists just a list a few examples, but there are many more out there including local database lists, research guides, and search interfaces. It will be exciting to see what libraries come up with next!

Figure 4: George Mason — Mason Metafinder

Optimizing an Octopus ... from page 20 Endnotes 1. Singley, Emily, and Jane Natches. “Finding the Gaps: A Survey of Electronic Resource Management in Alma, Sierra, and WMS.” Journal of Electronic Resources Librarianship 29, no. 2 (2017): 71. 2. Jantzi, Leanna, Jennifer Richard, and Sandra Wong. “Managing Discovery and Linking Services.” The Serials Librarian 70, no. 1-4 (2016): 184. 3. Ibid. 4. “Holdings Management Usage Consolidation: Step-by-Step Guide.” EBSCO Help. Accessed January 23, 2018. https://help.ebsco.com/interfaces/Usage_Consolidation/Getting_Started_with_ Usage_Consolidation/Holdings_Management_Usage_Consolidation%3A_Step-by-Step_Guide. 5. Jabaily, Matthew J., James R. Rodgers, and Steven A. Knowlton. “Leveraging Use by Publication Age Data in Serials Collection Decisions.” In Where Do We Go from Here? Proceedings of the 2015 Charleston Conference, pp. 292-302. 2015. 6. Antelman, Kristin. “Leveraging the Growth of Open Access in Library Collection Decision Making.” Proceedings from ACRL 2017: At the Helm: Leading Transformation (2017). 7. Pesch, Oliver. “The Rise and Shine of the Knowledgebase in the Age of Folio.” Presentation at the EBSCO User Group, Salt Lake City, UT, October 25-26, 2017. Accessed January 23, 2018. https://www.ebscousergroup.org/files/uploads/EBSCO_User_Group_Program_2017.pdf. 8. Singley and Natches, 73. 9. “IGeLU 2017 Documents.” Ex Libris Knowledge Center. Accessed January 23, 2018. https:// knowledge.exlibrisgroup.com/Cross_Product/Conferences_and_Seminars/IGeLU/IGeLU_2017. 10. “WMS Global 2017 Community Insights.” OCLC Community Center. Accessed January 23, 2018. https://www.oclc.org/community/worldshare/global2017/2017CommunityInsights.en.html. 11. Singley and Natches, 81. 12. Jantzi, Richard, and Wong, 196. 13. Anderson, R. “When the Wolf Finally Arrives: Big Deal Cancellations in North American Libraries.” May 1, 2017, accessed January 29, 2018. https://scholarlykitchen.sspnet.org/2017/05/01/ wolf-finally-arrives-big-deal-cancelations-north-american-libraries/. 14. Singley and Natches, 81.

Endnotes 1. Lynema, Emily, Cory Lown, and David Woodbury. 2012. “Virtual Browse: Designing User-Oriented Services for Discovery of Related Resources.” Library Trends 61, no. 1: 218-233. Library, Information Science & Technology Abstracts with Full Text. 2. Emily Singley, “To Bento or not to Bento — Displaying search results,” Usable Libraries, January 4, 2016, http://emilysingley.net/usablelibraries/ to-bento-or-not-to-bento-displayingsearch-results/. 3. http://library.uncc.edu/streamingvideo 4. http://library.uncc.edu/et 5. Elizabeth Siler, “Thinking Outside the Pages,” in Affordable Course Materials: Electronic Textbooks and Open Educational Resources, (Chicago, 2017): 25-43. 6. https://mason.deepwebaccess.com/ mason__MasonLibrariesOpenEducationResources_5f4/desktop/en/search. html

Rumors from page 8 open through March 30. https://scholarlykitchen.sspnet.org/2018/03/02/scholarly-kitchen-10th-anniversary-survey/?informz=1 ATGMedia is doing some surveys of our own. You may have heard from John Lavender about ATGMedia and the Charleston Conference. John works with Maverick Publishing and has his own consulting business. He and Jackie Ricords of IGI Global organized a preconference (publishers are not the enemy) for Charleston 2017 on November 7th reviewing practical examples of publishers and librarians working together on joint projects in areas such as evidence-based acquisitions models, encouraging patron usage, textbook issues and libraries, open access books, etc. Related, the on-top-of-it-all Maggie Farrell (UNLV), with Rick Branham, and Barbara Kawecki is guest editing the April 2018 issue of ATG on transforming library vendor relations. Looking forward to it! The ATG Quirky for March 2 is about Growing up in a Library. How Living In A Library Gave One Man “The Thirst Of Learning” is a great post from NPR’s STORYCORPS that tells the story of Ronald Clark and his family. Ronald’s dad, Raymond, was a New York Public Library continued on page 38

24 Against the Grain / February 2018

<http://www.against-the-grain.com>


Evaluating IR Platforms: Usability Criteria for End Users and IR Managers by Rachel Walton (Digital Archivist and Records Management Coordinator, Rollins College, Olin Library) <rwalton@rollins.edu> http://www.rollins.edu/library/yourlibrarian/rachel.html

A

s managers of electronic resources, librarians must be concerned with “user-friendliness” in two senses — the public user interface (PUI) where end users of all types interact, and the administrative interface (sometimes referred to as the “back end”) where library personnel execute most of the critical database and configuration work. These separate but equally important aspects of any library platform demand different tool capabilities and have completely different sets of stakeholders. Institutional Repositories (IRs) are a “set of services that a university offers to the members of its community for the management and dissemination of digital materials created by the institution and its community members.”1 While Institutional Repositories have faced some well-deserved criticism over the years2 and do not benefit from a very intuitive title,3 they remain a common service that many libraries today at institutions of higher education choose to provide. As technology platforms, IRs are no exception to the two-sided usability concerns described above. IR end users represent “information seekers” and these users, like other online searchers, basically “want to find information quickly and with a minimum fuss.”4 In contrast, administrators or managers of IR systems represent a group of “data maintainers”5 and might be better described as IR superusers. These “maverick managers”6 create metadata and documentation, upload most content, make quality assurance decisions, communicate with contributors, control website configuration settings, and generally oversee the IR’s collections on behalf the entire institution. Of course, in the context of IRs, there is a third group to consider — authors/submitters. This group is not looking for information as a part of a research investigation, but rather is depositing content into the system as an institutional contributor. Thanks to over a decade of devoted librarians and their research, we actually know quite a bit about authors/submitters as IR users and stakeholders.7 Comparatively, we know far less about what end users (“information seekers”) and IR mangers (“data maintainers”) desire in an IR platform and interface. While usability has been a concern and point of study in the world of IRs since their nascency, there has yet to be a consensus about what kind of usability criteria might set the standard for the variety of IR platforms out there on the market today. In addition, some of the previously proposed usability heuristics could use an update.8 As a digital archivist, IR manager, and usability researcher I would like to take this opportunity to return to these earlier conversations about IR usability and offer some further criteria of my own that I believe librarians today can use to critically evaluate their IR platforms. I make a special effort here to consider the desires of end users and IR managers alike.

What do they (“information seekers”) want?

Kim’s pioneering work on IR usability lays out a set of seven criteria for evaluating IR interfaces.9 While Kim conducted his study in the early days of IR platform development, many of his guidelines for user-friendly IR navigation are easily updated to fit the demands of current IR interfaces (see numbers 1-7 below). Jakob Nielsen’s famous ten usability heuristics, which still serve as an industry standard, were the basis for much of Kim’s work, and are, therefore, also well represented in these ten criteria.10 1. Users are given multiple ways to search 2. Users are provided sufficient visual cues and guidance for searches 3. The interface employs the users’ language, not specialized jargon 4. Users are afforded a high level of control and freedom when navigating 5. Search result listings include useful metadata descriptors for each resource (but avoid information overload)

Against the Grain / February 2018

6. The user can browse a diversity of content in meaningful ways 7. Links to open digital content are centrally located and clearly presented to users 8. The contents of the site are highly visible and discoverable on the open web 9. The interface offers a pleasing aesthetic and minimalist design 10. Appropriate Web 2.0 features enhance interface functionality How do we know these criteria serve the needs of information seekers? They told us. In a 2011 study of IR users, researchers discovered user dissatisfaction or frustration with several key usability elements.11 Reported usability obstacles and the tasks they are associated with help us understand what end users really value in an IR interface.12 For example, St. Jean’s study cited poor browsability as a major obstacle in IRs, hindering those users attempting general search strategies as opposed to known-item searches.13 Others simply expressed disappointment with the overall layout or organization of the IR, and similar studies have concurred that IR interfaces tend to lack basic functionalities and pose a steep learning curve for new users.14 At an even more elementary level, many interviewees in the 2011 study were uncertain about “what exactly constitutes an IR” and they lacked the vocabulary required to explore it fully.15 In addition to website functionality, end users in the same study cited the IR’s lack of overall visibility on the web as a serious hindrance and suggested this as the reason for why few of their peers knew or used the IR website.16 Other studies have likewise listed discoverability as a critical aspect of IR interaction.17 Search engine indexing and optimization (SEO) has instigated a critical turn for IR in this area, placing IR content in the hands of any online searcher rather than limiting it to a single institution in a kind of “information island.”18 Furthermore, some end users indicated that IRs did not align to their standards of an attractive, modern website.19 Users were disappointed with an overall lack of Web 2.0 components and researchers suggested this as a major opportunity for future development.20 Of course, if we fast forward to today, we see that our end users have even higher hopes for the kind of usability enhancements they expect from a website. Everything from tagging and “favoriting” to auto-complete search boxes and pop up tool tips are basic elements in a 2018 web user’s repertoire. In sum, our users’ bar for online research tools is high, and, therefore, if we hope to satisfy their expectations, our usability criteria must be rigorous.

What do we (“data maintainers”) want?

The question remains, what do we as IR administrators want for our IR interface? What are the features and capabilities that promote our efficiencies and ease of use? Even less has been articulated in the scholarship around this topic. In 2007, McKay outlined six major issues facing IR “data maintainers.”21 Informed by McKay’s recommendations, and based on my own everyday interactions with the “back end” of an IR, I have outlined the following criteria (or perhaps it is more of a wish list!) of desirable administrative interface capabilities in IR platforms. 1. Useful and clear terminology, consistent with profession-specific vocabularies 2. Streamlined submission workflows with the ability to tailor steps according to submission type 3. Appropriate and reasonable metadata requirements 4. Autoformatting and other authority control measures to ensure “clean” data input 5. Permission-based viewing, editing, and content suppression options continued on page 26

<http://www.against-the-grain.com>

25


Evaluating IR Platforms: Usability Criteria ... from page 25 6. Robust usage statistics and research reporting capabilities 7. Built-in system communications and an audit log for each record 8. Responsive to submission editing and content removal decisions 9. Accepts and supports a diversity of file types and media formats 10. Flexible and customizable website configuration What research can lend credibility to this extensive list of demands? While there is admittedly a dearth of information about the library community’s expectations for IR administrative interfaces, Cunningham has made an effort to collect feedback from librarians involved in content acquisition, data input, quality assurance, and other IR processes.22 In her study, she noticed that librarian IR managers were bogged down in the details of lengthy data entry workflows that significantly prolonged the deposit process.23 These data wranglers had confusion and disagreement about which fields were mandatory and were frustrated by poor data error reporting due to unclear system wording. Other “slowdowns” reported by librarians included a lack of default options and the absence of any help features like example entries.24 According to Cunningham, correctness of entries was a real concern for librarians who complained that there were no authority controls to ensure consistency in capitalization, name and date formatting, or other standard data types across an entire IR. Librarians also felt hamstrung when trying to revise, revisit, or remove content in the IR; workflows only seemed to “flow” one way and making changes to content was a chore. Other workflow complaints involved poor communication with authors, uncertainly about what stage a submission was in at any given time, and a lack of clarity about the roles and privileges of other IR users. Librarians also recognized that usage stats and research analysis tools could potentially help in achieving faculty buy-in with IRs, yet those components of the IR interface in question were found to be very limiting.25 In my own experience with IR administrative interfaces, I value system flexibility and customization above all else. Scholarly output is moving beyond the realm of the traditional manuscript or journal article, and since graduate students are far more likely than established faculty to utilize IRs as a research tool or publishing platform, IRs should be particularly responsive to cutting edge research trends like digital humanities products and multimedia scholarship.26 Today’s IRs must support a range of different non-text resources — datasets, audio files, streaming video, maps and coordinates, high resolution images, archival and museum artifacts, and even software.27 Therefore, IR managers must be able to tweak submission workflows, alter required fields, and provide specialized instructions to both submitters and end users according to the media type and format being deposited. Finally, good IR managers know that they are more than just “data maintainers”; they are also publicists and web managers, wielding the full power of their institutional brand. As such, they want to be able to control, not just how collection contents are processed, but also how they are presented on the open web.

Using the Criteria

While there is not enough space herein to evaluate a particular IR platform based on the criteria I have proposed, I sincerely hope that other IR managers will pick up where I leave off with this discussion about IR usability. If we can come to a consensus about what the most valuable features of an IR interface might be, we can put pressure on our vendors and ask more of our tools, improving our web spaces and services in the long run. Librarians, how do your current IR platforms measure up? Do the criteria presented here check all the boxes of your use case? Are there critical requirements that I have overlooked? Furthering this discussion will require conversation, comparisons, and competitive analyses.

26 Against the Grain / February 2018

Endnotes

1. Clifford Lynch, “Institutional Repositories: Essential Infrastructure for Scholarship in the Digital Age,” Libraries and the Academy, 3, no. 2 (2003): 327-336. 2. Dorothea Salo, “Innkeeper at the Roach Motel,” Library Trends, 57 no. 2 (2008): 98-123. 3. Andrew Richard Albanese, “Thinking Beyond the Box,” Library Journal, 134, no. 4 (2009): 27. 4. Dana McKay, “Institutional Repositories and Their ‘Other’ Users: Usability Beyond Authors,” Ariadne 52 (2007): http://www.ariadne.ac.uk/issue52/mckay/. 5. Ibid. 6. Salo, “Innkeeper at the Roach Motel.” 7. Alma Swan and Sheridan Brown, “Authors and Open Access Publishing,” Learned Publishing 17, no. 4 (2004): 219-224; Timothy Mark and Kathleen Shearer, “Institutional Repositories: A Review of Content Management Strategies,” in World Library and Information Congress: 72nd IFLA General Conference and Council, Seoul, Korea (2006); Jihyyn Kim, “Faculty Self-Archiving: Motivations and Barriers,” Journal of the American Society for Information Science & Technology, 61, no. 9 (2010): 19091922; Jihyyn Kim, “Motivations of Faculty Self-Archiving in Institutional Repositories,” Journal of Academic Librarianship, 37, no.3 (2011): 246–254; Ellen Dubinsky, “A Current Snapshot of Institutional Repositories: Growth Rate, Disciplinary Content and Faculty Contributions,” Journal of Librarianship & Scholarly Communication, 2, no. 3 (2014): 1–22; Zheng Ye Yang and Yu Li, “University Faculty Awareness and Attitudes towards Open Access Publishing and the Institutional Repository: A Case Study,” Journal of Librarianship and Scholarly Communication 3, no. 1 (2015): 1-29. 8. Jihyun Kim, “Finding Documents in a Digital Institutional Repository: DSpace and Eprints,” Proceedings from the American Society of Information Science and Technology, 42 (2005): doi:10.1002/meet.1450420173. 9. Ibid. [Note: Kim’s original 7 criteria were: “(1) give users an adequate number of search options; (2) provide examples of search query; (3) employ users’ language; (4) allow users greater control and freedom; (5) display useful components in result sets; (6) list search results in a useful way; and (7) clearly present links to open digital documents. Implementing these guidelines would improve the user’s experiences when using digital institutional repositories.”] 10. Jakob Nielsen, “10 Usability Heuristics for User Interface Design,” Nielson Normal Group, 1995, https://www.nngroup.com/articles/ten-usability-heuristics/. 11. Beth St. Jean, Soo Young Rieh, Elizabeth Yakel, Karen Markey, “Unheard Voices: Institutional Repository End-Users,” College and Research Libraries 72, no.1 (2011): 21-42. 12. More recent scholarship has shown how developing user profiles can help us understand typical needs, tasks, and problems encountered by different kinds of end users. See Maha ALjohani and James Bluestein, “Personas Help Understand Users’ Needs, Goals, and Desires in an Online Institutional Repository,” International Journal of Computer and Information Engineering 9, no.2 (2015): 631-632. 13. Ibid., 30. Searching and browsing was also a major usability issue cited in Hyun Hee Kim and Yonh Ho Kim, “Usability Study of Digital Institutional Repositories,” The Electronic Library 26, no. 6 (2008): 863-881. 14. St. Jean, “Unheard Voices,” 30; Philip Davis and Mathew Connolly, “Institutional Repositories: Evaluating the Reasons for Non-use of Cornell University’s Installation of DSpace,” D-Lib Magazine 13, no. 3/4 (2007): http://www.dlib.org/dlib/ march07/davis/03davis.html. 15. St. Jean, “Unheard Voices,” 28-29. 16. Ibid., 30. 17. Davis and Connolly, “Institutional Repositories: Evaluating the Reasons for Non-use… .” 18. Ibid.; Jennifer Howard, “Digital Repositories Foment a Quiet Revolution in Scholarship,” The Chronicle of Higher Education 56, no. 38 (2010). 19. St. Jean, “Unheard Voices,” 35; Hyun Hee Kim and Yonh Ho Kim, “Usability Study of Digital Institutional Repositories,” 868. 20. St. Jean, “Unheard Voices,” 40. 21. McKay, “Institutional Repositories and Their ‘Other’ Users… .” McKay cited the following challenges for IR administrators: (1) confounding terminology; (2) inefficient and confusing deposit or editing processes; (3) unreasonable metadata requirements; (4) insufficient formatting and authority controls; (5) the inability to suppress content or control privacy settings; and (6) less-than-helpful data gathering capabilities for the purposes of research reporting or usage statistics. 22. Sally Jo Cunningham, “An Ethnographic Study of Institutional Repository Librarians: Their Experiences of Usability,” Proceedings from the Open Repositories Conference, San Antonio, TX (2007). 23. Ibid. A 2006 University of Michigan usability study showed similar frustrations for IR administrators who found data entry and data editing pages “very long and undifferentiated visually, so finding the place to make changes can be difficult.” See Jim Ottaviani, “University of Michigan DSpace (a.k.a. Deep Blue) Usability Studies: Summary,” (2006): 6. http://deepblue.lib.umich.edu/bitstream/2027.42/40249/1/ Deep_Blue(DSpace)_usability_summary.pdf 24. Cunningham, “An Ethnographic Study….” 25. Ibid. 26. Laura Waugh, Jesse Hamner, Janette Klein, and Sian Brannon. “Evaluating the University of North Texas’ Digital Collections and Institutional Repository: An Exploratory Assessment of Stakeholder Perceptions and Use,” The Journal of Academic Librarianship 41 (2015): 744-750. 27. David Nicholas, Ian Rowlands, Anthony Watkinson, David Bowen, Bill Russell, and Hamid Jamali, “Have Digital Repositories Come of Age? The Views of Library Directors,” Webology 10, no.2 (2013): 6. http://www.webology.org/2013/ v10n2/a111.pdf

<http://www.against-the-grain.com>



Beyond Bibliographic Discovery: Bringing Concepts and Findings into the Mix by Athena Hoeppner (Discovery Services Librarian, University of Central Florida) <athena@ucf.edu> Introduction

Since their introduction in 2007,1 Web Scale Discovery (WSD), also known as Resource Discovery Services, have become the predominant tools for searching library collections and content in the broadest possible sense. Pre-harvesting metadata and content into one index lets discovery vendors normalize the indexing for efficient searching, apply consistent relevancy algorithms to create a ranked list of resource titles, design user interface features suited to the indexed metadata with fields and filters for refining results, and streamline linking to full text. Their current capabilities and content work well for bibliographic discovery but may not be the best solution for every information need. Despite encompassing a significant percentage of library content and richly featured interfaces with sophisticated relevance ranking algorithms and an ever-growing suite of function, WSD do not meet every type of user need, as evidenced by the continued existence and use of subject indexes and other databases. Many of the databases adhere to the same basic approach that WSD uses: enter a search term, match the terms in the index, which is mostly document metadata, and present a list of results (document citations) for the user to interact. Two relatively new discovery tools depart from this approach and produce fresh takes on discovery.

Yewno for Concept Exploration

The first thing most anyone familiar with WSD services will notice about Yewno is that a search produces a graphical representation of concepts and connections instead of the familiar document title list. The differences do not stop there. WDS services collect metadata and text to create an index. Yewno collects full text content and uses artificial intelligence (AI) to identify concepts and recognize relationships. While the discovery interface does have a search entry form, librarian-centric approaches such as Boolean searching, fielded searches, known item, and even natural language searching do not work well for initial searches. Instead, Yewno works best when users enter one broad topic to start exploring. Yewno’s results display is a concept map. Circular nodes represent concepts and lines show semantic relationships between nodes. To the side of the concept map, Yewno shows an overview of the selected concept, similar to the Wikipedia summary in Google results. The user can click a tab to see a list of concepts, or a tab to see the list of documents in which the AI found the concept. The initial nodes start with very broad concepts. Users can expand the broad concept into subtopics that are more detailed and their connections. They can also add additional concepts and control how many concepts and connections display in the map. I am sure there are many features that I have not had the chance to try out, and the interface and content are still evolving. Yewno’s method of processing content is AI driven, applying computational semantics, graph theory, and machine learning to the full text of ingested documents. The AI is essentially reading the documents, similar to how humans read, infer meaning, and learn. The process results in a neural network model creating inferences. Using full text instead of metadata avoids problems that stem from inconsistent metadata. In WSD, variable accuracy and quality of metadata have a big impact on discoverability and relevancy ranking. In Yewno metadata quality does not affect the ability of the AI to process the paper nor the surfacing of a paper in the results. While Yewno started with scholarly journals, the content included in Yewno Discover continues to grow and diversify. It contains over 125 million documents and is currently ingesting U.S. federal documents. Government document discovery has been very fragmented with huge metadata differences and decentralized sources for the full content. In addition to adding the government documents into their discovery system, Yewno is hosting the content to provide stable, consolidated access. Yewno has processed approximately 17 million government documents as of late January 2018.

28 Against the Grain / February 2018

Knowtro for Discovering Research Findings

Knowtro takes discovery in a very different direction than either WSD or Yewno. Where WSD indexes metadata and content for searching and display in a relevancy ranked list, and Yewno analyzes the content and illustrates relationships between concepts, Knowtro is a discovery tool for research findings. Knowtro analyzes scholarly articles to uncover key findings from published articles. A search on Knowtro brings up a list of cards, each with a single findings statement. The findings are stated as relationships between a predictor variable and an outcome variable with the size of the relationship between the two variables expressed as a percentage. The resulting statements on the card take the form of: m{predictor variable} increases/decreases/no effect on {outcome variable} by percentage. For example: Research Suggests: Job Level increase Career Satisfaction by 58% Knowtro targets learners, with the intent of making research more accessible by making it more broadly understood. Reading scientific articles is daunting, especially for novices who may not understand the terminology. Knowtro distills the research into easily parsed statements, making research findings exceptionally accessible. Knowtro’s vision of making research findings understandable for undergraduates and non-academics guides their content selection. To ensure quality, Knowtro selects papers from the top third of academic journals, based on impact factor. As a starting point, Knowtro looked at the most common undergraduate degrees. Within those disciplines, Knowtro chooses studies that will resonate with their target audience, with topics that are likely to be salient. The resulting source set is much smaller than WSD, but perhaps with particular relevance to undergraduates and a higher rate of very-reliable findings presented in the sources. Knowtro includes over 70,000 findings. Knowtro’s interface offers a simple one-line search entry form. Like Yewno, Knowtro does not work with complex approaches to searching. One or two word concepts work best. Knowtro also provides a list of clickable concepts, which pull up results more reliably than searches. The search appears to run in the category and subcategories assigned to the findings. Neither article metadata nor full text of articles are indexed for searching. Likewise, the results page offers no filtering or refinement features. The Detailed View of the results is information dense and requires greater understanding of the Knowtro interface and icons set, and a good grasp of how research findings are expressed. The view shows a table with a row for each finding and columns for aspects of the research and finding. The table is sortable by clicking the column headers. The headers include: Unit; Report Outcome Variable; Time; Sample; Theory/Theme; r2; Author(s); and Year. I would love to be able to sort by more than one header and to see these elements available as filters in the Basic results display. Clicking a result in the table opens up even more details about the research and the finding, with fuller descriptions of the predicting and outcome variables, a description of the methodology used (in some cases), and the full citation to the original paper. In addition there are buttons to see all the findings from the same paper, to show a formatted citation, and to link to the paper via DOI.

Potential Interactions for WSD, Yewno, and Knowtro

The products have potential for interesting interactions, pulling content from one system into others via widgets and APIs. I imagine a use case where a student starts in the familiar discovery search form on a library home page. They would see the typical relevancy ranked documents list, but with a call out that shows the concept mapping for the Subjects with the highest number of hits. The user could click continued on page 29

<http://www.against-the-grain.com>


Beyond Bibliographic Discovery from page 28 into to widget to open the full Yewno interface and continue exploring there. In both the WSD and Yewno documents list, documents with Knowtro findings could show the top Findings card with an icon to see more findings. Ideally, there would be a filter to show only results that have Knowtro Findings. I am sure there are many other ways to make these products interact and enhance the knowledge discovery experience for users.

Conclusion

Yewno and Knowtro are exciting developments on the discovery. They bring the emphasis away from metadata and document discovery and rely on the full content of documents to enable concept exploration and bite-sized research findings. Their approaches to content handling are truly distinct and their discovery interfaces are also innovative, if not without some predecessors and contemporaries. I admit my first impression of Yewno brought to mind various early approaches to using concept maps in a discovery. While many concept map tools of the past were confusing and slow, Yewno may have created version that works well. They are not alone in giving the concept map another go. IEEE’s InnovationQ has a very colorful concept map as a component in its interface, and I suspect there are many examples that I’m not aware of. Knowtro also reminded me of products from the past, especially FirstSearch’s FactSearch. Extracting snippets of content from articles and other sources is not a new idea, and several products still do just that, including RDS TableBase, ProQuest’s Statistical Insight, and other databases. However, Knowtro takes the extraction a step further.

Rather than simply reproducing a snippet unchanged, they transform the content from difficult to understand and nearly impossible to search, to easily grasped and discoverable. WSD, Yewno, and Knowtro each have their strengths and advantages. I cannot see Yewno or Knowtro supplanting the dominance of WSD for document searching, nor do I expect them to pull library spending from WSD. Yewno Discover is a subscription service with FTE based pricing. At the time of writing this article Knowtro is 100% free with the full version accessible on their website. All three of these tools help users address information needs and connect to the content in valuable ways. I hope they continue to develop, find their markets, and improve discovery for our students and researchers.

Links to the Products

Yewno: https://about.yewno.com Knowto: https://www.knowtro.com

Special thanks to Michelle Valiani, Channel Partner Manager and Ruth Pikering, Co-Founder & Chief Business Development and Strategy Officer of Yewno, and Sean R. McMahon, Founder and President of Knowtro, for the long informative conversations. Without their patient explanations I would not have been able to write this article.

Endnotes 1. Breeding, Marshall. 2014. “Chapter 1: Discovery product functionality.” Library Technology Reports no. 1: 5. General Reference Center Gold, EBSCOhost (accessed January 26, 2018).

Vendor Platforms — Tools for Efficient Library Acquisitions by Justin Clarke (Director of Sales and Marketing North America, HARRASSOWITZ) <jclarke@harrassowitz.de>

V

endors supplying materials to libraries began developing online platforms or systems years ago in order to address the need to interface with libraries in an online environment. By allowing customers to work online, vendors can offer better presentation of material availability, improve acquisitions efficiencies, maintain and store historical purchasing activity, and more. These platforms might vary greatly in usability and performance but in general are meant to enhance the experience of working with vendors by allowing customers to research, tag, order, and track acquisitions all from the comfort of their online workspace.

As a global subscription agency and book dealer to academic and research libraries, HARRASSOWITZ maintains vendor systems to provide a friendly user experience to customers for managing acquisitions of all types of materials. Having been in continuous operation since 1872, we carefully evolved from traditional snail mail relationships with customers to embracing the word-wide web for its speed and efficiency. The days of tracking down damaged or missing paper orders, claims, and invoices are (mostly) behind us. Vendor online platforms are central to providing robust information and workflows to library

Against the Grain / February 2018

customers working in all areas of collection development, acquisitions, technical services, assessment, and beyond. H A R R A SS O W I T Z , f o r e x a m p l e , launched its first online system for managing subscriptions and standing orders in 1994 at the request of customers to have an easy-touse platform that provided access to the full HARRASSOWITZ serials database. Then later in 2000, HARRASSOWITZ released the first version of OttoEditions, the online database for monographs and music scores, which was launched at the time with web based searching, library ordering, and claiming functions. These early systems met the needs of customers who no longer wished to peruse paper catalogs and generate orders dispatched via mail. In 2005 OttoSerials, the HARRASSOWITZ online database for serials and continuations was further upgraded. OttoSerials moved to an entirely web based environment and additional functionality and profiles were added to maintain information about an institution and its descriptors used for providing quotes and verifying pricing. As e-journals gained quick popularity throughout the academic research community, subscription vendor systems were

again enhanced to accommodate additional e-resources related data such as format preferences, IP ranges and proxy information. As technology rapidly advances, vendors respond to market needs by improving and refining their platforms to meet library workflow and data demand. In 2016 HARRASSOWITZ launched its Fokus system which replaced the OttoSerials interface for maintaining subscriptions and standing orders, and development is currently taking place to migrate OttoEditions into the Fokus system for management of approvals, firm orders, and music scores. Again, the latest technologies for web development were utilized, search engine speed and device optimization improved, and graphical interfaces and best practices were employed to provide a modern and enhanced user experience. Library supply vendors frequently consult with users when developing or enhancing online platforms via focus groups, library advisory boards, in-house visits, and surveys. HARRASSOWITZ also solicited customer feedback throughout the entire process of conceptualization, mockups, prototypes, early adopters, migration, and post-migration launch. It’s important to offer feedback to your vendors continued on page 30

<http://www.against-the-grain.com>

29


Vendor Platforms ... from page 29 as to where existing online services could be further developed or new services introduced to their platforms. Subscription vendors in general maintain a vast wealth of information pertaining to publishers’ offerings, and present this information in a standardized format for users to easily recognize and understand. Customers require quick searching and retrieval of data concerning subscriptions and their formats, rich bibliographic information, information about publisher mergers and splits, claiming cycles, coverage, backfile availability, platform and access information, post cancellation access rights, licensing information, standard terms and conditions, and all kinds of helpful links that need to be entered, verified, and maintained. Much of the data provided by publishers and content providers, however, is not prepared and communicated in a standardized format such as ONIX-PC. Vendors must scrub data, analyze for accuracy, and incorporate it into their online platforms so that users have one interface to research subscription information. Thus, the burden of gathering, scrutinizing and organizing such a corpus of information is shifted from each individual library to the supply vendor. Data is also harmonized so that online management reports can be generated by the user for analysis purposes. The rise of the Big Deal as a purchasing mechanism for electronic journal content also requires subscription vendors to adjust their subscription databases to reflect large bundles of content acquired through the package model. Not only do title entries need to be linked and referenced, but online renewal workflows also need to be adjusted to reflect what can and cannot be done within the confines of a multi-year deal governed by a license negotiated directly with the publisher. Book vendors also load data from a variety of sources which often include national bibliographies that vary historically in content of data provided to agencies as well as the format it is delivered in. Again, agencies are tasked with harmonizing such data and mapping it to fields in their in-house systems to display to users in a common vendor interface. The advancement of eBooks has also complicated the vendor platform environment as it did with the journals in the early years of their development and adoption. Book vendors have had to add platform information to traditional book records, along with acquisition methods such as single or multi-user access all the while leading the user to the title and format they are looking to order through an intermediary. In addition to presenting availability of publisher materials, library supply vendors maintain a historical record of purchases made by libraries through the vendors’ services. Tracking items through order, shipping and delivery is expected in vendor platforms as

30 Against the Grain / February 2018

well as accounting and invoicing data. Some libraries record their internal fund codes and purchase order numbers, library billing and shipping addresses, as well as location and selector data. As libraries move to next-generation Integrated Library Systems and migrate their data accordingly, library supply vendors act as an important resource for verifying historical acquisitions data and can provide management reports to assist libraries when transitioning to new systems. Customization of vendor systems is also important to cope with varying sizes, types, and expectations of library customers. Having the ability to set the appropriate user permissions and defaults is needed to control who and how transactions are submitted to vendors. The ability to modify field labels grants flexibility to users and their respective library workflows. Some supply vendors also offer the library the ability to pre-populate fields with data in a dropdown format that ensures uniformity and avoids possible keying errors. Considering the wealth of information that exists in vendor systems and the huge amount of transactions happening annually, it is of utmost importance for library supply vendors to work with other entities in the market to share data electronically. By moving away from individual keying of data for orders and invoices, not only is considerable staff time saved by the library and vendor, but data entry is less prone to errors. While we do live in a world of email convenience, it is still faster and more reliable for systems to be able to output and ingest data that are replicated across systems. The past two decades have seen quite some advancement in the sharing of information electronically between vendor platforms/systems and integrated library systems. In 1992, for example, HARRASSOWITZ actively participated in the design and development of UN/EDIFACT based standards for book EDI transactions between libraries and booksellers as a full project member of EDILIBE (Electronic Data Interchange for Libraries and Booksellers in Europe). Additional advancements were made in 1995 when the EDI standards for the complete cycle of monograph transactions were implemented by HARRASSOWITZ and other library supply vendors. The next two decades involved additional integration of ordering and billing systems with library ILS systems. At HARRASSOWITZ this meant testing and implementing EDI ordering for monographs and music scores, as well as EDI invoicing of all HARRASSOWITZ products: monographs, approval plan materials, music scores, subscriptions, standing orders, and databases. These processes are largely batch transactions where a large number of titles are ordered, claimed, or invoiced all at once instead of individual electronic submission. Many book vendors

also offer cataloging records that can be provided in a batch environment for uploading into a library’s online system for display to patrons. Provisional order records are also frequently offered as a way for a library to get a record into their system before the item has arrived, further reducing the possibility for duplication. The past few years have seen the advancement of API (Application Programming Interface) use by next-generation library systems as a means of communicating with library supply vendors, and a whole host of other useful applications. In 2015, HARRASSOWITZ, for example, entered into a collaborative partnership with an ILS vendor to develop APIs to benefit its mutual customers need to further improved acquisitions workflows. The collaboration has provided library staff with a new, streamlined acquisition process compatible with both the integrated library system and the HARRASSOWITZ acquisitions systems. In addition, the collaboration benefits libraries by reducing costs associated with acquisitions and avoiding unnecessary spending. HARRASSOWITZ customers can work directly in the OttoEditions environment with the same ease of workflows they are accustomed to in terms of placing orders, but also take advantage of the real-time ordering API. When the order button is pressed in OttoEditions, the API immediately searches for a record in the ILS and generates an order. Should the title being ordered not be found in the ILS system, a record is generated on the fly to populate the ILS system. Subsequent order details are returned to HARRASSOWITZ via an API that immediately displays order number information in the order record in OttoEditions. As a result, librarians will be spared the task of replicating these transactions in the integrated library system interface. Other library system vendors have developed and implemented the same or similar API functionality and a NISO Working Group is tasked with developing an API toolset that can be used throughout the information community. In the past 25 years, the library supply vendor community has embraced emerging technologies to advance their online platforms used by customers to identify and obtain resources for their patrons. From simple web catalogs to faster and more user-friendly acquisitions modules, vendor platforms play a key role for acquisitions and technical services departments, as well as subject specialists, accountants, and assessment librarians. These platforms are used for researching availability, tracking current orders and content, as well as preserving historical purchasing data. Library supply vendor systems and platforms are constantly being revised, expanded and retooled to take advantage of emerging technologies and will continue to meet the challenges and ever evolving needs of customers.

<http://www.against-the-grain.com>



Op Ed — Opinions and Editorials

Op Ed — Open Data, the New Frontier for Open Research by Tim Britton (Managing Director of Open Research at Springer Nature) <researchdata@springernature.com>

O

pen science should be about opening up all areas of research, including the data underlying it. The evidence clearly demonstrates that open data and good data management makes research more productive, more likely to be cited and unlocks innovation for the good of society including unexpected new discoveries and economic benefit. Yet while open access to research articles and books is now increasingly the expected norm, and while up to half of the world’s research data are shared (which is still not enough), much, much less data are made openly available (let alone FAIR as in Findable, Accessible, Interoperable or Reusable). But why is this? Why is it that open data is not yet the norm given that it is fifteen years since the National Institutes of Health (NIH) released its trail-blazing 2003 Statement on Sharing Research Data? The NIH statement reads: “We believe that data sharing is essential for expedited translation of research results into knowledge, products, and procedures to improve human health. The NIH endorses the sharing of final research data to serve these and other important scientific goals. The NIH expects and supports the timely release and sharing of final research data from NIH-supported studies for use by other researchers. Starting with the October 1, 2003 receipt date, investigators submitting an NIH application seeking $500,000 or more in direct costs in any single year are expected to include a plan for data sharing or state why data sharing is not possible.” And this statement was before NIH’s 2004 Public Access Policy which, together with the Wellcome Trust, led the way in funders encouraging, and then requiring, open access to articles, with the consequent growth of openly available article content. Why is it that since the WorldWideWeb was invented in 1989 to help researchers connect and collaborate, nearly thirty years on the majority of the world’s research data still lays dusty, unloved and undiscoverable in the proverbial or literal desk drawer? I believe macro trends are at play here which help explain why open article accessibility has outstripped the underlying data accessibility: first and

32 Against the Grain / February 2018

foremost is that it is easier to achieve the first. The tools emerging from our digital revolution are often designed to deliver pre-formed, consumable content to users. Whether it is YouTube footage, financial news or academic articles, digital dissemination and consumption tools are widely available and easy to use for this purpose. The consumer culture is one of expecting consumable content to be freely available: from music and film to news and analysis, pay walls are few and far between. This is the way the world has changed in the last 15 years and it is no surprise that academic publishing is following suit with open access. Sharing data in a meaningful, usable way requires more complex tools and a degree of user expertise. There is a cultural point here too, in that the underlying building blocks of content (in our case data) are not necessarily expected to be as freely available. But, as we are witnessing in other sectors, these structural and attitudinal barriers are starting to come down. Opening up data to make it useful to others is a challenge not unique to the research community. I came to scholarly communication and publishing from a long career in market research and using data to draw new insights. Before joining Springer Nature, I was EMEA Chief Operating Officer of YouGov, the international Internet-based market research and data analytics firm and later head of strategy and transformation for PwC’s global data research and insight centre, r2i. In both these roles I saw a move away from focussing on data production and collection, or more accurately repetitive data creation, to a focus on the reuse of data already amassed with the application of new experimental efforts on where data had not previously been collected. The benefits from this are clear: time and money is not needlessly spent repeating the work of others; researchers can focus on the analysis and interpretation of, rather than production of, data and the unintended benefits of a particular dataset are much more likely to be uncovered. Indeed, since open government data has become the norm (from social statistics through weather

data to procurement records) there is clear evidence of both economic and societal benefit. A 2013 McKinsey study found that “seven sectors alone could generate more than $3 trillion a year in additional value as a result of open data, which is already giving rise to hundreds of entrepreneurial businesses and helping established companies to segment markets, define new products and services, and improve the efficiency and effectiveness of operations.” According to the Open Data Institute, “Those studies focused on the value of public sector open data alone found that it is worth between 0.4% and 1.5% of an economy’s GDP.” In the same way that the value of data reuse and sharing has been recognised in other sectors, there is great potential for it in academic research and publishing. The Human Genome Project is a success we can build on. From an investment of $14.5 billion by the U.S. government, the Wellcome Trust and others, the whole human genome was sequenced and the data is open to anyone. The estimated contribution to the U.S. economy alone was $1 trillion in the first decade after this data became available, according to a report by the Battelle Memorial Institute. As a publisher I am keen to recognise and acknowledge the importance of research articles and books. These are important summaries and conclusions of years of work for researchers. However, the real building blocks and arguably the real value of discovery lie in the data — the excel tables sitting on a PhD student’s desktop, the thousands of microbiology images taken to produce an exquisite figure in a research paper, the coded responses to qualitative surveys. The world’s research data is a rich but untapped seam of new insights and unexpected connections. Today, we have the ability through algorithms to explore discarded datasets from experiments that produced negative or “null” results and potentially dig out a few precious gems from these forgotten mines. Sharing this data ensures that the hours spent pipetting in a laboratory is never a waste, and public funding is not spent needlessly repeating the same tests over and over. continued on page 33

<http://www.against-the-grain.com>


Op Ed from page 32 Many funders are now driving data sharing through policy including U.S. federal agencies, the Bill & Melinda Gates Foundation, Howard Hughes Medical Institute, the UK Research Councils, Wellcome Trust and European Commission. The NIH continues to drive change through its trans-NIH initiative The Big Data to Knowledge (BD2K) program. Yet survey after survey shows that researchers’ efforts to archive, publish and share data continue to be hampered by time constraints and a lack of knowledge around data standards, metadata and curation expertise, repository options, and funder requirements. There are still few incentives for researchers to share their data, and many challenges, despite their motivations to do so. According to the recent State of Open Data 2017 (a Digital Science and Figshare report, produced in collaboration with Springer Nature and Wiley), the proportion of researchers sharing their data has grown, with an ever greater willingness to reuse open datasets. Eighty per cent of survey respondents are willing to share their data, with the same number already or open to using others’ data. Yet only 60% of respondents make their data openly available “frequently” or “sometimes.” The most common ways of sharing data are still supplementary information in a journal article or peer-to-peer, which do often not make data openly available or discoverable. We have to make it easier for data to be truly and usefully open. Over the past twelve months the Open Research team at Springer Nature has talked with librarians and funders, whilst continuing to explore the attitudes and behaviours of researchers. The concerns are clear: a lack of researcher expertise in metadata and data curation to describe data so that it can be found easily by others, the lack of established data standards, along with more fundamental questions around identifying which data to archive and preserve and where to deposit it. Funders and institutions also want to support researchers to follow best practice, ensure compliance with data sharing policies, and help bring about a change in culture — where data sharing and data management is the norm. Institutions and libraries have a key role to play in supporting researchers: helping them understand and comply with funder requirements, and establishing local research data management support, training and solutions where needed. In many research institutions, libraries and research data management teams are now offering expert advice and support, often reskilling teams as experts in data management. Partnering with data initiatives, repositories and other useful parties, including publishers, will help reduce potential duplication of effort and ensure sustainability.

Against the Grain / February 2018

Springer Nature is committed to supporting good research data management and data sharing. We want to help researchers who seek to adopt open approaches to their data wherever possible, and to partner with funders, institutions and community initiatives such as the Research Data Alliance (RDA) and CODATA. Our starting point is to recommend and partner with community and general repositories, rather than keeping data in a proprietary publisher ecosystem. Our list of more than eighty recommended repositories is publicly available under a CCBY license and regularly updated. We have long advocated for data availability through editorial and journal policies, and in 2016 we launched our standardised journal data policies to further encourage good practice. As a publisher, you would expect us to offer publishing options to improve data sharing and we do. Our flagship research data journal, Scientific Data and our new data note article type in BMC Research Notes both offer authors publication credit, while making data easier to find, understand and reuse. The insights we gained from researchers and institutions led us to consider how we could help them further, drawing on the experience and expertise we have built in data policy, standards and curation. As well as a free helpdesk to advise researchers and editors on journal data policies and choice of repositories, we have been working on new solutions for researchers and institutions. In 2017, we introduced a new Research Data Support service to enable authors to share their data. Research Data Editors curate and enhance metadata to improve discoverability and encourage reuse in accordance with the FAIR data principles, and help authors draft data summaries and data availability statements to improve human readability and data linking and citation. This January we extended Research Data Support to make it available to all researchers. We also extended the service to institutions who want to support best practice data de-

positing by their researchers, or complement in-house research data management teams. Our Data Curation Editors, Research Data Editors and in-house data policy experts are now available for training including on policy and best practice in data management, and hands-on data curation workshops. At the Research Data Access and Preservation (RDAP) Summit in Chicago this March, we will consider the “Intersection of Publishing and Data” and share some of our learning from the implementation of our Research Data Support service. This will include how data curation standards have been developed and how the data services integrate with the established publishing workflow. Also in March, we will be revealing the results of a recent survey which explores how researchers are sharing data associated with published research. Open science offers huge opportunities through open data. Good data practice should speed the pace of discovery by increasing reproducibility, making research more productive, and delivering return on investment to the global economy. As with open government data, we now need the infrastructure, funding and skillset embedded into the research community to make data sharing and open data the new normal in academic research. Many of the building blocks exist today, including positive researcher attitudes, community repositories, funder policies, data publishing options, and the growing research data management expertise in institutions, driven by libraries and librarians. Yet the open data challenge is too big, and the potential benefits too great, for a fragmented approach. We need to join up and collaborate to find shared solutions, and look to and learn from other industries. By working together, governments, funders, institutions, publishers and libraries can unlock the huge potential of open data to improve our knowledge, the global economy, our health and environment.

Tim Britton, Managing Director of Open Research Group, Springer Nature Tim Britton joined Springer Nature in October 2016 as Managing Director of the Open Research Group. With decades of experience working in the marketing and research information industries, Tim is responsible for the open research portfolio across Springer Nature, including BioMed Central, SpringerOpen, the open access journals from Nature Research and open access monographs from Springer and Palgrave Macmillan. Prior to joining Springer Nature, Tim was Head of Strategy & Transformation at r2i (PwC’s global data research and insight centre of excellence). Before that, he spent eight years at international Internet-based market research firm YouGov in a number of leadership roles, most recently as Chief Executive UK and Head of Global Reports. To find out more about the new Springer Nature research data services available to institutions, please email <researchdata@springernature.com>.

<http://www.against-the-grain.com>

33


ATG Special Report — Messing with Success: The Charleston Library Conference FutureLab by Mark Sandler (Principal Consultant, Novel Solutions) <mark@novelsolutions.net> www.novelsolutions.net and Heather Staines (Director of Partnerships, Hypothesis) <heather@hypothes.is> hypothes.is “Don’t mess with success,” is a facile aphorism that, more often than not, should be ignored by organizational leaders. It flies in the face of another, more venerable adage: “don’t rest on your laurels.” In one way or another, organizations of every stripe — for-profit, not for-profit, governmental, educational, whatever — need to continually weigh these contradictory instincts to tinker or not to tinker. In general, most smart — and successful — organizations are continually looking ahead, gauging the prevailing winds, and tacking left or right accordingly. That said, we all know that predicting the future is an imperfect science — destined to be wrong by X degrees to the left or right. Still, looking ahead and taking a best guess is certain to be far better than flying blind into the coming storm. All praise to those leaders who TRY to look forward and reposition their organizations for smooth sailing. As successes go, the Charleston Library Conference (CLC) has to be one of the prime exemplars in the library and scholarly publishing space. The Conference’s founder and godmother, Katina Strauch, reminds us that “it all began in 1980 with 20 registrants, no exhibitors and no concurrent sessions.” Thirty-seven years later, conference attendance approaches 2,000, with 250 exhibitors and hundreds of professional presentations — plenary, concurrent, posters, Neapolitans, Lively Lunches, etc. For many, the annual CLC is the social, professional, and commercial highlight of the year. At this year’s Vendor’s Showcase, an account manager said that, “I do more business here in six hours than in four days at ALA or ACRL.” The conference has also spawned publications (Against the Grain, The Charleston Advisor, The Charleston Briefings), podcasts, newsletters, and other broadcast communications that contribute to the continuing professional development of librarians and publishers alike. Clearly, the CLC has become a central institution in support of libraries and publishing. Like the “little engine that could,” the Conference has chugged uphill over the past thirty-five years to become a central professional marketplace for the goods, services and ideas that define the scholarly information space. Why then take the time to question the positioning of the Conference? It ain’t broke, so what are we trying to fix? Katina would be the first to admit that the Charleston Library Conference has enjoyed greater success than she could have ever dreamed back in 1980. That said, though, she recognizes that the profession has evolved enormously over the past thirty-five years, and this evolution has been accelerating in recent years as technology permeates every aspect of the information, publishing, and educational communities. As these changes accumulate exponentially, past success is not a reliable predictor of future staying power. What will a relevant professional conference look like in five years? Who will participate and what topics would be covered? These are important questions for an institution like the CLC, and we congratulate Katina for setting in motion a FutureLab to address these questions.

The FutureLab Process

For months in advance of the 2017 Conference, members of the CLC team began a series of conversations about how to structure ongoing discussions to stay abreast of trends in the library and scholarly publishing space and to use this input to tune the CLC program and support attendee participation. As part of this planning process the CLC team had several discussions with leaders at the International Association of STM Publishers (STM) who have facilitated a futuring group for a number of years. In conversations with Eefke Smit, editor of STM’s annual Tech Trends reports, and Heather Staines who participates in the STM futuring group, the CLC planning team was able to get a good sense of how that organization elicits and manages the input of its membership.

34 Against the Grain / February 2018

As Heather describes it, The STM’s Futurelab Committee is an informal group that meets each December in London on the day before the STM week meetings to talk about trends in scholarly communications. Each participant comes prepared to discuss three things they feel will impact the publishing space in the next few years, as well as over the longer term. About thirty folks typically attend, present their trend ideas, then try to hash out which will be the most significant and how (and if) trends connect to one another. The conversation is brought to life later in an infographic that is shared throughout the following year. It was apparent that Katina and the CLC planning team were on the same wavelength as STM’s leadership about the importance of tracking trends to remain relevant going forward. It therefore made sense for the CLC to adopt the outlines of the STM process and facilitate a “futuring” discussion for their attendees. The planning team sought out a range of voices from different library types and varying segments of the publisher and vendor communities. Thirty-five thought leaders from libraries and publishing were invited to participate in this year’s Charleston FutureLab (CFL). In advance of the meeting, CLC participants were asked to submit “three important insights about the future of libraries and scholarly information.” As a result, the planners gathered over a hundred insights about where the professions (libraries, librarians, scholars, publishers, schools of information, vendors, tech companies, etc.) were heading. These insights, ideas, predictions, and pleas were shared with CFL participants in advance of the meeting and were organized in various ways to facilitate discussion.

Onsite

Through a variety of more or less standard techniques, the CFL facilitators sought to surface a degree of consensus around a handful of themes from the hundred-plus submitted statements. “Facilitation” was hardly needed as our invitees were more than willing to pitch ideas in the absence of ice breakers or coaxing. Some of the ideas offered and captured by Don Hawkins (http://www.against-the-grain.com/2017/11/ charleston-future-lab/) included: • Curator of research input and output • OA sustainability • Collections as data sets • Interoperability • Discovery more important than collections • Clarify the librarian identity • Library research to inform praxis • Align library and university missions • Assert the value proposition of libraries • Adapt to the future of reading • Immediacy of services • New publisher services • Attend to local language needs and niche user communities • Incorporate AI and machine processes • Account for evolving research behaviors • Assess new library roles, functions and organizational structures • Apply metrics and standards • Use the power of research and libraries to solve real world problems • Update MILS training • Save time for the reader • Empathize with users continued on page 35

<http://www.against-the-grain.com>


ATG Special Report — CLC FutureLab from page 34 By asking CFL participants to shift through discussion groupings and regroupings, this longer list (and others not reported here) was consolidated and, eventually, our contributors were asked to highlight their choices for the most salient trends. The resulting list of the six most compelling themes/trends is expanded upon below (starting with the highest vote-getter): 1. Discovery 2. Interoperability 3. Reader/Researcher empathy 4. Return on investment 5. Artificial intelligence 6. Workflow

Key themes

Discovery — As noted above, “discoverability” emerged as the leading vote-getter for important trends confronting libraries and publishers. This surprised some contributors who correctly noted that the phrase was only mentioned by a few of the participants in their initial identification of trends prior to the meeting in Charleston. One such submission (by a library director), however, was clear and to the point: “Discovery will become more important than collections as libraries move from collecting to facilitating access to resources that are owned, rented, purchased on demand, or freely available.” This participant, and others in agreement, were basing the centrality they accorded “discovery” on the proliferation of research content; the diversity of outlets for research; the shifting metrics for accrediting research; and the prevalence open-access content available to users. While the “discovery” theme seemed to gain momentum over the course of the on-site CFL discussion, several of our experts noted the “squishy” nature of the word — i.e., that the word seemed to mean different things to different discussants. One participant noted that, “People define ‘discovery’ differently and in different contexts. It represents many different issues, and different perspectives on why people are interested.” Similarly, a second CFL participant said, “I don’t know what people are talking about when they say ‘discovery’ — it’s something we’ve always talked about with different terms. Is it one or another technology? The organizing infrastructure behind a technology? Is it a service, or, more generally, information literacy?” The point here is that, whereas “discovery” emerged at the leading trend for lab participants, it’s difficult to convey exactly what that means for libraries and publishers — i.e., what are the actionable takeaways that would help address our colleagues’ concerns about discovery. One thread in the discussion does point to greater “tech literacy.” “Libraries need to engage with technology more seriously, rather than mostly being passive users who learn each new ‘system’ or software chosen for them. Libraries are understaffed and underinvested in this aspect of work, saying technology isn’t really our job — we can outsource it. And we do. And are constantly surprised at the unpleasant results.” “Freely available content as threat: With nearly 50% of recently published journal articles freely available via Google Scholar, library systems need to be more effective at supporting discovery and use of this content.” “Fostering the next generation of information infrastructure, from discoverability of pre-prints, to cross-linking data to articles. What is the role of the library, and how should it partner with both non-profit and for-profit organizations?” It’s undoubtedly true that there is ambiguity attached to such words and phrases as “discovery,” “discovery services,” and “discovery layer.” Librarians and publishers alike were reflecting on a concern that researchers, students and library users are struggling to make sense of the information environment and will increasingly turn to simplified

Against the Grain / February 2018

environments like Google if the so-called information specialists can’t deliver the best content where and when it is needed, or, as one commenter put it, “ensuring that the most credible research can be discovered expeditiously.” Interoperability — While interoperable metadata standards have long been important to scholarly communications experts, there is now increasing user demand for interoperable tools and standard formats to access all manner of content. Libraries and publishers acknowledge a growing demand for tools and services that can seamlessly integrate with other tools, rather than locking a library or researcher into a proprietary flow. Several CFL participants noted the opportunity for libraries and publishers to provide higher levels of service to future information seekers. Participants acknowledged the underlying source of the tension around interoperability, noting that, “[W}ith a large number of publishers and aggregators — including libraries — there will be a large variety of user interfaces and tools. Even standards groups that look to make some aspects of the user experience uniform, do not choose what colors, fonts, designs, and features offered by one or another online service.” While this creates some problems for the user, it is also a source of potentially laudable qualities of “distinctiveness,” “creativity,” “improvement,” and “progress.” In other words, there are benefits attached to differentiation, but this needs to be balanced against the problems encountered by users when they can’t readily move information of communication over, under, or around the walls that separate user communities. One contributor interestingly noted that interoperability, as exemplified by OCLC’s adoption of MARC, the ubiquity of PDF, and Google’s ability to circumvent myriad providers and deliver content to users are likely the outcomes of monopolistic market control, be that benign or malicious. Again, there was an understanding among participants that we are seeking a balance between seamlessness and diversity. Users across the internet need to easily interact with each other, and yet we want to encourage niche innovation. As one of our panel of experts proposed, “small technologies, widgets, apps, tools, should be easily added to larger, more encompassing technologies like operating systems and browsers. Interoperability comes about whenever one niche but agile technology plays a subservient role to a major technology.” To this point of niche user needs, another participant cited the example of text-mining, encouraging greater collaboration to accelerate the development and application of new standards for formatting content. “The interoperability of distinct content objects will need to increase, as users will become used to applying analytical tools (especially artificial intelligence), to act on large corpora of content and extract information from that analysis. Current content, mainly in PDF form, is inappropriate for this purpose. Libraries and publishers will need to tweak their work processes and services to facilitate that growing demand.” The upshot of this concern about interoperability — or the lack thereof — seemed to be that the competition among information providers — including libraries — to develop “THE BEST” user experience for a particular body of content undermines the overall user experience across systems and different bodies of content. Somehow, the industry needs to move to a broader perspective of user behavior and to work collectively to address their needs and preferences. Cooperation, rather than competition, or, as some call it, “co-opetition,” is needed to develop an overarching infrastructure that can integrate a full range of scholarly communication workflows, including multiple media types. Empathy with Users — A number of the CFL contributors noted/ predicted/encouraged an evolution of library work from a focus on content management to a broader concern for the information needs of the user communities it serves. As one participant put it, “I believe the skill that is most needed by successful libraries and librarians is empathy. The data and information environment will be continually changing — it is unlikely that anyone can fully grasp what is needed since our patrons are not sure themselves. However, if you have an empathetic outlook (which is the core of user design), then you can grow and adapt as the information continued on page 36

<http://www.against-the-grain.com>

35


ATG Special Report — CLC FutureLab from page 35 needs change. Without empathy, we are are designing solutions that are looking for an audience. Let’s figure out what our patrons need and move with them. Our future is service and we need to embrace it!” There were many takes on this theme of “empathy,” and the need for libraries to develop more user-centric services: “Too many are making the short term trade off of hiring for specific technical skills rather than the broader, softer librarian skills needed to adapt appropriately to rapidly changing contexts.” “The library needs to serve the whole student and the whole professor.” “We think our users need certain things — e.g., content/services/ support. This doesn’t always (or even frequently) align with the things our users tell us they want. Should we keep trying to make our users care about the things we think they should care about (OA publishing, OERs, increased information literacy, etc.), or can we become better at listening to their expressed needs?” “We all need to get closer to the scholarly workflow and the place of libraries and publishers therein. Libraries are struggling to articulate a compelling vision of their role with respect to scholarly information and we believe it is crucial for scholarship and scholars that libraries do so.” “Move into an even stronger role of serving as consultants to the faculty, students, and independent researchers at their respective institutions. Currently, librarians are viewed as the custodians of content. However, librarians need to further evolve their roles to not only being the custodians of content, but also the primary campus resource for directing faculty, students, and independent researchers to the most appropriate and VALID research results.” One library director who participated in the CFL discussion put it succinctly, “The first course in library school should be in customer service. The focus of librarian training should be on working with patrons and meeting their needs directly.” Return on Investment(ROI) — The discussion of “value” focused almost exclusively on libraries, although participants acknowledged that there was certainly room to address similar questions for publishing as the open-access movement advances. The following statement by a librarian is a pretty good framing statement for the tenor of the ROI discussion: “If we all woke up tomorrow and the library had disappeared, what impact would that have on the everyday lives of those who decide on our funding? The answer to this question will, to a very significant extent, determine our viability in the future.” CFL contributors came down all sides of this issue. Some asserted that the value of libraries will only rise in the future, as suggested by the following comments: “I do believe that libraries have never been more important. We have a unique opportunity to provide clarity for our users in helping them find the information that they need.” “Libraries are revered and respected places throughout the world. They are known as places for study and contemplation. We need to preserve the library brand which is very powerful. Some of us have lost sight of this as we become coffee shops and computer labs or study halls.” These kinds of faith-based statements about the future of libraries were called into question by a number of other participants who, in one way or another, suggested that urgent changes need to be undertaken to secure the future of libraries. “Increasingly invisible: The future of library-funded collections is in jeopardy because faculty and students are increasingly unaware that the content they are using is paid for by their library.” “Spread too thin: As libraries invest in a wide array of new services intended to add value for their institutions, their conservative nature simultaneously threatens their ability to succeed

36 Against the Grain / February 2018

in new endeavors as well as the legacy roles they are reluctant to let go.” “Traditional metrics like collection size, gate counts and number of FTEs need to be replaced by more sophisticated measures like cost-per-use, impact measures and engagement metrics.” “Predictive analytics to deliver the most useful resources in a tailored way to individual students will also drive the value statement for libraries. This requires a sophisticated understanding of data and tools, and a corresponding need to understand the privacy implications and third party algorithms.” “Marketing, writing business cases, and data analysis will be important for future librarians. Marketing services to students and faculty and understanding how to pitch ideas supported with data to administration or departments will help keep libraries relevant.” One of our experts from the publishing side gave us this good summary statement of the ROI discussion: “Pundits say that ‘the library has never been more important’ than it is now. Why, specifically? What are they calling The Library? A place? A person? A suite of services? I don’t agree we can take the value of libraries or publishers for granted. Neither are protected species under wildlife conservation.” Artificial Intelligence — AI, however we might define it, seems to be all the rage across numerous fields, including publishing and librarianship. While AI seems to hold the promise of freeing up humans from some current tasks, there are also concerns about a reduction or change in the jobs that remain. One participant asks, “Will AI-powered search engines become sophisticated enough that the need for information literacy will be replaced by the need to monitor and evaluate the impartiality of search algorithms?” Publishers are looking at artificial intelligence to help identify peer reviewers or provide recommendation engines for relevant content. As indicated by the following, stakeholders from all sectors wonder how the addition of AI capabilities will affect research process and analysis in general. “The use of Artificial Intelligence and machine learning to analyze large sets of data to make predictions and solve problems may change the way research is conducted in the future. Librarians and vendors need to be proactive in helping to shape how these technologies can best be used to enhance research of authoritative sources.” In other words, the ability to act on a large corpus of text does not guarantee excellent or useful outcomes if the underlying content is suspect or compromised. The old GIGO principle (garbage-in, garbage out) will still pertain, so libraries and publishers share a common interest in leading users to apply their new and powerful tools to bodies of quality content. Workflow — Attendees frequently turned to the idea that workflow was a key area of future interest for libraries and publishers, as well as for the researchers they both serve. With more and more information at our fingertips, ways to find, manage, and utilize information in a manner that would streamline workflow and address pain-points is critical. This touches on the related issues that surfaced at the Lab — e.g., Services, Return on Investment, and Interoperability. In its role as liaison between publisher and service providers, the library must consider how such services fit into the evolving research process and how it can assist in educating faculty and staff about new services that pertain to workflow. Lab attendees of all stripes — publishers, librarians, faculty, “tank-thinkers,” consultants, etc. — seemed to agree on the need to engage with researchers in the pre-publication stages of the research workflow. In other words, higher orders of institutional value will flow to those who facilitate the creation of scholarly outputs, as opposed to those who enter the scene at the end of the work process. One of our librarian contributors put it this way: “Libraries and publishers need to find their place in the scholarly workflow, developing tools and staffing that can add value for working scholars.” continued on page 38

<http://www.against-the-grain.com>


Journal Packages from The MIT Press Package subscriptions give you the content you need from a trusted publisher at a discounted price. All packages feature: • DRM-free content • Unlimited simultaneous users • Free access to backfile content for duration of subscription (many titles go back to 1999) • Compatible with mobile devices • Citation download, social sharing, TOC alerts • COUNTER usage reporting

The Arts & Humanities Journal Collection • • • • • • • • • • •

African Arts ARTMargins Computer Music Journal Dædalus Design Issues Grey Room Leonardo Leonardo Music Journal The New England Quarterly October PAJ: A Journal of Performance and Art • TDR/The Drama Review • Thresholds NEW

The Economics Journal Collection • • • • •

American Journal of Health Economics Asian Development Review Asian Economic Papers Education Finance and Policy The Review of Economics and Statistics

Free 30-day trials

The International Affairs, History, & Political Science Journal Collection • • • • •

Global Environmental Politics International Security Journal of Cold War Studies Journal of Interdisciplinary History Perspectives on Science: Historical, Philosophical, Social

The Science & Technology Journal Collection • • • • • • •

Artificial Life Evolutionary Computation Journal of Cognitive Neuroscience Linguistic Inquiry Nautilus* Neural Computation Presence: Teleoperators & Virtual Environments

*Print package option only.

mitpressjournals.org For more information, or to request a free 30-day trial, please contact: jdpcs-licenses@mit.edu


ATG Special Report — CLC FutureLab from page 36 And a publisher noted that a walk through a modern research library leaves no doubt that collecting is being de-emphasized in preference to staffing up with a “proliferation of Ph.D.s working directly with faculty in research and instructional roles.” A second publisher also noted that. “Libraries and librarians have a unique opportunity to engage with scholars to offer venues for promotion of their scholarly outputs. These types of researchers have unique information needs that librarians can address. They also have unique preservation issues for their outputs that librarians can solve through inclusion in local institutional repositories or national ones targeted for innovation such as e-channel (https://library. med.utah.edu/e-channel/). Library space can also be repurposed to house innovative teams and personnel to facilitate intellectual collaborations and incorporation of knowledge into design and product development.”

Some Further Insights

As with all things, the most prevalent observations are not always the most correct, insightful, or interesting. There were some contributions by CFL attendees that didn’t rise to top vote-getters but were nonetheless important (in the judgment of the facilitators and authors). More than a few participants noted the meteoric increases in open access content and the likely impact this would have on libraries (not good?), commercial publishers (not good?) and Google (very good?). A couple of other contributors noted the need for libraries to better align their vision and activities with the priorities of their hosting institutions. While some hastened to add that they didn’t dismiss their obligations to serve constituencies beyond their funding institutions, they urged a serious review of how library activities and resources are prioritized. Two participants noted the movement away from longform reading and wondered how libraries and traditional scholarly publishers might adapt to such a cultural shift. And finally, just one participant asked us to think about how our work in libraries and publishing ameliorates real world problems like violence, poverty, hunger and environmental stewardship. Increasingly, we see successful organizations marketing their products and services as connected to societal benefits. Young workers seem attracted to “meaningful” work, and if those in the scholarly communication space hope to attract the best talent, it would behoove them to consider how their work advances the social good beyond profit, growth, careerism, and institutional prestige. As one expects with this kind of exercise, most contributors offered positive comments at the end of the onsite session or in subsequent emails. One comment that sticks with us, however, is from a partic-

Rumors from page 24 custodian, who along with Ronald and the rest of the family, lived “on the top floor of the Washington Heights branch in upper Manhattan.” They moved there in 1949, when Ronald was 15 years old. Following in the family tradition, Ronald also raised his daughter Jamilah in the same apartment until she was five. To this day, Ronald believes that living in a library gave him “the thirst of learning — and this just never left me.” Katina here. “My son, Raymond Walser, also grew up in the College of Charleston Library! And he is a great bibliophile! There is not a book that you can name that Raymond doesn’t know the author or the plot. He was in the Army

38 Against the Grain / February 2018

ipant who expressed disappointment that the group failed to take the opportunity to discuss longer-horizon trends and issues. This participant averred — and maybe some readers of this summary will agree — that most of the concerns discussed were really present concerns — not even those of the near-future. There certainly seems to be some truth in this helpful insight. And while the authors have tried here to summarize major takeaways from the CFL meeting, it might be worth noting — because some readers have surely been thinking it — some issues that were not discussed. For one, it’s hard to imagine a discussion of coming changes in academe that doesn’t acknowledge changing demographics of the student body and younger faculty. Do we think that the changing face of the academy won’t affect user needs, learning styles, or research agendas? Also scarcely considered was the role the tech sector will play in academe. What inroads might Apple, Google, Facebook and Amazon make in influencing or transforming higher education by 2025? As often occurs at such visioning sessions, the majority of CFL participants seemed so focused on the next tactical moves of their organizations that they had difficulty seeing beyond the near future to account for the much larger societal forces that will inevitably shape — perhaps disrupt — their institutional futures.

Next Steps:

The consensus for this experimental gathering was: interesting exercise, but not enough time was allotted to develop the many themes or adequately hash out the potential consequences. Duly noted. Another takeaway for organizers was a lack of early career attendees, who may well have had entirely different perspectives, and a U.S.-centric view focused largely on issues around English-language content. Point taken. Later on, during the main conference, organizers also caught whiff of notions that a secret cabal had met without explanation or information to the wider attendee base or even the conference organizers. Not our intention. Experiments have to start somehow. But we acknowledge the concern and helpful feedback as we digest this year’s CFL input and plan for possible sessions in upcoming years. As next steps, we’d love to hear your thoughts on the topics identified above and, of course, other subjects. Hopefully, these added contributions by the larger CLC community can be incorporated into a future Conference presentation or ATG article. As consideration of a follow-up 2018 meeting kicks off, we welcome your input on how this event might be adapted for future Charleston Conferences to encourage broader, real-time participation. Readers are encouraged to share their ideas about how these future discussions can best be structured to optimize participation and dissemination.

and now works for the Army Corps of Engineers so he moves a lot but he is always moving his book collections! Libraries have always been great learning places!” http://www.against-the-grain. com/2018/03/atg-quirkiesgrowing-up-in-a-library/ Hope that you have had a chance to read the Up and Comer profiles that were in the print Dec/Jan ATG! We are printing the twentieth one here. It is of Katie Mika. Katie was changing jobs and her profile was lost in the transition! She is Data Services Librarian at the University of Colorado in Boulder. Katie likes skiing, trail running, playing golf, cooking and reading among other

things! All the profiles have been loaded on the ATG News Channel. http://www.against-the-grain.com/atg_ profiles_v29-6/ We are planning once again to have the Charleston Conference Gala Reception at the South Carolina Aquarium which is a favorite venue! And we have just learned that the SC Aquarium has been named as a finalist for the National Medal for Museum and Library Service. The National Medal is the nation’s highest honor given to museums and libraries for service to their communities. For 24 years, the award has celebrated institutions that demonstrate extraordinary and innovative continued on page 43

<http://www.against-the-grain.com>


ATG Interviews Pat Sabosik General Manager, ACI Scholarly Blog Index by Tom Gilson (Associate Editor, Against the Grain) <gilsont@cofc.edu> and Katina Strauch (Editor, Against the Grain) <kstrauch@comcast.net> ATG: Pat, your current work with the ACI Information Group is just the latest chapter in a distinguished career. You’ve also had experience working for Choice, Science Direct, Dow Jones & Reuters and Thomson Learning among others. What do you consider your top career highlights? Which of your many accomplishments make you most proud? PS: Digital content has been a focus and foundation of my career so there are a few highlights. Starting Elsevier’s ScienceDirect business was a big accomplishment. As the first General Manager of ScienceDirect, I worked with a team of very talented people and building the platform to deliver digital journal content to researchers’ desktops worldwide was a big deal. The ScienceDirect platform was more than just e-journals, it enabled Elsevier to link to other content like 3-D chemical compounds and, through the digital object identifier (DOI), to other scientific information thereby accelerating research. I joined America Online (AOL) during their Internet growth years and before the Time Warner merger. I worked with AOL partners like Kodak, Sony, and IBM on new Internet initiatives. In my last project for AOL I led the digital imaging team to define ways consumers could have access to digital pictures on their computers. That was about 20 years ago, now we have our pictures in our pockets! ATG: Your most recent innovative project is the ACI Scholarly Blog Index. What has been the market’s response? PS: The ACI scholarly blog project started with a proof of concept. I interviewed many librarians across the country to see if they would be interested in a curated collection of blogs written by academic scholars. The response was yes and the company put resources into building the product. Sadly, after nearly four years there was just not enough library interest in actually subscribing to the ACI Scholarly Blog Index, so the scholarly blog end-user product will be discontinued at the end of June 2018. The ACI Information Group continues to aggregate and syndicate blogs for its active ACI Newstex News & Commentary Blog Index newsfeed business. Libraries now have the option to buy blog feeds from those produced by the ACI Information Group for the Newstex feed business. ATG: Can you tell us more about how libraries can exercise the option of buying blog feeds from those produced by the ACI Information Group for their Newstex feed business? And what exactly will libraries get when they buy these blog feeds?

Against the Grain / February 2018

PS: Libraries have the option to access ACI Information Group’s Newstex blogs through one of ACI’s resellers, such as Thomson Reuters or ProQuest, both of whom incorporate ACI’s Newstex blogs into their services or directly from ACI. You can use the website (www.aci.info) to learn more. ATG: With the ACI Scholarly Blog Index ceasing publication, will you still be working with the ACI Information Group? Or are you planning to move on to other opportunities? If so, what might they be? PS: I will continue working with ACI on an as needed basis. I am currently networking within the information industry for a research engagement or market analysis assignment for new digital products and continue to be active within the Society for Scholarly Publishing. I will also be judging the Software and Information Industry’s Codie Award nominations this spring and taking a landscape design course. ATG: You were ahead of the curve in realizing that blogs had scholarly value. What did you see that others missed? PS: Blogs have been accepted as a new form of scholarly communication and the internet gave academics the tools and medium to communicate directly with their colleagues and students. These scholars are also journal authors and frequently their blogs are a continuum or update to their published research. Journal articles discuss a single point of research and are laden with field-specific jargon. Blog posts are more accessible to the general reader and include broader coverage of a topic. We saw the scholarly blog as a good starting point for undergraduates writing papers and

doing research. The blogs can lead the student to a journal article with a better understanding of the content than starting with a journal search. Graduate students used the Index to follow leaders in their field. Also, the ACI Scholarly Blog Index was much less expensive for a campus-wide subscription than a bundle of journal subscriptions. This is now history. A decline in funding, a potential decline in student enrollment, the changing publishing landscape and advances in information technology have proven a difficult environment for academic libraries and how they treat authoritative social media. ATG: As you survey the world of scholarly communication today, are there other formats or resource types that are currently flying under the radar? PS: I think that scholarship and research produced in new media is not getting the coverage in libraries that it deserves. The requirement to archive social media in perpetuity may not be reasonable, however, a reasonable period of availability would be a good discussion for the industry to have. There are already demands for software archiving, data sets and repositories like GitHub, Open AI, and the Open Science Framework. These initiatives will present collection, access, and distribution challenges for both academic libraries and institutional repositories. Non-traditional media seems to fit outside the academic library’s brief and is just beginning to be addressed. ATG: If a young entrepreneur in our industry approached you and asked where they should focus their efforts, what would you tell them? Are there particular areas that you think are ripe for investment? PS: Quantitative skills will be a valuable asset for young entrepreneurs. Understanding data as the underpinning of the next generation of information products like data visualization and knowledge graphs, artificial intelligence, and the Internet of Things movement will be essential. Digital currency is another area to watch, particularly if we see a shift in payments by students for research services. That will accelerate new forms of digital currency and payment schemes in the education market. ATG: Libraries are struggling to find their place in the scholarly communication infrastructure. Where do you see them fitting in? Going forward, what do you see as their primary contribution? PS: The library’s role is to support faculty and students in their research endeavors. continued on page 40

<http://www.against-the-grain.com>

39


Interview — Pat Sabosik from page 39 Libraries find themselves at the end of the cycle of scholarly publishing — archiving the research that scholars publish, so that’s a hard spot to be in. Two obvious areas are working with Institutional Repositories and the Office of Research. Libraries have experience in search and working with these two groups to help design either search tools or personal search robots to aid researchers in the discovery process would be a valuable collaborative effort. Aligning with the Office of Research to help negotiate licenses and access terms for scholar’s content pre- or post-publication would fit in a library’s brief. I can see libraries having a role in building institutional repositories for their faculty but I think a good approach is to provide an index of their faculty’s work and then link to as much as they can. In a “linked” society, that may be an acceptable approach. The functions of archiving, access, and discovery are still fundamental to an academic library’s mission and these functions will continue in various forms as Open Access and new content and data formats emerge. And, libraries need to be more flexible addressing the perishable nature of social media. ATG: Are there any recent innovations in the library world that encourage you to think that libraries will make this contribution successfully? PS: Knowledge graphs and linked data are not new but new applications are transforming search and discovery through visualization and that’s exciting. ATG: You also mention libraries having a role in building institutional repositories. What would you say to those who doubt that

most academic libraries have the infrastructure, staff, and expertise to play a meaningful role? PS: A bit of irony isn’t it. Academic librarians wanted direct access to all of their institutional scholars’ work but now report that it’s hard and they don’t have the infrastructure. IRs are a long-term investment that I believe have value and help support an institution’s brand as well as archive scholars’ research. If the library is not the place to manage this, then perhaps the Office of Research is a prospective home. Along these lines, the definition of a successful IR and what might be included may need to be narrowed, or redefined. ATG: If you were to look in your crystal ball what kind of future do you see for scholarly communications? What key challenges will we face? And how should we meet them? PS: The Open Access to published scholarly research, the Open Science Framework, and Open AI will present opportunities as well as challenges to policy, operations, and funding that will shape the next generation of scholarly publishing. These trends are starting in the Sciences and I expect the “open” model will spread to the Social Sciences and Humanities. From a graduate perspective, more post-doctoral students are going into industry than staying in the academy and this will have implications for institutions, funders, and libraries. Dr. Paula Stephan, Andrew Young School of Policy Studies, Georgia State University addressed this topic in a keynote at the 2017 Society for Scholarly Publishing meeting. ATG: Recently, you attended the Charleston Conference first annual Future Lab. What were your key takeaways? PS: Librarians and many publishers are still focusing on the tactics of publishing and

collection instead of the trends that are shaping information strategy. Institutional change will be driven by (1) demographic shifts such as life-long learning programs and older learners; (2) operational or structural shifts like operating with reduced state and federal funding and reduced foreign student enrollment; or (3) policy-driven shifts such as open research, funding initiatives, curriculum changes, hiring practices, and the use of personal robots for discovery and research. Technology will be an underpinning as the next generation of students who now enter college. But the strategy of how the technology is deployed at both the institution and by the student is where the future lies. An example I gave at Future Lab is why do we still have segmented databases? Let’s think of Google (where students go as a starting point for discovery) as a large data lake of content and let students or researchers refine their personal robots to find content. They control their destiny, not hampered by institutional policies and restrictive access. That’s very old-school thinking. ATG: Pat, we know that you are an avid hiker who has tackled demanding mountain trails worldwide. What was your most memorable conquest? Are there other activities that help you recharge and get energized for the next challenge? PS: I love talking about hiking! One of the best was a five-day hike on the Inca Trail to Machu Picchu. The highest peak we climbed was over 14,000 feet high and it literally took your breath away. A trip up the Annapurna Sanctuary Trail in Nepal was another highlight. Backpacking in New Hampshire’s White Mountains is a great way to unwind. In 2017 I became a Connecticut Master Gardener! Hiking and digging in the dirt keeps me grounded. Pun intended.

Book Reviews — Monographic Musings Column Editor: Regina Gong (Open Educational Resources (OER) Project Manager/Head of Technical Services and Systems, Lansing Community College Library) <gongr1@lcc.edu> Column Editor’s Note: For this issue, we are focusing on two titles: one is on strategies in managing libraries and another on renewing yourself written by our regular book reviewers Corey Seeman and Ashley Bailey. The book Inherent Strategies in Library Management, is another excellent addition to the literature of library management. It provides readers with a great overview of library management thinking and places it in the context of business management literature. It is interesting how most books on library management always emphasize the need for libraries to be agile, innovative, and embrace change. Libraries of all types and sizes do that every day even with little or sometimes no funding support so to me this is not new. How else are libraries still here despite grim predictions of our demise? Meanwhile, in Renew Yourself: A Six-Step Plan for More Meaningful Work, we see somewhat of a self-help type book that gives us practical advice and insights into how to be happier and more fulfilled in one’s work. I find this book interesting because it allows us to be more introspective and examine ways in which we are fulfilling our purpose and goals in our work to make it more meaningful.

40 Against the Grain / February 2018

I hope you enjoy reading the reviews in this issue. As always, please let me know if you want to be a book reviewer. Just send me an email at <gongr1@lcc.edu>. I always have a free book waiting for you. Happy reading! — RG

Koizumi, Masanori. Inherent Strategies in Library Management. Cambridge, MA: Chandos Publishing, 2017. 9780081012970 (eBook). 9780081012772 (Paperback). $78.95 (print). Reviewed by Corey Seeman (Director, Kresge Library Services, Ross School of Business, University of Michigan, Ann Arbor) <cseeman@umich.edu> Reading this book transported me back to library school. One class in particular came to mind while reviewing this book — the one on continued on page 41

<http://www.against-the-grain.com>


Book Reviews from page 40 library management. Even though I do not remember much about the class in the 25+ years since I took it, I remember the books we read. The primary text was Robert D. Stueart and John T. Eastlick’s Library Management, a book read by many library school students. The other book was David Halberstam’s The Reckoning, a book that brilliantly tells the parallel histories of two giants in the automotive world, Ford and Nissan. The reason why we were reading the textbook was fairly self-evident. But why were we reading The Reckoning? Well, we never really found out as our faculty member changed early in the class and the replacement did not like the book. While we did not incorporate it as much as I would have liked, I enjoyed it so much — far more than anything else I read in library school. I believe the intention was to show how organizations operate during change. Additionally, it might have been to show how being a leading company does not mean you can ignore competitors, even if their market entry is not close to where you were operating. The question might be this — if we are exploring how to effectively look at library management, should we explore libraries, or also explore businesses and other organizations? This is one of the central questions of Masanori Koizumi’s excellent new work, Inherent Strategies in Library Management. An assistant professor in the Faculty of Library, Information and Media Science at the University of Tsukuba in Japan, Masanori Koizumi approaches this well researched work like a true academic. He poses the question of whether library management should be something studied in the context of business management, or does it require its own set of literature? To do this, Koizumi does a great deal of research on both library management literature and business management literature. In thinking about last year’s Charleston Conference theme (What’s Past is Prologue), this work fits perfectly as he explores library culture, values and history as it relates to management. This includes some of the early works in the field that might have been written from a gender bias and assumed that libraries do not need to be managed like other organizations or businesses. Many aspects of this work are very useful for library managers across the board. This is because a number of library managers moved into these roles from the ranks and do not have a great deal of management training to fall back on. Koizumi provides an excellent overview of some of the prominent thought leaders in the management space, which provides the reader with a list of promising works to learn more about running organizations and what are the prevailing thoughts. He also provides an excellent examination of citations from library management texts to see if the cited works are coming from library literature or standard business management works. Another relevant section in the book is his exploration of transitions at some very large libraries from the 1960s to the present. These libraries include Harvard University, Columbia University, National Diet Library (Japan’s national library), and others. While the chart showing changing functions and departments over the years was very interesting, it may not be as applicable to smaller or mid-sized libraries that simply do not work at the levels of these huge libraries. With the changes that are going on across our profession, understanding the best way to manage our resources, our people and our services is critical for not only our success, but also our survival. It is not sufficient for us to merely do as Admiral Farragut reportedly said at the Battle of Mobile Bay in 1864 when he was quoted as saying “Damn the torpedoes, full speed ahead!” The future of the library is not chiseled in stone, and library managers and administrators need to seek out paths forward that might come from both libraries before us and other organizations. The need for change and for innovation is great for libraries. We need to be bold in undertaking these challenges. Masanori Koizumi provides readers with a great overview of library management thinking and places it in the context of business management scholarship. We end up with a great work that gives us tools to handle many of the challenges that are waiting for libraries over the next five years. To that end, this would be a good book to have on your bookshelf...while you still have them!

Against the Grain / February 2018

Hakala-Ausperk, Catherine. Renew Yourself: A Six-Step Plan for More Meaningful Work. Chicago, IL: ALA Editions, 2017. 9780838914991. 152 pages. $50.00. Reviewed by Ashley Fast Bailey (Director, Collection Development and Workflow Solutions, Midwestern and Southeastern U.S., GOBI Library Solutions) <abailey@ybp.com> Catherine Hakala-Ausperk, an active library planner, speaker, consultant, and trainer, along with being an adjunct faculty member at Kent State University’s School of Library and Information Science, provides practical questions, activities, and insights into how to be happier and more fulfilled in one’s work. Renew Yourself: A Six-Step Plan for More Meaningful Work provides a six-step process to work through and answer the question of “what’s next?” Whether the question is being asked about a current job, or a job change is imminent, this book can help with the success of either path. Taking the journey using the 5W1H questions: who, what, when, where, why, and how, Hakala-Ausperk helps one take steps to build to the bigger picture. Who? The first question the reader is asked to contemplate and answer is who are you? By defining one’s values and goals, it can help bring clarity to the renewal process. Hakala-Ausperk uses exercises and questions to distill down personal values, fundamental values, and goals to help provide a trajectory for the remaining 5W1H questions. By matching these goals and values with one’s work fulfillment and job satisfaction, this can be accomplished. Defining these and providing clarity to the question of who are you will allow oneself to reexamine their current position and consider the contribution they want to make to their current role, or even a new venture. The second and third sections of this book go into the What and When W’s. What do you want to do? Hakala-Ausperk’s writes about defining one’s mission and working to discover where passion and values intersect. With more exercises, this section walks through finding purpose in one’s work. The third question is when is the right time? This is a question everyone has asked themselves at one point or another in their career — when is the right time to do something new, make a position change within the organization, or leave for another role all together. By providing examples of when people made life changing or career changing choices, Hakala-Ausperk provides valuable and insightful illustrations for this question. In addition, she outlines the life cycle of purpose and what that can look like and use it to help move forward in whatever point one is currently at. Where should you be is the next question. Where should your renewal take you? Hakala-Ausperk examines the most common reasons that people stay or go in their current position. From work culture to talents to passion, she covers eleven reasons that people stay or go. Going into detail on each and giving options on dealing with them, she provides some great insight into the factors that can go into job and life satisfaction. Steps five and six are the Why and How. Why would you move forward, and what would or could that look? By giving voice to many of the uncertainties and questions that come into one’s mind when they decide to make a move forward, it allows the reader to think through some reasons relating to making a major change. How addresses the question of “How do you do it?” Hakala-Ausperk reminds us that these major changes should be well thought out; they take time, patience, and preparation. Outlining ways to prepare for a major shift, she breaks down what can be a daunting decision into small bits and help give a map to the process. Renewing Yourself contains a renewal plan template and notes referencing many great works dealing with this topic. Renew Yourself: a six-step plan for more meaningful work is a workbook-style book that provides a program for defining one’s mission and purpose, answering questions to help one see if there are things in their current job to bring more satisfaction and fulfillment, or if a shift is needed, and allows one to create a plan for growth.

<http://www.against-the-grain.com>

41


Collecting to the Core — Best Practices in College Student Development by Dr. Beth A. McDonough (Research and Instruction Librarian, Associate Professor, Hunter Library, Western Carolina University; Education Subject Editor, Resources for College Libraries: Career Resources) <bmcdono@wcu.edu> and Dr. April Perry (Assistant Professor and Program Director, Higher Education Student Affairs, Western Carolina University) <alperry@wcu.edu> Column Editor: Anne Doherty (Resources for College Libraries Project Editor, CHOICE/ACRL) <adoherty@ala-choice.org> Column Editor’s Note: The “Collecting to the Core” column highlights monographic works that are essential to the academic library within a particular discipline, inspired by the Resources for College Libraries bibliography (online at http://www.rclweb.net). In each essay, subject specialists introduce and explain the classic titles and topics that continue to remain relevant to the undergraduate curriculum and library collection. Disciplinary trends may shift, but some classics never go out of style. — AD

M

ost undergraduate students arrive at college as adolescents and leave as young adults. They come from increasingly diverse socioeconomic, racial, ethnic, religious, linguistic, and gender backgrounds, which creates more and more complex campus environments. While the undergraduate experience can no longer be considered a singular phenomenon, if that was ever the case, a growing body of evidence supports the relationship between healthy student development and academic success. The expectation remains that students be transformed by the college experience and college student development research and theory is key to creating conditions for that transformation. As the core works discussed in this essay demonstrate, understanding college student development is at the heart of current best practices in higher education, making these monographs valuable additions to any undergraduate library collection. Every librarian has experienced the patron who knows exactly which book he or she wants, but can’t remember the title: “You know, the green book.” Student Services: A Handbook for the Profession is such a book for generations of student affairs professionals who do indeed know it as “the green book.”1 Authored by scholars and practitioners, this “big picture” handbook is updated every few years and offers essential theoretical bases, organizational frameworks, historical context, and professional competencies for the student affairs profession. The sixth edition published in 2017 contains several new chapters, including one focused on college student development. Ernest Pascarella and Patrick Terenzini’s groundbreaking book, How College Affects Students: Findings and Insights from Twenty Years of Research, synthesizes more than 2,600 studies in order to understand the impact of college on students.2 Together with

42 Against the Grain / February 2018

the second volume, How College Affects Students: A Third Decade of Research, more than 30 years of research findings are considered.3 Most recently, How College Affects Students: 21st Century Evidence that Higher Education Works, continues this work by incorporating research from 2002-2013 in a third volume.4 Each volume updates evidence and compares current findings to past research. A more specialized work, Student Development in College: Theory, Research, and Practice, deeply explores the range of college student development theory.5 The third edition of this classic text has been expanded and revised and includes new insights into emerging identity theory, including faith and gender identities. While past editions were organized chronologically, the new edition utilizes a conceptual organizational structure, which makes a dense body of theory more accessible to students, faculty, and practitioners. The two leading student affairs professional organizations, American College Personnel Association-College Student Educators International (commonly known as ACPA) and Student Affairs Administrators in Higher Education (commonly known as NASPA), collaborated to publish two influential reports intended to redefine student affairs practice and its relationship to student learning. Learning Reconsidered: A Campus-wide Focus on the Student Experience challenged educators to view “learning and development as intertwined, inseparable elements of the student experience.”6 To that end, the report outlined seven transformative learning outcomes related to college student development, dimensions of those outcomes, sample developmental experiences, and theories for educators. Published shortly thereafter, Learning Reconsidered 2: A Practical Guide to Implementing a Campus-wide Focus on the Student Experience billed itself as a “blueprint for action” offering the “dialogue, tools, and materials necessary to put into practice the recommendations in Learning Reconsidered.”7 The fact that a half-dozen professional organizations in higher education joined ACPA and NASPA in sponsoring the second report is testimony to the unified vision embraced by the higher education student affairs community. George Kuh, the researcher who founded the widely used National Survey of Student

Engagement (NSSE) and developed High Impact Educational Practices (HIPs), has made many significant contributions to the literature on college student success. Two titles for undergraduate academic libraries to consider are the game-changing volume Student Success in College: Creating Conditions that Matter and the more recent Using Evidence of Student Learning to Improve Higher Education.8-9 The former emerged from a two-year study of twenty institutions’ student success initiatives originating from the NSSE and dubbed Project DEEP (Documenting Effective Educational Practice). The revised edition revisits the same schools five years later to update the evidence of impact on students. Using Evidence of Student Learning to Improve Higher Education is grounded in research conducted at the National Institute for Learning Outcomes Assessment (NILOA). Authored by NILOA’s principal scholars, chapters advocate using student learning outcome assessment to improve student success, rather than mere compliance to accreditation standards. They examine broader relationships between outcomes assessment, academic achievement, and institutional effectiveness. Frances Stage and Kathleen Manning’s popular handbook, Research in the College Context: Approaches and Methods, was revised and updated in 2016.10 Reflecting current trends, the authors added chapters about mixed-methods research and participatory action research; expanded information about equity and justice, critical research approaches, and visual research methods; and updated methods for disseminating research results with special attention devoted to online media. The book offers practical, real-world examples of diverse approaches to researching college students and campus settings. Charles Carney Strange and James Banning’s Designing for Learning: Creating Campus Environments for Student Success is a revised and expanded edition of their earlier influential work Educating by Design: Creating Campus Environments for Student Success.11 The authors examine how the design components of campus environments impact their learning purposes and present strategies and a practical framework that can be used to evaluate collegiate physical spaces with the intention of improving student success. continued on page 43

<http://www.against-the-grain.com>


Collecting to the Core from page 42 With higher education outcomes under increased scrutiny from the public and politicians alike, this core collection of evidence-based texts is of value to any college or university and should not be overlooked by institutions simply because they do not support a student affairs program. The research about what helps college students develop and succeed is of interest to administrators, faculty, librarians, and even curious students. Indeed, today’s higher education practitioners have unprecedented access to information about best practices and an open invitation to contribute to transformative student experiences. Endnotes 1. Schuh, John H., Susan R. Jones, and Vasti Torres. Student Services: A Handbook for the Profession. 6th ed. San Francisco: Jossey-Bass, 2017.* 2. Pascarella, Ernest T., and Patrick T. Terenzini. How College Affects Students: Findings and Insights from Twenty Years of Research. San Francisco: Jossey-Bass, 1991.* 3. Pascarella, Ernest T., and Patrick T. Terenzini. How College Affects Students: A Third Decade of Research. San Francisco: Jossey-Bass, 2005.* 4. Mayhew, Matthew J., Alyssa N. Rockenbach, Nicholas A. Bowman, Tricia A. Seifert, and Gregory C. Wolniak with Ernest T. Pascarella and Patrick T. Terenzini. How College Affects Students: 21st Century Evidence that Higher Education Works. San Francisco: Jossey-Bass, 2016. 5. Patton, Lori D., Kristin A. Renn, Florence Guido-DiBrito, and Stephen John Quaye. Student Development in College: Theory, Research, and Practice. 3rd ed. San Francisco: Jossey-Bass, 2016.* 6. Keeling, Richard P., ed. Learning Reconsidered: A Campus-wide Focus on the Student Experience. ACPA, NASPA: Washington, D.C., 2004. 7. Keeling, Richard P., ed. Learning Reconsidered 2: A Practical Guide to Implementing a Campus-wide Focus on the Student Experience. ACPA, NASPA: Washington, D.C., 2006. 8. Kuh, George D., Jillian Kinzie, John H. Schuh, and Elizabeth J. Whitt. Student Success in College: Creating Conditions that Matter. Rev. ed. San Francisco: JosseyBass, 2010.* 9. Kuh, George D., Stanley O. Ikenberry, Natasha A. Jankowski, Timothy Reese Cain, Peter T. Ewell, Pat Hutchings, and Jillian Kinzie. Using Evidence of Student Learning to Improve Higher Education. San Francisco: Jossey-Bass, 2015.* 10. Stage, Frances K., and Kathleen Manning. Research in the College Context: Approaches and Methods. 2nd ed. New York: Routledge, 2016. 11. Strange, Charles Carney, and James H. Banning. Designing for Learning: Creating Campus Environments for Student Success. 2nd ed. San Francisco: JosseyBass, 2015.* *Editor’s note: An asterisk (*) denotes a title selected for Resources for College Libraries.

Against the Grain / February 2018

Booklover — Japanese Art Column Editor: Donna Jacobs (Retired, Medical University of South Carolina, Charleston, SC 29425) <donna.jacobs55@gmail.com>

K

azuo Ishiguro was awarded the 2017 Nobel Prize in literature. Ishiguro, “who, in novels of great emotional force, has uncovered the abyss beneath our illusory sense of connection with the world,” was already somewhat familiar to the American reader. His 1989 novel Remains of the Day was transformed into the 1993 movie by the same name starring Anthony Hopkins. Prior to choosing a novel to delve into, I read the press around this announcement. An opinion piece by Garrison Keillor provided a curious perspective in typical Keillor humor: “Once again the humorless Swedes have chosen a writer of migraines for the Nobel Prize in literature, an author of twilight meditations on time and memory and mortality and cold toast by loners looking at bad wallpaper.” ….“Of course all the book critics applauded the choice of Kazuo Ishiguro: Praising the dull and deadly is a time-tested way to demonstrate intellectual superiority. It’s like taking a ski vacation in North Dakota: It sets you apart from the crowd. And comedy is so utterly adolescent.” Well, this set up my reading adventure quite differently, off to the library to check out An Artist of the Floating World. Ishiguro was born in Nagasaki, Japan. His family moved to the United Kingdom when he was a young boy. He graduated in English and Philosophy at the University of Kent, and continued with Creative Writing at the University of East Anglia. Once

Rumors from page 38 approaches to public service and are making a difference for individuals, families and communities. The SC Aquarium was nominated for the National Medal by Robert Macdonald, Director Emeritus of the Museum of the City of New York and Director Emeritus of the South Carolina Aquarium board of directors. Macdonald’s unique perspective on museums and the critical role they play in the economic health and cultural fabric of the community was instrumental in helping shape the Aquarium’s identity as a community gathering place, convener, and valuable resource in South Carolina. The Charleston Conference is proud to allow Charleston Conference accompanying family members to attend the Gala Reception! A parting Rumor just received — Digital Science and Katalysis are pleased to announce the launch of a pilot project to test

I learned of his biography, I found the title of the novel I chose intriguing and somewhat reflective of this author/artist. It is written in English about an artist who made some creative decisions during World War II that set the course for his professional career. The cadence of the writing settles you into the Japanese cultural experience. I floated pleasantly between subtleties of Japanese culture, conversational innuendo and post war angst; and the English language by which all this was delivered. Masuji Ono, the story’s central figure, begins his career in the art of ukiyo-e (“pictures of the floating world”). The definition being given as “an art closely connected with the pleasures of theatres, restaurants, teahouses, geisha and courtesans in the even then very large city.” Ono struggles reflectively with his career choice to pursue a more pragmatic path with his artistic talent. He expresses his difficult decision to Mori-san, his Sensei: “I have learnt many things over these past years. I have learnt much in contemplating the world of pleasure, and recognizing its fragile beauty. But I now feel it is time for me to progress to other things. Sensei, it is my belief that in such troubled times as these, artists must learn to value something more tangible than those pleasurable things that disappear with the morning light. It is not necessary that artists always occupy a decadent and enclosed world. My conscience, Sensei, tells me I cannot remain forever an artist of the floating world.”

blockchain technologies to support the peer review process. They are joined in this pilot project by founding partner Springer Nature. The initiative is an important step towards a fairer and more transparent ecosystem for peer review and explores the utility of decentralized data stores in supporting trusted assertions that connect researchers to their activities. As research volumes increase each year, the problems of research reproducibility, recognition of reviewers and the rising burden of the review process, have led to a challenging landscape for scholarly communications. In its initial phase, this initiative aims to look at practical solutions that leverage the distributed registry and smart contract elements of blockchain technologies. Later phases aim to establish a consortium of organisations committed to working together to solve scholarly communications challenges that center around peer review. Digital Science will manage the project and looks forward to coordinating with further partners continued on page 54

<http://www.against-the-grain.com>

43


Wryly Noted — Books About Books Column Editor: John D. Riley (Against the Grain Contributor and Owner, Gabriel Books) <jdriley@comcast.net> https://www.facebook.com/Gabriel-Books-121098841238921/ Photographed Letters on Wings: How Microfilmed V-Mail Helped Win World War II by Tom Weiner with Bill Streeter. (ISBN 978-945473517, Levellers Press, Amherst, Massachusetts 2017, $18.95.) V-Mail is an undeservedly obscure topic of military and philatelic history. It was a short lived (1942-1945) but strategic technology used to miniaturize paper mail utilizing microfilm systems to free up space and cargo weight on planes supplying the troops during WW II. This is a very important book for anyone interested in postal history, the history of microfilm, espionage and war time communications, and the enormous effort made by the postal service and the military to keep the battlefront and the home front in constant communication. The military viewed rapid communication between front line troops and family members, loved ones, and other supporters as paramount in maintaining troop morale and bolstering support for the war effort back home. The book is a labor of love on several fronts. First of all, there is William Streeter, who did the primary research and compiled the relevant documents. He was inspired by the V-Mail he used to communicate with his double cousin, Henry Streeter, during the war. Henry died at age 19 in battle. This book is William Streeter’s way of honoring his cousin and the other troops who served during the war. The other person in this triangle is Tom Weiner, who took up the task of writing the book and completing the research necessary to bring the book to publication. He met William Streeter near the end of Streeter’s life and was inspired by the depth of his research and the passion he had for the subject. Mr. Weiner previously wrote Called to Serve: Stories of Men and Women Confronted By the Vietnam War Draft, a personal recounting of the effects of the war featuring interviews with men and women representing all of the choices faced from serving to resisting, from leaving the country to conscientious objection and “beating the draft.” Also included are chapters on the history of the draft and of conscientious objection. He came at this book about V-Mail with his own passion and depth of research. The book begins with a comprehensive recounting of the history of microfilm and micro-photography, which is coeval with photography itself. The earliest micro-photographs date to 1839 when a daguerreotype process was used to produce novelty texts. By 1851 James Glaisher suggested storing documents using microfilm and Sir John Herschel envisioned creating microscopic editions of reference works for use by librarians and researchers. One of the major impediments to wider usage of micro-photography was the lack of an inexpensive, easy to use lens for viewing it. Rene Dagron improved on the one-piece microscope invented by the Earl of Stanhope and with its

44 Against the Grain / February 2018

introduction micro-photography became available to a wider public. By 1862 Dagron was producing “microscopic photo-jewelry.” As with so many other steps forward in the world of viewing technology, pornography became a popular use of the device. Microscopic images were embedded in rings and lockets for gifts and personal enjoyment. At about that same time the military was starting to take notice of the technology. The ability to miniaturize large amounts of information was extremely useful in the spy game and in military communication. The Siege of Paris in 1870 was an important step in the military use of microfilm. Because of the complete entrapment of the City, military intelligence could only be exchanged by means of hot air balloons, which were easily shot down, or by means of a very old technology, in use for thousands of years: pigeons! Pigeons had been used since the time of Alexander and Caesar for military communication and even the Reuters news service was using pigeons up until 1860 to transmit stock prices between Brussels, Belgium and Aachen, Germany. With the advent of microfilm much more text could be sent by air. During the siege pigeons were outfitted with small tubes made from goose quills which held the rolled-up microfilm. The tubes were then attached to the tail feathers of the pigeons. The canny Prussians countered the flocks of pigeons with trained hawks of their own which hunted down their slow flying prey. When (and if) the microfilm arrived, it was projected by special lanterns onto a white wall where the messages were transcribed by numerous secretaries. From the time of the Siege of Paris until World War II there were numerous developments in microfilm technology, including its growing use in banking, legal document and blue print storage, library storage (especially newspapers), and of course, espionage. After a thorough history of microfilm, the book moves on to its main subject, microfilmed mail (i.e., V-Mail) and its use in World War II. V-Mail was an outgrowth of “Airgraph,” which was used by the British starting in 1940. “Airgraph” was a partnership between Pan Am Airways and Kodak which utilized the same technology Kodak had developed with its Recordak microfilm company and which would subsequently come to be used by V-Mail. The British faced the brunt of the war at the outset and thus anticipated the need for lighter weight cargo while supplying the far-flung troops of their Empire. To put the savings in perspective: “...1,500 letters could be put on a roll weighing a few ounces...and mail that would have necessitated the use of fifty planes could now be carried with one plane!”

When the U.S entered the war, the way was clear to emulate the “Airgraph” system to supply the hundreds of thousands of troops stationed in multiple theaters of war. Besides the obvious advantages of miniaturization, V-Mail also allowed for the indefinite retention of the original letter on microfilm rolls. With many planes being shot down close to the front, this allowed the postal service to resend letters that were lost. Once the letters were delivered the stored microfilm rolls were destroyed. V-Mail started out slowly, with only 35,000 letters sent from stateside in 1942, but by 1943 millions of letters were being sent. By the time it ended over a billion letters had been sent and received. With V-Mail, as opposed to regular post, there was no cost for troops writing home. The stationery was free for families and the cost of postage was only three cents. The ease of use, speed of delivery, and the effort by family members to stay in touch added to V-Mails popularity. The one drawback to V-Mail was that all messages had to be written on official forms that were already quite small and no inserts were allowed, so no lipstick kisses, locks of hair or perfume made it through the processing. There were three centers for aggregating mail for microfilming: New York, Chicago and San Francisco. In each location hundreds of specialized workers prepared the rolls of film for shipment. On the battlefront, V-Mail stations were operational within days of securing a position. Some of the more exciting stories in the book involve the jury rigging of photography labs in the wake of recent battles. Soldiers and technicians managed to eke out enough water even in the desert in order to finish their mission. As with all wartime mail, V-Mail was subject to censorship, both to protect sensitive information from reaching the enemy, and also to keep families at home reassured that all was well with their loved ones. Officers’ mail was self censored, but enlisted men’s mail either had offending passages blacked out or cut out. Some mail was even tested for secret ink to ferret out spies. As the war wound down V-Mail also slowed down and was no longer in service by 1946. V-Mail proved its usefulness during the war, but dreams of expanding it to civilian use never materialized as regular air service made the use of Recordak and microfilming simply too cumbersome for regular postal service. Rounding out the book are samples of V-Mail letters called “The Voices of V-Mail” which give a living picture of the war and the close bonds that were maintained by the correspondence. There are also numerous facsimiles of the vivid advertising used to promote V-Mail. The book closes with an exhaustive bibliography of books, journal articles, and digital resources. Due to its multi-disciplinary approach this book should find a place in any library.

<http://www.against-the-grain.com>


Oregon Trails — Favorite Books of 2017 Column Editor: Thomas W. Leonhardt (Retired, Eugene, OR 97404) <oskibear70@gmail.com> With no apologies to contemporary lists of best books, I offer my “Top 17 in ’17” among the many books that I read last year. They are listed in the order in which I read them.

Hitler’s Ascent: 1889-1939 by Volker Ulrich

Ulrich humanizes Hitler but not to the point where I sympathize with or admire him but to where I get a better understanding of him. For example, I learned that Hitler was a voracious reader, at least during one stage of his life. He even developed an art of reading that seemed to be based on noting and retaining only that which confirmed his beliefs. Imagine the course of history if Adolf had read broadly and with an open mind.

Narziss und Goldmund by Hermann Hesse

When I took Joseph Mileck’s seminar in 1972, he assigned Hesse’s stories and novels (all of them except Das Glasperlenspiel) in chronological order following Hesse’s own personal development that is mirrored in his fiction and poetry. I have been gradually re-reading those novels first encountered so long ago. For those wanting an understanding of Hesse but not wanting to read his entire work, I recommend just two of his novels, Siddhartha and Narziss und Goldmund.

Casuals of the Sea by William McFee

Bennett Cerf considered three works by McFee to hold up against the best of Joseph Conrad, another writer of the sea: Casuals of the Sea; Captain Macedoine’s Daughter; and Command. I put Casuals of the Sea among my favorite books of all time since I first read it in 1962, the year I bought my first copy of it. I now own ten variant copies of the book and at least one copy of every other book that McFee wrote. Below is the quotation of a second-hand bookseller in Casuals of the Sea that launched my journey into McFee territory. “Be master of yourself. The world is not an oyster to be opened, but a quicksand to be passed. If you have wings you can fly over it, if not you may quite possibly sucked in.”

Good-bye Columbus by Philip Roth

In 1963 I was a private in the Army at Ft. Dix, New Jersey and lucky enough to be among soldiers with college degrees who fed my hunger for literature by recommending title after title including Good-bye Columbus. I never read anything else by Roth but after re-reading his National Book Award winner, I now want to try his other works.

A Single Pebble by John Hersey

How did I miss this book as I went from Hiroshima to A Bell for Adano to The War Lover? A Single Pebble is a simple story full of complex observations about how people’s lives are intertwined with their environment, especially the mighty Yangtze River as observed and reported by a young American engineer who travels a thousand miles aboard a Chinese junk. He is as captivated by the crew, on board and on shore, as he is by the Yangtze itself. Halfway through the book, I had become a passenger, too.

Ulysses by James Joyce

How many references to this book have I seen that suggest it is a book talked about but never read in its entirety? I no longer belong to that category. After many starts and stops, I finally read the entire 783 pages of the Modern Library (1961 corrected and reset) edition that I bought in Fresno, California in 1966. It was worth the wait and having to re-read more than half the novel. My enjoyment of Ulysses, for I did enjoy it, was enhanced by having read A Portrait of the Artist at least six times, beginning in 1960 while still in high school, and Dubliners three times. I didn’t read Homer’s The Odyssey until after reading Ulysses. I didn’t need

Against the Grain / February 2018

either one in order to enjoy and understand the other. And using , as I read Ulysses, several guides, I found them more a distraction than a help with a few exceptions.

For Whom the Bell Tolls by Ernest Hemingway

This self-parody was not apparent to me when I first read it in 1961, the very year when Hemingway tipped off how Robert Jordan ended his stand against the Nationalist, not with a shootout but with a bang. A second reading confirmed some of what I remembered but it also gave me a chance to appreciate how Hemingway describes the brutality of war and especially a civil war. In 1961, For Whom the Bell Tolls was full of romance, adventure, and heroism. From my perspective as an old man, I am skeptical of what Roberto thought of as love but I also appreciate the personal dynamics in the novel, much as I did before but with a deeper understanding.

Go She Must by David Garnett

David “Bunny” Garnett was a popular member of the Bloomsbury group and the son of writers and editors, his mother being Constance Garnett whose translations of Dostoevsky, Tolstoy, Chekov, and others enabled English speakers to enjoy that rich Russian literature. David Garnett writes with sensitivity and style and tells a good story. See also his The Sailor’s Return and Beany-Eye, that is, if you can find copies.

The Forties by Edmund Wilson

Even after reading Memoirs of Hecate County, I was taken aback at Wilson’s frankness as he describes his intimacies with the woman who became his wife. It is one thing to be so open in a journal that is for one’s own eyes only but to prepare it for publication so candidly is beyond me. Yes, I enjoyed the observations of this exceedingly literate (and opinionated) and erudite man and have most of his books sitting on my shelves waiting to be read.

A Portrait of an Artist as an Old Man by Joseph Heller

This is a funny book, funny as odd and funny as humorous. Don’t quit reading halfway through, even if tempted, because you need to get to the punch line. What a funny, odd book. It is not for the novice reader.

The Moving Toy Shop by Edmund Crispin

Edmund Crispin was a writer but also a composer, a pianist, an organist, and a conductor. His sleuth, the Oxford don, Gervase Fen, is perhaps my favorite crime solver of all time. The eponymous toy shop is not moving but it moves and therein lies the mystery. I think I have about four more Professor Fen books to go. Then what shall I read?

A Surfeit of Lampreys by Ngaio Marsh

Should you run out of Gervase Fen novels, I suggest that you move on to Ngaio Marsh’s Roderick Alleyn, a gentleman’s detective. There are 32 novels featuring him so there is no danger of running out of reading any too soon. Both Alleyn and Fen can quote poetry and Shakespeare with the best of them while also solving mysteries.

The Private Papers of a Bankrupt Bookseller by Anonymous (William Young Darling)

The title alone is almost enough to make you cry but then come the bookseller’s musings about books, the businesses next door to him, poetry, and, indirectly, on life itself. So poignant are some of the essays, many no more than a page in length, that my eyes became teary as I read them. I identified with the bookseller the minute I saw the book in Richard Booth’s Bookshop, Café, and Cinema (Hay-on-Wye, Wales), just after one of the tastiest breakfasts ever anywhere. I have even bought a book of John Masefield poetry on the recommendation of a bookseller who never existed except on paper. This is my book of the year. continued on page 46

<http://www.against-the-grain.com>

45


The Scholarly Publishing Scene — Getting Attention Column Editor: Myer Kutz (President, Myer Kutz Associates, Inc.) <myerkutz@aol.com>

T

he idea that the contents and conclusions in scholarly journal articles make their way into newspapers and other popular media isn’t new. Nor is the notion that such transmissions are important not just to the general public, but to the scientific community, as well. An October 1991 New England Journal of Medicine (NEJM) article, “Importance of the Lay Press in the Transmission of Medical Knowledge to the Scientific Community,” by David P. Phillips, Elliot J. Kanter, Bridget Bednarczyk, and Patricia L. Tastad, posited that “efficient, undistorted communication of the results of medical research is important to physicians, the scientific community, and the public.” Because “information that first appears in the scientific literature is frequently retransmitted in the popular press,” the article’s authors asked the question, “does popular coverage of medical research in turn amplify the effects of that research on the scientific community?” The hypothesis they tested was, “researchers are more likely to cite papers that have been publicized in the popular press.” The article’s authors compared the number of Science Citation Index references to NEJM articles that were covered by The New York Times with the number of references to similar articles that were not covered by the Times. The authors concluded that NEJM articles covered by the Times received a disproportionate number of scientific citations in each of the ten years after the articles appeared. (The Times is an appropriate yardstick because of its power; what it chooses to publish makes its way to many other media outlets.) The effect was strongest in the first year after publication, when NEJM articles publicized by the Times received 72.8 percent more scientific citations than control articles. This effect was not present for articles published during the newspaper strike in the 1960s, when the Times prepared an “edition of record” that was not distributed; articles covered by the

Times during this period were no more likely to be cited than those not covered. NEJM remains fully aware of the importance of article transmissions to the popular press. With the print edition published weekly on Thursdays, and with each week’s content available online at NEJM.org at 5 PM EST each Wednesday, NEJM offers advance-access subscriptions to “qualified journalists” with daily or weekly deadlines. When advance-access subscribers agree to abide by NEJM’s embargo policy, i.e., not to publish stories before online publication on Wednesday at 5 PM EST, they have complimentary access to embargoed content via the NEJM Media Center on the Friday prior to publication. At this time, advance-access subscribers also receive author contact information, which facilitates author interviews in advance of publication. Of course, because NEJM publishes articles about medical issues of wide interest, its parent, The Massachusetts Medical Society, would place great emphasis on feeding clinical and medical research information to journalists. Other scholarly publishers in not just medical but in other disciplines, as well, maintain similar press operations, with their own sets of rules, not only, I think, to disseminate results in journal papers and information in books for public benefit, but also to attract papers from prominent and up-and-coming authors, as well as to sell more subscriptions and more copies — to do well by doing good. In addition to medicine, many scientific disciplines receive considerable press coverage. Biology, physics, and astronomy are disciplines that come readily to mind. During the recent PROSE Awards judging, two science journals

Oregon Trails from page 45 Black Betty by Walter Mosley If you don’t know who Easy Rawlins is, I invite you to find out by reading about him in Walter Mosley’s novels. He is like no private eye or anyone at all you have ever encountered. As he solves mysteries, Rawlins also provides a running social commentary that gives us all something to think about.

Death at the President’s Lodgings by Michael Innes Here is another professor of English literature using his professional background as a backdrop for murder and mayhem. The president in this novel oversees an imaginary English university situated midway

46 Against the Grain / February 2018

really stood out for me. The reason was their purposeful linking of scholarly articles with public issues and the attempt to secure press coverage. The first one I want to discuss is a subscription-based chemical sciences journal, Chem, published by Cell Press. The journal’s primary aims, according to the publisher, are to showcase “chemistry as a force for good” by demonstrating “how fundamental studies in chemistry and its sub-disciplines may help in finding potential solutions to the global challenges of tomorrow.” (I’m reminded of the phrase, “Better Living Through Chemistry,” the more common variant of a DuPont advertising slogan, “Better Things for Better Living...Through Chemistry,” used from 1935 until 1982. To circumvent trademark infringement, “Better Living Through Chemistry” was used on non-DuPont products. You’ve probably heard the phrase used to promote prescription or recreational drugs, to praise chemicals and plastics, or for sarcasm. [Hat tip to Wikipedia]) On submission, Chem authors are asked to categorize their contributions, including research articles, reviews, commentaries, discussions, and opinions into at least one of the following ten Sustainable Development Goals identified and developed by the United Nations, which Chem selected for their relevance to chemistry: good health and well-being; affordable and clean energy; clean water and sanitation; climate action; zero hunger; sustainable cities and communities; responsible consumption and production; industry, innovation, and infrastructure; life on land; life below water. Cell Press says that Chem is the first journal to establish these wide-ranging links. According to the supporting materials that Cell Press submitted to PROSE, they’re taking on a “greater challenge.” They assert that chemistry is never seen in “the same positive light” as other scicontinued on page 47

between Oxford and Cambridge. Oh to have such lodgings, sans the murders.

Earthly Delights, Unearthly Adornments: American Writers as Image Makers and About Fiction by Wright Morris

Most of Wright Morris’ books are novels and short stories but he was also a gifted photographer and a literary critic with an insider’s feeling for American literature. But Morris also read across borders as will be seen in his twenty recommended readings appended to About Fiction. In 1989, I was privileged to meet, converse, and dine with Mr. Morris. I regret that I had not read these two books before meeting him because then I might have been able to hold up my end of our conversation and correspondence. The winner of two National Book Awards and two Guggenheim fellowships for photography, this native Nebraskan is probably the best unknown American writer ever.

<http://www.against-the-grain.com>


The Scholarly Publishing Scene from page 46 entific disciplines. When it comes to chemistry press coverage, the focus is on topics such as waste, environmental hazards, and weaponry. Cell Press drive home their point with this additional assertion: even several recent Nobel prizes in chemistry have been awarded to life science researchers. Is this initiatve working? According to Cell Press, since Chem’s launch in July 2016, around 30% of its research articles have been picked up in both specialized and general news outlets. I’ll presume that the coverage has been positive. The other journal that caught my attention during PROSE Awards is an Open Access interdisciplinary journal, GeoHealth, published by the American Geophysical Union (AGU) in collaboration with Wiley. Started in 2017 (so not yet eligible for PROSE Awards, but wait ’til next year) Geohealth, according to AGU’s website, “highlights issues at the intersection of the Earth and environmental sciences and health sciences. It focuses on the following topics: environmental and occupational health; outdoor and indoor air quality and pollution; food safety and security; water quality, water waste treatment and water availability; climate change in relation to human, agricultural, and environmental health and diseases; soil health and services; ecosystem health and services; environmentally-related epidemiology; geoethics; national and international laws and policy, as well as remedia-

tion around GeoHealth issues; global Public Health; effects of climate change on exposure to pathogenic viruses, parasites and bacteria; human health risks of exposure to potentially harmful agents in the aquatic environment and through the food chain; remote sensing, satellite based observation of infectious disease and modeling; hydroepidemiology.” GeoHealth’s content includes original peer-reviewed research papers, reviews, and commentaries discussing recent research or relevant policy, most of them invited by the editors. The current editor in chief is Gabriel Filippelli, Professor of Earth Sciences and Director of the Center for Urban Health at Indiana University. He has an ambitious vision for the journal. He wants it to “be an interactive, nimble, and perhaps even controversial vehicle for covering challenging issues.” Additionally, he wants the journal to have an international focus and will be soliciting research from regions such as Africa and parts of southeast Asia. The journal has an enviable pedigree. The founding editor is environmental microbiologist Rita Colwell, an internationally recognized expert on cholera and other infectious diseases. During her long and distinguished career, she has served as the 11th director of the National Science Foundation (from August 1998 to February 2004). In 2008, she founded CosmosID, a company that uses systematic microbial identification that provides proven high-resolution bioinformatics to facilitate personalized treatment in health care and monitoring of environmental bio threat agents. In addition to being chair of CosmosID, she holds Distinguished Uni-

versity Professorships at the University of Maryland and at Johns Hopkins University Bloomberg School of Public Health. When you go on GeoHealth’s website on Wiley’s Online Library, you see a list of research articles. Listed below each article title, written in language approaching academic speak, is a list of three “key points,” which are written in pure layman’s terms. This presentation, it seems to me, will facilitate public awareness of the studies and distribution of their contents though the popular press. Sure enough, GeoHealth studies have been featured in such publications as Business Insider and even the New York Post (“Anthropogenic carbondioxide emissions may increase the risk of global iron deficiency”); the Washington Post (“Next generation ice core technology reveals true minimum natural levels on lead (Pb) in the atmosphere: insights from the Black Death”); and Scientific American (“Impacts of oak pollen on allergic asthma in the United States and potential influence of future climate change”). I wonder whether Chem and GeoHealth are signals about the future direction of the journals business. Will we see more of these general-news-oriented journals instead of narrowly focused twigs and branches extending from the limbs and trunks of disciplineand sub-discipline-based trees appealing only to specialists? I look forward eagerly to the answer to this question.

And They Were There Reports of Meetings — @Risk Forum, 13th APE, and the 37th Annual Charleston Conference Column Editor: Sever Bordeianu (Head, Print Resources Section, University Libraries, MSC05 3020, 1 University of New Mexico, Albuquerque, NM 87131-0001; Phone: 505-277-2645; Fax: 505-277-9813) <sbordeia@unm.edu> @Risk North Open Forum — The State of Shared Print Preservation in Canada — November 10, 2017 — Ottawa, Canada Reported by Tony Horava (Associate University Librarian, Collections, University of Ottawa, Canada) <thorava@uottawa.ca> The @Risk North Open Forum (http://www.carl-abrc.ca/news/ save-the-date-at-risk-north-2017/) was held at Library and Archives Canada, in Ottawa. It was conceived as a Canadian-focused successor to the @Risk Forum held in Chicago in spring 2016 that was held under the auspices of the Center for Research Libraries. The purpose of this forum was to give attendees an opportunity to discuss the state of shared print preservation programs in Canada, in a setting that was intended to push these conversations forward into action. Participants came from across the country, representing academic libraries, public libraries, government libraries, regional consortia, and national level organizations. The day began with a keynote from Constance Malpas, Research Scientist at OCLC. In her talk, “Approaching the Long-Term Preservation of Print Documentation,” she explained that this issue is still

Against the Grain / February 2018

relatively new — we need to think about it in terms of new tools and we need to think at scale. Redistributing curatorial responsibility across multiple institutions, building out the long tail, and sharing investment in stewardship are important. She argued that Canada is in a good place to be thinking about shared stewardship. In terms of the distribution of holdings of print books, there are 46M volumes, of which 92% are concentrated in 12 mega regions. We need to think about movement of flows of books at a system level. There are 5.8M books held outside of these mega regions (40%). There are 89% that are held in 5 or fewer libraries, and 15% are held uniquely in Canada. Extra-regional print books are at greater risk, where there is less commitment to preservation. A supra-institutional understanding that transcends organizational and geographic boundaries is necessary. She cited Rick Lugg in arguing that institutional scale collection management is not sustainable. There is either too much duplication, or too little! Collaborative scale agreements are needed. Common cause is needed even among Ivy schools. Scarcity is common in research collections; scarcity decreases as the scale of collaboration grows. Consortia scale partnerships leverage trust networks, and direct borrowing consortial networks reduces friction in collection management. continued on page 48

<http://www.against-the-grain.com>

47


And They Were There from page 47 She noted that European countries have moved significantly to shared print stewardship and collaboration. There are centralized models in Norway and Finland where right-scaling of stewardship is important. She described four elements of conscious coordination- system wide awareness (aligning local action with collective effort); explicit commitments (move commitments above the institutional level); division of labour/specialization (focus on collecting more specialized material); and reciprocal access (curate locally, share globally). We need to think in terms of inter-consortial rather than intra-consortial scale of collaboration. Bernard Reilly, President of the Center for Research Libraries, gave a talk entitled, “@Risk and National Coordinated Efforts in Print Preservation in the United States.” He described the shared print agenda for CRL and coordinated U.S efforts in print preservation. Major U.S. shared print programs include Scholars Trust, Big Ten Academic Alliance, WEST (Western Regional Storage Trust), and EAST (Eastern Academic Scholars Trust). These are based on MOUs and retention commitments, i.e., 25 years. He noted that approximately 422K titles are not registered in PAPR, and less than 1% have multiple copies registered. There are 462K unique titles in social sciences and humanities across major libraries. He described the new reality of academic research libraries, namely that there is less funding today than ever before for public universities. We need to substantially expand the scope and improve the quality of the shared collection, merging preservation and e-access as key priorities. We need to significantly increase the number of serial titles that are adequately preserved AND accessible, and create a North American consensus on the scope, norms and standards for print stewardship. We need to identify a critical corpus of serials worthy of digitization and preservation. Unfortunately there is no leadership at the national level in the U.S. He also discussed the importance of articulating a clear and convincing narrative for scholars and funders, about the value of preservation efforts. Maureen Clapperton, Director General at the Bibliothèque et Archives du Québec (BaNQ), described the organization’s mandate, collections, and digital preservation work that has been carried out to date. Monica Fuijkschot, Director General of Libraries and Archives Canada, gave a talk entitled: “State of the Ark: LAC Initiatives Supporting Print Preservation.” She described six key principles related to retention of print collections at risk: 1. LAC communicated its willingness to hold last copies of Canadiana; 2. LAC holdings are described in the National Union Catalog; 3. LAC’s preservation copies and rare books are held in appropriate preservation environments; 4. Continued availability of print material onsite; LAC will lend material if it is the only institution in Canada that holds it; 5. LAC committed to hold its Canadiana collection in perpetuity; 6. LAC has historically sought to transfer deselected material to other institutions, and will continue to do so. This was followed by a panel of representatives from different regional initiatives discussing current initiatives in shared print management: COPPUL (Council of Prairie and Pacific University Libraries) Shared Print Archive Network; TUG (Tri-University Group: Wilfrid Laurier University; University of Guelph; University of Waterloo); Scholars Portal/OCUL (twenty one academic libraries in Ontario); and Keep@ Downsview (five academic libraries within Ontario). There were breakout sessions during which the attendees were asked to consider the issues, priorities and opportunities for a national preservation strategy, and the role of Library and Archives Canada, regional consortia, and the Canadian Association of Research Libraries. What followed was a lively discussion and a general consensus that developing such a strategy would be timely, strategic, and necessary. Participants discussed the types of collections that would be important to preserve. It was also clear that the participants envisage an important role for Library and Archives Canada, in close partnership with other key stakeholders in the Canadian landscape. The issues around preservation require sustainable approaches, and intensive collaboration with many partners. The issues are large-scale and challenging, involving funding, coordination,

48 Against the Grain / February 2018

and long-term commitment. There was a recognition of how important is coordinated preservation at the national level, to ensure that our scholarly and cultural record is preserved for future generations. It was also clear that coordinated preservation is essential for citizens to be able to ask questions, to know their heritage, and develop a new understanding of identity and place. Special collections are unique, fugitive, and essential materials. There was a definite sense of how critical it is to harness our collective expertise, resources, and capacity. Risk and opportunity are closely linked — I hope that there will soon be developments to build upon the groundwork that was laid at this very timely forum.

Academic Publishing in Europe conference (APE) — Publishing 2020 Ramping Up Relevance in a Multi-faceted, Fragmenting System of Research Output and Innovation — January 16-17, 2018 — Berlin, Germany Reported by Anthony Watkinson (CIBER Research) <anthony. watkinson@btinternet.com> “Publishing 2020 Ramping Up Relevance” is the short title of the thirteenth Academic Publishing in Europe conference (APE). The full title gives a pretty good idea of the content. The site is https://www. ape-conference.eu/ which currently carries the programs and lots of photographs but is due to carry videos and presentations — probably by the time this report appears. It also links to an excellent earlier report from Chris Armbruster for the magazine Research Information: https:// www.researchinformation.info/news/analysis-opinion/ape-2018-conference-report. The dates were 16-17 January in Berlin with a pre-conference organized with the SSP the day before. There is an international attendance which here in Europe also includes the significant visitors from the USA. But it is a select gathering with numbers for the main event dictated by the size of the historic Leibniz Hall of the Berlin Brandenburg Academy of Sciences and Humanities (BBAW): official figures are SSP Pre-conference 75, Main Event 241 with a waiting list of 24. Like Charleston there is a presiding genius in the larger shape of Arnoud de Kemp, once a very senior director of Springer Verlag. His approach is distinctively Continental European which shows those of us in the Anglo-American world a different way of thinking especially relating to the “transition to open access” (see below). He has also always thought in terms not just about what the big publishers want but about the larger ecosystem: the closing words of the report of the first conference reads: “Despite all the energy and investments publishers are devoting to change their role, if they are not seen as adding enough value to the chain and not seen as proactive enough, authors, libraries and funding agencies will vote with their feet.” (www.ape2006.de/ APE2006_finalreport.pdf). What follows is highly selective. The first morning is always devoted to big names honored as Keynotes. “Open Science” was the central theme. You could argue that it was central theme of the whole conference. Open Scholarship would have been better but you cannot have everything and all these speakers were thinking in terms of science. Professor Sabine Kunst of the Berlin University Alliance had definite views about the policies of this organization currently in contraction. She saw open science as the scientific version of self-publishing. There is a lot of baggage in this suggestion not necessarily understood by her. Her point was that open access using existing technology “makes it possible for researchers to manage the publication process independently of publishers and to design it to their own discretion.” Peer review could be transferred to universities. These were plans. David Sweeney had a more cautious view. He has actual responsibilities as executive chair designate of Research England, supremo of a new government structure giving research money across all fields in the largest UK country. He has to monitor an ongoing process. As a custodian of the Finch project, the UK government process for transitioning to Open Access, and he is professionally interested in how it has gone: one answer (mainly positive) is provided by the official report to the UK Universities at http://www.universitiesuk. ac.uk/policy-and-analysis/reports/Documents/2017/monitoring-transition-open-access-2017.pdf. Another (mainly negative view) is from continued on page 49

<http://www.against-the-grain.com>


And They Were There from page 48 Dr. Danny Kingsley, who is the scholarly communication guru at Cambridge University Library: https://www.repository.cam.ac.uk/ handle/1810/269913. Sweeney notes that payments for open access in hybrid journals is where most of his money has gone but it has not led to the flipping of business models from subscription based to fully open access. Can publishers remain partners? Another big presentation was also from government — by veteran Eurocrat Jean-Claude Burgelman. Open Access empowers scientific communities and supports innovative business solutions. Part of the program is now an European Commission Open Research Publishing Platform following the best practice established by the Gates Foundation and the Wellcome Trust. — see https://ec.europa.eu/research/openscience/pdf/information_note_platform_public.pdf#view=fit&pagemode=none. He and his colleagues have ambitious plans for open data but he recognizes that it will not be easy as the presentation by David Nicholas (below) will explain. Burgelman’s views were complemented by those of Professor Johannes Vogel who heads up the Berlin Museum of Natural History but is also Chairman of the EU Open Science Policy Platform (https:// www.openaire.eu/open-science-policy-platform). Citizen scientists are involved in decision making. For him from a museum angle “deep change or slow death” is the alternatives. There had been some grounded presentations in the pre-conference from the UK Medical Research Council, the Association of Universities in the Netherlands and the Swiss Rector’s Conference inter alia under the heading — “How is public policy and funding changing the flow of scholarly communication?” In a later session the presentation by Professor Nicholas was something of a corrective: http://ciber-research.eu/download/20180116-APE. pdf. His team have been interviewing early career researchers (the

Against the Grain / February 2018

academics of the future) across seven countries in a longitudinal study of their ideas and practices. ECRs believe in sharing, openness and transparency but also need to publish in journals that have high impact factors. They cannot afford to make the data from their research open to all because they need to be the first to exploit it in publications. In the last session Dr. Rafael Ball of ETH Libraries was also very aware of the barriers. Niko Goncharoff of Digital Science asserted that publishers were not missing the boat and were on board as far as open science and data is concerned: but “changes in community behavior and culture” must come first. What else about publishing positions. There was a publishing keynote by Dr. Michiel Kolman, the senior Elsevier executive who is president of the International Publishers Association. His basic message was his members have a mission to maximize their role as stewards of truth and quality, that the promise of open access cannot be left to pirates (SciHub was name-checked) and that (alas) stakeholders are divided — as we shall see. There was also a full session on “Piracy” with speakers from the three biggest companies. The presentations reflected the fact that publishers differ in how to deal with piracy but in the nicest possible way. Duncan Campbell from Wiley gave some useful definitions: piracy is the commercial violation of legally sanctioned intellectual property, a symptom of unmet user needs and market demand and the exploitation of the gap between price and value. Wouter Haak (Elsevier) concentrated on sharing. There are some access problems but is researcher uptake just due to problems of access which do exist or is it more about convenience? We can work with scholarly collaboration networks to solve these problems and he showed how — see for example the Coalition for Responsible Sharing (http://www.responsiblesharing.org/) which appears to be getting traction among major learned societies. Wim van der Stelt (Springer Nature) chose as his title — “Will showing teeth solve the problem?” and suggested different approaches to Research Gate and to SciHub — which seem to be happening. In the subsequent discussion Rafael Ball from a library viewpoint urged cooperation. SciHub now ingest books. continued on page 50

<http://www.against-the-grain.com>

49


And They Were There from page 49 Charlie Rapple, a fourth speaker, suggested that discovery services are the big losers to Research Gate. There were several other presentations from other publishers and a vendor. The APE lecture was given by Dr. Annette Thomas, now CEO at Clarivate Analytics — the Thomson spin-off. She has already shown her strategy by the resurrection of the Institute of Scientific Information (ISI). A very different talk from Annie Callanan, relatively new CEO of Taylor & Francis, was a graphic “modest proposal for relevancy.” Quite different too was a substantial contribution on Building an Academic-Led Publisher for the Digital Age by Dr. Caroline Edwards of the Open Library of the Humanities. She brought monographs into the discussion and impressed librarians present: her theme was “Opening up Scholarly Dialogue.” Later there was a whole session on the “Benefits of OA Books.” Dr. Frances Pinter emphasized practicalities with special reference to Knowledge Unlatched, Eelco Ferweda of OAPEN described the European landscape, highlighting his own “A Landscape Study on Open Access and Monographs” (https://scholarlyfutures.jiscinvolve. org/wp/2017/10/landscape-study-open-access-monographs/), and Ros Pyne, who heads up policy and development at Springer Nature Open Research (the biggest publisher of OA books), produced evidence for the OA effect — big increases in downloads should and do encourage authors to go OA (https://www.springernature.com/gp/open-research/ journals-books/books/the-oa-effect).

Finally, there were two “technology” sessions of interest to both publishers and librarians. Dr. Eefke Smit (Director of Standards and Technology at STM) moderated a series of presentations under the heading “Blockchain: Hype or Game Changer.” They vary in comprehensibility for the lay person. The opening speaker consultant Dr. Joris van Rossum offered two sites for further scrutiny: https://www. digital-science.com/press-releases/digital-science-report-reveals-potential-behind-blockchain-technology-scholarly-communication-research/ and https://www.blockchainforscience.com/ which repay study. The latter organization’s founder Dr. Soenke Bartling also spoke. Van Rossum sees real potential but picks out how to gain trust as the current barrier. An entrepreneur Eveline Klumpers pointed to her start-up — see https://www.katalysis.io/about-us/. Blockchain technologies lower the cost of micropayments but in this world anonymity is impossible. Finally, Lambert Heller of TIB Hannover provided a librarian perspective which seemed to provide a contrasting message: “Blockchains allow for exchange of value, following transparent rules, without having to trust any player.” Not all the follow-ups from his slides seem to go anywhere. The jury seems to be still out. The second session was on artificial intelligence. It is clear from the presentations that AI is already being embedded in processes we are familiar with. Richard Wynne of Aries (a king of online editorial systems) gave a good account of what his company is doing. Tahir Mansoori of Colwitz (now part of Taylor&Francis) showed examples of enhanced analytics. Dr. Thomas Lemberger of EMBO (the European Molecular Biology Organization) explained projects involving AI which are part of the emerging open science landscape — see http://www.embo.org/news/press-releases/2017/ sourcedata-is-making-data-discoverable.

Issues in Book and Serial Acquisition, “What’s Past is Prologue,” Charleston Gaillard Center, Francis Marion Hotel, Embassy Suites Historic Downtown, and Courtyard Marriott Historic District — Charleston, SC, November 6-10, 2017 Charleston Conference Reports compiled by: Ramune K. Kubilius (Northwestern University, Galter Health Sciences Library) <r-kubilius@northwestern.edu> Column Editor’s Note: Thank you to all of the Charleston Conference attendees who agreed to write short reports that highlight sessions they attended at the 2017 Charleston Conference. All attempts were made to provide a broad coverage of sessions, and notes are included in the reports to reflect changes that were not printed in the conference’s final program (though some may be reflected in the online schedule, where links can also be found to presentations’ PowerPoint slides and handouts). Please visit the conference site http://www.charlestonlibraryconference.com/ to link to selected videos as well as interviews, and to blog reports, written by Charleston Conference blogger, Donald Hawkins. The 2017 Charleston Conference Proceedings will be published in 2018, in partnership with Purdue University Press. In this issue of ATG you will find the first installment of 2017 conference reports. We will continue to publish all of the reports received in upcoming print issues throughout the year. — RKK

MONDAY, NOVEMBER 6, 2017 PRECONFERENCES AND SEMINARS The Charlotte Initiative for Permanent Acquisitions of E-books by Academic Libraries – Research Project Outcomes and Next Steps — Presented by Michael Zeoli (YBP Library Services); Theresa Liedtka (University of Tennessee at Chattanooga); Rebecca Seger (Oxford University Press); John Sherer (University of North Carolina Press, University of North Carolina); October Ivins (Ivins eContent Solutions); Elizabeth Siler (UNC Charlotte); Alison Bradley (Davidson College); Kelly Denzer (Davidson College); Kate Davis (Scholars Portal, OCUL) Reported by Jack Montgomery (Western Kentucky University Libraries) <jack.montgomery@wku.edu>

50 Against the Grain / February 2018

To examine the ever-changing library market, the Carnegie-Mellon-funded Charlotte Initiative has been studying the eBook market with conventional academic usage as its model. The Initiative has three stated goals: First, to achieve irrevocable acquisition and access of eBooks in the academic setting. The second goal is to allow unlimited simultaneous users for eBooks. Finally, to secure freedom from Digital Rights Management issues like proprietary formats and the restricted access to content. Presenters from the various organizational teams and librarians made reports as to the progress and current status of the initiative including a major literature review. A wide variety of issues were discussed including the changing role of the traditional university press and the suggestion by publishers who see a future in the sale of large eBook collections rather than single title sales. Licensing issues included a desire to standardize the language of contracts and the contradictory ideas of perpetual access while allowing publishers to terminate agreements at any time. In truth, perpetual access and DRM are terms, for which there is no industry-wide consensus as to their definition and application, but that consensus is needed for eBooks to evolve.

WEDNESDAY, NOVEMBER 8, 2017 MORNING PLENARY SESSIONS 21st Century Academic Library: The promise, the plan, a response — Presented by Loretta Parham (Atlanta University Center (AUC) Robert W. Woodruff Library) Reported by Ramune K. Kubilius (Northwestern University, Galter Health Sciences Library) <r-kubilius@northwestern.edu> Parham started out by reminding the audience of the not very optimistic listing of librarianship reported in the USA Today story (and continued on page 51

<http://www.against-the-grain.com>


And They Were There from page 50 elsewhere), “8 jobs that won’t exist in 2030.” Her tour of the landscape included points made by ACRL and Educause in their top trends lists, the need to change from yesterday’s vocabulary, and traits of Gen Z. She emphasized the role of special collections in preserving the work of “heroes and sheroes.” Advice she quoted from the 2016 David W. Lewis book, Reimagining the Academic Library, included: be proactive, market repeatedly, and “sell the change.” This resonated in Parham’s talk as she recounted the story about the formation of the Atlantic University Center’s Robert Woodruff Library and showed a film clip. One wishes the speaker had expounded a bit on whether other academic institutions’ library services could similarly benefit from the formation of a consortium with an incorporated library, modeled after AUC. For a detailed report on old and new vocabulary and more mentioned during this presenter’s talk, read the blog report by Donald Hawkins: www.against-the-grain.com/2017/11/the-opening-session-21st-century-academic-library/.

Technology and Platforms: What’s On the Horizon — Presented by Georgios Papadopoulos (Atypon) NOTE: The title presented at the conference varied slightly from the scheduled title listed — Scholarly Communication Technology: Present and Future. Reported by Ethan Cutler (Western Michigan University Homer Stryker M.D. School of Medicine) <ethan.cutler@med.wmich.edu> Papadopoulos, CEO and founder of Atypon, began the plenary session by providing a background of his extensive career and the “dream of a better technology for scholarly communication” with which it began. Today, Papadopoulos says, the tech industry is currently at a place to develop the products needed to improve scholarly communication, but requires the interdependent relationship of the publishing and library communities to adopt and embrace new technology standards. Papadopoulos continued, explaining how today’s current technology standards have remained primarily stagnant for the last 20 years, citing outdated authentication processes, static content, and imperfect discovery and archiving methods as the fundamental hurdles obscuring the improvements needed to moving the industry forward. He both justified the need for change and detailed how improvements in access through improved authentication technologies, moving content standards from HTML to EPUB, and using robots for discovery and archiving could help facilitate these changes. Nonetheless, Papadopoulos reminded the audience of the cyclical undertone of progress, predicting that scholarly communication technology will change every 20 years. Questions following the presentation reflected the dependent and supportive relationship between technology, publishers, and librarians. Requests for new infrastructure and capital investment recommendations were also asked. Papadopoulos responded by proclaiming there is no need to invest in new capital, saying “the basics are there.” In closing, Papadopoulos painted an optimistic picture of the future, one where new technologies will soon be available to meet the demands of publishers and librarians alike. Read also the report on this plenary by Charleston Conference blogger, Donald Hawkins: http://www.against-the-grain.com/2017/11/ technology-and-platforms-whats-on-the-horizon/.

WEDNESDAY, NOVEMBER 8, 2017 NEAPOLITAN SESSIONS PrePrints, IR’s & the Version of Record — Presented by Judy Luther (Moderator, Informed Strategies); Ivy Anderson (California Digital Library); John Inglis (Cold Spring Harbor Laboratory Press); Monica Bradford (AAAS/Science) Reported by Rachel Besara (Missouri State University) <rachelbesara@missouristate.edu>

Against the Grain / February 2018

The session was structured as a discussion moderated by Luther. Brief introductory remarks gave the perspective from which each panelist approached the discussion. Each of the three panelists gave a short presentation. Inglis noted that scientific researchers do not know or care what the Version of Record is for a given article. Publishers and librarians are the ones concerned with that distinction. Should the version of record then become the version with the record? Bradford touched on the fact that technology can now support a “living document” in a pre-print server, but then the yet unsolved question is how credit gets assigned across a document’s lifespan. Anderson pointed out that the future is with immediate publication and post-publication peer review, but what is the impact of this versioning on value metrics used by libraries on their institutions? The presentations were followed by a discussion of pre-arranged questions. Finally, remarks and questions were taken from the floor.

Publication Ethics, Today’s Challenges: Navigating and Combating Questionable Practices — Presented by Ramune Kubilius (Moderator, Northwestern University, Galter Health Sciences Library); Jayne Marks (Wolters Kluwer); Barbara Epstein (University of Pittsburgh Health Sciences Library System); Jenny Lunn (American Geophysical Union); Duncan MacRae (Wolters Kluwer) Reported by Ramune Kubilius (Northwestern University, Galter Health Sciences Library) <r-kubilius@northwestern.edu> Marks introduced the timely session as one whose theme grew out of a 2017 Fiesole Retreat discussion. Lunn described the role of a society publisher — to provide guidelines, check incoming manuscripts, resolve issues, mediate disputes, but the society community needs to be self-policing. Ethics red flags include requests to remove a co-author, a plagiarism (software) match of over 15%, refusal to share data. What does experience suggest? There is some author ignorance, everyone is responsible, every situation is unique, and some cases are just tips of the iceberg. McRae focused on new developments in academic fraud, including the competitive worldwide scene that includes third party agencies providing fake peer reviews, the selling of authorship, and content sold on demand. Unfortunately, there are governmental incentives for publication in some countries (e.g., China as described in a recent New York Times article). Journal responses? Close loopholes, enforce stricter policies. One effort is: Think/Check/Submit.org. Epstein described academics walking the tightrope. Authors have described quandaries, such as: picking the “right” journal, meeting the funding mandates, avoiding predatory journals, choosing in which repository to deposit, and deciding what to do with preprints. Regarding data, they might argue: why share it, and what should be shared, and “my data is complicated,” “is it my problem to help, “what if flaws in my data are exposed,” etc. There is a new scholarly communication paradigm, a clamorous marketplace, resentment towards publishers, a line between predatory and trustworthy. Admittedly, library access to resources can be convoluted and slow (no matter how hard we try). “Education only reaches the willing,” she reminded, and “The scholarly communication river will continue flowing downhill around barriers in its way” (it won’t stop, so we had better find ways to adjust). Kubilius stepped in to help monitor questions that included mention of predatory behavior vs low quality journals, and some nuances specific to disciplines when it comes to data sharing, etc. Read also the session report by Charleston Conference blogger, Donald Hawkins: http://www.against-the-grain.com/2017/11/publication-ethics-todays-challenges/.

Wide Open, or just Ajar, Evaluating Real User Metrics in Open Access — Presented by Charles Watkinson (Moderator, Univ. of Michigan); Amy Brand (MIT Press); Byron Russell (Ingenta Connect, Ingenta); Hillary Corbett (Northeastern Univ. Libraries) Reported by Amy Lewontin (Snell Library Northeastern University) <a.lewontin@northeastern.edu> continued on page 61

<http://www.against-the-grain.com>

51


LEGAL ISSUES Section Editors: Bruce Strauch (The Citadel) <strauchb@citadel.edu> Bryan M. Carson, J.D., M.I.L.S. (Western Kentucky University) <bryan.carson@wku.edu> Jack Montgomery (Western Kentucky University) <jack.montgomery@wku.edu>

Cases of Note — Copyright – Substantial Similarity Column Editor: Bruce Strauch (The Citadel, Emeritus) <bruce.strauch@gmail.com> TAYLOR CORPORATION V. FOUR SEASONS GREETINGS. UNITED STATES COURT OF APPEALS FOR THE EIGHTH CIRCUIT, 403 F.3d 958; 2005 U.S. App. LEXIS 5866. Four artists worked designing cards for Creative Card Company. They created six card designs that caused this action. Creative Card was the copyright owner due to the employment relationship. You know. The old Work-for-Hire thingy. The President ditched Creative Card, formed Four Seasons and hired three of the four artists. And how original is anyone? They created six card designs awfully similar to six they had done at Creative. Then Creative Card went bankrupt. Taylor Corp., yet another greeting card company, bought Creative’s assets including “all intellectual property of the Business, including … copyrights … artwork, designs and other intangible property.” An attached schedule listed hundreds of greeting card designs among which was the disputed six. And Taylor sued Four Seasons for copyright infringement saying the six new were “substantially similar” to the former six. At the trial court, Taylor won on infringement and got an injunction against Four Seasons prohibiting any future use of the designs. Four Seasons appealed primarily on the basis the trial court had failed to identify what elements of the cards were original. Presumably wanting to use the unoriginal bits.

The Appeal

Four Seasons argued that an appeal of substantial similarity required a de novo review citing the Second Circuit’s Boisson v. Banian, Ltd., 273 F.3d 262, 272 (2d Cir. 2001); Folio Impressions, Inc. v. Byer Cal., 937 F.2d 759, 766 (2d Cir. 1991). Their reasoning was substantial similarity only required making a visual comparison of two works rather than re-judging the credibility of witnesses. However, Federal Rule of Civil Procedure 52(a) is clear that “in all actions tried upon the facts without a jury … the court shall find the facts.” And then describes the standard of appellate review. “Findings of fact, whether based on oral or documentary evidence, shall not be set aside unless clearly erroneous.”

52 Against the Grain / February 2018

Settling that little issue rather neatly. And what’s wrong with the Second Circuit?

Substantial Similarity

The Eighth Circuit (that’s where this case is) uses a two-step analysis: extrinsic and intrinsic tests. Is there a similarity of ideas? This is looked at extrinsically, focusing on objective similarities in the details of the work. Is there a similarity of expression? This is analyzed intrinsically asking the response of the ordinary, reasonable person. Can anyone seriously follow this? Let’s try harder. Extrinsic: card designs share similar ideas. Intrinsic: designs share similarities of expression. It actually makes a tad more sense when you see the titles of the cards: Colored Presents, Ribbon of Flags Around Globe, Three Worlds of Thanks, Globe Ornament, Pencil Sketch Farm, Thanksgiving Cart, and Wreath with Verse. And if you Google “Taylor Greeting Cards Thanksgiving

Cart” you can get a sense of the two elements without having to rely on the dessicated abstraction of legalese. Four Seasons didn’t dispute the extrinsic analysis. For example, similar holiday themes: Thanksgiving, Christmas wreath with verse. But it argued the district court on the intrinsic analysis should filter out the unprotectable elements, also known as analytic dissection. And you can see their point. How are they to do holiday themes without using a pumpkin or a fir tree? But the Second Circuit rejected this labor. See Dr. Seuss Enters. v. Penguin Books USA, Inc., 109 F.3d 1394, 1399 (9th Cir. 1997). The ordinary, reasonable observer views it as a whole and asks if there is substantial similarity. Pumpkins and fir trees are obvious fair game. Are they drawn, arranged, colored the same? What about design and use of lettering? The district court did an exhaustive side-by-side comparison noting the similarities.

Questions & Answers — Copyright Column Column Editor: Laura N. Gasaway (Associate Dean for Academic Affairs, University of North Carolina-Chapel Hill School of Law, Chapel Hill, NC 27599; Phone: 919-962-2295; Fax: 919-962-1193) <laura_gasaway@unc.edu> www.unc.edu/~unclng/gasaway.htm QUESTION: An academic author asks about the CASE Act and its likelihood for passage in the near future. ANSWER: The Copyright Alternative in Small-Claims Enforcement Act of 2017 (CASE Act), H.R. 3945 is a bill that was introduced to help professional creators and small business owners who rely on the copyright system for their businesses. The act seeks to address problems caused by the fact that the copyright system provides rights for these

creators and businesses but it has no effective means for them to enforce their rights outside of expensive federal litigation. The American Intellectual Property Law Association estimates that the cost of litigating through the appeals process is $350,000, and federal litigation is too complicated for creators and small businesses to take on without the assistance of counsel. A survey by the American Bar Association found that most attorneys would continued on page 54

<http://www.against-the-grain.com>



Questions & Answers from page 52 not consider taking a case if the amount in controversy is less than $30,000. Thus, copyright infringement often goes unchallenged and many creators and small businesses feel disenfranchised by the copyright system. This is especially acute for creators such as photographers, graphic artists, authors, songwriters, bloggers and YouTubers because the individual value of their work is often too low for warrant the expense of litigation. The CASE Act is an attempt to rectify this and give creators a small claims process to address infringement through a hearing before a three-judge board within the U.S. Copyright Office. The process would be voluntary for both parties. It limits an alleged infringers’ liability to $15,000 per work and $30,000 total and insulates them from awards of attorney fees unless they act in bad faith. This is in contrast to litigation where statutory damages range from $30,000 per act of infringement up to $150,000 if the infringement is found to be willful (and courts often finds willfulness). Moreover, the board can hear claims by both creators and by users. The act dictates that the Librarian of Congress appoint board members who must have experience representing the interests of both users and creators. They will be required to follow legal precedence in deciding cases. A similar act in the United Kingdom resulted in more settlements rather than more litigation. The bill is a bipartisan one and appears to be good for both the creators of copyrighted works and users, but there is no way to predict what Congress will do with it. QUESTION: A school librarian asks the best way to provide student access to online tutorials that teach software skills. ANSWER: It depends on the copyright status of the online tutorial. For example, if the work is copyrighted, as are most commercially produced tutorials, one must read the copyright notice and seek permission to reproduce the tutorial unless the notice specifies otherwise. If the tutorial is published online with a Creative Commons license, then the terms of that license apply. If the author of the tutorial indicates that it may be freely used with no restriction, then the tutorial may be reproduced for students either in print or on a copyright management system. An alternative is to provide students with links to the tutorials rather than reproducing them. QUESTION: A city’s public library has a large collection of published sheet music. A

Rumors from page 43 who wish to become involved; Katalysis will use its market-leading expertise in blockchain technologies to implement the test platform;

54 Against the Grain / February 2018

librarian asks whether it is copyright infringement to provide a digital copy of copyrighted sheet music to an individual patron upon request. ANSWER: The U.S. Copyright Act, section 108(d) permits libraries to make single copies of portions of works for a user upon request. There are exclusions from this section of the Act, however. Section 108(i) states that exceptions provided in section 108 “do not apply to a musical work, a pictorial, graphic or sculptural work, or a motion picture or other audiovisual work other than an audiovisual work dealing with the news.” A musical work may be embodied in sheet music, a musical recording, etc., so libraries do not have permission to reproduce sheet music even in response to a user request. Certainly, fair use applies, but fair use most often applies to a portion of works not to the full work. Sheet music for an entire song is an entire copyrighted work. Today, there are many cost effective online sources for digital copies of sheet music to which a user can be referred. QUESTION: A university librarian asks whether the recent WikiLeaks posting of Michael Wolff’s new book on President Trump, Fire and Fury, constitutes copyright infringement. ANSWER: The press has reported that WikiLeaks tweeted what appeared to be a full-text PDF copy of the work right after the book reached the bestseller list. Typically, one who posts an infringing copy of a work online is liable for direct copyright infringement. A harder question is posed when person “A” posts the work and person “B” distributes a link to the infringing content. Liability for sharing the link is less likely to be infringement. However, when person “C” downloads the infringing content, he or she has also infringed the copyright. In this instance, WikiLeaks says that “someone” leaked the content online and it simply tweeted the link to where the content could be found. Thus, the question is whether WikiLeaks is liable for inducing or contributing to infringement. After the tweet, Google removed the PDF file as soon as it became aware of it. Therefore, in reality, this may be more of a hypothetical question than one of actual liability. QUESTION: A public librarian reports reading something about the late night talk

show host, Conan O’Brien, and the infringement of copyrighted jokes. She asks whether jokes are copyrightable. ANSWER: For years, social norms have dictated that comedians not steal the jokes of other comedians. Anyone who takes another’s joke is more or less shamed by other comedians. The copyright question though is an interesting one, however, and a recent case may have further confused the matter. Jokes, like other literary works, qualify for copyright protection if they possess at least a minimal amount of originality and contain enough expression, more than a short phrase. A freelance comedy writer, Robert Kaseberg, claims to have posted four jokes on Twitter, which Conan used on his show in an altered form. At issue is whether the jokes were or should have registered for copyright and, if so, whether Conan infringed the jokes. The federal district court for the Southern District of California ruled in May 2017 that the case would not be decided on summary judgment but would go forward to trial. In her ruling, the judge held that jokes qualify only for thin copyright protection. According to the court, most jokes begin with a factual sentence and are followed by a second sentence punch line. The underlying idea of the joke, as well as the facts in the first sentence of a joke, are not copyrightable. Further, a joke does not have to be identical to a copyrighted one to infringe. Jokes have a limited number of variations in protectable expression which gives them only thin protection because they must be (1) humorous; (2) as applied to the facts articulated in the joke’s first sentence; and (3) provide mass appeal. The trial was originally to be held in August 2017, but further disputes have arisen over whether some of the jokes should have been registered and whether the plaintiff’s lawyer committed fraud before the Copyright Office in the documentation submitted for registration of one of the jokes. In November, Conan’s lawyers filed a complaint with the original judge on these matters. When the case goes forward on its merits, it should provide some clarity on copyright infringement of jokes. Regardless of the outcome, whether it will have any effect on the social norms among comedians is not known.

Springer Nature will participate with a selection of its journals and give key input around publisher and peer review workflows; ORCID will provide insights and knowhow around personal identifiers and authentication. Blockchain is a technology for decentralized, self-regulating data. We would like a panel

at the 2018 Charleston Conference on the blockchain technology. Are you interested? The call for papers has just been posted. Visit www.charlestonlibraryconference.com. Meanwhile — Happy Daylight Savings time. Spring is coming! — Yr. Ed.

<http://www.against-the-grain.com>


Biz of Acq — Planning for the Unplannable: Meeting Unexpected Collections Budget Demands by Christine K. Dulaney (Associate Librarian/Director of Technical Services, American University Library, Washington, DC 20016; Phone: 202-885-3840) <cdulaney@american.edu> Column Editor: Michelle Flinchbaugh (Acquisitions and Digital Scholarship Services Librarian, Albin O. Kuhn Library & Gallery, University of Maryland Baltimore County, 1000 Hilltop Circle, Baltimore, MD 21250; Phone: 410-455-6754; Fax: 410-455-1598) <flinchba@umbc.edu>

U

nplanned budget demands can take many forms. Any one of these events can cause a financial crisis: cost overruns due to license agreement renegotiations, budget cutback from administration, fluctuations in inflation or currency exchange rates, unanticipated standing order deliveries, programmatic errors in predicting patron-driven purchases, or miscalculations of approval plan projections. Human errors anywhere along the supply chain including with publisher or vendor staff can create financial problems. Our library faced unanticipated budget demands when mid-fiscal year, our Demand Driven Acquisitions (DDA) program ran out of funds. Our library had agreed to participate in a pilot program for a consortial DDA program with several other libraries. Within six months after turning on the DDA, however, the original allocation was expended and each library was required to contribute additional funds. Another three months later, the libraries were again required to contribute additional funds. Although these unexpected expenditures were not overwhelming and the additional expenditures were supported by our library administration, these unexpected expenditures could have been devastating to our library’s collections budget. However, due to advanced planning for the unexpected, we could absorb these unexpected costs. So how did we manage these unanticipated expenditures? Successfully planning for unexpected budget expenditures requires several actions on the part of the budget manager. First, you must have a well-conceived and logical budget ledger with carefully thought-out and defined funds.1 Second, you must have a clear understanding of your library expenditures and be able to anticipate a full year of expenses. It is important to keep in mind that different kinds of expenditures behave in different ways depending on the resource, the vendor, and the format. Third, you need to understand how much flexibility you have in your budgeting authority as well as where you can build flexibility into your budget. Finally, be aware of where you can make quick cancellations in your collection.

Understand Your Expenditures

Defining your collection funds and setting up a logical ledger is the first step towards understanding how you are expending your allocated financial resources. The library literature provides many examples of how libraries have created and defined the funds within their library ledgers. Funds should be granular enough that they provide informa-

Against the Grain / February 2018

tion by subject, format, acquisitions vehicle, funding source, etc. Your fund tree should be flexible enough that it can create useful analytic reports to explain your library’s spending patterns.2 In most libraries, there are two basic forms of expenditures — one-time expenditures and ongoing expenditures. In organizing your budget ledger, keep these two categories separate. Ongoing expenditures include serials, standing orders, journals, database renewals — essentially any type of resource which is “repurchased” each year through a renewal in order to maintain access to the current content. Purchasing continuations creates an expectation in your users that the content of that database or periodical will be available and current. At the beginning of each fiscal year, allocate enough funds in your new budget to cover all your renewals with an estimated rate of inflation. You can get inflation projections from a variety of sources including vendors or from library literature. Library Journal publishes an annual periodical price projection every April.3 Each year, allocations need to reflect these renewals in addition to an estimated percent of inflation which fluctuates from year to year but which may be from 5%-8% or even higher depending on format and subject matter. One-time expenditures or firm orders are typically payments for monographs or other resources whose content is static and purchased with a single payment. At the beginning of each fiscal year, contact your approvals vendor to get an approximation of the cost of any approval plan which you may have. The proportion of your budget which is allocated to firm or one-time purchases is your discretionary money. These funds get expended as titles or packages are selected. These allocations give you flexibility because you have some control over whether you will purchase these titles. In the event of a budget emergency, you can always reduce the allocation of your firm orders to cover a cost over-run in another area. Our library’s collections budget also pays for collection maintenance and service items such as bibliographic records purchases, binding, systems fees, etc. Even for these annual costs, we track where we have flexibility for a last minute cancellation. Can we stop preparing bindery shipments to transfer funds to cover a cost over-run in a different area? Can we order replacement copies only on request?

Strategies for Creating Savings

How can you find unallocated funds to cover a cost over-run or solve some budget crisis? You should be aware of how much flexibility

and authority you have in creating funds and establishing allocations. In our library, we do have the authority to set up any necessary funds and to establish allocations for those funds. If this is the case for you as well, first, set up a discretionary account which you can use to cover a budget emergency. If you end the fiscal year without needing these funds, then you can use these funds to make end of year one-time purchases. In our library, we are required to expend our entire collections budget within the span of a single fiscal year. However, if you can, it would be even more advantageous to roll over these “discretionary” funds from one fiscal year to the next. At our library, we were able to use funds which we had allocated to a discretionary account in order to enter into a consortial DDA agreement midway through the fiscal year, and then to cover the unexpected increase in DDA costs. Second, know which vendors maintain deposit accounts or credit balances on your behalf. Setting up, replenishing and using credits or deposit accounts strategically can help out when you are faced with a budget emergency. If you can generously replenish deposit accounts in flush years, you will develop a cushion which you can rely on when you need to reallocate funds for an immediate need. If your vendor maintains a credit account for you, check with your subscription vendor at least annually to track how much you have accumulated in credits. With some vendors, you may need to inform them when to use a credit to pay an invoice. Depending on the rules outlined by your financial office, you may need to use credits in the same fiscal year as they were accumulated. Be aware of the flexibility which you may or may not have in using these funds. If you can carry over credits from one fiscal year to next, then you can maintain a balance in your credit account which you can then use to make payments. Although we have this flexibility in our library, we tend to use our credits as soon as we receive them in order to keep reporting of title costs accurately.

Strategies for Making Quick Cuts

In planning for the unplanned budget emergency, you can take advantage of several strategies. First, conduct annual reviews of usage for all types of continuations. One of the most painful budget emergencies is a recession or a reduction in the overall budget. Although it’s common to get advanced notice of a budget cut prior to the start of the fiscal year, it could happen that a recession is announced mid-fiscal continued on page 56

<http://www.against-the-grain.com>

55


Biz of Acq from page 55 year. In this situation, you must be prepared to make immediate and effective cuts quickly. One of the best preparations for this type of budget crisis is to have completed a serials assessment or review. A comprehensive serials review is labor intensive and time-consuming. If you have not done some kind of serials review in the past, you may not have the time to undertake a thorough review. For this reason, I advocate that you conduct a serials assessment even if you are not facing a budget shortfall or a call to cut titles. A completed serials review will provide data which identifies titles which you can cancel in order to meet a budget cut. The prevalence of serials reviews and journal cancellation projects is widely available and well-documented.4 Information on how to conduct a serials review as well as what to do with the analysis of serials usage varies depending on the goals of the review and whether cancellations are dictated by a budget reduction. If a comprehensive review of all serials titles is too time-consuming, you can choose to review a portion of your serials collection, such as databases only or journals only. Alternatively, you can review titles based on formats. For example, you can review titles which you receive in multiple formats in order to determine if the redundancy is necessary. You could choose to do a review of all titles from a particular vendor only or a review of all the titles which you receive as part of a package.

A review of databases rather than periodical titles may be more manageable for several reasons. Typically, libraries have fewer database titles than journal titles. Second, database titles are frequently a high-cost item. By reviewing all databases and then cutting only a few titles from the collection, you can quickly and efficiently reach a budget cut target. Database usage and cost per use assessment is a standard data point for deciding whether a title is a good value for the library. Even if your library is not facing a budget reduction, conducting a review of your library databases can prove useful from a number of perspectives. The most difficult year of a serials assessment project is the first year. The first time you conduct a serials assessment, you will be setting up routines including running reports on your data, understanding usage reports including COUNTER and SUSHI standards, setting up excel spreadsheets, establishing communication avenues, and workflows. After the first year, you will have established templates and you will be updating existing spreadsheets, so the process becomes easier and more routine. By establishing a culture of serials assessment, you will be armed with information that may be useful in finding quick cancellations which can help resolve an unplanned budget crisis.

Vendor Relations

At any time, you should always consider working with your book or serial vendor. In many cases, they can suggest cost saving techniques which can help bridge a temporary financial crisis. For example, our approvals

vendor has suspended book shipments or held standing order shipments. We have also worked with our bindery to temporarily hold sending bindery shipments. Vendors can provide extremely helpful assistance when you are facing an unplanned budget problem.

Conclusion

Although unplanned financial expenditures are challenging, with some pre-planning, they do not need to be catastrophic or unmanageable. A pro-active and knowledgeable budget manager can handle these unexpected situations without panic. By knowing the extent of your authority, understanding your expenditures and fund structure, building flexibility and a cushion into your budget, conducting a review of periodicals or database usage, a budget manager can meet many different types of unplanned budget emergencies. Endnotes 1. Cathy Moore-Jackson, Mary Walker and John Williams, “Essential Elements of a Fund Tree Design: Logical Transaction Records and Enabled Reporting,” Charleston Conference, November 11, 2006, Charleston, SC. 2. Lynda Fuller Clendenning, J. Kay Martin and Gail Mckenzie. “Secrets for managing materials budget allocations: A brief guide for collection managers,” Library Collections, Acquisitions, and Technical Services 29, no. 1 (2016): 99-108. 3. Stephen Bosch and Kittie Henderson, “Steps down the evolutionary road,” Library Journal 139, no. 7 (April 15, 2014): 32-37. 4. Paula Sullenger, “The Promise of the future: A review of the serials literature, 2012-13.” Library Resources & Technical Services 60, no. 1 (2016): 12-22.

Both Sides Now: Vendors and Librarians — Reinvention Column Editor: Michael Gruenberg (Managing Partner, Gruenberg Consulting, LLC) <michael.gruenberg@verizon.net> www.gruenbergconsulting.com

I

t will come as no surprise to those of you who know me that I have a profound love of music. That devotion began when I was 12 years old. At that time, my parents bought me a record player for my birthday. When a neighborhood friend found out about the purchase of the record player, her son brought me the Top 10 records of the day as a gift. I repeatedly played those 45s, memorized all the pertinent information contained on the label about the music on those records and began a lifetime of collecting vinyl which continues to this day. Usually, when we have guests at our house, I am prompted to show my record collection and even play a few tunes on the stereo and/ or the Seeburg jukebox. Recently, one of my friends asked me about the records I had pertaining to Rod Stewart. A long-time favorite of mine, Rod has gone from a 60’s garage band rocker to a solo singing career to a spandex wearing disco singer to now, where he is

56 Against the Grain / February 2018

the Great American Songbook crooner. He has consistently reinvented himself over his many decades as performer. And each time it looks like Rod’s career is coming to an end, he totally reinvents himself and becomes a success all over again. He is the poster child for “reinvention.” In any profession, the road to accomplishment is filled with potholes, diversions and often strewn with people who may not have your best interests at heart. The successful person reads the writings on the wall and deals with all those diversions and adapts themselves accordingly. Pat Riley, one of the most successful coaches in the history of professional basketball, wrote about reinvention in his book, The Winner Within: A Life Plan for Team Players. Pat came from humble beginnings in Schenectady, New York to win many championships as a player and coach and deservedly was inducted

into the Basketball Hall of Fame in 2008 in recognition of his incredible achievements on and off the court. I have read his book, referred to it many times and have found it an invaluable resource during my years as a successful sales executive in the information industry as I, too, consistently reinvented myself to adapt to whatever situation was in front of me. In the book, Riley talks about the inevitable changes that we all have to cope with, both in our professional and personal lives. He calls those challenges “thunderbolts from the sky” as a description of the changes that life throws at all of us. Even though many of these thunderbolts appear at the worst possible moments of our lives, we need to recognize how to cope with them and then overcome them, as well. My advice to people who are frustrated with the curve balls that life throws at them is, “Don’t get mad; get ahead!” continued on page 57

<http://www.against-the-grain.com>


Both Sides Now ... from page 56 Reinvention in the ever evolving career of a salesperson and/or information professional is the process by which a person takes stock of where they are, where they think they are going in their current position and where they would like to be in the future. Sometimes that means taking a few steps back for the promise of advancing many steps forward in the future. Sometimes that means taking a lateral step and sometimes it means rolling the dice and taking a leap of faith forward into a situation with incredible upside potential. None of these possibilities are wrong and none of them may be right. Only the person making the decision knows and experiences what is right. However, standing still is not an option! We are living in an environment that is changing by the minute. If you or your organization is conducting business today in the exact same manner that it was even six months ago, you are probably falling behind the curve and will most certainly be overtaken by your competitors. Each year, I take part in a symposium at the Library School at Catholic University conducted by my dear friend, Prof. Bruce Rosenstein. My role, as well as other people from our industry who attend, is taking time to speak to graduating students who are examining their employment options in anticipation of leaving the University to explore the job market. We talk about the realities of the marketplace and give advice as to how to proceed. And while many of the students have their sights set on a library career, I always speak about all the possibilities of considering a profession serving the library community as opposed to working within a library structure. Whether it is a job in training users on an aggregators’ platform, or working at a major information industry’s customer service department or even being a library database salesperson, there are many opportunities available. I bring up these possibilities so that it becomes apparent that as one progresses in a librarianship career, that they are aware that, 1) Reinventions will inevitably take place and thus be aware that, 2) There are many avenues to travel if you are willing to reassess yourself. In any business or any library, reinvention is not a luxury, it’s necessity. In my career, my first twenty years of employment was with one company. I rose in the ranks from sales rep to VP of Sales. Along the way, the company was bought and sold five times. This is where I learned the fine art of reinvention because every time a new owner took control, I had to prove my worth all over again. My past achievements under the previous administration were of little interest to the new management team. My choice each time was either to prove myself yet again or leave the company. When it became apparent to me that the latest purchaser of the company represented a team of executives not to my liking, I left on my own accord not wishing to prove my worth yet again. While it was a relief to leave there, at the same time it was a scary proposition when I tendered my resignation after 20 years at the same company. I knew that I would have to experience some degree of discomfort as I made my way in a brave new world. However, in retrospect, I learned more about myself, enjoyed more success professionally and monetarily and as a result achieved more in the subsequent 20 years than I had in the previous 20. Quite frankly, some ventures I chose failed, while others succeeded beyond my expectations. What’s so interesting about reinventing yourself is that you take all the lessons learned from your previous and current positions and apply them to a new opportunity. Never take for granted that everyone is conducting their operations the same as you are. Even the most basic of procedures that you use each day (and take for granted) may not be part of the framework somewhere else. Introducing those new ideas that you knew from the past will certainly go a long way in ensuring success at the new place of employment.

Against the Grain / February 2018

Serials Management. Integrated Soluuons.

Experience Choice

Where extraordinary service happens. For over 40 years, WT Cox Informaaon Serivces has been a trusted provider of professional serials management services. Experience expert customer service, user-friendly tools, and a host of integrated services.

Electronic and Print Serials Management eStats - Staasscs & Analyycs Tool Journal Finder® - A-Z List, Link Resolver & ERM

www.wtcox.com Ultimately, you cannot fight the concept of “change.” New technologies, new colleagues, new procedures are the daily facts of life. The pace of conducting business today is not an evolution, it is a revolution and to survive and thrive, the future will be owned by the people who have learned to read the writings on the wall and adapt themselves to whatever new environment is thrust upon them. The late, great Sam Cooke, wrote and performed a song called “A Change Is Gonna Come.” This 1964 R&B hit is among the most recognizable pop songs associated with the African American Civil Rights Movement. And although Mr. Cooke did not have libraries or sales in mind when he wrote it, the message of the title is quite clear. Those who understand this concept will be successful.

Mike is currently the Managing Partner of Gruenberg Consulting, LLC, a firm he founded in January 2012 after a successful career as a senior sales executive in the information industry. His firm is devoted to provide clients with sales staff analysis, market research, executive coaching, trade show preparedness, product placement and best practices advice for improving negotiation skills for librarians and salespeople. His book, “Buying and Selling Information: A Guide for Information Professionals and Salespeople to Build Mutual Success” has become the definitive book on negotiation skills and is available on Amazon, Information Today in print and eBook, Amazon Kindle, B&N Nook, Kobo, Apple iBooks, OverDrive, 3M Cloud Library, Gale (GVRL), MyiLibrary, ebrary, EBSCO, Blio, and Chegg. www.gruenbergconsulting.com

<http://www.against-the-grain.com>

57


Optimizing Library Services — How Information Scientists Can Help Fix the Broken Peer Review Process by Jeremy Horne (International Institute of Informatics and Systemics) <jhorne.ial@gmail.com> Column Editors: Caroline J. Campbell (Promotions Coordinator, IGI Global) <ccampbell@igi-global.com> and Lindsay Wertman (Managing Director, IGI Global) <lwertman@igi-global.com> www.igi-global.com The Search For Knowledge Quality Standards

Until the 1970s, a person would go to a building known as a “library,” walk in, and then make their way over to a cabinet with many drawers. These drawers contained cards with information about the books and other printed material in that building, all of which were available to the public. In larger libraries, there would be a main desk providing general information on how to use the library, and a smaller reference desk staffed with individuals able to assist researchers on locating information, in particular, “references” or generic sources, like atlases, dictionaries, and encyclopedias. Very few people questioned the integrity of what was given to them or the advice given by the staff. For students, reliance on that information was critical as it was foundational to the integrity of the knowledge they were acquiring to proceed with their studies. It was all about truth, knowledge, and learning. When computers started to become more sophisticated, there was a corresponding development in the type and mode of information dispensed. Card catalogues started to become computer based and in the colleges and universities, the libraries started becoming multimedia centers. Consequently, librarians realized that microfilm, reel-to-reel moving pictures, and records should also be included as knowledge sources. In the schools themselves, attention was paid to source integrity. It was common to include some critical thinking in high school and college English courses, as a part of the initial instruction on assembling a “theme” and then, a term paper. Still, there was no concern about the quality of the source, whether it be librarians, academic journals, etc. Therefore, information repositories themselves have a long tradition of being reliable sources of knowledge. Today, with the information and technology explosion, detailed discoveries are being vetted across increasing areas of study. Greater skill is required in winnowing out what has relevance and quality thus, the ability of the average person is being challenged with technology and discovery forcing more specialization. Therefore, everyone has to be versed in systematic methodologies of evaluation and selection of quality sources, especially the librarian. Libraries in essence no longer are mere repositories of information, but instead librarians must be the future scientist with all the attendant responsibilities to become educated in knowledge quality, hence the appellation “information scientist.”

The Problem, Its Background, and Some Reasons

Information scientists must know how to recognize knowledge quality and how to handle it. More specifically, what does an information scientist do about the emerging dubious quality appearing in ostensible journals? Journals that have few, poor, or no peer review criteria and often charging authors to publish are deemed “fake,” “false,” or “predatory,” hence the use of the word “ostensible.” The “predatory” designation refers to taking advantage of those needing to publish at the risk of compromising their job status. Academics as individuals are upset with predatory journals, but publishers are as well (Khosrow-Pour, 2017; Sorokowski et al, 2017; Kolata, 2017; Shen, Bjork, and B-Christer, 2015; Bohannon, 2013; Cook, 2006; Csiszar, 2016; Kaplan, 2005; Preston, 2017; Retraction Watch, 2017; Callaos, 2011). Nature, in particular, has been very upset, saying that these journals take a severe toll on research efforts in terms of time, money, resources, and even the animals (Moher et al, 2017). Information scientists (the former “librarian”) are directly affected by these predatory practices and have developed efforts to alert others to the problem. After all, it is they (the information scientists) who often are in the middle of conveying information as knowledge in their collections of printed and other media. In fact, it is their responsibility to help the public ensure that it is of quality, both as object and process. The reference “librarian” in

58 Against the Grain / February 2018

particular has, in the past, been relied upon as the one who can direct a person to quality sources, i.e., references. Her/his role is critical now more than ever.

Origins and Rise of Predatory Publishing

While it is not the major purpose here to discuss how and why predatory and fake publishing came about, it still is important to think about these factors. • Initially, the more information produced and distributed among existing journals, more and more information is left out, thus generating a demand for more publications. • More PhDs are being graduated, but there are not enough vacancies to hire them, and the positions that do exist are destructively competitive. There is not enough support for education generally to create more faculty positions. • Journal subscriptions are costly, and libraries cannot afford them. The response has been to have authors pay for articles — which is a questionable practice to begin with. The ostensible intent is to have the articles “open source” (available to the public without cost). • How does one discern legitimate journals from fake ones just by looking at them? There may be well-researched articles dispersed among poor quality ones. • School quality has been declining, such that not only is half the U.S. population not able to read past the eighth grade, but only half of high school graduates meet minimal academic standards (NAAL, 2003, 2005). In addition, National Science Foundation (2015) data as well as the National Center of Educational Statistics (NCES, 2009) paint an equally dismal picture with 25 percent of U.S. adults thinking the Sun orbits the Earth. • It may be argued that the “publish or perish” goad is an unsustainable quantity-over-quality ideology. Since there would be no peer review, this creates a veritable minefield of misleading or false research and data to contaminate other legitimate research, especially if those researchers rely upon what is published in the fake journals. Each of these can be debated, and more reasons can be added. Librarian scientists do not have direct control over some of these variables, such as the “publish or perish” model, but there are some factors, such as acting in an individual capacity to help improve school quality by counseling students in research methodologies and in knowledge quality, they are able to greatly influence.

Information Scientists’ Solutions

In 2010, in response to predatory publishing, Jeffrey Beall, Master of Science in Library Science (MSLS) degree and librarian at the Auraria Library, University of Colorado Denver (Beall-CV, 2017) started issuing his “list of predatory publishers.” An extensive amount of controversy and history has surrounded this event and its motivations, but there has been an underlying legitimacy. Numerous publications were listed in “Beall’s List of potential, possible, or probable predatory scholarly open-access publishers,” but there were no specific reasons why each was included. However, he did publish a general set of guidelines (Beall - criteria, 2017), such as these publications not only accepting payment for articles but — more serious — having little or no peer review. It seems that it does not take much to have been on Beall’s “hit list”; even a typographical error would do. The Journal of continued on page 59

<http://www.against-the-grain.com>


Optimizing Library Services from page 58 Financial Education (JFE, 2017) appeared on the list. In 2016, Timothy Michael, Associate Professor of Finance and Business at the University of Houston at Clear Lake, Texas and the executive director of JFE, erroneously referred to “Dr. Beall” (as Beall has no such degree) and requested that the JFE be removed from the “hit list.” Beall refused to do so, as the typo was supposedly not corrected (JFE-complaint, 2017). More and more scholars, while sympathizing with Beall about the problems began questioning his authority and asking, “Who set you up as judge, jury, and executioner?” “Who is peer reviewing Beall?” While the specific reasons are not known, it is reasonable to think that the threat of lawsuits (Flaherty, 2013; Silver, 2017) and heightened criticism from academics motivated Beall to stop publishing the list. The Associate Dean for Scholarly Resources and Collections at Marriott Library, University of Utah, Rick Anderson (Anderson- CV, 2017), ironically holding the same degree and sharing the same concerns about poor quality journals as Beall, set forth guidelines for assessing whether a journal is fake or predatory, such as: • Falsely claiming to provide peer review and meaningful editorial oversight of submissions • Lying about affiliations with prestigious scholarly/scientific organizations • Claiming affiliation with a non-existent organization • Naming reputable scholars to editorial boards without their permission (and refusing to remove them) • Falsely claiming to have a high Journal Impact Factor • Hiding information about APCs until after the author has completed submission • Falsely claiming to be included in prestigious indices. (Anderson – Scholarly, 2017)

What Are the Problems with These Solutions?

Anderson has a link summarizing a critique of Beall (Crawford, 2017), as in “…there’s no stated reason for things to be included.” He refers to Walt Crawford, another retired librarian (Crawford – background, 2017), who states that it is not only “grotesquely unfair to blacklist a journal or publisher without giving any reason why,” but the original entries in Beall’s list of 1,604 entries is pared down to “53 journals and 177 publishers” (Crawford, 2017). In lieu of Beall’s absence, Cabell’s list of presumably predatory journals has appeared, essentially reflecting Beall’s criteria (Cabell-criteria, 2017). Besides the critique of who set Cabell’s organization up and specifically why it listed a journal, is that Cabell’s sells the list (Cabell – Charges, 2017). Others, like the Directory of Open Access Journals (DOAJ) have their rules (DOAJ Standards, 2017), but there is no detailed set of criteria for peer reviewing content, i.e., quality control. For example, how thoroughly were sources validated? Do they in fact exist? What about basic standards, such as dated information, financial interests of the author, or overreliance on one source or even a few? Still others like Anderson have attempted to set forth standards for rating of knowledge quality as Beall apparently tried to do (Anderson-Scholarly, 2017; IIIS, 2017). Many others have their own standards and methods of peer review (Peer review methods and standards, 2017). Unfortunately, because there was no non-profit organization with the criteria and metrics, academics flocked to Beall, and Cabell’s. While there may be basic standards for quality, in the end “Different journals have different aims and individual titles can be seen as ‘brands.’ The editorial position of the journal influences the criteria used to make decisions on whether to publish a paper” (MacGill, 2017). In essence, a state of anarchy exists in the world of knowledge quality assessment.

What Can Information Scientists Do To Help Solve The Knowledge Quality Problems? Fake and predatory journals most likely will not disappear as long as the profit motive and factors giving rise to them (such as publish

Against the Grain / February 2018

or perish problems) continue. There are two basic areas in which the information scientist has a responsibility: discerning and promoting knowledge quality and addressing the blacklists. Aside from critical thinking (2017), information scientists would do well in being grounded in epistemology, or justified belief. That is, how do we know that something is true? Familiarity with the course content taught in philosophy departments is a good start. Applying epistemology to questionable information often will generate the needed red flags. Bringing philosophy department faculty into the process will help. Teaching others how to spot poor quality is another action. As for blacklists themselves, an information scientist needs not only to know what criteria is being used to judge a publication or article, but exactly how the criteria was applied and why. Here, one should read the journal first, arrive at a conclusion, and then see what the blacklist says. If a journal is thought to be important or worthy, a colleague should participate, blindly reading the article independently. Both should compare notes afterward. In all cases, one should be versant in the journal or article’s subject matter, aside from composition, organization, grammar, style, typography, etc., not unlike grading a term paper or thesis. Then, the question arises, “who created the blacklist?” Was it a dispassionate non-profit effort to identify those who have no regard for the truth? How was the judgment criteria created and applied? Was there any interaction between those managing the blacklist and the targeted publications? Was there any blind reviewing of the journal? Was there a well-defined procedure to allow the journal to come in line with the blacklist standards? While it is critical that individual information scientists have the ability to identify quality information, it is perhaps more important that they work with others to promote knowledge quality through the creation of a knowledge quality institute (KQI). This institute may have activities comparable to those of standards organizations like the Institute of Electrical and Electronics Engineers (IEEE), the American National Standards Institute (ANSI), or the International Organization for Standardization (IOS). A common core of standards and recommended practices should be developed and applied appropriately to various disciplines that all peer reviewers in their respective areas would use. These would undergo periodic review at the meetings, as they are with other standards organizations. Analogous to other scientific and standards organizations, the KQI would have workshops in specialty areas and a “trade fair” component, where representatives of publishers, universities/colleges, research organizations, and other information vetting outlets present their findings on knowledge quality. In addition, a journal of knowledge quality would tie together persons interested in this activity and would serve as a paradigm case of excellent peer review, the “gold standard.” Through this collaboration, librarians can not only assist in creating KQI, but they can identify ethical publishers in which to create a partnership with and invest in their collections. Through investing in reputable publishers, it will diminish the presence of predatory publishers, as many academic publishing houses are in direct competition with these “vampire presses.” Additionally, librarians can identify the publishers that have a rigorous and transparent peer review process. For example, IGI Global’s peer review process: “As a part of the commitment to maintain the best ethical practices and the highest levels of transparency, the peer review process is the driving force behind all IGI Global publications” (IGI-peer review, 2017). Meaningful and successful peer review depends upon the competence of the reviewer, elimination of bias as much as possible, “cross-checking” through multiple reviewers, an “audit trail” of the review process for accountability, and thoroughness. In addition, meaningful reviewing criteria should be known to all in the review process, as well as to the reader. IGI Global’s peer review process is being done transparently through the eEditorial Discovery (EED) online submission system. continued on page 60

<http://www.against-the-grain.com>

59


Optimizing Library Services from page 59 Their criteria for the journal process is below: • There must be at least 3-5 Editorial Review Board Members in a double-blind peer review process (note that two reviewers are the standard for many publishers); • The Editor-in-Chief will blind the manuscript and assign it to 3-5 members of the Editorial Review Board. Once at least three Editorial Review Board Members have completed their reviews, the Editor(s)-in-Chief will then send the manuscript and its reviews to an Associate Editor for review; • Then, once the Associate Editor’s review commentary is received, the Editor(s)-in-Chief will take all commentary into consideration, and make their formal assessment of the manuscript. • After the overall assessment by the Editor(s)-in-Chief is complete, the author (if the manuscript is not rejected outright) will have the opportunity to make either major or minor revisions to their work per the reviewer’s commentary. Once the revisions are received from the author, the Editor(s)-in-Chief will either accept or reject the manuscript outright, send the revised manuscript to the Associate Editor for their additional commentary, or the Editor-in-Chief may request additional revisions from the author before a final decision can be made. • Note that the revision process may repeat itself several times to ensure that the author’s work is of the highest quality. Before the double-blind peer review, each reviewer must submit their curriculum vitae to help ensure that they are not reviewing outside their subject area. Then, “each review board member is evaluated every six months.” This practice is not as common as one would typically assume in the publishing field. Additionally, through researching publishers, librarians are able to see which publisher best fits their needs. Through the collaborative efforts with librarians, IGI Global has created customized solutions through the award-winning InfoSci Database Platform that contain over 4,500 peer-reviewed books and more than 175 quality peer-reviewed journals, allowing unlimited simultaneous users to download full-text in XML and PDF with no digital rights management (DRM). They, and many other publishers and librarians, are active in “Peer Review Week,” an international campaign to combat unethical publishing practices, and writing informative posts and guides highlighting the threat of predatory publishing. It is no accident that information scientists (“librarians”) and publishers, in going beyond the outcry of academics, initiated efforts to track academic journals for their quality and started to establish some mechanisms to address peer review issues. Now, a concerted effort by all those concerned about knowledge quality is needed. This article is such a call to action, following the outline above of suggested activities. Contact this author, who currently is coordinating an effort to establish the KQI.

References

(all links accessed 25 December 2017) AAAS (2017). Science: Editorial Policies. Science. http://www. sciencemag.org/authors/science-editorial-policies#conflict-of-interest Anderson – CV (2017). Rick Anderson. http://faculty.utah.edu/ bytes/curriculumVitae.hml?id Anderson – Scholarly (2017). Cabell’s New Predatory Journal Blacklist: A review. The Scholarly Kitchen. 25 July 2017. https:// scholarlykitchen.sspnet.org/2017/07/25/cabells-new-predatory-journal-blacklist-review/ APA – standards (2017). Standards of Accreditation for Health Service Psychology. American Psychological Association. https://www. apa.org/ed/accreditation/about/policies/standards-of-accreditation.pdf Beall (2017). Beall List. http://beallslist.weebly.com/ Beall – criteria (2017) http://beallslist.weebly.com/uploads/3/0/9/5/30958339/criteria-2015.pdf https://en.wikipedia.org/ wiki/Jeffrey_Beall

60 Against the Grain / February 2018

Beall – CV (2017). Jeffrey Beall – curriculum vitae. https:// web.archive.org/web/20131102231551/http://people.auraria.edu/ Jeffrey_Beall/sites/people.auraria.edu.Jeffrey_Beall/files/cv/JeffreyBeall-CV.pdf Bohannon , J. (2013). Who is afraid of peer review? Science, 342(6154) , pp. 60-65. DOI: 10.1126/science.342.6154.60. http:// science.sciencemag.org/content/342/6154/60 Cabell (2017). Cabell’s Blacklist violations. Cabell’s Scholarly Analytics. https://cabells.com/blacklist-criteria Cabell – Charges (2017). The Journal Blacklist. https://cabells. com/about-blacklist https://cabells.com/get-quote Callaos, N. (2011). Peer Reviewing: Weaknesses and proposed solutions. Unpublished paper, available upon request from Jeremy Horne. Cabell – criteria (2017). Blacklist criteria. https://cabells.com/ blacklist-criteria Cook, A. (2006). Is peer review broken? The Scientist, 1 February 2006. http://www.the-scientist.com/?articles.view/articleNo/23672/ title/Is-Peer-Review-Broken-/ Crawford, W. (2017) Walt at Random. The library voice of the radical middle. https://walt.lishost.org/2016/01/trust-me-the-otherproblem-with-87-of-bealls-lists/ Crawford – background (2017). Walt at Random – About. https:// walt.lishost.org/about/ Critical thinking (2017). Two excellent sources for critical thinking material: www.criticalthinking.org and www.criticalthinking.com Csiszar, A. (2016). Peer review: Troubled from the start. Nature, 532(7599), 19 April 2016. http://www.nature.com/news/peer-reviewtroubled-from-the-start-1.19763 DOAJ Standards (2017) Directory of Open Access Journals. https:// doaj.org/application/new#review_process-container Flaherty, C. (2013). Librarians and lawyers. Inside Higher Ed. 15 February 2013. https://www.insidehighered.com/news/2013/02/15/ another-publisher-accuses-librarian-libel IGI-peer review (2017). For more information on partnership opportunities please contact: <E-Resources@igi-global.com>. https:// www.igi-global.com/publish/peer-review-process/ IIIS (2017). Information to Contributors. Journal of Systemics, Cybernetics and Informatics. http://www.iiisci.org/Journal/SCI/PeerReviewMeth.asp?var= Integrating Reviewing Processes. http://www. iiisci.org/Journal/SCI/IntegRevProcesses.asp?var= JFE (2017). Financial Education Association. http://jfedweb.org/ feaboard.html JFE-complaint (2017). Journal of Financial Education. http:// jfedweb.org/Beall2016.htm Kaplan, D., 2005, How to Fix Peer Review, The Scientist, 19(1), p. 10, Jun. 6 Khosrow-Pour, M. (2017). “Maintaining the Integrity of Scientific Knowledge Content.” https://www.youtube.com/watch?v=z93mj0alKEg&feature=youtu.be Kolata, G. (2017). Many Academics Are Eager to Publish in Worthless Journals. New York Times, Science. 30 October 2017. https://www. nytimes.com/2017/10/30/science/predatory-journals-academics.html?module=WatchingPortal&region=c-column-middle-span-region&pgType=Homepage&action=click&mediaId=thumb_square&state=standard&contentPlacement=5&version=internal&contentCollection=www.nytimes.com&contentId=https%3A%2F%2Fwww.nytimes. com%2F2017%2F10%2F30%2Fscience%2Fpredatory-journals-academics.html&eventName=Watching-article-click MacGill, M. (2017). Peer review: What is it and why do we do it? Medical News Today, 10 August 2017. https://www.medicalnewstoday. com/articles/281528.php Moher, D., Shamseer, L., and Cobey, K. (2017). Stop this waste of people, animals and money. Nature 5(4 9): 23. NAAL (2003). National Assessment of Adult Literacy. https://nces. ed.gov/naal/kf_demographics.asp continued on page 61

<http://www.against-the-grain.com>


Optimizing Library Services from page 60 NAAL (2005) Key Concepts and Features of the 2003 National Assessment of Adult Literacy; NCES 2006-471 U.S. Department of Education, Institute of Education Sciences, National Center for Educational Statistics (NCES) https://nces.ed.gov/NAAL/PDF/2006471_1.PDF National Science Foundation (2014). “Science and Engineering Indicators 2014” Chapter 7 Science and Technology: Public Attitudes and Understanding P. 7-23 NCES, (2009). Basic Reading Skills and the Literacy of America’s Least Literate Adults. National Center for Educational Statistics. https:// nces.ed.gov/pubs2009/2009481.pdf Peer review methods and standards (2017). Welcome to the Peer Review Week resources page where you can find How-tos & Tutorials, Best Practices & Guidelines and Research related to peer review. Peer Review Week, 2017. American Association for the Advancement of Science (AAAS). http://www.pre-val.org/prw/ Peer review week (2017). https://peerreviewweek.files.wordpress. com/2016/06/prw-press-release-2017.pdf ; and https://peerreviewweek. wordpress.com/ Preston, A. (2017). The future of peer review. Scientific American, 9 August 2017. https://blogs.scientificamerican.com/observations/thefuture-of-peer-review/; and https://publons.com/home/ Publons (2017). Publons. https://publons.com/home/ Retraction Watch (2017). “Weekend reads: A flawed paper makes it into Nature; is science in big trouble?; a reproducibility crisis history.” Retraction Watch. http://retractionwatch.com/2016/12/10/ weekend-reads-flawed-paper-makes-nature-science-big-trouble-reproducibility-crisis-history/ Shen, Cenyu; Björk, Bo-Christer (2015). Predatory’ open access: a longitudinal study of article volumes and market characteristics. BMC Medicine. 13 (1): 230. ISSN 1741-7015. doi:10.1186/s12916015-0469-2. https://bmcmedicine.biomedcentral.com/articles/10.1186/ s12916-015-0469-2 Silver, A. (2017). Controversial website that lists “predatory” publishers shuts down. Nature, 18 January 2017. ISSN: 0028-0836 EISSN: 1476-4687 https://www.nature.com/news/controversial-website-that-lists-predatory-publishers-shuts-down-1.21328

And They Were There from page 51 Russell, Head of Ingenta Connect, introduced the three speakers. There are over 86,000 users for Ingenta, they will be launching the new Ingenta Open, in 2018, and they hope to evaluate the metrics from their new open access platform. He mentioned that the panelists will be addressing the topic from their different positions, Brand, what do publishers want from OA metrics, Corbett, on what do libraries want to learn from OA metrics and lastly, Watkinson, addressing the topic of what do funders learn, or want to learn from OA metrics. Brand opened her talk with the concept that MIT, as a publisher of both scholarly journals and books, represents both authors and a publisher. She introduced the idea of how OA publishing impacts academic careers. When a work is published in neuroscience, computer science or linguistics, Ms. Brand said, there is a lot of immediate activity that occurs within hours. Tweets and blogs discussing the publication start happening and then there are many downloads from many parts of the world who now read an open publication. Ms. Brand also made mention that open access is not necessarily seeing a growth in impact between open access and more citations. But from the standpoint of being a publisher, with some of their books being “open,” MIT is not seeing damage to their sales. She referred to it as a balancing act, between sales and open access. What Brand did emphasize was the importance of helping one’s authors, especially making good use of altmetrics and the tools around

Against the Grain / February 2018

Sorokowski1, P., Kulczycki, E., Sorokowska, A., and Pisanski, K. (2017). Predatory journals recruit fake editor. Nature: 543 (481–483) (23 March 2017) doi:10.1038/543481a http://www.nature.com/news/ predatory-journals-recruit-fake-editor-1.21662#fake; https://www.nature.com/polopoly_fs/1.21662!/menu/main/topColumns/topLeftColumn/ pdf/543481a.pdf

Recommended Readings Published by IGI Global:

Al-Suqri, M. N., Al-Kindi, A. K., AlKindi, S. S., & Saleem, N. E. (2018). Promoting Interdisciplinarity in Knowledge Generation and Problem Solving (pp. 1-324). Hershey, PA: IGI Global. doi:10.4018/978-1-5225-3878-3 Baran, M. L., & Jones, J. E. (2016). Mixed Methods Research for Improved Scientific Study (pp. 1-335). Hershey, PA: IGI Global. doi:10.4018/978-1-5225-0007-0 Esposito, A. (2017). Research 2.0 and the Impact of Digital Technologies on Scholarly Inquiry (pp. 1-343). Hershey, PA: IGI Global. doi:10.4018/978-1-5225-0830-4 Hsu, J. (2017). International Journal of Ethics in Digital Research and Scholarship (IJEDRS). doi:10.4018/IJEDRS Jeyasekar, J. J., & Saravanan, P. (2018). Innovations in Measuring and Evaluating Scientific Information (pp. 1-315). Hershey, PA: IGI Global. doi:10.4018/978-1-5225-3457-0 Munigal, A. (2017). Scholarly Communication and the Publish or Perish Pressures of Academia (pp. 1-375). Hershey, PA: IGI Global. doi:10.4018/978-1-5225-1697-2 Sibinga, C. T. (2018). Ensuring Research Integrity and the Ethical Management of Data (pp. 1-303). Hershey, PA: IGI Global. doi:10.4018/978-1-5225-2730-5 Storey, V. A., & Hesbol, K. A. (2016). Contemporary Approaches to Dissertation Development and Research Methods (pp. 1-360). Hershey, PA: IGI Global. doi:10.4018/978-1-5225-0445-0 Wang, V. C. (2015). Handbook of Research on Scholarly Publishing and Research Methods (pp. 1-582). Hershey, PA: IGI Global. doi:10.4018/978-1-4666-7409-7 Wang, V. C., & Reio Jr., T. G. (2018). Handbook of Research on Innovative Techniques, Trends, and Analysis for Optimized Research Methods (pp. 1-445). Hershey, PA: IGI Global. doi:10.4018/978-15225-5164-5

it. Brand made mention of a document, “A guide to using Altmetric data in your Biosketch CV” https://staticaltmetric.s3.amazonaws.com/ uploads/2016/05/NIH-guide.pdf, as well as other tools to take advantage of, as an author, all with the idea that promoting an author also helps a publisher. One idea Brand mentioned was making sure that Open Access does not mean lack of peer review, and also that OA should not disadvantage a publication and the tenure process. Corbett from Northeastern University discussed what academic libraries would like to know about, from their OA usage. Are our users, faculty and students using OA content in their own research, are faculty using OA material for their courses? Then she talked realistically about why academic libraries might want to know about their OA usage, such as making use of it to help fill in gaps alongside their subscribed content, or replace subscriptions, or are they truly helping their students with an affordability textbook initiatives? The last speaker of the session was Watkinson of the University of Michigan, discussing what funders hope to learn from OA usage. He mentioned the fact that there is truly a diversity of funders, and mentioned a few of the different types, such as government organizations, foundations, libraries, individuals, institutions of all sorts. He also went on to say that it was not easy to discern “actionable measures” from vision statements from foundations, e.g., the Gates Foundation, that mentions free and immediate and unrestricted access to research in its statement. But Watkinson did highlight some very important patterns that he saw in funder’s desires, such as the idea of “use and re-use through open licensing, and he continued on page 62

<http://www.against-the-grain.com>

61


And They Were There from page 61 went on to mention the open access eBooks from JSTOR. The session concluded with Watkinson emphasizing the importance of storytelling from data, rather than just showing numbers.

WEDNESDAY, NOVEMBER 8, 2017 CONCURRENT SESSIONS A Trouble Shared: Collaborative Approaches to Problems Affecting Measurement of E-Resource Usage Data — Presented by Ross MacIntyre (JISC); Jill Morris (PALCI) Reported by Jeanne Cross (University of North Carolina Wilmington) <crossj@uncw.edu> “Downloading spreadsheets is not a good use of time — stop!” was a theme of the presentation given to a standing-room only crowd. MacIntyre began the presentation discussing the services JISC provides and gave some examples of their partnerships and projects. Morris followed by describing the CC-PLUS program specifically. Instead of simply looking at our current eresource use compared to our use in the past we should be using services that display our use in context with other institutions with flexible outputs and graphical displays that aid the analysis and interpretation of the data. When data is provided in the aggregate we can see how successful our deals are in comparison to others. Among other benefits, this information would be useful in contract negotiation. One of the many projects that JISC is undertaking is the challenge of assessing the value of eBooks. Complications surrounding eBook use reporting led to wider discussion, expanded partnerships, and the development of the CC-PLUS program which is a multinational, multi-consortial project, funded by the IMLS. PALCI is a partner in the project. Within the next year a proof of concept platform is expected to be live with automated tools that will be able to ingest data and provide a range of outputs. This tool is built for consortia use, but will also be useful for individual libraries, with hosted services a possibility in the future. Look for a full project report late spring.

Between Rare and Commonplace: Closing the Venn Diagram of Special and General Collections — Presented by Boaz NadavManes (Brown University Library); Christopher Geissler (Brown University Library) Reported by Annie Bélanger (Grand Valley State University) <annie.belanger@gvsu.edu> The session centered on the questions of access, use and engagement: how to bridge divide between the two types of collections? The speakers reframed their focus to include range of medium-rare to rare; selection to curation; preservation to conservation; mediated to direct access; and business considerations. They sought to understand the roles of the curator, researcher and audience. General collections rarely curate but expect access instantly. Special collections curate with low intentionality of access and usage. A spectrum of purchase to access and usage/view for all rare and medium-rare items was needed. Partnering together to empower staff and shape the work ensured collections moved forward as desired. Acknowledging that special collections puts inordinate amount of pressure on tech services, speakers see them as a cluster of problems: access, exclusivity, user. For example, mediated access limits the audience to power users and those willing to be monitored. Moving forward, collections budget is being redirected to services; focusing on extracting and enhancing data from existing collections to create collections as service. Developing inventory project to get 100%

62 Against the Grain / February 2018

in catalog with a minimum record. Historical cataloguing concealed the collections that do have diverse voices.

History Has Its Eyes On You: Lighthouses and Libraries Weather Storms of Change – or Is Being a Public Good Good Enough? — Presented by Corey Seeman (University of Michigan) Reported by Brianna Hess (Simmons School of Library and Information Science) <hessb@simmons.edu> In this presentation, Seeman explored similar histories and challenges of lighthouses and libraries. Accompanied by images of Michigan lighthouses and armed with firsthand experience of seeing a library through massive changes, Seeman tackled issues of obsolescence, repurposing, and the precarious position of two “public goods”: lighthouses and libraries. Seeman described the changing status of lighthouses, from their beginnings as socially significant, publically funded beacons for guiding watercraft to their present forms: largely automated structures repurposed only by wealthy individuals wishing to take up residence. Likewise, he chronicled the recent history of academic libraries as they transformed from traditional and beloved “hearts of the institution” to organizations facing digitization, space issues, and budget cuts. How do academic libraries remain relevant and useful through the coming storm? Seeman invited professionals to embrace change, and he offered insights into how libraries can leverage their expertise in community outreach to prove their value to rationalize expenditures. He spoke of balancing community needs with aspirations, and he envisioned a future library with closed stacks, extensive resource sharing, interactive space, and collection development built not on “just in case” but “just in time.”

How Difficult Can It Be? Creating an Integrated Network Among Library Stakeholders to Promote Electronic Access — Presented by Denise Branch (Virginia Commonwealth University); Ben Johnson (ProQuest); Jamie GieseckAshworth (EBSCO Information Services); Anne-Marie Viola (Sage Publishing) Reported by Eric Parker (Northwestern University, Pritzker School of Law) <ecp278@law.northwestern.edu> This concurrent session provided differing perspectives on necessary information flows among libraries, subscription agents, and content and discovery service providers to sustain a successful information ecosystem. Branch communicated how libraries have numerous stakeholders, among whom data like MARC records, etc., need to flow efficiently. Disrupters to these flows include: information silos, etc. An integrated vision for stakeholders consists of: enhancing industry standards, synchronizing knowledge bases, and others. Viola spoke on the publisher’s role in ERM. Sage’s library partners sit “downstream” from them in the ecosystem, providing the discovery systems to make Sage’s content useful. They face both internal (like KBART data discrepancies) and external challenges (like consortial resource licensing). Gieseck-Ashworth presented the subscription agent perspective. EBSCO deals with millions of orders at any given time. They get various information files in various ways to manage everything. A lot of resources go into maintaining relationships, and in moving data efficiently. Johnson discussed Ex Libris’ place between libraries and content providers. They have separate teams working with libraries and content providers. A big challenge is data standards: they work with over 5,000 content providers, with only a handful of staff ensuring data quality. Their strategy is transparency in telling providers what they and libraries need. continued on page 70

<http://www.against-the-grain.com>


Being Earnest with Collections — What Happens After Short-Term Loan Withdrawal by Carol Joyner Cramer (Head of Collection Management, Z. Smith Reynolds Library, Wake Forest University) <cramercj@wfu.edu> Column Editor: Michael A. Arthur (Associate Professor, Head, Resource Acquisition & Discovery, The University of Alabama Libraries, Box 870266, Tuscaloosa, AL 35487; Phone: 205-348-1493; Fax: 205-348-6358) <maarthur@ua.edu> Column Editor’s Note: If your library is managing a demand driven acquisitions program then you have probably experienced the frustration caused when publishers decide all or some titles will not be available for shortterm loans. At The University of Alabama, we began a vibrant DDA plan early in 2016. Initially the plan was set to purchase on first trigger. We later changed to the STL model because we wanted to determine over the long term if STL would be more cost efficient than outright purchase. Our decision resulted in the removal of several titles from our DDA pool on the EBSCO platform. Fortunately, EBSCO manages that process so it was not labor intensive. However, it left us wondering what impact this change would have on coverage within certain disciplines and whether or not the bulk of lost titles would come from small and university presses. We continue to review our plan and evaluate it based on a number of factors. The size of our DDA program necessitates that we use due diligence to ensure that title selection is as diverse as possible. I was pleased when Carol agreed to provide ATG readers with the results of her study at Wake Forest University. This article will answer some questions and lead others to pursue similar studies. — MA

M

y library loves the Short-Term Loan (STL) model of Demand-Driven Acquisition (DDA). Publishers, however, may have mixed feelings. So a publisher pulls its frontlist (<1 year old) content out of the STL program. We react by dropping the affected content from our DDA pool. What happens next? The Z. Smith Reynolds Library at Wake Forest University supports a relatively small FTE enrollment, but we have a relatively generous monograph budget when viewed on a per-FTE basis. Under those circumstances, a DDA and STL purchase plan makes a lot of sense. We have a broad-ranging plan in terms of subject coverage and per-loan spending caps. The only content restriction is on books flagged as “Juvenile” or “Popular” by GOBI Library Solutions. Because our pool is so large compared to our FTE, only a tiny percentage of the books in the pool ever receive any use. As publishers increased their prices for STLs over the last few years, we watched this trend with dismay (no one wants to pay more), but we reluctantly realized that the prices we enjoyed back in 2011 were probably unsustainable for publishers. Sometime in our fiscal year 2015, a major trade publisher (“Publisher A”) removed its

Against the Grain / February 2018

frontlist titles from the STL model with our provider Ebook Library (or “EBL” as it was then, now known as ProQuest Ebook Central). Therefore, to keep these books in the DDA pool, we would have to purchase the book at full price upon the first use. In what was perhaps an emotional reaction, the Reynolds Library decided to remove the frontlist books from our DDA pool entirely. We also removed content of any subsequent publisher who enacted the same or similar restrictions. We knew price increases were inevitable. However, setting the price at 100% for the first use was just too high. Even as we were removing these titles though, we recognized that we could be making a huge mistake. Sure, our DDA spending with Publisher A would go down, but what if our 28 subject liaison-selectors simply purchased more print from this publisher to compensate? After all, we knew that some (although not all) liaisons would notice in GOBI’s system that we had a DDA book available and would choose another book — thereby punishing publishers who made their frontlist available. On the other hand, by not having their titles before our users in our DDA pool, perhaps Publisher A would realize even less revenue than before. This was all speculation however. What really happened? I set myself to find out. As I mentioned, the change occurred in fiscal year 2015 (July 1, 2014-June 30, 2015 — henceforth FY15). So I chose to compare the sales that Publisher A realized from Reynolds Library in the year before the change, FY14, to the year after, FY16. Then I pulled the same data for another trade publisher, “Publisher B,” who did not change their participation in STLs before June 30, 2016. Table 1 presents a few basic facts about our campus size and book budget. I have given two FTE numbers, as we have law and medical students who are served by independent campus libraries. However, these students may use our DDA pool, and we do not restrict law and medical content from our pool. Across the study period, the Reynolds Library had budget increases that we could partially apply to monographs. As indicated in Table 2, we directed most of these increases toward the DDA slice of the budget, while keeping firm-order funds flat and slightly cutting approvals. The Collection Management unit does not dictate to our liaisons any policy regarding duplication of titles. Liaisons may purchase both print and electronic — or only one preferred format — depending on the needs of their specific department. Likewise, when Publisher A and other publishers dropped out of STLs, I announced the change, but did not

dictate any specific action that liaisons needed to take in response. Therefore, I could assess what happened naturally.

Data Collection

Since the DDA decision had no impact on our journal subscriptions, I confined my project to the monographs side of the budget. I also ignored STL/DDA spending on backlist books, since backlist books were not impacted by Publisher A’s decision. Therefore, I examined these types of purchases: • Approval shipments of print • Firm-order print purchases • STL spending on frontlist titles • DDA triggered purchases of frontlist titles • Firm-order eBook purchases To keep the focus on books, I excluded nontext formats such as films and music scores. I retrieved our total spend on monographs from our Voyager ILS, in which we use reporting funds to flag content types. To see specifically what we purchased from Publishers A and B, we requested imprint-level data from GOBI Library Solutions.1 GOBI identified which imprints corresponded to these two publishers. While we do use other book vendors beside GOBI, practically all of our frontlist purchases from these two publishers would be sourced from GOBI. Statistics on EBL eBook usage and costs came from the EBL statistics module.

Results

The results for Publisher A (which dropped out of STLs) are summarized in Table 3. As you can see, for Publisher A the $3,094 in STL spend from FY14 was almost exactly displaced by $3,151 in firm-order eBook purchases. So Reynolds Library spent practically the same money, but got access to many fewer books for our pains. Interestingly, there was practically no effect on spending on firm-order print books, which means that liaisons did not respond to this situation by choosing to buy more print from Publisher A. The results for Publisher B (which remained in the STL program) are summarized in Table 4. Publisher B had a sharp increase in DDA spending. A look at the title-by-title data revealed that Publisher B responded to the STL crisis by dramatically raising the perSTL price instead of dropping out of program entirely — at least during the time under study. At Reynolds Library, the median STL cost from Publisher B rose from $13.82 to $32.33 per STL instance. As a result, the total amount continued on page 64

<http://www.against-the-grain.com>

63


Being Earnest with Collections from page 63 spent with Publisher B increased despite accessing fewer titles. Another way to look at this data is to think about each publisher’s share of Reynolds Library’s monograph spending. As I stated above, our book spending had risen in FY16 as compared to FY14, and most of that increase came from the DDA category. Table 5 shows what happened to each publisher when viewed in the light of our entire book spend. So for Publisher A, dropping out of DDA meant a decline in their “market share” of our spending. Publisher B also dropped in “market share,” but not as sharply.

Table 1: FTE and Monograph Spending FY14 vs. FY16

Table 2: Reynolds Library Monograph Spending by Type: FY14 vs. FY16

Discussion and Conclusion

So what did we learn? In a way, all parties lost. Both publishers lost in “market share.” (I did not investigate whether this was due to a single “Publisher C” gaining more sales, or sales spreading out among more publishers, or some other combination of factors.) Publisher A’s strategy of dropping out of STLs worked in the sense that we purchased individual eBooks that offset FY14 spending. However, Publisher A missed the opportunity to gain more sales as Reynolds Library directed budget growth in the direction of DDA. Publisher B also lost “market share,” but its strategy of raising STL prices was more successful than dropping out of the STL model entirely. Reynolds Library spent the same amount of money with Publisher A, but lost access to a broad swath of titles. Had Reynolds Library not dropped Publisher A’s offerings, we would not have been able to afford the program — especially as other publishers also dropped out of the STL program. Since the study period ended, ProQuest has re-branded EBL as EBook Central and more recently has released the “Access-to-Own” model. By setting a higher price for the first use, ProQuest is moving closer towards the model adopted by Publisher B during the study term. The long list of publishers that have agreed to participate in the Access-to-Own model2 implies that publishers are willing to try this model. Wake Forest University may not be a typical customer. Is it more normal in this day and age for libraries to cut monograph budgets instead of increasing them? Publishers may be designing their pricing strategies based on a customer profile that is very different from us. Also, the publisher’s profit margin is almost certainly not identical across different distribution channels, e.g., DDA through ProQuest vs. sales on its own eBook platform. However, as I was not made privy to these differential profit margins, I could not take them into account. Since Reynolds Library gleans a lot of value from the STL and DDA models, we will continue investing in it as long as possible. However, as purchasing options evolve, we will need to adapt our selection practices in response. Endnotes 1. I am grateful to Steve Hyndman and his colleagues at GOBI Library Solutions for their assistance in gathering data for this analysis. 2. Available publicly at http://media2.proquest.com/ documents/access-to-own-publisherlist.pdf.

64 Against the Grain / February 2018

Table 3: Spending for “Publisher A,” who Dropped STL

Table 4: Spending for “Publisher B,” who Retained STL

Table 5: Share of Reynolds Library Monograph Spending for Each Publisher

<http://www.against-the-grain.com>


6180 East Warren Avenue • Denver, CO 80222 Phone: 303-282-9706 • Fax: 303-282-9743

Subscribe to The Charleston ADVISOR Today!

The Charleston

ADVISOR

Critical Reviews of Web Products for Information Professionals

comparative reviews...reports from “The Charleston Advisor serves up timely editorials and columns, the field...interviews with industry standalone and comparative reviews, and press releases, among other features. Produced by folks with impeccable library and players...opinion editorials... publishing credentials ...[t]his is a title you should consider...” comparative reviews...reports from the field...interviews with industry players...opinion editorials... — Magazines for Libraries, eleventh edition, edited by Cheryl LaGuardia with consulting editors Bill Katz and Linda Sternberg Katz (Bowker, 2002).

• Over 750 reviews now available • Web edition and database provided with all subscriptions • Unlimited IP filtered or name/password access • Full backfile included • Comparative reviews of aggregators featured • Leading opinions in every issue

$295.00 for libraries $495.00 for all others

✓Yes! Enter My Subscription For One Year. ❏ Yes, I am Interested in being a Reviewer. ❏ Name_____________________________________________ Title_________________________________________ Organization___________________________________________________________________________________ Address________________________________________________________________________________________ City/State/Zip__________________________________________________________________________________ Phone_____________________________________________ Fax_________________________________________ Email_________________________________________Signature_________________________________________

Let’s Get Technical — Improving Electronic Resources Management Communication with Chat Solutions by Michael Fernandez (Electronic Resources Librarian, American University Library) <fernm@american.edu> Column Editors: Stacey Marien (Acquisitions Librarian, American University Library) <smarien@american.edu> and Alayne Mundt (Resource Description Librarian, American University Library) <mundt@american.edu> Column Editor Note: In this month’s column, we feature the experience of trying out different forms of communication to be used within a small department. Michael Fernandez, Electronic Resources Librarian at American University Library, explains the challenges he and his staff faced with communicating amongst themselves and the steps that were taken to improve communication. — SM & AM

Introduction

There are numerous communication streams impacting electronic resources management (ERM). External communications, such as access issues and vendor correspondences, reach ERM staff through multiple channels, while ERM staff themselves utilize a variety of methods to communicate with individuals and groups both inside and outside of the library. While best practices for many of these modes of communication have been

Against the Grain / February 2018

developed and refined, internal communication among ERM staff can remain complex and inefficient. This article will describe communication challenges faced by the ERM Unit at American University (AU) Library, methods undertook to address them and what the results were, as well as lessons learned for communication improvement.

The Situation The ERM Unit at AU Library is housed within the Technical Services Division, and consists of one librarian and two specialists. The ERM Unit manages all stages of the e-resources lifecycle: trialing and acquiring new e-resources, licensing, setting up access, responding to troubleshooting requests, and generating usage reports for assessment. Within the ERM Unit, each specialist has a different area of concentration. One specialist focuses on access and serves as a primary contact with vendors for price quotes, trials, and making e-resources discoverable. The other

specialist focuses on assessment, and works on collecting and maintaining usage statistics in the electronic resource management system (ERMS), as well as generating usage reports. Both specialists have been cross-trained, allowing them to fill in for each other when the situation calls for it. Troubleshooting access problems accounts for a sizable amount of the work in the ERM Unit. The specialists have been cross-trained in this area as well, which is vital; depending on the access problem, troubleshooting often requires collaboration, consultation with external parties, or rerouting of reported issues. Access problems can reach the ERM Unit in a variety of ways. The primary method is an online form that library users are prompted to fill out when they encounter an e-resource problem. This form will generate a troubleshooting ticket within iSupport, the library’s request management software program. When a new ticket is generated, a continued on page 66

<http://www.against-the-grain.com>

65


Let’s Get Technical from page 65 notification email is sent to the ERM Unit’s shared email account. This shared email is the preferred contact for the ERM Unit, and is the one associated with vendor correspondence for financial communications such as billing and renewals as well as being tied to administrative accounts for vendor platforms. Multiple staff in the ERM Unit handle vendor correspondence, so this shared email account ensures that the correct staff sees relevant vendor communications and protects against messages being lost amid staff absence or turnover. Access problems can also be directly communicated to the shared email; this is typically done by public services staff, who also report problems over the phone, via the reference chat service, and in person. For these direct communications, ERM Unit staff will manually create a troubleshooting ticket, so that the problem can be tracked. With this variety of methods for reporting access problems, it could be difficult to determine who was responding to what. These communication issues tended to become exacerbated during peak times for troubleshooting tickets, typically occurring at the beginning of the semester and during finals. The shared email served as a good triage point for access problems, albeit an imperfect one. For one, being associated with hundreds of vendor and publisher accounts, it receives no small amount of spam emails. Secondly, access issues from library colleagues may be directed to a specific member of the ERM Unit, and might be done so informally or in passing. Or it could be directly communicated by one library staff member and also reported by a patron’s troubleshooting ticket. Faced with the problems of email clutter and potentially wasted staff bandwidth through duplication of efforts, the ERM Unit looked to ways of improving internal communication channels.

The Process

The ERM Unit began investigating the use of chat programs for internal communication in December of 2016. This move was precipitated by a few factors. First, the ERM Unit had recently filled a vacant specialist position after being short-handed for the better part of the previous year. Second, the other specialist had just begun working from home one day a week. Shortly after that, both specialists’ schedules included teleworking once a week. Having a fully staffed ERM Unit doing regular remote work made chat communication a necessity. Initially, the ERM Unit piloted Slack, a chat service platform that can be used freely online via a browser or with a downloadable app. Trying Slack came at the suggestion of the newly hired specialist, who had used it at her previous position, working for a library vendor. Slack had a number of attractive features that made it a logical candidate for piloting. One of these features is that Slack archives chat history and makes it fully searchable, so if there was a question about who had responded to a particular e-resource access issue, or how it

66 Against the Grain / February 2018

was answered, the digital trail of that question could be easily followed up on. Slack also gives users the ability to create dedicated chat rooms, or Channels. This was a feature that was potentially useful given the number of special projects that the ERM Unit focuses on. Through the use of Channels, we could keep general access troubleshooting conversations separate from conversations on projects such as our Unit’s accessibility inventory. Other appealing aspects of Slack were its collaborative features, such as file sharing and ability to sync with apps like Google Calendar and Google Drive. Slack was piloted by the ERM Unit for about three months, and during that time, it was found to be useful for communication. Delegating tasks through chat was particularly helpful, as a large volume of the unit’s work comes through the shared email address, and chat allowed us to determine who was covering what without the inbox clutter of additional emails or worrying about cc’ing everyone on a response. Chat was also useful for collaborating on ERM issues, for example, brainstorming on what level of technical language to use when responding to a patron’s access issue or messaging a URL for link checking. Even on days when both specialists were on site in the office, Slack was useful for troubleshooting an issue and not having to worry about disturbing other Technical Services staff hearing the conversation. If there were any drawbacks to Slack, it was that the small size of our unit meant we really weren’t taking advantage of all its functionality. For example, separate Teams and groups can be setup within Slack; this wasn’t necessary within our small unit. Organizationally, Slack would have been better suited if the library as a whole adopted it as a chat solution. In March of 2017, around the same time we were gathering the pilot feedback for Slack, the Director of Technical Services asked us to investigate Skype for Business. This was part of a larger push to use Skype, as it had recently been adopted at AU after an institution-wide migration to a Microsoft Outlook email server the previous summer. Additionally, Technical Services was in the midst of being relocated off-site, so we were looking for ways to retain communication lines with colleagues on campus and potential options for holding more meetings remotely. Overall, the transition to Skype in the ERM Unit was smooth, and it was found to be more suited to the size of our group. Skype retains similar functionality to Slack, such as file sharing. Skype’s integration with Outlook also allows for preserving chat histories, which get archived within a designated folder in the Outlook client. Scheduling is also facilitated via the integration with the Outlook calendar. Another positive for Skype was that the chat client was pre-installed at the specialist’s workstations, so they could login that way or configure the client to auto-login when the desktop was booted. The specialists found this preferable to logging onto our Slack

instance via web browser, although Slack does also provide downloads for a desktop client or mobile app.

Lessons Learned

As a chat solution, Skype quickly became integrated into the regular workflows of the ERM Unit. Communicating via Skype is done every day in the unit and now feels like second nature to the specialists. Certain types of communication, like link checking or collaborative troubleshooting, lend themselves to chat and can be handled more efficiently than they would be verbally or over email. The initial three month pilot of Slack was informative and useful for determining what type of chat solution worked best for the ERM Unit. While Slack offers excellent functionality and integration potential, it wasn’t necessarily suited to our small unit of three. For larger organizations, where job duties may be more diffused, Slack can potentially be a more effective communication tool. In adopting a tool like Slack, it’s advisable to take a top-level approach in order to achieve staff buy-in across the organization. The customizability and flexibility of Slack is an asset, as it allows staff to use it however they feel comfortable, whether by logging into the web-based client or through an app on their mobile device. For the ERM Unit, we found that the scalable ease of Skype was more suited to our needs. Skype’s chat client was as simple and natural as text messaging, and its automatic integration with Microsoft Outlook was another positive. Our overall takeaway was to be willing to experiment with different chat solutions and find what works best in terms of organizational size and staff communication style. In the ERM Unit, chat is used daily, typically initiated with a “good morning” check-in. Another takeaway was to identify what types of communications were better suited to chat, and which ones leant themselves more to email or verbal communication. Chat hasn’t exactly replaced all other communication in the ERM Unit, but it has become a preferred tool for interactions that can be handled more immediately or less formally than via email. Despite its immediacy, it’s also important to be patient when awaiting responses; someone may not hear or see a chat notification right away because they are engrossed in another task or stepped away for a minute. For the immediate future, the ERM Unit plans to continue using Skype as its preferred chat solution. At the time of this writing, other units at AU Library are experimenting with using the Microsoft Teams application for larger cross-departmental projects, such as our upcoming migration to Alma. The robust functionality of Teams bears a number of similarities to Slack, so it will be interesting to observe how it gets utilized. As our experience has taught us, scalability to the organization’s size is a key factor in the success of a communication solution.

<http://www.against-the-grain.com>


Library Analytics: Shaping the Future — Applying Data Analysis: Demonstrate Value, Shape Services, and Broaden Information Literacy by Rachael Cohen (Discovery User Experience Librarian, Indiana University Bloomington) <rachcohe@indiana.edu> and Angie Thorpe Pusnik (Digital User Experience Librarian, Indiana University Kokomo) <atthorpe@iuk.edu> Column Editors: John McDonald (EBSCO Information Services) <johnmcdonald@ebsco.com> and Kathleen McEvoy (EBSCO Information Services) <k.mcevoy@ebsco.com>

A

cademic librarians are no strangers to statistics and assessment, and we do not just passively provide access to materials. For decades, librarians have evaluated their collections and programming against changing user needs and interests. In recent years, however, librarians have shifted from assessing their impact primarily through peer comparisons to instead turning inward and assessing their impact against distinctive institutional student success outcomes. With decreasing budgets and increasing calls for evidence of impact, we are having to prove both our value and our expertise at every turn. We are being asked to justify the purchase of expensive tools and journal packages, and we are called upon to demonstrate how our subject expertise directly influences student learning. Rather than waiting to be asked to provide proof of impact, librarians have assumed the driver’s seat and begun collecting and analyzing data to assess the value of our collections and demonstrate our importance to the academy. There is no better time to jump headfirst into proactively illustrating the library’s value. Academic librarians grapple with several hard truths every day. For one, we acknowledge but strive to defy the fact that we often only reach and actively consult with a small percentage of our users. The truth remains that a percentage of our users never step foot into the university library. We also contend with the oftheld belief — from both students and faculty alike — that, as “digital natives,” students already know how to search. As we have found, though, search transactions logs often show the exact opposite. The time is ripe for librarians to find new ways to communicate and work with their users. Failing to do so further removes us from the research process, which reduces our overall value. One powerful solution that may already be available to many librarians is anonymous user data. In particular, anonymous user behavior and search transaction logs from webscale discovery services offer rich testimony to user skillsets and subject interests. The

Against the Grain / February 2018

authors contend these data are essential toward developing connections with more library users and establishing the pivotal role of the library within the academy.

far exceeded in-person library visits and that university administrators were increasingly interested in overall assessment of student success, we asked ourselves some tough questions:

Case Studies

• Discovery services/tools were undoubtedly more prevalent in libraries, but how were students actually searching in them? • What trends, successes, and challenges do user search queries reveal? • Moreover, who is even using our discovery services/tools? • How does user behavior compare across our two institutions: the largest and the smallest campuses in a multi-campus university system?

Both authors work at institutions that launched EBSCO’s discovery service in fall 2011. By early 2014, largely motivated by the hefty financial investments we were making in discovery, we realized we needed to begin assessing their efficacy. We turned to data both because it was comprehensive, and it documented genuine, unfiltered user behavior. Even the collection of data from searches recommended or conducted by a librarian reflected actual “in the wild” discovery usage. Usage data is, in fact, the record of what actually happened. We compared user behavior metrics across our two campuses: the flagship Indiana University campus in Bloomington, which enrolled nearly 47,000 students in 2013-2014, and the significantly-smaller regional Indiana University campus in Kokomo, which enrolled approximately 2,600 students in 2013-2014. We looked at longitudinal usage reports from our discovery vendor and Google Analytics over the three most recent academic years. Vendor data proved insightful user engagement metrics, such as full text downloads and abstract views. Google Analytics yielded valuable user behavior data, including the devices and browsers used to access discovery, plus the distribution of basic versus advanced user searches. Although we had hypothesized that user behavior would differ between our campuses, our results mostly complemented one another’s, including the fact that — even in 2014 — despite the ubiquitous nature of smartphones, students at both campuses still overwhelmingly used a desktop or laptop computer to access discovery.1 One year later, in 2015, we returned to our data. Although we had shared our 2014 findings with our colleagues, admittedly, the change we had sought had not yet transpired. In this context, as well as in response to observations that online library resource usage

We also had to ask ourselves how we would even find answers to these questions besides our standard discovery usage data. Focus groups, interviews, and surveys were possibilities, but focus groups and interviews are inherently limited in terms of participation counts, and surveys run the risk of low participation rates. Furthermore, through other studies we have found that students are rather prone to telling you what they think you want to hear, rather than their unfiltered reality. Thus, we instead turned to transaction logs for our dataset. Transaction logs — or user search queries — hold the capability of revealing not only users’ information needs but also broader trends and patterns in searching behaviors. Our methodology involved examining search queries from our respective discovery service logs, which were harvested from Google Analytics. We collected a semester’s worth of data, and then we each identified a random sample of 1,677 recorded search queries. To understand who was using discovery at each campus, we then reviewed and categorized each query in our respective datasets, using the Library of Congress (LC) Classification schedule to assign both a class and subclass to each query. Categorizing queries allowed us to pinpoint themes and/or assignments students were working on so that we could then collaborate with our teaching colleagues — both in and outside the library — to further help students succeed in their information-seeking activities. We initially set out to identify recurrent queries at each campus and then compare results in order to gauge any overarching patterns continued on page 68

<http://www.against-the-grain.com>

67


Library Analytics ... from page 67 of user behavior across campuses. However, our end project innovated on the initial project plan because we did not simply analyze our search query logs, identify heavy versus light discovery service users, and determine which LC subjects were searched most/least frequently (respectively, social sciences and medicine/ world history and agriculture). Rather, we discovered search patterns and themes, which open up unexpected opportunities to engage more deeply with our instruction librarian and teaching faculty colleagues through the application of our search query data analysis. Search patterns included search missteps, such as typographical errors and questions (e.g., “why people travel”). Where our first dive into data analysis provided us with interesting but not necessarily actionable results, our second study on search queries provided practical evidence we can leverage to improve our services.2

Applying Data-Informed Results

We are not the first libraries to use Google Analytics to evaluate our resources, and we acknowledge the appeal of continuing to peel back the layers of collected data. However, rather than just analyzing data, we propose applying data to effect change. Data such as transaction logs allow us to both demonstrate the value of our collections and our expertise to the academy. Librarians with technical and public service responsibilities alike can utilize anonymous usage data to partner with instructors to build students’ information literacy skills, shape library services and resources, and increase overall engagement with users. These activities can, in turn, improve user assignments, perceptions of library services, and overall appreciation for the library. Collection managers, including subject specialists and liaisons, can use search transactions to improve collection development. Frequently searched titles that are not part of the library’s collection should be considered for acquisition. Titles that are owned may be prime opportunities for outreach regarding print or online library reserves, if allowed. Themes that emerge from subject categorization may also reveal changing disciplinary focal points. For example, if a subject recurs in search queries but is collected at only a minimal or basic level, collection managers may want to explore coverage expansions, in consultation with both faculty and library service provides. For librarians who serve at public service desks, identifying search patterns and themes can serve as preparation for possible reference questions they may likely encounter during specific semesters. This means that, even if the questions and/or topics are outside their area of expertise, librarians can attune themselves with the optimal resources needed to answer questions. This preparedness, in turn, instills a perception that librarians are truly experts in research and are an important resource to utilize. The librarian becomes as much a resource

68 Against the Grain / February 2018

to consult as the recommended material in the library’s collection. Librarians with system responsibilities will likely also find it beneficial to review transaction logs. Log analysis facilitates the identification of similar queries and patterns, which can help ensure librarians learn how users interact with resources. This understanding enables librarians to better advocate for and then implement appropriate new features that assist students in their searching. This knowledge will likely also benefit vendors as they continue to develop intuitive yet effective products for library users. Last but definitely not least, transaction log analysis assuredly also benefits librarians with instruction responsibilities. Transaction logs reveal common user errors and misunderstandings when searching library resources. Equipped with this knowledge, librarians can proactively advise students regarding how to craft their search queries — including using built-in tools such as auto-suggest keyword features — to avoid problems before they emerge. This is particularly helpful because studies have shown that students are more likely to refine or completely change their search query versus use a facet or click to the second page of results.3 This level of preparedness enables librarians to move away from teaching basic searching skills to instead spend more time covering precise techniques that will enable students to formulate their research questions and match their search strategies to appropriate search tools. We have spoken in generalities regarding how librarians with different responsibilities may benefit from data or transaction log analysis. Since completing our data analysis, we have begun to translate our results into action. We launched supplementary training sessions for student employees and librarians, recognizing that these groups require different instruction methods and levels of detail. The trainings focused on what is and is not included in our discovery services, discovery facets — such as source types or publications — and content providers, since several search queries included specific facet (e.g., “professional journals about teaching” and “Journal of American Medical Association”) or database names (e.g., “ArtStor” and “factiva”). Our student trainings incorporated independent, hands-on activities, while our librarian trainings involved demonstrations and additional technical explanations. We are also developing intentional outreach plans to share our results with our teaching faculty. We have identified a limited number of faculty on each campus with whom we would like to initially work. For this pilot phase, we plan to work with faculty who are teaching capstone courses. We will present sample search queries from each faculty member’s discipline and begin a conversation about the instructor’s satisfaction with their students’ coursework. We foresee these conversations then moving to how we can collaborate to build reciprocal value between our faculty and our librarians.

Conclusion

The ACRL Framework for Information Literacy for Higher Education calls librarians to identify core ideas within their own knowledge domain that can extend learning for students, create a new cohesive curriculum for information literacy, and collaborate more extensively with faculty.4 The authors uphold data — including transaction logs — as a core idea that affects all of these areas. That is: • Data empowers librarians to identify where student searches are breaking down so they can improve services and extend learning for students. • Data analysis and evaluating student searches thus logically extend themselves as integral components of a cohesive curriculum for information literacy initiatives. • As a component of the information literacy curriculum, sharing data is another strategy for librarians to collaborate more extensively with faculty. This latter point is significant: On their own, search queries are just data. They lack context, and they are anonymous. However, librarians can use this data as another starting point to have more meaningful conversations with our colleagues; we can extract meaning from our data. This data allows us to connect actual user information needs with our services, with our collections, with our instructors, and with our courses. With this data, we are better equipped than ever before to collaborate with our colleagues, build students’ abilities to formulate appropriate search queries, use information ethically, and meet student success outcomes.

Endnotes 1. Cohen, Rachael A., and Angie Thorpe. “Discovering user behavior: Applying usage statistics to shape frontline services.” The Serials Librarian 69.1 (2015): 29-46. https://doi.org/10.1080/036152 6X.2015.1040194 2. Cohen, Rachael A., and Angie Thorpe Pusnik. “Measuring query complexity in web-scale discovery: A comparison between two academic libraries.” Reference & User Services Quarterly (in press). 3. Examples: Georgas, Helen. “Google vs. the library (part II): Student search patterns and behaviors when using Google and a federated search tool” portal: Libraries and the Academy 14.4 (2014): 503-532. https://doi. org/10.1353/pla.2014.0034; Brett, Kelsey, Ashley Lierman, and Cherie Turner. “Lessons learned: A Primo usability study.” Information Technology and Libraries 35.1 (2016). https://dx.doi.org/10.6017/ital. v35i1.8965 4. Association of College and Research Libraries. Framework for information literacy for higher education. (2015). http:// www.ala.org/acrl/standards/ilframework

<http://www.against-the-grain.com>


Squirreling Away: Managing Information Resources & Libraries — Snow Plows or No Plows - What’s a City to Do? Column Editor: Corey Seeman (Director, Kresge Library Services, Stephen M. Ross School of Business, University of Michigan; Phone: 734-764-9969) <cseeman@umich.edu> Twitter @cseeman

A

s I started to write this, it was snowing in Ann Arbor, Michigan. The forecast was grim, with estimates of between six and nine inches in Southeast Michigan over the course of the day — with more snow expected on Saturday and Sunday. Schools across the region were closed and flights in and out of Detroit’s DTW Airport were cancelled as well, causing many librarians on their way to Denver for ALA’s Midwinter Meeting anxious or rethinking their trip. If you could stay at home, we were being told by weather professionals, that might be a good idea. But in Ann Arbor, we are chugging along. It is not that we are amazingly dedicated (though we are) or tremendously talented (though we are), it is because we have the equipment and the means to take care of the snow for our mostly residential campus. In fact, this morning, Michigan President Dr. Mark Schlissel tweeted “Up before dawn getting our campus ready, our outstanding Facilities & Operations staff Make Blue Go and help keep us safe, no matter the weather. Thank you, team!!!!”1 And it is definitely true that we are grateful for the Facilities & Operations staff who clean up every time it snows. We are also grateful for the road crews who are out everytime it snows, even if it falls on a holiday or a weekend. As my former boss once told me, it is truly a gift that we live in a world where the snow that falls is plowed off the streets, allowing us to get out of our houses and where we need to go, even on the worst days. In the winter, especially in Michigan, we are very focused on the weather. Your ability to travel, get to work, get to school, walk the dog, etc. all hinge on your ability to get out of your front door. It also is, as the old joke goes, a great time to make French toast, as the grocery stores always run out of milk, eggs, and bread in the day before a big storm. Since you have a good deal of time to think about things when you are clearing your driveway and sidewalks in front of your house, it got me thinking about what might be connection between the way that we manage for snow events and the way that we build collections in our libraries. Maybe it was the cold air running through my brain, but there were more connections than I knew! So I wanted to learn more about the snow removal industry in the United States. In doing some research (in our databases...PLEASE!), I learned that IBISWorld estimates that snow plowing services in the United States represents an $18.8 billion industry with over 151,000 individual businesses involved. This does not include municipalities and government.2 From that very same report, we learn this about expenses: Purchases Purchases are expected to account for 31.0% of revenue in 2017. Industry operators commonly purchase replacement plow and spreader components, including mounting systems and frames, hydraulics, joystick controls, enhanced headlamps and blade markers. Industry operators also purchase snow removal equipment, such as shovels. However, fuel expenditures and the procurement of rock salt are the Snowplowing Services industry’s largest purchase costs. Rock salt prices, according to Snow Magazine Online, can cost industry operators upwards of $110 per ton. These costs can be especially prohibitive as larger snowplowing companies commonly purchase more than 350 tons of salt per year. Rising rock salt prices have led to an increase in purchase costs from 2012 through 2014. However, in recent years, industry operators have implemented new materials and technologies to mitigate these costs. For example, liquid ice-melting products are more commonly used. These solutions act as a preventive anti-icer and can also be combined with traditional rock salt for more efficient ice melting.

Against the Grain / February 2018

Other expenses Depreciation costs are expected to comprise 2.6% of industry revenue. Many industry operators work seasonally and are employed in other occupations, including landscaping, building contracting and various other skilled trades. Consequently, these individuals already own large pick-up trucks or other vehicles, which are this industry’s largest capital expense. Other capital expenses include snowplows and salt spreaders. Snowplows, for example, vary in shape, size and design. However, standard straight-blade plows can cost as little as $4,600.3 So from this report, we learn (not surprisingly) that the expenses that are balanced between capital goods, such as plows, shovels and other equipment, and consumable goods, such as fuel and rock salt. I am not sure why they did not count the eggs, milk and bread in the consumables — I will have to write them and ask. So these expenses are simply the cost of doing business. What is also addressed, and more germain to this article, is the Business Locations exploration of this industry sector. IBISWorld estimates that Michigan has 8.4% of the snowplowing establishments in the United States. For those keeping score, New York comes in first with 14.1% of these businesses dedicated to keeping the Empire State moving during winter. If you have ever been to Buffalo in the winter — this makes perfect sense. For that matter, if you ever have been to Buffalo in the summer — it also makes sense. In further describing where these firms are not, they make what would be a fairly obvious statement: All of the other regions have a disproportionately lower concentration of establishments because they typically experience milder winter weather and little to no snow. For example, the Southeast is expected to account for only 4.6% of establishments in 2017, although the region is home to over a quarter of the US population. Likewise, the West accounts for only 6.8% of all establishments, although the region is home to 17.2% of the population.4 In looking at the other side of the market, from the service to the equipment, we see that there are great constraints also in regards to snowplow manufacturing. A similar IBISWorld report states: “Snowplows, salt spreaders and other snow control equipment are simple mechanisms that are used for a single purpose, so keeping products fresh and innovative may challenge some industry operators.”5 So right now, if you’re still here, you might be wondering where I am going with this. If this were a crime drama, the judge would ask if the counselor has a question. And I do. What does this have to do with libraries? Everything. But before I continue, I want to bring up an event that took place in our beloved Charleston in early January this year. On January 3rd, an uncharacteristically bad winter storm dumped five inches of snow in Charleston and shut down the airport for four days. While five inches of snow might not have been a problem if the temperatures moved into its normal ranges for a Charleston January, it did not bounce back above the freezing mark. This lead to icy conditions and an airport and a layer of snow and ice that did not melt. This left the airport operations frozen out. This amount of snow would not have slowed down any airport in the “Frost Belt” because they are prepared for the inclement weather. In Charleston, that is simply not the case. So we should ask ourselves an important question, why aren’t they prepared? Long after the airport resumed operations, the Post and Courier asked Paul Campbell, CEO of Charleston County Aviation Authority, why they were not better prepared for the storm? Among many things Campbell said, one resonated with me: “It doesn’t make any economic sense for us to buy any equipment [to clear snow]. It was a once-in-30 year event.”6 continued on page 70

<http://www.against-the-grain.com>

69


Squirreling Away ... from page 69 According to the Post and Courier, the previous snow was a number of years earlier: “The last real snow in the area was eight years ago, and that was only a couple of quickly melting inches. For a lot of people along the South Carolina coast, this was as deep a snow as they had ever seen.”7 The snowfall’s novelty wore off when people realized that the snow was not melting quickly. With the airport being shut down and roads very difficult to travel on, the novelty shifted to anger. Besides countless passengers who were stranded in their travels to and from Charleston, Boeing was particularly concerned. The construction of their 787 Dreamliner commercial aircraft, that is partially constructed in the giant facility adjacent to the airport, was interrupted by this storm. The question being asked over and over again across the region is why didn’t the airport or the county have the snowplows to handle this weather occurance to get the city moving again. The answer is likely found in the IBISWorld reports — snowplow services have very little market in South Carolina and the cost would be high for the community to purchase this equipment just in case. If you made a huge investment in snow plows, which really serve no other purpose, could you justify having them sit at the end of an airport taxiway collecting rust year after year? It is not a matter of marketing to get people to use it —they are only needed when it snows. Had they the equipment and the staff who could operate it, they might not have had to close down at all. The airport CEO would likely have been quoted in the papers saying something along the lines of “they thought I was crazy buying these snow plows, but I knew that eventually we’d need ’em.” But it would have been the first time it was used in eight years and that might beg another question — was the money used to buy that equipment well spent? So why did I take us on this story? Well, to talk about print volumes and library collection development of course. In thinking about the problems that the Charleston Airport (and the community) suffered through because they did not have the snow plows, despite needing them every 30 years or so (if we believe the Airport CEO). I would think that it would be difficult to justify the cost of equipment (and staff time) for something that may be used once every ten or twenty years. And while Charleston travelers on January 3rd would have been thrilled if they did make the investment, what might they have given up to pay for the equipment so rarely used? To need something once every eight years is a hard sell for an administrator. In looking at our library collections, are we making the same decisions? We often use ten year windows in looking at circulation of newly purchased print items as a measure of success. We claim that book reviews are not as timely, and do not drive patrons to these volumes. We continually invest in discovery platforms to expose these resources to our patrons. We think about marketing as a way to get people to check out these items. Can we routinely purchase items that are not used for significant amounts of time and be good stewards of our campus dollars? There is no right or wrong answer here for sure. Librarians are using their professional expertise to help build the collections that support

And They Were There from page 62 Laying Down the Whack-a-Mole Mallet: One Inexperienced ERM Team’s story about adopting the Agile Philosophy to Manage Electronic Resources, The Epic Saga – Part One — Presented by Gerri Rinna (Western Michigan University) Reported by Susannah Benedetti (University of North Carolina Wilmington) <benedettis@uncw.edu>

70 Against the Grain / February 2018

the work being done at their campus. That all being said, do library administrators and collection development librarians need to be thinking more logically about our purchases as how the resources will be used. We should be driven by what is needed on our campus more than what others are doing. And if you invested in something that might be used once in ten, twenty or thirty years, what have we given up to make that possible? If collection development librarians were not constrained by space, by staffing issues or by budget, the work would be easy. But that is not the case anywhere. The libraries with the larger budgets, bigger facilities and more staff also typically support a campus population with a greater appetite for library resources. In libraries, we tend not to look at other aspects of our communities in finding parallels to how we should build our collections and our services. In seeing the relatively rare need for plow equipment in Southern cities, we see an interesting exploration of the very issue that is at the core of collection development. Are we buying what our campus really needs or are some works as useful as a snowplow in South Carolina? The cost associated with having every tool in our toolbox is simply not something that any library can afford. Here is hoping that your travels are weather-incident free and that you have all the resources your campus needs. A guy can dream, right?

Corey Seeman is the Director, Kresge Library Services at the Ross School of Business at the University of Michigan, Ann Arbor. He is also the new editor for this column that intends to provide an eclectic exploration of business and management topics relative to the intersection of publishing, librarianship and the information industry. No business degree required! He may be reached at <cseeman@umich.edu> or via twitter at @cseeman. Endnotes 1. https://twitter.com/DrMarkSchlissel/status/961953636560658432?ref_ src=twsrc%5Etfw 2. Madigan, John (2017). Snowplowing Services in the US. IBISWorld Industry Report OD5400. Retrieved from IBISWorld database, February 9, 2018. 3. Madigan, John (2017). Snowplowing Services in the US. IBISWorld Industry Report OD5400. Retrieved from IBISWorld database, February 9, 2018., p.23-24. 4. Madigan, John (2017). Snowplowing Services in the US. IBISWorld Industry Report OD5400. Retrieved from IBISWorld database, February 9, 2018., p.21. 5. Peters, Iris (2017). Snowplow Manufacturing in the US. IBISWorld Industry Report OD5432. Retrieved from IBISWorld database, February 9, 2018., p.9. 6. Wise, Warren. Boeing Wants Action Plan After Charleston Airport Snow Debacle. Post and Courier (Charleston, S.C.), January 19, 2018, page B1 (from Access World News, accessed on January 24, 2018). 7. Petersen, Bo. Kids of All Ages Fall for Rare Coastal South Carolina Snow. Post and Courier (Charleston, S.C.), January 4, 2018, page A1 (from Access World News, accessed on January 24, 2018).

Rinna described her experiences using an agile management tool at Western Michigan University to handle the increasing and expanding number of eresources, with new subscription models, platform changes, browser updates, and apps that seem to constantly change, all on top of changing library and campus administration, initiatives, and strategic plan. In 2015 the library migrated from a locally hosted system to a cloud hosted ILS with Knowledge Base, Link Resolver, Discovery Layer, Statistical module, etc. With workflows, processes, and even file management unsustainable in the new system that placed more responsibility on the ERM team, she implemented the Kanban board, a project management process that originated with Japanese production and manufacturing industries continued on page 73

<http://www.against-the-grain.com>


Charleston Comings and Goings: News and Announcements for the Charleston Library Conference by Leah Hinds (Executive Director, Charleston Conference) <leah@charlestonlibraryconference.com> 2018 Conference Theme and Dates

The Charleston Conference is pleased to announce the theme for 2018 from Percy Bysshe Shelly’s poem “Ode to the West Wind” — O, wind, if winter comes, can spring be far behind? The Conference in its 38th year will be held November 5-9 in the Performance Hall of the Charleston Gaillard Center as well as in numerous adjoining hotels in downtown Charleston. Here are some upcoming dates and deadlines: March 1 – Call for Preconferences Opens April 16 – Call for Papers Opens April 28 – Call for Preconference Deadline June 4 – Vendor Showcase Registration Opens June 11 – Conference Early Bird Registration and Hotel Room Blocks Open July 13 – Call for Papers Deadline August 17 – Juried Product Development Forum Deadline August 24 – Charleston Premier Deadline August 31 – Up and Comer Nomination Deadline September 14 – Early Bird Registration Deadline October 12 – Regular Registration Deadline October 26 – Online Registration Closes Nov 5-6 – Preconferences and Seminars November 6 – Vendor Showcase November 7-9 – Main Conference

Rumor has it…

Annette Thomas, Clarivate’s Scientific and Academic Research Group’s CEO, will be keynoting at Charleston 2018. You might remember that Annette was our keynote speaker at the 2012 Charleston Conference when she was CEO of MacMillan. Here is a link to the Penthouse Interview with Annette Thomas done in 2012 in Charleston. https://www.youtube.com/watch?v=oXe-iF-t4i0 Taken from Katina’s “If Rumors Were Horses” blog at http://www. against-the-grain.com/2018/02/rumor-2-15-18/.

Videos

Videos of all 2017 plenary sessions, Neapolitan sessions, and select concurrents and Lively Lunch Discussions are available on the conference website at http://www.charlestonlibraryconference. com/video/. Also available are the 2017 ATG Views from the Penthouse Suite interviews (http:// www.charlestonlibraryconference.com/video/atg-penthouse-interviews/) including Maggie Farrell (UNLV), Brewster Kahle (Internet Archives), Judy Luther (Informed Strategies), Georgios Papadopolous (Atypon), Loretta Parham (AUC – Robert W. Woodruff Library), Dave Tykoson (California State University), and Charles Watkinson (University of Michigan Press at University of Michigan Library).

2017 Up & Comer Award Winners

ATG: The Podcast (www.atgthepodcast.com) is featuring a series of interviews with 2017 Up & Comer Award Winners. This award is intended for librarians, library staff, vendors, publishers, MLIS students, instructors, consultants, and researchers who are new to their field or are in the early years of the profession. Up and Comers are passionate about the future of libraries. They innovate, inspire, collaborate, and take risks. They are future library leaders and change makers, and we are excited to celebrate them with this award. A full list of award winners is available at http://www.charlestonlibraryconference.com/2017-upcomer-award-winners/. Keep an eye out for the nominations to open for 2018 later this spring!

Against the Grain / February 2018

Charleston Conference Webcasts

The Charleston Conference is now offering a series of free webcasts on topics of importance to libraries, publishers, and vendors in information industry. Past topics include misinformation, frameworks for improving end of year spending, and library marketing. See our website for recordings of previous sessions: https://www.charlestonlibraryconference.com/video/webinars/.

Upcoming Webcasts

February 28, 2:00 pm Eastern — Library as Publisher: New Models of Scholarly Communication for a New Era — Presented by Sarah Kalikman Lippincott. Why are so many libraries going into the publishing business at a time when scholarly publishing is facing so many challenges? Publishing, after all, is a complex business and the trend in the marketplace is to economies of scale and the consolidation of smaller publishers into the fold of the largest. It does not seem a propitious moment for a library to become a small independent publisher. So why are libraries doing this? How is this similar or different from the services commercial publishers provide? Does it involve offering the same services, or are new models, types of content, and needs resulting in new solutions that suit new players? This webinar will help the reader understand the context of library publishing. It also explores when a publishing program is a good fit for a library and provides guidance for defining, launching or growing a publishing initiative. March 26, 2:00 pm Eastern — Reading in a Digital Age — Presented by David Durant, East Carolina University, and Tony Horava, University of Ottawa. How is reading changing in the digital environment? How will it continue to change? Are we headed for an all-digital future? Or does print still have a place in the reading environment? Does format matter? What do readers tell us they want? This webinar will offer librarians, publishers, vendors, and others, an overview of these key issues, as well as advice on how their institutions should approach the print vs. digital controversy.

Charleston Briefings

Both of these upcoming webinars are based on books of the same title that are part of the Charleston Briefings, a series of short books (12,000 to 20,000 words) on the topic of innovation in the world of libraries and scholarly communication. The default format for the Briefings is an open access eBook using Michigan Publishing’s eBook hosting platform to provide online access. Print copies are also available through Amazon. http://charlestonbriefings.com/ The audience for the Briefings is the same audience that attends the Charleston Conference: librarians, publishers, entrepreneurs in information technology, vendors and consultants. The series will offer timely, readable, and focused treatments of topics of significance to practitioners in these fields. The purpose of the series is to offer the reader a useful overview that will allow them to engage more effectively with new trends and innovations in their industry. The Charleston Briefings are the first service being developed under the umbrella of ATG Media that will be in addition to the Charleston Conference and Against the Grain. ATG Media is a wholly owned subsidiary of ATG, LLC. Stay tuned for more information on the conference website at www. charlestonlibraryconference.com, or join our email list at http://bit.ly/ chs-email-list to receive periodic updates.

<http://www.against-the-grain.com>

71


ALA Midwinter Meeting 2018: February 9-13, Denver, CO by Leah H. Hinds (Executive Director, Charleston Conference) <leah@charlestonlibraryconference.com> “Oh, the weather was cold and frightful, but the meeting was so delightful…” Denver showed us its characteristic split-personality weather, ranging from cold and snowy on Friday and Saturday, to beautiful blue skies and milder temps on Sunday, then back to cold and snow. With attendance at just over 8,000 people, numbers were down marginally from 2017 and significantly from 2016. The vendors I spoke with, however, reported good traffic in the exhibit hall and a plethora of meetings to keep them busy. It was good to connect with our friends and supporters, and meet new faces as well. I was especially interested in the Future of Libraries Symposium sessions, which were focused on Tom Gilson, Associate Editor exploring the many futures for acfor Against the Grain, and I ademic, public, school, and special stopped by for a meeting at the libraries. Sponsored by ALA’s Prenax booth to talk with NanCenter for the Future of Librarcy Percival and George Rego. ies, the Symposium considered the “near-term trends already inspiring innovation in libraries and the longer-term trends that will help us adapt to the needs of our communities.” https://2018.alamidwinter. org/symposium-about The Saturday morning plenary session was “Libraries Transform: Education Innovation,” presented by Peter Piccolo, Executive Director of Innovation at the Imaginarium — Denver Public Schools Innovation Lab, and Nina Sharma, Managing Director at the University of Denver’s Project X-ITE (Innovation, Technology and Entrepreneurship). Peter spoke on his experience at the Imaginarium and Peter Piccolo and gave advice for creative innovation. Nina Sharma. Some key takeaways: • If you’re innovating, your heart should be at the least accelerated. Innovation is risky. • Know the difference between good and bad failure, and how to learn from your mistakes. • Be aware of structural barriers that create incentives or disincentives for leadership. • Know what problem you’re solving for! • Look to other industries and “steal like an artist.” Food trucks inspired the public school system to create a microschool. Nina Sharma presented on various projects and events that were done through Project X-ITE at the University of Denver. “Project XITE is a cross-disciplinary community at the University of Denver that supports the next generation of entrepreneurial thinkers through collaborative events, experiential programming, and strategic global partnerships,” (from http://www.projectxite.org/). Some key takeaways: • Events included Denver Startup Week, StudioX, Project X-ITE Forum, and the Colorado Solutions Summit, which brought together unique teams of thinkers, doers, and problem-solvers to work together in curated mentoring teams to support Colorado companies.

72 Against the Grain / February 2018

• (About Project X-ITE) “We get stuff done. We don’t stop. Forward motion. And it drives everyone else crazy.” • We started our program without student advisors. It was the antithesis of human centered design. Quickly corrected by including their users on an advisory board.The Hunter Forum, sponsored by Elsevier, was focused on Library Impact and Value Across the University. Moderated by Jean Shipman of Elsevier, panelists included Michael Levine-Clark, University of Denver; Emily McElroy, University of Nebraska Medical Center; Melissa Bowles-Terry, University of Nevada Las Vegas; and Dr. Regina Kilkenney, University of Colorado-Denver.

Panel at the Hunter Forum. The forum included topics such as: • Leveraging the strategic goals of the university, how do libraries communicate value to administrators and become indispensable to their university? What metrics and qualitative information are used beyond traditional library statistics? (McElroy) • Connecting the library to university values to create resonating messaging (Levine-Clark) • Library educators are making a difference on institutional outcome metrics. Highlights from GWLA multi-institutional study on: The Impact of Information Literacy Instruction on Student Success (Bowles-Terry) (http://www.arl.org/storage/ documents/publications/The_Impact_of_Information_Literacy_Instruction_on_Student_Success_October_2017.pdf) • A University Administrator’s Perspective and Panel Response (Kilkenney). The Awesome Ideas Pitch was supported by the Libraries chapter of the Awesome Foundation, “an international organization that forwards the interest of awesome in the universe $1,000 at a time.” Eight finalists presented their ideas for projects to impact their community, demonstrate a new idea, or improve a tool or service. The eight finalists, in order of presentation, were: • Making is for Everyone: Student Startup (Jessica Davilla Greene, Claremont Colleges Library) • The Chatanooga Memory Project: a crowd-sourced digital history project (Corinne Hill and Chris Cummings, Chatanooga Public Library) • Project Book Culture (Stacy Linemann, Waseca-Le Sueur Regional Library System) • Free the Textbooks! (Carrie Moran, Cal State San Marcos) • Getting Down with IoT (Sandy Avila, University of Central Florida) • Health for the Healers (Maggie Ansell and Ariel Pomputius, University of Florida) • Kids Storytelling Festival (Christian Zabriski, Yonkers Public Library) • Punk: First Wave Documentary (Nick Parker, Sacramento Public Library)

continued on page 73

<http://www.against-the-grain.com>


The Monday morning plenary session was on Civic Innovation, and featured talks from Margaret Hunt, Colorado Creative Industries and Space to Create, and Jake Rishavy, Colorado Smart Cities Alliance. Hunt spoke on the “Space to Create” project, generating affordable live/ Two winners were selected, each to win a $1,000 grant from the work spaces for the creative industries workforce and artists in rural Awesome Foundation: one by audience vote and one selected by a communities, along with the Creative District Community Loan Fund, panel of judges. The audience pick was “Free the Textbooks!” while Art in Public Places, and grants to support the arts and career advancethe judges selected the “Kids Storytelling Festival.” ment grants for creative entrepreneurs. Rishavy spoke on the Colorado The Sunday morning plenary session was on Diversity and Equity, Smart Cities Alliance a statewide collaboration of public, private and and featured inspiring talks from Elizabeth Martinez and Binnie academic sector leaders committed to accelerating the adoption of smart Wilkin. They both spoke on their wealth of experience and knowledge cities projects and initiatives in their respective communities. in building diversity and equity in libraries and in the information indusUp next was a concurrent session titled “Sustainability Strategies try as a whole, as well as forward-looking thinking on the same topic. for Libraries and Communities” that presented several great ways to get Binnie Wilkin predicts, “Bold systems and new forms of networks will libraries involved with environmental friendly and sustainable practices. evolve as new generations who have grown up experiencing life in an Joe Mocnik from North Dakota State University presented steps his age of connectivity become the decision-makers.” Video highlights of institution is taking to move from coal burning heat to more renewable this and other Symposium sessions can been seen at https://youtu.be/ resources. Am Brunvand from Utah State University is teaching CYhn3QLqPpw. students information literacy from a civic engagement perspective, Another fascinating and cutting-edge session I attended was “Block- tying in sustainability with place-based knowledge and local advocacy chain, Open Civic Data, and TV Whitespace: Three New Projects.” groups. Rebekkah Smith Aldrich from the Mid-Hudson Library Moderated by Sandra Hirsch from San Jose State University, the System spoke on the NYLA Sustainability Initiative (www.nyla.org/ session featured three IMLS funded projects at various institutions. sustainability) and their implementation of a regional certification proSue Alman, San Jose State University School of Information, spoke gram, Sustainable Library Certification. The certification is currently about a project dedicated to understanding blockchain technology and available only for public libraries in New York, but they plan to expand its potential uses in libraries. More information can be found on their to school and academic sectors soon. Ben Rawlins from SUNY Geneseo blog at https://ischoolblogs.sjsu.edu/blockchains/. Toby Greenwalt presented their OER initiative, SUNY OER Services, which included of the Carnegie Library of Pittsburgh spoke on the Open Civic an Excelsior Scholarship of $8 million to provide open educational resources to students at SUNY and Data project aimed at connecting CUNY to defray textbook costs. libraries and community information networks. They’re hosting The closing session on Monworkshops and two conferences in day afternoon featured Bill Nye, year one, and will offer stipends to “The Science Guy,” and co-author partnerships for field testing their Gregory Mone. Together they toolkit in year two. Updates and have authored a series of children’s more info at https://civic-switchbooks called Jake and the Geniusboard.github.io/. Finally, Kristin es. The pair met by chance at a Rebmann from San Jose State coffee shop in California and got University presented her project to know each other when Mone inon TV Whitespace, technology to vited Nye to go surfing with him in broadcast wifi into the community Malibu the next day. The session through unused television frequenwas a fun-filled discussion that was cies. http://ischool.sjsu.edu/about/ a comfortable talk between friends. news/detail/ischool-associate-proMone posed questions to Nye, and fessor-awarded-grant-will-exhe answered with characteristic wit pand-libraries%E2%80%99-inand humor. Topics ranged from ternet-access Bill Nye and Gregory Mone at the Closing Session. the realism used in the books (“No jet packs!”), to the importance of Sunday afternoon featured a great panel discussion sponsored by the LITA/ALCTS Electronic including female characters (“Half the humans are girls and women, so Resources Management Interest Group titled “Vendor Relationships: half the engineers and scientists should be girls and women.”). When Build, Negotiate, Transform.” The panel was moderated by Michael asked, “What do libraries/librarians mean to you?” Nye responded that Rodriguez, University of Connecticut, and featured Jason Chabak, librarians help you learn to think, and the role of librarians is to help ReadCube; Lindsay Cronk, University of Rochester; Allen Jones, people figure out what is reasonable information and to teach critical the New School; Christine Stamison, NorthEast Research Libraries thinking skills. Consortium (NERL); and Kimberly Steinle, Duke University Press. Each panelist gave fantastic input on networking, collaboration across ALA Annual 2018 will be held in New Orleans, LA, June 21-26. industries, and establishing successful relationships.

ALA Midwinter Meeting from page 72

And They Were There from page 70 and has since seen wider applications that include librarianship, as described in an article in the Journal of Electronic Resources Librarianship in 2016. Physical Kanban boards use sticky notes on a whiteboard to communicate status, progress, and issues visually. Online tools like Asana utilize the whiteboard metaphor in a software setting with customizable “lanes” such as To Do, Plan,

Against the Grain / February 2018

Develop, Test, Deploy, Done. The ERM team has used the tool for projects such as discovery layer configuration for MARCIVE and government documents, streamlining usage statistics workflow, ILS configuration, reports management, and ERM lifecycle reports. Agile management using the Kanban model fosters collaboration, self-organization, and cross functionality through visual transparency so that all team members can be aware of what everyone else is working on, progress being made, and what the team is trying to accomplish. continued on page 77

<http://www.against-the-grain.com>

73


ATG PROFILES ENCOURAGED

T

he first profile in this issue recognizes one of ATG Media’s 2017 Up and Comers whose profile did not appear in the last issue with the others. Who exactly is an “Up and Comer,” you ask? They are librarians, library staff, vendors, publishers, MLIS students, instructors, consultants, and researchers who are new to their field or are in the early years of the profession. Up and Comers are passionate about the future of libraries – they innovate, inspire, collaborate, and take risks. They are future library leaders and change makers, and they all have one thing in common – they deserve to be celebrated. In addition to having their profiles appear in ATG, they will be featured in a series of scheduled podcast interviews that will be posted on the ATGthePodcast.com website as well. The award winners were also recognized at the Charleston Conference First Time Attendee Reception in November. ATG Media would like to thank Erin Gallagher (Director of Collection Services, Reed College Library, <gallaghere@reed.edu>) for all her work with organizing the nominations and gathering the profiles from the 2017 Up and Comers. Congratulations to all who were nominated.

Katie Mika Assistant Professor, Data Services Librarian Norlin Library University of Colorado Boulder Boulder, CO 80302 Phone: (303) 735-3111 <Katherine.Mika@Colorado.edu>

Born and lived: Born in Tampa, FL; lived in Philadelphia, PA; Houston, TX; Pittsburgh, PA; Fraser Valley, CO; Boston, MA; and now back to Colorado in Boulder. What attracts you to a career in libraries: I really enjoy working at the intersection of a lot of different disciplines in the academy and being involved in an incredibly diverse amount of research. I’ve always loved digging for information to answer questions and satisfy curiosities, and I’m very grateful to be able to do it professionally. The opportunity for activism in librarianship is also very compelling for me. I believe in information access as a human right, and I’m happy to have found a vocation that supports open acces to research and data and open source, collaborative projects. Professional career highlights: Receiving an Up & Comer award from ATG! Working with the Biodiveristy Heritage Library as a National Digital Stewardship Resident was an incredible opportunity that I’m certain will remain a career highlight. I learned so much about collaborative, academic research, and I very much enjoyed being able to spend every day digging into the guts of a digital library to figure out how to make improvements and enhance access to collections. In my spare time: I enjoy skiing, trail running, playing golf, cooking, and of course, reading. Favorite books: A Tree Grows in Brooklyn, The Dispossessed, Infinite Jest. The change I hope to make in the profession: I hope to encourage more Open Access and Open Science policies across institutions and publishers. I hope to improve access services to digital content, data, and metadata on a large scale. Goals I hope to achieve five years from now: I would like to collaborate with other librarians and institutions in supporting computational research using library collections. I hope to become a valuable resource for users navigating research data issues. And I hope to develop

74 Against the Grain / February 2018

a professional network of innovative librarians, archivists, and collection managers with whom to develop exciting and interesting projects that push the field forward. Where do I see the industry in five years: I see less of a reliance on proprietary software and formats and more enthusiastic support of open source initiatives. I see more international collaborations. And I hope I see a more diverse profession that is committed to inclusivity and representing its users.

Athena Hoeppner Discovery Services Librarian University of Central Florida 4000 Central Florida Blvd. Orlando, FL 32817 Phone: (407) 823-5019 <athena@ucf.edu> • https://library.ucf.edu

Born and lived: Athena was born in California, which served as a launching pad for a rather nomadic childhood with stints in California, Utah, Colorado, Nebraska, and Wyoming. She identifies as a Coloradan expat, as she spent her most memorable pre-adult years there, on the Front Range. Professional career and activities: Athena is the Discovery Services Librarian at the University of Central Florida, in Orlando, Florida. Her career in academic libraries began in 1994, when she was hired into her first position as a temporary librarian at Northern State University, Aberdeen, SD, shortly after graduating with an MLIS from Simmons College, Boston, MA. When the 9 month temporary contract was up, Athena accepted an offer to join the Reference Department at the University of Central Florida. Her affiliation with UCF Libraries spans over 20 years and a range of roles in public services, systems, and technical services. Most recently, her position title changed from Electronic Resources Librarian to Discovery Services Librarian. In her current role, she oversees eResources expenditures of over $4 million, leads implementation and maintenance of discovery and access technologies, and spends too much time on email and spreadsheets. Her first love remains public service, and she serves four hours per week to Reference and Ask a Librarian directly interacting with students and their information needs. Athena’s work and research center on applying technology to connect users to content. Her projects explore usability, user experience, and technology for improving discovery of, and access to, information. In my spare time: My after work hours often feel as busy as work, or more. Most every evening I’m dancing swing or blues (and sometimes freestyle goth), indoor climbing, breathing and stretching in yoga, lifting heavy with friends, or table top gaming. I love to cook, especially for groups of friends who’ve come over for games or a movie. I also love sitting on the couch, eating dinner, snuggling my dog, and chatting with my spouse. Pet peeves: Littering. Philosophy: Life is what we make of it. I can do things that make life better for me and for others. Most memorable career achievement: If I am to brag, I’d say that being invited to give the keynote address at the Resource Discovery Conference in Bath, UK was a highlight. The invitation was a direct result of my article, The Ins and Outs of Web Scale Discovery, published in Computers in Libraries. That single paper has had the greatest transformative effect on my career. continued on page 75

<http://www.against-the-grain.com>


ATG Profiles Encouraged from page 74 Goal I hope to achieve five years from now: I want to help transform the way people conceptualize, react to, and interact with their information needs. I want to do this by reimagining library technologies and how people find and interact with the technologies. I’m not a technologist, so I expect my part will mostly be spouting off ideas to as many people as will listen. I’d also like to get through emails. How/where do I see the industry in five years: The industry will continue to seek ever better ways to deliver high-quality, relevant, high-value content to libraries. Artificial intelligence will become a normal part of the tools used by vendors to process their content and indexes for better discovery and delivery. AI could be used to enhance metadata, uncovering concepts and meanings, identifying distinct data elements and findings within documents and index them for discovery independently from the documents that contain them. Authentication and authorization will shift away from EZproxy for most major vendors, relying on Federated Identity instead. The federated identity systems, such as Shibboleth and OpenAthens will continue to expand relationships with vendors. The Federated Identity providers will more routinely pass anonymize user attributes to databases, which will be used to customize the interface, recommendations, and user experience for individuals. In addition, libraries will be able to obtain granular usage data that incorporates user attributes. Combined with AI that learns information habits of individuals, users may have truly customized experiences on vendor platforms. Users will increasingly use their mobile devices for academic purposes including searching for and viewing library content. Vendors will develop interfaces and tools to improve the mobile experience of their products and bring the functionality in line with user expectations. A major part of this will be easy transitions from mobile to full computer for work and allowing the user to choose the suite of apps and tools the wish to use to interact with the content.

Goal I hope to achieve five years from now: Make ShipIndex.org self-sustaining; promote the importance of maritime history research and make it easier to do. How/where do I see the industry in five years: Constantly changing, with new and different tools and services for helping people discover the resources they need to learn and to improve their lives. The continuing growth of a duopoly in library services companies is disconcerting, however (and I recognize my complicity in this, though it wasn’t so obvious in 2004), and librarians should actively support a wide range of vendors, even if it sometimes means more complexity or cost. In the long run, a strong and diverse library marketplace will ensure a better range of services for patrons.

Pat Sabosik General Manager of ACI ScholarlyBlog Index and President of Elm City Consulting, LLC. I was under contract with the ACI Informaion Group to develop the ACI Scholarly Blog Index. Elm City Consulting, LLC 1069 Ridge Road, Hamden, CT 06517 Phone: (203) 988-7861 <psabosik@elmcityconsulting.com>

Born and lived: New Jersey. Early life: New Jersey.

Professional career and activities: Currently serve on the board of directors for Stevens Institute of Technology’s Web Campus. Family: Married, one son, two beautiful grandchildren.

In my spare time: I garden and hike, oh and backpack. There is nothing as restful as sleeping in the woods in a tent! Favorite books: My all-time favorite is The Foundation Trilogy by Isaac Asimov and anything written by Mark Kurlansky. I’m reading Kurlansky’s Paper now.

Vendors will respond to the increasing availability and discoverability of OA and the constant pressure of scihub by creating options to legally use and share content.

Pet peeves: None, really.

Libraries will continue to collect, organize, disseminate, and preserve the knowledge of humankind. And they will thrive.

How/where do I see the industry in five years): The publishing industry will settle into a service model for libraries and authors and subscriptions as we know them will not be relevant. The 40+ list of authors in a published research article will end. Research information will be open. Scholars will continue to compete on innovation in their fields and innovations will have a practical/industry focus to them. Education will begin to shift to experiencial-based learning and students will learn by doing, acquiring competency-based skills. Libraries will stay relevant as service centers to institutions and align more closely with the Office of Research and Admissions to better serve students and faculty at their institutions.

Peter McCracken Electronic Resources Librarian Cornell University 110 Olin Library, Cornell University Ithaca, NY 14853 Phone: (607) 255-1892 <phm64@cornell.edu>

Born and lived: Seattle, WA.

Professional career and activities: Reference Librarian at East Carolina University and University of Washington; then co-founder of Serials Solutions (2000). In 2009 I left Serials Solutions (by then a part of ProQuest) to found ShipIndex.org, which is still operating on a shoestring. In June 2016 I joined Cornell University. Family: My wife is an art historian and professor at Ithaca College; we have one son. We have lived in Ithaca, NY, since 2007. Favorite books: English Passengers, by Matthew Kneale; Reamde and Seveneves, both by Neal Stephenson. Most memorable career achievement: My then-seven-year old son sidling up to me to hold my hand (which he generally hated doing) while I was honored at a reception for receiving the Ulrichs Serials Librarianship Award, in 2011.

Against the Grain / February 2018

Most memorable career achievement: Launching ScienceDirect.

(Note: Alignment of libraries with the Admissions Office is an idea I’m thinking about a lot. The new student cohorts being admitted to college has advanced computer skills because they are digital natives. Librarians who better understand the profile, skills, and expectations of incoming students will be better to provide services to them. Still working on this idea.)

Elizabeth Siler Collection Development Librarian UNC Charlotte 9201 University City Blvd. Charlotte, NC 28223 Phone: (704) 687-1372 Fax: (704) 687-0890 • <esiler3@uncc.edu>

Born and lived: Born in Mt. Laurel, NJ. Raised in Highland Village, TX. continued on page 76

<http://www.against-the-grain.com>

75


ATG Profiles Encouraged from page 75 Early life: I grew up in just outside Dallas, TX and attended undergrad at Elon University in NC. Prior to working in libraries I worked in television production and marketing. Professional career and activities: I graduated in 2010 from the University of Kentucky with an MSLS and have worked in Electronic Resources and Collection Development for the last 7.5 years at both FIU and UNCC. I have been active in ALA serving on different committees within ALCTS and have also served in local or regional organizations such as SEFLIN and ASERL. Family: My husband and I have been married for going on 10 years and have a 4 year old daughter and new son expected the end of February, 2018. In my spare time: I spend my spare time with my family and friends, taking advantage of the events planned throughout our city and the many breweries around Charlotte. Favorite books: Harry Potter Series; Middlesex; All the Light we Cannot See. Most memorable career achievement: Successfully executing a full two day conference as part of a Mellon Funded grant. Goal I hope to achieve five years from now: I hope to continue to progress in my career and moved to a more advanced role in the library. How/where do I see the industry in five years: I see a continued move toward electronic resources with more collaboration among librarians, publishers and content providers to provide the best possible content, in the most accessible way with plans for long term preservation. I also see the continued consolidation of library content companies providing fewer choices and funding hurdles for libraries in the future.

Rachel Walton Digital Archivist and Records Management Coordinator, Rollins College, Olin Library 1000 Holt Ave., Winter Park, FL 32789 Phone: (407) 691-1127 <rwalton@rollins.edu> http://www.rollins.edu/library/yourlibrarian/ rachel.html

Born and lived: Born in California, raised in Florida. After a short few years in Raleigh, North Carolina, I moved back to Florida and have no plans to leave. I love the sunshine! Professional career and activities: I received my master’s in Library Science at UNC Chapel Hill in 2015, with a specialization in digital curation. I am also a graduate of the University of Florida where I received both a bachelor’s in History and Art History and a master’s in Latin American History. My professional research interests include website usability, open access publishing, and data management. I’m an active member of the Society of Florida Archivists as well as Chief Editor of the organization’s scholarly journal. I also enjoy presenting at national conferences such as the Society of American Archivists’ annual meeting and IASSIST. Family: I’m happily married to a bearded, baseball-loving foodie. We just bought our first place in Casselberry, Florida, and we have one very spoiled fur baby. In my spare time: I like to spend time at local coffee shops sipping cappuccinos and reading alongside my adorable French Bulldog sidekick, Jackie. Favorite books: Michael Chabon’s Yiddish Policemen’s Union.

Philosophy: If you liked it, then you should have put metadata on it!

76 Against the Grain / February 2018

Most memorable career achievement: Receiving the Society of American Archivists’ Theodore Calvin Pease Award in March 2016 for “Looking for Answers: A Usability Study of Online Finding Aid Navigation.” How/where do I see the industry in five years: My goal is always to help make the people in my sphere of influence better digital stewards of records and information. So, I look forward to the day when all producers of information (authors, researchers, publishers, students, etc.) engage in preservation planning and choose sustainable preservation solutions for their digital assets. Also, from a library perspective, I hope to see more vendors in our industry offering transparent and meaningful usage statistics driven by standard metrics that can be easily compared across platforms. I also believe that the next five years will show an increasing interest from vendors and users alike in open access publishing solutions.

Heather Wilson Acquisitions & Electronic Resources Librarian Caltech Library 1200 E. California Blvd., MC 1-43 Pasadena, CA 91125 Phone: (626) 395-3419 <hwilson@caltech.edu>

Born and lived: Technically, I’m from Charleston! I was born in the hospital at the old Charleston Naval Base and lived on the base for my first three years. Professional career and activities: As of 2018, I began serving as the Community Outreach Coordinator for the CORAL Project, meaning that I coordinate a seasonal newsletter, act as a community communication point, and serve on the CORAL Steering and Web Committees. I have served on both committees already since September 2016, when Caltech implemented CORAL locally. Family: I’m an only child to two parents who are so married that they drive the same car (as in two 2000 Chrysler Sebring convertibles in the driveway.) In my spare time: I collect records and operate a little pet sitting business in Hollywood. Favorite books: Biographies, anything by Murakami, and the AACR2, of course. Pet peeves: I adopt pet peeves too easily to burden others with new ones. If you just mention your own pet peeve, it will become my pet peeve. I don’t want to put that on the reader. (Just kidding: people who whistle on the metro drive me crazy.) Most memorable career achievement: This Against the Grain article is a pretty great milestone. I’ve read Lolly Gasaway’s column since I was in graduate school. Goal I hope to achieve five years from now: I’m hoping to get a second degree, an MBA in Supply Chain Management, and focus on the information services life cycle. While I’m interested in how libraries can gain equity in that life cycle, I hope to enter Research Administration and focus on the role of research in that digital production chain. How/where do I see the industry in five years: The growth in reporting tools and authentication options on the market indicates that my work will move away from collection maintenance and more toward access management. That the current subscription model is unsustainable seems to be common knowledge at this point, no citation needed. Compound this problem with the fact that technology for getting around these subscriptions will likely grow more widespread and more advanced. With these price increases, my own library’s collection will have halved in five years, and five years is also enough time for pirating options to become ubiquitous. Open access options and article-on-demand tools provide some alternatives, so improving access to those less expensive, more library-friendly options seems to be a worthwhile focus during this transitional time. continued on page 77

<http://www.against-the-grain.com>


COMPANY PROFILES ENCOURAGED

ACI Information Group, LLC

Otto Harrassowitz GmbH & Co. KG

10 Potter Hill Road Guilford, CT 06437 www.aci.info

Kreuzberger Ring 7b-d 65205 Wiesbaden Germany Phone: +49 (0)611 530 0 Fax: +49 (0)611 530 560 www.harrassowitz.de

Officers: Larry Schwartz, President

Key products and services: Newstex News & Commentary Blog Index, blog feed service Core markets/clientele: Corporate and government markets, content resellers Number of employees: ACI Information Group is a private company. History and brief description of your company/publishing program: For over a decade, ACI Information Group has been one of the world’s leading aggregators of authoritative content. ACI’s leading product, the Newstex News & Commentary Blog Index, a blog feed service, offers easy access to thousands of online publications written by industry insiders and thought leaders across a range of disciplines. Formerly Newstex, the company changed its name to ACI Information Group in 2015. Is there anything else that you think would be of interest to our readers? ACI’s Newstex News & Commentary Blog Index is a blog feed service available to corporations and resellers, as well as to libraries as a licensed feed. ACI’s Scholarly Blog Index, the library end-user product, has been discontinued.

Back Talk from page 78 for our users the print collections that they find in our buildings (eventually we have seven locations to think about in this way) into a source of inspiration and a sustaining resource and tool for success. Let us hear from you — <jod@asu.edu> and <lorrie.mcallister@asu.edu>.

And They Were There from page 73 Navigating Research: Do scholarly resources still meet users’ needs? — Presented by Patricia Hudson (Moderator, Oxford University Press); David Tyckoson (California State University, Fresno); Simon Pawley (Oxford University Press) Reported by Alicia Willson-Metzger (Christopher Newport University) <awillson@cnu.edu> Pawley summarized the findings of an Oxford University Press study published in the white paper Navigating Research: How academic users understand, discover, and utilize reference resources. Research methodology consisted of in-depth interviews with librarians, faculty, and students; UK and U.S. librarians interviewed were then surveyed to augment interview responses. Some chief findings: patrons do not seek basic factual information in reference resources, and instead turn to famil-

Against the Grain / February 2018

Officers: Friedemann Weigel – Managing Partner, Director of Sales. Ruth Becker-Scheicher – Managing Partner, Director of Accounting. Nadja Dorn-Lange – Managing Partner, Human Resources & Publisher Relations. Association memberships, etc.: ALA, EDItEUR, IFLA, MLA, Music Library Association, NASIG, NISO, UKSG. Key products and services: Comprehensive range of high-quality acquisitions and collection development support for the following types of resources: subscriptions, standing orders, approval plans, monographs, music scores, and databases. Core markets/clientele: Academic and research libraries. Number of employees: 185

History and brief description of your company/publishing program: Established in 1872, HARRASSOWITZ has been serving libraries around the world for almost 150 years. A trusted partner to libraries and publishers world-wide, we deliver industry-leading, tailored services empowering them to fulfill their vital role supporting the world´s teaching, learning and research. Our core values of trust, respect, teamwork and reliability, inspire us to anticipate our customers’ needs and exceed their expectations. Is there anything else that you think would be of interest to our readers? Our customers consistently identify us as an industry leader in terms of the quality, accuracy and speed of our customer service and often refer to our service as the Gold Standard in information supply. In 2017, HARRASSOWITZ Has Been Named One of “1000 Companies to Inspire Europe” by the London Stock Exchange Group. To be included in the report, companies needed to show consistent revenue growth over a minimum of three years as well as the ability to out-perform their peers.

iar resources such as Wikipedia. Patrons at all levels require guidance in finding relevant resources for interdisciplinary research. Discoverability is central to resources being used. Connecting users to relevant reference content is a continuing challenge. Tyckoson provided a “real world” look at the implications of this study in the Cal State-Fresno Library. He examined reference collection usage by frequency and used this information to inform reference weeding decisions. Promoting reference use is key to patron engagement with the collection. Include reference works in the library’s discovery system; circulate reference sources. While most sources will not be used, the “best” resources will be. It is important to weed. Develop a retention policy and note retention guidelines for library staff in the catalog. This informative session was as described in the conference program.

That’s all the reports we have room for in this issue. Watch for more reports from the 2017 Charleston Conference in upcoming issues of Against the Grain. Presentation material (PowerPoint slides, handouts) and taped session links from many of the 2017 sessions are available online. Visit the Conference Website at www. charlestonlibraryconference.com. — KS

<http://www.against-the-grain.com>

77


Back Talk — Balancing the Books Column Editor: Jim O’Donnell (University Librarian, Arizona State University) <jod@asu.edu>

C

haucer’s Clerk of Oxford was a very learned man and a bibliomaniac besides — he had twenty books all his own! You can be sure those books were special to him, carefully guarded, repeatedly studied, and half-memorized. He reminds me a little of myself as a kid in our house on an army post in the desert. My parents were readers, but there still weren’t probably more than a hundred volumes in our house at one time, even counting the stream of books checked out from the Post Library. (I still have, and use, my dad’s copy of Edward Gibbon, and a few more of those parental books besides.) Times change. I own a lot more than a hundred books and I go to (and work in!) hugely larger libraries now. If the digital revolution hadn’t intervened, I’d probably own even more and our libraries would hold even more. Now the law of supply and demand tells us that as supply increases, demand gets attenuated and perceived value falls. So I own books I don’t know I own and am surprised to find on my shelves. (Occasionally, I buy what turns out to be a duplicate.) Does that mean I don’t care about books? Well, quite the opposite, but numbers matter and big numbers matter a lot. The bigger collections get, the less we can pay attention to each single volume. Browsing gets harder when there are hundreds and thousands of volumes to browse. When I got to college, a collection of three million volumes seemed huge; now with 4.5 million at ASU we rightly feel middle-sized. So books have a problem — and it’s not a problem arising from the digital revolution. Whatever competition there is for attention from other and newer media, our books are competing with themselves. How do we keep the love alive? I mean that question seriously, for us librarians as librarians. How do we manage our collections and our buildings and our web presences in a way that lets us help our readers cherish our books — which are really their books — to good effect? I think that the postWWII strategy of buying more and more and building more and more shelves and crowding

the field of attention has outlived its usefulness. We need to make books, especially printed books, special again. At Arizona State University, with the support of the Andrew W. Mellon Foundation, we have a plan. We’re beginning a huge renovation of our largest stack tower, the 1966-built Charles Trumbull Hayden Library. (Hayden was the founder of Tempe, AZ, and the father of Arizona’s first congressman, the redoubtable Carl Hayden.) When we come back to the refreshed building in a couple of years, it will have space for about 300,000 volumes out of our collection of 4.5 million items. (That number is a bit of a guess, but should let us showcase approximately half of our humanities collection. Discovery tools and enhanced physical delivery will put the rest in the hands of our users.) How will we help those books to stay alive and vital and visible, not letting them just become visual background noise, wallpaper, or furniture for students who know no other source of wisdom than their “devices”? We first started playing with these ideas a couple of years ago, when we hosted an exhibition by the contemporary artist Michael Mandiberg, who had downloaded the complete contents of Wikipedia and formatted them for printing. We worked with him to print a hundred or so (of the many thousands!) of the volumes and displayed them against wallpaper that gave outlines of hundreds more. We wanted to make users think about print. So as we go into renovation, we’re (almost) ready to take our adventures further. First of all, we’re bringing our special collections exhibits and reading room down from the back of the top floor to the main floor, with an open reading room and a traditional closed space beyond. We want students coming into the building to do their calculus homework to see things, things that suggest that there’s more to the world than they’ve assumed, with things they can get to inspect, work with, and cherish. The open reading room will emphasize our region and our communities, a “greater Arizona” that stretches from San Diego to Taos, from Monument Valley to Guaymas,

ADVERTISERS’ INDEX

27 Accessible Archives 79 American Chemical Society 5 ATG 19 Casalini Libri 65 The Charleston Advisor 8 The Charleston Report 9 Cold Spring Harbor Lab Press

49 Emery-Pratt 3 GOBI Library Solutions 11 Harrassowitz 2 IGI Global 53 INFORMS 80 Midwest Library Service 37 The MIT Press

13 Modern Language Association 15 The Optical Society 31 Readex, A Division of NewsBank 17 Society of Exploration Geophysicists 23 SPIE Digital Library 7 Taylor & Francis Group 57 WT Cox Information Services

For Advertising Information Contact: Toni Nix, Ads Manager, <justwrite@lowcountry.com>, Phone: 843-835-8604, Fax: 843-835-5892.

78 Against the Grain / February 2018

and has been home to many different kinds of communities over thousands of years. The building will, of course, be full of work and study and group study spaces and what we’ve been calling “wizard spaces” for technology-assisted places of investigation and creation. The top floor, though, will offer users the quiet and private space they regularly demand, and there they will find some almost traditional stacks. Most of the books in the building will be there (to be sure, there will be some other provocative pockets of presence for them), and the atmosphere will say “Library.” But what books? Arranged how? There for what purposes? How often refreshed? Answering those questions is the work we’re doing now to be ready for the reopening in 2020. We have some hunches: (1) Don’t be a slave to call-number ordering. LC call numbers represent a single taxonomy of knowledge from long ago. Set books free from that arrangement and let them talk to each other and to users in unexpected ways. (2) Make the setting attractive — mix higher shelf ranges with lower, give people space, don’t be afraid to have enough room for the books to show off. (And gosh, might we start keeping some dust jackets for them? Is our history of destroying publishers’ best effort to help the volume speak for itself such a good thing?) (3) Refresh the “merchandise.” This will cost some money and effort, but we expect to find ways to cycle books through the in-building displays, so that users will see something different every few weeks or months — and pause to look at what’s there. (4) Above all, engage the users. We think we can do a much better job of building relevant and engaging and inclusive collections if we find the right way to make the physical collection continuously reflect input, both passive and active, from our users. Passive input is what we record from their in-building use and checkouts, active will come from focus groups, surveys, ongoing committees, and the engagement of our collections staff. (And one more thought: can we take the “sorting shelves” and move them up front? Call attention to them Amazon-like as our “best sellers”? Most libraries I know have hidden those shelves in a back corner, for all that power users know that they’re the most sincere evidence available of what people in our community are reading now.) My colleague and lead on the Mellon grant, AUL Lorrie McAllister, and I spoke about this project at the Charleston Conference last fall and we’re building a network of colleagues at other institutions interested in what we’re doing and in the analogous things they’re doing and thinking of doing. We welcome contact with anyone with ideas that can help us make continued on page 77

<http://www.against-the-grain.com>




Turn static files into dynamic content formats.

Create a flipbook
Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.