27 minute read
Communicating Library Impact through Annual Reports
Library Analytics: Shaping the Future — Communicating Library Impact through Annual Reports
by Kristin Hall (Instructional Designer, Stony Brook University Libraries, Stony Brook University Libraries)
and Janet H. Clarke (Associate Dean, Research & User Engagement, Stony Brook University Libraries)
Column Editors: John McDonald (EBSCO Information Services) <johnmcdonald@ebsco.com>
and Kathleen McEvoy (EBSCO Information Services) <kmcevoy@ebsco.com>
Academic libraries, like other units of higher education institutions, need to demonstrate value to their institutions. This is accomplished through a variety of methods, from formal publications and presentations to informal one-on-one conversations and social media posts. As discussed elsewhere about library assessment websites, 1 annual reports are another key method of communicating library value to stakeholders.
Annual reports are formal documents for an organization. They may be for internal or external purposes and provide a forum for communicating the goals, accomplishments, and directions of a unit or organization. At Stony Brook University Libraries, the Research & User Engagement division — which encompasses or oversees liaison activities (information literacy instruction and research support), access services, and campus outreach — publishes annual reports on its activities, accomplishments, and goals. For over a decade, the primary purpose of the report was to inform the Dean of Libraries as well as RUE members of its output. It served as a handy source of statistics when University Administration requested certain data points, but mostly, it was an internal document. Over the years, with the intent to make the Libraries, and especially RUE divisional work — because of its outward-facing mission — more visible and relevant to external stakeholders (such as University Administration), we tried to make it more user-friendly with graphics and charts that added some data visualization, but it continued to be very jargony and inward-focused. ACRL’s 2010 Value of Academic Libraries and 2017 Academic Library Impact reports, which clearly describe the imperative to communicate to our external stakeholders in deliberate, intentional ways, helped us reconceptualize the entire report in concrete and effective ways. We also drew from theories such as multimedia learning and cognitive load theory to design the annual report.
ACRL has identified specific areas of institutional missions that academic libraries can and do impact, and should explore to further increase their value: student enrollment, retention, graduation rates; student success, achievement, learning, experience, engagement; faculty research productivity, grant proposals, grant funding, teaching; and institutional reputation and prestige. These research agenda areas should be used to shape or revise library missions, visions, and strategic directions in collections, services, and programming to ensure that academic libraries contribute maximum value to institutional outcomes (ACRL, 2010).
The Impact Report outlines six priority research and action areas that can help libraries more effectively communicate their contributions to institutional missions. 1). Communicating the library’s impact to the institution requires libraries to present the library’s contributions using terminology that is easily understandable by the institutional/ higher education stakeholders, raise awareness of the library’s participation in missional areas to those outside of the library, and leverage the library’s unique position of serving all students and majors. 2). Matching library assessment to the institution’s mission requires libraries to work with campus partners and departments to collaborate on common issues and goals, work with teaching and learning support services as well as faculty and students to build a culture of assessment, and align assessment activities to the institution’s strategic directions. 3). Including library data in institutional data collection requires libraries to have their data included in the systematic data collection processes and analyses of the institution to better connect the library with research, teaching, learning, and student success. 4). Impact on student success has become the most significant way for institutions to demonstrate their value to their stakeholders, and libraries can quantify their impact in this area with data and assessment of library resources, programs, spaces, library instruction for student success, and other data points. 5). Libraries must show the ways they contribute to critical thinking, student learning and engagement, and use spaces, collections, and programs to enhance learning and engagement. 6). And libraries must collaborate with other partners and units on campus and at other institutions to improve student learning and success.
The Impact Report stresses that the first priority area — communicating the library’s contribution to the institution — is the most important, and that the other five areas support this priority area in more specific ways (46). Indeed, a library that is adequately achieving the other five priority areas, but isn’t communicating its value effectively, through reports and other methods, may still fail to demonstrate its value to its stakeholders, which would be extremely unfortunate. Lewin and Passonneau (2012) noted that “[i]nstitutions will not place high value on libraries if stakeholders cannot discern the positive impact library activities have on scholarship and teaching activities” (p. 91). Moreover, at least half of the 10 “next steps” identified in the white paper, Library Integration in Institutional Learning Analytics (Institute of Museum and Library Services, 2018, p. 7), involve communicating value or prioritizing user stories or impact narratives to further facilitate greater library integration with institutional data and analysis of student learning and success.
With these ACRL recommendations in mind, we then applied multimedia learning theory to guide the visual redesign and presentation of our divisional annual report. We wanted to present a report that external stakeholders could view, process and easily understand the impact the University Library had on the University community.
Multimedia Learning Theory
Multimedia learning theory developed by Richard Mayer (2009) is based on several assumptions including what Mayer calls the active processing or SOI framework (selecting, organizing and integrating information), limited capacity of working memory and dual coding. This theory is guided by several principles with the fundamental belief that individuals learn better with words and pictures than with words alone. It guides the creation of multimedia materials to help foster learning while reducing extraneous processing.
The SOI framework developed by Mayer (2009) is a way to describe active processing during learning. This includes selecting relevant information, organizing this information in a meaningful way and integrating this new knowledge into existing schemas. When learning new information, our working memory has a limited capacity in what it can process at one time. Research has found the average person can hold seven plus or minus two pieces of information in working memory at any one time (Miller 1956). When there is too much information to process beyond what an individual’s working memory can handle, they can experience a cognitive overload (Sweller, 1988). This is especially true when the viewer has limited background knowledge (in this case, academic library work) and when information is unorganized. The viewer is using their working memory to figure out the meaning and/or organize the information in a way that makes sense to them. If this is too complicated, the viewer may overlook important information or possibly give up reviewing the material completely. This cognitive load can be reduced by organizing information
for the viewer, grouping relevant information together (chunking), drawing attention and emphasizing important information, and removing any information that is not essential.
The third assumption of Richard Mayer’s theory is Dual Coding. This means presenting information using words and visuals (whether this is spoken or written words and static or moving images). Dual coding helps the viewer process and retain information more effectively. 2 When the presenter uses meaningful words and appropriate images together, they are organizing and chunking information for the reader which can help the viewer process information better.
Organization of information in a meaningful way was essential. We organized each area within the division focusing on the mission, vision and goals outlined in the Libraries’ strategic plan. Each area highlighted their accomplishments for the year for each goal and provided statistics and images to support these accomplishments in the higher education language of student and faculty success. Each section was set up in the same way so the areas were easily recognizable as the viewer turns the pages. This design was intentional so the viewer does not use valuable working memory trying to figure out where to find information or have to unpack library jargon. (See Figure 1.)
Icons were created and used throughout the annual report. These icons allow the viewer to easily recognize each area including academic engagement, access and user services, research and emerging technologies, campus engagement and assessment. (See Figure 2.)
Icons were also used on the back cover of the report to maximize the impact of the report’s highlights and to reinforce the dual coding utilized throughout the report. (See Figure 3.)
Another aspect of organizing information for the viewer is to emphasize important information. This was shown throughout the report by using larger font, bold colors and strategically placing
Figure 1. Consistent organization in all sections, linking accomplishments back to the Libraries’ strategic plan.
Figure 2. Use of icons and dual coding to brand the unit titles. Figure 3. Strategic use of icons and dual coding principles on the back cover of report for additional emphasis.
this information where viewers would be drawn. Rather than listing all of our statistics, we organized and emphasized the numbers that we knew were important for our external stakeholders to recognize. (See Figure 4.)
Redesign Outcomes
Using the ACRL’s Impact Report as a roadmap, then, the principal goal of the newly redesigned report was to communicate the library’s contribution to the University’s mission of student success, faculty research and productivity, and diversity. These contributions are clearly outlined first, strategically framing the rest of the document (Stony Brook University Libraries, 3). In addition, the new design accomplished the following: • We highlighted data points and library assessment that matched or
resonated with the University’s
mission, such as campus partnerships that accomplished mutual goals for student and faculty success. • We included data points that
directly contributed to the Uni
versity’s data collection, such as
the library’s direct role in high impact educational practices such as the general education learning outcomes. We quantified the Libraries’ impact by highlighting data points that could be easily understood by external stakeholders. We showed the Libraries’ year-round efforts at campus engagement, highlighting new programming and increased attendance at events. We recognized new campus and external partnerships to demonstrate the Libraries’ active engagement with the communities it serves.
Challenges
As with any project, there were some procedural and technical challenges in creating the annual report. There were some obstacles in obtaining and/or gaining access to data. Different units had different ways of reporting and there was inconsistency in the way information was reported (example: health sciences instruction statistics vs. main campus instruction statistics). This required us to edit and rewrite some areas to obtain a cohesive report. In addition, connecting the report to our strategic process required reflection and time that we didn’t always have to invest. Technically, conceptualizing and creating the report was time-intensive. We were fortunate to have a member of our staff with a design background to layout this report in Adobe InDesign. However, this posed another difficulty in that all editing fell to this one staff member. We plan to explore design tools that are more familiar to more staff.
Conclusion
We are pleased with the improvements of the report’s content and visual presentation. Going forward, we would like to incorporate impact narratives that can further integrate library data with institutional data and analysis of student learning and success. We would also like to use this report as a template for other levels of reporting, vertically and horizontally across the library organization, so that we are all intentionally and consistently incorporating ACRL’s recommendations for communicating library value into our reporting practices. A link to our full annual report can be found here: https://library.stonybrook.edu/ wp-content/uploads/2019/06/RUEAnnualReportFinalSinglePages.pdf.
Figure 4. Example of emphasizing important information with font, colors, and placement on page.
References
Association of College & Research Li
braries. (2017). Academic Library Impact: Improving Practice and Essential Areas to Research. Prepared by Lynn Silipigni Connaway, William Harvey, Vanessa Kitzie, and Stephanie Mikitish of OCLC Research. Chicago: Association of College & Research Libraries.
Association of College and Research
Libraries. (2010). The Value of Academic Libraries: A Comprehensive Research Review and Report. Researched by Megan Oakleaf. Chicago: Associate of College and Research Libraries.
Hall, K., and Clarke, J.H. (Forthcoming). Communicating Library Impact through the Assessment Website in the 2018 Library Assessment Conference Proceedings.
Institute of Museum and Library
Services. (2018). Library Integration in Institutional Learning Analytics. Prepared by Megan Oakleaf. Accessed November, 2018.
Lewin, H.S., and Passonneau, S.M. (2012). An Analysis of Academic Research Libraries Assessment Data: A Look at Profesional Models and Benchmarking Data. The Journal of Academic Librarianship, 38 (2), 85-93.
Mayer, Richard. (2009). Multimedia learning, (2nd ed.). Cambridge: Cambridge University Press.
Miller, George A. (1956). The magical number seven, plus or minus two: Some limits on our capacity for processing information. The Psychology Review, 63 (2), 81-97.
Paivio, A. (2006). Mind and Its Evolution. New York: Psychology Press.
Stony Brook University Libraries. (2018). Research & User Engagement 2017-2018 Annual Report. Retrieved from https://library.stonybrook.edu/wp-content/ uploads/2019/06/RUEAnnualReportFinalSinglePages.pdf.
Sweller, John. (1988). Cognitive load during problem solving: Effects on learning. Cognitive Science, 12 (2), 257-285.
Endnotes
1. Hall and Clarke, “Communicating Library Impact through the Assessment Website” in the 2018 Library Assessment Conference Proceedings (forthcoming). 2. For a more in-depth review of dual coding theory, see Mind and Its Evolution: A Dual Coding Theoretical Approach by Allan Paivio.
Column Editor: T. Scott Plutchak (Librarian, Epistemologist, Birmingham, Alabama) <splutchak@gmail.com> http://tscott.typepad.com
Iwas so sure she’d been there having dinner with me. I remembered so clearly how she looked, what she’d ordered. But she was equally adamant she’d never been to that restaurant, certainly not with me, not that evening years back.
“I can prove it!” I’ve kept a daily log for over thirty years. There’d be evidence. But I was wrong. I’d been there alone. I’d written her a letter, described what she would have eaten if she’d been with me, imagined what she would’ve been wearing. Then I created a memory that was better than real. I guess love can do that to you.
Our memories are notoriously flawed, a fact that has fascinated me ever since I first came across the work of Elizabeth Loftus when I was a teenager. Oh, wait. Her first books didn’t come out until I was in my early twenties. I just checked.
So I’ve learned not to trust my memories. I started keeping a journal around age fourteen (I think — that first notebook’s in a box in the attic, so I can’t easily dig it out to see the date). I keep five year diaries and the daily logs. Multiple ways to cross check what I remember happening against what I can document.
Our cultural memory is grievously flawed as well. This morning there’s news of a candidate for U.S. Senator from Alabama ranting against the “homosexual activities” and “wife swap shows” on television destroying our country’s moral core. He says we need shows that promote “the biblical principles on which our nation was founded.” 1 Yesterday over lunch I read Jill Lepore emphatically pointing out in her book These Truths that the country “was not founded as a Christian nation.” As she points out, “[t]he United States was founded during the most secular era in American history … It is no accident that the Constitution does not mention God.” 2 The candidate has cultural memory. Lepore has documentation. Lots of it. Which is stronger?
The documentation is unreliable. My diaries and logs are far from complete. How does any diarist know which of a given day’s events will be the ones that matter the most when it comes to understanding a life in full? So it is with the stuff of history. “Most of what once existed is gone. Flesh decays, wood rots, walls fall, books burn. Nature takes one toll, malice another. History is the study of what remains,… [M] ost of what historians study survives because it was purposely kept — placed in a box and carried up to the attic, shelved in a library, stored in a museum, photographed or recorded, downloaded to a server — carefully preserved and even catalogued. All of it, together, the accidental and the intentional, … is called the historical record, and it is maddeningly uneven, asymmetrical, and unfair.” 3
Umberto Eco and Jean-Claude Carrière make similar points in their charming dialogue This Is Not The End of The Book. Eco emphasizes, “…we are not even-handed with the cultural objects that we choose to preserve. For example, if you want to buy an original of one of the great comic strips it is horribly expensive, because they are so rare…. But why are they so rare? Simply because the newspapers that used to publish them threw the plates in the bin the moment the strip had been printed.” 4 The successful printers of the incunabula age made their money from indulgences, playing cards, and pornography, almost none of which survives. Gutenberg’s glorious bibles, which we’ve carefully preserved, bankrupted him.
Carrière adds, “… if you look carefully, a horrific proportion of our libraries is made up of books written by the utterly talentless, or by halfwits and crazy people. The great majority of the 300,000 scrolls kept in the Library of Alexandria are bound to have been complete rubbish.” 5 We romanticize what’s been lost.
We fetishize the documentation of the past. As written records have shifted to digital formats over the past three decades, there’s been an endless drumbeat of dire warnings over what’s being lost. The kinds of records — letters, for instance — that have been the basis of so many fine literary biographies simply aren’t being written anymore. When I took up the position of Director of the Lister Hill Library in 1995, and wanted to trace the arc of certain key decisions made by my predecessors, there were binders of chronologically arranged memos for me to refer to. Within a year or two I abandoned the practice of writing and printing official memos. My communications became more informal, decisions traced and documented in emails. What memos there were existed only in digital form. I worried that my successors wouldn’t be able to trace my decision making in the same way. True enough — not in the same way. But my digital files still exist. And not just the official end products. Much more of the context has been saved than would’ve been the case with paper. Despite the use of the delete key, most of us do a poor job of organizing and pruning our digital files. So I’ve got twenty years of my time at Lister Hill on a flash drive, ready to be mined. But what do I do with it?
When UAB’s archivist, Tim Pennycuff, was assiduously trying to wrest the boxes of personal papers from the office of one of the previous university presidents, Dr. McCallum kept putting him off, saying that first he needed to do some weeding. “No, no!” we’d cry. “Let Tim do the sorting. He’s the professional.” We knew what McCallum wanted to expurgate. His was an era of deals and favors and bargains behind closed doors, scores kept and vendettas waged. What he’d eventually allow into the archives would be the “clean” record. The official account. Far from the whole truth.
That kind of cleansing is much more difficult now. (Using “deep fakes” to rewrite the historical record is an entirely different matter). There is a far more detailed record of the minutiae of the life of a typical Facebook user than could ever have been gathered for even a fairly prominent person of a hundred years ago. But who is charged with keeping it?
It’s not that we lack the stuff; we lack the tools and systems and workflows to make sense of it all. Here’s Nicholas Carr: “When we complain about information overload, what we’re usually complaining about is ambient overload. This is an altogether different beast. Ambient overload doesn’t involve needles in haystacks. It involves haystack-sized piles of needles. We experience ambient overload when we’re surrounded by so much information that is of immediate interest to us that we feel overwhelmed by the neverending pressure of trying to keep up with it all.” 6
That digital stack of needles upends Lepore’s comment about the historical record being mostly things “purposely kept.” The archivist sorts and sifts a messy carton of a hodge-podge of corporate records, selects what they believe to be the most telling and relevant ones and catalogs and preserves them. Archival best practices were developed in the world of print because it wasn’t practical, in any sense, to keep everything. But with a mass of digital records perhaps it isn’t practical, in any sense, to do anything other than keep everything. Take this flash drive containing twenty years of my decision making. Sure, there’s a lot of junk in there. But what would be the point of having a trained archivist sort through it, deleting everything that they decided isn’t an essential part of the historical record? The sensible thing is to use the kinds of refreshing techniques that digital preservationists now have in hand to keep the whole corpus together and mine it for whatever insights and stories can be revealed. I used to tell skeptical administrators that the archivist’s superpower wasn’t knowing what to keep, it was knowing what to
by Meghan Burke (Metadata/Electronic Resources Librarian, Marymount University) <mburke@marymount.edu>
Column Editors: Stacey Marien (Acquisitions Librarian, American University Library) <smarien@american.edu>
and Alayne Mundt (Resource Description Librarian, American University Library) <mundt@american.edu>
Column Editor’s Note: In this issue’s column, we profile how one library decided to pilot a collection evaluation project. Meghan Burke, Metadata/Electronic Resources Li brarian and Gwen Vredevoogd, Collection Development Librarian, at Marymount University describe a holistic approach to assess their collections based on program reviews. — SM& AM
Introduction
While we constantly are performing basic collection maintenance, the librarians at Marymount University realized it had been well over a decade since the library collection had been assessed for content. Program collection profiles were woefully out-of-date, and the collection needed a more intensive evaluation.
Problem
Assessing library collections in a meaningful way at a small liberal arts university can be challenging. At Marymount University, our small size and curriculum-driven collection means we need more granular information to truly analyze our collection. Luckily, there is no shortage of studies on assessing library collections, and a review of the literature led us to Madeline Kelly’s method of holistically assessing library collections program-by-program 1 , which seemed to provide what we wanted, so in the spring 2019 we began to plan how to pilot this over the next academic year. In the spring and early summer, we identified the programs to be assessed, the cycle, and the data to be gathered. Our fiscal year ends in the summer, so this was when we gathered the information the liaison librarians would need and developed a template to guide what they should be assessing and to capture observations and decisions made during the process.
Process
By conducting collection assessment at the programmatic level, we hope to determine strengths and weaknesses in the relevant content areas and formats in order to establish any changes needed in our support of the degree program. Specifically, does the program review indicate new content areas or shifts in focus that we need to consider? And what types of resources are most useful to the program? (For example, some disciplines may prefer electronic over print, or vice versa).
We decided to select several programs that had conducted program reviews in 2018-2019, gathering various data that would inform how best to support that program moving forward, determining if areas supporting the program needed to be weeded or updated, and revising program profiles that detail how the library’s collection supports the program. We wanted a mix of graduate and undergraduate programs of different sizes. When several programs that had undergone review fell to the same librarian, we asked them to select just one program to assess. The program reviews conducted by departments gave us an idea of how the faculty plan to meet its goals over the next five years, so ideally, we would assess each program every five years.
Once we determined how we wanted to evaluate the collection and which programs to include we needed to determine the data points to use to assess the collection. For our pilot year, we chose a mix of qualitative and quantitative data. Liaison librarians will be provided with the program review itself, along with the most current data available on the use and perception of the library’s print and electronic collections in the relevant subject areas. On the quantitative side, data includes: • Electronic resource usage statistics for the 2018 annual year for databases, e-journals, eBooks, and streaming media, in COUNTER format, when available • Total cost and calculated cost-peruse of each electronic resource • A report of physical item circulation by Library of Congress classification • Interlibrary loan statistics for both print and electronic resources, and data on the books requested through our consortium-loan service to determine gaps in the collection • Database overlap analysis, upon request
On the qualitative side, in addition to the program review, we will evaluate: • Sample syllabi and course assignments from the chosen programs • Collections-related data from the library’s bi-annual faculty satisfaction survey, where faculty members self-identify by department
We would like additional direct measures for student use of the collection. Possibilities to gather this information include evaluation of a selection of the bibliographies of student work (available through our Office of Planning and Institutional Effectiveness), an informal student survey about the collection, perhaps conducted at the circulation desk, or librarian-gathered data about which resources are taught and most frequently used during library instruction sessions.
Outcomes and Anticipated Challenges
When we identify changes that need to be made to support the programs assessed, the final steps will be to weed the collection and update it with new, more relevant content, cancelling or reallocating funds to more useful electronic materials, and if necessary, adjusting our budget allocations for print and e-resources to best reflect the needs and use of the collection by the programs. We have some reservations about the sustainability of assessing the collection this way. While assessing the collection program-by-program is much more thorough and breaks down the process so we are not trying to assess the entire collection in one year, it is a process that will need to be repeated each year as programs complete their reviews. Assessing the collection by program also places the responsibility for assessment on the liaison librarian, who will then consult with Collections to make decisions about the collection. While liaisons are responsible for collecting and weeding their content areas, this assessment is more prescribed and in-depth than undertaking a summer weeding project, or regular selection of materials. Despite these concerns, we are excited to see the results of this year’s pilot and hope it will allow us to curate a collection that will serve our students and faculty the best we can.
Assessing the Pilot
After the initial pilot is complete, we should have the information to produce updated program profiles and the ability to easily identify areas to weed, update, and new resources or tools needed to purchase/subscribe. The three librarians participating in the pilot will also be providing feedback about the process itself and the usefulness of the data points we provided. Once we have completed the first round, we will make necessary improvements to the process and determine if this is the best method for collection assessment.
Tips (so far) for Starting your Own Holistic Collection Assessment
Use the program review as a jumping-off point to make the assessment more meaningful and timelier by tying it to a preexisting cycle.
Identify the data points you already collect and select those that would be the most useful. Explore ways to add useful data points you do not already collect, if possible. Start small and see if this will work locally to work out any issues before it becomes programmatic. Make sure you have support from library faculty and staff moving forward both to gather data and help with the assessment pieces. Larger institutions may need a dedicated position to provide support. If you use Alma, become familiar with generating reports in Analytics or make friends with the person in your library who does this.
Endnote
1. Kelly, M. M. (2014). “Applying the Tiers of Assessment: A Holistic and Systematic Approach to Assessing Library Collections.” The Journal of Academic Librarianship, 40(6), 585-591. https://doi.org/10.1016/j. acalib.2014.10.002.
Visit us at table #113 Play our trivia game and you could go home with a Srub Daddy!
Epistemology from page 93
throw away. But the 21st century archivist needs to be a data and text miner rather than a careful selector.
When designing a data collection tool, one tries to imagine all of the questions one might want to ask of the data. You want to avoid the situation of coming up with a great question and realizing you can’t address it because you didn’t gather the right data in the right format at the start. Librarians and archivists and curators have made educated guesses over the centuries about what to preserve and what to discard. Those decisions, even more than the accidents of war and fire, have constrained the stories that historians can tell us about how we came to be.
In the last decades of the 20th century there was a great deal of justified anxiety over the preservation of digital formats. We’ve learned a lot since then. We understand redundancy and error-checking and transporting from older formats to new. The challenges aren’t technological as much as they are social and organizational. Process.
The failure of the Library of Congress’s project to establish an archive of all of Twitter is a cautionary tale of opportunity lost. 7 Although it was launched with great fanfare, LOC was never able to muster the resources that would’ve been required for it to live up to the hype. I’m sympathetic to the budgetary and technical challenges that the project presented, but saddened nonetheless at the decision to revert to print-world principles of selection. Now only those tweets “with historical significance” will make it to the archive. How can we know? We can’t tell if something will endure through time if we don’t keep it in the first place.
It’s been the responsibility of the “memory institutions” — libraries, archives, museums — to maintain the historical record over centuries. That’s shifting. Scholarly journal publishers are sharing responsibility for developing preservation programs and protocols that are beyond the capabilities of most libraries. Think Portico. LOCKSS. But these touch only a tiny fragment of what constitutes “the culture.” The corporate behemoths that own the servers on which our digital culture resides haven’t made long-term preservation a priority. Twitter and LOC took a stab at it. But it’s going to take more robust partnerships, led by the experts in the memory institutions and funded by the corporations building the global infrastructure, to figure out who is going to be responsible for what and how it is all going to be paid for.
I hope they hurry. I need to know where to send that flash drive.
Endnotes
1. Koplowitz, Howard. “John Merrill: ‘Homosexual activities, wife swap shows’ ruined TV.” Al.com. July 15, 2019. https://www.al.com/ news/2019/07/john-merrill-homosexual-activities-wife-swap-shows-ruined-tv.html 2. Lepore, Jill. These Truths: A History of the United States. W.W. Norton & Co., 2018. pp. 199-200. 3. Lepore, Jill. These Truths. A History of the United States. W.W. Norton & Co., 2018. p. 4. 4. Eco, Umberto and Jean-Claude Carrière, with Jean-Phillippe de Tonnac. This Is Not the End of the Book. Northwestern University Press, 2012. p. 17. 5. Eco, Umberto and Jean-Claude Carrière, with Jean-Phillippe de Tonnac. This Is Not the End of the Book. Northwestern University Press, 2012. p. 200. 6. Carr, Nicholas. “Situational overload and ambient overload.” Rough Type: Nicholas Carr’s Blog. March 7, 2011. http://www.roughtype. com/?p=1464. 7. Daley, Jason. “The Library of Congress Will Stop Archiving Twitter.” Smithsonian.com. December 27, 2017. https://www.smithsonianmag.com/smart-news/library-congress-will-stop-archiving-twitter-180967651/.