The horrific tragedy which took the lives of nine Emanuel African Methodist Episcopal Church (Charleston, South Carolina) parishioners and adversely affected five witness survivors on June 17, 2015 shook the world. Individuals, families, organizations, members of all denominations, and even the incarcerated have reached out to the victims’ families, survivors, and congregants to express their sympathy. Tangible evidence is continuously sent in the form of cards, letters, flowers, posters, paintings, books, art work, quilts, and prayer shawls express an outpouring of love, concern, and sympathy.
Members of the Charleston Archives, Libraries, and Museums (better known as CALM), volunteered their time and archival skills to organize, relocate, and inventory the numerous gifts. A Memorabilia Subcommittee was established to define a collection policy and processing procedures. CALM’s mission in part is to “preserve the history of the moment for the future, help tell the story to others and through the use of the collection, contribute to building a better, stronger, more united community.”
The first task was the maintenance of the temporary public-initiated memorial outside “Mother Emmanuel.” CALM members worked on a daily rotation basis to remove gifts of stuffed animals, posters, candle, balloons, Sweetgrass roses, and fresh, and artificial flowers. Days of torrential rain with flooding presented constant challenges in retrieving and preserving items, many of which sustained damage. The large shrine which spanned the length of the church was discontinued two months later due to upkeep and time constraints.
The second ongoing task was to sort and catalog the countless cards, letters, emails, textiles, and artwork sent in the mail. The City of Charleston provided two rooms at the Saint Julian Devine Community Center, an after-school children’s facility, for provisional storage of the donations. One room, holding cards and letters, along with 400 shawls and quilts, comprised 1,000 feet of space. Artwork and large memorabilia were contained in the second room. Prior to inventorying, CALM members researched various collection policies and best practices of sites of massive tragedy, including Sandy Hook Elementary and the Boston Marathon bombing.
Our volunteerism commenced in the evenings after our daily archival positions. We divided duties with one group processing the correspondence and another working with the shawls and quilts. The abundance of prayer shawls received was mostly made by church “Shawl Ministries.” We indicated the measurements of the piece, fiber content (wool blend or acrylic), design type (knit or crochet), and donor name, if indicated. The article was photographed and the information was saved on a computer database. Cards and letters were individually noted on the database with sender and date. The final step was packaging items in archival acid free boxes and labeling for long term storage. When the donations outgrew the Center, the Roman Catholic Diocese of Charleston donated temporary space for the expanded holdings. Considering the future, the Mother Emanuel desires a permanent location with a professional archivist to maintain their collection. Naturally, adequate and ongoing funding will be needed for the Church to fulfill its goals.
Mother Emanuel commemorated the first anniversary in 2016 by displaying a small temporary exhibit of prayer quilts in a City of Charleston building located close to the Church. This year, the Church, with the assistance of Brockington and Associates, a cultural resources consulting firm, installed “The Light of Hope,” an expanded exhibition incorporating memorabilia and various portraits of the Emanuel Nine.
Lessons learned: It is crucial to understand and provide the family members, parishioners, and members of the clergy time and space to decide how and when they want to handle their donations. In times of grief, it is challenging to make decisions, much less rational ones. If anything, we realized the virtue of patience and sensitivity.
While CALM members organized and inventoried the initial massive amount of Mother Emanuel’s donations with the goal to preserve history, it is ultimately the Church’s decision to do as they want with the materials. The Avery Research Center, along with numerous and vested repositories in and around Charleston are capturing images and documents from the days, months, and years that followed this tragedy. One example is the online digital exhibit: “A Tribute to the Mother Emanuel Church,” which highlights the outpouring of expressions of condolences locally and worldwide.
The Emanuel Nine:DePayne Middleton Doctor, Cynthia Graham Hurd, Susie Jackson, Ethel Lance, Reverend Clementa Pinckney, Tywanza Sanders, Daniel Simmons Sr., Sharonda Coleman Singleton and Myra Thompson.
Georgette Mayo is currently the Processing Archivist for the Avery Research Center for African American History and Culture at the College of Charleston. She received her BA in African American Studies (Phi Beta Kappa) and master’s degrees in Library Science and Public History, with a concentration in Archival Management from the University of South Carolina.
With the enormous explosion of podcasts following the breakout success of Serial (2014-2016), many archives and special collections are turning to audio to give the space for extended storytelling and highlighting the work of their archivists, curators, and faculty.
This is not that story.
Iterative Process Creates a Podcast: Historically Yours, the podcast from Special Collections at the University of Iowa took more than two years to create, as we tested versions of the idea, adapted them, tested them again, and adapted them. The central question guiding the process was: What do we do with manuscripts in social media spaces?
Special Collections at the University of Iowa has had a robust social media presence for more than five years. However, social media feeds inspire a certain type of interaction with content that privileges quick connection to visual material as it scrolls by in a feed leading to a heavier reliance on photography and rare books. Both are visual and have interesting aspects that can be grasped and understood quickly in a feed with minimal description. That fits the format of most social media feeds, and also the staff time required to produce content for the feeds quickly.
Presenting about The University of Iowa Special Collections’ social media outreach at the Midwest Archives Conference in 2014, I was asked an important question in the Q&A: What about manuscripts? In a quick scrolling feed, one manuscript can look like any other manuscript. I did not have an answer at the time: Manuscripts are harder. The compelling and addictive aspects of historic research are contained in the question, the quest, and the connections: The context. Context is something a rapidly scrolling social media feed does not well support. Context takes time to develop and time to deliver.
So I set out to solve that problem: What would be a format that would support just enough context and personality to really bring a historic document to life, but without being so overwhelming that it might be repeatable and sustainable?
My first answer to that question was a pilot video project called “History Out Loud” featuring a miscellaneous manuscript letter collection in Special Collections that I had always wanted to feature in some way. The MsL collection has thousands of individual letters with no collection and no context. Thinking of our fast social media posts, I determined that the equivalent of an Instagram post with a manuscript letter would be a video of a person reading the letter out loud. We piloted this video project with a test run of five short videos.
In reviewing the footage, what became clear to me was that the visuals were getting in the way. Watching a person, their posture, the set around them, and even their facial expressions were not adding to the experience of hearing the letter, but rather were taking away from it – distracting from it. It was easier to pay attention to the letter as audio alone. The content was telling us that it wanted to be a podcast.
Then came Serial in October of 2014. A podcast entered into American popular culture to such an extent that it garnered a parody sketch on Saturday Night Live in December of that year.
Refinement: With the recent explosion of interest in podcasts, I started expanding my own listening beyond the few radio-based formats I had traditionally consumed. In particular, recent humanities podcasts have been inventing new storytelling formats. Armed with knowledge, the concept grew and changed. Instead of one reader, inspired by The New Yorker Fiction Podcast, I added a guest to read the letter and took on the role of host. Dear Hank and John provided a format for adding theme music, a quote about letter writing, and a tagline. The project grew from, “Let’s read a historic letter” to being inspired by the question: “What can we learn from just one letter?” It changed from promoting a letter, a single historic document, to being an exploration about letter writing past and present, and the spark of inspiration that makes historic research so compelling, with our staff and guests’ full personalities and passions included. With that shift in focus, the name “History Out Loud” no longer captured its essence and we switched to “Historically Yours.”
With the format set, we were poised to record and discover all the problems and challenges recording in the library.
What you need to know about file storage: The mass familiarity of sites like YouTube for video makes it seem like it should be possible when making a podcast to simply choose a site, upload files and go. However, unlike YouTube, podcast distribution sites like iTunes do not store files but only make them available via RSS feed from hosting site. The actual files need to be stored somewhere, and most of the file hosting options are not free, or have enticing free options that in the end only allow enough storage and bandwidth for a few episodes to be stored at a time. Archive.org can be used for free (and provides a tutorial). However, paying for a service gives you access to analytics. Historically Yours is hosted on Podbean.com and Podbean also offers step by step explanations of what resolution and formats your image and sound files should be to properly connect with iTunes, which was very helpful. Other options include Soundcloud and Libsyn. Soundcloud and Libsyn both have an entry-level tier with analytics for $7/month. However, Podbean.com had a tier for just $32/year while still including analytics, so I chose Podbean.
It is important to think of these sites like a storage locker. As Dana Gerber-Margie pointed out in her talk at the Midwest Archives Conference in 2017, the files can be deleted and lost the moment you fail to pay. The site does not and will not back up your files so archiving your work needs to happen at your institution.
None of the hosting sites are able to generate an invoice, so working with your institution to find a solution for paying might be a place where the process can stall.
Equipment & Editing: It took us six months of trying every piece of equipment in every closet and adjusting to come up with the right combination of equipment and location to get good sound and move on with the project.
Quiet place to record (HVAC hum will also be audible)
Microphone (Will need omnidirectional microphone if recording with one mic and two people)
Computer for editing
Program for editing (Audacity or Audition)
In the end, our sound solution was to treat the podcast like an oral history. The Zoom H4N recorder we use for recording oral histories (~$150) doubles as a podcast recorder. We tested many USB microphone solutions and even cell phone microphones, but the Zoom was the best at picking up two voices. We are able to set it in between us, hit record, and go. Other set ups required us to find an omni-directional microphone in order for it to be pointing at two people in two directions. We are not audio engineers! It was tricky to get the sound right. The Zoom solved our audio issues.
For editing, we use Audacity, which is free to download. There are great tutorials online, which are needed because the buttons are not clear. I picked it up from tutorials and was editing in 15 minutes, but it did require a tutorial to explain what the minimally marked buttons meant. At first the episodes took an hour to edit, but they get faster each time.
Get a designer: There is an important and obvious step that I missed along the way: You will need a thumbnail and a header image for your podcast. The thumbnail is very important in inspiring people to listen to your podcast, so do not skimp on this step. I started a Twitter, Facebook, and blog for the podcast as well so the thumbnail and header image had to be resized and reformatted for each site. This took a good deal of time and should be factored into the schedule. I did not have access to a designer so our team worked with Canva, the online graphic design software, to create the thumbnail and headers.
Sharing the RSS feed:
Once you have your perfect first episode and design and you have paid for a hosting site, and put it together, there is another step before fully launching your podcast. Once our Podbean site was set, I submitted the feed to iTunes and it took us two days before our RSS feed was approved. If you have announced a specific date that your podcast will launch, this could slow down the process. You may need to upload your files to your hosting site, submit the feed to iTunes, wait for approval, and then fully roll out the advertising for your podcast when it has been approved by the various podcast sites.
Results/What We’ve Learned: So far Historically Yours has five published episodes and is averaging 100 downloads per episode. The highest number of downloads always comes on the first day. That first day download number has increased with each episode, so the podcast is growing a healthy base community.
In each episode of Historically Yours, we call on our community to help us out with the research. From the very first episode we had a listener inspired to do a bit of historic research about the letter and we received a listener response (via email) about the results of their searching, identifying more information about the theater fire described in the letter. For the next episode, I will read user feedback letters on the air and we expect the user connection and response letters to increase as soon as they are read on the air.
The podcast gives a chance to feature our staff and graduate student as real idiosyncratic passionate people who love research, and seems to be inspiring responses from like-minded passionate history nerds. It seems the perfect method to reduce fear of the institution or the professionals by connecting with and inspiring new users.
The steps to make a podcast are not all that difficult, but like any creative work the end result is improved by testing, critiquing, and changing. If you have the space to invest in the concept in bursts without a tight timeline you can troubleshoot the process along the way, learn from those who have gone before, and create a really meaningful way to connect with our users.
Finding a quiet place to record was our biggest challenge. Have your recording location identified (and tested) before proceeding far. Between HVAC banging, construction, door slams, and interruptions, many locations may not be feasible.
Give your podcast a home on your blog as well. We post a transcription of the letter we are reading each episode to our blog along with an image of the letter.
Our followers asked for the RSS feed to be added to: iTunes, Pocket Casts, and Stitcher, as well as its home on Podbean.
It might be good to upload three episodes at once to start with, especially for a short podcast – having several episodes to binge at once can build a fan.
If you are using a single microphone and one person’s voice is deeper or quieter, put the microphone closer to them.
Get multiple memory cards and a card reader.
People trying out the podcast will listen to the first episode. Over time, episode one will have the most downloads and will be the most important. It’s your commercial. Do your best to get episode one right.
Historically Yours: Historically Yours is asking the question: What can you learn from just one letter?
Host: Colleen Theisen Guests: Staff, graduate students, faculty, and friends. Theme music: Will Riordan Editing: Colleen Theisen and Farah Boules
As we say on the podcast – DON’T FORGET TO WRITE!
Colleen Theisen is the Outreach & Engagement Librarian for Special Collections & University Archives at the University of Iowa Libraries, where she coordinates social media, including a Tumblr named “New and Notable” by Tumblr in 2013, the YouTube channel, “Staxpeditions,” and the podcast “Historically Yours.” She started her career as a high school art teacher and completed her Master of Science in Information in 2011 at the University of Michigan School of Information. In 2015 she was named a Library Journal Mover & Shaker. She’s on Twitter @libralthinking.
In January 2014 I started my position at Texas A&M University with Cushing Memorial Library and Archives, which holds the University Archives and Special Collections at A&M. Our only digital presence consisted of a Flickr account hosting items from the University Archives and some items from Special Collections that were put into the Institutional Repository (OAK Trust). Eight months after I started a new Associate Dean of Special Collections and Director of Cushing Library was hired. The new director and I started to voice our opinion that we needed to increase our presence on the web, but also have a system to handle both digitized and born digital materials. In time the Dean of the Libraries organized a retreat for interested parties and out of that a task force was formed to investigate Digital Asset Management (DAM) tools and to come up with a recommendation for implementation.
In the fall of 2014 the task force was established with the objective of investigating and making recommendations for a solution or solutions that would enable the Texas A&M University Libraries to store, display, and preserve digitized and born digital university records and research. In the spring of 2015, the charge expanded to include attention to broader campus needs.
After defining an assessment process and expanding our scope to include campus, the task force first worked to conduct a campus needs assessment, to identify and develop use cases, and to distill core requirements. This became the basis of our testing rubrics. We ran multiple stages of assessment to identify and test systems, as well as to analyze the results of those tests. A recommendation was reached on the basis of this analysis and further inquiries.
Our analysis of twenty-six systems allowed us to confidently assert that no one digital asset management product would meet library and campus needs. Broadly, “digital asset management consists of management tasks and decisions surrounding the ingestion, annotation, cataloguing, storage, retrieval, and distribution” of image, multimedia, and text files. These tasks are performed by systems (DAMS) that differ in their approach to functions and range of associated capabilities. Given campus needs, and our experience as a leading developer with DSpace, which the Libraries uses as our IR, the task force was attuned to the particular importance of the data models embedded in these systems, which guide and constrain other functionality.
We were convinced that modular solutions to discrete needs for storing, displaying, and preserving digital assets are warranted, and that these solutions are likely to require customization. We recommended building a digital asset management ecosystem (DAME) rather than attempting to meet all needs with a single DAMS.
The choice of the word ecosystem, as opposed to “system” (as with a DAMS) is explained by the DAME’s emphasis on a distributed service architecture. This is an architecture in which the discrete roles of a DAMS are handled not by one application, but instead by a collection of applications, each one suited for the role it plays. The DAME’s structure will certainly vary from institution to institution, and in fact this flexibility is perhaps the DAME’s strongest quality. In general, a DAME’s ecosystem will be divided into the following layers:
In the DAME, the management layer is conceived of as a collection of web services that handle record creation, curation, and discovery. It does not, itself, handle the actual assets, but instead records the assets’ location and metadata, and allows for the management and retrieval of this information. The management layer should be comprised of at least two elements, the first being a custom web service and the second a repository with a fully featured application profile interface (API). The repository application can be one of the many popular DAMS solutions that are currently in use, the only requirement being that it exposes all desired functionality through an API.
It may seem that a repository with a fully featured API would be sufficient to satisfy the needs of a management layer, but there are several good reasons for including a custom web service in this layer. The first reason is that this web service will act as an interface for all communication with the management layer, and by so doing, the DAME is repository agnostic. All other applications in the ecosystem will be programmed against the consistent API of the custom service, and the job of interfacing with the repository’s API is left solely to the custom web service. If the decision is made to switch repositories, the only thing that needs to be updated in the DAME will be the custom web service, and the rest of the ecosystem will not realize the change took place. The second reason for this separation is it allows you to employ multiple repository solutions side-by-side, with the web service aggregating responses. Finally, in record retrieval, the and authentication of the user can be handled by the custom web service, relieving the repository of any need to be compatible with the institution’s authentication and authorization strategy.
This management layer thus communicates with the persistence layer, which is not, by necessity, one of the more complicated portions of the DAME’s architecture. It is simply the data source, or collection of data sources, needed to support the repository. Most repositories that would work well in the DAME are likely to have varied options when it comes to persistence, making the persistence layer one of the more flexible aspects of the DAME. In general this layer will store the assets’ URI, metadata, and possibly even application-specific information needed by the presentation layer.
The preservation layer, which had already been under development would continue and integrated into the new system. A processing layer would be connected to local redundant storage. That local storage would be also connected to dark archives storage and rarely accessed.
Every system that we tested consisted of different tools and components, bundled together as a single system. Part of the argument for a DAME over a DAMS is the ability to determine the components in these bundles locally, and to swap them out to meet evolving needs.
With that in mind the task forced recommended the deployment of modular digital asset management components to meet the complex needs of the Texas A&M University Libraries and campus. These include:
The deployment of a system to manage and store digital assets and metadata. Our recommended open-source system is Fedora 4, to be coupled with Blacklight and Solr for search and retrieval. Solr indexes content managed by the repository, and Blacklight enables search and retrieval across the indexed content.
The development of custom user interfaces as appropriate (likely, public user interface and administrative interfaces).
The deployment of a triple store to enable linked data, along with Apache Camel and Fuseki as the basis of connecting Fedora to the triple store and to Solr indexing software.
The deployment of an exhibition system. Our recommended open-source exhibition layer would be Spotlight, which is an extension to Blacklight and will easily integrate into our DAME.
The deployment of a preservation system that would consist of Artefactual’s Archivematica that connects to localized redundant storage. Redundant storage it connected to dark archive of the Digital Preservation Network (DPN) and Amazon’s Glacier via Duracloud.
The development of the ecosystem has started. The Libraries’ IT team has started working on bringing up Fedora 4, along with the other components recommended by the task force. As mentioned above the preservation layer had already been in development, and the final kinks are being worked out in that part of the system. The hope is that the ecosystem will be fully functional within a year.
Overall, the work of the task force was beneficial. We had input from a number of stakeholders that brought forward desired functionality that one specific group of users might not have considered. There was a very strong presence on the Task Force representing the Special Collections, but also our preservation unit which had very similar ideas have groups that are regularly working together. The addition of subject/reference librarians and cataloging and the expertise of the Digital Initiatives group (Library IT) brought yet other perspectives. Having some university representatives also gave us an idea what units around Texas A&M require when dealing with digital materials. The task force had sent out surveys to a number of units on campus and we were able to gather a larger amount of useful info. At a minimum I now know of some units that have large amounts of electronic files that we will have to prepare for in the near future as we bring up the DAME and continue to develop our digital archiving process at Texas A&M. In the end this diverse group with expertise in a number of areas allowed us to test a large number of software solutions. We were able to robustly test the functionality of these solutions and we were able to collect data on strengths and weaknesses of the different softwares. The solution of a DAME built off of Fedora 4 and bringing in a number of other open source solutions might not work for other institutes as we are heavily reliant on the expertise of our IT to bring all of these components together, but the process of creating a task force for a diverse group (including those outside the library) was beneficial. We now have buy-in that had not existed before from multiple units in the library and interests from outside the Libraries, specifically in the area of materials related to the University Archives.
Greg Bailey is the University Archivist at Texas A&M University, a position he has help since January 2014. Prior to that, he served at the University Archivist and Records Manager at Stephen F. Austin State University. He is currently a member of the College and University Archives Section’s Steering Committee.
Over the past year, the University of Louisville has experienced an unprecedented level of administrative turnover in the presidency, the board of trustees, and the leadership of the University of Louisville Foundation (ULF), the charitable organization that helps support the activities of the University. The unfolding events added urgency to the development and clarification of some of the University Archives’ policies: issues we had been “working on” for years – including email preservation – could no longer be considered a theoretical problem.
While it is difficult to identify the true “beginning” of the turmoil, by March 2016, the Board of Trustees and Faculty Senate were discussing votes of no confidence in the president. There were also concerns about the president’s dual role as head of both the university and the ULF. We wanted to preserve a record of the events, of course, but we also wanted to capture the impact it had on attitudes and opinions within our larger community. We had a longstanding tradition of supplementing the official university record with clippings from newspapers, magazines, and (more recently) blogs, and we continued collecting these materials. However, the colleague who had been responsible for this activity for decades found himself newly concerned: what if the president’s office learned we had several file folders’ worth of newspaper clippings containing comments critical of him? Would the Archives, or the Libraries as a whole, suffer as a result? After a brief discussion at a staff meeting, we agreed we had to continue “clipping” in order to preserve a complete and accurate picture. I offered to take any responsibility for the activity, trusting in the twin shields of our duty to preserve the record as objectively as possible and my status as a tenured faculty member. As it turned out, we were the least of the president’s office’s concerns.
In June 2016, the Republican governor dissolved the existing board and replaced it with a new, smaller board. He also announced that the president would tender his resignation to the new board as soon as it was installed. The Democratic attorney general filed suit, arguing that the governor did not have the authority to dissolve the board; a judge reinstated the old board pending the outcome of the suit. We were unsure who our “real” board of trustees was, but nonetheless, the president negotiated his exit and departed.
While my colleague continued to clip print articles, we also knew there was a lot more going on online. Stories were breaking daily about the president, his exit, the board of trustees (both of them), the governor, and the attorney general. As we had for a special project in 2009, we used a short-term Archive-It account. Our Archivist for Manuscript Collections and I worked from Google alerts to create an individual “seed” for each story. While we still didn’t have the financial resources to use Archive-It on an ongoing basis, we made as much use of it as we could. When we exhausted the remaining space on our Archive-It account, we began preserving web-based stories as PDFs. Given our time and budgetary constraints, this seemed our best alternative: PDF/A is an acceptable preservation format; the vast majority of the stories did not contain relevant links, only links to advertisements; and PDFs are easy to provide access to.
With the president’s exit, I also realized we had come to a major fork in the road: I had to talk to the President’s Office about obtaining his electronic files, particularly his email. This was completely new territory for us. The Archives was at an interesting juncture in other ways, as well: our records manager had recently departed, and I was working with a couple of other colleagues to cover his responsibilities while we searched for a new Archivist for Records Management. In the interviews I did my best to explain our tentative entry into electronic records – as a founding member of the MetaArchive Cooperative, we had plenty of experience with digital preservation, but less with the ingest of digital files from university offices – and hoped we could recruit someone who was interested in developing this program with me. (We did!)
At the same time, I pursued access to the former president’s files. I contacted the interim president, who was now responsible for the records of the Office of the President. He was immediately supportive, but I still had to convince Information Technology (IT) that copies of the files could – in fact, should – be transferred to us. In my initial conversations with IT, I learned that the former president had not saved many files to his assigned network space; the assumption was that his assistants created his documents. But he had plenty of email.
And here we ran into a surprising roadblock. While the University Archivist is named in the university’s governance document (the “Redbook”) as the custodian of university records, IT was nervous enough to confer with the University Counsel’s office. And while I had anticipated concerns about the speed with which we might make the material available, the Counsel’s office was actually worried about attorney-client privilege. That is, they were concerned that by releasing privileged email to us, they would essentially be sharing them with a third party, and thus nullifying privilege. Like most college presidents, ours had been named in suits against the university, sometimes simply by virtue of being the head of the institution. We ultimately agreed that email between the former president and individuals at specific law firms (identified by the domain name in their email addresses) could be filtered out of the material we received. While this is somewhat less than optimal, we know the files will be maintained by IT pending several “litigation holds” (i.e., they cannot be destroyed until the litigation is resolved), giving us a chance to follow up with them again after the dust has settled.
Our new Archivist for Records Management worked with IT’s specialist to use Library of Congress’s Bagger application to “bag” the .pst files (in 10 GB “chunks”) and transfer them to the Archives. We still have to face the issues of long-term preservation and access, but at least we have them in our possession.
In January 2017, we learned that our interim president was departing as well. In his case, it was to take the presidency at another institution, so the circumstances were happier. And since we had worked through the technical and organizational issues, the process of transferring his email went off without a hitch. While we certainly expected to cross these bridges under calmer circumstances, I am almost (almost) grateful that we were forced into action. We might do things differently next time, but we were able to develop and act on a reasonable plan in a short period of time. The approaches we worked out under these pressing circumstances are at least a starting point – something concrete we can modify and build on – rather than theoretical musings.
Carrie Daniels is University Archivist and Director of Archives and Special Collections at the University of Louisville. She holds an MSLIS from Simmons College and an EdM from the Harvard Graduate School of Education.
In a January 4, 2016, special report on Digital Humanities in Libraries, published in American Libraries Magazine, Stewart Varner and Patricia Hswe posited that “Stripping digital collections down to core components could render everything old new again in terms of what libraries might offer to the humanities research community.” Taking this approach to heart, the George J. Mitchell Department of Special Collections & Archives at Bowdoin College Library has been exploring new ways of leveraging metadata about our digital and physical collections to support a burgeoning interest in the digital humanities and computational studies across the College’s curriculum.
Historically at Bowdoin, perhaps like all libraries, we have regarded metadata first and foremost as a functional tool — the description necessary to assist researchers in their inquiries and librarians in their management of collections. Digital humanities, data curation, and new technologies, such as open source data visualization software, prompt us, though, to consider our metadata in a new light. Metadata, considered more abstractly, is one of our most valuable and important collections. Our metadata reflects decades of work by dedicated staff and volunteers, who applied their energy and expertise to analyze, synthesize, and interpret our physical and, more recently, our digital materials.
Over the past year, Special Collections & Archives staff have been integrating metadata in our teaching and outreach efforts. Along with promoting our rich collections of rare books, manuscript and archival holdings, we are actively looking for opportunities to suggest and support the integration of our metadata in faculty teaching and research.
These efforts have coincided with and been expedited by the largest digitization project yet undertaken by Bowdoin—the Howard Digitization Project. Funded by the National Historical Publications and Records Commission, this project allowed us to digitize the entire archive of General Oliver Otis Howard and his two brothers, Charles and Rowland, all of whom graduated from Bowdoin and served in the Civil War. The project produced some 180,000 digital images representing more than 80,000 physical items. Another key aspect of the project was the modernization of metadata about the collection.
Volunteers had thoroughly indexed the incoming letters to Oliver Otis Howard over the course of several decades. Project staff were able to scan, OCR, normalize, and then augment this information to create a comprehensive dataset about the Howard letters. Helpful for managing the digital files, this dataset also offers rich opportunities for the application of digital humanities methodologies and tools. The index includes more than 40,000 records with data points such as the letter’s author, recipient, date, and the place where it was written and received.
In their article, Varner and Hswe asserted that the “representation of digital collections in various data formats may lead to creative programs and partnerships for instruction, collection development and strategy….” And, that has indeed been the case at Bowdoin.
The Howard correspondence index has quickly found a place within the College’s curriculum. In Fall 2016, Clare Bates Congdon, Visiting Associate Professor of Computer Science, had students in her Interactive Data Visualization course work with this rich data set. The resulting visualizations, while simple, are compelling. Researchers, for instance, can select a date range to then see a map of Oliver Otis Howard’s correspondence for that time period. By expanding the chronological range, a researcher can quickly see the arc of Oliver Otis Howard’s career as reflected by his correspondents. The visualization shows a clear southward and westward expansion of Howard’s social network as he transitioned from Civil War general to commissioner of the Freedmen’s Bureau and a major figure in Reconstruction to a participant in the Indian Wars.
While learning new technologies and the ingredients for a successful interactive data visualization, the students also were exposed to the biography of a compelling 19th century Bowdoin graduate. According to Congdon, working with Bowdoin-specific data made the exercise more meaningful for the students and encouraged them to engage with the assignment on multiple levels. The exercise also established a collaborative and mutually beneficial relationship that will continue between the Computer Science Department and Special Collections.
The Howard correspondence visualization project, along with other examples of faculty utilizing metadata from Special Collections, was the topic of a recent faculty development lunch and learn. Organized to mark the conclusion of the Howard digitization project, the well-attended event proved highly successful, and several faculty members have subsequently indicated an interest in incorporating metadata from Special Collections into their teaching and research. Current and future projects include visualizing the personal library of one of the College’s founders based on our bibliographic records of his collection, mapping the travels of a nineteenth-century female botanist based on our index of her plant samples, and experimenting with textual analysis to explore transcripts of nineteenth-century Bowdoin student letters to see what this might reveal about what a liberal arts education offered 150 years ago.
Kat Stefko is Director of Special Collections & Archives at Bowdoin College and current chair of the College & University Archives Section of SAA. She has held library and archives positions at Duke and Harvard universities, Bates College, and the Philadelphia Museum of Art.
This past year, Z. Smith Reynolds (ZSR) Library at Wake Forest University (WFU) wanted to provide a special opportunity to engage students interested in writing outside of the classroom and offer them the opportunity to become published authors. The inspiration for Writers Camp @ ZSR came after a group of ZSR librarians heard Jane McGonigal present “Find the Future: The Game” during the American Library Association’s 2014 Annual Conference.
During the Summer of 2015, The Writers Camp @ ZSR committee was formed and the Writing Center was brought in to help plan, market, and lead the event scheduled for Friday, January 29th, 2016. Members included ZSR’s Instruction and Outreach, the Wake Forest Writing Center, Personal and Career Development, the WFU Press, and Special Collections & Archives. The group decided the starting point for each student’s writing odyssey would be an artifact from the University Archives.
The event commenced at 3 pm with a reception in the Special Collections Research Room. With some help from the Demon Deacon himself, the writers made a personal connection to their assigned artifacts which would inspire their works. Taken from throughout Wake Forest history, the variety of artifacts from the University Archives were sure to cultivate genres of written work from poetry to short stories, and more. Author and professor, Jenny Puckett, provided a brief lesson on “going down the rabbit hole”, the phrase she used to describe the oftentimes never-ending adventure into finding the history and context of an artifact. Returning to the Library at 7:00 pm, the group of talented Wake Forest University students who had applied and were selected to participate, arrived ready to spend the entire night in the Library (which was closed to the campus as part of its regular hours). They were ready to challenge their creative abilities and participate in a writing event unlike any other.
At 7:30 pm the students returned to the library atrium to kick-off the evening. After a few encouraging words from the Director of the Writing Center, the writers scurried off into the depths of ZSR, from the darkest basement corner to the serene 6th floor catwalk overlooking the atrium. A midnight pizza delivery provided a nice break for several writers, and many student writers appreciated the endless pots of coffee and late night snacks that were set up in the atrium (known as ‘Writers Camp Command Central’ during the event). An incredible group of tutors from the Writing Center offered assistance and advice into the wee hours of the morning.
Twelve hours and eight pots of coffee later, thirty-three works were submitted and after editing, were published in ebook and print formats, with several special editions designated for Special Collections.
This unique event was a lot of work—obtaining grant support from the university, marketing, reviewing applications, selecting artifacts, staying up all night, editing the student work—but it was definitely worth it. Our students are now published authors, and participated in a one-of-a-kind event. Special thanks are due to our Outreach Librarian Hu Womack for overseeing all the details!
Tanya Zanish-Belcher currently serves as Director, Special Collections and University Archivist for Wake Forest University in Winston-Salem, North Carolina. She received her B.A. (1983) in History from Ohio Wesleyan University and an M.A. (1990) in Historical and Archival Administration from Wright State University in Dayton, Ohio. She is Past President of the Midwest Archives Conference and a past member of the Council for the Society of American Archivists (2012-2015). She was named an SAA Fellow in 2011, in 2016 was elected SAA Vice President/President Elect, and will serve as SAA’s 73rd President in 2017-2018.