Race and Cultural Landscapes

The Cultural Landscape Foundation. Nov. 10, 2017.
The Cultural Landscape Foundation. Nov. 10, 2017.

It was an honor to be the first interview for The Cultural Landscape Foundation’s new series on race and landscape. There are many narratives written into the fabric of the mission gardens, and some of the most significant revolve around the representation and erasures of the Native American past, particularly the history of the mission period and early California statehood. Despite decades of activism and some hopeful initiatives for more inclusive and critically reflective interpretation, the mission gardens remain paradoxical — historical yet timeless, beautiful yet violent, secular heritage sites yet sacred. Join the conversation about what the California Misison landscapes mean. https://tclf.org/race-and-cultural-landscapes-conversation-elizabeth-kryder-reid

Powered by WPeMatico

Who owns the past at the California missions?

The question “who owns the past”? has been asked about antiquities being contested by museums and source nations (Kate Fitz Gibbons, James Cuno), about Indigenous narratives and anthropologists (IPinCH), and about the place of intellectual property in our cultural commons (Lewis Hyde).

At the California missions the question of “who owns the past?” is a multi-layered one. The majority of the historic sites are owned by the Catholic Church in some manner (Diocesan properties, a Catholic University, etc.), while two are owned and managed by the California State Parks. In cases such as Mission San Juan Capistrano and Mission San Jose, the sites are managed in partnership with not-for-profits. These administrative structures are formative in the framing of the interpretation of the past at the missions. Another layer is the question of the tangled narrative of church and state. What is Catholic history? What is California history? And how do those two relate? The most pointed questions about who owns the past at the missions surround the place the Native American past in the narrative (Deana Dartt, Phoebe Kropp).

Along with these deeply ideological aspects of the question “who owns the past?” are the quite pragmatic issues of control of access to images in archival collections. Historic photographs and other visual culture related to the missions are in collections of museums, archives, historical societies, and the missions themselves. Much has been done to making these materials accessible to general audiences. Most of the larger institutions have digitized their collections. The Online Archive of California is a rich and remarkable resource that provides the public access to the collections of more than 200 repositories through a simple search interface. But anyone wanting to do more than view images, such as including them in  publications or digital scholarship, must navigate the labyrinth of permissions and fees that many institutions require. There is a move toward more open access to digital collections. The Huntington Library, for example, delegates seeking copyright permission to users.

Louis Choris, Vue de Presidio Sn. Francisco
Louis Choris, Vue de Presidio Sn. Francisco (San Francisco Presidio), 1822. Yale Collection of Western Americana, Beinecke Rare Book and Manuscript Library.

Yale’s Beinecke Library provides downloadable high resolution copies for free, noting that they are “committed to providing broad access to its collections for teaching, learning, and research in accordance with Yale University Policy. The Beinecke’s Website, catalog records, finding aids, and digital images enhance scholarship and promote use of both the digital and the original object.”

The vast majority of repositories, however, still charge fees. For some, these permission and reproduction fees are seen as vital revenue. Particularly troubling , however, is the practice of subcontracting out reproduction to for-profit such as the University of Southern California Digital Library which contracted the reproduction of some digital collections to Corbis (recently acquired by Getty Images).

The question of “who owns the past?” is a vital one at multiple levels, but for those trying to expand the voices telling that story, the sale of images to generate income or make a profit are barriers that limit the democratizing of knowledge and the broader engagement of public in curating their own history.

Powered by WPeMatico

Adding new base layers to Neatline

Wayne Graham, Technical Director for the Council on Library and Information Resources and formerly the Head of Research and Development at the University of Virginia’s Scholars’ Lab (where he was an architect of Neatline), provides a file of additional basemaps at github. These maps can provide alternatives to Google Maps and several cool options for base layers for Neatline maps from major sources like esri.

To add these maps to your Neatline, do the following:

  1. Go to https://github.com/waynegraham/neatline_basemaps and “Clone or download” the files.
  2. Open your finder and navigate to the “neatline-basemaps-master” file that you just downloaded.
  3. From the “neatline-basemaps-master” file, you want to upload the “providers.js” file into your Omeka install into the following folder: [your omeka install name]/plugins/Neatline.
  4. In your downloads, in the “neatline-basemaps-master” file, find the “layers” folder.
  5. Upload all of the files from this “layers” folder into the following folder in your Omeka install: [your omeka install name]/plugins/Neatline/layers
  6. The new base layers should now appear as options in the Exhibit Settings for Neatline exhibits in your Omeka.

Powered by WPeMatico

Enabling Google Maps in Neatline

Enabling Google Maps in Neatline

My students and I were having difficulty loading Google Maps in Neatline, so I came up with two possible solutions. The first is to upload alternative map layers. The second is to obtain a Google API key and plug it into the Neatline code. After some googling, I discovered several threads that indicated that (at least in the past) there had been a problem with Neatline’s code for OpenLayers, which doesn’t play well with Google Maps as is, but could if a Google API key were added.

If you want to add an API key, here is what you do:

  1. Get a Google Maps Javascript API key. These are free as long as you have less than 25,000 map loads per day. You can register for an API code here: https://developers.google.com/maps/documentation/javascript/get-api-key
  2. Make sure you have backed up your site recently. If you have not, it might be a good idea to do it now.
  3. Go to your Omeka install and navigate to this file: [Your Omeka Name]/plugins/Neatline/helpers/Assets.php
  4. Open the file in the editor.
  5. Find this line: nl_appendScript(‘//maps.google.com/maps/api/js?v=3.20&sensor=false’);
  6. Rewrite the code and paste in your Google API key so that the line looks like this: nl_appendScript(‘//maps.google.com/maps/api/js?key=[YOUR KEY GOES HERE]’);
  7. Save the file.




Powered by WPeMatico

Doing Digital History 2016 White Paper Summary

During the summer of 2016, Sharon M. Leon and Sheila A. Brennan led a second Doing Digital History institute for advanced topics in digital humanities (IATDH) funded by the National Endowment for the Humanities, Office of Digital Humanities together with an amazing team of graduate student mentors and visiting scholars. The final report and white paper is available.

Doing Digital History 2016 offered 24 mid-career American historians an opportunity to immerse themselves in two intensive weeks of training focusing on the theories and methods of digital history. The results of the institute were impressive, with participants increasing their technical skills, their digital literacy, and their comfort with evaluating digital work.

The team was able to rely on the lessons learned from 2014 during the planning and design phases of the curriculum and the evaluation structure for 2016. By the end of the two weeks, everyone left with new skills, new understandings of digital methodologies, and a new appreciation for the work required to build and sustain successful digital humanities projects.

A major goal of the Doing Digital History institutes is to make a targeted impact on history faculty, their students and departments, and the field at-large. To measure the overall effectiveness of the institute on changing attitudes and practices we asked four questions related to our goals at the beginning and the end of each institute:

  • If you were asked to review a digital project for a professional journal in your field of expertise, would you feel comfortable saying yes to the request?
  • If you were asked to review a colleague’s digital work for promotion, would you feel comfortable assessing its scholarly impact?
  • Do you feel comfortable presenting or discussing digital history work with your colleagues?
  • Do you feel comfortable supervising students who want to use digital tools in their history scholarship?

By comparing the 2014 and 2016 institute data, we can see some interesting differences in the cohorts. Prior to DoingDH, the 2016 cohort was much more comfortable reviewing digital work for promotion than the 2014 group, but less willing to review digital projects for journals. The 2014 cohort was slightly more at ease in supervising students incorporating digital tools into their work, and more than half of each group felt comfortable discussing digital work with colleagues.

Caparison chart 2014 and 2016 results

The post-institute surveys show that the 2016 cohort left with more overall confidence across each goal and achieved slightly more positive change in growth than the 2014 group. Even still, both groups experienced an impressive amount of personal and professional growth in two weeks!

In May 2017, we surveyed the DoingDH 2016 cohort one last time to gather some data about how each of them incorporated what they learned into their teaching, research, and professional development during the 2016-17 academic year:


  • 61% used online publishing in their teaching
  • 61% used geospatial methods in their teaching
  • 28% used text analysis techniques in their teaching
  • 33% introduced data management concepts in their teaching
  • 28% blogged about their teaching


  • 61% launched a digital project related to their work
  • 72% revised their own data management and research methods practices
  • 28% blogged about their research

 Professional Advancement and Service

  • 78% talked to their administration about supporting DH work
  • 39% participated in a DH unconference or workshop
  • 33% taught a workshop for their colleagues based on things they learned at DoingDH 2016
  • 67% collaborated with a colleague on DH project
  • 17% reviewed a DH project for a journal or online publication

The results are impressive. Our other aspirations of moving the field will take longer than a year and will require some additional research in a few years to more adequately assess the long-term impact.

After finishing our second IATDH introduction to digital history, we can affirm some of our findings from our 2014 white paper. Based on the applicant pools from 2014 and 2016, we see that there are still relatively few training opportunities at the novice level for faculty, and yet, it has not prevented history departments from asking their faculty members—prepared or not—to teach digital history courses. Preparing faculty to teach these courses, just like in public history, means more than simply reading the literature. It’s a methodological shift and we continue to believe that it is irresponsible of departments, colleges, and universities to assign faculty to teach digital history courses without providing the time and resources for professional development.

Interestingly in 2016, we received more applications from junior faculty and new PhDs seeking digital training to prepare them for the job market, because their graduate programs offered no courses or opportunities to learn digital methods. Some of these applicants wrote desperately hoping that they could participate in DoingDH 2016 to help them obtain a tenure-track job.

The Doing Digital History institutes are an effort to provide scholars with a very preliminary introduction to the theories and methods of digital history. As such, they are only a beginning. Our evaluation shows that they have made a significant impact in the field, but we all have much work left to do to raise the digital literacy of the core of mid-career colleagues.

DoingDH 2016 Tweets & Curriculum Wrapped & Ready

DoingDH 2016 Participants and Team

DoingDH 2016 Participants and Team

An amazing cohort of historians came to RRCHNM’s Doing Digital History 2016 in July as self-identified digital novices, unsure of their abilities to keep up with the work during the two-institute. They all left with their own web domains, experience working in the statistical programming language R, and many ideas for new teaching assignments, research projects, and digital publications. Most important, each participant became more confident engaging with and reviewing digital scholarship, advising students wishing to do digital projects, and in learning to tinker with and ask questions of digital methodologies.

Throughout the two weeks, readings and discussions were coupled with demonstrations and hands-on work. Each participant established their own web domain, installed open source software (WordPress, Omeka, R, Audacity); implemented best practices for managing their research; made visualizations; built simple maps; learned how to plan a digital project; edited sound files, planned digitally-inflected lessons for their classes; and considered the implications of the changing field of scholarly communications.

Participants and team members worked in public throughout the institute. All readings, assignments, tutorials, and participant blog posts are contained with this website. And, we created a PDF of the entire DoingDH-2016-Curriculum for sharing and distributing under the  CC BY-NC license. Additionally, the Twitter backchannel conversations using the #doingdh16 hashtag are Storified for a different perspective of each day.



Doing DH benefited from an enthusiastic corps of RRCHNM graduate students, Alyssa Fahringer, Eric Gonzaba, Jannelle Legg,  and Spencer Roberts, who developed tutorials and use cases for incorporating different digital tools into teaching and research. They also provided moral and technical support to the participants, and managed the Twitter backchannel.

Doing DH also featured guest instructors from Mason’s History and Art History Department, Mason Library, and neighboring institutions. Lincoln Mullen shared his extensive expertise doing computational research over three days and Michael O’Malley led a day on sound studies. Jeri Wieringa of the Mason Publishing Group shared trends in scholarly communications and digital publishing initiatives. Denise Meringolo, a participant in Doing DH 2014, returned to discuss how her public history work at University of Maryland, Baltimore County, became digital. And, Jeff McClurken visited from University of Mary Washington to lead a day on digitally-inflected pedagogy.

By the end of Doing DH 2016, participants were tired, but also invigorated with ideas and three action items to implement over the next six months.

Day Eight —

A day about sound.  Although I use sound-based sources all the time as someone who works in oral history, I don’t always think about them as sound, but as sources that, like other sources, can be crafted or manipulated to convey different meanings.  Thus, I mostly disagreed with O’Malley’s assertion that sound sources are somehow more easily manipulated than other, particularly text-based sources.  Still, the session did make me think a lot about how manipulating sound (removing background noise, reducing highs & lows in interviews–all of which I’ve done …) can indeed change the way people perceive oral histories when they listen to them.  So, I will definitely be more mindful when I start manipulating the sound of interviews in the future.

Powered by WPeMatico