Monthly Archives: September 2022

Mapping Praxis- Taos Pueblo

I created an ArcGIS storymap from recent photos of my parents trip to New Mexico because I am interested in learning about Native American tribes and culture. The text in the storymap is copied from the UNESCO World Heritage Center site and is included as an embedded link.

https://storymaps.arcgis.com/stories/8f55254612444f7f99d14568be0d02b8

I found storymaps relatively easy to navigate to create a product. I experimented using different functions like the 3D map, adding a button, embedding links and adding videos. My final product is more of a result of trial and error than watching tutorials. I did however view other storymaps and am excited to continue creating and working on my skills to produce a story map like below, Traveling to Taos.

Praxis: The Eviction Map

Re: Analysis of a Digital Humanities Project and Lightning Talk

The first thing I noticed when researching digital humanities projects to write and talk about is how esoteric and historical in nature most of them are like Maria Popova’s list of project that included “Salem Witch Trials of 1692” and “London Lives from 1690 to 1800” or Alan Liu’s list that included “Geoparsing 19th-Century Travel Narratives.” 

I was looking for something more contemporary and relatable and immediately useful. A Google search returned more of the same from institutions across the world, including a project called “Creating immersive, interactive environments for engaging with ancient Egyptian coffins.” This is not a topic I want to spend any of my time exploring. 

Although in my last blog post I embraced the “big tent” of digital humanities, in practice, writing about digital humanities projects or anything related to the field is challenging when there’s no clear definition. I went outside of the course recommendations to an area I’m familiar with: the law & justice sector. There were a few options I could explore and chose the National Eviction Map project from the Eviction Lab program. I figured this relates to one of the focuses of our curriculum: Data Visualization and Mapping and thus would qualify as a digital humanities project.

The majority of poor renting families in America spend over half of their income on housing costs, and eviction is transforming their lives. Yet little is known about the prevalence, causes, and consequences of housing insecurity.

Enter the Eviction Lab – a team of researchers, students, and website architects who believe that a stable, affordable home is central to human flourishing and economic mobility. 

Drawing on tens of millions of records, the Eviction Lab at Princeton University published the first ever dataset of evictions in America, going back to 2000. Their goal is for the data to be used by policymakers, community organizers, journalists, educators, non-profit organizations, students, and citizens interested in understanding more about housing, eviction, and poverty in their own backyards.

Now about the DH project that is the National Eviction Map : Eviction cases are civil lawsuits filed by landlords to remove tenants from rental properties or collect past-due rent. Records of eviction cases are typically held in electronic case management systems or paper files in the local court where the case was filed. The mapping project used three strategies to collect eviction case data: bulk requests for electronic records from all state courts, requests for aggregated counts of eviction filings at the county level, and the purchase of proprietary individual records data from LexisNexis Risk Solutions. Researchers can use the data to help document the prevalence, causes, and consequences of eviction and to evaluate laws and policies designed to promote residential security and reduce poverty.

Eviction Lab has a Twitter account @evictionlab with over 12K followers with the tagline: The Eviction Lab is helping neighbors and policymakers understand the eviction crisis. In July 2022 they put out a Twitter thread about the new edition of the Map. News outlets regularly site the lab and the map when reporting on eviction-related issues.

Presner & Mangle of DH

Reading Presner was very engaging for me and solidified the debate of the function of DH, whether to ‘critique’ or to ‘build.’ I believe Presner build a strong foundation to support the view that one in DH must build, which he did by threading a connection between the Frankfurt Schools’ theory and Liu’s critique before continuing from there. The Frankfurt School philosophers articulated the difference between a ‘traditional theory’, which is about, ‘the pursuit of factual knowledge,’ or a ‘critical theory,’ which is a continuing development of Kantian philosophy that factual knowledge is not enough. One must be aware of knowledge but also engage with the social world. This was an elegant three-pointer to use a basketball metaphor. And a way to put this into practice is to use a ‘negation,’ or to shake things up or to punk it (rock’n’roll metaphor), which leads to a ‘mangle.’ This, I believe, is what he means when he states, ” DH is experimental, dirty, and completely suffused by social and material diabetics or resistance and accommodation.” Punk rock lives! Mangle is what one gets when things are shaken and no longer rarified, when the structures are crumbling, when ideas are fraying, or a new chapter is being written.

I will make only one more observation, which is his elaboration of utopia. He made an excellent point about what is not said when one is stating a social criticism. What is not said and hiddenly implied is that the author has a vision of something better which is not brought into the conversation.(This is thought of a strength or virtue for most people.) As he states, “Nowadays utopian ideas have a bad rap because they appear hopelessly naive or programmatically prescriptive:however, without an idea of change for the better, there can be no constructive social critique.” I think this is something that has to be stated and not just even implied by the person giving the critique. It must be articulated as part of the process, which is where Presner leads to by emphatically stating that ,” For the DH, that there is a utopian idea at its core: participation without condition”

Presner definitely set the DH in a philosophical framework to expand the essence of the traditional humanities for the next century by articulating the humanities’ social, participatory, and ethical dimensions leading to a better, more inclusive world.

dh project analysis: the quantified drama

THIS WEEK, as I was analyzing a specific DH project, the following aspects of our readings provided a lens of investigation. 

  1. The discussion in Ramsay’s and Rockwell’s text about what constitutes scholarship and whether there is a way to have scholarship without the “discursive elements” – the question of whether a prototype can be theory? Can a thing be scholarship without discourse that illustrates, and articulates the maker’s thinking about the thing? 
  1. And the above text’s intersection with Presner’s call for a more expanded definition of critical discourse. Presner notes the field’s obligation to consider and widen what critical theory encompasses. He especially advocates for critical theory to consider and expand into that “which might or could be”, the utopian.

So, above, we have one argument for questioning the reliance on traditional scholarly discourse as a necessary ingredient of scholarship, and one argument for leaning more into the potential of critical discourse and expanding its place in scholarship.

Adjacent to these seemingly conflicting considerations, I also felt that …

  1. …the distinction between qualitative and quantitative data underlined in D’Ignacio and Klein’s work on Data Feminism plays an important role in considering what is legible (for what reader?) as scholarship, esp. if it is presented without much explanatory or interpretive discourse. Here, their footnote 42:

“People often say that there are two broad kinds of data: quantitative data, consisting of numbers (e.g., how many siblings you have), and qualitative data, consisting of words and categories (e.g., what color is your shirt?). As we will show in chapter 4, any time there is a binary, there is usually also a hierarchy, and in this case it is that quantitative data can be incorrectly perceived as “better” than qualitative data for being more objective, true, generalizable, larger scale, and so on. Feminist researchers have consistently demonstrated the need to collect qualitative data as well, as they can often (but, of course, not always) capture more nuance and detail than numbers.”

The lens these three observations created, made me look at how specific DH projects integrate discursive context and how leaning on qualitative vs. quantitative data might influence the presence or absence of traditionally scholarly context.  I finally chose to take a closer look at a DH project “To See or Not to See“ – an Interactive Tool for the Visualization and Analysis of Shakespeare Plays that seemed to present itself directly to the user/reader without much context. 

[Full disclosure: I also chose this because my academic journey began in theater history, I trained and worked as an actor and a playwright, and I continue to create work for the stage that investigates collisions of stage and screen.]

Here is the link to the project’s page, specifically the page for Hamlet: http://www.thomaswilhelm.eu/shakespeare/output/hamlet.html

Essentially the project turns the plays’ text into quantitative data (counting words and lines) and then colorcodes which character/speaker is associated with the text. It shows one entire play on one “page.” Even at first glance we can already see how much stage time (i.e graph space) each character’s text occupies. We can get a sense of the characters’ movement through the story and easily track their (especially the protagonists’) arcs.

WHAT DATA IT WORKS WITH:

The project works with a stable and finite data set: The Shakespeare Folger Editions text (via the Folger Editions archives). The Folger text of Hamlet, e.g., is very likely complete and will not change. So, questions of accommodating growth don’t have to be central for the creators.

The project focuses exclusively on the quantifiable aspects of Shakespeare’s play, i.e., the countable aspects of language , word- and lines counts. 

Which already brings up the question of what it doesn’t work with. (See the last section of this post.) 

WHAT IT LOOKS LIKE: 

As many other data visualizations I encounter in my daily reading life, the overall structure of this project takes full advantage of our familiarity with interpreting a grid. The x-axis traces the play from beginning to end, leaning on what Kurt Vonnegut illustrated when he proposed that all stories could be turned into graphs. (https://www.youtube.com/watch?v=oP3c1h8v2ZQ )

The y-axis lists the play’s characters, likely in order of total time present, or perhaps in order of relevance in relation to the protagonist. But there is also an implicit – or is it explicit? – arrangement by class (Royals at the top). This organization is a replication of a familiar hierarchy – which contributes to an easy  the “reading” experience and yet perpetuates and fortifies problematic structures.

Giving the characters distinct color coding helps to trace text data ascribed to them. The color schemes don’t telegrapgh a methodology and seem more intent on easily showing contrast.Each character is represented by name and a male or female gender symbol, which says perhaps more about what data the authors anticipate readers will want to parse than it says about the world of the play or theatre at the time of Shakespeare.

Additionally, and especially interesting although easy to skip, is a band at the top, that traces the play’s non-dialogue text – i.e.its stage directions, entrances, and exits. 

Beyond the main screen, there are two pop up windows which let the user delve deeper into the quantifiable elements of the play. The pop-up windows show a) SPEECH: sections of dialogue that are tagged to indicate modes of text (like speech, song, quote etc.) and b) METRICS: graphs that present the network and data specific to the character. For example, I can see all characters Ophelia interacts with, how much she interacts with them, and how her dialog-quantity compares to the play’s overall word-count and a selected act’s word count.

Together, the data show the underlying webbed structure of the play’s world and it’s inhabitants’ relationships.

WHO MADE IT:

Information about the project, adjacent scholarship, and its makers is hidden within the ‘about’ section of a small gray menu button. In other words, it’s not prominent. A link to a scholarly paper is part of the about page. The short paper is a traditionally scholarly text, and the way to discover the names and affiliations of the authors: Thomas Wilhelm, Manuel Burghardt, and Christian Wolff – likely three German men affiliated with a German University. (Sidenote: German theater has a deconstrionist attitude toward Shakespeare’s plays. The plays are often cut and rearranged according to the director, so it strikes me as interesting that a project that so clearly appreciates the full text emerges there.) 

WHAT IT’S LIKE TO READ/USE IT: 

The visual impact of the entirety of Hamlet one one screen equals the satisfaction one can feel when looking at a good map of a city one has lived in. 

It works as a tool (a response to our desire yet inability to grasp more extensive chunks of time and space) that facilitates orientation. There is certainly new knowledge in understanding space and time of Hamlets 2+ hour universe from a bird’s eye view/on one single image. As each play is creating a world of its own, these data collections might function as maps/guides through a Shakespeare imagined universe.

The structure and provided data are easy to navigate and understandable for those who has encountered the play before and have an initial familiarity with Shakespeare’s drama. The authors’ assumption is definitely that their users have a familiarity with Shakespeare and that a quantitative look at a text might yield insight beyond traditional, literary text analysis. 

It’s not intended as an introduction to Shakespeare, I don’t think. So, the assumption of a knowledgable readership obviates the need for a user manual or how-to guidance — at least at first glance. In the related paper, the authors confirm this assumption about audience. They imagine the site to be of use for people working in or creating work about theater.

And it is. 

I made my partner, who’s a professional actor, look at the web-site, and she was instantly enthusiastic about the possibility it might afford her when preparing for a role. As an actor the aspect of a play that stays ambiguous the longest, is a sense of the entire arc one’s character travels. Grasping the arc is necessary in crafting the performance, so having a tool that makes a particular arc so visible, would likely help an actor by illustrating a characters place in the play’s eco-system quickly and succinctly. A theater director who is in the process of shaping the staging of the play, could find this map invaluable as well. The authors also imagine that it could inspire new points of entry for literary analysis, by showing, e.g., word-quantity disparities between genders.  

So what makes this project effective is the author’s understanding of potential users and how little context these users need to grasp the project’s application and implication. It presupposes an informed/partially informed reader with a specific goal and can therefore leave more unarticulated. 

And still, when I read the related paper, I was grateful for the ways in which it helped me to move beyond my initial evaluation of the project’s possibility. In that way, a theoretical discourse does become necessary. The paper did inspire me to think more creatively about the project’s uses and suggested applications beyond my familiar realm. I almost wish that aspects of the paper, in less formal articulation, could preface the project or even interrupt it.

QUESTIONS THAT REMAIN: 

I understand that quantifying an experience (theater) and a literary work (play) is intriguing. Quantification seems to offer a path toward understanding and parsing how and why art works for humans. (And I also understand the reverse. How often it is important to find the story data tells via using a unique narrator’s take on the data.)  

However, theater as an experience still centers the qualitative and subjective. Where do these aspects go in the analysis of world-count?

How to mark up/count a catharsis?  Is there a discomfort with the  ambiguity of qualitative aspects? Does it force a thinking about the variety that performance might bring to these texts? (What about tone, subtext, intention that imbue the words?) What are these aspects so resolutely excluded?

What are the implications and limits of the structure-the reflexive absorption of the grid? What metrics are left out? Considerations of plays shown in repertoire by a company, men playing women’s roles, identity markers other than gender? How would these considerations interfere with the binaries established by the grid?

The creators talk about possible expansions of the project. How do the current discoveries they have made help facilitate new discoveries? Or is it possible the current direction keeps us from entering a radically different but equally productive line of inquiry?

Finally: A bit of a meta question, related to my interest in narrativity. Does it entrench the view of “story” as a hero’s arc and how does this quantification relate to notions of narrativity?

Archiving Historical Violence Through Architectural Technologies and Situated Testimony in Forensic Architecture’s “Dispossession and The Memory of the Earth: Land Dispossession in Nueva Colonia”

Forensic Architecture is a research agency based out of London that combines architectural digital technologies with investigative techniques to piece together evidence and ultimately craft visual archives of state violence that otherwise would not exist. The term forensic architecture refers to the emerging academic field at Goldsmiths, University of London that “produces and presents architectural evidence within legal and political processes.” I first became aware of Forensic Architecture’s work at the Whitney Biennial in 2019, and they are ultimately the spark for my interest in digital humanities. While Forensic Architecture does not necessarily identify themselves as a digital humanities collective, I believe that the work they do constitutes digital humanities with an explicit basis in critical theory. That is to say, Forensic Architecture employs digitals tools with the purpose of documenting historical oppression that lacks other “traditional” forms of documentation (such as writing, images, news, videos, etc) in an effort to question what knowledge is by exposing the gaps in what we “know” about human rights violations, and repositioning power by focusing on and working with structurally oppressed communities to effectively document their history.

In the investigative project “Dispossession and The Memory of the Earth: Land Dispossession in Nueva Colonia” commissioned by The Commission for the Clarification of Truth, Coexistence and Non-Repetition of Colombia, Forensic Architecture collaborates with Instituto Popular de Capacitación  (a Colombian government organization that carries out research, training and education, and more with communities, social and political movements, media, as well as with the State in its different territorial scales) and Forjando Futuros (a Colombian non-profit that provides legal representation for victims of land dispossession due to armed conflict) to show the dispossession of campesino farmer land in the Urabá Antioqueño region of Nueva Colonia from the 1960s to present day through a web platform consisting of cluster visualizations and mapping technologies. According to their investigation site, methodologies include “3D modeling, data mining, fieldwork, photogrammetry, remote sensing, situated testimony, ground truth, [and] software development.”

In pursuing this project, the various teams of investigators ultimately expose the different actors involved in the land dispossession of Urubá for the purposes of “commercial monocrop banana cultivation,” and calls into focus the governmental forces that were meant to protect the campesinos from the brutal violence enacted both on the people and environment of Urabá. The web platform Despojo Urubá introduces historical context and identifies 12 different groups of actors. Clusters of circles representing land are color coded to reflect each of the 12 different groups and their respective land ownership. Starting in 1950 and depicted throughout time, these circles are shown to shift and move as land is dispossessed from the campesinos, and ownership transferred between different corporate entities. The platform also introduces an alternative view through the use of a map which shows in a more geographical context which parts of Urubá are being dispossessed, transferred, and bought over time, and by which actors. Beyond the web platform, Forensic Architecture created a video as an accompaniment to the web platform that delves into historical context, providing a situated testimony of massacres inflicted against the campesinos of Urubá through architectural modeling, aerial analysis, and testimony. The investigation is a powerful project that transforms the ways that users think about knowledge, history, and memory, revealing the ways we can practice, as Presner writes, “building a bridge… between the mapped and unmapped, the global and the local… reestablish[ing] contact with the non-philosophical” [Presner, 66].

Despojo Urubá allows users to understand the conflict, scale, and timeline of land dispossession in Urubá in a way that could easily get lost in translation in “traditional” written research projects. The investigative video component delivers extraordinary insights that are more accessible than papers typically are, and rooted in testimony that can be envisioned through the use of digital technologies. The use of clusters and maps were tools specifically chosen, and they serve distinct purposes in crafting the archive of land dispossession. There are immense strengths in this project in archiving and writing a history that oftentimes goes unwritten. The interactive nature of the web platform paired with the video allow for insights to be explored by users in meaningful ways.

While Forensic Architecture engaged with Colombians from a wide array of backgrounds in their investigation and practice critical theory intentionally, my biggest critique is their lack of job pipelines to support non-European scholars who are interested in working on and developing their own projects. How does Forensic Architecture ensure that their on-the-ground collaborators are being compensated for their labor on projects that deal with violence and trauma?

Overall, I highly recommend checking out Forensic Architecture’s “Dispossession and The Memory of the Earth: Land Dispossession in Nueva Colonia”, as well as their other 84 investigations. I believe that their investigative work reinvigorates us to think critically and creatively about how digital (and architectural) technologies can be used in exposing and archiving human rights violations.

Sefaria in Gephi: More Visualization than Judaism, & That’s Okay – A Very DH Project.

A/N: This is less a critique and more praise, frankly, and you know what? I’m okay with that, this project is awesome.

Sefaria is a massive and widely-used online repository for Jewish texts, from Torah to Talmud. It contains all of our texts in several languages, and also includes user-generated commentaries and collections. It is often called a “living library” of Judaism. Gephi is a software that allows for data visualization. So when I came across Sefaria in Gephi, I said “oh my gosh, I’d been wondering about this exact thing!”

Sefaria in Gephi is a project by the folks at Ludic Analytics, a small group of colleagues who all work on literature visualization as a both a research and pedagogical tool. It’s a little dated at this point, finishing the posts themselves in 2014 when Sefaria was only a year old, but there is still a massive amount of value in these graphs. The main author, Dr. Liz Shayne, says she started this project mostly out of curiosity for “how do the formal attributes of digital adaptations affect the positions we take towards texts? And how do they reorganize the way we perceive, think about and feel for/with/about texts?”. This is, in my opinion, a very DH question to ask- how does the way we visualize data change the way we perceive the results and how we feel about them? It is truly DH in being at the intersection of math and literature, as well; it hits a very necessary cross-section.

Quick sidebar about Dr. Shayne- since this series was published on WordPress she actually ended up working with Sefaria, and is now a director at Yeshivat Maharat, a women’s university for Jewish studies (in goyim terms), which is actually part of my little niche community of Open Orthodoxy! Very proud to have her as one of us.

So about Sefaria: Sefaria creates literal links between texts that all reference the same thing. If you highlight Genesis 1:1, it’ll show you all the other texts that mention Gen. 1:1. It makes it very easy to see the (not so literal) links between the texts.

Over 87k connections were made over 100k nodes, which Dr. Shayne notes it’s important to realize that these connections are less an indication of over 2000 years of texts, but rather an indication of the incredible crowdsourcing Sefaria has been able to accomplish.

Here is the first example she gives of what Gephi did with Sefaria, using the plugin OpenOrd graphing, which visualizes large datasets.

The figure above represents the following:

“Blue – Biblical texts and commentaries on them (with the exception of Rashi). Each node is a verse or the commentary by one author on that verse.

Green – Rashi’s commentaries. Each node is a single comment on a section

Pink – The Gemara. Each node is a single section of a page.

(Note – these first 3 make up 87% of the nodes in this graph. Rashi actually has the highest number of nodes, but none of them have very many connections)

Red – Codes of Law. Each node is a single sub-section.

Purple – The Mishnah. Each node is a single Mishnah.

Orange – Other (Mysticism, Mussar, etc.)”

Don’t worry if you’re not Jewish/don’t know what these things mean, just know that they’re all Jewish texts. “Size also corresponds to degree”, says Shayne. “the more connections a single node has, the larger it is”. The largest blue node is just the first verse of Genesis. From this graph we can also see that most connections are made by the Gemara referencing the Torah and the Gemara referencing itself. Shayne notes, however, that this graph is just very hard to read, and misses a lot of important information like proximity- there’s nothing linear or otherwise sequential in this graph.

Dr. Shayne experiments with several different methods of visualizing this data, and is quite good at self-critiquing the methods based on the way they change the way you think about a text. The second article she writes in the series actually talks entirely about the limits of the project and of the medium, and specifically how to make limits work in your favor, which is something I think a lot of DH projects could use and are trying to learn. She also experiments with what data she’s trying to visualize, going from small concepts like connections between individual statements to much broader ones like visualizing connections between entire books. Due to her own project, as Sefaria gets bigger, it adds more links, and what she’s able to do with this data changes—like tracking allusions to other texts rather than tracking the texts themselves. There’s a point where she actually graphs Sefaria getting larger, which accidentally gave her insight into how Sefaria was built up.

Overall, I think this project has a lot to offer in terms of what it allows us to see about Jewish commentary from different lenses. However, this project is ultimately much less about Judaism and more about reliable and creative graphing. We learn that what is helpful visually may not always be what is helpful statistically. We also owe this project some credit to Sefaria’s popularity, as it was written about in Wired which gave it some traction. Unrelatedly, she is also quite witty: in potentially the funniest line I’ve ever read in an academic article, she writes “statistically speaking, Genesis 1:1 is the Kevin Bacon of Sefaria. You are more likely to be within 6 degrees of it than anything else.”

Dr. Shayne is, according to her WordPress, still working on this project, however I can’t find much online as to in what capacity (perhaps that was her work with Sefaria directly). You can access the data from her project here on GitHub (like any good DH project), but be warned that it is extremely hefty. She ends this exploration with two questions:

“1. How does this kind of work – making visualizations and thinking about networked Jewish text – enhance the traditional experience of studying Jewish texts in a Jewish environment?
2. How can an academic researcher make use of these visualizations and to what degree does she need to become an expert in network theory to do so?”

And to these I say:

  1. Making these sorts of connections is innate to Jewish study- it is why we study the commentaries in addition to the Tanach. These images don’t take the legwork out of making those connections, but rather serve as a memory aid.
  2. This question I can answer less, but I can say that you definitely don’t need to be an expert to use these- this is the Digital Humanities at work in serving the public knowledge base; these illustrations are incredibly accessible in their formatting.

“The Black Boxes of Forced Disappearance” in Colombia –Analysis of DH Tools + Potentials

For this week task I chose to analyze the DH tools used for the investigation: “The Black Boxes of Enforced Disappearance”. This work is not a DH project (apologies). However, it does use many of the tools that DH scholars work and interact with, particularly those involving mapping, 3d modeling, video analysis, archive analysis and data mining. The “Black Boxes” investigation is part of the final report done by Colombia’s Truth Commission. This Commission was established in 2016 as part of the country’s peace agreement process. The purpose of the commission is to “shed light on five decades of atrocities and human rights violations committed during the country’s armed conflict” (United Nations). A Truth Commission is an institution that promotes a transitional justice system and reinforces the importance of truth to resolve conflicts and most importantly as it sees truth as a form of reparation to victims of human rights violations. Truth, however, is not perceived as a matter-of-factness statement but as a complex and heterophonic narrative. The most popular (and perhaps the first?) truth commission in the world was established by Nelson Mandela in South Africa to “help deal what happened under apartheid”. Given the tools used, the complexity of the investigation and the profound impact this work has I think it can constitute a very interesting example of the intersection between digital tools and humanistic inquiry in a broader and real-life scenario.

–The context–

“The Black Boxes of Enforced Disappearance” is an investigation that traces the events that occurred in Bogotá, Colombia between November 6 and 7 of 1985 when the M-19 guerrilla took over the Justice Palace and the operations that the Colombian government launched to “retake” such building. This episode is one of the deadliest and most traumatic events of Colombia’s internal conflict. It was broadcasted by the national TV in real time but left in total impunity. The aftermath of the event left 101 civilians dead and an unknown number of missing persons. The investigation focuses on what happened to those that were murdered and disappeared. One of the most complex results of the investigation is that the Colombian armed forces were the ones that carried out the torture, killing and in some cases, the disappearance of the hostages of the building. As they put it: “Our analysis shows that what was presented as a chaotic hostage release scenario by the armed forces has served for decades to cover up a planned and organized counterinsurgency operation” (Colombian Truth Commission). The complete history of this event is very intricate and difficult, I’ve left some links at the end for those that want to know more about the content of the investigation.

–The Builders–

As we’ve seen with the past readings and discussions pertaining the knowledge production in DH interdisciplinary is also central in this investigation. There were three major builders in this project. The video analysis, mapping, data mining and architecture reconstruction of the places were the killing and people’s disappearance occurred was carried out by the Forensic Architecture agency. This is an agency based in London and mainly composed of architects, filmmakers, software developers and (some) social scientists. The data collection, social work and archival analysis was carried out by a specific appointed truth commissioner and his research team. Interesting fact, the main social researcher in charge, Oscar Pedraza , holds a PhD in Cultural Anthropology from the Graduate Center! And the third set of builders (this is my assumption) are most likely web designers, perhaps UX/UI designers, that render the content available and legible for the digital world and the Colombian population at large.

–The Tools & Outcomes–

Although I’m not actually familiar with how the tools operate or their tech specificities of the programs or platforms employed (this is also very interesting that the digital builders don’t really name their tools, they name their technique, but that’s different), there are many key takeaways from the way they presented the investigation for the public. I’ve separated the ones that I think most important in the following categories:

  • Geographical analysis: 3d modeling and cartographies were used to project the videos recovered from the event but also to trace the routes taken by the hostages and the armed actors.
  • Video analysis: 50 hours of video was synchronized to create a narrative of how the hostages were evacuated from the building and where they were last seen.
  • Audio analysis: 76 testimonies were analyzed and compared with other data
  • Data mining: large volumes of data found in a variety of documents (“sentences, hearings, reports, resolutions, official letters, expert reports, statements, DNA identification, etc) helped place the evidence in space and time and show how many of the evidence throughout the years (videos, photos, reports) has been manipulated or erased. Fascinating conclusion about data (maybe obvios to some of you): data moves in time, is not stable, is very fragile.
  • Documentary, Guides & Exhibition: To help the public walk through all this material the investigation produced a documentary, several special guides (like this one) with visuals that help elucidate and educate how the geographical analysis was done. And last they hosted an exhibition at a museum with a mural title “Negative Evidence

–The Potentials–

After reviewing the project, I think there are many interesting ways of using techniques for humanistic inquiries. I will just focus on one, the video analysis. Given that this events, as I mentioned before, where broadcasted in national television, there was a lot of footage to reconstruct. Nonetheless these sources, this video archive, was manipulated and erased throughout the years, the research point out to the extreme difficulty of organizing and making sense of this data. It seems incredible the work they were able to do with the video available and in a sense, it reminds me of the project of Colored Conventions, of course not for a video archive, but for the visual record that is used to reconstruct a narrative that seems to have been buried in history. This very fragile and manipulated data, that is in both cases very visual is used to narrate a story that must come to light or to public knowledge in a new way and in both cases can have a very reparatory and healing outcome for all the actors involved.

–The shortcomings of my Analysis–

This was a very complex investigation that involves many tools, researchers, infrastructure, resources, etc. With more time it would be nice to explore deeper the tools used. Also, given that this is presented as a finalized investigation; a finalized “product” and not a project in process I don’t have a lot of knowledge about what kind of problems they ran into, besides the difficulty of the task itself! and besides the effects of the pandemic.

Links/Further reading:

About the events & recent | About the Tools | About the Builders & video | Similar Techniques in Journalism (NYT)

Expectations vs. Reality.

Records and record keeping is an intricate process. The validity and availability of records depends on many factors such as bias, motivation and many other factors that even come to the person who is keeping those said records (Basically there are many questions to be asked be they obvious and/or nuanced). There is a problem with keeping records for the marginalized communities especially for the communities of color here in the Western world. Are they reliable? Are they complete? Do they gloss over the context of the people? Who kept those records that are now being researched by the modern researchers. In designing the course for the Digital Humanities those questions have to answered in order to present a more nuanced approach in pedagogy. Those are the questions I am asking when I read the article “Teaching the Digital Caribbean: The Ethics of a Public Pedagogical Experiment” by Kelly Baker.

Records and archives by their design are mundane and at times can be very dry. The challenge comes in deciphering those records in order to be presentable to a wider audience and I would classify the students even in a Masters program as a wider audience as well. But through those records the above questions must be answered before giving the full picture to the students. That is where the problem comes in the records and the archives themselves might be incomplete in a scholarly sense because the record keepers and the writers of those records might be very unreliable when they were recording the event pertaining to the marginalized community.

A review of “What America Ate”

Cover of What America Ate Project. Source: https://whatamericaate.org/

Launched in the spring of 2017, What America Ate, is an interactive website and online food recipe archive created by the collaboration among Digital Humanities professors and students from the University of Michigan. The project was funded by National Endowment for the Humanities with the mission to discover the everchanging diet and food habits of the American people in different historical phenomena during the Great Depression in the 1930s and 1940s.

The project contains a vast amount of Digitized content including 200 rare community cookbooks, rare promotional materials, flyers, and recipe booklets. When I was browsing through the project, I could not exactly find how long did it take to complete such a mammoth project. However, It is fair to assume that it takes years because of the sheer amount of content it has. Although, the data in the What America Ate project came from multiple sources, the majority of it comes from American Eat, a Depression Era project. In essence, What America Ate can be considered as an extended effort to understand different aspects of American Life during the era. The primary objective of the project is to make the historical record of food culture available to the masses using digital tools.

The interface of the project contains content in multiple formats to attract diverse users. There is a Browse section on the website with filters like Regions, Format, and Years to narrow down the browsing result to specific research interests. Also, a Basic and Advance Search functionality is available in the interface that facilitates searching from a few phrases, and keywords to perform complex field-specific searches with multi-select dropdown menus. A detailed recipe section is available to categorize Dishes based on time, course, and ingredients to attract foodies who are just browsing the project to find out what Americans used to eat in breakfast or as soup. Each recipe also contains a number of associated metadata such as description, date range, language, sources, citations, and many more! Furthermore, there is a map visualization to categorize recipes by different regions of America. The project also invites volunteers to transcribe the scanned images.

Although the What America Ate project is populated with thousands of materials, it does not contain all the contents that were available at that period. Instead, the project utilizes sampling to find out the representatives that epitomize different food traditions from different regions of America. One criticism that I have about the project is regarding sampling bias that is visible in the number of selected recipes by region. For instance, most of the recipes are from the far west and south which is more than any other region.

In conclusion, What America Ate is a captivating project that offers a great overview of America’s culinary history. Like many other DH projects, what I liked most is the inclusive characteristics of What we ate that serve and appeal to a broad range of audiences.

Please go and check it out!!! https://whatamericaate.org/

Digital Humanities Quarterly—More Than an Index

In reviewing the dhq website, “an open-access, peer-reviewed, digital journal covering all aspects of digital media in the humanities,” it is apparent that a rich resource of current and diverse thinking is tethered to a traditionalist and outdated expression of academic online publishing.  From the information hierarchy to the interface, this website does not take advantage of best practices in facilitating the easy access to the rich collection it houses. 

Although this type of website may feel familiar and comfortable to academics who began their careers in an earlier wave of online expression, newer scholars, digital natives, are accustom to sites that take advantage of UX/UI learnings over the past decades to address user needs and behaviors and apply design principles that signify relevance, active participation in the larger online world, and promote engagement.

Navigation

Main Navigation: This tool serves as the primary framework that users engage with to understand the contents of the site, and it also communicates what the creators have decided merits attention based on what they understand their objects and user needs to be. Typically when landing on a site, users need to gain understanding of where it is they have landed—what is the purpose of the site? Who is shaping this site? The order of options on the main navigation of dhq suggests that before knowing about the publication itself or who is involved in it, users should be concerned with guidelines to submit. From an outside observer this can suggest that the publication is lacking in sufficient submissions and soliciting new work is imperative or, alternatively, assumes that academics are most concerned with having work published, considering it is seen as a legitimizing activity. Either way, this prominent placement on the navigation feels unseemly. In contrast, non-profits typically don’t place the donate button before the who and why of their organizations. That is not to say it does not have prominence, and following non-profit convention of creating an alternative treatment that both draws attention to the action of submitting, but maintains the prominence of the publications existing content and underlying philosophies may serve the dhq site well. 

Issue Navigation: Presumably, this navigation should provide entry points into the evolving conversation, and provide insight into the overarching themes that the editors chosen to showcase. This would provide users with a layer of understanding to help navigate the many areas of inquiry in the field. However, on the dhq site the issues are simply arranged by date—giving indication of the themes revealed only after arbitrarily clicking into a dated link. If you are not a user arriving with a specific inquiry in mind, the search tool is rendered useless, and the lengthly scroll of dates becomes overwhelming and obtuse. It’s akin to offering someone an anthology where each chapter is simply a page number, requiring you to flip to that page to know what the chapter covered. Using only dates to indicate each issue is also a missed opportunity to cross-pollinate thinking—to invite users, many of whom may have come to the site with a specific understanding or area of knowledge, to explore new perspectives.

Orienting The User

  • Issues: Each issue has a theme only revealed when you click on its issue date. Once on the issue landing page, however, you are required to click again into the “front matter” to better understand the intention and process behind the issue. Some issues do not have a theme, and simply list an “articles” section—negating the convention of the others. In both cases, the issue landing page would benefit from a quick (2-4 sentence) summary from the editor(s) to both humanize the issue theme and set the stage for deeper understanding. In a sense, the issue landing page is the quarterly cover— which even in the more traditional publishing world typically include some kind of indication of what’s to come.
  • About DHQ: This page should help orient the user to the overarching perspective and structure of the publication. It does include key information, but it might benefit the creators to reconsider the order in which they present it. Technical and contextual information about the publication seem to be intermingled in a way that doesn’t suggest a progression to build up the users understanding. For example, it might benefit the user to read through “DHQ on Digital Humanities” before getting information on public indexes and source code. It’s also confusing that the technical overview section does not also encompass the sub-topic related to “getting DHQ Data” vs. having that section stand alone. The current hierarchy raises questions about the definitions of the term “technical” as seen by the creators.  There also seems to be a missed opportunity to cross promote other portions of the site. Having learned more about the structure and purpose of the publication, it would make sense to be invited, as a user, to find out more about submitting, or have a carousel promoting recent articles that exemplify the purpose delineated on the page.
  • DHQ People: This page is meant to give insight into the folks who bring the publication to life. It’s worth noting that the publication’s statement on BLM and Structural Racism does not appear on the page where it’s actual personnel structure is listed, and there are no images of the people behind the publication to give a preliminary indication of the diversity of the staff. In addition, providing images of staff and contributors would act as an encouraging signal to those entering the space if they were to see images of people who may resemble them. Simply stripping away any human representation (ironic in the humanities), creates a faceless cold index driven experience and misses opportunities for greater community building and collaboration.

Aesthetics

Overall, the dhq website is aesthetically uninviting and feels like a missed opportunity to collaborate with design professionals who would bring their own expertise to help create a more compelling and engaging experience. The site as a whole could benefit from more modern and thought out branding elements including an updated logo, typography, palette, and images. A more sophisticated typographic treatment would help reinforce hierarchy, while an updated palette and imagery would better set the tone and indicate cultural awareness. Increased usage of images, in particular, would help infuse energy and a sense of vitality into the presentation of these deeply engaging conversations. Each issue with a theme, for example, could benefit from images that help bring the area of exploration into relief.

In Conclusion

It is true, simple mastery of basic coding skills does create enormous potential and opportunity for sharing information quickly and easily. The dhq site does an extraordinary job of gathering and presenting the writings of a wide range of DH scholars and thinkers and serves as an impressive resource. However, the site also exemplifies a common occurance on many academic pages. These sites are often set up with an uneven emphasis on their underlying indexes and neglect the needs of the user and front end design conventions to facilitate efficient and frictionless access to information. Everything from the site architecture and mapping, including the hierarchy on each page, to the fonts and colors impact the user’s experience. Done well, these elements can create a welcoming hub that users choose to return to, and even make into a “third space,” done badly and it can render a site a chore to navigate and only accessed in times of immediate research based need. In considering the building of DH tools, creators must invest in the user’s experience. Perhaps the strict formats and conventions of earlier scholarly publishing has influenced the current online expressions, but in order to take full advantage of the internet’s potential a new approach is required. As a tool, the internet provides incredible opportunity to positively impact both academic and public engagement—leading to richer and more varied outcomes. Besides, scholars like easily navigable and aesthetically pleasing things too! 

It’s possible that lack of funding for a front end design collaborator limited this site’s front end outcomes, but hopefully it was not a case of these skills being deemed unnecessary. UX/UI and front end design funding should be included in grant proposals and presented as imperative, because the resulting digital expressions will better reflect the energy and sense of possibility inherent to Digital Humanities. 

Extra: 
Some sample resources that successfully use front end design and structuring to impart varied and rich long for content:

The Mark Up

The Kirkus Review

Atmos Magazine