Listing 1 - 5 of 5 |
Sort by
|
Choose an application
digital humanities --- digital scholarship --- humanities computing --- digital cultural heritage --- cultural analytics --- public history
Choose an application
cultural analytics --- cultural sociology --- natural language processing --- machine learning --- computational methods --- cultural artifacts --- Culture --- Digital humanities --- Study and teaching --- Data processing
Choose an application
Just as a traveler crossing a continent won't sense the curvature of the earth, one lifetime of reading can't grasp the largest patterns organizing literary history. This is the guiding premise behind Distant Horizons, which uses the scope of data newly available to us through digital libraries to tackle previously elusive questions about literature. Ted Underwood shows how digital archives and statistical tools, rather than reducing words to numbers (as is often feared), can deepen our understanding of issues that have always been central to humanistic inquiry. Without denying the usefulness of time-honored approaches like close reading, narratology, or genre studies, Underwood argues that we also need to read the larger arcs of literary change that have remained hidden from us by their sheer scale. Using both close and distant reading to trace the differentiation of genres, transformation of gender roles, and surprising persistence of aesthetic judgment, Underwood shows how digital methods can bring into focus the larger landscape of literary history and add to the beauty and complexity we value in literature.
Criticism. --- Literature --- Digital humanities. --- Research --- Methodology. --- American literature. --- English literature. --- characterization. --- cultural analytics. --- diction. --- digital humanities. --- distant reading. --- gender. --- genre. --- machine learning.
Choose an application
Across the humanities and social sciences, scholars increasingly use quantitative methods to study textual data. Considered together, this research represents an extraordinary event in the long history of textuality. More or less all at once, the corpus has emerged as a major genre of cultural and scientific knowledge. In Literary Mathematics, Michael Gavin grapples with this development, describing how quantitative methods for the study of textual data offer powerful tools for historical inquiry and sometimes unexpected perspectives on theoretical issues of concern to literary studies. Student-friendly and accessible, the book advances this argument through case studies drawn from the Early English Books Online corpus. Gavin shows how a copublication network of printers and authors reveals an uncannily accurate picture of historical periodization; that a vector-space semantic model parses historical concepts in incredibly fine detail; and that a geospatial analysis of early modern discourse offers a surprising panoramic glimpse into the period's notion of world geography. Across these case studies, Gavin challenges readers to consider why corpus-based methods work so effectively and asks whether the successes of formal modeling ought to inspire humanists to reconsider fundamental theoretical assumptions about textuality and meaning. As Gavin reveals, by embracing the expressive power of mathematics, scholars can add new dimensions to digital humanities research and find new connections with the social sciences.
Digital humanities. --- Quantitative research. --- Early English Books Online Text Creation Partnership. --- Early English books online. --- corpus linguistics. --- cultural analytics. --- digital humanities. --- distributional semantics. --- geographical text analysis. --- history. --- literature. --- networks.
Choose an application
"How computational methods can expand how we see, read, and listen to Holocaust testimony. The Holocaust is one of the most documented-and now digitized-events in human history. Institutions and archives hold hundreds of thousands of hours of audio and video testimony, composed of more than a billion words in dozens of languages, with millions of pieces of descriptive metadata. It would take several lifetimes to engage with these testimonies one at a time. Computational methods could be used to analyze an entire archive-but what are the ethical implications of "listening" to Holocaust testimonies by means of an algorithm? In this book, Todd Presner explores how the digital humanities can provide both new insights and humanizing perspectives for Holocaust memory and history. Presner suggests that it is possible to develop an "ethics of the algorithm" that mediates between the ethical demands of listening to individual testimonies and the interpretative possibilities of computational methods. He delves into thousands of testimonies and witness accounts, focusing on the analysis of trauma, language, voice, genre, and the archive itself. Tracing the affordances of digital tools that range from early, proto-computational approaches to more recent uses of automatic speech recognition and natural language processing, Presner introduces readers to what may be the ultimate expression of these methods: AI-driven testimonies that use machine learning to process responses to questions, offering a user experience that seems to replicate an actual conversation with a Holocaust survivor.With Ethics of the Algorithm, Presner presents a digital humanities argument for how big data models and computational methods can be used to preserve and perpetuate cultural memory"-- "The Holocaust is one of the most documented-and now digitized-events in human history. Institutions and archives hold hundreds of thousands of hours of audio and video testimony, composed of more than a billion words in dozens of languages, with millions of pieces of descriptive metadata. It would take several lifetimes to engage with these testimonies one at a time. Computational methods could be used to analyze an entire archive-but what are the ethical implications of "listening" to Holocaust testimonies by means of an algorithm? In this book, Todd Presner explores how the digital humanities can provide both new insights and humanizing perspectives for Holocaust memory and history. Presner suggests that it is possible to develop an "ethics of the algorithm" that mediates between the ethical demands of listening to individual testimonies and the interpretative possibilities of computational methods. He delves into thousands of testimonies and witness accounts, focusing on the analysis of trauma, language, voice, genre, and the archive itself. Tracing the affordances of digital tools that range from early, proto-computational approaches to more recent uses of automatic speech recognition and natural language processing, Presner introduces readers to what may be the ultimate expression of these methods: AI-driven testimonies that use machine learning to process responses to questions, offering a user experience that seems to replicate an actual conversation with a Holocaust survivor. With Ethics of the Algorithm, Presner presents a digital humanities argument for how big data models and computational methods can be used to preserve and perpetuate cultural memory"--
Computer algorithms --- Digital humanities --- History --- Holocaust, Jewish (1939-1945) --- HISTORY / Holocaust. --- AI. --- Algorithm. --- Algorithmic. --- Auschwitz. --- Bomba. --- Child. --- Clusters. --- Corpus. --- Cultural. --- Data. --- Database. --- Death. --- Digital. --- Dimensions in Testimony (DiT). --- Distant. --- Dutch. --- Ethical. --- Ethics of the Algorithm: Digital Humanities and Holocaust Memory. --- Ethics. --- Fortunoff. --- Ghetto. --- History. --- Holocaust testimony. --- Holocaust. --- Human. --- Jewish. --- Jews. --- Judgment. --- Kimmelmann. --- Labor. --- Machine learning. --- Mala. --- Media. --- Memory. --- Narrative. --- Nazi. --- Network. --- Police. --- Population. --- Processes. --- Segments. --- Semantic triplets. --- Semantic. --- Silence. --- Survivors. --- Technologies. --- Technology. --- Testimony. --- Todd Presner. --- Trauma. --- Triplets. --- USC Shoah foundation. --- Victims. --- Violence. --- Visualization. --- War. --- algorithmic fabulation. --- archive. --- big data. --- cultural analytics. --- datafication. --- digital culture. --- digital media. --- digital technologies. --- ethical computation. --- humanistic data science. --- natural language processing. --- survivor. --- virtual. --- witness. --- Moral and ethical aspects. --- Moral and ethical aspects. --- Data processing. --- Study and teaching. --- AI. --- Algorithm. --- Algorithmic. --- Auschwitz. --- Bomba. --- Child. --- Clusters. --- Corpus. --- Cultural. --- Data. --- Database. --- Death. --- Digital. --- Dimensions in Testimony (DiT). --- Distant. --- Dutch. --- Ethical. --- Ethics of the Algorithm: Digital Humanities and Holocaust Memory. --- Ethics. --- Fortunoff. --- Ghetto. --- History. --- Holocaust testimony. --- Holocaust. --- Human. --- Jewish. --- Jews. --- Judgment. --- Kimmelmann. --- Labor. --- Machine learning. --- Mala. --- Media. --- Memory. --- Narrative. --- Nazi. --- Network. --- Police. --- Population. --- Processes. --- Segments. --- Semantic triplets. --- Semantic. --- Silence. --- Survivors. --- Technologies. --- Technology. --- Testimony. --- Todd Presner. --- Trauma. --- Triplets. --- USC Shoah foundation. --- Victims. --- Violence. --- Visualization. --- War. --- algorithmic fabulation. --- archive. --- big data. --- cultural analytics. --- datafication. --- digital culture. --- digital media. --- digital technologies. --- ethical computation. --- humanistic data science. --- natural language processing. --- survivor. --- virtual. --- witness.
Listing 1 - 5 of 5 |
Sort by
|