Digital Humanities (DH), once referred to as "Humanities Computing", has become increasingly popular among humanities scholars with a wide array of digital research tools being developed these years. However, there is currently no unifying definition for DH. Sometimes, a broader term, Digital Scholarship (DS) is also used interchangeably with DH.
Broadly speaking, DH is “Digital” + “Humanities”: the intersection of computing, research, and teaching in the fields of the humanities. Therefore, in a wider sense, DH can be understood as an umbrella term for a number of different activities that surround technology and humanities scholarship like data mining, born-digital preservation, visualisation and many others. (Gavin & Smith, 2012)
(Gavin, M., & Smith, K. M. (2012). An Interview with Brett Bobley. In M. K. Gold (Ed.) Debates in the Digital Humanities, pp.61-66. London: University of Minnesota Press.)
Like DH concepts, it is hard to delineate clearly what is new for methodology in DH, the traditional research methods in social science still apply but they are more influenced by the use of technologies. In many ways, the emerging digital tools helps humanities scholars to modify research methods to explore patterns and uncover the hidden messages of research data. The following summarizes some commonly used methodologies employed in many DH projects.
- Data Collection / Curation
In DH, data collection is a process of creating, gathering and acquiring information in a systematic manner. Traditionally, the humanities scholars adopt qualitative methods such as interview, focus group, oral history and ethnography to collect research data. In DH research, data can be created in many different ways to enhance the scope, quantity and quality of data collected and to open up new possibilities in humanities research. Humanities data can be text, geospatial, data, results of analysis, and more!
Digitisation converts physical materials into digital forms. Physical materials do not confine to books and papers, but also photos, painting, video and audio materials. The CUHK Digital Repository provides access to all digital images created by the Library.
- Text Encoding
Text encoding is a process where documents are transferred to an electronically searchable format for digital humanities research. Digitsed documents in the form of images might not be useful unless they are enhanced by, say, transcribing the documents to computer texts for further research. The T-Pen project is an example of worldwide collaboration in transcribing handwritten texts.
- Data Extraction
Data extraction is the process of retrieving data out of data sources. Online data such as the posts and responses from social media can be extracted for data mining with the use of advanced software. Data extraction is a important step in the research process specifically for some emerging humanities disciplines such as social network, social interaction and cyber society.
- Data Interpretation & Analysis
Basically, data analysis means the process of understanding, evaluating and summarising the data collected. There is a great variety of data analysis methods in DH based on researchers' disciplines.
- Text Mining & Analysis
Many textual mining & analysis tools are available to analyze texts over vast corpora of text documents. One commonly used analytic method is called Topic Modeling. Basically this method works by counting the frequencies and examining the network of the specific terms so that researchers can further explore, evaluate and interpret the concepts and hidden patterns that arise from the input text. Text analysis methods are widely adopted in the fields of Language and Literature Studies, Translation Studies, Culture and Religious Studies, History as well as Philosophy.
- Spatial and Temporal Analysis with GIS
With the use of digital map and Geographic Information System(GIS), researchers can easily conduct data visualisation, network analysis and statistical analysis with spatial dimension. Please refer to the LibGuides on GIS and Digital Scholarship Research for more details.
- Image Analysis
Image analysis is the process of extracting information from digital images utilizing digital image processing techniques. In 2011, Lev Manovich presented a computer software that would allow distant reading of a vast image dataset by measuring factors such as greyscale in numerical form, and present visualisations of the results gained in this way instead of using ‘natural languages’ that are poorly suited to describing the visual.
- Data Visualisation
The latest tools in digital humanities enable the researchers to visually represent their data in a more sophisticated way with pictorial and graphical images. Researchers can make use of visualisation tools to map out the linkage between their research objects in network graphs instead of listing all the objects in tables to explain solely in plain words. On the other hand, research outputs now do not confine to research papers only. Researchers can make use of handy online tools and platform in creating their research project websites, online exhibitions, etc. in showcasing their research outputs to the world. More examples can be found in our LibGuides on Data Visualization and Digital Scholarship Research.