International Perspectives on Theories and Practices of Multimodal Analysis
Seeing the Unforeseen: Eye-Tracking Reading Paths in Multimodal Webpages
In an elucidating overview of research conducted on the concept of reading paths, Hiippala (2012: 318) deplores that “the development of a theory of reading paths that would integrate perceptual psychology […] has not been followed up in subsequent multimodal research […] [but] became swamped by semiotically oriented, interpretative multimodal frameworks”. This overwhelmingly interpretative tradition in multimodal research has been criticised several times (cf., e.g., Kaltenbacher 2004: 202; Bateman 2008: 11; Bateman 2014: 239) and constitutes one of the gaps in multimodal research that this book is trying to close. The gap identified here concerns a general flaw in multimodal research. On the one hand, we have a seemingly endless number of studies on multimodal meaning-making grounded in the standard theory of social semiotics originally developed by Kress/van Leeuwen (2006 ). An overwhelming majority of these studies is based on heuristic and interpretative approaches to the question how intersemiotic meaning is created by writers and how it is decoded by readers of multimodal text. Such approaches involve methods like introspection, participant observation, interviews, and verbal protocols. On the other hand, authentic reader behaviour often differs from what interpretative methods suggest and make us believe. Because of this discrepancy, considerable efforts have been undertaken during the past ten years to put multimodal research onto more solid empirical ground. Noteworthy, in this respect, are the development of multimodal corpora and annotation schemes (cf., e.g., Baldry/Thibault 2005), the creation of interactive software to analyse multimodal text (cf., e.g., O’Halloran...
You are not authenticated to view the full text of this chapter or article.
This site requires a subscription or purchase to access the full text of books or journals.
Do you have any questions? Contact us.Or login to access all content.