Anthropology, Department of


Date of this Version



Richards-Rissetto, H., Primeau, K.E., Witt, D.E., Goodwin, G. (2023). Multisensory Experiences in Archaeological Landscapes—Sound, Vision, and Movement in GIS and Virtual Reality. In: Landeschi, G., Betts, E. (eds) Capturing the Senses. Quantitative Methods in the Humanities and Social Sciences. Springer, Cham.


Open Access This chapter is licensed under the terms of the Creative Commons Attribution 4.0 International License


Archaeologists are employing a variety of digital tools to develop new methodological frameworks that combine computational and experiential approaches which is leading to new multisensory research. In this article, we explore vision, sound, and movement at the ancient Maya city of Copan from a multisensory and multiscalar perspective bridging concepts and approaches from different archaeological paradigms. Our methods and interpretations employ theory-inspired variables from proxemics and semiotics to develop a methodological framework that combines computation with sensory perception. Using GIS, 3D, and acoustic tools we create multisensory experiences in VR with spatial sound using an immersive headset (Oculus Rift) and touch controllers (for movement). The case study simulates the late eighth and early ninth-century landscape of the ancient Maya city of Copan to investigate the role of landscape in facilitate movement, send messages, influence social interaction, and structure cultural events. We perform two simulations to begin to study the impact of vegetation on viewsheds and soundsheds of a stela at ancient Copan. Our objectives are twofold: (1) design and test steps towards developing a GIS computational approach to analyse the impact of vegetation within urban agrarian landscapes on viewsheds and soundsheds and (2) explore cultural significance of Stela 12, and more generally the role of synesthetic experience in ancient Maya society using a multisensory approach that incorporates GIS and VR.