Multi-Sensory Rendering: Combining Graphics and Acoustics

Jackson Pope and Alan Chalmers
Department of Computer Science, University of Bristol, UK
email : pope@cs.bris.ac.uk

ABSTRACT


Human perception of the geometry and spatial layout of an environment is a multi-sensory process. In addition to sight, the human brain is also particularly adept at subconciously processing echoes and using these reflected sounds to provide some indication of the dimensions of an environment.  This auditory impression of the size of an environment will incorporate surfaces not only to the front, but also to the sides and rear of the person and thus currently hidden from his/her view. So while computer graphics can provide an image of what a person can currently see, the level of perceptual realism may be significantly improved by incorporating auditory effects as well.  This paper describes a method for combining the computation of lighting and acoustics to provide enhanced rendering of virtual environments.