Abstract:
Today's media increasingly emphasizes the fluent collaboration between body and medium. Thus, surfaces become tangible, GUIs allow multi-touch gestures, deliver haptic feedback, or embed elements into our natural spatial environment. In some cases they even extend or integrate into the body by becoming wearable or implanted. Accordingly, the typical media recipients will not, as in the past, consume content statically as during television or reading a book, they will perceive media content while they move and according to their moving body while they walk with their smartphone, look around while using augmented reality glasses or use gestures on top of their tablets.
This relation between body and perception, however, can create a broad range of incompatibilities because the body is not only involved in body control but also during processing of a broad range of media content. To investigate the effects of such incompatibilities, three elements become necessary: a conceptual and deep understanding of the cognitive connection between body and media perception, the development of experimental environments to investigate their mutual dependency within different media, and experiments that explicitly focus on the relation between body related cognition and media perception.
Accordingly, within this dissertation, we develop these central theoretical and practical tools necessary for investigating the interactions between body related cognition and media related cognition within a broad range of media platforms by: (a) reviewing and structuring the current state of research and its challenges in the field of spatial content perception; (b) developing the experimental environment Inter| act3D that allows platform and media independent investigation of this connection; and (c) investigating a central dependency between media perception and body representations; the effect of body posture on visual media perception.