Utilizing image guided surgery for user interaction in medical augmented reality

DSpace Repository

Show simple item record

dc.contributor.author Fischer, Jan de_DE
dc.contributor.author Bartz, Dirk de_DE
dc.date.accessioned 2005-03-31 de_DE
dc.date.accessioned 2014-03-18T10:14:01Z
dc.date.available 2005-03-31 de_DE
dc.date.available 2014-03-18T10:14:01Z
dc.date.issued 2005 de_DE
dc.identifier.other 11719705X de_DE
dc.identifier.uri http://nbn-resolving.de/urn:nbn:de:bsz:21-opus-16617 de_DE
dc.identifier.uri http://hdl.handle.net/10900/48730
dc.description.abstract The graphical overlay of additional medical information over the patient during a surgical procedure has long been considered one of the most promising applications of augmented reality. While many experimental systems for augmented reality in medicine have reached an advanced state and can deliver high-quality augmented video streams, they usually depend heavily on specialized dedicated hardware. Such dedicated system components, which originally have been designed for engineering applications or VR research, often are ill-suited for use in the clinical practice. We have described a novel medical augmented reality application, which is based almost exclusively on existing, commercially available, and certified medical equipment. In our system, a so-called image guided surgery device is used for tracking a webcam, which delivers the digital video stream of the physical scene that is augmented with the virtual information. In this paper, we show how the capability of the image guided surgery system for tracking surgical instruments can be harnessed for user interaction. Our method enables the user to define points and freely drawn shapes in 3-d and provides selectable menu items, which can be located in immediate proximity to the patient. This eliminates the need for conventional touchscreen- or mouse-based user interaction without requiring additional dedicated hardware like dedicated tracking systems or specialized 3-d input devices. Thus the surgeon can directly interact with the system, without the help of additional personnel. We demonstrate our new input method with an application for creating operation plan sketches directly on the patient in an augmented view. en
dc.language.iso en de_DE
dc.publisher Universität Tübingen de_DE
dc.rights ubt-podok de_DE
dc.rights.uri http://tobias-lib.uni-tuebingen.de/doku/lic_mit_pod.php?la=de de_DE
dc.rights.uri http://tobias-lib.uni-tuebingen.de/doku/lic_mit_pod.php?la=en en
dc.subject.classification Erweiterte Realität <Informatik> , Visualisierung , Mensch-Maschine-Kommunikation , Computergraphik de_DE
dc.subject.ddc 004 de_DE
dc.subject.other Medizinische Visualisierung , Intraoperative Navigation de_DE
dc.subject.other Image Guided Surgery en
dc.title Utilizing image guided surgery for user interaction in medical augmented reality en
dc.type Report (Bericht) de_DE
dc.date.updated 2012-10-11 de_DE
utue.publikation.fachbereich Sonstige - Informations- und Kognitionswissenschaften de_DE
utue.publikation.fakultaet 7 Mathematisch-Naturwissenschaftliche Fakultät de_DE
dcterms.DCMIType Text de_DE
utue.publikation.typ report de_DE
utue.opus.id 1661 de_DE
utue.opus.portal wsi de_DE
utue.opus.portalzaehlung 2005.40000 de_DE
utue.publikation.source WSI ; 2005 ; 4 de_DE
utue.publikation.reihenname WSI-Reports - Schriftenreihe des Wilhelm-Schickard-Instituts für Informatik de_DE
utue.publikation.zsausgabe 2005, 4
utue.publikation.erstkatid 2919855-0

Dateien:

This item appears in the following Collection(s)

Show simple item record