dc.contributor.advisor |
Baes, Gregorio B. |
|
dc.contributor.author |
Montejo, Jay Renzo L. |
|
dc.date.accessioned |
2015-07-25T17:40:28Z |
|
dc.date.available |
2015-07-25T17:40:28Z |
|
dc.date.issued |
2014-04 |
|
dc.identifier.uri |
http://cas.upm.edu.ph:8080/xmlui/handle/123456789/43 |
|
dc.description.abstract |
In the operating room, computer-aided navigation systems are used by the surgeon
to view medical images during surgery. But since sterility must be maintained
at all times to prevent disease transmission and spread of infections, the surgeon
cannot use the mouse, keyboard or any other controller devices in cases where
there are several images that need to be viewed or if there is a need to manipulate
images. In order to provide a sterile and natural human-computer interaction, this
paper presents a gesture-controlled image navigation and manipulation application
that uses hand gesture recognition and the computer vision capabilities of the
Kinect sensor to allow the surgeon to perform navigation and manipulation on
medical images. |
en_US |
dc.language.iso |
en |
en_US |
dc.subject |
human-computer interaction |
en_US |
dc.subject |
hand gesture recognition |
en_US |
dc.subject |
image navigation and manipulation |
en_US |
dc.subject |
Kinect |
en_US |
dc.title |
Image Navigation and Manipulation in the Operating Room using Hand Gesture Recognition |
en_US |
dc.type |
Thesis |
en_US |