Unknown

Dataset Information

0

Gaze-contingent perceptually enabled interactions in the operating theatre.


ABSTRACT: Improved surgical outcome and patient safety in the operating theatre are constant challenges. We hypothesise that a framework that collects and utilises information -especially perceptually enabled ones-from multiple sources, could help to meet the above goals. This paper presents some core functionalities of a wider low-cost framework under development that allows perceptually enabled interaction within the surgical environment.The synergy of wearable eye-tracking and advanced computer vision methodologies, such as SLAM, is exploited. As a demonstration of one of the framework's possible functionalities, an articulated collaborative robotic arm and laser pointer is integrated and the set-up is used to project the surgeon's fixation point in 3D space.The implementation is evaluated over 60 fixations on predefined targets, with distances between the subject and the targets of 92-212 cm and between the robot and the targets of 42-193 cm. The median overall system error is currently 3.98 cm. Its real-time potential is also highlighted.The work presented here represents an introduction and preliminary experimental validation of core functionalities of a larger framework under development. The proposed framework is geared towards a safer and more efficient surgical theatre.

SUBMITTER: Kogkas AA 

PROVIDER: S-EPMC5509830 | biostudies-other | 2017 Jul

REPOSITORIES: biostudies-other

altmetric image

Publications

Gaze-contingent perceptually enabled interactions in the operating theatre.

Kogkas Alexandros A AA   Darzi Ara A   Mylonas George P GP  

International journal of computer assisted radiology and surgery 20170410 7


<h4>Purpose</h4>Improved surgical outcome and patient safety in the operating theatre are constant challenges. We hypothesise that a framework that collects and utilises information -especially perceptually enabled ones-from multiple sources, could help to meet the above goals. This paper presents some core functionalities of a wider low-cost framework under development that allows perceptually enabled interaction within the surgical environment.<h4>Methods</h4>The synergy of wearable eye-tracki  ...[more]

Similar Datasets

| S-EPMC6598771 | biostudies-literature
| S-EPMC5948240 | biostudies-literature
| S-EPMC4077667 | biostudies-literature
| S-EPMC9313363 | biostudies-literature
| S-EPMC3281887 | biostudies-literature
| S-EPMC5338519 | biostudies-literature
| S-EPMC7045871 | biostudies-literature
| S-EPMC5279905 | biostudies-other
| S-EPMC5070492 | biostudies-literature