Publications of Bruno Josue Marques

Publications HAL de marques de la structure shacra;mimesis

2016

Journal articles

im.png
titre
Robust Augmented Reality registration method for Localization of Solid Organs’ Tumors Using CT-derived Virtual Biomechanical Model and Fluorescent Fiducials
auteur
Seong-Ho Kong, Nazim Haouchine, Renato Soares, Andrey S Klymchenko, Bohdan Andreiuk, Bruno Marques, Galyna Shabat, Thierry Piéchaud, Michele Diana, Stéphane Cotin, Jacques Marescaux
article
Surgical Endoscopy, Springer Verlag (Germany), 2016, <10.1007/s00464-016-5297-8>
resume
Accurate localization of solid organs tumors is crucial to ensure both radicality and organ function preservation. Augmented Reality (AR) is the fusion of computer-generated and real-time images. AR can be used in surgery as a navigation tool, by creating a patient-specific virtual model through 3D software manipulation of DICOM imaging (e.g. CT-scan). The virtual model can be superimposed to the real-time images to obtain the enhanced real-time localization. However, the 3D virtual model is rigid, and does not take into account inner structures’ deformations. We present a concept of automated navigation system, enabling transparency visualization of internal anatomy and tumor’s margins, while the organs undergo deformation during breathing or surgical manipulation.
Accès au texte intégral et bibtex
https://hal.archives-ouvertes.fr/hal-01314963/file/surg-endosc.pdf BibTex

2015

Conference papers

titre
Framework for augmented reality in Minimally Invasive laparoscopic surgery
auteur
Bruno Marques, Rosalie Plantefeve, Frédérick Roy, Nazim Haouchine, Emmanuel Jeanvoine, Igor Peterlik, Stéphane Cotin
article
HealthCom 2015, Oct 2015, Boston, United States. 2015 17th International Conference on E-health Networking, Application & Services (HealthCom) 2015, <10.1109/HealthCom.2015.7454467>
resume
This article presents a framework for fusing pre-operative data and intra-operative data for surgery guidance. This framework is employed in the context of Minimally Invasive Surgery (MIS) of the liver. From stereoscopic images a three dimensional point cloud is reconstructed in real-time. This point cloud is then used to register a patient-specific biomechanical model derived from Computed Tomography images onto the laparoscopic view. In this way internal structures such as vessels and tumors can be visualized to help the surgeon during the procedure. This is particularly relevant since abdominal organs undergo large deformations in the course of the surgery, making it difficult for surgeons to correlate the laparoscopic view with the pre-operative images. Our method has the potential to reduce the duration of the operation as the biomechanical model makes it possible to estimate the in-depth position of tumors and vessels at any time of the surgery, which is essential to the surgical decision process. Results show that our method can be successfully applied during laparoscopic procedure without interfering with the surgical work flow.
Accès au texte intégral et bibtex
https://hal.inria.fr/hal-01315574/file/article.pdf BibTex
Depth_Contour.jpg
titre
Improving depth perception during surgical augmented reality
auteur
Bruno Marques, Nazim Haouchine, Rosalie Plantefeve, Stephane Cotin
article
SIGGRAPH [Poster], Aug 2015, Los Angeles, United States. pp.Article No. 24, 2015, <10.1145/2787626.2792654>
resume
This study suggests a method to compensate the loss of depth perception while enhancing organ vessels and tumors to surgeons. This method relies on a combination of contour rendering technique and adaptive alpha blending to effectively perceive the vessels and tumors depth. In addition, this technique is designed to achieve real-time to satisfy the requirements of clinical routines, and has been tested on real human surgery.
Accès au texte intégral et bibtex
https://hal.inria.fr/hal-01191101/file/template.pdf BibTex