CINXE.COM
A Modular Edge Device Network for Surgery Digitalization
<!DOCTYPE html> <html lang="en"> <head> <meta content="text/html; charset=utf-8" http-equiv="content-type"/> <title>A Modular Edge Device Network for Surgery Digitalization</title> <!--Generated on Tue Mar 18 08:47:37 2025 by LaTeXML (version 0.8.8) http://dlmf.nist.gov/LaTeXML/.--> <meta content="width=device-width, initial-scale=1, shrink-to-fit=no" name="viewport"/> <link href="https://cdn.jsdelivr.net/npm/bootstrap@5.3.0/dist/css/bootstrap.min.css" rel="stylesheet" type="text/css"/> <link href="/static/browse/0.3.4/css/ar5iv.0.7.9.min.css" rel="stylesheet" type="text/css"/> <link href="/static/browse/0.3.4/css/ar5iv-fonts.0.7.9.min.css" rel="stylesheet" type="text/css"/> <link href="/static/browse/0.3.4/css/latexml_styles.css" rel="stylesheet" type="text/css"/> <script src="https://cdn.jsdelivr.net/npm/bootstrap@5.3.0/dist/js/bootstrap.bundle.min.js"></script> <script src="https://cdnjs.cloudflare.com/ajax/libs/html2canvas/1.3.3/html2canvas.min.js"></script> <script src="/static/browse/0.3.4/js/addons_new.js"></script> <script src="/static/browse/0.3.4/js/feedbackOverlay.js"></script> <base href="/html/2503.14049v1/"/></head> <body> <nav class="ltx_page_navbar"> </nav> <div class="ltx_page_main"> <div class="ltx_page_content"> <article class="ltx_document ltx_authors_1line"> <div class="ltx_para" id="p1"> <span class="ltx_ERROR undefined" id="p1.3">\IEEEoverridecommandlockouts</span><span class="ltx_ERROR undefined" id="p1.4">\overrideIEEEmargins</span><span class="ltx_ERROR undefined" id="p1.5">\addbibresource</span> <p class="ltx_p" id="p1.2">./main.bib </p> </div> <h1 class="ltx_title ltx_font_bold ltx_title_document" style="font-size:173%;">A Modular Edge Device Network for Surgery Digitalization </h1> <div class="ltx_authors"> <span class="ltx_creator ltx_role_author"> <span class="ltx_personname">Vincent Schorp<sup class="ltx_sup" id="id12.2.id1">1</sup> </span><span class="ltx_author_notes"> <span class="ltx_contact ltx_role_affiliation"><sup class="ltx_sup" id="id13.3.id1"><span class="ltx_text ltx_font_italic" id="id13.3.id1.1">1</span></sup><span class="ltx_text ltx_font_italic" id="p1.2.2.1">Balgrist University Hospital, University of Zurich, Zurich, Switzerland <br class="ltx_break"/><sup class="ltx_sup" id="p1.2.2.1.1">2</sup>Institute of Embedded Systems, ZHAW School of Engineering, Winterthur, Switzerland <br class="ltx_break"/>vincent.schorp@balgrist.ch</span> </span></span></span> <span class="ltx_creator ltx_role_author"> <span class="ltx_personname">Frédéric Giraud<sup class="ltx_sup" id="id14.2.id1"><span class="ltx_text ltx_font_italic" id="id14.2.id1.1">1</span></sup> </span><span class="ltx_author_notes"> <span class="ltx_contact ltx_role_affiliation"><sup class="ltx_sup" id="id15.3.id1"><span class="ltx_text ltx_font_italic" id="id15.3.id1.1">1</span></sup><span class="ltx_text ltx_font_italic" id="p1.2.2.1a">Balgrist University Hospital, University of Zurich, Zurich, Switzerland <br class="ltx_break"/><sup class="ltx_sup" id="p1.2.2.1a.1">2</sup>Institute of Embedded Systems, ZHAW School of Engineering, Winterthur, Switzerland <br class="ltx_break"/>vincent.schorp@balgrist.ch</span> </span></span></span> <span class="ltx_creator ltx_role_author"> <span class="ltx_personname">Gianluca Pargätzi<sup class="ltx_sup" id="id16.2.id1"><span class="ltx_text ltx_font_italic" id="id16.2.id1.1">2</span></sup> </span><span class="ltx_author_notes"> <span class="ltx_contact ltx_role_affiliation"><sup class="ltx_sup" id="id17.3.id1"><span class="ltx_text ltx_font_italic" id="id17.3.id1.1">1</span></sup><span class="ltx_text ltx_font_italic" id="p1.2.2.1b">Balgrist University Hospital, University of Zurich, Zurich, Switzerland <br class="ltx_break"/><sup class="ltx_sup" id="p1.2.2.1b.1">2</sup>Institute of Embedded Systems, ZHAW School of Engineering, Winterthur, Switzerland <br class="ltx_break"/>vincent.schorp@balgrist.ch</span> </span></span></span> <span class="ltx_creator ltx_role_author"> <span class="ltx_personname">Michael Wäspe<sup class="ltx_sup" id="id18.2.id1"><span class="ltx_text ltx_font_italic" id="id18.2.id1.1">2</span></sup> </span><span class="ltx_author_notes"> <span class="ltx_contact ltx_role_affiliation"><sup class="ltx_sup" id="id19.3.id1"><span class="ltx_text ltx_font_italic" id="id19.3.id1.1">1</span></sup><span class="ltx_text ltx_font_italic" id="p1.2.2.1c">Balgrist University Hospital, University of Zurich, Zurich, Switzerland <br class="ltx_break"/><sup class="ltx_sup" id="p1.2.2.1c.1">2</sup>Institute of Embedded Systems, ZHAW School of Engineering, Winterthur, Switzerland <br class="ltx_break"/>vincent.schorp@balgrist.ch</span> </span></span></span> <span class="ltx_creator ltx_role_author"> <span class="ltx_personname">Lorenzo von Ritter-Zahony<sup class="ltx_sup" id="id20.2.id1"><span class="ltx_text ltx_font_italic" id="id20.2.id1.1">2</span></sup> </span><span class="ltx_author_notes"> <span class="ltx_contact ltx_role_affiliation"><sup class="ltx_sup" id="id21.3.id1"><span class="ltx_text ltx_font_italic" id="id21.3.id1.1">1</span></sup><span class="ltx_text ltx_font_italic" id="p1.2.2.1d">Balgrist University Hospital, University of Zurich, Zurich, Switzerland <br class="ltx_break"/><sup class="ltx_sup" id="p1.2.2.1d.1">2</sup>Institute of Embedded Systems, ZHAW School of Engineering, Winterthur, Switzerland <br class="ltx_break"/>vincent.schorp@balgrist.ch</span> </span></span></span> <span class="ltx_creator ltx_role_author"> <span class="ltx_personname"> <br class="ltx_break"/>Marcel Wegmann<sup class="ltx_sup" id="id22.2.id1"><span class="ltx_text ltx_font_italic" id="id22.2.id1.1">2</span></sup> </span><span class="ltx_author_notes"> <span class="ltx_contact ltx_role_affiliation"><sup class="ltx_sup" id="id23.3.id1"><span class="ltx_text ltx_font_italic" id="id23.3.id1.1">1</span></sup><span class="ltx_text ltx_font_italic" id="p1.2.2.1e">Balgrist University Hospital, University of Zurich, Zurich, Switzerland <br class="ltx_break"/><sup class="ltx_sup" id="p1.2.2.1e.1">2</sup>Institute of Embedded Systems, ZHAW School of Engineering, Winterthur, Switzerland <br class="ltx_break"/>vincent.schorp@balgrist.ch</span> </span></span></span> <span class="ltx_creator ltx_role_author"> <span class="ltx_personname">John Garcia Henao<sup class="ltx_sup" id="id24.2.id1"><span class="ltx_text ltx_font_italic" id="id24.2.id1.1">1</span></sup> </span><span class="ltx_author_notes"> <span class="ltx_contact ltx_role_affiliation"><sup class="ltx_sup" id="id25.3.id1"><span class="ltx_text ltx_font_italic" id="id25.3.id1.1">1</span></sup><span class="ltx_text ltx_font_italic" id="p1.2.2.1f">Balgrist University Hospital, University of Zurich, Zurich, Switzerland <br class="ltx_break"/><sup class="ltx_sup" id="p1.2.2.1f.1">2</sup>Institute of Embedded Systems, ZHAW School of Engineering, Winterthur, Switzerland <br class="ltx_break"/>vincent.schorp@balgrist.ch</span> </span></span></span> <span class="ltx_creator ltx_role_author"> <span class="ltx_personname">Dominique Cachin<sup class="ltx_sup" id="id26.2.id1"><span class="ltx_text ltx_font_italic" id="id26.2.id1.1">2</span></sup> </span><span class="ltx_author_notes"> <span class="ltx_contact ltx_role_affiliation"><sup class="ltx_sup" id="id27.3.id1"><span class="ltx_text ltx_font_italic" id="id27.3.id1.1">1</span></sup><span class="ltx_text ltx_font_italic" id="p1.2.2.1g">Balgrist University Hospital, University of Zurich, Zurich, Switzerland <br class="ltx_break"/><sup class="ltx_sup" id="p1.2.2.1g.1">2</sup>Institute of Embedded Systems, ZHAW School of Engineering, Winterthur, Switzerland <br class="ltx_break"/>vincent.schorp@balgrist.ch</span> </span></span></span> <span class="ltx_creator ltx_role_author"> <span class="ltx_personname">Sebastiano Caprara<sup class="ltx_sup" id="id28.2.id1"><span class="ltx_text ltx_font_italic" id="id28.2.id1.1">1</span></sup> </span><span class="ltx_author_notes"> <span class="ltx_contact ltx_role_affiliation"><sup class="ltx_sup" id="id29.3.id1"><span class="ltx_text ltx_font_italic" id="id29.3.id1.1">1</span></sup><span class="ltx_text ltx_font_italic" id="p1.2.2.1h">Balgrist University Hospital, University of Zurich, Zurich, Switzerland <br class="ltx_break"/><sup class="ltx_sup" id="p1.2.2.1h.1">2</sup>Institute of Embedded Systems, ZHAW School of Engineering, Winterthur, Switzerland <br class="ltx_break"/>vincent.schorp@balgrist.ch</span> </span></span></span> <span class="ltx_creator ltx_role_author"> <span class="ltx_personname"> <br class="ltx_break"/>Philipp Fürnstahl<sup class="ltx_sup" id="id30.2.id1"><span class="ltx_text ltx_font_italic" id="id30.2.id1.1">1</span></sup> </span><span class="ltx_author_notes"> <span class="ltx_contact ltx_role_affiliation"><sup class="ltx_sup" id="id31.3.id1"><span class="ltx_text ltx_font_italic" id="id31.3.id1.1">1</span></sup><span class="ltx_text ltx_font_italic" id="p1.2.2.1i">Balgrist University Hospital, University of Zurich, Zurich, Switzerland <br class="ltx_break"/><sup class="ltx_sup" id="p1.2.2.1i.1">2</sup>Institute of Embedded Systems, ZHAW School of Engineering, Winterthur, Switzerland <br class="ltx_break"/>vincent.schorp@balgrist.ch</span> </span></span></span> <span class="ltx_creator ltx_role_author"> <span class="ltx_personname">Fabio Carrillo<sup class="ltx_sup" id="id32.2.id1"><span class="ltx_text ltx_font_italic" id="id32.2.id1.1">1</span></sup> </span><span class="ltx_author_notes"> <span class="ltx_contact ltx_role_affiliation"><sup class="ltx_sup" id="id33.3.id1"><span class="ltx_text ltx_font_italic" id="id33.3.id1.1">1</span></sup><span class="ltx_text ltx_font_italic" id="p1.2.2.1j">Balgrist University Hospital, University of Zurich, Zurich, Switzerland <br class="ltx_break"/><sup class="ltx_sup" id="p1.2.2.1j.1">2</sup>Institute of Embedded Systems, ZHAW School of Engineering, Winterthur, Switzerland <br class="ltx_break"/>vincent.schorp@balgrist.ch</span> </span></span></span> </div> <section class="ltx_section" id="Sx1"> <h2 class="ltx_title ltx_title_section">INTRODUCTION</h2> <div class="ltx_para ltx_noindent" id="Sx1.p1"> <p class="ltx_p" id="Sx1.p1.1">Future surgical care will increasingly rely on collaboration between caregivers, patients, technology, and information systems, driven by data science. The operating room (OR) of the future will be fully interconnected, providing surgical teams with real-time information for improved decision-making and efficiency. However, integration of machine learning in interventional medicine remains slow due to limited digitization and standardization of patient data <cite class="ltx_cite ltx_citemacro_cite">[<span class="ltx_ref ltx_missing_citation ltx_ref_self">maier-hein_surgical_2017</span>]</cite>. In an effort to tackle these challenges, automated multimodal data acquisition and processing promise to improve care, enabling surgical digital twins, foundation models, and clinical diagnostics to enhance surgeons’ capabilities. Additionally, such data will empower medical robots and enable remote participation, learning, and eventually teleoperation. However, achieving real-time, synchronized data capture from devices used in the OR remains challenging due to diverse hardware interfaces and protocols, while extensive cabling in multi-device setups compromises OR sterility and ergonomics.</p> </div> <div class="ltx_para ltx_noindent" id="Sx1.p2"> <p class="ltx_p" id="Sx1.p2.1">Recent projects at OR-X <cite class="ltx_cite ltx_citemacro_cite">[<span class="ltx_ref ltx_missing_citation ltx_ref_self">orx_min</span>]</cite>, a translational hub for surgical research, highlight these challenges. A surgery digitalization solution by <span class="ltx_ERROR undefined" id="Sx1.p2.1.1">\textcite</span>Hein_2024_CVPR_min required the interconnection of numerous cameras, resulting in a convoluted setup. A robotic ultrasound (US) scanning method proposed by <span class="ltx_ERROR undefined" id="Sx1.p2.1.2">\textcite</span>Cavalcanti_2024_min, emphasized the need to integrate robots, US scanners, cameras, and tracking devices for adaptive scanning and real-time reconstruction.</p> </div> <figure class="ltx_figure" id="Sx1.F1"><img alt="Refer to caption" class="ltx_graphics ltx_centering ltx_img_landscape" height="472" id="Sx1.F1.g1" src="x1.png" width="831"/> <figcaption class="ltx_caption ltx_centering"><span class="ltx_tag ltx_tag_figure">Figure 1: </span><span class="ltx_text ltx_font_bold" id="Sx1.F1.4.1">The Data Hub Network.</span> <em class="ltx_emph ltx_font_italic" id="Sx1.F1.5.2">Left:</em> The Data Hub (DH) with all ports. <em class="ltx_emph ltx_font_italic" id="Sx1.F1.6.3">Right:</em> The experimental setup comprises a patient-side DH connected to a US scanner, a pose tracking device, and an RGB-D camera; a supervisor DH to configure and monitor the experiment; and a high-performance computer for data storage and processing. The network can be extended to a larger number of patient-side DHs. The real-time transfer of the synchronized data is achieved using a ROS2 network.</figcaption> </figure> <div class="ltx_para ltx_noindent" id="Sx1.p3"> <p class="ltx_p" id="Sx1.p3.1">We propose a computer network composed of edge devices placed in the OR, named Data Hubs (DH), connected via optical fiber over a network switch, as shown in Fig. <a class="ltx_ref" href="https://arxiv.org/html/2503.14049v1#Sx1.F1" title="Figure 1 ‣ INTRODUCTION ‣ A Modular Edge Device Network for Surgery Digitalization"><span class="ltx_text ltx_ref_tag">1</span></a>. DHs enable seamless integration of diverse medical devices, sensors, and robotic systems, ensuring synchronized, multimodal data acquisition and real-time processing for machine learning applications. A high-performance computer (Nvidia DGX) ensures state-of-the-art data processing capabilities. Strategically distributing DHs within the OR allows flexible configurations while a user interface provides centralized control over all devices and data streams.</p> </div> <div class="ltx_para ltx_noindent" id="Sx1.p4"> <p class="ltx_p" id="Sx1.p4.1">To validate our solution, we tested the edge device network in an ongoing project on US-based 3D reconstruction of anatomies. This use case represents a classic challenging example from a clinical OR, involving medical imaging, pose tracking, and RGB-D imaging. We demonstrate that the system can simultaneously collect and store high-resolution, high-frame-rate data from all devices while providing an intuitive user interface for experiment configuration and monitoring. The contributions of this work include: </p> <ul class="ltx_itemize" id="Sx1.I1"> <li class="ltx_item" id="Sx1.I1.i1" style="list-style-type:none;"> <span class="ltx_tag ltx_tag_item">•</span> <div class="ltx_para" id="Sx1.I1.i1.p1"> <p class="ltx_p" id="Sx1.I1.i1.p1.1">A modular high-performance computer network enabling high-bandwidth data acquisition and real-time processing.</p> </div> </li> <li class="ltx_item" id="Sx1.I1.i2" style="list-style-type:none;"> <span class="ltx_tag ltx_tag_item">•</span> <div class="ltx_para" id="Sx1.I1.i2.p1"> <p class="ltx_p" id="Sx1.I1.i2.p1.1">An edge device capable of interfacing with multimodal sensors and robotic systems for medical applications</p> </div> </li> <li class="ltx_item" id="Sx1.I1.i3" style="list-style-type:none;"> <span class="ltx_tag ltx_tag_item">•</span> <div class="ltx_para" id="Sx1.I1.i3.p1"> <p class="ltx_p" id="Sx1.I1.i3.p1.1">An intuitive software solution for managing a network of sensors and devices.</p> </div> </li> <li class="ltx_item" id="Sx1.I1.i4" style="list-style-type:none;"> <span class="ltx_tag ltx_tag_item">•</span> <div class="ltx_para ltx_noindent" id="Sx1.I1.i4.p1"> <p class="ltx_p" id="Sx1.I1.i4.p1.1">Experimental validation in a realistic surgical environment, showcasing the system’s utility in complex procedures.</p> </div> </li> </ul> </div> </section> <section class="ltx_section" id="Sx2"> <h2 class="ltx_title ltx_title_section">MATERIALS AND METHODS</h2> <section class="ltx_subsection" id="Sx2.SSx1"> <h3 class="ltx_title ltx_title_subsection">Data Hub Hardware</h3> <div class="ltx_para ltx_noindent" id="Sx2.SSx1.p1"> <p class="ltx_p" id="Sx2.SSx1.p1.1">The DH is an edge computing device tailored for surgical environments with multimodal bidirectional ports and embedded computing capabilities. It is based on an NVIDIA Jetson Orin NX featuring an NVIDIA Ampere GPU and augmented with an Intel E810 network adapter to enable high-throughput optical fiber communication. The hardware includes two HDMI input ports, one HDMI output port, one Ethernet port, two optical fiber interfaces, two USB-C 3.2 ports, and two USB-A 3.2 ports, as shown in Fig <a class="ltx_ref" href="https://arxiv.org/html/2503.14049v1#Sx1.F1" title="Figure 1 ‣ INTRODUCTION ‣ A Modular Edge Device Network for Surgery Digitalization"><span class="ltx_text ltx_ref_tag">1</span></a>. While a network of DHs can operate independently, an Nvidia DGX computer equipped with eight A100 GPUs was added to enable state-of-the-art data processing and large-scale storage.</p> </div> </section> <section class="ltx_subsection" id="Sx2.SSx2"> <h3 class="ltx_title ltx_title_subsection">Data Hub Software</h3> <div class="ltx_para ltx_noindent" id="Sx2.SSx2.p1"> <p class="ltx_p" id="Sx2.SSx2.p1.1">The software architecture of the DH is built on the Isaac ROS framework <cite class="ltx_cite ltx_citemacro_cite">[<span class="ltx_ref ltx_missing_citation ltx_ref_self">isaac_ros</span>]</cite>, which standardizes communication with various devices. Each supported device is encapsulated within a Docker image containing the necessary drivers, SDKs, and a ROS 2 <cite class="ltx_cite ltx_citemacro_cite">[<span class="ltx_ref ltx_missing_citation ltx_ref_self">ros2</span>]</cite> package. Currently, the DH supports multiple devices including ZED Mini and Intel RealSense cameras (via USB-C/USB-A); Atracsys FusionTrack (via Ethernet); KUKA LBR Medical Robot (via Ethernet); and Aixplorer Ultrasound systems (via HDMI input). To support high-throughput data transfer, GPU-enabled H.264 image compression and decompression modules have been integrated into the system.</p> </div> <div class="ltx_para ltx_noindent" id="Sx2.SSx2.p2"> <p class="ltx_p" id="Sx2.SSx2.p2.1">To facilitate the deployment of complex experimental setups, one DH is designated as the central control unit. An adaptable user interface was created as a single entrypoint for convenient device configuration and real-time control and monitoring. We programmed the interface using Streamlit <cite class="ltx_cite ltx_citemacro_cite">[<span class="ltx_ref ltx_missing_citation ltx_ref_self">streamlit_min</span>]</cite> and Docker.</p> </div> </section> <section class="ltx_subsection" id="Sx2.SSx3"> <h3 class="ltx_title ltx_title_subsection">Experimental Setup</h3> <div class="ltx_para ltx_noindent" id="Sx2.SSx3.p1"> <p class="ltx_p" id="Sx2.SSx3.p1.1">The DH network is evaluated on an US scanning experiment, depicted in Fig <a class="ltx_ref" href="https://arxiv.org/html/2503.14049v1#Sx1.F1" title="Figure 1 ‣ INTRODUCTION ‣ A Modular Edge Device Network for Surgery Digitalization"><span class="ltx_text ltx_ref_tag">1</span></a>, on an ex-vivo human spine anatomy<span class="ltx_note ltx_role_footnote" id="footnote1"><sup class="ltx_note_mark">1</sup><span class="ltx_note_outer"><span class="ltx_note_content"><sup class="ltx_note_mark">1</sup><span class="ltx_tag ltx_tag_note">1</span>BASEC N° 2023-01652 approved by the Ethical Committee of the Kanton of Zurich</span></span></span>. The data collection leverages a SuperSonic Aixplorer Ultimate handheld US scanner, an Atracsys FusionTrack 500 visual tracking device, and an Intel Realsense D405 RGB-D camera. The procedure aims to collect the raw data required to reconstruct the anatomy from the US images and a mesh of the visible surface from the RGB-D data. A network of two DHs was utilized in this procedure. The patient-side DH was used to connect the three sensors mentioned above which feature HDMI, Ethernet, and USB-C interfaces, respectively. The monitoring DH was utilized to configure, start, stop, and monitor the data recording procedure. During dataset acquisition, the data is stored in real-time on the Nvidia DGX storage space, where it is post-processed later.</p> </div> </section> </section> <section class="ltx_section" id="Sx3"> <h2 class="ltx_title ltx_title_section">RESULTS</h2> <div class="ltx_para ltx_noindent" id="Sx3.p1"> <p class="ltx_p" id="Sx3.p1.1">The proposed network of edge devices allows the recording of the data from all three sensors at their maximum frame rate and resolution. The results can be seen in Table <a class="ltx_ref" href="https://arxiv.org/html/2503.14049v1#Sx3.T1" title="Table 1 ‣ RESULTS ‣ A Modular Edge Device Network for Surgery Digitalization"><span class="ltx_text ltx_ref_tag">1</span></a>. The scan of the spine took a total of 78 seconds, during which 4622 US images, 2283 images each for RGB and depth, and 12434 3D poses were acquired, amounting to a total of 6.2 GB.</p> </div> <figure class="ltx_table" id="Sx3.T1"> <figcaption class="ltx_caption ltx_centering"><span class="ltx_tag ltx_tag_table">Table 1: </span>Frame rates and resolutions achieved during the experiment.</figcaption> <table class="ltx_tabular ltx_centering ltx_guessed_headers ltx_align_middle" id="Sx3.T1.1"> <thead class="ltx_thead"> <tr class="ltx_tr" id="Sx3.T1.1.1.1"> <th class="ltx_td ltx_align_left ltx_th ltx_th_column ltx_border_l ltx_border_r ltx_border_t" id="Sx3.T1.1.1.1.1"><span class="ltx_text ltx_font_bold" id="Sx3.T1.1.1.1.1.1">Device</span></th> <th class="ltx_td ltx_align_left ltx_th ltx_th_column ltx_border_r ltx_border_t" id="Sx3.T1.1.1.1.2"><span class="ltx_text ltx_font_bold" id="Sx3.T1.1.1.1.2.1">Connector</span></th> <th class="ltx_td ltx_align_left ltx_th ltx_th_column ltx_border_r ltx_border_t" id="Sx3.T1.1.1.1.3"><span class="ltx_text ltx_font_bold" id="Sx3.T1.1.1.1.3.1">Data Format and FPS</span></th> </tr> </thead> <tbody class="ltx_tbody"> <tr class="ltx_tr" id="Sx3.T1.1.2.1"> <td class="ltx_td ltx_align_left ltx_border_l ltx_border_r ltx_border_t" id="Sx3.T1.1.2.1.1">Aixplorer US scanner</td> <td class="ltx_td ltx_align_left ltx_border_r ltx_border_t" id="Sx3.T1.1.2.1.2">HDMI</td> <td class="ltx_td ltx_align_left ltx_border_r ltx_border_t" id="Sx3.T1.1.2.1.3">RGB 1080p @ 60.2 FPS</td> </tr> <tr class="ltx_tr" id="Sx3.T1.1.3.2"> <td class="ltx_td ltx_align_left ltx_border_l ltx_border_r ltx_border_t" id="Sx3.T1.1.3.2.1">Atracsys Fusion 500</td> <td class="ltx_td ltx_align_left ltx_border_r ltx_border_t" id="Sx3.T1.1.3.2.2">Ethernet</td> <td class="ltx_td ltx_align_left ltx_border_r ltx_border_t" id="Sx3.T1.1.3.2.3">3D Pose @ 200.8 FPS</td> </tr> <tr class="ltx_tr" id="Sx3.T1.1.4.3"> <td class="ltx_td ltx_align_left ltx_border_l ltx_border_r ltx_border_t" id="Sx3.T1.1.4.3.1">Intel Realsense D405</td> <td class="ltx_td ltx_align_left ltx_border_r ltx_border_t" id="Sx3.T1.1.4.3.2">USB-C</td> <td class="ltx_td ltx_align_left ltx_border_r ltx_border_t" id="Sx3.T1.1.4.3.3">RGB 720p @ 29.6 FPS</td> </tr> <tr class="ltx_tr" id="Sx3.T1.1.5.4"> <td class="ltx_td ltx_align_left ltx_border_b ltx_border_l ltx_border_r ltx_border_t" id="Sx3.T1.1.5.4.1">Intel Realsense D405</td> <td class="ltx_td ltx_align_left ltx_border_b ltx_border_r ltx_border_t" id="Sx3.T1.1.5.4.2">USB-C</td> <td class="ltx_td ltx_align_left ltx_border_b ltx_border_r ltx_border_t" id="Sx3.T1.1.5.4.3">Depth 720p @ 29.6 FPS</td> </tr> </tbody> </table> </figure> </section> <section class="ltx_section" id="Sx4"> <h2 class="ltx_title ltx_title_section">DISCUSSION</h2> <div class="ltx_para ltx_noindent" id="Sx4.p1"> <p class="ltx_p" id="Sx4.p1.1">The experiment shows that the proposed network is capable of acquiring the highest possible resolution and frame rate of the employed sensors. The DH allows an efficient setup of the experiment that would have been cluttered otherwise, as demonstrated in the study by <span class="ltx_ERROR undefined" id="Sx4.p1.1.1">\textcite</span>Cavalcanti_2024_min, which was reproduced here. The user interface allows the straightforward configuration and monitoring of the experiment. The data is stored on the DGX where it is made available for off-line processing and machine learning training.</p> </div> <div class="ltx_para ltx_noindent" id="Sx4.p2"> <p class="ltx_p" id="Sx4.p2.1">The growing demand for diverse and larger datasets highlights the necessity for scaling up our setup and expanding the number of DHs used in the OR. This comes with a requirement for higher data throughput and efficient monitoring of many data streams simultaneously. While our network meets the required specifications, this remains to be tested.</p> </div> <div class="ltx_para ltx_noindent" id="Sx4.p3"> <p class="ltx_p" id="Sx4.p3.1">We demonstrate the capabilities of our system using three sensing devices. To enhance compatibility with a broader range of experimental setups, we aim to increase the number of supported devices. This includes exploring use cases with medical robots, which demand real-time closed-loop control.</p> </div> <div class="ltx_para ltx_noindent" id="Sx4.p4"> <p class="ltx_p" id="Sx4.p4.1">This system will be further developed to form the backbone of our ongoing effort to digitalize surgeries. In the future, DHs will be deployed across the OR as distributed connected devices, eventually enabling real-time inference, remote participation, and teleoperation.</p> </div> <div class="ltx_para ltx_noindent" id="Sx4.p5"> <span class="ltx_ERROR undefined" id="Sx4.p5.1">\printbibliography</span> </div> <div class="ltx_pagination ltx_role_newpage"></div> </section> </article> </div> <footer class="ltx_page_footer"> <div class="ltx_page_logo">Generated on Tue Mar 18 08:47:37 2025 by <a class="ltx_LaTeXML_logo" href="http://dlmf.nist.gov/LaTeXML/"><span style="letter-spacing:-0.2em; margin-right:0.1em;">L<span class="ltx_font_smallcaps" style="position:relative; bottom:2.2pt;">a</span>T<span class="ltx_font_smallcaps" style="font-size:120%;position:relative; bottom:-0.2ex;">e</span></span><span style="font-size:90%; position:relative; bottom:-0.2ex;">XML</span><img alt="Mascot Sammy" src="data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAAAsAAAAOCAYAAAD5YeaVAAAAAXNSR0IArs4c6QAAAAZiS0dEAP8A/wD/oL2nkwAAAAlwSFlzAAALEwAACxMBAJqcGAAAAAd0SU1FB9wKExQZLWTEaOUAAAAddEVYdENvbW1lbnQAQ3JlYXRlZCB3aXRoIFRoZSBHSU1Q72QlbgAAAdpJREFUKM9tkL+L2nAARz9fPZNCKFapUn8kyI0e4iRHSR1Kb8ng0lJw6FYHFwv2LwhOpcWxTjeUunYqOmqd6hEoRDhtDWdA8ApRYsSUCDHNt5ul13vz4w0vWCgUnnEc975arX6ORqN3VqtVZbfbTQC4uEHANM3jSqXymFI6yWazP2KxWAXAL9zCUa1Wy2tXVxheKA9YNoR8Pt+aTqe4FVVVvz05O6MBhqUIBGk8Hn8HAOVy+T+XLJfLS4ZhTiRJgqIoVBRFIoric47jPnmeB1mW/9rr9ZpSSn3Lsmir1fJZlqWlUonKsvwWwD8ymc/nXwVBeLjf7xEKhdBut9Hr9WgmkyGEkJwsy5eHG5vN5g0AKIoCAEgkEkin0wQAfN9/cXPdheu6P33fBwB4ngcAcByHJpPJl+fn54mD3Gg0NrquXxeLRQAAwzAYj8cwTZPwPH9/sVg8PXweDAauqqr2cDjEer1GJBLBZDJBs9mE4zjwfZ85lAGg2+06hmGgXq+j3+/DsixYlgVN03a9Xu8jgCNCyIegIAgx13Vfd7vdu+FweG8YRkjXdWy329+dTgeSJD3ieZ7RNO0VAXAPwDEAO5VKndi2fWrb9jWl9Esul6PZbDY9Go1OZ7PZ9z/lyuD3OozU2wAAAABJRU5ErkJggg=="/></a> </div></footer> </div> </body> </html>