Zaloguj się

Aby wyświetlić tę treść, wymagana jest subskrypcja JoVE. Zaloguj się lub rozpocznij bezpłatny okres próbny.

W tym Artykule

  • Podsumowanie
  • Streszczenie
  • Wprowadzenie
  • Protokół
  • Wyniki
  • Dyskusje
  • Ujawnienia
  • Podziękowania
  • Materiały
  • Odniesienia
  • Przedruki i uprawnienia

Podsumowanie

This article presents the design and implementation of an automatic surgery module based on augmented reality (AR)- based 3D reconstruction. The system enables remote surgery by allowing surgeons to inspect reconstructed features and replicate surgical hand movements as if they were performing the surgery in close proximity.

Streszczenie

Augmented Reality (AR) is in high demand in medical applications. The aim of the paper is to provide automatic surgery using AR for the Transcatheter Aortic Valve Replacement (TAVR). TAVR is the alternate medical procedure for open-heart surgery. TAVR replaces the injured valve with the new one using a catheter. In the existing model, remote guidance is given, while the surgery is not automated based on AR. In this article, we deployed a spatially aligned camera that is connected to a motor for the automation of image capture in the surgical environment. The camera tracks the 2D high-resolution image of the patient's heart along with the catheter testbed. These captured images are uploaded using the mobile app to a remote surgeon who is a cardiology expert. This image is utilized for the 3D reconstruction from 2D image tracking. This is viewed in a HoloLens like an emulator in a laptop. The surgeon can remotely inspect the 3D reconstructed images with additional transformation features such as rotation and scaling. These transformation features are enabled through hand gestures. The surgeon's guidance is transmitted to the surgical environment to automate the process in real-time scenarios. The catheter testbed in the surgical field is controlled by the hand gesture guidance of the remote surgeon. The developed prototype model demonstrates the effectiveness of remote surgical guidance through AR.

Wprowadzenie

AR can superimpose the 3D model in a real-world environment. The technological development towards AR has made a paradigm shift in many fields, namely education1, medical2, manufacturing3, and entertainment4. AR technology, along with ultra-reliable low-latency communication, proves its inevitable role in the medical field. From the learning stage of human anatomy to surgical guidance, the stages of learning can be visualized with AR-powered software5,6 and hardware. AR provides a crucial and reliable solution to the medical practitioner in a surgical environment7,8.

Aortic valve stenosis is the heart valve disease, which is most common among mankind9. The root cause of the disease is poor food habits and irregular routines of day-to-day life. The symptom and result of the disease is the narrowing of the heart valve, followed by a reduction in the blood flow. This problem needs to be addressed before any damage takes place to the human heart. Thus, the heart is overburdened to process the blood flow. So, before any damage happens, surgery needs to be done, which, owing to technological developments in recent days, can also be done using the TAVR procedure. The procedure can be adopted based on the condition of the heart and other body parts of patients. In this TAVR10,11, the catheter is inserted to replace the damaged valve in the heart. However, placing the catheter position12 to replace the valve is tedious for the practitioner. This idea motivated us to design an automated surgery model based on AR13,14, which helps the surgeon to precisely position the valve during the replacement process. Moreover, the surgery can be performed by a motion mapping algorithm, which maps the surgeon's movement captured from a remote location to the robotic arm.

In the existing work15,16,17, the visualization of the TAVR18 procedure is monitored through fluoroscopy. Hence, it is difficult to analyze the heart valve and tedious to find the replacement location. This sets up a barrier to positioning the catheter in the human heart. In addition, the remote motion is mapped to the surgical field to make the process automated. However, to overcome the research gap, we propose an automated robotic-based surgery for valve replacement using AR-assisted technology.

The protocol is a generic model that can be applied to all surgical environments. In the beginning stage of the work, 2D images are captured all around the surgical environment with the fullest spatial resolution of the largest degree of freedom. This means that enough images are captured for 3D reconstruction19 purpose, followed by motion mapping through hand gesture tracking20.

Protokół

1. Surgical environment

  1. Design a surgery environment as shown in Figure 1. Make sure the environment has an object-carrying platter, a robotic arm, and two side-hanging arms, one to hold a camera placeholder and the other to have a consistent white background along with the weighing module for balance.
  2. Develop two drivers, one for the snapshot of the live surgical environment, as mentioned in steps 2.1 to 2.10, and the other to control the revolving mechanism that supports 360° surveillance, as mentioned in steps 3.1 to 3.4.
  3. Before implementing the above two modules, enable Bluetooth of the mobile device and the laptop, which serve as the surgeon's HoloLens emulator.
  4. Pair the devices for uninterrupted image transmission.

2. Setting up the driver to control the two hanging arms

  1. Make sure that the hanging arms are controlled by a stepper motor with the arrangement as shown in Figure 2 for a flawless revolution of 360°.
  2. Connect the motor to the microcontroller board using the TB 6600 driver. To run the motor, install the microcontroller IDE from the browser.
  3. Click the Download button to download the software. Then, in the microcontroller IDE, go to File > Open a New Sketch to write the code.
  4. Make sure to connect the microcontroller board to interface with the new sketch through a dedicated connection port, say COM 4. Check the Com Port and verify that it shows the microcontroller board.
  5. Check the hardware switch settings of the stepper motor driver TB 6600. Ensure that the settings are such that the current flow is 2 A, which can be attained by setting SW4 ON and SW5 and SW6 OFF.
  6. Ensure the switch positions of SW1, SW2, and SW3 are set so that the micro-step is 1/8 steps to attain the revolution steps as per the requirement. Ensure the settings are SW1 OFF, SW2 ON, and SW3 OFF in TB6600.
  7. Connect RTC 3231 with the microcontroller to have real global time synchronization. Make sure that the revolution step size is 12° and that the motor step increment is triggered only when the real-time unit, i.e., the seconds read from the RTC module, is odd in number.
  8. Connect the 5 V pin of the microcontroller board to the RTC VCC and the microcontroller's GND to the RTC's GND.
  9. Connect the SCL pin of RTC to the A0 pin and the SDA to the A1 pin of the microcontroller. This module can ensure a step size of 12°, making 30 steps in one revolution. Ensure that the step increment happens every odd second. Let this software module drive the stepper motor21.
  10. Verify the setup is working correctly by running the code, which is available on the GitHub page: https://github.com/Johnchristopherclement/Automatic_Surgery_model_using_AR.
  11. Download Android Studio to develop the automatic camera app. Ensure the system requirements are met, then download the software from the website.

3. Developing a driver for mobile-based scene surveillance and image transmission as a client module

  1. Develop a camera application in the Android operating system that can take snapshots every 2 s, especially when the seconds are odd numbers.
  2. Connect the mobile phone with the system. In Android Studio, click New > New Project and choose Empty Views Activity. Click on Next to develop an Android code, which is available at https://github.com/Johnchristopherclement/Automatic_Surgery_model_using_AR.
  3. Ensure that the app captures the images automatically and sends them to a remote device consistently.
  4. Transmit snapshots from the mobile app immediately after taking the snapshot to the paired device, i.e., to the remote surgeon's system, through Bluetooth.
    NOTE: Make sure the modules mentioned in sections 2 and 3 run in time synchronization, one for every even number of seconds and the other for every odd number of seconds.

4. Developing a client module to receive and handle surveillance images

  1. Open the server module, which is a graphical user interface.
  2. Enter the VVID port number in the text field VVID, whose default value is 94f39d29 7d6d 437d 973b fba39e49d4ee.
  3. Click on Create Socket to create and bind the socket. Click on Connect to establish a connection with the mobile app.
  4. Click on Capture to capture and save the scene surveillance images in the local folder
  5. Enter the local folder name in the field folder name if it needs to be other than the default one mentioned.

5. Operating the robotic arm

  1. Let the client module include a robotic arm as well. Design the arm to have a rotational movement in its base, shoulder, elbow, wrist, and fingers.
  2. Make sure that MG 996R servos are used for governing the rotational movement at the base, shoulder and elbow. Ensure that the SG 90 servo motor is used to control the rotational movement at the elbow and fingers.
  3. Compile the code given in https://github.com/Johnchristopherclement/Automatic_Surgery_model_using_AR in the microcontroller IDE to drive the robotic arm based on the commands received from the remote surgeon.

6. 3D reconstruction for augmented reality

  1. Read two images at a time in a sequence, one by one, from the local folder to obtain the possible overlapping (as the images are collected in close proximity, there will be an overlapping between the consecutive images) between them.
  2. Design a tertiary filter as per the requirement of the Directional Intensified Feature Description using the Tertiary Filtering22 (DITF) algorithm to obtain the gradient and orientation.
  3. Extract the features using the DITF method22, as shown in Figure 3.
  4. Reconstruct 3D images from the collected features using SFM23.

7. Hand gesture recognition at the surgeon's location

  1. Facilitate the surgeon to inspect the 3D reconstructed image features by enabling him/her to visualize the environment from all perspectives by providing hand gesture-based rotation and zoom in/out of reconstructed features.
  2. Normalize and map the distance between the tip of the surgeon's thumb and the index finger of the right hand into a corresponding angle of rotation. Let the normalization be in such a way that the bare minimum distance corresponds to 0° and the maximum to 180°.
  3. Transmit the hand gesture control, through Bluetooth, to the remote surgery environment as well for the rotation of the object platter, which makes it revolve on its axis as the 3D reconstructed features revolve at the surgeon's end.
  4. Find the distance between the tip and thumb of the surgeon's left hand to control the movement of the fingers of the robot arm.
  5. Measure the angle of elevation from the spatial distance between the tip of the thumb and index fingers of the surgeon's left hand with respect to an imaginary x-y ground plane to determine the elevation angle. Map this angle into an elevation angle that the robot arm can make with the x-y plane.
  6. Find the azimuth angle that the hand of the surgeon makes with that of the virtual y-z plane. Identify these angles through hand gesture-based recognition.
  7. Map the distance, elevation, and azimuth angles to control the robot's finger movement and arm rotation, which both correspond to elevation and azimuth angles.
  8. Let the surgeon inspect the reconstructed features by zooming and rotating. Let the surgeon transmit commands to the robot arm to perform surgery from a remote location.
  9. Make sure that the surgery commands are transmitted as a control string of sequence starting with a string concurrence followed by the values to control the platter rotation and robotic arm controller. Let [θb, θs, θe, θw, θf] be the angle of the vector that consists of values, each corresponding to the control signal corresponding to the base, shoulder, elbow, wrist, and finger of the robot arm.
    NOTE: The GitHub link provides the code to enable hand gesture control in the surgical field. https://github.com/Johnchristopherclement/Automatic_Surgery_model_using_AR.

Wyniki

The protocol was tested with the heart phantom model. Figure 2 shows the expected setup for the live surveillance of the surgical field with the help of spatially distributed cameras. The distributed cameras, as shown in Figure 2, help to increase the spatial resolution of the field for effective 3D reconstruction. However, realizing the physical placement of those cameras in various spatial locations involves complexity. So, we have optimized the setup design a...

Dyskusje

In an existing work15, X-ray and CT scans are examined to locate the catheter in the heart. However, AR TAVR replacement establishes a new possibility in TAVR18 surgical procedure by the implementation of an automated model using 3D reconstruction. As mentioned in the protocol section this work has five stages to design. The first stage of DITF22, mentioned in section 6, which we proposed in our previous work22, is enhanced in...

Ujawnienia

The authors declare no conflicts of interest.

Podziękowania

The authors acknowledge no funding for this research.

Materiały

NameCompanyCatalog NumberComments
android IDEsoftwarehttps://developer.android.com/studiosoftware can be downloaded from this link
Arduino BoardArdunio UnoArdunio UnoMicrocontroller for processing
arduino softwaresoftwarehttps://www.arduino.cc/en/software.software can be downloaded from this link
Human Heart phantom modelBiology Lab Equipment Manufacturer and ExporterB071YBLX2V(8B-ZB2Q-H3MS-1)light weight model with 3parts to the deep analysis of heart.
mobile holderHumble universal monopoad holderB07S9KNGVSTo carry the mobile in surgical field
pycharm IDEsoftwarehttps://www.jetbrains.com/pycharm/software can be downloaded from this link
Robot armPrinted-botsB08R2JLKYM(P0-E2UT-JSOU)arm can be controlled through control signal.it has 5 degree of freedom to access.
servo motorKollmorgen Co-Engineers MotorsMG-966Rhigh-torque servo motor,servo pulses ranging from 500 to 2500 microseconds (µs), with a frequency of 50Hz to 333Hz. 
servomotorKollmorgen Co-Engineers MotorsSG-90R1.8 kg-cm to 2.5 kg-cm load can be applied to SG-90R servo.
Stepper Motor28BYJ-4828BYJ-48Steper motor, 5V DC, 100 Hz frequency, torque 1200 Gf.cm
Stepper MotorNema 23NemaSteper motor, 9V - 42 V DC, 100 Hz frequency

Odniesienia

  1. Wang, L. J., Casto, B., Reyes-Molyneux, N., Chance, W. W., Wang, S. J. Smartphone-based augmented reality patient education in radiation oncology. Tech Innov Patient Supp Radiation Oncol. 29, 100229 (2024).
  2. Guerroudji, M. A., Amara, K., Lichouri, M., Zenati, N., Masmoudi, M. A 3D visualization-based augmented reality application for brain tumor segmentation. Comput Anim Virtual Worlds. 35 (1), e2223 (2024).
  3. Xia, L., et al. Augmented reality and indoor positioning based mobile production monitoring system to support workers with human-in-the-loop. Robotic Comput Integ Manufac. 86, 102664 (2024).
  4. Preece, C., Skandalis, A. Time to imagine an escape: investigating the consumer timework at play in augmented reality. Eur J Market. 58, 92-118 (2024).
  5. Suresh, D., Aydin, A., James, S., Ahmed, K., Dasgupta, P. The role of augmented reality in surgical training: a systematic review. Surg Innov. 30, 366-382 (2023).
  6. Moreta-Martinez, R., et al. Combining augmented reality and 3D printing to display patient models on a smartphone. J Vis Exp. (155), 60618 (2020).
  7. Ma, L., Huang, T., Wang, J., Liao, H. Visualization, registration and tracking techniques for augmented reality guided surgery: a review. Phys Med Biol. 68, 04TR02 (2023).
  8. Hofman, J., et al. First-in-human real-time ai-assisted instrument deocclusion during augmented reality robotic surgery. Healthc Technol Lett. 11 (2-3), 33-39 (2023).
  9. Thiene, G., Rizzo, S., Basso, C. Bicuspid aortic valve: The most frequent and not so benign congenital heart disease. Cardiovasc Pathol. 70, 107604 (2024).
  10. Mack, M. J., et al. Transcatheter aortic-valve replacement in low-risk patients at five years. New Engl J Med. 389, 1949-1960 (2023).
  11. Vitanova, K., et al. Aortic valve versus root surgery after failed transcatheter aortic valve replacement. J Thorac Cardiovas Surg. 166, 1418-1430 (2023).
  12. Bydlon, T. M., Torjesen, A., Fokkenrood, S., Di Tullio, A., Flexman, M. L. 3d visualisation of navigation catheters for endovascular procedures using a 3d hub and fiber optic realshape technology: phantom study results. EJVES Vascular Forum. 59, 24-30 (2023).
  13. Faris, H., Harfouche, C., Bandle, J., Wisbach, G. Surgical tele-mentoring using a robotic platform: initial experience in a military institution. Surg Endosc. 37, 9159-9166 (2023).
  14. Fitzgerald, L., et al. Mentoring approaches in a safe surgery program in tanzania: Lessons learned during covid-19 and recommendations for the future. Surg Open Sci. 14, 109-113 (2023).
  15. de Turenne, A., Eugène, F., Blanc, R., Szewczyk, J., Haigron, P. Catheter navigation support for mechanical thrombectomy guidance: 3d/2d multimodal catheter-based registration with no contrast dye fluoroscopy. Int J Comput Assist Radiol Surg. 19 (3), 459-468 (2023).
  16. Tang, H., et al. A multiple catheter tips tracking method in x-ray fluoroscopy images by a new lightweight segmentation network and bayesian filtering. Int J Med Robotics Comput Assist Surg. 19, e2569 (2023).
  17. Annabestani, M., Caprio, A., Wong, S. C., Mosadegh, B. A machine learning-based roll angle prediction for intracardiac echocardiography catheter during bi-plane fluoroscopy. Appl Sci. 13, 3483 (2023).
  18. Thourani, V. H., et al. Survival after surgical aortic valve replacement in low-risk patients: a contemporary trial benchmark. Ann Thorac Surg. 117, 106-112 (2024).
  19. Domínguez-Velasco, C. F., et al. Augmented reality simulation as training model of ventricular puncture: Evidence in the improvement of the quality of punctures. Int J Med Robotics Comput Assist Surg. 19, e2529 (2023).
  20. Wang, Q., Xie, Z. Arias: An ar-based interactive advertising system. Plos One. 18, e0285838 (2023).
  21. Ji, F., Chong, F., Wang, F., Bai, D. Augmented reality platform for the unmanned mining process in underground mines. Mining,Metal Explor. 39 (2), 385-395 (2022).
  22. Indhumathi, S., Clement, J. C. Directional intensified feature description using tertiary filtering for augmented reality tracking. Sci Rep. 13, 20311 (2023).
  23. Gao, L., Zhao, Y., Han, J., Liu, H. Research on multi-view 3D reconstruction technology based on SFM. Sensors. 22 (12), 4366 (2022).
  24. Suárez, I., Sfeir, G., Buenaposada, J. M., Baumela, L. BEBLID: Boosted efficient binary local image descriptor. Pattern Recog Lett. 133, 366-372 (2020).
  25. Borman, R. I., Harjoko, A. Improved ORB algorithm through feature point optimization and Gaussian pyramid. Int J Adv Comp Sci Appl. 15 (2), 268-275 (2024).

Przedruki i uprawnienia

Zapytaj o uprawnienia na użycie tekstu lub obrazów z tego artykułu JoVE

Zapytaj o uprawnienia

Przeglądaj więcej artyków

Automatic SurgeryTranscatheter Aortic Valve ReplacementAugmented RealityRemote SurgeryRobotics3D ReconstructionImage ExtractionSurgical PrecisionReal time SynchronizationTelemedicineRobotic Control SystemsMobile CameraLow Latency CommunicationCatheter based Procedure

This article has been published

Video Coming Soon

JoVE Logo

Prywatność

Warunki Korzystania

Zasady

Badania

Edukacja

O JoVE

Copyright © 2025 MyJoVE Corporation. Wszelkie prawa zastrzeżone