Our research focuses on revolutionizing remote surgery through augmented reality and robotics, aiming to enhance precision and accessibility in medical procedures. We are exploring how real-time 3D reconstruction and feature extraction can provide surgeons with unprecedented control and visualization, even from a distance. We have implemented our own directional-intensified feature extraction to ensure precise and accurate detection of surgical features, enhancing the surgeon's control over the robotic arm.
By using 3D reconstruction from minimal 2D images, we have significantly reduced the hardware complexity and improved the efficiency of our system. Our system provides the best surveillance of the surgical environment using a single mobile camera, synchronized to take snaps every two seconds, ensuring real-time updates for the remote surgeon. We have developed seamless communication protocols with low latency and high reliability using Bluetooth and LAN, ensuring uninterrupted and accurate data transmission.
Our approach to achieve precise real-time synchronization between video feeds and robotic arm responses can serve as a benchmark for future research in remote surgery and telemedicine. It offers a model for integrating real-time data with robotic control systems. Our approach to achieve precise real-time synchronization between video feeds and robotic arm responses can serve as a benchmark for future research in remote surgery and telemedicine.
It offers a model for integrating real-time data with robotic control systems. To prepare the surgery environment, arrange an object-carrying platter, a robotic arm, and two side-hanging arms on a working platform. After locking the mobile in the mobile holder, enable Bluetooth of the mobile device and the laptop, which serve as the surgeon's HoloLens emulator.
Pair the devices for uninterrupted image transmission. Ensure that the hanging arms are controlled by a stepper motor for a flawless revolution of 360 degrees. Connect the motor to the microcontroller board using the TB6600 driver.
Click the Download button to download the microcontroller-integrated development environment from the browser. Then, go to File and open a new sketch to write the code. Connect the microcontroller board to interface with the new sketch through a communication port number for connection port.
Check the communication port and verify that it shows the microcontroller board. Check the hardware switch settings of the stepper motor driver TB6600 to ensure a current flow of two ampers. Set SW4 to on and SW5 and SW6 to off.
Set SW1 and SW3 to off and SW2 to on to achieve one eighth micro steps, meeting the required revolution steps. Connect RTC 3231 with the microcontroller to have real global time synchronization. Ensure the revolution step size is 12 degrees, and the motor step increments only when the seconds read from the RTC module are odd.
Connect the five volt pin of the microcontroller board to the RTCVCC, and the microcontroller's ground to the RTC's ground. Connect the SCL pin of RTC to the A0 pin and the SDA to the A1 pin of the microcontroller. This module ensures a 12 degree step size, completing 30 steps per revolution, with increments occurring every odd second.
Run the code available on the GitHub page to verify the setup is working correctly. Download Android Studio to develop the automatic camera application. Connect the mobile phone to the system.
In Android Studio, click New, New Project, and choose Empty Views Activity. Click on Next to develop an Android code available at the given site. Confirm that the app captures the images automatically and sends them to a remote device consistently.
Transmit snapshots from the mobile app immediately to the remote surgeon system through Bluetooth. To begin, open the graphical user interface server module. Enter the VVID port number in the VVID text field.
Click on Create Socket to create and bind the socket. Then, click on Connect to establish a connection with the mobile app. Now, click on Capture to capture and save the scene surveillance images in the local folder.
Let the client module include a robotic arm designed to have rotational movement in its base, shoulder, elbow, wrist, and fingers. Ensure that MG996R servos are used to govern the rotational movement at the base, shoulder, and elbow. Use the SG90 servo motor to control the rotational movement at the wrist and fingers.
Compile the code given in the microcontroller-integrated development environment to drive the robotic arm based on the commands received from the remote surgeon. Read two images at a time in a sequence one by one from the local folder to identify the possible overlapping between them. Extract the features using the DITF method.
Reconstruct three-dimensional images from the collected features using SFM. Enable the surgeon to inspect the three-dimensional reconstructed image features using hand gesture controls for rotation and zoom in or out, allowing visualization from all perspectives. Normalize and map the distance between the tip of the surgeon's thumb and the index finger of the right hand into a corresponding angle of rotation.
Transmit the hand gesture control through Bluetooth to the remote surgery environment. Ensure that the rotation of the object platter at the remote site mirrors the rotation of the three-dimensional reconstructed features at the surgeon's end. The accuracy of motion mapping decreased as the distance between two fingers increased.
The DITF algorithm achieved the lowest latency compared to ORB and BEBLID.