ERCP (Endoscopic Retrograde Cholangiopancreatography) is a common procedure used to diagnose and treat biliary and pancreatic diseases. However, the repeated exposure to X-ray radiation during these procedures poses health risks to surgeons. Teleoperation systems can help reduce this exposure, but they face challenges such as the lack of force feedback and differences between the master device's mechanisms and the movements of surgical tools, which can diminish surgical precision. This study aimed to develop a master device with force feedback specifically for teleoperated ERCP guidewire insertion, drawing inspiration from the natural hand movements of surgeons. The device includes a ring-shaped translation control handle and a rotation control handle, both designed to allow unlimited movement, thereby intuitively replicating the operation of the guidewire. A force feedback system was incorporated to enable collision detection and prevent potential injuries during procedures. Experimental results showed that the proposed system enhances control precision, reduces handling inertia, and provides effective force feedback. These advancements ensure safer and more accurate guidewire manipulation, addressing key limitations of existing teleoperation systems. Ultimately, this device not only minimizes radiation exposure for surgeons but also facilitates intuitive and precise teleoperated ERCP procedures.
With the evolution of robotic technology, the expansion of operations into challenging environments underscores the growing need for effective teleoperation systems. In such an environment, robots or machines can improve the efficiency and safety of tasks by delivering more detailed and accurate information to workers through virtual reality (VR). Current teleoperation systems have limitations in providing a comprehensive understanding of the work environment. Accordingly, this study proposes a technology that utilizes VR to provide a high level of telepresence to workers and enable intuitive control. To achieve this, we introduce a pregenerated computer-assisted design model for static objects beyond the viewing area of RGB-D cameras and a method to update the point cloud of the target objects, which are dynamic objects, in real-time. By incorporating this information, we created a 3D visual map and delivered it to the operator in real-time through HMD, enabling the operator to clearly recognize the robot’s current location and surroundings. In addition, we introduced hand motion recognition through HMD viewpoints and VR controllers, allowing the operator to intuitively control the robot. These techniques can improve the efficiency and safety of remote work.