Researchers at Texas A&M University have developed a new surgical technology for steadying robotic arms during surgery. The new study was published in the journal Scientific Reports.
Robotic Fingers as an Extension of the Surgeon
The team of researchers demonstrated that users can have an accurate perception of distance to contact through the use of electrical currents that are sent to fingerprints. The electrical currents cause small but perceptible buzzes, and users are able to control robotic fingers in a more precise and accurate way when operating on fragile surfaces.
According to the researchers, the method could be used by surgeons to reduce inadvertent injuries, some of which occur during robot-assisted operative procedures.
Hangye Park is an assistant professor in the Department of Electrical and Computer Engineering.
“One of the challenges with robotic fingers is ensuring that they can be controlled precisely enough to softly land on biological tissue,” said Park. “With our design, surgeons will be able to get an intuitive sense of how far their robotic fingers are from contact, information they can then use to touch fragile structures with just the right amount of force.”
Surgeons utilize robot-assisted surgical systems, or telerobotic surgical systems, as physical extensions of themselves. The surgeons then control robotic fingers with movements from their own, which allows for complicated procedures to be done remotely. This also allows surgeons to take on more patients for medical care, and because of the small size of the robotic fingers, smaller incisions can be done. With the technology, surgeons are not required to make large incisions, which are often needed to accommodate the surgeon’s hands in the patient’s body.
One of the key aspects of moving robotic fingers precisely is the use of live streaming of visual information, which comes from cameras that are placed on telerobotic arms. This requires the surgeons to observe monitors in order to match their finger movements with the telerobotic finger movements. This helps them locate where their robotic fingers are and how close they are to each other.
According to Park, visual information alone is not enough to guide fine finger movements, which is extremely important when the fingers are operating very close to the brain and other delicate tissue.
“Surgeons can only know how far apart their actual fingers are from each other indirectly, that is, by looking at where their robotic fingers are relative to each other on a monitor,” Park said. “This roundabout view diminishes their sense of how far apart their actual fingers are from each other, which then affects how they control their robotic fingers.”
Glove Fitted with Stimulation Probes
Trying to overcome this challenge, the team of researchers developed an alternate way to deliver distance information, and it is independent of visual feedback. They do this by using gloves fitted with stimulation probes, which pass different frequencies of electrical currents onto fingertips. By doing this, users were able to be trained to associate the frequency of current pulses with distance, in a way in such a way that there is an increase in current frequencies when a test object is getting closer.
The technology was specific to the user’s sensitivity to electrical current frequencies. This means that in the case of a user being sensitive to a wider range of current frequencies, the distance information was then delivered with a smaller increase in currents.
According to the researchers, the users receiving electrical pulses were capable of lowering their force of contact by about 70%, and they had a raised awareness of the proximity to underlying surfaces. One of the concluding observations was that proximity information being delivered through mild electrical pulses was about three times more effective than just visual information.
According to Park, the new developments could drastically increase maneuverability during surgery, while at the same time minimizing unintended tissue damage.
“Our goal was to come up with a solution that would improve the accuracy in proximity estimation without increasing the burden of active thinking needed for this task,” he said. “When our technique is ready for use in surgical settings, physicians will be able to intuitively know how far their robotic fingers are from underlying structures, which means that they can keep their active focus on optimizing the surgical outcome of their patients.”
Spread the love
"soft" - Google News
May 19, 2020 at 12:29AM
https://ift.tt/2ZfoQlL
Computer Graphics Technology Adapted for Soft Robotics - Unite.AI
"soft" - Google News
https://ift.tt/2QZtiPM
https://ift.tt/2KTtFc8
Bagikan Berita Ini
0 Response to "Computer Graphics Technology Adapted for Soft Robotics - Unite.AI"
Post a Comment