WO2025038310A1 - Machine learning assisted mucosal ablation - Google Patents
Machine learning assisted mucosal ablation Download PDFInfo
- Publication number
- WO2025038310A1 WO2025038310A1 PCT/US2024/040823 US2024040823W WO2025038310A1 WO 2025038310 A1 WO2025038310 A1 WO 2025038310A1 US 2024040823 W US2024040823 W US 2024040823W WO 2025038310 A1 WO2025038310 A1 WO 2025038310A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- catheter
- mucosa
- organ
- ablation
- video
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B18/00—Surgical instruments, devices or methods for transferring non-mechanical forms of energy to or from the body
- A61B18/04—Surgical instruments, devices or methods for transferring non-mechanical forms of energy to or from the body by heating
- A61B18/12—Surgical instruments, devices or methods for transferring non-mechanical forms of energy to or from the body by heating by passing a current through the tissue to be heated, e.g. high-frequency current
- A61B18/14—Probes or electrodes therefor
- A61B18/1492—Probes or electrodes therefor having a flexible, catheter-like structure, e.g. for heart ablation
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B18/00—Surgical instruments, devices or methods for transferring non-mechanical forms of energy to or from the body
- A61B2018/00053—Mechanical features of the instrument of device
- A61B2018/00214—Expandable means emitting energy, e.g. by elements carried thereon
- A61B2018/0022—Balloons
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B18/00—Surgical instruments, devices or methods for transferring non-mechanical forms of energy to or from the body
- A61B2018/00315—Surgical instruments, devices or methods for transferring non-mechanical forms of energy to or from the body for treatment of particular body parts
- A61B2018/00482—Digestive system
- A61B2018/00488—Esophagus
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B18/00—Surgical instruments, devices or methods for transferring non-mechanical forms of energy to or from the body
- A61B2018/00315—Surgical instruments, devices or methods for transferring non-mechanical forms of energy to or from the body for treatment of particular body parts
- A61B2018/00482—Digestive system
- A61B2018/00494—Stomach, intestines or bowel
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B18/00—Surgical instruments, devices or methods for transferring non-mechanical forms of energy to or from the body
- A61B2018/00315—Surgical instruments, devices or methods for transferring non-mechanical forms of energy to or from the body for treatment of particular body parts
- A61B2018/00541—Lung or bronchi
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B18/00—Surgical instruments, devices or methods for transferring non-mechanical forms of energy to or from the body
- A61B2018/00571—Surgical instruments, devices or methods for transferring non-mechanical forms of energy to or from the body for achieving a particular surgical effect
- A61B2018/00577—Ablation
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B18/00—Surgical instruments, devices or methods for transferring non-mechanical forms of energy to or from the body
- A61B2018/00982—Surgical instruments, devices or methods for transferring non-mechanical forms of energy to or from the body combined with or comprising means for visual or photographic inspections inside the body, e.g. endoscopes
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/10—Computer-aided planning, simulation or modelling of surgical operations
- A61B2034/107—Visualisation of planned trajectories or target regions
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
- A61B2034/2046—Tracking techniques
- A61B2034/2059—Mechanical position encoders
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
- A61B2034/2046—Tracking techniques
- A61B2034/2061—Tracking techniques using shape-sensors, e.g. fiber shape sensors with Bragg gratings
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
- A61B2034/2046—Tracking techniques
- A61B2034/2065—Tracking using image or pattern recognition
Definitions
- the present disclosure relates generally to mucosal ablation systems. Specifically, the present disclosure relates to a machine learning assisted mucosal ablation system.
- Mucosal ablation systems are used to ablate or remove the mucosal lining in certain organs of the human body. Removing this lining and its subsequent generation may assist in treating certain illnesses or disorders.
- mucosal ablation systems may be used to remove the mucosal lining inside the duodenum (e.g., the organ in the digestive tract between the stomach and the small intestines) to treat diabetes.
- mucosal ablation systems may be used to treat fatty liver disease and polycystic ovarian syndrome.
- a system for mucosal resurfacing includes a camera, a catheter, a memory, and a processor communicatively coupled to the memory.
- the camera produces a video of a mucosa of an organ and the catheter is positioned in the organ.
- the processor generates, using the video from the camera, a map of the organ and identifies, using a neural network and the video from the camera, a first portion of the mucosa to be ablated.
- the processor also determines, using the map and motion of the catheter, that the catheter is positioned by the first portion of the mucosa and in response to determining that the catheter is positioned by the first portion of the mucosa, ablates, using the catheter, the first portion of the mucosa.
- a method for mucosal resurfacing includes producing, using a camera, a video of a mucosa of an organ and positioning a catheter in the organ.
- the method also includes generating, using the video from the camera, a map of the organ and identifying, using a neural network and the video from the camera, a first portion of the mucosa to be ablated.
- the method further includes determining, using the map and motion of the catheter, that the catheter is positioned by the first portion of the mucosa and in response to determining that the catheter is positioned by the first portion of the mucosa, ablating, using the catheter, the first portion of the mucosa.
- Other embodiments includes a non-transitory machine- readable medium storing instructions that, when executed by a processor, cause the processor to perform the method.
- Figure 1 illustrates an example mucosal ablation system.
- Figure 2 illustrates an example tube in the system of Figure 1 .
- Figure 3 illustrates an example tube in the system of Figure 1 .
- Figure 4 illustrates an example catheter in the system of Figure 1 .
- Figure 5 illustrates an example ablation tool on the catheter of Figure 4.
- Figure 6 illustrates an example positioning of the tube and catheter of the system of Figure 1 .
- Figure 7 illustrates an example computer system in the system of Figure 1 .
- Figure 8 illustrates an example computer system in the system of Figure 1 .
- Figure 9 illustrates an example computer system in the system of Figure 1 .
- Figure 10 illustrates an example computer system in the system of Figure
- Figure 11 illustrates an example computer system in the system of Figure
- Figure 12 illustrates an example computer system in the system of Figure
- Figure 13 is a flowchart of an example method performed in the system of Figure 1.
- Mucosal ablation may help treat certain illnesses and disorders.
- the mucosal lining inside an organ is ablated or removed, which allows the mucosal lining to subsequently regenerate. This removal and regeneration may refresh the mucosal lining in the organ, which may help treat certain illnesses and disorders.
- mucosal ablation of the duodenum has been demonstrated to help treat type II diabetes.
- Mucosal ablation procedures are traditionally performed visually by a surgeon. As a result, the procedures themselves may be inconsistently performed across different surgeons, which leads to inconsistent results. For example, some surgeons may remove too much or too little of the mucosal lining, which may further harm the patient without treating the illness or disorder.
- the present disclosure describes a system that uses machine learning to guide or assist mucosal ablation procedures (e.g., in the duodenum, lungs, esophagus, stomach, intestines, or colon).
- the system uses machine learning in two different ways.
- First, the system uses machine learning to map the inside of an organ and to detect the position of a surgical tool (e.g., a catheter or ablation tool) inside the organ.
- This machine learning technique may be referred to as simultaneous localization and mapping (SLAM).
- SLAM simultaneous localization and mapping
- the system may map the stomach in three-dimensional space and then perform a three-dimensional reconstruction of the stomach to produce a three-dimensional map or model of the stomach.
- Second, the system uses machine learning to detect treatment regions inside the organ.
- the system may use a neural network to analyze a video feed of the inside of the organ.
- the neural network may detect regions of the organ where mucosal ablation has been performed and regions of the organ where mucosal ablation has not been performed. The system may then guide the surgical tool to a region where mucosal ablation has not been performed so that the region may be ablated.
- the neural network may analyze the video feed during ablation to determine dosage. For example, the neural network may detect visual indicators of ablation and determine whether more ablation should be performed on the treatment region. The system may provide visual or audio cues to indicate whether more ablation should be performed or whether ablation should be stopped.
- the system provides several technical advantages.
- the system may provide a map of the organ and indicate a location of the surgical tool within the map of the organ. This information may provide a more expansive view of the positioning of the surgical tool during the procedure relative to existing systems that provide only a view from behind the surgical tool.
- the system may automatically detect regions of the organ that have not been ablated or that are under-ablated and guide the surgical tool towards those regions. In this manner, the system may protect against under-ablation.
- the system may also detect regions of the organ that have been ablated and guide the surgical tool away from those regions, which protects against over-ablation.
- the system may improve the consistency of mucosal ablation procedures performed by different surgeons and improve the consistency of results. Additionally, the system may shorten the amount of time it takes to perform an ablation procedure, which may improve patient safety and recovery.
- Figure 1 illustrates an example mucosal ablation system 100.
- the system 100 includes a surgery cart 102, a control station 104 and a computer system 106.
- the system 100 may be used to ablate or remove the mucosal lining of an organ in the body. Removal of the mucosal lining and its subsequent regeneration may help treat certain illnesses or disorders, such as, for example, type II diabetes.
- the system 100 may be used to ablate the mucosal lining of any suitable organ.
- the system 100 may be used to ablate the mucosal lining of the duodenum, the lungs, the esophagus, the stomach, the intestines, or the colon.
- the surgery cart 102 may include the tools and devices that perform the mucosal ablation procedure.
- the surgery cart 102 includes an actuator box 108, a tube 110, a camera 112, a catheter 114, and a display 116.
- the tube 110, camera 112, and catheter 114 may form a surgical tool that is controlled by the actuator box 108.
- the surgery cart 102 may be positioned or moved next to a subject or patient.
- a guide wire may then be inserted or positioned within the subject’s or patient’s body and into an organ.
- the actuator box 108 may then insert the tube 110 (and the camera 112) along the guide wire and into the organ.
- the catheter 114 may also be inserted through the tube 110 and into the organ.
- the ablation procedure may then be performed using the camera 112 and the catheter 114.
- the actuator box 108 may retract the catheter 114 and the tube 110.
- the camera 112 may be positioned at a distal end of the tube 110 (e.g., an end opposite the actuator box 108).
- the camera 112 may be a monocular or stereo camera system.
- the camera 112 may provide one or more video feeds of the environment within the organ when the tube 110 is inserted into the organ.
- the video feeds may show the movement of the tube 110 or the catheter 114 along with the progress of the ablation procedure.
- the catheter 114 may be inserted through the tube 110 and into the organ.
- the catheter 114 may include a portion (e.g., an ablation tool) that may be inflated to contact the mucosal lining of the organ. Additionally, the catheter 114 may be used to perform the ablation procedure once the catheter 114 has been inflated to contact the mucosal lining.
- the catheter 114 may include cathodes and anodes that apply an electric current to the mucosal lining, which causes poration and ablation of the mucosal lining.
- the portion may include jets that cause a hot fluid to flow over or beneath the mucosal lining, which causes ablation of the mucosal lining.
- the display 116 may display the video feed from the camera 112 during the ablation procedure.
- An operator of the system 100 e.g., a surgeon
- the display 116 may view the display 116 to inspect the progress of the ablation procedure.
- the display 116 may also present other vital information about the subject or patient.
- the display 116 also presents a map of the organ, which indicates a position or location of the tube 110 and the catheter 114 in the map. Additionally, the display 116 may present markings on the displayed video feed to indicate regions of the organ that have been ablated or regions of the organ that have not been ablated. Moreover, the display 116 may present messages or other visual markings to indicate dosage.
- the display 116 may present messages indicating that more ablation should be performed on a treatment region or messages indicating that ablation of a treatment region should be stopped. This information may assist or guide the ablation procedure, which may improve the consistency of the results of the ablation procedure.
- the operator of the system 100 may use the control station 104 to control the surgery cart 102.
- the control station 104 includes a display 118 and a control 120.
- the display 118 may provide information about the ablation procedure, similar to the display 116.
- the display 118 may present the video feed from the camera 112 to the operator of the system 100, the map of the organ, markings that indicate ablated or non-ablated regions or the organ, and/or messages or visual indicators indicating whether more ablation should be performed or whether ablation should be stopped.
- the operator of the system 100 may use the control 120 to control the movement of the surgery cart 102 or the operation of the actuator box 108.
- the operator of the system 100 may use the control 120 to control the movement of the surgery cart 102, the movement of the tube 110, or the movement of the catheter 114.
- the computer system 106 may use machine learning to assist the operator of the system 100 during the ablation procedure.
- the computer system 106 is separate from the surgery cart 102 and the control station 104.
- the computer system 106 is partially or fully embodied within the surgery cart 102 and/or the control station 104.
- the computer system 106 includes a processor 122 and a memory 124, which perform the actions or functions of the computer system 106 described herein.
- the computer system 106 uses machine learning to determine a location or position of the catheter 114 in the organ.
- the computer system 106 uses a SLAM algorithm that analyzes the video feed from the camera 112 to map the inside of the subject’s or patient’s organ.
- the SLAM algorithm also considers sensor measurements from sensors in or on the tube 110 to determine a position or location of the tube 110 and catheter 114 within the map.
- the tube 110 may include position sensors, kinematic sensors, or shape sensors that measure the position, movement, or shape of the tube 110, respectively.
- the SLAM algorithm may consider these sensor measurements to determine a position or location of the catheter 114 or the tube 110 within the generated map of the inside of the subject’s or patient’s organ.
- the computer system 106 may present, on the display 116 or the display 118, the map of the organ along with the position or location of the tube 110 or the catheter 114 inside the organ.
- the operator of the system 100 may view the map to see a more expansive view of the positioning of the tube 110 and the catheter 114 within the organ. As a result, the operator of the system 100 is provided more positioning and location information during the ablation procedure relative to existing systems that merely provide the operator the video feed from the camera 112.
- the computer system 106 uses a neural network to analyze or process the video feed from the camera 112 to determine and monitor the stages of the ablation procedure.
- the neural network may be trained (e.g., through analyzing video feeds from other ablation procedures) to identify or detect visual indicators of illness or disease and visual indicators of ablation.
- the neural network may analyze the video feed from the camera 112 during an ablation procedure to identify regions of the organ where the mucosal lining should be ablated.
- the neural network may identify from the video feed regions or portions of the organ where the mucosal lining presents signs of damage or disease.
- the neural network may identify these regions of the organ for ablation.
- the computer system 106 may use the SLAM algorithm to determine when the catheter 114 is positioned by the portion or region where the mucosal lining should be ablated. When the catheter 114 is properly positioned, the computer system 106 may inflate the catheter 114 and ablate the mucosal lining at the portion or region of the organ.
- the neural network may also analyze the video feed from the camera 112 to identify portions or regions of the organ that have been ablated. For example, when an electric current is used to ablate the mucosal lining, the neural network may identify regions or portions of the organ that had been ablated by identifying products of electrolysis on the internal surface of the organ, which may appear as lightly colored rings.
- the neural network may instruct that the catheter 114 be moved to another region of the organ that needs treatment.
- the computer system 106 may present visual, audio, or tactile instructions for navigating the tube 110 or the catheter 114.
- the computer system 106 may present on the display 116 or the display 118 arrows that show the direction the tube 110 or the catheter 114 should be moved to position the tube 110 or the catheter 114 by or near a treatment region.
- the computer system 106 may produce audible messages that instruct the operator of the system 100 how to move or navigate the tube 110 or the catheter 114.
- the computer system 106 may produce tactile feedback (e.g., haptic vibrations or movements) at the control 120 to indicate to the operator of the system 100 how to move or navigate the tube 110 or the catheter 114.
- the neural network may be trained to predict or determine regions of the organ where ablation is undesired.
- the neural network may detect regions of the organ that have scarring or polyps. The neural network may identify these regions and determine that these regions should not be ablated.
- the computer system 106 may instruct the operator of the system 100 to move past these regions or to avoid ablating these regions. The computer system 106 may also avoid identifying these regions as treatment regions.
- the neural network may be trained to determine dosage using visual indicators of ablation (e.g., products of electrolysis). For example, the neural network may be trained to predict or determine an amount of ablation that has occurred based on the visual indicators of ablation that appear in the video feed.
- the computer system 106 may determine whether the determined amount of ablation falls within a target range (e.g., a target dosing range). If the amount of ablation falls within the target range, the computer system 106 may present a message (e.g., on the display 116 or the display 118) indicating that ablation should be stopped. The operator of the system 100 may then stop ablation of the treatment region.
- a target range e.g., a target dosing range
- the computer system 106 may present no message or may present a message indicating that ablation should continue. The operator of the system 100 may then continue ablation of the treatment region. After the operator of the system 100 indicates that the ablation procedure is complete, the computer system 106 may use the neural network to continue analyzing the video feed to determine regions of the organ that have received an amount of ablation that falls beneath the target range. If the computer system 106 identifies such a region, the computer system 106 may present a message indicating that the region should be re-ablated. The operator of the system 100 may then ablate the region.
- the computer system 106 may also adjust the target range for different regions of the organ.
- the neural network may be trained to detect visual indicators that affect dosage (e.g., scarring, polyps, folds, turns, or other surface features). When the neural network detects these features in a region, the computer system 106 may adjust the target range for that region. When that region is being ablated, the computer system 106 may present instructions or messages that guide the ablation according to the adjusted target range.
- the computer system 106 uses machine learning to assist the operator of the system 100 during the ablation procedure. Specifically, the computer system 106 may track the location or position of the catheter 114 in the organ and monitor or track the stages of the ablation procedure. The computer system 106 may guide the procedure and may present information about the procedure to the operator of the system 100. In certain embodiments, by using machine learning, the computer system 106 improves the consistency of ablation procedures, and the consistency of results for the ablation procedures.
- the processor 122 is any electronic circuitry, including, but not limited to one or a combination of microprocessors, microcontrollers, application specific integrated circuits (ASIC), application specific instruction set processor (ASIP), and/or state machines, that communicatively couples to memory 124 and controls the operation of the computer system 106.
- the processor 122 may be 8-bit, 16-bit, 32- bit, 64-bit or of any other suitable architecture.
- the processor 122 may include an arithmetic logic unit (ALU) for performing arithmetic and logic operations, processor registers that supply operands to the ALU and store the results of ALU operations, and a control unit that fetches instructions from memory and executes them by directing the coordinated operations of the ALU, registers and other components.
- ALU arithmetic logic unit
- the processor 122 may include other hardware that operates software to control and process information.
- the processor 122 executes software stored on the memory 124 to perform any of the functions described herein.
- the processor 122 controls the operation and administration of the computer system 106 by processing information (e.g., information received from the surgery cart 102, control station 104, and memory 124).
- processing information e.g., information received from the surgery cart 102, control station 104, and memory 124.
- the processor 122 is not limited to a single processing device and may encompass multiple processing devices.
- the memory 124 may store, either permanently or temporarily, data, operational software, or other information for the processor 122.
- the memory 124 may include any one or a combination of volatile or non-volatile local or remote devices suitable for storing information.
- the memory 124 may include random access memory (RAM), read only memory (ROM), magnetic storage devices, optical storage devices, or any other suitable information storage device or a combination of these devices.
- the software represents any suitable set of instructions, logic, or code embodied in a computer-readable storage medium.
- the software may be embodied in the memory 124, a disk, a CD, or a flash drive.
- the software may include an application executable by the processor 122 to perform one or more of the functions described herein.
- Figure 2 illustrates an example tube 110 in the system 100 of Figure 1 .
- the tube 110 may be a flexible tube that houses the camera 112. Additionally, the tube 110 may define one or more channels (which may also be referred to as lumens). In the example of Figure 2, the tube 110 defines the channels 202, 204, and 206.
- Various instruments may be inserted through the tube 110 and through the channels 202, 204, and 206.
- the catheter 114 or a guide wire may be inserted through the channels 202, 204, and 206.
- the tube 110 may house or include lights 208. The lights 208 may illuminate the area in front of the tube 110 so that the camera 112 may capture video footage of the region in front of the tube 110.
- Figure 3 illustrates an example tube 110 in the system 100 of Figure 1 .
- the tube 110 may be a flexible tube that includes a distal end 302 (e.g., an end inserted into the organ) and a proximal end 304 (e.g., an end closest to the surgery cart 102).
- the distal end 302 may include the camera 112 and the light 208 that illuminates the region in front of the tube 110 in the organ.
- the proximal end 304 may be connected to the actuator box 108.
- the tube 110 may be formed using segments 310. Each segment 310 may be connected so as to allow the tube 110 to bend or fold along the segments 310.
- the tube 110 may include multiple sensors that monitor or measure various aspects of the tube 110.
- the tube 110 includes a position sensor 306, a kinematic sensor 307, a shape sensor 308, a volume sensor 312, and a pressure sensor 314.
- the tube 110 may also include one or more of a depth sensor, a stereo vision sensor, active stereo vision sensor, or an infrared sensor. Each of these sensors may be positioned on or within the tube 110.
- the position sensor 306 may track or measure a position of the tube 110. For example, the position sensor 306 may determine coordinates representing a position or location of the tube 110.
- the kinematic sensor 307 may detect or measure movement or motion of the tube 110. For example, the kinematic sensor 307 may measure an acceleration or velocity of the tube 110.
- the kinematic sensor 307 and the position sensor 306 may include accelerometers that detect the movement or positioning of the tube 110.
- the shape sensor 308 may detect or measure a shape of the tube 110.
- the shape sensor 308 may include an optical fiber that is used to detect when the tube 110 bends or folds.
- the volume sensor 312 and the pressure sensor 314 may detect various aspects of the catheter 114 inserted through the tube 110.
- the volume sensor 312 may detect a volume of the catheter 114 when the catheter 114 has been inflated.
- the pressure sensor 314 may detect a pressure experienced by the catheter 114 when the catheter 114 is inflated.
- the pressure sensor 314 may detect the pressure caused by the organ pressing down on the inflated catheter 114.
- FIG 4 illustrates an example catheter 114 in the system 100 of Figure 1 .
- the catheter 114 may include an ablation tool 402 positioned near an end of the catheter 114.
- the catheter 114 may be inserted through the tube 110 and into an organ.
- the ablation tool 402 may emerge from the tube 110 and into the organ.
- the ablation tool 402 may be inflated so that the ablation tool 402 contacts the inner wall of the organ.
- the ablation tool 402 may then ablate or remove the mucosal lining on the inner wall of the organ.
- the ablation tool 402 may apply an electric current to the mucosal lining, or the ablation tool 402 may cause a hot fluid to flow over or under the mucosal lining.
- the electric current or the hot fluid may cause ablation of the mucosal lining.
- Figure 5 illustrates an example ablation tool 402 on the catheter 114 of Figure 4.
- the ablation tool 402 includes an inflatable surface 502, which resembles a balloon, a cathode 504, and an anode 506.
- the inflatable surface 502 may expand when air or another fluid is injected into the ablation tool 402.
- the inflatable surface 502 may expand to contact the inner wall of the organ.
- the cathode 504 and anode 506 may be positioned on the inflatable surface 502.
- the inflatable surface 502 may cause the cathode 504 and the anode 506 to also contact the inner wall of the organ.
- An electric current may then be applied between the cathode 504 and the anode 506 and across the inner wall of the organ.
- the electric current may cause ablation of the mucosal lining on the inner wall of the organ.
- the electric current may cause the cells of the mucosal lining to porate and die, which results in ablation of the mucosal lining.
- Figure 6 illustrates an example positioning of the tube 110 and the catheter 114 of the system 100 of Figure 1 .
- the tube 110 is inserted into a duodenum 602 of a subject or patient, which is an organ between the stomach and small intestines.
- the catheter 114 may then be inserted through the tube 110.
- the catheter 114 may emerge from the front of the tube 110 into the duodenum 602.
- the ablation tool 402 on the catheter 114 may then be inflated to contact the inner wall of the duodenum 602.
- the ablation tool 402 may then ablate the mucosal lining on the inner wall of the duodenum 602.
- the ablation tool 402 may apply an electric current or hot fluid to the mucosal lining to ablate the mucosal lining.
- the catheter 114 and the tube 110 may be retracted and extracted from the duodenum 602.
- Figure 7 illustrates an example computer system 106 in the system 100 of Figure 1 .
- Figure 7 shows the computer system 106 using a SLAM algorithm to determine a position or orientation of the catheter 114 within a map of an organ.
- the computer system 106 provides a more expansive view of the positioning of the catheter 114 within the organ relative to existing ablation systems that present only a video feed from the camera in the tube in the organ.
- the computer system 106 receives a video 702.
- the video 702 may be generated by the camera 112 in the tube 110. As discussed previously, the camera 112 is positioned at the front of the tube 110 and within the organ.
- the video 702 may be captured from the perspective of the front of the tube 110.
- the video 702 may show the inner wall of the organ. Additionally, the video 702 may show the catheter 114 after the catheter 114 has emerged from the tube 110 and into the organ.
- the computer system 106 may analyze the video 702 to identify landmarks 704 that appear in the video 702.
- the landmarks 704 may include the inner wall of the organ along with other body parts. For example, if the organ is the duodenum, the landmarks 704 may include the papilla that are attached to the duodenum.
- the computer system 106 may analyze the landmarks 704 to generate a map 706 of the organ. For example, the computer system 106 may use the shape and size of the inner wall of the organ to produce the boundaries of the organ in the map 706.
- the computer system 106 may use the other body parts that appear in the landmarks 704 to verify or confirm the shape and positioning of the walls of the organ in the map 706. For example, the computer system 106 may determine where certain portions of the inner wall are located in the duodenum when the computer system 106 detects a papilla in the video 702.
- the computer system 106 may also receive measurements from various sensors on the tube 110 or the catheter 114.
- the computer system 106 receives a position measurement 708, a shape measurement 710, and/or a kinematic measurement 712.
- the computer system 106 may receive the position measurements 708 from the position sensor 306 in or on the tube 110.
- the position measurement 708 may include coordinates that identify a position of the tube 110 in free space.
- the computer system 106 may receive the shape measurement 710 from the shape sensor 308 in or on the tube 110.
- the shape measurement 710 may indicate a deformation of the tube 110.
- the shape measurement 710 may indicate a fold or a bend in the tube 110.
- the computer system 106 may receive the kinematic measurement 712 from the kinematic sensor 307 in or on the tube 110.
- the kinematic measurement 712 may indicate a motion or a movement of the tube 110.
- the kinematic sensor 307 may produce the kinematic measurement 712 to indicate the direction and the distance moved by the tube 110.
- the computer system 106 may use the landmarks 704, the position measurement 708, the shape measurement 710, and/or the kinematic measurement 712 to generate a map 706 of the organ.
- the computer system 106 may locate a landmark 704 in several frames of the video 702.
- the computer system 106 may analyze each of the frames to identify or match the landmark 704 in each of the frames.
- the landmark 704 may move to a different position in the frames due to movement of the tube 110 in the organ.
- the position measurement 708, the shape measurement 710, and/or the kinematic measurement 712 may indicate the movement of the tube 110 that occurred between frames.
- the computer system 106 may determine how the frames correspond to one another in a three-dimensional space (e.g., the depth of one frame relative to another frame).
- the computer system 106 may then stitch the frames together according to the positioning of the landmarks 704 and according to the position measurement 708, the shape measurement 710, and/or the kinematic measurement 712 to generate the map 706, which may be a three-dimensional map.
- the computer system 106 may then localize the tube 110 or the camera 112 in the map 706 using the positioning of the landmarks 704 and/or the position measurement 708, the shape measurement 710, and/or the kinematic measurement 712. By localizing the tube 110 or the camera 112, the computer system 106 may determine a position and/or orientation 714 of the tube 110 or the camera 112 within the map 706. The computer system 106 may then present the map 706 and the position and/or orientation 714 on the display 116 or the display 118. The display 116 or 118 may then present the operator of the system 100 a more expansive view of the position of the tube 110 in the organ relative to existing systems that merely present the video feed from the front of the tube 110. With this more expansive view, the operator of the system 100 may gain a better understanding of the positioning of the catheter 114 within the organ.
- Figure 8 illustrates an example computer system 106 in the system 100 of Figure 1 .
- Figure 8 shows the computer system 106 using a neural network 802 to identify treatment regions 804 in the organ.
- the computer system 106 uses machine learning to guide or assist the stages of the ablation procedure.
- the neural network 802 may have been trained by reviewing videos of ablation procedures. These videos may show regions of the organs that are diseased or damaged, and the videos may show the ablation procedure on these regions along with the results. By analyzing these videos, the neural network 802 may learn visual indicators of illness or disease on the inner wall of the organ. Additionally, the neural network 802 may learn visual indicators of successful ablation.
- the neural network 802 may be trained to identify products of electrolysis that presents as light-colored rings on the inner wall of the organ as evidence of ablation. Moreover, the neural network 802 may be trained to identify visual indicators that ablation should be avoided (e.g., scarring or polyps). The neural network 802 may then be used during subsequent ablation procedures to guide or assist the procedures by identifying regions of an organ that should be ablated (e.g. , due to disease or illness), regions of an organ that have been ablated, or regions of an organ that should be avoided during ablation.
- regions of an organ that should be ablated e.g. , due to disease or illness
- the computer system 106 receives the video 702 from the camera 112 in the tube 110.
- the video 702 may show the inner wall of the organ from the perspective of the front of the tube 110.
- the neural network 802 may analyze the video 702 to identify or detect visual indicators of illness or disease or visual indicators of ablation. In this manner, the neural network 802 may identify regions of the organ that have been treated by the ablation procedure, and regions of the organ that should be treated by the ablation procedure. For example, the neural network 802 may analyze the video 702 to identify evidence of ablation that presents as light-colored rings on the inner wall of the organ.
- the neural network 802 may also be trained to identify visual indicators of illness or disease. When the neural network 802 identifies, from the video 702, that a region has indicators of illness or disease and does not have the lightcolored ring, the neural network 802 may identify the region as a treatment region 804.
- the computer system 106 may indicate the treatment region 804 on the display 116 or 118. For example, if the display 116 or the display 118 are presenting the video 702 from the camera 112, the computer system 106 may mark the treatment region 804 on the displayed video 702. The computer system 106 may present an indicator, such as a colored box or a colored dot, over the treatment region 804 in the video 702. As another example, the computer system 106 may use a semantic image segmentation process to assign labels to the pixels in the video 702 based on what those pixels are showing. If the computer system 106 determines that a pixel in the video 702 is showing a treatment region 804, then the computer system may assign a first color or shade to that pixel. Otherwise, the computer system 106 may assign a second color or shade to that pixel. The different colored or shaded pixels may guide the operator of the system 100 to ablate the treatment regions 804.
- an indicator such as a colored box or a colored dot
- the computer system 106 may then analyze the position and/or orientation 714 of the tube 110 or the catheter 114 determined using the SLAM algorithm to determine whether the tube 110 or the catheter 114 are positioned by or near the treatment region 804. Specifically, the computer system 106 may determine from the position and/or orientation 714 whether the catheter 114 is properly positioned to ablate the treatment region 804. If the catheter 114 is not properly positioned, the computer system 106 may cause the actuator box 108 to move the tube 110 and/or the catheter 114 into position. The computer system 106 may also present visual, audio, or tactile instructions to the operator of the system 100 on how and where to move the tube 110 or the catheter 114.
- the computer system 106 may communicate an inflation signal 806 (e.g., to the surgery cart 102).
- the inflation signal 806 may cause the ablation tool 402 on the catheter 114 to inflate and expand to contact the inner wall of the organ.
- the inflation signal 806 may cause the inflatable surface 502 of the ablation tool 402 to expand, which brings the cathode 504 and anode 506 into contact with the inner wall of the organ.
- An electric signal may then be applied to the inner wall to ablate the treatment region 804.
- Figure 9 illustrates an example computer system 106 in the system 100 of Figure 1. Generally, Figure 9 shows the computer system 106 determining when to ablate a treatment region 804.
- the computer system 106 may receive a volume measurement 902 and/or a pressure measurement 904.
- the computer system 106 may receive the volume measurement 902 from the volume sensor 312 on the tube 110 and the pressure measurement 904 from the pressure sensor 314 on the tube 110.
- the volume measurement 902 may indicate a volume of the ablation tool 402 during inflation.
- the pressure measurement 904 may indicate a pressure experienced by the ablation tool 402 when inflated.
- the pressure measurement 904 may indicate a pressure exerted by the inner wall of the organ on the ablation tool 402 when the ablation tool 402 is inflated.
- the computer system 106 may analyze the video 702, the volume measurement 902, and/or the pressure measurement 904 to determine a contact region 906 on the inner wall of the organ.
- the contact region 906 may be the region of the inner wall of the organ that is contacted by the ablation tool 402 when the ablation tool 402 is inflated.
- the computer system 106 may then determine whether the contact region 906 is sufficient for treating the treatment region 804.
- the video 702 may provide a visual indication of the size and extent of the contact region 906.
- the volume measurement 902 and/or the pressure measurement 904 may indicate whether the ablation tool 402 may be further inflated.
- the computer system 106 may use the neural network 802 to analyze the video 702 to determine if the video 702 shows that the contact region 906 is too small (e.g., indicating that the ablation tool 402 has not made sufficient contact with the treatment region 804). If the contact region 906 is too small, then the ablation procedure may underdose the ablation. As a result, the ablation may be ineffective.
- the computer system 106 may then consider the volume measurement 902 and/or the pressure measurement 904 to determine whether the ablation tool 402 can be further inflated to increase the contact region 906. For example, if the volume measurement 902 and/or the pressure measurement 904 fall below thresholds, then the computer system 106 may determine that the ablation tool 402 may be further inflated. Otherwise, the computer system 106 may determine that the ablation tool 402 cannot be further inflated. The computer system 106 may communicate the inflation signal 806 to further inflate the ablation tool 402 to increase the contact region 906.
- the computer system 106 determines that the contact region 906 is the proper size and would provide a proper dosage during the ablation procedure, the computer system 106 communicates an ablation signal 908 (e.g., to the surgery cart 102).
- the ablation signal 908 may cause the ablation tool 402 to perform ablation.
- the ablation signal 908 may cause the ablation tool 402 to send an electric current through the contact region 906 to ablate the contact region 906.
- the ablation signal 908 may cause the ablation tool 402 send hot fluid to flow over or under the contact region 906 to ablate the contact region 906.
- the computer system 106 may adjust the ablation procedure during the ablation step.
- the neural network 802 may continue analyzing the video 702 during ablation to determine how the ablation is progressing. For example, the neural network 802 may analyze the video 702 to detect visual indicators of products of electrolysis or lightly colored rings on the contact region 906. If the neural network 802 does not detect sufficient products of electrolysis, the computer system 106 may increase a magnitude of the electric current applied to the contact region 906 (e.g., to increase dosage). If the neural network 802 detects a large amount of products of electrolysis, the computer system 106 may decrease the magnitude of the electric current applied to the contact region 906 (e.g., to reduce dosage).
- the neural network 802 may be trained to predict or determine an amount of ablation that has occurred based on the visual indicators of ablation that appear in the video 702.
- the computer system 106 may determine whether the determined amount of ablation falls within a target range (e.g., a target dosing range). If the amount of ablation falls within the target range, the computer system 106 may present a message (e.g., on the display 116 or the display 118) indicating that ablation should be stopped. The operator of the system 100 may then stop ablation of the treatment region. If the amount of ablation falls beneath the target range, the computer system 106 may present no message or may present a message indicating that ablation should continue. The operator of the system 100 may then continue ablation of the treatment region.
- a target range e.g., a target dosing range
- the computer system 106 may present a message (e.g., on the display 116 or the display 118) indicating that ablation should be stopped. The operator of the system 100 may then stop ab
- the computer system 106 may also adjust the target range for different regions of the organ.
- the neural network 802 may be trained to detect visual indicators that affect dosage (e.g., scarring, polyps, folds, turns, or other surface features). When the neural network 802 detects these features in a region, the computer system 106 may adjust the target range for that region. When that region is being ablated, the computer system 106 may present instructions or messages that guide the ablation according to the adjusted target range.
- Figure 10 illustrates an example computer system 106 in the system 100 of Figure 1 . Generally, Figure 10 shows the computer system 106 identifying subsequent treatment regions in the organ.
- the computer system 106 continues to receive the video 702 from the camera 112 in the tube 110 during the ablation procedure.
- the video 702 may indicate regions of the organ that have been treated and regions of the organ that have not been treated.
- the computer system 106 may use the neural network 802 to continue analyzing the video 702 to identify or detect evidence of ablation 1002. As discussed previously, if an electric current is used during the ablation procedure, then the evidence of ablation 1002 may include products of electrolysis that presents as lightly colored rings on the inner wall of the organ.
- the neural network 802 may analyze the video 702 to identify these lightly colored rings as indicators of a region of the organ having been treated by ablation.
- the computer system 106 may communicate the ablation signal 908 (e.g., to the surgery cart 102) so that ablation may be performed again on the contact region 906. If the neural network 802 detects sufficient evidence of ablation 1002, then the computer system 106 may use the neural network 802 to analyze the video 702 to identify another treatment region 1004. For example, the neural network 802 may analyze the video 702 to detect a region of the organ that does not present evidence of ablation 1002. The neural network 802 may even determine that this region presents visual indicators of illness or disease.
- the neural network 802 may identify this region as the next treatment region 1004.
- the computer system 106 may then cause or instruct the tube 110 and catheter 114 to move to the next treatment region 1004.
- the computer system 106 may track the movement of the tube 110 or the catheter 114 to detect when the tube 110 or the catheter 114 has been positioned by or near the treatment region 1004.
- the computer system 106 may inflate the catheter 114 and perform ablation at the treatment region 1004. In this manner, the computer system 106 uses the neural network 802 to monitor the stages of the ablation procedure.
- Figure 11 illustrates an example computer system 106 in the system 100 of Figure 1 .
- Figure 11 shows the computer system 106 being used to analyze the results of an ablation procedure. For example, after an ablation procedure has been performed and a healing period has elapsed, the computer system 106 may analyze the effectiveness of the ablation procedure.
- the computer system 106 receives a video 1102.
- the video 1102 may be captured by the camera 112 in the tube 110.
- the tube 110 may have been inserted back into the organ after the healing period has elapsed.
- the video 1102 may show the inner wall of the organ.
- the computer system 106 may use the neural network 802 to analyze the video 1102 to identify regeneration 1104 on the inner wall of the organ.
- the neural network 802 may have been trained to identify evidence of regeneration 1104 on the inner walls of organs (e.g., by reviewing videos of organs that have been treated with ablation).
- the neural network 802 may analyze the video 1102 to identify or detect evidence of regeneration 1104.
- the computer system 106 may analyze the evidence of regeneration 1104 and make a treatment determination 1106.
- the treatment determination 1106 may indicate whether there is sufficient evidence of regeneration 1104.
- the computer system 106 may make a treatment determination 1106 that further ablation procedures are needed.
- the computer system 106 may make the treatment determination 1106 that further ablation procedures are not necessary. In this manner, the computer system 106 may be used to determine the effectiveness of an ablation procedure.
- Figure 12 illustrates an example computer system 106 in the system 100 of Figure 1. Generally, Figure 12 shows the computer system 106 determining whether an ablation procedure may be needed.
- the computer system 106 receives a video 1202.
- the video 1202 may be generated by the camera 112 in the tube 110.
- the tube 110 may have been inserted into an organ during a routine scan or checkup.
- the video 1202 may show the inner wall of the organ.
- the computer system 106 may use the neural network 802 to analyze the video 1202 to detect landmarks 1204.
- the neural network 802 may be trained to identify visual indicators of various illnesses or diseases.
- the neural network 802 may determine whether the landmarks 1204 that appear in the video 1202 include these indicators.
- the computer system 106 may make a treatment recommendation 1206 based on the identified landmarks 1204. For example, the computer system 106 may recommend ablation if the landmarks 1204 indicate illness or disease that may be treated by ablation. If the landmarks 1204 do not indicate illness or disease that may be treated by ablation, the computer system 106 may not recommend ablation. In this manner, the computer system 106 uses machine learning to determine when ablation may be helpful, and to recommend ablation in those instances.
- Figure 13 is a flowchart of an example method 1300 performed in the system 100 of Figure 1 .
- various components of the system 100 may perform the method 1300.
- the system 100 uses machine learning to assist mucosal ablation.
- the camera 112 in the tube 110 produces a video 702.
- the tube 110 and the camera 112 may have been inserted or positioned within an organ (e.g., the duodenum or the lungs).
- the video 702 may show the inner walls of the organ.
- the system 100 may position the catheter 114 in the organ.
- the catheter 114 may be inserted through a channel in the tube 110 into the organ.
- the catheter 114 may emerge from the tube 110 and into the organ.
- the computer system 106 generates the map 706 of the organ.
- the computer system 106 may use a SLAM algorithm to analyze the video 702 and to identify landmarks 704 in the video 702.
- the landmarks 704 may include the inner wall of the organ along with other body parts.
- the computer system 106 may analyze the landmarks 704 to determine the shape of the organ and the positioning of the walls of the organ.
- the computer system 106 may then generate the map 706 of the inside of the organ using these landmarks 704.
- the computer system 106 identifies a treatment region 804 in the organ.
- the computer system 106 may use the neural network 802 to analyze the video 702 to identify the treatment region 804.
- the neural network 802 may be trained to identify or detect visual indicators of disease or illness.
- the neural network 802 may identify these indicators of disease or illness in the video 702.
- the neural network 802 may identify the treatment region 804 that includes these indicators of disease or illness.
- the computer system 106 determines the position 714 of the catheter 114 in the organ.
- the computer system 106 may receive sensor measurements (e.g., the position measurement 708, the shape measurement 710, or the kinematic measurement 712) and determine the position 714 of the catheter 114 using the sensor measurements.
- the position 714 may be a position of the catheter 114 within the map 706 of the organ.
- the computer system 106 may determine whether the catheter 114 is positioned by or near the treatment region 804. If the catheter 114 is not positioned near or by the treatment region 804, the computer system 106 may instruct or cause the catheter 114 to be moved closer to the treatment region 804.
- the computer system 106 ablates the treatment region 804. Specifically, when the computer system 106 determines that the catheter 114 is positioned by or near the treatment region 804, the computer system 106 may begin ablation of the treatment region 804. The computer system 106 may communicate the inflation signal 806 that causes the ablation tool 402 on the catheter 114 to inflate so that the ablation tool 402 contacts the inner wall of the organ. The computer system 106 may then communicate the ablation signal 908, which causes the ablation tool 402 to ablate the treatment region 804. For example, the ablation signal 908 may cause the ablation tool 402 to apply an electric current to the treatment region 804.
- the ablation signal 908 may cause the ablation tool 402 to cause hot fluid to flow over or under the treatment region 804.
- the computer system 106 uses machine learning to guide or assist the ablation procedure in the organ, which may result in more consistent ablation procedures and more consistent results for ablation procedures across different physicians.
- the system 100 uses machine learning to guide or assist mucosal ablation procedures (e.g., in the duodenum, lungs, esophagus, stomach, intestines, or colon).
- mucosal ablation procedures e.g., in the duodenum, lungs, esophagus, stomach, intestines, or colon.
- the system 100 uses machine learning in two different ways.
- the system 100 uses machine learning (e.g., a SLAM algorithm) to map the inside of an organ and to detect the position of a surgical tool (e.g., the catheter 114 or the ablation tool 402) inside the organ.
- machine learning e.g., a SLAM algorithm
- the system 100 uses machine learning to detect treatment regions 804 inside the organ.
- the system 100 may use the neural network 802 to analyze a video 702 of the inside of the organ.
- the neural network 802 may detect regions of the organ where mucosal ablation has been performed and regions of the organ where mucosal ablation is needed.
- the system 100 may then guide the surgical tool to a region where mucosal ablation has not been performed so that the region may be ablated.
- spatially relative terms such as “beneath”, “below”, “lower”, “above”, “upper”, “proximal”, “distal”, and the like may be used to describe one element’s or feature’s relationship to another element or feature as illustrated in the figures.
- These spatially relative terms are intended to encompass different positions (i.e., locations) and orientations (i.e., rotational placements) of the elements or their operation in addition to the position and orientation shown in the figures. For example, if the content of one of the figures is turned over, elements described as “below” or “beneath” other elements or features would then be “above” or “over” the other elements or features.
- the exemplary term “below” can encompass both positions and orientations of above and below.
- a device may be otherwise oriented (rotated 90 degrees or at other orientations) and the spatially relative descriptors used herein interpreted accordingly.
- descriptions of movement along and around various axes include various special element positions and orientations.
- the singular forms “a”, “an”, and “the” are intended to include the plural forms as well, unless the context indicates otherwise.
- the terms “comprises”, “comprising”, “includes”, and the like specify the presence of stated features, steps, operations, elements, and/or components but do not preclude the presence or addition of one or more other features, steps, operations, elements, components, and/or groups.
- Components described as coupled may be electrically or mechanically directly coupled, or they may be indirectly coupled via one or more intermediate components.
- the term “position” refers to the location of an element or a portion of an element in a three-dimensional space (e.g., three degrees of translational freedom along Cartesian x-, y-, and z-coordinates).
- the term “orientation” refers to the rotational placement of an element or a portion of an element (three degrees of rotational freedom - e.g., roll, pitch, and yaw).
- shape refers to a set positions or orientations measured along an element.
- proximal refers to a direction toward the base of the computer-assisted device along its kinematic chain and distal refers to a direction away from the base along the kinematic chain.
- aspects of this disclosure are described in reference to computer-assisted systems and devices, which may include systems and devices that are teleoperated, remote-controlled, autonomous, semiautonomous, robotic, and/or the like. Further, aspects of this disclosure are described in terms of an embodiment using a medical system, such as the DA VINCI SURGICAL SYSTEM or ION SYSTEM commercialized by Intuitive Surgical, Inc. of Sunnyvale, California. Knowledgeable persons will understand, however, that aspects disclosed herein may be embodied and implemented in various ways, including robotic and, if applicable, non-robotic embodiments. Techniques described with reference to surgical instruments and surgical methods may be used in other contexts.
- the instruments, systems, and methods described herein may be used for humans, animals, portions of human or animal anatomy, industrial systems, general robotic, or teleoperational systems.
- the instruments, systems, and methods described herein may be used for non-medical purposes including industrial uses, general robotic uses, sensing or manipulating non-tissue work pieces, cosmetic improvements, imaging of human or animal anatomy, gathering data from human or animal anatomy, setting up or taking down systems, training medical or non-medical personnel, and/or the like.
- Additional example applications include use for procedures on tissue removed from human or animal anatomies (with or without return to a human or animal anatomy) and for procedures on human or animal cadavers. Further, these techniques can also be used for medical treatment or diagnosis procedures that include, or do not include, surgical aspects.
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Surgery (AREA)
- Engineering & Computer Science (AREA)
- Plasma & Fusion (AREA)
- Medical Informatics (AREA)
- Otolaryngology (AREA)
- Physics & Mathematics (AREA)
- Cardiology (AREA)
- Biomedical Technology (AREA)
- Heart & Thoracic Surgery (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Molecular Biology (AREA)
- Animal Behavior & Ethology (AREA)
- General Health & Medical Sciences (AREA)
- Public Health (AREA)
- Veterinary Medicine (AREA)
- Surgical Instruments (AREA)
Abstract
The present disclosure describes a system and method for using machine learning to assist in mucosal ablation procedures. The system includes a camera, a catheter, a memory, and a processor communicatively coupled to the memory. The camera produces a video of a mucosa of an organ and the catheter is positioned in the organ. The processor generates, using the video from the camera, a map of the organ and identifies, using a neural network and the video from the camera, a first portion of the mucosa to be ablated. The processor also determines, using the map and motion of the catheter, that the catheter is positioned by the first portion of the mucosa and in response to determining that the catheter is positioned by the first portion of the mucosa, ablates, using the catheter, the first portion of the mucosa.
Description
MACHINE LEARNING ASSISTED MUCOSAL ABLATION
TECHNICAL FIELD
[0001] The present disclosure relates generally to mucosal ablation systems. Specifically, the present disclosure relates to a machine learning assisted mucosal ablation system.
BACKGROUND
[0002] Mucosal ablation systems are used to ablate or remove the mucosal lining in certain organs of the human body. Removing this lining and its subsequent generation may assist in treating certain illnesses or disorders. For example, mucosal ablation systems may be used to remove the mucosal lining inside the duodenum (e.g., the organ in the digestive tract between the stomach and the small intestines) to treat diabetes. As another example, mucosal ablation systems may be used to treat fatty liver disease and polycystic ovarian syndrome.
SUMMARY
[0003] The present disclosure describes a system and method for using machine learning to assist in mucosal ablation procedures. According to an embodiment, a system for mucosal resurfacing includes a camera, a catheter, a memory, and a processor communicatively coupled to the memory. The camera produces a video of a mucosa of an organ and the catheter is positioned in the organ. The processor generates, using the video from the camera, a map of the organ and identifies, using a neural network and the video from the camera, a first portion of the mucosa to be ablated. The processor also determines, using the map and motion of the catheter, that the catheter is positioned by the first portion of the mucosa and in response to determining that the catheter is positioned by the first portion of the mucosa, ablates, using the catheter, the first portion of the mucosa.
[0004] According to another embodiment, a method for mucosal resurfacing includes producing, using a camera, a video of a mucosa of an organ and positioning a catheter in the organ. The method also includes generating, using the video from the camera, a map of the organ and identifying, using a neural network and the video from the camera, a first portion of the mucosa to be ablated. The method further
includes determining, using the map and motion of the catheter, that the catheter is positioned by the first portion of the mucosa and in response to determining that the catheter is positioned by the first portion of the mucosa, ablating, using the catheter, the first portion of the mucosa. Other embodiments includes a non-transitory machine- readable medium storing instructions that, when executed by a processor, cause the processor to perform the method.
[0005] The foregoing general description and the following detailed description are exemplary and explanatory in nature and are intended to provide an understanding of the present disclosure without limiting the scope of the present disclosure. In that regard, additional aspects, features, and advantages of the present disclosure will be apparent to one skilled in the art from the following detailed description.
BRIEF DESCRIPTION OF THE DRAWINGS
[0006] Figure 1 illustrates an example mucosal ablation system.
[0007] Figure 2 illustrates an example tube in the system of Figure 1 .
[0008] Figure 3 illustrates an example tube in the system of Figure 1 .
[0009] Figure 4 illustrates an example catheter in the system of Figure 1 .
[0010] Figure 5 illustrates an example ablation tool on the catheter of Figure 4.
[0011] Figure 6 illustrates an example positioning of the tube and catheter of the system of Figure 1 .
[0012] Figure 7 illustrates an example computer system in the system of Figure 1 .
[0013] Figure 8 illustrates an example computer system in the system of Figure 1 .
[0014] Figure 9 illustrates an example computer system in the system of Figure 1 .
[0015] Figure 10 illustrates an example computer system in the system of Figure
1.
[0016] Figure 11 illustrates an example computer system in the system of Figure
[0017] Figure 12 illustrates an example computer system in the system of Figure
1.
[0018] Figure 13 is a flowchart of an example method performed in the system of Figure 1.
DETAILED DESCRIPTION
[0019] Mucosal ablation may help treat certain illnesses and disorders. During the mucosal ablation procedure, the mucosal lining inside an organ is ablated or removed, which allows the mucosal lining to subsequently regenerate. This removal and regeneration may refresh the mucosal lining in the organ, which may help treat certain illnesses and disorders. For example, mucosal ablation of the duodenum has been demonstrated to help treat type II diabetes. Mucosal ablation procedures, however, are traditionally performed visually by a surgeon. As a result, the procedures themselves may be inconsistently performed across different surgeons, which leads to inconsistent results. For example, some surgeons may remove too much or too little of the mucosal lining, which may further harm the patient without treating the illness or disorder.
[0020] The present disclosure describes a system that uses machine learning to guide or assist mucosal ablation procedures (e.g., in the duodenum, lungs, esophagus, stomach, intestines, or colon). Generally, the system uses machine learning in two different ways. First, the system uses machine learning to map the inside of an organ and to detect the position of a surgical tool (e.g., a catheter or ablation tool) inside the organ. This machine learning technique may be referred to as simultaneous localization and mapping (SLAM). For example, using this technique, the system may map the stomach in three-dimensional space and then perform a three-dimensional reconstruction of the stomach to produce a three-dimensional map or model of the stomach. Second, the system uses machine learning to detect treatment regions inside the organ. For example, the system may use a neural network to analyze a video feed of the inside of the organ. The neural network may detect regions of the organ where mucosal ablation has been performed and regions of the organ where mucosal ablation has not been performed. The system may then guide the surgical tool to a region where mucosal ablation has not been performed so that the region may be ablated. Additionally, the neural network may analyze the video
feed during ablation to determine dosage. For example, the neural network may detect visual indicators of ablation and determine whether more ablation should be performed on the treatment region. The system may provide visual or audio cues to indicate whether more ablation should be performed or whether ablation should be stopped.
[0021] In certain embodiments, the system provides several technical advantages. For example, the system may provide a map of the organ and indicate a location of the surgical tool within the map of the organ. This information may provide a more expansive view of the positioning of the surgical tool during the procedure relative to existing systems that provide only a view from behind the surgical tool. As another example, the system may automatically detect regions of the organ that have not been ablated or that are under-ablated and guide the surgical tool towards those regions. In this manner, the system may protect against under-ablation. The system may also detect regions of the organ that have been ablated and guide the surgical tool away from those regions, which protects against over-ablation. Thus, the system may improve the consistency of mucosal ablation procedures performed by different surgeons and improve the consistency of results. Additionally, the system may shorten the amount of time it takes to perform an ablation procedure, which may improve patient safety and recovery.
[0022] Figure 1 illustrates an example mucosal ablation system 100. As seen in Figure 1 , the system 100 includes a surgery cart 102, a control station 104 and a computer system 106. Generally, the system 100 may be used to ablate or remove the mucosal lining of an organ in the body. Removal of the mucosal lining and its subsequent regeneration may help treat certain illnesses or disorders, such as, for example, type II diabetes. The system 100 may be used to ablate the mucosal lining of any suitable organ. For example, the system 100 may be used to ablate the mucosal lining of the duodenum, the lungs, the esophagus, the stomach, the intestines, or the colon.
[0023] The surgery cart 102 may include the tools and devices that perform the mucosal ablation procedure. As seen in Figure 1 , the surgery cart 102 includes an actuator box 108, a tube 110, a camera 112, a catheter 114, and a display 116. Generally, the tube 110, camera 112, and catheter 114 may form a surgical tool that
is controlled by the actuator box 108. The surgery cart 102 may be positioned or moved next to a subject or patient. A guide wire may then be inserted or positioned within the subject’s or patient’s body and into an organ. The actuator box 108 may then insert the tube 110 (and the camera 112) along the guide wire and into the organ. The catheter 114 may also be inserted through the tube 110 and into the organ. The ablation procedure may then be performed using the camera 112 and the catheter 114. After the ablation procedure is complete, the actuator box 108 may retract the catheter 114 and the tube 110.
[0024] The camera 112 may be positioned at a distal end of the tube 110 (e.g., an end opposite the actuator box 108). The camera 112 may be a monocular or stereo camera system. The camera 112 may provide one or more video feeds of the environment within the organ when the tube 110 is inserted into the organ. The video feeds may show the movement of the tube 110 or the catheter 114 along with the progress of the ablation procedure.
[0025] The catheter 114 may be inserted through the tube 110 and into the organ. The catheter 114 may include a portion (e.g., an ablation tool) that may be inflated to contact the mucosal lining of the organ. Additionally, the catheter 114 may be used to perform the ablation procedure once the catheter 114 has been inflated to contact the mucosal lining. For example, the catheter 114 may include cathodes and anodes that apply an electric current to the mucosal lining, which causes poration and ablation of the mucosal lining. As another example, the portion may include jets that cause a hot fluid to flow over or beneath the mucosal lining, which causes ablation of the mucosal lining.
[0026] The display 116 may display the video feed from the camera 112 during the ablation procedure. An operator of the system 100 (e.g., a surgeon) may view the display 116 to inspect the progress of the ablation procedure. The display 116 may also present other vital information about the subject or patient. In some embodiments, the display 116 also presents a map of the organ, which indicates a position or location of the tube 110 and the catheter 114 in the map. Additionally, the display 116 may present markings on the displayed video feed to indicate regions of the organ that have been ablated or regions of the organ that have not been ablated. Moreover, the display 116 may present messages or other visual markings to indicate
dosage. For example, the display 116 may present messages indicating that more ablation should be performed on a treatment region or messages indicating that ablation of a treatment region should be stopped. This information may assist or guide the ablation procedure, which may improve the consistency of the results of the ablation procedure.
[0027] The operator of the system 100 may use the control station 104 to control the surgery cart 102. As seen in Figure 1 , the control station 104 includes a display 118 and a control 120. The display 118 may provide information about the ablation procedure, similar to the display 116. For example, the display 118 may present the video feed from the camera 112 to the operator of the system 100, the map of the organ, markings that indicate ablated or non-ablated regions or the organ, and/or messages or visual indicators indicating whether more ablation should be performed or whether ablation should be stopped. The operator of the system 100 may use the control 120 to control the movement of the surgery cart 102 or the operation of the actuator box 108. For example, the operator of the system 100 may use the control 120 to control the movement of the surgery cart 102, the movement of the tube 110, or the movement of the catheter 114.
[0028] The computer system 106 may use machine learning to assist the operator of the system 100 during the ablation procedure. In certain embodiments, the computer system 106 is separate from the surgery cart 102 and the control station 104. In some embodiments, the computer system 106 is partially or fully embodied within the surgery cart 102 and/or the control station 104. As seen in Figure 1 , the computer system 106 includes a processor 122 and a memory 124, which perform the actions or functions of the computer system 106 described herein.
[0029] In a first example, the computer system 106 uses machine learning to determine a location or position of the catheter 114 in the organ. The computer system 106 uses a SLAM algorithm that analyzes the video feed from the camera 112 to map the inside of the subject’s or patient’s organ. The SLAM algorithm also considers sensor measurements from sensors in or on the tube 110 to determine a position or location of the tube 110 and catheter 114 within the map. For example, the tube 110 may include position sensors, kinematic sensors, or shape sensors that measure the position, movement, or shape of the tube 110, respectively. The SLAM algorithm may
consider these sensor measurements to determine a position or location of the catheter 114 or the tube 110 within the generated map of the inside of the subject’s or patient’s organ. The computer system 106 may present, on the display 116 or the display 118, the map of the organ along with the position or location of the tube 110 or the catheter 114 inside the organ. The operator of the system 100 may view the map to see a more expansive view of the positioning of the tube 110 and the catheter 114 within the organ. As a result, the operator of the system 100 is provided more positioning and location information during the ablation procedure relative to existing systems that merely provide the operator the video feed from the camera 112.
[0030] In a second example, the computer system 106 uses a neural network to analyze or process the video feed from the camera 112 to determine and monitor the stages of the ablation procedure. The neural network may be trained (e.g., through analyzing video feeds from other ablation procedures) to identify or detect visual indicators of illness or disease and visual indicators of ablation. After training, the neural network may analyze the video feed from the camera 112 during an ablation procedure to identify regions of the organ where the mucosal lining should be ablated. For example, the neural network may identify from the video feed regions or portions of the organ where the mucosal lining presents signs of damage or disease. The neural network may identify these regions of the organ for ablation. The computer system 106 may use the SLAM algorithm to determine when the catheter 114 is positioned by the portion or region where the mucosal lining should be ablated. When the catheter 114 is properly positioned, the computer system 106 may inflate the catheter 114 and ablate the mucosal lining at the portion or region of the organ. The neural network may also analyze the video feed from the camera 112 to identify portions or regions of the organ that have been ablated. For example, when an electric current is used to ablate the mucosal lining, the neural network may identify regions or portions of the organ that had been ablated by identifying products of electrolysis on the internal surface of the organ, which may appear as lightly colored rings. When the neural network determines that a portion or a region of the organ has been ablated, the neural network may instruct that the catheter 114 be moved to another region of the organ that needs treatment.
[0031] In certain embodiments, the computer system 106 may present visual, audio, or tactile instructions for navigating the tube 110 or the catheter 114. For example, the computer system 106 may present on the display 116 or the display 118 arrows that show the direction the tube 110 or the catheter 114 should be moved to position the tube 110 or the catheter 114 by or near a treatment region. As another example, the computer system 106 may produce audible messages that instruct the operator of the system 100 how to move or navigate the tube 110 or the catheter 114. As another example, the computer system 106 may produce tactile feedback (e.g., haptic vibrations or movements) at the control 120 to indicate to the operator of the system 100 how to move or navigate the tube 110 or the catheter 114.
[0032] In particular embodiments, the neural network may be trained to predict or determine regions of the organ where ablation is undesired. For example, the neural network may detect regions of the organ that have scarring or polyps. The neural network may identify these regions and determine that these regions should not be ablated. In response, the computer system 106 may instruct the operator of the system 100 to move past these regions or to avoid ablating these regions. The computer system 106 may also avoid identifying these regions as treatment regions.
[0033] In some embodiments, the neural network may be trained to determine dosage using visual indicators of ablation (e.g., products of electrolysis). For example, the neural network may be trained to predict or determine an amount of ablation that has occurred based on the visual indicators of ablation that appear in the video feed. The computer system 106 may determine whether the determined amount of ablation falls within a target range (e.g., a target dosing range). If the amount of ablation falls within the target range, the computer system 106 may present a message (e.g., on the display 116 or the display 118) indicating that ablation should be stopped. The operator of the system 100 may then stop ablation of the treatment region. If the amount of ablation falls beneath the target range, the computer system 106 may present no message or may present a message indicating that ablation should continue. The operator of the system 100 may then continue ablation of the treatment region. After the operator of the system 100 indicates that the ablation procedure is complete, the computer system 106 may use the neural network to continue analyzing the video feed to determine regions of the organ that have received an amount of
ablation that falls beneath the target range. If the computer system 106 identifies such a region, the computer system 106 may present a message indicating that the region should be re-ablated. The operator of the system 100 may then ablate the region.
[0034] The computer system 106 may also adjust the target range for different regions of the organ. The neural network may be trained to detect visual indicators that affect dosage (e.g., scarring, polyps, folds, turns, or other surface features). When the neural network detects these features in a region, the computer system 106 may adjust the target range for that region. When that region is being ablated, the computer system 106 may present instructions or messages that guide the ablation according to the adjusted target range.
[0035] In this manner, the computer system 106 uses machine learning to assist the operator of the system 100 during the ablation procedure. Specifically, the computer system 106 may track the location or position of the catheter 114 in the organ and monitor or track the stages of the ablation procedure. The computer system 106 may guide the procedure and may present information about the procedure to the operator of the system 100. In certain embodiments, by using machine learning, the computer system 106 improves the consistency of ablation procedures, and the consistency of results for the ablation procedures.
[0036] The processor 122 is any electronic circuitry, including, but not limited to one or a combination of microprocessors, microcontrollers, application specific integrated circuits (ASIC), application specific instruction set processor (ASIP), and/or state machines, that communicatively couples to memory 124 and controls the operation of the computer system 106. The processor 122 may be 8-bit, 16-bit, 32- bit, 64-bit or of any other suitable architecture. The processor 122 may include an arithmetic logic unit (ALU) for performing arithmetic and logic operations, processor registers that supply operands to the ALU and store the results of ALU operations, and a control unit that fetches instructions from memory and executes them by directing the coordinated operations of the ALU, registers and other components. The processor 122 may include other hardware that operates software to control and process information. The processor 122 executes software stored on the memory 124 to perform any of the functions described herein. The processor 122 controls the operation and administration of the computer system 106 by processing information
(e.g., information received from the surgery cart 102, control station 104, and memory 124). The processor 122 is not limited to a single processing device and may encompass multiple processing devices.
[0037] The memory 124 may store, either permanently or temporarily, data, operational software, or other information for the processor 122. The memory 124 may include any one or a combination of volatile or non-volatile local or remote devices suitable for storing information. For example, the memory 124 may include random access memory (RAM), read only memory (ROM), magnetic storage devices, optical storage devices, or any other suitable information storage device or a combination of these devices. The software represents any suitable set of instructions, logic, or code embodied in a computer-readable storage medium. For example, the software may be embodied in the memory 124, a disk, a CD, or a flash drive. In particular embodiments, the software may include an application executable by the processor 122 to perform one or more of the functions described herein.
[0038] Figure 2 illustrates an example tube 110 in the system 100 of Figure 1 . As seen in Figure 2, the tube 110 may be a flexible tube that houses the camera 112. Additionally, the tube 110 may define one or more channels (which may also be referred to as lumens). In the example of Figure 2, the tube 110 defines the channels 202, 204, and 206. Various instruments may be inserted through the tube 110 and through the channels 202, 204, and 206. For example, the catheter 114 or a guide wire may be inserted through the channels 202, 204, and 206. Additionally, the tube 110 may house or include lights 208. The lights 208 may illuminate the area in front of the tube 110 so that the camera 112 may capture video footage of the region in front of the tube 110.
[0039] Figure 3 illustrates an example tube 110 in the system 100 of Figure 1 . As seen in Figure 3, the tube 110 may be a flexible tube that includes a distal end 302 (e.g., an end inserted into the organ) and a proximal end 304 (e.g., an end closest to the surgery cart 102). The distal end 302 may include the camera 112 and the light 208 that illuminates the region in front of the tube 110 in the organ. The proximal end 304 may be connected to the actuator box 108. The tube 110 may be formed using segments 310. Each segment 310 may be connected so as to allow the tube 110 to bend or fold along the segments 310.
[0040] Additionally, the tube 110 may include multiple sensors that monitor or measure various aspects of the tube 110. As seen in Figure 3, the tube 110 includes a position sensor 306, a kinematic sensor 307, a shape sensor 308, a volume sensor 312, and a pressure sensor 314. The tube 110 may also include one or more of a depth sensor, a stereo vision sensor, active stereo vision sensor, or an infrared sensor. Each of these sensors may be positioned on or within the tube 110. The position sensor 306 may track or measure a position of the tube 110. For example, the position sensor 306 may determine coordinates representing a position or location of the tube 110. The kinematic sensor 307 may detect or measure movement or motion of the tube 110. For example, the kinematic sensor 307 may measure an acceleration or velocity of the tube 110. The kinematic sensor 307 and the position sensor 306 may include accelerometers that detect the movement or positioning of the tube 110. The shape sensor 308 may detect or measure a shape of the tube 110. For example, the shape sensor 308 may include an optical fiber that is used to detect when the tube 110 bends or folds.
[0041] The volume sensor 312 and the pressure sensor 314 may detect various aspects of the catheter 114 inserted through the tube 110. For example, the volume sensor 312 may detect a volume of the catheter 114 when the catheter 114 has been inflated. The pressure sensor 314 may detect a pressure experienced by the catheter 114 when the catheter 114 is inflated. For example, the pressure sensor 314 may detect the pressure caused by the organ pressing down on the inflated catheter 114.
[0042] Figure 4 illustrates an example catheter 114 in the system 100 of Figure 1 . As seen in Figure 4, the catheter 114 may include an ablation tool 402 positioned near an end of the catheter 114. The catheter 114 may be inserted through the tube 110 and into an organ. The ablation tool 402 may emerge from the tube 110 and into the organ. When the catheter 114 and the ablation tool 402 have been positioned by or near a treatment region. The ablation tool 402 may be inflated so that the ablation tool 402 contacts the inner wall of the organ. The ablation tool 402 may then ablate or remove the mucosal lining on the inner wall of the organ. For example, the ablation tool 402 may apply an electric current to the mucosal lining, or the ablation tool 402 may cause a hot fluid to flow over or under the mucosal lining. The electric current or the hot fluid may cause ablation of the mucosal lining.
[0043] Figure 5 illustrates an example ablation tool 402 on the catheter 114 of Figure 4. As seen in Figure 5, the ablation tool 402 includes an inflatable surface 502, which resembles a balloon, a cathode 504, and an anode 506. Generally, the inflatable surface 502 may expand when air or another fluid is injected into the ablation tool 402. The inflatable surface 502 may expand to contact the inner wall of the organ. The cathode 504 and anode 506 may be positioned on the inflatable surface 502. When the inflatable surface 502 expands to contact the inner wall of the organ, the inflatable surface 502 may cause the cathode 504 and the anode 506 to also contact the inner wall of the organ. An electric current may then be applied between the cathode 504 and the anode 506 and across the inner wall of the organ. The electric current may cause ablation of the mucosal lining on the inner wall of the organ. For example, the electric current may cause the cells of the mucosal lining to porate and die, which results in ablation of the mucosal lining.
[0044] Figure 6 illustrates an example positioning of the tube 110 and the catheter 114 of the system 100 of Figure 1 . In the example of Figure 6, the tube 110 is inserted into a duodenum 602 of a subject or patient, which is an organ between the stomach and small intestines. After the tube 110 is inserted into the duodenum 602, the catheter 114 may then be inserted through the tube 110. The catheter 114 may emerge from the front of the tube 110 into the duodenum 602. The ablation tool 402 on the catheter 114 may then be inflated to contact the inner wall of the duodenum 602. The ablation tool 402 may then ablate the mucosal lining on the inner wall of the duodenum 602. For example, the ablation tool 402 may apply an electric current or hot fluid to the mucosal lining to ablate the mucosal lining. After the procedure is complete, the catheter 114 and the tube 110 may be retracted and extracted from the duodenum 602.
[0045] Figure 7 illustrates an example computer system 106 in the system 100 of Figure 1 . Generally, Figure 7 shows the computer system 106 using a SLAM algorithm to determine a position or orientation of the catheter 114 within a map of an organ. In particular embodiments, the computer system 106 provides a more expansive view of the positioning of the catheter 114 within the organ relative to existing ablation systems that present only a video feed from the camera in the tube in the organ.
[0046] The computer system 106 receives a video 702. The video 702 may be generated by the camera 112 in the tube 110. As discussed previously, the camera 112 is positioned at the front of the tube 110 and within the organ. The video 702 may be captured from the perspective of the front of the tube 110. The video 702 may show the inner wall of the organ. Additionally, the video 702 may show the catheter 114 after the catheter 114 has emerged from the tube 110 and into the organ. The computer system 106 may analyze the video 702 to identify landmarks 704 that appear in the video 702. The landmarks 704 may include the inner wall of the organ along with other body parts. For example, if the organ is the duodenum, the landmarks 704 may include the papilla that are attached to the duodenum. The computer system 106 may analyze the landmarks 704 to generate a map 706 of the organ. For example, the computer system 106 may use the shape and size of the inner wall of the organ to produce the boundaries of the organ in the map 706. Additionally, the computer system 106 may use the other body parts that appear in the landmarks 704 to verify or confirm the shape and positioning of the walls of the organ in the map 706. For example, the computer system 106 may determine where certain portions of the inner wall are located in the duodenum when the computer system 106 detects a papilla in the video 702.
[0047] The computer system 106 may also receive measurements from various sensors on the tube 110 or the catheter 114. In the example of Figure 7, the computer system 106 receives a position measurement 708, a shape measurement 710, and/or a kinematic measurement 712. The computer system 106 may receive the position measurements 708 from the position sensor 306 in or on the tube 110. The position measurement 708 may include coordinates that identify a position of the tube 110 in free space. The computer system 106 may receive the shape measurement 710 from the shape sensor 308 in or on the tube 110. The shape measurement 710 may indicate a deformation of the tube 110. For example, the shape measurement 710 may indicate a fold or a bend in the tube 110. The computer system 106 may receive the kinematic measurement 712 from the kinematic sensor 307 in or on the tube 110. The kinematic measurement 712 may indicate a motion or a movement of the tube 110. For example, as the tube 110 is moved through the body and into the organ, the kinematic sensor 307 may produce the kinematic measurement 712 to indicate the direction and the distance moved by the tube 110.
[0048] According to the SLAM algorithm, the computer system 106 may use the landmarks 704, the position measurement 708, the shape measurement 710, and/or the kinematic measurement 712 to generate a map 706 of the organ. For example, the computer system 106 may locate a landmark 704 in several frames of the video 702. The computer system 106 may analyze each of the frames to identify or match the landmark 704 in each of the frames. The landmark 704 may move to a different position in the frames due to movement of the tube 110 in the organ. The position measurement 708, the shape measurement 710, and/or the kinematic measurement 712 may indicate the movement of the tube 110 that occurred between frames. Using this information, the computer system 106 may determine how the frames correspond to one another in a three-dimensional space (e.g., the depth of one frame relative to another frame). The computer system 106 may then stitch the frames together according to the positioning of the landmarks 704 and according to the position measurement 708, the shape measurement 710, and/or the kinematic measurement 712 to generate the map 706, which may be a three-dimensional map.
[0049] The computer system 106 may then localize the tube 110 or the camera 112 in the map 706 using the positioning of the landmarks 704 and/or the position measurement 708, the shape measurement 710, and/or the kinematic measurement 712. By localizing the tube 110 or the camera 112, the computer system 106 may determine a position and/or orientation 714 of the tube 110 or the camera 112 within the map 706. The computer system 106 may then present the map 706 and the position and/or orientation 714 on the display 116 or the display 118. The display 116 or 118 may then present the operator of the system 100 a more expansive view of the position of the tube 110 in the organ relative to existing systems that merely present the video feed from the front of the tube 110. With this more expansive view, the operator of the system 100 may gain a better understanding of the positioning of the catheter 114 within the organ.
[0050] Figure 8 illustrates an example computer system 106 in the system 100 of Figure 1 . Generally, Figure 8 shows the computer system 106 using a neural network 802 to identify treatment regions 804 in the organ. In this manner, the computer system 106 uses machine learning to guide or assist the stages of the ablation procedure.
[0051] The neural network 802 may have been trained by reviewing videos of ablation procedures. These videos may show regions of the organs that are diseased or damaged, and the videos may show the ablation procedure on these regions along with the results. By analyzing these videos, the neural network 802 may learn visual indicators of illness or disease on the inner wall of the organ. Additionally, the neural network 802 may learn visual indicators of successful ablation. For example, if ablation by electric current is used, the neural network 802 may be trained to identify products of electrolysis that presents as light-colored rings on the inner wall of the organ as evidence of ablation. Moreover, the neural network 802 may be trained to identify visual indicators that ablation should be avoided (e.g., scarring or polyps). The neural network 802 may then be used during subsequent ablation procedures to guide or assist the procedures by identifying regions of an organ that should be ablated (e.g. , due to disease or illness), regions of an organ that have been ablated, or regions of an organ that should be avoided during ablation.
[0052] The computer system 106 receives the video 702 from the camera 112 in the tube 110. The video 702 may show the inner wall of the organ from the perspective of the front of the tube 110. The neural network 802 may analyze the video 702 to identify or detect visual indicators of illness or disease or visual indicators of ablation. In this manner, the neural network 802 may identify regions of the organ that have been treated by the ablation procedure, and regions of the organ that should be treated by the ablation procedure. For example, the neural network 802 may analyze the video 702 to identify evidence of ablation that presents as light-colored rings on the inner wall of the organ. The neural network 802 may also be trained to identify visual indicators of illness or disease. When the neural network 802 identifies, from the video 702, that a region has indicators of illness or disease and does not have the lightcolored ring, the neural network 802 may identify the region as a treatment region 804.
[0053] In some embodiments, the computer system 106 may indicate the treatment region 804 on the display 116 or 118. For example, if the display 116 or the display 118 are presenting the video 702 from the camera 112, the computer system 106 may mark the treatment region 804 on the displayed video 702. The computer system 106 may present an indicator, such as a colored box or a colored dot, over the treatment region 804 in the video 702. As another example, the computer system 106 may use
a semantic image segmentation process to assign labels to the pixels in the video 702 based on what those pixels are showing. If the computer system 106 determines that a pixel in the video 702 is showing a treatment region 804, then the computer system may assign a first color or shade to that pixel. Otherwise, the computer system 106 may assign a second color or shade to that pixel. The different colored or shaded pixels may guide the operator of the system 100 to ablate the treatment regions 804.
[0054] The computer system 106 may then analyze the position and/or orientation 714 of the tube 110 or the catheter 114 determined using the SLAM algorithm to determine whether the tube 110 or the catheter 114 are positioned by or near the treatment region 804. Specifically, the computer system 106 may determine from the position and/or orientation 714 whether the catheter 114 is properly positioned to ablate the treatment region 804. If the catheter 114 is not properly positioned, the computer system 106 may cause the actuator box 108 to move the tube 110 and/or the catheter 114 into position. The computer system 106 may also present visual, audio, or tactile instructions to the operator of the system 100 on how and where to move the tube 110 or the catheter 114.
[0055] When the computer system 106 determines that the catheter 114 is properly positioned to ablate the treatment region 804, the computer system 106 may communicate an inflation signal 806 (e.g., to the surgery cart 102). The inflation signal 806 may cause the ablation tool 402 on the catheter 114 to inflate and expand to contact the inner wall of the organ. For example, the inflation signal 806 may cause the inflatable surface 502 of the ablation tool 402 to expand, which brings the cathode 504 and anode 506 into contact with the inner wall of the organ. An electric signal may then be applied to the inner wall to ablate the treatment region 804.
[0056] Figure 9 illustrates an example computer system 106 in the system 100 of Figure 1. Generally, Figure 9 shows the computer system 106 determining when to ablate a treatment region 804.
[0057] The computer system 106 may receive a volume measurement 902 and/or a pressure measurement 904. For example, the computer system 106 may receive the volume measurement 902 from the volume sensor 312 on the tube 110 and the pressure measurement 904 from the pressure sensor 314 on the tube 110. The
volume measurement 902 may indicate a volume of the ablation tool 402 during inflation. The pressure measurement 904 may indicate a pressure experienced by the ablation tool 402 when inflated. For example, the pressure measurement 904 may indicate a pressure exerted by the inner wall of the organ on the ablation tool 402 when the ablation tool 402 is inflated.
[0058] The computer system 106 may analyze the video 702, the volume measurement 902, and/or the pressure measurement 904 to determine a contact region 906 on the inner wall of the organ. The contact region 906 may be the region of the inner wall of the organ that is contacted by the ablation tool 402 when the ablation tool 402 is inflated.
[0059] The computer system 106 may then determine whether the contact region 906 is sufficient for treating the treatment region 804. The video 702 may provide a visual indication of the size and extent of the contact region 906. The volume measurement 902 and/or the pressure measurement 904 may indicate whether the ablation tool 402 may be further inflated. As an example, the computer system 106 may use the neural network 802 to analyze the video 702 to determine if the video 702 shows that the contact region 906 is too small (e.g., indicating that the ablation tool 402 has not made sufficient contact with the treatment region 804). If the contact region 906 is too small, then the ablation procedure may underdose the ablation. As a result, the ablation may be ineffective. The computer system 106 may then consider the volume measurement 902 and/or the pressure measurement 904 to determine whether the ablation tool 402 can be further inflated to increase the contact region 906. For example, if the volume measurement 902 and/or the pressure measurement 904 fall below thresholds, then the computer system 106 may determine that the ablation tool 402 may be further inflated. Otherwise, the computer system 106 may determine that the ablation tool 402 cannot be further inflated. The computer system 106 may communicate the inflation signal 806 to further inflate the ablation tool 402 to increase the contact region 906.
[0060] When the computer system 106 determines that the contact region 906 is the proper size and would provide a proper dosage during the ablation procedure, the computer system 106 communicates an ablation signal 908 (e.g., to the surgery cart 102). The ablation signal 908 may cause the ablation tool 402 to perform ablation.
For example, the ablation signal 908 may cause the ablation tool 402 to send an electric current through the contact region 906 to ablate the contact region 906. As another example, the ablation signal 908 may cause the ablation tool 402 send hot fluid to flow over or under the contact region 906 to ablate the contact region 906.
[0061] In some embodiments, the computer system 106 may adjust the ablation procedure during the ablation step. The neural network 802 may continue analyzing the video 702 during ablation to determine how the ablation is progressing. For example, the neural network 802 may analyze the video 702 to detect visual indicators of products of electrolysis or lightly colored rings on the contact region 906. If the neural network 802 does not detect sufficient products of electrolysis, the computer system 106 may increase a magnitude of the electric current applied to the contact region 906 (e.g., to increase dosage). If the neural network 802 detects a large amount of products of electrolysis, the computer system 106 may decrease the magnitude of the electric current applied to the contact region 906 (e.g., to reduce dosage).
[0062] In some embodiments, the neural network 802 may be trained to predict or determine an amount of ablation that has occurred based on the visual indicators of ablation that appear in the video 702. The computer system 106 may determine whether the determined amount of ablation falls within a target range (e.g., a target dosing range). If the amount of ablation falls within the target range, the computer system 106 may present a message (e.g., on the display 116 or the display 118) indicating that ablation should be stopped. The operator of the system 100 may then stop ablation of the treatment region. If the amount of ablation falls beneath the target range, the computer system 106 may present no message or may present a message indicating that ablation should continue. The operator of the system 100 may then continue ablation of the treatment region.
[0063] The computer system 106 may also adjust the target range for different regions of the organ. The neural network 802 may be trained to detect visual indicators that affect dosage (e.g., scarring, polyps, folds, turns, or other surface features). When the neural network 802 detects these features in a region, the computer system 106 may adjust the target range for that region. When that region is being ablated, the computer system 106 may present instructions or messages that guide the ablation according to the adjusted target range.
[0064] Figure 10 illustrates an example computer system 106 in the system 100 of Figure 1 . Generally, Figure 10 shows the computer system 106 identifying subsequent treatment regions in the organ.
[0065] The computer system 106 continues to receive the video 702 from the camera 112 in the tube 110 during the ablation procedure. The video 702 may indicate regions of the organ that have been treated and regions of the organ that have not been treated. The computer system 106 may use the neural network 802 to continue analyzing the video 702 to identify or detect evidence of ablation 1002. As discussed previously, if an electric current is used during the ablation procedure, then the evidence of ablation 1002 may include products of electrolysis that presents as lightly colored rings on the inner wall of the organ. The neural network 802 may analyze the video 702 to identify these lightly colored rings as indicators of a region of the organ having been treated by ablation.
[0066] If the neural network 802 does not detect sufficient evidence of ablation 1002 in a region of the organ (e.g., a contact region 906 where ablation was performed), then the computer system 106 may communicate the ablation signal 908 (e.g., to the surgery cart 102) so that ablation may be performed again on the contact region 906. If the neural network 802 detects sufficient evidence of ablation 1002, then the computer system 106 may use the neural network 802 to analyze the video 702 to identify another treatment region 1004. For example, the neural network 802 may analyze the video 702 to detect a region of the organ that does not present evidence of ablation 1002. The neural network 802 may even determine that this region presents visual indicators of illness or disease. The neural network 802 may identify this region as the next treatment region 1004. The computer system 106 may then cause or instruct the tube 110 and catheter 114 to move to the next treatment region 1004. The computer system 106 may track the movement of the tube 110 or the catheter 114 to detect when the tube 110 or the catheter 114 has been positioned by or near the treatment region 1004. When the tube 110 or the catheter 114 is properly positioned by the treatment region 1004, the computer system 106 may inflate the catheter 114 and perform ablation at the treatment region 1004. In this manner, the computer system 106 uses the neural network 802 to monitor the stages of the ablation procedure.
[0067] Figure 11 illustrates an example computer system 106 in the system 100 of Figure 1 . Generally, Figure 11 shows the computer system 106 being used to analyze the results of an ablation procedure. For example, after an ablation procedure has been performed and a healing period has elapsed, the computer system 106 may analyze the effectiveness of the ablation procedure.
[0068] The computer system 106 receives a video 1102. The video 1102 may be captured by the camera 112 in the tube 110. The tube 110 may have been inserted back into the organ after the healing period has elapsed. The video 1102 may show the inner wall of the organ.
[0069] The computer system 106 may use the neural network 802 to analyze the video 1102 to identify regeneration 1104 on the inner wall of the organ. The neural network 802 may have been trained to identify evidence of regeneration 1104 on the inner walls of organs (e.g., by reviewing videos of organs that have been treated with ablation). The neural network 802 may analyze the video 1102 to identify or detect evidence of regeneration 1104. The computer system 106 may analyze the evidence of regeneration 1104 and make a treatment determination 1106. The treatment determination 1106 may indicate whether there is sufficient evidence of regeneration 1104. For example, if the mucosal lining on the inner wall of the organ does not present sufficient evidence of regeneration 1104 or if indicators of disease or illness continue to be present, then the computer system 106 may make a treatment determination 1106 that further ablation procedures are needed. On the other hand, if the mucosal lining of the organ indicates or presents sufficient evidence of regeneration 1104 and visual indicators of illness or disease are not detected, then the computer system 106 may make the treatment determination 1106 that further ablation procedures are not necessary. In this manner, the computer system 106 may be used to determine the effectiveness of an ablation procedure.
[0070] Figure 12 illustrates an example computer system 106 in the system 100 of Figure 1. Generally, Figure 12 shows the computer system 106 determining whether an ablation procedure may be needed.
[0071] The computer system 106 receives a video 1202. The video 1202 may be generated by the camera 112 in the tube 110. The tube 110 may have been inserted
into an organ during a routine scan or checkup. The video 1202 may show the inner wall of the organ.
[0072] The computer system 106 may use the neural network 802 to analyze the video 1202 to detect landmarks 1204. For example, the neural network 802 may be trained to identify visual indicators of various illnesses or diseases. The neural network 802 may determine whether the landmarks 1204 that appear in the video 1202 include these indicators.
[0073] The computer system 106 may make a treatment recommendation 1206 based on the identified landmarks 1204. For example, the computer system 106 may recommend ablation if the landmarks 1204 indicate illness or disease that may be treated by ablation. If the landmarks 1204 do not indicate illness or disease that may be treated by ablation, the computer system 106 may not recommend ablation. In this manner, the computer system 106 uses machine learning to determine when ablation may be helpful, and to recommend ablation in those instances.
[0074] Figure 13 is a flowchart of an example method 1300 performed in the system 100 of Figure 1 . In particular embodiments, various components of the system 100 may perform the method 1300. By performing the method 1300, the system 100 uses machine learning to assist mucosal ablation.
[0075] In block 1302, the camera 112 in the tube 110 produces a video 702. The tube 110 and the camera 112 may have been inserted or positioned within an organ (e.g., the duodenum or the lungs). The video 702 may show the inner walls of the organ. In block 1304, the system 100 may position the catheter 114 in the organ. For example, the catheter 114 may be inserted through a channel in the tube 110 into the organ. The catheter 114 may emerge from the tube 110 and into the organ.
[0076] In block 1306, the computer system 106 generates the map 706 of the organ. For example, the computer system 106 may use a SLAM algorithm to analyze the video 702 and to identify landmarks 704 in the video 702. The landmarks 704 may include the inner wall of the organ along with other body parts. The computer system 106 may analyze the landmarks 704 to determine the shape of the organ and the positioning of the walls of the organ. The computer system 106 may then generate the map 706 of the inside of the organ using these landmarks 704.
[0077] In block 1308, the computer system 106 identifies a treatment region 804 in the organ. The computer system 106 may use the neural network 802 to analyze the video 702 to identify the treatment region 804. For example, the neural network 802 may be trained to identify or detect visual indicators of disease or illness. When the neural network 802 analyzes the video 702, the neural network 802 may identify these indicators of disease or illness in the video 702. The neural network 802 may identify the treatment region 804 that includes these indicators of disease or illness.
[0078] In block 1310, the computer system 106 determines the position 714 of the catheter 114 in the organ. For example, the computer system 106 may receive sensor measurements (e.g., the position measurement 708, the shape measurement 710, or the kinematic measurement 712) and determine the position 714 of the catheter 114 using the sensor measurements. The position 714 may be a position of the catheter 114 within the map 706 of the organ. Using the position 714, the computer system 106 may determine whether the catheter 114 is positioned by or near the treatment region 804. If the catheter 114 is not positioned near or by the treatment region 804, the computer system 106 may instruct or cause the catheter 114 to be moved closer to the treatment region 804.
[0079] In block 1312, the computer system 106 ablates the treatment region 804. Specifically, when the computer system 106 determines that the catheter 114 is positioned by or near the treatment region 804, the computer system 106 may begin ablation of the treatment region 804. The computer system 106 may communicate the inflation signal 806 that causes the ablation tool 402 on the catheter 114 to inflate so that the ablation tool 402 contacts the inner wall of the organ. The computer system 106 may then communicate the ablation signal 908, which causes the ablation tool 402 to ablate the treatment region 804. For example, the ablation signal 908 may cause the ablation tool 402 to apply an electric current to the treatment region 804. As another example, the ablation signal 908 may cause the ablation tool 402 to cause hot fluid to flow over or under the treatment region 804. In this manner, the computer system 106 uses machine learning to guide or assist the ablation procedure in the organ, which may result in more consistent ablation procedures and more consistent results for ablation procedures across different physicians.
[0080] In summary, the system 100 uses machine learning to guide or assist mucosal ablation procedures (e.g., in the duodenum, lungs, esophagus, stomach, intestines, or colon). Generally, the system 100 uses machine learning in two different ways. First, the system 100 uses machine learning (e.g., a SLAM algorithm) to map the inside of an organ and to detect the position of a surgical tool (e.g., the catheter 114 or the ablation tool 402) inside the organ. Second, the system 100 uses machine learning to detect treatment regions 804 inside the organ. For example, the system 100 may use the neural network 802 to analyze a video 702 of the inside of the organ. The neural network 802 may detect regions of the organ where mucosal ablation has been performed and regions of the organ where mucosal ablation is needed. The system 100 may then guide the surgical tool to a region where mucosal ablation has not been performed so that the region may be ablated.
[0081] This description and the accompanying drawings that illustrate aspects, embodiments, or modules should not be taken as limiting. Various mechanical, compositional, structural, electrical, and operational changes may be made without departing from the spirit and scope of this description and the claims. In some instances, well-known circuits, structures, or techniques have not been shown or described in detail in order not to obscure other features. Like numbers in two or more figures represent the same or similar elements.
[0082] In this description, specific details are set forth describing some embodiments consistent with the present disclosure. Numerous specific details are set forth in order to provide a thorough understanding of the embodiments. It will be apparent, however, to one skilled in the art that some embodiments may be practiced without some or all of these specific details. The specific embodiments disclosed herein are meant to be illustrative but not limiting. One skilled in the art may realize other elements that, although not specifically described here, are within the scope and the spirit of this disclosure. In addition, to avoid unnecessary repetition, one or more features shown and described in association with one embodiment may be incorporated into other embodiments unless specifically described otherwise or if the one or more features would make an embodiment non-functional.
[0083] Further, the terminology in this description is not intended to be limiting. For example, spatially relative terms-such as “beneath”, “below”, “lower”, “above”, “upper”,
“proximal”, “distal”, and the like may be used to describe one element’s or feature’s relationship to another element or feature as illustrated in the figures. These spatially relative terms are intended to encompass different positions (i.e., locations) and orientations (i.e., rotational placements) of the elements or their operation in addition to the position and orientation shown in the figures. For example, if the content of one of the figures is turned over, elements described as “below” or “beneath” other elements or features would then be “above” or “over” the other elements or features. Thus, the exemplary term “below” can encompass both positions and orientations of above and below. A device may be otherwise oriented (rotated 90 degrees or at other orientations) and the spatially relative descriptors used herein interpreted accordingly. Likewise, descriptions of movement along and around various axes include various special element positions and orientations. In addition, the singular forms “a”, “an”, and “the” are intended to include the plural forms as well, unless the context indicates otherwise. And, the terms “comprises”, “comprising”, “includes”, and the like specify the presence of stated features, steps, operations, elements, and/or components but do not preclude the presence or addition of one or more other features, steps, operations, elements, components, and/or groups. Components described as coupled may be electrically or mechanically directly coupled, or they may be indirectly coupled via one or more intermediate components.
[0084] Elements described in detail with reference to one embodiment, or module may, whenever practical, be included in other embodiments, or modules in which they are not specifically shown or described. For example, if an element is described in detail with reference to one embodiment and is not described with reference to a second embodiment, the element may nevertheless be claimed as included in the second embodiment. Thus, to avoid unnecessary repetition in the following description, one or more elements shown and described in association with one embodiment, or application may be incorporated into other embodiments, or aspects unless specifically described otherwise, unless the one or more elements would make an embodiment or embodiments non-functional, or unless two or more of the elements provide conflicting functions.
[0085] In some instances, well known methods, procedures, components, and circuits have not been described in detail so as not to unnecessarily obscure aspects of the embodiments.
[0086] This disclosure describes various devices, elements, and portions of computer-assisted devices and elements in terms of their state in three-dimensional space. As used herein, the term “position” refers to the location of an element or a portion of an element in a three-dimensional space (e.g., three degrees of translational freedom along Cartesian x-, y-, and z-coordinates). As used herein, the term “orientation” refers to the rotational placement of an element or a portion of an element (three degrees of rotational freedom - e.g., roll, pitch, and yaw). As used herein, the term “shape” refers to a set positions or orientations measured along an element. As used herein, and for a device with repositionable arms, the term “proximal” refers to a direction toward the base of the computer-assisted device along its kinematic chain and “distal” refers to a direction away from the base along the kinematic chain.
[0087] Aspects of this disclosure are described in reference to computer-assisted systems and devices, which may include systems and devices that are teleoperated, remote-controlled, autonomous, semiautonomous, robotic, and/or the like. Further, aspects of this disclosure are described in terms of an embodiment using a medical system, such as the DA VINCI SURGICAL SYSTEM or ION SYSTEM commercialized by Intuitive Surgical, Inc. of Sunnyvale, California. Knowledgeable persons will understand, however, that aspects disclosed herein may be embodied and implemented in various ways, including robotic and, if applicable, non-robotic embodiments. Techniques described with reference to surgical instruments and surgical methods may be used in other contexts. Thus, the instruments, systems, and methods described herein may be used for humans, animals, portions of human or animal anatomy, industrial systems, general robotic, or teleoperational systems. As further examples, the instruments, systems, and methods described herein may be used for non-medical purposes including industrial uses, general robotic uses, sensing or manipulating non-tissue work pieces, cosmetic improvements, imaging of human or animal anatomy, gathering data from human or animal anatomy, setting up or taking down systems, training medical or non-medical personnel, and/or the like. Additional example applications include use for procedures on tissue removed from human or
animal anatomies (with or without return to a human or animal anatomy) and for procedures on human or animal cadavers. Further, these techniques can also be used for medical treatment or diagnosis procedures that include, or do not include, surgical aspects.
[0088] Although illustrative embodiments have been shown and described, a wide range of modification, change and substitution is contemplated in the foregoing disclosure and in some instances, some features of the embodiments may be employed without a corresponding use of other features. One of ordinary skill in the art would recognize many variations, alternatives, and modifications. Thus, the scope of the disclosure should be limited only by the following claims, and it is appropriate that the claims be construed broadly and, in a manner, consistent with the scope of the embodiments disclosed herein.
Claims
1 . A system for mucosal ablation, the system comprising: a camera arranged to produce a video of a mucosa of an organ; a catheter arranged to be positioned in the organ; a memory; and a processor communicatively coupled to the memory, the processor configured to: generate, using the video from the camera, a map of the organ; identify, using a neural network and the video from the camera, a first portion of the mucosa to be ablated; determine, using the map and motion of the catheter, that the catheter is positioned by the first portion of the mucosa; and in response to determining that the catheter is positioned by the first portion of the mucosa, ablate, using the catheter, the first portion of the mucosa.
2. The system of Claim 1 , wherein the organ is a duodenum.
3. The system of Claim 1 , wherein the organ is a lung.
4. The system of Claim 1 , wherein the organ is an esophagus, a stomach, or an intestine.
5. The system of Claim 1 , wherein ablating the first portion of the mucosa comprises: inflating a portion of the catheter; and applying an electric current through the portion of the catheter and to the first portion of the mucosa to electroablate the first portion of the mucosa.
6. The system of any of Claim 5, wherein the processor is further configured to cause the portion of the catheter to further inflate in response to determining, using the video from the camera, that a contact region of the catheter on the first portion of the mucosa would underdose the ablation of the first portion of the mucosa.
7. The system of Claim 6, further comprising at least one of a volume sensor or a pressure sensor, wherein determining that the contact region would underdose the ablation is further based on at least one of a volume measurement from the volume sensor or a pressure measurement from the pressure sensor.
8. The system of any of Claims 5 through 7, wherein the processor is further configured to determine, using the neural network, that ablation of the first portion of the mucosa is complete based on a visual indicator of a product of electrolysis in the video from the camera.
9. The system of any of Claims 5 through 8, wherein the catheter comprises a cathode and an anode, and wherein inflating the portion of the catheter causes the cathode and the anode to contact the first portion of the mucosa.
10. The system of Claim 9, wherein the electric current is applied between the cathode and the anode.
11. The system of any of Claims 5 through 10, wherein the processor is further configured to adjust a magnitude of the electric current based on the video from the camera.
12. The system of any of Claims 1 through 11 , wherein the processor is further configured to: determine, using the neural network and the video from the camera, a second portion of the mucosa to ablate after ablating the first portion of the mucosa; and determine, using the map and motion of the catheter, that the catheter is positioned by the second portion of the mucosa.
13. The system of any of Claims 1 through 12, wherein the processor is further configured to mark, on a display presenting the video from the camera, the first portion of the mucosa.
14. The system of Claims 1 through 13, further comprising a kinematic sensor arranged to detect the motion of the catheter.
15. The system of any of Claims 1 through 14, wherein the processor is further configured to analyze, using the neural network, a regeneration of the first portion of the mucosa.
16. The system of Claim 15, wherein the processor is further configured to determine, based on the regeneration of the first portion of the mucosa, whether ablation of the first portion of the mucosa should be repeated.
17. The system of any of Claims 1 through 16, wherein the processor is further configured to identify, using the neural network, a mucosa of a second organ that should be ablated.
18. The system of any of Claims 1 through 17, wherein the processor is further configured to present, on a display, the map of the organ and a position of the catheter in the map.
19. A method for mucosal ablation, the method comprising: producing, using a camera, a video of a mucosa of an organ; positioning a catheter in the organ; generating, using the video from the camera, a map of the organ; identifying, using a neural network and the video from the camera, a first portion of the mucosa to be ablated; determining, using the map and motion of the catheter, that the catheter is positioned by the first portion of the mucosa; and in response to determining that the catheter is positioned by the first portion of the mucosa, ablating, using the catheter, the first portion of the mucosa.
20. The method of Claim 19, wherein the organ is a duodenum.
21 . The method of Claim 19, wherein the organ is a lung.
22. The method of Claim 19, wherein the organ is an esophagus, a stomach, or an intestine.
23. The method of Claim 19, wherein ablating the first portion of the mucosa comprises: inflating a portion of the catheter; and
applying an electric current through the portion of the catheter and to the first portion of the mucosa.
24. The method of any of Claim 23, further comprising causing the portion of the catheter to further inflate in response to determining, using the video from the camera, that a contact region of the catheter on the first portion of the mucosa would underdose the ablation of the first portion of the mucosa.
25. The method of Claim 24, wherein determining that the contact region would underdose the ablation is further based on at least one of a volume measurement from a volume sensor or a pressure measurement from a pressure sensor.
26. The method of any of Claims 23 through 25, further comprising determining, using the neural network, that ablation of the first portion of the mucosa is complete based on a visual indicator of a product of electrolysis in the video from the camera.
27. The method of any of Claims 23 through 26, wherein the catheter comprises a cathode and an anode, and wherein inflating the portion of the catheter causes the cathode and the anode to contact the first portion of the mucosa.
28. The method of Claim 27, wherein the electric current is applied between the cathode and the anode.
29. The method of any of Claims 23 through 28, further comprising adjusting a magnitude of the electric current based on the video from the camera.
30. The method of any of Claims 19 through 29, further comprising: determining, using the neural network and the video from the camera, a second portion of the mucosa to ablate after ablating the first portion of the mucosa; and determining, using the map and motion of the catheter, that the catheter is positioned by the second portion of the mucosa.
31 . The method of any of Claims 19 through 30, further comprising marking, on a display presenting the video from the camera, the first portion of the mucosa.
32. The method of Claims 19 through 31 , further comprising a kinematic sensor arranged to detect the motion of the catheter.
33. The method of any of Claims 19 through 32, further comprising analyzing, using the neural network, a regeneration of the first portion of the mucosa.
34. The method of Claim 33, further comprising determining, based on the regeneration of the first portion of the mucosa, whether ablation of the first portion of the mucosa should be repeated.
35. The method of any of Claims 19 through 34, further comprising identifying, using the neural network, a mucosa of a second organ that should be ablated.
36. The method of any of Claims 19 through 35, further comprising presenting, on a display, the map of the organ and a position of the catheter in the map.
37. A non-transitory machine-readable medium storing instructions for mucosal ablation that, when executed by a processor, cause the processor to: perform the method of any of Claims 19 through 36.
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US202363532592P | 2023-08-14 | 2023-08-14 | |
| US63/532,592 | 2023-08-14 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2025038310A1 true WO2025038310A1 (en) | 2025-02-20 |
Family
ID=92503843
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/US2024/040823 Pending WO2025038310A1 (en) | 2023-08-14 | 2024-08-02 | Machine learning assisted mucosal ablation |
Country Status (1)
| Country | Link |
|---|---|
| WO (1) | WO2025038310A1 (en) |
Citations (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20200297415A1 (en) * | 2019-03-22 | 2020-09-24 | Boston Scientific Scimed Inc. | Automated electrode recommendation for ablation systems |
| US20210145510A1 (en) * | 2019-11-18 | 2021-05-20 | Nido Surgical Inc. | Instrument port for epicardial ablation with inflatable balloon |
| US20220005198A1 (en) * | 2020-07-06 | 2022-01-06 | Biosense Webster (Israel) Ltd. | Automatic contiguity estimation of wide area circumferential ablation points |
| US20220044787A1 (en) * | 2020-08-06 | 2022-02-10 | Biosense Webster (Israel) Ltd. | Apparatus for treating cardiac arrhythmias utilizing a machine learning algorithm to optimize an ablation index calculation |
| US20230119097A1 (en) * | 2021-10-20 | 2023-04-20 | Olympus Corporation | Endoluminal transhepatic access procedure |
| US20230157783A1 (en) * | 2020-02-21 | 2023-05-25 | Intuitive Surgical Operations, Inc. | Systems and methods for delivering targeted therapy |
-
2024
- 2024-08-02 WO PCT/US2024/040823 patent/WO2025038310A1/en active Pending
Patent Citations (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20200297415A1 (en) * | 2019-03-22 | 2020-09-24 | Boston Scientific Scimed Inc. | Automated electrode recommendation for ablation systems |
| US20210145510A1 (en) * | 2019-11-18 | 2021-05-20 | Nido Surgical Inc. | Instrument port for epicardial ablation with inflatable balloon |
| US20230157783A1 (en) * | 2020-02-21 | 2023-05-25 | Intuitive Surgical Operations, Inc. | Systems and methods for delivering targeted therapy |
| US20220005198A1 (en) * | 2020-07-06 | 2022-01-06 | Biosense Webster (Israel) Ltd. | Automatic contiguity estimation of wide area circumferential ablation points |
| US20220044787A1 (en) * | 2020-08-06 | 2022-02-10 | Biosense Webster (Israel) Ltd. | Apparatus for treating cardiac arrhythmias utilizing a machine learning algorithm to optimize an ablation index calculation |
| US20230119097A1 (en) * | 2021-10-20 | 2023-04-20 | Olympus Corporation | Endoluminal transhepatic access procedure |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US12414823B2 (en) | Anatomical feature tracking | |
| US20240374334A1 (en) | Systems and methods for adaptive input mapping | |
| CN113712674B (en) | Catheter robot, catheter robot system, catheter control method, computer readable storage medium, and electronic device | |
| JP7721577B2 (en) | Systems and methods for hybrid imaging and navigation | |
| US12414686B2 (en) | Endoscopic anatomical feature tracking | |
| KR102764213B1 (en) | Graphical user interface for displaying guidance information in a plurality of modes during an image-guided procedure | |
| JP2020531070A (en) | Systems and methods for monitoring patient movement during medical procedure | |
| US20240238049A1 (en) | Medical instrument guidance systems and associated methods | |
| US20240293195A1 (en) | Systems and methods for self-alignment and adjustment of robotic endoscope | |
| US20250134603A1 (en) | Robotic catheter system and methods for displaying corrective actions therefor | |
| WO2025038310A1 (en) | Machine learning assisted mucosal ablation | |
| WO2024134467A1 (en) | Lobuar segmentation of lung and measurement of nodule distance to lobe boundary | |
| WO2022233201A1 (en) | Method, equipment and storage medium for navigating a tubular component in a multifurcated channel | |
| US20250235287A1 (en) | Computer-assisted distance measurement in a surgical space | |
| US11850004B2 (en) | Systems and methods for determining an arrangement of explanted tissue and for displaying tissue information | |
| WO2025230837A1 (en) | Computer-assisted estimation of target overlay | |
| US20240127399A1 (en) | Visualization adjustments for instrument roll | |
| WO2025212319A1 (en) | Computer-assisted annotation of subsurface structures onto a surface | |
| WO2025101531A1 (en) | Graphical pre-rendering of virtual content for surgical systems | |
| WO2024028934A1 (en) | Endoscopy assistance device, endoscopy assistance method, and recording medium |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 24759311 Country of ref document: EP Kind code of ref document: A1 |