WO2025117336A1 - Cathéters orientables et différences de force de fil - Google Patents
Cathéters orientables et différences de force de fil Download PDFInfo
- Publication number
- WO2025117336A1 WO2025117336A1 PCT/US2024/056943 US2024056943W WO2025117336A1 WO 2025117336 A1 WO2025117336 A1 WO 2025117336A1 US 2024056943 W US2024056943 W US 2024056943W WO 2025117336 A1 WO2025117336 A1 WO 2025117336A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- drive wires
- model
- bendable body
- catheter
- force
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J18/00—Arms
- B25J18/06—Arms flexible
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/30—Surgical robots
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/70—Manipulators specially adapted for use in surgery
- A61B34/71—Manipulators operated by drive cable mechanisms
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J13/00—Controls for manipulators
- B25J13/08—Controls for manipulators by means of sensing devices, e.g. viewing or touching devices
- B25J13/085—Force or torque sensors
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/10—Programme-controlled manipulators characterised by positioning means for manipulator elements
- B25J9/104—Programme-controlled manipulators characterised by positioning means for manipulator elements with cables, chains or ribbons
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1679—Programme controls characterised by the tasks executed
- B25J9/1689—Teleoperation
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
- A61B2034/2046—Tracking techniques
- A61B2034/2048—Tracking techniques using an accelerometer or inertia sensor
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
- A61B2034/2046—Tracking techniques
- A61B2034/2051—Electromagnetic tracking systems
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
- A61B2034/2046—Tracking techniques
- A61B2034/2059—Mechanical position encoders
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
- A61B2034/2046—Tracking techniques
- A61B2034/2061—Tracking techniques using shape-sensors, e.g. fiber shape sensors with Bragg gratings
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
- A61B2034/2046—Tracking techniques
- A61B2034/2065—Tracking using image or pattern recognition
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/30—Surgical robots
- A61B2034/301—Surgical robots for introducing or steering flexible instruments inserted into the body, e.g. catheters or endoscopes
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/30—Surgical robots
- A61B2034/305—Details of wrist mechanisms at distal ends of robotic arms
- A61B2034/306—Wrists with multiple vertebrae
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/06—Measuring instruments not otherwise provided for
- A61B2090/064—Measuring instruments not otherwise provided for for measuring force, pressure or mechanical tension
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/40—Robotics, robotics mapping to robotics vision
- G05B2219/40144—Force sensation feedback from slave
Definitions
- the present disclosure generally relates to imaging and/or medical devices and, more particularly, to an apparatus, method, and storage medium for making, and/or use with, a steerable medical device, system, method, and/or a continuum robot, such as, but not limited to endoscopes or catheters that have multiple bending sections and methods of use and that can estimate a distal exerted force by using wire force information, applicable to guide interventional tools and instruments in medical procedures, including after or while implementing robotic control for all sections of a catheter or imaging device/apparatus or system to match a state or states when each section reaches or approaches a same or similar, or approximately a same or similar, state or states of a first section of the catheter or imaging device, apparatus, or system.
- the present disclosure generally relates to imaging and, more particularly, to bronchoscope(s), robotic bronchoscope(s), robot apparatus(es), method(s), and storage medium(s) that operate to image a target, object, or specimen (such as, but not limited to, a lung, a biological object or sample, tissue, etc.).
- a target, object, or specimen such as, but not limited to, a lung, a biological object or sample, tissue, etc.
- a target, object, or specimen such as, but not limited to, a lung, a biological object or sample, tissue, etc.
- a target, object, or specimen such as, but not limited to, a lung, a biological object or sample, tissue, etc.
- a target, object, or specimen such as, but not limited to, a lung, a biological object or sample, tissue, etc.
- One or more bronchoscopic, endoscopic, medical, camera, catheter, or imaging devices, systems, and methods and/or storage mediums for use with same, are discussed here
- BACKGROUND Medical imaging is used with equipment to diagnose and treat medical conditions. Endoscopy, bronchoscopy, catheterization, and other medical procedures facilitate the ability to look inside a body. During such a procedure, a flexible medical tool may be inserted into a patient’s body, and an instrument may be passed through the tool to examine or treat an area inside the body.
- a scope can be used with an imaging device that views and/or captures objects or areas. The imaging can be transmitted or transferred to a display for review or analysis by an operator, such as a physician, clinician, technician, medical practitioner or the like.
- the scope can be an endoscope, bronchoscope, or other type of scope.
- a bronchoscope is an endoscopic instrument to look or view inside, or image, the airways in a lung or lungs of a patient.
- the bronchoscope may be put in the nose or mouth and moved down the throat and windpipe, and into the airways, where views or imaging may be made of the bronchi, bronchioles, larynx, trachea, windpipe, or other areas.
- Catheters and other medical tools may be inserted through a tool channel in the bronchoscope to provide a pathway to a target area in the patient for diagnosis, planning, medical procedure(s), treatment, etc.
- Robotic bronchoscopes, robotic endoscopes, or other robotic imaging devices may be equipped with a tool channel or a camera and biopsy tools, and such devices (or users of such devices) may insert/retract the camera and biopsy tools to exchange such components.
- the robotic bronchoscopes, endoscopes, or other imaging devices may be used in association with a display system and a control system.
- An imaging device, such as a camera may be placed in the bronchoscope, the endoscope, or other imaging device/system to capture images inside the patient and to help control and move the bronchoscope, the endoscope, or the other type of imaging device, and a display or monitor may be used to view the captured images.
- An endoscopic camera that may be used for control may be positioned at a distal part of a catheter or probe (e.g., at a tip section).
- the display system may display, on the monitor, an image or images captured by the camera, and the display system may have a display coordinate used for displaying the captured image or images.
- the control system may control a moving direction of the tool channel or the camera. For example, the tool channel or the camera may be bent according to a control by the control system.
- the control system may have an operational controller (such as, but not limited to, a joystick, a gamepad, a controller, an input device, etc.), and physicians may rotate or otherwise move the camera, probe, catheter, etc. to control same.
- a continuum robot or snake may include a plurality of bending sections having a flexible structure, wherein the shape of the continuum robot is controlled by deforming the bending sections.
- U.S. Pat. Pub. No. 2020/0345305 discloses a catheter force control device with automated control of the catheter contact-force with a target tissue.
- the catheter contact force is measured by the force sensor located at the remote end of the catheter, then the control signal adjusting a position of the linear actuator to compensate for a disturbance of the remote end contact with the target tissue.
- U.S. Pat. Pub. No.2020/0015693 discloses an electrophysiologic catheter with a sensor for a force exerted on the catheter tip. The sensor includes a flexible circuit and can deform in a manner that improves the catheter’s functionality concerning force feedback and location feedback.
- U.S. Pat. Pub. No. 2013/0190726 discloses the guide wire and the catheter with a force sensor coupled between the actuator and the guide wire. The force sensor signal will be sent to the catheter control system, then the catheter control system will control the guide wire’s motion as a function of the force data.
- the related arts disclose the technology to measure the force exerted on the catheter tip by using the force sensors at the catheter tip.
- the force sensor at the catheter tip for any device or system would limit miniaturization of the catheter since the catheter tip has to accommodate an additional sensor.
- the force sensor is intrinsically difficult to miniaturize since the sensor requires size to provide deformation to measure a force to achieve a required measurement range.
- there is a need for devices, systems, methods, and/or storage mediums that address the above issues by providing rapid, accurate, cost-effective, and minimally invasive structure and manufacture/use techniques for structure of a catheter and/or a catheter tip with features that refine and advance bendable medical devices and that operate to estimate a distal exerted force or force difference.
- imaging devices, apparatuses, systems, methods, and/or storage mediums such as, but not limited to, using robotic and/or catheter features, by providing consistent manufacture/use techniques (and resulting structure from same) and by providing rapid, accurate, cost-effective, consistent, and minimally invasive structure and manufacture/use techniques for structure of a catheter and/or a catheter tip using one or more features that refine and advance bendable imaging and/or medical devices to estimate a distal exerted force or force difference by using wire force information.
- a robotic catheter and/or autonomous robot may be used in one or more embodiments of the present disclosure to address the above issues, and robotic catheter and/or autonomous robot apparatuses, systems, methods, storage mediums, and/or other related features may be used to increase maneuverability into an object, sample, or target (e.g., a patient, an organ of a patient, a tissue being imaged or evaluated, a lung, a lumen, a vessel, another part of a patient, etc.), while preserving visualization and catheter stability and while refining and advancing bendable medical devices estimate a distal exerted force or force difference by using wire force information.
- an object, sample, or target e.g., a patient, an organ of a patient, a tissue being imaged or evaluated, a lung, a lumen, a vessel, another part of a patient, etc.
- imaging e.g., computed tomography (CT), Magnetic Resonance Imaging (MRI), etc.
- CT computed tomography
- MRI Magnetic Resonance Imaging
- apparatuses, systems, methods, and storage mediums for using a navigation and/or control method or methods (manual or automatic) in one or more apparatuses or systems (e.g., an imaging apparatus or system, a catheter, an autonomous catheter or robot, an endoscopic imaging device or system, etc.).
- One or more embodiments of an imaging and/or medical apparatus, device, or system of the present disclosure may include or have: a bendable body having a distal bending segment or section that is bendable by at least one drive wire; an actuator that operates to drive the at least one drive wire; a force sensor in or near the actuator that operates to measure a wire force on the at least one drive wire; a user interface; and one or more controllers or processors that operate to: control the actuator to manipulate the bendable body, and to send and/or receive information from the user interface.
- the one or more controllers or processors may operate to: compute a driving force on the at least one drive wire using a mathematical model for a manipulation of the bendable body in air (and/or in another medium or environment); compute an exerted force and/or a force difference between (a) the wire forces measured with the force sensor and (b) the driving forces computed with the mathematical model; and send a command to the actuator and/or to the user interface, wherein the command is a function of the exerted force and/or the force difference.
- a robotic apparatus may include: a bendable body having a distal bending segment that is bendable by one or more drive wires; an actuator that operates to drive the one or more drive wires; a force sensor in or near, or in communication with, the actuator, the force sensor operating to measure, determine, or compute one or more respective wire forces exerted on the one or more drive wires; a user interface; and one or more controllers or processors that operate to: control the actuator to manipulate the bendable body; send and/or receive information from the user interface; compute one or more respective driving forces on the one or more drive wires using a mathematical model for the manipulation of the bendable body in air; compute a force difference between (a) the one or more respective wire forces measured with the force sensor and (b) the one or more respective driving forces computed with the mathematical model; and send a command to the actuator and/or to the user interface, wherein the command is a function of the force difference and/or of the exerted one or more wire forces.
- the one or more controllers or processors may further operate to send a protective action in a case where the force difference reaches a threshold value for the one or more drive wires.
- the distal bending segment may be bendable by a plurality of the one or more drive wires, and the one or more controllers or processors may further operate to send a command to the actuator to take a protective action in a case where the force difference reaches a threshold value for at least one of the plurality of the one or more drive wires.
- the protective action may be: a command to the actuator to make the distal bending segment more flexible or bend more or to make the distal bending segment change its path or to reduce a force on the one or more drive wires via actuation; a command to the actuator to stop the actuator; and/or an instruction to the user interface to issue a warning to the operator or to issue a warning on a display of the robotic apparatus or in communication with the robotic apparatus.
- the actuator may include one or more encoders to detect a position information of the one or more drive wires.
- the bendable body may include a proximal bending segment, a middle bending segment, and a distal bending segment, where each of the proximal, middle, and distal bending segments operate to be bendable by at least two of the one or more drive wires, and one or more second actuators may operate to drive the proximal bending segment and one or more third actuators may operate to drive the middle bending segment.
- the mathematical model may operate to accept the position information of the one or more drive wires as input and output the one or more respective driving forces on the one or more drive wires.
- the one or more controllers or processors may further operate to, after sending the command to the actuator and/or to the user interface: receive the position information of the driving wires from the one or more encoders; and plug or input the position information of the one or more drive wires into the mathematical model used for the computation of the one or more respective driving forces.
- the mathematical model includes an algorithm generated by machine learning methods with a neural network, a support vector machine, or a random forest; (ii) in a case where the one or more controllers or processors train the mathematical model or train an Artificial Intelligence (AI) model using the mathematical model, the trained model is one or a combination of the following: a neural net model or neural network model, a deep convolutional neural network model, a recurrent neural network model with long short-term memory that can take temporal relationships across images or frames into account, a generative adversarial network (GAN) model, a consistent generative adversarial network (cGAN) model, a three cycle-consistent generative adversarial network (3cGAN) model, a model that can take temporal relationships across images or frames into account, a model that can take temporal relationships into account including tissue location(s) during pullback in a vessel and/or including tissue characterization data during pullback in a vessel, a model
- the mathematical model may include a kinematic model of the distal bending segment of the bendable body.
- the mathematical model may further operate to map a wire position or a wire displacement, Pa, for a first drive wire of the one or more drive wires and a wire position or a wire displacement, Pb, for a second drive wire of the one or more drive wires to the respective one or more drive wire forces, Ta and Tb, respectively, using one or more of the equations discussed below, wherein s is a length of a bending section of the distal bending segment or of the distal bending segment, ⁇ is a bending angle of the distal bending segment and/or the bending section of the distal bending segment, 1/k is a curvature radius and k is a curvature of the bending section and/or the distal bending segment, K ⁇ is a linear rotational spring constant, and d is an offset from a centroid of the bendable body or of the bending section.
- One or more embodiments may have one or more of the following occur or exist: (i) K ⁇ /(2d 2 ) is a constant value determined by a design value, and/or the one or more controllers or processors further operate to estimate or determine the one or more forces exerted on the one or more drive wires throughout an operation of the bendable body, where the bendable body is used as part of a catheter; (ii) the one or more controllers or processors further operate to derive a mathematical model for multiple bending sections of the bendable body; (iii) in a case where the bendable body has a distal bending segment and a proximal bending segment, the distal and proximal bending segments include the length s1 and s2, the linear rotational spring constant K ⁇ 1, K ⁇ 2, and the drive wires 160a1, 160b1 and the drive wires 160a2, 160b2, respectively; (iv) in a case where the bendable body has a distal bending segment and a proximal bending segment, the
- the one or more controllers or processors further operate to compare the respective force estimates to current force values and compute the force difference between the respective force estimates and the current force values; and/or (ii) the one or more controllers or processors further operate to: compare the respective force estimates to current force values, compute the force difference between the respective force estimates and the current force values, compare the force difference to a set or predetermined threshold value, and issue a command to: (a) stop or change a mode of the bendable body, of the actuator, of the one or more drive wires, and/or of a catheter including the bendable body in a case where the force difference equals or exceeds the set or predetermined threshold value, (b) change the bending or flexibility of the bendable body or of the catheter including the bendable body such that a magnitude of the current force values approaches or becomes zero, and/or (c) issue a warning to the user interface and change a state of the robotic apparatus to a different state corresponding with or responding to the warning.
- One or more embodiments may include or have one or more of the following: (i) the one or more drive wires operate to terminate at a distal end of the distal bending segment or at a distal end of one or more bending sections of the bendable body; (ii) the robotic apparatus further comprises a tool channel extending a length of the bendable body, the tool channel operating to include a medical tool and/or a camera that is detachably attached with or in the tool channel, and/or the bendable body further includes a tip or an atraumatic tip at a distal end of the bendable body; (iii) a sensor or a camera is disposed or attached to a distal end of the robotic apparatus; (iv) the one or more drive wires include multiple push and/or pull drive or driving wires; (v) the bendable body has multiple bending sections and one or more processors that operate to control or command the multiple bending sections of the bendable body using one or more of the following modes: a Follow the Leader (FTL) mode
- a method of controlling and/or using a robotic catheter/apparatus or imaging apparatus may include computing a force or force difference between a wire force measurement and of a computed driving force, where the method may include using one or more technique(s) or feature(s) as discussed herein.
- a storage medium stores instructions or a program for causing one or more processors of an apparatus or system to perform a method of controlling and/or using a robotic catheter or imaging apparatus and/or to perform a method for computing a force or force difference between a wire force measurement and of a computed driving force, where the method may include using one or more technique(s) or feature(s) as discussed herein.
- apparatuses and systems, and methods and storage mediums for performing navigation, movement, and/or control, and/or for controlling, manufacturing, or using a catheter and/or the one or more force and/or force difference technique(s) may operate to characterize biological objects, such as, but not limited to, blood, mucus, tissue, etc.
- One or more embodiments of the present disclosure may be used in clinical application(s), such as, but not limited to, intervascular imaging, intravascular imaging, bronchoscopy, atherosclerotic plaque assessment, cardiac stent evaluation, intracoronary imaging using blood clearing, balloon sinuplasty, sinus stenting, arthroscopy, ophthalmology, ear research, veterinary use and research, etc.
- one or more technique(s) discussed herein may be employed as or along with features to reduce the cost of at least one of manufacture and maintenance of the one or more apparatuses, devices, systems, and storage mediums by reducing or minimizing a number of optical and/or processing components and by virtue of the efficient techniques to cut down cost (e.g., physical labor, mental burden, fiscal cost, time and complexity, etc.) of use/manufacture of such apparatuses, devices, systems, and storage mediums.
- cut down cost e.g., physical labor, mental burden, fiscal cost, time and complexity, etc.
- explanatory embodiments may include several novel features, and a particular feature may not be essential to some embodiments of the devices, systems, and methods that are described herein.
- one or more additional devices, one or more systems, one or more methods, and one or more storage mediums using imaging, drive (push/pull) wire control/support, force exertion and/or force difference technique(s), and/or other technique(s) are discussed herein. Further features of the present disclosure will in part be understandable and will in part be apparent from the following description and with reference to the attached drawings.
- FIG. 1 a different or similar reference number and/or character may be used for the same or similar feature(s) in one or more other embodiments, and/or the reference numeral(s) or character(s) may be altered or modified to indicate one or more secondary elements and/or references of a same or similar nature and/or kind (such as, but not limited to, using 168’ compared to 168).
- the one or more features shown in the figures are illustrative, and it is intended that changes and modifications may be made to the one or more embodiments without departing from the scope and spirit of the subject disclosure.
- FIG. 1 To assist those of ordinary skill in the relevant art in making and using the subject matter hereof, reference is made to the appended drawings and figures, wherein: [0030] FIG.
- FIG. 1 illustrates at least one embodiment of an imaging, continuum robot/steerable catheter, or endoscopic apparatus or system in accordance with one or more aspects of the present disclosure
- FIG. 2 is a schematic diagram showing at least one embodiment of an imaging, steerable catheter, or continuum robot apparatus or system in accordance with one or more aspects of the present disclosure
- FIGS. 3A-3B illustrate at least one embodiment example of a steerable catheter, continuum robot, and/or medical device that may be used with one or more technique(s), including autonomous navigation features and/or technique(s) and/or force determination or force difference determination features and/or technique(s), in accordance with one or more aspects of the present disclosure
- FIGS. 3A-3B illustrate at least one embodiment example of a steerable catheter, continuum robot, and/or medical device that may be used with one or more technique(s), including autonomous navigation features and/or technique(s) and/or force determination or force difference determination features and/or technique(s), in accordance with one or more aspects of the present disclosure
- FIG. 4 is a schematic diagram showing at least one embodiment of an imaging, continuum robot, steerable catheter, or endoscopic apparatus or system in accordance with one or more aspects of the present disclosure
- FIG. 5 is a schematic diagram showing at least one embodiment of a console or computer that may be used with one or more autonomous navigation and/or force or force difference determination technique(s) in accordance with one or more aspects of the present disclosure
- FIG. 6 is a flowchart of at least one embodiment of a method for planning an operation of at least one embodiment of a continuum robot or steerable catheter apparatus or system in accordance with one or more aspects of the present disclosure
- FIGS. 7A-7B are side views of a robotic catheter system in a straight and bent configuration, respectively, in accordance with one or more aspects of the present disclosure
- FIG. 8 is a map of drive wire position and force estimates in accordance with one or more aspects of the present disclosure
- FIG. 9 is a diagram of at least one embodiment of a robotic catheter system that may use or be used with one or more force determination and/or force difference determination techniques in accordance with one or more aspects of the present disclosure
- FIG. 10 is a diagram showing forces on a bending section of at least one embodiment of a robotic catheter system that may use or be used with one or more force determination and/or force difference determination techniques in accordance with one or more aspects of the present disclosure
- FIG. 11 is a diagram modeling forces with a linear rotational spring constant that may be use or be used with one or more force determination and/or force difference determination techniques in accordance with one or more aspects of the present disclosure
- FIGS.12A-12B are side views of one or more embodiments of a robotic catheter or continuum robot in accordance with one or more aspects of the present disclosure
- FIG.13 illustrates a flowchart for at least one method embodiment for performing correction, adjustment, and/or smoothing for a catheter or probe of a continuum robot device or system that may be used with one or more force and/or force difference determination feature(s) and/or technique(s) in accordance with one or more aspects of the present disclosure
- FIG. 11 illustrates a flowchart for at least one method embodiment for performing correction, adjustment, and
- FIG. 14 shows a schematic diagram of an embodiment of a computer or console that may be used with one or more embodiments of an apparatus or system, or one or more methods, discussed herein in accordance with one or more aspects of the present disclosure
- FIG. 15 shows a schematic diagram of at least an embodiment of a system using a computer or processor, a memory, a database, and input and output devices in accordance with one or more aspects of the present disclosure
- FIG. 16 shows a created architecture of or for a regression model(s) that may be used for autonomous navigation, movement detection, and/or control techniques and/or any other technique or feature discussed herein in accordance with one or more aspects of the present disclosure
- FIG. 16 shows a created architecture of or for a regression model(s) that may be used for autonomous navigation, movement detection, and/or control techniques and/or any other technique or feature discussed herein in accordance with one or more aspects of the present disclosure
- FIG. 16 shows a created architecture of or for a regression model(s) that may be used for autonomous navigation, movement detection,
- FIG. 17 shows a convolutional neural network architecture that may be used for autonomous navigation, movement detection, and/or control techniques and/or any other technique or feature discussed herein in accordance with one or more aspects of the present disclosure
- FIG. 18 shows a created architecture of or for a regression model(s) that may be used for autonomous navigation, movement detection, and/or control techniques and/or any other technique or feature discussed herein in accordance with one or more aspects of the present disclosure
- FIG.19 is a schematic diagram of or for a segmentation model(s) that may be used for any feature(s) and/or technique(s) discussed herein in accordance with one or more aspects of the present disclosure.
- One or more embodiments of the present disclosure avoid the aforementioned issues by providing one or more simple, efficient, cost-effective, and innovative structure that may be used with catheter or probe control technique(s) (including, but not limited to, robotic control technique(s)) as discussed herein and/or force determination and/or force difference (e.g., drive wire force) determination feature(s) and/or technique(s) as discussed herein.
- catheter or probe control technique(s) including, but not limited to, robotic control technique(s)
- force determination and/or force difference e.g., drive wire force
- the robotic control techniques may be used with a co-registration (e.g., computed tomography (CT) co-registration, cone-beam CT (CBCT) co-registration, etc.) to enhance a successful targeting rate for a predetermined sample, target, or object (e.g., a lung, a portion of a lung, a vessel, a nodule, an organ of a patient, a patient, tissue, etc.) by minimizing human error.
- CT computed tomography
- CBCT cone-beam CT
- CBCT may be used to locate a target, sample, or object (e.g., the lesion(s) or nodule(s) of a lung or airways, plaque or other tissue in one or more samples or in a patient(s), a set or predetermined target in tissue or in a patient, etc.) along with an imaging device (e.g., a steerable catheter, a continuum robot, etc.) and to co-register the target, sample, or object (e.g., the lesions or nodules, plaque or other tissue in one or more samples or in a patient(s), a set or predetermined target in tissue or in a patient, etc.) with the device shown in an image to achieve proper guidance.
- a target, sample, or object e.g., the lesion(s) or nodule(s) of a lung or airways, plaque or other tissue in one or more samples or in a patient(s), a set or predetermined target in tissue or in a patient, etc.
- an imaging device
- imaging e.g., computed tomography (CT), Magnetic Resonance Imaging (MRI), etc.
- MRI Magnetic Resonance Imaging
- storage mediums for using a navigation and/or control method or methods (manual or automatic) and/or for using force determination and/or force difference (e.g., drive wire force) determination feature(s) and/or technique(s) in one or more apparatuses or systems (e.g., an imaging apparatus or system, an endoscopic imaging device or system, a bronchoscope, etc.).
- force determination and/or force difference e.g., drive wire force
- CT computed tomography
- MRI Magnetic Resonance Imaging
- Details of the present disclosure may cover the one or more mechanisms or features of a continuum robot, such as, but not limited to, details of estimating a distal exerted force or a force difference by using wire force information, as well as the apparatuses/systems, methods/procedures/techniques, and/or other hardware/software (e.g., storage medium(s), processor(s), etc.) that may be used with the one or more continuum robots and the one or more support structures.
- hardware/software e.g., storage medium(s), processor(s), etc.
- a robotic catheter and/or autonomous robot may be used in one or more embodiments of the present disclosure to address the above issues, and robotic catheter and/or autonomous robot apparatuses, systems, methods, storage mediums, and/or other related features may be used to increase maneuverability into an object, sample, or target (e.g., a patient, an organ of a patient, a tissue being imaged or evaluated, a lung, a lumen, a vessel, another part of a patient, etc.), while preserving visualization and catheter stability and while refining and advancing bendable medical devices estimate a distal exerted force or force difference by using wire force information.
- an object, sample, or target e.g., a patient, an organ of a patient, a tissue being imaged or evaluated, a lung, a lumen, a vessel, another part of a patient, etc.
- imaging e.g., computed tomography (CT), Magnetic Resonance Imaging (MRI), etc.
- CT computed tomography
- MRI Magnetic Resonance Imaging
- apparatuses, systems, methods, and storage mediums for using a navigation and/or control method or methods (manual or automatic) in one or more apparatuses or systems (e.g., an imaging apparatus or system, a catheter, an autonomous catheter or robot, an endoscopic imaging device or system, etc.).
- One or more embodiments of an imaging and/or medical apparatus, device, or system of the present disclosure may include or have: a bendable body having a distal bending segment or section that is bendable by at least one drive wire; an actuator that operates to drive the at least one drive wire; a force sensor in or near the actuator that operates to measure a wire force on the at least one drive wire; a user interface; and one or more controllers or processors that operate to: control the actuator to manipulate the bendable body, and to send and/or receive information from the user interface.
- the one or more controllers or processors may operate to: compute a driving force on the at least one drive wire using a mathematical model for a manipulation of the bendable body in air (and/or in another medium or environment); compute an exerted force and/or a force difference between (a) the wire forces measured with the force sensor and (b) the driving forces computed with the mathematical model; and send a command to the actuator and/or to the user interface, wherein the command is a function of the exerted force and/or the force difference.
- One or more embodiments may have multiple push/pull drive/driving wires for the one or more driving wires.
- the drive wires may terminate in a distal bending end of the shaft of the catheter or continuum robot.
- a storage medium stores instructions or a program for causing one or more processors of an apparatus or system to perform a method of controlling and/or using a robotic catheter or imaging apparatus and/or to perform a method for computing a force or force difference between a wire force measurement and of a computed driving force, where the method may include using one or more technique(s) as discussed herein.
- a continuum robot for performing robotic control may include: one or more processors that operate to: instruct or command a first bending section or portion of a catheter or a probe of the continuum robot such that the first bending section or portion achieves, or is disposed at, a pose, position, or state at a position along a path, the catheter or probe of the continuum robot having a plurality of bending sections or portions and a base; instruct or command each of the other bending sections or portions of the plurality of bending sections or portions of the catheter or probe to match, substantially match, or approximately match the pose, position, or state of the first bending section or portion at the position along the path in a case where each section or portion reaches or approaches a same, similar, or approximately similar state or states at the position along the path; and instruct or command the plurality of bending sections or portions such that the first bending section or portion or a Tip or distal bending section or portion is located in a predetermined pose, position, or state at or
- a first bending section or portion or the Tip or distal bending section or portion may include a camera, an endoscopic camera, a sensor, or other imaging device or system to obtain one or more images of or in a target, sample, or object; and the one or more processors may further operate to command the camera, sensor, or other imaging device or system to obtain the one or more images of or in the target, sample, or object at the predetermined pose, position, or state, and the one or more processors operate to receive the one or more images and/or display the one or more images on a display.
- the method(s) may further include any of the features discussed herein that may be used in the one or more apparatuses of the present disclosure.
- the navigation and/or control may be employed so that an apparatus or system having multiple portions or sections (e.g., multiple bending portions or sections) operates to: (i) keep track of a path of a portion (e.g., a tip) or of each of the multiple portions or sections of an apparatus or system; (ii) have a state or states of each of the multiple portions or sections match a state or states of a first portion or section of the multiple portions or sections in a case where each portion or section reaches or approaches a same, similar, or approximately similar state (e.g., a position or other state(s) in a target, object, or specimen; a position or other state(s) in a patient; a target position or state(s) in an image or frame; a set or predetermined position or state(s) in an image or frame; a set or predetermined position or state(s) in an image or frame where the first portion or section reaches or approaches the set or predetermined position or state(s) at one point in time
- an orientation, pose, or state may include one or more degrees of freedom.
- two (2) degrees of freedom may be used, which may include an angle for a magnitude of bending and a plane for a direction of bending.
- matching state(s) may involve matching, duplicating, mimicking, or otherwise copying other characteristics, such as, but not limited to, vectors for each section or portion of the one or more sections or portions of a probe or catheter, for different portions or sections of the catheter or probe.
- a transition or change from a base angle/plane to a target angle/plane may be set or predetermined using transition values (e.g., while not limited hereto, a base orientation or state may have a stage at 0 mm, an angle at 0 degrees, and a plane at 0 degrees whereas a target orientation or state may have a stage at 20mm, an angle at 90 degrees, and a plane at 180 degrees.
- the intermediate values for the stage, angle, and plane may be set depending on how many transition orientations or states may be used).
- a continuum robot or steerable catheter may include one or more of the following: (i) a distal bending section or portion, wherein the distal bending section or portion is commanded or instructed automatically or based on an input of a user of the continuum robot or steerable catheter; (ii) a plurality of bending sections or portions including a distal or most distal bending portion or section and the rest of the plurality of the bending sections or portions; and/or (iii) the one or more processors further operate to instruct or command the forward motion, or the motion in the set or predetermined direction, of a motorized linear stage (or other structure used to map path or path-like information) and/or of the continuum robot or steerable catheter automatically and/or based on an input of a user of the continuum robot.
- a continuum robot or steerable catheter may further include: a base and an actuator that operates to bend the plurality of the bending sections or portions independently; and a motorized linear stage and/or a sensor that operates to move the continuum robot or steerable catheter forward and backward, and/or in the predetermined or set direction or directions, wherein the one or more processors operate to control the actuator and the motorized linear stage and/or the sensor.
- the plurality of bending sections or portions may each include driving wires that operate to bend a respective section or portion of the plurality of sections or portions, wherein the driving wires are connected to an actuator so that the actuator operates to bend one or more of the plurality of bending sections or portions using the driving wires.
- One or more embodiments may include a user interface of or disposed on a base, or disposed remotely from a base, the user interface operating to receive an input from a user of the continuum robot or steerable catheter to move one or more of the plurality of bending sections or portions and/or a motorized linear stage and/or a sensor, wherein the one or more processors further operate to receive the input from the user interface, and the one or more processors and/or the user interface operate to use a base coordinate system.
- One or more displays may be provided to display a path (e.g., a control path) of the continuum robot or steerable catheter.
- the continuum robot may further include an operational controller or joystick that operates to issue or input one or more commands or instructions as an input to one or more processors, the input including an instruction or command to move one or more of a plurality of bending sections or portions and/or a motorized linear stage and/or a sensor;
- the continuum robot may further include a display to display one or more images taken by the continuum robot; and/or
- the continuum robot may further include an operational controller or joystick that operates to issue or input one or more commands or instructions to one or more processors, the input including an instruction or command to move one or more of a plurality of bending sections or portions and/or a motorized linear stage and/or a sensor, and the operational controller or joystick operates to be controlled by a user of the continuum robot.
- the continuum robot or the steerable catheter may include a plurality of bending sections or portions and may include an endoscope camera, wherein one or more processors operate or further operate to receive one or more endoscopic images from the endoscope camera, and wherein the continuum robot further comprises a display that operates to display the one or more endoscopic images.
- Driving and/or control technique(s) may be employed to adjust, change, or control any state, pose, position, orientation, navigation, path, or other state type that may be used in one or more embodiments for a continuum robot or steerable catheter.
- Physicians or other users of the apparatus or system may have reduced or saved labor and/or mental burden using the apparatus or system due to the navigation, control, and/or orientation (or pose, or position, etc.) feature(s) of the present disclosure.
- one or more features of the present disclosure may achieve a minimized or reduced interaction with anatomy (e.g., of a patient), object, or target (e.g., tissue, one or more lungs, one or more airways, etc.) during use, which may reduce the physical and/or mental burden on a patient or target.
- anatomy e.g., of a patient
- target e.g., tissue, one or more lungs, one or more airways, etc.
- a labor of a user to control and/or navigate e.g., rotate, translate, etc.
- the imaging apparatus or system or a portion thereof e.g., a catheter, a probe, a camera, one or more sections or portions of a catheter, probe, camera, etc.
- a labor of a user to control and/or navigate is saved or reduced via use of the navigation, control, and/or force estimation/determination or force difference determination technique(s) of the present disclosure.
- an imaging device or system, or a portion of the imaging device or system may include multiple sections or portions, and the multiple sections or portions may be multiple bending sections or portions.
- the imaging device or system may include manual and/or automatic navigation and/or control features.
- a user of the imaging device or system may control each section or portion, and/or the imaging device or system (or steerable catheter, continuum robot, etc.) may operate to automatically control (e.g., robotically control) each section or portion, such as, but not limited to, via one or more navigation, movement, and/or control techniques of the present disclosure.
- Navigation, control, and/or orientation feature(s) may include, but are not limited to, implementing mapping of a pose (angle value(s), plane value(s), etc.) of a first portion or section (e.g., a tip portion or section, a distal portion or section, a predetermined or set portion or section, a user selected or defined portion or section, etc.) to a stage position/state (or a position/state of another structure being used to map path or path-like information), controlling angular position(s) of one or more of the multiple portions or sections, controlling rotational orientation or position(s) of one or more of the multiple portions or sections, controlling (manually or automatically (e.g., robotically)) one or more other portions or sections of the imaging device or system (e.g., continuum robot, steerable catheter, etc.) to match or substantially or approximately match (or be close to or similar to) the navigation/orientation/position/pose of the first portion or section in a case where the one or more other portions or
- an imaging device or system may enter a target along a path where a first section or portion of the imaging device or system (or portion of the device or system) is used to set the navigation, control, or state path and state(s)/position(s), and each subsequent section or portion of the imaging device or system (or portion of the device or system) is controlled to follow the first section or portion such that each subsequent section or portion matches (or is similar to, approximate to, substantially matching, etc.) the orientation, position, state, etc. of the first section or portion at each location along the path.
- each section or portion of the imaging device or system is controlled to match (or be similar to, be approximate to, be substantially matching, etc.) the prior orientation, position, state, etc. (for each section or portion) for each of the locations along the path.
- each section or portion of the device or system may follow a leader (or more than one leader) or may use one or more RFTL and/or FTL technique(s) discussed herein.
- an imaging or continuum robot device or system or catheter, probe, camera, etc.
- a target, an object, a specimen, a patient e.g., a lung of a patient, an esophagus of a patient, a spline, another portion of a patient, another organ of a patient, a vessel of a patient, etc.
- a patient e.g., a lung of a patient, an esophagus of a patient, a spline, another portion of a patient, another organ of a patient, a vessel of a patient, etc.
- the navigation, control, orientation, and/or state feature(s) are not limited thereto, and one or more devices or systems of the present disclosure may include any other desired navigation, control, orientation, and/or state specifications or details as desired for a given application or use.
- the first portion or section may be a distal or tip portion or section of the imaging or continuum robot device or system.
- the first portion or section may be any predetermined or set portion or section of the imaging or continuum robot device or system, and the first portion or section may be predetermined or set manually by a user of the imaging or continuum robot device or system or may be set automatically by the imaging device or system (or by a combination of manual and automatic control).
- a “change of orientation” or a “change of state” may be defined in terms of direction and magnitude.
- each interpolated step may have a same direction, and each interpolated step may have a larger magnitude as each step approaches a final orientation.
- any motion along a single direction may be the accumulation of a small motion in that direction.
- the small motion may have a unique or predetermined set of wire position or state changes to achieve the orientation change.
- Large or larger motion(s) in that direction may use a plurality of the small motions to achieve the large or larger motion(s). Dividing a large change into a series of multiple changes of the small or predetermined/set change may be used as one way to perform interpolation.
- Interpolation may be used in one or more embodiments to produce a desired or target motion, and at least one way to produce the desired or target motion may be to interpolate the change of wire positions or states.
- a “change of state” may refer to a change of force (e.g., driving wire force(s), distal tip force(s), catheter tip force(s), etc.).
- an apparatus or system may include one or more processors that operate to: instruct or command a distal bending section or portion of a catheter or a probe of the continuum robot such that the distal bending section or portion achieves, or is disposed at, a bending pose or position, the catheter or probe of the continuum robot having a plurality of bending sections or portions and a base; store or obtain the bending pose or position of the distal bending section or portion and store or obtain a position or state of a motorized linear stage (or other structure used to map path or path-like information) that operates to move the catheter or probe of the continuum robot in a case where the one or more processors instruct or command forward motion, or a motion in a set or predetermined direction or directions, of the motorized linear stage (or other predetermined or set structure for mapping path or path-like information); generate a goal or target bending pose or position for each corresponding section or portion of the catheter or probe from, or based on, the previous
- FIG.1 illustrates a simplified representation of a medical environment, such as an operating room, where a robotic catheter system 1000 may be used.
- FIG. 2 illustrates a functional block diagram that may be used in at least one embodiment of the robotic catheter system 1000.
- FIGS.3A-3D represent at least one embodiment of the catheter 104 (see FIGS. 3A-3B) and bending for the catheter 104 (as shown in FIGS.
- FIGS. 4-5 illustrate logical block diagrams that may be used for one or more embodiments of the robotic catheter system 1000.
- the system 1000 may include a computer cart (see e.g., the controller 100, 102 in FIG. 1) operatively connected to a steerable catheter or continuum robot 104 via a robotic platform 108.
- the robotic platform 108 includes one or more than one robotic arm 132 and a rail 110 (see e.g., FIGS. 1-2) and/or linear translation stage 122 (see e.g., FIG.2). [0074] As shown in FIGS.
- one or more embodiments of a system 1000 for performing robotic control may include one or more of the following: a display controller 100, a display 101-1, a display 101-2, a controller 102, an actuator 103, a continuum device (also referred to herein as a “steerable catheter” or “an imaging device”) 104, an operating portion 105, a tracking sensor 106 (e.g., an electromagnetic (EM) tracking sensor) or camera, a catheter tip position/orientation/pose/state detector 107, and a rail 110 (which may be attached to or combined with a linear translation stage 122) (for example, as shown in at least FIGS.
- EM electromagnetic
- the system 1000 may include one or more processors, such as, but not limited to, a display controller 100, a controller 102, a console or computer 1200, a CPU 1201, any other processor or processors discussed herein, etc., that operate to execute a software program, to control the one or more control technique(s), support structure feature(s)r and/or technique(s), or other feature(s) or technique(s) discussed herein, and to control display of a navigation screen on one or more displays 101-1, 101-2, etc.
- processors such as, but not limited to, a display controller 100, a controller 102, a console or computer 1200, a CPU 1201, any other processor or processors discussed herein, etc.
- the one or more processors may generate a three dimensional (3D) model of a structure (for example, a branching structure like airway of lungs of a patient, an object to be imaged, tissue to be imaged, etc.) based on images, such as, but not limited to, CT images, MRI images, etc.
- the 3D model may be received by the one or more processors (e.g., the display controller 100, the controller 102, the console or computer 1200, the CPU 1201, any other processor or processors discussed herein, etc.) from another device.
- a two-dimensional (2D) model may be used instead of 3D model in one or more embodiments.
- the 2D or 3D model may be generated before a navigation starts.
- the 2D or 3D model may be generated in real-time (in parallel with the navigation).
- examples of generating a model of branching structure are explained.
- the models may not be limited to a model of branching structure.
- a model of a route direct to a target may be used instead of the branching structure.
- a model of a broad space may be used, and the model may be a model of a place or a space where an observation or a work is performed by using a continuum robot 104 explained below.
- a user U may control the robotic catheter system 1000 via a user interface unit (operation unit) to perform an intraluminal procedure on a patient P positioned on an operating table B.
- the user interface may include at least one of a main or first display 101-1 (a first user interface unit), a second display 101-2 (a second user interface unit), and a handheld controller 105 (a third user interface unit).
- the main or first display 101-1 may include, for example, a large display screen attached to the system 1000 and/or the controllers 101, 102 of the system 1000 or mounted on a wall of the operating room and may be, for example, designed as part of the robotic catheter system 1000 or may be part of the operating room equipment.
- a secondary display 101-2 that is a compact (portable) display device configured to be removably attached to the robotic platform 108.
- the second or secondary display 101-2 may include, but are not limited to, a portable tablet computer, a mobile communication device (a cellphone), a tablet, a laptop, etc.
- the steerable catheter 104 may be actuated via an actuator unit 103.
- the actuator unit 103 may be removably attached to the robotic platform 108 or any component thereof (e.g., the robotic arm 132, the rail 110, and/or the linear translation stage 122).
- the handheld controller 105 may include a gamepad-like controller with a joystick having shift levers and/or push buttons, and the controller 105 may be a one-handed controller or a two-handed controller.
- the actuator unit 103 may be enclosed in a housing having a shape of a catheter handle.
- One or more access ports 126 may be provided in or around the catheter handle. The access port 126 may be used for inserting and/or withdrawing end effector tools and/or fluids when performing an interventional procedure of the patient P.
- the system 1000 includes at least a system controller 102, a display controller 100, and the main display 101-1.
- the main display 101-1 may include a conventional display device such as a liquid crystal display (LCD), an OLED display, a QLED display, any other display discussed herein, any other display known to those skilled in the art, etc.
- the main display 101-1 may provide or display a graphic interface unit (GUI) configured to display one or more views. These views may include a live view image 134, an intraoperative image 135, a preoperative image 136, and other procedural information 138. Other views that may be displayed include a model view, a navigational information view, and/or a composite view.
- the live image view 134 may be an image from a camera at the tip of the catheter 104.
- the live image view 134 may also include, for example, information about the perception and navigation of the catheter 104.
- the preoperative image 136 may include pre-acquired 3D or 2D medical images of the patient P acquired by conventional imaging modalities such as, but not limited to, computer tomography (CT), magnetic resonance imaging (MRI), ultrasound imaging, or any other desired imaging modality.
- CT computer tomography
- MRI magnetic resonance imaging
- the intraoperative image 135 may include images used for image guided procedure such images may be acquired by fluoroscopy or CT imaging modalities (or another desired imaging modality).
- the intraoperative image 135 may be augmented, combined, or correlated with information obtained from a sensor, camera image, or catheter data.
- the sensor may be located at the distal end of the catheter 104.
- the catheter tip tracking sensor 106 may be, for example, an electromagnetic (EM) sensor.
- EM electromagnetic
- a catheter tip position detector 107 may be included in the robotic catheter system 1000; the catheter tip position detector 107 may include an EM field generator operatively connected to the system controller 102.
- the catheter/continuum robot 104 may not include or use the EM tracking sensor 106.
- Suitable electromagnetic sensors for use with a steerable catheter may be used with any feature of the present disclosure, including the sensors discussed, for example, in U.S. Pat. No. 6,201,387 and in International Pat. Pub. WO 2020/194212 A1, which are incorporated by reference herein in their entireties.
- the element 106 may be a camera or other imaging device/feature.
- the display controller 100 may acquire position/orientation/navigation/pose/state (or other state) information of the continuum robot 104 from a controller 102.
- the display controller 100 may acquire the position/orientation/navigation/pose/state (or other state) information directly from a tip position/orientation/navigation/pose/state (or other state) detector 107.
- the continuum robot 104 may be a catheter device (e.g., a steerable catheter or probe device).
- the continuum robot 104 may be attachable/detachable to the actuator 103, and the continuum robot 104 may be disposable.
- FIG.2 illustrates the robotic catheter system 1000 including the system controller 102 operatively connected to the display controller 100, which is connected to the first display 101-1 and to the second display 101-2 and/or to the handheld controller or operating portion 105.
- the system controller 102 is also connected to the actuator 103 via the robotic platform 108 or any component thereof (e.g., the robotic arm 132, the rail 110, and/or the linear translation stage 122).
- the actuator unit 103 may include a plurality of motors 144 that operate to control a plurality of drive wires 160 (while not limited to any particular number of drive wires 160, FIG. 2 shows that six (6) drive wires 160 are being used in the subject embodiment example).
- the drive wires 160 travel through the steerable catheter or continuum robot 104.
- One or more access ports 126 may be located on the catheter 104 (and may include an insertion/extraction detector 109).
- the catheter 104 may include a proximal section 148 located between the actuator 103 and the proximal bending section 152, where the drive wires 160 operate to actuate the proximal bending section 152.
- Three of the six drive wires 160 continue through the distal bending section 156 where the drive wires 160 operate to actuate the distal bending section 156 and allow for a range of movement.
- FIG.2 is shown with two bendable sections 152, 156 (although one or more bendable sections may be used in one or more embodiments).
- FIGS.3A-3B show at least one embodiment of a continuum robot 104 that may be used in the system 1000 or any other system discussed herein.
- FIG. 3A shows at least one embodiment of a steerable catheter 104.
- the steerable catheter 104 may include a non- steerable proximal section 148, a steerable distal section 156, and a catheter tip 320.
- the proximal section 148 and distal bendable section 156 (including portions 152, 154, and 156 in FIG.
- the proximal section 148 is configured with through-holes (or thru-holes) or grooves or conduits to pass drive wires 160 from the distal section 152, 154, 156 to the actuator unit 103.
- the distal section 152, 154, 156 is comprised of a plurality of bending segments including at least a distal segment 156, a middle segment 154, and a proximal segment 152. Each bending segment is bent by actuation of at least some of the plurality of drive wires 160 (driving members).
- the posture of the catheter 104 may be supported by supporting wires (support members) also arranged along the wall of the catheter 104 (as discussed in U.S. Pat. Pub. US2021/0308423, which is incorporated by reference herein in its entirety).
- the proximal ends of drive wires 160 are connected to individual actuators or motors 144 of the actuator unit 103, while the distal ends of the drive wires 160 are selectively anchored to anchor members in the different bending segments of the distal bendable section(s) 152, 154, 156.
- Each bending segment is formed by a plurality of ring-shaped components (rings) with through-holes (or thru-holes), grooves, or conduits along the wall of the rings.
- the ring- shaped components are defined as wire-guiding members 162 or anchor members 164 depending on a respective function(s) within the catheter 104.
- the anchor members 164 are ring-shaped components onto which the distal end of one or more drive wires 160 are attached in one or more embodiments.
- the wire-guiding members 162 are ring-shaped components through which some drive wires 160 slide through (without being attached thereto). [0083] As shown in FIG. 3B, detail “A” obtained from the identified portion of FIG. 3A illustrates at least one embodiment of a ring-shaped component (a wire-guiding member 162 or an anchor member 164).
- Each ring-shaped component 162, 164 may include a central opening which may form a tool channel 168 and may include a plurality of conduits 166 (grooves, sub-channels, or through-holes (or thru-holes)) arranged lengthwise (and which may be equidistant from the central opening) along the annular wall of each ring-shaped component 162, 164.
- an inner cover such as is described in U.S. Pat. Pub. US2021/0369085 and US2022/0126060, which are incorporated by reference herein in their entireties, may be included to provide a smooth inner channel and to provide protection.
- the non-steerable proximal section 148 may be a flexible tubular shaft and may be made of extruded polymer material.
- the tubular shaft of the proximal section 148 also may have a central opening or tool channel 168 and plural conduits 166 along the wall of the shaft surrounding the tool channel 168.
- An outer sheath may cover the tubular shaft and the steerable section 152, 154, 156.
- at least one tool channel 168 formed inside the steerable catheter 104 provides passage for an imaging device and/or end effector tools from the insertion port 126 to the distal end of the steerable catheter 104.
- the actuator unit 103 may include, in one or more embodiments, one or more servo motors or piezoelectric actuators.
- the actuator unit 103 may operate to bend one or more of the bending segments of the catheter 104 by applying a pushing and/or pulling force to the drive wires 160.
- each of the three bendable segments of the steerable catheter 104 has a plurality of drive wires 160. If each bendable segment is actuated by three drive wires 160, the steerable catheter 104 has nine driving wires arranged along the wall of the catheter 104. Each bendable segment of the catheter 104 is bent by the actuator unit 103 by pushing or pulling at least one of these nine drive wires 160. Force is applied to each individual drive wire in order to manipulate/steer the catheter 104 to a desired pose or state.
- the actuator unit 103 assembled with steerable catheter 104 may be mounted on the robotic platform 108 or any component thereof (e.g., the robotic arm 132, the rail 110, and/or the linear translation stage 122).
- the robotic platform 108, the rail 110, and/or the linear translation stage 122 may include a slider and a linear motor.
- the robotic platform 108 or any component thereof is motorized, and may be controlled by the system controller 102 to insert and remove the steerable catheter 104 to/from the target, sample, or object (e.g., the patient, the patient’s bodily lumen, one or more airways, a lung, a tissue, an organ, a target or object, a specimen, etc.).
- An imaging device 180 that may be inserted through the tool channel 168 includes an endoscope camera (videoscope) along with illumination optics (e.g., optical fibers or LEDs) (or any other camera or imaging device, tool, etc. discussed herein or known to those skilled in the art).
- the illumination optics provide light to irradiate the lumen and/or a lesion target which is a region of interest within the target, sample, or object (e.g., in a patient).
- End effector tools may refer to endoscopic surgical tools including clamps, graspers, scissors, staplers, ablation or biopsy needles, and other similar tools, which serve to manipulate body parts (organs or tumorous tissue) during imaging, examination, or surgery.
- the imaging device 180 may be what is commonly known as a chip-on-tip camera and may be color (e.g., take one or more color images) or black-and-white (e.g., take one or more black-and-white images). In one or more embodiments, a camera may support color and black-and-white images.
- the imaging device 180 or camera may use fluoroscopy (NIRF, NIRAF, any other fluoroscopy discussed herein or known to those skilled in the art, etc.) in addition to another imaging modality (e.g., OCT, IVUS, any other imaging modality discussed herein or known to those skilled in the art, etc.) or may be any other camera discussed herein.
- a tracking sensor e.g., an EM tracking sensor
- the steerable catheter 104 and the tracking sensor 106 may be tracked by the tip position detector 107.
- the tip position detector 107 detects a position of the tracking sensor 106, and outputs the detected positional information to the system controller 102.
- the system controller 102 receives the positional information from the tip position detector 107, and continuously records and displays the position of the steerable catheter 104 with respect to the coordinate system of the target, sample, or object (e.g., a patient, a lung, an airway(s), a vessel, etc.).
- the system controller 102 operates to control the actuator unit 103 and the robotic platform 108 or any component thereof (e.g., the robotic arm 132, the rail 110, and/or the linear translation stage 122) in accordance with the manipulation commands input by the user U via one or more of the input and/or display devices (e.g., the handheld controller 105, a GUI at the main display 101-1, touchscreen buttons at the secondary display 101-2, etc.).
- FIG.3C and FIG.3D show exemplary catheter tip manipulations by actuating one or more bending segments of the steerable catheter 104. As illustrated in FIG. 3C, manipulating only the most distal segment 156 of the steerable section may change the position and orientation of the catheter tip 320.
- manipulating one or more bending segments (152 or 154) other than the most distal segment may affect only the position of catheter tip 320, but may not affect the orientation of the catheter tip 320.
- actuation of distal segment 156 changes the catheter tip from a position P1 having orientation O1, to a position P2 having orientation O2, to position P3 having orientation O3, to position P4 having orientation O4, etc.
- actuation of the middle segment 152 and/or the middle segment 154 may change the position of the catheter tip 320 from a position P1 having orientation O1 to a position P2 and position P3 having the same orientation O1.
- 3C and FIG. 3D may be performed during catheter navigation (e.g., while inserting the catheter 104 through tortuous anatomies, one or more targets, one or more lungs or other organs, one or more airways, samples, objects, a patient, etc.).
- the one or more catheter tip manipulations shown in FIG. 3C and FIG. 3D may apply namely to the targeting mode applied after the catheter tip 320 has been navigated to a predetermined distance (a targeting distance) from the target, sample, or object.
- the actuator 103 may proceed or retreat along a rail 110 (e.g., to translate the actuator 103, the continuum robot/catheter 104, etc.), and the actuator 103 and continuum robot 104 may proceed or retreat in and out of the patient’s body or other target, object, or specimen (e.g., tissue).
- the catheter device 104 may include a plurality of driving backbones and may include a plurality of passive sliding backbones.
- the catheter device 104 may include at least nine (9) driving backbones and at least six (6) passive sliding backbones.
- the catheter device 104 may include an atraumatic tip at the end of the distal section of the catheter device 104 and/or may include or use force determination and/or force difference determination technique(s) and/or feature(s) as further discussed below.
- FIG.4 illustrates that a system 1000 may include the system controller 102 which may operate to execute software programs and control the display controller 100 to display a navigation screen (e.g., a live view image 134) on the main display 101-1 and/or the secondary display 101-2.
- the display controller 100 may include a graphics processing unit (GPU) or a video display controller (VDC) (or any other suitable hardware discussed herein or known to those skilled in the art).
- FIG. 5 illustrates components of the system controller 102 and/or the display controller 100.
- any of the one or more processors may be configured separately.
- the controller 102 may similarly include a CPU 120, a RAM 130, an I/O 140, a ROM 110, and an HDD 150 as shown diagrammatically in FIG. 5.
- any of the one or more processors such as, but not limited to, the controller 102 and the display controller 100, may be configured as one device (for example, the structural attributes of the controller 100 and the controller 102 may be combined into one controller or processor, such as, but not limited to, the one or more other processors discussed herein (e.g., computer, console, or processor 1200, 1200’, etc.)).
- the controller 102 and display controller 100 may include a central processing unit (CPU 120) which may include or comprise of one or more processors (microprocessors, nanoprocessors, etc.), a random access memory (RAM 130) module, an input/output (I/O 140) interface, a read only memory (ROM 110), and data storage memory (e.g., a hard disk drive (HDD 150) or other data storage memory, such as, but not limited to, a solid state drive (SSD) or other data storage 150 as shown, for example, in FIG. 4).
- CPU 120 central processing unit
- processors microprocessors, nanoprocessors, etc.
- RAM 130 random access memory
- I/O 140 input/output
- ROM 110 read only memory
- data storage memory e.g., a hard disk drive (HDD 150) or other data storage memory, such as, but not limited to, a solid state drive (SSD) or other data storage 150 as shown, for example, in FIG. 4).
- the one or more processors, and/or the display controller 100 and/or the controller 102 may include structure as shown in FIGS.14- 19 as further discussed below.
- the ROM 110 and/or HDD 150 may store the operating system (OS) software and may store software programs that operate to be used for executing the functions of the robotic catheter system 1000 as a whole.
- the RAM 130 may be used as a workspace memory.
- the CPU 120 may operate to execute the software programs developed in the RAM 130.
- the I/O 140 may operate to input, for example, positional information to the display controller 100, and to output information for displaying the navigation screen to the one or more displays (main display 101-1 and/or secondary display 101-2).
- system controller 102 and/or the display controller 100 may include one or more computer or processing components or units, such as, but not limited to, the components, processors, or units shown in at least FIGS. 1, 7A-12B, and/or 14-19 as discussed herein.
- the system controller 102 and the display controller 100 may include a central processing unit (CPU 1201) (which may be comprised of one or more processors (microprocessors, nanoprocessors, etc.)), a random access memory (RAM 1203) module, an input/output or communication (I/O 1205) interface, a read only memory (ROM 1202), and data storage memory (e.g., a hard disk drive 1204 or solid state drive (SSD) 1204) (see e.g., also data storage 150 of FIG. 4).
- the navigation screen is a graphical user interface (GUI) generated by a software program, but it may also be generated by firmware, or a combination of software and firmware.
- GUI graphical user interface
- a Solid State Drive (SSD) 1204 or other data storage may be used instead of HDD 1204 as the data storage 150.
- the system controller 102 may control the steerable catheter 104 based on any known kinematic algorithms applicable to continuum or steerable catheter robots. For example, the segments or portions of the steerable catheter 104 may be controlled individually to direct the catheter tip with a combined actuation of all bendable segments or sections. By way of another example, a controller 102 may control the catheter 104 based on an algorithm known as follow the leader (FTL) algorithm or mode.
- FTL follow the leader
- the most distal segment 156 is actively controlled with forward kinematic values, while the middle segment 154 and the other middle or proximal segment 152 (following sections) of the steerable catheter 104 move at a first position in the same way as the distal section moved at the first position or a second position near the first position (e.g., the subsequent sections may follow a path traced out by the distal section).
- the RFTL algorithm may be used.
- a reverse FTL (RFTL) process may be implemented. This may be implemented using inverse kinematics.
- the RFTL mode may automatically control all sections of the steerable catheter 104 to retrace the pose (or state) from the same position along the path made during insertion (e.g., in a reverse or backwards order or manner).
- the display controller 100 may acquire position information of the steerable catheter 104 from the system controller 102. Alternatively, the display controller 100 may acquire the position information directly from the tip position detector 107.
- the steerable catheter 104 may be a single-use or limited-use catheter device. In other words, the steerable catheter 104 may be attachable to, and detachable from, the actuator unit 103 to be disposable.
- the display controller 100 may generate and output a live-view image or other view(s) or a navigation screen to the main display 101-1 and/or the secondary display 101-2.
- This view may optionally be registered with a 3D model of a patient’s anatomy (a branching structure) and the position information of at least a portion of the catheter (e.g., position of the catheter tip 320) by executing pre-programmed software routines.
- one or more end effector tools may be inserted through the access port 126 at the proximal end of the catheter 104, and such tools may be guided through the tool channel 168 of the catheter body to perform an intraluminal procedure from the distal end of the catheter.
- the tool may be a medical tool such as an endoscope camera, forceps, a needle, or other biopsy or ablation tools.
- the tool may be described as an operation tool or working tool. The working tool is inserted or removed through the working tool access port 126.
- the tool may include an endoscope camera or an end effector tool, which may be guided through a steerable catheter under the same principles.
- a procedure there is usually a planning procedure, a registration procedure, a targeting procedure, and an operation procedure.
- the one or more processors such as, but not limited to, the display controller 100, may generate and output a navigation screen to the one or more displays 101-1, 101-2 based on the 2D/3D model and the position/orientation/navigation/pose/state/force (or other state) information by executing the software.
- the navigation screen may indicate a current position/orientation/navigation/pose/state/force (or other state) of the continuum robot 104 on the 2D/3D model.
- a user may recognize the current position/orientation/navigation/pose/state/force (or other state) of the continuum robot 104 in the branching structure.
- Any feature of the present disclosure may be used with any navigation/pose/state feature(s) or other feature(s) discussed in PCT App. No. PCT/US2024/031766, filed May 30, 2024, the disclosures of which are incorporated by reference herein in their entireties.
- a user may recognize the current position of the steerable catheter 104 in the branching structure.
- one or more end effector tools may be inserted through the access port 126 at the proximal end of the catheter 104, and such tools may be guided through the tool channel 168 of the catheter body to perform an intraluminal procedure from the distal end of the catheter 104.
- the ROM 1202 and/or HDD 1204 may operate to store the software in one or more embodiments.
- the RAM 1203 may be used as a work memory.
- the CPU 1201 may execute the software program developed in the RAM 1203.
- the I/O or communication interface 1205 may operate to input the positional (or other state) information (or other information, such as, but not limited to, distal force or force difference information) to the display controller 100 (and/or any other processor discussed herein) and to output information for displaying the navigation screen to the one or more displays 101-1, 101-2.
- the navigation screen may be generated by the software program. In one or more other embodiments, the navigation screen may be generated by a firmware.
- One or more devices or systems may include a tip position/orientation/navigation/pose/state (or other state) detector 107 that operates to detect a position/orientation/navigation/pose/state (or other state) of the EM tracking sensor 106 and to output the detected positional (and/or other state) information to the controller 100 or 102 (e.g., as shown in FIGS.1-2), or to any other processor(s) discussed herein.
- the controller 102 may operate to receive the positional (or other state) information of the tip of the continuum robot 104 from the tip position/orientation/navigation/pose/state (or any other state discussed herein) detector 107.
- the controller 100 and/or the controller 102 operates to control the actuator 103 in accordance with the manipulation by a user (e.g., manually), and/or automatically (e.g., by a method or methods run by one or more processors using software, by the one or more processors, using automatic manipulation in combination with one or more manual manipulations or adjustments, etc.) via one or more operation/operating portions or operational controllers 105 (e.g., such as, but not limited to a joystick as shown in FIGS.1-2; see also, diagram of FIG.4).
- a user e.g., manually
- automatically e.g., by a method or methods run by one or more processors using software, by the one or more processors, using automatic manipulation in combination with one or more manual manipulations or adjustments, etc.
- one or more operation/operating portions or operational controllers 105 e.g., such as, but not limited to a joystick as shown in FIGS.1-2; see also, diagram of FIG.4
- the one or more displays 101-1, 101-2 and/or operation portion or operational controllers 105 may be used as a user interface 3000 (also referred to as a receiving device) (e.g., as shown diagrammatically in FIG.4).
- a user interface 3000 also referred to as a receiving device
- the system(s) 1000 may include, as an operation unit, the display 101-1 (e.g., such as, but not limited to, a large screen user interface with a touch panel, first user interface unit, etc.), the display 101-2 (e.g., such as, but not limited to, a compact user interface with a touch panel, a second user interface unit, etc.) and the operating portion 105 (e.g., such as, but not limited to, a joystick shaped user interface unit having shift lever/ button, a third user interface unit, a gamepad, or other input device, etc.).
- the display 101-1 e.g., such as, but not limited to, a large screen user interface with a touch panel, first user interface unit, etc.
- the display 101-2 e.g., such as, but not limited to, a compact user interface with a touch panel, a second user interface unit, etc.
- the operating portion 105 e.g., such as, but not limited to, a joystick shaped user interface
- the controller 100 and/or the controller 102 may control the continuum robot 104 based on an algorithm known as follow the leader (FTL) algorithm and/or the RFTL algorithm.
- the FTL algorithm may be used in addition to the robotic control features of the present disclosure.
- the middle section and the proximal section (following sections) of the continuum robot 104 may move at a first position (or other state) in the same or similar way as the distal section moved at the first position (or other state) or a second position (or state) near the first position (or state) (e.g., during insertion of the continuum robot/catheter 104, by using the navigation, movement, and/or control feature(s) of the present disclosure, etc.).
- the middle section and the distal section of the continuum robot 104 may move at a first position or state in the same/similar/approximately similar way as the proximal section moved at the first position or state or a second position or state near the first position (e.g., during removal of the continuum robot/catheter 104).
- the continuum robot/catheter 104 may be removed by automatically and/or manually moving along the same or similar, or approximately same or similar, path that the continuum robot/catheter 104 used to enter a target (e.g., a body of a patient, an object, a specimen (e.g., tissue), etc.) using the FTL algorithm, including, but not limited to, using FTL with the one or more control, support structure (e.g., kink avoidance/reduction, buckling reduction/avoidance, etc.), or other technique(s) discussed herein.
- a target e.g., a body of a patient, an object, a specimen (e.g., tissue), etc.
- FTL algorithm including, but not limited to, using FTL with the one or more control, support structure (e.g., kink avoidance/reduction, buckling reduction/avoidance, etc.), or other technique(s) discussed herein.
- any feature of the present disclosure may be used with features, including, but not limited to, training feature(s), autonomous navigation feature(s), artificial intelligence feature(s), etc., as discussed in PCT App. No. PCT/US2024/037935, filed July 12, 2024, the disclosures of which are incorporated by reference herein in their entireties.
- the system 1000 may include a tool channel 126 for a camera, biopsy tools, or other types of medical tools (as shown in FIGS. 1-2).
- the tool may be a medical tool, such as an endoscope, a forceps, a needle, or other biopsy tools, etc.
- the tool may be described as an operation tool or working tool (e.g., an imaging device, a camera, etc.).
- the working tool may be inserted or removed through a working tool insertion slot 126 (as shown in FIGS. 1-2).
- a working tool insertion slot 126 as shown in FIGS. 1-2.
- Any of the features of the present disclosure may be used in combination with any of the features, including, but not limited to, the tool insertion slot, as discussed in U.S. Pat. App. No. 18/477,081, filed September 28, 2023, the disclosure of which is incorporated by reference herein in its entirety.
- One or more of the features discussed herein may be used for planning procedures, including using one or more models for robotic control and/or artificial intelligence applications.
- FIG.6 is a flowchart showing steps of at least one planning procedure of an operation of the continuum robot/catheter device 104.
- One or more of the processors discussed herein may execute the steps shown in FIG. 6, and these steps may be performed by executing a software program read from a storage medium, including, but not limited to, the ROM 1202 or HDD/SSD 1204, by CPU 1201 or by any other processor discussed herein.
- One or more methods of planning using the continuum robot/catheter device 104 may include one or more of the following steps: (i) In step s601, one or more images such, as CT or MRI images, may be acquired; (ii) In step s602, a three dimensional model of a branching structure (for example, an airway model of lungs or a model of an object, specimen or other portion of a body) may be generated based on the acquired one or more images; (iii) In step s603, a target on the branching structure may be determined (e.g., based on a user instruction, based on preset or stored information, etc.); (iv) In step s604, a route of the continuum robot/catheter device 104 to reach the target (e.g., on the branching structure) may be determined (e.g., based on a user instruction, based on preset or stored information, based on a combination of user instruction and stored or preset information, etc.); and/or
- a model e.g., a 2D or 3D model
- a target and a route on the model may be determined and stored before the operation of the continuum robot 104 is started.
- embodiments of using a catheter device/continuum robot 104 are explained, such as, but not limited to features for performing navigation, movement, and/or robotic control technique(s), performing or using force determination and/or force difference determination feature(s) and/or technique(s), or any other feature(s) and/or technique(s) discussed herein.
- Pose or state information may be stored in a lookup table or tables, and the pose or state information for one or more sections of the catheter or probe may be updated in the lookup table based on new information (e.g., environmental change(s) for the catheter or probe, movement of a target or sample, movement of a patient, user control, relaxation state changes, etc.).
- new information e.g., environmental change(s) for the catheter or probe, movement of a target or sample, movement of a patient, user control, relaxation state changes, etc.
- the new information or the updated information may be used to control the one or more sections of the catheter or probe more efficiently during navigation (forwards and/or backwards).
- the previously stored pose or state may not be ideal or may work less efficiently as compared with an updated pose or state modified or updated in view of the new information (e.g., the movement, in this example).
- one or more embodiments of the present disclosure may update or modify the pose or state information such that robotic control of the catheter or probe may work efficiently in view of the new information, movement, relaxation, and/or environmental change(s).
- the update or change may also affect a number of other points (e.g., all points in a lookup table or tables, all points forward beyond the initially changed point, one or more future points or points beyond the initially changed point as desired, etc.).
- the transform (or difference, change, update, etc.) between the previous pose or state and the new or updated post or state may be propagated to all points going forward or may be propagated to one or more of forward points (e.g., for a predetermined or set range, for a predetermined or set distance, etc.). Doing so in one or more embodiments may operate to shift all or part of the future path based on how the pose or state of the catheter or probe was adjusted, using that location as a pivot point.
- Such update(s) may be obtained from one or more internal sources (e.g., one or more processors, one or more sensors, combination(s) thereof, etc.) or may be obtained from one or more external sources (e.g., other processor(s), external sensor(s), combination(s) thereof, etc.).
- a difference between a real-time target, sample, or object (e.g., an airway) and the previous target, sample, or object (e.g., a previous airway) may be detected using machine vision (of the endoscope image) or using multiple medical images.
- Body, target, object, or sample divergence may also be estimated from other sensors, like one measuring breathing or the motion of the body (or another predetermined or set motion or change to track).
- an amount of transform, update, and/or change may be different for each point, and/or may be a function of, for example, a distance from a current point.
- One or more methods of controlling or using a continuum robot/catheter device may use one or more Hold the Line techniques, Close the Gap techniques, and/or Stay the Course techniques, such as, but not limited to, the techniques discussed in U.S. Pat. App. No. 63/585,128 filed on September 25, 2023, the disclosure of which is incorporated by reference herein in its entirety, and as discussed in International Patent Application No.
- a catheter or probe may be controlled to stay the desired course.
- a pose, position, or state of a section or sections, or of a portion or portions, of the catheter or probe may be adjusted to minimize any deviation of a pose, position, or state of one or more next (e.g., subsequent, following, proximal, future, Middle/proximal, etc.) sections out of the predetermined, targeted, desired trajectory and maximum motion along the trajectory.
- the coordinates and the trajectory of subsequent/following/next/future sections may be known, set, or determined, and information for one or more prior sections may be known, set, or determined.
- section lengths in one or more embodiments, one or more advantageous results may be achieved.
- any counter-active or undesired motion(s) may be avoided or eliminated.
- the system controller 102 (or any other controller, processor, computer, etc. discussed herein) may perform a robotic control mode and/or an autonomous navigation mode. During the robotic control mode and/or the autonomous navigation mode, the user does not need to control the bending and translational insertion position of the steerable catheter 104.
- the autonomous navigation mode may include or comprise: (1) a perception step, (2) a planning step, and (3) a control step.
- the system controller 102 may receive an endoscope view (or imaging data) and may analyze the endoscope view (or imaging data) to find addressable airways from the current position/orientation of the catheter 104. At an end of this analysis, the system controller 102 identifies/perceives these addressable airways as paths in an endoscope view (or imaging data).
- the planning step is a step to determine a target path, which is the destination for the steerable catheter 104.
- the present disclosure uniquely includes means to reflect user instructions concurrently for the decision of a target path among the identified or perceived paths.
- the control step is a step to control the steerable catheter 104 and the linear translation stage 122 (or any other portion of the robotic platform 108) to navigate the steerable catheter 104 to the target path, pose, state, etc. This step may also be performed as an automatic step.
- the system controller 102 operates to use information relating to the real time endoscope view (e.g., the view 134), the target path, and an internal design & status information on the robotic catheter system 1000.
- the real-time endoscope view 134 may be displayed in a main display 101-1 (as a user input/output device) in the system 1000. The user may see the airways in the real-time endoscope view 134 through the main display 101-1. This real-time endoscope view 134 may also be sent to the system controller 102.
- the system controller 102 may process the real-time endoscope view 134 and may identify path candidates by using image processing algorithms.
- the system controller 102 may select the paths with the designed computation processes, and then may display the paths with a circle, octagon, or other geometric shape with the real-time endoscope view 134, for example, as discussed in International App. No. PCT/US2024/037935, filed July 12, 2024, the disclosures of which are incorporated by reference herein in their entireties.
- the system controller 102 may provide a cursor so that the user may indicate the target path by moving the cursor with the joystick 105. When the cursor is disposed or is located within the area of the path, the system controller 102 operates to recognize the path with the cursor as the target path.
- the system controller 102 may pause the motion of the actuator unit 103 and the linear translation stage 122 while the user is moving the cursor so that the user may select the target path with a minimal change of the real-time endoscope view 134 and paths since the system 1000 would not move in such a scenario.
- the system controller 102 may pause the motion of the actuator unit 103 and the linear translation stage 122 while the user is moving the cursor so that the user may select the target path with a minimal change of the real-time endoscope view 134 and paths since the system 1000 would not move in such a scenario.
- Any feature of the present disclosure may be used with autonomous navigation, movement detection, and/or control technique(s), including, but not limited to, the features discussed in International App. No. PCT/US2024/037935, filed July 12, 2024, the disclosures of which are incorporated by reference herein in their entireties.
- the system controller 102 may control the steerable catheter 104 based on any known kinematic algorithms applicable to continuum or snake-like catheter robots.
- the system controller controls the steerable catheter 104 based on an algorithm known as follow the leader (FTL) algorithm or on the RFTL algorithm.
- FTL leader
- the most distal segment 156 is actively controlled with forward kinematic values, while the middle segment 154 and the other middle or proximal segment 152 (following sections) of the steerable catheter 104 move at a first position in the same way as the distal section moved at the first position or a second position near the first position.
- any other algorithm may be applied to control a continuum robot or catheter/probe.
- applying a same “change in position” or a “change in state” to two separate orientations/states may maintain a difference (e.g., a set difference, a predetermined difference, etc.) between the two separate orientations/states.
- a difference e.g., a set difference, a predetermined difference, etc.
- an orientation/state difference may be defined as the difference between wire position/state in one or more embodiments (other embodiments are not limited thereto), changing both sets of wire positions or states by the same amount would not affect the orientation or state difference between the two separate orientations or states.
- Orientations mapped to two subsequent stage positions/states may have a specific orientation difference between the orientations.
- the later (or second) stage position/state or position/state of the another structure
- the smoothing process may include an additional step of a “small motion”, which operates to cause the pose/state difference to change by an amount of that small motion.
- the small motion step operates to direct that orientation/state in a table towards a proper (e.g., set, desired, predetermined, selected, etc.) direction, while also maintaining a semblance or configuration of the prior path/state before the smoothing process was applied. Therefore, in one or more embodiments, it may be most efficient and effective to combine and compare wire positions or states to or with prior orientations or states while using a smoothing process to maintain the pre-existing orientation changes.
- a catheter or probe may transition, move, or adjust using a shortest possible volume.
- a process or algorithm may perform the transitioning, moving, or adjusting process more efficiently than computing a transformation stackup of each section or portion of the catheter or probe.
- each interpolated step aims towards the final orientation in a desired direction such that any prior orientation which the interpolated step is combined with will also aim towards the desired direction to achieve the final orientation.
- an apparatus or system may include one or more processors that operate to: receive or obtain an image or images showing pose or position (or other state) information of a tip section of a catheter or probe having a plurality of sections including at least the tip section; track a history of the pose or position (or other state) information of the tip section of the catheter or probe during a period of time; and use the history of the pose or position (or other state) information of the tip section to determine how to align or transition, move, or adjust (e.g., robotically, manually, automatically, etc.) each section of the plurality of sections of the catheter or probe.
- processors that operate to: receive or obtain an image or images showing pose or position (or other state) information of a tip section of a catheter or probe having a plurality of sections including at least the tip section; track a history of the pose or position (or other state) information of the tip section of the catheter or probe during a period of time; and use the history of the pose or position (or other state) information of the tip section to determine how to align
- additional image(s) may be received or obtained to show the catheter or probe after each section of the plurality of sections of the catheter or probe has been aligned or adjusted (e.g., robotically, manually, automatically, etc.) based on the history of the pose or position (or other state) information of the tip section.
- the apparatus or system may include a display to display the image or images showing the aligned or adjusted sections of the catheter or probe.
- the pose or position (or other state) information may include, but is not limited to, a target pose or position (or other state) or a final pose or position (or other state) that the tip section is set to reach, an interpolated pose or position (or other state) of the tip section (e.g., an interpolation of the tip section between two positions or poses (or other states) (e.g., between pose or position (or other state) A to pose or position (or other state) B) where the apparatus or system sends pose (or other state) change information in steps based on a desired, set, or predetermined speed; between poses or positions where each pose or position (or other state) of the catheter or probe takes or is disposed is tracked during the transition; etc.), and a measured pose or position (or other state) (e.g., using tracked poses or positions (or other states), using encoder positions (or other states) of each wire motor, etc.) where the one or more processors may further operate to calculate or derive a current position or position
- each pose or position (or state) may be converted (e.g., via the one or more processors) between the following formats: Drive Wire Positions (or state(s)); and/or Coordinates (three-dimensional (3D) Position and Orientation (or other state(s))).
- an apparatus or system may include a camera deployed at a tip of a catheter or probe and may be bent with the catheter or probe, and/or the camera may be detachably attached to, or removably inserted into, the steerable catheter or probe.
- an apparatus or system may include a display controller, or the one or more processors may display the image or images for display on a display.
- imaging modalities including, for example, CT (computed tomography), MRI (magnetic resonance imaging), NIRF (near infrared fluorescence), NIRAF (near infrared auto-fluorescence), OCT (optical coherence tomography), SEE (spectrally encoded endoscope), IVUS (intravascular ultrasound), PET (positron emission tomography), X-ray imaging, combinations or hybrids thereof, other imaging modalities discussed herein, any combination thereof, or any modality known to those skilled in the art.
- CT computed tomography
- MRI magnetic resonance imaging
- NIRF near infrared fluorescence
- NIRAF near infrared auto-fluorescence
- OCT optical coherence tomography
- SEE spectrally encoded endoscope
- IVUS intravascular ultrasound
- PET positron emission tomography
- configurations are described as a robotic bronchoscope (or other scope, such as, but not limited to, an endoscope, other scopes discussed herein, or known to those skilled in the art) arrangement or a continuum robot arrangement that may be equipped with a tool channel for an imaging device and medical tools, where the imaging device and the medical tools may be exchanged by inserting and retracting the imaging device and/or the medical tools via the tool channel (see e.g., tool channel 126 in FIGS. 1-2, see e.g., imaging device or tool (e.g., camera) 160, and see e.g., medical tool 133 in FIG.1).
- a robotic bronchoscope or other scope, such as, but not limited to, an endoscope, other scopes discussed herein, or known to those skilled in the art
- a continuum robot arrangement that may be equipped with a tool channel for an imaging device and medical tools, where the imaging device and the medical tools may be exchanged by inserting and retracting the imaging device and/or the medical tools via the tool
- the imaging device can be a camera or other imaging device
- the medical tool can be a biopsy tool or other medical device.
- Configurations may facilitate placement of medical tools, catheters, needles or the like, and may be free standing, cart mounted, patient mounted, movably mounted, combinations thereof, or the like.
- the present disclosure is not limited to any particular configuration.
- the robotic catheter or steerable catheter arrangement may be used in association with one or more displays and control devices and/or processors, such as those discussed herein (see e.g., one or more device or system configurations shown in one or more of FIGS. 1-19 of the present disclosure).
- the display device may display, on a monitor, an image captured by the imaging device, and the display device may have a display coordinate used for displaying the captured image.
- a display coordinate used for displaying the captured image.
- top, bottom, right, and left portions of the monitor(s) or display(s) may be defined by axes of the displaying coordinate system/grid, and a relative position of the captured image or images against the monitor may be defined on the displaying coordinate system/grid.
- the robotic catheter or scope may use one or more imaging devices (e.g., a catheter or probe 104, a camera, a sensor, any other imaging device discussed herein, etc.) and one or more display devices (e.g., a display 101-1, a display 101-2, a screen 1209, any other display discussed herein, etc.) to facilitate viewing, imaging, and/or characterizing tissue, a sample, or other object using one or a combination of the imaging modalities described herein.
- imaging devices e.g., a catheter or probe 104, a camera, a sensor, any other imaging device discussed herein, etc.
- display devices e.g., a display 101-1, a display 101-2, a screen 1209, any other display discussed herein, etc.
- the tool channel e.g., the tool channel 126) or the camera (e.g., the camera 180 of FIG. 3A).
- the tool channel or the camera may be bent according to a control by the system (such as, but not limited to, the features discussed herein and shown in at least FIGS.3A-3D).
- the system may have an operational controller (for example, a gamepad, a joystick 105 (see e.g., FIGS. 1-2), etc.) and a control coordinate.
- a direction to which the tool channel or the camera moves or is bent according to a particular command (up, down, turn right, or turn left; alternatively, a command set may include a first direction, a second direction opposite or substantially opposite from or to the first direction, a third direction that is about or is 90 degrees from or to the first direction, and a fourth direction that is opposite or substantially opposite from or to the third direction) is adjusted to match a direction (top, bottom, right or left) on a display (or on the display coordinate).
- the calibration is performed so that an upward of the displayed image on the display coordinate corresponds to an upward direction on the control coordinate (a direction to which the tool channel or the camera moves according to an “up” command).
- first, second, third, and fourth directions on the display correspond to the first, second, third, and fourth directions of the control coordinate (e.g., of the tool channel or camera).
- the control coordinate e.g., of the tool channel or camera.
- the tool channel or the camera is bent to an upward or first direction on the control coordinate.
- the direction to which the tool channel or the camera is bent corresponds to an upward or first direction of the capture image displayed on the display.
- a rotation function of a display of the captured image on the display coordination may be performed.
- the orientation of the camera view should match with a conventional orientation of the camera view that physicians or other medical personnel typically see in their normal catheter, imaging device, scope, etc. procedure: for example, for a bronchoscope, the right and left main bronchus may be displayed horizontally on a monitor or display (e.g., the display 101-1, the display 101-2, the display or screen 1209, etc.). Then, if the right and left main bronchus in a captured image are not displayed horizontally on the display, a user may rotate the captured image on the display coordinate so that the right and left main bronchus are displayed horizontally on a monitor or display.
- a monitor or display e.g., the display 101-1, the display 101-2, the display or screen 1209, etc.
- the captured image is rotated on the display coordinate after a calibration is performed, a relationship between the top, bottom, right, and left (or first, second, third, and/or fourth directions) of the displayed image and top, bottom, right, and left (or corresponding first, second, third, and/or fourth directions) of the monitor may be changed.
- the tool channel or the camera may move or may be bent in the same way regardless of the rotation of the displayed image when a particular command is received (for example, a command to let the tool channel or the camera (or a capturing direction of the camera) move upward, downward, right, or left or to move in the first direction, second direction, third direction, or fourth direction).
- a calibration or arrangement of the imaging device, scope, catheter or probe may use an orientation feature 406 discussed below.
- This may change a relationship between the top, bottom, right, and left (or first, second, third, and fourth directions) of the monitor and a direction to which the tool channel or the camera moves (up, down, right, or left; or a first, second, third, or fourth direction) on the monitor according to a particular command (for example, tilting a joystick to up, down, right, or left; tilting the joystick in a first direction, the second direction, the third direction, or the fourth direction; etc.).
- a particular command for example, tilting a joystick to up, down, right, or left; tilting the joystick in a first direction, the second direction, the third direction, or the fourth direction; etc.
- the tool channel or the camera when the calibration is performed, by tilting the joystick upward (or to a first direction), the tool channel or the camera is bent to a direction corresponding to a direction top (or to the first direction) of or on the monitor.
- the tool channel or the camera may not be bent to the direction corresponding to the direction of the top (or of the first direction) of the monitor but may be bent to a direction to a diagonally upward of the monitor. This may complicate user interaction.
- bronchoscopists When the camera is inserted into a continuum robot or steerable catheter apparatus or system or any other system or apparatus discussed herein, an operator may map or calibrate the orientation of the camera view, the user interface device, and the robot end- effector. However, this may not be enough for bronchoscopists, in one or more situations, because (1) the right and left main bronchus may be displayed in arbitrary direction in this case, and (2) bronchoscopists rely on how the bronchi look to navigate a bronchoscope and bronchoscopists typically confirm the location of the bronchoscope using or based on how the right and left main bronchus look like.
- a direction to which a tool channel or a camera moves or is bent is corrected automatically in a case where a displayed image is rotated.
- the robot configurational embodiments described below enable to keep a correspondence between a direction on a monitor (top, bottom, right, or left of the monitor; a first, second, third, or fourth direction(s) of the monitor, etc.), a direction the tool channel or the camera moves on the monitor or display (e.g., the display 101-1, the display 101-2, the display or screen 1209, etc.) according to a particular directional command (up, down, turn right, or turn left; first direction, second direction, third direction, or fourth direction, etc.), and a user interface device even in a case where the displayed image is rotated.
- medical image processing implements functioning through use of one or more processes, techniques, algorithms, or other steps discussed herein, that operate to provide rapid, accurate, cost-effective, and minimally invasive structure and manufacture/use techniques for structure of a catheter and/or a catheter tip.
- one or more embodiments may use the force determination and/or force difference determination technique(s) of the present disclosure.
- a medical apparatus or system provides advantageous features to continuum robots or steerable catheters by providing rapid, accurate, cost-effective, and minimally invasive structure and manufacture/use techniques for structure of a catheter and/or a catheter tip and providing work efficiency to physicians during a medical procedure and rapid, accurate, and minimally invasive techniques for patients.
- a medical apparatus or system 1000 may be provided in the form of a continuum robot or steerable catheter (or other imaging device) assembly or configuration that provides medical imaging with rapid, accurate, cost-effective, and minimally invasive structure and manufacture/use techniques for structure of a catheter and/or a catheter tip according to one or more embodiments.
- FIGS. 2-5, 7A-12B, and 14-19 show one or more hardware configurations of the system 1000 as discussed above for FIG. 1.
- the system 1000 (or any other system discussed herein) may include one or more medical tools 133 and one or more medical devices or catheters/probes 104 (see e.g., as shown in FIG. 1).
- the medical tool 133 may be referred to as a “biopsy tool”, a “medical procedure tool”, an “imaging tool”, etc. in one or more embodiments and the medical device 104 is referred to as a “catheter” (or probe, continuum robot, steerable catheter, etc.). That said, the medical tool 133 and the medical device 104 are not limited thereto, and a variety of other types of tools, devices, configurations, or arrangements also falls within the scope of the present disclosure, including, but not limited to, for example, a bronchoscope, catheter, robotic bronchoscope, robotic catheter, endoscope, colonoscope, ablation device, sheath, guidewire, needle, probe, forceps, another medical tool, a camera, an imaging device, etc.
- the controller or joystick 105 may have a housing with an elongated handle or handle section which may be manually grasped, and one or more input devices including, for example, a lever or a button or another input device that allows a user, such as a physician, nurse, technician, etc., to send a command to the medical apparatus or system 1000 (or any other system or apparatus discussed herein) to move the catheter 104.
- input devices including, for example, a lever or a button or another input device that allows a user, such as a physician, nurse, technician, etc., to send a command to the medical apparatus or system 1000 (or any other system or apparatus discussed herein) to move the catheter 104.
- the controller or joystick 105 may execute software, computer instructions, algorithms, etc., so the user may complete all operations with the hand-held controller 105 by holding it with one hand, and/or the controller or joystick 105 may operate to communicate with one or more processors or controllers (e.g., processor 1200, controller 102, display controller 100, any other processor, computer, or controller discussed herein or known to those skilled in the art, etc.) that operate to execute software, computer instructions, algorithms, methods, other features, etc., so the user may complete any and/or all operations.
- the medical device 104 may be configured as or operate as a bronchoscope, catheter, endoscope, or another type of medical device.
- the system 1000 may use an imaging device, where the imaging device may be a mechanical, digital, or electronic device configured to record, store, or transmit visual images, e.g., a camera, a camcorder, a motion picture camera, etc.
- the display controller 100, the controller 102, a processor (such as, but not limited to, the processor 1200, any other processor discussed herein, etc.), etc. may operate to execute software, computer instructions, algorithms, methods, etc., and control a display of a navigation screen on the display 101-1, other types of imagery or information on the mini- display or other display 101-2, a display on a screen 1209, etc.
- the display controller 100, the controller 102, a processor may generate a three-dimensional (3D) model of an internal branching structure, for example, lungs or other internal structures, of a patient based on medical images such as CT, MRI, another imaging modality, etc. Additionally or alternatively, the 3D model may be received by the display controller 100, the controller 102, a processor (such as, but not limited to, the processor 1200, any other processor discussed herein, etc.), etc. from another device.
- the display controller 100, the controller 102, a processor (such as, but not limited to, the processor 1200, any other processor discussed herein, etc.), etc.
- the display controller 100, the controller 102, a processor may generate and output a navigation screen to any of the displays 101-1, 101-2, 1209, etc. based on the 3D model and the catheter position information by executing the software and/or by performing one or more algorithms, methods, and/or other features of the present disclosure.
- the display controller 100, the controller 102, a processor such as, but not limited to, the processor 1200, any other processor discussed herein, etc.
- the display controller 100, the controller 102, a processor such as, but not limited to, the processor 1200, any other processor discussed herein, etc.
- the display controller 100, the controller 102, a processor such as, but not limited to, the processor 1200, any other processor discussed herein, etc.), etc.
- any console thereof may include one or more or a combination of levers, keys, buttons, switches, a mouse, a keyboard, etc., to control the elements of the system 1000 (or any other system or apparatus discussed herein) and each may have configurational components, as shown in FIGS.4-5 and 14 as aforementioned, and may include other elements or components as discussed herein or known to those skilled in the art.
- the components of the system 1000 may be interconnected with medical instruments or a variety of other devices, and may be controlled independently, externally, or remotely by the display controller 100, the controller 102, a processor (such as, but not limited to, the processor 1200, any other processor discussed herein, etc.), etc.
- a sensor such as, but not limited to, the tracking sensor 106, a tip position detector 107, any other sensor discussed herein, etc. may monitor, measure or detect various types of data of the system 1000 (or other apparatus or system discussed herein), and may transmit or send the sensor readings or data to a host through a network.
- the I/O interface or communication 1205 may interconnect various components with the medical apparatus/system 1000 to transfer data or information, or facilitate communication, to or from the apparatus/system 1000.
- a power source may be used to provide power to the medical apparatus or system 1000 (or any other apparatus/system discussed herein) to maintain a regulated power supply, and may operate in a power-on mode, a power-off mode, and/or other modes.
- the power source may include or comprise a battery contained or included in the medical apparatus or system 1000 (or other apparatus or system discussed herein) and/or may include an external power source such as line power or AC power from a power outlet that may interconnect with the medical apparatus or system 1000 (or other system or apparatus of the present disclosure) through an AC/DC adapter and a DC/DC converter, or an AC/DC converter (or using any other configuration discussed herein or known to those skilled in the art) in order to adapt the power voltage from a source into one or more voltages used by components in the medical apparatus or system 1000 (and/or any other system or apparatus discussed herein).
- Any of the sensors or detectors discussed herein including, but not limited to, the sensor 106, the detector 107, etc.
- any of the sensors or detectors discussed herein, including, but not limited to, the sensor 106, the detector 107, etc. may be a plurality of sensors and may acquire sensor information output from one or more sensors that detect force, motion, current position and movement of components interconnected with the medical apparatus or system 1000 (or any other apparatus or system of the present disclosure). Any of the sensors or detectors discussed herein, including, but not limited to, the sensor 106, the detector 107, etc.
- a multi-axis acceleration or accelerometer sensor and a multi-axis gyroscope sensor may be a combination of an acceleration and gyroscope sensors, may include other sensors, and may be configured through the use of a piezoelectric transducer, a mechanical switch, a single axis accelerometer, a multi-axis accelerometer, or other types of configurations. Any of the sensors or detectors discussed herein, including, but not limited to, the sensor 106, the detector 107, etc.
- the medical apparatus or system 1000 may monitor, detect, measure, record, or store physical, operational, quantifiable data or other characteristic parameters of the medical apparatus or system 1000 (or any other system or apparatus discussed herein) including one or more or a combination of a force, impact, shock, drop, fall, movement, acceleration, deceleration, velocity, rotation, temperature, pressure position, orientation, motion, or other types of data of the medical apparatus or system 1000 (and/or other apparatus or system discussed herein) in multiple axes, in a multi-dimensional manner, along an x axis, y axis, z axis, or any combination thereof, and may generate sensor readings, information, data, a digital signal, an electronic signal, or other types of information corresponding to the detected state.
- the medical apparatus or system 1000 may transmit or send the sensor reading data wirelessly or in a wired manner to a remote host or server.
- Any of the sensors or detectors discussed herein, including, but not limited to, the sensor 106, the detector 107, etc., may be used/interrogated and may generate a sensor reading signal or information that may be processed in real time, stored, post processed at a later time, or combinations thereof.
- the information or data that is generated by any of the sensors or detectors discussed herein, including, but not limited to, the sensor 106, the detector 107, etc. may be processed, demodulated, filtered, or conditioned to remove noise or other types of signals.
- any of the sensors or detectors discussed herein, including, but not limited to, the sensor 106, the detector 107, etc. may include one or more or a combination of a force sensor, an acceleration, deceleration, or accelerometer sensor, a gyroscope sensor, a power sensor, a battery sensor, a proximity sensor, a motion sensor, a position sensor, a rotation sensor, a magnetic sensor, a barometric sensor, an illumination sensor, a pressure sensor, an angular position sensor, a temperature sensor, an altimeter sensor, an infrared sensor, a sound sensor, an air monitoring sensor, a piezoelectric sensor, a strain gauge sensor, a sound sensor, a vibration sensor, a depth sensor, and may include other types of sensors.
- the acceleration sensor may sense or measure the displacement of mass of a component of the medical apparatus or system 1000 with a position or sense the speed of a motion of the component of the medical apparatus or system 1000 (or other apparatus or system).
- the gyroscope sensor may sense or measure angular velocity or an angle of motion and may measure movement of the medical apparatus or system 1000 in up to six total degrees of freedom in three-dimensional space including three degrees of translation freedom along cartesian x, y, and z coordinates and orientation changes between those axes through rotation along one or more or of a yaw axis, a pitch axis, a roll axis, and a horizontal axis.
- the acceleration sensor may include, for example, a gravity sensor, a drop detection sensor, etc.
- the gyroscope sensor may include an angular velocity sensor, a hand- shake correction sensor, a geomagnetism sensor, etc.
- the position sensor may be a global positioning system (GPS) sensor that receives data output from a GPS.
- GPS global positioning system
- the longitudinal and latitude of a current position may be obtained from access points of a radio frequency identification device (RFID) and a WiFi device and information output from wireless base stations, for example, so that these detections may be used as position sensors. These sensors may be arranged internally or externally of the medical apparatus or system 1000 (or any other system or apparatus of the present disclosure).
- RFID radio frequency identification device
- WiFi wireless base stations
- These sensors may be arranged internally or externally of the medical apparatus or system 1000 (or any other system or apparatus of the present disclosure).
- the medical device 104 in one or more embodiments, may be configured as a catheter 104 as aforementioned and as shown in FIGS.1-5, and may move based on any of the aforementioned algorithms, including, but not limited to, the FTL algorithm, the RFTL algorithm, any other algorithm known to those skilled in the art, etc.
- the middle section and the proximal section (following sections) of the catheter 104 may move at a first position in the same way as the distal section moved at the first position or a second position near the first position.
- the display controller 100, the controller 102, a processor such as, but not limited to, the processor 1200, any other processor discussed herein, etc.
- the display controller 100, the controller 102, a processor such as, but not limited to, the processor 1200, any other processor discussed herein, etc.), etc.
- imaging e.g., computed tomography (CT), Magnetic Resonance Imaging (MRI), etc.
- MRI Magnetic Resonance Imaging
- apparatuses, systems, methods, and storage mediums for using a navigation and/or control method or methods (manual or automatic) in one or more apparatuses or systems (e.g., an imaging apparatus or system, a catheter, an autonomous catheter or robot, an endoscopic imaging device or system, etc.).
- CT computed tomography
- MRI Magnetic Resonance Imaging
- Embodiment(s) of an imaging and/or medical apparatus, device, or system may include: a bendable body having a distal bending segment or section that is bendable by at least one drive wire; an actuator that operates to drive the at least one drive wire; a force sensor in or near the actuator that operates to measure a wire force on the at least one drive wire; a user interface; and one or more controllers or processors that operate to: control the actuator to manipulate the bendable body, and to send and/or receive information from the user interface.
- the controller(s) or processor(s) may operate to: compute a driving force on the drive wire(s) using a mathematical model for a manipulation of the bendable body in air (and/or in another medium/environment); compute an exerted force and/or a force difference between (a) the wire forces measured with the force sensor and (b) the driving forces computed with the mathematical model; and send a command to the actuator and/or the user interface, wherein the command is a function of the exerted force and/or the force difference.
- the apparatus may further include a tool channel extending a length of the one or more bending sections.
- a tool channel or tool channel tube may run through a body and catheter shaft.
- a storage medium stores instructions or a program for causing one or more processors of an apparatus or system to perform a method of controlling and/or using a robotic catheter or imaging apparatus and/or to perform a method for computing a force or force difference between a wire force measurement and of a computed driving force, where the method may include using one or more technique(s) as discussed herein.
- FIGS. 7A-7B are side views of a robotic catheter in a straight and bent configuration, respectively, in accordance with aspect(s) of the present disclosure.
- the catheter 104 is attached to actuator 103.
- the catheter 104 is steerable with driving wires 160a and 160b.
- the driving wires are anchored with anchor 164 at the distal end of catheter 104 and connected to linear motion mechanism(s), which may be one or more of a linear translation stage 122, a rail 110, a robotic arm 132, etc., a rail in or in communication with the actuator 103 to steer the catheter 104 against a mechanical ground set in actuator 103.
- At least one force sensor 190 operates to measure force(s) on driving wires 160a, 160b, respectively.
- the actuator 103 operates to be or is electrically connected to the controller(s) 100, 102 and to move the linear motion mechanism(s), which may be one or more of a linear translation stage 122, a rail 110, a robotic arm 132, etc., based on commands from the controller(s) or processors(s) 100, 102.
- the controller(s) 100, 102 may include the joystick 105 (or another handheld controller, also indicated as 105) for one or more operators to indicate a steering command or a path that the one or more operators would like to achieve and receive the one or more operators’ indication with the joystick 105, then send the commands to the actuator 103 to move linear motion mechanism(s), which may be one or more of a linear translation stage 122, a rail 110, a robotic arm 132, any other motion mechanism discussed herein or known to those skilled in the art, etc.
- linear motion mechanism(s) which may be one or more of a linear translation stage 122, a rail 110, a robotic arm 132, any other motion mechanism discussed herein or known to those skilled in the art, etc.
- controller(s) 100, 102 may operate to receive force values on each driving wire 160a, 160b and position values on each linear motion mechanism(s), which may be one or more of a linear translation stage 122, a rail 110, a robotic arm 132, any other motion mechanism discussed herein or known to those skilled in the art, etc., in real time.
- the controller 100 and/or the controller 102 may include a mathematical model to compute the respective force value on each driving wire 160a, 160b for the position values of the linear motion mechanism(s), which may be one or more of a linear translation stage 122, a rail 110, a robotic arm 132, any other motion mechanism discussed herein or known to those skilled in the art, etc., without surroundings or with no surroundings around catheter 104, which means in the air.
- the linear motion mechanism(s) which may be one or more of a linear translation stage 122, a rail 110, a robotic arm 132, any other motion mechanism discussed herein or known to those skilled in the art, etc., without surroundings or with no surroundings around catheter 104, which means in the air.
- the mathematical model maps the driving wires’ positions q, which equals to the positions of linear motion mechanism(s), which may be one or more of a linear translation stage 122, a rail 110, a robotic arm 132, any other motion mechanism discussed herein or known to those skilled in the art, etc., to the force estimates t on each driving wire 160a, 160b when the catheter 104 is not surrounded by anything, e.g., in the air (see FIG.8).
- the mathematical model includes computation f.
- the driving wire’s positions q may be a vector with 1 x n components for n driving wires.
- the force estimates t may be a vector with 1 x n components for force estimates for n driving wires.
- the driving wires’ positions q may be determined based on a linear encoder to measure a position of linear motion mechanism(s), which may be one or more of a linear translation stage 122, a rail 110, a robotic arm 132, any other motion mechanism discussed herein or known to those skilled in the art, etc.
- the linear motion mechanism(s) which may be one or more of a linear translation stage 122, a rail 110, a robotic arm 132, any other motion mechanism discussed herein or known to those skilled in the art, etc., may include motors with rotary encoders, then the deriving wires’ positions q may be computed by using rotary encoder output.
- computation f may be a mapping generated by machine learning approaches.
- the approaches may be one of a neural network, regression methods based on support vector machine or a random forest, or may use any other model approaches or Artificial Intelligence (AI) structure or techniques discussed herein, including, but not limited to, the structure or techniques shown in at least FIGS. 15-19 as discussed further below.
- computation f may be one of regression methods based on least-squares method or interpolation methods based on radial basis function or Gaussian process regression.
- computation f may be referencing a look-up-table prepared in advance.
- computation f may be a kinematic model based on the catheter and driving wires design.
- the controller(s) 100, 102 may include and operate to execute the computation with the mathematical model in FIG. 9.
- the controller(s) 100, 102 (or any other processor discussed herein or known to those skilled in the art) operate to receive current position values on linear motion mechanism(s), which may be one or more of a linear translation stage 122, a rail 110, a robotic arm 132, any other motion mechanism discussed herein or known to those skilled in the art, etc., via sensors.
- the sensors may be encoders on motors operating linear motion mechanism(s), which may be one or more of a linear translation stage 122, a rail 110, a robotic arm 132, any other motion mechanism discussed herein or known to those skilled in the art, etc.
- the controller(s) 100, 102 (or any other processor discussed herein (e.g., processor 1200) or any other processor known to those skilled in the art) may operate to then plug the current position values into the mathematical model.
- the mathematical model outputs the force estimates or determinations.
- the controller(s) 100, 102 or any other processor discussed herein (e.g., processor 1200) or any other processor known to those skilled in the art) may operate to also receive current force values on driving wires 160a and 160b.
- the controller(s) 100, 102 may operate to plug the force estimates and the current force values into algorithm 1 (see e.g., FIG.9).
- Algorithm 1 operates to at least compare the respective force estimates and the current force values and compute the force difference between the force estimates and the current force values.
- One design example of Algorithm 1 for one or more embodiments may be a comparison of the force difference with the threshold value.
- one or more system embodiments may operate to define a maximum magnitude of the force difference as the pre-set threshold value.
- the system(s) may operate to issue the specialized commands to linear motion mechanism(s), which may be one or more of a linear translation stage 122, a rail 110, a robotic arm 132, any other motion mechanism discussed herein or known to those skilled in the art, etc.
- linear motion mechanism may be one or more of a linear translation stage 122, a rail 110, a robotic arm 132, any other motion mechanism discussed herein or known to those skilled in the art, etc.
- the specialized commands may be a stop command to stop all linear motion mechanism(s), which may be one or more of a linear translation stage 122, a rail 110, a robotic arm 132, any other motion mechanism discussed herein or known to those skilled in the art, etc., safely while the controller(s) 100, 102 (or any other processor discussed herein (e.g., processor 1200) or any other processor known to those skilled in the art) may operate to be or is receiving all necessary sensor data.
- any other processor discussed herein e.g., processor 1200
- processor 1200 any other processor known to those skilled in the art
- the specialized commands may be a command to change a motion mode to control linear motion mechanism(s), which may be one or more of a linear translation stage 122, a rail 110, a robotic arm 132, any other motion mechanism discussed herein or known to those skilled in the art, etc., based on a force feedback control.
- the system may make the catheter 104 flexible by moving linear motion mechanism(s), which may be one or more of a linear translation stage 122, a rail 110, a robotic arm 132, any other motion mechanism discussed herein or known to those skilled in the art, etc., to reduce the magnitude of force value to zero.
- Algorithm 1 may operate to be connected to software to manage a graphical user interface (GUI) and the system status.
- GUI graphical user interface
- Algorithm 1 may operate to issue a warning via the GUI and change the system state to a corresponding state.
- the catheter 104 may have one bending segment bendable on the plane.
- a catheter base coordinate xyz may be defined with an origin at the proximal end of the catheter 104. The origin may be also on the centroid of the catheter 104.
- the catheter 104 may have an antagonistic pair of driving wires 160a, 160b with the offset of d from the centroid of the catheter 104.
- pa and pb may have the below geometrical relationship(s) with the bending angle ⁇ .
- p a and p b have the same magnitude with opposite directions.
- the bending may be modeled with a linear rotational spring constant K ⁇ in FIG. 11.
- the bending moment equilibrium may be represented in the following relationship with the force T a and T b on driving wires 160a and 160b: [0173] In this model, all driving wires 160a and 160b may be contributing to the bending evenly.
- one or more embodiments may derive the mathematical model to map pa and pb into Ta and Tb, as follows: [0175] The important observation of these equations is that K ⁇ /(2d 2 ) is a constant value determined by the design value. Therefore, by using these equations for the mathematical model, one or more embodiments may estimate the force on the driving wires 160a and 160b throughout the catheter operation. [0176] Similarly, one or more embodiments may derive the mathematical model for multiple bending sections. As a design example, a two-segment catheter is illustrated in FIGS. 12A-12B.
- the catheter 104 in FIG.12A includes a distal bending segment 156 and a proximal bending segment 152.
- the distal and proximal bending segments 156, 152 include the length s1 and s2, the linear rotational spring constant K ⁇ 1 , K ⁇ 2 , and driving wires 160a1, 160b1 and driving wires 160a2, 160b2, respectively.
- the offset magnitude for all the driving wires 160a1, 160b1, 160a2, 160b2 is d.
- embodiment(s) may define local coordinates for each segment in FIG. 12A. The bending angle may be defined according to these local coordinates.
- one or more embodiments may calculate the local wire displacement pa1L and p b1L for a distal bending segment, which may be a driving wire displacement only for the distal bending segment, as follows: [0180] By referring to the local coordinate of the distal segment attached on the proximal end of the distal segment, one or more embodiments may derive mapping of pa1L and pb1L into T a1 and T b1 similarly to the single segment case, as follows: [0181] For the proximal bending segment 152, the bending moment equilibrium is in the below relationships/equations with the force Ta1, Tb1, Ta2, Tb2 on the driving wires 160a1, 160b1, 160a2, 160b2.
- the driving wires 160a1 and 160b1 for the distal segment 156 may also input the bending moment to the proximal bending segment 152 here.
- the forces, Ta2, Tb2 may be mapped into pa2 and pb2, as follows: [0183] With one or more of the above equations, the mathematical model(s) may map pa1, pb1, pa2, pb2 to the force Ta1, Tb1, Ta2, Tb2 on the driving wires 160a1, 160b1, 160a2, 160b2 even with multiple bending segments.
- one or more embodiments may map the wire forces into wire positions even with any bending segment numbers.
- the controller(s) or the processor(s) may operate to estimate the magnitude of the forces exerted in the catheter tip (e.g., the catheter tip 320) without the force sensor in the catheter tip (e.g., the catheter tip 320) since the difference stems from the external force from the anatomy (environment).
- the catheter 104 does not need the dedicated force sensor in the catheter tip 320. This allows miniaturizing an outer diameter of the catheter 104 and/or maximizing the catheter tool channel (e.g., the tool channel 168 or any other tool channel discussed herein). This also allows for fewer parts and a reduced cost for the catheter, which can be particularly important for single use catheters. [0187] One or more features of the present disclosure may be also useful to check integrity of the dedicated force sensor in the catheter tip (e.g., the catheter tip 320) when the catheter 104 includes the dedicated force sensor in the catheter tip (e.g., the catheter tip 320).
- the controller(s) 100, 102 (or any other processors (e.g., the processor 1200) discussed herein) operates to efficiently activate protective action to prevent the catheter malfunction or potential patient injury.
- processors e.g., the processor 1200
- One or more features of the present disclosure provides and allows distinguishing the force exerted into the catheter tip from the driving force. Therefore, the controller(s) or processor(s) may operate to protect the catheter and/or the patient from potential malfunction/injury with feature(s) or factor(s) to consider of the present disclosure.
- Setting the threshold value for the absolute value of the force difference in one or more embodiments provides a simple way to realize the controller or processor judgement whether there is a large force exerted into or on the catheter tip.
- the controller(s) or processor(s) may operate to reduce the forces exerted in or on the catheter tip efficiently since the catheter can change its shape to fit the shape of the anatomy or can deform against the source of such an exerted force. Therefore, the controller(s) or processor(s) of the present disclosure may operate to prevent the catheter malfunction and/or the large force induced to the anatomy.
- the controller(s) or processor(s) of the present disclosure may operate to efficiently stop increasing the forces exerted in or on the catheter tip or at least reduce a risk to increase the force exerted in or on the catheter tip.
- the system(s) may operate to provide the operator(s) the time to migrate the next safety maneuver efficiently.
- the system(s) may operate to inform the operator(s) of a significant interaction between the catheter and the anatomy and make the operator(s) take another operational approach from the current approach or may propose alternative(s) for the operator(s) to consider and use. Therefore, the system(s) of the present disclosure may operate to avoid a significant situation with the catheter malfunction and/or potential exerted force to the anatomy and also provide the operator(s) the efficient operational approach to promote smooth catheter operation in the confined anatomy.
- the mathematical model which may be, in one or more embodiments, accepting the position information of the driving wires as input and outputting the driving force on the driving wires, allows updating the driving forces by using a current system status in a case where the catheter is controlled with the driving wire position.
- the controller(s) or processor(s) of the present disclosure provide the ability to compute a force difference or force determination/estimation, and the controller(s) or processor(s) of the present disclosure operate to execute the computation of the force difference based on the current system status efficiently.
- Any reflowable polymer material, compatible with the catheter or continuum robot, may be used for one or more features of the present disclosure.
- the polymer material may be Pebax (or any other block copolymer variation of PEBA (polyether block amide)).
- One or more components of the catheter 104 may be made of higher durometer than one or more other components of the catheter 104 so that the higher durometer may provide useful structure for the catheter during reflow processes.
- the method may further include disposing or incorporating a sensor (e.g., an electromagnetic (EM) sensor) or a camera within the geometry and/or structure of the distal tip and/or the atraumatic tip piece.
- the sensor or camera may be used for tracking, and the sensor or camera may be seated (or fully seated) within the distal tip or atraumatic tip piece and be protected by or within the distal tip piece.
- an apparatus for performing navigation control and/or for controlling, manufacturing, or using a catheter and/or catheter tip may include a flexible medical device or tool; and one or more processors that operate to: bend a distal portion of the flexible medical device or tool; and advance the flexible medical device or tool through a pathway, wherein the flexible medical device or tool may be advanced through the pathway in a substantially centered manner.
- the flexible medical device or tool may have multiple bending sections, and the one or more processors may further operate to control or command the multiple bending sections of the flexible medical device or tool using one or more of the following modes: a Follow the Leader (FTL) mode, a Reverse Follow the Leader (RFTL) mode, etc.
- the flexible medical device or tool may include a catheter or scope and the catheter or scope may be part of, include, or be attached to an imaging apparatus, such as, but not limited to, an endoscope, a catheter, a probe, a bronchoscope, or any other imaging device discussed herein or known to those skilled in the art.
- a method for controlling an apparatus including a flexible medical device or tool that operates to perform navigation control and/or for controlling, manufacturing, or using a catheter and/or catheter tip may include: bending a distal portion of the flexible medical device or tool; and advancing the flexible medical device or tool through a pathway, wherein the flexible medical device or tool may be advanced through the pathway in a substantially centered manner.
- the flexible medical device or tool may have multiple bending sections, and the method may further include controlling or commanding the multiple bending sections of the flexible medical device or tool using one or more of the following modes: a Follow the Leader (FTL) process or mode, a Reserve Follow the Leader (RFTL) process or mode, etc.
- a non-transitory computer-readable storage medium storing at least one program for causing a computer to execute a method for controlling an apparatus including a flexible medical device or tool that operates to perform navigation control and/or for controlling, manufacturing, or using a catheter and/or catheter tip, where the method may include: bending a distal portion of the flexible medical device or tool; and advancing the flexible medical device or tool through a pathway, wherein the flexible medical device or tool may be advanced through the pathway in a substantially centered manner.
- the method may include any other feature discussed herein.
- the scope may comprise, for example, an anoscope, an arthroscope, a bronchoscope, a colonoscope, a colposcope, a cystoscope, an esophagoscope, a gastroscope, a laparoscope, a laryngoscope, a neuroendoscope, a proctoscope, a sigmoidoscope, a thoracoscope, an ureteroscope, or another device.
- the scope preferably includes or comprises a bronchoscope.
- unit may generally refer to firmware, software, hardware, or other component, such as circuitry, a controller or processor, etc., or any combination thereof, that is used to effectuate a purpose.
- the modules may be hardware units (such as circuitry, firmware, a field programmable gate array, a digital signal processor, an application specific integrated circuit, any other hardware discussed herein or known to those skilled in the art, etc.) and/or software modules (such as a program, a computer readable program, instructions stored in a memory or storage medium, instructions downloaded from a remote memory or storage medium, other software discussed herein or known to those skilled in the art, etc.). Any units or modules for implementing one or more of the various steps discussed herein are not exhaustive or limited thereto. However, where there is a step of performing one or more processes, there may be a corresponding functional module or unit (implemented by hardware and/or software), or processor(s), controller(s), computer(s), etc.
- the medical apparatus or system 1000 of FIG. 1 may be configured as a continuum robot or steerable catheter with a multi-sectional catheter or probe configuration and follow the leader technology (or other control or movement, or tip, feature(s) and/or technique(s) discussed herein) to allow for precise catheter tip movement.
- AI artificial intelligence
- machine learning algorithms processes, techniques, or the like
- a method comprising: advancing the medical tool or catheter through a pathway and/or controlling a medical tool or catheter as discussed herein.
- AI techniques may use a neural network, a random forest algorithm, a cognitive computing system, a rules-based engine, other AI network structure discussed herein or known to those skilled in the art, etc., and are trained based on a set of data to assess types of data and generate output.
- a training algorithm may be configured to implement a method comprising: advancing the medical tool or catheter through a pathway, wherein the medical tool or catheter is advanced through the pathway in a substantially centered manner.
- An AI-implemented algorithm may also be used to train a model to perform the manufacture and/or use of the continuum robot(s) and/or steerable catheter(s) having a tip piece and/or having structure (e.g., to avoid/reduce wire kink, to avoid/reduce drive wire friction, to detect or determine drive wire force(s), to determine or detect force difference(s), etc.) as discussed herein.
- Feature(s) discussed herein may be used for performing control, correction, adjustment, and/or smoothing (e.g., direct FTL smoothing, path smoothing, continuum robot smoothing, etc.).
- FIG.13 is a flowchart showing steps of at least one procedure for performing correction, adjustment, and/or smoothing of a continuum robot/catheter device (e.g., such as continuum robot/catheter device 104).
- a continuum robot/catheter device e.g., such as continuum robot/catheter device 104.
- One or more of the processors discussed herein may execute the steps shown in FIG.13, and these steps may be performed by executing a software program read from a storage medium, including, but not limited to, the ROM 1202 or HDD 150, by CPU 1201 or by any other processor discussed herein.
- One or more methods of performing correction, adjustment, and/or smoothing for a catheter or probe of a continuum robot device or system may include one or more of the following steps: (i) in step S1300, instructing a distal bending section or portion of a catheter or a probe of a continuum robot such that the distal bending section or portion achieves, or is disposed at, a bending pose or position; (ii) in step S1301, storing or obtaining the bending pose or position of the distal bending section or portion and storing or obtaining a position of a motorized linear stage that operates to move the catheter or probe of the continuum robot in a case where a forward motion, or a motion in a set or predetermined direction or directions, of the motorized linear stage is instructed or commanded; (iii) in step S1302, generating a goal or target bending pose or position (or other state) for each corresponding section or portion of the catheter or probe from, or based on, the
- a user may provide an operation input through an input element or device, and the continuum robot apparatus or system 104/1000 may receive information of the input element and one or more input/output devices, which may include, but are not limited to, a receiver, a transmitter, a speaker, a display, an imaging sensor, a user input device, which may include a keyboard, a keypad, a mouse, a position tracked stylus, a position tracked probe, a foot switch, a microphone, etc.
- input/output devices which may include, but are not limited to, a receiver, a transmitter, a speaker, a display, an imaging sensor, a user input device, which may include a keyboard, a keypad, a mouse, a position tracked stylus, a position tracked probe, a foot switch, a microphone, etc.
- a guide device, component, or unit may include one or more buttons, knobs, switches, etc., that a user may use to adjust parameter(s) of the continuum robot 104/1000, such as speed (e.g., rotational speed, translational speed, etc.), angle, or plane, etc.
- the continuum robot 104/1000 may be interconnected with medical instruments or a variety of other devices, and may be controlled independently, externally, or remotely a communication interface, such as, but not limited to the communication interface 1205.
- the communication interface 1205 may be configured as a circuit or other device for communicating with components included in the apparatus or system 1000, and with various external apparatuses connected to the apparatus via a network.
- the communication interface 1205 may store information to be output in a transfer packet and may output the transfer packet to an external apparatus via the network by communication technology such as Transmission Control Protocol/Internet Protocol (TCP/IP).
- the apparatus may include a plurality of communication circuits according to a desired communication form.
- the CPU 1202, the communication interface 1205, and other components of the computer 1200 may interface with other elements including, for example, one or more of an external storage, a display, a keyboard, a mouse, a sensor, a microphone, a speaker, a projector, a scanner, a display, an illumination device, etc.
- One or more control, adjustment, correction, and/or smoothing features of the present disclosure may be used with one or more image correction or adjustment features in one or more embodiments.
- One or more adjustments, corrections, or smoothing functions for a catheter or probe device and/or a continuum robot may adjust a path of one or more sections or portions of the catheter or probe device and/or the continuum robot (e.g., the continuum robot 104, the continuum robot 1000, etc.), and one or more embodiments may make a corresponding adjustment or correction to an image view.
- the medical tool may be an endoscope, a bronchoscope, any other medical tool discussed herein, any other medical tool known to those skilled in the art, etc.
- a computer such as the console or computer 1200, may perform any of the steps, processes, and/or techniques discussed herein, any of the embodiments shown in FIGS.1-19, any other apparatus or system discussed herein, etc.
- a computer may perform any of the steps, processes, and/or techniques discussed herein, any of the embodiments shown in FIGS.1-19, any other apparatus or system discussed herein, etc.
- There are many ways to control a continuum robot correct or adjust an image or a path (or one or more sections or portions of) a continuum robot (or other probe or catheter device or system), manufacture or use a catheter having a tip and/or support structure, or perform any other measurement or process discussed herein, to perform continuum robot method(s) or algorithm(s), and/or to control at least one continuum robot device/apparatus, system and/or storage medium, digital as well as analog.
- a computer such as the console or computer 1200, may be dedicated to control and/or use continuum robot devices, systems, methods, and/or storage mediums for use therewith described herein.
- the one or more detectors, sensors, cameras, or other components of the apparatus or system embodiments may transmit the digital or analog signals to a processor or a computer such as, but not limited to, an image processor or display controller 100, a controller 102, a CPU 1201, a processor or computer 1200 (see e.g., at least FIGS. 1-5, 7A-12B, and 14-19), a combination thereof, etc.
- the image processor may be a dedicated image processor or a general purpose processor that is configured to process images.
- the computer 1200 may be used in place of, or in addition to, the image processor or display controller 100 and/or the controller 102 (or any other processor or controller discussed herein, such as, but not limited to, the computer 1200, etc.).
- the image processor may include an ADC and receive analog signals from the one or more detectors or sensors of the system 1000 (or any other system discussed herein).
- the image processor may include one or more of a CPU, DSP, FPGA, ASIC, or some other processing circuitry.
- the image processor may include memory for storing image, data, and instructions.
- the image processor may generate one or more images based on the information provided by the one or more detectors, sensors, or cameras.
- a computer or processor discussed herein such as, but not limited to, a processor of the devices, apparatuses or systems of FIGS.1-5, 7A-12B, and 14-19, the computer 1200, the image processor, etc. may also include any component(s) further discussed herein below.
- Electrical analog signals obtained from the output of the system 1000 or the components thereof, and/or from the devices, apparatuses, or systems of FIGS. 1-5, 7A-12B, and 14-19, may be converted to digital signals to be analyzed with a computer, such as, but not limited to, the computers or controllers 100, 102 of FIG.1, the computer 1200, etc.
- a computer such as the computer or controllers 100, 102 of FIG.1, the console or computer 1200, etc.
- the electric signals used for imaging may be sent to processor(s), such as, but not limited to, the processors or controllers 100, 102 of FIGS.1-5, a computer 1200 (see e.g., FIG. 1, FIG. 14-15, etc. as discussed further below, via cable(s) or wire(s), such as, but not limited to, the cable(s) or wire(s) 113 (see FIG.14).
- FIG. 14 Various components of a computer system 1200 (see e.g., the console or computer 1200 as may be used as one embodiment example of the computer, processor, or controllers 100, 102 shown in FIG. 1) are provided in FIG. 14.
- a computer system 1200 may include a central processing unit (“CPU”) 1201, a ROM 1202, a RAM 1203, a communication interface 1205 (also referred as an Input/Output or I/O interface), a hard disk (and/or other storage device, such as, but not limited to, an SSD) 1204, a screen (or monitor interface) 1209, a keyboard (or input interface; may also include a mouse or other input device in addition to the keyboard) 1210 and a BUS (or “Bus”) or other connection lines (e.g., connection line 1213) between one or more of the aforementioned components (e.g., as shown in FIG. 14).
- the computer system 1200 may comprise one or more of the aforementioned components.
- a computer system 1200 may include a CPU 1201, a RAM 1203, an input/output (I/O) interface (such as the communication interface 1205) and a bus (which may include one or more lines 1213 as a communication system between components of the computer system 1200; in one or more embodiments, the computer system 1200 and at least the CPU 1201 thereof may communicate with the aforementioned component(s) of a continuum robot device or system using same, such as, but not limited to, the system 1000, the devices/systems of FIGS.1-5, 7A-12B, and 14-19, otherwise discussed herein, etc., via one or more lines 1213), and one or more other computer systems 1200 may include one or more combinations of the other aforementioned components (e.g., the one or more lines 1213 of the computer 1200 may connect to other components via line 113).
- I/O input/output
- the CPU 1201 is configured to read and perform computer-executable instructions stored in a storage medium.
- the computer-executable instructions may include those for the performance of the methods and/or calculations described herein.
- the computer system 1200 may include one or more additional processors in addition to CPU 1201, and such processors, including the CPU 1201, may be used for controlling and/or manufacturing a device, system, or storage medium for use with same or for use with any continuum robot technique(s), any force determination (e.g., drive wire force(s)) and/or force difference determination technique(s) and/or feature(s), or any other technique(s)/feature(s) discussed herein.
- the system 1200 may further include one or more processors connected via a network connection (e.g., via network 1206).
- the CPU 1201 and any additional processor being used by the system 1200 may be located in the same telecom network or in different telecom networks (e.g., performing, manufacturing, controlling, calculation, and/or using technique(s) may be controlled remotely).
- the I/O or communication interface 1205 provides communication interfaces to input and output devices, which may include the aforementioned component(s) of the system(s) discussed herein (e.g., the controller 100, the controller 102, the displays 101-1, 101- 2, the actuator 103, the continuum device or catheter 104, the operating portion or controller 105, the tracking sensor 106, the position detector 107, the rail 110, etc.), a microphone, a communication cable and a network (either wired or wireless), a keyboard 1210, a mouse, a touch screen or screen 1209, a light pen, etc.
- the communication interface of the computer 1200 may connect to other components discussed herein via line 113 (as shown in FIG. 14).
- the Monitor interface or screen 1209 provides communication interfaces thereto.
- Any methods and/or data of the present disclosure such as, but not limited to, the methods for using and/or controlling a continuum robot or catheter device, system, or storage medium for use with same and/or method(s) for imaging, performing tissue or sample characterization or analysis, performing diagnosis, planning and/or examination, for performing control or adjustment techniques (e.g., to a path of, to a pose or position of, or to one or more sections or portions of, a continuum robot, a catheter or a probe), for using or manufacturing catheter, catheter tip, and/or force determination (e.g., drive wire force(s)) and/or force difference determination feature(s) and/or technique(s), and/or for performing image correction or adjustment or other technique(s), as discussed herein, may be stored on a computer-readable storage medium.
- a computer-readable and/or writable storage medium used commonly such as, but not limited to, one or more of a hard disk (e.g., the hard disk 1204, a magnetic disk, etc.), a flash memory, a CD, an optical disc (e.g., a compact disc (“CD”) a digital versatile disc (“DVD”), a Blu-ray TM disc, etc.), a magneto-optical disk, a random- access memory (“RAM”) (such as the RAM 1203), a DRAM, a read only memory (“ROM”), a storage of distributed computing systems, a memory card, or the like (e.g., other semiconductor memory, such as, but not limited to, a non-volatile memory card, a solid state drive (SSD) (see storage 1204 may be an SSD instead of a hard disk in one or more embodiments; see also, storage 150 in FIG.4), SRAM, etc.), an optional combination thereof, a server/database, etc.
- a hard disk e.g.
- the computer-readable storage medium may be a non-transitory computer-readable medium, and/or the computer-readable medium may comprise all computer-readable media, with the sole exception being a transitory, propagating signal in one or more embodiments.
- the computer-readable storage medium may include media that store information for predetermined, limited, or short period(s) of time and/or only in the presence of power, such as, but not limited to Random Access Memory (RAM), register memory, processor cache(s), etc.
- Embodiment(s) may also be realized by a computer of a system or apparatus that reads out and executes computer executable instructions (e.g., one or more programs) recorded on a storage medium (which may also be referred to more fully as a “non-transitory computer- readable storage medium”) to perform the functions of one or more of the above-described embodiment(s) and/or that includes one or more circuits (e.g., application specific integrated circuit (ASIC)) for performing the functions of one or more of the above-described embodiment(s), and by a method performed by the computer of the system or apparatus by, for example, reading out and executing the computer executable instructions from the storage medium to perform the functions discussed herein and/or controlling the one or more circuits to perform the functions of the present disclosure.
- computer executable instructions e.g., one or more programs
- a storage medium which may also be referred to more fully as a “non-transitory computer- readable storage medium”
- ASIC application specific integrated circuit
- the methods, devices, systems, and computer-readable storage mediums related to the processors may be achieved utilizing suitable hardware, such as that illustrated in the figures.
- suitable hardware such as that illustrated in the figures.
- Functionality of one or more aspects of the present disclosure may be achieved utilizing suitable hardware, such as that illustrated in FIG.14.
- Such hardware may be implemented utilizing any of the known technologies, such as standard digital circuitry, any of the known processors that are operable to execute software and/or firmware programs, one or more programmable digital devices or systems, such as programmable read only memories (PROMs), programmable array logic devices (PALs), etc.
- PROMs programmable read only memories
- PALs programmable array logic devices
- the CPU 1200, 1201 (as shown in FIG. 14, and/or which may be included in the computer, processor, controller and/or CPU 100, 102, 1201, etc. of FIGS.1-5, 7A-12B, and 14-19), etc. may also include and/or be made of one or more microprocessors, nanoprocessors, one or more graphics processing units (“GPUs”; also called a visual processing unit (“VPU”)), one or more Field Programmable Gate Arrays (“FPGAs”), or other types of processing components (e.g., application specific integrated circuit(s) (ASIC)).
- GPUs graphics processing units
- VPU visual processing unit
- FPGAs Field Programmable Gate Arrays
- ASIC application specific integrated circuit
- the various aspects of the present disclosure may be implemented by way of software and/or firmware program(s) that may be stored on suitable storage medium (e.g., computer-readable storage medium, hard drive, etc.) or media (such as floppy disk(s), memory chip(s), etc.) for transportability and/or distribution.
- the computer may include a network of separate computers or separate processors to read out and execute the computer executable instructions.
- the computer executable instructions may be provided to the computer, for example, from a network or the storage medium.
- the computers or processors e.g., 100, 102, 1201, 1200, etc.
- a computer or processor may include an image/display processor or communicate with an image/display processor.
- the computer 1200 includes a central processing unit (CPU) 1201, and may also include a graphical processing unit (GPU) 1215.
- the CPU 1201 or the GPU 1215 may be replaced by the field-programmable gate array (FPGA), the application-specific integrated circuit (ASIC) or other processing unit depending on the design of a computer, such as the computer 1200, controller or processor 100, controller or processor 102, any other computer, CPU, or processor discussed herein, etc.
- FPGA field-programmable gate array
- ASIC application-specific integrated circuit
- At least one computer program is stored in the HDD/SSD 1204, the data storage 150, or any other storage device or drive discussed herein, and the CPU 1201 loads the at least one program onto the RAM 1203, and executes the instructions in the at least one program to perform one or more processes described herein, as well as the basic input, output, calculation, memory writing, and memory reading processes.
- the computer such as the computer 1200, the computer, processors, and/or controllers of FIGS. 1-5, 7A-12B, and 14-19, etc., communicate with the component(s) of the apparatuses/systems of FIGS.
- the monitor or display 1209 displays the reconstructed image, and the monitor or display 1209 may display other information about the imaging condition or about an object to be imaged.
- the monitor 1209 also provides a graphical user interface for a user to operate a system, for example when performing CT, MRI, or other imaging modalities or other imaging technique(s), including, but not limited to, controlling continuum robot devices/systems, and/or manufacturing or using catheter tip feature(s) and/or technique(s).
- An operation signal is input from the operation unit (e.g., such as, but not limited to, a mouse device 1211, a keyboard 1210, a touch panel device, etc.) into the communication interface 1205 in the computer 1200, and corresponding to the operation signal the computer 1200 instructs the system (e.g., the system 1000, the systems/apparatuses of FIGS. 1-5, 7A-12B, and 14-19, any other system/apparatus discussed herein, etc.) to start or end the imaging, and/or to start or end continuum robot control(s) and/or performance of correction, adjustment, and/or catheter, catheter tip, and/or support structure feature(s) and/or technique(s).
- the system e.g., the system 1000, the systems/apparatuses of FIGS. 1-5, 7A-12B, and 14-19, any other system/apparatus discussed herein, etc.
- the camera or imaging device as aforementioned may have interfaces to communicate with the computer 1200 to send and receive the status information and the control signals.
- one or more processors or computers 1200, 1200’ may be part of a system in which the one or more processors or computers 1200, 1200’ (or any other processor discussed herein) communicate with other devices (e.g., a database 1603, a memory 1602 (which may be used with or replaced by any other type of memory discussed herein or known to those skilled in the art), an input device 1600, an output device 1601, etc.).
- one or more models may have been trained previously and stored in one or more locations, such as, but not limited to, the memory 1602, the database 1603, etc.
- one or more models and/or data discussed herein may be input or loaded via a device, such as the input device 1600.
- a user may employ an input device 1600 (which may be a separate computer or processor, a keyboard such as the keyboard 1210, a mouse such as the mouse 1211, a microphone, a screen or display 1209 (e.g., a touch screen or display), or any other input device known to those skilled in the art).
- an input device 1600 may not be used (e.g., where user interaction is eliminated by one or more artificial intelligence features discussed herein).
- the output device 1601 may receive one or more outputs discussed herein to perform coregistration, autonomous navigation, movement detection, control, and/or any other process discussed herein.
- the database 1603 and/or the memory 1602 may have outputted information (e.g., trained model(s), detected marker information, image data, test data, validation data, training data, coregistration result(s), segmentation model information, object detection/regression model information, combination model information, etc.) stored therein. That said, one or more embodiments may include several types of data stores, memory, storage media, etc.
- the term “subset” of a corresponding set does not necessarily represent a proper subset and may be equal to the corresponding set.
- any other model architecture, machine learning algorithm, or optimization approach may be employed.
- One or more embodiments may utilize hyper-parameter combination(s).
- One or more embodiments may employ data capture, selection, annotation as well as model evaluation (e.g., computation of loss and validation metrics) since data may be domain and application specific.
- the model architecture may be modified and optimized to address a variety of computer visions issues (discussed below).
- Embodiment(s) may automatically detect (predict a spatial location of) a catheter tip (e.g., in or near an airway, pathway, a lung or other organ, in a patient, etc.) in a time series of X-ray images to co-register the X-ray images with the corresponding OCT images (at least one example of a reference point of two different coordinate systems).
- Embodiment(s) may use deep (recurrent) convolutional neural network(s), which may improve catheter tip detection, tissue detection, tissue characterization, robotic control, force and/or force difference determination(s), and image co-registration significantly.
- Embodiment(s) may employ segmentation and/or object/keypoint detection architectures to solve one or more computer vision issues in other domain areas in one or more applications.
- Embodiment(s) employ several novel materials and methods to solve one or more computer vision or other issues (e.g., lesion detection in time series of X-ray images, for instance; tissue detection; tissue characterization; robotic control; force determination(s) (e.g., drive wire force(s)) and/or force difference determination(s); etc.).
- Embodiment(s) employ data capture and selection. In embodiment(s), the data is what makes such an application unique and distinguishes this application from other applications.
- images may include a radiodense marker, a sensor (e.g., an EM sensor), or some other identifier that is specifically used in one or more procedures (e.g., used in catheters/probes with a similar marker, sensor, or identifier to that of an OCT marker, used in catheters/probes with a similar or same marker, sensor, or identifier even compared to another imaging modality, etc.) to facilitate computational detection of a marker, sensor, catheter, and/or tissue detection, characterization, validation, etc. in one or more images (e.g., X-ray images or other types of images).
- a radiodense marker e.g., a sensor, e.g., an EM sensor
- some other identifier that is specifically used in one or more procedures (e.g., used in catheters/probes with a similar marker, sensor, or identifier to that of an OCT marker, used in catheters/probes with a similar or same marker, sensor, or identifie
- Embodiment(s) may couple a software device or features (model) to hardware (e.g., a robotic catheter or probe, a steerable probe/catheter using one or more sensors (or other identifier or tracking components), etc.).
- Embodiment(s) may utilize animal data in addition to patient data. Training deep learning may use a large amount of data, which may be difficult to obtain from clinical studies. Inclusion of image data from pre-clinical studies in animals into a training set may improve model performance.
- Training and evaluation of a model may be highly data dependent (e.g., a way in which frames are selected (e.g., during steerable catheter control, frames obtained via a robotic catheter, etc.), split into training/validation/test sets, and grouped into batches as well as the order in which the frames, sets, and/or batches are presented to the model, any other data discussed herein, etc.).
- such parameters may be more important or significant than some of the model hyper-parameters (e.g., batch size, number of convolution layers, any other hyper-parameter discussed herein, etc.).
- Embodiment(s) may use a collection or collections of user annotations after introduction of a device/apparatus, system, and/or method(s) into a market, and may use post market surveillance, retraining of a model or models with new data collected (e.g., in clinical use), and/or a continuously adaptive algorithm/method(s). [0229] Embodiment(s) may employ data annotation.
- embodiment(s) may label pixel(s) representing a marker, a sensor, a camera, or an identifier detection or a tissue and/or catheter detection, characterization, and/or validation as well as pixels representing a blood vessel(s) or portions of a pathway (e.g., a vessel, a lumen, an airway, a bronchial pathway, another type of pathway, etc.) at different phase(s) of a procedure/method (e.g., different levels of contrast due to intravascular contrast agent) of acquired frame(s).
- a pathway e.g., a vessel, a lumen, an airway, a bronchial pathway, another type of pathway, etc.
- One or more embodiments may employ incorporation of prior knowledge.
- a marker, sensor, or other portion of a robotic catheter/probe location may be known inside a vessel, pathway, airway, or bronchial pathway (or other type of pathway) and/or inside a catheter or probe; a tissue and/or catheter or catheter tip location may be known inside a vessel, an airway, a lung, a bronchial pathway, or other type of target, object, or specimen; etc.
- simultaneous localization of the pathway, airway, bronchial pathway, lung, etc. and sensor(s)/marker(s)/identifier(s) may be used to improve sensor/marker detection and/or tissue and/or catheter detection, localization, characterization, and/or validation.
- a sensor or other portion of a catheter/probe may move inside a target, object, or specimen (e.g., a pathway, an airway, a bronchial pathway, a lung or another organ, in a patient, in tissue, etc.), and such prior knowledge may be incorporated into the machine learning algorithm or the loss function.
- One or more embodiments may employ loss (cost) and evaluation function(s)/metric(s). For example, use of temporal information for model training and evaluation may be used in one or more embodiments. One or more embodiments may evaluate a distance between prediction and ground truth per frame as well as consider a trajectory of predictions across multiple frames of a time series. [0232] Application of machine learning [0233] Application of machine learning may be used in one or more embodiment(s), as discussed in PCT/US2020/051615, filed on September 18, 2020 and published as WO 2021/055837 A9 on March 25, 2021, and as discussed in U.S. Pat. App. No.
- One or more models may be used in one or more embodiment(s) to detect and/or characterize a tissue or tissues and/or lesion(s), such as, but not limited to, the one or more models as discussed in PCT/US2020/051615, filed on September 18, 2020 and published as WO 2021/055837 A9 on March 25, 2021, and as discussed in U.S. Pat. App. No. 17/761,561, filed on March 17, 2022, the applications and publications of which are incorporated by reference herein in their entireties.
- one or more embodiments may use a segmentation model, a regression model, a combination thereof, etc.
- the input may be the entire image frame or frames
- the output may be the centroid coordinates of sensors/markers (target sensor and stationary sensor or marker, if necessary/desired) and/or coordinates of a portion of a catheter or probe to be used in determining the localization and lesion and/or tissue detection and/or characterization.
- FIGS. 16-18 an example of an input image on the left side of FIGS.16-18 and a corresponding output image on the right side of FIGS.16- 18 are illustrated for regression model(s).
- At least one architecture of a regression model is shown in FIG. 16. In at least the embodiment of FIG.
- the regression model may use a combination of one or more convolution layers 900, one or more max-pooling layers 901, and one or more fully connected dense layers 902. While not limited to the Kernel size, Width/Number of filters (output size), and Stride sizes shown for each layer (e.g., in the left convolution layer of FIG. 16, the Kernel size is “3x3”, the Width/# of filters (output size) is “64”, and the Stride size is “2”). In one or more embodiments, another hyper-parameter search with a fixed optimizer and with a different width may be performed, and at least one embodiment example of a model architecture for a convolutional neural network for this scenario is shown in FIG. 17.
- One or more embodiments may use one or more features for a regression model as discussed in “Deep Residual Learning for Image Recognition” to Kaiming He, et al., Microsoft Research, December 10, 2015 (https://arxiv.org/pdf/1512.03385.pdf), which is incorporated by reference herein in its entirety.
- FIG. 18 shows at least a further embodiment example of a created architecture of or for a regression model(s).
- the output from a segmentation model is a “probability” of each pixel that may be categorized as a tissue or lesion characterization or tissue or lesion identification/determination
- post-processing after prediction via the trained segmentation model may be developed to better define, determine, or locate the final coordinate of a tissue location, a catheter tip location, force or force difference determination (e.g., drive wire force(s)) or catheter location, and/or a lesion location (or a sensor/marker location where the sensor/marker is a part of the catheter) and/or determine the type and/or characteristics of the tissue(s) and/or lesion(s).
- One or more embodiments of a semantic segmentation model may be performed using the One-Hundred Layers Tiramisu method discussed in “The One Hundred Layers Tiramisu: Fully Convolutional DenseNets for Semantic Segmentation” to Simon Jégou, et al., Montreal Institute for Learning Algorithms, published October 31, 2017 (https://arxiv.org/pdf/1611.09326.pdf), which is incorporated by reference herein in its entirety.
- a segmentation model may be used in one or more embodiment, for example, as shown in FIG.19. At least one embodiment may utilize an input 600 as shown to obtain an output 606 of at least one embodiment of a segmentation model method.
- a slicing size may be one or more of the following: 100 x 100, 224 x 224, 512 x 512, and, in one or more of the experiments performed, a slicing size of 224 x 224 performed the best.
- a batch size (of images in a batch) may be one or more of the following: 2, 4, 8, 16, and, from the one or more experiments performed, a bigger batch size typically performs better (e.g., with greater accuracy).
- 16 images/batch may be used. The optimization of all of these hyper-parameters depends on the size of the available data set as well as the available computer/computing resources; thus, once more data is available, different hyper- parameter values may be chosen. Additionally, in one or more embodiments, steps/epoch may be 100, and the epochs may be greater than (>) 1000. In one or more embodiments, a convolutional autoencoder (CAE) may be used.
- CAE convolutional autoencoder
- hyper-parameters may include, but are not limited to, one or more of the following: Depth (i.e., # of layers), Width (i.e., # of filters), Batch size (i.e., # of training images/step): May be >4 in one or more embodiments, Learning rate (i.e., a hyper-parameter that controls how fast the weights of a neural network (the coefficients of regression model) are adjusted with respect the loss gradient), Dropout (i.e., % of neurons (filters) that are dropped at each layer), and/or Optimizer: for example, Adam optimizer or Stochastic gradient descent (SGD) optimizer.
- Depth i.e., # of layers
- Width i.e., # of filters
- Batch size i.e., # of training images/step
- Learning rate i.e., a hyper-parameter that controls how fast the weights of a neural network (the coefficients of regression model) are adjusted with respect the loss gradient
- Dropout
- other hyper- parameters may be fixed or constant values, such as, but not limited to, for example, one or more of the following: Input size (e.g., 1024 pixel x 1024 pixel, 512 pixel x 512 pixel, another preset or predetermined number or value set, etc.), Epochs: 100, 200, 300, 400, 500, another preset or predetermined number, etc. (for additional training, iteration may be set as 3000 or higher), and/or Number of models trained with different hyper-parameter configurations (e.g., 10, 20, another preset or predetermined number, etc.).
- Input size e.g., 1024 pixel x 1024 pixel, 512 pixel x 512 pixel, another preset or predetermined number or value set, etc.
- Epochs 100, 200, 300, 400, 500, another preset or predetermined number, etc. (for additional training, iteration may be set as 3000 or higher)
- Number of models trained with different hyper-parameter configurations e.
- One or more features discussed herein may be determined using a convolutional auto-encoder, Gaussian filters, Haralick features, and/or thickness or shape of the sample or object (e.g., the tissue or tissues, the lesion or lesions, the catheter or catheter tip, the force(s) or force difference(s) acting on a structure of a catheter (e.g., on a catheter tip, on one or more drive wires, etc.), a lung or other organ, an airway, a bronchial pathway, another type of pathway, a specimen, a patient, a target in the patient, etc.).
- a convolutional auto-encoder e.g., the tissue or tissues, the lesion or lesions, the catheter or catheter tip, the force(s) or force difference(s) acting on a structure of a catheter (e.g., on a catheter tip, on one or more drive wires, etc.), a lung or other organ, an airway, a bronchial pathway, another type of pathway
- One or more embodiments of the present disclosure may use machine learning to determine sensor, tissue, catheter or catheter tip, force(s) and/or force difference(s) (e.g., force(s) on a catheter tip, drive wire force(s), etc.), and/or lesion location; to determine, detect, or evaluate tissue and/or lesion type(s) and/or characteristic(s); and/or to perform any other feature discussed herein.
- Machine learning is a field of computer science that gives processors the ability to learn, via artificial intelligence. Machine learning may involve one or more algorithms that allow processors or computers to learn from examples and to make predictions for new unseen data points.
- such one or more algorithms may be stored as software or one or more programs in at least one memory or storage medium, and the software or one or more programs allow a processor or computer to carry out operation(s) of the processes described in the present disclosure.
- machine learning may be used to train one or more models to efficiently to control a catheter or catheter tip, to manufacture or use the catheter or catheter tip, to detect or determine one or more force(s) or force difference(s) (e.g., catheter tip force(s), drive wire force(s), any other force(s) discussed herein, etc.), to perform imaging, etc.
- the present disclosure and/or one or more components of devices, systems, and storage mediums, and/or methods, thereof also may be used in conjunction with continuum robot devices, systems, methods, and/or storage mediums and/or with endoscope devices, systems, methods, and/or storage mediums.
- Such continuum robot devices, systems, methods, and/or storage mediums are disclosed in at least: U.S. Provisional Pat. App. No. 63/150,859, filed on February 18, 2021, U.S. Pat. App. No. 17/673,606, filed February 16, 2022, and U.S. Pat. No.11,882,365, issued on January 23, 2024, the disclosures of which are incorporated by reference herein in their entireties.
- Such endoscope devices, systems, methods, and/or storage mediums are disclosed in at least: U.S. Pat. App. No.17/565,319, filed on December 29, 2021, the disclosure of which is incorporated by reference herein in its entirety; U.S. Pat. App. No.63/132,320, filed on December 30, 2020, the disclosure of which is incorporated by reference herein in its entirety; U.S. Pat. App. No. 17/564,534, filed on December 29, 2021, the disclosure of which is incorporated by reference herein in its entirety; and U.S. Pat. App. No. 63/131,485, filed December 29, 2020, the disclosure of which is incorporated by reference herein in its entirety.
- any of the features of the present disclosure may be used in combination with any of the features as discussed in .S. Prov. Pat. App. No. 63/378,017, filed September 30, 2022, the disclosure of which is incorporated by reference herein in its entirety; U.S. Pat. App. No.18/477,081, filed September 28, 2023, the disclosure of which is incorporated by reference herein in its entirety; U.S. Pat. App. No.18/477,081, filed September 28, 2023, the disclosure of which is incorporated by reference herein in its entirety; and/or any of the features as discussed in U.S. Prov. Pat. App. No.63/377,983, filed September 30, 2022, the disclosure of which is incorporated by reference herein in its entirety.
- any of the features of the present disclosure may be used in combination with any of the features as discussed in U.S. Pat. Pub. No.2023/0131269, published on April 26, 2023, the disclosure of which is incorporated by reference herein in its entirety. Any of the features of the present disclosure may be used in combination with any of the features as discussed in U.S. Pat. App. No.63/587,637, filed on October 3, 2023, the disclosure of which is incorporated by reference herein in its entirety; as discussed in U.S. Pat. Pub. No.2023/0131269, published on April 26, 2023, the disclosure of which is incorporated by reference herein in its entirety; and as discussed in PCT Pat. App. Nos.
- present disclosure and/or one or more components of devices, systems, and storage mediums, and/or methods, thereof also may be used in conjunction with autonomous robot devices, systems, methods, and/or storage mediums and/or with endoscope devices, systems, methods, and/or storage mediums.
- Such continuum robot devices, systems, methods, and/or storage mediums are disclosed in at least: U.S. Prov. Pat. App.63/497, 358, filed on April 20, 2023, which is incorporated by reference herein in its entirety.
Landscapes
- Engineering & Computer Science (AREA)
- Health & Medical Sciences (AREA)
- Robotics (AREA)
- Surgery (AREA)
- Life Sciences & Earth Sciences (AREA)
- Mechanical Engineering (AREA)
- Medical Informatics (AREA)
- Heart & Thoracic Surgery (AREA)
- Biomedical Technology (AREA)
- Molecular Biology (AREA)
- Animal Behavior & Ethology (AREA)
- General Health & Medical Sciences (AREA)
- Public Health (AREA)
- Veterinary Medicine (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Human Computer Interaction (AREA)
- Endoscopes (AREA)
Abstract
La présente invention concerne un ou plusieurs dispositifs, systèmes, procédés et supports de stockage pour effectuer une commande robotique et/ou utiliser une ou plusieurs déterminations de force et/ou de différence de force pour estimer/déterminer une force exercée à l'aide d'informations de force de fil. Des exemples comprennent non exclusivement l'utilisation avec un cathéter ou un robot continuum comprenant au moins une section pliable pouvant être manipulée indépendamment pour avancer à travers un passage, sans entrer en contact avec un ou plusieurs éléments fragiles à l'intérieur du passage, un ou plusieurs fils d'entraînement du cathéter présentant une ou plusieurs forces et/ou une ou plusieurs différences de force à déterminer/estimer. Des exemples d'applications comprennent l'imagerie, l'évaluation et le diagnostic d'objets biologiques, par exemple, mais non exclusivement, des applications médicales bronchiques ou autres. Des techniques prévues dans la présente invention améliorent également le traitement, l'imagerie et l'efficacité de commande de cathéter tout en obtenant des images qui sont plus précises, et permettent également d'obtenir des dispositifs, des systèmes, des procédés et des supports de stockage qui réduisent la charge mentale et physique et améliorent la facilité d'utilisation.
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US202363603400P | 2023-11-28 | 2023-11-28 | |
| US63/603,400 | 2023-11-28 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2025117336A1 true WO2025117336A1 (fr) | 2025-06-05 |
Family
ID=95897804
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/US2024/056943 Pending WO2025117336A1 (fr) | 2023-11-28 | 2024-11-21 | Cathéters orientables et différences de force de fil |
Country Status (1)
| Country | Link |
|---|---|
| WO (1) | WO2025117336A1 (fr) |
Cited By (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN120617757A (zh) * | 2025-08-13 | 2025-09-12 | 苏州元科医疗器械有限公司 | 介入导管 |
Citations (10)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20020165534A1 (en) * | 2001-05-02 | 2002-11-07 | Hayzelden Robert C. | Steerable catheter with torque transfer system |
| US20080009750A1 (en) * | 2006-06-09 | 2008-01-10 | Endosense Sa | Catheter having tri-axial force sensor |
| US20080046122A1 (en) * | 2003-06-30 | 2008-02-21 | Intuitive Surgical, Inc. | Maximum torque driving of robotic surgical tools in robotic surgical systems |
| US20120143127A1 (en) * | 2005-05-27 | 2012-06-07 | Magnetecs, Inc. | Apparatus and method for shaped magnetic field control for catheter, guidance, control, and imaging |
| US20170119484A1 (en) * | 2013-03-14 | 2017-05-04 | Hansen Medical, Inc. | Catheter tension sensing |
| US20180055589A1 (en) * | 2016-08-26 | 2018-03-01 | Hansen Medical, Inc. | Steerable catheter with shaft load distributions |
| US20180243900A1 (en) * | 2017-02-28 | 2018-08-30 | Canon U.S.A. Inc. | Apparatus of continuum robot |
| US20190060612A1 (en) * | 2015-09-04 | 2019-02-28 | Petrus A. Besselink | Flexible and steerable device |
| US20200375682A1 (en) * | 2019-05-31 | 2020-12-03 | Canon U.S.A., Inc. | Actively controlled steerable medical device with passive bending mode |
| US20220218418A1 (en) * | 2021-01-11 | 2022-07-14 | Hepius Medical Inc. | Sensor-free force and position control of tendon-driven catheters through interaction modeling |
-
2024
- 2024-11-21 WO PCT/US2024/056943 patent/WO2025117336A1/fr active Pending
Patent Citations (10)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20020165534A1 (en) * | 2001-05-02 | 2002-11-07 | Hayzelden Robert C. | Steerable catheter with torque transfer system |
| US20080046122A1 (en) * | 2003-06-30 | 2008-02-21 | Intuitive Surgical, Inc. | Maximum torque driving of robotic surgical tools in robotic surgical systems |
| US20120143127A1 (en) * | 2005-05-27 | 2012-06-07 | Magnetecs, Inc. | Apparatus and method for shaped magnetic field control for catheter, guidance, control, and imaging |
| US20080009750A1 (en) * | 2006-06-09 | 2008-01-10 | Endosense Sa | Catheter having tri-axial force sensor |
| US20170119484A1 (en) * | 2013-03-14 | 2017-05-04 | Hansen Medical, Inc. | Catheter tension sensing |
| US20190060612A1 (en) * | 2015-09-04 | 2019-02-28 | Petrus A. Besselink | Flexible and steerable device |
| US20180055589A1 (en) * | 2016-08-26 | 2018-03-01 | Hansen Medical, Inc. | Steerable catheter with shaft load distributions |
| US20180243900A1 (en) * | 2017-02-28 | 2018-08-30 | Canon U.S.A. Inc. | Apparatus of continuum robot |
| US20200375682A1 (en) * | 2019-05-31 | 2020-12-03 | Canon U.S.A., Inc. | Actively controlled steerable medical device with passive bending mode |
| US20220218418A1 (en) * | 2021-01-11 | 2022-07-14 | Hepius Medical Inc. | Sensor-free force and position control of tendon-driven catheters through interaction modeling |
Non-Patent Citations (4)
| Title |
|---|
| GUDINO ET AL.: "Control of intravascular catheters using an array of active steering coils", MEDICAL PHYSICS, vol. 38, no. 7, 2011, pages 4215 - 4224, XP012145369, Retrieved from the Internet <URL:https://pmc.ncbi.nlm.nih.gov/articles/PMC6961950> [retrieved on 20250110], DOI: 10.1118/1.3600693 * |
| HEUNIS CHRISTOFF M.; WOTTE YANNIK P.; SIKORSKI JAKUB; FURTADO GUILHERME PHILLIPS; MISRA SARTHAK: "The ARMM System - Autonomous Steering of Magnetically-Actuated Catheters: Towards Endovascular Applications", IEEE ROBOTICS AND AUTOMATION LETTERS, IEEE, vol. 5, no. 2, 8 January 2020 (2020-01-08), pages 704 - 711, XP011767834, DOI: 10.1109/LRA.2020.2965077 * |
| HU ET AL.: "A novel methodology for comprehensive modeling of the kinetic behavior of steerable catheters", ASME TRANSACTIONS ON MECHATRONICS, vol. 24, no. 4, 2019, pages 1785 - 1797, XP011739847, Retrieved from the Internet <URL:https://www.researchgate.net/profile/Lin-Cao-2/pubfication/334490038ANovelMethodologyforComprehensiveModelingoftheKineticBehaviorofSteerableCatheters/links/Sd3d1caa4585153e59276709/A-Novel-Methodology-for-Comprehensive-Modeling-of-the-Kinetic-Behavior-of-Steerable-Catheters.pdf> [retrieved on 20250110], DOI: 10.1109/TMECH.2019.2928786 * |
| HU ET AL.: "Steerable catheters for minimally invasive surgery: a review and future directions", COMPUTER ASSISTED SURGERY, vol. 23, no. 1, 2018, pages 21 - 41, XP055914868, Retrieved from the Internet <URL:https://www.tandfonline.com/doi/pdf/10.1080/24699322.2018.1526972> [retrieved on 20250110], DOI: 10.1080/24699322.2018.1526972 * |
Cited By (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN120617757A (zh) * | 2025-08-13 | 2025-09-12 | 苏州元科医疗器械有限公司 | 介入导管 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US12295672B2 (en) | Robotic systems for determining a roll of a medical device in luminal networks | |
| CN110831486B (zh) | 用于基于定位传感器的分支预测的系统和方法 | |
| KR102683476B1 (ko) | 항행(navigation)을 위한 이미지 기반 분지(branch) 감지 및 매핑 | |
| US12475591B2 (en) | Vision-based 6DOF camera pose estimation in bronchoscopy | |
| KR20210016572A (ko) | 이미지-기반 기도 분석 및 매핑 | |
| KR20200136931A (ko) | 매핑 및 내비게이션을 위한 방법 및 시스템 | |
| US11944422B2 (en) | Image reliability determination for instrument localization | |
| CN114340542A (zh) | 用于位置传感器的基于权重的配准的系统和方法 | |
| US11950868B2 (en) | Systems and methods for self-alignment and adjustment of robotic endoscope | |
| WO2025019378A1 (fr) | Détection de mouvement de dispositif et planification de navigation et/ou navigation autonome pour un robot continuum ou un dispositif ou système endoscopique | |
| WO2025117336A1 (fr) | Cathéters orientables et différences de force de fil | |
| US20250172443A1 (en) | Force sensing with analog-to-digital conversion | |
| US20250170363A1 (en) | Robotic catheter tip and methods and storage mediums for controlling and/or manufacturing a catheter having a tip | |
| WO2025059207A1 (fr) | Appareil médical doté d'une structure de support et son procédé d'utilisation | |
| WO2024081745A2 (fr) | Localisation et ciblage de petites lésions pulmonaires | |
| US20240164856A1 (en) | Detection in a surgical system | |
| US20230225802A1 (en) | Phase segmentation of a percutaneous medical procedure | |
| WO2025072201A1 (fr) | Commande robotique pour robot continuum | |
| WO2025117590A1 (fr) | Appareil médical pliable comportant des actionneurs modulaires |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 24898538 Country of ref document: EP Kind code of ref document: A1 |