[go: up one dir, main page]

WO2025210565A1 - Système robotique chirurgical conçu pour commander un instrument orientable à l'aide de multiples modalités d'imagerie - Google Patents

Système robotique chirurgical conçu pour commander un instrument orientable à l'aide de multiples modalités d'imagerie

Info

Publication number
WO2025210565A1
WO2025210565A1 PCT/IB2025/053537 IB2025053537W WO2025210565A1 WO 2025210565 A1 WO2025210565 A1 WO 2025210565A1 IB 2025053537 W IB2025053537 W IB 2025053537W WO 2025210565 A1 WO2025210565 A1 WO 2025210565A1
Authority
WO
WIPO (PCT)
Prior art keywords
instrument
coordinate system
imaging device
imaging
surgical robotic
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
PCT/IB2025/053537
Other languages
English (en)
Inventor
William J. Peine
Dany JUNIO
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Covidien LP
Original Assignee
Covidien LP
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Covidien LP filed Critical Covidien LP
Publication of WO2025210565A1 publication Critical patent/WO2025210565A1/fr
Pending legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/25User interfaces for surgical systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots
    • A61B34/37Leader-follower robots
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/70Manipulators specially adapted for use in surgery
    • A61B34/74Manipulators with manual electric input means
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2051Electromagnetic tracking systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2055Optical tracking systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B2090/364Correlation of different images or relation of image positions in respect to the body
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • A61B2090/376Surgical systems with images on a monitor during operation using X-rays, e.g. fluoroscopy
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • A61B2090/378Surgical systems with images on a monitor during operation using ultrasound

Definitions

  • Surgical robotic systems are currently being used in a variety of medical procedures, including minimally invasive surgical procedures.
  • Some surgical robotic systems include a surgeon console controlling a surgical robotic arm and a surgical instrument having an end effector (e.g., forceps or grasping instrument) coupled to and actuated by the robotic arm.
  • the robotic arm In operation, the robotic arm is moved to a position over a patient and then guides the surgical instrument into a small incision via a surgical port or a natural orifice of a patient to position the end effector at a work site within the patient’s body.
  • a variety of different types of instruments are used with surgical robotic systems that are designed to perform specific functions, such as steerable catheters and endoscopes.
  • Image guided medical and surgical procedures utilize patient images obtained prior to or during a medical procedure to guide a physician performing the procedure. Such procedures can be referred to as computer-assisted procedures.
  • CT computed tomography
  • MRI magnetic resonance imaging
  • fluoroscopic imaging such as with a C-arm device
  • PET positron emission tomography
  • US ultrasound imaging
  • Typical image guided navigation systems generally require a dynamic reference frame to track the position of the patient should patient movement occur during the procedure.
  • the dynamic reference frame is generally affixed to the patient in a generally permanent or immovable fashion.
  • the dynamic reference frame may also be used as a fiducial marker and may, therefore, be attached to the patient during the acquisition of pre-operative images. This enables the image space to be aligned with patient space during the navigated procedure.
  • Various instruments that are desired to be tracked may be used during an operative procedure. Image data is generally acquired, either intra- operatively or pre-operatively, and the instrument is generally illustrated, and superimposed on the captured image data to identify the position of the instrument relative to the patient space.
  • the instrument may include tracking sensors, such as electromagnetic coils or optical detection points, such as LEDs or reflectors that may be detected by a suitable tracking system.
  • the DRF can be used by the tracking system to maintain a registration or localization of the patient space to the image space.
  • the DRF can be also any appropriate tracking sensor that is fixed to a portion of the patient that allows the system to determine whether the patient has moved relative to the image space.
  • Flexible surgical instruments such as catheters, scopes, etc.
  • These instruments are used with the aid of various visualization techniques to aid in navigation of the instruments and to perform the procedure.
  • multiple visualization modalities are used simultaneously, such as ultrasound and fluoroscopy, to guide the instrument.
  • the surgeon needs to manually perform mental-registration and compensation of the images and the instrument.
  • the surgeon needs to understand how each imaging modality relates to the other imaging modalities as well as how the instrument is being imaged in the different imaging modalities, and how movement inputs affect the movement of the instrument.
  • the present disclosure provides for a surgical robotic system and a navigation system for navigating a surgical instrument using multiple imaging modalities.
  • Suitable imaging modalities include a fluoroscopy imaging system and an ultrasound imaging system.
  • the navigation system includes fiducial markers, electromagnetic trackers, or any other index markers/trackers that are used to localize imaging components, the instrument, the patient, etc.
  • the markers may be tracked using either white or infrared cameras, electromagnetic receivers, and the like.
  • the robotic system is configured to combine multiple imaging modalities, such as fluoroscopy and ultrasound.
  • imaging modalities such as fluoroscopy and ultrasound.
  • localization of imagining components allows for combining of fluoroscopy and ultrasound images using the same coordinate system by locating image acquisition components (e.g., ultrasound transducer, X-ray sensors, etc.) in space with relation to the camera, receiver, or some other reference point.
  • image acquisition components e.g., ultrasound transducer, X-ray sensors, etc.
  • Localization information is then processed using properties of the imagers to enable orientation of the output from both devices on display screen(s) both in rotation as well as scale/zoom while also providing the operators with the potential instructions to align the imaging systems as desired.
  • automatic alignment may also be used.
  • the system may also be used to align the robotic visual reference to the images.
  • the robotic system components e.g., robotic arms, and the image components are aligned to be in the same plane or have a live registration of the robot motion inside the image to detect where the robot coordinate system is within the current imaging output.
  • the robot control may be aligned such that each cartesian control command happens with the same axes as the image - e.g., right input moves the instrument to the right, up input moves the instrument up, etc.
  • a surgical robotic system includes a first robotic arm having a first instrument drive unit and an instrument coupled to and actuatable by the first instrument drive unit, and a surgeon console including a handle controller for receiving a user input to move the instrument in a first coordinate system.
  • the system also includes a second robotic arm having a second instrument drive unit, and an imaging device coupled to and actuatable by the second instrument drive unit.
  • the imaging device is configured to capture an image of the instrument using an imaging modality in a second coordinate system, different from the first coordinate system.
  • a tracking system is also provided for obtaining tracking data of the instrument and the imaging device.
  • the surgical robotic system further includes a processor configured to register the instrument in the second coordinate system based on the tracking data, align the first coordinate system with the second coordinate system, and receive inputs controlling the instrument in the second coordinate system.
  • the imaging device may be one of an ultrasound probe or a fluoroscope.
  • the instrument may be one of a catheter or a flexible scope.
  • the tracking system may be one of an optical tracking system or an electromagnetic tracking system.
  • the handle controller may include a plurality of directional inputs.
  • the processor may be configured to map the plurality of directional inputs from the first coordinate system to the second coordinate system.
  • the surgeon console further may include a display screen for displaying the image captured by the imaging device.
  • the processor is configured to display a first 3D axis symbol corresponding to the first coordinate system and a second 3D axis symbol corresponding to the second coordinate system.
  • the processor may be configured to align the second 3D axis symbol to the first 3D axis symbol when the first coordinate system is aligned with the second coordinate system.
  • a method for controlling an imaging system with a surgical robotic system includes: receiving a user input at a handle controller for moving an instrument in a first coordinate system, where the instrument is coupled to a first instrument drive unit of a first robotic arm; and capturing an image of the instrument through an imaging device using an imaging modality in a second coordinate system, where the imaging device is coupled to a second instrument drive unit of a second robotic arm; obtaining tracking data of the instrument and the imaging device using a tracking system.
  • the method also includes registering the instrument in the second coordinate system based on the tracking data.
  • the method further includes aligning the first coordinate system with the second coordinate system.
  • the method additionally includes receiving inputs controlling the instrument in the second coordinate system.
  • the handle controller may include a plurality of directional inputs.
  • the method may further include mapping the plurality of directional inputs from the first coordinate system to the second coordinate system.
  • the method may further include displaying the image captured by the imaging device on a display screen.
  • the method may further include displaying a first 3D axis symbol corresponding to the first coordinate system and a second 3D axis symbol corresponding to the second coordinate system.
  • the method may also include aligning the second 3D axis symbol to the first 3D axis symbol when the first coordinate system is aligned with the second coordinate system.
  • a method for controlling a medical instrument in a plurality of different imaging modalities includes capturing a first image of an instrument through a first imaging device using a first imaging modality in a first coordinate system, where the first imaging device is coupled to a first instrument drive unit of a first robotic arm.
  • the method also includes capturing a second image of the instrument through a second imaging device using a second imaging modality in a second coordinate system, where the second imaging device is coupled to a second instrument drive unit of a second robotic arm and the second imaging modality is different from the first imaging modality and the second coordinate system is different from the first coordinate system.
  • the method also includes obtaining tracking data of the instrument, the first imaging device, and the second imaging device using a tracking system.
  • the method further includes registering the instrument in the first coordinate system and the second coordinate system based on the tracking data.
  • the method additionally includes aligning the first coordinate system with the second coordinate system.
  • Implementations of the above embodiment may include one or more of the following features.
  • the method may further include displaying the first image captured by the first imaging device side-by-side with the second image captured by the second imaging device.
  • the method may further include displaying a first 3D axis symbol corresponding to the first coordinate system and a second 3D axis symbol corresponding to the second coordinate system.
  • the method may also include aligning the second 3D axis symbol to the first 3D axis symbol when the first coordinate system of the first imaging device is aligned with the second coordinate system of the second imaging device. Aligning the first coordinate system with the second coordinate system may include moving at least one of the first robotic arm or the second robotic arm.
  • FIG. 3 is a perspective view of a movable cart having a setup arm with the surgical robotic arm of the surgical robotic system of FIG. 1 according to an embodiment of the present disclosure
  • FIG. 4 is a schematic diagram of a computer architecture of the surgical robotic system of FIG. 1 according to an embodiment of the present disclosure
  • FIG. 5 is a navigation system for use with the surgical robotic system of FIG. 1 according to an embodiment of the present disclosure
  • FIG. 6 is a perspective view of a handle controller according to one embodiment of the present disclosure.
  • FIG. 7 is a view of a workstation including a first display screen showing an image of a first imaging modality and a second display screen showing an image of a second imaging modality according to an embodiment of the present disclosure
  • FIG. 8 is a view of graphical user interface (GUI) showing an image of the first imaging modality and coordinate system icons representing registration according to an embodiment of the present disclosure
  • FIG. 9 is a flow chart of a method for controlling a steerable instrument using a single imaging modality according to an embodiment of the present disclosure.
  • FIG. 10 is a flow chart of a method for controlling a steerable instrument using multiple imaging modalities according to an embodiment of the present disclosure.
  • the surgeon console 30 also includes a plurality of user interface devices, such as foot pedals 36 and a pair of handle controllers 38a and 38b, which are used by a user to remotely control the robotic arms 40.
  • the surgeon console further includes an armrest 33 used to support clinician’ s arms while the clinician is operating the handle controllers 38a and 38b.
  • the control tower 20 can also include a display 23, which may be a touchscreen and may display one or more graphical user interface(s) (GUIs).
  • GUIs graphical user interface
  • the control tower 20 also acts as an interface between the surgeon console 30 and one or more of the robotic arms 40.
  • the control tower 20 is configured to control the robotic arms 40, such as to move the robotic arms 40 and the corresponding surgical instrument 50, based on a set of programmable instructions and/or input commands from the surgeon console 30.
  • the robotic arms 40 and the surgical instrument 50 execute a desired movement sequence in response to input from the foot pedals 36 and the handle controllers 38a and 38b.
  • Each of the control tower 20, the surgeon console 30, and the robotic arm 40 includes a respective computer 21, 31, 41.
  • the computers 21, 31, 41 are interconnected to each other using any suitable communication network based on wired or wireless communication protocols.
  • Suitable protocols include, but are not limited to, transmission control protocol/internet protocol (TCP/IP), datagram protocol/internet protocol (UDP/IP), and/or datagram congestion control protocol (DCCP).
  • the setup arm 61 includes a first link 62a, a second link 62b, and a third link 62c, which provide for lateral maneuverability of the robotic arms 40.
  • the links 62a, 62b, 62c are interconnected at joints 63a and 63b, each of which may include an actuator (not shown) for rotating the links 62b and 62b relative to each other and the link 62c.
  • the links 62a, 62b, 62c are movable in corresponding lateral planes, which are parallel to each other, thereby allowing for extension of the robotic arm 40 relative to the patient (e.g., surgical table).
  • the robotic arm 40 may be coupled to the surgical table (not shown).
  • the setup arm 61 includes controls 65 for adjusting movement of the links 62a, 62b, 62c as well as the lift 67.
  • the setup arm 61 may include any type and/or number of joints.
  • the actuator 48b of the joint 44b is coupled to the joint 44c via the belt 45a, and the joint 44c is in turn coupled to the joint 46b via the belt 45b.
  • Joint 44c may include a transfer case coupling the belts 45a and 45b, such that the actuator 48b is configured to rotate each of the links 42b, 42c and a holder 46 relative to each other. More specifically, links 42b, 42c, and the holder 46 are passively coupled to the actuator 48b which enforces rotation about a pivot point “P” which lies at an intersection of the first axis defined by the link 42a and the second axis defined by the holder 46.
  • the joints 44a and 44b include respective actuators 48a and 48b configured to drive the joints 44a, 44b, 44c relative to each other through a series of belts 45a and 45b or other mechanical linkages such as a drive rod, a cable, or a lever and the like.
  • the actuator 48a is configured to rotate the robotic arm 40 about a longitudinal axis defined by the link 42a.
  • the holder 46 defines a second longitudinal axis and configured to receive an instrument drive unit (IDU) 52 (FIG. 1).
  • the IDU 52 is configured to couple to an actuation mechanism of the surgical instrument 50 and the camera 51 and is configured to move (e.g., rotate) and actuate the instrument 50 and/or the camera 51.
  • IDU 52 transfers actuation forces from its actuators to the surgical instrument 50 to actuate components of an end effector 49 of the surgical instrument 50.
  • the holder 46 includes a sliding mechanism 46a, which is configured to move the IDU 52 along the second longitudinal axis defined by the holder 46.
  • the holder 46 also includes a joint 46b, which rotates the holder 46 relative to the link 42c.
  • the instrument 50 may be inserted through an endoscopic access port 55 (FIG. 3) held by the holder 46.
  • the holder 46 also includes a port latch 46c for securing the access port 55 to the holder 46 (FIG. 2).
  • the IDU 52 is attached to the holder 46, followed by a sterile interface module (SIM) 43 being attached to a distal portion of the IDU 52.
  • SIM sterile interface module
  • the SIM 43 is configured to secure a sterile drape (not shown) to the IDU 52.
  • the instrument 50 is then attached to the SIM 43.
  • the instrument 50 is then inserted through the access port 55 by moving the IDU 52 along the holder 46.
  • the SIM 43 includes a plurality of drive shafts configured to transmit rotation of individual motors of the IDU 52 to the instrument 50 thereby actuating the instrument 50.
  • the SIM 43 provides a sterile barrier between the instrument 50 and the other components of the robotic arm 40, including the IDU 52.
  • the robotic arm 40 also includes a plurality of manual override buttons 53 (FIG. 1) disposed on the IDU 52 and the setup arm 61, which may be used in a manual mode. The user may press one or more of the buttons 53 to move the component associated with the one or more buttons 53.
  • each of the computers 21, 31, 41 of the surgical robotic system 10 may include a plurality of controllers, which may be embodied in hardware and/or software.
  • the computer 21 of the control tower 20 includes a controller 21a and safety observer 21b.
  • the controller 21a receives data from the computer 31 of the surgeon console 30 about the current position and/or orientation of the handle controllers 38a and 38b and the state of the foot pedals 36 and other buttons.
  • the controller 21a processes these input positions to determine desired drive commands for each joint of the robotic arm 40 and/or the IDU 52 and communicates these to the computer 41 of the robotic arm 40.
  • the controller 21a also receives the actual joint angles measured by encoders of the actuators 48a and 48b and uses this information to determine force feedback commands that are transmitted back to the computer 31 of the surgeon console 30 to provide haptic feedback through the handle controllers 38a and 38b.
  • the safety observer 21b performs validity checks on the data going into and out of the controller 21a and notifies a system fault handler if errors in the data transmission are detected to place the computer 21 and/or the surgical robotic system 10 into a safe state.
  • the computer 41 includes a plurality of controllers, namely, a main cart controller 41a, a setup arm controller 41b, a robotic arm controller 41c, and an instrument drive unit (IDU) controller 41 d.
  • the main cart controller 41a receives and processes joint commands from the controller 21a of the computer 21 and communicates them to the setup arm controller 41b, the robotic arm controller 41c, and the IDU controller 41 d.
  • the main cart controller 41a also manages instrument exchanges and the overall state of the movable cart 60, the robotic arm 40, and the IDU 52.
  • the main cart controller 41a also communicates actual joint angles back to the controller 21a.
  • Each of joints 63a and 63b and the rotatable base 64 of the setup arm 61 are passive joints (i.e., no actuators are present therein) allowing for manual adjustment thereof by a user.
  • the joints 63a and 63b and the rotatable base 64 include brakes that are disengaged by the user to configure the setup arm 61.
  • the setup arm controller 41b monitors slippage of each of joints 63a and 63b and the rotatable base 64 of the setup arm 61, when brakes are engaged or can be freely moved by the operator when brakes are disengaged, but do not impact controls of other joints.
  • the robotic arm controller 41c controls each joint 44a and 44b of the robotic arm 40 and calculates desired motor torques required for gravity compensation, friction compensation, and closed loop position control of the robotic arm 40.
  • the robotic arm controller 41c calculates a movement command based on the calculated torque.
  • the calculated motor commands are then communicated to one or more of the actuators 48a and 48b in the robotic arm 40.
  • the actual joint positions are then transmitted by the actuators 48a and 48b back to the robotic arm controller 41c.
  • the IDU controller 41 d receives desired joint angles for the surgical instrument 50, such as wrist and jaw angles, and computes desired currents for the motors in the IDU 52.
  • the IDU controller 41 d calculates actual angles based on the motor positions and transmits the actual angles back to the main cart controller 41a.
  • the robotic arm 40 is controlled in response to a pose of the handle controller controlling the robotic arm 40, e.g., the handle controller 38a, which is transformed into a desired pose of the robotic arm 40 through a hand eye transform function executed by the controller 21a.
  • the hand eye function as well as other functions described herein, is/are embodied in software executable by the controller 21a or any other suitable controller described herein.
  • the pose of one of the handle controllers 38a may be embodied as a coordinate position and roll-pitch-yaw (RPY) orientation relative to a coordinate reference frame, which is fixed to the surgeon console 30.
  • the desired pose of the instrument 50 is relative to a fixed frame on the robotic arm 40.
  • the pose of the handle controller 38a is then scaled by a scaling function executed by the controller 21a.
  • the coordinate position may be scaled down and the orientation may be scaled up by the scaling function.
  • the controller 21a may also execute a clutching function, which disengages the handle controller 38a from the robotic arm 40.
  • the controller 21a stops transmitting movement commands from the handle controller 38a to the robotic arm 40 if certain movement limits or other thresholds are exceeded and in essence acts like a virtual clutch mechanism, e.g., limits mechanical input from effecting mechanical output.
  • the desired pose of the robotic arm 40 is based on the pose of the handle controller 38a and is then passed by an inverse kinematics function executed by the controller 21a.
  • the inverse kinematics function calculates angles for the joints 44a, 44b, 44c of the robotic arm 40 that achieve the scaled and adjusted pose input by the handle controller 38a.
  • the calculated angles are then passed to the robotic arm controller 41c, which includes a joint axis controller having a proportional-derivative (PD) controller, the friction estimator module, the gravity compensator module, and a two-sided saturation block, which is configured to limit the commanded torque of the motors of the joints 44a, 44b, 44c.
  • handle controller 38a may be substituted for and/or employed in conjunction with handle controller 38b.
  • FIG. 5 shows an image-guided navigation system 100 for use with a surgical robotic system 10 in non-line-of-site navigation of an instrument, such as a catheter during cardiac therapy or any other surgical procedure.
  • the navigation system 100 may be used with any type of instrument or delivery system, including guide wires, needles, drug delivery systems, cell delivery systems, gene delivery systems and biopsy systems. Moreover, these instruments may be used for cardiac therapy or any other therapy in the body or be used to navigate or map any other regions of the body, such as moving body structures.
  • the navigation system 100 addresses multiple cardiac, neurological, organ and other soft tissue therapies, including drug delivery, cell transplantation, gene delivery, electrophysiology ablations, transmyocardial vascularization (TMR), biopsy guidance, and virtual echography imaging.
  • TMR transmyocardial vascularization
  • the navigation system 100 may include an imaging device 112 that is used to acquire preoperative or real-time images of a patient 114.
  • the imaging device 112 may be a fluoroscopic x- ray imaging device, a magnetic resonance imager (MRI), a computed tomography (CT) imager, a positron emission tomography (PET) imager, an isocentric fluoroscopy imager, a biplane fluoroscopy imager, an ultrasound imager, a multi-slice computed tomography (MSCT) imager, positron emission tomography-computed tomography (PET/CT), high definition computed tomography (HDCT), dual source computed tomography, a high-frequency ultrasound (HIFU) imager, an optical coherence tomography (OCT) imager, an intra-vascular ultrasound imager (IVUS), an intra-operative CT imager, an intra-operative MRI imager, a single photon emission computer tomography (SPECT) imager, and a combination
  • the imaging device 112 may include an arm 116 having an imaging (e.g., x-ray) source 118, a receiving section 120, an optional calibration and tracking target 122 and optional radiation sensors 124.
  • the calibration and tracking target 122 includes calibration markers.
  • An arm controller 128 captures the x-ray images received at the x-ray receiving section 120 and stores the images for later use.
  • the arm controller 128 may also control the rotation of the arm 116.
  • the arm 116 may move in the direction of arrow 130 or rotate about the long axis of the patient, allowing anterior or lateral views of the patient 114 to be imaged. Each of these movements involves rotation about a mechanical connector 132 of the arm 116.
  • any 2D, 3D or 4D imaging device such as isocentric fluoroscopy, bi-plane fluoroscopy, ultrasound, computed tomography (CT), multi-slice computed tomography (MSCT), magnetic resonance imaging (MRI), high frequency ultrasound (HIFU), positron emission tomography (PET), optical coherence tomography (OCT), intra-vascular ultrasound (IVUS), ultrasound, intra-operative CT or MRI may also be used to acquire 2D, 3D or 4D pre-operative or real-time images or image data of the patient 114.
  • the images may also be obtained and displayed in two, three or four dimensions.
  • four-dimensional surface rendering of the heart or other regions of the body may also be achieved by incorporating heart data or other soft tissue data from an atlas map or from preoperative image data captured by MRI, CT, or echocardiography modalities.
  • Image datasets from hybrid modalities could also provide functional image data superimposed onto anatomical data to be used to confidently reach target sights within the heart or other areas of interest.
  • PET positron emission tomography
  • SPECT single photon emission computer tomography
  • the fluoroscopic imaging device 112 provides a virtual bi-plane image using a single-head arm fluoroscope (i.e., imaging device 112) by simply rotating the arm 116 about at least two planes, which could be orthogonal planes to generate two-dimensional images that can be converted to three-dimensional volumetric images.
  • an icon representing the location of a catheter or other instrument, introduced and advanced in the patient 114 may be superimposed in more than one view on display screen(s) 136 allowing simulated bi-plane or even multi-plane views, including two and three-dimensional views.
  • MRI magnetic resonance imaging
  • This type of imaging provides very good tissue visualization in three-dimensional form and also provides anatomy and functional information from the imaging.
  • MRI imaging data is generally registered and compensated for motion correction using dynamic reference frames that are discussed herein.
  • Computed tomography (CT) imaging is also generally a pre-operative technique that exposes the patient to a limited level of radiation.
  • CT imaging is a very fast imaging procedure.
  • a multi-slice CT system provides 3D images having good resolution and anatomy information. Again, CT imaging is generally registered and needs to account for motion correction, via dynamic reference frames.
  • Fluoroscopy imaging is generally an intra-operative imaging procedure that exposes the patient to certain amounts of radiation to provide either two-dimensional or rotational three- dimensional images.
  • Fluoroscopic images generally provide good resolution and anatomy information. Fluoroscopic images can be either manually or automatically registered and also need to account for motion correction using dynamic reference frames.
  • Ultrasound imaging is also generally intra-operative procedure using a non-ionizing field to provide either 2D, 3D, or 4D imaging, including anatomy and blood flow information. Ultrasound imaging provides automatic registration and does not need to account for any motion correction.
  • the navigation system 100 may include a second imaging device 166, which may be an ultrasound probe, such as a drop-in, laparoscopic ultrasound probe, an ultrasound probe, and the like.
  • the ultrasound probe is configured to generate ultrasound data by emitting acoustic waves into the tissue and to receive the reflected acoustic waves, i.e., ultrasound data.
  • the received reflected acoustic waves are then processed by the workstation 134 to identify various properties of the tissues through which the acoustic wave traveled, such as the density of the tissue.
  • the ultrasound probe may be grasped by a robotic instrument or coupled directly to the IDU 52.
  • the second imaging device 166 may be a fluoroscopic imager, a magnetic resonance imager (MRI), a computed tomography (CT) imager, a positron emission tomography (PET) imager, an isocentric fluoroscopy imager, a bi-plane fluoroscopy imager, an ultrasound imager, a multi-slice computed tomography (MSCT) imager, positron emission tomography-computed tomography (PET/CT), high definition computed tomography (HDCT), dual source computed tomography, a high-frequency ultrasound (HIFU) imager, an optical coherence tomography (OCT) imager, an intra-vascular ultrasound imager (IVUS), an intra-operative CT imager, an intraoperative MRI imager, a single photon emission computer tomography (SPECT) imager, and a combination thereof.
  • MRI magnetic resonance imager
  • CT computed tomography
  • PET positron emission tomography
  • IDT high definition computed to
  • the navigation system 100 may further include an electromagnetic navigation system 144 that includes a transmitter coil array 146, the coil array controller 148, a navigation probe interface 150, a dynamic reference frame 154, and the instrument 50, which may be an electromagnetically tracked flexible instrument, such as a catheter, scope, or any other type of instrument.
  • the entire tracking system 144 or parts of the tracking system 144 may be incorporated into the imaging devices 112 and 166, including the workstation 134 and radiation sensors 124. Incorporating the tracking system 144 will provide an integrated imaging and tracking system. Any combination of these components may also be incorporated into the imaging system 112, which again can include a fluoroscopic arm imaging device or any other appropriate imaging device.
  • the transmitter coil array 146 is shown attached to the x-ray receiving section 120 of the arm 116. However, it should be noted that the transmitter coil array 146 may also be positioned at any other location as well. For example, the transmitter coil array 146 may be positioned at the x- ray source 118, within or atop the OR table 156 positioned below the patient 114, on siderails associated with the table 156, or positioned on the patient 114 in proximity to the region being navigated, such as on the patient's chest.
  • the transmitter coil array 146 includes a plurality of coils that are each operable to generate distinct electromagnetic fields into the navigation region of the patient 114, which is sometimes referred to as patient space.
  • the transmitter coil array 146 is controlled or driven by the coil array controller 148.
  • the coil array controller 148 drives each coil in the transmitter coil array 146 in a time division multiplex or a frequency division multiplex manner. In this regard, each coil may be driven separately at a distinct time, or all of the coils may be driven simultaneously with each being driven by a different frequency.
  • electromagnetic fields are generated within the patient 114 in the area where the medical procedure is being performed, which is again sometimes referred to as patient space.
  • the electromagnetic fields generated in the patient space induces currents in sensors 158 positioned in the instrument 50.
  • the navigation probe interface 150 provides all the necessary electrical isolation for the navigation system 100.
  • the navigation probe interface 150 also includes amplifiers, filters and buffers required to directly interface with the sensors 158 in instrument 50.
  • the instrument 50 may employ a wireless communications channel as opposed to being coupled directly to the navigation probe interface 150.
  • the instrument 50 may be equipped with at least one, and generally multiple, localization sensors 158.
  • the instrument 50 is also generally a steerable catheter that includes a handle at a proximal end and the multiple location sensors 158 fixed to the catheter body and spaced axially from one another along the distal segment of the instrument 50.
  • the instrument 50 as shown in FIG. 5 may include four localization sensors 158.
  • the localization sensors 158 are generally formed as electromagnetic receiver coils, such that the electromagnetic field generated by the transmitter coil array 146 induces current in the electromagnetic receiver coils or sensors 158.
  • the instrument 50 may also be equipped with one or more sensors, which are operable to sense various physiological signals.
  • the instrument 50 may be provided with electrodes for sensing myopotentials or action potentials.
  • the instrument 50 may also be provided with an open lumen to allow the delivery of a medical device or pharmaceutical/cell/gene agents.
  • the instrument 50 may be used as a guide catheter for deploying a medical lead, such as a cardiac lead for use in cardiac pacing and/or defibrillation or tissue ablation.
  • the open lumen may alternatively be used to locally deliver pharmaceutical agents, cell, or genetic therapies.
  • the dynamic reference frame 154 of the electromagnetic tracking system 144 is also coupled to the navigation probe interface 150 to forward the information to the coil array controller 148.
  • the dynamic reference frame 154 is a small magnetic field detector that is designed to be fixed to the patient 114 adjacent to the region being navigated so that any movement of the patient 114 is detected as relative motion between the transmitter coil array 146 and the dynamic reference frame 154. This relative motion is forwarded to the coil array controller 148, which updates registration correlation and maintains accurate navigation.
  • the dynamic reference frame 154 can be configured as a pair of orthogonally oriented coils, each having the same center or may be configured in any other non-coaxial coil configuration.
  • the dynamic reference frame 154 may be affixed externally to the patient 114, adjacent to the region of navigation, such as on the patient's chest, as shown in FIG. 5 or on the patient's back.
  • the dynamic reference frame 154 can be affixed to the patient's skin, by way of a stick-on adhesive patch.
  • the dynamic reference frame 154 may also be removably attachable to fiducial markers 160 also positioned on the patient's body.
  • the dynamic reference frame 154 may be internally attached, for example, to the wall of the patient's heart or other soft tissue using a temporary lead that is attached directly to the heart. This provides increased accuracy since this lead will track the regional motion of the heart. Gating will also increase the navigational accuracy of the system 100. It should further be noted that multiple dynamic reference frames 154 may also be employed. For example, an external dynamic reference frame 154 may be attached to the chest of the patient 114, as well as to the back of the patient 114. Since certain regions of the body may move more than others due to motions of the heart or the respiratory system, each dynamic reference frame 154 may be appropriately weighted to increase accuracy even further. In this regard, the dynamic reference frame 154 attached to the back may be weighted higher than the dynamic reference frame 154 attached to the chest, since the dynamic reference frame 154 attached to the back is relatively static in motion.
  • the navigation system 100 may also include an optical tracking system 170, which may include any suitable camera 172 or cameras, such as one or more infrared cameras (e.g., bifocal cameras), able to identify, for example, active and passive tracking markers 174 in a given measurement volume viewable from the perspective of the camera 172.
  • the camera 172 may scan the given measurement volume and detect the light that comes from the markers 174 in order to identify and determine the position of the markers 174 in three- dimensions.
  • active markers 174 may include infrared-emitting markers that are activated by an electrical signal (e.g., infrared light emitting diodes (LEDs)), and passive markers 174 may include retro-reflective markers that reflect infrared light (e.g., they reflect incoming IR radiation into the direction of the incoming light), for example, emitted by illuminators on the camera 172, or other suitable device.
  • the markers 174 may be placed on the imaging devices 112 and 166 to allow for their registration and localization.
  • FIG. 6 shows the left-handle controller 38b, which is a mirror copy of the right-handle controller 38a, which may be used to control, i.e., move, activate, etc. tool coupled to the IDU 52 of the robotic arms 40, such as the instrument 50, the laparoscopic camera 51, and the ultrasound probe 166.
  • Each of the handle controllers 38a and 38b includes a handle 701 and a paddle 708 that is pivotally coupled to the handle 701 at one end (e.g., proximal) of the paddle 708.
  • the paddle 708 is configured to actuate a function of the instrument 50.
  • the paddle 708 may include a finger sensor configured to detect presence or movement of a finger, such as touch sensors, capacitive sensors, optical sensors, and the like.
  • the finger sensor 704 may be disposed on any portion of the handle controllers 38a and 38b.
  • Each of the handle controllers 38a and 38b may also include a trigger 705a and one or more buttons 705b for activating various functions of the instrument 50.
  • each of the handle controllers 38a and 38b may include a gimbal assembly 706 allowing for movement and rotation of the handle controllers 38a and 38b in a coordinate system of the handle controller 38a.
  • the coordinate system is represented by a 3D axis symbol 600 including the X-axis 600x, Y-axis 600y, Z-axis 600z.
  • the gimbal assembly includes a plurality of frames 710a, 710b, 710c interconnected by rotatable joints 711 between each of the frames 710a, 710b, 710c, the handle 701, and a support frame 712.
  • the joints include encoders or other sensors suitable for measuring rotation, which is then used as input to control movement (e.g., pitch, roll, yaw, etc.) of the instrument 50.
  • the handle controller 38b may be any other directional input device, such as an analog joystick, a directional pad, a touchpad, trackball, mouse, and the like.
  • the display screens 136 of the workstation 134 are used to display images obtained from the first imaging device 112 and/or the second imaging device 166.
  • Each of the display screens i.e., first display screen 136a and second display screen 136b, may be display an image from one of the image modalities, i.e., from the first imaging device 112 and the second imaging device 166, respectively.
  • the images may be displayed on a single display in separate windows, regions, or picture-in-picture formats.
  • the first display screen 136a displays the image 180a from the first imaging modality, i.e., the first imaging device 112, which may be a fluoroscope as described above.
  • the display screen 136a also shows a coordinate system of the first imaging modality, which is depicted using a 3D axis symbol 182, including the X-axis 182x, Y-axis 182y, Z-axis 182z.
  • the orientation of the axes 182x, y, z is obtained using electromagnetic and/or optical tracking systems described above.
  • the 3D axis symbol 182 may be overlayed on the image 180a.
  • the second display screen 136b displays the image 180b from the imaging second modality, i.e., the second imaging device 166, which may be an ultrasound probe as described above.
  • the display screen 136b also shows a coordinate system of the second imaging modality, which is depicted using a 3D axis symbol 184, including the X-axis 184x, Y-axis 184y, Z-axis 184z.
  • the orientation of the axes 184x, y, z is obtained using electromagnetic and/or optical tracking systems described above.
  • the 3D axis symbol 184 may be overlayed on the image 180b.
  • the respective coordinate systems of the first and second imaging modalities are not aligned due to different, i.e., misaligned, orientations of the corresponding imaging devices 112 and 166.
  • the coordinate system of the hand controllers 38a and 38b is also not aligned with either of the coordinate systems of the first and second imaging modalities. Accordingly, the movement inputs at the handle controllers 38a and 38b in one direction may result in movement of the catheter 152 in the first and second images in completely different direction due to the misalignment.
  • FIG. 9 shows a method for using a single imaging modality of navigation system 100 with the surgical robotic system 10.
  • the method may be embodied in software instructions stored in memory and executable by a processor of the workstation 134, surgeon console 30, control tower 20, etc.
  • the surgical robotic system 10 is initialized, which includes attaching the instrument 50 to the IDU 52 of the robotic arm 40.
  • the instrument 50 is controlled by user inputs through the handle controller 38a.
  • pivoting the handle 701 about the Y-axis 600y controls the yaw motion (e.g., movement between right and left) and pivoting the handle 701 about X-axis 600x, controls the pitch motion (e.g., movement between up and down).
  • the Z-axis 600z rotation provides for roll controls of the instrument.
  • only some of the movements of the handle controller 38a may be mapped to the instrument 50. In other words, the instrument 50 may have less degrees of freedom than the handle controller 38a.
  • the instrument 50 is advanced to the surgical site, which may be done by moving the holder 46, or other actuation means of the instrument 50.
  • the first imaging device 112 is also activated to image the surgical site and the instrument 50 as it is advanced.
  • the instrument 50 is advanced until the instrument 50 is within the imaging frame of the first imaging device 112. At this point, the instrument 50 is movable according to the original coordinate system of the handle controller 38a described above with respect to step 200.
  • the instrument 50, and by extension the handle controller 38a is registered to the coordinate system of the first imaging device 112.
  • registration may be performed using various tracking systems, e.g., the electromagnetic tracking system 144, the optical tracking system 170, etc. to identify location and orientation of the first imaging device 112.
  • a GUI may be displayed as an overlay on the image 190.
  • the GUI may show the resulting movement of the instrument 50 prior to actual movement thereof.
  • the system 10 may display an indicator, which may be a virtual representation 50’ of the instrument 50 of where the instrument 50 will move to and/or an arrow 198 corresponding to the movement.
  • the GUI may display a confirmation prompt along with the movement indicator asking the user to confirm the indicated movement corresponds to desired movement.
  • FIG. 10 shows a method for using multiple imaging modalities of the navigation system 100.
  • the method may also be embodied in software instructions stored in memory and executable by a processor of the workstation 134, surgeon console 30, control tower 20, etc.
  • the surgical robotic system 10 is initialized, which includes attaching the instrument 50 to the IDU 52 of the robotic arm 40.
  • the instrument 50 is controlled by user inputs through the handle controller 38a as described above with respect to the method of FIG. 9.
  • the instrument 50 is advanced to the surgical site, which may be done by moving the holder 46, or other actuation means of the instrument 50.
  • the first imaging device 112 and the second imaging device 166 are also activated to image the surgical site and the instrument 50 as it is advanced to provide for dual modality imaging, e.g., fluoroscopy and ultrasound.
  • the instrument 50 is advanced until the instrument 50 is within the imaging frame of the first imaging device 112 and the second imaging device 166. At this point, the instrument 50 is movable according to the original coordinate system of the handle controller 38a described above with respect to step 300.
  • the instrument 50 is registered to the coordinate system of each of the first imaging device 112 and the second imaging device 166.
  • registration may be performed using various tracking systems, e.g., the electromagnetic tracking system 144, the optical tracking system 170, etc., to identify location and orientation of the instrument 50 relative to the first imaging device 112 and the second imaging device 166.
  • the registration status may be displayed using 3D axis symbols as shown in FIG. 8.
  • the first image 180a from the first imaging device 112 shows a coordinate system of the first imaging modality, which is depicted using the 3D axis symbol 182 as an overlay on the image 180a.
  • the second image 180b from the second imaging device 166 shows a coordinate system of the second imaging modality, which is depicted using the 3D axis symbol 184.
  • the orientation of the coordinate systems of the first and second modalities is obtained using electromagnetic and/or optical tracking systems described above.
  • the coordinate systems of the first and second image modalities are aligned. This may be accomplished by moving one or both of the imaging devices 112 and 166 until the coordinate systems are aligned. Movement of the imaging device 112 and 166 is done by the robotic arms 116 or 40, respectively. Alignment of the coordinate systems may be indicated by the 3D axis symbols 182 and 184 being aligned at step 308. After two imaging modalities are aligned, the method of FIG. 9 may be performed to align the movement control system, i.e., the handle controller 38a, with both imaging modalities as described above.
  • the movement control system i.e., the handle controller 38a
  • FIGS. 9 and 10 are suitable for use during cardiac procedures where navigation and guidance of flexible instruments is particularly challenging.
  • the navigation system 100 is also configured to compensate for the effects of respiration and the beating heart that can normally complicate navigation of instruments in and around the heart.
  • Various patient parameters may be monitored such as blood pressure using a pressure monitor, blood oxygenation using a pulse oximeter, heart activity using an electrocardiogram monitor, etc.
  • the imaging devices 112 and 166 may be operated based on the measured patient parameters.
  • the imaging process may be synced to physiological activity of the patient. Such that the image device 112 and/or 166 will capture images at this desired synced time.
  • the physiological change may be the beating heart, which is identified by ECG.
  • Diastole is the period of time between contractions of the atria or the ventricles during which blood enters the relaxed chambers from systemic circulation and the lungs. Diastole is often measured as the blood pressure at the instant of maximum cardiac relaxation. This makes this time period suitable for imaging.
  • image compensation may be accomplished by displaying images obtained during a specific time period, e.g., diastole. Thus, imaging may be performed continuously, but only image frames captured during certain time periods are displayed.
  • Example 1 A surgical robotic system comprising: a first robotic arm including a first instrument drive unit and an instrument coupled to and actuatable by the first instrument drive unit; a surgeon console including a handle controller for receiving a user input to move the instrument in a first coordinate system; a second robotic arm including a second instrument drive unit and an imaging device coupled to and actuatable by the second instrument drive unit, the imaging device configured to capture an image of the instrument using an imaging modality in a second coordinate system, different from the first coordinate system; a tracking system for obtaining tracking data of the instrument and the imaging device; and a processor configured to: register the instrument in the second coordinate system based on the tracking data; align the first coordinate system with the second coordinate system; and receive inputs controlling the instrument in the second coordinate system.
  • Example 2 The surgical robotic system according to Example 1, wherein the imaging device is selected from the group consisting of an ultrasound probe and a fluoroscope, and the instrument is selected from the group consisting of a catheter and a flexible scope.
  • Example 3 The surgical robotic system according to Example 1, wherein the handle controller includes a plurality of directional inputs.
  • Example 4 The surgical robotic system according to Example 3, wherein the processor is configured to map the plurality of directional inputs from the first coordinate system to the second coordinate system.
  • Example 5 The surgical robotic system according to Example 1, wherein the surgeon console further includes a display screen for displaying the image captured by the imaging device.
  • Example 6 The surgical robotic system according to Example 5, wherein the processor is configured to display a first 3D axis symbol corresponding to the first coordinate system and a second 3D axis symbol corresponding to the second coordinate system.
  • Example 9 The surgical robotic system according to Example 8, wherein the movement indicator is at least one of a virtual representation or an arrow.
  • Example 12 The method according to Example 11, further comprising mapping the plurality of directional inputs from the first coordinate system to the second coordinate system.
  • Example 13 The method according to Example 10, further comprising displaying the image captured by the imaging device on a display screen.
  • Example 14 The method according to Example 13, further comprising displaying a first 3D axis symbol corresponding to the first coordinate system and a second 3D axis symbol corresponding to the second coordinate system.
  • Example 15 The method according to Example 14, further comprising aligning the second 3D axis symbol to the first 3D axis symbol when the first coordinate system is aligned with the second coordinate system.
  • Example 16 A method for controlling a medical instrument in a plurality of different imaging modalities, the method comprising: capturing a first image of an instrument through a first imaging device using a first imaging modality in a first coordinate system, wherein the first imaging device is coupled to a first instrument drive unit of a first robotic arm; capturing a second image of the instrument through a second imaging device using a second imaging modality in a second coordinate system, wherein the second imaging device is coupled to a second instrument drive unit of a second robotic arm, the second imaging modality is different from the first imaging modality, and the second coordinate system is different from the first coordinate system; obtaining tracking data of the instrument, the first imaging device, and the second imaging device using a tracking system; registering the instrument in the first coordinate system and the second coordinate system based on the tracking data; and aligning the first coordinate system with the second coordinate system.
  • Example 17 The method according to Example 16, further comprising displaying the first image captured by the first imaging device side-by-side with the second image captured by the second imaging device.
  • Example 18 The method according to Example 17, further comprising displaying a first 3D axis symbol corresponding to the first coordinate system and a second 3D axis symbol corresponding to the second coordinate system.
  • Example 19 The method according to Example 18, further comprising aligning the second 3D axis symbol to the first 3D axis symbol when the first coordinate system of the first imaging device is aligned with the second coordinate system of the second imaging device.
  • Example 20 The method according to Example 16, wherein aligning the first coordinate system with the second coordinate system includes moving at least one of the first robotic arm or the second robotic arm.

Landscapes

  • Health & Medical Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Surgery (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Animal Behavior & Ethology (AREA)
  • Public Health (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Robotics (AREA)
  • General Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • Veterinary Medicine (AREA)
  • Gynecology & Obstetrics (AREA)
  • Radiology & Medical Imaging (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Pathology (AREA)
  • Human Computer Interaction (AREA)
  • Manipulator (AREA)

Abstract

Un système robotique chirurgical comprend un premier bras robotique pourvu d'une première unité d'entraînement d'instrument ainsi que d'un instrument couplé à la première unité d'entraînement d'instrument et actionnable par celle-ci, et une console de chirurgien dotée d'un dispositif de commande à manche, conçu pour recevoir une entrée d'utilisateur destinée à déplacer l'instrument dans un premier système de coordonnées. Le système comprend également un second bras robotique présentant une seconde unité d'entraînement d'instrument et un dispositif d'imagerie couplé à la seconde unité d'entraînement d'instrument et actionnable par celle-ci. Le dispositif d'imagerie est configuré pour capturer une image de l'instrument à l'aide d'une modalité d'imagerie dans un second système de coordonnées, différent du premier système de coordonnées. Un système de suivi est également prévu, destiné à obtenir des données de suivi de l'instrument et du dispositif d'imagerie. Le système robotique chirurgical comprend en outre un processeur configuré pour enregistrer l'instrument dans le second système de coordonnées sur la base des données de suivi, aligner le premier système de coordonnées sur le second système de coordonnées, et recevoir des entrées commandant l'instrument dans le second système de coordonnées.
PCT/IB2025/053537 2024-04-04 2025-04-03 Système robotique chirurgical conçu pour commander un instrument orientable à l'aide de multiples modalités d'imagerie Pending WO2025210565A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US202463574401P 2024-04-04 2024-04-04
US63/574,401 2024-04-04

Publications (1)

Publication Number Publication Date
WO2025210565A1 true WO2025210565A1 (fr) 2025-10-09

Family

ID=95399286

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB2025/053537 Pending WO2025210565A1 (fr) 2024-04-04 2025-04-03 Système robotique chirurgical conçu pour commander un instrument orientable à l'aide de multiples modalités d'imagerie

Country Status (1)

Country Link
WO (1) WO2025210565A1 (fr)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20190314097A1 (en) * 2016-07-14 2019-10-17 Intuitive Surgical Operation, Inc. Secondary instrument control in a computer-assisted teleoperated system
EP3033132B1 (fr) * 2013-08-15 2021-01-06 Intuitive Surgical Operations, Inc. Interface utilisateur graphique pour le positionnement et l'insértion de cathéter
US20210393344A1 (en) * 2020-06-22 2021-12-23 Auris Health, Inc. Control scheme calibration for medical instruments

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3033132B1 (fr) * 2013-08-15 2021-01-06 Intuitive Surgical Operations, Inc. Interface utilisateur graphique pour le positionnement et l'insértion de cathéter
US20190314097A1 (en) * 2016-07-14 2019-10-17 Intuitive Surgical Operation, Inc. Secondary instrument control in a computer-assisted teleoperated system
US20210393344A1 (en) * 2020-06-22 2021-12-23 Auris Health, Inc. Control scheme calibration for medical instruments

Similar Documents

Publication Publication Date Title
US12295679B2 (en) Robotic positioning of a device
US11950898B2 (en) Systems and methods for displaying estimated location of instrument
US11903661B2 (en) Systems and methods for concomitant medical procedures
JP7322057B2 (ja) 回転オフセットを有する多機能エンドエフェクタを備えるロボット制御可能な医療システム
CN114025700B (zh) 控制台叠加以及其使用方法
CN109069217B (zh) 图像引导外科手术中的姿势估计以及透视成像系统的校准的系统和方法
US20220022983A1 (en) Automated insertion device
US12208220B2 (en) Active distal tip drive
CN107690302B (zh) 在图像引导的外科手术中的配准补偿的系统和方法
JP2023133606A (ja) 細長いデバイスに関するシステム及び方法
US20240382265A1 (en) Hybrid localization for minimally invasive surgery and cervical spinal referencing, and methods for using the same
WO2025210565A1 (fr) Système robotique chirurgical conçu pour commander un instrument orientable à l'aide de multiples modalités d'imagerie
US20220323157A1 (en) System and method related to registration for a medical procedure
US20230390006A1 (en) Locking casters for surgical systems
US20230390010A1 (en) Locking casters for surgical systems with sensing
WO2024229649A1 (fr) Dispositif de suivi de patient non invasif pour intervention chirurgicale
WO2024236440A1 (fr) Localisation hybride pour chirurgie minimalement invasive et référencement spinal cervical, et leurs procédés d'utilisation

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 25719124

Country of ref document: EP

Kind code of ref document: A1