WO2025229542A1 - Localisation de cible pour accès percutané - Google Patents
Localisation de cible pour accès percutanéInfo
- Publication number
- WO2025229542A1 WO2025229542A1 PCT/IB2025/054474 IB2025054474W WO2025229542A1 WO 2025229542 A1 WO2025229542 A1 WO 2025229542A1 IB 2025054474 W IB2025054474 W IB 2025054474W WO 2025229542 A1 WO2025229542 A1 WO 2025229542A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- target
- instrument
- image
- anatomy
- location
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/25—User interfaces for surgical systems
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/36—Image-producing devices or illumination devices not otherwise provided for
- A61B90/361—Image-producing devices, e.g. surgical cameras
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
- A61B2034/2046—Tracking techniques
- A61B2034/2048—Tracking techniques using an accelerometer or inertia sensor
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
- A61B2034/2046—Tracking techniques
- A61B2034/2051—Electromagnetic tracking systems
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
- A61B2034/2046—Tracking techniques
- A61B2034/2065—Tracking using image or pattern recognition
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/30—Surgical robots
- A61B2034/301—Surgical robots for introducing or steering flexible instruments inserted into the body, e.g. catheters or endoscopes
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/30—Devices for illuminating a surgical field, the devices having an interrelation with other surgical devices or with a surgical procedure
- A61B2090/306—Devices for illuminating a surgical field, the devices having an interrelation with other surgical devices or with a surgical procedure using optical fibres
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/36—Image-producing devices or illumination devices not otherwise provided for
- A61B90/361—Image-producing devices, e.g. surgical cameras
- A61B2090/3614—Image-producing devices, e.g. surgical cameras using optical fibre
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/36—Image-producing devices or illumination devices not otherwise provided for
- A61B90/37—Surgical systems with images on a monitor during operation
- A61B2090/376—Surgical systems with images on a monitor during operation using X-rays, e.g. fluoroscopy
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/30—Surgical robots
- A61B34/35—Surgical robots for telesurgery
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/30—Surgical robots
- A61B34/37—Leader-follower robots
Definitions
- This disclosure relates generally to medical systems, and specifically to target localization techniques for percutaneous access.
- a ureteroscope includes an endoscope at its distal end configured to enable visualization of the urinary tract.
- the ureteroscope can be used to designate or set a target location for a needle to access the kidney percutaneously (also referred to as “target selection” or “target localization”).
- target selection also referred to as “target selection” or “target localization”.
- the physician drives the needle into the patient, to the target location, and proceeds to dilate the tract and perform a PCNL procedure.
- the physician may use another medical instrument (which may be in conjunction with the needle) to extract the stone from the kidney via the percutaneous access point.
- One innovative aspect of the subject matter of this disclosure can be implemented in a method of designating a target for percutaneous access.
- the method includes steps of receiving sensor data from one or more sensors disposed on an instrument within an anatomy, where the sensor data indicates a pose of the instrument in a first coordinate space; receiving an image depicting a portion of the anatomy in a field- of-view (FOV) of a camera disposed on or proximate to a distal end of the instrument; determining a location on the image associated with an anatomical feature; and determining a position of the target in the first coordinate space based at least in part on the location on the image associated with the anatomical feature.
- FOV field- of-view
- a controller for a medical system including a processing system and a memory.
- the memory stores instructions that, when executed by the processing system, cause the controller to receive sensor data from one or more sensors disposed on an instrument within an anatomy, where the sensor data indicates a pose of the instrument in a first coordinate space; receive an image depicting a portion of the anatomy in an FOV of a camera disposed on or proximate to a distal end of the instrument; determine a location on the image associated with an anatomical feature; and determine a position of a target for percutaneous access in the first coordinate space based at least in part on the location on the image associated with the anatomical feature.
- FIG. 1 shows an example medical system, according to some implementations.
- FIG. 2 shows another example medical system, according to some implementations.
- FIGS. 3-5 show an example percutaneous access procedure that can be performed using the medical system of FIG. 1.
- FIG. 6 shows a block diagram of an example target localization system, according to some implementations.
- FIGS. 7A-7D show various states of an example graphical user interface (GUI) for designating a percutaneous access target, according to some implementations.
- GUI graphical user interface
- FIGS. 8A and 8B show an example operation for converting a two- dimensional (2D) target location in an image coordinate space to a three-dimensional (3D) target position in a sensor coordinate space.
- FIG. 9 shows a block diagram of an example controller for a medical system, according to some implementations.
- FIG. 10 shows an illustrative flowchart depicting an example operation for designating a target for percutaneous access, according to some implementations.
- Certain standard anatomical terms of location may be used herein to refer to the anatomy of animals, and namely humans, with respect to the example implementations.
- certain spatially relative terms such as “outer,” “inner,” “upper,” “lower,” “below,” “above,” “vertical,” “horizontal,” “top,” “bottom,” and similar terms, are used herein to describe a spatial relationship of one element, device, or anatomical structure to another device, element, or anatomical structure, it is understood that these terms are used herein for ease of description to describe the positional relationship between elements and structures, as illustrated in the drawings. It should be understood that spatially relative terms are intended to encompass different orientations of the elements or structures, in use or operation, in addition to the orientations depicted in the drawings.
- an element or structure described as “above” another element or structure may represent a position that is below or beside such other element or structure with respect to alternate orientations of the subject patient, element, or structure, and vice-versa.
- the term “patient” may generally refer to humans, anatomical models, simulators, cadavers, and other living or non-living objects.
- a single block may be described as performing a function or functions; however, in actual practice, the function or functions performed by that block may be performed in a single component or across multiple components, or may be performed using hardware, using software, or using a combination of hardware and software.
- various illustrative components, blocks, modules, circuits, and steps have been described below generally in terms of their functionality. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the overall system. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present disclosure.
- the example systems or devices may include components other than those shown, including well-known components such as a processor, memory and the like.
- the techniques described herein may be implemented in hardware, software, firmware, or any combination thereof, unless specifically described as being implemented in a specific manner. Any features described as modules or components may also be implemented together in an integrated logic device or separately as discrete but interoperable logic devices. If implemented in software, the techniques may be realized at least in part by a non-transitory processor-readable storage medium including instructions that, when executed, performs one or more of the methods described herein.
- the non-transitory processor-readable data storage medium may form part of a computer program product, which may include packaging materials.
- the non-transitory processor-readable storage medium may comprise random access memory (RAM) such as synchronous dynamic random-access memory (SDRAM), read only memory (ROM), non-volatile random access memory (NVRAM), electrically erasable programmable read-only memory (EEPROM), FLASH memory, other known storage media, and the like.
- RAM synchronous dynamic random-access memory
- ROM read only memory
- NVRAM non-volatile random access memory
- EEPROM electrically erasable programmable read-only memory
- FLASH memory other known storage media, and the like.
- the techniques additionally, or alternatively, may be realized at least in part by a processor-readable communication medium that carries or communicates code in the form of instructions or data structures and that can be accessed, read, or executed by a computer or other processor.
- processors or a processing system
- processors may refer to any general -purpose processor, special -purpose processor, conventional processor, controller, microcontroller, or state machine capable of executing scripts or instructions of one or more software programs stored in memory.
- a physician can insert a scope (such as a ureteroscope) into the urinary tract through the urethra.
- a ureteroscope includes an endoscope at its distal end configured to enable visualization of the urinary tract.
- the ureteroscope can be used to designate or set a target location for a needle to access the kidney percutaneously (also referred to as “target selection” or “target localization”).
- target location refers to any point or position in a world coordinate system, which may or may not coincide with a physical object.
- a target location can be designated within a calyx, midway between a papilla and the distal end of the ureteroscope. The physician drives the needle into the patient, to the target location, and proceeds to dilate the tract and perform a PCNL procedure.
- a target localization system may generate a graphical user interface (GUI) that can be used to designate a target for percutaneous access on a real-time image of an anatomy.
- GUI graphical user interface
- the image may depict a field-of-view (FOV) of a camera disposed on an instrument (such as a scope) within the anatomy.
- FOV field-of-view
- the GUI may include an interactive feature (such as a reticle) for selecting a target location on the image responsive to user input.
- the target localization system also receives sensor data indicating a pose (including a position and heading) of the instrument and determines a position of the target relative to the pose of the instrument based on the target location on the image.
- the target localization system can convert the target location on the image to a point in a coordinate space associated with the camera (also referred to as a “camera space”) based on intrinsic parameters (such as a focal length and optical center) of the camera.
- the target localization system can determine a ray that originates from the center of the camera and intersects the point in the camera space and can further convert the ray from the camera space to a coordinate space associated with the sensor data (also referred to as a “sensor space”).
- the conversion from camera space to sensor space can be performed using a calibration matrix (such as for hand-eye calibration) that maps any point or vector in the camera space to a respective point or vector in the sensor space.
- the target localization system may set the position of the target as a vector having a fixed relationship to the pose of the instrument based on the ray in the sensor space.
- the vector may extend a threshold distance from a position of the instrument along a direction of the ray.
- GUI of the present implementations can reduce and/or streamline the workflow steps that require careful movement and positioning of instruments inside an anatomy. For example, a physician can navigate a scope directly to a park position and designate a target for percutaneous access using the GUI without having to touch or tag the scope to any portion of the anatomy.
- the GUI also enables the physician to visualize the position of the target in the camera space (in contrast to existing target localization techniques which only provide the position of the target in the sensor space).
- respiration may cause movement of the instrument in the sensor space.
- the image displayed on the GUI may exhibit little or no change due to respiration, thereby allowing the physician to designate a target more precisely or accurately.
- respiration may cause movement of the anatomy in the camera space (such as while the instrument appears relatively stationary in the sensor space).
- the physician may use their clinical judgement to compensate and/or adjust for the movement of the anatomy.
- the physician may suspend the patient’s breathing for at least part of the percutaneous access procedure. For example, during a percutaneous access procedure, the patient is often under general anesthesia and allowed to breathe through a ventilator. In some aspects, the physician may use the ventilator to force a breath-hold by the patient.
- robotic tools may engage or control one or more medical instruments (such as an endoscope) to access a target site within a patient’s anatomy or perform a treatment at the target site.
- the robotic tools may be guided or controlled by a physician.
- the robotic tools may operate in an autonomous or semi-autonomous manner.
- systems and techniques are described herein in the context of robotic-assisted medical procedures, the systems and techniques may be applicable to other types of medical procedures that utilize camera and/or sensor data (such as procedures that do not rely on robotic tools or only utilize robotic tools in a very limited capacity).
- the systems and techniques described herein may be applicable to medical procedures that rely on manually operated medical instruments (such as an endoscope that is exclusively controlled and operated by a physician).
- the systems and techniques described herein also may be applicable beyond the context of medical procedures (such as in simulated environments or laboratory settings, such as with models or simulators, among other examples).
- FIG. 1 shows an example medical system 100, according to some implementations.
- the medical system 100 includes a robotic system 110 configured to engage with and/or control a medical instrument 120 to perform a procedure on a patient 130.
- the medical system 100 also includes a control system 140 configured to interface with the robotic system 110, provide information regarding the procedure, and/or perform a variety of other operations.
- the control system 140 can include one or more displays 142 to present certain information to assist the physician 160.
- the medical system 100 can include a table 150 configured to hold the patient 130.
- the system 100 further includes an electromagnetic (EM) field generator 180, which can be held by one or more robotic arms 112 of the robotic system 110 or can be a stand-alone device.
- EM electromagnetic
- the medical system 100 is shown to include an imaging device 190 which can be integrated into a C-arm or otherwise configured to provide imaging during a procedure, such as for a fluoroscopy-type procedure.
- the medical system 100 my not include the imaging device 190.
- the medical system 100 may be used to perform a percutaneous access procedure.
- the physician 160 can perform a procedure to remove the kidney stone through a percutaneous access point on the patient 130.
- the physician 160 can interact with the control system 140 to control the robotic system 110 to advance and navigate the medical instrument 120 (such as an endoscope) from the urethra, through the bladder, up the ureter, and into the kidney where the stone is located.
- the control system 140 can provide information via the display(s) 142 regarding the medical instrument 120 to assist the physician 160 in navigating the medical instrument 120, such as real-time images captured therewith.
- the medical instrument 120 can be used to designate or tag a target location for the medical instrument 170 to access the kidney percutaneously (such as a desired point to access the kidney).
- the physician 160 can designate a particular papilla as the target location for entering into the kidney with the medical instrument 170.
- other target locations can be designated or determined.
- the control system 140 may provide an graphical interface 144, which can include a visualization to indicate an alignment of an orientation of the medical instrument 170 relative to a target trajectory (such as a desired access path), a visualization to indicate a progress of inserting the medical instrument 170 towards the target location, and/or other information.
- a target trajectory such as a desired access path
- a visualization to indicate a progress of inserting the medical instrument 170 towards the target location
- a percutaneous procedure can be performed without the assistance of the medical instrument 120.
- the medical system 100 can be used to perform a variety of other procedures.
- the medical instrument 170 can alternatively be used by a component of the medical system 100.
- the medical instrument 170 can be held or manipulated by the robotic system 110 (such as the one or more robotic arms 112) and the techniques discussed herein can be implemented to control the robotic system 110 to insert the medical instrument 170 with the appropriate orientation to reach a target location.
- the medical instrument 120 is implemented as a scope (such as an endoscope) and the medical instrument 170 is implemented as a needle.
- the medical instrument 120 is referred to as “the scope” or “the lumen-based medical instrument,” and the medical instrument 170 is referred to as “the needle” or “the percutaneous medical instrument.”
- the medical instrument 120 and the medical instrument 170 can each be implemented as any suitable type of medical instrument including, for example, a scope, a needle, a catheter, a guidewire, a lithotripter, a basket retrieval device, forceps, a vacuum, a needle, a scalpel, an imaging probe,jaws, scissors, graspers, needle holder, micro dissector, staple applier, tacker, suction or irrigation tool, or clip applier, among other examples.
- a medical instrument may be a steerable device. In some other implementations, a medical instrument may be a non-steerable device.
- a surgical tool may refer to any device that is configured to puncture or be inserted through the human anatomy, such as a needle, a scalpel, or a guidewire, among other examples. However, a surgical tool can refer to other types of medical instruments.
- a medical instrument such as the scope 120 and/or the needle 170, may include a sensorthat is configured to generate sensor data, which can be sent to another device.
- the sensor data may indicate a pose (including a location and/or orientation) of the medical instrument and/or can be used to determine a pose of the medical instrument.
- a sensor can include an electromagnetic (EM) sensor with a coil of conductive material.
- the EM field generator 180 can provide an EM field that is detected by the EM sensor on the medical instrument.
- the magnetic field can induce small currents in coils of the EM sensor, which can be analyzed to determine a distance and/or angle or orientation between the EM sensor and the EM field generator.
- a medical instrument can include other types of sensors configured to generate sensor data, such as a camera, a range sensor, a radar device, a shape sensing fiber, an accelerometer, a gyroscope, an accelerometer, a satellite-based positioning sensor (such as a global positioning system (GPS)), or a radio-frequency transceiver, among other examples.
- a sensor may be positioned on a distal end of a medical instrument.
- a sensor on a medical instrument may provide sensor data to the control system 140 and the control system 140 may perform one or more localization techniques to determine or track a position and/or an orientation of the medical instrument.
- scope and “endoscope” are used herein according to their broad and ordinary meanings and can refer to any type of elongate medical instrument having image generating, viewing, and/or capturing functionality and configured to be introduced into any type of organ, cavity, lumen, chamber, and/or space of a body.
- references herein to scopes or endoscopes can refer to a ureteroscope (such as for accessing the urinary tract), a laparoscope, a nephroscope (such as for accessing the kidneys), a bronchoscope (such as for accessing an airway, such as the bronchus), a colonoscope (such as for accessing the colon), an arthroscope (such as for accessing a joint), a cystoscope (such as for accessing the bladder), or a borescope, among other examples.
- a ureteroscope such as for accessing the urinary tract
- a laparoscope such as for accessing the kidneys
- a nephroscope such as for accessing the kidneys
- a bronchoscope such as for accessing an airway, such as the bronchus
- a colonoscope such as for accessing the colon
- an arthroscope such as for accessing a joint
- cystoscope such as for accessing the bladder
- a borescope among
- a scope can comprise a tubular and/or flexible medical instrument that is configured to be inserted into the anatomy of a patient to capture images of the anatomy.
- a scope may accommodate wires and/or optical fibers to transfer signals to or from an optical assembly and a distal end of the scope, which can include an imaging device, such as an optical camera.
- the camera or imaging device can be used to capture images of an internal anatomical space, such as a calyx or papilla of a kidney.
- a scope can further accommodate optical fibers to carry light from proximately- located light sources, such as light-emitting diodes, to the distal end of the scope.
- the distal end of the scope can include ports for light sources to illuminate an anatomical space when using the camera or imaging device.
- the scope may be controlled by a robotic system, such as the robotic system 110.
- the imaging device can comprise an optical fiber, fiber array, and/or lens.
- the optical components can move along with the tip of the scope such that movement of the tip of the scope results in changes to the images captured by the imaging device.
- a scope can be articulable, such as with respect to at least a distal portion of the scope, so that the scope can be steered within the human anatomy.
- a scope may be articulated with, for example, five or six degrees of freedom, including X, Y, Z coordinate movement, as well as pitch, yaw, and roll.
- a position sensor(s) of the scope can likewise have similar degrees of freedom with respect to the position information they produce or provide.
- a scope can include telescoping parts, such as an inner leader portion and an outer sheath portion, which can be manipulated to telescopically extend the scope.
- a scope may comprise a rigid or flexible tube configured to be passed within an outer sheath, catheter, introducer, or other lumen-type device, or can be used without such devices.
- a scope may include a working channel for deploying medical instruments (such as lithotripters, basketing devices, or forceps), irrigation, and/or aspiration to an operative region at a distal end of the scope.
- the robotic system 110 can be configured to at least partly facilitate execution of a medical procedure.
- the robotic system 110 can be arranged in a variety of ways depending on the particular procedure.
- the robotic system 110 can include the one or more robotic arms 112 configured to engage with and/or control the scope 120 to perform a procedure.
- each robotic arm 112 can include multiple arm segments coupled to joints, which can provide multiple degrees of movement.
- the robotic system 110 is positioned proximate to the patient’s legs and the robotic arms 112 are actuated to engage with and position the scope 120 for access into an access point, such as the urethra of the patient 130.
- the scope 120 can be inserted into the patient 130 robotically using the robotic arms 112, manually by the physician 160, or a combination thereof.
- the robotic arms 112 also can be connected to the EM field generator 180, which can be positioned near a treatment site, such as within proximity to the kidneys of the patient 130.
- the robotic system 110 can include a support structure 114 coupled to the one or more robotic arms 112.
- the support structure 114 can include control electronics or circuitry, one or more power sources, one or more pneumatics, one or more optical sources, one or more actuators (such as motors to move the one or more robotic arms 112), memory or data storage, and/or one or more communication interfaces.
- the support structure 114 includes an input/output (I/O) device(s) 116 configured to receive input, such as user input to control the robotic system 110, and/or provide output, such as a graphical user interface (GUI), information regarding the robotic system 110, or information regarding a procedure, among other examples.
- I/O input/output
- GUI graphical user interface
- the I/O device(s) 116 can include a display, a touchscreen, a touchpad, a projector, a mouse, a keyboard, a microphone, a speaker, etc.
- the robotic system 110 is movable (such as the support structure 114 includes wheels) so that the robotic system 110 can be positioned in a location that is appropriate or desired for a procedure.
- the robotic system 110 is a stationary system. Further, in some implementations, the robotic system 110 is integrated into the table 150.
- the robotic system 110 can be coupled to any component of the medical system 100, such as the control system 140, the table 150, the EM field generator 180, the scope 120, and/or the needle 170.
- the robotic system is communicatively coupled to the control system 140.
- the robotic system 110 can be configured to receive a control signal from the control system 140 to perform an operation, such as to position a robotic arm 112 in a particular manner, or manipulate the scope 120, among other examples.
- the robotic system 110 can control a component of the robotic system 110 to perform the operation.
- the robotic system 110 is configured to receive an image from the scope 120 depicting internal anatomy of the patient 130 and/or send the image to the control system 140, which can then be displayed on the display(s) 142.
- the robotic system 110 is coupled to a component of the medical system 100, such as the control system 140, in such a manner as to allow for fluids, optics, power, or the like to be received therefrom.
- the control system 140 can be configured to provide various functionality to assist in performing a medical procedure.
- the control system 140 can be coupled to the robotic system 110 and operate in cooperation with the robotic system 110 to perform a medical procedure on the patient 130.
- the control system 140 can communicate with the robotic system 110 via a wireless or wired connection (such as to control the robotic system 110 and/or the scope 120, receive images captured by the scope 120), provide power to the robotic system 110 via one or more electrical connections, or provide optics to the robotic system 110 via one or more optical fibers or other components, among other examples.
- control system 140 may communicate with the needle 170 and/or the scope 120 to receive sensor data from the needle 170 and/or the scope 120 (via the robotic system 110 and/or directly from the needle 170 and/or the scope 120).
- control system 140 may communicate with the table 150 to position the table 150 in a particular orientation or otherwise control the table 150.
- control system 140 may communicate with the EM field generator 180 to control generation of an EM field around the patient 130.
- the control system 140 includes various I/O devices configured to assist the physician 160 or others in performing a medical procedure.
- the control system 140 includes an I/O device(s) 146 that is employed by the physician 160 or other user to control the scope 120, such as to navigate the scope 120 within the patient 130.
- the physician 160 can provide input via the I/O device(s) 146 and, in response, the control system 140 can send control signals to the robotic system 110 to manipulate the scope 120.
- the I/O device(s) 146 is illustrated as a controller in the example of FIG. 1, the I/O device(s) 146 can be implemented as a variety of types of I/O devices, such as a touchscreen, a touch pad, a mouse, or a keyboard, among other examples.
- the display(s) 142 can provide a graphical interface 144 to assist the physician 160 in manipulating the needle 170.
- the display(s) 142 can also provide (such as via the graphical interface 144 and/or another interface) information regarding the scope 120.
- the control system 140 can receive real-time images that are captured by the scope 120 and display the real-time images via the display(s) 142.
- the control system 140 can receive signals (such as analog, digital, electrical, acoustic or sonic, pneumatic, tactile, or hydraulic signals) from a medical monitor and/or a sensor associated with the patient 130, and the display(s) 142 can present information regarding the health or environment of the patient 130.
- Such information can include information that is displayed via a medical monitor including, for example, a heart rate (such as ECG or HRV), blood pressure or rate, muscle bio-signals (such as EMG), body temperature, blood oxygen saturation (such as SpO2), CO2, brain waves (such as EEG), or environmental temperatures, among other examples.
- a heart rate such as ECG or HRV
- blood pressure or rate such as EMG
- muscle bio-signals such as EMG
- body temperature such as blood oxygen saturation (such as SpO2), CO2
- brain waves such as EEG
- environmental temperatures among other examples.
- control system 140 can include various components or subsystems.
- the control system 140 can include control electronics or circuitry, as well as one or more power sources, pneumatics, optical sources, actuators, memory or data storage devices, and/or communication interfaces.
- the control system 140 may include control circuitry comprising a computer-based control system that is configured to store executable instructions, that when executed, cause various operations to be implemented.
- the control system 140 may be movable (such as in FIG. 1). In some other implementations, the control system 140 may be a stationary system.
- control system 140 any such functionality and/or components can be integrated into and/or performed by other systems and/or devices, such as the robotic system 110, the table 150, and/or the EM generator 180 (or even the scope 120 and/or the needle 170).
- FIG. 2 shows another example medical system 200, according to some implementations.
- the medical system 200 may be one example of the medical system 100 of FIG. 1.
- the medical system 200 is shown to include the robotic system 110 and the control system 140 of FIG. 1.
- the robotic system 110 includes an elongated support structure 114 (also referred to as a “column”), a robotic system base 25, and a console 13 at the top of the column 114.
- the column 114 may include one or more arm supports 17 (also referred to as a “carriage”) for supporting the deployment of the one or more robotic arms 112.
- the arm support 17 may include individually-configurable arm mounts that rotate along a perpendicular axis to adjust the base of the robotic arms 112 for better positioning relative to the patient.
- the robotic arms 112 may be configured to engage with and/or control the scope 120 and/or the needle 170 to perform one or more aspects of a medical procedure.
- a scope-advancement instrument coupling (such as an instrument device manipulator) can be attached to the distal portion of one of the arms 112, to facilitate robotic control or advancement of the scope 120, while another one of the arms 112 may have associated therewith an instrument coupling that is configured to facilitate advancement of the needle 170.
- the arm support 17 also includes a column interface that allows the arm support 17 to vertically translate along the column 114.
- the column interface can be connected to the column 114 through slots that are positioned on opposite sides of the column 114 to guide the vertical translation of the arm support 17.
- the slot contains a vertical translation interface to position and hold the arm support 17 at various vertical heights relative to the robotic system base 25.
- Vertical translation of the arm support 17 allows the robotic system 110 to adjust the reach of the robotic arms 112 to meet a variety of table heights, patient sizes, and physician preferences.
- the individually-configurable arm mounts on the arm support 17 can allow the robotic arm base 21 of the robotic arms 112 to be angled in a variety of configurations.
- the robotic arms 112 may generally comprise robotic arm bases 21 and end effectors 22, separated by a series of linkages 23 that are connected by a series of joints 24, each joint 24 comprising one or more independent actuators 217.
- Each actuator 217 may comprise an independently-controllable motor.
- Each independently-controllable joint 24 can provide an independent degree of freedom of movement to the robotic arm.
- each of the arms 112 has seven joints, and thus provides seven degrees of freedom, including “redundant” degrees of freedom.
- Redundant degrees of freedom allow the robotic arms 112 to position their respective end effectors 22 at a specific position, orientation, and trajectory in space using different linkage positions and joint angles. This allows for the system to position and direct a medical instrument from a desired point in space while allowing the physician to move the arm joints into a clinically advantageous position away from the patient to create greater access, while avoiding arm collisions.
- the robotic system base 25 balances the weight of the column 114, arm support 17, and arms 112 over the floor. Accordingly, the robotic system base 25 may house certain relatively heavier components, such as electronics, motors, power supply, as well as components that selectively enable movement or immobilize the robotic system.
- the robotic system base 25 can include wheel-shaped casters 28 that allow for the robotic system to easily move around the operating room prior to a procedure. After reaching the appropriate position, the casters 28 may be immobilized using wheel locks to hold the robotic system 110 in place during the procedure.
- a console 13 is positioned at the upper end of column 114 and can provide one or more I/O components 116, such as a user interface for receiving user input and a display screen (or a dual-purpose device such as, for example, a touchscreen) to provide the physician or user with pre-operative and intra-operative data.
- Example preoperative data may include pre-operative plans, navigation and mapping data derived from pre-operative computed tomography (CT) scans, and/or notes from pre-operative patient interviews.
- Example intra-operative data may include optical information provided from the tool, sensor and coordinate information from sensors, as well as vital patient statistics, such as respiration, heart rate, and/or pulse.
- the console 13 may be positioned and tilted to allow a physician to view the console 13, robotic arms 112, and patient while operating the console 13 from behind the robotic system 110.
- the end effector 22 of each of the robotic arms 112 may comprise an instrument device manipulator (IDM) 29, which may be attached using a mechanism changer interface (MCI).
- IDM 29 can be removed and replaced with a different type of IDM, for example, a first type of IDM may manipulate a scope, while a second type of IDM may manipulate a needle.
- Another type of IDM may be configured to hold an electromagnetic field generator (such as the EM field generator 180).
- An MCI can include connectors to transfer pneumatic pressure, electrical power, electrical signals, and/or optical signals from the robotic arm 112 to the IDM 29.
- the IDMs 29 may be configured to manipulate medical instruments, such as the scope 120, using techniques including, for example, direct drives, harmonic drives, geared drives, belts and pulleys, magnetic drives, and the like.
- the IDMs 29 can be attached to respective ones of the robotic arms 112, wherein the robotic arms 112 are configured to insert or retract the respective coupled medical instruments into or out of the treatment site.
- the robotic system 110 further includes power 219 and communication 214 interfaces (such as connectors) to transfer pneumatic pressure, electrical power, electrical signals, and/or optical signals from the robotic arms 112 to the IDMs 29.
- a user can manually manipulate a robotic arm 112 of the robotic system 110 without using electronic user controls. For example, during setup in a surgical operating room, a user may move the robotic arms 112 and/or any other medical instruments to provide desired access to a patient.
- the robotic system 110 may rely on force feedback and inertia control from the user to determine appropriate configuration of the robotic arms 112 and associated instrumentation.
- the medical system 100 can include control circuitry configured to perform certain functionality described herein, including control circuitry 211 of the robotic system 110 and/or control circuitry 251 of the control system 140. That is, the control circuitry of the medical system 100 may be part of the robotic system 110, the control system 140, or some combination thereof. Therefore, any reference herein to control circuitry may refer to circuitry embodied in a robotic system, a control system, or any other component of a medical system, such as the medical system 100 shown in FIG. 1.
- control circuitry is used herein according to its broad and ordinary meaning, and may refer to any collection of processors, processing circuitry, processing modules or units, chips, dies (such as semiconductor dies including come or more active and/or passive devices and/or connectivity circuitry), microprocessors, micro-controllers, digital signal processors, microcomputers, central processing units, field programmable gate arrays, programmable logic devices, state machines (such as hardware state machines), logic circuitry, analog circuitry, digital circuitry, and/or any device that manipulates signals (analog and/or digital) based on hard coding of the circuitry and/or operational instructions.
- processors processing circuitry, processing modules or units, chips, dies (such as semiconductor dies including come or more active and/or passive devices and/or connectivity circuitry), microprocessors, micro-controllers, digital signal processors, microcomputers, central processing units, field programmable gate arrays, programmable logic devices, state machines (such as hardware state machines), logic circuitry, analog circuitry, digital circuitry, and
- Control circuitry referenced herein may further include one or more circuit substrates (such as printed circuit boards), conductive traces and vias, and/or mounting pads, connectors, and/or components.
- Control circuitry referenced herein may further comprise one or more, storage devices, which may be embodied in a single memory device, a plurality of memory devices, and/or embedded circuitry of a device.
- Such data storage may comprise read-only memory, random access memory, volatile memory, nonvolatile memory, static memory, dynamic memory, flash memory, cache memory, data storage registers, and/or any device that stores digital information.
- control circuitry comprises a hardware and/or software state machine
- analog circuitry, digital circuitry, and/or logic circuitry data storage device(s) or register(s) storing any associated operational instructions may be embedded within, or external to, the circuitry comprising the state machine, analog circuitry, digital circuitry, and/or logic circuitry.
- the control circuitry 211 and/or 251 may comprise a computer-readable medium storing, and/or configured to store, hard-coded and/or operational instructions corresponding to at least some of the steps and/or functions illustrated in one or more of the present figures and/or implementations described herein. Such computer-readable medium can be included in an article of manufacture in some instances.
- the control circuitry 211 and/or 251 may be locally maintained on the robotic system 110 or the control system 140 or may be remotely located at least in part (such as communicatively coupled indirectly via a local area network and/or a wide area network). Any of the control circuitry 211 and/or 251 may be configured to perform any aspect(s) of the various processes disclosed herein.
- control circuitry 211 may be integrated with the base 25, column 114, and/or console 13 of the robotic system 110, and/or another system communicatively coupled to the robotic system 110.
- control system 140 at least a portion of the control circuitry 251 may be integrated with a console base 51 and/or display 142 of the control system 140. It should be understood that any description herein of functional control circuitry or associated functionality may be embodied in the robotic system 110, the control system 140, or any combination thereof, and/or at least in part in one or more other local or remote systems or devices.
- control system 140 can include various I/O components 258 configured to assist the physician or others in performing a medical procedure.
- the I/O components 258 can be configured to allow for user input to control or navigate the scope 120 and/or needle 170 within the patient.
- the physician can provide input to the control system 140 and/or robotic system 110 via one or more input controls 255, wherein in response to such input, control signals can be sent to the robotic system 110 to manipulate the scope 120 and/or needle 170.
- Example suitable input controls 255 may include any type of user input devices or device interfaces, such as buttons, keys, joysticks, handheld controllers (such as video-game type controllers), computer mice, trackpads, trackballs, control pads, foot pedals, sensors (such as motion sensors or cameras) that capture hand or finger gestures, or touchscreens, among other examples.
- the control system can include various components (sometimes referred to as “subsystems”).
- the control system 140 can include control electronics or circuitry 251, as well as one or more power supplies or supply interfaces 259, pneumatic devices, optical sources, actuators, data storage devices, and/or communication interfaces 254.
- the various components of the medical system 100 can be communicatively coupled to each other over a network, which can include a wireless and/or wired network.
- Example networks include one or more personal area networks (PANs), local area networks (LANs), wide area networks (WANs), Internet area networks (IANS), cellular networks, the Internet, personal area networks (PANs), body area network (BANs), etc.
- PANs personal area networks
- WANs wide area networks
- IANS Internet area networks
- BANs body area network
- the communication interfaces 214 and 254 of the robotic system 110 and the control system 140, respectively can be configured to communicate with one or more devices, sensors, or systems, such as over a wireless and/or wired network connection.
- the various communication interfaces can implement a wireless technology such as Bluetooth, Wi-Fi, near field communication (NFC), or the like.
- the various components of the system 100 can be connected for data communication, fluid exchange, power exchange, and so on via one or more support cables, tubes, or the like.
- the medical system 100 can provide a variety of benefits, such as providing guidance to assist a physician in performing a procedure (such as instrument tracking or instrument alignment information), enabling a physician to perform a procedure from an ergonomic position without the need for awkward arm motions and/or positions, enabling a single physician to perform a procedure with one or more medical instruments, avoiding radiation exposure (such as associated with fluoroscopy techniques), enabling a procedure to be performed in a single-operative setting, or providing continuous suction to remove an object more efficiently (such as to remove a kidney stone), among other examples.
- the medical system 100 can provide guidance information to assist a physician in using various medical instruments to access a target anatomical feature while minimizing bleeding and/or damage to anatomy (such as critical organs or blood vessels).
- the medical system 100 can provide non-radiation based navigational and/or localization techniques and/or reduce the amount of equipment in the operating room. Moreover, the medical system 100 can provide functionality that is distributed between at least the control system 140 and the robotic system 110, which can be independently movable. Such distribution of functionality and/or mobility can enable the control system 140 and/or the robotic system 110 to be placed at locations that are optimal for a particular medical procedure, which can maximize working area around the patient and/or provide an optimized location for a physician to perform a procedure.
- the techniques and systems can be implemented in other procedures, such as in fully-robotic medical procedures or human-only procedures (such as free of robotic systems).
- the medical system 100 can be used to perform a procedure without a physician holding or manipulating a medical instrument (such as a fully-robotic procedure). That is, medical instruments that are used during a procedure, such as the scope 120 and the needle 170, can each be held or controlled by components of the medical system 100, such as the robotic arm(s) 112 of the robotic system 110.
- FIGS. 3-5 show an example percutaneous access procedure that can be performed using the medical system 100 of FIG. 1.
- the medical system 100 is arranged in an operating room to remove kidney stones from the patient 130 with the assistance of the scope 120 and the needle 170.
- the patient 130 may be positioned in a modified supine position with the patient 130 slightly tilted to the side to access the back or side of the patient 130, such as that illustrated in FIG. 1.
- the patient 130 can be positioned in other manners, such as a supine position or a prone position, among other examples.
- FIGS. 3-5 illustrate the patient 130 in a supine position with the legs spread apart.
- the imaging device 190 including the C-arm shown in FIG. 1 has been removed.
- FIGS. 3-5 illustrate use of the medical system 100 to perform a percutaneous access procedure to remove a kidney stone from the patient 130
- the medical system 100 can be used to remove a kidney stone in other manners and/or to perform other procedures.
- the patient 130 can be arranged in other positions as desired for a procedure.
- Various acts or workflow are described with reference to FIGS. 3-5, and throughout this disclosure, as being performed by the physician 160. It should be understood that these acts can be performed directly by the physician 160, a user under direction of the physician 160, another user (such as a technician), a combination thereof, and/or any other user.
- the renal anatomy as illustrated at least in part in FIGS. 3-5, is described here for reference with respect to certain medical procedures relating to aspects of the present disclosure.
- the kidneys generally comprise two bean-shaped organs located on the left and right in the retroperitoneal space.
- the kidneys receive blood from the paired renal arteries, and blood exits into the paired renal veins.
- Each kidney is attached to a ureter, which is a tube that carries excreted urine from the kidney to the bladder.
- the bladder is attached to the urethra.
- a recessed area on the concave border of the kidney is the renal hilum, where the renal artery enters the kidney and the renal vein and ureter leave.
- the kidney is surrounded by tough fibrous tissue, the renal capsule, which is itself surrounded by perirenal fat, renal fascia, and pararenal fat.
- the anterior (front) surface of these tissues is the peritoneum, while the posterior (rear) surface is the transversalis fascia.
- the functional substance, or parenchyma, of the kidney is divided into two major structures: the outer renal cortex and the inner renal medulla. These structures take the shape of a plurality of cone-shaped renal lobes, each containing renal cortex surrounding a portion of medulla called a renal pyramid.
- the tip, or papilla, of each pyramid empties urine into a respective minor calyx; minor calyces empty into major calyces, and major calyces empty into the renal pelvis, which transitions to the ureter.
- the ureter and renal vein exit the kidney and the renal artery enters.
- Hilar fat and lymphatic tissue with lymph nodes surrounds these structures.
- the hilar fat is contiguous with a fat-fdled cavity called the renal sinus.
- the renal sinus collectively contains the renal pelvis and calyces and separates these structures from the renal medullary tissue.
- FIGS. 3-5 show various features of the anatomy of the patient 130.
- the patient 130 includes kidneys 310 fluidly connected to a bladder 330 via ureters 320, and a urethra 340 fluidly connected to the bladder 330.
- the kidney 310(A) includes calyces (such as a calyx 312), renal papillae (such as a papilla 314), and renal pyramids (such as a pyramid 316).
- a kidney stone 318 is located in proximity to the papilla 314.
- the kidney stone 318 can be located at other locations within the kidney 310(A) or elsewhere.
- the physician 160 can position the robotic system 110 at the side or foot of the table 150 to initiate delivery of the scope 120 (not shown in FIG. 3) into the patient 130.
- the robotic system 110 can be positioned at the side of the table 150 within proximity to the feet of the patient 130 and aligned for direct linear access to the urethra 340 of the patient 130.
- the hip of the patient 130 may be used as a reference point to position the robotic system 110.
- one or more of the robotic arms 112, such as the robotic arms 112(B) and 112(C) can stretch outwards to reach in between the legs of the patient 130.
- the robotic arm 112(B) can be controlled to extend and provide linear access to the urethra 340.
- the physician 160 inserts a medical instrument 350 at least partially into the urethra 340 along this direct linear access path (also referred to as a “virtual rail”).
- the medical instrument 350 can include a lumen-type device configured to receive the scope 130, thereby assisting in inserting the scope 120 into the anatomy of the patient 130.
- the robotic arm 112(B) By aligning the robotic arm 112(B) to the urethra 340 of the patient 130 and/or using the medical instrument 350, friction and/or forces on the sensitive anatomy in the area can be reduced.
- the scope 120 may be inserted directly into the urethra 340 without the use of the medical instrument 350.
- the physician 160 can also position the robotic arm 112(A) near a treatment site for the procedure.
- the robotic arm 112(A) can be positioned within proximity to the incision site and/or the kidneys 310 of the patient 130.
- the robotic arm 112(A) can be connected to the EM field generator 180 to assist in tracking a location of the scope 120 and/or the needle 170 during the procedure.
- the robotic arm 112(A) is positioned relatively close to the patient 130, in some embodiments the robotic arm 112(A) is positioned elsewhere and/or the EM field generator 180 is integrated into the table 150 (which can allow the robotic arm 112(A) to be in a docked position).
- the robotic arm 1 12(C) remains in a docked position, as shown in FIG. 3.
- the robotic arm 1 12(C) can be used in some implementations to perform any of the functions discussed above of the robotic arms 112(A) and/or 112(C).
- the scope 120 can be inserted into the patient 130 robotically, manually, or a combination thereof, as shown in FIG. 4.
- the physician 160 can connect the scope 120 to the robotic arm 112(C) and/or position the scope 120 at least partially within the medical instrument 350 and/or the patient 130.
- the scope 120 can be connected to the robotic arm 112(C) at any time, such as before the procedure or during the procedure (such as after positioning the robotic system 110).
- the physician 160 can then interact with the control system 140, such as the I/O device(s) 146, to navigate the scope 120 within the patient 130.
- the physician 160 can provide input via the I/O device(s) 146 to control the robotic arm 1 12(C) to navigate the scope 120 through the urethra 340, the bladder 330, the ureter 320(A), and up to the kidney 310(A).
- the control system 140 may present an instrumentalignment interface 410 (such as the graphical interface 144 of FIG. 1) on the display(s) 142 to view a real-time image 412 captured by the scope 120 to assist the physician 160 in controlling the scope 120.
- the physician 160 can navigate the scope 120 to locate the kidney stone 318, as depicted in the image 412.
- the control system 140 may use localization techniques to determine a position and/or an orientation of the scope 120, which can be viewed by the physician 160 via the display(s) 142 to also assist in controlling the scope 120.
- other types of information can be presented on the display(s) 142 to assist the physician 160 in controlling the scope 120, such as x-ray images of the internal anatomy of the patient 130.
- the physician 160 can designate a target location for the needle 170 to enter the kidney 310(A) for eventual extraction of the kidney stone 318. For example, to minimize bleeding and/or avoid hitting a blood vessel or other undesirable anatomy of the kidney 310(A) and/or anatomy surrounding the kidney 310(A), the physician 160 can seek to align the needle 170 with an axis of a calyx. To do so, the physician 160 can designate a target location that is aligned with the center of the calyx and the center of a papilla (such as the papilla 314).
- the physician may designate the target by touching the scope 120 to the papilla 314 (also referred to as a “tag” position) and retracting the scope 120 to a “park” position some distance away from the papilla 314 (such as where the entire papilla 314 is within an FOV of a camera disposed on the scope 120).
- the control system 140 uses localization techniques to determine the “tag” and “park” positions of the scope 120 (such as based on sensor data from an EM sensor disposed on the scope 120) and sets the target location (also referred to as the “EM target”) midway between the “tag” and “park” positions.
- the physician 160 can proceed with the procedure by positioning the needle 170 for insertion into the target location.
- the physician 160 may use his or her best judgment to place the needle 170 on the patient 130 at an incision site, such as based on knowledge regarding the anatomy of the patient 130, experience from previously performing the procedure, an analysis of CT or x-ray images, or other pre-operative information of the patient 130, among other examples.
- the physician 160 can attempt to avoid critical anatomy of the patient 130, such as the lungs, pleura, colon, paraspinal muscles, ribs, and/or intercostal nerves.
- the control system 140 may use CT, x-ray, or ultrasound images to provide information to the physician 160 regarding a location to place the needle 170 on the patient 130.
- the control system 140 can determine a target trajectory 502 for inserting the needle 170 to assist the physician 160 in reaching the target location (such as the papilla 314).
- the target trajectory 502 can represent a desired path for accessing to the target location.
- the target trajectory 502 can be determined based on a position of a medical instrument (such as the needle 170 or the scope 120), a target location within the human anatomy, a position and/or orientation of a patient, or the anatomy of the patient (such as the location of organs within the patient relative to the target location), among other examples.
- a medical instrument such as the needle 170 or the scope 120
- the target trajectory 502 includes a straight line that passes through the papilla 314 and the needle 170 (extending from a tip of the needle 170 through the papilla 314, such as a point on an axis of the papilla 314).
- the target trajectory 502 can take other forms, such as a curved line, and/or can be defined in other manners.
- the needle 170 may be a flexible bevel-tip needle that is configured to curve as the needle 170 is inserted in a straight manner. Such needle can be used to steer around particular anatomy, such as the ribs or other anatomy.
- the control system 140 can provide information to guide a user, such as to compensate for deviation in the needle trajectory or to maintain the user on the target trajectory.
- the target trajectory 502 can have another position, angle, and/or form.
- a target trajectory can be implemented with a lower pole access point, such as through a papilla located below the kidney stone 318 shown in FIG. 5, with a non-coaxial angle through the papilla, which can be used to avoid the hip.
- the control system 140 can use the target trajectory 502 to provide an alignment-progress visualization 504 via the instrument-alignment interface 410.
- the alignment-progress visualization 504 can include an instrument alignment element 506 indicative of an orientation of the needle 170 relative to the target trajectory 502.
- the physician 160 can view the alignment-progress visualization 504 and orient the needle 170 to the target trajectory 502. When aligned, the physician 160 can insert the needle 170 into the patient 130 to reach the target location.
- the alignment-progress visualization 504 may include a progress visualization 508 (also referred to as a “progress bar”) indicating a proximity of the needle 170 to the target location.
- the instrument-alignment interface 410 can assist the physician 160 in aligning and/or inserting the needle 170 to reach the target location.
- the physician 160 can insert another medical instrument (such as a power catheter, vacuum, or nephroscope) into the path created by the needle 170 and/or over the needle 170.
- the physician 160 can use the other medical instrument and/or the scope 120 to fragment and remove pieces of the kidney stone 318 from the kidney 310(A) .
- a position of a medical instrument can be represented with a point or point set, and an orientation of the medical instrument can be represented as an angle or offset relative to an axis or plane.
- a position of a medical instrument can be represented with a coordinate(s) of a point or point set within a coordinate system (such as one or more X, Y, Z coordinates) and/or an orientation of the medical instrument can be represented with an angle relative to an axis or plane for the coordinate system (such as angle with respect to the X-axis or plane, Y-axis or plane, and/or Z- axis or plane).
- a change in orientation of the medical instrument can correspond to a change in an angle of the medical instrument relative to the axis or plane.
- an orientation of a medical instrument is represented with yaw, pitch, and/or roll information.
- a trajectory may represent a pose.
- a trajectory of a medical instrument can refer to a pose of the medical instrument, including or indicating both a position and orientation of the medical instrument.
- a target trajectory can refer to a target pose, including or indicating both a position and orientation of a desired path.
- a trajectory may refer to either an orientation or a position.
- any of the robotic arms 112 can be used to perform the functions. Further, any additional robotic arms and/or systems can be used to perform the procedure. Moreover, the robotic system 110 can be used to perform other parts of the procedure.
- the robotic system 110 can be controlled to align and/or insert the needle into the patient 130.
- one of the robotic arms 112 can engage with and/or control the needle 170 to position the needle 170 at the appropriate location, align the needle 170 with the target trajectory, and/or insert the needle 170 to the target location.
- the control system 140 can use localization techniques to perform such processing.
- a percutaneous procedure can be performed entirely or partially with the medical system 100 (such as with or without the assistance of the physician 160).
- a percutaneous access procedure can be subdivided into 3 phases: a target selection phase (where a target location within an anatomy is selected or designated for percutaneous access), a site selection phase (where a needle is placed on the surface of the patient’s skin and aligned with the target location), and a needle insertion phase (where the needle is driven percutaneously to rendezvous with the target location).
- a physician may perform a series of complex steps that require careful movement and positioning of medical instruments inside a patient’s body (such as touching a scope to a portion of an anatomy and retracting the scope to a park position).
- a medical system may provide greater information or guidance regarding how to perform the workflow associated with the target selection phase.
- the control system 140 may generate warnings on the display 142 if the physician 160 fails to adhere to one or more requirements of the target selection workflow.
- Example warnings may indicate that the EM target is outside the FOV of the camera, the FOV of the camera is not obscured by the papilla at the “tag” position (which suggests that the scope is not close enough to the papilla), the papilla is not centered in the FOV of the camera at the “park” position, or that excessive amount of EM motion is detected from the scope (which could be caused by respiration or various sources of EM interference).
- warnings may help the physician 160 adhere to existing workflow requirements, the physician 160 still must perform a series of complex workflow steps (such as navigating the scope to the “tag” position and retracting the scope to the “park” position).
- a medical system may simplify the workflow associated with the target selection phase, for example, by eliminating the requirement to touch the scope to a portion of the anatomy (such as a papilla).
- the physician 160 can drive the scope directly to the “park” position to designate a target for percutaneous access.
- the medical system may set the target at a fixed offset from the scope tip (such as in relation to the pose of the scope). Assuming other conditions are met (such as the papilla being centered in the FOV of the camera and the scope being parked a suitable distance away from the papilla), the offset distance can be predetermined so that the target resides within a calyx (and does not extend beyond a papilla).
- the target can be defined by a vector (also referred to as a “target vector”) extending from the tip of the scope.
- the target vector may have a predetermined direction in relation to the orientation (or heading) of the scope.
- the target vector may always point in the same direction as the heading of the scope.
- the scope may not be positioned and/or oriented in a consistent or repeatable manner in relation to the anatomy.
- a physician may be unable park the scope at a desired position and/or orientation such that the papilla is centered within the FOV of the camera.
- any deviations or offsets in the spatial relationship between the scope pose and the papilla location result in corresponding offsets in the target location.
- a medical system may utilize multiple localization modalities for designating a target for percutaneous access.
- the medical system may provide a graphical user interface (GUI) that allows a user or physician to select a target location on a real-time image of the anatomy (depicting the FOV of the camera).
- GUI graphical user interface
- FIG. 6 shows a block diagram of an example target localization system 600, according to some implementations.
- the target localization system 600 may be one example of any of the control circuitry 251 or 211 of FIG. 2.
- the target localization system 600 is configured to determine or designate a target position 609 within an anatomy based, at least in part, on image data 601 and sensor data 608 received via a medical instrument (such as a scope or other lumen-based medical instrument).
- the target position 609 represents a target for percutaneous access (such as the target location described with reference to FIGS. 3-5).
- the image data 601 may be received via a camera disposed on or proximate to the instrument and the sensor data 608 may be received via one or more sensors disposed on the instrument.
- the camera may be disposed on the distal end of a scope. In some other implementations, the camera may be disposed on the distal end of a working channel inserted through a scope (such as via a lumen of the scope). In some implementations, the one or more sensors may include an EM sensor.
- the target localization system 600 includes a user interface component 610, a ray calculation component 620, a coordinate space conversion (CSC) component 630, and a vector creation component 640.
- the user interface component 610 is configured to display or generate a GUI 650 that includes a real-time image of the anatomy based on the received image data 601. More specifically, the image may depict a portion of the anatomy in an FOV of the camera based on the current instrument pose.
- the GUI 650 may be one example of the graphical interface 144 of FIG. 1. With reference for example to FIG. 1, the GUI 650 may be presented on the display 142.
- the user interface component 610 is also configured to receive user input 602 via one or more input devices (such as the input controls 255 of FIG.
- the user interface component 610 may include an alignment feature for aligning a pose of the instrument with a desired portion of the anatomy (such as a papilla).
- the alignment feature may be displayed as a graphical overlay on the image that provides guidance to the physician for aligning the desired portion of the anatomy within the FOV of the camera.
- the pose of the instrument is deemed to be associated with a suitable “park” position.
- the user interface component 610 may display an interactive feature on the image once the instrument reaches a “park” position. For example, the physician may provide user input 602 indicating that the instrument is parked (or has otherwise reached a “park” position) after aligning the desired portion of the anatomy with the alignment feature.
- the interactive feature can be used to select a target location 603 on the image of the anatomy based on user input 602.
- the target location 603 can coincide with any point (or pixel) on the image.
- the target location 603 may represent a point in a two-dimensional (2D) image space associated with a direction or angular offset of the target position 609.
- the target location 603 may be selected to align with the center of the desired portion of the anatomy (such as the center of a papilla).
- the interactive feature may be a reticle that can be manipulated or dragged to the target location 603 in response to a tap-and-drag (or click-and-drag) type user input 602.
- tap-and-drag or click-and-drag
- various other suitable features may be used to select or otherwise specify the target location 603 on the image of the anatomy.
- the ray calculation component 620 is configured to determine or calculate a ray 605 that originates from the center of the camera and passes through the target location 603 in a three-dimensional (3D) coordinate space associated with the camera (also referred to as the “camera space”).
- the camera space represents a coordinate system that can be used to describe any point or vector in the FOV of the camera. Aspects of the present disclosure recognize that any point in a 3D camera space can be projected onto a 2D image space (such as the image of the anatomy). Each projection represents a ray passing through the center of the camera and intersecting the image space at a particular point or location based on the intrinsic parameters of the camera.
- Example intrinsic parameters include an optical center (c x , c y ) of the camera, a focal length of the camera (f x , f y ), and a skew coefficient (s).
- the ray calculation component 620 may calculate the ray 605 intersecting the target location 603 based on intrinsic parameters 604 of the camera disposed on or proximate to the distal end of the instrument.
- the CSC component 630 is configured to convert the ray 605 from the camera space to a corresponding ray 607 in a coordinate space associated with the sensor data 608 (also referred to as the “sensor space”).
- the sensor space represents a world coordinate system that can be used to describe the position and orientation of the instrument based on the sensor data 608.
- the CSC component 630 may perform the coordinate-space conversion based on a mapping 606 that is configured to map any point or vector in the camera space to a respective point or vector in the sensor space.
- the mapping 606 may be a hand-eye calibration matrix or transform associated with the robotic system 110 of FIG. 1. More specifically, the hand-eye calibration matrix HTMM may be calibrated to estimate the pose of the instrument (in the sensor space) with respect to the FOV of the camera based on a known calibration pattern.
- the vector creation component 640 is configured to determine a target vector that points to the target position 609 based on the sensor data 608 and the ray 607 in the sensor space.
- the target vector may originate from the instrument position indicated by the sensor data 608 and extend a threshold distance (d O ff Set ) in a direction of the ray 607.
- the target vector may be “locked” to the pose of the instrument.
- the target vector may have a fixed relationship with respect to the instrument pose so that any change in the position or orientation of the instrument results in a corresponding change in the position or orientation of the target vector (such as the same magnitude and direction of change).
- the threshold distance d 0 ⁇ set may be a predetermined value.
- the threshold distance d 0 ⁇ set may be estimated, based on prerecorded data and/or parameters associated with a percutaneous access procedure (such as a scope retraction limit), to ensure that the target position 609 resides within a calyx (and does not extend beyond a papilla).
- the vector creation component 640 may estimate a depth of the desired portion of the anatomy using one or more image processing or computer vision techniques and may determine the threshold distance d O ffset based on the estimated depth. For example, the vector creation component 640 may set the threshold distance d 0 ⁇ set to be equal to half the estimated depth or distance to the desired portion of the anatomy. Still further, in some implementations, the vector creation component 640 may set the target position 609 directly onto the desired portion of the anatomy (such as the center of a papilla), rather than a vector attached to the instrument, based on the estimated depth. In such implementations, the target position 609 may shift with respiratory movements and may need to be updated with respect to the instrument pose.
- the target localization system 600 may further compensate for respiratory movements by adding a workflow step to measure the magnitude and direction of the patient’s respiration. For example, after receiving user input 602 indicating that the instrument has reached a “park” position, the target localization system 600 may track any changes or variations in the sensor data 608 over a threshold duration to estimate a baseline respiratory movement.
- the threshold duration may be configured to span at least one respiratory cycle (such as 12 seconds). In some other implementations, the threshold duration may be configured to span two or more respiratory cycles.
- the vector creation component 640 may adjust the target position 609 to compensate for the magnitude and direction of the baseline respiratory movement.
- the target localization system 600 relies on user input 602 to set the target location 603 on the image of the anatomy.
- the target localization system 600 may use one or more image processing or computer vision techniques to determine the target location 603 without any user input.
- Example suitable image processing techniques include segmentation, machine learning, and statistical analysis, among other examples.
- segmentation refers to various techniques for portioning a digital image into groups of pixels (or “image segments”) based on related characteristics or identifying features.
- Example segmentation techniques include machine learning (ML) models, masking, thresholding, clustering, and edge detection, among other examples.
- the target localization system 600 may segment the image data 601 and estimate the target location 603 based on various characteristics or properties of the image segments (such as those associated with a papilla). [0097] Still further, in some implementations, the target localization system 600 may further manipulate or control the instrument to maintain the desired portion of the anatomy within the FOV of the camera. For example, after detecting the desired portion of the anatomy (such as a papilla) using image segmentation techniques, the target localization system 600 may assume control of the instrument from the physician. More specifically, the target localization system 600 may track the desired portion of the anatomy based on real-time image data 601 and may adjust the position and/or orientation of the instrument so that the desired portion of the anatomy remains within the FOV of the camera at all times.
- the target localization system 600 can significantly improve the outcome of a percutaneous access operation.
- the GUI 650 can reduce and/or streamline the workflow steps that require careful movement and positioning of instruments inside an anatomy. For example, a physician can navigate a scope directly to a “park” position and select a target location 603 via the GUI 650 without having to touch or tag any portion of the anatomy.
- the GUI 650 also enables the physician to visualize the target position 609 in the camera space (in contrast to existing target localization techniques which only provide the position of the target in the sensor space). This allows the physician to visually confirm the precision or accuracy of the target position 609.
- the target localization system 600 may include one or more additional visualization modalities to further assist the physician in confirming the target position 609.
- Example suitable visualization modalities may include augmented reality (AR) or virtual reality (VR) glasses or headsets that can display the position and orientation of the instrument (such as a scope) in relation to the anatomy.
- AR or VR headset may display the position and orientation (or movement) of the scope in a virtual environment (such as an AR or VR environment) that includes the patient’s anatomy and/or a model or rendering thereof.
- the AR or VR headset also may display the position and orientation (or movement) of the needle in the virtual environment.
- FIGS. 7A-7D show various states 700-730, respectively, of an example GUI 701 for designating a percutaneous access target, according to some implementations.
- the GUI 701 may be one example of the GUI 650 of FIG. 6. More specifically, FIG. 7A shows an example state 700 of the GUI 701 while a physician is navigating a scope (such as an endoscope) to a “park” position within an anatomy, FIG. 7B shows an example state 710 of the GUI 701 once the physician registers or otherwise indicates that the scope has reached a “park” position, FIG. 7C shows an example state 720 of the GUI 701 as the physician selects a target location, and FIG. 7D shows an example state 730 of the GUI 701 once the physician has locked in a target location.
- a scope such as an endoscope
- the GUI 701 is shown to include an image 702 depicting a FOV of a camera disposed on or proximate to the distal end of the scope and an alignment feature 704 overlaying the image 702.
- the alignment feature 704 is depicted as a large white circle with a white crosshair in the center.
- the GUI 701 also includes a rendering 706 depicting a spatial relationship between the alignment feature 704 and the scope.
- the rendering 706 shows the alignment feature 704 being projected onto the anatomy from the tip of the scope.
- the GUI 701 may include instructions 708 to position or align the scope so that the papilla is inside the alignment feature 704.
- the physician may position the scope so that the edges of the papilla are substantially aligned with the white circle of the alignment feature 704. This may ensure that the scope is not too close or too far away from the papilla.
- the physician may press “SET” on the GUI 701 (not shown for simplicity) or a separate user interface (also referred to as a “pendant”) to lock in or register the current instrument pose as a “park” position.
- the GUI 701 is shown to include an interactive reticle 714 (in lieu of the alignment feature 704) displayed on the image 702.
- the reticle 714 is depicted as a small black circle with a black crosshair in the center.
- the GUI 701 is also updated to include a rendering 716 depicting a spatial relationship between the reticle 714 and the scope.
- the rendering 716 also includes arrows showing that the reticle 714 can be moved in various directions.
- the GUI 701 may include instructions 708 to move the reticle 714 so that the crosshair is positioned over a target location and to press “SET” on the pendant once the target location is selected.
- FIG. 7B the GUI 701 is shown to include an interactive reticle 714 (in lieu of the alignment feature 704) displayed on the image 702.
- the reticle 714 is depicted as a small black circle with a black crosshair in the center.
- the GUI 701 is also updated to include a rendering 716 depicting a spatial relationship between the
- the physician is instructed to move the reticle 714 using the direction keys of a controller (such as the I/O device 146 of FIG. 1).
- the reticle 714 may be controlled using any suitable input device, including but not limited to touchscreens, touchpads, buttons, switches, mice, keyboards, keypads, joysticks, or scroll wheels.
- the physician moves the reticle 714 to a lower- right region of the image 702, so that the crosshair is aligned with the center of the papilla, and presses “SET” on the pendant to lock in the position of the reticle 714 as the target location.
- the GUI 701 is shown to include a crosshair 734 (in lieu of the reticle 714) displayed on the image 702.
- the GUI 701 is also updated to include a rendering 736 depicting a spatial relationship between the crosshair 734 and the scope.
- the GUI 738 may also include an indication 738 that the selected target location has been saved. The physician may tap or click on the “NEXT” icon in the upper right comer of the GUI 701 to proceed with the next phase of the percutaneous access procedure. In some implementations, pressing the “NEXT” icon may cause the targe localization system 600 to determine the target position 609 based on the selected target location 603.
- GUI 701 Any specific text, fonts, shapes, buttons, icons, or other graphical features shown in the GUI 701 are merely for purposes of illustration and may be configured differently (or user-configurable) in actual implementations.
- the “NEXT” icon may be replaced with different variations of the same or similar textual content (such as “continue,” “proceed,” or “site selection,” among other examples).
- Various aspects of the GUI 701 also may be customized to user preferences.
- Example suitable customization options may include, among other examples, changing the size or location of the image 702, changing the size or location of the renderings 706, 716, and 736, changing the colors of the alignment feature 706 and/or the reticle 714, adjusting various rendering parameters (such as color, opacity, or intensity of various features), or changing the colors of text or highlights in the GUI 701.
- FIGS. 8A and 8B show an example operation for converting a two- dimensional (2D) target location (x ; , y f ) in an image coordinate space 800 to a three- dimensional (3D) target position 818 in a sensor coordinate space 810.
- the example operation may be performed by the target localization system 600 of FIG. 6.
- the target location (Xj, y may be one example of the target location 603 and the target position 818 may be one example of the target position 609.
- the target localization system 600 may normalize the target location (Xj, y based on the optical center (c x , c y ) and focal length (f x , fy) °f the camera 802.
- the normalized image space 804 (defined by X c and Y c coordinates) represents a 2D projection of the 3D camera space 800 (defined by X c . Y c , and Z c coordinates).
- the distance from the center of the camera 802 to the normalized image space 804 (also referred to as the “Z -offset”) depends on the focal length (f x , fy) of the camera 802.
- any point in the 3D camera space 800 can be projected onto the normalized image space 804 via a ray passing through the center of the camera 802.
- the ray calculation component 620 may determine a ray 806 passing through the normalized target location (x n , y y ) based on the intrinsic parameters of the camera 802 (including its optical center (c x , c y ), focal length (f x , f y ), and skew coefficient s).
- the intrinsic parameters can be described as a transformation matrix (K) that transforms any point or vector in the camera space 800 to a respective point or vector in the normalized image space 804, where:
- the target localization system 600 may further map the ray 806 from the camera space 800 to a respective ray 816 in the sensor space 810.
- the CSC component 630 may convert the ray 806 from the camera space 800 to the sensor space 810 using a hand-eye calibration matrix that transforms any point or vector in the camera space 800 to a respective point or vector in the sensor space 810.
- the ray 816 is mapped to a position 812 of the instrument in the sensor space (as indicated by a sensor disposed on the instrument).
- the vector creation component 640 may determine the target position 818 by tracing the ray 816, from the instrument position 812 to a threshold distance d O ff S et along the ray 816.
- the target position 818 can be described as a vector of length d O ff Set relative to the instrument position 812.
- the target position 818 also may have a fixed relationship to a heading 814 of the instrument (as further indicated by the sensor disposed on the instrument) so that any changes to the instrument heading 814 result in corresponding changes to the target position 818.
- FIG. 9 shows a block diagram of an example controller 900 for a medical system, according to some implementations.
- the controller 900 may be one example of the target localization system 600 of FIG. 6. More specifically, the controller 900 is configured to determine or designate a target within anatomy for percutaneous access.
- the controller 900 includes a communication interface 910, a processing system 920, and a memory 930.
- the communication interface 910 is configured to communicate with one or more components of the medical system. More specifically, the communication interface 910 includes a sensor interface (I/F) 912 for communicating with one or more sensors (such as an EM sensor) and an image source interface (I/F) 914 for communicating with one or more image sources (such as a camera).
- the sensor interface 912 may receive sensor data from one or more sensors disposed on an instrument within an anatomy, where the sensor data indicates a pose of the instrument in a first coordinate space.
- the image source interface 914 may receive an image depicting a portion of the anatomy in an FOV of a camera disposed on or proximate to a distal end of the instrument.
- the memory 930 may include a non-transitory computer-readable medium (including one or more nonvolatile memory elements, such as EPROM, EEPROM, Flash memory, or a hard drive, among other examples) that may store the following software (SW) modules: a target selection SW module 932 to determine a location on the image associated with an anatomical feature; and a position determination SW module 934 to determine a position of the target in the first coordinate space based at least in part on the location on the image associated with the anatomical feature.
- SW software
- the processing system 920 may include any suitable one or more processors capable of executing scripts or instructions of one or more software programs stored in the controller 900 (such as in the memory 930). For example, the processing system 920 may execute the target selection SW module 932 to determine a location on the image associated with an anatomical feature. The processing system 920 also may execute the position determination SW module 934 to determine a position of the target in the first coordinate space based at least in part on the location on the image associated with the anatomical feature. [0113] FIG. 10 shows an illustrative flowchart depicting an example operation 1000 for designating a target for percutaneous access, according to some implementations. In some implementations, the example operation 1000 may be performed by a controller for a medical system such as the controller 900 of FIG. 9 or the target localization system 600 of FIG. 6.
- the controller receives sensor data from one or more sensors disposed on an instrument within an anatomy, where the sensor data indicates a pose of the instrument in a first coordinate space (802).
- the one or more sensors may include an electromagnetic sensor.
- the controller receives an image depicting a portion of the anatomy in a field-of-view (FOV) of a camera disposed on or proximate to a distal end of the instrument (804).
- the controller determines a location on the image associated with an anatomical feature (806).
- the controller further determines a position of the target in the first coordinate space based at least in part on the location on the image associated with the anatomical feature (808).
- the location of the location of the anatomical feature may be determined based on one or more image processing operations.
- the controller may generate a graphical user interface (GUI) that includes the image and an interactive feature displayed thereon, and receive user input associated with the interactive feature, where the location of the anatomical feature is determined based on the user input.
- GUI graphical user interface
- the interactive feature may include a reticle for selecting the location on the image associated with the anatomical feature.
- the determining of the position of the target may include converting the location on the image to a point in a second coordinate space based on intrinsic parameters of the camera, determining a ray that originates from a center of the camera and intersects the point in the second coordinate space, converting the ray from the second coordinate space to the first coordinate space, and setting the position of the target as a vector having a fixed relationship to the pose of the instrument based on the ray in the first coordinate space.
- the vector may extend a threshold distance along the ray in the first coordinate space.
- the threshold distance may be a fixed distance.
- the controller may estimate a depth of the portion of the anatomy in the FOV of the camera and determine the threshold distance based on the estimated depth.
- the controller may further detect changes in the sensor data received over a threshold duration while the instrument is in a parked state and adjust the position of the target based on the detected changes in the sensor data.
- the threshold duration may be associated with one or more respiratory cycles.
- the controller may further display the pose of the instrument in relation to the anatomy in an augmented reality (AR) or virtual reality (VR) environment.
- the controller may further adjust the pose of the instrument so that the anatomical feature remains in the FOV of the camera.
- a phrase referring to “at least one of’ a list of items refers to any combination of those items, including single members.
- “at least one of: a, b, or c” is intended to cover: a, b, c, a-b, a-c, b-c, and a-b-c.
Landscapes
- Health & Medical Sciences (AREA)
- Surgery (AREA)
- Engineering & Computer Science (AREA)
- Life Sciences & Earth Sciences (AREA)
- Heart & Thoracic Surgery (AREA)
- Animal Behavior & Ethology (AREA)
- Veterinary Medicine (AREA)
- Biomedical Technology (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Medical Informatics (AREA)
- Molecular Biology (AREA)
- Public Health (AREA)
- General Health & Medical Sciences (AREA)
- Robotics (AREA)
- Pathology (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Human Computer Interaction (AREA)
- Endoscopes (AREA)
Abstract
La présente divulgation concerne des procédés, des dispositifs et des systèmes de planification et de réalisation d'interventions médicales. Les présents modes de réalisation concernent plus particulièrement la désignation d'une cible pour un accès percutané. Selon certains aspects, un système de localisation de cible peut générer une interface utilisateur graphique (GUI) qui peut être utilisée pour désigner une cible pour un accès percutané sur une image en temps réel d'une anatomie. Par exemple, l'image peut représenter le champ de vision (FOV) d'une caméra disposée sur un instrument (tel qu'un endoscope) à l'intérieur de l'anatomie. L'interface graphique peut comprendre une fonctionnalité interactive (telle qu'un réticule) permettant de sélectionner un emplacement cible sur l'image en réponse à une entrée de l'utilisateur. Le système de localisation de cible reçoit également des données de capteur indiquant une pose (comprenant une position et un cap) de l'instrument et détermine une position de la cible par rapport à la pose de l'instrument en fonction de l'emplacement de la cible sur l'image.
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US202463641264P | 2024-05-01 | 2024-05-01 | |
| US63/641,264 | 2024-05-01 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2025229542A1 true WO2025229542A1 (fr) | 2025-11-06 |
Family
ID=97561235
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/IB2025/054474 Pending WO2025229542A1 (fr) | 2024-05-01 | 2025-04-29 | Localisation de cible pour accès percutané |
Country Status (1)
| Country | Link |
|---|---|
| WO (1) | WO2025229542A1 (fr) |
Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2020121149A1 (fr) * | 2018-12-14 | 2020-06-18 | Acclarent, Inc. | Système chirurgical combinant la navigation par capteur et l'endoscopie |
| US20210196312A1 (en) * | 2019-12-31 | 2021-07-01 | Auris Health, Inc. | Alignment interfaces for percutaneous access |
| US20210378759A1 (en) * | 2020-06-03 | 2021-12-09 | Covidien Lp | Surgical tool navigation using sensor fusion |
| WO2023001611A1 (fr) * | 2021-07-21 | 2023-01-26 | Technische Universität München | Suivi électromagnétique pour trachéotomie percutanée par dilatation |
| WO2023036848A1 (fr) * | 2021-09-07 | 2023-03-16 | Neuronav Ltd | Système de navigation chirurgicale à réalité augmentée |
-
2025
- 2025-04-29 WO PCT/IB2025/054474 patent/WO2025229542A1/fr active Pending
Patent Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2020121149A1 (fr) * | 2018-12-14 | 2020-06-18 | Acclarent, Inc. | Système chirurgical combinant la navigation par capteur et l'endoscopie |
| US20210196312A1 (en) * | 2019-12-31 | 2021-07-01 | Auris Health, Inc. | Alignment interfaces for percutaneous access |
| US20210378759A1 (en) * | 2020-06-03 | 2021-12-09 | Covidien Lp | Surgical tool navigation using sensor fusion |
| WO2023001611A1 (fr) * | 2021-07-21 | 2023-01-26 | Technische Universität München | Suivi électromagnétique pour trachéotomie percutanée par dilatation |
| WO2023036848A1 (fr) * | 2021-09-07 | 2023-03-16 | Neuronav Ltd | Système de navigation chirurgicale à réalité augmentée |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US12465431B2 (en) | Alignment techniques for percutaneous access | |
| US12220150B2 (en) | Aligning medical instruments to access anatomy | |
| US12251175B2 (en) | Medical instrument driving | |
| EP4084721B1 (fr) | Identification et ciblage d'éléments anatomiques | |
| US12414686B2 (en) | Endoscopic anatomical feature tracking | |
| KR102683476B1 (ko) | 항행(navigation)을 위한 이미지 기반 분지(branch) 감지 및 매핑 | |
| US12251177B2 (en) | Control scheme calibration for medical instruments | |
| KR20250041081A (ko) | 의료 기구 항행 및 표적 선정을 위한 시스템 및 방법 | |
| US20250339175A1 (en) | Percutaneous access guidance | |
| WO2025229542A1 (fr) | Localisation de cible pour accès percutané | |
| US20250339644A1 (en) | Directionality indication for medical instrument driving | |
| US20250268665A1 (en) | Elongate instrument with proximal pose and shape sensing | |
| US20250302552A1 (en) | Interface for identifying objects in an anatomy | |
| US20240127399A1 (en) | Visualization adjustments for instrument roll | |
| US20250302536A1 (en) | Interface for determining instrument pose | |
| JP2025186478A (ja) | 医療器具のための制御スキーム較正 |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 25798213 Country of ref document: EP Kind code of ref document: A1 |