WO2024184739A1 - Patch capteur pour le suivi d'instruments chirurgicaux - Google Patents
Patch capteur pour le suivi d'instruments chirurgicaux Download PDFInfo
- Publication number
- WO2024184739A1 WO2024184739A1 PCT/IB2024/051919 IB2024051919W WO2024184739A1 WO 2024184739 A1 WO2024184739 A1 WO 2024184739A1 IB 2024051919 W IB2024051919 W IB 2024051919W WO 2024184739 A1 WO2024184739 A1 WO 2024184739A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- instrument
- handheld
- surgical
- sensor patch
- substrate
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/30—Surgical robots
-
- H—ELECTRICITY
- H01—ELECTRIC ELEMENTS
- H01Q—ANTENNAS, i.e. RADIO AERIALS
- H01Q1/00—Details of, or arrangements associated with, antennas
- H01Q1/12—Supports; Mounting means
- H01Q1/22—Supports; Mounting means by structural association with other equipment or articles
- H01Q1/24—Supports; Mounting means by structural association with other equipment or articles with receiving set
- H01Q1/248—Supports; Mounting means by structural association with other equipment or articles with receiving set provided with an AC/DC converting device, e.g. rectennas
-
- H—ELECTRICITY
- H01—ELECTRIC ELEMENTS
- H01Q—ANTENNAS, i.e. RADIO AERIALS
- H01Q21/00—Antenna arrays or systems
- H01Q21/06—Arrays of individually energised antenna units similarly polarised and spaced apart
-
- H—ELECTRICITY
- H02—GENERATION; CONVERSION OR DISTRIBUTION OF ELECTRIC POWER
- H02J—CIRCUIT ARRANGEMENTS OR SYSTEMS FOR SUPPLYING OR DISTRIBUTING ELECTRIC POWER; SYSTEMS FOR STORING ELECTRIC ENERGY
- H02J50/00—Circuit arrangements or systems for wireless supply or distribution of electric power
- H02J50/20—Circuit arrangements or systems for wireless supply or distribution of electric power using microwaves or radio frequency waves
- H02J50/27—Circuit arrangements or systems for wireless supply or distribution of electric power using microwaves or radio frequency waves characterised by the type of receiving antennas, e.g. rectennas
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B17/00—Surgical instruments, devices or methods
- A61B2017/00017—Electrical control of surgical instruments
- A61B2017/00221—Electrical control of surgical instruments with wireless transmission of data, e.g. by infrared radiation or radiowaves
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
- A61B2034/2046—Tracking techniques
- A61B2034/2048—Tracking techniques using an accelerometer or inertia sensor
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
- A61B2034/2046—Tracking techniques
- A61B2034/2059—Mechanical position encoders
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
- A61B2034/2046—Tracking techniques
- A61B2034/2065—Tracking using image or pattern recognition
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/39—Markers, e.g. radio-opaque or breast lesions markers
- A61B2090/3991—Markers, e.g. radio-opaque or breast lesions markers having specific anchoring means to fixate the marker to the tissue, e.g. hooks
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/06—Devices, other than using radiation, for detecting or locating foreign bodies ; Determining position of diagnostic devices within or on the body of the patient
- A61B5/061—Determining position of a probe within the body employing means separate from the probe, e.g. sensing internal probe position employing impedance electrodes on the surface of the body
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/36—Image-producing devices or illumination devices not otherwise provided for
- A61B90/361—Image-producing devices, e.g. surgical cameras
Definitions
- Surgical robotic systems are currently being used in a variety of surgical procedures, including minimally invasive medical procedures.
- Some surgical robotic systems include a surgeon console controlling a surgical robotic arm and a surgical instrument having an end effector (e.g., forceps or grasping instrument) coupled to and actuated by the robotic arm.
- the robotic arm In operation, the robotic arm is moved to a position over a patient and then guides the surgical instrument into a small incision via a surgical port or a natural orifice of a patient to position the end effector at a work site within the patient’s body.
- the data may be used for digitization and analytics, such as, recording motions of robotically controlled instruments throughout a procedure and using statistical analysis to rate a surgeon’s performance.
- Data may also be used to estimate the pose of an endoscopic camera and surgical instruments to aid in soft-tissue navigation.
- this tracking is not possible with handheld devices that are used alongside robotically controlled instruments. This so-called “hybrid” approach, i.e., where handheld instruments are used alongside robotic instruments, is common in robotic surgery where a handheld instrument is more suitable for a specific task.
- a sensor patch for tracking a surgical instrument includes a flexible printed circuit board substrate, and an antenna array disposed on the substrate and configured to receive an electromagnetic energy transmission.
- the sensor patch also includes a rectifier disposed on the substrate and configured to convert the received electromagnetic energy transmission to direct current, and an inertial measurement unit disposed on the substrate and powered by the direct current.
- the inertial measurement unit is configured to measure at least one movement parameter of the surgical instrument.
- Implementations of the above embodiment may include one or more of the following features.
- the sensor patch may also include an adhesive layer disposed on an attachment surface of the substrate.
- the electromagnetic energy transmission may have a frequency of about 60 GHz.
- the inertial measurement unit may include at least one of an accelerometer, a gyroscope, or a magnetometer.
- the at least one movement parameter of the surgical instrument may be acceleration, angular rate, and/or a magnetic field property.
- a surgical robotic system includes a handheld surgical instrument having a shaft and a detachable sensor patch disposed on the shaft, where the sensor patch is configured to measure at least one movement parameter of the handheld surgical instrument.
- the system also includes a first robotic arm having a camera configured to capture a video of a reference point of the handheld surgical instrument, and a second robotic arm including a robotic instrument.
- the system also includes a controller configured to receive the at least one movement parameter from the sensor patch, receive the video from the camera of the handheld surgical instrument, and localize the handheld surgical instrument relative to the robotic instrument based on the at least one movement parameter and the video of the reference point of the handheld surgical instrument.
- the surgical robotic system may include a plurality of access ports, wherein one or more of the access ports includes a transmitter antenna and an energy source coupled to the transmitter antenna and configured to energize the transmitter antenna to emit an electromagnetic energy transmission.
- the sensor patch may further include a flexible printed circuit board substrate and an adhesive layer disposed on an attachment surface of the substrate.
- the sensor patch may also include an antenna array disposed on the substrate and configured to receive the electromagnetic energy transmission and a rectifier disposed on the substrate and configured to convert the received electromagnetic energy transmission to direct current.
- the sensor patch may additionally include an inertial measurement unit disposed on the substrate and powered by the direct current.
- the inertial measurement unit may be configured to measure at least one movement parameter of the handheld surgical instrument.
- the inertial measurement unit may include one of an accelerometer, a gyroscope, or a magnetometer.
- the electromagnetic energy transmission may have a frequency of about 60 GHz.
- the movement parameter may be one of acceleration, angular rate, or a magnetic field property.
- the controller may be further configured to calculate one or more estimated pixels of a reference point of the handheld surgical instrument based on the at least one movement parameter.
- the controller may be further configured to calculate one or more actual pixels of the reference point of the handheld surgical instrument based on the video.
- the controller may be further configured to localize the handheld surgical instrument based on kinematics data of the first robotic arm.
- a method for localizing a handheld instrument with a surgical robotic system includes wirelessly energizing a sensor patch disposed on a shaft of a handheld laparoscopic instrument through an antenna mounted on an access port.
- the method also includes measuring, at the sensor patch, movement parameter of the handheld instrument.
- the method further includes capturing a video of a reference point of the handheld laparoscopic instrument through a camera held by a robotic arm, and localizing the handheld instrument relative to a robotic instrument held by a second robotic arm based on the at least one movement parameter and the video of the reference point of the handheld instrument.
- Implementations of the above embodiment may include one or more of the following features.
- the method may further include calculating one or more estimated pixels of a reference point of the handheld instrument based on the at least one movement parameter.
- the method may also include calculating at least one actual pixel of the reference point of the handheld instrument based from the video.
- the method may additionally include localizing the handheld instrument based on kinematic data of the robotic arm.
- FIG. 1 is a perspective view of a surgical robotic system including a control tower, a console, and one or more surgical robotic arms each disposed on a mobile cart according to an embodiment of the present disclosure
- FIG. 2 is a perspective view of a surgical robotic arm of the surgical robotic system of FIG. 1 according to an embodiment of the present disclosure
- FIG. 3 is a perspective view of a mobile cart having a setup arm with the surgical robotic arm of the surgical robotic system of FIG. 1 according to an embodiment of the present disclosure
- FIG. 4 is a schematic diagram of a computer architecture of the surgical robotic system of FIG. 1 according to an embodiment of the present disclosure
- FIG. 5 is a plan schematic view of the surgical robotic system of FIG. 1 positioned about a surgical table according to an embodiment of the present disclosure
- FIG. 6 is a schematic view of a wireless power transfer system according to an embodiment of the present disclosure.
- FIG. 7 is a perspective view of an exemplary handheld surgical instrument according to an embodiment of the present disclosure.
- FIG. 8 is a schematic diagram of a detachable sensor patch according to an embodiment of the present disclosure.
- FIG. 9 is a perspective view of the detachable sensor patch according to an embodiment of the present disclosure.
- FIG. 10 is a flow chart of an optimization process according to an embodiment of the present disclosure.
- FIG. I l a flow chart of a method of localizing the handheld instrument according to an embodiment of the present disclosure.
- a surgical robotic system 10 includes a control tower 20, which is connected to all of the components of the surgical robotic system 10 including a surgeon console 30 and one or more mobile carts 60.
- Each of the mobile carts 60 includes a robotic arm 40 having a surgical instrument 50 removably coupled thereto.
- the robotic arms 40 also couple to the mobile carts 60.
- the robotic system 10 may include any number of mobile carts 60 and/or robotic arms 40.
- the surgical instrument 50 is configured for use during minimally invasive surgical procedures.
- the surgical instrument 50 may be configured for open surgical procedures.
- the surgical instrument 50 may be an electrosurgical forceps configured to seal tissue by compressing tissue between jaw members and applying electrosurgical current thereto.
- the surgical instrument 50 may be a surgical stapler including a pair of jaws configured to grasp and clamp tissue while deploying a plurality of tissue fasteners, e.g., staples, and cutting stapled tissue.
- the surgical instrument 50 may be a surgical clip applier including a pair of jaws configured apply a surgical clip onto tissue.
- One of the robotic arms 40 may include a laparoscopic camera 51 configured to capture video of the surgical site.
- the laparoscopic camera 51 may be a stereoscopic endoscope configured to capture two side-by-side (i.e., left and right) images of the surgical site to produce a video stream of the surgical scene.
- the laparoscopic camera 51 is coupled to an image processing device 56, which may be disposed within the control tower 20.
- the image processing device 56 may be any computing device configured to receive the video feed from the laparoscopic camera 51 and output the processed video stream.
- the surgeon console 30 includes a first screen 32, which displays a video feed of the surgical site provided by camera 51 of the surgical instrument 50 disposed on the robotic arm 40, and a second screen 34, which displays a user interface for controlling the surgical robotic system 10.
- the first screen 32 and second screen 34 may be touchscreens allowing for displaying various graphical user inputs.
- the surgeon console 30 also includes a plurality of user interface devices, such as foot pedals 36 and a pair of hand controllers 38a and 38b which are used by a user to remotely control robotic arms 40.
- the surgeon console further includes an armrest 33 used to support clinician’s arms while operating the hand controllers 38a and 38b.
- the control tower 20 includes a screen 23, which may be a touchscreen, and outputs on the graphical user interfaces (GUIs).
- GUIs graphical user interfaces
- the control tower 20 also acts as an interface between the surgeon console 30 and one or more robotic arms 40.
- the control tower 20 is configured to control the robotic arms 40, such as to move the robotic arms 40 and the corresponding surgical instrument 50, based on a set of programmable instructions and/or input commands from the surgeon console 30, in such a way that robotic arms 40 and the surgical instrument 50 execute a desired movement sequence in response to input from the foot pedals 36 and the hand controllers 38a and 38b.
- the foot pedals 36 may be used to enable and lock the hand controllers 38a and 38b, repositioning camera movement and electrosurgical activation/deactivation.
- the foot pedals 36 may be used to perform a clutching action on the hand controllers 38a and 38b. Clutching is initiated by pressing one of the foot pedals 36, which disconnects (i.e., prevents movement inputs) the hand controllers 38a and/or 38b from the robotic arm 40 and corresponding instrument 50 or camera 51 attached thereto. This allows the user to reposition the hand controllers 38a and 38b without moving the robotic arm(s) 40 and the instrument 50 and/or camera 51. This is useful when reaching control boundaries of the surgical space.
- Each of the control tower 20, the surgeon console 30, and the robotic arm 40 includes a respective computer 21, 31, 41.
- the computers 21, 31, 41 are interconnected to each other using any suitable communication network based on wired or wireless communication protocols.
- Suitable protocols include, but are not limited to, transmission control protocol/intemet protocol (TCP/IP), datagram protocol/intemet protocol (UDP/IP), and/or datagram congestion control protocol (DC).
- Wireless communication may be achieved via one or more wireless configurations, e.g., radio frequency, optical, Wi-Fi, Bluetooth (an open wireless protocol for exchanging data over short distances, using short length radio waves, from fixed and mobile devices, creating personal area networks (PANs), ZigBee® (a specification for a suite of high level communication protocols using small, low-power digital radios based on the IEEE 122.15.4-1203 standard for wireless personal area networks (WPANs)).
- wireless configurations e.g., radio frequency, optical, Wi-Fi, Bluetooth (an open wireless protocol for exchanging data over short distances, using short length radio waves, from fixed and mobile devices, creating personal area networks (PANs), ZigBee® (a specification for a suite of high level communication protocols using small, low-power digital radios based on the IEEE 122.15.4-1203 standard for wireless personal area networks (WPANs)).
- PANs personal area networks
- ZigBee® a specification for a suite of high level communication protocols using small, low-power digital radios
- the computers 21, 31, 41 may include any suitable processor (not shown) operably connected to a memory (not shown), which may include one or more of volatile, non-volatile, magnetic, optical, or electrical media, such as read-only memory (ROM), random access memory (RAM), electrically-erasable programmable ROM (EEPROM), non-volatile RAM (NVRAM), or flash memory.
- the processor may be any suitable processor (e.g., control circuit) adapted to perform the operations, calculations, and/or set of instructions described in the present disclosure including, but not limited to, a hardware processor, a field programmable gate array (FPGA), a digital signal processor (DSP), a central processing unit (CPU), a microprocessor, and combinations thereof.
- FPGA field programmable gate array
- DSP digital signal processor
- CPU central processing unit
- microprocessor e.g., microprocessor
- the setup arm 61 includes a first link 62a, a second link 62b, and a third link 62c, which provide for lateral maneuverability of the robotic arm 40.
- the links 62a, 62b, 62c are interconnected at joints 63a and 63b, each of which may include an actuator (not shown) for rotating the links 62b and 62b relative to each other and the link 62c.
- the links 62a, 62b, 62c are movable in their corresponding lateral planes that are parallel to each other, thereby allowing for extension of the robotic arm 40 relative to the patient (e.g., surgical table).
- the robotic arm 40 may be coupled to the surgical table (not shown).
- the setup arm 61 includes controls 65 for adjusting movement of the links 62a, 62b, 62c as well as the lift 67.
- the setup arm 61 may include any type and/or number of joints.
- the third link 62c may include a rotatable base 64 having two degrees of freedom.
- the rotatable base 64 includes a first actuator 64a and a second actuator 64b.
- the first actuator 64a is rotatable about a first stationary arm axis which is perpendicular to a plane defined by the third link 62c and the second actuator 64b is rotatable about a second stationary arm axis which is transverse to the first stationary arm axis.
- the first and second actuators 64a and 64b allow for full three-dimensional orientation of the robotic arm 40.
- the actuator 48b of the joint 44b is coupled to the joint 44c via the belt 45a, and the joint 44c is in turn coupled to the joint 46b via the belt 45b.
- Joint 44c may include a transfer case coupling the belts 45a and 45b, such that the actuator 48b is configured to rotate each of the links 42b, 42c and a holder 46 relative to each other. More specifically, links 42b, 42c, and the holder 46 are passively coupled to the actuator 48b which enforces rotation about a pivot point “P” which lies at an intersection of the first axis defined by the link 42a and the second axis defined by the holder 46. In other words, the pivot point “P” is a remote center of motion (RCM) for the robotic arm 40.
- RCM remote center of motion
- the actuator 48b controls the angle 0 between the first and second axes allowing for orientation of the surgical instrument 50. Due to the interlinking of the links 42a, 42b, 42c, and the holder 46 via the belts 45a and 45b, the angles between the links 42a, 42b, 42c, and the holder 46 are also adjusted in order to achieve the desired angle 0. In embodiments, some or all of the joints 44a, 44b, 44c may include an actuator to obviate the need for mechanical linkages.
- the joints 44a and 44b include an actuator 48a and 48b configured to drive the joints 44a, 44b, 44c relative to each other through a series of belts 45a and 45b or other mechanical linkages such as a drive rod, a cable, or a lever and the like.
- the actuator 48a is configured to rotate the robotic arm 40 about a longitudinal axis defined by the link 42a.
- the holder 46 defines a second longitudinal axis and configured to receive an instrument drive unit (IDU) 52 (FIG. 1).
- the IDU 52 is configured to couple to an actuation mechanism of the surgical instrument 50 and the camera 51 and is configured to move (e.g., rotate) and actuate the instrument 50 and/or the camera 51.
- IDU 52 transfers actuation forces from its actuators to the surgical instrument 50 to actuate components an end effector 49 of the surgical instrument 50.
- the holder 46 includes a sliding mechanism 46a, which is configured to move the IDU 52 along the second longitudinal axis defined by the holder 46.
- the holder 46 also includes a joint 46b, which rotates the holder 46 relative to the link 42c.
- the instrument 50 may be inserted through an endoscopic access port 55 (FIG. 3) held by the holder 46.
- the holder 46 also includes a port latch 46c for securing the access port 55 to the holder 46 (FIG. 2).
- the robotic arm 40 also includes a plurality of manual override buttons 53 (FIG. 1) disposed on the IDU 52 and the setup arm 61, which may be used in a manual mode. The user may press one or more of the buttons 53 to move the component associated with the button 53.
- each of the computers 21, 31, 41 of the surgical robotic system 10 may include a plurality of controllers, which may be embodied in hardware and/or software.
- the computer 21 of the control tower 20 includes a controller 21a and safety observer 21b.
- the controller 21a receives data from the computer 31 of the surgeon console 30 about the current position and/or orientation of the hand controllers 38a and 38b and the state of the foot pedals 36 and other buttons.
- the controller 21a processes these input positions to determine desired drive commands for each joint of the robotic arm 40 and/or the IDU 52 and communicates these to the computer 41 of the robotic arm 40.
- the controller 21a also receives the actual joint angles measured by encoders of the actuators 48a and 48b and uses this information to determine force feedback commands that are transmitted back to the computer 31 of the surgeon console 30 to provide haptic feedback through the hand controllers 38a and 38b.
- the safety observer 21b performs validity checks on the data going into and out of the controller 21a and notifies a system fault handler if errors in the data transmission are detected to place the computer 21 and/or the surgical robotic system 10 into a safe state.
- the controller 21a is coupled to a storage 22a, which may be non-transitory computer- readable medium configured to store any suitable computer data, such as software instructions executable by the controller 21a.
- the controller 21a also includes transitory memory 22b for loading instructions and other computer readable data during execution of the instructions.
- other controllers of the system 10 include similar configurations.
- the computer 41 includes a plurality of controllers, namely, a main cart controller 41a, a setup arm controller 41b, a robotic arm controller 41c, and an instrument drive unit (IDU) controller 4 Id.
- the main cart controller 41a receives and processes joint commands from the controller 21a of the computer 21 and communicates them to the setup arm controller 41b, the robotic arm controller 41c, and the IDU controller 41d.
- the main cart controller 41a also manages instrument exchanges and the overall state of the mobile cart 60, the robotic arm 40, and the IDU 52.
- the main cart controller 41a also communicates actual joint angles back to the controller 21a.
- Each of joints 63a and 63b and the rotatable base 64 of the setup arm 61 are passive joints (i.e., no actuators are present therein) allowing for manual adjustment thereof by a user.
- the joints 63a and 63b and the rotatable base 64 include brakes that are disengaged by the user to configure the setup arm 61.
- the setup arm controller 4 lb monitors slippage of each of joints 63a and 63b and the rotatable base 64 of the setup arm 61, when brakes are engaged or can be freely moved by the operator when brakes are disengaged, but do not impact controls of other joints.
- the robotic arm controller 41c controls each joint 44a and 44b of the robotic arm 40 and calculates desired motor torques required for gravity compensation, friction compensation, and closed loop position control of the robotic arm 40.
- the robotic arm controller 41c calculates a movement command based on the calculated torque.
- the calculated motor commands are then communicated to one or more of the actuators 48a and 48b in the robotic arm 40.
- the actual joint positions are then transmitted by the actuators 48a and 48b back to the robotic arm controller 41c.
- the IDU controller 41d receives desired joint angles for the surgical instrument 50, such as wrist and jaw angles, and computes desired currents for the motors in the IDU 52.
- the IDU controller 4 Id calculates actual angles based on the motor positions and transmits the actual angles back to the main cart controller 41a.
- the robotic arm 40 is controlled in response to a pose of the hand controller controlling the robotic arm 40, e.g., the hand controller 38a, which is transformed into a desired pose of the robotic arm 40 through a hand eye transform function executed by the controller 21a.
- the hand eye function as well as other functions described herein, is/are embodied in software executable by the controller 21a or any other suitable controller described herein.
- the pose of one of the hand controllers 38a may be embodied as a coordinate position and roll-pitch-yaw (RPY) orientation relative to a coordinate reference frame, which is fixed to the surgeon console 30.
- the desired pose of the instrument 50 is relative to a fixed frame on the robotic arm 40.
- the pose of the hand controller 38a is then scaled by a scaling function executed by the controller 21a.
- the coordinate position may be scaled down and the orientation may be scaled up by the scaling function.
- the controller 21a may also execute a clutching function, which disengages the hand controller 38a from the robotic arm 40.
- the controller 21a stops transmitting movement commands from the hand controller 38a to the robotic arm 40 if certain movement limits or other thresholds are exceeded and in essence acts like a virtual clutch mechanism, e.g., limits mechanical input from effecting mechanical output.
- the desired pose of the robotic arm 40 is based on the pose of the hand controller 38a and is then passed by an inverse kinematics function executed by the controller 21a.
- the inverse kinematics function calculates angles for the joints 44a, 44b, 44c of the robotic arm 40 that achieve the scaled and adjusted pose input by the hand controller 38a.
- the calculated angles are then passed to the robotic arm controller 41c, which includes a joint axis controller having a proportional-derivative (PD) controller, the friction estimator module, the gravity compensator module, and a two-sided saturation block, which is configured to limit the commanded torque of the motors of the joints 44a, 44b, 44c.
- PD proportional-derivative
- the surgical robotic system 10 is setup around a surgical table 90.
- the system 10 includes mobile carts 60a-d, which may be numbered “1” through “4.”
- each of the carts 60a-d are positioned around the surgical table 90.
- Position and orientation of the carts 60a-d depends on a plurality of factors, such as placement of a plurality of access ports 55a-d, which in turn, depends on the surgery being performed.
- the access ports 55a-d are inserted into the patient, and carts 60a-d are positioned to insert instruments 50 and the laparoscopic camera 51 into corresponding ports 55a-d.
- each of the robotic arms 40a-d is attached to one of the access ports 55a-d that is inserted into the patient by attaching the latch 46c (FIG. 2) to the access port 55 (FIG. 3).
- the IDU 52 is attached to the holder 46, followed by the SIM 43 being attached to a distal portion of the IDU 52.
- the instrument 50 is attached to the SIM 43.
- the instrument 50 is then inserted through the access port 55 by moving the IDU 52 along the holder 46.
- the SIM 43 includes a plurality of drive shafts configured to transmit rotation of individual motors of the IDU 52 to the instrument 50 thereby actuating the instrument 50.
- the SIM 43 provides a sterile barrier between the instrument 50 and the other components of robotic arm 40, including the IDU 52.
- the SIM 43 is also configured to secure a sterile drape (not shown) to the IDU 52.
- the system 10 also includes a power source 110 configured to transmit RF power in an ultra-wideband power spectrum from about 30 GHz to about 300 GHz, in millimeter (mmWave) wave spectrum, and in embodiments may be about 60 GHz.
- the power source 110 is coupled to one or more antennas 130 via a feedline 120.
- the antenna 130 is disposed on one or more of the access ports 55a-d and is configured to emit mmWave energy within the patient cavity.
- the energy is received by a detachable sensor patch 140 disposed on a handheld surgical instrument 150 of FIG. 7.
- Handheld surgical instrument 150 may be any handheld, laparoscopic, manually actuated or powered surgical instrument, such as an electrosurgical instrument (e.g., vessel sealer, dissector, etc.), a surgical stapler, a grasper, a scissor, etc.
- Handheld surgical instrument 150 includes a handle 152, a longitudinal shaft 154 extending distally therefrom, and an end effector 156 disposed at a distal end portion of the shaft 154.
- the shaft 154 is sized to be insertable through one of the access ports 55a-d and the sensor patch 140 is disposed along any portion of the shaft 154 that is disposed within the patient cavity during the use of the surgical instrument 150. This allows the sensor patch 140 to receive the energy emitted by the antenna 130 disposed on one of the access ports 55a-d.
- the sensor patch 140 includes a flexible printed circuit board (PCB) substrate 142 having an attachment surface 144 with an adhesive layer 146 disposed thereon, which may include any suitable non-permanent adhesive composition to allow for removal of the sensor patch 140 from the shaft 154.
- the sensor patch 140 may have any suitable shape, such as polygonal (e.g., rectangular), curved (e.g., circular), or any combination thereof.
- the sensor patch 140 is attached to the shaft 154 prior to insertion of the shaft 154 through the access port 55a-d and is removed after the instrument 50 is no longer being used.
- the sensor patch 140 may have a width of about 10 mm, a length of about 30 mm, and a thickness of about 1 mm.
- the thickness of the sensor patch 140 may vary depending on the thickness of the integrated circuit (IC) components disposed on a sensor (i.e., upper) surface 145 of the PCB substrate 142.
- IC integrated circuit
- the sensor patch 140 includes a plurality of IC components, such as, a microcontroller (MCU) 160, an inertial measurement unit (IMU) 162, a memory 164, a voltage regulator 166, a rectifier 168, and a mmWave antenna array 170 configured to receive the energy emitted by the antenna 130 thereby resulting in wireless power transfer (WPT) from the antenna 130.
- MCU microcontroller
- IMU inertial measurement unit
- memory 164 a memory 164
- a voltage regulator 166 a voltage regulator
- rectifier 168 rectifier
- mmWave antenna array 170 configured to receive the energy emitted by the antenna 130 thereby resulting in wireless power transfer (WPT) from the antenna 130.
- WPT wireless power transfer
- the EM waves are received by the antenna array 170 and then passed down to the rectifier 168 to be converted from AC to DC.
- the voltage regulator 166 stabilizes power supplies to ensure IMU 162, MCU 160, and the memory 164 function properly.
- the MCU 160 may be any suitable processor (e.g., control circuit) adapted to perform the operations, calculations, and/or set of instructions described in the present disclosure including, but not limited to, a hardware processor, a field programmable gate array (FPGA), a digital signal processor (DSP), a central processing unit (CPU), a microprocessor, and combinations thereof.
- the memory 164 may be electrically erasable programmable ROM (EEPROM), non-volatile RAM (NVRAM), or flash memory.
- the IMU 162 may include one or more of the following types of sensors integrated in a single system-in-package IC component, such as gyroscopes, accelerometers, magnetometer, etc.
- the IMU 162 measures and reports raw or filtered angular rate and specific force/acceleration experienced by the sensor patch 140, as well as other parameters, depending on the sensor package. Coordinated by the MCU 160, this data may then written into the onboard memory 164 for later retrieval.
- the sensor patch 140 may also include additional components, such as an onboard wireless communication module 165 (e.g., Bluetooth Low Energy) which may transmit IMU measurements in real-time, obviating the need for post-procedure retrieval.
- An environmental (e.g., pressure, temperature, light, moisture, etc.) sensor may also be integrated into the sensor patch 140.
- the sensor patch 140 may also include one or more electrical contacts (not shown) configured to couple to counterpart contacts (not shown) disposed on the shaft 154 of the instrument 150 allowing for the sensor patch 140 to be connected to an external power source (e.g., a battery disposed in the handle 152 of the instrument 150).
- an external power source e.g., a battery disposed in the handle 152 of the instrument 150.
- the antenna array 170 may include a plurality of antennas 171.
- the configuration and dimensions of antenna array 170 depend on the frequency of the EM waves emitted by the antenna 130.
- the 57 GHz to 64 GHz V-band is an unlicensed spectrum by the FCC.
- the size of each antenna 171 of the antenna array 170 may be from about 2 mm to about 3 mm to be sufficient radiation power efficiency.
- multiple elements e.g., 2x2
- the sensor patch 140 may be adhered to a specific location on the shaft 154.
- the sensor patch 140 may be placed on the shaft 154 such that a geometric center of the sensor patch 140 is from about 10 cm to about 20 cm, i.e., distance d (FIG. 7) away from a distal end of the end effector 156.
- the sensor patch 140 may be placed such that it is not too close to the end effector 156 where other robotically controlled instruments 50 may scatter or block the EM waves.
- the sensor patch 140 may be also placed such that it is not too close to the proximal end portion of the shaft 154 either because the abdominal wall absorbs waves traveling across it.
- the sensor patch 140 may be also placed at distance that makes it visible in an endoscopic view to the camera 51.
- the IMU 162 outputs sensor data based on the type of sensors embedded in its package and may include body-frame accelerations, angular rates, magnetic field measurements, etc. In the absence of a registration process, this data cannot be directly used in conjunction with the kinematic data of the robotic arms 40 to infer the position of the handheld instrument 150 relative to robotically-controlled instruments 50. This may be accomplished by determining the initial pose of the IMU 162 at activation in a global coordinate system, which includes the instrument 50 of the robotic system 10 using computer vision algorithms by processing the video feed of the camera 51. The calculations and determination of the location of the IMU 162 may be performed by the controller 21a of the robotic system 10.
- the IMU 162 body-frame (i.e., local) pose may be represented by a 6-dimensional vector PIMU(XIMU, yiMU, ZIMU. OIMU, IMU, YIMU). Since there is no relative motion between a distal (i.e., reference) point 151 of the shaft 154 and the IMU 162 at any point in time throughout the procedure, the position of the reference point 151 PD(XD, yo, ZD) in the global coordinate system after IMU 162 activation may be uniquely determined based on its initial position PDO(XDO, yoo, Z DO). PIMU is calculated from measurements from the IMU 162. Poomay be determined using an optimization process of FIG. 10. [0059] In FIG.
- a deterministic representation of a point in the 3-D world frame is shown as a pixel frame, i.e., a video frame captured by the camera 51.
- the extrinsic parameters represent the location of the camera 51 in the 3-D scene.
- the intrinsic parameters represent the optical center and focal length of the camera 51.
- the world points are transformed to camera coordinates using the extrinsic parameters.
- the camera coordinates are then mapped into the image plane using the intrinsic parameters.
- the intrinsic parameters of the camera 51 are known or may be calculated (i.e., empirically derived) from the position of the camera 51 in the world frame using kinematics data (e.g., joint angles) of the robotic arm 40 holding the camera 51.
- Kinematics data is then used to determine the rigid 3D-to-3D transformation for localizing the sensor patch 140 and the transformation is dependent on the position of the reference point 151 of the shaft 154.
- PDO may be assigned an initial value in world coordinates and is then used to calculate corresponding pixel coordinates [X’DO, y’oo] using the process of FIG. 10.
- the estimated pixel coordinate values are then compared to actual pixel coordinate values and a cost function is constructed based on the differences.
- Various optimization methods may then be used to iteratively update the initial pose estimate to drive the cost down until the difference between estimated and actual pixel coordinate values is within a desired tolerance. Subsequently, the iterative process determines the actual position of the reference point 151 in the 3D world frame.
- a method for localizing the handheld instrument 150 in use along with the robotic system 10 is disclosed. Portions of the method may be implemented as software instructions executable by a processor, e.g., the controller 21a.
- the instrument 150 along with the sensor patch 140 is inserted into the patient through one of the access ports 55a-d.
- the antenna 130 is energized and the sensor patch 140 is activated as described above at step 202.
- the camera 51 captures a video of the instrument 150, i.e., the reference point 151, and provides the video to the controller 21a for image processing.
- the controller 21a also receives IMU data from the sensor patch 140.
- the controller 21a then performs the optimization process of FIG. 11 until the estimated pixel coordinate values sufficiently align with actual pixel coordinate values.
- the controller 21a calculates estimated pixel coordinate values of the reference point 151 and actual pixel coordinate values of the reference point 151. Estimated pixel coordinate values are calculated based on the IMU data and actual pixel coordinate values may be calculated using any suitable computer vision algorithm, such as a deep learning algorithm configured to detect pixels corresponding to the reference point 151.
- the estimated and actual pixel coordinate values are compared to calculate a difference therebetween.
- the difference is compared to a threshold indicative of a desired tolerance for matching the IMU data with the video feed of the camera 51.
- step 214 the controller 21a outputs a message indicating that the handheld instrument 150 has been localized relative to the robotically controlled instrument(s) 50.
Landscapes
- Engineering & Computer Science (AREA)
- Health & Medical Sciences (AREA)
- Surgery (AREA)
- Life Sciences & Earth Sciences (AREA)
- Biomedical Technology (AREA)
- Robotics (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Heart & Thoracic Surgery (AREA)
- Medical Informatics (AREA)
- Molecular Biology (AREA)
- Animal Behavior & Ethology (AREA)
- General Health & Medical Sciences (AREA)
- Public Health (AREA)
- Veterinary Medicine (AREA)
- Power Engineering (AREA)
- Computer Networks & Wireless Communication (AREA)
- Manipulator (AREA)
Abstract
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN202480016920.4A CN120835775A (zh) | 2023-03-09 | 2024-02-28 | 用于跟踪手术器械的传感器贴片 |
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US202363451051P | 2023-03-09 | 2023-03-09 | |
| US63/451,051 | 2023-03-09 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2024184739A1 true WO2024184739A1 (fr) | 2024-09-12 |
Family
ID=90195459
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/IB2024/051919 Pending WO2024184739A1 (fr) | 2023-03-09 | 2024-02-28 | Patch capteur pour le suivi d'instruments chirurgicaux |
Country Status (2)
| Country | Link |
|---|---|
| CN (1) | CN120835775A (fr) |
| WO (1) | WO2024184739A1 (fr) |
Citations (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20200186179A1 (en) * | 2018-12-06 | 2020-06-11 | Apple Inc. | Electronic Devices Having Circuitry in Housing Attachment Structures |
| US20200405219A1 (en) * | 2019-06-28 | 2020-12-31 | Orthosensor Inc | Orthopedic system for pre-operative, intra-operative, and post-operative assessment |
| US20230011384A1 (en) * | 2021-07-07 | 2023-01-12 | Springloc Ltd. | Apparatus and method for passive markers localization within a body |
-
2024
- 2024-02-28 WO PCT/IB2024/051919 patent/WO2024184739A1/fr active Pending
- 2024-02-28 CN CN202480016920.4A patent/CN120835775A/zh active Pending
Patent Citations (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20200186179A1 (en) * | 2018-12-06 | 2020-06-11 | Apple Inc. | Electronic Devices Having Circuitry in Housing Attachment Structures |
| US20200405219A1 (en) * | 2019-06-28 | 2020-12-31 | Orthosensor Inc | Orthopedic system for pre-operative, intra-operative, and post-operative assessment |
| US20230011384A1 (en) * | 2021-07-07 | 2023-01-12 | Springloc Ltd. | Apparatus and method for passive markers localization within a body |
Also Published As
| Publication number | Publication date |
|---|---|
| CN120835775A (zh) | 2025-10-24 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| EP4275642A1 (fr) | Identification et suivi de position d'instrument en temps réel | |
| US11948226B2 (en) | Systems and methods for clinical workspace simulation | |
| US20240324856A1 (en) | Surgical trocar with integrated cameras | |
| US20250134609A1 (en) | Setting remote center of motion in surgical robotic system | |
| US20240407865A1 (en) | Determining information about a surgical port in a surgical robotic system | |
| US12472022B2 (en) | Surgical robotic system with daisy chaining | |
| US20240058031A1 (en) | System and method for port placement in a surgical robotic system | |
| US12479098B2 (en) | Surgical robotic system with access port storage | |
| WO2024184739A1 (fr) | Patch capteur pour le suivi d'instruments chirurgicaux | |
| US20240415590A1 (en) | Surgeon control of robot mobile cart and setup arm | |
| US20230255705A1 (en) | System and method for calibrating a surgical instrument | |
| WO2024201216A1 (fr) | Système robotique chirurgical et méthode pour empêcher une collision d'instrument | |
| EP4154837A1 (fr) | Configuration de système robotique chirurgical | |
| US20240374330A1 (en) | System of operating surgical robotic systems with access ports of varying length | |
| US20240341883A1 (en) | Bedside setup process for movable arm carts in surgical robotic system | |
| US20250235274A1 (en) | Component presence and identification in surgical robotic system | |
| US20240341878A1 (en) | Surgical robotic system with orientation setup device and method | |
| US20250017674A1 (en) | Surgical robotic system and method for optical measurement of end effector pitch, yaw, and jaw angle | |
| EP4649371A1 (fr) | Système robotique chirurgical et procédé de communication entre une console de chirurgien et un assistant de chevet | |
| WO2024157113A1 (fr) | Système robotique chirurgical et procédé de placement d'orifice d'accès assisté | |
| EP4648702A1 (fr) | Système robotique chirurgical et méthode de navigation d'instruments chirurgicaux | |
| WO2025041075A1 (fr) | Système robotique chirurgical et procédé de détection de ports d'accès | |
| EP4444210A1 (fr) | Leviers de commande à pied d'interface utilisateur graphique pour système robotique chirurgical |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 24709519 Country of ref document: EP Kind code of ref document: A1 |
|
| WWE | Wipo information: entry into national phase |
Ref document number: 202480016920.4 Country of ref document: CN |
|
| WWE | Wipo information: entry into national phase |
Ref document number: 2024709519 Country of ref document: EP |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| WWP | Wipo information: published in national office |
Ref document number: 202480016920.4 Country of ref document: CN |
|
| ENP | Entry into the national phase |
Ref document number: 2024709519 Country of ref document: EP Effective date: 20251009 |
|
| ENP | Entry into the national phase |
Ref document number: 2024709519 Country of ref document: EP Effective date: 20251009 |
|
| ENP | Entry into the national phase |
Ref document number: 2024709519 Country of ref document: EP Effective date: 20251009 |
|
| ENP | Entry into the national phase |
Ref document number: 2024709519 Country of ref document: EP Effective date: 20251009 |