WO2025158442A1 - Systèmes et procédés de détermination et d'application d'une compensation de déviation à des systèmes robotiques - Google Patents
Systèmes et procédés de détermination et d'application d'une compensation de déviation à des systèmes robotiquesInfo
- Publication number
- WO2025158442A1 WO2025158442A1 PCT/IL2025/050086 IL2025050086W WO2025158442A1 WO 2025158442 A1 WO2025158442 A1 WO 2025158442A1 IL 2025050086 W IL2025050086 W IL 2025050086W WO 2025158442 A1 WO2025158442 A1 WO 2025158442A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- deflection
- sensor
- deflection compensation
- sensor data
- robotic arm
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/30—Surgical robots
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/30—Surgical robots
- A61B34/37—Leader-follower robots
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1628—Programme controls characterised by the control loop
- B25J9/1638—Programme controls characterised by the control loop compensation for arm bending/inertia, pay load weight/inertia
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1628—Programme controls characterised by the control loop
- B25J9/1641—Programme controls characterised by the control loop compensation for backlash, friction, compliance, elasticity in the joints
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1679—Programme controls characterised by the tasks executed
- B25J9/1689—Teleoperation
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
- A61B2034/2046—Tracking techniques
- A61B2034/2051—Electromagnetic tracking systems
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
- A61B2034/2046—Tracking techniques
- A61B2034/2055—Optical tracking systems
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
- A61B2034/2046—Tracking techniques
- A61B2034/2059—Mechanical position encoders
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/30—Surgical robots
- A61B2034/302—Surgical robots specifically adapted for manipulations within body cavities, e.g. within abdominal or thoracic cavities
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/06—Measuring instruments not otherwise provided for
- A61B2090/064—Measuring instruments not otherwise provided for for measuring force, pressure or mechanical tension
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/06—Measuring instruments not otherwise provided for
- A61B2090/064—Measuring instruments not otherwise provided for for measuring force, pressure or mechanical tension
- A61B2090/065—Measuring instruments not otherwise provided for for measuring force, pressure or mechanical tension for measuring contact or contact pressure
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/06—Measuring instruments not otherwise provided for
- A61B2090/064—Measuring instruments not otherwise provided for for measuring force, pressure or mechanical tension
- A61B2090/066—Measuring instruments not otherwise provided for for measuring force, pressure or mechanical tension for measuring torque
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/08—Accessories or related features not otherwise provided for
- A61B2090/0818—Redundant systems, e.g. using two independent measuring systems and comparing the signals
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/30—Surgical robots
- A61B34/32—Surgical robots operating autonomously
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/45—Nc applications
- G05B2219/45117—Medical, radio surgery manipulator
Definitions
- the present disclosure is generally directed to mechanical deflection compensation, and relates more particularly to determining and applying a deflection compensation to robotic systems.
- Surgical robots may assist a surgeon or other medical provider in carrying out a surgical procedure, or may complete one or more surgical procedures autonomously.
- Providing controllable linked articulating members allows a surgical robot to reach areas of a patient anatomy during various medical procedures.
- Example aspects of the present disclosure include:
- a system for determining and applying a deflection compensation comprises a robotic arm; a tracked portion attached to the robotic arm, the tracked portion configured to support and orient an end effector; a sensor configured to sense a load and to yield sensor data; at least one processor; and at least one memory storing instructions for execution by the at least one processor that, when executed, cause the at least one processor to: receive sensor data from the sensor, the sensor data corresponding to a load exerted on the tracked portion; receive pose information corresponding to a pose of the tracked portion; determine a deflection compensation of the tracked portion based on the pose information and the load exerted on the tracked portion; and apply the deflection compensation to the robotic arm.
- the senor data is first sensor data and the pose information is first pose information
- the memory stores additional instructions for execution by the at least one processor that, when executed, further cause the at least one processor to: receive second sensor data from the sensor, the second sensor data corresponding to a load exerted on the end effector; receive second pose information corresponding to a pose of the end effector; and determine a deflection compensation of the end effector based on the second sensor data and the second pose information.
- the memory stores additional instructions for execution by the at least one processor that, when executed, further cause the at least one processor to: receive third sensor data from the sensor, the third sensor data corresponding to a load exerted on the robotic arm; receive third pose information corresponding to a pose of the robotic arm; and determine a deflection compensation of the robotic arm based on the third sensor data and the third pose information.
- determining the deflection compensation of the tracked portion includes retrieving the deflection compensation from a lookup table, the lookup table having a plurality of deflection compensations for the tracked portion based on a plurality of parameters
- determining the deflection compensation of the end effector includes retrieving the deflection compensation of the end effector from the lookup table, the lookup table having a plurality of deflection compensations for the end effector based on a plurality of parameters
- determining the deflection compensation of the robotic arm includes retrieving the deflection compensation of the robotic arm from the lookup table, the lookup table having a plurality of deflection compensations for the robotic arm based on a plurality of parameters.
- the sensor is integrated with the tracked portion.
- the sensor is at least one of a force sensor, a torque sensor, or a combination of a force and torque sensor, and wherein the load is at least one of a force, a torque, or a combination of a force and a torque.
- determining the deflection compensation of the tracked portion includes retrieving the deflection compensation from a lookup table, the lookup table having a plurality of deflection compensations based on a plurality of parameters.
- any of the aspects herein, wherein the plurality of parameters include at least one of pose information, load information, gravity information, or any combination thereof.
- simulation comprises a finite element analysis simulation.
- the memory stores additional instructions for execution by the at least one processor that, when executed, further cause the at least one processor to: receive gravity information corresponding to gravity force exerted on the tracked portion, wherein determining the deflection compensation is also based on the gravity information.
- a system for determining and applying a deflection compensation comprises a robotic arm; a tracked portion attached to the robotic arm, the tracked portion configured to support and orient an end effector; a sensor configured to sense a load and to yield sensor data; at least one processor; and at least one memory storing instructions for execution by the at least one processor that, when executed, cause the at least one processor to: receive first sensor data from the sensor, the first sensor data corresponding to a load exerted on the tracked portion; receive second sensor data from the sensor, the second sensor data corresponding to a load exerted on the end effector; receive first pose information corresponding to a pose of the tracked portion; receive second pose information corresponding to a pose of the end effector; determine a deflection compensation of the tracked portion based on the first pose
- determining the deflection compensation of the tracked portion includes retrieving the deflection compensation from a lookup table, the lookup table having a plurality of deflection compensations for the tracked portion based on a plurality of parameters
- determining the deflection compensation of the end effector includes retrieving the deflection compensation of the end effector from the lookup table, the lookup table having a plurality of deflection compensations for the end effector based on a plurality of parameters.
- any of the aspects herein, wherein the plurality of parameters include at least one of pose information, load information, gravity information, or any combination thereof.
- simulation comprises a finite element analysis simulation.
- the senor is integrated with the tracked portion.
- the sensor is at least one of a force sensor, a torque sensor, or a combination of a force and torque sensor, and wherein the load is at least one of a force, a torque, or a combination of a force and a torque.
- a system for determining and applying a deflection compensation comprises a robotic arm; a tracked portion attached to the robotic arm, the tracked portion configured to support and orient an end effector; a sensor configured to sense a load and to yield sensor data; at least one processor; and at least one memory storing instructions for execution by the at least one processor that, when executed, cause the at least one processor to: receive first sensor data from the sensor, the first sensor data corresponding to a load exerted on the tracked portion; receive second sensor data from the sensor, the second sensor data corresponding to a load exerted on the end effector; receive third sensor data from the sensor, the third sensor data corresponding to a load exerted on the robotic arm; receive first pose information corresponding to a pose of the tracked portion; receive second pose information corresponding to a pose of the end effector; receive third pose information corresponding to a pose of the robotic arm; determine a deflection compensation of the tracked portion based on the first pose information and the first sensor data,
- a system for determining and applying a deflection compensation comprises a robot having a robotic arm, a tracked portion, a tool changer, and an end effector; one or more sensors configured to sense a load and to yield sensor data; an integrated circuit configured to: receive system sensor data from the sensor, the system sensor data corresponding to a load exerted the robot; receive system pose information corresponding to a pose of the robot; determine a deflection compensation of the robot based on the system sensor data and the system pose information; and apply the deflection compensation to the robot.
- determining the deflection compensation of the robot includes retrieving the deflection compensation from a lookup table, the lookup table having a plurality of deflection compensations for the robot based on a plurality of parameters.
- each of the expressions “at least one of A, B and C”, “at least one of A, B, or C”, “one or more of A, B, and C”, “one or more of A, B, or C” and “A, B, and/or C” means A alone, B alone, C alone, A and B together, A and C together, B and C together, or A, B and C together.
- each one of A, B, and C in the above expressions refers to an element, such as X, Y, and Z, or class of elements, such as XI -Xn, Yl-Ym, and Zl-Zo
- the phrase is intended to refer to a single element selected from X, Y, and Z, a combination of elements selected from the same class (e.g., XI and X2) as well as a combination of elements selected from two or more classes (e.g., Y 1 and Zo).
- FIG. 1 A shows aspects of a system according to at least one embodiment of the present disclosure
- Fig. IB shows additional aspects of the system according to at least one embodiment of the present disclosure
- Fig. 1C shows aspects of a tracked portion attached to a robotic arm and an end effector according to at least one embodiment of the present disclosure
- Fig. 2A is a flowchart according to at least one embodiment of the present disclosure
- Fig. 2B is a flowchart according to at least one embodiment of the present disclosure.
- Fig. 2C is a flowchart according to at least one embodiment of the present disclosure.
- Fig. 3 is a flowchart according to at least one embodiment of the present disclosure.
- the described methods, processes, and techniques may be implemented in hardware, software, firmware, or any combination thereof. If implemented in software, the functions may be stored as one or more instructions or code on a computer- readable medium and executed by a hardware-based processing unit. Alternatively or additionally, functions may be implemented using machine learning models, neural networks, artificial neural networks, or combinations thereof (alone or in combination with instructions).
- Computer-readable media may include non-transitory computer-readable media, which corresponds to a tangible medium such as data storage media (e.g., RAM, ROM, EEPROM, flash memory, or any other medium that can be used to store desired program code in the form of instructions or data structures and that can be accessed by a computer).
- processors such as one or more digital signal processors (DSPs), general purpose microprocessors (e.g., Intel Core i3, i5, i7, or i9 processors; Intel Celeron processors; Intel Xeon processors; Intel Pentium processors; AMD Ryzen processors; AMD Athlon processors; AMD Phenom processors; Apple A10 or 10X Fusion processors; Apple Al l, A12, A12X, A12Z, or A13 Bionic processors; or any other general purpose microprocessors), graphics processing units (e.g., Nvidia GeForce RTX 2000- series processors, Nvidia GeForce RTX 3000-series processors, AMD Radeon RX 5000-series processors, AMD Radeon RX 6000-series processors, or any other graphics processing units), application specific integrated circuits (ASICs), field programmable logic arrays (FPGAs), or other equivalent integrated or discrete logic circuitry
- DSPs digital signal processors
- proximal and distal are used in this disclosure with their conventional medical meanings, proximal being closer to the operator or user of the system, and further from the region of surgical interest in or on the patient, and distal being closer to the region of surgical interest in or on the patient, and further from the operator or user of the system.
- Robotic-assisted surgery may carry risks such as spinal cord injury, temporary pain, or result in permanent harm to the patient.
- accuracy is crucial in such robotic-assisted surgeries to minimize such risks.
- inaccuracies may result from deflections experienced by components of a robotic surgical system.
- robotic surgical systems may improve and validate accuracy through navigation (tracking the arm) and kinematics (software calculations based on joint sensors).
- deflection compensation may be determined from force and/or torque sensors, Finite Element Analysis (FEA) of components under a load, experimentation of components under load, combinations thereof, etc.
- FEA Finite Element Analysis
- usage of a force and/or torque sensor's calibration matrix, as well as a lookup table gathered from either FEA or physical experimentation can be used to create deflection data that the surgical system will be able to use in order to compensate for deflection of multiple components in the robotic surgical system.
- Embodiments of the present disclosure provide technical solutions to one or more of the problems of (1) improving accuracy of a robotic surgical system, (2) compensating for deflection of one or more components of the robotic surgical system, and/or (3) improving patient and the surgical team’s safety.
- a block diagram of a system 100 according to at least one embodiment of the present disclosure is shown.
- the system 100 may be used to determine and apply a deflection compensation to the system 100 (which may be, for example, a robotic system) and/or carry out one or more other aspects of one or more of the methods disclosed herein.
- the system 100 comprises a computing device 102, one or more imaging devices 112, a robot 114, a navigation system 118, a database 130, and/or a cloud or other network 134.
- Systems according to other embodiments of the present disclosure may comprise more or fewer components than the system 100.
- the system 100 may not include the imaging device 112, the robot 114, the navigation system 118, one or more components of the computing device 102, the database 130, and/or the cloud 134.
- the computing device 102 comprises a processor 104, a memory 106, a communication interface 108, and a user interface 110.
- Computing devices according to other embodiments of the present disclosure may comprise more or fewer components than the computing device 102.
- the processor 104 of the computing device 102 may be any processor described herein or any similar processor.
- the processor 104 may be configured to execute instructions stored in the memory 106, which instructions may cause the processor 104 to carry out one or more computing steps utilizing or based on data received from the imaging device 112, the robot 114, the navigation system 118, the database 130, and/or the cloud 134.
- the memory 106 may be or comprise RAM, DRAM, SDRAM, other solid-state memory, any memory described herein, or any other tangible, non-transitory memory for storing computer-readable data and/or instructions.
- the memory 106 may store information or data useful for completing, for example, any step of the methods 200A, 200B, 200C. and/or 300 described herein, or of any other methods.
- the memory 106 may store, for example, instructions and/or machine learning models that support one or more functions of the robot 114.
- the memory 106 may store content (e.g., instructions and/or machine learning models) that, when executed by the processor 104, enable sensor processing 120 and/or deflection determination 128.
- the sensor processing 120 enables the processor 104 to process sensor data received from, for example, a sensor 142.
- the sensor data may be processed to obtain, for example, a load exerted onto an object such as a robotic arm 116, a tracked portion 136, an tool changer 140, an end effector 146, or any other portion of the robot 114.
- the load may be a force and/or torque exerted onto the object.
- the sensor data may be processed to obtain, for example, a pose of the object such as, for example, the robotic arm 116, the tracked portion 136, the tool changer 140, the end effector 146, or any other portion of the robot 114.
- pose information may be obtained from, for example, the navigation system 118.
- the deflection determination 128 enables the processor 104 to receive and process the load and the pose information to determine a deflection compensation.
- the processor 104 may also receive gravity information as input in addition to the load and the pose information.
- the deflection determination 128 enables the processor 104 to retrieve the deflection compensation from a lookup table having a plurality of deflection compensations for the tracked portion based on a plurality of parameters such as the pose information, the load, and/or the gravity information.
- Such content may, in some embodiments, be organized into one or more applications, modules, packages, layers, or engines.
- the memory 106 may store other types of content or data (e.g., machine learning models, artificial neural networks, deep neural networks, etc.) that can be processed by the processor 104 to carry out the various methods and features described herein.
- various contents of memory 106 may be described as instructions, it should be appreciated that functionality described herein can be achieved through use of instructions, algorithms, and/or machine learning models.
- the data, algorithms, and/or instructions may cause the processor 104 to manipulate data stored in the memory 106 and/or received from or via the imaging device 112, the robot 114, the database 130, and/or the cloud 134.
- the memory 106 may also store gravity information 122 (e.g., information regarding deflection due to gravity) and/or deflection information 124 (e.g., information regarding deflection due to factors other than gravity such as, for example, an external force exerted on the robot 114 or any part of the robot 114 such as such as the robotic arm 116, the tracked portion 136, the tool changer 140, and/or the end effector 146.
- the gravity information 122 corresponds to an amount of deflection for the object such as the robotic arm 116, the tracked portion 136, the tool changer 140, the end effector 146, or any other portion of the robot 114 due to gravity exerted on the object.
- the deflection information 124 corresponds to an amount of deflection or deflection compensation for the object such as the robotic arm 116, the tracked portion 136, the tool changer 140, the end effector 146, or any other portion of the robot 114 based on a pose of the object and an external force exerted on the object.
- the deflection compensation may be determined from physical experimentation of an external force exerted onto, for example, the robotic arm 116, the tracked portion 136, the tool changer 140, the end effector 146, or any other portion of the robot 114.
- the deflection compensation may be determined from simulation of an external force exerted on, for example, the robotic arm 116, the tracked portion 136, the tool changer 140, the end effector 146, or any other portion of the robot 114.
- the gravity information 122 and/or the deflection information 124 may be stored in a table such as, for example, a lookup table. It will be appreciated that the gravity information 122 and/or the deflection information 124 may be stored in any form and/or format.
- the computing device 102 may also comprise a communication interface 108.
- the communication interface 108 may be used for receiving image data or other information from an external source (such as the imaging device 112, the robot 114, the navigation system 118, the database 130, the cloud 134, and/or any other system or component not part of the system 100), and/or for transmitting instructions, images, or other information to an external system or device (e.g., another computing device 102, the imaging device 112, the robot 114, the navigation system 118, the database 130, the cloud 134, and/or any other system or component not part of the system 100).
- an external system or device e.g., another computing device 102, the imaging device 112, the robot 114, the navigation system 118, the database 130, the cloud 134, and/or any other system or component not part of the system 100.
- the communication interface 108 may comprise one or more wired interfaces (e.g., a USB port, an Ethernet port, a Firewire port) and/or one or more wireless transceivers or interfaces (configured, for example, to transmit and/or receive information via one or more wireless communication protocols such as 802.1 la/b/g/n, Bluetooth, NFC, ZigBee, and so forth).
- the communication interface 108 may be useful for enabling the device 102 to communicate with one or more other processors 104 or computing devices 102, whether to reduce the time needed to accomplish a computing-intensive task or for any other reason.
- the computing device 102 may also comprise one or more user interfaces 110.
- the user interface 110 may be or comprise a keyboard, mouse, trackball, monitor, television, screen, touchscreen, and/or any other device for receiving information from a user and/or for providing information to a user.
- the user interface 110 may be used, for example, to receive a user selection or other user input regarding any step of any method described herein. Notwithstanding the foregoing, any required input for any step of any method described herein may be generated automatically by the system 100 (e.g., by the processor 104 or another component of the system 100) or received by the system 100 from a source external to the system 100.
- the user interface 110 may be useful to allow a surgeon or other user to modify instructions to be executed by the processor 104 according to one or more embodiments of the present disclosure, and/or to modify or adjust a setting of other information displayed on the user interface 110 or corresponding thereto.
- the computing device 102 may utilize a user interface 110 that is housed separately from one or more remaining components of the computing device 102.
- the user interface 110 may be located proximate one or more other components of the computing device 102, while in other embodiments, the user interface 110 may be located remotely from one or more other components of the computer device 102.
- the imaging device 112 may be operable to image anatomical feature(s) (e.g., a bone, veins, tissue, etc.) and/or other aspects of patient anatomy to yield image data (e.g., image data depicting or corresponding to a bone, veins, tissue, etc.).
- image data refers to the data generated or captured by an imaging device 112, including in a machine-readable form, a graphical/visual form, and in any other form.
- the image data may comprise data corresponding to an anatomical feature of a patient, or to a portion thereof.
- the image data may be or comprise a preoperative image, an intraoperative image, a postoperative image, or an image taken independently of any surgical procedure.
- a first imaging device 112 may be used to obtain first image data (e.g., a first image) at a first time, and a second imaging device 112 may be used to obtain second image data (e.g., a second image) at a second time after the first time.
- the imaging device 112 may be capable of taking a 2D image or a 3D image to yield the image data.
- the imaging device 112 may be or comprise, for example, an ultrasound scanner (which may comprise, for example, a physically separate transducer and receiver, or a single ultrasound transceiver), an O-arm, a C-arm, a G-arm, or any other device utilizing X-ray-based imaging (e.g., a fluoroscope, a CT scanner, or other X-ray machine), a magnetic resonance imaging (MRI) scanner, an optical coherence tomography (OCT) scanner, an endoscope, a microscope, an optical camera, a thermographic camera (e.g., an infrared camera), a radar system (which may comprise, for example, a transmitter, a receiver, a processor, and one or more antennae), or any other imaging device 112 suitable for obtaining images of an anatomical feature of a patient.
- the imaging device 112 may be contained entirely within a single housing, or may comprise a transmitter/emitter and a receiver/detector that are in separate housings or are otherwise
- the imaging device 112 may comprise more than one imaging device 112.
- a first imaging device may provide first image data and/or a first image
- a second imaging device may provide second image data and/or a second image.
- the same imaging device may be used to provide both the first image data and the second image data, and/or any other image data described herein.
- the imaging device 112 may be operable to generate a stream of image data.
- the imaging device 112 may be configured to operate with an open shutter, or with a shutter that continuously alternates between open and shut so as to capture successive images.
- image data may be considered to be continuous and/or provided as an image data stream if the image data represents two or more frames per second.
- the robot 114 may be mechanically coupled with (e.g., affixed to, attached to, mounted to, etc.) a patient bed or table.
- the robot 114 may be disposed on a robot cart 144.
- the robot cart 144 may be or comprise a mobile platform that enables the robot 114 and/or components thereof to be positioned relative to the patient and/or the bed or table on which the patient is positioned.
- the robot cart 144 may comprise wheels that enable the robot cart 144 to roll or move relative to the patient.
- the robot cart 144 may be detachable from the wheels or the wheels may lockable such that, once the robot cart 144 is positioned in a desired location relative to the patient, the robot cart 144 will remain fixed in the desired location.
- the robot cart 144 may have a mechanism that enables the robot cart 144 to remain fixed relative to the patient. The mechanism may better ensure that the robot 114 and/or any other components on the robot cart 144 do not move relative to the patient due to the mobility of the robot cart 144 once the robot cart 144 has been positioned in the desired location.
- the robot 114 may be any surgical robot or surgical robotic system.
- the robot 114 may be or comprise, for example, the Mazor XTM Stealth Edition robotic guidance system.
- the robot 114 may be configured to position the imaging device 112 at one or more precise position(s) and orientation(s), and/or to return the imaging device 112 to the same position(s) and orientation(s) at a later point in time.
- the robot 114 may additionally or alternatively be configured to manipulate a surgical tool (whether based on guidance from the navigation system 118 or not) to accomplish or to assist with a surgical task.
- the robot 114 may be configured to hold and/or manipulate an anatomical element during or in connection with a surgical procedure.
- the robot 114 may comprise one or more robotic arms 116.
- the robotic arm 116 may comprise a first robotic arm and a second robotic arm, though the robot 114 may comprise more than two robotic arms. In some embodiments, one or more of the robotic arms 116 may be used to hold and/or maneuver the imaging device 112. In embodiments where the imaging device 112 comprises two or more physically separate components (e.g., a transmitter and receiver), one robotic arm 116 may hold one such component, and another robotic arm 116 may hold another such component. Each robotic arm 116 may be positionable independently of the other robotic arm. The robotic arms 116 may be controlled in a single, shared coordinate space, or in separate coordinate spaces.
- the robot 114 together with the robotic arm 116, may have, for example, one, two, three, four, five, six, seven, or more degrees of freedom. Further, the robotic arm 116 may be positioned or positionable in any pose, plane, and/or focal point. The pose includes a position and an orientation. As a result, an imaging device 112, surgical tool, or other object held by the robot 114 (or, more specifically, by the robotic arm 116) may be precisely positionable in one or more needed and specific positions and orientations.
- the robotic arm(s) 116 may comprise one or more sensors that enable the processor 104 (or a processor of the robot 114) to determine a precise pose in space of the robotic arm (as well as any object or element held by or secured to the robotic arm).
- reference markers e.g., navigation markers
- the reference markers may be tracked by the navigation system 118, and the results of the tracking may be used by the robot 114 and/or by an operator of the system 100 or any component thereof.
- the navigation system 118 can be used to track other components of the system (e.g., imaging device 112) and the system can operate without the use of the robot 114 (e.g., with the surgeon manually manipulating the imaging device 112 and/or one or more surgical tools, based on information and/or instructions generated by the navigation system 118, for example).
- the navigation markers 148A-148F may be or comprise one or more active markers, one or more passive markers, or a combination of active and passive markers.
- the navigation markers 148A-148F may comprise a first navigation marker 148A, a second navigation marker 148B, a third navigation marker 148C, a fourth navigation marker 148D, a fifth navigation marker 148E, and a sixth navigation marker 148F.
- the navigation markers 148A-148F may be, for example, LEDs, infrared LEDs, reflective markers, and/or the like.
- the navigation system 118 may be configured to obtain pose information describing a pose of the navigation markers 148A-148F, which may be used to determine a correlating pose of the tracked portion 136 and, based on the known connection of the tracked portion 136 to the robotic arm 116, the pose of the robotic arm 116 (e.g., using transformation 124 and registration 128).
- the navigation system 118 may provide navigation for a surgeon and/or a surgical robot during an operation.
- the navigation system 118 may be any now-known or future- developed navigation system, including, for example, the Medtronic StealthStationTM S8 surgical navigation system or any successor thereof.
- the navigation system 118 may include one or more cameras or other sensor(s) for tracking one or more reference markers (e.g., navigation markers 148A-148F), navigated trackers, or other objects within the operating room or other room in which some or all of the system 100 is located.
- the one or more cameras may be optical cameras, infrared cameras, or other cameras.
- the navigation system 118 may comprise one or more electromagnetic sensors.
- the navigation system 118 may be used to track a position and orientation (e.g., a pose) of the imaging device 112, the robot 114 and/or robotic arm 116, and/or one or more surgical tools (or, more particularly, to track a pose of the tracked portion 136 or other navigated tracker attached, directly or indirectly, in fixed relation to the one or more of the foregoing).
- the navigation system 118 may include a display for displaying one or more images from an external source (e.g., the computing device 102, imaging device 112, or other source) or for displaying an image and/or video stream from the one or more cameras or other sensors of the navigation system 118.
- the system 100 can operate without the use of the navigation system 118.
- the navigation system 118 may be configured to provide guidance to a surgeon or other user of the system 100 or a component thereof, to the robot 114, or to any other element of the system 100 regarding, for example, a pose of one or more anatomical elements, whether or not a tool is in the proper trajectory, and/or how to move a tool into the proper trajectory to carry out a surgical task according to a preoperative or other surgical plan.
- the tracked portion 136 may be or comprise a device that enables the navigation system 118 to track the robotic arm 116.
- the tracked portion 136 may have a proximal end that can be coupled to the distal end of the robotic arm 116, and a distal end that can be coupled to a proximal end of a tool changer 140.
- the tracked portion 136 comprises navigation markers 148A-148F.
- the tool changer 140 may include a proximal end that is connectable to a distal end of the tracked portion 136, and a distal end that can be connected to the end effector 146 that can be used to carry out one or more surgical tasks.
- the tool changer 140 may comprise a surgical tool.
- the end effector 146 may, for example, support a surgical tool that may be configured to drill, burr, mill, cut, saw, ream, tap, etc. into anatomical tissues such as patient anatomy (e.g., soft tissues, bone, etc.).
- the system 100 may comprise multiple surgical tools, with each surgical tool performing a different surgical task (e.g., a surgical drill for drilling, a surgical mill for milling, a curette for removing anatomical tissue, an osteotome for cutting bone, etc.).
- the surgical tool may provide an adapter interface to which different working ends can be attached to perform multiple different types of surgical maneuvers (e.g., the surgical tool may be able to receive one or more different tool bits, such that the surgical tool can drill, mill, cut, saw, ream, tap, etc. depending on the tool bit coupled with the surgical tool).
- the surgical tool may be operated autonomously or semi-autonomously.
- the navigation system 118 may track the pose (e.g., position and orientation) of and/or navigate the surgical tool. For example, the navigation system 118 may identify the navigation markers 148A-148F on the tracked portion 136 and, based on the identification and the coupling of the tracked portion 136 with the tool changer 140 and the coupling of the tool changer 140 with the end effector 146, use transformation 124 and registration 128 to determine the pose of the surgical tool.
- the navigation system 118 may identify the navigation markers 148A-148F on the tracked portion 136 and, based on the identification and the coupling of the tracked portion 136 with the tool changer 140 and the coupling of the tool changer 140 with the end effector 146, use transformation 124 and registration 128 to determine the pose of the surgical tool.
- the sensor 142 may be any kind of sensor 142 for measuring a value such as, for example, force and/or torque exerted on the tool changer 140, the tracked portion 136, and/or the robotic arm 116.
- the sensor 142 may include one or more or any combination of components that are electrical, mechanical, electro-mechanical, magnetic, electromagnetic, or the like.
- the sensor 142 may include, but is not limited to, one or more of a torque sensor, a force sensor, a linear encoder, a strain gauge, a rotary encoder, a capacitor, and/or an accelerometer.
- the sensor 142 may include a memory for storing sensor data.
- the sensor 142 may output signals (e.g., sensor data) to one or more sources (e.g., the computing device 102, the navigation system 118, and/or the robot 114).
- the sensor 142 may be positioned adjacent to or integrated with another component of the system 100 such as, but not limited to, the robotic arm 116, the tracked portion 136, the tool changer 140, and/or the end effector 146. In some embodiments, the sensor 142 is positioned as a standalone component.
- the sensor 142 may include a plurality of sensors and each sensor may be positioned at the same location or a different location as any other sensor.
- the senor(s) 124 can be positioned at or on any component of the system 100 or environment (e.g., on any portion of the navigation system 118, the robot 114, the robotic arm 116, the tracked portion 136, the tool changer 140, the end effector 146, and/or any other component at the surgical site).
- the sensor 142 may send the data to the computing device 102 when the sensor 142 detects a load such as, for example, force and/or torque exerted on the end effector 146, the tool changer 140, the tracked portion 136, and/or the robotic arm 116. Further, in some embodiments, the sensor 142 may send data to the computing device 102 to display on the user interface 110 or otherwise notify the surgeon or operator of the change in the characteristic. In other embodiments, the sensor 142 may alert the surgeon or operator of the change in the characteristic by an alert such as, but not limited to, a sound or a light display.
- the sensor 142 may contribute to inaccuracy of the robot 114 and/or the robotic arm 116.
- the deflection due to the sensor 142 may be compensated for by using a calibration matrix of the sensor to compensate for the deflection.
- the database 130 may store information that correlates one coordinate system to another (e.g., one or more robotic coordinate systems to a patient coordinate system and/or to a navigation coordinate system).
- the database 130 may additionally or alternatively store, for example, one or more surgical plans (including, for example, pose information about a target and/or image information about a patient’ s anatomy at and/or proximate the surgical site, for use by the robot 114, the navigation system 118, and/or a user of the computing device 102 or of the system 100); one or more images useful in connection with a surgery to be completed by or with the assistance of one or more other components of the system 100; information about the tracked portion 136 (e.g., the pose of the navigation markers 148A-148F on the tracked portion 136, types of connectors and other units that can attach to the tracked portion 136, etc.); the gravity information 122, the deflection information 124, and/or any other useful information.
- one or more surgical plans including, for example, pose information about a target and
- the database 130 may be configured to provide any such information to the computing device 102 or to any other device of the system 100 or external to the system 100, whether directly or via the cloud 134.
- the database 130 may be or comprise part of a hospital image storage system, such as a picture archiving and communication system (PACS), a health information system (HIS), and/or another system for collecting, storing, managing, and/or transmitting electronic medical records including image data.
- a hospital image storage system such as a picture archiving and communication system (PACS), a health information system (HIS), and/or another system for collecting, storing, managing, and/or transmitting electronic medical records including image data.
- the cloud 134 may be or represent the Internet or any other wide area network.
- the computing device 102 may be connected to the cloud 134 via the communication interface 108, using a wired connection, a wireless connection, or both.
- the computing device 102 may communicate with the database 130 and/or an external device (e.g., a computing device) via the cloud 134.
- the system 100 or similar systems may be used, for example, to carry out one or more aspects of any of the methods 200A, 200B, 200C, 300 described herein.
- the system 100 or similar systems may also be used for other purposes.
- Fig. 2A depicts a method 200A that may be used, for example, for determining and applying a deflection compensation to a robot such as the robot 114 or a robotic arm such as the robotic 116 of a robotic system.
- the method 200A (and/or one or more steps thereof) may be carried out or otherwise performed, for example, by at least one processor.
- the at least one processor may be the same as or similar to the processor(s) 104 of the computing device 102 described above.
- the at least one processor may be part of a robot (such as a robot 114) or part of a navigation system (such as a navigation system 118).
- a processor other than any processor described herein may also be used to execute the method 200A.
- the at least one processor may perform the method 200A by executing elements stored in a memory such as the memory 106.
- the elements stored in the memory and executed by the processor may cause the processor to execute one or more steps of a function as shown in method 200A.
- One or more portions of a method 200A may be performed by the processor executing any of the contents of memory, such as sensor processing 120 and/or deflection determination 128.
- the method 200A comprises receiving first sensor data (step 204 A).
- the first sensor data may correspond to a load such as, for example, a force and/or torque exerted onto a tracked portion such as the tracked portion 136.
- the first sensor data may be received from, for example, one or more sensors such as the one or more sensors 142.
- the sensor data may be processed by a processor such as the processor 104 using a sensor processing such as the sensor processing 120.
- the sensor data may be processed to obtain, for example, a load exerted onto an object such as a robotic arm such as the robotic arm 116, a tracked portion such as the tracked portion 136, an end effector such as the end effector 146, a tool changer such as the tool changer 140, or any other portion of a robot such as the robot 114.
- the load may be a force and/or torque exerted onto the object.
- the sensor data may be processed to obtain, for example, a pose of the object such as, for example, the robotic arm, the tracked portion, the end effector, or any other portion of the robot.
- pose information may be obtained from, for example, a navigation system such as the navigation system 118.
- the sensor may be any kind of sensor for measuring a value such as, for example, force and/or torque exerted on the end effector, the tracked portion, and/or the robotic arm.
- the sensor may include one or more or any combination of components that are electrical, mechanical, electro-mechanical, magnetic, electromagnetic, or the like.
- the sensor may include, but is not limited to, one or more of a torque sensor, a force sensor, a linear encoder, a strain gauge, a rotary encoder, a capacitor, and/or an accelerometer.
- the sensor may include a memory for storing sensor data.
- the sensor may output signals (e.g., sensor data) to one or more sources (e.g., a computing device such as the computing device 102, the navigation system, and/or the robot).
- the sensor may be positioned adjacent to or integrated with another component of a system such as the system 100 such as, but not limited to, the robotic arm, the tracked portion, the robot, the end effector and/or the tool changer.
- the sensor is positioned as a standalone component.
- the sensor may include a plurality of sensors and each sensor may be positioned at the same location or a different location as any other sensor. It will be appreciated that in some embodiments the sensor(s) can be positioned at or on any component of the system or environment (e.g., on any portion of the navigation system, the robot, the robotic arm, the tracked portion, the end effector, the tool changer, and/or any other component at the surgical site).
- the method 200A also comprises receiving first pose information (step 208 A).
- the first pose information may correspond to a pose of the tracked portion.
- the first pose information may be obtained from, for example, the navigation system.
- the navigation system may include one or more cameras or other sensor(s) for tracking one or more reference markers (e.g., navigation markers), navigated trackers, or other objects within the operating room or other room in which some or all of the system is located.
- the one or more cameras may be optical cameras, infrared cameras, or other cameras.
- the navigation system may comprise one or more electromagnetic sensors.
- the navigation system may be used to track a pose of the robot, the robotic arm, and/or one or more surgical tools (or, more particularly, to track a pose of the tracked portion or other navigated tracker attached, directly or indirectly, in fixed relation to the one or more of the foregoing).
- the first pose information may alternatively or additionally be obtained from one or more sensors that enable the processor (or a processor of the robot) to determine a precise pose in space of the robotic arm (as well as any object or element held by or secured to the robotic arm).
- the one or more sensors may be integrated with or otherwise positioned on the robotic arm.
- the one or more sensors may be, for example, encoders of the robotic arm from which its location in space can be determined relative to the robotic arm’s base.
- the method 200A also comprises receiving gravity information (step 212A).
- the gravity information may be the same as or similar to the gravity information 122 and may be stored in and/or received from, for example, memory such as the memory 106, a database such as the database 130, and/or a cloud such as the cloud 134.
- the gravity information corresponds to an amount of deflection for the object such as the robotic arm, the tracked portion, the end effector, the tool changer, or any other portion of the robot due to gravity exerted on the object.
- the method 200A also comprises determining a deflection compensation of the tracked portion (step 216A). Determining the deflection compensation may be based on a lookup table and/or a calibration matrix.
- the deflection compensation may be determined using a deflection determination such as the deflection determination 128.
- the deflection determination enables the processor to receive and process the load (received from, for example, the step 204A above) and the pose information (received from, for example, the step 208A above) to determine the deflection compensation.
- the deflection determination enables the processor to retrieve the deflection compensation from a lookup table of deflection information having a plurality of deflection compensations for the tracked portion based on a plurality of parameters such as the pose information, the load, and/or the gravity information.
- the processor may also receive the gravity information (received from, for example, the step 212A above) as input in addition to the load and the pose information.
- the deflection information corresponds to an amount of deflection or deflection compensation for the object such as the robotic arm, the tracked portion, the tool changer, the end effector, or any other portion of the robot based on a pose of the object and an external force exerted on the object.
- the deflection compensation may equate to, for example, a vector having a magnitude and direction of deflection that the object would experience based on the pose of the object and the external force exerted on the object (and the gravity exerted on the object).
- the deflection compensation may equate to the vector with a magnitude and a direction that is opposite of deflection that the object would experience.
- the deflection compensation may be determined from physical experimentation of an external force exerted onto, for example, the robotic arm, the tracked portion, the tool changer, the end effector, or any other portion of the robot. Alternatively or additionally the deflection compensation may be determined from simulation of an external force exerted on, for example, the robotic arm, the tracked portion, the tool changer, the end effector, or any other portion of the robot.
- the deflection information may be stored and retrieved from, for example, a table such as a lookup table. It will be appreciated that the deflection information may be stored in any form and/or format in the memory, the database, and/or the cloud.
- the method 200A also comprises applying the deflection compensation (step 220A).
- the deflection compensation may be applied to, for example, the robotic arm and/or the robot. More specifically, the deflection compensation may be applied to a trajectory of the robotic arm so as to automatically adjust the trajectory of the robotic arm to account for the deflection compensation. In other words, the trajectory of the robotic arm may be adjusted such that the trajectory matches a planned or target trajectory. Additionally or alternatively, the deflection compensation may be applied to a 3D location of a tool center point (TCP) of the robot. More specifically, the TCP has x-, y-, and z- coordinates in space to which the deflection compensation may be applied to. In embodiments where the deflection compensation is applied to the trajectory and the 3D location, the deflection compensation may be applied in degrees to the trajectory and in distances (e.g., millimeters) to the 3D location.
- TCP tool center point
- the present disclosure encompasses embodiments of the method 200A that comprise more or fewer steps than those described above, and/or one or more steps that are different than the steps described above.
- Fig. 2B depicts a method 200B that may be used, for example, for determining and applying a deflection compensation to a robot such as the robot 114 or a robotic arm such as the robotic 116 of a robotic system.
- the method 200B (and/or one or more steps thereof) may be carried out or otherwise performed, for example, by at least one processor.
- the at least one processor may be the same as or similar to the processor(s) 104 of the computing device 102 described above.
- the at least one processor may be part of a robot (such as a robot 114) or part of a navigation system (such as a navigation system 118).
- a processor other than any processor described herein may also be used to execute the method 200B.
- the at least one processor may perform the method 200B by executing elements stored in a memory such as the memory 106.
- the elements stored in the memory and executed by the processor may cause the processor to execute one or more steps of a function as shown in method 200B.
- One or more portions of a method 200B may be performed by the processor executing any of the contents of memory, such as sensor processing 120 and/or deflection determination 128.
- the method 200B comprises receiving second sensor data (step 204B).
- the step 204B may be the same as or similar to the step 204A of the method 200A described above, except that the second sensor data may correspond to a load such as, for example, a force and/or torque exerted onto an end effector such as the end effector 146.
- the method 200B also comprises receiving second pose information (step 208B).
- the step 208B may be the same as or similar to the step 208A of the method 200A described above, except that the second pose information may correspond to a pose of the end effector.
- the method 200B also comprises receiving gravity information (step 212B).
- the step 212B may be the same as or similar to the step 212A of the method 200A described above.
- the method 200B also comprises determining a deflection compensation of the end effector (step 216B).
- the step 216B may be the same as or similar to the step 216A of the method 200A described above, except that the deflection compensation is determined for the end effector.
- the method 200B also comprises applying the deflection compensation (step 220B).
- the step 220B may be the same as or similar to the step 220A of the method 200A described above.
- the present disclosure encompasses embodiments of the method 200B that comprise more or fewer steps than those described above, and/or one or more steps that are different than the steps described above.
- Fig. 2C depicts a method 200C that may be used, for example, for determining and applying a deflection compensation to a robot such as the robot 114 or a robotic arm such as the robotic 116 of a robotic system.
- the method 200C (and/or one or more steps thereof) may be carried out or otherwise performed, for example, by at least one processor.
- the at least one processor may be the same as or similar to the processor(s) 104 of the computing device 102 described above.
- the at least one processor may be part of a robot (such as a robot 114) or part of a navigation system (such as a navigation system 118).
- a processor other than any processor described herein may also be used to execute the method 200C.
- the at least one processor may perform the method 200C by executing elements stored in a memory such as the memory 106.
- the elements stored in the memory and executed by the processor may cause the processor to execute one or more steps of a function as shown in method 200C.
- One or more portions of a method 200C may be performed by the processor executing any of the contents of memory, such as sensor processing 120 and/or deflection determination 128.
- the method 200C comprises receiving third sensor data (step 204C).
- the step 204C may be the same as or similar to the step 204A of the method 200A described above, except that the third sensor data may correspond to a load such as, for example, a force and/or torque exerted onto a robotic arm such as the robotic arm 116.
- the method 200C also comprises receiving third pose information (step 208C).
- the step 208B may be the same as or similar to the step 208A of the method 200A described above, except that the third pose information may correspond to a pose of the robotic arm.
- the method 200C also comprises receiving gravity information (step 212C).
- the step 212C may be the same as or similar to the step 212A of the method 200A described above.
- the method 200C also comprises determining a deflection compensation of the robotic arm (step 216C).
- the step 216C may be the same as or similar to the step 216A of the method 200A described above, except that the deflection compensation is determined for the robotic arm.
- the method 200C also comprises applying the deflection compensation (step 220C).
- the step 220B may be the same as or similar to the step 220A of the method 200A described above.
- the present disclosure encompasses embodiments of the method 200C that comprise more or fewer steps than those described above, and/or one or more steps that are different than the steps described above.
- any steps of the method 200 A, 200B, 200C and/or the methods 200A, 200B, 200C may be combined or repeated.
- the methods 200A and 200B may be carried out simultaneously or sequentially to obtain deflection compensations for the end effector and the tracked portion.
- the methods 200A and 200C may be carried out simultaneously or sequentially to obtain deflection compensations for the tracked portion and the robotic arm.
- deflection compensations may be obtained for any portion of a robot such as the robot 114 or any other component of a system such as the system 100.
- the sensor used to detect the load may contribute to inaccuracy of the robot and/or the robotic arm.
- the deflection due to the sensor may be compensated for by using a calibration matrix of the sensor to compensate for the deflection.
- the deflection compensation beneficially accounts for deflections due to gravity and/or external forces exerted onto the end effector, the tool changer, the tracked portion, the robotic arm, and/or any portion of the robot. Such compensation of deflection may help improve or increase accuracy of the robot and the robotic arm.
- the deflection compensation may be used to adjust a trajectory of the robotic arm such that the robotic arm follows a trajectory at or near the target or planned trajectory.
- the deflection compensation may be applied to a 3D location of a tool center point (TCP) of the robot. More specifically, the TCP has x-, y-, and z- coordinates in space to which the deflection compensation may be applied to. In embodiments where the deflection compensation is applied to the trajectory and the 3D location, the deflection compensation may be applied in degrees to the trajectory and in distances (e.g., millimeters) to the 3D location.
- Such improved accuracy may result in an increase in an overall performance of the robot and/or the system and thereby improving patient safety and user satisfaction with the robot and/or the system.
- the deflection compensation can be determined and applied to the robotic system as a whole.
- a method 300 that may be used, for example, for determining and applying a deflection compensation to a robot such as a robot 114 of a system such as the system 100.
- the method 200C (and/or one or more steps thereof) may be carried out or otherwise performed, for example, by at least one processor.
- the at least one processor may be the same as or similar to the processor(s) 104 of the computing device 102 described above.
- the at least one processor may be part of a robot (such as a robot 114) or part of a navigation system (such as a navigation system 118).
- a processor other than any processor described herein may also be used to execute the method 300.
- the at least one processor may perform the method 300 by executing elements stored in a memory such as the memory 106.
- the elements stored in the memory and executed by the processor may cause the processor to execute one or more steps of a function as shown in method 300.
- One or more portions of a method 300 may be performed by the processor executing any of the contents of memory, such as sensor processing 120 and/or deflection determination 128.
- the method 300 comprises receiving system sensor data (step 304).
- the step 304 may be the same as or similar to the step 204A of the method 200A described above, except that the system sensor data may be received from any sensor (which may be the same as or similar to the sensor 142) of a system such as the system 100 (which may be, for example, a robotic system for a robot such as the robot 114).
- the sensor data may include one or more sets of sensor data obtained from one or more sensors of the system.
- the method 300 also comprises receiving system pose information (step 308).
- the step 308 may be the same as or similar to the step 208A of the method 200A described above, except that the system pose information may correspond to a pose of the system.
- the method 300 also comprises receiving gravity information (step 312).
- the step 312 may be the same as or similar to the step 212A of the method 200A described above.
- the gravity information may be gravity information for the robotic system as a whole.
- the method 300 also comprises determining a deflection compensation of the robotic arm (step 316).
- the step 316 may be the same as or similar to the step 216A of the method 200A described above, except that the deflection compensation is determined for the robotic system as a whole. More specifically, determining the deflection compensation may be based on a lookup table and/or a calibration matrix in which deflection compensations are stored. The deflection compensations may be based on, for example, experimentations, FEA analysis, and/or calculations for the robotic system. In other instances, the deflection compensation may be determined using a deflection determination such as the deflection determination 128.
- the deflection determination enables the processor to receive and process the sensor data (received from, for example, the step 304 above) and the pose information (received from, for example, the step 308 above) to determine the deflection compensation.
- the deflection determination enables the processor to retrieve the deflection compensation from a lookup table of deflection information having a plurality of deflection compensations for the robotic system based on a plurality of parameters such as the pose information, the sensor data, and/or the gravity information.
- the method 300 also comprises applying the deflection compensation (step 320).
- the step 220B may be the same as or similar to the step 220A of the method 200A described above except that the deflection compensation is applied to the robotic system.
- the present disclosure encompasses embodiments of the method 300 that comprise more or fewer steps than those described above, and/or one or more steps that are different than the steps described above.
- the present disclosure encompasses methods with fewer than all of the steps identified in Figs. 2A, 2B, 2C, and 3 (and the corresponding description of the methods 200A, 200B, 200C, and 300), as well as methods that include additional steps beyond those identified in Figs. 2A, 2B, 2C, and 300 (and the corresponding description of the methods 200A, 200B, 200C, and 300).
- the present disclosure also encompasses methods that comprise one or more steps from one method described herein, and one or more steps from another method described herein. Any correlation described herein may be or comprise a registration or any other correlation.
- Example 1 A system for determining and applying a deflection compensation, the system comprising: a robotic arm; a tracked portion attached to the robotic arm, the tracked portion configured to support and orient an end effector; a sensor configured to sense a load and to yield sensor data; at least one processor; and at least one memory storing instructions for execution by the at least one processor that, when executed, cause the at least one processor to: receive sensor data from the sensor, the sensor data corresponding to a load exerted on the tracked portion; receive pose information corresponding to a pose of the tracked portion; determine a deflection compensation of the tracked portion based on the pose information and the load exerted on the tracked portion; and apply the deflection compensation to the robotic arm.
- Example 2 The system of Example 1, wherein the sensor data is first sensor data and the pose information is first pose information, and wherein the memory stores additional instructions for execution by the at least one processor that, when executed, further cause the at least one processor to: receive second sensor data from the sensor, the second sensor data corresponding to a load exerted on the end effector; receive second pose information corresponding to a pose of the end effector; and determine a deflection compensation of the end effector based on the second sensor data and the second pose information.
- Example 3 The system of Example 2, wherein the memory stores additional instructions for execution by the at least one processor that, when executed, further cause the at least one processor to: receive third sensor data from the sensor, the third sensor data corresponding to a load exerted on the robotic arm; receive third pose information corresponding to a pose of the robotic arm; and determine a deflection compensation of the robotic arm based on the third sensor data and the third pose information.
- Example 4 The system of Example 3, wherein determining the deflection compensation of the tracked portion includes retrieving the deflection compensation from a lookup table, the lookup table having a plurality of deflection compensations for the tracked portion based on a plurality of parameters, wherein determining the deflection compensation of the end effector includes retrieving the deflection compensation of the end effector from the lookup table, the lookup table having a plurality of deflection compensations for the end effector based on a plurality of parameters, and wherein determining the deflection compensation of the robotic arm includes retrieving the deflection compensation of the robotic arm from the lookup table, the lookup table having a plurality of deflection compensations for the robotic arm based on a plurality of parameters.
- Example 5 The system of any of Examples 1-4, wherein the sensor is integrated with the tracked portion.
- Example 6 The system of any of Examples 1-5, wherein the sensor is at least one of a force sensor, a torque sensor, or a combination of a force and torque sensor, and wherein the load is at least one of a force, a torque, or a combination of a force and a torque.
- Example 7 The system of any of Examples 1-6, wherein determining the deflection compensation of the tracked portion includes retrieving the deflection compensation from a lookup table, the lookup table having a plurality of deflection compensations based on a plurality of parameters.
- Example 8 The system of Example 7, wherein the plurality of parameters include at least one of pose information, load information, gravity information, or any combination thereof.
- Example 9 The system of Example 7, wherein the plurality of deflection compensations is obtained from at least one of physical experimentation or simulation.
- Example 10 The system of Example 9, wherein the simulation comprises a finite element analysis simulation.
- Example 11 The system of any of Examples 1-10, wherein the memory stores additional instructions for execution by the at least one processor that, when executed, further cause the at least one processor to: receive gravity information corresponding to gravity force exerted on the tracked portion, wherein determining the deflection compensation is also based on the gravity information.
- Example 12 The system of any of Examples 1-11, wherein applying the deflection compensation to the robotic arm includes automatically adjusting one or more trajectories of the robotic arm based on the deflection compensation.
- Example 13 A system for determining and applying a deflection compensation, the system comprising: a robotic arm; a tracked portion attached to the robotic arm, the tracked portion configured to support and orient an end effector; a sensor configured to sense a load and to yield sensor data; at least one processor; and at least one memory storing instructions for execution by the at least one processor that, when executed, cause the at least one processor to: receive first sensor data from the sensor, the first sensor data corresponding to a load exerted on the tracked portion; receive second sensor data from the sensor, the second sensor data corresponding to a load exerted on the end effector; receive first pose information corresponding to a pose of the tracked portion; receive second pose information corresponding to a pose of the end effector; determine a deflection compensation of the tracked portion based on the first pose information and the first sensor data and of the end effector based on the second pose information and the second sensor data; and apply the deflection compensation to the robotic arm.
- Example 14 The system of Example 13, wherein determining the deflection compensation of the tracked portion includes retrieving the deflection compensation from a lookup table, the lookup table having a plurality of deflection compensations for the tracked portion based on a plurality of parameters, wherein determining the deflection compensation of the end effector includes retrieving the deflection compensation of the end effector from the lookup table, the lookup table having a plurality of deflection compensations for the end effector based on a plurality of parameters.
- Example 15 The system of Example 14, wherein the plurality of parameters include at least one of pose information, load information, gravity information, or any combination thereof.
- Example 16 The system of Example 14, wherein the plurality of deflection compensations is obtained from at least one of physical experimentation or simulation.
- Example 17 The system of Example 16, wherein the simulation comprises a finite element analysis simulation.
- Example 18 The system of Examples 13-17, wherein the sensor is integrated with the tracked portion.
- Example 19 The system of Examples 13-18, wherein the sensor is at least one of a force sensor, a torque sensor, or a combination of a force and torque sensor, and wherein the load is at least one of a force, a torque, or a combination of a force and a torque.
- Example 20 A system for determining and applying a deflection compensation, the system comprising: a robotic arm; a tracked portion attached to the robotic arm, the tracked portion configured to support and orient an end effector; a sensor configured to sense a load and to yield sensor data; at least one processor; and at least one memory storing instructions for execution by the at least one processor that, when executed, cause the at least one processor to: receive first sensor data from the sensor, the first sensor data corresponding to a load exerted on the tracked portion; receive second sensor data from the sensor, the second sensor data corresponding to a load exerted on the end effector; receive third sensor data from the sensor, the third sensor data corresponding to a load exerted on the robotic arm; receive first pose information corresponding to a pose of the tracked portion; receive second pose information corresponding to a pose of the end effector; receive third pose information corresponding to a pose of the robotic arm; determine a deflection compensation of the tracked portion based on the first pose information and the first sensor data, of the
- Example 21 A system for determining and applying a deflection compensation, the system comprising: a robot having a robotic arm, a tracked portion, a tool changer, and an end effector; one or more sensors configured to sense a load and to yield sensor data; an integrated circuit configured to: receive system sensor data from the sensor, the system sensor data corresponding to a load exerted the robot; receive system pose information corresponding to a pose of the robot; determine a deflection compensation of the robot based on the system sensor data and the system pose information; and apply the deflection compensation to the robot.
- Example 22 The system of Example 21, wherein determining the deflection compensation of the robot includes retrieving the deflection compensation from a lookup table, the lookup table having a plurality of deflection compensations for the robot based on a plurality of parameters.
Landscapes
- Engineering & Computer Science (AREA)
- Health & Medical Sciences (AREA)
- Robotics (AREA)
- Surgery (AREA)
- Life Sciences & Earth Sciences (AREA)
- Heart & Thoracic Surgery (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Biomedical Technology (AREA)
- Mechanical Engineering (AREA)
- Medical Informatics (AREA)
- Molecular Biology (AREA)
- Animal Behavior & Ethology (AREA)
- General Health & Medical Sciences (AREA)
- Public Health (AREA)
- Veterinary Medicine (AREA)
- Manipulator (AREA)
Abstract
Sont proposés des systèmes et des procédés de détermination et d'application d'une compensation de déviation. Des données de capteur peuvent être reçues en provenance d'un capteur et peuvent correspondre à une charge exercée sur un système robotique ou à un ou plusieurs composants du système robotique. Des informations de pose correspondant à une pose du système robotique ou du ou des composants peuvent être reçues. Une compensation de déviation peut être déterminée sur la base des informations de pose et de la charge, et la compensation de déviation peut être appliquée au système robotique ou au ou aux composants du système robotique.
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US202463625620P | 2024-01-26 | 2024-01-26 | |
| US63/625,620 | 2024-01-26 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2025158442A1 true WO2025158442A1 (fr) | 2025-07-31 |
Family
ID=94824019
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/IL2025/050086 Pending WO2025158442A1 (fr) | 2024-01-26 | 2025-01-26 | Systèmes et procédés de détermination et d'application d'une compensation de déviation à des systèmes robotiques |
Country Status (1)
| Country | Link |
|---|---|
| WO (1) | WO2025158442A1 (fr) |
Citations (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20140039517A1 (en) * | 2012-08-03 | 2014-02-06 | Stryker Corporation | Navigation System for use with a Surgical Manipulator Operable in Manual or Semi-Autonomous Modes |
| CN110152211A (zh) * | 2019-06-12 | 2019-08-23 | 兰州理工大学 | 一种患者承载医用机械臂误差补偿系统及方法 |
| US20230133689A1 (en) * | 2021-11-01 | 2023-05-04 | Mazor Robotics Ltd. | Arm movement safety layer |
| US20230248454A1 (en) * | 2017-11-10 | 2023-08-10 | Intuitive Surgical Operations, Inc. | Systems and methods for controlling a robotic manipulator or associated tool |
-
2025
- 2025-01-26 WO PCT/IL2025/050086 patent/WO2025158442A1/fr active Pending
Patent Citations (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20140039517A1 (en) * | 2012-08-03 | 2014-02-06 | Stryker Corporation | Navigation System for use with a Surgical Manipulator Operable in Manual or Semi-Autonomous Modes |
| US20230248454A1 (en) * | 2017-11-10 | 2023-08-10 | Intuitive Surgical Operations, Inc. | Systems and methods for controlling a robotic manipulator or associated tool |
| CN110152211A (zh) * | 2019-06-12 | 2019-08-23 | 兰州理工大学 | 一种患者承载医用机械臂误差补偿系统及方法 |
| US20230133689A1 (en) * | 2021-11-01 | 2023-05-04 | Mazor Robotics Ltd. | Arm movement safety layer |
Non-Patent Citations (1)
| Title |
|---|
| ROUVINEN A ET AL: "DEFLECTION COMPENSATION OF A FLEXIBLE HYDRAULIC MANUPULATOR UTILIZING NEURAL NETWORKS", MECHATRONICS, PERGAMON PRESS, OXFORD, GB, vol. 7, no. 4, 1 June 1997 (1997-06-01), pages 355 - 368, XP000657016, ISSN: 0957-4158, DOI: 10.1016/S0957-4158(97)00009-3 * |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US12042171B2 (en) | Systems and methods for surgical port positioning | |
| US12201377B2 (en) | Arm movement safety layer | |
| WO2023214398A1 (fr) | Navigation de bras robotique à l'aide d'un support osseux virtuel | |
| EP4518785A1 (fr) | Robot chirurgical comprenant un montage flottant de patient | |
| US20250318886A1 (en) | Automatic robotic procedure for skin cutting, tissue pathway, and dilation creation | |
| WO2023148715A1 (fr) | Création de voie tissulaire à l'aide de capteurs ultrasonores | |
| WO2023062624A1 (fr) | Systèmes pour définir une géométrie d'objet à l'aide de bras robotiques | |
| US20240382265A1 (en) | Hybrid localization for minimally invasive surgery and cervical spinal referencing, and methods for using the same | |
| US20240225758A1 (en) | Multi-arm surgical robotic platform | |
| WO2025158442A1 (fr) | Systèmes et procédés de détermination et d'application d'une compensation de déviation à des systèmes robotiques | |
| WO2023148718A1 (fr) | Suivi segmentaire intégré de robot | |
| US20240383152A1 (en) | Multi-axis force transducer feedback from robotic end effector adapter | |
| US20230404692A1 (en) | Cost effective robotic system architecture | |
| WO2024103286A1 (fr) | Bras de type branchez-et-utilisez pour robotique rachidienne | |
| US20230355325A1 (en) | Replaceable arm guide and end effector for surgical systems | |
| US20240119696A1 (en) | Systems, devices, and methods for identifying and locating a region of interest | |
| WO2024229651A1 (fr) | Positionnement intelligent d'un chariot de bras de robot | |
| WO2024236477A1 (fr) | Rétroaction de transducteur de force à axes multiples provenant d'un adaptateur d'effecteur terminal robotique | |
| WO2023147702A1 (fr) | Boîtier pour un couvercle stérile et filtre | |
| US20230165653A1 (en) | Systems, methods, and devices for covering and tracking a surgical device | |
| WO2024236440A1 (fr) | Localisation hybride pour chirurgie minimalement invasive et référencement spinal cervical, et leurs procédés d'utilisation | |
| WO2024252400A1 (fr) | Effecteur terminal de support osseux | |
| WO2025120636A1 (fr) | Systèmes et procédés de détermination du mouvement d'un ou plusieurs éléments anatomiques | |
| WO2025079074A1 (fr) | Systèmes d'étalonnage et d'authentification d'effecteurs terminaux | |
| WO2023148712A1 (fr) | Systèmes et procédés de commande d'un bras robotique |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 25708903 Country of ref document: EP Kind code of ref document: A1 |