[go: up one dir, main page]

WO2025115013A1 - Non-unique led pattern geometry and identification for robotics navigation - Google Patents

Non-unique led pattern geometry and identification for robotics navigation Download PDF

Info

Publication number
WO2025115013A1
WO2025115013A1 PCT/IL2024/051125 IL2024051125W WO2025115013A1 WO 2025115013 A1 WO2025115013 A1 WO 2025115013A1 IL 2024051125 W IL2024051125 W IL 2024051125W WO 2025115013 A1 WO2025115013 A1 WO 2025115013A1
Authority
WO
WIPO (PCT)
Prior art keywords
tracking
robotic
unit
camera
patterns
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
PCT/IL2024/051125
Other languages
French (fr)
Inventor
Itamar ESHEL
Itay JERBY
Hay H. SHMULEVICH
Adi Sandelson
Stanislav LOKSHIN
Nicholas J. Rawluk
John C. HOUGE
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Mazor Robotics Ltd
Original Assignee
Mazor Robotics Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Mazor Robotics Ltd filed Critical Mazor Robotics Ltd
Publication of WO2025115013A1 publication Critical patent/WO2025115013A1/en
Pending legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods
    • A61B2017/00681Aspects not otherwise provided for
    • A61B2017/00725Calibration or performance testing
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2055Optical tracking systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2065Tracking using image or pattern recognition
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots
    • A61B2034/305Details of wrist mechanisms at distal ends of robotic arms
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/39Markers, e.g. radio-opaque or breast lesions markers
    • A61B2090/3937Visible markers
    • A61B2090/3945Active visible markers, e.g. light emitting diodes

Definitions

  • This application relates generally to the field of robotics assisted surgery, and more particularly to robotics navigation.
  • Surgical robots may assist a surgeon or other medical provider in carrying out a surgical procedure or may complete one or more surgical procedures autonomously. During such surgical procedures, surgical tools may be used on one or more anatomical elements. The tools may be oriented and operated by the surgical robot and/or the surgeon or other medical provider.
  • FIG. 1 is a block diagram illustrating a robotic system according to various examples.
  • FIG. 2 illustrates aspects of the operation of the robotic system of FIG. 1 according to various examples.
  • FIGS. 3A-3C illustrate examples of a robotic tracking unit of the system of FIG. 1.
  • FIG. 4 schematically illustrates the robotic tracking unit of the system of FIG. 1 according to various examples.
  • FIG. 5 illustrates a flowchart of a method performed by the robotic system of FIG. 1 according to various examples.
  • a surgical robot system requires determining both a tool trajectory (i.e., the trajectory along which a tool is used on the patient) and the tool center point (“TCP”), typically the distal-most point of the tool (e.g., the tip of a bone removal tool).
  • TCP tool center point
  • navigation cameras are used to validate tool trajectory and tool center point to maintain accuracy.
  • Navigation cameras are used to track portions of the robot and/or surgical tool as it moves. Navigation analyzes captured images to locate markers, which are used to determine the position of one or more aspects of the robot. For example, some robots have passive reflective spheres attached to known positions on the robot.
  • a robotic navigation system is able to track the robot by tracking the locations of the spheres as the robot moves.
  • LEDs light emitting diodes
  • a robotic navigation system is able to determine the position of the robot by identifying patterns formed by the LEDs.
  • designing trackers for navigation is a complicated task. Trackers require unique patterns in order to be identified by the navigation system. Designing locations of LEDs for tracking has many constraints, especially because the robots present a small geometry in which to work. For example, each unique pattern represents another position that can be detected. The accuracy of the tracking improves as the number of detectable positions increases. However, the more unique patterns you need, the more complicated it becomes to design the trackers.
  • a robotic surgical system may include an RTU at the end of the robotic arm that will remain fixed on the arm and facilitate the connection of end effectors (e.g., surgical tools) to the robotic arm during surgery.
  • the RTU includes active tracker LEDs (which can be controlled and turned on and off) to enable navigation of the RTU’s location in space using a navigation camera to improve accuracy.
  • Examples presented herein provide an RTU that includes multiple non-unique LED patterns (e.g., a recurring pattern of LEDs) along its perimeter.
  • the RTU can present ten or more non-unique patterns of LEDs on the external surfaces of the RTU. Because the RTU presents non-unique patterns, its physical shape may be simplified over a design that has to present multiple unique patterns.
  • a repeating LED pattern provides for an RTU with a simpler geometry (e.g., as compared to an RTU that must support multiple unique patters).
  • a simpler geometry reduces the complexity and cost associated with manufacturing.
  • the need for unique faces introduces inefficiency in manufacturing because unique patterns mean different parts are manufactured for each pattern.
  • a simpler geometry for the RTU also improves maintenance and serviceability. Examples provided herein also increase the design’s useful lifetime. Less possible conflicts with other geometries mitigates the risk of conflicting with future geometries for future instruments.
  • the techniques described herein relate to a medical system including: a robot including a robotic arm; a robotic tracking unit coupled to a distal end of the robotic arm, the robotic tracking unit including a plurality of light emitting diodes (LEDs); a navigation system including a camera positioned to capture images of the robotic tracking unit; and an electronic processor coupled to the robot, the robotic tracking unit, and the navigation system, and configured to: select a tracking pattern for the robotic tracking unit from a plurality of non- unique tracking patterns; control the plurality of LEDs to illuminate based on the tracking pattern; receive, from the navigation system, a captured image of the robotic tracking unit; detect, in the captured image, the tracking pattern; and determine, based on the tracking pattern, an orientation of the robotic tracking unit; and control the robotic arm based on the orientation of the robotic tracking unit.
  • a medical system including: a robot including a robotic arm; a robotic tracking unit coupled to a distal end of the robotic arm, the robotic tracking unit including a plurality of light emitting diodes (LEDs
  • the techniques described herein relate to a method for operating a surgical robot, the method including: selecting, with an electronic processor, a tracking pattern for a robotic tracking unit coupled to a distal end of a robotic arm of the surgical robot from a plurality of non-unique tracking patterns; controlling a plurality of light emitting diodes (LEDs) of the robotic tracking unit to illuminate based on the tracking pattern; receiving, from a camera positioned to capture images of the robotic tracking unit, a captured image of the robotic tracking unit; detecting, in the captured image, the tracking pattern; and determining, based on the tracking pattern, an orientation of the robotic tracking unit; and controlling the robotic arm based on the orientation of the robotic tracking unit.
  • LEDs light emitting diodes
  • distal and proximal are used in the following description with respect to a position or direction relative to the surgical robot. “Distal” or “distally” are a position distant from or in a direction away from the surgical robot toward the patient. “Proximal” and “proximally” are a position near or in a direction away from the patient toward the surgical robot.
  • FIG. 1 is a block diagram of a robotic system 100.
  • the robotic system 100 may be used to carry out robotic assisted surgery, including one or more aspects of one or more of the methods disclosed herein.
  • the robotic system 100 includes a computing device 102, one or more imaging devices 112, a robot 114, a navigation system 118, one or more sensors 126, a database 130, and/or a cloud (or other network) 134.
  • Systems according to other embodiments of the present disclosure may comprise more or fewer components than the robotic system 100.
  • the computing device 102 includes an electronic processor 104, a memory 106, a communication interface 108, and a user interface 110. In some aspects, the computing device 102 may include more or fewer components than illustrated in the example.
  • the computing device 102 includes an electronic processor 104 (for example, a microprocessor, application specific integrated circuit, etc.), a memory 106, a communication interface 108, and a user interface 110.
  • the electronic processor 104, the memory 106, the communication interface 108, and the user interface 110, as well as the other various modules (not illustrated) are coupled directly, by one or more control or data buses (e.g., the bus 140), or a combination thereof.
  • the memory 106 may be made up of one or more non-transitory computer-readable media and includes at least a program storage area and a data storage area.
  • the program storage area and the data storage area can include combinations of several types of memory, such as readonly memory (“ROM”), random access memory (“RAM”) (for example, dynamic RAM (“DRAM”), synchronous DRAM (“SDRAM”), etc.), electrically erasable programmable readonly memory (“EEPROM”), flash memory, or any other suitable tangible, non-transitory memory for storing computer- readable data and/or instructions.
  • the memory 106 may store information or data useful for completing, for example, any aspect of the method 500 described herein, or of any other methods.
  • the memory 106 may store, for example, instructions and/or machine learning models that support one or more functions of the robot 114.
  • the memory 106 may store content (e.g., instructions and/or machine learning models) that, when executed by the electronic processor 104, enable image processing 120, sensor processing 122, and/or robotic tracking 124.
  • Such content may, in some embodiments, be organized into one or more applications, modules, packages, layers, or engines.
  • the image processing 120 enables the electronic processor 104 to process image data of an image (received from, for example, the imaging device 112, a camera of the navigation system 118, or any imaging device) for the purpose of, for example, identifying information about an anatomical element 204 (shown in FIG. 2) and/or objects in the image such as a surgical tool 128 and a robotic tracking unit 136.
  • the identifying information can be used to determine a three-dimensional location of the surgical tool 128, for example, relative to the anatomical element 204.
  • the image processing 120 is also configured to detect tracking patterns produced by illuminated LEDs of the robotic tracking unit 136, as described herein.
  • the sensor processing 122 enables the processor 104 to process sensor output data (received from for example, the one or more sensors 126) for the purpose of, for example, determining the location of the robotic arm 116, the surgical tool 128, and/or the robotic tracking unit 136.
  • the sensor output may be received as signal(s) and may be processed by the electronic processor 104 using the sensor processing 122 to output data such as, for example, force data, acceleration data, pose data, time data, location data, etc.
  • the robotic tracking 124 enables the processor 104 to send and receive data and/or commands to and from the robotic tracking unit 136 to control the robotic tracking unit 136 and to determine its orientation, as described herein.
  • the memory 106 may store other types of content or data (e.g., machine learning models, artificial neural networks, deep neural networks, etc.) that can be processed by the electronic processor 104 to carry out the various method and features described herein.
  • content or data e.g., machine learning models, artificial neural networks, deep neural networks, etc.
  • the data, algorithms, and/or instructions may cause the electronic processor 104 to manipulate data stored in the memory 106 and/or received from or via the imaging device 112, the robot 114, the database 130, they sensors 126, and/or the cloud 134.
  • the electronic processor 104 sends and receives information (for example, from the memory 106, the communication interface 108, and/or the user interface 110) and processes the information by executing one or more software instructions or modules, capable of being stored in the memory 106, or another non-transitory computer readable medium.
  • the software can include firmware, one or more applications, program data, filters, rules, one or more program modules, and other executable instructions.
  • the electronic processor 104 is configured to retrieve from the memory 106 and execute, among other things, software for performing methods as described herein.
  • the communication interface 108 transmits and receives information from devices external to the computing device 102, for example, components of the robotic system 100.
  • the communication interface 108 receives input (for example, from the user interface 110), provides system output or a combination of both.
  • the communication interface 108 may be used for receiving image data or other information from an external source (such as the imaging device 112, the robot 114, the navigation system 118, the sensors 126, the database 130, the cloud 134, and/or any other system or component not part of the robotic system 100), and/or for transmitting instructions, images, or other information to an external system or device (e.g., another computing device 102, the imaging device 112, the robot 114, the navigation system 118, the sensors 126, the database 130, the cloud 134, and/or any other system or component not part of the robotic system 100).
  • an external source such as the imaging device 112, the robot 114, the navigation system 118, the sensors 126, the database 130, the cloud 134, and/or any other system or component not
  • the communication interface 108 may include one or more wired interfaces (e.g., a USB port, an Ethernet port, etc.) and/or one or more wireless transceivers or interfaces (configured, for example, to transmit and/or receive information via one or more wireless communication protocols such as 802.11a/b/g/n, Bluetooth, NFC, ZigBee, and so forth).
  • the communication interface 108 may be useful for enabling the computing device 102 to communicate with one or more other electronic processors or computing devices, whether to reduce the time needed to accomplish a computing-intensive task or for any other reason.
  • the computing device 102 may also include one or more user interfaces 110.
  • the user interface 110 may be or include a keyboard, mouse, trackball, monitor, television, screen, touchscreen, and/or any other device for receiving information from a user and/or for providing information to a user.
  • the user interface 110 may be used, for example, to receive a user selection or other user input regarding any step of any method described herein. Notwithstanding the foregoing, any required input for any step of any method described herein may be generated automatically by the robotic system 100 (e.g., by the electronic processor 104 or another component of the robotic system 100) or received by the robotic system 100 from a source external to the robotic system 100.
  • the user interface 110 may be useful to allow a surgeon or other user to modify instructions to be executed by the electronic processor 104 according to one or more embodiments of the present disclosure, and/or to modify or adjust a setting of other information displayed on the user interface 110 or corresponding thereto.
  • the computing device 102 may utilize a user interface 110 that is housed separately from one or more remaining components of the computing device 102.
  • the user interface 110 may be located proximate one or more other components of the computing device 102, while in other embodiments, the user interface 110 may be located remotely from one or more other components of the computing device 102.
  • FIG. 1 illustrates only a single electronic processor 104, memory 106, communication interface 108, and user interface 110
  • alternative embodiments of the computing device 102 may include multiple electronic processors, memory modules, communication interfaces, and/or user interfaces.
  • the robotic system 100 may include other computing devices, each including similar components as, and configured similarly to, the computing device 102.
  • portions of the computing device 102 are implemented partially or entirely on a semiconductor chip (e.g., an application specific integrated circuit (ASIC), a field-programmable gate array (“FPGA”), and the like).
  • ASIC application specific integrated circuit
  • FPGA field-programmable gate array
  • the various modules and controllers described herein may be implemented as individual controllers, as illustrated, or as components of a single controller. In some aspects, a combination of approaches may be used.
  • the imaging device 112 may be operable to image anatomical feature(s) (e.g., a bone, veins, tissue, etc.) and/or other aspects of patient anatomy and/or objects such as the surgical tool 128 to yield image data (e.g., image data depicting or corresponding to a bone, veins, tissue, etc.).
  • image data refers to the data generated or captured by an imaging device 112, including in a machine-readable form, a graphical/visual form, and in any other form.
  • the image data may comprise data corresponding to an anatomical feature of a patient, or to a portion thereof, and/or objects such as the surgical tool 128.
  • the image data may be or comprise a preoperative image, an intraoperative image, a postoperative image, or an image taken independently of any surgical procedure.
  • the imaging device 112 may be capable of taking a 2D image or a 3D image to yield the image data.
  • the imaging device 112 may be or comprise, for example, an ultrasound scanner (which may comprise, for example, a physically separate transducer and receiver, or a single ultrasound transceiver), an 0-arm, a C-arm, a G-arm, or any other device utilizing X-ray -based imaging (e.g., a fluoroscope, a CT scanner, or other X-ray machine), a magnetic resonance imaging (MRI) scanner, an optical coherence tomography (OCT) scanner, an endoscope, a microscope, an optical camera, a thermographic camera (e.g., an infrared camera), a radar system (which may comprise, for example, a transmitter, a receiver, a processor, and one or more antennae), or any
  • the sensors 126 can be configured to provide sensor output.
  • the sensors 126 may include a position sensor, a proximity sensor, a magnetometer, or an accelerometer.
  • the sensors 126 may include a linear encoder, a rotary encoder, or an incremental encoder (e.g., positioned to sense movement or position of the robotic arm 116).
  • Sensor output or output data from the sensors 126 may be provided to an electronic processor of the robot 114, to the electronic processor 104 of the computing device 102, and/or to the navigation system 118.
  • Output data from the sensor(s) 126 may also be used to determine position information for the robot 114. It will be appreciated that the sensors 126 may be separate from or included in the robotic arm(s) 116.
  • the sensors 126 may enable the electronic processor 104 (or an electronic processor of the robot 114) to determine a precise pose in space of a robotic arm 116 (as well as any object or element held by or secured to the robotic arm). In other words, sensor output or output data from the sensors 126 may be used to calculate a position in space of the robotic arm 116 (and thus, the surgical tool 128) relative to one or more coordinate systems.
  • the robot 114 may be any surgical robot or surgical robotic system.
  • the robot 114 may be or comprise, for example, the Mazor XTM Stealth Edition robotic guidance system, or any derivative thereof.
  • the robot 114 may be configured to position the surgical tool 128 at one or more precise poses (e.g., position(s) and orientation(s)).
  • the surgical tool 128 may be any tool capable of cutting, drilling, milling, and/or parting an anatomical element.
  • the surgical tool 128 may be, in one example, a drill bit.
  • the robot 114 may be configured to rotate and/or advance the cutting tool 128 using, for example, one or more motors to rotate the surgical tool 128.
  • the robot 114 may additionally or alternatively be configured to manipulate any component (whether based on guidance from the navigation system 118 or not) to accomplish or to assist with a surgical task.
  • the robot 114 may be configured to hold and/or manipulate an anatomical element during or in connection with a surgical procedure.
  • the robot 114 may comprise one or more robotic arms 116.
  • the robotic arm 116 may comprise a first robotic arm and a second robotic arm, though the robot 114 may comprise more than two robotic arms.
  • one or more of the robotic arms 116 may be used to hold and/or maneuver the surgical tool 128.
  • Each robotic arm 116 may be positionable independently of the other robotic arm.
  • the robotic arms 116 may be controlled in a single, shared coordinate space, or in separate coordinate spaces.
  • the robot 114 together with the robotic arm 116, may have, for example, one, two, three, four, five, six, seven, or more degrees of freedom. Further, the robotic arm 116 may be positioned or positionable in any pose, plane, and/or focal point. The pose includes a position and an orientation. As a result, a surgical tool 128 or another object held by the robot 114 (or, more specifically, by the robotic arm 116) may be precisely positionable in one or more needed and specific positions and orientations.
  • reference markers e.g., navigation markers
  • the reference markers may be tracked by the navigation system 118, and the results of the tracking may be used by the robot 114 and/or by an operator of the robotic system 100 or any component thereof.
  • the navigation system 118 provides navigation for a surgeon and/or a surgical robot during an operation.
  • the navigation system 118 may be any now-known or future-developed navigation system, including, for example, the Medtronic StealthStationTM S8 surgical navigation system or any successor thereof.
  • the navigation system 118 may include one or more cameras (e.g., the camera 210 illustrated in FIG. 2) or other sensor(s) for tracking one or more reference markers, navigated trackers, or other objects within the operating room or other room in which some or all of the robotic system 100 is located (including, as described herein, the robotic tracking unit 136).
  • the one or more cameras may be optical cameras, infrared cameras, or other cameras.
  • the navigation system 118 may comprise one or more electromagnetic sensors.
  • the navigation system 118 may be used to track a position and orientation (e.g., a pose) of the imaging device 112, the robot 114 and/or robotic arm 116, the surgical tool 128), and/or one or more other tools (or, more particularly, to track a pose of a navigated tracker attached, directly or indirectly, in fixed relation to the one or more of the foregoing).
  • the navigation system 118 may include a display for displaying one or more images from an external source (e.g., the computing device 102, imaging device 112, or other source) or for displaying an image and/or video stream from the one or more cameras or other sensors of the navigation system 118.
  • the robotic system 100 can operate without the use of the navigation system 118.
  • the navigation system 118 may be configured to provide guidance to a surgeon or other user of the robotic system 100 or a component thereof, to the robot 114, or to any other element of the robotic system 100 regarding, for example, a pose of one or more anatomical elements, whether or not a tool is in the proper trajectory, and/or how to move a tool into the proper trajectory to carry out a surgical task according to a preoperative or other surgical plan.
  • the database 130 may store information that correlates one coordinate system to another (e.g., one or more robotic coordinate systems to a patient coordinate system and/or to a navigation coordinate system).
  • the database 130 may additionally or alternatively store, for example, one or more surgical plans (including, for example, pose information about a target and/or image information about a patient’s anatomy at and/or proximate the surgical site, for use by the robot 114, the navigation system 118, and/or a user of the computing device 102 or of the robotic system 100); one or more images useful in connection with a surgery to be completed by or with the assistance of one or more other components of the robotic system 100; and/or any other useful information.
  • one or more surgical plans including, for example, pose information about a target and/or image information about a patient’s anatomy at and/or proximate the surgical site, for use by the robot 114, the navigation system 118, and/or a user of the computing device 102 or of the robotic system 100
  • the database 130 may be configured to provide any such information to the computing device 102 or to any other device of the robotic system 100 or external to the robotic system 100, whether directly or via the cloud 134.
  • the database 130 may be or comprise part of a hospital image storage system, such as a picture archiving and communication system (PACS), a health information system (HIS), and/or another system for collecting, storing, managing, and/or transmitting electronic medical records including image data.
  • a hospital image storage system such as a picture archiving and communication system (PACS), a health information system (HIS), and/or another system for collecting, storing, managing, and/or transmitting electronic medical records including image data.
  • the cloud 134 may be or represent the Internet or any other wide area network.
  • the computing device 102 may be connected to the cloud 134 via the communication interface 108, using a wired connection, a wireless connection, or both.
  • the computing device 102 may communicate with the database 130 and/or an external device (e.g., a computing device) via the cloud 134.
  • the robotic system 100 or similar systems may be used, for example, to carry out one or more aspects of the method 500 described herein.
  • the robotic system 100 or similar systems may also be used for other purposes.
  • FIG. 2 illustrates a representative example system 200 of the robotic system 100.
  • the system 200 includes the computing device 102, the robot 114, and the navigation system 118.
  • the robot 114 includes the robotic arm 116, the robotic tracking unit 136 (coupled to the distal end of the robotic arm 116) and the surgical tool 128. As illustrated in FIG. 2, the robot 114 may be used to perform procedures on an anatomical element 204.
  • the navigation system 118 includes a camera 210, which has a field of view 206. As illustrated in FIG. 2, the field of view may encompass the anatomical element 204 (including an anatomy tracker 208) and the robotic tracking unit 136.
  • FIGS. 3A-3C illustrate examples of the robotic tracking unit 136.
  • the robotic tracking unit 136 is generally frustoconical in shape, having a longitudinal axis 314.
  • the robotic tracking unit 136 includes a plurality of light emitting diodes (LEDs) 312.
  • the LEDs are distributed in a repeating pattern about the circumference of the robotic tracking unit 136.
  • the plurality of LEDs 312 is distributed evenly between the distal end 318 and the proximal end 320 of the robotic tracking unit 136.
  • the robotic tracking unit 136 includes a plurality of planar facets 316 aligned in pairs (one at the distal end 318 and the other at the proximal end 320).
  • the plurality of LEDs 312 is positioned to form vertices of a plurality of non-unique (i.e., repeating) tracking patterns 325 (indicated schematically as Blue, Red and Green), which are distributed circumferentially about the robotic tracking unit 136.
  • the tracking patterns 325 may be selectively illuminated and detected (e.g., by the navigation system 118) to determine an orientation of the robotic tracking unit 136.
  • FIG. 4 schematically illustrates an example of the robotic tracking unit 136.
  • the robotic tracking unit 136 includes an electronic controller 402.
  • the electronic controller includes at least an electronic processor, a memory, and an input/output interface.
  • the electronic controller 402 is coupled to the plurality of LEDs 312. By way of example, only four LEDs are illustrated in FIG. 4, which may be illuminated to produce the tracking pattern 404. This should not be considered limiting.
  • the robotic tracking unit 136 includes enough LEDs to produce multiple nonunique tracking patterns (e.g., ten or more) distributed around the robotic tracking unit 136. As illustrated in FIG.
  • the robotic tracking unit 136 includes a photodiode 406, which is coupled to the electronic controller 402 and configured to sense light (e.g., infrared light) produced, for example, by the navigation system 118.
  • the photodiode is associated with the tracking pattern 404.
  • the robotic tracking unit 136 includes one photodiode for each of the plurality of tracking patterns producible by the plurality of LEDs 312.
  • the electronic controller 402 is coupled (via suitable wired or wireless connections) with the computing device 102.
  • FIG. 5 illustrates an example method 500 for operating the system of FIG. 1 to track the robot 114.
  • the method 500 is described in conjunction with the robotic system 100 as described herein, the method 500 could be used with other systems and devices.
  • the method 500 may be modified or performed differently than the example provided.
  • the method 500 is applicable to robotic systems, which are not used for surgery.
  • the method 500 is described as being performed by the computing device 102, and, in particular, the electronic processor 104. However, it should be understood that, in some examples, portions of the method 500 may be performed by other components of the robotic system 100, such as, for example, the navigation system 118 and the electronic controller 402.
  • a system calibration may be performed. For example, when the robotic system 100 is set up prior to a surgery, calibration is performed to adjust the system based on the location of the camera 210 and the robot 114.
  • the electronic processor 104 may perform a calibration routine as follows.
  • the electronic processor 104 selects a tracking pattern for the robotic tracking unit.
  • the tracking pattern is selected from the plurality of non-unique tracking patterns producible by the plurality of LEDs 312. Because the tracking patterns are non-unique (i.e., identical and repeating), they cannot all be produced at once. As such, the electronic processor 104 may use any of a number of techniques to select which tracking pattern will be illuminated. [0054] For example, the electronic processor 104 may determine a location of the robotic arm. In some examples, the electronic processor 104 receives data from the sensors 126 and interprets the data to determine allocation in space for the robotic arm 116. The electronic processor 104 also determines a location of the camera 210 relative to the robotic arm 116.
  • the electronic processor 104 may receive location information for the camera 210 from the navigation system 118. [0055] Using the location for the robotic arm 116 and the location of the camera 210 relative to the robotic arm 116, the electronic processor 104 is able to determine which of the plurality of non-unique tracking patterns is facing the camera 210 and select that tracking pattern for illumination at step 520.
  • the electronic processor 104 controls the plurality of LEDs to illuminate to sequentially produce each of the plurality of non-unique tracking patterns. In this way, each pattern is illuminated in turn while the electronic processor 104 receives a plurality of images from the camera 201 (e.g., via the navigation system 118).
  • the electronic processor 104 for example using image processing, detects in the plurality of images a plurality of candidate tracking patterns. Some of the candidate tracking patterns may show a complete pattern, while some may be partial. Because not all of the candidate tracking patterns are aligned with the focal plane of the camera 210, the candidate tracking patterns will exhibit different degrees of distortion. Accordingly, the electronic processor 104 calculates, for each of the candidate tracking patterns, a geometric accuracy.
  • the geometric accuracy is a value (e.g., a percentage) indicating the extent to which a candidate tracking pattern matches the shape of the non-unique tracking pattern (were it to be viewed parallel to the focal plane of the camera 210).
  • the electronic processor 104 also bases the geometric accuracy on the angle-to-camera for the candidate tracking pattern (i.e., the degree of parallelism between the candidate tracking pattern and the focal plane of the camera 210).
  • the electronic processor 104 selects the tracking pattern for the robotic tracking unit from the plurality of non-unique tracking patterns by selecting from the plurality of candidate tracking patterns the candidate tracking pattern with a greater degree of geometric accuracy (as compared to the other candidate tracking patterns).
  • the electronic processor 104 utilizes infrared light (e.g., emitted by an infrared emitter on the camera 210) to select the tracking pattern.
  • some embodiments of the robotic tracking unit 136 include a plurality of photodiodes, each of which are associated with one of the plurality of non-unique tracking patterns.
  • the electronic processorl04 controls the infrared emitter to illuminate (e.g., by sending a command to the navigation system 118).
  • the electronic processor 104 receives from each of the plurality of photodiodes, a signal indicative of a level of infrared light received from the infrared emitter.
  • the electronic processor 104 selects the tracking pattern for the robotic tracking unit from the plurality of non-unique tracking patterns based on the signals received from the plurality of photodiodes. For example, the electronic processor 104 may select the tracking pattern corresponding to the photodiode receiving the highest level of infrared light. In other embodiments, other types of light may be produced and detecting to accomplish the selection of the tracking pattern.
  • the electronic processor 104 performs a calibration to adjust the system based on the location of the camera 210 and the robot 114.
  • the electronic processor 104 may receive a location for the camera 210, illuminate a tracking pattern for the robotic tracking unit, and move the robotic arm and/or robotic tracking unit to detect the illuminated pattern using the camera 210.
  • the tracking pattern illuminated is one of the plurality of non-unique tracking patterns suitable for calibration.
  • the tracking pattern may be randomly selected.
  • the tracking pattern is one that allows for the robotic arm to be placed in a desired position for calibration.
  • the electronic processor 104 controls the plurality of LEDs 312 of the robotic tracking unit to illuminate based on the tracking pattern. For example, the electronic processor 104 illuminates those LEDs, which when illuminated, produce the selected tracking pattern. In some instances, e.g., where the calibration is performed based on the location of the camera 210, the electronic processor 104 controls the plurality of LEDs to illuminate based on the selected tracking pattern and controls the robotic arm to rotate the robotic tracking unit 136 while the LEDs are illuminated. In some aspects, the electronic processor 104 also controls the robotic arm to position the robotic tracking unit within a field of view of the camera 210 (e.g., based on knowledge of the location of the camera 210).
  • the electronic processor 104 receives, from the camera 210, which is positioned to capture images of the robotic tracking unit 136, a captured image of the robotic tracking unit 136. Also present in these images are the illuminated tracking pattern. In instances where the robotic tracking unit 136 is rotated, as the robotic tracking unit 136 rotates, the electronic processor 104 receives a plurality of images from the camera 210. Each of these images captures the robotic tracking unit 136 and may capture the illuminated tracking pattern. [0061] At step 540, the electronic processor 104 detects, using image processing techniques, the tracking pattern in the captured image. For example, the electronic processor 104 may use an object classifier trained using images of the non-unique tracking pattern.
  • the electronic processor 104 processes each image to detect the tracking pattern. [0062] At step 550, the electronic processor 104 determines, based on the tracking pattern, an orientation of the robotic tracking unit 136. For example, the electronic processor 104 may use the position, size, and geometric accuracy of the tracking pattern to determine the orientation in space of the robotic tracking unit 136.
  • the electronic processor 104 may receive and process multiple images while the robotic tracking unit 136 rotates. In such examples, for each of the plurality of images where the illuminated tracking pattern is detected, the electronic processor 104 calculates a geometric accuracy (as described herein) for the detected tracking pattern. Each geometric accuracy is compared to a threshold. For example, the threshold may be a percentage of accuracy that yields an acceptable accuracy for tracking the surgical tool 128. In some aspects, the electronic processor 104 controls the robotic arm to stop rotating the robotic tracking unit 136 when the geometric accuracy for one of the plurality of images exceeds the threshold. The electronic processor 104 then determines the orientation of the robotic tracking unit 136 based on the tracking pattern detected in the image where the geometric accuracy exceeds the threshold.
  • the electronic processor 104 controls the robotic arm based on the orientation of the robotic tracking unit. For example, the electronic processor 104 may determine a tool center point for the surgical tool 128 based on the orientation of the robotic tracking unit and control the robotic arm to position the surgical tool based on the tool center point. For example, the electronic processor 104 may use knowledge of the shape and dimensions of the surgical tool 128 and transformation matrices to calculate the tool center point.
  • the electronic processor 104 determines, based on signals received from the one or more sensors 126, a nominal position for the surgical tool 128.
  • the sensors may track movement of the robotic arm 116 and indicate to the electronic processor 104 where the robotic arm 116 (and thus the surgical tool 128 is supposed to be positioned).
  • the electronic processor 104 may determine an error threshold based on the nominal position and the tool center point.
  • the error threshold represents the difference between the expected and actual location of the tool center point.
  • the electronic processor 104 uses the error threshold to control the robotic arm.
  • the calibration is performed each time the camera 210 is repositioned.
  • the electronic processor 104 may receive, from the navigation system, a second location (e.g., different from the last received location) for the camera 210 and, responsive to receiving the second location for the camera, perform the calibration routine again.
  • a second location e.g., different from the last received location
  • the electronic processor 104 may receive, from the navigation system, a second location (e.g., different from the last received location) for the camera 210 and, responsive to receiving the second location for the camera, perform the calibration routine again.
  • references herein to “one embodiment” or “an embodiment” means that a particular feature, structure, or characteristic described in connection with the embodiment can be included in at least one embodiment of the disclosure.
  • the appearances of the phrase “in one embodiment” in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments necessarily mutually exclusive of other embodiments. The same applies to the term “implementation.”
  • the conjunction “if’ may also or alternatively be construed to mean “when” or “upon” or “in response to determining” or “in response to detecting,” which construal may depend on the corresponding specific context.
  • the phrase “if it is determined” or “if [a stated condition] is detected” may be construed to mean “upon determining” or “in response to determining” or “upon detecting [the stated condition or event]” or “in response to detecting [the stated condition or event].”
  • the terms “couple,” “coupling,” “coupled,” “connect,” “connecting,” or “connected” refer to any manner known in the art or later developed in which energy is allowed to be transferred between two or more elements, and the interposition of one or more additional elements is contemplated, although not required.
  • the terms “directly coupled,” “directly connected,” et cetera imply the absence of such additional elements.
  • attachment and “directly attached,” as applied to a description of a physical structure.
  • a relatively thin layer of adhesive or other suitable binder can be used to implement such “direct attachment” of the two corresponding components in such physical structure.
  • processors may be provided through the use of dedicated hardware as well as hardware capable of executing software in association with appropriate software.
  • the functions may be provided by a single dedicated processor, by a single shared processor, or by a plurality of individual processors, some of which may be shared.
  • processor or “controller” should not be construed to refer exclusively to hardware capable of executing software, and may implicitly include, without limitation, digital signal processor (DSP) hardware, network processor, application specific integrated circuit (ASIC), field programmable gate array (FPGA), read only memory (ROM) for storing software, random access memory (RAM), and nonvolatile storage. Other hardware, conventional and/or custom, may also be included.
  • DSP digital signal processor
  • ASIC application specific integrated circuit
  • FPGA field programmable gate array
  • ROM read only memory
  • RAM random access memory
  • nonvolatile storage nonvolatile storage.
  • Other hardware conventional and/or custom, may also be included.
  • any switches shown in the figures are conceptual only. Their function may be carried out through the operation of program logic, through dedicated logic, through the interaction of program control and dedicated logic, or even manually, the particular technique being selectable by the implementer as more specifically understood from the context.
  • circuitry may refer to one or more or all of the following: (a) hardware-only circuit implementations (such as implementations in only analog and/or digital circuitry); (b) combinations of hardware circuits and software, such as (as applicable): (i) a combination of analog and/or digital hardware circuit(s) with software/firmware and (ii) any portions of hardware processor(s) with software (including digital signal processor(s)), software, and memory(ies) that work together to cause an apparatus, such as a mobile phone or server, to perform various functions); and (c) hardware circuit(s) and or processor(s), such as a microprocessor(s) or a portion of a microprocessor(s), that requires software (e.g., firmware) for operation, but the software may not be present when it is not needed for operation.”
  • This definition of circuitry applies to all uses of this term in this application, including in any claims.
  • circuitry also covers an implementation of merely a hardware circuit or processor (or multiple processors) or portion of a hardware circuit or processor and its (or their) accompanying software and/or firmware.
  • circuitry also covers, for example and if applicable to the particular claim element, a baseband integrated circuit or processor integrated circuit for a mobile device or a similar integrated circuit in server, a cellular network device, or other computing or network device.
  • Example 1 A medical system comprising: a robot including a robotic arm; a robotic tracking unit coupled to a distal end of the robotic arm, the robotic tracking unit including a plurality of light emitting diodes (LEDs); a navigation system including a camera positioned to capture images of the robotic tracking unit; and an electronic processor coupled to the robot, the robotic tracking unit, and the navigation system, and configured to: perform a calibration routine by: selecting a tracking pattern for the robotic tracking unit from a plurality of non-unique tracking patterns; controlling the plurality of LEDs to illuminate based on the tracking pattern; receiving, from the navigation system, a captured image of the robotic tracking unit; detecting, in the captured image, the tracking pattern; and determining, based on the tracking pattern, an orientation of the robotic tracking unit; and control the robotic arm based on the orientation of the robotic tracking unit.
  • a calibration routine by: selecting a tracking pattern for the robotic tracking unit from a plurality of non-unique tracking patterns; controlling the plurality of LEDs to illuminate based on the tracking pattern; receiving
  • Example 2 The medical system of example 1, further comprising: a surgical tool coupled to a distal end of the robotic tracking unit; wherein the electronic processor is further configured to: determine a tool center point for the surgical tool based on the orientation of the robotic tracking unit; and control the robotic arm to position the surgical tool based on the tool center point.
  • Example 3 The medical system of example 2, further comprising: one or more sensors for determining a nominal position for the surgical tool; wherein the electronic processor is further configured to: determining, based on signals received from the one or more sensors, a nominal position for the surgical tool; determine an error threshold based on the nominal position and the tool center point; and control the robotic arm based on the error threshold.
  • Example 4 The medical system of any one of examples 1-3, wherein the electronic processor is further configured to: determine a location of the robotic arm; determine a location of the camera relative to the robotic arm; and select the tracking pattern for the robotic tracking unit from the plurality of non-unique tracking patterns by determining which of the plurality of non-unique tracking patterns is facing the camera based on the location of the robotic arm and the location of the camera relative to the robotic arm.
  • Example 5 The medical system of any one of examples 1-4, wherein the electronic processor is further configured to: control the plurality of LEDs to illuminate to sequentially produce each of the plurality of non-unique tracking patterns; receive a plurality of images from the camera; detect in the plurality of images a plurality of candidate tracking patterns; calculate, for each of the candidate tracking patterns, a geometric accuracy; and select the tracking pattern for the robotic tracking unit from the plurality of non-unique tracking patterns by selecting from the plurality of candidate tracking patterns the candidate tracking pattern with a greater degree of geometric accuracy.
  • Example 6 The medical system of example 5, wherein the electronic processor is further configured to: calculate, for each of the candidate tracking patterns, an angle-to-camera; and the geometric accuracy is based on the angle-to-camera.
  • Example 7 The medical system of any one of examples 1-6, wherein: the navigation system further comprises an infrared emitter; the robotic tracking unit further comprises a plurality of photodiodes, each of the photodiodes associated with one of the plurality of non- unique tracking patterns; and the electronic processor is further configured to: control the infrared emitter to illuminate; receive from each of the plurality of photodiodes a signal indicative of a level of infrared light received from the infrared emitter; and select the tracking pattern for the robotic tracking unit from the plurality of non-unique tracking patterns based on the signals received from the plurality of photodiodes.
  • Example 8 The medical system of any one of examples 1-7, wherein the electronic processor is further configured to: receiving, from the navigation system, a location for the camera; responsive to controlling the plurality of LEDs to illuminate based on the tracking pattern, control the robotic arm to rotate the robotic tracking unit based on the location for the camera; receive a plurality of images from the camera; for each of the plurality of images, calculate a geometric accuracy based on detecting the tracking pattern in the image; control the robotic arm to stop rotating the robotic tracking unit when the geometric accuracy for one of the plurality of images exceeds a threshold; and responsive to the geometric accuracy for one of the plurality of images exceeding the threshold, determine the orientation of the robotic tracking unit based on the tracking pattern detected in the image.
  • Example 9 The medical system of example 8, wherein the electronic processor is further configured to: receive, from the navigation system, a second location for the camera different from the location; and responsive to receiving the second location for the camera, repeat the calibration routine.
  • Example 10 A method for operating a surgical robot, the method comprising: performing a calibration routine by: selecting, with an electronic processor, a tracking pattern for a robotic tracking unit coupled to a distal end of a robotic arm of the surgical robot from a plurality of non-unique tracking patterns; controlling a plurality of light emitting diodes (LEDs) of the robotic tracking unit to illuminate based on the tracking pattern; receiving, from a camera positioned to capture images of the robotic tracking unit, a captured image of the robotic tracking unit; detecting, in the captured image, the tracking pattern; and determining, based on the tracking pattern, an orientation of the robotic tracking unit; and controlling the robotic arm based on the orientation of the robotic tracking unit.
  • a calibration routine by: selecting, with an electronic processor, a tracking pattern for a robotic tracking unit coupled to a distal end of a robotic arm of the surgical robot from a plurality of non-unique tracking patterns; controlling a plurality of light emitting diodes (LEDs) of the robotic tracking unit to illuminate based on the
  • Example 11 The method of example 10, further comprising: determining a tool center point for a surgical tool coupled to a distal end of the robotic tracking unit based on the orientation of the robotic tracking unit; and controlling the robotic arm to position the surgical tool based on the tool center point.
  • Example 12 The method of example 11, further comprising: receiving, from one or more sensors, signals indicating a nominal position for the surgical tool; determining an error threshold based on the nominal position and the tool center point; and controlling the robotic arm based on the error threshold.
  • Example 13 The method of any one of examples 10-12, further comprising: determining a location of the robotic arm; determining a location of the camera relative to the robotic arm; and selecting the tracking pattern for the robotic tracking unit from the plurality of non-unique tracking patterns by determining which of the plurality of non-unique tracking patterns is facing the camera based on the location of the robotic arm and the location of the camera relative to the robotic arm.
  • Example 14 The method of any one of examples 10-13, further comprising: controlling the plurality of LEDs to illuminate to sequentially produce each of the plurality of non-unique tracking patterns; receiving a plurality of images from the camera; detecting in the plurality of images a plurality of candidate tracking patterns; calculating, for each of the candidate tracking patterns, a geometric accuracy; and selecting the tracking pattern for the robotic tracking unit from the plurality of non-unique tracking patterns by selecting from the plurality of candidate tracking patterns the candidate tracking pattern with a greater degree of geometric accuracy.
  • Example 15 The method of example 14, wherein the geometric accuracy is based on an angle-to-camera for each of the candidate tracking patterns.
  • Example 16 The method of any one of examples 10-15, further comprising: controlling an infrared emitter to illuminate; receiving from each of a plurality of photodiodes a signal indicative of a level of infrared light received from the infrared emitter; and selecting the tracking pattern for the robotic tracking unit from the plurality of non-unique tracking patterns based on the signals received from the plurality of photodiodes; wherein each of the photodiodes associated with one of the plurality of non-unique tracking patterns.
  • Example 17 The method of any one of examples 10-16, further comprising: receiving a location for the camera; responsive to controlling the plurality of LEDs to illuminate based on the tracking pattern, control the robotic arm to rotate the robotic tracking unit based on the location for the camera; receiving a plurality of images from the camera; for each of the plurality of images, calculating a geometric accuracy based on detecting the tracking pattern in the image; controlling the robotic arm to stop rotating the robotic tracking unit when the geometric accuracy for one of the plurality of images exceeds a threshold; and responsive to the geometric accuracy for one of the plurality of images exceeding the threshold, determining the orientation of the robotic tracking unit based on the tracking pattern detected in the image.
  • Example 18 The method of example 17, further comprising: receiving a second location for the camera different from the location; and responsive to receiving the second location for the camera, repeating the calibration routine.

Landscapes

  • Health & Medical Sciences (AREA)
  • Surgery (AREA)
  • Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Biomedical Technology (AREA)
  • Robotics (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Manipulator (AREA)

Abstract

An example system may include a robot including a robotic arm and a robotic tracking unit (RTU) coupled to a distal end of the robotic arm, the RTU including a plurality of LEDs. The system may include a navigation system including a camera positioned to capture images of the RTU. The system may include an electronic processor coupled to the robot, the RTU, and the navigation system, and configured to perform a calibration routine by selecting a tracking pattern for the RTU from a plurality of non-unique tracking patterns; controlling the plurality of LEDs to illuminate based on the tracking pattern; receiving, from the navigation system, a captured image of the RTU; detect, in the captured image, the tracking pattern; determining, based on the tracking pattern, an orientation of the RTU; and controlling the robotic arm based on the orientation of the RTU.

Description

NON-UNIQUE LED PATTERN GEOMETRY AND IDENTIFICATION FOR ROBOTICS NAVIGATION
FIELD
[0001] This application relates generally to the field of robotics assisted surgery, and more particularly to robotics navigation.
BACKGROUND
[0002] Surgical robots may assist a surgeon or other medical provider in carrying out a surgical procedure or may complete one or more surgical procedures autonomously. During such surgical procedures, surgical tools may be used on one or more anatomical elements. The tools may be oriented and operated by the surgical robot and/or the surgeon or other medical provider.
BRIEF DESCRIPTION OF THE DRAWINGS
[0003] The accompanying figures, where like reference numerals refer to identical or functionally similar elements throughout the separate views, together with the detailed description below, are incorporated in and form part of the specification, and serve to further illustrate embodiments, examples, aspects, and features of concepts that include the claimed subject matter and explain various principles and advantages of those embodiments, examples, aspects, and features.
[0004] FIG. 1 is a block diagram illustrating a robotic system according to various examples.
[0005] FIG. 2 illustrates aspects of the operation of the robotic system of FIG. 1 according to various examples.
[0006] FIGS. 3A-3C illustrate examples of a robotic tracking unit of the system of FIG. 1.
[0007] FIG. 4 schematically illustrates the robotic tracking unit of the system of FIG. 1 according to various examples.
[0008] FIG. 5 illustrates a flowchart of a method performed by the robotic system of FIG. 1 according to various examples.
[0009] Skilled artisans will appreciate that elements in the figures are illustrated for simplicity and clarity and have not necessarily been drawn to scale. For example, the dimensions of some of the elements in the figures may be exaggerated relative to other elements to help to improve understanding of examples, aspects, and features illustrated.
[0010] In some instances, the apparatus and method components have been represented where appropriate by conventional symbols in the drawings, showing only those specific details that are pertinent to understanding the of various embodiments, examples, aspects, and features so as not to obscure the disclosure with details that will be readily apparent to those of ordinary skill in the art having the benefit of the description herein.
DETAILED DESCRIPTION
[0011] When performing robotics assisted surgery, it is important that the system operate accurately to improve patient outcomes. For example, operating a surgical robot system requires determining both a tool trajectory (i.e., the trajectory along which a tool is used on the patient) and the tool center point (“TCP”), typically the distal-most point of the tool (e.g., the tip of a bone removal tool). In some systems, navigation cameras are used to validate tool trajectory and tool center point to maintain accuracy. Navigation cameras are used to track portions of the robot and/or surgical tool as it moves. Navigation analyzes captured images to locate markers, which are used to determine the position of one or more aspects of the robot. For example, some robots have passive reflective spheres attached to known positions on the robot. A robotic navigation system is able to track the robot by tracking the locations of the spheres as the robot moves.
[0012] Other systems utilize multiple light emitting diodes (LEDs) for tracking. For example, a robotic navigation system is able to determine the position of the robot by identifying patterns formed by the LEDs. However, designing trackers for navigation is a complicated task. Trackers require unique patterns in order to be identified by the navigation system. Designing locations of LEDs for tracking has many constraints, especially because the robots present a small geometry in which to work. For example, each unique pattern represents another position that can be detected. The accuracy of the tracking improves as the number of detectable positions increases. However, the more unique patterns you need, the more complicated it becomes to design the trackers.
[0013] Another design concern for trackers is line of sight issues. The geometry of the trackers needs to be such that there are no trackers blocked from the camera’s view. For example, the more axes of rotation that a robot has, the more possibilities exist for its movement to block portions of the trackers. It is also possible for trackers to be blocked from the camera’s view by the tracked components’ own geometry.
[0014] To address these problems, embodiments and aspects presented herein provide a robotic tracking unit (“RTU”) with non-unique LED pattern geometry. For example, a robotic surgical system may include an RTU at the end of the robotic arm that will remain fixed on the arm and facilitate the connection of end effectors (e.g., surgical tools) to the robotic arm during surgery. The RTU includes active tracker LEDs (which can be controlled and turned on and off) to enable navigation of the RTU’s location in space using a navigation camera to improve accuracy.
[0015] Examples presented herein provide an RTU that includes multiple non-unique LED patterns (e.g., a recurring pattern of LEDs) along its perimeter. In some aspects, the RTU can present ten or more non-unique patterns of LEDs on the external surfaces of the RTU. Because the RTU presents non-unique patterns, its physical shape may be simplified over a design that has to present multiple unique patterns.
[0016] Using the examples presented herein, accuracy is improved while system complexity is reduced. Utilizing non-unique patterns provides for ease of design, as the need to use multiple patterns is eliminated. In addition, using a non-unique pattern allows for a greater pattern density, which results in a higher resolution of patterns. Higher pattern resolution makes it more likely to use a pattern, which plane is closer to being perpendicular to the camera’s line of sight. This, in turn, translates to improved navigation accuracy.
[0017] In addition, the use of a repeating LED pattern provides for an RTU with a simpler geometry (e.g., as compared to an RTU that must support multiple unique patters). A simpler geometry reduces the complexity and cost associated with manufacturing. The need for unique faces introduces inefficiency in manufacturing because unique patterns mean different parts are manufactured for each pattern. A simpler geometry for the RTU also improves maintenance and serviceability. Examples provided herein also increase the design’s useful lifetime. Less possible conflicts with other geometries mitigates the risk of conflicting with future geometries for future instruments.
[0018] In some aspects, the techniques described herein relate to a medical system including: a robot including a robotic arm; a robotic tracking unit coupled to a distal end of the robotic arm, the robotic tracking unit including a plurality of light emitting diodes (LEDs); a navigation system including a camera positioned to capture images of the robotic tracking unit; and an electronic processor coupled to the robot, the robotic tracking unit, and the navigation system, and configured to: select a tracking pattern for the robotic tracking unit from a plurality of non- unique tracking patterns; control the plurality of LEDs to illuminate based on the tracking pattern; receive, from the navigation system, a captured image of the robotic tracking unit; detect, in the captured image, the tracking pattern; and determine, based on the tracking pattern, an orientation of the robotic tracking unit; and control the robotic arm based on the orientation of the robotic tracking unit.
[0019] In some aspects, the techniques described herein relate to a method for operating a surgical robot, the method including: selecting, with an electronic processor, a tracking pattern for a robotic tracking unit coupled to a distal end of a robotic arm of the surgical robot from a plurality of non-unique tracking patterns; controlling a plurality of light emitting diodes (LEDs) of the robotic tracking unit to illuminate based on the tracking pattern; receiving, from a camera positioned to capture images of the robotic tracking unit, a captured image of the robotic tracking unit; detecting, in the captured image, the tracking pattern; and determining, based on the tracking pattern, an orientation of the robotic tracking unit; and controlling the robotic arm based on the orientation of the robotic tracking unit.
[0020] Specific embodiments of the present disclosure are now described with reference to the figures, wherein like reference numbers indicate identical or functionally similar elements. The terms “distal” and “proximal” are used in the following description with respect to a position or direction relative to the surgical robot. “Distal” or “distally” are a position distant from or in a direction away from the surgical robot toward the patient. “Proximal” and “proximally” are a position near or in a direction away from the patient toward the surgical robot.
[0021] Before any examples are explained in detail, it is to be understood that the examples presented herein are not limited in their application to the details of construction and the arrangement of components set forth in the following description or illustrated in the following drawings. The examples are capable of other embodiments and of being practiced or of being carried out in various ways. For ease of description, the example systems presented herein may be illustrated with a single exemplar of each of its component parts. Some examples may not describe or illustrate all components of the systems. Other example embodiments may include more or fewer of each of the illustrated components, may combine some components, or may include additional or alternative components.
[0022] FIG. 1 is a block diagram of a robotic system 100. The robotic system 100 may be used to carry out robotic assisted surgery, including one or more aspects of one or more of the methods disclosed herein. The robotic system 100 includes a computing device 102, one or more imaging devices 112, a robot 114, a navigation system 118, one or more sensors 126, a database 130, and/or a cloud (or other network) 134. Systems according to other embodiments of the present disclosure may comprise more or fewer components than the robotic system 100. The computing device 102 includes an electronic processor 104, a memory 106, a communication interface 108, and a user interface 110. In some aspects, the computing device 102 may include more or fewer components than illustrated in the example.
[0023] The computing device 102 includes an electronic processor 104 (for example, a microprocessor, application specific integrated circuit, etc.), a memory 106, a communication interface 108, and a user interface 110. The electronic processor 104, the memory 106, the communication interface 108, and the user interface 110, as well as the other various modules (not illustrated) are coupled directly, by one or more control or data buses (e.g., the bus 140), or a combination thereof.
[0024] The memory 106 may be made up of one or more non-transitory computer-readable media and includes at least a program storage area and a data storage area. The program storage area and the data storage area can include combinations of several types of memory, such as readonly memory (“ROM”), random access memory (“RAM”) (for example, dynamic RAM (“DRAM”), synchronous DRAM (“SDRAM”), etc.), electrically erasable programmable readonly memory (“EEPROM”), flash memory, or any other suitable tangible, non-transitory memory for storing computer- readable data and/or instructions. The memory 106 may store information or data useful for completing, for example, any aspect of the method 500 described herein, or of any other methods. The memory 106 may store, for example, instructions and/or machine learning models that support one or more functions of the robot 114. For instance, the memory 106 may store content (e.g., instructions and/or machine learning models) that, when executed by the electronic processor 104, enable image processing 120, sensor processing 122, and/or robotic tracking 124. Such content may, in some embodiments, be organized into one or more applications, modules, packages, layers, or engines.
[0025] The image processing 120 enables the electronic processor 104 to process image data of an image (received from, for example, the imaging device 112, a camera of the navigation system 118, or any imaging device) for the purpose of, for example, identifying information about an anatomical element 204 (shown in FIG. 2) and/or objects in the image such as a surgical tool 128 and a robotic tracking unit 136. The identifying information can be used to determine a three-dimensional location of the surgical tool 128, for example, relative to the anatomical element 204. The image processing 120 is also configured to detect tracking patterns produced by illuminated LEDs of the robotic tracking unit 136, as described herein.
[0026] The sensor processing 122 enables the processor 104 to process sensor output data (received from for example, the one or more sensors 126) for the purpose of, for example, determining the location of the robotic arm 116, the surgical tool 128, and/or the robotic tracking unit 136. The sensor output may be received as signal(s) and may be processed by the electronic processor 104 using the sensor processing 122 to output data such as, for example, force data, acceleration data, pose data, time data, location data, etc.
[0027] The robotic tracking 124 enables the processor 104 to send and receive data and/or commands to and from the robotic tracking unit 136 to control the robotic tracking unit 136 and to determine its orientation, as described herein.
[0028] Alternatively, or additionally, the memory 106 may store other types of content or data (e.g., machine learning models, artificial neural networks, deep neural networks, etc.) that can be processed by the electronic processor 104 to carry out the various method and features described herein. Thus, although various contents of memory 106 may be described as instructions, it should be appreciated that functionality described herein can be achieved through use of instructions, algorithms, and/or machine learning models. The data, algorithms, and/or instructions may cause the electronic processor 104 to manipulate data stored in the memory 106 and/or received from or via the imaging device 112, the robot 114, the database 130, they sensors 126, and/or the cloud 134.
[0029] The electronic processor 104 sends and receives information (for example, from the memory 106, the communication interface 108, and/or the user interface 110) and processes the information by executing one or more software instructions or modules, capable of being stored in the memory 106, or another non-transitory computer readable medium. The software can include firmware, one or more applications, program data, filters, rules, one or more program modules, and other executable instructions. The electronic processor 104 is configured to retrieve from the memory 106 and execute, among other things, software for performing methods as described herein.
[0030] The communication interface 108 transmits and receives information from devices external to the computing device 102, for example, components of the robotic system 100. The communication interface 108 receives input (for example, from the user interface 110), provides system output or a combination of both. The communication interface 108 may be used for receiving image data or other information from an external source (such as the imaging device 112, the robot 114, the navigation system 118, the sensors 126, the database 130, the cloud 134, and/or any other system or component not part of the robotic system 100), and/or for transmitting instructions, images, or other information to an external system or device (e.g., another computing device 102, the imaging device 112, the robot 114, the navigation system 118, the sensors 126, the database 130, the cloud 134, and/or any other system or component not part of the robotic system 100).
[0031] The communication interface 108 may include one or more wired interfaces (e.g., a USB port, an Ethernet port, etc.) and/or one or more wireless transceivers or interfaces (configured, for example, to transmit and/or receive information via one or more wireless communication protocols such as 802.11a/b/g/n, Bluetooth, NFC, ZigBee, and so forth). In some embodiments, the communication interface 108 may be useful for enabling the computing device 102 to communicate with one or more other electronic processors or computing devices, whether to reduce the time needed to accomplish a computing-intensive task or for any other reason. [0032] The computing device 102 may also include one or more user interfaces 110. The user interface 110 may be or include a keyboard, mouse, trackball, monitor, television, screen, touchscreen, and/or any other device for receiving information from a user and/or for providing information to a user. The user interface 110 may be used, for example, to receive a user selection or other user input regarding any step of any method described herein. Notwithstanding the foregoing, any required input for any step of any method described herein may be generated automatically by the robotic system 100 (e.g., by the electronic processor 104 or another component of the robotic system 100) or received by the robotic system 100 from a source external to the robotic system 100. In some embodiments, the user interface 110 may be useful to allow a surgeon or other user to modify instructions to be executed by the electronic processor 104 according to one or more embodiments of the present disclosure, and/or to modify or adjust a setting of other information displayed on the user interface 110 or corresponding thereto.
[0033] Although the user interface 110 is shown as part of the computing device 102, in some embodiments, the computing device 102 may utilize a user interface 110 that is housed separately from one or more remaining components of the computing device 102. In some embodiments, the user interface 110 may be located proximate one or more other components of the computing device 102, while in other embodiments, the user interface 110 may be located remotely from one or more other components of the computing device 102.
[0034] It should be understood that although FIG. 1 illustrates only a single electronic processor 104, memory 106, communication interface 108, and user interface 110, alternative embodiments of the computing device 102 may include multiple electronic processors, memory modules, communication interfaces, and/or user interfaces. It should also be noted that the robotic system 100 may include other computing devices, each including similar components as, and configured similarly to, the computing device 102. In some embodiments, portions of the computing device 102 are implemented partially or entirely on a semiconductor chip (e.g., an application specific integrated circuit (ASIC), a field-programmable gate array (“FPGA”), and the like). Similarly, the various modules and controllers described herein may be implemented as individual controllers, as illustrated, or as components of a single controller. In some aspects, a combination of approaches may be used.
[0035] Continuing with other aspects of the robotic system 100, the imaging device 112 may be operable to image anatomical feature(s) (e.g., a bone, veins, tissue, etc.) and/or other aspects of patient anatomy and/or objects such as the surgical tool 128 to yield image data (e.g., image data depicting or corresponding to a bone, veins, tissue, etc.). “Image data” as used herein refers to the data generated or captured by an imaging device 112, including in a machine-readable form, a graphical/visual form, and in any other form. In various examples, the image data may comprise data corresponding to an anatomical feature of a patient, or to a portion thereof, and/or objects such as the surgical tool 128. The image data may be or comprise a preoperative image, an intraoperative image, a postoperative image, or an image taken independently of any surgical procedure. The imaging device 112 may be capable of taking a 2D image or a 3D image to yield the image data. The imaging device 112 may be or comprise, for example, an ultrasound scanner (which may comprise, for example, a physically separate transducer and receiver, or a single ultrasound transceiver), an 0-arm, a C-arm, a G-arm, or any other device utilizing X-ray -based imaging (e.g., a fluoroscope, a CT scanner, or other X-ray machine), a magnetic resonance imaging (MRI) scanner, an optical coherence tomography (OCT) scanner, an endoscope, a microscope, an optical camera, a thermographic camera (e.g., an infrared camera), a radar system (which may comprise, for example, a transmitter, a receiver, a processor, and one or more antennae), or any other imaging device 112 suitable for obtaining images of an anatomical feature of a patient and/or objects such as the surgical tool 128.
[0036] The sensors 126 can be configured to provide sensor output. The sensors 126 may include a position sensor, a proximity sensor, a magnetometer, or an accelerometer. In some embodiments, the sensors 126 may include a linear encoder, a rotary encoder, or an incremental encoder (e.g., positioned to sense movement or position of the robotic arm 116). Sensor output or output data from the sensors 126 may be provided to an electronic processor of the robot 114, to the electronic processor 104 of the computing device 102, and/or to the navigation system 118. Output data from the sensor(s) 126 may also be used to determine position information for the robot 114. It will be appreciated that the sensors 126 may be separate from or included in the robotic arm(s) 116. The sensors 126 may enable the electronic processor 104 (or an electronic processor of the robot 114) to determine a precise pose in space of a robotic arm 116 (as well as any object or element held by or secured to the robotic arm). In other words, sensor output or output data from the sensors 126 may be used to calculate a position in space of the robotic arm 116 (and thus, the surgical tool 128) relative to one or more coordinate systems.
[0037] The robot 114 may be any surgical robot or surgical robotic system. The robot 114 may be or comprise, for example, the Mazor X™ Stealth Edition robotic guidance system, or any derivative thereof. The robot 114 may be configured to position the surgical tool 128 at one or more precise poses (e.g., position(s) and orientation(s)). The surgical tool 128 may be any tool capable of cutting, drilling, milling, and/or parting an anatomical element. The surgical tool 128 may be, in one example, a drill bit. In some embodiments, the robot 114 may be configured to rotate and/or advance the cutting tool 128 using, for example, one or more motors to rotate the surgical tool 128.
[0038] The robot 114 may additionally or alternatively be configured to manipulate any component (whether based on guidance from the navigation system 118 or not) to accomplish or to assist with a surgical task. In some embodiments, the robot 114 may be configured to hold and/or manipulate an anatomical element during or in connection with a surgical procedure. The robot 114 may comprise one or more robotic arms 116. In some embodiments, the robotic arm 116 may comprise a first robotic arm and a second robotic arm, though the robot 114 may comprise more than two robotic arms. In some embodiments, one or more of the robotic arms 116 may be used to hold and/or maneuver the surgical tool 128. Each robotic arm 116 may be positionable independently of the other robotic arm. The robotic arms 116 may be controlled in a single, shared coordinate space, or in separate coordinate spaces.
[0039] The robot 114, together with the robotic arm 116, may have, for example, one, two, three, four, five, six, seven, or more degrees of freedom. Further, the robotic arm 116 may be positioned or positionable in any pose, plane, and/or focal point. The pose includes a position and an orientation. As a result, a surgical tool 128 or another object held by the robot 114 (or, more specifically, by the robotic arm 116) may be precisely positionable in one or more needed and specific positions and orientations.
[0040] In some embodiments, reference markers (e.g., navigation markers) may be placed on the robot 114 (including, e.g., on the robotic arm 116), the surgical tool 128, or any other object in the surgical space. The reference markers may be tracked by the navigation system 118, and the results of the tracking may be used by the robot 114 and/or by an operator of the robotic system 100 or any component thereof.
[0041] The navigation system 118 provides navigation for a surgeon and/or a surgical robot during an operation. The navigation system 118 may be any now-known or future-developed navigation system, including, for example, the Medtronic StealthStation™ S8 surgical navigation system or any successor thereof. The navigation system 118 may include one or more cameras (e.g., the camera 210 illustrated in FIG. 2) or other sensor(s) for tracking one or more reference markers, navigated trackers, or other objects within the operating room or other room in which some or all of the robotic system 100 is located (including, as described herein, the robotic tracking unit 136). The one or more cameras may be optical cameras, infrared cameras, or other cameras. In some embodiments, the navigation system 118 may comprise one or more electromagnetic sensors. In various embodiments, the navigation system 118 may be used to track a position and orientation (e.g., a pose) of the imaging device 112, the robot 114 and/or robotic arm 116, the surgical tool 128), and/or one or more other tools (or, more particularly, to track a pose of a navigated tracker attached, directly or indirectly, in fixed relation to the one or more of the foregoing). The navigation system 118 may include a display for displaying one or more images from an external source (e.g., the computing device 102, imaging device 112, or other source) or for displaying an image and/or video stream from the one or more cameras or other sensors of the navigation system 118. In some embodiments, the robotic system 100 can operate without the use of the navigation system 118. The navigation system 118 may be configured to provide guidance to a surgeon or other user of the robotic system 100 or a component thereof, to the robot 114, or to any other element of the robotic system 100 regarding, for example, a pose of one or more anatomical elements, whether or not a tool is in the proper trajectory, and/or how to move a tool into the proper trajectory to carry out a surgical task according to a preoperative or other surgical plan.
[0042] The database 130 may store information that correlates one coordinate system to another (e.g., one or more robotic coordinate systems to a patient coordinate system and/or to a navigation coordinate system). The database 130 may additionally or alternatively store, for example, one or more surgical plans (including, for example, pose information about a target and/or image information about a patient’s anatomy at and/or proximate the surgical site, for use by the robot 114, the navigation system 118, and/or a user of the computing device 102 or of the robotic system 100); one or more images useful in connection with a surgery to be completed by or with the assistance of one or more other components of the robotic system 100; and/or any other useful information. The database 130 may be configured to provide any such information to the computing device 102 or to any other device of the robotic system 100 or external to the robotic system 100, whether directly or via the cloud 134. In some embodiments, the database 130 may be or comprise part of a hospital image storage system, such as a picture archiving and communication system (PACS), a health information system (HIS), and/or another system for collecting, storing, managing, and/or transmitting electronic medical records including image data.
[0043] The cloud 134 may be or represent the Internet or any other wide area network. The computing device 102 may be connected to the cloud 134 via the communication interface 108, using a wired connection, a wireless connection, or both. In some embodiments, the computing device 102 may communicate with the database 130 and/or an external device (e.g., a computing device) via the cloud 134.
[0044] The robotic system 100 or similar systems may be used, for example, to carry out one or more aspects of the method 500 described herein. The robotic system 100 or similar systems may also be used for other purposes.
[0045] FIG. 2 illustrates a representative example system 200 of the robotic system 100. The system 200 includes the computing device 102, the robot 114, and the navigation system 118. The robot 114 includes the robotic arm 116, the robotic tracking unit 136 (coupled to the distal end of the robotic arm 116) and the surgical tool 128. As illustrated in FIG. 2, the robot 114 may be used to perform procedures on an anatomical element 204.
[0046] The navigation system 118 includes a camera 210, which has a field of view 206. As illustrated in FIG. 2, the field of view may encompass the anatomical element 204 (including an anatomy tracker 208) and the robotic tracking unit 136.
[0047] FIGS. 3A-3C illustrate examples of the robotic tracking unit 136. In the illustrated examples, the robotic tracking unit 136 is generally frustoconical in shape, having a longitudinal axis 314. The robotic tracking unit 136 includes a plurality of light emitting diodes (LEDs) 312. The LEDs are distributed in a repeating pattern about the circumference of the robotic tracking unit 136. In the example illustrated, the plurality of LEDs 312 is distributed evenly between the distal end 318 and the proximal end 320 of the robotic tracking unit 136. In the illustrated examples, the robotic tracking unit 136 includes a plurality of planar facets 316 aligned in pairs (one at the distal end 318 and the other at the proximal end 320). These examples illustrate one possible configuration and should not be considered limiting. Others shapes and configurations are possible without departing from the subject matter disclosed herein and recited in the claims. [0048] As illustrated in FIGS. 3B and 3C, the plurality of LEDs 312 is positioned to form vertices of a plurality of non-unique (i.e., repeating) tracking patterns 325 (indicated schematically as Blue, Red and Green), which are distributed circumferentially about the robotic tracking unit 136. As described herein, the tracking patterns 325 may be selectively illuminated and detected (e.g., by the navigation system 118) to determine an orientation of the robotic tracking unit 136.
[0049] FIG. 4 schematically illustrates an example of the robotic tracking unit 136. In the example illustrated, the robotic tracking unit 136 includes an electronic controller 402. The electronic controller includes at least an electronic processor, a memory, and an input/output interface. The electronic controller 402 is coupled to the plurality of LEDs 312. By way of example, only four LEDs are illustrated in FIG. 4, which may be illuminated to produce the tracking pattern 404. This should not be considered limiting. As described above with respect to FIGS. 3A-3C, the robotic tracking unit 136 includes enough LEDs to produce multiple nonunique tracking patterns (e.g., ten or more) distributed around the robotic tracking unit 136. As illustrated in FIG. 4, in some examples, the robotic tracking unit 136 includes a photodiode 406, which is coupled to the electronic controller 402 and configured to sense light (e.g., infrared light) produced, for example, by the navigation system 118. In some examples, the photodiode is associated with the tracking pattern 404. In some examples, the robotic tracking unit 136 includes one photodiode for each of the plurality of tracking patterns producible by the plurality of LEDs 312. The electronic controller 402 is coupled (via suitable wired or wireless connections) with the computing device 102.
[0050] FIG. 5 illustrates an example method 500 for operating the system of FIG. 1 to track the robot 114. Although the method 500 is described in conjunction with the robotic system 100 as described herein, the method 500 could be used with other systems and devices. In addition, the method 500 may be modified or performed differently than the example provided. In particular, the method 500 is applicable to robotic systems, which are not used for surgery.
[0051] As an example, the method 500 is described as being performed by the computing device 102, and, in particular, the electronic processor 104. However, it should be understood that, in some examples, portions of the method 500 may be performed by other components of the robotic system 100, such as, for example, the navigation system 118 and the electronic controller 402.
[0052] In some aspects, a system calibration may be performed. For example, when the robotic system 100 is set up prior to a surgery, calibration is performed to adjust the system based on the location of the camera 210 and the robot 114. For example, the electronic processor 104 may perform a calibration routine as follows.
[0053] At step 510, the electronic processor 104 selects a tracking pattern for the robotic tracking unit. The tracking pattern is selected from the plurality of non-unique tracking patterns producible by the plurality of LEDs 312. Because the tracking patterns are non-unique (i.e., identical and repeating), they cannot all be produced at once. As such, the electronic processor 104 may use any of a number of techniques to select which tracking pattern will be illuminated. [0054] For example, the electronic processor 104 may determine a location of the robotic arm. In some examples, the electronic processor 104 receives data from the sensors 126 and interprets the data to determine allocation in space for the robotic arm 116. The electronic processor 104 also determines a location of the camera 210 relative to the robotic arm 116. For example, the electronic processor 104 may receive location information for the camera 210 from the navigation system 118. [0055] Using the location for the robotic arm 116 and the location of the camera 210 relative to the robotic arm 116, the electronic processor 104 is able to determine which of the plurality of non-unique tracking patterns is facing the camera 210 and select that tracking pattern for illumination at step 520.
[0056] In another example, the electronic processor 104 controls the plurality of LEDs to illuminate to sequentially produce each of the plurality of non-unique tracking patterns. In this way, each pattern is illuminated in turn while the electronic processor 104 receives a plurality of images from the camera 201 (e.g., via the navigation system 118). The electronic processor 104, for example using image processing, detects in the plurality of images a plurality of candidate tracking patterns. Some of the candidate tracking patterns may show a complete pattern, while some may be partial. Because not all of the candidate tracking patterns are aligned with the focal plane of the camera 210, the candidate tracking patterns will exhibit different degrees of distortion. Accordingly, the electronic processor 104 calculates, for each of the candidate tracking patterns, a geometric accuracy. The geometric accuracy is a value (e.g., a percentage) indicating the extent to which a candidate tracking pattern matches the shape of the non-unique tracking pattern (were it to be viewed parallel to the focal plane of the camera 210). In some aspects, the electronic processor 104 also bases the geometric accuracy on the angle-to-camera for the candidate tracking pattern (i.e., the degree of parallelism between the candidate tracking pattern and the focal plane of the camera 210). In this example, the electronic processor 104 selects the tracking pattern for the robotic tracking unit from the plurality of non-unique tracking patterns by selecting from the plurality of candidate tracking patterns the candidate tracking pattern with a greater degree of geometric accuracy (as compared to the other candidate tracking patterns).
[0057] In another example, the electronic processor 104 utilizes infrared light (e.g., emitted by an infrared emitter on the camera 210) to select the tracking pattern. As noted, some embodiments of the robotic tracking unit 136 include a plurality of photodiodes, each of which are associated with one of the plurality of non-unique tracking patterns. In such embodiments, to select the tracking pattern, the electronic processorl04 controls the infrared emitter to illuminate (e.g., by sending a command to the navigation system 118). The electronic processor 104 receives from each of the plurality of photodiodes, a signal indicative of a level of infrared light received from the infrared emitter. The electronic processor 104 then selects the tracking pattern for the robotic tracking unit from the plurality of non-unique tracking patterns based on the signals received from the plurality of photodiodes. For example, the electronic processor 104 may select the tracking pattern corresponding to the photodiode receiving the highest level of infrared light. In other embodiments, other types of light may be produced and detecting to accomplish the selection of the tracking pattern.
[0058] In another example, the electronic processor 104 performs a calibration to adjust the system based on the location of the camera 210 and the robot 114. For example, the electronic processor 104 may receive a location for the camera 210, illuminate a tracking pattern for the robotic tracking unit, and move the robotic arm and/or robotic tracking unit to detect the illuminated pattern using the camera 210. The tracking pattern illuminated is one of the plurality of non-unique tracking patterns suitable for calibration. In some aspects, the tracking pattern may be randomly selected. In some aspects, the tracking pattern is one that allows for the robotic arm to be placed in a desired position for calibration.
[0059] At step 520, regardless of how the tracking pattern is selected, the electronic processor 104 controls the plurality of LEDs 312 of the robotic tracking unit to illuminate based on the tracking pattern. For example, the electronic processor 104 illuminates those LEDs, which when illuminated, produce the selected tracking pattern. In some instances, e.g., where the calibration is performed based on the location of the camera 210, the electronic processor 104 controls the plurality of LEDs to illuminate based on the selected tracking pattern and controls the robotic arm to rotate the robotic tracking unit 136 while the LEDs are illuminated. In some aspects, the electronic processor 104 also controls the robotic arm to position the robotic tracking unit within a field of view of the camera 210 (e.g., based on knowledge of the location of the camera 210).
[0060] At step 530, the electronic processor 104 receives, from the camera 210, which is positioned to capture images of the robotic tracking unit 136, a captured image of the robotic tracking unit 136. Also present in these images are the illuminated tracking pattern. In instances where the robotic tracking unit 136 is rotated, as the robotic tracking unit 136 rotates, the electronic processor 104 receives a plurality of images from the camera 210. Each of these images captures the robotic tracking unit 136 and may capture the illuminated tracking pattern. [0061] At step 540, the electronic processor 104 detects, using image processing techniques, the tracking pattern in the captured image. For example, the electronic processor 104 may use an object classifier trained using images of the non-unique tracking pattern. Where multiple images are received, the electronic processor 104 processes each image to detect the tracking pattern. [0062] At step 550, the electronic processor 104 determines, based on the tracking pattern, an orientation of the robotic tracking unit 136. For example, the electronic processor 104 may use the position, size, and geometric accuracy of the tracking pattern to determine the orientation in space of the robotic tracking unit 136.
[0063] In some examples, as noted above, the electronic processor 104 may receive and process multiple images while the robotic tracking unit 136 rotates. In such examples, for each of the plurality of images where the illuminated tracking pattern is detected, the electronic processor 104 calculates a geometric accuracy (as described herein) for the detected tracking pattern. Each geometric accuracy is compared to a threshold. For example, the threshold may be a percentage of accuracy that yields an acceptable accuracy for tracking the surgical tool 128. In some aspects, the electronic processor 104 controls the robotic arm to stop rotating the robotic tracking unit 136 when the geometric accuracy for one of the plurality of images exceeds the threshold. The electronic processor 104 then determines the orientation of the robotic tracking unit 136 based on the tracking pattern detected in the image where the geometric accuracy exceeds the threshold. [0064] At step 560, the electronic processor 104 controls the robotic arm based on the orientation of the robotic tracking unit. For example, the electronic processor 104 may determine a tool center point for the surgical tool 128 based on the orientation of the robotic tracking unit and control the robotic arm to position the surgical tool based on the tool center point. For example, the electronic processor 104 may use knowledge of the shape and dimensions of the surgical tool 128 and transformation matrices to calculate the tool center point.
[0065] In some aspects, the electronic processor 104 determines, based on signals received from the one or more sensors 126, a nominal position for the surgical tool 128. For example, the sensors may track movement of the robotic arm 116 and indicate to the electronic processor 104 where the robotic arm 116 (and thus the surgical tool 128 is supposed to be positioned). Using this nominal position, the electronic processor 104 may determine an error threshold based on the nominal position and the tool center point. The error threshold represents the difference between the expected and actual location of the tool center point. In some aspects, the electronic processor 104 uses the error threshold to control the robotic arm.
[0066] In some aspects, the calibration is performed each time the camera 210 is repositioned. For example, the electronic processor 104 may receive, from the navigation system, a second location (e.g., different from the last received location) for the camera 210 and, responsive to receiving the second location for the camera, perform the calibration routine again. [0067] With regard to the processes, systems, methods, heuristics, etc. described herein, it should be understood that, although the steps of such processes, etc. have been described as occurring according to a certain ordered sequence, such processes could be practiced with the described steps performed in an order other than the order described herein. It further should be understood that certain steps could be performed simultaneously, that other steps could be added, or that certain steps described herein could be omitted. In other words, the descriptions of processes herein are provided for the purpose of illustrating certain implementations and should in no way be construed to limit the claims.
[0068] Accordingly, it is to be understood that the above description is intended to be illustrative and not restrictive. Many embodiments and applications other than the examples provided would be apparent upon reading the above description. The scope should be determined, not with reference to the above description, but should instead be determined with reference to the appended claims, along with the full scope of equivalents to which such claims are entitled. It is anticipated and intended that future developments will occur in the technologies discussed herein, and that the disclosed systems and methods will be incorporated into such future embodiments. In sum, it should be understood that the application is capable of modification and variation.
[0069] All terms used in the claims are intended to be given their broadest reasonable constructions and their ordinary meanings as understood by those knowledgeable in the technologies described herein unless an explicit indication to the contrary in made herein. In particular, use of the singular articles such as “a,” “the,” “said,” et cetera, should be read to recite one or more of the indicated elements unless a claim recites an explicit limitation to the contrary. [0070] Unless explicitly stated otherwise, each numerical value and range should be interpreted as being approximate as if the word “about” or “approximately” preceded the value or range.
[0071] Reference herein to “one embodiment” or “an embodiment” means that a particular feature, structure, or characteristic described in connection with the embodiment can be included in at least one embodiment of the disclosure. The appearances of the phrase “in one embodiment” in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments necessarily mutually exclusive of other embodiments. The same applies to the term “implementation.”
[0072] Unless otherwise specified herein, the use of the ordinal adjectives “first,” “second,” “third,” etc., to refer to an object of a plurality of like objects merely indicates that different instances of such like objects are being referred to, and is not intended to imply that the like objects so referred-to have to be in a corresponding order or sequence, either temporally, spatially, in ranking, or in any other manner.
[0073] Unless otherwise specified herein, in addition to its plain meaning, the conjunction “if’ may also or alternatively be construed to mean “when” or “upon” or “in response to determining” or “in response to detecting,” which construal may depend on the corresponding specific context. For example, the phrase “if it is determined” or “if [a stated condition] is detected” may be construed to mean “upon determining” or “in response to determining” or “upon detecting [the stated condition or event]” or “in response to detecting [the stated condition or event].”
[0074] Also, for purposes of this description, the terms “couple,” “coupling,” “coupled,” “connect,” “connecting,” or “connected” refer to any manner known in the art or later developed in which energy is allowed to be transferred between two or more elements, and the interposition of one or more additional elements is contemplated, although not required. Conversely, the terms “directly coupled,” “directly connected,” et cetera, imply the absence of such additional elements. The same type of distinction applies to the use of terms “attached” and “directly attached,” as applied to a description of a physical structure. For example, a relatively thin layer of adhesive or other suitable binder can be used to implement such “direct attachment” of the two corresponding components in such physical structure.
[0075] The described embodiments are to be considered in all respects as only illustrative and not restrictive. In particular, the scope of the disclosure is indicated by the appended claims rather than by the description and figures herein. All changes that come within the meaning and range of equivalency of the claims are to be embraced within their scope.
[0076] The functions of the various elements shown in the figures, including any functional blocks labeled as “processors” and/or “controllers,” may be provided through the use of dedicated hardware as well as hardware capable of executing software in association with appropriate software. When provided by a processor, the functions may be provided by a single dedicated processor, by a single shared processor, or by a plurality of individual processors, some of which may be shared. Moreover, explicit use of the term “processor” or “controller” should not be construed to refer exclusively to hardware capable of executing software, and may implicitly include, without limitation, digital signal processor (DSP) hardware, network processor, application specific integrated circuit (ASIC), field programmable gate array (FPGA), read only memory (ROM) for storing software, random access memory (RAM), and nonvolatile storage. Other hardware, conventional and/or custom, may also be included. Similarly, any switches shown in the figures are conceptual only. Their function may be carried out through the operation of program logic, through dedicated logic, through the interaction of program control and dedicated logic, or even manually, the particular technique being selectable by the implementer as more specifically understood from the context.
[0077] As used in this application, the term “circuitry” may refer to one or more or all of the following: (a) hardware-only circuit implementations (such as implementations in only analog and/or digital circuitry); (b) combinations of hardware circuits and software, such as (as applicable): (i) a combination of analog and/or digital hardware circuit(s) with software/firmware and (ii) any portions of hardware processor(s) with software (including digital signal processor(s)), software, and memory(ies) that work together to cause an apparatus, such as a mobile phone or server, to perform various functions); and (c) hardware circuit(s) and or processor(s), such as a microprocessor(s) or a portion of a microprocessor(s), that requires software (e.g., firmware) for operation, but the software may not be present when it is not needed for operation.” This definition of circuitry applies to all uses of this term in this application, including in any claims. As a further example, as used in this application, the term circuitry also covers an implementation of merely a hardware circuit or processor (or multiple processors) or portion of a hardware circuit or processor and its (or their) accompanying software and/or firmware. The term circuitry also covers, for example and if applicable to the particular claim element, a baseband integrated circuit or processor integrated circuit for a mobile device or a similar integrated circuit in server, a cellular network device, or other computing or network device.
[0078] It should be appreciated by those of ordinary skill in the art that any block diagrams herein represent conceptual views of illustrative circuitry embodying the principles of the disclosure. Similarly, it will be appreciated that any flow charts, flow diagrams, state transition diagrams, pseudo code, and the like represent various processes which may be substantially represented in computer readable medium and so executed by a computer or processor, whether or not such computer or processor is explicitly shown.
[0079] It should be understood that although certain figures presented herein illustrate hardware and software located within particular devices, these depictions are for illustrative purposes only. In some embodiments, the illustrated components may be combined or divided into separate software, firmware, and/or hardware. For example, instead of being located within and performed by a single electronic processor, logic and processing may be distributed among multiple electronic processors. Regardless of how they are combined or divided, hardware and software components may be located on the same computing device or may be distributed among different computing devices connected by one or more networks or other suitable communication links.
[0080] The following paragraphs provide examples and alternatives disclosed herein.
[0081] Example 1. A medical system comprising: a robot including a robotic arm; a robotic tracking unit coupled to a distal end of the robotic arm, the robotic tracking unit including a plurality of light emitting diodes (LEDs); a navigation system including a camera positioned to capture images of the robotic tracking unit; and an electronic processor coupled to the robot, the robotic tracking unit, and the navigation system, and configured to: perform a calibration routine by: selecting a tracking pattern for the robotic tracking unit from a plurality of non-unique tracking patterns; controlling the plurality of LEDs to illuminate based on the tracking pattern; receiving, from the navigation system, a captured image of the robotic tracking unit; detecting, in the captured image, the tracking pattern; and determining, based on the tracking pattern, an orientation of the robotic tracking unit; and control the robotic arm based on the orientation of the robotic tracking unit.
[0082] Example 2. The medical system of example 1, further comprising: a surgical tool coupled to a distal end of the robotic tracking unit; wherein the electronic processor is further configured to: determine a tool center point for the surgical tool based on the orientation of the robotic tracking unit; and control the robotic arm to position the surgical tool based on the tool center point.
[0083] Example 3. The medical system of example 2, further comprising: one or more sensors for determining a nominal position for the surgical tool; wherein the electronic processor is further configured to: determining, based on signals received from the one or more sensors, a nominal position for the surgical tool; determine an error threshold based on the nominal position and the tool center point; and control the robotic arm based on the error threshold.
[0084] Example 4. The medical system of any one of examples 1-3, wherein the electronic processor is further configured to: determine a location of the robotic arm; determine a location of the camera relative to the robotic arm; and select the tracking pattern for the robotic tracking unit from the plurality of non-unique tracking patterns by determining which of the plurality of non-unique tracking patterns is facing the camera based on the location of the robotic arm and the location of the camera relative to the robotic arm.
[0085] Example 5. The medical system of any one of examples 1-4, wherein the electronic processor is further configured to: control the plurality of LEDs to illuminate to sequentially produce each of the plurality of non-unique tracking patterns; receive a plurality of images from the camera; detect in the plurality of images a plurality of candidate tracking patterns; calculate, for each of the candidate tracking patterns, a geometric accuracy; and select the tracking pattern for the robotic tracking unit from the plurality of non-unique tracking patterns by selecting from the plurality of candidate tracking patterns the candidate tracking pattern with a greater degree of geometric accuracy.
[0086] Example 6. The medical system of example 5, wherein the electronic processor is further configured to: calculate, for each of the candidate tracking patterns, an angle-to-camera; and the geometric accuracy is based on the angle-to-camera.
[0087] Example 7. The medical system of any one of examples 1-6, wherein: the navigation system further comprises an infrared emitter; the robotic tracking unit further comprises a plurality of photodiodes, each of the photodiodes associated with one of the plurality of non- unique tracking patterns; and the electronic processor is further configured to: control the infrared emitter to illuminate; receive from each of the plurality of photodiodes a signal indicative of a level of infrared light received from the infrared emitter; and select the tracking pattern for the robotic tracking unit from the plurality of non-unique tracking patterns based on the signals received from the plurality of photodiodes.
[0088] Example 8. The medical system of any one of examples 1-7, wherein the electronic processor is further configured to: receiving, from the navigation system, a location for the camera; responsive to controlling the plurality of LEDs to illuminate based on the tracking pattern, control the robotic arm to rotate the robotic tracking unit based on the location for the camera; receive a plurality of images from the camera; for each of the plurality of images, calculate a geometric accuracy based on detecting the tracking pattern in the image; control the robotic arm to stop rotating the robotic tracking unit when the geometric accuracy for one of the plurality of images exceeds a threshold; and responsive to the geometric accuracy for one of the plurality of images exceeding the threshold, determine the orientation of the robotic tracking unit based on the tracking pattern detected in the image.
[0089] Example 9. The medical system of example 8, wherein the electronic processor is further configured to: receive, from the navigation system, a second location for the camera different from the location; and responsive to receiving the second location for the camera, repeat the calibration routine.
[0090] Example 10. A method for operating a surgical robot, the method comprising: performing a calibration routine by: selecting, with an electronic processor, a tracking pattern for a robotic tracking unit coupled to a distal end of a robotic arm of the surgical robot from a plurality of non-unique tracking patterns; controlling a plurality of light emitting diodes (LEDs) of the robotic tracking unit to illuminate based on the tracking pattern; receiving, from a camera positioned to capture images of the robotic tracking unit, a captured image of the robotic tracking unit; detecting, in the captured image, the tracking pattern; and determining, based on the tracking pattern, an orientation of the robotic tracking unit; and controlling the robotic arm based on the orientation of the robotic tracking unit.
[0091] Example 11. The method of example 10, further comprising: determining a tool center point for a surgical tool coupled to a distal end of the robotic tracking unit based on the orientation of the robotic tracking unit; and controlling the robotic arm to position the surgical tool based on the tool center point.
[0092] Example 12. The method of example 11, further comprising: receiving, from one or more sensors, signals indicating a nominal position for the surgical tool; determining an error threshold based on the nominal position and the tool center point; and controlling the robotic arm based on the error threshold.
[0093] Example 13. The method of any one of examples 10-12, further comprising: determining a location of the robotic arm; determining a location of the camera relative to the robotic arm; and selecting the tracking pattern for the robotic tracking unit from the plurality of non-unique tracking patterns by determining which of the plurality of non-unique tracking patterns is facing the camera based on the location of the robotic arm and the location of the camera relative to the robotic arm.
[0094] Example 14. The method of any one of examples 10-13, further comprising: controlling the plurality of LEDs to illuminate to sequentially produce each of the plurality of non-unique tracking patterns; receiving a plurality of images from the camera; detecting in the plurality of images a plurality of candidate tracking patterns; calculating, for each of the candidate tracking patterns, a geometric accuracy; and selecting the tracking pattern for the robotic tracking unit from the plurality of non-unique tracking patterns by selecting from the plurality of candidate tracking patterns the candidate tracking pattern with a greater degree of geometric accuracy.
[0095] Example 15. The method of example 14, wherein the geometric accuracy is based on an angle-to-camera for each of the candidate tracking patterns.
[0096] Example 16. The method of any one of examples 10-15, further comprising: controlling an infrared emitter to illuminate; receiving from each of a plurality of photodiodes a signal indicative of a level of infrared light received from the infrared emitter; and selecting the tracking pattern for the robotic tracking unit from the plurality of non-unique tracking patterns based on the signals received from the plurality of photodiodes; wherein each of the photodiodes associated with one of the plurality of non-unique tracking patterns.
[0097] Example 17. The method of any one of examples 10-16, further comprising: receiving a location for the camera; responsive to controlling the plurality of LEDs to illuminate based on the tracking pattern, control the robotic arm to rotate the robotic tracking unit based on the location for the camera; receiving a plurality of images from the camera; for each of the plurality of images, calculating a geometric accuracy based on detecting the tracking pattern in the image; controlling the robotic arm to stop rotating the robotic tracking unit when the geometric accuracy for one of the plurality of images exceeds a threshold; and responsive to the geometric accuracy for one of the plurality of images exceeding the threshold, determining the orientation of the robotic tracking unit based on the tracking pattern detected in the image.
[0098] Example 18. The method of example 17, further comprising: receiving a second location for the camera different from the location; and responsive to receiving the second location for the camera, repeating the calibration routine. [0099] Various features and advantages of the embodiments presented herein are set forth in the following claims.

Claims

What is claimed is:
1. A medical system comprising: a robot (114) including a robotic arm (116); a robotic tracking (124) unit (136) coupled to a distal end (318) of the robotic arm (116), the robotic tracking (124) unit (136) including a plurality of light emitting diodes (LEDs); a navigation system (118) including a camera (210) (201) positioned to capture images of the robotic tracking (124) unit (136); and an electronic processor (104) coupled to the robot (114), the robotic tracking (124) unit
(136), and the navigation system (118), and configured to: perform a calibration routine by: selecting a tracking pattern (404) for the robotic tracking (124) unit (136) from a plurality of non-unique tracking patterns (325); controlling the plurality of LEDs (312) to illuminate based on the tracking pattern (404); receiving, from the navigation system (118), a captured image of the robotic tracking (124) unit (136); detecting, in the captured image, the tracking pattern (404); and determining, based on the tracking pattern (404), an orientation of the robotic tracking (124) unit (136); and control the robotic arm (116) based on the orientation of the robotic tracking (124) unit (136).
2. The medical system of claim 1, further comprising: a surgical tool (128) coupled to a distal end (318) of the robotic tracking (124) unit (136); wherein the electronic processor (104) is further configured to: determine a tool center point for the surgical tool (128) based on the orientation of the robotic tracking (124) unit (136); and control the robotic arm (116) to position the surgical tool (128) based on the tool center point.
3. The medical system of claim 2, further comprising: one or more sensors (126) for determining a nominal position for the surgical tool (128); wherein the electronic processor (104) is further configured to: determining, based on signals received from the one or more sensors (126), a nominal position for the surgical tool (128); determine an error threshold based on the nominal position and the tool center point; and control the robotic arm (116) based on the error threshold.
4. The medical system of any one of claims 1-3, wherein the electronic processor (104) is further configured to: determine a location of the robotic arm (116); determine a location of the camera (210) (201) relative to the robotic arm (116); and select the tracking pattern (404) for the robotic tracking (124) unit (136) from the plurality of non-unique tracking patterns (325) by determining which of the plurality of non-unique tracking patterns (325) is facing the camera (210) (201) based on the location of the robotic arm (116) and the location of the camera (210) (201) relative to the robotic arm (116).
5. The medical system of any one of claims 1-4, wherein the electronic processor (104) is further configured to: control the plurality of LEDs (312) to illuminate to sequentially produce each of the plurality of non-unique tracking patterns (325); receive a plurality of images from the camera (210) (201); detect in the plurality of images a plurality of candidate tracking patterns (325); calculate, for each of the candidate tracking patterns (325), a geometric accuracy; and select the tracking pattern (404) for the robotic tracking (124) unit (136) from the plurality of non-unique tracking patterns (325) by selecting from the plurality of candidate tracking patterns (325) the candidate tracking pattern (404) with a greater degree of geometric accuracy.
6. The medical system of claim 5, wherein the electronic processor (104) is further configured to: calculate, for each of the candidate tracking patterns (325), an angle-to-camera (210) (201); and the geometric accuracy is based on the angle-to-camera (210) (201).
7. The medical system of any one of claims 1-6, wherein: the navigation system (118) further comprises an infrared emitter; the robotic tracking (124) unit (136) further comprises a plurality of photodiodes, each of the photodiodes associated with one of the plurality of non-unique tracking patterns (325); and the electronic processor (104) is further configured to: control the infrared emitter to illuminate; receive from each of the plurality of photodiodes a signal indicative of a level of infrared light received from the infrared emitter; and select the tracking pattern (404) for the robotic tracking (124) unit (136) from the plurality of non-unique tracking patterns (325) based on the signals received from the plurality of photodiodes.
8. The medical system of any one of claims 1-7, wherein the electronic processor (104) is further configured to: receiving, from the navigation system (118), a location for the camera (210) (201); responsive to controlling the plurality of LEDs (312) to illuminate based on the tracking pattern (404), control the robotic arm (116) to rotate the robotic tracking (124) unit (136) based on the location for the camera (210) (201); receive a plurality of images from the camera (210) (201); for each of the plurality of images, calculate a geometric accuracy based on detecting the tracking pattern (404) in the image; control the robotic arm (116) to stop rotating the robotic tracking (124) unit (136) when the geometric accuracy for one of the plurality of images exceeds a threshold; and responsive to the geometric accuracy for one of the plurality of images exceeding the threshold, determine the orientation of the robotic tracking (124) unit (136) based on the tracking pattern (404) detected in the image.
9. The medical system of claim 8, wherein the electronic processor (104) is further configured to: receive, from the navigation system (118), a second location for the camera (210) (201) different from the location; and responsive to receiving the second location for the camera (210) (201), repeat the calibration routine.
10. A method (500) for operating a surgical robot (114), the method (500) comprising: performing a calibration routine by: selecting, with an electronic processor (104), a tracking pattern (404) for a robotic tracking (124) unit (136) coupled to a distal end (318) of a robotic arm (116) of the surgical robot (114) from a plurality of non-unique tracking patterns (325); controlling a plurality of light emitting diodes (LEDs) of the robotic tracking (124) unit (136) to illuminate based on the tracking pattern (404); receiving, from a camera (210) (201) positioned to capture images of the robotic tracking (124) unit (136), a captured image of the robotic tracking (124) unit (136); detecting, in the captured image, the tracking pattern (404); and determining, based on the tracking pattern (404), an orientation of the robotic tracking (124) unit (136); and controlling the robotic arm (116) based on the orientation of the robotic tracking (124) unit (136).
11. The method (500) of claim 10, further comprising: determining a tool center point for a surgical tool (128) coupled to a distal end (318) of the robotic tracking (124) unit (136) based on the orientation of the robotic tracking (124) unit (136); and controlling the robotic arm (116) to position the surgical tool (128) based on the tool center point.
12. The method (500) of claim 11, further comprising: receiving, from one or more sensors (126), signals indicating a nominal position for the surgical tool (128); determining an error threshold based on the nominal position and the tool center point; and controlling the robotic arm (116) based on the error threshold.
13. The method (500) of any one of claims 10-12, further comprising: determining a location of the robotic arm (116); determining a location of the camera (210) (201) relative to the robotic arm (116); and selecting the tracking pattern (404) for the robotic tracking (124) unit (136) from the plurality of non-unique tracking patterns (325) by determining which of the plurality of nonunique tracking patterns (325) is facing the camera (210) (201) based on the location of the robotic arm (116) and the location of the camera (210) (201) relative to the robotic arm (116).
14. The method (500) of any one of claims 10-13, further comprising: controlling the plurality of LEDs (312) to illuminate to sequentially produce each of the plurality of non-unique tracking patterns (325); receiving a plurality of images from the camera (210) (201); detecting in the plurality of images a plurality of candidate tracking patterns (325); calculating, for each of the candidate tracking patterns (325), a geometric accuracy; and selecting the tracking pattern (404) for the robotic tracking (124) unit (136) from the plurality of non-unique tracking patterns (325) by selecting from the plurality of candidate tracking patterns (325) the candidate tracking pattern (404) with a greater degree of geometric accuracy.
15. The method (500) of any one of claims 10-14, further comprising: controlling an infrared emitter to illuminate; receiving from each of a plurality of photodiodes a signal indicative of a level of infrared light received from the infrared emitter; and selecting the tracking pattern (404) for the robotic tracking (124) unit (136) from the plurality of non-unique tracking patterns (325) based on the signals received from the plurality of photodiodes; wherein each of the photodiodes associated with one of the plurality of non-unique tracking patterns (325).
PCT/IL2024/051125 2023-11-27 2024-11-26 Non-unique led pattern geometry and identification for robotics navigation Pending WO2025115013A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US202363602744P 2023-11-27 2023-11-27
US63/602,744 2023-11-27

Publications (1)

Publication Number Publication Date
WO2025115013A1 true WO2025115013A1 (en) 2025-06-05

Family

ID=94321636

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IL2024/051125 Pending WO2025115013A1 (en) 2023-11-27 2024-11-26 Non-unique led pattern geometry and identification for robotics navigation

Country Status (1)

Country Link
WO (1) WO2025115013A1 (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2021059253A2 (en) * 2019-09-26 2021-04-01 Stryker European Operations Limited Tracker for a surgical instrument
US20220183766A1 (en) * 2020-12-15 2022-06-16 Mazor Robotics Ltd. Systems and methods for defining a work volume
WO2022249168A1 (en) * 2021-05-27 2022-12-01 Mazor Robotics Ltd. Systems, methods, and devices for determining an object pose

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2021059253A2 (en) * 2019-09-26 2021-04-01 Stryker European Operations Limited Tracker for a surgical instrument
US20220183766A1 (en) * 2020-12-15 2022-06-16 Mazor Robotics Ltd. Systems and methods for defining a work volume
WO2022249168A1 (en) * 2021-05-27 2022-12-01 Mazor Robotics Ltd. Systems, methods, and devices for determining an object pose

Similar Documents

Publication Publication Date Title
US11806090B2 (en) System and method for image based registration and calibration
US11723726B2 (en) Systems and methods for tracking objects
US20220175464A1 (en) Tracker-Based Surgical Navigation
US20220241032A1 (en) Multi-arm robotic systems and methods for identifying a target
CN117425449A (en) Multi-arm robotic system and method for monitoring a target or performing a surgical procedure
WO2022249163A1 (en) System and method of gesture detection and device positioning
US12419692B2 (en) Robotic arm navigation using virtual bone mount
CN118613225A (en) Segment tracking combining optical tracking and inertial measurement
WO2021252263A1 (en) Robotic reference frames for navigation
EP4203832A1 (en) Registration of multiple robotic arms using single reference frame
WO2025115013A1 (en) Non-unique led pattern geometry and identification for robotics navigation
US12274513B2 (en) Devices, methods, and systems for robot-assisted surgery
EP4284287A1 (en) Multi-arm robotic systems for identifying a target
US20230404692A1 (en) Cost effective robotic system architecture
US20250235271A1 (en) Devices, methods, and systems for robot-assisted surgery
WO2025172998A1 (en) Adaptive bone removal system and method
WO2025150039A1 (en) Rigidity-based robotic arm position selection and compensation
WO2024261752A1 (en) Systems for real-time detection of object collision and/or object movement
CN121127200A (en) Systems and methods for identifying one or more tracking devices
WO2025173000A1 (en) Multi-arm robotic systems and methods for calibrating and verifying calibration of the same
WO2025163635A1 (en) Systems and methods for segmental tracking using single sphere trackers
CN117320655A (en) Devices, methods and systems for robot-assisted surgery
WO2025120636A1 (en) Systems and methods for determining movement of one or more anatomical elements

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 24840909

Country of ref document: EP

Kind code of ref document: A1