[go: up one dir, main page]

EP4618884A1 - Bras de type branchez-et-utilisez pour robotique rachidienne - Google Patents

Bras de type branchez-et-utilisez pour robotique rachidienne

Info

Publication number
EP4618884A1
EP4618884A1 EP22965472.8A EP22965472A EP4618884A1 EP 4618884 A1 EP4618884 A1 EP 4618884A1 EP 22965472 A EP22965472 A EP 22965472A EP 4618884 A1 EP4618884 A1 EP 4618884A1
Authority
EP
European Patent Office
Prior art keywords
robotic arm
extension member
properties
aspects
robotic
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
EP22965472.8A
Other languages
German (de)
English (en)
Inventor
Weijun Xu
Wei Tang
Yingying LIU
Wuqian LI
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Medtronic Navigation Inc
Original Assignee
Medtronic Navigation Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Medtronic Navigation Inc filed Critical Medtronic Navigation Inc
Publication of EP4618884A1 publication Critical patent/EP4618884A1/fr
Pending legal-status Critical Current

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/90Identification means for patients or instruments, e.g. tags
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods
    • A61B2017/00477Coupling
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis

Definitions

  • the present disclosure is generally directed to robotic assisted surgery, and relates more particularly to plug-and-play robotic arms for robotic assisted surgery.
  • Surgical robots may assist a surgeon or other medical provider in carrying out a surgical procedure, or may complete one or more surgical procedures autonomously.
  • Providing controllable linked articulating members allows a surgical robot to reach areas of a patient anatomy during various medical procedures.
  • Example aspects of the present disclosure include:
  • a system including: a processor; and a memory storing instructions thereon that, when executed by the processor, cause the processor to: identify one or more properties associated with a robotic arm, wherein the robotic arm is plug-and-play compatible with a coupling interface of the system; determine one or more instruction operations associated with controlling the robotic arm based on the one or more properties; and control the robotic arm in association with a surgical procedure by transmitting one or more surgical commands based on the one or more instruction operations.
  • the instructions are further executable by the processor to:detect a removable coupling established between the robotic arm and the coupling interface, wherein the removable coupling includes at least one of: a mechanical connection established between the robotic arm and the coupling interface; and an electrical connection established between the robotic arm and the coupling interface, wherein identifying the one or more properties associated with the robotic arm is in response to detecting the removable coupling.
  • the instructions are further executable by the processor to:retrieve the one or more properties associated with the robotic arm from a memory stored on the robotic arm.
  • the one or more properties associated with the robotic arm include at least one of: status information associated with the robotic arm; and identification information associated with the robotic arm.
  • the one or more properties associated with the robotic arm include at least one of: a hardware version associated with the robotic arm; and a software version associated with the robotic arm.
  • the one or more properties associated with the robotic arm include at least one of: a type associated with the robotic arm; one or more functions associated with the robotic arm; and a range of motion associated with the robotic arm.
  • any of the aspects herein further including: determine one or more control parameters associated with controlling the robotic arm based on the one or more properties associated with the robotic arm, wherein the one or more control parameters include at least one of target pose information of the robotic arm and second target pose information of an end effector of the robotic arm, wherein controlling the robotic arm includes by transmitting the one or more surgical commands based on the one or more control parameters.
  • determining the one or more instruction operations, determining the one or more control parameters, or both is based on at least one of: one or more characteristics of a subject associated with the surgical procedure; and one or more characteristics of an anatomical element of the subject in association with the surgical procedure.
  • the instructions are further executable by the processor to:identify one or more second properties associated with an extension member, wherein: the extension member is plug-and-play compatible with the coupling interface and the robotic arm; and the extension member is removably coupled to the coupling interface and removably coupled to the robotic arm; determine one or more second control parameters associated with controlling the extension member and the robotic arm based on identifying the one or more second properties; and control the extension member and the robotic arm in association with the procedure by transmitting the one or more surgical commands based on the one or more second control parameters.
  • the one or more second properties include at least one of: a type associated with the extension member; one or more functions associated with the extension member; and a range of motion associated with the extension member.
  • determining the one or more instruction operations is based on a target objective associated with performing the procedure.
  • the instructions are further executable by the processor to:determine pose information of the robotic arm, pose information of an extension member removably coupled to the robotic arm, or both in response to at least one of: receiving data from the robotic arm; and receiving second data from an extension member, wherein the extension member is removably coupled to the coupling interface.
  • coupling interface is associated with one or more support structures of the system.
  • a system including: a robotic arm; and robot management circuitry that manages control of the robotic arm by: identifying one or more properties associated with the robotic arm, wherein the robotic arm is plug-and-play compatible with a mounting component of the system; determining one or more instruction operations associated with controlling the robotic arm based on identifying the one or more properties; and controlling the robotic arm in association with a surgical procedure by transmitting one or more surgical commands based on the one or more instruction operations.
  • the robot management circuitry is to: detect a removable coupling established between the robotic arm and the coupling interface, wherein the removable coupling includes at least one of: a mechanical connection established between the robotic arm and the coupling interface; and an electrical connection established between the robotic arm and the coupling interface, wherein identifying the one or more properties associated with the robotic arm is in response to detecting the removable coupling
  • the robot management circuitry further manages control of the robotic arm by: determining one or more control parameters associated with controlling the robotic arm based on identifying the one or more properties associated with the robotic arm, wherein controlling the robotic arm includes by transmitting the one or more surgical commands based on identifying the one or more control parameters.
  • the robot management circuitry further manages control of the robotic arm by: identifying one or more second properties associated with an extension member, wherein: the extension member is plug-and-play compatible with the coupling interface and the robotic arm; and the extension member is removably coupled to the coupling interface of and removably coupled to the robotic arm; determining one or more second control parameters associated with controlling the extension member and the robotic arm based on identifying the one or more second properties; and controlling the extension member and the robotic arm in association with the procedure by transmitting the one or more surgical commands based on the one or more second control parameters.
  • any of the aspects herein further including an extension member removably coupled to the robotic arm and to the mounting component, wherein: a first end of the extension member is plug-and-play compatible with the mounting component based on one or more position-mounting interfaces, one or more power interfaces, and one or more data communication interfaces of the extension member; and a second end of the extension member is plug-and-play compatible with the robotic arm based on one or more second position-mounting interfaces, one or more second power interfaces, and one or more second data communication interfaces of the extension member.
  • a method including: electronically receiving status information associated with a robotic arm of a system, wherein the status information includes an indication that the robotic arm is in an active state, wherein the robotic arm is plug-and-play compatible with a coupling interface of the system; and determining, in response to electronically receiving the status information and based on one or more properties associated with the robotic arm, one or more instruction operations associated with controlling the robotic arm; and controlling the robotic arm in association with a surgical procedure by electronically communicating one or more surgical commands corresponding to the one or more instruction operations to the robotic arm.
  • Fig. 1A is a block diagram of a system according to at least one implementation of the present disclosure.
  • Figs. 1B and 1C illustrate example implementations of the system in accordance with aspects of the present disclosure.
  • Fig. 2 illustrates an example process flow in accordance with aspects of the present disclosure.
  • Fig. 3 illustrates an example process flow in accordance with aspects of the present disclosure.
  • the described methods, processes, and techniques may be implemented in hardware, software, firmware, or any combination thereof. If implemented in software, the functions may be stored as one or more instructions or code on a computer-readable medium and executed by a hardware-based processing unit. Alternatively or additionally, functions may be implemented using machine learning models, neural networks, artificial neural networks, or combinations thereof (alone or in combination with instructions) . Alternatively or additionally, functions may be implemented using machine learning models, neural networks, artificial neural networks, or combinations thereof (alone or in combination with instructions) .
  • processors such as one or more digital signal processors (DSPs) , general purpose microprocessors (e.g., Intel Core i3, i5, i7, or i9 processors; Intel Celeron processors; Intel Xeon processors; Intel Pentium processors; AMD Ryzen processors; AMD Athlon processors; AMD Phenom processors; Apple A10 or 10X Fusion processors; Apple A11, A12, A12X, A12Z, or A13 Bionic processors; or any other general purpose microprocessors) , graphics processing units (e.g., Nvidia GeForce RTX 2000-series processors, Nvidia GeForce RTX 3000-series processors, AMD Radeon RX 5000-series processors, AMD Radeon RX 6000-series processors, or any other graphics processing units) , application specific integrated circuits (ASICs) , field programmable logic arrays (FPGAs) , or other equivalent integrated circuits (ASICs) ,
  • proximal and distal are used in this disclosure with their conventional medical meanings, proximal being closer to the operator or user of the system, and further from the region of surgical interest in or on the patient, and distal being closer to the region of surgical interest in or on the patient, and further from the operator or user of the system.
  • a spinal robotic system may include a workstation and a robotic arm controllable by the workstation.
  • the workstation may control the robotic arm by issuing a set of commands. For example, based on the issued commands, the workstation may control the robotic arm to reach a planned position (e.g., coordinates, trajectory, orientation, etc. ) and provide a surgeon with an accurate and stable trajectory.
  • Some robotic systems may support robotic arms that can be detached from the workstation and fixed to the side rail of an operating bed.
  • Some robotic systems may support a robotic arm fixed on a cart.
  • the working range of a robotic arm may be limited, which may thereby impact the ease of use of the robotic arm with respect to a surgical procedure.
  • the weight of some robotic arms may negatively impact a user’s ability to install a robotic arm to an operating bed. For example, due to the weight of some robotic arms, switching out one robotic arm for another robotic arm to adjust the working range may not be a feasible option for a user.
  • some robotic systems may only support a single type of robotic arm. Such support for only a single type of robotic arm may thereby prevent surgical expansion.
  • the robotic arm may be separate from a workstation.
  • the robotic arm may include a plug-and-play interface supportive of mounting the robotic arm to a support structure, for example, at a connection interface of the support structure.
  • the support structure may be, for example, an operating table, a ceiling mount structure, a wall mount structure, a support structure of a robot, or the like.
  • connection interface of the operating table may be located at any portion (e.g., a side portion, a front portion, etc. ) of the operating table.
  • the connection interface of the operating table may be the same as the connection interface of the robotic arm.
  • the connection interface of the operating table and the connection interface of the robotic arm may each include respective configurations (e.g., arrangement, positioning) of a power supply connector, a communication connector, and position mounting holes supportive of removably coupling the robotic arm to the operating table.
  • the connection interfaces may support the transmission of data and power between the operating table (or a computing system associated with the operating table) and the robotic arm.
  • aspects of the robotic arm support relatively quick installation to a support structure (e.g., operating table, ceiling mount structure, wall mount structure, support structure of a robot, etc. ) .
  • the robotic arm may be designed to be lightweight compared to some other robotic arms.
  • support structures (e.g., an operating table, etc. ) described herein may include multiple connection interfaces compatible with the robotic arm, thereby supporting multiple locations at which the robotic arm may be removably coupled.
  • aspects of the present disclosure support implementations in which a user may relocate the robotic arm from one connection interface to another connection interface as needed with respect to performing a surgical procedure. Such example features extend the working range of the robotic arm.
  • the robotic arm may be powered on via the support structure for example, by installing the robotic arm to the support structure.
  • the robotic arm may be in a powered on state once the robotic arm is installed on the support structure (e.g., once the power supply connector of the robotic arm is electrically coupled to the power supply connector of the support structure) .
  • the robotic arm may receive power from the support structure via the power supply connector of the robotic arm.
  • the workstation may read status data from the robotic arm and implement different algorithms associated with calculating parameters for a surgical procedure.
  • the status data may indicate that the robotic arm is powered on and removably coupled to the support structure.
  • the status data may indicate the connection interface of the support structure to which the robotic arm is connected.
  • the workstation may read identification information from the robotic arm (e.g., via the connection interface of the robotic arm) and implement algorithms for calculating a screw trajectory for the surgical procedure.
  • the workstation may identify a hardware version and/or software version of the robotic arm, based on which the workstation may implement the algorithms.
  • a single workstation may be compatible with multiple robotic arms for multiple types of surgical procedures (e.g., trauma surgery, spine surgery, joint surgery, etc. ) .
  • the single workstation may support multiple types of surgical procedures at the same time, using different types of robotic arms.
  • the workstation may implement different therapy algorithms based on the identification information read from the connection interface of the robotic arm.
  • the lightweight design of the robotic arm may support relatively quick installation and replacement by a user, thereby supporting implementations in which any combination of robotic arms and robotic arm types may be connected to a support structure (e.g., operating table, etc. ) for performing a surgical procedure.
  • a portable plug-and-play extension joint is described that is compatible with the robotic arm described herein.
  • the terms “extension joint, ” “extension member, ” “extension arm, ” and “extension component” may be used interchangeably herein.
  • the extension joint may further extend the working range of a robotic arm.
  • the extension joint may have the same connection interface at respective ends (e.g., two ends) of the extension joint.
  • a first end which is installed to a support structure e.g., operating table, ceiling mounting structure, etc.
  • a second end which is electrically and mechanically connected to the robotic arm will be the “slave. ”
  • Aspects of the present disclosure support extension joints of any combination of lengths, any quantity of joints, or the like.
  • the workstation may read stored data from the extension joint and determine, from the data, pose information of the second end.
  • the first end and the second end of the extension joint may have respective connection interfaces that are compatible with support structures and robotic arms described herein. Example aspects of the connection interfaces of the extension joint, the support structure, and the robotic arm are later described herein.
  • the extension joint may be powered on via the support structure (e.g., operating table, etc. ) , for example, by installing the extension joint to the support structure.
  • the extension joint may be in a powered on state once the extension joint is installed on the support structure (e.g., once a power supply connector of the extension joint is electrically coupled to the power supply connector of the support structure) .
  • the extension joint may receive power from the support structure via the power supply connector of the extension joint.
  • the workstation may read status data from the extension joint and implement different algorithms associated with calculating parameters for a surgical procedure.
  • the status data may indicate that the extension joint is powered on and coupled to the support structure.
  • the status data may indicate the connection interface of the support structure to which the extension joint is connected.
  • the status data may indicate that a robotic arm is coupled to another end of the extension joint.
  • the workstation may read identification information from the extension joint (e.g., via the connection interface of the extension joint) and implement algorithms for calculating a screw trajectory for the surgical procedure.
  • the workstation may identify a hardware version and/or software version of the extension joint, based on which the workstation may implement the algorithms.
  • the workstation may read identification information described herein with respect to the robotic arm via the extension joint.
  • aspects of the extension joint support extending the working range of a robotic arm.
  • the working range of a robotic arm e.g., fully extended length of the robotic arm, trajectory of an end effector of the robotic arm based on quantity of joints, etc.
  • the robotic arm may be insufficient for performing a surgical procedure.
  • aspects of the present disclosure support connecting the robotic arm to the support structure via the extension joint may provide an overall working range supportive of performing the surgical procedure.
  • implementations using the extension joint (or multiple extension joints) and the robotic arm may provide a fully extended length and/or additional possible trajectories of the end effector (e.g., due to an increased quantity of joints) supportive of performing the surgical procedure.
  • the extension joint may be mountable to any support structure described herein, for example, an operating table, a ceiling mount support structure, a wall mount support structure, a support structure of a robot device, and the like.
  • aspects of the plug-and-play features of the extension joint support advantageous implementations in which a single workstation may be compatible with multiple extension joints and robotic arms for multiple types of surgical procedures (e.g., trauma surgery, spine surgery, joint surgery, etc. ) .
  • the extension joint described herein may supplement the working range of the robotic arm in association with successfully completing the surgical procedure. Accordingly, for example, such implementations may provide cost benefits in that the manufacturing of additional robotic arms of the same type (but different lengths) may be avoided.
  • the extension joint may enable a robotic arm to reach a target location in association with the surgical procedure, a user may avoid having to find a replacement robotic arm of the same type but different length, thereby providing increased user convenience.
  • connection interfaces of the extension joint may be compatible with the respective connection interfaces of the support structure and the robotic arms described herein, thereby providing plug-and-play interchangeability and flexibility.
  • Such interchangeability and flexibility may support reduced system setup time.
  • the lightweight design and connection interface compatibility of the robotic arms and extension joints may support relatively quick installation and replacement by a user, thereby supporting implementations in which any combination of extension joints and robotic arms may be connected to a support structure (e.g., operating table, etc. ) for performing a surgical procedure.
  • the lightweight design and flexibility with respect to different mechanical and/or electrical mounting structures support advantageous implementations which may avoid exceeding the weight capacity of an operating table due to the weight of a patient and the weight of the robotic arms.
  • aspects of the present disclosure support the removal of a robotic arm from an operating table and attaching the same robotic arm to another support structure (e.g., a ceiling-mounted support structure, a wall-mounted support structure, a robot-mounted support structure, etc. ) to meet the weight capacity of the operating table.
  • each of the robotic arms described herein may weigh from about 11 pounds to about 22 pounds, which is relatively lighter in comparison to other surgical robotic arms.
  • Fig. 1A illustrates an example of a system 100 that supports aspects of the present disclosure.
  • the system 100 includes a computing device 102, one or more imaging devices 112, a robot 114, a navigation system 118, a database 130, and/or a cloud network 134 (or other network) .
  • Systems according to other implementations of the present disclosure may include more or fewer components than the system 100.
  • the system 100 may omit and/or include additional instances of one or more components of the computing device 102, the imaging device (s) 112, the robot 114, navigation system 118, the database 130, and/or the cloud network 134.
  • the system 100 may omit any instance of the computing device 102, the imaging device (s) 112, the robot 114, navigation system 118, the database 130, and/or the cloud network 134.
  • the system 100 may support the implementation of one or more other aspects of one or more of the methods disclosed herein.
  • the processor 104 of the computing device 102 may be any processor described herein or any similar processor.
  • the processor 104 may be configured to execute instructions stored in the memory 106, which instructions may cause the processor 104 to carry out one or more computing steps utilizing or based on data received from the imaging devices 112, the robot 114, the navigation system 118, the database 130, and/or the cloud network 134.
  • the memory 106 may be or include RAM, DRAM, SDRAM, other solid-state memory, any memory described herein, or any other tangible, non-transitory memory for storing computer-readable data and/or instructions.
  • the memory 106 may store information or data associated with completing, for example, any step of the methods or process flows described herein, or of any other methods.
  • the memory 106 may store, for example, instructions and/or machine learning models that support one or more functions of the imaging devices 112, the robot 114, and the navigation system 118.
  • the memory 106 may store content (e.g., instructions and/or machine learning models) that, when executed by the processor 104, enable image processing 120, segmentation 122, transformation 124, registration 128, and/or an arm management engine 138.
  • Such content if provided as in instruction, may, in some implementations, be organized into one or more applications, modules, packages, layers, or engines.
  • the communication interface 108 may include one or more wired interfaces (e.g., a USB port, an Ethernet port, a Firewire port) and/or one or more wireless transceivers or interfaces (configured, for example, to transmit and/or receive information via one or more wireless communication protocols such as 802.11a/b/g/n, Bluetooth, NFC, ZigBee, and so forth) .
  • the communication interface 108 may support communication between the device 102 and one or more other processors 104 or computing devices 102, whether to reduce the time needed to accomplish a computing-intensive task or for any other reason.
  • the computing device 102 may also include one or more user interfaces 110.
  • the user interface 110 may be or include a keyboard, mouse, trackball, monitor, television, screen, touchscreen, and/or any other device for receiving information from a user and/or for providing information to a user.
  • the user interface 110 may be used, for example, to receive a user selection or other user input regarding any step of any method described herein. Notwithstanding the foregoing, any required input for any step of any method described herein may be generated automatically by the system 100 (e.g., by the processor 104 or another component of the system 100) or received by the system 100 from a source external to the system 100.
  • the user interface 110 may support user modification (e.g., by a surgeon, medical personnel, a patient, etc. ) of instructions to be executed by the processor 104 according to one or more implementations of the present disclosure, and/or to user modification or adjustment of a setting of other information displayed on the user interface 110 or corresponding thereto.
  • the robot 114 may be any surgical robot or surgical robotic system.
  • the robot 114 may be or include, for example, the Mazor X TM Stealth Edition robotic guidance system.
  • the robot 114 may be configured to position the imaging device 112 at one or more precise position (s) and orientation (s) , and/or to return the imaging device 112 to the same position (s) and orientation (s) at a later point in time.
  • the robot 114 may additionally or alternatively be configured to manipulate a surgical tool (whether based on guidance from the navigation system 118 or not) to accomplish or to assist with a surgical task.
  • the robot 114 may be configured to hold and/or manipulate an anatomical element during or in connection with a surgical procedure.
  • the robot 114 may include one or more robotic arms 116.
  • the navigation system 118 may provide navigation for a surgeon and/or a surgical robot during an operation.
  • the navigation system 118 may be any now-known or future-developed navigation system, including, for example, the Medtronic StealthStation TM S8 surgical navigation system or any successor thereof.
  • the navigation system 118 may include one or more cameras or other sensor (s) for tracking one or more reference markers, navigated trackers, or other objects within the operating room or other room in which some or all of the system 100 is located.
  • the one or more cameras may be optical cameras, infrared cameras, or other cameras.
  • the navigation system 118 may include one or more electromagnetic sensors.
  • the navigation system 118 may be configured to provide guidance to a surgeon or other user of the system 100 or a component thereof, to the robot 114, or to any other element of the system 100 regarding, for example, a pose of one or more anatomical elements, whether or not a tool is in the proper trajectory, and/or how to move a tool into the proper trajectory to carry out a surgical task according to a preoperative or other surgical plan.
  • the arm management engine 136 may support features described herein of identifying properties associated with a robotic arm 116, determining instruction operations and/or control parameters associated with controlling the robotic arm based on the identified properties, and controlling the robotic arm in association with a surgical procedure by transmitting surgical commands based the instruction operations and/or control parameters.
  • the database 130 may store information that correlates one coordinate system to another (e.g., one or more robotic coordinate systems to a patient coordinate system and/or to a navigation coordinate system) .
  • the database 130 may additionally or alternatively store, for example, one or more surgical plans (including, for example, pose information about a target and/or image information about a patient’s anatomy at and/or proximate the surgical site, for use by the robot 114, the navigation system 118, and/or a user of the computing device 102 or of the system 100) ; one or more images useful in connection with a surgery to be completed by or with the assistance of one or more other components of the system 100; and/or any other useful information.
  • one or more surgical plans including, for example, pose information about a target and/or image information about a patient’s anatomy at and/or proximate the surgical site, for use by the robot 114, the navigation system 118, and/or a user of the computing device 102 or of the system 100
  • the database 130 may be configured to provide any such information to the computing device 102 or to any other device of the system 100 or external to the system 100, whether directly or via the cloud network 134.
  • the database 130 may be or include part of a hospital image storage system, such as a picture archiving and communication system (PACS) , a health information system (HIS) , and/or another system for collecting, storing, managing, and/or transmitting electronic medical records including image data.
  • a hospital image storage system such as a picture archiving and communication system (PACS) , a health information system (HIS) , and/or another system for collecting, storing, managing, and/or transmitting electronic medical records including image data.
  • PACS picture archiving and communication system
  • HIS health information system
  • Wired communications technologies may include, for example, Ethernet-based wired local area network (LAN) connections using physical transmission mediums (e.g., coaxial cable, copper cable/wire, fiber-optic cable, etc. ) .
  • Wireless communications technologies may include, for example, cellular or cellular data connections and protocols (e.g., digital cellular, personal communications service (PCS) , cellular digital packet data (CDPD) , general packet radio service (GPRS) , enhanced data rates for global system for mobile communications (GSM) evolution (EDGE) , code division multiple access (CDMA) , single-carrier radio transmission technology (1 ⁇ RTT) , evolution-data optimized (EVDO) , high speed packet access (HSPA) , universal mobile telecommunications service (UMTS) , 3G, long term evolution (LTE) , 4G, and/or 5G, etc. ) , low energy, Wi-Fi, radio, satellite, infrared connections, and/or communication protocols.
  • PCS personal communications service
  • CDPD cellular
  • the Internet is an example of the communications network that constitutes an Internet Protocol (IP) network consisting of multiple computers, computing networks, and other communication devices located in multiple locations, and components in the communications network (e.g., computers, computing networks, communication devices) may be connected through one or more telephone systems and other means.
  • IP Internet Protocol
  • the computing device 102 may be connected to the cloud network 134 via the communication interface 108, using a wired connection, a wireless connection, or both. In some implementations, the computing device 102 may communicate with the database 130 and/or an external device (e.g., a computing device) via the cloud network 134.
  • an external device e.g., a computing device
  • Fig. 1B illustrates an example implementation 101 of the system 100 that supports aspects of the present disclosure. Aspects of the system 100 previously described with reference to Fig. 1A and descriptions of like elements are omitted for brevity.
  • the system 100 may include a robot 114, robotic arms 116 (e.g., robotic arm 116-a through robotic arm 116-d, etc. ) , extension members 140 (e.g., extension member 140-b, extension member 140-c, etc. ) , a workstation 150, a support structure 154, and a support structure 155.
  • the workstation 150 may be integrated with the robot 114.
  • the workstation 150 may be integrated with computing device 102 and/or the navigation system 118 described with reference to Fig. 1A.
  • the support structure 154 may be capable of supporting or holding a subject 152 during a procedure (e.g., a surgical procedure, medical imaging, etc. ) .
  • the subject 152 may be a living subject (e.g., a human subject) . It is understood, however, that the system 100 supports surgical procedures relative to any subject 152 using any combination of robotic arms 116 and extension members 140 described herein.
  • the support structure 155 may be, for example, mounted to a surface 158 of an environment in which the system 100 is implemented.
  • the surface 158 may be a ceiling surface, a wall surface, a floor surface, or the like, and is not limited thereto.
  • the environment may be, for example, an operating room and is not limited thereto.
  • the system 100 may support various purposes or procedures by one or more users (e.g., a surgeon, a medical technician, etc. ) using any combination of robotic arms 116 and/or extension members 140.
  • Aspects of the present disclosure support coupling of one or more robotic arms 116 and/or extension members 140 to any of the robot 114, the workstation 150, the support structure 154, and the support structure 155.
  • the robot 114, the support structure 154, and the support structure 155 may include respective mounting components 156 (e.g., mounting component 156-a through mounting component 156-d) to which the robotic arms 116 and extension members 140 may be coupled.
  • Coupled may refer to an electrical coupling and/or a mechanical coupling between the components.
  • the coupling may include an electrical and/or mechanical coupling between the components.
  • Electrical coupling may enable electrical signals to be shared between electrically coupled components.
  • Mechanical coupling may enable one component to be physical supported and/or manipulated based on movements of another component mechanically coupled thereto.
  • the mounting components 156, ends 117 (e.g., end 117-a through end 117-d) of the robotic arms 116, ends 141 (e.g., end 141-b, end 141-c) of the extension members 140, and ends 142 (e.g., end 142-b, end 142-c) of the extension members 140 may include respective connection interfaces (also referred to herein as “coupling interfaces” or “interfaces” ) supportive of plug-and-play interchangeability and flexibility of the robotic arms 116 and extension members 140. Examples of the connection interfaces are later described with reference to Fig. 1C.
  • the system 100 supports removably coupling an end 117 of a robotic arm 116 to a mounting component 156 in association with performing a surgical procedure or other procedure.
  • robotic arm 116-a may be coupled to mounting component 156-a of the support structure 154 via end 117-a of the robotic arm 116-a.
  • robotic arm 116-d may be coupled to mounting component 156-d of the robot 114 (and workstation 150) via end 117-d of the robotic arm 116-d.
  • the system 100 supports the use of an extension member 140 to increase the working range of a robotic arm 116.
  • the system 100 supports removably coupling an end 141 of an extension member 140 to a mounting component 156 and further, removably coupling an end 142 of the extension member 140 to an end 117 of a robotic arm 116 in association with performing a surgical procedure or other procedure.
  • end 141-b of the extension member 140-b may be coupled to mounting component 156-b of the support structure 154
  • end 142-b of the extension member 140-b may be coupled to end 117-b of the robotic arm 116-b.
  • end 141-c of the extension member 140-c may be coupled to mounting component 156-c of the support structure 155, and end 142-c of the extension member 140-c may be coupled to end 117-c of the robotic arm 116-c.
  • the system 100 may support any combination of extension members 140 of different dimensions (e.g., different thickness, different lengths, etc. ) , different quantities of joints, and the like.
  • Fig. 1C illustrates an example implementation 102 of the system 100 that supports aspects of the present disclosure. Aspects of the system 100 previously described with reference to Figs. 1A and 1B and descriptions of like elements are omitted for brevity.
  • connection interface of a robotic arm 116 may include one or more position-mounting holes 147 (e.g., four position-mounting holes 147) , a power supply connector 148, and a communication connector 149. It is to be understood that the aspects of the connection interfaces of the robotic arms 116 (e.g., robotic arm 116-a, robotic arm 116-b) illustrated and described with reference to Fig. 1C may apply to any robotic arm 116 described herein.
  • connection interface of a mounting component 156 may include one or more position-mounting holes 157 (e.g., four position-mounting holes 157) , a power supply connector 158, a communication connector 159.
  • position-mounting holes 157 e.g., four position-mounting holes 157
  • power supply connector 158 e.g., a power supply connector 158
  • communication connector 159 e.g., a communication connector 159.
  • FIG. 1C may apply to mounting component 156-a and mounting component 156-b of the support structure 154, mounting component 156-c of the support structure 155 of Fig. 1B, mounting component 156-d of the robot 114 of Fig. 1B, and the like.
  • connection interfaces of an extension member 140 supported by the present disclosure are described herein.
  • a connection interface at an end 141 of an extension member 140 may include one or more position-mounting holes 167 (e.g., four position-mounting holes 167) , a power supply connector 168, and a communication connector 169.
  • a connection interface at an end 142 of the extension member 140 may include one or more position-mounting holes 177 (e.g., four position-mounting holes 177) , a power supply connector 178, and a communication connector 179. It is to be understood that the aspects of the connection interfaces of the extension member 140-b illustrated and described with reference to Fig. 1C may apply to any extension member 140 described herein.
  • connection interfaces of the robotic arms 116, the extension members 140, and the mounting components 156 as illustrated herein are examples, and it is to be understood that the configurations and orientations may be different from the examples.
  • the position-mounting holes e.g., position-mounting holes 147, position-mounting holes 157, position-mounting holes 167, position-mounting holes 177) may also be referred to as position-mounting interfaces.
  • connection interface of the robotic arm 116-a may match or complement the connection interfaces of the mounting component 156-a and the mounting component 156-b.
  • the configuration (e.g., arrangement, positioning, etc. ) , types, and sizes of the position-mounting holes 147, the power supply connector 148, and the communication connector 149 of the robotic arm 116-a may match or complement the configuration (e.g., arrangement, positioning, etc. ) , types, and sizes of the position-mounting holes 157, the power supply connector 158, and the communication connector 159 of the mounting components156.
  • connection interface of the robotic arm 116-a may match or complement the connection interface at end 142-b of the extension member 140-b.
  • the configuration e.g., arrangement, positioning, etc.
  • types, and sizes of the position-mounting holes 147, the power supply connector 148, and the communication connector 149 of the robotic arm 116-a may match or complement the configuration (e.g., arrangement, positioning, etc. ) , types, and sizes of the position-mounting holes 177, the power supply connector 178, and the communication connector 179 of the extension member 140-b.
  • connection interface at end 141-b of the extension member 140-b may match or complement the connection interfaces of the mounting component 156-a and the mounting component 156-b.
  • the configuration (e.g., arrangement, positioning, etc. ) , types, and sizes of the position-mounting holes 167, the power supply connector 168, and the communication connector 169 of the extension member 140-b may match or complement the configuration (e.g., arrangement, positioning, etc. ) , types, and sizes of the position-mounting holes 157, the power supply connector 158, and the communication connector 159 of the mounting components 156.
  • the gender (e.g., female or male) of the power supply connector 158, the communication connector 159, the power supply connector 178, and the communication connector 179 may be configured to complement the gender (e.g., male or female) of the power supply connector 148, the communication connector 149, the power supply connector 168 and the communication connector 169.
  • aspects of the present disclosure support gender neutral (e.g., flat, non-recessed, non-protruding, etc. ) implementations of any of the power supply connector 148, the communication connector 149, the power supply connector 158, the communication connector 159, the power supply connector 168, the communication connector 169, the power supply connector 178, and the communication connector 179.
  • coupling between the connection interfaces described herein may be implemented with ferromagnetic connectors.
  • the robotic arm 116-a may be coupled to the mounting component 156-a using pins (not illustrated) inserted at locations where the position-mounting holes 147 and the position-mounting holes 157 overlap.
  • the extension member 140-b may be coupled to the mounting component 156-b using pins (not illustrated) inserted at locations where the position-mounting holes 167 and the position-mounting holes 157 overlap.
  • the robotic arm 116-b may be coupled to the extension member 140-b using pins (not illustrated) inserted at locations where the position-mounting holes 147 and the position-mounting holes 177 overlap.
  • the position-mounting holes 147 of the robotic arm 116-a may be replaced with pins, and the pins may be inserted into the position-mounting holes 157 of the mounting component 156-a in association with coupling the robotic arm 116-a to the mounting component 156-a.
  • the position-mounting holes 147 of the robotic arm 116-b may be replaced with pins, and the pins may be inserted into the position-mounting holes 177 of the extension member 140-b in association with coupling the robotic arm 116-b to the extension member 140-b.
  • the position-mounting holes 167 of the extension member 140-b may be replaced with pins, and the pins may be inserted into the position-mounting holes 157 of the mounting component 156-b in association with coupling the extension member 140-b to the mounting component 156-b.
  • aspects of the plug-and-play features of the robotic arms 116 and the extension members 140 support implementations in which the workstation 150 may be compatible with multiple robotic arms 116 and extension members 140 for multiple types of surgical procedures (e.g., trauma surgery, spine surgery, joint surgery, etc. ) .
  • the workstation 150 may identify and implement algorithms associated with performing a surgical procedure based on a robotic arm 116 and/or extension member 140. It is to be understood that features described herein as implemented by the system 100 may be implemented by a computing device 102, robot 114, navigation system 118, and/or workstation 150 described herein.
  • Fig. 2 illustrates an example of a process flow 200 that supports the plug-and-play features of robotic arms 116 and extension members 140 in accordance with aspects of the present disclosure.
  • process flow 200 may implement aspects of the system 100 described with reference to Figs. 1A through 1C.
  • process flow 200 described herein with respect to controlling a robotic arm 116 and/or extension member 140 in association with a surgical procedure may be applied to any robotic arm 116 and extension member 140 supported by the system 100. It is to be understood that the example aspects are not limited to the robotic arms 116 (e.g., robotic arm 116-a, robotic arm 116-b) and extension members 140 (e.g., extension member 140-b) described with reference to the process flow 200.
  • the system 100 may detect that robotic arm 116-a has been connected (e.g., by a user) to the mounting component 156-a of a support structure such as the support structure 154.
  • the system 100 may detect that a connection (e.g., mechanical connection and/or electrical connection) has been established between a connection interface at end 117-a of the robotic arm 116-a and a connection interface of the mounting component 156-a.
  • a connection e.g., mechanical connection and/or electrical connection
  • the system 100 may receive status data associated with the robotic arm 116-a indicating that the robotic arm 116-a is powered on and connected to the support structure 154.
  • the system 100 may electronically receive the status data from electronic circuitry integrated with the support structure 154.
  • the system 100 may electronically receive the status data from the robotic arm 116-a.
  • the system 100 may detect that extension member 140-b and robotic arm 116-b have been connected (e.g., by a user) to the mounting component 156-b of the support structure 154. For example, the system 100 may detect that a connection (e.g., mechanical connection and/or electrical connection) has been established between a connection interface at end 141-b of the extension member 140-b and a connection interface of the mounting component 156-b. The system 100 may further detect that a connection has also been established between a connection interface at end 142-b of the extension member 140-b and a connection interface at end 117-b of the robotic arm 116-b.
  • a connection e.g., mechanical connection and/or electrical connection
  • the system 100 may receive status data that the extension member 140-b is powered on and connected to the support structure 154.
  • the status data may indicate that the robotic arm 116-b is powered on and connected to the support structure 154 via the extension member 140-b.
  • the system 100 may electronically receive the status data from electronic circuitry integrated with the support structure 154.
  • the system 100 may electronically receive the status data from the extension member 140-b and/or the robotic arm 116-b.
  • the identification information may be visually detectable by the system 100.
  • the identification information may be physically integrated with the robotic arm 116-a (e.g., printed on the robotic arm 116-a, printed on a label attached to the robotic arm 116-a, etc. ) .
  • the system 100 may visually (e.g., via an optical reading device, an optical sensor, a camera device, etc. included in the system 100) detect the identification information.
  • the system 100 may retrieve data from database 130 that indicates properties such as hardware version, software version, type, available functions, and range of motion of the robotic arm 116-a.
  • the system 100 may support receiving any of the identification information, hardware version, software version, type, available functions, and range of motion of the robotic arm 116-a via a user input at the user interface 110.
  • the system 100 may identify properties associated with the extension member 140-b and the robotic arm 116-b based on the established connection. For example, the system 100 may read data stored in the extension member 140-b (e.g., stored in a memory chip in the extension member 140-b) and/or data stored in the robotic arm 116-b (e.g., stored in a memory chip in the robotic arm 116-b) , and the data may include an indication of the properties.
  • the data stored in the extension member 140-b may include identification information (e.g., a serial number, a model number, etc.
  • the data stored in the robotic arm 116-b may include identification information (e.g., a serial number, a model number, etc. ) , hardware version, software version, type, available functions, and range of motion of the robotic arm 116-b as described herein.
  • the identification information of the extension member 140-b and/or the robotic arm 116-b may be visually detectable by the system 100.
  • the identification information of the extension member 140-b may be physically integrated with the extension member 140-b (e.g., printed on the extension member 140-b, printed on a label attached to the extension member 140-b, etc. ) .
  • the identification information of the robotic arm 116-b may be physically integrated with the robotic arm 116-b (e.g., printed on the robotic arm 116-b, printed on a label attached to the robotic arm 116-b, etc. ) .
  • the system 100 may visually detect the identification information of the extension member 140-b and/or robotic arm 116-b. Using the identification information, the system 100 may retrieve data from database 130 that indicates respective properties of the extension member 140-b and the robotic arm 116-b, such as hardware version, software version, type, available functions, and range of motion. Additionally, or alternatively, the system 100 may support receiving any of the identification information, hardware version, software version, type, available functions, and range of motion of extension member 140-b and/or the robotic arm 116-b via a user input at the user interface 110.
  • the system 100 may determine instruction operations associated with controlling the robotic arm 116-a based the identification information and/or properties (e.g., hardware version, software version, type, available functions, range of motion, etc. ) of the robotic arm 116-a. For example, at 215, the system 100 may identify capabilities of the robotic arm 116-a.
  • identification information and/or properties e.g., hardware version, software version, type, available functions, range of motion, etc.
  • the system 100 may determine instruction operations associated with controlling the extension member 140-b and the robotic arm 116-b based the respective identification information and/or properties (e.g., hardware version, software version, type, available functions, range of motion, etc. ) of the extension member 140-b and the robotic arm 116-b. For example, at 215-b, the system 100 may identify capabilities of the extension member 140-b and the robotic arm 116-b.
  • the system 100 may determine or configure control parameters for controlling the robotic arm 116-a, based on the properties of the robotic arm 116-a. For example, at 220-a, the system 100 may determine algorithms for controlling the robotic arm 116-a, based on the capabilities determined at 215. The system 100 may determine or set the control parameters in association with a target objective (e.g., delivering therapy, performing a surgical incision, etc. ) of the surgical procedure. In some aspects, the target objective may be set by the user via the workstation 150.
  • An example control parameter includes target pose information of the robotic arm 116-a.
  • Another example control parameter includes target pose information of an end effector of the robotic arm 116-a.
  • the system 100 may determine or configure control parameters for controlling the extension member 140-b and the robotic arm 116-b, based on the respective properties of the extension member 140-b and the robotic arm 116-b. For example, at 220-b, the system 100 may determine algorithms for controlling the extension member 140-b and the robotic arm 116-b, based on the capabilities determined at 215-b. The system 100 may determine or set the control parameters in association with a target objective (e.g., delivering therapy, performing a surgical incision, etc. ) of the surgical procedure.
  • An example control parameter includes target pose information of the extension member 140-b and the robotic arm 116-b.
  • Another example control parameter includes target pose information of an end effector of the robotic arm 116-b.
  • the system 100 may identify other data associated with a surgical procedure, and the system 100 may configure the control parameters (e.g., at 220-a and 220-b) based on the other data.
  • the other data include characteristics (e.g., weight, height, pose information, etc. ) of the subject 152, characteristics (e.g., dimensions, pose information, type, etc. ) of a target anatomical element of the subject 152 in association with a surgical procedure, and the like.
  • Additional examples of the other data include target coordinates associated with the surgical procedure, type of surgical procedure, and the like, and are not limited thereto.
  • the system 100 may access the other data from a surgical plan described herein.
  • the system 100 may control the robotic arm 116-a in association with performing the surgical procedure.
  • the system 100 may control the robotic arm 116-a by providing surgical commands to the robotic arm 116-a that correspond to the instruction operations of 215-a and control parameters of 220-a.
  • the system 100 may control the extension member 140-b and robotic arm 116-b in association with performing the surgical procedure.
  • the system 100 may control the extension member 140 and robotic arm 116-b by providing surgical commands to the extension member 140-b and robotic arm 116-b.
  • the provided surgical commands may correspond to the instruction operations of 215-b and control parameters of 220-b.
  • the system 100 may monitor pose information of the robotic arm 116-a in association with the surgical procedure.
  • the robotic arm 116-a may electronically provide data to the system 100 indicative of the pose information.
  • the system 100 may monitor pose information of the extension member 140-b and robotic arm 116-b in association with the surgical procedure.
  • the extension member 140-b and/or robotic arm 116-b (e.g., via the extension member 140-b) may electronically provide data to the system 100 indicative of the pose information.
  • the system 100 may support tracking techniques (e.g., using cameras or other sensor (s) described with reference to navigation system 118) based on which the system 100 may monitor the pose information of the robotic arm 116-a, the extension member 140-b, and the robotic arm 116-b.
  • tracking techniques e.g., using cameras or other sensor (s) described with reference to navigation system 118
  • the system 100 may provide alerts or recommendations (e.g., via the user interface 110, via the workstation 150, etc. ) to the user indicating a recommended robotic arm 116, a recommended extension member 140, and/or a recommended installation location of the robotic arm 116 or extension member 140 for performing a surgical procedure.
  • the system 100 may provide such alerts or recommendations preoperatively or intraoperatively.
  • the system 100 may identify a target location (e.g., surgical site, an anatomical element, etc. ) associated with the surgical procedure based on corresponding surgical plan stored in the database 130.
  • a target location e.g., surgical site, an anatomical element, etc.
  • the system 100 may output a recommendation to install one or more robotic arms 116 (e.g., mount one or more robotic arms 116 and/or extension members 140 to a support structure) in association with performing the surgical procedure.
  • the system 100 may determine whether the properties (e.g., available functions, range of motion, etc. ) of the robotic arm 116-a support performing the surgical procedure. For example, for the robotic arm 116-a coupled to mounting component 156-a, the system 100 may determine whether the properties (e.g., available functions, range of motion, etc. ) of the robotic arm 116-a support performing the surgical procedure.
  • the properties e.g., available functions, range of motion, etc.
  • the system 100 may output a recommendation to replace the robotic arm 116-a with a robotic arm 116 of a different type.
  • the system 100 may output a recommendation to add an extension member 140 between the robotic arm 116-a and the mounting component 156-a. Additionally, or alternatively, the system 100 may output a recommendation to mount the robotic arm 116-a at another location.
  • the system 100 may output a recommendation to mount the robotic arm 116-a (with or without an extension member 140) to the mounting component 156-c of support structure 155.
  • the system 100 may output a recommendation to mount the robotic arm 116-a (with or without an extension member 140) to mounting component 156-d of the robot 114.
  • the system 100 may determine from the surgical plan (or real-time measurement data) that the weight of the subject 152 in combination with the weight of the robotic arm 116-a and/or robotic arm 116-b may exceed the weight capacity of the support structure 154.
  • the system 100 may output a recommendation to replace the robotic arm 116-a and/or the robotic arm 116-b with a comparable robotic arm 116 that weighs less.
  • the system 100 may output a recommendation to relocate the robotic arm 116-a and/or the robotic arm 116-b to another support structure (e.g., support structure 155, a support structure of the robot 114, etc. ) .
  • the system 100 may support providing recommendations to place robotic arms 116 (and/or extension members 140) according to locations of different mounting components 156 of the support structure 154 to evenly distribute the combined weight of the robotic arms 116 and/or extension members 140.
  • Fig. 3 illustrates an example of a process flow 300 in accordance with aspects of the present disclosure.
  • process flow 300 may implement aspects of the system 100 described with reference to Figs. 1A through 1C.
  • the operations may be performed in a different order than the order shown, or the operations may be performed in different orders or at different times. Certain operations may also be left out of the process flow 300, or other operations may be added to the process flow 300.
  • process flow 300 described herein with respect to controlling a robotic arm 116 and/or extension member 140 in association with a surgical procedure may be applied to any robotic arm 116 and extension member 140 supported by the system 100. It is to be understood that the example aspects are not limited to the robotic arms 116 (e.g., robotic arm 116-a, robotic arm 116-b) and extension members 140 (e.g., extension member 140-b) described with reference to the process flow 300.
  • the process flow 300 may include detecting a removable coupling established between a robotic arm and coupling interface of a system.
  • the removable coupling includes at least one of: a mechanical connection established between the robotic arm and the coupling interface; and an electrical connection established between the robotic arm and the coupling interface.
  • the coupling interface is associated with one or more support structures of the system.
  • the process flow 300 may include identifying one or more properties associated with a robotic arm.
  • the robotic arm is plug-and-play compatible with a coupling interface of the system.
  • identifying the one or more properties associated with the robotic arm is in response to detecting the removable coupling.
  • the process flow 300 may include retrieving the one or more properties associated with the robotic arm from a memory stored on the robotic arm.
  • the one or more properties associated with the robotic arm include at least one of: status information associated with the robotic arm; and identification information associated with the robotic arm. In some aspects, the one or more properties associated with the robotic arm include at least one of: a hardware version associated with the robotic arm; and a software version associated with the robotic arm. In some aspects, the one or more properties associated with the robotic arm include at least one of: a type associated with the robotic arm; one or more functions associated with the robotic arm; and a range of motion associated with the robotic arm.
  • the process flow 300 may include determining one or more instruction operations associated with controlling the robotic arm based on the one or more properties.
  • determining the one or more instruction operations is based on a target objective associated with performing the procedure.
  • the process flow 300 may include determining one or more control parameters associated with controlling the robotic arm based on the one or more properties associated with the robotic arm.
  • the one or more control parameters include at least one of: target pose information of the robotic arm; and second target pose information of an end effector of the robotic arm. In some aspects, determining the one or more instruction operations, determining the one or more control parameters, or both is based on at least one of: one or more characteristics of a subject associated with the surgical procedure; and one or more characteristics of an anatomical element of the subject in association with the surgical procedure.
  • the process flow 300 may include controlling the robotic arm in association with a procedure by transmitting one or more commands based on the one or more instruction operations.
  • controlling the robotic arm includes by transmitting the one or more surgical commands based on the one or more control parameters.
  • the process flow 300 may include determining pose information of the robotic arm, pose information of an extension member removably coupled to the robotic arm, or both in response to at least one of: receiving data from the robotic arm; and receiving second data from an extension member, wherein the extension member is removably coupled to the coupling interface.
  • the process flow 300 includes: identifying one or more second properties associated with an extension member, wherein the extension member is plug-and-play compatible with the coupling interface and the robotic arm; and the extension member is removably coupled to the coupling interface and removably coupled to the robotic arm; determining one or more second control parameters associated with controlling the extension member and the robotic arm based on identifying the one or more second properties; and controlling the extension member and the robotic arm in association with the procedure by transmitting the one or more surgical commands based on the one or more second control parameters.
  • the one or more second properties include at least one of: a type associated with the extension member; one or more functions associated with the extension member; and a range of motion associated with the extension member.
  • the process flows 200 and 300 described herein may be carried out or otherwise performed, for example, by at least one processor.
  • the at least one processor may be the same as or similar to the processor (s) 104 of the computing device 102 described above.
  • the at least one processor may be part of a robot (such as a robot 114) , part of a navigation system (such as a navigation system 118) , or part of a workstation 150.
  • a processor other than any processor described herein may also be used to execute the process flows described herein.
  • the at least one processor may perform operations of the process flows by executing elements stored in a memory such as the memory 106.
  • the elements stored in memory and executed by the processor may cause the processor to execute one or more operations of a function as shown in the process flows.
  • One or more portions of the process flows may be performed by the processor executing any of the contents of memory, such as arm management engine 136, image processing 120, a segmentation 122, a transformation 124, and/or a registration 128.
  • the present disclosure encompasses methods with fewer than all of the operations identified in Figs. 2 and 3 (and the corresponding descriptions of the process flows 200 and 300) , as well as methods that include additional operations beyond those identified in Figs. 2 and 3.
  • the present disclosure also encompasses methods that include one or more steps from one method described herein, and one or more steps from another method described herein. Any correlation described herein may be or include a registration or any other correlation.
  • Example aspects of the present disclosure include:
  • a system including: a processor; and a memory storing instructions thereon that, when executed by the processor, cause the processor to: identify one or more properties associated with a robotic arm, wherein the robotic arm is plug-and-play compatible with a coupling interface of the system; determine one or more instruction operations associated with controlling the robotic arm based on the one or more properties; and control the robotic arm in association with a surgical procedure by transmitting one or more surgical commands based on the one or more instruction operations.
  • the instructions are further executable by the processor to:detect a removable coupling established between the robotic arm and the coupling interface, wherein the removable coupling includes at least one of: a mechanical connection established between the robotic arm and the coupling interface; and an electrical connection established between the robotic arm and the coupling interface, wherein identifying the one or more properties associated with the robotic arm is in response to detecting the removable coupling.
  • the instructions are further executable by the processor to:retrieve the one or more properties associated with the robotic arm from a memory stored on the robotic arm.
  • the one or more properties associated with the robotic arm include at least one of: status information associated with the robotic arm; and identification information associated with the robotic arm.
  • the one or more properties associated with the robotic arm include at least one of: a hardware version associated with the robotic arm; and a software version associated with the robotic arm.
  • the one or more properties associated with the robotic arm include at least one of: a type associated with the robotic arm; one or more functions associated with the robotic arm; and a range of motion associated with the robotic arm.
  • any of the aspects herein further including: determine one or more control parameters associated with controlling the robotic arm based on the one or more properties associated with the robotic arm, wherein the one or more control parameters include at least one of target pose information of the robotic arm and second target pose information of an end effector of the robotic arm, wherein controlling the robotic arm includes by transmitting the one or more surgical commands based on the one or more control parameters.
  • determining the one or more instruction operations, determining the one or more control parameters, or both is based on at least one of: one or more characteristics of a subject associated with the surgical procedure; and one or more characteristics of an anatomical element of the subject in association with the surgical procedure.
  • the instructions are further executable by the processor to:identify one or more second properties associated with an extension member, wherein: the extension member is plug-and-play compatible with the coupling interface and the robotic arm; and the extension member is removably coupled to the coupling interface and removably coupled to the robotic arm; determine one or more second control parameters associated with controlling the extension member and the robotic arm based on identifying the one or more second properties; and control the extension member and the robotic arm in association with the procedure by transmitting the one or more surgical commands based on the one or more second control parameters.
  • the one or more second properties include at least one of: a type associated with the extension member; one or more functions associated with the extension member; and a range of motion associated with the extension member.
  • determining the one or more instruction operations is based on a target objective associated with performing the procedure.
  • the instructions are further executable by the processor to:determine pose information of the robotic arm, pose information of an extension member removably coupled to the robotic arm, or both in response to at least one of: receiving data from the robotic arm; and receiving second data from an extension member, wherein the extension member is removably coupled to the coupling interface.
  • coupling interface is associated with one or more support structures of the system.
  • a system including: a robotic arm; and robot management circuitry that manages control of the robotic arm by: identifying one or more properties associated with the robotic arm, wherein the robotic arm is plug-and-play compatible with a mounting component of the system; determining one or more instruction operations associated with controlling the robotic arm based on identifying the one or more properties; and controlling the robotic arm in association with a surgical procedure by transmitting one or more surgical commands based on the one or more instruction operations.
  • the robot management circuitry is to: detect a removable coupling established between the robotic arm and the coupling interface, wherein the removable coupling includes at least one of: a mechanical connection established between the robotic arm and the coupling interface; and an electrical connection established between the robotic arm and the coupling interface, wherein identifying the one or more properties associated with the robotic arm is in response to detecting the removable coupling
  • the robot management circuitry further manages control of the robotic arm by: determining one or more control parameters associated with controlling the robotic arm based on identifying the one or more properties associated with the robotic arm, wherein controlling the robotic arm includes by transmitting the one or more surgical commands based on identifying the one or more control parameters.
  • the robot management circuitry further manages control of the robotic arm by: identifying one or more second properties associated with an extension member, wherein: the extension member is plug-and-play compatible with the coupling interface and the robotic arm; and the extension member is removably coupled to the coupling interface of and removably coupled to the robotic arm; determining one or more second control parameters associated with controlling the extension member and the robotic arm based on identifying the one or more second properties; and controlling the extension member and the robotic arm in association with the procedure by transmitting the one or more surgical commands based on the one or more second control parameters.
  • any of the aspects herein further including an extension member removably coupled to the robotic arm and to the mounting component, wherein: a first end of the extension member is plug-and-play compatible with the mounting component based on one or more position-mounting interfaces, one or more power interfaces, and one or more data communication interfaces of the extension member; and a second end of the extension member is plug-and-play compatible with the robotic arm based on one or more second position-mounting interfaces, one or more second power interfaces, and one or more second data communication interfaces of the extension member.
  • a method including: electronically receiving status information associated with a robotic arm of a system, wherein the status information includes an indication that the robotic arm is in an active state, wherein the robotic arm is plug-and-play compatible with a coupling interface of the system; and determining, in response to electronically receiving the status information and based on one or more properties associated with the robotic arm, one or more instruction operations associated with controlling the robotic arm; and controlling the robotic arm in association with a surgical procedure by electronically communicating one or more surgical commands corresponding to the one or more instruction operations to the robotic arm.
  • phrases “at least one, ” “one or more, ” “or, ” and “and/or” are open-ended expressions that are both conjunctive and disjunctive in operation.
  • each of the expressions “at least one of A, B and C, ” “at least one of A, B, or C, ” “one or more of A, B, and C, ” “one or more of A, B, or C, ” “A, B, and/or C, ” and “A, B, or C” means A alone, B alone, C alone, A and B together, A and C together, B and C together, or A, B and C together.
  • a or “an” entity refers to one or more of that entity.
  • the terms “a” (or “an” ) , “one or more, ” and “at least one” can be used interchangeably herein. It is also to be noted that the terms “comprising, ” “including, ” and “having” can be used interchangeably.
  • automated refers to any process or operation, which is typically continuous or semi-continuous, done without material human input when the process or operation is performed.
  • a process or operation can be automatic, even though performance of the process or operation uses material or immaterial human input, if the input is received before performance of the process or operation.
  • Human input is deemed to be material if such input influences how the process or operation will be performed. Human input that consents to the performance of the process or operation is not deemed to be “material. ”
  • aspects of the present disclosure may take the form of an implementation that is entirely hardware, an implementation that is entirely software (including firmware, resident software, micro-code, etc. ) or an implementation combining software and hardware aspects that may all generally be referred to herein as a “circuit, ” “module, ” or “system. ” Any combination of one or more computer-readable medium (s) may be utilized.
  • the computer-readable medium may be a computer-readable signal medium or a computer-readable storage medium.
  • a computer-readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. More specific examples (a non-exhaustive list) of the computer-readable storage medium would include the following: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a random access memory (RAM) , a read-only memory (ROM) , an erasable programmable read-only memory (EPROM or Flash memory) , an optical fiber, a portable compact disc read-only memory (CD-ROM) , an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
  • a computer-readable storage medium may be any tangible medium that can contain or store a program for use by or in connection with an instruction execution system, apparatus, or device.
  • a computer-readable signal medium may include a propagated data signal with computer-readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof.
  • a computer-readable signal medium may be any computer-readable medium that is not a computer-readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
  • Program code embodied on a computer-readable medium may be transmitted using any appropriate medium, including, but not limited to, wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.

Landscapes

  • Health & Medical Sciences (AREA)
  • Surgery (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Biomedical Technology (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Pathology (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Robotics (AREA)
  • Manipulator (AREA)

Abstract

Procédés et systèmes (100) comprenant l'identification d'une ou de plusieurs propriétés associées à un bras robotique (116). Le bras robotique (116) de type branchez-et-utilisez est compatible avec une interface de couplage des systèmes (100). Les procédés et les systèmes (100) comprennent la détermination d'opérations d'instruction et de paramètres de commande associés à la commande du bras robotique (116) sur la base de la ou des propriétés. Les procédés et les systèmes (100) comprennent la commande du bras robotique (116) en association avec une intervention chirurgicale sur la base des opérations d'instruction et des paramètres de commande. Les procédés et les systèmes (100) comprennent le couplage amovible du bras robotique (116) à une structure de support (154, 155), avec ou sans élément d'extension (140).
EP22965472.8A 2022-11-16 2022-11-16 Bras de type branchez-et-utilisez pour robotique rachidienne Pending EP4618884A1 (fr)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2022/132221 WO2024103286A1 (fr) 2022-11-16 2022-11-16 Bras de type branchez-et-utilisez pour robotique rachidienne

Publications (1)

Publication Number Publication Date
EP4618884A1 true EP4618884A1 (fr) 2025-09-24

Family

ID=91083612

Family Applications (1)

Application Number Title Priority Date Filing Date
EP22965472.8A Pending EP4618884A1 (fr) 2022-11-16 2022-11-16 Bras de type branchez-et-utilisez pour robotique rachidienne

Country Status (3)

Country Link
EP (1) EP4618884A1 (fr)
CN (1) CN120187373A (fr)
WO (1) WO2024103286A1 (fr)

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7979157B2 (en) * 2004-07-23 2011-07-12 Mcmaster University Multi-purpose robotic operating system and method
US10500015B2 (en) * 2014-05-13 2019-12-10 Covidien Lp Surgical robotic arm support systems and methods of use
US10500739B2 (en) * 2015-11-13 2019-12-10 Ethicon Llc Robotic surgical system
GB2545637A (en) * 2015-12-10 2017-06-28 Cambridge Medical Robotics Ltd Robot mounting arrangement
US11918297B2 (en) * 2019-01-10 2024-03-05 Mazor Robotics Ltd. System and method for registration between coordinate systems and navigation

Also Published As

Publication number Publication date
CN120187373A (zh) 2025-06-20
WO2024103286A1 (fr) 2024-05-23

Similar Documents

Publication Publication Date Title
US12295797B2 (en) Systems, methods, and devices for providing an augmented display
EP4531740A1 (fr) Enregistrement de clamp de processus spinal et procédés d'utilisation de celui-ci
US20250152262A1 (en) Path planning based on work volume mapping
US20240293190A1 (en) System and method for preliminary registration
US20230346492A1 (en) Robotic surgical system with floating patient mount
US20230240754A1 (en) Tissue pathway creation using ultrasonic sensors
WO2023062624A1 (fr) Systèmes pour définir une géométrie d'objet à l'aide de bras robotiques
US20240382265A1 (en) Hybrid localization for minimally invasive surgery and cervical spinal referencing, and methods for using the same
WO2024103286A1 (fr) Bras de type branchez-et-utilisez pour robotique rachidienne
US12094128B2 (en) Robot integrated segmental tracking
WO2024229651A1 (fr) Positionnement intelligent d'un chariot de bras de robot
US20230404692A1 (en) Cost effective robotic system architecture
US20240156531A1 (en) Method for creating a surgical plan based on an ultrasound view
WO2023141800A1 (fr) Système de positionnement de rayons x mobile
US11847809B2 (en) Systems, devices, and methods for identifying and locating a region of interest
WO2025158442A1 (fr) Systèmes et procédés de détermination et d'application d'une compensation de déviation à des systèmes robotiques
WO2024236440A1 (fr) Localisation hybride pour chirurgie minimalement invasive et référencement spinal cervical, et leurs procédés d'utilisation
WO2025120636A1 (fr) Systèmes et procédés de détermination du mouvement d'un ou plusieurs éléments anatomiques
WO2024236568A1 (fr) Systèmes et procédés d'identification d'un ou plusieurs dispositifs de suivi
WO2025122777A1 (fr) Auto-étalonnage d'un système à capteurs multiples
CN118647331A (zh) 用于生成混合图像的系统和装置

Legal Events

Date Code Title Description
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE

PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE

17P Request for examination filed

Effective date: 20250613

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC ME MK MT NL NO PL PT RO RS SE SI SK SM TR