WO2025184368A1 - Rétroaction de force basée sur l'anatomie et guidage d'instrument - Google Patents
Rétroaction de force basée sur l'anatomie et guidage d'instrumentInfo
- Publication number
- WO2025184368A1 WO2025184368A1 PCT/US2025/017633 US2025017633W WO2025184368A1 WO 2025184368 A1 WO2025184368 A1 WO 2025184368A1 US 2025017633 W US2025017633 W US 2025017633W WO 2025184368 A1 WO2025184368 A1 WO 2025184368A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- anatomical structure
- force
- medical procedure
- medical
- data
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/30—Surgical robots
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/06—Measuring instruments not otherwise provided for
- A61B2090/064—Measuring instruments not otherwise provided for for measuring force, pressure or mechanical tension
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/06—Measuring instruments not otherwise provided for
- A61B2090/064—Measuring instruments not otherwise provided for for measuring force, pressure or mechanical tension
- A61B2090/065—Measuring instruments not otherwise provided for for measuring force, pressure or mechanical tension for measuring contact or contact pressure
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/06—Measuring instruments not otherwise provided for
- A61B2090/064—Measuring instruments not otherwise provided for for measuring force, pressure or mechanical tension
- A61B2090/066—Measuring instruments not otherwise provided for for measuring force, pressure or mechanical tension for measuring torque
Definitions
- Teleoperation of robotic systems can provide teleoperators with multiple advantages. For example, teleoperators can operate such robotic systems with greater control and precision than would otherwise be achievable using conventional techniques. Further, teleoperators can suspend movements made by such instruments when addressing fatigue experienced by the teleoperators, and continue operation when the fatigue has subsided. But in the absence of continuous feedback as force is applied by conventional instruments to various structures, it can be difficult for teleoperators to develop accurate perceptions of the state of one or more structures as forces are applied to such structures during teleoperation, particularly where such structures may be sensitive to repeated applications of force.
- Technical solutions disclosed herein are generally related to systems and methods for anatomy based force feedback and instrument guidance.
- the technical solutions can determine an amount of force applied by an instrument (e.g., a medical instrument) during teleoperation.
- an instrument e.g., a medical instrument
- described herein are specific techniques for updating user interfaces during teleoperation to establish a specific teleoperation experience.
- the user interface can be generated based at least in part on video data received from a sensing system during a medical procedure.
- the user interface can include one or more indications of a metric indicative of performance of the medical procedure. These indications can be in the form of numbers, letters, for example, or in the form of updates to the to the images represented by the video data that occur in accordance with the techniques described herein.
- the system can include one or more processors, coupled with memory.
- the one or more processors can receive a data stream of a medical procedure performed with a robotic medical system.
- the one or more processors can identify, using the data stream and with one or more models trained with machine learning, a type of an anatomical structure on which the medical procedure is performed.
- the one or more processors can determine, using the data stream, an amount of force applied to the anatomical structure.
- the one or more processors can determine, based at least on a comparison of the amount of force applied to the anatomical structure with a force threshold established for the type of the anatomical structure, a metric indicative of performance of the medical procedure.
- the one or more processors can provide an indication of the metric to control performance of the medical procedure.
- the one or more processors can receive the data stream of the medical procedure, the data stream comprising data associated with force vectors representing directions and magnitudes of force interactions between instruments and anatomical structures involved in the medical procedure, the force vectors captured at a plurality of points in time. In aspects, the one or more processors can determine the amount of force applied to the anatomical structure based at least on the force vectors.
- the one or more processors can determine the amount of force applied to the anatomical structure, the amount of force representing an interaction between an instrument involved in the medical procedure and the anatomical structure at a point in time.
- the one or more processors can determine a cumulative amount of force applied to the anatomical structure over a period of time, the cumulative amount of force determined based at least on adding amounts of force applied to the anatomical structure by instruments involved in the medical procedure at one or more points in time over the period of time. In aspects, the one or more processors can determine the amount of force applied to the anatomical structure based at least on the cumulative amount of force applied to the anatomical structure.
- the one or more processors can determine the amount of force applied based at least on contact between at least one instrument and the anatomical structure. In aspects, the one or more processors can determine a three-dimensional direction associated with the contact between the at least one instrument and the anatomical structure.
- the one or more processors can determine the amount of force applied to the anatomical structure based at least on one or more sensor signals generated by one or more torque sensors.
- the one or more torque sensors are configured to measure torque at one or more joints of one or more arms of a surgical robot, the one or more arms supporting an instrument that exerts the amount of force on the anatomical structure.
- the one or more processors can update the amount of force applied to the anatomical structure based at least on an instrument type associated with the instrument that is exerting the amount of force on the anatomical structure.
- the one or more processors can determine, based at least on the update to the amount of force applied to the anatomical structure, the metric indicative of performance of the medical procedure. [0011] In some aspects, the one or more processors can determine the metric based at least on the comparison of the amount of force with the force threshold established for the type of the anatomical structure and one or more of: an instrument type of an instrument involved in the medical procedure, an orientation of the instrument involved in the medical procedure, a maneuver associated with a point in time at which the amount of force is applied to the anatomical structure, or an amount of time for patient recovery associated with the amount of force applied to the anatomical structure.
- the one or more processors can generate a user interface based at least on the data stream of the medical procedure and the metric indicative of the performance of the medical procedure.
- the one or more processor scan cause a display device to display the user interface, the display of the user interface occurring during or after a point in time when the medical procedure is performed by the robotic medical system.
- the one or more processors can generate a visual representation of the anatomical structure involved in the medical procedure. In aspects, the one or more processors can update the visual representation of the anatomical structure based at least on the metric indicative of the performance of the medical procedure. In aspects, the one or more processors can cause the display device to display the user interface, the user interface comprising at least a portion of the visual representation of the anatomical structure. [0014] In some aspects, the one or more processors can update the visual representation of the anatomical structure based at least on a heatmap associated with the anatomical structure. In aspects, the heatmap comprises a visual representation of the metric indicative of the performance of the medical procedure overlaid onto the anatomical structure.
- the one or more processors can determine the indication of the metric indicative of the performance of the medical procedure. In aspects, the one or more processors can update the user interface based at least on the one or more indicators to include the one or more indicators at a location on the user interface.
- the one or more processors can determine the indication of the metric to control performance of the medical procedure.
- the indication represents the amount of force applied to the anatomical structure at a point in time at which the force is applied to the anatomical structure.
- the one or more processors can determine the indication of the metric to control performance of the medical procedure.
- the indication represents the amount of force applied to the anatomical structure over a period in time during which the force is applied to the anatomical structure.
- the one or more processors can determine the indication of the metric to control performance of the medical procedure.
- the indication represents a three-dimensional direction in which an instrument contacts the anatomical structure.
- the one or more processors can determine the indication of the metric to control performance of the medical procedure based at least on the metric indicative of the performance of the medical procedure.
- the method can include one or more processors receiving a data stream of a medical procedure performed with a robotic medical system.
- the method can include one or more processors identifying, using the data stream and with one or more models trained with machine learning, a type of an anatomical structure on which the medical procedure is performed.
- the method can include one or more processors determining, using the data stream, an amount of force applied to the anatomical structure.
- the method can include one or more processors determining, based at least on a comparison of the amount of force applied to the anatomical structure with a force threshold established for the type of the anatomical structure, a metric indicative of performance of the medical procedure.
- the method can include one or more processors providing an indication of the metric to control performance of the medical procedure.
- the method can include the one or more processors receiving the data stream of the medical procedure.
- the data stream comprises data associated with force vectors representing directions and magnitudes of force interactions between instruments and anatomical structures involved in the medical procedure, the force vectors captured at a plurality of points in time.
- method can include the one or more processors determining the amount of force applied to the anatomical structure based at least on the force vectors.
- the method can include the one or more processors determining the amount of force applied to the anatomical structure, the amount of force representing an interaction between an instrument involved in the medical procedure and the anatomical structure at a point in time.
- aspects of the technical solution are directed to a non-transitory computer- readable medium storing processor-executable instructions that, when executed by one or more processors, cause the one or more processors to receive a data stream of a medical procedure performed with a robotic medical system.
- the instructions can include instructions to identify, using the data stream and with one or more models trained with machine learning, a type of an anatomical structure on which the medical procedure is performed.
- the instructions can include instructions to determine, using the data stream, an amount of force applied to the anatomical structure.
- the instructions can include instructions to determine, based at least on a comparison of the amount of force applied to the anatomical structure with a force threshold established for the type of the anatomical structure, a metric indicative of performance of the medical procedure.
- the instructions can include instructions to provide an indication of the metric to control performance of the medical procedure.
- FIG. 1 A depicts an example system to determine forces applied by an instrument during teleoperation.
- FIG. IB illustrates a schematic block diagram of an example environment for determining an amount of force applied by an instrument during teleoperation, according to some embodiments
- FIG. 2 illustrates a flowchart diagram illustrating an example method for determining an amount of force applied by an instrument during teleoperation, according to some embodiments
- FIG. 3 illustrates an image of an example user interface, according to some embodiments
- FIG. 4 illustrates a graph of example force limits, according to some embodiments.
- FIG. 5 illustrates a diagram of a medical environment, according to some embodiments.
- FIG. 6 illustrates a block diagram depicting an architecture for a computer system that can be employed to implement elements of the systems and methods described and illustrated herein.
- the present disclosure is discussed in the context of a surgical procedure, in some embodiments, the present disclosure can be applicable to other medical sessions or environments or activities, as well as non-medical activities where the determination of forces applied is desired.
- Systems, methods, apparatuses, and non-transitory computer-readable media are provided for anatomy based force feedback and instrument guidance.
- the technology can determine an amount of force applied by an instrument (e.g., a medical instrument) during teleoperation.
- methods described herein include receiving a data stream of a medical procedure performed with a robotic medical system; identifying, using the data stream and with one or more models trained with machine learning, a type of an anatomical structure on which the medical procedure is performed; determining, using the data stream, an amount of force applied to the anatomical structure; and determining, based at least on a comparison of the amount of force applied to the anatomical structure with a force threshold established for the type of the anatomical structure, a metric indicative of performance of the medical procedure. Arrangements also relate to providing an indication of the metric to control performance of the medical procedure with the robotic medical system.
- the present disclosure includes systems and methods that quantify an amount of force that has been applied to one or more anatomical structures involved in the surgery, determines metrics indicative of the performance that may represent an amount of force that can be applied, and provides an indication of those metrics to the surgeon (e.g., via a user interface).
- the described techniques can improve the perception of the way the robotic system is interacting with patients during medical procedures, reduce the chances of applying force to anatomical structures that unnecessarily result in adverse effects to the short- and long-term health outcomes of patients, and generally improve overall patient outcomes. And by virtue of understanding the state of one or more anatomical structures in the context of the metrics described herein, a surgeon can better understand what future interactions are possible and what the downstream effects of such interactions may be. This, in turn, enables the surgeon to avoid extending the surgery time or causing unintended damage to the anatomical structures involved in the medical procedure.
- FIG. 1 A depicts an example system 100 to determine forces applied by an instrument during teleoperation of robotic systems such as, for example, robotic medical systems used in robot-assisted surgeries.
- the example system 100 can include a combination of hardware and software for generating indications of an amount of force during operation of a robotic system.
- the example system 100 can include a network 101, a medical environment 102, and a data processing system 130 as described herein.
- the example system 100 can include a medical environment 102 (e.g., a medical environment that is the same as, or similar to, the example medical environment 500 of FIG. 5) including one or more data capture devices 110, medical instruments 112, visualization tools 114, displays 116 and robotic medical systems (RMSs) 120.
- RMS 120 can include or generate various types of data streams 158 that are described herein, and can operate using system configurations 122.
- One or more RMSs 120 can be communicatively coupled with one or more data processing systems 130.
- the RMS 120 can be deployed in any medical environment 102.
- the medical environment 102 can include any space or facility for performing medical procedures, such as a surgical facility, or an operating room.
- the medical environment 102 can include medical instruments 112 (e.g., surgical tools used for specialized tasks) that the RMS 120 can use for performing operational procedures, such as surgical patient procedures, whether invasive, non-invasive, or any in-patient or out-patient procedures.
- RMS 120 can be centralized or distributed across a plurality of computing devices or systems, such as computing devices 600 (e.g., used on servers, network devices or cloud computing products) to implement various functionalities of the RMS 120, including communicating or processing data streams 158 across various devices via the network 101.
- the medical environment 102 can include one or more data capture devices 110 (e.g., optical devices, such as cameras or sensors or other types of sensors or detectors) for capturing data streams 158.
- the data streams 158 can include any sensor data, such as images or videos of a surgery, kinematics data on any movement of medical instruments 112, or any events data, such as installation, configuration or selection events corresponding to medical instruments 112.
- the medical environment 102 can include one or more visualization tools 114 to gather the captured data streams 158 and process it for display to the user (e.g., a surgeon, a medical professional or an engineer or a technician configuring RMS) via one or more (e.g., touchscreen) displays 116.
- a display 116 can present data stream 158 (e.g., images or video frames) of a medical procedure (e.g., surgery) being performed using the RMS 120 while handling, manipulating, holding or otherwise utilizing medical instruments 112 to perform surgical tasks at the surgical site.
- RMS 120 can include system configurations 122 based at least on which RMS 120 can operate, and the functionality of which can impact the data flow of the data streams 158.
- the system 100 can include one or more data capture devices 110 (e.g., video cameras, sensors or detectors) for collecting any data stream 158, that can be used for machine learning, including detection of objects from sensor data (e.g., video frames or force or feedback data), detection of particular events (e.g., user interface selection of, or a surgeon’s engaging of, a medical instrument 112) or detection of kinematics (e.g., movements of the medical instrument 112).
- the data capture devices 110 can include cameras or other image capture devices for capturing videos or images from a particular viewpoint within the medical environment 102.
- the data capture devices 110 can be positioned, mounted, or otherwise located to capture content from any viewpoint that facilitates the data processing system capturing various surgical tasks or actions.
- the data capture devices 110 can include any of a variety of detectors, sensors, cameras, video imaging devices, infrared imaging devices, visible light imaging devices, intensity imaging devices (e.g., black, color, grayscale imaging devices, etc.), hyperspectral imaging devices (e.g., a hyperspectral camera, etc.), depth imaging devices (e.g., stereoscopic imaging devices, time-of-flight imaging devices, etc.), medical imaging devices such as endoscopic imaging devices, ultrasound imaging devices, etc., non-visible light imaging devices, any combination or sub-combination of the above mentioned imaging devices, or any other type of imaging devices that can be suitable for the purposes described herein.
- intensity imaging devices e.g., black, color, grayscale imaging devices, etc.
- hyperspectral imaging devices e.g., a hyperspectral camera, etc.
- depth imaging devices e.g., stereoscopic imaging devices, time-of-flight imaging devices, etc.
- medical imaging devices such as endoscopic imaging devices, ultrasound imaging devices, etc.
- the data capture devices 110 can include cameras that a surgeon can use to perform a surgery and observe manipulation components within a purview of field of view suitable for the given task performance.
- the data capture devices can output any type of data streams 158, including data streams 158 of kinematics data (e.g., kinematics data streams), data streams 158 of events data (e.g., events data streams) and data streams 158 of sensor data (e.g., sensors data streams).
- data capture devices 110 can capture, detect, or acquire sensor data such as videos or images, including for example, still images, video images, vector images, bitmap images, other types of images (e.g., Raman hyperspectral images, etc.), or combinations thereof.
- the data capture devices 110 can capture the images at any suitable predetermined capture rate or frequency.
- Settings, such as zoom settings or resolution, of each of the data capture devices 110 can vary as desired to capture suitable images from any viewpoint.
- data capture devices 110 can have fixed viewpoints, locations, positions, or orientations.
- the data capture devices 110 can be portable, or otherwise configured to change orientation or telescope in various directions.
- the data capture devices 110 can be part of a multi-sensor architecture including multiple sensors, with each sensor being configured to detect, measure, or otherwise capture a particular parameter (e.g., sound, images, or pressure).
- the data capture devices 110 can generate sensor data from any type and form of a sensor, such as a positioning sensor, a biometric sensor, a velocity sensor, an acceleration sensor, a vibration sensor, a motion sensor, a pressure sensor, a light sensor, a distance sensor, a current sensor, a focus sensor, a temperature or pressure sensor or any other type and form of sensor used for providing data on the medical instruments 112, or the data capture devices (e.g., optical devices).
- a the data capture device 110 can include a location sensor, a distance sensor or a positioning sensor providing coordinate locations of a medical instrument 112 (e.g., kinematics data).
- the data capture device 110 can include a sensor providing information or data on a location, position or spatial orientation of an object (e.g., medical instrument 112 or a lens of data capture device 110) with respect to a reference point for kinematics data.
- the reference point can include any fixed, defined location used as the starting point for measuring distances and positions in a specific direction, serving as the origin from which all other points or locations can be determined.
- the display 116 can show, illustrate or play the data stream 158, such as a video stream, in which the medical instruments 112 at or near surgical sites are shown.
- the display 116 can display a rectangular image of a surgical site along with at least a portion of the medical instruments 112 being used to perform surgical tasks.
- the display 116 can provide compiled or composite images generated by the visualization tool 114 from a plurality of data capture devices 110 to provide a visual feedback from one or more points of view.
- the visualization tool 114 can be configured or designed to receive any number of different data streams 158 from any number of data capture devices 110 and combine them into a single data stream displayed on a display 116.
- the visualization tool 114 can be configured to receive a plurality of data stream components and combine the plurality of data stream components into a single data stream 158.
- the visualization tool 114 can receive a visual sensor data from one or more of the medical instruments 112, sensors or cameras with respect to a surgical site or an area in which a surgery is performed.
- the visualization tool 114 can incorporate, combine or utilize multiple types of data (e.g., positioning data of a medical instrument 112 along sensor readings of pressure, temperature, vibration or any other data) to generate an output to present on a display 116.
- the visualization tool 114 can present locations of medical instruments 112 along with locations of any reference points or surgical sites, including locations of anatomical parts of the patient (e.g., organs, glands or bones).
- the medical instruments 112 can be any type and form of tool or medical instrument used for surgery, medical procedures or a tool in an operating room or environment.
- the medical instrument 112 can be imaged by, associated with, or include an image capture device.
- a medical instrument 112 can be a tool for making incisions, a tool for suturing a wound, an endoscope for visualizing organs or tissues, an imaging device, a needle and a thread for stitching a wound, a surgical scalpel, forceps, scissors, retractors, graspers, or any other tool or medical instrument to be used during a surgery.
- the medical instruments 112 can include hemostats, trocars, surgical drills, suction devices or any medical instruments for use during a surgery.
- the medical instrument 112 can include other or additional types of therapeutic or diagnostic medical imaging implements.
- the medical instrument 112 can be configured to be installed in, coupled with, or manipulated by an RMS 120, such as by manipulator arms or other components for holding, using and manipulating the medical instruments.
- the medical instruments 112 can be the same as, or similar to, the medical instruments discussed with respect to FIG. 5.
- the RMS 120 can be a computer-assisted system configured to perform a surgical or medical procedure or activity on a patient via or using or with the assistance of one or more robotic components or the medical instruments 112.
- the RMS 120 can include any number of manipulator arms for grasping, holding or manipulating various medical instruments 112 and performing computer-assisted medical tasks using the medical instruments 112 controlled by the manipulator arms.
- the data streams 158 can be generated by the RMS 120.
- sensor data associated with the data streams 158 can include images (e.g., video images) captured by a medical instrument 112 can be sent to the visualization tool 114.
- a display 116 e.g., a touchscreen
- the RMS 120 can include one or more input ports to receive direct or indirect connection of one or more auxiliary devices.
- the visualization tool 114 can be connected to the RMS 120 to receive the images from the medical instrument when the medical instrument is installed in the RMS 120 (e.g., on a manipulator arm for handing medical instruments 112).
- the data stream 158 can include data indicative of positioning and movement of the medical instruments 112 that can be captured or identified by data packets of a kinematics data.
- the visualization tool 114 can combine the data stream components from the data capture devices 110 and the medical instrument 112 into a single combined data stream 158 which can be indicated or presented on a display 116.
- the RMS 120 can provide the data streams 158 to the data processing system 130 periodically, continuously, or in real-time.
- Data packets can include a unit of data in a data stream 158.
- the data packets can include the actual information being sent and metadata, such as a source and a destination address, a port identifier or any other information for transmitting data.
- the data packets can include a data (e.g., a payload) corresponding to an event (e.g., installation, uninstallation, engagement or setup of a medical instrument 112).
- the data packets can include data corresponding to sensor information (e.g., a video frame captured by a camera), or data on movement of a medical instrument 112.
- the data packets can be transmitted in the data streams 158 that can be separated or combined.
- a data stream 158 for kinematics data e.g., a kinematics data stream
- Data packets can include one or more timestamps, which can indicate a particular time when particular events took place. Timestamps can include time indications expressed in any combination of nanoseconds, microseconds, milliseconds, seconds, hours, days, months or years. Timestamps can be included in the payload or metadata of data packets and can indicate the time when a data packet was generated, the time when the data packet was transmitted from the device that generated the data packet, the time when the data packet was received by another device (e.g., a system within the RMS 120, or another device on a network) or a time when the data packet is stored into a data repository 132.
- timestamps can include time indications expressed in any combination of nanoseconds, microseconds, milliseconds, seconds, hours, days, months or years. Timestamps can be included in the payload or metadata of data packets and can indicate the time when a data packet was generated, the time when the data packet was transmitted from the device that generated the data packet, the time when the data packet
- the data repository 132 can include one or more data files, data structures, arrays, values, or other information that facilitates operation of the data processing system 130.
- the data repository 132 can include one or more local or distributed databases and can include a database management system.
- the data repository 132 can include, maintain, or manage one or more data streams 158.
- the data streams 158 can include or be formed from one or more of a video stream, image stream, stream of sensor measurements, event stream, or kinematics stream.
- the data streams 158 can include data collected by one or more data capture devices 110, such as a set of 3D sensors from a variety of angles or vantage points with respect to the procedure activity (e.g., point or area of surgery).
- the data stream 158 can include any stream of data.
- the data streams 158 can include a video stream, including a series of video frames or organized into video fragments, such as video fragments of about 1, 2, 3, 4, 5, 10 or 15 seconds of a video. Each second of the video can include, for example, 30, 45, 60, 90 or 120 video frames per second.
- the data streams 158 can include an event stream which can include a stream of event data or information, such as packets, which identify or convey a state of the RMS 120 or an event that occurred in association with the RMS 120.
- data stream 158 can include any portion of system configuration 122, including information on operations on data streams 158, data on installation, uninstallation, calibration, set up, attachment, detachment or any other action performed by or on an RMS 120 with respect to the medical instruments 112.
- the data stream 158 can include data about an event, such as a state of the RMS 120 indicating whether the medical instrument 112 is calibrated, adjusted or includes a manipulator arm installed on the RMS 120.
- a data stream 158 representing event data (e.g., event data stream) can include data on whether an RMS 120 was fully functional (e.g., without errors) during the procedure. For example, when a medical instrument 112 is installed on a manipulator arm of the RMS 120, a signal or data packet(s) can be generated indicating that the medical instrument 112 has been installed on the manipulator arm of the RMS 120.
- the data stream 158 can include a stream of kinematics data which can refer to or include data associated with one or more of the manipulator arms or medical instruments 112 attached to the manipulator arms, such as arm locations or positioning.
- the data corresponding to the medical instruments 112 can be captured or detected by one or more displacement transducers, orientational sensors, positional sensors, or other types of sensors and devices to measure parameters or generate kinematics information.
- the kinematics data can include sensor data along with time stamps and an indication of the medical instrument 112 or type of medical instrument 112 associated with the data stream 158.
- the data repository 132 can store sensor data having video frames that can include one or more static images or frames extracted from a sequence of images of a video file.
- Video frame can represent a specific moment in time and can be identified by a metadata including a timestamp.
- Video frames can display visual content of the video of a medical procedure being analyzed by the data processing system 130 (e.g., by the anatomy detector 146) to form a composite video along with performance metrics indicative of the performance of the surgeon performing the procedure.
- a video frame can depict a snapshot of the surgical task, illustrating a movement or usage of a medical instrument 112 such as a robotic arm manipulating a surgical tool within the patient's body.
- the data streams 158 corresponding to sensor data (e.g., videos), events, and kinematics can include related, corresponding or duplicate information that can be used for cross-data comparisons and verification that all three data sources are in agreement.
- the detection function can implement a check for consistency between diverse data types and data sources by mapping and comparing timestamps between different data types to facilitate if they consistently progress over time, such as in accordance with expected flow and correlation of events, video stream details and kinematics values.
- an installation of a medical instrument 112 can be recorded as a system event and provided in a data stream 158 of events data.
- the installed medical instrument 112 can shows up in a sensor data (e.g., in a video) which can be detected the data processing system 130, which can include a computer vision model.
- Kinematics data can confirm movements of the medical instrument 112 according to the movements detected by the data processing system 130.
- the data processing system 130 can verify time synchronization across the three data sources (e.g., three data streams 158).
- the data processing system 130 can include any combination of hardware or software that perform one or more of the functions described herein.
- the data processing system 130 can include any combination of hardware and software for determine forces applied by a medical instrument during teleoperation.
- the data processing system 130 can include any computing device (e.g., a computing device that is the same as, or similar to, the computing device 600 of FIG. 6) and can include one or more servers, virtual machines, or can be part of or include a cloud computing environment.
- the data processing system 130 can be provided via a centralized computing device or be provided via distributed computing components, such as including multiple, logically grouped servers and facilitating distributed computing techniques.
- the logical group of servers can be referred to as a data center, server farm or a machine farm.
- the servers which can include virtual machines, can also be geographically dispersed.
- a data center or machine farm can be administered as a single entity, or the machine farm can include a plurality of machine farms.
- the servers within each machine farm can be heterogeneous - one or more of the servers or machines can operate according to one or more type of operating system platform.
- the data processing system 130, or components thereof can include a physical or virtual computer system operatively coupled, or associated with, the medical environment 102.
- the data processing system 130, or components thereof can be coupled, or associated with, the medical environment 102 via a network 101, either directly or directly through an intermediate computing device or system.
- the network 101 can be any type or form of network.
- the geographical scope of the network can vary widely and can include a body area network (BAN), a personal area network (PAN), a local-area network (LAN) (e.g., Intranet), a metropolitan area network (MAN), a wide area network (WAN), or the Internet.
- the topology of the network 101 can assume any form such as point-to-point, bus, star, ring, mesh, tree, etc.
- the network 101 can utilize different techniques and layers or stacks of protocols, including, for example, the Ethernet protocol, the internet protocol suite (TCP/IP), the ATM (Asynchronous Transfer Mode) technique, the SONET (Synchronous Optical Networking) protocol, the SDH (Synchronous Digital Hierarchy) protocol, etc.
- the TCP/IP internet protocol suite can include application layer, transport layer, internet layer (including, e.g., IPv6), or the link layer.
- the network 101 can be a type of a broadcast network, a telecommunications network, a data communication network, a computer network, a Bluetooth network, or other types of wired and wireless networks.
- the data processing system 130 can be located at least partially at the location of the surgical facility associated with the medical environment 102 or remotely therefrom. Elements of the data processing system 130, or components thereof can be accessible via portable devices such as laptops, mobile devices, wearable smart devices, etc.
- the data processing system 130, or components thereof can include other or additional elements that can be considered desirable to have in performing the functions described herein.
- the data processing system 130, or components thereof can include, or be associated with, one or more components or functionality of a computing including, for example, one or more processors coupled with memory that can store instructions, data or commands for implementing the functionalities of the data processing system 130 discussed herein.
- the data processing system 130 can include one or more of a data collector 144, an anatomy detector 146, a force predictor 148, a metric predictor 150, a performance controller 152, or a data repository 132.
- the performance controller 152 can include a timer 154 or a user interface 156.
- the data processing system 130 can be communicatively coupled with one or more data processing systems 130.
- the data processing system 130 can be communicatively coupled with one or more other data processing systems 130 that operate in cooperation to perform one or more of the operations described herein.
- the data processing system 130 can be implemented by one or more components of the medical environment 102.
- the data processing system 130 can be implemented by one or more components of the RMS 120.
- the data processing system 103 can receive one or more data streams 158 that are described herein, and can monitor operation of the RMS 120 using the system configurations 122.
- One or more RMSs 120 can be communicatively coupled with the one or more data processing systems 130.
- the data repository 132 can be configured to receive, store, and provide the data streams 158 (e.g., one or more data packets associated with the data streams 158) before, during, or after a medical procedure.
- the data repository 132 stores data associated with one or more of a machine learning (ML) model 134, historical data 136 associated with one or more previously performed medical procedures involving the RMS 120, types 138 (e.g., one or more force types), thresholds 140 (e.g., thresholds representing force limits), or tables 142 (e.g., tables representing one or more sets of force limits).
- ML machine learning
- the data collector 144 can be implemented by the data processing system 130 or can be a device that is the same as, or similar to, the computing device 600 of FIG. 6.
- the data collector 144 can receive the data streams 158.
- the data collector 144 can receive the data streams via the network 101.
- the data collector 144 can receive the data streams 158 from the data processing system 130.
- the data collector 144 can receive the data streams 158 of a medical procedure performed with the RMS 120.
- the one or more packets associated with the data streams 158 can represent one or more images during a medical procedure.
- the one or more images can be captured or otherwise obtained by the visualization tool 114.
- the one or more images can represent one or more anatomical structures or one or more medical instruments as describe herein.
- the data collector 144 can provide the data streams 158 (e.g., one or more packets of the data streams 158) to the anatomy detector 146, the force predictor 148, or the metric predictor 150.
- the anatomy detector 146 can be implemented by the data processing system 130 or can be a device that is the same as, or similar to, the computing device 600 of FIG. 6.
- the anatomy detector 146 can receive the data streams 158.
- the anatomy detector 146 can receive the data streams 158 from the data collector 144.
- the anatomy detector 146 can identify a type of an anatomical structure on which a medical procedure is performed.
- the anatomy detector 146 can identify the type of the anatomical structure on which a medical procedure is performed based at least on the data streams 158.
- the anatomy detector 146 can identify the type of the anatomical structure on which a medical procedure is performed based at least on the data streams 158 and an ML model.
- the anatomy detector 146 can provide the data streams 158 to the ML model to cause the ML model to provide an output, the output representing the type of the anatomical structure on which the medical procedure is performed.
- the anatomy detector 146 can provide data associated with the type of the anatomical structure to the force predictor 148 or the performance controller 152.
- the force predictor 148 can be implemented by the data processing system 130 or can be a device that is the same as, or similar to, the computing device 600 of FIG. 6.
- the force predictor 148 can receive the data streams 158.
- the force predictor 148 can receive the data streams 158 from the data collector 144.
- the force predictor 148 can receive data associated with the type of the anatomical structure from the anatomy detector 146 or the force predictor 148 can receive data associated with the type of the interaction from the force predictor 148.
- the force predictor 148 can determine an amount of force applied to the anatomical structure.
- the force predictor 148 can determine the amount of force applied to the anatomical structure based at least on the type of the anatomical structure or the type of the interaction with the anatomical structure.
- the force predictor 148 can provide data associated with the amount of force applied to the anatomical structure to the performance controller 152.
- the metric predictor 150 can be implemented by the data processing system 130 or can be a device that is the same as, or similar to, the computing device 600 of FIG. 6.
- the metric predictor 150 can receive the data streams 158.
- the force predictor 148 can receive the data streams 158 from the data collector 144.
- the metric predictor 150 can identify a type of an interaction involving an anatomical structure.
- the metric predictor 150 can identify the type of the interaction involving the anatomical structure based at least on the data streams 158.
- the metric predictor 150 can provide data associated with the type of the interaction to the force predictor 148 or the performance controller 152.
- the anatomy detector 146 can identify the type of the anatomical structure on which a medical procedure is performed and provide data associated with the type of anatomical structure to the metric predictor 150.
- the performance controller 152 can be implemented by the data processing system 130 or can be a device that is the same as, or similar to, the computing device 600 of FIG. 6.
- the performance controller 152 can receive the data associated with the amount of force applied to the anatomical structure from the force predictor 148.
- the performance controller 152 can receive data associated with one or more metrics as described herein from the metric predictor 150.
- the performance controller 152 can determine an indication of one or more metrics as described herein to control performance of the medical procedure with the robotic medical system.
- the performance controller 152 can determine a user interface 156. In examples, the performance controller 152 determines the user interface 156 periodically (e.g., every 1 second, every 2 seconds, etc.). In some examples, the performance controller 152 determines the user interface 156 continuously. In some embodiments, the performance controller 152 provides an indication of a metric, where the indication represents or is represented by the user interface 156. In some embodiments, the performance controller 152 can provide data associated with the indication of the metric to cause a device to display the indication of the amount of force. For example, the performance controller 152 can provide the data associated with the indication of the metric to cause display 116 to display the indication of the metric. In this example, the data associated with the indication of the metric can be configured to cause the display 116 to display the indication.
- the data repository 132 can be implemented by the data processing system 130 or can be a device that is the same as, or similar to, the computing device 600 of FIG. 6.
- the data repository can receive data from any of the devices of FIG. 1A either directly or indirectly (e.g., via the data processing system 130).
- the data can include the ML model 134, the historical data 136, the types 138, the threshold 140, or the table 142.
- the data stored by the data repository 132 is associated with a previously-performed medical procedure.
- the data stored by the data repository 132 is associated with a current medical procedure.
- the data repository 132 can receive the data streams 158 or the system configurations 122 and store the data streams 158 or the system configurations 122. The data repository 132 can then provide the data streams 158 or the system configurations 122 (e.g., one or more data packets thereof) to the one or more of the components of the data processing system 130.
- FIG. IB is a schematic block diagram illustrating an example environment 160 in which devices, systems, methods, or products described herein can be implemented, according to some embodiments.
- the environment 160 includes a metric prediction system 162, a sensing system 164, and a user input system 168.
- the metric prediction system 162 is the same as, or similar to, the data processing system 130 of FIG. 1 A.
- the sensing system 164 is the same as, or similar to, one or more data capture devices 110 of FIG. 1A.
- the metric prediction system 162 can receive video data 166 from the sensing system 164. Additionally, the metric prediction system 162 can receive robotic system data 170, and medical instrument data 172. In examples, the metric prediction system 162 can receive the video data 166, the robotic system data 170, or the medical instrument data 172 as part of a data stream.
- the data stream can be received from a robotic medical system that is the same as, or similar to, the robotic medical system 120 of FIG. 1 A.
- the data stream (e.g., one or more packets included in the data stream) can be the same as, or similar to, the data streams 158 of FIG. 1A.
- the metric prediction system 162 can also communicate (e.g., establish communication connections to exchange data) with the user input system 168.
- the metric prediction system 162, the sensing system 164, or the user input system 168 can include or be implemented by one or more suitable computing systems, such as the computing device 600 of FIG. 6.
- metric prediction system 162, sensing system 164, or user input system 168 can include one or more components that are the same as, or similar to, one or more of the components of the computing device 600.
- the metric prediction system 162, sensing system 164, or user input system 168 can be configured to communicate (e.g., to establish communication connections to exchange data).
- the system 100 can include one or more devices or systems that are the same as, or similar to, one or more devices or systems discussed with respect to example medical environment 500 of FIG. 5.
- the processes described herein can be implemented by the metric prediction system 162.
- the some or all of the processes implemented by the metric prediction system 162 can be implemented by one or more other devices (alone or in cooperation with the metric prediction system 162) such as, for example, the sensing system 164 or the user input system 168, which can be the same as, or similar to, the computing device 600 of FIG. 6. While the metric prediction system 162 is illustrated as a separate system from the user input system 168, in examples, the metric prediction system 162 can be included in (e.g., implemented by) user input system 168. Accordingly, one or more of the functions described herein as being performed by the metric prediction system 162 can similarly be performed by the user input system 168.
- the metric prediction system 162 can be the same as, or similar to, the computing device 600 of FIG. 6.
- the user input system 168 can be the same as, or similar to, the user control system 510 of FIG. 5 or the computing device 600 of FIG. 6.
- the sensing system 164 can be the same as, or similar to, the computing device 600 of FIG. 6.
- the metric prediction system 162 can receive the robotic system data 170 from the sensing system 164, where the sensing system includes a device that is the same as, or similar to, one or more medical instruments supported by manipulator arms (e.g., manipulator arms that are the same as, or similar to, manipulator arms 535A-535D of FIG.
- manipulator arms e.g., manipulator arms that are the same as, or similar to, manipulator arms 535A-535D of FIG.
- an imaging device e.g., an endoscope, an ultrasound tool, etc.
- a sensing instrument e.g., a force-sensing surgical instrument
- the imaging device or the sensing instrument are associated with (e.g., included in or implemented by) the sensing system 164.
- a medical procedure refers to a surgical procedure or operation performed in a medical environment (e.g., a medical or surgical theater, etc. that is the same as, or similar to, the medical environment 500 of FIG. 5) by or using one or more of a medical staff, a robotic system, or a medical instrument.
- a medical staff include surgeons, nurses, support staff, and so on (e.g., individuals that can be the same as, or similar to, surgeon 530A or additional medical personnel 530B-530D of FIG. 5).
- the robotic systems include the robotic medical system or the robot surgical system described herein such as, for example, one or more device of medical environment 500 (e.g., robotic medical system 524).
- Examples of medical instruments include the medical instruments supported by the manipulator arms 535A-535D.
- Medical procedures can have various modalities, including robotic (e.g., using at least one robotic system), non-robotic laparoscopic, non-robotic open, and so on.
- the robotic system data 170, and medical instrument data 172 collected during a medical procedure also refers to, or includes, robotic system data 170, and medical instrument data 172 collected by one or more devices in a medical environment (e.g., a medical environment 500) in which the medical procedure is performed and for one or more of medical staff, robotic system, or medical instrument performing or used in performing the medical procedure.
- the metric prediction system 162 can receive and process data sources or data streams including one or more of video data 166, robotic system data 170, and medical instrument data 172 collected for a training procedure or a medical procedure. For example, the metric prediction system 162 can acquire data streams of the video data 166, robotic system data 170, and medical instrument data 172 in real-time. In some examples, the metric prediction system 162 can utilize all types of robotic system data 170, and medical instrument data 172 collected, obtained, determined, or calculated for a medical procedure when generating one or more user interfaces (UIs) as described herein.
- UIs user interfaces
- the metric prediction system 162 receives the video data 166, the robotic system data 170, or the medical instrument data 172 during operation of the robotic system.
- the metric prediction system 162 can receive the video data 166 from the sensing system 164 during operation of the robotic system.
- the video data 166 can be associated with one or more images captured individually or continuously by the imaging device included in the sensing system 164.
- the imaging device includes a visual image endoscope, laparoscopic ultrasound, camera, etc. Other suitable imaging devices are also contemplated.
- the sensing system 164 includes a repositionable assembly including one or more linkages supported by the robotic system.
- the sensing system 164 can include a repositionable assembly including one or more linkages that can be articulated by the robotic system based at least on the input provided by the surgeons via the user input system 168 described herein.
- the metric prediction system 162 can receive the robotic system data 170 from a robotic system (e.g., from one or more components of a robotic system).
- the robotic system data 170 includes a system event stream, the system event stream further including data associated with one or more system events (e.g., states of one or more devices such as whether one or more devices or medical instruments are connected to the robotic system, whether the one or more devices are operating as expected, error messages, or the like).
- the robotic system can include one or more devices or components of a robotic medical system (e.g., a robotic medical system that is the same as, or similar to, the robotic medical system 524 of FIG.
- the metric prediction system 162 can receive the medical instrument data 172 where the medical instrument data 172 is associated with a state of the one or more of the tools supported by the robotic medical system.
- the robotic system can include a user input system 168 (e.g., a first surgeon console that is the same as, or similar to, the user control system 510 of FIG. 5).
- the one or more images captured by the imaging device included in the sensing system 164 can show at least a portion of at least one medical instrument (tools, surgical instruments, or the like) within a field of view of the imaging device.
- the sensing system 164 can include an imaging device that is supported along a distal portion of a tool (e.g., a tool that is supported by (e.g., installed on) a robotic medical system 524).
- the sensing system 164 can be operated by medical staff during a training session where the medical staff are familiarizing themselves with the robotic system or practicing certain maneuvers using the robotic system.
- the sensing system 164 can be operated by medical staff during a surgery where a surgeon is operating the user input system 168.
- the sensing system 164 can be operated such that the imaging device of the sensing system 164 is positioned to capture and generate images of at least a distal portion of at least one medical instrument in a field of view of the imaging device included in the sensing system 164, where the field of view is directed to at least a portion of tissue of a patient.
- the images can be captured as the one or more medical instruments are controlled by the robotic system based at least on input received by the user input system 168.
- the robotic system data 170 can be associated with the state of the control of one or more devices of the robotic system based at least on inputs received by the user input system 168. For example, as the user input system 168 communicates with the robotic system to control the at least one medical instrument, the robotic system can generate and provide the robotic system data 170 to the metric prediction system 162.
- the robotic system data 170 can represent whether the user input system 168 is controlling one or more of the medical instruments, whether the user input system 168 is generating control signals that are configured to cause the maneuvering one or more medical instruments within the field of view of the sensing system, the torque being applied at one or more joints involved in supporting one or more of the medical instruments involved, or the like.
- the robotic system data 170 can be associated with force exerted by the robotic system on one or more anatomical structures.
- the robotic system data 170 can be generated by the robotic system based at least on movement of one or more linkages or one or more components of the medical instruments of the robotic system.
- one or more sensors corresponding to the one or more linkages can generate sensor signals representative of the force exerted by the linkages when repositioning the medical instrument.
- the sensor signals can be included in the robotic system data 170 which, in turn, is included in the data stream.
- One or more different sensors e.g., encoders or the like
- this sensor data can later be used to derive the position of medical instruments supported by the linkages.
- the medical instrument data 172 can be associated with one or more states of one or more medical instruments of the robotic system.
- the medical instrument data 172 can be associated with one or more states of one or more medical instruments controlled during teleoperation of the robotic system by the user input system 168.
- the one or more states can represent whether or not the one or more medical instruments of the robotic system are performing one or more functions.
- functions can include tool activations, movement of medical instruments, or the like as described herein.
- the one or more states can represent whether or not one or more medical instruments are being controlled by the robotic system based at least on inputs received by the user input system 168.
- the metric prediction system 162 receives the data stream of the medical procedure.
- the metric prediction system 162 can receive the data stream comprising data associated with force vectors that represent directions and magnitudes of force interactions between medical instruments and anatomical structures involved in the medical procedure.
- the metric prediction system 162 can receive the data stream comprising data associated with force vectors, where the force vectors include grasp force vectors.
- the metric prediction system 162 can receive the data stream comprising data associated with grasp force vectors representing directions and magnitudes of force interactions between portions of instruments and anatomical structures grasped by the portions of the instruments during the medical procedure.
- the metric prediction system 162 can receive the data associated with force vectors at a plurality of points in time during the medical procedure.
- the plurality of points in time can be instantaneous points in time (e.g., the procedure can be occurring in real time).
- the metric prediction system 162 can receive the data associated with force vectors at a plurality of points in time prior to the instantaneous point in time. For example, as the metric prediction system 162 determines a cumulative force associated with an interaction between one or more medical instruments and one or more anatomical structures, the metric prediction system 162 can receive the data associated with force vectors at a plurality of points in time prior to the instantaneous time step.
- the data can be generated during the current medical procedure or a previous medical procedure (e.g., a previous medical procedure associated with the patient or the surgeon).
- the metric prediction system 162 determines the amount of force applied or to be applied to anatomical structures based at least in part on the force vectors captured at the plurality of points in time as the medical instruments interact with the anatomical structures involved in the medical procedure.
- the metric prediction system 162 receives the data stream of the medical procedure, the data stream comprising data associated with a skill level of the one or more surgeons involved in the medical procedure.
- the metric prediction system 162 can receive the data associated with the skill level of the one or more surgeons involved in the medical procedure based at least in part on the one or more interactions between the medical instruments of the robotic system and anatomical structures involved in the medical procedure.
- the skill level can represent, for example, an amount of previous interactions involving similar medical instruments and anatomical structures during previous medical procedures conducted by the surgeons, scores representing patient outcomes specific to the interactions involving similar medical instruments and anatomical structures during previous medical procedures conducted by the surgeons, force vectors associated with previously-performed interactions involving the surgeon, or other historical information that can be used to determine force limit.
- the metric prediction system 162 can receive the data stream of the medical procedure, where the data stream includes robotic system data 170 that is associate with kinematic information or system event information corresponding to the operation of the robotic system.
- the metric prediction system 162 receives the data stream of the medical procedure, where the data stream includes data associated with one or more aspects of the medical procedure (e.g., a type of medical procedure, a complexity level associated with the medical procedure, a segment of the medical procedure associated with a phase, task, or step, or the like).
- the metric prediction system 162 receives patient data associated with information about the patient such as their age, demographics, whether the patient has a compromised immune system (or is sick at the time of the medical procedure), the tensile strength of one or more of the anatomical structures involved in the medical procedure, any other such information.
- the metric prediction system 162 can identify a type of an anatomical structure involved in the medical procedure.
- the metric prediction system 162 can identify a type of an anatomical structure involved in the medical procedure based at least in part on the data stream (e.g., one or more aspects of the data represented by the data stream).
- the metric prediction system 162 can identify the type of an anatomical structure involved based at least on the type of medical procedure.
- the metric prediction system 162 can identify the type of the anatomical structure based on image interpretation technique using one or more models trained o machine learning. For example, where a medical procedure involves addressing a hernia in an abdomen of a patient, the metric prediction system 162 can determine the type of the anatomical structure involved based at least in part on the one or more anatomical structures for which access can be gained by the robotic system during the medical procedure.
- the metric prediction system 162 establishes a reference frame.
- the metric prediction system 162 can establish a reference frame based at least on one or more interactions involved in the medical procedure.
- the reference frame can include, for example, portions of the data stream (e.g., images of the video data 166) associated with a given interaction.
- the metric prediction system 162 establishes a reference frame such that the reference frame corresponds to (e.g., focuses on) one or more interactions.
- the metric prediction system 162 identifies the type of the anatomical structure involved based at least in part on one or more models trained with machine learning. For example, the metric prediction system 162 can receive the video data 166 from the sensing system 164 during the medical procedure. In this example, the metric prediction system 162 can provide the video data 166 to the one or more models to cause the one or more models to generate an output. The output can represent one or more classifications, the one or more classifications corresponding to identifiers of the one or more anatomical structures represented by the video data 166. In some embodiments, the one or more classifications can be made on a frame or a pixel basis.
- the one or more classifications can be made based at least in part on one or more groups of pixels.
- the one or more classifications can be associated with one or more bounding boxes or segmentation masks generated by the one or more models that correspond to groups of pixels representing the one or more anatomical structures.
- the metric prediction system 162 identifies the type of the anatomical structure on which the medical procedure is performed and a type of an interaction with the anatomical structure using the one or more models trained with machine learning.
- the metric prediction system 162 can receive the video data 166 from the sensing system 164 during the medical procedure and the metric prediction system 162 can provide the video data 166 to the one or more models to cause the one or more models to generate an output.
- the output can represent one or more classifications corresponding to identifiers of the one or more anatomical structures or the one or more medical instruments as represented by the video data 166.
- the one or more classifications can correspond to identifiers of the one or more interactions represented by the video data 166 between the one or more anatomical structures or the one or more medical instruments.
- one or more images associated with the video data 166 can be provided to the one or more models to cause the one or more models to generate an output, the output representing the movement of the anatomical structure by one or more medical instruments.
- the output can be further represented as an indication of the type of the interaction, where the type includes one or more of a grab interaction (e.g., grabbing at least a portion of an anatomical structure using jaws of an end effector supported by a medical instrument), a retract interaction (e.g., holding back or separating tissue associated with the one or more anatomical structures), a cut interaction (e.g., cutting at least a portion of an anatomical structure), or a cauterize interaction (e.g., burning tissue associated with the one or more anatomical systems using, for example, electrocautery systems, chemical cauterization systems, or the like).
- a grab interaction e.g., grabbing at least a portion of an anatomical structure using jaws of an end effector supported by a medical instrument
- a retract interaction e.g., holding back or separating tissue associated with the one or more anatomical structures
- a cut interaction e.g., cutting at least a portion of an anatomical structure
- the metric prediction system 162 can provide robotic system data 170 or medical instrument data 172 to the one or more models to cause the one or more models to identify the type of the anatomical structure on which the medical procedure is performed, and a type of an interaction with the anatomical structure.
- the one or more models can be trained on training data that includes the video data 166 from prior medical procedures, the robotic system data 170 from prior medical procedures, or the medical instrument data 172 from prior medical procedures.
- the metric prediction system 162 can provide the data from the data stream received during the surgical procedure to the one or more models to cause the one or more models to generate the outputs discussed above.
- the metric prediction system 162 can provide previously- generated video data 166 from prior medical procedures, the robotic system data 170 from prior medical procedures, or the medical instrument data 172 from prior medical procedures to the one or more models to cause the one or more models to generate the outputs described herein.
- the metric prediction system 162 can provide previously-generated video data 166 from prior medical procedures, the robotic system data 170 from prior medical procedures, or the medical instrument data 172 from prior medical procedures, where the previously-generated data corresponds to one or more earlier points in time and one or more tags.
- the one or more tags can be determined based at least on inputs received by individuals annotating the previously generated data.
- the annotations can correspond to, for example, the types of anatomical structures involved at a given point in time, the locations of the anatomical structures involved at the given point in time, the medical instruments involved in interactions at the given point in time, or the type of interaction involved at the given point in time.
- the metric prediction system 162 can compare the output of the one or more models to the corresponding input of the one or more models (e.g., the classifications generated by the one or more models to the tags corresponding to the inputs to the one or more models) and determine a difference between the output and the input.
- the metric prediction system 162 can update the one or more models by changing one or more of the weights associated with the one or more models and repeat the training process until the one or more models converge.
- the metric prediction system 162 determines one or more force vectors based at least on the data stream of the medical procedure. For example, the metric prediction system 162 can determine the one or more force vectors where the one or more force vectors represent direction and magnitude associated with operation of a medical instrument during an interaction with (e.g., contact with) an anatomical structure. In some embodiments, where the medical instrument comes into contact with an anatomical structure, the metric prediction system 162 can determine the force applied to the anatomical structure based at least in part on the force and direction of the medical instrument at a point in time or at multiple points in time.
- the metric prediction system 162 determines the one or more force vectors based at least on a context in which one or more interactions occur. For example, the metric prediction system 162 can determine one or more anatomical structures and one or more interactions involving medical instruments and the one or more anatomical structures based at least on the data stream as described herein. The metric prediction system 162 can then determine one or more the force vectors based at least on the context associated with the interactions between the medical instruments and the one or more anatomical structures. For example, where the interaction involves the medical instruments moving the anatomical structures, the metric prediction system 162 can determine that the movement is associated with a predetermined context and the metric prediction system 162 can determine the force vectors based at least on the context. In examples, the metric prediction system 162 can update the aspects of the force vector (e.g., increase or decrease the force or direction as measured, etc.) based at least on the context.
- the aspects of the force vector e.g., increase or decrease the force or direction as measured, etc.
- the metric prediction system 162 determines the magnitude of the force interaction between the anatomical structure and the medical instrument involved in the medical procedure based at least on the data stream. For example, the metric prediction system 162 can determine the magnitude of the force interaction based at least in part on the robotic system data 170. In such an example, the metric prediction system 162 can determine the magnitude of the force based at least in part on the sensor signals (e.g., torque measurements as described herein) that are representative of the force exerted by the linkages when repositioning the medical instrument or when remaining in contact with the anatomical structures (e.g., when grabbing, moving, or holding the anatomical structures).
- the sensor signals e.g., torque measurements as described herein
- different sensors can generate sensor data indicative of a position of the linkages relative to one another or the robotic system.
- the sensor data can be included in the data stream.
- the sensor data can later be used to derive the position of medical instruments supported by the linkages. For example, when the position or pose of the robotic system (e.g., the one or more components of the robotic system) are registered relative to a patient, the sensors data indicative of a position of the linkages relative to one another and the robotic system can be used to determine the relative position of medical instruments and the anatomical structures of the patient.
- the metric prediction system 162 determines an amount of force applied to an anatomical structure. For example, the metric prediction system 162 can determine an amount of force applied to an anatomical structure based at least on the force vectors associated with a given interaction.
- the one or more sensor signals generated by one or more torque sensors can be used to determine the amount of force.
- the one or more sensor signals can be generated by one or more torque sensors associated with (e.g., installed or included in) one or more manipulator arms (e.g., one or more manipulator arms that are the same as, or similar to, the manipulator arms 53 SA- 535D of FIG. 5) or linkages of the one or more manipulator arms.
- the torque sensors are configured to measure torque at one or more joints of the one or more manipulator arms or the one or more linkages.
- the metric prediction system 162 can receive the sensor signals (via the data stream) and analyze the sensor signals to determine an amount of force exerted on the anatomical structure based at least on the torque measured at each joint.
- the metric prediction system 162 can account for (e.g., subtract) the torque needed to move the one or more manipulator arms, linkages, or medical instrument associated with inertia or the weight of the components to determine the actual torque applied to the anatomical structure during the interaction. In this way, the metric prediction system 162 can determine the amount of force exerted (e.g., by a medical instrument supported by a surgical robot) on a given anatomical structure during a medical procedure.
- the metric prediction system 162 can determine the amount of force applied to the anatomical structure based at least on one or more force vectors. For example, the metric prediction system 162 can determine the one or more force vectors representing direction and magnitude associated with operation of medical instruments during the medical procedure. In some embodiments, the amount of force applied is associated with (e.g., corresponds to) one or more interactions between medical instruments and anatomical structures involved in the surgical procedure. In some embodiments, the metric prediction system 162 determines the one or more force vectors based at least in part on the images generated during the medical procedure and the torque measured at the joints by the torque sensors.
- the metric prediction system 162 can determine the amount of force applied to the anatomical structure based at least on one or more forces not exerted by the medical instrument. For example, the metric prediction system 162 can determine the amount of force applied based at least on the metric prediction system 162 determining force vectors and the application of other forces (e.g., gravity) during an interaction. In one illustrative example, where the exterior of a jaw of a stapler is being used to hold a portion of the intestines of the patient or move an organ, the metric prediction system 162 can first determine the force applied to the a portion of the intestines of the patient or move an organ based at least on the force vector associated with the interaction.
- forces e.g., gravity
- the metric prediction system 162 can then update the force applied to the intestines and the organ based at least on the effect of gravity on the intestines and organ during the interaction. In this way, the metric prediction system 162 can determine an amount of force that is applied to an anatomical structure (e.g., the intestines) which may move in a fluid manner based at least on the tensile strength of the anatomical structure even in cases where the interaction between the medical instrument and the anatomical structures is relatively small as represented by the measured torque involved in the interaction.
- an anatomical structure e.g., the intestines
- the metric prediction system 162 can determine the amount of force applied to the anatomical structure based at least on properties of the anatomical structure involved in an interaction. For example, the metric prediction system 162 can determine that an anatomical structure involved in an interaction is associated with a predetermined tensile strength. The metric prediction system 162 can then determine the amount of force applied based at least on the metric prediction system 162 determining a force vector for the interaction and an expected amount of resistance corresponding to the tensile strength associated with the anatomical structure. In this way, the metric prediction system 162 can more accurately determine an amount of force applied to different types of anatomical structures based at least on the expected resistance of the anatomical structures.
- the metric prediction system 162 can determine a context for an interaction. For example, the metric prediction system 162 can determine the context for the interaction based at least on a force vector associated with the interaction. In one illustrative example, where a context is associated with normal operation (e.g., the performance of expected movements) the metric prediction system 162 can receive the video data 166, the robotic system data 170, or the medical instrument data 172 and analyze the data to confirm that the context is associated with normal operation.
- a context is associated with normal operation (e.g., the performance of expected movements)
- the metric prediction system 162 can receive the video data 166, the robotic system data 170, or the medical instrument data 172 and analyze the data to confirm that the context is associated with normal operation.
- metric prediction system 162 may analyze the data and determine that the data indicates that a percentage of the force exerted throughout the interaction was less than a threshold amount (e.g., less than 2 Newtons) and that a different percentage of the force exerted was at or greater than the threshold amount (e.g., at or greater than 2 Newtons). The metric prediction system 162 can then determine that operation of the medical instrument at or above the threshold amount was anticipated (e.g., the force was applied when cutting, stapling, etc.) for the given interaction. The metric prediction system 162 can determine or update the anticipated amount based at least on the metric prediction system 162 analyzing similar medical procedures involving similar anatomical structures (e.g., historical procedures).
- the metric prediction system 162 determines a force signature or set of force signatures for one or more of an interaction (e.g., an interaction type) or a surgeon. For example, the metric prediction system 162 can determine a force signature based at least on one or more force vectors associated with a medical instrument involved in interactions with anatomical structures. The metric prediction system 162 can determine the force signature based at least on force vectors associated with similar interactions across a plurality of surgical procedures.
- the metric prediction system 162 determines one or more stacked force signatures. For example, the metric prediction system 162 can determine one or more stacked force signatures (e.g., based at least on a combination of one or more distinct force signatures). In this example, the metric prediction system 162 can determine one or more stacked force signatures based at least on one or more forces signatures involved in an interaction between a medical instrument and an anatomical features. In some embodiments, the force signatures or stacked force signatures may be associated with a surgeon. In some embodiments, the one or more force signatures or the one or more stacked force signatures may be associated with a medical procedure (e.g., a type of medical procedure).
- a medical procedure e.g., a type of medical procedure.
- the metric prediction system 162 trains a model to determine the one or more force signatures described herein.
- the metric prediction system 162 can provide data associated with force signatures determined by the metric prediction system 162 or data associated with a given surgical procedure (e.g., data included in a data stream) as input to the model to cause the model to generate an output.
- the metric prediction system 162 may then compare the output to an expected output (e.g., corresponding to the force signature, one or more forces applied in association with the force signature, etc.) to determine a difference between the output and the expected output.
- the metric prediction system 162 may then update on or more weights of the model and repeat this process iterative until the model converges.
- the metric prediction system 162 compares one or more force signatures to determine one or more instances where interactions involved contact between medical instruments and anatomical structures that were not expected during the interactions.
- the force signature represents an average amount of force applied by a medical instrument when moving an anatomical structure
- other amounts of force for other force signatures e.g., force signatures associated with surgeons moving similar anatomical structures
- the metric prediction system 162 can determine that the other force signatures are outliers.
- the metric prediction system 162 determines metrics as described herein based at least on whether the other force signatures are outliers.
- the metric prediction system 162 determines force signatures or sets of force signatures for surgeons. For example, the metric prediction system 162 can determine a force signature for a surgeon based at least on one or more force vectors associated with a medical instrument involved in interactions with anatomical structures that were performed by the surgeon. The metric prediction system 162 can determine the force signature based at least on force vectors associated with similar interactions across a plurality of surgical procedures involving the surgeon. In some embodiments, the metric prediction system 162 can determines stacked force signatures (e.g., based at least on a combination of one or more distinct force signatures) for the surgeon.
- stacked force signatures e.g., based at least on a combination of one or more distinct force signatures
- the metric prediction system 162 can determine one or more stacked force signatures based at least on one or more forces signatures involved in an interaction between a medical instrument and an anatomical features that were performed by the surgeon. In some embodiments, the metric prediction system 162 can generate a profile for one or more surgeons based at least on the one or more force signatures or stacked force signatures associated with the one or more surgeons.
- the metric prediction system 162 determines the amount of force applied based at least on the contact between medical instruments and the anatomical structures during the surgical procedure. For example, the metric prediction system 162 can determine the amount of force applied based at least on the contact between medical instruments and the anatomical structures during the surgical procedure, where the contact is associated with a three-dimensional direction in which the medical instruments contact the anatomical structures. In these examples, the metric prediction system 162 can determine the three-dimensional direction in which the medical instruments contact the anatomical structures based at least on the metric prediction system 162 determining the type of interaction involved.
- the amount of force applied represents an amount of force involved in an interaction between a medical instrument and an anatomical structure determined at a point in time.
- the amount of force applied represents an amount of force (e.g., a cumulative amount of force) involved in an interaction between a medical instrument and an anatomical structure determined over a period of time.
- the amount of force applied over time can be determined based at least on adding the amount of force involved in one or more interactions as discussed herein at one or more points in time associated with the period of time.
- the metric prediction system 162 determines the amount of force applied based at least on one or more models trained with machine learning (referred to herein as “models”). For example, the metric prediction system 162 can receive the data stream including the video data 166, the robotic system data 170, or the medical instrument data 172, and the metric prediction system 162 can provide data associated with the data stream to the one or more models to cause the one or more models to generate one or more outputs. In some embodiments, the one or more outputs may represent the amount of force applied to the anatomical structure during an interaction with an instrument.
- the one or more outputs may represent the amount of force applied to the anatomical structure during an interaction with an instrument at a point in time or at a plurality of points in time.
- the data provided to the one or more models to cause the models to provide outputs representing an amount of force can include raw data (e.g., unprocessed data).
- one or more sensor signals included in the video data 166, the robotic system data 170, or the medical instrument data 172 may be generated from one or more sensors as described herein without being processed (e.g., updated) before being provided to the one or more models.
- one or more of the outputs of the one or more models may be scored.
- the metric prediction system 162 can store the data stream in association with the outputs (e.g., in the data repository 132).
- the metric prediction system 162 can then replay the data stream via a client device (e.g., a computing device that is associated with an individual and is the same as, or similar to, the computing device 600 of FIG. 6).
- the client device may receive input from the individual indicating whether portions or all of the surgical procedure involved appropriate or non-appropriate handling of tissue by the surgeon.
- the input may then be received by the data processing system 130 and stored in further association with the data stream.
- the metric prediction system 162 can then provide the data stream and the indications of whether portions or all of the surgical procedure were appropriate as input to a model to train the model to determine whether portions or entire future surgical procedures involve appropriate or not appropriate handling of tissue.
- the metrics described may be determined herein based at least on the determinations made during a surgical procedure regarding whether the handling of tissue is appropriate or not appropriate.
- the indication of the metric may be determined based at least on outputs indicating whether the handling of tissue during the surgical procedure is appropriate or not appropriate.
- the metric prediction system 162 updates the amount of force applied to the anatomical structure.
- the metric prediction system 162 can update the amount of force applied to the anatomical structure based at least on a medical instrument type of a medical instrument that interacted with (e.g., contacted, grabbed, cut) the anatomical structure.
- the metric prediction system 162 can update the amount of force applied by a stapler where the force applied is intended to cause deformation of the tissue (e.g., during the stapling process).
- the metric prediction system 162 may update the amount of force applied by the stapler where the force applied is not intended to cause deformation of the tissue (e.g., when using a non-cutting portion of the stapler to move the anatomical structure). In this way, the overall force applied to the anatomical structure may be updated (e.g., increased or decreased) where certain tissue interactions are expected. In some embodiments, the metric prediction system 162 can update the amount of force applied based at least on one or more predetermined amounts that correspond to a given interaction.
- the metric prediction system 162 determines a metric indicative of performance of the medical procedure (referred to as a “metric”). For example, the metric prediction system 162 can determine a metric based at least on a comparison of an amount of force applied during an interaction involving a medical instrument and an anatomical structure with a force threshold. In examples, the force threshold can be the same as, or similar to, the thresholds described herein, such as the thresholds 140. In some embodiments, the metric prediction system 162 determines the metric based at least on the metric prediction system 162 updating the amount of force, the medical instrument type, or anatomical structure involved in the medical procedure as described herein.
- the metric prediction system 162 determines the metric based at least on a comparison of the amount of force applied with a force threshold that is established for the anatomical structure. For example, the metric prediction system 162 can compare the amount of force applied to the force threshold, where the force threshold is associated with a degree to which an anatomical structure can be manipulated either at a point in time or over multiple points in time. In one illustrative example, a force threshold for an organ such as a liver may be comparatively lower when compared to a force threshold for an organ such as the kidney.
- the higher force threshold corresponding to the kidney may be established to account for the sensitivity of portions of the kidney (e.g., the adrenal glands which, when manipulated, can cause the release of hormones that affect other portions of the patient) as compared to portions of the liver which may not have similar sensitivities.
- the metric prediction system 162 can determine the metric based at least on whether the amount of force applied satisfies the force threshold that is established for the anatomical structure.
- the metric prediction system 162 can compare the amount of force applied to one or more force thresholds that are associated with corresponding near-term or long-term health outcomes.
- each force threshold can be associated with a near-term or long-term health outcome indicative of a period of time for the tissue to recover, a likelihood that the tissue will recover (e.g., that a tear will heal, etc.), etc.
- the metric prediction system 162 can determine the metric based at least on the comparison of the amount of force applied to the one or more force thresholds that are associated with the corresponding near-term or long-term health outcomes.
- the force threshold can be updated based at least on information associated with one or more surgical procedures.
- the metric prediction system 162 can receive input from users after a surgical procedure indicative of one or more of: the type of surgical procedure, the near-term or long-term health outcome of the surgical procedure, etc., and the metric prediction system 162 can correlate amounts of force applied to the tissue with the near-term or long-term health outcomes. In this example, the metric prediction system 162 can update the force thresholds based at least on the correlation of the amounts of force applied to the tissue with the near-term or long-term health outcomes.
- the metric prediction system 162 determines the metric based at least on a comparison of the amount of force applied with a force threshold that is established for interactions involving the anatomical structure and the type of the medical instrument. For example, as noted above, one or more medical instruments involved in the medical procedure can be associated with different force thresholds. In these examples, the metric prediction system 162 can compare the amount of force applied to a force threshold that is established for interactions between the anatomical structure and the type of the medical instrument.
- the metric prediction system 162 determines the metric based at least on a comparison of the amount of force applied with a force threshold that is established for interactions involving the anatomical structure and the orientation of the medical instrument. In examples, the metric prediction system 162 can compare the amount of force applied to a force threshold that is established for interactions involving the anatomical structure and the medical instrument while in a particular orientation relative to the anatomical structure. In one illustrative example, the metric prediction system 162 can determine the metric based at least on the force applied and whether the medical instrument is oriented in a first orientation (e.g., an operating orientation) or a second orientation (e.g., a non-operating orientation).
- a first orientation e.g., an operating orientation
- a second orientation e.g., a non-operating orientation
- the medical instrument may be configured to perform a first function that involves exerting force (e.g., clamping or grabbing onto the anatomical structure).
- the medical instrument may be configured to perform one or more second functions (e.g., pushing the anatomical structure to the side) that involve exerting a different force.
- the metric prediction system 162 can determine the amount of force applied and compare the amount of force applied to the force threshold that is established for the given interaction.
- the metric prediction system 162 determines the metric based at least on a comparison of the amount of force applied with a force threshold that is established for interactions involving the anatomical structure and a maneuver performed using the medical instrument. In examples, the metric prediction system 162 can compare the amount of force applied to a force threshold that is established for interactions involving the anatomical structure and the maneuver performed using the medical instrument. In one illustrative example, the metric prediction system 162 can determine the metric based at least on a first maneuver that involves exerting force (e.g., clamping or grabbing onto the anatomical structure) during the first maneuver.
- a first maneuver that involves exerting force (e.g., clamping or grabbing onto the anatomical structure) during the first maneuver.
- the metric prediction system 162 can determine the metric based at least on a comparison of the amount of force applied with a force threshold that is established for the anatomical structure that corresponds to an amount of time for patient recovery. For example, as force is exerted by a medical instrument on an anatomical structure during a medical procedure, the force may be compared to amounts of time corresponding to different stages of patient recovery.
- the metric prediction system 162 can determine an amount of time (e.g., minutes, hours) needed for patient recovery before the patient can leave the medical facility where the surgery was performed, return to light activity, return to normal activity, etc.
- an amount of time e.g., minutes, hours
- the metric prediction system 162 can determine the amount of time (e.g., minutes, hours, days) needed for patient recovery before the patient can leave the medical facility, return to light activity, return to normal activity, etc.
- the amount of time e.g., minutes, hours, days
- the metric prediction system 162 can determine an indication of the metric. For example, the metric prediction system 162 can determine the indication of the metric indicative of the performance of the medical procedure where the indication represents the amount of force applied to the anatomical structure. In examples, the indication can represent the amount of force applied to the anatomical structure at a point in time (e.g., an instant force). In some examples, the indication can represent the amount of force applied to the anatomical structure over a period of time (e.g., a cumulative force).
- the metric prediction system 162 can determine the indication of the metric where the indication represents a three-dimensional direction in which a medical instrument contacts an anatomical structure.
- the indication can represent the direction in which the medical instrument contacts the anatomical structure at a point in time.
- the direction in which the medical instrument contacts the anatomical structure may be represented with a two-dimensional or three-dimensional indicator (e.g., line, arrow, and/or the like).
- the metric prediction system 162 can provide an indication of the metric via the robotic system.
- the metric prediction system 162 can provide the indication of the metric based at least in part on the metric prediction system 162 generating data associated with a user interface.
- the metric prediction system 162 can generate the user interface based at least in part on the indication of the metric.
- the metric prediction system 162 can further generate data associated with the user interface that is configured to cause a display device (e.g., a display device of the user input system 168) to provide an output representing the indication of the metric.
- the user interface can also include images representing the medical instruments and anatomical structures that are in a field of view of an imaging device of the sensing system 164.
- the metric prediction system 162 can provide the indication of the metric based at least in part on the metric prediction system 162 generating data associated with haptic feedback.
- the metric prediction system 162 can provide the indication of the metric based at least in part on the metric prediction system 162 generating data associated with haptic feedback that corresponds to the metric indicative of the performance of the medical procedure.
- the metric prediction system 162 can compare the amount of force with a force threshold established for the type of the anatomical structure to determine the metric indicative of the performance of the medical procedure.
- metric prediction system 162 can determine the metric as the difference between the force applied and the force threshold decreases and generate data associated with haptic feedback based at least on the change in the difference between the force applied and the force threshold.
- the data associated with the haptic feedback can cause controls (e.g., controls that are the same as, or similar to, the controls of the user control system 510 of FIG. 5) engaged by the surgeon during the medical procedure to vibrate at varying intensities. In this way, the surgeon can be provided with feedback representing the indication of the metric described herein.
- the metric prediction system 162 generates a user interface based at least in part on the data stream associated with the medical procedure and the indication of the metric. For example, the metric prediction system 162 can generate the user interface based at least in part on images representing medical instruments or anatomical structures that are in a field of view of an imaging device. In an example, the metric prediction system 162 can generate a user interface based at least in part on images representing medical instruments or anatomical structures that are in a field of view of an imaging device, where the imaging device is included in the sensing system 164.
- the images can be captured at one or more points in time (sometimes referred to as points in time) that correspond to the one or more points in time at which the metric prediction system 162 determined the metric.
- the metric prediction system 162 generates the user interface based at least in part on the images and the indication of the metric. For example, the metric prediction system 162 can generate the user interface such that the indication of the metric is included in the user interface.
- the indication includes a color-coded or binary indicator.
- the indication can be associated with an area of a user interface that is colored one color that represents the amount of force applied to the anatomical structure during an interaction.
- the indication can be associated with an area of the user interface that is colored one or more different colors.
- the indication can be associated with an area of the user interface that is colored on or more different colors.
- indicator can be associated with an area of a user interface that is colored yet another color (e.g., a third color) when an amount of force applied to the anatomical structure is approaching a force limit.
- the indication of the metric includes a numerical representation of an amount of force applied during an interaction between a medical instrument and an anatomical structure. For example, as a surgeon causes a medical instrument to contact an anatomical structure, or when the medical instrument moves while contacting the anatomical structure, the indication can include the amount of force involved in contacting the anatomical structure.
- the indication of the metric includes a numerical representation of a scale or speed with which one or more medical instruments are moving. For example, as a surgeon causes a medical instrument to move toward an anatomical structure, or when the medical instrument moves while contacting the anatomical structure, the indication can include a speed (e.g., in cm/s, mm/s, or the like) of at least a portion of the medical instrument. In some embodiments, the metric prediction system 162 determines the speed based at least on the relative motion of at least a portion of the medical instrument in comparison with the anatomical structures or the patient.
- the metric prediction system 162 provides one or more images associated with the data stream that correspond to the anatomical structure to be displayed via a user interface.
- the metric prediction system 162 can generate data associated with the user interface that is configured to cause a display device to provide an output representing the user interface, where the user interface at least in part represents one or more images associated with the data stream that correspond to the anatomical structure.
- the metric prediction system 162 can update the user interface based at least in part on updates to the metrics described herein.
- the metric prediction system 162 can update the user interface by updating one or more pixels of the one or more images based at least in part on the metrics during performance of the procedure.
- the metric prediction system 162 determines at least one area associated with at least one overlay. For example, the metric prediction system 162 can determine at least one area associated with at least one overlay, where the at least one area corresponds to at least a portion of an anatomical structure represented by the one or more images.
- the overlay can be configured to cause the user interface to update a representation of the at least one area when displayed via the display.
- the metric prediction system 162 then updates one or more pixels associated with the user interface (e.g., one or more pixels of the images associated with the user interface) based at least in part on the at least one overly.
- the metric prediction system 162 can construct the overlay based at least in part on the at least one area and the metrics described herein. In one illustrative example, the metric prediction system 162 then updates one or more pixels associated with the user interface by tinting the one or more pixels with one or more shades or one or more colors based at least on the amount of force involved in the interaction between the medical instrument and the anatomical structure. In another illustrative example, the metric prediction system 162 then updates a plurality of pixels associated with the user interface by tinting the plurality of pixels in accordance with a segmentation mask (discussed above) corresponding to an anatomical structure involved in the interaction and one or more metrics corresponding to the interaction as described herein.
- a segmentation mask discussed above
- the metric prediction system 162 can update the segmentation mask (e.g., by updating the color of a portion or all of the segmentation mask) to indicate the increased probability of tissue damage when further interactions occur between the medical instrument and the anatomical structure.
- the metric prediction system 162 can generate user interfaces that, for example, color code specific anatomical structures, or the like based at least on the amount of force involved in the interaction between the medical instrument and the anatomical structure.
- the overlay can be associated with one or more colors or shades that represent the metrics described herein as they correspond to the anatomical structures involved in the medical procedure.
- the metric prediction system 162 constructs a heatmap.
- the metric prediction system 162 can construct a heatmap that includes a visual representation of the metric overlaid onto the anatomical structure involved in an interaction.
- the metric prediction system 162 can construct a heatmap based at least in part on the at least one area and the metrics described herein.
- the heatmap can include one or more regions of tinted shades or colors.
- the heatmap can include one or more regions of gradients of shades or colors.
- the heatmap can be a gradient of a color (e.g., red) that corresponds to metrics representing the instant or cumulative force applied to one or more portions of the anatomical structure during the medical procedure.
- the metric prediction system 162 then updates the one or more pixels (e.g., of the images associated with the video data 166) based at least in part on the at least one overlay or the heatmap.
- the metric prediction system 162 can continuously update the user interface to indicate to the surgeon the amount of force applied to specific areas of the anatomical structures during a surgical procedure, thereby improving the surgeon’s awareness of the context associated with a given anatomical structure.
- a metric indicative of performance represents an accumulated force (e.g., force applied to a given anatomical structure over a period of time)
- the metric prediction system 162 can construct the heatmap based at least in part on the at least one area and the accumulated force associated with (e.g., corresponding to) one or more portions of the anatomical structure.
- the metric prediction system 162 generates a visual representation of an anatomical structure involved in a medical procedure, the visual representation being based at least on the shape and position of the anatomical structure relative to the body of the patient.
- the metric prediction system 162 can generate a visual representation of the anatomical structure involved in the medical procedure based at least on the data stream generated during the medical procedure.
- the metric prediction system 162 can update the visual representation of the anatomical structure based at least on the metrics as described herein.
- the metric prediction system 162 can update the visual representation of the anatomical structure based at least on the metrics by updating one or more intensity or color values corresponding to pixels involved in representing the anatomical structure.
- the metric prediction system 162 determines at least one first area where the amount of force applied satisfies a first threshold; and the metric prediction system 162 determines at least one second area where the amount of force applied satisfies a second threshold. For example, where multiple anatomical structures are in a field of view of the imaging device of the sensing system 164, at least one first area can correspond to a first anatomical structure and at least one second area can correspond to a second anatomical structure. The metric prediction system 162 can then update one or more pixels of the images generated by the imaging device based at least one the at least one first area and the at least one second area.
- pixels associated with the first area can be updated by tinting the pixels using a first color or first shade; and pixels associated with the second area can be updated by tinting the pixels using a second color or second shade based at least on the metrics associated with the anatomical structures.
- the metric prediction system 162 can index and store data associated with the medical procedure and the interactions between the robotic system (e.g., medical instruments of the robotic system) and anatomical structures of the patient. For example, the metric prediction system 162 can index and store data associated with instant or cumulative amounts of force applied to the anatomical structure involved in the medical procedure. The metric prediction system 162 can then determine an expected amount of time for recovery for the patient based at least on the instant or cumulative amounts of force applied to the anatomical structure. The metric prediction system 162 can then determine an expected amount of time for recovery for the patient based at least on one or more aspects of the interactions between medical instruments and anatomical structures during the medical procedure.
- the robotic system e.g., medical instruments of the robotic system
- the metric prediction system 162 generates a report.
- the metric prediction system 162 can generate a report including the metrics determined during the medical procedure.
- the metric prediction system 162 can generate a report including metrics determined for multiple medical procedures.
- the metric prediction system 162 can generate a report including metrics determined for multiple medical procedures involving a specific patient.
- the metric prediction system 162 can generate a report including metrics determined for multiple medical procedures involving a specific surgeon.
- the metric prediction system 162 can provide the report in real time (e.g., via the user interfaces described herein) or upon request after the surgical procedure is performed.
- the metric prediction system 162 generates a report indicating a correlation between one or more force signatures or one or more stacked force signatures, and a probability of outcome associated with a type of medical procedure. For example, the metric prediction system 162 can receive feedback corresponding to one or more medical procedures, the feedback indicating whether the medical procedure was successful or not successful (e.g., whether tissue was handled appropriately or not appropriately, whether a patient healed as expected or not as expected, etc.). The metric prediction system 162 can then correlate the feedback with one or more force signatures or stacked force signatures involved in the medical procedure.
- the metric prediction system 162 can determine a probability of a future outcome (e.g., whether the tissue will be handled appropriately or not appropriately, whether patients will heal as expected or not as expected, etc.) based at least on the correlation between the feedback and the one or more force signatures or stacked force signatures involved in similar medical procedures.
- a probability of a future outcome e.g., whether the tissue will be handled appropriately or not appropriately, whether patients will heal as expected or not as expected, etc.
- the metric prediction system 162 can receive data associated with patient or surgeon feedback. For example, patients or surgeons can provide feedback indicating how long their recovery process was, whether the patient experienced discomfort, the degree to which the patient experienced discomfort, whether there was observable tissue damage, the amount of time needed for the damaged tissue to recover, or the like. The metric prediction system 162 can then correlate the patient feedback with the metrics determined in association with the medical procedure (e.g., amounts of force applied to the anatomical structures) and the interactions between the robotic system and anatomical structures of the patient and update one or more of the force limits as described herein for the patient or for other patients.
- the metrics determined in association with the medical procedure e.g., amounts of force applied to the anatomical structures
- the robotic system data 170 includes or is indicative of robotic system events corresponding to a state or an activity of an attribute or an aspect of a robotic system.
- the robotic system data 170 of a robotic system can be generated by the robotic system (e.g., in the form of a robotic system log) in its normal course of operations.
- the robotic system data is determined based at least on at least input received by the user input system 168 of the robotic system from a user or sensor data of a sensor on the robotic system.
- the robotic system can include one or more sensors (e.g., camera, infrared sensor, ultrasonic sensors, etc.), actuators, interfaces, consoles, that can output information used to detect such a system event.
- FIG. 2 is a flowchart diagram illustrating an example method 200 for determining an amount of force applied by an instrument during teleoperation, according to some embodiments.
- the method 200 can be performed by one or more systems, devices, or components depicted in FIG. 1 A, FIG. IB, FIG. 3, FIG. 5, and FIG. 6 including, for example, the metric prediction system 162 of FIG. IB.
- a data stream is received of a medical procedure performed with a robotic medical system.
- a metric prediction system e.g., metric prediction system 162
- metric prediction system 162 can receive the data stream of a medical procedure performed with a robotic medical system.
- a type of an anatomical structure on which the medical procedure is performed is identified.
- the metric prediction system e.g., the metric prediction system 162
- the metric prediction system 162 can identify, using the data stream and with one or more models trained with machine learning, a type of an anatomical structure on which the medical procedure is performed.
- an amount of force applied to the anatomical structure is determined.
- the metric prediction system e.g., the metric prediction system 162
- the metric prediction system 162 can determine an amount of force applied to the anatomical structure.
- a metric is determined, the metric indicative of performance of the medical procedure.
- the metric prediction system e.g., the metric prediction system 162
- the metric prediction system can determine the metric indicative of performance of the medical procedure.
- an indication of the metric is provided.
- the metric prediction system e.g., the metric prediction system 162
- the metric prediction system 162 can provide the metric indicative of the performance of the medical procedure.
- FIG. 3 is an image of an example user interface 300, according to some embodiments.
- the user interface 400 shows four anatomical structures 302, 304, 306, 308 as well as other anatomical structures.
- the four anatomical structures 302, 304, 306, 308 are each shown as having overlays associated with different colors (e.g., a first color, a second color, a third color, and a fourth color, respectively).
- the user interface 300 also includes a label 310 that corresponds to a point in time during the performance of a medical procedure.
- FIG. 4 is a graph of example force limits, according to some embodiments.
- interactions between medical instruments and anatomical structures can be associate with interaction types, labeled along the X-axis as “dissect”, “drive needle”, “manipulate” (e.g., move), “retract”, and “tie suture”).
- interaction types labeled along the X-axis as “dissect”, “drive needle”, “manipulate” (e.g., move), “retract”, and “tie suture”).
- Each interaction type can be further associate one or more sub-limits that correspond to particular aspects of each interaction type.
- the force limits can be between 0 newtons and 14 newtons.
- FIG. 5 is a diagram of a medical environment, according to some embodiments.
- the medical environment 500 can refer to or include a surgical environment or surgical system.
- the medical environment 500 can include a robotic medical system 524, a user control system 510, and an auxiliary system 515 communicatively coupled one to another.
- a visualization tool 520 can be connected to the auxiliary system 515, which in turn can be connected to the robotic medical system 524.
- the visualization tool 520 can be considered connected to the robotic medical system.
- the visualization tool 520 can be directly connected to the robotic medical system 524.
- a metric prediction system 162 can be connected to the user control system 510 which in turn can be connected to the robotic medical system 524.
- the metric prediction system 162 can be connected directly to the robotic medical system 524.
- the visualization tool can be considered connected to the robotic medical system.
- the medical environment 500 can be used to perform a computer-assisted medical procedure with a patient 525.
- surgical team can include a surgeon 530A and additional medical personnel 530B-530D such as a medical assistant, nurse, and anesthesiologist, and other suitable team members who can assist with the surgical procedure or medical session.
- the medical session can include the surgical procedure being performed on the patient 525, as well as any pre-operative (e.g., which can include setup of the medical environment 500, including preparation of the patient 525 for the procedure), and postoperative (e.g., which can include clean up or post care of the patient), or other processes during the medical session.
- pre-operative e.g., which can include setup of the medical environment 500, including preparation of the patient 525 for the procedure
- postoperative e.g., which can include clean up or post care of the patient
- the medical environment 500 can be implemented in a non-surgical procedure, or other types of medical procedures or diagnostics that can benefit from the accuracy and convenience of the surgical system.
- the robotic medical system 524 can include a plurality of manipulator arms 535 A- 535D to which a plurality of medical instruments (e.g., the instruments described herein) can be coupled to, installed to, or supported by.
- a plurality of medical instruments e.g., the instruments described herein
- the plurality of manipulator arms 535A-535D can include one or more linkages.
- Each medical instrument can be any suitable surgical tool (e.g., a tool having tissue-interaction functions), imaging device (e.g., an endoscope, an ultrasound tool, etc.), sensing instrument (e.g., a force-sensing surgical instrument), diagnostic instrument, or other suitable instrument that can be used for a computer- assisted surgical procedure on the patient 525 (e.g., by being at least partially inserted into the patient and manipulated to perform a computer-assisted surgical procedure on the patient).
- the robotic medical system 524 is shown as including four manipulator arms (e.g., the manipulator arms 535A-535D), in other embodiments, the robotic medical system can include greater than or fewer than four manipulator arms. Further, not all manipulator arms can have a medical instrument installed thereto at all times of the medical session. Moreover, in some embodiments, a medical instrument installed on a manipulator arm can be replaced with another medical instrument as suitable.
- One or more of the manipulator arms 535A-535D or the medical instruments attached to manipulator arms can include one or more displacement transducers, orientational sensors, positional sensors, or other types of sensors and devices to measure parameters or generate kinematics information.
- One or more components of the medical environment 500 can be configured to use the measured parameters or the kinematics information to track (e.g., determine poses of) or control the medical instruments, as well as anything connected to the medical instruments or the manipulator arms 535A-535D.
- the user control system 510 can be used by the surgeon 530A to control (e.g., move) one or more of the manipulator arms 535A-535D or the medical instruments connected to the manipulator arms.
- the user control system 510 can include a display that can provide the surgeon 530A with imagery (e.g., high-definition 3D imagery) of a surgical site associated with the patient 525 as captured by a medical instrument installed to one of the manipulator arms 535A-535D.
- imagery e.g., high-definition 3D imagery
- the user control system 510 can include a stereo viewer having two or more displays where stereoscopic images of a surgical site associated with the patient 525 and generated by a stereoscopic imaging system can be viewed by the surgeon 530A. In some embodiments, the user control system 510 can also receive images from the auxiliary system 515 and the visualization tool 520.
- the surgeon 530A can use the imagery displayed by the user control system 510 to perform one or more procedures with one or more medical instruments attached to the manipulator arms 535A-535D.
- the user control system 510 can include a set of controls. These controls can be manipulated by the surgeon 530A to control movement of the manipulator arms 535A-535D or the medical instruments installed thereto.
- the controls can be configured to detect a wide variety of hand, wrist, and finger movements by the surgeon 530A to allow the surgeon to intuitively perform a procedure on the patient 525 using one or more medical instruments installed to the manipulator arms 535A-535D.
- the auxiliary system 515 can include one or more computer systems (e.g., computing devices that are the same as, or similar to the computing device 600 of FIG. 6) configured to perform processing operations within the medical environment 500.
- the one or more computer systems can control or coordinate operations performed by various other components (e.g., the robotic medical system 524, the user control system 510) of the medical environment 500.
- a computer systems included in the user control system 510 can transmit instructions to the robotic medical system 524 by way of the one or more computing devices of the auxiliary system 515.
- the auxiliary system 515 can receive and process image data representative of imagery captured by one or more imaging devices (e.g., medical instruments) attached to the robotic medical system 524, as well as other data stream sources received from the visualization tool.
- imaging devices e.g., medical instruments
- one or more image capture devices can be located within the medical environment 500. These image capture devices can capture images from various viewpoints within the medical environment 500. These images (e.g., video streams) can be transmitted to the visualization tool 520, which can then passthrough those images to the auxiliary system 515 as a single combined data stream. The auxiliary system 515 can then transmit the single video stream (including any data stream received from the medical instrument(s) of the robotic medical system 524) to present on a display of the user control system 510.
- images e.g., video streams
- the auxiliary system 515 can then transmit the single video stream (including any data stream received from the medical instrument(s) of the robotic medical system 524) to present on a display of the user control system 510.
- the auxiliary system 515 can be configured to present visual content (e.g., the single combined data stream) to other team members (e.g., the medical personnel 530B-530D) who can not have access to the user control system 510.
- the auxiliary system 515 can include a display 640 configured to display one or more user interfaces, such as images of the surgical site, information associated with the patient 525 or the surgical procedure, or any other visual content (e.g., the single combined data stream).
- display 540 can be a touchscreen display or include other features to allow the medical personnel 530B-530D to interact with the auxiliary system 515.
- the robotic medical system 524, the user control system 510, and the auxiliary system 515 can be communicatively coupled one to another in any suitable manner.
- the robotic medical system 524, the user control system 510, and the auxiliary system 515 can be communicatively coupled by way of control lines 545, which can represent any wired or wireless communication link as can serve a particular implementation.
- the robotic medical system 524, the user control system 510, and the auxiliary system 515 can each include one or more wired or wireless communication interfaces, such as one or more local area network interfaces, Wi-Fi network interfaces, cellular interfaces, etc.
- the medical environment 500 can include other or additional components or elements that can be needed or considered desirable to have for the medical session for which the surgical system is being used.
- FIG. 6 is a block diagram depicting an architecture for a computing device 600 that can be employed to implement elements of the systems and methods described and illustrated herein, including aspects of the systems depicted in FIGS. 1 A-1B, 3, or 5, and the method depicted in FIG. 2.
- the metric prediction system 162, the sensing system 164, the user input system 168, and the devices described with respect to medical environment 500 can include one or more component or functionality of computing device 600.
- the computing device 600 can be any computing device used herein and can include or be used to implement a data processing system or its components.
- the computing device 600 includes at least one bus 605 or other communication component or interface for communicating information between various elements of the computer system.
- the computer system further includes at least one processor 610 or processing circuit coupled to the bus 605 for processing information.
- the computing device 600 also includes at least one main memory 615, such as a random-access memory (RAM) or other dynamic storage device, coupled to the bus 605 for storing information, and instructions to be executed by the processor 610.
- the main memory 615 can be used for storing information during execution of instructions by the processor 610.
- the computing device 600 can further include at least one read only memory (ROM) 620 or other static storage device coupled to the bus 605 for storing static information and instructions for the processor 610.
- ROM read only memory
- a storage device 625 such as a solid-state device, magnetic disk or optical disk, can be coupled to the bus 605 to persistently store information and instructions.
- the computing device 600 can be coupled via the bus 605 to a display 630, such as a liquid crystal display, or active-matrix display, for displaying information.
- a display 630 such as a liquid crystal display, or active-matrix display
- An input device 635 such as a keyboard or voice interface can be coupled to the bus 605 for communicating information and commands to the processor 610.
- the input device 635 can include a touch screen display (e.g., the display 630).
- the input device 635 can include sensors to detect gestures.
- the input device 635 can also include a cursor control, such as a mouse, a trackball, or cursor direction keys, for communicating direction information and command selections to the processor 610 and for controlling cursor movement on the display 630.
- the processes, systems and methods described herein can be implemented by the computing device 600 in response to the processor 610 executing an arrangement of instructions contained in the main memory 615. Such instructions can be read into the main memory 615 from another computer-readable medium, such as the storage device 625. Execution of the arrangement of instructions contained in the main memory 615 causes the computing device 600 to perform the illustrative processes described herein. One or more processors in a multi-processing arrangement can also be employed to execute the instructions contained in the main memory 615. Hard-wired circuitry can be used in place of or in combination with software instructions together with the systems and methods described herein. Systems and methods described herein are not limited to any specific combination of hardware circuitry and software.
- the processor 610 can execute one or more instructions associated with the system 100.
- the processor 610 can include an electronic processor, an integrated circuit, or the like including one or more of digital logic, analog logic, digital sensors, analog sensors, communication buses, volatile memory, nonvolatile memory, and the like.
- the processor 610 can include, but is not limited to, at least one microcontroller unit (MCU), microprocessor unit (MPU), central processing unit (CPU), graphics processing unit (GPU), physics processing unit (PPU), embedded controller (EC), or the like.
- the processor 610 can include, or be associated with, a main memory 615 operable to store or storing one or more non- transitory computer-readable instructions for operating components of the system 100 and operating components operably coupled to the processor 610.
- the one or more instructions can include at least one of firmware, software, hardware, operating systems, or embedded operating systems, for example.
- the processor 610 or the system 100 generally can include at least one communication bus controller to effect communication between the system processor and the other elements of the system
- the main memory 615 can include one or more hardware memory devices to store binary data, digital data, or the like.
- the main memory 615 can include one or more electrical components, electronic components, programmable electronic components, reprogrammable electronic components, integrated circuits, semiconductor devices, flip flops, arithmetic units, or the like.
- the main memory 615 can include at least one of a non-volatile memory device, a solid-state memory device, a flash memory device, a NAND memory device, a volatile memory device, etc.
- the main memory 615 can include one or more addressable memory regions disposed on one or more physical memory arrays.
- any two components so associated can also be viewed as being “operably connected,” or “operably coupled,” to each other to achieve the desired functionality, and any two components capable of being so associated can also be viewed as being “operably couplable,” to each other to achieve the desired functionality.
- operably couplable include but are not limited to physically mateable or physically interacting components or wirelessly interactable or wirelessly interacting components or logically interacting or logically interactable components.
Landscapes
- Health & Medical Sciences (AREA)
- Surgery (AREA)
- Engineering & Computer Science (AREA)
- Life Sciences & Earth Sciences (AREA)
- Biomedical Technology (AREA)
- Robotics (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Heart & Thoracic Surgery (AREA)
- Medical Informatics (AREA)
- Molecular Biology (AREA)
- Animal Behavior & Ethology (AREA)
- General Health & Medical Sciences (AREA)
- Public Health (AREA)
- Veterinary Medicine (AREA)
- Manipulator (AREA)
Abstract
L'invention concerne des systèmes et des procédés de rétroaction de force basée sur l'anatomie et de guidage d'instrument. Un système reçoit un flux de données d'une opération médicale effectuée avec un système médical robotique. Le système identifie un type d'une structure anatomique sur laquelle l'opération médicale est effectuée. Le système détermine une quantité de force appliquée à la structure anatomique. Le système détermine, au moins sur la base d'une comparaison de la quantité de force appliquée à la structure anatomique avec un seuil de force établi pour le type de la structure anatomique, une mesure indiquant les performances de l'opération médicale. Le système fournit une indication de la mesure pour commander les performances de l'opération médicale.
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US202463559783P | 2024-02-29 | 2024-02-29 | |
| US63/559,783 | 2024-02-29 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2025184368A1 true WO2025184368A1 (fr) | 2025-09-04 |
Family
ID=95064403
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/US2025/017633 Pending WO2025184368A1 (fr) | 2024-02-29 | 2025-02-27 | Rétroaction de force basée sur l'anatomie et guidage d'instrument |
Country Status (1)
| Country | Link |
|---|---|
| WO (1) | WO2025184368A1 (fr) |
Citations (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20020120188A1 (en) * | 2000-12-21 | 2002-08-29 | Brock David L. | Medical mapping system |
| US20200272660A1 (en) * | 2019-02-21 | 2020-08-27 | Theator inc. | Indexing characterized intraoperative surgical events |
| WO2023146761A1 (fr) * | 2022-01-27 | 2023-08-03 | Smith & Nephew, Inc. | Système et procédé pour fournir une commande de force réglable pour instruments chirurgicaux électriques |
-
2025
- 2025-02-27 WO PCT/US2025/017633 patent/WO2025184368A1/fr active Pending
Patent Citations (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20020120188A1 (en) * | 2000-12-21 | 2002-08-29 | Brock David L. | Medical mapping system |
| US20200272660A1 (en) * | 2019-02-21 | 2020-08-27 | Theator inc. | Indexing characterized intraoperative surgical events |
| WO2023146761A1 (fr) * | 2022-01-27 | 2023-08-03 | Smith & Nephew, Inc. | Système et procédé pour fournir une commande de force réglable pour instruments chirurgicaux électriques |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| JP7367140B2 (ja) | 走査ベースの位置付けを伴う遠隔操作手術システム | |
| US20220336078A1 (en) | System and method for tracking a portion of the user as a proxy for non-monitored instrument | |
| JP2023126480A (ja) | 訓練又は支援機能を有する手術システム | |
| JP2024521721A (ja) | 手術シミュレーション物体修正システム | |
| EP2942029A1 (fr) | Robot chirurgical et procédés de commande de celui-ci | |
| US20140288413A1 (en) | Surgical robot system and method of controlling the same | |
| US20130178868A1 (en) | Surgical robot and method for controlling the same | |
| CN114376733A (zh) | 利用外科手术过程图集配置外科手术系统 | |
| KR20140020071A (ko) | 수술 로봇 시스템 및 그 제어방법 | |
| CN105188590A (zh) | 图像采集装置和可操纵装置活动臂受控运动过程中的碰撞避免 | |
| US12000753B2 (en) | User-installable part installation detection techniques | |
| CN114828727A (zh) | 计算机辅助手术系统、手术控制装置和手术控制方法 | |
| WO2017098504A1 (fr) | Détection automatique de dysfonctionnement dans des outils chirurgicaux | |
| CN120302940A (zh) | 用于估计图像中的位置数据的方法和系统 | |
| EP4143844A1 (fr) | Système et procédé de suivi d'une partie de l'utilisateur en tant que substitut pour instrument non surveillé | |
| WO2025184368A1 (fr) | Rétroaction de force basée sur l'anatomie et guidage d'instrument | |
| WO2025027463A1 (fr) | Système et procédé de traitement de flux de données combinés de robots chirurgicaux | |
| WO2024201216A1 (fr) | Système robotique chirurgical et méthode pour empêcher une collision d'instrument | |
| CN119816252A (zh) | 手术机器人系统和用于不同成像模态的术中融合的方法 | |
| WO2025184378A1 (fr) | Mise à jour d'une interface utilisateur sur la base d'une force appliquée par un instrument pendant une téléopération | |
| US20240070875A1 (en) | Systems and methods for tracking objects crossing body wallfor operations associated with a computer-assisted system | |
| WO2025245005A1 (fr) | Navigation chirurgicale endoscopique avec un espace de travail chirurgical 3d | |
| WO2025136980A1 (fr) | Gestion de performance chirurgicale par le biais d'un réglage de triangulation | |
| WO2025194117A1 (fr) | Détection d'interaction entre des instruments médicaux robotisés et des structures anatomiques | |
| WO2025179100A1 (fr) | Système et méthode de commande de téléopération sur la base de la présence de la main et d'une dérive du dispositif d'entrée |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 25713095 Country of ref document: EP Kind code of ref document: A1 |