[go: up one dir, main page]

WO2025184382A1 - Configuration of force data of robotic systems for compatibility with network transmissions - Google Patents

Configuration of force data of robotic systems for compatibility with network transmissions

Info

Publication number
WO2025184382A1
WO2025184382A1 PCT/US2025/017655 US2025017655W WO2025184382A1 WO 2025184382 A1 WO2025184382 A1 WO 2025184382A1 US 2025017655 W US2025017655 W US 2025017655W WO 2025184382 A1 WO2025184382 A1 WO 2025184382A1
Authority
WO
WIPO (PCT)
Prior art keywords
force
points
threshold
processors
instruments
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
PCT/US2025/017655
Other languages
French (fr)
Inventor
Hong Seo Lim
Kenneth WHALER
Xinyu Fu
Anthony M. JARC
Xi Liu
Aron MIZRAHI
Jeffrey OUSLEY
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Intuitive Surgical Operations Inc
Original Assignee
Intuitive Surgical Operations Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Intuitive Surgical Operations Inc filed Critical Intuitive Surgical Operations Inc
Publication of WO2025184382A1 publication Critical patent/WO2025184382A1/en
Pending legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/01Protocols
    • H04L67/12Protocols specially adapted for proprietary or special-purpose networking environments, e.g. medical networks, sensor networks, networks in vehicles or remote metering networks
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/25User interfaces for surgical systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots
    • A61B34/37Leader-follower robots
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/70Manipulators specially adapted for use in surgery
    • A61B34/76Manipulators having means for providing feel, e.g. force or tactile feedback
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/70Manipulators specially adapted for use in surgery
    • A61B34/77Manipulators with motion or force scaling
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H20/00ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance
    • G16H20/40ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance relating to mechanical, radiation or invasive therapies, e.g. surgery, laser therapy, dialysis or acupuncture
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • A61B2034/101Computer-aided simulation of surgical operations
    • A61B2034/105Modelling of the patient, e.g. for ligaments or bones
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots
    • A61B2034/301Surgical robots for introducing or steering flexible instruments inserted into the body, e.g. catheters or endoscopes

Definitions

  • the present implementations relate generally to medical devices, including but not limited to configuration of force data of robotic systems for compatibility with network transmissions.
  • Systems, methods, apparatuses, and non-transitory computer-readable media are provided at least for selecting relevant videos or video clips of medical procedures based on force applied by instruments of a robotic device during a medical procedure.
  • implementations according to this disclosure can recommend videos of portions of medical procedures in which force applied meets or exceeds a force threshold.
  • implementations according to this disclosure can down-sample force data to enable transmission to user devices for presentation, while retaining force data indicative of high-force events during the medical procedure.
  • implementations according to this disclosure can validate surgical events (e.g., tool contact or collision) using force data.
  • implementations according to this disclosure can provide visual annotations or overlays of video or video clips based on force data (e.g., present one or more heat map overlays each having a color that indicates a magnitude or direction of force applied to or by a given instrument).
  • force data e.g., present one or more heat map overlays each having a color that indicates a magnitude or direction of force applied to or by a given instrument.
  • At least one aspect is directed to a system.
  • the system can include one or more processors, coupled with memory.
  • the system can receive a data set corresponding to force at one or more instruments of a robotic medical system during a medical procedure.
  • the system can remove one or more first points from the data set that satisfy a first threshold indicative of a first magnitude of force at the one or more instruments of the robotic system.
  • the system can identify, with a machine learning model and from the data set with the one or more first points removed, one or more second points that satisfy a second threshold indicative of a second magnitude of force at the one or more instruments of the robotic medical system.
  • the system can cause a user interface to present one or more videos for one or more times during the medical procedure that correspond to the one or more second points from the data set.
  • At least one aspect is directed to a method.
  • the method can include receiving a data set corresponding to force at one or more instruments of a robotic medical system during a medical procedure.
  • the method can include removing one or more first points from the data set that satisfy a first threshold indicative of a first magnitude of force at the one or more instruments of the robotic system.
  • the method can include identifying, with a machine learning model and from the data set with the one or more first points removed, one or more second points that satisfy a second threshold indicative of a second magnitude of force at the one or more instruments of the robotic medical system.
  • the method can include causing a user interface to present one or more videos for one or more times during the medical procedure that correspond to the one or more second points from the data set.
  • At least one aspect is directed to a non-transitory computer readable medium can include one or more instructions stored thereon and executable by a processor.
  • the processor can receive a data set corresponding to force at one or more instruments of a robotic medical system during a medical procedure.
  • the processor can remove one or more first points from the data set that satisfy a first threshold indicative of a first magnitude of force at the one or more instruments of the robotic system.
  • the processor can identify, with a machine learning model and from the data set with the one or more first points removed, one or more second points that satisfy a second threshold indicative of a second magnitude of force at the one or more instruments of the robotic medical system.
  • the processor can cause a user interface to present one or more videos for one or more times during the medical procedure that correspond to the one or more second points from the data set.
  • FIG. 1 depicts an example system according to this disclosure.
  • FIG. 2 depicts an example computer architecture according to this disclosure.
  • FIG. 3 depicts an example force data model according to this disclosure.
  • FIG. 4 depicts an example modified force data structure according to this disclosure.
  • FIG. 5A depicts an example user interface presentation according to this disclosure.
  • FIG. 5B depicts an example modified user interface presentation according to this disclosure.
  • FIG. 6 depicts an example method of configuration of force data of robotic systems for compatibility with network transmissions according to this disclosure.
  • FIG. 7A depicts an example of force data models according to this disclosure.
  • FIG. 7B depicts an example of modified force data models according to this disclosure.
  • FIG. 7C depicts an example of filtered force data models according to this disclosure.
  • aspects of technical solutions of this disclosure are directed to the configuration of force data of robotic systems for compatibility with network transmission.
  • the technical solutions described herein can use one or more thresholds to down-sample force data and identify force events.
  • the technical solutions of this disclosure can include a data processing system that can apply minimum force thresholds and maximum force thresholds to respectively down-sample force data and identify high-force events.
  • the data processing system can identify a minimum force threshold below which force data is not transmitted or stored as discrete values (e.g., force values below a given quantitative value can be ignored).
  • a minimum force threshold can be an experimentally-derived value of 0.6 Newtons (N).
  • the data processing system of this technical solution can identify a maximum force threshold above which a high-force state is associated.
  • maximum force threshold can be a quantitative value over 0.6 N, and can be either a static value or a dynamic value based on tissue type associated with the high-force application.
  • Force data can be obtained from telemetry of a robotic system and can be correlated with timestamps associated with the medical procedure.
  • this technical solution can achieve a technical improvement of reducing force data from millions of data points to 800 data points or less, with little to no loss of visibility into high-force events, allowing force data to be consumable by client computers (e.g., surgeon’s desktop or tablet) via wireless networks.
  • the technical solutions of this disclosure can enable or facilitate transmission of force data for display, presentation, or rendering by computing systems or display devices, while retaining force data indicative of relative or high-force events during the medical procedure. Further, by down-sampling force data, the technical solutions can facilitate deployment of force-based video for review across a wide range of client devices or types of computing devices due to the reduction in bandwidth due to down-sampling.
  • the data processing system can include, access, or otherwise utilize a machine learning model that can identify videos or video clips for one or more medical procedures based on force data.
  • the machine learning model can generate a density plot of forces associated with one or more instruments of a robotic system during one or more medical procedures, and can generate “fingerprints” for one or more videos or video clips.
  • video clips can be portions of videos that span a medical procedures, and can be associated with discrete portions of the workflow (e.g., task or phase) and discrete force data or ranges of force data from the robotic system.
  • the machine learning model can allow a surgeon reviewing medical procedures for force to rapidly review or cycle between many high-force video clips relevant to a particular robotic system, medical procedure, or medical procedure workflow.
  • FIG. 1 depicts an example system according to this disclosure.
  • a system 100 can include at least a network 101, a data processing system 102, a client system 103, and a robotic system 104 (which can include or be referred to as a robotic medical system 104).
  • the network 101 can include any type or form of network.
  • the geographical scope of the network 101 can vary widely and the network 101 can include a body area network (BAN), a personal area network (PAN), a local-area network (LAN), e.g., Intranet, a metropolitan area network (MAN), a wide area network (WAN), or the Internet.
  • the topology of the network 101 can be of any form and can include, e.g., any of the following: point-to-point, bus, star, ring, mesh, or tree.
  • the network 101 can include an overlay network which is virtual and sits on top of one or more layers of other networks 101.
  • the network 101 can be of any such network topology as known to those ordinarily skilled in the art capable of supporting the operations described herein.
  • the network 101 can utilize different techniques and layers or stacks of protocols, including, e.g., the Ethernet protocol, the Internet protocol suite (TCP/IP), the ATM (Asynchronous Transfer Mode) technique, the SONET (Synchronous Optical Networking) protocol, or the SD (Synchronous Digital Hierarchy) protocol.
  • the TCP/IP Internet protocol suite can include application layer, transport layer, Internet layer (including, e.g., IPv6), or the link layer.
  • the network 101 can include a type of a broadcast network, a telecommunications network, a data communication network, or a computer network.
  • the data processing system 102 can include a physical computer system operatively coupled or that can be coupled with one or more components of the system 100, either directly or directly through an intermediate computing device or system.
  • the data processing system 102 can include a virtual computing system, an operating system, and a communication bus to effect communication and processing.
  • the data processing system 102 can include a system processor 110, an interface controller 112, a video data processor 120, a force data processor 130, and force event processor 140, and a system memory 150.
  • the system processor 110 can execute one or more instructions associated with the system 100.
  • the system processor 110 can include an electronic processor, an integrated circuit, or the like including one or more of digital logic, analog logic, digital sensors, analog sensors, communication buses, volatile memory, nonvolatile memory, and the like.
  • the system processor 110 can include, but is not limited to, at least one microcontroller unit (MCU), microprocessor unit (MPU), central processing unit (CPU), graphics processing unit (GPU), physics processing unit (PPU), embedded controller (EC), or the like.
  • the system processor 110 can include a memory operable to store or storing one or more instructions for operating components of the system processor 110 and operating components operably coupled to the system processor 110.
  • the one or more instructions can include one or more of firmware, software, hardware, operating systems, embedded operating systems.
  • the system processor 110 or the data processing system 102 generally can include one or more communication bus controller to effect communication between the system processor 110 and the other elements of the system 100.
  • the interface controller 112 can link the data processing system 102 with one or more of the network 101 and the client system 103, by one or more communication interfaces.
  • a communication interface can include, for example, an application programming interface (“API”) compatible with a particular component of the data processing system 102, or the client system 103.
  • API application programming interface
  • the communication interface can provide a particular communication protocol compatible with a particular component of the data processing system 102 and a particular component of the client system 103.
  • the interface controller 112 can be compatible with particular content objects and can be compatible with particular content delivery systems corresponding to particular content objects, structures of data, types of data, or any combination thereof.
  • the interface controller 112 can be compatible with transmission of text data or binary data structured according to one or more metrics or data of the system memory 150.
  • the video data processor 120 can identify one or more features in depictions in video data as discussed herein.
  • the depictions can include portions of a patient site, one or more medical instruments, or any combination thereof, but are not limited thereto.
  • the video data processor 120 can identify one or more edges, regions, or a structure within an image and associated with the depictions.
  • an edge can correspond to a line in an image that separates two depicted objects (e.g., a delineation between an instrument and a patient site).
  • a region can correspond to an area in an image that at least partially corresponds to a depicted object (e.g., an instrument tip).
  • a structure can correspond to an area in an image that at least partially corresponds to a portion of a depicted object or a predetermined type of an object (e.g., a scalpel edge).
  • the video data processor 120 can include or correspond to a first machine learning model configured to identify the one or more features in the one or more images.
  • the system can generate, with the first machine learning model and based on the first feature, a second feature indicating an economy of motion for each of the plurality of videos.
  • the force data processor 130 can process one or more metrics indicative of forces associated with one or more components of the robotic system 104 with respect to one or more given medical procedures or medical procedures of a given type.
  • the metrics can correspond to one or more quantitative values of force applied by an instrument or an actuator associated with the instrument at a given time or over a given time period for a given medical procedure.
  • the metrics can be indicative of a forceps instrument closing at a magnitude of 1.1 N at a given time during a medical procedure.
  • the metrics can correspond to one or more quantitative values of force applied to an instrument or an actuator associated with the instrument at a given time or over a given time period for a given medical procedure.
  • the metrics can be indicative of force at a magnitude of 1.1 N against a forceps instrument hindered from moving freely at the patient site through contact with tissue, at a given time during a medical procedure.
  • the force data processor 130 can generate one or more force data structures corresponding to force associated with one or more instruments, and can transmit the force data structures to the force event processor 140.
  • the force event processor 140 can determine one or more states of the robotic device that correspond to one or more events of the medical procedure.
  • an event can correspond to a set of one or more metrics indicative of a state of one or more of the robotic system 104, any component thereof, a patient, a patient site, or a medical environment.
  • the state can be associated with or indicative of a given medical procedure.
  • a set of force data for a forceps instrument of the robotic device 104 having a quantitative value of 2.5 N, during a specific gall bladder removal procedure, can be indicative of a high-force event of a specific medical procedure.
  • the force event processor 140 can be configured to process force data from one or more medical procedures to identify high-force events in one or more given medical procedures.
  • the force event processor 140 can modify one or more videos or video segments corresponding to the video data 152, according to one or more states corresponding to one or more components of the robotic system 104. For example, the force event processor 140 can identify a state corresponding to one or more instruments based on one or more metrics determined or identified by the force data processor 210. The force event processor 140 can identify a video segment corresponding to one or more magnitudes of force at one or more given times or time period, based on one or more metrics determined or identified by the force data processor 210. For example, the force event processor 140 can identify a video segment for the gall bladder medical procedure corresponding to a time or time period correlated with or matching the time or time period indicated by the force data for the high-force event.
  • the force event processor 140 can provide a technical improvement to retrieve video segments that depict force events for given medical procedures, beyond the capability of manual processes.
  • the force event processor 140 can instruct the video data processor to modify at least a portion of at least one image corresponding to a video or video segment, to indicate a state corresponding to one or more magnitudes of force determined or identified by the force data processor 210.
  • the system memory 150 can store data associated with the system 100.
  • the system memory 150 can include one or more hardware memory devices to store binary data, digital data, or the like.
  • the system memory 150 can include one or more electrical components, electronic components, programmable electronic components, reprogrammable electronic components, integrated circuits, semiconductor devices, flip flops, arithmetic units, or the like.
  • the system memory 150 can include at least one of a non-volatile memory device, a solid-state memory device, a flash memory device, or aNAND memory device.
  • the system memory 150 can include one or more addressable memory regions disposed on one or more physical memory arrays.
  • a physical memory array can include a NAND gate array disposed on, for example, at least one of a particular semiconductor device, integrated circuit device, and printed circuit board device.
  • the system memory 150 can include a video data 152, robot metrics 154, performance metrics 156, and patient metrics 158.
  • the system memory 150 can correspond to a non-transitory computer readable medium.
  • the non-transitory computer readable medium can include one or more instructions executable by the system processor 110.
  • the processor can select, according to a physical property of at least a portion of a patient site associated with the medical procedure, a quantitative value of the first threshold.
  • the non-transitory computer readable medium can include the non-transitory computer readable medium further can include one or more instructions executable.
  • the processor can select, according to a type of at least one of the instruments, a quantitative value of the second threshold.
  • the non-transitory computer readable medium can include the non-transitory computer readable medium further can include one or more instructions executable.
  • the processor can select, according to a type of at least one of the instruments or according to a type of segment of the medical procedure, a quantitative value of the second threshold.
  • the video data 152 can depict one or more medical procedures from one or more viewpoints associated with corresponding medical procedures.
  • the video data 152 can correspond to still images or frames of video images that depict at least a portion of a medical procedure, medical environment, or patient site from a given viewpoint.
  • the video data processor 120 can identify one or more depictions in an image or across a plurality of images. Each time can, for example, be associated with a given task or phase of a workflow as occurring during that task or phase.
  • the robot metrics 154 can be indicative of one or more states of one or more components of the robotic system 104.
  • Components of the robotic system 104 can include, but are not limited to, actuators of the robotic system 104 as discussed herein.
  • the robot metrics 154 can include one or more data points indicative of one or more of an activation state (e.g., activated or deactivated), a position, or orientation of a component of the robotic system 104.
  • the robot metrics 154 can be linked with or correlated with one or more medical procedures, one or more phases of a given medical procedure, or one or more tasks of a given phase of a given medical procedure.
  • a robot metric among the robot metrics 154 can correspond to corresponding positions of one or more actuators of a given arm of the robotic system 104 at a given time or over a given time period. Each time can, for example, be associated with a given task or phase of a workflow as occurring during that task or phase.
  • the performance metrics 156 can be indicative of one or more actions during one or more medical procedures.
  • the performance metrics 156 can correspond to OPIs as discussed herein.
  • the patient metrics 158 can be indicative of one or more characteristics of a patient during one or more medical procedures.
  • the patient metrics 158 can indicate various conditions (e.g., diabetes, low blood pressure, blood clotting) or various traits (e.g., age, weight) corresponding to the patient in each medical procedure.
  • the surgical data processor 130 or the model processor 140 can obtain the patient metrics 158, and can filter or modify any segments of video in response to the obtained patient metrics 158.
  • the surgical data processor 130 or the model processor 140 can include video segments restricted to patients with patient metrics 158 indicative of diabetes.
  • the client system 103 can include a computing system associated with a database system.
  • the client system 103 can correspond to a cloud system, a server, a distributed remote system, or any combination thereof.
  • the client system 103 can include an operating system to execute a virtual environment.
  • the operating system can include hardware control instructions and program execution instructions.
  • the operating system can include a high-level operating system, a server operating system, an embedded operating system, or a boot loader.
  • the client system 103 can include a user interface 160.
  • the user interface 160 can include one or more devices to receive input from a user or to provide output to a user.
  • the user interface 160 can correspond to a display device to provide visual output to a user and one or more or user input devices to receive input from a user.
  • the input devices can include a keyboard, mouse or touch-sensitive panel of the display device, but are not limited thereto.
  • the display device can display at least one or more presentations as discussed herein, and can include an electronic display.
  • An electronic display can include, for example, a liquid crystal display (LCD), a light-emitting diode (LED) display, an organic light-emitting diode (OLED) display, or the like.
  • the display device can receive, for example, capacitive or resistive touch input.
  • the display device can be housed at least partially within the client system 103.
  • the robotic system 104 can include one or more robotic devices configured to perform one or more actions of a medical procedure (e.g., a surgical procedure).
  • a robotic device can include or be coupled with, but is not limited to, a surgical device that can be manipulated by a robotic device.
  • a surgical device can include, but is not limited to, a scalpel or a cauterizing tool.
  • the robotic system 104 can include various motors, actuators, or electronic devices whose position or configuration can be modified according to input at one or more robotic interfaces.
  • a robotic interface can include a manipulator with one or more levers, buttons, or grasping controls that can be manipulated by pressure or gestures from one or more hands, arms, fingers, or feet.
  • the robotic system 104 can include a surgeon console in which the surgeon can be positioned (e.g., standing or seated) to operate the robotic system 104.
  • the robotic system 104 is not limited to a surgeon console co-located or on-site with the robotic system 104.
  • the robotic system 104 can include an instrument(s) 170.
  • the instrument(s) 170 can include components of the robotic system 104 that can be moved in response to input by a surgeon at the surgeon console of the robotic device 104.
  • the components can correspond to or include one or more actuators that can each move or otherwise change state to operate one or more of the instruments 170 of the robotic device 104.
  • each of the instruments can include one or more sensors or be associated with one or more sensors to provide haptic feedback from the robotic system 104.
  • the haptic feedback can include one or more data points according to the robot metrics 154, or that can be indicative of the robot metrics 154.
  • FIG. 2 depicts an example computer architecture according to this disclosure.
  • a computer architecture 200 can include at least a force data processor 210, and a force event processor 220.
  • the force data processor 210 can correspond at least partially in one or more of structure and operation to the force data processor 130.
  • the force data processor 210 can include a threshold processor 212, a point processor 214, and a signature processor 216.
  • the force event processor 220 can correspond at least partially in one or more of structure and operation to the force event processor 140.
  • the force event processor 220 can include an instrument metrics processor 222, a video segmentation processor 224, and a video annotation processor 226.
  • the threshold processor 212 can provide one or more thresholds corresponding to one or more force events.
  • the threshold processor 212 can provide one or more thresholds indicative of quantitative values of force data.
  • the threshold processor 212 can provide a minimum force threshold indicative of a minimum magnitude of force to be recorded or processed for the medical procedure.
  • the threshold processor 212 can provide a maximum force threshold indicative of a minimum magnitude of force associated with a high- force event for the medical procedure.
  • the threshold processor 212 can provide one or more minimum force thresholds each corresponding to a given instrument of the robotic system 104, a given medical procedure, or a given instrument of the robotic system 104 for a given medical procedure.
  • the threshold processor 212 can provide one or more maximum force thresholds each corresponding to a given instrument of the robotic system 104, a given medical procedure, or a given instrument of the robotic system 104 for a given medical procedure.
  • the threshold processor 212 can determine one or more thresholds according to one more criteria associated with a medical procedure, type of medical procedure, robotic system, type of robotic system, instrument for a robotic system, or type of instrument of a robotic system, but is not limited thereto.
  • the threshold processor 212 can determine a minimum force threshold corresponding to a given instrument during a given medical procedure, based on a physical property of a type of tissue at a patient site for the medical procedure.
  • the system 100 can select, according to a physical property of at least a portion of a patient site associated with the medical procedure, a quantitative value of the first threshold.
  • the portion of a patient site can correspond to patient tissue
  • the physical property can correspond to a tensile strength of the patient tissue.
  • the threshold processor 212 can generate dynamic thresholds for high-force events based on instrument type.
  • the threshold processor 212 can generate dynamic thresholds for high-force events based on segment of a workflow of a medical procedure or type of medical procedure (e.g., task or phase).
  • the threshold processor 212 can generate thresholds based on skill level, complexity of medical procedure, or anatomical structure.
  • the threshold processor 212 can obtain a skill level of a surgeon associated with a request for video.
  • the threshold processor 212 can obtain the skill level based on one or more of the performance metrics 156 associated with the surgeon, and can provide a threshold that is configured to the surgeon providing the request.
  • a novice surgeon having one or more performance metrics 156 can request videos according to a medical procedure.
  • the system can select, according to a type of at least one of the instruments, a quantitative value of the second threshold.
  • the second threshold can correspond to a maximum force threshold.
  • the threshold processor 212 can determine, based on the OPIs or the correlation of the OPIs with a novice surgeon profile, to provide a maximum force threshold that is lower than a maximum force threshold for an expert surgeon.
  • the system 100 can select, according to a type of segment of the medical procedure, a quantitative value of the second threshold.
  • the threshold processor 212 can determine, based on one or more of the performance metrics 156 or one or more of the patient metrics 158, to provide a first maximum force threshold that is lower than a second maximum force threshold, where the first maximum force threshold is associated with a preoperative phase of a workflow or a task performed during the preoperative phase of the workflow, and the second maximum force threshold is associated with a preoperative phase of a workflow or a task performed during the preoperative phase of the workflow.
  • the threshold processor 212 can provide a technical improvement to provide thresholds corresponding to multiple aspects of a medical procedure.
  • the point processor 214 can obtain one or more points associated with force data as discussed herein.
  • the point processor 214 can generate one more points including data indicative of one or more of a magnitude of force, an instrument at which the force is detected, a type of the force, a time associated with the detection of the force, or any combination thereof.
  • the type of the force can indicate whether a force is applied by an instrument, or being applied to an instrument.
  • the point processor 214 can obtain point set including a plurality of points collectively indicative of force corresponding to a given instrument for a given medical procedure, and can modify the point set to add, remove, or modify one or more points in accordance with one or more thresholds provided by the threshold processor 212 that are applicable to the point set.
  • the point processor 214 can remove one or more points of a point set having a magnitude below a minimum force threshold associated with the medical procedure associate with the points of the point set.
  • the point processor 214 can remove one or more points of a point set having a magnitude below a minimum force threshold for the medical procedure associated with the points of the point set.
  • the system can receive a second data set corresponding to positions of the one or more instruments of the robotic medical system during the medical procedure.
  • the system can identify, with the machine learning model and from the data set with the one or more first points removed, one or more third points that do not satisfy the second threshold.
  • the system can determine, based on the one or more third points and the positions, contact between the one or more instruments.
  • the second data set can correspond to the point set or be derived by the point processor 214 from one of more of the robot metrics 154.
  • the signature processor 216 can provide one or more signatures.
  • a signature can include a data structure that is indicative of one or more of a task, a phase, an event, or any combination thereof as discussed herein.
  • the signature processor 216 can identify one or more of a task, a phase, an event corresponding to a given medical procedure or a given type of medical procedure, according to a value of a given signature.
  • the signature processor 216 can generate a signature indicative of force at a given task, phase or event of a given medical procedure.
  • the signature processor 216 can generate a force signature based on at least one of force data as discussed herein, one or more OPIs indicative of force, one or more OPIs associated with force data, or any combination thereof.
  • the signature processor 216 can generate a force signature from force data segmented according to segments of a workflow (e.g., task or phase). For example, the signature processor 216 can generate a force signature including only segments of a workflow that at least partially include points that meet (e.g., exceed) a minimum force threshold. For example, the signature processor 216 can generate a force signature excluding segments of a workflow that at least partially include points that meet (e.g., exceed) a maximum force threshold.
  • the signature processor 216 can provide one or more signatures to the threshold processor 212.
  • the threshold processor 212 can identify one or more dynamic thresholds by receiving one or more of the signatures provided by the signature processor 216 as input.
  • the threshold processor 212 can identify one or more dynamic thresholds via a machine learning model integrated with the threshold processor 212.
  • the system can generate, based on one or more of the second points that satisfy the second threshold, a first signature indicative of the force at the one or more instruments of the robotic medical system during the medical procedure.
  • the signature processor 216 can generate the first signature.
  • the system can determine that one or more second signatures each satisfy a third threshold indicative of a similarity with the first signature.
  • the system can select, according to the one or more second signatures, the one or more videos.
  • the signature processor 216 can determine that the one or more second signatures each satisfy the third threshold.
  • the instrument metrics processor 222 can determine one or more states associated with one or more instruments 170 of the robotic system 104.
  • the instrument metrics processor 222 can identify a state corresponding to a position of one or of the instruments 170.
  • the position of the instrument 170 can correspond to an absolute position having one or more absolute coordinates in a coordinate space for the patient site or the medical environment.
  • the position of the instrument 170 can correspond to a relative position having one or more relative coordinates in a coordinate space with respect to the given instrument 170 and one or more of another instrument 170 of the robotic system 104 and a tissue structure of the patient site.
  • the instrument metrics processor 222 can detect a relative position between two or more instruments 170 (e.g., distance).
  • the instrument metrics processor 222 can detect a relative position between an instrument 170 and a tissue structure of (e.g., distance).
  • the instrument metrics processor 222 can remove one or more points indicative of force, based on one or more relative or absolute positions as discussed herein. For example, the instrument metrics processor 222 can filter forces in response to a determination that two instruments are located at the same absolute position or have a relative position indicating a distance of zero or less. The same absolute position or the relative position indicating a distance of zero or less can be indicative of a collision or contact between instruments. Collision or contact between instruments can be associated with high-force data. For example, the instrument metrics processor 222 can detect collisions from kinematics data (e.g., overlap of instruments based on coordinates, yaw, pitch, roll) corresponding to the robot metrics 154.
  • kinematics data e.g., overlap of instruments based on coordinates, yaw, pitch, roll
  • the instrument metrics processor 222 can detect collisions based on one or more force signatures or similarity among one or more force signatures.
  • the instrument metrics processor 222 can remove points associated with contact or collision, in accordance with a determination or a condition to treat collision and contact between instruments 170 as noise.
  • this technical solution can provide a technical improvement at least to a graphical user interface (GUI), with respect at least to presentation of video augmented with indications of various metrics as discussed herein.
  • GUI graphical user interface
  • this technical solution can provide a technical improvement at least to improve a user experience by improvements to the user interface as discussed herein.
  • the video segmentation processor 224 can identify portions of the video data 152 according to one or more of the robot metrics 154, the performance metrics 156, and the patient metrics 158. For example, the video segmentation processor 224 can divide one or more videos for corresponding medical procedures according to one or more phases or tasks for that medical procedure.
  • the video annotation processor 226 can modify one or more images or portions of images according to force data associated with that image. For example, the video annotation processor 226 can instruct the video data processor to apply an annotation for a given instrument 170 at one or more given times. In response, the video annotation processor 226 can identify the given instrument 170 in one or more images, and can apply the annotation to the image or images including the given instrument 170.
  • an annotation can include a presentation of text, values or graphics descriptive of the given instrument or the force data corresponding to the given instrument 170 at the given time.
  • the annotation can include a modification of at least a portion of the image corresponding to the given instrument 170 at the given time (e.g., an overlay highlighting the image with a given color, at a given transparency level).
  • the video segmentation processor 224 and the video annotation processor 226 can each provide a technical improvement to modify video data to present force data as discussed herein with respect to given medical procedures or types of medical procedures, beyond the capability of manual processes.
  • FIG. 3 depicts an example force data model according to this disclosure.
  • a force data model 300 can include at least a lower force threshold 302, force metrics 310.
  • the force data model 300 is illustrated by way of example in Fig. 3 as a chart including the force metrics 310 arranged as a line graph.
  • the example chart of Fig. 3 depicts quantitative values of the force metrics 310 including a time component and a force component.
  • a time component according to the force data model 300 can include a timestamp indicative of a relative time from the start of a given medical procedure, or an absolute timestamp according to a UNIX timestamp.
  • a force component according to the force data model 300 can include a quantitative value indicating a signed or unsigned magnitude.
  • the quantitative value of the force component can correspond to Newtons (N), for example.
  • the lower force threshold 302 can be indicative of a minimum force threshold.
  • the lower force threshold 302 can be indicative of a magnitude of force of 0.6 N.
  • the threshold processor 212 can determine the lower force threshold 302 as discussed herein.
  • the point processor can identify all points of the force metrics 310 having corresponding force components at or below a magnitude (either signed or unsigned) of the lower force threshold 302.
  • the force metrics 310 can correspond to an instance of the robot metrics 154.
  • the force metrics 310 can correspond to a set of points as discussed herein for a given medical procedure, plurality of medical procedures corresponding to a type of medical procedure, a plurality of medical procedures corresponding to a given surgeon or group of surgeons, or any combination thereof, but is not limited thereto.
  • the force metrics 310 can include all or substantially all points detected at one or more of the instruments 170 during one or more medical procedures.
  • the force metrics 310 can include all points captured from one or more instruments at a sampling frequency between 30-50 Hz.
  • the force metrics 310 can include approximately two million points for a medical procedure including a surgery of three hours.
  • the force metrics 310 can include a first peak 320, and a second peak 322.
  • the first peak 320 can include a first set of force metrics indicative of a first set of one or more relative peaks of the force metrics 310.
  • the first peak 320 can correspond to points of the force metrics 310 captured during a first task or phase of the medical procedure.
  • the second peak 322 can include a second set of force metrics indicative of a second set of one or more relative peaks of the force metrics 310.
  • the second peak 322 can correspond to points of the force metrics 310 captured during a second task or phase of the medical procedure.
  • the modified force metrics 410 can include a subset of the force metrics 310.
  • the point processor can remove all points of the force metrics 310 having corresponding force components at or below a magnitude (either signed or unsigned) of the lower force threshold 302, to generate or obtain the modified force metrics 410.
  • the modified force metrics 410 can include only points at or above the lower force threshold 302.
  • the modified force metrics 410 can include fewer than 8,000 points for a medical procedure including a surgery of three hours.
  • the points data processor 214 can provide a technical improvement to significantly reduce data density of force data associated with a medical procedure without reducing granularity of force data, via a technical solution including one or more thresholds determined according to one or more characteristics of one or more medical procedures.
  • the modified force metrics 410 can include a first peak 320, and a second peak 322.
  • the first peak 320 can include a first set of force metrics indicative of a first set of one or more relative peaks of the force metrics 310.
  • the first peak 320 can correspond to points of the force metrics 310 captured during a first task or phase of the medical procedure.
  • the second peak 322 can include a second set of force metrics indicative of a second set of one or more relative peaks of the force metrics 310.
  • the second peak 322 can correspond to points of the force metrics 310 captured during a second task or phase of the medical procedure.
  • the modified force metrics 410 can include a first maximum peak satisfying the upper force threshold 412, a second maximum peak not satisfying the upper force threshold 414, a first set of modified force metrics at the first peak 420, and a second set of modified force metrics at the second peak 422.
  • the first maximum peak 412 satisfying the upper force threshold can include a third set of force metrics with force components having quantitative values that meet or exceed a quantitative value of the maximum force threshold 402.
  • the second maximum peak not satisfying the upper force threshold 414 can include a fourth set of force metrics with force components having quantitative values at or below the quantitative value of the maximum force threshold 402, and that meet or exceed the quantitative value of the minimum force threshold 302.
  • the first set of modified force metrics at the first peak 420 can include a subset of points at the first peak 320.
  • the first set of modified force metrics at the first peak 420 can correspond to a first denoised set of force metrics that exclude points having force components removed according to the minimum force threshold 302.
  • the second set of modified force metrics at the second peak 422 can include a subset of points at the second peak 322.
  • the second set of modified force metrics at the second peak 422 can correspond to a second denoised set of force metrics that exclude points having force components removed according to the minimum force threshold 302.
  • FIG. 5A depicts an example user interface presentation according to this disclosure.
  • a user interface presentation 500A can include at least a video data presentation 502, and a force data presentation 504.
  • the video data presentation 502 can include a depiction at the user interface 160 according to the video data 152.
  • the video data presentation 502 can correspond to an instance of one or more videos or video segments identified by the video data processor 120.
  • the video data presentation 502 can include an instrument presentation 510A.
  • the instrument presentation 510A can include a depiction of an instrument of the robotic system 104.
  • the force data presentation 504 can include one or more elements of the GUI that are indicative of force data associated with the video data presentation 502.
  • the force data presentation 504 can include control affordances and a video timeline presentation.
  • the control affordances can include GUI elements that can detect user input.
  • the control affordances can include a first affordance to highlight clinical segments of the video data presentation 502, a second affordance to highlight consoles and surgeons of the video data presentation 502, a third affordance to highlight instruments of the video data presentation 502, and a fourth affordance to highlight force of the video data presentation 502.
  • the control affordances can include a sliding bar used to manipulate output of the GUI.
  • the force data presentation 504 can include one or more elements of the GUI indicative of a sequence of timestamps or one or more time periods corresponding to a video including the video data presentation 502.
  • the force data presentation 504 can include a first force data model of a first arm 520, a second force data model of a second arm 530, a third force data model of a third arm 540, and a fourth force data model of a fourth arm 550.
  • the active instrument state indications 524 can include a first chart of force data according to the structure of the modified force data structure 400, but is not limited to the values indicated thereby.
  • a robotic system as discussed herein can include one or more instruments providing force feedback, and can include one or more instruments not providing forcefeedback (non-force instruments).
  • non-force instruments only force feedback instruments we would be able to access the force data.
  • an inactive state as discussed herein can correspond a nonforce feedback instrument during any time of a medical procedure.
  • an inactive state as discussed herein can correspond a force feedback instrument during a time of a medical procedure providing force feedback at a level below a minimum force threshold as discussed herein.
  • the second force data model of a second arm 530 can include a second set of data corresponding to the structure of the modified force data structure 400, and can be transmitted to the client system 103 via the interface controller 112 of the data processing system 102.
  • the second force data model of a second arm 530 can include an active instrument state indications 532, and an inactive state indication 534.
  • the active instrument state indications 532 can correspond to one or more times during the medical procedure in which the second arm 530 is associated with force data at or above the minimum force threshold 302.
  • the active instrument state indications 532 can include a second chart of force data according to the structure of the modified force data structure 400, but is not limited to the values indicated thereby.
  • the inactive state indication 534 can correspond to one or more times during the medical procedure in which the second arm 530 is associated with force data below the minimum force threshold 302.
  • the third force data model of a third arm 540 can include a third set of data corresponding to the structure of the modified force data structure 400, and can be transmitted to the client system 103 via the interface controller 112 of the data processing system 102.
  • the third force data model of a third arm 540 can include an inactive instrument state indication 542.
  • the inactive instrument state indication 542 can correspond to one or more times during the medical procedure in which the third arm 540 is associated with force data below the minimum force threshold 302.
  • the third arm 540 remains in an inactive state during the medical procedure.
  • the second active instrument state indication 554 can correspond to one or more times during the medical procedure in which the fourth arm 550 is associated withforce data at or above the minimum force threshold 302.
  • the second active instrument state indication 552 can include a second portion of a fifth chart of force data according to the structure of the modified force data structure 400, but is not limited to the values indicated thereby.
  • FIG. 5B depicts an example modified user interface presentation according to this disclosure.
  • a modified user interface presentation 500B can include at least a modified instrument presentation 510B, and a high force indication presentation 560.
  • the modified instrument presentation 510B can correspond to a modification of a portion of one or more images associated with an instrument associated with a high-force point or high-force event.
  • the first arm 520 can be depicted in the modified instrument presentation 510B to include a highlight as discussed herein.
  • the modified instrument presentation 51 OB can include a highlight for the portion of the video depicting the first arm 520 with a red color overlay.
  • the overlay can have a transparency of 50%.
  • the high force indication presentation 560 can present an indication of a force feedback setting.
  • the force feedback setting can dictate a magnitude of haptic force feedback delivered to a surgeon by one or more of the instruments (e.g., arms).
  • a force feedback setting can correspond to an OFF setting corresponding to no haptic feedback delivered to a surgeon.
  • a force feedback setting can correspond to a LOW setting corresponding to low haptic feedback delivered to a surgeon (e.g., minimum detectable feedback by human hands).
  • a force feedback setting can correspond to a MEDIUM setting corresponding to medium haptic feedback delivered to a surgeon (e.g., detectable feedback by human hands below a level of resistance to movement of instruments).
  • the force magnitude presentation 564 can present a quantitative or qualitative value that is indicative of the high-force event.
  • the force magnitude presentation 564 can present a numerical value that indicates the magnitude of the high-force event (e.g., 6.2 N).
  • the force type presentation 562 can include a visualization for force data including one or more of heat map highlights of instruments in video according to force data or force OPIs.
  • the video data processor 120 can receive one or more timestamps from the force event processor 140 to segment video and deliver video by machine learning indicative of high-force moments in various medical procedures, to adjust timeline to include only high force events (e.g., a force-based highlights reel), or generate video recommendations based on high force events.
  • the user interface 160 can display, at the instrument presentation 510A or 510B, stacked force signatures for various segments of medical procedure.
  • a stack can include historical force signatures for a particular surgeon, type of medical procedure, site, or time interval.
  • FIG. 6 depicts an example method of configuration of force data of robotic systems for compatibility with network transmissions according to this disclosure.
  • the system 100 or any component thereof can perform method 600.
  • the method 600 can receive a data set for force during a medical procedure.
  • the data set can correspond to force data as discussed herein.
  • the method 600 can receive the data set for force at one or more instruments 170 of a robotic medical system.
  • the method can include selecting, according to a physical property of at least a portion of a patient site associated with the medical procedure, a quantitative value of the first threshold, the portion of a patient site corresponding to patient tissue, and the physical property corresponding to a tensile strength of the patient tissue.
  • the first threshold can correspond to the minimum force threshold 302.
  • the method 600 can remove one or more first points from the data set that satisfy a first threshold.
  • the point processor 214 can remove one or more points below the minimum force threshold 302 as discussed herein.
  • the first threshold can be indicative of a minimum magnitude of force
  • the second threshold indicative of a maximum magnitude of force.
  • the method 600 can remove the first points by a threshold indicative of a first magnitude of force at the one or more instruments of the robotic system.
  • the method 600 can identify one or more second points that satisfy a second threshold.
  • the second threshold can correspond to the maximum force threshold 402.
  • the method can include selecting, according to a type of at least one of the instruments or according to a type of segment of the medical procedure, a quantitative value of the second threshold.
  • the threshold processor 212 can dynamically set the second threshold according to a type of an instrument 170 from which a given set of force data is captured.
  • the method 600 can identify the second points with a threshold indicative of a second magnitude of force at the one or more instruments.
  • the second magnitude of force can be the magnitude of force associated with the maximum force threshold 402.
  • the maximum force threshold 402 can correspond to an instrument collision, for example, if collision force data is to be filtered out.
  • the method 600 can identify with a machine learning model and from the data set with the one or more first points removed. For example, the method can include generating, based on one or more of the second points that satisfy the second threshold, a first signature indicative of the force at the one or more instruments of the robotic medical system during the medical procedure. For example, the method can include determining that one or more second signatures each satisfy a third threshold indicative of a similarity with the first signature. The method can include selecting, according to the one or more second signatures, the one or more videos.
  • the method can include receiving a second data set corresponding to positions of the one or more instruments of the robotic medical system during the medical procedure.
  • the method can include identifying, with the machine learning model and from the data set with the one or more first points removed, one or more third points that do not satisfy the second threshold.
  • the method can include determining, based on the one or more third points and the positions, contact between the one or more instruments.
  • the method 600 can cause a user interface to present one or more videos.
  • the user interface 160 can present one or more video or segments of video as discussed herein.
  • the method 600 can present one or more videos for one or more times during the medical procedure.
  • the method 600 can present one or more videos for the one or more second points.
  • FIG. 7A depicts an example of force data models according to this disclosure.
  • the first presentation 700A depicted in FIG. 7A can be generated or output by one or more component or system depicted herein, including, for example, data processing system 102 depicted in FIG. 1, or the computing architecture 200 depicted in FIG. 2.
  • a first presentation 700A of force data models can include at least a first force data model 702A for a first instrument 170 of the robotic system 104, and a second force data model 704A for a second instrument 170 of the robotic system 104.
  • each of the points of the first force data model 702A and the second force data model 704A can include a force type metric.
  • the force type metric can be indicative of a state of force with respect to a given one or more of the instruments 170.
  • the force type metric can be a categorical metric as discussed herein by way of example, but is not limited thereto.
  • the first force data model 702A and the second force data model 704A can include a number of the first points at a first point density indicative of raw data captured by the instrument.
  • the first point density can correspond to the first data model including points at or below the minimum force threshold.
  • the user interface 160 can present the first force data model 702A and the second force data model 704A as bar charts, to indicate presence of one or more points captured at one or more given times during a given medical procedure by each instrument.
  • Figure 7A can depict various force feedback configurations for one or more instruments 170 of the robotic system 104.
  • the robotic system 104 can receive input corresponding to a setting by a surgeon of a level of force feedback (e.g., resistance to movement of an instrument 170) is provided to the instruments 170.
  • a level of force feedback e.g., resistance to movement of an instrument 170
  • visualizations according to the examples of Figs. 7A-C can reflect one or more settings of one or more force feedback configurations, to identify perceivable responses at one or more instruments 170 (e.g., manipulators of the robotic system 104).
  • this technical solution can provide a technical improvement at least to guide surgeons to using the appropriate amount of force in a medical procedure (e.g., the surgeon will use less force when the surgeon feels more force at a higher setting), beyond the capability of manual processes to achieve.
  • the first force data model 702A can include a first waveform indicative of one or more first points at one or more times during the during a given medical procedure, where each of the first points are indicative of force at or on the first instrument 170.
  • the first force data model 702A can include at least a first subset of points 710A, a second subset of points 712A, a third subset of points 714A, and a fourth subset of points 716A.
  • the first subset of points 710A can correspond to a portion of the waveform indicative of a first state of the first instrument 170.
  • the first subset of points 710A can indicate a first force type associated with the first instrument 170 at various times.
  • the first force type state can be, for example, a first force configuration state indicative of a first force feedback magnitude on or by the first instrument 170.
  • the second subset of points 712A can correspond to a portion of the waveform indicative of a second state of the first instrument 170.
  • the second subset of points 712A can indicate a second force type associated with the first instrument 170 at various times.
  • the second force type state can be, for example, a second force configuration state indicative of a second force feedback magnitude on or by the first instrument 170.
  • the third subset of points 714A can correspond to a portion of the waveform indicative of a third state of the first instrument 170.
  • the third subset of points 714A can indicate a third force type associated with the first instrument 170 at various times.
  • the third force type state can be, for example, a third force configuration state indicative of a third force feedback magnitude on or by the first instrument 170.
  • the fourth subset of points 716A can correspond to a portion of the waveform indicative of a fourth state of the first instrument 170.
  • the fourth subset of points 716A can indicate a fourth force type associated with the first instrument 170 at various times.
  • the fourth force type state can be, for example, a fourth force configuration state indicative of a fourth force feedback magnitude on or by the first instrument 170.
  • the second force data model 704A can include a second waveform indicative of one or more first points at one or more times during the during a given medical procedure, where each of the first points are indicative of force at or on the second instrument 170.
  • the second force data model 704A can include a number of the first points at the point density.
  • the second force data model 704A can include at least a first subset of points 720A, a second subset of points 722A, a third subset of points 724A, and a fourth subset of points 726A.
  • the first subset of points 720A can correspond to a portion of the waveform indicative of a first state of the second instrument 170.
  • the first subset of points 720A can indicate the first force type associated with the second instrument 170 at various times.
  • the second subset of points 722A can correspond to a portion of the waveform indicative of a second state of the second instrument 170.
  • the second subset of points 722A can indicate the second force type associated with the second instrument 170 at various times.
  • the third subset of points 724A can correspond to a portion of the waveform indicative of a third state of the second instrument 170.
  • the third subset of points 724A can indicate the third force type associated with the second instrument 170 at various times.
  • the fourth subset of points 726A can correspond to a portion of the waveform indicative of a fourth state of the second instrument 170.
  • the fourth subset of points 726A can indicate the fourth force type associated with the second instrument 170 at various times.
  • FIG. 7B depicts an example of modified force data models according to this disclosure.
  • a second presentation 700B of modified force data models can include at least a modified first force data model 702B for a first instrument 170 of the robotic system 104, and a second modified force data model 704B for a second instrument 170 of the robotic system 104.
  • each of the points of the first modified force data model 702B and the second modified force data model 704B can include the force type metric as discussed herein.
  • the first modified force data model 702B and the modified second force data model 704B can include a number of the first points at a second point density that excludes points having force metrics indicative of magnitudes of force at or below the minimum force threshold.
  • the second point density can correspond to the first data model excluding points at or below the minimum force threshold.
  • the user interface 160 can present the first modified force data model 702B and the second modified force data model 704B as line charts, to accommodate removal of points without loss of clarity of a user interface presentation.
  • the first modified force data model 702B can include a first waveform indicative of one or more first points at one or more times during the during a given medical procedure, where each of the first points are indicative of force at or on the first instrument 170.
  • the first modified force data model 702B can include at least a first subset of points 71 OB, a second subset of points 712B, a third subset of points 714B, and a fourth subset of points 716B.
  • the first subset of points 71 OB can correspond to a portion of the waveform indicative of a first state of the first instrument 170.
  • the first subset of points 710B can indicate the first force type associated with the first instrument 170 at various times.
  • the second subset of points 712B can correspond to a portion of the waveform indicative of a second state of the first instrument 170.
  • the second subset of points 712B can indicate the second force type associated with the first instrument 170 at various times.
  • the third subset of points 714B can correspond to a portion of the waveform indicative of a third state of the first instrument 170.
  • the third subset of points 714B can indicate the third force type associated with the first instrument 170 at various times.
  • the fourth subset of points 716B can correspond to a portion of the waveform indicative of a fourth state of the first instrument 170.
  • the fourth subset of points 716B can indicate the fourth force type associated with the first instrument 170 at various times.
  • the second modified force data model 704B can include a second waveform indicative of one or more first points at one or more times during the during a given medical procedure, where each of the first points are indicative of force at or on the second instrument 170.
  • the second modified force data model 704B can include a number of the first points at the point density.
  • the second modified force data model 704B can include at least a first subset of points 720B, a second subset of points 722B, a third subset of points 724B, and a fourth subset of points 726B.
  • the first subset of points 720B can correspond to a portion of the waveform indicative of a first state of the second instrument 170.
  • the first subset of points 720B can indicate the first force type associated with the second instrument 170 at various times.
  • the second subset of points 722B can correspond to a portion of the waveform indicative of a second state of the second instrument 170.
  • the second subset of points 722B can indicate the second force type associated with the second instrument 170 at various times.
  • the third subset of points 724B can correspond to a portion of the waveform indicative of a third state of the second instrument 170.
  • the third subset of points 724B can indicate the third force type associated with the second instrument 170 at various times.
  • the fourth subset of points 726B can correspond to a portion of the waveform indicative of a fourth state of the second instrument 170.
  • the fourth subset of points 726B can indicate the fourth force type associated with the second instrument 170 at various times.
  • the first filtered force data model 702C and the modified second force data model 704B can include a number of the first points at a third point density that excludes points having force metrics indicative of magnitudes of force at or below the minimum force threshold.
  • the second point density can correspond to the first data model excluding points at or below the minimum force threshold, and excluding time periods including only excluded points from presentation.
  • the user interface 160 can present the first filtered force data model 702C and the second filtered force data model 704C as bar charts only for portions of the waveform for each instrument including a point at the third point density, to accommodate removal of points without loss of clarity of a user interface presentation.
  • the first filtered force data model 702C can include a first waveform indicative of one or more first points at one or more times during the during a given medical procedure, where each of the first points are indicative of force at or on the first instrument 170.
  • the first filtered force data model 702C can include at least a first subset of points 710C, a second subset of points 712C, a third subset of points 714C, and a fourth subset of points 716C.
  • the first subset of points 710C can correspond to a portion of the waveform indicative of a first state of the first instrument 170.
  • the first subset of points 710C can indicate the first force type associated with the first instrument 170 at various times.
  • the second subset of points 712C can correspond to a portion of the waveform indicative of a second state of the first instrument 170.
  • the second subset of points 712C can indicate the second force type associated with the first instrument 170 at various times.
  • the third subset of points 714C can correspond to a portion of the waveform indicative of a third state of the first instrument 170.
  • the third subset of points 714C can indicate the third force type associated with the first instrument 170 at various times.
  • the fourth subset of points 716C can correspond to a portion of the waveform indicative of a fourth state of the first instrument 170.
  • the fourth subset of points 716C can indicate the fourth force type associated with the first instrument 170 at various times.
  • the second filtered force data model 704C can include a second waveform indicative of one or more first points at one or more times during the during a given medical procedure, where each of the first points are indicative of force at or on the second instrument 170.
  • the second filtered force data model 704C can include a number of the first points at the point density.
  • the second filtered force data model 704C can include at least a first subset of points 720C, a second subset of points 722C, a third subset of points 724C, and a fourth subset of points 726C.
  • the first subset of points 720C can correspond to a portion of the waveform indicative of a first state of the second instrument 170.
  • the first subset of points 720C can indicate the first force type associated with the second instrument 170 at various times.
  • the second subset of points 722C can correspond to a portion of the waveform indicative of a second state of the second instrument 170.
  • the second subset of points 722C can indicate the second force type associated with the second instrument 170 at various times.
  • the third subset of points 724C can correspond to a portion of the waveform indicative of a third state of the second instrument 170.
  • the third subset of points 724C can indicate the third force type associated with the second instrument 170 at various times.
  • the fourth subset of points 726C can correspond to a portion of the waveform indicative of a fourth state of the second instrument 170.
  • the fourth subset of points 726C can indicate the fourth force type associated with the second instrument 170 at various times.
  • the data processing system can generate (e.g., via a machine learning model) a density plot of forces associated with one or more instruments of a robotic system during one or more medical procedures, and can generate “fingerprints” for one or more videos or video clips.
  • the video clips associated with portions of videos of medical procedures can be associated with discrete portions of the workflow (e.g., task or phase) and discrete force data or ranges of force data from the robotic system.
  • the data processing system can generate, render, display, or otherwise allow a surgeon reviewing medical procedures for force to rapidly review or cycle between many high-force video clips relevant to a particular robotic system, medical procedure, or medical procedure workflow.
  • the data processing system can generate a histogram of force, which can be based on 700C.
  • references to "or” may be construed as inclusive so that any terms described using “or” may indicate any of a single, more than one, and all of the described terms. References to at least one of a conjunctive list of terms may be construed as an inclusive OR to indicate any of a single, more than one, and all of the described terms. For example, a reference to “at least one of "A” and “B” can include only “A”, only “B", as well as both "A” and “B”. Such references used in conjunction with “comprising” or other open terminology can include additional items. References to “is” or “are” may be construed as nonlimiting to the implementation or action referenced in connection with that term. The terms “is” or “are” or any tense or derivative thereof, are interchangeable and synonymous with “can be” as used herein, unless stated otherwise herein.
  • Directional indicators depicted herein are example directions to facilitate understanding of the examples discussed herein, and are not limited to the directional indicators depicted herein. Any directional indicator depicted herein can be modified to the reverse direction, or can be modified to include both the depicted direction and a direction reverse to the depicted direction, unless stated otherwise herein. While operations are depicted in the drawings in a particular order, such operations are not required to be performed in the particular order shown or in sequential order, and all illustrated operations are not required to be performed. Actions described herein can be performed in a different order. Where technical features in the drawings, detailed description or any claim are followed by reference signs, the reference signs have been included to increase the intelligibility of the drawings, detailed description, and claims. Accordingly, neither the reference signs nor their absence has any limiting effect on the scope of any clam elements.

Landscapes

  • Health & Medical Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Surgery (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Robotics (AREA)
  • Public Health (AREA)
  • Animal Behavior & Ethology (AREA)
  • Molecular Biology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Biomedical Technology (AREA)
  • Veterinary Medicine (AREA)
  • Urology & Nephrology (AREA)
  • Epidemiology (AREA)
  • Primary Health Care (AREA)
  • Human Computer Interaction (AREA)
  • Computing Systems (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Manipulator (AREA)

Abstract

Configuration of force data of robotic systems for compatibility with network transmissions is provided. A system receives a data set corresponding to force at an instrument of a robotic medical system during a medical procedure. The system removes first points from the data set that satisfy a first threshold indicative of a first magnitude of force at the instrument of the robotic system. The system identifies, with a machine learning model and from the data set with the one or more first points removed, one or more second points that satisfy a second threshold indicative of a second magnitude of force at instrument of the robotic medical system. The system causes a user interface to present one or more videos for one or more times during the medical procedure that correspond to the one or more second points from the data set.

Description

CONFIGURATION OF FORCE DATA OF ROBOTIC SYSTEMS FOR COMPATIBILITY WITH NETWORK TRANSMISSIONS
CROSS-REFERENCES TO RELATED APPLICATIONS
[0001] This application claims the benefit of priority under 35 U.S.C. § 119 to U.S. Provisional Patent Application 63/559,124, filed February 28, 2024, which is hereby incorporated by reference in its entirety for all purposes.
TECHNICAL FIELD
[0002] The present implementations relate generally to medical devices, including but not limited to configuration of force data of robotic systems for compatibility with network transmissions.
INTRODUCTION
[0003] Improvement of surgical techniques is important for increasing effectiveness and efficiency of medical procedures. Medical procedures are increasingly associated with increasing quantities of information from increasing sources. In addition, surgeons increasingly demand information with high accuracy regarding preferred or optimal technique for performing various surgical procedures, or granular tasks within procedures. However, conventional systems cannot effectively or efficiently identify, search for, select and present information regarding techniques at a level of specificity demanded by surgeons, and cannot achieve delivery of information regarding medical procedures by manual means.
SUMMARY
[0004] Systems, methods, apparatuses, and non-transitory computer-readable media are provided at least for selecting relevant videos or video clips of medical procedures based on force applied by instruments of a robotic device during a medical procedure. For example, implementations according to this disclosure can recommend videos of portions of medical procedures in which force applied meets or exceeds a force threshold. For example, implementations according to this disclosure can down-sample force data to enable transmission to user devices for presentation, while retaining force data indicative of high-force events during the medical procedure. For example, implementations according to this disclosure can validate surgical events (e.g., tool contact or collision) using force data. For example, implementations according to this disclosure can provide visual annotations or overlays of video or video clips based on force data (e.g., present one or more heat map overlays each having a color that indicates a magnitude or direction of force applied to or by a given instrument). Thus, technical solutions for configuration of force data of robotic systems for compatibility with network transmissions are provided.
[0005] At least one aspect is directed to a system. The system can include one or more processors, coupled with memory. The system can receive a data set corresponding to force at one or more instruments of a robotic medical system during a medical procedure. The system can remove one or more first points from the data set that satisfy a first threshold indicative of a first magnitude of force at the one or more instruments of the robotic system. The system can identify, with a machine learning model and from the data set with the one or more first points removed, one or more second points that satisfy a second threshold indicative of a second magnitude of force at the one or more instruments of the robotic medical system. The system can cause a user interface to present one or more videos for one or more times during the medical procedure that correspond to the one or more second points from the data set.
[0006] At least one aspect is directed to a method. The method can include receiving a data set corresponding to force at one or more instruments of a robotic medical system during a medical procedure. The method can include removing one or more first points from the data set that satisfy a first threshold indicative of a first magnitude of force at the one or more instruments of the robotic system. The method can include identifying, with a machine learning model and from the data set with the one or more first points removed, one or more second points that satisfy a second threshold indicative of a second magnitude of force at the one or more instruments of the robotic medical system. The method can include causing a user interface to present one or more videos for one or more times during the medical procedure that correspond to the one or more second points from the data set.
[0007] At least one aspect is directed to a non-transitory computer readable medium can include one or more instructions stored thereon and executable by a processor. The processor can receive a data set corresponding to force at one or more instruments of a robotic medical system during a medical procedure. The processor can remove one or more first points from the data set that satisfy a first threshold indicative of a first magnitude of force at the one or more instruments of the robotic system. The processor can identify, with a machine learning model and from the data set with the one or more first points removed, one or more second points that satisfy a second threshold indicative of a second magnitude of force at the one or more instruments of the robotic medical system. The processor can cause a user interface to present one or more videos for one or more times during the medical procedure that correspond to the one or more second points from the data set.
BRIEF DESCRIPTION OF THE FIGURES
[0008] These and other aspects and features of the present implementations are depicted by way of example in the figures discussed herein. Present implementations can be directed to, but are not limited to, examples depicted in the figures discussed herein. Thus, this disclosure is not limited to any figure or portion thereof depicted or referenced herein, or any aspect described herein with respect to any figures depicted or referenced herein.
[0009] FIG. 1 depicts an example system according to this disclosure.
[0010] FIG. 2 depicts an example computer architecture according to this disclosure.
[0011] FIG. 3 depicts an example force data model according to this disclosure.
[0012] FIG. 4 depicts an example modified force data structure according to this disclosure.
[0013] FIG. 5A depicts an example user interface presentation according to this disclosure.
[0014] FIG. 5B depicts an example modified user interface presentation according to this disclosure.
[0015] FIG. 6 depicts an example method of configuration of force data of robotic systems for compatibility with network transmissions according to this disclosure.
[0016] FIG. 7A depicts an example of force data models according to this disclosure.
[0017] FIG. 7B depicts an example of modified force data models according to this disclosure.
[0018] FIG. 7C depicts an example of filtered force data models according to this disclosure.
DETAILED DESCRIPTION
[0019] Aspects of the technical solutions are described herein with reference to the figures, which are illustrative examples of this technical solution. The figures and examples below are not meant to limit the scope of this technical solution to the present implementations or to a single implementation, and other implementations in accordance with present implementations are possible, for example, by way of interchange of some or all of the described or illustrated elements. Where certain elements of the present implementations can be partially or fully implemented using known components, only those portions of such known components that are necessary for an understanding of the present implementations are described, and detailed descriptions of other portions of such known components are omitted to not obscure the present implementations. Terms in the specification and claims are to be ascribed no uncommon or special meaning unless explicitly set forth herein. Further, this technical solution and the present implementations encompass present and future known equivalents to the known components referred to herein by way of description, illustration, or example.
[0020] Aspects of technical solutions of this disclosure are directed to the configuration of force data of robotic systems for compatibility with network transmission. The technical solutions described herein can use one or more thresholds to down-sample force data and identify force events. For example, the technical solutions of this disclosure can include a data processing system that can apply minimum force thresholds and maximum force thresholds to respectively down-sample force data and identify high-force events. The data processing system can identify a minimum force threshold below which force data is not transmitted or stored as discrete values (e.g., force values below a given quantitative value can be ignored). For example, a minimum force threshold can be an experimentally-derived value of 0.6 Newtons (N). The data processing system of this technical solution can identify a maximum force threshold above which a high-force state is associated. For example, maximum force threshold can be a quantitative value over 0.6 N, and can be either a static value or a dynamic value based on tissue type associated with the high-force application. Force data can be obtained from telemetry of a robotic system and can be correlated with timestamps associated with the medical procedure. For example, this technical solution can achieve a technical improvement of reducing force data from millions of data points to 800 data points or less, with little to no loss of visibility into high-force events, allowing force data to be consumable by client computers (e.g., surgeon’s desktop or tablet) via wireless networks. Thus, by downsampling force data, the technical solutions of this disclosure can enable or facilitate transmission of force data for display, presentation, or rendering by computing systems or display devices, while retaining force data indicative of relative or high-force events during the medical procedure. Further, by down-sampling force data, the technical solutions can facilitate deployment of force-based video for review across a wide range of client devices or types of computing devices due to the reduction in bandwidth due to down-sampling. [0021] The data processing system can include, access, or otherwise utilize a machine learning model that can identify videos or video clips for one or more medical procedures based on force data. For example, the machine learning model can generate a density plot of forces associated with one or more instruments of a robotic system during one or more medical procedures, and can generate “fingerprints” for one or more videos or video clips. For example, video clips can be portions of videos that span a medical procedures, and can be associated with discrete portions of the workflow (e.g., task or phase) and discrete force data or ranges of force data from the robotic system. Thus, the machine learning model can allow a surgeon reviewing medical procedures for force to rapidly review or cycle between many high-force video clips relevant to a particular robotic system, medical procedure, or medical procedure workflow.
[0022] FIG. 1 depicts an example system according to this disclosure. As illustrated by way of example in FIG. 1, a system 100 can include at least a network 101, a data processing system 102, a client system 103, and a robotic system 104 (which can include or be referred to as a robotic medical system 104).
[0023] The network 101 can include any type or form of network. The geographical scope of the network 101 can vary widely and the network 101 can include a body area network (BAN), a personal area network (PAN), a local-area network (LAN), e.g., Intranet, a metropolitan area network (MAN), a wide area network (WAN), or the Internet. The topology of the network 101 can be of any form and can include, e.g., any of the following: point-to-point, bus, star, ring, mesh, or tree. The network 101 can include an overlay network which is virtual and sits on top of one or more layers of other networks 101. The network 101 can be of any such network topology as known to those ordinarily skilled in the art capable of supporting the operations described herein. The network 101 can utilize different techniques and layers or stacks of protocols, including, e.g., the Ethernet protocol, the Internet protocol suite (TCP/IP), the ATM (Asynchronous Transfer Mode) technique, the SONET (Synchronous Optical Networking) protocol, or the SD (Synchronous Digital Hierarchy) protocol. The TCP/IP Internet protocol suite can include application layer, transport layer, Internet layer (including, e.g., IPv6), or the link layer. The network 101 can include a type of a broadcast network, a telecommunications network, a data communication network, or a computer network.
[0024] The data processing system 102 can include a physical computer system operatively coupled or that can be coupled with one or more components of the system 100, either directly or directly through an intermediate computing device or system. The data processing system 102 can include a virtual computing system, an operating system, and a communication bus to effect communication and processing. The data processing system 102 can include a system processor 110, an interface controller 112, a video data processor 120, a force data processor 130, and force event processor 140, and a system memory 150.
[0025] The system processor 110 can execute one or more instructions associated with the system 100. The system processor 110 can include an electronic processor, an integrated circuit, or the like including one or more of digital logic, analog logic, digital sensors, analog sensors, communication buses, volatile memory, nonvolatile memory, and the like. The system processor 110 can include, but is not limited to, at least one microcontroller unit (MCU), microprocessor unit (MPU), central processing unit (CPU), graphics processing unit (GPU), physics processing unit (PPU), embedded controller (EC), or the like. The system processor 110 can include a memory operable to store or storing one or more instructions for operating components of the system processor 110 and operating components operably coupled to the system processor 110. For example, the one or more instructions can include one or more of firmware, software, hardware, operating systems, embedded operating systems. The system processor 110 or the data processing system 102 generally can include one or more communication bus controller to effect communication between the system processor 110 and the other elements of the system 100.
[0026] The interface controller 112 can link the data processing system 102 with one or more of the network 101 and the client system 103, by one or more communication interfaces. A communication interface can include, for example, an application programming interface (“API”) compatible with a particular component of the data processing system 102, or the client system 103. The communication interface can provide a particular communication protocol compatible with a particular component of the data processing system 102 and a particular component of the client system 103. The interface controller 112 can be compatible with particular content objects and can be compatible with particular content delivery systems corresponding to particular content objects, structures of data, types of data, or any combination thereof. For example, the interface controller 112 can be compatible with transmission of text data or binary data structured according to one or more metrics or data of the system memory 150. [0027] The video data processor 120 can identify one or more features in depictions in video data as discussed herein. For example, the depictions can include portions of a patient site, one or more medical instruments, or any combination thereof, but are not limited thereto. The video data processor 120 can identify one or more edges, regions, or a structure within an image and associated with the depictions. For example, an edge can correspond to a line in an image that separates two depicted objects (e.g., a delineation between an instrument and a patient site). For example, a region can correspond to an area in an image that at least partially corresponds to a depicted object (e.g., an instrument tip). For example, a structure can correspond to an area in an image that at least partially corresponds to a portion of a depicted object or a predetermined type of an object (e.g., a scalpel edge). For example, the video data processor 120 can include or correspond to a first machine learning model configured to identify the one or more features in the one or more images. For example, the system can generate, with the first machine learning model and based on the first feature, a second feature indicating an economy of motion for each of the plurality of videos.
[0028] The force data processor 130 can process one or more metrics indicative of forces associated with one or more components of the robotic system 104 with respect to one or more given medical procedures or medical procedures of a given type. The metrics can correspond to one or more quantitative values of force applied by an instrument or an actuator associated with the instrument at a given time or over a given time period for a given medical procedure. For example, the metrics can be indicative of a forceps instrument closing at a magnitude of 1.1 N at a given time during a medical procedure. The metrics can correspond to one or more quantitative values of force applied to an instrument or an actuator associated with the instrument at a given time or over a given time period for a given medical procedure. For example, the metrics can be indicative of force at a magnitude of 1.1 N against a forceps instrument hindered from moving freely at the patient stie through contact with tissue, at a given time during a medical procedure. The force data processor 130 can generate one or more force data structures corresponding to force associated with one or more instruments, and can transmit the force data structures to the force event processor 140.
[0029] The force event processor 140 can determine one or more states of the robotic device that correspond to one or more events of the medical procedure. For example, an event can correspond to a set of one or more metrics indicative of a state of one or more of the robotic system 104, any component thereof, a patient, a patient site, or a medical environment. The state can be associated with or indicative of a given medical procedure. For example, a set of force data for a forceps instrument of the robotic device 104, having a quantitative value of 2.5 N, during a specific gall bladder removal procedure, can be indicative of a high-force event of a specific medical procedure. The force event processor 140 can be configured to process force data from one or more medical procedures to identify high-force events in one or more given medical procedures.
[0030] The force event processor 140 can modify one or more videos or video segments corresponding to the video data 152, according to one or more states corresponding to one or more components of the robotic system 104. For example, the force event processor 140 can identify a state corresponding to one or more instruments based on one or more metrics determined or identified by the force data processor 210. The force event processor 140 can identify a video segment corresponding to one or more magnitudes of force at one or more given times or time period, based on one or more metrics determined or identified by the force data processor 210. For example, the force event processor 140 can identify a video segment for the gall bladder medical procedure corresponding to a time or time period correlated with or matching the time or time period indicated by the force data for the high-force event. Thus, the force event processor 140 can provide a technical improvement to retrieve video segments that depict force events for given medical procedures, beyond the capability of manual processes. For example, the force event processor 140 can instruct the video data processor to modify at least a portion of at least one image corresponding to a video or video segment, to indicate a state corresponding to one or more magnitudes of force determined or identified by the force data processor 210.
[0031] The system memory 150 can store data associated with the system 100. The system memory 150 can include one or more hardware memory devices to store binary data, digital data, or the like. The system memory 150 can include one or more electrical components, electronic components, programmable electronic components, reprogrammable electronic components, integrated circuits, semiconductor devices, flip flops, arithmetic units, or the like. The system memory 150 can include at least one of a non-volatile memory device, a solid-state memory device, a flash memory device, or aNAND memory device. The system memory 150 can include one or more addressable memory regions disposed on one or more physical memory arrays. A physical memory array can include a NAND gate array disposed on, for example, at least one of a particular semiconductor device, integrated circuit device, and printed circuit board device. The system memory 150 can include a video data 152, robot metrics 154, performance metrics 156, and patient metrics 158.
[0032] The system memory 150 can correspond to a non-transitory computer readable medium. For example, the non-transitory computer readable medium can include one or more instructions executable by the system processor 110. The processor can select, according to a physical property of at least a portion of a patient site associated with the medical procedure, a quantitative value of the first threshold. For example, the non-transitory computer readable medium can include the non-transitory computer readable medium further can include one or more instructions executable. The processor can select, according to a type of at least one of the instruments, a quantitative value of the second threshold. For example, the non-transitory computer readable medium can include the non-transitory computer readable medium further can include one or more instructions executable. The processor can select, according to a type of at least one of the instruments or according to a type of segment of the medical procedure, a quantitative value of the second threshold.
[0033] The video data 152 can depict one or more medical procedures from one or more viewpoints associated with corresponding medical procedures. For example, the video data 152 can correspond to still images or frames of video images that depict at least a portion of a medical procedure, medical environment, or patient site from a given viewpoint. For example, the video data processor 120 can identify one or more depictions in an image or across a plurality of images. Each time can, for example, be associated with a given task or phase of a workflow as occurring during that task or phase.
[0034] The robot metrics 154 can be indicative of one or more states of one or more components of the robotic system 104. Components of the robotic system 104 can include, but are not limited to, actuators of the robotic system 104 as discussed herein. For example, the robot metrics 154 can include one or more data points indicative of one or more of an activation state (e.g., activated or deactivated), a position, or orientation of a component of the robotic system 104. For example, the robot metrics 154 can be linked with or correlated with one or more medical procedures, one or more phases of a given medical procedure, or one or more tasks of a given phase of a given medical procedure. For example, a robot metric among the robot metrics 154 can correspond to corresponding positions of one or more actuators of a given arm of the robotic system 104 at a given time or over a given time period. Each time can, for example, be associated with a given task or phase of a workflow as occurring during that task or phase.
[0035] The performance metrics 156 can be indicative of one or more actions during one or more medical procedures. For example, the performance metrics 156 can correspond to OPIs as discussed herein. The patient metrics 158 can be indicative of one or more characteristics of a patient during one or more medical procedures. For example, the patient metrics 158 can indicate various conditions (e.g., diabetes, low blood pressure, blood clotting) or various traits (e.g., age, weight) corresponding to the patient in each medical procedure. The surgical data processor 130 or the model processor 140 can obtain the patient metrics 158, and can filter or modify any segments of video in response to the obtained patient metrics 158. For example, the surgical data processor 130 or the model processor 140 can include video segments restricted to patients with patient metrics 158 indicative of diabetes.
[0036] The client system 103 can include a computing system associated with a database system. For example, the client system 103 can correspond to a cloud system, a server, a distributed remote system, or any combination thereof. For example, the client system 103 can include an operating system to execute a virtual environment. The operating system can include hardware control instructions and program execution instructions. The operating system can include a high-level operating system, a server operating system, an embedded operating system, or a boot loader. The client system 103 can include a user interface 160.
[0037] The user interface 160 can include one or more devices to receive input from a user or to provide output to a user. For example, the user interface 160 can correspond to a display device to provide visual output to a user and one or more or user input devices to receive input from a user. For example, the input devices can include a keyboard, mouse or touch-sensitive panel of the display device, but are not limited thereto. The display device can display at least one or more presentations as discussed herein, and can include an electronic display. An electronic display can include, for example, a liquid crystal display (LCD), a light-emitting diode (LED) display, an organic light-emitting diode (OLED) display, or the like. The display device can receive, for example, capacitive or resistive touch input. The display device can be housed at least partially within the client system 103.
[0038] The robotic system 104 can include one or more robotic devices configured to perform one or more actions of a medical procedure (e.g., a surgical procedure). For example, a robotic device can include or be coupled with, but is not limited to, a surgical device that can be manipulated by a robotic device. For example, a surgical device can include, but is not limited to, a scalpel or a cauterizing tool. The robotic system 104 can include various motors, actuators, or electronic devices whose position or configuration can be modified according to input at one or more robotic interfaces. For example, a robotic interface can include a manipulator with one or more levers, buttons, or grasping controls that can be manipulated by pressure or gestures from one or more hands, arms, fingers, or feet. The robotic system 104 can include a surgeon console in which the surgeon can be positioned (e.g., standing or seated) to operate the robotic system 104. However, the robotic system 104 is not limited to a surgeon console co-located or on-site with the robotic system 104. The robotic system 104 can include an instrument(s) 170. The instrument(s) 170 can include components of the robotic system 104 that can be moved in response to input by a surgeon at the surgeon console of the robotic device 104. The components can correspond to or include one or more actuators that can each move or otherwise change state to operate one or more of the instruments 170 of the robotic device 104. For example, each of the instruments can include one or more sensors or be associated with one or more sensors to provide haptic feedback from the robotic system 104. For example, the haptic feedback can include one or more data points according to the robot metrics 154, or that can be indicative of the robot metrics 154.
[0039] FIG. 2 depicts an example computer architecture according to this disclosure. As illustrated by way of example in FIG. 2, a computer architecture 200 can include at least a force data processor 210, and a force event processor 220. The force data processor 210 can correspond at least partially in one or more of structure and operation to the force data processor 130. The force data processor 210 can include a threshold processor 212, a point processor 214, and a signature processor 216. The force event processor 220 can correspond at least partially in one or more of structure and operation to the force event processor 140. The force event processor 220 can include an instrument metrics processor 222, a video segmentation processor 224, and a video annotation processor 226.
[0040] The threshold processor 212 can provide one or more thresholds corresponding to one or more force events. The threshold processor 212 can provide one or more thresholds indicative of quantitative values of force data. For example, the threshold processor 212 can provide a minimum force threshold indicative of a minimum magnitude of force to be recorded or processed for the medical procedure. For example, the threshold processor 212 can provide a maximum force threshold indicative of a minimum magnitude of force associated with a high- force event for the medical procedure. For example, the threshold processor 212 can provide one or more minimum force thresholds each corresponding to a given instrument of the robotic system 104, a given medical procedure, or a given instrument of the robotic system 104 for a given medical procedure. For example, the threshold processor 212 can provide one or more maximum force thresholds each corresponding to a given instrument of the robotic system 104, a given medical procedure, or a given instrument of the robotic system 104 for a given medical procedure.
[0041] The threshold processor 212 can determine one or more thresholds according to one more criteria associated with a medical procedure, type of medical procedure, robotic system, type of robotic system, instrument for a robotic system, or type of instrument of a robotic system, but is not limited thereto. For example, the threshold processor 212 can determine a minimum force threshold corresponding to a given instrument during a given medical procedure, based on a physical property of a type of tissue at a patient site for the medical procedure. For example, the system 100 can select, according to a physical property of at least a portion of a patient site associated with the medical procedure, a quantitative value of the first threshold. For example, the portion of a patient site can correspond to patient tissue, and the physical property can correspond to a tensile strength of the patient tissue.
[0042] The threshold processor 212 can generate dynamic thresholds for high-force events based on instrument type. The threshold processor 212 can generate dynamic thresholds for high-force events based on segment of a workflow of a medical procedure or type of medical procedure (e.g., task or phase). The threshold processor 212 can generate thresholds based on skill level, complexity of medical procedure, or anatomical structure. For example, the threshold processor 212 can obtain a skill level of a surgeon associated with a request for video. The threshold processor 212 can obtain the skill level based on one or more of the performance metrics 156 associated with the surgeon, and can provide a threshold that is configured to the surgeon providing the request. For example, a novice surgeon having one or more performance metrics 156 (e.g., OPIs) that are correlated with a novice surgeon, can request videos according to a medical procedure. For example, the system can select, according to a type of at least one of the instruments, a quantitative value of the second threshold. Here, the second threshold can correspond to a maximum force threshold. The threshold processor 212 can determine, based on the OPIs or the correlation of the OPIs with a novice surgeon profile, to provide a maximum force threshold that is lower than a maximum force threshold for an expert surgeon.
[0043] For example, the system 100 can select, according to a type of segment of the medical procedure, a quantitative value of the second threshold. The threshold processor 212 can determine, based on one or more of the performance metrics 156 or one or more of the patient metrics 158, to provide a first maximum force threshold that is lower than a second maximum force threshold, where the first maximum force threshold is associated with a preoperative phase of a workflow or a task performed during the preoperative phase of the workflow, and the second maximum force threshold is associated with a preoperative phase of a workflow or a task performed during the preoperative phase of the workflow. Thus, the threshold processor 212 can provide a technical improvement to provide thresholds corresponding to multiple aspects of a medical procedure.
[0044] The point processor 214 can obtain one or more points associated with force data as discussed herein. The point processor 214 can generate one more points including data indicative of one or more of a magnitude of force, an instrument at which the force is detected, a type of the force, a time associated with the detection of the force, or any combination thereof. For example, the type of the force can indicate whether a force is applied by an instrument, or being applied to an instrument. The point processor 214 can obtain point set including a plurality of points collectively indicative of force corresponding to a given instrument for a given medical procedure, and can modify the point set to add, remove, or modify one or more points in accordance with one or more thresholds provided by the threshold processor 212 that are applicable to the point set. For example, the point processor 214 can remove one or more points of a point set having a magnitude below a minimum force threshold associated with the medical procedure associate with the points of the point set. For example, the point processor 214 can remove one or more points of a point set having a magnitude below a minimum force threshold for the medical procedure associated with the points of the point set. For example, the system can receive a second data set corresponding to positions of the one or more instruments of the robotic medical system during the medical procedure. The system can identify, with the machine learning model and from the data set with the one or more first points removed, one or more third points that do not satisfy the second threshold. The system can determine, based on the one or more third points and the positions, contact between the one or more instruments. Here, the second data set can correspond to the point set or be derived by the point processor 214 from one of more of the robot metrics 154.
[0045] The signature processor 216 can provide one or more signatures. For example, a signature can include a data structure that is indicative of one or more of a task, a phase, an event, or any combination thereof as discussed herein. The signature processor 216 can identify one or more of a task, a phase, an event corresponding to a given medical procedure or a given type of medical procedure, according to a value of a given signature. For example, the signature processor 216 can generate a signature indicative of force at a given task, phase or event of a given medical procedure. The signature processor 216 can generate a force signature based on at least one of force data as discussed herein, one or more OPIs indicative of force, one or more OPIs associated with force data, or any combination thereof. The signature processor 216 can generate a force signature from force data segmented according to segments of a workflow (e.g., task or phase). For example, the signature processor 216 can generate a force signature including only segments of a workflow that at least partially include points that meet (e.g., exceed) a minimum force threshold. For example, the signature processor 216 can generate a force signature excluding segments of a workflow that at least partially include points that meet (e.g., exceed) a maximum force threshold.
[0046] The signature processor 216 can provide one or more signatures to the threshold processor 212. The threshold processor 212 can identify one or more dynamic thresholds by receiving one or more of the signatures provided by the signature processor 216 as input. The threshold processor 212 can identify one or more dynamic thresholds via a machine learning model integrated with the threshold processor 212. For example, the system can generate, based on one or more of the second points that satisfy the second threshold, a first signature indicative of the force at the one or more instruments of the robotic medical system during the medical procedure. For example, the signature processor 216 can generate the first signature. For example, the system can determine that one or more second signatures each satisfy a third threshold indicative of a similarity with the first signature. The system can select, according to the one or more second signatures, the one or more videos. For example, the signature processor 216 can determine that the one or more second signatures each satisfy the third threshold.
[0047] The instrument metrics processor 222 can determine one or more states associated with one or more instruments 170 of the robotic system 104. The instrument metrics processor 222 can identify a state corresponding to a position of one or of the instruments 170. The position of the instrument 170 can correspond to an absolute position having one or more absolute coordinates in a coordinate space for the patient site or the medical environment. The position of the instrument 170 can correspond to a relative position having one or more relative coordinates in a coordinate space with respect to the given instrument 170 and one or more of another instrument 170 of the robotic system 104 and a tissue structure of the patient site. For example, the instrument metrics processor 222 can detect a relative position between two or more instruments 170 (e.g., distance). For example, the instrument metrics processor 222 can detect a relative position between an instrument 170 and a tissue structure of (e.g., distance).
[0048] The instrument metrics processor 222 can remove one or more points indicative of force, based on one or more relative or absolute positions as discussed herein. For example, the instrument metrics processor 222 can filter forces in response to a determination that two instruments are located at the same absolute position or have a relative position indicating a distance of zero or less. The same absolute position or the relative position indicating a distance of zero or less can be indicative of a collision or contact between instruments. Collision or contact between instruments can be associated with high-force data. For example, the instrument metrics processor 222 can detect collisions from kinematics data (e.g., overlap of instruments based on coordinates, yaw, pitch, roll) corresponding to the robot metrics 154. For example, the instrument metrics processor 222 can detect collisions based on one or more force signatures or similarity among one or more force signatures. The instrument metrics processor 222 can remove points associated with contact or collision, in accordance with a determination or a condition to treat collision and contact between instruments 170 as noise. Thus, this technical solution can provide a technical improvement at least to a graphical user interface (GUI), with respect at least to presentation of video augmented with indications of various metrics as discussed herein. Thus, this technical solution can provide a technical improvement at least to improve a user experience by improvements to the user interface as discussed herein.
[0049] The video segmentation processor 224 can identify portions of the video data 152 according to one or more of the robot metrics 154, the performance metrics 156, and the patient metrics 158. For example, the video segmentation processor 224 can divide one or more videos for corresponding medical procedures according to one or more phases or tasks for that medical procedure. The video annotation processor 226 can modify one or more images or portions of images according to force data associated with that image. For example, the video annotation processor 226 can instruct the video data processor to apply an annotation for a given instrument 170 at one or more given times. In response, the video annotation processor 226 can identify the given instrument 170 in one or more images, and can apply the annotation to the image or images including the given instrument 170. For example, an annotation can include a presentation of text, values or graphics descriptive of the given instrument or the force data corresponding to the given instrument 170 at the given time. For example, the annotation can include a modification of at least a portion of the image corresponding to the given instrument 170 at the given time (e.g., an overlay highlighting the image with a given color, at a given transparency level). Thus, the video segmentation processor 224 and the video annotation processor 226 can each provide a technical improvement to modify video data to present force data as discussed herein with respect to given medical procedures or types of medical procedures, beyond the capability of manual processes.
[0050] FIG. 3 depicts an example force data model according to this disclosure. As illustrated by way of example in FIG. 3, a force data model 300 can include at least a lower force threshold 302, force metrics 310. The force data model 300 is illustrated by way of example in Fig. 3 as a chart including the force metrics 310 arranged as a line graph. The example chart of Fig. 3 depicts quantitative values of the force metrics 310 including a time component and a force component. For example, a time component according to the force data model 300 can include a timestamp indicative of a relative time from the start of a given medical procedure, or an absolute timestamp according to a UNIX timestamp. For example, a force component according to the force data model 300 can include a quantitative value indicating a signed or unsigned magnitude. The quantitative value of the force component can correspond to Newtons (N), for example.
[0051] The lower force threshold 302 can be indicative of a minimum force threshold. For example, the lower force threshold 302 can be indicative of a magnitude of force of 0.6 N. For example, the threshold processor 212 can determine the lower force threshold 302 as discussed herein. For example, the point processor can identify all points of the force metrics 310 having corresponding force components at or below a magnitude (either signed or unsigned) of the lower force threshold 302.
[0052] The force metrics 310 can correspond to an instance of the robot metrics 154. For example, the force metrics 310 can correspond to a set of points as discussed herein for a given medical procedure, plurality of medical procedures corresponding to a type of medical procedure, a plurality of medical procedures corresponding to a given surgeon or group of surgeons, or any combination thereof, but is not limited thereto. For example, the force metrics 310 can include all or substantially all points detected at one or more of the instruments 170 during one or more medical procedures. For example, the force metrics 310 can include all points captured from one or more instruments at a sampling frequency between 30-50 Hz. For example, the force metrics 310 can include approximately two million points for a medical procedure including a surgery of three hours. The force metrics 310 can include a first peak 320, and a second peak 322. The first peak 320 can include a first set of force metrics indicative of a first set of one or more relative peaks of the force metrics 310. For example, the first peak 320 can correspond to points of the force metrics 310 captured during a first task or phase of the medical procedure. The second peak 322 can include a second set of force metrics indicative of a second set of one or more relative peaks of the force metrics 310. For example, the second peak 322 can correspond to points of the force metrics 310 captured during a second task or phase of the medical procedure.
[0053] FIG. 4 depicts an example modified force data structure according to this disclosure. As illustrated by way of example in FIG. 4, a modified force data structure 400 can include at least an upper force threshold 402, and modified force metrics 410. The example chart of Fig. 3 depicts quantitative values of the force metrics 310 including the time component and the force component as discussed herein. The upper force threshold 402 can be indicative of a maximum force threshold. For example, the upper force threshold 402 can be indicative of a magnitude of force of 6.0 N or greater. For example, the threshold processor 212 can determine the upper force threshold 402 as discussed herein.
[0054] The modified force metrics 410 can include a subset of the force metrics 310. For example, the point processor can remove all points of the force metrics 310 having corresponding force components at or below a magnitude (either signed or unsigned) of the lower force threshold 302, to generate or obtain the modified force metrics 410. For example, the modified force metrics 410 can include only points at or above the lower force threshold 302. For example, the modified force metrics 410 can include fewer than 8,000 points for a medical procedure including a surgery of three hours. Thus, the points data processor 214 can provide a technical improvement to significantly reduce data density of force data associated with a medical procedure without reducing granularity of force data, via a technical solution including one or more thresholds determined according to one or more characteristics of one or more medical procedures. The modified force metrics 410 can include a first peak 320, and a second peak 322. The first peak 320 can include a first set of force metrics indicative of a first set of one or more relative peaks of the force metrics 310. For example, the first peak 320 can correspond to points of the force metrics 310 captured during a first task or phase of the medical procedure. The second peak 322 can include a second set of force metrics indicative of a second set of one or more relative peaks of the force metrics 310. For example, the second peak 322 can correspond to points of the force metrics 310 captured during a second task or phase of the medical procedure.
[0055] The modified force metrics 410 can include a first maximum peak satisfying the upper force threshold 412, a second maximum peak not satisfying the upper force threshold 414, a first set of modified force metrics at the first peak 420, and a second set of modified force metrics at the second peak 422. The first maximum peak 412 satisfying the upper force threshold can include a third set of force metrics with force components having quantitative values that meet or exceed a quantitative value of the maximum force threshold 402. The second maximum peak not satisfying the upper force threshold 414 can include a fourth set of force metrics with force components having quantitative values at or below the quantitative value of the maximum force threshold 402, and that meet or exceed the quantitative value of the minimum force threshold 302. The first set of modified force metrics at the first peak 420 can include a subset of points at the first peak 320. For example, the first set of modified force metrics at the first peak 420 can correspond to a first denoised set of force metrics that exclude points having force components removed according to the minimum force threshold 302. The second set of modified force metrics at the second peak 422 can include a subset of points at the second peak 322. For example, the second set of modified force metrics at the second peak 422 can correspond to a second denoised set of force metrics that exclude points having force components removed according to the minimum force threshold 302.
[0056] FIG. 5A depicts an example user interface presentation according to this disclosure. As illustrated by way of example in FIG. 5A, a user interface presentation 500A can include at least a video data presentation 502, and a force data presentation 504. The video data presentation 502 can include a depiction at the user interface 160 according to the video data 152. For example, the video data presentation 502 can correspond to an instance of one or more videos or video segments identified by the video data processor 120. The video data presentation 502 can include an instrument presentation 510A. The instrument presentation 510A can include a depiction of an instrument of the robotic system 104. [0057] The force data presentation 504 can include one or more elements of the GUI that are indicative of force data associated with the video data presentation 502. The force data presentation 504 can include control affordances and a video timeline presentation. The control affordances can include GUI elements that can detect user input. For example, the control affordances can include a first affordance to highlight clinical segments of the video data presentation 502, a second affordance to highlight consoles and surgeons of the video data presentation 502, a third affordance to highlight instruments of the video data presentation 502, and a fourth affordance to highlight force of the video data presentation 502. The control affordances can include a sliding bar used to manipulate output of the GUI. The force data presentation 504 can include one or more elements of the GUI indicative of a sequence of timestamps or one or more time periods corresponding to a video including the video data presentation 502. The force data presentation 504 can include a first force data model of a first arm 520, a second force data model of a second arm 530, a third force data model of a third arm 540, and a fourth force data model of a fourth arm 550.
[0058] The first force data model of a first arm 520 can include a first set of data corresponding to the structure of the modified force data structure 400, and can be transmitted to the client system 103 via the interface controller 112 of the data processing system 102. The first force data model of a first arm 520 can include an inactive instrument state indication 522, and an active instrument state indications 524. The inactive instrument state indication 522 can correspond to one or more times during the medical procedure in which the first arm 520 is associated with force data below the minimum force threshold 302. The active instrument state indications 524 can correspond to one or more times during the medical procedure in which the first arm 520 is associated with force data at or above the minimum force threshold 302. The active instrument state indications 524 can include a first chart of force data according to the structure of the modified force data structure 400, but is not limited to the values indicated thereby. For example, a robotic system as discussed herein can include one or more instruments providing force feedback, and can include one or more instruments not providing forcefeedback (non-force instruments). Here, only force feedback instruments we would be able to access the force data. For example, an inactive state as discussed herein can correspond a nonforce feedback instrument during any time of a medical procedure. For example, an inactive state as discussed herein can correspond a force feedback instrument during a time of a medical procedure providing force feedback at a level below a minimum force threshold as discussed herein. [0059] The second force data model of a second arm 530 can include a second set of data corresponding to the structure of the modified force data structure 400, and can be transmitted to the client system 103 via the interface controller 112 of the data processing system 102. The second force data model of a second arm 530 can include an active instrument state indications 532, and an inactive state indication 534. The active instrument state indications 532 can correspond to one or more times during the medical procedure in which the second arm 530 is associated with force data at or above the minimum force threshold 302. The active instrument state indications 532 can include a second chart of force data according to the structure of the modified force data structure 400, but is not limited to the values indicated thereby. The inactive state indication 534 can correspond to one or more times during the medical procedure in which the second arm 530 is associated with force data below the minimum force threshold 302.
[0060] The third force data model of a third arm 540 can include a third set of data corresponding to the structure of the modified force data structure 400, and can be transmitted to the client system 103 via the interface controller 112 of the data processing system 102. The third force data model of a third arm 540 can include an inactive instrument state indication 542. The inactive instrument state indication 542 can correspond to one or more times during the medical procedure in which the third arm 540 is associated with force data below the minimum force threshold 302. Here, the third arm 540 remains in an inactive state during the medical procedure.
[0061] The fourth force data model of a fourth arm 550 can include a fourth set of data corresponding to the structure of the modified force data structure 400, and can be transmitted to the client system 103 via the interface controller 112 of the data processing system 102. The fourth force data model of a fourth arm 550 can include a first active instrument state indication 552, and a second active instrument state indication 554. The first active instrument state indication 552 can correspond to one or more times during the medical procedure in which the fourth arm 550 is associated with force data at or above the minimum force threshold 302. The first active instrument state indication 552 can include a first portion of a fifth chart of force data according to the structure of the modified force data structure 400, but is not limited to the values indicated thereby. The second active instrument state indication 554 can correspond to one or more times during the medical procedure in which the fourth arm 550 is associated withforce data at or above the minimum force threshold 302. The second active instrument state indication 552 can include a second portion of a fifth chart of force data according to the structure of the modified force data structure 400, but is not limited to the values indicated thereby.
[0062] FIG. 5B depicts an example modified user interface presentation according to this disclosure. As illustrated by way of example in FIG. 5B, a modified user interface presentation 500B can include at least a modified instrument presentation 510B, and a high force indication presentation 560. The modified instrument presentation 510B can correspond to a modification of a portion of one or more images associated with an instrument associated with a high-force point or high-force event. For example, the first arm 520 can be depicted in the modified instrument presentation 510B to include a highlight as discussed herein. For example, the modified instrument presentation 51 OB can include a highlight for the portion of the video depicting the first arm 520 with a red color overlay. The overlay can have a transparency of 50%.
[0063] The high force indication presentation 560 can present one or mor indications directed to the high-force event. For example, the high force indication presentation 560 can correspond to a popup window that can be presented in response to a mouseover or hover event of a cursor at or near a portion of the force data presentation 504. The high force indication presentation 560 can include a force type presentation 562, and a force magnitude presentation 564. The force type presentation 562 can include an image, text, glyph, or the like that is indicative of the high-force event. For example, the force type presentation 562 can be an image or glyph that depicts a symbol associated with a high-force event, or type of high-force event. For example, the force type presentation 562 can present different symbols for tissue contact, instrument collision (if not filtered), instrument type, dynamic threshold level, or any combination thereof, but is not limited thereto.
The high force indication presentation 560 can present an indication of a force feedback setting. For example, the force feedback setting can dictate a magnitude of haptic force feedback delivered to a surgeon by one or more of the instruments (e.g., arms). For example, a force feedback setting can correspond to an OFF setting corresponding to no haptic feedback delivered to a surgeon. For example, a force feedback setting can correspond to a LOW setting corresponding to low haptic feedback delivered to a surgeon (e.g., minimum detectable feedback by human hands). For example, a force feedback setting can correspond to a MEDIUM setting corresponding to medium haptic feedback delivered to a surgeon (e.g., detectable feedback by human hands below a level of resistance to movement of instruments). For example, a force feedback setting can correspond to a HIGH setting corresponding to high haptic feedback delivered to a surgeon (e.g., detectable feedback by human hands at or above a level of resistance to movement of instruments). The high force indication presentation 560 can present an indication of force feedback detected at or by an instrument. For example, the force feedback can correspond to, for example, an OFF state indicative of no force detected at an instrument. For example, the force feedback can correspond to, for example, an LOW state indicative of a force detected at an instrument below the minimum force threshold. For example, the force feedback can correspond to, for example, a MEDIUM state indicative of a force detected at an instrument at or above the minimum force threshold and below the maximum force threshold. For example, the force feedback can correspond to, for example, a HIGH state indicative of a force detected at an instrument at or above the maximum force threshold.
[0064] The force magnitude presentation 564 can present a quantitative or qualitative value that is indicative of the high-force event. For example, the force magnitude presentation 564 can present a numerical value that indicates the magnitude of the high-force event (e.g., 6.2 N). For example, the force type presentation 562 can include a visualization for force data including one or more of heat map highlights of instruments in video according to force data or force OPIs. For example, the video data processor 120 can receive one or more timestamps from the force event processor 140 to segment video and deliver video by machine learning indicative of high-force moments in various medical procedures, to adjust timeline to include only high force events (e.g., a force-based highlights reel), or generate video recommendations based on high force events. In response to receiving one or more signatures from the signature processors 216, the user interface 160 can display, at the instrument presentation 510A or 510B, stacked force signatures for various segments of medical procedure. For example, a stack can include historical force signatures for a particular surgeon, type of medical procedure, site, or time interval.
[0065] FIG. 6 depicts an example method of configuration of force data of robotic systems for compatibility with network transmissions according to this disclosure. At least the system 100 or any component thereof can perform method 600. At 610, the method 600 can receive a data set for force during a medical procedure. For example, the data set can correspond to force data as discussed herein. At 612, the method 600 can receive the data set for force at one or more instruments 170 of a robotic medical system. For example, the method can include selecting, according to a physical property of at least a portion of a patient site associated with the medical procedure, a quantitative value of the first threshold, the portion of a patient site corresponding to patient tissue, and the physical property corresponding to a tensile strength of the patient tissue. For example, the first threshold can correspond to the minimum force threshold 302.
[0066] At 620, the method 600 can remove one or more first points from the data set that satisfy a first threshold. For example, the point processor 214 can remove one or more points below the minimum force threshold 302 as discussed herein. For example, the first threshold can be indicative of a minimum magnitude of force, and the second threshold indicative of a maximum magnitude of force. At 622, the method 600 can remove the first points by a threshold indicative of a first magnitude of force at the one or more instruments of the robotic system.
[0067] At 630, the method 600 can identify one or more second points that satisfy a second threshold. For example, the second threshold can correspond to the maximum force threshold 402. For example, the method can include selecting, according to a type of at least one of the instruments or according to a type of segment of the medical procedure, a quantitative value of the second threshold. For example, the threshold processor 212 can dynamically set the second threshold according to a type of an instrument 170 from which a given set of force data is captured. At 632, the method 600 can identify the second points with a threshold indicative of a second magnitude of force at the one or more instruments. For example, the second magnitude of force can be the magnitude of force associated with the maximum force threshold 402. The maximum force threshold 402 can correspond to an instrument collision, for example, if collision force data is to be filtered out. At 634, the method 600 can identify with a machine learning model and from the data set with the one or more first points removed. For example, the method can include generating, based on one or more of the second points that satisfy the second threshold, a first signature indicative of the force at the one or more instruments of the robotic medical system during the medical procedure. For example, the method can include determining that one or more second signatures each satisfy a third threshold indicative of a similarity with the first signature. The method can include selecting, according to the one or more second signatures, the one or more videos.
[0068] For example, the method can include receiving a second data set corresponding to positions of the one or more instruments of the robotic medical system during the medical procedure. The method can include identifying, with the machine learning model and from the data set with the one or more first points removed, one or more third points that do not satisfy the second threshold. The method can include determining, based on the one or more third points and the positions, contact between the one or more instruments. At 640, the method 600 can cause a user interface to present one or more videos. For example, the user interface 160 can present one or more video or segments of video as discussed herein. At 642, the method 600 can present one or more videos for one or more times during the medical procedure. At 644, the method 600 can present one or more videos for the one or more second points.
[0069] FIG. 7A depicts an example of force data models according to this disclosure. The first presentation 700A depicted in FIG. 7A can be generated or output by one or more component or system depicted herein, including, for example, data processing system 102 depicted in FIG. 1, or the computing architecture 200 depicted in FIG. 2.
[0070] As illustrated by way of example in FIG. 7A, a first presentation 700A of force data models can include at least a first force data model 702A for a first instrument 170 of the robotic system 104, and a second force data model 704A for a second instrument 170 of the robotic system 104. For example, each of the points of the first force data model 702A and the second force data model 704A can include a force type metric. The force type metric can be indicative of a state of force with respect to a given one or more of the instruments 170. For example, the force type metric can be a categorical metric as discussed herein by way of example, but is not limited thereto. The first force data model 702A and the second force data model 704A can include a number of the first points at a first point density indicative of raw data captured by the instrument. For example, the first point density can correspond to the first data model including points at or below the minimum force threshold. For example, the user interface 160 can present the first force data model 702A and the second force data model 704A as bar charts, to indicate presence of one or more points captured at one or more given times during a given medical procedure by each instrument. For example, Figure 7A can depict various force feedback configurations for one or more instruments 170 of the robotic system 104. For example, the robotic system 104 can receive input corresponding to a setting by a surgeon of a level of force feedback (e.g., resistance to movement of an instrument 170) is provided to the instruments 170. For example, visualizations according to the examples of Figs. 7A-C can reflect one or more settings of one or more force feedback configurations, to identify perceivable responses at one or more instruments 170 (e.g., manipulators of the robotic system 104). Thus, this technical solution can provide a technical improvement at least to guide surgeons to using the appropriate amount of force in a medical procedure (e.g., the surgeon will use less force when the surgeon feels more force at a higher setting), beyond the capability of manual processes to achieve.
[0071] The first force data model 702A can include a first waveform indicative of one or more first points at one or more times during the during a given medical procedure, where each of the first points are indicative of force at or on the first instrument 170. The first force data model 702A can include at least a first subset of points 710A, a second subset of points 712A, a third subset of points 714A, and a fourth subset of points 716A.
[0072] The first subset of points 710A can correspond to a portion of the waveform indicative of a first state of the first instrument 170. For example, the first subset of points 710A can indicate a first force type associated with the first instrument 170 at various times. Here, the first force type state can be, for example, a first force configuration state indicative of a first force feedback magnitude on or by the first instrument 170. The second subset of points 712A can correspond to a portion of the waveform indicative of a second state of the first instrument 170. For example, the second subset of points 712A can indicate a second force type associated with the first instrument 170 at various times. Here, the second force type state can be, for example, a second force configuration state indicative of a second force feedback magnitude on or by the first instrument 170. The third subset of points 714A can correspond to a portion of the waveform indicative of a third state of the first instrument 170. For example, the third subset of points 714A can indicate a third force type associated with the first instrument 170 at various times. Here, the third force type state can be, for example, a third force configuration state indicative of a third force feedback magnitude on or by the first instrument 170. The fourth subset of points 716A can correspond to a portion of the waveform indicative of a fourth state of the first instrument 170. For example, the fourth subset of points 716A can indicate a fourth force type associated with the first instrument 170 at various times. Here, the fourth force type state can be, for example, a fourth force configuration state indicative of a fourth force feedback magnitude on or by the first instrument 170.
[0073] The second force data model 704A can include a second waveform indicative of one or more first points at one or more times during the during a given medical procedure, where each of the first points are indicative of force at or on the second instrument 170. The second force data model 704A can include a number of the first points at the point density. The second force data model 704A can include at least a first subset of points 720A, a second subset of points 722A, a third subset of points 724A, and a fourth subset of points 726A.
[0074] The first subset of points 720A can correspond to a portion of the waveform indicative of a first state of the second instrument 170. For example, the first subset of points 720A can indicate the first force type associated with the second instrument 170 at various times. The second subset of points 722A can correspond to a portion of the waveform indicative of a second state of the second instrument 170. For example, the second subset of points 722A can indicate the second force type associated with the second instrument 170 at various times. The third subset of points 724A can correspond to a portion of the waveform indicative of a third state of the second instrument 170. For example, the third subset of points 724A can indicate the third force type associated with the second instrument 170 at various times. The fourth subset of points 726A can correspond to a portion of the waveform indicative of a fourth state of the second instrument 170. For example, the fourth subset of points 726A can indicate the fourth force type associated with the second instrument 170 at various times.
[0075] FIG. 7B depicts an example of modified force data models according to this disclosure. As illustrated by way of example in FIG. 7B, a second presentation 700B of modified force data models can include at least a modified first force data model 702B for a first instrument 170 of the robotic system 104, and a second modified force data model 704B for a second instrument 170 of the robotic system 104. For example, each of the points of the first modified force data model 702B and the second modified force data model 704B can include the force type metric as discussed herein. The first modified force data model 702B and the modified second force data model 704B can include a number of the first points at a second point density that excludes points having force metrics indicative of magnitudes of force at or below the minimum force threshold. For example, the second point density can correspond to the first data model excluding points at or below the minimum force threshold. For example, the user interface 160 can present the first modified force data model 702B and the second modified force data model 704B as line charts, to accommodate removal of points without loss of clarity of a user interface presentation.
[0076] The first modified force data model 702B can include a first waveform indicative of one or more first points at one or more times during the during a given medical procedure, where each of the first points are indicative of force at or on the first instrument 170. The first modified force data model 702B can include at least a first subset of points 71 OB, a second subset of points 712B, a third subset of points 714B, and a fourth subset of points 716B.
[0077] The first subset of points 71 OB can correspond to a portion of the waveform indicative of a first state of the first instrument 170. For example, the first subset of points 710B can indicate the first force type associated with the first instrument 170 at various times. The second subset of points 712B can correspond to a portion of the waveform indicative of a second state of the first instrument 170. For example, the second subset of points 712B can indicate the second force type associated with the first instrument 170 at various times. The third subset of points 714B can correspond to a portion of the waveform indicative of a third state of the first instrument 170. For example, the third subset of points 714B can indicate the third force type associated with the first instrument 170 at various times. The fourth subset of points 716B can correspond to a portion of the waveform indicative of a fourth state of the first instrument 170. For example, the fourth subset of points 716B can indicate the fourth force type associated with the first instrument 170 at various times.
[0078] The second modified force data model 704B can include a second waveform indicative of one or more first points at one or more times during the during a given medical procedure, where each of the first points are indicative of force at or on the second instrument 170. The second modified force data model 704B can include a number of the first points at the point density. The second modified force data model 704B can include at least a first subset of points 720B, a second subset of points 722B, a third subset of points 724B, and a fourth subset of points 726B.
[0079] The first subset of points 720B can correspond to a portion of the waveform indicative of a first state of the second instrument 170. For example, the first subset of points 720B can indicate the first force type associated with the second instrument 170 at various times. The second subset of points 722B can correspond to a portion of the waveform indicative of a second state of the second instrument 170. For example, the second subset of points 722B can indicate the second force type associated with the second instrument 170 at various times. The third subset of points 724B can correspond to a portion of the waveform indicative of a third state of the second instrument 170. For example, the third subset of points 724B can indicate the third force type associated with the second instrument 170 at various times. The fourth subset of points 726B can correspond to a portion of the waveform indicative of a fourth state of the second instrument 170. For example, the fourth subset of points 726B can indicate the fourth force type associated with the second instrument 170 at various times.
[0080] FIG. 7C depicts an example of filtered force data models according to this disclosure. As illustrated by way of example in FIG. 7C, a third presentation 700C of filtered force data models can include at least a filtered first force data model 702C for a first instrument 170 of the robotic system 104, and a second filtered force data model 704C for a second instrument 170 of the robotic system 104. For example, each of the points of the first filtered force data model 702C and the second filtered force data model 704C can include the force type metric as discussed herein. The first filtered force data model 702C and the modified second force data model 704B can include a number of the first points at a third point density that excludes points having force metrics indicative of magnitudes of force at or below the minimum force threshold. For example, the second point density can correspond to the first data model excluding points at or below the minimum force threshold, and excluding time periods including only excluded points from presentation. For example, the user interface 160 can present the first filtered force data model 702C and the second filtered force data model 704C as bar charts only for portions of the waveform for each instrument including a point at the third point density, to accommodate removal of points without loss of clarity of a user interface presentation.
[0081] The first filtered force data model 702C can include a first waveform indicative of one or more first points at one or more times during the during a given medical procedure, where each of the first points are indicative of force at or on the first instrument 170. The first filtered force data model 702C can include at least a first subset of points 710C, a second subset of points 712C, a third subset of points 714C, and a fourth subset of points 716C.
[0082] The first subset of points 710C can correspond to a portion of the waveform indicative of a first state of the first instrument 170. For example, the first subset of points 710C can indicate the first force type associated with the first instrument 170 at various times. The second subset of points 712C can correspond to a portion of the waveform indicative of a second state of the first instrument 170. For example, the second subset of points 712C can indicate the second force type associated with the first instrument 170 at various times. The third subset of points 714C can correspond to a portion of the waveform indicative of a third state of the first instrument 170. For example, the third subset of points 714C can indicate the third force type associated with the first instrument 170 at various times. The fourth subset of points 716C can correspond to a portion of the waveform indicative of a fourth state of the first instrument 170. For example, the fourth subset of points 716C can indicate the fourth force type associated with the first instrument 170 at various times.
[0083] The second filtered force data model 704C can include a second waveform indicative of one or more first points at one or more times during the during a given medical procedure, where each of the first points are indicative of force at or on the second instrument 170. The second filtered force data model 704C can include a number of the first points at the point density. The second filtered force data model 704C can include at least a first subset of points 720C, a second subset of points 722C, a third subset of points 724C, and a fourth subset of points 726C.
[0084] The first subset of points 720C can correspond to a portion of the waveform indicative of a first state of the second instrument 170. For example, the first subset of points 720C can indicate the first force type associated with the second instrument 170 at various times. The second subset of points 722C can correspond to a portion of the waveform indicative of a second state of the second instrument 170. For example, the second subset of points 722C can indicate the second force type associated with the second instrument 170 at various times. The third subset of points 724C can correspond to a portion of the waveform indicative of a third state of the second instrument 170. For example, the third subset of points 724C can indicate the third force type associated with the second instrument 170 at various times. The fourth subset of points 726C can correspond to a portion of the waveform indicative of a fourth state of the second instrument 170. For example, the fourth subset of points 726C can indicate the fourth force type associated with the second instrument 170 at various times.
[0085] Thus, the data processing system can generate (e.g., via a machine learning model) a density plot of forces associated with one or more instruments of a robotic system during one or more medical procedures, and can generate “fingerprints” for one or more videos or video clips. The video clips associated with portions of videos of medical procedures can be associated with discrete portions of the workflow (e.g., task or phase) and discrete force data or ranges of force data from the robotic system. The data processing system can generate, render, display, or otherwise allow a surgeon reviewing medical procedures for force to rapidly review or cycle between many high-force video clips relevant to a particular robotic system, medical procedure, or medical procedure workflow. For example, the data processing system can generate a histogram of force, which can be based on 700C. [0086] Having now described some illustrative implementations, the foregoing is illustrative and not limiting, having been presented by way of example. In particular, although many of the examples presented herein involve specific combinations of method acts or system elements, those acts and those elements may be combined in other ways to accomplish the same objectives. Acts, elements and features discussed in connection with one implementation are not intended to be excluded from a similar role in other implementations.
[0087] The phraseology and terminology used herein is for the purpose of description and should not be regarded as limiting. The use of "including," "comprising," "having," "containing," "involving," "characterized by," "characterized in that," and variations thereof herein, is meant to encompass the items listed thereafter, equivalents thereof, and additional items, as well as alternate implementations consisting of the items listed thereafter exclusively. In one implementation, the systems and methods described herein consist of one, each combination of more than one, or all of the described elements, acts, or components.
[0088] References to "or" may be construed as inclusive so that any terms described using "or" may indicate any of a single, more than one, and all of the described terms. References to at least one of a conjunctive list of terms may be construed as an inclusive OR to indicate any of a single, more than one, and all of the described terms. For example, a reference to "at least one of "A" and "B" can include only "A", only "B", as well as both "A" and "B". Such references used in conjunction with "comprising" or other open terminology can include additional items. References to "is" or "are" may be construed as nonlimiting to the implementation or action referenced in connection with that term. The terms "is" or "are" or any tense or derivative thereof, are interchangeable and synonymous with "can be" as used herein, unless stated otherwise herein.
[0089] Directional indicators depicted herein are example directions to facilitate understanding of the examples discussed herein, and are not limited to the directional indicators depicted herein. Any directional indicator depicted herein can be modified to the reverse direction, or can be modified to include both the depicted direction and a direction reverse to the depicted direction, unless stated otherwise herein. While operations are depicted in the drawings in a particular order, such operations are not required to be performed in the particular order shown or in sequential order, and all illustrated operations are not required to be performed. Actions described herein can be performed in a different order. Where technical features in the drawings, detailed description or any claim are followed by reference signs, the reference signs have been included to increase the intelligibility of the drawings, detailed description, and claims. Accordingly, neither the reference signs nor their absence has any limiting effect on the scope of any clam elements.
[0090] Scope of the systems and methods described herein is thus indicated by the appended claims, rather than the foregoing description. The scope of the claims includes equivalents to the meaning and scope of the appended claims.

Claims

WHAT IS CLAIMED IS:
1. A system, comprising: one or more processors, coupled with memory, to: receive a data set corresponding to force at one or more instruments of a robotic medical system during a medical procedure; remove one or more first points from the data set that satisfy a first threshold indicative of a first magnitude of force at the one or more instruments of the robotic medical system; identify, with a machine learning model and from the data set with the one or more first points removed, one or more second points that satisfy a second threshold indicative of a second magnitude of force at the one or more instruments of the robotic medical system; and cause a user interface to present one or more videos for one or more times during the medical procedure that correspond to the one or more second points from the data set.
2. The system of claim 1, comprising the one or more processors to: select, according to a physical property of at least a portion of a patient site associated with the medical procedure, a quantitative value of the first threshold.
3. The system of claim 2, the portion of the patient site corresponding to patient tissue, and the physical property corresponding to a tensile strength of the patient tissue.
4. The system of any of claims 1-3, comprising the one or more processors to: select, according to a type of at least one of the instruments, a quantitative value of the second threshold.
5. The system of any of claims 1-4, comprising the one or more processors to: select, according to a type of segment of the medical procedure, a quantitative value of the second threshold.
6. The system of any of claims 1-5, the first threshold indicative of a minimum magnitude of force.
7. The system of any of claims 1-6, the second threshold indicative of a maximum magnitude of force.
8. The system of any of claims 1-7, comprising the one or more processors to: receive a second data set corresponding to positions of the one or more instruments of the robotic medical system during the medical procedure.
9. The system of claim 8, comprising the one or more processors to: identify, with the machine learning model and from the data set with the one or more first points removed, one or more third points that do not satisfy the second threshold.
10. The system of claim 9, comprising the one or more processors to: determine, based on the one or more third points and the positions, contact between the one or more instruments.
11. The system of any of claims 1-10, comprising the one or more processors to: generate a first signature indicative of the force at the one or more instruments of the robotic medical system during the medical procedure.
12. The system of claim 11, comprising the one or more processors to: generate, based on one or more of the second points that satisfy the second threshold, the first signature.
13. The system of any of claims 11 and 12, comprising the one or more processors to: determine that one or more second signatures each satisfy a third threshold indicative of a similarity with the first signature; and select, according to the one or more second signatures, the one or more videos.
14. A method, comprising: receiving, by one or more processors coupled with memory, a data set corresponding to force at one or more instruments of a robotic medical system during a medical procedure; removing, by the one or more processors, one or more first points from the data set that satisfy a first threshold indicative of a first magnitude of force at the one or more instruments of the robotic system; identifying, by the one or more processors, with a machine learning model and from the data set with the one or more first points removed, one or more second points that satisfy a second threshold indicative of a second magnitude of force at the one or more instruments of the robotic medical system; and causing, by the one or more processors, a user interface to present one or more videos for one or more times during the medical procedure that correspond to the one or more second points from the data set.
15. The method of claim 14, comprising: selecting, by the one or more processors, according to a physical property of at least a portion of a patient site associated with the medical procedure, a quantitative value of the first threshold, the portion of a patient site corresponding to patient tissue, and the physical property corresponding to a tensile strength of the patient tissue.
16. The method of claim 14, comprising: selecting, by the one or more processors, according to a type of at least one of the instruments or according to a type of segment of the medical procedure, a quantitative value of the second threshold.
17. The method of claim 14, the first threshold indicative of a minimum magnitude of force, and the second threshold indicative of a maximum magnitude of force.
18. The method of claim 14, comprising: receiving, by the one or more processors, a second data set corresponding to positions of the one or more instruments of the robotic medical system during the medical procedure; identifying, by the one or more processors, with the machine learning model and from the data set with the one or more first points removed, one or more third points that do not satisfy the second threshold; and determining, by the one or more processors, based on the one or more third points and the positions, contact between the one or more instruments.
19. The method of claim 14, comprising: generating, by the one or more processors, based on one or more of the second points that satisfy the second threshold, a first signature indicative of the force at the one or more instruments of the robotic medical system during the medical procedure.
20. The method of claim 19, comprising: determining, by the one or more processors, that one or more second signatures each satisfy a third threshold indicative of a similarity with the first signature; and selecting, by the one or more processors, according to the one or more second signatures, the one or more videos.
21. A non-transitory computer readable medium including one or more instructions stored thereon and executable by one or more processors to: receive, by the one or more processors, a data set corresponding to force at one or more instruments of a robotic medical system during a medical procedure; remove, by the one or more processors, one or more first points from the data set that satisfy a first threshold indicative of a first magnitude of force at the one or more instruments of the robotic system; identify, by the one or more processors and with a machine learning model and from the data set with the one or more first points removed, one or more second points that satisfy a second threshold indicative of a second magnitude of force at the one or more instruments of the robotic medical system; and cause, by the one or more processors, a user interface to present one or more videos for one or more times during the medical procedure that correspond to the one or more second points from the data set.
22. The non-transitory computer readable medium of claim 21, the non-transitory computer readable medium further including one or more instructions executable by the one or more processors to: select, according to a physical property of at least a portion of a patient site associated with the medical procedure, a quantitative value of the first threshold.
23. The non-transitory computer readable medium of claim 21, the non-transitory computer readable medium further including one or more instructions executable by the one or more processors to: select, according to a type of at least one of the instruments, a quantitative value of the second threshold.
24. The non-transitory computer readable medium of claim 21, the non-transitory computer readable medium further including one or more instructions executable by the one or more processors to: select, according to a type of at least one of the instruments or according to a type of segment of the medical procedure, a quantitative value of the second threshold.
PCT/US2025/017655 2024-02-28 2025-02-27 Configuration of force data of robotic systems for compatibility with network transmissions Pending WO2025184382A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US202463559124P 2024-02-28 2024-02-28
US63/559,124 2024-02-28

Publications (1)

Publication Number Publication Date
WO2025184382A1 true WO2025184382A1 (en) 2025-09-04

Family

ID=95065473

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2025/017655 Pending WO2025184382A1 (en) 2024-02-28 2025-02-27 Configuration of force data of robotic systems for compatibility with network transmissions

Country Status (1)

Country Link
WO (1) WO2025184382A1 (en)

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20210322121A1 (en) * 2018-09-12 2021-10-21 Verb Surgical Inc. Machine-learning-based visual-haptic system for robotic surgical platforms

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20210322121A1 (en) * 2018-09-12 2021-10-21 Verb Surgical Inc. Machine-learning-based visual-haptic system for robotic surgical platforms

Similar Documents

Publication Publication Date Title
US20220175470A1 (en) Reconfigurable display in computer-assisted tele-operated surgery
KR102512881B1 (en) Master to slave orientation mapping when misaligned
US8199106B2 (en) Systems and methods of camera-based fingertip tracking
US9342145B2 (en) Cursor control
US9791938B2 (en) System and methods of camera-based fingertip tracking
US10771350B2 (en) Method and apparatus for changeable configuration of objects using a mixed reality approach with augmented reality
EP2612591A1 (en) Medical information display device, method and program
Speidel et al. Tracking of instruments in minimally invasive surgery for surgical skill analysis
JP2019526328A (en) System and method for preventing surgical errors
JP6362061B2 (en) Diagnosis support system, operation method thereof, and program
Ameur et al. Hand-gesture-based touchless exploration of medical images with leap motion controller
JP2018534667A (en) Data display device
Hettig et al. Comparison of gesture and conventional interaction techniques for interventional neuroradiology
US20140258917A1 (en) Method to operate a device in a sterile environment
CN113808181A (en) Medical image processing method, electronic device and storage medium
WO2025184382A1 (en) Configuration of force data of robotic systems for compatibility with network transmissions
JP6440394B2 (en) Simulation image display device
Shim et al. Interactive features based augmented reality authoring tool
Heinrich et al. Interacting with medical volume data in projective augmented reality
WO2025171123A1 (en) Medical procedure video segment identification based on surgical data
KR101953730B1 (en) Medical non-contact interface system and method of controlling the same
CN104997510B (en) The method, apparatus and program of image monitoring are carried out to intervention using magnetic resonance device
US20250295457A1 (en) Configuration of robotic systems for ergonomic states of operators in medical procedures
US20250364123A1 (en) Monitoring of a medical environment by fusion of egocentric and exocentric sensor data
JP6530841B2 (en) DIAGNOSTIC SUPPORT SYSTEM, OPERATION METHOD THEREOF, DIAGNOSTIC SUPPORT DEVICE, AND PROGRAM

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 25713429

Country of ref document: EP

Kind code of ref document: A1