[go: up one dir, main page]

WO2025106911A1 - Systems and methods for accelerated skill acquisition and assessment for surgical training - Google Patents

Systems and methods for accelerated skill acquisition and assessment for surgical training Download PDF

Info

Publication number
WO2025106911A1
WO2025106911A1 PCT/US2024/056260 US2024056260W WO2025106911A1 WO 2025106911 A1 WO2025106911 A1 WO 2025106911A1 US 2024056260 W US2024056260 W US 2024056260W WO 2025106911 A1 WO2025106911 A1 WO 2025106911A1
Authority
WO
WIPO (PCT)
Prior art keywords
assembly
sensor
tool
linkage assembly
coupled
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
PCT/US2024/056260
Other languages
French (fr)
Inventor
Nabil Simaan
Madison A. VELIKY
Ahmet Yildiz
Garrison L. JOHNSTON
Jumanh ATOUM
Jie Ying Wu
Soheil KOLOURI
Michael I. MIGA
Rondi M. KAUFFMANN
Kamran Idrees
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vanderbilt University
Original Assignee
Vanderbilt University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vanderbilt University filed Critical Vanderbilt University
Publication of WO2025106911A1 publication Critical patent/WO2025106911A1/en
Pending legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods
    • A61B2017/00017Electrical control of surgical instruments
    • A61B2017/00022Sensing or detecting at the treatment site
    • A61B2017/00039Electric or electromagnetic phenomena other than conductivity, e.g. capacity, inductivity, Hall effect
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/06Measuring instruments not otherwise provided for
    • A61B2090/064Measuring instruments not otherwise provided for for measuring force, pressure or mechanical tension
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/70Manipulators specially adapted for use in surgery
    • A61B34/76Manipulators having means for providing feel, e.g. force or tactile feedback

Definitions

  • the present disclosure provides a training system including one or more haptic devices, a skill assessment algorithm, a virtual reality model of anatomy, and user eye gaze and Attorney Docket No.093386-0040-WO01 instrument motion tracking algorithms.
  • the combination of these devices and methods allows this system to train surgeons on minimally invasive procedures where they may use a simulated organ in a virtual reality setting or a physical organ model either made of silicone or utilizing an explanted animal organ.
  • the present disclosure provides a system for training a user in soft-tissue surgeries.
  • the system comprises a robotic arm assembly and a computing device.
  • the robotic arm assembly includes a frame, an upper manipulation linkage assembly coupled to the frame, a lower manipulation linkage assembly coupled to the frame, one or more actuators coupled to the upper manipulation linkage assembly or the lower manipulation linkage assembly, a tool stem assembly coupled to the upper manipulation linkage assembly and the lower manipulation linkage assembly, the tool stem assembly configured to support a surgical tool, and a sensor assembly coupled to the tool stem assembly.
  • the computing device is configured to receive magnetic flux density data from the sensor assembly when a force is applied to the tool stem assembly, determine a position and an orientation of the tool stem assembly based on the magnetic flux density data, estimate forces of interaction with the upper manipulation linkage assembly and the lower manipulation linkage assembly based on electrical current of the one or more actuators, calculate an estimation of force at a distal end of the surgical tool based on the position and the orientation of the tool stem assembly and the forces of interaction, and process the estimation of force at the distal end of the surgical tool to provide real-time haptic feedback to the user via the surgical tool.
  • the sensor assembly includes a plurality of magnets and a plurality of Hall-effect sensors configured to detect the magnetic flux density data.
  • the tool stem assembly includes a stem body, a first sensor housing supported on the stem body, and a second sensor housing supported on the stem body, and wherein the sensor assembly includes a first sensor supported by the first sensor housing and a second sensor supported by the second sensor housing, and wherein the first sensor includes a first elastomeric matrix positioned between the first sensor housing and the stem body, and wherein the second sensor includes a second elastomeric matrix positioned between the second sensor housing and the stem body.
  • the first sensor includes a plurality of magnets and a plurality of Hall-effect sensors within the first sensor housing, the plurality of magnets and the plurality of Hall-effect sensors configured to detect the magnetic flux density data.
  • the second sensor includes a plurality of magnets and a plurality of Hall-effect sensors within the second sensor housing, the plurality of magnets and the plurality of Hall-effect sensors configured to detect the magnetic flux density data.
  • the sensor assembly includes a six-axis sensor including an array of magnets and corresponding Hall-effect sensors to detect the magnetic flux density data when a force is applied to the surgical tool.
  • the system further comprises a pitch actuation assembly coupled to the upper manipulation linkage assembly, the pitch actuating assembly configured to move the upper manipulation linkage assembly about a horizontal axis.
  • the system further comprises a yaw actuation assembly coupled to the lower manipulation linkage assembly, the yaw actuation assembly configured to move the lower manipulation linkage assembly about a vertical axis.
  • the system further comprises an interchange device coupled to the tool stem assembly, the interchange device configured to lock the surgical tool in position relative to the tool stem assembly.
  • the interchange device includes a locked state and a released state to interchange the surgical tool.
  • the system further comprises a virtual reality system configured to provide a simulated physical interaction with a soft-tissue organ, and wherein the user receives visual and auditory guidance from the virtual reality system while receiving the real-time haptic feedback.
  • the virtual reality system is configured to incorporate eye gaze data of the user when providing the real-time haptic feedback.
  • Attorney Docket No.093386-0040-WO01 [0019]
  • the computing device is further configured to input the estimation of force at the distal end of the surgical tool, the eye gaze data, and tool motion data to a skill assessment machine learning model to provide an output of a score related to a surgical training task performed by the user.
  • the computing device is further configured to input video data from an endoscope to the skill assessment machine learning model.
  • the tool motion data include curvature and torsion of a striction curve and a dual angle.
  • the system further comprises a second robotic arm assembly including a second frame, a second upper manipulation linkage assembly coupled to the second frame, a second lower manipulation linkage assembly coupled to the second frame, one or more actuators coupled to the second upper manipulation linkage assembly or the second lower manipulation linkage assembly, a second tool stem assembly coupled to the second upper manipulation linkage assembly and the second lower manipulation linkage assembly, the second tool stem assembly configured to support a second surgical tool, and a second sensor assembly coupled to the second tool stem assembly.
  • the present disclosure provides a haptic device with integrated intrinsic joint-level sensing and embedded sensors allowing estimation of user effort and tool-tip interaction force.
  • the haptic device is configured to provide a user the experience of a remote center of motion with a specified stiffness.
  • the present disclosure provides a low-cost, six-axis sensor for estimating force and moment interactions with surgical instruments using an array of Hall- effect sensors and miniature magnets embedded within an elastomeric tool interface.
  • the present disclosure provides a system whereby information regarding tool motion, and estimated user effort and forces at the tool tip are used for real-time skill scoring for the purpose of informing control laws that assist trainees in the process of gaining technical skills during minimally invasive surgical training.
  • the Attorney Docket No.093386-0040-WO01 system includes two robotic arm assemblies along with either a physical training model (e.g., an explanted animal organ or a silicon-based phantom) or a virtual-reality simulated phantom.
  • a programmable position and stiffness of the remote center of motion is achieved via a combination of kinematic coordinated control and a combination of active stiffness control and/or via impedance/admittance control.
  • a virtual reality system is used for simulated physical interaction with an organ and visual and auditory guidance cues are provided to a trainee along with force feedback via the haptic device.
  • the present disclosure provides a skill-assessment algorithm that uses information regarding the temporal evolution of the locus of instantaneous screws of motion and instantaneous wrenches (forces/moments) as an input for the skill assessment algorithm.
  • the system utilizes user gaze information for skill assessment.
  • the system utilizes endoscopic video information as an additional input to the skill-assessment algorithm for skill assessment of the user.
  • the system utilizes a simulated model of a target training anatomy to enable simulated interaction with a target anatomy while also offering haptic force feedback.
  • FIG.1A illustrates a haptic training system according to an embodiment of the present disclosure.
  • FIG.2A illustrates a perspective view of a prototype of the robotic arm assembly illustrated in FIGS.1A-1F.
  • FIG.2B illustrates a top perspective view of a prototype robotic arm assembly illustrated in FIGS.1A-1F.
  • FIG.3 is a perspective view of the various assemblies of the robotic arm assembly illustrated in FIGS.1A-1F.
  • FIG.4 illustrates an exploded view of an upper manipulation linkage (five-bar) assembly of the robotic arm assembly.
  • FIG.5 illustrates a rear perspective view of a lower (vertical) manipulation linkage assembly of the robotic arm assembly.
  • FIG.6 illustrates perspective view of a pitch actuation assembly of the robotic arm assembly.
  • FIG.7A illustrates perspective view of a tool stem assembly of the robotic arm assembly with a surgical tool.
  • FIG.7B is an enlarged view of a portion of a tool stem assembly of the robotic arm assembly with a surgical tool.
  • Attorney Docket No.093386-0040-WO01 [0047]
  • FIG.8A illustrates a perspective view of a component of the tool stem assembly of FIG.7A.
  • FIG.8B illustrates an exploded view of the component of the tool stem assembly shown in FIG.8A.
  • FIG.9 illustrates an exploded view of the component of the tool stem assembly shown in FIGS.8A-8B.
  • FIG.10 illustrates perspective view and a cross-section view of a tool stem assembly of the robotic arm assembly.
  • FIG.11 illustrates an exploded view of a tool stem assembly with a surgical tool.
  • FIG.12 illustrates an enlarged exploded view of a tool stem assembly.
  • FIG.13 illustrates a series of view of a sensor assembly of the robotic arm assembly.
  • FIG.14 illustrates a diagram of forces acting on a surgical tool in the robotic arm assembly.
  • FIG.15 illustrates a diagram and process for determining tool motion characteristics.
  • FIG.16 illustrates a diagram and process for determining tool motion characteristics.
  • FIG.17 illustrates a scene from a virtual training environment using the haptic training system of FIG.1A.
  • FIG.18 illustrates a graph neural network of a machine learning algorithm for skill assessment.
  • FIG.19 illustrates a time-series model of a machine learning algorithm for skill assessment.
  • FIG.20 illustrates integration of a proposed force sensor with a laparoscopic tool and haptic device. Attorney Docket No.093386-0040-WO01
  • FIG.21 illustrates (a) picture of the assembled force/torque sensor. (b) Picture of the sensor with the top cover and top silicone layer removed.
  • Haptic devices have shown to be valuable in supplementing surgical training, especially when providing haptic feedback based on user performance metrics such as wrench applied by the user on the tool.
  • current 6-axis force/torque (F/T) sensors are prohibitively expensive, especially when needed in low-resource areas.
  • the present disclosure provides a system for surgical training for users to obtain skills associated with soft-tissue surgeries.
  • the training system described herein can estimate the forces applied by a user on either a virtual environment or a physical phantom model to provide a “real-life” simulated learning and training experience.
  • the system of the present disclosure also can accept a variety of surgical instruments for the user to practice with.
  • the system serves as a tool tracking device thereby recording motions and estimating forces of interaction with the environment.
  • the system using information from the haptic device(s) provides online skill assessment between repetitions of surgical subtasks by the same user. Using this information, a haptic training intervention can be configured to assist the user in completing a surgical task and the level of assistance can be tuned based on preference.
  • a machine-learning model of tissue deformation is provided based on finite element modeling of target training anatomy. Additionally, the machine-learning model informs skill assessment and the haptic device of the expected interaction force with the virtual anatomy model.
  • the present disclosure also provides a low-cost, six-axis F/T sensor used in the training system.
  • the present disclosure also describes a training system and device that uses Hall-effect sensors to measure the change in the position of magnets embedded in a silicone layer that results from an applied wrench to the device.
  • Preliminary experimental validation demonstrates that these sensors can achieve an accuracy of 0.45 N and 0.014 Nm, and a theoretical XY range of ⁇ 50N, Z range of ⁇ 20N, and torque range of ⁇ 0.2Nm.
  • the experimental data discussed below indicates that the six-axis F/T sensor disclosed herein can accurately measure user force and provide useful feedback during surgical training on a haptic device.
  • Benefits of implementing a training system with a low-cost F/T sensor as described herein include improving the feasibility of exploiting force data for feedback in surgical assessment and increasing access to such haptic device trainers.
  • the cost of fabrication of these F/T sensors may be reduced by 3-D printing the flexures into the sensor, but the cost and complexity of signal amplification and conditioning remains contra-indicated to the simplicity of the F/T sensor for training in soft-tissue surgeries as described herein.
  • Additional benefits include features that allow for device interchangeability – the system allows use of domain-relevant tools based on the surgical simulation that a user is practicing.
  • FIGS.1A-1F and 2A-2B illustrate a haptic training system 10 for developing skills and training for soft-tissue surgeries.
  • the system 10 includes one or more robotic arm assemblies 12 and a computing device 14.
  • the robotic arm assembly 12 includes a plurality of subsystems or assemblies that work together to provide a training experience to a user learning how to perform soft-tissue surgeries.
  • the computing device 14 interfaces and communicates with the robotic arm assembly 12 to receive data from the plurality of subsystems or assemblies of the robotic arm assembly 12.
  • the training system 10 can include a second robotic arm assembly 12 for two-handed training options. Only one robotic arm assembly 12 is discussed herein, however it is understood that the second robotic arm assembly would be the same in structure and function as described herein.
  • FIGS.2A-2B illustrate a prototype of the robotic arm assembly 12.
  • the computing device 14 is illustrated as having a number of components, but any one or more of these components may be omitted or duplicated, as Attorney Docket No.093386-0040-WO01 suitable for the application and setting.
  • some or all of the components included in the computing device 14 may be attached to one or more motherboards and enclosed in a housing.
  • some of those components may be fabricated onto a single system-on-a-chip (SoC) (e.g., the SoC may include one or more electronic processing devices 16 and one or more storage devices 18).
  • SoC system-on-a-chip
  • the computing device 14 may not include one or more of the components illustrated in FIG.1A, but may include interface circuitry for coupling to the one or more components using any suitable interface (e.g., a Universal Serial Bus (USB) interface, a High-Definition Multimedia Interface (HDMI) interface, a Controller Area Network (CAN) interface, a Serial Peripheral Interface (SPI) interface, an Ethernet interface, a wireless interface, or any other appropriate interface).
  • the computing device 14 may not include a display device 20, but may include display device interface circuitry (e.g., a connector and driver circuitry) to which an external display device 20 may be coupled.
  • the computing device 14 includes a processing device 16 (e.g., one or more processing devices).
  • the terms “electronic processor device” and “processing device” interchangeably refer to any device or portion of a device that processes electronic data from registers and/or memory to transform that electronic data into other electronic data that may be stored in registers and/or memory.
  • the processing device 16 may include one or more digital signal processors (DSPs), application-specific integrated circuits (ASICs), central processing units (CPUs), graphics processing units (GPUs), server processors, field programmable gate arrays (FPGA), or any other suitable processing devices.
  • DSPs digital signal processors
  • ASICs application-specific integrated circuits
  • CPUs central processing units
  • GPUs graphics processing units
  • server processors field programmable gate arrays
  • FPGA field programmable gate arrays
  • the computing device 14 also includes a storage device 18 (e.g., one or more storage devices).
  • the storage device 18 may include one or more memory devices, such as random-access memory (RAM) devices (e.g., static RAM (SRAM) devices, magnetic RAM (MRAM) devices, dynamic RAM (DRAM) devices, resistive RAM (RRAM) devices, or conductive-bridging RAM (CBRAM) devices), hard drive-based memory devices, solid-state memory devices, networked drives, cloud drives, or any combination of memory devices.
  • RAM random-access memory
  • MRAM magnetic RAM
  • DRAM dynamic RAM
  • RRAM resistive RAM
  • CBRAM conductive-bridging RAM
  • the storage device 18 may include memory that shares a die with the processing device 16.
  • the memory may be used as cache memory and include embedded dynamic random-access memory (eDRAM) or spin transfer torque Attorney Docket No.093386-0040-WO01 magnetic random-access memory (STT-MRAM), for example.
  • eDRAM embedded dynamic random-access memory
  • STT-MRAM spin transfer torque Attorney Docket No.093386-0040-WO01 magnetic
  • the storage device 18 may include non-transitory computer readable media having instructions thereon that, when executed by one or more processing devices (e.g., the processing device 16), cause the computing device 14 to perform any appropriate ones of the methods disclosed herein below or portions of such methods.
  • the computing device 14 further includes an interface device 22 (e.g., one or more interface devices 22).
  • the interface device 22 may include one or more communication chips, connectors, and/or other hardware and software to govern communications between the computing device 14 and other computing devices.
  • the interface device 22 may include circuitry for managing wireless communications for the transfer of data to and from the computing device 14.
  • wireless and its derivatives may be used to describe circuits, devices, systems, methods, techniques, communications channels, etc., that may communicate data via modulated electromagnetic radiation through a nonsolid medium. The term does not imply that the associated devices do not contain any wires, although in some embodiments they might not.
  • Circuitry included in the interface device 22 for managing wireless communications may implement any of a number of wireless standards or protocols, including but not limited to Institute for Electrical and Electronic Engineers (IEEE) standards including Wi-Fi (IEEE 802.11 family), IEEE 802.16 standards, Long-Term Evolution (LTE) project along with any amendments, updates, and/or revisions (e.g., advanced LTE project, ultramobile broadband (UMB) project (also referred to as “3GPP2”), etc.).
  • IEEE Institute for Electrical and Electronic Engineers
  • LTE Long-Term Evolution
  • UMB ultramobile broadband
  • circuitry included in the interface device 22 for managing wireless communications may operate in accordance with a Global System for Mobile Communication (GSM), General Packet Radio Service (GPRS), Universal Mobile Telecommunications System (UMTS), High Speed Packet Access (HSPA), Evolved HSPA (E-HSPA), or LTE network.
  • GSM Global System for Mobile Communication
  • GPRS General Packet Radio Service
  • UMTS Universal Mobile Telecommunications System
  • E-HSPA Evolved HSPA
  • LTE LTE network.
  • circuitry included in the interface device 22 for managing wireless communications may operate in accordance with Enhanced Data for GSM Evolution (EDGE), GSM EDGE Radio Access Network (GERAN), Universal Terrestrial Radio Access Network (UTRAN), or Evolved UTRAN (E-UTRAN).
  • EDGE Enhanced Data for GSM Evolution
  • GERAN GSM EDGE Radio Access Network
  • UTRAN Universal Terrestrial Radio Access Network
  • E-UTRAN Evolved UTRAN
  • circuitry included in the interface device 22 for managing wireless communications may operate in accordance with Code Division Multiple Access (CDMA), Time Division Multiple Access (TDMA), Digital Enhanced Cordless Telecommunications (DECT), Evolution-Data Optimized (EV-DO), and derivatives Attorney Docket No.093386-0040-WO01 thereof, as well as any other wireless protocols that are designated as 3G, 4G, 5G, and beyond.
  • the interface device 906 may include one or more antennas (e.g., one or more antenna arrays) configured to receive and/or transmit wireless signals.
  • the interface device 22 may include circuitry for managing wired communications, such as electrical, optical, or any other suitable communication protocols.
  • the interface device 22 may include circuitry to support communications in accordance with Ethernet technologies.
  • the interface device 22 may support both wireless and wired communication, and/or may support multiple wired communication protocols and/or multiple wireless communication protocols.
  • a first set of circuitry of the interface device 22 may be dedicated to shorter-range wireless communications such as Wi-Fi or Bluetooth
  • a second set of circuitry of the interface device 22 may be dedicated to longer-range wireless communications such as global positioning system (GPS), EDGE, GPRS, CDMA, WiMAX, LTE, EV-DO, or others.
  • GPS global positioning system
  • EDGE EDGE
  • GPRS long-range wireless communications
  • CDMA Code Division Multiple Access
  • WiMAX Code Division Multiple Access
  • LTE Long Term Evolution
  • EV-DO Evolution-DO
  • the computing device 14 also includes battery/power circuitry 24.
  • the battery/power circuitry 24 may include one or more energy storage devices (e.g., batteries or capacitors) and/or circuitry for coupling components of the computing device 14 to an energy source separate from the computing device 14 (e.g., to AC line power).
  • the computing device 14 also includes a display device 20 (e.g., one or multiple individual display devices).
  • the display device 20 may include any visual indicators, such as a heads-up display, a computer monitor, a projector, a touchscreen display, a liquid crystal display (LCD), a light-emitting diode display, or a flat panel display.
  • LCD liquid crystal display
  • the computing device 14 also includes additional input/output (I/O) devices 26.
  • the I/O devices 26 may include one or more data/signal transfer interfaces, audio I/O devices (e.g., microphones or microphone arrays, speakers, headsets, earbuds, alarms, etc.), audio codecs, video codecs, printers, sensors (e.g., thermocouples or other temperature sensors, humidity sensors, pressure sensors, vibration sensors, etc.), image capture devices (e.g., Attorney Docket No.093386-0040-WO01 one or more cameras), human interface devices (e.g., keyboards, cursor control devices, such as a mouse, a stylus, a trackball, or a touchpad), etc.
  • audio I/O devices e.g., microphones or microphone arrays, speakers, headsets, earbuds, alarms, etc.
  • audio codecs e.g., video codecs
  • printers e.g., sensors (e.g., thermocoup
  • various components of the interface devices 22 and/or I/O devices 26 can be configured to send and receive suitable control messages, suitable control/telemetry signals, and streams of data.
  • the interface devices 22 and/or I/O devices 26 include one or more analog-to-digital converters (ADCs) for transforming received analog signals into a digital form suitable for operations performed by the processing device 16 and/or the storage device 18.
  • ADCs analog-to-digital converters
  • the interface devices 22 and/or I/O devices 26 include one or more digital-to-analog converters (DACs) for transforming digital signals provided by the processing device 16 and/or the storage device 18 into an analog form suitable for being communicated to the corresponding components of the system 10.
  • DACs digital-to-analog converters
  • the computing device 14 also includes a plurality of motor controllers 28 configured to control operation of a respective motor in the system 10.
  • the motor controllers 28 receive and process instructions from the processing device 16 to control the operation of the motors.
  • the robotic arm assembly 10 includes a frame assembly 100, an upper manipulation linkage (five-bar) assembly 200, a lower (vertical) manipulation linkage assembly 300, a pitch actuation assembly 400 for the upper manipulation linkage five-bar assembly 200, a yaw actuation assembly 500 for the lower manipulation linkage 300, a tool stem assembly 600, and a sensor assembly 700.
  • the frame assembly 100 includes a first upright 104, a second upright 108, and a first cross bar 112 coupled between the first upright 104 and the second upright 108.
  • the first cross bar 112 is oriented perpendicular relative to the first upright 104 and the second upright 108 and is coupled to an upper end of the first upright 104 and an upper end of the second upright 108.
  • the frame assembly 100 also includes a first strut 116 extending from the first upright 104, a second strut 120 extending from the second upright 108, a second cross bar 124 coupled between the first strut 116 and the second strut 120, and a third cross bar 126 coupled between the first strut 116 and the second strut 120.
  • the second cross bar 124 is oriented perpendicular relative to the first strut 116 and the second strut 120.
  • the second cross Attorney Docket No.093386-0040-WO01 bar 124 is also oriented parallel relative to the first cross bar 112.
  • the frame assembly 100 also includes a first leg 128 extending from the first strut 116 and a second leg 132 extending from the second strut 120.
  • the first leg 128 and the second leg 132 are oriented perpendicular relative to the first strut 116, the second strut 120, and the second cross bar 124. In some constructions, the first leg 128 and the second leg 132 extend from the second cross bar 124.
  • the frame assembly 100 further includes a first bracket 136 extending from the first upright 104 and a second bracket 140 extending from the second upright 108.
  • the first bracket 136 and the second bracket 140 include apertures which are horizontally aligned and positioned across from one another as illustrated in FIG.3.
  • the frame assembly 100 further includes a third bracket 144 extending from the first cross bar 112 and a fourth bracket 148 extending from the third cross bar 126.
  • the third bracket 144 and the fourth bracket 148 include apertures which are vertically aligned and positioned across from one another as illustrated in FIG.3.
  • the upper manipulation linkage (five-bar) assembly 200 (hereinafter referred to as “the upper five-bar assembly 200”) includes a housing 204 including a first peg 208 and a second peg 212.
  • the first peg 204 and the second peg 208 are configured to be received within the corresponding apertures in the first bracket 136 and the second bracket 140 of the frame assembly 100 to thereby couple the upper five-bar assembly 200 to the frame assembly 100.
  • the upper five-bar assembly 200 also includes a first revolute actuator 212 coupled to the housing 204 and a second revolute actuator 216 coupled to the housing 204.
  • the upper five-bar assembly 200 also includes a first input link 220 coupled to the first revolute actuator 212 and a second input link 224 coupled to the second revolute actuator 216.
  • the first revolute actuator 212 controls movement of the first input link 220
  • the second revolute actuator 216 controls movement of the second input link 224.
  • the upper five-bar assembly 200 further includes a coupler link 228 configured to provide kinematic coupling between the first input link 220 and the second input link 224.
  • the coupler link 228 terminates with a hook link 232.
  • the hook link 232 is coupled to the tool stem assembly 600 via a passive revolute joint 236.
  • the lower (vertical) manipulation linkage assembly 300 (hereinafter referred to as “the lower five-bar assembly 300”) includes a housing 304 including a first peg 308 and a second peg 312. The first peg 304 and the second peg 308 are configured to be received within the corresponding apertures in the third bracket 144 and the fourth bracket 148 of the frame assembly 100 to thereby couple the lower five-bar assembly 300 to the frame assembly 100.
  • the lower five-bar assembly 300 also includes a third actuator 312 coupled to the housing 304 and a fourth actuator 316 coupled to the housing 304.
  • the lower five-bar assembly 300 also includes a first input link 320 coupled to the third actuator 312 and a second input link 324 coupled to the fourth actuator 316.
  • the third actuator 312 controls movement of the first input link 320
  • the fourth actuator 316 controls movement of the second input link 324.
  • the lower five-bar assembly 300 further includes a coupler link 328 configured to provide kinematic coupling between the first input link 320 and the second input link 324.
  • the coupler link 328 terminates with a hook link 332.
  • the hook link 332 is coupled to the tool stem assembly 600 via a passive revolute joint 336.
  • the pitch actuation assembly 400 for the upper five- bar assembly 200 includes a fifth actuator 404 coupled to the housing 204.
  • the fifth actuator 404 is configured to rotate the housing 204 about its horizontal axis.
  • the pitch actuation assembly 400 includes a plurality of links to effectuate movement of the housing 204.
  • the yaw actuation assembly 500 for the lower five- bar assembly 300 includes a sixth actuator 504 coupled to the housing 304.
  • the sixth actuator 504 is configured to move the housing 304 about its vertical axis.
  • the yaw actuation assembly 500 includes a four-bar transmission to effectuate movement of the housing 304.
  • the tool stem assembly 600 includes an interchange device 612 to quickly exchange surgical tools.
  • the interchange device 612 is normally closed and locks the surgical tool via spring-loaded lock pins that engage with matching slots in a tool adaptor 616 (shown in FIG.7B). By pressing radially on levels shown in FIG.9, the interchange device 612 can be released.
  • FIG.9 (at A) shows the interchange device 612 in a released state.
  • FIG.9 (at B) shows the interchange device 612 in a locked state.
  • FIG.9 (at C) illustrates a cross-section of the interchange device 612 in the locked state.
  • the tool stem assembly 600 is shown along with a cross- sectional view.
  • the tool stem assembly 600 includes a stem body 620, a first sensor housing 624 coupled to the hook link 232, and a second sensor housing 628 coupled to the hook link 332.
  • the stem body 620 interfaces with the first sensor housing 624 and the second sensor housing 628 via an elastomeric medium (shown as void 632) and sensor assemblies 700 (discussed below).
  • the second sensor housing 628 is free to rotate relative to the stem body 620.
  • the interchange device 612 is passively rotatable relative to the stem body 620 and this tool rotation is measured by a hall-effect sensor and a magnet 640 (discussed below in more detail).
  • FIGS.11 and 12 show an exploded view of the tool stem assembly 600.
  • FIGS.10-13 illustrate the sensor assemblies 700.
  • the sensor assemblies 700 include a first sensor assembly 700A supported in the first sensor housing 624 and a second sensor assembly 700B (not separately shown in the figures, but the structure and function is the same as the first sensor assembly 700A described herein) supported in the second sensor housing 628.
  • the sensor assemblies 700A and 700B each include an elastomeric matrix 704 positioned between the stem body 620 and the sensor housings 624, 628.
  • the elastomeric matrix 704 has a plurality of magnets 708 embedded into each side below the outer surface.
  • the elastomeric matrix 704 includes eight miniature magnets 708 as illustrated (one not visible in FIG.13). In other embodiments, the elastomeric matrix 704 may include Attorney Docket No.093386-0040-WO01 more or fewer than eight magnets 708.
  • the sensor assemblies 700A and 700B each include a plurality of Hall-effect sensors 712 positioned to correspond with a respective one of the plurality of magnets 708. In some embodiments, the sensor assemblies 700A and 700B each include eight miniature Hall-effect sensors 712. In other embodiments, the sensor assemblies 700A and 700B may include more or fewer than eight Hall-effect sensors 712.
  • the Hall-effect sensors 712 are configured to measure the magnetic flux density which depends on the position of the magnets 708 relative to the Hall-effect sensors 712. As an external force is applied to the stem body 620, small deflections in the elastomeric matrix 704 cause the magnets 708 to move thereby changing the magnetic flux density.
  • the change in magnetic flux density readings from the plurality of Hall-effect sensors 712 provides some sensory redundancy and allows for calibrating a mapping from magnetic flux density readings and the full six components of force and moment acting on the Hall-effect sensors 712.
  • the sensor assembly 700 is also described in more detail below in the EXAMPLE section.
  • the system arrangement described above uses back-drivable actuators with high transmission efficiency, therefore, it is possible to use motor current sensing to determine the interaction forces between the upper five-bar assembly 200, the lower five-bar assembly 300, and the stem body 620.
  • the system 10 since the system 10 includes six actuators for a tool passively capable of rolling about its longitudinal axis, it is possible to use actuation force redistribution (actuation redundancy) to control the compressive internal force along the segment between the hook link 232 (on the upper five-bar assembly 200) and the hook link 332 (on the lower five-bar assembly 300).
  • actuation force redistribution actuation redundancy
  • a kinematically-consistent joint level motion control is implemented to minimize this internal load, which may be assumed to be zero.
  • the haptic simulation has the goal of specifying a force of interaction at a virtual abdominal incision point and this force is fd. It is also assumed that the interaction forces acting on the stem body 620 from the upper five-bar assembly 200 and the lower five-bar assembly 300 are f1 and f2 and the force applied by the user’s hand is fh (which is unknown). In addition, a second unknown is the force at the end- effector f e .
  • the information obtained through this process includes the dual angle (distance along the common normal of two consecutive instantaneous screws and the angle about the common normal of the two consecutive screws), the curvature ⁇ (s) and torsion ⁇ (s).
  • the same process shown in FIGS.15 and 16 may also be reproduced for the instantaneous wrenches that users apply on the environment or tool handle.
  • the screws represent the moment and force 6-dimensional vectors and similar measures of the striction curve of these wrenches may be followed and used as an additional input to the machine learning model for skill assessment.
  • gaze tracking of the user’s eyes can be incorporated in some embodiments.
  • the virtual reality system provides a training environment with realistic visual and haptic feedback for surgical training without the need for physical training phantoms or tissue.
  • the eye gaze information is collected from a dedicated sensor in special glasses, such as, for example, Pupil Labs Neon XR.
  • FIG.17 shows a scene from the virtual training environment with a virtual organ model.
  • the virtual reality environment integrates models of the physical robotic arm assembly 12 and virtual Attorney Docket No.093386-0040-WO01 anatomy to allow users to practice manipulating organs with surgical instruments and perform other training tasks. Users can experience realistic haptic feedback through the robotic arm assembly 12. Virtual cameras in the scene provide visual or video feedback as they would receive using an endoscope.
  • a custom communication protocol allows the virtual environment to interface with machine-learning-driven backend tissue simulation to ensure real-time feedback.
  • the training environment also supports eye gaze tracking with respect to the virtual or physical scene. This allows eye gaze analysis to assess user skill level. Additionally, an augmented reality interface can broadcast the gaze of an expert to guide trainees on what to pay attention to for more efficient skill acquisition.
  • These kinematic performance measures are then used as an input to the machine learning algorithm of FIGS.18 and 19.
  • Machine learning algorithms for skill assessment have generally focused only on kinematics, vision, and system events.
  • the haptic training system 10 and integrated environment allow analysis of other measures including force and economy of motion measures.
  • a graph neural network is employed, as shown in FIG.18, to perform multi-modal fusion between information streams coming from vision feedback, kinematics measures, force, economy of motion measures, and gaze.
  • Each information stream is processed initially through a time-series model, as shown in FIG.19.
  • the network architecture and feature preprocessing are based on Yonghao Long et al., “Relational graph learning on visual and kinematics embeddings for accurate gesture recognition in robotic surgery,” 2021 IEEE International Conference on Robotics and Automation (ICRA), pp.13346-13353, IEEE, 2021, which is incorporated herein by reference.
  • the neural network uses a message-passing scheme to refine the encoding by sharing information between the modalities.
  • the refined encoding goes through further processing depending on the downstream task, such as skill assessment, gesture segmentation, and gesture classification.
  • the gesture segmentation and recognition together with skill assessment can be used to provide targeted feedback to users depending on the skill level they are exhibiting while performing each gesture.
  • Attorney Docket No.093386-0040-WO01 EXAMPLE – SOFT, SIX-AXIS F/T SENSOR FOR TRAINING IN SURGICAL APPLICATIONS [00123]
  • SENSOR DESIGN [00124] The mechanism of sensing force is based on Hooke’s Law, which gives a direct relationship between force applied on a spring and its deformation. Silicone has highly elastic properties and typically follows a non-linear deformation model.
  • the senor can also be used as a general 6-axis F/T sensor, it was designed with the target application of measuring forces and torques applied at the tip of surgical tools in a Attorney Docket No.093386-0040-WO01 haptic training device.
  • the sensor was designed such that it can be easily mounted to laparoscopic instruments by passing the shaft of the tools through the center of the force sensor as shown in FIG.20.
  • the sensor itself has three main components: a center piece with attached Hall-effect sensors, a layer of silicone, and an outer shell to house the magnets (FIG.21).
  • Both the center piece and the outer shell are 3D printed out of Phrozen Rock-Black Stiff resin (Phrozen, Sonic Mega 8K S).
  • the center piece serves to rigidly attach the magnetometers to the tool shaft whereas the magnets are fixed to the haptic device.
  • the layer of silicone is molded with Eco-Flex 00-30 (Smooth-On) which has a shore hardness of 00-30.
  • the layer thickness is 6mm between the center piece and the outer shell. This allows the sensors to move with respect to the magnets proportional to the amount of force applied by the user on the tool shaft. [00127] To capture the forces/torques in six degrees of freedom, the sensor has eight magnetometers and eight corresponding magnets arrayed radially around the center of the tool.
  • the Hall-effect sensors were oriented at an angle of 25° from the vertical axis of the tool shaft.
  • the magnets were offset by 6mm from the surface of the sensors.
  • the position of each magnet with respect to its magnetometer was combined to give the deformation twist of the center piece as a whole.
  • Eight sensors were chosen to provide redundancy that was leveraged to minimize error in the computation of the deformation twist. This is explained in more detail below. As a basis for that computation, it was necessary to define a coordinate system for each magnetometer and magnet with respect to the base frame of the force sensor in the nominal case (no force applied) (FIG.22).
  • the base frame was placed at the center of the device defined with the z-axis along the tool shaft in the direction of the tool tip.
  • the Hall-effect sensor frames were assigned by the chip itself and have the z-axis pointing outward from the face of the chip, the y-axis pointing along the length of the tool shaft, and the x-axis along the horizontal surface of the inner ring.
  • the frames of the magnets had the exact same orientation as their corresponding magnetometers, just translated 6mm in the positive Z direction. [00128] Measurements from the Hall-effect sensors were sent via I2C communication protocol to a Teensy 4.0 microcontroller.
  • MLX90393 magnetometer There were four versions of the MLX90393 magnetometer chip which can each accommodate four user-specified addresses, so in total, 16 MLX90393 magnetometers can fit on a single I2C bus (FIG.23). Since the current sensor Attorney Docket No.093386-0040-WO01 design used eight Hall-effect sensors, two of the force sensing devices were connected on a single I2C bus, which will facilitate future integration into a haptic device. Each magnetometer was soldered to a custom printed circuit board (PCB) which interfaces between each chip and assigns the I2C address for each magnetometer. [00129] The MLX90393 magnetometer offers several parameters that can be adjusted by the user to determine the quantities, resolution, and frequency of magnetic field measurements.
  • PCB printed circuit board
  • the Hall-effect sensor is capable of detecting changes in magnetic field in the X, Y, and Z direction as well as the internal temperature of the chip. For this application, measurements in all three dimensions are considered and temperature is ignored. [00130] Information from the magnet supplier indicated that the magnetic field strength at the surface of the magnet is 6053 gauss (605.3 mT). Therefore, the sensor must be capable of capturing the range 0-605300 ⁇ T.
  • the magnetometer can report 16 bits per measurement which means that the ideal resolution should be no less than 9.236 ⁇ T/LSB in the Z-direction. The best option for resolution offered by the MLX90393 is 9.680 ⁇ T/LSB along the Z-axis.
  • the corresponding resolution for the field strength in the X and Y directions are both 6.009 ⁇ T/LSB.
  • the maximum frequency of data acquisition was limited by the time it took to perform a measurement. This was determined by the amount of time needed to prepare the magnetometer for a measurement and how many dimensions must be measured. For this application, the preparation time was 839 ⁇ s and the measurement time for a single axis was 835 ⁇ s. Therefore, the minimum period for measuring from three axes was 3.34ms. A 10ms period was chosen to give a sampling frequency of 100Hz, which is far below the threshold for human tactile perception of latency.
  • ⁇ p ⁇ p + ⁇ R ⁇ p + ⁇ R ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ R ⁇ p ⁇ (9)
  • ⁇ p ⁇ components of 0Tc, respectively.
  • ⁇ p ⁇ ⁇ IR ⁇ is the position of the ith magnet in world frame. This value is a constant that is known from the sensor geometry.
  • ⁇ p ⁇ ⁇ IR ⁇ is the position of the ith Hall-effect sensor frame ⁇ Si ⁇ relative to ⁇ C ⁇ which is also a constant value and ⁇ p ⁇ ⁇ IR ⁇ is the position of the i th magnet as sensed by the i th Hall-effect sensor.
  • EXPERIMENTS [00151] To verify the sensor design and force sensing model, a number of experiments were performed which are described below. First, the relationship between individual displacement of the magnets based on magnetic field strength was determined experimentally. Then the sensor was calibrated with a robust combination of forces and torques to populate the stiffness matrix, K Attorney Docket No.093386-0040-WO01 eq. (22). Finally, the force sensing model was verified by applying a set of known forces/torques to the sensor and comparing the sensed force/torque to the ground truth. [00152] As noted above, we assume a linear relationship between the measured magnetic flux density and position of the magnet with respect to the sensor.
  • the MLX90393 magnetometer is able to detect magnetic flux density along three axes (X, Y, and Z).
  • the change in magnetic flux for each dimension is sufficiently sensitive to a positional change in that dimension (i.e., a change in the x-direction corresponds to a change in the x-component of flux density).
  • the following experiment was performed to determine the relationship between a change in position and the corresponding change in magnetic flux.
  • the setup was comprised of a motorized linear XYZ-stage robot, a 3D printed mount for the Hall-effect sensor, a magnet embedded in silicone housing and a Teensy4.0 to read the data serially.
  • the data was recorded for post-processing using the serial port logging feature of the open source software puTTy.
  • the magnet is mounted to the end effector of the Cartesian robot which is zeroed by aligning the magnet on the center of the Hall-effect chip as shown in FIG.24.
  • the process includes moving the robot end effector along each axis and collecting magnetic field readings at a total of thirty discrete points. To determine the mapping in the Z- axis, the robot moved the magnet away in 0.2mm increments withing the range [1, 3]mm of distance between the magnet and the chip. Similarly, in the X and Y dimensions, the data is collected in the range [ ⁇ 1, 1]mm with 0.2mm increments in their respective axes.
  • the outer shell of the force sensor is fixed to the end effector of the Meca500 (Mecademic) robot.
  • a rod which has an integrated attachment for calibration weights is rigidly connected to the inner ring of the force sensor.
  • Masses of 50g and 200g were used to calibrate the sensor.
  • the robot was commanded to put the sensor in 193 different poses to sample a robust combination of forces and torques across all six axes (FIG. 26).
  • the mean of a hundred measurements from each Hall-effect sensor were taken at each pose to reduce the effect of noise in the measurements.
  • the deflection twist of the sensor frame is computed for each pose. These twists make up the ⁇ matrix which is substituted into eq.
  • ⁇ max 6.07 ⁇ 10 ⁇ 3 N/ ⁇ T
  • ⁇ min 2.88 ⁇ 10 ⁇ 3 N/ ⁇ T
  • ⁇ max 2.26 ⁇ 10 ⁇ 3 Nm/ ⁇ T
  • ⁇ min 1.48 ⁇ 10 ⁇ 3 Nm/ ⁇ T
  • a ng e uc ean norm of the reported RMS errors for force and torque can summarize the overall error as 0.45N and Attorney Docket No.093386-0040-WO01 0.014Nm, respectively.
  • the estimated range of our sensor ⁇ 50N in the X,Y directions, ⁇ 20N in the Z directions, and ⁇ 0.2Nm for all torques.
  • the estimated range of the force sensor is computed by multiplying the stiffness for a single dimension (elements on the diagonal of K) by the maximum possible deformation in that dimension. For the X and Y directions, the maximum possible deformation is 6mm and in the Z direction it is 3mm.
  • Non-transitory Computer Readable Storage Medium the platforms, systems, media, and methods disclosed herein include one or more non-transitory computer readable storage media encoded with a program including instructions executable by the operating system of an optionally networked computer.
  • a computer readable storage medium is a tangible component of a computer.
  • a computer readable storage medium is optionally removable from a computer.
  • a computer readable storage medium includes, by way of non-limiting examples, compact disc read-only memories (CD-ROMs), digital versatile discs (DVDs), flash memory devices, solid state memory, magnetic disk drives, magnetic tape drives, optical disk drives, cloud computing systems and services, and the like.
  • the platforms, systems, media, and methods disclosed herein include at least one computer program, or use of the same.
  • a computer program includes a sequence of instructions, executable in an electronic processor (e.g., the electronic processor 1012) of the computer, written to perform a specified task.
  • Computer readable instructions may be implemented as program modules, such as functions, objects, application programming interface (API), data structures, and the like, that perform particular tasks or implement particular abstract data types.
  • API application programming interface
  • a computer program comprises one sequence of instructions.
  • a computer program comprises a plurality of sequences of instructions.
  • a computer program is provided from one location. In other embodiments, a computer program is provided from a plurality of locations.
  • a computer program includes one or more software modules.
  • a computer program includes, in part or in whole, one or more web applications, one or more mobile applications, one or more standalone applications, one or more web browser plug-ins, extensions, add-ins, or add-ons, or combinations thereof.
  • machine learning algorithms are employed to build a model to classify particles based on a dataset(s).
  • machine learning algorithms may include a support vector machine (SVM), a na ⁇ ve Bayes classification, a random forest, a neural network, deep learning, or other supervised learning algorithm or unsupervised learning algorithm for classification and regression.
  • SVM support vector machine
  • the machine learning algorithms may be trained using one or more training datasets. For example, previously received location or user data may be employed to train various algorithms. Moreover, as described above, these algorithms can be continuously trained/retrained using real-time user data as it is received.
  • the machine learning algorithm employs regression modeling where relationships between variables are determined and weighted.
  • the machine learning algorithm employs regression modeling, where relationships between predictor variables and dependent variables are determined and weighted.
  • Data stores include repositories for persistently storing and managing collections of data.
  • Types of data stores repositories include, for example, databases and simpler store types, or use of the same.
  • Simpler store types include files, emails, and so forth.
  • a database is a series of bytes that is managed by a database management system (DBMS).
  • DBMS database management system
  • suitable databases include, by way of Attorney Docket No.093386-0040-WO01 non-limiting examples, relational databases, non-relational databases, object-oriented databases, object databases, entity-relationship model databases, associative databases, and extensible markup language (XML) databases. Further non-limiting examples include structured query language (SQL), PostgreSQL, MySQL, Oracle, DB2, and Sybase. In some embodiments, a database is cloud computing based. [00173] Standalone Application [00174] In some embodiments, a computer program includes a standalone application, which is a program that is run as an independent computer process, not an add-on to an existing process, (e.g., not a plug-in).
  • a compiler is a computer program(s) that transforms source code written in a programming language into binary object code such as assembly language or machine code. Suitable compiled programming languages include, by way of non-limiting examples, C, C++, Objective-C, COBOL, Delphi, Eiffel, JavaTM, Lisp, PythonTM, Visual Basic, and VB .NET, or combinations thereof. Compilation is often performed, at least in part, to create an executable program.
  • a computer program includes one or more executable compiled applications.
  • the systems and methods disclosed herein include software, server, or database modules. Software modules are created using machines, software, and languages.
  • a software module comprises a file, a section of code, a programming object, a programming structure, or combinations thereof. In further various embodiments, a software module comprises a plurality of files, a plurality of sections of code, a plurality of programming objects, a plurality of programming structures, or combinations thereof.
  • the one or more software modules comprise, by way of non- limiting examples, a web application, a mobile application, and a standalone application.
  • software modules are in one computer program or application. In other embodiments, software modules are in more than one computer program or application. In some embodiments, software modules are hosted on one machine. In other embodiments, software modules are hosted on more than one machine.
  • the modules, processors, or systems described above may be implemented as a programmed general purpose computer, an electronic device programmed with microcode, a hard-wired analog logic circuit, software stored on a computer-readable medium or signal, an optical computing device, a networked system of electronic and/or optical devices, a special purpose computing device, an integrated circuit device, a semiconductor chip, and/or a software module or object stored on a computer-readable medium or signal, for example.
  • Particular implementations of the subject matter have been described. Other implementations, alterations, and permutations of the described implementations are within the scope of the following claims as will be apparent to those skilled in the art.

Landscapes

  • Health & Medical Sciences (AREA)
  • Surgery (AREA)
  • Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Biomedical Technology (AREA)
  • Robotics (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Manipulator (AREA)

Abstract

A haptic training system for developing skills and training for soft-tissue surgeries. The system includes one or more robotic arm assemblies and a computing device. The robotic arm assembly includes a frame, an upper manipulation linkage assembly coupled to the frame, a lower manipulation linkage assembly coupled to the frame, one or more actuators coupled to the upper manipulation linkage assembly or the lower manipulation linkage assembly, a tool stem assembly coupled to the upper manipulation linkage assembly and the lower manipulation linkage assembly, the tool stem assembly configured to support a surgical tool, and a sensor assembly coupled to the tool stem assembly. The computing device is configured to receive magnetic flux density data from the sensor assembly when a force is applied to the tool stem assembly, determine a position and an orientation of the tool stem assembly based on the magnetic flux density data, estimate forces of interaction with the upper manipulation linkage assembly and the lower manipulation linkage assembly based on electrical current of the one or more actuators, calculate an estimation of force at a distal end of the surgical tool based on the position and the orientation of the tool stem assembly and the forces of interaction, and process the estimation of force at the distal end of the surgical tool to provide real-time haptic feedback to the user via the surgical tool.

Description

Attorney Docket No.093386-0040-WO01 SYSTEMS AND METHODS FOR ACCELERATED SKILL ACQUISITION AND ASSESSMENT FOR SURGICAL TRAINING CROSS-REFERENCE TO RELATED APPLICATIONS [0001] This application is a non-provisional of and claims the benefit of U.S. Provisional Patent Application No.63/599,460, filed on November 15, 2024, the contents of which are incorporated herein by reference. BACKGROUND [0002] Resource-limited countries suffer from a severe shortage of healthcare workers. This shortage is particularly profound when considering the lack of access to qualified surgeons. Some countries in sub-Saharan Africa have less than one qualified surgeon per 100,000 persons in population. Task-shifting, in which medical care is provided by nurses and non-physician medical practitioners, has become commonplace. [0003] The leading causes of perioperative deaths in Africa are attributed to wound site infection, sepsis, hemorrhage, and anesthesia complications. It is therefore essential to develop strategies to reduce these risks. One potential strategy is to shift towards laparoscopic minimally invasive surgery when possible, provided training in laparoscopic techniques can be made widely available. [0004] Given the severe shortage of surgeons and providers with laparoscopic skills, there is a need to facilitate laparoscopic skill acquisition for physicians in Africa. The development of low-cost high-fidelity laparoscopic simulators can encourage dissemination in resource-limited settings, increase the rate of skill acquisition, and accelerate the pace of training fully-qualified surgeons in Africa. [0005] Accordingly, low-cost systems and methods for accelerated training of medical students and surgical trainees in soft tissue surgeries is desirable. SUMMARY [0006] The present disclosure provides a training system including one or more haptic devices, a skill assessment algorithm, a virtual reality model of anatomy, and user eye gaze and Attorney Docket No.093386-0040-WO01 instrument motion tracking algorithms. The combination of these devices and methods allows this system to train surgeons on minimally invasive procedures where they may use a simulated organ in a virtual reality setting or a physical organ model either made of silicone or utilizing an explanted animal organ. [0007] The present disclosure provides a system for training a user in soft-tissue surgeries. The system comprises a robotic arm assembly and a computing device. The robotic arm assembly includes a frame, an upper manipulation linkage assembly coupled to the frame, a lower manipulation linkage assembly coupled to the frame, one or more actuators coupled to the upper manipulation linkage assembly or the lower manipulation linkage assembly, a tool stem assembly coupled to the upper manipulation linkage assembly and the lower manipulation linkage assembly, the tool stem assembly configured to support a surgical tool, and a sensor assembly coupled to the tool stem assembly. The computing device is configured to receive magnetic flux density data from the sensor assembly when a force is applied to the tool stem assembly, determine a position and an orientation of the tool stem assembly based on the magnetic flux density data, estimate forces of interaction with the upper manipulation linkage assembly and the lower manipulation linkage assembly based on electrical current of the one or more actuators, calculate an estimation of force at a distal end of the surgical tool based on the position and the orientation of the tool stem assembly and the forces of interaction, and process the estimation of force at the distal end of the surgical tool to provide real-time haptic feedback to the user via the surgical tool. [0008] In some aspects, the sensor assembly includes a plurality of magnets and a plurality of Hall-effect sensors configured to detect the magnetic flux density data. [0009] In some aspects, the tool stem assembly includes a stem body, a first sensor housing supported on the stem body, and a second sensor housing supported on the stem body, and wherein the sensor assembly includes a first sensor supported by the first sensor housing and a second sensor supported by the second sensor housing, and wherein the first sensor includes a first elastomeric matrix positioned between the first sensor housing and the stem body, and wherein the second sensor includes a second elastomeric matrix positioned between the second sensor housing and the stem body. Attorney Docket No.093386-0040-WO01 [0010] In some aspects, the first sensor includes a plurality of magnets and a plurality of Hall-effect sensors within the first sensor housing, the plurality of magnets and the plurality of Hall-effect sensors configured to detect the magnetic flux density data. [0011] In some aspects, the second sensor includes a plurality of magnets and a plurality of Hall-effect sensors within the second sensor housing, the plurality of magnets and the plurality of Hall-effect sensors configured to detect the magnetic flux density data. [0012] In some aspects, the sensor assembly includes a six-axis sensor including an array of magnets and corresponding Hall-effect sensors to detect the magnetic flux density data when a force is applied to the surgical tool. [0013] In some aspects, the system further comprises a pitch actuation assembly coupled to the upper manipulation linkage assembly, the pitch actuating assembly configured to move the upper manipulation linkage assembly about a horizontal axis. [0014] In some aspects, the system further comprises a yaw actuation assembly coupled to the lower manipulation linkage assembly, the yaw actuation assembly configured to move the lower manipulation linkage assembly about a vertical axis. [0015] In some aspects, the system further comprises an interchange device coupled to the tool stem assembly, the interchange device configured to lock the surgical tool in position relative to the tool stem assembly. [0016] In some aspects, the interchange device includes a locked state and a released state to interchange the surgical tool. [0017] In some aspects, the system further comprises a virtual reality system configured to provide a simulated physical interaction with a soft-tissue organ, and wherein the user receives visual and auditory guidance from the virtual reality system while receiving the real-time haptic feedback. [0018] In some aspects, the virtual reality system is configured to incorporate eye gaze data of the user when providing the real-time haptic feedback. Attorney Docket No.093386-0040-WO01 [0019] In some aspects, the computing device is further configured to input the estimation of force at the distal end of the surgical tool, the eye gaze data, and tool motion data to a skill assessment machine learning model to provide an output of a score related to a surgical training task performed by the user. [0020] In some aspects, the computing device is further configured to input video data from an endoscope to the skill assessment machine learning model. [0021] In some aspects, the tool motion data include curvature and torsion of a striction curve and a dual angle. [0022] In some aspects, the system further comprises a second robotic arm assembly including a second frame, a second upper manipulation linkage assembly coupled to the second frame, a second lower manipulation linkage assembly coupled to the second frame, one or more actuators coupled to the second upper manipulation linkage assembly or the second lower manipulation linkage assembly, a second tool stem assembly coupled to the second upper manipulation linkage assembly and the second lower manipulation linkage assembly, the second tool stem assembly configured to support a second surgical tool, and a second sensor assembly coupled to the second tool stem assembly. [0023] In some embodiments, the present disclosure provides a haptic device with integrated intrinsic joint-level sensing and embedded sensors allowing estimation of user effort and tool-tip interaction force. In some aspects, the haptic device is configured to provide a user the experience of a remote center of motion with a specified stiffness. [0024] In additional embodiments, the present disclosure provides a low-cost, six-axis sensor for estimating force and moment interactions with surgical instruments using an array of Hall- effect sensors and miniature magnets embedded within an elastomeric tool interface. [0025] In further embodiments, the present disclosure provides a system whereby information regarding tool motion, and estimated user effort and forces at the tool tip are used for real-time skill scoring for the purpose of informing control laws that assist trainees in the process of gaining technical skills during minimally invasive surgical training. In some aspects, the Attorney Docket No.093386-0040-WO01 system includes two robotic arm assemblies along with either a physical training model (e.g., an explanted animal organ or a silicon-based phantom) or a virtual-reality simulated phantom. [0026] In some aspects, a programmable position and stiffness of the remote center of motion is achieved via a combination of kinematic coordinated control and a combination of active stiffness control and/or via impedance/admittance control. [0027] In some aspects, a virtual reality system is used for simulated physical interaction with an organ and visual and auditory guidance cues are provided to a trainee along with force feedback via the haptic device. [0028] In some embodiments, the present disclosure provides a skill-assessment algorithm that uses information regarding the temporal evolution of the locus of instantaneous screws of motion and instantaneous wrenches (forces/moments) as an input for the skill assessment algorithm. [0029] In some aspects, the system utilizes user gaze information for skill assessment. [0030] In some aspects, the system utilizes endoscopic video information as an additional input to the skill-assessment algorithm for skill assessment of the user. [0031] In some aspects, the system utilizes a simulated model of a target training anatomy to enable simulated interaction with a target anatomy while also offering haptic force feedback. [0032] Other aspects of the disclosure will become apparent by consideration of the detailed description and accompanying drawings. BRIEF DESCRIPTION OF THE DRAWINGS [0033] FIG.1A illustrates a haptic training system according to an embodiment of the present disclosure. [0034] FIG.1B illustrates a rear perspective view of a robotic arm assembly of the haptic training system illustrated in FIG.1A. Attorney Docket No.093386-0040-WO01 [0035] FIG.1C illustrates a top perspective view of a robotic arm assembly of the haptic training system illustrated in FIG.1A. [0036] FIG.1D illustrates a top view of a robotic arm assembly of the haptic training system illustrated in FIG.1A. [0037] FIG.1E illustrates a front view of a robotic arm assembly of the haptic training system illustrated in FIG.1A. [0038] FIG.1F illustrates a side view of a robotic arm assembly of the haptic training system illustrated in FIG.1A. [0039] FIG.2A illustrates a perspective view of a prototype of the robotic arm assembly illustrated in FIGS.1A-1F. [0040] FIG.2B illustrates a top perspective view of a prototype robotic arm assembly illustrated in FIGS.1A-1F. [0041] FIG.3 is a perspective view of the various assemblies of the robotic arm assembly illustrated in FIGS.1A-1F. [0042] FIG.4 illustrates an exploded view of an upper manipulation linkage (five-bar) assembly of the robotic arm assembly. [0043] FIG.5 illustrates a rear perspective view of a lower (vertical) manipulation linkage assembly of the robotic arm assembly. [0044] FIG.6 illustrates perspective view of a pitch actuation assembly of the robotic arm assembly. [0045] FIG.7A illustrates perspective view of a tool stem assembly of the robotic arm assembly with a surgical tool. [0046] FIG.7B is an enlarged view of a portion of a tool stem assembly of the robotic arm assembly with a surgical tool. Attorney Docket No.093386-0040-WO01 [0047] FIG.8A illustrates a perspective view of a component of the tool stem assembly of FIG.7A. [0048] FIG.8B illustrates an exploded view of the component of the tool stem assembly shown in FIG.8A. [0049] FIG.9 illustrates an exploded view of the component of the tool stem assembly shown in FIGS.8A-8B. [0050] FIG.10 illustrates perspective view and a cross-section view of a tool stem assembly of the robotic arm assembly. [0051] FIG.11 illustrates an exploded view of a tool stem assembly with a surgical tool. [0052] FIG.12 illustrates an enlarged exploded view of a tool stem assembly. [0053] FIG.13 illustrates a series of view of a sensor assembly of the robotic arm assembly. [0054] FIG.14 illustrates a diagram of forces acting on a surgical tool in the robotic arm assembly. [0055] FIG.15 illustrates a diagram and process for determining tool motion characteristics. [0056] FIG.16 illustrates a diagram and process for determining tool motion characteristics. [0057] FIG.17 illustrates a scene from a virtual training environment using the haptic training system of FIG.1A. [0058] FIG.18 illustrates a graph neural network of a machine learning algorithm for skill assessment. [0059] FIG.19 illustrates a time-series model of a machine learning algorithm for skill assessment. [0060] FIG.20 illustrates integration of a proposed force sensor with a laparoscopic tool and haptic device. Attorney Docket No.093386-0040-WO01 [0061] FIG.21 illustrates (a) picture of the assembled force/torque sensor. (b) Picture of the sensor with the top cover and top silicone layer removed. (c) Exploded view of sensor: ① top outer shell, ② silicone layer, ③ MLX90393 Magnetometer, ④ K&J Magnetics D101- N52 magnet, ⑤ center piece, ⑥ bottom outer shell. [0062] FIG.22 illustrates frame assignments for the nominal center frame {0}, the deformed center frame {C}, the sensor frames {Si}, and the magnet frames {Mi}. [0063] FIG.23 illustrates an I2C wiring diagram. There exist 16 unique I2C addresses for the MLX90393 magnetometer which allows up to two F/T sensors on the same I2C bus. [0064] FIG.24 illustrates a setup for sensitivity experiments. [0065] FIG.25 graphically illustrates distance of magnet from sensor frame as a function of magnetic field strength for X, Y, and Z axes. Note that although the range of distances and magnetic flux vary between X/Y and Z, the plots are superimposed. [0066] FIG.26 illustrates a calibration experimental setup. (a) Closeup of force sensor attached to calibration weight and robot flange. ① 100g mass. ② 3D printed attachment. ③ force sensor. ④ Meca500 robot (b) Experimental setup for z axis torque experiments. (c) Example eight poses out of 193 used to calibrate the force sensor. [0067] FIG.27 illustrates validation results of the force sensor that compares the applied and measured force/torque in each dimension. DETAILED DESCRIPTION [0068] Unless otherwise defined, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art. In case of conflict, the present document, including definitions, will control. Example methods and systems are described below, although methods and systems similar or equivalent to those described herein can be used in practice or testing of the present disclosure. All publications, patent applications, patents and other references mentioned herein are incorporated by reference in their entirety. Attorney Docket No.093386-0040-WO01 The systems, methods, and examples disclosed herein are illustrative only and not intended to be limiting. [0069] The terms “comprise(s),” “include(s),” “having,” “has,” “can,” “contain(s),” and variants thereof, as used herein, are intended to be open-ended transitional phrases, terms, or words that do not preclude the possibility of additional acts or structures. The singular forms “a,” “an” and “the” include plural references unless the context clearly dictates otherwise. [0070] As used herein, the term “or” is intended to mean an inclusive “or” rather than an exclusive “or.” That is, unless specified otherwise, or clear from context, “X employs A or B” is intended to mean any of the natural inclusive permutations. That is, if X employs A, X employs B, or X employs both A and B, then “X employs A or B” is satisfied under any of the foregoing instances. Moreover, articles “a” and “an” as used in the subject specification and annexed drawings should generally be construed to mean “one or more” unless specified otherwise or clear from context to be directed to a singular form. [0071] In addition, unless otherwise indicated, numbers expressing quantities, constituents, distances, or other measurements used in the specification and claims are to be understood as being modified by the term “about.” The terms “about,” “approximately,” “substantially,” or their equivalents, represent an amount or condition close to the specific stated amount or condition that still performs a desired function or achieves a desired result. For example, the terms “approximately,” “about,” and “substantially” may refer to an amount or condition that deviates by less than 10%, or by less than 5%, or by less than 1%, or by less than 0.1%, or by less than 0.01% from a specifically stated amount or condition. [0072] The present disclosure is described with reference to the drawings, where like reference numerals are used to refer to like elements throughout. In the following description, for purposes of explanation, numbers of specific details are set forth in order to provide an improved understanding of the present disclosure. It may be evident, however, that the systems and methods of the present disclosure may be practiced without one or more of these specific details. In other instances, well-known structures and devices are shown in block diagram form to facilitate describing the systems and methods of the present disclosure. There is no specific requirement that a system, method, or technique include all of the details characterized herein to Attorney Docket No.093386-0040-WO01 obtain some benefit according to the present disclosure. Thus, the specific examples characterized herein are meant to be example applications of the techniques described and alternatives are possible. [0073] Haptic devices have shown to be valuable in supplementing surgical training, especially when providing haptic feedback based on user performance metrics such as wrench applied by the user on the tool. However, current 6-axis force/torque (F/T) sensors are prohibitively expensive, especially when needed in low-resource areas. [0074] The present disclosure provides a system for surgical training for users to obtain skills associated with soft-tissue surgeries. The training system described herein can estimate the forces applied by a user on either a virtual environment or a physical phantom model to provide a “real-life” simulated learning and training experience. The system of the present disclosure also can accept a variety of surgical instruments for the user to practice with. The system serves as a tool tracking device thereby recording motions and estimating forces of interaction with the environment. The system using information from the haptic device(s) provides online skill assessment between repetitions of surgical subtasks by the same user. Using this information, a haptic training intervention can be configured to assist the user in completing a surgical task and the level of assistance can be tuned based on preference. A machine-learning model of tissue deformation is provided based on finite element modeling of target training anatomy. Additionally, the machine-learning model informs skill assessment and the haptic device of the expected interaction force with the virtual anatomy model. [0075] The present disclosure also provides a low-cost, six-axis F/T sensor used in the training system. The present disclosure also describes a training system and device that uses Hall-effect sensors to measure the change in the position of magnets embedded in a silicone layer that results from an applied wrench to the device. Preliminary experimental validation demonstrates that these sensors can achieve an accuracy of 0.45 N and 0.014 Nm, and a theoretical XY range of ±50N, Z range of ±20N, and torque range of ±0.2Nm. The experimental data discussed below indicates that the six-axis F/T sensor disclosed herein can accurately measure user force and provide useful feedback during surgical training on a haptic device. Attorney Docket No.093386-0040-WO01 [0076] Benefits of implementing a training system with a low-cost F/T sensor as described herein include improving the feasibility of exploiting force data for feedback in surgical assessment and increasing access to such haptic device trainers. The cost of fabrication of these F/T sensors may be reduced by 3-D printing the flexures into the sensor, but the cost and complexity of signal amplification and conditioning remains contra-indicated to the simplicity of the F/T sensor for training in soft-tissue surgeries as described herein. [0077] Additional benefits include features that allow for device interchangeability – the system allows use of domain-relevant tools based on the surgical simulation that a user is practicing. The system also provides integrated sensing due to integrated sensors within the tool stem that allow direct sensing of the interaction forces between the tool stem and the five-bar linkages. This added information allows improved estimation of the interaction force at the surgical tool tip. Lastly, the system provides a programmable remote center of motion location and stiffness. In addition to being able to program a pivot motion of the surgical tool about an incision point (remote center of motion (RCM) point), the system uses the sensors in the stem and in the joints to enable stiffness control of the RCM point to mimic the function of an abdominal tissue incision point through which the surgical instrument is passed. [0078] FIGS.1A-1F and 2A-2B illustrate a haptic training system 10 for developing skills and training for soft-tissue surgeries. The system 10 includes one or more robotic arm assemblies 12 and a computing device 14. The robotic arm assembly 12 includes a plurality of subsystems or assemblies that work together to provide a training experience to a user learning how to perform soft-tissue surgeries. The computing device 14 interfaces and communicates with the robotic arm assembly 12 to receive data from the plurality of subsystems or assemblies of the robotic arm assembly 12. In some embodiments, the training system 10 can include a second robotic arm assembly 12 for two-handed training options. Only one robotic arm assembly 12 is discussed herein, however it is understood that the second robotic arm assembly would be the same in structure and function as described herein. FIGS.2A-2B illustrate a prototype of the robotic arm assembly 12. [0079] With reference to FIG.1A, the computing device 14 is illustrated as having a number of components, but any one or more of these components may be omitted or duplicated, as Attorney Docket No.093386-0040-WO01 suitable for the application and setting. In some embodiments, some or all of the components included in the computing device 14 may be attached to one or more motherboards and enclosed in a housing. In some embodiments, some of those components may be fabricated onto a single system-on-a-chip (SoC) (e.g., the SoC may include one or more electronic processing devices 16 and one or more storage devices 18). Additionally, in various embodiments, the computing device 14 may not include one or more of the components illustrated in FIG.1A, but may include interface circuitry for coupling to the one or more components using any suitable interface (e.g., a Universal Serial Bus (USB) interface, a High-Definition Multimedia Interface (HDMI) interface, a Controller Area Network (CAN) interface, a Serial Peripheral Interface (SPI) interface, an Ethernet interface, a wireless interface, or any other appropriate interface). For example, the computing device 14 may not include a display device 20, but may include display device interface circuitry (e.g., a connector and driver circuitry) to which an external display device 20 may be coupled. [0080] The computing device 14 includes a processing device 16 (e.g., one or more processing devices). As used herein, the terms “electronic processor device” and “processing device” interchangeably refer to any device or portion of a device that processes electronic data from registers and/or memory to transform that electronic data into other electronic data that may be stored in registers and/or memory. In various embodiments, the processing device 16 may include one or more digital signal processors (DSPs), application-specific integrated circuits (ASICs), central processing units (CPUs), graphics processing units (GPUs), server processors, field programmable gate arrays (FPGA), or any other suitable processing devices. [0081] The computing device 14 also includes a storage device 18 (e.g., one or more storage devices). In various embodiments, the storage device 18 may include one or more memory devices, such as random-access memory (RAM) devices (e.g., static RAM (SRAM) devices, magnetic RAM (MRAM) devices, dynamic RAM (DRAM) devices, resistive RAM (RRAM) devices, or conductive-bridging RAM (CBRAM) devices), hard drive-based memory devices, solid-state memory devices, networked drives, cloud drives, or any combination of memory devices. In some embodiments, the storage device 18 may include memory that shares a die with the processing device 16. In such an embodiment, the memory may be used as cache memory and include embedded dynamic random-access memory (eDRAM) or spin transfer torque Attorney Docket No.093386-0040-WO01 magnetic random-access memory (STT-MRAM), for example. In some embodiments, the storage device 18 may include non-transitory computer readable media having instructions thereon that, when executed by one or more processing devices (e.g., the processing device 16), cause the computing device 14 to perform any appropriate ones of the methods disclosed herein below or portions of such methods. [0082] The computing device 14 further includes an interface device 22 (e.g., one or more interface devices 22). In various embodiments, the interface device 22 may include one or more communication chips, connectors, and/or other hardware and software to govern communications between the computing device 14 and other computing devices. For example, the interface device 22 may include circuitry for managing wireless communications for the transfer of data to and from the computing device 14. The term “wireless” and its derivatives may be used to describe circuits, devices, systems, methods, techniques, communications channels, etc., that may communicate data via modulated electromagnetic radiation through a nonsolid medium. The term does not imply that the associated devices do not contain any wires, although in some embodiments they might not. Circuitry included in the interface device 22 for managing wireless communications may implement any of a number of wireless standards or protocols, including but not limited to Institute for Electrical and Electronic Engineers (IEEE) standards including Wi-Fi (IEEE 802.11 family), IEEE 802.16 standards, Long-Term Evolution (LTE) project along with any amendments, updates, and/or revisions (e.g., advanced LTE project, ultramobile broadband (UMB) project (also referred to as “3GPP2”), etc.). In some embodiments, circuitry included in the interface device 22 for managing wireless communications may operate in accordance with a Global System for Mobile Communication (GSM), General Packet Radio Service (GPRS), Universal Mobile Telecommunications System (UMTS), High Speed Packet Access (HSPA), Evolved HSPA (E-HSPA), or LTE network. In some embodiments, circuitry included in the interface device 22 for managing wireless communications may operate in accordance with Enhanced Data for GSM Evolution (EDGE), GSM EDGE Radio Access Network (GERAN), Universal Terrestrial Radio Access Network (UTRAN), or Evolved UTRAN (E-UTRAN). In some embodiments, circuitry included in the interface device 22 for managing wireless communications may operate in accordance with Code Division Multiple Access (CDMA), Time Division Multiple Access (TDMA), Digital Enhanced Cordless Telecommunications (DECT), Evolution-Data Optimized (EV-DO), and derivatives Attorney Docket No.093386-0040-WO01 thereof, as well as any other wireless protocols that are designated as 3G, 4G, 5G, and beyond. In some embodiments, the interface device 906 may include one or more antennas (e.g., one or more antenna arrays) configured to receive and/or transmit wireless signals. [0083] In some embodiments, the interface device 22 may include circuitry for managing wired communications, such as electrical, optical, or any other suitable communication protocols. For example, the interface device 22 may include circuitry to support communications in accordance with Ethernet technologies. In some embodiments, the interface device 22 may support both wireless and wired communication, and/or may support multiple wired communication protocols and/or multiple wireless communication protocols. For example, a first set of circuitry of the interface device 22 may be dedicated to shorter-range wireless communications such as Wi-Fi or Bluetooth, and a second set of circuitry of the interface device 22 may be dedicated to longer-range wireless communications such as global positioning system (GPS), EDGE, GPRS, CDMA, WiMAX, LTE, EV-DO, or others. In some other embodiments, a first set of circuitry of the interface device 22 may be dedicated to wireless communications, and a second set of circuitry of the interface device 22 may be dedicated to wired communications. [0084] The computing device 14 also includes battery/power circuitry 24. In various embodiments, the battery/power circuitry 24 may include one or more energy storage devices (e.g., batteries or capacitors) and/or circuitry for coupling components of the computing device 14 to an energy source separate from the computing device 14 (e.g., to AC line power). [0085] The computing device 14 also includes a display device 20 (e.g., one or multiple individual display devices). In various embodiments, the display device 20 may include any visual indicators, such as a heads-up display, a computer monitor, a projector, a touchscreen display, a liquid crystal display (LCD), a light-emitting diode display, or a flat panel display. [0086] The computing device 14 also includes additional input/output (I/O) devices 26. In various embodiments, the I/O devices 26 may include one or more data/signal transfer interfaces, audio I/O devices (e.g., microphones or microphone arrays, speakers, headsets, earbuds, alarms, etc.), audio codecs, video codecs, printers, sensors (e.g., thermocouples or other temperature sensors, humidity sensors, pressure sensors, vibration sensors, etc.), image capture devices (e.g., Attorney Docket No.093386-0040-WO01 one or more cameras), human interface devices (e.g., keyboards, cursor control devices, such as a mouse, a stylus, a trackball, or a touchpad), etc. [0087] Depending on the specific embodiment of the system 10, various components of the interface devices 22 and/or I/O devices 26 can be configured to send and receive suitable control messages, suitable control/telemetry signals, and streams of data. In some examples, the interface devices 22 and/or I/O devices 26 include one or more analog-to-digital converters (ADCs) for transforming received analog signals into a digital form suitable for operations performed by the processing device 16 and/or the storage device 18. In some additional examples, the interface devices 22 and/or I/O devices 26 include one or more digital-to-analog converters (DACs) for transforming digital signals provided by the processing device 16 and/or the storage device 18 into an analog form suitable for being communicated to the corresponding components of the system 10. [0088] The computing device 14 also includes a plurality of motor controllers 28 configured to control operation of a respective motor in the system 10. The motor controllers 28 receive and process instructions from the processing device 16 to control the operation of the motors. [0089] With reference to FIG.3, the robotic arm assembly 10 includes a frame assembly 100, an upper manipulation linkage (five-bar) assembly 200, a lower (vertical) manipulation linkage assembly 300, a pitch actuation assembly 400 for the upper manipulation linkage five-bar assembly 200, a yaw actuation assembly 500 for the lower manipulation linkage 300, a tool stem assembly 600, and a sensor assembly 700. [0090] In some embodiments, the frame assembly 100 includes a first upright 104, a second upright 108, and a first cross bar 112 coupled between the first upright 104 and the second upright 108. The first cross bar 112 is oriented perpendicular relative to the first upright 104 and the second upright 108 and is coupled to an upper end of the first upright 104 and an upper end of the second upright 108. The frame assembly 100 also includes a first strut 116 extending from the first upright 104, a second strut 120 extending from the second upright 108, a second cross bar 124 coupled between the first strut 116 and the second strut 120, and a third cross bar 126 coupled between the first strut 116 and the second strut 120. The second cross bar 124 is oriented perpendicular relative to the first strut 116 and the second strut 120. The second cross Attorney Docket No.093386-0040-WO01 bar 124 is also oriented parallel relative to the first cross bar 112. The frame assembly 100 also includes a first leg 128 extending from the first strut 116 and a second leg 132 extending from the second strut 120. The first leg 128 and the second leg 132 are oriented perpendicular relative to the first strut 116, the second strut 120, and the second cross bar 124. In some constructions, the first leg 128 and the second leg 132 extend from the second cross bar 124. [0091] The frame assembly 100 further includes a first bracket 136 extending from the first upright 104 and a second bracket 140 extending from the second upright 108. The first bracket 136 and the second bracket 140 include apertures which are horizontally aligned and positioned across from one another as illustrated in FIG.3. The frame assembly 100 further includes a third bracket 144 extending from the first cross bar 112 and a fourth bracket 148 extending from the third cross bar 126. The third bracket 144 and the fourth bracket 148 include apertures which are vertically aligned and positioned across from one another as illustrated in FIG.3. [0092] As illustrated in FIGS.3 and 4, the upper manipulation linkage (five-bar) assembly 200 (hereinafter referred to as “the upper five-bar assembly 200”) includes a housing 204 including a first peg 208 and a second peg 212. The first peg 204 and the second peg 208 are configured to be received within the corresponding apertures in the first bracket 136 and the second bracket 140 of the frame assembly 100 to thereby couple the upper five-bar assembly 200 to the frame assembly 100. [0093] With continued reference to FIG.4, the upper five-bar assembly 200 also includes a first revolute actuator 212 coupled to the housing 204 and a second revolute actuator 216 coupled to the housing 204. The upper five-bar assembly 200 also includes a first input link 220 coupled to the first revolute actuator 212 and a second input link 224 coupled to the second revolute actuator 216. The first revolute actuator 212 controls movement of the first input link 220, and the second revolute actuator 216 controls movement of the second input link 224. [0094] The upper five-bar assembly 200 further includes a coupler link 228 configured to provide kinematic coupling between the first input link 220 and the second input link 224. The coupler link 228 terminates with a hook link 232. The hook link 232 is coupled to the tool stem assembly 600 via a passive revolute joint 236. Attorney Docket No.093386-0040-WO01 [0095] As illustrated in FIGS.3 and 5, the lower (vertical) manipulation linkage assembly 300 (hereinafter referred to as “the lower five-bar assembly 300”) includes a housing 304 including a first peg 308 and a second peg 312. The first peg 304 and the second peg 308 are configured to be received within the corresponding apertures in the third bracket 144 and the fourth bracket 148 of the frame assembly 100 to thereby couple the lower five-bar assembly 300 to the frame assembly 100. [0096] With continued reference to FIGS.3 and 5, the lower five-bar assembly 300 also includes a third actuator 312 coupled to the housing 304 and a fourth actuator 316 coupled to the housing 304. The lower five-bar assembly 300 also includes a first input link 320 coupled to the third actuator 312 and a second input link 324 coupled to the fourth actuator 316. The third actuator 312 controls movement of the first input link 320, and the fourth actuator 316 controls movement of the second input link 324. [0097] The lower five-bar assembly 300 further includes a coupler link 328 configured to provide kinematic coupling between the first input link 320 and the second input link 324. The coupler link 328 terminates with a hook link 332. The hook link 332 is coupled to the tool stem assembly 600 via a passive revolute joint 336. [0098] With reference to FIGS.3 and 6, the pitch actuation assembly 400 for the upper five- bar assembly 200 includes a fifth actuator 404 coupled to the housing 204. The fifth actuator 404 is configured to rotate the housing 204 about its horizontal axis. The pitch actuation assembly 400 includes a plurality of links to effectuate movement of the housing 204. [0099] With reference to FIGS.3 and 5, the yaw actuation assembly 500 for the lower five- bar assembly 300 includes a sixth actuator 504 coupled to the housing 304. The sixth actuator 504 is configured to move the housing 304 about its vertical axis. The yaw actuation assembly 500 includes a four-bar transmission to effectuate movement of the housing 304. [00100] The tool stem assembly 600 illustrated in FIGS.7A and 7B is configured to support a surgical tool 602 (e.g., laparoscopic grasper, cautery hooks, staplers, needle drivers, tissue graspers and the like). The tool stem assembly 600 is supported on a first passive universal joint 604 and a second passive universal joint 608. With this arrangement, only five of the six Attorney Docket No.093386-0040-WO01 actuators are needed to control an axis of the tool. The tool can passively rotate about its longitudinal axis but its roll motion is sensed. [00101] The tool stem assembly 600 is shown in more detail in FIGS.7A-13. With reference to FIGS.8A, 8B, and 9, the tool stem assembly 600 includes an interchange device 612 to quickly exchange surgical tools. The interchange device 612 is normally closed and locks the surgical tool via spring-loaded lock pins that engage with matching slots in a tool adaptor 616 (shown in FIG.7B). By pressing radially on levels shown in FIG.9, the interchange device 612 can be released. FIG.9 (at A) shows the interchange device 612 in a released state. FIG.9 (at B) shows the interchange device 612 in a locked state. FIG.9 (at C) illustrates a cross-section of the interchange device 612 in the locked state. [00102] With reference to FIG.10, the tool stem assembly 600 is shown along with a cross- sectional view. The tool stem assembly 600 includes a stem body 620, a first sensor housing 624 coupled to the hook link 232, and a second sensor housing 628 coupled to the hook link 332. The stem body 620 interfaces with the first sensor housing 624 and the second sensor housing 628 via an elastomeric medium (shown as void 632) and sensor assemblies 700 (discussed below). The second sensor housing 628 is free to rotate relative to the stem body 620. [00103] With reference to FIGS.8A, 8B, 9, and 10, the interchange device 612 is passively rotatable relative to the stem body 620 and this tool rotation is measured by a hall-effect sensor and a magnet 640 (discussed below in more detail). FIGS.11 and 12 show an exploded view of the tool stem assembly 600. [00104] FIGS.10-13 illustrate the sensor assemblies 700. The sensor assemblies 700 include a first sensor assembly 700A supported in the first sensor housing 624 and a second sensor assembly 700B (not separately shown in the figures, but the structure and function is the same as the first sensor assembly 700A described herein) supported in the second sensor housing 628. The sensor assemblies 700A and 700B each include an elastomeric matrix 704 positioned between the stem body 620 and the sensor housings 624, 628. The elastomeric matrix 704 has a plurality of magnets 708 embedded into each side below the outer surface. In some embodiments, the elastomeric matrix 704 includes eight miniature magnets 708 as illustrated (one not visible in FIG.13). In other embodiments, the elastomeric matrix 704 may include Attorney Docket No.093386-0040-WO01 more or fewer than eight magnets 708. The sensor assemblies 700A and 700B each include a plurality of Hall-effect sensors 712 positioned to correspond with a respective one of the plurality of magnets 708. In some embodiments, the sensor assemblies 700A and 700B each include eight miniature Hall-effect sensors 712. In other embodiments, the sensor assemblies 700A and 700B may include more or fewer than eight Hall-effect sensors 712. The Hall-effect sensors 712 are configured to measure the magnetic flux density which depends on the position of the magnets 708 relative to the Hall-effect sensors 712. As an external force is applied to the stem body 620, small deflections in the elastomeric matrix 704 cause the magnets 708 to move thereby changing the magnetic flux density. The change in magnetic flux density readings from the plurality of Hall-effect sensors 712 provides some sensory redundancy and allows for calibrating a mapping from magnetic flux density readings and the full six components of force and moment acting on the Hall-effect sensors 712. [00105] The sensor assembly 700 is also described in more detail below in the EXAMPLE section. [00106] The system arrangement described above uses back-drivable actuators with high transmission efficiency, therefore, it is possible to use motor current sensing to determine the interaction forces between the upper five-bar assembly 200, the lower five-bar assembly 300, and the stem body 620. However, since the system 10 includes six actuators for a tool passively capable of rolling about its longitudinal axis, it is possible to use actuation force redistribution (actuation redundancy) to control the compressive internal force along the segment between the hook link 232 (on the upper five-bar assembly 200) and the hook link 332 (on the lower five-bar assembly 300). A kinematically-consistent joint level motion control is implemented to minimize this internal load, which may be assumed to be zero. [00107] With reference to FIG.14, it is assumed that the haptic simulation has the goal of specifying a force of interaction at a virtual abdominal incision point and this force is fd. It is also assumed that the interaction forces acting on the stem body 620 from the upper five-bar assembly 200 and the lower five-bar assembly 300 are f1 and f2 and the force applied by the user’s hand is fh (which is unknown). In addition, a second unknown is the force at the end- effector fe. Attorney Docket No.093386-0040-WO01 [00108] Given motor currents from each of the upper five-bar assembly 200 and the lower five-bar assembly 300, it is possible to solve for f1 and f2 using the motor torque-current constant to estimate the motor torques τi and the statics model of the upper five-bar assembly 200 and the lower five-bar assembly 300. This will have the form of: ^^^் ^^^^ = ^^^ ^^ = 1,2 (1) [00109] In addition, the sensor model also provides an estimate of these force such that: ^^^^^^^ = ^^^ , ^^ = 1,2 (2) where K is a known sensor stiffness matrix, A is a known sensor geometry matrix, and Bi is the augmented vector of all magnetic flux densities vectors from all eight Hall-effect sensors in each sensor assembly 700A and 700B. [00110] It is possible to cast the problem of solving eq. (1) and eq. (2) for f1 while minimizing the effect of measurement noise on the motor torques τi and the magnetic flux intensities Bi. This in general can be cast as an optimization problem that solves ^ᇣ^ᇧ^^ᇤ, ^ᇧ^ᇥ^^ ^^ ^^ ^ ^ = ^^ ^ (3) ா ^^ [00111] And the problem of solving for fi given measurements ^ ^^^ ^^^^ can be solved as a weighted-least squares problem with a solution ^^ 0 ି^ ^^ ^^ ^^ = ൬^^் ^ ^ ൨^^^ ^^் ^ ^ 0 ൨ ^ ^ ^^ ^^ ^ (4) where the weight
Figure imgf000022_0001
proportional to the covariance of the measurement noise of τi and Bi. [00112] Given the estimate of f1 and f2, the force equilibrium equation is formed as: ^^^ + ^^ଶ + ^^ௗ + ^^^ + ^^^ = ^^ (5) [00113] Assuming the user does not apply a significant moment on the tool, the moment equilibrium about the hand interaction point is: [^^^^] ^^^ + [^^ଶ^] ^^ଶ + [^^ௗ^] ^^ௗ + [^^^^] ^^^ = ^^ (6) Attorney Docket No.093386-0040-WO01 which may be solved for the tool-tip force fe. [00114] Given the solution for fe it is possible to solve for the hand interaction force fh from eq. (5). [00115] In the above model, it is assumed that the robotic arm assembly 12 has been calibrated and that τi excludes the motor torques needed to overcome the gravity acting on the links of the device. This may be achieved through modeling the gravitational effects and measuring the weights of the robot links and its centers of mass of its links. [00116] SKILL ASSESSMENT [00117] A unique way of characterizing instrument motion is illustrated in FIGS.15 and 16. The approach presented uses tool motion data captured from the robotic arm assembly 12 and reconstructs the instantaneous screws of motion for every instrument pose transitions between consecutive samples. Using this data, the striction curve (which is the curve connecting the closest points on consecutive screw axes) is computed. The information obtained through this process includes the dual angle (distance along the common normal of two consecutive instantaneous screws and the angle about the common normal of the two consecutive screws), the curvature κ(s) and torsion τ(s). [00118] The same process shown in FIGS.15 and 16 may also be reproduced for the instantaneous wrenches that users apply on the environment or tool handle. Within this context the screws represent the moment and force 6-dimensional vectors and similar measures of the striction curve of these wrenches may be followed and used as an additional input to the machine learning model for skill assessment. [00119] In addition to kinematics data gathered from the robotic arm assembly 12, gaze tracking of the user’s eyes can be incorporated in some embodiments. The virtual reality system provides a training environment with realistic visual and haptic feedback for surgical training without the need for physical training phantoms or tissue. The eye gaze information is collected from a dedicated sensor in special glasses, such as, for example, Pupil Labs Neon XR. FIG.17 shows a scene from the virtual training environment with a virtual organ model. The virtual reality environment integrates models of the physical robotic arm assembly 12 and virtual Attorney Docket No.093386-0040-WO01 anatomy to allow users to practice manipulating organs with surgical instruments and perform other training tasks. Users can experience realistic haptic feedback through the robotic arm assembly 12. Virtual cameras in the scene provide visual or video feedback as they would receive using an endoscope. A custom communication protocol allows the virtual environment to interface with machine-learning-driven backend tissue simulation to ensure real-time feedback. [00120] The training environment also supports eye gaze tracking with respect to the virtual or physical scene. This allows eye gaze analysis to assess user skill level. Additionally, an augmented reality interface can broadcast the gaze of an expert to guide trainees on what to pay attention to for more efficient skill acquisition. [00121] These kinematic performance measures are then used as an input to the machine learning algorithm of FIGS.18 and 19. [00122] Machine learning algorithms for skill assessment have generally focused only on kinematics, vision, and system events. The haptic training system 10 and integrated environment allow analysis of other measures including force and economy of motion measures. A graph neural network is employed, as shown in FIG.18, to perform multi-modal fusion between information streams coming from vision feedback, kinematics measures, force, economy of motion measures, and gaze. Each information stream is processed initially through a time-series model, as shown in FIG.19. The network architecture and feature preprocessing are based on Yonghao Long et al., “Relational graph learning on visual and kinematics embeddings for accurate gesture recognition in robotic surgery,” 2021 IEEE International Conference on Robotics and Automation (ICRA), pp.13346-13353, IEEE, 2021, which is incorporated herein by reference. Once each modality is encoded, the neural network uses a message-passing scheme to refine the encoding by sharing information between the modalities. Then, the refined encoding goes through further processing depending on the downstream task, such as skill assessment, gesture segmentation, and gesture classification. As all this can be done in real-time, the gesture segmentation and recognition together with skill assessment can be used to provide targeted feedback to users depending on the skill level they are exhibiting while performing each gesture. Attorney Docket No.093386-0040-WO01 EXAMPLE – SOFT, SIX-AXIS F/T SENSOR FOR TRAINING IN SURGICAL APPLICATIONS [00123] SENSOR DESIGN [00124] The mechanism of sensing force is based on Hooke’s Law, which gives a direct relationship between force applied on a spring and its deformation. Silicone has highly elastic properties and typically follows a non-linear deformation model. However, under small deformations, the relationship between force and deformation can be approximated as linear. In order to measure deformation, small 1/16" diameter, 1/32" height magnets (K&J Magnetics, D101-N52) are placed on the surface of a layer of silicone and Hall-effect sensors/magnetometers (Melexis, MLX90393) are placed on the opposite surface to measure magnetic field strength. As the silicone is compressed, the relative position between the magnets and the magnetometers changes which is captured by the change in magnetic flux density. A linear mapping was assumed between position and magnetic field strength, as given by: p = Mb + o (7) where p is the position of the magnet with respect to the Hall-effect sensor, M is a diagonal matrix containing the change in position for a change in flux for each axis, b is the flux density, and o is the sensor offset. [00125] Any one-dimensional force applied to an isolated magnet, silicone, Hall-effect sensor system can be calculated using Hooke’s Law: ^^^ = ^^(^^^^^ + ^^) (8) where Fi is the force applied to the sensor, k is the spring constant of the silicone, m is the constant that relates the distance of the magnet to magnetic field strength, b is magnetic field strength, and o is some offset. Extrapolating to six dimensions from this one-dimensional case is explained in more detail below. [00126] While the sensor can also be used as a general 6-axis F/T sensor, it was designed with the target application of measuring forces and torques applied at the tip of surgical tools in a Attorney Docket No.093386-0040-WO01 haptic training device. The sensor was designed such that it can be easily mounted to laparoscopic instruments by passing the shaft of the tools through the center of the force sensor as shown in FIG.20. The sensor itself has three main components: a center piece with attached Hall-effect sensors, a layer of silicone, and an outer shell to house the magnets (FIG.21). Both the center piece and the outer shell are 3D printed out of Phrozen Rock-Black Stiff resin (Phrozen, Sonic Mega 8K S). The center piece serves to rigidly attach the magnetometers to the tool shaft whereas the magnets are fixed to the haptic device. The layer of silicone is molded with Eco-Flex 00-30 (Smooth-On) which has a shore hardness of 00-30. The layer thickness is 6mm between the center piece and the outer shell. This allows the sensors to move with respect to the magnets proportional to the amount of force applied by the user on the tool shaft. [00127] To capture the forces/torques in six degrees of freedom, the sensor has eight magnetometers and eight corresponding magnets arrayed radially around the center of the tool. The Hall-effect sensors were oriented at an angle of 25° from the vertical axis of the tool shaft. The magnets were offset by 6mm from the surface of the sensors. To encapsulate the positional deformation that results from a force applied to the sensor, the position of each magnet with respect to its magnetometer was combined to give the deformation twist of the center piece as a whole. Eight sensors were chosen to provide redundancy that was leveraged to minimize error in the computation of the deformation twist. This is explained in more detail below. As a basis for that computation, it was necessary to define a coordinate system for each magnetometer and magnet with respect to the base frame of the force sensor in the nominal case (no force applied) (FIG.22). The base frame was placed at the center of the device defined with the z-axis along the tool shaft in the direction of the tool tip. The Hall-effect sensor frames were assigned by the chip itself and have the z-axis pointing outward from the face of the chip, the y-axis pointing along the length of the tool shaft, and the x-axis along the horizontal surface of the inner ring. The frames of the magnets had the exact same orientation as their corresponding magnetometers, just translated 6mm in the positive Z direction. [00128] Measurements from the Hall-effect sensors were sent via I2C communication protocol to a Teensy 4.0 microcontroller. There were four versions of the MLX90393 magnetometer chip which can each accommodate four user-specified addresses, so in total, 16 MLX90393 magnetometers can fit on a single I2C bus (FIG.23). Since the current sensor Attorney Docket No.093386-0040-WO01 design used eight Hall-effect sensors, two of the force sensing devices were connected on a single I2C bus, which will facilitate future integration into a haptic device. Each magnetometer was soldered to a custom printed circuit board (PCB) which interfaces between each chip and assigns the I2C address for each magnetometer. [00129] The MLX90393 magnetometer offers several parameters that can be adjusted by the user to determine the quantities, resolution, and frequency of magnetic field measurements. The Hall-effect sensor is capable of detecting changes in magnetic field in the X, Y, and Z direction as well as the internal temperature of the chip. For this application, measurements in all three dimensions are considered and temperature is ignored. [00130] Information from the magnet supplier indicated that the magnetic field strength at the surface of the magnet is 6053 gauss (605.3 mT). Therefore, the sensor must be capable of capturing the range 0-605300μT. The magnetometer can report 16 bits per measurement which means that the ideal resolution should be no less than 9.236 µT/LSB in the Z-direction. The best option for resolution offered by the MLX90393 is 9.680 µT/LSB along the Z-axis. The corresponding resolution for the field strength in the X and Y directions are both 6.009 µT/LSB. [00131] The maximum frequency of data acquisition was limited by the time it took to perform a measurement. This was determined by the amount of time needed to prepare the magnetometer for a measurement and how many dimensions must be measured. For this application, the preparation time was 839μs and the measurement time for a single axis was 835μs. Therefore, the minimum period for measuring from three axes was 3.34ms. A 10ms period was chosen to give a sampling frequency of 100Hz, which is far below the threshold for human tactile perception of latency. [00132] FORCE ESTIMATION [00133] When a wrench is applied to the center piece of the sensor (FIG.21 at ⑤), the center deflects inside of the silicone layer (FIG.21 at ②). A method is described for estimating the wrench applied to the center piece of the sensor and the pose of the center piece using the measurements from the MLX90393 sensors (FIG.21 at ③). Attorney Docket No.093386-0040-WO01 [00134] The ith MLX90393 sensor measures the magnetic flux density b T ^ = ^^^௫,^^௬, ^^௭൧ in Teslas in its own local frame {Si}. As described below, the position of the ith magnet ௌ^ p^^ as a function of bi in {Si} was experimentally calibrated. Next, the measurements of ௌ^ p^^ were used to estimate the pose of the center piece using the Arun’s method for point-cloud registration. [00135] As shown in FIG.22, a frame {C} is fixed to the sensor center piece. When no wrench is applied to the sensor, this frame will be aligned with the original center frame (i.e., ^ T^ = Iସ). When deflected, {C} exhibits a positional and/or orientational offset from the world frame (i.e., ^ T ≠ I ). To sol 0 ^ ସ ve for Tc, the following loop closure equation is used: ^ p = ^ p + ^ R ^ p + ^ R ^ ^^ ^^ ^ ^ ^^ ^ R^^ p^^ (9) In this equation, ^ p
Figure imgf000028_0001
^ components of 0Tc, respectively. ^ p^^ ∈ IRଷ is the position of the ith magnet in world frame. This value is a constant that is known from the sensor geometry. Additionally, ^ p^^ ∈ IRଷ is the position of the ith Hall-effect sensor frame {Si} relative to {C} which is also a constant value and ^^ p^^ ∈ IRଷ is the position of the ith magnet as sensed by the ith Hall-effect sensor. This equation can be simplified to: ^ p^^ = ^ p^ + ^ R ^ ^ p^^ (10) [00136] Using this equation and information from all eight Hall-effect sensors, we can estimate 0pc and 0Rc using a least-squares approach: ^ ^ ଶ min ∑ ^ p ^ − ^ R ^ ^ ^ ^ p^^ + p^ ^^ (11)
Figure imgf000028_0002
[00137] To solve this equation, we rely on the singular value decomposition (SVD) based approach first proposed by Arun et al. In this method, we first find the centroid of the position measurements: Attorney Docket No.093386-0040-WO01 ^ pത^ = ^଼ ∑ ^ ^ ^଼ ^଼ୀ^ p^^ , pത^ = ∑ ^ ^଼ୀ^ p^^ (12) Using these
Figure imgf000029_0001
H = ^ T ∑ ^ ^ ^ ^଼ୀ^ ^ p^^ − pത^ ^ ^ p^^ − pത^ ^ (13) Next, we take the
Figure imgf000029_0002
H = UΣVT (14) The rotation matrix can now be estimated using: ^ R^ = VUT (15) Once the rotation matrix is estimated, the translation can be estimated using: ^ p ^ ^ ^ ^ = pത^ − R^ pത^ (16) [00138] Using eq. (15)
Figure imgf000029_0003
, center piece with respect to the sensor world frame in the form of an SE(3) transformation matrix 0Tc. [00139] For the force sensing model, we want to express the deflection of the center piece in terms of the twist ∆^^ ∈ IR^. We can extract the deflection twist from 0Tc using: ∆^^ = log൫ ^ T ൯˅ ∈ ^ ^ IR (17)
Figure imgf000029_0004
[00140] Where log(∙): → se(3) is the matrix logarithm that takes an SE(3) transformation matrix and returns a matrix se(3) twist and (∙): se(3) → IR6 takes a matrix se(3) twist and returns a twist in vector form. [00141] For simplicity, we use the estimation method described above to fit a matrix A ∈ IR6×24 that can be used to estimate the deflection twist Δξ from a stacked vector of magnetic flux densities from all eight sensors b^ = ^bT T T ଶସ ^, … , b଼൧ ∈ IR : ∆^^ = Ab^ (18) Attorney Docket No.093386-0040-WO01 [00142] To find this matrix, we experimentally measure n different magnetic flux density vectors using the method described below and put them in the columns of a matrix ^^ = ^b^^, … , b^^൧ ∈ IRଶସ×^. Using the method described above, we then find the n deflection twists that correspond to the columns of B and put them in the columns of the matrix Ξ = [∆^^^, … ,∆^^^] ∈ IR^×^. Using Ξ and B, eq. (12) can be rewritten using all n measurements of b^: Ξ = AB (19) Using this equation, we can find A using: A = ΞBା (20) where B+ is the Moore-Penrose pseudoinverse of B. [00143] Using Hooke’s law, the deflection twist Δξ can be used to find the wrench w∈IR^ applied to the sensor center piece using the stiffness matrix K ∈ IR^×^: w = K∆ξ (21) By substituting eq. (18) in for Δξ, we can find w using the magnetic flux density measurements b ^ : w = KAb^ (22) In the experiments described below, we also collected n measurements of the applied wrench. Similar to above, we create a matrix whose columns are the wrench measurements W = [w , ^] ^×^ ^ … , w ∈ IR . W = KAB (23) We can now find the stiffness matrix using: K = W(AB)+ (24) [00144] SENSITIVITY AND ERROR ANALYSIS Attorney Docket No.093386-0040-WO01 [00145] We aim to establish theoretical bounds on the uncertainty in the estimated wrench given the uncertainty in the magnetic flux density. In this initial feasibility study, we consider only the uncertainty due to magnetic flux density measurement and ignore the contribution of sensor geometry and material properties of silicone to the uncertainty in the wrench estimation. [00146] Given eq. (22), the deviation in the wrench estimation δw that corresponds to an unmodeled measurement error in magnetic flux density δb ^ is given simply by: δw=KAδb ^ (25) [00147] In many cases, we do not know the exact value of δb ^ . We are therefore more interested in establishing bounds on the norm of δw. Therefore, we write the following inequality: ‖δw‖ ≤ ‖KA‖ฮδb^ฮ (26) where ∥KA∥ is the spectral or 2-norm of KA. Given that the 2-norm of a matrix equals its maximum singular value, this equation can be rewritten as: ‖δw‖ ≤ ^^max(KA)ฮδb^ฮ (27) [00148] Therefore, the
Figure imgf000031_0001
b ^ is bounded by the maximum singular value σmax(KA). [00149] For tool tip forces, the amplification of measurement error in b ^ ^ and b ^ ଶ are bounded by the maximum singular value of [Ad1K1A1 Ad2K2A2 ] where Ad1 and Ad2 are the adjoint transformations corresponding with transforming wrenches from the tool tip to the frames of sensors 1 and 2. [00150] EXPERIMENTS [00151] To verify the sensor design and force sensing model, a number of experiments were performed which are described below. First, the relationship between individual displacement of the magnets based on magnetic field strength was determined experimentally. Then the sensor was calibrated with a robust combination of forces and torques to populate the stiffness matrix, K Attorney Docket No.093386-0040-WO01 eq. (22). Finally, the force sensing model was verified by applying a set of known forces/torques to the sensor and comparing the sensed force/torque to the ground truth. [00152] As noted above, we assume a linear relationship between the measured magnetic flux density and position of the magnet with respect to the sensor. The MLX90393 magnetometer is able to detect magnetic flux density along three axes (X, Y, and Z). We first verified that the change in magnetic flux for each dimension is sufficiently sensitive to a positional change in that dimension (i.e., a change in the x-direction corresponds to a change in the x-component of flux density). The following experiment was performed to determine the relationship between a change in position and the corresponding change in magnetic flux. [00153] The setup was comprised of a motorized linear XYZ-stage robot, a 3D printed mount for the Hall-effect sensor, a magnet embedded in silicone housing and a Teensy4.0 to read the data serially. The data was recorded for post-processing using the serial port logging feature of the open source software puTTy. The magnet is mounted to the end effector of the Cartesian robot which is zeroed by aligning the magnet on the center of the Hall-effect chip as shown in FIG.24. [00154] The process includes moving the robot end effector along each axis and collecting magnetic field readings at a total of thirty discrete points. To determine the mapping in the Z- axis, the robot moved the magnet away in 0.2mm increments withing the range [1, 3]mm of distance between the magnet and the chip. Similarly, in the X and Y dimensions, the data is collected in the range [−1, 1]mm with 0.2mm increments in their respective axes. [00155] The resulting mapping between position and magnetic flux density along each of the three axes is shown in FIG.25. The results demonstrate that the relationship between magnetic field strength and position is approximately linear in each dimension for the anticipated range. Therefore, eq. (7) can be written with M populated by the slopes and o from the y-intercepts from the linear regression model of each curve: 0.4423 0 0 −22 p = ^ 0 03678 0 ൩ b + ^−18൩ (28) 5ᇥ ᇣᇤ8ᇥ o
Figure imgf000032_0001
Attorney Docket No.093386-0040-WO01 [00156] The goal of the calibration procedure is to construct the stiffness matrix, K, in order to relate the magnetic flux readings to force eq. (22). The outer shell of the force sensor is fixed to the end effector of the Meca500 (Mecademic) robot. A rod which has an integrated attachment for calibration weights is rigidly connected to the inner ring of the force sensor. Masses of 50g and 200g were used to calibrate the sensor. The robot was commanded to put the sensor in 193 different poses to sample a robust combination of forces and torques across all six axes (FIG. 26). The mean of a hundred measurements from each Hall-effect sensor were taken at each pose to reduce the effect of noise in the measurements. [00157] From the Hall-effect sensor readings and the method described above, the deflection twist of the sensor frame is computed for each pose. These twists make up the Ξ matrix which is substituted into eq. (20) with the Hall-effect readings to generate A, which estimates the deflection twist directly from the set of magnetic flux measurements. [00158] To determine the wrenches applied to the force sensor at each pose, they must first be defined in world frame. The position of the center of mass of the calibration weight is given by the direct kinematics of the robot and a static transform between the end effector frame {ee} and the center of mass frame {ω}, which is measured from CAD eq. (29). ^ ^ p ^^ ఠ ൨ = ^ T p ^^ ^ ఠ ൨ (29) 1 1 [00159] The orientation of the {ω} frame is aligned with the world frame so that the force applied at {ω} is trivially wఠ = [0,0,−^^^^, 0,0,0]T. The resulting force felt by the force sensor is calculated by taking multiplying the ω force by an adjoint transpose: w^ = AdTwఠ (30) Where the adjoint matrix is given by eq. (31) for the rotation and position between the sensor frame and the calibration weight frame, ωRC and ωpC, respectively. ఠ R ఠ ∧ఠ ^ p R ^ ^ ^ ^ Attorney Docket No.093386-0040-WO01 [00160] Now that the deflection twist and the wrench on the sensor is known for each pose, the stiffness matrix can be computed from eq. (24): −8.82 −11.33 −17.41 0.03 0.14 0.02 é 1.53 12.77 0.65 ù ê −0.21 −0.02 −0.06 39.70 ú K = ê 27.12 −6.50 −0.03 0.10 −0.04ú × 10ଷ (32) [00161]
Figure imgf000034_0001
the singular values of KA are investigated. For forces, σmax = 6.07×10−3 N/μT, σmin = 2.88 × 10−3 N/μT and the isotropy index is σmin/σmax = 0.47. As for the torques, σmax = 2.26 × 10−3 Nm/μT, σmin = 1.48 × 10−3Nm/μT and the isotropy index is σmin/σmax = 0.65. [00162] The purpose of the validation experiment is to verify the accuracy of the force sensor by comparing a known applied force to the measured force. The setup was identical to that used for calibration, except a different calibration weight was used, specifically, 100g was chosen. As in the calibration procedure, the sensor is held in different poses and the magnetic flux density from each Hall Effect sensor is recorded. Using the method described above, the deflection twist was generated from the data, then the measured force by the sensor was computed using eq. (22). The ground truth force/torque values applied to the center were calculated according to the method previously described. The measured force was compared to the ground truth as shown in FIG.27. The root mean square error (RMSE) over all sampled poses is reported in Table I. TABLE I: Root mean square error of force and torque measurements for each axis. Quantit F F F M M M
Figure imgf000034_0002
[ ] n overvew o e sensor c arac ers cs s prov e . a ng e uc ean norm of the reported RMS errors for force and torque can summarize the overall error as 0.45N and Attorney Docket No.093386-0040-WO01 0.014Nm, respectively. Though not validated experimentally, the estimated range of our sensor ±50N in the X,Y directions, ±20N in the Z directions, and ±0.2Nm for all torques. The estimated range of the force sensor is computed by multiplying the stiffness for a single dimension (elements on the diagonal of K) by the maximum possible deformation in that dimension. For the X and Y directions, the maximum possible deformation is 6mm and in the Z direction it is 3mm. [00164] Non-transitory Computer Readable Storage Medium [00165] In some embodiments, the platforms, systems, media, and methods disclosed herein include one or more non-transitory computer readable storage media encoded with a program including instructions executable by the operating system of an optionally networked computer. In further embodiments, a computer readable storage medium is a tangible component of a computer. In some embodiments, a computer readable storage medium is optionally removable from a computer. In some embodiments, a computer readable storage medium includes, by way of non-limiting examples, compact disc read-only memories (CD-ROMs), digital versatile discs (DVDs), flash memory devices, solid state memory, magnetic disk drives, magnetic tape drives, optical disk drives, cloud computing systems and services, and the like. In some cases, the program and instructions are permanently, substantially permanently, semi-permanently, or non- transitorily encoded on the media. [00166] Computer Program [00167] In some embodiments, the platforms, systems, media, and methods disclosed herein include at least one computer program, or use of the same. A computer program includes a sequence of instructions, executable in an electronic processor (e.g., the electronic processor 1012) of the computer, written to perform a specified task. Computer readable instructions may be implemented as program modules, such as functions, objects, application programming interface (API), data structures, and the like, that perform particular tasks or implement particular abstract data types. In light of the disclosure provided herein, those of skill in the art will recognize that a computer program may be written in various versions of various languages. Attorney Docket No.093386-0040-WO01 [00168] The functionality of the computer readable instructions may be combined or distributed as desired in various environments. In some embodiments, a computer program comprises one sequence of instructions. In some embodiments, a computer program comprises a plurality of sequences of instructions. In some embodiments, a computer program is provided from one location. In other embodiments, a computer program is provided from a plurality of locations. In various embodiments, a computer program includes one or more software modules. In various embodiments, a computer program includes, in part or in whole, one or more web applications, one or more mobile applications, one or more standalone applications, one or more web browser plug-ins, extensions, add-ins, or add-ons, or combinations thereof. [00169] Machine Learning [00170] In some embodiments, machine learning algorithms are employed to build a model to classify particles based on a dataset(s). Examples of machine learning algorithms may include a support vector machine (SVM), a naïve Bayes classification, a random forest, a neural network, deep learning, or other supervised learning algorithm or unsupervised learning algorithm for classification and regression. The machine learning algorithms may be trained using one or more training datasets. For example, previously received location or user data may be employed to train various algorithms. Moreover, as described above, these algorithms can be continuously trained/retrained using real-time user data as it is received. In some embodiments, the machine learning algorithm employs regression modeling where relationships between variables are determined and weighted. In some embodiments, the machine learning algorithm employs regression modeling, where relationships between predictor variables and dependent variables are determined and weighted. [00171] Data Stores [00172] In some embodiments, the platforms, systems, media, and methods disclosed herein include one or more data stores. Data stores include repositories for persistently storing and managing collections of data. Types of data stores repositories include, for example, databases and simpler store types, or use of the same. Simpler store types include files, emails, and so forth. In some embodiments, a database is a series of bytes that is managed by a database management system (DBMS). In various embodiments, suitable databases include, by way of Attorney Docket No.093386-0040-WO01 non-limiting examples, relational databases, non-relational databases, object-oriented databases, object databases, entity-relationship model databases, associative databases, and extensible markup language (XML) databases. Further non-limiting examples include structured query language (SQL), PostgreSQL, MySQL, Oracle, DB2, and Sybase. In some embodiments, a database is cloud computing based. [00173] Standalone Application [00174] In some embodiments, a computer program includes a standalone application, which is a program that is run as an independent computer process, not an add-on to an existing process, (e.g., not a plug-in). Standalone applications are often compiled. A compiler is a computer program(s) that transforms source code written in a programming language into binary object code such as assembly language or machine code. Suitable compiled programming languages include, by way of non-limiting examples, C, C++, Objective-C, COBOL, Delphi, Eiffel, Java™, Lisp, Python™, Visual Basic, and VB .NET, or combinations thereof. Compilation is often performed, at least in part, to create an executable program. In some embodiments, a computer program includes one or more executable compiled applications. [00175] Software Modules [00176] In some embodiments, the systems and methods disclosed herein include software, server, or database modules. Software modules are created using machines, software, and languages. In various embodiments, a software module comprises a file, a section of code, a programming object, a programming structure, or combinations thereof. In further various embodiments, a software module comprises a plurality of files, a plurality of sections of code, a plurality of programming objects, a plurality of programming structures, or combinations thereof. In various embodiments, the one or more software modules comprise, by way of non- limiting examples, a web application, a mobile application, and a standalone application. In some embodiments, software modules are in one computer program or application. In other embodiments, software modules are in more than one computer program or application. In some embodiments, software modules are hosted on one machine. In other embodiments, software modules are hosted on more than one machine. In further embodiments, software modules are hosted on cloud computing platforms. In some embodiments, software modules are hosted on Attorney Docket No.093386-0040-WO01 one or more machines in one location. In other embodiments, software modules are hosted on one or more machines in more than one location. [00177] Furthermore, the modules, processes systems, and sections may be implemented as a single processor or as a distributed processor. Further, it should be appreciated that the steps mentioned above may be performed on a single or distributed processor (single and/or multi- core, or cloud computing system). Also, the processes, system components, modules, and sub- modules described in the various Figures of and for embodiments above may be distributed across multiple computers or systems or may be co-located in a single processor or system. Example structural embodiment alternatives suitable for implementing the modules, sections, systems, means, or processes described herein are provided below. [00178] The modules, processors, or systems described above may be implemented as a programmed general purpose computer, an electronic device programmed with microcode, a hard-wired analog logic circuit, software stored on a computer-readable medium or signal, an optical computing device, a networked system of electronic and/or optical devices, a special purpose computing device, an integrated circuit device, a semiconductor chip, and/or a software module or object stored on a computer-readable medium or signal, for example. [00179] Particular implementations of the subject matter have been described. Other implementations, alterations, and permutations of the described implementations are within the scope of the following claims as will be apparent to those skilled in the art. While operations are depicted in the drawings or claims in a particular order, this should not be understood as requiring that such operations be performed in the particular order shown or in sequential order, or that all illustrated operations be performed (some operations may be considered optional), to achieve desirable results. [00180] As described above in the detailed description, reference is made to the accompanying drawings that form a part hereof wherein like numerals designate like parts throughout, and in which is shown, by way of illustration, implementations that may be practiced. It is to be understood that other implementations may be utilized, and structured or logical changes may be made, without departing from the scope of the present disclosure. Therefore, the detailed description as described above is not to be taken in a limiting sense. Attorney Docket No.093386-0040-WO01 [00181] All statements herein reciting principles, aspects, and embodiments of the disclosure, as well as specific examples thereof, are intended to encompass both structural and functional equivalents thereof. Additionally, it is intended that such equivalents include both currently known equivalents as well as equivalents developed in the future (e.g., any elements developed that perform the same function, regardless of structure). [00182] Since many modifications, variations, and changes in detail can be made to the described preferred embodiments of the invention, it is intended that all matters in the foregoing description and shown in the accompanying drawings be interpreted as illustrative and not in a limiting sense. Thus, the scope of the invention should be determined by the appended claims and their legal equivalents. [00183] Various operations may be described as multiple discrete actions or operations in turn, in a manner that is most helpful in understanding the subject matter disclosed herein. However, the order of description should be construed as to imply that these operations are necessarily order dependent. In particular, these operations may not be performed in the order of presentation. Operations described may be performed in a different order from the described implementation. Various additional operations may be performed, and/or described operations may be omitted in additional implementations.

Claims

Attorney Docket No.093386-0040-WO01 CLAIMS What is claimed is: 1. A system for training a user in soft-tissue surgeries, the system comprising: a robotic arm assembly including a frame, an upper manipulation linkage assembly coupled to the frame, a lower manipulation linkage assembly coupled to the frame, one or more actuators coupled to the upper manipulation linkage assembly or the lower manipulation linkage assembly, a tool stem assembly coupled to the upper manipulation linkage assembly and the lower manipulation linkage assembly, the tool stem assembly configured to support a surgical tool, and a sensor assembly coupled to the tool stem assembly; a computing device configured to receive magnetic flux density data from the sensor assembly when a force is applied to the tool stem assembly, determine a position and an orientation of the tool stem assembly based on the magnetic flux density data, estimate forces of interaction with the upper manipulation linkage assembly and the lower manipulation linkage assembly based on electrical current of the one or more actuators, calculate an estimation of force at a distal end of the surgical tool based on the position and the orientation of the tool stem assembly and the forces of interaction, process the estimation of force at the distal end of the surgical tool to provide real- time haptic feedback to the user via the surgical tool. 2. The system of claim 1, wherein the sensor assembly includes a plurality of magnets and a plurality of Hall-effect sensors configured to detect the magnetic flux density data. Attorney Docket No.093386-0040-WO01 3. The system of claim 1, wherein the tool stem assembly includes a stem body, a first sensor housing supported on the stem body, and a second sensor housing supported on the stem body, and wherein the sensor assembly includes a first sensor supported by the first sensor housing and a second sensor supported by the second sensor housing, and wherein the first sensor includes a first elastomeric matrix positioned between the first sensor housing and the stem body, and wherein the second sensor includes a second elastomeric matrix positioned between the second sensor housing and the stem body. 4. The system of claim 3, wherein the first sensor includes a plurality of magnets and a plurality of Hall-effect sensors within the first sensor housing, the plurality of magnets and the plurality of Hall-effect sensors configured to detect the magnetic flux density data. 5. The system of claim 4, wherein the second sensor includes a plurality of magnets and a plurality of Hall-effect sensors within the second sensor housing, the plurality of magnets and the plurality of Hall-effect sensors configured to detect the magnetic flux density data. 6. The system of claim 1, wherein the sensor assembly includes a six-axis sensor including an array of magnets and corresponding Hall-effect sensors to detect the magnetic flux density data when a force is applied to the surgical tool. 7. The system of claim 1, further comprising a pitch actuation assembly coupled to the upper manipulation linkage assembly, the pitch actuating assembly configured to move the upper manipulation linkage assembly about a horizontal axis. 8. The system of claim 1, further comprising a yaw actuation assembly coupled to the lower manipulation linkage assembly, the yaw actuation assembly configured to move the lower manipulation linkage assembly about a vertical axis. 9. The system of claim 1, further comprising an interchange device coupled to the tool stem assembly, the interchange device configured to lock the surgical tool in position relative to the tool stem assembly. Attorney Docket No.093386-0040-WO01 10. The system of claim 9, wherein the interchange device includes a locked state and a released state to interchange the surgical tool. 11. The system of claim 1, further comprising a virtual reality system configured to provide a simulated physical interaction with a soft-tissue organ, and wherein the user receives visual and auditory guidance from the virtual reality system while receiving the real-time haptic feedback. 12. The system of claim 11, wherein the virtual reality system is configured to incorporate eye gaze data of the user when providing the real-time haptic feedback. 13. The system of claim 12, wherein the computing device is further configured to input the estimation of force at the distal end of the surgical tool, the eye gaze data, and tool motion data to a skill assessment machine learning model to provide an output of a score related to a surgical training task performed by the user. 14. The system of claim 13, wherein the computing device is further configured to input video data from an endoscope to the skill assessment machine learning model. 15. The system of claim 13, wherein the tool motion data include curvature and torsion of a striction curve and a dual angle. 16. The system of claim 1, further comprising a second robotic arm assembly including a second frame, a second upper manipulation linkage assembly coupled to the second frame, a second lower manipulation linkage assembly coupled to the second frame, one or more actuators coupled to the second upper manipulation linkage assembly or the second lower manipulation linkage assembly, a second tool stem assembly coupled to the second upper manipulation linkage assembly and the second lower manipulation linkage assembly, the second tool stem assembly configured to support a second surgical tool, and Attorney Docket No.093386-0040-WO01 a second sensor assembly coupled to the second tool stem assembly.
PCT/US2024/056260 2023-11-15 2024-11-15 Systems and methods for accelerated skill acquisition and assessment for surgical training Pending WO2025106911A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US202363599460P 2023-11-15 2023-11-15
US63/599,460 2023-11-15

Publications (1)

Publication Number Publication Date
WO2025106911A1 true WO2025106911A1 (en) 2025-05-22

Family

ID=95743537

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2024/056260 Pending WO2025106911A1 (en) 2023-11-15 2024-11-15 Systems and methods for accelerated skill acquisition and assessment for surgical training

Country Status (1)

Country Link
WO (1) WO2025106911A1 (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170239821A1 (en) * 2014-08-22 2017-08-24 President And Fellows Of Harvard College Sensors for Soft Robots and Soft Actuators
US20190355278A1 (en) * 2018-05-18 2019-11-21 Marion Surgical Inc. Virtual reality surgical system including a surgical tool assembly with haptic feedback
US20200289230A1 (en) * 2019-03-15 2020-09-17 Ethicon Llc Robotic surgical controls with force feedback
WO2022119754A1 (en) * 2020-12-03 2022-06-09 Intuitive Surgical Operations, Inc. Systems and methods for assessing surgical ability
US20230286171A1 (en) * 2020-11-17 2023-09-14 Wuhan United Imaging Healthcare Surgical Technology Co., Ltd. Quick change interface for joint of robotic arm and robotic arm

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170239821A1 (en) * 2014-08-22 2017-08-24 President And Fellows Of Harvard College Sensors for Soft Robots and Soft Actuators
US20190355278A1 (en) * 2018-05-18 2019-11-21 Marion Surgical Inc. Virtual reality surgical system including a surgical tool assembly with haptic feedback
US20200289230A1 (en) * 2019-03-15 2020-09-17 Ethicon Llc Robotic surgical controls with force feedback
US20230286171A1 (en) * 2020-11-17 2023-09-14 Wuhan United Imaging Healthcare Surgical Technology Co., Ltd. Quick change interface for joint of robotic arm and robotic arm
WO2022119754A1 (en) * 2020-12-03 2022-06-09 Intuitive Surgical Operations, Inc. Systems and methods for assessing surgical ability

Similar Documents

Publication Publication Date Title
Shan et al. Modeling and analysis of soft robotic fingers using the fin ray effect
Jaquier et al. Geometry-aware manipulability learning, tracking, and transfer
Aukes et al. Design and testing of a selectively compliant underactuated hand
Berniker et al. Motor learning of novel dynamics is not represented in a single global coordinate system: evaluation of mixed coordinate representations and local learning
Xu et al. An investigation of the intrinsic force sensing capabilities of continuum robots
Duan et al. Ultrasound-guided assistive robots for scoliosis assessment with optimization-based control and variable impedance
JP2739804B2 (en) Dipole estimator
Ma et al. Automatic precision robot assembly system with microscopic vision and force sensor
CN105203055B (en) A Dynamic Error Compensation Method of Articulated Coordinate Measuring Machine
CN105444672A (en) Orthogonal plane calibrating method and orthogonal plane calibrating system of relation between laser distance measuring device and end of mechanical arm
Yu A new pose accuracy compensation method for parallel manipulators based on hybrid artificial neural network
AlBeladi et al. Vision-based shape reconstruction of soft continuum arms using a geometric strain parametrization
Joubair et al. Use of a force-torque sensor for self-calibration of a 6-DOF medical robot
Qi et al. Kinematics optimization and static analysis of a modular continuum robot used for minimally invasive surgery
Lakhal et al. Inverse kinematic modeling of a class of continuum bionic handling arm
Grassmann et al. Fas—a fully actuated segment for tendon-driven continuum robots
Chua et al. A modular 3-degrees-of-freedom force sensor for robot-assisted minimally invasive surgery research
Zhang et al. An Optimized Mass-spring Model with Shape Restoration Ability Based on Volume Conservation.
Navarro et al. A bio-inspired active prostate phantom for adaptive interventions
Zhu et al. Design of a gough–stewart platform based on visual servoing controller
Cui et al. A novel flexible two-step method for eye-to-hand calibration for robot assembly system
Black et al. Low-profile 6-axis differential magnetic force/torque sensing
Xu et al. A low-cost wearable hand gesture detecting system based on IMU and convolutional neural network
WO2025106911A1 (en) Systems and methods for accelerated skill acquisition and assessment for surgical training
Chen et al. Tension vector and structure matrix associated force sensitivity of a 6-DOF cable-driven parallel robot

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 24892385

Country of ref document: EP

Kind code of ref document: A1