[go: up one dir, main page]

WO2024123647A1 - Intelligent surgical instrument with a combination method for stapling line detection and labelling - Google Patents

Intelligent surgical instrument with a combination method for stapling line detection and labelling Download PDF

Info

Publication number
WO2024123647A1
WO2024123647A1 PCT/US2023/082239 US2023082239W WO2024123647A1 WO 2024123647 A1 WO2024123647 A1 WO 2024123647A1 US 2023082239 W US2023082239 W US 2023082239W WO 2024123647 A1 WO2024123647 A1 WO 2024123647A1
Authority
WO
WIPO (PCT)
Prior art keywords
staple line
staple
firing
line segment
neural network
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/US2023/082239
Other languages
French (fr)
Inventor
Shan Wan
Bin Zhao
Ning Li
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Genesis Medtech USA Inc
Original Assignee
Genesis Medtech USA Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Genesis Medtech USA Inc filed Critical Genesis Medtech USA Inc
Priority to EP23901377.4A priority Critical patent/EP4626296A1/en
Priority to CN202380083425.0A priority patent/CN120344184A/en
Publication of WO2024123647A1 publication Critical patent/WO2024123647A1/en
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0012Biomedical image inspection
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00002Operational features of endoscopes
    • A61B1/00004Operational features of endoscopes characterised by electronic signal processing
    • A61B1/00009Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope
    • A61B1/000096Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope using artificial intelligence
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/313Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor for introducing through surgical openings, e.g. laparoscopes
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N20/00Machine learning
    • G06N20/10Machine learning using kernel methods, e.g. support vector machines [SVM]
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/044Recurrent networks, e.g. Hopfield networks
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • G06N3/084Backpropagation, e.g. using gradient descent
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • G06N3/09Supervised learning
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/764Arrangements for image or video recognition or understanding using pattern recognition or machine learning using classification, e.g. of video objects
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/82Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/40Scenes; Scene-specific elements in video content
    • G06V20/41Higher-level, semantic clustering, classification or understanding of video scenes, e.g. detection, labelling or Markovian modelling of sport events or news items
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods
    • A61B17/068Surgical staplers, e.g. containing multiple staples or clamps
    • A61B17/072Surgical staplers, e.g. containing multiple staples or clamps for applying a row of staples in a single action, e.g. the staples being applied simultaneously
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/06Measuring instruments not otherwise provided for
    • A61B2090/064Measuring instruments not otherwise provided for for measuring force, pressure or mechanical tension
    • A61B2090/065Measuring instruments not otherwise provided for for measuring force, pressure or mechanical tension for measuring contact or contact pressure
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V2201/00Indexing scheme relating to image or video recognition or understanding
    • G06V2201/03Recognition of patterns in medical or anatomical images
    • G06V2201/034Recognition of patterns in medical or anatomical images of medical instruments

Definitions

  • Surgical staplers have been used as instrumental tools for open and laparoscopic surgical procedures. Particularly since the 1980s laparoscopic video’s commercialization, the endoscopic stapler/cutter has emerged as the minimally invasive solution for a wide spectrum of complex general/colorectal/GYN/ bariatric/thoracic procedures. Mechanical surgical staplers have been open-loop devices till recent years when their feedback completely relied on surgeons’ judgment and experience.
  • Staple line leak in surgery is a challenge, and it’s more prevalent in sleeve gastrectomy.
  • a properly formed B-shaped staple is correlated with a low leakage rate.
  • One factor that may improve the B shape staple form is applying optimum pressure on the tissue at the stapling time. If the pressure is too high, which means the clamped tissue is too thick, the staple cannot close the loop to form the B-shape. If the pressure is too low, which means the clamped tissue is too thin, the grip would be low and cause tissue staking. The pressure on the tissue is dependent on the applied tissue pressure time since the tissue will dehydrate under pressure. As a result, quantifying the tissue pressure during clamping and firing may inform the decision-making process and improve the staple form.
  • Another factor that may improve staple form is the proper usage of the stapler at the time of surgery. The proper use includes, but is not limited to, selecting the correct cartridge size and applying the optimal local shear forces to the tissue.
  • Embodiments of this disclosure disclose an embedded sensor or an array of sensors at the end effector of the stapler, which can be embedded in the anvil or the cartridge.
  • the sensors quantify the tissue pressure during clamping and firing, the data will be used in the further computation process and provide users with operation guidance to improve the form of the stapling line.
  • the sensor data will also be used in calculating the local shear forces of the tissue caused by the applied force, and provide user guidance in selecting the correct cartridge size.
  • Figure 1 illustrates a firing force curve during a stapling process in a medical procedure.
  • Figure 2a illustrates an exemplary image from a video showing a staple line.
  • Figure 2b illustrates another exemplary image from an enhanced real time video showing a staple line.
  • Figure 3 a illustrates an exemplary console connecting a stapler, laparoscope, and monitor, according to an embodiment of the disclosure.
  • Figure 4 is an exemplary image from the real live video that shows a detected staple line, according to an embodiment of the disclosure.
  • Figures 5a - 5d illustrate images showing the recognition of staple line segments from images from real-time video of surgery, according to an embodiment of the disclosure.
  • Figure 6 illustrates the exemplary hardware components of the console of Figure 3b, according to an embodiment of the disclosure.
  • the video and data processing unit can calculate the abnormality parameters and fuse the real-time video with various labels, such as different colors or patterns.
  • Figure 2b illustrates an example of part of staple line 202 being labeled in an enhanced real-time video, according to an embodiment of the disclosure. This will alert the surgeons to perform necessary surgical maneuvers and remedy if the surgeon deems the drop in firing force of the staple (and the effect it caused) as high risk.
  • Figure 2a is an image showing staple line 200 from the real-time video of the operation without any labeling.
  • embodiments of the disclosed system can be used in bariatric or colorectal surgery where the system can label the staple lines with extreme tissue thickness, or firing over unwanted tissue where a higher leakage risk is present. A surgeon can further evaluate based on the alert (e.g., a labeled staple line section) to decide if further suturing or enhancement is needed.
  • FIG. 3 illustrates an embodiment of the disclosed system.
  • the system can include a console 300 connected to a stapler 302, a laparoscope 304, and a monitor 306.
  • the connections between the console 300 and the other devices 302, 304, and 306 can be a wired or wireless connection, such as, but not limited to, a Wi-Fi connection, local area network (LAN) connection, or Bluetooth connection.
  • the stapler 302 can transmit the firing force data of the stapler 302 to the console 300.
  • the laparoscope 304 can transmit real-time video captured by the laparoscope 304 to the console 300.
  • the console can process the firing force data and the real-time video and output alerts on monitor 306.
  • the console 300 can include a number of modules: an object detection module 308, a new staple line recognition module 310, a tracking module 312, a firing curve and staple line segment association module 314, and a classification module 316.
  • the object detection module 308 can use a neural network (e.g., Region-based Convolutional Neural Networks (“RCNN) or You Only Look Once (“YOLO”)) to locate a staple line from an image.
  • RCNN Region-based Convolutional Neural Networks
  • YOLO You Only Look Once
  • Figure 4 provides an exemplary image from the real live video that shows in line 402 the detected staple line.
  • the new staple line recognition module 310 can compare the detected staple line with previously detected staple lines to recognize a newly generated staple line. If it is the first firing of the stapler, the new staple line recognition module 310 can regard the entire staple line as a new staple line.
  • Figure 5a is an image showing the first staple firing during surgery.
  • Figure 5b illustrates the new staple line 502 of the first staple firing of Figure 5a as recognized by the new staple line recognition module 310.
  • Figure 5c is an image showing a second staple firing during the surgery.
  • the new staple line recognition module 310 can compare the image of Figure 5c with the image of Figure 5a to recognize a newly generated second staple line 504 (shown in Figure 4d).
  • the tracking module 312 can utilize a neural network to track each staple line segment (each segment corresponds to an individual firing of the stapler). That is, the tracking module 312 can give the location of each staple line in every frame where the staple line is visible.
  • the firing curve and staple line segment association module 314 can associate the most recent staple firing curve with a new staple line segment once the new staple line segment is detected by tracking module 312. Such association matches each firing curve (e.g., the firing curve of Figure 1) with a staple line segment (e.g., the staple line segments of Figure 5b and Figure 5d)
  • the classification module 316 can use a convolutional neural network (“CNN”) built to take the matched firing curve and staple line segment image as inputs and classify the input into one of several categories such as a) normal staple firing; b) abnormal staple firing with lack of tissue clamped; c) abnormal staple firing with possible missing staples; and d) abnormal staple firing with tissue abnormality.
  • CNN convolutional neural network
  • the object detection module 308, new staple line recognition module 310, tracking module 312, firing curve and staple line segment association module 314, and classification module 316 of Figure 3b can be implemented in software, firmware, hardware, or any combination of the three.
  • FIG. 6 illustrates the exemplary hardware components of the console of Figure 3b, according to an embodiment of the disclosure.
  • the console 600 can include, among other things, an VO interface 612, a processing unit 614, a storage unit 616, a memory module 618, and a user interface 620, all connected via connection 622.
  • I/O interface 612 can be configured for communicating with a laparoscope and a stapler that are connected to the console 600.
  • the I/O interface 612 can also communicate with a monitor to display alerts on the monitor.
  • separate I/O interfaces can be used for communicating with various devices, such as the laparoscope, the stapler, and the monitor. Communication can be via any suitable wired or wireless channel.
  • Processing unit 614 may be configured to receive data from the laparoscope and the stapler and process the data to detect and label staple lines 600.
  • Processing unit 614 may also be configured to generate and transmit signals, via I/O interface 612, to display alerts on the monitor.
  • the computer-readable medium can include volatile or non-volatile, magnetic, semiconductor, tape, optical, removable, non-removable, or other types of computer-readable medium or computer-readable storage devices.
  • the computer-readable medium can have computer instructions stored thereon, as disclosed.
  • the computer- readable medium may be a disc or a flash drive having the computer instructions stored thereon.
  • console 600 of Figure 6 can include additional components that are not shown in Figure 6 and that some of the components shown in Figure 6 may be optional in certain embodiments.
  • FIG. 7 is a flow chart illustrating the exemplary steps performed by the console 300 of Figure 3.
  • the console receives firing force data from a stapler and real-time video captured by a laparoscope during a medical procedure, such as surgery (Step 701).
  • the console can locate a staple line from an image from the real-time video (Step 702). This can be done using a neural network.
  • the located staple line can be compared with previously detected staple lines to recognize a newly generated staple line (Step 703). If it is the first firing of the stapler, the entire staple line can be regarded as a new staple line.
  • the console can then track each staple line segment (each segment corresponds to an individual firing of the stapler) using a neural network (Step 704).
  • the console can determine the location of each staple line in every frame of the real-time video where the staple line is visible.
  • the console associates the most recent staple firing curve generated from the firing force data received from the stapler with the new staple line segment once the new staple line segment is determined (Step 705).
  • each firing curve of the stapler is associated with a staple line segment identified from the real-time video.
  • the console may use a convolutional neural network (“CNN”) built to take the matched firing curve and staple line segment image as inputs and classify the input into one of several categories such as a) normal staple firing; b) abnormal staple firing with lack of tissue clamped; c) abnormal staple firing with possible missing staples; and d) abnormal staple firing with tissue abnormality.
  • CNN convolutional neural network
  • the category can be output to the user (Step 707).
  • the console may also record and classify the visual properties of the clamping tissue, the anvil location, and the cartridge size.
  • the visual properties include organ/tissue classification, shape, abnormality, relative thickness, color, relative color variation, and surface vein structure. This data will be used to train a machine learning model that can predict the pressure changes based on the tissue and recommend appropriate cartridge size for that specific tissue and anvil location.
  • the console may also collect pressure sensor data and compare it to preset threshold values during the clamping process.
  • the clamping process includes two phases: closing the jaw and resting period. Average tissue thickness is highly correlated with the maximum pressure applied on the tissue during clamping. If the max pressure is too high, it implies that tissue is relatively thick for that cartridge; if the max pressure is too low, it implies that tissue is relatively thin for that cartridge, in either case, the system will recommend a cartridge change.
  • the console may also collect and analyze the shear force on the tissue to recommend the stapler repositioning based on the variation in the pressure on the cartridge.
  • One reason for not having a proper B-shaped staple form is the human factor. Doctors may apply unnecessary strains or tension on the tissue when applying the staple. If the pressure distribution is not uniform, the intelligent system may recommend stapler repositioning, including but not limited to rotation or translation, to minimize the pressure distribution variation to achieve a better stapler form.
  • a surgical stapler may be commercialized independently or in combination with an artificial intelligence (“Al”) processing unit and imaging system based on embodiments of this disclosure.
  • Al artificial intelligence

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Evolutionary Computation (AREA)
  • General Physics & Mathematics (AREA)
  • General Health & Medical Sciences (AREA)
  • Software Systems (AREA)
  • Artificial Intelligence (AREA)
  • Computing Systems (AREA)
  • Biophysics (AREA)
  • Biomedical Technology (AREA)
  • Molecular Biology (AREA)
  • Data Mining & Analysis (AREA)
  • Mathematical Physics (AREA)
  • General Engineering & Computer Science (AREA)
  • Computational Linguistics (AREA)
  • Medical Informatics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Surgery (AREA)
  • Multimedia (AREA)
  • Radiology & Medical Imaging (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Pathology (AREA)
  • Optics & Photonics (AREA)
  • Databases & Information Systems (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Signal Processing (AREA)
  • Quality & Reliability (AREA)
  • Surgical Instruments (AREA)
  • Endoscopes (AREA)

Abstract

A surgical instrument or system combining local sensing, computer vision and artificial intelligence algorithm to detect staple line, and its possible abnormality, and integrate it into laparoscopic or open surgical videos through AI data processing unit

Description

INTELLIGENT SURGICAL INSTRUMENT WITH A COMBINATION
METHOD FOR STAPLING LINE DETECTION AND LABELLING
CROSS-REFERENCE TO RELATED APPLICATIONS
[0001] This patent application claims the benefit of U.S. Provisional Patent Application No. 63/385,991, filed on December 4, 2022, the entire content of which is incorporated as reference.
BACKGROUND OF THE INVENTION
[0002] Surgical staplers have been used as instrumental tools for open and laparoscopic surgical procedures. Particularly since the 1980s laparoscopic video’s commercialization, the endoscopic stapler/cutter has emerged as the minimally invasive solution for a wide spectrum of complex general/colorectal/GYN/ bariatric/thoracic procedures. Mechanical surgical staplers have been open-loop devices till recent years when their feedback completely relied on surgeons’ judgment and experience.
[0003] Staple line leak in surgery is a challenge, and it’s more prevalent in sleeve gastrectomy. A properly formed B-shaped staple is correlated with a low leakage rate. One factor that may improve the B shape staple form is applying optimum pressure on the tissue at the stapling time. If the pressure is too high, which means the clamped tissue is too thick, the staple cannot close the loop to form the B-shape. If the pressure is too low, which means the clamped tissue is too thin, the grip would be low and cause tissue staking. The pressure on the tissue is dependent on the applied tissue pressure time since the tissue will dehydrate under pressure. As a result, quantifying the tissue pressure during clamping and firing may inform the decision-making process and improve the staple form. Another factor that may improve staple form is the proper usage of the stapler at the time of surgery. The proper use includes, but is not limited to, selecting the correct cartridge size and applying the optimal local shear forces to the tissue.
SUMMARY OF THE INVENTION
[0004] Recent advances in surgical fields include adding power, basic and advanced sensing functions, and wireless data communication capabilities to the staplers. These additional features can be designed and implemented in the stapler, and connecting the stapler to a computation unit will extend it from a discrete device into a connected intelligent system. As a result, the system improves the ease of use and tool selection, and in certain circumstances, stapling consistency by understanding the force to close, force to fire, and dynamically provide force feedback.
[0005] Embodiments of this disclosure disclose an embedded sensor or an array of sensors at the end effector of the stapler, which can be embedded in the anvil or the cartridge. The sensors quantify the tissue pressure during clamping and firing, the data will be used in the further computation process and provide users with operation guidance to improve the form of the stapling line. The sensor data will also be used in calculating the local shear forces of the tissue caused by the applied force, and provide user guidance in selecting the correct cartridge size.
[0006] Embodiments of this disclosure disclose a surgical system including a surgical stapler with force detection and controlling function, and its interface with a data receiving and processing unit. The data receiving and processing unit can process the feedback data from the surgical system with real-time laparoscopic or open surgical video. As a result, an enhanced real-time video with a labeled risk zone can be displayed on the screen to provide information for operators.
BRIEF DESCRIPTION OF THE DRAWINGS
[0007] Figure 1 illustrates a firing force curve during a stapling process in a medical procedure.
[0008] Figure 2a illustrates an exemplary image from a video showing a staple line.
[0009] Figure 2b illustrates another exemplary image from an enhanced real time video showing a staple line.
[0010] Figure 3 a illustrates an exemplary console connecting a stapler, laparoscope, and monitor, according to an embodiment of the disclosure.
[0011] Figure 3b illustrates the exemplary components of the console of Figure 3 a, according to an embodiment of the disclosure.
[0012] Figure 4 is an exemplary image from the real live video that shows a detected staple line, according to an embodiment of the disclosure.
[0013] Figures 5a - 5d illustrate images showing the recognition of staple line segments from images from real-time video of surgery, according to an embodiment of the disclosure. [0014] Figure 6 illustrates the exemplary hardware components of the console of Figure 3b, according to an embodiment of the disclosure.
[0015] Figure 7 is a flowchart illustrating the exemplary steps in the method of detecting staple lines during a medical procedure, according to an embodiment of the disclosure.
DETAILED DESCRIPTION OF PREFERRED EMBODIMENTS OF THE INVENTION [0016] Embodiments of the present disclosure will be described below in conjunction with the accompanying drawings, but it should be appreciated by those skilled in the art that the embodiments described below are exemplary, rather than exhaustive. They are only used to illustrate the present disclosure and should not be regarded as limiting the scope of the present disclosure. All other embodiments obtained by those of ordinary skill in the art without creative efforts based on the embodiments disclosed herein shall fall within the scope of the present disclosure. [0017] An example is shown in Figure 1. The firing force curve 100 during a stapling process of a medical procedure is illustrated. It can be seen that during the stapling firing, there is a distinct drop of firing force 102 in this illustration. Although useful, it is very hard to interpret this information (i.e., the distinct drop of firing force) alone for clinical judgment. On the other hand, when combined with real-time video of the medical procedure, the imaging of the staple line, and its interaction with specific tissue can be used as input data. It is possible to determine the reason for the drop in firing force, which could be related to a lack of tissue clamped, missing staples, or tissue abnormality. In these cases, the video and data processing unit can calculate the abnormality parameters and fuse the real-time video with various labels, such as different colors or patterns.
[0018] Figure 2b illustrates an example of part of staple line 202 being labeled in an enhanced real-time video, according to an embodiment of the disclosure. This will alert the surgeons to perform necessary surgical maneuvers and remedy if the surgeon deems the drop in firing force of the staple (and the effect it caused) as high risk. In contrast, Figure 2a is an image showing staple line 200 from the real-time video of the operation without any labeling. [0019] As an example of application, embodiments of the disclosed system can be used in bariatric or colorectal surgery where the system can label the staple lines with extreme tissue thickness, or firing over unwanted tissue where a higher leakage risk is present. A surgeon can further evaluate based on the alert (e.g., a labeled staple line section) to decide if further suturing or enhancement is needed.
[0020] Figure 3 illustrates an embodiment of the disclosed system. The system can include a console 300 connected to a stapler 302, a laparoscope 304, and a monitor 306. The connections between the console 300 and the other devices 302, 304, and 306 can be a wired or wireless connection, such as, but not limited to, a Wi-Fi connection, local area network (LAN) connection, or Bluetooth connection. The stapler 302 can transmit the firing force data of the stapler 302 to the console 300. The laparoscope 304 can transmit real-time video captured by the laparoscope 304 to the console 300. The console can process the firing force data and the real-time video and output alerts on monitor 306.
[0021] In one embodiment as illustrated in Figure 3b, the console 300 can include a number of modules: an object detection module 308, a new staple line recognition module 310, a tracking module 312, a firing curve and staple line segment association module 314, and a classification module 316.
[0022] In this embodiment, the object detection module 308 can use a neural network (e.g., Region-based Convolutional Neural Networks (“RCNN) or You Only Look Once (“YOLO”)) to locate a staple line from an image. Figure 4 provides an exemplary image from the real live video that shows in line 402 the detected staple line.
[0023] Multiple staple firings can happen in one surgery. The new staple line recognition module 310 can compare the detected staple line with previously detected staple lines to recognize a newly generated staple line. If it is the first firing of the stapler, the new staple line recognition module 310 can regard the entire staple line as a new staple line.
[0024] Figure 5a is an image showing the first staple firing during surgery. Figure 5b illustrates the new staple line 502 of the first staple firing of Figure 5a as recognized by the new staple line recognition module 310. Figure 5c is an image showing a second staple firing during the surgery. The new staple line recognition module 310 can compare the image of Figure 5c with the image of Figure 5a to recognize a newly generated second staple line 504 (shown in Figure 4d).
[0025] Referring back to Figure 3b, the tracking module 312 can utilize a neural network to track each staple line segment (each segment corresponds to an individual firing of the stapler). That is, the tracking module 312 can give the location of each staple line in every frame where the staple line is visible. [0026] The firing curve and staple line segment association module 314 can associate the most recent staple firing curve with a new staple line segment once the new staple line segment is detected by tracking module 312. Such association matches each firing curve (e.g., the firing curve of Figure 1) with a staple line segment (e.g., the staple line segments of Figure 5b and Figure 5d)
[0027] The classification module 316 can use a convolutional neural network (“CNN”) built to take the matched firing curve and staple line segment image as inputs and classify the input into one of several categories such as a) normal staple firing; b) abnormal staple firing with lack of tissue clamped; c) abnormal staple firing with possible missing staples; and d) abnormal staple firing with tissue abnormality.
[0028] The object detection module 308, new staple line recognition module 310, tracking module 312, firing curve and staple line segment association module 314, and classification module 316 of Figure 3b can be implemented in software, firmware, hardware, or any combination of the three.
[0029] Figure 6 illustrates the exemplary hardware components of the console of Figure 3b, according to an embodiment of the disclosure. The console 600 can include, among other things, an VO interface 612, a processing unit 614, a storage unit 616, a memory module 618, and a user interface 620, all connected via connection 622.
[0030] I/O interface 612 can be configured for communicating with a laparoscope and a stapler that are connected to the console 600. The I/O interface 612 can also communicate with a monitor to display alerts on the monitor. In some embodiments, separate I/O interfaces can be used for communicating with various devices, such as the laparoscope, the stapler, and the monitor. Communication can be via any suitable wired or wireless channel. Processing unit 614 may be configured to receive data from the laparoscope and the stapler and process the data to detect and label staple lines 600. Processing unit 614 may also be configured to generate and transmit signals, via I/O interface 612, to display alerts on the monitor.
[0031] Storage unit 616 and/or memory module 618 may be configured to store one or more computer programs that may be executed by processing unit 614 to perform functions of device 600. For example, the various exemplary modules of Figure 3b can reside in storage unit 616 and/or memory module 618. Storage unit 616 and memory 618 can be non- transitory computer-readable medium storing instructions which, when executed, cause one or more processors 614 to perform the method, as discussed in the various embodiments of the disclosure.
[0032] The computer-readable medium can include volatile or non-volatile, magnetic, semiconductor, tape, optical, removable, non-removable, or other types of computer-readable medium or computer-readable storage devices. The computer-readable medium can have computer instructions stored thereon, as disclosed. In some embodiments, the computer- readable medium may be a disc or a flash drive having the computer instructions stored thereon.
[0033] It should be understood that console 600 of Figure 6 can include additional components that are not shown in Figure 6 and that some of the components shown in Figure 6 may be optional in certain embodiments.
[0034] Figure 7 is a flow chart illustrating the exemplary steps performed by the console 300 of Figure 3. First, the console receives firing force data from a stapler and real-time video captured by a laparoscope during a medical procedure, such as surgery (Step 701). Next, the console can locate a staple line from an image from the real-time video (Step 702). This can be done using a neural network. The located staple line can be compared with previously detected staple lines to recognize a newly generated staple line (Step 703). If it is the first firing of the stapler, the entire staple line can be regarded as a new staple line. [0035] The console can then track each staple line segment (each segment corresponds to an individual firing of the stapler) using a neural network (Step 704). That is, the console can determine the location of each staple line in every frame of the real-time video where the staple line is visible. The console associates the most recent staple firing curve generated from the firing force data received from the stapler with the new staple line segment once the new staple line segment is determined (Step 705). Essentially, each firing curve of the stapler is associated with a staple line segment identified from the real-time video.
[0036] The console may use a convolutional neural network (“CNN”) built to take the matched firing curve and staple line segment image as inputs and classify the input into one of several categories such as a) normal staple firing; b) abnormal staple firing with lack of tissue clamped; c) abnormal staple firing with possible missing staples; and d) abnormal staple firing with tissue abnormality. (Step 706) The category can be output to the user (Step 707).
[0037] The console may also record and classify the visual properties of the clamping tissue, the anvil location, and the cartridge size. The visual properties include organ/tissue classification, shape, abnormality, relative thickness, color, relative color variation, and surface vein structure. This data will be used to train a machine learning model that can predict the pressure changes based on the tissue and recommend appropriate cartridge size for that specific tissue and anvil location.
[0038] The console may also collect pressure sensor data and compare it to preset threshold values during the clamping process. The clamping process includes two phases: closing the jaw and resting period. Average tissue thickness is highly correlated with the maximum pressure applied on the tissue during clamping. If the max pressure is too high, it implies that tissue is relatively thick for that cartridge; if the max pressure is too low, it implies that tissue is relatively thin for that cartridge, in either case, the system will recommend a cartridge change.
[0039] The console may also collect and analyze the shear force on the tissue to recommend the stapler repositioning based on the variation in the pressure on the cartridge. One reason for not having a proper B-shaped staple form is the human factor. Doctors may apply unnecessary strains or tension on the tissue when applying the staple. If the pressure distribution is not uniform, the intelligent system may recommend stapler repositioning, including but not limited to rotation or translation, to minimize the pressure distribution variation to achieve a better stapler form.
[0040] A surgical stapler may be commercialized independently or in combination with an artificial intelligence (“Al”) processing unit and imaging system based on embodiments of this disclosure.
[0041] Although embodiments of this disclosure have been fully described with reference to the accompanying drawings, it is to be noted that various changes and modifications will become apparent to those skilled in the art. Such changes and modifications are to be understood as being included within the scope of embodiments of this disclosure as defined by the appended claims.

Claims

CLAIMS WHAT IS CLAIMED IS:
1. A system for stapling line detection during a medical procedure, the system comprising: an object detection module configured to detect a staple line from an image; a new staple line recognition module configured to compare the detected staple line with a previously detected staple line to recognize a newly generated staple line; a tracking module configured to track each staple line segment, each staple line segment corresponding to an individual firing of a stapler; a firing curve and staple line segment association module configured to associate a most recent staple firing curve with a new staple line segment after the new staple line segment is detected by the tracking module; and a classification module configured to take the associated staple firing curve and new staple line segment as input and classify the input into one of a plurality of categories.
2. The system of claim 1, wherein the object detection module is configured to use a neural network to locate the staple line from the image.
3. The system of claim 2, wherein the neural network comprises either a Regionbased Convolutional Neural Networks (“RCNN) or You Only Look Once (“YOLO”) Neural Network.
4. The system of claim 1, wherein the new staple line recognition module is configured to regard the detected staple line as a new staple line if the detected staple line is from a first firing of a stapler.
5. The system of claim 1, wherein the tracking module uses a neural network to track each staple line segment.
6. The system of claim 1, wherein the classification module uses a convolutional neural network (“CNN”) built to take the associated staple firing curve and new staple line segment as input and classify the input into one of a plurality of categories.
7. The system of claim 1, wherein the plurality of categories comprises (a) a normal staple firing, (b) an abnormal staple firing with lack of tissue clamped, (c) an abnormal staple firing with possible missing staples, and (d) an abnormal staple firing with tissue abnormality.
8. The system of claim 1, wherein the system is connected to a stapler and configured to receive firing force data of the stapler.
9. The system of claim 1, wherein the system is connected to a laparoscope and configured to receive a real-time video captured by the laparoscope during the medical procedure.
10. The system of claim 1, wherein the system is connected to a monitor and configured to output an alert on the monitor, and the alert is generated based on the category of which the input is classified.
11. A computer-implemented method of identifying a staple line, comprising: detecting a staple line from an image; comparing the detected staple line with a previously detected staple line to recognize a newly generated staple line; tracking each staple line segment, each staple line segment corresponding to an individual firing of a stapler; associating a most recent staple firing curve with a new staple line segment after the new staple line segment is detected by the tracking module; and taking the associated staple firing curve and new staple line segment as input and classify the input into one of a plurality of categories.
12. The computer-implemented method of claim 11, further comprising using a neural network to locate the staple line from the image.
13. The computer-implemented method of claim 12, wherein the neural network comprises either a Region-based Convolutional Neural Network (“RCNN) or You Only Look Once (“YOLO”) Neural Network.
14. The computer-implemented method of claim 11, wherein comparing the detected staple line with a previously detected staple line to recognize a newly generated staple line comprises regarding the detected staple line as a new staple line if the detected staple line is from a first firing of a stapler.
15. The computer-implemented method of claim 11, wherein tracking each staple line segment comprises using a neural network to track each staple line segment.
16. The computer-implemented method of claim 11, wherein classify the input into one of a plurality of categories comprises using a convolutional neural network (“CNN”) built to take the associated staple firing curve and new staple line segment as input and classify the input into one of a plurality of categories.
17. The computer-implemented method of claim 11, wherein the plurality of categories comprises (a) a normal staple firing, (b) an abnormal staple firing with lack of tissue clamped, (c) an abnormal staple firing with possible missing staples, and (d) an abnormal staple firing with tissue abnormality.
18. The computer-implemented method of claim 11, further comprising connecting to a stapler and receiving firing force data of the stapler.
19. The computer-implemented method of claim 11, further comprising connecting to a laparoscope and receiving a real-time video captured by the laparoscope.
20. The computer-implemented method of claim 11, further comprising connecting to a monitor and outputting an alert on the monitor, the alert generated based on the category of which the input is classified.
PCT/US2023/082239 2022-12-04 2023-12-04 Intelligent surgical instrument with a combination method for stapling line detection and labelling Ceased WO2024123647A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
EP23901377.4A EP4626296A1 (en) 2022-12-04 2023-12-04 Intelligent surgical instrument with a combination method for stapling line detection and labelling
CN202380083425.0A CN120344184A (en) 2022-12-04 2023-12-04 Intelligent surgical instrument and combined method for detecting and marking staple lines

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US202263385991P 2022-12-04 2022-12-04
US63/385,991 2022-12-04

Publications (1)

Publication Number Publication Date
WO2024123647A1 true WO2024123647A1 (en) 2024-06-13

Family

ID=91380120

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2023/082239 Ceased WO2024123647A1 (en) 2022-12-04 2023-12-04 Intelligent surgical instrument with a combination method for stapling line detection and labelling

Country Status (3)

Country Link
EP (1) EP4626296A1 (en)
CN (1) CN120344184A (en)
WO (1) WO2024123647A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2025027493A1 (en) * 2023-08-01 2025-02-06 Covidien Lp System and method for machine learning method to verify consistent staple line delivery

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2008082844A2 (en) * 2006-12-29 2008-07-10 Satiety, Inc. Devices and methods for placement of partitions within a hollow body organ
WO2019123082A1 (en) * 2017-12-21 2019-06-27 Ethicon Llc Staple instrument comprising a firing path display
US20200015905A1 (en) * 2018-07-16 2020-01-16 Ethicon Llc Visualization of surgical devices
US20210204873A1 (en) * 2020-01-06 2021-07-08 Covidien Lp Systems and methods for anastomosis leakage detection and prediction
WO2021136999A1 (en) * 2019-12-30 2021-07-08 Ethicon Llc Surgical system for overlaying surgical instrument data onto a virtual three dimensional construct of an organ

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2008082844A2 (en) * 2006-12-29 2008-07-10 Satiety, Inc. Devices and methods for placement of partitions within a hollow body organ
WO2019123082A1 (en) * 2017-12-21 2019-06-27 Ethicon Llc Staple instrument comprising a firing path display
US20200015905A1 (en) * 2018-07-16 2020-01-16 Ethicon Llc Visualization of surgical devices
WO2021136999A1 (en) * 2019-12-30 2021-07-08 Ethicon Llc Surgical system for overlaying surgical instrument data onto a virtual three dimensional construct of an organ
US20210204873A1 (en) * 2020-01-06 2021-07-08 Covidien Lp Systems and methods for anastomosis leakage detection and prediction

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2025027493A1 (en) * 2023-08-01 2025-02-06 Covidien Lp System and method for machine learning method to verify consistent staple line delivery

Also Published As

Publication number Publication date
EP4626296A1 (en) 2025-10-08
CN120344184A (en) 2025-07-18

Similar Documents

Publication Publication Date Title
US11819188B2 (en) Machine-learning-based visual-haptic system for robotic surgical platforms
US12458455B2 (en) Surgical RFID assemblies for instrument operational setting control
US11553971B2 (en) Surgical RFID assemblies for display and communication
US11361176B2 (en) Surgical RFID assemblies for compatibility detection
US11510743B2 (en) Communication control for a surgeon controlled secondary display and primary display
CN116635946A (en) Cooperative surgical display
US20220108789A1 (en) Cloud analytics packages
US20220104807A1 (en) Shared situational awareness of the device actuator activity to prioritize certain aspects of displayed information
US20220104912A1 (en) Situational awareness of instruments location and individualization of users to control displays
WO2022070066A1 (en) Monitoring of user visual gaze to control which display system displays the primary information
JP2023544356A (en) Reconfiguring display sharing
EP4041098A1 (en) Surgical instrument with adaptive function controls
EP4626296A1 (en) Intelligent surgical instrument with a combination method for stapling line detection and labelling
US20230240512A1 (en) Endoscope system, manipulation assistance method, and manipulation assistance program
US12277784B2 (en) Intelligent energy device based on real-time visual analysis of laparoscopic video
US12433589B2 (en) Learned triggers for adaptive control of surgical stapling systems
US20230065764A1 (en) Method for operating a surgical system

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23901377

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2025532573

Country of ref document: JP

Kind code of ref document: A

WWE Wipo information: entry into national phase

Ref document number: 202380083425.0

Country of ref document: CN

Ref document number: 2025532573

Country of ref document: JP

WWE Wipo information: entry into national phase

Ref document number: 2023901377

Country of ref document: EP

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 2023901377

Country of ref document: EP

Effective date: 20250630

WWP Wipo information: published in national office

Ref document number: 202380083425.0

Country of ref document: CN

WWP Wipo information: published in national office

Ref document number: 2023901377

Country of ref document: EP