WO2024123647A1 - Intelligent surgical instrument with a combination method for stapling line detection and labelling - Google Patents
Intelligent surgical instrument with a combination method for stapling line detection and labelling Download PDFInfo
- Publication number
- WO2024123647A1 WO2024123647A1 PCT/US2023/082239 US2023082239W WO2024123647A1 WO 2024123647 A1 WO2024123647 A1 WO 2024123647A1 US 2023082239 W US2023082239 W US 2023082239W WO 2024123647 A1 WO2024123647 A1 WO 2024123647A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- staple line
- staple
- firing
- line segment
- neural network
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0012—Biomedical image inspection
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00002—Operational features of endoscopes
- A61B1/00004—Operational features of endoscopes characterised by electronic signal processing
- A61B1/00009—Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope
- A61B1/000096—Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope using artificial intelligence
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/313—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor for introducing through surgical openings, e.g. laparoscopes
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N20/00—Machine learning
- G06N20/10—Machine learning using kernel methods, e.g. support vector machines [SVM]
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/044—Recurrent networks, e.g. Hopfield networks
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/045—Combinations of networks
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/08—Learning methods
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/08—Learning methods
- G06N3/084—Backpropagation, e.g. using gradient descent
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/08—Learning methods
- G06N3/09—Supervised learning
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/764—Arrangements for image or video recognition or understanding using pattern recognition or machine learning using classification, e.g. of video objects
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/82—Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/40—Scenes; Scene-specific elements in video content
- G06V20/41—Higher-level, semantic clustering, classification or understanding of video scenes, e.g. detection, labelling or Markovian modelling of sport events or news items
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B17/00—Surgical instruments, devices or methods
- A61B17/068—Surgical staplers, e.g. containing multiple staples or clamps
- A61B17/072—Surgical staplers, e.g. containing multiple staples or clamps for applying a row of staples in a single action, e.g. the staples being applied simultaneously
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/06—Measuring instruments not otherwise provided for
- A61B2090/064—Measuring instruments not otherwise provided for for measuring force, pressure or mechanical tension
- A61B2090/065—Measuring instruments not otherwise provided for for measuring force, pressure or mechanical tension for measuring contact or contact pressure
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V2201/00—Indexing scheme relating to image or video recognition or understanding
- G06V2201/03—Recognition of patterns in medical or anatomical images
- G06V2201/034—Recognition of patterns in medical or anatomical images of medical instruments
Definitions
- Surgical staplers have been used as instrumental tools for open and laparoscopic surgical procedures. Particularly since the 1980s laparoscopic video’s commercialization, the endoscopic stapler/cutter has emerged as the minimally invasive solution for a wide spectrum of complex general/colorectal/GYN/ bariatric/thoracic procedures. Mechanical surgical staplers have been open-loop devices till recent years when their feedback completely relied on surgeons’ judgment and experience.
- Staple line leak in surgery is a challenge, and it’s more prevalent in sleeve gastrectomy.
- a properly formed B-shaped staple is correlated with a low leakage rate.
- One factor that may improve the B shape staple form is applying optimum pressure on the tissue at the stapling time. If the pressure is too high, which means the clamped tissue is too thick, the staple cannot close the loop to form the B-shape. If the pressure is too low, which means the clamped tissue is too thin, the grip would be low and cause tissue staking. The pressure on the tissue is dependent on the applied tissue pressure time since the tissue will dehydrate under pressure. As a result, quantifying the tissue pressure during clamping and firing may inform the decision-making process and improve the staple form.
- Another factor that may improve staple form is the proper usage of the stapler at the time of surgery. The proper use includes, but is not limited to, selecting the correct cartridge size and applying the optimal local shear forces to the tissue.
- Embodiments of this disclosure disclose an embedded sensor or an array of sensors at the end effector of the stapler, which can be embedded in the anvil or the cartridge.
- the sensors quantify the tissue pressure during clamping and firing, the data will be used in the further computation process and provide users with operation guidance to improve the form of the stapling line.
- the sensor data will also be used in calculating the local shear forces of the tissue caused by the applied force, and provide user guidance in selecting the correct cartridge size.
- Figure 1 illustrates a firing force curve during a stapling process in a medical procedure.
- Figure 2a illustrates an exemplary image from a video showing a staple line.
- Figure 2b illustrates another exemplary image from an enhanced real time video showing a staple line.
- Figure 3 a illustrates an exemplary console connecting a stapler, laparoscope, and monitor, according to an embodiment of the disclosure.
- Figure 4 is an exemplary image from the real live video that shows a detected staple line, according to an embodiment of the disclosure.
- Figures 5a - 5d illustrate images showing the recognition of staple line segments from images from real-time video of surgery, according to an embodiment of the disclosure.
- Figure 6 illustrates the exemplary hardware components of the console of Figure 3b, according to an embodiment of the disclosure.
- the video and data processing unit can calculate the abnormality parameters and fuse the real-time video with various labels, such as different colors or patterns.
- Figure 2b illustrates an example of part of staple line 202 being labeled in an enhanced real-time video, according to an embodiment of the disclosure. This will alert the surgeons to perform necessary surgical maneuvers and remedy if the surgeon deems the drop in firing force of the staple (and the effect it caused) as high risk.
- Figure 2a is an image showing staple line 200 from the real-time video of the operation without any labeling.
- embodiments of the disclosed system can be used in bariatric or colorectal surgery where the system can label the staple lines with extreme tissue thickness, or firing over unwanted tissue where a higher leakage risk is present. A surgeon can further evaluate based on the alert (e.g., a labeled staple line section) to decide if further suturing or enhancement is needed.
- FIG. 3 illustrates an embodiment of the disclosed system.
- the system can include a console 300 connected to a stapler 302, a laparoscope 304, and a monitor 306.
- the connections between the console 300 and the other devices 302, 304, and 306 can be a wired or wireless connection, such as, but not limited to, a Wi-Fi connection, local area network (LAN) connection, or Bluetooth connection.
- the stapler 302 can transmit the firing force data of the stapler 302 to the console 300.
- the laparoscope 304 can transmit real-time video captured by the laparoscope 304 to the console 300.
- the console can process the firing force data and the real-time video and output alerts on monitor 306.
- the console 300 can include a number of modules: an object detection module 308, a new staple line recognition module 310, a tracking module 312, a firing curve and staple line segment association module 314, and a classification module 316.
- the object detection module 308 can use a neural network (e.g., Region-based Convolutional Neural Networks (“RCNN) or You Only Look Once (“YOLO”)) to locate a staple line from an image.
- RCNN Region-based Convolutional Neural Networks
- YOLO You Only Look Once
- Figure 4 provides an exemplary image from the real live video that shows in line 402 the detected staple line.
- the new staple line recognition module 310 can compare the detected staple line with previously detected staple lines to recognize a newly generated staple line. If it is the first firing of the stapler, the new staple line recognition module 310 can regard the entire staple line as a new staple line.
- Figure 5a is an image showing the first staple firing during surgery.
- Figure 5b illustrates the new staple line 502 of the first staple firing of Figure 5a as recognized by the new staple line recognition module 310.
- Figure 5c is an image showing a second staple firing during the surgery.
- the new staple line recognition module 310 can compare the image of Figure 5c with the image of Figure 5a to recognize a newly generated second staple line 504 (shown in Figure 4d).
- the tracking module 312 can utilize a neural network to track each staple line segment (each segment corresponds to an individual firing of the stapler). That is, the tracking module 312 can give the location of each staple line in every frame where the staple line is visible.
- the firing curve and staple line segment association module 314 can associate the most recent staple firing curve with a new staple line segment once the new staple line segment is detected by tracking module 312. Such association matches each firing curve (e.g., the firing curve of Figure 1) with a staple line segment (e.g., the staple line segments of Figure 5b and Figure 5d)
- the classification module 316 can use a convolutional neural network (“CNN”) built to take the matched firing curve and staple line segment image as inputs and classify the input into one of several categories such as a) normal staple firing; b) abnormal staple firing with lack of tissue clamped; c) abnormal staple firing with possible missing staples; and d) abnormal staple firing with tissue abnormality.
- CNN convolutional neural network
- the object detection module 308, new staple line recognition module 310, tracking module 312, firing curve and staple line segment association module 314, and classification module 316 of Figure 3b can be implemented in software, firmware, hardware, or any combination of the three.
- FIG. 6 illustrates the exemplary hardware components of the console of Figure 3b, according to an embodiment of the disclosure.
- the console 600 can include, among other things, an VO interface 612, a processing unit 614, a storage unit 616, a memory module 618, and a user interface 620, all connected via connection 622.
- I/O interface 612 can be configured for communicating with a laparoscope and a stapler that are connected to the console 600.
- the I/O interface 612 can also communicate with a monitor to display alerts on the monitor.
- separate I/O interfaces can be used for communicating with various devices, such as the laparoscope, the stapler, and the monitor. Communication can be via any suitable wired or wireless channel.
- Processing unit 614 may be configured to receive data from the laparoscope and the stapler and process the data to detect and label staple lines 600.
- Processing unit 614 may also be configured to generate and transmit signals, via I/O interface 612, to display alerts on the monitor.
- the computer-readable medium can include volatile or non-volatile, magnetic, semiconductor, tape, optical, removable, non-removable, or other types of computer-readable medium or computer-readable storage devices.
- the computer-readable medium can have computer instructions stored thereon, as disclosed.
- the computer- readable medium may be a disc or a flash drive having the computer instructions stored thereon.
- console 600 of Figure 6 can include additional components that are not shown in Figure 6 and that some of the components shown in Figure 6 may be optional in certain embodiments.
- FIG. 7 is a flow chart illustrating the exemplary steps performed by the console 300 of Figure 3.
- the console receives firing force data from a stapler and real-time video captured by a laparoscope during a medical procedure, such as surgery (Step 701).
- the console can locate a staple line from an image from the real-time video (Step 702). This can be done using a neural network.
- the located staple line can be compared with previously detected staple lines to recognize a newly generated staple line (Step 703). If it is the first firing of the stapler, the entire staple line can be regarded as a new staple line.
- the console can then track each staple line segment (each segment corresponds to an individual firing of the stapler) using a neural network (Step 704).
- the console can determine the location of each staple line in every frame of the real-time video where the staple line is visible.
- the console associates the most recent staple firing curve generated from the firing force data received from the stapler with the new staple line segment once the new staple line segment is determined (Step 705).
- each firing curve of the stapler is associated with a staple line segment identified from the real-time video.
- the console may use a convolutional neural network (“CNN”) built to take the matched firing curve and staple line segment image as inputs and classify the input into one of several categories such as a) normal staple firing; b) abnormal staple firing with lack of tissue clamped; c) abnormal staple firing with possible missing staples; and d) abnormal staple firing with tissue abnormality.
- CNN convolutional neural network
- the category can be output to the user (Step 707).
- the console may also record and classify the visual properties of the clamping tissue, the anvil location, and the cartridge size.
- the visual properties include organ/tissue classification, shape, abnormality, relative thickness, color, relative color variation, and surface vein structure. This data will be used to train a machine learning model that can predict the pressure changes based on the tissue and recommend appropriate cartridge size for that specific tissue and anvil location.
- the console may also collect pressure sensor data and compare it to preset threshold values during the clamping process.
- the clamping process includes two phases: closing the jaw and resting period. Average tissue thickness is highly correlated with the maximum pressure applied on the tissue during clamping. If the max pressure is too high, it implies that tissue is relatively thick for that cartridge; if the max pressure is too low, it implies that tissue is relatively thin for that cartridge, in either case, the system will recommend a cartridge change.
- the console may also collect and analyze the shear force on the tissue to recommend the stapler repositioning based on the variation in the pressure on the cartridge.
- One reason for not having a proper B-shaped staple form is the human factor. Doctors may apply unnecessary strains or tension on the tissue when applying the staple. If the pressure distribution is not uniform, the intelligent system may recommend stapler repositioning, including but not limited to rotation or translation, to minimize the pressure distribution variation to achieve a better stapler form.
- a surgical stapler may be commercialized independently or in combination with an artificial intelligence (“Al”) processing unit and imaging system based on embodiments of this disclosure.
- Al artificial intelligence
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Evolutionary Computation (AREA)
- General Physics & Mathematics (AREA)
- General Health & Medical Sciences (AREA)
- Software Systems (AREA)
- Artificial Intelligence (AREA)
- Computing Systems (AREA)
- Biophysics (AREA)
- Biomedical Technology (AREA)
- Molecular Biology (AREA)
- Data Mining & Analysis (AREA)
- Mathematical Physics (AREA)
- General Engineering & Computer Science (AREA)
- Computational Linguistics (AREA)
- Medical Informatics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Surgery (AREA)
- Multimedia (AREA)
- Radiology & Medical Imaging (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Pathology (AREA)
- Optics & Photonics (AREA)
- Databases & Information Systems (AREA)
- Heart & Thoracic Surgery (AREA)
- Animal Behavior & Ethology (AREA)
- Public Health (AREA)
- Veterinary Medicine (AREA)
- Signal Processing (AREA)
- Quality & Reliability (AREA)
- Surgical Instruments (AREA)
- Endoscopes (AREA)
Abstract
Description
Claims
Priority Applications (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| EP23901377.4A EP4626296A1 (en) | 2022-12-04 | 2023-12-04 | Intelligent surgical instrument with a combination method for stapling line detection and labelling |
| CN202380083425.0A CN120344184A (en) | 2022-12-04 | 2023-12-04 | Intelligent surgical instrument and combined method for detecting and marking staple lines |
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US202263385991P | 2022-12-04 | 2022-12-04 | |
| US63/385,991 | 2022-12-04 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2024123647A1 true WO2024123647A1 (en) | 2024-06-13 |
Family
ID=91380120
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/US2023/082239 Ceased WO2024123647A1 (en) | 2022-12-04 | 2023-12-04 | Intelligent surgical instrument with a combination method for stapling line detection and labelling |
Country Status (3)
| Country | Link |
|---|---|
| EP (1) | EP4626296A1 (en) |
| CN (1) | CN120344184A (en) |
| WO (1) | WO2024123647A1 (en) |
Cited By (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2025027493A1 (en) * | 2023-08-01 | 2025-02-06 | Covidien Lp | System and method for machine learning method to verify consistent staple line delivery |
Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2008082844A2 (en) * | 2006-12-29 | 2008-07-10 | Satiety, Inc. | Devices and methods for placement of partitions within a hollow body organ |
| WO2019123082A1 (en) * | 2017-12-21 | 2019-06-27 | Ethicon Llc | Staple instrument comprising a firing path display |
| US20200015905A1 (en) * | 2018-07-16 | 2020-01-16 | Ethicon Llc | Visualization of surgical devices |
| US20210204873A1 (en) * | 2020-01-06 | 2021-07-08 | Covidien Lp | Systems and methods for anastomosis leakage detection and prediction |
| WO2021136999A1 (en) * | 2019-12-30 | 2021-07-08 | Ethicon Llc | Surgical system for overlaying surgical instrument data onto a virtual three dimensional construct of an organ |
-
2023
- 2023-12-04 WO PCT/US2023/082239 patent/WO2024123647A1/en not_active Ceased
- 2023-12-04 EP EP23901377.4A patent/EP4626296A1/en active Pending
- 2023-12-04 CN CN202380083425.0A patent/CN120344184A/en active Pending
Patent Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2008082844A2 (en) * | 2006-12-29 | 2008-07-10 | Satiety, Inc. | Devices and methods for placement of partitions within a hollow body organ |
| WO2019123082A1 (en) * | 2017-12-21 | 2019-06-27 | Ethicon Llc | Staple instrument comprising a firing path display |
| US20200015905A1 (en) * | 2018-07-16 | 2020-01-16 | Ethicon Llc | Visualization of surgical devices |
| WO2021136999A1 (en) * | 2019-12-30 | 2021-07-08 | Ethicon Llc | Surgical system for overlaying surgical instrument data onto a virtual three dimensional construct of an organ |
| US20210204873A1 (en) * | 2020-01-06 | 2021-07-08 | Covidien Lp | Systems and methods for anastomosis leakage detection and prediction |
Cited By (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2025027493A1 (en) * | 2023-08-01 | 2025-02-06 | Covidien Lp | System and method for machine learning method to verify consistent staple line delivery |
Also Published As
| Publication number | Publication date |
|---|---|
| EP4626296A1 (en) | 2025-10-08 |
| CN120344184A (en) | 2025-07-18 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US11819188B2 (en) | Machine-learning-based visual-haptic system for robotic surgical platforms | |
| US12458455B2 (en) | Surgical RFID assemblies for instrument operational setting control | |
| US11553971B2 (en) | Surgical RFID assemblies for display and communication | |
| US11361176B2 (en) | Surgical RFID assemblies for compatibility detection | |
| US11510743B2 (en) | Communication control for a surgeon controlled secondary display and primary display | |
| CN116635946A (en) | Cooperative surgical display | |
| US20220108789A1 (en) | Cloud analytics packages | |
| US20220104807A1 (en) | Shared situational awareness of the device actuator activity to prioritize certain aspects of displayed information | |
| US20220104912A1 (en) | Situational awareness of instruments location and individualization of users to control displays | |
| WO2022070066A1 (en) | Monitoring of user visual gaze to control which display system displays the primary information | |
| JP2023544356A (en) | Reconfiguring display sharing | |
| EP4041098A1 (en) | Surgical instrument with adaptive function controls | |
| EP4626296A1 (en) | Intelligent surgical instrument with a combination method for stapling line detection and labelling | |
| US20230240512A1 (en) | Endoscope system, manipulation assistance method, and manipulation assistance program | |
| US12277784B2 (en) | Intelligent energy device based on real-time visual analysis of laparoscopic video | |
| US12433589B2 (en) | Learned triggers for adaptive control of surgical stapling systems | |
| US20230065764A1 (en) | Method for operating a surgical system |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 23901377 Country of ref document: EP Kind code of ref document: A1 |
|
| ENP | Entry into the national phase |
Ref document number: 2025532573 Country of ref document: JP Kind code of ref document: A |
|
| WWE | Wipo information: entry into national phase |
Ref document number: 202380083425.0 Country of ref document: CN Ref document number: 2025532573 Country of ref document: JP |
|
| WWE | Wipo information: entry into national phase |
Ref document number: 2023901377 Country of ref document: EP |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| ENP | Entry into the national phase |
Ref document number: 2023901377 Country of ref document: EP Effective date: 20250630 |
|
| WWP | Wipo information: published in national office |
Ref document number: 202380083425.0 Country of ref document: CN |
|
| WWP | Wipo information: published in national office |
Ref document number: 2023901377 Country of ref document: EP |