[go: up one dir, main page]

WO2021044523A1 - Dispositif d'assistance chirurgicale - Google Patents

Dispositif d'assistance chirurgicale Download PDF

Info

Publication number
WO2021044523A1
WO2021044523A1 PCT/JP2019/034637 JP2019034637W WO2021044523A1 WO 2021044523 A1 WO2021044523 A1 WO 2021044523A1 JP 2019034637 W JP2019034637 W JP 2019034637W WO 2021044523 A1 WO2021044523 A1 WO 2021044523A1
Authority
WO
WIPO (PCT)
Prior art keywords
surgical
unit
support device
image
identification mark
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/JP2019/034637
Other languages
English (en)
Japanese (ja)
Inventor
直 小林
尚紀 北村
勇太 熊頭
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Incubit Inc
Original Assignee
Incubit Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Incubit Inc filed Critical Incubit Inc
Priority to PCT/JP2019/034637 priority Critical patent/WO2021044523A1/fr
Publication of WO2021044523A1 publication Critical patent/WO2021044523A1/fr
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis

Definitions

  • the present invention relates to a surgical support device.
  • Patent Document 1 Remote surgery using AR technology is being performed.
  • an AR marker is placed in the oral cavity to align a three-dimensional CT image.
  • Patent Document 1 assumes a hard surgical object such as a tooth, if it is used for a deformable surgical object such as an internal organ, the superimposed image will be displaced.
  • the present invention has been made in view of such a background, and an object of the present invention is to provide a technique for superimposing and displaying necessary information on the surgical field.
  • the main invention of the present invention for solving the above-mentioned problems is a surgical support device, which is a position / orientation acquisition unit for acquiring the positions and postures of a plurality of identification tags attached to a surgical object in the body, and the identification label.
  • a position calculation unit that calculates a position for arranging a three-dimensional image in the virtual reality space according to the position of, and a drawing processing unit that superimposes and displays the three-dimensional image on the surgical object according to the calculated position.
  • the position calculation unit deforms the three-dimensional image in response to a change in either the position or the posture of the identification marker.
  • the present invention has the following configurations.
  • a position / posture acquisition unit that acquires the positions and postures of a plurality of identification marks attached to surgical objects in the body, A position calculation unit that calculates the position of arranging the 3D image in the virtual reality space according to the position of the identification mark, and A drawing processing unit that superimposes and displays the three-dimensional image on the surgical object according to the calculated position, and With The position calculation unit deforms the three-dimensional image in response to a change in either the position or the posture of the identification mark.
  • a surgical support device characterized by.
  • the position calculation unit deforms the three-dimensional model according to a change in either the position or the posture of the identification mark.
  • the drawing processing unit superimposes and displays the three-dimensional image on the surgical object based on the deformed three-dimensional model.
  • a surgical support device characterized by. [Item 3] The surgical support device according to item 1 or 2. Further provided with an instrument control unit that performs a predetermined process on the surgical object based on the three-dimensional image.
  • [Item 4] The surgical support device according to any one of items 1 to 3.
  • the drawing processing unit superimposes and displays predetermined information on the surgical object.
  • the surgery support system of the present embodiment supports surgery performed by remote control, such as laparoscopic surgery.
  • the surgery support system of the present embodiment superimposes and displays a three-dimensional image (hereinafter referred to as a superimposed image) related to surgery on a surgical target area (organ, etc.) on an image of the surgical field viewed by an operator performing remote surgery.
  • a superposed image is an image related to the simulation of the surgery performed in advance, but it is meaningful to superimpose and display it on the surgical target area such as an organ, for example, the structure inside the organ.
  • the image of can be adopted.
  • ⁇ Overview> 1 and 2 are schematic views showing elements constituting the surgical support system (hereinafter, simply referred to as “system”) 1 according to the present embodiment.
  • system the surgical support system
  • the structure of the figure is an example, and elements other than these may be included.
  • the System 1 includes a surgical device 2 and a surgical support device 3.
  • the surgical device 2 is a device for performing an operation by an operator 5 on a surgical subject 4.
  • the operator according to the present embodiment is a surgeon or the like for surgery, and the surgical subject 4 is a patient or the like.
  • the surgical device 2 includes a work unit 20 that actually performs an operation on the operation target person 4 (that is, acts directly on the body of the operation target person 4) and an operation unit 22 that receives an operation from the operator 5 on the work unit 20. And a control unit 24 that controls the work unit 20 and the operation unit 22.
  • the surgical device 2 according to the present embodiment is not limited to a surgical target area (not limited to a closed space), such as so-called laparoscopic surgery, ophthalmic surgery, brain surgery, spinal surgery, artificial joint surgery, and the like. ), It is a device for a minimally invasive surgical method by making a small hole for penetrating a surgical instrument in the area and inserting the surgical instrument into the area through the hole.
  • laryngeal endoscopy, bronchoscopy, gastrointestinal endoscopy, duodenal endoscopy, small intestinal endoscopy, colonoscopy, thoracoscope, laparoscope, cystoscope, biliary tract, arthrioscope, spinal endoscopy examples include, but are not limited to, mirrors, vascular endoscopes, and epidural endoscopy.
  • the work unit 20 includes an instrument unit 200 used for surgery and a sensor unit 202 that grasps at least the state of the surgical field of the surgical subject 4 or the state of the instrument unit 200.
  • the sensor unit 202 includes an endoscopic camera unit 202'.
  • a plurality of endoscopic camera units 202' are preferably arranged.
  • the instrument unit 200 has a surgical instrument attached to the tip of an operable movable arm. These instruments can be used by the operation received by the operation unit 22. Surgical instruments include tweezers, scissors, forceps, needle holders, scalpels (electric scalpels, etc.) and those that perform equivalent functions, but other instruments are used depending on the application. May be good.
  • the sensor unit 202 can employ various sensors such as a pressure sensor, a gyro sensor, an acceleration sensor, a temperature sensor, and a camera for detecting the state of the instrument unit 200, for example.
  • a camera an optical camera, an infrared camera, an X-ray camera, or the like can be appropriately adopted depending on the information to be acquired.
  • the operation unit 22 includes a controller unit 222 that receives an operation of the operator 5, a display unit 224 that displays information to the operator 5, and a speaker unit 226 that provides information by voice or the like to the operator 5.
  • the controller unit 222 can be combined with, for example, a joystick-shaped input device, a foot pedal, or the like.
  • the display unit 224 according to the present embodiment has a so-called VR (Virtual Reality) HMD (Head Mount Display) -like structure, and stereoscopically views the surgical target area by utilizing the parallax between the user's eyes. Is possible.
  • the display unit visually provides the operator with information necessary for performing the operation, such as information on the work unit 20 and the operation target 4, such as the correction monitor and AR (Augmented Reality). Anything that can be done is sufficient.
  • Control unit 24 controls information communication between the work unit 20 and the operation unit 22. For example, the operation received by the operation unit 22 is transmitted to the work unit 20, or the image information acquired by the camera unit 202'is displayed on the display unit 224.
  • FIG. 3 is a block diagram illustrating an outline of a hardware configuration example of the surgery support device 3.
  • the surgery support device 3 includes a processor 301, a memory 302, a storage 303, a transmission / reception unit 304, and an input / output unit 305 as main configurations, and these are electrically connected to each other via a bus 306.
  • the processor 301 is an arithmetic unit that controls the operation of the surgery support device 3, controls the transmission and reception of data between each element, and performs processing necessary for executing an application program.
  • the processor 301 is, for example, a CPU (Central Processing Unit), and executes each process by executing an application program or the like stored in the storage 303 and expanded in the memory 302.
  • CPU Central Processing Unit
  • the memory 302 includes a main storage device composed of a volatile storage device such as a DRAM (Dynamic Random Access Memory), and an auxiliary storage device composed of a non-volatile storage device such as a flash memory or an HDD (Hard Disk Drive). .. While this memory 302 is used as a work area of the processor 301, the BIOS (Basic Input / Output System) executed when the operation support device 3 is started, various setting information, and the like are stored.
  • BIOS Basic Input / Output System
  • the storage 303 stores application programs, data used for various processes, and the like.
  • the transmission / reception unit 304 connects the surgery support device 3 to the communication network.
  • the transmission / reception unit 304 may be provided with a short-range communication interface such as Bluetooth (registered trademark) or BLE (Bluetooth Low Energy).
  • the surgery support device 3 is connected to the surgery device 2 via the transmission / reception unit 304.
  • the bus 306 transmits, for example, an address signal, a data signal, and various control signals between the connected processor 301, memory 302, storage 303, transmission / reception unit 304, and input / output unit 305.
  • FIG. 4 is a block diagram illustrating an outline of a software configuration example of the surgery support device 3.
  • the surgery support device 3 includes each functional unit of the marker detection unit 311, the position / orientation calculation unit 312, the arrangement processing unit 313, and the drawing processing unit 314, and each storage unit of the superimposed image storage unit 331 and the explanatory information storage unit 332. To be equipped.
  • Each of the above functional units is realized by the processor 301 included in the surgery support device 3 reading a program stored in the storage 303 into the memory 302 and executing the program, and each of the above storage units is provided in the surgery support device 3. It is implemented as part of the storage area provided by the memory 302 and the storage 303.
  • FIG. 5 is a diagram showing an example of the display unit 224 on which the superimposed image 52 is displayed.
  • identification marks 51 are placed at each of three or more predetermined locations in the surgical target area 41 (organ in the example of FIG. 5), and the superimposed image 52 is aligned based on the identification marks 51. Is assumed.
  • the identification mark 51 is assumed to be an AR marker. As the identification mark 51, any mark can be adopted as long as it is a mark indicating the position of a predetermined portion of the surgical target area 41 such as an organ.
  • the organ is colored in some way and the colored portion is designated. It may be an identification mark 51.
  • the identification mark 51 can be made of a material that dissolves in the body of the surgical subject 4 used for an adhesion preventive material, an absorbent suture, or the like.
  • the marker detection unit 311 detects the identification mark 51 from the image of the surgical field.
  • the marker detection unit 311 detects the identification mark 51 from the image of the surgical field acquired by the camera unit 202'.
  • the marker detection unit 311 can detect the identification mark 51 by, for example, detecting the feature amount of the identification mark 51 from the image.
  • a known method can be adopted for the detection process of the identification mark 51 by the marker detection unit 311, and detailed description thereof will be omitted here.
  • the position calculation unit 312 calculates the position of the identification mark 51.
  • the position calculation unit 312 calculates the three-dimensional position of the identification mark 51 in the virtual reality space. Since the identification mark 51 is assumed to be arranged at a predetermined position in the operation target area 41 such as an organ, the position calculation unit 312 specifies the positions of three or more identification marks 51 in a three-dimensional space. The position and orientation of the superimposed image 52 in the above can be determined.
  • the superimposed image storage unit 331 stores the superimposed image.
  • FIG. 6 is a diagram showing a configuration example of information including a superimposed image (hereinafter, referred to as superimposed image information) stored in the superimposed image storage unit 331.
  • the superimposed image information includes model data for creating a three-dimensional image in association with identification information (operation ID) that identifies an operation related to surgery.
  • the model data may be, for example, still image data such as a DWG file, or moving image data such as a VRML or OSG file.
  • the arrangement processing unit 313 arranges the superimposed image in the virtual reality space. Based on the model data, the placement processing unit 313 can create a superposed image so as to align the position of the corresponding site of the surgical target area 41 with the position of the identification mark 51 calculated by the position calculation unit 312.
  • the arrangement processing unit 313 receives, for example, the designation of the operation to be performed by the operator 5 according to the progress of the operation, reads the superimposed image information corresponding to the specified operation from the superimposed image storage unit 331, and acquires the model data. can do.
  • the explanatory information storage unit 332 stores information including explanations related to surgery (hereinafter referred to as explanatory information).
  • FIG. 7 is a diagram showing a configuration example of explanatory information stored in the explanatory information storage unit 332.
  • the explanatory information includes an explanation in association with the operation ID.
  • the description can be a description of the superimposed image based on the model data included in the superimposed image information, and can be, for example, text data expressing the incision locus of the organ shown in the superimposed image in a sentence.
  • the drawing processing unit 314 draws the superimposed image.
  • the drawing processing unit 314 superimposes and draws the superimposed image created by the arrangement processing unit 313 on the image of the surgical field displayed on the display unit 224 so as to be displayed at the position calculated by the arrangement processing unit 313.
  • an auxiliary line to be incised by the electronic knife 200a is shown.
  • the drawing processing unit 314 can read the explanation corresponding to the operation ID of the superimposed image information from the explanatory information storage unit 332 and display it on the display unit 224.
  • the explanation 53 of the explanatory information is displayed on the display unit 224.
  • the drawing processing unit 314 may display the explanation 53 of the explanatory information at a fixed display location of the display unit 224 (for example, it may be displayed as a telop on the upper part of the screen, the lower part of the screen, or the like). It is possible.), As in the example of FIG. 5, it may be displayed in the vicinity of the superimposed image 52.
  • the drawing processing unit 314 is in the vicinity of the instrument unit 200 being operated and is not superimposed on the instrument unit 200 and the superimposed image 52 (superimposed). It may be superimposed on the transparent background portion of the image 52), and it is preferable to display the description 53.
  • FIG. 8 is a diagram illustrating the operation of the surgery support device 3.
  • the operator 5 operates the instrument unit 200 to place the identification mark 51 (AR marker) at a predetermined position in the surgical target area 41 such as an organ (S400).
  • the marker detection unit 311 acquires an image of the surgical field taken by the camera unit 202'(S401), and detects the identification mark 51 (AR marker) from the acquired image (S402).
  • the position calculation unit 312 calculates the position of the identification mark 51 in the augmented reality space (S403), and the arrangement processing unit 313 determines the position and orientation of the superimposed image according to the position of the identification mark 51, and uses the superimposed image information.
  • a superposed image is created based on the included model data (S404).
  • the superimposed image information to be displayed may be designated by the arrangement processing unit 313 from the operator 5 or other surgical personnel, or the superimposed image information to be displayed may be registered in advance.
  • the drawing processing unit 314 superimposes and displays the superimposed image on the display unit 224 according to the position and orientation calculated by the arrangement processing unit 313 (S405).
  • the drawing processing unit 314 can also display the explanatory 53 included in the explanatory information on the display unit 224.
  • the superimposed image can be superimposed and displayed on the operation target area 41 after accurately aligning using the identification mark 51 such as the AR marker. This makes it possible to assist the operator 5 in operating the instrument unit 200 and improve the accuracy and efficiency of the operation.
  • the superimposed image is displayed according to the position of the identification mark 51, but when the identification mark 51 is used as an AR marker, the superimposed image is deformed in consideration of the normal direction thereof. It can be superimposed and displayed on the above.
  • FIG. 9 is a diagram illustrating a modified example in which the normal direction of the identification mark 51, which is an AR marker, is taken into consideration.
  • the surgical target area 41 such as an organ has a three-dimensional shape
  • the shape of the surgical target area 41 changes as the surgical target area 41 changes.
  • the normal direction 54 of the identification mark 51 also changes.
  • the normal direction 54 is acquired for some parts of the operation target area 41, and the position is based on the difference from the normal direction of the surface in contact with the part in the three-dimensional model.
  • the calculation unit 312 can transform the three-dimensional model.
  • a general method used for object deformation processing in 3D modeling software can be adopted.
  • FIG. 10 is a diagram illustrating the operation of the surgical support device 3 when the normal direction of the identification mark 51 is taken into consideration.
  • step S402 of the process shown in FIG. 8 after the marker detection unit 311 detects the AR marker 51 from the image of the surgical field taken by the camera unit 202', the position calculation unit is replaced with the step S403. 312 calculates the position of the AR marker 51 in the augmented reality space and its normal direction (vector) (S501).
  • the arrangement processing unit 313 determines the position and orientation of the superimposed image according to the position of the identification mark 51 based on the model data included in the superimposed image information (S404), and the normal direction of the AR marker 51.
  • a superposed image deformed according to the above is created (S502), and the drawing processing unit 314 superimposes and displays the deformed superposed image according to the position calculated by the arrangement processing unit 313 (S405).
  • the superimposed image 52 deformed according to the normal direction 54 of the AR marker 51 is superimposed and displayed on the operation target area 41.
  • the auxiliary line 52 to be incised by the electronic knife 200a shown in the example of FIG. 5 is deformed according to the normal directions 54a to 54c of the AR markers 54a to 54c, and is superimposed and drawn as the auxiliary line 52a. It is shown that it is being done. Since organs and the like may be deformed at any time, by adding the deformation processing as described above, it is possible to more accurately superimpose and display the superposed image based on the model data in accordance with the surgical target area 41.
  • the surgical support device 3 and the surgical device 2 are separate devices, but the control unit 24 and the operation unit 22 of the surgical support device 2 are provided with the functions of the surgical support device 3.
  • the surgical device 2 may have the function of the surgical support device 3 as a unit separate from the work unit 20, the operation unit 22, and the control unit 24.
  • the surgery support device 3 is assumed to be one computer, but the present invention is not limited to this, and it can be realized by a plurality of computers.
  • a storage unit may be provided in a database server and accessed via a network, or the functions of the surgery support device 3 may be distributed and deployed on a plurality of computers.
  • one virtual computer may be mounted by a plurality of computers, and the function of the surgery support device 3 may be provided to the virtual computer.
  • the position and posture of the identification mark 51 are used for deformation of the superimposed image, but the present invention is not limited to this, and may be used for grasping the shape of the surgical target area 41.
  • the position calculation unit 312 can calculate the position and posture of the identification mark 51, that is, one part of the operation target area 41, and can grasp the shape of the operation target area 41 based on this. It is possible.
  • the position and orientation of the detected identification mark 51 are used for alignment and deformation of the superimposed image, but may be used for automatic control of the instrument unit 200.
  • a locus for moving the instrument unit 200 such as a locus for incising with an electric knife, is set as model data, and the position of the locus is aligned and / or based on the position and / or posture of the identification mark 51.
  • the control unit 24 can control the instrument unit 200 to move so that the locus is deformed and aligned and / or follows the deformed locus.
  • the normal direction 54 of the AR marker 51 is taken into consideration, but not only the normal direction 54 but also the posture of three axes including the two axes related to the plane of the AR marker 51. May be considered.
  • the normal direction 54 of the AR marker 51 is taken into consideration, but not only the normal direction 54 but also the posture of three axes including the two axes related to the plane of the AR marker 51. May be considered.
  • the superimposed image can be displayed more accurately according to the shape of the current operation target area 41.
  • the surgical support device 3 only superimposes and displays the superposed image on the surgical target area 41.
  • the superposed image includes a locus of moving the surgical instrument, etc., depending on the superposed image.
  • An instrument control unit that automatically controls the instrument unit 200 may be provided. This makes it possible to automatically control the surgical operation.

Landscapes

  • Health & Medical Sciences (AREA)
  • Surgery (AREA)
  • Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Biomedical Technology (AREA)
  • Robotics (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Processing Or Creating Images (AREA)

Abstract

L'objectif de l'invention est d'obtenir une technique permettant l'affichage superposé d'informations nécessaires à un champ opératoire. Plus spécifiquement, l'invention concerne un dispositif d'assistance chirurgicale. Ce dispositif d'assistance chirurgicale comporte: une unité d'acquisition de positions et d'orientations, laquelle acquiert la position et l'orientation de plusieurs marqueurs distinctifs placés sur un objet d'intervention chirurgical à l'intérieur d'un corps; une unité de calcul de positions, laquelle calcule, en correspondance avec les positions des marqueurs distinctifs, une position qui permet de placer une image 3D dans un espace de réalité virtuelle; une unité de traitement de rendu d'image, laquelle affiche de façon superposée et en correspondance avec les positions calculées, les images 3D sur l'objet d'intervention chirurgicale. L'unité de calcul de positions modifie l'image 3D en réponse à la modification de l'orientation ou de la position des marqueurs distinctifs.
PCT/JP2019/034637 2019-09-03 2019-09-03 Dispositif d'assistance chirurgicale Ceased WO2021044523A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/JP2019/034637 WO2021044523A1 (fr) 2019-09-03 2019-09-03 Dispositif d'assistance chirurgicale

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2019/034637 WO2021044523A1 (fr) 2019-09-03 2019-09-03 Dispositif d'assistance chirurgicale

Publications (1)

Publication Number Publication Date
WO2021044523A1 true WO2021044523A1 (fr) 2021-03-11

Family

ID=74852399

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2019/034637 Ceased WO2021044523A1 (fr) 2019-09-03 2019-09-03 Dispositif d'assistance chirurgicale

Country Status (1)

Country Link
WO (1) WO2021044523A1 (fr)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017179350A1 (fr) * 2016-04-11 2017-10-19 富士フイルム株式会社 Dispositif, procédé et programme de commande d'affichage d'image
US20180140362A1 (en) * 2015-04-07 2018-05-24 King Abdullah University Of Science And Technology Method, apparatus, and system for utilizing augmented reality to improve surgery
JP2019161558A (ja) * 2018-03-15 2019-09-19 国立大学法人 東京大学 臓器追跡装置、臓器追跡方法及び臓器追跡プログラム

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180140362A1 (en) * 2015-04-07 2018-05-24 King Abdullah University Of Science And Technology Method, apparatus, and system for utilizing augmented reality to improve surgery
WO2017179350A1 (fr) * 2016-04-11 2017-10-19 富士フイルム株式会社 Dispositif, procédé et programme de commande d'affichage d'image
JP2019161558A (ja) * 2018-03-15 2019-09-19 国立大学法人 東京大学 臓器追跡装置、臓器追跡方法及び臓器追跡プログラム

Similar Documents

Publication Publication Date Title
AU2019352792B2 (en) Indicator system
US20220331052A1 (en) Cooperation among multiple display systems to provide a healthcare user customized information
US20250090241A1 (en) Systems and methods for tracking a position of a robotically-manipulated surgical instrument
US12357392B2 (en) Navigational aid
EP3737322B1 (fr) Guidage pour le placement d'orifices chirurgicaux
KR20160102464A (ko) 의료 절차 훈련을 위한 시뮬레이터 시스템
CN114945937A (zh) 用于内窥镜流程的引导式解剖操纵
US20250268666A1 (en) Systems and methods for providing surgical assistance based on operational context
JP7239117B2 (ja) 手術支援装置
US11532130B2 (en) Virtual augmentation of anatomical models
CN118215936A (zh) 用于腹腔镜和视频辅助手术的交互式增强现实系统
WO2021044523A1 (fr) Dispositif d'assistance chirurgicale
WO2022219501A1 (fr) Système comprenant une matrice de caméras déployables hors d'un canal d'un dispositif chirurgical pénétrant un tissu
CN114159157A (zh) 辅助移动器械的方法、装置、设备及存储介质
JP2002045372A (ja) 手術ナビゲーション装置
US20250235287A1 (en) Computer-assisted distance measurement in a surgical space
WO2019035206A1 (fr) Système d'endoscope et procédé de génération d'image
GB2611972A (en) Feature identification
GB2608016A (en) Feature identification

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19944504

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19944504

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP