[go: up one dir, main page]

WO2025112078A1 - Procédé et système de navigation pour une ponction intrarachidienne basés sur une technologie de réalité mixte - Google Patents

Procédé et système de navigation pour une ponction intrarachidienne basés sur une technologie de réalité mixte Download PDF

Info

Publication number
WO2025112078A1
WO2025112078A1 PCT/CN2023/136151 CN2023136151W WO2025112078A1 WO 2025112078 A1 WO2025112078 A1 WO 2025112078A1 CN 2023136151 W CN2023136151 W CN 2023136151W WO 2025112078 A1 WO2025112078 A1 WO 2025112078A1
Authority
WO
WIPO (PCT)
Prior art keywords
point
ray
needle insertion
candidate
target point
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
PCT/CN2023/136151
Other languages
English (en)
Chinese (zh)
Inventor
顾卫东
秦春晖
李济宇
高蕾
严兆阳
吴加珺
徐艺涤
李铭
邱健健
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Suzhou Reacool Medical Technology Co Ltd
Huadong Hospital
Original Assignee
Suzhou Reacool Medical Technology Co Ltd
Huadong Hospital
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Suzhou Reacool Medical Technology Co Ltd, Huadong Hospital filed Critical Suzhou Reacool Medical Technology Co Ltd
Publication of WO2025112078A1 publication Critical patent/WO2025112078A1/fr
Pending legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods
    • A61B17/34Trocars; Puncturing needles
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots

Definitions

  • the present invention relates to the field of digital medical technology, and in particular to navigation technology in spinal canal puncture.
  • the first aspect of the present invention provides a navigation method for spinal puncture based on mixed reality technology, which comprises:
  • Step S1 receiving a click operation on the skin surface of a three-dimensional image to obtain a collision point, wherein the three-dimensional image is three-dimensionally reconstructed based on mixed reality technology;
  • Step S2 draw a ray from the collision point to the target point in the spinal canal inside the three-dimensional image, and detect whether the ray collides with bones or blood vessels before reaching the target point; when it is detected that the ray does not collide with bones or blood vessels before reaching the target point, the collision point is determined to be the best needle insertion point;
  • Step S3 When there is an optimal needle insertion point, a visual navigation path from the optimal needle insertion point to the target point is generated.
  • step S2 when it is detected that the ray collides with bones or blood vessels before reaching the target point, the method further includes:
  • Step S4 determining that the collision point is invalid
  • Step S5 finding the direction vector from the collision point to the target point as a reference vector
  • Step S6 With the target point as the center, find 100 to 500 points near the reference vector whose angles with the reference vector are 0 to The vector of degree 1 is used as a candidate vector;
  • Step S7 Calculate the position point on the skin surface of the three-dimensional image corresponding to the candidate vector as a candidate needle insertion point
  • Step S8 Make a ray from the candidate needle insertion point to the target point in the spinal canal inside the three-dimensional image, and detect whether the ray collides with bones or blood vessels before reaching the target point;
  • Step S9 when the best needle insertion point has not been found after 100 to 500 ray detections are completed, the angle between the candidate vector and the reference vector is further enlarged by 1 degree, and then steps S5 to S8 are repeated, and so on, and the maximum angle between the candidate vector and the reference vector does not exceed 5 degrees;
  • Step S10 When it is detected that the ray does not collide with bones or blood vessels before reaching the target point, the candidate needle insertion point corresponding to the ray is determined to be the optimal needle insertion point.
  • step S8 when it is detected that the ray does not collide with bones or blood vessels before reaching the target point, the collision point is determined to be the best needle insertion point.
  • step S9 when the angle between the candidate vector and the reference vector is 5 degrees and the best needle insertion point still cannot be found, it is determined that there is no best needle insertion point.
  • a second aspect of the present application provides a navigation system for intraspinal puncture based on mixed reality technology, which comprises:
  • a collision point acquisition module the collision point acquisition module is used to receive a click operation on the skin surface of a three-dimensional image to obtain a collision point, the three-dimensional image is three-dimensionally reconstructed based on mixed reality technology;
  • a ray detection module which is used to make a ray from the collision point to the target point in the spinal canal inside the three-dimensional image, and detect whether the ray collides with bones or blood vessels before reaching the target point; when it is detected that the ray does not collide with bones or blood vessels before reaching the target point, the collision point is determined to be the best needle insertion point;
  • a navigation path generation module is used to generate a visual navigation path from the optimal needle insertion point to the target point when there is an optimal needle insertion point.
  • the ray detection module is further used to:
  • the candidate vector and the reference vector are compared.
  • the angle between the candidate vector and the reference vector is further enlarged by 1 degree, and then steps (2) to (5) are repeated, and so on, and the maximum angle between the candidate vector and the reference vector does not exceed 5 degrees;
  • the candidate needle insertion point corresponding to the ray is determined to be the optimal needle insertion point.
  • the ray detection module is also used for: when the angle between the candidate vector and the reference vector is 5 degrees, and the best needle insertion point still cannot be found, it is determined that there is no best needle insertion point.
  • the third aspect of the present application provides an electronic device, comprising: a memory, a processor, and a computer program stored in the memory and executable on the processor, wherein the computer program, when executed by the processor, implements the above-mentioned navigation method for intraspinal puncture based on mixed reality technology.
  • the technical solution of the present application uses mixed reality technology to virtually reconstruct the real lesion into a 1:1 restored and adapted stereoscopic image that can be directly viewed, and through a precise algorithm, it can identify the position of the hand in one second and calculate the navigation path of the puncture in seconds, thereby providing an extremely fast, real-time, accurate and minimally invasive surgical navigation solution for puncture surgery.
  • the technical solution of the present application integrates the technical advantages of 1:1 three-dimensional lesion reconstruction, 1-second virtual-real adaptation (using the Unity engine, with underlying optimization, batch processing technology, multi-threading support, lightweight components, and about 30 algorithms can be executed in 1 second) and millimeter-level tracking navigation (100-500 rays will be emitted within each degree, taking 100 rays as an example, the arc length corresponding to 1 degree is about 0.017 mm), helping the operator to make preoperative plans, enabling doctors to see through, understand and accurately identify surgical targets, optimize personalized intraoperative real-time decision-making, improve surgical quality and efficiency, and reduce radiation damage to doctors. It is especially suitable for navigation of puncture surgeries in departments such as pain, anesthesia, interventional, neurosurgery, and orthopedics.
  • FIG1 is a flow chart of a navigation method for intraspinal puncture based on mixed reality technology in one embodiment of the present application
  • FIG2 is a schematic diagram showing that the ray starting from the collision point does not collide with bones or blood vessels before reaching the target point.
  • the collision point is the optimal needle insertion point;
  • FIG3 is a schematic diagram showing that a ray starting from a collision point collides with a bone or a blood vessel before reaching a target point. In this case, the collision point is invalid.
  • FIG. 4 is a schematic diagram showing the principle of searching for candidate needle insertion points when the collision point is detected to be invalid.
  • 10-skin surface of three-dimensional image 20-bone; a-collision point; b-target point; c-calculation area.
  • This embodiment provides a navigation method for spinal puncture based on mixed reality technology, as shown below:
  • Step 1 Obtain a CT image of the patient.
  • Step 2 Perform professional three-dimensional reconstruction and add targets within its specified area.
  • the method of obtaining three-dimensional images based on CT images is as follows:
  • Targets are added using 3D balls in the analysis bar tool of MIMICS. In the three-dimensional view, place balls at the required parts. The placed balls can be adjusted to the required position by moving.
  • STL file export Save the extracted model and the placed targets by exporting STL files. During the extraction process, the solid model and the added balls need to be extracted separately, and the name of each part should be marked during the extraction.
  • the above method can virtually reconstruct the real lesion into a 1:1 restored and adapted stereoscopic image that can be directly viewed through.
  • Step 3 Upload the STL file to the mixed reality glasses.
  • the mixed reality glasses of this embodiment use the Unity engine, which has bottom-level optimization, batch processing technology, multi-threading support, and lightweight components.
  • the algorithm can be executed about 30 times in 1 second, thereby achieving a 1-second virtual-real adaptation effect.
  • Step 4 Open the intraspinal puncture navigation software installed in the mixed reality smart glasses.
  • the mixed reality smart glasses include a memory, a processor, and a computer program stored in the memory and executable on the processor (i.e., The intraspinal puncture navigation software corresponding to the intraspinal puncture navigation method based on mixed reality technology of the present application) is implemented when the computer program is executed by the processor.
  • the intraspinal puncture navigation method based on mixed reality technology of the present application i.e., the following steps 4.1 to 4.3
  • FIG1 The intraspinal puncture navigation method based on mixed reality technology of the present application (i.e., the following steps 4.1 to 4.3) is shown in FIG1:
  • Step 4.1 Receive a click operation on the surface of the 3D image skin 10 to obtain a collision point a.
  • the position of the hand in space is captured in one second by the camera on the mixed reality glasses, and a position point where the finger clicks on the skin in the stereoscopic image is captured, which is the collision point a.
  • Step 4.2 Draw a ray from the collision point a to the target point b in the spinal canal inside the three-dimensional image, and detect whether the ray collides with bones or blood vessels before reaching the target point b (the ray detection is sent from point a to point b to determine whether it intersects with the levels of other organs).
  • FIG. 2 is a three-dimensional image reconstructed by mixed reality technology and can be directly viewed, where 10 is the skin surface of the three-dimensional image, 20 is the bone, a is the collision point, and b is the target (target b is pre-determined by the doctor in the surgical area on the tube cone).
  • target b is pre-determined by the doctor in the surgical area on the tube cone.
  • the collision point a is determined to be invalid. As shown in FIG3 , the ray emitted from a collides with the bone before reaching b. Therefore, the current collision point a is determined to be invalid.
  • a ray is drawn from the candidate needle insertion point to the target point b located in the spinal canal inside the three-dimensional image to detect whether the ray collides with bones or blood vessels before reaching the target point b; when it is detected that the ray does not collide with bones or blood vessels before reaching the target point b, the collision point a is determined to be the optimal needle insertion point;
  • the angle between the candidate vector and the reference vector is further enlarged by 1 degree, and then the same method is repeated, and so on.
  • the maximum angle between the candidate vector and the reference vector does not exceed 5 degrees.
  • the angle between the candidate vector and the reference vector is 5 degrees and the best insertion point is still not found, it is judged that there is no best insertion point.
  • the candidate needle insertion point corresponding to the ray is determined to be the optimal needle insertion point.
  • Step 4.3 When there is an optimal needle entry point, a visual navigation path from the optimal needle entry point to the target point b is generated.
  • the visual navigation path is generated by an algorithm, which controls the length of zooming in and out to achieve the effect of a path.

Landscapes

  • Health & Medical Sciences (AREA)
  • Surgery (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Medical Informatics (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Molecular Biology (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Robotics (AREA)
  • Pathology (AREA)
  • Apparatus For Radiation Diagnosis (AREA)

Abstract

L'invention concerne un procédé et un système de navigation pour une ponction intrarachidienne basés sur une technologie de réalité mixte. Le procédé consistant à : étape S1 : recevoir une opération de clic sur une surface de peau (10) dans une image tridimensionnelle pour donner un point de collision (a), l'image tridimensionnelle étant formée par reconstruction tridimensionnelle sur la base d'une technologie de réalité mixte ; étape S2 : générer un rayon du point de collision (a) à un point cible (b) dans le canal rachidien dans l'image tridimensionnelle, et détecter si le rayon entre en collision avec un os (20) ou un vaisseau sanguin avant d'atteindre le point cible (b), et lorsqu'il est détecté que le rayon n'entre pas en collision avec l'os (20) ou le vaisseau sanguin avant d'atteindre le point cible (b), déterminer que le point de collision est un point d'insertion d'aiguille optimal ; et S3 : lorsque le point d'insertion d'aiguille optimal est présent, générer un trajet de navigation visuelle du point d'insertion d'aiguille optimal au point cible. Le procédé et le système fournissent un plan de navigation rapide, en temps réel, précis et minimalement invasif pour une ponction intrarachidienne.
PCT/CN2023/136151 2023-11-27 2023-12-04 Procédé et système de navigation pour une ponction intrarachidienne basés sur une technologie de réalité mixte Pending WO2025112078A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202311591907.5A CN117717415A (zh) 2023-11-27 2023-11-27 一种基于混合现实技术的椎管内穿刺的导航方法及系统
CN202311591907.5 2023-11-27

Publications (1)

Publication Number Publication Date
WO2025112078A1 true WO2025112078A1 (fr) 2025-06-05

Family

ID=90204297

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2023/136151 Pending WO2025112078A1 (fr) 2023-11-27 2023-12-04 Procédé et système de navigation pour une ponction intrarachidienne basés sur une technologie de réalité mixte

Country Status (2)

Country Link
CN (1) CN117717415A (fr)
WO (1) WO2025112078A1 (fr)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103970988A (zh) * 2014-04-14 2014-08-06 中国人民解放军总医院 消融针穿刺路径规划方法及系统
KR20150073510A (ko) * 2013-12-23 2015-07-01 재단법인 아산사회복지재단 의료용 바늘의 삽입 경로의 생성 방법
US20210196399A1 (en) * 2019-12-31 2021-07-01 Auris Health, Inc. Alignment techniques for percutaneous access
CN116135159A (zh) * 2021-11-17 2023-05-19 中移(苏州)软件技术有限公司 三维路径规划方法、装置、设备和存储介质
CN116869622A (zh) * 2023-01-20 2023-10-13 深圳市箴石医疗设备有限公司 一种穿刺手术路径规划方法、装置及存储介质
CN117084791A (zh) * 2023-10-19 2023-11-21 苏州恒瑞宏远医疗科技有限公司 一种穿刺方位解算方法以及穿刺作业执行系统

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20150073510A (ko) * 2013-12-23 2015-07-01 재단법인 아산사회복지재단 의료용 바늘의 삽입 경로의 생성 방법
CN103970988A (zh) * 2014-04-14 2014-08-06 中国人民解放军总医院 消融针穿刺路径规划方法及系统
US20210196399A1 (en) * 2019-12-31 2021-07-01 Auris Health, Inc. Alignment techniques for percutaneous access
CN116135159A (zh) * 2021-11-17 2023-05-19 中移(苏州)软件技术有限公司 三维路径规划方法、装置、设备和存储介质
CN116869622A (zh) * 2023-01-20 2023-10-13 深圳市箴石医疗设备有限公司 一种穿刺手术路径规划方法、装置及存储介质
CN117084791A (zh) * 2023-10-19 2023-11-21 苏州恒瑞宏远医疗科技有限公司 一种穿刺方位解算方法以及穿刺作业执行系统

Also Published As

Publication number Publication date
CN117717415A (zh) 2024-03-19

Similar Documents

Publication Publication Date Title
US12223118B2 (en) Imaging system and method for use in surgical and interventional medical procedures
JP7162793B2 (ja) 超音波拓本技術に基づく脊椎画像生成システム及び脊柱手術用のナビゲーション・位置確認システム
TWI836491B (zh) 註冊二維影像資料組與感興趣部位的三維影像資料組的方法及導航系統
Cash et al. Concepts and preliminary data toward the realization of image-guided liver surgery
US11564649B2 (en) System and method for identifying and marking a target in a fluoroscopic three-dimensional reconstruction
US8942455B2 (en) 2D/3D image registration method
EP3669326B1 (fr) Recalage osseux en imagerie ultrasonore avec calibrage de la vitesse du son et segmentation fondés sur l'apprentissage
CN109984843B (zh) 骨折闭合复位导航系统及方法
JP6620252B2 (ja) 超音波融合撮像システムにおけるプローブ誘導変形の補正
CN116570370B (zh) 一种脊柱针刀穿刺导航系统
CN111415404A (zh) 术中预设区域的定位方法、装置、存储介质及电子设备
Rasoulian et al. Ultrasound-guided spinal injections: a feasibility study of a guidance system
Ribeiro et al. Augmented reality guided laparoscopic liver resection: a phantom study with intraparenchymal tumors
LU101007B1 (en) Artificial-intelligence based reduction support
CN118252614B (zh) 经椎间孔入路腰椎间盘突出射频消融术穿刺路径规划方法
WO2025112078A1 (fr) Procédé et système de navigation pour une ponction intrarachidienne basés sur une technologie de réalité mixte
Stolka et al. A 3D-elastography-guided system for laparoscopic partial nephrectomies
JP2003305036A (ja) シーケンスの作成方法
Liu et al. Surgical instrument guidance using synthesized anatomical structures
CN120563581B (zh) 医学图像处理方法、装置、计算机设备及存储介质
US20250295455A1 (en) Support device, support method, and support program
US20240005495A1 (en) Image processing device, method, and program
TW201103494A (en) Method for image positioning
CN110428483B (zh) 一种图像处理方法及计算设备
JP2023172310A (ja) 画像処理装置、方法およびプログラム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23959984

Country of ref document: EP

Kind code of ref document: A1