[go: up one dir, main page]

WO2020116991A1 - Dispositif d'imagerie ultrasonore ayant une fonction guide d'acupuncture - Google Patents

Dispositif d'imagerie ultrasonore ayant une fonction guide d'acupuncture Download PDF

Info

Publication number
WO2020116991A1
WO2020116991A1 PCT/KR2019/017194 KR2019017194W WO2020116991A1 WO 2020116991 A1 WO2020116991 A1 WO 2020116991A1 KR 2019017194 W KR2019017194 W KR 2019017194W WO 2020116991 A1 WO2020116991 A1 WO 2020116991A1
Authority
WO
WIPO (PCT)
Prior art keywords
needle
image
ultrasound
information
unit
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/KR2019/017194
Other languages
English (en)
Korean (ko)
Inventor
이상훈
전영주
전민호
김대혁
김소영
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Korea Institute of Oriental Medicine KIOM
Original Assignee
Korea Institute of Oriental Medicine KIOM
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Korea Institute of Oriental Medicine KIOM filed Critical Korea Institute of Oriental Medicine KIOM
Publication of WO2020116991A1 publication Critical patent/WO2020116991A1/fr
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/08Clinical applications
    • A61B8/0833Clinical applications involving detecting or locating foreign bodies or organic structures
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/08Clinical applications
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/46Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
    • A61B8/461Displaying means of special interest
    • A61B8/465Displaying means of special interest adapted to display user selection data, e.g. icons or menus
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/46Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
    • A61B8/461Displaying means of special interest
    • A61B8/466Displaying means of special interest adapted to display 3D data
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/46Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
    • A61B8/467Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient characterised by special input means
    • A61B8/469Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient characterised by special input means for selection of a region of interest
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/48Diagnostic techniques
    • A61B8/488Diagnostic techniques involving Doppler signals
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/52Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/5207Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of raw data to produce diagnostic data, e.g. for generating an image
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/54Control of the diagnostic device

Definitions

  • U.S. Patent No. 9,292,654 discloses an ultrasound imaging device that provides tutorial information. Instruction information including text, images, and videos is provided for each operation step of the ultrasound imaging apparatus. In addition, reference parameters of a setting status corresponding to each operation step are prepared, and they are set according to the operation step.
  • US Patent Publication No. 2014/0039304A9 discloses a technique that detects blood flow by driving an ultrasound imaging device in a Doppler mode, generates location information of the blood flow, and warns when the needle approaches the blood vessel.
  • An ultrasound imaging apparatus suitable for acupuncture acupuncture in the field of Korean medicine is proposed.
  • Guidance information suitable for users who do not have much experience with ultrasound imaging equipment is provided for each acupuncture point where acupuncture is performed.
  • an ultrasound imaging device recognizes this and provides a warning.
  • a method is proposed in which an ultrasound imaging device can detect a needle's entry into a dangerous area more quickly and reliably.
  • risk zone information is prepared and stored in advance for each acupuncture points.
  • the position of the needle is tracked by the positioning system.
  • the risk area information corresponding to the selected acupuncture points is extracted from the stored risk area information, and the ultrasound image is recognized with the help of the risk area information to determine the risk area in the actual image.
  • the determined danger zone information is stored, and alerts when the needle position approaches the danger zone.
  • a warning may be output when the position of the tip of the needle is close to the dangerous area even if it is outside the range of the ultrasound image displayed on the screen.
  • the proposed invention it is possible to reduce the risk of acupuncture during acupuncture. Furthermore, it is advantageous for real-time processing because it is possible to quickly and accurately detect the danger zone entry.
  • Figure 1 shows the configuration of the acupuncture acupuncture surgery system according to an embodiment.
  • FIG. 2 is a block diagram showing the configuration of a probe according to an embodiment.
  • Figure 3 is a block diagram showing the configuration of an ultrasound imaging apparatus for acupuncture acupuncture according to an embodiment.
  • FIG. 4 is a block diagram showing the configuration of the needle position detection unit according to another embodiment.
  • FIG. 5 is a block diagram showing the configuration of a surgical risk warning unit according to another embodiment.
  • FIG. 6 is a flowchart illustrating a configuration of an ultrasound image display control method according to an embodiment.
  • FIG. 7 is a flow chart showing the configuration of one embodiment of the sensor-based position calculation step 661.
  • each block is understood to represent various embodiments by adding a combination of one, two, or more numbers of blocks that are not necessary to essential blocks. Should be.
  • Figure 1 shows the configuration of the acupuncture acupuncture surgery system according to an embodiment.
  • the acupuncture needle acupuncture system includes an ultrasound imaging apparatus 50, a probe 10, and a needle 30.
  • the operator selects acupuncture points, and the main body of the ultrasound imaging apparatus 50 is a document explaining acupuncture method of the corresponding acupuncture points in response to the user's manipulation through the operation portion, a 2D or 3D image showing the anatomical structure of the acupuncture points, A graphic animation or video showing the acupuncture procedure of the corresponding acupuncture points is displayed on the screen.
  • a positioning system for positioning the needle 30 is mounted. On the screen of the ultrasound imaging apparatus 50, the position of the needle 30 detected by the positioning system is displayed on the image scanned by the probe 10.
  • a probe according to an embodiment includes an ultrasonic element array 250 and an ultrasonic element driver 225 driving the ultrasonic element array 250. Additionally, the probe according to an embodiment may further include an acceleration sensor 230 and an acceleration sensor driver 223. The acceleration sensor 230 may be, for example, a gyro sensor. The absolute position of the probe can be calculated using an acceleration sensor. Additionally, the probe according to an embodiment may further include magnetic position sensors 210-1, 210-2, ..., 210-n, and a magnetic sensor driver 221 for driving them. In one embodiment, the needle positioning system is a magnetic force based positioning system.
  • the needle is magnetized, and a plurality of magnetic position sensors provided in the probe measure the intensity of the magnetic field to calculate the position of the magnetized needle.
  • the saliva positioning system is an RF based positioning system.
  • the RF transmitter is fixed to the needle, and a plurality of receivers provided in the probe measure the intensity of the RF field to calculate the position of the needle.
  • the proposed invention is not limited to a specific positioning system, and may be one of known positioning systems not based on ultrasound images.
  • FIG. 3 is a block diagram showing the configuration of an ultrasound imaging apparatus for acupuncture acupuncture according to an embodiment.
  • the ultrasound imaging apparatus for acupuncture needle acupuncture provides images of biological tissue at the treatment site during acupuncture acupuncture.
  • the ultrasound imaging apparatus for acupuncture acupuncture according to an embodiment includes an ultrasound signal processing unit 370, a user interface unit 390, a needle position detection unit 330, and a procedure risk warning unit 310 It includes.
  • the ultrasonic signal processing unit 370, the user interface unit 390, the needle position detection unit 330, and the surgical risk warning unit 310 are all or part of the microprocessor, digital signal It may be implemented with program instructions executed by one or more computing elements, such as a processing processor.
  • the computing elements process data by reading and executing program instructions stored in the storage unit 320.
  • the storage 320 may be, for example, one or a combination of mass storage such as hard disk, SSD, and network storage, and memory such as nonvolatile memory and volatile RAM.
  • the storage unit 320 stores operation guide information including operation guide content for each treatment part and risk region information for each treatment part.
  • the treatment site is defined according to the acupuncture points. When the operator selects acupuncture points, a corresponding treatment site is selected. Acupuncture points for acupuncture are defined in the field of Korean medicine. Accurate treatment of acupuncture points requires a lot of experience and anatomical knowledge.
  • the manipulation guide content is prepared as a database having a tree structure for each major and sub-category of the acupuncture points.
  • the operation guide content includes document information describing acupuncture points of the corresponding acupuncture points, 2D or 3D image information showing the anatomical structure of the acupuncture points, and graphic animations or videos showing acupuncture points of the corresponding points.
  • the document information may include a brief description of the acupuncture needle acupuncture procedure, information about the patient posture and the location where the probe is to be located, and a specific method of accurately locating the treatment site using ultrasound.
  • the procedure guide information may be stored in the storage unit 320.
  • the procedure guide information includes risk region information for each treatment site.
  • the danger region information may be, for example, image patterns or feature information that can distinguish specific biological tissues, for example, blood vessels, from an ultrasound image.
  • an ultrasound image has a limited resolution, and thus it is not easy to recognize a specific biological tissue from the image. Since the acupuncture needle acupuncture procedure is limited to a very narrow location, the treatment site can be specified.
  • a dangerous area in an ultrasound image is obtained from information such as a pattern or feature information of an image of the corresponding biological tissue commonly seen in an ultrasound image, for example, characteristics of a luminance distribution or a shape of a boundary line. In recognition, it can increase the accuracy or the likelihood of success.
  • the danger area information may be 3D area information of biological tissue.
  • the standard 3D region information of the biological tissue indicates the shape of the corresponding biological tissue, and the overall 3D region information may be a map for finding the biological tissue in an ultrasound image.
  • the ultrasound signal processing unit 370 generates an image by processing an ultrasound signal obtained by driving an array of ultrasound elements.
  • the imaging mode of the ultrasound imaging apparatus includes a B mode (brightness mode), an M mode (motion mode), a D mode (Doppler mode) using a Doppler effect, and a C mode (color Doppler mode).
  • the ultrasonic signal processing unit 370 drives the ultrasonic element array 250 in one of these modes to generate ultrasonic image information from the output signal.
  • the user interface unit 390 receives a user's operation command through a control unit such as a mouse, a touch pad, and a keyboard, and outputs ultrasound image or system generated information as visual information through a screen or audio information through a speaker. do.
  • the user interface unit 390 includes an instruction information providing unit 391.
  • the instruction information providing unit 391 selects the acupuncture points from the operator and extracts and provides the corresponding operation guide content from the storage unit 320.
  • the user selects one of the major categories through, for example, a touch input, and then selects one of the sub-categories to select the acupuncture points.
  • the user interface unit 390 extracts the operation guide content of the selected acupuncture points from the database.
  • Document information explaining the method of acupuncture of the selected acupuncture points 2D or 3D image information showing the anatomical structure of the acupuncture points, and graphic animations or videos showing the acupuncture points of the acupuncture points are provided according to the user's selection.
  • the user interface unit 390 may further include a display setting unit 393 for each site.
  • the procedure guide information may further include setting parameters for each treatment site.
  • the display setting unit for each part 393 extracts the setting parameter for each treatment part from the storage unit 320 and sets the display control parameter of the ultrasound image accordingly.
  • the needle position detector 330 calculates the position of the needle from at least two sensor inputs.
  • U.S. Patent No. 9,597,008, issued to EZONO AG on March 21, 2017, measures the strength of the magnetic field of a magnetized needle using a plurality of magneto-metric detectors and uses it to measure the position of the needle Disclosed is a technique for measuring. Needle positioning systems from EZONO AG are commercially available and the applicants are using this technology in developing the proposed invention in cooperation with EZONO AG.
  • the self-positioning system of EZONO AG uses a plurality of self-positioning detection elements embedded in the probe, and can calculate the position of the needle and output it superimposed on the ultrasound image.
  • the proposed invention is not limited to this, and various known positioning techniques, such as the above-described RF method, an anchor method, and a positioning system using radio wave fingerprint, can be applied.
  • the probe 10 includes an acceleration sensor 230, for example a gyro sensor.
  • the probe position detector 350 measures the position and posture of the probe, that is, the tilted direction, from the output of the acceleration sensor 230.
  • the needle position detection unit 330 first calculates the position of the needle 30 in the reference coordinate system of the probe 10 from the output of the magnetic position detection elements 210. Thereafter, the needle position detection unit 330 converts the coordinate system using the position and posture information of the probe detected by the probe position detection unit 350 to calculate the position of the needle 30 in the absolute coordinate system.
  • the operation danger warning unit 310 determines a three-dimensional danger region from the danger region information corresponding to the selected treatment region, and the ultrasound image generated by the ultrasonic signal processing unit, and the location of the needle calculated by the needle location detection unit 330 A warning signal is output when the position of the tip of the needle is close to the danger zone.
  • the operation risk warning unit 310 includes a danger area recognition unit 313 and a risk determination unit 311.
  • the danger zone recognition unit 313 recognizes a 3D danger zone from an ultrasound image using the danger zone information.
  • the danger zone information is image pattern or feature information of a specific biological tissue.
  • the risk region for example, three-dimensional modeling of blood vessels existing in a specific acupuncture site, and image patterns that appear when each modeling is seen on an ultrasound image may be risk region information.
  • a function value is compared with a reference value to determine a blood vessel region pattern value. This may be risk zone information.
  • the danger zone recognition unit 313 first stores a plurality of images obtained by scanning around the selected acupuncture points of the probe 10 and acquires three-dimensional information around the acupuncture points from the plurality of images. This 3D information is not modeled as a 3D volume image.
  • the input ultrasound image collection recognizes the 3D danger area using the image pattern of the danger area information.
  • the 3D image is zoomed and shifted to have the same scale and position as compared to the 3D danger area modeling information stored through the recognition of the feature points of the 3D image obtained first. Subsequently, the transformed 3D ultrasound image is partitioned into blocks.
  • 3D dangerous region modeling information may be obtained from the acquired ultrasound image collection and matched with the ultrasound image.
  • the treatment site can be specified.
  • a dangerous area in an ultrasound image is obtained from information such as a pattern or feature information of an image of the corresponding biological tissue commonly seen in an ultrasound image, for example, characteristics of a luminance distribution or a shape of a boundary line. In recognition, it can increase the accuracy or the likelihood of success.
  • the risk determination unit 311 calculates how close the position of the needle end calculated by the needle position detection unit 330 is to the 3D danger area modeling area in the biological tissue of the patient recognized by the danger area recognition unit 313, If a certain distance is exceeded, or the direction where the tip of the needle moves is toward the danger zone and the movement speed is greater than a certain level, a warning message is generated and output.
  • the warning message can be output on the screen and/or audio.
  • the position of the needle is calculated by an absolute system separated from the ultrasound image.
  • the 3D modeling information of the dangerous area is calculated and stored from the ultrasound volume image. Accordingly, since the operation risk warning unit 310 processes the position of the tip of the needle by comparing it with the 3D modeling area information of the danger area, the stored ultrasonic wave position is generated by the ultrasonic signal processing unit 370 and outputs it to the screen. Even if it is out of range of the image, a warning can be output when it approaches the danger zone.
  • the needle position detection unit 330 may include a sensor-based location calculation unit 331, an image-based location calculation unit 333, and a location information output unit 335.
  • 4 is a block diagram showing the configuration of the needle position detection unit according to another embodiment.
  • the sensor-based position calculator 331 calculates the position of the needle from at least two sensor inputs. This is as described above.
  • the image-based position calculator 333 estimates the position of the needle by tracking the movement of the tissue in the ultrasound image. It is advantageous to use other methods in parallel because the danger zone entry may take place at the moment the magnetic system positioning system is inoperative.
  • the needle is very thin, so it is not visible on the ultrasound image.
  • the image-based position calculating unit 333 continuously compares the generated ultrasound image to identify a changing portion and a portion that does not change, and calculates a portion changing in a specific direction as a needle position. For example, by calculating a change amount between image frames, a motion vector can be obtained, and accordingly, motion of a biological tissue can be identified. When the needle enters the living tissue, movement occurs because the surrounding living tissue is pushed out. This amount of change can reflect motion relatively well even when the image quality is poor.
  • the position of the tip of the needle can be estimated by analyzing the direction of movement of biological tissue.
  • the ultrasonic signal processing unit 370 drives the array of ultrasonic elements in a Doppler mode, detects the speed of biological tissue pushed out when the needle enters, identifies biological tissue, and estimates the position of the needle therefrom Can be.
  • the Doppler mode can sensitively identify moving tissue. While operating in the Doppler mode, the movement of biological tissues can be obtained through the difference between frames of the ultrasound image, and the position of the needle can be estimated by finding the portion of the movement or the central axis of the movement from the distribution of the movement.
  • the location information output unit 335 calculates the position of the needle by synthesizing the outputs of the sensor-based location calculation unit 331 and the image-based location calculation unit 333.
  • the location information output unit 335 preferentially selects the output of the sensor-based location calculating unit 331, but when it is determined that there is no output or reliability, the location-based output calculating unit 333 is executed in parallel. Select output.
  • the reliability of the result of calculating the needle position can be judged to be unreliable, for example, when the continuity of movement of the tip of the needle is largely broken.
  • the location information output unit 335 preferentially selects the output of the sensor-based location calculation unit 331, but when it is determined that the output is not present or unreliable, the image-based location calculation unit executed in parallel ( 333) Create a location in combination with the output of the task.
  • the image display control unit 360 combines the ultrasound image generated by the ultrasound signal processing unit 370, the instruction information provided by the user interface unit 390, and the position of the needle detected by the needle position detection unit 330. Is displayed. For example, the image display control unit 360 overlays and displays a needle image generated as a graphic reflecting the position of the needle detected by the needle position detection unit 330 on the ultrasound image generated by the ultrasound signal processing unit 370. Can be. As another example, the image display control unit 360 may simultaneously display the instruction information provided by the user interface unit 390 on the ultrasound image generated by the ultrasound signal processing unit 370 by dividing the screen.
  • the danger zone recognition unit 313 of the operation risk warning unit 310 may include a reflection coefficient image analysis unit 315 and a Doppler image analysis unit 317.
  • the reflection coefficient image analysis unit 315 recognizes a dangerous area by driving the ultrasonic signal processing unit 370 in a luminance mode or a B mode (brightness mode) method that generates an image by processing a reflection coefficient of an ultrasonic signal.
  • the specific operation of the reflection coefficient image analysis unit 315 is similar to the operation of the dangerous area recognition unit 313 described above.
  • the Doppler image analysis unit 317 drives the ultrasound signal in a Doppler mode to generate an image by driving the ultrasound signal in a Doppler mode, or a D mode (Doppler mode) or a C mode (color Doppler mode) to drive the ultrasound signal processing unit 370 to detect a hazardous area. Recognize.
  • Doppler mode a D mode
  • C mode color Doppler mode
  • Modeling of the danger zone in the subject's biological tissue can be generated by comparing the shape of the region with the flow to the stored danger zone information.
  • the Doppler image analysis unit 317 may intermittently drive the Doppler mode in the background while imaging the ultrasound signal processing unit 370 in a luminance mode.
  • the Doppler image analysis unit 317 may intermittently drive the Doppler mode in the background while imaging the ultrasound signal processing unit 370 in a luminance mode.
  • the ultrasound image display control method includes the acupuncture point selection step 610, the risk region information extraction step 620, the procedure site scanning step 640, and the risk region modeling step 650. And, a needle position detection step 660, and a surgical risk warning step 680.
  • the ultrasound imaging apparatus receives the acupuncture points from the operator through the manipulation unit.
  • the processing configuration of the ultrasound imaging apparatus is similar to that of the user interface unit 390 described with reference to FIG. 3, so a detailed description thereof will be omitted.
  • the ultrasound imaging apparatus reads risk zone information corresponding to the selected acupuncture points from the database.
  • the processing configuration of the ultrasound imaging apparatus in the danger region information extraction step 620 is similar to the processing of the danger region recognition unit 313 described with reference to FIG. 3, so a detailed description thereof will be omitted.
  • step 640 of the scanning of the surgical site the ultrasound imaging apparatus generates an ultrasound volume image scanned around the selected acupuncture points.
  • the processing configuration of the ultrasound imaging apparatus in the procedure region scanning step 640 is similar to the scan processing for modeling the ultrasound signal processing unit 370 described with reference to FIG. 3, so a detailed description thereof will be omitted.
  • the ultrasound imaging apparatus In the danger zone modeling step 650, the ultrasound imaging apparatus generates 3D danger zone information from the ultrasound volume image generated in the operation site scan step 640, but is extracted in the risk zone information extraction step 620. It is created and stored by referring to the information on the danger area of the corresponding acupuncture points.
  • the processing configuration of the ultrasound imaging apparatus in the danger zone modeling step 650 is similar to the processing of the danger zone recognition unit 313 described with reference to FIG. 3, so a detailed description thereof will be omitted.
  • the ultrasound imaging apparatus calculates the position of the needle from at least two sensor inputs.
  • the processing configuration of the ultrasound imaging apparatus is similar to the processing of the needle position detection unit 330 described with reference to FIG. 3, so a detailed description thereof will be omitted.
  • the ultrasound imaging apparatus outputs a warning signal when the position of the needle tip according to the position of the needle calculated in the needle position detection step 660 approaches the stored danger area.
  • the operation risk warning step 680 may output a warning when the position of the tip of the needle is close to the danger area even if it is outside the range of the displayed ultrasound image.
  • the danger zone information may include image pattern or feature information of biological tissue.
  • the danger area information may include 3D area information of biological tissue.
  • the processing configuration of the ultrasound imaging apparatus in the procedure risk warning step 680 is similar to the processing of the risk determination unit 311 described with reference to FIG. 3, so a detailed description thereof will be omitted.
  • the ultrasound image display control method may further include a display setting step 630 for each region.
  • the ultrasound imaging apparatus sets the display control parameter of the ultrasound image according to the setting parameter for each treatment part included in the procedure guide information.
  • the processing configuration of the ultrasound imaging device in the display setting step 630 for each part is similar to the processing of the display setting part 393 for each part described with reference to FIG. 3, so a detailed description thereof will be omitted.
  • the danger region modeling step 650 may recognize the danger region in an image generated by intermittently driving the ultrasound signal processor in a Doppler mode.
  • the dangerous area modeling step 650 may be intermittently driven in the Doppler mode in the background while imaging in the luminance mode.
  • the processing configuration of the ultrasound imaging apparatus is similar to the processing of the Doppler image analysis unit 317 described with reference to FIG. 5, so a detailed description thereof will be omitted.
  • the needle position detection step 660 may include a sensor-based position calculation step 661, an image-based position calculation step 663, and a location information output step 665.
  • 7 is a flow chart showing the configuration of one embodiment of the sensor-based position calculation step 661.
  • the ultrasound imaging apparatus calculates the position of the needle from at least two sensor inputs.
  • the ultrasound imaging apparatus estimates the position of the needle by tracking the movement of the tissue in the ultrasound image.
  • the ultrasound imaging apparatus sequentially compares the generated ultrasound images to identify the changing and unchanging parts, and calculates the changing part in a specific direction as the needle position. can do.
  • the ultrasound imaging device calculates the position of the needle by synthesizing the outputs of the sensor-based position calculation step 661 and the image-based position calculation step 663.
  • the processing configuration of the ultrasound imaging apparatus is similar to the processing of the needle position detection unit 330 described with reference to FIG. 4, so a detailed description thereof will be omitted.
  • the proposed invention can be applied to an ultrasound imaging device for acupuncture.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Medical Informatics (AREA)
  • Surgery (AREA)
  • Pathology (AREA)
  • Radiology & Medical Imaging (AREA)
  • Biophysics (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Physics & Mathematics (AREA)
  • Molecular Biology (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Computer Graphics (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Ultra Sonic Daignosis Equipment (AREA)

Abstract

L'invention concerne une technologie de commande d'affichage d'images pour un dispositif d'imagerie ultrasonore. Des informations de régions dangereuses sont préparées et stockées à l'avance pour chaque point d'acupuncture. La position d'une aiguille d'acupuncture est suivie par un système de positionnement. Lorsqu'un point d'acupuncture est sélectionné, une région dangereuse est déterminée dans une image ultrasonore réelle par extraction d'informations relatives aux régions dangereuses correspondant au point d'acupuncture sélectionné à partir des informations relatives aux régions dangereuses stockées et par reconnaissance de l'image ultrasonore à l'aide des informations relatives aux régions dangereuses extraites. Des informations sur la région dangereuse déterminée sont stockées, et un avertissement est émis lorsque l'aiguille atteint une position proche de la région dangereuse.
PCT/KR2019/017194 2018-12-06 2019-12-06 Dispositif d'imagerie ultrasonore ayant une fonction guide d'acupuncture Ceased WO2020116991A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR1020180155772A KR102188176B1 (ko) 2018-12-06 2018-12-06 침 시술 가이드 기능을 가진 초음파 영상 기기
KR10-2018-0155772 2018-12-06

Publications (1)

Publication Number Publication Date
WO2020116991A1 true WO2020116991A1 (fr) 2020-06-11

Family

ID=70973679

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2019/017194 Ceased WO2020116991A1 (fr) 2018-12-06 2019-12-06 Dispositif d'imagerie ultrasonore ayant une fonction guide d'acupuncture

Country Status (2)

Country Link
KR (1) KR102188176B1 (fr)
WO (1) WO2020116991A1 (fr)

Cited By (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11471151B2 (en) * 2018-07-16 2022-10-18 Cilag Gmbh International Safety logic for surgical suturing systems
US11589731B2 (en) 2019-12-30 2023-02-28 Cilag Gmbh International Visualization systems using structured light
US11648060B2 (en) 2019-12-30 2023-05-16 Cilag Gmbh International Surgical system for overlaying surgical instrument data onto a virtual three dimensional construct of an organ
US11744667B2 (en) 2019-12-30 2023-09-05 Cilag Gmbh International Adaptive visualization by a surgical system
US11759283B2 (en) 2019-12-30 2023-09-19 Cilag Gmbh International Surgical systems for generating three dimensional constructs of anatomical organs and coupling identified anatomical structures thereto
US11776144B2 (en) 2019-12-30 2023-10-03 Cilag Gmbh International System and method for determining, adjusting, and managing resection margin about a subject tissue
US11832996B2 (en) 2019-12-30 2023-12-05 Cilag Gmbh International Analyzing surgical trends by a surgical system
US11850104B2 (en) 2019-12-30 2023-12-26 Cilag Gmbh International Surgical imaging system
US11864729B2 (en) 2019-12-30 2024-01-09 Cilag Gmbh International Method of using imaging devices in surgery
US12002571B2 (en) 2019-12-30 2024-06-04 Cilag Gmbh International Dynamic surgical visualization systems
US12053223B2 (en) 2019-12-30 2024-08-06 Cilag Gmbh International Adaptive surgical system control according to surgical smoke particulate characteristics
US12207881B2 (en) 2019-12-30 2025-01-28 Cilag Gmbh International Surgical systems correlating visualization data and powered surgical instrument data
US12257013B2 (en) 2019-03-15 2025-03-25 Cilag Gmbh International Robotic surgical systems with mechanisms for scaling camera magnification according to proximity of surgical tool to tissue
US12453592B2 (en) 2019-12-30 2025-10-28 Cilag Gmbh International Adaptive surgical system control according to surgical smoke cloud characteristics

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102757064B1 (ko) * 2022-02-04 2025-01-21 (주)클래시스 초음파 팁 및 이를 이용한 피부 시술 가이드 시스템

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050049505A1 (en) * 2003-08-27 2005-03-03 Chung-Yuo Wu Intravenous injection device
KR101502964B1 (ko) * 2014-03-27 2015-03-17 원광대학교산학협력단 침 가이드용 경혈 탐측 초음파 프루브 장치
KR20150036869A (ko) * 2013-09-30 2015-04-08 이성파 자동 주사 장치 및 방법
US20150359991A1 (en) * 2013-03-05 2015-12-17 Ezono Ag System for image guided procedure
KR20170062708A (ko) * 2015-11-30 2017-06-08 한국 한의학 연구원 침술 장치

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4575033B2 (ja) * 2004-06-02 2010-11-04 株式会社東芝 超音波診断装置及び超音波診断装置の作動方法
JP5416900B2 (ja) * 2007-11-22 2014-02-12 株式会社東芝 超音波診断装置及び穿刺支援用制御プログラム
KR20170123305A (ko) * 2017-10-30 2017-11-07 이일권 내비게이터를 가진 인체 시술 장치 및 그에 의한 인체 시술 위치 표시 방법
KR102085588B1 (ko) * 2018-02-09 2020-03-06 고려대학교 산학협력단 시술도구 위치 추적 시스템
KR102122767B1 (ko) * 2018-03-05 2020-06-15 고려대학교산학협력단 초음파 검사 지원 장치 및 방법
KR102133695B1 (ko) * 2018-07-11 2020-07-14 동국대학교 경주캠퍼스 산학협력단 니들 가이드 시스템 및 그 방법

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050049505A1 (en) * 2003-08-27 2005-03-03 Chung-Yuo Wu Intravenous injection device
US20150359991A1 (en) * 2013-03-05 2015-12-17 Ezono Ag System for image guided procedure
KR20150036869A (ko) * 2013-09-30 2015-04-08 이성파 자동 주사 장치 및 방법
KR101502964B1 (ko) * 2014-03-27 2015-03-17 원광대학교산학협력단 침 가이드용 경혈 탐측 초음파 프루브 장치
KR20170062708A (ko) * 2015-11-30 2017-06-08 한국 한의학 연구원 침술 장치

Cited By (30)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11471151B2 (en) * 2018-07-16 2022-10-18 Cilag Gmbh International Safety logic for surgical suturing systems
US11559298B2 (en) 2018-07-16 2023-01-24 Cilag Gmbh International Surgical visualization of multiple targets
US11564678B2 (en) 2018-07-16 2023-01-31 Cilag Gmbh International Force sensor through structured light deflection
US12181579B2 (en) 2018-07-16 2024-12-31 Cilag GmbH Intemational Controlling an emitter assembly pulse sequence
US12092738B2 (en) 2018-07-16 2024-09-17 Cilag Gmbh International Surgical visualization system for generating and updating a three-dimensional digital representation from structured light imaging data
US11754712B2 (en) 2018-07-16 2023-09-12 Cilag Gmbh International Combination emitter and camera assembly
US12025703B2 (en) 2018-07-16 2024-07-02 Cilag Gmbh International Robotic systems with separate photoacoustic receivers
US12257013B2 (en) 2019-03-15 2025-03-25 Cilag Gmbh International Robotic surgical systems with mechanisms for scaling camera magnification according to proximity of surgical tool to tissue
US11864729B2 (en) 2019-12-30 2024-01-09 Cilag Gmbh International Method of using imaging devices in surgery
US11925309B2 (en) 2019-12-30 2024-03-12 Cilag Gmbh International Method of using imaging devices in surgery
US11813120B2 (en) 2019-12-30 2023-11-14 Cilag Gmbh International Surgical systems for generating three dimensional constructs of anatomical organs and coupling identified anatomical structures thereto
US11832996B2 (en) 2019-12-30 2023-12-05 Cilag Gmbh International Analyzing surgical trends by a surgical system
US11850104B2 (en) 2019-12-30 2023-12-26 Cilag Gmbh International Surgical imaging system
US11864956B2 (en) 2019-12-30 2024-01-09 Cilag Gmbh International Surgical systems for generating three dimensional constructs of anatomical organs and coupling identified anatomical structures thereto
US11759284B2 (en) 2019-12-30 2023-09-19 Cilag Gmbh International Surgical systems for generating three dimensional constructs of anatomical organs and coupling identified anatomical structures thereto
US11882993B2 (en) 2019-12-30 2024-01-30 Cilag Gmbh International Method of using imaging devices in surgery
US11896442B2 (en) 2019-12-30 2024-02-13 Cilag Gmbh International Surgical systems for proposing and corroborating organ portion removals
US11908146B2 (en) 2019-12-30 2024-02-20 Cilag Gmbh International System and method for determining, adjusting, and managing resection margin about a subject tissue
US11925310B2 (en) 2019-12-30 2024-03-12 Cilag Gmbh International Method of using imaging devices in surgery
US11776144B2 (en) 2019-12-30 2023-10-03 Cilag Gmbh International System and method for determining, adjusting, and managing resection margin about a subject tissue
US11937770B2 (en) 2019-12-30 2024-03-26 Cilag Gmbh International Method of using imaging devices in surgery
US12002571B2 (en) 2019-12-30 2024-06-04 Cilag Gmbh International Dynamic surgical visualization systems
US11759283B2 (en) 2019-12-30 2023-09-19 Cilag Gmbh International Surgical systems for generating three dimensional constructs of anatomical organs and coupling identified anatomical structures thereto
US12053223B2 (en) 2019-12-30 2024-08-06 Cilag Gmbh International Adaptive surgical system control according to surgical smoke particulate characteristics
US11744667B2 (en) 2019-12-30 2023-09-05 Cilag Gmbh International Adaptive visualization by a surgical system
US12096910B2 (en) 2019-12-30 2024-09-24 Cilag Gmbh International Surgical hub for use with a surgical system in a surgical procedure
US11648060B2 (en) 2019-12-30 2023-05-16 Cilag Gmbh International Surgical system for overlaying surgical instrument data onto a virtual three dimensional construct of an organ
US12207881B2 (en) 2019-12-30 2025-01-28 Cilag Gmbh International Surgical systems correlating visualization data and powered surgical instrument data
US11589731B2 (en) 2019-12-30 2023-02-28 Cilag Gmbh International Visualization systems using structured light
US12453592B2 (en) 2019-12-30 2025-10-28 Cilag Gmbh International Adaptive surgical system control according to surgical smoke cloud characteristics

Also Published As

Publication number Publication date
KR102188176B1 (ko) 2020-12-07
KR20200068880A (ko) 2020-06-16

Similar Documents

Publication Publication Date Title
WO2020116991A1 (fr) Dispositif d'imagerie ultrasonore ayant une fonction guide d'acupuncture
EP1323380B1 (fr) Appareil d'imagerie ultrasonique d'une aiguille pour biopsie
US9773305B2 (en) Lesion diagnosis apparatus and method
CN109475343B (zh) 剪切波弹性成像测量显示方法及系统
CN112215843A (zh) 超声智能成像导航方法、装置、超声设备及存储介质
KR102294194B1 (ko) 관심영역의 시각화 장치 및 방법
CN110072465B (zh) 用于肺部超声的目标探头放置
CN107111875A (zh) 用于多模态自动配准的反馈
WO2020116992A1 (fr) Dispositif d'image échographique ayant une fonction de guidage de thérapie par aiguille utilisant un marqueur
KR20160032586A (ko) 관심영역 크기 전이 모델 기반의 컴퓨터 보조 진단 장치 및 방법
US9974618B2 (en) Method for determining an imaging specification and image-assisted navigation as well as device for image-assisted navigation
US20230172428A1 (en) Endoscope image processing device
WO2015119338A1 (fr) Procédé de guidage de la position d'analyse d'une sonde à ultrasons tridimensionnelle, et système de diagnostic par ultrasons utilisant le procédé de guidage
EP3712847A1 (fr) Détection de pointe de cathéter dans une vidéo fluoroscopique à l'aide d'apprentissage profond
KR20160037023A (ko) 컴퓨터 보조 진단 지원 장치 및 방법
CN112533540A (zh) 超声成像的方法、超声成像设备以及穿刺导航系统
KR20160046670A (ko) 영상 진단 보조 장치 및 방법
JP2022037101A (ja) 超音波システム及び方法
KR20200006886A (ko) 니들 가이드 시스템 및 그 방법
CN113040822A (zh) 子宫内膜蠕动的测量方法、用于测量子宫内膜蠕动的设备
CN112545551B (zh) 用于医学成像设备的方法和系统
CN118252529A (zh) 超声扫描方法、装置和系统、电子设备和存储介质
JP7299100B2 (ja) 超音波診断装置及び超音波画像処理方法
CN111292248B (zh) 超声融合成像方法及超声融合导航系统
KR102615722B1 (ko) 초음파 스캐너 및 초음파 스캐너에서의 조준 가이드 방법

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19894365

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19894365

Country of ref document: EP

Kind code of ref document: A1