[go: up one dir, main page]

WO2009136461A1 - Appareil d’échographie - Google Patents

Appareil d’échographie Download PDF

Info

Publication number
WO2009136461A1
WO2009136461A1 PCT/JP2009/000668 JP2009000668W WO2009136461A1 WO 2009136461 A1 WO2009136461 A1 WO 2009136461A1 JP 2009000668 W JP2009000668 W JP 2009000668W WO 2009136461 A1 WO2009136461 A1 WO 2009136461A1
Authority
WO
WIPO (PCT)
Prior art keywords
ultrasonic
model
storage
image
anatomical model
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/JP2009/000668
Other languages
English (en)
Japanese (ja)
Inventor
瀬戸久美子
東隆
佐々木元
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hitachi Ltd
Original Assignee
Hitachi Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hitachi Ltd filed Critical Hitachi Ltd
Priority to JP2010510996A priority Critical patent/JP5027922B2/ja
Publication of WO2009136461A1 publication Critical patent/WO2009136461A1/fr
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves

Definitions

  • ultrasonic diagnostic equipment relates to technology suitable for use for beginners training in ultrasonic examination, patient explanation, imaging support during follow-up observation, etc.
  • An ultrasonic diagnostic apparatus is an apparatus that transmits and receives ultrasonic waves to and from a subject via a probe, and reconstructs and displays a tomographic image of an imaging region based on a reflected echo signal output from the probe. .
  • ultrasonic diagnostic apparatus is currently widely used in the medical field because it can non-invasively and real-time diagnose an imaging site.
  • ultrasonic tomographic images are difficult to observe for examiners and examinees due to limited visual field, artifacts that are unique to ultrasonic waves, and inspections because they are subjective and poorly objective. There is a problem such as a large difference depending on the skill of the person.
  • Patent Document 1 proposes an apparatus that displays an ultrasonic tomographic image together with a reference image indicating the inside of a living body.
  • Patent Document 1 an anatomical chart database in which coordinate information in a living body and an anatomical chart showing the inside of the living body are associated in advance is provided, coordinate information is detected from a position sensor of the probe, and the coordinate information in which the probe is stored is stored. When moving to the corresponding position, the corresponding anatomical chart is displayed together with the ultrasonic tomographic image.
  • the coordinate information in the anatomical chart database is defined by determining the organ position with respect to a standard model. Therefore, generally, the position of the organ has individual differences and the probe is pressed. Considering that there is a point where the organ moves due to operations such as, there is a possibility that it does not match the actual organ position.
  • the displayed anatomical chart is two-dimensional, it is difficult to grasp the three-dimensional positional relationship between the imaging surface of the ultrasonic tomogram and the organ.
  • the object of the present invention is to support beginners training, follow-up observation, patient explanation, etc. in ultrasonic examination, correct the organ position of the anatomical model according to the patient, and use the corrected anatomical model to probe It is to navigate the child scan with high accuracy.
  • an ultrasonic diagnostic apparatus of the present invention transmits and receives ultrasonic waves to and from tissue inside a living body, thereby outputting a reception signal, and based on the reception signal, the ultrasonic probe
  • An ultrasonic tomographic image forming means for forming an ultrasonic tomographic image of the tissue, a display means for displaying an image provided with an input / output means for inputting a target site, etc., and a position for detecting the position and angle of the ultrasonic probe
  • a detection means a model storage means for storing an anatomical model including a living organ; and the ultrasonic probe for the anatomical model whose magnification is corrected according to the size of the subject from the position and angle detected by the position detection means.
  • Navigation control means for specifying an imaging plane of a child and displaying the anatomical model including the target region, the imaging plane, and an ultrasonic tomographic image captured from the ultrasonic tomographic image forming means on a display means.
  • the control unit, the control unit, the image extraction means for extracting a plurality of images within a predetermined section from the position and angle of the imaging surface, the extracted plurality of images and the ultrasonic tomographic image are collated, An image closest to the ultrasonic tomogram is obtained, and model position correction means for correcting the position of the anatomical model based on the difference between the position of the imaging surface and the position of the extracted image is provided. did.
  • a doctor or an engineer can efficiently find a target cross section, so that the examination efficiency is improved.
  • FIG. 1 It is a figure which shows the system configuration
  • FIG. 1 is a block diagram showing the overall configuration of the present invention.
  • the probe 10 is a transmitter / receiver that is used by being applied to the body surface of a subject and transmits / receives ultrasonic waves.
  • the ultrasonic tomographic image forming means 11 the reflected echo signal output from the probe 10 is subjected to processing such as amplification, analog-digital conversion, phasing addition, etc., and an ultrasonic tomographic image is reconstructed, and the control unit 12.
  • processing such as amplification, analog-digital conversion, phasing addition, etc.
  • an ultrasonic tomographic image is output to the display means 13.
  • the position detection means 14 is a means for detecting the position and angle of the probe 10 in the three-dimensional space.
  • a magnetic sensor is used for the position detection means 14
  • a magnetic generator is fixed to the patient's bedside, and a magnetic sensor for detecting a magnetic field is attached to the probe 10.
  • the magnetic sensor detects the magnetic field, the coordinates in the three-dimensional space with the magnetic generator as the origin and the rotation angle around each axis (XYZ axis) are detected.
  • the anatomical model storage means 15 is a means for storing a anatomical model on a computer representing an internal organ of the human body.
  • the storage surface information storage means 16 is means for storing information on a specific imaging surface in the anatomical model.
  • the control unit 12 includes a navigation processing unit 120, takes in an anatomical model from the anatomical model storage unit 15, and determines the position and angle of the coordinate system of the anatomical model from the position and angle of the probe 10 detected by the position detection unit 14. And the imaging plane on the anatomical model, the cut plane on the anatomical model corresponding to the imaging plane, and the ultrasonic tomographic image are output to the display means 13.
  • the control unit 12 further includes an image extraction unit 121 and a model position correction unit 122, and corrects the organ position of the anatomical model according to the organ position of the subject.
  • the display unit 13 includes an input / output unit 130, and information necessary for specifying and correcting the anatomical model such as a target site is input / output by the user.
  • FIG. 2 shows an example of the table configuration in the anatomical model storage means 15.
  • the anatomical model storage means 15 stores a set-specific part definition table 20, a set master table 21, a part master table 22, and a part-specific model data 23.
  • the part-specific part definition table 20 defines parts to be referred to for each purpose of ultrasonic examination.
  • one or more part numbers in the part master table 22 are associated with the set number in the set master table 21.
  • the part master table 22 holds information for specifying the part-specific model data 23, for example, a file name.
  • This part-specific model data 23 is three-dimensional form image data representing a three-dimensional form.
  • the part-specific model data 23 is constructed as a surface model and is provided as a part-specific file group describing the surface.
  • Each file defines model color, transparency, vertex coordinates of triangles constituting the boundary (side surface), and the like as information about the three-dimensional object, and the vertex coordinates are values in a common model coordinate system.
  • the file format is, for example, VRML (Virtual Reality Modeling Language extension .wrl).
  • the surface model is described.
  • the part-specific model data 23 may take other forms such as voxel data as long as the part can be identified.
  • FIG. 3 shows a screen example of the display means 13.
  • the display means 13 includes a model display area 30, a cut surface display area 31, an ultrasonic tomogram display area 32, and various input / output means 1301 to 1305.
  • the model display area 30 the anatomical model 300 read from the anatomical model storage unit 15 and the imaging plane object 301 corresponding to the position and angle of the probe 10 are displayed, and the anatomical model is displayed on the same plane as the imaging plane object 301.
  • a plane obtained by cutting 300 is displayed in the cut plane display area 31.
  • the input / output means displays a set name selection unit 1301 for inputting a set name for reading the anatomical model 300, or a part name corresponding to the set selected by the set name selection unit 1301, and the target part during the subsequent correction processing. Includes a part name selection unit 1302 for designating.
  • the initial alignment button 1303 and navigation start button 1304 are pressed by the user during navigation processing, and the correction button 1305 is pressed by the user during correction processing.
  • step 400 the set name of the set master table 21 is output to the set name selection unit 1301.
  • step 401 the anatomical model corresponding to the set name designated in step 400 is output to the model display area 30. That is, when the user selects the set name “upper abdomen”, the part definition table 20 for each set is searched using the set No as a key, and the part master table 22 is searched using the corresponding part No as a key. , Gallbladder, pancreas, spleen, right kidney, left kidney ”are obtained and output to the site name selection unit 1302.
  • the file name of the relevant model data 23 is acquired, and each file “liver.wrl, gallbladder.wrl, pancreas.wrl, spleen.wrl, right kidney.wrl, left kidney.wrl” is read, and the model coordinate system
  • the anatomical model 300 is output to the model display area 30 in accordance with the origin of the three-dimensional coordinate system of the model display area 30.
  • an imaging plane object 301 is created and output to the model display area 30.
  • the imaging surface object 301 of the present embodiment has a quadrilateral shape that matches the image size of the ultrasonic tomographic image, but may further take a fan-like shape that matches the shape of the ultrasonic tomographic image.
  • the display magnification of the model is corrected according to the patient.
  • biometric information such as the patient's height and abdominal circumference is acquired as a magnification correction parameter via the input / output means or the like, and the display magnification of the model body surface is corrected.
  • the display magnification of the model body surface is corrected.
  • feature points are extracted from the medical images by image processing, and the subject's You may correct
  • initial setting of the magnetic generator coordinate system and the model coordinate system is performed.
  • the position and angle output by the sensor of the position detecting means 14 of the probe 10 in the magnetic generator coordinate system at the initial setting, and the position and angle of the imaging plane object 301 in the model coordinate system at the initial setting. get.
  • the probe 10 is arranged so as to match a specific position of the living body (lateral scanning on the navel), and the imaging plane object 301 in the model display area 30 is the same as the probe 10. Is moved with a mouse or the like so as to match a specific position (horizontal scanning on the navel), and then an initial alignment button 1303 is pressed.
  • step 405 a coordinate transformation matrix between the magnetic generator coordinate system and the model coordinate system is created.
  • a coordinate transformation matrix is calculated from the position and angle of the probe 10 acquired in step 404 and the position and angle of the imaging plane object 301.
  • a coordinate transformation matrix that performs rotation and translation is expressed by equation (1).
  • r11 to r33 are rotational components
  • Ax, Ay, and Az are translational components.
  • A [r11, r12, r13,0, r21, r22, r23,0, r31, r32, r33,0, Ax, Ay, Az, 1]
  • a known technique is used for the calculation method of each component. For example, in Japanese Patent Laid-Open No.
  • the subject is arranged in parallel with the subject and the magnetic generator, and it is assumed that there is no change in the rotation component between the magnetic generator coordinate system and the model coordinate system, and the magnetic field acquired at the initial setting is set. Only the translation component is calculated from the pair of the generator coordinate system value and the model coordinate system value.
  • a pair of values of the magnetic generator coordinate system and the value of the model coordinate system of three or more points are taken, and M. Muller et al, Meshless Deformations Based on shape Modeling, Proc of SIGGRAPH'05, pp. 471-
  • the rotation component and the translation component may be calculated by a method indicated by 478 (2005) or the like.
  • step 406 when the user presses the navigation start button 1304, the position and angle output by the sensor of the position detection means 14 of the probe 10 are acquired.
  • step 407 the position and angle in step 406 are calculated in step 405.
  • the position and angle in the model coordinate system are calculated from the coordinate transformation matrix.
  • step 408 the image plane object 301 is translated and rotated from the position and angle in the model coordinate system in step 407.
  • step 409 a cut surface is created with respect to the anatomical model 300 whose magnification has been corrected previously, using a plane that can be identified from the position and angle in the model coordinate system in step 407.
  • the cut surface is created by a general method used in computer graphics or the like such as a method using a stencil buffer or texture mapping.
  • step 410 an ultrasonic tomographic image is acquired from the ultrasonic tomographic image forming means 11.
  • step 411 the model display area 30, the cut surface display area 31, and the ultrasonic tomographic image display area 32 are redrawn.
  • the model display area 30 the anatomical model 300 whose magnification has been corrected and the imaging plane object 301 after being moved in step 408 are output.
  • the cut surface created in step 409 is output to the cut surface display area 31.
  • the ultrasonic tomographic image acquired in step 410 is output to the ultrasonic tomographic image display area 32.
  • step 412 it is determined whether or not the user has pressed the navigation end button. If navigation is in progress, step 406 is executed. If not, the process ends.
  • FIG. 5 shows an example of screen development at the time of correction.
  • FIG. 5A shows the display means 13 before correction
  • FIG. 5B shows the display means 13 after correction.
  • the imaging plane object 301 is displayed on the anatomical model 300 in the model display area 30 and the cutting plane display area 31 is cut at the same position as the imaging plane object 301 by the processing of the navigation control unit 120 described above.
  • An ultrasonic tomographic image is displayed in the cut surface of the anatomical model 300 and the ultrasonic tomographic image display area 32.
  • the target name is selected by the site name selection unit 1302 and the correction button 1305 is pressed.
  • the cross-sectional position 50 on the anatomical model 300 that approximates the ultrasonic tomographic image is calculated by the image extracting unit 121, and only the difference between the position of the original imaging plane object 301 and the cross-sectional position 50 is calculated by the model position correcting unit 121.
  • the anatomical model 300 is moved. As a result, as shown in the corrected display means 13 in (b), the same cut surface as the ultrasonic tomographic image is output to the cut surface display area 31.
  • FIG. 6 shows a processing flow of the image extraction means 121.
  • step 600 the target part input by the part name selection unit 1302 is acquired.
  • step 601 the position and angle output by the sensor are acquired by the position detecting means 14 of the probe 10 in the same manner as in step 406.
  • step 602 coordinate conversion is performed in the same manner as in step 407, and the position of the imaging plane object 301 is obtained. Calculate the angle.
  • step 603 a plane having the same angle as the imaging plane object 301 is translated at regular intervals in the X axis direction, the Y axis direction, and the Z axis direction.
  • step 604 it is determined whether or not the imaging plane object 301 is within the target part acquired in step 600.
  • step 605 a cutting plane is created from the position and angle after coordinate conversion with respect to the anatomical model 300 as in step 409.
  • step 606 each cutting plane Si created in step 605, its position Pi, and the current imaging plane position p0 are stored in the memory.
  • the target part is single is shown.
  • step 604 it is determined in step 604 whether the target part is included in the plurality of parts.
  • step 603 an example in which a surface having the same angle as that of the imaging surface object 301 is searched in step 603 is shown.
  • the surface is searched by changing the angle within a certain range based on the angle of the imaging surface object 301. You may do it.
  • FIG. 7 shows a processing flow of the model position correcting means 122.
  • the ultrasonic tomographic image forming unit 11 acquires an ultrasonic tomographic image and imaging conditions such as an irradiation depth and a viewing angle.
  • each cutting plane Si acquired by the image extraction means 121, the cutting position Pi, and the current position P0 of the imaging plane are acquired.
  • each cut surface Si is processed from the imaging conditions acquired in step 700 into a format that can be easily compared with an ultrasonic tomographic image such as masking.
  • each cutting plane processed in step 702 is collated with the ultrasonic tomographic image acquired in step 700, and the cutting position of the approximate cutting plane is calculated.
  • the scale for measuring the degree of similarity of images takes a general method, for example, difference (squared intensity differences: SID), correlation coefficient (correlation coefficient: CC), mutual information (mutual information: MI), standardized mutual information. (normalized mutual information: NMI) etc. are used.
  • SID squared intensity differences
  • CC correlation coefficient
  • MI mutual information
  • MI standardized mutual information.
  • normalized mutual information: NMI normalized mutual information
  • step 704 the position of the anatomical model 300 is translated by the difference between the cutting plane position Pk acquired in step 703 and the imaging plane position P0 acquired in step 701.
  • step 705 the model display area 30 and the cut surface display area 31 are redrawn.
  • the image extraction unit 121 and the model position correction unit 122 create a cut surface for the anatomical model 300 and calculate the similarity with the ultrasonic tomographic image.
  • a three-dimensional image may be reconstructed from the captured ultrasonic tomographic image, a cut surface may be created for the three-dimensional ultrasonic tomographic image, and collated with the current ultrasonic tomographic image.
  • the user can understand the three-dimensional positional relationship between the organ and the imaging surface with higher accuracy and can easily take a target cross section when scanning with the probe. is there.
  • the system configuration of the present embodiment has a storage surface information storage unit 16 in addition to the configuration of the first example.
  • FIG. 8 shows an example of the storage surface information storage means 16.
  • the storage surface information storage means 16 stores a part-specific storage surface table 80, a storage surface definition table 81, and reference image image data 82.
  • the site-specific storage surface table 80 stores storage surfaces to be referred to by region and the display order thereof. Specifically, one or more of the storage surface definition table 81 for the region No. of the region master table 22 described above is stored. A storage plane No. is associated.
  • the storage plane definition table 81 stores storage plane names, storage plane display information such as position, angle, depth of field, and viewing angle, and reference image information to be displayed together with the storage plane, and uniquely specifies each storage plane. It is managed by information, for example, storage plane No.
  • the position holds the three-dimensional coordinate value and the angle holds the rotation angle from the three axes, but other parameters may be held as long as the storage surface can be reproduced later by the display means 13.
  • it is represented by coordinate values of arbitrary three points not on the same straight line, coordinate values of four vertices, a transformation matrix representing rotation or translation from a certain reference plane, and the like. However, both are defined in the same model coordinate system.
  • the reference image image data 82 is a reference image that the user wants to refer to at the same time when the storage surface is called, for example, an ultrasonic tomographic image that should be drawn, and holds, for example, a file name in the reference image image data 82.
  • the control unit 12 searches the storage surface definition table 81 and outputs the storage surface name to the storage surface selection area 90.
  • the control unit 12 searches the storage surface definition table 81 and acquires the reference image file name “Image_001.jpg”. To do.
  • the corresponding file of the reference image image data 82 is read and output to the reference image display area 91.
  • a plurality of reference image information in the storage plane definition table 81 may be provided and displayed in the reference image display area 81.
  • the control unit 12 similarly obtains the corresponding display information acquired from the storage plane definition table 81 (X coordinate -32, Y coordinate -45, Z coordinate -100, X axis rotation 8 °, Y axis rotation 44 °, Z axis rotation.
  • a storage plane object 92 for display is created based on -44 °, irradiation depth 190 mm, and viewing angle A), and is output on the anatomical model 300 output previously in the model display area 30.
  • the imaging plane object 301 and the cut plane display area in the model display area 30 are linked to the probe 10 by the processing of the navigation processing unit 120 described above.
  • the cut surface 31 and the tomographic image of the ultrasonic tomographic image display area 32 are changed. Therefore, the user can navigate to the storage surface by scanning the probe 10 so that the imaging surface object 301 is aligned with the storage surface object 92.
  • FIG. 10 shows an example of a screen when the user moves the imaging plane object 301 from the position of FIG. 9 to the position of the storage plane object 92 before position correction. Since the probe has moved from the time of FIG. 9, the imaging plane object 301 in the model display area 30, the cut plane in the cut plane display area 31, and the tomographic image in the ultrasonic tomogram display area 32 are changed. Here, the cut surface of the cut surface display area 31 and the reference image of the reference image display area 91 should be the same.
  • the organ position of the patient may be different from that of the anatomical model, and may not necessarily be the same as shown in FIG. Therefore, when the user compares the tomographic image in the ultrasonic tomographic image display area 32 and the tomographic image in the reference image display area 91 and determines that the cross sections are different, the tomographic image in the ultrasonic tomographic image display area 32 is selected from the site selection unit 1302. The organ currently depicted in the image is selected (if it is unknown, a plurality of surrounding organs are selected), and the correction button 1305 is pressed.
  • the determination of the presence / absence of the correction is performed by comparing the tomographic image of the ultrasonic tomographic image display area 32 with the tomographic image of the reference image display area 91 without displaying the tomographic image of the reference image display area 91. 12 may be determined semi-automatically or automatically.
  • FIG. 11 shows a screen example immediately after the system corrects the positions of the anatomical model 300 and the storage plane object 92.
  • the control unit 12 calculates a movement amount for correction by the image extraction unit 121 and the model position correction unit 122 described above, moves the anatomical model 300 from the movement amount, and accordingly The storage plane object 92 also moves.
  • the model since the model is moved, the tomographic image of the ultrasonic image display area 32 is not changed, and the cut surface of the cut surface display area 31 is the same as the ultrasonic tomographic image display area 32.
  • FIG. 12 is an example of a screen when the user matches the imaging plane object 301 with the position of the storage plane object 92 after position correction. As shown in FIG.
  • the user scans the probe 10 so that the imaging plane object 301 is aligned with the newly moved storage plane object 92. Then, the cut surface of the cut surface display area 31, the tomographic image of the ultrasonic image display area 32, and the tomographic image of the reference image display area 91 are all drawn with the same surface.
  • the conventional probe was used to search for the target cross-section, but the target cross-section can be reached at least several times, thus improving the inspection efficiency.
  • it can be useful for follow-up observation if training for beginners at the time of medical examinations, etc., and storing the preservation plane for each patient.
  • the control unit 12 searches the part-specific storage surface table 80 using the part number of the part name “liver” as a key, and stores the storage surface name “right rib”. Under the arch 1, right under the bow 2, right intercostal space, upper abdominal sagittal section, upper abdominal crossing ", the display order is acquired, and the storage surface name is output to the storage surface selection area 90 according to the display order.
  • the control unit 12 uses the storage surface No. of the storage surface name “under right rib arch 1” as a key for the storage surface definition table 91 or the like.
  • the reference image image data 82 is searched, and the storage plane object 300 is output on the anatomical model previously output from the display information in the model display area 30.
  • the control unit 12 acquires the corresponding reference image file name “us_img001.jpg” from the storage plane definition table 81 and outputs the corresponding reference image image data 92 to the reference image display area 91.
  • the surfaces to be scanned for each organ are displayed together with the order, and the user selects a desired storage surface, which can be used for preventing oversight of scanning and for training.
  • the storage surface name may be selected from the beginning.
  • the basic scan used for medical examinations and the like there are cases where a plurality of organs are rendered and diagnosed by one scan, so that the target storage plane can be efficiently called.
  • ultrasonic diagnostic equipment relates to technology suitable for use for beginners training in ultrasonic examination, patient explanation, imaging support during follow-up observation, etc.

Landscapes

  • Life Sciences & Earth Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Pathology (AREA)
  • Radiology & Medical Imaging (AREA)
  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Ultra Sonic Daignosis Equipment (AREA)

Abstract

Selon l'invention, la navigation guidant le balayage par une sonde est menée avec une précision élevée grâce à l'utilisation d'un modèle anatomique incluant un organe dont la position peut être corrigée conformément à la position d'un patient. L'invention concerne un appareil d'échographie comportant une unité de commande (12), un moyen de détection de la position (14) servant à détecter la position et l'angle d'une sonde (10), et un moyen de stockage de modèle anatomique (15) permettant de conserver un modèle anatomique incluant un organe d'un organisme. L'unité de commande (12) comporte un moyen de navigation (120) permettant d'afficher une partie de surface ainsi qu'une section transversale du modèle anatomique, les agrandissements de celles-ci étant corrigés conformément au patient selon la position et l'angle détectés et selon un échogramme ; un moyen d'extraction d'image (121) permettant d'extraire des images de sections prédéfinies sous le même angle que les parties de surface ; et un moyen de correction de position du modèle (122) permettant de mettre en correspondance les images extraites et l'échogramme, d'acquérir l'image la plus proche de l'échogramme, et de corriger la position du modèle anatomique sur la base de la différence entre la position de la partie de surface et la position de l'image extraite.
PCT/JP2009/000668 2008-05-07 2009-02-18 Appareil d’échographie Ceased WO2009136461A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2010510996A JP5027922B2 (ja) 2008-05-07 2009-02-18 超音波診断装置

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2008-120870 2008-05-07
JP2008120870 2008-05-07

Publications (1)

Publication Number Publication Date
WO2009136461A1 true WO2009136461A1 (fr) 2009-11-12

Family

ID=41264518

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2009/000668 Ceased WO2009136461A1 (fr) 2008-05-07 2009-02-18 Appareil d’échographie

Country Status (2)

Country Link
JP (1) JP5027922B2 (fr)
WO (1) WO2009136461A1 (fr)

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011104137A (ja) * 2009-11-18 2011-06-02 Aloka Co Ltd 超音波診断システム
JP2012050551A (ja) * 2010-08-31 2012-03-15 Toshiba Corp 超音波診断装置、超音波画像処理装置及び超音波画像処理プログラム
CN102727248A (zh) * 2011-04-15 2012-10-17 西门子公司 超声系统以及超声系统中的图像处理方法和装置
JP2013512748A (ja) * 2009-12-08 2013-04-18 コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ アブレーション治療計画及びデバイス
WO2014034948A1 (fr) * 2012-09-03 2014-03-06 株式会社東芝 Appareil de diagnostic ultrasonore et procédé de traitement d'image
WO2014038635A1 (fr) * 2012-09-06 2014-03-13 株式会社 東芝 Dispositif de diagnostic ultrasonore et dispositif de projection d'image pour une utilisation médicale
JP2016137212A (ja) * 2015-01-29 2016-08-04 ジーイー・メディカル・システムズ・グローバル・テクノロジー・カンパニー・エルエルシー 超音波診断装置
JP2018079070A (ja) * 2016-11-16 2018-05-24 キヤノンメディカルシステムズ株式会社 超音波診断装置、及び走査支援プログラム
JP2019517879A (ja) * 2016-06-15 2019-06-27 中慧医学成像有限公司 3次元イメージング方法及びシステム
US11653897B2 (en) 2016-07-07 2023-05-23 Canon Medical Systems Corporation Ultrasonic diagnostic apparatus, scan support method, and medical image processing apparatus
US11813112B2 (en) 2018-02-09 2023-11-14 Samsung Medison Co., Ltd. Ultrasound diagnosis apparatus and method of displaying ultrasound image
JP2024046927A (ja) * 2022-09-26 2024-04-05 株式会社オプティム プログラム、方法、情報処理装置、システム

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3110927B2 (ja) 1993-11-08 2000-11-20 東京瓦斯株式会社 推進工法における裏込め注入方法
KR20150021781A (ko) * 2013-08-21 2015-03-03 한국디지털병원수출사업협동조합 환자 정보에 따른 3차원 초음파 영상의 검사장치 및 그 운영방법
KR101595718B1 (ko) 2014-02-04 2016-02-19 한국디지털병원수출사업협동조합 3차원 초음파 프로브의 스캔 위치 가이드 방법 및 이 방법이 포함된 초음파 진단기
US11123041B2 (en) 2014-08-28 2021-09-21 Samsung Electronics Co., Ltd. Ultrasound diagnosis apparatus for self-diagnosis and remote-diagnosis, and method of operating the ultrasound diagnosis apparatus
KR101705120B1 (ko) * 2014-08-28 2017-02-09 삼성전자 주식회사 자가 진단 및 원격 진단을 위한 초음파 진단 장치 및 초음파 진단 장치의 동작 방법
US12004821B2 (en) 2022-02-03 2024-06-11 Medtronic Navigation, Inc. Systems, methods, and devices for generating a hybrid image
US12295797B2 (en) 2022-02-03 2025-05-13 Medtronic Navigation, Inc. Systems, methods, and devices for providing an augmented display
US12249099B2 (en) 2022-02-03 2025-03-11 Medtronic Navigation, Inc. Systems, methods, and devices for reconstructing a three-dimensional representation

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002263101A (ja) * 2001-03-06 2002-09-17 Aloka Co Ltd 超音波診断装置
JP2007125179A (ja) * 2005-11-02 2007-05-24 Olympus Medical Systems Corp 超音波診断装置

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002263101A (ja) * 2001-03-06 2002-09-17 Aloka Co Ltd 超音波診断装置
JP2007125179A (ja) * 2005-11-02 2007-05-24 Olympus Medical Systems Corp 超音波診断装置

Cited By (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011104137A (ja) * 2009-11-18 2011-06-02 Aloka Co Ltd 超音波診断システム
JP2013512748A (ja) * 2009-12-08 2013-04-18 コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ アブレーション治療計画及びデバイス
JP2012050551A (ja) * 2010-08-31 2012-03-15 Toshiba Corp 超音波診断装置、超音波画像処理装置及び超音波画像処理プログラム
CN102727248A (zh) * 2011-04-15 2012-10-17 西门子公司 超声系统以及超声系统中的图像处理方法和装置
US9524551B2 (en) 2012-09-03 2016-12-20 Toshiba Medical Systems Corporation Ultrasound diagnosis apparatus and image processing method
WO2014034948A1 (fr) * 2012-09-03 2014-03-06 株式会社東芝 Appareil de diagnostic ultrasonore et procédé de traitement d'image
JP2014061291A (ja) * 2012-09-03 2014-04-10 Toshiba Corp 超音波診断装置及び画像処理方法
CN103781424A (zh) * 2012-09-03 2014-05-07 株式会社东芝 超声波诊断装置以及图像处理方法
WO2014038635A1 (fr) * 2012-09-06 2014-03-13 株式会社 東芝 Dispositif de diagnostic ultrasonore et dispositif de projection d'image pour une utilisation médicale
JP2016137212A (ja) * 2015-01-29 2016-08-04 ジーイー・メディカル・システムズ・グローバル・テクノロジー・カンパニー・エルエルシー 超音波診断装置
JP2019517879A (ja) * 2016-06-15 2019-06-27 中慧医学成像有限公司 3次元イメージング方法及びシステム
US11653897B2 (en) 2016-07-07 2023-05-23 Canon Medical Systems Corporation Ultrasonic diagnostic apparatus, scan support method, and medical image processing apparatus
JP2018079070A (ja) * 2016-11-16 2018-05-24 キヤノンメディカルシステムズ株式会社 超音波診断装置、及び走査支援プログラム
US11813112B2 (en) 2018-02-09 2023-11-14 Samsung Medison Co., Ltd. Ultrasound diagnosis apparatus and method of displaying ultrasound image
JP2024046927A (ja) * 2022-09-26 2024-04-05 株式会社オプティム プログラム、方法、情報処理装置、システム
JP7748169B2 (ja) 2022-09-26 2025-10-02 株式会社オプティム プログラム、方法、情報処理装置、システム

Also Published As

Publication number Publication date
JPWO2009136461A1 (ja) 2011-09-01
JP5027922B2 (ja) 2012-09-19

Similar Documents

Publication Publication Date Title
JP5027922B2 (ja) 超音波診断装置
US9471981B2 (en) Reference image display method for ultrasonography and ultrasonic diagnosis apparatus
RU2663649C2 (ru) Сегментация крупных объектов из нескольких трехмерных видов
EP3003161B1 (fr) Procédé d'acquisition en 3d d'images ultrasonores
JP5574742B2 (ja) 超音波診断装置
JP2010264232A (ja) 診断支援装置、診断支援プログラムおよび診断支援方法
CN100548223C (zh) 超声诊断设备
KR101504162B1 (ko) 의료 화상용 정보처리장치, 의료 화상용 촬영 시스템 및 의료 화상용 정보처리방법
BR112020014733A2 (pt) Método implementado por computador para a obtenção de medições anatômicas em uma imagem de ultrassom, meios de programa de computador, dispositivo de análise de imagem e método de imageamento por ultrassom
JP5601684B2 (ja) 医用画像装置
CN111671461B (zh) 超声波诊断装置及显示方法
CN105046644A (zh) 基于线性相关性的超声与ct图像配准方法和系统
JP6258026B2 (ja) 超音波診断装置
CN112545551B (zh) 用于医学成像设备的方法和系统
JP2014195729A (ja) 超音波診断装置
JP3552300B2 (ja) 手術器具の位置表示装置
TWM551477U (zh) 一種適用於人工智慧影像分析的超音波診斷裝置
JP4843728B2 (ja) 超音波診断装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 09742586

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 2010510996

Country of ref document: JP

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 09742586

Country of ref document: EP

Kind code of ref document: A1