[go: up one dir, main page]

CN116172605A - An image registration method and an ultrasonic imaging system - Google Patents

An image registration method and an ultrasonic imaging system Download PDF

Info

Publication number
CN116172605A
CN116172605A CN202111436052.XA CN202111436052A CN116172605A CN 116172605 A CN116172605 A CN 116172605A CN 202111436052 A CN202111436052 A CN 202111436052A CN 116172605 A CN116172605 A CN 116172605A
Authority
CN
China
Prior art keywords
breathing
time
real
pose
caused
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111436052.XA
Other languages
Chinese (zh)
Inventor
郭楚
江涛
刘林
于开欣
丛龙飞
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Mindray Bio Medical Electronics Co Ltd
Original Assignee
Shenzhen Mindray Bio Medical Electronics Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Mindray Bio Medical Electronics Co Ltd filed Critical Shenzhen Mindray Bio Medical Electronics Co Ltd
Priority to CN202111436052.XA priority Critical patent/CN116172605A/en
Publication of CN116172605A publication Critical patent/CN116172605A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/52Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/5215Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/44Constructional features of the ultrasonic, sonic or infrasonic diagnostic device
    • A61B8/4444Constructional features of the ultrasonic, sonic or infrasonic diagnostic device related to the probe
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/48Diagnostic techniques
    • A61B8/483Diagnostic techniques involving the acquisition of a 3D volume of data
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/30Determination of transform parameters for the alignment of images, i.e. image registration
    • G06T7/33Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10072Tomographic images
    • G06T2207/10081Computed x-ray tomography [CT]
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10072Tomographic images
    • G06T2207/10088Magnetic resonance imaging [MRI]
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10132Ultrasound image

Landscapes

  • Life Sciences & Earth Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Surgery (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Pathology (AREA)
  • Radiology & Medical Imaging (AREA)
  • Biomedical Technology (AREA)
  • Veterinary Medicine (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Biophysics (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Ultra Sonic Daignosis Equipment (AREA)

Abstract

An image registration method and an ultrasonic imaging system register first image data and second image data of an inspected object in real time according to real-time pose information of an ultrasonic probe, real-time pose information of the inspected object and pose change information of the inspected object caused by breathing, so that an excellent registration effect can be achieved.

Description

一种图像配准方法和超声成像系统An image registration method and an ultrasonic imaging system

技术领域technical field

本发明涉及一种图像配准方法和超声成像系统。The invention relates to an image registration method and an ultrasonic imaging system.

背景技术Background technique

在临床中对被检查对象的成像可以使用一种或多种的成像系统,以让医务人员可获得一种或多种模态的医学图像,多种模态的医学图像例如可以是CT(ComputedTomography,计算机断层扫描)图像、MR(Magnetic Resonance,磁共振)图像、超声图像等。超声融合成像导航的原理是通过空间定位装置(通常是设置在探头上的例如磁定位传感器),建立起实时的超声图像和提前获取的其他数据(如三维超声图像、CT图像或MR图像)的对应关系,达到两种图像完全融合,实现两种图像对诊断和治疗过程的共同引导,大大提高临床医生的诊断信心和手术效果。One or more imaging systems can be used in clinical imaging of the object to be examined, so that medical personnel can obtain medical images of one or more modalities. The medical images of multiple modalities can be, for example, CT (Computed Tomography , computed tomography) images, MR (Magnetic Resonance, magnetic resonance) images, ultrasound images, etc. The principle of ultrasound fusion imaging navigation is to establish real-time ultrasound images and other data (such as three-dimensional ultrasound images, CT images or MR images) obtained in advance through spatial positioning devices (usually such as magnetic positioning sensors installed on the probe). The corresponding relationship achieves the complete fusion of the two images, realizes the joint guidance of the two images for the diagnosis and treatment process, and greatly improves the diagnostic confidence of clinicians and the effect of surgery.

在诊断过程中,由于定位系统和患者等有可能会发生位移,这会导致医学图像配准不正确,从而图像融合效果不好。During the diagnosis process, due to the possible displacement of the positioning system and the patient, etc., this will lead to incorrect medical image registration, resulting in poor image fusion.

发明内容Contents of the invention

针对上述问题,本发明提出一种图像配准方法和超声成像系统,下面具体说明。In view of the above problems, the present invention proposes an image registration method and an ultrasonic imaging system, which will be described in detail below.

根据第一方面,一种实施例中提供一种图像配准方法,包括:According to the first aspect, an image registration method is provided in an embodiment, including:

控制超声探头向被检查对象发射超声波和接收从所述被检查对象返回的超声回波以获取超声波回波信号,并根据所述超声回波信号获取所述被检查对象实时的超声图像数据;controlling the ultrasonic probe to transmit ultrasonic waves to the object under inspection and receiving ultrasonic echoes returned from the object under inspection to obtain ultrasonic echo signals, and obtain real-time ultrasonic image data of the object under inspection according to the ultrasonic echo signals;

获取所述被检查对象非实时的检测图像数据;Acquiring non-real-time detection image data of the inspected object;

通过探头定位器获取所述超声探头实时的位姿信息,和通过体表定位器获取所述被检查对象实时的位姿信息和所述被检查对象由呼吸引起的位姿变化信息;其中,所述探头定位器设置于所述超声探头,所述体表定位器用于设置于所述被检查对象;Obtain the real-time pose information of the ultrasonic probe through the probe locator, and obtain the real-time pose information of the inspected object and the pose change information of the inspected object caused by breathing through the body surface locator; wherein, the The probe positioner is set on the ultrasonic probe, and the body surface positioner is used to set on the object to be examined;

根据超声探头实时的位姿信息,获取所述被检查对象实时的超声图像数据和所述检测图像数据初始的配准映射关系;Acquiring the real-time ultrasonic image data of the inspected object and the initial registration mapping relationship between the detected image data according to the real-time pose information of the ultrasonic probe;

从被检测对象实时的位姿信息中剔除其所包含的由呼吸引起的位姿变化信息;根据被检测对象剔除由呼吸引起的位姿变化信息后的位姿信息,对所述配准映射关系进行校正;Remove the pose change information caused by breathing from the real-time pose information of the detected object; according to the pose information of the detected object after removing the pose change information caused by respiration, the registration mapping relationship make corrections;

根据校正后的配准映射关系,对所述被检查对象实时的超声图像数据和检测图像数据进行配准。According to the corrected registration mapping relationship, registration is performed on the real-time ultrasonic image data and detection image data of the inspected object.

一实施例中,通过所述体表定位器获取被检查对象由呼吸引起的位姿变化信息包括:In one embodiment, obtaining information on pose changes of the subject under inspection caused by breathing through the body surface locator includes:

获取所述被检查对象至少一个呼吸周期内的由呼吸引起的位姿变化信息。Acquiring information on pose changes of the inspected object caused by respiration within at least one respiration cycle.

一实施例中,所述获取所述被检查对象至少一个呼吸周期内的由呼吸引起的位姿变化信息,包括:In one embodiment, the acquisition of information on pose changes caused by respiration within at least one respiration cycle of the subject under inspection includes:

确定初始的配准映射关系所对应的呼吸时相,及其对应的由呼吸引起的位姿信息;Determine the respiratory phase corresponding to the initial registration mapping relationship, and its corresponding pose information caused by breathing;

确定呼吸周期内每个呼吸时相,及其对应的由呼吸引起的位姿信息;Determine each breathing phase in the breathing cycle and its corresponding pose information caused by breathing;

根据所述初始的配准映射关系所对应的呼吸时相、及其对应的由呼吸引起的位姿信息,所述呼吸周期内每个呼吸时相及其对应的由呼吸引起的位姿信息,计算得到呼吸引起的呼吸周期内每个呼吸时相对应的位姿变化量矩阵,作为所述由呼吸引起的位姿变化信息。According to the respiratory phase corresponding to the initial registration mapping relationship and the corresponding pose information caused by breathing, each breathing phase in the breathing cycle and its corresponding pose information caused by breathing, A pose change matrix corresponding to each breath in the breathing cycle caused by breathing is calculated as the pose change information caused by breathing.

一实施例中,所述从被检测对象实时的位姿信息中剔除其所包含的由呼吸引起的位姿变化信息,包括:In one embodiment, the removing the pose change information caused by breathing from the real-time pose information of the detected object includes:

确定所述被检测对象实时的呼吸时相;determining the real-time respiratory phase of the detected object;

根据所述被检测对象实时的呼吸时相,以及所述被检查对象至少一个呼吸周期内的由呼吸引起的位姿变化信息,将所述被检测对象实时的位姿信息中所包含的由呼吸引起的位姿变化信息,从所述被检测对象实时的位姿信息中剔除。According to the real-time breathing phase of the detected object, and the pose change information caused by breathing of the detected object within at least one breathing cycle, the real-time pose information of the detected object contained in the breathing The resulting pose change information is removed from the real-time pose information of the detected object.

一实施例中:In one embodiment:

所述根据所述被检测对象实时的呼吸时相,以及所述被检查对象至少一个呼吸周期内的由呼吸引起的位姿变化信息,将所述被检测对象实时的位姿信息中所包含的由呼吸引起的位姿变化信息,从所述被检测对象实时的位姿信息中剔除,包括:According to the real-time breathing phase of the detected object and the pose change information caused by breathing of the detected object within at least one breathing cycle, the real-time pose information of the detected object The pose change information caused by breathing is removed from the real-time pose information of the detected object, including:

根据所述被检测对象实时的呼吸时相,以及所述呼吸引起的呼吸周期内每个呼吸时相对应的位姿变化量矩阵,得到呼吸引起的所述被检测对象实时的呼吸时相对应的位姿变化量矩阵;According to the real-time breathing phase of the detected object, and the pose change matrix corresponding to each breathing time in the breathing cycle caused by the breathing, the corresponding real-time breathing time of the detected object caused by breathing is obtained. Pose variation matrix;

所述被检测对象实时的位姿信息包括所述被检测对象实时的位姿矩阵,将所述被检测对象实时的位姿矩阵乘以所述被检测对象实时的呼吸时相对应的位姿变化量矩阵的逆矩阵,以从所述被检测对象实时的位姿信息中剔除其所包含的由呼吸引起的位姿变化信息。The real-time pose information of the detected object includes the real-time pose matrix of the detected object, and the real-time pose matrix of the detected object is multiplied by the corresponding pose change when the detected object breathes in real time The inverse matrix of the quantity matrix is used to remove the pose change information caused by breathing from the real-time pose information of the detected object.

一实施例中,所述对所述配准映射关系进行校正,还包括:根据所述被检查对象由呼吸引起的位姿变化信息,对被检查对象由呼吸引起的超声图像数据的变化所引起的配准映射关系的变化进行校正。In an embodiment, the correcting the registration mapping relationship further includes: according to the pose change information of the inspected object caused by breathing, the ultrasonic image data changes caused by the inspected object caused by breathing Correction for changes in the registration mapping relationship.

一实施例中,所述的图像配准方法还包括:控制对配准后的实时的超声图像数据和所述检测图像数据进行融合显示或对比显示。In an embodiment, the image registration method further includes: controlling to perform fusion display or comparative display of the registered real-time ultrasound image data and the detection image data.

一实施例中,所述检测图像数据为三维超声图像数据或者与超声图像数据不同模态的图像数据。In an embodiment, the detection image data is three-dimensional ultrasound image data or image data of a different modality from the ultrasound image data.

根据第二方面,一种实施例提供一种图像配准方法,包括:According to the second aspect, an embodiment provides an image registration method, including:

通过超声探头获取被检查对象实时的第一图像数据;Obtain real-time first image data of the inspected object through the ultrasonic probe;

获取所述被检查对象的第二图像数据;acquiring second image data of the inspected object;

获取所述超声探头实时的位姿信息;Obtaining real-time pose information of the ultrasonic probe;

获取所述被检查对象实时的位姿信息;Obtaining real-time pose information of the inspected object;

获取所述被检查对象由呼吸引起的位姿变化信息;Obtaining information on pose changes of the inspected object caused by respiration;

根据所述超声探头实时的位姿信息、所述被检查对象实时的位姿信息、以及所述被检查对象由呼吸引起的位姿变化信息,对所述被检查对象实时的第一图像数据和所述第二图像数据进行配准。According to the real-time pose information of the ultrasonic probe, the real-time pose information of the inspected object, and the pose change information of the inspected object caused by breathing, the real-time first image data and The second image data is registered.

一实施例中,所述获取所述被检查对象由呼吸引起的位姿变化信息包括:In one embodiment, the acquisition of information on the pose change of the inspected object caused by breathing includes:

获取所述被检查对象至少一个呼吸周期内的由呼吸引起的位姿变化信息。Acquiring information on pose changes of the inspected object caused by respiration within at least one respiration cycle.

一实施例中,所述获取所述被检查对象至少一个呼吸周期内的由呼吸引起的位姿变化信息,包括:In one embodiment, the acquisition of information on pose changes caused by respiration within at least one respiration cycle of the subject under inspection includes:

确定初始的配准映射关系所对应的呼吸时相,及其对应的由呼吸引起的位姿信息;Determine the respiratory phase corresponding to the initial registration mapping relationship, and its corresponding pose information caused by breathing;

确定呼吸周期内每个呼吸时相,及其对应的由呼吸引起的位姿信息;Determine each breathing phase in the breathing cycle and its corresponding pose information caused by breathing;

根据所述初始的配准映射关系所对应的呼吸时相、及其对应的由呼吸引起的位姿信息,所述呼吸周期内每个呼吸时相及其对应的由呼吸引起的位姿信息,计算得到呼吸引起的呼吸周期内每个呼吸时相对应的位姿变化量矩阵,作为所述由呼吸引起的位姿变化信息。According to the respiratory phase corresponding to the initial registration mapping relationship and the corresponding pose information caused by breathing, each breathing phase in the breathing cycle and its corresponding pose information caused by breathing, A pose change matrix corresponding to each breath in the breathing cycle caused by breathing is calculated as the pose change information caused by breathing.

一实施例中,所述对所述被检查对象实时的第一图像数据和第二图像数据进行配准,包括:In an embodiment, the registering the real-time first image data and second image data of the inspected object includes:

根据所述超声探头实时的位姿信息,获取所述被检测对象实时的第一图像数据和所述第二图像数据初始的配准映射关系;Acquiring the real-time first image data of the detected object and the initial registration mapping relationship between the second image data according to the real-time pose information of the ultrasonic probe;

根据所述被检查对象由呼吸引起的位姿变化信息和所述被检查对象实时的位姿信息对所述配准映射关系进行校正;Correcting the registration mapping relationship according to the pose change information of the inspected object caused by breathing and the real-time pose information of the inspected object;

根据校正后的配准映射关系,对所述被检查对象实时的第一图像数据和第二图像数据进行配准。According to the corrected registration mapping relationship, the real-time first image data and second image data of the inspected object are registered.

一实施例中,所述根据所述被检查对象由呼吸引起的位姿变化信息和所述被检查对象实时的位姿信息对所述配准映射关系进行校正,包括:In one embodiment, correcting the registration mapping relationship according to the pose change information of the inspected object caused by breathing and the real-time pose information of the inspected object includes:

根据所述被检查对象由呼吸引起的位姿变化信息对所述被检查对象实时的位姿信息进行校正;correcting the real-time pose information of the inspected object according to the pose change information of the inspected object caused by breathing;

根据所述被检查对象校正后实时的位姿信息,对所述配准映射关系进行校正。The registration mapping relationship is corrected according to the corrected real-time pose information of the inspected object.

一实施例中,所述根据所述被检查对象由呼吸引起的位姿变化信息对所述被检查对象实时的位姿信息进行校正包括:In one embodiment, the correcting the real-time pose information of the inspected object according to the pose change information of the inspected object caused by breathing includes:

从所述被检测对象实时的位姿信息中识别并剔除其所包含的由呼吸引起的位姿变化信息。From the real-time pose information of the detected object, the pose change information caused by respiration is identified and eliminated.

一实施例中,所述从所述被检测对象实时的位姿信息中识别并剔除其所包含的由呼吸引起的位姿变化信息包括:In one embodiment, the identifying and removing the pose change information caused by breathing from the real-time pose information of the detected object includes:

确定所述被检测对象实时的呼吸时相;determining the real-time respiratory phase of the detected object;

根据所述被检测对象实时的呼吸时相,以及所述被检查对象至少一个呼吸周期内的由呼吸引起的位姿变化信息,将所述被检测对象实时的位姿信息中所包含的由呼吸引起的位姿变化信息,从所述被检测对象实时的位姿信息中剔除。According to the real-time breathing phase of the detected object, and the pose change information caused by breathing of the detected object within at least one breathing cycle, the real-time pose information of the detected object contained in the breathing The resulting pose change information is removed from the real-time pose information of the detected object.

一实施例中,所述根据所述被检测对象实时的呼吸时相,以及所述被检查对象至少一个呼吸周期内的由呼吸引起的位姿变化信息,将所述被检测对象实时的位姿信息中所包含的由呼吸引起的位姿变化信息,从所述被检测对象实时的位姿信息中剔除,包括:In one embodiment, the real-time pose of the detected object is calculated according to the real-time breathing phase of the detected object and the pose change information of the detected object caused by breathing within at least one breathing cycle. The pose change information caused by breathing contained in the information is removed from the real-time pose information of the detected object, including:

根据所述被检测对象实时的呼吸时相,以及所述呼吸引起的呼吸周期内每个呼吸时相对应的位姿变化量矩阵,得到呼吸引起的所述被检测对象实时的呼吸时相对应的位姿变化量矩阵;According to the real-time breathing phase of the detected object, and the pose change matrix corresponding to each breathing time in the breathing cycle caused by the breathing, the corresponding real-time breathing time of the detected object caused by breathing is obtained. Pose variation matrix;

所述被检测对象实时的位姿信息包括所述被检测对象实时的位姿矩阵,将所述被检测对象实时的位姿矩阵乘以所述被检测对象实时的呼吸时相对应的位姿变化量矩阵的逆矩阵,以从所述被检测对象实时的位姿信息中剔除其所包含的由呼吸引起的位姿变化信息。The real-time pose information of the detected object includes the real-time pose matrix of the detected object, and the real-time pose matrix of the detected object is multiplied by the corresponding pose change when the detected object breathes in real time The inverse matrix of the quantity matrix is used to remove the pose change information caused by breathing from the real-time pose information of the detected object.

一实施例中,所述根据所述被检查对象由呼吸引起的位姿变化信息和所述被检查对象实时的位姿信息对所述配准映射关系进行校正,还包括:In one embodiment, the correcting the registration mapping relationship according to the pose change information of the inspected object caused by breathing and the real-time pose information of the inspected object further includes:

根据所述被检查对象由呼吸引起的位姿变化信息,对被检查对象由呼吸引起的第一图像数据的变化所引起的配准映射关系的变化进行校正。Correcting the change of the registration mapping relationship caused by the change of the first image data of the inspected object caused by the breathing according to the pose change information of the inspected object caused by the breathing.

一实施例中,所述的图像配准方法还包括:控制对配准后的实时的第一图像数据和所述第二图像数据进行融合显示或对比显示,其中,所述第一图像数据包括二维超声图像数据,所述第二图像数据包括三维超声图像数据或者与超声图像数据不同模态的图像数据。In an embodiment, the image registration method further includes: controlling the fusion display or comparative display of the registered real-time first image data and the second image data, wherein the first image data includes Two-dimensional ultrasound image data, the second image data includes three-dimensional ultrasound image data or image data of a different modality from the ultrasound image data.

根据第三方面,一种实施例提供一种超声成像系统,包括:According to a third aspect, an embodiment provides an ultrasound imaging system, comprising:

超声探头,用于向被检测对象发射超声波和接收相应的超声波回波,以获取超声波回波信号;The ultrasonic probe is used to transmit ultrasonic waves to the detected object and receive corresponding ultrasonic echoes to obtain ultrasonic echo signals;

发射和接收控制电路,用于控制所述超声探头执行超声波的发射和超声波回波信号的接收;The transmitting and receiving control circuit is used to control the ultrasonic probe to perform ultrasonic emission and ultrasonic echo signal reception;

探头定位器,设置于所述超声探头,用于获取所述超声探头实时的位姿信息;A probe locator, arranged on the ultrasonic probe, for obtaining real-time pose information of the ultrasonic probe;

体表定位器,用于设置于所述被检查对象,以获取所述被检查对象实时的位姿信息;A body surface locator, configured to be installed on the inspected object to obtain real-time pose information of the inspected object;

处理器,用于执行本文中任一实施例所述的图像配准方法。A processor, configured to execute the image registration method described in any embodiment herein.

根据第四方面,一种实施例提供一种计算机可读存储介质,所述计算机可读存储介质存储有程序,所述程序能够被处理器执行以实现如本文中任一实施例所述的方法。According to a fourth aspect, an embodiment provides a computer-readable storage medium, the computer-readable storage medium stores a program, and the program can be executed by a processor to implement the method described in any embodiment herein .

依据上述实施例的图像配准方法、超声成像系统和计算机可读存储介质,根据超声探头实时的位姿信息、被检查对象实时的位姿信息、以及所述被检查对象由呼吸引起的位姿变化信息,对所述被检查对象实时的第一图像数据和所述第二图像数据进行配准,可以达到优秀的配准效果。According to the image registration method, ultrasonic imaging system and computer-readable storage medium of the above-mentioned embodiments, according to the real-time pose information of the ultrasonic probe, the real-time pose information of the inspected object, and the pose of the inspected object caused by breathing The change information is used to register the real-time first image data and the second image data of the inspected object, so that an excellent registration effect can be achieved.

附图说明Description of drawings

图1为一种实施例的配准原理的示意图;FIG. 1 is a schematic diagram of a registration principle of an embodiment;

图2为一种实施例的配准原理的示意图;Fig. 2 is a schematic diagram of the registration principle of an embodiment;

图3为一种实施例的超声成像系统的结构示意图;Fig. 3 is a schematic structural diagram of an ultrasonic imaging system of an embodiment;

图4为一种实施例的图像配准方法的流程图;Fig. 4 is a flowchart of an image registration method of an embodiment;

图5为一种实施例的图像配准方法的流程图;FIG. 5 is a flowchart of an image registration method of an embodiment;

图6为一种实施例的根据所述超声探头实时的位姿信息、所述被检查对象实时的位姿信息、以及所述被检查对象由呼吸引起的位姿变化信息,对所述被检查对象实时的第一图像数据和第二图像数据进行配准的进一步示意图;6 is an embodiment of the real-time pose information of the ultrasonic probe, the real-time pose information of the inspected object, and the pose change information of the inspected object caused by breathing. A further schematic diagram of the real-time registration of the first image data and the second image data of the object;

图7为一种实施例的从当前时刻被检测对象的位姿信息中识别并剔除当前时刻被检测对象的位姿信息中所包含的由呼吸引起的位姿变化信息的进一步示意图;Fig. 7 is a further schematic diagram of an embodiment of identifying and removing the pose change information caused by breathing contained in the pose information of the detected object at the current moment from the pose information of the detected object at the current moment;

图8为一种实施例的图像配准方法的流程图;FIG. 8 is a flowchart of an image registration method in an embodiment;

图9为一种实施例的图像配准方法的流程图;FIG. 9 is a flow chart of an image registration method in an embodiment;

图10为一种实施例的对所述被检查对象实时的超声图像数据和检测图像数据进行配准的进一步示意图;Fig. 10 is a further schematic diagram of an embodiment of registering the real-time ultrasound image data and detection image data of the object under inspection;

图11为一种实施例的根据所述由呼吸引起的位姿变化信息对所述配准映射关系进行第二类校正的进一步示意图。Fig. 11 is a further schematic diagram of an embodiment of performing a second type of correction on the registration mapping relationship according to the pose change information caused by breathing.

具体实施方式Detailed ways

下面通过具体实施方式结合附图对本发明作进一步详细说明。其中不同实施方式中类似元件采用了相关联的类似的元件标号。在以下的实施方式中,很多细节描述是为了使得本申请能被更好的理解。然而,本领域技术人员可以毫不费力的认识到,其中部分特征在不同情况下是可以省略的,或者可以由其他元件、材料、方法所替代。在某些情况下,本申请相关的一些操作并没有在说明书中显示或者描述,这是为了避免本申请的核心部分被过多的描述所淹没,而对于本领域技术人员而言,详细描述这些相关操作并不是必要的,他们根据说明书中的描述以及本领域的一般技术知识即可完整了解相关操作。The present invention will be further described in detail below through specific embodiments in conjunction with the accompanying drawings. Wherein, similar elements in different implementations adopt associated similar element numbers. In the following implementation manners, many details are described for better understanding of the present application. However, those skilled in the art can readily recognize that some of the features can be omitted in different situations, or can be replaced by other elements, materials, and methods. In some cases, some operations related to the application are not shown or described in the description, this is to avoid the core part of the application being overwhelmed by too many descriptions, and for those skilled in the art, it is necessary to describe these operations in detail Relevant operations are not necessary, and they can fully understand the relevant operations according to the description in the specification and general technical knowledge in the field.

另外,说明书中所描述的特点、操作或者特征可以以任意适当的方式结合形成各种实施方式。同时,方法描述中的各步骤或者动作也可以按照本领域技术人员所能显而易见的方式进行顺序调换或调整。因此,说明书和附图中的各种顺序只是为了清楚描述某一个实施例,并不意味着是必须的顺序,除非另有说明其中某个顺序是必须遵循的。In addition, the characteristics, operations or characteristics described in the specification can be combined in any appropriate manner to form various embodiments. At the same time, the steps or actions in the method description can also be exchanged or adjusted in a manner obvious to those skilled in the art. Therefore, the various sequences in the specification and drawings are only for clearly describing a certain embodiment, and do not mean a necessary sequence, unless otherwise stated that a certain sequence must be followed.

本文中为部件所编序号本身,例如“第一”、“第二”等,仅用于区分所描述的对象,不具有任何顺序或技术含义。而本申请所说“连接”、“联接”,如无特别说明,均包括直接和间接连接(联接)。The serial numbers assigned to components in this document, such as "first", "second", etc., are only used to distinguish the described objects, and do not have any sequence or technical meaning. The "connection" and "connection" mentioned in this application all include direct and indirect connection (connection) unless otherwise specified.

图像融合功能例如多模态图像融合功能可以基于磁导航定位系统来实现,具体地,通过在超声探头上安装磁定位传感器Sensor(以下简称探头Sensor)来实时追踪和定位超声探头的空间位置,使用面面配准算法完成实时二维超声图像,与其他三维体数据(例如三维超声图像数据或者与超声图像数据不同模态的图像数据,其中与超声图像数据不同模态的图像数据可以是CT图像数据或MR图像数据等)的配准融合。请参照图1,不妨以超声图像和CT/MR图像之间的配准原理来说明,可以理解地,实时的二维超声图像和三维超声图像数据之间的配准原理也是类似的。超声图像和CT/MR图像的配准计算公式为:Image fusion functions such as multimodal image fusion functions can be realized based on the magnetic navigation positioning system, specifically, by installing a magnetic positioning sensor Sensor (hereinafter referred to as the probe Sensor) on the ultrasonic probe to track and locate the spatial position of the ultrasonic probe in real time, using The face-to-face registration algorithm completes the real-time two-dimensional ultrasound image, and other three-dimensional volume data (such as three-dimensional ultrasound image data or image data of a different modality from the ultrasound image data, wherein the image data of a different modality from the ultrasound image data can be a CT image data or MR image data, etc.) Referring to FIG. 1 , it may be illustrated by the registration principle between ultrasound images and CT/MR images. Understandably, the registration principle between real-time 2D ultrasound images and 3D ultrasound image data is also similar. The formula for registration of ultrasound images and CT/MR images is:

T=P×R×A;T=P×R×A;

不妨将上述公式命名为公式(1),公式(1)中各参数有如下物理含义:T表示超声图像坐标系到CT/MR图像坐标系的位姿矩阵;P表示磁导航定位系统中磁场发生器的坐标系到CT/MR图像坐标系的位姿矩阵;R表示探头Sensor坐标系到磁场发生器坐标系的位姿矩阵,R矩阵可以通过磁导航定位系统来实时获取;A则表示超声图像坐标系到探头Sensor坐标系的位姿矩阵,A矩阵可以通过探头机械尺寸参数获得,是已知的,当超声参数(例如成像深度、ROI框位置、三维扫描范围等)不变时,A矩阵是固定不变的。在诊断过程中,操作员一般会移动超声探头来进行实时超声成像,在这个过程中,A矩阵和P矩阵一般是保持不变的,R矩阵会随着超声探头的移动而发生变化,但如上所述,R矩阵可以通过磁导航定位系统来获取。The above formula may be named as formula (1). The parameters in formula (1) have the following physical meanings: T represents the pose matrix from the ultrasound image coordinate system to the CT/MR image coordinate system; P represents the magnetic field generation in the magnetic navigation positioning system The pose matrix from the coordinate system of the sensor to the CT/MR image coordinate system; R represents the pose matrix from the sensor coordinate system of the probe to the magnetic field generator coordinate system, and the R matrix can be obtained in real time through the magnetic navigation positioning system; A represents the ultrasound image The pose matrix from the coordinate system to the sensor coordinate system of the probe, the A matrix can be obtained through the mechanical size parameters of the probe, and is known. When the ultrasound parameters (such as imaging depth, ROI frame position, three-dimensional scanning range, etc.) are constant, the A matrix is fixed. During the diagnosis process, the operator generally moves the ultrasound probe to perform real-time ultrasound imaging. During this process, the A matrix and P matrix generally remain unchanged, and the R matrix will change with the movement of the ultrasound probe, but as above As mentioned above, the R matrix can be obtained through the magnetic navigation and positioning system.

因此,一个配准的过程是这样的:Therefore, a registration process is as follows:

先进行初始配准来计算得到P矩阵,具体地,操作者选取一幅超声图像,然后在CT/MR图像中寻找对应切面,接着将两者进行叠加,通过平移或旋转使得超声图像与对应切面的CT/MR图像中相同的解剖结构重合,以获取此时的T矩阵,然后可以通过公式P=T×A-1×R-1来计算得到P矩阵。在得到P矩阵后,由于A矩阵已知,R矩阵可以通过磁导航定位系统来实时获取,因此这时候就可以通过公式(1)来实现超声图像与CT/MR图像的实时配准和融合显示。First perform initial registration to calculate the P matrix. Specifically, the operator selects an ultrasound image, then finds the corresponding section in the CT/MR image, and then superimposes the two, and makes the ultrasound image and the corresponding section through translation or rotation The same anatomical structure in the CT/MR image is superimposed to obtain the T matrix at this time, and then the P matrix can be calculated by the formula P=T×A −1 ×R −1 . After obtaining the P matrix, since the A matrix is known, the R matrix can be obtained in real time through the magnetic navigation and positioning system, so at this time, the real-time registration and fusion display of the ultrasound image and the CT/MR image can be realized through the formula (1) .

在图像的实时配准过程中,如果在完成初始配准之后,患者体位发生了较大变化,或者现场有人员(医生或护士等)不小心触碰到了磁场发生器使得磁场发生器的位置发生了较大变动,这都会导致始时配准所计算得到的P矩阵无法如实反映当前磁场发生器与CT/MR图像之间新的位姿关系,因此需要重新进行一次配准以求取新的矩阵P。举个例子:初始配准之后,超声探头的空间位置保持不变,而患者却往其脚底方向平移,此时超声探头可能从患者腹部相对运动到了患者肋间位置,超声图像发生了变化,但用于系统实时配准的公式T=P×R×A中由于P、R、A等矩阵在系统看来均未发生变化,所以对应的CT/MR图像切面始终保持不变,显示为患者平移之前的切面图像,配准结果失效;因此医生若想恢复正确的融合效果便只能重新配准,而这会占用医生和患者宝贵的手术时间,同时为医生带来较差的用户体验。During the real-time image registration process, if the patient’s body position changes greatly after the initial registration is completed, or someone on site (doctor or nurse, etc.) accidentally touches the magnetic field generator, the position of the magnetic field generator This will lead to the fact that the P matrix calculated by the initial registration cannot faithfully reflect the new pose relationship between the current magnetic field generator and the CT/MR image, so it is necessary to perform a new registration to obtain a new Matrix P. For example: after the initial registration, the spatial position of the ultrasound probe remains unchanged, but the patient moves toward the sole of the foot. At this time, the ultrasound probe may move from the patient’s abdomen to the patient’s intercostal position, and the ultrasound image changes. In the formula T=P×R×A used for system real-time registration, since the P, R, A and other matrices do not change from the perspective of the system, the corresponding CT/MR image slices always remain unchanged, which is displayed as patient translation For the previous section images, the registration result is invalid; therefore, if the doctor wants to restore the correct fusion effect, he has to re-register, which will take up valuable operation time for the doctor and the patient, and bring a poor user experience for the doctor.

针对配准后患者体位变化或者磁场发生器位置变化导致的融合失效问题,可以引入体位追踪功能来实时检测患者体位变化和磁场发生器位置的变化并对它们进行矫正。请参照图2,不妨仍以基于磁导航定位系统来实现超声图像和CT/MR图像之间的配准和融合为例进行说明。在磁导航定位系统中新引入一路传感器,不妨称之为体表Sensor,将体表Sensor固定在患者胸骨或者肚皮部位皮肤表面,使其与患者固连,其作用在于实时获取体位变化信号,该信号代表患者与磁场发生器的相对位置变化,包括患者自身体位变化和磁场发生器的位置变化。在通过初始配准求解出矩阵P后,医生可以利用点击按钮等方式开启体位追踪功能,系统根据磁导航系统的数据记录此时体表Sensor坐标系相对于磁场发生器坐标系的位姿矩阵Rp0,利用矩阵P和Rp0,可计算出体表sensor坐标系相对于CT/MR图像坐标系的位姿矩阵M:For the fusion failure problem caused by the change of the patient's body position or the position of the magnetic field generator after registration, the body position tracking function can be introduced to detect the change of the patient's body position and the position of the magnetic field generator in real time and correct them. Referring to FIG. 2 , it may still be illustrated by taking the registration and fusion between ultrasound images and CT/MR images based on the magnetic navigation positioning system as an example. A new sensor is introduced into the magnetic navigation positioning system, which may be called the body surface sensor. The body surface sensor is fixed on the skin surface of the patient's sternum or belly, so that it is firmly connected with the patient. Its function is to obtain body position change signals in real time. The signal represents the relative position change between the patient and the magnetic field generator, including the position change of the patient itself and the position change of the magnetic field generator. After solving the matrix P through the initial registration, the doctor can click the button to start the body position tracking function, and the system records the pose matrix R of the body surface Sensor coordinate system relative to the magnetic field generator coordinate system at this time according to the data of the magnetic navigation system p0 , using the matrix P and R p0 , the pose matrix M of the body surface sensor coordinate system relative to the CT/MR image coordinate system can be calculated:

M=P×Rp0M=P×R p0 ;

由于体表Sensor与患者固连,体表Sensor坐标系与CT/MR图像坐标系之间的空间关系不会随着患者体位或者磁场发生器位置的变化而变化,因此当患者体位或者磁场发生器位置发生变化时矩阵M始终保持不变,此时超声图像坐标系相对于CT/MR图像坐标系的新位姿矩阵T’的计算公式为:Since the body surface Sensor is fixedly connected to the patient, the spatial relationship between the body surface Sensor coordinate system and the CT/MR image coordinate system will not change with the patient's position or the position of the magnetic field generator. Therefore, when the patient's position or the magnetic field generator When the position changes, the matrix M always remains unchanged. At this time, the calculation formula of the new pose matrix T' of the ultrasound image coordinate system relative to the CT/MR image coordinate system is:

Figure BDA0003381742890000081
Figure BDA0003381742890000081

不妨将上述公式命名为公式(2),公式(2)中Rp0表示医生开启体位追踪功能时刻或者说初始配准时刻,系统所记录的体表Sensor坐标系相对于磁场发生器坐标系的位姿矩阵,可以理解地,这不会随着患者体位或者磁场发生器的位置变化而变化,为固定值;而Rp表示体表Sensor坐标系相对于磁场发生器坐标系实时的位姿矩阵,会随着患者体位或者磁场发生器位置的变化而变化,因此可实时反馈体位变化信号。公式(2)便是体位追踪下实时二维超声图像与CT/MR图像等三维体数据的新的配准流程,这可有效解决面面配准后患者体位变化或者磁场发生器位置变化导致的融合失效问题。You may wish to name the above formula as formula (2). In formula (2), R p0 represents the moment when the doctor turns on the body position tracking function or the initial registration moment, and the position of the body surface Sensor coordinate system recorded by the system relative to the magnetic field generator coordinate system It is understandable that this will not change with the position of the patient or the position of the magnetic field generator, and is a fixed value; while R p represents the real-time pose matrix of the body surface Sensor coordinate system relative to the magnetic field generator coordinate system, It will change with the patient's body position or the position of the magnetic field generator, so the body position change signal can be fed back in real time. Formula (2) is a new registration process for real-time two-dimensional ultrasound images and three-dimensional volume data such as CT/MR images under body position tracking, which can effectively solve the problems caused by changes in the patient's body position or the position of the magnetic field generator after face-to-face registration. Fusion failure problem.

体位追踪功能生效的前提是要求体表Sensor与患者固连,其目的是保证体位追踪过程中体表Sensor与患者之间不发生相对运动。申请人发现,临床中为了保证粘贴位置牢固和靠近肝脏手术治疗部位等条件,体表Sensor通常粘贴于患者胸骨表面皮肤或者肚皮上,体位追踪功能生效的过程中体表Sensor会随着患者的呼吸运动而运动,这会导致体位追踪过程中体表Sensor所检测到的体位变化信号中存在多余的呼吸运动信号,这为体位追踪下的实时配准和融合功能引入了新的配准误差。举个例子:医生在配准融合后开启体位追踪功能,超声探头固定不动采集二维超声图像,此时患者体位和磁场发生器位置均不发生变化,理论上此时的超声图像所对应的CT/MR切面图像应保持不变,即融合效果保持不变,但由于体表Sensor会随着患者的呼吸进行上下的起伏运动,即公式(2)中的Rp矩阵随着患者呼吸运动空间位置实时变化,导致新位姿矩阵T’也实时变化,最终反映为超声图像所对应的CT/MR切面图像实时变化,融合效果时好时坏,为医生的诊疗造成了不必要的干扰。The prerequisite for the body position tracking function to take effect is that the body surface Sensor is required to be firmly connected to the patient. The purpose is to ensure that there is no relative movement between the body surface Sensor and the patient during the body position tracking process. The applicant found that in clinical practice, in order to ensure that the sticking position is firm and close to the site of liver surgery, the body surface Sensor is usually pasted on the patient's breastbone surface skin or belly, and the body surface Sensor will follow the patient's breathing when the position tracking function takes effect. Movement and movement, which will cause redundant breathing motion signals in the body position change signal detected by the body surface Sensor during body position tracking, which introduces new registration errors for real-time registration and fusion functions under body position tracking. For example: after registration and fusion, the doctor turns on the body position tracking function, and the ultrasound probe is fixed to collect two-dimensional ultrasound images. At this time, the patient's body position and the position of the magnetic field generator do not change. The CT/MR section image should remain unchanged, that is, the fusion effect remains unchanged, but since the body surface Sensor will move up and down with the patient's breathing, that is, the R p matrix in formula (2) moves with the patient's breathing space The position changes in real time, causing the new pose matrix T' to change in real time, which is finally reflected in the real-time change of the CT/MR section image corresponding to the ultrasound image. The fusion effect is good and bad, causing unnecessary interference to the doctor's diagnosis and treatment.

基于上述认识,针对体表Sensor所检测到的体位变化信号中存在多余的呼吸运动信号导致融合效果变差的问题,申请人提出一种图像配准方法:Based on the above understanding, the applicant proposes an image registration method for the problem of poor fusion effect caused by redundant breathing motion signals in the body position change signals detected by the body surface Sensor:

(1)在患者体表和探头上分别固定传感器Sensor,追踪Sensor的空间位置;(1) Fix the Sensor on the patient's body surface and the probe respectively, and track the spatial position of the Sensor;

(2)获取三维体数据,完成与实时二维超声的初始配准;这里的三维体数据,可以是三维超声图像数据,也可以是与超声图像数据不同模态的图像数据,例如CT图像数据或MR图像数据等;(2) Obtain 3D volume data and complete the initial registration with real-time 2D ultrasound; the 3D volume data here can be 3D ultrasound image data, or image data of a different modality from ultrasound image data, such as CT image data or MR image data, etc.;

(3)采集一段时长的体表Sensor定位信息,以提取呼吸周期信息,例如患者在静态下整个呼吸周期内所有时相下体表Sensor的位姿信息;(3) Collect body surface Sensor positioning information for a period of time to extract respiratory cycle information, such as the pose information of the body surface Sensor at all phases of the patient's entire respiratory cycle under static conditions;

(4)根据上述呼吸周期信息在体表Sensor的实时定位信号中剔除呼吸信号的影响,以获得剔除呼吸信号后的体位变化信息;(4) Remove the influence of the breathing signal from the real-time positioning signal of the body surface Sensor according to the above breathing cycle information, so as to obtain body position change information after removing the breathing signal;

(5)基于上述体位变化信息,实时矫正超声与三维体数据的配准映射关系。(5) Based on the above body position change information, the registration mapping relationship between the ultrasound and the three-dimensional volume data is corrected in real time.

下面对各步骤的原理进行进一步的分析和说明。The principle of each step is further analyzed and explained below.

(1)在患者体表和探头上分别固定传感器Sensor,追踪Sensor的空间位置。(1) Fix the Sensor on the patient's body surface and the probe respectively, and track the spatial position of the Sensor.

分别在超声探头上和患者体表固定传感器Sensor,超声探头上所固定的传感器不妨称为探头Sensor,患者体表所固定的传感器不妨称为体表Sensor。探头Sensor用于实时追踪超声探头的空间位置,体表Sensor牢固粘贴于患者胸骨表面皮肤或者肚皮上,用于实时检测体位变化信息,两个传感器Sensor在整个图像融合过程中需始终处于定位系统的精确测量范围内。Sensors are respectively fixed on the ultrasound probe and on the patient's body surface. The sensor fixed on the ultrasound probe may be called a probe sensor, and the sensor fixed on the patient's body surface may be called a body surface Sensor. The probe sensor is used to track the spatial position of the ultrasound probe in real time. The body surface sensor is firmly pasted on the patient's breastbone surface skin or belly to detect body position changes in real time. The two sensor sensors must always be in the position of the positioning system during the entire image fusion process. within the precise measurement range.

定位系统例如磁导航系统,其中的磁场发生器可实时追踪每个磁定位Sensor在磁场中的空间位置,即每个传感器Sensor坐标系相对于磁场发生器坐标系的位姿关系,用一组参数(q0,q1,q2,q3,x,y,z)进行表示,其中,(q0,q1,q2,q3)表示Sensor坐标系相对于磁场发生器坐标系的姿态信息,为单位四元数,(x,y,z)表示Sensor坐标系相对于磁场发生器坐标系的位置信息。基于参数(q0,q1,q2,q3,x,y,z)可计算出Sensor坐标系相对于磁场发生器坐标系的位姿矩阵R,该矩阵为4*4的齐次矩阵,计算公式(不妨记为公式(3))为:A positioning system such as a magnetic navigation system, in which the magnetic field generator can track the spatial position of each magnetic positioning Sensor in the magnetic field in real time, that is, the pose relationship of each sensor Sensor coordinate system relative to the magnetic field generator coordinate system, using a set of parameters (q 0 ,q 1 ,q 2 ,q 3 ,x,y,z) to represent, where (q 0 ,q 1 ,q 2 ,q 3 ) represents the attitude of the Sensor coordinate system relative to the magnetic field generator coordinate system Information, which is a unit quaternion, (x, y, z) indicates the position information of the Sensor coordinate system relative to the magnetic field generator coordinate system. Based on the parameters (q 0 ,q 1 ,q 2 ,q 3 ,x,y,z), the pose matrix R of the Sensor coordinate system relative to the magnetic field generator coordinate system can be calculated, which is a 4*4 homogeneous matrix , the calculation formula (maybe denoted as formula (3)) is:

Figure BDA0003381742890000101
Figure BDA0003381742890000101

在Sensor坐标系中的任意一点,假设其坐标为(x0,y0,z0)T,化为齐次坐标(x0,y0,z0,1)T后通过左乘R矩阵便可求出该点在磁场发生器坐标系中的齐次坐标。At any point in the Sensor coordinate system, assuming that its coordinates are (x 0 , y 0 , z 0 ) T , it is transformed into a homogeneous coordinate (x 0 , y 0 , z 0 ,1) T and then multiplied by the R matrix to the left. The homogeneous coordinates of the point in the coordinate system of the magnetic field generator can be obtained.

(2)获取三维体数据,完成与实时二维超声的初始配准。(2) Acquire 3D volume data and complete initial registration with real-time 2D ultrasound.

将患者术前三维体数据如三维超声、CT、MR或PET等图像导入超声系统,医生浏览各个层片图像,找到存在独特结构如血管分叉、肿瘤等的层片作为待配准的体数据层片图像。医生使用安装了磁定位Sensor的超声探头采集并浏览实时二维超声图像,找到与待配准的体数据层片图像解剖结构一致的二维超声图像并冻结,作为待配准的二维超声图像。Import the patient's preoperative three-dimensional volume data such as three-dimensional ultrasound, CT, MR or PET images into the ultrasound system, and the doctor browses the images of each slice to find slices with unique structures such as vascular bifurcations and tumors as the volume data to be registered layer image. The doctor uses the ultrasound probe installed with the magnetic positioning Sensor to collect and browse the real-time two-dimensional ultrasound image, find the two-dimensional ultrasound image consistent with the anatomical structure of the volume data slice image to be registered and freeze it as the two-dimensional ultrasound image to be registered .

由于探头Sensor固连在超声探头上,导致二维超声图像坐标系与探头Sensor坐标系之间的位姿关系始终保持不变,因此超声图像坐标系到探头Sensor坐标系的位姿矩阵A可以通过探头机械尺寸参数获得,同样为4*4的齐次矩阵。在超声图像坐标系中的任意一个像素点(xU,yU,zU,1)T通过左乘A矩阵便可求出该点在探头Sensor坐标系中的坐标。基于所冻结的待配准二维超声图像所对应的探头Sensor的定位信息,利用上述公式(3)可计算冻结时刻探头Sensor的R矩阵,根据矩阵A和矩阵R可求出待配准的超声图像坐标系中的任意一个像素点(xU,yU,zU,1)T在磁场发生器坐标系中的坐标(xM,yM,zM,1)T,计算公式(不妨记为公式(4))为:Since the probe Sensor is fixedly connected to the ultrasound probe, the pose relationship between the two-dimensional ultrasound image coordinate system and the probe Sensor coordinate system remains unchanged, so the pose matrix A from the ultrasound image coordinate system to the probe Sensor coordinate system can be obtained by The mechanical size parameters of the probe are obtained, which is also a 4*4 homogeneous matrix. For any pixel point (x U , y U , z U , 1) T in the ultrasound image coordinate system, the coordinates of the point in the sensor coordinate system of the probe can be obtained by multiplying the A matrix to the left. Based on the positioning information of the probe Sensor corresponding to the frozen two-dimensional ultrasound image to be registered, the R matrix of the probe Sensor at the time of freezing can be calculated by using the above formula (3), and the ultrasound image to be registered can be obtained according to matrix A and matrix R The coordinates (x M ,y M ,z M ,1) T of any pixel point (x U ,y U ,z U , 1) T in the image coordinate system in the magnetic field generator coordinate system, the calculation formula (may as well remember For formula (4)) is:

Figure BDA0003381742890000102
Figure BDA0003381742890000102

将待配准的体数据层片图像和二维超声图像调节不同的透明度进行叠加显示,医生手动对待配准的体数据层片图像或者待配准的二维超声图像进行平移和旋转,对齐两个图像中相同的解剖结构如血管或者肿瘤,此时可获取待配准的二维超声图像相对于三维体数据的位姿关系,即上文所提及的T矩阵,基于上述公式(1)T=P×R×A,则磁场发生器坐标系到三维体数据坐标系的位姿矩阵P的计算公式(不妨记为公式(5))为:The volume data layer image to be registered and the 2D ultrasound image are superimposed and displayed with different transparency. The doctor manually translates and rotates the volume data layer image to be registered or the 2D ultrasound image to be registered, and aligns the two images. For the same anatomical structure such as blood vessels or tumors in two images, the pose relationship of the 2D ultrasound image to be registered relative to the 3D volume data can be obtained at this time, that is, the T matrix mentioned above, based on the above formula (1) T=P×R×A, then the calculation formula of the pose matrix P from the magnetic field generator coordinate system to the three-dimensional volume data coordinate system (may be denoted as formula (5)) is:

P=T×A-1×R-1 P=T×A -1 ×R -1

计算出P矩阵后,由图1的原理图可知,当医生实时移动超声探头进行采集二维超声图像,超声系统根据公式T=P×R×A实时计算与当前二维超声图像对应的体数据层片图像,当前超声图像坐标系中的任意一个像素点(xU,yU,zU,1)T对应的三维体数据坐标系中同一点的坐标(xV,yV,zV,1)T计算公式(不妨记为公式(6))为:After the P matrix is calculated, it can be seen from the schematic diagram in Figure 1 that when the doctor moves the ultrasound probe in real time to collect two-dimensional ultrasound images, the ultrasound system calculates the volume data corresponding to the current two-dimensional ultrasound image in real time according to the formula T=P×R×A Slice image, any pixel point (x U , y U , z U , 1) T in the current ultrasound image coordinate system corresponds to the coordinates of the same point in the three-dimensional volume data coordinate system (x V , y V , z V , 1) The formula for calculating T (maybe denoted as formula (6)) is:

Figure BDA0003381742890000111
Figure BDA0003381742890000111

将当前超声图像与对应的三维体数据层片图像调节不同的透明度进行叠加显示,或者并排对比显示,此时便完成了实时二维超声图像与三维体数据的配准融合,可为医生提供例如多模态图像融合导航,方便医生完成例如经皮穿刺消融等微创手术。Adjust the transparency of the current ultrasound image and the corresponding three-dimensional volume data slice image to superimpose and display them, or compare and display them side by side. At this time, the registration and fusion of real-time two-dimensional ultrasound images and three-dimensional volume data is completed, which can provide doctors with, for example, Multimodal image fusion navigation facilitates doctors to complete minimally invasive operations such as percutaneous ablation.

(3)采集一段时长的体表Sensor定位信息,以提取呼吸周期信息。(3) Collect body surface Sensor positioning information for a period of time to extract respiratory cycle information.

假设初始配准完成时刻患者的呼吸时相为ψ0,此时保证患者始终保持平静呼吸且体位不变,医生在患者呼吸时相到达ψ0时通过诸如按键等方式告知系统开始采集体表Sensor的定位信息,医生可自定义采集时长,只需保证这段时长内包含至少一个完整的呼吸周期,较优地,可以是多个(例如大于或等于3个)患者呼吸周期。一个例子中,采取包含N个完整的呼吸周期,其中N大于或等于3;那么根据磁导航系统所提供的体表Sensor的定位信息,针对每个呼吸时相ψt,均会采集N组参数

Figure BDA0003381742890000112
其中1≤n≤N。Assuming that the patient’s respiratory phase is ψ 0 when the initial registration is completed, the patient is guaranteed to keep breathing calmly and the body position remains unchanged. When the patient’s respiratory phase reaches ψ 0 , the doctor informs the system to start collecting the body surface Sensor by means such as pressing a button. The doctor can customize the acquisition time, as long as the period includes at least one complete breathing cycle, preferably, it can be multiple (for example, greater than or equal to 3) breathing cycles of the patient. In an example, take N complete breathing cycles, where N is greater than or equal to 3; then according to the positioning information of the body surface Sensor provided by the magnetic navigation system, for each breathing phase ψ t , N sets of parameters will be collected
Figure BDA0003381742890000112
where 1≤n≤N.

每一个单位四元数q=(q0,q1,q2,q3)的物理意义代表绕某个旋转轴旋转了若干角度,可表示为标量和矢量的有序对形式:The physical meaning of each unit quaternion q=(q 0 ,q 1 ,q 2 ,q 3 ) represents a number of angles rotated around a certain rotation axis, which can be expressed as an ordered pair of scalars and vectors:

Figure BDA0003381742890000113
Figure BDA0003381742890000113

上述公式不妨记为公式(7),在公式(7)中,

Figure BDA0003381742890000114
表示该四元数的旋转轴,为单位向量,另外,/>
Figure BDA0003381742890000115
而θ表示旋转的角度,其计算公式(不妨记为公式(8))为:The above formula may be recorded as formula (7), in formula (7),
Figure BDA0003381742890000114
Represents the rotation axis of the quaternion, which is a unit vector, and, in addition, />
Figure BDA0003381742890000115
And θ represents the angle of rotation, and its calculation formula (maybe denoted as formula (8)) is:

θ=2arccosq0 θ = 2 arccosq 0

其中旋转轴

Figure BDA0003381742890000116
的计算公式(不妨记为公式(9))为:where the axis of rotation
Figure BDA0003381742890000116
The calculation formula of (maybe denoted as formula (9)) is:

Figure BDA0003381742890000121
Figure BDA0003381742890000121

因此,针对每个呼吸时相ψt所采集的N组

Figure BDA0003381742890000122
参数,根据其中的N组四元数/>
Figure BDA0003381742890000123
可得到N个旋转角度θ(n)和旋转轴/>
Figure BDA0003381742890000124
由于患者平静呼吸,可认为N组数据的旋转轴和旋转角度之间差别很小,可通过计算它们的参数均值获取对于呼吸时相ψt的精确旋转角度θt和旋转轴/>
Figure BDA0003381742890000125
θt的计算公式(不妨记为公式(10))为:Therefore, the N groups collected for each respiratory phase ψ t
Figure BDA0003381742890000122
Parameters, according to the N groups of quaternions />
Figure BDA0003381742890000123
N rotation angles θ (n) and rotation axes can be obtained />
Figure BDA0003381742890000124
Since the patient breathes calmly, it can be considered that the difference between the rotation axis and the rotation angle of the N sets of data is very small, and the precise rotation angle θ t and the rotation axis for the respiratory phase ψ t can be obtained by calculating the mean value of their parameters />
Figure BDA0003381742890000125
The calculation formula of θ t (may be denoted as formula (10)) is:

Figure BDA0003381742890000126
Figure BDA0003381742890000126

Figure BDA0003381742890000127
的计算公式(不妨记为公式(11))为:
Figure BDA0003381742890000127
The calculation formula of (maybe denoted as formula (11)) is:

Figure BDA0003381742890000128
Figure BDA0003381742890000128

其中,Norm代表归一化函数,可将向量归一化为模长为1的单位向量。基于精确旋转角度θt和旋转轴

Figure BDA0003381742890000129
即可得到针对每个呼吸时相ψt的精确单位四元数qt=(q0t,q1t,q2t,q3t),其计算公式(不妨记为公式(12))为:Among them, Norm represents the normalization function, which can normalize the vector into a unit vector with a modulus length of 1. Based on precise rotation angle θ t and rotation axis
Figure BDA0003381742890000129
The precise unit quaternion q t =(q 0t ,q 1t ,q 2t ,q 3t ) for each breathing phase ψ t can be obtained, and its calculation formula (may be recorded as formula (12)) is:

Figure BDA00033817428900001210
Figure BDA00033817428900001210

根据每个呼吸时相ψt所采集的N组平移参数(x(n),y(n),z(n)),可通过计算参数均值获取对于呼吸时相ψt的精确位移向量dt=(xt,yt,zt),其计算公式(不妨记为公式(13))为:According to the N sets of translation parameters (x (n) , y (n) , z (n) ) collected for each respiratory phase ψ t , the precise displacement vector d t for the respiratory phase ψ t can be obtained by calculating the mean value of the parameters =(x t , y t , z t ), the calculation formula (maybe denoted as formula (13)) is:

Figure BDA00033817428900001211
Figure BDA00033817428900001211

基于针对每个呼吸时相ψt的精确单位四元数qt=(q0t,q1t,q2t,q3t)和精确位移向量dt=(xt,yt,zt),利用上述公式(3),可计算出呼吸时相ψt时刻体表Sensor坐标系相对于磁场发生器坐标系的精确位姿矩阵Rt,计算公式(不妨记为公式(14))为:Based on the exact unit quaternion q t =(q 0t ,q 1t ,q 2t ,q 3t ) and the exact displacement vector d t =(x t ,y t ,z t ) for each respiratory phase ψ t , using The above formula (3) can calculate the precise pose matrix R t of the body surface Sensor coordinate system relative to the magnetic field generator coordinate system at the moment of respiratory phase ψ t , and the calculation formula (may be denoted as formula (14)) is:

Figure BDA00033817428900001212
Figure BDA00033817428900001212

假设在初始配准完成时刻时患者的呼吸时相为ψ0时,体表Sensor坐标系相对于磁场发生器坐标系的位姿矩阵为R0,则体表Sensor在呼吸时相ψt时刻相对于呼吸时相ψ0时刻的位姿变化量ΔRt的计算公式(不妨记为公式(15))为:Assuming that when the initial registration is completed, the respiratory phase of the patient is ψ 0 , and the pose matrix of the body surface Sensor coordinate system relative to the magnetic field generator coordinate system is R 0 , then the body surface Sensor is relatively The formula for calculating the pose change ΔR t at the moment of respiratory phase ψ 0 (may be denoted as formula (15)) is:

Figure BDA0003381742890000131
Figure BDA0003381742890000131

因此,针对整个呼吸周期内的每个呼吸时相ψt,均有唯一的体表Sensor位姿变化量ΔRt与之对应,表示体表Sensor在呼吸时相ψt时刻因为患者的呼吸运动其空间位置姿态所发生的变化,整个呼吸周期内的所有时相体表Sensor的位姿变化量便是患者的呼吸周期信息。Therefore, for each respiratory phase ψ t in the entire respiratory cycle, there is a unique body surface Sensor pose change ΔR t corresponding to it, which means that the body surface Sensor at the moment of respiratory phase ψ t is due to the patient’s respiratory movement The changes in the spatial position and posture, and the changes in the posture and posture of the body surface Sensor in all phases during the entire respiratory cycle are the respiratory cycle information of the patient.

(4)根据上述呼吸周期信息在体表Sensor的实时定位信号中剔除呼吸信号的影响,以获得剔除呼吸信号后的体位变化信息。(4) Eliminate the influence of the respiratory signal from the real-time positioning signal of the body surface Sensor according to the above respiratory cycle information, so as to obtain body position change information after removing the respiratory signal.

如上文所述,体表Sensor的实时定位信号可用矩阵Rp表示,表示体表Sensor坐标系相对于磁场发生器坐标系实时的位姿矩阵,它会随着患者体位或者磁场发生器位置的变化而变化。在体位追踪的过程中,所检测到的体表Sensor的实时定位信号中既包含体位变化信号(患者体位变化和磁场发生器位置变化),也包含多余的呼吸运动信号,假设此时患者的呼吸时相为ψt,只代表体位变化信息的体表Sensor坐标系相对于磁场发生器坐标系的位姿矩阵为Rmt,则体表Sensor的实时定位位姿信息Rp可认为是体位变化与呼吸运动的复合,即如下公式(不妨记为公式(16)):As mentioned above, the real-time positioning signal of the body surface sensor can be represented by the matrix Rp , which represents the real-time pose matrix of the body surface sensor coordinate system relative to the magnetic field generator coordinate system, which will change with the patient's body position or the position of the magnetic field generator And change. In the process of body position tracking, the detected real-time positioning signal of the body surface Sensor includes not only body position change signals (patient body position changes and magnetic field generator position changes), but also redundant respiratory motion signals. The time phase is ψ t , and the pose matrix of the body surface Sensor coordinate system relative to the magnetic field generator coordinate system is R mt . Then the real-time positioning pose information R p of the body surface Sensor can be considered as body position change and The compounding of respiratory movement is the following formula (may be recorded as formula (16)):

Rp=Rmt×ΔRt R p =R mt ×ΔR t

因此,若要获得此时的体位变化信息,只需在体表Sensor的实时定位信号中剔除呼吸周期信号,则表示此时体位变化信息的体表Sensor坐标系相对于磁场发生器坐标系的位姿矩阵Rmt的计算公式(不妨记为公式(17))为:Therefore, if you want to obtain the body position change information at this time, you only need to remove the respiratory cycle signal from the real-time positioning signal of the body surface Sensor, then the position of the body surface Sensor coordinate system representing the body position change information at this time relative to the magnetic field generator coordinate system The calculation formula of the attitude matrix R mt (may be denoted as formula (17)) is:

Rmt=Rp×(ΔRt)-1 R mt =R p ×(ΔR t ) -1

公式(17)可理解为在任意的呼吸时相ψt时刻,从体表Sensor的实时定位信号中剔除呼吸周期信号后的体位变化信息为矩阵Rmt。由于整个呼吸周期内所有时相的位姿变化量ΔR均已知,因此可在体位追踪的过程中根据患者的呼吸时相在体表Sensor的实时定位信号Rp中剔除呼吸信号ΔR,只保留体位变化信息RmFormula (17) can be understood as the body position change information after removing the respiratory cycle signal from the real-time positioning signal of the body surface Sensor at any time of the respiratory phase ψ t is a matrix R mt . Since the pose variation ΔR of all time phases in the entire respiratory cycle is known, the respiratory signal ΔR can be removed from the real-time positioning signal R p of the body surface Sensor according to the patient's respiratory phase during the body tracking process, and only the Body position change information R m .

(5)基于上述体位变化信息,实时矫正超声与三维体数据的配准映射关系。(5) Based on the above body position change information, the registration mapping relationship between the ultrasound and the three-dimensional volume data is corrected in real time.

如上文所记载的体位追踪配准流程,公式(2)中表示体表Sensor坐标系相对于磁场发生器坐标系的实时位姿矩阵Rp中由于掺杂了呼吸周期信息,因此为体位追踪过程中实时二维超声图像与CT/MR图像等三维体数据的配准关系(或者说配准映射关系)引入了误差。为了剔除患者呼吸运动对体位追踪的影响,将公式(2)中的Rp矩阵替换为只表示体位变化信息的矩阵Rm,便可在体位追踪过程中实时去除呼吸信号,保证体位追踪下实时二维超声图像与CT/MR图像等三维体数据良好的配准关系。去除呼吸运动影响后的体位追踪下的配准公式(不妨记为公式(18))为:As in the body position tracking registration process described above, the formula (2) indicates that the real-time pose matrix R p of the body surface Sensor coordinate system relative to the magnetic field generator coordinate system is mixed with respiratory cycle information, so it is the body position tracking process The registration relationship (or registration mapping relationship) between real-time two-dimensional ultrasound images and three-dimensional volume data such as CT/MR images introduces errors. In order to eliminate the influence of the patient's respiratory movement on body position tracking, the R p matrix in formula (2) is replaced by the matrix R m that only represents body position change information, so that the respiratory signal can be removed in real time during the body position tracking process, ensuring real-time Good registration relationship between 2D ultrasound images and 3D volume data such as CT/MR images. The registration formula (may be denoted as formula (18)) under body position tracking after removing the influence of breathing motion is:

Figure BDA0003381742890000141
Figure BDA0003381742890000141

体位追踪过程中,基于患者当前的呼吸时相,根据公式(17)可获得剔除呼吸信号后的体位变化信号,之后根据公式(18)的配准流程实现实时二维超声图像与CT/MR图像等三维体数据的配准,可保证在配准融合后,即使患者体位发生了变化或者磁场发生器位置发生了移动,依旧可以保持良好的配准效果。In the body position tracking process, based on the current respiratory phase of the patient, the body position change signal after removing the respiratory signal can be obtained according to formula (17), and then the real-time two-dimensional ultrasound image and CT/MR image can be realized according to the registration process of formula (18) The registration of three-dimensional volume data can ensure that after the registration fusion, even if the patient's body position changes or the position of the magnetic field generator moves, a good registration effect can still be maintained.

可以看到,本发明与现有技术相比,在配准过程中对体表Sensor的位姿信息进行了优化,可在体位追踪过程中将体表Sensor所检测到的体位变化信号中存在的多余呼吸运动信号实时剔除,因而有效去除了患者呼吸运动对体位变化信息(患者体位变化和磁场发生器位置变化)的影响,保证体位追踪下实时二维超声图像与三维超声、CT、MR或PET等三维体数据良好的配准融合效果。It can be seen that, compared with the prior art, the present invention optimizes the pose information of the body surface Sensor during the registration process, and can detect the position change signal detected by the body surface Sensor during the body position tracking process. Excess respiratory motion signals are eliminated in real time, thereby effectively removing the influence of patient respiratory motion on body position change information (patient body position changes and magnetic field generator position changes), ensuring real-time two-dimensional ultrasound images under body position tracking and three-dimensional ultrasound, CT, MR or PET Good registration and fusion effect of 3D volume data.

以上是基于磁导航定位系统的配准方法为例进行举例和说明,可以理解地,磁导航系统以及定位Sensor可通过光学定位系统和定位Marker(标记点)进行替代,患者体表的Marker也可以通过在患者体表描绘标记点的方式进行替代,具体方案如下:The above is an example and description based on the registration method of the magnetic navigation and positioning system. It is understandable that the magnetic navigation system and the positioning Sensor can be replaced by the optical positioning system and the positioning Marker (marker point), and the Marker on the patient's body surface can also be Replace it by drawing markers on the patient's body surface, the specific scheme is as follows:

(1)在患者体表和探头上固定或者标记Marker,利用光学相机定位探头和患者的空间位置;(1) Fix or mark the Marker on the patient's body surface and the probe, and use the optical camera to locate the spatial position of the probe and the patient;

(2)获取三维体数据,完成与实时二维超声的初始配准;(2) Acquire 3D volume data and complete the initial registration with real-time 2D ultrasound;

(3)对光学相机采集的一段时长的患者体表Marker运动的视频进行分析,以提取呼吸周期信息,例如患者在静态下整个呼吸周期内所有时相下体表Sensor的位姿信息;(3) Analyze the video of the patient's body surface Marker movement collected by the optical camera for a period of time to extract respiratory cycle information, such as the pose information of the patient's body surface Sensor at all phases during the entire respiratory cycle under static conditions;

(4)根据上述呼吸周期信息在光学相机对体表Marker的实时定位信号中剔除呼吸信号的影响,以获得剔除呼吸信号后的体位变化信息;(4) Remove the influence of the breathing signal from the real-time positioning signal of the body surface Marker by the optical camera according to the above breathing cycle information, so as to obtain the body position change information after removing the breathing signal;

(5)基于上述体位变化信息,实时矫正超声与三维体数据的配准映射关系。(5) Based on the above body position change information, the registration mapping relationship between the ultrasound and the three-dimensional volume data is corrected in real time.

请参照图3,本发明一些实施例中公开了一种超声成像系统,其包括超声探头10、发射/接收控制电路20、回波处理模块30、处理器40和显示模块50,下面对各部件进行说明。Please refer to FIG. 3 , an ultrasonic imaging system is disclosed in some embodiments of the present invention, which includes an ultrasonic probe 10 , a transmitting/receiving control circuit 20 , an echo processing module 30 , a processor 40 and a display module 50 . parts are described.

超声探头10可以是矩阵探头,也可以是带有机械装置的四维探头,本发明对此不作限制,只要采用的超声探头能够获得被检查者的目标区域的超声回波信号或者说数据即可。一些实施例中,超声探头获取一组四维图像数据(即动态三维超声图像)或者获取一卷三维超声图像数据,所获得的超声图像数据中需要有较清楚的血管等结构信息。一些具体实施例中,超声探头10包括多个阵元,用于实现电脉冲信号和超声波的相互转换,从而实现向被检测生物组织60(例如人体或动物体中的生物组织)发射超声波并接收组织反射回的超声回波,以获取超声波回波信号。超声探头10所包括的这多个阵元,可以排列成一排构成线阵,或排布成二维矩阵构成面阵,这多个阵元也可以构成凸阵列。阵元可根据激励电信号发射超声波,或将接收的超声波变换为电信号。因此每个阵元可用于向感兴趣区域的生物组织发射超声波,也可用于接收经组织返回的超声波回波。在进行超声检测时,可通过发射序列和接收序列控制哪些阵元用于发射超声波,哪些阵元用于接收超声波,或者控制阵元分时隙用于发射超声波或接收超声回波。参与超声波发射的所有阵元可以被电信号同时激励,从而同时发射超声波;或者参与超声波发射的阵元也可以被具有一定时间间隔的若干电信号激励,从而持续发射具有一定时间间隔的超声波。The ultrasonic probe 10 can be a matrix probe or a four-dimensional probe with a mechanical device, which is not limited in the present invention, as long as the ultrasonic probe used can obtain ultrasonic echo signals or data of the target area of the examinee. In some embodiments, the ultrasonic probe acquires a set of four-dimensional image data (ie dynamic three-dimensional ultrasonic images) or acquires a volume of three-dimensional ultrasonic image data, and the obtained ultrasonic image data needs to have relatively clear structural information such as blood vessels. In some specific embodiments, the ultrasonic probe 10 includes a plurality of array elements, which are used to realize mutual conversion between electrical pulse signals and ultrasonic waves, so as to transmit ultrasonic waves to the detected biological tissue 60 (such as biological tissue in a human body or an animal body) and receive ultrasonic waves. Ultrasonic echoes reflected back by tissues to obtain ultrasonic echo signals. The multiple array elements included in the ultrasonic probe 10 can be arranged in a row to form a linear array, or arranged in a two-dimensional matrix to form a planar array, and these multiple array elements can also form a convex array. The array element can emit ultrasonic waves according to the excitation electrical signal, or convert the received ultrasonic waves into electrical signals. Therefore, each array element can be used to transmit ultrasonic waves to biological tissues in the region of interest, and can also be used to receive ultrasonic echoes returned through the tissues. When performing ultrasonic testing, it is possible to control which array elements are used to transmit ultrasonic waves and which array elements are used to receive ultrasonic waves through the transmitting sequence and receiving sequence, or control the array elements to transmit ultrasonic waves or receive ultrasonic echoes in time slots. All array elements participating in ultrasonic emission can be excited by electrical signals at the same time, so as to emit ultrasonic waves at the same time; or the array elements participating in ultrasonic emission can also be excited by several electrical signals with a certain time interval, so as to continuously emit ultrasonic waves with a certain time interval.

发射/接收控制电路20用于控制超声探头10执行超声波的发射和超声波回波信号的接收,具体地,发射/接收控制电路20一方面用于控制超声探头10向生物组织60发射超声波束,另一方面用于控制超声探头10接收超声波束经组织反射的超声回波。具体实施例中,发射/接收控制电路20用于产生发射序列和接收序列,并输出至超声探头。发射序列用于控制超声探头10中多个阵元中的部分或者全部向生物组织60的感兴趣目标发射超声波,发射序列的参数包括发射用的阵元数和超声波发射参数(例如幅度、频率、发波次数、发射间隔、发射角度、波型和/或聚焦位置等)。接收序列用于控制多个阵元中的部分或者全部接收超声波经组织后的回波,接收序列的参数包括接收用的阵元数以及回波的接收参数(例如接收角度、深度等)。对超声回波的用途不同或根据超声回波生成的图像不同,发射序列中的超声波参数和接收序列中的回波参数也有所不同。The transmission/reception control circuit 20 is used to control the ultrasonic probe 10 to perform ultrasonic transmission and ultrasonic echo signal reception. Specifically, the transmission/reception control circuit 20 is used to control the ultrasonic probe 10 to transmit ultrasonic beams to the biological tissue 60 on the one hand, and the other On the one hand, it is used to control the ultrasonic probe 10 to receive the ultrasonic echo reflected by the ultrasonic beam through the tissue. In a specific embodiment, the transmission/reception control circuit 20 is used to generate a transmission sequence and a reception sequence, and output them to the ultrasound probe. The transmission sequence is used to control some or all of the plurality of array elements in the ultrasonic probe 10 to transmit ultrasonic waves to the target of interest in the biological tissue 60. The parameters of the transmission sequence include the number of array elements used for transmission and ultrasonic transmission parameters (such as amplitude, frequency, wave times, launch intervals, launch angles, wave patterns and/or focus positions, etc.). The receiving sequence is used to control some or all of the plurality of array elements to receive the echoes of the organized ultrasonic waves. The parameters of the receiving sequence include the number of receiving array elements and the receiving parameters of the echo (such as receiving angle, depth, etc.). The ultrasonic parameters in the transmitting sequence and the echo parameters in the receiving sequence are also different depending on the application of the ultrasonic echo or the image generated based on the ultrasonic echo.

回波处理模块30用于对超声探头10接收到的超声回波信号进行处理,例如对超声回波信号进行滤波、放大、波束合成等处理,得到超声回波数据。在具体实施例中,回波处理模块30可以将超声回波数据输出给处理器40,也可以将超声回波数据先存储在一存储器中,在需要基于超声回波数据进行运算时,处理器40从存储器中读取超声回波数据。本领域技术人员应当理解,在有的实施例中,当不需要对超声回波信号进行滤波、放大、波束合成等处理时,回波处理模块30也可以省略。The echo processing module 30 is used for processing the ultrasonic echo signals received by the ultrasonic probe 10 , for example, filtering, amplifying, and beamforming the ultrasonic echo signals to obtain ultrasonic echo data. In a specific embodiment, the echo processing module 30 can output the ultrasonic echo data to the processor 40, or store the ultrasonic echo data in a memory first, and when it is necessary to perform operations based on the ultrasonic echo data, the processor 40 Read the ultrasound echo data from the memory. Those skilled in the art should understand that, in some embodiments, when the ultrasonic echo signal does not need to be processed by filtering, amplifying, beamforming, etc., the echo processing module 30 can also be omitted.

处理器40用于获取超声回波数据或者说信号,并采用相关算法得到所需要的参数或图像。The processor 40 is used to obtain ultrasonic echo data or signals, and use correlation algorithms to obtain required parameters or images.

显示模块50可以用于显示信息,例如显示由处理器40计算得到的参数和图像等。本领域技术人员应当理解,在有的实施例中,超声成像系统本身可以不集成显示模块,而是连接一个计算机设备(例如电脑),通过计算机设备的显示模块(例如显示屏)来显示信息。The display module 50 can be used for displaying information, such as displaying parameters and images calculated by the processor 40 . Those skilled in the art should understand that, in some embodiments, the ultrasound imaging system itself may not integrate a display module, but is connected to a computer device (such as a computer) to display information through a display module (such as a display screen) of the computer device.

本发明一些实施例中的超声成像系统还引入获取位姿信息的传感器Sensor,这些传感器采用基于电磁感应的位置传感器,但在其他实施例中,也可以是基于光学原理的位置传感器,还可以是基于声学原理等的其他类型位置传感器,甚至还可以是图像定位Marker(标记点)。一些实施例中,获取位姿信息的传感器Sensor可以包括探头定位器11和体表定位器13,下面具体说明。The ultrasonic imaging system in some embodiments of the present invention also introduces sensors for acquiring pose information. These sensors use position sensors based on electromagnetic induction, but in other embodiments, they can also be position sensors based on optical principles, and can also be Other types of position sensors based on acoustic principles, etc., can even be image positioning markers (markers). In some embodiments, the Sensor for acquiring pose information may include a probe locator 11 and a body surface locator 13, which will be described in detail below.

一些实施例中,超声探头10上设置有上述的探头定位器11,换句话说,探头定位器11设置于超声探头10;具体地,探头定位器11可以是内置于超声探头10内部,也可以是固定在超声探头10的壳体等。探头定位器11用于获取超声探头实时的位姿信息。在超声图像数据获取过程中,探头定位器11根据感应的超声探头10的移动状况,产生超声探头10的实时姿态信息例如位姿矩阵R,这样,每一帧超声图像数据对应着相应的实时姿态信息。In some embodiments, the above-mentioned probe positioner 11 is set on the ultrasonic probe 10, in other words, the probe positioner 11 is set on the ultrasonic probe 10; specifically, the probe positioner 11 can be built into the ultrasonic probe 10, or can be are fixed to the casing of the ultrasonic probe 10 and the like. The probe positioner 11 is used to obtain real-time pose information of the ultrasonic probe . During the ultrasonic image data acquisition process, the probe locator 11 generates real-time attitude information of the ultrasonic probe 10 such as a pose matrix R according to the sensed movement of the ultrasonic probe 10, so that each frame of ultrasonic image data corresponds to a corresponding real-time attitude information.

一些实施例中,体表定位器13用于设置于被检查对象,例如通过佩戴、粘贴等方式设置于被检查对象的体表。体表定位器13用于获取上述被检查对象实时的位姿信息例如位姿矩阵RpIn some embodiments, the body surface locator 13 is used to be arranged on the object to be inspected, for example, to be arranged on the body surface of the object to be inspected by wearing, pasting, and the like. The body surface locator 13 is used to obtain the real-time pose information of the above-mentioned inspected object, such as pose matrix R p .

一个工作可以是这样的:A job can look like this:

超声成像系统通过发射/接收控制电路20激励超声探头10向人体待检查部位发射超声波,获得超声回波信号;处理器40对获得的超声回波信号进行处理,获得目标组织器官的超声图像数据;处理器40还可以获取其他检测图像数据,该其他检测图像数据可以是与超声图像数据相同模态的三维超声图像数据,也可以是与超声图像数据不同模态的另一模态图像数据,如CT或MR图像数据,例如在处理器40进行配准前经由一模态导入接口导入上述另一模态图像数据;另一模态图像数据的获取方法可以参见现有相关技术,这里不作描述;此外,所导入的另一模态图像数据可以是三维数据,也可以是不同时间的多个三维数据。设置于超声探头10上的探头定位器11,随着超声探头10的移动,不断地提供超声探头的位姿信息,例如包括六自由度(即垂直向、横向、纵向、俯仰、滚转和摇摆)的空间方位信息,设置于被检测对象的体表定位器也获取被检查对象实时的位姿信息,例如包括六自由度(即垂直向、横向、纵向、俯仰、滚转和摇摆)的空间方位信息。处理器40利用图像数据和实时姿态信息,对超声图像和另一模态图像进行配准与融合,然后把融合结果送到显示部件50,由显示部件50显示融合结果;或者,处理器40利用图像数据和实时姿态信息,对超声图像和另一模态图像进行配准,然后将配准结果送到显示部件50,由显示部件50对配准后的实时的超声图像数据和检测图像数据进行对比显示。下面对配准过程进行一个更具体的说明。The ultrasonic imaging system excites the ultrasonic probe 10 through the transmitting/receiving control circuit 20 to transmit ultrasonic waves to the parts of the human body to be examined to obtain ultrasonic echo signals; the processor 40 processes the obtained ultrasonic echo signals to obtain ultrasonic image data of target tissues and organs; The processor 40 can also acquire other detection image data, which can be three-dimensional ultrasound image data of the same modality as the ultrasound image data, or another modality image data of a different modality from the ultrasound image data, such as CT or MR image data, for example, before the processor 40 performs registration, import the above-mentioned another modality image data through a modality import interface; the acquisition method of another modality image data can refer to the existing related technology, which will not be described here; In addition, another imported modality image data may be three-dimensional data, or multiple three-dimensional data at different times. The probe positioner 11 arranged on the ultrasonic probe 10, along with the movement of the ultrasonic probe 10, continuously provides the pose information of the ultrasonic probe, for example including six degrees of freedom (that is, vertical, lateral, longitudinal, pitch, roll and sway) ), the body surface locator set on the object to be inspected also obtains the real-time pose information of the object to be inspected, such as space including six degrees of freedom (that is, vertical, lateral, longitudinal, pitch, roll, and sway) orientation information. Processor 40 uses image data and real-time posture information to register and fuse the ultrasound image and another modality image, and then sends the fusion result to display unit 50, which displays the fusion result; or, processor 40 utilizes Image data and real-time posture information, register the ultrasonic image and another modality image, and then send the registration result to the display part 50, and the real-time ultrasonic image data and detection image data after the registration are performed by the display part 50 contrast display. A more specific description of the registration process is given below.

一些实施例中,处理器40通过超声探头获取被检查对象实时的第一图像数据例如超声图像数据;处理器40获取被检查对象的第二图像数据例如三维超声图像数据或者与超声图像数据不同模态的图像数据如CT或MR图像数据;处理器40获取超声探头10实时的位姿信息——例如通过探头定位器11来获取;处理器40获取被检查对象实时的位姿信息——例如通过体表定位器13来获取;处理器40还获取被检查对象由呼吸引起的位姿变化信息,例如处理器获取被检查对象至少一个呼吸周期内的由呼吸引起的位姿变化信息,具体地,可以参见上文步骤(3)中所提及的方法,通过采集一段时长的体表定位器13输出的位姿信息,以提取呼吸周期信息。一些具体实施例中,处理器40获取被检查对象至少一个呼吸周期内的由呼吸引起的位姿变化信息进一步这样来实现:处理器40确定初始的配准映射关系所对应的呼吸时相,及其对应的由呼吸引起的位姿信息;处理器40确定呼吸周期内每个呼吸时相,及其对应的由呼吸引起的位姿信息;处理器40根据上述初始的配准映射关系所对应的呼吸时相、及其对应的由呼吸引起的位姿信息,上述呼吸周期内每个呼吸时相及其对应的由呼吸引起的位姿信息,计算得到呼吸引起的呼吸周期内每个呼吸时相对应的位姿变化量矩阵,作为上述由呼吸引起的位姿变化信息;这样计算得到的由呼吸引起的位姿变化信息,一个例子是上文公式(15)提及的位姿变化量ΔRtIn some embodiments, the processor 40 acquires real-time first image data of the object under inspection, such as ultrasound image data, through an ultrasound probe; state image data such as CT or MR image data; the processor 40 obtains the real-time pose information of the ultrasonic probe 10—for example, through the probe positioner 11; the processor 40 obtains the real-time pose information of the inspected object—for example, through body surface locator 13 to acquire; processor 40 also acquires the pose change information caused by respiration of the inspected object, for example, the processor acquires the pose change information caused by respiration in at least one breathing cycle of the inspected object, specifically, Referring to the method mentioned in step (3) above, the breathing cycle information can be extracted by collecting the pose information output by the body surface locator 13 for a period of time. In some specific embodiments, the processor 40 obtains the pose change information caused by respiration within at least one respiration cycle of the subject under inspection, which is further implemented as follows: the processor 40 determines the respiration phase corresponding to the initial registration mapping relationship, and Its corresponding pose information caused by breathing; processor 40 determines each breathing phase in the breathing cycle, and its corresponding pose information caused by breathing; processor 40 according to the above-mentioned initial registration mapping relationship corresponding The breathing phase and its corresponding pose information caused by breathing, each breathing phase in the above breathing cycle and its corresponding pose information caused by breathing, calculate the breathing phase of each breathing phase in the breathing cycle caused by breathing The corresponding pose change matrix is used as the above-mentioned pose change information caused by breathing; an example of the pose change information caused by breathing calculated in this way is the pose change ΔR t mentioned in the above formula (15) .

处理器40根据超声探头10实时的位姿信息、被检查对象实时的位姿信息、以及被检查对象由呼吸引起的位姿变化信息,对被检查对象实时的第一图像数据和第二图像数据进行配准。一些具体实施例中,处理器40根据超声探头10实时的位姿信息、被检查对象实时的位姿信息、以及被检查对象由呼吸引起的位姿变化信息对被检查对象实时的第一图像数据和第二图像数据进行配准可以这样来进行:The processor 40 performs real-time first image data and second image data of the inspected object according to the real-time pose information of the ultrasonic probe 10, the real-time pose information of the inspected object, and the pose change information of the inspected object caused by breathing. for registration. In some specific embodiments, the processor 40 calculates the real-time first image data of the inspected object according to the real-time pose information of the ultrasonic probe 10, the real-time pose information of the inspected object, and the pose change information of the inspected object caused by breathing. Registration with the second image data can be done as follows:

处理器40获取第一图像数据和第二图像数据初始的配准映射关系(或者说配准关系)。初始的配准映射关系的获取可以参见上文关于初始配准的说明,例如操作者选取一幅第一图像,然后在第二图像中寻找对应切面,接着将两者进行叠加,通过平移或旋转使得第一图像与对应切面的第二图像中相同的解剖结构重合,以获取此时的T矩阵,然后可以通过公式P=T×A-1×R-1来计算得到P矩阵,该P矩阵即为第一图像数据和第二图像数据初始的配准映射关系。在获取初始的配准映射关系后,处理器40在实时动态配准过程中,再获取待配准的第一图像数据;处理器40获取被检查对象实时的位姿信息;处理器40根据被检查对象由呼吸引起的位姿变化信息和被检查对象实时的位姿信息对上述配准映射关系进行校正;处理器40再根据校正后的配准映射关系,对被检查对象实时的第一图像数据和第二图像数据进行配准。The processor 40 acquires an initial registration mapping relationship (or registration relationship) between the first image data and the second image data. The acquisition of the initial registration mapping relationship can refer to the description of the initial registration above. For example, the operator selects a first image, then finds the corresponding slice in the second image, and then superimposes the two, through translation or rotation Make the first image coincide with the same anatomical structure in the second image of the corresponding section to obtain the T matrix at this time, and then the P matrix can be calculated by the formula P=T×A -1 ×R -1 , the P matrix That is, the initial registration mapping relationship between the first image data and the second image data. After obtaining the initial registration mapping relationship, the processor 40 obtains the first image data to be registered during the real-time dynamic registration process; the processor 40 obtains the real-time pose information of the inspected object; Correct the above-mentioned registration mapping relationship by the pose change information of the inspected object caused by breathing and the real-time pose information of the inspected object; the processor 40 then performs the real-time first image of the inspected object according to the corrected registration mapping relationship. The data and the second image data are registered.

处理器40对配准映射关系进行校正,一些实施例可以是:The processor 40 corrects the registration mapping relationship, some embodiments may be:

处理器40根据被检查对象由呼吸引起的位姿变化信息对被检查对象实时的位姿信息进行校正,例如从被检测对象实时的位姿信息中识别并剔除其所包含的由呼吸引起的位姿变化信息;一些具体实施例中,处理器40从被检测对象实时的位姿信息中识别并剔除其所包含的由呼吸引起的位姿变化信息包括可以这样来进行:处理器40确定上述被检测对象实时的呼吸时相;处理器40根据上述被检测对象实时的呼吸时相,以及上述被检查对象至少一个呼吸周期内的由呼吸引起的位姿变化信息,将上述被检测对象实时的位姿信息中所包含的由呼吸引起的位姿变化信息,从上述被检测对象实时的位姿信息中剔除;具体地,处理器40根据上述被检测对象实时的呼吸时相,以及上述呼吸引起的呼吸周期内每个呼吸时相对应的位姿变化量矩阵,得到呼吸引起的上述被检测对象实时的呼吸时相对应的位姿变化量矩阵;上述被检测对象实时的位姿信息包括上述被检测对象实时的位姿矩阵,处理器40将上述被检测对象实时的位姿矩阵乘以上述被检测对象实时的呼吸时相对应的位姿变化量矩阵的逆矩阵,以从上述被检测对象实时的位姿信息中剔除其所包含的由呼吸引起的位姿变化信息。The processor 40 corrects the real-time pose information of the inspected object according to the posture change information of the inspected object caused by respiration, for example, identifying and removing the respiration-induced pose information contained in the real-time pose information of the inspected object. pose change information; in some specific embodiments, the processor 40 identifies and removes the pose change information contained in it from the real-time pose information of the detected object. Detect the real-time respiratory phase of the object; the processor 40 converts the real-time position of the detected object to The pose change information caused by breathing included in the pose information is removed from the real-time pose information of the detected object; specifically, the processor 40 according to the real-time breathing phase of the detected object and The pose change matrix corresponding to each breath in the breathing cycle is obtained to obtain the pose change matrix corresponding to the real-time breathing of the detected object caused by breathing; the real-time pose information of the detected object includes the detected For the real-time pose matrix of the object, the processor 40 multiplies the real-time pose matrix of the detected object by the inverse matrix of the pose change matrix corresponding to the real-time breathing of the detected object to obtain the real-time pose matrix of the detected object The pose change information caused by breathing is removed from the pose information.

在进行实时配准时,对于特别是一些腹式呼吸的病人进行腹部脏器的超声图像和其他图像(例如三维超声、CT、MR或PET等三维体数据)进行配准时,呼吸运动造成的脏器位移、旋转及形变会影响超声图像数据变化,从而使得配准结果不理想,为了消除或减弱这种影响,一些实施例中,处理器40对配准映射关系的校正还包括:处理器40根据被检查对象由呼吸引起的位姿变化信息,对被检查对象由呼吸引起的超声图像数据的变化所引起的配准映射关系的变化进行校正。During real-time registration, especially for some patients with abdominal breathing, when the ultrasound images of abdominal organs and other images (such as three-dimensional ultrasound, CT, MR or PET and other three-dimensional volume data) are registered, the organs caused by respiratory movement Displacement, rotation and deformation will affect the change of ultrasonic image data, thus making the registration result unsatisfactory. In order to eliminate or weaken this effect, in some embodiments, the correction of the registration mapping relationship by the processor 40 also includes: the processor 40 according to The pose change information of the inspected object caused by respiration is used to correct the change of the registration mapping relationship caused by the change of the ultrasound image data caused by the respiration of the inspected object.

因此,一些实施例可以是这样的:So some embodiments could be like this:

处理器40控制超声探头10向被检查对象发射超声波和接收从上述被检查对象返回的超声回波以获取超声波回波信号,并根据上述超声回波信号获取上述被检查对象实时的超声图像数据;The processor 40 controls the ultrasonic probe 10 to transmit ultrasonic waves to the object under inspection and receive the ultrasonic echo returned from the object under inspection to obtain ultrasonic echo signals, and obtain real-time ultrasonic image data of the object under inspection according to the above ultrasonic echo signals;

处理器40获取上述被检查对象非实时的检测图像数据;The processor 40 acquires the non-real-time detection image data of the object to be inspected;

处理器40通过探头定位器11获取上述超声探头10实时的位姿信息,和通过体表定位器13获取上述被检查对象实时的位姿信息和上述被检查对象由呼吸引起的位姿变化信息;处理器40通过上述体表定位器获取被检查对象由呼吸引起的位姿变化信息,包括:获取上述被检查对象至少一个呼吸周期内的由呼吸引起的位姿变化信息。一些具体实施例中,处理器40获取被检查对象至少一个呼吸周期内的由呼吸引起的位姿变化信息进一步这样来实现:处理器40确定初始的配准映射关系所对应的呼吸时相,及其对应的由呼吸引起的位姿信息;处理器40确定呼吸周期内每个呼吸时相,及其对应的由呼吸引起的位姿信息;处理器40根据上述初始的配准映射关系所对应的呼吸时相、及其对应的由呼吸引起的位姿信息,上述呼吸周期内每个呼吸时相及其对应的由呼吸引起的位姿信息,计算得到呼吸引起的呼吸周期内每个呼吸时相对应的位姿变化量矩阵,作为上述由呼吸引起的位姿变化信息;这样计算得到的由呼吸引起的位姿变化信息,一个例子是上文公式(15)提及的位姿变化量ΔRtThe processor 40 obtains the real-time pose information of the ultrasonic probe 10 through the probe locator 11, and obtains the real-time pose information of the inspected object and the pose change information of the inspected object caused by breathing through the body surface locator 13; The processor 40 acquires the pose change information of the inspected object caused by respiration through the body surface locator, including: acquiring the pose change information of the inspected object caused by respiration within at least one breathing cycle. In some specific embodiments, the processor 40 acquires the pose change information caused by respiration within at least one respiration cycle of the subject to be examined, which is further implemented as follows: the processor 40 determines the respiration phase corresponding to the initial registration mapping relationship, and Its corresponding pose information caused by breathing; processor 40 determines each breathing phase in the breathing cycle, and its corresponding pose information caused by breathing; processor 40 according to the above-mentioned initial registration mapping relationship corresponding The breathing phase and its corresponding pose information caused by breathing, each breathing phase in the above breathing cycle and its corresponding pose information caused by breathing, calculate the breathing phase of each breathing phase in the breathing cycle caused by breathing The corresponding pose change matrix is used as the above pose change information caused by breathing; the pose change information calculated in this way is caused by breathing, an example is the pose change ΔR t mentioned in the above formula (15) ;

处理器40根据超声探头10实时的位姿信息,获取上述被检查对象实时的超声图像数据和上述检测图像数据初始的配准映射关系。初始的配准映射关系的获取可以参见上文关于初始配准的说明,例如操作者选取一幅超声图像,然后在检测图像中寻找对应切面,接着将两者进行叠加,通过平移或旋转使得超声图像与对应切面的检测图像中相同的解剖结构重合,以获取此时的T矩阵,然后可以通过公式P=T×A-1×R-1来计算得到P矩阵,该P矩阵即为超声图像数据和检测图像数据初始的配准映射关系;The processor 40 acquires the initial registration mapping relationship between the real-time ultrasonic image data of the inspected object and the detected image data according to the real-time pose information of the ultrasonic probe 10 . The acquisition of the initial registration mapping relationship can refer to the description of the initial registration above. For example, the operator selects an ultrasound image, then finds the corresponding slice in the detection image, and then superimposes the two, and makes the ultrasound The image is superimposed with the same anatomical structure in the detection image of the corresponding section to obtain the T matrix at this time, and then the P matrix can be calculated by the formula P=T×A -1 ×R -1 , and the P matrix is the ultrasound image The initial registration mapping relationship between the data and the detected image data;

处理器40从被检测对象实时的位姿信息中剔除其所包含的由呼吸引起的位姿变化信息;以及处理器40根据被检测对象剔除由呼吸引起的位姿变化信息后的位姿信息,对上述配准映射关系进行校正;一些实施例中,处理器40根据被检查对象由呼吸引起的位姿变化信息对被检查对象实时的位姿信息进行校正,例如从被检测对象实时的位姿信息中识别并剔除其所包含的由呼吸引起的位姿变化信息;一些具体实施例中,处理器40从被检测对象实时的位姿信息中识别并剔除其所包含的由呼吸引起的位姿变化信息包括可以这样来进行:处理器40确定上述被检测对象实时的呼吸时相;处理器40根据上述被检测对象实时的呼吸时相,以及上述被检查对象至少一个呼吸周期内的由呼吸引起的位姿变化信息,将上述被检测对象实时的位姿信息中所包含的由呼吸引起的位姿变化信息,从上述被检测对象实时的位姿信息中剔除;具体地,处理器40根据上述被检测对象实时的呼吸时相,以及上述呼吸引起的呼吸周期内每个呼吸时相对应的位姿变化量矩阵,得到呼吸引起的上述被检测对象实时的呼吸时相对应的位姿变化量矩阵;上述被检测对象实时的位姿信息包括上述被检测对象实时的位姿矩阵,处理器40将上述被检测对象实时的位姿矩阵乘以上述被检测对象实时的呼吸时相对应的位姿变化量矩阵的逆矩阵,以从上述被检测对象实时的位姿信息中剔除其所包含的由呼吸引起的位姿变化信息;在进行实时配准时,对于特别是一些腹式呼吸的病人进行腹部脏器的超声图像和其他图像(例如三维超声、CT、MR或PET等三维体数据)进行配准时,呼吸运动造成的脏器位移、旋转及形变会影响超声图像数据变化,从而使得配准结果不理想,为了消除或减弱这种影响,一些实施例中,处理器40对配准映射关系的校正还包括:处理器40根据被检查对象由呼吸引起的位姿变化信息,对被检查对象由呼吸引起的超声图像数据的变化所引起的配准映射关系的变化进行校正;The processor 40 removes the pose change information caused by breathing from the real-time pose information of the detected object; and the processor 40 removes the pose information of the detected object after removing the pose change information caused by breathing, Correct the above-mentioned registration mapping relationship; in some embodiments, the processor 40 corrects the real-time pose information of the inspected object according to the pose change information of the inspected object caused by breathing, for example, from the real-time pose information of the inspected object In some specific embodiments, the processor 40 identifies and removes the pose change information contained in the detected object from the real-time pose information of the detected object and removes the pose information caused by breathing. The change information can be carried out in the following way: the processor 40 determines the real-time respiratory phase of the detected object; the processor 40 determines the real-time respiratory phase of the detected object according to the real-time respiratory phase of the detected object, and the breath caused by the detected object in at least one respiratory cycle. pose change information of the detected object, and remove the pose change information caused by breathing contained in the real-time pose information of the detected object from the real-time pose information of the detected object; specifically, the processor 40 according to the above-mentioned The real-time breathing phase of the detected object, and the corresponding pose change matrix for each breathing cycle caused by the above breathing, and obtain the pose change matrix corresponding to the real-time breathing of the above detected object caused by breathing The real-time pose information of the detected object includes the real-time pose matrix of the detected object, and the processor 40 multiplies the real-time pose matrix of the detected object by the corresponding pose change during the real-time breathing of the detected object The inverse matrix of the quantity matrix is used to remove the pose change information caused by breathing from the real-time pose information of the detected object; when performing real-time registration, especially for some patients with abdominal breathing, abdominal viscera When registering an ultrasound image of an organ with other images (such as three-dimensional ultrasound, CT, MR or PET and other three-dimensional volume data), the organ displacement, rotation, and deformation caused by breathing motion will affect the changes in the ultrasound image data, thus making the registration result inconsistent. Ideally, in order to eliminate or weaken this effect, in some embodiments, the correction of the registration mapping relationship by the processor 40 further includes: the processor 40 calculates the breathing information of the checked object according to the pose change information of the checked object caused by breathing. Correct the changes in the registration mapping relationship caused by the changes in the ultrasound image data;

处理器40根据校正后的配准映射关系,对上述被检查对象实时的超声图像数据和检测图像数据进行配准。The processor 40 registers the real-time ultrasound image data and the detected image data of the object under inspection according to the corrected registration mapping relationship.

分析上述过程可以看到,当超声探头10的位姿信息和被检查对象的位姿信息中任一者发生变化时,配准映射关系都会发生变化,因此需要进行校正和更新,配准映射关系的变化实质上包含了几大类,第一类是由被检测对象的位姿信息变化所引起的(对于磁导航系统,被检测对象的位姿信息变化包含了患者体位发生了变化,或者磁场发生器的位置发生了变化)配准映射关系的变化;第二类则是在第一类的基础上,被检测对象由于呼吸对体表定位器13所获取的位姿信息所引起的干扰,这也会导致配准映射关系发生变化;更进一步地,第三类则是由于被检测对象的呼吸引起的超声图像数据的变化,这也会导致配准映射关系发生变化,因此可以针对性地对这第三类中一类或多类进行校正,下面具体说明。Analyzing the above process, it can be seen that when any of the pose information of the ultrasonic probe 10 and the pose information of the object to be inspected changes, the registration mapping relationship will change, so it needs to be corrected and updated, and the registration mapping relationship The change of the detected object essentially includes several categories. The first type is caused by the change of the pose information of the detected object (for the magnetic navigation system, the change of the pose information of the detected object includes the change of the patient's body position, or the magnetic field The position of the generator has changed) the change of the registration mapping relationship; the second type is based on the first type, the detected object is caused by the interference caused by breathing on the pose information obtained by the body surface locator 13, This will also lead to changes in the registration mapping relationship; further, the third type is the change of the ultrasound image data due to the breathing of the detected object, which will also cause changes in the registration mapping relationship, so it can be targeted Correction is performed on one or more of the third categories, which will be described in detail below.

一些实施例中,超声探头10向被检查对象发射超声波和接收从上述被检查对象返回的超声回波以获取超声波回波信号,处理器40根据上述超声回波信号获取上述被检查对象实时的超声图像数据;处理器40获取上述被检查对象非实时的检测图像数据,例如三维超声图像数据或者与超声图像数据不同模态的图像数据,与超声图像数据不同模态的图像数据可以是CT或MR图像数据等;处理器40通过探头定位器11获取超声探头10实时的位姿信息,和通过体表定位器13获取被检查对象实时的位姿信息;处理器40对被检查对象实时的超声图像数据和检测图像数据进行配准。一些实施例中具体的配准过程可以这样的:In some embodiments, the ultrasonic probe 10 transmits ultrasonic waves to the object to be inspected and receives ultrasonic echoes returned from the object to be inspected to obtain ultrasonic echo signals, and the processor 40 obtains real-time ultrasonic waves of the object to be inspected according to the ultrasonic echo signals. Image data; the processor 40 acquires the non-real-time detection image data of the object to be inspected, such as three-dimensional ultrasonic image data or image data of a different modality from the ultrasonic image data, and the image data of a different modality from the ultrasonic image data can be CT or MR image data, etc.; the processor 40 obtains the real-time pose information of the ultrasonic probe 10 through the probe locator 11, and obtains the real-time pose information of the inspected object through the body surface locator 13; the processor 40 obtains the real-time ultrasonic image of the inspected object The data and the detection image data are registered. The specific registration process in some embodiments can be as follows:

处理器40根据超声探头实时的位姿信息获取上述超声图像数据和检测图像数据初始的配准映射关系。初始的配准映射关系的获取可以参见上文关于初始配准的说明,例如操作者选取一幅超声图像,然后在检测图像中寻找对应切面,接着将两者进行叠加,通过平移或旋转使得超声图像与对应切面的检测图像中相同的解剖结构重合,以获取此时的T矩阵,然后可以通过公式P=T×A-1×R-1来计算得到P矩阵,该P矩阵即为超声图像数据和检测图像数据初始的配准映射关系。处理器40对上述配准映射关系进行校正,然后根据校正后的配准映射关系,对对上述被检查对象实时的超声图像数据和检测图像数据进行配准。The processor 40 acquires the initial registration mapping relationship between the ultrasound image data and the detection image data according to the real-time pose information of the ultrasound probe. The acquisition of the initial registration mapping relationship can refer to the description of the initial registration above. For example, the operator selects an ultrasound image, then finds the corresponding slice in the detection image, and then superimposes the two, and makes the ultrasound The image is superimposed with the same anatomical structure in the detection image of the corresponding section to obtain the T matrix at this time, and then the P matrix can be calculated by the formula P=T×A -1 ×R -1 , and the P matrix is the ultrasound image The initial registration mapping relationship between the data and the detected image data. The processor 40 corrects the above-mentioned registration mapping relationship, and then performs registration on the real-time ultrasonic image data and detection image data of the above-mentioned inspected object according to the corrected registration mapping relationship.

一些实施例中,处理器40对上述配准映射关系进行校正包括:处理器40根据被检查对象实时的位姿信息,对上述配准映射关系进行第一类校正;以及,处理器40通过上述体表定位器13获取被检查对象由呼吸引起的位姿变化信息,并根据上述由呼吸引起的位姿变化信息对上述配准映射关系进行第二类校正。一些实施例中,上述第一类校正用于校正由被检查对象的体位变化所引起的配准映射关系的变化,上述第二类校正用于剔除在进行第一类校正时被检查对象的位姿信息包含的由呼吸引起的位姿变化信息。一些实施例中,处理器40对上述配准映射关系进行校正还包括:处理器40通过上述由呼吸引起的位姿变化信息对上述配准映射关系进行第三类校正,上述第三类校正用于校正被检查对象由呼吸引起的超声图像数据的变化所引起的配准映射关系的变化。In some embodiments, the processor 40 correcting the above-mentioned registration mapping relationship includes: the processor 40 performs a first-type correction on the above-mentioned registration mapping relationship according to the real-time pose information of the inspected object; and, the processor 40 through the above-mentioned The body surface locator 13 acquires the pose change information of the inspected object caused by respiration, and performs a second type of correction on the above-mentioned registration mapping relationship according to the pose change information caused by respiration. In some embodiments, the above-mentioned first type of correction is used to correct the change of the registration mapping relationship caused by the change of the body position of the inspected object, and the above-mentioned second type of correction is used to eliminate the position of the inspected object during the first type of correction. The pose information contains the pose change information caused by breathing. In some embodiments, the processor 40 correcting the above-mentioned registration mapping relationship further includes: the processor 40 performs a third type of correction on the above-mentioned registration mapping relationship through the above-mentioned posture change information caused by breathing, and the above-mentioned third type of correction is used for It is used to correct the change of the registration mapping relationship caused by the change of the ultrasonic image data caused by the respiration of the examined object.

下面对第二类校正进行一个更详细的说明。A more detailed description of the second type of correction is given below.

一些实施例中,处理器40通过上述体表定位器13获取被检查对象由呼吸引起的位姿变化信息,例如可以是获取上述被检查对象至少一个呼吸周期内的由呼吸引起的位姿变化信息。一些具体实施例中,处理器40确定初始的配准映射关系所对应的呼吸时相,及其对应的由呼吸引起的位姿信息;处理器40确定呼吸周期内每个呼吸时相,及其对应的由呼吸引起的位姿信息;处理器40根据上述初始的配准映射关系所对应的呼吸时相、及其对应的由呼吸引起的位姿信息,上述呼吸周期内每个呼吸时相及其对应的由呼吸引起的位姿信息,计算得到呼吸引起的呼吸周期内每个呼吸时相对应的位姿变化量矩阵,作为上述由呼吸引起的位姿变化信息;这样计算得到的由呼吸引起的位姿变化信息,一个例子是上文公式(15)提及的位姿变化量ΔRtIn some embodiments, the processor 40 obtains information on the pose change of the object under inspection caused by respiration through the above-mentioned body surface locator 13, for example, it may be to obtain information on the pose change of the object under inspection caused by respiration within at least one breathing cycle . In some specific embodiments, the processor 40 determines the respiratory phase corresponding to the initial registration mapping relationship, and its corresponding pose information caused by breathing; the processor 40 determines each respiratory phase in the respiratory cycle, and its Corresponding pose information caused by breathing; processor 40, according to the breathing phase corresponding to the initial registration mapping relationship and the corresponding pose information caused by breathing, each breathing phase and The corresponding pose information caused by breathing is calculated to obtain the corresponding pose change matrix for each breath in the breathing cycle caused by breathing, which is used as the above pose change information caused by breathing; The pose change information of , an example is the pose change ΔR t mentioned in formula (15) above.

一些实施例中,处理器40根据上述由呼吸引起的位姿变化信息对上述配准映射关系进行第二类校正包括:处理器40确定上述被检测对象实时的呼吸时相;处理器40根据上述被检测对象实时的呼吸时相,上述被检查对象至少一个呼吸周期内的由呼吸引起的位姿变化信息,初始的配准映射关系所对应的呼吸时相和由呼吸引起的位姿信息,对配准映射关系进行第二类校正,从而剔除在进行第一类校正时被检查对象的位姿信息包含的由呼吸引起的位姿变化信息。一些具体实施例中,处理器40根据上述被检测对象实时的呼吸时相,以及上述呼吸引起的呼吸周期内每个呼吸时相对应的位姿变化量矩阵,得到呼吸引起的上述被检测对象实时的呼吸时相对应的位姿变化量矩阵;上述被检测对象实时的位姿信息包括上述被检测对象实时的位姿矩阵,处理器40将上述被检测对象实时的位姿矩阵乘以上述被检测对象实时的呼吸时相对应的位姿变化量矩阵的逆矩阵,从而剔除在进行第一类校正时被检查对象的位姿信息包含的由呼吸引起的位姿变化信息。In some embodiments, the processor 40 performs the second type of correction on the above-mentioned registration mapping relationship according to the above-mentioned pose change information caused by breathing, including: the processor 40 determines the real-time breathing phase of the detected object; the processor 40 according to the above-mentioned The real-time breathing phase of the detected object, the pose change information caused by breathing of the detected object within at least one breathing cycle, the breathing phase corresponding to the initial registration mapping relationship and the pose information caused by breathing, for The second type of correction is performed on the registration mapping relationship, so as to eliminate the pose change information caused by breathing contained in the pose information of the inspected object when the first type of correction is performed. In some specific embodiments, the processor 40 obtains the real-time breathing phase of the detected object caused by the breathing according to the real-time breathing phase of the detected object and the corresponding pose change matrix for each breathing time in the breathing cycle caused by the breathing. The corresponding pose variation matrix during breathing; the real-time pose information of the detected object includes the real-time pose matrix of the detected object, and the processor 40 multiplies the real-time pose matrix of the detected object by the detected The inverse matrix of the pose change matrix corresponding to the real-time breathing of the object, so as to eliminate the pose change information caused by the breathing contained in the pose information of the inspected object when the first type of correction is performed.

一些实施例中,处理器40还控制显示模块50对配准后的实时的超声图像数据和检测图像数据进行融合显示或对比显示。In some embodiments, the processor 40 also controls the display module 50 to perform fusion display or comparative display of the registered real-time ultrasound image data and detection image data.

以上是超声成像系统的一些说明。The above are some descriptions of the ultrasound imaging system.

本发明一些实施例中还公开了一种图像配准方法,下面具体说明。Some embodiments of the present invention also disclose an image registration method, which will be described in detail below.

请参照图4或图5,一些实施例中的图像配准方法,包括以下步骤:Please refer to FIG. 4 or FIG. 5, the image registration method in some embodiments includes the following steps:

步骤100:通过超声探头获取被检查对象实时的第一图像数据。第一图像数据例如为超声图像数据。Step 100: Obtain real-time first image data of the inspected object through the ultrasonic probe. The first image data is, for example, ultrasound image data.

步骤110:获取上述被检查对象的第二图像数据。第二图像数据例如为三维超声图像数据或者与超声图像数据不同模态的图像数据,例如CT或MR图像数据。Step 110: Obtain the second image data of the above-mentioned inspected object. The second image data is, for example, three-dimensional ultrasound image data or image data of a different modality from the ultrasound image data, such as CT or MR image data.

步骤120:获取上述超声探头实时的位姿信息。Step 120: Obtain the real-time pose information of the ultrasonic probe.

例如通过设置于超声探头的探头定位器来获取超声探头实时的位姿信息,一些实施例中,位姿信息可以是包括六自由度(即垂直向、横向、纵向、俯仰、滚转和摇摆)的空间方位信息。For example, the real-time position and orientation information of the ultrasonic probe can be obtained by the probe positioner arranged on the ultrasonic probe. In some embodiments, the position and position information can include six degrees of freedom (that is, vertical, lateral, longitudinal, pitch, roll and swing) spatial orientation information.

步骤130:获取上述被检查对象实时的位姿信息。Step 130: Obtain the real-time pose information of the object under inspection.

例如通过设置于被检查对象的体表定位器来获取超声探头实时的位姿信息,一些实施例中,位姿信息可以是包括六自由度(即垂直向、横向、纵向、俯仰、滚转和摇摆)的空间方位信息。For example, the real-time pose information of the ultrasonic probe can be obtained by setting a body surface locator on the object to be inspected. In some embodiments, the pose information can include six degrees of freedom (that is, vertical, lateral, longitudinal, pitch, roll, and Swing) spatial orientation information.

步骤140:获取上述被检查对象由呼吸引起的位姿变化信息。Step 140: Acquiring information on pose changes of the above-mentioned inspected object caused by breathing.

一些实施例中,步骤140获取上述被检查对象由呼吸引起的位姿变化信息,包括:获取上述被检查对象至少一个呼吸周期内的由呼吸引起的位姿变化信息。In some embodiments, step 140 acquiring information on pose changes of the inspected object caused by breathing includes: acquiring information on pose changes of the inspected object caused by breathing within at least one breathing cycle.

一个具体的方式可以如上文中步骤(3)采集一段时长的体表Sensor定位信息以提取呼吸周期信息的方式,例如,步骤140确定初始的配准映射关系所对应的呼吸时相,及其对应的由呼吸引起的位姿信息;步骤140确定呼吸周期内每个呼吸时相,及其对应的由呼吸引起的位姿信息;步骤140根据上述初始的配准映射关系所对应的呼吸时相、及其对应的由呼吸引起的位姿信息,上述呼吸周期内每个呼吸时相及其对应的由呼吸引起的位姿信息,计算得到呼吸引起的呼吸周期内每个呼吸时相对应的位姿变化量矩阵,作为上述由呼吸引起的位姿变化信息。A specific way can be as in step (3) above to collect body surface Sensor positioning information for a period of time to extract respiratory cycle information, for example, step 140 determines the respiratory phase corresponding to the initial registration mapping relationship, and its corresponding Pose information caused by breathing; step 140 determines each breathing phase in the breathing cycle, and its corresponding pose information caused by breathing; step 140, according to the breathing phase corresponding to the above-mentioned initial registration mapping relationship, and Its corresponding pose information caused by breathing, each breathing phase in the above breathing cycle and its corresponding pose information caused by breathing, calculate the corresponding pose change in each breathing cycle caused by breathing Quantity matrix, as the above-mentioned pose change information caused by breathing.

步骤150:根据上述超声探头实时的位姿信息、上述被检查对象实时的位姿信息、以及上述被检查对象由呼吸引起的位姿变化信息,对上述被检查对象实时的第一图像数据和第二图像数据进行配准。Step 150: According to the real-time pose information of the ultrasonic probe, the real-time pose information of the inspected object, and the pose change information of the inspected object caused by breathing, analyze the real-time first image data and the second image data of the inspected object. The two image data are registered.

一些实施例中,请参照图6,步骤150包括以下步骤:In some embodiments, please refer to FIG. 6, step 150 includes the following steps:

步骤151:根据上述超声探头实时的位姿信息,获取第一图像数据和第二图像数据初始的配准映射关系。Step 151: Obtain an initial registration mapping relationship between the first image data and the second image data according to the real-time pose information of the ultrasound probe.

初始的配准映射关系的获取可以参见上文关于初始配准的说明,例如操作者选取一幅第一图像,然后在第二图像中寻找对应切面,接着将两者进行叠加,通过平移或旋转使得第一图像与对应切面的第二模态图像中相同的解剖结构重合,以获取此时的T矩阵,然后可以通过公式P=T×A-1×R-1来计算得到P矩阵,该P矩阵即为第一图像数据和第二图像数据初始的配准映射关系。The acquisition of the initial registration mapping relationship can refer to the description of the initial registration above. For example, the operator selects a first image, then finds the corresponding slice in the second image, and then superimposes the two, through translation or rotation Make the first image coincide with the same anatomical structure in the second modality image of the corresponding section to obtain the T matrix at this time, and then the P matrix can be calculated by the formula P=T×A -1 ×R -1 , the The P matrix is the initial registration mapping relationship between the first image data and the second image data.

步骤152:根据上述被检查对象由呼吸引起的位姿变化信息和上述被检查对象实时的位姿信息对上述配准映射关系进行校正。一些实施例中,步骤152根据上述被检查对象由呼吸引起的位姿变化信息对上述被检查对象实时的位姿信息进行校正,例如从上述被检测对象实时的位姿信息中识别并剔除其所包含的由呼吸引起的位姿变化信息;步骤152再根据上述被检查对象校正后实时的位姿信息,对上述配准映射关系进行校正。Step 152: Correct the registration mapping relationship according to the pose change information of the inspected object caused by breathing and the real-time pose information of the inspected object. In some embodiments, step 152 corrects the real-time pose information of the inspected object according to the posture change information of the inspected object caused by breathing, for example, identifying and removing all the detected objects from the real-time pose information of the inspected object. Included pose change information caused by respiration; step 152 corrects the registration mapping relationship based on the corrected real-time pose information of the inspected object.

一些实施例中,请参照图7,步骤152从上述被检测对象实时的位姿信息中识别并剔除其所包含的由呼吸引起的位姿变化信息,包括以下步骤:In some embodiments, please refer to FIG. 7, step 152 identifies and removes the pose change information caused by breathing from the above-mentioned real-time pose information of the detected object, including the following steps:

步骤152.1:确定上述被检测对象实时的呼吸时相;Step 152.1: Determine the real-time breathing phase of the detected object;

步骤152.3:根据上述被检测对象实时的呼吸时相,以及上述被检查对象至少一个呼吸周期内的由呼吸引起的位姿变化信息,将上述被检测对象实时的位姿信息中所包含的由呼吸引起的位姿变化信息,从上述被检测对象实时的位姿信息中剔除。例如步骤152.3根据上述被检测对象实时的呼吸时相,以及上述呼吸引起的呼吸周期内每个呼吸时相对应的位姿变化量矩阵,得到呼吸引起的上述被检测对象实时的呼吸时相对应的位姿变化量矩阵;上述被检测对象实时的位姿信息包括上述被检测对象实时的位姿矩阵,步骤152.3将上述被检测对象实时的位姿矩阵乘以上述被检测对象实时的呼吸时相对应的位姿变化量矩阵的逆矩阵,以从上述被检测对象实时的位姿信息中剔除其所包含的由呼吸引起的位姿变化信息。Step 152.3: According to the real-time breathing phase of the detected object and the pose change information caused by breathing of the detected object within at least one breathing cycle, the real-time pose information of the detected object contained in the breathing The resulting pose change information is removed from the real-time pose information of the detected object. For example, in step 152.3, according to the real-time breathing phase of the detected object and the corresponding pose change matrix for each breathing time in the breathing cycle caused by the breathing, obtain the corresponding real-time breathing time of the detected object caused by breathing Pose variation matrix; the real-time pose information of the detected object includes the real-time pose matrix of the detected object, and step 152.3 multiplies the real-time pose matrix of the detected object by the real-time breathing time of the detected object to correspond to The inverse matrix of the pose change matrix is used to remove the pose change information caused by breathing from the real-time pose information of the detected object.

步骤153:根据校正后的配准映射关系,对上述被检查对象实时的第一图像数据和第二图像数据进行配准。Step 153: Register the real-time first image data and second image data of the object under inspection according to the corrected registration mapping relationship.

在进行实时配准时,对于特别是一些腹式呼吸的病人进行腹部脏器的超声图像和其他模态图像进行配准时,呼吸运动造成的脏器位移、旋转及形变会影响超声图像数据变化,从而使得配准结果不理想,为了消除或减弱这种影响,一些实施例中,步骤150还包括:根据上述被检查对象由呼吸引起的位姿变化信息,对被检查对象由呼吸引起的超声图像数据的变化所引起的配准映射关系的变化进行校正。During real-time registration, especially for some abdominal breathing patients, when the ultrasound images of abdominal organs are registered with other modal images, the displacement, rotation and deformation of the organs caused by respiratory movement will affect the changes in the ultrasound image data, thus The registration result is unsatisfactory. In order to eliminate or reduce this effect, in some embodiments, step 150 further includes: according to the above-mentioned posture change information of the inspected object caused by breathing, the ultrasound image data of the inspected object caused by breathing Correction for changes in the registration mapping relationship caused by changes in .

步骤160:控制对配准后的实时的第一图像数据和第二图像数据进行融合显示或对比显示。Step 160: control to perform fusion display or comparative display of the registered real-time first image data and second image data.

因此,一些实施例的图像配准方法可以是这样的流程:Therefore, the image registration method in some embodiments may be as follows:

(1)控制超声探头向被检查对象发射超声波和接收从上述被检查对象返回的超声回波以获取超声波回波信号,并根据上述超声回波信号获取上述被检查对象实时的超声图像数据;(1) Control the ultrasonic probe to transmit ultrasonic waves to the object under inspection and receive the ultrasonic echo returned from the object under inspection to obtain ultrasonic echo signals, and obtain real-time ultrasonic image data of the object under inspection according to the above-mentioned ultrasonic echo signals;

(2)获取上述被检查对象非实时的检测图像数据;(2) Obtain the non-real-time detection image data of the above-mentioned inspected object;

(3)通过探头定位器获取上述超声探头实时的位姿信息,和通过体表定位器获取上述被检查对象实时的位姿信息和上述被检查对象由呼吸引起的位姿变化信息;通过所述体表定位器获取被检查对象由呼吸引起的位姿变化信息,包括:获取上述被检查对象至少一个呼吸周期内的由呼吸引起的位姿变化信息。一些具体实施例中,获取被检查对象至少一个呼吸周期内的由呼吸引起的位姿变化信息进一步这样来实现:确定初始的配准映射关系所对应的呼吸时相,及其对应的由呼吸引起的位姿信息;确定呼吸周期内每个呼吸时相,及其对应的由呼吸引起的位姿信息;根据上述初始的配准映射关系所对应的呼吸时相、及其对应的由呼吸引起的位姿信息,上述呼吸周期内每个呼吸时相及其对应的由呼吸引起的位姿信息,计算得到呼吸引起的呼吸周期内每个呼吸时相对应的位姿变化量矩阵,作为上述由呼吸引起的位姿变化信息;这样计算得到的由呼吸引起的位姿变化信息,一个例子是上文公式(15)提及的位姿变化量ΔRt(3) Obtain the real-time pose information of the above-mentioned ultrasonic probe through the probe locator, and obtain the real-time pose information of the above-mentioned inspected object and the pose change information of the above-mentioned inspected object caused by breathing through the body surface locator; The body surface locator acquires information on pose changes of the inspected object caused by breathing, including: acquiring information on pose changes of the inspected object caused by breathing within at least one breathing cycle. In some specific embodiments, the acquisition of the breathing-induced pose change information of the inspected object within at least one breathing cycle is further implemented as follows: determine the breathing phase corresponding to the initial registration mapping relationship, and the corresponding breathing phase caused by breathing pose information; determine each breathing phase in the breathing cycle, and its corresponding pose information caused by breathing; according to the above initial registration mapping relationship, the corresponding breathing phase and its corresponding breathing-induced Pose information, each breathing phase in the above breathing cycle and its corresponding pose information caused by breathing, calculate the corresponding pose change matrix for each breathing in the breathing cycle caused by breathing, as the above-mentioned breathing caused by The pose change information caused by this method; the pose change information caused by breathing calculated in this way, an example is the pose change ΔR t mentioned in the above formula (15);

(4)根据超声探头实时的位姿信息,获取上述被检查对象实时的超声图像数据和上述检测图像数据初始的配准映射关系。初始的配准映射关系的获取可以参见上文关于初始配准的说明,例如操作者选取一幅超声图像,然后在检测图像中寻找对应切面,接着将两者进行叠加,通过平移或旋转使得超声图像与对应切面的检测图像中相同的解剖结构重合,以获取此时的T矩阵,然后可以通过公式P=T×A-1×R-1来计算得到P矩阵,该P矩阵即为超声图像数据和检测图像数据初始的配准映射关系;(4) According to the real-time pose information of the ultrasound probe, the initial registration mapping relationship between the real-time ultrasound image data of the inspected object and the detection image data is obtained. The acquisition of the initial registration mapping relationship can refer to the description of the initial registration above. For example, the operator selects an ultrasound image, then finds the corresponding slice in the detection image, and then superimposes the two, and makes the ultrasound The image is superimposed with the same anatomical structure in the detection image of the corresponding section to obtain the T matrix at this time, and then the P matrix can be calculated by the formula P=T×A -1 ×R -1 , and the P matrix is the ultrasound image The initial registration mapping relationship between the data and the detected image data;

(5)从被检测对象实时的位姿信息中剔除其所包含的由呼吸引起的位姿变化信息;以及根据被检测对象剔除由呼吸引起的位姿变化信息后的位姿信息,对上述配准映射关系进行校正;一些实施例中,根据被检查对象由呼吸引起的位姿变化信息对被检查对象实时的位姿信息进行校正,例如从被检测对象实时的位姿信息中识别并剔除其所包含的由呼吸引起的位姿变化信息;一些具体实施例中,从被检测对象实时的位姿信息中识别并剔除其所包含的由呼吸引起的位姿变化信息包括可以这样来进行:确定上述被检测对象实时的呼吸时相;根据上述被检测对象实时的呼吸时相,以及上述被检查对象至少一个呼吸周期内的由呼吸引起的位姿变化信息,将上述被检测对象实时的位姿信息中所包含的由呼吸引起的位姿变化信息,从上述被检测对象实时的位姿信息中剔除;具体地,根据上述被检测对象实时的呼吸时相,以及上述呼吸引起的呼吸周期内每个呼吸时相对应的位姿变化量矩阵,得到呼吸引起的上述被检测对象实时的呼吸时相对应的位姿变化量矩阵;上述被检测对象实时的位姿信息包括上述被检测对象实时的位姿矩阵,将上述被检测对象实时的位姿矩阵乘以上述被检测对象实时的呼吸时相对应的位姿变化量矩阵的逆矩阵,以从上述被检测对象实时的位姿信息中剔除其所包含的由呼吸引起的位姿变化信息;在进行实时配准时,对于特别是一些腹式呼吸的病人进行腹部脏器的超声图像和其他图像(例如三维超声、CT、MR或PET等三维体数据)进行配准时,呼吸运动造成的脏器位移、旋转及形变会影响超声图像数据变化,从而使得配准结果不理想,为了消除或减弱这种影响,一些实施例中,对配准映射关系的校正还包括:处理器40根据被检查对象由呼吸引起的位姿变化信息,对被检查对象由呼吸引起的超声图像数据的变化所引起的配准映射关系的变化进行校正;(5) Remove the pose change information caused by breathing from the real-time pose information of the detected object; In some embodiments, the real-time pose information of the inspected object is corrected according to the pose change information of the inspected object caused by breathing, for example, the real-time pose information of the inspected object is identified and eliminated. Included pose change information caused by breathing; in some specific embodiments, identifying and removing the pose change information contained in the detected object from the real-time pose information may be carried out as follows: determine The real-time respiratory phase of the detected object; according to the real-time respiratory phase of the detected object and the posture change information caused by breathing of the detected object within at least one respiratory cycle, the real-time pose of the detected object The pose change information caused by breathing contained in the information is removed from the real-time pose information of the detected object; The pose change matrix corresponding to each breath is obtained to obtain the pose change matrix corresponding to the real-time breath of the detected object caused by breathing; the real-time pose information of the detected object includes the real-time position of the detected object Pose matrix, the real-time pose matrix of the detected object is multiplied by the inverse matrix of the pose change matrix corresponding to the real-time breathing of the detected object, so as to remove all its components from the real-time pose information of the detected object Included pose change information caused by breathing; during real-time registration, especially for some patients with abdominal breathing, ultrasonic images and other images of abdominal organs (such as three-dimensional ultrasound, CT, MR or PET and other three-dimensional data ) during registration, the organ displacement, rotation and deformation caused by respiratory movement will affect the change of ultrasound image data, thus making the registration result unsatisfactory. In order to eliminate or weaken this effect, in some embodiments, the registration mapping relationship The correction also includes: the processor 40 corrects the change of the registration mapping relationship caused by the change of the ultrasonic image data of the inspected object caused by the breathing according to the pose change information of the inspected object caused by the breathing;

(6)根据校正后的配准映射关系,对上述被检查对象实时的超声图像数据和检测图像数据进行配准。(6) According to the corrected registration mapping relationship, register the real-time ultrasonic image data and detection image data of the object to be inspected.

分析上述过程可以看到,当超声探头的位姿信息和被检查对象的位姿信息中任一者发生变化时,配准映射关系都会发生变化,因此需要进行校正和更新,配准映射关系的变化实质上包含了几大类,第一类是由被检测对象的位姿信息变化所引起的(对于磁导航系统,被检测对象的位姿信息变化包含了患者体位发生了变化,或者磁场发生器的位置发生了变化)配准映射关系的变化;第二类则是在第一类的基础上,被检测对象由于呼吸对体表定位器13所获取的位姿信息所引起的干扰,这也会导致配准映射关系发生变化;更进一步地,第三类则是由于被检测对象的呼吸引起的超声图像数据的变化,这也会导致配准映射关系发生变化,因此可以针对性地对这第三类中一类或多类进行校正,下面具体说明。Analyzing the above process, it can be seen that when any one of the pose information of the ultrasonic probe and the pose information of the inspected object changes, the registration mapping relationship will change, so it needs to be corrected and updated. The changes essentially include several categories. The first category is caused by changes in the pose information of the detected object (for a magnetic navigation system, changes in the pose information of the detected object include changes in the patient's position, or changes in the magnetic field. The position of the device has changed) the change of the registration mapping relationship; the second type is based on the first type, the detected object is due to the interference caused by the breath on the pose information obtained by the body surface locator 13, which It will also lead to changes in the registration mapping relationship; furthermore, the third type is the change of the ultrasound image data due to the breathing of the detected object, which will also cause changes in the registration mapping relationship, so it can be targeted. One or more of these third categories are corrected, as described below.

请参照图8或图9,一些实施例中图像配准方法包括以下步骤:Please refer to FIG. 8 or FIG. 9, the image registration method in some embodiments includes the following steps:

步骤200:获取实时的超声图像数据。例如步骤200控制超声探头向被检查对象发射超声波和接收从上述被检查对象返回的超声回波以获取超声波回波信号,并根据上述超声回波信号获取上述被检查对象实时的超声图像数据。Step 200: Obtain real-time ultrasound image data. For example, step 200 controls the ultrasonic probe to transmit ultrasonic waves to the inspected object and receive ultrasonic echoes returned from the inspected object to obtain ultrasonic echo signals, and obtain real-time ultrasonic image data of the inspected object according to the ultrasonic echo signals.

步骤210:获取上述被检查对象非实时的检测图像数据,例如三维超声图像数据或者与超声图像数据不同模态的图像数据,与超声图像数据不同模态的图像数据可以是CT或MR图像数据等。Step 210: Obtain non-real-time detection image data of the object to be inspected, such as three-dimensional ultrasound image data or image data of a different modality from the ultrasound image data, and the image data of a different modality from the ultrasound image data may be CT or MR image data, etc. .

步骤220:获取两类位姿信息。具体地,步骤220通过探头定位器获取上述超声探头实时的位姿信息,和通过体表定位器获取上述被检查对象实时的位姿信息;其中上述探头定位器设置于上述超声探头,上述体表定位器用于设置于上述被检查对象。Step 220: Obtain two types of pose information. Specifically, step 220 obtains the real-time pose information of the ultrasonic probe through the probe locator, and obtains the real-time pose information of the inspected object through the body surface locator; wherein the probe locator is set on the ultrasonic probe, and the body surface locator The locator is used for setting on the above-mentioned inspected object.

步骤230:对上述被检查对象实时的超声图像数据和检测图像数据进行配准。Step 230: Registering the real-time ultrasound image data and detection image data of the object under inspection.

请参照图10,一些实施例中步骤230对上述被检查对象实时的超声图像数据和检测图像数据进行配准包括以下步骤:Please refer to FIG. 10 , in some embodiments, step 230 registers the real-time ultrasound image data and detection image data of the object under inspection, including the following steps:

步骤231:根据超声探头实时的位姿信息获取上述超声图像数据和检测图像数据初始的配准映射关系。Step 231: According to the real-time pose information of the ultrasound probe, the initial registration mapping relationship between the ultrasound image data and the detection image data is acquired.

初始的配准映射关系的获取可以参见上文关于初始配准的说明,例如操作者选取一幅超声图像,然后在检测图像中寻找对应切面,接着将两者进行叠加,通过平移或旋转使得超声图像与对应切面的检测图像中相同的解剖结构重合,以获取此时的T矩阵,然后可以通过公式P=T×A-1×R-1来计算得到P矩阵,该P矩阵即为超声图像数据和检测图像数据初始的配准映射关系。The acquisition of the initial registration mapping relationship can refer to the description of the initial registration above. For example, the operator selects an ultrasound image, then finds the corresponding slice in the detection image, and then superimposes the two, and makes the ultrasound The image is superimposed with the same anatomical structure in the detection image of the corresponding section to obtain the T matrix at this time, and then the P matrix can be calculated by the formula P=T×A -1 ×R -1 , and the P matrix is the ultrasound image The initial registration mapping relationship between the data and the detected image data.

步骤233:对上述配准映射关系进行校正。步骤233对上述配准映射关系进行校正包括进行第一类校正、第二类校正和第三类校正中的一者或多者。Step 233: Correct the above registration mapping relationship. Step 233 correcting the registration mapping relationship includes performing one or more of the first type of correction, the second type of correction and the third type of correction.

一些实施例中,步骤233根据被检查对象实时的位姿信息,对上述配准映射关系进行第一类校正。一些实施例中,上述第一类校正用于校正由被检查对象的体位变化所引起的配准映射关系的变化。In some embodiments, step 233 performs a first type of correction on the above-mentioned registration mapping relationship according to the real-time pose information of the inspected object. In some embodiments, the above-mentioned first type of correction is used to correct the change of the registration mapping relationship caused by the change of the body position of the inspected object.

一些实施例中,步骤233通过体表定位器获取被检查对象由呼吸引起的位姿变化信息,例如获取上述被检查对象至少一个呼吸周期内的由呼吸引起的位姿变化信息;一些具体实施例中,步骤233确定初始的配准映射关系所对应的呼吸时相,及其对应的由呼吸引起的位姿信息;步骤233确定呼吸周期内每个呼吸时相,及其对应的由呼吸引起的位姿信息;步骤233根据上述初始的配准映射关系所对应的呼吸时相、及其对应的由呼吸引起的位姿信息,上述呼吸周期内每个呼吸时相及其对应的由呼吸引起的位姿信息,计算得到呼吸引起的呼吸周期内每个呼吸时相对应的位姿变化量矩阵,作为上述由呼吸引起的位姿变化信息;这样计算得到的由呼吸引起的位姿变化信息,一个例子是上文公式(15)提及的位姿变化量ΔRtIn some embodiments, step 233 obtains information on pose changes of the object under inspection caused by respiration through a body surface locator, for example, obtains information on pose changes of the object under inspection caused by respiration within at least one breathing cycle; some specific embodiments In step 233, determine the respiratory phase corresponding to the initial registration mapping relationship, and its corresponding pose information caused by breathing; step 233 determines each respiratory phase in the respiratory cycle, and its corresponding breathing-induced Pose information; Step 233 According to the respiratory phase corresponding to the above-mentioned initial registration mapping relationship and the corresponding pose information caused by breathing, each breathing phase in the above-mentioned breathing cycle and its corresponding breathing phase Pose information, calculate the corresponding pose change matrix for each breath in the breathing cycle caused by breathing, as the above pose change information caused by breathing; the calculated pose change information caused by breathing, a An example is the pose variation ΔR t mentioned in formula (15) above.

步骤233根据上述由呼吸引起的位姿变化信息对上述配准映射关系进行第二类校正。一些实施例中,上述第二类校正用于剔除在进行第一类校正时被检查对象的位姿信息包含的由呼吸引起的位姿变化信息。Step 233 performs a second type of correction on the above-mentioned registration mapping relationship according to the above-mentioned pose change information caused by breathing. In some embodiments, the above-mentioned second type of correction is used to eliminate the pose change information caused by breathing contained in the pose information of the inspected object when the first type of correction is performed.

下面对第二类校正进行一个更详细的说明。A more detailed description of the second type of correction is given below.

请参照图11,一些实施例中,步骤233根据上述由呼吸引起的位姿变化信息对上述配准映射关系进行第二类校正包括以下步骤:Please refer to FIG. 11 , in some embodiments, step 233 performs the second type of correction on the above-mentioned registration mapping relationship according to the above-mentioned pose change information caused by breathing, including the following steps:

步骤233.1:确定上述被检测对象实时的呼吸时相。Step 233.1: Determine the real-time breathing phase of the detected object.

步骤233.3:根据上述被检测对象实时的呼吸时相,上述被检查对象至少一个呼吸周期内的由呼吸引起的位姿变化信息,初始的配准映射关系所对应的呼吸时相和由呼吸引起的位姿信息,对配准映射关系进行第二类校正,从而剔除在进行第一类校正时被检查对象的位姿信息包含的由呼吸引起的位姿变化信息。一些具体实施例中,步骤233.3根据上述被检测对象实时的呼吸时相,以及上述呼吸引起的呼吸周期内每个呼吸时相对应的位姿变化量矩阵,得到呼吸引起的上述被检测对象实时的呼吸时相对应的位姿变化量矩阵;上述被检测对象实时的位姿信息包括上述被检测对象实时的位姿矩阵,步骤233.3将上述被检测对象实时的位姿矩阵乘以上述被检测对象实时的呼吸时相对应的位姿变化量矩阵的逆矩阵,从而剔除在进行第一类校正时被检查对象的位姿信息包含的由呼吸引起的位姿变化信息。Step 233.3: According to the real-time respiratory phase of the detected object, the posture change information caused by breathing of the detected object within at least one breathing cycle, the respiratory phase corresponding to the initial registration mapping relationship and the breathing phase caused by breathing Pose information, the second type of correction is performed on the registration mapping relationship, so as to eliminate the pose change information caused by breathing contained in the pose information of the inspected object when the first type of correction is performed. In some specific embodiments, step 233.3 is based on the real-time breathing phase of the detected object and the pose change matrix corresponding to each breath in the breathing cycle caused by the breathing, to obtain the real-time breathing of the detected object caused by breathing. The corresponding pose change matrix during breathing; the real-time pose information of the detected object includes the real-time pose matrix of the detected object, and step 233.3 multiplies the real-time pose matrix of the detected object by the real-time The inverse matrix of the pose change matrix corresponding to the respiration, so as to eliminate the pose change information caused by respiration contained in the pose information of the inspected object when the first type of correction is performed.

一些实施例中,步骤233通过上述由呼吸引起的位姿变化信息对上述配准映射关系进行第三类校正,上述第三类校正用于校正被检查对象由呼吸引起的超声图像数据的变化所引起的配准映射关系的变化。In some embodiments, step 233 performs a third type of correction on the above registration mapping relationship through the above pose change information caused by respiration. The changes in the registration mapping relationship caused by it.

步骤235:根据校正后的配准映射关系,对上述被检查对象实时的超声图像数据和检测图像数据进行配准。Step 235: According to the corrected registration mapping relationship, register the real-time ultrasound image data and the detected image data of the object under inspection.

步骤240:控制对配准后的实时的超声图像数据和检测图像数据进行融合显示或对比显示。Step 240: control to perform fusion display or comparative display of the registered real-time ultrasound image data and detection image data.

本发明一些实施例中,将体表定位器的信号所包含的体位变化信息(被检查对象的位姿信息)和呼吸周期信息进行解耦,体位变化信息用于实现精确的体位追踪功能,保证图像配准过程中即使患者体位发生变化或者诸如磁场发生器位置发生移动后依旧可以保持良好的配准效果;而呼吸周期信息可丢弃或者用于呼吸补偿功能,实时矫正融合时肝脏等器官因患者呼吸运动所导致的在人体内部的位置偏移。In some embodiments of the present invention, the body position change information contained in the signal of the body surface locator (the position and posture information of the inspected object) and the breathing cycle information are decoupled, and the body position change information is used to realize the accurate body position tracking function, ensuring During the image registration process, even if the patient's body position changes or the position of the magnetic field generator moves, it can still maintain a good registration effect; while the respiratory cycle information can be discarded or used for respiratory compensation, real-time correction of liver and other organs during fusion. Displacement within the body caused by breathing movement.

本文参照了各种示范实施例进行说明。然而,本领域的技术人员将认识到,在不脱离本文范围的情况下,可以对示范性实施例做出改变和修正。例如,各种操作步骤以及用于执行操作步骤的组件,可以根据特定的应用或考虑与系统的操作相关联的任何数量的成本函数以不同的方式实现(例如一个或多个步骤可以被删除、修改或结合到其他步骤中)。This document is described with reference to various exemplary embodiments. However, those skilled in the art will recognize that changes and modifications can be made to the exemplary embodiments without departing from the scope herein. For example, the various operational steps, as well as the components used to perform the operational steps, may be implemented in different ways depending on the particular application or considering any number of cost functions associated with the operation of the system (e.g., one or more steps may be deleted, modified or incorporated into other steps).

在上述实施例中,可以全部或部分地通过软件、硬件、固件或者其任意组合来实现。另外,如本领域技术人员所理解的,本文的原理可以反映在计算机可读存储介质上的计算机程序产品中,该可读存储介质预装有计算机可读程序代码。任何有形的、非暂时性的计算机可读存储介质皆可被使用,包括磁存储设备(硬盘、软盘等)、光学存储设备(CD至ROM、DVD、Blu Ray盘等)、闪存和/或诸如此类。这些计算机程序指令可被加载到通用计算机、专用计算机或其他可编程数据处理设备上以形成机器,使得这些在计算机上或其他可编程数据处理装置上执行的指令可以生成实现指定的功能的装置。这些计算机程序指令也可以存储在计算机可读存储器中,该计算机可读存储器可以指示计算机或其他可编程数据处理设备以特定的方式运行,这样存储在计算机可读存储器中的指令就可以形成一件制造品,包括实现指定功能的实现装置。计算机程序指令也可以加载到计算机或其他可编程数据处理设备上,从而在计算机或其他可编程设备上执行一系列操作步骤以产生一个计算机实现的进程,使得在计算机或其他可编程设备上执行的指令可以提供用于实现指定功能的步骤。In the above embodiments, all or part of them may be implemented by software, hardware, firmware or any combination thereof. In addition, the principles herein may be embodied in a computer program product on a computer-readable storage medium having computer-readable program code preloaded thereon, as understood by those skilled in the art. Any tangible, non-transitory computer readable storage medium may be used, including magnetic storage devices (hard disk, floppy disk, etc.), optical storage devices (CD to ROM, DVD, Blu Ray disk, etc.), flash memory, and/or the like . These computer program instructions can be loaded into a general purpose computer, special purpose computer or other programmable data processing apparatus to form a machine, so that these instructions executed on the computer or other programmable data processing apparatus can generate an apparatus for realizing specified functions. These computer program instructions may also be stored in a computer-readable memory which can instruct a computer or other programmable data processing device to operate in a particular manner such that the instructions stored in the computer-readable memory form a Manufactures, including implementing devices for implementing specified functions. Computer program instructions can also be loaded on a computer or other programmable data processing device, thereby performing a series of operational steps on the computer or other programmable device to produce a computer-implemented process, so that the computer or other programmable device Instructions may provide steps for performing specified functions.

虽然在各种实施例中已经示出了本文的原理,但是许多特别适用于特定环境和操作要求的结构、布置、比例、元件、材料和部件的修改可以在不脱离本披露的原则和范围内使用。以上修改和其他改变或修正将被包含在本文的范围之内。While the principles herein have been shown in various embodiments, many modifications in structure, arrangement, proportions, elements, materials and components, particularly suited to particular circumstances and operational requirements may be made without departing from the principles and scope of this disclosure use. The above modifications and other changes or amendments are intended to be included within the scope of this document.

前述具体说明已参照各种实施例进行了描述。然而,本领域技术人员将认识到,可以在不脱离本披露的范围的情况下进行各种修正和改变。因此,对于本披露的考虑将是说明性的而非限制性的意义上的,并且所有这些修改都将被包含在其范围内。同样,有关于各种实施例的优点、其他优点和问题的解决方案已如上所述。然而,益处、优点、问题的解决方案以及任何能产生这些的要素,或使其变得更明确的解决方案都不应被解释为关键的、必需的或必要的。本文中所用的术语“包括”和其任何其他变体,皆属于非排他性包含,这样包括要素列表的过程、方法、文章或设备不仅包括这些要素,还包括未明确列出的或不属于该过程、方法、系统、文章或设备的其他要素。此外,本文中所使用的术语“耦合”和其任何其他变体都是指物理连接、电连接、磁连接、光连接、通信连接、功能连接和/或任何其他连接。The foregoing detailed description has been described with reference to various embodiments. However, those skilled in the art will recognize that various modifications and changes can be made without departing from the scope of the present disclosure. Accordingly, the disclosure is to be considered in an illustrative rather than a restrictive sense, and all such modifications are intended to be embraced within its scope. Also, advantages, other advantages and solutions to problems have been described above with respect to various embodiments. However, neither benefits, advantages, solutions to problems, nor any elements that lead to these, or make the solutions more definite, should be construed as critical, required, or necessary. As used herein, the term "comprises" and any other variants thereof are non-exclusive, such that a process, method, article, or apparatus that includes a list of elements includes not only those elements, but also elements not expressly listed or not part of the process. , method, system, article or other element of a device. Additionally, the term "coupled" and any other variations thereof, as used herein, refers to a physical connection, an electrical connection, a magnetic connection, an optical connection, a communicative connection, a functional connection, and/or any other connection.

具有本领域技术的人将认识到,在不脱离本发明的基本原理的情况下,可以对上述实施例的细节进行许多改变。因此,本发明的范围应仅由权利要求确定。Those skilled in the art will recognize that many changes may be made to the details of the above-described embodiments without departing from the underlying principles of the invention. Accordingly, the scope of the invention should be determined only by the claims.

Claims (20)

1. A method of image registration, comprising:
controlling an ultrasonic probe to emit ultrasonic waves to an inspected object, receiving ultrasonic echoes returned from the inspected object to acquire ultrasonic echo signals, and acquiring real-time ultrasonic image data of the inspected object according to the ultrasonic echo signals;
acquiring non-real-time detection image data of the checked object;
acquiring real-time pose information of the ultrasonic probe through a probe positioner, and acquiring real-time pose information of the inspected object and pose change information of the inspected object caused by breathing through a body surface positioner; the probe positioner is arranged on the ultrasonic probe, and the body surface positioner is arranged on the inspected object;
acquiring the real-time ultrasonic image data of the checked object and the initial registration mapping relation of the detection image data according to the real-time pose information of the ultrasonic probe;
Removing pose change information caused by respiration from the real-time pose information of the detected object; correcting the registration mapping relation according to pose information of the detected object after removing pose change information caused by respiration;
and registering the ultrasonic image data and the detection image data of the checked object in real time according to the corrected registration mapping relation.
2. The image registration method according to claim 1, wherein acquiring pose change information of the inspected object caused by respiration by the body surface locator includes:
and acquiring pose change information caused by breathing in at least one breathing cycle of the checked object.
3. The image registration method according to claim 2, wherein the acquiring of respiratory-induced pose change information of the subject over at least one respiratory cycle comprises:
determining a respiration phase corresponding to the initial registration mapping relation and corresponding pose information caused by respiration;
determining each breathing phase in the breathing cycle and corresponding pose information caused by breathing;
and calculating to obtain a pose change amount matrix corresponding to each breathing time phase in the breathing period caused by breathing according to the breathing time phase corresponding to the initial registration mapping relation and the pose information corresponding to the breathing time phase and the pose information corresponding to the breathing in the breathing period caused by breathing.
4. The image registration method according to claim 2 or 3, wherein the removing pose change information caused by breathing from the real-time pose information of the detected object includes:
determining a breathing phase of the detected subject in real time;
and removing pose change information caused by breathing contained in the real-time pose information of the detected object from the real-time pose information of the detected object according to the real-time breathing time phase of the detected object and the pose change information caused by breathing in at least one breathing period of the detected object.
5. The image registration method of claim 4, wherein,
the removing the pose change information caused by breathing included in the pose information of the detected object in real time from the pose information of the detected object in real time according to the respiratory phase of the detected object in real time and the pose change information caused by breathing in at least one respiratory period of the detected object, includes:
obtaining a pose change matrix corresponding to the real-time breathing time phase of the detected object caused by breathing according to the real-time breathing time phase of the detected object and the pose change matrix corresponding to each breathing time phase in the breathing period caused by breathing;
The real-time pose information of the detected object comprises a real-time pose matrix of the detected object, and the real-time pose matrix of the detected object is multiplied by an inverse matrix of a pose change amount matrix corresponding to the real-time breathing time phase of the detected object so as to remove the pose change information caused by breathing from the real-time pose information of the detected object.
6. The image registration method according to claim 1, wherein the correcting the registration mapping relationship further includes: and correcting the change of the registration mapping relation caused by the change of the ultrasonic image data of the checked object caused by the respiration according to the pose change information of the checked object caused by the respiration.
7. The image registration method according to any one of claims 1 to 6, characterized by further comprising: and controlling fusion display or contrast display of the registered real-time ultrasonic image data and the detected image data.
8. The image registration method according to any one of claims 1 to 6, wherein the detection image data is three-dimensional ultrasound image data or image data of a different modality from ultrasound image data.
9. A method of image registration, comprising:
acquiring real-time first image data of an inspected object through an ultrasonic probe;
acquiring second image data of the inspected object;
acquiring real-time pose information of the ultrasonic probe;
acquiring real-time pose information of the checked object;
acquiring pose change information of the checked object caused by respiration;
and registering the first image data and the second image data of the inspected object in real time according to the real-time pose information of the ultrasonic probe, the real-time pose information of the inspected object and the pose change information of the inspected object caused by breathing.
10. The image registration method according to claim 9, wherein the acquiring pose change information of the inspected object caused by respiration includes:
and acquiring pose change information caused by breathing in at least one breathing cycle of the checked object.
11. The image registration method according to claim 10, wherein the acquiring of respiratory-induced pose change information of the subject over at least one respiratory cycle comprises:
determining a respiration phase corresponding to the initial registration mapping relation and corresponding pose information caused by respiration;
Determining each breathing phase in the breathing cycle and corresponding pose information caused by breathing;
and calculating to obtain a pose change amount matrix corresponding to each breathing time phase in the breathing period caused by breathing according to the breathing time phase corresponding to the initial registration mapping relation and the pose information corresponding to the breathing time phase and the pose information corresponding to the breathing in the breathing period caused by breathing.
12. The image registration method according to any one of claims 9 to 11, wherein registering the first image data and the second image data of the inspected object in real time includes:
acquiring an initial registration mapping relation between the real-time first image data and the second image data of the detected object according to the real-time pose information of the ultrasonic probe;
correcting the registration mapping relation according to the pose change information of the checked object caused by breathing and the real-time pose information of the checked object;
and registering the first image data and the second image data of the checked object in real time according to the corrected registration mapping relation.
13. The image registration method according to claim 12, wherein the correcting the registration map according to pose change information of the inspected object caused by breathing and the real-time pose information of the inspected object includes:
correcting the real-time pose information of the checked object according to the pose change information of the checked object caused by breathing;
and correcting the registration mapping relation according to the real-time pose information of the checked object after correction.
14. The image registration method according to claim 13, wherein correcting the pose information of the inspected object in real time according to the pose change information of the inspected object caused by respiration includes:
and identifying and removing the pose change information caused by breathing from the real-time pose information of the detected object.
15. The image registration method according to claim 14, wherein the identifying and removing pose change information caused by breathing contained in the detected object from the real-time pose information of the detected object includes:
determining a breathing phase of the detected subject in real time;
And removing pose change information caused by breathing contained in the real-time pose information of the detected object from the real-time pose information of the detected object according to the real-time breathing time phase of the detected object and the pose change information caused by breathing in at least one breathing period of the detected object.
16. The image registration method according to claim 15, wherein the removing pose change information caused by breathing included in the real-time pose information of the detected object from the real-time pose information of the detected object based on the real-time breathing phase of the detected object and the pose change information caused by breathing in at least one breathing cycle of the detected object includes:
obtaining a pose change matrix corresponding to the real-time breathing time phase of the detected object caused by breathing according to the real-time breathing time phase of the detected object and the pose change matrix corresponding to each breathing time phase in the breathing period caused by breathing;
the real-time pose information of the detected object comprises a real-time pose matrix of the detected object, and the real-time pose matrix of the detected object is multiplied by an inverse matrix of a pose change amount matrix corresponding to the real-time breathing time phase of the detected object so as to remove the pose change information caused by breathing from the real-time pose information of the detected object.
17. The image registration method according to any one of claims 12 to 16, wherein the correcting the registration map according to respiratory-induced pose change information of the subject under examination and real-time pose information of the subject under examination further includes:
and correcting the change of the registration mapping relation caused by the change of the first image data of the checked object caused by the respiration according to the pose change information of the checked object caused by the respiration.
18. The image registration method according to any one of claims 9 to 11, characterized by further comprising: and controlling fusion display or contrast display of the registered real-time first image data and the registered real-time second image data, wherein the first image data comprises two-dimensional ultrasonic image data, and the second image data comprises three-dimensional ultrasonic image data or image data of a different mode from the ultrasonic image data.
19. An ultrasound imaging system, comprising:
the ultrasonic probe is used for transmitting ultrasonic waves to the detected object and receiving corresponding ultrasonic wave echoes so as to acquire ultrasonic wave echo signals;
a transmission and reception control circuit for controlling the ultrasonic probe to perform transmission of ultrasonic waves and reception of ultrasonic echo signals;
The probe locator is arranged on the ultrasonic probe and used for acquiring real-time pose information of the ultrasonic probe;
the body surface locator is used for being arranged on the checked object to acquire real-time pose information of the checked object;
a processor for performing the image registration method as claimed in any one of claims 1 to 18.
20. A computer readable storage medium having stored thereon a program executable by a processor to implement the method of any one of claims 1 to 18.
CN202111436052.XA 2021-11-29 2021-11-29 An image registration method and an ultrasonic imaging system Pending CN116172605A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111436052.XA CN116172605A (en) 2021-11-29 2021-11-29 An image registration method and an ultrasonic imaging system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111436052.XA CN116172605A (en) 2021-11-29 2021-11-29 An image registration method and an ultrasonic imaging system

Publications (1)

Publication Number Publication Date
CN116172605A true CN116172605A (en) 2023-05-30

Family

ID=86438924

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111436052.XA Pending CN116172605A (en) 2021-11-29 2021-11-29 An image registration method and an ultrasonic imaging system

Country Status (1)

Country Link
CN (1) CN116172605A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US12429550B2 (en) * 2022-06-30 2025-09-30 Samsung Electronics Co., Ltd. Electronic device for estimating relative position and pose and operating method of the same

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US12429550B2 (en) * 2022-06-30 2025-09-30 Samsung Electronics Co., Ltd. Electronic device for estimating relative position and pose and operating method of the same

Similar Documents

Publication Publication Date Title
JP5367215B2 (en) Synchronization of ultrasound imaging data with electrical mapping
JP4795099B2 (en) Superposition of electroanatomical map and pre-acquired image using ultrasound
JP5345275B2 (en) Superposition of ultrasonic data and pre-acquired image
JP5265091B2 (en) Display of 2D fan-shaped ultrasonic image
US8126239B2 (en) Registering 2D and 3D data using 3D ultrasound data
CN103829949B (en) Patient Motion Compensation in In Vivo Probe Tracking Systems
JP5622995B2 (en) Display of catheter tip using beam direction for ultrasound system
US20060253024A1 (en) Software product for three-dimensional cardiac imaging using ultrasound contour reconstruction
US20060241445A1 (en) Three-dimensional cardial imaging using ultrasound contour reconstruction
CN116172605A (en) An image registration method and an ultrasonic imaging system
AU2012258444A1 (en) Display of two-dimensional ultrasound fan

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination