WO2019179344A1 - Procédé d'imagerie ultrasonore tridimensionnelle basée sur la fusion d'informations provenant de multiples capteurs, dispositif et support de stockage lisible par machine terminal - Google Patents
Procédé d'imagerie ultrasonore tridimensionnelle basée sur la fusion d'informations provenant de multiples capteurs, dispositif et support de stockage lisible par machine terminal Download PDFInfo
- Publication number
- WO2019179344A1 WO2019179344A1 PCT/CN2019/078034 CN2019078034W WO2019179344A1 WO 2019179344 A1 WO2019179344 A1 WO 2019179344A1 CN 2019078034 W CN2019078034 W CN 2019078034W WO 2019179344 A1 WO2019179344 A1 WO 2019179344A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- information
- pose
- position information
- camera
- ultrasonic probe
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/48—Diagnostic techniques
- A61B8/483—Diagnostic techniques involving the acquisition of a 3D volume of data
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/42—Details of probe positioning or probe attachment to the patient
- A61B8/4245—Details of probe positioning or probe attachment to the patient involving determining the position of the probe, e.g. with respect to an external reference frame or to the patient
- A61B8/4254—Details of probe positioning or probe attachment to the patient involving determining the position of the probe, e.g. with respect to an external reference frame or to the patient using sensors mounted on the probe
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/52—Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/5215—Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data
Definitions
- the present application relates to the field of ultrasonic imaging technologies, and in particular, to a three-dimensional ultrasound imaging method, apparatus, terminal, and machine readable storage medium based on multi-sensor information fusion.
- three-dimensional ultrasound imaging eliminates the need for doctors to synthesize multiple two-dimensional images based on experience to understand the process of three-dimensional anatomical structure, and that the three-dimensional ultrasound imaging effect is intuitive and clinical.
- the value is relatively large, so it is concerned by researchers and medical workers.
- the common three-dimensional ultrasound imaging technology is mainly divided into two categories: the first one is to use three-dimensional electronic phased array and other methods to obtain three-dimensional data without moving the probe, and to image immediately or in real time;
- the trajectory of the probe in the three-dimensional space is determined by using the position sensor of the spatial position sensor, thereby determining the spatial coordinates and image orientation of the obtained two-dimensional image per frame, and performing three-dimensional reconstruction on the scanned structure.
- the second method for performing three-dimensional reconstruction specifically includes four cases to locate each frame of the two-dimensional image scanned: First, an electromagnetic sensor.
- Second, external positioning (visual) device An external T-type dual camera arm is attached and a special visual marker such as a silver reflective ball or a black and white circular template is loaded on the ultrasonic probe. An estimate of six degrees of freedom is obtained by monitoring the position of the marker by the camera. But you need to "stare" at the markers on the probe.
- the probe is free to operate with the doctor and is often prone to occlusion, making it impossible to track.
- the mechanical moving probe However, the probe is driven by the mechanical device to sweep the space area in a fan shape or in a rotating manner. Because it is controlled by the machine, the position and orientation of each frame can be obtained. The scope of the scan is very limited and can only cover the range that the mechanical scanning device can cover. For special scenes such as intravascular ultrasound, such devices can scan the blood vessels well along a fixed axis for a week, but for other free scans, this device solution does not meet the need for free scanning. Fourth, handheld three-dimensional ultrasound system.
- one of the objectives of the embodiments of the present application is to provide a three-dimensional ultrasound imaging method and apparatus based on multi-sensor information fusion, and improve the stability and quality of the three-dimensional ultrasound imaging technology by setting a sensor group and a camera on the ultrasonic probe. .
- the embodiment of the present application provides a three-dimensional ultrasound imaging method based on multi-sensor information fusion, comprising: acquiring a first pose information of an ultrasound probe by using a sensor group, wherein the first pose information is obtained by the sensor group according to the collected information.
- the sensor group is disposed on the ultrasonic probe, and the sensor group includes at least an inertial guidance sensor;
- a three-dimensional image of each frame of the ultrasound image is reconstructed according to an interpolation method.
- the method further includes:
- both the first pose information and the second pose information are subjected to pose correction.
- the second pose information of the ultrasonic probe is acquired by using the camera, and the second pose information is obtained by the camera according to the collected panoramic scene, wherein the camera is disposed on the ultrasonic probe, and the method includes:
- the first information is collected by the first camera disposed on the ultrasonic probe, and the position information of the first camera corresponding to the first information is calculated by using the pose estimation algorithm, and the first set of position information and the first position of the ultrasonic probe are obtained according to the position information.
- a set of angle information wherein the first set of position information includes position information of three degrees of freedom, and the first set of angle information includes angle information of three degrees of freedom;
- the second information is collected by the second camera disposed on the back side of the ultrasonic probe, and the position information of the second camera corresponding to the second information is calculated by using the pose estimation algorithm, and the second set of position information of the ultrasonic probe is obtained according to the position information.
- a second set of angle information wherein the second set of position information includes position information of three degrees of freedom, and the second set of angle information includes angle information of three degrees of freedom;
- the third information is collected by the third camera disposed on the ultrasonic probe handle, and the position information of the third camera corresponding to the third information is calculated by using the pose estimation algorithm, and the third set of position information of the ultrasonic probe is obtained according to the position information.
- a third set of angle information wherein the third set of position information includes position information of three degrees of freedom, and the third set of angle information includes angle information of three degrees of freedom;
- the fourth information is collected by the fourth camera disposed on the back side of the ultrasonic probe handle, and the position information of the fourth camera corresponding to the fourth information is calculated by using the pose estimation algorithm, and the fourth set of position information of the ultrasonic probe is obtained according to the position information.
- a fourth set of angle information wherein the fourth set of position information includes position information of three degrees of freedom, and the fourth set of angle information includes angle information of three degrees of freedom;
- the acquiring the second pose information of the ultrasound probe by using the camera includes:
- the pose estimation algorithm comprises a synchronous positioning and map construction algorithm or a visual inertia mileage calculation method.
- the method after reconstructing the three-dimensional image of each frame of the ultrasound image according to the interpolation method, the method further includes:
- the first pose information and the three-dimensional image of the next frame of the ultrasound image are optimized.
- the inertial guidance sensor comprises a first inertial guidance chip and a second inertial guidance chip, and the first inertial guidance chip is disposed on the ultrasonic probe, and the second inertial guidance chip is disposed on the back of the ultrasonic probe handle side.
- the embodiment of the present application provides a three-dimensional ultrasound imaging apparatus based on multi-sensor information fusion, including:
- the sensing acquisition module is configured to acquire the first pose information of the ultrasonic probe by using the sensor group, wherein the first pose information is obtained by the sensor group according to the collected linear and angular acceleration information, wherein the sensor group is disposed on the ultrasonic probe, and the sensor The group includes at least an inertial guidance sensor;
- the camera acquisition module is configured to acquire the second pose information of the ultrasound probe by using the camera, and the second pose information is obtained by the camera according to the collected panoramic scene, wherein the camera is disposed on the ultrasound probe;
- a filtering module configured to obtain pose information of each frame of the ultrasound image by using an optimal estimation filtering method
- a reconstruction module configured to reconstruct a three-dimensional image of each frame of the ultrasound image according to an interpolation method.
- the method further includes:
- a similarity calculation module configured to calculate a similarity between the three-dimensional image at the current moment and the three-dimensional image at the previous moment
- a similarity comparison module configured to generate correction information when the similarity is greater than a similar threshold set in advance
- the similarity determining module is configured to determine whether the similarity between the first pose information and the second pose information is smaller than the pose estimate corresponding to the correction information;
- Negating the execution module configured to determine whether the result is negative, reconstructing a three-dimensional image of each frame of the ultrasound image according to the interpolation method
- the affirmative execution module is configured to perform posture correction on both the first pose information and the second pose information when the judgment result is YES.
- the camera acquiring module is specifically configured to:
- the first information is collected by the first camera disposed on the ultrasonic probe, and the position information of the first camera corresponding to the first information is calculated by using a pose estimation algorithm, and the ultrasonic probe is obtained according to the position information.
- a set of position information and a first set of angle information wherein the first set of position information includes position information of three degrees of freedom, and the first set of angle information includes angle information of three degrees of freedom;
- the second information is collected by the second camera disposed on the back side of the ultrasonic probe, and the position information of the second camera corresponding to the second information is calculated by using the pose estimation algorithm, and the ultrasonic probe is acquired according to the position information.
- a second set of position information and a second set of angle information wherein the second set of position information includes position information of three degrees of freedom, and the second set of angle information includes angle information of three degrees of freedom;
- the third information is collected by the third camera disposed on the ultrasonic probe handle, and the position information of the third camera corresponding to the third information is calculated by using a pose estimation algorithm, and the ultrasonic probe is acquired according to the position information.
- a third set of position information and a third set of angle information wherein the third set of position information includes position information of three degrees of freedom, and the third set of angle information includes angle information of three degrees of freedom;
- the fourth information is collected by the fourth camera disposed on the back side of the ultrasonic probe handle, and the position information of the fourth camera corresponding to the fourth information is calculated by using the pose estimation algorithm, and the ultrasonic probe is acquired according to the position information.
- a fourth set of position information and a fourth set of angle information wherein the fourth set of position information includes position information of three degrees of freedom, and the fourth set of angle information includes angle information of three degrees of freedom;
- the first group of location information, the first group of angle information, the second group of location information, the second group of angle information, the third group of location information, the third group of angle information, The fourth set of position information and the fourth set of angle information are stitched into the second pose information.
- the camera acquisition module is configured to calculate spatial position information of each camera by using a pose estimation algorithm according to information collected by each camera; and can perform ultrasound on the camera according to each camera.
- the relative position information on the probe and the spatial position information of each camera are calculated to obtain position information and angle information of the ultrasonic probe as the second pose information.
- the pose estimation algorithm comprises a synchronous positioning and map construction algorithm or a visual inertia mileage calculation method.
- the method further includes:
- a feedback module configured to feed the three-dimensional image into the positioner of the ultrasonic probe
- the optimization module is configured to optimize the first pose information and the three-dimensional image of the next frame of the ultrasound image.
- the embodiment of the present application further provides a terminal, including a memory and a processor, where the memory is configured to store a program supporting the processor to perform the multi-sensor information fusion based three-dimensional ultrasound imaging method provided by the foregoing aspect, where the processor is configured to Used to execute programs stored in memory.
- the embodiment of the present application further provides a computer readable storage medium, where the computer readable storage medium stores a computer program, and when the computer program is executed by the processor, performs the steps of the method of any of the above.
- the multi-sensor information fusion based three-dimensional ultrasound imaging method, apparatus, and terminal machine-readable storage medium provided by the embodiments of the present application, wherein the multi-sensor information fusion-based three-dimensional ultrasound imaging method comprises: first, using an sensor group to acquire an ultrasound probe The first pose information needs to be described here.
- the sensor group is disposed on the ultrasonic probe.
- the sensor group includes at least an inertial guidance sensor to comprehensively consider the environment in which the ultrasonic probe is located through different angles.
- the ultrasonic image is obtained by optimally filtering the first pose information and the second pose information.
- FIG. 1 is a first flowchart of a three-dimensional ultrasound imaging method based on multi-sensor information fusion provided by an embodiment of the present application
- FIG. 2 is a second flowchart of a three-dimensional ultrasound imaging apparatus based on multi-sensor information fusion provided by an embodiment of the present application;
- FIG. 3 is a schematic structural diagram of a three-dimensional ultrasound imaging apparatus based on multi-sensor information fusion provided by an embodiment of the present application;
- FIG. 4 is a structural connection diagram of a three-dimensional ultrasound imaging apparatus based on multi-sensor information fusion provided by an embodiment of the present application.
- Icon 101-ultrasound probe; 102-ultrasound probe handle; 103-ultrasound probe cable; 104-ultrasound probe probe surface; 201-first camera; 202-second camera; 203-fourth camera; 204-third camera 301-first inertial guidance chip; 302-second inertial guidance chip; 401-sensing acquisition module; 402-camera acquisition module; 403-filter module; 404-reconstruction module.
- the position and direction of the six degrees of freedom of the probe are determined by an electromagnetic sensor.
- This type of probe may have a limited range of single scans and is not suitable for one-time large-scale composite scanning.
- the second is an external positioning device to monitor the position of the marker to obtain an estimate of six degrees of freedom. But you need to "stare" at the markers on the probe. However, the probe is free to operate with the doctor and is often prone to occlusion, making it impossible to track.
- the third is to obtain the position and orientation of each frame by mechanically moving the probe. However, for special scenes such as intravascular ultrasound, the need for free scanning is still not met.
- the scanning object must be a flat surface.
- the surface of the human body mostly belongs to the curved surface, which cannot meet the condition.
- the existing use situation may result in poor performance of the three-dimensional ultrasonic imaging technology.
- the embodiment of the present application provides a three-dimensional ultrasound imaging method and apparatus based on multi-sensor information fusion, which will be described below by way of embodiments.
- the three-dimensional ultrasound imaging method based on multi-sensor information fusion proposed in this embodiment specifically includes the following steps:
- Step S101 Acquire the first pose information of the ultrasonic probe by using the sensor group.
- the sensor group is disposed on the ultrasonic probe, and the sensor group includes at least an inertial guidance sensor, that is, different types of sensors are simultaneously disposed on the ultrasonic probe.
- the position and posture of the ultrasound probe are considered from different angles.
- the application of the above sensor group greatly reduces the requirements of the system for the use environment, and the positioning of the probe posture is not required to install the external positioning module, which can significantly improve the portability of the system.
- the inertial guidance sensor includes a first inertial guidance chip and a second inertial guidance chip, and the first inertial guidance chip is disposed on the ultrasonic probe, and the second inertial guidance chip is disposed on the back side of the ultrasonic probe handle .
- the first inertial guiding chip 301 and the second inertial guiding chip 302 can provide position information and attitude information of the ultrasonic probe in real time.
- Step S102 Acquire the second pose information of the ultrasonic probe by using the camera, and the second pose information is obtained by the camera according to the collected panoramic scene.
- the camera is disposed on the ultrasonic probe. During use, the camera usually selects a high-definition camera, and the panoramic scene in which the ultrasonic probe is located is collected by the camera to obtain the second pose information of the ultrasonic probe.
- the active information fusion of the positioning of the ultrasonic probe and the three-dimensional reconstruction further improves the accuracy and robustness of the system.
- the probe positioning and 3D reconstruction modules are two completely independent modules compared to the existing passive 3D reconstruction mode.
- the result of the three-dimensional reconstruction can be fed back into the probe locator, and the information of the sensor group and the result of the three-dimensional reconstruction are systematically optimized, thereby improving the robustness and precision of the system.
- Step S103 Obtaining pose information of each frame of the ultrasound image by using an optimal estimation filtering method.
- the method of optimal estimation filtering is used to obtain the pose information of the ultrasonic probe when acquiring the ultrasonic image of each frame according to the first pose information and the second pose information of each frame of the ultrasound image.
- Step S104 reconstructing a three-dimensional image of each frame of the ultrasound image according to the interpolation method. That is to say, in the method, the ultrasound image of different frames can be spliced into a three-dimensional volume data by a fast and efficient interpolation algorithm.
- an interpolation algorithm is used to fuse the plurality of ultrasound images to obtain three-dimensional volume data.
- the method further includes:
- the three-dimensional image of each frame of the ultrasonic image is reconstructed according to the interpolation method. That is, when the similarity between the first pose information and the second pose information is greater than or equal to the pose estimate corresponding to the correction information, the three-dimensional image of each frame of the ultrasound image is reconstructed according to the interpolation method.
- both the first pose information and the second pose information are subjected to pose correction. That is, when the similarity between the first pose information and the second pose information is less than the pose estimate corresponding to the correction information, both the first pose information and the second pose information are posture corrected so that the obtained result is obtained. More precise.
- the second pose information of the ultrasonic probe is obtained by using the camera, and the second pose information is obtained by the camera according to the collected panoramic scene, wherein the camera is disposed on the ultrasonic probe, including:
- the first thing to note is that in theory, only one camera is needed to obtain the 6 degrees of freedom pose information of the ultrasound probe. However, in this method, four cameras are set because: one or more cameras may be occluded during the inspection. Even so, the scene data provided by other cameras can also provide visual information. In addition, according to the way in which the probe is held by the doctor during the examination, it can be inferred that the possibility that the four cameras are simultaneously blocked is very small. So we can always get a stable video signal. Second, the information obtained between multiple video signals can be fused and cross-validated, which improves the positioning accuracy.
- the mounting positions of the four cameras are as shown in FIG. 3, and the cameras 201-204 provide video information in real time to monitor the surrounding scene of the ultrasonic probe.
- the real-time algorithm uses the pose estimation algorithm from the acquired video signal to reverse the position of the four cameras.
- the position and orientation of the probe can be determined from four positions.
- the two-dimensional ultrasound image of each frame acquired by the ultrasound probe is given a position of 6 degrees of freedom (position 3 degrees of freedom, such as three-dimensional coordinates in space) and angle (3 degrees of freedom of the pose, such as rigid body Euler
- the information of the pitch angle, the roll angle and the deflection angle in the angle may specifically include the following steps:
- the information and the first set of angle information wherein the first set of position information includes position information of three degrees of freedom, and the first set of angle information includes angle information of three degrees of freedom.
- pose estimation algorithms such as Simultaneous Localization And Mapping (SLAM), Visual Inertial Odometry (VIO), and the like.
- the fourth information is collected by the fourth camera disposed on the back side of the ultrasonic probe handle, and the position information of the fourth camera corresponding to the fourth information is calculated by using the pose estimation algorithm, and the fourth ultrasonic probe is obtained according to the position information.
- the group position information and the fourth group angle information wherein the fourth group position information includes position information of three degrees of freedom, and the fourth group angle information includes angle information of three degrees of freedom.
- the first group position information, the first group angle information, the second group position information, the second group angle information, the third group position information, the third group angle information, the fourth group position information, and the fourth group angle The information is stitched into the second pose information.
- the spatial position information of each camera may be calculated by using a pose estimation algorithm according to information collected by each camera.
- the second pose information of the ultrasonic probe can be calculated by combining the relative position information of each camera on the ultrasonic probe and the spatial position information of each camera.
- the posture information includes position information of the ultrasonic probe and angle information in the space.
- the method further includes:
- Step S201 feeding back the three-dimensional image into the positioner of the ultrasonic probe. That is, a three-dimensional image of each reconstructed ultrasound image is stored in the locator of the ultrasound probe, and the purpose of this feedback is to compare with the next frame.
- Step S202 Optimizing the first pose information and the three-dimensional image of the next frame of the ultrasound image, that is, correcting the first pose information of the next frame of the ultrasound image by using the pre-stored reconstructed three-dimensional image in advance, It is guaranteed that the first pose information of the next frame of the ultrasound image does not have a large error.
- the three-dimensional ultrasound imaging method based on multi-sensor information fusion includes: firstly, acquiring the first pose information of the ultrasonic probe by using the sensor group, it is required that the sensor group is disposed on the ultrasonic probe.
- the sensor group includes at least an inertial guidance sensor, that is, a plurality of sensors are used to monitor the first pose information of the ultrasonic probe, and secondly, the second pose information of the ultrasonic probe is acquired by the camera, and the second pose information is collected by the camera according to the camera.
- the camera is disposed on the ultrasonic probe, and then the pose information of each frame of the ultrasonic image is obtained by using an optimal estimation filtering method, so that each frame of ultrasound is reconstructed according to the interpolation method.
- a three-dimensional image of the image it is possible to achieve the influence of the electromagnetic field and the ferromagnetic substance in the surrounding environment, and the ultrasonic probe is not blocked by the user or the inspector, and the interference to the user is smaller, and the problem of positioning due to the occlusion does not occur.
- the embodiment provides a three-dimensional ultrasound imaging apparatus based on multi-sensor information fusion, including:
- the sensor acquisition module 401 is configured to acquire the first pose information of the ultrasound probe by using the sensor group, wherein the first pose information is obtained by the sensor group according to the collected linear and angular acceleration information, wherein the sensor group is disposed on the ultrasound probe,
- the sensor group includes at least an inertial guidance sensor
- the camera acquisition module 402 is configured to acquire the second pose information of the ultrasound probe by using the camera, and the second pose information is obtained by the camera according to the collected panoramic scene, wherein the camera is set at On the ultrasound probe
- the filtering module 403 is configured to obtain the pose information of each frame of the ultrasound image by using an optimal estimation filtering method
- the reconstruction module 404 is configured to reconstruct the three-dimensional image of each frame of the ultrasound image according to the interpolation method.
- the multi-sensor information fusion based three-dimensional ultrasound imaging apparatus further includes: a similarity calculation module configured to calculate a similarity between the current time 3D image and the previous time 3D image, and the similarity comparison a module configured to generate correction information when the similarity is greater than a similar threshold set in advance, and the similarity determination module is configured to determine whether the similarity between the first pose information and the second pose information is smaller than a pose estimate corresponding to the correction information Value, negating the execution module, configured to determine whether the result is negative, reconstructing the three-dimensional image of each frame of the ultrasound image according to the interpolation method, and surely executing the module, configured to determine the result of the determination as the first pose position and the second pose The information is corrected for pose.
- a similarity calculation module configured to calculate a similarity between the current time 3D image and the previous time 3D image
- the similarity comparison a module configured to generate correction information when the similarity is greater than a similar threshold set in advance
- the similarity determination module is configured to determine whether the similarity between
- the camera acquisition module is specifically configured to:
- the first information is collected by the first camera disposed on the ultrasonic probe, and the position information of the first camera corresponding to the first information is calculated by using a pose estimation algorithm, and the ultrasonic probe is obtained according to the position information.
- a set of position information and a first set of angle information wherein the first set of position information includes position information of three degrees of freedom, and the first set of angle information includes angle information of three degrees of freedom;
- the second information is collected by the second camera disposed on the back side of the ultrasonic probe, and the position information of the second camera corresponding to the second information is calculated by using the pose estimation algorithm, and the ultrasonic probe is acquired according to the position information.
- a second set of position information and a second set of angle information wherein the second set of position information includes position information of three degrees of freedom, and the second set of angle information includes angle information of three degrees of freedom;
- the third information is collected by the third camera disposed on the ultrasonic probe handle, and the position information of the third camera corresponding to the third information is calculated by using a pose estimation algorithm, and the ultrasonic probe is acquired according to the position information.
- a third set of position information and a third set of angle information wherein the third set of position information includes position information of three degrees of freedom, and the third set of angle information includes angle information of three degrees of freedom;
- the fourth information is collected by the fourth camera disposed on the back side of the ultrasonic probe handle, and the position information of the fourth camera corresponding to the fourth information is calculated by using the pose estimation algorithm, and the ultrasonic probe is acquired according to the position information.
- a fourth set of position information and a fourth set of angle information wherein the fourth set of position information includes position information of three degrees of freedom, and the fourth set of angle information includes angle information of three degrees of freedom;
- the first group of location information, the first group of angle information, the second group of location information, the second group of angle information, the third group of location information, the third group of angle information, The fourth set of position information and the fourth set of angle information are stitched into the second pose information.
- the camera acquisition module is configured to calculate spatial position information of each camera by using a pose estimation algorithm according to information collected by each camera; and can perform ultrasound on the camera according to each camera.
- the relative position information on the probe and the spatial position information of each camera are calculated to obtain position information and angle information of the ultrasonic probe as the second pose information.
- the pose estimation algorithm includes a synchronous positioning and map construction algorithm or a visual inertia mileage calculation method.
- the multi-sensor information fusion based three-dimensional ultrasound imaging apparatus in this embodiment further includes: a feedback module configured to feed the three-dimensional image into the locator of the ultrasound probe, and the optimization module is configured to perform the next frame of the ultrasound image The first pose information and the three-dimensional image are optimized.
- the three-dimensional ultrasonic imaging apparatus based on multi-sensor information fusion provided by the embodiment of the present application has the same technical features as the three-dimensional ultrasonic imaging method based on multi-sensor information fusion provided by the above embodiments, so that the same technical problem can be solved and the same Technical effect.
- the embodiment of the present application further provides a terminal, including a memory and a processor, the memory being configured to store a program supporting the processor to execute the method of the foregoing embodiment, the processor being configured to execute a program stored in the memory.
- the embodiment of the present application further provides a computer readable storage medium, where the computer readable storage medium stores a computer program, and when the computer program is executed by the processor, performs the steps of the method of any of the above.
- each block of the flowchart or block diagram can represent a module, a program segment, or a portion of code that includes one or more of the Executable instructions. It should also be noted that, in some alternative implementations, the functions noted in the blocks may also occur in a different order than those illustrated in the drawings.
- each block of the block diagrams and/or flowcharts, and combinations of blocks in the block diagrams and/or flowcharts can be implemented in a dedicated hardware-based system that performs the specified function or function. Or it can be implemented by a combination of dedicated hardware and computer instructions.
- each functional module or unit in each embodiment of the present application may be integrated to form a separate part, or each module may exist separately, or two or more modules may be integrated to form a separate part.
- the functions, if implemented in the form of software functional modules and sold or used as separate products, may be stored in a computer readable storage medium.
- the technical solution of the present application which is essential or contributes to the prior art, or a part of the technical solution, may be embodied in the form of a software product, which is stored in a storage medium, including
- the instructions are used to cause a computer device (which may be a personal computer, server, or network device, etc.) to perform all or part of the steps of the methods described in various embodiments of the present application.
- the foregoing storage medium includes: a U disk, a mobile hard disk, a read-only memory (ROM), a random access memory (RAM), a magnetic disk, or an optical disk, and the like. .
- the multi-sensor information fusion based three-dimensional ultrasound imaging method, apparatus, and terminal machine-readable storage medium provided by the embodiments of the present application, wherein the multi-sensor information fusion-based three-dimensional ultrasound imaging method comprises: first, using an sensor group to acquire an ultrasound probe The first pose information needs to be described here.
- the sensor group is disposed on the ultrasonic probe.
- the sensor group includes at least an inertial guidance sensor to comprehensively consider the environment in which the ultrasonic probe is located through different angles.
- the ultrasonic image is obtained by optimally filtering the first pose information and the second pose information.
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Engineering & Computer Science (AREA)
- Biomedical Technology (AREA)
- Medical Informatics (AREA)
- Pathology (AREA)
- Radiology & Medical Imaging (AREA)
- Biophysics (AREA)
- Physics & Mathematics (AREA)
- Heart & Thoracic Surgery (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Molecular Biology (AREA)
- Surgery (AREA)
- Animal Behavior & Ethology (AREA)
- General Health & Medical Sciences (AREA)
- Public Health (AREA)
- Veterinary Medicine (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Ultra Sonic Daignosis Equipment (AREA)
Abstract
L'invention concerne un procédé d'imagerie ultrasonore tridimensionnelle basé sur la fusion d'informations provenant de multiples capteurs, un dispositif et un support de stockage lisible par machine terminal, le procédé d'imagerie ultrasonore tridimensionnelle basé sur la fusion d'informations provenant de multiples capteurs consistant à: premièrement, obtenir des premières informations de posture d'une sonde ultrasonore (101) à l'aide d'un groupe de capteurs, les premières informations de posture étant obtenues par le groupe de capteurs en fonction d'informations d'accélération linéaire et de rotation collectées, le groupe de capteurs étant disposé sur la sonde ultrasonore (101), et le groupe de capteurs comprenant au moins deux types de dispositifs, un capteur d'accélération et un gyroscope; deuxièmement, obtenir des secondes informations de posture de la sonde ultrasonore (101) en utilisant des caméras (201, 202, 203, 204), les secondes informations de posture étant obtenues par les caméras (201, 202, 203, 204) en fonction de scènes panoramiques collectées, les caméras (201, 202, 203, 204) sont disposées sur la sonde ultrasonore (101); ensuite, obtenir des informations de posture de chaque trame d'une image ultrasonore au moyen d'un procédé de filtrage d'estimation optimal, et reconstruire une image tridimensionnelle de chaque trame de l'image ultrasonore selon un procédé d'interpolation; par conséquent, l'utilisation combinée du groupe de capteurs et des caméras (201, 202, 203, 204) est obtenue sur la sonde ultrasonore (101), et la qualité dans la technologie d'imagerie ultrasonore tridimensionnelle est améliorée.
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN201810232253.X | 2018-03-20 | ||
| CN201810232253.XA CN108403146B (zh) | 2018-03-20 | 2018-03-20 | 基于多传感器信息融合的三维超声成像方法及装置 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2019179344A1 true WO2019179344A1 (fr) | 2019-09-26 |
Family
ID=63132850
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/CN2019/078034 Ceased WO2019179344A1 (fr) | 2018-03-20 | 2019-03-13 | Procédé d'imagerie ultrasonore tridimensionnelle basée sur la fusion d'informations provenant de multiples capteurs, dispositif et support de stockage lisible par machine terminal |
Country Status (2)
| Country | Link |
|---|---|
| CN (1) | CN108403146B (fr) |
| WO (1) | WO2019179344A1 (fr) |
Cited By (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US12446854B2 (en) * | 2021-08-03 | 2025-10-21 | Fujifilm Sonosite, Inc. | Ultrasound probe guidance |
Families Citing this family (14)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN108403146B (zh) * | 2018-03-20 | 2020-06-30 | 余夏夏 | 基于多传感器信息融合的三维超声成像方法及装置 |
| CN111292277B (zh) * | 2018-12-10 | 2021-02-09 | 深圳迈瑞生物医疗电子股份有限公司 | 超声融合成像方法及超声融合成像导航系统 |
| CN110248302B (zh) * | 2019-05-29 | 2021-06-08 | 苏州佳世达电通有限公司 | 超音波探头检验系统及超音波探头检验方法 |
| CN110179502A (zh) * | 2019-06-06 | 2019-08-30 | 深圳大学 | 手术器械及使用方法 |
| CN112237439B (zh) * | 2019-07-17 | 2024-10-11 | 深圳市理邦精密仪器股份有限公司 | 调整探头位置的方法、探头、超声设备和存储介质 |
| CN112155595B (zh) * | 2020-10-10 | 2023-07-07 | 达闼机器人股份有限公司 | 超声波诊断设备、超声探头、图像的生成方法及存储介质 |
| CN112155596B (zh) * | 2020-10-10 | 2023-04-07 | 达闼机器人股份有限公司 | 超声波诊断设备、超声波图像的生成方法及存储介质 |
| CN112530014B (zh) * | 2020-12-18 | 2023-07-25 | 北京理工大学重庆创新中心 | 一种多无人机室内场景三维重建方法及装置 |
| CN112704514B (zh) * | 2020-12-24 | 2021-11-02 | 重庆海扶医疗科技股份有限公司 | 病灶定位方法及病灶定位系统 |
| CN112617902A (zh) * | 2020-12-31 | 2021-04-09 | 上海联影医疗科技股份有限公司 | 一种三维成像系统及成像方法 |
| CN113160221B (zh) * | 2021-05-14 | 2022-06-28 | 深圳市奥昇医疗科技有限责任公司 | 图像处理方法、装置、计算机设备和存储介质 |
| CN113288209B (zh) * | 2021-06-04 | 2025-07-01 | 深圳开立生物医疗科技股份有限公司 | 一种超声成像设备、超声探头的位置信息获取方法及介质 |
| CN114886459A (zh) * | 2021-08-27 | 2022-08-12 | 中山大学孙逸仙纪念医院 | 采集超声操作手法数据的系统、方法、装置、设备及介质 |
| CN114533111A (zh) * | 2022-01-12 | 2022-05-27 | 电子科技大学 | 一种基于惯性导航系统的三维超声重建系统 |
Citations (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN1689518A (zh) * | 2004-04-21 | 2005-11-02 | 西门子共同研究公司 | 使用基于图像的导航系统的增强现实仪器放置的方法 |
| CN106056664A (zh) * | 2016-05-23 | 2016-10-26 | 武汉盈力科技有限公司 | 一种基于惯性和深度视觉的实时三维场景重构系统及方法 |
| WO2016176452A1 (fr) * | 2015-04-28 | 2016-11-03 | Qualcomm Incorporated | Fusion dans un dispositif du suivi optique et inertiel de la position de sondes à ultrasons |
| WO2018002004A1 (fr) * | 2016-06-30 | 2018-01-04 | Koninklijke Philips N.V. | Système de suivi à dispositif inertiel et procédé de fonctionnement de celui-ci |
| CN107802346A (zh) * | 2017-10-11 | 2018-03-16 | 成都漫程科技有限公司 | 一种基于惯性制导的超声融合导航系统及方法 |
| CN108403146A (zh) * | 2018-03-20 | 2018-08-17 | 余夏夏 | 基于多传感器信息融合的三维超声成像方法及装置 |
Family Cites Families (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN102106741B (zh) * | 2009-12-25 | 2013-06-05 | 东软飞利浦医疗设备系统有限责任公司 | 一种二维超声图像的三维重建方法 |
| CN102499762B (zh) * | 2011-11-23 | 2014-06-04 | 东南大学 | 医用超声探头相对于检查部位的三维空间定位系统及方法 |
| CN104105455B (zh) * | 2011-12-03 | 2017-04-19 | 皇家飞利浦有限公司 | 内窥镜手术中的超声探头的机器人引导 |
| CN103197000A (zh) * | 2012-01-05 | 2013-07-10 | 西门子公司 | 用于超声探测的装置、监控设备和超声探测系统及方法 |
| CN104758066B (zh) * | 2015-05-06 | 2017-05-10 | 中国科学院深圳先进技术研究院 | 用于手术导航的设备及手术机器人 |
| CN107582098B (zh) * | 2017-08-08 | 2019-12-06 | 南京大学 | 一种二维超声图像集合重构的三维超声成像方法 |
-
2018
- 2018-03-20 CN CN201810232253.XA patent/CN108403146B/zh not_active Expired - Fee Related
-
2019
- 2019-03-13 WO PCT/CN2019/078034 patent/WO2019179344A1/fr not_active Ceased
Patent Citations (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN1689518A (zh) * | 2004-04-21 | 2005-11-02 | 西门子共同研究公司 | 使用基于图像的导航系统的增强现实仪器放置的方法 |
| WO2016176452A1 (fr) * | 2015-04-28 | 2016-11-03 | Qualcomm Incorporated | Fusion dans un dispositif du suivi optique et inertiel de la position de sondes à ultrasons |
| CN106056664A (zh) * | 2016-05-23 | 2016-10-26 | 武汉盈力科技有限公司 | 一种基于惯性和深度视觉的实时三维场景重构系统及方法 |
| WO2018002004A1 (fr) * | 2016-06-30 | 2018-01-04 | Koninklijke Philips N.V. | Système de suivi à dispositif inertiel et procédé de fonctionnement de celui-ci |
| CN107802346A (zh) * | 2017-10-11 | 2018-03-16 | 成都漫程科技有限公司 | 一种基于惯性制导的超声融合导航系统及方法 |
| CN108403146A (zh) * | 2018-03-20 | 2018-08-17 | 余夏夏 | 基于多传感器信息融合的三维超声成像方法及装置 |
Cited By (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US12446854B2 (en) * | 2021-08-03 | 2025-10-21 | Fujifilm Sonosite, Inc. | Ultrasound probe guidance |
Also Published As
| Publication number | Publication date |
|---|---|
| CN108403146B (zh) | 2020-06-30 |
| CN108403146A (zh) | 2018-08-17 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| WO2019179344A1 (fr) | Procédé d'imagerie ultrasonore tridimensionnelle basée sur la fusion d'informations provenant de multiples capteurs, dispositif et support de stockage lisible par machine terminal | |
| US9665936B2 (en) | Systems and methods for see-through views of patients | |
| CN109475386B (zh) | 内部设备跟踪系统以及操作其的方法 | |
| EP3081184B1 (fr) | Système et procédé de navigation à base d'images fusionnées avec marqueur de placement tardif | |
| US20200214682A1 (en) | Methods and apparatuses for tele-medicine | |
| US8582856B2 (en) | Image processing apparatus, image processing method, and program | |
| US9474505B2 (en) | Patient-probe-operator tracking method and apparatus for ultrasound imaging systems | |
| US20160317122A1 (en) | In-device fusion of optical and inertial positional tracking of ultrasound probes | |
| JP6833533B2 (ja) | 超音波診断装置および超音波診断支援プログラム | |
| US20180092628A1 (en) | Ultrasonic diagnostic apparatus | |
| JP2012247364A (ja) | ステレオカメラ装置、ステレオカメラシステム、プログラム | |
| CN103750859A (zh) | 基于位置信息的超声宽景成像方法 | |
| WO2012164919A1 (fr) | Appareil de génération d'image par ultrasons et procédé de génération d'image par ultrasons | |
| US20210068781A1 (en) | Ultrasonic imaging system | |
| JP2014212904A (ja) | 医用投影システム | |
| CN113081033A (zh) | 基于空间定位装置的三维超声成像方法、存储介质及设备 | |
| JP2005252482A (ja) | 画像生成装置及び3次元距離情報取得装置 | |
| JP7373045B2 (ja) | 超音波撮像方法、超音波撮像装置、超音波撮像システムおよび超音波撮像プログラム | |
| JP5485853B2 (ja) | 医用画像表示装置及び医用画像誘導方法 | |
| CN116585005A (zh) | 一种光捕与惯捕结合的结石定位系统 | |
| JP2006285789A (ja) | 画像処理方法、画像処理装置 | |
| JP4653461B2 (ja) | ディジタルx線断層撮影装置 | |
| JPWO2019158350A5 (fr) | ||
| Abbas et al. | MEMS Gyroscope and the Ego-Motion Estimation Information Fusion for the Low-Cost Freehand Ultrasound Scanner | |
| CN120477816A (zh) | 一种超声宽景成像方法、装置和存储介质 |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 19771978 Country of ref document: EP Kind code of ref document: A1 |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| 32PN | Ep: public notification in the ep bulletin as address of the adressee cannot be established |
Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 21.12.2020) |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 19771978 Country of ref document: EP Kind code of ref document: A1 |