WO2013046568A1 - Equipement d'imagerie photoacoustique et procédé d'imagerie photoacoustique - Google Patents
Equipement d'imagerie photoacoustique et procédé d'imagerie photoacoustique Download PDFInfo
- Publication number
- WO2013046568A1 WO2013046568A1 PCT/JP2012/005786 JP2012005786W WO2013046568A1 WO 2013046568 A1 WO2013046568 A1 WO 2013046568A1 JP 2012005786 W JP2012005786 W JP 2012005786W WO 2013046568 A1 WO2013046568 A1 WO 2013046568A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- photoacoustic
- image data
- photoacoustic image
- spatial information
- unit
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/0093—Detecting, measuring or recording by applying one single type of energy and measuring its conversion into another type of energy
- A61B5/0095—Detecting, measuring or recording by applying one single type of energy and measuring its conversion into another type of energy by applying light and detecting acoustic waves, i.e. photoacoustic measurements
Definitions
- the present invention relates to a photoacoustic imaging apparatus and a photoacoustic imaging method for generating a photoacoustic image by detecting a photoacoustic wave generated in a subject by irradiating the subject with light.
- an ultrasonic image is generated by detecting ultrasonic waves reflected in the subject by irradiating the subject with ultrasonic waves.
- Ultrasonic imaging for obtaining a morphological tomographic image is known.
- development of an apparatus that displays not only a morphological tomographic image but also a functional tomographic image has been advanced in recent years.
- One of such devices is a device using a photoacoustic analysis method.
- This photoacoustic analysis method irradiates a subject with light having a predetermined wavelength (for example, visible light, near infrared light, or mid infrared light), and a specific substance in the subject absorbs the energy of this light.
- a photoacoustic wave which is the resulting elastic wave, is detected and the concentration of the specific substance is quantitatively measured.
- the specific substance in the subject is, for example, glucose or hemoglobin contained in blood.
- Such a technique for detecting a photoacoustic wave and generating a photoacoustic image based on the detection signal is called photoacoustic imaging (PAI) or photoacoustic tomography (PAT).
- Patent Document 1 can generate a three-dimensional photoacoustic image using an ultrasonic probe in which an optical system using a bundle fiber and a detection element for ultrasonic detection are combined together.
- a photoacoustic imaging device is disclosed.
- image data is displayed in the order of acquisition based on the first one-line tomographic image data at the scanning start position of the ultrasonic probe.
- the viewpoint of the displayed photoacoustic image is determined regardless of the positional relationship between the user and the subject or the scanning direction of the ultrasonic probe, the content of the photoacoustic image for the user.
- the problem that it is troublesome to grasp this may arise. Therefore, since the appearance differs depending on from which direction the photoacoustic image is viewed, the image is arranged in a certain direction when the image is first displayed, or the image so that the user can easily grasp the image contents. It is preferable that the direction of viewing can be specified.
- the present invention has been made in view of the above problems, and in displaying a three-dimensional photoacoustic image, a photoacoustic imaging apparatus that enables a user to easily grasp the contents of the photoacoustic image, and
- An object of the present invention is to provide a photoacoustic imaging method.
- a photoacoustic imaging apparatus is Ultrasound having an array transducer in which a light irradiating unit for irradiating measurement light to a test site and an ultrasonic transducer for detecting photoacoustic waves generated in the test site by irradiation of measurement light are arranged one-dimensionally With a probe, Obtaining first spatial information defining the position and direction of the ultrasonic probe in real space, and the direction of the axis of the test site in real space, the scanning direction of the ultrasonic probe in real space, And an information acquisition unit that acquires second spatial information that defines at least one of the vertical directions of the real space; An image data generation unit that generates photoacoustic image data based on the photoacoustic wave detected by the scanning of the ultrasonic probe and the first spatial information acquired during the scanning; An associating unit for associating the second spatial information with the photoacoustic image data in a state in which the positional relationship in the real space
- a viewpoint setting unit for setting the power direction A data conversion unit that converts the photoacoustic image data so that a photoacoustic image based on the photoacoustic image data is displayed on the display unit as an image viewed from the direction to be the viewpoint; And a display unit that displays a photoacoustic image based on the converted photoacoustic image data.
- the viewpoint setting unit further selects a forward direction or a reverse direction with respect to the selected one direction, and based on the selection result.
- the viewpoint setting unit selects, for a plurality of generated photoacoustic image data, the same element as a direction to be the viewpoint with respect to each of the plurality of photoacoustic image data. It is preferable.
- the information acquisition unit acquires the first spatial information for each imaging for one frame.
- the photoacoustic imaging apparatus performs imaging for one frame when each of the position and orientation defined by the first spatial information substantially matches the preset position and orientation. Can be.
- the information acquisition unit preferably includes a magnetic sensor unit, and acquires the first spatial information and the second spatial information using the magnetic sensor unit.
- the photoacoustic imaging method includes: By using a photoacoustic imaging apparatus equipped with an ultrasonic probe for photoacoustic imaging, Obtaining first spatial information defining the position and direction of the ultrasonic probe in real space, and the direction of the axis of the test site in real space, the scanning direction of the ultrasonic probe in real space, And acquiring second spatial information that defines at least one of the vertical directions of the real space, Based on the photoacoustic wave detected by the scanning of the ultrasonic probe and the first spatial information acquired during the scanning, photoacoustic image data is generated, In a state where the positional relationship in the real space between the at least one direction and the imaging part related to the photoacoustic image data is maintained, the second spatial information is associated with the photoacoustic image data, For the photoacoustic image data associated with the second spatial information, one direction is selected from the at least one direction defined by the second spatial information, and the viewpoint is set based on the selection result.
- the photoacoustic image data is converted so that the photoacoustic image based on the photoacoustic image data is displayed on the display unit as an image viewed from the direction to be the viewpoint, A photoacoustic image based on the converted photoacoustic image data is displayed.
- the photoacoustic imaging method it is preferable to further select either the forward direction or the reverse direction for the selected one direction, and take into account the selection result.
- the photoacoustic imaging method it is preferable to acquire the first spatial information every imaging for one frame.
- the photoacoustic imaging method according to the present invention performs imaging for one frame when each of the position and orientation defined by the first spatial information substantially matches the preset position and orientation. Can be.
- the photoacoustic imaging method it is preferable to acquire the first spatial information and the second spatial information using a magnetic sensor unit.
- the positional relationship in the real space between at least one direction defined by the second spatial information and the imaging part related to the photoacoustic image data is maintained.
- the spatial information is associated with the photoacoustic image data, and for the photoacoustic image data associated with the second spatial information, one direction is selected from the at least one direction, and the viewpoint is set based on the selection result.
- the photoacoustic image data is converted so that the photoacoustic image based on the photoacoustic image data is displayed on the display unit as the image viewed from the direction to be set as the viewpoint, the power direction is set, the three-dimensional The photoacoustic image is always displayed as an image viewed from a specific direction associated with the real space. As a result, when displaying a three-dimensional photoacoustic image, it becomes possible for the user to easily grasp the contents of the photoacoustic image.
- FIG. 1 is a schematic diagram illustrating a configuration of a photoacoustic imaging apparatus in the embodiment
- FIG. 2 is a schematic diagram illustrating a configuration of a signal processing unit in the embodiment
- FIG. 3 is a configuration of an ultrasonic probe in the embodiment.
- the photoacoustic imaging apparatus 10 of the present embodiment includes an ultrasonic probe 20, a system control unit 11, a laser light source 12, a signal reception unit 13, an information acquisition unit 14, a display unit 15, and an operation.
- the unit 16 (user interface) is provided.
- the signal receiving unit 13 includes a receiving circuit 30, an AD converting unit 31, a processing selecting unit 32, a delay adding unit 33, a raw data memory 34, a phase matching adding unit 35, a detection / logarithmic converting unit 36, a frame construction.
- the receiving circuit 30, AD conversion unit 31, process selection unit 32, delay addition unit 33, raw data memory 34, phase matching addition unit 35, detection / logarithm conversion unit 36, frame construction unit 37, and volume data construction unit 38 The whole corresponds to the image data generation unit 42 in the present invention.
- the photoacoustic imaging method of this embodiment uses the photoacoustic imaging device 10,
- the information acquisition unit 14 acquires first spatial information that defines the position and orientation of the ultrasound probe 20 in the real space, and the axis direction in the real space of the test site M, the ultrasound probe Obtaining second spatial information defining at least one direction of the scanning direction in the real space of the child 20 and the vertical direction of the real space;
- the image data generation unit 42 generates three-dimensional photoacoustic image data based on the photoacoustic wave U detected by the scanning of the ultrasound probe 20 and the first spatial information acquired during the scanning.
- the associating unit 39 associates the second spatial information with the photoacoustic image data in a state where the positional relationship in the real space between the at least one direction and the imaging part related to the photoacoustic image data is maintained.
- the viewpoint setting unit 40 selects one direction from the at least one direction defined by the second spatial information, and this one Select the direction of forward or reverse direction, set the direction to be the viewpoint based on the two selection results,
- the photoacoustic image data is converted by the data conversion unit 41 so that a photoacoustic image based on the photoacoustic image data is displayed on the display unit as an image viewed from the selected one direction.
- the display unit 15 displays a photoacoustic image based on the converted photoacoustic image data.
- the ultrasonic probe 20 includes a light irradiation unit 21 and an array transducer 22, and detects a photoacoustic wave from a region to be examined.
- a magnetic sensor that constitutes a part of the information acquisition unit 14 is built in the ultrasonic probe 20.
- the light irradiation unit 21 is an optical element that irradiates the laser beam L toward the test site from the vicinity of the array transducer 22.
- the light irradiation unit 21 is a light guide plate 52 connected to the tip of an optical fiber 50 that guides the laser light L output from the laser light source 12 to the vicinity of the array transducer 22. is there. Further, when the laser light L emitted from the distal end portion of the optical fiber 50 is irradiated to the test site as it is, the light irradiation portion 21 becomes the distal end portion of the optical fiber 50.
- the light irradiation unit 21 is arranged along the periphery of the array transducer 22, for example.
- the array transducer 22 is a detection element that detects the photoacoustic wave U generated in the region to be examined. As shown in FIG. 3, the array transducer 22 is composed of a plurality of ultrasonic transducers 22a arranged in a one-dimensional manner.
- the ultrasonic transducer 22a is a piezoelectric element made of a polymer film such as piezoelectric ceramics or polyvinylidene fluoride (PVDF).
- the ultrasonic transducer 22a has a function of converting an acoustic signal into an electric signal when the photoacoustic wave U is detected. This electrical signal is output to the receiving circuit 30 described later.
- the ultrasonic probe 22a is appropriately selected from a sector scanning type, a linear scanning type, a convex scanning type, and the like according to a region to be diagnosed.
- the laser light irradiation is simultaneously performed by, for example, the entire light irradiation unit 21 (all the light guide plates 52 in FIG. 3). In this case, the laser beam irradiation is performed every time one line of imaging (photoacoustic image generation).
- the data based on the acoustic signal is subjected to a delay addition process in a delay addition unit 33 described later.
- the laser beam irradiation can be performed for each partial region of the test site, for example.
- a plurality of light guide plates 52 are provided corresponding to each of the regions A, B, and C (FIG. 3).
- the light guide plate 52a corresponding to the region A irradiates the region A with laser light when the region A is selected.
- the light guide plate 52b corresponding to the region B irradiates the region B with laser light when the region B is selected.
- the light guide plate 52c corresponding to the region C irradiates the region C with laser light when the region C is selected.
- the array transducer 22 is composed of 192 ch ultrasonic transducers 22a.
- the width of the array transducer 22 in the arrangement direction is divided into, for example, three partial regions (regions A to C) related to the generation of the photoacoustic image, and the width of each partial region is an ultrasonic transducer for 64 channels. It is assumed that the width corresponds to the width of 22a.
- the photoacoustic imaging apparatus 10 repeats light irradiation and signal detection to the partial areas once (total 3 times) for each partial area, and acquires data for all 192 channels, thereby obtaining one frame worth of data. Perform imaging. Data based on the acoustic signal by such a detection method is once stored in a raw data memory 34 described later, and then subjected to phase matching addition processing in the phase matching addition unit 35.
- the ultrasonic probe 20 may include an acoustic matching layer on the surface of the array transducer 22 in order to detect photoacoustic waves efficiently.
- the acoustic impedance of the piezoelectric element material and the living body are greatly different. Therefore, when the piezoelectric element material and the living body are in direct contact with each other, reflection at the interface is increased, and the photoacoustic wave cannot be detected efficiently. For this reason, a photoacoustic wave can be detected efficiently by arranging an acoustic matching layer having an intermediate acoustic impedance between the piezoelectric element material and the living body.
- the material constituting the acoustic matching layer include epoxy resin and silicone rubber.
- the system control unit 11 controls the laser light source 12, the signal reception unit 13, the information acquisition unit 14, the display unit 15, and the operation unit 16. For example, the system control unit 11 outputs a trigger signal for synchronizing them.
- the laser light source 12 outputs laser light L to be irradiated on the test site as measurement light.
- the laser light source 12 includes, for example, one or more light sources that generate laser light having a wavelength included in a blood absorption peak.
- a light emitting element such as a semiconductor laser (LD), a solid-state laser, or a gas laser that generates a specific wavelength component or monochromatic light including the component can be used.
- the laser light source 12 includes a flash lamp that is an excitation light source and a Q-switched laser that controls laser oscillation. When the system controller 11 outputs a flash lamp trigger signal, the laser light source 12 turns on the flash lamp and excites the Q-switched laser.
- the wavelength of the laser light is appropriately determined according to the light absorption characteristics of the substance in the subject to be imaged.
- the hemoglobin in a living body has an optical absorption coefficient that varies depending on its state (oxygenated hemoglobin, deoxygenated hemoglobin, methemoglobin, carbon dioxide hemoglobin, etc.).
- the imaging target is hemoglobin in a living body (that is, when imaging a blood vessel inside the living body)
- the light transmittance of the living body is good, and various hemoglobins have a light absorption peak of about 600 to 1000 nm. It is preferable.
- the laser light source 12 preferably outputs pulsed light having a pulse width of 1 to 100 nsec as laser light.
- the output of the laser beam is 10 ⁇ J / cm 2 to several tens of mJ / cm 2 from the viewpoints of propagation loss of laser beam and photoacoustic wave, efficiency of photoacoustic conversion, detection sensitivity of the current detector, and the like. Is preferred.
- the repetition of the pulsed light output is preferably 10 Hz or more from the viewpoint of the image construction speed.
- the laser beam may be a pulse train in which a plurality of the above pulsed beams are arranged.
- the laser light output from the laser light source 12 is guided to the vicinity of the array transducer 22 of the ultrasonic probe 20 using light guide means such as an optical fiber, a light guide plate, a lens, and a mirror.
- the region to be examined is irradiated from the vicinity of 22.
- the signal receiving unit 13 generates a photoacoustic image from the detected acoustic signal.
- the photoacoustic image is generated by receiving circuit 30, AD conversion unit 31, processing selection unit 32, delay addition unit 33, raw data memory 34, phase matching addition unit 35, detection / logarithmic conversion unit 36, frame construction unit 37, This is realized by the volume data construction unit 38, the association unit 39, the viewpoint setting unit 40, and the data conversion unit 41.
- the receiving circuit 30 receives the electrical signal of the photoacoustic wave output from the ultrasonic probe 20.
- the received electrical signal is output to the AD conversion unit 31.
- the AD converter 31 is a sampling means for converting an electric signal into a digital signal.
- the AD converter 31 converts the electrical signal received by the receiving circuit 30 into a digital signal in synchronization with, for example, an AD clock signal with a clock frequency of 40 MHz output by the system controller 11.
- the process selection unit 32 selects a process for reconstructing the digital signal (sampling data) sampled by the AD conversion unit 31.
- the process selection unit 32 selects a delay addition process or a phase matching addition process as a process for reconstructing the sampling data. If the delay addition process is selected, the process selection unit 32 transmits the sampling data to the delay addition unit 33 to perform the phase matching addition. When the process is selected, the sampling data is transmitted to the raw data memory 34.
- the delay addition unit 33 adds, for example, each sampling data obtained from a signal detected by each of the ultrasonic transducers 22a with a delay time corresponding to the position of the ultrasonic transducer 22a, and reconstructed one line. Minute signal data is generated (delay addition method).
- delay addition method When the delay addition process is performed, the laser light irradiation is performed a plurality of times with the position of the ultrasonic probe 20 fixed, and therefore, for each line, the channel of the ultrasonic transducer 22a is shifted for each irradiation. Signal data is generated.
- this delay addition part 33 may replace with the delay addition method, and may perform a reconfiguration
- the signal data for one line is the spatial information that defines the position and direction of the ultrasonic probe 20 in the real space acquired by the information acquisition unit 14 described later, and the signal data for the one line. Spatial information (first spatial information) at the time when the acoustic signal that is the basis of is detected is associated. As a result, it is possible to determine in which line portion in the real space the signal data for one line is located.
- the delay addition unit 33 outputs the signal data reconstructed as described above to the detection / logarithm conversion unit 36.
- the raw data memory 34 temporarily stores all the raw sampling data obtained from signals detected by the ultrasonic transducers 22a, for example.
- the phase matching adder 35 performs phase matching addition of the respective sampling data based on the raw sampling data stored in the raw data memory 34, so that signal data for one line is obtained for each channel of the ultrasonic transducer 22a. Is generated.
- the phase matching adder 35 preferably includes, for example, a DSP (Digital Signal Processor) or an FPGA (Field Programmable Gate Array).
- the phase matching adder 35 outputs the signal data reconstructed as described above to the detector / logarithm converter 36.
- the detection / logarithm conversion unit 36 generates an envelope of signal data for one line output from the delay addition unit 33 or the phase matching addition unit 35, and then logarithmically converts the envelope to widen the dynamic range. Then, the detection / logarithm conversion unit 36 outputs the signal data for one line subjected to the signal processing as described above to the frame construction unit 37.
- the frame construction unit 37 combines a plurality of signal data for one line to generate image data for one frame (one section). For example, the frame construction unit 37 constructs image data for one frame by converting the position of the time axis of the data for one line into the position of the displacement axis representing the depth in the tomographic image. The constructed image data for one frame is output to the volume data construction unit 38.
- the volume data construction unit 38 superimposes the image data for one frame on the basis of the image data for one frame generated for each scanning position of the ultrasound probe 20 and arranges the acquired data in spatial coordinates, or acquires the acquired data. Volume data for a three-dimensional photoacoustic image is constructed while interpolating between the two. Image data having a three-dimensional visual effect created by the volume data construction unit 38 as described above is referred to as photoacoustic image data.
- the associating unit 39 obtains second spatial information acquired by the information acquiring unit 14 described later (the direction of the axis in the real space of the test site, the scanning direction of the ultrasonic probe in the real space, and the vertical direction of the real space) (Spatial information defining at least one direction) is associated with the photoacoustic image data.
- the association of the second spatial information is performed in a state in which the positional relationship in the real space between the at least one direction and the imaging part related to the photoacoustic image data is maintained.
- the “imaging site related to photoacoustic image data” means a portion of the test site actually represented in the photoacoustic image based on the photoacoustic image data.
- the viewpoint setting unit 40 selects one direction from among the at least one direction defined by the second spatial information for the photoacoustic image data associated with the second spatial information, and the selection result The direction which should be taken as a viewpoint is set based on this.
- the viewpoint setting unit 40 may be configured to select a forward direction or a reverse direction in response to an instruction from the user for the selected one direction. That is, in this case, the direction of the viewpoint when displaying the photoacoustic image based on the photoacoustic image data based on the selected one direction and the selection result of the forward direction or the reverse direction with respect to the selected direction. Is decided.
- the three-dimensional photoacoustic image is always displayed as an image viewed from a specific direction associated with the real space.
- the direction is selected according to the direction set as an initial value or information input by the user using the operation unit 16 (FIG. 2).
- the user can change the viewpoint at the time of displaying a photoacoustic image as needed.
- the distance between the photoacoustic image data and the viewpoint is appropriately adjusted according to the scale of the photoacoustic image. That is, since the distance between the photoacoustic image data and the viewpoint affects the scale of the image display means on the screen, the photoacoustic image is displayed in an appropriate size on the screen. .
- the viewpoint setting unit 40 may select the same element as the direction to be the viewpoint for each of the plurality of photoacoustic image data.
- the “element” means an element constituting the second spatial information and an element in the forward direction or the reverse direction. That is, the viewpoint setting unit 40 selects the one direction so that the direction to be the viewpoint for each of a plurality of photoacoustic image data is the same, and selects either the forward direction or the reverse direction. become.
- “same” does not mean that it is physically the same in the real space, but means that the selected items such as “the scanning direction of the ultrasonic probe in the real space” are the same. is there.
- the data conversion unit 41 converts the photoacoustic image data so that a photoacoustic image based on the photoacoustic image data is displayed on the display unit as an image viewed from the selected one direction. .
- a conversion method it can carry out by well-known methods, such as coordinate conversion, for example.
- the converted photoacoustic image data is output to the display unit 15.
- the data conversion unit 41 may output three-dimensional volume data as photoacoustic image data in a lump, or may sequentially output image data for one frame.
- an image is displayed so that a predetermined viewpoint is obtained when one scan of the ultrasound probe 20 is completed.
- image data for one frame is sequentially displayed as the scanning of the ultrasonic probe 20 progresses.
- the image data for one frame is displayed in the virtual space on the screen of the display unit 15. It is displayed while being superimposed from the back toward the front.
- the image data for one frame is displayed on the screen of the display unit 15 in the virtual space. It is displayed while being superimposed from the front to the back.
- the image data 55 for one frame is moved forward in spatial coordinates while the center of the image is shifted so that the image behind is transparent. It shows how they are arranged sequentially.
- the information acquisition unit 14 acquires first spatial information that defines the position and orientation of the ultrasonic probe 20 in the real space, and the axis direction in the real space of the test site M, the ultrasonic probe
- the second spatial information defining at least one of the scanning direction in the real space of the child 20 and the vertical direction of the real space is acquired.
- “Direction of the axis of the test site M in the real space” means the central axis of the head, torso, or limb that includes the test site.
- the center axis of the trunk D is an axis ⁇ passing through the center along the length direction of the trunk
- the center axis of the upper limb E is centered along the length direction of the upper limb.
- the “scanning direction of the ultrasonic probe 20 in the real space” means a direction in which the ultrasonic probe 20 is scanned, as shown in FIG. 3, and is usually the arrangement direction of the ultrasonic transducers 22 a. And a direction along the surface of the test site.
- the information acquisition unit 14 includes a magnetic sensor unit having a magnetic field generation unit 17 and a plurality of magnetic sensors 18, and acquires spatial information using the magnetic sensor unit.
- the magnetic sensor unit has a relative position coordinate (x, y, z) of the magnetic sensor 18 with respect to the magnetic field generator 17 in the space on the pulse magnetic field formed by the magnetic field generator 17, and posture information (angle ( information on ⁇ , ⁇ , ⁇ ).
- the position of the ultrasonic probe 20 in the real space can be obtained as an intermediate position based on the position where the two magnetic sensors 18 have acquired, for example.
- the orientation of the ultrasonic probe 20 in the real space is such that one of the two magnetic sensors 18 (for example, the sensor on the back side of the ultrasonic probe 20) and the other magnetic sensor (for example, the ultrasonic probe). It can be obtained by calculating a vector directed to 20 front side sensors).
- the orientation of the axis of the test site M in the real space can be obtained by determining two points that define the central axis of the head, torso, or limbs that include the test site using the magnetic sensor unit. .
- the ultrasonic probe 20 is arranged at the shoulder position, the position is stored in the information acquisition unit 14, and then the ultrasonic probe 20 is arranged at the hand position. If the information acquisition unit 14 stores the position, two points that define the central axis ⁇ of the upper limb can be obtained. Further, the scanning direction of the ultrasonic probe 20 in the real space can be obtained based on the temporal change of the position of the ultrasonic probe 20 in the real space. Further, the vertical direction of the real space can be the z-axis of the coordinate system defined by the magnetic field generator 17, for example.
- two magnetic sensors 18 are provided to define the arrangement direction of the ultrasonic transducers 22a. Two at least four are required to define the direction perpendicular to the direction and intersecting the surface of the test site.
- the first spatial information is acquired every time one frame is captured. Specifically, by synchronizing the information acquisition timing with the output of the laser light source 12, it is possible to acquire information for each imaging.
- the display unit 15 displays a photoacoustic image based on the photoacoustic image data converted by the data conversion unit 41.
- 6A and 6B are schematic diagrams showing modeled three-dimensional photoacoustic images.
- 6Aa is a perspective view of the photoacoustic image
- FIG. 6Ab is a plan view of the photoacoustic image
- FIG. 6Ac is a front view of the photoacoustic image
- FIG. 6Ad is a right side view of the photoacoustic image
- FIG. It is a right view of the said photoacoustic image.
- three cubes are represented as imaging parts.
- the x-, y-, and z-axes of the orthogonal system are associated with the photoacoustic image data.
- the x-axis is the scanning direction in the real space of the ultrasonic probe
- the y-axis is the site to be examined. It is assumed that the direction of the axis in the real space and the z axis are the vertical direction of the real space.
- three directions are associated with the photoacoustic image data, but one direction may be associated.
- the three directions are set to form an orthogonal system, but these directions do not necessarily need to be an orthogonal system.
- the viewpoint setting unit 40 selecting the “scanning direction in the real space of the ultrasound probe” from the three directions and selecting “reverse direction” in the selection of the forward direction or the reverse direction, the viewpoint When the x-axis negative direction is set as the direction to be set, as shown in FIG. 6Be, the surface described as “X” in the cubic surface is seen on the yz plane Pyz so that it can be seen from the back side. The projected photoacoustic image is displayed.
- the three-dimensional photoacoustic image is always displayed as an image viewed from a specific direction associated with the real space, the user can easily grasp the contents of the photoacoustic image.
- the above display method is an initial display method in which the photoacoustic image is displayed on the display unit. Therefore, after the photoacoustic image is displayed once, the user rotates or moves the image as necessary. Of course, it is also possible to make them. That is, if the user wants to see an image from the positive y-axis direction after setting the positive x-axis direction as the direction to be viewed and viewing the three-dimensional image, the user can use the operation unit 16 to view the image. By inputting that fact, the information is transmitted to the viewpoint setting unit 40. At this time, the viewpoint setting unit 40 sets the positive y-axis direction as the direction to be set as the viewpoint.
- the data conversion unit 41 re-acquires the already recorded photoacoustic image data and the second spatial information associated therewith, and recalculates the photoacoustic image whose viewpoint is in the positive y-axis direction.
- a setting of always displaying an image obtained by viewing the cross section of the test site from the distal side (the side far from the heart) can be considered.
- the operation unit 16 is for the user to input information necessary for imaging.
- the user uses the operation unit 16 to specify the viewpoint direction when the photoacoustic image is displayed, or to input information about patient information or imaging conditions.
- the positional relationship in the real space between at least one direction defined by the second spatial information and the imaging site related to the photoacoustic image data is maintained.
- the second spatial information is associated with the photoacoustic image data
- one direction is selected from the at least one direction
- the three-dimensional photoacoustic image is always in real space. It is displayed as an image viewed from the associated specific direction.
- the photoacoustic imaging apparatus and method according to the present invention captures an image for one frame when each of the position and orientation defined by the first spatial information substantially matches the preset position and orientation.
- the ultrasonic probe 20 Photoacoustic image data can be collected at regular intervals regardless of the scanning speed.
- the position and orientation set in advance are set by the user using the operation unit 16, for example.
Landscapes
- Life Sciences & Earth Sciences (AREA)
- Health & Medical Sciences (AREA)
- Physics & Mathematics (AREA)
- Heart & Thoracic Surgery (AREA)
- Molecular Biology (AREA)
- Pathology (AREA)
- Engineering & Computer Science (AREA)
- Biomedical Technology (AREA)
- Acoustics & Sound (AREA)
- Medical Informatics (AREA)
- Biophysics (AREA)
- Surgery (AREA)
- Animal Behavior & Ethology (AREA)
- General Health & Medical Sciences (AREA)
- Public Health (AREA)
- Veterinary Medicine (AREA)
- Ultra Sonic Daignosis Equipment (AREA)
Abstract
L'invention a pour but de permettre à un utilisateur de comprendre les contenus d'une image photoacoustique facilement lorsqu'une image photoacoustique tridimensionnelle est affichée. A cet effet, selon l'invention, dans une imagerie photoacoustique, des informations de second espace spécifient au moins une direction parmi : l'orientation axiale de la zone d'objet (y) ; la direction de balayage (x) ; et la direction verticale (z). Tout en maintenant la relation de position d'espace réel entre la ou les directions spécifiées par les informations de second espace et le site imagé dans les données d'image photoacoustique, les informations de second espace sont corrélées aux données d'image photoacoustique. Pour les données d'image photoacoustique qui ont été corrélées aux informations de second espace, une direction est sélectionnée parmi la ou les directions mentionnées ci-dessus et, sur la base des résultats de la sélection, la direction à utiliser en tant que point de vue est réglée. Les données d'image photoacoustique sont converties de telle sorte qu'une image photoacoustique basée sur les données d'image photoacoustique est affichée sur une unité d'affichage (15) en tant qu'image vue à partir de la direction à utiliser en tant que point de vue.
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| JP2011-212134 | 2011-09-28 | ||
| JP2011212134A JP5722182B2 (ja) | 2011-09-28 | 2011-09-28 | 光音響撮像装置および光音響撮像方法 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2013046568A1 true WO2013046568A1 (fr) | 2013-04-04 |
Family
ID=47994657
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/JP2012/005786 Ceased WO2013046568A1 (fr) | 2011-09-28 | 2012-09-12 | Equipement d'imagerie photoacoustique et procédé d'imagerie photoacoustique |
Country Status (2)
| Country | Link |
|---|---|
| JP (1) | JP5722182B2 (fr) |
| WO (1) | WO2013046568A1 (fr) |
Cited By (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN111028337A (zh) * | 2019-12-04 | 2020-04-17 | 南京大学 | 一种改善有限视角问题的三维光声成像方法 |
Families Citing this family (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2019013899A1 (fr) | 2017-07-13 | 2019-01-17 | Exxonmobil Chemical Patents Inc. | Compositions de polyester et procédé pour la fabrication d'articles à partir de telles compositions |
Citations (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JPH1156845A (ja) * | 1997-06-04 | 1999-03-02 | Advanced Technol Lab Inc | 胸部診断用超音波画像処理装置及び該画像作成方法 |
| JP2003325514A (ja) * | 2002-05-16 | 2003-11-18 | Aloka Co Ltd | 超音波診断装置 |
| JP2010259536A (ja) * | 2009-04-30 | 2010-11-18 | Canon Inc | 画像処理装置及びその制御方法 |
Family Cites Families (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2001104295A (ja) * | 1999-10-06 | 2001-04-17 | Hitachi Medical Corp | 医用画像撮影システム |
| JP3688605B2 (ja) * | 2001-07-10 | 2005-08-31 | アロカ株式会社 | 超音波診断装置 |
-
2011
- 2011-09-28 JP JP2011212134A patent/JP5722182B2/ja active Active
-
2012
- 2012-09-12 WO PCT/JP2012/005786 patent/WO2013046568A1/fr not_active Ceased
Patent Citations (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JPH1156845A (ja) * | 1997-06-04 | 1999-03-02 | Advanced Technol Lab Inc | 胸部診断用超音波画像処理装置及び該画像作成方法 |
| JP2003325514A (ja) * | 2002-05-16 | 2003-11-18 | Aloka Co Ltd | 超音波診断装置 |
| JP2010259536A (ja) * | 2009-04-30 | 2010-11-18 | Canon Inc | 画像処理装置及びその制御方法 |
Cited By (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN111028337A (zh) * | 2019-12-04 | 2020-04-17 | 南京大学 | 一种改善有限视角问题的三维光声成像方法 |
Also Published As
| Publication number | Publication date |
|---|---|
| JP5722182B2 (ja) | 2015-05-20 |
| JP2013070847A (ja) | 2013-04-22 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| JP6525565B2 (ja) | 被検体情報取得装置および被検体情報取得方法 | |
| JP5779169B2 (ja) | 音響画像生成装置およびそれを用いて画像を生成する際の進捗状況の表示方法 | |
| US9339254B2 (en) | Object information acquiring apparatus | |
| JP6486068B2 (ja) | 被検部位情報取得装置 | |
| US20130116536A1 (en) | Acoustic wave acquiring apparatus and acoustic wave acquiring method | |
| JP5647583B2 (ja) | 光音響分析装置および光音響分析方法 | |
| US9330462B2 (en) | Object information acquiring apparatus and control method of object information acquiring apparatus | |
| JP6632257B2 (ja) | 被検体情報取得装置 | |
| JP5917037B2 (ja) | 被検体情報取得装置および被検体情報取得方法 | |
| JP2013027481A (ja) | 光音響撮像システムおよび装置並びにそれらに使用されるプローブユニット | |
| JP2017070385A (ja) | 被検体情報取得装置およびその制御方法 | |
| JP5683383B2 (ja) | 光音響撮像装置およびその作動方法 | |
| JP6177530B2 (ja) | ドプラ計測装置およびドプラ計測方法 | |
| JP5936559B2 (ja) | 光音響画像生成装置および光音響画像生成方法 | |
| JP2014131596A (ja) | 被検体情報取得装置、被検体情報取得装置の制御方法、およびプログラム | |
| JP6742734B2 (ja) | 被検体情報取得装置および信号処理方法 | |
| JP5722182B2 (ja) | 光音響撮像装置および光音響撮像方法 | |
| WO2014050020A1 (fr) | Dispositif de génération d'image photo-acoustique, et procédé de génération d'image photo-acoustique | |
| JP6843632B2 (ja) | 音響波測定装置およびその制御方法 | |
| JP2019107421A (ja) | 光音響装置および被検体情報取得方法 | |
| CN118019497A (zh) | 图像生成方法、图像生成程序以及图像生成装置 | |
| JP2016152879A (ja) | 被検体情報取得装置 | |
| JP2017164222A (ja) | 処理装置および処理方法 | |
| JP2022034766A (ja) | 画像生成方法、画像生成プログラムおよび画像生成装置 | |
| Palaniappan et al. | A custom developed linear array photoacoustic tomography for noninvasive medical imaging |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 12836222 Country of ref document: EP Kind code of ref document: A1 |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 12836222 Country of ref document: EP Kind code of ref document: A1 |