[go: up one dir, main page]

WO2015156039A1 - Appareil d'imagerie des organes - Google Patents

Appareil d'imagerie des organes Download PDF

Info

Publication number
WO2015156039A1
WO2015156039A1 PCT/JP2015/054706 JP2015054706W WO2015156039A1 WO 2015156039 A1 WO2015156039 A1 WO 2015156039A1 JP 2015054706 W JP2015054706 W JP 2015054706W WO 2015156039 A1 WO2015156039 A1 WO 2015156039A1
Authority
WO
WIPO (PCT)
Prior art keywords
tongue
organ
image
image data
unit
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/JP2015/054706
Other languages
English (en)
Japanese (ja)
Inventor
松田 伸也
楠田 将之
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Konica Minolta Inc
Original Assignee
Konica Minolta Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Konica Minolta Inc filed Critical Konica Minolta Inc
Publication of WO2015156039A1 publication Critical patent/WO2015156039A1/fr
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons

Definitions

  • the present invention relates to an organ image photographing apparatus for photographing an organ of a living body and detecting a diagnosis item of the organ from the photographed image.
  • a diagnostic method for diagnosing a health condition or medical condition by observing the state of the tongue.
  • the physical condition and health level are diagnosed based on the color and shape of the tongue and moss, and the diagnosis items include the color of the tongue, the thickness of the tongue, and the crack on the surface of the tongue.
  • the tongue In a healthy state, the tongue is light red, but if there is a problem with the respiratory or circulatory system, the tongue will turn blue, and if there is fever or dehydration, the tongue will turn red. Further, when the metabolism of water is poor, the so-called swelled state is formed, and the tongue swells and becomes thick. Conversely, when the blood flow is insufficient or the water is insufficient, the tongue becomes thin and thin. Furthermore, when the immunity decreases due to lack of nutrition, poor blood flow, stress, etc., the regenerative power of the tongue cells decreases and the surface of the tongue cracks.
  • Patent Document 1 a tongue is photographed with a camera to extract a region of interest such as a tongue apex, a tongue, a tongue base, a tongue base, and the tongue quality (tongue color) and tongue tongue (moss color) of the region of interest. This makes it easy to diagnose the health status of individuals.
  • Patent Document 2 an index for diagnosing the state of blood or blood vessels is obtained by photographing the tongue with a camera and detecting the color and gloss of the tongue separately.
  • the ratio of the red (R) image data and the green (G) image data in each pixel of the captured image of the tongue is calculated, and the ratio is compared with a threshold value, whereby the tongue The color and the color of moss are judged. Furthermore, the number of connections in the vertical and horizontal directions of pixels on the base of the tongue (parts other than the nipple) is counted, and the number of connections in one direction (for example, the vertical direction) is equal to or greater than a predetermined value and the connection in the other direction (for example, the horizontal direction). A portion where the number is equal to or less than a predetermined value is determined as a crack in the tongue.
  • JP 2011-239926 A (refer to claim 1, paragraph [0028] etc.)
  • JP 2005-137756 A (refer to claim 3, paragraphs [0059] to [0063], [0079] to [0081], FIG. 4 to FIG. 6, FIG. 8, etc.)
  • a captured image (visible image) of the tongue is acquired by irradiating the tongue with visible light.
  • the part where the moss exists on the surface of the tongue is white or yellow, and the color of the underlying tongue cannot be detected, so the image data of the part where the moss does not exist on the surface is selected from the visible image Thus, the color of the tongue is detected.
  • the entire surface of the tongue is covered with moss, it is impossible to select image data of a portion where the moss is not present on the surface, and it is impossible to detect the color of the tongue.
  • the thickness of the tongue can be detected by using a method such as a light cutting method.
  • the light cutting method is a method of detecting the shape of the subject by irradiating the subject with linear light (eg, visible light) and detecting the shape of the reflected light.
  • linear light eg, visible light
  • detecting the shape of the reflected light even if this method is used, if the moss distribution on the surface of the tongue is uneven, the reflected light of the irradiated light will be disturbed in the portion where the moss is present. It becomes impossible to accurately detect the thickness of the.
  • the present invention has been made to solve the above problems, and an object of the present invention is to provide an organ imaging apparatus capable of accurately detecting a diagnosis item of an organ regardless of the surface state of the organ of a living body. It is to provide.
  • An organ imaging apparatus receives an illumination unit that illuminates a living organ with illumination light including near-infrared light, and the illumination light reflected from the surface of the organ.
  • An imaging unit that acquires an image of the organ, and at least image data of a near-infrared image obtained by receiving the near-infrared light included in the illumination light among the images acquired by the imaging unit.
  • an arithmetic unit for classifying the degree of the diagnosis item of the organ.
  • the calculation unit classifies the degree of the diagnosis item of the organ based on the image data of the near-infrared image, thereby accurately determining the diagnosis item of the organ regardless of the state of the organ surface that causes noise. Can be detected.
  • It is explanatory drawing which shows the positional relationship of the said illumination part with respect to imaging
  • FIG. 9 is a graph showing a distribution of RGB image data in one direction of the visible image in FIG. 8. It is a graph which shows distribution of the image data in one direction of the near-infrared image of FIG. It is explanatory drawing which shows the cross-sectional shape of a tongue, and the distribution of the image data in one direction of the near-infrared image of a tongue.
  • FIG. 16 is an explanatory diagram illustrating the distribution of the image data in FIG. 15 together with a region set when detecting the thickness of the tongue. It is a graph which shows the frequency distribution of the image data of the one part area
  • the numerical value range includes the values of the lower limit A and the upper limit B.
  • FIG. 1 is a perspective view showing an external appearance of an organ image photographing apparatus 1 of the present embodiment
  • FIG. 2 is a block diagram showing a schematic configuration of the organ image photographing apparatus 1.
  • the organ image capturing apparatus 1 captures an organ of a living body and detects information (diagnostic items) necessary for health diagnosis.
  • diagnostic items for example, the color of the tongue, the thickness of the tongue, and the crack on the surface of the tongue can be considered.
  • the organ image photographing apparatus 1 includes an illumination unit 2, an imaging unit 3, a display unit 4, an operation unit 5, a communication unit 6, and an audio output unit 7.
  • the illumination unit 2 is provided in the housing 21, and the configuration other than the illumination unit 2 (for example, the imaging unit 3, the display unit 4, the operation unit 5, the communication unit 6, and the audio output unit 7) is provided in the housing 22. ing.
  • casing 22 are connected so that relative rotation is possible, rotation is not necessarily required and one side may be completely fixed to the other.
  • etc., May be provided in the single housing
  • the organ image photographing device 1 may be composed of a multifunctional portable information terminal.
  • the illuminator 2 illuminates a living organ (herein, the tongue) that is an object to be imaged with illumination light including near-infrared light, and includes an illuminator that illuminates the object to be imaged from above.
  • a light source that emits a daylight color such as a xenon lamp is used.
  • FIG. 3 shows the light emission characteristics of the xenon lamp used as the light source of the illumination unit 2.
  • Xenon lamps are often used in camera flashes, and are almost flat from the visible wavelength range (400 nm to 700 nm) to the near infrared wavelength range (700 nm to 1000 nm). Emission characteristics close to the distribution.
  • the illumination unit 2 includes a lighting circuit and a dimming circuit in addition to the above light source, and lighting / extinguishing and dimming are controlled by a command from the illumination control unit 11.
  • the imaging unit 3 acquires an image of the organ by receiving illumination light from the illumination unit 2 reflected from the surface of the organ.
  • FIG. 4 is an explanatory diagram showing a positional relationship between the illumination unit 2 and the imaging unit 3 with respect to the imaging target.
  • the imaging unit 3 is disposed so as to face the tongue that is the subject of imaging.
  • the imaging unit 3 includes an imaging lens 31, an infrared reflecting mirror 32, a visible sensor 33, and an infrared sensor 34.
  • Illumination light (reflected light) emitted from the illumination unit 2 and reflected by the tongue passes through the imaging lens 31 and is separated into visible light and infrared light (including near infrared light) by the infrared reflection mirror 32. Is done.
  • the visible light separated by the infrared reflecting mirror 32 is guided to the visible sensor 33 and forms an image there.
  • the visible sensor 33 has a color filter 33a shown in FIG. 5 on the light incident side of the light receiving surface, and thereby, for each pixel of the sensor, red (R), green ( G) and blue (B) light can be received.
  • the visible sensor 33 obtains a visible image composed of each color of RGB.
  • the infrared light separated by the infrared reflecting mirror 32 is guided to the infrared sensor 34 and forms an image there.
  • the infrared sensor 34 acquires a near-infrared image by receiving near-infrared (IR) light.
  • the aperture (lens brightness), shutter speed, and focal length of the imaging lens 31 are set so that the entire range to be photographed is in focus.
  • F number 16
  • shutter speed 1/120 seconds
  • focal length 20 mm.
  • the area sensor (the visible sensor 33 and the infrared sensor 34) is composed of an image sensor such as a CCD (Charge-Coupled Device) or a CMOS (Complementary Metal-Oxide Semiconductor), for example, and has sufficient color and shape to be photographed. Sensitivity, resolution, etc. are set so that they can be detected easily. As an example, sensitivity: 60 db, resolution: 10 million pixels.
  • Imaging by the imaging unit 3 is controlled by the imaging control unit 12.
  • the imaging unit 3 includes a focus mechanism (not shown), a diaphragm mechanism, a drive circuit, an A / D conversion circuit, and the like, and in response to a command from the imaging control unit 12. Focus, aperture control, A / D conversion, and the like are controlled.
  • data of 0 to 255 in 8 bits is acquired for each of R, G, B, and IR as captured image data.
  • the illumination unit 2 described above is arranged so as to illuminate the imaging target at an angle A of, for example, 0 ° to 45 ° with respect to the imaging optical axis X of the imaging unit 3 passing through the imaging target.
  • the imaging optical axis X refers to the optical axis of the imaging lens 31.
  • FIG. 6 is an explanatory diagram showing another configuration of the imaging unit 3.
  • the imaging unit 3 may include an imaging lens 31, a color filter 35, and an imaging element 36.
  • the image sensor 36 is composed of an image sensor such as a CCD or CMOS, and has a Si photodiode having detection sensitivity from visible light to near infrared light.
  • the color filter 35 is disposed on the light incident side of the image sensor 36, and transmits visible light and near infrared light to the image sensor 36.
  • FIG. 7 schematically shows the configuration of the color filter 35.
  • the color filter 33a shown in FIG. 5 includes a G filter that transmits green light at a ratio of 2 pixels to 4 pixels, and a B filter that transmits blue light and a red light transmitted to the remaining pixels.
  • the R filter is arranged at a ratio of 1: 1.
  • the color filter 35 in FIG. 7 uses one of the G filters arranged at a ratio of 2 pixels per 4 pixels as near-infrared light. Is replaced with an IR filter that transmits light. By using such a color filter 35, the detection of visible light and the detection of near-infrared light can be shared by one image sensor 36.
  • both visible light and near infrared light can be detected by one sensor. Further, in this configuration, since the infrared reflection mirror 32 and the infrared sensor 34 in FIG. 4 can be omitted, it is possible to realize downsizing and cost reduction.
  • the display unit 4 includes a liquid crystal panel (not shown), a backlight, a lighting circuit, and a control circuit.
  • the display unit 4 calculates and outputs an image acquired by photographing with the imaging unit 3 and a calculation unit 16 described later. Displayed information. Display of various types of information on the display unit 4 is controlled by the display control unit 13.
  • the operation unit 5 is an input unit for instructing imaging by the imaging unit 3, and includes an OK button (imaging execution button) 5a and a CANCEL button 5b.
  • the display unit 4 and the operation unit 5 are configured by a common touch panel display device 41, and the display area of the display unit 4 and the display area of the operation unit 5 in the touch panel display device 41 are separated.
  • the display of the operation unit 5 on the touch panel display device 41 is controlled by the operation control unit 14.
  • the operation unit 5 may be configured by an input unit other than the touch panel display device 41 (the operation unit 5 may be provided at a position outside the display area of the touch panel display device 41).
  • the communication unit 6 transmits the image data acquired by the imaging unit 3 and the information calculated and output by the calculation unit 16 described below to the outside via a communication line (including wired and wireless). It is an interface for receiving information from the outside. Transmission / reception of information in the communication unit 6 is controlled by the communication control unit 18.
  • the audio output unit 7 outputs various types of information as audio, and is composed of, for example, a speaker.
  • the information output by voice includes the result calculated by the calculation unit 16.
  • the sound output in the sound output unit 7 is controlled by the sound output control unit 19.
  • the organ imaging apparatus 1 further includes an illumination control unit 11, an imaging control unit 12, a display control unit 13, an operation control unit 14, an image processing unit 15, a calculation unit 16, a storage unit 17, a communication control unit 18, and a voice.
  • An output control unit 19 and an overall control unit 20 that controls these units are provided.
  • the illumination control unit 11, the imaging control unit 12, the display control unit 13, the operation control unit 14, the communication control unit 18, and the audio output control unit 19 are the illumination unit 2, the imaging unit 3, the display unit 4, and the operation.
  • the unit 5, the communication unit 6, and the audio output unit 7 are controlled.
  • the overall control unit 20 is composed of, for example, a CPU (Central Processing Unit).
  • the illumination control unit 11, the imaging control unit 12, the display control unit 13, the operation control unit 14, the image processing unit 15, the calculation unit 16, the communication control unit 18, the audio output control unit 19, and the overall control unit 20 are: It may be configured integrally (for example, with one CPU).
  • the image processing unit 15 has a function of performing various types of image processing, such as extracting an outline of an organ from an image acquired by the imaging unit 3.
  • the outline of the organ can be extracted by extracting the luminance edge of the captured image (the portion where the brightness has changed abruptly in the image) using a filter.
  • the edge extraction filter is a filter that weights pixels in the vicinity of the target pixel when performing first-order differentiation (when obtaining a difference in image data between adjacent pixels).
  • the storage unit 17 stores image data acquired by the imaging unit 3, data acquired by the image processing unit 15, data calculated by the calculation unit 16, information received from the outside, and the like.
  • the calculation unit 16 determines the degree of the diagnostic item of the organ based on at least the image data of the near-infrared image obtained by receiving the near-infrared light included in the illumination light among the images acquired by the imaging unit 3.
  • Classify The above classification may be performed by quantifying the degree of the diagnosis item, or may be performed by an output other than the numerical value.
  • a specific example for classifying the degree of diagnosis items will be described.
  • FIG. 8 shows a visible image and a near-infrared image of the tongue acquired by the imaging unit 3 when the imaging unit 3 images the tongue under illumination by the illumination unit 2.
  • the moss region on the tongue surface is indicated by hatching for convenience.
  • the visible image it is confirmed that there is moss on almost the entire surface of the tongue, and the moss distribution is uneven.
  • the near-infrared image the difference between the color of the tongue and the color of the moss disappears, and the moss cannot be seen.
  • FIG. 9 shows the spectral distribution of tongue color and moss color.
  • the B wavelength range (400 to 500 nm) and the G wavelength range (500 to 600 nm) there is a large difference in reflectance between the tongue color and moss color, but in the wavelength range exceeding 650 nm, the tongue color and moss There is almost no difference in reflectance between the colors of and. Accordingly, when the near-infrared light exceeding the wavelength of 650 nm is used to illuminate the tongue, and the tongue is photographed to obtain a near-infrared image, the tongue and the moss have the same output (same color) in the near-infrared image. The difference between the color of moss and the color of moss disappears.
  • the tongue color can be accurately detected.
  • the moss unevenness does not appear in the near-infrared image, the thickness and crack of the tongue, which will be described later, can be accurately detected by using light having any wavelength of 650 to 1000 nm.
  • the color of the blood reflects the color of the blood.
  • the color of blood changes mainly depending on the degree of oxidation of hemoglobin.
  • FIG. 10 shows the absorption characteristics of oxygenated hemoglobin (HbO 2 ) and reduced hemoglobin (Hb). From the figure, it can be seen that in the wavelength region of 600 to 800 nm, reduced hemoglobin has a larger absorption coefficient than oxygenated hemoglobin. That is, reduced hemoglobin has a higher degree of light absorption in the wavelength region of 600 to 800 nm than oxyhemoglobin. For this reason, blood with a high amount of deoxyhemoglobin looks relatively blue because of its large red absorption. Therefore, it is possible to detect the color of the tongue reflecting the color of blood by determining whether the reduced hemoglobin is high or low.
  • FIG. 10 shows that the difference between the absorption coefficient of reduced hemoglobin and the absorption coefficient of oxyhemoglobin is maximum when the wavelength of light is 660 nm and is equal to 805 nm.
  • the color of the tongue was detected in the region of the left or right end of the tongue without the moss or the lower part (lingual apex), but in this embodiment, it is not affected by the moss, The color may be detected in any region of the tongue.
  • the central region C of the near-infrared image of the tongue shown in FIG. 8 (the region in the tongue between the tip of the tongue and the base of the tongue) is set as a diagnosis target region, and the tongue C based on the image data of this region C.
  • the color of is detected as follows.
  • the calculation unit 16 obtains an average value Bm of the B image data and an average value IRm of the IR image data in the region C of the near-infrared image acquired by the imaging unit 3, and uses these ratios IRm / Bm as a tongue. It is used as an index when detecting the color.
  • the ratio IRm / Bm is large, the color of the tongue is red (the degree of oxidation of blood is large) because blue is relatively small.
  • the ratio IRm / Bm is small, the color of the tongue is blue (the degree of oxidation of blood is small) because blue is relatively large.
  • the calculation unit 16 classifies the tongue color into a plurality of ranks according to the value of the ratio IRm / Bm.
  • the value of IRm / Bm is the rank "1" less than L 1
  • the value of IRm / Bm is the rank "2" less than L 1 or L 2
  • the value of IRm / Bm is less than L 2 or L 3
  • the rank is set to “3”
  • the calculation unit 16 classifies the tongue color into any one of “1” to “3” according to the value of IRm / Bm.
  • this classification indicates the degree of tongue color such as “dark red”, “normal red”, “light red”, etc. (other than numerical values) ). Thereby, even if the whole tongue surface is covered with moss or the moss distribution is uneven, the color of the tongue (including the degree of redness) can be accurately detected.
  • illumination including light with a wavelength of 600 to 800 nm is used when detecting the color of the tongue (degree of oxidation of hemoglobin). It is desirable to use light.
  • the index used when detecting the color of the tongue is not limited to the ratio IRm / Bm, but may be, for example, the ratio IRm / (IRm + B).
  • FIGS. 11 and 12 show the visible image of the tongue, the cross-sectional shape of the tongue along the line AA ′, and the distribution of RGB image data of each pixel aligned in the AA ′ direction in the visible image.
  • FIG. 11 shows a case where the tongue is thin and the moss is thin
  • FIG. 12 shows a case where the tongue is thick and the moss is thick.
  • the AA ′ direction refers to a direction that passes through the tongue between the tongue tip and the tongue base of the tongue in the visible image and is perpendicular to the direction connecting the tongue tip and the tongue base, and corresponds to the horizontal direction.
  • the surface of the tongue When the tongue is thin, the surface of the tongue has a valley shape with the central part recessed downward. For this reason, when illumination light reflected from the surface of the tongue is received by the imaging unit and a distribution of image data in the direction AA ′ is created, the above distribution has a shape recessed downward in both RGB (FIG. 11). reference).
  • the surface of the tongue bulges in the center and becomes a mountain (convex upward). Therefore, when illumination light reflected from the surface of the tongue is received by the imaging unit and a distribution of image data in the direction AA ′ is created, the distribution has a convex shape with a high central portion in both RGB (FIG. 12).
  • the RGB image data distribution includes noise (due to the reflected light from the moss). Become.
  • FIGS. 13 and 14 show the distribution of image data of each pixel arranged in the A-A ′ direction of the visible image and the near-infrared image shown in FIG. 8, respectively.
  • the AA ′ direction of the near-infrared image refers to a direction perpendicular to the direction connecting the tongue apex and the tongue base, passing through the tongue between the tongue apex and the tongue base of the near-infrared image, and in the horizontal direction. It corresponds.
  • the calculation unit 16 detects the thickness of the tongue as follows using the distribution of the image data of the near-infrared image.
  • FIG. 15 shows a sectional shape of the tongue and a distribution of image data of each pixel arranged in the A-A ′ direction of the near-infrared image of the tongue.
  • FIG. 16 shows the distribution of the image data in FIG. 15 together with the region S set when the thickness is detected.
  • This region S is a region closer to the end than the central portion in the AA ′ direction.
  • W width of the tongue determined from the contour line of the tongue
  • FIG. the dimensional relationship shown in FIG. (Region including a region of width W / 4 at a distance of W / 8 from the end of the tongue) can be considered.
  • the contour line of the tongue is obtained by the image processing unit 15 described above.
  • the cross-sectional shape of the tongue changes variously.
  • the distribution shape is horizontal, or the horizontal to concave shape, no matter how the tongue moves. To change.
  • the quadratic coefficient of the approximate polynomial is 0 or a positive value.
  • the quadratic coefficient of the approximate polynomial of the distribution in the region S is a negative value. . Note that the thicker the tongue, the greater the negative value of the coefficient. Therefore, the thickness of the tongue can be detected by looking at the quadratic coefficient of the approximate polynomial in the region S of the distribution of the image data of the near-infrared image.
  • the calculation unit 16 classifies the thickness of the tongue into any one of “1” to “3” according to the second-order coefficient of the approximate polynomial in the image data distribution region S.
  • this classification indicates the level of tongue thickness (other than numerical values) such as “thin”, “normal”, and “thick”. May be. Thereby, the thickness of the tongue (including the thickness) can be accurately detected regardless of the presence or absence of moss on the tongue surface and the unevenness of the moss.
  • Detection of tongue surface cracks 17 and 18 show the frequency distribution of the image data in the region C of the visible image and the near-infrared image shown in FIG. Since many cracks occur near the center of the surface of the tongue, in the present embodiment, a region having a length of 1 ⁇ 4 of the length in the vertical direction is defined as a region C in the tongue near the vertical center of the tongue. This is set as a crack detection area (diagnosis area). Note that the frequency distribution in the region C of the visible image in FIG. 17 indicates the frequency distribution for the B image data of RGB.
  • the base of the tongue appears more than when there is no crack, so the range of values that can be taken by the image data of the pixels that make up the base is expanded for both RGB. For this reason, when the frequency distribution of the image data of the captured image is created, the width of the frequency distribution is widened.
  • the crack on the tongue surface is caused by a decrease in immunity due to lack of nutrition, poor blood flow, stress, or the like.
  • moss is a keratinized papillary tissue of the lingual mucosa, with exfoliated cells, mucus, food, and bacteria attached to the tongue surface. Since the base of the tongue appears in the portion without moss, the range of values that can be taken by the image data of the pixels constituting the base is also expanded for both RGB. For this reason, in the frequency distribution of the visible image in FIG. 17, the standard deviation (or variance) increases as a result of including both the background image data of the portion without moss and the background image data of the crack portion. Incidentally, the standard deviation of the frequency distribution of FIG. 17 is 26.78 (the variance is the square of the standard deviation).
  • the standard deviation (or variance) is smaller than the frequency distribution of the visible image as in the frequency distribution of FIG.
  • the standard deviation of the frequency distribution in FIG. 18 is 13.18. Therefore, by creating a frequency distribution of the image data of the near-infrared image and looking at its standard deviation (or variance), it is possible to accurately crack the tongue regardless of the surface condition of the tongue (uneven distribution of moss). Can be detected.
  • the rank is “1”, and N 1 or more and less than N 2 is the rank “2”.
  • a rank “3” is set between N 2 and N 3 .
  • the calculation unit 16 classifies the cracks of the tongue into any one of “1” to “3” according to the value of the standard deviation ⁇ (or variance ⁇ 2 ) of the frequency distribution. In addition to the numerical values “1” to “3” shown above, this classification indicates the degree of cracking of the tongue (other than numerical values), such as “small”, “medium”, and “large”. Also good. This makes it possible to accurately detect cracks in the tongue (including the degree of cracking) regardless of the presence or absence of moss on the tongue surface and unevenness of the moss.
  • FIG. 19 is a flowchart showing an operation flow in the organ image photographing apparatus 1 of the present embodiment.
  • the illumination control unit 11 turns on the illumination unit 2 (S1) and sets photographing conditions such as illuminance (S1). S2).
  • the imaging control unit 12 controls the imaging unit 3 to photograph the tongue that is the photographing target (S3). Thereby, the visible image and near-infrared image of a tongue are obtained.
  • the image processing unit 15 extracts the contour line of the tongue from the photographed image of the tongue (S4). Then, the computing unit 16 detects the upper and lower ends and left and right ends of the tongue from the extracted outline, and sets areas for detecting diagnostic items (tongue color, thickness, crack) (S5). Specifically, the calculation unit 16 sets the region C in the tongue (see FIG. 8) when detecting the color of the tongue, and in the horizontal direction (A ⁇ When a region (each pixel) in the A ′ direction) is set and a crack on the tongue surface is detected, a region C in the tongue is set.
  • the calculation unit 16 uses a feature amount (ratio IRm / Bm, region S) for detecting the color, thickness, and crack of the tongue based on the photographed image (visible image, near infrared image) of the tongue.
  • the quadratic coefficient of the approximate polynomial and the standard deviation ⁇ of the frequency distribution are extracted (S6), and these are classified (for example, digitized) by the above-described method (S7).
  • the calculated numerical value is output from the calculation unit 16 and displayed on the display unit 4 and is also stored in the storage unit 17. However, the calculated numerical value is output from the audio output unit 7 by voice as necessary, or an output device (not shown). Or transferred to the outside via the communication unit 6 (S8).
  • the calculation unit 16 determines the tongue diagnosis item (tongue) based on at least the image data of the near-infrared image among the images obtained by photographing the tongue with the imaging unit 3. Color, thickness, cracks). In the near-infrared image, noise (eg, moss) that was visible in the visible image can no longer be seen, so the diagnostic items for the tongue can be accurately detected regardless of the state of the tongue surface (whether moss is present or moss is uneven). Can do. Therefore, the accuracy of diagnosis based on the detected diagnosis item is improved.
  • the near-infrared image noise (eg, moss) that was visible in the visible image can no longer be seen, so the diagnostic items for the tongue can be accurately detected regardless of the state of the tongue surface (whether moss is present or moss is uneven). Can do. Therefore, the accuracy of diagnosis based on the detected diagnosis item is improved.
  • the calculation unit 16 classifies the degree of tongue thickness as a diagnostic item based on the distribution of image data in one direction in the near-infrared image.
  • the near-infrared image the influence of the moss on the tongue surface that becomes noise is eliminated, so that the distribution of the image data in one direction is smoother than that of the visible image. Accordingly, it becomes possible to accurately and easily grasp the cross-sectional shape of the tongue surface based on the distribution of the image data in one direction of the near-infrared image, and to detect the thickness of the tongue accurately.
  • the horizontal surface of the tongue is obtained by using the distribution of the image data of each pixel that passes through the tongue and is aligned in a direction perpendicular to the direction connecting the tongue tip and the tongue base.
  • the calculation unit 16 obtains a second order polynomial that approximates the above distribution in a predetermined region (region S) of the distribution of the image data, and classifies the degree of tongue thickness based on the second order coefficient. Therefore, it becomes easy to classify the thickness of the tongue into a plurality of ranks based on the magnitude and sign (positive / negative) of the second-order coefficient of the approximate polynomial.
  • the calculation unit 16 classifies the degree of cracking of the tongue as a diagnostic item based on the frequency distribution of the image data of a partial region of the tongue in the near-infrared image.
  • the near-infrared image the influence of the moss on the tongue surface, which becomes noise, is eliminated. Therefore, the frequency distribution of the near-infrared image becomes smaller than the frequency distribution of the visible image. Therefore, it is possible to accurately detect the presence or absence of a crack on the tongue surface based on the frequency distribution of the near-infrared image.
  • the calculation unit 16 determines the degree of cracking of the tongue based on the standard deviation or variance of the frequency distribution. This makes it possible to accurately and reliably detect cracks on the tongue surface.
  • the calculation unit 16 classifies the degree of tongue color as a diagnostic item based on the near-infrared image data and the B image data in the visible image.
  • B image data is less likely to fluctuate due to the presence of moss than other G and R image data, so the near-infrared image data and B image data are used for tongue color detection.
  • the color of the tongue can be accurately detected regardless of the presence or absence of moss.
  • the calculation unit 16 sets the ratio (IRm / Bm) of the average value IRm of the image data in a partial region of the near-infrared image and the average value Bm of the B image data in a partial region of the visible image. Based on the degree of tongue color. In this case, it is possible to determine whether the tongue is reddish or bluish by a relative comparison between IRm and Bm, and the color of the tongue can be easily detected.
  • the subject to be photographed is a human tongue
  • it may not be a human but may be an animal other than a human.
  • the diagnostic item can be accurately detected by applying the method of the present embodiment. In this case, it is possible to quickly and accurately determine the poor physical condition of an animal that cannot communicate its intention.
  • the case where the organ of the living body is the tongue has been described as an example.
  • the eyelid or the like can be used as the organ to be diagnosed.
  • the organ image capturing apparatus described above can be expressed as follows, and has the following effects.
  • the organ imaging apparatus described above receives an illumination unit that illuminates a living organ with illumination light including near-infrared light, and receives the illumination light that is reflected from the surface of the organ. Based on image data of a near-infrared image obtained by receiving at least the near-infrared light included in the illumination light, among the image acquired by the imaging unit and the image acquired by the imaging unit, And an arithmetic unit for classifying the degree of diagnosis items of the organ.
  • the image acquired by the imaging unit includes a near-infrared image of the organ.
  • the noise for example, moss if the organ is a tongue
  • the operation unit classifies the degree of the diagnosis item of the organ based on the image data of the near-infrared image, thereby accurately detecting the diagnosis item of the organ regardless of the state of the organ surface that causes noise. Is possible.
  • the organ may be a tongue.
  • the arithmetic unit is affected by the moss on the tongue surface by classifying the degree of the diagnostic items of the tongue (for example, the color, thickness of the tongue, cracks on the surface) based on the image data of the near-infrared image. Therefore, the diagnostic item can be detected accurately. That is, even if the entire surface of the tongue is covered with moss or the moss distribution is uneven, it is possible to accurately detect the diagnostic items of the tongue.
  • the calculation unit may classify the degree of the thickness of the tongue as the diagnostic item based on the distribution of image data in one direction in the near-infrared image.
  • the influence of the moss on the tongue surface that becomes noise is eliminated, so that the distribution of the image data in one direction in the near-infrared image is smoother than the distribution of the image data in one direction in the visible image. Therefore, it is possible to accurately and easily grasp the cross-sectional shape of the tongue surface based on the former distribution, and it is possible to accurately detect the thickness of the tongue.
  • the calculation unit may obtain a second order polynomial that approximates the distribution in a predetermined region of the distribution of the image data, and classify the degree of the thickness of the tongue based on the second order coefficient of the polynomial. .
  • the thickness of the tongue can be detected based on the magnitude and sign (positive / negative) of the second-order coefficient of the polynomial, and the level of the thickness of the tongue can be easily classified.
  • the distribution of the image data is a distribution of image data of each pixel that passes through the tongue between the tongue tip and the tongue base of the tongue in the near-infrared image and is aligned in a direction perpendicular to the direction connecting the tongue tip and the tongue base. Also good.
  • the surface shape in the horizontal direction of the tongue is grasped by classifying the degree of the thickness of the tongue based on the distribution of horizontal image data passing through the center of the tongue (in the tongue) in the near-infrared image.
  • the thickness of the tongue can be accurately detected.
  • the calculation unit may classify the degree of cracking of the tongue as the diagnostic item based on a frequency distribution of image data of a partial region of the tongue in the near-infrared image.
  • the influence of the moss on the tongue surface that causes noise is eliminated, so the spread of the frequency distribution of the image data of a partial region of the tongue in the near-infrared image is the same region as the above region in the visible image Smaller than the spread of the frequency distribution of data. Therefore, it is possible to accurately detect the presence or absence of a crack on the tongue surface based on the former frequency distribution.
  • the calculation unit may classify the degree of tongue cracking based on the standard deviation or variance of the frequency distribution. Since the standard deviation or variance indicates the extent of the frequency distribution, it is possible to accurately and reliably detect cracks on the tongue surface.
  • the partial region may be a region in the tongue between the tongue tip and the base of the tongue in the near-infrared image. In this case, it is possible to reliably detect a crack in the tongue that is likely to occur in the vicinity of the tongue surface on the tongue surface based on the frequency distribution.
  • the illumination light includes visible light
  • the calculation unit receives the image data of the near-infrared image and the visible light included in the illumination light through the surface of the organ.
  • the degree of tongue color as the diagnostic item may be classified based on the blue image data in the visible image.
  • the blue (B) image data of the visible image is less likely to fluctuate due to the presence of moss (less susceptible to moss) than the other green (G) and red (R) image data. Therefore, the color of the tongue can be accurately detected regardless of the presence of moss by classifying the degree of the color of the tongue based on the image data of the near-infrared image and the B image data of the visible image. It becomes possible.
  • the calculation unit calculates an average value of image data in the target region of the near-infrared image and a blue color in the target region of the visible image.
  • the degree of tongue color may be classified based on the ratio to the average value of the image data.
  • Redness or blueness of the tongue is detected by relative comparison between the average value (IRm) of the image data in the target region of the near-infrared image and the average value (Bm) of the B image data in the target region of the visible image. Therefore, it is easy to detect the color of the tongue.
  • the present invention can be used in an apparatus for photographing a living organ and detecting a diagnosis item of the organ from the photographed image.

Landscapes

  • Life Sciences & Earth Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Biophysics (AREA)
  • Pathology (AREA)
  • Engineering & Computer Science (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Physics & Mathematics (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Measuring And Recording Apparatus For Diagnosis (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)

Abstract

L'invention concerne un appareil d'imagerie d'organes (1) pourvu d'une unité d'éclairage (2), d'une unité d'imagerie (3) et d'une unité de calcul (16). L'unité d'éclairage (2) éclaire un organe d'un corps vivant à l'aide d'une lumière d'éclairage comprenant de la lumière dans le proche infrarouge. L'unité d'imagerie (3) acquiert des images de l'organe en recevant de la lumière d'éclairage réfléchie au niveau de la surface de l'organe. L'unité de calcul (16) classe les degrés de catégories de diagnostic d'organe sur la base des données d'image d'au moins les images dans le proche infrarouge obtenues à l'aide d'une lumière reçue dans le proche infrarouge contenue dans la lumière d'éclairage parmi les images acquises par l'unité d'imagerie (3).
PCT/JP2015/054706 2014-04-08 2015-02-20 Appareil d'imagerie des organes Ceased WO2015156039A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2014-079375 2014-04-08
JP2014079375 2014-04-08

Publications (1)

Publication Number Publication Date
WO2015156039A1 true WO2015156039A1 (fr) 2015-10-15

Family

ID=54287620

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2015/054706 Ceased WO2015156039A1 (fr) 2014-04-08 2015-02-20 Appareil d'imagerie des organes

Country Status (1)

Country Link
WO (1) WO2015156039A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110929740A (zh) * 2019-11-21 2020-03-27 中电健康云科技有限公司 一种基于lgbm模型的舌质舌苔分离方法

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006149679A (ja) * 2004-11-29 2006-06-15 Konica Minolta Holdings Inc 健康度判定方法、装置、及びプログラム
CN101214140A (zh) * 2007-12-29 2008-07-09 哈尔滨工业大学 便携式近红外舌下静脉图像采集仪
JP2009028058A (ja) * 2007-07-24 2009-02-12 Saieco:Kk 舌診システム、舌診装置、舌診方法、舌診プログラム
KR101028999B1 (ko) * 2010-07-16 2011-04-14 상지대학교산학협력단 설진 영상 촬영장치 및 방법

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006149679A (ja) * 2004-11-29 2006-06-15 Konica Minolta Holdings Inc 健康度判定方法、装置、及びプログラム
JP2009028058A (ja) * 2007-07-24 2009-02-12 Saieco:Kk 舌診システム、舌診装置、舌診方法、舌診プログラム
CN101214140A (zh) * 2007-12-29 2008-07-09 哈尔滨工业大学 便携式近红外舌下静脉图像采集仪
KR101028999B1 (ko) * 2010-07-16 2011-04-14 상지대학교산학협력단 설진 영상 촬영장치 및 방법

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110929740A (zh) * 2019-11-21 2020-03-27 中电健康云科技有限公司 一种基于lgbm模型的舌质舌苔分离方法

Similar Documents

Publication Publication Date Title
US20170164888A1 (en) Organ imaging device
JP6204314B2 (ja) 電子内視鏡システム
JPWO2015029537A1 (ja) 器官画像撮影装置
JP7542585B2 (ja) 画像処理装置、内視鏡システム及び画像処理装置の作動方法
EP3851026B1 (fr) Dispositif d'endoscope, processeur d'endoscope et procédé de fonctionnement de dispositif d'endoscope
CN112469323B (zh) 内窥镜系统
KR20150128916A (ko) 빌리루빈 레벨의 추정
US20170311872A1 (en) Organ image capture device and method for capturing organ image
JPWO2020170791A1 (ja) 医療画像処理装置及び方法
JP7229676B2 (ja) 生体情報検出装置および生体情報検出方法
WO2016067892A1 (fr) Dispositif de génération de degré de santé, système de génération de degré de santé, et programme
AU2025202257A1 (en) Intraoral diagnostic device and method of using same
US20160210746A1 (en) Organ imaging device
JP2016198140A (ja) 器官画像撮影装置
WO2022014258A1 (fr) Dispositif de processeur et son procédé de fonctionnement
WO2015068494A1 (fr) Dispositif de capture d'image d'organe
WO2015156039A1 (fr) Appareil d'imagerie des organes
JP2005094185A (ja) 画像処理システム、画像処理装置、および撮像制御方法
JP2016151584A (ja) 器官画像撮影装置
JP6756054B2 (ja) 電子内視鏡用プロセッサ及び電子内視鏡システム
JP2016150024A (ja) 器官画像撮影装置
CN113556968A (zh) 内窥镜系统
JP2015226599A (ja) 生体色度計測装置
WO2023275774A1 (fr) Dispositif non invasif pour l'estimation rapide de l'anémie
US20160228054A1 (en) Organ imaging device

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15776341

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 15776341

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP