[go: up one dir, main page]

WO2025115505A1 - Tooth row image generation device, tooth row image generation method, and program - Google Patents

Tooth row image generation device, tooth row image generation method, and program Download PDF

Info

Publication number
WO2025115505A1
WO2025115505A1 PCT/JP2024/038825 JP2024038825W WO2025115505A1 WO 2025115505 A1 WO2025115505 A1 WO 2025115505A1 JP 2024038825 W JP2024038825 W JP 2024038825W WO 2025115505 A1 WO2025115505 A1 WO 2025115505A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
dentition
teeth
row
unit
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
PCT/JP2024/038825
Other languages
French (fr)
Japanese (ja)
Inventor
泰雄 大塚
岳史 浜崎
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Panasonic Intellectual Property Management Co Ltd
Original Assignee
Panasonic Intellectual Property Management Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Panasonic Intellectual Property Management Co Ltd filed Critical Panasonic Intellectual Property Management Co Ltd
Publication of WO2025115505A1 publication Critical patent/WO2025115505A1/en
Pending legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/04Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
    • A61B1/045Control thereof
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/24Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor for the mouth, i.e. stomatoscopes, e.g. with tongue depressors; Instruments for opening or keeping open the mouth
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61CDENTISTRY; APPARATUS OR METHODS FOR ORAL OR DENTAL HYGIENE
    • A61C19/00Dental auxiliary appliances
    • A61C19/04Measuring instruments specially adapted for dentistry

Definitions

  • the present disclosure relates to a dentition image generating device, a dentition image generating method, and a program.
  • Patent Document 1 discloses a device that has an excitation light emitting unit that emits excitation light to cause plaque, lesions, etc. to emit fluorescence, and an illumination light emitting unit that emits illumination light to illuminate the periphery of plaque, lesions, etc., and that the irradiation means is capable of irradiating the illumination light and excitation light simultaneously.
  • Patent Document 1 also discloses that in order to improve the visibility of plaque, lesions, etc., the amount of light emitted from the excitation light emitting unit is greater than the amount of light emitted from the illumination light emitting unit.
  • a dentition image generating device that generates a dentition image that creates a less unnatural feeling.
  • the present disclosure provides a dentition image generating device, a dentition image generating method, and a program that are capable of generating a dentition image with reduced discomfort.
  • a dentition image generating device includes an acquisition unit that acquires a captured image obtained by photographing the surface of the dentition and dental plaque in an oral cavity irradiated with light including a wavelength range of blue light, a synthesis unit that generates a dentition image by synthesizing a plurality of dentition block images, which are partial dentition images based on the captured images, and a processing unit that performs white balance processing on the synthesized dentition image.
  • a dentition image generating method obtains an image obtained by photographing the surface of the dentition and dental plaque in an oral cavity irradiated with light including a wavelength range of blue light, generates a dentition image by synthesizing a plurality of dentition block images, which are partial dentition images based on the photographed image, and performs white balance processing on the synthesized dentition image.
  • a program according to one aspect of the present disclosure is a program for causing a computer to execute the above-mentioned tooth row image generating method.
  • a dentition image generating device or the like that is capable of generating a dentition image with reduced discomfort.
  • FIG. 1 is a perspective view of an intraoral camera in a dentition image generating system according to an embodiment.
  • FIG. 2 is a schematic diagram of a dentition image generating system according to an embodiment.
  • FIG. 3 is a block diagram illustrating a functional configuration of the mobile terminal according to the embodiment.
  • FIG. 4 is a sequence diagram showing the operation of the dentition image generating system according to the embodiment.
  • FIG. 5 is a diagram for explaining the processing of the dentition image generating system according to the embodiment.
  • Patent Document 1 discloses that an LED (Light Emitting Diode) with a central wavelength of 365 nm, 405 nm, or 470 nm is used as the light source of the excitation light.
  • LED Light Emitting Diode
  • the color tone of the plaque-free tooth area and the surrounding area in the obtained image is close to the color of the excitation light.
  • the color tone of the plaque-free tooth area and the surrounding area of the affected area is far from the actual color tone, and the user may feel uncomfortable with the obtained image.
  • the obtained image is used to observe the condition of normal tissue around the affected area or to explain it to the user, the user may feel uncomfortable because the color tone is far from the actual color tone.
  • the reproducibility of the color tone of each image obtained is low, and the user may feel uncomfortable with the obtained image.
  • the device of Patent Document 1 has room for improvement in terms of generating a dentition image with reduced discomfort.
  • the inventors of the present application therefore conducted extensive research into dentition image generating devices etc. capable of generating dentition images with reduced discomfort, and have devised the dentition image generating device etc. shown below.
  • the dentition image generating device includes an acquisition unit that acquires a captured image obtained by photographing the surface of the dentition and dental plaque in an oral cavity irradiated with light including a wavelength range of blue light, a synthesis unit that generates a dentition image by synthesizing a plurality of dentition block images, which are partial dentition images based on the captured images, and a processing unit that performs white balance processing on the synthesized dentition image.
  • the dentition image generating device may be the dentition image generating device according to the first aspect, and the processing unit may perform the white balance processing based on color information of the central tooth of the dentition image after synthesis.
  • the white balance of the central tooth can be adjusted with an appropriate gain.
  • the color tone of the central tooth in the synthesized dentition image can be made closer to the actual color tone. Therefore, for example, if the area that the user focuses on is an area that includes the central tooth, the plaque area in that area can be effectively emphasized.
  • the dentition image generating device may be the dentition image generating device according to the first aspect, and the processing unit may perform the white balance processing based on color information of the entire dental region of the dentition image after synthesis.
  • a dentition image generating device may be a dentition image generating device according to any one of the first to third aspects, in which the processing unit adjusts the blue color level of the dental region of each of the plurality of dentition block images, and the synthesis unit synthesizes the plurality of dentition block images with the blue color level adjusted.
  • the dentition image generating device may be a dentition image generating device according to any one of the first to fourth aspects, and may include a detection unit that detects plaque in an image based on the captured image, and a generation unit that generates a composite dentition image in which the plaque area detected by the detection unit is highlighted.
  • plaque areas can be highlighted in the composite dentition image. For example, when such a composite dentition image is presented to a user, the user can easily identify the plaque areas.
  • the dentition image generating device may be the dentition image generating device according to the fifth aspect, and may include a display unit that displays the synthesized dentition image in which the plaque area is highlighted.
  • the dentition image generating device may be a dentition image generating device according to any one of the first to sixth aspects, and the plurality of dentition block images may include images of anterior teeth.
  • the dentition image generating device may be a dentition image generating device according to any one of the first to seventh aspects, and the surface of the dentition may include a side surface of the dentition.
  • the dentition image generating device may be a dentition image generating device according to any one of the first to eighth aspects, and the surface of the dentition may include an occlusal surface of the dentition.
  • the teeth image generating device may be a teeth image generating device according to any one of the first to ninth aspects, and may include a saturation enhancement processing unit that generates a converted image by converting the combined teeth image into an HSV image, identifies a specific pixel area in which one or more pixels of the converted image that satisfy at least one of a first predetermined range for saturation, a second predetermined range for hue, and a third predetermined range for brightness are located, and performs saturation enhancement processing on the specific pixel area in the combined teeth image to generate the combined teeth image with enhanced saturation.
  • a saturation enhancement processing unit that generates a converted image by converting the combined teeth image into an HSV image, identifies a specific pixel area in which one or more pixels of the converted image that satisfy at least one of a first predetermined range for saturation, a second predetermined range for hue, and a third predetermined range for brightness are located, and performs saturation enhancement processing on the specific pixel area in the combined teeth image to generate the combined teeth image with enhanced saturation.
  • the dentition image generating device may be a dentition image generating device according to any one of the first to tenth aspects, and may include a shading display processing unit that generates a converted image by converting the combined dentition image into an HSV image, identifies a specific pixel area in which one or more pixels of the converted image that satisfy at least one of a first predetermined range for saturation, a second predetermined range for hue, and a third predetermined range for brightness are located, detects an accumulation level distribution of fluorescent substances accumulated in the plaque from the brightness value in the specific pixel area of the converted image, and performs shading image processing according to the accumulation level distribution of the fluorescent substances detected for the specific pixel area in the combined dentition image to generate the combined dentition image including a shading display.
  • a shading display processing unit that generates a converted image by converting the combined dentition image into an HSV image, identifies a specific pixel area in which one or more pixels of the converted image that satisfy at least one of a first predetermined range for saturation
  • a dentition image generating method obtains a photographed image obtained by photographing the surface of the dentition and dental plaque in an oral cavity irradiated with light including a wavelength range of blue light, generates a dentition image by synthesizing a plurality of dentition block images, which are partial dentition images based on the photographed image, and performs white balance processing on the synthesized dentition image.
  • a program according to one aspect of the present disclosure is a program for causing a computer to execute the above-mentioned dentition image generating method.
  • each figure is a schematic diagram and is not necessarily an exact illustration. Therefore, for example, the scales of the figures do not necessarily match.
  • the same reference numerals are used for substantially the same configurations, and duplicate explanations are omitted or simplified.
  • ordinal numbers such as “first” and “second” do not refer to the number or order of components, unless otherwise specified, but are used for the purpose of avoiding confusion between and distinguishing between components of the same type.
  • FIG. 1 (Embodiment)
  • FIG. 1 a dentition image generating system and a dentition image generating method according to the present embodiment will be described with reference to FIGS. 1 to 5.
  • FIG. 1 a dentition image generating system and a dentition image generating method according to the present embodiment will be described with reference to FIGS. 1 to 5.
  • FIG. 1 is a perspective view of an intraoral camera 10 in the dentition image generating system according to the present embodiment.
  • the intraoral camera 10 has a toothbrush-like housing that can be handled with one hand, and the housing has a head portion 10a that is placed in the user's oral cavity when photographing the dentition, a handle portion 10b that is held by the user, and a neck portion 10c that connects the head portion 10a and the handle portion 10b.
  • the photographing unit 21 photographs the surfaces of the dentition and dental plaque in the oral cavity irradiated with light including the wavelength range of blue light.
  • the surfaces of the dentition include at least one of the buccal (outer) side surface of the dentition, the lingual (inner) side surface of the dentition, and the occlusal surface of the dentition.
  • the imaging unit 21 is incorporated into the head portion 10a and the neck portion 10c.
  • the imaging unit 21 has an image sensor (not shown) and a lens (not shown) arranged on its optical axis LA.
  • the imaging element is a photographing device such as a CMOS (Complementary Metal Oxide Semiconductor) sensor or a CCD (Charge Coupled Device) element, and an image of the teeth is formed by a lens.
  • the imaging element outputs a signal (image data) corresponding to the formed image to the outside.
  • the image photographed by the imaging element is also referred to as an RGB image.
  • the RGB image is an image sequence obtained by irradiating the dentition with blue light, and may be, for example, an image sequence of the side of the dentition, or an image sequence of the occlusal surface of the dentition.
  • the image sequence includes, for example, one or more images (e.g., time-series images) photographed along the direction of the dentition.
  • the side of the dentition may be the tongue side or the cheek side.
  • the photographing unit 21 may further have an optical filter that blocks light of a color irradiated from the illumination unit (illumination device) and transmits fluorescence emitted by plaque in response to the light.
  • the photographing unit 21 may have an optical filter that is a blue light cut filter that cuts blue wavelength light components contained in the light incident on the image sensor.
  • the blue light cut filter cuts out a portion of the light including the blue wavelength range from the light before it enters the image sensor. Note that the photographing unit 21 does not need to have a blue light cut filter.
  • the intraoral camera 10 is also equipped with multiple first to fourth LEDs 23A to 23D as an illumination unit that irradiates light onto the teeth to be photographed during photography.
  • the first to fourth LEDs 23A to 23D irradiate plaque with light of a color that causes the plaque to fluoresce when irradiated thereon (e.g., light of a single color).
  • the first to fourth LEDs 23A to 23D are, for example, blue LEDs that irradiate blue light that includes a wavelength (an example of a predetermined wavelength) with a peak at 405 nm. Note that the first to fourth LEDs 23A to 23D are not limited to blue LEDs, and may be any light source that irradiates light that includes the wavelength range of blue light.
  • FIG. 2 is a schematic diagram of a dentition image generating system according to this embodiment.
  • the dentition image generating system according to this embodiment is generally configured such that the imaging unit 21 captures fluorescence emitted by plaque in response to light from the illumination unit 23, and then synthesizes (links and combines) multiple images based on the captured RGB images (e.g., a first dentition block image, described below) to generate a panoramic image in which the teeth are aligned horizontally.
  • the panoramic image is an image that represents the condition of at least a portion (e.g., the entirety) of the user's mouth.
  • the dentition image generation system includes an intraoral camera 10 and a mobile terminal 50.
  • the intraoral camera 10 comprises a hardware unit 20, a signal processing unit 30, and a communication unit 40.
  • the hardware unit 20 is a physical element of the intraoral camera 10, and includes an imaging unit 21, a sensor unit 22, an illumination unit 23, and an operation unit 24.
  • the photographing unit 21 generates image data by photographing the teeth in the user's oral cavity. It can also be said that the photographing unit 21 generates image data by photographing the surface of the dentition in the oral cavity and the plaque irradiated with light of a specific wavelength that excites fluorescent substances contained in the plaque.
  • the photographing unit 21 receives a control signal from the camera control unit 31, performs operations such as photographing in accordance with the received control signal, and outputs image data of a moving image or a still image obtained by photographing to the image processing unit 32.
  • the photographing unit 21 has the above-mentioned image sensor, optical filter, and lens.
  • the image data is generated, for example, based on light that has passed through the optical filter.
  • the image data is an image showing multiple teeth, but it is sufficient that the image shows at least one tooth.
  • the sensor unit 22 detects external light entering the photographing area of the RGB image. For example, the sensor unit 22 detects whether external light is entering the oral cavity.
  • the sensor unit 22 is arranged, for example, near the photographing unit 21.
  • the sensor unit 22 may be arranged, for example, in the head unit 10a of the intraoral camera 10, similar to the photographing unit 21. In other words, the sensor unit 22 is located in the user's oral cavity when the photographing unit 21 captures images.
  • the lighting unit 23 irradiates light onto the area of the multiple areas in the oral cavity that is to be photographed by the imaging unit 21.
  • QLF quantitative visible light induced fluorescence
  • the illumination unit 23 has the above-mentioned first to fourth LEDs 23A to 23D.
  • the first to fourth LEDs 23A to 23D for example, irradiate the shooting area with light from different directions. This makes it possible to prevent shadows from appearing in the shooting area.
  • Each of the first to fourth LEDs 23A to 23D is configured so that at least the dimming can be controlled.
  • Each of the first to fourth LEDs 23A to 23D may be configured so that the dimming and color can be controlled.
  • the first to fourth LEDs 23A to 23D are arranged to surround the image capture unit 21.
  • the illumination unit 23 has its illumination intensity (light emission intensity) controlled according to the shooting area.
  • the illumination intensity of each of the first to fourth LEDs 23A to 23D may be controlled uniformly, or may be controlled to be different from one another.
  • the number of LEDs in the illumination unit 23 is not particularly limited, and may be one, or may be five or more.
  • the illumination unit 23 is not limited to having LEDs as a light source, and may have other light sources.
  • the operation unit 24 accepts operations from the user.
  • the operation unit 24 is configured, for example, with push buttons, but may also be configured to accept operations by voice, etc.
  • the hardware unit 20 may further include a battery (e.g., a secondary battery) that supplies power to each component of the intraoral camera 10, a coil for wireless charging by an external charger connected to a commercial power source, and an actuator required for at least one of composition adjustment and focus adjustment.
  • a battery e.g., a secondary battery
  • the signal processing unit 30 has various functional components realized by a CPU (Central Processing Unit) or MPU (Micro Processor Unit) that execute various processes described below, and a memory unit 35 such as a ROM (Read Only Memory) or RAM (Random Access Memory) that stores programs for causing each functional component to execute various processes.
  • the signal processing unit 30 has a camera control unit 31, an image processing unit 32, a control unit 33, a lighting control unit 34, and a memory unit 35.
  • the camera control unit 31 is mounted, for example, on the handle unit 10b of the intraoral camera 10, and controls the image capture unit 21.
  • the camera control unit 31 controls at least one of the aperture and the shutter speed of the image capture unit 21 in response to a control signal from the image processing unit 32, for example.
  • the image processing unit 32 is mounted, for example, on the handle unit 10b of the intraoral camera 10, acquires the RGB image (image data) captured by the imaging unit 21, performs image processing on the acquired RGB image, and outputs the RGB image after the image processing to the camera control unit 31 and the control unit 33.
  • the image processing unit 32 may also output the RGB image after the image processing to the memory unit 35, and store the RGB image after the image processing in the memory unit 35.
  • the image processing unit 32 is, for example, composed of a circuit, and performs image processing such as noise removal and edge enhancement on an RGB image. Note that noise removal and edge enhancement may be performed by the mobile terminal 50.
  • the RGB image output from the image processing unit 32 may be transmitted to the mobile terminal 50 via the communication unit 40, and an image based on the transmitted RGB image (for example, a third dentition image described below) may be displayed on the display unit 56 of the mobile terminal 50. This makes it possible to present an image based on the RGB image to the user.
  • the control unit 33 is a control device that controls the signal processing unit 30.
  • the control unit 33 controls each component of the signal processing unit 30 based on, for example, the detection results of external light, etc. by the sensor unit 22.
  • the lighting control unit 34 is mounted, for example, on the handle portion 10b of the intraoral camera 10, and controls the turning on and off of the first to fourth LEDs 23A to 23D.
  • the lighting control unit 34 is composed of, for example, a circuit. For example, when a user performs an operation on the display unit 56 of the mobile terminal 50 to start up the intraoral camera 10, a corresponding signal is sent from the mobile terminal 50 to the signal processing unit 30 via the communication unit 40.
  • the lighting control unit 34 of the signal processing unit 30 turns on the first to fourth LEDs 23A to 23D based on the received signal.
  • the memory unit 35 stores RGB images (image data) captured by the image capture unit 21.
  • the memory unit 35 is realized by, for example, semiconductor memory such as ROM and RAM, but is not limited to this.
  • the communication unit 40 is a wireless communication module for wirelessly communicating with the mobile terminal 50.
  • the communication unit 40 is mounted, for example, on the handle portion 10b of the intraoral camera 10, and performs wireless communication with the mobile terminal 50 based on a control signal from the signal processing unit 30.
  • the communication unit 40 executes wireless communication with the mobile terminal 50 that complies with existing communication standards such as WiFi (registered trademark) and Bluetooth (registered trademark). Via the communication unit 40, an RGB image is transmitted from the intraoral camera 10 to the mobile terminal 50, and an operation signal is transmitted from the mobile terminal 50 to the intraoral camera 10.
  • the mobile terminal 50 displays the plaque area in the second dentition image including two or more teeth based on an RGB image of the dentition surface and plaque that are fluorescently reacting when light including a wavelength range of blue light is irradiated onto the teeth.
  • the mobile terminal 50 also functions as a user interface for the dentition image generation system.
  • the mobile terminal 50 is an example of a dentition image generation device.
  • FIG. 3 is a block diagram showing the functional configuration of the mobile terminal 50 according to this embodiment.
  • the mobile terminal 50 includes an acquisition unit 51, a processing unit 52, a synthesis unit 53, a detection unit 54, a generation unit 55, and a display unit 56.
  • the mobile terminal 50 includes a processor and a memory.
  • the memory is a ROM, a RAM, or the like, and can store a program executed by the processor.
  • the acquisition unit 51, the processing unit 52, the synthesis unit 53, the detection unit 54, and the generation unit 55 are realized by a processor that executes a program stored in the memory.
  • the mobile terminal 50 may be realized, for example, by a smartphone or tablet terminal capable of wireless communication.
  • the acquisition unit 51 acquires RGB images from the intraoral camera 10. Specifically, the acquisition unit 51 acquires images (image sequence) of multiple teeth generated by the imaging unit 21.
  • the RGB images are images obtained by the intraoral camera 10 photographing teeth that are undergoing a fluorescent reaction by irradiating the teeth with light that includes the wavelength range of blue light.
  • the acquisition unit 51 is configured to include, for example, a wireless communication module that performs wireless communication.
  • the RGB image acquired by the acquisition unit 51 has a bluish tint overall. This is because when light including a wavelength range of blue light is irradiated onto teeth to detect plaque, the illumination unit 23 strengthens the light including a wavelength range of blue light in order to strengthen the excited fluorescence of plaque. As a result, the blue pixel value (B) becomes dominant compared to the red pixel value (R) and the green pixel value (G). In other words, a color cast occurs in the RGB image acquired by the acquisition unit 51. In this state, it is difficult to present the plaque adhesion state to the user in an easy-to-understand manner. Therefore, the mobile terminal 50 performs a process to present the plaque adhesion state to the user in an easy-to-understand manner by performing a predetermined image processing on the RGB image acquired by the acquisition unit 51 as shown below.
  • the processing unit 52 generates a plurality of first dentition block images, which are partial dentition images, from the RGB image (image sequence) generated by the imaging unit 21, and performs image processing on an image based on the generated plurality of first dentition block images.
  • the processing unit 52 performs image processing on each of the plurality of first dentition block images and each of the first dentition image or second dentition image obtained by combining the plurality of first dentition block images.
  • the first dentition block image is an image obtained by cutting out a specified range including the center of the angle of view from an image captured at a certain shooting position.
  • Each of the multiple first dentition block images is an image based on an image captured at a different shooting position (or shooting direction).
  • each of the multiple first dentition block images is an image including at least one tooth or space between teeth.
  • the first dentition block image may be, for example, an image of at least one tooth or space taken from the front.
  • the first dentition block images may be the same or different in size (image size).
  • at least a portion of the dental area shown in that first dentition block image overlaps with a portion of the dental area shown in at least one other first dentition block image.
  • the RGB image may be the first dentition block image itself, or the first dentition block image may be generated by dividing the RGB image so that some of the images overlap each other.
  • the processing unit 52 performs an exposure correction process on the multiple first dentition block images as image processing before composition by the composition unit 53.
  • the exposure correction process adjusts the blue level of each tooth region by multiplying the R, G, and B components by equal gains.
  • the processing unit 52 adjusts the blue level so that the blue levels of the tooth regions of each of the multiple first dentition block images approach (for example, match).
  • the processing unit 52 adjusts the color of each tooth of one or more other first dentition block images so that the color of the tooth (reference tooth) of the first dentition block image located in the center when the multiple first dentition block images are arranged in the order of the dentition is unified.
  • the reference tooth may be, for example, a tooth located in the center of the angle of view in the RGB image captured by the imaging unit 21.
  • both ends of the angle of view may be unevenly irradiated with light from the illumination unit 23, it is possible to reproduce the color of the tooth illuminated in the same way as the central tooth by adjusting the blue level.
  • the blue level adjustment may be performed only on the tooth region, or on the entire first dentition block image (e.g., the region including the teeth and gums).
  • the processing unit 52 also performs WB (White Balance) processing on the dentition image (first dentition image or second dentition image) as image processing after synthesis by the synthesis unit 53.
  • WB processing is processing to adjust the color balance in the image by multiplying each of the R, G, and B components by a different gain (white balance gain).
  • WB processing is processing to adjust the gain of at least two color components of the red, green, and blue components of the image to be processed so that, for example, the red pixel average value of multiple red pixel values of multiple pixels constituting the tooth region of the first dentition block image to be processed, the green pixel average value of multiple green pixel values of the multiple pixels, and the blue pixel average value of multiple blue pixel values of the multiple pixels become closer (for example, equal).
  • the processing unit 52 performs WB processing on the dentition image, assuming that the tooth region shown in the dentition image is a white region.
  • the processing unit 52 performs WB processing on the dentition image based on the color (color information) of the tooth region shown in the dentition image.
  • the processing unit 52 may perform WB processing based on the color (color information) of an arbitrary tooth (e.g., a tooth shown in the center of the dentition image) or on a statistical value (e.g., average value, median, etc.) of the color (color information) of the tooth region.
  • the processing unit 52 may perform WB processing by multiplying the entire tooth region by a gain calculated based on the arbitrary tooth or the statistical value.
  • the blue level of the composite dentition image can be made uniform, so that the tooth region of the dentition image can be white-balanced collectively.
  • “Bulk” means that the gain used for white-balance processing is common.
  • the processing unit 52 only needs to adjust the blue level and perform at least WB processing.
  • the tooth area shown in the dentition image may be the area of the natural tooth in the teeth shown in the dentition image.
  • the processing unit 52 may perform WB processing assuming that the natural tooth area is a white area (for example, based on the color (color information) of the natural tooth area shown in the dentition image).
  • the natural tooth area may be the area of the natural tooth in one tooth shown in the dentition image, the area of the natural tooth in a specific tooth, or the area of the natural tooth in multiple teeth.
  • a statistical value of the color (e.g., chromaticity) of the natural tooth area of each of the multiple teeth may be used in the WB processing.
  • the statistical value is, for example, the average value, but may also be a maximum value, minimum value, mode, median, etc.
  • excitation fluorescence when excitation light is applied to natural teeth, excitation fluorescence is emitted from the dentin, which penetrates the enamel and fluoresces green. It is also known that fillings in caries treatment scars (e.g., metal inlays) do not emit excitation fluorescence under blue LED light, and are captured darkly (at low brightness) by the camera. As described above, it is also known that plaque (plaque areas) fluoresces reddish pink (excitation fluorescence) when irradiated with blue light. For these reasons, the processing unit 52 can detect natural teeth, excluding caries treatment scars and plaque, from the dentition image.
  • caries treatment scars e.g., metal inlays
  • the processing unit 52 may further identify the type of tooth (e.g., a specific tooth) included in the image from the first dentition block image, the first dentition image, or the dentition image (the second dentition image or the third dentition image). Identifying the type of tooth may mean identifying whether the tooth is an incisor, a canine, or a molar, or whether the tooth is a central incisor, a lateral incisor, a canine, a first premolar, a second premolar, a first molar, a second molar, or a third molar (wisdom tooth).
  • the type of tooth e.g., a specific tooth
  • the processing unit 52 may also identify in which region of the oral cavity (upper jaw, lower jaw, left or right) the tooth is located.
  • the method by which the processing unit 52 identifies the type of tooth is not particularly limited, and may be, for example, a method using a machine learning model, a method using pattern matching, or any other known method.
  • the machine learning model is a learning model that is trained to output the type of tooth shown in an image containing teeth when the image is input.
  • an image showing each tooth of a standard shape may be used as a reference, or an image of the teeth in the user's mouth captured in advance may be used as a reference.
  • the processing unit 52 may also present the user with information identifying the teeth to be photographed before photographing, and determine that the teeth appearing in the image acquired after the information is presented are the teeth to be photographed, or have the user input which teeth are included in the acquired image, and determine that the input type of tooth is shown in the image.
  • the synthesis unit 53 synthesizes a plurality of first tooth row block images whose blue color levels have been adjusted by the processing unit 52 to generate a single tooth row image (panoramic image) in which a plurality of teeth are lined up.
  • the synthesis unit 53 uses a stitching process to synthesize a plurality of tooth row block images whose blue color levels have been adjusted, but the synthesis method is not limited to this.
  • the stitching process may be performed, for example, using the contours of the teeth.
  • the stitching process here is a process of combining multiple images (here, multiple first dentition block images) having overlapping areas to generate one or more dentition images.
  • the multiple first dentition block images are first arranged on a two-dimensional plane so that their overlapping portions line up. Then, the multiple first dentition block images are scaled (magnified), positioned, and oriented so that at least some of the feature points and surface points appear in the same location on the two-dimensional grid. In this manner, the multiple first dentition block images are registered where they are aligned. That is, the multiple first dentition block images are aligned to represent a series of adjacent teeth and are aligned to match the positions of the teeth.
  • feature points may be provided on the contour of the tooth region.
  • the location of the surface point can be calculated by measuring the distance between the focal points of the viewpoints at which the two first dentition block images, including the overlapping portion, were captured, extracting the angle of the optical axis at each viewpoint, and measuring the triangle at a common position in the image.
  • the synthesis unit 53 generates a second teeth row image P20 (see FIG. 5 described later) showing the left molar to the right molar by synthesis, but this is not limited to this, and it is sufficient to generate a second teeth row image P20 that includes some teeth, such as two or more teeth.
  • the synthesis unit 53 may include information regarding the type or position within the oral cavity of those some teeth in the second teeth row image P20. This makes it possible to notify the user which area within the oral cavity the second teeth row image P20 corresponds to.
  • the type of tooth or its position within the oral cavity e.g., upper jaw, lower jaw, left or right area, etc.
  • the processing unit 52 is identified by the processing unit 52.
  • the detection unit 54 detects plaque (plaque regions) on the image on which WB processing has been performed (an example of an image based on a captured image).
  • the detection unit 54 detects plaque based on color information of the tooth region in the dentition image generated by the synthesis unit 53.
  • the color information includes brightness V, hue H, and saturation S.
  • the detection unit 54 detects plaque, for example, based on brightness V.
  • the detection unit 54 detects, for example, a region where brightness V is equal to or greater than a threshold value as a plaque region.
  • the detection unit 54 may input the dentition image on which WB processing has been performed, and detect plaque (plaque area) that appears in the dentition image using a machine learning model that has been trained to output plaque (plaque area) that appears in the dentition image.
  • the detection unit 54 may detect plaque before the compositing unit 53 performs the compositing, or after the compositing unit 53 performs the compositing and before the processing unit 52 performs the WB processing. In other words, the detection unit 54 may detect plaque using a color-cast image (an example of an image based on a captured image).
  • the generating unit 55 is a processing unit for generating a dentition image (for example, a third dentition image described below) in which the plaque region detected by the detecting unit 54 is highlighted.
  • the generating unit 55 highlights the detected plaque region on the dentition image generated by the synthesizing unit 53.
  • the generating unit 55 for example, superimposes a highlight on the plaque region of the dentition image.
  • the generation unit 55 may superimpose a highlight on the plaque area of the third dentition image after the WB processing.
  • the display unit 56 is a display device included in the mobile terminal 50, and displays the image generated by the generation unit 55.
  • the display unit 56 may be realized, for example, by a liquid crystal display panel.
  • the detection unit 54 and the generation unit 55 may function as a saturation emphasis processing unit that performs saturation emphasis processing.
  • the detection unit 54 may generate a converted image by converting the dentition image (e.g., the third dentition image) that has been synthesized by the synthesis unit 53 and has been subjected to WB processing into an HSV image, and may identify a specific pixel area in which one or more pixels that satisfy at least one of the following conditions are located: saturation S is within a first predetermined range (e.g., 30 to 80 in 8-bit representation), hue H is within a second predetermined range (e.g., 140 to 170 in 8-bit representation), and brightness V is within a third predetermined range (e.g., 100 to 180 in 8-bit representation).
  • a first predetermined range e.g., 30 to 80 in 8-bit representation
  • hue H is within a second predetermined range
  • brightness V is within a third predetermined range (e.g., 100 to 180 in 8-bit representation).
  • the generation unit 55 may also generate a dentition image in which saturation S is emphasized by performing saturation emphasis processing on a specific pixel area in the dentition image (e.g., the third dentition image).
  • the HSV image is generated, for example, by converting the color space of the dentition image into an HSV space.
  • the display unit 56 may also display a row of teeth image that has been subjected to saturation enhancement processing.
  • the first, second and third specified ranges can be determined by comparing the actual plaque and tooth areas with the HSV image, and are not limited to the above numerical ranges.
  • the ranges of values for lightness V, hue H and saturation S can be determined by administering a plaque stain and comparing the degree of staining by the plaque stain.
  • the detection unit 54 and the generation unit 55 may function as a shade display processing unit that performs shade image processing according to the concentration distribution (accumulation level distribution) of the fluorescent substance.
  • the detection unit 54 may detect the concentration distribution of the fluorescent substance accumulated in the plaque from the value of the brightness V in a specific pixel region of the converted image.
  • the generation unit 55 may generate a dentition image (e.g., a third dentition image) including a shade display by performing shade image processing according to the concentration distribution of the fluorescent substance detected for a specific pixel region in the dentition image.
  • porphyrins in the plaque are excited, generating red fluorescence. Furthermore, the intensity of the fluorescence does not reflect the current bacterial flora, but is thought to indicate the accumulation of fluorescent substances (porphyrins). In other words, the more fluorescent substances accumulate, the stronger the red fluorescence becomes. In other words, the level of porphyrin accumulation increases as plaque matures. Therefore, the fluorescence intensity of mature plaque is stronger than that of young plaque.
  • the detection unit 54 detects the accumulation level (concentration or density) of fluorescent substances by comparing the intensity of red fluorescence per unit area of the plaque region.
  • a plaque region may be extracted from one or more pixels in the HSV image that satisfy at least one of the following: saturation S within a first predetermined range, hue H within a second predetermined range, and brightness V within a third predetermined range.
  • the brightness V is determined by the value of R, regardless of the hue H and saturation S.
  • the fluorescent wavelength of porphyrin a fluorescent substance in dental plaque
  • the concentration of porphyrin accumulated in the plaque area can be evaluated by detecting the brightness V value of the HSV image of the plaque area.
  • Fig. 4 is a sequence diagram showing the operation (dentition image generating method) of the dentition image generating system according to this embodiment.
  • the process shown in Fig. 4 is executed by a mobile terminal 50.
  • the process shown in Fig. 4 is, for example, a process performed in real time, and is performed every time one frame or multiple frames of image data are obtained.
  • Fig. 5 is a diagram for explaining the process of the dentition image generating system according to this embodiment. Note that in Fig. 5, color-cast images are shown with diagonal hatching.
  • image data is generated when a user uses the intraoral camera 10 to capture an image of the teeth and gums in his or her oral cavity (S101).
  • This image data is obtained, for example, by irradiating the teeth with light that includes the wavelength range of blue light and capturing an image of the teeth that are reacting fluorescently.
  • the communication unit 40 of the intraoral camera 10 transmits the captured image data to the mobile terminal 50, and the acquisition unit 51 of the mobile terminal 50 acquires the image data (S102).
  • the image data may be a video or one or more still images. Furthermore, when the image data is a video or multiple still images, the image data may be transmitted for each frame of the video or for each still image. Furthermore, when the image data is a video, the image data may be transmitted for each multiple frames.
  • image data may be transmitted in real time, or may be transmitted all at once after a series of photographs (e.g., photographs of all teeth in the oral cavity) have been taken.
  • the processing unit 52 of the mobile terminal 50 adjusts the blue level of the received image data (S103).
  • the processing unit 52 may multiply each of the R, G, and B components by different gains so that the blue levels of the tooth regions of each of the multiple first dentition block images, which are partial dentition images based on the image data, become closer to each other.
  • Each of the multiple first dentition block images P1 to P5 shown in FIG. 5(a) is an image on which no WB processing has been performed. Also, each of the multiple first dentition block images P1 to P5 is an image on which the blue color level has been adjusted.
  • the synthesis unit 53 of the mobile terminal 50 synthesizes the multiple first dentition block images P1 to P5 for which no WB processing has been performed and for which the blue color level has been adjusted (S104).
  • the synthesis unit 53 performs a stitching process on the first dentition block images P1, P2, and P3 to generate a first dentition image P11 including the first dentition block images P1 and P2 and at least a portion of the first dentition block image P3.
  • the synthesis unit 53 also performs a stitching process on the first dentition block images P3, P4, and P5 to generate a first dentition image P12 including at least a portion of the first dentition block image P3 and the first dentition block images P4 and P5.
  • the first dentition images P11 and P12 are images in which at least a portion of the dental region overlaps. In the example of FIG. 5, at least a portion of the dental region of the first dentition block image P3 overlaps in the first dentition images P11 and P12.
  • the processing unit 52 also performs stitching processing so that the first teeth row image includes the central part of the angle of view of each of the first teeth row block images to be synthesized.
  • the processing unit 52 performs stitching processing so that the first teeth row image P11 includes the central part of the angle of view of each of the first teeth row block images P1 to P3.
  • the first teeth row image P11 thus generated is an image in which light hits each tooth from the front, making it less likely to create shadows and making it easier to see the spaces between the teeth.
  • the first teeth row block image P3 may be an image including the two lower front teeth, for example, an image including the space between the two front teeth.
  • the first teeth row image P11 may be, for example, a panoramic image showing the area from the left back tooth to the front teeth
  • the first teeth row image P12 may be, for example, a panoramic image showing the area from the right back tooth to the front teeth.
  • the image including the space between the front teeth may be, for example, an image taken after an announcement such as "Please take a picture of your front teeth" is made when taking a picture with the imaging unit 21, or it may be an image in which the user has input that it is an image of the front teeth.
  • the number of first dentition images generated by the synthesis unit 53 is not particularly limited, and may be three or more.
  • the synthesis unit 53 generates a second teeth row image P20 by performing a stitching process on the two first teeth row images P11 and P12.
  • the second teeth row image P20 is a panoramic image in which multiple first teeth row block images P1 to P5 are synthesized.
  • the second teeth row image P20 is a panoramic image that shows at least a portion of the teeth row in the user's oral cavity, and may be, for example, a panoramic image that shows the user's left molar to right molar.
  • the second teeth row image P20 is an example of a teeth row image after synthesis.
  • the synthesis unit 53 may generate the second dentition image P20 directly based on the multiple first dentition block images P1 to P5. In other words, the first dentition images P11 and P12 do not need to be generated.
  • the processing unit 52 of the mobile terminal 50 performs image processing on the second dentition image P20 synthesized by the synthesis unit 53 (S105).
  • the processing unit 52 performs at least WB processing on the second dentition image P20.
  • the processing unit 52 performs WB processing on the second dentition image P20 based on the color information of the tooth region of the second dentition image P20 to generate a third dentition image P30.
  • the processing unit 52 may perform WB processing based on the color information of the central tooth in the second dentition image P20 (e.g., the tooth shown in the first dentition block image P3) as a reference.
  • the processing unit 52 may perform WB processing based on, for example, the color information of only the central tooth in the second dentition image P20 (e.g., the tooth shown in the first dentition block image P3).
  • the third teeth row image P30 is composed of a plurality of second teeth row block images P31 to P35.
  • the second teeth row block image P31 corresponds to an image obtained by performing white balance processing on the first teeth row block image P1
  • the second teeth row block image P32 corresponds to an image obtained by performing white balance processing on the first teeth row block image P2
  • the second teeth row block image P33 corresponds to an image obtained by performing white balance processing on the first teeth row block image P3
  • the second teeth row block image P34 corresponds to an image obtained by performing white balance processing on the first teeth row block image P4
  • the second teeth row block image P35 corresponds to an image obtained by performing white balance processing on the first teeth row block image P5.
  • the detection unit 54 of the mobile device 50 detects plaque in the third dentition image P30 (S106).
  • the detection unit 54 detects the presence or absence of plaque, but may also detect, for example, the concentration distribution of fluorescent substances, i.e., the accumulation level of plaque.
  • the generating unit 55 of the mobile device 50 generates an image in which the plaque region detected by the detecting unit 54 is highlighted (S107).
  • the generating unit 55 generates an image in which a highlight display indicating the plaque region is superimposed on the third dentition image P30, but it may also generate an image in which a shading display according to the concentration distribution of the fluorescent substance is superimposed on the third dentition image P30.
  • the display unit 56 of the mobile device 50 displays the image generated by the generation unit 55 (S108).
  • a user can take an image of the user's own oral cavity with the intraoral camera 10 and check the condition of the oral cavity in a panoramic image displayed on the mobile terminal 50. Furthermore, the concentration distribution of the fluorescent material is shown in the displayed panoramic image, allowing the user to easily check the health condition of their own teeth.
  • the mobile device 50 also performs white balance processing, so that it can generate a third dentition image P30 in which the plaque area is superimposed (e.g., highlighted) regardless of the color of light emitted by the illumination unit 23 of the intraoral camera 10.
  • the mobile terminal 50 may also generate a three-dimensional model of multiple teeth in the oral cavity from multiple captured image data.
  • the mobile terminal 50 may also display an image based on the generated three-dimensional model.
  • the detection unit 54 may detect plaque on the first dentition images P11 and P12.
  • the mobile terminal 50 processes images of teeth
  • some or all of this processing may be performed by the intraoral camera 10.
  • an intraoral camera 10 is used whose main purpose is to photograph teeth, but the intraoral camera 10 may also be an oral care device equipped with a camera.
  • the intraoral camera 10 may also be an oral irrigator equipped with a camera.
  • a mobile terminal 50 is given as an example of a user's information terminal, but the information terminal may be a stationary information terminal.
  • the multiple first dentition block images may be images captured after an announcement of the tooth position or tooth type, etc., is made when capturing images with the imaging unit 21, or may be images in which the position of the tooth is input by a user, etc.
  • each processing unit included in the dentition image generating system is typically realized as an LSI, which is an integrated circuit. These may be individually implemented as single chips, or may be integrated into a single chip that includes some or all of them.
  • the integrated circuit is not limited to LSI, but may be realized by a dedicated circuit or a general-purpose processor. It is also possible to use an FPGA (Field Programmable Gate Array) that can be programmed after the LSI is manufactured, or a reconfigurable processor that can reconfigure the connections and settings of the circuit cells inside the LSI.
  • FPGA Field Programmable Gate Array
  • each component may be configured with dedicated hardware, or may be realized by executing a software program suitable for each component.
  • Each component may be realized by a program execution unit such as a CPU or processor reading and executing a software program recorded on a recording medium such as a hard disk or semiconductor memory.
  • the division of functional blocks in the block diagram is one example, and multiple functional blocks may be realized as one functional block, one functional block may be divided into multiple blocks, or some functions may be transferred to other functional blocks. Furthermore, the functions of multiple functional blocks having similar functions may be processed in parallel or in a time-shared manner by a single piece of hardware or software.
  • the mobile terminal 50 may be realized as a single device, or may be realized by multiple devices.
  • the components of the mobile terminal 50 may be distributed in any manner among the multiple devices.
  • at least some of the functions of the mobile terminal 50 may be realized by the intraoral camera 10 (e.g., the signal processing unit 30).
  • the communication method between the multiple devices is not particularly limited, and may be wireless communication or wired communication. Furthermore, wireless communication and wired communication may be combined between the devices.
  • the present disclosure may also be realized as a dentition image generation method executed by a dentition image generation system.
  • the present disclosure may also be realized as an intraoral camera, a mobile terminal, or a cloud server included in the dentition image generation system.
  • each step is executed in the sequence diagram is merely an example to specifically explain the present disclosure, and an order other than the above may also be used.
  • some of the steps may be executed simultaneously (in parallel) with other steps.
  • Another aspect of the present disclosure may be a computer program that causes a computer to execute each of the characteristic steps included in the dentition image generation method shown in FIG. 4.
  • the program may be a program to be executed by a computer.
  • one aspect of the present disclosure may be a non-transitory computer-readable recording medium on which such a program is recorded.
  • such a program may be recorded on a recording medium and distributed or circulated.
  • the distributed program may be installed in a device having another processor, and the program may be executed by that processor, thereby making it possible to cause that device to perform each of the above processes.
  • This disclosure can be applied to a dentition image generation system.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Surgery (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Animal Behavior & Ethology (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • Engineering & Computer Science (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Pathology (AREA)
  • Physics & Mathematics (AREA)
  • Radiology & Medical Imaging (AREA)
  • Optics & Photonics (AREA)
  • Dentistry (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Epidemiology (AREA)
  • Dental Tools And Instruments Or Auxiliary Dental Instruments (AREA)
  • Endoscopes (AREA)

Abstract

A tooth row image generation device comprises: an acquisition unit (51) that acquires a captured image obtained by imaging the surface of a tooth row and dental plaque within an oral cavity being irradiated with light including a wavelength region of blue light; a combination unit (53) that generates a tooth row image obtained by combining a plurality of tooth row block images, which are partial tooth row images based on the captured image; and a processing unit (52) that performs a white balancing process on the combined tooth row image.

Description

歯列画像生成装置、歯列画像生成方法及びプログラムDENTAL ROW IMAGE GENERATION DEVICE, DENTAL ROW IMAGE GENERATION METHOD, AND PROGRAM

 本開示は、歯列画像生成装置、歯列画像生成方法及びプログラムに関する。 The present disclosure relates to a dentition image generating device, a dentition image generating method, and a program.

 特許文献1には、歯垢、病変部などより蛍光を放射させる為の励起光を発光する励起光発光部と、歯垢、病変部などの周辺を照明する為の照明光を発光する照明光発光部を有し、且つこの照射手段が照明光と励起光とを同時に照射することが可能である装置が開示されている。また、特許文献1には、歯垢、病変部などの視認性を高めるため、励起光発光部からの発光光量が照明光発光部からの発光光量より大きいことが開示されている。 Patent Document 1 discloses a device that has an excitation light emitting unit that emits excitation light to cause plaque, lesions, etc. to emit fluorescence, and an illumination light emitting unit that emits illumination light to illuminate the periphery of plaque, lesions, etc., and that the irradiation means is capable of irradiating the illumination light and excitation light simultaneously. Patent Document 1 also discloses that in order to improve the visibility of plaque, lesions, etc., the amount of light emitted from the excitation light emitting unit is greater than the amount of light emitted from the illumination light emitting unit.

国際公開第2005/104926号International Publication No. WO 2005/104926

 ところで、歯列画像を生成する歯列画像生成装置では、より違和感の少ない歯列画像が生成されることが望まれている。 Incidentally, it is desirable for a dentition image generating device that generates a dentition image that creates a less unnatural feeling.

 そこで、本開示は、違和感が低減された歯列画像を生成することが可能な歯列画像生成装置、歯列画像生成方法及びプログラムを提供する。 The present disclosure provides a dentition image generating device, a dentition image generating method, and a program that are capable of generating a dentition image with reduced discomfort.

 本開示の一態様に係る歯列画像生成装置は、青色光の波長域を含む光が照射されている口腔内の歯列の面及び歯垢を撮影することで得られた撮影画像を取得する取得部と、前記撮影画像に基づく部分的な歯列画像である複数の歯列ブロック画像を合成した歯列画像を生成する合成部と、合成後の前記歯列画像に対してホワイトバランス処理を行う処理部と、を備える。 A dentition image generating device according to one aspect of the present disclosure includes an acquisition unit that acquires a captured image obtained by photographing the surface of the dentition and dental plaque in an oral cavity irradiated with light including a wavelength range of blue light, a synthesis unit that generates a dentition image by synthesizing a plurality of dentition block images, which are partial dentition images based on the captured images, and a processing unit that performs white balance processing on the synthesized dentition image.

 本開示の一態様に係る歯列画像生成方法は、青色光の波長域を含む光が照射されている口腔内の歯列の面及び歯垢を撮影することで得られた撮影画像を取得し、前記撮影画像に基づく部分的な歯列画像である複数の歯列ブロック画像を合成した歯列画像を生成し、合成後の前記歯列画像に対してホワイトバランス処理を行う。 A dentition image generating method according to one aspect of the present disclosure obtains an image obtained by photographing the surface of the dentition and dental plaque in an oral cavity irradiated with light including a wavelength range of blue light, generates a dentition image by synthesizing a plurality of dentition block images, which are partial dentition images based on the photographed image, and performs white balance processing on the synthesized dentition image.

 本開示の一態様に係るプログラムは、上記の歯列画像生成方法をコンピュータに実行させるためのプログラムである。 A program according to one aspect of the present disclosure is a program for causing a computer to execute the above-mentioned tooth row image generating method.

 本開示の一態様によれば、違和感が低減された歯列画像を生成することが可能な歯列画像生成装置等を実現することができる。 According to one aspect of the present disclosure, it is possible to realize a dentition image generating device or the like that is capable of generating a dentition image with reduced discomfort.

図1は、実施の形態に係る歯列画像生成システムにおける口腔内カメラの斜視図である。FIG. 1 is a perspective view of an intraoral camera in a dentition image generating system according to an embodiment. 図2は、実施の形態に係る歯列画像生成システムの概略的構成図である。FIG. 2 is a schematic diagram of a dentition image generating system according to an embodiment. 図3は、実施の形態に係る携帯端末の機能構成を示すブロック図である。FIG. 3 is a block diagram illustrating a functional configuration of the mobile terminal according to the embodiment. 図4は、実施の形態に係る歯列画像生成システムの動作を示すシーケンス図である。FIG. 4 is a sequence diagram showing the operation of the dentition image generating system according to the embodiment. 図5は、実施の形態に係る歯列画像生成システムの処理を説明するための図である。FIG. 5 is a diagram for explaining the processing of the dentition image generating system according to the embodiment.

 (本開示に至った経緯)
 本開示の説明に先立ち、本開示に至った経緯について説明する。
(Background to this disclosure)
Before describing the present disclosure, the background to the development of the present disclosure will be described.

 特許文献1には、励起光の光源として中心波長が365nm、405nm或いは470nmであるLED(Light Emitting Diode)が用いられることが開示されている。このような励起光により患部及び患部周辺を撮影する場合、得られる画像における歯垢がない歯牙領域及び患部周辺の色調が励起光の光色に近い色調となる。そのため、歯垢がない歯牙領域及び患部周辺の色調が実際の色調からかけ離れてしまい、ユーザは、得られる画像に対して違和感を感じることがある。例えば、得られる画像を用いて、患部周辺の正常組織の状況を観察したり、ユーザに対する説明を行う場合には、実際の色調からかけ離れているため、ユーザは違和感を感じることがある。 Patent Document 1 discloses that an LED (Light Emitting Diode) with a central wavelength of 365 nm, 405 nm, or 470 nm is used as the light source of the excitation light. When the affected area and the surrounding area are photographed using such excitation light, the color tone of the plaque-free tooth area and the surrounding area in the obtained image is close to the color of the excitation light. As a result, the color tone of the plaque-free tooth area and the surrounding area of the affected area is far from the actual color tone, and the user may feel uncomfortable with the obtained image. For example, when the obtained image is used to observe the condition of normal tissue around the affected area or to explain it to the user, the user may feel uncomfortable because the color tone is far from the actual color tone.

 また、歯垢、病変部などの視認性を高めるために励起光発光部と照明光発光部とを調整する場合、得られる画像ごとの色調の再現性が低く、ユーザは、得られる画像に対して違和感を感じることがある。 In addition, when adjusting the excitation light emitting unit and the illumination light emitting unit to improve the visibility of plaque, lesions, etc., the reproducibility of the color tone of each image obtained is low, and the user may feel uncomfortable with the obtained image.

 このように、特許文献1の装置では、違和感が低減された歯列画像を生成することに対して改善の余地がある。そこで、本願発明者らは、違和感が低減された歯列画像を生成することが可能な歯列画像生成装置等について鋭意検討を行い、以下に示す歯列画像生成装置等を創案した。 As such, the device of Patent Document 1 has room for improvement in terms of generating a dentition image with reduced discomfort. The inventors of the present application therefore conducted extensive research into dentition image generating devices etc. capable of generating dentition images with reduced discomfort, and have devised the dentition image generating device etc. shown below.

 本開示の第1態様に係る歯列画像生成装置は、青色光の波長域を含む光が照射されている口腔内の歯列の面及び歯垢を撮影することで得られた撮影画像を取得する取得部と、前記撮影画像に基づく部分的な歯列画像である複数の歯列ブロック画像を合成した歯列画像を生成する合成部と、合成後の前記歯列画像に対してホワイトバランス処理を行う処理部と、を備える。 The dentition image generating device according to the first aspect of the present disclosure includes an acquisition unit that acquires a captured image obtained by photographing the surface of the dentition and dental plaque in an oral cavity irradiated with light including a wavelength range of blue light, a synthesis unit that generates a dentition image by synthesizing a plurality of dentition block images, which are partial dentition images based on the captured images, and a processing unit that performs white balance processing on the synthesized dentition image.

 これにより、複数の歯列ブロック画像が合成された歯列画像を適切なゲインでホワイトバランス調整することができる。つまり、合成後の歯列画像における色調を実際の色調に近づけることができる。よって、色調に対する違和感が低減された合成後の歯列画像を生成することができる。 This allows the white balance of a row of teeth image, which is a composite of multiple row of teeth block images, to be adjusted with an appropriate gain. In other words, the color tones in the composite row of teeth image can be made closer to the actual color tones. This makes it possible to generate a composite row of teeth image with reduced discomfort in the color tones.

 また、例えば、第2態様に係る歯列画像生成装置は、第1態様に係る歯列画像生成装置であって、前記処理部は、合成後の前記歯列画像の中央の歯牙の色情報に基づいて前記ホワイトバランス処理を行ってもよい。 Also, for example, the dentition image generating device according to the second aspect may be the dentition image generating device according to the first aspect, and the processing unit may perform the white balance processing based on color information of the central tooth of the dentition image after synthesis.

 これにより、中央の歯牙を適切なゲインでホワイトバランス調整することができる。つまり、合成後の歯列画像において特に中央の歯牙の色調を実際の色調に近づけることができる。よって、例えば、ユーザが注目する領域が中央の歯牙を含む領域である場合、当該注目領域における歯垢領域を効果的に強調可能である。 This allows the white balance of the central tooth to be adjusted with an appropriate gain. In other words, the color tone of the central tooth in the synthesized dentition image can be made closer to the actual color tone. Therefore, for example, if the area that the user focuses on is an area that includes the central tooth, the plaque area in that area can be effectively emphasized.

 また、例えば、第3態様に係る歯列画像生成装置は、第1態様に係る歯列画像生成装置であって、前記処理部は、合成後の前記歯列画像の歯牙領域全体の色情報に基づいて前記ホワイトバランス処理を行ってもよい。 Also, for example, the dentition image generating device according to the third aspect may be the dentition image generating device according to the first aspect, and the processing unit may perform the white balance processing based on color information of the entire dental region of the dentition image after synthesis.

 これにより、合成後の歯列画像全体の歯牙を適切なゲイン(平均的なゲイン)でホワイトバランス調整することができる。つまり、合成後の歯列画像に映る歯牙の色調を全体的に実際の色調に近づけることができる。 This allows the white balance of all the teeth in the combined dentition image to be adjusted with an appropriate gain (average gain). In other words, the overall color tone of the teeth in the combined dentition image can be made closer to the actual color tone.

 また、例えば、第4態様に係る歯列画像生成装置は、第1態様~第3態様のいずれかに係る歯列画像生成装置であって、前記処理部は、前記複数の歯列ブロック画像それぞれの歯牙領域の青色レベルを調整し、前記合成部は、前記青色レベルが調整された前記複数の歯列ブロック画像を合成してもよい。 Also, for example, a dentition image generating device according to a fourth aspect may be a dentition image generating device according to any one of the first to third aspects, in which the processing unit adjusts the blue color level of the dental region of each of the plurality of dentition block images, and the synthesis unit synthesizes the plurality of dentition block images with the blue color level adjusted.

 これにより、複数の歯列ブロック画像それぞれの青色レベルを揃えることができるので、WB処理後の歯列画像における歯牙領域の色(色情報)を揃えることができる。例えば、複数の歯列ブロック画像それぞれの歯牙領域の色がわずかに異なることによりユーザが違和感(白の色が揃っていないことによる違和感)を感じることを抑制することができる。 This allows the blue color level of each of the multiple dentition block images to be uniform, and therefore the color (color information) of the tooth area in the dentition image after WB processing can be uniform. For example, it is possible to prevent the user from feeling discomfort (discomfort due to the white color not being uniform) caused by slight differences in the color of the tooth area in each of the multiple dentition block images.

 また、例えば、第5態様に係る歯列画像生成装置は、第1態様~第4態様のいずれかに係る歯列画像生成装置であって、前記撮影画像に基づく画像に対して歯垢の検出を行う検出部と、前記検出部により検出された歯垢領域が強調表示された合成後の前記歯列画像を生成するための生成部とを備えてもよい。 Also, for example, the dentition image generating device according to the fifth aspect may be a dentition image generating device according to any one of the first to fourth aspects, and may include a detection unit that detects plaque in an image based on the captured image, and a generation unit that generates a composite dentition image in which the plaque area detected by the detection unit is highlighted.

 これにより、合成後の歯列画像における歯垢領域を強調表示することができる。例えば、そのような合成後の歯列画像がユーザに提示される場合、ユーザは歯垢領域を容易に確認することができる。 This allows plaque areas to be highlighted in the composite dentition image. For example, when such a composite dentition image is presented to a user, the user can easily identify the plaque areas.

 また、例えば、第6態様に係る歯列画像生成装置は、第5態様に係る歯列画像生成装置であって、前記歯垢領域が強調表示された合成後の前記歯列画像を表示する表示部を備えてもよい。 Furthermore, for example, the dentition image generating device according to the sixth aspect may be the dentition image generating device according to the fifth aspect, and may include a display unit that displays the synthesized dentition image in which the plaque area is highlighted.

 これにより、歯垢領域が強調表示された合成後の歯列画像をユーザに提示することができる。例えば、歯磨き後の歯列の画像を撮影し、歯垢領域を特定することで、磨き残しの領域をユーザに提示することができる。 This allows the user to be presented with a synthesized image of the dentition in which the areas of plaque are highlighted. For example, by taking an image of the dentition after brushing and identifying the areas of plaque, it is possible to present the user with areas that have not been brushed.

 また、例えば、第7態様に係る歯列画像生成装置は、第1態様~第6態様のいずれかに係る歯列画像生成装置であって、前記複数の歯列ブロック画像は、前歯を撮影した画像を含んでもよい。 Also, for example, the dentition image generating device according to the seventh aspect may be a dentition image generating device according to any one of the first to sixth aspects, and the plurality of dentition block images may include images of anterior teeth.

 これにより、右側の歯(例えば、右側の奥歯)から前歯までを撮影した画像と、左側の歯(例えば、左側の奥歯)から前歯までを撮影した画像とを前歯を基準に合成することができる。よって、右側の歯から左側の歯までの合成後の歯列画像を生成することができる。 This allows an image taken from the right side teeth (e.g., the back teeth on the right side) to the front teeth and an image taken from the left side teeth (e.g., the back teeth on the left side) to the front teeth to be synthesized based on the front teeth. Therefore, a synthesized tooth row image from the right side teeth to the left side teeth can be generated.

 また、例えば、第8態様に係る歯列画像生成装置は、第1態様~第7態様のいずれかに係る歯列画像生成装置であって、前記歯列の面は、前記歯列の側面を含んでもよい。 Furthermore, for example, the dentition image generating device according to the eighth aspect may be a dentition image generating device according to any one of the first to seventh aspects, and the surface of the dentition may include a side surface of the dentition.

 これにより、色調に対する違和感が低減された歯列画像として、歯列の側面を撮影した合成後の歯列画像を生成することができる。 This makes it possible to generate a composite image of the side of the dentition, which reduces the discomfort caused by color tones.

 また、例えば、第9態様に係る歯列画像生成装置は、第1態様~第8態様のいずれかに係る歯列画像生成装置であって、前記歯列の面は、前記歯列の噛み合わせ面を含んでもよい。 Furthermore, for example, the dentition image generating device according to the ninth aspect may be a dentition image generating device according to any one of the first to eighth aspects, and the surface of the dentition may include an occlusal surface of the dentition.

 これにより、色調に対する違和感が低減された歯列画像として、歯列の噛み合わせ面を撮影した合成後の歯列画像を生成することができる。 This makes it possible to generate a composite image of the occlusal surfaces of the dentition, resulting in a dentition image with reduced discomfort in terms of color tones.

 また、例えば、第10態様に係る歯列画像生成装置は、第1態様~第9態様のいずれかに係る歯列画像生成装置であって、合成後の前記歯列画像をHSV画像に変換することで変換画像を生成し、前記変換画像が有する複数の画素のうち彩度が第1の所定範囲内、色相が第2の所定範囲内、及び、明度が第3の所定範囲内の少なくとも1つを満たす1以上の画素が位置する特定画素領域を特定し、合成後の前記歯列画像における前記特定画素領域に対して彩度強調処理を行うことで彩度が強調された合成後の前記歯列画像を生成する彩度強調処理部を備えてもよい。 Also, for example, the teeth image generating device according to the tenth aspect may be a teeth image generating device according to any one of the first to ninth aspects, and may include a saturation enhancement processing unit that generates a converted image by converting the combined teeth image into an HSV image, identifies a specific pixel area in which one or more pixels of the converted image that satisfy at least one of a first predetermined range for saturation, a second predetermined range for hue, and a third predetermined range for brightness are located, and performs saturation enhancement processing on the specific pixel area in the combined teeth image to generate the combined teeth image with enhanced saturation.

 これにより、合成後の歯列画像において歯垢領域としての特定画素領域を特定し、特定画素領域に対して彩度強調処理を行うため、さらに、歯垢領域を区別しやすい合成後の歯列画像(彩度Sが強調された歯列画像)を生成することができる。 As a result, specific pixel regions in the synthesized dentition image that are plaque regions are identified, and saturation enhancement processing is performed on the specific pixel regions, making it possible to generate a synthesized dentition image (dentition image with enhanced saturation S) in which plaque regions are easier to distinguish.

 また、例えば、第11態様に係る歯列画像生成装置は、第1態様~第10態様のいずれかに係る歯列画像生成装置であって、合成後の前記歯列画像をHSV画像に変換することで変換画像を生成し、前記変換画像が有する複数の画素のうち彩度が第1の所定範囲内、色相が第2の所定範囲内、及び、明度が第3の所定範囲内の少なくとも1つを満たす1以上の画素が位置する特定画素領域を特定し、前記変換画像の前記特定画素領域における前記明度の値から前記歯垢内に蓄積された蛍光物質の蓄積レベル分布を検出し、合成後の前記歯列画像における前記特定画素領域に対して検出した前記蛍光物質の蓄積レベル分布に応じた濃淡画像処理を行うことで濃淡表示を含む合成後の前記歯列画像を生成する濃淡表示処理部を備えてもよい。 Also, for example, the dentition image generating device according to the eleventh aspect may be a dentition image generating device according to any one of the first to tenth aspects, and may include a shading display processing unit that generates a converted image by converting the combined dentition image into an HSV image, identifies a specific pixel area in which one or more pixels of the converted image that satisfy at least one of a first predetermined range for saturation, a second predetermined range for hue, and a third predetermined range for brightness are located, detects an accumulation level distribution of fluorescent substances accumulated in the plaque from the brightness value in the specific pixel area of the converted image, and performs shading image processing according to the accumulation level distribution of the fluorescent substances detected for the specific pixel area in the combined dentition image to generate the combined dentition image including a shading display.

 これにより、例えば、ユーザに蛍光物質の濃度分布を通知できる合成後の歯列画像を生成することができる。 This makes it possible to generate a composite dentition image that can inform the user of the concentration distribution of the fluorescent material, for example.

 また、本開示の一態様に係る歯列画像生成方法は、青色光の波長域を含む光が照射されている口腔内の歯列の面及び歯垢を撮影することで得られた撮影画像を取得し、前記撮影画像に基づく部分的な歯列画像である複数の歯列ブロック画像を合成した歯列画像を生成し、合成後の前記歯列画像に対してホワイトバランス処理を行う。また、本開示の一態様に係るプログラムは、上記の歯列画像生成方法をコンピュータに実行させるためのプログラムである。 A dentition image generating method according to one aspect of the present disclosure obtains a photographed image obtained by photographing the surface of the dentition and dental plaque in an oral cavity irradiated with light including a wavelength range of blue light, generates a dentition image by synthesizing a plurality of dentition block images, which are partial dentition images based on the photographed image, and performs white balance processing on the synthesized dentition image. A program according to one aspect of the present disclosure is a program for causing a computer to execute the above-mentioned dentition image generating method.

 これにより、上記の歯列画像生成システムと同様の効果を奏する。 This achieves the same effect as the above-mentioned dentition image generation system.

 なお、これらの全般的又は具体的な態様は、システム、方法、集積回路、コンピュータプログラム又はコンピュータで読み取り可能なCD-ROM等の非一時的記録媒体で実現されてもよく、システム、方法、集積回路、コンピュータプログラム又は記録媒体の任意な組み合わせで実現されてもよい。プログラムは、記録媒体に予め記憶されていてもよいし、インターネット等を含む広域通信網を介して記録媒体に供給されてもよい。 These general or specific aspects may be realized by a system, a method, an integrated circuit, a computer program, or a non-transitory recording medium such as a computer-readable CD-ROM, or by any combination of a system, a method, an integrated circuit, a computer program, or a recording medium. The program may be pre-stored in the recording medium, or may be supplied to the recording medium via a wide area communication network including the Internet.

 また、各図は、模式図であり、必ずしも厳密に図示されたものではない。したがって、例えば、各図において縮尺などは必ずしも一致しない。また、各図において、実質的に同一の構成については同一の符号を付しており、重複する説明は省略又は簡略化する。 In addition, each figure is a schematic diagram and is not necessarily an exact illustration. Therefore, for example, the scales of the figures do not necessarily match. In addition, in each figure, the same reference numerals are used for substantially the same configurations, and duplicate explanations are omitted or simplified.

 また、本明細書において、同一などの要素間の関係性を示す用語、並びに、数値、及び、数値範囲は、厳格な意味のみを表す表現ではなく、実質的に同等な範囲、例えば数%程度(あるいは、10%程度)の差異をも含むことを意味する表現である。 In addition, in this specification, terms that indicate relationships between elements, such as "identical," as well as numerical values and numerical ranges, are not expressions that express only the strict meaning, but are expressions that include a substantially equivalent range, for example, a difference of about a few percent (or about 10%).

 また、本明細書において、「第1」、「第2」などの序数詞は、特に断りの無い限り、構成要素の数又は順序を意味するものではなく、同種の構成要素の混同を避け、区別する目的で用いられている。 In addition, in this specification, ordinal numbers such as "first" and "second" do not refer to the number or order of components, unless otherwise specified, but are used for the purpose of avoiding confusion between and distinguishing between components of the same type.

 (実施の形態)
 以下、本実施の形態に係る歯列画像生成システム及び歯列画像生成方法について、図1~図5を参照しながら説明する。
(Embodiment)
Hereinafter, a dentition image generating system and a dentition image generating method according to the present embodiment will be described with reference to FIGS. 1 to 5. FIG.

 [1.歯列画像生成システムの構成]
 まず、本実施の形態に係る口腔内カメラを備える歯列画像生成システムの構成について、図1~図3を参照しながら説明する。図1は、本実施の形態に係る歯列画像生成システムにおける口腔内カメラ10の斜視図である。
[1. Configuration of the dentition image generating system]
First, the configuration of a dentition image generating system including an intraoral camera according to the present embodiment will be described with reference to Figures 1 to 3. Figure 1 is a perspective view of an intraoral camera 10 in the dentition image generating system according to the present embodiment.

 図1に示すように、口腔内カメラ10は、片手で取り扱うことが可能な歯ブラシ状の筺体を備え、その筺体は、歯列撮影時にユーザの口腔内に配置されるヘッド部10aと、ユーザが把持するハンドル部10bと、ヘッド部10a及びハンドル部10bを接続するネック部10cとを備える。 As shown in FIG. 1, the intraoral camera 10 has a toothbrush-like housing that can be handled with one hand, and the housing has a head portion 10a that is placed in the user's oral cavity when photographing the dentition, a handle portion 10b that is held by the user, and a neck portion 10c that connects the head portion 10a and the handle portion 10b.

 撮影部21は、青色光の波長域を含む光が照射されている口腔内の歯列の面及び歯垢を撮影する。歯列の面は、歯列の頬側(外側)の側面、歯列の舌側(内側)の側面、及び、歯列の噛み合わせ面の少なくとも1つを含む。 The photographing unit 21 photographs the surfaces of the dentition and dental plaque in the oral cavity irradiated with light including the wavelength range of blue light. The surfaces of the dentition include at least one of the buccal (outer) side surface of the dentition, the lingual (inner) side surface of the dentition, and the occlusal surface of the dentition.

 撮影部21は、ヘッド部10aとネック部10cとに組み込まれている。撮影部21は、その光軸LA上に配置された撮像素子(図示しない)とレンズ(図示しない)とを有する。 The imaging unit 21 is incorporated into the head portion 10a and the neck portion 10c. The imaging unit 21 has an image sensor (not shown) and a lens (not shown) arranged on its optical axis LA.

 撮像素子は、例えばCMOS(Complementary Metal Oxide Semiconductor)センサ又はCCD(Charge Coupled Device)素子などの撮影デバイスであって、レンズによって歯牙の像が結像される。その結像した像に対応する信号(画像データ)を、撮像素子は外部に出力する。撮像素子によって撮影された撮影画像を、RGB画像とも記載する。また、RGB画像は、青色光を歯列に照射して得られる画像列であり、例えば、歯列の側面が撮影された画像列であってもよいし、歯列の噛み合わせ面が撮影された画像列であってもよい。画像列は、例えば、歯列を並び方向に沿って撮影した1以上の画像(例えば、時系列画像)を含む。また、歯列の側面は、舌側の側面であってもよいし、頬側の側面であってもよい。 The imaging element is a photographing device such as a CMOS (Complementary Metal Oxide Semiconductor) sensor or a CCD (Charge Coupled Device) element, and an image of the teeth is formed by a lens. The imaging element outputs a signal (image data) corresponding to the formed image to the outside. The image photographed by the imaging element is also referred to as an RGB image. The RGB image is an image sequence obtained by irradiating the dentition with blue light, and may be, for example, an image sequence of the side of the dentition, or an image sequence of the occlusal surface of the dentition. The image sequence includes, for example, one or more images (e.g., time-series images) photographed along the direction of the dentition. The side of the dentition may be the tongue side or the cheek side.

 撮影部21は、さらに照明部(照明デバイス)から照射される色の光を遮光し、かつ、当該光に対して歯垢が発する蛍光を透過する光学フィルタを有してもよい。本実施の形態では、撮影部21は、撮像素子に入射する光に含まれる青色波長の光成分をカットする青色光カットフィルタを光学フィルタとして有してもよい。青色光の波長域を含む光を歯牙に照射し、歯垢を検出する場合、歯垢の励起蛍光を強くするため青色光の波長域を含む光を強くすると青色画素値が赤色画素値及び緑色画素値に比べて支配的になるため、RGB画像の全体が青色を帯びる。この対処として、青色光カットフィルタが撮像素子に入射する前の光から青色光の波長域を含む光の一部をカットする。なお、撮影部21は、青色光カットフィルタを有していなくてもよい。 The photographing unit 21 may further have an optical filter that blocks light of a color irradiated from the illumination unit (illumination device) and transmits fluorescence emitted by plaque in response to the light. In this embodiment, the photographing unit 21 may have an optical filter that is a blue light cut filter that cuts blue wavelength light components contained in the light incident on the image sensor. When light including the blue wavelength range is irradiated onto teeth to detect plaque, if the light including the blue wavelength range is strengthened to strengthen the excitation fluorescence of plaque, the blue pixel values become dominant compared to the red pixel values and green pixel values, and the entire RGB image becomes blue. To address this, the blue light cut filter cuts out a portion of the light including the blue wavelength range from the light before it enters the image sensor. Note that the photographing unit 21 does not need to have a blue light cut filter.

 また、口腔内カメラ10は、撮影時に撮影対象の歯牙に対して光を照射する照明部として、複数の第1~第4のLED23A~23Dを搭載している。第1~第4のLED23A~23Dは、歯垢に照射されることで当該歯垢が蛍光を発する色の光(例えば、単一色の光)を照射する。第1~第4のLED23A~23Dは、例えば、405nmをピークとする波長(所定波長の一例)を含む青色光を照射する青色LEDである。なお、第1~第4のLED23A~23Dは、青色光の波長域を含む光を照射する光源であればよく、青色LEDに限るものではない。 The intraoral camera 10 is also equipped with multiple first to fourth LEDs 23A to 23D as an illumination unit that irradiates light onto the teeth to be photographed during photography. The first to fourth LEDs 23A to 23D irradiate plaque with light of a color that causes the plaque to fluoresce when irradiated thereon (e.g., light of a single color). The first to fourth LEDs 23A to 23D are, for example, blue LEDs that irradiate blue light that includes a wavelength (an example of a predetermined wavelength) with a peak at 405 nm. Note that the first to fourth LEDs 23A to 23D are not limited to blue LEDs, and may be any light source that irradiates light that includes the wavelength range of blue light.

 図2は、本実施の形態に係る歯列画像生成システムの概略的構成図である。本実施の形態に係る歯列画像生成システムは、概略的には、照明部23からの光に対して歯垢が発する蛍光を撮影部21が撮影し、撮影されたRGB画像に基づく複数の画像(例えば、後述する第1歯列ブロック画像)を合成(連結合成)することで、歯牙が横一列に整列されるタイプのパノラマ画像を生成するように構成される。パノラマ画像は、ユーザの口の少なくとも一部(例えば、全体)の状態を表す画像である。 FIG. 2 is a schematic diagram of a dentition image generating system according to this embodiment. The dentition image generating system according to this embodiment is generally configured such that the imaging unit 21 captures fluorescence emitted by plaque in response to light from the illumination unit 23, and then synthesizes (links and combines) multiple images based on the captured RGB images (e.g., a first dentition block image, described below) to generate a panoramic image in which the teeth are aligned horizontally. The panoramic image is an image that represents the condition of at least a portion (e.g., the entirety) of the user's mouth.

 図2に示すように、歯列画像生成システムは、口腔内カメラ10と、携帯端末50とを備える。 As shown in FIG. 2, the dentition image generation system includes an intraoral camera 10 and a mobile terminal 50.

 口腔内カメラ10は、ハード部20と、信号処理部30と、通信部40とを備える。 The intraoral camera 10 comprises a hardware unit 20, a signal processing unit 30, and a communication unit 40.

 ハード部20は、口腔内カメラ10における物理的な要素であり、撮影部21と、センサ部22と、照明部23と、操作部24とを有する。 The hardware unit 20 is a physical element of the intraoral camera 10, and includes an imaging unit 21, a sensor unit 22, an illumination unit 23, and an operation unit 24.

 撮影部21は、ユーザの口腔内の歯牙を撮影することで画像データを生成する。撮影部21は、歯垢に含まれる蛍光物質を励起する所定波長の照射光が照射されている口腔内の歯列の面及び歯垢を撮影することで画像データを生成するともいえる。撮影部21は、カメラ制御部31からの制御信号を受け付け、受け付けた制御信号に応じて、撮影などの動作を実行し、撮影で得た動画又は静止画の画像データを画像処理部32に出力する。撮影部21は、上記の撮像素子及び光学フィルタとレンズとを有する。画像データは、例えば、光学フィルタを通過した光に基づいて生成される。また、画像データは、複数本の歯牙が映る画像であるが、少なくとも1本の歯牙が映る画像であればよい。 The photographing unit 21 generates image data by photographing the teeth in the user's oral cavity. It can also be said that the photographing unit 21 generates image data by photographing the surface of the dentition in the oral cavity and the plaque irradiated with light of a specific wavelength that excites fluorescent substances contained in the plaque. The photographing unit 21 receives a control signal from the camera control unit 31, performs operations such as photographing in accordance with the received control signal, and outputs image data of a moving image or a still image obtained by photographing to the image processing unit 32. The photographing unit 21 has the above-mentioned image sensor, optical filter, and lens. The image data is generated, for example, based on light that has passed through the optical filter. Furthermore, the image data is an image showing multiple teeth, but it is sufficient that the image shows at least one tooth.

 センサ部22は、RGB画像の撮影領域に入射する外光を検出する。例えば、センサ部22は、口腔内に外光が入射しているか否かを検出する。センサ部22は、例えば、撮影部21の近傍に配置される。センサ部22は、例えば、撮影部21と同様、口腔内カメラ10のヘッド部10aに配置されてもよい。つまり、センサ部22は、撮影部21が撮影する際、ユーザの口腔内に位置する。 The sensor unit 22 detects external light entering the photographing area of the RGB image. For example, the sensor unit 22 detects whether external light is entering the oral cavity. The sensor unit 22 is arranged, for example, near the photographing unit 21. The sensor unit 22 may be arranged, for example, in the head unit 10a of the intraoral camera 10, similar to the photographing unit 21. In other words, the sensor unit 22 is located in the user's oral cavity when the photographing unit 21 captures images.

 照明部23は、口腔内の複数の領域のうち、撮影部21が撮影する領域に光を照射する。定量的可視光誘起蛍光法(QLF法)でも知られているように、青色光が照射されると歯垢内のバクテリアが排泄するポルフィリンという物質が赤みを帯びたピンク色に蛍光(励起蛍光)することが知られており、本実施の形態では、照明部23は、撮影部21が撮影する領域に青色光を照射する。 The lighting unit 23 irradiates light onto the area of the multiple areas in the oral cavity that is to be photographed by the imaging unit 21. As is also known from quantitative visible light induced fluorescence (QLF) method, it is known that when blue light is irradiated, a substance called porphyrin, which is excreted by bacteria in plaque, fluoresces a reddish pink color (excited fluorescence), and in this embodiment, the lighting unit 23 irradiates blue light onto the area that is to be photographed by the imaging unit 21.

 照明部23は、上記の複数の第1~第4のLED23A~23Dを有する。複数の第1~第4のLED23A~23Dは、例えば、撮影領域に対して互いに異なる方向から光を照射する。これにより、撮影領域に影が発生することを抑制することができる。 The illumination unit 23 has the above-mentioned first to fourth LEDs 23A to 23D. The first to fourth LEDs 23A to 23D, for example, irradiate the shooting area with light from different directions. This makes it possible to prevent shadows from appearing in the shooting area.

 複数の第1~第4のLED23A~23Dのそれぞれは、少なくとも調光が制御可能に構成される。複数の第1~第4のLED23A~23Dのそれぞれは、調光及び調色が制御可能に構成されてもよい。複数の第1~第4のLED23A~23Dは、撮影部21を囲むように配置される。 Each of the first to fourth LEDs 23A to 23D is configured so that at least the dimming can be controlled. Each of the first to fourth LEDs 23A to 23D may be configured so that the dimming and color can be controlled. The first to fourth LEDs 23A to 23D are arranged to surround the image capture unit 21.

 照明部23は、撮影領域に応じて照射強度(発光強度)が制御される。複数の第1~第4のLED23A~23Dのそれぞれは、一律に照射強度が制御されてもよいし、互いに異なるように照射強度が制御されてもよい。なお、照明部23が有するLEDの数は特に限定されず、1つであってもよいし、5つ以上であってもよい。また、照明部23は、光源としてLEDを有することに限定されず、他の光源を有していてもよい。 The illumination unit 23 has its illumination intensity (light emission intensity) controlled according to the shooting area. The illumination intensity of each of the first to fourth LEDs 23A to 23D may be controlled uniformly, or may be controlled to be different from one another. The number of LEDs in the illumination unit 23 is not particularly limited, and may be one, or may be five or more. Furthermore, the illumination unit 23 is not limited to having LEDs as a light source, and may have other light sources.

 操作部24は、ユーザからの操作を受け付ける。操作部24は、例えば、押しボタンなどにより構成されるが、音声などにより操作を受け付ける構成であってもよい。 The operation unit 24 accepts operations from the user. The operation unit 24 is configured, for example, with push buttons, but may also be configured to accept operations by voice, etc.

 また、ハード部20は、さらに、口腔内カメラ10の各構成要素に電力を供給する電池(例えば、二次電池)、商用電源に接続された外部の充電器によってワイヤレス充電されるためのコイル、構図調節及びピント調節の少なくとも一方に必要なアクチュエータなどを備えていてもよい。 The hardware unit 20 may further include a battery (e.g., a secondary battery) that supplies power to each component of the intraoral camera 10, a coil for wireless charging by an external charger connected to a commercial power source, and an actuator required for at least one of composition adjustment and focus adjustment.

 信号処理部30は、後述する様々な処理を実行するCPU(Central Processing Unit)又はMPU(Micro Processor Unit)などにより実現される各機能構成部と、各機能構成部に様々な処理を実行させるためのプログラムを記憶するROM(Read Only Memory)、RAM(Random Access Memory)などのメモリ部35とを有する。具体的には、信号処理部30は、カメラ制御部31と、画像処理部32と、制御部33と、照明制御部34と、メモリ部35とを有する。 The signal processing unit 30 has various functional components realized by a CPU (Central Processing Unit) or MPU (Micro Processor Unit) that execute various processes described below, and a memory unit 35 such as a ROM (Read Only Memory) or RAM (Random Access Memory) that stores programs for causing each functional component to execute various processes. Specifically, the signal processing unit 30 has a camera control unit 31, an image processing unit 32, a control unit 33, a lighting control unit 34, and a memory unit 35.

 カメラ制御部31は、例えば、口腔内カメラ10のハンドル部10bに搭載され、撮影部21を制御する。カメラ制御部31は、例えば、画像処理部32からの制御信号に応じて撮影部21の絞り及びシャッタスピードの少なくとも1つを制御する。 The camera control unit 31 is mounted, for example, on the handle unit 10b of the intraoral camera 10, and controls the image capture unit 21. The camera control unit 31 controls at least one of the aperture and the shutter speed of the image capture unit 21 in response to a control signal from the image processing unit 32, for example.

 画像処理部32は、例えば、口腔内カメラ10のハンドル部10bに搭載され、撮影部21が撮影したRGB画像(画像データ)を取得し、その取得したRGB画像に対して画像処理を実行し、その画像処理後のRGB画像をカメラ制御部31及び制御部33に出力する。また、画像処理部32は、画像処理後のRGB画像をメモリ部35に出力し、画像処理後のRGB画像をメモリ部35に記憶させてもよい。 The image processing unit 32 is mounted, for example, on the handle unit 10b of the intraoral camera 10, acquires the RGB image (image data) captured by the imaging unit 21, performs image processing on the acquired RGB image, and outputs the RGB image after the image processing to the camera control unit 31 and the control unit 33. The image processing unit 32 may also output the RGB image after the image processing to the memory unit 35, and store the RGB image after the image processing in the memory unit 35.

 画像処理部32は、例えば、回路で構成され、例えばRGB画像に対してノイズ除去、輪郭強調処理などの画像処理を実行する。なお、ノイズ除去及び輪郭強調処理などは、携帯端末50により実行されてもよい。 The image processing unit 32 is, for example, composed of a circuit, and performs image processing such as noise removal and edge enhancement on an RGB image. Note that noise removal and edge enhancement may be performed by the mobile terminal 50.

 なお、画像処理部32から出力されたRGB画像(画像処理後のRGB画像)は、通信部40を介して携帯端末50に送信され、送信されたRGB画像に基づく画像(例えば、後述する第3歯列画像)が携帯端末50の表示部56に表示されてもよい。これにより、ユーザにRGB画像に基づく画像を提示することができる。 The RGB image output from the image processing unit 32 (the RGB image after image processing) may be transmitted to the mobile terminal 50 via the communication unit 40, and an image based on the transmitted RGB image (for example, a third dentition image described below) may be displayed on the display unit 56 of the mobile terminal 50. This makes it possible to present an image based on the RGB image to the user.

 制御部33は、信号処理部30を制御する制御装置である。制御部33は、例えば、センサ部22による外光等の検出結果に基づいて、信号処理部30の各構成要素を制御する。 The control unit 33 is a control device that controls the signal processing unit 30. The control unit 33 controls each component of the signal processing unit 30 based on, for example, the detection results of external light, etc. by the sensor unit 22.

 照明制御部34は、例えば、口腔内カメラ10のハンドル部10bに搭載され、第1~第4のLED23A~23Dの点灯及び消灯を制御する。照明制御部34は、例えば回路で構成される。例えば、ユーザが携帯端末50の表示部56に対して口腔内カメラ10を起動させる操作を実行すると、携帯端末50から対応する信号が通信部40を介して信号処理部30に送信される。信号処理部30の照明制御部34は、受信した信号に基づいて、第1~第4のLED23A~23Dを点灯させる。 The lighting control unit 34 is mounted, for example, on the handle portion 10b of the intraoral camera 10, and controls the turning on and off of the first to fourth LEDs 23A to 23D. The lighting control unit 34 is composed of, for example, a circuit. For example, when a user performs an operation on the display unit 56 of the mobile terminal 50 to start up the intraoral camera 10, a corresponding signal is sent from the mobile terminal 50 to the signal processing unit 30 via the communication unit 40. The lighting control unit 34 of the signal processing unit 30 turns on the first to fourth LEDs 23A to 23D based on the received signal.

 メモリ部35は、上記のプログラム以外に、撮影部21によって撮影されたRGB画像(画像データ)などを記憶する。メモリ部35は、例えば、ROM、RAMなどの半導体メモリにより実現されるがこれに限定されない。 In addition to the above programs, the memory unit 35 stores RGB images (image data) captured by the image capture unit 21. The memory unit 35 is realized by, for example, semiconductor memory such as ROM and RAM, but is not limited to this.

 通信部40は、携帯端末50と無線通信を行うための無線通信モジュールである。通信部40は、例えば、口腔内カメラ10のハンドル部10bに搭載され、信号処理部30からの制御信号に基づいて、携帯端末50と無線通信を行う。通信部40は、例えばWiFi(登録商標)、Bluetooth(登録商標)などの既存の通信規格に準拠した無線通信を携帯端末50との間で実行する。通信部40を介して、口腔内カメラ10から携帯端末50にRGB画像が送信され、且つ、携帯端末50から口腔内カメラ10に操作信号が送信される。 The communication unit 40 is a wireless communication module for wirelessly communicating with the mobile terminal 50. The communication unit 40 is mounted, for example, on the handle portion 10b of the intraoral camera 10, and performs wireless communication with the mobile terminal 50 based on a control signal from the signal processing unit 30. The communication unit 40 executes wireless communication with the mobile terminal 50 that complies with existing communication standards such as WiFi (registered trademark) and Bluetooth (registered trademark). Via the communication unit 40, an RGB image is transmitted from the intraoral camera 10 to the mobile terminal 50, and an operation signal is transmitted from the mobile terminal 50 to the intraoral camera 10.

 携帯端末50は、例えば、青色光の波長域を含む光を歯牙に照射することで蛍光反応している歯列の面及び歯垢を撮影したRGB画像に基づいて2以上の歯牙を含む第2歯列画像における歯垢領域を表示する。また、携帯端末50は、歯列画像生成システムのユーザインタフェースとして機能する。携帯端末50は、歯列画像生成装置の一例である。 The mobile terminal 50 displays the plaque area in the second dentition image including two or more teeth based on an RGB image of the dentition surface and plaque that are fluorescently reacting when light including a wavelength range of blue light is irradiated onto the teeth. The mobile terminal 50 also functions as a user interface for the dentition image generation system. The mobile terminal 50 is an example of a dentition image generation device.

 図3は、本実施の形態に係る携帯端末50の機能構成を示すブロック図である。 FIG. 3 is a block diagram showing the functional configuration of the mobile terminal 50 according to this embodiment.

 図3に示すように、携帯端末50は、取得部51と、処理部52と、合成部53と、検出部54と、生成部55と、表示部56とを備える。携帯端末50は、プロセッサ及びメモリなどを備える。メモリは、ROM及びRAMなどであり、プロセッサにより実行されるプログラムを記憶することができる。取得部51と、処理部52と、合成部53と、検出部54と、生成部55とは、メモリに格納されたプログラムを実行するプロセッサなどによって実現される。携帯端末50は、例えば、無線通信可能なスマートフォン又はタブレット端末等により実現されてもよい。 As shown in FIG. 3, the mobile terminal 50 includes an acquisition unit 51, a processing unit 52, a synthesis unit 53, a detection unit 54, a generation unit 55, and a display unit 56. The mobile terminal 50 includes a processor and a memory. The memory is a ROM, a RAM, or the like, and can store a program executed by the processor. The acquisition unit 51, the processing unit 52, the synthesis unit 53, the detection unit 54, and the generation unit 55 are realized by a processor that executes a program stored in the memory. The mobile terminal 50 may be realized, for example, by a smartphone or tablet terminal capable of wireless communication.

 取得部51は、口腔内カメラ10からRGB画像を取得する。具体的には、取得部51は、撮影部21で生成された複数の歯牙が映る画像(画像列)を取得する。RGB画像は、口腔内カメラ10が青色光の波長域を含む光を歯牙に照射することで蛍光反応している歯牙を撮影することで得られた画像である。取得部51は、例えば、無線通信を行う無線通信モジュールを含んで構成される。 The acquisition unit 51 acquires RGB images from the intraoral camera 10. Specifically, the acquisition unit 51 acquires images (image sequence) of multiple teeth generated by the imaging unit 21. The RGB images are images obtained by the intraoral camera 10 photographing teeth that are undergoing a fluorescent reaction by irradiating the teeth with light that includes the wavelength range of blue light. The acquisition unit 51 is configured to include, for example, a wireless communication module that performs wireless communication.

 ここで、取得部51が取得するRGB画像は、全体が青色を帯びている。青色光の波長域を含む光を歯牙に照射し歯垢を検出する場合、歯垢の励起蛍光を強くするため青色光の波長域を含む光を照明部23が強くするためである。これにより、青色画素値(B)が赤色画素値(R)及び緑色画素値(G)に比べて支配的になっている。つまり、取得部51が取得するRGB画像では、色かぶりが発生している。この状態では、ユーザに歯垢の付着状態を分かりやすく提示することが困難である。そこで、携帯端末50は、以下に示すように、取得部51が取得するRGB画像に対して所定の画像処理を行うことで、ユーザに歯垢の付着状態を分かりやすく提示するための処理を実行する。 Here, the RGB image acquired by the acquisition unit 51 has a bluish tint overall. This is because when light including a wavelength range of blue light is irradiated onto teeth to detect plaque, the illumination unit 23 strengthens the light including a wavelength range of blue light in order to strengthen the excited fluorescence of plaque. As a result, the blue pixel value (B) becomes dominant compared to the red pixel value (R) and the green pixel value (G). In other words, a color cast occurs in the RGB image acquired by the acquisition unit 51. In this state, it is difficult to present the plaque adhesion state to the user in an easy-to-understand manner. Therefore, the mobile terminal 50 performs a process to present the plaque adhesion state to the user in an easy-to-understand manner by performing a predetermined image processing on the RGB image acquired by the acquisition unit 51 as shown below.

 処理部52は、撮影部21で生成されたRGB画像(画像列)から部分的な歯列画像である複数の第1歯列ブロック画像を生成し、生成した複数の第1歯列ブロック画像に基づく画像に対して画像処理を行う。本実施の形態では、処理部52は、複数の第1歯列ブロック画像のそれぞれと、当該複数の第1歯列ブロック画像を合成した第1歯列画像又は第2歯列画像とのそれぞれに対して、画像処理を行う。 The processing unit 52 generates a plurality of first dentition block images, which are partial dentition images, from the RGB image (image sequence) generated by the imaging unit 21, and performs image processing on an image based on the generated plurality of first dentition block images. In this embodiment, the processing unit 52 performs image processing on each of the plurality of first dentition block images and each of the first dentition image or second dentition image obtained by combining the plurality of first dentition block images.

 第1歯列ブロック画像は、ある撮影位置において撮影した画像から画角中心を含む所定範囲を切り抜いた画像である。複数の第1歯列ブロック画像のそれぞれは、撮影位置(又は撮影方向)が異なる画像に基づく画像である。 The first dentition block image is an image obtained by cutting out a specified range including the center of the angle of view from an image captured at a certain shooting position. Each of the multiple first dentition block images is an image based on an image captured at a different shooting position (or shooting direction).

 また、複数の第1歯列ブロック画像は、それぞれが少なくとも1つの歯牙、又は、歯間を含む画像である。第1歯列ブロック画像は、例えば、少なくとも1つの歯牙、又は、歯間を正面から撮影した画像であってもよい。第1歯列ブロック画像の大きさ(画像サイズ)は、等しくてもよいし異なっていてもよい。また、第1歯列ブロック画像のそれぞれにおいて、当該第1歯列ブロック画像に映る歯牙領域の少なくとも一部が他の少なくとも1つの第1歯列ブロック画像に映る歯牙領域の一部と重複する。 Furthermore, each of the multiple first dentition block images is an image including at least one tooth or space between teeth. The first dentition block image may be, for example, an image of at least one tooth or space taken from the front. The first dentition block images may be the same or different in size (image size). Furthermore, in each of the first dentition block images, at least a portion of the dental area shown in that first dentition block image overlaps with a portion of the dental area shown in at least one other first dentition block image.

 なお、RGB画像が第1歯列ブロック画像そのものであってもよいし、RGB画像を互いに一部が重複するように分割することで、第1歯列ブロック画像が生成されてもよい。 The RGB image may be the first dentition block image itself, or the first dentition block image may be generated by dividing the RGB image so that some of the images overlap each other.

 処理部52は、合成部53による合成前の画像処理として、複数の第1歯列ブロック画像に対して露光補正処理を実行する。露光補正処理は、R、G及びBの各成分に等しいゲインを乗算する事でそれぞれの歯牙領域の青色レベルを調整する。処理部52は、複数の第1歯列ブロック画像それぞれの歯牙領域の青色レベルが近づく(例えば、一致する)ように青色レベルを調整する。処理部52は、例えば、複数の第1歯列ブロック画像を歯列の順に並べたときに中央に位置する第1歯列ブロック画像の歯牙(基準歯牙)の色で統一されるように、他の1以上の第1歯列ブロック画像それぞれの歯牙の色を調整する。基準歯牙は、例えば、撮影部21により撮影されたRGB画像における画角の中央に位置する歯牙であってもよい。例えば、画角の両端は照明部23からの光が偏って照射されている可能性があるので、青色レベルが調整されることで、中央の歯牙と同様に照らされた状態の色を再現することが可能である。なお、青色レベルの調整は、歯牙領域のみに対して実行されてもよいし、第1歯列ブロック画像の全体(例えば、歯牙、歯肉を含む領域)に対して実行されてもよい。 The processing unit 52 performs an exposure correction process on the multiple first dentition block images as image processing before composition by the composition unit 53. The exposure correction process adjusts the blue level of each tooth region by multiplying the R, G, and B components by equal gains. The processing unit 52 adjusts the blue level so that the blue levels of the tooth regions of each of the multiple first dentition block images approach (for example, match). For example, the processing unit 52 adjusts the color of each tooth of one or more other first dentition block images so that the color of the tooth (reference tooth) of the first dentition block image located in the center when the multiple first dentition block images are arranged in the order of the dentition is unified. The reference tooth may be, for example, a tooth located in the center of the angle of view in the RGB image captured by the imaging unit 21. For example, since both ends of the angle of view may be unevenly irradiated with light from the illumination unit 23, it is possible to reproduce the color of the tooth illuminated in the same way as the central tooth by adjusting the blue level. The blue level adjustment may be performed only on the tooth region, or on the entire first dentition block image (e.g., the region including the teeth and gums).

 また、処理部52は、合成部53による合成後の画像処理として、歯列画像(第1歯列画像又は第2歯列画像)に対してWB(White Balance)処理を実行する。WB処理は、R、G、Bの各成分に異なるゲイン(ホワイトバランスゲイン)を乗算することで画像における色バランスを調整する処理である。WB処理は、例えば、処理対象の第1歯列ブロック画像のうちの歯牙の領域を構成する複数の画素が有する複数の赤色画素値の赤画素平均値と、当該複数の画素が有する複数の緑色画素値の緑画素平均値と、当該複数の画素が有する複数の青色画素値の青画素平均値とが近づく(例えば、等しくなる)ように、処理対象の画像の赤成分、緑成分、及び、青成分のうちの少なくとも2つの色成分のゲインを調整する処理である。 The processing unit 52 also performs WB (White Balance) processing on the dentition image (first dentition image or second dentition image) as image processing after synthesis by the synthesis unit 53. WB processing is processing to adjust the color balance in the image by multiplying each of the R, G, and B components by a different gain (white balance gain). WB processing is processing to adjust the gain of at least two color components of the red, green, and blue components of the image to be processed so that, for example, the red pixel average value of multiple red pixel values of multiple pixels constituting the tooth region of the first dentition block image to be processed, the green pixel average value of multiple green pixel values of the multiple pixels, and the blue pixel average value of multiple blue pixel values of the multiple pixels become closer (for example, equal).

 処理部52は、歯列画像に対して、当該歯列画像に映る歯牙領域を白色領域と仮定して、WB処理を実行する。処理部52は、歯列画像に対して、当該歯列画像に映る歯牙領域の色(色情報)を基準にWB処理を実行する。処理部52は、任意の歯牙(例えば、歯列画像の中央に映る歯牙)の色(色情報)を基準にWB処理を実行してもよいし、歯牙領域の色(色情報)の統計値(例えば、平均値、中央値など)を基準にWB処理を実行してもよい。処理部52は、例えば、当該任意の歯牙、又は、統計値に基づいて算出したゲインを歯牙領域全体にかけあわせることでWB処理を実行してもよい。また、例えば、合成部53による合成前に青色レベルが調整されている場合、合成された歯列画像の青レベルを揃えることができるので、歯列画像の歯牙領域を一括でホワイトバランス処理することが可能である。一括とは、ホワイトバランス処理に用いるゲインが共通であることを意味する。 The processing unit 52 performs WB processing on the dentition image, assuming that the tooth region shown in the dentition image is a white region. The processing unit 52 performs WB processing on the dentition image based on the color (color information) of the tooth region shown in the dentition image. The processing unit 52 may perform WB processing based on the color (color information) of an arbitrary tooth (e.g., a tooth shown in the center of the dentition image) or on a statistical value (e.g., average value, median, etc.) of the color (color information) of the tooth region. The processing unit 52 may perform WB processing by multiplying the entire tooth region by a gain calculated based on the arbitrary tooth or the statistical value. Also, for example, if the blue level is adjusted before composition by the composition unit 53, the blue level of the composite dentition image can be made uniform, so that the tooth region of the dentition image can be white-balanced collectively. "Bulk" means that the gain used for white-balance processing is common.

 これにより、歯牙領域を白色(無彩色)として表示することができる。つまり、ユーザの実際の歯牙の色を再現しやすくなる。これは、画像の違和感の低減に加えて、歯垢領域を強調表示しやすくすることに繋がる。また、一括でホワイトバランス処理されている場合、歯牙領域の色を同一の色(白色)に揃えることが可能である。例えば、歯牙領域の色に対してユーザが違和感を感じることを抑制することができる。 This allows the tooth area to be displayed in white (achromatic color). In other words, it is easier to reproduce the user's actual tooth color. This not only reduces the sense of incongruity in the image, but also makes it easier to highlight plaque areas. Also, when white balance processing is performed collectively, it is possible to unify the color of the tooth area to the same color (white). For example, it is possible to prevent the user from feeling uncomfortable about the color of the tooth area.

 なお、処理部52は、青色レベルの調整、及び、WB処理のうち少なくともWB処理を実行すればよい。 The processing unit 52 only needs to adjust the blue level and perform at least WB processing.

 なお、歯列画像に映る歯牙領域とは、歯列画像に映る歯牙における天然歯牙の領域であってもよい。つまり、処理部52は、天然歯牙の領域を白色領域と仮定して(例えば、歯列画像に映る天然歯牙の領域の色(色情報)を基準として)、WB処理を実行してもよい。天然歯牙の領域は、歯列画像に映る1本の歯牙における天然歯牙の領域であってもよいし、所定の歯牙における天然歯牙の領域であってもよいし、複数の歯牙における天然歯牙の領域であってもよい。複数の歯牙における天然歯牙の領域を基準とする場合、複数の歯牙それぞれの天然歯牙の領域の色(例えば、色度)の統計値がWB処理に用いられてもよい。統計値は、例えば、平均値であるが、最大値、最小値、最頻値、中央値等であってもよい。 The tooth area shown in the dentition image may be the area of the natural tooth in the teeth shown in the dentition image. In other words, the processing unit 52 may perform WB processing assuming that the natural tooth area is a white area (for example, based on the color (color information) of the natural tooth area shown in the dentition image). The natural tooth area may be the area of the natural tooth in one tooth shown in the dentition image, the area of the natural tooth in a specific tooth, or the area of the natural tooth in multiple teeth. When the area of the natural tooth in multiple teeth is used as the reference, a statistical value of the color (e.g., chromaticity) of the natural tooth area of each of the multiple teeth may be used in the WB processing. The statistical value is, for example, the average value, but may also be a maximum value, minimum value, mode, median, etc.

 天然歯牙は、励起光を照射すると象牙質から励起蛍光が発せられ、エナメル質を透過して緑色に蛍光することが知られている。また、齲歯治療痕の詰め物(例えば、メタルインレー)は、青色LED光の下では励起蛍光せず、カメラでは暗く(低輝度で)撮像されることが知られている。また、上記で記載したように、歯垢(歯垢領域)は、青色光が照射されると赤みを帯びたピンク色に蛍光(励起蛍光)することが知られている。これらのことから、処理部52は、歯列画像から齲歯治療痕及び歯垢を除く天然歯牙を検出可能である。 It is known that when excitation light is applied to natural teeth, excitation fluorescence is emitted from the dentin, which penetrates the enamel and fluoresces green. It is also known that fillings in caries treatment scars (e.g., metal inlays) do not emit excitation fluorescence under blue LED light, and are captured darkly (at low brightness) by the camera. As described above, it is also known that plaque (plaque areas) fluoresces reddish pink (excitation fluorescence) when irradiated with blue light. For these reasons, the processing unit 52 can detect natural teeth, excluding caries treatment scars and plaque, from the dentition image.

 なお、処理部52は、さらに、第1歯列ブロック画像、第1歯列画像、又は、歯列画像(第2歯列画像若しくは第3歯列画像)から、画像に含まれる歯牙(例えば、特定の歯牙)の種類を識別してもよい。歯牙の種類を識別するとは、当該歯牙が切歯、犬歯、臼歯のいずれであるかを識別することであってもよいし、中切歯、側切歯、犬歯、第一小臼歯・第二小臼歯・第一大臼歯・第二大臼歯・第三大臼歯(親知らず)のいずれであるかを識別することであってもよい。また、処理部52は、歯牙が口腔内のどの領域(上顎、下顎、左右)に位置するかを識別してもよい。なお、処理部52が歯牙の種類を識別する方法は特に限定されず、例えば、機械学習モデルを用いる方法であってもよいし、パターンマッチングを用いる方法であってもよいし、その他の公知のいかなる方法であってもよい。機械学習モデルは、歯牙を含む画像が入力されると、当該画像に映る歯牙の種類を出力するように学習された学習モデルである。パターンマッチングを用いる方法では、標準的な形状の各歯牙が映る画像が基準として用いられてもよいし、予め撮影された当該ユーザの口腔内の歯牙の画像が基準として用いられてもよい。また、処理部52は、撮影前に撮影対象の歯牙を特定する情報をユーザに提示し、提示後に取得された画像に映る歯牙を当該撮影対象の歯牙と判定してもよいし、取得された画像がどの歯牙を含むかをユーザに入力させ、入力された種類の歯牙が当該画像に映っていると判定してもよい。 The processing unit 52 may further identify the type of tooth (e.g., a specific tooth) included in the image from the first dentition block image, the first dentition image, or the dentition image (the second dentition image or the third dentition image). Identifying the type of tooth may mean identifying whether the tooth is an incisor, a canine, or a molar, or whether the tooth is a central incisor, a lateral incisor, a canine, a first premolar, a second premolar, a first molar, a second molar, or a third molar (wisdom tooth). The processing unit 52 may also identify in which region of the oral cavity (upper jaw, lower jaw, left or right) the tooth is located. The method by which the processing unit 52 identifies the type of tooth is not particularly limited, and may be, for example, a method using a machine learning model, a method using pattern matching, or any other known method. The machine learning model is a learning model that is trained to output the type of tooth shown in an image containing teeth when the image is input. In the method using pattern matching, an image showing each tooth of a standard shape may be used as a reference, or an image of the teeth in the user's mouth captured in advance may be used as a reference. The processing unit 52 may also present the user with information identifying the teeth to be photographed before photographing, and determine that the teeth appearing in the image acquired after the information is presented are the teeth to be photographed, or have the user input which teeth are included in the acquired image, and determine that the input type of tooth is shown in the image.

 合成部53は、処理部52により青色レベルが調整された複数の第1歯列ブロック画像を合成することにより複数の歯牙が並んだ1枚の歯列画像(パノラマ画像)を生成する。本実施の形態では、合成部53は、ステッチング処理を用いて、青色レベルが調整された複数の歯列ブロック画像を合成するが、合成方法はこれに限定されない。ステッチング処理は、例えば、歯牙の輪郭を用いて行われてもよい。 The synthesis unit 53 synthesizes a plurality of first tooth row block images whose blue color levels have been adjusted by the processing unit 52 to generate a single tooth row image (panoramic image) in which a plurality of teeth are lined up. In this embodiment, the synthesis unit 53 uses a stitching process to synthesize a plurality of tooth row block images whose blue color levels have been adjusted, but the synthesis method is not limited to this. The stitching process may be performed, for example, using the contours of the teeth.

 なお、ここでのステッチング処理は、重複する領域を有する複数の画像(ここでは、複数の第1歯列ブロック画像)を組み合わせて、1以上の歯列画像を生成する処理である。ステッチング処理において、まず複数の第1歯列ブロック画像は、重複する部分が並ぶように、2次元平面上に配列される。そして、特徴点及び表面点の少なくとも一部が2次元グリッド上の同一の場所において見えるように、複数の第1歯列ブロック画像がスケール変更(倍率調節)され、位置付けられ、向けられる。このように、複数の第1歯列ブロック画像は、それらが整列させられる場所に位置合わせされる。つまり、複数の第1歯列ブロック画像は、一連の隣接する歯を表すために整列させられ、歯の位置を合致させるように整列させられる。 Note that the stitching process here is a process of combining multiple images (here, multiple first dentition block images) having overlapping areas to generate one or more dentition images. In the stitching process, the multiple first dentition block images are first arranged on a two-dimensional plane so that their overlapping portions line up. Then, the multiple first dentition block images are scaled (magnified), positioned, and oriented so that at least some of the feature points and surface points appear in the same location on the two-dimensional grid. In this manner, the multiple first dentition block images are registered where they are aligned. That is, the multiple first dentition block images are aligned to represent a series of adjacent teeth and are aligned to match the positions of the teeth.

 なお、特徴点を検出する特徴検出アルゴリズムとしては、スケール不変特徴変換(SIFT)又は高速化ロバスト特徴(SURF)を含む、任意のタイプの特徴検出アルゴリズムが使用されてもよい。本実施の形態では、歯牙領域の輪郭部分に特徴点が設けられてもよい。また、表面点の場所は、重複する部分を含む2つの第1歯列ブロック画像それぞれを撮影した視点の焦点間の距離を測定し、それぞれの視点における光軸の角度を抽出し、画像内の共通の位置による三角形の測度によって算出可能である。 Note that any type of feature detection algorithm may be used to detect the feature points, including scale invariant feature transform (SIFT) or speed up robust feature transformation (SURF). In this embodiment, feature points may be provided on the contour of the tooth region. The location of the surface point can be calculated by measuring the distance between the focal points of the viewpoints at which the two first dentition block images, including the overlapping portion, were captured, extracting the angle of the optical axis at each viewpoint, and measuring the triangle at a common position in the image.

 なお、以降において、合成部53は、左奥歯から右奥歯までが映る第2歯列画像P20(後述する図5を参照)を合成により生成する例について説明するがこれに限定されず、2以上の歯牙などの一部の歯牙を含む第2歯列画像P20を生成すればよい。この場合、合成部53は、当該一部の歯牙の種類又は口腔内の位置に関する情報を当該第2歯列画像P20に含めてもよい。これにより、口腔内のどの領域の第2歯列画像P20であるかをユーザに通知することができる。なお、歯牙の種類又は口腔内での位置(例えば、上顎、下顎、左右などの領域)は、処理部52により特定される。 Note that hereinafter, an example will be described in which the synthesis unit 53 generates a second teeth row image P20 (see FIG. 5 described later) showing the left molar to the right molar by synthesis, but this is not limited to this, and it is sufficient to generate a second teeth row image P20 that includes some teeth, such as two or more teeth. In this case, the synthesis unit 53 may include information regarding the type or position within the oral cavity of those some teeth in the second teeth row image P20. This makes it possible to notify the user which area within the oral cavity the second teeth row image P20 corresponds to. Note that the type of tooth or its position within the oral cavity (e.g., upper jaw, lower jaw, left or right area, etc.) is identified by the processing unit 52.

 検出部54は、WB処理が実行された画像(撮影画像に基づく画像の一例)に対して、歯垢(歯垢領域)の検出を実行する。本実施の形態では、検出部54は、合成部53により生成された歯列画像における歯牙領域の色情報に基づいて、歯垢の検出を実行する。色情報としては、明度V、色相H及び彩度Sが含まれる。検出部54は、例えば、明度Vに基づいて歯垢を検出する。検出部54は、例えば、明度Vが閾値以上である領域を歯垢領域と検出する。 The detection unit 54 detects plaque (plaque regions) on the image on which WB processing has been performed (an example of an image based on a captured image). In this embodiment, the detection unit 54 detects plaque based on color information of the tooth region in the dentition image generated by the synthesis unit 53. The color information includes brightness V, hue H, and saturation S. The detection unit 54 detects plaque, for example, based on brightness V. The detection unit 54 detects, for example, a region where brightness V is equal to or greater than a threshold value as a plaque region.

 なお、検出部54は、WB処理が実行された歯列画像を入力とし、歯列画像に映る歯垢(歯垢領域)を出力するように学習された機械学習モデルを用いて歯列画像に映る歯垢(歯垢領域)を検出してもよい。 The detection unit 54 may input the dentition image on which WB processing has been performed, and detect plaque (plaque area) that appears in the dentition image using a machine learning model that has been trained to output plaque (plaque area) that appears in the dentition image.

 なお、検出部54は、合成部53による合成前、又は、合成部53による合成後かつ処理部52によるWB処理の実行前に、歯垢を検出してもよい。つまり、検出部54は、色かぶりした画像(撮影画像に基づく画像の一例)を用いて、歯垢を検出してもよい。 The detection unit 54 may detect plaque before the compositing unit 53 performs the compositing, or after the compositing unit 53 performs the compositing and before the processing unit 52 performs the WB processing. In other words, the detection unit 54 may detect plaque using a color-cast image (an example of an image based on a captured image).

 生成部55は、検出部54により検出された歯垢領域が強調表示された歯列画像(例えば、後述する第3歯列画像)を生成するための処理部である。生成部55は、合成部53により生成された歯列画像に対して、検出された歯垢領域をハイライト表示する。生成部55は、例えば、歯列画像の歯垢領域にハイライト表示を重畳する。 The generating unit 55 is a processing unit for generating a dentition image (for example, a third dentition image described below) in which the plaque region detected by the detecting unit 54 is highlighted. The generating unit 55 highlights the detected plaque region on the dentition image generated by the synthesizing unit 53. The generating unit 55, for example, superimposes a highlight on the plaque region of the dentition image.

 なお、生成部55は、検出部54がWB処理前に歯垢を検出している場合、WB処理後の第3歯列画像の歯垢領域にハイライト表示を重畳してもよい。 If the detection unit 54 detects plaque before the WB processing, the generation unit 55 may superimpose a highlight on the plaque area of the third dentition image after the WB processing.

 表示部56は、携帯端末50が備える表示デバイスであり、生成部55が生成した画像を表示する。表示部56は、例えば、液晶ディスプレイパネルなどにより実現されてもよい。 The display unit 56 is a display device included in the mobile terminal 50, and displays the image generated by the generation unit 55. The display unit 56 may be realized, for example, by a liquid crystal display panel.

 なお、検出部54及び生成部55は、彩度強調処理を行う彩度強調処理部として機能してもよい。例えば、検出部54は、合成部53により合成され、かつ、WB処理された歯列画像(例えば、第3歯列画像)をHSV画像に変換することで変換画像を生成し、変換画像が有する複数の画素のうち彩度Sが第1の所定範囲内(例えば、8bit表現で30以上80以下)、色相Hが第2の所定範囲内(例えば、8bit表現で140以上170以下)、及び、明度Vが第3の所定範囲内(例えば、8bit表現で100以上180以下)の少なくとも1つを満たす1以上の画素が位置する特定画素領域を特定してもよい。また、生成部55は、歯列画像(例えば、第3歯列画像)における特定画素領域に対して彩度強調処理を行うことで彩度Sが強調された歯列画像を生成してもよい。HSV画像は、例えば、歯列画像の色空間をHSV空間に変換することで生成される。また、表示部56は、彩度強調処理が行われた歯列画像を表示してもよい。 The detection unit 54 and the generation unit 55 may function as a saturation emphasis processing unit that performs saturation emphasis processing. For example, the detection unit 54 may generate a converted image by converting the dentition image (e.g., the third dentition image) that has been synthesized by the synthesis unit 53 and has been subjected to WB processing into an HSV image, and may identify a specific pixel area in which one or more pixels that satisfy at least one of the following conditions are located: saturation S is within a first predetermined range (e.g., 30 to 80 in 8-bit representation), hue H is within a second predetermined range (e.g., 140 to 170 in 8-bit representation), and brightness V is within a third predetermined range (e.g., 100 to 180 in 8-bit representation). The generation unit 55 may also generate a dentition image in which saturation S is emphasized by performing saturation emphasis processing on a specific pixel area in the dentition image (e.g., the third dentition image). The HSV image is generated, for example, by converting the color space of the dentition image into an HSV space. The display unit 56 may also display a row of teeth image that has been subjected to saturation enhancement processing.

 第1の所定範囲、第2の所定範囲及び第3の所定範囲は、実際の歯垢領域と歯牙の領域とHSV画像とを比較することで特定されればよく、上記の数値範囲に限るものではない。例えば、明度V、色相H及び彩度Sの値の範囲は、歯垢染色剤を投与し、歯垢染色剤による染色の度合と比較して決めることができる。 The first, second and third specified ranges can be determined by comparing the actual plaque and tooth areas with the HSV image, and are not limited to the above numerical ranges. For example, the ranges of values for lightness V, hue H and saturation S can be determined by administering a plaque stain and comparing the degree of staining by the plaque stain.

 なお、検出部54及び生成部55は、蛍光物質の濃度分布(蓄積レベル分布)に応じた濃淡画像処理を行う濃淡表示処理部として機能してもよい。例えば、検出部54は、変換画像の特定画素領域における明度Vの値から歯垢内に蓄積された蛍光物質の濃度分布を検出してもよい。また、生成部55は、歯列画像における特定画素領域に対して検出した蛍光物質の濃度分布に応じた濃淡画像処理を行うことで濃淡表示を含む歯列画像(例えば、第3歯列画像)を生成してもよい。 The detection unit 54 and the generation unit 55 may function as a shade display processing unit that performs shade image processing according to the concentration distribution (accumulation level distribution) of the fluorescent substance. For example, the detection unit 54 may detect the concentration distribution of the fluorescent substance accumulated in the plaque from the value of the brightness V in a specific pixel region of the converted image. Furthermore, the generation unit 55 may generate a dentition image (e.g., a third dentition image) including a shade display by performing shade image processing according to the concentration distribution of the fluorescent substance detected for a specific pixel region in the dentition image.

 青色光が各層を透過する過程で歯垢のポルフィリンが励起され赤色蛍光が発生する。また蛍光強度は現在の菌叢を反映しているのではなく、蛍光物質(ポルフィリン)の蓄積を示すと考えられる。つまり、蛍光物質の蓄積が多いほど、赤色蛍光が濃くなる。すなわち、歯垢の成熟に伴いポルフィリンの蓄積レベルが高まる。よって、熟成歯垢の蛍光強度は、若い歯垢の蛍光強度より強い。 As blue light passes through each layer, porphyrins in the plaque are excited, generating red fluorescence. Furthermore, the intensity of the fluorescence does not reflect the current bacterial flora, but is thought to indicate the accumulation of fluorescent substances (porphyrins). In other words, the more fluorescent substances accumulate, the stronger the red fluorescence becomes. In other words, the level of porphyrin accumulation increases as plaque matures. Therefore, the fluorescence intensity of mature plaque is stronger than that of young plaque.

 検出部54は、歯垢領域の単位面積あたりの赤色蛍光の強度を比較することで、蛍光物質の蓄積レベル(濃度又は密度)を検出する。 The detection unit 54 detects the accumulation level (concentration or density) of fluorescent substances by comparing the intensity of red fluorescence per unit area of the plaque region.

 上述したように、HSV画像から彩度Sが第1の所定範囲内、色相Hが第2の所定範囲内、及び、明度Vが第3の所定範囲内の少なくとも1つを満たす1以上の画素から歯垢領域が抽出されてもよい。 As described above, a plaque region may be extracted from one or more pixels in the HSV image that satisfy at least one of the following: saturation S within a first predetermined range, hue H within a second predetermined range, and brightness V within a third predetermined range.

 また、WB処理が実行された画像において、歯垢領域では、色相H及び彩度Sに依らず明度VはRの値で決まる。 In addition, in an image that has undergone WB processing, in plaque regions, the brightness V is determined by the value of R, regardless of the hue H and saturation S.

 また、歯垢内の蛍光物質であるポルフィリンの蛍光波長は600nm~740nmであり、ピーク蛍光波長が630nmであることが知られている。すなわち、歯垢領域に蓄積されたポルフィリン濃度を、歯垢領域のHSV画像の明度Vの値を検出することで評価できる。 It is also known that the fluorescent wavelength of porphyrin, a fluorescent substance in dental plaque, is 600 nm to 740 nm, with a peak fluorescent wavelength of 630 nm. In other words, the concentration of porphyrin accumulated in the plaque area can be evaluated by detecting the brightness V value of the HSV image of the plaque area.

 [2.歯列画像生成システムの動作]
 続いて、上記のように構成される歯列画像生成システムの動作について、図4及び図5を参照しながら説明する。図4は、本実施の形態に係る歯列画像生成システムの動作(歯列画像生成方法)を示すシーケンス図である。図4に示す処理は、携帯端末50により実行される。なお、図4に示す処理は、例えば、リアルタイムに行われる処理であり、1フレーム又は複数フレームの画像データが得られる毎に行われる。図5は、本実施の形態に係る歯列画像生成システムの処理を説明するための図である。なお、図5では、色かぶりしている画像を斜線ハッチングで示している。
2. Operation of the Tooth Row Image Generating System
Next, the operation of the dentition image generating system configured as above will be described with reference to Figs. 4 and 5. Fig. 4 is a sequence diagram showing the operation (dentition image generating method) of the dentition image generating system according to this embodiment. The process shown in Fig. 4 is executed by a mobile terminal 50. Note that the process shown in Fig. 4 is, for example, a process performed in real time, and is performed every time one frame or multiple frames of image data are obtained. Fig. 5 is a diagram for explaining the process of the dentition image generating system according to this embodiment. Note that in Fig. 5, color-cast images are shown with diagonal hatching.

 図4に示すように、ユーザが口腔内カメラ10を用いて、自身の口腔内の歯牙及び歯茎を撮影することで画像データが生成される(S101)。この画像データは、例えば、青色光の波長域を含む光を歯牙に照射することで蛍光反応している歯牙を撮影することで得られた画像データである。 As shown in FIG. 4, image data is generated when a user uses the intraoral camera 10 to capture an image of the teeth and gums in his or her oral cavity (S101). This image data is obtained, for example, by irradiating the teeth with light that includes the wavelength range of blue light and capturing an image of the teeth that are reacting fluorescently.

 次に、口腔内カメラ10の通信部40は、撮影された画像データを携帯端末50に送信し、携帯端末50の取得部51は、当該画像データを取得する(S102)。画像データは、動画であってもよいし、1又は複数の静止画であってもよい。また、画像データが動画又は複数の静止画である場合には、動画のフレーム毎、又は静止画毎に、画像データが送信されてもよい。なお、画像データが動画である場合において複数フレーム毎に画像データが送信されてもよい。 Next, the communication unit 40 of the intraoral camera 10 transmits the captured image data to the mobile terminal 50, and the acquisition unit 51 of the mobile terminal 50 acquires the image data (S102). The image data may be a video or one or more still images. Furthermore, when the image data is a video or multiple still images, the image data may be transmitted for each frame of the video or for each still image. Furthermore, when the image data is a video, the image data may be transmitted for each multiple frames.

 また、画像データの送信は、リアルタイムで行われてもよいし、一連の撮影(例えば口腔内の全ての歯牙の撮影)が行われた後にまとめて送信されてもよい。 In addition, image data may be transmitted in real time, or may be transmitted all at once after a series of photographs (e.g., photographs of all teeth in the oral cavity) have been taken.

 携帯端末50の処理部52は、受信した画像データの青色レベルを調整する(S103)。処理部52は、画像データに基づく部分的な歯列画像である複数の第1歯列ブロック画像それぞれの歯牙領域の青色レベルが近づくように、R、G、Bの各成分に異なるゲインを乗算してもよい。 The processing unit 52 of the mobile terminal 50 adjusts the blue level of the received image data (S103). The processing unit 52 may multiply each of the R, G, and B components by different gains so that the blue levels of the tooth regions of each of the multiple first dentition block images, which are partial dentition images based on the image data, become closer to each other.

 図5の(a)に示す複数の第1歯列ブロック画像P1~P5のそれぞれは、WB処理が実行されていない画像である。また、複数の第1歯列ブロック画像P1~P5のそれぞれは、青色レベルが調整された画像である。 Each of the multiple first dentition block images P1 to P5 shown in FIG. 5(a) is an image on which no WB processing has been performed. Also, each of the multiple first dentition block images P1 to P5 is an image on which the blue color level has been adjusted.

 図4を再び参照して、次に、携帯端末50の合成部53は、WB処理が実行されておらず、かつ、青色レベルが調整された複数の第1歯列ブロック画像P1~P5を合成する(S104)。 Referring again to FIG. 4, next, the synthesis unit 53 of the mobile terminal 50 synthesizes the multiple first dentition block images P1 to P5 for which no WB processing has been performed and for which the blue color level has been adjusted (S104).

 図5の(a)及び(b)に示すように、合成部53は、第1歯列ブロック画像P1、P2及びP3に対してステッチング処理を実行することで、第1歯列ブロック画像P1及びP2と、第1歯列ブロック画像P3の少なくとも一部とを含む第1歯列画像P11を生成する。また、合成部53は、第1歯列ブロック画像P3、P4及びP5に対してステッチング処理を実行することで、第1歯列ブロック画像P3の少なくとも一部と、第1歯列ブロック画像P4及びP5とを含む第1歯列画像P12を生成する。第1歯列画像P11及びP12は、歯牙領域の少なくとも一部が重複する画像である。図5の例では、第1歯列画像P11及びP12において、第1歯列ブロック画像P3の歯牙領域の少なくとも一部が重複している。 As shown in (a) and (b) of FIG. 5, the synthesis unit 53 performs a stitching process on the first dentition block images P1, P2, and P3 to generate a first dentition image P11 including the first dentition block images P1 and P2 and at least a portion of the first dentition block image P3. The synthesis unit 53 also performs a stitching process on the first dentition block images P3, P4, and P5 to generate a first dentition image P12 including at least a portion of the first dentition block image P3 and the first dentition block images P4 and P5. The first dentition images P11 and P12 are images in which at least a portion of the dental region overlaps. In the example of FIG. 5, at least a portion of the dental region of the first dentition block image P3 overlaps in the first dentition images P11 and P12.

 また、処理部52は、合成される第1歯列ブロック画像それぞれの画角の中央部分を第1歯列画像が含むようにステッチング処理を実行する。第1歯列画像P11を例に説明すると、処理部52は、第1歯列ブロック画像P1~P3それぞれの画角の中央部分を第1歯列画像P11が含むようにステッチング処理を実行する。これにより生成される第1歯列画像P11は、各歯牙とも正面から光が当たった際の画像となるので、影ができにくく歯間が見えやすい画像となり得る。 The processing unit 52 also performs stitching processing so that the first teeth row image includes the central part of the angle of view of each of the first teeth row block images to be synthesized. Using the first teeth row image P11 as an example, the processing unit 52 performs stitching processing so that the first teeth row image P11 includes the central part of the angle of view of each of the first teeth row block images P1 to P3. The first teeth row image P11 thus generated is an image in which light hits each tooth from the front, making it less likely to create shadows and making it easier to see the spaces between the teeth.

 ここで、第1歯列ブロック画像P3は、下の歯の2本の前歯を含む画像、例えば、2本の前歯の歯間を含む画像であってもよい。この場合、第1歯列画像P11は、例えば、左奥歯から前歯までが映るパノラマ画像であり、第1歯列画像P12は、例えば、右奥歯から前歯までが映るパノラマ画像であり得る。 Here, the first teeth row block image P3 may be an image including the two lower front teeth, for example, an image including the space between the two front teeth. In this case, the first teeth row image P11 may be, for example, a panoramic image showing the area from the left back tooth to the front teeth, and the first teeth row image P12 may be, for example, a panoramic image showing the area from the right back tooth to the front teeth.

 なお、前歯の歯間を含む画像は、例えば、撮影部21での撮影時に、「前歯を撮影して下さい」などのアナウンスがされた後に撮影された画像であってもよいし、ユーザなどにより前歯の画像であることが入力された画像であってもよい。 In addition, the image including the space between the front teeth may be, for example, an image taken after an announcement such as "Please take a picture of your front teeth" is made when taking a picture with the imaging unit 21, or it may be an image in which the user has input that it is an image of the front teeth.

 なお、合成部53が生成する第1歯列画像の数は特に限定されず、3枚以上であってもよい。 The number of first dentition images generated by the synthesis unit 53 is not particularly limited, and may be three or more.

 そして、合成部53は、2つの第1歯列画像P11及びP12に対してステッチング処理を実行することで、第2歯列画像P20を生成する。第2歯列画像P20は、複数の第1歯列ブロック画像P1~P5が合成されたパノラマ画像である。第2歯列画像P20は、ユーザの口腔内の少なくとも一部の歯列が映るパノラマ画像であり、例えば、ユーザの左奥歯から右奥歯までが映るパノラマ画像であってもよい。第2歯列画像P20は、合成後の歯列画像の一例である。 Then, the synthesis unit 53 generates a second teeth row image P20 by performing a stitching process on the two first teeth row images P11 and P12. The second teeth row image P20 is a panoramic image in which multiple first teeth row block images P1 to P5 are synthesized. The second teeth row image P20 is a panoramic image that shows at least a portion of the teeth row in the user's oral cavity, and may be, for example, a panoramic image that shows the user's left molar to right molar. The second teeth row image P20 is an example of a teeth row image after synthesis.

 なお、合成部53は、複数の第1歯列ブロック画像P1~P5に基づいて、直接第2歯列画像P20を生成してもよい。つまり、第1歯列画像P11及びP12は生成されなくてもよい。 The synthesis unit 53 may generate the second dentition image P20 directly based on the multiple first dentition block images P1 to P5. In other words, the first dentition images P11 and P12 do not need to be generated.

 図4を再び参照して、次に、携帯端末50の処理部52は、合成部53により合成された第2歯列画像P20に画像処理を実行する(S105)。ステップS105において、処理部52は、第2歯列画像P20に対して、少なくともWB処理を実行する。 Referring again to FIG. 4, next, the processing unit 52 of the mobile terminal 50 performs image processing on the second dentition image P20 synthesized by the synthesis unit 53 (S105). In step S105, the processing unit 52 performs at least WB processing on the second dentition image P20.

 図5の(c)及び(d)に示すように、処理部52は、第2歯列画像P20に対して、第2歯列画像P20の歯牙領域の色情報を基準にWB処理を実行することで、第3歯列画像P30を生成する。処理部52は、第2歯列画像P20における中央の歯牙(例えば、第1歯列ブロック画像P3に映る歯牙)の色情報を基準にWB処理を実行してもよい。処理部52は、例えば、第2歯列画像P20における中央の歯牙(例えば、第1歯列ブロック画像P3に映る歯牙)のみの色情報を基準にWB処理を実行してもよい。 5(c) and (d), the processing unit 52 performs WB processing on the second dentition image P20 based on the color information of the tooth region of the second dentition image P20 to generate a third dentition image P30. The processing unit 52 may perform WB processing based on the color information of the central tooth in the second dentition image P20 (e.g., the tooth shown in the first dentition block image P3) as a reference. The processing unit 52 may perform WB processing based on, for example, the color information of only the central tooth in the second dentition image P20 (e.g., the tooth shown in the first dentition block image P3).

 なお、第3歯列画像P30は、複数の第2歯列ブロック画像P31~P35により構成される。第2歯列ブロック画像P31は、第1歯列ブロック画像P1に対してホワイトバランス処理が実行された画像に相当し、第2歯列ブロック画像P32は、第1歯列ブロック画像P2に対してホワイトバランス処理が実行された画像に相当し、第2歯列ブロック画像P33は、第1歯列ブロック画像P3に対してホワイトバランス処理が実行された画像に相当し、第2歯列ブロック画像P34は、第1歯列ブロック画像P4に対してホワイトバランス処理が実行された画像に相当し、第2歯列ブロック画像P35は、第1歯列ブロック画像P5に対してホワイトバランス処理が実行された画像に相当する。 The third teeth row image P30 is composed of a plurality of second teeth row block images P31 to P35. The second teeth row block image P31 corresponds to an image obtained by performing white balance processing on the first teeth row block image P1, the second teeth row block image P32 corresponds to an image obtained by performing white balance processing on the first teeth row block image P2, the second teeth row block image P33 corresponds to an image obtained by performing white balance processing on the first teeth row block image P3, the second teeth row block image P34 corresponds to an image obtained by performing white balance processing on the first teeth row block image P4, and the second teeth row block image P35 corresponds to an image obtained by performing white balance processing on the first teeth row block image P5.

 図4を再び参照して、次に、携帯端末50の検出部54は、第3歯列画像P30に対して歯垢を検出する(S106)。検出部54は、歯垢の有無を検出するが、例えば、蛍光物質の濃度分布、つまり歯垢の蓄積レベルを検出してもよい。 Referring again to FIG. 4, next, the detection unit 54 of the mobile device 50 detects plaque in the third dentition image P30 (S106). The detection unit 54 detects the presence or absence of plaque, but may also detect, for example, the concentration distribution of fluorescent substances, i.e., the accumulation level of plaque.

 次に、携帯端末50の生成部55は、検出部54により検出された歯垢が付着した歯垢領域が強調表示された画像を生成する(S107)。本実施の形態では、生成部55は、歯垢領域を示すハイライト表示を第3歯列画像P30に重畳した画像を生成するが、蛍光物質の濃度分布に応じた濃淡表示を第3歯列画像P30に重畳した画像を生成してもよい。 Then, the generating unit 55 of the mobile device 50 generates an image in which the plaque region detected by the detecting unit 54 is highlighted (S107). In this embodiment, the generating unit 55 generates an image in which a highlight display indicating the plaque region is superimposed on the third dentition image P30, but it may also generate an image in which a shading display according to the concentration distribution of the fluorescent substance is superimposed on the third dentition image P30.

 次に、携帯端末50の表示部56は、生成部55により生成された画像を表示する(S108)。 Next, the display unit 56 of the mobile device 50 displays the image generated by the generation unit 55 (S108).

 このような歯列画像生成システムを用いることで、ユーザは、口腔内カメラ10でユーザ自身の口腔内の画像を撮影し、携帯端末50に表示された口腔内の状態をパノラマ画像で確認できる。さらに、表示されるパノラマ画像に蛍光物質の濃度分布が示されることで、ユーザは自身の歯牙の健康状態の確認などを容易に行うことができる。 By using such a dentition image generation system, a user can take an image of the user's own oral cavity with the intraoral camera 10 and check the condition of the oral cavity in a panoramic image displayed on the mobile terminal 50. Furthermore, the concentration distribution of the fluorescent material is shown in the displayed panoramic image, allowing the user to easily check the health condition of their own teeth.

 また、携帯端末50は、ホワイトバランス処理を実行するので、口腔内カメラ10の照明部23が照射する光色によらず、歯垢領域が重畳された(例えば、強調表示された)第3歯列画像P30を生成することができる。 The mobile device 50 also performs white balance processing, so that it can generate a third dentition image P30 in which the plaque area is superimposed (e.g., highlighted) regardless of the color of light emitted by the illumination unit 23 of the intraoral camera 10.

 また、携帯端末50は、例えば、撮影された複数の画像データから、口腔内の複数の歯牙の三次元モデルを生成してもよい。また、携帯端末50は、生成された三次元モデルに基づく画像を表示してもよい。 The mobile terminal 50 may also generate a three-dimensional model of multiple teeth in the oral cavity from multiple captured image data. The mobile terminal 50 may also display an image based on the generated three-dimensional model.

 なお、検出部54は、第2歯列画像P20に対して歯垢を検出することに替えて、第1歯列画像P11及びP12に対して歯垢を検出してもよい。 In addition, instead of detecting plaque on the second dentition image P20, the detection unit 54 may detect plaque on the first dentition images P11 and P12.

 なお、ここでは、携帯端末50が歯牙の画像の処理を行う例を説明したが、この処理の一部又は全てを口腔内カメラ10が行ってもよい。 Although an example has been described in which the mobile terminal 50 processes images of teeth, some or all of this processing may be performed by the intraoral camera 10.

 (その他の実施の形態)
 以上、本開示の実施の形態に係る歯列画像生成システムについて説明したが、本開示は、この実施の形態に限定されるものではない。
Other Embodiments
Although the dentition image generating system according to the embodiment of the present disclosure has been described above, the present disclosure is not limited to this embodiment.

 例えば、上記実施の形態では、歯牙を撮影することを主目的とした口腔内カメラ10を用いる例を説明したが、口腔内カメラ10は、カメラを備える口腔内ケア機器であってもよい。例えば、口腔内カメラ10は、カメラを備える口腔内洗浄機等であってもよい。 For example, in the above embodiment, an example was described in which an intraoral camera 10 is used whose main purpose is to photograph teeth, but the intraoral camera 10 may also be an oral care device equipped with a camera. For example, the intraoral camera 10 may also be an oral irrigator equipped with a camera.

 また、上記実施の形態では、ユーザの情報端末として携帯端末50を例示したが、情報端末は、据え置き型の情報端末であってもよい。 In addition, in the above embodiment, a mobile terminal 50 is given as an example of a user's information terminal, but the information terminal may be a stationary information terminal.

 また、上記実施の形態において、複数の第1歯列ブロック画像は、例えば、撮影部21での撮影時に、歯牙の位置又は歯牙の種類などのアナウンスがされた後に撮影された画像であってもよいし、ユーザなどによりどの位置の歯牙の画像であるかが入力された画像であってもよい。 In addition, in the above embodiment, the multiple first dentition block images may be images captured after an announcement of the tooth position or tooth type, etc., is made when capturing images with the imaging unit 21, or may be images in which the position of the tooth is input by a user, etc.

 また、上記実施の形態に係る歯列画像生成システムに含まれる各処理部は典型的には集積回路であるLSIとして実現される。これらは個別に1チップ化されてもよいし、一部又は全てを含むように1チップ化されてもよい。 Furthermore, each processing unit included in the dentition image generating system according to the above embodiment is typically realized as an LSI, which is an integrated circuit. These may be individually implemented as single chips, or may be integrated into a single chip that includes some or all of them.

 また、集積回路化はLSIに限るものではなく、専用回路又は汎用プロセッサで実現してもよい。LSI製造後にプログラムすることが可能なFPGA(Field Programmable Gate Array)、又はLSI内部の回路セルの接続や設定を再構成可能なリコンフィギュラブル・プロセッサを利用してもよい。 In addition, the integrated circuit is not limited to LSI, but may be realized by a dedicated circuit or a general-purpose processor. It is also possible to use an FPGA (Field Programmable Gate Array) that can be programmed after the LSI is manufactured, or a reconfigurable processor that can reconfigure the connections and settings of the circuit cells inside the LSI.

 また、上記各実施の形態において、各構成要素は、専用のハードウェアで構成されるか、各構成要素に適したソフトウェアプログラムを実行することによって実現されてもよい。各構成要素は、CPU又はプロセッサなどのプログラム実行部が、ハードディスク又は半導体メモリなどの記録媒体に記録されたソフトウェアプログラムを読み出して実行することによって実現されてもよい。 In addition, in each of the above embodiments, each component may be configured with dedicated hardware, or may be realized by executing a software program suitable for each component. Each component may be realized by a program execution unit such as a CPU or processor reading and executing a software program recorded on a recording medium such as a hard disk or semiconductor memory.

 また、ブロック図における機能ブロックの分割は一例であり、複数の機能ブロックを一つの機能ブロックとして実現したり、一つの機能ブロックを複数に分割したり、一部の機能を他の機能ブロックに移してもよい。また、類似する機能を有する複数の機能ブロックの機能を単一のハードウェア又はソフトウェアが並列又は時分割に処理してもよい。 Furthermore, the division of functional blocks in the block diagram is one example, and multiple functional blocks may be realized as one functional block, one functional block may be divided into multiple blocks, or some functions may be transferred to other functional blocks. Furthermore, the functions of multiple functional blocks having similar functions may be processed in parallel or in a time-shared manner by a single piece of hardware or software.

 また、上記実施の形態に係る携帯端末50は、単一の装置として実現されてもよいし、複数の装置により実現されてもよい。携帯端末50が複数の装置によって実現される場合、当該携帯端末50が有する各構成要素は、複数の装置にどのように振り分けられてもよい。例えば、携帯端末50の機能のうち、少なくとも一部の機能は口腔内カメラ10(例えば、信号処理部30)により実現されてもよい。携帯端末50が複数の装置で実現される場合、当該複数の装置間の通信方法は、特に限定されず、無線通信であってもよいし、有線通信であってもよい。また、装置間では、無線通信及び有線通信が組み合わされてもよい。 Furthermore, the mobile terminal 50 according to the above embodiment may be realized as a single device, or may be realized by multiple devices. When the mobile terminal 50 is realized by multiple devices, the components of the mobile terminal 50 may be distributed in any manner among the multiple devices. For example, at least some of the functions of the mobile terminal 50 may be realized by the intraoral camera 10 (e.g., the signal processing unit 30). When the mobile terminal 50 is realized by multiple devices, the communication method between the multiple devices is not particularly limited, and may be wireless communication or wired communication. Furthermore, wireless communication and wired communication may be combined between the devices.

 また、本開示は、歯列画像生成システムにより実行される歯列画像生成方法として実現されてもよい。また、本開示は、歯列画像生成システムに含まれる口腔内カメラ、携帯端末、又はクラウドサーバとして実現されてもよい。 The present disclosure may also be realized as a dentition image generation method executed by a dentition image generation system. The present disclosure may also be realized as an intraoral camera, a mobile terminal, or a cloud server included in the dentition image generation system.

 また、シーケンス図における各ステップが実行される順序は、本開示を具体的に説明するために例示するためのものであり、上記以外の順序であってもよい。また、上記ステップの一部が、他のステップと同時(並列)に実行されてもよい。 In addition, the order in which each step is executed in the sequence diagram is merely an example to specifically explain the present disclosure, and an order other than the above may also be used. In addition, some of the steps may be executed simultaneously (in parallel) with other steps.

 また、本開示の一態様は、図4に示される歯列画像生成方法に含まれる特徴的な各ステップをコンピュータに実行させるコンピュータプログラムであってもよい。 Another aspect of the present disclosure may be a computer program that causes a computer to execute each of the characteristic steps included in the dentition image generation method shown in FIG. 4.

 また、例えば、プログラムは、コンピュータに実行させるためのプログラムであってもよい。また、本開示の一態様は、そのようなプログラムが記録された、コンピュータ読み取り可能な非一時的な記録媒体であってもよい。例えば、そのようなプログラムを記録媒体に記録して頒布又は流通させてもよい。例えば、頒布されたプログラムを、他のプロセッサを有する装置にインストールして、そのプログラムをそのプロセッサに実行させることで、その装置に、上記各処理を行わせることが可能となる。 Furthermore, for example, the program may be a program to be executed by a computer. Furthermore, one aspect of the present disclosure may be a non-transitory computer-readable recording medium on which such a program is recorded. For example, such a program may be recorded on a recording medium and distributed or circulated. For example, the distributed program may be installed in a device having another processor, and the program may be executed by that processor, thereby making it possible to cause that device to perform each of the above processes.

 以上、一つ又は複数の態様に係る歯列画像生成システム等について、実施の形態に基づいて説明したが、本開示は、この実施の形態に限定されるものではない。本開示の趣旨を逸脱しない限り、当業者が思いつく各種変形を本実施の形態に施したものや、異なる実施の形態における構成要素を組み合わせて構築される形態も、一つ又は複数の態様の範囲内に含まれてもよい。 The above describes the dentition image generating system and the like relating to one or more aspects based on the embodiment, but the present disclosure is not limited to this embodiment. As long as it does not deviate from the spirit of the present disclosure, various modifications conceivable by a person skilled in the art to this embodiment and forms constructed by combining components in different embodiments may also be included within the scope of one or more aspects.

 本開示は、歯列画像生成システムに適用できる。 This disclosure can be applied to a dentition image generation system.

 10 口腔内カメラ
 10a  ヘッド部
 10b  ハンドル部
 10c  ネック部
 20  ハード部
 21  撮影部
 22  センサ部
 23  照明部
 23A  第1のLED
 23B  第2のLED
 23C  第3のLED
 23D  第4のLED
 24  操作部
 30  信号処理部
 31  カメラ制御部
 32  画像処理部
 33  制御部
 34  照明制御部
 35  メモリ部
 40  通信部
 50  携帯端末(歯列画像生成装置)
 51  取得部
 52  処理部
 53  合成部
 54  検出部(彩度強調処理部、濃淡表示処理部)
 55  生成部(彩度強調処理部、濃淡表示処理部)
 56  表示部
 P1、P2、P3、P4、P5  第1歯列ブロック画像
 P11、P12  第1歯列画像
 P20  第2歯列画像(歯列画像)
 P30  第3歯列画像
 P31、P32、P33、P34、P35  第2歯列ブロック画像
REFERENCE SIGNS LIST 10 Intraoral camera 10a Head section 10b Handle section 10c Neck section 20 Hard section 21 Photography section 22 Sensor section 23 Illumination section 23A First LED
23B Second LED
23C Third LED
23D 4th LED
24 Operation unit 30 Signal processing unit 31 Camera control unit 32 Image processing unit 33 Control unit 34 Illumination control unit 35 Memory unit 40 Communication unit 50 Portable terminal (dentition image generating device)
51 Acquisition unit 52 Processing unit 53 Synthesis unit 54 Detection unit (saturation emphasis processing unit, grayscale display processing unit)
55 Generation unit (saturation emphasis processing unit, gray scale display processing unit)
56 Display unit P1, P2, P3, P4, P5 First dentition block image P11, P12 First dentition image P20 Second dentition image (dentition image)
P30 Third dentition image P31, P32, P33, P34, P35 Second dentition block images

Claims (13)

 青色光の波長域を含む光が照射されている口腔内の歯列の面及び歯垢を撮影することで得られた撮影画像を取得する取得部と、
 前記撮影画像に基づく部分的な歯列画像である複数の歯列ブロック画像を合成した歯列画像を生成する合成部と、
 合成後の前記歯列画像に対してホワイトバランス処理を行う処理部と、を備える
 歯列画像生成装置。
an acquisition unit that acquires an image obtained by photographing a surface of a dentition and dental plaque in an oral cavity irradiated with light having a wavelength range including blue light;
a synthesis unit for generating a row of teeth image by synthesizing a plurality of row of teeth block images, which are partial row of teeth images based on the photographed images;
a processing unit that performs white balance processing on the combined row-of-teeth image.
 前記処理部は、合成後の前記歯列画像の中央の歯牙の色情報に基づいて前記ホワイトバランス処理を行う
 請求項1に記載の歯列画像生成装置。
The device for generating a row-of-teeth image according to claim 1 , wherein the processing unit performs the white balance processing based on color information of a central tooth in the row-of-teeth image after synthesis.
 前記処理部は、合成後の前記歯列画像の歯牙領域全体の色情報に基づいて前記ホワイトバランス処理を行う
 請求項1に記載の歯列画像生成装置。
The device for generating a row-of-teeth image according to claim 1 , wherein the processing unit performs the white balance processing based on color information of an entire tooth region of the row-of-teeth image after synthesis.
 前記処理部は、前記複数の歯列ブロック画像それぞれの歯牙領域の青色レベルを調整し、
 前記合成部は、前記青色レベルが調整された前記複数の歯列ブロック画像を合成する
 請求項1~3のいずれか1項に記載の歯列画像生成装置。
The processing unit adjusts a blue level of a tooth region of each of the plurality of dentition block images,
4. The device for generating an image of a row of teeth according to claim 1, wherein the synthesis unit synthesizes the plurality of row of teeth block images in which the blue color level has been adjusted.
 前記撮影画像に基づく画像に対して歯垢の検出を行う検出部と、
 前記検出部により検出された歯垢領域が強調表示された合成後の前記歯列画像を生成するための生成部とを備える
 請求項1~3のいずれか1項に記載の歯列画像生成装置。
A detection unit that detects plaque from an image based on the captured image;
The dentition image generating device according to any one of claims 1 to 3, further comprising: a generating unit for generating a synthesized dentition image in which the plaque area detected by the detecting unit is highlighted.
 前記歯垢領域が強調表示された合成後の前記歯列画像を表示する表示部を備える
 請求項5に記載の歯列画像生成装置。
The dentition image generating device according to claim 5 , further comprising a display unit that displays the dentition image after synthesis in which the plaque region is highlighted.
 前記複数の歯列ブロック画像は、前歯を撮影した画像を含む
 請求項1~3のいずれか1項に記載の歯列画像生成装置。
The dentition image generating device according to claim 1 , wherein the plurality of dentition block images include an image of anterior teeth.
 前記歯列の面は、前記歯列の側面を含む
 請求項1~3のいずれか1項に記載の歯列画像生成装置。
The dentition image generating device according to claim 1 , wherein the surface of the dentition includes a side surface of the dentition.
 前記歯列の面は、前記歯列の噛み合わせ面を含む
 請求項1~3のいずれか1項に記載の歯列画像生成装置。
The dentition image generating device according to claim 1 , wherein the surfaces of the dentition include an occlusal surface of the dentition.
 合成後の前記歯列画像をHSV画像に変換することで変換画像を生成し、前記変換画像が有する複数の画素のうち彩度が第1の所定範囲内、色相が第2の所定範囲内、及び、明度が第3の所定範囲内の少なくとも1つを満たす1以上の画素が位置する特定画素領域を特定し、合成後の前記歯列画像における前記特定画素領域に対して彩度強調処理を行うことで彩度が強調された合成後の前記歯列画像を生成する彩度強調処理部を備える
 請求項1~3のいずれか1項に記載の歯列画像生成装置。
The device for generating an image of a row of teeth according to any one of claims 1 to 3, further comprising a saturation emphasis processing unit that generates a converted image by converting the synthesized image of the row of teeth into an HSV image, identifies a specific pixel area in which one or more pixels among a plurality of pixels in the converted image that satisfy at least one of a first predetermined range for saturation, a second predetermined range for hue, and a third predetermined range for brightness are located, and performs saturation emphasis processing on the specific pixel area in the synthesized image of the row of teeth to generate the synthesized image of the row of teeth with emphasized saturation.
 合成後の前記歯列画像をHSV画像に変換することで変換画像を生成し、前記変換画像が有する複数の画素のうち彩度が第1の所定範囲内、色相が第2の所定範囲内、及び、明度が第3の所定範囲内の少なくとも1つを満たす1以上の画素が位置する特定画素領域を特定し、前記変換画像の前記特定画素領域における前記明度の値から前記歯垢内に蓄積された蛍光物質の蓄積レベル分布を検出し、合成後の前記歯列画像における前記特定画素領域に対して検出した前記蛍光物質の蓄積レベル分布に応じた濃淡画像処理を行うことで濃淡表示を含む合成後の前記歯列画像を生成する濃淡表示処理部を備える
 請求項1~3のいずれか1項に記載の歯列画像生成装置。
The device for generating an image of a row of teeth according to any one of claims 1 to 3, further comprising a shading display processing unit that generates a converted image by converting the synthesized row of teeth image into an HSV image, identifies a specific pixel area in which one or more pixels among a plurality of pixels in the converted image are located, the saturation of which falls within at least one of a first predetermined range, a hue of which falls within a second predetermined range, and a brightness of which falls within a third predetermined range, detects an accumulation level distribution of fluorescent substances accumulated in the plaque from the brightness value in the specific pixel area of the converted image, and performs shading image processing according to the accumulation level distribution of the fluorescent substances detected for the specific pixel area in the synthesized row of teeth image, thereby generating the synthesized row of teeth image including a shading display.
 青色光の波長域を含む光が照射されている口腔内の歯列の面及び歯垢を撮影することで得られた撮影画像を取得し、
 前記撮影画像に基づく部分的な歯列画像である複数の歯列ブロック画像を合成した歯列画像を生成し、
 合成後の前記歯列画像に対してホワイトバランス処理を行う
 歯列画像生成方法。
Obtaining a photographed image obtained by photographing the surface of the dentition and dental plaque in the oral cavity irradiated with light having a wavelength range of blue light;
generating a row of teeth image by synthesizing a plurality of row of teeth block images, which are partial row of teeth images based on the photographed images;
The method for generating a row-of-teeth image further comprises performing white balance processing on the combined row-of-teeth image.
 請求項12に記載の歯列画像生成方法をコンピュータに実行させるためのプログラム。 A program for causing a computer to execute the dentition image generation method described in claim 12.
PCT/JP2024/038825 2023-11-29 2024-10-31 Tooth row image generation device, tooth row image generation method, and program Pending WO2025115505A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2023201600 2023-11-29
JP2023-201600 2023-11-29

Publications (1)

Publication Number Publication Date
WO2025115505A1 true WO2025115505A1 (en) 2025-06-05

Family

ID=95897688

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2024/038825 Pending WO2025115505A1 (en) 2023-11-29 2024-10-31 Tooth row image generation device, tooth row image generation method, and program

Country Status (1)

Country Link
WO (1) WO2025115505A1 (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH1165004A (en) * 1997-08-12 1999-03-05 Sony Corp Panoramic imaging system
JP2014133029A (en) * 2013-01-11 2014-07-24 Panasonic Corp Dental plaque detection system and dental plaque detection method
JP2020110301A (en) * 2019-01-10 2020-07-27 ミツミ電機株式会社 Oral cavity observation device
JP2021065263A (en) * 2019-10-17 2021-04-30 株式会社吉田製作所 Dental image acquisition device
JP2022130677A (en) * 2016-05-26 2022-09-06 デンタル スマートミラー, インコーポレイテッド Dental mirror having integrated camera and applications thereof
WO2022187654A1 (en) * 2021-03-05 2022-09-09 Greenmark Biomedical Inc. Dental imaging system and image analysis
JP2023504193A (en) * 2019-12-04 2023-02-01 ケアストリーム デンタル エルエルシー Intraoral 3D scanning with automatic chart creation function

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH1165004A (en) * 1997-08-12 1999-03-05 Sony Corp Panoramic imaging system
JP2014133029A (en) * 2013-01-11 2014-07-24 Panasonic Corp Dental plaque detection system and dental plaque detection method
JP2022130677A (en) * 2016-05-26 2022-09-06 デンタル スマートミラー, インコーポレイテッド Dental mirror having integrated camera and applications thereof
JP2020110301A (en) * 2019-01-10 2020-07-27 ミツミ電機株式会社 Oral cavity observation device
JP2021065263A (en) * 2019-10-17 2021-04-30 株式会社吉田製作所 Dental image acquisition device
JP2023504193A (en) * 2019-12-04 2023-02-01 ケアストリーム デンタル エルエルシー Intraoral 3D scanning with automatic chart creation function
WO2022187654A1 (en) * 2021-03-05 2022-09-09 Greenmark Biomedical Inc. Dental imaging system and image analysis

Similar Documents

Publication Publication Date Title
JP5968944B2 (en) Endoscope system, processor device, light source device, operation method of endoscope system, operation method of processor device, operation method of light source device
ES2431846T5 (en) Methods and products to analyze gingival tissues
CN113727636A (en) Scanner device with replaceable scanning tip
JP2013034569A (en) Intraoral examination device and method for operating the same
US20110058717A1 (en) Methods and systems for analyzing hard tissues
CN102613953A (en) Electronic endoscope system
WO2018155630A1 (en) Photography evalulation/detection unit and optical device
EP4620433A1 (en) Image processing method, image processing device, and program
Mayer et al. Photometric CIELAB analysis of the gingiva: a novel approach to assess response to periodontal therapy
CN109152520A (en) Image signal processing device, image signal processing method, and image signal processing program
GODLEVSKY et al. Application of mobile photography with smartphone cameras for monitoring of early caries appearance in the course of orthodontic correction with dental brackets
JP2004089237A (en) Tooth observation device
WO2025115505A1 (en) Tooth row image generation device, tooth row image generation method, and program
WO2025115504A1 (en) Dentition image generation device, dentition image generation method, and program
CN113206921B (en) External light interference removing method
JP5410374B2 (en) Intraoral observation device
CN119947632A (en) Tartar detection device, tartar detection method and procedure
WO2021166749A1 (en) Learning device and medical image processing device
Haciali et al. Clinical assessment of dental color during dehydration and rehydration by various dental photography techniques
JP2003159210A (en) Method for displaying fluorescent diagnostic image and display unit thereof
WO2025173550A1 (en) Information processing device, information processing method, and program
WO2025197661A1 (en) Periodontal disease detection system, periodontal disease detection method, and program
WO2025173551A1 (en) Information processing device, information processing method, and program
JP2007190370A (en) Image composing device, and method and program thereof
JP2007190371A (en) Dental color measuring device, system, method and program

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 24897194

Country of ref document: EP

Kind code of ref document: A1