[go: up one dir, main page]

WO2025173550A1 - Information processing device, information processing method, and program - Google Patents

Information processing device, information processing method, and program

Info

Publication number
WO2025173550A1
WO2025173550A1 PCT/JP2025/003101 JP2025003101W WO2025173550A1 WO 2025173550 A1 WO2025173550 A1 WO 2025173550A1 JP 2025003101 W JP2025003101 W JP 2025003101W WO 2025173550 A1 WO2025173550 A1 WO 2025173550A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
tooth
unit
information processing
region
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
PCT/JP2025/003101
Other languages
French (fr)
Japanese (ja)
Inventor
泰雄 大塚
岳史 浜崎
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Panasonic Intellectual Property Management Co Ltd
Original Assignee
Panasonic Intellectual Property Management Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Panasonic Intellectual Property Management Co Ltd filed Critical Panasonic Intellectual Property Management Co Ltd
Publication of WO2025173550A1 publication Critical patent/WO2025173550A1/en
Pending legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/04Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
    • A61B1/045Control thereof
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/24Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor for the mouth, i.e. stomatoscopes, e.g. with tongue depressors; Instruments for opening or keeping open the mouth
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61CDENTISTRY; APPARATUS OR METHODS FOR ORAL OR DENTAL HYGIENE
    • A61C19/00Dental auxiliary appliances
    • A61C19/06Implements for therapeutic treatment

Definitions

  • This disclosure relates to an information processing device, an information processing method, and a program.
  • Patent document 1 discloses a device that sets tooth regions in an image of teeth lined up side by side.
  • Patent Document 1 does not disclose technology for acquiring images suitable for checking the condition of the oral cavity.
  • the present disclosure therefore provides an information processing device, information processing method, and program that can acquire images that are more suitable for checking the condition of the oral cavity.
  • An information processing device includes an acquisition unit that acquires a first image including a specific tooth obtained by photographing the inside of the oral cavity, a detection unit that detects a first tooth region of the specific tooth from the first image, and a selection unit that selects the first image as a reference image of the specific tooth if the proportion of the first tooth region in the first image is equal to or greater than a predetermined value.
  • An information processing method acquires an image containing a specific tooth obtained by photographing the inside of the oral cavity, detects the tooth region of the specific tooth from the image, and, if the proportion of the tooth region in the image is equal to or greater than a predetermined value, selects the image as a reference image for the specific tooth.
  • a program according to one aspect of the present disclosure is a program for causing a computer to execute the above-described information processing method.
  • an information processing device or the like that can acquire images that are more suitable for checking the condition of the oral cavity.
  • FIG. 1 is a perspective view of an intraoral camera in an information processing system according to an embodiment.
  • FIG. 2 is a schematic configuration diagram of an information processing system according to an embodiment.
  • FIG. 3 is a block diagram illustrating a functional configuration of the mobile terminal according to the embodiment.
  • FIG. 4 is a flowchart showing a first operation of the information processing system according to the embodiment.
  • FIG. 5 is a diagram for explaining step S13 shown in FIG.
  • FIG. 6 is a flowchart showing a second operation of the information processing system according to the embodiment.
  • FIG. 7A is a first diagram for explaining step S24 shown in FIG.
  • FIG. 7B is a second diagram for explaining step S24 shown in FIG.
  • FIG. 8 is a flowchart showing the operation of evaluating the oral care state according to the embodiment.
  • FIG. 8 is a flowchart showing the operation of evaluating the oral care state according to the embodiment.
  • FIG. 9 is a diagram for explaining a shooting guide in the information processing system according to the first modification of the embodiment.
  • FIG. 10 is a flowchart showing the operation of the information processing system according to the first modification of the embodiment.
  • FIG. 11 shows a screen for selecting a tooth to be photographed according to the second modification of the embodiment.
  • the information processing device includes an acquisition unit that acquires a first image including a specific tooth obtained by photographing the inside of the oral cavity, a detection unit that detects a first tooth region of the specific tooth from the first image, and a selection unit that selects the first image as a reference image of the specific tooth if the proportion of the first tooth region in the first image is equal to or greater than a predetermined value.
  • an information processing device may be the information processing device according to the first aspect, wherein the acquisition unit further acquires a second image including the specific tooth obtained by photographing the oral cavity, the detection unit detects a second tooth region of the specific tooth from the second image, and the selection unit selects a second image having a second tooth region similar to the first tooth region of the first image as a comparison image to be compared with the reference image.
  • the information processing device may be the information processing device according to the second aspect, wherein the acquisition unit acquires one or more of the second images, and the selection unit selects, from among the one or more second images, a second image in which the degree of overlap between the second tooth region in the second image and the first tooth region in the reference image is equal to or greater than a predetermined degree, as the comparison image.
  • the information processing device may be the information processing device according to the second or third aspect, and may further include a control unit that, when the imaging device that captures images of the oral cavity captures the second image, causes an imaging guide based on the reference image to be superimposed on the image acquired from the imaging device and displayed on the display device.
  • the information processing device may be the information processing device according to the fourth aspect, and the imaging guide may include a silhouette, outline, or semi-transparent display of the specific tooth included in the reference image.
  • an information processing device may be an information processing device according to any one of the second to fifth aspects, wherein the detection unit further detects a first plaque region of the specific tooth based on the reference image and a second plaque region of the specific tooth based on the comparison image, and further includes a score calculation unit that calculates a first plaque score based on the first tooth region and the first plaque region in the reference image and calculates a second plaque score based on the second tooth region and the second plaque region in the comparison image, and a comparison unit that compares oral care conditions based on the first plaque score and the second plaque score.
  • the plaque score is calculated using an image with a large tooth area, which can result in a more accurate plaque score. Therefore, it is possible to obtain an image that is more suitable for comparing plaque scores.
  • an information processing device may be the information processing device according to the sixth aspect, wherein the score calculation unit calculates the first plaque score based on the ratio of the number of pixels in the first plaque region to the number of pixels in the natural tooth region in the reference image, and calculates the second plaque score based on the ratio of the number of pixels in the second plaque region to the number of pixels in the natural tooth region in the comparison image.
  • an information processing device is an information processing device according to any one of the first to seventh aspects, and the selection unit may select the first image as the reference image if the ratio of the number of pixels in the first tooth region to the total number of pixels in the first image is equal to or greater than the predetermined value.
  • an information processing device may be an information processing device according to any one of the second to seventh aspects, in which the first image is an image of the inside of the oral cavity before the user performs oral care treatment, and the second image is an image of the inside of the oral cavity after the user has performed oral care treatment.
  • An information processing method acquires an image including a specific tooth obtained by photographing the inside of the oral cavity, detects the tooth region of the specific tooth from the image, and, if the proportion of the tooth region in the image is equal to or greater than a predetermined value, selects the image as a reference image for the specific tooth.
  • a program according to an eleventh aspect of the present disclosure is a program for causing a computer to execute the above-described information processing method.
  • these general or specific aspects may be realized as a system, method, integrated circuit, computer program, or non-transitory recording medium such as a computer-readable CD-ROM, or as any combination of a system, method, integrated circuit, computer program, or recording medium.
  • the program may be pre-stored on the recording medium, or may be supplied to the recording medium via a wide area communication network, including the Internet.
  • each figure is a schematic diagram and is not necessarily an exact illustration. Therefore, for example, the scales of the figures do not necessarily match. Furthermore, in each figure, substantially identical components are assigned the same reference numerals, and duplicate explanations are omitted or simplified.
  • ordinal numbers such as “first” and “second” do not refer to the number or order of components, unless otherwise specified, but are used to avoid confusion and distinguish between components of the same type.
  • Figure 1 is a perspective view of an intraoral camera 10 in the information processing system according to the present embodiment.
  • the intraoral camera 10 has a toothbrush-shaped housing that can be handled with one hand.
  • the housing includes a head portion 10a that is placed in the user's oral cavity when taking a photograph, a handle portion 10b that is held by the user, and a neck portion 10c that connects the head portion 10a and the handle portion 10b.
  • the imaging unit 21 captures images of the surfaces of the dentition and dental plaque in the oral cavity irradiated with light including the wavelength range of blue light.
  • the surfaces of the dentition include at least one of the cheek (outside) side surface of the dentition, the tongue (inside) side surface of the dentition, and the occlusal surface of the dentition.
  • the dentition may also include, for example, one or more teeth.
  • the imaging unit 21 is an example of an imaging device.
  • the photographing unit 21 is incorporated into the head unit 10a and neck unit 10c.
  • the photographing unit 21 has an image sensor (not shown) and a lens (not shown) arranged on its optical axis LA.
  • the imaging element is a photographing device such as a CMOS (Complementary Metal Oxide Semiconductor) sensor or a CCD (Charge Coupled Device) element, and an image of the teeth is formed by a lens.
  • the imaging element outputs a signal (image data) corresponding to the formed image to the outside.
  • the image captured by the imaging element is also referred to as an RGB image.
  • An RGB image is an image obtained by illuminating the dentition with blue light, and may be, for example, an image of the side of the dentition, or an image of the occlusal surface of the dentition.
  • the image includes, for example, one or more images (e.g., time-series images) of the dentition along its alignment.
  • the side of the dentition may be the lingual side or the buccal side.
  • the RGB image includes, for example, one or more teeth.
  • the imaging unit 21 may further include an optical filter that blocks light of a color emitted from the illumination unit (illumination device) and transmits fluorescence emitted by plaque in response to that light.
  • the imaging unit 21 may include, as an optical filter, a blue light cut-off filter that cuts out blue wavelength light components contained in light incident on the imaging element.
  • a blue light cut-off filter that cuts out blue wavelength light components contained in light incident on the imaging element.
  • the intraoral camera 10 is also equipped with multiple first to fourth LEDs 23A-23D as an illumination unit that irradiates light onto the teeth to be photographed during photography.
  • the first to fourth LEDs 23A-23D irradiate dental plaque with light of a color that causes the plaque to fluoresce (for example, single-color light).
  • the first to fourth LEDs 23A-23D are, for example, blue LEDs that irradiate blue light including a wavelength whose peak is 405 nm (an example of a predetermined wavelength). Note that the first to fourth LEDs 23A-23D are not limited to blue LEDs, and may be any light source that irradiates light including the blue light wavelength range.
  • FIG. 2 is a schematic diagram of an information processing system according to this embodiment.
  • the information processing system according to this embodiment is generally configured so that the imaging unit 21 captures fluorescence emitted by plaque in response to light from the illumination unit 23, and from the one or more captured RGB images (one or more images), an image more suitable for checking the condition of the oral cavity is obtained. Furthermore, the information processing system according to this embodiment is configured to select two images that allow the condition of the oral cavity to be compared.
  • the information processing system includes an intraoral camera 10 and a mobile terminal 50.
  • the intraoral camera 10 comprises a hardware unit 20, a signal processing unit 30, and a communication unit 40.
  • the hardware unit 20 is a physical element of the intraoral camera 10 and includes a photographing unit 21, a sensor unit 22, an illumination unit 23, and an operation unit 24.
  • the photographing unit 21 generates image data by photographing the teeth in the user's oral cavity. It can also be said that the photographing unit 21 generates image data by photographing the surface of the dentition in the oral cavity and the plaque, which are irradiated with light of a predetermined wavelength that excites the fluorescent substance contained in the plaque.
  • the photographing unit 21 receives a control signal from the camera control unit 31, performs operations such as photographing in accordance with the received control signal, and outputs video or still image data obtained by photographing to the image processing unit 32.
  • the photographing unit 21 has the above-mentioned image sensor, optical filter, and lens.
  • the image data is generated based on light that has passed through the optical filter, for example. Furthermore, the image data is an image showing multiple teeth, but it is sufficient that the image shows at least one tooth.
  • the sensor unit 22 detects external light entering the imaging area of the RGB image. For example, the sensor unit 22 detects whether external light is entering the oral cavity.
  • the sensor unit 22 is arranged, for example, near the imaging unit 21.
  • the sensor unit 22 may be arranged, for example, in the head unit 10a of the intraoral camera 10, similar to the imaging unit 21. In other words, the sensor unit 22 is located inside the user's oral cavity when the imaging unit 21 takes an image.
  • external light may be detected by the image processing unit 32 (described below) using the image captured by the imaging unit 21.
  • the signal processing unit 30 (for example, the image processing unit 32) may have the functions of the sensor unit 22.
  • the lighting unit 23 irradiates light onto the area of the multiple areas in the oral cavity that will be photographed by the imaging unit 21.
  • QLF quantitative light-induced fluorescence
  • porphyrin a substance produced by bacteria in dental plaque, is known to fluoresce a reddish-pink color (excited fluorescence) when irradiated with blue light.
  • the lighting unit 23 irradiates blue light onto the area that will be photographed by the imaging unit 21.
  • Each of the first to fourth LEDs 23A to 23D is configured so that at least the dimming can be controlled.
  • Each of the first to fourth LEDs 23A to 23D may also be configured so that the dimming and color adjustment can be controlled.
  • the first to fourth LEDs 23A to 23D are arranged to surround the imaging unit 21.
  • the illumination unit 23 controls the illumination intensity (light emission intensity) according to the shooting area.
  • the illumination intensity of each of the first to fourth LEDs 23A to 23D may be controlled uniformly, or may be controlled to differ from one another.
  • the number of LEDs in the illumination unit 23 is not particularly limited, and may be one, or five or more.
  • the illumination unit 23 is not limited to having an LED as a light source, and may also have other light sources.
  • the operation unit 24 accepts operations from the user.
  • the operation unit 24 is configured, for example, with push buttons, but may also be configured to accept operations via voice, etc.
  • the hardware unit 20 may also include a battery (e.g., a secondary battery) that supplies power to each component of the intraoral camera 10, a coil for wireless charging by an external charger connected to a commercial power source, and an actuator required for at least one of composition adjustment and focus adjustment.
  • a battery e.g., a secondary battery
  • the signal processing unit 30 has functional components implemented by a CPU (Central Processing Unit) or MPU (Micro Processor Unit) that execute various processes described below, and a memory unit 35 such as a ROM (Read Only Memory) or RAM (Random Access Memory) that stores programs for causing each functional component to execute various processes.
  • the signal processing unit 30 has a camera control unit 31, an image processing unit 32, a control unit 33, a lighting control unit 34, and the memory unit 35.
  • the camera control unit 31 is mounted, for example, on the handle portion 10b of the intraoral camera 10 and controls the imaging unit 21.
  • the camera control unit 31 controls at least one of the aperture and shutter speed of the imaging unit 21 in response to a control signal from the image processing unit 32, for example.
  • the image processing unit 32 is mounted, for example, on the handle unit 10b of the intraoral camera 10, acquires the RGB image (image data) captured by the imaging unit 21, performs image processing on the acquired RGB image, and outputs the processed RGB image to the camera control unit 31 and the control unit 33.
  • the image processing unit 32 may also output the processed RGB image to the memory unit 35, and store the processed RGB image in the memory unit 35.
  • the image processing unit 32 is composed of, for example, a circuit, and performs image processing such as noise removal and edge enhancement on RGB images. Note that noise removal and edge enhancement processing may also be performed by the mobile terminal 50.
  • the RGB image output from the image processing unit 32 may be transmitted to the mobile terminal 50 via the communication unit 40, and an image based on the transmitted RGB image may be displayed on the display unit 56 of the mobile terminal 50. This allows the user to be presented with an image based on the RGB image.
  • the control unit 33 is a control device that controls the signal processing unit 30.
  • the control unit 33 controls each component of the signal processing unit 30 based on, for example, the detection results of external light, etc. by the sensor unit 22.
  • the lighting control unit 34 is mounted, for example, on the handle portion 10b of the intraoral camera 10 and controls the turning on and off of the first to fourth LEDs 23A to 23D.
  • the lighting control unit 34 is composed of, for example, a circuit. For example, when a user performs an operation on the display unit 56 of the mobile terminal 50 to start the intraoral camera 10, a corresponding signal is sent from the mobile terminal 50 to the signal processing unit 30 via the communication unit 40.
  • the lighting control unit 34 of the signal processing unit 30 turns on the first to fourth LEDs 23A to 23D based on the received signal.
  • the memory unit 35 also stores RGB images (image data) captured by the image capture unit 21.
  • the memory unit 35 is realized by, for example, semiconductor memory such as ROM or RAM, but is not limited to this.
  • the communication unit 40 is a wireless communication module for wireless communication with the mobile terminal 50.
  • the communication unit 40 is mounted, for example, on the handle portion 10b of the intraoral camera 10, and communicates wirelessly with the mobile terminal 50 based on control signals from the signal processing unit 30.
  • the communication unit 40 performs wireless communication with the mobile terminal 50 in accordance with existing communication standards such as Wi-Fi (registered trademark) and Bluetooth (registered trademark). RGB images are sent from the intraoral camera 10 to the mobile terminal 50 via the communication unit 40, and operation signals are sent from the mobile terminal 50 to the intraoral camera 10.
  • the mobile terminal 50 performs processing to obtain images more suitable for checking the condition of the oral cavity, for example, by irradiating the teeth with light including the wavelength range of blue light, and using RGB images of the surfaces of the dentition and dental plaque that are reacting to fluorescence.
  • the mobile terminal 50 functions as a user interface for the information processing system.
  • the mobile terminal 50 is also an example of an information processing device.
  • FIG. 3 is a block diagram showing the functional configuration of the mobile terminal 50 according to this embodiment.
  • the mobile terminal 50 includes a communication unit 51, a detection unit 52, a selection unit 53, a score calculation unit 54, a control unit 55, and a display unit 56.
  • the mobile terminal 50 also includes a processor and memory.
  • the memory may be a ROM or RAM, and may store programs executed by the processor.
  • the communication unit 51, the detection unit 52, the selection unit 53, the score calculation unit 54, the control unit 55, and the display unit 56 are implemented by a processor that executes programs stored in the memory.
  • the mobile terminal 50 may be implemented, for example, by a smartphone or tablet device capable of wireless communication.
  • the communication unit 51 is a wireless communication module for wireless communication with the intraoral camera 10.
  • the communication unit 51 acquires RGB images from the intraoral camera 10. Specifically, the communication unit 51 acquires images including one or more teeth generated by the imaging unit 21.
  • the RGB images are images obtained by the intraoral camera 10 by irradiating the teeth with light including the wavelength range of blue light and capturing images of the teeth that are undergoing a fluorescent reaction.
  • the communication unit 51 may also include a wired communication module for wired communication with the intraoral camera 10.
  • the communication unit 51 is an example of an acquisition unit.
  • RGB images will also be referred to as images.
  • the detection unit 52 detects the tooth region and plaque region of a specific tooth from the image acquired by the communication unit 51. Because teeth and plaque emit different fluorescence, the detection unit 52 detects the tooth and plaque based on the color of the fluorescent (excited fluorescence) portion in the image. Specifically, the detection unit 52 detects the first tooth region, second tooth region, first plaque region, and second plaque region, which will be described later.
  • the first tooth region and second tooth region are natural tooth regions.
  • dental plaque dental plaque region fluoresces a reddish-pink color (excitation fluorescence). It is also known that when teeth (dental regions) are irradiated with excitation light, excitation fluorescence is emitted from the dentin, which passes through the enamel and fluoresces green. Furthermore, fillings left behind after caries treatment (e.g., metal inlays) do not emit excitation fluorescence under blue LED light, and are captured darkly (at low brightness) on the camera. In other words, dental regions and plaque regions in the image are distinguishable, and within dental regions, natural dental regions and fillings are distinguishable.
  • dental regions naturally dental regions refer to the areas of teeth excluding fillings. Note that any known technology may be used to detect dental regions and plaque regions.
  • the selection unit 53 selects an image that is more suitable for checking the condition of the oral cavity based on the image acquired by the communication unit 51. If the proportion of the tooth region in the image is equal to or greater than a first predetermined value, the selection unit 53 selects the first image as a reference image of a specific tooth. For example, the selection unit 53 may select an image as a reference image of a specific tooth if the proportion of the number of pixels in the tooth region to the total number of pixels in the image is equal to or greater than a first predetermined value.
  • the first predetermined value is set in advance and stored in a memory unit (not shown) of the mobile terminal 50.
  • the first predetermined value is an example of a predetermined value.
  • the selection unit 53 may also perform processing to select a comparison image to be compared with the reference image.
  • the score calculation unit 54 calculates a plaque score that indicates the degree of plaque based on the tooth region and plaque region in the image.
  • the plaque score is a numerical value that indicates the condition of the oral cavity.
  • the score calculation unit 54 calculates the plaque score based on the ratio of the plaque area (e.g., the number of pixels in the plaque) or the amount of fluorescence from the plaque to the tooth area (e.g., the number of pixels in the tooth region).
  • the score calculation unit 54 calculates the plaque score, for example, based on the following formula 1.
  • Plaque score plaque area / tooth area...Equation 1
  • tooth area refers to the area of the natural tooth region in the image.
  • the method for calculating the plaque score is not limited to this, and it may be calculated using other methods that use the area of the tooth region.
  • the control unit 55 performs various processes on the plaque score calculated by the score calculation unit 54.
  • the control unit 55 may, for example, compare oral care conditions based on the plaque scores of two images.
  • the control unit 55 may, for example, compare the plaque scores of images taken before and after an oral care treatment such as brushing teeth.
  • the control unit 55 may, for example, evaluate the effectiveness of an oral care treatment based on the plaque score of the oral care treatment. In this way, the control unit 55 may function as a comparison unit that compares oral care conditions.
  • FIG. 4 is a flowchart showing a first operation (information processing method) of the information processing system according to this embodiment.
  • FIG. 4 describes the operation of selecting an appropriate image as an image before tooth brushing. Note that in this embodiment, the information processing system only needs to execute at least the operation shown in FIG. 4 (for example, the operation of selecting a before image) among the operations shown in FIGS. 4, 6, and 8.
  • the communication unit 51 acquires a before image from the intraoral camera 10, which is an image of the inside of the user's oral cavity before brushing (S11).
  • the before image is an example of a first image, and in this case, is an image that includes a specific tooth that is the target tooth to be checked for any remaining brushing.
  • control unit 55 determines whether there are multiple before images (S12).
  • the control unit 55 determines whether there are multiple before images acquired by the communication unit 51 in step S11.
  • the selection unit 53 determines whether any of the multiple before images has a tooth area ratio equal to or greater than a first predetermined value (S13).
  • step S13 the detection unit 52 first detects the tooth area of a specific tooth in each of the multiple before images, and calculates the tooth area ratio by dividing the area (e.g., number of pixels) of the tooth area in the before image by the area (e.g., number of pixels) of the entire before image.
  • Figure 5 is a diagram for explaining step S13 shown in Figure 4.
  • Images A and B are schematic diagrams of before images of the same tooth (specific tooth) that have been binarized.
  • the before image here is an image of the occlusal surface of a first molar.
  • the natural tooth area is shown without hatching, and areas other than the natural tooth area are shown with diagonal hatching.
  • image B is an image in which the area occupied by the natural tooth area is smaller on the entire screen because the photographing position has been shifted to the right compared to image A.
  • the detection unit 52 binarizes the tooth area and other areas in the before image and calculates the proportion of natural tooth area in the image.
  • the proportion of natural tooth area is 62%
  • the proportion of natural tooth area is 27%.
  • the plaque score for Image B is more susceptible to the influence of plaque remaining at the boundary between the natural tooth and the prosthesis than for Image A. Because plaque tends to remain at the boundary, if Image B is an image taken after oral care treatment, the plaque score may increase despite the fact that the oral care treatment has been performed. In other words, if the proportion of natural tooth area is small, there is a risk that the plaque score will not be calculated accurately. Therefore, in the present disclosure, as shown in step S13, a process is performed to extract images in which the tooth area proportion is equal to or greater than a first predetermined value.
  • the first predetermined value is preferably a high value from the perspective of accurately calculating the plaque score, and may be 40% or more, more preferably 50% or more, and even more preferably 60% or more. Furthermore, the first predetermined value may be set to a different value for each type of tooth (e.g., "incisor,” “canine,” “molar”), or the same value may be set.
  • the detection unit 52 calculates the tooth area ratio for each of the multiple before images.
  • the selection unit 53 determines whether or not there is an image among the multiple before images in which the tooth area ratio is equal to or greater than a first predetermined value.
  • the selection unit 53 determines that there is an image in which the tooth area ratio is equal to or greater than a first predetermined value (Yes in S13), it selects a before image in which the tooth area ratio satisfies a predetermined condition (S14).
  • the predetermined condition may be that the tooth area ratio is close to the average, median, or mode of the tooth area ratios in one or more before images in which the tooth area ratio is equal to or greater than the first predetermined value, or that the tooth area ratio is the maximum value.
  • the selection unit 53 may select a before image that is closest to the average value of the tooth area ratios in the one or more before images in which the tooth area ratio is equal to or greater than the first predetermined value.
  • the selection unit 53 determines the before image whose tooth area ratio satisfies a predetermined condition, i.e., the selected before image, as the reference image (S15). In other words, the selection unit 53 selects, from one or more before images, a before image that satisfies the predetermined condition as the reference image. The selection unit 53 stores the reference image in the storage unit.
  • the selection unit 53 determines one before image as the reference image, it may also determine, for example, two or more before images that satisfy predetermined conditions as the reference image. In this way, the selection unit 53 functions as a first selection unit for selecting a reference image from one or more before images.
  • the selection unit 53 determines whether the tooth area ratio of the before image is equal to or greater than a first predetermined value (S16), and if it determines that the tooth area ratio is equal to or greater than the first predetermined value (Yes in S16), proceeds to step S15. For example, if the ratio of the number of pixels in the tooth region to the total number of pixels in the before image is equal to or greater than a first predetermined value, the selection unit 53 may select the before image as the reference image.
  • the pixel number ratio is synonymous with the area ratio.
  • the control unit 55 issues a notification to re-photograph the oral cavity (S17, S18) because there are no before images that can be used as reference images. This makes it possible to prevent plaque scores from being calculated for teeth that are mostly covered by metal inlays, or for images in which natural teeth are not visible or are visible but are small.
  • Figure 6 is a flowchart showing the second operation (information processing method) of the information processing system according to this embodiment.
  • the communication unit 51 acquires an After image, which is an image of the user's oral cavity after brushing, from the intraoral camera 10 (S21).
  • the After image is an example of a second image, and in this case, it is an image that includes a specific tooth that is the target tooth for which brushing is to be checked.
  • control unit 55 determines whether there are multiple After images (S22).
  • the control unit 55 determines whether there are multiple After images acquired by the communication unit 51 in step S21.
  • the selection unit 53 determines whether any of the multiple After images has a tooth area ratio equal to or greater than a first predetermined value (S23).
  • the first predetermined value used in step S23 is, for example, the same value as the first predetermined value used in step S13, but is not limited to this.
  • the first predetermined value used in step S23 may be changed, for example, depending on the tooth area ratio of the Before image determined as the reference image in step S15.
  • step S23 is the same as the determination method in step S13, and so a description thereof will be omitted.
  • the selection unit 53 determines that there is an image with a tooth area ratio equal to or greater than a first predetermined value (Yes in S23), it further determines whether there is an image with a tooth area ratio equal to or greater than the first predetermined value that has an overlapping degree with the reference image equal to or greater than a second predetermined value (S24). From the one or more After images for which the determination in step S23 is Yes is made, the selection unit 53 extracts After images that have a similar angle of view to the reference image, i.e., that are similar to the reference image. For example, the selection unit 53 may select, as the comparison image, an After image that has a second tooth area similar to the first tooth area in the Before image.
  • the selection unit 53 may select, as the comparison image, an After image from one or more second images that shows the same tooth as the tooth included in the reference image, captured from a position closer (e.g., the closest) to the position at which the reference image was captured.
  • a position closer e.g., the closest
  • the overlapping degree is an example of the overlapping degree
  • the second predetermined value is an example of the predetermined degree.
  • the determination of whether the tooth is the same as the tooth included in the reference image may be made by any known method, and can be made, for example, based on the posture (position and orientation) of the intraoral camera 10 when the reference image and comparison image were captured, the shape of the tooth (for example, pattern matching based on the shape of the tooth included in the reference image), etc.
  • the selection unit 53 extracts at least three feature points on the contours of the natural teeth contained in the reference image and the after image, and performs image correction using the feature points in the reference image and the after image to align the relative positions of each feature point so that their coordinates match. Image correction will be explained with reference to Figures 7A and 7B.
  • FIGS. 7A and 7B are diagrams illustrating the image correction performed in step S24 shown in FIG. 6.
  • FIG. 7A is a diagram illustrating the process of adjusting the relative position when the after image is misaligned in the X-axis and Y-axis directions with respect to the reference image
  • FIG. 7B is a diagram illustrating the process of adjusting the relative position when the after image is rotated with respect to the reference image. Note that (c) of FIG. 7A and (c) of FIG. 7B are enlarged images for convenience.
  • the selection unit 53 uses feature points to shift at least one of the reference image and the after image parallel to the X-axis and Y-axis directions so that the contours of the tooth region (natural tooth region) in the reference image and the tooth region (natural tooth region) in the after image overlap. Shifting parallel is an example of image correction.
  • FIG. 7B shows the reference image
  • FIG. 7B shows the after image for which step S23 was determined to be Yes, in which rotation has occurred.
  • Rotation can occur, for example, when the orientation of the intraoral camera 10 when the after image was captured is such that the intraoral camera 10 has rotated around the optical axis of the intraoral camera 10 relative to the orientation of the intraoral camera 10 when the reference image was captured.
  • the selection unit 53 uses the feature points to rotate at least one of the reference image and the after image so that the contours of the tooth region (natural tooth region) in the reference image and the tooth region (natural tooth region) in the after image overlap. Rotation is an example of image correction.
  • the selection unit 53 calculates the area of the overlapping portion of the tooth regions when the contours of the tooth regions overlap, as shown in (c) of Figure 7A or (c) of Figure 7B.
  • the selection unit 53 aligns the reference image and the after image relative to each other based on feature points (e.g., feature points on the contour line) in the outline of the tooth region in the reference image and feature points in the outline of the tooth region in the after image, and identifies overlapping areas between the aligned reference image and the after image.
  • feature points e.g., feature points on the contour line
  • the overlapping areas between the tooth region in the reference image and the tooth region in the after image are indicated by dot hatching.
  • image correction is not limited to the above and may also include, for example, enlarging or reducing the image.
  • the degree of overlap may be the value obtained by dividing the area (or number of pixels) of the overlapping portion by the area (or number of pixels) of the tooth region in the reference image, or it may be the area of the overlapping portion itself.
  • the area of the overlapping portion may also be the number of pixels in the overlapping portion.
  • tooth region in the images shown in Figures 7A(a) and 7B(a) is an example of a first tooth region
  • the tooth region in the images shown in Figures 7A(b) and 7B(b) is an example of a second tooth region.
  • the selection unit 53 determines, as the comparison image, an After image whose degree of overlap is equal to or greater than a second predetermined value (S25). In other words, the selection unit 53 selects, from one or more After images, an After image whose degree of overlap is equal to or greater than a second predetermined value as the comparison image.
  • Such reference and comparison images are similar images with many commonalities in the tooth regions shown, and can be images that make it easy to compare the state of the teeth. It can also be said that the selection unit 53 selects, as the comparison image, an After image that has a high similarity to the reference image of a specific tooth from, for example, one or more After images.
  • the similarity is, for example, a value that corresponds to the degree of overlap of the tooth regions. It is assumed that the higher the degree of overlap, the higher the similarity.
  • the overlap is an example of a value based on the area of the overlapping portion between the tooth region of the reference image and the tooth region of the After image. The value based on the area is not limited to the degree of overlap, and may be, for example, the area of the tooth region.
  • the selection unit 53 associates the comparison image with the reference image and stores it in the storage unit. Note that the selection unit 53 determines one After image as the comparison image, but may also determine, for example, two or more After images whose degree of overlap is equal to or greater than a second predetermined value as the comparison images. In this way, the selection unit 53 functions as a second selection unit for selecting a comparison image from one or more After images.
  • the selection unit 53 determines whether the tooth area ratio of the after image is equal to or greater than a first predetermined value (S26), and if it determines that the tooth area ratio is equal to or greater than the first predetermined value (Yes in S26), the selection unit 53 proceeds to step S24.
  • the control unit 55 issues a notification to re-photograph the oral cavity (S27, S28) because there are no after images that can be used as comparison images.
  • the selection unit 53 may determine whether the difference between the proportion of tooth regions in the After image and the proportion of tooth regions in the reference image is within a third predetermined value, and select the After image for which it is determined that the difference is within the third predetermined value as the comparison image.
  • the third predetermined value is set in advance and stored in the storage unit.
  • Figure 8 is a flowchart showing the operation (information processing method) of evaluating the oral care condition according to this embodiment.
  • the detection unit 52 detects plaque regions in the reference image and the comparison image (S41).
  • the detection unit 52 detects areas in the reference image and the comparison image where porphyrin, a substance produced by bacteria in plaque, fluoresces reddish pink (excited fluorescence) as plaque regions.
  • the plaque region of a specific tooth based on the reference image is an example of a first plaque region
  • the plaque region of a specific tooth based on the comparison image is an example of a second plaque region.
  • the score calculation unit 54 calculates plaque scores for the reference image and the comparison image (S42).
  • the score calculation unit 54 calculates a plaque score (first plaque score) indicating the condition of the teeth based on the first tooth region and the first plaque region for the reference image, and calculates a plaque score (second plaque score) based on the second tooth region and the second plaque region for the comparison image.
  • the score calculation unit 54 calculates the first plaque score based on the ratio of the number of pixels in the first plaque region to the number of pixels in the natural tooth region in the reference image, and calculates the second plaque score based on the ratio of the number of pixels in the second plaque region to the number of pixels in the natural tooth region in the comparison image.
  • the first plaque score and the second plaque score are plaque scores for the same tooth.
  • the control unit 55 evaluates the user's oral care status based on the plaque scores of the reference image and the comparison image (S43). If the reference image and the comparison image are images before and after oral care treatment (e.g., before and after brushing), the control unit 55 may evaluate the effectiveness of the oral care treatment based on the first plaque score and the second plaque score. For example, the control unit 55 may calculate a plaque score for each tooth, thereby identifying which teeth have been left unbrushed the most.
  • the control unit 55 can also be said to compare the oral care state before and after the oral care treatment based on the first plaque score and the second plaque score. In this way, the control unit 55 may function as a comparison unit that compares the oral care state.
  • control unit 55 causes the display unit 56 to display the evaluation results of the oral care state (S44).
  • the control unit 55 may cause the display unit 56 to display the first plaque score and the second plaque score, or may cause the display unit 56 to display a numerical value or level indicating the effectiveness of the oral care treatment based on the first plaque score and the second plaque score.
  • FIG. 9 is a diagram illustrating the imaging guide in the information processing system according to this modified example.
  • This modified example differs from the information processing system according to the embodiment in that a imaging guide image is displayed on the display unit 56 when capturing an after image.
  • FIG. 9 shows an image in which a shooting guide image is superimposed on the captured image.
  • the captured image displayed by the light received by the shooting unit 21 is shown by a solid line
  • the shooting guide image superimposed on the captured image is shown by a dashed line.
  • the captured image here is an image that has not yet been captured, but will be displayed on the display unit 56 when a shooting operation is performed. Note that the captured image, i.e., the After image, does not include a shooting guide.
  • the display of the imaging guide image on the display unit 56 is executed, for example, by the control unit 55.
  • the control unit 55 superimposes the imaging guide on the image (taken image) acquired from the intraoral camera 10 and displayed on the display unit 56.
  • Figure 10 is a flowchart showing the operation of the information processing system according to this modified example (information processing system).
  • step S31 the information processing system executes step S31 in addition to the flowchart shown in Figure 6.
  • the control unit 55 superimposes a shooting guide on the display of the display unit 56 before the intraoral camera 10 captures an After image (S31).
  • the control unit 55 may display the shooting guide on the display unit 56 when it receives an operation from the user indicating that an After image is to be captured.
  • the display of the shooting guide can be set to on or off, and the control unit 55 may display the shooting guide on the display unit 56 only when the setting for displaying the shooting guide is on, for example.
  • the control unit 55 may also display the shooting guide on the display unit 56 when the notification of step S27 or S28 is given and re-shooting is to be performed.
  • Modification 2 of the embodiment The information processing system according to this modification will be described below with reference to FIG. 11 .
  • the following description will focus on differences from the embodiment, and descriptions of content that is the same as or similar to the embodiment will be omitted or simplified.
  • the configuration of the information processing system according to this modification may be the same as the information processing system according to the embodiment, and description thereof will be omitted.
  • the following description will also use the reference numerals of the information processing system according to the embodiment.
  • Figure 11 shows a screen for selecting the tooth to be photographed in this modified example.
  • Figure 11 illustrates an example in which RGB images (e.g., reference image, after image) photographed by the user are stored in association with information indicating which tooth the image represents.
  • RGB images e.g., reference image, after image
  • the display unit 56 of the mobile terminal 50 displays a screen that allows the user to select the RGB image to be captured.
  • the display unit 56 displays, for example, the rows of teeth of the upper and lower jaws.
  • the display of the rows of teeth may be an image or an illustration.
  • Figure 11 shows an example in which the first molar on the left side of the lower jaw has been selected.
  • the control unit 55 After receiving the user's selection, acquires an RGB image from the intraoral camera 10, and stores the acquired RGB image in association with information indicating the first molar on the left side of the mandible. For example, the control unit 55 stores the reference image in association with information indicating the first molar on the left side of the mandible.
  • tooth selection may be performed before or after capturing the RGB image. It is sufficient that the RGB image and the tooth type or position are stored in association with each other.
  • imaging direction information indicating whether the RGB image was taken from the cheek side, tongue side, or occlusal side of the tooth may also be stored in association with the RGB image.
  • the imaging direction information may be input by the user, for example.
  • the display unit 56 may display information indicating which tooth the reference image is an image of. For example, in a display of the rows of teeth in the oral cavity as shown in FIG. 11, the display unit 56 may indicate which tooth the reference image is an image of, or may display text information.
  • the degree of overlap between the reference image and the after image captured in this manner may then be determined.
  • control unit 55 may store the tooth type and the amount of plaque in association with each other in the memory unit.
  • control unit 55 reads the reference image and the after image from the storage unit and displays them on the display unit 56, the tooth type or tooth type may be superimposed on the image.
  • an intraoral camera 10 is used whose main purpose is to photograph teeth, but the intraoral camera 10 may also be an oral care device equipped with a camera.
  • the intraoral camera 10 may also be an oral irrigator equipped with a camera.
  • a mobile terminal 50 is used as an example of an information processing device, but the information processing device may also be a stationary information terminal or a server device.
  • the degree of overlap is given as an example of the degree of overlap, but the degree of overlap may also be indicated as a scale such as "high,” “medium,” or "low.”
  • the information processing device in the embodiments, the mobile terminal 50
  • a display device capable of communicating with the information processing device may also be provided as a device separate from the information processing device.
  • each processing unit included in the information processing system according to the above-described embodiments is typically realized as an LSI, which is an integrated circuit. These may be individually implemented as single chips, or some or all of them may be integrated into a single chip.
  • integrated circuits are not limited to LSIs, but may be realized using dedicated circuits or general-purpose processors.
  • FPGAs Field Programmable Gate Arrays
  • reconfigurable processors which allow the connections and settings of circuit cells within the LSI to be reconfigured, may also be used.
  • each component may be configured with dedicated hardware, or may be realized by executing a software program appropriate for that component.
  • Each component may also be realized by a program execution unit such as a CPU or processor reading and executing a software program recorded on a recording medium such as a hard disk or semiconductor memory.
  • the division of functional blocks in the block diagram is one example; multiple functional blocks may be realized as a single functional block, one functional block may be divided into multiple blocks, or some functions may be moved to other functional blocks. Furthermore, the functions of multiple functional blocks with similar functions may be processed in parallel or time-shared by a single piece of hardware or software.
  • the information processing device (e.g., mobile terminal 50) according to the above-described embodiments may be realized as a single device, or may be realized by multiple devices.
  • the components of the information processing device may be distributed in any manner among the multiple devices.
  • at least some of the functions of the information processing device may be realized by the intraoral camera 10 (e.g., signal processing unit 30).
  • the communication method between the multiple devices is not particularly limited, and may be wireless communication or wired communication. Furthermore, wireless communication and wired communication may be combined between the devices.
  • the present disclosure may also be realized as an information processing method executed by an information processing system.
  • the present disclosure may also be realized as an intraoral camera, mobile terminal, or cloud server included in the information processing system.
  • one aspect of the present disclosure may be a computer program that causes a computer to execute each of the characteristic steps included in the information processing method shown in any of Figures 4, 6, and 8.
  • the program may be a program to be executed by a computer.
  • one aspect of the present disclosure may be a computer-readable non-transitory recording medium on which such a program is recorded.
  • such a program may be recorded on a recording medium and distributed or circulated. For example, by installing the distributed program in a device having another processor and having that processor execute the program, it becomes possible to cause that device to perform each of the above processes.
  • This disclosure can be applied to an information processing system for checking the condition of the oral cavity.
  • Intraoral camera 10a Head part 10b Handle part 20 Hardware part 21 Photography part (photography device) 22 Sensor unit 23 Illumination unit 23A First LED 23B Second LED 23C Third LED 23D Fourth LED 24 Operation unit 30 Signal processing unit 31 Camera control unit 32 Image processing unit 33 Control unit 34 Lighting control unit 35 Memory unit 40 Communication unit 50 Portable terminal (information processing device) 51 Communication unit (acquisition unit) 52 Detection unit 53 Selection unit (first selection unit, second selection unit) 54 Score calculation unit 55 Control unit (comparison unit) 56 Display section (display device)

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Surgery (AREA)
  • Veterinary Medicine (AREA)
  • Public Health (AREA)
  • General Health & Medical Sciences (AREA)
  • Animal Behavior & Ethology (AREA)
  • Biophysics (AREA)
  • Medical Informatics (AREA)
  • Radiology & Medical Imaging (AREA)
  • Optics & Photonics (AREA)
  • Engineering & Computer Science (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Pathology (AREA)
  • Molecular Biology (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Physics & Mathematics (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Dentistry (AREA)
  • Epidemiology (AREA)
  • Endoscopes (AREA)
  • Dental Tools And Instruments Or Auxiliary Dental Instruments (AREA)

Abstract

An information processing device according to the present invention comprises: an acquisition unit (for example, a communication unit (51)) that acquires a first image including a specific tooth, obtained by imaging the inside of an oral cavity; a detection unit (52) that detects a first tooth region of the specific tooth from the first image; and a selection unit (53) that, when the ratio of the first tooth region in the first image is equal to or greater than a prescribed value, selects the first image as a reference image of the specific tooth.

Description

情報処理装置、情報処理方法及びプログラムInformation processing device, information processing method, and program

 本開示は、情報処理装置、情報処理方法及びプログラムに関する。 This disclosure relates to an information processing device, an information processing method, and a program.

 特許文献1には、横並びの歯牙を撮像した撮像画像に対して歯牙領域を設定する装置が開示されている。 Patent document 1 discloses a device that sets tooth regions in an image of teeth lined up side by side.

特開2017-18152号公報Japanese Patent Application Laid-Open No. 2017-18152

 ところで、口腔内の状態を確認するための適した画像に対して、歯垢領域の設定などの処理が行われることが望まれる。しかしながら、特許文献1には、口腔内の状態を確認するための適した画像を取得する技術については開示されていない。 Incidentally, it is desirable to perform processing such as setting plaque areas on images suitable for checking the condition of the oral cavity. However, Patent Document 1 does not disclose technology for acquiring images suitable for checking the condition of the oral cavity.

 そこで、本開示は、口腔内の状態を確認するためのより適した画像を取得することができる情報処理装置、情報処理方法及びプログラムを提供する。 The present disclosure therefore provides an information processing device, information processing method, and program that can acquire images that are more suitable for checking the condition of the oral cavity.

 本開示の一態様に係る情報処理装置は、口腔内を撮影することで得られる特定の歯牙を含む第1画像を取得する取得部と、前記第1画像から前記特定の歯牙の第1歯牙領域を検出する検出部と、前記第1画像における前記第1歯牙領域の割合が所定値以上である場合、当該第1画像を前記特定の歯牙の基準画像として選択する選択部と、を備える。 An information processing device according to one aspect of the present disclosure includes an acquisition unit that acquires a first image including a specific tooth obtained by photographing the inside of the oral cavity, a detection unit that detects a first tooth region of the specific tooth from the first image, and a selection unit that selects the first image as a reference image of the specific tooth if the proportion of the first tooth region in the first image is equal to or greater than a predetermined value.

 本開示の一態様に係る情報処理方法は、口腔内を撮影することで得られる特定の歯牙を含む画像を取得し、前記画像から前記特定の歯牙の歯牙領域を検出し、前記画像における前記歯牙領域の割合が所定値以上である場合、当該画像を前記特定の歯牙の基準画像として選択する。 An information processing method according to one aspect of the present disclosure acquires an image containing a specific tooth obtained by photographing the inside of the oral cavity, detects the tooth region of the specific tooth from the image, and, if the proportion of the tooth region in the image is equal to or greater than a predetermined value, selects the image as a reference image for the specific tooth.

 本開示の一態様に係るプログラムは、上記の情報処理方法をコンピュータに実行させるためのプログラムである。 A program according to one aspect of the present disclosure is a program for causing a computer to execute the above-described information processing method.

 本開示の一態様によれば、口腔内の状態を確認するためのより適した画像を取得することができる情報処理装置等を実現することができる。 According to one aspect of the present disclosure, it is possible to realize an information processing device or the like that can acquire images that are more suitable for checking the condition of the oral cavity.

図1は、実施の形態に係る情報処理システムにおける口腔内カメラの斜視図である。FIG. 1 is a perspective view of an intraoral camera in an information processing system according to an embodiment. 図2は、実施の形態に係る情報処理システムの概略的構成図である。FIG. 2 is a schematic configuration diagram of an information processing system according to an embodiment. 図3は、実施の形態に係る携帯端末の機能構成を示すブロック図である。FIG. 3 is a block diagram illustrating a functional configuration of the mobile terminal according to the embodiment. 図4は、実施の形態に係る情報処理システムの第1動作を示すフローチャートである。FIG. 4 is a flowchart showing a first operation of the information processing system according to the embodiment. 図5は、図4に示すステップS13を説明するための図である。FIG. 5 is a diagram for explaining step S13 shown in FIG. 図6は、実施の形態に係る情報処理システムの第2動作を示すフローチャートである。FIG. 6 is a flowchart showing a second operation of the information processing system according to the embodiment. 図7Aは、図6に示すステップS24を説明するための第1図である。FIG. 7A is a first diagram for explaining step S24 shown in FIG. 図7Bは、図6に示すステップS24を説明するための第2図である。FIG. 7B is a second diagram for explaining step S24 shown in FIG. 図8は、実施の形態に係る口腔ケア状態の評価を行う動作を示すフローチャートである。FIG. 8 is a flowchart showing the operation of evaluating the oral care state according to the embodiment. 図9は、実施の形態の変形例1に係る情報処理システムにおける撮影ガイドを説明するための図である。FIG. 9 is a diagram for explaining a shooting guide in the information processing system according to the first modification of the embodiment. 図10は、実施の形態の変形例1に係る情報処理システムの動作を示すフローチャートである。FIG. 10 is a flowchart showing the operation of the information processing system according to the first modification of the embodiment. 図11は、実施の形態の変形例2に係る撮影対象の歯牙を選択する画面を示す図である。FIG. 11 shows a screen for selecting a tooth to be photographed according to the second modification of the embodiment.

 本開示の第1態様に係る情報処理装置は、口腔内を撮影することで得られる特定の歯牙を含む第1画像を取得する取得部と、前記第1画像から前記特定の歯牙の第1歯牙領域を検出する検出部と、前記第1画像における前記第1歯牙領域の割合が所定値以上である場合、当該第1画像を前記特定の歯牙の基準画像として選択する選択部と、を備える。 The information processing device according to the first aspect of the present disclosure includes an acquisition unit that acquires a first image including a specific tooth obtained by photographing the inside of the oral cavity, a detection unit that detects a first tooth region of the specific tooth from the first image, and a selection unit that selects the first image as a reference image of the specific tooth if the proportion of the first tooth region in the first image is equal to or greater than a predetermined value.

 これにより、第1歯牙領域の割合が所定値未満の画像に比べて歯牙領域が多く、特定の歯牙の状態を確認しやすい基準画像を取得することができる。よって、口腔内の状態を確認するためのより適した画像を取得することができる。 This allows for the acquisition of a reference image with a larger tooth area than an image in which the proportion of the first tooth area is less than a predetermined value, making it easier to check the condition of a specific tooth. This makes it possible to acquire an image that is more suitable for checking the condition of the oral cavity.

 また、例えば、第2態様に係る情報処理装置は、第1態様に係る情報処理装置であって、前記取得部は、さらに、前記口腔内を撮影することで得られる前記特定の歯牙を含む第2画像を取得し、前記検出部は、前記第2画像から前記特定の歯牙の第2歯牙領域を検出し、前記選択部は、前記第1画像の前記第1歯牙領域と類似する第2歯牙領域を有する第2画像を、前記基準画像の比較対象である比較画像として選択してもよい。 Furthermore, for example, an information processing device according to a second aspect may be the information processing device according to the first aspect, wherein the acquisition unit further acquires a second image including the specific tooth obtained by photographing the oral cavity, the detection unit detects a second tooth region of the specific tooth from the second image, and the selection unit selects a second image having a second tooth region similar to the first tooth region of the first image as a comparison image to be compared with the reference image.

 これにより、基準画像と歯牙領域が類似する比較画像を取得することができる。このような比較画像は、基準画像と歯牙領域の状態を比較しやすい画像である。よって、口腔内の状態を比較して確認するためのより適した画像を取得することができる。 This makes it possible to obtain a comparison image in which the tooth region is similar to that of the reference image. Such a comparison image makes it easy to compare the state of the tooth region with that of the reference image. Therefore, it is possible to obtain an image that is more suitable for comparing and checking the state of the oral cavity.

 また、例えば、第3態様に係る情報処理装置は、第2態様に係る情報処理装置であって、前記取得部は、1以上の前記第2画像を取得し、前記選択部は、1以上の前記第2画像のうち、当該第2画像における前記第2歯牙領域と、前記基準画像における前記第1歯牙領域との重なり度合いが所定度合い以上である第2画像を前記比較画像として選択してもよい。 Furthermore, for example, the information processing device according to the third aspect may be the information processing device according to the second aspect, wherein the acquisition unit acquires one or more of the second images, and the selection unit selects, from among the one or more second images, a second image in which the degree of overlap between the second tooth region in the second image and the first tooth region in the reference image is equal to or greater than a predetermined degree, as the comparison image.

 これにより、歯牙領域の重なり部分が多い基準画像及び比較画像が選択されるので、例えば、歯牙領域の面積を用いて口腔内の状態を示す歯垢スコアを算出する場合に、より正確な歯垢スコアを算出することができる。 This allows reference images and comparison images with large overlapping tooth regions to be selected, making it possible to calculate a more accurate plaque score, for example, when calculating a plaque score that indicates the condition of the oral cavity using the area of the tooth region.

 また、例えば、第4態様に係る情報処理装置は、第2態様又は第3態様に係る情報処理装置であって、さらに、前記口腔内を撮影する撮影装置が前記第2画像を撮影する際、前記基準画像に基づく撮影ガイドを、前記撮影装置から取得され表示装置に表示された画像に重畳表示させる制御部を備えてもよい。 Furthermore, for example, the information processing device according to the fourth aspect may be the information processing device according to the second or third aspect, and may further include a control unit that, when the imaging device that captures images of the oral cavity captures the second image, causes an imaging guide based on the reference image to be superimposed on the image acquired from the imaging device and displayed on the display device.

 これにより、基準画像に類似した第2画像をユーザが撮影することを支援することができる。 This helps the user capture a second image that is similar to the reference image.

 また、例えば、第5態様に係る情報処理装置は、第4態様に係る情報処理装置であって、前記撮影ガイドは、前記基準画像に含まれる前記特定の歯牙のシルエット、輪郭、又は半透明の表示を含んでもよい。 Furthermore, for example, the information processing device according to the fifth aspect may be the information processing device according to the fourth aspect, and the imaging guide may include a silhouette, outline, or semi-transparent display of the specific tooth included in the reference image.

 これにより、特定の歯牙のシルエット、輪郭、又は半透明の表示を表示させることで、基準画像に類似した第2画像をユーザが撮影することを支援することができる。 This can assist the user in capturing a second image similar to the reference image by displaying a silhouette, outline, or semi-transparent representation of a particular tooth.

 また、例えば、第6態様に係る情報処理装置は、第2態様~第5態様のいずれか1つの態様に係る情報処理装置であって、前記検出部は、さらに、前記基準画像に基づく前記特定の歯牙の第1歯垢領域と、前記比較画像に基づく前記特定の歯牙の第2歯垢領域とを検出し、さらに、前記基準画像において、前記第1歯牙領域と前記第1歯垢領域とに基づいて、第1歯垢スコアを算出し、かつ、前記比較画像において、前記第2歯牙領域と前記第2歯垢領域とに基づいて、第2歯垢スコアを算出するスコア算出部と、前記第1歯垢スコアと前記第2歯垢スコアとに基づいて、口腔ケア状態を比較する比較部とを備えてもよい。 Furthermore, for example, an information processing device according to a sixth aspect may be an information processing device according to any one of the second to fifth aspects, wherein the detection unit further detects a first plaque region of the specific tooth based on the reference image and a second plaque region of the specific tooth based on the comparison image, and further includes a score calculation unit that calculates a first plaque score based on the first tooth region and the first plaque region in the reference image and calculates a second plaque score based on the second tooth region and the second plaque region in the comparison image, and a comparison unit that compares oral care conditions based on the first plaque score and the second plaque score.

 これにより、歯牙領域が多い画像を用いて歯垢スコアが算出されるので、当該歯垢スコアはより正確な値となり得る。よって、歯垢スコアの比較を行うためのより適した画像を取得することができる。 As a result, the plaque score is calculated using an image with a large tooth area, which can result in a more accurate plaque score. Therefore, it is possible to obtain an image that is more suitable for comparing plaque scores.

 また、例えば、第7態様に係る情報処理装置は、第6態様に係る情報処理装置であって、前記スコア算出部は、前記基準画像における天然歯牙領域の画素数に対する前記第1歯垢領域の画素数の比に基づいて、前記第1歯垢スコアを算出し、前記比較画像における天然歯牙領域の画素数に対する前記第2歯垢領域の画素数の比に基づいて、前記第2歯垢スコアを算出してもよい。 Furthermore, for example, an information processing device according to a seventh aspect may be the information processing device according to the sixth aspect, wherein the score calculation unit calculates the first plaque score based on the ratio of the number of pixels in the first plaque region to the number of pixels in the natural tooth region in the reference image, and calculates the second plaque score based on the ratio of the number of pixels in the second plaque region to the number of pixels in the natural tooth region in the comparison image.

 これにより、天然歯牙領域の画素数に対する歯垢領域の画素数の比により歯垢スコアを算出する場合において、より正確な歯垢スコアを算出することが可能である。 This makes it possible to calculate a more accurate plaque score when calculating the plaque score based on the ratio of the number of pixels in the plaque area to the number of pixels in the natural tooth area.

 また、例えば、第8態様に係る情報処理装置は、第1態様~第7態様のいずれか1つの態様に係る情報処理装置であって、前記選択部は、前記第1画像の全画素数に対する前記第1歯牙領域の画素数の割合が前記所定値以上である場合、当該第1画像を前記基準画像として選択してもよい。 Furthermore, for example, an information processing device according to an eighth aspect is an information processing device according to any one of the first to seventh aspects, and the selection unit may select the first image as the reference image if the ratio of the number of pixels in the first tooth region to the total number of pixels in the first image is equal to or greater than the predetermined value.

 これにより、全画素数に対する第1歯牙領域の画素数の割合を用いて、口腔内の状態を確認するためのより適した画像を選択することができる。 This allows the ratio of the number of pixels in the first tooth region to the total number of pixels to be used to select an image that is more suitable for checking the condition of the oral cavity.

 また、例えば、第9態様に係る情報処理装置は、第2態様~第7態様のいずれか1つの態様に係る情報処理装置であって、前記第1画像は、ユーザが口腔ケア処理を行う前の口腔内を撮影した画像であり、前記第2画像は、前記ユーザが口腔ケア処理を行った後の口腔内を撮影した画像であってもよい。 Furthermore, for example, an information processing device according to a ninth aspect may be an information processing device according to any one of the second to seventh aspects, in which the first image is an image of the inside of the oral cavity before the user performs oral care treatment, and the second image is an image of the inside of the oral cavity after the user has performed oral care treatment.

 これにより、ユーザが口腔ケア処理を行う前後での比較をより適切に行い得る基準画像及び比較画像を取得することができる。 This allows users to obtain reference images and comparison images that allow them to more appropriately compare before and after oral care treatment.

 本開示の第10態様に係る情報処理方法は、口腔内を撮影することで得られる特定の歯牙を含む画像を取得し、前記画像から前記特定の歯牙の歯牙領域を検出し、前記画像における前記歯牙領域の割合が所定値以上である場合、当該画像を前記特定の歯牙の基準画像として選択する。 An information processing method according to a tenth aspect of the present disclosure acquires an image including a specific tooth obtained by photographing the inside of the oral cavity, detects the tooth region of the specific tooth from the image, and, if the proportion of the tooth region in the image is equal to or greater than a predetermined value, selects the image as a reference image for the specific tooth.

 これにより、上記の情報処理装置と同様の効果を奏する。 This achieves the same effects as the information processing device described above.

 本開示の第11態様に係るプログラムは、上記の情報処理方法をコンピュータに実行させるためのプログラムである。 A program according to an eleventh aspect of the present disclosure is a program for causing a computer to execute the above-described information processing method.

 これにより、上記の情報処理装置と同様の効果を奏する。 This achieves the same effects as the information processing device described above.

 なお、これらの全般的又は具体的な態様は、システム、方法、集積回路、コンピュータプログラム又はコンピュータで読み取り可能なCD-ROM等の非一時的記録媒体で実現されてもよく、システム、方法、集積回路、コンピュータプログラム又は記録媒体の任意な組み合わせで実現されてもよい。プログラムは、記録媒体に予め記憶されていてもよいし、インターネット等を含む広域通信網を介して記録媒体に供給されてもよい。 Note that these general or specific aspects may be realized as a system, method, integrated circuit, computer program, or non-transitory recording medium such as a computer-readable CD-ROM, or as any combination of a system, method, integrated circuit, computer program, or recording medium. The program may be pre-stored on the recording medium, or may be supplied to the recording medium via a wide area communication network, including the Internet.

 また、各図は、模式図であり、必ずしも厳密に図示されたものではない。したがって、例えば、各図において縮尺などは必ずしも一致しない。また、各図において、実質的に同一の構成については同一の符号を付しており、重複する説明は省略又は簡略化する。 Furthermore, each figure is a schematic diagram and is not necessarily an exact illustration. Therefore, for example, the scales of the figures do not necessarily match. Furthermore, in each figure, substantially identical components are assigned the same reference numerals, and duplicate explanations are omitted or simplified.

 また、本明細書において、平行などの要素間の関係性を示す用語、並びに、数値、及び、数値範囲は、厳格な意味のみを表す表現ではなく、実質的に同等な範囲、例えば数%程度(あるいは、10%程度)の差異をも含むことを意味する表現である。 Furthermore, in this specification, terms indicating relationships between elements, such as "parallel," as well as numerical values and numerical ranges, are not expressions that express only the strict meaning, but also expressions that include a substantially equivalent range, for example, a difference of about a few percent (or about 10%).

 また、本明細書において、「第1」、「第2」などの序数詞は、特に断りの無い限り、構成要素の数又は順序を意味するものではなく、同種の構成要素の混同を避け、区別する目的で用いられている。 Furthermore, in this specification, ordinal numbers such as "first" and "second" do not refer to the number or order of components, unless otherwise specified, but are used to avoid confusion and distinguish between components of the same type.

 (実施の形態)
 以下、本実施の形態に係る情報処理システムについて、図1~図8を参照しながら説明する。
(Embodiment)
An information processing system according to this embodiment will be described below with reference to FIGS.

 [1.情報処理システムの構成]
 まず、本実施の形態に係る口腔内カメラを備える情報処理システムの構成について、図1~図3を参照しながら説明する。図1は、本実施の形態に係る情報処理システムにおける口腔内カメラ10の斜視図である。
[1. Configuration of information processing system]
First, the configuration of an information processing system including an intraoral camera according to the present embodiment will be described with reference to Figures 1 to 3. Figure 1 is a perspective view of an intraoral camera 10 in the information processing system according to the present embodiment.

 図1に示すように、口腔内カメラ10は、片手で取り扱うことが可能な歯ブラシ状の筺体を備え、その筺体は、撮影時にユーザの口腔内に配置されるヘッド部10aと、ユーザが把持するハンドル部10bと、ヘッド部10a及びハンドル部10bを接続するネック部10cとを備える。 As shown in Figure 1, the intraoral camera 10 has a toothbrush-shaped housing that can be handled with one hand. The housing includes a head portion 10a that is placed in the user's oral cavity when taking a photograph, a handle portion 10b that is held by the user, and a neck portion 10c that connects the head portion 10a and the handle portion 10b.

 撮影部21は、青色光の波長域を含む光が照射されている口腔内の歯列の面及び歯垢を撮影する。歯列の面は、歯列の頬側(外側)の側面、歯列の舌側(内側)の側面、及び、歯列の噛み合わせ面の少なくとも1つを含む。また、歯列とは、例えば、1以上の歯牙を含む。撮影部21は、撮影装置の一例である。 The imaging unit 21 captures images of the surfaces of the dentition and dental plaque in the oral cavity irradiated with light including the wavelength range of blue light. The surfaces of the dentition include at least one of the cheek (outside) side surface of the dentition, the tongue (inside) side surface of the dentition, and the occlusal surface of the dentition. The dentition may also include, for example, one or more teeth. The imaging unit 21 is an example of an imaging device.

 撮影部21は、ヘッド部10aとネック部10cとに組み込まれている。撮影部21は、その光軸LA上に配置された撮像素子(図示しない)とレンズ(図示しない)とを有する。 The photographing unit 21 is incorporated into the head unit 10a and neck unit 10c. The photographing unit 21 has an image sensor (not shown) and a lens (not shown) arranged on its optical axis LA.

 撮像素子は、例えばCMOS(Complementary Metal Oxide Semiconductor)センサ又はCCD(Charge Coupled Device)素子などの撮影デバイスであって、レンズによって歯牙の像が結像される。その結像した像に対応する信号(画像データ)を、撮像素子は外部に出力する。撮像素子によって撮影された撮影画像を、RGB画像とも記載する。また、RGB画像は、青色光を歯列に照射して得られる画像であり、例えば、歯列の側面が撮影された画像であってもよいし、歯列の噛み合わせ面が撮影された画像であってもよい。画像は、例えば、歯列を並び方向に沿って撮影した1以上の画像(例えば、時系列画像)を含む。また、歯列の側面は、舌側の側面であってもよいし、頬側の側面であってもよい。RGB画像には、例えば、1以上の歯牙が含まれる。 The imaging element is a photographing device such as a CMOS (Complementary Metal Oxide Semiconductor) sensor or a CCD (Charge Coupled Device) element, and an image of the teeth is formed by a lens. The imaging element outputs a signal (image data) corresponding to the formed image to the outside. The image captured by the imaging element is also referred to as an RGB image. An RGB image is an image obtained by illuminating the dentition with blue light, and may be, for example, an image of the side of the dentition, or an image of the occlusal surface of the dentition. The image includes, for example, one or more images (e.g., time-series images) of the dentition along its alignment. The side of the dentition may be the lingual side or the buccal side. The RGB image includes, for example, one or more teeth.

 撮影部21は、さらに照明部(照明デバイス)から照射される色の光を遮光し、かつ、当該光に対して歯垢が発する蛍光を透過する光学フィルタを有してもよい。本実施の形態では、撮影部21は、撮像素子に入射する光に含まれる青色波長の光成分をカットする青色光カットフィルタを光学フィルタとして有してもよい。青色光の波長域を含む光を歯牙に照射し、歯垢を検出する場合、歯垢の励起蛍光を強くするため青色光の波長域を含む光を強くすると青色画素値が赤色画素値及び緑色画素値に比べて支配的になるため、RGB画像の全体が青色を帯びる。この対処として、青色光カットフィルタが撮像素子に入射する前の光から青色光の波長域を含む光の一部をカットする。なお、撮影部21は、青色光カットフィルタを有していなくてもよい。 The imaging unit 21 may further include an optical filter that blocks light of a color emitted from the illumination unit (illumination device) and transmits fluorescence emitted by plaque in response to that light. In this embodiment, the imaging unit 21 may include, as an optical filter, a blue light cut-off filter that cuts out blue wavelength light components contained in light incident on the imaging element. When light including the blue light wavelength range is irradiated onto teeth to detect plaque, if the light including the blue light wavelength range is strengthened to increase the excitation fluorescence of plaque, the blue pixel values will become dominant over the red and green pixel values, and the entire RGB image will appear blue. To address this, the blue light cut-off filter cuts out a portion of the light including the blue light wavelength range from the light before it enters the imaging element. Note that the imaging unit 21 does not necessarily have to include a blue light cut-off filter.

 また、口腔内カメラ10は、撮影時に撮影対象の歯牙に対して光を照射する照明部として、複数の第1~第4のLED23A~23Dを搭載している。第1~第4のLED23A~23Dは、歯垢に照射されることで当該歯垢が蛍光を発する色の光(例えば、単一色の光)を照射する。第1~第4のLED23A~23Dは、例えば、405nmをピークとする波長(所定波長の一例)を含む青色光を照射する青色LEDである。なお、第1~第4のLED23A~23Dは、青色光の波長域を含む光を照射する光源であればよく、青色LEDに限るものではない。 The intraoral camera 10 is also equipped with multiple first to fourth LEDs 23A-23D as an illumination unit that irradiates light onto the teeth to be photographed during photography. The first to fourth LEDs 23A-23D irradiate dental plaque with light of a color that causes the plaque to fluoresce (for example, single-color light). The first to fourth LEDs 23A-23D are, for example, blue LEDs that irradiate blue light including a wavelength whose peak is 405 nm (an example of a predetermined wavelength). Note that the first to fourth LEDs 23A-23D are not limited to blue LEDs, and may be any light source that irradiates light including the blue light wavelength range.

 図2は、本実施の形態に係る情報処理システムの概略的構成図である。本実施の形態に係る情報処理システムは、概略的には、照明部23からの光に対して歯垢が発する蛍光を撮影部21が撮影し、撮影された1以上のRGB画像(1以上の画像)から、口腔内の状態を確認するためのより適した画像を取得するように構成される。さらには、本実施の形態に係る情報処理システムは、口腔内の状態を比較可能な2枚の画像を選択するように構成される。 Figure 2 is a schematic diagram of an information processing system according to this embodiment. The information processing system according to this embodiment is generally configured so that the imaging unit 21 captures fluorescence emitted by plaque in response to light from the illumination unit 23, and from the one or more captured RGB images (one or more images), an image more suitable for checking the condition of the oral cavity is obtained. Furthermore, the information processing system according to this embodiment is configured to select two images that allow the condition of the oral cavity to be compared.

 図2に示すように、情報処理システムは、口腔内カメラ10と、携帯端末50とを備える。 As shown in FIG. 2, the information processing system includes an intraoral camera 10 and a mobile terminal 50.

 口腔内カメラ10は、ハード部20と、信号処理部30と、通信部40とを備える。 The intraoral camera 10 comprises a hardware unit 20, a signal processing unit 30, and a communication unit 40.

 ハード部20は、口腔内カメラ10における物理的な要素であり、撮影部21と、センサ部22と、照明部23と、操作部24とを有する。 The hardware unit 20 is a physical element of the intraoral camera 10 and includes a photographing unit 21, a sensor unit 22, an illumination unit 23, and an operation unit 24.

 撮影部21は、ユーザの口腔内の歯牙を撮影することで画像データを生成する。撮影部21は、歯垢に含まれる蛍光物質を励起する所定波長の照射光が照射されている口腔内の歯列の面及び歯垢を撮影することで画像データを生成するともいえる。撮影部21は、カメラ制御部31からの制御信号を受け付け、受け付けた制御信号に応じて、撮影などの動作を実行し、撮影で得た動画又は静止画の画像データを画像処理部32に出力する。撮影部21は、上記の撮像素子及び光学フィルタとレンズとを有する。画像データは、例えば光学フィルタを通過した光に基づいて生成される。また、画像データは、複数本の歯牙が映る画像であるが、少なくとも1本の歯牙が映る画像であればよい。 The photographing unit 21 generates image data by photographing the teeth in the user's oral cavity. It can also be said that the photographing unit 21 generates image data by photographing the surface of the dentition in the oral cavity and the plaque, which are irradiated with light of a predetermined wavelength that excites the fluorescent substance contained in the plaque. The photographing unit 21 receives a control signal from the camera control unit 31, performs operations such as photographing in accordance with the received control signal, and outputs video or still image data obtained by photographing to the image processing unit 32. The photographing unit 21 has the above-mentioned image sensor, optical filter, and lens. The image data is generated based on light that has passed through the optical filter, for example. Furthermore, the image data is an image showing multiple teeth, but it is sufficient that the image shows at least one tooth.

 センサ部22は、RGB画像の撮影領域に入射する外光を検出する。例えば、センサ部22は、口腔内に外光が入射しているか否かを検出する。センサ部22は、例えば、撮影部21の近傍に配置される。センサ部22は、例えば、撮影部21と同様、口腔内カメラ10のヘッド部10aに配置されてもよい。つまり、センサ部22は、撮影部21が撮影する際、ユーザの口腔内に位置する。なお、センサ部22を用いる代わりに、撮影部21が撮影した画像を用いて、後述する画像処理部32で外光の検出が行われてもよい。つまり、信号処理部30(例えば、画像処理部32)がセンサ部22の機能を有していてもよい。 The sensor unit 22 detects external light entering the imaging area of the RGB image. For example, the sensor unit 22 detects whether external light is entering the oral cavity. The sensor unit 22 is arranged, for example, near the imaging unit 21. The sensor unit 22 may be arranged, for example, in the head unit 10a of the intraoral camera 10, similar to the imaging unit 21. In other words, the sensor unit 22 is located inside the user's oral cavity when the imaging unit 21 takes an image. Note that instead of using the sensor unit 22, external light may be detected by the image processing unit 32 (described below) using the image captured by the imaging unit 21. In other words, the signal processing unit 30 (for example, the image processing unit 32) may have the functions of the sensor unit 22.

 照明部23は、口腔内の複数の領域のうち、撮影部21が撮影する領域に光を照射する。定量的可視光誘起蛍光法(QLF(Quantitative Light-induced Fluorescence)法)でも知られているように、青色光が照射されると歯垢内のバクテリアが産出する物質であるポルフィリンが赤みを帯びたピンク色に蛍光(励起蛍光)することが知られており、本実施の形態では、照明部23は、撮影部21が撮影する領域に青色光を照射する。 The lighting unit 23 irradiates light onto the area of the multiple areas in the oral cavity that will be photographed by the imaging unit 21. As is also known from quantitative light-induced fluorescence (QLF (Quantitative Light-Induced Fluorescence)), porphyrin, a substance produced by bacteria in dental plaque, is known to fluoresce a reddish-pink color (excited fluorescence) when irradiated with blue light. In this embodiment, the lighting unit 23 irradiates blue light onto the area that will be photographed by the imaging unit 21.

 照明部23は、上記の複数の第1~第4のLED23A~23Dを有する。複数の第1~第4のLED23A~23Dは、例えば、撮影領域に対して互いに異なる方向から光を照射する。これにより、撮影領域に影が発生することを抑制することができる。 The lighting unit 23 has the above-mentioned first to fourth LEDs 23A to 23D. The first to fourth LEDs 23A to 23D, for example, emit light from different directions toward the shooting area. This makes it possible to prevent shadows from appearing in the shooting area.

 複数の第1~第4のLED23A~23Dのそれぞれは、少なくとも調光が制御可能に構成される。複数の第1~第4のLED23A~23Dのそれぞれは、調光及び調色が制御可能に構成されてもよい。複数の第1~第4のLED23A~23Dは、撮影部21を囲むように配置される。 Each of the first to fourth LEDs 23A to 23D is configured so that at least the dimming can be controlled. Each of the first to fourth LEDs 23A to 23D may also be configured so that the dimming and color adjustment can be controlled. The first to fourth LEDs 23A to 23D are arranged to surround the imaging unit 21.

 照明部23は、撮影領域に応じて照射強度(発光強度)が制御される。複数の第1~第4のLED23A~23Dのそれぞれは、一律に照射強度が制御されてもよいし、互いに異なるように照射強度が制御されてもよい。なお、照明部23が有するLEDの数は特に限定されず、1つであってもよいし、5つ以上であってもよい。また、照明部23は、光源としてLEDを有することに限定されず、他の光源を有していてもよい。 The illumination unit 23 controls the illumination intensity (light emission intensity) according to the shooting area. The illumination intensity of each of the first to fourth LEDs 23A to 23D may be controlled uniformly, or may be controlled to differ from one another. The number of LEDs in the illumination unit 23 is not particularly limited, and may be one, or five or more. Furthermore, the illumination unit 23 is not limited to having an LED as a light source, and may also have other light sources.

 操作部24は、ユーザからの操作を受け付ける。操作部24は、例えば、押しボタンなどにより構成されるが、音声などにより操作を受け付ける構成であってもよい。 The operation unit 24 accepts operations from the user. The operation unit 24 is configured, for example, with push buttons, but may also be configured to accept operations via voice, etc.

 また、ハード部20は、さらに、口腔内カメラ10の各構成要素に電力を供給する電池(例えば、二次電池)、商用電源に接続された外部の充電器によってワイヤレス充電されるためのコイル、構図調節及びピント調節の少なくとも一方に必要なアクチュエータなどを備えていてもよい。 The hardware unit 20 may also include a battery (e.g., a secondary battery) that supplies power to each component of the intraoral camera 10, a coil for wireless charging by an external charger connected to a commercial power source, and an actuator required for at least one of composition adjustment and focus adjustment.

 信号処理部30は、後述する様々な処理を実行するCPU(Central Processing Unit)又はMPU(Micro Processor Unit)などにより実現される各機能構成部と、各機能構成部に様々な処理を実行させるためのプログラムを記憶するROM(Read Only Memory)、RAM(Random Access Memory)などのメモリ部35を有する。具体的には、信号処理部30は、カメラ制御部31と、画像処理部32と、制御部33と、照明制御部34と、メモリ部35とを有する。 The signal processing unit 30 has functional components implemented by a CPU (Central Processing Unit) or MPU (Micro Processor Unit) that execute various processes described below, and a memory unit 35 such as a ROM (Read Only Memory) or RAM (Random Access Memory) that stores programs for causing each functional component to execute various processes. Specifically, the signal processing unit 30 has a camera control unit 31, an image processing unit 32, a control unit 33, a lighting control unit 34, and the memory unit 35.

 カメラ制御部31は、例えば、口腔内カメラ10のハンドル部10bに搭載され、撮影部21を制御する。カメラ制御部31は、例えば、画像処理部32からの制御信号に応じて撮影部21の絞り及びシャッタスピードの少なくとも1つを制御する。 The camera control unit 31 is mounted, for example, on the handle portion 10b of the intraoral camera 10 and controls the imaging unit 21. The camera control unit 31 controls at least one of the aperture and shutter speed of the imaging unit 21 in response to a control signal from the image processing unit 32, for example.

 画像処理部32は、例えば、口腔内カメラ10のハンドル部10bに搭載され、撮影部21が撮影したRGB画像(画像データ)を取得し、その取得したRGB画像に対して画像処理を実行し、その画像処理後のRGB画像をカメラ制御部31及び制御部33に出力する。また、画像処理部32は、画像処理後のRGB画像をメモリ部35に出力し、画像処理後のRGB画像をメモリ部35に記憶させてもよい。 The image processing unit 32 is mounted, for example, on the handle unit 10b of the intraoral camera 10, acquires the RGB image (image data) captured by the imaging unit 21, performs image processing on the acquired RGB image, and outputs the processed RGB image to the camera control unit 31 and the control unit 33. The image processing unit 32 may also output the processed RGB image to the memory unit 35, and store the processed RGB image in the memory unit 35.

 画像処理部32は、例えば、回路で構成され、例えばRGB画像に対してノイズ除去、輪郭強調処理などの画像処理を実行する。なお、ノイズ除去及び輪郭強調処理などは、携帯端末50により実行されてもよい。 The image processing unit 32 is composed of, for example, a circuit, and performs image processing such as noise removal and edge enhancement on RGB images. Note that noise removal and edge enhancement processing may also be performed by the mobile terminal 50.

 なお、画像処理部32から出力されたRGB画像(画像処理後のRGB画像)は、通信部40を介して携帯端末50に送信され、送信されたRGB画像に基づく画像が携帯端末50の表示部56に表示されてもよい。これにより、ユーザにRGB画像に基づく画像を提示することができる。 The RGB image output from the image processing unit 32 (the RGB image after image processing) may be transmitted to the mobile terminal 50 via the communication unit 40, and an image based on the transmitted RGB image may be displayed on the display unit 56 of the mobile terminal 50. This allows the user to be presented with an image based on the RGB image.

 制御部33は、信号処理部30を制御する制御装置である。制御部33は、例えば、センサ部22による外光等の検出結果に基づいて、信号処理部30の各構成要素を制御する。 The control unit 33 is a control device that controls the signal processing unit 30. The control unit 33 controls each component of the signal processing unit 30 based on, for example, the detection results of external light, etc. by the sensor unit 22.

 照明制御部34は、例えば、口腔内カメラ10のハンドル部10bに搭載され、第1~第4のLED23A~23Dの点灯及び消灯を制御する。照明制御部34は、例えば回路で構成される。例えば、ユーザが携帯端末50の表示部56に対して口腔内カメラ10を起動させる操作を実行すると、携帯端末50から対応する信号が通信部40を介して信号処理部30に送信される。信号処理部30の照明制御部34は、受信した信号に基づいて、第1~第4のLED23A~23Dを点灯させる。 The lighting control unit 34 is mounted, for example, on the handle portion 10b of the intraoral camera 10 and controls the turning on and off of the first to fourth LEDs 23A to 23D. The lighting control unit 34 is composed of, for example, a circuit. For example, when a user performs an operation on the display unit 56 of the mobile terminal 50 to start the intraoral camera 10, a corresponding signal is sent from the mobile terminal 50 to the signal processing unit 30 via the communication unit 40. The lighting control unit 34 of the signal processing unit 30 turns on the first to fourth LEDs 23A to 23D based on the received signal.

 メモリ部35は、上記のプログラム以外に、撮影部21によって撮影されたRGB画像(画像データ)などを記憶する。メモリ部35は、例えば、ROM、RAMなどの半導体メモリにより実現されるがこれに限定されない。 In addition to the above programs, the memory unit 35 also stores RGB images (image data) captured by the image capture unit 21. The memory unit 35 is realized by, for example, semiconductor memory such as ROM or RAM, but is not limited to this.

 通信部40は、携帯端末50と無線通信を行うための無線通信モジュールである。通信部40は、例えば、口腔内カメラ10のハンドル部10bに搭載され、信号処理部30からの制御信号に基づいて、携帯端末50と無線通信を行う。通信部40は、例えばWiFi(登録商標)、Bluetooth(登録商標)などの既存の通信規格に準拠した無線通信を携帯端末50との間で実行する。通信部40を介して、口腔内カメラ10から携帯端末50にRGB画像が送信され、且つ、携帯端末50から口腔内カメラ10に操作信号が送信される。 The communication unit 40 is a wireless communication module for wireless communication with the mobile terminal 50. The communication unit 40 is mounted, for example, on the handle portion 10b of the intraoral camera 10, and communicates wirelessly with the mobile terminal 50 based on control signals from the signal processing unit 30. The communication unit 40 performs wireless communication with the mobile terminal 50 in accordance with existing communication standards such as Wi-Fi (registered trademark) and Bluetooth (registered trademark). RGB images are sent from the intraoral camera 10 to the mobile terminal 50 via the communication unit 40, and operation signals are sent from the mobile terminal 50 to the intraoral camera 10.

 携帯端末50は、例えば、青色光の波長域を含む光を歯牙に照射することで蛍光反応している歯列の面及び歯垢を撮影したRGB画像を用いて、口腔内の状態を確認するためのより適した画像を取得するための処理を実行する。携帯端末50は、情報処理システムのユーザインタフェースとして機能する。また、携帯端末50は、情報処理装置の一例である。 The mobile terminal 50 performs processing to obtain images more suitable for checking the condition of the oral cavity, for example, by irradiating the teeth with light including the wavelength range of blue light, and using RGB images of the surfaces of the dentition and dental plaque that are reacting to fluorescence. The mobile terminal 50 functions as a user interface for the information processing system. The mobile terminal 50 is also an example of an information processing device.

 図3は、本実施の形態に係る携帯端末50の機能構成を示すブロック図である。 Figure 3 is a block diagram showing the functional configuration of the mobile terminal 50 according to this embodiment.

 図3に示すように、携帯端末50は、通信部51と、検出部52と、選択部53と、スコア算出部54と、制御部55と、表示部56とを備える。携帯端末50は、プロセッサ及びメモリなどを備える。メモリは、ROM及びRAMなどであり、プロセッサにより実行されるプログラムを記憶することができる。通信部51と、検出部52と、選択部53と、スコア算出部54と、制御部55と、表示部56とは、メモリに格納されたプログラムを実行するプロセッサなどによって実現される。携帯端末50は、例えば、無線通信可能なスマートフォン又はタブレット端末等により実現されてもよい。 As shown in FIG. 3, the mobile terminal 50 includes a communication unit 51, a detection unit 52, a selection unit 53, a score calculation unit 54, a control unit 55, and a display unit 56. The mobile terminal 50 also includes a processor and memory. The memory may be a ROM or RAM, and may store programs executed by the processor. The communication unit 51, the detection unit 52, the selection unit 53, the score calculation unit 54, the control unit 55, and the display unit 56 are implemented by a processor that executes programs stored in the memory. The mobile terminal 50 may be implemented, for example, by a smartphone or tablet device capable of wireless communication.

 通信部51は、口腔内カメラ10と無線通信を行うための無線通信モジュールである。通信部51は、口腔内カメラ10からRGB画像を取得する。具体的には、通信部51は、撮影部21で生成された1以上の歯牙を含む画像を取得する。RGB画像は、口腔内カメラ10が青色光の波長域を含む光を歯牙に照射することで蛍光反応している歯牙を撮影することで得られた画像である。なお、通信部51は、口腔内カメラ10と有線通信を行う有線通信モジュールを含んでいてもよい。通信部51は、取得部の一例である。以降において、RGB画像を画像とも記載する。 The communication unit 51 is a wireless communication module for wireless communication with the intraoral camera 10. The communication unit 51 acquires RGB images from the intraoral camera 10. Specifically, the communication unit 51 acquires images including one or more teeth generated by the imaging unit 21. The RGB images are images obtained by the intraoral camera 10 by irradiating the teeth with light including the wavelength range of blue light and capturing images of the teeth that are undergoing a fluorescent reaction. The communication unit 51 may also include a wired communication module for wired communication with the intraoral camera 10. The communication unit 51 is an example of an acquisition unit. Hereinafter, RGB images will also be referred to as images.

 検出部52は、通信部51が取得した画像から特定の歯牙の歯牙領域及び歯垢領域を検出する。歯牙及び歯垢が互いに異なる蛍光を発するので、検出部52は、画像における蛍光(励起蛍光)している部分の色により、歯牙及び歯垢を検出する。具体的には、検出部52は、後述する第1歯牙領域、第2歯牙領域、第1歯垢領域及び第2歯垢領域を検出する。第1歯牙領域及び第2歯牙領域は、天然歯牙領域である。 The detection unit 52 detects the tooth region and plaque region of a specific tooth from the image acquired by the communication unit 51. Because teeth and plaque emit different fluorescence, the detection unit 52 detects the tooth and plaque based on the color of the fluorescent (excited fluorescence) portion in the image. Specifically, the detection unit 52 detects the first tooth region, second tooth region, first plaque region, and second plaque region, which will be described later. The first tooth region and second tooth region are natural tooth regions.

 上記でも説明したが、歯垢(歯垢領域)は、青色光が照射されると赤みを帯びたピンク色に蛍光(励起蛍光)する。また、歯牙(歯牙領域)は、励起光を照射すると象牙質から励起蛍光が発せられ、エナメル質を透過して緑色に蛍光することが知られている。また、齲歯治療痕の詰め物(例えば、メタルインレー)は、青色LED光の下では励起蛍光せず、カメラでは暗く(低輝度で)撮像される。つまり、画像における歯牙領域と歯垢領域とは、識別可能であり、また、歯牙領域のうち、天然歯牙領域と詰め物とは識別可能である。以降において、歯牙領域(天然歯牙領域)は、歯牙における詰め物を除く領域である。なお、歯牙領域及び歯垢領域の検出方法としては、公知のいかなる技術が用いられてもよい。 As explained above, when irradiated with blue light, dental plaque (dental plaque region) fluoresces a reddish-pink color (excitation fluorescence). It is also known that when teeth (dental regions) are irradiated with excitation light, excitation fluorescence is emitted from the dentin, which passes through the enamel and fluoresces green. Furthermore, fillings left behind after caries treatment (e.g., metal inlays) do not emit excitation fluorescence under blue LED light, and are captured darkly (at low brightness) on the camera. In other words, dental regions and plaque regions in the image are distinguishable, and within dental regions, natural dental regions and fillings are distinguishable. Hereinafter, dental regions (natural dental regions) refer to the areas of teeth excluding fillings. Note that any known technology may be used to detect dental regions and plaque regions.

 選択部53は、通信部51が取得した画像に基づいて、口腔内の状態を確認するためのより適した画像を選択する。選択部53は、画像における歯牙領域の割合が第1所定値以上である場合、当該第1画像を特定の歯牙の基準画像として選択する。選択部53は、例えば、画像の全画素数に対する歯牙領域の画素数割合が第1所定値以上である場合、当該画像を特定の歯牙の基準画像として選択してもよい。第1所定値は、予め設定されており、携帯端末50の記憶部(図示しない)に記憶されている。第1所定値は、所定値の一例である。 The selection unit 53 selects an image that is more suitable for checking the condition of the oral cavity based on the image acquired by the communication unit 51. If the proportion of the tooth region in the image is equal to or greater than a first predetermined value, the selection unit 53 selects the first image as a reference image of a specific tooth. For example, the selection unit 53 may select an image as a reference image of a specific tooth if the proportion of the number of pixels in the tooth region to the total number of pixels in the image is equal to or greater than a first predetermined value. The first predetermined value is set in advance and stored in a memory unit (not shown) of the mobile terminal 50. The first predetermined value is an example of a predetermined value.

 また、選択部53は、さらに、基準画像と比較する対象の比較画像を選択するための処理を実行してもよい。 The selection unit 53 may also perform processing to select a comparison image to be compared with the reference image.

 スコア算出部54は、画像における歯牙領域及び歯垢領域に基づいて、歯垢の程度を示す歯垢スコアを算出する。歯垢スコアは、口腔内の状態を示す数値である。本実施の形態では、スコア算出部54は、歯垢面積(例えば、歯垢の画素数)又は歯垢の蛍光量と、歯牙面積(例えば、歯牙領域の画素数)との比に基づいて歯垢スコアを算出する。スコア算出部54は、例えば、以下の式1に基づいて歯垢スコアを算出する。 The score calculation unit 54 calculates a plaque score that indicates the degree of plaque based on the tooth region and plaque region in the image. The plaque score is a numerical value that indicates the condition of the oral cavity. In this embodiment, the score calculation unit 54 calculates the plaque score based on the ratio of the plaque area (e.g., the number of pixels in the plaque) or the amount of fluorescence from the plaque to the tooth area (e.g., the number of pixels in the tooth region). The score calculation unit 54 calculates the plaque score, for example, based on the following formula 1.

 歯垢スコア=歯垢面積/歯牙面積   ・・・式1 Plaque score = plaque area / tooth area...Equation 1

 なお、ここでの歯牙面積とは、画像における天然歯牙領域の面積である。 Note that the tooth area here refers to the area of the natural tooth region in the image.

 なお、歯垢スコアの算出方法はこれに限定されず、歯牙領域の面積を用いる他の方法で算出されてもよい。 Note that the method for calculating the plaque score is not limited to this, and it may be calculated using other methods that use the area of the tooth region.

 制御部55は、スコア算出部54が算出した歯垢スコアに対する各種処理を実行する。制御部55は、例えば、2枚の画像の歯垢スコアに基づいて、口腔ケア状態を比較してもよい。制御部55は、例えば、歯磨きなどの口腔ケア処理の前後の画像に基づいて、それぞれの歯垢スコアを比較してもよい。制御部55は、例えば、口腔ケア処理の歯垢スコアに基づいて、口腔ケア処理の効果を評価してもよい。このように、制御部55は、口腔ケア状態を比較する比較部として機能してもよい。 The control unit 55 performs various processes on the plaque score calculated by the score calculation unit 54. The control unit 55 may, for example, compare oral care conditions based on the plaque scores of two images. The control unit 55 may, for example, compare the plaque scores of images taken before and after an oral care treatment such as brushing teeth. The control unit 55 may, for example, evaluate the effectiveness of an oral care treatment based on the plaque score of the oral care treatment. In this way, the control unit 55 may function as a comparison unit that compares oral care conditions.

 表示部56は、携帯端末50が備える表示デバイスであり、スコア算出部54により算出された歯垢スコア、制御部55による口腔ケア状態の比較結果などを表示する。表示部56は、例えば、液晶ディスプレイパネルなどにより実現されるが、これに限定されない。表示部56は、表示装置の一例である。 The display unit 56 is a display device included in the mobile terminal 50, and displays the plaque score calculated by the score calculation unit 54, the oral care status comparison results by the control unit 55, and the like. The display unit 56 is realized, for example, by a liquid crystal display panel, but is not limited to this. The display unit 56 is an example of a display device.

 [2.情報処理システムの動作]
 続いて、上記のように構成される情報処理システムの動作について、図4~図8を参照しながら説明する。以下では、歯磨きの前後の画像を比較することで磨き残しをチェックする例について説明する。図4は、本実施の形態に係る情報処理システムの第1動作(情報処理方法)を示すフローチャートである。図4では、歯磨き前の画像として適切な画像を選択する動作について説明する。なお、本実施の形態では、情報処理システムは、図4、図6及び図8に示す動作のうち、少なくとも図4に示す動作(例えば、Before画像を選択する動作)を実行すればよい。
2. Operation of Information Processing System
Next, the operation of the information processing system configured as described above will be described with reference to FIGS. 4 to 8. An example of checking for missed brushing areas by comparing images before and after tooth brushing will be described below. FIG. 4 is a flowchart showing a first operation (information processing method) of the information processing system according to this embodiment. FIG. 4 describes the operation of selecting an appropriate image as an image before tooth brushing. Note that in this embodiment, the information processing system only needs to execute at least the operation shown in FIG. 4 (for example, the operation of selecting a before image) among the operations shown in FIGS. 4, 6, and 8.

 図4に示すように、通信部51は、口腔内カメラ10からユーザの歯磨き前の口腔内を撮影したBefore画像を取得する(S11)。Before画像は、第1画像の一例であり、ここでは磨き残しをチェックする対象の歯牙である特定の歯牙を含む画像である。 As shown in FIG. 4, the communication unit 51 acquires a before image from the intraoral camera 10, which is an image of the inside of the user's oral cavity before brushing (S11). The before image is an example of a first image, and in this case, is an image that includes a specific tooth that is the target tooth to be checked for any remaining brushing.

 次に、制御部55は、Before画像が複数あるか否かを判定する(S12)。制御部55は、ステップS11において、通信部51が取得したBefore画像が複数あるか否かを判定する。 Next, the control unit 55 determines whether there are multiple before images (S12). The control unit 55 determines whether there are multiple before images acquired by the communication unit 51 in step S11.

 次に、選択部53は、制御部55によりBefore画像が複数あると判定された場合(S12でYes)、複数のBefore画像のうち歯牙面積割合が第1所定値以上の画像があるか否かを判定する(S13)。 Next, if the control unit 55 determines that there are multiple before images (Yes in S12), the selection unit 53 determines whether any of the multiple before images has a tooth area ratio equal to or greater than a first predetermined value (S13).

 ステップS13において、まず検出部52は、複数のBefore画像それぞれにおいて、特定の歯牙の歯牙領域を検出し、当該Before画像における歯牙領域の面積(例えば、画素数)を当該Before画像全体の面積(例えば、画素数)で除算することで、歯牙面積割合を算出する。 In step S13, the detection unit 52 first detects the tooth area of a specific tooth in each of the multiple before images, and calculates the tooth area ratio by dividing the area (e.g., number of pixels) of the tooth area in the before image by the area (e.g., number of pixels) of the entire before image.

 図5は、図4に示すステップS13を説明するための図である。画像A及びBは、同一の歯牙(特定の歯牙)を撮影したBefore画像を二値化処理した模式図である。ここでの、Before画像は、第一大臼歯の咬合面を撮影した画像である。模式図では、天然歯牙領域をハッチングなしで示しており、天然歯牙領域以外を斜線ハッチングで示している。また、画像Bは、画像Aに対し、撮影位置が右側にシフトしたために、画面全体に占める天然歯牙領域の面積が小さくなっている画像である。 Figure 5 is a diagram for explaining step S13 shown in Figure 4. Images A and B are schematic diagrams of before images of the same tooth (specific tooth) that have been binarized. The before image here is an image of the occlusal surface of a first molar. In the schematic diagram, the natural tooth area is shown without hatching, and areas other than the natural tooth area are shown with diagonal hatching. Furthermore, image B is an image in which the area occupied by the natural tooth area is smaller on the entire screen because the photographing position has been shifted to the right compared to image A.

 図5の画像A及びBに示すように、検出部52は、例えば、Before画像における歯牙領域とそれ以外の領域とを二値化処理し、画像に占める天然歯牙領域の割合を算出する。画像Aでは、天然歯牙領域の割合(歯牙面積割合)が62%であり、画像Bでは、天然歯牙領域の割合が27%である。 As shown in images A and B in Figure 5, the detection unit 52, for example, binarizes the tooth area and other areas in the before image and calculates the proportion of natural tooth area in the image. In image A, the proportion of natural tooth area (tooth area proportion) is 62%, and in image B, the proportion of natural tooth area is 27%.

 例えば、式1から歯垢スコアを算出する場合、画像Aに比べ、画像Bは天然歯牙と補綴物との境界部に残っている歯垢の影響を歯垢スコアが受けやすい。境界部は歯垢が残りやすい傾向があるので、画像Bが口腔ケア処理後の画像である場合、口腔ケア処理後であるにも関わらず、歯垢スコアが増えてしまう場合がある。つまり、天然歯牙領域の割合が少ないと歯垢スコアが正確に算出されないおそれがある。そこで、本開示では、ステップS13に示すように、歯牙面積割合が第1所定値以上である画像を抽出する処理を実行する。 For example, when calculating the plaque score using Equation 1, the plaque score for Image B is more susceptible to the influence of plaque remaining at the boundary between the natural tooth and the prosthesis than for Image A. Because plaque tends to remain at the boundary, if Image B is an image taken after oral care treatment, the plaque score may increase despite the fact that the oral care treatment has been performed. In other words, if the proportion of natural tooth area is small, there is a risk that the plaque score will not be calculated accurately. Therefore, in the present disclosure, as shown in step S13, a process is performed to extract images in which the tooth area proportion is equal to or greater than a first predetermined value.

 第1所定値は、歯垢スコアを正確に算出する観点から高い値が好ましく、40%以上であってもよいし、より好ましくは50%以上であってもよいし、さらに好ましくは60%以上であってもよい。また、第1所定値は、歯牙の種類(例えば、「切歯」「犬歯」「臼歯」)ごとに異なる値が設定されていてもよいし、同じ値が設定されていてもよい。 The first predetermined value is preferably a high value from the perspective of accurately calculating the plaque score, and may be 40% or more, more preferably 50% or more, and even more preferably 60% or more. Furthermore, the first predetermined value may be set to a different value for each type of tooth (e.g., "incisor," "canine," "molar"), or the same value may be set.

 なお、検出部52は、複数のBefore画像それぞれにおいて、歯牙面積割合を算出する。そして、選択部53は、複数のBefore画像の中に、歯牙面積割合が第1所定値以上の画像があるか否かを判定する。 The detection unit 52 calculates the tooth area ratio for each of the multiple before images. The selection unit 53 then determines whether or not there is an image among the multiple before images in which the tooth area ratio is equal to or greater than a first predetermined value.

 図4を再び参照して、次に、選択部53は、歯牙面積割合が第1所定値以上の画像があると判定した場合(S13でYes)、さらに歯牙面積割合が所定条件を満たすBefore画像を選択する(S14)。所定条件は、歯牙面積割合が第1所定値以上の1以上のBefore画像における歯牙面積割合の平均値、中央値又は最頻値に近いことであってもよいし、歯牙面積割合が最大値であることであってもよい。例えば、選択部53は、歯牙面積割合が第1所定値以上である1以上のBefore画像の中から、当該1以上のBefore画像における歯牙面積割合の平均値に最も近いBefore画像を選択してもよい。 Referring again to FIG. 4, next, if the selection unit 53 determines that there is an image in which the tooth area ratio is equal to or greater than a first predetermined value (Yes in S13), it selects a before image in which the tooth area ratio satisfies a predetermined condition (S14). The predetermined condition may be that the tooth area ratio is close to the average, median, or mode of the tooth area ratios in one or more before images in which the tooth area ratio is equal to or greater than the first predetermined value, or that the tooth area ratio is the maximum value. For example, the selection unit 53 may select a before image that is closest to the average value of the tooth area ratios in the one or more before images in which the tooth area ratio is equal to or greater than the first predetermined value.

 次に、選択部53は、歯牙面積割合が所定条件を満たすBefore画像、つまり選択されたBefore画像を、基準画像に決定する(S15)。つまり、選択部53は、1以上のBefore画像の中から、所定条件を満たすBefore画像を基準画像として選択する。選択部53は、基準画像を記憶部に記憶させる。 Next, the selection unit 53 determines the before image whose tooth area ratio satisfies a predetermined condition, i.e., the selected before image, as the reference image (S15). In other words, the selection unit 53 selects, from one or more before images, a before image that satisfies the predetermined condition as the reference image. The selection unit 53 stores the reference image in the storage unit.

 なお、選択部53は、1つのBefore画像を基準画像に決定するが、例えば、所定条件を満たす2以上のBefore画像を基準画像に決定してもよい。このように、選択部53は、1以上のBefore画像から基準画像を選択するための第1選択部として機能する。 Note that although the selection unit 53 determines one before image as the reference image, it may also determine, for example, two or more before images that satisfy predetermined conditions as the reference image. In this way, the selection unit 53 functions as a first selection unit for selecting a reference image from one or more before images.

 また、選択部53は、制御部55によりBefore画像が複数ないと判定された場合(S12でNo)、当該Before画像の歯牙面積割合が第1所定値以上であるか否かを判定し(S16)、歯牙面積割合が第1所定値以上であると判定した場合(S16でYes)、ステップS15に進む。選択部53は、例えば、Before画像の全画素数に対する歯牙領域の画素数の割合が第1所定値以上である場合、当該Before画像を基準画像として選択してもよい。画素数割合は、面積割合と同義である。 Furthermore, if the control unit 55 determines that there are not multiple before images (No in S12), the selection unit 53 determines whether the tooth area ratio of the before image is equal to or greater than a first predetermined value (S16), and if it determines that the tooth area ratio is equal to or greater than the first predetermined value (Yes in S16), proceeds to step S15. For example, if the ratio of the number of pixels in the tooth region to the total number of pixels in the before image is equal to or greater than a first predetermined value, the selection unit 53 may select the before image as the reference image. The pixel number ratio is synonymous with the area ratio.

 また、制御部55は、選択部53により歯牙面積割合が第1所定値以上の画像がないと判定された場合(S13でNo)、又は、歯牙面積割合が第1所定値以上ではないと判定された場合(S16でNo)、基準画像として用いることができるBefore画像がないので、口腔内の再撮影を通知する(S17、S18)。これにより、メタルインレーで大部分が覆われた歯牙、又は、画像内に天然歯牙が映っていない若しくは映っていても小さい画像の歯垢スコアが算出されることを抑制することができる。 Furthermore, if the selection unit 53 determines that there are no images in which the tooth area ratio is equal to or greater than the first predetermined value (No in S13), or if the selection unit 53 determines that the tooth area ratio is not equal to or greater than the first predetermined value (No in S16), the control unit 55 issues a notification to re-photograph the oral cavity (S17, S18) because there are no before images that can be used as reference images. This makes it possible to prevent plaque scores from being calculated for teeth that are mostly covered by metal inlays, or for images in which natural teeth are not visible or are visible but are small.

 続いて、歯磨き後の画像として適切な画像を選択する動作について、図6~図7Bを参照しながら説明する。図6は、本実施の形態に係る情報処理システムの第2動作(情報処理方法)を示すフローチャートである。 Next, the operation of selecting an appropriate image as an image after brushing teeth will be described with reference to Figures 6 to 7B. Figure 6 is a flowchart showing the second operation (information processing method) of the information processing system according to this embodiment.

 図6に示すように、通信部51は、口腔内カメラ10からユーザの歯磨き後の口腔内を撮影したAfter画像を取得する(S21)。After画像は、第2画像の一例であり、ここでは磨き残しをチェックする対象の歯牙である特定の歯牙を含む画像である。 As shown in FIG. 6, the communication unit 51 acquires an After image, which is an image of the user's oral cavity after brushing, from the intraoral camera 10 (S21). The After image is an example of a second image, and in this case, it is an image that includes a specific tooth that is the target tooth for which brushing is to be checked.

 次に、制御部55は、After画像が複数あるか否かを判定する(S22)。制御部55は、ステップS21において、通信部51が取得したAfter画像が複数あるか否かを判定する。 Next, the control unit 55 determines whether there are multiple After images (S22). The control unit 55 determines whether there are multiple After images acquired by the communication unit 51 in step S21.

 次に、選択部53は、制御部55によりAfter画像が複数あると判定された場合(S22でYes)、複数のAfter画像のうち歯牙面積割合が第1所定値以上の画像があるか否かを判定する(S23)。ステップS23で用いる第1所定値は、例えば、ステップS13で用いる第1所定値と同じ値であるが、これに限定されない。ステップS23で用いる第1所定値は、例えば、ステップS15で基準画像に決定されたBefore画像の歯牙面積割合に応じて変更されてもよい。 Next, if the control unit 55 determines that there are multiple After images (Yes in S22), the selection unit 53 determines whether any of the multiple After images has a tooth area ratio equal to or greater than a first predetermined value (S23). The first predetermined value used in step S23 is, for example, the same value as the first predetermined value used in step S13, but is not limited to this. The first predetermined value used in step S23 may be changed, for example, depending on the tooth area ratio of the Before image determined as the reference image in step S15.

 なお、ステップS23の判定方法は、ステップS13の判定方法と同様であり、説明を省略する。 Note that the determination method in step S23 is the same as the determination method in step S13, and so a description thereof will be omitted.

 次に、選択部53は、歯牙面積割合が第1所定値以上の画像があると判定した場合(S23でYes)、さらに、歯牙面積割合が第1所定値以上のAfter画像において、基準画像との重なり度が第2所定値以上の画像があるか否かを判定する(S24)。選択部53は、ステップS23でYesと判定された1以上のAfter画像の中から、基準画像と画角が近い、つまり基準画像に類似したAfter画像を抽出する。例えば、選択部53は、Before画像の第1歯牙領域と類似する第2歯牙領域を有するAfter画像を比較画像として選択してもよい。また、例えば、選択部53は、1以上の第2画像のうち、基準画像に含まれる歯牙と同一の歯牙を、基準画像を撮影した位置により近い(例えば、最も近い)位置から撮影したAfter画像を、比較画像として選択してもよい。なお、重なり度は、重なり度合いの一例であり、第2所定値は、所定度合いの一例である。 Next, if the selection unit 53 determines that there is an image with a tooth area ratio equal to or greater than a first predetermined value (Yes in S23), it further determines whether there is an image with a tooth area ratio equal to or greater than the first predetermined value that has an overlapping degree with the reference image equal to or greater than a second predetermined value (S24). From the one or more After images for which the determination in step S23 is Yes is made, the selection unit 53 extracts After images that have a similar angle of view to the reference image, i.e., that are similar to the reference image. For example, the selection unit 53 may select, as the comparison image, an After image that has a second tooth area similar to the first tooth area in the Before image. Furthermore, for example, the selection unit 53 may select, as the comparison image, an After image from one or more second images that shows the same tooth as the tooth included in the reference image, captured from a position closer (e.g., the closest) to the position at which the reference image was captured. Note that the overlapping degree is an example of the overlapping degree, and the second predetermined value is an example of the predetermined degree.

 なお、基準画像に含まれる歯牙と同一の歯牙であるか否かの判定は、公知のいかなる方法で行われてもよく、例えば、基準画像及び比較画像を撮影したときの口腔内カメラ10の姿勢(位置及び向き)、歯牙の形状(例えば、基準画像に含まれる歯牙の形状を基準としたパターンマッチング)などにより判定可能である。 In addition, the determination of whether the tooth is the same as the tooth included in the reference image may be made by any known method, and can be made, for example, based on the posture (position and orientation) of the intraoral camera 10 when the reference image and comparison image were captured, the shape of the tooth (for example, pattern matching based on the shape of the tooth included in the reference image), etc.

 選択部53は、基準画像とAfter画像とに含まれる天然歯牙の輪郭上に少なくとも3点の特徴点を抽出し、基準画像とAfter画像との特徴点を用いて、各特徴点の座標を一致するように相対位置を合わせる画像補正を行う。画像補正について、図7A及び図7Bを参照しながら説明する。 The selection unit 53 extracts at least three feature points on the contours of the natural teeth contained in the reference image and the after image, and performs image correction using the feature points in the reference image and the after image to align the relative positions of each feature point so that their coordinates match. Image correction will be explained with reference to Figures 7A and 7B.

 図7A及び図7Bは、図6に示すステップS24に行われる画像補正を説明するための図である。図7Aは、基準画像に対してAfter画像にX軸方向及びY軸方向のずれが発生している場合の相対位置を合わせる処理を説明するための図であり、図7Bは、基準画像に対してAfter画像にローテーションが発生している場合の相対位置を合わせる処理を説明するための図である。なお、図7Aの(c)及び図7Bの(c)は、便宜上、画像を拡大して図示している。 FIGS. 7A and 7B are diagrams illustrating the image correction performed in step S24 shown in FIG. 6. FIG. 7A is a diagram illustrating the process of adjusting the relative position when the after image is misaligned in the X-axis and Y-axis directions with respect to the reference image, and FIG. 7B is a diagram illustrating the process of adjusting the relative position when the after image is rotated with respect to the reference image. Note that (c) of FIG. 7A and (c) of FIG. 7B are enlarged images for convenience.

 図7Aの(a)は、基準画像を示し、図7Aの(b)は、ステップS23でYesと判定されたAfter画像であって、X軸方向及びY軸方向のずれが発生しているAfter画像を示す。X軸方向及びY軸方向のずれは、例えば、基準画像を撮影した位置とAfter画像を撮影した位置とが、X軸方向及びY軸方向にずれていた場合に発生し得る。 (a) in Figure 7A shows the reference image, and (b) in Figure 7A shows the after image for which step S23 was determined to be Yes, which shows deviations in the X-axis and Y-axis directions. Deviations in the X-axis and Y-axis directions can occur, for example, when the position at which the reference image was captured and the position at which the after image was captured are misaligned in the X-axis and Y-axis directions.

 この場合、図7Aの(c)に示すように、選択部53は、特徴点を利用して基準画像の歯牙領域(天然歯牙領域)とAfter画像の歯牙領域(天然歯牙領域)との輪郭を重ねるように、基準画像及びAfter画像の少なくとも一方をX軸方向及びY軸方向に平行に移動させる。平行に移動させることは、画像補正の一例である。 In this case, as shown in (c) of Figure 7A, the selection unit 53 uses feature points to shift at least one of the reference image and the after image parallel to the X-axis and Y-axis directions so that the contours of the tooth region (natural tooth region) in the reference image and the tooth region (natural tooth region) in the after image overlap. Shifting parallel is an example of image correction.

 図7Bの(a)は、基準画像を示し、図7Bの(b)は、ステップS23でYesと判定されたAfter画像であって、ローテーションが発生しているAfter画像を示す。ローテーションは、例えば、After画像を撮影したときの口腔内カメラ10の姿勢が基準画像を撮影したときの口腔内カメラ10の姿勢に対して、口腔内カメラ10の光軸を回転軸として回転していた場合に発生し得る。 (a) in Figure 7B shows the reference image, and (b) in Figure 7B shows the after image for which step S23 was determined to be Yes, in which rotation has occurred. Rotation can occur, for example, when the orientation of the intraoral camera 10 when the after image was captured is such that the intraoral camera 10 has rotated around the optical axis of the intraoral camera 10 relative to the orientation of the intraoral camera 10 when the reference image was captured.

 この場合、図7Bの(c)に示すように、選択部53は、特徴点を利用して基準画像の歯牙領域(天然歯牙領域)とAfter画像の歯牙領域(天然歯牙領域)との輪郭を重ねるように、基準画像及びAfter画像の少なくとも一方を回転させる。回転させることは、画像補正の一例である。 In this case, as shown in (c) of Figure 7B, the selection unit 53 uses the feature points to rotate at least one of the reference image and the after image so that the contours of the tooth region (natural tooth region) in the reference image and the tooth region (natural tooth region) in the after image overlap. Rotation is an example of image correction.

 そして、選択部53は、図7Aの(c)又は図7Bの(c)に示すように、歯牙領域の輪郭が重なっている状態において、歯牙領域の重なり部分の面積を算出する。 Then, the selection unit 53 calculates the area of the overlapping portion of the tooth regions when the contours of the tooth regions overlap, as shown in (c) of Figure 7A or (c) of Figure 7B.

 図7A及び図7Bに示すように、選択部53は、基準画像の歯牙領域の輪郭における特徴点(例えば、輪郭線上の特徴点)と、After画像の歯牙領域の輪郭における特徴点とに基づいて、基準画像と当該After画像との相対的な位置合わせを行い、位置合わせを行った基準画像と当該After画像とにおける重なり部分を特定する。図7Aの(c)及び図7Bの(c)では、基準画像の歯牙領域とAfter画像の歯牙領域との重なり部分を、ドットハッチングで示している。 As shown in Figures 7A and 7B, the selection unit 53 aligns the reference image and the after image relative to each other based on feature points (e.g., feature points on the contour line) in the outline of the tooth region in the reference image and feature points in the outline of the tooth region in the after image, and identifies overlapping areas between the aligned reference image and the after image. In Figures 7A (c) and 7B (c), the overlapping areas between the tooth region in the reference image and the tooth region in the after image are indicated by dot hatching.

 なお、画像補正は上記に限定されず、例えば、画像を拡縮することなどを含んでいてもよい。 Note that image correction is not limited to the above and may also include, for example, enlarging or reducing the image.

 重なり度は、重なり部分の面積(又は画素数)を基準画像の歯牙領域の面積(又は画素数)で除算した値であってもよいし、重なり部分の面積そのものであってもよい。重なり部分の面積とは、重なり部分の画素数であってもよい。 The degree of overlap may be the value obtained by dividing the area (or number of pixels) of the overlapping portion by the area (or number of pixels) of the tooth region in the reference image, or it may be the area of the overlapping portion itself. The area of the overlapping portion may also be the number of pixels in the overlapping portion.

 なお、図7Aの(a)及び図7Bの(a)に示す画像における歯牙領域は第1歯牙領域の一例であり、図7Aの(b)及び図7Bの(b)に示す画像における歯牙領域は第2歯牙領域の一例である。 Note that the tooth region in the images shown in Figures 7A(a) and 7B(a) is an example of a first tooth region, and the tooth region in the images shown in Figures 7A(b) and 7B(b) is an example of a second tooth region.

 図6を再び参照して、次に、選択部53は、重なり度が第2所定値以上であるAfter画像を比較画像に決定する(S25)。つまり、選択部53は、1以上のAfter画像の中から、重なり度が第2所定値以上であるAfter画像を比較画像として選択する。 Referring again to FIG. 6, next, the selection unit 53 determines, as the comparison image, an After image whose degree of overlap is equal to or greater than a second predetermined value (S25). In other words, the selection unit 53 selects, from one or more After images, an After image whose degree of overlap is equal to or greater than a second predetermined value as the comparison image.

 このような基準画像及び比較画像は、映っている歯牙領域における共通部分が多い類似した画像であり、歯牙の状態を比較しやすい画像となり得る。選択部53は、例えば1以上のAfter画像から、特定の歯牙の基準画像との類似度が高いAfter画像を比較画像として選択するとも言える。類似度とは、例えば、歯牙領域の重なり度に応じた値である。重なり度が高い場合、類似度も高くなると想定される。重なり度は、基準画像の歯牙領域とAfter画像の歯牙領域との重なり部分の面積に基づく値の一例である。面積に基づく値は、重なり度に限定されず、例えば歯牙領域の面積であってもよい。 Such reference and comparison images are similar images with many commonalities in the tooth regions shown, and can be images that make it easy to compare the state of the teeth. It can also be said that the selection unit 53 selects, as the comparison image, an After image that has a high similarity to the reference image of a specific tooth from, for example, one or more After images. The similarity is, for example, a value that corresponds to the degree of overlap of the tooth regions. It is assumed that the higher the degree of overlap, the higher the similarity. The overlap is an example of a value based on the area of the overlapping portion between the tooth region of the reference image and the tooth region of the After image. The value based on the area is not limited to the degree of overlap, and may be, for example, the area of the tooth region.

 選択部53は、比較画像を基準画像と対応付けて記憶部に記憶させる。なお、選択部53は、1つのAfter画像を比較画像に決定するが、例えば、重なり度が第2所定値以上である2以上のAfter画像を比較画像に決定してもよい。このように、選択部53は、1以上のAfter画像から比較画像を選択するための第2選択部として機能する。 The selection unit 53 associates the comparison image with the reference image and stores it in the storage unit. Note that the selection unit 53 determines one After image as the comparison image, but may also determine, for example, two or more After images whose degree of overlap is equal to or greater than a second predetermined value as the comparison images. In this way, the selection unit 53 functions as a second selection unit for selecting a comparison image from one or more After images.

 また、選択部53は、制御部55によりAfter画像が複数ないと判定された場合(S22でNo)、当該After画像の歯牙面積割合が第1所定値以上であるか否かを判定し(S26)、歯牙面積割合が第1所定値以上であると判定した場合(S26でYes)、ステップS24に進む。 Furthermore, if the control unit 55 determines that there are not multiple after images (No in S22), the selection unit 53 determines whether the tooth area ratio of the after image is equal to or greater than a first predetermined value (S26), and if it determines that the tooth area ratio is equal to or greater than the first predetermined value (Yes in S26), the selection unit 53 proceeds to step S24.

 また、制御部55は、選択部53により、重なり度が第2所定値未満であると判定された場合(S24でNo)、歯牙面積割合が第1所定値以上の画像がないと判定された場合(S23でNo)、又は、歯牙面積割合が第1所定値以上ではないと判定された場合(S26でNo)、比較画像として用いることができるAfter画像がないので、口腔内の再撮影を通知する(S27、S28)。 Furthermore, if the selection unit 53 determines that the degree of overlap is less than the second predetermined value (No in S24), if it determines that there are no images with a tooth area ratio equal to or greater than the first predetermined value (No in S23), or if it determines that the tooth area ratio is not equal to or greater than the first predetermined value (No in S26), the control unit 55 issues a notification to re-photograph the oral cavity (S27, S28) because there are no after images that can be used as comparison images.

 なお、選択部53は、ステップS24において、After画像における歯牙領域の割合と、基準画像における歯牙領域の割合との差異が第3所定値以内であるか否かを判定し、当該差異が第3所定値以内であると判定したAfter画像を比較画像として選択してもよい。第3所定値は、予め設定され、記憶部に記憶されている。 In step S24, the selection unit 53 may determine whether the difference between the proportion of tooth regions in the After image and the proportion of tooth regions in the reference image is within a third predetermined value, and select the After image for which it is determined that the difference is within the third predetermined value as the comparison image. The third predetermined value is set in advance and stored in the storage unit.

 続いて、上記で決定された基準画像と比較画像とを用いて口腔内の状態の評価を行う動作について、図8を参照しながら説明する。図8は、本実施の形態に係る口腔ケア状態の評価を行う動作(情報処理方法)を示すフローチャートである。 Next, the operation of evaluating the condition of the oral cavity using the reference image and comparison image determined above will be described with reference to Figure 8. Figure 8 is a flowchart showing the operation (information processing method) of evaluating the oral care condition according to this embodiment.

 図8に示すように、検出部52は、基準画像及び比較画像の歯垢領域を検出する(S41)。検出部52は、基準画像及び比較画像それぞれにおける歯垢内のバクテリアが産出する物質であるポルフィリンが赤みを帯びたピンク色に蛍光(励起蛍光)している部分を歯垢領域として検出する。基準画像に基づく特定の歯牙の歯垢領域は第1歯垢領域の一例であり、比較画像に基づく特定の歯牙の歯垢領域は第2歯垢領域の一例である。 As shown in FIG. 8, the detection unit 52 detects plaque regions in the reference image and the comparison image (S41). The detection unit 52 detects areas in the reference image and the comparison image where porphyrin, a substance produced by bacteria in plaque, fluoresces reddish pink (excited fluorescence) as plaque regions. The plaque region of a specific tooth based on the reference image is an example of a first plaque region, and the plaque region of a specific tooth based on the comparison image is an example of a second plaque region.

 次に、スコア算出部54は、基準画像及び比較画像の歯垢スコアを算出する(S42)。スコア算出部54は、基準画像に対して、第1歯牙領域と、第1歯垢領域とに基づいて、歯牙の状態を示す歯垢スコア(第1歯垢スコア)を算出し、比較画像に対して、第2歯牙領域と、第2歯垢領域とに基づいて歯垢スコア(第2歯垢スコア)を算出する。例えば、スコア算出部54は、基準画像における天然歯牙領域の画素数に対する第1歯垢領域の画素数の比に基づいて、第1歯垢スコアを算出し、比較画像における天然歯牙領域の画素数に対する第2歯垢領域の画素数の比に基づいて、第2歯垢スコアを算出する。第1歯垢スコア及び第2歯垢スコアは、同じ歯牙に対する歯垢スコアである。 Next, the score calculation unit 54 calculates plaque scores for the reference image and the comparison image (S42). The score calculation unit 54 calculates a plaque score (first plaque score) indicating the condition of the teeth based on the first tooth region and the first plaque region for the reference image, and calculates a plaque score (second plaque score) based on the second tooth region and the second plaque region for the comparison image. For example, the score calculation unit 54 calculates the first plaque score based on the ratio of the number of pixels in the first plaque region to the number of pixels in the natural tooth region in the reference image, and calculates the second plaque score based on the ratio of the number of pixels in the second plaque region to the number of pixels in the natural tooth region in the comparison image. The first plaque score and the second plaque score are plaque scores for the same tooth.

 次に、制御部55は、基準画像及び比較画像それぞれの歯垢スコアに基づいて、ユーザの口腔ケア状態を評価する(S43)。基準画像及び比較画像が口腔ケア処理前後(例えば、歯磨き前後)の画像である場合、制御部55は、第1歯垢スコアと第2歯垢スコアとに基づいて、口腔ケア処理の効果を評価してもよい。例えば、制御部55は、歯牙ごとに歯垢スコアを算出することで、どの歯牙において磨き残しが多いかなどを特定してもよい。 Next, the control unit 55 evaluates the user's oral care status based on the plaque scores of the reference image and the comparison image (S43). If the reference image and the comparison image are images before and after oral care treatment (e.g., before and after brushing), the control unit 55 may evaluate the effectiveness of the oral care treatment based on the first plaque score and the second plaque score. For example, the control unit 55 may calculate a plaque score for each tooth, thereby identifying which teeth have been left unbrushed the most.

 制御部55は、第1歯垢スコアと第2歯垢スコアとに基づいて、口腔ケア処理前後の口腔ケア状態を比較するとも言える。このように、制御部55は、口腔ケア状態を比較する比較部して機能してもよい。 The control unit 55 can also be said to compare the oral care state before and after the oral care treatment based on the first plaque score and the second plaque score. In this way, the control unit 55 may function as a comparison unit that compares the oral care state.

 次に、制御部55は、口腔ケア状態の評価結果を表示部56に表示させる(S44)。制御部55は、第1歯垢スコア及び第2歯垢スコアを表示部56に表示させてもよいし、第1歯垢スコア及び第2歯垢スコアに基づく口腔ケア処理の効果を示す数値又はレベル等を表示部56に表示させてもよい。 Next, the control unit 55 causes the display unit 56 to display the evaluation results of the oral care state (S44). The control unit 55 may cause the display unit 56 to display the first plaque score and the second plaque score, or may cause the display unit 56 to display a numerical value or level indicating the effectiveness of the oral care treatment based on the first plaque score and the second plaque score.

 (実施の形態の変形例1)
 以下では、本変形例に係る情報処理システムについて、図9及び図10を参照しながら説明する。なお、以下では、実施の形態との相違点を中心に説明し、実施の形態と同一又は類似の内容については説明を省略又は簡略化する。本変形例に係る情報処理システムの構成は実施の形態に係る情報処理システムと同様であってもよく、説明を省略する。また、以下において、実施の形態に係る情報処理システムの符号を用いて説明する。
(First Modification of the Embodiment)
The information processing system according to this modification will be described below with reference to Figures 9 and 10. The following description will focus on differences from the embodiment, and descriptions of content that is the same as or similar to the embodiment will be omitted or simplified. The configuration of the information processing system according to this modification may be the same as the information processing system according to the embodiment, and description thereof will be omitted. The following description will also use the reference numerals of the information processing system according to the embodiment.

 図9は、本変形例に係る情報処理システムにおける撮影ガイドを説明するための図である。本変形例では、After画像を撮影する際に、撮影ガイド画像を表示部56に表示する点において、実施の形態に係る情報処理システムと相違する。 FIG. 9 is a diagram illustrating the imaging guide in the information processing system according to this modified example. This modified example differs from the information processing system according to the embodiment in that a imaging guide image is displayed on the display unit 56 when capturing an after image.

 図9の(a)は、基準画像を示し、図9の(b)は、基準画像に基づく撮影ガイド用の撮影ガイド画像を示す。撮影ガイド画像は、基準画像に含まれる特定の歯牙のシルエット、輪郭又は半透明の表示のいずれかを撮影ガイドとして含む画像である。図9の(b)では、基準画像に含まれる歯牙及びその周囲の輪郭を破線で示している。 (a) in Figure 9 shows a reference image, and (b) in Figure 9 shows a photography guide image for photography guidance based on the reference image. The photography guide image is an image that includes, as a photography guide, either a silhouette, outline, or semi-transparent display of a specific tooth included in the reference image. In (b) in Figure 9, the tooth included in the reference image and its surrounding outline are shown with dashed lines.

 図9の(c)は、撮影画像に撮影ガイド画像を重畳した画像を示す。図9の(c)では、撮影部21が受光した光により表示される撮影画像を実線で示し、当該撮影画像に重畳された撮影ガイド画像を破線で示している。ここでの撮影画像は、まだ画像の捕捉を行っていないが、撮影操作を行った場合に表示部56に表示される画像である。なお、捕捉された画像、つまりAfter画像に撮影ガイドは含まれない。 (c) in Figure 9 shows an image in which a shooting guide image is superimposed on the captured image. In (c) in Figure 9, the captured image displayed by the light received by the shooting unit 21 is shown by a solid line, and the shooting guide image superimposed on the captured image is shown by a dashed line. The captured image here is an image that has not yet been captured, but will be displayed on the display unit 56 when a shooting operation is performed. Note that the captured image, i.e., the After image, does not include a shooting guide.

 ユーザは、口腔内カメラ10で口腔ケア処理後の口腔内の画像(After画像)を撮影する。その際、ユーザは、表示部56に表示された、撮影部21が受光した光により表示される撮影画像を確認しながら、撮影位置などを調整する。本変形例では、撮影位置などを調整するための撮影画像に、撮影ガイド画像を重畳して表示する。 The user uses the intraoral camera 10 to capture an image of the oral cavity after oral care treatment (after image). At that time, the user adjusts the shooting position, etc., while checking the captured image displayed on the display unit 56 using light received by the imaging unit 21. In this modified example, a shooting guide image is superimposed on the captured image to adjust the shooting position, etc.

 これにより、ユーザが、撮影ガイドを参考に、例えば、撮影ガイド画像における撮影ガイドと撮影画像における歯牙との位置を確認しながら、口腔内カメラ10の姿勢(位置及び向き)などを調整することができる。例えば、撮影ガイド画像における撮影ガイドと撮影画像における歯牙とが一致するか又は所定以上近づいた場合において撮影が行われることで、基準画像と近い位置で撮影されたAfter画像が取得され得る。 This allows the user to refer to the imaging guide and, for example, adjust the posture (position and orientation) of the intraoral camera 10 while checking the position of the imaging guide in the imaging guide image and the teeth in the captured image. For example, by capturing an image when the imaging guide in the imaging guide image and the teeth in the captured image match or are closer than a predetermined distance, an After image captured at a position close to the reference image can be obtained.

 なお、表示部56への撮影ガイド画像の表示は、例えば、制御部55により実行される。制御部55は、例えば、ユーザが口腔内カメラ10を用いてAfter画像を撮影する際、撮影ガイドを、口腔内カメラ10から取得され表示部56に表示された画像(撮影画像)に重畳表示させる。 The display of the imaging guide image on the display unit 56 is executed, for example, by the control unit 55. For example, when the user takes an after image using the intraoral camera 10, the control unit 55 superimposes the imaging guide on the image (taken image) acquired from the intraoral camera 10 and displayed on the display unit 56.

 続いて、本変形例に係る情報処理システムの動作について、図10を参照しながら説明する。図10は、本変形例に係る情報処理システムの動作(情報処理システム)を示すフローチャートである。 Next, the operation of the information processing system according to this modified example will be described with reference to Figure 10. Figure 10 is a flowchart showing the operation of the information processing system according to this modified example (information processing system).

 図10に示すように、情報処理システムは、図6に示すフローチャートに加えて、ステップS31を実行する。 As shown in Figure 10, the information processing system executes step S31 in addition to the flowchart shown in Figure 6.

 制御部55は、口腔内カメラ10がAfter画像を撮影する前に、撮影ガイドを表示部56の表示に重畳表示させる(S31)。制御部55は、例えば、ユーザからAfter画像を撮影することを示す操作を受け付けると、表示部56に撮影ガイドを表示させてもよい。また、撮影ガイドの表示のオン及びオフの設定が可能であり、制御部55は、例えば、撮影ガイドの表示の設定がオンになっている場合のみ、撮影ガイドを表示部56に表示させてもよい。また、制御部55は、ステップS27又はS28の通知が行われ再撮影が行われる場合に、表示部56に撮影ガイドを表示させてもよい。 The control unit 55 superimposes a shooting guide on the display of the display unit 56 before the intraoral camera 10 captures an After image (S31). For example, the control unit 55 may display the shooting guide on the display unit 56 when it receives an operation from the user indicating that an After image is to be captured. The display of the shooting guide can be set to on or off, and the control unit 55 may display the shooting guide on the display unit 56 only when the setting for displaying the shooting guide is on, for example. The control unit 55 may also display the shooting guide on the display unit 56 when the notification of step S27 or S28 is given and re-shooting is to be performed.

 (実施の形態の変形例2)
 以下では、本変形例に係る情報処理システムについて、図11を参照しながら説明する。なお、以下では、実施の形態との相違点を中心に説明し、実施の形態と同一又は類似の内容については説明を省略又は簡略化する。本変形例に係る情報処理システムの構成は実施の形態に係る情報処理システムと同様であってもよく、説明を省略する。また、以下において、実施の形態に係る情報処理システムの符号を用いて説明する。
(Modification 2 of the embodiment)
The information processing system according to this modification will be described below with reference to FIG. 11 . The following description will focus on differences from the embodiment, and descriptions of content that is the same as or similar to the embodiment will be omitted or simplified. The configuration of the information processing system according to this modification may be the same as the information processing system according to the embodiment, and description thereof will be omitted. The following description will also use the reference numerals of the information processing system according to the embodiment.

 図11は、本変形例に係る撮影対象の歯牙を選択する画面を示す図である。図11では、ユーザが撮影したRGB画像(例えば、基準画像、After画像)を、どの歯を撮影した画像であるかを示す情報を対応付けて記憶する例について説明する。 Figure 11 shows a screen for selecting the tooth to be photographed in this modified example. Figure 11 illustrates an example in which RGB images (e.g., reference image, after image) photographed by the user are stored in association with information indicating which tooth the image represents.

 図11に示すように、携帯端末50の表示部56は、ユーザがこれから撮影するRGB画像を選択するための表示を行う。表示部56は、例えば、上顎及び下顎それぞれの歯列の表示を行う。歯列の表示は、画像であってもよいし、イラストであってもよい。 As shown in FIG. 11, the display unit 56 of the mobile terminal 50 displays a screen that allows the user to select the RGB image to be captured. The display unit 56 displays, for example, the rows of teeth of the upper and lower jaws. The display of the rows of teeth may be an image or an illustration.

 ユーザは、次に撮影する歯牙の選択を行う。図11では、下顎の左側の第1大臼歯が選択されている例を示している。 The user selects the next tooth to photograph. Figure 11 shows an example in which the first molar on the left side of the lower jaw has been selected.

 制御部55は、ユーザの選択を受け付けた後に口腔内カメラ10からRGB画像を取得すると、取得したRGB画像と下顎の左側の第1大臼歯を示す情報とを対応付けて記憶させる。例えば、制御部55は、基準画像と下顎の左側の第1大臼歯を示す情報とを対応付けて記憶させる。 After receiving the user's selection, the control unit 55 acquires an RGB image from the intraoral camera 10, and stores the acquired RGB image in association with information indicating the first molar on the left side of the mandible. For example, the control unit 55 stores the reference image in association with information indicating the first molar on the left side of the mandible.

 なお、歯牙の選択はRGB画像を撮影する前に行われてもよいし、撮影の後に行われてもよい。RGB画像と、歯牙の種類又は位置とが対応付けて記憶されていればよい。 Note that tooth selection may be performed before or after capturing the RGB image. It is sufficient that the RGB image and the tooth type or position are stored in association with each other.

 なお、RGB画像が当該歯牙を頬側から撮影したか舌側から撮影したか噛み合わせ面側から撮影したかを示す撮影方向情報がさらにRBG画像に対応付けて記憶されてもよい。撮影方向情報は、例えば、ユーザにより入力されてもよい。 In addition, imaging direction information indicating whether the RGB image was taken from the cheek side, tongue side, or occlusal side of the tooth may also be stored in association with the RGB image. The imaging direction information may be input by the user, for example.

 また、After画像を撮影する際、ユーザにより基準となる基準画像が選択されると、表示部56は、基準画像がどの歯牙を撮影した画像であるかを示す情報を表示してもよい。例えば、表示部56は、図11に示すような口腔内の歯列表示において、基準画像がどの歯牙の画像であるかを指し示してもよいし、文字情報を表示してもよい。 Furthermore, when an After image is captured, if the user selects a reference image to serve as a reference, the display unit 56 may display information indicating which tooth the reference image is an image of. For example, in a display of the rows of teeth in the oral cavity as shown in FIG. 11, the display unit 56 may indicate which tooth the reference image is an image of, or may display text information.

 そして、このように撮影された基準画像とAfter画像とに対する重なり度が判定されてもよい。 The degree of overlap between the reference image and the after image captured in this manner may then be determined.

 また、制御部55は、歯牙の種類と、歯垢量とが対応付いて記憶部に記憶されてもよい。 Furthermore, the control unit 55 may store the tooth type and the amount of plaque in association with each other in the memory unit.

 また、制御部55は、基準画像およびAfter画像を記憶部から読み出し表示部56に表示させる際、歯牙の種類又は歯牙の種類を当該画像に対して重畳表示してもよい。 Furthermore, when the control unit 55 reads the reference image and the after image from the storage unit and displays them on the display unit 56, the tooth type or tooth type may be superimposed on the image.

 (その他の実施の形態)
 以上、本開示の実施の形態等に係る情報処理システムについて説明したが、本開示は、この実施の形態等に限定されるものではない。
(Other embodiments)
Although the information processing system according to the embodiments of the present disclosure has been described above, the present disclosure is not limited to the embodiments.

 例えば、上記実施の形態等では、歯牙を撮影することを主目的とした口腔内カメラ10を用いる例を説明したが、口腔内カメラ10は、カメラを備える口腔内ケア機器であってもよい。例えば、口腔内カメラ10は、カメラを備える口腔内洗浄機等であってもよい。 For example, in the above embodiments, an example has been described in which an intraoral camera 10 is used whose main purpose is to photograph teeth, but the intraoral camera 10 may also be an oral care device equipped with a camera. For example, the intraoral camera 10 may also be an oral irrigator equipped with a camera.

 また、上記実施の形態等では、情報処理装置として携帯端末50を例示したが、情報処理装置は、据え置き型の情報端末であってもよいし、サーバ装置であってもよい。 Furthermore, in the above embodiments, a mobile terminal 50 is used as an example of an information processing device, but the information processing device may also be a stationary information terminal or a server device.

 また、上記実施の形態等では、重なり度合いの一例として重なり度を例示したが、例えば、重なり度合いは、「高」、「中」、「低」などの段階で示されてもよい。 Furthermore, in the above embodiments, the degree of overlap is given as an example of the degree of overlap, but the degree of overlap may also be indicated as a scale such as "high," "medium," or "low."

 また、上記実施の形態等では、情報処理装置(実施の形態等では、携帯端末50)が表示部を備える例について説明したがこれに限定されず、情報処理装置と別体の装置として、情報処理装置と通信可能な表示装置が設けられてもよい。 Furthermore, in the above embodiments, an example has been described in which the information processing device (in the embodiments, the mobile terminal 50) is equipped with a display unit, but this is not limited to this, and a display device capable of communicating with the information processing device may also be provided as a device separate from the information processing device.

 また、上記実施の形態等に係る情報処理システムに含まれる各処理部は典型的には集積回路であるLSIとして実現される。これらは個別に1チップ化されてもよいし、一部又は全てを含むように1チップ化されてもよい。 Furthermore, each processing unit included in the information processing system according to the above-described embodiments is typically realized as an LSI, which is an integrated circuit. These may be individually implemented as single chips, or some or all of them may be integrated into a single chip.

 また、集積回路化はLSIに限るものではなく、専用回路又は汎用プロセッサで実現してもよい。LSI製造後にプログラムすることが可能なFPGA(Field Programmable Gate Array)、又はLSI内部の回路セルの接続や設定を再構成可能なリコンフィギュラブル・プロセッサを利用してもよい。 Furthermore, integrated circuits are not limited to LSIs, but may be realized using dedicated circuits or general-purpose processors. FPGAs (Field Programmable Gate Arrays), which can be programmed after LSI manufacturing, or reconfigurable processors, which allow the connections and settings of circuit cells within the LSI to be reconfigured, may also be used.

 また、上記各実施の形態等において、各構成要素は、専用のハードウェアで構成されるか、各構成要素に適したソフトウェアプログラムを実行することによって実現されてもよい。各構成要素は、CPU又はプロセッサなどのプログラム実行部が、ハードディスク又は半導体メモリなどの記録媒体に記録されたソフトウェアプログラムを読み出して実行することによって実現されてもよい。 Furthermore, in each of the above embodiments, each component may be configured with dedicated hardware, or may be realized by executing a software program appropriate for that component. Each component may also be realized by a program execution unit such as a CPU or processor reading and executing a software program recorded on a recording medium such as a hard disk or semiconductor memory.

 また、ブロック図における機能ブロックの分割は一例であり、複数の機能ブロックを一つの機能ブロックとして実現したり、一つの機能ブロックを複数に分割したり、一部の機能を他の機能ブロックに移してもよい。また、類似する機能を有する複数の機能ブロックの機能を単一のハードウェア又はソフトウェアが並列又は時分割に処理してもよい。 Furthermore, the division of functional blocks in the block diagram is one example; multiple functional blocks may be realized as a single functional block, one functional block may be divided into multiple blocks, or some functions may be moved to other functional blocks. Furthermore, the functions of multiple functional blocks with similar functions may be processed in parallel or time-shared by a single piece of hardware or software.

 また、上記実施の形態等に係る情報処理装置(例えば、携帯端末50)は、単一の装置として実現されてもよいし、複数の装置により実現されてもよい。情報処理装置が複数の装置によって実現される場合、当該情報処理装置が有する各構成要素は、複数の装置にどのように振り分けられてもよい。例えば、情報処理装置の機能のうち、少なくとも一部の機能は口腔内カメラ10(例えば、信号処理部30)により実現されてもよい。情報処理装置が複数の装置で実現される場合、当該複数の装置間の通信方法は、特に限定されず、無線通信であってもよいし、有線通信であってもよい。また、装置間では、無線通信及び有線通信が組み合わされてもよい。 Furthermore, the information processing device (e.g., mobile terminal 50) according to the above-described embodiments may be realized as a single device, or may be realized by multiple devices. When an information processing device is realized by multiple devices, the components of the information processing device may be distributed in any manner among the multiple devices. For example, at least some of the functions of the information processing device may be realized by the intraoral camera 10 (e.g., signal processing unit 30). When an information processing device is realized by multiple devices, the communication method between the multiple devices is not particularly limited, and may be wireless communication or wired communication. Furthermore, wireless communication and wired communication may be combined between the devices.

 また、本開示は、情報処理システムにより実行される情報処理方法として実現されてもよい。また、本開示は、情報処理システムに含まれる口腔内カメラ、携帯端末、又はクラウドサーバとして実現されてもよい。 The present disclosure may also be realized as an information processing method executed by an information processing system. The present disclosure may also be realized as an intraoral camera, mobile terminal, or cloud server included in the information processing system.

 また、シーケンス図における各ステップが実行される順序は、本開示を具体的に説明するために例示するためのものであり、上記以外の順序であってもよい。また、上記ステップの一部が、他のステップと同時(並列)に実行されてもよい。 Furthermore, the order in which each step in the sequence diagram is executed is merely an example to specifically explain the present disclosure, and orders other than those described above may also be used. Furthermore, some of the steps may be executed simultaneously (in parallel) with other steps.

 また、本開示の一態様は、図4、図6及び図8のいずれかに示される情報処理方法に含まれる特徴的な各ステップをコンピュータに実行させるコンピュータプログラムであってもよい。 Furthermore, one aspect of the present disclosure may be a computer program that causes a computer to execute each of the characteristic steps included in the information processing method shown in any of Figures 4, 6, and 8.

 また、例えば、プログラムは、コンピュータに実行させるためのプログラムであってもよい。また、本開示の一態様は、そのようなプログラムが記録された、コンピュータ読み取り可能な非一時的な記録媒体であってもよい。例えば、そのようなプログラムを記録媒体に記録して頒布又は流通させてもよい。例えば、頒布されたプログラムを、他のプロセッサを有する装置にインストールして、そのプログラムをそのプロセッサに実行させることで、その装置に、上記各処理を行わせることが可能となる。 Furthermore, for example, the program may be a program to be executed by a computer. Furthermore, one aspect of the present disclosure may be a computer-readable non-transitory recording medium on which such a program is recorded. For example, such a program may be recorded on a recording medium and distributed or circulated. For example, by installing the distributed program in a device having another processor and having that processor execute the program, it becomes possible to cause that device to perform each of the above processes.

 以上、一つ又は複数の態様に係る情報処理システム等について、実施の形態等に基づいて説明したが、本開示は、この実施の形態等に限定されるものではない。本開示の趣旨を逸脱しない限り、当業者が思いつく各種変形を本実施の形態に施したものや、異なる実施の形態における構成要素を組み合わせて構築される形態も、一つ又は複数の態様の範囲内に含まれてもよい。 The information processing system, etc. relating to one or more aspects has been described above based on the embodiments, etc., but the present disclosure is not limited to these embodiments, etc. As long as it does not deviate from the spirit of the present disclosure, various modifications that a person skilled in the art could conceive of to this embodiment, or forms constructed by combining components from different embodiments, may also be included within the scope of one or more aspects.

 本開示は、口腔内の状態を確認するための情報処理システムに適用できる。 This disclosure can be applied to an information processing system for checking the condition of the oral cavity.

 10  口腔内カメラ
 10a  ヘッド部
 10b  ハンドル部
 20  ハード部
 21  撮影部(撮影装置)
 22  センサ部
 23  照明部
 23A  第1のLED
 23B  第2のLED
 23C  第3のLED
 23D  第4のLED
 24  操作部
 30  信号処理部
 31  カメラ制御部
 32  画像処理部
 33  制御部
 34  照明制御部
 35  メモリ部
 40  通信部
 50  携帯端末(情報処理装置)
 51  通信部(取得部)
 52  検出部
 53  選択部(第1選択部、第2選択部)
 54  スコア算出部
 55  制御部(比較部)
 56  表示部(表示装置)
10 Intraoral camera 10a Head part 10b Handle part 20 Hardware part 21 Photography part (photography device)
22 Sensor unit 23 Illumination unit 23A First LED
23B Second LED
23C Third LED
23D Fourth LED
24 Operation unit 30 Signal processing unit 31 Camera control unit 32 Image processing unit 33 Control unit 34 Lighting control unit 35 Memory unit 40 Communication unit 50 Portable terminal (information processing device)
51 Communication unit (acquisition unit)
52 Detection unit 53 Selection unit (first selection unit, second selection unit)
54 Score calculation unit 55 Control unit (comparison unit)
56 Display section (display device)

Claims (11)

 口腔内を撮影することで得られる特定の歯牙を含む第1画像を取得する取得部と、
 前記第1画像から前記特定の歯牙の第1歯牙領域を検出する検出部と、
 前記第1画像における前記第1歯牙領域の割合が所定値以上である場合、当該第1画像を前記特定の歯牙の基準画像として選択する選択部と、を備える
 情報処理装置。
an acquisition unit that acquires a first image including a specific tooth by photographing the inside of the oral cavity;
a detection unit that detects a first tooth region of the specific tooth from the first image;
a selection unit that selects the first image as a reference image of the specific tooth when a proportion of the first tooth region in the first image is equal to or greater than a predetermined value.
 前記取得部は、さらに、前記口腔内を撮影することで得られる前記特定の歯牙を含む第2画像を取得し、
 前記検出部は、前記第2画像から前記特定の歯牙の第2歯牙領域を検出し、
 前記選択部は、前記第1画像の前記第1歯牙領域と類似する第2歯牙領域を有する第2画像を、前記基準画像の比較対象である比較画像として選択する
 請求項1に記載の情報処理装置。
The acquisition unit further acquires a second image including the specific tooth obtained by photographing the oral cavity,
the detection unit detects a second tooth region of the specific tooth from the second image;
The information processing device according to claim 1 , wherein the selection unit selects a second image having a second tooth region similar to the first tooth region of the first image as a comparison image to be compared with the reference image.
 前記取得部は、1以上の前記第2画像を取得し、
 前記選択部は、1以上の前記第2画像のうち、当該第2画像における前記第2歯牙領域と、前記基準画像における前記第1歯牙領域との重なり度合いが所定度合い以上である第2画像を前記比較画像として選択する
 請求項2に記載の情報処理装置。
the acquisition unit acquires one or more second images;
The information processing device according to claim 2, wherein the selection unit selects, from among one or more of the second images, a second image in which the degree of overlap between the second tooth region in the second image and the first tooth region in the reference image is equal to or greater than a predetermined degree as the comparison image.
 さらに、前記口腔内を撮影する撮影装置が前記第2画像を撮影する際、前記基準画像に基づく撮影ガイドを、前記撮影装置から取得され表示装置に表示された画像に重畳表示させる制御部を備える
 請求項2又は3に記載の情報処理装置。
The information processing device according to claim 2 or 3, further comprising a control unit that, when an imaging device that photographs the oral cavity photographs the second image, superimposes an imaging guide based on the reference image on an image acquired from the imaging device and displayed on a display device.
 前記撮影ガイドは、前記基準画像に含まれる前記特定の歯牙のシルエット、輪郭、又は半透明の表示を含む
 請求項4に記載の情報処理装置。
The information processing device according to claim 4 , wherein the imaging guide includes a silhouette, an outline, or a semi-transparent display of the specific tooth included in the reference image.
 前記検出部は、さらに、前記基準画像に基づく前記特定の歯牙の第1歯垢領域と、前記比較画像に基づく前記特定の歯牙の第2歯垢領域とを検出し、
 さらに、
 前記基準画像において、前記第1歯牙領域と前記第1歯垢領域とに基づいて、第1歯垢スコアを算出し、かつ、前記比較画像において、前記第2歯牙領域と前記第2歯垢領域とに基づいて、第2歯垢スコアを算出するスコア算出部と、
 前記第1歯垢スコアと前記第2歯垢スコアとに基づいて、口腔ケア状態を比較する比較部とを備える
 請求項2又は3に記載の情報処理装置。
the detection unit further detects a first plaque region of the specific tooth based on the reference image and a second plaque region of the specific tooth based on the comparison image;
moreover,
a score calculation unit that calculates a first plaque score based on the first tooth region and the first plaque region in the reference image, and calculates a second plaque score based on the second tooth region and the second plaque region in the comparison image;
The information processing device according to claim 2 or 3, further comprising a comparison unit that compares an oral care state based on the first plaque score and the second plaque score.
 前記スコア算出部は、
 前記基準画像における天然歯牙領域の画素数に対する前記第1歯垢領域の画素数の比に基づいて、前記第1歯垢スコアを算出し、
 前記比較画像における天然歯牙領域の画素数に対する前記第2歯垢領域の画素数の比に基づいて、前記第2歯垢スコアを算出する
 請求項6に記載の情報処理装置。
The score calculation unit
calculating the first plaque score based on a ratio of the number of pixels in the first plaque region to the number of pixels in the natural tooth region in the reference image;
The information processing device according to claim 6 , wherein the second plaque score is calculated based on a ratio of the number of pixels in the second plaque region to the number of pixels in the natural tooth region in the comparison image.
 前記選択部は、前記第1画像の全画素数に対する前記第1歯牙領域の画素数の割合が前記所定値以上である場合、当該第1画像を前記基準画像として選択する
 請求項1~3のいずれか1項に記載の情報処理装置。
The information processing device according to any one of claims 1 to 3, wherein the selection unit selects the first image as the reference image when a ratio of the number of pixels of the first tooth region to the total number of pixels of the first image is equal to or greater than the predetermined value.
 前記第1画像は、ユーザが口腔ケア処理を行う前の口腔内を撮影した画像であり、
 前記第2画像は、前記ユーザが口腔ケア処理を行った後の口腔内を撮影した画像である
 請求項2又は3に記載の情報処理装置。
the first image is an image of the inside of the oral cavity taken before the user performs oral care treatment,
The information processing device according to claim 2 or 3, wherein the second image is an image of the inside of the oral cavity after the user has performed oral care treatment.
 口腔内を撮影することで得られる特定の歯牙を含む画像を取得し、
 前記画像から前記特定の歯牙の歯牙領域を検出し、
 前記画像における前記歯牙領域の割合が所定値以上である場合、当該画像を前記特定の歯牙の基準画像として選択する
 情報処理方法。
Obtaining images of specific teeth obtained by photographing the oral cavity,
detecting a dental region of the specific tooth from the image;
If the proportion of the tooth region in the image is equal to or greater than a predetermined value, the image is selected as a reference image for the specific tooth.
 請求項10に記載の情報処理方法をコンピュータに実行させるためのプログラム。 A program for causing a computer to execute the information processing method described in claim 10.
PCT/JP2025/003101 2024-02-15 2025-01-30 Information processing device, information processing method, and program Pending WO2025173550A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2024-021355 2024-02-15
JP2024021355 2024-02-15

Publications (1)

Publication Number Publication Date
WO2025173550A1 true WO2025173550A1 (en) 2025-08-21

Family

ID=96773009

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2025/003101 Pending WO2025173550A1 (en) 2024-02-15 2025-01-30 Information processing device, information processing method, and program

Country Status (1)

Country Link
WO (1) WO2025173550A1 (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011182993A (en) * 2010-03-09 2011-09-22 Panasonic Electric Works Co Ltd Dental plaque detector
JP2016140760A (en) * 2015-01-30 2016-08-08 デンタル・イメージング・テクノロジーズ・コーポレーション Intra-oral image acquisition alignment
JP2018134418A (en) * 2017-02-23 2018-08-30 正樹 神原 Photographing evaluation/detection unit and optical device
WO2022176942A1 (en) * 2021-02-22 2022-08-25 パナソニックIpマネジメント株式会社 Intraoral camera system and photography operation determination method

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011182993A (en) * 2010-03-09 2011-09-22 Panasonic Electric Works Co Ltd Dental plaque detector
JP2016140760A (en) * 2015-01-30 2016-08-08 デンタル・イメージング・テクノロジーズ・コーポレーション Intra-oral image acquisition alignment
JP2018134418A (en) * 2017-02-23 2018-08-30 正樹 神原 Photographing evaluation/detection unit and optical device
WO2022176942A1 (en) * 2021-02-22 2022-08-25 パナソニックIpマネジメント株式会社 Intraoral camera system and photography operation determination method

Similar Documents

Publication Publication Date Title
JP5968944B2 (en) Endoscope system, processor device, light source device, operation method of endoscope system, operation method of processor device, operation method of light source device
JP4088313B2 (en) Image processing system, hospital processing system
CN103796566B (en) Endoscopic system and method for displaying image
JPWO2020036121A1 (en) Endoscope system
JPWO2019220848A1 (en) Endoscope device, endoscope operation method, and program
CN106714651B (en) Evaluation value calculation device and electronic endoscope system
JP2018134418A (en) Photographing evaluation/detection unit and optical device
EP4620433A1 (en) Image processing method, image processing device, and program
JP2002529122A (en) System and method for analyzing tooth shade
CN112004454B (en) endoscope system
TWM503883U (en) Dental plaque detector with camera
US11547304B2 (en) Composite device for medical image capturing
JP7784630B2 (en) Plaque detection device, plaque detection method and program
JPWO2018043726A1 (en) Endoscope system
WO2025173550A1 (en) Information processing device, information processing method, and program
WO2025173551A1 (en) Information processing device, information processing method, and program
WO2022113995A1 (en) Dentition image capturing system and dentition image capturing method
JP6120758B2 (en) Medical system
JP7675374B2 (en) Intraoral camera, lighting control device, and lighting control method
US20160242678A1 (en) Organ image photographing apparatus
JP2013074929A (en) Oral cavity interior observation device and oral cavity interior observation system
CN111712178A (en) Endoscope system and its working method
WO2025197661A1 (en) Periodontal disease detection system, periodontal disease detection method, and program
WO2025197670A1 (en) Periodontal disease detection system, periodontal disease detection method, and program
JP4831962B2 (en) Imaging device

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 25754876

Country of ref document: EP

Kind code of ref document: A1