[go: up one dir, main page]

WO2024260743A1 - Système de balayage intra-buccal pour déterminer un signal infrarouge - Google Patents

Système de balayage intra-buccal pour déterminer un signal infrarouge Download PDF

Info

Publication number
WO2024260743A1
WO2024260743A1 PCT/EP2024/065626 EP2024065626W WO2024260743A1 WO 2024260743 A1 WO2024260743 A1 WO 2024260743A1 EP 2024065626 W EP2024065626 W EP 2024065626W WO 2024260743 A1 WO2024260743 A1 WO 2024260743A1
Authority
WO
WIPO (PCT)
Prior art keywords
visible light
light
infrared
signals
power level
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
PCT/EP2024/065626
Other languages
English (en)
Inventor
Philip Grabow WESTERGAARD
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
3Shape AS
Original Assignee
3Shape AS
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 3Shape AS filed Critical 3Shape AS
Publication of WO2024260743A1 publication Critical patent/WO2024260743A1/fr
Pending legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • G01B11/25Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
    • G01B11/2513Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object with several lines being projected in more than one direction, e.g. grids, patterns
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0059Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
    • A61B5/0082Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence adapted for particular medical purposes
    • A61B5/0088Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence adapted for particular medical purposes for oral or dental tissue
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Measuring devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/107Measuring physical dimensions, e.g. size of the entire body or parts thereof
    • A61B5/1079Measuring physical dimensions, e.g. size of the entire body or parts thereof using optical or photographic means
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61CDENTISTRY; APPARATUS OR METHODS FOR ORAL OR DENTAL HYGIENE
    • A61C9/00Impression cups, i.e. impression trays; Impression methods
    • A61C9/004Means or methods for taking digitized impressions
    • A61C9/0046Data acquisition means or methods
    • A61C9/0053Optical means or methods, e.g. scanning the teeth by a laser or light beam
    • A61C9/006Optical means or methods, e.g. scanning the teeth by a laser or light beam projecting one or more stripes or patterns on the teeth
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • G01B11/25Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
    • G01B11/2509Color coding

Definitions

  • the disclosure relates to an intraoral scanning system. More specifically, the disclosure relates to one or more processors of the system that is configured to acquire infrared images by changing a power level of the emitted infrared light.
  • ionizing radiation e.g., X-rays
  • X-ray bitewing radiographs are often used to provide non-quantitative images of the teeth's internal structures.
  • images are typically limited in their ability to show early tooth mineralization changes (e.g. initial caries) resulting in underestimation of the demineralization depth; they are unable to assess the presence or not of micro-cavitation; they result in frequent overlap of the approximal tooth surfaces which requires repetition of radiograph acquisition and thus may involve a lengthy and expensive procedure.
  • NIR near-infrared
  • the cycling between different light sources may be provided by switching on and off the different light sources or by applying a mechanical filter in-front of the different light sources which provides an equivalent on and off switching of the different light sources by blocking the unwanted emitted wavelengths.
  • the mechanical filter may be a filter wheel with different blocking filters.
  • the slower movement of the handheld intraoral scanner is needed if wanting an optimal image quality for both surface information and inner region information for generating a three-dimensional model of the dental object being scanned which includes both surface and inner region information.
  • the on-off switching creates unwanted transients on the emitted light pulses, and additionally, the on- off switching creates timing issues between the light sources.
  • mechanical filtering will result in a bulky solution that will lead to a larger handheld intraoral scanner that eventually will result in an uncomfortable scanner experience for the patient.
  • an intraoral scanning system may be configured to determine 3D data of a dental object in an oral cavity.
  • the intraoral scanning system may comprise a handheld intraoral scanner that includes a projector unit configured to emit visible light and infrared light during a scan sequence,
  • the system may comprise an image sensor unit configured to acquire the visible light signals and infrared signals from at least the dental object caused by the emitted visible light and the emitted infrared light, respectively.
  • the image sensor unit may be a highspeed camera which has a frame rate of above 60 frames per seconds.
  • the image sensor unit may be a very high-speed camera which has a frame rate of above 500 frames per seconds.
  • the system may include one or more processors configured to control a power level of the emitted infrared light to a first power level during a first time period of the scan sequence, and during a second time period, the power level of the emitted infrared light switches between the first power level and a second power level, and wherein the first power level is lower than the second power level.
  • the one or more processors is further configured to determine 3D data based on the acquired visible light signals during the first time period and to determine an inner region of the dental object based on the acquired infrared signals during the second time period.
  • One or more of the plurality of single-color channels may be configured to transmit infrared light and block visible light.
  • the power of the projector unit which is controlled by the one or more processors may be a supply power to the projector unit.
  • the emitted visible light may include wavelengths between 350 nm and 750 nm.
  • the emitted infrared light may include wavelengths from between 800 nm and 1200 nm.
  • the image sensor unit may include multiple cameras, such as, high speed camera.
  • the multiple cameras may be arranged around the projector unit or next to the projector unit.
  • An inner region of the dental object may be determined by the one or more processors based on the infrared signal.
  • the inner region may include information about dental features that are arranged within the dental object.
  • the dental feature may be one or more of an anatomy feature, a disease feature and a mechanical feature.
  • the anatomy feature may be an enamel, a dentine, or a pulp.
  • the disease feature may be plauq, crack or caries.
  • the mechanical feature may be a filling and/or a composite restoration.
  • the power difference between the first power level and the second power level may be determined by a power ratio.
  • the power ratio may be between 1/4 to 1/2, 1/8 to 1/2, 1/10 to 1/2, 1/20 to 1/2 or 1/10 to 1/4.
  • the one or more processors is configured to up and down the power level of the infrared light during a power slope between the first power level and the second power level.
  • the power slope has a slope that is determined such that no transients will appear on the emitted infrared light.
  • the projector unit may include multiple light sources that are configured to emit one or more color lights and the infrared light.
  • the multiple light sources may be arranged within a single module that includes multiple Light Emitting Diodes (LED) that are configured to emit different wavelengths within the visible and non-visible wavelength ranges.
  • LED Light Emitting Diodes
  • the light source i.e. one or more LEDs, that is configured to emit infrared light may be arranged separated from the light source that is configured to emit the visible light.
  • the one or more processors may be configured to determine a sub-inner region information based on the infrared signals and to enhance 3D data by the combination of the visible light signals and the sub-inner region information.
  • sub-inner region information is subtracted from the visible light signals, and the result would be enhanced 3D data where noise that may be created in the 3D data due to the emitted infrared light during the first time period will be removed.
  • the one or more processors may be configured to determine inner region information based on the infrared signals and to determine a composed image based on the infrared signals and the visible light signals, and wherein the composed image includes enhanced inner region information.
  • the composed image may include a subtraction of the visible light signals from the infrared signals.
  • the first power level of the emitted infrared light during the first time period may be below a noise floor level of the image sensor unit.
  • the one or more processors may be configured to control the projector unit during the scan sequence, and during the first time period the emitted visible light includes a first visible light that is turned on at a constant power level while the infrared light is turned on constant at the first power level.
  • the emitted visible light includes a second visible light that is turned on and off with a second pulse repetition rate and the first visible light is turned on and off asynchronously to the on/off switching of the second visible light and with a first pulse repetition rate, and wherein the power level of the infrared light is turned up when the first and the second visible light are turned off, and the infrared light is turned down when the first or the second visible light are turned on, and wherein the power level of the infrared light is turned up and down between the first power level and the second power level.
  • the first time period and the second time period are repeated throughout the scan sequence, and wherein the second time period may be repeated within 100 ms, within 200 ms or within 500.
  • the second time period may be between 30 ms and 50 ms, 30 ms and 40 ms, about 30 ms or about 40 ms about or about 50 ms.
  • the first visible light may include wavelengths that correspond to white light, and wherein the second visible light includes wavelengths that correspond to blue light.
  • the emitted blue light is excited fluorescence information from the dental object, and the fluorescence information may include green fluorescence information and red fluorescence information.
  • the projector unit may be configured to emit during the scan sequence the visible light that includes white light and blue light, wherein a first pulse repetition rate of the white light is different from a second pulse repetition rate of the blue light, and wherein the infrared light is constant on at power level during the first time period, and is constant on between two power levels during the second time period.
  • the one or more processors may be configured to adjust the first and the second pulse repetition rate based on a scan mode of the handheld intraoral scanner. For example, in a scan mode where the projector unit does emit both the visible light and infrared light, the pulse repetition rate of the visible light is increased where the infrared light is turned on constantly during the scan sequence.
  • the first pulse repetition rate is faster than the second pulse repetition rate, and in this example, the higher pulse repetition rate is necessary to provide 3D data with the needed quality to determine a 3D model.
  • the one or more processors may be configured to change the first and/or the second pulse repetition rate during the second time period based on a scan mode. For example, a low second pulse repetition rate may result in that every time the second visible light is turned off during the second timer period, the infrared signals are acquired.
  • the intraoral scanning system includes a filter unit that is configured to output filtered visible light signals and combined filtered light signals, wherein the combined filtered light signals include infrared signals and a first color, and wherein the filtered visible light signals include at least the first color.
  • the one or more processors may then be configured to determine the infrared signal by subtracting the first color of the filtered visible light signals from the combined filtered light signals. Then, when the second pulse repetition rate is doubled in relation to the low second pulse repetition rate, the one or more processors may then be configured to acquire the second visible light signals and the infrared signals at the same time due to the filter unit.
  • the one or more processors is configured to acquire the second visible light signals and the infrared signals at different time periods in the scan sequence. Then, when the second pulse repetition is doubled in relation to the low second pulse repetition rate it would not be possible to acquire the second visible light signals and the infrared signals at different time periods because of the width/ size of the second time period.
  • the filter unit allows the one or more processors to acquire the second visible light signals and the infrared signals at the same time due to the filter unit.
  • the intraoral scanner may comprise the filter unit that is configured to receive the visible light signals and the infrared signals from at least the dental object, and wherein the filter unit may be configured to transmit filtered visible light signals and combined filtered light signals, wherein the combined filtered light signals include a combination of the first color light of the visible light signals and infrared light of the infrared signals.
  • the filter unit may comprise a plurality of single-color channels configured to output the filtered visible light signals, and a plurality of combined-color channels configured to output the combined filtered light signals.
  • the image sensor unit may be configured to acquire the filtered visible light signals and the combined filtered light signals.
  • the one or more processors may be configurd to determine an infrared signal based on a subtraction of the combined filtered light signals with one or more color lights of the filtered visible light signals, and determine the 3D data of the dental object based on at least the filtered visible light signals.
  • the one or more processors may be configured to determine fluorescence information of the dental object from the visible light signals.
  • the fluorescence information is excited by the emitted visible light that includes blue wavelengths between 350 nm and 500 nm.
  • the inner region may be determined by composed scan information that includes a difference between the infrared signals and the visible light signals.
  • the fluorescence information may include green fluorescence information and/or red fluorescence information, wherein the one or more processors may be configured to determine a first difference between the infrared signal and the green fluorescence information and a second difference between the infrared signal and the red fluorescence information, and wherein the composed scan information includes a summation of the first difference and the second difference.
  • the contrast between the dental features within the inner region of the dental object is improved significantly.
  • the composed scan information may include a summation of the infrared signal, the green fluorescence information and the red fluorescence information.
  • the composed scan information may include a subtraction of the infrared signal with the visible light signal including white wavelengths or green wavelengths.
  • the one or more processors may be configured to determine the 3D data based on one or more colors of the visible light signals, such as white, red and/or green wavelengths.
  • the scan sequence may be a full scan of at least an upper jaw and/or a lower jaw of the oral cavity.
  • the emitted visible light may include visible light pulses, and the emitted infrared light may not be a pulse.
  • the filter unit may include a pixel-pattern filter that includes the plurality of single-color channels, and an infrared filter that is configured to block or partly block the infrared light signals for a first group of the plurality of single-color channels.
  • the infrared filter may be further configured to transmit the infrared light signals for a second group of the plurality of single-color channels, and wherein the second group of the plurality of single-color channels corresponds to the plurality of combined filter channels.
  • the filter unit may be constructed in a way that allows to be aligned and arranged in front of the image sensor unit, and which eventually, results in a compact filter unit.
  • the pixelpattern filter may include a first pixel-pattern surface and a second pixel-pattern surface that is opposite to the first pixel-pattern surface.
  • the infrared filter may be arranged on or in vicinity to the first pixel-pattern surface and the image sensor may be arranged on or in vicinity to the second pixel-pattern surface.
  • the infrared filter may include a first infrared surface and a second infrared surface that is opposite to the first infrared surface, and wherein the pixel-pattern filter is arranged on or in vicinity to the first infrared surface, and the image sensor unit may be arranged on or in vicinity to the second infrared surface.
  • the pixel-pattern filter may be a Bayer filter.
  • the 3D data includes 3D geometry data of the dental object and/or color data of the dental object.
  • the 3D data may include a plurality of sub-scans that have been acquired by the handheld intraoral scanner, and the plurality of sub-scans have been stitched together to form the 3D data.
  • the 3D geometry data may include the 3D shape of teeth and gingival, and the color data includes the color of the teeth and the gingival. Furthermore, the 3D data may include shade values of the teeth.
  • the one or more processors may be configured to determine in parallel the 3D data and the inner region of the dental object based on the infrared signal and/or a combination of the infrared signal and the visible light signals, and/or a combination of the infrared signal, visible light signal being white and fluorescence green and fluorescence red.
  • the one or more processors may be configured to receive the infrared signal and the filtered visible light signals sequentially from the image sensor unit and process the received signals in parallel but with a slight delay between the initiation of the processing of the two signals caused by the sequential receival of the infrared signals and the filtered visible light signals.
  • the image sensor unit is configured to output the filtered visible light signals and the combined filtered light signals in parallel such that no delay will appear between the processing of the two signals.
  • the filtered visible light signals for determining the 3D data may include a summation of red, green and blue wavelengths, and the first color light which is combined with the reflected infrared light in the combined filtered light signals may be green.
  • the one or more processors is configured to determine a virtual visible light signal from each of the combined filtered light signals, wherein the virtual visible light signal includes green wavelengths.
  • the virtual visible light signal is determined by interpolating between the filtered visible light signals of neighbouring single-color channels in relation to each of the combine-color channels, and wherein the filtered visible light signals of the neighbouring single-color channels include green wavelengths.
  • the virtual visible light signal corresponds to the first color light.
  • the first color light of the combined filtered light signals may be determined by interpolating between the first group of the plurality of single-color channels.
  • the one or more processors may then be configured to combine the virtual visible light signal of each of the combined filtered light signals with the filtered visible light signals to determine the 3D data.
  • all pixels that are aligned with the plurality of single-color channels and the plurality of combined-color channels are used for determining the 3D data. Thereby, no reduction in the resolution of the 3D data is obtained in comparison to the example where the one or more processors is disregarding the plurality of combined filtered light signals for determining 3D data.
  • the one or more processors may be configured to change a pulse repetition rate of the emitted visible light from the projector unit based on the wavelength of the emitted visible light.
  • the one or more processors may be configured to improve the signal -to-noise ratio of the filtered visible light signal by increasing a pulse repetition rate of the emitted visible light from the projector unit. For example, in a situation where the emitted light of the projector unit includes wavelengths between 350 nm and 500 nm, the one or more processors is configured to increase the pulse repetition rate of the emitted visible light. Since the infrared light is constant on throughout a scanning sequence of a patient, then no pulse repetition rate is defined for the emitted infrared light.
  • the pulse duration of emitted visible light pulses may be around 0.4 ms.
  • the switching period between the two colors may be around 0.8 ms, corresponding to a pulse repetition rate (PRR) of 1.25 kHz. This will be the PRR of the white visible light pulses.
  • PRR pulse repetition rate
  • the PPR for the blue visible light pulses is only occurring every fourth pulse, and the PPR is then reduced to 625 Hz for the blue pulses.
  • the one or more processors may be configured to change a gain level of the image sensor unit based on a wavelength and/or a power level of the emitted light from the projector unit. To further improve the signal-to-noise ratio of the filtered visible light signal, the one or more processors is configured to increase a gain level of the emitted light from the projector unit.
  • the projector unit may be configured to constantly emit the infrared light while emitting the visible light.
  • the on-off switching of the emitted infrared light is avoided, and thereby, unwanted transients on the emitted infrared light are avoided.
  • the determination of the infrared signal is performed by the one or more processors.
  • the one or more processors may be configured to generate a color light magnitude signal from the filtered visible light signals, and wherein the color light magnitude signal may be one or more of red light, green light and blue. If combining red, green and blue then the color light magnitude signal corresponds to white light. Then, the one or more processors may be configured to generate a combined light magnitude signal from a combined filtered signal of the combined filtered signals, wherein the combined light magnitude signal may be a sum of infrared light and the first color that includes one or more colors of the color light magnitude signal. The one or more processors may then be configured to determine the infrared signal by subtracting the color light magnitude signal from the combined light magnitude signal.
  • the color light magnitude signal may be a sum of red light, green light and blue light, and wherein the combined light magnitude signal is a sum of infrared light and red light, green light and blue light.
  • the color light magnitude signal includes green light, and wherein the combined light magnitude signal is a sum of infrared light and green light.
  • the color light magnitude may be generated based on filtered visible light signals from a first group of the plurality of single-color channels, and the combined light magnitude signal may be generated based on a single combined-color channel of the plurality of combined-color channels, and wherein the first group of the plurality of single-color channels is arranged in vicinity to the single combined-color channel.
  • the first group may be arranged as neighbour channels to the single combined-color channel.
  • the first group of single-color channels constitutes a color filter-channel neighbourhood
  • the color filter-channel neighbourhood is one of following a 4-neighborhood color filter channel arrangement, a diagonal neighbourhood color filter channel arrangement, or a 8-neighborhood color filter channel arrangement.
  • a 2D infrared image or a composed scan image may be determined based on the infrared signal, and to obtain an optimal resolution quality of the 2D infrared image the ratio between the plurality of combined-color channels and the plurality of single-color channels is determined to be between 1/16 and 1/4.
  • the handheld intraoral scanner may be configured to scan a patient during a scan sequence, and during the scan sequence, the projector unit may be configured to emit visible light and infrared light such that a 3D model is real time determined and displayed based on the 3D data. Furthermore, the infrared signal may also be determined in realtime.
  • the one or more processors may be configured to determine and display a 2D or a 3D infrared image based on the infrared signal, or, to determine and display a composed scan image based on the infrared signal.
  • the one or more processors is configured to determine a 3D model in real-time that includes one or more of following information of a dental object or a dentition: 3D geometry, color information, shade information and inner region information determined from at least the infrared signal.
  • the one or more processors may be configured to control the projector unit during the scan sequence, and wherein the scan sequence includes at least a first time period and a second time period, wherein during the first time period the emitted visible light includes a first visible light that is turned on at a constant power level while the infrared light is turned on at a constant power level, and during the second time period the emitted visible light includes a second visible light that is turned on and off with a second pulse repetition rate and the first visible light is turned on and off asynchronously to the on/off switching of the second visible light and with a first pulse repetition rate, and wherei the power level of the infrared light is turned up when the first and the second visible light are turned off and down when the first or the second visible light are turned on, and wherein the power level of the infrared light is turned up and down between the first power level and the second power level.
  • the first visible light may include white, green or red wavelengths
  • the second visible light may include blue wavelengths for the purpose of exciting susceptible molecules on the dental object to excite red and green fluorescence signals.
  • the one or more processors may be configured to determine a composed scan image based on the infrared light, the first visible light and/or the second visible light.
  • the one or more processors may be configured to change the signal -to-noise ratio of the filtered visible light signal by changing the pulse repetition rate during a time period of the emitted visible light.
  • the one or more processors may be configured to change the signal-to-noise ration of the filtered visible light signal by changing the first pulse repetition rate during the second time period based on a scan mode. For example, in a scan mode where the projector unit does emit both the visible light and infrared light, the pulse repetition rate of the visible light is increased in relation to a scan sequence where infrared light is turned on constantly during the scan sequence.
  • the scan sequence may include a repetition of the first time period and the second time period, and wherein the first time period and the second time period are repeated throughout the scan sequence, and wherein the second time period is repeated within 100 ms, 200 ms or 500 ms.
  • the first time period may be longer than the second time period.
  • the first time period may be between 100 ms and 500 ms, or between 100 ms and 250 ms.
  • the second time period may be between 30 ms and 50 ms, 30 ms and 40 ms, about 30 ms or about 40 ms about or about 50 ms.
  • the scan sequence would provide sufficient amount of visible light signals and infrared signals for the one or more processors to determine the 3D data and the infrared signals that would inevitably result in a 3D model with the necessary quality for the user to investigate the dentition for plaque, caries, cracks and other dental features within and on a dental object of the dentition.
  • the scan sequence may be a full scan of at least an upper jaw and/or a lower jaw of the oral cavity of the patient.
  • the projector unit may be configured to emit during the scan sequence a first visible light that includes white light and a second visible light that includes blue light, and wherein a first pulse repetition rate of the first visible light is different from a second pulse repetition rate of the second visible light, and wherein the infrared light is emitted constantly during the scan sequence.
  • the one or more processors may be configured to adjust the first and the second repetition rate based on a scan mode of the handheld intraoral scanner. The first pulse repetition rate may be faster than the second pulse repetition rate.
  • the one or more processors may be configured to determine fluorescence information of the dental object from the filtered visible light signals.
  • the filtered visible light signals may include excited fluorescence red light and fluorescence green light that have been excited by the emitted blue light.
  • the one or more processors may be configured to determine the 3D data based on two or more different colors of the filtered visible light signals.
  • the 3D data may be determined based on white light which may include a summation red light, blue light and green light, or, the 3D data may be determined based on blue light, green light or red light, or a combination of both blue light, green light and red light.
  • a power level of the infrared light may be constant throughout a scan sequence of at least an upper jaw and/or a lower jaw of the oral cavity.
  • the power level of the infrared light may be set to a first power level during a primary time period and to a second power level during a secondary time period, wherein the first power level is lower than the second power level.
  • the primary time period may correspond to the first time period of the scan sequence which implies mainly the capturing of visible light signal for determining the 3D data.
  • the secondary time period corresponds to the second time period of the scan sequence which implies mainly the capturing of scattered visible light for determining fluorescence information and scattered infrared light.
  • the first power level may be between 50% and 90% of the second power level, between 10 % and 50 % or between 5 % and 40 %.
  • the first power level may provide a signal above a noise floor level of the image sensor unit.
  • the emitted visible light may include visible light pulses, and wherein the emitted infrared light is not a pulse as the emitted infrared light is constant on throughout the complete scan sequence of a jaw of the patient.
  • the one or more processors may be configured to determine an infrared signal based on the subtraction of the combined filtered light signals with the one or more color lights of the filtered visible light signals, and the inner region is determined by composed scan information that includes a difference between the infrared signal and the one or more color lights of the filtered visible light signals.
  • the composed scan information includes enhanced contrast between the dental features, wherein the enhanced dental features may be depicted on or in a 3D model determined by the 3D data and the composed scan information.
  • the one or more processors may be configured to determine a first difference between the infrared signal and the green fluorescence information and a second difference between the infrared signal and the red fluorescence information, and wherein the composed scan information includes a summation of the first difference and/or the second difference.
  • the contrast between the dental features would be improved even more.
  • the composed scan information may include a summation of the infrared signal, the green fluorescence information and the red fluorescence information, and in this example, the dentin-enamel junction is more enhanced, and thereby, easier to distinguish from other dental features of the dental object.
  • the composed scan information provides an enhanced visualization of caries of teeth in such a manner that it becomes easier for the dentist to identify and treat caries.
  • the intraoral scanning system may be configured to enhance the visualization of restorations of teeth in such a manner that it becomes easier for the dentist to identify and treat second order caries, and, to identify the type of restoration, such as type of fillings, inlays, onlays, crowns and/or sealants.
  • the intraoral scanning system may be configured to enhance the visualization of "dentinenamel junction” (DEJ) of teeth in such a manner that it becomes easier for the dentist to identify the dentin-enamel junction for clinical assessment.
  • the intraoral scanning system may be configured to generate or update a three- dimensional 3D model while determining composed scan information with the use of different wavelength modalities.
  • yet a further aspect of the present disclosure is to detect incipient caries at stages where preventive measures are likely to effect remineralization, repairing damage done by the caries infection at a stage well before more complex restorative measures are necessary.
  • the disclosure can be accurate at an earlier stage of caries infection than has been exhibited using existing fluorescence or near-infrared approaches.
  • the intraoral scanning system may be configured to provide a composed scan data based on captured visible light and the determined infrared signal.
  • the infrared signal may include mainly reflection from inside a tooth, i.e. internal region of a dental object, and significantly less of surface reflection of a tooth.
  • the visible light signals may include mainly surface reflection of a tooth and significantly less reflection from inside a tooth, i.e. internal region of a dental object.
  • the infrared signal may include information about dental condition from within a tooth and not just superficial.
  • a composed scan information does not include an overlay of two 2D images where each of the two 3D images relates to different emitted wavelengths from the projector unit.
  • the composed scan information may be a combination of intensity levels of each pixel of the 2D images that relates to different wavelengths. For example, at a first time period, the image sensor unit captures light information that relates to visible wavelength, and at a second time period, the image sensor unit captures light information that relates to infrared, and the intensity levels of the two time periods are recorded and combined for enhancing a dental condition, i.e. a dental feature such as dentine, enamel, dentine-enamel junction, plaque, caries, cracks etc.
  • a dental condition i.e. a dental feature such as dentine, enamel, dentine-enamel junction, plaque, caries, cracks etc.
  • the combination of the intensity levels may be done digitally by subtraction and/or addition of the intensity levels.
  • the intensity levels may be captured and recorded during at least three time periods for at least three different wavelengths, such as white-coloured wavelength, blue-coloured wavelength and infrared wavelength.
  • the infrared wavelength may be between 750 nm and 1500 nm.
  • the one or more processors may be configured to display the composed scan information and the 3D model on a displaying unit of the system.
  • the one or more processors may be configured to utilize the filtered visible light signals for both generating or updating the 3D model and for determining the composed scan information that includes enhanced internal structures in the form of enhanced contrast between the dental features, such as a caries lesion, and tooth structures.
  • the surface of a dental object are mainly provided by the visible light information, and the inner region of the dental object are mainly provided by the composed scan information or the infrared signal determined by the one or more processors.
  • a handheld intraoral scanning device first performs a scan for generating or updating a 3D model and then afterwards performs another scan for determining composed scan information would result in a more complicated way of aligning the position of the composed scan information on the 3D model in comparison to the present disclosure.
  • the intraoral scanning system may comprise an intraoral scanner that includes the and the image sensor unit.
  • the system may comprise one or more processors that are configured to control a power level of the emitted visible light to a first power level during a first time period of the scan sequence, and during a second time period, the power level of the emitted visible light switches between the first power level and a second power level, and wherein the first power level is lower than the second power level, and wherein the emitted infrared light is modulated at a given frequency.
  • the modulation of the emitted infrared light may be based on on-off switching.
  • the advantage of having a constant on white light is that the temperature of the white light source is more constant and closer to an idle temperature of the white light source.
  • an image that is generated based on the acquired infrared signals would have more sub-surface scattering from the dental object. That will result in an infrared image or a composed scan image where the tooth is more easily seen. That has notably turned out to be preferred by the dentists.
  • the addition of the white light to the infrared image or the composed scan image may be done digitally, however, the above solution is done analogously, which means less computational usage of the one or more processors.
  • the first power level may be between 0.5% and 10 %, 1% and 5 %, or preferably around 1 %.
  • the advantage of having emitted visible light, e.g. white light, at the first power level while emitting UV light or infrared light is that the composed scan information will include additional sub-surface reflections caused by the emitted visible light. And, the additional sub-surface reflections will enhance the contrast even more of a proximal lesion or a dental-enamel junction in a dental object.
  • FIGs. 1 A, IB, 1C, and ID illustrate different examples of the intraoral scanning system
  • FIGs 2A, 2B and 2C illustrate different examples of a filter unit
  • FIGs 3 A, 3B, 3C and 3D illustrate different examples on a composed scan information
  • FIG. 4 illustrates another example of the intraoral scanning system.
  • the electronic hardware may include microprocessors, microcontrollers, digital signal processors (DSPs), field programmable gate arrays (FPGAs), programmable logic devices (PLDs), gated logic, discrete hardware circuits, and other suitable hardware configured to perform the various functionality described throughout this disclosure.
  • Computer program shall be construed broadly to mean instructions, instruction sets, code, code segments, program code, programs, subprograms, software modules, applications, software applications, software packages, routines, subroutines, objects, executables, threads of execution, procedures, functions, etc., whether referred to as software, firmware, middleware, microcode, hardware description language, or otherwise.
  • a scanning for providing intra-oral scan data may be performed by a dental scanning system that may include an intraoral scanning device such as the TRIOS series scanners from 3 Shape A/S.
  • the dental scanning system may include a wireless capability as provided by a wireless network unit.
  • the scanning device may employ a scanning principle such as triangulation-based scanning, confocal scanning, focus scanning, ultrasound scanning, x-ray scanning, stereo vision, structure from motion, optical coherent tomography OCT, or any other scanning principle.
  • the scanning device is capable of obtaining surface information by operated by projecting a pattern and translating a focus plane along an optical axis of the scanning device and capturing a plurality of 2D images at different focus plane positions such that each series of captured 2D images corresponding to each focus plane forms a stack of 2D images.
  • the acquired 2D images are also referred to herein as raw 2D images, wherein raw in this context means that the images have not been subject to image processing.
  • the focus plane position is preferably shifted along the optical axis of the scanning system, such that 2D images captured at a number of focus plane positions along the optical axis form said stack of 2D images (also referred to herein as a sub-scan) for a given view of the object, i.e.
  • the scanning device is generally moved and angled relative to the dentition during a scanning session, such that at least some sets of sub-scans overlap at least partially, in order to enable reconstruction of the digital dental 3D model by stitching overlapping subscans together in real-time and display the progress of the virtual 3D model on a display as a feedback to the user.
  • the result of stitching is the digital 3D representation of a surface larger than that which can be captured by a single sub-scan, i.e. which is larger than the field of view of the 3D scanning device.
  • Stitching also known as registration and fusion, works by identifying overlapping regions of 3D surface in various sub-scans and transforming sub-scans to a common coordinate system such that the overlapping regions match, finally yielding the digital 3D model.
  • An Iterative Closest Point (ICP) algorithm may be used for this purpose.
  • Another example of a scanning device is a triangulation scanner, where a time varying pattern is projected onto the dental arch and a sequence of images of the different pattern configurations are acquired by one or more cameras located at an angle relative to the projector unit.
  • Color texture of the dental arch may be acquired by illuminating the object using different monochromatic colors such as individual red, green and blue colors or my illuminating the object using multichromatic light such as white light.
  • a 2D image may be acquired during a flash of white light.
  • the process of obtaining surface information in real time of a dental arch to be scanned requires the scanning device to illuminate the surface and acquire high number of 2D images.
  • a high speed camera is used with a framerate of 300-2000 2D frames pr second dependent on the technology and 2D image resolution.
  • the high amount of image data needed to be handled by the scanning device to eighter directly forward the raw image data stream to an external processing device or performing some image processing before transmitting the data to an external device or display. This process requires that multiple electronic components inside the scanner is operating with a high workload thus requiring a high demand of current.
  • the scanning device comprises one or more light projectors configured to generate an illumination pattern to be projected on a three-dimensional dental arch during a scanning session.
  • the light projector(s) preferably comprises a light source, a mask having a spatial pattern, and one or more lenses such as collimation lenses or projection lenses.
  • the light source may be configured to generate light of a single wavelength or a combination of wavelengths (mono- or polychromatic). The combination of wavelengths may be produced by using a light source configured to produce light (such as white light) comprising different wavelengths.
  • the light projector(s) may comprise multiple light sources such as LEDs individually producing light of different wavelengths (such as red, green, and blue) that may be combined to form light comprising the different wavelengths.
  • the light produced by the light source may be defined by a wavelength defining a specific color, or a range of different wavelengths defining a combination of colors such as white light.
  • the scanning device comprises a light source configured for exciting fluorescent material of the teeth to obtain fluorescence data from the dental arch.
  • a light source may be configured to produce a narrow range of wavelengths.
  • the light from the light source is infrared (IR) light, which is capable of penetrating dental tissue.
  • the light projector(s) may be DLP projectors using a micro mirror array for generating a time varying pattern, or a diffractive optical element (DOF), or back-lit mask projectors, wherein the light source is placed behind a mask having a spatial pattern, whereby the light projected on the surface of the dental arch is patterned.
  • the back-lit mask projector may comprise a collimation lens for collimating the light from the light source, said collimation lens being placed between the light source and the mask.
  • the mask may have a checkerboard pattern, such that the generated illumination pattern is a checkerboard pattern. Alternatively, the mask may feature other patterns such as lines or dots, etc.
  • the scanning device preferably further comprises optical components for directing the light from the light source to the surface of the dental arch.
  • the specific arrangement of the optical components depends on whether the scanning device is a focus scanning apparatus, a scanning device using triangulation, or any other type of scanning device.
  • a focus scanning apparatus is further described in EP 2 442 720 Bl by the same applicant, which is incorporated herein in its entirety.
  • the light reflected from the dental arch in response to the illumination of the dental arch is directed, using optical components of the scanning device, towards the image sensor(s).
  • the image sensor(s) are configured to generate a plurality of images based on the incoming light received from the illuminated dental arch.
  • the image sensor unit may be a high-speed image sensor such as an image sensor configured for acquiring images with exposures of less than 1/1000 second or frame rates in excess of 250 frames pr. second (fps).
  • the image sensor may be a rolling shutter (CCD) or global shutter sensor (CMOS).
  • the image sensor(s) may be a monochrome sensor including a color filter array such as a Bayer filter and/or additional filters that may be configured to substantially remove one or more color components from the reflected light and retain only the other non-removed components prior to conversion of the reflected light into an electrical signal.
  • additional filters may be used to remove a certain part of a white light spectrum, such as a blue component, and retain only red and green components from a signal generated in response to exciting fluorescent material of the teeth.
  • the network unit may be configured to connect the dental scanning system to a network comprising a plurality of network elements including at least one network element configured to receive the processed data.
  • the network unit may include a wireless network unit or a wired network unit.
  • the wireless network unit is configured to wirelessly connect the dental scanning system to the network comprising the plurality of network elements including the at least one network element configured to receive the processed data.
  • the wired network unit is configured to establish a wired connection between the dental scanning system and the network comprising the plurality of network elements including the at least one network element configured to receive the processed data.
  • the dental scanning system preferably further comprises a processor configured to generate scan data (such as extra-oral scan data and/or intra-oral scan data) by processing the two-dimensional (2D) images acquired by the scanning device.
  • the processor may be part of the scanning device.
  • the processor may comprise a Field- programmable gate array (FPGA) and/or an Advanced RISC Machines (ARM) processor located on the scanning device.
  • the scan data comprises information relating to the three- dimensional dental arch.
  • the scan data may comprise any of: 2D images, 3D point clouds, depth data, texture data, intensity data, color data, and/or combinations thereof.
  • the scan data may comprise one or more point clouds, wherein each point cloud comprises a set of 3D points describing the three-dimensional dental arch.
  • the scan data may comprise images, each image comprising image data e.g.
  • the image sensor(s) of the scanning device may acquire a plurality of raw 2D images of the dental arch in response to illuminating said object using the one or more light projectors.
  • the plurality of raw 2D images may also be referred to herein as a stack of 2D images.
  • the 2D images may subsequently be provided as input to the processor, which processes the 2D images to generate scan data.
  • the processing of the 2D images may comprise the step of determining which part of each of the 2D images are in focus in order to deduce/generate depth information from the images.
  • the internal depth information may be used to generate 3D point clouds comprising a set of 3D points in space, e.g., described by cartesian coordinates (x, y, z).
  • the 3D point clouds may be generated by the processor or by another processing unit.
  • Each 2D/3D point may furthermore comprise a timestamp that indicates when the 2D/3D point was recorded, i.e., from which image in the stack of 2D images the point originates.
  • the timestamp is correlated with the z-coordinate of the 3D points, i.e., the z-coordinate may be inferred from the timestamp.
  • the output of the processor is the scan data, and the scan data may comprise image data and/or depth data, e.g.
  • the scanning device may be configured to transmit other types of data in addition to the scan data.
  • Examples of data include 3D information, texture information such as infra-red (IR) images, fluorescence images, reflectance color images, x-ray images, and/or combinations thereof.
  • IR infra-red
  • the examples illustrated in the below figures may be performed by one or more processors.
  • FIGS.1 A, IB, 1C, and ID illustrates an example of the intraoral scanning system 1.
  • the system 1 includes a handheld intraoral scanner 10 that can be handheld and used for scanning a patient’s mouth.
  • the handheld intraoral scanner may include a projector unit 3 that is configured to emit 8 visible light (51 A, 5 IB) and infrared light 52 during a scan sequence 50.
  • the handheld intraoral scanner 10 may include an image sensor unit 4 configured to acquire the visible light signals 9 and infrared signals 9 from at least the dental object caused by the emitted 8 visible light (51 A, 5 IB) and the emitted 8 infrared light 52, respectively.
  • the system 1 includes one or more processors 2 configured to control a power of the projector unit 3 such that a first power level 52 A of the emitted 8 infrared light 52 during a first time period 56 of the scan sequence 50 is lower than a second power level 52B of the emitted 8 infrared light during 52 a second time period 55 of the scan sequence 50. Furthermore, the emitted 8 visible light 51 A during the first time period 56 is constant on at a constant power level, and during the first time period 56, the emitted visible light 51 A includes white wavelengths, During the second time period 55, the emitted visible light includes two modalities (51 A, 5 IB) white wavelengths 51A and blue wavelengths 5 IB.
  • the one or more processors 2 is configured to turn on and off the two modalities (51 A, 5 IB) asynchronously.
  • the power of the infrared light 52 is turned up to the second power level 52B and down to the first power level 52A when the first or the second visible light are turned on.
  • the first time period 56 and the second time period 55 are repeated multiple times during the scan sequence 50.
  • the first power level 52A of the emitted 8 infrared light 52 is below a noise floor 102 of the image sensor unit 4, and during the second time period 55, the one or more processors 2 is configured to turn up and down the power level of the emitted infrared light 52 between the first power level 52A and the second power level 52B.
  • the one or more processors 2 is configured to switch between the first power level 52A and the second power level 52B during the second time period 55.
  • the second time period may be repeated within 100 ms, within 200 ms, or within 500 ms, and the second time period may be between 30 ms and 50 ms, 30 ms and 40 ms, about 30 ms or about 40 ms about or about 50 ms.
  • the one or more processors 2 is further configured to determine 3D data based on the acquired 9 visible light signals during the first time period 56 and to determine an inner region of the dental object based on the acquired infrared signals during the second time period 55.
  • FIG. IB illustrates a similar example as in FIG. 1 A, however, in this example, the one or more processors 2 is configured to change the first and/or the second pulse repetition rate between the different second time periods (55A,55B).
  • the projector unit 5 emits a single modality of visible light 51 at a first pulse repetition rate along with the infrared light 52
  • the projector unit emits two modalities of visible light (51 A, 5 IB) along with the infrared light 52.
  • the one or more processors has changed the first pulse repetition rate of the modality of visible light 51 A that is turned on during both of the second time periods (55A,55B).
  • the one or more processors 2 is configured to emit only white wavelengths 51A along with the infrared light 52 during the first 55 A of the two second timer periods (55A,55B), and then in the another second time period 55B the one or more processors 2 is configured to emit both white and blue wavelengths (51 A, 5 IB) asynchronously.
  • the first pulse repetition rate has increased in the another second time period 55B as the width of the second time period may not change in comparison to the width of the first second time period 55A when adding more visible light modalities to the scan sequence 50.
  • FIG. 1C illustrates a similar example as in FIG. 1 A, however, in this example the first power level 52 A is above the noise floor 102 of the image sensor unit 4.
  • a power ratio between the first power level and the second power level may be between the first power level and the second power level is between 1/4 to 1/2, 1/8 to 1/2, 1/10 to 1/2, 1/20 to 1/2 or 1/10 to 1/4.
  • FIG. ID illustrates an example where the pulse repetition rate of the visible light 51 in the second time period 55 has increased to an extend that in order not to increase the width of the second time period 55 the projector unit 5 has to emit both the visible light 51 and the infrared light 52 at the same time in order for the one or more processors 2 to acquire both the visible light signals and the infrared light signals at the same time during the second time period 55.
  • the handheld intraoral scanner 10 includes a filter unit 3 arranged in front of the image sensor unit 4.
  • the filter unit 3 is configured to transmit filtered visible light signals and combined filtered light signals, wherein the combined filtered light signals include a combination of a first color light of the visible light signals and infrared light of the infrared light signals.
  • the filter unit may comprise a plurality of single-color channels 11 configured to output the filtered visible light signals, and a plurality of combined-color channels (12A,12B) configured to output the combined filtered light signals.
  • the image sensor unit 4 is configured to acquire the filtered visible light signals by pixels 14 that are aligned 16 to one or more of the single-color channels 11 of the plurality of single-color channels, and to acquire the combined filtered light signals by pixels 15 that are aligned 16 to one or more of the combined-color channels (12A,12B) of the plurality of combined-color channels.
  • each of plurality of single-color channels 11 and each of the plurality of combined-color channels (12A,12B) are aligned 16 to each of the pixels (14,16) of the image sensor unit 4.
  • the one or more processors 2 receives the filtered visible light signals and the combined filtered light signals, wherein the one or more processors 2 is configured to determine the infrared signal based on a subtraction of the combined filtered light signals with one or more color lights of the acquired filtered visible light signals.
  • the one or more processors 2 is further configured to determine an inner region of the dental object based on the infrared signal.
  • the one or more processors is configured to determine the 3D data of the dental object based on at least the filtered visible light signals.
  • the one or more processors 2 is then configured to determine a 3D model that includes one or more of following information of a dental object or a dentition: 3D geometry, color information, shade information and inner region information determined based on at least the infrared signal.
  • the one or more processors 2 is configured to determine in parallel the 3D data and an inner region of the dental object based on the infrared signal.
  • the 3D model and the inner region may be determined in parallel.
  • FIGS. 2A, 2B 2C illustrate different examples of the filter unit 5.
  • the filter unit 5 includes a plurality of combined-color channels (12A,12B), which in this specific example includes two combined-color channels (12A, 12B).
  • the filter unit 5 includes a plurality of single-color channels (11,11A,1 IB).
  • the combined-color channels (12A, 12B) is configured to output combined filtered light signals that include a combination of green light and infrared light where wavelengths within the blue and red spectrum are blocked from being output through the combined-color channels (12A,12B).
  • the first color light of the combined-color channels (12A,12B) includes green wavelengths.
  • the plurality of single-color channels 11 are configured to output filtered visible light signals that include blue, green and red light.
  • the one or more processors 2 is configured to determine the infrared signal by subtracting the green light outputted by the neighbouring single-color channels (11 A,1 IB) in relation to the respective composed-color channels (12A,12B) from the combined filtered visible light signals.
  • the combined-color channels (12A, 12B) is configured to output combined filtered light signals that include a combination of white light and infrared light.
  • the one or more processors 2 is then configured to determine the infrared signal by subtracting the green light, red light and blue light outputted by the neighbouring singlecolor channels (11A,1 IB) in relation to the respective composed-color channels (12A,12B) from the combined filtered visible light signals.
  • the two composed-color channels (12A,12B) are sharing two single-color channels (11A+1 IB).
  • the one or more processors 2 is configured to generate a color light magnitude signal (11A,1 IB) from the filtered visible light signals, and wherein the color light magnitude signal (11 A,1 IB) is one or more of red light, green light and blue.
  • the color light magnitude signal (11 A,1 IB) corresponds to green light
  • the color light magnitude signal (11 A,1 IB) corresponds to a summation of red, green and blue light.
  • the one or more processors is configured to generate a combined light magnitude signal from a combined filtered signal of the combined filtered signals, and wherein the combined light magnitude signal is a sum of infrared light and the first color light that includes one or more colors of the color light magnitude signal.
  • the first color light is green
  • the first color is white.
  • the one or more processors is further configured to determine the infrared signal by subtracting the color light magnitude signal from the combined light magnitude signal.
  • the color light magnitude is generated based on filtered visible light signals from a first group (11 A,1 IB) of the plurality of single-color channels 11, and the combined light magnitude signal is generated based on a single combined-color channel (12A,12B) of the plurality of combined-color channels, and wherein the first group (11 A,1 IB) of the plurality of single-color channels 11 is arranged in vicinity to the single combined-color channel (12A,12B).
  • the one or more processors 2 is configured to determine the color light magnitude signal by averaging the filtered visible light signals from the first group (11A,1 IB) or by interpolation between the filtered visible light signals from the first group (11,1 IB).
  • the determined color light magnitude signal is an approximation of the first color that is combined with the infrared light in the plurality of combined-color channels.
  • FIG. 2C illustrates different examples of how the first group (11A,1 IB) of the plurality of single-color channels 1 is arranged relative to each of the combined-color channels (12A,12B).
  • the first group (11A,1 IB) of single-color channels (11) constitutes a color filter-channel neighbourhood, and the color filter-channel neighbourhood is as following:
  • the color filter-channel neighbourhood 11A is a four-neighbourhood color filter channel arrangement, wherein each of the single filterchannels 11 A of the four-neighbourhood outputs green light which also corresponds the first color of the combined filtered light signal.
  • the color filter-channel neighbourhood 11A is a diagonal neighbourhood color filter channel arrangement
  • the color filter-channel neighbourhood 11A is an eight-neighbourhood color filter channel arrangement.
  • FIGS. 3A, 3B, 3C, and 3D illustrate example on composed scan image 20 determined based on the infrared signal 24 and the visible signals (51 A, 5 IB).
  • the visible light signals 22 (e.g. the filtered visible light signals) includes surface reflection, i.e. surface information, provided by white coloured light 51A emitted by the handheld intraoral scanning device 10, and the infrared signal 24 provided by the emitted infrared light 52.
  • the composed scan information 20 includes a subtraction of the infrared signal 24 with visible light signals 22, and wherein the visible light signals 22 are used for determining a 3D surface model 29 of the dentition 80.
  • the composed scan information 20 includes enhanced internal structure that is represented by a restoration 26A that is not seen in the infrared signal 24 but can easily be identified in the composed scan information 20.
  • the composed scan information 20 may be mapped onto the 3D surface model 29 by the one or more processors 2, such that the composed scan information provides three-dimensional information regarding the inner region of the dentition 80.
  • the visible light signals 22 includes excited fluorescence information that is provided by blue coloured light 5 IB emitted by the handheld intraoral scanning device 10, and in this example, the composed scan information 20 includes a subtraction of the infrared signal 24 with the visible light signals 22, and wherein the visible light signals 22 are used for applying fluorescence information onto the 3D surface model 29 of the dentition 80.
  • the infrared image 24 the caries lesion 27 can barely be seen, however, on the composed image 20 the visibility of the caries lesion has improved.
  • FIG. 3C illustrates a similar example as FIG. 3B, however, the composed scan information 20 in FIG. 3C is composed differently.
  • the composed scan information 20 includes a summation of the green coloured fluorescence information 22 and the infrared signals 24.
  • the composed scan information 20 includes enhanced textural information about the enamel 28A and dental 28B, and thereby, the Dentin-Enamel- Junction (DEJ) has become easier to see due to an improved contast in the composed scan information image 20. It is easily seen that the enamel 28A and the dental 28B are more clearly seen in the composed scan information image 20 than in the infrared signal 24.
  • DEJ Dentin-Enamel- Junction
  • the composed scan information includes a summation of the infrared signals 24, the green fluorescence information 22 and the red fluorescence information 22, and the composition, has also shown to improve the visibility of DEJ in relation to a regular or enhanced fluorescence information (, or, in relation to the infrared signal 24).
  • FIG. 3B illustrates an example where the projector unit 5 emits visible lights that includes blue wavelengths 5 IB and white wavelengths 51 A, and non-visible light that includes infrared wavelengths.
  • the emitted blue wavelengths 5 IB excites greencoloured fluorescence information 22B and red-coloured fluorescence information 22C.
  • the visible light signals (22A,22B, 22C) include surface information provided by the emitted white wavelengths 22A and green 22B and red 22C fluorescence provided by the emitted blue wavelengths.
  • the surface information is used for generating or updating the 3D model 29, and the fluorescence information (21 A, 21 C) are used for determining a composed scan information 20.
  • the one or more processors 2 is configured to determine a first difference between the infrared signals 24 and the green fluorescence information 22B and a second difference between the infrared signals and the red fluorescence information 22C, and the composed scan information 20 includes a summation of the first difference and the second difference.
  • the enhance internal structure relates to a caries lesion that has become more visible in comparison to the infrared signal 24 and in the example illustrated in FIG. 3B.
  • FIG. 4 illustrates an example of a scan sequence 50.
  • the emitted visible light 51 A e.g. white light
  • the emitted visible light includes two modalities (51,5 IB), e.g. white light 51 A and Ultra-violet light 5 IB.
  • the one or more processors 2 is configured to turn on and off the ultra-violet light 5 IB and the infrared light 52, asynchronously while the white light 51 A is constant on at two different power levels, i.e. a first power level 51 A’ and a second power level 51A”.
  • the white light 51 A i.e.
  • the first visible light 51 A is emitted at the second power level 51” during the first time period 56.
  • the white light i.e. the first visible light 51 A
  • the white light is emitting at a first power level 51 A’ that is well below the second power level 51A”.
  • a third power level of the white light 51 A may be introduced.
  • the white light 51 A is emitting at the third power level while the infrared light 52 is emitted by the projector unit, and the third power level is larger than the first power level 51 A’ but lower than the second power level 51A”.
  • the varying low power levels, the first and third power levels provides the benefit of being able to adjust specifically the level of white light for generating fluorescence information and/or infrared signals with level of sub-surface scattering that does not ruin the fluorescence information nor the infrared information, but instead, provides a better visibility of the actual dental object, e.g. a tooth, in combination with fluorescence information and/or infrared information.
  • connection or “coupled” as used herein may include wirelessly connected or coupled.
  • the term “and/or” includes any and all combinations of one or more of the associated listed items. The steps of any disclosed method is not limited to the exact order stated herein, unless expressly stated otherwise.
  • An intraoral scanning system configured to determine 3D data of a dental object in an oral cavity, the intraoral scanning system comprising;
  • an intraoral scanner including; o a projector unit configured to emit visible light and infrared light during a scan sequence, o an image sensor unit configured to acquire the visible light signals and infrared signals from at least the dental object caused by the emitted visible light and the emitted infrared light, respectively, and
  • processors configured to: o control a power of the projector unit such that a first power level of the emitted infrared light during a first time period of the scan sequence is lower than a second power level of the emitted infrared light during a second time period of the scan sequence, and o determine 3D data based on the acquired visible light signals during the first time period and to determine an inner region of the dental object based on the acquired infrared signals during the second time period.
  • a power ratio between the first power level and the second power level is between 1/4 to 1/2, 1/8 to 1/2, 1/10 to 1/2, 1/20 to 1/2 or 1/10 to 1/4.
  • the one or more processors is configured to determine a sub-inner region information based on the infrared signals and to enhance 3D data by the combination of the visible light signals and the sub-inner region information.
  • the one or more processors is configured to determine inner region information based on the infrared signals and to determine a composed image based on the infrared signals and the visible light signals, and wherein the composed image includes enhanced inner region information.
  • the first power level of the emitted infrared light during the first time period is below a noise floor level of the image sensor unit.
  • the one or more processors is configured to control the projector unit during the scan sequence, and during:
  • the emitted visible light includes a first visible light that is turned on at a constant power level while the infrared light is turned on constant at the first power level
  • the emitted visible light includes a second visible light that is turned on and off with a first pulse repetition rate while the infrared light is turned on constant at the second power level, and the first visible light is turned on and off asynchronously to the on/off switching of the first visible light and with a second pulse repetition rate.
  • the intraoral scanning system according to any of items 7 to 10, wherein the first visible light includes wavelengths that correspond to white light, and wherein the second visible light includes wavelengths that correspond to blue light.
  • the projector unit is configured to emit during the scan sequence the visible light that includes white light and blue light, wherein a first pulse repetition rate of the white light is different from a second pulse repetition rate of the blue light, and wherein the infrared light is constant on during the first time period and the second time period.
  • intraoral scanner comprising:
  • a filter unit configured to receive the visible light signals and the infrared signals from at least the dental object, and wherein the filter unit is configured to transmit filtered visible light signals and combined filtered light signals, wherein the combined filtered light signals include a combination of a first color light of the visible light signals and infrared light of the infrared signals, and wherein the filter unit comprising: o a plurality of single-color channels configured to output the filtered visible light signals, and o a plurality of combined-color channels configured to output the combined filtered light signals,
  • an image sensor unit configured to acquire the filtered visible light signals and the combined filtered light signals
  • processors configured to: o determine an infrared signal based on a subtraction of the combined filtered light signals with one or more color lights of the filtered visible light signals, and o determine the 3D data of the dental object based on at least the filtered visible light signals.
  • the one or more processors is configured to determine fluorescence information of the dental object from the visible light signals.
  • the inner region is determined by composed scan information that includes a difference between the infrared signals and the visible light signals.
  • fluorescence information includes green fluorescence information and/or red fluorescence information.
  • the one or more processors is configured to determine a first difference between the infrared signal and the green fluorescence information and a second difference between the infrared signal and the red fluorescence information, and wherein the composed scan information includes a summation of the first difference and the second difference.
  • the intraoral scanning system according to any of the previous items, wherein the one or more processors is configured to determine the 3D data based on one or more colors of the visible light signals.
  • the scan sequence is a full scan of at least an upper jaw and/or a lower jaw of the oral cavity.
  • a power level of the infrared light is set to a first power level during a primary time period and set to a second power level during a secondary time period, wherein the first power level is lower than the second power level.
  • the emitted visible light includes visible light pulses, and wherein the emitted infrared light is not a pulse.
  • An intraoral scanning system configured to determine 3D data of a dental object in an oral cavity, the intraoral scanning system comprising;
  • an intraoral scanner including; o a projector unit configured to emit visible light and infrared light during a scan sequence, o an image sensor unit configured to acquire the visible light signals and infrared signals from at least the dental object caused by the emitted visible light and the emitted infrared light, respectively, and
  • processors configured to: o control a power level of the emitted visible light to a first power level during a first time period of the scan sequence, and during a second time period, the power level of the emitted visible light switches between the first power level and a second power level, and wherein the first power level is lower than the second power level, and wherein the emitted infrared light is modulated at a given frequency, o determine 3D data based on the acquired visible light signals during the first time period and to determine an inner region of the dental object based on the acquired infrared signals during the second time period.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Veterinary Medicine (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Dentistry (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Surgery (AREA)
  • General Physics & Mathematics (AREA)
  • Biophysics (AREA)
  • Pathology (AREA)
  • Biomedical Technology (AREA)
  • Molecular Biology (AREA)
  • Medical Informatics (AREA)
  • Epidemiology (AREA)
  • Optics & Photonics (AREA)
  • Audiology, Speech & Language Pathology (AREA)
  • Dental Tools And Instruments Or Auxiliary Dental Instruments (AREA)
  • Endoscopes (AREA)

Abstract

La présente divulgation concerne un système de balayage intra-buccal conçu pour déterminer des données 3D d'un objet dentaire dans une cavité buccale. Le système de balayage intra-buccal peut comprendre un scanner intra-buccal, comprenant; une unité de projecteur conçue pour émettre une lumière visible et une lumière infrarouge pendant une séquence de balayage, une unité de capteur d'image conçue pour acquérir les signaux de lumière visible et des signaux infrarouges à partir d'au moins l'objet dentaire provoqués par la lumière visible émise et la lumière infrarouge émise, respectivement, et un ou plusieurs processeurs conçus pour : commander une puissance de l'unité de projecteur de telle sorte qu'un premier niveau de puissance de la lumière infrarouge émise pendant une première période de la séquence de balayage est inférieur à un second niveau de puissance de la lumière infrarouge émise pendant une seconde période de la séquence de balayage, et déterminer des données 3D sur la base des signaux de lumière visible acquis pendant la première période et pour déterminer une région interne de l'objet dentaire sur la base des signaux infrarouges acquis pendant la seconde période.
PCT/EP2024/065626 2023-06-19 2024-06-06 Système de balayage intra-buccal pour déterminer un signal infrarouge Pending WO2024260743A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
DKPA202370305 2023-06-19
DKPA202370305 2023-06-19

Publications (1)

Publication Number Publication Date
WO2024260743A1 true WO2024260743A1 (fr) 2024-12-26

Family

ID=91465153

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2024/065626 Pending WO2024260743A1 (fr) 2023-06-19 2024-06-06 Système de balayage intra-buccal pour déterminer un signal infrarouge

Country Status (1)

Country Link
WO (1) WO2024260743A1 (fr)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9103666B2 (en) * 2007-05-17 2015-08-11 Technion Research And Development Foundation, Ltd. Compact 3D scanner with fixed pattern projector and dual band image sensor
EP2442720B1 (fr) 2009-06-17 2016-08-24 3Shape A/S Appareil d'exploration à focalisation
EP3598061A1 (fr) * 2017-03-13 2020-01-22 J. Morita MFG. Corp. Scanner tridimensionnel et sonde
US10585958B2 (en) * 2016-07-27 2020-03-10 Align Technology, Inc. Intraoral scanner with dental diagnostics capabilities
US20230181020A1 (en) * 2018-04-25 2023-06-15 Dentlytec G.P.L. Ltd. Properties measurement device

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9103666B2 (en) * 2007-05-17 2015-08-11 Technion Research And Development Foundation, Ltd. Compact 3D scanner with fixed pattern projector and dual band image sensor
EP2442720B1 (fr) 2009-06-17 2016-08-24 3Shape A/S Appareil d'exploration à focalisation
US10585958B2 (en) * 2016-07-27 2020-03-10 Align Technology, Inc. Intraoral scanner with dental diagnostics capabilities
EP3598061A1 (fr) * 2017-03-13 2020-01-22 J. Morita MFG. Corp. Scanner tridimensionnel et sonde
US20230181020A1 (en) * 2018-04-25 2023-06-15 Dentlytec G.P.L. Ltd. Properties measurement device

Similar Documents

Publication Publication Date Title
US11612326B2 (en) Estimating a surface texture of a tooth
JP6487580B2 (ja) テクスチャ的特徴を用いる物体の3dモデル化方法
JP5276006B2 (ja) 口腔内測定装置及び口腔内測定システム
JP6430934B2 (ja) 蛍光を測定する口腔内3dスキャナ
JP5305929B2 (ja) 診療審美兼用口中撮像装置
JP6045676B2 (ja) 口腔内撮像装置
JP2013034569A (ja) 口腔内検査装置、口腔内検査装置の作動方法
JP2019523064A5 (fr)
CN113499160A (zh) 具有牙科诊断能力的口内扫描仪
WO2020185806A1 (fr) Dispositif de balayage intraoral avec tomographie par cohérence optique intégrée (oct)
EP4272630A1 (fr) Système et procédé pour fournir une rétroaction dynamique pendant le balayage d'un objet dentaire
WO2024260743A1 (fr) Système de balayage intra-buccal pour déterminer un signal infrarouge
WO2024260907A1 (fr) Système de balayage intrabuccal pour déterminer un signal de couleur visible et un signal infrarouge
WO2025202067A1 (fr) Système de balayage intrabuccal à programmes de séquence de balayage améliorés
EP4646137A1 (fr) Système de balayage intrabuccal pour déterminer des informations de balayage composites
WO2025125551A1 (fr) Système de balayage intrabuccal à images focalisées alignées sur un modèle de surface 3d
WO2025202066A1 (fr) Système de balayage intrabuccal permettant de fournir un signal de rétroaction qui comporte un niveau de qualité d'un modèle tridimensionnel
CN114786561A (zh) 用于确定健康状况的扫描系统
WO2024260906A1 (fr) Mesures volumétriques d'une région interne d'un objet dentaire
KR20250164786A (ko) 3d 모델 상에 2d 이미지를 중첩시키기 위한 구강내 스캐너 시스템 및 방법
CN121039699A (zh) 用于在三维模型上叠加二维图像的口内扫描仪系统和方法
CN119816241A (zh) 具有led的卫生实现方式的口内扫描装置
WO2025202064A1 (fr) Système de balayage intrabuccal doté d'un boîtier de pointe configuré pour transmettre une lumière infrarouge
CN121038736A (zh) 口腔外扫描仪系统
FI20235483A1 (en) Device and method for imaging teeth

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 24731948

Country of ref document: EP

Kind code of ref document: A1