WO2024260906A1 - Mesures volumétriques d'une région interne d'un objet dentaire - Google Patents
Mesures volumétriques d'une région interne d'un objet dentaire Download PDFInfo
- Publication number
- WO2024260906A1 WO2024260906A1 PCT/EP2024/066771 EP2024066771W WO2024260906A1 WO 2024260906 A1 WO2024260906 A1 WO 2024260906A1 EP 2024066771 W EP2024066771 W EP 2024066771W WO 2024260906 A1 WO2024260906 A1 WO 2024260906A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- dental
- dental feature
- feature boundary
- boundary
- processors
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0012—Biomedical image inspection
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/0033—Features or image-related aspects of imaging apparatus, e.g. for MRI, optical tomography or impedance tomography apparatus; Arrangements of imaging apparatus in a room
- A61B5/0035—Features or image-related aspects of imaging apparatus, e.g. for MRI, optical tomography or impedance tomography apparatus; Arrangements of imaging apparatus in a room adapted for acquisition of images from more than one imaging mode, e.g. combining MRI and optical tomography
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/0059—Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
- A61B5/0082—Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence adapted for particular medical purposes
- A61B5/0088—Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence adapted for particular medical purposes for oral or dental tissue
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61C—DENTISTRY; APPARATUS OR METHODS FOR ORAL OR DENTAL HYGIENE
- A61C9/00—Impression cups, i.e. impression trays; Impression methods
- A61C9/004—Means or methods for taking digitized impressions
- A61C9/0046—Data acquisition means or methods
- A61C9/0053—Optical means or methods, e.g. scanning the teeth by a laser or light beam
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H30/00—ICT specially adapted for the handling or processing of medical images
- G16H30/40—ICT specially adapted for the handling or processing of medical images for processing medical images, e.g. editing
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H50/00—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
- G16H50/20—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for computer-aided diagnosis, e.g. based on medical expert systems
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H50/00—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
- G16H50/50—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for simulation or modelling of medical disorders
Definitions
- the disclosure relates to an intraoral scanning system. More specifically, the disclosure relates to a system that is configured to determine an internal measurement of an inner region of a dental object, and wherein the system includes a handheld intraoral scanner.
- ionizing radiation e.g., X-rays
- X-ray bitewing radiographs are often used to provide non-quantitative images of the teeth's internal structures.
- images are typically limited in their ability to show early tooth mineralization changes (e.g. initial caries) resulting in underestimation of the demineralization depth; they are unable to assess the presence or not of micro-cavitation; they result in frequent overlap of the approximal tooth surfaces which requires repetition of radiograph acquisition and thus may involve a lengthy and expensive procedure.
- NIR near-infrared
- an intraoral scanning system may be configured to determine an internal measurement of an inner region or volumetric segment of a dental object.
- the system may comprise a handheld intraoral scanner that may be configured to provide non-visible 2d images and visible 2d images and 3d sub-scans of a dental object.
- the non-visible 2Dimages and the visible sub-scans may be provided by one or more light sources of the handheld intraoral scanner emits light that scatters off the dental object, and the scattering is acquired by one or more image sensors of the handheld intraoral scanner.
- the visible sub-scans may correspond to emitted light that includes wavelengths, such as between 350 nm and 750 nm, and the non- visible 2D images may correspond to emitted light that includes wavelengths, such as 800 nm to 1100 nm.
- the system may include one or more processors that is configured to determine point clouds of the visible sub-scans, and to determine a three-dimensional (3D) finite element mesh and align the point clouds to the 3D finite element mesh.
- Each of the point clouds may include a set of points with three-dimensional coordinates that corresponds to a part of the dental object. The point clouds may then be aligned to the 3D finite element mesh for the purpose of determining a volumetric representation of the dental object.
- the point clouds may represent a 3D model of the dental object in the 3D finite element mesh, wherein the point clouds include a set of points with three-dimensional cartesian coordinates.
- the 3D finite element mesh includes a plurality of finite elements, and wherein each of the plurality of finite elements includes multiple vertices.
- Each of the plurality of finite element may include one or more of following elements: tetrahedral, tetrahedral combined with octahedral, parallelepiped (skewed voxels), and voxels/cube.
- the evaluation by the one or more processors may include determining of one or more optical coefficients for each of the multiple vertices., and wherein the one or more optical coefficients correspond to a dental feature represented by the aligned point clouds.
- the one or more optical coefficients may include one or more of following optical indices: refraction index, light absorption index, and light scattering index, and wherein the one or more optical coefficients correspond to either the non- visible 2D images or the visible sub-scans.
- the one or more optical coefficients can be used to extract a dental feature or volumetric segment represented by the non-visible 2d images (i.e. the non-visible sub-scans).
- the dental feature or volumetric segment may be an anatomy feature being an enamel, a dentine, or a pulp, a disease feature being a crack or a caries lesion, and a mechanical feature being a filling and/or a composite restoration.
- the one or more processors may be configured to determine a first dental feature boundary of a volumetric segment in form of a surface represented by the 3D finite element mesh.
- the 3D finite element mesh includes a plurality of finite elements, where each of the plurality of finite elements represents a volumetric segment.
- the first dental feature boundary (or the dental feature boundaries) corresponding to volumetric segments are determined in the following way.
- the non-visible 2d image is used to estimate optical coefficients of the vertices of the 3D finite element mesh. These coefficients are fed into a marching finite element processor that refines or subdivides the 3D finite element mesh in such a way that each of the plurality of finite element mesh belongs to a single volumetric segment.
- the marching finite element processor generates 3D surface meshes that mark the first dental feature boundary (or dental feature boundaries) between separate volumetric segments of the 3D finite element mesh.
- the first dental feature boundary of a first dental feature is determined based on the one or more optical coefficients, and wherein the first dental feature boundary corresponds to the non-visible 2D images, and wherein the one or more processors may be configured to determine an internal measurement of the first dental feature boundary.
- the first dental feature boundary may be represented within the 3D finite element mesh or on a volumetric model of the dental object that is determined based on the 3D finite element mesh.
- the first dental feature boundary corresponds to a circumference of a dental feature that is represented within the 3D finite element mesh or the volumetric model of the dental object.
- the dental feature boundary i.e.
- volumetric segments of the dental feature boundary determines an outer contour or shape of a dental feature within the dental object, and the outer contour or shape allows the one or more processors to determine the size of the dental feature, wherein the size may be a distance between two points on the outer contour or shape.
- the size may correspond to a volume of the dental feature.
- the internal measurements on a volumetric segment may be determined between a first part of the first dental feature boundary and a second part of the first dental feature boundary.
- the internal measurement may be a distance between the first part and the second part, and the distance may be a minimum or a maximum distance.
- the dental feature may be the enamel
- the first part may correspond to an upper part of the enamel
- the second part may correspond to a bottom part of the enamel, and wherein the upper part is opposite to the bottom part.
- the minimum distance corresponds to the thinnest part of the enamel
- the maximum distance corresponds to the thickest part of the enamel.
- the dentist would be able to know which teeth have a critical thickness of enamel that would have a critical short travel distance for a potential caries from a surface to the dentin of the dental object.
- the one or more processors may be configured to fit a shape object to a volumetric segment represented by the first dental feature boundary and determine multiple internal measurements based on it.
- the one or more processors may be configured to determine a minimum and/or a maximum internal measurement of the multiple internal measurements.
- the minimum internal measurement may correspond to the minimum distance determined between the two parts, i.e. the first part and the second part, of the dental feature boundary.
- the maximum internal measurement may correspond to the maximum distance determined between the two parts, i.e. the first part and the second part, of the dental feature boundary.
- the internal measurement may be a distance along a normal vector of the first part of the first dental feature boundary.
- the normal vector may extend between the first part and the second part of the first dental feature boundary.
- the dental feature may be a caries lesion
- the internal measurement may be a distance measure that indicates the size of the caries lesion along a longitudinal axis that extends between the occlusal and the furcation of the dental object.
- the distance measure may indicate the size of the caries lesion towards the enamel or the pulp chamber of the dental object.
- the one or more processors may be configured to perform a fitting of a shape object to a geometry of the first dental feature boundary, and wherein the internal measurement is determined based on the fitted shape object.
- the geometry may be an outer contour or shape of the dental feature boundary.
- the shape object may be an ellipse, a sphere, an ellipsoid, or a bounding box.
- the fitting may be based on a principal component analysis.
- the fitting to the geometry of the dental feature boundary, such as the first dental feature boundary would result in a implicit function that corresponds to the geometry of the dental feature boundary.
- the implicit function would improve the resolution of the geometry of the dental feature boundary.
- the one or more processors may be configured to determine multiple internal measurements of the first dental feature boundary or to a shape object fitted to the geometry of the first dental feature boundary., and wherein the internal measurement includes a volumetric measurement that may be determined based on the multiple internal measurements.
- the volumetric measurement may include a volume size of the dental feature that corresponds to the first dental feature boundary.
- the dental feature may be a caries lesion and it would be of an advantage for the dentist to measure the volume of the caries lesion as that would give an improved indication of how the size of the caries lesion evolves over a period of time.
- Monitoring a distance between two parts of the caries would provide a measure of the size in a single dimension, wherein monitoring the volume of the caries would provide a measure of the size in multiple dimensions. Monitoring in multiple dimensions would inevitably result in an improved monitoring of the caries. It may be that the caries does not evolve in the single dimension you are monitoring the size but instead in another dimension.
- the volume of the size would change no matter which of the dimensions the car
- the internal measurement may be a sectional curvature measure of a part of the first dental feature boundary or to a shape object fitted to the geometry of the first dental feature boundary.
- the sectional curvature measure may be based on a mean or a gaussian curvature measure.
- the ratio of the enamel and/or the dentin that is affected may affect a percentage of the enamel thickness or the dentin thickness.
- the one or more processors may be configured to determine a plurality of dental feature boundaries, and wherein the plurality of dental feature boundaries includes the first dental feature boundary and at least a second dental feature boundary, and wherein the at least second dental feature boundary corresponds to a second dental feature; and wherein the internal measurement may be determined between the first dental feature boundary and the at least second dental feature boundary.
- the first dental feature may be a disease feature, such as a crack or a caries
- the second dental feature may be an anatomy feature, such as an enamel, a dentine or a pulp chamber.
- the first dental feature boundary may represent a disease feature
- the second dental feature boundary may represent an anatomy feature
- the one or more processors may be configured to group a plurality of dental feature boundaries into multiple dental feature boundary groups, and wherein the plurality of dental feature boundaries includes the first dental feature boundary and at least a second dental feature boundary, and wherein each of the multiple dental feature boundary groups corresponds to different dental features.
- a scanning situation which implies scanning both a lower and an upper jaw of a dentition would result in the plurality of dental features boundaries for
- the internal measurement may be determined between a first part of the first dental feature boundary and a second part of the second dental feature boundary.
- the internal measurement may be a distance between the first part and the second part, and the distance may be a minimum or a maximum distance.
- the first dental feature may be a caries, and the first part may correspond to a bottom part of the enamel, and the second part may correspond to a bottom part of the caries, and wherein the bottom part of the enamel is opposite to the bottom part of the caries.
- the sign of the internal measurement whether it is positive or negative, would determine whether the caries is extending through the enamel and into the dentine of the dental object.
- the level of the internal measurement corresponds to the depth of the caries into the enamel.
- the bottom part of the caries and the bottom part of the enamel may be determined along a longitudinal axis of the dental object that extends between occlusal and the furcation of the dental object.
- the system may be configured to provide an alert signal if the distance between the bottom parts is below a certain minimum distance threshold, and wherein the alert signal indicates that the caries is close to enter the dentin area of the dental object.
- the system may be configured to monitor an area of the dental object that has a distance between the two bottom parts that are below a maximum distance threshold and above a minimum distance threshold.
- the system may be configured not to monitor the area of the dental object when the distance between the two bottom parts is above the maximum distance.
- the internal measurement may be a distance between the first dental feature boundary and the at least second dental feature boundary or an overlap ratio between the first dental feature boundary and the second dental feature boundary.
- the overlap ratio may be a percentage of how much of the dentin, enamel or the pulp chamber is affected by a caries or a crack.
- the overlap ratio may be determined as following:
- the overlap ratio may be determined as following: • determine a second distance of the second dental feature, and wherein the second distance is determined between two parts of the second dental feature,
- the one or more processors may be configured to determine multiple sub-internal measurements that include a first sub-internal measurement, a second sub-internal measurement, and a third sub-internal measurement.
- the first sub-internal measurement may correspond to a distance or volumetric measurement of the first dental feature that may correspond to the first dental feature boundary.
- the second sub-internal measurement may correspond to a distance or volumetric measurement of the second dental feature that may correspond to the second dental feature boundary.
- the third sub-internal measurement may be a distance measure or a volumetric measure of an overlap between the first dental feature boundary and the at least second dental feature boundary, and the internal measurement may be a ratio between the third sub-internal measurement and the first sub-internal measurement or the second sub-internal measurement.
- the ratio informs a size ratio of how much of the first dental feature or the second dental feature is overlapped by the other dental feature.
- the ratio may be in percentage, and in this example, the dentist would know how much of for example the dentin or the enamel is affected by caries or cracks in percentage.
- the one or more processors may be configured to determine a progress over time for the first dental feature boundary to progress into or overlap partly the second dental feature boundary.
- the progress time may be determined based on the distance between the first dental feature boundary and the at least second dental feature boundary.
- the progress time may be determined based on the distance between the first dental feature boundary and the at least second dental feature boundary and a progress time algorithm that has been trained to output a progress time based on the distance between two dental feature boundaries.
- the training of the progress time algorithm may be based on non- visible 2D images performed on different patients, wherein the progression of dental features identified in the non-visible 2D images have been monitored over a period of time and registered within the progress time algorithm.
- the trained progress time algorithm may be configured to determine the progress time by correlating the distance to the monitored progression of a similar dental feature that corresponds to the first dental feature boundary, such as a caries.
- the plurality of dental feature boundaries may be determined by a marching finite element algorithm.
- the marching finite element algorithm may be configured to select a set of the one or more optical coefficients that are nearest a set of the multiple vertices for constructing a shape function that represents the first dental feature boundary.
- the algorithm may be configured to construct multiple shape function for multiple dental feature boundaries, such as the first dental feature boundary and the second dental feature boundary.
- the marching finite element algorithm may be a three-dimensional finite element algorithm, or more precisely, a marching cubes or tetrahedra algorithm.
- the determined point clouds of the non-visible 2D images and the visible sub-scans may include a first set of point clouds that corresponds to the non-visible 2D images and a second set of point clouds that corresponds to the visible sub scans. Both the first set of point clouds and the second set of point clouds are aligned to the 3D finite element mesh.
- the one or more processors may be configured to assign each of the point clouds to an optical coefficient of the one or more optical coefficients.
- the marching finite element algorithm may then select a set of point clouds that are nearest a set of the multiple vertices for constructing a shape function that represents the first dental feature boundary.
- the algorithm may be configured to construct multiple shape function for multiple dental feature boundaries, such as the first dental feature boundary and the second dental feature boundary.
- the one or more optical coefficients may be determined based on a neural radiance field model, wherein the neural radiance field model is configured to determine the one or more optical coefficients based on spatial location information and a viewing angle information of the handheld intraoral scanner while capturing the non-visible sub scans and the visible sub scans.
- Each of the multiple vertices may correspond to a pixel of an image sensor of the handheld intraoral scanner or a group of the multiple vertices may correspond to a pixel of an image sensor of the handheld intraoral scanner.
- the one or more processors may be configured to determine for each of the multiple vertices a relative position between the handheld intraoral scanner, i.e. each pixel of the image sensor, and the dental object, wherein the relative position includes spatial location information and viewing angle information for a corresponding casting object relative to each of the multiple vertices.
- the casting object may correspond to a view along a ray or a view along a volume of a pixel of the image sensor towards the dental object.
- the one or more processors may be further configured to determine the one or more optical coefficients for the multiple vertices by a neural radiance field model, wherein the neural radiance field model is configured to determine the one or more optical coefficients based on the spatial location information and the viewing angle information.
- the neural radiance field model may be configured to be trained using a plurality of non- visible 2d images, and wherein the casting object may be a ray or a cone.
- the ray corresponds to the viewing along the ray of the pixel and the cone corresponds to the viewing volume of the pixel.
- Each pixel of the image sensor is configured to receive non-visible 2D images which are then turned into synthetic pixel values for the corresponding pixels of the image sensor by the one or more processors.
- the received non-visible 2D images includes internal scatterings from within an inner region of the dental object, and which means that each of the synthetic pixel values includes an average of internal scatterings from different planes that are distributed along a casting object of the corresponding pixel.
- Each of the synthetic pixel values may correspond to each of the multiple vertices of a primary 3D finite element mesh, and wherein the one or more processors is configured to determine the one or more optical coefficients for each of the multiple vertices of the primary 3D finite element mesh.
- the internal scatterings of the received non-visible 2D images may be represented by one or more optical coefficients of multiple vertices of a sub 3D finite element mesh, and since the received non-visible 2D images include internal scatterings captured from different planes, the neural radiance field model is configured to determine a sub 3D finite element mesh for each of the different planes.
- the sub 3D finite element meshes are arranged along the casting object, such that a casting object starting from a vertex of the primary 3D finite element mesh may intersects the corresponding vertex of each of the sub 3D finite element meshes.
- the one or more optical coefficients of the vertex of the primary 3D finite element mesh may be an average of the one or more optical coefficients of the corresponding vertex of the sub 3D finite element meshes which the casting object intersects.
- the neural radiance field model may be trained by generating, based on the set of input parameters, an optical coefficient for each of the one or more coordinates which represents a density value for absorption of light, scattering of light, or refractive index.
- the neural radiance field model may be trained by determining the synthetic pixel value for the casting object based on the corresponding determined optical coefficient for each of the sub 3D finite element meshes that represent the internal scatterings from within the dental object. Further, neural radiance field model may be trained by minimizing a loss function between the synthetic pixel value and a corresponding true pixel value of the plurality of pixels of the non-visible 2D images.
- the non-visible sub-scans include 2D infrared image data or 3D infrared image data.
- the one or more processors may be further configured to determine a plurality of grid points within the 3D surface model and determine the 3D inner geometry by arranging at least one of: the synthetic pixel value the one or more optical coefficients at each of the plurality of grid points.
- the one or more processors may be configured to perform a covariance analysis of the one or more optical coefficients for the multiple vertices, determine a plurality of descriptive parameters based on the covariance analysis, and wherein a number of the plurality of descriptive parameters is smaller than a number of the one or more optical coefficients, and wherein the first dental feature boundary may be determined by providing the plurality of descriptive parameters into a marching finite element algorithm of the intraoral scanning system.
- the one or more processors would be faster in determining the first dental feature boundary when comparing to the other situation where the marching finite element algorithm receives the one or more optical coefficients for each of the multiple vertices.
- the covariance analysis may be a principal component analysis or an autoencoder deep learning algorithm, wherein the autoencoder deep learning algorithm includes a neural network.
- the neural network may be configured to receive the non-visible sub-scans, i.e. non-visible 2D images, and determine the plurality of descriptive parameters like one or more optical coefficients of the multiple vertices.
- the one or more processors may be configured to train the neural network by receiving one or more CBCT scans of one or more dental objects, determining one or more training optical coefficients for each of the one or more CBCT scans, receiving one or more non-visible sub scans (2D images) from a handheld intraoral scanner of the one or more dental objects, and training the neural network by mapping the one or more training optical coefficients onto a three-dimensional finite element mesh of the one or more non-visible sub scans.
- the neural network may be trained to classify specific dental features based on the one or more training optical coefficients, and wherein the neural network may be configured to receive the one or more optical coefficients of the multiple vertices that corresponds to the non-visible 2D image and determine a plurality of descriptive parameters for a specific dental feature based on the trained optical coefficients and the received one or more optical coefficients.
- the non-visible 2d images may include a plurality of 2D infrared images where the position and orientation of each of the plurality of 2D infrared images is known.
- the plurality of 2D infrared images may be supported by colour 2D images and fluorescent 2D images.
- the system may receive a CBCT scan of dental objects creating an accurate voxel model of the dental objects.
- the accurate voxel model may be segmented into layers, which are used to create a parametric voxel or a hexahedral mesh.
- Each of the plurality of 2D infrared images of the dental objects are mapped onto the accurate voxel model of the dental objects based on the position and orientation of each of the plurality of 2D infrared images.
- the accurate voxel model including the mapped plurality of 2D infrared images are feed into the neural network for training the neural network.
- a parametric model can be generated, and which describes a dental object based on 2D IR images.
- the parametric model includes the multiple vertices and the corresponding one or more optical coefficients.
- the one or more processors may be configured to determine a plurality of normal lengths from a second dental feature in a direction towards the first dental feature boundary and along a longitudinal axis of the dental object in the 3D finite element mesh and determine a maximum normal length of the plurality of normal lengths.
- the one or more processors may further be configured to determine a penetration depth of the maximum normal length that penetrates the first dental feature, and wherein the internal measurement includes the penetration depth.
- the penetration depth may be a distance measured in millimetres or micrometres.
- a penetration depth ratio may be determined by the one or more processors, and wherein the penetration depth ratio is the ratio between the penetration depth and a maximum normal length of the first dental feature boundary along the longitudinal axis.
- a sign of each of the plurality of normal lengths determines whether a normal length of the plurality of normal lengths is within the first dental feature. For example, if the sign of the normal length is negative then it indicates that the normal length is within the first dental feature, and if positive then it indicates that the normal length is outside the first dental feature.
- the one or more processors may be configured to select a group of normal lengths of the plurality of normal lengths, wherein each of the normal lengths of the group has a first sign that indicates that the group of normal lengths is within the first dental feature and determine a volume measurement based on the group of normal lengths, and wherein the internal measurement is the volume measurement.
- the one or more processors may be configured to determine a notification signal when the internal measurement is above a measurement threshold, and wherein the notification signal is displayed on a user interface of the system. Thereby, the dental would be warned if any critical internal measurements are provided by the system.
- the intraoral scanning system may include a user interface that may be configured to display a 3D model of the dental object and a 2D cross-section of the dental object, and wherein the internal measurement is displayed on the 2D crosssection of the dental object. Via the 3D model it may be possible for the user of the system to select which parts of the 3D model the user wants to have displayed as a 2D cross section.
- the internal measurement or a plurality of internal measurements may be displayed on the 2D cross section together with or without the dental feature boundaries.
- the internal measurement of a dental feature or between dental features may be displayed together with an arrow or a line that indicates where the internal measurement has been performed.
- the dental feature which the volume measure corresponds to may be coloured for enhancing the contrast of the dental feature, and the internal measurement may be displayed on the dental feature or next to.
- the dentist may be able to add two or more measurement points on the 3D model, a 2D image of a section of the 3D model, or the 2D cross section image, and the one or more processors is configured to perform the internal measurement between the two or more measurement points.
- the two or more measurement points may be added via a cursor on the user interface, wherein the cursor may be moved around via a motion sensor of the handheld intraoral scanner or by a mouse.
- the one or more processors is configured to assign each of the two or more measurement points to the closest vertex of the multiple vertices, and thereby, the one or more processors may be configured to determine the internal measurement between the two or more measurement points.
- a first measurement point of the two or more measurement points may correspond to the first part of the dental feature boundary, and a second measurement point of the two or more measurement points may correspond to the second part of the dental feature boundary.
- the dentist may be able to add a contour line on the 3D model that encompass a dental feature depicted on the 3D model, a 2D image of a section of the 3D model, or the 2D cross section image, and the one or more processors may be configured to perform the internal measurement on the encompassed dental feature.
- the internal measurement may be a volume measurement of the encompassed dental feature.
- the contour line on the 3D model may encompass an overlap between the first and the second dental feature boundary, i.e. between the first and the second dental feature.
- the contour line may be added via a cursor on the user interface, wherein the cursor may be moved around via a motion sensor of the handheld intraoral scanner or by a mouse.
- the one or more processors is configured to assign the contour line to the closest vertices of the multiple vertices, and thereby, the one or more processors may be configured to determine the internal measurement of the encompassed overlap or dental feature.
- the dental object in the displayed 2D cross-section may be selected by a user of the system via a marker on the displayed three-dimensional model of the dental object.
- the marker may be a window which makes it easier for the user to see which dental object is being selected on the 3D model to be view as a 2D cross-section.
- the first dental feature boundary may be displayed on the 3D model of the dental object and/or the 2D cross-section of the dental object.
- the dental feature boundaries may be displayed on the dental object as the actual dental feature that it corresponds to, such as an enamel, a dentine, a pulp chamber, crack, caries, filling or a compose restoration etc.
- the one or more processors may be configured to change a viewing angle of the 2D crosssection on a position of a marker on the 3D model.
- the marker may be a line that indicates the cross-section that is displayed on the 2D cross-section. The user can rotate the marker, and the cross-section displayed on the 2D cross-section is also rotated symmetrically with the rotation of the marker.
- the one or more processors may be configured to change a viewing depth into a dental object of the 3D model. By increasing the viewing depth, the user can see further into the dental object, and when decreasing the viewing depth, the user is able to see less into the dental object.
- the viewing depth may be adjusted by a slider on the user interface or by a button press on the handheld intraoral scanner and a movement of the handheld intraoral scanner that is detected by a motion sensor that is arranged within the handheld intraoral scanner.
- the user interface may include a transparency unit that may be configured to change the transparency of the first dental feature boundary.
- a transparency unit that may be configured to change the transparency of the first dental feature boundary.
- the user may be able to change the transparency of selected dental feature boundaries.
- the user may not be interested to see the any disease features but only the anatomy features, and in this example, the user would increase the transparency of the disease features to an extend they can’t be seen by the user on the user interface.
- the user may be able to remove the unwanted dental features on the dental object by selecting these and press a remove button on a keyboard or a graphical remove button on the user interface. By removing or reducing the transparency of any dental features would also remove or reduce the transparency of corresponding internal measurements that may be applied on the same dental object.
- the user interface may include a selector that may be configured to select one or more of a plurality of dental feature boundaries to be displayed, and wherein the plurality of dental feature boundaries includes the first dental feature boundary and at least a second dental feature boundary.
- the transparency unit may be configured to change the transparency of the selected one or more of the plurality of dental feature boundaries.
- the user interface may be configured to display a plurality of dental feature boundaries, and wherein the plurality of dental feature boundaries includes the first dental feature boundary and at least a second dental feature boundary, and wherein the plurality of dental feature boundaries may have different colors for the purpose of improving the contrast between the dental features on the dental object, and thereby, improving the visualization of the dental features.
- the one or more processors may be configured to determine the first dental feature that corresponds to the first dental feature boundary, and wherein the first dental feature may be determined based on the one or more optical coefficients that correspond to the first dental feature boundary and a dental feature algorithm.
- the dental feature algorithm includes one or more of following an anatomy feature determiner, a disease feature determiner, and a mechanical feature determiner.
- the anatomy feature determiner may be configured to determine that the one or more optical coefficients corresponds to an anatomy feature when the one or more optical coefficients is within an anatomy feature coefficient range.
- the disease feature determiner may be configured to determine that the one or more optical coefficients corresponds to a disease feature when the one or more optical coefficients is within a disease feature coefficient range.
- the mechanical feature determiner may be configured to determine that the one or more optical coefficients corresponds to a mechanical feature when the one or more optical coefficients is within a mechanical feature coefficient range.
- the anatomy feature coefficient range, the disease feature coefficient range and the mechanical feature coefficient range may be predetermined and stored in a memory of the system.
- the ranges may be occasionally updated based on manual input by a user.
- the user may identify dental features on a graphical user interface and categorize the identified dental features into an anatomy feature, a disease feature or a mechanical feature.
- the manual input may be provided by different users of the system, and the memory may be a cloud server or a server that are connected to the intraoral scanning system that may include multiple handheld intraoral scanners and user interfaces connected via a wireless network.
- the one or more processors may be configured to align the non-visible 2D images to the visible sub-scans and determine the point clouds of the aligned sub-scans.
- the alignment may be based on positions and viewing angles of the handheld intraoral scanner while capturing the sub-scans. The positions and the viewing angles may be determined by a motion sensor that is arranged within the handheld intraoral scanner.
- the one or more processors may be configured to align the non-visible 2D images to the visible sub-scans based on common dental features between the non-visible 2D images and the visible sub-scans.
- the position of the handheld intraoral scanner may be determined by identifying a dental object in the visible sub-scans and the same dental object in the non-visible 2D images based on common dental features, such as the geometry and/or a shade color of the dental object or one or more of the dental features.
- the one or more processors may be configured to determine the viewing angle to the dental object of the visible sub-scans and the non-visible 2D images based on triangulation between one or more light sources of the handheld intraoral scanner, one or more image sensors of the handheld intraoral scanner, and the dental object or the one or more dental features.
- the one or more processors may be configured to determine point clouds of the visible sub-scans and a composition of the non-visible 2D images and the visible sub-scans.
- the one or more processors may be configured to determine, in real time, surface information from the visible sub-scans for generating or updating a three-dimensional (3D) model of the dental object.
- the one or more processors may be further configured to_determine a plurality of dental feature boundaries, including the first dental feature boundary and/or the second dental feature boundary, based on the composition or on the non-visible 2D images.
- the non-visible 2D images include mainly scattering from inside a dental object, i.e. internal region of a dental object, and significantly less of surface reflection of a tooth.
- the visible sub-scans include mainly surface reflection of a tooth and significantly less reflection from inside a tooth, i.e. internal region of a dental object.
- a composed scan information does not include an overlay of two 2D images where each of the two 2D images relates to different emitted wavelengths from the projector unit.
- the composed scan information may be a combination of intensity levels of each pixel of the image sensor unit that relates to different wavelengths. For example, at a first time period, the image sensor unit may capture light information that relates to visible wavelength, and at a second time period, the image sensor unit may capture light information that relates to a infrared or near-infrared wavelength, and the intensity levels of the two time periods are recorded and combined for enhancing a dental feature.
- the combination of the intensity levels may be done digitally by subtraction and/or addition of the intensity levels.
- the intensity levels may be captured and recorded during at least three time periods for at least three different wavelengths, such as white-coloured wavelength, blue-coloured wavelength and near-infrared wavelength.
- the one or more processors may be configured to display the composed scan information and the 3D model on a displaying unit of the system, and wherein the dental feature boundaries may be added to the composed scan information and/or the 3D model.
- FIG. 1 illustrates an intraoral scanning system
- FIG. 2 illustrates the dental object defined as a 3D finite element mesh
- FIGS. 3A and 3B illustrates an example of one or more processors
- FIG. 4 illustrates an example of an internal measurement
- FIGs. 5A, 5B and 5C illustrate different examples of the internal measurement
- FIGs. 6A and 6B illustrate further examples of the internal measurement
- FIGs. 7A, 7B and 7C illustrate different examples on determining a plurality of dental feature boundaries
- FIGS. 8A and 8B illustrate examples of the one or more processors that is configured to determine one or more optical coefficients
- FIG. 9 illustrates an example of the one or more processors that is configured to perform a covariance analysis
- FIGs. 10A and 10B illustrate examples on how to train a neural network for determining a plurality of descriptive parameters
- FIG. 11 illustrates an example of a user interface of the system.
- the electronic hardware may include microprocessors, microcontrollers, digital signal processors (DSPs), field programmable gate arrays (FPGAs), programmable logic devices (PLDs), gated logic, discrete hardware circuits, and other suitable hardware configured to perform the various functionality described throughout this disclosure.
- Computer program shall be construed broadly to mean instructions, instruction sets, code, code segments, program code, programs, subprograms, software modules, applications, software applications, software packages, routines, subroutines, objects, executables, threads of execution, procedures, functions, etc., whether referred to as software, firmware, middleware, microcode, hardware description language, or otherwise.
- a scanning for providing intra-oral scan data may be performed by a dental scanning system that may include a handheld intraoral scanning device such as the TRIOS series scanners from 3 Shape A/S.
- the dental scanning system may include a wireless capability as provided by a wireless network unit.
- the scanning device may employ a scanning principle such as triangulation-based scanning, confocal scanning, focus scanning, ultrasound scanning, x-ray scanning, stereo vision, structure from motion, optical coherent tomography OCT, or any other scanning principle.
- the scanning device is capable of obtaining surface information by operated by projecting a pattern and translating a focus plane along an optical axis of the scanning device and capturing a plurality of 2D images at different focus plane positions such that each series of captured 2D images corresponding to each focus plane forms a stack of 2D images.
- the acquired 2D images are also referred to herein as raw 2D images, wherein raw in this context means that the images have not been subject to image processing.
- the focus plane position is preferably shifted along the optical axis of the scanning system, such that 2D images captured at a number of focus plane positions along the optical axis form said stack of 2D images (also referred to herein as a sub-scan) for a given view of the object, i.e.
- the scanning device is generally moved and angled relative to the dentition during a scanning session, such that at least some sets of sub-scans overlap at least partially, in order to enable reconstruction of the digital dental 3D model by stitching overlapping subscans together in real-time and display the progress of the virtual 3D model on a display as a feedback to the user.
- the result of stitching is the digital 3D representation of a surface larger than that which can be captured by a single sub-scan, i.e. which is larger than the field of view of the 3D scanning device.
- Stitching also known as registration and fusion, works by identifying overlapping regions of 3D surface in various sub-scans and transforming sub-scans to a common coordinate system such that the overlapping regions match, finally yielding the digital 3D model.
- An Iterative Closest Point (ICP) algorithm may be used for this purpose.
- Another example of a scanning device is a triangulation scanner, where a time varying pattern is projected onto the dental arch and a sequence of images of the different pattern configurations are acquired by one or more cameras located at an angle relative to the projector unit.
- the handheld intraoral scanner may comprise one or more light projectors configured to generate an illumination pattern to be projected on a three-dimensional dental object during a scanning session.
- the light projector(s) may comprise a light source, a mask having a spatial pattern, and one or more lenses such as collimation lenses or projection lenses.
- the light source may be configured to generate light of a single wavelength or a combination of wavelengths (mono- or polychromatic). The combination of wavelengths may be produced by using a light source configured to produce light (such as white light) comprising different wavelengths.
- the light projector(s) may comprise multiple light sources such as LEDs individually producing light of different wavelengths (such as red, green, and blue) that may be combined to form light comprising the different wavelengths.
- the light produced by the light source may be defined by a wavelength defining a specific colour, or a range of different wavelengths defining a combination of colours such as white light.
- the scanning device comprises a light source configured for exciting fluorescent material of the teeth to obtain fluorescence data from the dental object.
- a light source may be configured to produce a narrow range of wavelengths.
- the light from the light source is infrared (IR) light, which is capable of penetrating dental tissue.
- the light projector(s) may be DLP projectors using a micro mirror array for generating a time varying pattern, or a diffractive optical element (DOF), or back-lit mask projectors, wherein the light source is placed behind a mask having a spatial pattern, whereby the light projected on the surface of the dental object is patterned.
- the back-lit mask projector may comprise a collimation lens for collimating the light from the light source, said collimation lens being placed between the light source and the mask.
- the mask may have a checkerboard pattern, such that the generated illumination pattern is a checkerboard pattern. Alternatively, the mask may feature other patterns such as lines or dots, etc.
- Color texture of the dental arch may be acquired by illuminating the object using different monochromatic colors such as individual red, green and blue colors or my illuminating the object using multichromatic light such as white light.
- a 2D image may be acquired during a flash of white light.
- the process of obtaining surface information in real time of a dental arch to be scanned requires the scanning device to illuminate the surface and acquire high number of 2D images.
- a high speed camera is used with a framerate of 300-2000 2D frames pr second dependent on the technology and 2D image resolution.
- the high amount of image data needed to be handled by the scanning device to eighter directly forward the raw image data stream to an external processing device or performing some image processing before transmitting the data to an external device or display. This process requires that multiple electronic components inside the scanner is operating with a high workload thus requiring a high demand of current.
- the scanning device comprises one or more light projectors configured to generate an illumination pattern to be projected on a three-dimensional dental arch during a scanning session.
- the light projector(s) preferably comprises a light source, a mask having a spatial pattern, and one or more lenses such as collimation lenses or projection lenses.
- the light source may be configured to generate light of a single wavelength or a combination of wavelengths (mono- or polychromatic). The combination of wavelengths may be produced by using a light source configured to produce light (such as white light) comprising different wavelengths.
- the light projector(s) may comprise multiple light sources such as LEDs individually producing light of different wavelengths (such as red, green, and blue) that may be combined to form light comprising the different wavelengths.
- the light produced by the light source may be defined by a wavelength defining a specific color, or a range of different wavelengths defining a combination of colors such as white light.
- the scanning device comprises a light source configured for exciting fluorescent material of the teeth to obtain fluorescence data from the dental arch.
- a light source may be configured to produce a narrow range of wavelengths.
- the light from the light source is infrared (IR) light, which is capable of penetrating dental tissue.
- the light projector(s) may be DLP projectors using a micro mirror array for generating a time varying pattern, or a diffractive optical element (DOF), or back-lit mask projectors, wherein the light source is placed behind a mask having a spatial pattern, whereby the light projected on the surface of the dental arch is patterned.
- the back-lit mask projector may comprise a collimation lens for collimating the light from the light source, said collimation lens being placed between the light source and the mask.
- the mask may have a checkerboard pattern, such that the generated illumination pattern is a checkerboard pattern. Alternatively, the mask may feature other patterns such as lines or dots, etc.
- the scanning device preferably further comprises optical components for directing the light from the light source to the surface of the dental arch.
- the specific arrangement of the optical components depends on whether the scanning device is a focus scanning apparatus, a scanning device using triangulation, or any other type of scanning device.
- a focus scanning apparatus is further described in EP 2 442 720 Bl by the same applicant, which is incorporated herein in its entirety.
- the light reflected from the dental arch in response to the illumination of the dental arch is directed, using optical components of the scanning device, towards the image sensor(s).
- the image sensor(s) are configured to generate a plurality of images based on the incoming light received from the illuminated dental arch.
- the image sensor unit may be a high-speed image sensor such as an image sensor configured for acquiring images with exposures of less than 1/1000 second or frame rates in excess of 250 frames pr. second (fps).
- the image sensor may be a rolling shutter (CCD) or global shutter sensor (CMOS).
- the image sensor(s) may be a monochrome sensor including a color filter array such as a Bayer filter and/or additional filters that may be configured to substantially remove one or more color components from the reflected light and retain only the other non-removed components prior to conversion of the reflected light into an electrical signal.
- additional filters may be used to remove a certain part of a white light spectrum, such as a blue component, and retain only red and green components from a signal generated in response to exciting fluorescent material of the teeth.
- the network unit may be configured to connect the dental scanning system to a network comprising a plurality of network elements including at least one network element configured to receive the processed data.
- the network unit may include a wireless network unit or a wired network unit.
- the wireless network unit is configured to wirelessly connect the dental scanning system to the network comprising the plurality of network elements including the at least one network element configured to receive the processed data.
- the wired network unit is configured to establish a wired connection between the dental scanning system and the network comprising the plurality of network elements including the at least one network element configured to receive the processed data.
- the dental scanning system preferably further comprises a processor configured to generate scan data (such as extra-oral scan data and/or intra-oral scan data) by processing the two-dimensional (2D) images acquired by the scanning device.
- the processor may be part of the scanning device.
- the processor may comprise a Field- programmable gate array (FPGA) and/or an Advanced RISC Machines (ARM) processor located on the scanning device.
- the scan data comprises information relating to the three- dimensional dental arch.
- the scan data may comprise any of: 2D images, 3D point clouds, depth data, texture data, intensity data, color data, and/or combinations thereof.
- the scan data may comprise one or more point clouds, wherein each point cloud comprises a set of 3D points describing the three-dimensional dental arch.
- the scan data may comprise images, each image comprising image data e.g. described by image coordinates and a timestamp (x, y, t), wherein depth information can be inferred from the timestamp.
- the image sensor(s) of the scanning device may acquire a plurality of raw 2D images of the dental arch in response to illuminating said object using the one or more light projectors.
- the plurality of raw 2D images may also be referred to herein as a stack of 2D images.
- the 2D images may subsequently be provided as input to the processor, which processes the 2D images to generate scan data.
- the processing of the 2D images may comprise the step of determining which part of each of the 2D images are in focus in order to deduce/generate depth information from the images.
- the internal depth information may be used to generate 3D point clouds comprising a set of 3D points in space, e.g., described by cartesian coordinates (x, y, z).
- the 3D point clouds may be generated by the processor or by another processing unit.
- Each 2D/3D point may furthermore comprise a timestamp that indicates when the 2D/3D point was recorded, i.e., from which image in the stack of 2D images the point originates.
- the timestamp is correlated with the z-coordinate of the 3D points, i.e., the z-coordinate may be inferred from the timestamp.
- the output of the processor is the scan data, and the scan data may comprise image data and/or depth data, e.g. described by image coordinates and a timestamp (x, y, t) or alternatively described as (x, y, z).
- the scanning device may be configured to transmit other types of data in addition to the scan data. Examples of data include 3D information, texture information such as infra-red (IR) images, fluorescence images, reflectance color images, x-ray images, and/or combinations thereof.
- IR infra-red
- FIG. 1 illustrates an intraoral scanning system 1 that is configured to determine 8 an internal measurement of an inner region of a dental object 2.
- the system comprising a handheld intraoral scanner 10 that is configured to provide 12 non-visible 2D images and visible sub-scans of a dental object.
- the system includes one or more processors 13 configured to determine 14 point clouds of the non-visible 2D images and the visible sub-scans, determine 15 a three-dimensional (3D) finite element mesh and align the point clouds to the 3D finite element mesh, and wherein the 3D finite element mesh includes a plurality of finite elements, and wherein each of the plurality of finite elements includes multiple vertices, determine 16 one or more optical coefficients for each of the multiple vertices based on the non-visible 2D images, and wherein the one or more optical coefficients correspond to a dental feature represented by the aligned point clouds; determine 17 a first dental feature boundary of a first dental feature based on the determined one or more optical coefficients, and wherein the first dental feature boundary corresponds to the non-visible 2D images, and determine 18 an internal measurement of the first dental feature boundary.
- processors 13 configured to determine 14 point clouds of the non-visible 2D images and the visible sub-scans, determine 15 a three-dimensional (3D) finite element mesh and align
- FIG. 2 illustrates the dental object 2 that includes information about the surface 5 and the inner region 3 of the dental object 2.
- the dental object 2 is defined as a 3D finite element mesh 2 wherein each of the multiple vertices has three-dimensional coordinates 20.
- the dental object includes dental features, such as a pulp chamber 4, dentine 6 and enamel 8.
- the surfaces of the dental object, including the dental features, are represented by triangles and the volume of the dental object, including the dental features, are represented by the plurality of finite elements which in this example, includes tetrahedral.
- FIG. 3 A and 3B illustrate an example wherein the one or more processors 13 determines the one or more optical coefficients 32 for each of the multiple vertices 33 based on the non-visible 2D images and determines the dental feature boundaries (31 A, 3 IB) based on the one or more optical coefficients 32.
- the dental object 2 is yet again represented by the 3D finite element mesh 2, but the one or more optical coefficients are illustrated for a part 2A of the 3D finite element mesh 2.
- the one or more processors has determined a disease feature represented by the value of 5, and a first dental feature boundary 31 A is applied to the mesh 2A enclosing the disease feature, and a second dental feature boundary 3 IB is applied to the mesh 2A representing a boundary between the enamel, represented by the value of 3, and the dentine, represented by the value of 2.
- the one or more processors 13 is configured to determine the first dental feature that corresponds to the first dental feature boundary 31 A, and wherein the first dental feature is determined based on the one or more optical coefficients 32 that correspond to the first dental feature boundary 31 A and a dental feature algorithm 35.
- the dental feature algorithm 35 includes one or more of following an anatomy feature determiner 35A that is configured to determine that the one or more optical coefficients 32 corresponds to an anatomy feature when the one or more optical coefficients is within an anatomy feature coefficient range; a disease feature determiner 35B that is configured to determine that the one or more optical coefficients 32 corresponds to a disease feature when the one or more optical coefficients 32 is within a disease feature coefficient range; and a mechanical feature determiner 35C that is configured to determine that the one or more optical coefficients 32 corresponds to a mechanical feature when the one or more optical coefficients 32 is within a mechanical feature coefficient range.
- FIG. 4 illustrates the dental object 2 that includes a first dental feature boundary 31 A representing a caries 40A, a second dental feature boundary 3 IB representing the boundary between enamel and dentine, and a third dental feature boundary 31C representing the pulp chamber.
- the one or more processors 13 is configured to determine the internal measurement 44 between a first part 41 A of the first dental feature boundary 31 A and a second part 41B of the first dental feature boundary 31 A.
- the one or more processors 13 is configured to perform a fitting of a shape object 43 to a geometry of the first dental feature boundary 31 A, and wherein the internal measurement 44 is determined based on the fitted shape object 43.
- FIGs. 5A, 5B, and 5C illustrate different examples of the internal measurement.
- FIGs. 5A and 5B illustrate the one or more processors 13 that is configured to determine multiple internal measurements 43 of the first dental feature boundary (31A,41) or of a shape object (43,42) fitted to the geometry of the first dental feature boundary 31 A, and wherein the one or more processors 13 is configured to determine a minimum and/or a maximum internal measurement 44 of the multiple internal measurements 44A.
- the one or more processors 13 is configured to determine a volumetric measurement of the dental feature that corresponds to the dental feature boundary (31 A), and the determined volumetric measurement is based on the multiple internal measurements 44A.
- FIGs. 6A and 6B illustrate different examples on the internal measurement 44.
- the internal measurement 44 is a sectional curvature measure of a part 41 A of the first dental feature boundary (31 A,41) or to a shape object (43,42) fitted to the geometry of the first dental feature boundary 31 A.
- FIGs. 7A, 7B and 7C illustrate the one or more processors 13 configured to determine a plurality of dental feature boundaries (31 A, 3 IB, 31C, 3 ID), wherein the plurality of dental feature boundaries (31 A, 3 IB, 31C, 3 ID) includes the first dental feature boundary 31 A, a second dental feature boundary 3 IB, a third dental feature boundary 31C and a fourth dental feature boundary 3 ID.
- the first dental feature boundary corresponds to a disease feature 40A
- the second dental feature boundary 3 IB corresponds to enamel 40B
- the third dental feature boundary corresponds to dentine 40C
- the fourth dental feature boundary corresponds to the pulp chamber 40D.
- the one or more processors 13 is configured to determine an internal measurement 44A between the second dental feature boundary 3 IB and the third dental feature boundary 31C, and wherein the internal measurement 44 A corresponds to the thickness of the enamel.
- the one or more processors 13 is further configured to determine an internal measurement 44B between the second dental boundary 3 IB and the first dental feature boundary 31 A.
- the one or more processor 13 is configured to determine the thickness of the enamel 40B at the maximum internal measurement 44B, i.e. distance, between the second dental feature boundary 3 IB and the first dental feature boundary 31 A. If the maximum internal measurement is larger than the thickness of the enamel 40B at the same area of the dental object 2, then the disease feature 40A is penetrating dentine 40C.
- the one or more processors 13 is configured to generate a notification signal to a user interface for notifying the user of the system 1 about the disease feature 44B penetrating into the dentine.
- the one or more processors 13 is configured to determine a first internal measurement 44A between the third dental feature boundary 3 IB and the first dental feature boundary 31C in a first direction along the longitudinal axis 51 of the dental object, and a second internal measurement 44B between the third dental feature boundary 31C and the first dental feature boundary 31A in a second direction along the longitudinal axis 51, and wherein the second direction is opposite to the first direction.
- the sign of the internal measurement (44A,44B) determines whether the internal measurement (44A,44B) includes a measure of a penetration depth of the first dental feature 40 A into the third dental feature 40C.
- the second internal measurement 44B corresponds to the penetration depth of the first dental feature 40A into the third dental feature 40C.
- the notification signal includes an information about the penetration depth, and that the penetration depth is into the dentine 40C.
- the one or more processors 13 is configured to determine a plurality of normal lengths (44A,44B) from a second dental feature boundary 3 IB in a direction towards the first dental feature boundary 31 A and along a longitudinal axis 51 of the dental object 2 in the 3D finite element mesh 2.
- the sign of the plurality of normal lengths (44A,44B) are the same.
- the sign of the plurality of normal lengths (44A,44B) indicates that the first dental feature 40A is penetrating the second dental feature 40B.
- the one or more processors 13 is configured to determine a maximum normal length 44B of the plurality of normal lengths (44A,44B), and determine a penetration depth 44B that is the maximum normal length 44B.
- the one or more processors is configured to determine a notification signal when the internal measurement 44B, i.e. the penetration depth, is above a measurement threshold, and wherein the notification signal is displayed on a user interface of the system 1.
- Each of the multiple vertices may correspond to a pixel of an image sensor of the handheld intraoral scanner 10, or, a group of the multiple vertices may correspond to a pixel of an image sensor of the handheld intraoral scanner 10.
- FIGs. 8 A and 8B illustrate an example of the one or more processors that is configured to determine the one or more optical coefficients 32.
- the image sensor includes at least five pixels (81A,81B,81C,81D,81E), wherein each of the pixels includes a casting object (80A,80B,80C,80D, 80E) that intersects a primary 3D finite element mesh 2A and multiple sub 3D finite element meshes (2B,2C,2D,2E).
- the primary 3D finite element mesh 2A includes the one or more optical coefficients 32 for each of the multiple vertices, and the one or more optical coefficients 32 corresponds to what a pixel (81A,81B,81C,81D,81E) is viewing.
- the sub 3D finite element meshes (2B,2C,2D,2E) corresponds to the internal scatterings that in this specific example, the casting objects (80A,80B,80C,80D,80E) is a ray that corresponds to the viewing field of the pixel (81A,81B,81C,81D,81E).
- Each pixels (81A,81B,81C,81D,81E) is configured to receive non-visible 2D images which are then turned into synthetic pixel values.
- the received non-visible 2D images includes internal scatterings from within an inner region of the dental object, and which means that each of the synthetic pixel values includes an average of internal scatterings from different planes that are distributed along a casting object (80A,80B,80C,80D,80E) of the corresponding pixel (81 A, 8 IB, 81C, 8 ID, 8 IE).
- Each of the synthetic pixel values may correspond to each of the multiple vertices 33 of a primary 3D finite element mesh 2 A, and wherein the one or more processors is configured to determine the one or more optical coefficients 32 for each of the multiple vertices of the primary 3D finite element mesh based on the one or more optical coefficients 32A of the sub 3D finite element meshes (2B,2C,2D,2E).
- the internal scatterings of the received non- visible 2D images may be represented by one or more optical coefficients 32A of multiple vertices of a sub 3D finite element mesh (2B,2C,2D,2E), and since the received non- visible 2D images include internal scatterings captured from different planes, the neural radiance field model is configured to determine a sub 3D finite element mesh (2B,2C,2D,2E) for each of the different planes.
- the sub 3D finite element meshes (2B,2C,2D,2E) are arranged along the casting object (80A,80B,80C,80D,80E), such that a casting object (80A,80B,80C,80D,80E) that starts at a vertex of the primary 3D finite element mesh 2A may intersects the corresponding vertex of each of the sub 3D finite element meshes (2B,2C,2D,2E).
- the one or more optical coefficients 32 of the vertex of the primary 3D finite element mesh may 2A be an average of the one or more optical coefficients 32A of the corresponding vertex of the sub 3D finite element meshes (2B,2C,2D,2E) which the casting object (80A,80B,80C,80D,80E) intersects.
- FIG. 9 illustrates an example wherein the one or more processors 13 is configured to perform a covariance analysis of the one or more optical coefficients 32 for the purpose of reducing the number of one or more optical coefficients 32 to be processed in order to determine the dental feature boundaries (31 A, 3 IB).
- the one or more processors 13 is configured to perform a covariance analysis 90 of the one or more optical coefficients 32 for the multiple vertices and to determine a plurality of descriptive parameters 32B based on the covariance analysis 90, and wherein a number of the plurality of descriptive parameters 32B is smaller than a number of the one or more optical coefficients 32.
- the first and/or the second dental feature boundary (31 A, 3 IB) is determined by providing the plurality of descriptive parameters 32B into a marching finite element algorithm 91 of the intraoral scanning system 1.
- the covariance analysis 90 is a principal component analysis or an autoencoder deep learning algorithm, wherein the autoencoder deep learning algorithm includes a neural network.
- the neural network is configured to receive the one or more optical coefficients 32 of the multiple vertices that corresponds to the non-visible 2D images and determine the plurality of descriptive parameters 32B.
- FIGs. 10A and 10B illustrate examples on how to train the neural network for determining the plurality of descriptive parameters 32B.
- the one or more processors 13 is configured to train the neural network by receiving one or more CBCT scans 100A of one or more dental objects, determining 100C one or more training optical coefficients for each of the one or more CBCT scans, receiving 100B one or more non-visible sub scans from a handheld intraoral scanner of the one or more dental objects, and training the neural network by mapping 100D the one or more training optical coefficients onto the one or more non-visible sub scans.
- FIG. 10A and 10B illustrate examples on how to train the neural network for determining the plurality of descriptive parameters 32B.
- the one or more processors 13 is configured to train the neural network by receiving one or more CBCT scans 100A of one or more dental objects, determining 100C one or more training optical coefficients for each of the one or more CBCT scans, receiving 100B one or more non-visible sub scans from a handheld intraoral scanner of the
- the neural network is trained to classify 100E specific dental features 40 based on the one or more training optical coefficients, and wherein the neural network is configured to receive the one or more optical coefficients 32 of the multiple vertices that corresponds to the non-visible 2D images, and determine a plurality of descriptive parameters 32B for a specific dental feature 40 based on the trained optical coefficients and the received one or more optical coefficients.
- FIG. 11 illustrates an example of a user interface 110 of the system 1 that displays display a three-dimensional (3D) model 111 of the dental object 2 and a two-dimensional (2D) cross-section 112 of the dental object 2, and wherein the internal measurement 44 is displayed on the 2D cross-section 112 of the dental object 2.
- the dental object 2 to be displayed on the 2D cross-section 112 is selected by a marker 113, and the dental feature boundaries (31 A, 3 IB, 31C) are displayed on the 2D cross-section 112.
- connection or “coupled” as used herein may include wirelessly connected or coupled.
- the term “and/or” includes any and all combinations of one or more of the associated listed items. The steps of any disclosed method is not limited to the exact order stated herein, unless expressly stated otherwise.
- An intraoral scanning system configured to determine an internal measurement of an inner region of a dental object, the system comprising:
- a handheld intraoral scanner configured to provide non-visible 2D images and visible sub-scans of a dental object
- • one or more processors configured to; o determine point clouds of the the visible sub-scans, o determine a three-dimensional (3D) finite element mesh and align the point clouds to the 3D finite element mesh, and wherein the 3D finite element mesh includes a plurality of finite elements, and wherein each of the plurality of finite elements includes multiple vertices, o determine one or more optical coefficients for each of the multiple vertices, and wherein the one or more optical coefficients correspond to a dental feature represented by the aligned point clouds; o determine a first dental feature boundary of a first dental feature based on the determined one or more optical coefficients, and wherein the first dental feature boundary corresponds to the non-visible 2D images, and o determine an internal measurement of the first dental feature boundary.
- the one or more processors is configured to determine multiple internal measurements of the first dental feature boundary or of a shape object fitted to the geometry of the first dental feature boundary, and wherein the one or more processors is configured to determine a minimum and/or a maximum internal measurement of the multiple internal measurements.
- the one or more processors is configured to determine multiple internal measurements of the first dental feature boundary or to a shape object fitted to the geometry of the first dental feature boundary, and wherein the internal measurement includes a volumetric measurement that is determined based on the multiple internal measurements.
- the internal measurement is a sectional curvature measure of a part of the first dental feature boundary or to a shape object fitted to the geometry of the first dental feature boundary.
- the one or more processors is configured to determine a plurality of dental feature boundaries, and wherein the plurality of dental feature boundaries includes the first dental feature boundary and at least a second dental feature boundary, and wherein the at least second dental feature boundary corresponds to a second dental feature; and wherein the internal measurement is determined between the first dental feature boundary and the at least second dental feature boundary.
- the one or more processors is configured to determine multiple sub-internal measurements that include a first sub-internal measurement, a second sub-internal measurement, and a third subinternal measurement, and wherein:
- the first sub-internal measurement corresponds to a distance or volumetric measurement of the first dental feature that corresponds to the first dental feature boundary
- the second sub-internal measurement corresponds to a distance or volumetric measurement of the second dental feature that corresponds to the second dental feature boundary
- the third sub-internal measurement is a distance measure or a volumetric measure of an overlap between the first dental feature boundary and the at least second dental feature boundary
- the internal measurement is a ratio between the third sub-internal measurement and the first sub-internal measurement or the second sub-internal measurement.
- the neural radiance field model is configured to be trained using a plurality of non-visible 2D images, and wherein the casting object is a ray or a cone .
- the covariance analysis is a principal component analysis or an autoencoder deep learning algorithm
- the autoencoder deep learning algorithm includes a neural network
- the neural network is configured to receive the one or more optical coefficients of the multiple vertices that corresponds to the non-visible 2D images and determine the plurality of descriptive parameters.
- the neural network is trained to classify specific dental features based on the one or more training optical coefficients, and wherein the neural network is configured to receive the one or more optical coefficients of the multiple vertices that corresponds to the non- visible 2D images, and determine a plurality of descriptive parameters for a specific dental feature.
- an anatomy feature being an enamel, a dentine, or a pulp chamber
- a disease feature being a crack or a caries
- a mechanical feature being a filling and/or a composite restoration.
- the intraoral scanning system according to any of the previous items, and wherein the one or more processors is configured to determine a notification signal when the internal measurement is above a measurement threshold, and wherein the notification signal is displayed on a user interface of the system.
- the intraoral scanning system comprising a user interface configured to display a three-dimensional (3D) model of the dental object and a two-dimensional (2D) cross-section of the dental object, and wherein the internal measurement is displayed on the 2D cross-section of the dental object.
- the user interface includes a selector that is configured to select one or more of a plurality of dental feature boundaries to be displayed, and wherein the plurality of dental feature boundaries includes the first dental feature boundary and at least a second dental feature boundary, and wherein the transparency unit is configured to change the transparency of the selected one or more of the plurality of dental feature boundaries.
- the intraoral scanning system comprising a user interface configured to display a plurality of dental feature boundaries, and wherein the plurality of dental feature boundaries includes the first dental feature boundary and at least a second dental feature boundary, and wherein the plurality of dental feature boundaries has different colors.
- the one or more processors is configured to group a plurality of dental feature boundaries into multiple dental feature boundary groups, and wherein the plurality of dental feature boundaries includes the first dental feature boundary and at least a second dental feature boundary, and wherein each of the multiple dental feature boundary groups corresponds to different dental features.
- the one or more processors is configured to determine the first dental feature that corresponds to the first dental feature boundary, and wherein the first dental feature is determined based on the one or more optical coefficients that correspond to the first dental feature boundary and a dental feature algorithm, and wherein the dental feature algorithm includes one or more of following:
- an anatomy feature determiner that is configured to determine that the one or more optical coefficients corresponds to an anatomy feature when the one or more optical coefficients is within an anatomy feature coefficient range
- a disease feature determiner that is configured to determine that the one or more optical coefficients corresponds to a disease feature when the one or more optical coefficients is within a disease feature coefficient range
- a mechanical feature determiner that is configured to determine that the one or more optical coefficients corresponds to a mechanical feature when the one or more optical coefficients is within a mechanical feature coefficient range.
- the one or more processors is configured to align the non-visible 2D images to the visible sub-scans and determine the point clouds of the aligned sub-scans.
Landscapes
- Health & Medical Sciences (AREA)
- Engineering & Computer Science (AREA)
- Medical Informatics (AREA)
- Public Health (AREA)
- General Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Biomedical Technology (AREA)
- Epidemiology (AREA)
- Physics & Mathematics (AREA)
- Pathology (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Radiology & Medical Imaging (AREA)
- Primary Health Care (AREA)
- Veterinary Medicine (AREA)
- Animal Behavior & Ethology (AREA)
- Dentistry (AREA)
- Databases & Information Systems (AREA)
- Data Mining & Analysis (AREA)
- Biophysics (AREA)
- Heart & Thoracic Surgery (AREA)
- Molecular Biology (AREA)
- Surgery (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Audiology, Speech & Language Pathology (AREA)
- Optics & Photonics (AREA)
- Quality & Reliability (AREA)
- Computer Vision & Pattern Recognition (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Dental Tools And Instruments Or Auxiliary Dental Instruments (AREA)
Abstract
La présente invention concerne un système de balayage intra-buccal destiné à fournir des informations de structure interne améliorées de dents par détermination d'informations de balayage composées. Le système de balayage intra-buccal peut être configuré pour fournir des données de balayage composées sur la base d'informations de lumière visible capturées et d'informations de lumière interne. Le système peut comprendre un dispositif de balayage intra-buccal portatif qui comprend une unité de projecteur configurée pour émettre de la lumière ayant différentes longueurs d'onde pendant des périodes sur au moins une arcade dentaire, les différentes longueurs d'onde comprenant au moins une longueur d'onde proche infrarouge et au moins une longueur d'onde visible. Le dispositif de balayage intra-buccal portatif peut en outre comprendre une unité de capteur d'image configurée pour capturer des informations de lumière visible et des informations de lumière interne à partir d'au moins l'arcade dentaire provoquées par la lumière émise aux différentes longueurs d'onde. Le système peut en outre comprendre un ou plusieurs processeurs connectés fonctionnellement à l'unité de capteur d'image, lesdits processeurs étant configurés pour recevoir les informations de lumière visible et les informations de lumière interne, déterminer, en temps réel, des informations de surface à partir des informations de lumière visible pour générer ou mettre à jour un modèle tridimensionnel (3D) de l'arcade dentaire à l'aide des informations de surface, déterminer des informations de structure interne de l'arcade dentaire à partir des informations de lumière interne, et déterminer des informations de balayage composées comprenant une composition des informations de structure interne et des informations de lumière visible.
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| DKPA202370303 | 2023-06-19 | ||
| DKPA202370303 | 2023-06-19 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2024260906A1 true WO2024260906A1 (fr) | 2024-12-26 |
Family
ID=91586214
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/EP2024/066771 Pending WO2024260906A1 (fr) | 2023-06-19 | 2024-06-17 | Mesures volumétriques d'une région interne d'un objet dentaire |
Country Status (1)
| Country | Link |
|---|---|
| WO (1) | WO2024260906A1 (fr) |
Citations (8)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20150164335A1 (en) * | 2012-06-27 | 2015-06-18 | 3Shape A/S | 3d intraoral scanner measuring fluorescence |
| EP2442720B1 (fr) | 2009-06-17 | 2016-08-24 | 3Shape A/S | Appareil d'exploration à focalisation |
| KR20200000704A (ko) * | 2018-06-25 | 2020-01-03 | 오스템임플란트 주식회사 | 치과용 서지컬 가이드 설계 방법, 이를 위한 장치, 및 이를 기록한 기록매체 |
| US20200170760A1 (en) * | 2017-05-27 | 2020-06-04 | Medicim Nv | Method for intraoral scanning directed to a method of processing and filtering scan data gathered from an intraoral scanner |
| CA3130463A1 (fr) * | 2019-03-08 | 2020-09-17 | Align Technology, Inc. | Identification d'objet etranger et augmentation et/ou filtrage d'image pour balayage intrabuccal |
| US20210353152A1 (en) * | 2020-04-15 | 2021-11-18 | Align Technology, Inc. | Automatic generation of orthodontic or prosthodontic prescription |
| AU2022291491A1 (en) * | 2016-07-27 | 2023-02-02 | Align Technology, Inc. | Intraoral scanner with dental diagnostics capabilities |
| US20230149135A1 (en) * | 2020-07-21 | 2023-05-18 | Get-Grin Inc. | Systems and methods for modeling dental structures |
-
2024
- 2024-06-17 WO PCT/EP2024/066771 patent/WO2024260906A1/fr active Pending
Patent Citations (8)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| EP2442720B1 (fr) | 2009-06-17 | 2016-08-24 | 3Shape A/S | Appareil d'exploration à focalisation |
| US20150164335A1 (en) * | 2012-06-27 | 2015-06-18 | 3Shape A/S | 3d intraoral scanner measuring fluorescence |
| AU2022291491A1 (en) * | 2016-07-27 | 2023-02-02 | Align Technology, Inc. | Intraoral scanner with dental diagnostics capabilities |
| US20200170760A1 (en) * | 2017-05-27 | 2020-06-04 | Medicim Nv | Method for intraoral scanning directed to a method of processing and filtering scan data gathered from an intraoral scanner |
| KR20200000704A (ko) * | 2018-06-25 | 2020-01-03 | 오스템임플란트 주식회사 | 치과용 서지컬 가이드 설계 방법, 이를 위한 장치, 및 이를 기록한 기록매체 |
| CA3130463A1 (fr) * | 2019-03-08 | 2020-09-17 | Align Technology, Inc. | Identification d'objet etranger et augmentation et/ou filtrage d'image pour balayage intrabuccal |
| US20210353152A1 (en) * | 2020-04-15 | 2021-11-18 | Align Technology, Inc. | Automatic generation of orthodontic or prosthodontic prescription |
| US20230149135A1 (en) * | 2020-07-21 | 2023-05-18 | Get-Grin Inc. | Systems and methods for modeling dental structures |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| JP7657276B2 (ja) | 歯科診断機能を有する口腔内スキャナ | |
| JP6487580B2 (ja) | テクスチャ的特徴を用いる物体の3dモデル化方法 | |
| US11628046B2 (en) | Methods and apparatuses for forming a model of a subject's teeth | |
| WO2024260906A1 (fr) | Mesures volumétriques d'une région interne d'un objet dentaire | |
| WO2025078622A1 (fr) | Système et procédé de balayage intrabuccal | |
| WO2024261153A1 (fr) | Représentation graphique d'un volume interne d'un objet dentaire | |
| CN120092269A (zh) | 牙科口腔健康的3d数字可视化、注释和沟通 | |
| US20250387075A1 (en) | Method for determining optical parameters to be displayed on a three-dimensional model | |
| WO2024260907A1 (fr) | Système de balayage intrabuccal pour déterminer un signal de couleur visible et un signal infrarouge | |
| WO2025125551A1 (fr) | Système de balayage intrabuccal à images focalisées alignées sur un modèle de surface 3d | |
| WO2025202067A1 (fr) | Système de balayage intrabuccal à programmes de séquence de balayage améliorés | |
| WO2025202066A1 (fr) | Système de balayage intrabuccal permettant de fournir un signal de rétroaction qui comporte un niveau de qualité d'un modèle tridimensionnel | |
| WO2024260743A1 (fr) | Système de balayage intra-buccal pour déterminer un signal infrarouge | |
| EP4646137A1 (fr) | Système de balayage intrabuccal pour déterminer des informations de balayage composites | |
| KR20250164786A (ko) | 3d 모델 상에 2d 이미지를 중첩시키기 위한 구강내 스캐너 시스템 및 방법 | |
| WO2025202065A1 (fr) | Système de balayage intrabuccal pour améliorer des informations de balayage composées |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 24733930 Country of ref document: EP Kind code of ref document: A1 |