[go: up one dir, main page]

WO2024200543A1 - Scanner optique 3d intrabuccal à précision améliorée - Google Patents

Scanner optique 3d intrabuccal à précision améliorée Download PDF

Info

Publication number
WO2024200543A1
WO2024200543A1 PCT/EP2024/058301 EP2024058301W WO2024200543A1 WO 2024200543 A1 WO2024200543 A1 WO 2024200543A1 EP 2024058301 W EP2024058301 W EP 2024058301W WO 2024200543 A1 WO2024200543 A1 WO 2024200543A1
Authority
WO
WIPO (PCT)
Prior art keywords
unit
projector
scanner
camera
pattern
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
PCT/EP2024/058301
Other languages
English (en)
Inventor
Rasmus KjÆr
Mads Grønlund Andersen
Karsten Bjerrum DIDERIKSEN
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
3Shape AS
Original Assignee
3Shape AS
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 3Shape AS filed Critical 3Shape AS
Priority to CN202480029554.6A priority Critical patent/CN121057922A/zh
Priority to KR1020257035987A priority patent/KR20250166297A/ko
Publication of WO2024200543A1 publication Critical patent/WO2024200543A1/fr
Anticipated expiration legal-status Critical
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • G01B11/25Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
    • G01B11/2513Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object with several lines being projected in more than one direction, e.g. grids, patterns
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61CDENTISTRY; APPARATUS OR METHODS FOR ORAL OR DENTAL HYGIENE
    • A61C9/00Impression cups, i.e. impression trays; Impression methods
    • A61C9/004Means or methods for taking digitized impressions
    • A61C9/0046Data acquisition means or methods
    • A61C9/0053Optical means or methods, e.g. scanning the teeth by a laser or light beam
    • A61C9/006Optical means or methods, e.g. scanning the teeth by a laser or light beam projecting one or more stripes or patterns on the teeth
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00064Constructional details of the endoscope body
    • A61B1/00071Insertion part of the endoscope body
    • A61B1/0008Insertion part of the endoscope body characterised by distal tip features
    • A61B1/00096Optical elements
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/24Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor for the mouth, i.e. stomatoscopes, e.g. with tongue depressors; Instruments for opening or keeping open the mouth
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0059Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
    • A61B5/0082Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence adapted for particular medical purposes
    • A61B5/0088Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence adapted for particular medical purposes for oral or dental tissue
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61CDENTISTRY; APPARATUS OR METHODS FOR ORAL OR DENTAL HYGIENE
    • A61C9/00Impression cups, i.e. impression trays; Impression methods
    • A61C9/004Means or methods for taking digitized impressions
    • A61C9/0046Data acquisition means or methods
    • A61C9/0053Optical means or methods, e.g. scanning the teeth by a laser or light beam
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61CDENTISTRY; APPARATUS OR METHODS FOR ORAL OR DENTAL HYGIENE
    • A61C9/00Impression cups, i.e. impression trays; Impression methods
    • A61C9/004Means or methods for taking digitized impressions
    • A61C9/0046Data acquisition means or methods
    • A61C9/0053Optical means or methods, e.g. scanning the teeth by a laser or light beam
    • A61C9/0066Depth determination through adaptive focusing
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • G01B11/25Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • G01B11/25Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
    • G01B11/2518Projection by scanning of the object

Definitions

  • the present disclosure relates to an intraoral 3D scanner for generating a three-dimensional representation of a scanned object.
  • the disclosure further relates to a scan unit for an intraoral 3D scanner, wherein the scan unit comprises multiple camera units.
  • the disclosure further relates to a method for generating a three-dimensional representation.
  • the pattern features are in focus in a wide focus range. It is generally the case for any optical system that a given point or feature projected through the optical system will have a certain spread (blurring) when imaged by the optical system.
  • the degree of spreading of the point can be described by a point spread function (PSF).
  • PSF point spread function
  • the resolution of the 3D representation is limited by the spreading of the features described by the PSF, since the features need to be sharp in the images in order to accurately determine the 3D points for generating the 3D representation.
  • the minimum feature size in the pattern is limited by the imaging resolving power of the optics of the scanner.
  • the imaging resolution is limited primarily by three effects: defocus, lens aberrations, and diffraction.
  • the aperture of the projector unit or camera(s) influence the imaging resolution primarily via the aforementioned three effects.
  • a small aperture is advantageous for minimizing the negative effects of defocus and lens aberrations, since a camera or projector having a small aperture is highly tolerant of defocus.
  • the extreme example is a pinhole camera, in which case all objects are in focus almost regardless of their distance from the pinhole aperture.
  • a small aperture causes more diffraction, which negatively affects the imaging resolution.
  • Existing 3D scanners typically utilize either dynamic patterns that change in time or structured light patterns of lower density due to the difficulties associated with the projection of high- density patterns.
  • a short working distance is typically desired for dental scanning applications since it is desired to be able to scan the patient’s teeth in very close proximity to the teeth.
  • a larger working distance either requires more space inside the scanner or it requires the operator to hold the scanner in some predetermined distance over the teeth in order to have the projected pattern in focus; both of which are undesired.
  • a large depth of focus is desired in order to capture all detail e.g. of steep surfaces and cavities inside the patient’s mouth.
  • a large depth of focus is more tolerant to changes in the height over the teeth in which the scanner is held.
  • an optical system for a triangulation-based intraoral scanner comprising:
  • a pattern generating element configured for generating a pattern of light to be projected on a surface an object
  • an aperture having a predetermined size such that it provides a pupil diameter of between 0.2 mm to 0.7 mm;
  • each camera unit comprising:
  • an image sensor for acquiring one or more image(s), wherein the resolution of the image sensor is at least 0.25 megapixel;
  • the camera focus lenses define a projector optical axis; wherein the projector optical axis and the camera optical axis are nonparallel, wherein the working distance of the projector unit and/or a given camera unit is shorter than 50 mm, and wherein the numerical aperture of the projector unit and/or a given camera unit is between 0.0035 and 0.015.
  • an intraoral scanner comprising:
  • a pattern generating element configured for generating a pattern of light to be projected on a surface of an object
  • each camera unit comprising:
  • one or more camera focus lenses for focusing light received from the surface of the object onto the image sensor, wherein the camera focus lenses define a projector optical axis.
  • the scanner is a triangulation-based intraoral scanner comprising:
  • a pattern generating element configured for generating a pattern of light to be projected on a surface of an object
  • an aperture having a predetermined size such that it provides a pupil diameter of between 0.2 mm to 0.7 mm;
  • each camera unit comprising:
  • an image sensor for acquiring one or more image(s); and - one or more camera focus lenses for focusing light received from the surface of the object onto the image sensor, wherein the camera focus lenses define a projector optical axis; wherein the projector optical axis and the camera optical axis are non-parallel, and wherein the working distance of the projector unit and/or a given camera unit is between 15 mm and 50 mm, and wherein the numerical aperture of the projector unit and/or a given camera unit is between 0.0035 and 0.015.
  • the present disclosure further relates to a 3D scanner system for generating a three- dimensional representation of an object, the 3D scanner system comprising:
  • processors configured for generating a three-dimensional representation of the object based on image(s) obtained by the camera unit(s).
  • the 3D scanner system comprises:
  • an intraoral scanner comprising:
  • At least one projector unit configured for projecting a pattern on a surface of the object, wherein the pattern comprises a plurality of pattern features
  • - two or more camera units configured for acquiring a set of images comprising at least one image from each camera unit, wherein each image includes at least a portion of the projected pattern, wherein the images within the set of images are acquired simultaneously;
  • processors configured for:
  • the 3D scanner system may comprise an optical system and/or intraoral scanner according to any of the embodiments disclosed herein.
  • the 3D scanner system may further comprise a display configured for displaying the three-dimensional representation of the object.
  • the present disclosure further relates to a method for generating a three-dimensional representation of an object using the intraoral scanner disclosed herein, the method comprising the steps of:
  • the set of images is acquired by multiple camera units of the intraoral scanner, wherein the number of images in the set of images corresponds to the number of camera units, wherein each camera unit contributes one image to the set of images, wherein images within the set of images are acquired simultaneously by the camera units;
  • Fig. 1 shows a scan unit according to the present disclosure.
  • Fig. 2 shows a cross-section through the scan unit of figure 1.
  • Fig. 3 shows an exploded view of a scan unit according to the present disclosure.
  • Fig. 4 shows the embodiment according to figure 3, wherein the units, i.e. the projector unit and the four camera units, are inserted and mounted/fixed in the fixation unit.
  • Fig. 5 shows a lens mount configured for receiving and mounting the projector or camera lens stack.
  • Fig. 6 shows a fixation unit according to the present disclosure.
  • Fig. 7 shows a camera lens stack, a lens mount, an image sensor, and a flexible printed circuit board, according to the present disclosure.
  • Fig. 8 shows an embodiment of a scan unit, wherein one or more lens mounts are integrated in the fixation unit.
  • Fig. 9 shows an exploded view of a projector unit according to the present disclosure.
  • Fig. 10 shows a cross-sectional view of a scan unit according to the present disclosure.
  • Fig. 11 shows a cross-sectional view of a scan unit according to the present disclosure, wherein the focus lens or lens stack comprises an outer thread.
  • Fig. 13 shows two scan units according to the present disclosure.
  • Fig. 14 shows a cross-section through an intraoral scanner according to the present disclosure.
  • the three-dimensional (3D) object may be a dental object.
  • dental objects include any one or more of: tooth/teeth, gingiva, implant(s), dental restoration(s), dental prostheses, edentulous ridge(s), and/or combinations thereof.
  • the dental object may be a gypsum model or a plastic model representing a subject’s teeth.
  • the three- dimensional (3D) object may comprise teeth and/or gingiva of a subject.
  • the dental object may only be a part of the subject’s teeth and/or oral cavity, since the entire set of teeth of the subject is not necessarily scanned during a scanning session.
  • a scanning session may be understood herein as a period of time during which data (such as images) of the 3D object is obtained.
  • the scanner disclosed herein may be an intraoral scanner for acquiring images within an intraoral cavity of a subject.
  • the scanner may be a handheld scanner, i.e. a device configured for being held with a human hand.
  • the scanner may employ any suitable scanning principle such as triangulation-based scanning, stereo vision, structure from motion, confocal scanning, or other scanning principles.
  • the scanner employs a triangulation-based scanning principle.
  • the scanner may comprise a projector unit and one or more camera units for determining points in 3D space based on triangulation.
  • the scanner comprises a projector unit and two or more camera units, wherein the camera units are configured to image the scanned object from separate views, i.e. from different directions.
  • the camera units may be configured to acquire a set of images, wherein a correspondence problem is solved within said set of images based on triangulation. The images within the set of images may be acquired by separate camera units of the scanner.
  • the images within the set of images are preferably acquired simultaneously, i.e. at the same moment in time, wherein each camera contributes with at least one image to the set of images.
  • the images within the set of images preferably captures substantially the same region of the dental object.
  • the images may comprise a plurality of image features corresponding to pattern features in a pattern of light projected on the surface of the dental object.
  • the correspondence problem may generally refer to the problem of ascertaining which parts, or image features, of one image correspond to which parts of another image within the set of images. Specifically, in this context, the correspondence problem may refer to the task of associating each image feature with a projector ray emanating from the projector unit.
  • the problem can also be stated as the task of associating points in the images with points in the projector plane of the projector unit.
  • a system and method for solving the correspondence problem is further described in PCT/EP2022/086763 and PA 2023 70115 by the same applicant, which are herein incorporated by reference in their entirety.
  • the projector unit may be configured to project a plurality of projector rays, which are projected onto a surface of the dental object.
  • Solving the correspondence problem may include the steps of determining image features in the images within a set of images, and further associate said image features with a specific projector ray.
  • the correspondence problem is solved jointly for groups of projector rays, as opposed to e.g. solving the correspondence problem projector ray by projector ray.
  • the inventors have found that by solving the correspondence problem jointly for groups or collections of projector rays, a particular reliable and robust solution can be obtained, consequently leading to a more accurate 3D representation. Subsequently, the depth of each projector ray may be computed, whereby a 3D representation of the scanned object may be generated.
  • the scanner may comprise one or more scan units, wherein each scan unit comprises a projector unit and one or more camera units.
  • the scanner may comprise one scan unit comprising one projector unit and at least two camera units.
  • the scanner may comprise one scan unit comprising one projector unit and four camera units.
  • the scanner may comprise at least two scan units, wherein each scan unit comprises a projector unit and two or more camera units.
  • the scanner may comprise at least two scan units, wherein each scan unit comprises a projector unit and four camera units.
  • the scanner has a field of view of at least 18 x 18 mm 2 , such as at least 20 x 20 mm 2 , at a given working distance, such as at a working distance between 15 mm and 50 mm.
  • the scanner may further comprise a reflecting element, such as a mirror, arranged in combination with a given scan unit.
  • the reflecting element is preferably configured to reflect light from the projector unit of the scan unit and/or from the surface of the dental object and onto the image sensor(s) of each camera unit of the scan unit associated with the reflecting element.
  • the scanner comprises or constitutes an elongated probe, which defines a longitudinal axis of the scanner.
  • the height of the mirror as seen along the projector optical axis is between about 13 mm to about 20 mm.
  • the mirror allows for part of the optical path to be folded or redirected inside the scanner, such that the scanner may accommodate an optical system, e.g. a scan unit, having a working distance longer than the intended height of the probe or tip of the scanner. Consequently, the tip or probe can be made smaller, in particular the height of the probe, such that it may easily enter e.g. an oral cavity of a patient. A smaller probe also more easily captures data in the back of the mouth of the patient.
  • an optical system e.g. a scan unit
  • a projector unit may be understood herein as a device configured for projecting light onto a surface, such as the surface of a three-dimensional object.
  • the projector unit is configured to project a pattern of light onto the surface of a dental object.
  • the projector unit is configured to project a pattern of light such that the pattern of light is in focus at a predefined focus distance, or focus range, measured along a projector optical axis.
  • the projector unit is configured to project the pattern of light such that the pattern of light is defocused at the opening of the probe of the scanner and/or at a surface of an optical window in said probe.
  • the projector unit may be configured to project unpolarized light.
  • the projector unit may comprise Digital Light Processing (DLP) projectors using a micro mirror array for generating a time varying pattern, or a diffractive optical element (DOE), or front-lit reflective mask projectors, or micro-LED projectors, or Liquid crystal on silicon (LCoS) projectors or back-lit mask projectors, wherein a light source is placed behind a mask having a spatial pattern, whereby the light projected on the surface of the dental object is patterned.
  • the pattern may be dynamic, i.e. such that the pattern changes over time, or the pattern may be static in time, i.e. such that the pattern remains the same over time.
  • An advantage of projecting a static pattern is that it allows the capture of all the image data simultaneously, thus preventing warping due to movement between the scanner and the object.
  • the projector unit may comprise one or more collimation lenses for collimating the light from the light source.
  • the collimation lens(es) may be placed between the light source and the mask.
  • the one or more collimation lenses are Fresnel lenses.
  • the projector unit may further comprise one or more focus lenses, or lens elements, configured for focusing the light at a predefined working distance.
  • the projector unit comprises a projector lens stack comprising a plurality of lens elements.
  • the projector lens stack may define the projector optical axis.
  • the lens elements of the projector lens stack are attached together to form a single unit.
  • the projector unit of the scanner comprises at least one light source and a pattern generating element for defining a pattern of light.
  • the pattern generating element is preferably configured for generating a light pattern to be projected on a surface of a dental object.
  • the pattern generating element may be a mask having a spatial pattern.
  • the projector unit may comprise a mask configured to define a pattern of light.
  • the mask may be placed between the light source of the projector unit and the one or more focus lenses, such that light transmitted through the mask is patterned into a light pattern.
  • the mask may define a polygonal pattern comprising a plurality of polygons, such as a checkerboard pattern.
  • the projector unit may further comprise one or more lenses such as collimation lenses or projection lenses.
  • the pattern generating element is based on diffraction and/or refraction to generate the light pattern, such as a pattern comprising an array of discrete unconnected dots.
  • the projector unit is configured to generate a predefined static pattern, which may be projected onto a surface.
  • the projector unit may be configured to generate a dynamic pattern, which changes in time.
  • the projector unit may be associated with its own projector plane, which is determined by the projector optics.
  • the projector plane may be understood as the plane wherein the mask is contained.
  • the projector plane comprises a plurality of pattern features of the projected pattern.
  • the camera units and projector unit are arranged such that the image sensors and the projector plane, e.g. defined by the mask, are in the same plane.
  • the projector unit may define a projector optical axis.
  • An optical axis may be understood as a line along which there is some degree of rotational symmetry in the optical system such as a camera lens or a projector unit.
  • the projector optical axis of the projector unit is substantially parallel with the longitudinal axis of the scanner.
  • the projector optical axis of the scan unit defines an angle, such as at least 45° or at least 75°, with the longitudinal axis of the scanner.
  • the projector optical axis of the projector unit is substantially orthogonal to the longitudinal axis of the scanner.
  • the projector unit may comprise an aperture having a predetermined size such that it provides a pupil diameter of between 0.2 mm to 0.7 mm, such as between 0.3 mm to 0.6 mm.
  • a pupil diameter of between 0.2 mm to 0.7 mm was found to be particularly useful because it provided a projected pattern in particularly good focus in a large focus range, e.g. a focus range of between 16 mm and 22 mm.
  • a pupil diameter between from about 0.3 mm to about 0.5 mm was found to provide a good compromise between the imaging resolution, e.g. the resolution of the pattern, and the depth of focus, i.e. the focus range.
  • Depth of focus may in some cases be understood as the maximum range where the object appears to be in acceptable focus, e.g. within a given predetermined tolerance.
  • the scanner has a working distance of between 10 mm and 100 mm.
  • a working distance of the projector unit of between 10 mm and 70 mm, such as between 15 mm and 50 mm, is particularly useful, since the optics, e.g. the scan unit(s), take up less space inside the scanner, and also since it is desired to be able to scan objects very close to the scanner. Since the optical system then takes up less space inside the scanner, it also allows for multiple scan units to be placed in succession inside the scanner.
  • the scanner is able to project a pattern in focus at the exit of the tip of the scanner, e.g.
  • the working distance may be understood as the object to lens distance where the image is at its sharpest focus.
  • the working distance may also, or alternatively, be understood as the distance from the object to a front lens, e.g. a front lens of the projector unit.
  • the front lens may be the one or more focus lenses of the projector unit.
  • the choice of aperture and working distance results in a numerical aperture of the projector unit of between 0.0035 and 0.015, which was found to provide a good imaging resolution, i.e. a pattern with pattern features in good focus in a given focus range, and further a good compromise in terms of defocus, lens aberrations, and diffraction.
  • a numerical aperture of the projector unit of between 0.005 and 0.009 was found to provide an ideal compromise between imaging resolution and depth of focus.
  • a numerical aperture in this range was found to be the best balance between mitigating the negative effects on resolution caused by defocus, lens aberrations, and diffraction.
  • the numerical aperture may be the same for the projector unit and the camera unit(s).
  • the numerical aperture may be understood as the object-space numerical aperture.
  • the projector unit may comprise one or more light sources.
  • the projector unit may be configured to project a pattern of light defined by a plurality of projector rays when the light source(s) are on/active.
  • the projector unit may be configured for sequentially turning the light source on and off at a predetermined frequency, wherein the light source is on for a predetermined time period.
  • the light source(s) may be configured to generate light of a single wavelength or a combination of wavelengths (mono- or polychromatic).
  • the combination of wavelengths may be produced by a light source configured to produce light comprising different wavelengths, or a range of wavelengths (such as white light).
  • the light source may be configured to generate unpolarized light, such as unpolarized white light.
  • each projector unit comprises a light source for generating white light.
  • white light enables the scanner to acquire data or information relating to the surface geometry and to the surface color simultaneously. Consequently, the same set of images can be used to provide both geometry of the object, e.g. in terms of 3D data / a 3D representation, and color of the object. Hence, there is no need for an alignment of data relating to the recorded surface geometry and data relating to the recorded surface color in order to generate a digital 3D representation of the object expressing both color and geometry of the object.
  • the projector unit may comprise multiple light sources such as LEDs individually producing light of different wavelengths (such as red, green, and blue) that may be combined to form light comprising different wavelengths.
  • the light produced by the light source(s) may be defined by a wavelength defining a specific color, or a range of different wavelengths defining a combination of colors such as white light.
  • the light source is a diode, such as a white light diode, or a laser diode.
  • the projector unit comprises a laser, such as a blue or green laser diode for generating blue or green light, respectively.
  • the scanner comprises a light source configured for exciting fluorescent material to obtain fluorescence data from the dental object such as from teeth. Such a light source may be configured to produce a narrow range of wavelengths.
  • the scanner comprises an infrared light source, which is configured to generate wavelengths in the infrared range, such as between 700 nm and 1.5 pm.
  • the scanner comprises one or more light sources selected from the group of: Infrared (I R) light source, near-infrared (NIR) light source, blue light source, violet light source, ultraviolet (UV) light source, and/or combinations thereof.
  • the scanner comprises a first light source forming part of the projector unit, and one or more second light sources, e.g.
  • a scanner configured for detecting fluorescence is further described in WO 2014/000745 A1 by the same applicant, and is herein incorporated by reference in its entirety.
  • the projector unit is configured for sequentially turning the light source on and off at a predetermined frequency, wherein the light source is on for a predetermined time period.
  • the time period may be between 3 milliseconds (ms) and 10 milliseconds (ms), such as between 4 ms and 8 ms.
  • the predetermined frequency for turning the light source on and off may be between 25 Hz and 35 Hz, such as approximately 30 Hz.
  • the projector unit may be configured to project a pattern of light defined by a plurality of projector rays when a light source of the projector unit is turned on.
  • the terms ‘illumination pattern’, ‘pattern of light’, ‘spatial pattern’, and ‘pattern’ are used herein interchangeably.
  • the pattern may be generated using a pattern generating element, e.g. located in the projector unit.
  • the pattern generating element may be a mask, such as a transparency or transmission mask, having a spatial pattern.
  • the mask may be a chrome photomask.
  • the pattern generating element is configured to utilize diffraction and/or refraction to generate a light pattern.
  • the use of a pattern of light may lead to a correspondence problem, where a correspondence between points in the light pattern and points seen by the camera unit(s) viewing the pattern needs to be determined.
  • the correspondence problem is solved jointly for groups of projector rays emanating from the projector unit.
  • the spatial pattern may be a polygonal pattern comprising a plurality of polygons.
  • the polygons may be selected from the group of: triangles, rectangles, squares, pentagons, hexagons, and/or combinations thereof. Other polygons can also be envisioned. In general, the polygons are composed of edges and corners.
  • the polygons are repeated in the pattern in a predefined manner.
  • the pattern may comprise a plurality of repeating units, wherein each repeating unit comprises a predefined number of polygons, wherein the repeating units are repeated throughout the pattern.
  • the pattern may comprise a predefined arrangement comprising any of stripes, squares, dots, triangles, rectangles, and/or combinations thereof.
  • the pattern is noncoded, such that no part of the pattern is unique.
  • the generated pattern of light is a polygonal pattern, such as a checkerboard pattern comprising a plurality of checkers. Similar to a common checkerboard, the checkers in the pattern may have alternating dark and bright areas corresponding to areas of low light intensity (dark) and areas of high(er) light intensity (bright).
  • the pattern of light is a checkerboard pattern comprising alternating squares of dark and bright light. In some embodiments, each square in the checkerboard pattern has a length of between 100 pm to 200 pm.
  • the pattern comprises at least 100 x 100 squares arranged in a checkerboard pattern, e.g. of the size mentioned above.
  • a pattern has a high number of pattern features, e.g. wherein the corners of the squares constitute features. Consequently, such a pattern sets high requirements to the optical system of the scanner.
  • the pattern of light may resemble a checkerboard pattern with alternating squares of different intensity in light.
  • the light pattern comprises a distribution of discrete unconnected spots of light.
  • the pattern preferably comprises a plurality of pattern features.
  • the pattern features may be arranged in a regular grid. In some embodiments of the presently disclosed scanner, the total number of pattern features in the pattern is at least 1000, preferably at least 3000, more preferably at least 10000, even more preferably at least 15000.
  • the acquired images of the object will similarly comprise a plurality of image features corresponding to the pattern features.
  • a pattern/image feature may be understood as an individual well-defined location in the pattern/image. Examples of image/pattern features include corners, edges, vertices, points, transitions, dots, stripes, etc.
  • the image/pattern features comprise the corners of checkers in a checkerboard pattern. In other embodiments, the image/pattern features comprise corners in a polygon pattern such as a triangular pattern.
  • the projector unit is configured for projecting a high-density pattern.
  • a high-density pattern may be understood as a pattern comprising more than 3000 pattern features.
  • a dense light pattern leads to a more complex correspondence problem since there is a large number of features for which to solve the correspondence problem.
  • a high-density pattern is more difficult to resolve due to the small features, which consequently sets a high requirement on the optics of the scanner, as discussed below.
  • the inventors have found that a pattern comprising more than 3000 pattern features provides a very good resolution of the corresponding 3D representation of the scanned object, since the high number of features provides for a high number of 3D points.
  • a scanner for projecting a high-density pattern is further described in EP 22183907.9 by the same applicant, and is herein incorporated by reference in its entirety.
  • the projected features will not be imaged by the scanner as ideal points, but rather they will have a certain spread (blurring) when imaged by scanner.
  • the degree of spreading of the feature can be described by a point spread function (PSF).
  • PSF point spread function
  • the resolution of the 3D representation is limited by the spreading of the features described by the PSF, since the features need to be sharp in the images in order to accurately determine the 3D points for generating the 3D representation.
  • the features are described by a PSF having an Airy disk radius of equal to or less than 100 pm, such as equal to or less than 50 pm.
  • the minimum feature size in the pattern is limited by the imaging resolving power of the optics of the scanner. As mentioned herein above, the imaging resolution is limited primarily by three effects: defocus, lens aberrations, and diffraction.
  • an optical system e.g. comprising the projector unit and/or the camera unit(s) as disclosed herein, having a numerical aperture of between 0.0035 and 0.015 enables the ability to resolve very fine details of between 50-200 pm in size in a focus range between 10 mm and 36 mm.
  • the optical system is configured to have a working distance of between 15 mm and 50 mm, e.g. the working distance of the projector unit and/or the camera unit(s).
  • the working distance can be longer than 50 mm, e.g. in case the scanner comprises a mirror arranged in the distal end of the scanner.
  • the working distance may in some cases be shorter than 15 mm if the scan unit is provided without a mirror in the scanner.
  • a numerical aperture of between 0.0035 and 0.015 may correspond to apertures providing a pupil diameter of between 0.2 mm and 0.7 mm. Accordingly, the technical effect of the choice of numerical aperture is that it provides the ability to project a high-density pattern, wherein the pattern is in focus in a relatively wide focus range in close proximity to the scanner, and wherein the blurring of the pattern features is below a given tolerance, e.g. given by the Airy disk mentioned previously. Consequently, a more accurate 3D representation may be generated since the position of the 3D points can be determined more accurately, i.e. with less uncertainty, and also since the smaller features allows for more features to be present in the pattern, thereby leading to a 3D representation comprising more 3D points.
  • a camera unit may be understood herein as a device for capturing an image of an object.
  • Each camera unit may comprise an image sensor for generating an image based on incoming light e.g. received from an illuminated 3D object.
  • the image sensor may be an electronic image sensor such as a charge-coupled device (CCD) or an active-pixel sensor (CMOS sensor).
  • the image sensor is a global shutter sensor configured to expose the entire image area (all pixels) simultaneously and generate an image in a single point in time.
  • the image sensor(s) may have a frame rate of at least 30 frames per second, such as at least 60 frames per second, or even at least 75 frames per second.
  • the camera units may capture images, or sets of images, at a frame rate of at least 30 frames per second, e.g. at a frame of at least 60 frames per second, e.g. at least 75 frames per second.
  • the number of 3D frames generated by the scanner per second may correspond to, or be less than, the above-indicated frame rates of the image sensors.
  • the image sensor is a rolling shutter sensor.
  • An advantage of utilizing a rolling shutter sensor is that the sensor can be made smaller for a given pixel array size compared to e.g. a global shutter sensor, which typically comprises more electronics per pixel leading to a sensor having a larger area or volume footprint, thus taking up more space.
  • a rolling shutter sensor is advantageous for applications with restricted space, such as for intraoral scanners, in particular for realizing a compact intraoral scanner.
  • the rolling shutter sensor may be configured to expose individual rows of pixels with a time lag and output an image based on that. In that case, the image frames may overlap in time because pixel(s) within one frame have been exposed to light at different times.
  • the imaged object may have moved during the exposure.
  • the light source of the projector unit is configured to flash during a time period, such that all pixels of the image sensor(s) are exposed simultaneously, effectively meaning that the pixels are exposed globally.
  • the global exposure is controlled by the light source of the projector unit as opposed to electronically on the image sensor(s).
  • the exposure time of the image sensor is below 15 milliseconds (ms), such as below 10 ms, such as between 4 to 8 ms.
  • ms milliseconds
  • These indicated exposure times preferably corresponds to the time period of the flash of the light source of the projector unit as described above.
  • an advantage of configuring the light source to flash during a time period as indicated above is that blurring due to relative movement between the scanner and the object being scanned is minimized. This kind of blurring is also referred to as motion blur.
  • the image sensor(s) may comprise an array of pixels, wherein each pixel is associated with a corresponding camera ray.
  • the array of pixels may be a two-dimensional (2D) array.
  • Each pixel may be covered by a micro lens.
  • the image area i.e. the 2D array of pixels
  • the resolution of the image sensor may be between about 0.25 megapixel to about 2.5 megapixel.
  • the image sensor is a CMOS sensor comprising an analog-to-digital converter (ADC) for each column of pixels, making conversion time significantly faster and allowing each camera unit to benefit from greater speed.
  • ADC analog-to-digital converter
  • Each image sensor may define an image plane, which is the plane that contains the object’s projected image.
  • Each image obtained by the image sensor(s) may comprise a plurality of image features, wherein each image feature originates from a pattern feature of the projected pattern.
  • one or more of the camera units comprise a light field camera.
  • each camera unit defines a camera optical axis.
  • the camera units may further comprise one or more focus lenses for focusing light.
  • the image sensor is a monochrome image sensor, wherein each pixel is associated with a single color channel, e.g. is a grayscale color channel, wherein the value of each pixel represents only an amount of light.
  • the image sensor is a color image sensor or an image sensor comprising a color filter array on the array of pixels.
  • the color filter array may be a Bayer filter employing an arrangement of four color filters: Red (R), Green (G), Green (G), and Blue (B).
  • the Bayer filter may also be referred to as a RGGB filter.
  • color pixels may be combined to monochrome pixels of 2 x 2 color pixels for 3D depth reconstruction. In this case, the resolution of the 3D depth reconstruction is only half the resolution of the image sensor in each direction.
  • the full native resolution is preferably utilized (with color filtered pixels).
  • the projector optical axis and the camera optical axis, or axes are non-parallel.
  • the projector optical axis and the camera optical axis of at least one camera unit may define a camera-projector angle of approximately 5 to 15 degrees, preferably 5 to 10 degrees, even more preferably 8 to 10 degrees. All of the camera units may be angled similarly with respect to the projector unit, such that each camera optical axis defines approximately the same angle with the projector optical axis.
  • the camera units are defocused at the opening of the probe of the scanner and/or at the surface of an optical window in said probe.
  • each camera unit and projector unit of a given scan unit are focused at the same distance.
  • each camera unit has a field of view of 50-115 degrees, such as 65-100 degrees, such as 65-75 degrees. In other embodiments, each camera unit has a field of view of 80-90 degrees.
  • Each camera unit may comprise one or more focus lenses for focusing light onto the image sensor of the given camera unit.
  • each camera unit comprises two or more lenses, or lens elements, assembled in a camera lens stack.
  • each camera unit may comprise a camera lens stack comprising a plurality of lens elements.
  • the purpose of the focus lens(es) or camera lens stack may be to define or ensure a predetermined focus distance, or working distance, of the camera unit.
  • the camera lens stack may further define the camera optical axis.
  • the lens elements of the camera lens stack are attached together to form a single unit.
  • the projector lens stack and the camera lens stack(s) are similar, such that similar lens elements are used in the lens stacks of the scan unit. Utilizing a similar lens design of the camera units and projector unit has the benefit of lowering production costs compared to developing different lens designs.
  • the camera unit(s) may comprise an aperture having a predetermined size such that it provides a pupil diameter of between 0.2 mm to 0.7 mm, such as between 0.3 mm to 0.6 mm.
  • a pupil diameter of between 0.2 mm to 0.7 mm was found to be particularly useful because it provided an imaged pattern in particularly good focus in a large focus range, e.g. a focus range of between 16 mm and 22 mm.
  • a pupil diameter between from about 0.3 mm to about 0.5 mm was found to provide a good compromise between the imaging resolution, e.g. the resolution of the pattern, and the depth of focus, i.e. the focus range.
  • a working distance of the camera unit(s) of between 10 mm and 70 mm, such as between 15 mm and 50 mm, is particularly useful for the same reasons as stated in relation to the projector unit.
  • the projector unit and the camera unit(s) have the same working distance.
  • the images formed by each camera unit are maintained focused over all object distances located between 10 mm and 50 mm, e.g., between 12 mm and 40 mm, e.g., between 15 mm and 36 mm from the lens that is farthest from the image sensor.
  • the choice of aperture and working distance results in a numerical aperture of each camera unit of between 0.0035 and 0.015, which was found to provide a good imaging resolution, i.e. a pattern with pattern features in good focus in a given focus range, and further a good compromise in terms of defocus, lens aberrations, and diffraction.
  • a numerical aperture of the camera unit(s) of between 0.005 and 0.009 was found to provide an ideal compromise between imaging resolution and depth of focus.
  • a numerical aperture in this range was found to be the best balance between mitigating the negative effects on resolution caused by defocus, lens aberrations, and diffraction.
  • the numerical aperture may be the same for the projector unit and the camera unit(s).
  • Each camera unit may further comprise a lens mount configured for receiving and mounting the camera lens stack.
  • the lens mount may comprise a cylindrically shaped section adapted to receive the camera lens stack, such that said lens stack can be fixedly mounted herein.
  • the lens mount may further comprise a flange adapted to interface with a fixation unit to ensure correct placement of the lens mount within the fixation unit in at least one direction.
  • the flange comprises one or more flat surfaces for interfacing with the fixation unit to fix the position of a given lens mount in the fixation unit in at least two directions.
  • the flat surfaces of the lens mount and fixation unit are shown in figures 5-6.
  • the camera units are symmetrically arranged around the projector unit, wherein the distance between the projector unit and a given camera unit is between 2 mm to 6 mm, such as between 3 mm to 4 mm.
  • the scanner comprises two or more camera units configured for acquiring a set of images comprising at least one image from each camera unit, wherein each image includes at least a portion of the projected pattern.
  • the images within the set of images are acquired simultaneously.
  • the number of images in the set of images may preferably correspond to the number of camera units, wherein each camera unit contributes one image to the set of images.
  • the 3D scanner system comprises one or more processors configured to generate a 3D representation based on the set of images, e.g. by identifying image features in the set of images and determining points in 3D space based on triangulation.
  • the processors may be located on the scanner or they may be located on an external computer.
  • the 3D representation may be generated continuously during a scanning session, and/or it may be generated in real-time.
  • the scanner system may further comprise a display for displaying the 3D representation. The rendering of the 3D representation and the display of said representation may further occur in real-time, or perceived real-time to the user, e.g. with time lags below 50 ms.
  • Scan unit configured to generate a 3D representation based on the set of images, e.g. by identifying image features in the set of images and determining points in 3D space based on triangulation.
  • the processors may be located on the scanner or they may be located on an external computer.
  • the 3D representation may be generated continuously during
  • a scan unit may be understood herein as a unit comprising at least one projector unit and one or more camera units.
  • each scan unit comprises at least two camera units having at least partly overlapping fields of view along different camera optical axes.
  • each scan unit comprises at least four camera units having at least partly overlapping fields of view along different camera optical axes.
  • the at least one projector unit and one or more camera units of the scan unit may be provided as modular units for being inserted into a fixation unit of the scan unit, as shown in figs. 3-4. This has the benefit of providing a more easy and intuitive assembly method of the scan unit. It further has the benefit of fixing the projector unit and camera unit(s) in a rigid structure, such that the geometric relationship between said units is fixed and maintained.
  • Each unit, camera unit or projector unit may be connected to its own flexible printed circuit board (PCB).
  • the projector unit and camera units may be fixedly mounted inside the fixation unit, e.g. using an adhesive.
  • the scan unit itself may further be considered a modular unit in the sense that it provides a complete optical system with projector unit and one or more camera units.
  • the scanner is adapted for receiving several of such scan units, such that the field of view of the scanner may be extended or enlarged.
  • An example of such an embodiment is shown in figure 14, which shows a scanner comprises two such scan units placed in series in a tip or distal end of the scanner.
  • each scan unit is arranged in combination with a mirror to redirect the projected light, e.g. toward a dental object, such as the dentition or dental arch of a subject.
  • a scanner having an extended field of view is further described in EP 23158001.0 by the same applicant, which is herein incorporated by reference in its entirety.
  • the scan unit may comprise a fixation unit configured for receiving and mounting the projector unit and the camera units in the scan unit.
  • the fixation unit is preferably further configured such that each camera optical axis forms a predefined angle with the projector axis when mounted in the fixation unit.
  • the fixation unit is configured to accommodate at least one projector unit and two or more camera units, such as four camera units.
  • the projector lens stack and/or the camera lens stack(s) protrude from the fixation unit as seen in figs. 1 and 4.
  • the fixation unit may comprise one or more openings for receiving and mounting each lens mount.
  • the openings of the fixation unit may be provided with one or more flat surfaces for interfacing with the flat surfaces of the lens mounts.
  • the flat surfaces are useful for ensuring proper placement of the lens mounts inside the fixation unit, such that they can ideally only move in one dimension during insertion.
  • the openings of the fixation unit may be shaped to fit the lens mounts, e.g. the openings may comprise a cylindrically shaped section to receive and mount a cylindrical section of the lens mounts.
  • the lens mounts may be fixedly mounted inside the fixation unit, e.g. using an adhesive. In some embodiments, the lens mounts are physically integrated in the fixation unit as shown in fig. 8.
  • the disclosure further relates to an optical system for an intraoral scanner, said optical system comprising at least one projector unit as described herein and one or more camera units as described herein.
  • the optical system may comprise any of the optical components disclosed herein, and it may be embodied in several different ways as suggested herein.
  • a reflecting element may be understood herein as an element configured to change the direction of light rays incident on the surface of said reflecting element or being transmitted through said reflecting element, e.g. in case of a prism.
  • the reflecting element is preferably configured to change the direction of a center beam of the projected light from a projector unit from a direction substantially parallel to the longitudinal axis of the scanner to a direction substantially orthogonal to said longitudinal axis.
  • a surface normal of each reflecting element defines an angle with respect to the projector optical axis of approximately 40-50 degrees, preferably approximately 45 degrees.
  • the reflecting element may be selected from the group of: mirrors, prisms, and/or combinations thereof.
  • the reflecting element is configured to reflect light from the projector unit of the scan unit and/or reflect light from the surface of the object being scanned and onto the image sensors of the scan unit.
  • the scanner comprises a mirror as the reflecting element.
  • the scanner comprises a prism as the reflecting element.
  • Some embodiments feature a combination of mirror(s) and prism(s).
  • the prism is preferably configured to change the direction of a center beam of the projected pattern of light from substantially parallel to the longitudinal axis of the scanner to a direction having an angle of at least 45 degrees with respect to said longitudinal axis. Even more preferably, the prism is configured to change the direction of a center beam of the projected pattern of light from substantially parallel to the longitudinal axis of the scanner to a direction having an angle of approximately 90 degrees with respect to said longitudinal axis.
  • the scanner comprises a scan unit, wherein the scanner further comprises a reflecting element positioned on the projector optical axis of the projector unit of said scan unit.
  • the scanner comprises at least two scan units, wherein the scanner further comprises a reflecting element arranged in combination with each scan unit.
  • the reflecting element is then configured to reflect light projected from the projector unit of said scan unit.
  • the reflecting element is preferably further arranged to reflect light from the object being scanned and onto the image sensor(s) of each camera unit of the scan unit.
  • the reflecting element of each scan unit is positioned on the projector optical axis.
  • the projector optical axis may in some embodiments coincide with the longitudinal axis of the scanner.
  • the reflecting element(s) may be shaped in a variety of ways. As an example, each reflecting element may be substantially rectangular. In some embodiments, the reflecting element(s) comprise a plurality of corners, wherein at least some of the corners are rounded.
  • the scanner comprises one or more flexible printed circuit boards (PCB).
  • PCB flexible printed circuit boards
  • Each of the flexible printed circuit boards may connect one of the camera units to a main printed circuit boards (PCB).
  • a flexible PCB is connected to each camera unit, wherein each PCB comprises a plurality of wires or circuits connected to the image sensor of a given camera unit.
  • each flexible PCB is bent at a first radius of curvature, wherein the first radius of curvature lies in a first plane.
  • Each PCB may be further bent at a second radius of curvature, wherein the second radius of curvature lies in a second plane.
  • each PCB is a flexible PCB which is manufactured at a predetermined shape, such that each flexible PCB contains a first section which is straight, and a second section which includes one or more turns.
  • the second section includes three turns, which may be right-angle turns; however, the turns may have rounded corners.
  • the first plane may be substantially parallel to the longitudinal axis of the scanner, and the second plane may be perpendicular to the first plane.
  • the manufactured flexible PCB may be bent only along the first radius of curvature, such that the entirety of the second section lies in the second plane.
  • the first radius of curvature is selected from about 0.5 mm to about 5 mm.
  • a first section of each PCB may be substantially parallel to the longitudinal axis of the scanner, and a second section of each PCB may lie in the second plane. Furthermore, the PCBs may overlap each other along a section, wherein the PCBs are bent at the first radius of curvature.
  • the scanner comprises one or more processors.
  • the scanner may comprise a first processor configured for determining image features in the acquired images.
  • the first processor may be selected from the group of: central processing units (CPU), accelerators (offload engines), general-purpose microprocessors, graphics processing units (GPU), neural processing units (NPU), application-specific integrated circuits (ASIC), field-programmable gate arrays (FPGA), dedicated logic circuitry, dedicated artificial intelligence processor units, or combinations thereof.
  • the first processor may be a field-programmable gate array (FPGA).
  • the first processor may be a neural processing unit (NPU).
  • the NPU may be configured to execute one or more machine learning algorithms.
  • a neural processing unit may be understood herein as a circuit configured to implement control and arithmetic logic necessary to execute machine learning algorithms, such as a neural network.
  • the scanner may further comprise a second processor configured for performing the steps of carrying out a computer-implemented method for generating a digital representation of a three- dimensional (3D) object.
  • the second processor may be configured for running a tracking algorithm configured for solving the correspondence problem, e.g. for determining corresponding image features in the obtained images. It may further be configured for determining 3D points based on the determined image features and triangulation.
  • the scanner may further comprise computer memory for storing instructions, which when executed, causes the first processor to carry out the step of determining image features in the set(s) of images.
  • the scanner may further comprise a second processor configured for performing the steps of carrying out the computer-implemented method for generating a digital representation of a three-dimensional (3D) object.
  • the computer memory may further store instructions, which when executed, causes the second processor to carry out the method of generating a digital representation of a three-dimensional (3D) object.
  • the second processor may be a central processing unit (CPU) such as an ARM processor or another suitable microprocessor.
  • the second processor may comprise computer memory.
  • the processor(s), such as the first and second processor, may both be located on the scanner, and they may be operatively connected such that the first processor provides input to the second processor.
  • the first processor may be located on the scanner, and the second processor may be located on the computer system described herein.
  • the first processor may be configured to determine image features in the images, and subsequently provide data related to the determined image features to the second processor.
  • the data may comprise image feature coordinates as well as other attributes such as a camera index or a predefined property, such as the phase, of the image feature(s).
  • the second processor may then be configured to generate the digital representation of the 3D object, e.g. in the form of a point cloud.
  • the scanner may be further configured to provide the digital representation to a computer system for rendering the representation.
  • the computer system may further process the digital representation, e.g. by stitching scan data, such as 3D representations or point clouds, received from the scanner and/or by fitting one or more surfaces to the stitched scan data I point clouds. This further processing by the computer system may also be referred to herein as reconstruction.
  • the output of the reconstruction is a digital 3D model of the scanned object.
  • the digital 3D model may be rendered and displayed on a display, e.g. connected to the computer system.
  • the rendering and/or display of the 3D model may occur in real-time.
  • the scanner preferably comprises a module for transmitting data, such as images or point clouds, to one or more external devices, such as a computer system.
  • the module may be a wireless module configured to wirelessly transfer data from the scanner to the computer system.
  • the wireless module may be configured to perform various functions required for the scanner to wirelessly communicate with a computer network.
  • the wireless module may utilize one or more of the IEEE 802.11 Wi-Fi protocols/ integrated TCP/IP protocol stack that allows the scanner to access the computer network.
  • the wireless module may include a system-on- chip having different types of inbuilt network connectivity technologies. These may include commonly used wireless protocols such as Bluetooth, ZigBee, Wi-Fi, WiGig (also known as 60 GHz Wi-Fi), etc.
  • the scanner may further (or alternatively) be configured to transmit data using a wired connection, such as an ethernet cable or a USB cable.
  • the scanner comprises a wireless module configured to wirelessly transfer data from the scanner to the computer system.
  • the scanner may be configured to continuously transfer the data, e.g. scan data or image data, during a scanning session. It may further be configured to transfer said data in real-time, such that a good scan experience is achieved.
  • a computer system may be understood as an electronic processing device for carrying out sequences of arithmetic or logical operations.
  • a computer system refers to one or more devices comprising at least one processor, such as a central processing unit (CPU), along with some type of computer memory.
  • processors such as a central processing unit (CPU)
  • Examples of computer systems falling within this definition include desktop computers, laptop computers, computer clusters, servers, cloud computers, quantum computers, mobile devices such as smartphones and tablet computers, and/or combinations thereof.
  • the computer system may comprise hardware such as one or more central processing units (CPU), graphics processing units (GPU), and computer memory such as random-access memory (RAM) or read-only memory (ROM).
  • the computer system may comprise a CPU, which is configured to read and execute instructions stored in the computer memory e.g. in the form of random-access memory.
  • the computer memory is configured to store instructions for execution by the CPU and data used by those instructions.
  • the memory may store instructions, which when executed by the CPU, cause the computer system to perform, wholly or partly, any of the computer-implemented methods disclosed herein.
  • the computer system may further comprise a graphics processing unit (GPU).
  • the GPU may be configured to perform a variety of tasks such as video decoding and encoding, rendering of the digital representation, and other image processing tasks.
  • the computer system may further comprise non-volatile storage in the form of a hard disc drive.
  • the computer system preferably further comprises an I/O interface configured to connect peripheral devices used in connection with the computer system. More particularly, a display may be connected and configured to display output from the computer system. The display may for example display a 2D rendering of the generated digital 3D representation.
  • Input devices may also be connected to the I/O interface. Examples of such input devices include a keyboard and a mouse, which allow user interaction with the computer system.
  • a network interface may further be part of the computer system in order to allow it to be connected to an appropriate computer network so as to receive and transmit data (such as scan data and/or images) from and to other computing devices.
  • the scan data may comprise or constitute 3D data, such as depth maps or point clouds, or it may comprise 2D data, such as images.
  • the CPU, volatile memory, hard disc drive, I/O interface, and network interface may be connected together by a bus.
  • the computer system is preferably configured for receiving data from the scanner, either directly from the scanner or via a computer network such as a wireless network.
  • the data may comprise images, processed images, point clouds, sets of data points, or other types of data.
  • the data may be transmitted/received using a wireless connection, a wired connection, and/or combinations thereof.
  • the computer system is configured for generating a digital representation of a three-dimensional (3D) object as described herein.
  • the computer system is configured for receiving data, such as point clouds, from the scanner and then subsequently perform the steps of reconstruction and rendering a digital representation of a three-dimensional (3D) object. Rendering may be understood as the process of generating one or more images from three-dimensional data.
  • the computer system may comprise computer memory for storing a computer program, said computer program comprising computer-executable instructions, which when executed, causes the computer system to carry out the method of generating a digital representation of a three-dimensional (3D) object.
  • the scanner is configured to acquire images of a three-dimensional (3D) object.
  • the images are preferably acquired using a scanner comprising one or more scan units, wherein each scan unit comprises a projector unit and one or more camera units.
  • the scanner may be an intraoral scanner for acquiring images inside the oral cavity of a subject.
  • the projector unit of the scanner is preferably configured for projecting a predefined pattern of light, such as a static pattern, onto a surface, e.g. onto the surface of the three-dimensional object. Once projected on the surface, some light will be reflected from the surface, which may then enter the camera unit(s) of the scanner, whereby images of the 3D object can be acquired.
  • the images are preferably acquired using one or more camera units per projector unit, such as at least two camera units or at least four camera units for each projector unit.
  • each scan unit of the scanner comprises a projector unit and four camera units.
  • the images may be processed by a processor located on the scanner, and then subsequently transmitted to the computer system.
  • the images may also be transmitted, without any processing, to the computer system.
  • both raw images and processed images are transmitted by the scanner to a computer system.
  • a processor located on the scanner receives the images as input and provides one or more point clouds.
  • Fig. 1 shows a scan unit 100 according to the present disclosure.
  • the scan unit 100 comprises a projector unit 300 and four camera units 200, wherein the projector unit 300 is arranged in the center of the four camera units 200.
  • the projector unit 300 may comprise a projector lens stack 310 defining a projector optical axis.
  • the projector lens stack 310 may comprise a plurality of lens elements, or focus lenses, attached together to form a single unit.
  • the scan unit 100 further may further comprise a fixation unit 400 configured for mounting the projector unit 300 and the camera unit(s) 200 in the scan unit 100.
  • the projector lens stack 310 and/or the camera lens stack 210(s) may protrude from the fixation unit 400.
  • each camera unit 200 comprises a camera lens stack 210 and an image sensor 220 for acquiring one or more image(s).
  • the camera lens stack 210 may comprise a plurality of lens elements, or focus lenses, attached together to form a single unit, wherein each camera lens stack 210 defines a camera optical axis.
  • the camera optical axis may define an angle with respect to the projector optical axis.
  • each camera unit 200 is arranged such that it forms a predefined angle with respect to the projector optical axis.
  • the camera units 200 may be configured to have at least partly overlapping fields of view along the different camera optical axes.
  • the projector unit 300 may further comprise a light source 330 for emitting light, such as a white light source 330, and one or more collimation lenses 312 for collimating light emitted by the light source 330 and transmitted through said collimation lenses 312.
  • the projector unit 300 may further comprise a mask 320 for defining a spatial pattern of light projected through the mask 320.
  • the projector unit 300 and/or camera unit(s) 200 may further comprise a lens mount 240 configured for receiving and mounting the projector lens stack 310 or a camera lens stack 210.
  • the projector lens stack 310 and the camera lens stack 210(s) may be similar, such that similar lens elements are used in the lens stacks.
  • Fig. 3 shows an exploded view of a scan unit 100 according to the present disclosure.
  • the scan unit 100 comprises a projector unit 300, four camera units 200, and a fixation unit 400 configured for receiving and mounting the projector unit 300 and the camera unit(s) 200 in the scan unit 100 such that each camera optical axis forms a predefined angle with the projector axis when mounted in the fixation unit 400.
  • the projector unit 300 and the four camera units 200 each comprise a lens mount 240 for accommodating the projector lens stack 310 and the camera lens stack 210, respectively.
  • the fixation unit 400 may be configured to receive the units including lens mount 240, such that the units may be fixed in the fixation unit 400.
  • Fig. 4 shows the embodiment according to figure 3, wherein the units, i.e. the projector unit
  • the projector and camera units 200 are inserted and mounted/fixed in the fixation unit 400.
  • the projector and camera units 200 may be attached to the fixation unit 400, e.g. by adhesive bonding, such that the units are fixed in the fixation unit 400.
  • each projector unit 300 and/or camera unit 200 may comprise a lens mount 240 for mounting a lens stack associated with the projector unit 300 and/or camera unit.
  • the lens mount 240 comprises a cylindrically shaped section adapted to receive the projector or camera lens stack 210.
  • the lens mount 240 may comprise a flange adapted to interface with the fixation unit 400 to ensure correct placement of the lens mount 240 within the fixation unit 400 in at least one direction.
  • the flange may comprise one or more flat surfaces for interfacing with the fixation unit 400 to fix the position of a given lens mount 240 in the fixation unit 400 in at least two directions.
  • Fig. 6 shows a fixation unit 400 according to the present disclosure.
  • the fixation unit 400 may be rigid and made in one piece. An advantage hereof is that the fixation unit 400 may ensure a fixed geometric relationship between the projector unit 300 and the camera unit(s) 200.
  • the fixation unit 400 is configured to accommodate a projector unit 300 and four camera units 200.
  • the projector unit 300 and/or camera unit(s) 200 may further comprise a lens mount 240 configured for receiving and mounting the projector or camera lens stack 210.
  • the fixation unit 400 may comprise one or more openings for receiving and mounting each lens mount 240 in the fixation unit 400.
  • the lens mount 240 is integrated in the fixation unit 400.
  • Fig. 7 shows a camera lens stack 210, a lens mount 240, an image sensor 220, and a flexible printed circuit 420 board, according to the present disclosure.
  • the lens mount 240 may be configured to be inserted on the top of the flexible printed circuit 420, such that the lens mount 240 may at least partly accommodate the image sensor 220 of a given camera unit.
  • the lens stack may be inserted and fixed in the lens mount 240. Once assembled, each camera unit 200 may be inserted into the fixation unit 400 as shown in figure 4.
  • Fig. 8 shows an embodiment of a scan unit 100, wherein the lens mount 240 is integrated in the fixation unit 400, such that there is a lens mount 240 for each camera unit 200 and/or projector unit 300.
  • the fixation unit 400 comprises five openings 402, wherein a lens mount 240 is physically integrated in each opening 402.
  • Each lens mount 240 is then configured for receiving and mounting a projector lens stack 310 or a camera lens stack 210.
  • Fig. 9 shows an exploded view of a projector unit 300 according to the present disclosure.
  • the projector unit 300 comprises: a projector lens stack 310 comprising a plurality of lens elements, or focus lenses, attached together to form a single unit, a lens mount 240 configured for receiving and mounting the projector lens stack 310, a light source 330 for emitting light, one or more collimation lenses 312 for collimating light emitted from the light source 330, an illumination mount configured for accommodating the collimation lenses 312, a mask 320 configured for defining a spatial pattern of light projected through the mask 320.
  • Fig. 10 shows a cross-sectional view of a scan unit 100 according to the present disclosure.
  • the camera units 200 are positioned in parallel with the projector unit 300, i.e. such that the camera optical axes are parallel to the projector optical axis.
  • the image sensor 220 of each camera unit 200 is surface-mounted on one printed circuit 420 board (PCB), preferably a rigid PCB.
  • PCB printed circuit 420 board
  • SMT surface mount technology
  • the embodiment further comprises a pattern generating element, e.g. a mask 320, as well as a lens mount 240, which may be attached to the PCB.
  • the scan unit 100 may further comprise a lens stack, e.g.
  • the scan unit 100 may further comprise a light source 330 and one or more collimation lenses 312.
  • the scan unit 100 comprises four camera unit 200 symmetrically arranged around the projector unit 300; however, only two of them are visible in the figure due to the cross-sectional view.
  • Fig. 11 shows a cross-sectional view of a scan unit 100 according to the present disclosure.
  • This embodiment is largely similar to the embodiment shown in fig. 10, except that in this embodiment the focus lenses or lens stacks comprise an outer thread configured for threaded engagement with the lens mount 240.
  • the outer thread has the purpose of adjusting the focus of a given unit (projector and/or camera) by screwing the lens stack in or out, whereby the position of the lens stack is changed along a given camera/projector optical axis.
  • Fig. 12 shows a cross-sectional view of a scan unit 100 according to the present disclosure. Similar to the embodiments shown in figs. 10-11 , this embodiment comprises surfacemounted image sensor 220s on a PCB, such as a rigid PCB.
  • the scan unit 100 similarly comprises a light source 330 and one or more collimation lenses 312.
  • each camera unit 200 comprises a housing in the shape of a rectangular cuboid (i.e. a hexahedron with 6 rectangles as faces, wherein adjacent faces meet at right angles).
  • the camera unit(s) 200 may comprise a backside-illuminated CMOS image-sensor.
  • the camera unit(s) 200 may comprise wafer-level optical elements constituting a lens stack.
  • the camera units 200 are provided as packaged modular units with both image sensor 220 and lens stack inside a rectangular cuboid housing.
  • Fig. 13 shows two scan units 100 according to the present disclosure.
  • the scan units 100 are configured to be part of an intraoral scanner. This figure further shows a configured to accommodate the scan units 100.
  • the frame may further be configured such that one or more reflecting elements can be inserted and mounted in the frame. The reflecting elements may protrude from the frame, when mounted.
  • the frame may comprise one or more openings for inserting the scan units 100.
  • the frame may be made in one piece, and is preferably manufactured to be rigid.
  • the frame may be configured such that the scan units 100 can be inserted in the frame from opposite directions, e.g. a first scan unit 100 from above the frame, and a second scan unit 100 from below the frame. Accordingly, the scanner may be assembled by inserting the first scan unit 100 into the frame through a first opening located in a first surface of the frame; and inserting the second scan unit 100 into the frame through a second opening located in a second surface of the frame.
  • Fig. 14 shows a cross-section through an intraoral scanner 500 according to the present disclosure.
  • the scanner comprises two scan units 100 arranged in series along a longitudinal axis of the scanner in order to increase the field of view of the scanner 500. Having a larger field of view enables large smooth features, such as the overall curve of a given tooth, to appear in each image, which improves the accuracy of a subsequent stitching of respective 3D scan data, such as 3D surfaces, obtained from different sets of images.
  • Each scan unit 100 is arranged in combination with a reflecting element 520, such as a mirror, wherein the reflecting element is configured to alter the direction of light projected by a given scan unit 100.
  • the intraoral scanner may further comprise a housing 540 for accommodating the scan units 100 and the frame 530.
  • the housing 540 may comprise an optical window 510 arranged in a distal end of the intraoral scanner 500.
  • the optical window may 510 be made of a polymer, such as poly(methyl methacrylate) (PMMA), or it may be made of a glass, such as sapphire glass.
  • PMMA poly(methyl methacrylate)
  • An intraoral scanner comprising:
  • a pattern generating element 320 configured for generating a pattern of light to be projected on a surface of an object
  • each camera unit 200 comprising:
  • an image sensor 220 for acquiring one or more image(s);
  • one or more camera focus lenses for focusing light received from the surface of the object onto the image sensor 220, wherein the camera focus lenses define a projector optical axis.
  • the projector unit 300 comprises an aperture having a predetermined size such that it provides a pupil diameter of between 0.2 mm to 0.7 mm.
  • the projector unit 300 is configured for sequentially turning the light source 330 on and off at a predetermined frequency, wherein the light source 330 is on for a predetermined time period.
  • the image sensor 220 comprises an array of pixels in a two-dimensional (2D) array, wherein the array comprises at least 1200 pixels times 1200 pixels.
  • the pattern generating element is a mask 320, such as a chrome-on-glass mask 320, or a diffractive optical element (DOE).
  • a mask 320 such as a chrome-on-glass mask 320, or a diffractive optical element (DOE).
  • DOE diffractive optical element
  • the pattern of light resembles a distribution of discrete unconnected spots of light.
  • the scanner comprises two or more camera units 200, wherein the camera units 200 are configured to acquire a set of images comprising a plurality of images.
  • the scanner comprises four or more camera units 200, wherein the camera units 200 are configured to acquire a set of images comprising a plurality of images.
  • the scanner according to any of the items 18-31 wherein the camera units 200 are synchronized such that images within the set of images are acquired simultaneously by the camera units 200.
  • each camera unit 200 is between 65° and 75°.
  • the scanner according to any of the preceding items, wherein the field of view of the camera units 200 are at least partially overlapping.
  • each camera projector optical axis defines an angle with the projector optical axis of between 5° to 10°. 36.
  • the camera units 200 are symmetrically arranged around the projector unit 300, wherein the distance between the projector unit 300 and a given camera unit 200 is between 2 mm to 5 mm.
  • the scanner further comprises one or more second light source 330s for emitting light at a second wavelength or a second range of wavelengths.
  • the projector unit 300 further comprises one or more collimation lenses 312 for collimating light from the light source 330.
  • the scanner further comprises a mirror arranged in a distal end of the scanner such that light projected from the projector unit 300 is redirected e.g. onto the surface of the object.
  • the height of the mirror as seen along the projector optical axis is between about 13 mm to about 20 mm.
  • the scanner comprises an optical window arranged in a distal end of the scanner, wherein the optical window is transparent to light projected by the projector unit 300.
  • the optical window is made of a polymer, such as poly(methyl methacrylate) (PMMA).
  • a 3D scanner system comprising:
  • An intraoral scanner comprising one or more scan units 100, each scan unit 100 comprising:
  • a projector unit 300 comprising:
  • a pattern generating element configured for generating a light pattern to be projected on a surface of a dental object
  • a projector lens stack 310 comprising a plurality of lens elements, the projector lens stack 310 defining a projector optical axis
  • each camera unit 200 comprising:
  • a camera lens stack 210 comprising a plurality of lens elements, the camera lens stack 210 defining a camera optical axis;
  • an image sensor 220 for acquiring one or more image(s);
  • fixation unit 400 configured for receiving and mounting the projector unit 300 and the camera unit(s) 200 in the scan unit 100 such that each camera optical axis forms a predefined angle with the projector axis when mounted in the fixation unit 400.
  • the intraoral scanner according to any of the preceding items wherein the projector lens stack 310 and the camera lens stack 210(s) are similar, such that similar lens elements are used in the lens stacks.
  • each lens mount 240 comprises a cylindrically shaped section adapted to receive the projector or camera lens stack 210.
  • each lens mount 240 comprises a flange adapted to interface with the fixation unit 400 to ensure correct placement of the lens mount 240 within the fixation unit 400 in at least one direction.
  • the flange comprises one or more flat surfaces for interfacing with the fixation unit 400 to fix the position of a given lens mount 240 in the fixation unit 400 in at least two directions.
  • fixation unit 400 comprises one or more openings for receiving and mounting each lens mount 240.
  • the projector unit 300 further comprises an illumination mount configured for accommodating the one or more collimation lenses 312.
  • the image sensor 220 is a rolling shutter sensor comprising an array of pixels.
  • the intraoral scanner according to item 65 wherein the projector unit 300 comprises a light source 330 configured to flash during a predefined time period such that effectively all pixels on the image sensor 220 are exposed globally.
  • each camera lens stack 210 is mounted directly on the image sensor 220 of a given camera unit.
  • fixation unit 400 is configured to accommodate at least one projector unit 300 and two or more camera units 200.
  • each scan unit 100 comprises at least one projector unit 300 and four or more camera units 200.
  • the intraoral scanner according to any of the preceding items wherein the scanner is configured to acquire a set of images, wherein the number of images within the set of images corresponds to the number of camera units 200.
  • the object is a dental object, such as at least a part of the dentition or teeth of a subject, or at least a part of a dental arch.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Physics & Mathematics (AREA)
  • Veterinary Medicine (AREA)
  • Public Health (AREA)
  • General Health & Medical Sciences (AREA)
  • Animal Behavior & Ethology (AREA)
  • Engineering & Computer Science (AREA)
  • Optics & Photonics (AREA)
  • Dentistry (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Surgery (AREA)
  • Molecular Biology (AREA)
  • Epidemiology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Biomedical Technology (AREA)
  • Pathology (AREA)
  • Biophysics (AREA)
  • Medical Informatics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Physics & Mathematics (AREA)
  • Radiology & Medical Imaging (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Audiology, Speech & Language Pathology (AREA)
  • Dental Tools And Instruments Or Auxiliary Dental Instruments (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

La présente divulgation concerne un système optique pour un scanner intrabuccal, comprenant : au moins une unité de projecteur comprenant : une source de lumière pour générer de la lumière ; un élément de génération de motif configuré pour générer un motif de lumière à projeter sur une surface d'un objet ; et une ou plusieurs lentilles de focalisation de projecteur pour focaliser le motif de lumière, les lentilles de focalisation de projecteur définissant un axe optique de projecteur ; et une ouverture ayant une taille prédéterminée. Le système optique comprend en outre une ou plusieurs unités de caméra, chaque unité de caméra comprenant : un capteur d'image pour acquérir une ou plusieurs images, la résolution du capteur d'image étant d'au moins 0,25 mégapixel ; et une ou plusieurs lentilles de focalisation de caméra pour focaliser la lumière reçue de la surface de l'objet sur le capteur d'image, les lentilles de focalisation de caméra définissant un axe optique de projecteur; l'axe optique de projecteur et l'axe optique de caméra étant non parallèles. La présente divulgation concerne en outre un scanner intrabuccal comprenant un tel système optique.
PCT/EP2024/058301 2023-03-31 2024-03-27 Scanner optique 3d intrabuccal à précision améliorée Pending WO2024200543A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202480029554.6A CN121057922A (zh) 2023-03-31 2024-03-27 准确性提高的光学口内3d扫描仪
KR1020257035987A KR20250166297A (ko) 2023-03-31 2024-03-27 개선된 정확도를 갖는 광학 3d 스캐너

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
DKPA202370162A DK202370162A1 (en) 2023-03-31 2023-03-31 Optical 3d scanner with improved accuracy
DKPA202370162 2023-03-31

Publications (1)

Publication Number Publication Date
WO2024200543A1 true WO2024200543A1 (fr) 2024-10-03

Family

ID=90716892

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2024/058301 Pending WO2024200543A1 (fr) 2023-03-31 2024-03-27 Scanner optique 3d intrabuccal à précision améliorée

Country Status (4)

Country Link
KR (1) KR20250166297A (fr)
CN (1) CN121057922A (fr)
DK (1) DK202370162A1 (fr)
WO (1) WO2024200543A1 (fr)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2014000745A1 (fr) 2012-06-27 2014-01-03 3Shape A/S Scanner intra-oral 3d intraoral mesurant la fluorescence
WO2020032572A1 (fr) * 2018-08-07 2020-02-13 주식회사 메디트 Scanner intra-oral tridimensionnel
AU2020308562A1 (en) * 2019-06-24 2022-02-10 Align Technology, Inc. Intraoral 3D scanner employing multiple miniature cameras and multiple miniature pattern projectors
WO2023187181A1 (fr) * 2022-03-31 2023-10-05 3Shape A/S Dispositif de numérisation 3d intra-oral pour projection d'un motif lumineux de haute densité
WO2023194460A1 (fr) * 2022-04-08 2023-10-12 3Shape A/S Dispositif de balayage intra-buccal à champ de vision étendu

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10966614B2 (en) * 2015-01-18 2021-04-06 Dentlytec G.P.L. Ltd. Intraoral scanner
US12076200B2 (en) * 2019-11-12 2024-09-03 Align Technology, Inc. Digital 3D models of dental arches with accurate arch width

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2014000745A1 (fr) 2012-06-27 2014-01-03 3Shape A/S Scanner intra-oral 3d intraoral mesurant la fluorescence
WO2020032572A1 (fr) * 2018-08-07 2020-02-13 주식회사 메디트 Scanner intra-oral tridimensionnel
AU2020308562A1 (en) * 2019-06-24 2022-02-10 Align Technology, Inc. Intraoral 3D scanner employing multiple miniature cameras and multiple miniature pattern projectors
WO2023187181A1 (fr) * 2022-03-31 2023-10-05 3Shape A/S Dispositif de numérisation 3d intra-oral pour projection d'un motif lumineux de haute densité
WO2023194460A1 (fr) * 2022-04-08 2023-10-12 3Shape A/S Dispositif de balayage intra-buccal à champ de vision étendu

Also Published As

Publication number Publication date
CN121057922A (zh) 2025-12-02
KR20250166297A (ko) 2025-11-27
DK202370162A1 (en) 2024-10-25

Similar Documents

Publication Publication Date Title
US12025430B2 (en) Intraoral scanner
US20230285124A1 (en) Intraoral scanner
JP7506961B2 (ja) 口腔内走査装置
JP6735899B2 (ja) 3次元スキャナとこれを利用した人工物加工装置
CN119344900A (zh) 具有准确的牙弓宽度的牙弓数字3d模型
KR101628730B1 (ko) 치과용 3차원 이미징 방법 및 그 시스템
US20250221800A1 (en) Intraoral 3d scanning device for projecting a high-density light pattern
KR20170093445A (ko) 칼라 패턴을 이용한 치과용 3차원 스캐너
US20250228648A1 (en) Intraoral scanning device with extended field of view
US12259231B2 (en) Intraoral scanner
WO2024200543A1 (fr) Scanner optique 3d intrabuccal à précision améliorée
US20230233295A1 (en) Intra-oral scanning device
US20240293206A1 (en) System and method of solving the correspondence problem in 3d scanning systems
DK202370273A1 (en) 3D scanner for minimizing motion blur
US20240344824A1 (en) Single pattern shift projection optical system for 3d scanner
WO2025031720A1 (fr) Système de scanner 3d à étalonnage dynamique

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 24715767

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: KR1020257035987

Country of ref document: KR

WWE Wipo information: entry into national phase

Ref document number: 2024715767

Country of ref document: EP

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 2024715767

Country of ref document: EP

Effective date: 20251031