EP4543362A1 - Procédé de détermination de caractéristiques dentaires à partir d'une image dentaire - Google Patents
Procédé de détermination de caractéristiques dentaires à partir d'une image dentaireInfo
- Publication number
- EP4543362A1 EP4543362A1 EP23734225.8A EP23734225A EP4543362A1 EP 4543362 A1 EP4543362 A1 EP 4543362A1 EP 23734225 A EP23734225 A EP 23734225A EP 4543362 A1 EP4543362 A1 EP 4543362A1
- Authority
- EP
- European Patent Office
- Prior art keywords
- tooth
- processing model
- image
- learning
- data set
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61C—DENTISTRY; APPARATUS OR METHODS FOR ORAL OR DENTAL HYGIENE
- A61C19/00—Dental auxiliary appliances
- A61C19/04—Measuring instruments specially adapted for dentistry
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/45—For evaluating or diagnosing the musculoskeletal system or teeth
- A61B5/4538—Evaluating a particular part of the muscoloskeletal system or a particular medical condition
- A61B5/4542—Evaluating the mouth, e.g. the jaw
- A61B5/4547—Evaluating teeth
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/72—Signal processing specially adapted for physiological signals or for diagnostic purposes
- A61B5/7235—Details of waveform analysis
- A61B5/7264—Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems
- A61B5/7267—Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems involving training the classification device
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61C—DENTISTRY; APPARATUS OR METHODS FOR ORAL OR DENTAL HYGIENE
- A61C9/00—Impression cups, i.e. impression trays; Impression methods
- A61C9/004—Means or methods for taking digitized impressions
- A61C9/0046—Data acquisition means or methods
- A61C9/0053—Optical means or methods, e.g. scanning the teeth by a laser or light beam
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H30/00—ICT specially adapted for the handling or processing of medical images
- G16H30/40—ICT specially adapted for the handling or processing of medical images for processing medical images, e.g. editing
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H50/00—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
- G16H50/30—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for calculating health indices; for individual health risk assessment
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H50/00—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
- G16H50/70—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for mining of medical data, e.g. analysing previous cases of other patients
Definitions
- the present invention relates to a method for determining tooth characteristics from a tooth image.
- Biofilm is a film that, if not removed, can cause dental diseases such as tooth decay. Regular removal, for example with a toothbrush, is therefore strongly recommended.
- the contrast agent fluorescein is known to fluoresce when illuminated by, for example, UV/blue light.
- German patent application DE 10 2022 102 045.2 which has not yet been published, discloses a dental dirt detection device that solves this problem by having the camera only record a specific color spectrum range. This significantly improves the signal-to-noise ratio. A biofilm is very clearly visible.
- US 2020/0201266 A1 shows a cleaning device for a household.
- This cleaning device can be designed for a wide variety of applications, such as cleaning a floor of a building, shaving a human body or cleaning teeth.
- This cleaning device can have a neural network with which different properties of an image of the object to be cleaned can be determined, such as the color of teeth, in order to be able to influence the cleaning process.
- US 2020/0146794 A1 describes an intelligent toothbrush that has a camera with which images of the teeth to be brushed can be captured. Using a neural network, the images can be evaluated to determine whether the teeth have plaque or tartar or whether there is inflammation in the oral cavity. The cleaning process with the toothbrush is adjusted accordingly.
- US 2020/0179089 A1 discloses an oral hygiene monitoring system that monitors the movement and orientation of oral hygiene devices, such as a toothbrush, when they are used. This is done using one or more cameras that monitor the movement of the toothbrush from outside the body of the person cleaning their teeth.
- US 2021/0393026 A1 describes an oral hygiene system that is a type of intelligent toothbrush that has an optical sensor to optically scan the interior of the mouth.
- the sensor data can be analyzed using a machine learning system, such as a neural network.
- the invention is based on the object of creating a method for recognizing teeth simply, reliably and automatically.
- the task is solved by the subjects of the independent claims.
- Advantageous developments and preferred embodiments form the subject of the subclaims.
- a method for generating tooth characteristics includes providing a processing model, capturing at least one tooth image of a tooth, and calculating the tooth characteristics from the tooth images using the processing model.
- the processing model was trained using a data set.
- the data set includes at least one learning tooth image and one learning tooth characteristics, which are linked to one another.
- the tooth characteristics include at least boundaries of one or more teeth in the tooth image.
- Boundaries of teeth show a similar representation in the tooth image across multiple tooth images. This makes it particularly suitable for machine learning.
- the trained processing model learned where the boundaries of the teeth are typically located in the image.
- the processing model can be based on certain markers in the image, such as individual teeth that are visible with greater contrast in the original tooth image.
- the finest differences in brightness in the gray levels are evaluated as a limitation by the processing model, provided they fit into the entire boundary contour of the teeth.
- teeth especially of the same types of teeth, are similar across multiple tooth images.
- an incisor tooth is similar to another user's incisor tooth.
- the respective differences can then be recognized very easily using the machine learning processing model.
- tooth characteristics can also be recognized from tooth images, even if no specialist personnel evaluate the images.
- the method is very suitable for machine learning.
- the inventors have recognized, unlike the prior art discussed above in which machine learning systems are used to analyze images of an oral cavity for plaque, gingivitis, and the like, that due to the similar shape of teeth in different people, a machine learning system is very precise and reliable can recognize the boundary between tooth and gum, even if the optical conditions are not optimal due to the system. Clear identification of the boundary between the teeth and gums significantly increases the quality of oral cleaning.
- Tooth characteristics are data that describe certain features of a tooth.
- tooth characteristics are an image file with the same dimensions as the tooth image, where this image only contains binary data, for example 0 indicating that there is no tooth at this location and a 1 indicating that there is a tooth here. Depending on the presentation, such an image would appear as a black contour drawing of the tooth.
- the generated image of the tooth characteristics corresponds to the tooth image.
- the tooth characteristics are position data or vector lines that run along the boundary of the tooth.
- Biofilm, plaque, dental plaque and dental dirt are synonymous within the scope of this application. They describe a substance that adheres to the teeth and usually contains saliva, bacteria and food particles.
- the tooth images are binary images.
- the tooth images are reconstructed before being input into the processing model.
- the reconstruction may include at least one of the following features:
- the tooth characteristics preferably represent boundaries of the tooth in the tooth image, in particular in relation to the gums and/or the tongue.
- the recognition of tooth characteristics may include a method for matching and/or classification or categorization of color aid programs. According to a preferred development, when the tooth characteristics are recognized by a machine learning algorithm, in particular a segmentation model is used for the area recognition of the tooth.
- a model can be used for object recognition. This includes a bounding box model and/or a model for tooth coordinate recognition.
- an algorithm for threshold value determination in particular an HSV, RGB, YCBCR, LAB threshold value determination, is used. Sections of the image can be highlighted using different colors.
- segmentation model could also be used to assign features or characteristics to areas.
- An area detection is carried out here. This is interesting, for example, when the characteristics have clearly defined boundaries, as is the case with tooth decay, for example.
- the tooth characteristics include features of the teeth, such as tooth discoloration, implants, de-mineralized areas, caries, etc.
- the tooth characteristics can be used in combination with the tooth image, which represents dental plaque, to create a map of the tooth in which not only the dental plaque but also the boundaries of the teeth are shown. This allows the plaque to be localized very precisely on the individual teeth and a cleaning process to be controlled accordingly.
- a tooth map with the boundary and the dental plaque can be used, for example, to appropriately control a tooth cleaning device, as described for example in the German patent application DE 102022 102 045.2, for cleaning the teeth.
- the tooth image is generated using a tooth dirt detection device.
- the contrast agent fluorescein has a fluorescence with the strongest intensity at 520 to 530 nm when excited with light with a wavelength of 465 to 500 nm.
- the very close wavelength spectrum of the exciting and emitted light leads to a signal-to-noise ratio that is too low in standard intraoral camera units of a dental dirt detection device to ensure reliable detection. Reflections that occur due to focused lighting solutions and blurry images from the camera unit are currently only compensated for with interoral cameras on the market by a greater distance from the tooth to the sensor unit.
- the dental dirt detection device preferably has a light filter to allow light with wavelengths between 480 nm to 530 nm to pass.
- an optical long-pass filter is preferably placed directly in front of the camera of the dental dirt detection device, which ideally has a cutoff wavelength of 480 nm to 530 nm and in particular about 510 nm and cuts off signals below this. Additionally, a bandpass or shortpass filter can be placed in front of the LEDs that illuminate the area to be detected in order to limit/focus the wavelength spectrum of the LEDs.
- a circular polarizer can also be used.
- an additional parameter can be taken into account to determine the tooth characteristics.
- the accuracy in determining the tooth characteristics can be increased.
- the processing model can generate better tooth characteristics if, for example, the tooth type (such as an incisor) is known.
- An incisor tooth differs in shape from, for example, a molar.
- the additional parameters preferably include at least one of the following parameters:
- an area with generalized tartar in the dental image may have similar characteristics to the background. If such an area is known, the position of the tooth limitation can be better calculated.
- the machine learning is supervised machine learning.
- the processing model is learned from given pairs of inputs and outputs. These inputs and outputs represent prefabricated tooth images and tooth characteristics. The correct functions are determined by the Dental images provided manually upon entry. After several rounds with different inputs and outputs, the ability to create associations is trained.
- the processing model is improved in the application phase through independent machine learning by a user manually adjusting the tooth characteristics generated by the processing model.
- the processing model is additionally trained using a data set consisting of the tooth image and the adapted tooth characteristics.
- the processing model is trained on a data set by a machine learning algorithm.
- the data set includes at least one learning tooth image and at least one learning tooth characteristics, which are linked to one another.
- the processing model is improved by the processing model generating a learning tooth image of target tooth characteristics and then using a target algorithm to determine a measure of how much the target tooth characteristics and the learning tooth characteristics differ.
- the processing model is adjusted based on the specific measurement.
- a computer program product includes instructions that, when the program is executed by a computer, cause it to carry out the method described above.
- the computer is a computing unit.
- the computing unit or computer can also be a digital user device, such as a smartphone, a server, a microcontroller, a laptop, a tablet computer, a PDA or another intelligent system, as well as a cloud-based system.
- a further aspect of the present invention is a method for automatically cleaning teeth, wherein at least one tooth characteristic is determined according to a method explained above and the cleaning process is controlled in accordance with the tooth characteristic thus determined.
- the cleaning process is carried out, for example, using an automatically controlled toothbrush.
- Figure 2 Block diagram of a system for determining
- Tooth characteristics from a tooth image
- Figure 3 a Block diagram of a system for forming a processing model
- FIG. 3b Block diagram of a system for the use of a
- Figure 4 shows a flowchart of a method for determining tooth characteristics from a tooth image.
- a system 1 for determining tooth characteristics includes a learning unit 2, an execution unit 3 and a tooth dirt detection device 4 for recording tooth images 5.
- the learning unit 2 includes a machine learning module 6, which is designed to use assignments 7 of tooth images 5 to learning tooth characteristics 8 in order to generate a processing model 9 therefrom.
- the application unit 3 includes an application module 10, which uses the processing model 9 to automatically determine tooth characteristics 11 from tooth images 5.
- the learning unit 2 and the execution units 3 are typically computing units, such as computers.
- the machine learning module 6 and the application module 10 are software applications that can be executed on these computers.
- the learning unit 2 and the execution unit 3 are two different computers connected to each other via a computer network to exchange the processing model 9.
- the learning unit 2 and the execution unit 3 are mapped by the same computer.
- the tooth dirt detection device 4 preferably communicates wirelessly with the execution unit 3, for example via WLAN. However, wire-based communication is also conceivable.
- Tooth dirt detection device 4 is described in detail in the unpublished German patent application DE 10 2022 102 045.2 and this patent application is incorporated by reference in its entirety.
- the tooth dirt detection device 4 is designed such that a U-shaped section of the tooth dirt detection device 4 is introduced into the mouth of a user.
- the U-shaped section of the dental dirt detection device 4 has a sensor arrangement.
- the U-shaped section is placed on the teeth so that all tooth surfaces can be detected by a sensor.
- a detection liquid is introduced into the oral cavity before the detection process.
- the detection fluid interacts with the biofilm and causes the biofilm to glow at another predetermined wavelength under the influence of light with a predetermined wavelength.
- the detection liquid is arranged in a detection capsule which can be inserted into the dental dirt detection device 4. The detection device then removes the detection liquid and pumps it onto the teeth.
- the tooth dirt detection device 4 consists of a handpiece, which can have a display on the side facing away from the user. On the side facing the user there is a mouthpiece that is inserted into the oral cavity and guides the sensor unit over the teeth.
- the mouthpiece has at least one camera unit with a camera.
- the camera unit alone has the dimensions 1x1x2, 7 mm and the entire sensor unit, including a protective glass, PCB (printed circuit board), filter and camera holder, has a diameter of 8 mm and a height of 3.8 mm. With these dimensions, the unit can be easily guided into an oral cavity.
- PCB printed circuit board
- an optical long-pass filter is preferably placed directly in front of the camera, which ideally has a cutoff wavelength of around 510 nm and cuts off signals below that.
- a bandpass or shortpass filter can be placed in front of the LEDs that illuminate the area to be detected in order to limit the wavelength spectrum of the LEDs.
- step S1 ( Figure 4).
- Tooth images 5 are manually assigned to tooth characteristics 11. Tooth images 5 are images of teeth that were recorded by a tooth dirt detection device 4. They typically show strong signal-to-noise ratios of dental plaque, but the demarcation of teeth, here tooth characteristics 11 , are very difficult to recognize. However, specialist personnel are able to determine these boundaries manually.
- an empty processing model 9 is initially used, which here consists of a neural network.
- empty means that the neural network has not yet been trained with any data, but contains the necessary basics and is ready to learn.
- learning tooth images 5 are read into the processing model 9, which then outputs target tooth characteristics 11 (FIG. 3a).
- an image of the teeth is recorded with a camera module of a tooth dirt detection device 4, with no contrast agent and, if necessary, color filters being used. This creates an image of the tooth in which a biofilm is not easily visible, but the boundaries of the tooth are clearly visible. This limit can be calculated using an algorithm or manually from the tooth image 5 and then represents the learning tooth characteristics.
- a target algorithm 12 determines how much the target characteristics differ from the learning tooth characteristics assigned to the learning tooth images 5.
- the processing model 9 is automatically improved based on the measurement. This learning step is now repeated with the same learning tooth image and/or a different learning tooth image.
- step S4 in which the processing model 9 is transferred from the learning unit 2 to the execution unit 3.
- step S5 new tooth images 5 are also recorded by the tooth dirt detection device 4.
- the new tooth images 5 are transmitted from the tooth dirt detection device 4 to the execution unit 3.
- the tooth surfaces visible to the camera are measured.
- the heights, depths and angles of inclination to the specific manual of the tooth can be determined in order to determine the surface of the tooth.
- teeth can be tracked, ie followed, across several recorded tooth images. This involves identifying teeth on different dental images. Furthermore, it is possible for tooth movements to be determined and the camera movement to be calculated from this.
- Reference symbol list
Landscapes
- Health & Medical Sciences (AREA)
- Engineering & Computer Science (AREA)
- Life Sciences & Earth Sciences (AREA)
- Public Health (AREA)
- General Health & Medical Sciences (AREA)
- Medical Informatics (AREA)
- Biomedical Technology (AREA)
- Epidemiology (AREA)
- Physics & Mathematics (AREA)
- Animal Behavior & Ethology (AREA)
- Veterinary Medicine (AREA)
- Pathology (AREA)
- Dentistry (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Data Mining & Analysis (AREA)
- Primary Health Care (AREA)
- Artificial Intelligence (AREA)
- Biophysics (AREA)
- Heart & Thoracic Surgery (AREA)
- Databases & Information Systems (AREA)
- Molecular Biology (AREA)
- Surgery (AREA)
- Physiology (AREA)
- Signal Processing (AREA)
- Fuzzy Systems (AREA)
- Mathematical Physics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Rheumatology (AREA)
- Psychiatry (AREA)
- Evolutionary Computation (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Radiology & Medical Imaging (AREA)
- Physical Education & Sports Medicine (AREA)
- Optics & Photonics (AREA)
- Orthopedic Medicine & Surgery (AREA)
- Dental Tools And Instruments Or Auxiliary Dental Instruments (AREA)
- Image Analysis (AREA)
- Endoscopes (AREA)
Abstract
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| DE102022115398.3A DE102022115398A1 (de) | 2022-06-21 | 2022-06-21 | Verfahren zum Bestimmen von Zahncharakteristika aus einem Zahnbild |
| PCT/EP2023/066684 WO2023247565A1 (fr) | 2022-06-21 | 2023-06-20 | Procédé de détermination de caractéristiques dentaires à partir d'une image dentaire |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| EP4543362A1 true EP4543362A1 (fr) | 2025-04-30 |
Family
ID=87003087
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| EP23734225.8A Pending EP4543362A1 (fr) | 2022-06-21 | 2023-06-20 | Procédé de détermination de caractéristiques dentaires à partir d'une image dentaire |
Country Status (4)
| Country | Link |
|---|---|
| EP (1) | EP4543362A1 (fr) |
| JP (1) | JP2025520657A (fr) |
| DE (1) | DE102022115398A1 (fr) |
| WO (1) | WO2023247565A1 (fr) |
Family Cites Families (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN119423483A (zh) | 2016-08-22 | 2025-02-14 | 科利布里有限公司 | 用于依从性监测的口腔卫生系统及远程-牙科系统 |
| KR102171837B1 (ko) | 2018-11-08 | 2020-10-29 | 이상근 | 스마트 칫솔을 이용한 구강정보 관리 시스템 |
| US11468561B2 (en) | 2018-12-21 | 2022-10-11 | The Procter & Gamble Company | Apparatus and method for operating a personal grooming appliance or household cleaning appliance |
| WO2021262528A1 (fr) | 2020-06-22 | 2021-12-30 | Colgate-Palmolive Company | Système de soin buccodentaire et procédé pour favoriser l'hygiène buccodentaire |
| US12033742B2 (en) * | 2020-12-11 | 2024-07-09 | Align Technology, Inc. | Noninvasive multimodal oral assessment and disease diagnoses apparatus and method |
| DE102022102045B4 (de) | 2022-01-28 | 2023-10-26 | epitome GmbH | Vorrichtung und Verfahren zur Erfassung von Biofilm im Mundraum |
-
2022
- 2022-06-21 DE DE102022115398.3A patent/DE102022115398A1/de active Pending
-
2023
- 2023-06-20 EP EP23734225.8A patent/EP4543362A1/fr active Pending
- 2023-06-20 WO PCT/EP2023/066684 patent/WO2023247565A1/fr not_active Ceased
- 2023-06-20 JP JP2024575266A patent/JP2025520657A/ja active Pending
Also Published As
| Publication number | Publication date |
|---|---|
| WO2023247565A1 (fr) | 2023-12-28 |
| DE102022115398A1 (de) | 2023-12-21 |
| JP2025520657A (ja) | 2025-07-03 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| DE69918549T2 (de) | Gerät zur feststellung von zahnbelag mittels fluoreszenzmessung | |
| US9870613B2 (en) | Detection of tooth condition using reflectance images with red and green fluorescence | |
| DE112012004064B4 (de) | Diagnosesystem | |
| EP2914201B1 (fr) | Procédé permettant d'obtenir au moins une image individuelle pertinente d'un objet dentaire | |
| DE202017007142U1 (de) | lntraoraler Scanner mit Zahndiagnosefähigkeiten | |
| CH680187A5 (fr) | ||
| EP2786696A1 (fr) | Système de caméra dentaire | |
| US20240138665A1 (en) | Dental imaging system and image analysis | |
| DE602004009875T2 (de) | Verfahren zur Bildverarbeitung für Profilbestimmung mittels strukturiertem Lichts | |
| DE102022102045B4 (de) | Vorrichtung und Verfahren zur Erfassung von Biofilm im Mundraum | |
| DE102007014413B4 (de) | Verfahren zum Auswerten von Fluoreszenzbildsätzen und Vorrichtung zu seiner Durchführung | |
| DE102009023952A1 (de) | Verfahren und Vorrichtung zur Bestimmung von Zahnfarben | |
| DE60132701T2 (de) | System zur farbabstimmung | |
| WO2023247565A1 (fr) | Procédé de détermination de caractéristiques dentaires à partir d'une image dentaire | |
| DE20209441U1 (de) | Vorrichtung zum Erkennen von bakteriellem Befall an Zähnen | |
| DE112020004617T5 (de) | Endoskopsystem | |
| DE102010043796A1 (de) | Zahnärztliches System zum Transilluminieren von Zähnen | |
| DE102019113283B4 (de) | Vorrichtung zur Abbildungserzeugung von Hautläsionen | |
| DE19724421C2 (de) | Verfahren und Vorrichtung zur quantitativen Bestimmung von dentaler Plaque | |
| EP4186469A1 (fr) | Appareil portable périodontal | |
| EP1269909A1 (fr) | Procédé et dispositif de reconnaissance des changements pathologiques sur la surface des tissus, notamment des dents | |
| EP4477136A1 (fr) | Procédé et système de détection de plaque dentaire | |
| DE102022115396A1 (de) | Verfahren zum Reinigen von Zähnen und Reinigungssystem | |
| EP3620100A1 (fr) | Pièce à main de caméra dentaire permettant de réaliser des enregistrements intrabuccaux | |
| DE10232682B4 (de) | Verfahren und Vorrichtung zum nichtmagnetischen Identifizieren von Lebewesen sowie Verwendung derselben |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: UNKNOWN |
|
| STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE |
|
| PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
| STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE |
|
| 17P | Request for examination filed |
Effective date: 20241227 |
|
| AK | Designated contracting states |
Kind code of ref document: A1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC ME MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
| DAV | Request for validation of the european patent (deleted) | ||
| DAX | Request for extension of the european patent (deleted) | ||
| REG | Reference to a national code |
Ref country code: HK Ref legal event code: DE Ref document number: 40125698 Country of ref document: HK |