WO2021054700A1 - Procédé pour fournir des informations de lésion dentaire et dispositif l'utilisant - Google Patents
Procédé pour fournir des informations de lésion dentaire et dispositif l'utilisant Download PDFInfo
- Publication number
- WO2021054700A1 WO2021054700A1 PCT/KR2020/012448 KR2020012448W WO2021054700A1 WO 2021054700 A1 WO2021054700 A1 WO 2021054700A1 KR 2020012448 W KR2020012448 W KR 2020012448W WO 2021054700 A1 WO2021054700 A1 WO 2021054700A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- tooth
- lesion
- information
- image
- providing
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/72—Signal processing specially adapted for physiological signals or for diagnostic purposes
- A61B5/7271—Specific aspects of physiological measurement analysis
- A61B5/7275—Determining trends in physiological measurement data; Predicting development of a medical condition based on physiological measurements, e.g. determining a risk factor
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/05—Detecting, measuring or recording for diagnosis by means of electric currents or magnetic fields; Measuring using microwaves or radio waves
- A61B5/055—Detecting, measuring or recording for diagnosis by means of electric currents or magnetic fields; Measuring using microwaves or radio waves involving electronic [EMR] or nuclear [NMR] magnetic resonance, e.g. magnetic resonance imaging
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B6/00—Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
- A61B6/02—Arrangements for diagnosis sequentially in different planes; Stereoscopic radiation diagnosis
- A61B6/03—Computed tomography [CT]
- A61B6/032—Transmission computed tomography [CT]
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H50/00—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
- G16H50/20—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for computer-aided diagnosis, e.g. based on medical expert systems
Definitions
- the present invention relates to a method for providing tooth lesion information and an apparatus using the same.
- Various X-ray images taken in various ways are widely used for dental diagnosis and treatment.
- Panorama X-ray images are widely used in the establishment of treatment plans in that the oral structure can be confirmed as a whole and the anatomical structure of the oral cavity is properly revealed.
- a method of predicting a lesion area by identifying an X-ray panoramic image with the naked eye is widely used in dental treatment, but techniques for automatically identifying a lesion have been developed to overcome the limitation of identification by the naked eye.
- Research to establish a lesion candidate region in an X-ray image remains at the level of using the contrast change in an X-ray image.
- An object of the present invention is to provide a means for effectively determining a lesion area in a tooth image by providing information on whether or not a lesion of each tooth occurs together with identification information of the corresponding tooth.
- the present invention is to provide the visualized lesion information through a map that visualizes the possibility that each region corresponds to the lesion region in the input tooth image.
- an object of the present invention is to provide a readout model capable of more accurately detecting a lesion area by using a loss function in consideration of a correlation between teeth in a process of learning a readout model for detecting a lesion area in a tooth image.
- a characteristic configuration of the present invention for achieving the object of the present invention as described above and realizing the characteristic effects of the present invention described later is as follows.
- a method for providing tooth lesion information performed by a computing device includes the steps of: (a) receiving a tooth image of a subject; (b) detecting a lesion area included in the tooth image through a pre-learned read model or supporting another device linked to the computing device to detect the lesion area; (c) generating tooth lesion information on the lesion area or supporting another device linked to the computing device to generate the tooth lesion information; And (d) providing lesion information to an external entity.
- the step (c) includes the steps of (c1) generating a map for visualizing a lesion area in the tooth image; And (c2) generating the tooth lesion information through visualization information in which the lesion area is visualized in the tooth image based on the map.
- the step (c) includes the steps of (c3) generating identification information for identifying the position or order of individual teeth included in the tooth image; (c4) generating matching information by matching the identification information with information on whether a lesion has occurred in an individual tooth corresponding to the identification information; And (c5) generating the tooth lesion information based on the matching information.
- the readout model is pre-trained in a direction that minimizes the result of the loss function, and the loss function may be determined based on correlation information between individual teeth.
- a computing device that provides tooth lesion information includes: a communication unit that receives a tooth image of a subject; And a processor for generating tooth lesion information for the tooth image, wherein the processor generates tooth lesion information on a lesion region included in the tooth image through a pre-learned read model, or other information linked to the computing device. It is possible to support the device to generate the lesion information and provide the tooth lesion information to an external entity.
- a single tooth image may provide a means for reading whether or not lesions have occurred in individual teeth.
- a medical image used in a conventional hospital can be used as it is, so it goes without saying that the method of the present invention is not dependent on a specific type of image or platform.
- FIG. 1 is a conceptual diagram schematically showing an exemplary configuration of a computing device that performs a method for providing lesion information according to the present invention.
- FIG. 2 is an exemplary block diagram showing hardware or software components of a computing device that performs a method for providing lesion information according to the present invention.
- FIG. 3 is a flowchart illustrating a method of providing tooth lesion information according to an exemplary embodiment.
- FIG. 4 is a diagram illustrating an example in which a method of providing lesion information according to an exemplary embodiment is performed.
- FIG. 5 is a diagram illustrating an example in which a Pearson correlation matrix used in a process of determining a loss function is visualized.
- image refers to multidimensional data composed of discrete image elements (eg, pixels in a 2D image and voxels in a 3D image).
- image refers to an X-ray image, (cone-beam) computed tomography, magnetic resonance imaging (MRI), ultrasound, or any other medical image known in the art. It may be a medical image of a subject, that is, a subject collected by the system. Also, an image may be provided in a non-medical context, such as a remote sensing system, an electron microscopy, and the like.
- image' refers to an image that can be seen by the eye (e.g., displayed on a video screen) or of an image (e.g., a file corresponding to a pixel output of a CT, MRI detector, etc.) It is a term referring to digital representations.
- the'DICOM Digital Imaging and Communications in Medicine; Medical Digital Imaging and Communications
- ACR American Radiological Society
- NEMA American Electrical Industry Association
- PES picture archiving and communication system
- DICOM digital medical imaging equipment
- 'learning' or'learning' is a term that refers to performing machine learning through computing according to procedures. It will be appreciated by those of skill in the art that it is not intended to be referred to.
- the present invention covers all possible combinations of the embodiments indicated herein. It should be understood that the various embodiments of the present invention are different from each other, but need not be mutually exclusive. For example, specific shapes, structures, and characteristics described herein may be implemented in other embodiments without departing from the spirit and scope of the present invention in relation to one embodiment. In addition, it should be understood that the location or arrangement of individual components in each disclosed embodiment may be changed without departing from the spirit and scope of the present invention. Accordingly, the detailed description to be described below is not intended to be taken in a limiting sense, and the scope of the present invention, if appropriately described, is limited only by the appended claims, along with all ranges equivalent to those claimed by the claims. Like reference numerals in the drawings refer to the same or similar functions over several aspects.
- FIG. 1 is a conceptual diagram schematically showing an exemplary configuration of a computing device that performs a method for providing lesion information according to the present invention.
- a computing device 100 includes a communication unit 110 and a processor 120, and is directly or indirectly connected to an external computing device (not shown) through the communication unit 110. Can communicate with enemies.
- the computing device 100 is a device that may include typical computer hardware (eg, computer processor, memory, storage, input devices and output devices, and other components of existing computing devices; routers, switches, etc.).
- Electronic communication devices eg, electronic communication devices; electronic information storage systems such as network-attached storage (NAS) and storage area networks (SANs) and computer software (i.e., allowing the computing device to function in a specific way). Instructions) may be used to achieve the desired system performance.
- NAS network-attached storage
- SANs storage area networks
- Instructions may be used to achieve the desired system performance.
- the communication unit 110 of such a computing device can transmit and receive requests and responses to and from other computing devices to which it is linked.
- requests and responses may be made by the same transmission control protocol (TCP) session.
- TCP transmission control protocol
- the present invention is not limited thereto, and may be transmitted/received as, for example, a user datagram protocol (UDP) datagram.
- the communication unit 110 may include a keyboard, a mouse, other external input devices, printers, displays, and other external output devices for receiving commands or instructions.
- the processor 120 of the computing device may include a micro processing unit (MPU), a central processing unit (CPU), a graphics processing unit (GPU) or a tensile processing unit (TPU), a cache memory, and a data bus. ), and the like.
- MPU micro processing unit
- CPU central processing unit
- GPU graphics processing unit
- TPU tensile processing unit
- cache memory and a data bus.
- data bus a data bus.
- it may further include an operating system and a software configuration of an application that performs a specific purpose.
- the image of the present invention targets a tooth image
- the tooth image targets a tooth X-ray panoramic image as an example, but the scope of the present invention is not limited thereto, and all of the general form tooth image It will be readily understood by those of ordinary skill in the art that it can be applied and that it can be applied in the process of detecting an arbitrary lesion area included in a target image.
- FIG. 2 is an exemplary block diagram showing hardware or software components of a computing device that performs a method for providing lesion information according to the present invention.
- the individual modules shown in FIG. 2 may be implemented by interlocking the communication unit 110 or the processor 120 included in the computing device 100, or the communication unit 110 and the processor 120, for example. Technicians will be able to understand.
- the computing device 100 may include an image acquisition module 210 as its constituent element.
- the image acquisition module 210 may acquire a tooth image of a subject, which is previously stored in a database or obtained from a device dedicated to photographing an image.
- the tooth image may be an X-ray panoramic image of a subject's teeth captured through a dedicated photographing device built into the computing device 100 or an external photographing dedicated device.
- the tooth image acquired through the image acquisition module 210 may be transmitted to the tooth lesion region detection module 220.
- the tooth lesion region detection module 220 may detect a lesion region included in the tooth image through a read model.
- the readout model may be an artificial neural network that has been trained in advance to detect a lesion area in a tooth image.
- the readout model included in the tooth lesion region detection module 220 is a deep learning-based artificial neural network that has been trained to output at least one of the location of the lesion region, the shape of the lesion region, and the probability corresponding to the lesion region from the input tooth image. I can.
- the artificial neural network included in the tooth lesion region detection module 220 determines whether the region included in the input tooth image matches the detection target region (eg, bone loss region) based on the similarity, and outputs the determination result.
- the detection target region eg, bone loss region
- DNN Deep Neural Network
- CNN Convolutional Neural Network
- DCNN Deep Convolution Neural Network
- RNN Recurrent Neural Network
- RBM Restricted Boltzmann Machine
- DBN Deep Belief Network
- SSD Single Shot Detector
- YOLO You Only Look Once
- the type of artificial neural network that can be used in the tooth lesion region detection module 220 is not limited to the presented example, and may include any artificial neural network that can be learned to recognize the lesion region in the tooth image based on the labeled learning data. It will be understood by those of ordinary skill in the art that it can be.
- the detection result of the lesion region performed through the tooth lesion region detection module 220 may be transmitted to the visualization module 230, and the visualization module 230 may generate a map for visualizing the lesion region in the tooth image.
- the visualization module 230 may generate a heat map in which the color of a corresponding region is changed according to a probability that each region of a tooth corresponds to a lesion region (eg, a bone loss region). .
- the method of generating the map by the visualization module 230 is not limited to the method described above, and an arbitrary method capable of visually expressing the probability that each region of the tooth image corresponds to the lesion region (e.g., It will be apparent to those of ordinary skill in the art that the corresponding probability may be expressed in a numerical manner, or the contrast may be adjusted according to the probability corresponding to the lesion area).
- the detection result performed through the tooth lesion region detection module 220 may be transmitted to the classification module 240, and the classification module 240 may determine identification information of a tooth corresponding to the detected lesion region.
- the classification module 240 may generate identification information for identifying the position or order of individual teeth included in the tooth image, and may determine identification information of the tooth in which the lesion area is detected based on the generated identification information and the detection result. .
- the identification information for individual teeth may be determined based on the FDI World Dental Federation notation, which is a dental identification number most used by dentists around the world, but is not limited thereto, and is based on an arbitrary method by which individual teeth can be identified. Can be determined.
- the classification module 240 determines the identification number of the tooth from which the lesion is detected among identification numbers generated for individual teeth, and based on the determination result, the identification number and It is possible to generate matching information by matching information on whether a lesion has occurred in an individual tooth.
- Matching information is a method of providing a list of identification information of teeth determined to have lesions according to implementation, a method of matching identification information of all teeth and whether lesions have occurred, etc. It can be created in the way of
- the tooth lesion information generation and transmission module 250 may generate tooth lesion information based on a map generated by the visualization module 230 or matching information generated through the classification module 240.
- the tooth lesion information may be determined based on at least one of visualization information generated by applying the map to a tooth image or information on a tooth list in which a lesion is detected based on the matching information.
- the tooth lesion information generated through the tooth lesion information generation and transmission module 250 may be stored in a database or may be provided to an external entity.
- the tooth lesion information generation and transmission module 250 may provide an output image reflecting the tooth lesion information to the external entity using a predetermined display device or the like, or through an provided communication unit.
- the external entity includes a user of the computing device 100, an administrator, a medical professional in charge of the subject, etc., but in addition to this, a subject who needs tooth lesion information for the input tooth image If so, it should be understood that any subject is included.
- the external entity may be an external AI device including a separate artificial intelligence (AI) hardware and/or software module that utilizes the tooth lesion information.
- AI artificial intelligence
- tooth lesion information which is the result of the hardware and/or software module performing the procedure, is used to suggest that it can be used as input data for other methods. That is, the external entity may be the computing device 100 itself.
- FIG. 2 Although the components shown in FIG. 2 are illustrated as being realized in one computing device for convenience of description, it will be understood that a plurality of computing devices 100 performing the method of the present invention may be configured to interlock with each other.
- FIG. 3 is a flowchart illustrating a method of providing tooth lesion information according to an exemplary embodiment.
- the computing device may receive a tooth image of a subject through a communication unit in step S100.
- the tooth image may be a tooth X-ray panoramic image, but the present invention is not limited thereto, and may include an image of an arbitrary shape photographing the subject's teeth.
- the computing device may detect a lesion area included in the tooth image through the reading model in step S200.
- the readout model may include a pre-trained artificial neural network to detect a lesion area in a tooth image.
- the loss function for training the readout model may be determined based on a maxillary loss function for a maxillary tooth and a mandibular loss function for a mandibular tooth in addition to a cross entropy function commonly used as a loss function.
- the upper jaw may mean an upper jaw included in the tooth image
- the mandible may mean a lower jaw included in the tooth image.
- the maxillary loss function may mean a loss function determined corresponding to the upper jaw
- the mandibular loss function may mean a loss function corresponding to the mandible.
- individual loss functions for calculating the maxillary loss function and the mandibular loss function may be determined based on Equation 1.
- Is the individual loss function Is a value indicating whether a tooth lesion has occurred, Is a value indicating whether a lesion of the j-th tooth of the i-th data sample has occurred, Is the predicted value of the readout model for the occurrence of a tooth lesion, Is the predicted value of the readout model for the occurrence of lesion of the j-th tooth in the i-th data sample, Is the pearson correlation matrix, Is each element of the Pearson correlation matrix and is the correlation coefficient value corresponding to the j-th tooth and the j'-th tooth, Is the correlation parameter determined based on the Pearson correlation matrix, Is the correlation parameter calculated for the j-th tooth of the i-th data sample, k is the number of teeth number, Is the number of data samples, Is Represents a value for normalizing between 0 and 1.
- the correlation coefficient between the a-th tooth and the b-th tooth included in the Pearson correlation matrix of Equation 1 May be determined based on Equation 2 below.
- Is the correlation coefficient between the a-th tooth and the b-th tooth Is a value indicating whether a lesion of the a-th tooth has occurred in the i-th training data
- Equation 1 the maxillary loss function and the mandibular loss function may be calculated as shown in Equations 3 and 4.
- Equation 5 the final loss function used for learning the readout model may be calculated as in Equation 5.
- Is the cross entropy function May mean an adjustable parameter.
- the final loss function is determined based on the maxillary loss function and the mandibular loss function, and the maxillary loss function and the mandibular loss function are determined based on the correlation between the teeth included in each. And It is determined on the basis of. In fact, since dental lesions often occur together between adjacent teeth, the correlation between adjacent teeth may be very high in relation to the occurrence of the lesion. Since the final loss function reflects the correlation between the teeth included in the maxilla and mandible, the accuracy of the prediction of the lesion area of the read model learned based on this is improved compared to the case of using the loss function that does not consider the correlation. Can be.
- the computing device may generate tooth lesion information on the detected lesion area in step S300 or support another device interlocked with the computing device to generate tooth lesion information.
- the tooth lesion information may be determined by at least one of visualization information in which a lesion area is visualized in a tooth image, and matching information in which identification information of individual teeth and whether or not a lesion occurs.
- the computing device may generate a map for visualizing the lesion area detected in the tooth image based on the detection result in step S200.
- the computing device may determine a weight for each region of the tooth image based on the output of the last layer of the read model, and generate a map representing different visual elements according to the determined weight. For example, the computing device represents an area corresponding to a high weight in the output of the final layer, and an area with a high probability of corresponding to a lesion area is expressed in a color close to red, and an area with a low probability of corresponding to a lesion area due to a low weight is expressed. It is possible to create a heat map that is represented by an area close to green.
- the computing device applies the generated heat map to the input tooth image and visualizes the region with a high probability of corresponding to the lesion region with a color close to red, and visualizes the region with a low probability of corresponding to the lesion region with a color close to green.
- Visualization information for a tooth image can be generated.
- the map for generating the visualization information is not limited to the heat map and color configuration as an example, and it can be easily understood by those of ordinary skill in the art that each region can be implemented in an arbitrary manner to visualize the probability that it corresponds to the lesion region. There will be.
- the computing device visualizes the area of the input tooth image with a high probability of corresponding to the lesion area with dark contrast, and the area with the low probability of corresponding to the lesion area with light intensity to visualize the lesion area of the input tooth image. can do.
- the computing device generates identification information that identifies the position or sequence of individual teeth included in the tooth image, and information on whether a lesion has occurred in the individual tooth corresponding to the generated identification information and the identification information. It is possible to generate matching information that matches.
- the identification information may be an identification number for an individual tooth according to the FDI World Dental Federation notation, and the computing device provides information on whether a lesion has occurred in the tooth based on the detection result in step S200. Matching information matching the identification number of can be generated.
- the computing device may provide tooth lesion information to an external entity in step S400.
- FIG. 4 is a diagram illustrating an example in which a method of providing lesion information according to an exemplary embodiment is performed.
- the computing device 420 may provide tooth lesion information including visualization information 430 and matching information 440 of the input tooth image 410. To this end, the computing device 420 may detect a lesion area in the tooth image 410 through a pre-learned reading model. The computing device generates the visualization information 430 to which the heat maps 431, 432, 433 that visualize the detected lesion area is applied, or the matching information 440 that matches the identification information of the tooth and the information on whether the lesion has occurred. It is possible to provide tooth lesion information by creating and providing it to an external entity. Each of the heat maps 431, 432, and 433 may be expressed in different colors or different shades according to a probability corresponding to the lesion area. The identification number of the tooth included in the matching information 440 may be determined according to the FDI World Dental Federation notation shown in the identification information 441.
- FIG. 5 is a diagram illustrating an example in which a Pearson correlation matrix is visualized according to an embodiment.
- the present invention can generate a lesion prediction model with higher accuracy by using the final loss function corresponding to Equation 5 in which the correlation between adjacent teeth is reflected in the process of training the artificial neural network corresponding to the reading model.
- Each of the matrices 510 and 520 corresponds to an example of visualizing the Pearson correlation matrix of the maxillary teeth and the Pearson correlation matrix of the mandibular teeth calculated based on Equation (2). Elements of each matrix may correspond to a degree of correlation between teeth. An element visualized with a darker color (or darker shade) corresponds to a higher correlation, and an element visualized with a lighter color (or lighter shade) corresponds to a lower correlation. Referring to the matrices 510 and 520, it can be seen that the correlation between teeth adjacent to each other is calculated high, which corresponds to the above-mentioned.
- the hardware may include a general-purpose computer and/or a dedicated computing device, or a specific computing device or special features or components of a specific computing device.
- the processes may be realized by one or more microprocessors, microcontrollers, embedded microcontrollers, programmable digital signal processors or other programmable devices, with internal and/or external memory.
- the processes can be configured to process application specific integrated circuits (ASICs), programmable gate arrays, programmable array logic (PAL) or electronic signals.
- ASICs application specific integrated circuits
- PAL programmable array logic
- the machine-readable recording medium may include program instructions, data files, data structures, etc. alone or in combination.
- the program instructions recorded on the machine-readable recording medium may be specially designed and configured for the present invention, or may be known to and usable by a person skilled in the computer software field.
- Examples of machine-readable recording media include magnetic media such as hard disks, floppy disks and magnetic tapes, optical recording media such as CD-ROM, DVD, Blu-ray, and magnetic-optical media such as floptical disks.
- magnetic-optical media and a hardware device specially configured to store and execute program instructions such as ROM, RAM, flash memory, and the like.
- program instructions include a processor, a processor architecture, or a heterogeneous combination of different hardware and software combinations, as well as any one of the aforementioned devices, or storing and compiling or interpreting to be executed on a machine capable of executing any other program instructions.
- the method and combinations of methods may be implemented as executable code that performs the respective steps.
- the method may be implemented as systems that perform the steps, and the methods may be distributed in several ways across devices or all functions may be integrated into one dedicated, standalone device or other hardware.
- the means for performing the steps associated with the processes described above may include any hardware and/or software described above. All such sequential combinations and combinations are intended to be within the scope of this disclosure.
- the hardware device may be configured to operate as one or more software modules to perform the processing according to the present invention, and vice versa.
- the hardware device may include a processor such as an MPU, CPU, GPU, or TPU that is combined with a memory such as ROM/RAM for storing program instructions and configured to execute instructions stored in the memory, and external devices and signals It may include a communication unit that can send and receive.
- the hardware device may include a keyboard, a mouse, and other external input devices for receiving commands written by developers.
Landscapes
- Health & Medical Sciences (AREA)
- Engineering & Computer Science (AREA)
- Life Sciences & Earth Sciences (AREA)
- Medical Informatics (AREA)
- Biomedical Technology (AREA)
- Public Health (AREA)
- General Health & Medical Sciences (AREA)
- Physics & Mathematics (AREA)
- Pathology (AREA)
- Surgery (AREA)
- Molecular Biology (AREA)
- Heart & Thoracic Surgery (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Animal Behavior & Ethology (AREA)
- Veterinary Medicine (AREA)
- Biophysics (AREA)
- High Energy & Nuclear Physics (AREA)
- Radiology & Medical Imaging (AREA)
- Databases & Information Systems (AREA)
- Artificial Intelligence (AREA)
- Pulmonology (AREA)
- Primary Health Care (AREA)
- Epidemiology (AREA)
- Theoretical Computer Science (AREA)
- Data Mining & Analysis (AREA)
- Optics & Photonics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Physiology (AREA)
- Psychiatry (AREA)
- Signal Processing (AREA)
- Image Analysis (AREA)
- Measuring And Recording Apparatus For Diagnosis (AREA)
Abstract
La présente invention concerne un procédé pour fournir des informations de lésion dentaire mis en œuvre par un dispositif informatique. Le procédé de fourniture d'informations de lésion dentaire, mis en œuvre par un dispositif informatique, peut comprendre les étapes suivantes : (a) la réception d'une image d'une dent d'un sujet; (b) la détection, par l'intermédiaire d'un modèle d'interprétation pré-entraîné, d'une zone de lésion incluse dans l'image de dent, ou le support de telle sorte qu'un autre dispositif, lié au dispositif informatique, détecte la zone de lésion; (c) la génération d'informations de lésion dentaire pour la zone de lésion, ou la prise en charge de telle sorte qu'un autre dispositif, lié au dispositif informatique, génère les informations de lésion dentaire; et (d) la fourniture des informations de lésion à une entité externe.
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| KR1020190114578A KR102186709B1 (ko) | 2019-09-18 | 2019-09-18 | 치아 병변 정보 제공 방법 및 이를 이용한 장치 |
| KR10-2019-0114578 | 2019-09-18 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2021054700A1 true WO2021054700A1 (fr) | 2021-03-25 |
Family
ID=73776712
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/KR2020/012448 Ceased WO2021054700A1 (fr) | 2019-09-18 | 2020-09-15 | Procédé pour fournir des informations de lésion dentaire et dispositif l'utilisant |
Country Status (2)
| Country | Link |
|---|---|
| KR (1) | KR102186709B1 (fr) |
| WO (1) | WO2021054700A1 (fr) |
Cited By (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN116725721A (zh) * | 2023-06-25 | 2023-09-12 | 先临三维科技股份有限公司 | 一种口腔检测方法、装置及设备 |
Families Citing this family (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| KR102568247B1 (ko) * | 2021-05-17 | 2023-08-21 | 오스템임플란트 주식회사 | 치아 치료를 위한 시뮬레이션 방법 및 장치, 컴퓨터 판독 가능한 기록 매체 및 컴퓨터 프로그램 |
| CN115100210B (zh) * | 2022-08-29 | 2022-11-18 | 山东艾克赛尔机械制造有限公司 | 一种基于汽车零部件防伪识别方法 |
Citations (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| KR20140091176A (ko) * | 2013-01-10 | 2014-07-21 | 삼성전자주식회사 | 병변 진단 장치 및 방법 |
| KR20160112493A (ko) * | 2015-03-19 | 2016-09-28 | (주)바텍이우홀딩스 | Ct에서의 효율적인 구강 병변 발병여부 예측감지 방법 및 시스템 |
| KR20180045551A (ko) * | 2016-10-26 | 2018-05-04 | 고려대학교 산학협력단 | 구강 병변의 진단 시스템 및 방법 |
| KR101943011B1 (ko) * | 2018-01-22 | 2019-01-28 | 주식회사 뷰노 | 피검체의 의료 영상 판독을 지원하는 방법 및 이를 이용한 장치 |
Family Cites Families (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20130122468A1 (en) * | 2010-05-13 | 2013-05-16 | Quantum Dental Technologies, Inc. | Method of processing and displaying oral health diagnostic data |
| KR101903424B1 (ko) | 2017-01-10 | 2018-11-13 | 한국광기술원 | 광단층영상시스템 기반 3d 구강 스캐너 및 이를 이용한 치아 상태 진단 방법 |
-
2019
- 2019-09-18 KR KR1020190114578A patent/KR102186709B1/ko active Active
-
2020
- 2020-09-15 WO PCT/KR2020/012448 patent/WO2021054700A1/fr not_active Ceased
Patent Citations (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| KR20140091176A (ko) * | 2013-01-10 | 2014-07-21 | 삼성전자주식회사 | 병변 진단 장치 및 방법 |
| KR20160112493A (ko) * | 2015-03-19 | 2016-09-28 | (주)바텍이우홀딩스 | Ct에서의 효율적인 구강 병변 발병여부 예측감지 방법 및 시스템 |
| KR20180045551A (ko) * | 2016-10-26 | 2018-05-04 | 고려대학교 산학협력단 | 구강 병변의 진단 시스템 및 방법 |
| KR101943011B1 (ko) * | 2018-01-22 | 2019-01-28 | 주식회사 뷰노 | 피검체의 의료 영상 판독을 지원하는 방법 및 이를 이용한 장치 |
Non-Patent Citations (1)
| Title |
|---|
| ZANELLA-CALZADA LAURA, GALVAN-TEJADA CARLOS, CHAVEZ-LAMAS NUBIA, RIVAS-GUTIERREZ JESUS, MAGALLANES-QUINTANAR RAFAEL, CELAYA-PADILL: "Deep Artificial Neural Networks for the Diagnostic of Caries Using Socioeconomic and Nutritional Features as Determinants: Data from NHANES 2013–2014", BIOENGINEERING, vol. 5, no. 2, 47, 18 June 2018 (2018-06-18), pages 1 - 20, XP055793073, ISSN: 2306-5354, DOI: 10.3390/bioengineering5020047 * |
Cited By (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN116725721A (zh) * | 2023-06-25 | 2023-09-12 | 先临三维科技股份有限公司 | 一种口腔检测方法、装置及设备 |
Also Published As
| Publication number | Publication date |
|---|---|
| KR102186709B1 (ko) | 2020-12-04 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| KR101898575B1 (ko) | 진행성 병변에 대한 미래 상태를 예측하는 방법 및 이를 이용한 장치 | |
| WO2019103440A1 (fr) | Procédé permettant de prendre en charge la lecture d'une image médicale d'un sujet et dispositif utilisant ce dernier | |
| WO2019143177A1 (fr) | Procédé de reconstruction de série d'images de tranche et appareil utilisant celui-ci | |
| KR102290799B1 (ko) | 치아 병변 정보 제공 방법 및 이를 이용한 장치 | |
| WO2023229415A1 (fr) | Procédé de fourniture d'image de realité augmentée et dispositif de fourniture d'image de realité augmentée (ra) | |
| CN113516639B (zh) | 基于全景x光片的口腔异常检测模型的训练方法及装置 | |
| WO2019208848A1 (fr) | Procédé de mesure de mouvement de globe oculaire tridimensionnel et système de diagnostic d'étourdissement basé sur un apprentissage profond automatique | |
| WO2021054700A1 (fr) | Procédé pour fournir des informations de lésion dentaire et dispositif l'utilisant | |
| WO2021157966A1 (fr) | Procédé de fourniture d'informations concernant l'orthodontie à l'aide d'un algorithme d'intelligence artificielle d'apprentissage profond, et dispositif l'utilisant | |
| WO2021034138A1 (fr) | Procédé d'évaluation de la démence et appareil utilisant un tel procédé | |
| WO2019143021A1 (fr) | Procédé de prise en charge de visualisation d'images et appareil l'utilisant | |
| WO2019143179A1 (fr) | Procédé de détection automatique de mêmes régions d'intérêt entre des images du même objet prises à un intervalle de temps, et appareil ayant recours à ce procédé | |
| Zhao et al. | Recognition and segmentation of teeth and mandibular nerve canals in panoramic dental X-rays by Mask RCNN | |
| WO2019231104A1 (fr) | Procédé de classification d'images au moyen d'un réseau neuronal profond et appareil utilisant ledit procédé | |
| WO2023027248A1 (fr) | Procédé de génération de données, et procédé d'entraînement et appareil l'utilisant | |
| Xiong et al. | Simultaneous detection of dental caries and fissure sealant in intraoral photos by deep learning: a pilot study | |
| JPWO2020153471A1 (ja) | 推定装置、学習モデル、学習モデルの生成方法、及びコンピュータプログラム | |
| WO2022131642A1 (fr) | Appareil et procédé pour déterminer la gravité d'une maladie sur la base d'images médicales | |
| CN103892802B (zh) | 一种新型的舌像智能辅助装置 | |
| WO2019098415A1 (fr) | Procédé permettant de déterminer si un sujet a développé un cancer du col de l'utérus, et dispositif utilisant ledit procédé | |
| WO2021145607A1 (fr) | Dispositif de dossier médical dentaire et procédé de dossier médical dentaire associé | |
| WO2019164277A1 (fr) | Procédé et dispositif d'évaluation de saignement par utilisation d'une image chirurgicale | |
| Kang et al. | Diagnostic accuracy of dental caries detection using ensemble techniques in deep learning with intraoral camera images | |
| WO2022231329A1 (fr) | Procédé et dispositif d'affichage de tissu d'image biologique | |
| WO2019124836A1 (fr) | Procédé de mappage d'une région d'intérêt d'une première image médicale sur une seconde image médicale, et dispositif l'utilisant |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 20865841 Country of ref document: EP Kind code of ref document: A1 |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 20865841 Country of ref document: EP Kind code of ref document: A1 |