[go: up one dir, main page]

US20130173175A1 - Method and apparatus for measuring biometrics of object - Google Patents

Method and apparatus for measuring biometrics of object Download PDF

Info

Publication number
US20130173175A1
US20130173175A1 US13/734,217 US201313734217A US2013173175A1 US 20130173175 A1 US20130173175 A1 US 20130173175A1 US 201313734217 A US201313734217 A US 201313734217A US 2013173175 A1 US2013173175 A1 US 2013173175A1
Authority
US
United States
Prior art keywords
biometrics
measuring
modeling
result
crl
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
US13/734,217
Other versions
US9881125B2 (en
Inventor
Hae-kyung JUNG
Hee-chul YOON
Hyun-taek LEE
Yong-Je Kim
Jae-hyun Kim
Myung-jin Eom
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Samsung Electronics Co Ltd
Original Assignee
Samsung Electronics Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Samsung Electronics Co Ltd filed Critical Samsung Electronics Co Ltd
Assigned to SAMSUNG ELECTRONICS CO., LTD. reassignment SAMSUNG ELECTRONICS CO., LTD. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: EOM, MYUNG-JIN, JUNG, HAE-KYUNG, KIM, JAE-HYUN, KIM, YONG-JE, LEE, HYUN-TAEK, YOON, Hee-chul
Publication of US20130173175A1 publication Critical patent/US20130173175A1/en
Application granted granted Critical
Publication of US9881125B2 publication Critical patent/US9881125B2/en
Active legal-status Critical Current
Adjusted expiration legal-status Critical

Links

Images

Classifications

    • G06F19/26
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/13Tomography
    • A61B8/14Echo-tomography
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16BBIOINFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR GENETIC OR PROTEIN-RELATED DATA PROCESSING IN COMPUTATIONAL MOLECULAR BIOLOGY
    • G16B45/00ICT specially adapted for bioinformatics-related data visualisation, e.g. displaying of maps or networks
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/08Clinical applications
    • A61B8/0866Clinical applications involving foetal diagnosis; pre-natal or peri-natal diagnosis of the baby
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/52Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/5215Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data
    • A61B8/5223Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data for extracting a diagnostic or physiological parameter from medical diagnostic data
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/30ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for calculating health indices; for individual health risk assessment
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/08Clinical applications
    • A61B8/0833Clinical applications involving detecting or locating foreign bodies or organic structures
    • A61B8/085Clinical applications involving detecting or locating foreign bodies or organic structures for locating body or organic structures, e.g. tumours, calculi, blood vessels, nodules
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/08Clinical applications
    • A61B8/0858Clinical applications involving measuring tissue layers, e.g. skin, interfaces
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/46Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
    • A61B8/467Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient characterised by special input means
    • A61B8/469Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient characterised by special input means for selection of a region of interest

Definitions

  • Apparatuses and methods consistent with exemplary embodiments relate to measuring biometrics of an object from an ultrasound image of the object.
  • Ultrasound systems have noninvasive and nondestructive characteristics and thus have been widely used in the medical field to obtain information about the internal portions of an object.
  • the ultrasound systems provide a high-resolution image of an object in real time without a need to perform a surgical operation. Thus, the ultrasound systems have drawn much attention in the medical field.
  • Ultrasound images have been used for early diagnosis to determine whether a fetus has a defect in its chromosome or nervous system, e.g., Down syndrome.
  • a diagnostician In order for a diagnostician to accurately measure biometrics of the fetus and diagnose a state of the fetus by determining the location of the fetus with the naked eyes, an image of a mid-sagittal plane of the fetus is detected and a fetal crown-rump length (CRL), a nuchal translucency (NT), and an intracranial translucency (IT) of the fetus are measured based on the image.
  • CTL fetal crown-rump length
  • NT nuchal translucency
  • IT intracranial translucency
  • biometrics such as the CRL, NT, and IT
  • a relative difference between the NT and the CRL or the IT and the CRL i.e., a value calculated based on at least two biometrics is used to diagnose the state of the fetus.
  • Exemplary embodiments may address at least the above problems and/or disadvantages and other disadvantages not described above. Also, exemplary embodiments are not required to overcome the disadvantages described above, and an exemplary embodiment may not overcome any of the problems described above.
  • One or more exemplary embodiments provide a method and apparatus for automatically measuring biometrics of an object by using an ultrasound image of the object.
  • a method of measuring biometrics of an object including receiving an image of the object, modeling the object such that at least one part of the object is identified, and measuring biometrics of the object, based on a result of modeling the object.
  • the modeling of the object may include displaying a result of modeling the at least one part of the object in an oval shape including a circular shape, and modifying the result of modeling the object, based on a user input signal.
  • the measuring of the biometrics of the object may include determining whether the measured biometrics fall within a normal range, modeling the object again by estimating a case where the biometrics fall within the normal range when it is determined that the measured biometrics do not fall within the normal range, and measuring the biometrics of the object again, based on the estimated modeling result.
  • the measuring of the biometrics of the object may include detecting a region-of-interest (ROI) for measuring the biometrics, based on the result of modeling the object; and measuring the biometrics in the ROI.
  • ROI region-of-interest
  • the detecting of the ROI may include displaying the detected ROI to be differentiated from the other parts of the object, displaying a region for measuring the biometrics, in the ROI, and modifying the ROI or the region for measuring the biometrics, according to a user input signal.
  • the modeling of the object may include modeling the object such that a body and head of the object are identified.
  • the method may further include detecting at least one characteristic point on the head of the object, setting a central axis by using the at least one characteristic points and then displaying the central axis, measuring an angle between the body and head of the object with respect to the central axis, determining whether the angle falls within a normal range, and measuring a crown-rump length (CRL) of the object, based on a result of the determining.
  • CTL crown-rump length
  • the measuring of the CRL may include measuring the CRL of the object based on the result of modeling the object when the angle falls within the normal range, estimating a result of modeling the object when the angle falls within the normal range when the angle does not fall within the normal range and then measuring the CRL of the object, based on the estimated modeling result, and displaying the estimated modeling result and the measured CRL.
  • the measuring of the biometrics may include measuring a crown-rump length (CRL) of the object, measuring at least one among a nuchal translucency (NT) and an intracranial translucency (IT) of the object, calculating a relative difference between the CRL and the NT or the IT, and displaying the measured CRL, NT, and IT.
  • CRL crown-rump length
  • NT nuchal translucency
  • IT intracranial translucency
  • a terminal apparatus for measuring biometrics of an object
  • the terminal apparatus including a storage for storing an image of the object, and a controller.
  • the controller includes a modeler for modeling the object such that at least one part of the object is identified in the image of the object; and a measurer for measuring biometrics of the object, based on a result of modeling the object.
  • the terminal apparatus may further include an input device for receiving a user input.
  • the modeler may modify the result of modeling the object, based on a user input signal.
  • the storage may store biometrics data including information about a normal range of at least one of biometrics. If the measured biometrics do not fall within the normal range, the modeler may model the object again by estimating a case where the biometrics fall within the normal range. If the measured biometrics do not fall within the normal range, the measurer may measure the biometrics of the object again, based on the estimated modeling result.
  • the controller may further include a calculator for calculating an error rate between a result of measuring the biometrics again and the biometrics measured using the result of modeling the object.
  • the measurer may detect a region-of-interest (ROI) for measuring the biometrics, based on the result of modeling the object, and measure the biometrics in the ROI.
  • the terminal apparatus may further include an output device for outputting the detected ROI to be differentiated from the other parts of the object, and output a region for measuring the biometrics in the ROI.
  • the terminal apparatus may further include an input device for receiving a user input.
  • the measurer may modify the ROI according to a user input signal.
  • the modeler may model the object such that a body and head of the object are identified.
  • the modeler may detect at least one characteristic point on the head of the object, and set a central axis by using the at least one characteristic point.
  • the storage may store biometrics data including information about a normal range of at least one of biometrics.
  • the measurer may measure an angle between the body and head of the object, determine whether the angle falls within a normal range, and measure a crown-rump length (CRL) of the object, based on a result of the determining.
  • the terminal apparatus may further include an output device for outputting a result of estimating a result of modeling the object when the angle fall within a normal range and a result of measuring the CRL of the object.
  • the measurer may measure the CRL of the object based on the result of modeling the object. If the angle does not fall within the normal range, the measurer may estimate a result of modeling the object when the angle falls within the normal range, and measure the CRL of the object based on the estimated modeling result.
  • the measurer may measure the CRL of the object, and measure at least one among a nuchal translucency (NT) and an intracranial translucency (IT) of the object.
  • the controller may further include a calculator for calculating a relative difference between the CRL and the NT or IT.
  • the controller may provide a user interface via which after modeling the object or extracting a region for measuring the biometrics of the object is automatically performed, whether a result of modeling the object or the extracted region is set to be verified by a user, according to a user input signal.
  • the controller may provide a user interface via which whether at least one among modeling the object, extracting a region for measuring the biometrics of the object, and estimating a result of modeling the object is to be performed automatically or manually, is set according to a user input signal.
  • FIG. 1 is a block diagram of a terminal apparatus that measures biometrics of an object, according to an exemplary embodiment
  • FIG. 2 is a block diagram of a terminal apparatus that measures biometrics of an object, according to an exemplary embodiment
  • FIG. 3 is a flowchart illustrating a method of measuring biometrics of an object, according to an exemplary embodiment
  • FIG. 4 is a flowchart illustrating a method of measuring biometrics of an object, according to an exemplary embodiment
  • FIG. 5 is a flowchart illustrating a method of measuring a crown-rump length (CRL) of an object, according to an exemplary embodiment
  • FIG. 6 is a flowchart illustrating a method of measuring a nuchal translucency (NT) or an intracranial translucency (IT) of an object, according to an exemplary embodiment
  • FIG. 7 is a block diagram of a system that measures biometrics of an object, according to an exemplary embodiment
  • FIG. 8 is a block diagram of a service apparatus included in a system that measures biometrics of an object, according to an exemplary embodiment
  • FIG. 9 is a block diagram of a service apparatus included in a system that measures biometrics of an object, according to an exemplary embodiment
  • FIG. 10 is a flowchart illustrating a method of measuring biometrics of an object, according to an exemplary embodiment
  • FIG. 11 is a flowchart illustrating a method of measuring biometrics of an object, according to an exemplary embodiment
  • FIG. 12 is a flowchart illustrating a method of measuring a CRL of an object, according to an exemplary embodiment
  • FIG. 13 is a flowchart illustrating a method of measuring an NT or an IT of an object, according to an exemplary embodiment
  • FIGS. 14A and 14B illustrate examples of an ultrasound image of an object transmitted to a terminal apparatus or a service apparatus according to an exemplary embodiment
  • FIGS. 15A , 15 B, and 15 C illustrate images each showing a result of modeling an object and a result of measuring a CRL, IT, and NT of the object, according to an exemplary embodiment
  • FIG. 16 illustrates a user interface screen, according to an exemplary embodiment.
  • FIG. 1 is a block diagram of a terminal apparatus 100 that measures biometrics of an object, according to an exemplary embodiment.
  • the terminal apparatus 100 of FIG. 1 may be similar to a terminal apparatus 200 of FIG. 2 which will be described below.
  • Biometrics may include length information of a human body, for example, a crown-rump length (CRL), an intracranial translucency (IT), and a nuchal translucency (NT) of a fetus.
  • a state of an object may be diagnosed by measuring biometrics of the object, based on an image of the object.
  • the terminal apparatus 100 may include a storage 110 and a controller 120 .
  • the terminal apparatus 100 may be included as an element of an image analysis apparatus included in a medical image diagnosis apparatus, e.g., an X-ray apparatus, an ultrasound apparatus, a computed tomography (CT) apparatus, or magnetic resonance imaging (MRI) apparatus. Otherwise, the terminal apparatus 100 may be any of various apparatuses that a user uses, e.g., a personal computer (PC), a notebook computer, a mobile phone, a tablet PC, a navigation system, a smart phone, a personal digital assistant (PDA), a smart television (TV), a portable multimedia player (PMP), and a digital broadcasting receiver. In addition, the terminal apparatus 100 should be understood as a concept including all other apparatuses that are currently developed and placed on the market or that are to be developed in the future.
  • a medical image diagnosis apparatus e.g., an X-ray apparatus, an ultrasound apparatus, a computed tomography (CT) apparatus, or magnetic resonance imaging (MRI) apparatus.
  • CT computed tomography
  • MRI magnetic resonance imaging
  • the storage 110 stores data or a program for operating the terminal apparatus 100 .
  • the storage 110 may store an operating system (OS) of the terminal apparatus 100 , at least one application program, and an image of the object.
  • the image of the object may include an internal or external image of the object for measuring biometrics of the object, such as an ultrasound image, an MRI image, a CT image, or an X-ray image.
  • the storage 110 may be any of various storage media, e.g., a random access memory (RAM), a read-only memory (ROM), a hard disk drive (HDD), a flash memory, a compact disc (CD)-ROM, and a digital versatile disc (DVD).
  • RAM random access memory
  • ROM read-only memory
  • HDD hard disk drive
  • CD compact disc
  • DVD digital versatile disc
  • the controller 120 controls operations of the terminal apparatus 100 . Basically, the controller 120 operates based on the OS stored in the storage 110 to build a basic platform environment of the terminal apparatus 100 , and runs an application program to provide a desired function according to a user's selection.
  • the controller 120 may control such that an image of the object is received from an external device (not shown) or the storage 110 , the object is modeled to identify respective object regions based on the image of the object, biometrics of the object are measured based on a result of modeling the object, and the measured biometrics and the result of modeling the object are then output to an external display unit (not shown) or an output device (not shown) included in the terminal apparatus 100 .
  • the controller 120 may include a modeler 121 and a measurer 122 .
  • the modeler 121 models the object such that the respective regions of the object may be identified, based on the image of the object.
  • the object may be modeled in an oval shape including a circular shape, but is not limited thereto. If the object is a fetus, the head and body of the fetus may be modeled in a circular or oval shape to be differentiated from each other and a result of modeling the object may be output via an output device.
  • the measurer 122 measures biometrics of the object based on the result of modeling the object when the modeler 121 models the object such that the respective regions of the object are identified. If the object is a fetus, then a central axis may be set on the circular or oval shape with which the object is modeled, based on characteristic points of the head and body of the fetus. The CRL, NT, and IT biometrics of the fetus may be measured based on the set central axis.
  • FIG. 2 is a block diagram of a terminal apparatus 200 that measures biometrics of an object, according to an exemplary embodiment.
  • the terminal apparatus 200 may include a storage 210 , a controller 220 , an input device 230 , and an output device 240 .
  • the storage 210 and the controller 220 correspond to the storage 110 and the controller 120 , respectively, and are not described again here.
  • a method of measuring biometrics which is capable of increasing the accuracy of biometrics by determining whether the biometrics fall within a normal range, i.e., a range pre-specified by a user based on certain criteria.
  • the storage 210 may store an image 211 of the object, and biometrics data 212 .
  • the storage 210 may store the biometrics data 212 including information about the normal range of the biometrics to determine whether measured biometrics fall within the normal range.
  • the controller 220 may include a modeler 221 , a measurer 222 , and a calculator 223 .
  • the modeler 221 may calculate a model of the object by estimating a new model so that the biometrics fall within the normal range.
  • the measurer 222 measures biometrics of the object, based on the estimated modeling result.
  • the calculator 223 may calculate an error rate between biometrics measured again by the measurer 222 and the previously measured biometrics.
  • the input device 230 is a unit that generates a user input signal for controlling or operating the terminal apparatus 200 , under a user's manipulation, and may include various input devices.
  • the input device 230 may include at least one among a key input device, a touch input device, a gesture input device, a voice input device, and the like.
  • the key input device may generate a signal corresponding to a key when the key is manipulated, and may be a keypad or a keyboard.
  • the touch input device may recognize a user input by sensing a user's touch on a particular part, and may be a touch pad, a touch screen, or a touch sensor.
  • the gesture input device senses a user's predetermined motion, e.g., shaking or moving a terminal, accessing the terminal, or blinking of the user's eyes, as a particular input signal, and may include at least one among a terrestrial magnetism sensor, an acceleration sensor, a camera, an altimeter, a gyro sensor, and a proximity sensor.
  • a user's predetermined motion e.g., shaking or moving a terminal, accessing the terminal, or blinking of the user's eyes, as a particular input signal
  • a terrestrial magnetism sensor e.g., an acceleration sensor, a camera, an altimeter, a gyro sensor, and a proximity sensor.
  • the output device 240 outputs a user interface for providing biometrics and a result of measuring to a screen (not shown) of the terminal apparatus 200 .
  • the output device 240 may be one of a liquid crystal display (LCD), a thin film transistor-LCD (TFT-LCD), light-emitting diodes (LEDs), organic light-emitting diodes (OLEDs), active matrix organic light-emitting diodes (AMOLED), a flexible display, and a three-dimensional (3D) display.
  • LCD liquid crystal display
  • TFT-LCD thin film transistor-LCD
  • LEDs light-emitting diodes
  • OLEDs organic light-emitting diodes
  • AMOLED active matrix organic light-emitting diodes
  • a flexible display and a three-dimensional (3D) display.
  • FIG. 3 is a flowchart illustrating a method 300 of measuring biometrics of an object, according to an exemplary embodiment.
  • the method 300 of FIG. 3 may be performed by the terminal apparatus 100 of FIG. 1 or the terminal apparatus 200 of FIG. 2 .
  • the terminal apparatus 100 or 200 may receive an image of an object from an external storage device or may read an image stored in the storage 110 to measure biometrics of the object, according to a request from a user or a control signal (operation S 301 ).
  • the receiving or reading of the image in operation S 301 may be performed by the controller 120 or 220 .
  • Operation S 303 may be performed by the modeler 121 or 221 .
  • Biometrics of the object may be measured based on a result of modeling the object performed in operation S 303 (operation S 305 ).
  • Operation S 305 may be performed by the measurer 122 or 222 .
  • the result of modeling the object may be output to a user, and the user may view and modify the result of modeling the object.
  • FIG. 4 is a flowchart illustrating a method 400 of measuring biometrics of an object, according to an exemplary embodiment.
  • the method 400 of FIG. 4 may be performed by the terminal apparatus 200 of FIG. 2 as described in detail below.
  • the terminal apparatus 200 may receive an image of an object from an external storage device or read the image 211 of the object stored in the storage 210 to measure biometrics of the object, according to a request from a user or a control signal (operation S 401 ).
  • the receiving of the image or the reading of the image 211 may be performed by the controller 220 .
  • the object may be modeled such that at least one part of the object may be identified, based on the image 211 of the object (operation S 403 ).
  • Operation S 403 may be performed by the modeler 221 .
  • a result of modeling the object may be output to a user via the output device 240 , and the user may view and modify the result of modeling the object, i.e., the object model. If it is determined that the user checks the result of modeling the object and requests to modify the result of modeling the object, and the result of modeling the object may be modified as requested by the user (operation S 407 ).
  • the request to modify the result of modeling the object in operation S 405 may be received via the input device 230 , and operation S 407 may be performed by the controller 220 .
  • biometrics of the object may be measured based on the result of modeling the object (operation S 410 ).
  • the measured biometrics are output (operation S 423 ). If the measured biometrics do not fall within the normal range, the object may be modeled again by estimating a case where the biometrics fall within the normal range (operation S 415 ). Biometrics of the object are re-measured based on the estimated modeling result (operation S 417 ). The biometrics measured again and the biometrics measured based on the previous result of modeling the object are compared to calculate an error rate therebetween (operation S 420 ).
  • the terminal apparatus 200 may calculate and output data for diagnosing a state of the object, based on the measured biometrics. If the measured biometrics do not fall within the normal range, the accuracy of the measured biometrics may be determined to be low. Thus, the data for diagnosing the state of the object may be calculated and output, based on the biometrics measured based on the estimated modeling result.
  • FIG. 5 is a flowchart illustrating a method 500 of measuring a CRL of an object, according to an exemplary embodiment
  • the object is modeled such that a head and a body of the object may be identified to measure the CRL of the object, which is one of biometrics. Modeling of the object may be estimated and performed again to increase the accuracy of measured biometrics, according to whether an angle between the head and body of the object falls within a normal range.
  • the terminal apparatus 200 of FIG. 2 may receive an image of an object from an external device or may read the image 211 of the object stored in the storage 210 to measure biometrics of the object, according to a request from a user or a control signal (operation S 501 ).
  • the object may be modeled such that the head and body of the object may be identified, based on the image 211 of the object (operation S 503 ).
  • a result of modeling the object may be output to a user via the output device 240 , and the user may check the result of modeling the object.
  • operation S 505 it is determined whether the user requests to modify the result of modeling the object, and the result of modeling the object may be modified as requested by the user (operation S 507 ).
  • a CRL of the object may be measured based on the result of modeling the object.
  • characteristic points of the head and body of the object may be extracted, and a central axis may be set on a figure obtained by modeling the object, based on the extracted characteristic points.
  • An angle between the head and body of the object may be measured based on the central axis (operation S 510 ).
  • the characteristic points may represent a predetermined portion of the object, including at least one of the crown of the head, palatine bones, and the end of a nose of the object.
  • the CRL of the object may be accurately measured.
  • the CRL may be measured to be small when the fetus crouches down to a large extent, and may be measured to be too large when the fetus stretches.
  • the measured CRL may not appropriate to be used to calculate a gestational age (GA), which is a value for diagnosing a state of the fetus.
  • GA gestational age
  • whether the angle falls within the normal range may be determined based on information about the normal range of the angle, included in the biometrics data 212 stored in the storage 210 of the terminal apparatus 200 , thereby enabling the CRL to be accurately measured.
  • the CRL is measured using the result of modeling the object (operation S 523 ). Otherwise, if the angle does not fall within the normal range, the object is modeled again by estimating a case where the angle falls within the normal range (operation S 515 ).
  • modeling of the object may be estimated and performed again by controlling the figure obtained by modeling the object such that the central axis on the head or body may be moved to a side.
  • the CRL is re-measured based on the estimated modeling result (operation S 517 ).
  • a result of re-measuring the CRL based on the estimated modeling result and a result of measuring the CRL based on the previous result of modeling the object are compared to calculate an error rate therebetween (operation S 520 ).
  • a GA which is a value for diagnosing a state of the fetus, may be calculated based on the CRL (operation S 525 ), and may then be output via the output device 240 (operation S 527 ).
  • FIG. 6 is a flowchart illustrating a method 600 of measuring an NT or an IT of an object, according to an exemplary embodiment.
  • the IT or the NT of the object may be measured based on a result of modeling the object.
  • the object may be modeled such that a head and body of the object may be identified as described above, and the IT or NT of the object may then be measured.
  • measuring of the CRL may be optional.
  • a model of the object may be obtained as a result of operation S 507 and/or S 505 of FIG. 5 .
  • the location of the NT or IT of the object may be estimated based on the result of modeling the object.
  • the NT is the nape and may thus be estimated as a region in which the head and body intersect.
  • the IT is located in the skull and may thus be estimated to be located in a region in which a central point and a central axis on the head intersect.
  • a region-of-interest (ROI) in which the NT or IT may be measured may be indicated.
  • a region in which the NT or IT may be measured i.e., the ROI, may be detected and output based on the result of modeling the object (operation S 607 ).
  • a user may check the output ROI and request to modify the ROI.
  • the controller 220 may measure the NT or IT in the ROI (operation S 613 ).
  • the controller 220 may modify the ROI based on the request from the user and may measure the NT or IT in the modified ROI (operation S 615 ).
  • the NT and IT are measured as lengths and may thus be displayed in the form of a line, together with the ROI.
  • a relative difference between the NT or IT and the CRL of the object is calculated (operation S 617 ).
  • An abnormality probability of the object may be calculated and output, based on the relative difference (operation S 620 ).
  • the relative difference may be expressed as NT/CRL or IT/CRL.
  • the CRL has to be measured to calculate the relative difference between the CRL and the NT or IT.
  • the CRL may be measured as described above.
  • FIG. 7 is a block diagram of a system that measures biometrics of an object, according to an exemplary embodiment.
  • the system may include a service apparatus 710 , a network 720 , and a terminal apparatus 730 .
  • biometrics of an object may be measured and a state of the object may be diagnosed according to a computer-based method in which a device that is connected to the terminal apparatus 730 via the network 720 measures the biometrics of the object and diagnoses a state of the object, and only information is input to or output from the terminal apparatus 730 .
  • a device that measures the biometrics of the object and diagnoses the state of the object, in response to a request from the terminal apparatus 730 may be hereinafter referred to as the service apparatus 710 .
  • the service apparatus 710 measures the biometrics of the object based on an image of the object received via the network 720 , and provides the terminal apparatus 730 with a result of the measuring and a result of diagnosing the state of the object based on the result of the measuring. More specifically, the object may be modeled such that at least one object portion may be identified in the image of the object, the biometrics of the object may be measured based on a result of modeling the object, a state of the object may be diagnosed according to a result of the measuring, and a result of the diagnosing may be provided to the terminal apparatus 730 .
  • the service apparatus 710 may provide a user interface via which the result of modeling the object and the measured biometrics may be provided to the terminal apparatus 730 so that a user may view, check, verify, and/or modify a process of measuring the biometrics of the object.
  • the service apparatus 710 may operate based on a server-client computing or a cloud computing and may include computer resources for measuring the biometrics of the object and diagnosing the state of the object, such as, for example, at least one of hardware and software.
  • the network 720 provides a path for exchanging data between the service apparatus 710 and the terminal apparatus 730 .
  • the network 720 is an internet protocol (IP) network via which a service for receiving/transmitting a large amount of data and a data service are provided by using an IP.
  • IP internet protocol
  • the network 720 may be an all-IP network that is an IP network structure obtained by integrating different networks based on an IP.
  • the network 720 may include at least one of a 3G mobile network including a wired network, a wireless broadcasting (Wibro) network, a wideband code division multiple access (WCDMA) network, a 3.5G mobile network including a high-speed downlink packet access (HSDPA) network and a long-term evolution (LTE) network, a 4G mobile network including LTE advanced, and a wireless local area network (LAN) including a satellite network and a Wi-Fi network.
  • a 3G mobile network including a wired network
  • Wibro wireless broadcasting
  • WCDMA wideband code division multiple access
  • a 3.5G mobile network including a high-speed downlink packet access (HSDPA) network and a long-term evolution (LTE) network
  • LTE long-term evolution
  • 4G mobile network including LTE advanced
  • LAN wireless local area network
  • the terminal apparatus 730 performs an operation of outputting the result of measuring the biometrics of the object and the result of diagnosing the state of the object, performed by the service apparatus 710 , as described in detail below.
  • FIG. 8 is a block diagram of a service apparatus 800 included in a system that measures biometrics of an object, according to an exemplary embodiment.
  • the service apparatus 800 of FIG. 8 may be similar to the service apparatus 710 of FIG. 7 or to a service apparatus 900 of FIG. 9 .
  • the service apparatus 800 may include a communicator 810 , a storage 820 , and a service provider 830 .
  • the communicator 810 exchanges data with the terminal apparatus 730 of FIG. 7 via the network 720 of FIG. 7 .
  • the storage 820 stores data and a program for operating the service apparatus 800 .
  • the storage 820 may store an image of an object.
  • the image of the object may include an internal or external image of the object for measuring biometrics of the object, e.g., an ultrasound image, an MRI image, a CT image, or an X-ray image of the object.
  • the storage 820 may include various storage media, such as a RAM, a ROM, an HDD, a flash memory, a CD-ROM, and/or a DVD.
  • the service provider 830 may control the image of the object to be received from an external device (not shown) or the storage 820 .
  • the object may be modeled to identify at least one portion of the object, based on the image of the object.
  • Biometrics of the object may be measured based on a result of modeling the object, and the measured biometrics may be then output to an external display unit (not shown) or an output device.
  • the service provider 830 may include a modeler 831 and a measurer 832 .
  • the modeler 831 models the object such that respective regions of the object may be identified, based on the image of the object.
  • the object may be modeled in an oval shape including a circular shape, but is not limited thereto.
  • the fetus may be approximately divided into a head and a body, and the head and body of the fetus may be modeled in a circular or oval shape and then be provided to an output device (not shown).
  • the measurer 832 measures the biometrics of the object based on a result of modeling the object. If the object is a fetus, a central axis may be set on the circular or oval shape by using characteristic points of the head and body, and a CRL, NT, and IT of the fetus, which are biometrics, may be measured based on the set central axis.
  • FIG. 9 is a block diagram of a service apparatus 900 included in a system that measures biometrics of an object, according to an exemplary embodiment.
  • the service apparatus 900 may include a communicator 910 , a storage 920 , and a service provider 930 .
  • the communicator 910 , the storage 920 , and the service provider 930 correspond to the communicator 810 , the storage 820 , and the service provider 830 of FIG. 8 , respectively, and thus, repeated descriptions are not provided again.
  • the service apparatus 900 may provide a method of measuring biometrics of an object, which is capable of increasing the accuracy of biometrics by determining whether the biometrics fall within a normal range.
  • the storage 920 may store an image 921 and biometrics data 922 of an object.
  • the storage 920 stores the biometrics data 922 including information about a normal range of at least one biometric, thereby enabling to determine whether measured biometrics fall within the normal range.
  • the service provider 930 may include a modeler 931 , a measurer 932 , and a calculator 933 .
  • the modeler 931 models the object again such that biometrics of the object may fall within the normal range.
  • the measurer 932 measures biometrics of the object again, based on a result of modeling the object again, performed by the modeler 931 .
  • the calculator 933 calculates an error rate between the biometrics measured again by the measurer 932 and the previously measured biometrics and provides the error rate to a user so that the user may determine the precision of the previously measured biometrics.
  • FIG. 10 is a flowchart illustrating a method 1000 of measuring biometrics of an object, according to an exemplary embodiment.
  • a terminal apparatus 730 may receive an image of an object from an external device or may read an image stored in a storage to measure biometrics of the object, according to a request from a user or a control signal (operation S 1001 ).
  • the terminal apparatus 730 may transmit the image of the object to a service apparatus 800 to request to measure biometrics of the object (operation S 1003 ).
  • the image of the object may be stored in the service apparatus 800 .
  • the terminal apparatus 730 may request the service apparatus 800 to measure the biometrics of the object stored in the service apparatus 800 and provide the terminal apparatus 730 with a result of the measuring.
  • the service apparatus 800 may model the object such that at least one part of the object may be identified, based on the image of the object (operation S 1005 ).
  • the service apparatus 800 may measure biometrics of the object based on a result of modeling the object (operation S 1007 ).
  • the result of modeling the object and the measured biometrics may be transmitted to the terminal apparatus 730 (operation S 1010 ).
  • the result of modeling the object and the measured biometrics may be output to the user via the terminal apparatus 730 (operation S 1013 ).
  • the user may view and modify the result of modeling the object and the measured biometrics.
  • FIG. 11 is a flowchart illustrating a method 1100 of measuring biometrics of an object, according to an exemplary embodiment.
  • a terminal apparatus 730 may receive an image of an object from an external device or may read an image stored in a storage to measure biometrics of the object, according to a request from a user or a control signal (operation S 1101 ).
  • the terminal apparatus 730 may transmit the image to a service apparatus 900 to request the service apparatus 900 to model the object in order to measure biometrics of the object (operation S 1103 ).
  • the image of the object may be stored in the service apparatus 900 .
  • the terminal apparatus 730 may request the service apparatus 900 to model the image of the object stored in the service apparatus 900 .
  • the object may be modeled such that at least one part of the object may be identified, based on the image (operation S 1105 ).
  • a result of modeling the object may be transmitted to the terminal apparatus 730 (operation S 1107 ).
  • the result of modeling the object may be output to the user via the output device (not shown) in the terminal apparatus 730 (operation S 1110 ).
  • the service apparatus 900 When the user views the result of modeling the object and requests the service apparatus 900 to modify the result of modeling the object, via an input device (not shown) of the terminal apparatus 730 (operations S 1113 and S 1115 ), the result of modeling the object may be modified as requested by the user (operation S 1117 ).
  • the service apparatus 900 may be requested to measure biometrics of the object (operation S 1120 ). Then, the service apparatus 900 may measure biometrics of the object, based on the result of modeling the object (operation S 1123 ).
  • Whether the measured biometrics fall within a normal range may be determined based on the biometrics data 922 stored in the storage 920 of the service apparatus 900 (operation S 1125 ). If the measured biometrics do not fall within the normal range, the precision of the measured biometrics may be determined to be low.
  • the measured biometrics may be transmitted to the terminal apparatus 730 (operation S 1127 ).
  • the measured biometrics may be output to the user via the output device (not shown) in the terminal apparatus 730 (operation S 1140 ).
  • the object is modeled again by estimating a case where biometrics of the object fall within the normal range (operation S 1130 ).
  • the biometrics of the object are measured again based on a result of modeling the object again (operation S 1133 ).
  • the measured biometrics and the biometrics measured based on the previous result of modeling the object may be compared to calculate an error rate therebetween (operation S 1135 ).
  • the service apparatus 900 may calculate data for diagnosing a state of the object from the measured biometrics and provide the data to the terminal apparatus 730 . However, if the measured biometrics do not fall within the normal range, the precision of the measured biometrics may be determined to be low. Thus, the data for diagnosing the state of the object may be calculated from the biometrics measured based on the estimated modeling result, and then be provided to the terminal apparatus 730 .
  • data related to the biometrics of the object including the result of modeling the object, the measured biometrics, the error rate, and the like, may be transmitted to the terminal apparatus 730 (operation S 1137 ).
  • the data may be controlled to be output by the terminal apparatus 730 (operation S 1140 ).
  • FIG. 12 is a flowchart illustrating a method 1200 of measuring a CRL of an object, according to an exemplary embodiment.
  • the object in order to measure a CRL, which is one of biometrics of an object, the object may be modeled such that a body and head of the object may be identified, and may be modeled again according to whether an angle between the body and head falls within a normal range.
  • a terminal apparatus 730 may receive an image of an object from an external device or may read an image from a storage to measure biometrics of the object, according to a request from a user or a control signal (operation S 1201 ).
  • the terminal apparatus 730 may transmit the image to the service apparatus 900 to request the service apparatus 900 to model the object to measure biometrics of the object (operation S 1203 ).
  • the image of the object may be stored in the service apparatus 900 .
  • the terminal apparatus 730 may request the service apparatus 900 to model the image of the object stored in the service apparatus 900 and provide a result of modeling the object.
  • the object may be modeled such that a body and head of the object may be identified, based on the image of the object (operation S 1205 ).
  • a result of modeling the object may be transmitted to the terminal apparatus 730 (operation S 1207 ).
  • the result of modeling the object may be output to a user via the output device of the terminal apparatus 730 (operation S 1210 ).
  • the service apparatus 900 When the user views the result of modeling the object and requests the service apparatus 900 to modify the result of modeling the object, via the input device of the terminal apparatus 730 (operations S 1213 and S 1215 ), the result of modeling the object may be modified as requested by the user (operation S 1217 ).
  • the service apparatus 900 may measure biometrics of the object, based on the result of modeling the object (operation S 1223 ).
  • a CRL of the object may be measured based on the result of modeling the object.
  • a GA may be calculated from the CRL.
  • characteristic points on the head and body of the object may be extracted, a central axis may be set on a figure, i.e., an object model, obtained by modeling the object, based on the extracted characteristics points, and then, biometrics of the object may be measured.
  • an angle between the body and head of the object and the CRL of the object may be measured based on the central axis.
  • the CRL of the object may be accurately measured.
  • the CRL may be measured to be small when the fetus crouches down to a large extent and may be measured to be too large when the fetus stretches.
  • the measured CRL may be not appropriate for calculating a GA, which is a value for diagnosing a state of the fetus.
  • whether the angle between the head and body of the object falls within the normal range may be determined based on information about the normal range of this angle, included in the biometrics data 922 stored in the storage 920 of the service apparatus 900 (operation S 1225 ), thereby enabling the CRL to be accurately measured.
  • the object is modeled again by estimating a case where the angle falls within the normal range (operation S 1227 ).
  • modeling of the object may be estimated and performed again by controlling a figure obtained by modeling the object such that the central axis on the head or body may be moved to a side and the angle may thus fall within the normal range.
  • a CRL of the object may be measured again based on a result of modeling the object again, and the measured CRL and the CRL measured based on the previous result of modeling the object may be compared to calculate an error rate therebetween (operation S 1230 ).
  • a GA for diagnosing a state of a fetus may be calculated from the CRL (operation S 1233 ).
  • the CRL and/or GA may be transmitted to the terminal apparatus 730 (operation S 1235 ) and may be output via the output device of the terminal apparatus 730 (operation S 1237 ).
  • FIG. 13 is a flowchart illustrating a method 1300 of measuring an NT or an IT of an object, according to an exemplary embodiment.
  • an IT or NT of an object may be measured based on a result of modeling the object.
  • the object may be modeled such that the head and body of the object may be identified according to an exemplary embodiment as described above, and the IT or NT of the object may then be measured based on a result of modeling the object.
  • Measuring a CRL may be optionally performed. For example, in operation S 1337 , the CRL may be received as an output of the operation S 1237 of FIG. 12 .
  • locations of the NT or IT of the object may be estimated based on a result of modeling the object.
  • the NT is the nape and may thus be estimated as a region in which the head and body intersect
  • the IT is located in the skull and may thus be estimated to be located in a region in which a central point on the head and a central axis on the head intersect.
  • An ROI in which the NT or IT may be measured may be indicated.
  • a terminal apparatus 730 requests a service apparatus 900 to measure an NT or IT of an object and provide a result of the measuring, according to a request from a user of the terminal apparatus 730 or a control signal (operation S 1301 ).
  • the service apparatus 900 sets a region in which the NT or IT is to be measured, i.e., an ROI, based on a result of modeling the object (operation S 1303 ).
  • information about the set ROI may be transmitted to the terminal apparatus 730 (operation S 1305 ).
  • the ROI may be displayed on the terminal apparatus 730 (operation S 1307 )
  • the ROI is modified as requested by the user (operation S 1315 ).
  • the NT or IT may be requested to be measured in the modified ROI (operation S 1317 ), and then be measured in the modified ROI (operation S 1320 ).
  • the NT and IT are measured as lengths and may thus be displayed in the form of a line, together with the ROI.
  • a relative difference between the NT or IT and the CRL may be calculated (operation S 1323 ).
  • An abnormality probability of the object may be calculated using the relative difference and then be provided to the terminal apparatus 730 (operation S 1325 ).
  • the relative difference may be expressed as NT/CRL or IT/CRL.
  • the CRL has to be measured to calculate the relative difference between the CRL and the NT or IT.
  • the measured NT and/or IT, and the relative difference between the NT or IT and the CRL may be transmitted to the terminal apparatus 730 (operation S 1327 ), and may then be output via the output device of the terminal apparatus 730 (operation S 1330 ).
  • FIGS. 14A and 14B illustrate examples of an ultrasound image of an object transmitted to a terminal apparatus or a service apparatus according to an exemplary embodiment.
  • FIG. 14A illustrates an example of an ultrasound image 1 of a fetus, received, for example, by the controller 120 or 220 of FIG. 1 or 2 or the service provider 830 or 930 of FIG. 8 or 9 .
  • the ultrasound image 1 includes a cross-section of the fetus, based on which biometrics of the fetus may be measured.
  • biometrics of the fetus may be measured by extracting portions 3 and 4 of an ultrasound image 2 of the fetus.
  • An object illustrated in FIG. 15 may be the same as the ultrasound image of FIG. 14A or 14 B or may be obtained by extracting a part of the ultrasound image 1 or 2 of FIG. 14A or 14 B.
  • FIGS. 15A to 15C illustrate examples of modeling an object and measuring a CRL, IT, and NT of the object, according to exemplary embodiments
  • FIG. 15A illustrates an example of a result of modeling the fetus and a result of measuring a CRL of the fetus displayed on a screen 88 .
  • the fetus may be modeled such that the head and body are identified in a circular shape 10 and an oval shape 20 , respectively, characteristic points on the head and body may be extracted, and central axes 11 , 12 , and 21 are then set and indicated based on the characteristic points.
  • a CRL 30 may be automatically displayed and measured between points 84 and 86 of the object model, based on the central axes 11 , 12 , and 21 .
  • a user may select desired points 90 , 92 of the fetus to be measured and may manually measure a CRL 40 between the points 90 , 92 .
  • An angle 50 between the head and body may also be measured with respect to the central axes 11 , 12 , and 21 . Whether the angle 50 falls within a normal range may be determined.
  • FIG. 15B illustrates an example of a result of modeling the fetus and a result of measuring a CRL of the fetus when an angle between the head and body does not fall within a normal range.
  • the object is modeled by estimating a case where the angle between the head and body falls within the normal range. Referring to FIG. 15B , if the angle falls outside the normal range, a result of modeling the head is moved toward a result of modeling the body (as indicated with a line 15 ) in order to adjust the angle between the head and body to fall within the normal range and a CRL of the fetus (as indicated with a line 60 ) is measured using the result of modeling the body including the angle 51 falling within the normal range.
  • FIG. 15C illustrates an example of a result of modeling the fetus and a result of measuring an IT and NT of the fetus.
  • the NT and IT may be measured by setting regions on the portions of the object that is modeled as ROIs.
  • a region around a central point 98 on the head may be set as a first ROI 70 of the IT.
  • a region 99 in which the head and body contact each other may be set as a second ROI 80 of the NT.
  • the first ROI 70 and the second ROI 80 may be displayed as expanded regions.
  • parts of the ROIs 70 and 80 in which the IT and NT are to be measured may be displayed as lines 71 and 81 .
  • a user may view displayed information and may directly modify the ROIs 70 and 80 or the parts of the ROIs 70 and 80 in which the IT and NT are to be measured.
  • FIG. 16 illustrates a screen image on which whether an object is to be modeled, whether biometrics of the object are to be automatically measured, and whether the measured biometrics are to be verified by a user after the measurement of the biometrics may be set.
  • a user interface 1400 via which the biometrics of the object may be set to be measured after a user verifies a result of modeling the object, an NT measuring region, and an IT measuring region may be provided. If the biometrics are set to be measured after the user verifies the result of modeling the object, the NT measuring region, and the IT measuring region, then the result of modeling the object or the NT measuring region and the IT measuring unit may be automatically set and displayed, whether these displayed items are to be modified may be determined according to the user's verification, and then the biometrics of the object may be measured.
  • the user interface of FIG. 16 may be provided so that a user may determine whether biometrics of the object, such as a CRL, an NT, and an IT, are to be automatically or manually measured.
  • biometrics of the object such as a CRL, an NT, and an IT
  • the biometrics are measured by setting measuring regions based on a result of modeling the object.
  • the biometrics of the object are determined to be manually measured, the biometrics are measured by manually modeling the object or setting measuring regions by the user. For example, when the biometrics are measured by measuring lengths of portions of the object, it is possible to set such that a user may make dots on an image of the object to measure the lengths.
  • Exemplary embodiments can be embodied as software codes that may be read by a computer (including various devices capable of processing information), in a computer-readable recording medium.
  • the computer-readable recording medium may be any recording apparatus capable of storing data that is read by a computer system, e.g., a ROM, a RAM, a CDROM, a magnetic tape, a floppy disk, an optical data storage device, and so on.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Medical Informatics (AREA)
  • Public Health (AREA)
  • General Health & Medical Sciences (AREA)
  • Pathology (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • Veterinary Medicine (AREA)
  • Animal Behavior & Ethology (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Radiology & Medical Imaging (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Theoretical Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Pregnancy & Childbirth (AREA)
  • Gynecology & Obstetrics (AREA)
  • Physiology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Databases & Information Systems (AREA)
  • Geometry (AREA)
  • Biotechnology (AREA)
  • Evolutionary Biology (AREA)
  • Spectroscopy & Molecular Physics (AREA)
  • Epidemiology (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Primary Health Care (AREA)
  • Computer Graphics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Software Systems (AREA)
  • General Physics & Mathematics (AREA)
  • Vascular Medicine (AREA)
  • Image Analysis (AREA)
  • Measuring And Recording Apparatus For Diagnosis (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)

Abstract

A method for measuring biometrics of an object includes receiving an image of an object, modeling the object to identify a portion of the object, and measuring biometrics of the object based on a modeling result the object.

Description

    CROSS-REFERENCE TO RELATED PATENT APPLICATION
  • This application claims the priority from Korean Patent Application No. 10-2012-0001150, filed on Jan. 4, 2012, in the Korean Intellectual Property Office, the disclosure of which is incorporated herein in its entirety by reference.
  • BACKGROUND
  • 1. Field
  • Apparatuses and methods consistent with exemplary embodiments relate to measuring biometrics of an object from an ultrasound image of the object.
  • 2. Description of the Related Art
  • Ultrasound systems have noninvasive and nondestructive characteristics and thus have been widely used in the medical field to obtain information about the internal portions of an object. The ultrasound systems provide a high-resolution image of an object in real time without a need to perform a surgical operation. Thus, the ultrasound systems have drawn much attention in the medical field.
  • Ultrasound images have been used for early diagnosis to determine whether a fetus has a defect in its chromosome or nervous system, e.g., Down syndrome. In order for a diagnostician to accurately measure biometrics of the fetus and diagnose a state of the fetus by determining the location of the fetus with the naked eyes, an image of a mid-sagittal plane of the fetus is detected and a fetal crown-rump length (CRL), a nuchal translucency (NT), and an intracranial translucency (IT) of the fetus are measured based on the image.
  • Although biometrics, such as the CRL, NT, and IT, are individually measured and output, a relative difference between the NT and the CRL or the IT and the CRL, i.e., a value calculated based on at least two biometrics is used to diagnose the state of the fetus. Thus, there is a need to automatically provide a user with a value calculated based on a result of integrating the biometrics, such as the CRL, NT, and IT, and a result of diagnosing the fetus based on the calculated value so that the user may easily diagnose and determine the state of the fetus.
  • SUMMARY
  • Exemplary embodiments may address at least the above problems and/or disadvantages and other disadvantages not described above. Also, exemplary embodiments are not required to overcome the disadvantages described above, and an exemplary embodiment may not overcome any of the problems described above.
  • One or more exemplary embodiments provide a method and apparatus for automatically measuring biometrics of an object by using an ultrasound image of the object.
  • According to an aspect of an exemplary embodiment, there is provided a method of measuring biometrics of an object, the method including receiving an image of the object, modeling the object such that at least one part of the object is identified, and measuring biometrics of the object, based on a result of modeling the object.
  • The modeling of the object may include displaying a result of modeling the at least one part of the object in an oval shape including a circular shape, and modifying the result of modeling the object, based on a user input signal.
  • The measuring of the biometrics of the object may include determining whether the measured biometrics fall within a normal range, modeling the object again by estimating a case where the biometrics fall within the normal range when it is determined that the measured biometrics do not fall within the normal range, and measuring the biometrics of the object again, based on the estimated modeling result.
  • The measuring of the biometrics of the object may include detecting a region-of-interest (ROI) for measuring the biometrics, based on the result of modeling the object; and measuring the biometrics in the ROI.
  • The detecting of the ROI may include displaying the detected ROI to be differentiated from the other parts of the object, displaying a region for measuring the biometrics, in the ROI, and modifying the ROI or the region for measuring the biometrics, according to a user input signal.
  • The modeling of the object may include modeling the object such that a body and head of the object are identified.
  • After the modeling of the object, the method may further include detecting at least one characteristic point on the head of the object, setting a central axis by using the at least one characteristic points and then displaying the central axis, measuring an angle between the body and head of the object with respect to the central axis, determining whether the angle falls within a normal range, and measuring a crown-rump length (CRL) of the object, based on a result of the determining.
  • The measuring of the CRL may include measuring the CRL of the object based on the result of modeling the object when the angle falls within the normal range, estimating a result of modeling the object when the angle falls within the normal range when the angle does not fall within the normal range and then measuring the CRL of the object, based on the estimated modeling result, and displaying the estimated modeling result and the measured CRL.
  • The measuring of the biometrics may include measuring a crown-rump length (CRL) of the object, measuring at least one among a nuchal translucency (NT) and an intracranial translucency (IT) of the object, calculating a relative difference between the CRL and the NT or the IT, and displaying the measured CRL, NT, and IT.
  • According to another aspect of an exemplary embodiment, there is provided a terminal apparatus for measuring biometrics of an object, the terminal apparatus including a storage for storing an image of the object, and a controller. The controller includes a modeler for modeling the object such that at least one part of the object is identified in the image of the object; and a measurer for measuring biometrics of the object, based on a result of modeling the object.
  • The terminal apparatus may further include an input device for receiving a user input. The modeler may modify the result of modeling the object, based on a user input signal.
  • The storage may store biometrics data including information about a normal range of at least one of biometrics. If the measured biometrics do not fall within the normal range, the modeler may model the object again by estimating a case where the biometrics fall within the normal range. If the measured biometrics do not fall within the normal range, the measurer may measure the biometrics of the object again, based on the estimated modeling result.
  • The controller may further include a calculator for calculating an error rate between a result of measuring the biometrics again and the biometrics measured using the result of modeling the object.
  • The measurer may detect a region-of-interest (ROI) for measuring the biometrics, based on the result of modeling the object, and measure the biometrics in the ROI. The terminal apparatus may further include an output device for outputting the detected ROI to be differentiated from the other parts of the object, and output a region for measuring the biometrics in the ROI.
  • The terminal apparatus may further include an input device for receiving a user input. The measurer may modify the ROI according to a user input signal.
  • The modeler may model the object such that a body and head of the object are identified.
  • The modeler may detect at least one characteristic point on the head of the object, and set a central axis by using the at least one characteristic point.
  • The storage may store biometrics data including information about a normal range of at least one of biometrics. The measurer may measure an angle between the body and head of the object, determine whether the angle falls within a normal range, and measure a crown-rump length (CRL) of the object, based on a result of the determining. The terminal apparatus may further include an output device for outputting a result of estimating a result of modeling the object when the angle fall within a normal range and a result of measuring the CRL of the object.
  • If the angle falls within the normal range, the measurer may measure the CRL of the object based on the result of modeling the object. If the angle does not fall within the normal range, the measurer may estimate a result of modeling the object when the angle falls within the normal range, and measure the CRL of the object based on the estimated modeling result.
  • The measurer may measure the CRL of the object, and measure at least one among a nuchal translucency (NT) and an intracranial translucency (IT) of the object. The controller may further include a calculator for calculating a relative difference between the CRL and the NT or IT.
  • The controller may provide a user interface via which after modeling the object or extracting a region for measuring the biometrics of the object is automatically performed, whether a result of modeling the object or the extracted region is set to be verified by a user, according to a user input signal.
  • The controller may provide a user interface via which whether at least one among modeling the object, extracting a region for measuring the biometrics of the object, and estimating a result of modeling the object is to be performed automatically or manually, is set according to a user input signal.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • The above and/or other aspects will become more apparent by describing certain exemplary embodiments with reference to the accompanying drawings, in which:
  • FIG. 1 is a block diagram of a terminal apparatus that measures biometrics of an object, according to an exemplary embodiment
  • FIG. 2 is a block diagram of a terminal apparatus that measures biometrics of an object, according to an exemplary embodiment;
  • FIG. 3 is a flowchart illustrating a method of measuring biometrics of an object, according to an exemplary embodiment;
  • FIG. 4 is a flowchart illustrating a method of measuring biometrics of an object, according to an exemplary embodiment;
  • FIG. 5 is a flowchart illustrating a method of measuring a crown-rump length (CRL) of an object, according to an exemplary embodiment;
  • FIG. 6 is a flowchart illustrating a method of measuring a nuchal translucency (NT) or an intracranial translucency (IT) of an object, according to an exemplary embodiment;
  • FIG. 7 is a block diagram of a system that measures biometrics of an object, according to an exemplary embodiment;
  • FIG. 8 is a block diagram of a service apparatus included in a system that measures biometrics of an object, according to an exemplary embodiment;
  • FIG. 9 is a block diagram of a service apparatus included in a system that measures biometrics of an object, according to an exemplary embodiment;
  • FIG. 10 is a flowchart illustrating a method of measuring biometrics of an object, according to an exemplary embodiment;
  • FIG. 11 is a flowchart illustrating a method of measuring biometrics of an object, according to an exemplary embodiment;
  • FIG. 12 is a flowchart illustrating a method of measuring a CRL of an object, according to an exemplary embodiment;
  • FIG. 13 is a flowchart illustrating a method of measuring an NT or an IT of an object, according to an exemplary embodiment;
  • FIGS. 14A and 14B illustrate examples of an ultrasound image of an object transmitted to a terminal apparatus or a service apparatus according to an exemplary embodiment;
  • FIGS. 15A, 15B, and 15C illustrate images each showing a result of modeling an object and a result of measuring a CRL, IT, and NT of the object, according to an exemplary embodiment; and
  • FIG. 16 illustrates a user interface screen, according to an exemplary embodiment.
  • DETAILED DESCRIPTION
  • Certain exemplary embodiments are described in greater detail below, with reference to the accompanying drawings.
  • In the following description, like drawing reference numerals are used for the like elements, even in different drawings. The matters defined in the description, such as detailed construction and elements, are provided to assist in a comprehensive understanding of exemplary embodiments. However, exemplary embodiments can be practiced without those specifically defined matters. Also, well-known functions or constructions are not described in detail since they would obscure the application with unnecessary detail.
  • Unless otherwise defined, all terms (including technical and scientific terms) used herein have the same meaning as commonly understood by one of ordinary skill in the art. It will be further understood that terms, such as those defined in commonly used dictionaries, should be interpreted as having a meaning that is consistent with their meaning in the context of the relevant art and will not be interpreted in an idealized or overly formal sense unless expressly so defined herein. Although few exemplary embodiments are described, it would be appreciated by those of ordinary skill in the art that changes may be made in these exemplary embodiments without departing from the principles and spirit of the invention, the scope of which is defined in the claims and their equivalents.
  • As used herein, ‘at least one of,’ when preceding a list of elements, modify the entire list of elements and does not modify the individual elements of the list.
  • FIG. 1 is a block diagram of a terminal apparatus 100 that measures biometrics of an object, according to an exemplary embodiment. The terminal apparatus 100 of FIG. 1 may be similar to a terminal apparatus 200 of FIG. 2 which will be described below.
  • Biometrics may include length information of a human body, for example, a crown-rump length (CRL), an intracranial translucency (IT), and a nuchal translucency (NT) of a fetus. According to an exemplary embodiment, a state of an object may be diagnosed by measuring biometrics of the object, based on an image of the object.
  • Referring to FIG. 1, the terminal apparatus 100 according to an exemplary embodiment may include a storage 110 and a controller 120.
  • The terminal apparatus 100 may be included as an element of an image analysis apparatus included in a medical image diagnosis apparatus, e.g., an X-ray apparatus, an ultrasound apparatus, a computed tomography (CT) apparatus, or magnetic resonance imaging (MRI) apparatus. Otherwise, the terminal apparatus 100 may be any of various apparatuses that a user uses, e.g., a personal computer (PC), a notebook computer, a mobile phone, a tablet PC, a navigation system, a smart phone, a personal digital assistant (PDA), a smart television (TV), a portable multimedia player (PMP), and a digital broadcasting receiver. In addition, the terminal apparatus 100 should be understood as a concept including all other apparatuses that are currently developed and placed on the market or that are to be developed in the future.
  • According to an exemplary embodiment, the storage 110 stores data or a program for operating the terminal apparatus 100. Basically, the storage 110 may store an operating system (OS) of the terminal apparatus 100, at least one application program, and an image of the object. The image of the object may include an internal or external image of the object for measuring biometrics of the object, such as an ultrasound image, an MRI image, a CT image, or an X-ray image. The storage 110 may be any of various storage media, e.g., a random access memory (RAM), a read-only memory (ROM), a hard disk drive (HDD), a flash memory, a compact disc (CD)-ROM, and a digital versatile disc (DVD).
  • The controller 120 controls operations of the terminal apparatus 100. Basically, the controller 120 operates based on the OS stored in the storage 110 to build a basic platform environment of the terminal apparatus 100, and runs an application program to provide a desired function according to a user's selection.
  • Specifically, the controller 120 may control such that an image of the object is received from an external device (not shown) or the storage 110, the object is modeled to identify respective object regions based on the image of the object, biometrics of the object are measured based on a result of modeling the object, and the measured biometrics and the result of modeling the object are then output to an external display unit (not shown) or an output device (not shown) included in the terminal apparatus 100.
  • According to an exemplary embodiment, the controller 120 may include a modeler 121 and a measurer 122.
  • The modeler 121 models the object such that the respective regions of the object may be identified, based on the image of the object. The object may be modeled in an oval shape including a circular shape, but is not limited thereto. If the object is a fetus, the head and body of the fetus may be modeled in a circular or oval shape to be differentiated from each other and a result of modeling the object may be output via an output device.
  • The measurer 122 measures biometrics of the object based on the result of modeling the object when the modeler 121 models the object such that the respective regions of the object are identified. If the object is a fetus, then a central axis may be set on the circular or oval shape with which the object is modeled, based on characteristic points of the head and body of the fetus. The CRL, NT, and IT biometrics of the fetus may be measured based on the set central axis.
  • FIG. 2 is a block diagram of a terminal apparatus 200 that measures biometrics of an object, according to an exemplary embodiment.
  • Referring to FIG. 2, the terminal apparatus 200 according to an exemplary embodiment may include a storage 210, a controller 220, an input device 230, and an output device 240. The storage 210 and the controller 220 correspond to the storage 110 and the controller 120, respectively, and are not described again here.
  • According to an exemplary embodiment, there is provided a method of measuring biometrics, which is capable of increasing the accuracy of biometrics by determining whether the biometrics fall within a normal range, i.e., a range pre-specified by a user based on certain criteria.
  • According to an exemplary embodiment, the storage 210 may store an image 211 of the object, and biometrics data 212. The storage 210 may store the biometrics data 212 including information about the normal range of the biometrics to determine whether measured biometrics fall within the normal range.
  • According to an exemplary embodiment, the controller 220 may include a modeler 221, a measurer 222, and a calculator 223.
  • When biometrics measured by the measurer 222 do not fall within the normal range, the modeler 221 may calculate a model of the object by estimating a new model so that the biometrics fall within the normal range.
  • When the biometrics measured by the measurer 222 do not fall within the normal range, the measurer 222 measures biometrics of the object, based on the estimated modeling result.
  • When the biometrics measured by the measurer 222 do not fall within the normal range and the object is modeled by estimating a case where the biometrics fall within the normal range, the calculator 223 may calculate an error rate between biometrics measured again by the measurer 222 and the previously measured biometrics.
  • The input device 230 is a unit that generates a user input signal for controlling or operating the terminal apparatus 200, under a user's manipulation, and may include various input devices. For example, the input device 230 may include at least one among a key input device, a touch input device, a gesture input device, a voice input device, and the like. The key input device may generate a signal corresponding to a key when the key is manipulated, and may be a keypad or a keyboard. The touch input device may recognize a user input by sensing a user's touch on a particular part, and may be a touch pad, a touch screen, or a touch sensor. The gesture input device senses a user's predetermined motion, e.g., shaking or moving a terminal, accessing the terminal, or blinking of the user's eyes, as a particular input signal, and may include at least one among a terrestrial magnetism sensor, an acceleration sensor, a camera, an altimeter, a gyro sensor, and a proximity sensor.
  • The output device 240 outputs a user interface for providing biometrics and a result of measuring to a screen (not shown) of the terminal apparatus 200. For example, the output device 240 may be one of a liquid crystal display (LCD), a thin film transistor-LCD (TFT-LCD), light-emitting diodes (LEDs), organic light-emitting diodes (OLEDs), active matrix organic light-emitting diodes (AMOLED), a flexible display, and a three-dimensional (3D) display.
  • FIG. 3 is a flowchart illustrating a method 300 of measuring biometrics of an object, according to an exemplary embodiment.
  • The method 300 of FIG. 3 may be performed by the terminal apparatus 100 of FIG. 1 or the terminal apparatus 200 of FIG. 2.
  • The method 300 performed by the terminal apparatus 100 or 200 will now be described in detail.
  • The terminal apparatus 100 or 200 may receive an image of an object from an external storage device or may read an image stored in the storage 110 to measure biometrics of the object, according to a request from a user or a control signal (operation S301). The receiving or reading of the image in operation S301 may be performed by the controller 120 or 220.
  • The object is modeled based on the image received or read in operation S301 such that at least one part of the object may be identified (operation S303). Operation S303 may be performed by the modeler 121 or 221.
  • Biometrics of the object may be measured based on a result of modeling the object performed in operation S303 (operation S305).
  • Operation S305 may be performed by the measurer 122 or 222.
  • The result of modeling the object may be output to a user, and the user may view and modify the result of modeling the object.
  • FIG. 4 is a flowchart illustrating a method 400 of measuring biometrics of an object, according to an exemplary embodiment.
  • The method 400 of FIG. 4 may be performed by the terminal apparatus 200 of FIG. 2 as described in detail below.
  • The terminal apparatus 200 may receive an image of an object from an external storage device or read the image 211 of the object stored in the storage 210 to measure biometrics of the object, according to a request from a user or a control signal (operation S401). In operation S401, the receiving of the image or the reading of the image 211 may be performed by the controller 220.
  • Then, the object may be modeled such that at least one part of the object may be identified, based on the image 211 of the object (operation S403). Operation S403 may be performed by the modeler 221.
  • A result of modeling the object may be output to a user via the output device 240, and the user may view and modify the result of modeling the object, i.e., the object model. If it is determined that the user checks the result of modeling the object and requests to modify the result of modeling the object, and the result of modeling the object may be modified as requested by the user (operation S407). The request to modify the result of modeling the object in operation S405 may be received via the input device 230, and operation S407 may be performed by the controller 220.
  • If it is determined that the user does not request to modify the result of modeling the object (operation S405), biometrics of the object may be measured based on the result of modeling the object (operation S410). In operation 413, it is determined whether the measured biometrics fall within a normal range based on the biometrics data 212 stored in the storage 210 of the terminal apparatus 200 (operation S413). If the measured biometrics do not fall within the normal range, the accuracy of the measured biometrics may be determined to be low.
  • Otherwise, if the measured biometrics fall within the normal range, the measured biometrics are output (operation S423). If the measured biometrics do not fall within the normal range, the object may be modeled again by estimating a case where the biometrics fall within the normal range (operation S415). Biometrics of the object are re-measured based on the estimated modeling result (operation S417). The biometrics measured again and the biometrics measured based on the previous result of modeling the object are compared to calculate an error rate therebetween (operation S420).
  • The terminal apparatus 200 may calculate and output data for diagnosing a state of the object, based on the measured biometrics. If the measured biometrics do not fall within the normal range, the accuracy of the measured biometrics may be determined to be low. Thus, the data for diagnosing the state of the object may be calculated and output, based on the biometrics measured based on the estimated modeling result.
  • FIG. 5 is a flowchart illustrating a method 500 of measuring a CRL of an object, according to an exemplary embodiment
  • According to an exemplary embodiment, the object is modeled such that a head and a body of the object may be identified to measure the CRL of the object, which is one of biometrics. Modeling of the object may be estimated and performed again to increase the accuracy of measured biometrics, according to whether an angle between the head and body of the object falls within a normal range.
  • The terminal apparatus 200 of FIG. 2 may receive an image of an object from an external device or may read the image 211 of the object stored in the storage 210 to measure biometrics of the object, according to a request from a user or a control signal (operation S501).
  • The object may be modeled such that the head and body of the object may be identified, based on the image 211 of the object (operation S503). A result of modeling the object may be output to a user via the output device 240, and the user may check the result of modeling the object. In operation S505, it is determined whether the user requests to modify the result of modeling the object, and the result of modeling the object may be modified as requested by the user (operation S507).
  • A CRL of the object may be measured based on the result of modeling the object. First, characteristic points of the head and body of the object may be extracted, and a central axis may be set on a figure obtained by modeling the object, based on the extracted characteristic points. An angle between the head and body of the object may be measured based on the central axis (operation S510). Here, the characteristic points may represent a predetermined portion of the object, including at least one of the crown of the head, palatine bones, and the end of a nose of the object.
  • If the angle between the head and body of the object falls within the normal range, then the CRL of the object may be accurately measured. In the case of a fetus, for example, the CRL may be measured to be small when the fetus crouches down to a large extent, and may be measured to be too large when the fetus stretches. Thus, the measured CRL may not appropriate to be used to calculate a gestational age (GA), which is a value for diagnosing a state of the fetus.
  • In operation 513, whether the angle falls within the normal range may be determined based on information about the normal range of the angle, included in the biometrics data 212 stored in the storage 210 of the terminal apparatus 200, thereby enabling the CRL to be accurately measured.
  • If the angle falls within the normal range, the CRL is measured using the result of modeling the object (operation S523). Otherwise, if the angle does not fall within the normal range, the object is modeled again by estimating a case where the angle falls within the normal range (operation S515). In the case of a fetus, for example, when the angle between the head and body of the object does not fall within the normal range since the fetus crouches down to a large extent, modeling of the object may be estimated and performed again by controlling the figure obtained by modeling the object such that the central axis on the head or body may be moved to a side.
  • The CRL is re-measured based on the estimated modeling result (operation S517). A result of re-measuring the CRL based on the estimated modeling result and a result of measuring the CRL based on the previous result of modeling the object are compared to calculate an error rate therebetween (operation S520).
  • Thereafter, a GA, which is a value for diagnosing a state of the fetus, may be calculated based on the CRL (operation S525), and may then be output via the output device 240 (operation S527).
  • FIG. 6 is a flowchart illustrating a method 600 of measuring an NT or an IT of an object, according to an exemplary embodiment.
  • In an exemplary embodiment, the IT or the NT of the object may be measured based on a result of modeling the object. The object may be modeled such that a head and body of the object may be identified as described above, and the IT or NT of the object may then be measured. In this case, measuring of the CRL may be optional. Thus, in operation S605, a model of the object may be obtained as a result of operation S507 and/or S505 of FIG. 5.
  • Referring to FIG. 6, the location of the NT or IT of the object may be estimated based on the result of modeling the object. In the case of a fetus, the NT is the nape and may thus be estimated as a region in which the head and body intersect. The IT is located in the skull and may thus be estimated to be located in a region in which a central point and a central axis on the head intersect. A region-of-interest (ROI) in which the NT or IT may be measured may be indicated.
  • A region in which the NT or IT may be measured, i.e., the ROI, may be detected and output based on the result of modeling the object (operation S607). A user may check the output ROI and request to modify the ROI.
  • In operation S610, if it is determined that the request to modify the ROI from the user is not received, the controller 220 may measure the NT or IT in the ROI (operation S613).
  • Otherwise, if the controller 220 receives the request to modify the ROI from the user, the controller 220 may modify the ROI based on the request from the user and may measure the NT or IT in the modified ROI (operation S615).
  • The NT and IT are measured as lengths and may thus be displayed in the form of a line, together with the ROI.
  • When the NT or IT is measured, a relative difference between the NT or IT and the CRL of the object is calculated (operation S617). An abnormality probability of the object may be calculated and output, based on the relative difference (operation S620). The relative difference may be expressed as NT/CRL or IT/CRL. The CRL has to be measured to calculate the relative difference between the CRL and the NT or IT. The CRL may be measured as described above.
  • FIG. 7 is a block diagram of a system that measures biometrics of an object, according to an exemplary embodiment.
  • Referring to FIG. 7, the system may include a service apparatus 710, a network 720, and a terminal apparatus 730.
  • According to an exemplary embodiment, biometrics of an object may be measured and a state of the object may be diagnosed according to a computer-based method in which a device that is connected to the terminal apparatus 730 via the network 720 measures the biometrics of the object and diagnoses a state of the object, and only information is input to or output from the terminal apparatus 730. For convenience of explanation, a device that measures the biometrics of the object and diagnoses the state of the object, in response to a request from the terminal apparatus 730 according to exemplary embodiments may be hereinafter referred to as the service apparatus 710.
  • The service apparatus 710 measures the biometrics of the object based on an image of the object received via the network 720, and provides the terminal apparatus 730 with a result of the measuring and a result of diagnosing the state of the object based on the result of the measuring. More specifically, the object may be modeled such that at least one object portion may be identified in the image of the object, the biometrics of the object may be measured based on a result of modeling the object, a state of the object may be diagnosed according to a result of the measuring, and a result of the diagnosing may be provided to the terminal apparatus 730. The service apparatus 710 may provide a user interface via which the result of modeling the object and the measured biometrics may be provided to the terminal apparatus 730 so that a user may view, check, verify, and/or modify a process of measuring the biometrics of the object.
  • The service apparatus 710 may operate based on a server-client computing or a cloud computing and may include computer resources for measuring the biometrics of the object and diagnosing the state of the object, such as, for example, at least one of hardware and software.
  • The network 720 provides a path for exchanging data between the service apparatus 710 and the terminal apparatus 730. The network 720 is an internet protocol (IP) network via which a service for receiving/transmitting a large amount of data and a data service are provided by using an IP. The network 720 may be an all-IP network that is an IP network structure obtained by integrating different networks based on an IP. Also, the network 720 may include at least one of a 3G mobile network including a wired network, a wireless broadcasting (Wibro) network, a wideband code division multiple access (WCDMA) network, a 3.5G mobile network including a high-speed downlink packet access (HSDPA) network and a long-term evolution (LTE) network, a 4G mobile network including LTE advanced, and a wireless local area network (LAN) including a satellite network and a Wi-Fi network.
  • According to exemplary embodiments, the terminal apparatus 730 performs an operation of outputting the result of measuring the biometrics of the object and the result of diagnosing the state of the object, performed by the service apparatus 710, as described in detail below.
  • FIG. 8 is a block diagram of a service apparatus 800 included in a system that measures biometrics of an object, according to an exemplary embodiment. The service apparatus 800 of FIG. 8 may be similar to the service apparatus 710 of FIG. 7 or to a service apparatus 900 of FIG. 9.
  • Referring to FIG. 8, the service apparatus 800 may include a communicator 810, a storage 820, and a service provider 830.
  • The communicator 810 exchanges data with the terminal apparatus 730 of FIG. 7 via the network 720 of FIG. 7.
  • The storage 820 stores data and a program for operating the service apparatus 800. In an exemplary embodiment, the storage 820 may store an image of an object. The image of the object may include an internal or external image of the object for measuring biometrics of the object, e.g., an ultrasound image, an MRI image, a CT image, or an X-ray image of the object. The storage 820 may include various storage media, such as a RAM, a ROM, an HDD, a flash memory, a CD-ROM, and/or a DVD.
  • The service provider 830 may control the image of the object to be received from an external device (not shown) or the storage 820. The object may be modeled to identify at least one portion of the object, based on the image of the object. Biometrics of the object may be measured based on a result of modeling the object, and the measured biometrics may be then output to an external display unit (not shown) or an output device.
  • According to an exemplary embodiment, the service provider 830 may include a modeler 831 and a measurer 832.
  • The modeler 831 models the object such that respective regions of the object may be identified, based on the image of the object. The object may be modeled in an oval shape including a circular shape, but is not limited thereto. When the object is a fetus, the fetus may be approximately divided into a head and a body, and the head and body of the fetus may be modeled in a circular or oval shape and then be provided to an output device (not shown).
  • When the modeler 831 models the object to identify the respective object regions, the measurer 832 measures the biometrics of the object based on a result of modeling the object. If the object is a fetus, a central axis may be set on the circular or oval shape by using characteristic points of the head and body, and a CRL, NT, and IT of the fetus, which are biometrics, may be measured based on the set central axis.
  • FIG. 9 is a block diagram of a service apparatus 900 included in a system that measures biometrics of an object, according to an exemplary embodiment.
  • Referring to FIG. 9, the service apparatus 900 according to an exemplary embodiment may include a communicator 910, a storage 920, and a service provider 930. The communicator 910, the storage 920, and the service provider 930 correspond to the communicator 810, the storage 820, and the service provider 830 of FIG. 8, respectively, and thus, repeated descriptions are not provided again.
  • According to an exemplary embodiment, the service apparatus 900 may provide a method of measuring biometrics of an object, which is capable of increasing the accuracy of biometrics by determining whether the biometrics fall within a normal range.
  • According to an exemplary embodiment, the storage 920 may store an image 921 and biometrics data 922 of an object. The storage 920 stores the biometrics data 922 including information about a normal range of at least one biometric, thereby enabling to determine whether measured biometrics fall within the normal range.
  • According to an exemplary embodiment, the service provider 930 may include a modeler 931, a measurer 932, and a calculator 933.
  • If biometrics measured by the measurer 932 do not fall within the normal range, the modeler 931 models the object again such that biometrics of the object may fall within the normal range.
  • If the measured biometrics do not fall within the normal range, the measurer 932 measures biometrics of the object again, based on a result of modeling the object again, performed by the modeler 931.
  • If the biometrics measured by the measurer 932 do not fall within the normal range and the object is modeled again by estimating a case where biometrics of the object fall within the normal range, the calculator 933 calculates an error rate between the biometrics measured again by the measurer 932 and the previously measured biometrics and provides the error rate to a user so that the user may determine the precision of the previously measured biometrics.
  • FIG. 10 is a flowchart illustrating a method 1000 of measuring biometrics of an object, according to an exemplary embodiment.
  • Referring to FIG. 10, a terminal apparatus 730 may receive an image of an object from an external device or may read an image stored in a storage to measure biometrics of the object, according to a request from a user or a control signal (operation S1001). The terminal apparatus 730 may transmit the image of the object to a service apparatus 800 to request to measure biometrics of the object (operation S1003). The image of the object may be stored in the service apparatus 800. The terminal apparatus 730 may request the service apparatus 800 to measure the biometrics of the object stored in the service apparatus 800 and provide the terminal apparatus 730 with a result of the measuring.
  • The service apparatus 800 may model the object such that at least one part of the object may be identified, based on the image of the object (operation S1005). The service apparatus 800 may measure biometrics of the object based on a result of modeling the object (operation S1007).
  • The result of modeling the object and the measured biometrics may be transmitted to the terminal apparatus 730 (operation S1010). The result of modeling the object and the measured biometrics may be output to the user via the terminal apparatus 730 (operation S1013). Thus, the user may view and modify the result of modeling the object and the measured biometrics.
  • FIG. 11 is a flowchart illustrating a method 1100 of measuring biometrics of an object, according to an exemplary embodiment.
  • Referring to FIG. 11, a terminal apparatus 730 may receive an image of an object from an external device or may read an image stored in a storage to measure biometrics of the object, according to a request from a user or a control signal (operation S1101). The terminal apparatus 730 may transmit the image to a service apparatus 900 to request the service apparatus 900 to model the object in order to measure biometrics of the object (operation S1103). The image of the object may be stored in the service apparatus 900. The terminal apparatus 730 may request the service apparatus 900 to model the image of the object stored in the service apparatus 900.
  • The object may be modeled such that at least one part of the object may be identified, based on the image (operation S1105). A result of modeling the object may be transmitted to the terminal apparatus 730 (operation S1107). The result of modeling the object may be output to the user via the output device (not shown) in the terminal apparatus 730 (operation S1110). When the user views the result of modeling the object and requests the service apparatus 900 to modify the result of modeling the object, via an input device (not shown) of the terminal apparatus 730 (operations S1113 and S1115), the result of modeling the object may be modified as requested by the user (operation S1117).
  • When a request to modify the result of modeling the object is not received from the user, the service apparatus 900 may be requested to measure biometrics of the object (operation S1120). Then, the service apparatus 900 may measure biometrics of the object, based on the result of modeling the object (operation S1123).
  • Whether the measured biometrics fall within a normal range may be determined based on the biometrics data 922 stored in the storage 920 of the service apparatus 900 (operation S1125). If the measured biometrics do not fall within the normal range, the precision of the measured biometrics may be determined to be low.
  • Otherwise, if the measured biometrics fall within the normal range, the measured biometrics may be transmitted to the terminal apparatus 730 (operation S1127). The measured biometrics may be output to the user via the output device (not shown) in the terminal apparatus 730 (operation S1140). If the measured biometrics do not fall within the normal range, the object is modeled again by estimating a case where biometrics of the object fall within the normal range (operation S1130). The biometrics of the object are measured again based on a result of modeling the object again (operation S1133). The measured biometrics and the biometrics measured based on the previous result of modeling the object may be compared to calculate an error rate therebetween (operation S1135).
  • The service apparatus 900 may calculate data for diagnosing a state of the object from the measured biometrics and provide the data to the terminal apparatus 730. However, if the measured biometrics do not fall within the normal range, the precision of the measured biometrics may be determined to be low. Thus, the data for diagnosing the state of the object may be calculated from the biometrics measured based on the estimated modeling result, and then be provided to the terminal apparatus 730.
  • Thereafter, data related to the biometrics of the object, including the result of modeling the object, the measured biometrics, the error rate, and the like, may be transmitted to the terminal apparatus 730 (operation S1137). The data may be controlled to be output by the terminal apparatus 730 (operation S1140).
  • FIG. 12 is a flowchart illustrating a method 1200 of measuring a CRL of an object, according to an exemplary embodiment.
  • According to an exemplary embodiment, in order to measure a CRL, which is one of biometrics of an object, the object may be modeled such that a body and head of the object may be identified, and may be modeled again according to whether an angle between the body and head falls within a normal range.
  • Referring to FIG. 12, a terminal apparatus 730 may receive an image of an object from an external device or may read an image from a storage to measure biometrics of the object, according to a request from a user or a control signal (operation S1201). The terminal apparatus 730 may transmit the image to the service apparatus 900 to request the service apparatus 900 to model the object to measure biometrics of the object (operation S1203). The image of the object may be stored in the service apparatus 900. The terminal apparatus 730 may request the service apparatus 900 to model the image of the object stored in the service apparatus 900 and provide a result of modeling the object.
  • The object may be modeled such that a body and head of the object may be identified, based on the image of the object (operation S1205). A result of modeling the object may be transmitted to the terminal apparatus 730 (operation S1207). The result of modeling the object may be output to a user via the output device of the terminal apparatus 730 (operation S1210). When the user views the result of modeling the object and requests the service apparatus 900 to modify the result of modeling the object, via the input device of the terminal apparatus 730 (operations S1213 and S1215), the result of modeling the object may be modified as requested by the user (operation S1217).
  • If there is no request from the user to modify the result of modeling the object and the terminal apparatus 730 requests the service apparatus 900 to provide biometrics of the object (operation S1220), then the service apparatus 900 may measure biometrics of the object, based on the result of modeling the object (operation S1223).
  • For example, a CRL of the object may be measured based on the result of modeling the object. A GA may be calculated from the CRL. First, characteristic points on the head and body of the object may be extracted, a central axis may be set on a figure, i.e., an object model, obtained by modeling the object, based on the extracted characteristics points, and then, biometrics of the object may be measured. In operation S1223, an angle between the body and head of the object and the CRL of the object may be measured based on the central axis.
  • When the angle between the head and body of the object falls within the normal range, the CRL of the object may be accurately measured. In the case of a fetus, the CRL may be measured to be small when the fetus crouches down to a large extent and may be measured to be too large when the fetus stretches. Thus, the measured CRL may be not appropriate for calculating a GA, which is a value for diagnosing a state of the fetus.
  • Thus, whether the angle between the head and body of the object falls within the normal range may be determined based on information about the normal range of this angle, included in the biometrics data 922 stored in the storage 920 of the service apparatus 900 (operation S1225), thereby enabling the CRL to be accurately measured.
  • If the angle between the head and body of the object does not fall within the normal range, the object is modeled again by estimating a case where the angle falls within the normal range (operation S1227). In the case of a fetus, if the angle between the head and body of the object does not fall within the normal range since the fetus crouches down to a large extent, modeling of the object may be estimated and performed again by controlling a figure obtained by modeling the object such that the central axis on the head or body may be moved to a side and the angle may thus fall within the normal range.
  • A CRL of the object may be measured again based on a result of modeling the object again, and the measured CRL and the CRL measured based on the previous result of modeling the object may be compared to calculate an error rate therebetween (operation S1230).
  • A GA for diagnosing a state of a fetus may be calculated from the CRL (operation S1233). The CRL and/or GA may be transmitted to the terminal apparatus 730 (operation S1235) and may be output via the output device of the terminal apparatus 730 (operation S1237).
  • FIG. 13 is a flowchart illustrating a method 1300 of measuring an NT or an IT of an object, according to an exemplary embodiment.
  • According to an exemplary embodiment, an IT or NT of an object may be measured based on a result of modeling the object. Thus, the object may be modeled such that the head and body of the object may be identified according to an exemplary embodiment as described above, and the IT or NT of the object may then be measured based on a result of modeling the object. Measuring a CRL may be optionally performed. For example, in operation S1337, the CRL may be received as an output of the operation S1237 of FIG. 12.
  • Referring to FIG. 13, locations of the NT or IT of the object may be estimated based on a result of modeling the object. In the case of a fetus, the NT is the nape and may thus be estimated as a region in which the head and body intersect, and the IT is located in the skull and may thus be estimated to be located in a region in which a central point on the head and a central axis on the head intersect. An ROI in which the NT or IT may be measured may be indicated.
  • First, a terminal apparatus 730 requests a service apparatus 900 to measure an NT or IT of an object and provide a result of the measuring, according to a request from a user of the terminal apparatus 730 or a control signal (operation S1301). The service apparatus 900 sets a region in which the NT or IT is to be measured, i.e., an ROI, based on a result of modeling the object (operation S1303). When the set ROI is to be verified by a user, information about the set ROI may be transmitted to the terminal apparatus 730 (operation S1305). The ROI may be displayed on the terminal apparatus 730 (operation S1307)
  • When the user views the displayed ROI and requests to modify the ROI, (operations S1310 and S1313), the ROI is modified as requested by the user (operation S1315). The NT or IT may be requested to be measured in the modified ROI (operation S1317), and then be measured in the modified ROI (operation S1320).
  • The NT and IT are measured as lengths and may thus be displayed in the form of a line, together with the ROI.
  • After the NT or IT is measured, a relative difference between the NT or IT and the CRL may be calculated (operation S1323). An abnormality probability of the object may be calculated using the relative difference and then be provided to the terminal apparatus 730 (operation S1325). In this case, the relative difference may be expressed as NT/CRL or IT/CRL. The CRL has to be measured to calculate the relative difference between the CRL and the NT or IT.
  • The measured NT and/or IT, and the relative difference between the NT or IT and the CRL may be transmitted to the terminal apparatus 730 (operation S1327), and may then be output via the output device of the terminal apparatus 730 (operation S1330).
  • FIGS. 14A and 14B illustrate examples of an ultrasound image of an object transmitted to a terminal apparatus or a service apparatus according to an exemplary embodiment.
  • Specifically, FIG. 14A illustrates an example of an ultrasound image 1 of a fetus, received, for example, by the controller 120 or 220 of FIG. 1 or 2 or the service provider 830 or 930 of FIG. 8 or 9. The ultrasound image 1 includes a cross-section of the fetus, based on which biometrics of the fetus may be measured.
  • Referring to FIG. 14B, biometrics of the fetus may be measured by extracting portions 3 and 4 of an ultrasound image 2 of the fetus.
  • An object illustrated in FIG. 15 may be the same as the ultrasound image of FIG. 14A or 14B or may be obtained by extracting a part of the ultrasound image 1 or 2 of FIG. 14A or 14B.
  • FIGS. 15A to 15C illustrate examples of modeling an object and measuring a CRL, IT, and NT of the object, according to exemplary embodiments
  • FIG. 15A illustrates an example of a result of modeling the fetus and a result of measuring a CRL of the fetus displayed on a screen 88.
  • Referring to FIG. 15A, the fetus may be modeled such that the head and body are identified in a circular shape 10 and an oval shape 20, respectively, characteristic points on the head and body may be extracted, and central axes 11, 12, and 21 are then set and indicated based on the characteristic points.
  • A CRL 30 may be automatically displayed and measured between points 84 and 86 of the object model, based on the central axes 11, 12, and 21. A user may select desired points 90, 92 of the fetus to be measured and may manually measure a CRL 40 between the points 90, 92.
  • An angle 50 between the head and body may also be measured with respect to the central axes 11, 12, and 21. Whether the angle 50 falls within a normal range may be determined.
  • FIG. 15B illustrates an example of a result of modeling the fetus and a result of measuring a CRL of the fetus when an angle between the head and body does not fall within a normal range.
  • If the angle between the head and body of the object does not fall within the normal range, the object is modeled by estimating a case where the angle between the head and body falls within the normal range. Referring to FIG. 15B, if the angle falls outside the normal range, a result of modeling the head is moved toward a result of modeling the body (as indicated with a line 15) in order to adjust the angle between the head and body to fall within the normal range and a CRL of the fetus (as indicated with a line 60) is measured using the result of modeling the body including the angle 51 falling within the normal range.
  • FIG. 15C illustrates an example of a result of modeling the fetus and a result of measuring an IT and NT of the fetus.
  • The NT and IT may be measured by setting regions on the portions of the object that is modeled as ROIs.
  • Referring to FIG. 15C, a region around a central point 98 on the head may be set as a first ROI 70 of the IT. A region 99 in which the head and body contact each other may be set as a second ROI 80 of the NT. The first ROI 70 and the second ROI 80 may be displayed as expanded regions. Also, parts of the ROIs 70 and 80 in which the IT and NT are to be measured may be displayed as lines 71 and 81. A user may view displayed information and may directly modify the ROIs 70 and 80 or the parts of the ROIs 70 and 80 in which the IT and NT are to be measured.
  • FIG. 16 illustrates a screen image on which whether an object is to be modeled, whether biometrics of the object are to be automatically measured, and whether the measured biometrics are to be verified by a user after the measurement of the biometrics may be set.
  • Referring to FIG. 16, a user interface 1400 via which the biometrics of the object may be set to be measured after a user verifies a result of modeling the object, an NT measuring region, and an IT measuring region may be provided. If the biometrics are set to be measured after the user verifies the result of modeling the object, the NT measuring region, and the IT measuring region, then the result of modeling the object or the NT measuring region and the IT measuring unit may be automatically set and displayed, whether these displayed items are to be modified may be determined according to the user's verification, and then the biometrics of the object may be measured.
  • Also, after modeling of the object, the result of modeling the object, the NT measuring region, and the IT measuring region are determined, the user interface of FIG. 16 may be provided so that a user may determine whether biometrics of the object, such as a CRL, an NT, and an IT, are to be automatically or manually measured. In this case, if the biometrics of the object are determined to be automatically measured, the biometrics are measured by setting measuring regions based on a result of modeling the object. If the biometrics of the object are determined to be manually measured, the biometrics are measured by manually modeling the object or setting measuring regions by the user. For example, when the biometrics are measured by measuring lengths of portions of the object, it is possible to set such that a user may make dots on an image of the object to measure the lengths.
  • Exemplary embodiments can be embodied as software codes that may be read by a computer (including various devices capable of processing information), in a computer-readable recording medium. Here, the computer-readable recording medium may be any recording apparatus capable of storing data that is read by a computer system, e.g., a ROM, a RAM, a CDROM, a magnetic tape, a floppy disk, an optical data storage device, and so on.
  • The described-above exemplary embodiments and advantages are merely exemplary and are not to be construed as limiting. The present teaching can be readily applied to other types of apparatuses. The description of exemplary embodiments is intended to be illustrative, and not to limit the scope of the claims, and many alternatives, modifications, and variations will be apparent to those skilled in the art. Descriptions of features or aspects within each embodiment should typically be considered as available for other similar features or aspects in other embodiments.

Claims (27)

What is claimed is:
1. A method of measuring biometrics of an object, the method comprising:
receiving an image of the object;
modeling the object to identify a portion of the object; and
measuring biometrics of the object, based on a modeling result.
2. The method of claim 1, wherein the modeling the object comprises:
displaying the modeling result of the identified portion of the object in an oval shape; and
modifying the modeling result, based on a user input signal.
3. The method of claim 1, wherein the measuring the biometrics of the object comprises:
determining whether the measured biometrics are within a normal range;
if it is determined that the measured biometrics are outside the normal range, modifying the modeling result so that the biometrics become within the normal range; and
re-measuring the biometrics of the object, based on the modified modeling result.
4. The method of claim 1, wherein the measuring the biometrics of the object comprises:
detecting a region-of-interest (ROI) for measuring the biometrics, based on the modeling result; and
measuring the biometrics in the ROI.
5. The method of claim 4, wherein the detecting the ROI comprises:
displaying the detected ROI to be differentiated from other portions of the object;
displaying a region for measuring the biometrics, in the ROI; and
modifying the ROI or the region for measuring the biometrics, according to a user input signal.
6. The method of claim 1, wherein the modeling the object comprises modeling the object to identify a body and a head of the object.
7. The method of claim 6, further comprising:
detecting a characteristic point on the head of the object;
setting a first axis through the characteristic point;
setting a second axis through the body;
measuring an angle between the body and the head of the object as an angle between the first axis and the second axis;
determining whether the angle is within a normal range; and
measuring a crown-rump length (CRL) of the object, based on a result of the determining.
8. The method of claim 7, wherein the measuring the CRL comprises:
if the angle is within the normal range, measuring the CRL of the object based on the modeling result;
if the angle is outside the normal range, modifying the modeling result so that the angle becomes within the normal range, and measuring the CRL of the object, based on the modified modeling result; and
displaying the modified modeling result and the measured CRL.
9. The method of claim 6, wherein the measuring the biometrics comprises:
measuring a crown-rump length (CRL) of the object;
measuring at least one among a nuchal translucency (NT) and an intracranial translucency (IT) of the object;
calculating a relative difference between one of the CRL and NT and the CRL and IT;
and displaying the measured CRL, NT, and IT.
10. A terminal apparatus for measuring biometrics of an object, the terminal apparatus comprising:
a storage which stores an image of the object; and
a controller comprising:
a modeler which models the object to identify a portion of the object, in the image of the object; and
a measurer which measures biometrics of the object, based on a modeling result.
11. The terminal apparatus of claim 10, further comprising an input device configured to receive a user input,
wherein the modeler modifies the modeling result, based on a user input signal.
12. The terminal apparatus of claim 10, wherein the storage stores biometrics data including information about a normal range of at least one biometric measurement value,
if the measured biometrics are outside the normal range, the modeler modifies the modeling result of the object so that the biometrics become within the normal range, and
the measurer re-measures the biometrics of the object, based on the modified modeling result.
13. The terminal apparatus of claim 12, wherein the controller further comprises a calculator which calculates an error rate between the biometrics measured based on the modified modeling result and the biometrics measured using the modeling result.
14. The terminal apparatus of claim 10, wherein the measurer detects a region-of-interest (ROI) for measuring the biometrics, based on the modeling result, and measures the biometrics in the ROI, and
the terminal apparatus further comprises an output device which outputs the detected ROI to be differentiated from other portions of the object, and outputs a region for measuring the biometrics in the ROI.
15. The terminal apparatus of claim 14, further comprising an input device configured to receive a user input,
wherein the measurer modifies the ROI according to a user input signal.
16. The terminal apparatus of claim 10, wherein the modeler models the object to identify a body and a head of the object.
17. The terminal apparatus of claim 16, wherein the modeler detects a characteristic point on the head of the object, and sets an axis through the characteristic point.
18. The terminal apparatus of claim 16, wherein the storage stores biometrics data including information about a normal range of at least one biometric measurement value,
the measurer measures an angle between the body and the head of the object, determines whether the angle is within a normal range, and measures a crown-rump length (CRL) of the object, based on a result of the determining, and
the terminal apparatus further comprises an output device which outputs a result of measuring the CRL of the object.
19. The terminal apparatus of claim 18, wherein, if the angle is within the normal range, the measurer measures the CRL of the object based on the modeling result, and
if the angle is outside of the normal range, the measurer modifies the modeling result so that the angle becomes within the normal range, and measures the CRL of the object based on the modified modeling result.
20. The terminal apparatus of claim 18, wherein the measurer measures the CRL of the object, and at least one among a nuchal translucency (NT) and an intracranial translucency (IT) of the object, and
the controller further comprises a calculator which calculates a relative difference between one of the CRL and NT and the CRL and IT.
21. The terminal apparatus of claim 10, wherein the measurer detects a region-of-interest (ROI) for measuring the biometrics, based on the modeling result, and
the controller provides a user interface which receives a user input signal indicating whether the modeling result or the detected ROI is to be viewed by a user, after the modeling the object or the detecting the ROI is automatically performed.
22. The terminal apparatus of claim 10, wherein the controller provides a user interface which receives a user input signal indicating whether at least one among the modeling the object, extracting a region for measuring the biometrics of the object, and modifying the modeling result is to be performed automatically or manually.
23. A method comprising:
receiving an image of an object;
segmenting the object into portions;
modeling the segmented portions based on an object model to represent prominent portions of the object; and
measuring biometrics of the object, based on the prominent portions.
24. The method of claim 23, further comprising:
displaying a modeling result of the prominent portions;
determining a first measuring result based on the prominent portions;
comparing the first measuring result with a predetermined range;
modifying the modeling result of the prominent portions so that the first measuring result becomes within the predetermined range, when the first measuring result is outside the predetermined range; and
re-measuring the biometrics of the object, based on the modified modeling result.
25. The method of claim 24, wherein the displaying the modeling result comprises displaying a body of the object and a head of the object; and
the determining the first measuring result comprises measuring an angle between the head and the body of the object.
26. The method of claim 25, wherein a crown-rump length (CRL) of the object is measured when the angle between the head and the body of the object is within the predetermined range.
27. The method of claim 26, further comprising:
measuring at least one among a nuchal translucency (NT) and an intracranial translucency (IT) of the object; and
calculating a relative difference between one of the CRL and NT and the CRL and IT.
US13/734,217 2012-01-04 2013-01-04 Ultrasound measurement of biometrics of fetus Active 2034-06-04 US9881125B2 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR10-2012-0001150 2012-01-04
KR1020120001150A KR101971622B1 (en) 2012-01-04 2012-01-04 The method and apparatus for measuring biometrics of object

Publications (2)

Publication Number Publication Date
US20130173175A1 true US20130173175A1 (en) 2013-07-04
US9881125B2 US9881125B2 (en) 2018-01-30

Family

ID=47845702

Family Applications (1)

Application Number Title Priority Date Filing Date
US13/734,217 Active 2034-06-04 US9881125B2 (en) 2012-01-04 2013-01-04 Ultrasound measurement of biometrics of fetus

Country Status (9)

Country Link
US (1) US9881125B2 (en)
EP (1) EP2612596B1 (en)
JP (1) JP5607713B2 (en)
KR (1) KR101971622B1 (en)
CN (1) CN103211615B (en)
BR (1) BR102013000020B1 (en)
CA (1) CA2800419C (en)
MX (1) MX337277B (en)
TW (1) TWI581766B (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2965693A1 (en) * 2014-07-11 2016-01-13 Samsung Medison Co., Ltd. Imaging apparatus and control method thereof
US20160154464A1 (en) * 2014-12-01 2016-06-02 Logitech Europe S.A. Keyboard with touch sensitive element
WO2018042008A1 (en) 2016-09-01 2018-03-08 Koninklijke Philips N.V. Ultrasound diagnosis apparatus
JP2018157961A (en) * 2017-03-23 2018-10-11 株式会社日立製作所 Ultrasonic image processing device and method
US20230414202A1 (en) * 2020-12-11 2023-12-28 Alpinion Medical Systems Co., Ltd. Medical indicator measuring method and ultrasound diagnostic device therefor

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102014202893A1 (en) * 2014-02-18 2015-08-20 Siemens Aktiengesellschaft Method of operating an imaging modality and imaging modality
KR102312270B1 (en) 2014-08-25 2021-10-14 삼성메디슨 주식회사 Untrasound dianognosis apparatus, method and computer-readable storage medium
CN104765558A (en) * 2015-03-24 2015-07-08 苏州佳世达电通有限公司 Ultrasonic wave device and control method thereof
JP6486493B2 (en) * 2015-10-30 2019-03-20 株式会社日立製作所 Ultrasonic diagnostic apparatus and method
CN110507358B (en) * 2018-05-21 2022-01-11 珠海艾博罗生物技术股份有限公司 Image processing method and system for measuring thickness of fetal nuchal transparency from ultrasonic image

Family Cites Families (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5605166A (en) 1995-11-13 1997-02-25 Chou; Kuo-Hua Hair clip
US5605155A (en) * 1996-03-29 1997-02-25 University Of Washington Ultrasound system for automatically measuring fetal head size
JP3295631B2 (en) * 1997-11-17 2002-06-24 ジーイー横河メディカルシステム株式会社 Ultrasound diagnostic apparatus, cursor display method, and measuring apparatus
JP4614548B2 (en) * 2001-01-31 2011-01-19 パナソニック株式会社 Ultrasonic diagnostic equipment
JP4085635B2 (en) * 2002-01-22 2008-05-14 凸版印刷株式会社 Outline extracting method and apparatus and program thereof
JP4758351B2 (en) * 2003-10-17 2011-08-24 コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ Manual tool for model-based image segmentation
JP4704094B2 (en) * 2005-04-14 2011-06-15 パナソニック株式会社 Ultrasonic diagnostic equipment
JP2008183063A (en) * 2007-01-26 2008-08-14 Toshiba Corp Medical image diagnostic apparatus, medical image display apparatus, and program
WO2009136332A2 (en) 2008-05-09 2009-11-12 Koninklijke Philips Electronics N.V. Automatic ultrasonic measurement of nuchal fold translucency
JP5366586B2 (en) * 2009-02-19 2013-12-11 株式会社東芝 Ultrasonic diagnostic equipment
EP2387949A1 (en) * 2010-05-17 2011-11-23 Samsung Medison Co., Ltd. Ultrasound system for measuring image using figure template and method for operating ultrasound system

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
Jardim, "Segmentation of fetal ultrasound images," Ultrasound in medicine & biology, vol. 31(2), p. 243-250, 2005 *
Snijders, "UK multicentre project on assessment of risk of trisomy 21 by maternal age and fetal nuchal-translucency thickness at 10-14 weeks of gestation," Lancet, vol. 352, p. 343-346, 1998 *
Whitlow, "The effect of fetal neck position on nuchal translucency measurement," British Journal of Obstetrics and Gynaecology, vol. 105, p. 872-876, 1998 *

Cited By (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10298849B2 (en) 2014-07-11 2019-05-21 Samsung Medison Co., Ltd. Imaging apparatus and control method thereof
EP2965693A1 (en) * 2014-07-11 2016-01-13 Samsung Medison Co., Ltd. Imaging apparatus and control method thereof
US20160154464A1 (en) * 2014-12-01 2016-06-02 Logitech Europe S.A. Keyboard with touch sensitive element
US9612664B2 (en) * 2014-12-01 2017-04-04 Logitech Europe S.A. Keyboard with touch sensitive element
US10528153B2 (en) 2014-12-01 2020-01-07 Logitech Europe S.A. Keyboard with touch sensitive element
WO2018042008A1 (en) 2016-09-01 2018-03-08 Koninklijke Philips N.V. Ultrasound diagnosis apparatus
CN109640831A (en) * 2016-09-01 2019-04-16 皇家飞利浦有限公司 Supersonic diagnostic appts
JP2019526357A (en) * 2016-09-01 2019-09-19 コーニンクレッカ フィリップス エヌ ヴェKoninklijke Philips N.V. Ultrasonic diagnostic equipment
US11246564B2 (en) 2016-09-01 2022-02-15 Koninklijke Philips N.V. Ultrasound diagnosis apparatus
JP7107918B2 (en) 2016-09-01 2022-07-27 コーニンクレッカ フィリップス エヌ ヴェ ultrasound diagnostic equipment
JP2022111140A (en) * 2016-09-01 2022-07-29 コーニンクレッカ フィリップス エヌ ヴェ Ultrasound diagnosis apparatus
JP7333448B2 (en) 2016-09-01 2023-08-24 コーニンクレッカ フィリップス エヌ ヴェ ultrasound diagnostic equipment
JP2018157961A (en) * 2017-03-23 2018-10-11 株式会社日立製作所 Ultrasonic image processing device and method
US20230414202A1 (en) * 2020-12-11 2023-12-28 Alpinion Medical Systems Co., Ltd. Medical indicator measuring method and ultrasound diagnostic device therefor
US12383237B2 (en) * 2020-12-11 2025-08-12 Alpinion Medical Systems Co., Ltd. Ultrasound diagnostic device and method for extracting characteristic points from acquired ultrasound image data using a neural network

Also Published As

Publication number Publication date
BR102013000020B1 (en) 2021-11-16
TWI581766B (en) 2017-05-11
KR101971622B1 (en) 2019-08-13
JP2013138869A (en) 2013-07-18
CA2800419A1 (en) 2013-07-04
MX2013000151A (en) 2013-07-16
CA2800419C (en) 2017-02-28
JP5607713B2 (en) 2014-10-15
TW201332520A (en) 2013-08-16
EP2612596A1 (en) 2013-07-10
US9881125B2 (en) 2018-01-30
BR102013000020A2 (en) 2015-07-14
KR20130080312A (en) 2013-07-12
CN103211615A (en) 2013-07-24
MX337277B (en) 2016-02-23
EP2612596B1 (en) 2017-09-13
CN103211615B (en) 2016-02-24

Similar Documents

Publication Publication Date Title
US9881125B2 (en) Ultrasound measurement of biometrics of fetus
US8498459B2 (en) System and method for verifying registration accuracy in digital medical images
US10424067B2 (en) Image processing apparatus, image processing method and storage medium
US20190142374A1 (en) Intertial device tracking system and method of operation thereof
US20140371591A1 (en) Method for automatically detecting mid-sagittal plane by using ultrasound image and apparatus thereof
CN107003721A (en) Improvement for eyes tracking system is calibrated
CN110246135A (en) Monitor Follicles method, apparatus, system and storage medium
US20130169674A1 (en) Method and apparatus for displaying medical image
US20250272844A1 (en) Echocardiography guide method and echocardiography guide device using same
WO2016061802A1 (en) Method and apparatus for displaying region of interest in current ultrasonic image
US11883186B2 (en) Methods and systems for continuous measurement of anomalies for dysmorphology analysis
KR102808518B1 (en) Method for segmenting medical image information and apparatus for performing the same
US9020231B2 (en) Method and apparatus for measuring captured object using brightness information and magnified image of captured image
KR102878371B1 (en) Method for providing information of joint space and device using the same
US20250295377A1 (en) Method for providing guideline for cardiac ultrasound image and device for providing guideline for cardiac ultrasound image using the same
KR102809592B1 (en) Method for segmenting medical image and apparatus for performing the same
US20250295381A1 (en) Method for providing user interface for cardiac ultrasound imaging guideline and device for providing user interface using the same
CN111179257A (en) Evaluation method and device, electronic equipment and storage medium
KR20250141628A (en) Method for providing guidelines of echocardiography images and device usinng the same
KR20250102794A (en) Method for evaluating spinal alignment condition and device for evaluating spinal alignment condition using the same
KR20230110986A (en) Method for providing information of the cervix and device using the same
JP2025525064A (en) Method for predicting risk of brain disease and method for training a brain disease risk analysis model

Legal Events

Date Code Title Description
AS Assignment

Owner name: SAMSUNG ELECTRONICS CO., LTD., KOREA, REPUBLIC OF

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:JUNG, HAE-KYUNG;YOON, HEE-CHUL;LEE, HYUN-TAEK;AND OTHERS;REEL/FRAME:029568/0316

Effective date: 20130103

STCF Information on status: patent grant

Free format text: PATENTED CASE

MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Year of fee payment: 4

MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 8TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1552); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Year of fee payment: 8