WO2023113285A1 - Method for managing body images and apparatus using same - Google Patents
Method for managing body images and apparatus using same Download PDFInfo
- Publication number
- WO2023113285A1 WO2023113285A1 PCT/KR2022/018743 KR2022018743W WO2023113285A1 WO 2023113285 A1 WO2023113285 A1 WO 2023113285A1 KR 2022018743 W KR2022018743 W KR 2022018743W WO 2023113285 A1 WO2023113285 A1 WO 2023113285A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- area
- computing device
- body image
- photographing
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T17/00—Three dimensional [3D] modelling, e.g. data description of 3D objects
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/11—Region-based segmentation
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H30/00—ICT specially adapted for the handling or processing of medical images
- G16H30/20—ICT specially adapted for the handling or processing of medical images for handling medical images, e.g. DICOM, HL7 or PACS
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H30/00—ICT specially adapted for the handling or processing of medical images
- G16H30/40—ICT specially adapted for the handling or processing of medical images for processing medical images, e.g. editing
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H70/00—ICT specially adapted for the handling or processing of medical references
Definitions
- Body image management and an apparatus using the body image management disclosed in the present disclosure relate to a method for managing medical images, and more specifically, to a management method for photographing medical images and retaining them in a predetermined storage.
- Japanese Patent Registration No. 2551476 discloses a database construction method in a medical image management system.
- Medical institutions store and retain medical images of patients acquired using medical imaging devices in a medical image information system.
- medical images are stored together with personal information for identifying a patient and information described by a doctor about an area or region where a medical image was obtained.
- the present disclosure solves the above-mentioned problems of the prior art, and provides a 3D body model capable of designating information on a body part to be captured when a medical image is captured, thereby providing a medical image and a medical image contained in the medical image.
- the characteristic configuration of the present invention for achieving the object of the present invention as described above and realizing the characteristic effects of the present invention described later is as follows.
- body image management performed by a computing device that obtains a body image, which is an image in which at least a part of the body is captured, from a device including a photographing device or having a previously photographed image, or from a photographing device.
- a method is provided, wherein the computing device includes a virtual three-dimensional body model divided into a predetermined number of surface regions, and by operating the three-dimensional body model, at least one of the surface regions is provided.
- a photographing area which is at least one surface area associated with the body image, among the surface areas is input or linked to the computing device, but the first user interface area is a capturing area selection step that is a step of receiving the capturing area from another provided device; and a capturing area storage step, which is a step of allowing the computing device to store the obtained body image together with information of the capturing area in a predetermined storage or to assist another device to store the acquired body image.
- the computing device stores the body image as an image standard or DICOM standard including metadata, and a reserved field of the image standard or the DICOM standard field) or an extra field (blank field) together with the information of the photographing area.
- the information of the imaging area includes a classification index of the imaging area, a classification name of the imaging area, 3D coordinates on the 3D body model, and 2D information processed to correspond to the 3D coordinates. contains at least one of the coordinates.
- the computing device before the capturing area selection step, is another device including a photographing device or having a pre-captured image, and a device interworking with the computing device or a photographing device included in the computing device.
- a body image acquisition step which is a step of obtaining a body image in which at least a part of the body is captured, is performed.
- the computing device is another device including a photographing device or holding a pre-captured image, and a device interworking with the computing device or a photographing device included in the computing device.
- a body image acquisition step which is a step of obtaining a body image in which at least a part of the body is captured, is performed.
- the computing device includes a virtual three-dimensional body model divided into a predetermined number of surface regions, and observation of the surface regions is performed by a user's observation manipulation of the three-dimensional body model.
- a second user interface area capable of displaying at least one of the presence, location, and number of body images associated with each of the surface areas on the second user interface area, or displaying thumbnails of the associated body images ( thumbnail) is further included.
- the method comprises: in response to a request to view a specific body image, the computing device displays a surface area associated with the specific body image on a virtual three-dimensional body model while providing the specific body image.
- the method further includes providing a third user interface area.
- a computer program comprising instructions implemented to perform the methods according to the present invention is also provided.
- a computer program may be stored on a machine-readable non-transitory recording medium.
- a computing device that acquires and manages a body image in which at least a part of the body is captured, and the computing device includes a photographing device or a device holding a previously photographed image.
- a communication unit that interworks or interworks with a photographing device included in the computing device; and a virtual 3D body model divided into a predetermined number of surface areas, and a first user interface area capable of selecting at least one of the surface areas by manipulating the 3D body model is provided to the communication unit.
- a photographing area that is at least one surface area associated with the body image among the surface areas is received or interlocked through the communication unit from another device provided with the first user interface area.
- Performing a capture area selection process for receiving the capture area and a capture area storage process for storing the obtained body image together with information on the capture area in a predetermined storage or supporting another device to store the captured body image through the communication unit contains the processor.
- the invention of the present disclosure when a medical image is captured, it is possible to intuitively select and designate a body part to be captured using a 3D body model, and information on the body part can be stored in the DICOM standard for medical images, etc. It can be stored together in the format of, so that when viewing medical images later, users such as medical staff can intuitively find the medical images on a virtual 3-dimensional body model corresponding to the patient, taking and viewing medical images. There is an effect of improving the convenience of the medical staff regarding.
- FIG. 1 is a conceptual diagram schematically illustrating an exemplary configuration of a computing device that performs a body image management method according to the present disclosure.
- FIG. 2 is an exemplary block diagram illustrating hardware or software components of a computing device that performs a body image management method according to the present disclosure.
- FIG. 3A is an exemplary flowchart illustrating a method for managing a body image according to an embodiment of the present disclosure
- FIG. 3B is an exemplary flowchart illustrating a method for managing a body image according to another embodiment of the present disclosure.
- FIG. 4 is a diagram showing a user interface provided in the body image management method of the present disclosure by way of example.
- 5A is a diagram illustrating a first user interface area provided in the body image management method of the present disclosure by way of example.
- 5B to 5D are diagrams illustrating a second user interface area provided in the body image management method of the present disclosure by way of example.
- first or second may be used to describe various components, but such terms are to be interpreted solely for the purpose of distinguishing one component from another, and no order is implied. because it doesn't For example, a first element may be termed a second element, and similarly, a second element may be termed a first element.
- 'image' is a term referring to an image that can be seen by the eye or a digital representation of an image (eg, displayed on a video screen).
- 'surface images' or 'skin images' refer to dermatology images.
- 'metadata' is a term that refers to data that describes other data.
- 'image' refers to the information of the device that took the image, the time at the time of shooting, exposure, whether or not the flash was used, and the resolution. , image size, etc. may be included as metadata.
- the 'DICOM Digital Imaging and Communications in Medicine
- ACR ACR
- NEMA National Electrical Manufacturers Association
- 'medical image storage and transmission system is a term that refers to a system that stores, processes, and transmits images and communication standards in accordance with the DICOM standard, and digital such as X-ray, CT, and MRI.
- Medical imaging images obtained using medical imaging equipment are stored in a standard format and can be transmitted to terminals inside and outside medical institutions through a network, and reading results and medical records can be added thereto.
- 'learning', 'training', or 'learning' is a term referring to performing machine learning through procedural computing, which is a mental function such as human educational activity. It will be appreciated by those skilled in the art that it is not intended to refer to.
- the present invention covers all possible combinations of the embodiments presented in this disclosure. It should be understood that the various embodiments of the present invention are different, but need not be mutually exclusive. For example, specific shapes, structures, and characteristics described herein may be implemented in one embodiment in another embodiment without departing from the spirit and scope of the invention. Additionally, it should be understood that the location or arrangement of individual components within each disclosed embodiment may be changed without departing from the spirit and scope of the invention. Accordingly, the detailed description that follows is not intended to be taken in a limiting sense.
- FIG. 1 is a conceptual diagram schematically illustrating an exemplary configuration of a computing device that performs a body image management method according to the present disclosure.
- a computing device 100 includes a communication unit 110 and a processor 120, and communicates with an external computing device (not shown) through the communication unit 110. You can communicate directly or indirectly.
- the computing device 100 may include typical computer hardware (eg, a computer, processor, memory, storage, input and output devices, and other components of conventional computing devices; a router; electronic communication devices, such as switches, switches, etc.; electronic information storage systems, such as network-attached storage (NAS) and storage area network (SAN)) and computer software (i.e., enabling computing devices to It may be to achieve the desired system performance by using a combination of instructions).
- the storage may include a storage device such as a hard disk or a universal serial bus (USB) memory as well as a storage device based on a network connection such as a cloud server.
- USB universal serial bus
- the communication unit 110 of such a computing device may transmit and receive requests and responses between other computing devices that are interlocked, for example, a dedicated storage, for example, a database server, and the like.
- requests and responses are the same TCP ( It may be performed by a Transmission Control Protocol (Session) session, but is not limited thereto, and may be transmitted and received as a User Datagram Protocol (UDP) datagram, for example.
- TCP Transmission Control Protocol
- UDP User Datagram Protocol
- the communication unit 110 may be implemented in the form of a communication module including a communication interface.
- communication interfaces include Wireless LAN (WLAN), Wireless Fidelity (WiFi) Direct, Digital Living Network Alliance (DLNA), Wireless Broadband (WiBro), World interoperability for Microwave access (WiMax), High Speed Downlink Packet Access (HSDPA), etc.
- DLNA Digital Living Network Alliance
- WiBro Wireless Broadband
- WiMax World interoperability for Microwave access
- HSDPA High Speed Downlink Packet Access
- BluetoothTM BluetoothTM
- RFID Radio Frequency IDentification
- IrDA Infrared Data Association
- UWB Userltra-WideBand
- ZigBee ZigBee
- NFC Near Field Communication
- the communication unit 110 may transmit/receive data from another computing device through an appropriate communication interface.
- the communication unit 110 may include a keyboard, a mouse, other external input devices, a printing device, a display, and other external output devices for receiving commands or instructions, or may be interlocked with them.
- the computing device 100 is provided.
- the display unit can be embedded or interlocked with an external display device through the communication unit 110 .
- such a display unit or display device may be a touch screen capable of touch input.
- the processor 120 of the computing device may include one or more micro processing units (MPUs), central processing units (CPUs), and graphics processing units (GPUs) having internal memory such as cache memory and/or external memory.
- MPUs micro processing units
- CPUs central processing units
- GPUs graphics processing units
- a microprocessor such as a neural processing unit (NPU) or a tensor processing unit (TPU)
- controller such as a microcontroller, an embedded microcontroller, a microcomputer, an arithmetic logic unit (ALU), a digital signal processor, such as , a programmable digital signal processor or other programmable device.
- it may further include a software configuration of an operating system and an application that performs a specific purpose.
- FIG. 2 is an exemplary block diagram illustrating hardware or software components of a computing device that performs a body image management method according to the present disclosure
- FIG. 3A illustrates a first embodiment of a body image management method according to the present disclosure. It is an exemplary flow diagram shown.
- the body image acquisition module 210 implemented by the computing device 200 is a source of the body image. From ), for example, a photographing device (eg, camera) 205a included in the computing device 200, a photographing device 205b interworking with the computing device 200, or a photographing device 305 included in another device 300 ) or from another device 300 having pre-captured images, acquiring a body image, which is an image in which at least a part of the body of the subject (or patient) is photographed (body image acquisition step; S100). .
- a photographing device eg, camera
- the computing device 200 is a personal computer used by medical staff
- the other device 300 may be a portable terminal including a camera.
- FIG. 4 is a diagram showing a user interface provided in the body image management method of the present disclosure by way of example.
- an exemplary configuration of a user interface 400 provided by the computing device 200 or provided through another device 300 interlocked with the computing device 200 is As shown, for the convenience of the user, blood including at least one of the subject's name 422, gender 424, age (not shown), identification number (patient ID; 426), and the name of the person in charge 428 Information on the specimen (patient) may be provided to the user.
- a list 430 related to a plurality of subjects may be provided together, and a predetermined interface element such as a button exemplified by reference numeral 410 in FIG. may be provided.
- the capturing area selection module 220 implemented by the computing device 200 is included in the computing device 200.
- the surface area associated with the body image by providing a first user interface area on a predetermined display unit or supporting another device 300 that works with the computing device 200 to provide the first user interface area.
- a step of inputting or receiving a capture area selection of a capture area; S200 is further included.
- 5A is a diagram illustrating a first user interface area provided in the body image management method of the present disclosure by way of example.
- the first user interface area 500 includes a virtual 3D body model 520, and the 3D body model 520 is divided into a predetermined number of surface areas 540.
- the surface regions may be regions segmented with reference to surface anatomy related to plastic surgery or dermatology, which is particularly useful when the body image is a dermascopy image, for example.
- the user may select at least one 550 of the surface areas 540 by manipulating the first user interface area 500, in particular, the 3D body model 520, and the selected at least one surface Area 550 is an imaging area 550, which is a surface area designated as being associated with the body image.
- manipulations such as clicks of the left and right buttons and middle buttons of a mouse (not shown), which is an input device linked to the computing device 200, and up-down of a wheel, etc. , Tap on the touch screen (tab), various manipulations such as pinch (pinch), etc. to rotate, enlarge, reduce, or parallel move the 3D body model 520, which is a 3D object on the user interface.
- manipulations such as clicks of the left and right buttons and middle buttons of a mouse (not shown), which is an input device linked to the computing device 200, and up-down of a wheel, etc. , Tap on the touch screen (tab), various manipulations such as pinch (pinch), etc. to rotate, enlarge, reduce, or parallel move the 3D body model 520, which is a 3D object on the user interface.
- the method of manipulation is known, it is needless to say that it is not limited thereto. Since methods for constructing user interfaces for handling 3D objects are well known to those skilled in the art related to computer hardware and software,
- the interface area 500 may be configured to simultaneously designate several surface areas.
- the specific point 560 is the captured body image. It may point to the center point of
- a user interface element 590 indicating a distinction name of the selected capturing area 550 is displayed as the first user interface element 590.
- 'Chest, lower' is exemplified as a classification name for the imaging area 550 in the user interface element 590 of FIG. 5A.
- the capturing region storage module 230 implemented by the computing device 200 stores the body image in the capturing region ( 550) together with the information of the specific point 560 in the predetermined storage 240 or supporting another device 300' to store the information in the storage 340' (photographing area A storage step; S300) is further included.
- the other device 300' may be the same device as the other device 300 or not.
- the capturing region storage step (S300) personal information such as the name, age, and gender of the subject for identifying the subject of the body image may be handled together with the body image.
- the body image may be stored in an image standard including metadata or DICOM standard, and a reserved field of the image standard or the DICOM standard or an extra Information on the capturing area 550 may also be written in the blank field.
- the information on the imaging area 550 includes a classification index for identifying the imaging area 550 among the surface areas 540, a classification name of the imaging area, and an image of the 3D body model 520. It may be one or more of 3D coordinates of a specific point 560 and 2D coordinates processed to correspond to the 3D coordinates. It is well known that a specific point on the body surface can correspond to a two-dimensional coordinate using various projection techniques.
- a second embodiment of the body image management method of the present disclosure in which only the order of steps S100 and S200 is different from the first embodiment is as follows.
- FIGS. 3B is an exemplary flowchart illustrating a second embodiment of a body image management method according to the present disclosure.
- a body image acquisition step (S100') is performed after the photographing region selection step (S200').
- Embodiments of the body image management method according to the present disclosure may perform an additional function of assisting a user in viewing a photographing area, and FIGS. 5B to 5D illustrate a second user interface area provided for this purpose. It is an enemy drawing.
- the photographing area viewing assistance module 250 implemented by the computing device 200 is configured to generate the above-mentioned 3D body model 520 Similarly, a virtual 3D body model 520' representing the body of the subject (patient) is included, and the surface of the subject is measured by the user's observation manipulation of the 3D body model 520'.
- a second user interface area 500' enabling observation of areas is provided, and the presence or absence of a body image associated with each of the surface areas on the second user interface area 500' and the location (560a, 560b, 560c) ) and displaying at least one of the numbers 570a, 570b, and 570c (assisting viewing of the captured area; S400).
- the presence or absence of a body image associated with each of the surface regions may distinguish surface regions 544a, 544b with no associated body image from surface regions 524a, 524b, and 524c with associated body images. It can be displayed in such a way that, although it is displayed in a color-coded manner in FIG. 5B, it is, of course, not limited thereto.
- the photographing area browsing assistance module 250 displays thumbnails of the attached body image on the second user interface area 500' in step S400 (580a, 580b, 580c). ) can also be displayed.
- the user's observation manipulation is performed on the second user interface area 500', such as clicking the left and right buttons of a mouse (not shown), clicking the middle button, and clicking a wheel up-down. It may be various manipulations that allow the 3D body model 520' to be rotated, enlarged, reduced, or translated in parallel, but, as described above, it is of course not limited thereto.
- the computing device 200 causes the user to For example, a user interface area including a list 440 of at least one associated body image may be provided so that the associated body image can be viewed, as illustrated by reference numerals 440 and 460 in FIG. as it has been
- the photographing area viewing assistance module 250 implemented by the computing device 200 is configured to display a specific body image 442 .
- the computing device 200 displays a predetermined interface area.
- a third user interface area 480 displaying the surface area 484 associated with the specific body image 442 on the virtual 3D body model 482 while providing the specific body image in the area 460 . ) may be further included (S500).
- the advantage of the technology described in this disclosure as the above embodiments is that, when taking and viewing medical images, a body part that is the target can be intuitively selected and designated using a 3D body model, and information about the body part can be selected. Information can be stored together in a format such as the DICOM standard for medical images, so that users such as medical staff can intuitively find the medical images on a virtual 3D body model corresponding to the patient when viewing the medical images later. This improves the convenience of the medical staff.
- the method disclosed in this disclosure may be performed not only once but also repeatedly, intermittently or continuously according to the user's request or need.
- the hardware device may include a general-purpose computer and/or a dedicated computing device or a specific computing device or a particular aspect or component of a specific computing device.
- the processes may be realized by a processor as described above, which is combined with a memory such as ROM/RAM or the like for storing program instructions and configured to execute instructions stored in the memory. Additionally, or alternatively, the processes may use an application specific integrated circuit (ASIC), programmable gate array, such as a field programmable gate array (FPGA), programmable logic unit (PLU) or programmable array logic (Programmable Array Logic; PAL) or any other device capable of executing and responding to other instructions, any other device or combination of devices that may be configured to process electronic signals.
- ASIC application specific integrated circuit
- FPGA field programmable gate array
- PLU programmable logic unit
- PAL programmable array logic
- a processing device may run an operating system and one or more software applications running on the operating system.
- a processing device may also access, store, manipulate, process, and generate data in response to execution of software.
- the processing device includes a plurality of processing elements and/or a plurality of types of processing elements. It can be seen that it can include.
- a processing device may include a plurality of processors or a processor and a controller. Other processing configurations are also possible, such as parallel processors.
- the hardware device may also include a communication unit as described above capable of exchanging signals with an external device.
- Software may include a computer program, code, instructions, or a combination of one or more of the foregoing, and may independently or collectively configure a processing device to operate as desired. ) command to the processing unit.
- Software and/or data may be any tangible machine, component, physical device, virtual equipment, computer storage medium or device, intended to be interpreted by or provide instructions or data to a processing device. , or may be permanently or temporarily embodied in a transmitted signal wave.
- Software may be distributed on networked computer systems and stored or executed in a distributed manner.
- Software and data may be stored on one or more machine-readable recording media.
- Machine-readable media may include program instructions, data files, data structures, etc. alone or in combination.
- Program instructions recorded on a machine-readable recording medium may be specially designed and configured for the embodiment or may be known and usable to those skilled in the art of computer software.
- Examples of machine-readable recording media include magnetic media such as hard disks, floppy disks and magnetic tapes, optical media such as CD-ROMs, DVDs, and Blu-rays, and floptical disks. ), and hardware devices specially configured to store and execute program instructions, such as ROM, RAM, flash memory, and the like.
- Examples of program instructions include stored and compiled or interpreted for execution on any one of the foregoing devices, as well as a heterogeneous combination of processors, processor architectures, or different combinations of hardware and software, or any other machine capable of executing program instructions.
- Machine code which can be created using a structured programming language such as C, an object-oriented programming language such as C++, or a high-level or low-level programming language (assembly language, hardware description languages, and database programming languages and technologies), This includes not only bytecode, but also high-level language code that can be executed by a computer using an interpreter or the like.
- the methods and combinations of methods may be implemented as executable code that performs each step.
- the method may be implemented as systems performing the steps, the methods may be distributed in several ways across devices or all functions may be integrated into one dedicated, stand-alone device or other hardware.
- the means for performing the steps associated with the processes described above may include any of the hardware and/or software described above. All such sequential combinations and combinations are intended to fall within the scope of this disclosure.
Landscapes
- Engineering & Computer Science (AREA)
- Health & Medical Sciences (AREA)
- Physics & Mathematics (AREA)
- Medical Informatics (AREA)
- General Health & Medical Sciences (AREA)
- Public Health (AREA)
- Theoretical Computer Science (AREA)
- Primary Health Care (AREA)
- Epidemiology (AREA)
- General Physics & Mathematics (AREA)
- Life Sciences & Earth Sciences (AREA)
- Radiology & Medical Imaging (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Heart & Thoracic Surgery (AREA)
- Pathology (AREA)
- Biomedical Technology (AREA)
- Biophysics (AREA)
- Molecular Biology (AREA)
- Surgery (AREA)
- Animal Behavior & Ethology (AREA)
- Veterinary Medicine (AREA)
- Computer Graphics (AREA)
- Geometry (AREA)
- Software Systems (AREA)
- Processing Or Creating Images (AREA)
- Quality & Reliability (AREA)
Abstract
Description
λ³Έ κ°μμμ κ°μλλ μ 체 μμ κ΄λ¦¬ λ° μ΄λ₯Ό μ΄μ©ν μ₯μΉλ μλ£ μμμ κ΄λ¦¬ λ°©λ²μ κ΄ν κ²μ΄λ©°, λ ꡬ체μ μΌλ‘λ μλ£ μμμ 촬μνμ¬ μ΄λ₯Ό μμ μ μ μ₯μμ 보μ νλ κ΄λ¦¬ λ°©λ²μ κ΄ν κ²μ΄λ€.Body image management and an apparatus using the body image management disclosed in the present disclosure relate to a method for managing medical images, and more specifically, to a management method for photographing medical images and retaining them in a predetermined storage.
μ 체 μμ λ± μλ£ μμμ κ΄λ¦¬νλ λ€μν μλ£ μμ λ°μ΄ν°λ² μ΄μ€ κ΅¬μΆ λ°©λ²λ€μ΄ μλ€. μλ₯Ό λ€μ΄, μΌλ³Έ λ±λ‘νΉν μ 2551476νΈμλ μλ£ μμ κ΄λ¦¬ μμ€ν μ μμ΄μμ λ°μ΄ν°λ² μ΄μ€ κ΅¬μΆ λ°©λ²μ΄ κ°μλμ΄ μλ€.There are various methods of constructing a medical image database for managing medical images such as body images. For example, Japanese Patent Registration No. 2551476 discloses a database construction method in a medical image management system.
μλ£ κΈ°κ΄μμλ μλ£ μμ μ₯μΉλ€μ μ΄μ©νμ¬ νλλ νμμ μλ£ μμμ μλ£ μμ μ 보 μμ€ν μ μ μ₯νμ¬ λ³΄μ νλ€. μλ£ μμ μ 보 μμ€ν μλ μλ£ μμμ΄ νμλ₯Ό μλ³ν μ μλ κ°μΈ μ 보μ, μλ£ μμμ΄ νλλ μμ, λΆμ λ±μ κ΄νμ¬ μμ¬κ° μμ ν μ 보μ ν¨κ» μ μ₯λκ³ μλ€.Medical institutions store and retain medical images of patients acquired using medical imaging devices in a medical image information system. In the medical image information system, medical images are stored together with personal information for identifying a patient and information described by a doctor about an area or region where a medical image was obtained.
κ·Έλ°λ° μ΄μ²λΌ μ’ λμ λ€μν μλ£ μμ μ₯λΉλ‘ 촬μν μλ£ μμλ€μ, μ΄λ―Έμ§ νμ€, μ컨λ, DICOM νμ€μΌλ‘ μμ€ν μ μ μ₯λμ΄ λ³΄μ λκ³ μμ λμλ, μ 체μ μ΄λ λΆμλ₯Ό 촬μν κ²μΈμ§λ₯Ό μΌλͺ©μμ°νκ² νμνμ¬ μλ£μ§μ΄ μ§κ΄μ μΌλ‘ μκΈ° μ½κ² ꡬ쑰νλ λ°μ΄ν°λ₯Ό ν¬ν¨νκ³ μμ§ μμ λ¨μ μ΄ μμΌλ©°, μ 체μ μμΉ λ³λ‘ μνΈ κ΄λ ¨μ±μ΄ μλ μλ£ μμλ€μ ν¨κ» μ΄λν΄μ νμΈνλ λ° νκ³μ μ΄ μλ€.However, even when medical images taken with various medical imaging equipment are stored and held in the system as an image standard, for example, the DICOM standard, the medical team can intuitively know which part of the body was taken by displaying at a glance which part of the body was taken. There is a disadvantage of not including easily structured data, and there is a limit to viewing and confirming medical images that are correlated by body position together.
λ°λΌμ λ³Έ κ°μμμμλ μ μ ν μ’ λ κΈ°μ μ λ¬Έμ μ μ ν΄κ²°νμ¬, μλ£ μμμ 촬μ μμ κ·Έ 촬μμ΄ λλ μ 체 λΆμμ κ΄ν μ 보λ₯Ό μ§μ ν μ μλ 3μ°¨μ μ 체 λͺ¨λΈμ μ 곡ν¨μΌλ‘μ¨ μλ£ μμκ³Ό κ·Έ μλ£ μμμ΄ λ΄κ³ μλ νμ μ 체μ νλ©΄ μμμ 3μ°¨μ μ 체 λͺ¨λΈ μμμ μ§μ ν μ μλλ‘ ν¨κ³Ό λμμ, μΆν νΉμ νμμ μλ£ μμλ€μ μ΄λν λ 3μ°¨μ μ 체 λͺ¨λΈ μμμ κ·Έ μλ£ μμλ€μ μ§κ΄μ μΌλ‘ μ°Ύμλ³Ό μ μκ² νλ λ°©λ² λ° μ΄λ₯Ό μ΄μ©ν μ₯μΉλ₯Ό μ 곡νλ κ²μ λͺ©μ μΌλ‘ νλ€.Therefore, the present disclosure solves the above-mentioned problems of the prior art, and provides a 3D body model capable of designating information on a body part to be captured when a medical image is captured, thereby providing a medical image and a medical image contained in the medical image. A method for designating the surface area of a patient's body on a 3D body model and at the same time intuitively finding the medical images of a specific patient on the 3D body model when viewing later, and a device using the same intended to provide
μκΈ°ν λ°μ κ°μ λ³Έ λ°λͺ μ λͺ©μ μ λ¬μ±νκ³ , νμ νλ λ³Έ λ°λͺ μ νΉμ§μ μΈ ν¨κ³Όλ₯Ό μ€ννκΈ° μν λ³Έ λ°λͺ μ νΉμ§μ μΈ κ΅¬μ±μ νκΈ°μ κ°λ€.The characteristic configuration of the present invention for achieving the object of the present invention as described above and realizing the characteristic effects of the present invention described later is as follows.
λ³Έ κ°μμμ μΌ νμμ λ°λ₯΄λ©΄, 촬μ μ₯μΉλ₯Ό ν¬ν¨νκ±°λ 미리 촬μλ μμμ 보μ ν μ₯μΉλ‘λΆν°, λλ 촬μ μ₯μΉλ‘λΆν° μ 체μ μ μ΄λ μΌλΆκ° 촬μλ μμμΈ μ 체 μμμ νλνλ μ»΄ν¨ν μ₯μΉμ μν΄ μνλλ μ 체 μμ κ΄λ¦¬ λ°©λ²μ΄ μ 곡λλλ°, κ·Έ λ°©λ²μ, μκΈ° μ»΄ν¨ν μ₯μΉκ°, μμ κ°μμ νλ©΄ μμλ€λ‘ λΆν λ κ°μμ 3μ°¨μ μ 체 λͺ¨λΈμ ν¬ν¨νκ³ , μκΈ° 3μ°¨μ μ 체 λͺ¨λΈμ λν μ‘°μμ μν΄ μκΈ° νλ©΄ μμλ€ μ€ μ μ΄λ νλλ₯Ό μ ν κ°λ₯νκ² νλ μ 1 μ¬μ©μ μΈν°νμ΄μ€ μμμ μ 곡ν¨μΌλ‘μ¨, μκΈ° νλ©΄ μμλ€ μ€ μκΈ° μ 체 μμμ κ²°λΆλ μ μ΄λ νλμ νλ©΄ μμμΈ μ΄¬μ μμμ μ λ ₯λ°κ±°λ μκΈ° μ»΄ν¨ν μ₯μΉμ μ°λνλ μκΈ° μ 1 μ¬μ©μ μΈν°νμ΄μ€ μμμ΄ μ 곡λλ ν μ₯μΉλ‘λΆν° μκΈ° 촬μ μμμ μ λ¬λ°λ λ¨κ³μΈ 촬μ μμ μ ν λ¨κ³; λ° μκΈ° μ»΄ν¨ν μ₯μΉκ°, νλλ μκΈ° μ 체 μμμ μκΈ° 촬μ μμμ μ 보μ ν¨κ» μμ μ μ μ₯μμ μ μ₯νκ±°λ ν μ₯μΉλ‘ νμ¬κΈ μ μ₯νλλ‘ μ§μνλ λ¨κ³μΈ 촬μ μμ μ μ₯ λ¨κ³λ₯Ό ν¬ν¨νλ€.According to one aspect of the present disclosure, body image management performed by a computing device that obtains a body image, which is an image in which at least a part of the body is captured, from a device including a photographing device or having a previously photographed image, or from a photographing device. A method is provided, wherein the computing device includes a virtual three-dimensional body model divided into a predetermined number of surface regions, and by operating the three-dimensional body model, at least one of the surface regions is provided. By providing a first user interface area that allows one to be selected, a photographing area, which is at least one surface area associated with the body image, among the surface areas is input or linked to the computing device, but the first user interface area is a capturing area selection step that is a step of receiving the capturing area from another provided device; and a capturing area storage step, which is a step of allowing the computing device to store the obtained body image together with information of the capturing area in a predetermined storage or to assist another device to store the acquired body image.
μ 리νκ²λ, μκΈ° 촬μ μμ μ μ₯ λ¨κ³μμ, μκΈ° μ»΄ν¨ν μ₯μΉλ, μκΈ° μ 체 μμμ λ©νλ°μ΄ν°(metadata)λ₯Ό ν¬ν¨νλ μ΄λ―Έμ§ νμ€ λλ DICOM νμ€μΌλ‘ μ μ₯νλ, μκΈ° μ΄λ―Έμ§ νμ€ λλ μκΈ° DICOM νμ€μ μμ½λ νλ(reserved field) λλ μ¬λΆμ νλ(blank field)μ μκΈ° 촬μ μμμ μ 보λ₯Ό ν¨κ» κΈ°μ νλ€.Advantageously, in the capturing region storing step, the computing device stores the body image as an image standard or DICOM standard including metadata, and a reserved field of the image standard or the DICOM standard field) or an extra field (blank field) together with the information of the photographing area.
λ°λμ§νκ²λ, μκΈ° 촬μ μμμ μ 보λ, μκΈ° 촬μ μμμ κ΅¬λΆ μΈλ±μ€(index), μκΈ° 촬μ μμμ κ΅¬λΆ λͺ μΉ, μκΈ° 3μ°¨μ μ 체 λͺ¨λΈ μμ 3μ°¨μ μ’ν, λ° μκΈ° 3μ°¨μ μ’νμ λμλλλ‘ κ°κ³΅λ 2μ°¨μ μ’ν μ€ μ μ΄λ νλλ₯Ό ν¬ν¨νλ€.Preferably, the information of the imaging area includes a classification index of the imaging area, a classification name of the imaging area, 3D coordinates on the 3D body model, and 2D information processed to correspond to the 3D coordinates. contains at least one of the coordinates.
μκΈ° λ°©λ²μ μΌ μ€μ μμμλ, μκΈ° 촬μ μμ μ ν λ¨κ³ μ μ, μκΈ° μ»΄ν¨ν μ₯μΉκ°, 촬μ μ₯μΉλ₯Ό ν¬ν¨νκ±°λ 미리 촬μλ μμμ 보μ ν ν μ₯μΉλ‘μ μκΈ° μ»΄ν¨ν μ₯μΉμ μ°λνλ μ₯μΉ λλ μκΈ° μ»΄ν¨ν μ₯μΉμ ν¬ν¨λ 촬μ μ₯μΉλ‘λΆν° μ 체μ μ μ΄λ μΌλΆκ° 촬μλ μμμΈ μ 체 μμμ νλνλ λ¨κ³μΈ μ 체 μμ νλ λ¨κ³λ₯Ό μννλ€.In an embodiment of the method, before the capturing area selection step, the computing device is another device including a photographing device or having a pre-captured image, and a device interworking with the computing device or a photographing device included in the computing device. A body image acquisition step, which is a step of obtaining a body image in which at least a part of the body is captured, is performed.
μκΈ° λ°©λ²μ λ€λ₯Έ μ€μ μμμλ, μκΈ° 촬μ μμ μ ν λ¨κ³ νμ, μκΈ° μ»΄ν¨ν μ₯μΉκ°, 촬μ μ₯μΉλ₯Ό ν¬ν¨νκ±°λ 미리 촬μλ μμμ 보μ ν ν μ₯μΉλ‘μ μκΈ° μ»΄ν¨ν μ₯μΉμ μ°λνλ μ₯μΉ λλ μκΈ° μ»΄ν¨ν μ₯μΉμ ν¬ν¨λ 촬μ μ₯μΉλ‘λΆν° μ 체μ μ μ΄λ μΌλΆκ° 촬μλ μμμΈ μ 체 μμμ νλνλ λ¨κ³μΈ μ 체 μμ νλ λ¨κ³λ₯Ό μννλ€.In another embodiment of the method, after the capturing area selection step, the computing device is another device including a photographing device or holding a pre-captured image, and a device interworking with the computing device or a photographing device included in the computing device. A body image acquisition step, which is a step of obtaining a body image in which at least a part of the body is captured, is performed.
λ°λμ§νκ², μκΈ° λ°©λ²μ, μκΈ° μ»΄ν¨ν μ₯μΉκ°, μμ κ°μμ νλ©΄ μμλ€λ‘ λΆν λ κ°μμ 3μ°¨μ μ 체 λͺ¨λΈμ ν¬ν¨νκ³ , μκΈ° 3μ°¨μ μ 체 λͺ¨λΈμ λν μ¬μ©μμ κ΄μ°° μ‘°μμ μν΄ μκΈ° νλ©΄ μμλ€μ κ΄μ°°μ κ°λ₯νκ² νλ μ 2 μ¬μ©μ μΈν°νμ΄μ€ μμμ μ 곡ν¨μΌλ‘μ¨, μκΈ° μ 2 μ¬μ©μ μΈν°νμ΄μ€ μμ μμ μκΈ° νλ©΄ μμλ€ κ°κ°μ κ²°λΆλ μ 체 μμμ μ 무, μμΉ λ° κ°μ μ€ μ μ΄λ νλλ₯Ό νμνκ±°λ μκΈ° κ²°λΆλ μ 체 μμμ μ¬λ€μΌ(thumbnail)μ νμνλ 촬μ μμ μ΄λ 보쑰 λ¨κ³λ₯Ό λ ν¬ν¨νλ€.Preferably, in the method, the computing device includes a virtual three-dimensional body model divided into a predetermined number of surface regions, and observation of the surface regions is performed by a user's observation manipulation of the three-dimensional body model. By providing a second user interface area capable of displaying at least one of the presence, location, and number of body images associated with each of the surface areas on the second user interface area, or displaying thumbnails of the associated body images ( thumbnail) is further included.
μ 리νκ², μκΈ° λ°©λ²μ, νΉμ μ 체 μμμ λν μ΄λ μμ²μ μνμ¬, μκΈ° μ»΄ν¨ν μ₯μΉκ°, μκΈ° νΉμ μ 체 μμμ μ 곡νλ©΄μ, μκΈ° νΉμ μ 체 μμμ κ²°λΆλ νλ©΄ μμμ κ°μμ 3μ°¨μ μ 체 λͺ¨λΈ μμ νμν μ 3 μ¬μ©μ μΈν°νμ΄μ€ μμμ μ 곡νλ λ¨κ³λ₯Ό λ ν¬ν¨νλ€.Advantageously, the method comprises: in response to a request to view a specific body image, the computing device displays a surface area associated with the specific body image on a virtual three-dimensional body model while providing the specific body image. The method further includes providing a third user interface area.
λ³Έ λ°λͺ μ λ€λ₯Έ νμμ λ°λ₯΄λ©΄, λ³Έ λ°λͺ μ λ°λ₯Έ λ°©λ²λ€μ μννλλ‘ κ΅¬νλ λͺ λ Ήμ΄λ€(instructions)μ ν¬ν¨νλ μ»΄ν¨ν° νλ‘κ·Έλ¨λ μ 곡λλ€. μλ₯Ό λ€μ΄, κ·Έλ¬ν μ»΄ν¨ν° νλ‘κ·Έλ¨μ κΈ°κ³ νλ κ°λ₯ν λΉμΌμμ κΈ°λ‘ λ§€μ²΄μ μ μ₯λ μ μλ€.According to another aspect of the present invention, a computer program comprising instructions implemented to perform the methods according to the present invention is also provided. For example, such a computer program may be stored on a machine-readable non-transitory recording medium.
λ³Έ λ°λͺ μ λ λ€λ₯Έ νμμ λ°λ₯΄λ©΄, μ 체μ μ μ΄λ μΌλΆκ° 촬μλ μμμΈ μ 체 μμμ νλνμ¬ κ΄λ¦¬νλ μ»΄ν¨ν μ₯μΉκ° μ 곡λλλ°, κ·Έ μ»΄ν¨ν μ₯μΉλ, 촬μ μ₯μΉλ₯Ό ν¬ν¨νκ±°λ 미리 촬μλ μμμ 보μ ν μ₯μΉμ μ°λνκ±°λ μκΈ° μ»΄ν¨ν μ₯μΉμ ν¬ν¨λ 촬μ μ₯μΉμ μ°λνλ ν΅μ λΆ; λ° μμ κ°μμ νλ©΄ μμλ€λ‘ λΆν λ κ°μμ 3μ°¨μ μ 체 λͺ¨λΈμ ν¬ν¨νκ³ , μκΈ° 3μ°¨μ μ 체 λͺ¨λΈμ λν μ‘°μμ μν΄ μκΈ° νλ©΄ μμλ€ μ€ μ μ΄λ νλλ₯Ό μ ν κ°λ₯νκ² νλ μ 1 μ¬μ©μ μΈν°νμ΄μ€ μμμ μκΈ° ν΅μ λΆλ₯Ό ν΅νμ¬ μμ μ λμ€νλ μ΄ μ₯μΉμ μ 곡ν¨μΌλ‘μ¨, μκΈ° νλ©΄ μμλ€ μ€ μκΈ° μ 체 μμμ κ²°λΆλ μ μ΄λ νλμ νλ©΄ μμμΈ μ΄¬μ μμμ μ λ ₯λ°κ±°λ μκΈ° ν΅μ λΆλ₯Ό ν΅νμ¬ μ°λνλ μκΈ° μ 1 μ¬μ©μ μΈν°νμ΄μ€ μμμ΄ μ 곡λλ ν μ₯μΉλ‘λΆν° μκΈ° 촬μ μμμ μ λ¬λ°λ 촬μ μμ μ ν νλ‘μΈμ€, λ° νλλ μκΈ° μ 체 μμμ μκΈ° 촬μ μμμ μ 보μ ν¨κ» μμ μ μ μ₯μμ μ μ₯νκ±°λ μκΈ° ν΅μ λΆλ₯Ό ν΅νμ¬ ν μ₯μΉλ‘ νμ¬κΈ μ μ₯νλλ‘ μ§μνλ 촬μ μμ μ μ₯ νλ‘μΈμ€λ₯Ό μννλ νλ‘μΈμλ₯Ό ν¬ν¨νλ€.According to another aspect of the present invention, there is provided a computing device that acquires and manages a body image in which at least a part of the body is captured, and the computing device includes a photographing device or a device holding a previously photographed image. a communication unit that interworks or interworks with a photographing device included in the computing device; and a virtual 3D body model divided into a predetermined number of surface areas, and a first user interface area capable of selecting at least one of the surface areas by manipulating the 3D body model is provided to the communication unit. By providing it to a predetermined display device through, a photographing area that is at least one surface area associated with the body image among the surface areas is received or interlocked through the communication unit from another device provided with the first user interface area. Performing a capture area selection process for receiving the capture area and a capture area storage process for storing the obtained body image together with information on the capture area in a predetermined storage or supporting another device to store the captured body image through the communication unit contains the processor.
λ³Έ κ°μμμ λ°λͺ μ μνλ©΄, μλ£ μμμ 촬μ μμ κ·Έ 촬μμ΄ λλ μ 체 λΆμλ₯Ό 3μ°¨μ μ 체 λͺ¨λΈμ μ΄μ©νμ¬ μ§κ΄μ μΌλ‘ μ ννμ¬ μ§μ ν μ μκ³ , κ·Έ μ 체 λΆμμ κ΄ν μ 보λ₯Ό μλ£ μμμ κ΄ν DICOM νμ€ λ±μ νμμΌλ‘ ν¨κ» μ μ₯ν΄λ μ μμ΄, μΆν μλ£ μμμ μ΄λ μμ μλ£μ§ λ±μ μ¬μ©μκ° νμμ λμνλ κ°μμ 3μ°¨μ μ 체 λͺ¨λΈ μμμ κ·Έ μλ£ μμλ€μ μ§κ΄μ μΌλ‘ μ°Ύμλ³Ό μ μλλ°, μλ£ μμμ 촬μ λ° μ΄λμ κ΄ν μλ£μ§μ νΈμμ±μ΄ ν₯μλλ ν¨κ³Όκ° μλ€.According to the invention of the present disclosure, when a medical image is captured, it is possible to intuitively select and designate a body part to be captured using a 3D body model, and information on the body part can be stored in the DICOM standard for medical images, etc. It can be stored together in the format of, so that when viewing medical images later, users such as medical staff can intuitively find the medical images on a virtual 3-dimensional body model corresponding to the patient, taking and viewing medical images. There is an effect of improving the convenience of the medical staff regarding.
λ³Έ λ°λͺ μ μ€μ μμ μ€λͺ μ μ΄μ©λκΈ° μνμ¬ μ²¨λΆλ μλ λλ©΄λ€μ λ³Έ λ°λͺ μ μ€μ μλ€ μ€ λ¨μ§ μΌλΆμΌ λΏμ΄λ©°, λ³Έ λ°λͺ μ΄ μν κΈ°μ λΆμΌμμ ν΅μμ μ§μμ κ°μ§ μ¬λ(μ΄ν "ν΅μμ κΈ°μ μ"λΌ ν¨)μ λ³κ°μ λ°λͺ μ μ΄λ₯Ό μ λμ λ Έλ ₯ μμ΄ μ΄ λλ©΄λ€μ κΈ°μ΄νμ¬ λ€λ₯Έ λλ©΄λ€μ μ»μ μ μλ€.The accompanying drawings for use in describing the embodiments of the present invention are only some of the embodiments of the present invention, and those of ordinary skill in the art (hereinafter referred to as "ordinary technicians") Other figures may be obtained on the basis of these figures without any effort to the extent of inventing the invention.
λ 1μ λ³Έ κ°μμμ λ°λ₯Έ μ 체 μμ κ΄λ¦¬ λ°©λ²μ μννλ μ»΄ν¨ν μ₯μΉμ μμμ ꡬμ±μ κ°λ΅μ μΌλ‘ λμν κ°λ λμ΄λ€.1 is a conceptual diagram schematically illustrating an exemplary configuration of a computing device that performs a body image management method according to the present disclosure.
λ 2λ λ³Έ κ°μμμ λ°λ₯Έ μ 체 μμ κ΄λ¦¬ λ°©λ²μ μννλ μ»΄ν¨ν μ₯μΉμ νλμ¨μ΄ λλ μννΈμ¨μ΄ ꡬμ±μμλ₯Ό λμν μμμ λΈλ‘λμ΄λ€.2 is an exemplary block diagram illustrating hardware or software components of a computing device that performs a body image management method according to the present disclosure.
λ 3aλ λ³Έ κ°μμμ λ°λ₯Έ μ 체 μμ κ΄λ¦¬ λ°©λ²μ μΌ μ€μ μλ₯Ό λμν μμμ νλ¦λμ΄λ©°, λ 3bλ λ³Έ κ°μμμ λ°λ₯Έ μ 체 μμ κ΄λ¦¬ λ°©λ²μ λ€λ₯Έ μ€μ μλ₯Ό λμν μμμ νλ¦λμ΄λ€.3A is an exemplary flowchart illustrating a method for managing a body image according to an embodiment of the present disclosure, and FIG. 3B is an exemplary flowchart illustrating a method for managing a body image according to another embodiment of the present disclosure.
λ 4λ λ³Έ κ°μμμ μ 체 μμ κ΄λ¦¬ λ°©λ²μμ μ 곡λλ μ¬μ©μ μΈν°νμ΄μ€λ₯Ό μμμ μΌλ‘ λνλΈ λλ©΄μ΄λ€.4 is a diagram showing a user interface provided in the body image management method of the present disclosure by way of example.
λ 5aλ λ³Έ κ°μμμ μ 체 μμ κ΄λ¦¬ λ°©λ²μμ μ 곡λλ μ 1 μ¬μ©μ μΈν°νμ΄μ€ μμμ μμμ μΌλ‘ λνλΈ λλ©΄μ΄λ€.5A is a diagram illustrating a first user interface area provided in the body image management method of the present disclosure by way of example.
λ 5b λ΄μ§ λ 5dλ λ³Έ κ°μμμ μ 체 μμ κ΄λ¦¬ λ°©λ²μμ μ 곡λλ μ 2 μ¬μ©μ μΈν°νμ΄μ€ μμμ μμμ μΌλ‘ λνλΈ λλ©΄μ΄λ€.5B to 5D are diagrams illustrating a second user interface area provided in the body image management method of the present disclosure by way of example.
νμ νλ λ³Έ λ°λͺ μ λν μμΈν μ€λͺ μ, λ³Έ λ°λͺ μ λͺ©μ λ€, κΈ°μ μ ν΄λ²λ€ λ° μ₯μ λ€μ λΆλͺ νκ² νκΈ° μνμ¬ λ³Έ λ°λͺ μ΄ μ€μλ μ μλ νΉμ μ€μ μλ₯Ό μμλ‘μ λμνλ μ²¨λΆ λλ©΄μ μ°Έμ‘°νλ€. μ΄λ€ μ€μ μλ ν΅μμ κΈ°μ μκ° λ³Έ λ°λͺ μ μ€μν μ μκΈ°μ μΆ©λΆνλλ‘ μμΈν μ€λͺ λλ€. μ²¨λΆ λλ©΄μ μ°Έμ‘°νμ¬ μ€λͺ ν¨μ μμ΄, λλ©΄ λΆνΈμ κ΄κ³μμ΄ λμΌν κ΅¬μ± μμλ λμΌν μ°Έμ‘°λΆνΈλ₯Ό λΆμ¬νκ³ , μ΄μ λν μ€λ³΅λλ μ€λͺ μ μλ΅νκΈ°λ‘ νλ€. λλ©΄μμ μ μ¬ν μ°Έμ‘°λΆνΈλ μ¬λ¬ μΈ‘λ©΄μ κ±Έμ³μ λμΌνκ±°λ μ μ¬ν κΈ°λ₯μ μ§μΉνλ€. DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS The following detailed description of the present invention refers to the accompanying drawings, which illustrate specific embodiments in which the present invention may be practiced in order to make the objects, technical solutions and advantages of the present invention clear. These embodiments are described in sufficient detail to enable a person skilled in the art to practice the present invention. In the description with reference to the accompanying drawings, the same reference numerals are given to the same components regardless of reference numerals, and overlapping descriptions thereof will be omitted. Like reference numbers in the drawings indicate the same or similar function throughout the various aspects.
"μ 1" λλ "μ 2" λ±μ μ©μ΄λ₯Ό λ€μν ꡬμ±μμλ€μ μ€λͺ νλλ° μ¬μ©λ μ μμ§λ§, μ΄λ° μ©μ΄λ€μ νλμ ꡬμ±μμλ₯Ό λ€λ₯Έ ꡬμ±μμλ‘λΆν° ꡬλ³νλ λͺ©μ μΌλ‘λ§ ν΄μλμ΄μΌ νλλ°, μ΄λ ν μμλ μμ¬νκ³ μμ§ μκΈ° λλ¬Έμ΄λ€. μλ₯Ό λ€μ΄, μ 1 ꡬμ±μμλ μ 2 ꡬμ±μμλ‘ λͺ λͺ λ μ μκ³ , μ μ¬νκ² μ 2 ꡬμ±μμλ μ 1 ꡬμ±μμλ‘λ λͺ λͺ λ μ μλ€.Terms such as "first" or "second" may be used to describe various components, but such terms are to be interpreted solely for the purpose of distinguishing one component from another, and no order is implied. because it doesn't For example, a first element may be termed a second element, and similarly, a second element may be termed a first element.
μ΄λ€ ꡬμ±μμκ° λ€λ₯Έ ꡬμ±μμμ "μ°κ²°λμ΄" μλ€κ³ μΈκΈλ λμλ, κ·Έ λ€λ₯Έ ꡬμ±μμμ μ§μ μ μΌλ‘ μ°κ²°λμ΄ μκ±°λ λλ μ μλμ΄ μμ μλ μμ§λ§, μ€κ°μ λ€λ₯Έ ꡬμ±μμκ° μ‘΄μ¬ν μλ μλ€κ³ μ΄ν΄λμ΄μΌ ν κ²μ΄λ€.It should be understood that when an element is referred to as being βconnectedβ to another element, it may be directly connected or connected to the other element, but other elements may exist in the middle.
λ¨μμ ννμ λ¬Έλ§₯μ λͺ λ°±νκ² λ€λ₯΄κ² λ»νμ§ μλ ν, 볡μμ ννμ ν¬ν¨νλ€. λ³Έ κ°μμμμ, "ν¬ν¨νλ€" λλ "κ°μ§λ€" λ±μ μ©μ΄λ κΈ°μ¬λ νΉμ§, μ«μ, λ¨κ³, λμ, ꡬμ±μμ, λΆλΆν λλ μ΄λ€μ μ‘°ν©ν κ²μ΄ μ‘΄μ¬ν¨μΌλ‘ μ§μ νλ €λ κ²μ΄μ§, νλ λλ κ·Έ μ΄μμ λ€λ₯Έ νΉμ§λ€μ΄λ μ«μ, λ¨κ³, λμ, ꡬμ±μμ, λΆλΆν λλ μ΄λ€μ μ‘°ν©ν κ²λ€μ μ‘΄μ¬ λλ λΆκ° κ°λ₯μ±μ 미리 λ°°μ νμ§ μλ κ²μΌλ‘ μ΄ν΄λμ΄μΌ νλ€.Singular expressions include plural expressions unless the context clearly dictates otherwise. In this disclosure, terms such as "comprise" or "have" are intended to designate that the described feature, number, step, operation, component, part, or combination thereof exists, but one or more other features or numbers, It should be understood that the presence or addition of steps, operations, components, parts, or combinations thereof is not precluded.
λν, λ³Έ κ°μμμμ 'μμ'μ (μ컨λ, λΉλμ€ νλ©΄μ νμλ) λμΌλ‘ λ³Ό μ μλ μμ λλ μμμ λμ§νΈ ννλ¬Όμ μ§μΉνλ μ©μ΄μ΄λ€. λ³Έ κ°μμμ κ±Έμ³ 'νλ©΄ μμ' λλ 'νΌλΆ μμ'μ νΌλΆκ³Ό μμ(dermatology image)λ₯Ό μ§μΉνλ€.Also, in this disclosure, 'image' is a term referring to an image that can be seen by the eye or a digital representation of an image (eg, displayed on a video screen). Throughout this disclosure 'surface images' or 'skin images' refer to dermatology images.
λ³Έ κ°μμμμ 'λ©νλ°μ΄ν°'λ λ€λ₯Έ λ°μ΄ν°λ₯Ό μ€λͺ ν΄μ£Όλ λ°μ΄ν°λ₯Ό μ§μΉνλ μ©μ΄λ‘μ, μλ₯Ό λ€μ΄, 'μ΄λ―Έμ§'λ κ·Έ μ΄λ―Έμ§λ₯Ό 촬μν μ₯μΉμ μ 보, 촬μ λΉμμ μκ°, λ ΈμΆ, νλμ μ¬μ© μ¬λΆ, ν΄μλ, μ΄λ―Έμ§ ν¬κΈ° λ±μ μ 보λ₯Ό λ©νλ°μ΄ν°λ‘ κ°μ§ μ μλ€.In this disclosure, 'metadata' is a term that refers to data that describes other data. For example, 'image' refers to the information of the device that took the image, the time at the time of shooting, exposure, whether or not the flash was used, and the resolution. , image size, etc. may be included as metadata.
*λ³Έ κ°μμμμ 'DICOM(Digital Imaging and Communications in Medicine; μλ£μ© λμ§νΈ μμ λ° ν΅μ )' νμ€μ μλ£μ© κΈ°κΈ°μμ λμ§νΈ μμ ννκ³Ό ν΅μ μ μ΄μ©λλ μ¬λ¬ κ°μ§ νμ€μ μ΄μΉνλ μ©μ΄μΈλ°, DICOM νμ€μ λ―Έκ΅ λ°©μ¬μ μνν(ACR)μ λ―Έκ΅ μ κΈ° 곡μ ν(NEMA)μμ ꡬμ±ν μ°ν© μμνμμ λ°ννλ€.*In this disclosure, the 'DICOM (Digital Imaging and Communications in Medicine)' standard is a term that collectively refers to various standards used for digital image expression and communication in medical devices. (ACR) and the National Electrical Manufacturers Association (NEMA).
κ·Έλ¦¬κ³ λ³Έ κ°μμμμ 'μλ£ μμ μ μ₯ μ μ‘ μμ€ν (PACS)'μ, μμ λ° ν΅μ νμ€, μ컨λ, DICOM νμ€μ λ§κ² μ μ₯, κ°κ³΅, μ μ‘νλ μμ€ν μ μ§μΉνλ μ©μ΄μ΄λ©°, Xμ , CT, MRIμ κ°μ λμ§νΈ μλ£ μμ μ₯λΉλ₯Ό μ΄μ©νμ¬ νλλ μλ£ μμ μ΄λ―Έμ§λ νμ€ νμμΌλ‘ μ μ₯λκ³ λ€νΈμν¬λ₯Ό ν΅νμ¬ μλ£ κΈ°κ΄ λ΄μΈμ λ¨λ§λ‘ μ μ‘μ΄ κ°λ₯νλ©°, μ΄μλ νλ κ²°κ³Ό λ° μ§λ£ κΈ°λ‘μ΄ μΆκ°λ μ μλ€.And in the present disclosure, 'medical image storage and transmission system (PACS)' is a term that refers to a system that stores, processes, and transmits images and communication standards in accordance with the DICOM standard, and digital such as X-ray, CT, and MRI. Medical imaging images obtained using medical imaging equipment are stored in a standard format and can be transmitted to terminals inside and outside medical institutions through a network, and reading results and medical records can be added thereto.
λν, λ³Έ κ°μμμμ 'νμ΅', 'νλ ¨', νΉμ 'λ¬λ'μ μ μ°¨μ λ°λ₯Έ μ»΄ν¨ν (computing)μ ν΅νμ¬ κΈ°κ³ νμ΅(machine learning)μ μνν¨μ μΌμ»«λ μ©μ΄μΈλ°, μΈκ°μ κ΅μ‘ νλκ³Ό κ°μ μ μ μ μμ©μ μ§μΉνλλ‘ μλλ κ²μ΄ μλμ ν΅μμ κΈ°μ μλ μ΄ν΄ν μ μμ κ²μ΄λ€.In addition, in the present disclosure, 'learning', 'training', or 'learning' is a term referring to performing machine learning through procedural computing, which is a mental function such as human educational activity. It will be appreciated by those skilled in the art that it is not intended to refer to.
λ€λ₯΄κ² μ μλμ§ μλ ν, κΈ°μ μ μ΄κ±°λ κ³Όνμ μΈ μ©μ΄λ₯Ό ν¬ν¨ν΄μ μ¬κΈ°μ μ¬μ©λλ λͺ¨λ μ©μ΄λ€μ ν΄λΉ κΈ°μ λΆμΌμμ ν΅μμ μ§μμ κ°μ§ μμ μν΄ μΌλ°μ μΌλ‘ μ΄ν΄λλ κ²κ³Ό λμΌν μλ―Έλ₯Ό κ°μ§λ€. μΌλ°μ μΌλ‘ μ¬μ©λλ μ¬μ μ μ μλμ΄ μλ κ²κ³Ό κ°μ μ©μ΄λ€μ κ΄λ ¨ κΈ°μ μ λ¬Έλ§₯μ κ°μ§λ μλ―Έμ μΌμΉνλ μλ―Έλ₯Ό κ°λ κ²μΌλ‘ ν΄μλμ΄μΌ νλ©°, λ³Έ κ°μμμμ λͺ λ°±νκ² μ μνμ§ μλ ν, μ΄μμ μ΄κ±°λ κ³Όλνκ² νμμ μΈ μλ―Έλ‘ ν΄μλμ§ μλλ€.Unless defined otherwise, all terms used herein, including technical or scientific terms, have the same meaning as commonly understood by one of ordinary skill in the art. Terms such as those defined in commonly used dictionaries should be interpreted as having a meaning consistent with the meaning in the context of the related art, and unless explicitly defined in the present disclosure, interpreted in an ideal or excessively formal meaning. It doesn't work.
λμ±μ΄ λ³Έ λ°λͺ μ λ³Έ κ°μμμ νμλ μ€μ μλ€μ λͺ¨λ κ°λ₯ν μ‘°ν©λ€μ λ§λΌνλ€. λ³Έ λ°λͺ μ λ€μν μ€μ μλ μλ‘ λ€λ₯΄μ§λ§ μνΈ λ°°νμ μΌ νμλ μμμ΄ μ΄ν΄λμ΄μΌ νλ€. μλ₯Ό λ€μ΄, μ¬κΈ°μ κΈ°μ¬λμ΄ μλ νΉμ νμ, ꡬ쑰 λ° νΉμ±μ μΌ μ€μ μμ κ΄λ ¨νμ¬ λ³Έ λ°λͺ μ μ¬μ λ° λ²μλ₯Ό λ²μ΄λμ§ μμΌλ©΄μ λ€λ₯Έ μ€μ μλ‘ κ΅¬νλ μ μλ€. λν, κ°κ°μ κ°μλ μ€μ μ λ΄μ κ°λ³ ꡬμ±μμμ μμΉ λλ λ°°μΉλ λ³Έ λ°λͺ μ μ¬μ λ° λ²μλ₯Ό λ²μ΄λμ§ μμΌλ©΄μ λ³κ²½λ μ μμμ΄ μ΄ν΄λμ΄μΌ νλ€. λ°λΌμ, νμ νλ μμΈν μ€λͺ μ νμ μ μΈ μλ―Έλ‘μ μ·¨νλ €λ κ²μ΄ μλλ€. Moreover, the present invention covers all possible combinations of the embodiments presented in this disclosure. It should be understood that the various embodiments of the present invention are different, but need not be mutually exclusive. For example, specific shapes, structures, and characteristics described herein may be implemented in one embodiment in another embodiment without departing from the spirit and scope of the invention. Additionally, it should be understood that the location or arrangement of individual components within each disclosed embodiment may be changed without departing from the spirit and scope of the invention. Accordingly, the detailed description that follows is not intended to be taken in a limiting sense.
λ³Έ κ°μμμμ λ¬λ¦¬ νμλκ±°λ λΆλͺ ν λ¬Έλ§₯μ λͺ¨μλμ§ μλ ν, λ¨μλ‘ μ§μΉλ νλͺ©μ, κ·Έ λ¬Έλ§₯μμ λ¬λ¦¬ μꡬλμ§ μλ ν, 볡μμ κ²μ μμ°λ₯Έλ€. λν, λ³Έ λ°λͺ μ μ€λͺ ν¨μ μμ΄, κ΄λ ¨λ κ³΅μ§ κ΅¬μ± λλ κΈ°λ₯μ λν ꡬ체μ μΈ μ€λͺ μ΄ λ³Έ λ°λͺ μ μμ§λ₯Ό ν릴 μ μλ€κ³ νλ¨λλ κ²½μ°μλ κ·Έ μμΈν μ€λͺ μ μλ΅νλ€.In this disclosure, unless otherwise indicated or clearly contradicted by context, terms referred to in the singular encompass the plural unless the context requires otherwise. In addition, in describing the present invention, if it is determined that a detailed description of a related known configuration or function may obscure the gist of the present invention, the detailed description will be omitted.
μ΄ν, ν΅μμ κΈ°μ μκ° λ³Έ λ°λͺ μ μ©μ΄νκ² μ€μν μ μλλ‘ νκΈ° μνμ¬, λ³Έ λ°λͺ μ λ°λμ§ν μ€μ μλ€μ κ΄νμ¬ μ²¨λΆλ λλ©΄μ μ°Έμ‘°νμ¬ μμΈν μ€λͺ νκΈ°λ‘ νλ€.Hereinafter, preferred embodiments of the present invention will be described in detail with reference to the accompanying drawings so that those skilled in the art can easily practice the present invention.
λ 1μ λ³Έ κ°μμμ λ°λ₯Έ μ 체 μμ κ΄λ¦¬ λ°©λ²μ μννλ μ»΄ν¨ν μ₯μΉμ μμμ ꡬμ±μ κ°λ΅μ μΌλ‘ λμν κ°λ λμ΄λ€.1 is a conceptual diagram schematically illustrating an exemplary configuration of a computing device that performs a body image management method according to the present disclosure.
λ 1μ μ°Έμ‘°νλ©΄, λ³Έ κ°μμμ μΌ μ€μ μμ λ°λ₯Έ μ»΄ν¨ν
μ₯μΉ(100)λ, ν΅μ λΆ(110) λ° νλ‘μΈμ(120)λ₯Ό ν¬ν¨νλ©°, μκΈ° ν΅μ λΆ(110)λ₯Ό ν΅νμ¬ μΈλΆ μ»΄ν¨ν
μ₯μΉ(λ―Έλμ)μ μ§κ°μ μ μΌλ‘ ν΅μ ν μ μλ€.Referring to FIG. 1 , a
ꡬ체μ μΌλ‘, μκΈ° μ»΄ν¨ν
μ₯μΉ(100)λ, μ νμ μΈ μ»΄ν¨ν° νλμ¨μ΄(μ컨λ, μ»΄ν¨ν°, νλ‘μΈμ, λ©λͺ¨λ¦¬, μ€ν 리μ§(storage), μ
λ ₯ μ₯μΉ λ° μΆλ ₯ μ₯μΉ, κΈ°ν κΈ°μ‘΄μ μ»΄ν¨ν
μ₯μΉμ ꡬμ±μμλ€μ ν¬ν¨ν μ μλ μ₯μΉ; λΌμ°ν°, μ€μμΉ λ±κ³Ό κ°μ μ μ ν΅μ μ₯μΉ; λ€νΈμν¬ λΆμ°© μ€ν 리μ§(NAS; network-attached storage) λ° μ€ν λ¦¬μ§ μμ λ€νΈμν¬(SAN; storage area network)μ κ°μ μ μ μ 보 μ€ν λ¦¬μ§ μμ€ν
)μ μ»΄ν¨ν° μννΈμ¨μ΄(μ¦, μ»΄ν¨ν
μ₯μΉλ‘ νμ¬κΈ νΉμ μ λ°©μμΌλ‘ κΈ°λ₯νκ² νλ λͺ
λ Ήμ΄λ€)μ μ‘°ν©μ μ΄μ©νμ¬ μνλ μμ€ν
μ±λ₯μ λ¬μ±νλ κ²μΌ μ μλ€. μκΈ° μ€ν 리μ§λ νλ λμ€ν¬, USB(Universal Serial Bus) λ©λͺ¨λ¦¬μ κ°μ κΈ°μ΅ μ₯μΉλΏλ§ μλλΌ ν΄λΌμ°λ μλ²μ κ°μ λ€νΈμν¬ μ°κ²° κΈ°λ°μ μ μ₯ μ₯μΉμ ννλ₯Ό ν¬ν¨ν μ μλ€.Specifically, the
μ΄μ κ°μ μ»΄ν¨ν
μ₯μΉμ ν΅μ λΆ(110)λ μ°λλλ ν μ»΄ν¨ν
μ₯μΉ, μ컨λ μ μ©μ μ μ₯μ, μ컨λ, λ°μ΄ν°λ² μ΄μ€ μλ² λ±κ³Όμ μ¬μ΄μμ μμ²κ³Ό μλ΅μ μ‘μμ ν μ μλλ°, μΌ μμλ‘μ κ·Έλ¬ν μμ²κ³Ό μλ΅μ λμΌν TCP(Transmission Control Protocol) μΈμ
(session)μ μνμ¬ μ΄λ£¨μ΄μ§ μ μμ§λ§, μ΄μ νμ λμ§λ μλ λ°, μ컨λ UDP(User Datagram Protocol) λ°μ΄ν°κ·Έλ¨(datagram)μΌλ‘μ μ‘μμ λ μλ μμ κ²μ΄λ€.The
ꡬ체μ μΌλ‘, ν΅μ λΆ(110)λ ν΅μ μΈν°νμ΄μ€λ₯Ό ν¬ν¨νλ ν΅μ λͺ¨λμ ννλ‘ κ΅¬νλ μ μλ€. μ΄λ₯Όν
λ©΄, ν΅μ μΈν°νμ΄μ€λ WLAN(Wireless LAN), WiFi(Wireless Fidelity) Direct, DLNA(Digital Living Network Alliance), WiBro(Wireless Broadband), WiMax(World interoperability for Microwave access), HSDPA(High Speed Downlink Packet Access) λ±μ 무μ μΈν°λ· μΈν°νμ΄μ€μ λΈλ£¨ν¬μ€(Bluetoothβ’), RFID(Radio Frequency IDentification), μ μΈμ ν΅μ (Infrared Data Association; IrDA), UWB(Ultra-WideBand), ZigBee, NFC(Near Field Communication) λ±μ 근거리 ν΅μ μΈν°νμ΄μ€λ₯Ό ν¬ν¨ν μ μλ€. λΏλ§ μλλΌ, ν΅μ μΈν°νμ΄μ€λ νλ‘μΈμκ° μΈλΆμ ν΅μ μ μνν μ μλ λͺ¨λ μΈν°νμ΄μ€(μλ₯Ό λ€μ΄, λ°μ΄ν° λ²μ€(data bus)μ κ°μ μ¬λ¬ λ²μ€, μ 무μ μΈν°νμ΄μ€)λ₯Ό λνλΌ μ μλ€.Specifically, the
μλ₯Ό λ€μ΄, ν΅μ λΆ(110)λ μ΄μ κ°μ΄ μ ν©ν ν΅μ μΈν°νμ΄μ€λ₯Ό ν΅ν΄ ν μ»΄ν¨ν
μ₯μΉλ‘λΆν° λ°μ΄ν°λ₯Ό μ‘μμ ν μ μλ€. λ§λΆμ¬, λμ μλ―Έμμ μκΈ° ν΅μ λΆ(110)λ λͺ
λ Ήμ΄ λλ μ§μ λ±μ μ λ¬λ°κΈ° μν ν€λ³΄λ, λ§μ°μ€, κΈ°ν μΈλΆ μ
λ ₯ μ₯μΉ, μΈμ μ₯μΉ, λμ€νλ μ΄, κΈ°ν μΈλΆ μΆλ ₯ μ₯μΉλ₯Ό ν¬ν¨νκ±°λ μ΄λ€κ³Ό μ°λλ μ μλ€. μ»΄ν¨ν
μ₯μΉ, μ컨λ, ν΄λ λ¨λ§ λλ κ°μΈμ© μ»΄ν¨ν°(personal computer; PC)μ κ°μ μ¬μ©μ λ¨λ§μ΄λ μλ²μ μ¬μ©μμκ² μ ν©ν μ¬μ©μ μΈν°νμ΄μ€λ₯Ό νμνμ¬ μ 곡ν¨μΌλ‘μ¨ μ¬μ©μμμ μνΈμμ©μ κ°λ₯νκ² νκΈ° μνμ¬, μ»΄ν¨ν
μ₯μΉ(100)λ λμ€νλ μ΄λΆλ₯Ό λ΄μ₯νκ±°λ μκΈ° ν΅μ λΆ(110)λ₯Ό ν΅νμ¬ μΈλΆμ λμ€νλ μ΄ μ₯μΉμ μ°λλ μ μμμ΄ μλ €μ Έ μλ€. μ컨λ, κ·Έλ¬ν λμ€νλ μ΄λΆ λλ λμ€νλ μ΄ μ₯μΉλ ν°μΉ μ
λ ₯μ΄ κ°λ₯ν ν°μΉμ€ν¬λ¦°μΌ μ μλ€.For example, the
λν, μ»΄ν¨ν
μ₯μΉμ νλ‘μΈμ(120)λ μΊμ λ©λͺ¨λ¦¬(cache memory)μ κ°μ λ΄λΆ λ©λͺ¨λ¦¬ λ°/λλ μΈλΆ λ©λͺ¨λ¦¬λ₯Ό κ°μ§λ νλ μ΄μμ, MPU(micro processing unit), CPU(central processing unit), GPU(graphics processing unit), NPU(neural processing unit) λλ TPU(tensor processing unit)μ κ°μ λ§μ΄ν¬λ‘νλ‘μΈμ, 컨νΈλ‘€λ¬, μ컨λ, λ§μ΄ν¬λ‘컨νΈλ‘€λ¬, μλ² λλ λ§μ΄ν¬λ‘컨νΈλ‘€λ¬, λ§μ΄ν¬λ‘μ»΄ν¨ν°, ALU(arithmetic logic unit), λμ§νΈ μ νΈ νλ‘μΈμ(digital signal processor), μ컨λ, νλ‘κ·Έλλ¨ΈλΈ λμ§νΈ μ νΈ νλ‘μΈμ λλ κΈ°ν νλ‘κ·Έλλ¨ΈλΈ μ₯μΉ λ±μ νλμ¨μ΄ ꡬμ±μ ν¬ν¨ν μ μλ€. λν, μ΄μ체μ , νΉμ λͺ©μ μ μννλ μ ν리μΌμ΄μ
μ μννΈμ¨μ΄ ꡬμ±μ λ ν¬ν¨ν μλ μλ€.In addition, the
μ΄μ λ 2 λ΄μ§ λ 5bλ₯Ό μ°Έμ‘°νμ¬ λ³Έ κ°μμμ λ°λ₯Έ μ 체 μμ κ΄λ¦¬ λ°©λ²μ μμΈν νμ νκΈ°λ‘ νλ€.Now, the body image management method according to the present disclosure will be described in detail with reference to FIGS. 2 to 5B .
λ 2λ λ³Έ κ°μμμ λ°λ₯Έ μ 체 μμ κ΄λ¦¬ λ°©λ²μ μννλ μ»΄ν¨ν μ₯μΉμ νλμ¨μ΄ λλ μννΈμ¨μ΄ ꡬμ±μμλ₯Ό λμν μμμ λΈλ‘λμ΄κ³ , λ 3aλ λ³Έ κ°μμμ λ°λ₯Έ μ 체 μμ κ΄λ¦¬ λ°©λ²μ μ 1 μ€μ μλ₯Ό λμν μμμ νλ¦λμ΄λ€.2 is an exemplary block diagram illustrating hardware or software components of a computing device that performs a body image management method according to the present disclosure, and FIG. 3A illustrates a first embodiment of a body image management method according to the present disclosure. It is an exemplary flow diagram shown.
λ 2 λ° λ 3aλ₯Ό μ°Έμ‘°νλ©΄, λ³Έ κ°μμμ μ 체 μμ κ΄λ¦¬ λ°©λ²μ μ 1 μ€μ μλ, μ»΄ν¨ν
μ₯μΉ(200)μ μνμ¬ κ΅¬νλλ μ 체 μμ νλ λͺ¨λ(210)μ΄, μ 체 μμμ μμ²(source; μμ€)μΌλ‘λΆν°, μ컨λ, μ»΄ν¨ν
μ₯μΉ(200)μ ν¬ν¨λ 촬μ μ₯μΉ(μ컨λ, μΉ΄λ©λΌ)(205a), μ»΄ν¨ν
μ₯μΉ(200)μ μ°λνλ 촬μ μ₯μΉ(205b) λλ ν μ₯μΉ(300)μ ν¬ν¨λ 촬μ μ₯μΉ(305)λ‘λΆν°, λλ 미리 촬μλ μμμ 보μ ν ν μ₯μΉ(300)λ‘λΆν°, νΌκ²μ²΄(λλ νμ)μ μ 체μ μ μ΄λ μΌλΆκ° 촬μλ μμμΈ μ 체 μμμ νλνλ λ¨κ³(μ 체 μμ νλ λ¨κ³; S100)λ₯Ό ν¬ν¨νλ€.Referring to FIGS. 2 and 3A , in the first embodiment of the body image management method of the present disclosure, the body
μλ₯Ό λ€μ΄, μ»΄ν¨ν
μ₯μΉ(200)κ° μλ£μ§μ΄ μ΄μ©νλ κ°μΈμ© μ»΄ν¨ν°λΌλ©΄, ν μ₯μΉ(300)λ μΉ΄λ©λΌλ₯Ό ν¬ν¨νλ ν΄λμ© λ¨λ§μΌ μ μλ€.For example, if the
λ 4λ λ³Έ κ°μμμ μ 체 μμ κ΄λ¦¬ λ°©λ²μμ μ 곡λλ μ¬μ©μ μΈν°νμ΄μ€λ₯Ό μμμ μΌλ‘ λνλΈ λλ©΄μ΄λ€.4 is a diagram showing a user interface provided in the body image management method of the present disclosure by way of example.
λ 4λ₯Ό μ°Έμ‘°νλ©΄, λ³Έ κ°μμμ λ°λ₯Έ μ 체 μμ κ΄λ¦¬ λ°©λ²μ μ€μ μλ€μμ μ»΄ν¨ν
μ₯μΉ(200)μ μνμ¬ μ 곡λκ±°λ μ΄μ μ°λλλ ν μ₯μΉ(300)λ₯Ό κ±°μ³ μ 곡λλ μ¬μ©μ μΈν°νμ΄μ€(400)μ μμμ ꡬμ±μ΄ λνλ μλλ°, μ¬μ©μμ νΈμλ₯Ό μνμ¬ νΌκ²μ²΄μ μ΄λ¦(422), μ±λ³(424), λμ΄(λ―Έλμ), μλ³λ²νΈ(patient ID; 426), λ΄λΉμ μ±λͺ
(428) μ€ μ μ΄λ νλλ₯Ό ν¬ν¨νλ νΌκ²μ²΄(νμ)μ μ λ³΄κ° μ¬μ©μμκ² μ 곡λ μ μλ€. λν, λ€μμ νΌκ²μ²΄μ κ΄ν λͺ©λ‘(430)μ΄ ν¨κ» μ 곡λ μ μμΌλ©°, κ°λ¨ν μ‘°μμΌλ‘ μ 체 μμ νλ λ¨κ³(S100)λ₯Ό μνν μ μλλ‘ λ 4μ μ°Έμ‘°λΆνΈ 410μΌλ‘ μμλ λ²νΌκ³Ό κ°μ μμ μ μΈν°νμ΄μ€ μμκ° μ 곡λ μλ μλ€.Referring to FIG. 4 , in embodiments of the body image management method according to the present disclosure, an exemplary configuration of a
μ 체 μμ νλ λ¨κ³(S100) λ€μμΌλ‘, λ³Έ κ°μμμ μ 체 μμ κ΄λ¦¬ λ°©λ²μ μ 1 μ€μ μλ, μ»΄ν¨ν
μ₯μΉ(200)μ μνμ¬ κ΅¬νλλ 촬μ μμ μ ν λͺ¨λ(220)μ΄, μ»΄ν¨ν
μ₯μΉ(200)μ ν¬ν¨λ μμ μ λμ€νλ μ΄λΆ μμ μ 1 μ¬μ©μ μΈν°νμ΄μ€ μμμ μ 곡νκ±°λ μ»΄ν¨ν
μ₯μΉ(200)μ μ°λνλ ν μ₯μΉ(300)λ‘ νμ¬κΈ μκΈ° μ 1 μ¬μ©μ μΈν°νμ΄μ€ μμμ μ 곡νλλ‘ μ§μν¨μΌλ‘μ¨ μκΈ° μ 체 μμμ κ²°λΆλ νλ©΄ μμμΈ μ΄¬μ μμμ μ
λ ₯ λλ μ λ¬λ°λ λ¨κ³(촬μ μμ μ ν λ¨κ³; S200)λ₯Ό λ ν¬ν¨νλ€.Next to the body image acquisition step (S100), in the first embodiment of the body image management method of the present disclosure, the capturing
λ 5aλ λ³Έ κ°μμμ μ 체 μμ κ΄λ¦¬ λ°©λ²μμ μ 곡λλ μ 1 μ¬μ©μ μΈν°νμ΄μ€ μμμ μμμ μΌλ‘ λνλΈ λλ©΄μ΄λ€.5A is a diagram illustrating a first user interface area provided in the body image management method of the present disclosure by way of example.
λ 5aλ₯Ό μ°Έμ‘°νλ©΄, μκΈ° μ 1 μ¬μ©μ μΈν°νμ΄μ€ μμ(500)μ κ°μμ 3μ°¨μ μ 체 λͺ¨λΈ(520)μ ν¬ν¨νλλ°, κ·Έ 3μ°¨μ μ 체 λͺ¨λΈ(520)μ μμ κ°μμ νλ©΄ μμλ€(540)λ‘ λΆν λμ΄ μλ€. νλ©΄ μμλ€μ μ±νμΈκ³Ό λλ νΌλΆκ³Όμ κ΄λ ¨λ νλ©΄ ν΄λΆνμ μ°Έκ³ λ‘ νμ¬ λΆν λ μμλ€μΌ μ μλλ°, μ΄λ μλ₯Ό λ€μ΄ μ 체 μμμ΄ λλ§μ€μ½νΌ(dermascopy) μμμΈ κ²½μ°μ νΉν μ μ©νλ€.Referring to FIG. 5A , the first
μ¬μ©μλ μ 1 μ¬μ©μ μΈν°νμ΄μ€ μμ(500), νΉν, 3μ°¨μ μ 체 λͺ¨λΈ (520)μ λν μ‘°μμ μνμ¬ μκΈ° νλ©΄ μμλ€(540) μ€ μ μ΄λ νλ(550)λ₯Ό μ νν μ μλλ°, κ·Έ μ νλ μ μ΄λ νλμ νλ©΄ μμ(550)μ μκΈ° μ 체 μμμ κ²°λΆλ κ²μΌλ‘ μ§μ λ νλ©΄ μμμΈ μ΄¬μ μμ(550)μ΄λ€.The user may select at least one 550 of the
μλ₯Ό λ€μ΄, μ¬μ©μλ‘ νμ¬κΈ μ»΄ν¨ν
μ₯μΉ(200)μ μ°λνλ μ
λ ₯ μ₯μΉμΈ λ§μ°μ€(λ―Έλμ)μ μΌμͺ½, μ€λ₯Έμͺ½ λ²νΌμ ν΄λ¦ λ° κ°μ΄λ° λ²νΌμ ν΄λ¦, ν (wheel)μ μ
λ€μ΄(up-down) λ±μ μ‘°μ, ν°μΉμ€ν¬λ¦° μμ ν(tab), νμΉ(pinch)μ κ°μ μ μ€μ²μ μ‘°μ λ±μΌλ‘ μ¬μ©μ μΈν°νμ΄μ€ μμ 3μ°¨μ λμλ¬ΌμΈ 3μ°¨μ μ 체 λͺ¨λΈ(520)μ νμ , νλ, μΆμ, λλ ννμ΄λμν¬ μ μκ² νλ μ¬λ¬ κ°μ§ μ‘°μμ λ°©μμ΄ μλ €μ Έ μμΌλ μ΄μ νμ λμ§ μμμ λ¬Όλ‘ μ΄λ€. 3μ°¨μ λμλ¬Όμ λ€λ£¨λ μ¬μ©μ μΈν°νμ΄μ€ ꡬμ±μ κ΄ν λ°©μμ μ»΄ν¨ν° νλμ¨μ΄ λ° μννΈμ¨μ΄μ κ΄ν κΈ°μ λΆμΌμμ ν΅μμ μ§μμ κ°μ§ μ¬λμκ² μ μλ €μ Έ μμΌλ―λ‘ μ΄μ κ΄ν μ§λμΉ μΈλΆ μ€λͺ
μ μλ΅νκΈ°λ‘ νλ€.For example, manipulations such as clicks of the left and right buttons and middle buttons of a mouse (not shown), which is an input device linked to the
λ―Έλμλ μΌ λ³νλ‘λ‘μ, μκΈ° μ 체 μμμ μμ²μΌλ‘ κΈ°λ₯νλ μ₯μΉ, μ컨λ, 촬μ μ₯μΉ(205a, 205b, 305 λ±)μ λ°λΌ μ¬λ¬ νλ©΄ μμλ€μ λμμ 촬μν΄μΌ ν κ²½μ°λ μλλ°, μ΄λ₯Ό μνμ¬ μ 1 μ¬μ©μ μΈν°νμ΄μ€ μμ(500)μ μ¬λ¬ νλ©΄ μμλ€μ λμμ μ§μ ν μ μλλ‘ κ΅¬μ±λ μλ μμ κ²μ΄λ€.As a variation not shown, there are cases in which several surface areas need to be simultaneously captured according to a device serving as a source of the body image, for example, a photographing
촬μ μμ μ ν λ¨κ³(S200)μμλ 촬μ μμ(550)λΏλ§ μλλΌ κ·Έ 촬μ μμ(550)μ μν νΉμ μ§μ (560)μ κ΄ν μ νλ ν¨κ» μ΄λ£¨μ΄μ§ μ μλλ°, μ£Όλ‘ κ·Έ νΉμ μ§μ (560)μ 촬μλ μ 체 μμμ μ€μ¬μ μ κ°λ¦¬ν€λ κ²μΌ μ μλ€.In the capturing area selection step (S200), not only the capturing
촬μ μμ μ ν λ¨κ³(S200)μμ, μ 1 μ¬μ©μ μΈν°νμ΄μ€ μμ(500)μμ 촬μ μμ(550)μ μ νμ΄ μ΄λ£¨μ΄μ§λ©΄, κ·Έ μ νλ 촬μ μμ(550)μ κ΅¬λΆ λͺ
μΉμ λνλ΄λ μ¬μ©μ μΈν°νμ΄μ€ μμ(590)κ° μ 1 μ¬μ©μ μΈν°νμ΄μ€ μμ(500) μμ λ μ 곡λ μ μλλ°, λ 5aμ μ¬μ©μ μΈν°νμ΄μ€ μμ(590)μλ 'Chest, lower'(νλΆ νλΆ)κ° μ΄¬μ μμ(550)μ κ΅¬λΆ λͺ
μΉμΌλ‘ μμλμλ€.In the capturing area selection step (S200), when the capturing
촬μ μμ μ ν λ¨κ³(S200) λ€μμΌλ‘, λ³Έ κ°μμμ μ 체 μμ κ΄λ¦¬ λ°©λ²μ μ 1 μ€μ μλ, μ»΄ν¨ν
μ₯μΉ(200)μ μνμ¬ κ΅¬νλλ 촬μ μμ μ μ₯ λͺ¨λ(230)μ΄, μκΈ° μ 체 μμμ 촬μ μμ(550)μ μ 보μ ν¨κ», λ ꡬ체μ μΌλ‘ νΉμ μ§μ (560)μ μ 보μ ν¨κ» μμ μ μ μ₯μ(240)μ μ μ₯νκ±°λ ν μ₯μΉ(300')λ‘ νμ¬κΈ μ μ₯μ(340')μ μ μ₯νλλ‘ μ§μνλ λ¨κ³(촬μ μμ μ μ₯ λ¨κ³; S300)λ₯Ό λ ν¬ν¨νλ€. ν μ₯μΉ(300')λ ν μ₯μΉ(300)μ λμΌν μ₯μΉμΌ μλ μκ³ , κ·Έλ μ§ μμ μλ μλ€.Next, in the capturing region selection step (S200), in the first embodiment of the body image management method of the present disclosure, the capturing
촬μ μμ μ μ₯ λ¨κ³(S300)μμ μκΈ° μ 체 μμμ νΌκ²μ²΄λ₯Ό μλ³νκΈ° μν νΌκ²μ²΄μ μ΄λ¦, μ°λ Ή, μ±λ³ λ±κ³Ό κ°μ κ°μΈ μ λ³΄κ° μκΈ° μ 체 μμκ³Ό ν¨κ» μ·¨κΈλ μ μλ€.In the capturing region storage step (S300), personal information such as the name, age, and gender of the subject for identifying the subject of the body image may be handled together with the body image.
λν, 촬μ μμ μ μ₯ λ¨κ³(S300)μμλ μκΈ° μ 체 μμμ λ©νλ°μ΄ν°(metadata)λ₯Ό ν¬ν¨νλ μ΄λ―Έμ§ νμ€ λλ DICOM νμ€μΌλ‘ μ μ₯ν μ μμΌλ©°, μκΈ° μ΄λ―Έμ§ νμ€ λλ μκΈ° DICOM νμ€μ μμ½λ νλ(reserved field) λλ μ¬λΆμ νλ(blank field)μ 촬μ μμ(550)μ μ 보λ₯Ό ν¨κ» κΈ°μ
ν μ μλ€.In addition, in the capturing region storage step (S300), the body image may be stored in an image standard including metadata or DICOM standard, and a reserved field of the image standard or the DICOM standard or an extra Information on the capturing
μ컨λ, 촬μ μμ(550)μ μ 보λ, νλ©΄ μμλ€(540) κ°μ΄λ°μ 촬μ μμ(550)μ μλ³ν μ μκ² νλ κ΅¬λΆ μΈλ±μ€(index), μκΈ° 촬μ μμμ κ΅¬λΆ λͺ
μΉ, 3μ°¨μ μ 체 λͺ¨λΈ(520) μμ νΉμ μ§μ (560)μ 3μ°¨μ μ’ν, κ·Έ 3μ°¨μ μ’νμ λμνλλ‘ κ°κ³΅λ 2μ°¨μ μ’ν μ€ νλ μ΄μμΌ μ μλ€. μ 체 νλ©΄μ νΉμ μ§μ μ μ¬λ¬ μ¬μ(projection) κΈ°λ²μ μ΄μ©νμ¬ 2μ°¨μ μ’νμ λμλ μ μμμ μ μλ €μ Έ μλ€.For example, the information on the
μ΄ μ 1 μ€μ μμ λ¨κ³λ€(S100, S200)μ μμλ§μ λ¬λ¦¬νλ λ³Έ κ°μμμ μ 체 μμ κ΄λ¦¬ λ°©λ²μ μ 2 μ€μ μλ λ€μκ³Ό κ°λ€.A second embodiment of the body image management method of the present disclosure in which only the order of steps S100 and S200 is different from the first embodiment is as follows.
λ 3bλ λ³Έ κ°μμμ λ°λ₯Έ μ 체 μμ κ΄λ¦¬ λ°©λ²μ μ 2 μ€μ μλ₯Ό λμν μμμ νλ¦λμ΄λ€. λ 2 λ° λ 3bλ₯Ό μ°Έμ‘°νλ©΄, λ³Έ κ°μμμ λ°λ₯Έ μ 체 μμ κ΄λ¦¬ λ°©λ²μ μ 2 μ€μ μμμλ, 촬μ μμ μ ν λ¨κ³(S200') νμ, μ 체 μμ νλ λ¨κ³(S100')κ° μνλλ€.3B is an exemplary flowchart illustrating a second embodiment of a body image management method according to the present disclosure. Referring to FIGS. 2 and 3B , in the second embodiment of the body image management method according to the present disclosure, a body image acquisition step (S100') is performed after the photographing region selection step (S200').
λ³Έ κ°μμμ λ°λ₯Έ μ 체 μμ κ΄λ¦¬ λ°©λ²μ μ€μ μλ€μ, 촬μ μμμ κ΄ν μ¬μ©μμ μ΄λμ 보쑰νλ μΆκ°μ μΈ κΈ°λ₯μ μνν μ μλλ°, λ 5b λ΄μ§ λ 5dλ μ΄λ₯Ό μν΄ μ 곡λλ μ 2 μ¬μ©μ μΈν°νμ΄μ€ μμμ μμμ μΌλ‘ λνλΈ λλ©΄μ΄λ€.Embodiments of the body image management method according to the present disclosure may perform an additional function of assisting a user in viewing a photographing area, and FIGS. 5B to 5D illustrate a second user interface area provided for this purpose. It is an enemy drawing.
λ¨Όμ , λ 5b λ° λ 5cλ₯Ό μ°Έμ‘°νλ©΄, λ³Έ κ°μμμ λ°λ₯Έ μ 체 μμ κ΄λ¦¬ λ°©λ²μ, μ»΄ν¨ν
μ₯μΉ(200)μ μν΄ κ΅¬νλλ 촬μ μμ μ΄λ 보쑰 λͺ¨λ(250)μ΄, μ μ ν 3μ°¨μ μ 체 λͺ¨λΈ(520)κ³Ό λ§μ°¬κ°μ§λ‘ νΌκ²μ²΄(νμ)μ μ 체λ₯Ό νμ(represent)νλ κ°μμ 3μ°¨μ μ 체 λͺ¨λΈ(520')μ ν¬ν¨νκ³ , κ·Έ 3μ°¨μ μ 체 λͺ¨λΈ(520')μ λν μ¬μ©μμ κ΄μ°° μ‘°μμ μν΄ νΌκ²μ²΄μ νλ©΄ μμλ€μ κ΄μ°°μ κ°λ₯νκ² νλ μ 2 μ¬μ©μ μΈν°νμ΄μ€ μμ(500')μ μ 곡νκ³ , μ 2 μ¬μ©μ μΈν°νμ΄μ€ μμ(500') μμ νλ©΄ μμλ€ κ°κ°μ κ²°λΆλ μ 체 μμμ μ 무, μμΉ(560a, 560b, 560c) λ° κ°μ(570a, 570b, 570c) μ€ μ μ΄λ νλλ₯Ό νμνλ λ¨κ³(촬μ μμ μ΄λ 보쑰 λ¨κ³; S400)λ₯Ό λ ν¬ν¨ν μ μλ€.First, referring to FIGS. 5B and 5C , in the body image management method according to the present disclosure, the photographing area
μλ₯Ό λ€μ΄, νλ©΄ μμλ€ κ°κ°μ κ²°λΆλ μ 체 μμμ μ 무λ, κ²°λΆλ μ 체 μμμ΄ μλ νλ©΄ μμλ€(524a, 524b, 524c)κ³Ό κ²°λΆλ μ 체 μμμ΄ μλ νλ©΄ μμλ€(544a, 544b)μ ꡬλΆνλ λ°©μμΌλ‘ νμλ μ μλλ°, λ 5bμλ μμμΌλ‘ ꡬλΆλκ² νμλμμΌλ μ΄μ νμ λμ§ μμμ λ¬Όλ‘ μ΄λ€.For example, the presence or absence of a body image associated with each of the surface regions may distinguish
λμμΌλ‘μ, λ 5dλ₯Ό μ°Έμ‘°νλ©΄, 촬μ μμ μ΄λ 보쑰 λͺ¨λ(250)μ, λ¨κ³(S400)μμ μ 2 μ¬μ©μ μΈν°νμ΄μ€ μμ(500') μμ μκΈ° κ²°λΆλ μ 체 μμμ μ¬λ€μΌ(thumbnail; 580a, 580b, 580c)μ νμν μλ μλ€.As an alternative, referring to FIG. 5D , the photographing area
μ¬κΈ°μμ μ¬μ©μμ κ΄μ°° μ‘°μμ, μ컨λ, λ§μ°μ€(λ―Έλμ)μ μΌμͺ½, μ€λ₯Έμͺ½ λ²νΌμ ν΄λ¦ λ° κ°μ΄λ° λ²νΌμ ν΄λ¦, ν (wheel)μ μ λ€μ΄(up-down) λ± μ 2 μ¬μ©μ μΈν°νμ΄μ€ μμ(500') μμ 3μ°¨μ μ 체 λͺ¨λΈ(520')μ νμ , νλ, μΆμ, λλ ννμ΄λμν¬ μ μκ² νλ μ¬λ¬ κ°μ§ μ‘°μμΌ μ μμΌλ, μμ μ€λͺ νλ κ²κ³Ό λ§μ°¬κ°μ§λ‘ μ΄μ νμ λμ§ μμμ λ¬Όλ‘ μ΄λ€.Here, the user's observation manipulation is performed on the second user interface area 500', such as clicking the left and right buttons of a mouse (not shown), clicking the middle button, and clicking a wheel up-down. It may be various manipulations that allow the 3D body model 520' to be rotated, enlarged, reduced, or translated in parallel, but, as described above, it is of course not limited thereto.
λν, μ 2 μ¬μ©μ μΈν°νμ΄μ€ μμ(500') μμ νλ©΄ μμλ€(540') κ°κ°μ κ²°λΆλ μ 체 μμμ μ 무μ νμ(524a, 524b, 524c), μμΉμ νμ(560a, 560b, 560c), κ°μμ νμ(570a, 570b, 570c) λλ μ¬λ€μΌ(580a, 580b, 580c)μ νμμ λν μ¬μ©μμ μ‘°μ, μ컨λ, ν΄λ¦(click) λλ ν(tab)μ μνμ¬, μ»΄ν¨ν
μ₯μΉ(200)λ, μ¬μ©μλ‘ νμ¬κΈ κ·Έ κ²°λΆλ μ 체 μμμ μ΄λν μ μλλ‘, μ컨λ, κ·Έ κ²°λΆλ μ μ΄λ νλμ μ 체 μμμ λͺ©λ‘(440)μ ν¬ν¨νλ μ¬μ©μ μΈν°νμ΄μ€ μμμ μ 곡ν μλ μλλ°, λ 4μ μ°Έμ‘°λΆνΈ 440, 460 λ±μΌλ‘ μμλ λ°μ κ°λ€.In addition, on the second user interface area 500', the presence/absence of a body image associated with each of the surface areas 540' is displayed (524a, 524b, 524c), the location is displayed (560a, 560b, 560c), and the number of body images is displayed. In response to a user's manipulation, for example, a click or a tap, on the display of the
λ€μ λ 4λ₯Ό μ°Έμ‘°νλ©΄, μΆκ°μ μΌλ‘, λλ λμμΌλ‘μ, λ³Έ κ°μμμ λ°λ₯Έ μ 체 μμ κ΄λ¦¬ λ°©λ²μ, μ»΄ν¨ν
μ₯μΉ(200)μ μν΄ κ΅¬νλλ 촬μ μμ μ΄λ 보쑰 λͺ¨λ(250)μ΄, νΉμ μ 체 μμ(442)μ λν μ¬μ©μμ μ΄λ μμ²μ μνμ¬, μ컨λ, μ¬μ©μκ° μ¬λ¬ μ 체 μμλ€μ μ΄λ ₯μ ν¬ν¨νλ μμ λͺ©λ‘(440) μ€μμ νΉμ μ 체 μμ(442)μ ν΄λ¦νλ μ‘°μμ μνμ¬, μ»΄ν¨ν
μ₯μΉ(200)κ°, μμ μ μΈν°νμ΄μ€ μμ μμ(460)μ κ·Έ νΉμ μ 체 μμμ μ 곡νλ λμμ, κ·Έ νΉμ μ 체 μμ(442)μ κ²°λΆλ νλ©΄ μμ(484)μ κ°μμ 3μ°¨μ μ 체 λͺ¨λΈ(482) μμ νμν μ 3 μ¬μ©μ μΈν°νμ΄μ€ μμ(480)μ μ 곡νλ λ¨κ³(S500)λ₯Ό λ ν¬ν¨ν μλ μλ€.Referring back to FIG. 4 , additionally or alternatively, in the body image management method according to the present disclosure, the photographing area
μκΈ° μ€μ μλ€λ‘μ¨ λ³Έ κ°μμμμ μ€λͺ λ κΈ°μ μ μ΄μ μ, μλ£ μμμ 촬μ λ° μ΄λ μμ κ·Έ λμμ΄ λλ μ 체 λΆμλ₯Ό 3μ°¨μ μ 체 λͺ¨λΈμ μ΄μ©νμ¬ μ§κ΄μ μΌλ‘ μ ννμ¬ μ§μ ν μ μμΌλ©°, κ·Έ μ 체 λΆμμ κ΄ν μ 보λ₯Ό μλ£ μμμ κ΄ν DICOM νμ€ λ±μ νμμΌλ‘ ν¨κ» μ μ₯ν΄λ μ μμ΄, μΆν μλ£ μμμ μ΄λ μμ μλ£μ§ λ±μ μ¬μ©μκ° νμμ λμνλ κ°μμ 3μ°¨μ μ 체 λͺ¨λΈ μμμ κ·Έ μλ£ μμλ€μ μ§κ΄μ μΌλ‘ μ°Ύμλ³Ό μ μμ΄ μλ£μ§μ νΈμμ±μ΄ ν₯μλλ€λ μ μ΄λ€.The advantage of the technology described in this disclosure as the above embodiments is that, when taking and viewing medical images, a body part that is the target can be intuitively selected and designated using a 3D body model, and information about the body part can be selected. Information can be stored together in a format such as the DICOM standard for medical images, so that users such as medical staff can intuitively find the medical images on a virtual 3D body model corresponding to the patient when viewing the medical images later. This improves the convenience of the medical staff.
λ³Έ κ°μμμ κ°μλ λ°©λ²μ 1ν μνλ μ μμ λΏλ§ μλλΌ μ¬μ©μμ μꡬ λλ νμμ λ°λΌμ λ°λ³΅μ μΌλ‘, κ°νμ λλ μ§μμ μΌλ‘ μνλ μλ μμμ λ¬Όλ‘ μ΄λ€.Of course, the method disclosed in this disclosure may be performed not only once but also repeatedly, intermittently or continuously according to the user's request or need.
μ΄μ, λ³Έ κ°μμμ λ€μν μ€μ μλ€μ κ΄ν μ€λͺ μ κΈ°μ΄νμ¬ ν΄λΉ κΈ°μ λΆμΌμ ν΅μμ κΈ°μ μλ, λ³Έ λ°λͺ μ λ°©λ² λ°/λλ νλ‘μΈμ€λ€, κ·Έλ¦¬κ³ κ·Έ λ¨κ³λ€μ΄ νλμ¨μ΄, μννΈμ¨μ΄ λλ νΉμ μ©λ‘μ μ ν©ν νλμ¨μ΄ λ° μννΈμ¨μ΄μ μμμ μ‘°ν©μΌλ‘ μ€νλ μ μλ€λ μ μ λͺ ννκ² μ΄ν΄ν μ μλ€. Based on the description of various embodiments of the present disclosure, a person skilled in the art can determine the method and / or processes of the present invention, and the steps are hardware, software, or hardware and software suitable for a particular application. It can be clearly understood that any combination can be realized.
λν, λ³Έ κ°μμλ₯Ό μ½μ μ€λλ μ ν΅μμ κΈ°μ μλ μ»΄ν¨ν μ₯μΉ, μλ₯Ό λ€μ΄, μν¬μ€ν μ΄μ , κ°μΈμ© μ»΄ν¨ν°, ν΄λ λ¨λ§ λ±μ ν΅νμ¬ μ 곡λ μ μλ λ€μν μ¬μ©μ μΈν°νμ΄μ€μ μ΅μνλ°, λ³Έ κ°μμμμ μ€λͺ λ λ°©λ²λ€μ μ΄λ£¨λ κ°λ³ λ¨κ³λ€μμ μΈκΈλ μ¬μ©μ μΈν°νμ΄μ€ ꡬμ±μ κ΄ν μμΈν μ€λͺ μ μ 무μλ λΆκ΅¬νκ³ ν΅μμ κΈ°μ μλ λ³Έ κ°μμμ λ°©λ²μμ νμν, μ¬μ©μμ μ¬λ¬ μ‘°μμ κ°λ₯νκ² νλ λ€μν μ¬μ©μ μΈν°νμ΄μ€λ₯Ό μ½κ² μμ ν μ μμ κ²μ΄λ€.In addition, today's skilled person reading this disclosure is familiar with various user interfaces that may be provided through a computing device, for example, a workstation, a personal computer, a portable terminal, etc., making the methods described in this disclosure Regardless of the presence or absence of a detailed description of the user interface configuration mentioned in the individual steps, a person skilled in the art can easily assume various user interfaces necessary for the method of the present disclosure, enabling various manipulations by the user.
μκΈ° νλμ¨μ΄ μ₯μΉλ λ²μ© μ»΄ν¨ν° λ°/λλ μ μ© μ»΄ν¨ν μ₯μΉ λλ νΉμ μ»΄ν¨ν μ₯μΉ λλ νΉμ μ»΄ν¨ν μ₯μΉμ νΉλ³ν λͺ¨μ΅ λλ ꡬμ±μμλ₯Ό ν¬ν¨ν μ μλ€. μκΈ° νλ‘μΈμ€λ€μ νλ‘κ·Έλ¨ λͺ λ Ήμ΄λ₯Ό μ μ₯νκΈ° μν ROM/RAM λ±κ³Ό κ°μ λ©λͺ¨λ¦¬μ κ²°ν©λκ³ μκΈ° λ©λͺ¨λ¦¬μ μ μ₯λ λͺ λ Ήμ΄λ€μ μ€ννλλ‘ κ΅¬μ±λλ μ μ λ λ°μ κ°μ νλ‘μΈμμ μνμ¬ μ€νλ μ μλ€. κ²λ€κ°, νΉμ λμμΌλ‘μ, μκΈ° νλ‘μΈμ€λ€μ μ£Όλ¬Έν μ§μ νλ‘(application specific integrated circuit; ASIC), νλ‘κ·Έλλ¨ΈλΈ κ²μ΄νΈ μ΄λ μ΄(programmable gate array), μ컨λ, FPGA(field programmable gate array), PLU(programmable logic unit) νΉμ νλ‘κ·Έλλ¨ΈλΈ μ΄λ μ΄ λ‘μ§(Programmable Array Logic; PAL) λλ κΈ°ν λͺ λ Ήμ΄(instruction)λ₯Ό μ€ννκ³ μλ΅ν μ μλ μμμ λ€λ₯Έ μ₯μΉ, μ μ μ νΈλ€μ μ²λ¦¬νκΈ° μν΄ κ΅¬μ±λ μ μλ μμμ λ€λ₯Έ μ₯μΉ λλ μ₯μΉλ€μ μ‘°ν©μΌλ‘ μ€μλ μ μλ€. μ²λ¦¬ μ₯μΉλ μ΄μ 체μ λ° μκΈ° μ΄μ 체μ μμμ μνλλ νλ μ΄μμ μννΈμ¨μ΄ μ ν리μΌμ΄μ μ μνν μ μλ€. λν, μ²λ¦¬ μ₯μΉλ μννΈμ¨μ΄μ μ€νμ μλ΅νμ¬, λ°μ΄ν°λ₯Ό μ κ·Ό, μ μ₯, μ‘°μ, μ²λ¦¬ λ° μμ±ν μλ μλ€. μ΄ν΄μ νΈμλ₯Ό μνμ¬, μ²λ¦¬ μ₯μΉλ νλκ° μ¬μ©λλ κ²μΌλ‘ μ€λͺ λ κ²½μ°λ μμ§λ§, ν΄λΉ κΈ°μ λΆμΌμμ ν΅μμ μ§μμ κ°μ§ μλ, μ²λ¦¬ μ₯μΉκ° 볡μ κ°μ μ²λ¦¬ μμ(processing element) λ°/λλ 볡μ μ νμ μ²λ¦¬ μμλ₯Ό ν¬ν¨ν μ μμμ μ μ μλ€. μλ₯Ό λ€μ΄, μ²λ¦¬ μ₯μΉλ 볡μ κ°μ νλ‘μΈμ λλ νλμ νλ‘μΈμ λ° νλμ 컨νΈλ‘€λ¬λ₯Ό ν¬ν¨ν μ μλ€. λν, λ³λ ¬ νλ‘μΈμ(parallel processor)μ κ°μ, λ€λ₯Έ μ²λ¦¬ ꡬμ±(processing configuration)λ κ°λ₯νλ€. μκΈ° νλμ¨μ΄ μ₯μΉλ μΈλΆ μ₯μΉμ μ νΈλ₯Ό μ£Όκ³ λ°μ μ μλ μ μ λ λ°μ κ°μ ν΅μ λΆλ ν¬ν¨ν μ μλ€.The hardware device may include a general-purpose computer and/or a dedicated computing device or a specific computing device or a particular aspect or component of a specific computing device. The processes may be realized by a processor as described above, which is combined with a memory such as ROM/RAM or the like for storing program instructions and configured to execute instructions stored in the memory. Additionally, or alternatively, the processes may use an application specific integrated circuit (ASIC), programmable gate array, such as a field programmable gate array (FPGA), programmable logic unit (PLU) or programmable array logic (Programmable Array Logic; PAL) or any other device capable of executing and responding to other instructions, any other device or combination of devices that may be configured to process electronic signals. A processing device may run an operating system and one or more software applications running on the operating system. A processing device may also access, store, manipulate, process, and generate data in response to execution of software. For convenience of understanding, there are cases in which one processing device is used, but those skilled in the art will understand that the processing device includes a plurality of processing elements and/or a plurality of types of processing elements. It can be seen that it can include. For example, a processing device may include a plurality of processors or a processor and a controller. Other processing configurations are also possible, such as parallel processors. The hardware device may also include a communication unit as described above capable of exchanging signals with an external device.
μννΈμ¨μ΄λ μ»΄ν¨ν° νλ‘κ·Έλ¨(computer program), μ½λ(code), λͺ λ Ήμ΄(instruction; μΈμ€νΈλμ ), λλ μ΄λ€ μ€ νλ μ΄μμ μ‘°ν©μ ν¬ν¨ν μ μμΌλ©°, μνλ λλ‘ λμνλλ‘ μ²λ¦¬ μ₯μΉλ₯Ό ꡬμ±νκ±°λ λ 립μ μΌλ‘ λλ κ²°ν©μ μΌλ‘(collectively) μ²λ¦¬ μ₯μΉμ λͺ λ Ήν μ μλ€. μννΈμ¨μ΄ λ°/λλ λ°μ΄ν°λ, μ²λ¦¬ μ₯μΉμ μνμ¬ ν΄μλκ±°λ μ²λ¦¬ μ₯μΉμ λͺ λ Ήμ΄ λλ λ°μ΄ν°λ₯Ό μ 곡νκΈ° μνμ¬, μ΄λ€ μ νμ κΈ°κ³, ꡬμ±μμ(component), 물리μ μ₯μΉ, κ°μ μ₯μΉ(virtual equipment), μ»΄ν¨ν° μ μ₯ 맀체 λλ μ₯μΉ, λλ μ μ‘λλ μ νΈ ν(signal wave)μ μꡬμ μΌλ‘, λλ μΌμμ μΌλ‘ ꡬ체ν(embody)λ μ μλ€. μννΈμ¨μ΄λ λ€νΈμν¬λ‘ μ°κ²°λ μ»΄ν¨ν° μμ€ν μμ λΆμ°λμ΄μ, λΆμ°λ λ°©λ²μΌλ‘ μ μ₯λκ±°λ μ€νλ μλ μλ€. μννΈμ¨μ΄ λ° λ°μ΄ν°λ νλ μ΄μμ κΈ°κ³ νλ κ°λ₯ κΈ°λ‘ λ§€μ²΄μ μ μ₯λ μ μλ€.Software may include a computer program, code, instructions, or a combination of one or more of the foregoing, and may independently or collectively configure a processing device to operate as desired. ) command to the processing unit. Software and/or data may be any tangible machine, component, physical device, virtual equipment, computer storage medium or device, intended to be interpreted by or provide instructions or data to a processing device. , or may be permanently or temporarily embodied in a transmitted signal wave. Software may be distributed on networked computer systems and stored or executed in a distributed manner. Software and data may be stored on one or more machine-readable recording media.
λμ±μ΄ λ³Έ λ°λͺ μ κΈ°μ μ ν΄λ²μ λμλ¬Ό λλ μ ν κΈ°μ λ€μ κΈ°μ¬νλ λΆλΆλ€μ λ€μν μ»΄ν¨ν° ꡬμ±μμλ₯Ό ν΅νμ¬ μνλ μ μλ νλ‘κ·Έλ¨ λͺ λ Ήμ΄μ ννλ‘ κ΅¬νλμ΄ κΈ°κ³ νλ κ°λ₯ 맀체μ κΈ°λ‘λ μ μλ€. κΈ°κ³ νλ κ°λ₯ 맀체λ νλ‘κ·Έλ¨ λͺ λ Ήμ΄, λ°μ΄ν° νμΌ, λ°μ΄ν° ꡬ쑰 λ±μ λ¨λ μΌλ‘ λλ μ‘°ν©νμ¬ ν¬ν¨ν μ μλ€. κΈ°κ³ νλ κ°λ₯ν κΈ°λ‘ λ§€μ²΄μ κΈ°λ‘λλ νλ‘κ·Έλ¨ λͺ λ Ήμ΄λ μ€μ μλ₯Ό μνμ¬ νΉλ³ν μ€κ³λκ³ κ΅¬μ±λ κ²λ€μ΄κ±°λ μ»΄ν¨ν° μννΈμ¨μ΄ λΆμΌμ ν΅μμ κΈ°μ μμκ² κ³΅μ§λμ΄ μ¬μ© κ°λ₯ν κ²μΌ μλ μλ€. κΈ°κ³ νλ κ°λ₯ κΈ°λ‘ λ§€μ²΄μ μμλ νλ λμ€ν¬, νλ‘νΌ λμ€ν¬ λ° μκΈ° ν μ΄νμ κ°μ μκΈ° 맀체(magnetic media), CD-ROM, DVD, Blu-rayμ κ°μ κ΄κΈ°λ‘ 맀체(optical media), νλ‘ν°μ»¬ λμ€ν¬(floptical disk)μ κ°μ μκΈ°-κ΄ λ§€μ²΄(magneto-optical media), λ° λ‘¬(ROM), λ¨(RAM), νλμ λ©λͺ¨λ¦¬ λ±κ³Ό κ°μ νλ‘κ·Έλ¨ λͺ λ Ήμ΄λ₯Ό μ μ₯νκ³ μννλλ‘ νΉλ³ν ꡬμ±λ νλμ¨μ΄ μ₯μΉκ° ν¬ν¨λλ€. νλ‘κ·Έλ¨ λͺ λ Ήμ΄μ μμλ, μ μ ν μ₯μΉλ€ μ€ μ΄λ νλλΏλ§ μλλΌ νλ‘μΈμ, νλ‘μΈμ μν€ν μ² λλ μμ΄ν νλμ¨μ΄ λ° μννΈμ¨μ΄μ μ‘°ν©λ€μ μ΄μ’ μ‘°ν©, λλ λ€λ₯Έ μ΄λ€ νλ‘κ·Έλ¨ λͺ λ Ήμ΄λ€μ μ€νν μ μλ κΈ°κ³ μμμ μ€νλκΈ° μνμ¬ μ μ₯ λ° μ»΄νμΌ λλ μΈν°ν리νΈλ μ μλ, Cμ κ°μ ꡬ쑰μ νλ‘κ·Έλλ° μΈμ΄, C++ κ°μ κ°μ²΄μ§ν₯μ νλ‘κ·Έλλ° μΈμ΄ λλ κ³ κΈ λλ μ κΈ νλ‘κ·Έλλ° μΈμ΄(μ΄μ λΈλ¦¬μ΄, νλμ¨μ΄ κΈ°μ μΈμ΄λ€ λ° λ°μ΄ν°λ² μ΄μ€ νλ‘κ·Έλλ° μΈμ΄ λ° κΈ°μ λ€)λ₯Ό μ¬μ©νμ¬ λ§λ€μ΄μ§ μ μλ λ°, κΈ°κ³μ΄ μ½λ, λ°μ΄νΈμ½λλΏλ§ μλλΌ μΈν°νλ¦¬ν° λ±μ μ¬μ©ν΄μ μ»΄ν¨ν°μ μν΄μ μ€νλ μ μλ κ³ κΈ μΈμ΄ μ½λλ μ΄μ ν¬ν¨λλ€. Furthermore, the subject matter of the technical solution of the present invention or parts contributing to the prior art may be implemented in the form of program instructions that can be executed through various computer components and recorded on a machine-readable medium. Machine-readable media may include program instructions, data files, data structures, etc. alone or in combination. Program instructions recorded on a machine-readable recording medium may be specially designed and configured for the embodiment or may be known and usable to those skilled in the art of computer software. Examples of machine-readable recording media include magnetic media such as hard disks, floppy disks and magnetic tapes, optical media such as CD-ROMs, DVDs, and Blu-rays, and floptical disks. ), and hardware devices specially configured to store and execute program instructions, such as ROM, RAM, flash memory, and the like. Examples of program instructions include stored and compiled or interpreted for execution on any one of the foregoing devices, as well as a heterogeneous combination of processors, processor architectures, or different combinations of hardware and software, or any other machine capable of executing program instructions. Machine code, which can be created using a structured programming language such as C, an object-oriented programming language such as C++, or a high-level or low-level programming language (assembly language, hardware description languages, and database programming languages and technologies), This includes not only bytecode, but also high-level language code that can be executed by a computer using an interpreter or the like.
λ°λΌμ λ³Έ λ°λͺ μ λ°λ₯Έ μΌ νμμμλ, μμ μ€λͺ λ λ°©λ² λ° κ·Έ μ‘°ν©λ€μ΄ νλ μ΄μμ μ»΄ν¨ν μ₯μΉλ€μ μνμ¬ μνλ λ, κ·Έ λ°©λ² λ° λ°©λ²μ μ‘°ν©λ€μ΄ κ° λ¨κ³λ€μ μννλ μ€ν κ°λ₯ν μ½λλ‘μ μ€μλ μ μλ€. λ€λ₯Έ μΌ νμμμλ, μκΈ° λ°©λ²μ μκΈ° λ¨κ³λ€μ μννλ μμ€ν λ€λ‘μ μ€μλ μ μκ³ , λ°©λ²λ€μ μ₯μΉλ€μ κ±Έμ³ μ¬λ¬ κ°μ§ λ°©λ²μΌλ‘ λΆμ°λκ±°λ λͺ¨λ κΈ°λ₯λ€μ΄ νλμ μ μ©, λ 립ν μ₯μΉ λλ λ€λ₯Έ νλμ¨μ΄μ ν΅ν©λ μ μλ€. λ λ€λ₯Έ μΌ νμμμλ, μμμ μ€λͺ ν νλ‘μΈμ€λ€κ³Ό μ°κ΄λ λ¨κ³λ€μ μννλ μλ¨λ€μ μμ μ€λͺ ν μμμ νλμ¨μ΄ λ°/λλ μννΈμ¨μ΄λ₯Ό ν¬ν¨ν μ μλ€. κ·Έλ¬ν λͺ¨λ μμ°¨ κ²°ν© λ° μ‘°ν©λ€μ λ³Έ κ°μμμ λ²μ λ΄μ μνλλ‘ μλλ κ²μ΄λ€.Therefore, in one aspect according to the present invention, when the above-described methods and combinations thereof are performed by one or more computing devices, the methods and combinations of methods may be implemented as executable code that performs each step. In another aspect, the method may be implemented as systems performing the steps, the methods may be distributed in several ways across devices or all functions may be integrated into one dedicated, stand-alone device or other hardware. In another aspect, the means for performing the steps associated with the processes described above may include any of the hardware and/or software described above. All such sequential combinations and combinations are intended to fall within the scope of this disclosure.
μ΄μμμ λ³Έ λ°λͺ μ΄ κ΅¬μ²΄μ μΈ κ΅¬μ±μμ λ±κ³Ό κ°μ νΉμ μ¬νλ€κ³Ό νμ λ μ€μ μλ€ λ° λλ©΄μ μν΄ μ€λͺ λμμΌλ, κ·Έ μ€μ μλ€μ νΉμ ν ꡬ쑰μ λλ κΈ°λ₯μ μ€λͺ μ μμλ₯Ό μν λͺ©μ μΌλ‘ λ³Έ λ°λͺ μ λ³΄λ€ μ λ°μ μΈ μ΄ν΄λ₯Ό λκΈ° μν΄μ μ 곡λ κ²μΌ λΏ, κ°μλ μ€μ μλ€μ λ³Έ λ°λͺ μ΄ νμ λλ κ²μ μλλ©°, λ³Έ λ°λͺ μ΄ μνλ κΈ°μ λΆμΌμμ ν΅μμ μΈ μ§μμ κ°μ§ μ¬λμ΄λΌλ©΄ μ΄λ¬ν κΈ°μ¬λ‘λΆν° λ€μν ννμ μμ λ° λ³νμ κΎν μ μλ€.Although the present invention has been described above with specific details such as specific components and limited embodiments and drawings, specific structural or functional descriptions of the embodiments are provided for purposes of illustration and to help a more general understanding of the present invention. However, the present invention is not limited to the disclosed embodiments, and those skilled in the art can make various modifications and variations from these descriptions.
λ°λΌμ, λ³Έ λ°λͺ μ κΈ°μ μ μ¬μμ μκΈ° μ€λͺ λ μ€μ μμ κ΅νλμ΄ μ ν΄μ Έμλ μλλλ©°, κ·Έ κΈ°μ μ μ¬μμ ν¬ν¨λλ λ³κ²½ λλ λ체물μ λΉλ‘―νμ¬ λ³Έ κ°μμμ 첨λΆλ νΉνμ²κ΅¬λ²μλΏλ§ μλλΌ μ΄ νΉνμ²κ΅¬λ²μμ κ· λ±νκ² λλ λ±κ°μ μΌλ‘ λ³νλ λͺ¨λ κ²λ€μ΄ λ³Έ λ°λͺ μ μ¬μμ λ²μ£Όμ μνλ€κ³ ν κ²μ΄λ€. μλ₯Ό λ€μ΄, μ€λͺ λ κΈ°μ λ€μ΄ μ€λͺ λ λ°©λ²κ³Ό λ€λ₯Έ μμλ‘ μνλκ±°λ, λ°/λλ μ€λͺ λ μμ€ν , ꡬ쑰, μ₯μΉ, νλ‘ λ±μ ꡬμ±μμλ€μ΄ μ€λͺ λ λ°©λ²κ³Ό λ€λ₯Έ ννλ‘ κ²°ν© λλ μ‘°ν©λκ±°λ, λ€λ₯Έ ꡬμ±μμ λλ κ· λ±λ¬Όμ μνμ¬ λμΉλκ±°λ μΉνλλλΌλ μ μ ν κ²°κ³Όκ° λ¬μ±λ μ μλ€.Therefore, the technical idea of the present invention should not be limited to the above-described embodiments and should not be determined, and the claims attached to this disclosure, including changes or substitutes included in the technical idea, as well as the scope of this patent claim Or equivalently, all modifications will fall within the scope of the spirit of the present invention. For example, the described techniques may be performed in an order different from the method described, and/or components of the described system, structure, device, circuit, etc. may be combined or combined in a different form than the method described, or other components may be used. Or even if it is replaced or substituted by equivalents, appropriate results can be achieved.
κ·Έμ κ°μ΄ κ· λ±νκ² λλ λ±κ°μ μΌλ‘ λ³νλ κ²μλ, μ컨λ λ³Έ λ°λͺ μ λ°λ₯Έ λ°©λ²μ μ€μν κ²κ³Ό λμΌν κ²°κ³Όλ₯Ό λΌ μ μλ, λ Όλ¦¬μ μΌλ‘ λμΉ(logically equivalent)μΈ λ°©λ²μ΄ ν¬ν¨λ κ²μΈ λ°, λ³Έ λ°λͺ μ μ§μ λ° λ²μλ μ μ ν μμλ€μ μνμ¬ μ νλμ΄μλ μλλλ©°, λ²λ₯ μ μνμ¬ νμ© κ°λ₯ν κ°μ₯ λμ μλ―Έλ‘ μ΄ν΄λμ΄μΌ νλ€.Such equivalent or equivalent modifications will include, for example, logically equivalent methods that can produce the same results as those performed by the method according to the present invention. The scope should not be limited by the above examples, and should be understood in the broadest sense permitted by law.
Claims (8)
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| KR1020210180572A KR102429640B1 (en) | 2021-12-16 | 2021-12-16 | Method for managing body images, and apparatus using the same |
| KR10-2021-0180572 | 2021-12-16 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2023113285A1 true WO2023113285A1 (en) | 2023-06-22 |
Family
ID=82826753
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/KR2022/018743 Ceased WO2023113285A1 (en) | 2021-12-16 | 2022-11-24 | Method for managing body images and apparatus using same |
Country Status (2)
| Country | Link |
|---|---|
| KR (1) | KR102429640B1 (en) |
| WO (1) | WO2023113285A1 (en) |
Families Citing this family (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| KR102429640B1 (en) * | 2021-12-16 | 2022-08-05 | μ£Όμνμ¬ μνμ€λννΈλμ€ | Method for managing body images, and apparatus using the same |
| KR102856049B1 (en) * | 2024-12-10 | 2025-09-05 | μ£Όμνμ¬ μ»΄ν¬λ©μ€ | Apparatus for 3D Human Body Scan Data Management for Selective Utilization of Human Body Part and Driving Method Thereof |
Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2002282215A (en) * | 2001-03-27 | 2002-10-02 | Mitsubishi Space Software Kk | Image filing equipment |
| KR20140047268A (en) * | 2012-10-12 | 2014-04-22 | μ£Όμνμ¬ μΈνΌλνΈν¬μ€μΌμ΄ | Medical image display method using virtual patient model and apparatus thereof |
| US20200327661A1 (en) * | 2019-04-12 | 2020-10-15 | Zebra Medical Vision Ltd | Systems and methods for processing 3d anatomical volumes based on localization of 2d slices thereof |
| KR20210021818A (en) * | 2019-08-19 | 2021-03-02 | μ£Όμνμ¬ μ λΉμΌμ΄ | Method and system for serching medical images |
| KR102429640B1 (en) * | 2021-12-16 | 2022-08-05 | μ£Όμνμ¬ μνμ€λννΈλμ€ | Method for managing body images, and apparatus using the same |
Family Cites Families (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2551476Y2 (en) | 1992-01-21 | 1997-10-22 | γγ·γγ³ζ ͺεΌδΌη€Ύ | Constant voltage compatible connector |
-
2021
- 2021-12-16 KR KR1020210180572A patent/KR102429640B1/en active Active
-
2022
- 2022-11-24 WO PCT/KR2022/018743 patent/WO2023113285A1/en not_active Ceased
Patent Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2002282215A (en) * | 2001-03-27 | 2002-10-02 | Mitsubishi Space Software Kk | Image filing equipment |
| KR20140047268A (en) * | 2012-10-12 | 2014-04-22 | μ£Όμνμ¬ μΈνΌλνΈν¬μ€μΌμ΄ | Medical image display method using virtual patient model and apparatus thereof |
| US20200327661A1 (en) * | 2019-04-12 | 2020-10-15 | Zebra Medical Vision Ltd | Systems and methods for processing 3d anatomical volumes based on localization of 2d slices thereof |
| KR20210021818A (en) * | 2019-08-19 | 2021-03-02 | μ£Όμνμ¬ μ λΉμΌμ΄ | Method and system for serching medical images |
| KR102429640B1 (en) * | 2021-12-16 | 2022-08-05 | μ£Όμνμ¬ μνμ€λννΈλμ€ | Method for managing body images, and apparatus using the same |
Non-Patent Citations (1)
| Title |
|---|
| PARK SANG KYU, KIM BO KYUN, SHIN DONGSUN: "Semi-automatic segmentation and surface reconstruction of CT images by using rotoscoping and warping techniques", FOLIA MORPHOLOGICA, WYDAWNICTWO VIA MEDICA, PL, vol. 79, no. 1, 1 January 2020 (2020-01-01), PL , pages 156 - 161, XP093072141, ISSN: 0015-5659, DOI: 10.5603/FM.a2019.0045 * |
Also Published As
| Publication number | Publication date |
|---|---|
| KR102429640B1 (en) | 2022-08-05 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| JP5628927B2 (en) | MEDICAL INFORMATION DISPLAY DEVICE AND METHOD, AND PROGRAM | |
| WO2016125978A1 (en) | Method and apparatus for displaying medical image | |
| US20090307328A1 (en) | Remote management interface for a medical device | |
| WO2023113285A1 (en) | Method for managing body images and apparatus using same | |
| US20130123603A1 (en) | Medical device and method for displaying medical image using the same | |
| CN102959579A (en) | Medical information display apparatus, operation method and program | |
| WO2019143021A1 (en) | Method for supporting viewing of images and apparatus using same | |
| WO2021034138A1 (en) | Dementia evaluation method and apparatus using same | |
| US10810758B2 (en) | Method and system using augmentated reality for positioning of ECG electrodes | |
| WO2019230302A1 (en) | Training data collecting device, training data collecting method and program, training system, trained model, and endoscope image processing device | |
| WO2017142223A1 (en) | Remote image transmission system, display apparatus, and guide displaying method thereof | |
| WO2021006472A1 (en) | Multiple bone density displaying method for establishing implant procedure plan, and image processing device therefor | |
| CN111261265A (en) | Medical image system based on virtual intelligent medical platform | |
| KR102222509B1 (en) | Method for assisting determination on medical images and apparatus using the same | |
| WO2019164277A1 (en) | Method and device for evaluating bleeding by using surgical image | |
| WO2020231007A2 (en) | Medical equipment learning system | |
| WO2010128818A2 (en) | Medical image processing system and processing method | |
| WO2013172685A1 (en) | Apparatus and method for reconfiguring panoramic x-ray image | |
| WO2022231329A1 (en) | Method and device for displaying bio-image tissue | |
| WO2021054700A1 (en) | Method for providing tooth lesion information, and device using same | |
| WO2019124836A1 (en) | Method for mapping region of interest of first medical image onto second medical image, and device using same | |
| WO2020130349A1 (en) | Method and apparatus for recording treatment plan of 3d medical image | |
| WO2023121051A1 (en) | Patient information provision method, patient information provision apparatus, and computer-readable recording medium | |
| JP2007052699A (en) | Medical information processing system | |
| WO2019164273A1 (en) | Method and device for predicting surgery time on basis of surgery image |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 22907754 Country of ref document: EP Kind code of ref document: A1 |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 22907754 Country of ref document: EP Kind code of ref document: A1 |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 22907754 Country of ref document: EP Kind code of ref document: A1 |