[go: up one dir, main page]

WO2023113285A1 - Method for managing body images and apparatus using same - Google Patents

Method for managing body images and apparatus using same Download PDF

Info

Publication number
WO2023113285A1
WO2023113285A1 PCT/KR2022/018743 KR2022018743W WO2023113285A1 WO 2023113285 A1 WO2023113285 A1 WO 2023113285A1 KR 2022018743 W KR2022018743 W KR 2022018743W WO 2023113285 A1 WO2023113285 A1 WO 2023113285A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
area
computing device
body image
photographing
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/KR2022/018743
Other languages
French (fr)
Korean (ko)
Inventor
μž₯ν˜„μž¬
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
F&d Partners Inc
Original Assignee
F&d Partners Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by F&d Partners Inc filed Critical F&d Partners Inc
Publication of WO2023113285A1 publication Critical patent/WO2023113285A1/en
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H30/00ICT specially adapted for the handling or processing of medical images
    • G16H30/20ICT specially adapted for the handling or processing of medical images for handling medical images, e.g. DICOM, HL7 or PACS
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H30/00ICT specially adapted for the handling or processing of medical images
    • G16H30/40ICT specially adapted for the handling or processing of medical images for processing medical images, e.g. editing
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H70/00ICT specially adapted for the handling or processing of medical references

Definitions

  • Body image management and an apparatus using the body image management disclosed in the present disclosure relate to a method for managing medical images, and more specifically, to a management method for photographing medical images and retaining them in a predetermined storage.
  • Japanese Patent Registration No. 2551476 discloses a database construction method in a medical image management system.
  • Medical institutions store and retain medical images of patients acquired using medical imaging devices in a medical image information system.
  • medical images are stored together with personal information for identifying a patient and information described by a doctor about an area or region where a medical image was obtained.
  • the present disclosure solves the above-mentioned problems of the prior art, and provides a 3D body model capable of designating information on a body part to be captured when a medical image is captured, thereby providing a medical image and a medical image contained in the medical image.
  • the characteristic configuration of the present invention for achieving the object of the present invention as described above and realizing the characteristic effects of the present invention described later is as follows.
  • body image management performed by a computing device that obtains a body image, which is an image in which at least a part of the body is captured, from a device including a photographing device or having a previously photographed image, or from a photographing device.
  • a method is provided, wherein the computing device includes a virtual three-dimensional body model divided into a predetermined number of surface regions, and by operating the three-dimensional body model, at least one of the surface regions is provided.
  • a photographing area which is at least one surface area associated with the body image, among the surface areas is input or linked to the computing device, but the first user interface area is a capturing area selection step that is a step of receiving the capturing area from another provided device; and a capturing area storage step, which is a step of allowing the computing device to store the obtained body image together with information of the capturing area in a predetermined storage or to assist another device to store the acquired body image.
  • the computing device stores the body image as an image standard or DICOM standard including metadata, and a reserved field of the image standard or the DICOM standard field) or an extra field (blank field) together with the information of the photographing area.
  • the information of the imaging area includes a classification index of the imaging area, a classification name of the imaging area, 3D coordinates on the 3D body model, and 2D information processed to correspond to the 3D coordinates. contains at least one of the coordinates.
  • the computing device before the capturing area selection step, is another device including a photographing device or having a pre-captured image, and a device interworking with the computing device or a photographing device included in the computing device.
  • a body image acquisition step which is a step of obtaining a body image in which at least a part of the body is captured, is performed.
  • the computing device is another device including a photographing device or holding a pre-captured image, and a device interworking with the computing device or a photographing device included in the computing device.
  • a body image acquisition step which is a step of obtaining a body image in which at least a part of the body is captured, is performed.
  • the computing device includes a virtual three-dimensional body model divided into a predetermined number of surface regions, and observation of the surface regions is performed by a user's observation manipulation of the three-dimensional body model.
  • a second user interface area capable of displaying at least one of the presence, location, and number of body images associated with each of the surface areas on the second user interface area, or displaying thumbnails of the associated body images ( thumbnail) is further included.
  • the method comprises: in response to a request to view a specific body image, the computing device displays a surface area associated with the specific body image on a virtual three-dimensional body model while providing the specific body image.
  • the method further includes providing a third user interface area.
  • a computer program comprising instructions implemented to perform the methods according to the present invention is also provided.
  • a computer program may be stored on a machine-readable non-transitory recording medium.
  • a computing device that acquires and manages a body image in which at least a part of the body is captured, and the computing device includes a photographing device or a device holding a previously photographed image.
  • a communication unit that interworks or interworks with a photographing device included in the computing device; and a virtual 3D body model divided into a predetermined number of surface areas, and a first user interface area capable of selecting at least one of the surface areas by manipulating the 3D body model is provided to the communication unit.
  • a photographing area that is at least one surface area associated with the body image among the surface areas is received or interlocked through the communication unit from another device provided with the first user interface area.
  • Performing a capture area selection process for receiving the capture area and a capture area storage process for storing the obtained body image together with information on the capture area in a predetermined storage or supporting another device to store the captured body image through the communication unit contains the processor.
  • the invention of the present disclosure when a medical image is captured, it is possible to intuitively select and designate a body part to be captured using a 3D body model, and information on the body part can be stored in the DICOM standard for medical images, etc. It can be stored together in the format of, so that when viewing medical images later, users such as medical staff can intuitively find the medical images on a virtual 3-dimensional body model corresponding to the patient, taking and viewing medical images. There is an effect of improving the convenience of the medical staff regarding.
  • FIG. 1 is a conceptual diagram schematically illustrating an exemplary configuration of a computing device that performs a body image management method according to the present disclosure.
  • FIG. 2 is an exemplary block diagram illustrating hardware or software components of a computing device that performs a body image management method according to the present disclosure.
  • FIG. 3A is an exemplary flowchart illustrating a method for managing a body image according to an embodiment of the present disclosure
  • FIG. 3B is an exemplary flowchart illustrating a method for managing a body image according to another embodiment of the present disclosure.
  • FIG. 4 is a diagram showing a user interface provided in the body image management method of the present disclosure by way of example.
  • 5A is a diagram illustrating a first user interface area provided in the body image management method of the present disclosure by way of example.
  • 5B to 5D are diagrams illustrating a second user interface area provided in the body image management method of the present disclosure by way of example.
  • first or second may be used to describe various components, but such terms are to be interpreted solely for the purpose of distinguishing one component from another, and no order is implied. because it doesn't For example, a first element may be termed a second element, and similarly, a second element may be termed a first element.
  • 'image' is a term referring to an image that can be seen by the eye or a digital representation of an image (eg, displayed on a video screen).
  • 'surface images' or 'skin images' refer to dermatology images.
  • 'metadata' is a term that refers to data that describes other data.
  • 'image' refers to the information of the device that took the image, the time at the time of shooting, exposure, whether or not the flash was used, and the resolution. , image size, etc. may be included as metadata.
  • the 'DICOM Digital Imaging and Communications in Medicine
  • ACR ACR
  • NEMA National Electrical Manufacturers Association
  • 'medical image storage and transmission system is a term that refers to a system that stores, processes, and transmits images and communication standards in accordance with the DICOM standard, and digital such as X-ray, CT, and MRI.
  • Medical imaging images obtained using medical imaging equipment are stored in a standard format and can be transmitted to terminals inside and outside medical institutions through a network, and reading results and medical records can be added thereto.
  • 'learning', 'training', or 'learning' is a term referring to performing machine learning through procedural computing, which is a mental function such as human educational activity. It will be appreciated by those skilled in the art that it is not intended to refer to.
  • the present invention covers all possible combinations of the embodiments presented in this disclosure. It should be understood that the various embodiments of the present invention are different, but need not be mutually exclusive. For example, specific shapes, structures, and characteristics described herein may be implemented in one embodiment in another embodiment without departing from the spirit and scope of the invention. Additionally, it should be understood that the location or arrangement of individual components within each disclosed embodiment may be changed without departing from the spirit and scope of the invention. Accordingly, the detailed description that follows is not intended to be taken in a limiting sense.
  • FIG. 1 is a conceptual diagram schematically illustrating an exemplary configuration of a computing device that performs a body image management method according to the present disclosure.
  • a computing device 100 includes a communication unit 110 and a processor 120, and communicates with an external computing device (not shown) through the communication unit 110. You can communicate directly or indirectly.
  • the computing device 100 may include typical computer hardware (eg, a computer, processor, memory, storage, input and output devices, and other components of conventional computing devices; a router; electronic communication devices, such as switches, switches, etc.; electronic information storage systems, such as network-attached storage (NAS) and storage area network (SAN)) and computer software (i.e., enabling computing devices to It may be to achieve the desired system performance by using a combination of instructions).
  • the storage may include a storage device such as a hard disk or a universal serial bus (USB) memory as well as a storage device based on a network connection such as a cloud server.
  • USB universal serial bus
  • the communication unit 110 of such a computing device may transmit and receive requests and responses between other computing devices that are interlocked, for example, a dedicated storage, for example, a database server, and the like.
  • requests and responses are the same TCP ( It may be performed by a Transmission Control Protocol (Session) session, but is not limited thereto, and may be transmitted and received as a User Datagram Protocol (UDP) datagram, for example.
  • TCP Transmission Control Protocol
  • UDP User Datagram Protocol
  • the communication unit 110 may be implemented in the form of a communication module including a communication interface.
  • communication interfaces include Wireless LAN (WLAN), Wireless Fidelity (WiFi) Direct, Digital Living Network Alliance (DLNA), Wireless Broadband (WiBro), World interoperability for Microwave access (WiMax), High Speed Downlink Packet Access (HSDPA), etc.
  • DLNA Digital Living Network Alliance
  • WiBro Wireless Broadband
  • WiMax World interoperability for Microwave access
  • HSDPA High Speed Downlink Packet Access
  • BluetoothTM BluetoothTM
  • RFID Radio Frequency IDentification
  • IrDA Infrared Data Association
  • UWB Userltra-WideBand
  • ZigBee ZigBee
  • NFC Near Field Communication
  • the communication unit 110 may transmit/receive data from another computing device through an appropriate communication interface.
  • the communication unit 110 may include a keyboard, a mouse, other external input devices, a printing device, a display, and other external output devices for receiving commands or instructions, or may be interlocked with them.
  • the computing device 100 is provided.
  • the display unit can be embedded or interlocked with an external display device through the communication unit 110 .
  • such a display unit or display device may be a touch screen capable of touch input.
  • the processor 120 of the computing device may include one or more micro processing units (MPUs), central processing units (CPUs), and graphics processing units (GPUs) having internal memory such as cache memory and/or external memory.
  • MPUs micro processing units
  • CPUs central processing units
  • GPUs graphics processing units
  • a microprocessor such as a neural processing unit (NPU) or a tensor processing unit (TPU)
  • controller such as a microcontroller, an embedded microcontroller, a microcomputer, an arithmetic logic unit (ALU), a digital signal processor, such as , a programmable digital signal processor or other programmable device.
  • it may further include a software configuration of an operating system and an application that performs a specific purpose.
  • FIG. 2 is an exemplary block diagram illustrating hardware or software components of a computing device that performs a body image management method according to the present disclosure
  • FIG. 3A illustrates a first embodiment of a body image management method according to the present disclosure. It is an exemplary flow diagram shown.
  • the body image acquisition module 210 implemented by the computing device 200 is a source of the body image. From ), for example, a photographing device (eg, camera) 205a included in the computing device 200, a photographing device 205b interworking with the computing device 200, or a photographing device 305 included in another device 300 ) or from another device 300 having pre-captured images, acquiring a body image, which is an image in which at least a part of the body of the subject (or patient) is photographed (body image acquisition step; S100). .
  • a photographing device eg, camera
  • the computing device 200 is a personal computer used by medical staff
  • the other device 300 may be a portable terminal including a camera.
  • FIG. 4 is a diagram showing a user interface provided in the body image management method of the present disclosure by way of example.
  • an exemplary configuration of a user interface 400 provided by the computing device 200 or provided through another device 300 interlocked with the computing device 200 is As shown, for the convenience of the user, blood including at least one of the subject's name 422, gender 424, age (not shown), identification number (patient ID; 426), and the name of the person in charge 428 Information on the specimen (patient) may be provided to the user.
  • a list 430 related to a plurality of subjects may be provided together, and a predetermined interface element such as a button exemplified by reference numeral 410 in FIG. may be provided.
  • the capturing area selection module 220 implemented by the computing device 200 is included in the computing device 200.
  • the surface area associated with the body image by providing a first user interface area on a predetermined display unit or supporting another device 300 that works with the computing device 200 to provide the first user interface area.
  • a step of inputting or receiving a capture area selection of a capture area; S200 is further included.
  • 5A is a diagram illustrating a first user interface area provided in the body image management method of the present disclosure by way of example.
  • the first user interface area 500 includes a virtual 3D body model 520, and the 3D body model 520 is divided into a predetermined number of surface areas 540.
  • the surface regions may be regions segmented with reference to surface anatomy related to plastic surgery or dermatology, which is particularly useful when the body image is a dermascopy image, for example.
  • the user may select at least one 550 of the surface areas 540 by manipulating the first user interface area 500, in particular, the 3D body model 520, and the selected at least one surface Area 550 is an imaging area 550, which is a surface area designated as being associated with the body image.
  • manipulations such as clicks of the left and right buttons and middle buttons of a mouse (not shown), which is an input device linked to the computing device 200, and up-down of a wheel, etc. , Tap on the touch screen (tab), various manipulations such as pinch (pinch), etc. to rotate, enlarge, reduce, or parallel move the 3D body model 520, which is a 3D object on the user interface.
  • manipulations such as clicks of the left and right buttons and middle buttons of a mouse (not shown), which is an input device linked to the computing device 200, and up-down of a wheel, etc. , Tap on the touch screen (tab), various manipulations such as pinch (pinch), etc. to rotate, enlarge, reduce, or parallel move the 3D body model 520, which is a 3D object on the user interface.
  • the method of manipulation is known, it is needless to say that it is not limited thereto. Since methods for constructing user interfaces for handling 3D objects are well known to those skilled in the art related to computer hardware and software,
  • the interface area 500 may be configured to simultaneously designate several surface areas.
  • the specific point 560 is the captured body image. It may point to the center point of
  • a user interface element 590 indicating a distinction name of the selected capturing area 550 is displayed as the first user interface element 590.
  • 'Chest, lower' is exemplified as a classification name for the imaging area 550 in the user interface element 590 of FIG. 5A.
  • the capturing region storage module 230 implemented by the computing device 200 stores the body image in the capturing region ( 550) together with the information of the specific point 560 in the predetermined storage 240 or supporting another device 300' to store the information in the storage 340' (photographing area A storage step; S300) is further included.
  • the other device 300' may be the same device as the other device 300 or not.
  • the capturing region storage step (S300) personal information such as the name, age, and gender of the subject for identifying the subject of the body image may be handled together with the body image.
  • the body image may be stored in an image standard including metadata or DICOM standard, and a reserved field of the image standard or the DICOM standard or an extra Information on the capturing area 550 may also be written in the blank field.
  • the information on the imaging area 550 includes a classification index for identifying the imaging area 550 among the surface areas 540, a classification name of the imaging area, and an image of the 3D body model 520. It may be one or more of 3D coordinates of a specific point 560 and 2D coordinates processed to correspond to the 3D coordinates. It is well known that a specific point on the body surface can correspond to a two-dimensional coordinate using various projection techniques.
  • a second embodiment of the body image management method of the present disclosure in which only the order of steps S100 and S200 is different from the first embodiment is as follows.
  • FIGS. 3B is an exemplary flowchart illustrating a second embodiment of a body image management method according to the present disclosure.
  • a body image acquisition step (S100') is performed after the photographing region selection step (S200').
  • Embodiments of the body image management method according to the present disclosure may perform an additional function of assisting a user in viewing a photographing area, and FIGS. 5B to 5D illustrate a second user interface area provided for this purpose. It is an enemy drawing.
  • the photographing area viewing assistance module 250 implemented by the computing device 200 is configured to generate the above-mentioned 3D body model 520 Similarly, a virtual 3D body model 520' representing the body of the subject (patient) is included, and the surface of the subject is measured by the user's observation manipulation of the 3D body model 520'.
  • a second user interface area 500' enabling observation of areas is provided, and the presence or absence of a body image associated with each of the surface areas on the second user interface area 500' and the location (560a, 560b, 560c) ) and displaying at least one of the numbers 570a, 570b, and 570c (assisting viewing of the captured area; S400).
  • the presence or absence of a body image associated with each of the surface regions may distinguish surface regions 544a, 544b with no associated body image from surface regions 524a, 524b, and 524c with associated body images. It can be displayed in such a way that, although it is displayed in a color-coded manner in FIG. 5B, it is, of course, not limited thereto.
  • the photographing area browsing assistance module 250 displays thumbnails of the attached body image on the second user interface area 500' in step S400 (580a, 580b, 580c). ) can also be displayed.
  • the user's observation manipulation is performed on the second user interface area 500', such as clicking the left and right buttons of a mouse (not shown), clicking the middle button, and clicking a wheel up-down. It may be various manipulations that allow the 3D body model 520' to be rotated, enlarged, reduced, or translated in parallel, but, as described above, it is of course not limited thereto.
  • the computing device 200 causes the user to For example, a user interface area including a list 440 of at least one associated body image may be provided so that the associated body image can be viewed, as illustrated by reference numerals 440 and 460 in FIG. as it has been
  • the photographing area viewing assistance module 250 implemented by the computing device 200 is configured to display a specific body image 442 .
  • the computing device 200 displays a predetermined interface area.
  • a third user interface area 480 displaying the surface area 484 associated with the specific body image 442 on the virtual 3D body model 482 while providing the specific body image in the area 460 . ) may be further included (S500).
  • the advantage of the technology described in this disclosure as the above embodiments is that, when taking and viewing medical images, a body part that is the target can be intuitively selected and designated using a 3D body model, and information about the body part can be selected. Information can be stored together in a format such as the DICOM standard for medical images, so that users such as medical staff can intuitively find the medical images on a virtual 3D body model corresponding to the patient when viewing the medical images later. This improves the convenience of the medical staff.
  • the method disclosed in this disclosure may be performed not only once but also repeatedly, intermittently or continuously according to the user's request or need.
  • the hardware device may include a general-purpose computer and/or a dedicated computing device or a specific computing device or a particular aspect or component of a specific computing device.
  • the processes may be realized by a processor as described above, which is combined with a memory such as ROM/RAM or the like for storing program instructions and configured to execute instructions stored in the memory. Additionally, or alternatively, the processes may use an application specific integrated circuit (ASIC), programmable gate array, such as a field programmable gate array (FPGA), programmable logic unit (PLU) or programmable array logic (Programmable Array Logic; PAL) or any other device capable of executing and responding to other instructions, any other device or combination of devices that may be configured to process electronic signals.
  • ASIC application specific integrated circuit
  • FPGA field programmable gate array
  • PLU programmable logic unit
  • PAL programmable array logic
  • a processing device may run an operating system and one or more software applications running on the operating system.
  • a processing device may also access, store, manipulate, process, and generate data in response to execution of software.
  • the processing device includes a plurality of processing elements and/or a plurality of types of processing elements. It can be seen that it can include.
  • a processing device may include a plurality of processors or a processor and a controller. Other processing configurations are also possible, such as parallel processors.
  • the hardware device may also include a communication unit as described above capable of exchanging signals with an external device.
  • Software may include a computer program, code, instructions, or a combination of one or more of the foregoing, and may independently or collectively configure a processing device to operate as desired. ) command to the processing unit.
  • Software and/or data may be any tangible machine, component, physical device, virtual equipment, computer storage medium or device, intended to be interpreted by or provide instructions or data to a processing device. , or may be permanently or temporarily embodied in a transmitted signal wave.
  • Software may be distributed on networked computer systems and stored or executed in a distributed manner.
  • Software and data may be stored on one or more machine-readable recording media.
  • Machine-readable media may include program instructions, data files, data structures, etc. alone or in combination.
  • Program instructions recorded on a machine-readable recording medium may be specially designed and configured for the embodiment or may be known and usable to those skilled in the art of computer software.
  • Examples of machine-readable recording media include magnetic media such as hard disks, floppy disks and magnetic tapes, optical media such as CD-ROMs, DVDs, and Blu-rays, and floptical disks. ), and hardware devices specially configured to store and execute program instructions, such as ROM, RAM, flash memory, and the like.
  • Examples of program instructions include stored and compiled or interpreted for execution on any one of the foregoing devices, as well as a heterogeneous combination of processors, processor architectures, or different combinations of hardware and software, or any other machine capable of executing program instructions.
  • Machine code which can be created using a structured programming language such as C, an object-oriented programming language such as C++, or a high-level or low-level programming language (assembly language, hardware description languages, and database programming languages and technologies), This includes not only bytecode, but also high-level language code that can be executed by a computer using an interpreter or the like.
  • the methods and combinations of methods may be implemented as executable code that performs each step.
  • the method may be implemented as systems performing the steps, the methods may be distributed in several ways across devices or all functions may be integrated into one dedicated, stand-alone device or other hardware.
  • the means for performing the steps associated with the processes described above may include any of the hardware and/or software described above. All such sequential combinations and combinations are intended to fall within the scope of this disclosure.

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Physics & Mathematics (AREA)
  • Medical Informatics (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Theoretical Computer Science (AREA)
  • Primary Health Care (AREA)
  • Epidemiology (AREA)
  • General Physics & Mathematics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Radiology & Medical Imaging (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Pathology (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • Veterinary Medicine (AREA)
  • Computer Graphics (AREA)
  • Geometry (AREA)
  • Software Systems (AREA)
  • Processing Or Creating Images (AREA)
  • Quality & Reliability (AREA)

Abstract

The present disclosure provides body image management and an apparatus using same. Specifically, according to an embodiment of the present disclosure, a computing device acquiring a body image, which is an image obtained by photographing at least a part of a body, from an image capturing device or a device including an image capturing device or having a previously captured image provides a first user interface area which includes a virtual three-dimensional body model divided into a predetermined number of surface areas and enables selection of at least one of the surface areas by performing an operation on the three-dimensional body model, thereby receiving an image capturing area that is at least one surface area associated with the body image from among the surface areas or receiving the image capturing area from another device which is linked to the computing device and is provided with the first user interface area, and the computing device stores the acquired body image and information of the image capturing area in a predetermined storage or supports the another device to store same.

Description

신체 μ˜μƒμ˜ 관리 방법 및 이λ₯Ό μ΄μš©ν•œ μž₯치Body image management method and device using the same

λ³Έ κ°œμ‹œμ„œμ— κ°œμ‹œλ˜λŠ” 신체 μ˜μƒ 관리 및 이λ₯Ό μ΄μš©ν•œ μž₯μΉ˜λŠ” 의료 μ˜μƒμ˜ 관리 방법에 κ΄€ν•œ 것이며, 더 κ΅¬μ²΄μ μœΌλ‘œλŠ” 의료 μ˜μƒμ„ μ΄¬μ˜ν•˜μ—¬ 이λ₯Ό μ†Œμ •μ˜ μ €μž₯μ†Œμ— λ³΄μœ ν•˜λŠ” 관리 방법에 κ΄€ν•œ 것이닀.Body image management and an apparatus using the body image management disclosed in the present disclosure relate to a method for managing medical images, and more specifically, to a management method for photographing medical images and retaining them in a predetermined storage.

신체 μ˜μƒ λ“± 의료 μ˜μƒμ„ κ΄€λ¦¬ν•˜λŠ” λ‹€μ–‘ν•œ 의료 μ˜μƒ λ°μ΄ν„°λ² μ΄μŠ€ ꡬ좕 방법듀이 μžˆλ‹€. 예λ₯Ό λ“€μ–΄, 일본 λ“±λ‘νŠΉν—ˆ 제2551476ν˜Έμ—λŠ” 의료 μ˜μƒ 관리 μ‹œμŠ€ν…œμ— μžˆμ–΄μ„œμ˜ λ°μ΄ν„°λ² μ΄μŠ€ ꡬ좕 방법이 κ°œμ‹œλ˜μ–΄ μžˆλ‹€.There are various methods of constructing a medical image database for managing medical images such as body images. For example, Japanese Patent Registration No. 2551476 discloses a database construction method in a medical image management system.

의료 κΈ°κ΄€μ—μ„œλŠ” 의료 μ˜μƒ μž₯μΉ˜λ“€μ„ μ΄μš©ν•˜μ—¬ νšλ“λœ ν™˜μžμ˜ 의료 μ˜μƒμ„ 의료 μ˜μƒ 정보 μ‹œμŠ€ν…œμ— μ €μž₯ν•˜μ—¬ λ³΄μœ ν•œλ‹€. 의료 μ˜μƒ 정보 μ‹œμŠ€ν…œμ—λŠ” 의료 μ˜μƒμ΄ ν™˜μžλ₯Ό 식별할 수 μžˆλŠ” 개인 정보와, 의료 μ˜μƒμ΄ νšλ“λœ μ˜μ—­, λΆ€μœ„ 등에 κ΄€ν•˜μ—¬ μ˜μ‚¬κ°€ μ„œμˆ ν•œ 정보와 ν•¨κ»˜ μ €μž₯되고 μžˆλ‹€.Medical institutions store and retain medical images of patients acquired using medical imaging devices in a medical image information system. In the medical image information system, medical images are stored together with personal information for identifying a patient and information described by a doctor about an area or region where a medical image was obtained.

그런데 이처럼 μ’…λž˜μ— λ‹€μ–‘ν•œ 의료 μ˜μƒ μž₯λΉ„λ‘œ μ΄¬μ˜ν•œ 의료 μ˜μƒλ“€μ€, 이미지 ν‘œμ€€, μ˜ˆμ»¨λŒ€, DICOM ν‘œμ€€μœΌλ‘œ μ‹œμŠ€ν…œμ— μ €μž₯λ˜μ–΄ 보유되고 μžˆμ„ λ•Œμ—λ„, μ‹ μ²΄μ˜ μ–΄λŠ λΆ€μœ„λ₯Ό μ΄¬μ˜ν•œ 것인지λ₯Ό 일λͺ©μš”μ—°ν•˜κ²Œ ν‘œμ‹œν•˜μ—¬ μ˜λ£Œμ§„μ΄ μ§κ΄€μ μœΌλ‘œ μ•ŒκΈ° μ‰½κ²Œ κ΅¬μ‘°ν™”λœ 데이터λ₯Ό ν¬ν•¨ν•˜κ³  μžˆμ§€ μ•Šμ€ 단점이 있으며, μ‹ μ²΄μ˜ μœ„μΉ˜ λ³„λ‘œ μƒν˜Έ 관련성이 μžˆλŠ” 의료 μ˜μƒλ“€μ„ ν•¨κ»˜ μ—΄λžŒν•΄μ„œ ν™•μΈν•˜λŠ” 데 ν•œκ³„μ μ΄ μžˆλ‹€.However, even when medical images taken with various medical imaging equipment are stored and held in the system as an image standard, for example, the DICOM standard, the medical team can intuitively know which part of the body was taken by displaying at a glance which part of the body was taken. There is a disadvantage of not including easily structured data, and there is a limit to viewing and confirming medical images that are correlated by body position together.

λ”°λΌμ„œ λ³Έ κ°œμ‹œμ„œμ—μ„œλŠ” μ „μˆ ν•œ μ’…λž˜ 기술의 λ¬Έμ œμ μ„ ν•΄κ²°ν•˜μ—¬, 의료 μ˜μƒμ˜ 촬영 μ‹œμ— κ·Έ 촬영이 λ˜λŠ” 신체 λΆ€μœ„μ— κ΄€ν•œ 정보λ₯Ό μ§€μ •ν•  수 μžˆλŠ” 3차원 신체 λͺ¨λΈμ„ μ œκ³΅ν•¨μœΌλ‘œμ¨ 의료 μ˜μƒκ³Ό κ·Έ 의료 μ˜μƒμ΄ λ‹΄κ³  μžˆλŠ” ν™˜μž μ‹ μ²΄μ˜ ν‘œλ©΄ μ˜μ—­μ„ 3차원 신체 λͺ¨λΈ μƒμ—μ„œ μ§€μ •ν•  수 μžˆλ„λ‘ 함과 λ™μ‹œμ—, μΆ”ν›„ νŠΉμ • ν™˜μžμ˜ 의료 μ˜μƒλ“€μ„ μ—΄λžŒν•  λ•Œ 3차원 신체 λͺ¨λΈ μƒμ—μ„œ κ·Έ 의료 μ˜μƒλ“€μ„ μ§κ΄€μ μœΌλ‘œ μ°Ύμ•„λ³Ό 수 있게 ν•˜λŠ” 방법 및 이λ₯Ό μ΄μš©ν•œ μž₯치λ₯Ό μ œκ³΅ν•˜λŠ” 것을 λͺ©μ μœΌλ‘œ ν•œλ‹€.Therefore, the present disclosure solves the above-mentioned problems of the prior art, and provides a 3D body model capable of designating information on a body part to be captured when a medical image is captured, thereby providing a medical image and a medical image contained in the medical image. A method for designating the surface area of a patient's body on a 3D body model and at the same time intuitively finding the medical images of a specific patient on the 3D body model when viewing later, and a device using the same intended to provide

μƒκΈ°ν•œ 바와 같은 λ³Έ 발λͺ…μ˜ λͺ©μ μ„ λ‹¬μ„±ν•˜κ³ , ν›„μˆ ν•˜λŠ” λ³Έ 발λͺ…μ˜ νŠΉμ§•μ μΈ 효과λ₯Ό μ‹€ν˜„ν•˜κΈ° μœ„ν•œ λ³Έ 발λͺ…μ˜ νŠΉμ§•μ μΈ ꡬ성은 ν•˜κΈ°μ™€ κ°™λ‹€.The characteristic configuration of the present invention for achieving the object of the present invention as described above and realizing the characteristic effects of the present invention described later is as follows.

λ³Έ κ°œμ‹œμ„œμ˜ 일 νƒœμ–‘μ— λ”°λ₯΄λ©΄, 촬영 μž₯치λ₯Ό ν¬ν•¨ν•˜κ±°λ‚˜ 미리 촬영된 μ˜μƒμ„ λ³΄μœ ν•œ μž₯μΉ˜λ‘œλΆ€ν„°, λ˜λŠ” 촬영 μž₯μΉ˜λ‘œλΆ€ν„° μ‹ μ²΄μ˜ 적어도 일뢀가 촬영된 μ˜μƒμΈ 신체 μ˜μƒμ„ νšλ“ν•˜λŠ” μ»΄ν“¨νŒ… μž₯μΉ˜μ— μ˜ν•΄ μˆ˜ν–‰λ˜λŠ” 신체 μ˜μƒ 관리 방법이 μ œκ³΅λ˜λŠ”λ°”, κ·Έ 방법은, 상기 μ»΄ν“¨νŒ… μž₯μΉ˜κ°€, μ†Œμ • 개수의 ν‘œλ©΄ μ˜μ—­λ“€λ‘œ λΆ„ν• λœ κ°€μƒμ˜ 3차원 신체 λͺ¨λΈμ„ ν¬ν•¨ν•˜κ³ , 상기 3차원 신체 λͺ¨λΈμ— λŒ€ν•œ μ‘°μž‘μ— μ˜ν•΄ 상기 ν‘œλ©΄ μ˜μ—­λ“€ 쀑 적어도 ν•˜λ‚˜λ₯Ό 선택 κ°€λŠ₯ν•˜κ²Œ ν•˜λŠ” 제1 μ‚¬μš©μž μΈν„°νŽ˜μ΄μŠ€ μ˜μ—­μ„ μ œκ³΅ν•¨μœΌλ‘œμ¨, 상기 ν‘œλ©΄ μ˜μ—­λ“€ 쀑 상기 신체 μ˜μƒμ— κ²°λΆ€λœ 적어도 ν•˜λ‚˜μ˜ ν‘œλ©΄ μ˜μ—­μΈ 촬영 μ˜μ—­μ„ μž…λ ₯λ°›κ±°λ‚˜ 상기 μ»΄ν“¨νŒ… μž₯μΉ˜μ— μ—°λ™ν•˜λ˜ 상기 제1 μ‚¬μš©μž μΈν„°νŽ˜μ΄μŠ€ μ˜μ—­μ΄ μ œκ³΅λ˜λŠ” 타 μž₯μΉ˜λ‘œλΆ€ν„° 상기 촬영 μ˜μ—­μ„ μ „λ‹¬λ°›λŠ” 단계인 촬영 μ˜μ—­ 선택 단계; 및 상기 μ»΄ν“¨νŒ… μž₯μΉ˜κ°€, νšλ“λœ 상기 신체 μ˜μƒμ„ 상기 촬영 μ˜μ—­μ˜ 정보와 ν•¨κ»˜ μ†Œμ •μ˜ μ €μž₯μ†Œμ— μ €μž₯ν•˜κ±°λ‚˜ 타 μž₯치둜 ν•˜μ—¬κΈˆ μ €μž₯ν•˜λ„λ‘ μ§€μ›ν•˜λŠ” 단계인 촬영 μ˜μ—­ μ €μž₯ 단계λ₯Ό ν¬ν•¨ν•œλ‹€.According to one aspect of the present disclosure, body image management performed by a computing device that obtains a body image, which is an image in which at least a part of the body is captured, from a device including a photographing device or having a previously photographed image, or from a photographing device. A method is provided, wherein the computing device includes a virtual three-dimensional body model divided into a predetermined number of surface regions, and by operating the three-dimensional body model, at least one of the surface regions is provided. By providing a first user interface area that allows one to be selected, a photographing area, which is at least one surface area associated with the body image, among the surface areas is input or linked to the computing device, but the first user interface area is a capturing area selection step that is a step of receiving the capturing area from another provided device; and a capturing area storage step, which is a step of allowing the computing device to store the obtained body image together with information of the capturing area in a predetermined storage or to assist another device to store the acquired body image.

μœ λ¦¬ν•˜κ²ŒλŠ”, 상기 촬영 μ˜μ—­ μ €μž₯ λ‹¨κ³„μ—μ„œ, 상기 μ»΄ν“¨νŒ… μž₯μΉ˜λŠ”, 상기 신체 μ˜μƒμ„ 메타데이터(metadata)λ₯Ό ν¬ν•¨ν•˜λŠ” 이미지 ν‘œμ€€ λ˜λŠ” DICOM ν‘œμ€€μœΌλ‘œ μ €μž₯ν•˜λ˜, 상기 이미지 ν‘œμ€€ λ˜λŠ” 상기 DICOM ν‘œμ€€μ˜ μ˜ˆμ•½λœ ν•„λ“œ(reserved field) λ˜λŠ” μ—¬λΆ„μ˜ ν•„λ“œ(blank field)에 상기 촬영 μ˜μ—­μ˜ 정보λ₯Ό ν•¨κ»˜ κΈ°μž…ν•œλ‹€.Advantageously, in the capturing region storing step, the computing device stores the body image as an image standard or DICOM standard including metadata, and a reserved field of the image standard or the DICOM standard field) or an extra field (blank field) together with the information of the photographing area.

λ°”λžŒμ§ν•˜κ²ŒλŠ”, 상기 촬영 μ˜μ—­μ˜ μ •λ³΄λŠ”, 상기 촬영 μ˜μ—­μ˜ ꡬ뢄 인덱슀(index), 상기 촬영 μ˜μ—­μ˜ ꡬ뢄 λͺ…μΉ­, 상기 3차원 신체 λͺ¨λΈ μƒμ˜ 3차원 μ’Œν‘œ, 및 상기 3차원 μ’Œν‘œμ— λŒ€μ‘λ˜λ„λ‘ κ°€κ³΅λœ 2차원 μ’Œν‘œ 쀑 적어도 ν•˜λ‚˜λ₯Ό ν¬ν•¨ν•œλ‹€.Preferably, the information of the imaging area includes a classification index of the imaging area, a classification name of the imaging area, 3D coordinates on the 3D body model, and 2D information processed to correspond to the 3D coordinates. contains at least one of the coordinates.

상기 λ°©λ²•μ˜ 일 μ‹€μ‹œ μ˜ˆμ—μ„œλŠ”, 상기 촬영 μ˜μ—­ 선택 단계 전에, 상기 μ»΄ν“¨νŒ… μž₯μΉ˜κ°€, 촬영 μž₯치λ₯Ό ν¬ν•¨ν•˜κ±°λ‚˜ 미리 촬영된 μ˜μƒμ„ λ³΄μœ ν•œ 타 μž₯μΉ˜λ‘œμ„œ 상기 μ»΄ν“¨νŒ… μž₯μΉ˜μ— μ—°λ™ν•˜λŠ” μž₯치 λ˜λŠ” 상기 μ»΄ν“¨νŒ… μž₯μΉ˜μ— ν¬ν•¨λœ 촬영 μž₯μΉ˜λ‘œλΆ€ν„° μ‹ μ²΄μ˜ 적어도 일뢀가 촬영된 μ˜μƒμΈ 신체 μ˜μƒμ„ νšλ“ν•˜λŠ” 단계인 신체 μ˜μƒ νšλ“ 단계λ₯Ό μˆ˜ν–‰ν•œλ‹€.In an embodiment of the method, before the capturing area selection step, the computing device is another device including a photographing device or having a pre-captured image, and a device interworking with the computing device or a photographing device included in the computing device. A body image acquisition step, which is a step of obtaining a body image in which at least a part of the body is captured, is performed.

상기 λ°©λ²•μ˜ λ‹€λ₯Έ μ‹€μ‹œ μ˜ˆμ—μ„œλŠ”, 상기 촬영 μ˜μ—­ 선택 단계 후에, 상기 μ»΄ν“¨νŒ… μž₯μΉ˜κ°€, 촬영 μž₯치λ₯Ό ν¬ν•¨ν•˜κ±°λ‚˜ 미리 촬영된 μ˜μƒμ„ λ³΄μœ ν•œ 타 μž₯μΉ˜λ‘œμ„œ 상기 μ»΄ν“¨νŒ… μž₯μΉ˜μ— μ—°λ™ν•˜λŠ” μž₯치 λ˜λŠ” 상기 μ»΄ν“¨νŒ… μž₯μΉ˜μ— ν¬ν•¨λœ 촬영 μž₯μΉ˜λ‘œλΆ€ν„° μ‹ μ²΄μ˜ 적어도 일뢀가 촬영된 μ˜μƒμΈ 신체 μ˜μƒμ„ νšλ“ν•˜λŠ” 단계인 신체 μ˜μƒ νšλ“ 단계λ₯Ό μˆ˜ν–‰ν•œλ‹€.In another embodiment of the method, after the capturing area selection step, the computing device is another device including a photographing device or holding a pre-captured image, and a device interworking with the computing device or a photographing device included in the computing device. A body image acquisition step, which is a step of obtaining a body image in which at least a part of the body is captured, is performed.

λ°”λžŒμ§ν•˜κ²Œ, 상기 방법은, 상기 μ»΄ν“¨νŒ… μž₯μΉ˜κ°€, μ†Œμ • 개수의 ν‘œλ©΄ μ˜μ—­λ“€λ‘œ λΆ„ν• λœ κ°€μƒμ˜ 3차원 신체 λͺ¨λΈμ„ ν¬ν•¨ν•˜κ³ , 상기 3차원 신체 λͺ¨λΈμ— λŒ€ν•œ μ‚¬μš©μžμ˜ κ΄€μ°° μ‘°μž‘μ— μ˜ν•΄ 상기 ν‘œλ©΄ μ˜μ—­λ“€μ˜ 관찰을 κ°€λŠ₯ν•˜κ²Œ ν•˜λŠ” 제2 μ‚¬μš©μž μΈν„°νŽ˜μ΄μŠ€ μ˜μ—­μ„ μ œκ³΅ν•¨μœΌλ‘œμ¨, 상기 제2 μ‚¬μš©μž μΈν„°νŽ˜μ΄μŠ€ μ˜μ—­ 상에 상기 ν‘œλ©΄ μ˜μ—­λ“€ 각각에 κ²°λΆ€λœ 신체 μ˜μƒμ˜ 유무, μœ„μΉ˜ 및 개수 쀑 적어도 ν•˜λ‚˜λ₯Ό ν‘œμ‹œν•˜κ±°λ‚˜ 상기 κ²°λΆ€λœ 신체 μ˜μƒμ˜ 섬넀일(thumbnail)을 ν‘œμ‹œν•˜λŠ” 촬영 μ˜μ—­ μ—΄λžŒ 보쑰 단계λ₯Ό 더 ν¬ν•¨ν•œλ‹€.Preferably, in the method, the computing device includes a virtual three-dimensional body model divided into a predetermined number of surface regions, and observation of the surface regions is performed by a user's observation manipulation of the three-dimensional body model. By providing a second user interface area capable of displaying at least one of the presence, location, and number of body images associated with each of the surface areas on the second user interface area, or displaying thumbnails of the associated body images ( thumbnail) is further included.

μœ λ¦¬ν•˜κ²Œ, 상기 방법은, νŠΉμ • 신체 μ˜μƒμ— λŒ€ν•œ μ—΄λžŒ μš”μ²­μ— μ‘ν•˜μ—¬, 상기 μ»΄ν“¨νŒ… μž₯μΉ˜κ°€, 상기 νŠΉμ • 신체 μ˜μƒμ„ μ œκ³΅ν•˜λ©΄μ„œ, 상기 νŠΉμ • 신체 μ˜μƒμ— κ²°λΆ€λœ ν‘œλ©΄ μ˜μ—­μ„ κ°€μƒμ˜ 3차원 신체 λͺ¨λΈ 상에 ν‘œμ‹œν•œ 제3 μ‚¬μš©μž μΈν„°νŽ˜μ΄μŠ€ μ˜μ—­μ„ μ œκ³΅ν•˜λŠ” 단계λ₯Ό 더 ν¬ν•¨ν•œλ‹€.Advantageously, the method comprises: in response to a request to view a specific body image, the computing device displays a surface area associated with the specific body image on a virtual three-dimensional body model while providing the specific body image. The method further includes providing a third user interface area.

λ³Έ 발λͺ…μ˜ λ‹€λ₯Έ νƒœμ–‘μ— λ”°λ₯΄λ©΄, λ³Έ 발λͺ…에 λ”°λ₯Έ 방법듀을 μˆ˜ν–‰ν•˜λ„λ‘ κ΅¬ν˜„λœ λͺ…λ Ήμ–΄λ“€(instructions)을 ν¬ν•¨ν•˜λŠ” 컴퓨터 ν”„λ‘œκ·Έλž¨λ„ μ œκ³΅λœλ‹€. 예λ₯Ό λ“€μ–΄, κ·ΈλŸ¬ν•œ 컴퓨터 ν”„λ‘œκ·Έλž¨μ€ 기계 νŒλ… κ°€λŠ₯ν•œ λΉ„μΌμ‹œμ  기둝 맀체에 μ €μž₯될 수 μžˆλ‹€.According to another aspect of the present invention, a computer program comprising instructions implemented to perform the methods according to the present invention is also provided. For example, such a computer program may be stored on a machine-readable non-transitory recording medium.

λ³Έ 발λͺ…μ˜ 또 λ‹€λ₯Έ νƒœμ–‘μ— λ”°λ₯΄λ©΄, μ‹ μ²΄μ˜ 적어도 일뢀가 촬영된 μ˜μƒμΈ 신체 μ˜μƒμ„ νšλ“ν•˜μ—¬ κ΄€λ¦¬ν•˜λŠ” μ»΄ν“¨νŒ… μž₯μΉ˜κ°€ μ œκ³΅λ˜λŠ”λ°”, κ·Έ μ»΄ν“¨νŒ… μž₯μΉ˜λŠ”, 촬영 μž₯치λ₯Ό ν¬ν•¨ν•˜κ±°λ‚˜ 미리 촬영된 μ˜μƒμ„ λ³΄μœ ν•œ μž₯μΉ˜μ™€ μ—°λ™ν•˜κ±°λ‚˜ 상기 μ»΄ν“¨νŒ… μž₯μΉ˜μ— ν¬ν•¨λœ 촬영 μž₯μΉ˜μ™€ μ—°λ™ν•˜λŠ” 톡신뢀; 및 μ†Œμ • 개수의 ν‘œλ©΄ μ˜μ—­λ“€λ‘œ λΆ„ν• λœ κ°€μƒμ˜ 3차원 신체 λͺ¨λΈμ„ ν¬ν•¨ν•˜κ³ , 상기 3차원 신체 λͺ¨λΈμ— λŒ€ν•œ μ‘°μž‘μ— μ˜ν•΄ 상기 ν‘œλ©΄ μ˜μ—­λ“€ 쀑 적어도 ν•˜λ‚˜λ₯Ό 선택 κ°€λŠ₯ν•˜κ²Œ ν•˜λŠ” 제1 μ‚¬μš©μž μΈν„°νŽ˜μ΄μŠ€ μ˜μ—­μ„ 상기 톡신뢀λ₯Ό ν†΅ν•˜μ—¬ μ†Œμ •μ˜ λ””μŠ€ν”Œλ ˆμ΄ μž₯μΉ˜μ— μ œκ³΅ν•¨μœΌλ‘œμ¨, 상기 ν‘œλ©΄ μ˜μ—­λ“€ 쀑 상기 신체 μ˜μƒμ— κ²°λΆ€λœ 적어도 ν•˜λ‚˜μ˜ ν‘œλ©΄ μ˜μ—­μΈ 촬영 μ˜μ—­μ„ μž…λ ₯λ°›κ±°λ‚˜ 상기 톡신뢀λ₯Ό ν†΅ν•˜μ—¬ μ—°λ™ν•˜λ˜ 상기 제1 μ‚¬μš©μž μΈν„°νŽ˜μ΄μŠ€ μ˜μ—­μ΄ μ œκ³΅λ˜λŠ” 타 μž₯μΉ˜λ‘œλΆ€ν„° 상기 촬영 μ˜μ—­μ„ μ „λ‹¬λ°›λŠ” 촬영 μ˜μ—­ 선택 ν”„λ‘œμ„ΈμŠ€, 및 νšλ“λœ 상기 신체 μ˜μƒμ„ 상기 촬영 μ˜μ—­μ˜ 정보와 ν•¨κ»˜ μ†Œμ •μ˜ μ €μž₯μ†Œμ— μ €μž₯ν•˜κ±°λ‚˜ 상기 톡신뢀λ₯Ό ν†΅ν•˜μ—¬ 타 μž₯치둜 ν•˜μ—¬κΈˆ μ €μž₯ν•˜λ„λ‘ μ§€μ›ν•˜λŠ” 촬영 μ˜μ—­ μ €μž₯ ν”„λ‘œμ„ΈμŠ€λ₯Ό μˆ˜ν–‰ν•˜λŠ” ν”„λ‘œμ„Έμ„œλ₯Ό ν¬ν•¨ν•œλ‹€.According to another aspect of the present invention, there is provided a computing device that acquires and manages a body image in which at least a part of the body is captured, and the computing device includes a photographing device or a device holding a previously photographed image. a communication unit that interworks or interworks with a photographing device included in the computing device; and a virtual 3D body model divided into a predetermined number of surface areas, and a first user interface area capable of selecting at least one of the surface areas by manipulating the 3D body model is provided to the communication unit. By providing it to a predetermined display device through, a photographing area that is at least one surface area associated with the body image among the surface areas is received or interlocked through the communication unit from another device provided with the first user interface area. Performing a capture area selection process for receiving the capture area and a capture area storage process for storing the obtained body image together with information on the capture area in a predetermined storage or supporting another device to store the captured body image through the communication unit contains the processor.

λ³Έ κ°œμ‹œμ„œμ˜ 발λͺ…에 μ˜ν•˜λ©΄, 의료 μ˜μƒμ˜ 촬영 μ‹œμ— κ·Έ 촬영이 λ˜λŠ” 신체 λΆ€μœ„λ₯Ό 3차원 신체 λͺ¨λΈμ„ μ΄μš©ν•˜μ—¬ μ§κ΄€μ μœΌλ‘œ μ„ νƒν•˜μ—¬ μ§€μ •ν•  수 있고, κ·Έ 신체 λΆ€μœ„μ— κ΄€ν•œ 정보λ₯Ό 의료 μ˜μƒμ— κ΄€ν•œ DICOM ν‘œμ€€ λ“±μ˜ ν˜•μ‹μœΌλ‘œ ν•¨κ»˜ μ €μž₯ν•΄λ‘˜ 수 μžˆμ–΄, μΆ”ν›„ 의료 μ˜μƒμ˜ μ—΄λžŒ μ‹œμ— μ˜λ£Œμ§„ λ“±μ˜ μ‚¬μš©μžκ°€ ν™˜μžμ— λŒ€μ‘ν•˜λŠ” κ°€μƒμ˜ 3차원 신체 λͺ¨λΈ μƒμ—μ„œ κ·Έ 의료 μ˜μƒλ“€μ„ μ§κ΄€μ μœΌλ‘œ μ°Ύμ•„λ³Ό 수 μžˆλŠ”λ°”, 의료 μ˜μƒμ˜ 촬영 및 μ—΄λžŒμ— κ΄€ν•œ μ˜λ£Œμ§„μ˜ νŽΈμ˜μ„±μ΄ ν–₯μƒλ˜λŠ” νš¨κ³Όκ°€ μžˆλ‹€.According to the invention of the present disclosure, when a medical image is captured, it is possible to intuitively select and designate a body part to be captured using a 3D body model, and information on the body part can be stored in the DICOM standard for medical images, etc. It can be stored together in the format of, so that when viewing medical images later, users such as medical staff can intuitively find the medical images on a virtual 3-dimensional body model corresponding to the patient, taking and viewing medical images. There is an effect of improving the convenience of the medical staff regarding.

λ³Έ 발λͺ…μ˜ μ‹€μ‹œ 예의 μ„€λͺ…에 이용되기 μœ„ν•˜μ—¬ μ²¨λΆ€λœ μ•„λž˜ 도면듀은 λ³Έ 발λͺ…μ˜ μ‹€μ‹œ μ˜ˆλ“€ 쀑 단지 일뢀일 뿐이며, λ³Έ 발λͺ…이 μ†ν•œ κΈ°μˆ λΆ„μ•Όμ—μ„œ ν†΅μƒμ˜ 지식을 κ°€μ§„ μ‚¬λžŒ(μ΄ν•˜ "ν†΅μƒμ˜ 기술자"라 함)은 λ³„κ°œμ˜ 발λͺ…에 이λ₯Ό μ •λ„μ˜ λ…Έλ ₯ 없이 이 도면듀에 κΈ°μ΄ˆν•˜μ—¬ λ‹€λ₯Έ 도면듀을 얻을 수 μžˆλ‹€.The accompanying drawings for use in describing the embodiments of the present invention are only some of the embodiments of the present invention, and those of ordinary skill in the art (hereinafter referred to as "ordinary technicians") Other figures may be obtained on the basis of these figures without any effort to the extent of inventing the invention.

도 1은 λ³Έ κ°œμ‹œμ„œμ— λ”°λ₯Έ 신체 μ˜μƒ 관리 방법을 μˆ˜ν–‰ν•˜λŠ” μ»΄ν“¨νŒ… μž₯치의 μ˜ˆμ‹œμ  ꡬ성을 개랡적으둜 λ„μ‹œν•œ κ°œλ…λ„μ΄λ‹€.1 is a conceptual diagram schematically illustrating an exemplary configuration of a computing device that performs a body image management method according to the present disclosure.

도 2λŠ” λ³Έ κ°œμ‹œμ„œμ— λ”°λ₯Έ 신체 μ˜μƒ 관리 방법을 μˆ˜ν–‰ν•˜λŠ” μ»΄ν“¨νŒ… μž₯치의 ν•˜λ“œμ›¨μ–΄ λ˜λŠ” μ†Œν”„νŠΈμ›¨μ–΄ κ΅¬μ„±μš”μ†Œλ₯Ό λ„μ‹œν•œ μ˜ˆμ‹œμ  블둝도이닀.2 is an exemplary block diagram illustrating hardware or software components of a computing device that performs a body image management method according to the present disclosure.

도 3aλŠ” λ³Έ κ°œμ‹œμ„œμ— λ”°λ₯Έ 신체 μ˜μƒ 관리 λ°©λ²•μ˜ 일 μ‹€μ‹œ 예λ₯Ό λ„μ‹œν•œ μ˜ˆμ‹œμ  흐름도이며, 도 3bλŠ” λ³Έ κ°œμ‹œμ„œμ— λ”°λ₯Έ 신체 μ˜μƒ 관리 λ°©λ²•μ˜ λ‹€λ₯Έ μ‹€μ‹œ 예λ₯Ό λ„μ‹œν•œ μ˜ˆμ‹œμ  흐름도이닀.3A is an exemplary flowchart illustrating a method for managing a body image according to an embodiment of the present disclosure, and FIG. 3B is an exemplary flowchart illustrating a method for managing a body image according to another embodiment of the present disclosure.

도 4λŠ” λ³Έ κ°œμ‹œμ„œμ˜ 신체 μ˜μƒ 관리 λ°©λ²•μ—μ„œ μ œκ³΅λ˜λŠ” μ‚¬μš©μž μΈν„°νŽ˜μ΄μŠ€λ₯Ό μ˜ˆμ‹œμ μœΌλ‘œ λ‚˜νƒ€λ‚Έ 도면이닀.4 is a diagram showing a user interface provided in the body image management method of the present disclosure by way of example.

도 5aλŠ” λ³Έ κ°œμ‹œμ„œμ˜ 신체 μ˜μƒ 관리 λ°©λ²•μ—μ„œ μ œκ³΅λ˜λŠ” 제1 μ‚¬μš©μž μΈν„°νŽ˜μ΄μŠ€ μ˜μ—­μ„ μ˜ˆμ‹œμ μœΌλ‘œ λ‚˜νƒ€λ‚Έ 도면이닀.5A is a diagram illustrating a first user interface area provided in the body image management method of the present disclosure by way of example.

도 5b λ‚΄μ§€ 도 5dλŠ” λ³Έ κ°œμ‹œμ„œμ˜ 신체 μ˜μƒ 관리 λ°©λ²•μ—μ„œ μ œκ³΅λ˜λŠ” 제2 μ‚¬μš©μž μΈν„°νŽ˜μ΄μŠ€ μ˜μ—­μ„ μ˜ˆμ‹œμ μœΌλ‘œ λ‚˜νƒ€λ‚Έ 도면이닀.5B to 5D are diagrams illustrating a second user interface area provided in the body image management method of the present disclosure by way of example.

ν›„μˆ ν•˜λŠ” λ³Έ 발λͺ…에 λŒ€ν•œ μƒμ„Έν•œ μ„€λͺ…은, λ³Έ 발λͺ…μ˜ λͺ©μ λ“€, 기술적 해법듀 및 μž₯점듀을 λΆ„λͺ…ν•˜κ²Œ ν•˜κΈ° μœ„ν•˜μ—¬ λ³Έ 발λͺ…이 μ‹€μ‹œλ  수 μžˆλŠ” νŠΉμ • μ‹€μ‹œ 예λ₯Ό μ˜ˆμ‹œλ‘œμ„œ λ„μ‹œν•˜λŠ” 첨뢀 도면을 μ°Έμ‘°ν•œλ‹€. 이듀 μ‹€μ‹œ μ˜ˆλŠ” ν†΅μƒμ˜ κΈ°μˆ μžκ°€ λ³Έ 발λͺ…을 μ‹€μ‹œν•  수 μžˆκΈ°μ— μΆ©λΆ„ν•˜λ„λ‘ μƒμ„Ένžˆ μ„€λͺ…λœλ‹€. 첨뢀 도면을 μ°Έμ‘°ν•˜μ—¬ μ„€λͺ…함에 μžˆμ–΄, 도면 λΆ€ν˜Έμ— 관계없이 λ™μΌν•œ ꡬ성 μš”μ†ŒλŠ” λ™μΌν•œ μ°Έμ‘°λΆ€ν˜Έλ₯Ό λΆ€μ—¬ν•˜κ³ , 이에 λŒ€ν•œ μ€‘λ³΅λ˜λŠ” μ„€λͺ…은 μƒλž΅ν•˜κΈ°λ‘œ ν•œλ‹€. λ„λ©΄μ—μ„œ μœ μ‚¬ν•œ μ°Έμ‘°λΆ€ν˜ΈλŠ” μ—¬λŸ¬ 츑면에 κ±Έμ³μ„œ λ™μΌν•˜κ±°λ‚˜ μœ μ‚¬ν•œ κΈ°λŠ₯을 μ§€μΉ­ν•œλ‹€. DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS The following detailed description of the present invention refers to the accompanying drawings, which illustrate specific embodiments in which the present invention may be practiced in order to make the objects, technical solutions and advantages of the present invention clear. These embodiments are described in sufficient detail to enable a person skilled in the art to practice the present invention. In the description with reference to the accompanying drawings, the same reference numerals are given to the same components regardless of reference numerals, and overlapping descriptions thereof will be omitted. Like reference numbers in the drawings indicate the same or similar function throughout the various aspects.

"제1" λ˜λŠ” "제2" λ“±μ˜ μš©μ–΄λ₯Ό λ‹€μ–‘ν•œ κ΅¬μ„±μš”μ†Œλ“€μ„ μ„€λͺ…ν•˜λŠ”λ° μ‚¬μš©λ  수 μžˆμ§€λ§Œ, 이런 μš©μ–΄λ“€μ€ ν•˜λ‚˜μ˜ κ΅¬μ„±μš”μ†Œλ₯Ό λ‹€λ₯Έ κ΅¬μ„±μš”μ†Œλ‘œλΆ€ν„° κ΅¬λ³„ν•˜λŠ” λͺ©μ μœΌλ‘œλ§Œ ν•΄μ„λ˜μ–΄μ•Ό ν•˜λŠ”λ°”, μ–΄λ– ν•œ μˆœμ„œλ„ μ‹œμ‚¬ν•˜κ³  μžˆμ§€ μ•ŠκΈ° λ•Œλ¬Έμ΄λ‹€. 예λ₯Ό λ“€μ–΄, 제1 κ΅¬μ„±μš”μ†ŒλŠ” 제2 κ΅¬μ„±μš”μ†Œλ‘œ λͺ…λͺ…될 수 있고, μœ μ‚¬ν•˜κ²Œ 제2 κ΅¬μ„±μš”μ†ŒλŠ” 제1 κ΅¬μ„±μš”μ†Œλ‘œλ„ λͺ…λͺ…될 수 μžˆλ‹€.Terms such as "first" or "second" may be used to describe various components, but such terms are to be interpreted solely for the purpose of distinguishing one component from another, and no order is implied. because it doesn't For example, a first element may be termed a second element, and similarly, a second element may be termed a first element.

μ–΄λ–€ κ΅¬μ„±μš”μ†Œκ°€ λ‹€λ₯Έ κ΅¬μ„±μš”μ†Œμ— "μ—°κ²°λ˜μ–΄" μžˆλ‹€κ³  μ–ΈκΈ‰λœ λ•Œμ—λŠ”, κ·Έ λ‹€λ₯Έ κ΅¬μ„±μš”μ†Œμ— μ§μ ‘μ μœΌλ‘œ μ—°κ²°λ˜μ–΄ μžˆκ±°λ‚˜ λ˜λŠ” μ ‘μ†λ˜μ–΄ μžˆμ„ μˆ˜λ„ μžˆμ§€λ§Œ, 쀑간에 λ‹€λ₯Έ κ΅¬μ„±μš”μ†Œκ°€ μ‘΄μž¬ν•  μˆ˜λ„ μžˆλ‹€κ³  μ΄ν•΄λ˜μ–΄μ•Ό ν•  것이닀.It should be understood that when an element is referred to as being β€œconnected” to another element, it may be directly connected or connected to the other element, but other elements may exist in the middle.

λ‹¨μˆ˜μ˜ ν‘œν˜„μ€ λ¬Έλ§₯상 λͺ…λ°±ν•˜κ²Œ λ‹€λ₯΄κ²Œ λœ»ν•˜μ§€ μ•ŠλŠ” ν•œ, 볡수의 ν‘œν˜„μ„ ν¬ν•¨ν•œλ‹€. λ³Έ κ°œμ‹œμ„œμ—μ„œ, "ν¬ν•¨ν•˜λ‹€" λ˜λŠ” "κ°€μ§€λ‹€" λ“±μ˜ μš©μ–΄λŠ” 기재된 νŠΉμ§•, 숫자, 단계, λ™μž‘, κ΅¬μ„±μš”μ†Œ, λΆ€λΆ„ν’ˆ λ˜λŠ” 이듀을 μ‘°ν•©ν•œ 것이 μ‘΄μž¬ν•¨μœΌλ‘œ μ§€μ •ν•˜λ €λŠ” 것이지, ν•˜λ‚˜ λ˜λŠ” κ·Έ μ΄μƒμ˜ λ‹€λ₯Έ νŠΉμ§•λ“€μ΄λ‚˜ 숫자, 단계, λ™μž‘, κ΅¬μ„±μš”μ†Œ, λΆ€λΆ„ν’ˆ λ˜λŠ” 이듀을 μ‘°ν•©ν•œ κ²ƒλ“€μ˜ 쑴재 λ˜λŠ” λΆ€κ°€ κ°€λŠ₯성을 미리 λ°°μ œν•˜μ§€ μ•ŠλŠ” κ²ƒμœΌλ‘œ μ΄ν•΄λ˜μ–΄μ•Ό ν•œλ‹€.Singular expressions include plural expressions unless the context clearly dictates otherwise. In this disclosure, terms such as "comprise" or "have" are intended to designate that the described feature, number, step, operation, component, part, or combination thereof exists, but one or more other features or numbers, It should be understood that the presence or addition of steps, operations, components, parts, or combinations thereof is not precluded.

λ˜ν•œ, λ³Έ κ°œμ‹œμ„œμ—μ„œ 'μ˜μƒ'은 (μ˜ˆμ»¨λŒ€, λΉ„λ””μ˜€ 화면에 ν‘œμ‹œλœ) 눈으둜 λ³Ό 수 μžˆλŠ” μ˜μƒ λ˜λŠ” μ˜μƒμ˜ λ””μ§€ν„Έ ν‘œν˜„λ¬Όμ„ μ§€μΉ­ν•˜λŠ” μš©μ–΄μ΄λ‹€. λ³Έ κ°œμ‹œμ„œμ— 걸쳐 'ν‘œλ©΄ μ˜μƒ' λ˜λŠ” 'ν”ΌλΆ€ μ˜μƒ'은 ν”ΌλΆ€κ³Ό μ˜μƒ(dermatology image)λ₯Ό μ§€μΉ­ν•œλ‹€.Also, in this disclosure, 'image' is a term referring to an image that can be seen by the eye or a digital representation of an image (eg, displayed on a video screen). Throughout this disclosure 'surface images' or 'skin images' refer to dermatology images.

λ³Έ κ°œμ‹œμ„œμ—μ„œ '메타데이터'λŠ” λ‹€λ₯Έ 데이터λ₯Ό μ„€λͺ…ν•΄μ£ΌλŠ” 데이터λ₯Ό μ§€μΉ­ν•˜λŠ” μš©μ–΄λ‘œμ„œ, 예λ₯Ό λ“€μ–΄, '이미지'λŠ” κ·Έ 이미지λ₯Ό μ΄¬μ˜ν•œ μž₯치의 정보, 촬영 λ‹Ήμ‹œμ˜ μ‹œκ°„, λ…ΈμΆœ, ν”Œλž˜μ‹œ μ‚¬μš© μ—¬λΆ€, 해상도, 이미지 크기 λ“±μ˜ 정보λ₯Ό λ©”νƒ€λ°μ΄ν„°λ‘œ κ°€μ§ˆ 수 μžˆλ‹€.In this disclosure, 'metadata' is a term that refers to data that describes other data. For example, 'image' refers to the information of the device that took the image, the time at the time of shooting, exposure, whether or not the flash was used, and the resolution. , image size, etc. may be included as metadata.

*λ³Έ κ°œμ‹œμ„œμ—μ„œ 'DICOM(Digital Imaging and Communications in Medicine; 의료용 λ””μ§€ν„Έ μ˜μƒ 및 톡신)' ν‘œμ€€μ€ 의료용 κΈ°κΈ°μ—μ„œ λ””μ§€ν„Έ μ˜μƒ ν‘œν˜„κ³Ό 톡신에 μ΄μš©λ˜λŠ” μ—¬λŸ¬ κ°€μ§€ ν‘œμ€€μ„ μ΄μΉ­ν•˜λŠ” μš©μ–΄μΈλ°”, DICOM ν‘œμ€€μ€ λ―Έκ΅­ 방사선 μ˜ν•™νšŒ(ACR)와 λ―Έκ΅­ μ „κΈ° κ³΅μ—…νšŒ(NEMA)μ—μ„œ κ΅¬μ„±ν•œ μ—°ν•© μœ„μ›νšŒμ—μ„œ λ°œν‘œν•œλ‹€.*In this disclosure, the 'DICOM (Digital Imaging and Communications in Medicine)' standard is a term that collectively refers to various standards used for digital image expression and communication in medical devices. (ACR) and the National Electrical Manufacturers Association (NEMA).

그리고 λ³Έ κ°œμ‹œμ„œμ—μ„œ '의료 μ˜μƒ μ €μž₯ 전솑 μ‹œμŠ€ν…œ(PACS)'은, μ˜μƒ 및 톡신 ν‘œμ€€, μ˜ˆμ»¨λŒ€, DICOM ν‘œμ€€μ— 맞게 μ €μž₯, 가곡, μ „μ†‘ν•˜λŠ” μ‹œμŠ€ν…œμ„ μ§€μΉ­ν•˜λŠ” μš©μ–΄μ΄λ©°, Xμ„ , CT, MRI와 같은 λ””μ§€ν„Έ 의료 μ˜μƒ μž₯λΉ„λ₯Ό μ΄μš©ν•˜μ—¬ νšλ“λœ 의료 μ˜μƒ μ΄λ―Έμ§€λŠ” ν‘œμ€€ ν˜•μ‹μœΌλ‘œ μ €μž₯되고 λ„€νŠΈμ›Œν¬λ₯Ό ν†΅ν•˜μ—¬ 의료 κΈ°κ΄€ λ‚΄μ™Έμ˜ λ‹¨λ§λ‘œ 전솑이 κ°€λŠ₯ν•˜λ©°, μ΄μ—λŠ” νŒλ… κ²°κ³Ό 및 μ§„λ£Œ 기둝이 좔가될 수 μžˆλ‹€.And in the present disclosure, 'medical image storage and transmission system (PACS)' is a term that refers to a system that stores, processes, and transmits images and communication standards in accordance with the DICOM standard, and digital such as X-ray, CT, and MRI. Medical imaging images obtained using medical imaging equipment are stored in a standard format and can be transmitted to terminals inside and outside medical institutions through a network, and reading results and medical records can be added thereto.

λ˜ν•œ, λ³Έ κ°œμ‹œμ„œμ—μ„œ 'ν•™μŠ΅', 'ν›ˆλ ¨', ν˜Ήμ€ 'λŸ¬λ‹'은 μ ˆμ°¨μ— λ”°λ₯Έ μ»΄ν“¨νŒ…(computing)을 ν†΅ν•˜μ—¬ 기계 ν•™μŠ΅(machine learning)을 μˆ˜ν–‰ν•¨μ„ μΌμ»«λŠ” μš©μ–΄μΈλ°”, μΈκ°„μ˜ ꡐ윑 ν™œλ™κ³Ό 같은 정신적 μž‘μš©μ„ μ§€μΉ­ν•˜λ„λ‘ μ˜λ„λœ 것이 μ•„λ‹˜μ„ ν†΅μƒμ˜ κΈ°μˆ μžλŠ” 이해할 수 μžˆμ„ 것이닀.In addition, in the present disclosure, 'learning', 'training', or 'learning' is a term referring to performing machine learning through procedural computing, which is a mental function such as human educational activity. It will be appreciated by those skilled in the art that it is not intended to refer to.

λ‹€λ₯΄κ²Œ μ •μ˜λ˜μ§€ μ•ŠλŠ” ν•œ, κΈ°μˆ μ μ΄κ±°λ‚˜ 과학적인 μš©μ–΄λ₯Ό ν¬ν•¨ν•΄μ„œ μ—¬κΈ°μ„œ μ‚¬μš©λ˜λŠ” λͺ¨λ“  μš©μ–΄λ“€μ€ ν•΄λ‹Ή 기술 λΆ„μ•Όμ—μ„œ ν†΅μƒμ˜ 지식을 κ°€μ§„ μžμ— μ˜ν•΄ 일반적으둜 μ΄ν•΄λ˜λŠ” 것과 λ™μΌν•œ 의미λ₯Ό κ°€μ§„λ‹€. 일반적으둜 μ‚¬μš©λ˜λŠ” 사전에 μ •μ˜λ˜μ–΄ μžˆλŠ” 것과 같은 μš©μ–΄λ“€μ€ κ΄€λ ¨ 기술의 λ¬Έλ§₯상 κ°€μ§€λŠ” μ˜λ―Έμ™€ μΌμΉ˜ν•˜λŠ” 의미λ₯Ό κ°–λŠ” κ²ƒμœΌλ‘œ ν•΄μ„λ˜μ–΄μ•Ό ν•˜λ©°, λ³Έ κ°œμ‹œμ„œμ—μ„œ λͺ…λ°±ν•˜κ²Œ μ •μ˜ν•˜μ§€ μ•ŠλŠ” ν•œ, μ΄μƒμ μ΄κ±°λ‚˜ κ³Όλ„ν•˜κ²Œ ν˜•μ‹μ μΈ 의미둜 ν•΄μ„λ˜μ§€ μ•ŠλŠ”λ‹€.Unless defined otherwise, all terms used herein, including technical or scientific terms, have the same meaning as commonly understood by one of ordinary skill in the art. Terms such as those defined in commonly used dictionaries should be interpreted as having a meaning consistent with the meaning in the context of the related art, and unless explicitly defined in the present disclosure, interpreted in an ideal or excessively formal meaning. It doesn't work.

λ”μš±μ΄ λ³Έ 발λͺ…은 λ³Έ κ°œμ‹œμ„œμ— ν‘œμ‹œλœ μ‹€μ‹œ μ˜ˆλ“€μ˜ λͺ¨λ“  κ°€λŠ₯ν•œ 쑰합듀을 λ§λΌν•œλ‹€. λ³Έ 발λͺ…μ˜ λ‹€μ–‘ν•œ μ‹€μ‹œ μ˜ˆλŠ” μ„œλ‘œ λ‹€λ₯΄μ§€λ§Œ μƒν˜Έ 배타적일 ν•„μš”λŠ” μ—†μŒμ΄ μ΄ν•΄λ˜μ–΄μ•Ό ν•œλ‹€. 예λ₯Ό λ“€μ–΄, 여기에 κΈ°μž¬λ˜μ–΄ μžˆλŠ” νŠΉμ • ν˜•μƒ, ꡬ쑰 및 νŠΉμ„±μ€ 일 μ‹€μ‹œ μ˜ˆμ— κ΄€λ ¨ν•˜μ—¬ λ³Έ 발λͺ…μ˜ 사상 및 λ²”μœ„λ₯Ό λ²—μ–΄λ‚˜μ§€ μ•ŠμœΌλ©΄μ„œ λ‹€λ₯Έ μ‹€μ‹œ 예둜 κ΅¬ν˜„λ  수 μžˆλ‹€. λ˜ν•œ, 각각의 κ°œμ‹œλœ μ‹€μ‹œ 예 λ‚΄μ˜ κ°œλ³„ κ΅¬μ„±μš”μ†Œμ˜ μœ„μΉ˜ λ˜λŠ” λ°°μΉ˜λŠ” λ³Έ 발λͺ…μ˜ 사상 및 λ²”μœ„λ₯Ό λ²—μ–΄λ‚˜μ§€ μ•ŠμœΌλ©΄μ„œ 변경될 수 있음이 μ΄ν•΄λ˜μ–΄μ•Ό ν•œλ‹€. λ”°λΌμ„œ, ν›„μˆ ν•˜λŠ” μƒμ„Έν•œ μ„€λͺ…은 ν•œμ •μ μΈ μ˜λ―Έλ‘œμ„œ μ·¨ν•˜λ €λŠ” 것이 μ•„λ‹ˆλ‹€. Moreover, the present invention covers all possible combinations of the embodiments presented in this disclosure. It should be understood that the various embodiments of the present invention are different, but need not be mutually exclusive. For example, specific shapes, structures, and characteristics described herein may be implemented in one embodiment in another embodiment without departing from the spirit and scope of the invention. Additionally, it should be understood that the location or arrangement of individual components within each disclosed embodiment may be changed without departing from the spirit and scope of the invention. Accordingly, the detailed description that follows is not intended to be taken in a limiting sense.

λ³Έ κ°œμ‹œμ„œμ—μ„œ 달리 ν‘œμ‹œλ˜κ±°λ‚˜ λΆ„λͺ…νžˆ λ¬Έλ§₯에 λͺ¨μˆœλ˜μ§€ μ•ŠλŠ” ν•œ, λ‹¨μˆ˜λ‘œ μ§€μΉ­λœ ν•­λͺ©μ€, κ·Έ λ¬Έλ§₯μ—μ„œ 달리 μš”κ΅¬λ˜μ§€ μ•ŠλŠ” ν•œ, 볡수의 것을 μ•„μš°λ₯Έλ‹€. λ˜ν•œ, λ³Έ 발λͺ…을 μ„€λͺ…함에 μžˆμ–΄, κ΄€λ ¨λœ 곡지 ꡬ성 λ˜λŠ” κΈ°λŠ₯에 λŒ€ν•œ ꡬ체적인 μ„€λͺ…이 λ³Έ 발λͺ…μ˜ μš”μ§€λ₯Ό 흐릴 수 μžˆλ‹€κ³  νŒλ‹¨λ˜λŠ” κ²½μš°μ—λŠ” κ·Έ μƒμ„Έν•œ μ„€λͺ…은 μƒλž΅ν•œλ‹€.In this disclosure, unless otherwise indicated or clearly contradicted by context, terms referred to in the singular encompass the plural unless the context requires otherwise. In addition, in describing the present invention, if it is determined that a detailed description of a related known configuration or function may obscure the gist of the present invention, the detailed description will be omitted.

μ΄ν•˜, ν†΅μƒμ˜ κΈ°μˆ μžκ°€ λ³Έ 발λͺ…을 μš©μ΄ν•˜κ²Œ μ‹€μ‹œν•  수 μžˆλ„λ‘ ν•˜κΈ° μœ„ν•˜μ—¬, λ³Έ 발λͺ…μ˜ λ°”λžŒμ§ν•œ μ‹€μ‹œ μ˜ˆλ“€μ— κ΄€ν•˜μ—¬ μ²¨λΆ€λœ 도면을 μ°Έμ‘°ν•˜μ—¬ μƒμ„Ένžˆ μ„€λͺ…ν•˜κΈ°λ‘œ ν•œλ‹€.Hereinafter, preferred embodiments of the present invention will be described in detail with reference to the accompanying drawings so that those skilled in the art can easily practice the present invention.

도 1은 λ³Έ κ°œμ‹œμ„œμ— λ”°λ₯Έ 신체 μ˜μƒ 관리 방법을 μˆ˜ν–‰ν•˜λŠ” μ»΄ν“¨νŒ… μž₯치의 μ˜ˆμ‹œμ  ꡬ성을 개랡적으둜 λ„μ‹œν•œ κ°œλ…λ„μ΄λ‹€.1 is a conceptual diagram schematically illustrating an exemplary configuration of a computing device that performs a body image management method according to the present disclosure.

도 1을 μ°Έμ‘°ν•˜λ©΄, λ³Έ κ°œμ‹œμ„œμ˜ 일 μ‹€μ‹œ μ˜ˆμ— λ”°λ₯Έ μ»΄ν“¨νŒ… μž₯치(100)λŠ”, 톡신뢀(110) 및 ν”„λ‘œμ„Έμ„œ(120)λ₯Ό ν¬ν•¨ν•˜λ©°, 상기 톡신뢀(110)λ₯Ό ν†΅ν•˜μ—¬ μ™ΈλΆ€ μ»΄ν“¨νŒ… μž₯치(λ―Έλ„μ‹œ)와 μ§κ°„μ ‘μ μœΌλ‘œ 톡신할 수 μžˆλ‹€.Referring to FIG. 1 , a computing device 100 according to an embodiment of the present disclosure includes a communication unit 110 and a processor 120, and communicates with an external computing device (not shown) through the communication unit 110. You can communicate directly or indirectly.

ꡬ체적으둜, 상기 μ»΄ν“¨νŒ… μž₯치(100)λŠ”, μ „ν˜•μ μΈ 컴퓨터 ν•˜λ“œμ›¨μ–΄(μ˜ˆμ»¨λŒ€, 컴퓨터, ν”„λ‘œμ„Έμ„œ, λ©”λͺ¨λ¦¬, μŠ€ν† λ¦¬μ§€(storage), μž…λ ₯ μž₯치 및 좜λ ₯ μž₯치, 기타 기쑴의 μ»΄ν“¨νŒ… μž₯치의 κ΅¬μ„±μš”μ†Œλ“€μ„ 포함할 수 μžˆλŠ” μž₯치; λΌμš°ν„°, μŠ€μœ„μΉ˜ λ“±κ³Ό 같은 μ „μž 톡신 μž₯치; λ„€νŠΈμ›Œν¬ λΆ€μ°© μŠ€ν† λ¦¬μ§€(NAS; network-attached storage) 및 μŠ€ν† λ¦¬μ§€ μ˜μ—­ λ„€νŠΈμ›Œν¬(SAN; storage area network)와 같은 μ „μž 정보 μŠ€ν† λ¦¬μ§€ μ‹œμŠ€ν…œ)와 컴퓨터 μ†Œν”„νŠΈμ›¨μ–΄(즉, μ»΄ν“¨νŒ… μž₯치둜 ν•˜μ—¬κΈˆ νŠΉμ •μ˜ λ°©μ‹μœΌλ‘œ κΈ°λŠ₯ν•˜κ²Œ ν•˜λŠ” λͺ…λ Ήμ–΄λ“€)의 쑰합을 μ΄μš©ν•˜μ—¬ μ›ν•˜λŠ” μ‹œμŠ€ν…œ μ„±λŠ₯을 λ‹¬μ„±ν•˜λŠ” 것일 수 μžˆλ‹€. 상기 μŠ€ν† λ¦¬μ§€λŠ” ν•˜λ“œ λ””μŠ€ν¬, USB(Universal Serial Bus) λ©”λͺ¨λ¦¬μ™€ 같은 κΈ°μ–΅ μž₯치뿐만 μ•„λ‹ˆλΌ ν΄λΌμš°λ“œ μ„œλ²„μ™€ 같은 λ„€νŠΈμ›Œν¬ μ—°κ²° 기반의 μ €μž₯ μž₯치의 ν˜•νƒœλ₯Ό 포함할 수 μžˆλ‹€.Specifically, the computing device 100 may include typical computer hardware (eg, a computer, processor, memory, storage, input and output devices, and other components of conventional computing devices; a router; electronic communication devices, such as switches, switches, etc.; electronic information storage systems, such as network-attached storage (NAS) and storage area network (SAN)) and computer software (i.e., enabling computing devices to It may be to achieve the desired system performance by using a combination of instructions). The storage may include a storage device such as a hard disk or a universal serial bus (USB) memory as well as a storage device based on a network connection such as a cloud server.

이와 같은 μ»΄ν“¨νŒ… μž₯치의 톡신뢀(110)λŠ” μ—°λ™λ˜λŠ” 타 μ»΄ν“¨νŒ… μž₯치, μ˜ˆμ»¨λŒ€ μ „μš©μ˜ μ €μž₯μ†Œ, μ˜ˆμ»¨λŒ€, λ°μ΄ν„°λ² μ΄μŠ€ μ„œλ²„ λ“±κ³Όμ˜ μ‚¬μ΄μ—μ„œ μš”μ²­κ³Ό 응닡을 μ†‘μˆ˜μ‹ ν•  수 μžˆλŠ”λ°”, 일 μ˜ˆμ‹œλ‘œμ„œ κ·ΈλŸ¬ν•œ μš”μ²­κ³Ό 응닡은 λ™μΌν•œ TCP(Transmission Control Protocol) μ„Έμ…˜(session)에 μ˜ν•˜μ—¬ μ΄λ£¨μ–΄μ§ˆ 수 μžˆμ§€λ§Œ, 이에 ν•œμ •λ˜μ§€λŠ” μ•ŠλŠ” λ°”, μ˜ˆμ»¨λŒ€ UDP(User Datagram Protocol) λ°μ΄ν„°κ·Έλž¨(datagram)μœΌλ‘œμ„œ μ†‘μˆ˜μ‹ λ  μˆ˜λ„ μžˆμ„ 것이닀.The communication unit 110 of such a computing device may transmit and receive requests and responses between other computing devices that are interlocked, for example, a dedicated storage, for example, a database server, and the like. As an example, such requests and responses are the same TCP ( It may be performed by a Transmission Control Protocol (Session) session, but is not limited thereto, and may be transmitted and received as a User Datagram Protocol (UDP) datagram, for example.

ꡬ체적으둜, 톡신뢀(110)λŠ” 톡신 μΈν„°νŽ˜μ΄μŠ€λ₯Ό ν¬ν•¨ν•˜λŠ” 톡신 λͺ¨λ“ˆμ˜ ν˜•νƒœλ‘œ κ΅¬ν˜„λ  수 μžˆλ‹€. 이λ₯Όν…Œλ©΄, 톡신 μΈν„°νŽ˜μ΄μŠ€λŠ” WLAN(Wireless LAN), WiFi(Wireless Fidelity) Direct, DLNA(Digital Living Network Alliance), WiBro(Wireless Broadband), WiMax(World interoperability for Microwave access), HSDPA(High Speed Downlink Packet Access) λ“±μ˜ 무선 인터넷 μΈν„°νŽ˜μ΄μŠ€μ™€ λΈ”λ£¨νˆ¬μŠ€(Bluetoothβ„’), RFID(Radio Frequency IDentification), 적외선 톡신(Infrared Data Association; IrDA), UWB(Ultra-WideBand), ZigBee, NFC(Near Field Communication) λ“±μ˜ 근거리 톡신 μΈν„°νŽ˜μ΄μŠ€λ₯Ό 포함할 수 μžˆλ‹€. 뿐만 μ•„λ‹ˆλΌ, 톡신 μΈν„°νŽ˜μ΄μŠ€λŠ” ν”„λ‘œμ„Έμ„œκ°€ 외뢀와 톡신을 μˆ˜ν–‰ν•  수 μžˆλŠ” λͺ¨λ“  μΈν„°νŽ˜μ΄μŠ€(예λ₯Ό λ“€μ–΄, 데이터 λ²„μŠ€(data bus)와 같은 μ—¬λŸ¬ λ²„μŠ€, μœ λ¬΄μ„  μΈν„°νŽ˜μ΄μŠ€)λ₯Ό λ‚˜νƒ€λ‚Ό 수 μžˆλ‹€.Specifically, the communication unit 110 may be implemented in the form of a communication module including a communication interface. For example, communication interfaces include Wireless LAN (WLAN), Wireless Fidelity (WiFi) Direct, Digital Living Network Alliance (DLNA), Wireless Broadband (WiBro), World interoperability for Microwave access (WiMax), High Speed Downlink Packet Access (HSDPA), etc. Including wireless Internet interface of Bluetoothβ„’, RFID (Radio Frequency IDentification), Infrared Data Association (IrDA), UWB (Ultra-WideBand), ZigBee, and NFC (Near Field Communication) short-range communication interfaces. can do. In addition, the communication interface may represent all interfaces through which the processor communicates with the outside (eg, multiple buses such as a data bus, wired/wireless interfaces).

예λ₯Ό λ“€μ–΄, 톡신뢀(110)λŠ” 이와 같이 μ ν•©ν•œ 톡신 μΈν„°νŽ˜μ΄μŠ€λ₯Ό 톡해 타 μ»΄ν“¨νŒ… μž₯μΉ˜λ‘œλΆ€ν„° 데이터λ₯Ό μ†‘μˆ˜μ‹ ν•  수 μžˆλ‹€. 덧뢙여, 넓은 μ˜λ―Έμ—μ„œ 상기 톡신뢀(110)λŠ” λͺ…λ Ήμ–΄ λ˜λŠ” μ§€μ‹œ 등을 전달받기 μœ„ν•œ ν‚€λ³΄λ“œ, 마우슀, 기타 μ™ΈλΆ€ μž…λ ₯ μž₯치, 인쇄 μž₯치, λ””μŠ€ν”Œλ ˆμ΄, 기타 μ™ΈλΆ€ 좜λ ₯ μž₯치λ₯Ό ν¬ν•¨ν•˜κ±°λ‚˜ 이듀과 연동될 수 μžˆλ‹€. μ»΄ν“¨νŒ… μž₯치, μ˜ˆμ»¨λŒ€, νœ΄λŒ€ 단말 λ˜λŠ” 개인용 컴퓨터(personal computer; PC)와 같은 μ‚¬μš©μž λ‹¨λ§μ΄λ‚˜ μ„œλ²„μ˜ μ‚¬μš©μžμ—κ²Œ μ ν•©ν•œ μ‚¬μš©μž μΈν„°νŽ˜μ΄μŠ€λ₯Ό ν‘œμ‹œν•˜μ—¬ μ œκ³΅ν•¨μœΌλ‘œμ¨ μ‚¬μš©μžμ™€μ˜ μƒν˜Έμž‘μš©μ„ κ°€λŠ₯ν•˜κ²Œ ν•˜κΈ° μœ„ν•˜μ—¬, μ»΄ν“¨νŒ… μž₯치(100)λŠ” λ””μŠ€ν”Œλ ˆμ΄λΆ€λ₯Ό λ‚΄μž₯ν•˜κ±°λ‚˜ 상기 톡신뢀(110)λ₯Ό ν†΅ν•˜μ—¬ μ™ΈλΆ€μ˜ λ””μŠ€ν”Œλ ˆμ΄ μž₯μΉ˜μ™€ 연동될 수 있음이 μ•Œλ €μ Έ μžˆλ‹€. μ˜ˆμ»¨λŒ€, κ·ΈλŸ¬ν•œ λ””μŠ€ν”Œλ ˆμ΄λΆ€ λ˜λŠ” λ””μŠ€ν”Œλ ˆμ΄ μž₯μΉ˜λŠ” ν„°μΉ˜ μž…λ ₯이 κ°€λŠ₯ν•œ ν„°μΉ˜μŠ€ν¬λ¦°μΌ 수 μžˆλ‹€.For example, the communication unit 110 may transmit/receive data from another computing device through an appropriate communication interface. In addition, in a broad sense, the communication unit 110 may include a keyboard, a mouse, other external input devices, a printing device, a display, and other external output devices for receiving commands or instructions, or may be interlocked with them. In order to enable interaction with a user by displaying and providing a user interface suitable for a user of a computing device, for example, a user terminal such as a portable terminal or a personal computer (PC) or a server, the computing device 100 is provided. It is known that the display unit can be embedded or interlocked with an external display device through the communication unit 110 . For example, such a display unit or display device may be a touch screen capable of touch input.

λ˜ν•œ, μ»΄ν“¨νŒ… μž₯치의 ν”„λ‘œμ„Έμ„œ(120)λŠ” μΊμ‹œ λ©”λͺ¨λ¦¬(cache memory)와 같은 λ‚΄λΆ€ λ©”λͺ¨λ¦¬ 및/λ˜λŠ” μ™ΈλΆ€ λ©”λͺ¨λ¦¬λ₯Ό κ°€μ§€λŠ” ν•˜λ‚˜ μ΄μƒμ˜, MPU(micro processing unit), CPU(central processing unit), GPU(graphics processing unit), NPU(neural processing unit) λ˜λŠ” TPU(tensor processing unit)와 같은 λ§ˆμ΄ν¬λ‘œν”„λ‘œμ„Έμ„œ, 컨트둀러, μ˜ˆμ»¨λŒ€, 마이크둜컨트둀러, μž„λ² λ””λ“œ 마이크둜컨트둀러, λ§ˆμ΄ν¬λ‘œμ»΄ν“¨ν„°, ALU(arithmetic logic unit), λ””μ§€ν„Έ μ‹ ν˜Έ ν”„λ‘œμ„Έμ„œ(digital signal processor), μ˜ˆμ»¨λŒ€, ν”„λ‘œκ·Έλž˜λ¨ΈλΈ” λ””μ§€ν„Έ μ‹ ν˜Έ ν”„λ‘œμ„Έμ„œ λ˜λŠ” 기타 ν”„λ‘œκ·Έλž˜λ¨ΈλΈ” μž₯치 λ“±μ˜ ν•˜λ“œμ›¨μ–΄ ꡬ성을 포함할 수 μžˆλ‹€. λ˜ν•œ, 운영체제, νŠΉμ • λͺ©μ μ„ μˆ˜ν–‰ν•˜λŠ” μ• ν”Œλ¦¬μΌ€μ΄μ…˜μ˜ μ†Œν”„νŠΈμ›¨μ–΄ ꡬ성을 더 포함할 μˆ˜λ„ μžˆλ‹€.In addition, the processor 120 of the computing device may include one or more micro processing units (MPUs), central processing units (CPUs), and graphics processing units (GPUs) having internal memory such as cache memory and/or external memory. , a microprocessor such as a neural processing unit (NPU) or a tensor processing unit (TPU), a controller such as a microcontroller, an embedded microcontroller, a microcomputer, an arithmetic logic unit (ALU), a digital signal processor, such as , a programmable digital signal processor or other programmable device. In addition, it may further include a software configuration of an operating system and an application that performs a specific purpose.

이제 도 2 λ‚΄μ§€ 도 5bλ₯Ό μ°Έμ‘°ν•˜μ—¬ λ³Έ κ°œμ‹œμ„œμ— λ”°λ₯Έ 신체 μ˜μƒ 관리 방법을 μƒμ„Ένžˆ ν›„μˆ ν•˜κΈ°λ‘œ ν•œλ‹€.Now, the body image management method according to the present disclosure will be described in detail with reference to FIGS. 2 to 5B .

도 2λŠ” λ³Έ κ°œμ‹œμ„œμ— λ”°λ₯Έ 신체 μ˜μƒ 관리 방법을 μˆ˜ν–‰ν•˜λŠ” μ»΄ν“¨νŒ… μž₯치의 ν•˜λ“œμ›¨μ–΄ λ˜λŠ” μ†Œν”„νŠΈμ›¨μ–΄ κ΅¬μ„±μš”μ†Œλ₯Ό λ„μ‹œν•œ μ˜ˆμ‹œμ  블둝도이고, 도 3aλŠ” λ³Έ κ°œμ‹œμ„œμ— λ”°λ₯Έ 신체 μ˜μƒ 관리 λ°©λ²•μ˜ 제1 μ‹€μ‹œ 예λ₯Ό λ„μ‹œν•œ μ˜ˆμ‹œμ  흐름도이닀.2 is an exemplary block diagram illustrating hardware or software components of a computing device that performs a body image management method according to the present disclosure, and FIG. 3A illustrates a first embodiment of a body image management method according to the present disclosure. It is an exemplary flow diagram shown.

도 2 및 도 3aλ₯Ό μ°Έμ‘°ν•˜λ©΄, λ³Έ κ°œμ‹œμ„œμ˜ 신체 μ˜μƒ 관리 λ°©λ²•μ˜ 제1 μ‹€μ‹œ μ˜ˆλŠ”, μ»΄ν“¨νŒ… μž₯치(200)에 μ˜ν•˜μ—¬ κ΅¬ν˜„λ˜λŠ” 신체 μ˜μƒ νšλ“ λͺ¨λ“ˆ(210)이, 신체 μ˜μƒμ˜ μ›μ²œ(source; μ†ŒμŠ€)μœΌλ‘œλΆ€ν„°, μ˜ˆμ»¨λŒ€, μ»΄ν“¨νŒ… μž₯치(200)에 ν¬ν•¨λœ 촬영 μž₯치(μ˜ˆμ»¨λŒ€, 카메라)(205a), μ»΄ν“¨νŒ… μž₯치(200)에 μ—°λ™ν•˜λŠ” 촬영 μž₯치(205b) λ˜λŠ” 타 μž₯치(300)에 ν¬ν•¨λœ 촬영 μž₯치(305)λ‘œλΆ€ν„°, λ˜λŠ” 미리 촬영된 μ˜μƒμ„ λ³΄μœ ν•œ 타 μž₯치(300)λ‘œλΆ€ν„°, 피검체(λ˜λŠ” ν™˜μž)의 μ‹ μ²΄μ˜ 적어도 일뢀가 촬영된 μ˜μƒμΈ 신체 μ˜μƒμ„ νšλ“ν•˜λŠ” 단계(신체 μ˜μƒ νšλ“ 단계; S100)λ₯Ό ν¬ν•¨ν•œλ‹€.Referring to FIGS. 2 and 3A , in the first embodiment of the body image management method of the present disclosure, the body image acquisition module 210 implemented by the computing device 200 is a source of the body image. From ), for example, a photographing device (eg, camera) 205a included in the computing device 200, a photographing device 205b interworking with the computing device 200, or a photographing device 305 included in another device 300 ) or from another device 300 having pre-captured images, acquiring a body image, which is an image in which at least a part of the body of the subject (or patient) is photographed (body image acquisition step; S100). .

예λ₯Ό λ“€μ–΄, μ»΄ν“¨νŒ… μž₯치(200)κ°€ μ˜λ£Œμ§„μ΄ μ΄μš©ν•˜λŠ” 개인용 컴퓨터라면, 타 μž₯치(300)λŠ” 카메라λ₯Ό ν¬ν•¨ν•˜λŠ” νœ΄λŒ€μš© 단말일 수 μžˆλ‹€.For example, if the computing device 200 is a personal computer used by medical staff, the other device 300 may be a portable terminal including a camera.

도 4λŠ” λ³Έ κ°œμ‹œμ„œμ˜ 신체 μ˜μƒ 관리 λ°©λ²•μ—μ„œ μ œκ³΅λ˜λŠ” μ‚¬μš©μž μΈν„°νŽ˜μ΄μŠ€λ₯Ό μ˜ˆμ‹œμ μœΌλ‘œ λ‚˜νƒ€λ‚Έ 도면이닀.4 is a diagram showing a user interface provided in the body image management method of the present disclosure by way of example.

도 4λ₯Ό μ°Έμ‘°ν•˜λ©΄, λ³Έ κ°œμ‹œμ„œμ— λ”°λ₯Έ 신체 μ˜μƒ 관리 λ°©λ²•μ˜ μ‹€μ‹œ μ˜ˆλ“€μ—μ„œ μ»΄ν“¨νŒ… μž₯치(200)에 μ˜ν•˜μ—¬ μ œκ³΅λ˜κ±°λ‚˜ 이에 μ—°λ™λ˜λŠ” 타 μž₯치(300)λ₯Ό 거쳐 μ œκ³΅λ˜λŠ” μ‚¬μš©μž μΈν„°νŽ˜μ΄μŠ€(400)의 μ˜ˆμ‹œμ  ꡬ성이 λ‚˜νƒ€λ‚˜ μžˆλŠ”λ°”, μ‚¬μš©μžμ˜ 편의λ₯Ό μœ„ν•˜μ—¬ ν”Όκ²€μ²΄μ˜ 이름(422), 성별(424), λ‚˜μ΄(λ―Έλ„μ‹œ), μ‹λ³„λ²ˆν˜Έ(patient ID; 426), λ‹΄λ‹Ήμ˜ μ„±λͺ…(428) 쀑 적어도 ν•˜λ‚˜λ₯Ό ν¬ν•¨ν•˜λŠ” 피검체(ν™˜μž)의 정보가 μ‚¬μš©μžμ—κ²Œ 제곡될 수 μžˆλ‹€. λ˜ν•œ, λ‹€μˆ˜μ˜ 피검체에 κ΄€ν•œ λͺ©λ‘(430)이 ν•¨κ»˜ 제곡될 수 있으며, κ°„λ‹¨ν•œ μ‘°μž‘μœΌλ‘œ 신체 μ˜μƒ νšλ“ 단계(S100)λ₯Ό μˆ˜ν–‰ν•  수 μžˆλ„λ‘ 도 4의 μ°Έμ‘°λΆ€ν˜Έ 410으둜 μ˜ˆμ‹œλœ λ²„νŠΌκ³Ό 같은 μ†Œμ •μ˜ μΈν„°νŽ˜μ΄μŠ€ μš”μ†Œκ°€ 제곡될 μˆ˜λ„ μžˆλ‹€.Referring to FIG. 4 , in embodiments of the body image management method according to the present disclosure, an exemplary configuration of a user interface 400 provided by the computing device 200 or provided through another device 300 interlocked with the computing device 200 is As shown, for the convenience of the user, blood including at least one of the subject's name 422, gender 424, age (not shown), identification number (patient ID; 426), and the name of the person in charge 428 Information on the specimen (patient) may be provided to the user. In addition, a list 430 related to a plurality of subjects may be provided together, and a predetermined interface element such as a button exemplified by reference numeral 410 in FIG. may be provided.

신체 μ˜μƒ νšλ“ 단계(S100) λ‹€μŒμœΌλ‘œ, λ³Έ κ°œμ‹œμ„œμ˜ 신체 μ˜μƒ 관리 λ°©λ²•μ˜ 제1 μ‹€μ‹œ μ˜ˆλŠ”, μ»΄ν“¨νŒ… μž₯치(200)에 μ˜ν•˜μ—¬ κ΅¬ν˜„λ˜λŠ” 촬영 μ˜μ—­ 선택 λͺ¨λ“ˆ(220)이, μ»΄ν“¨νŒ… μž₯치(200)에 ν¬ν•¨λœ μ†Œμ •μ˜ λ””μŠ€ν”Œλ ˆμ΄λΆ€ 상에 제1 μ‚¬μš©μž μΈν„°νŽ˜μ΄μŠ€ μ˜μ—­μ„ μ œκ³΅ν•˜κ±°λ‚˜ μ»΄ν“¨νŒ… μž₯치(200)에 μ—°λ™ν•˜λŠ” 타 μž₯치(300)둜 ν•˜μ—¬κΈˆ 상기 제1 μ‚¬μš©μž μΈν„°νŽ˜μ΄μŠ€ μ˜μ—­μ„ μ œκ³΅ν•˜λ„λ‘ μ§€μ›ν•¨μœΌλ‘œμ¨ 상기 신체 μ˜μƒμ— κ²°λΆ€λœ ν‘œλ©΄ μ˜μ—­μΈ 촬영 μ˜μ—­μ„ μž…λ ₯ λ˜λŠ” μ „λ‹¬λ°›λŠ” 단계(촬영 μ˜μ—­ 선택 단계; S200)λ₯Ό 더 ν¬ν•¨ν•œλ‹€.Next to the body image acquisition step (S100), in the first embodiment of the body image management method of the present disclosure, the capturing area selection module 220 implemented by the computing device 200 is included in the computing device 200. The surface area associated with the body image by providing a first user interface area on a predetermined display unit or supporting another device 300 that works with the computing device 200 to provide the first user interface area. A step of inputting or receiving a capture area (selection of a capture area; S200) is further included.

도 5aλŠ” λ³Έ κ°œμ‹œμ„œμ˜ 신체 μ˜μƒ 관리 λ°©λ²•μ—μ„œ μ œκ³΅λ˜λŠ” 제1 μ‚¬μš©μž μΈν„°νŽ˜μ΄μŠ€ μ˜μ—­μ„ μ˜ˆμ‹œμ μœΌλ‘œ λ‚˜νƒ€λ‚Έ 도면이닀.5A is a diagram illustrating a first user interface area provided in the body image management method of the present disclosure by way of example.

도 5aλ₯Ό μ°Έμ‘°ν•˜λ©΄, 상기 제1 μ‚¬μš©μž μΈν„°νŽ˜μ΄μŠ€ μ˜μ—­(500)은 κ°€μƒμ˜ 3차원 신체 λͺ¨λΈ(520)을 ν¬ν•¨ν•˜λŠ”λ°”, κ·Έ 3차원 신체 λͺ¨λΈ(520)은 μ†Œμ • 개수의 ν‘œλ©΄ μ˜μ—­λ“€(540)둜 λΆ„ν• λ˜μ–΄ μžˆλ‹€. ν‘œλ©΄ μ˜μ—­λ“€μ€ μ„±ν˜•μ™Έκ³Ό λ˜λŠ” 피뢀과에 κ΄€λ ¨λœ ν‘œλ©΄ 해뢀학을 참고둜 ν•˜μ—¬ λΆ„ν• λœ μ˜μ—­λ“€μΌ 수 μžˆλŠ”λ°”, μ΄λŠ” 예λ₯Ό λ“€μ–΄ 신체 μ˜μƒμ΄ λ”λ§ˆμŠ€μ½”ν”Ό(dermascopy) μ˜μƒμΈ κ²½μš°μ— 특히 μœ μš©ν•˜λ‹€.Referring to FIG. 5A , the first user interface area 500 includes a virtual 3D body model 520, and the 3D body model 520 is divided into a predetermined number of surface areas 540. has been The surface regions may be regions segmented with reference to surface anatomy related to plastic surgery or dermatology, which is particularly useful when the body image is a dermascopy image, for example.

μ‚¬μš©μžλŠ” 제1 μ‚¬μš©μž μΈν„°νŽ˜μ΄μŠ€ μ˜μ—­(500), 특히, 3차원 신체 λͺ¨λΈ (520)에 λŒ€ν•œ μ‘°μž‘μ— μ˜ν•˜μ—¬ 상기 ν‘œλ©΄ μ˜μ—­λ“€(540) 쀑 적어도 ν•˜λ‚˜(550)λ₯Ό 선택할 수 μžˆλŠ”λ°”, κ·Έ μ„ νƒλœ 적어도 ν•˜λ‚˜μ˜ ν‘œλ©΄ μ˜μ—­(550)은 상기 신체 μ˜μƒμ— κ²°λΆ€λœ κ²ƒμœΌλ‘œ μ§€μ •λœ ν‘œλ©΄ μ˜μ—­μΈ 촬영 μ˜μ—­(550)이닀.The user may select at least one 550 of the surface areas 540 by manipulating the first user interface area 500, in particular, the 3D body model 520, and the selected at least one surface Area 550 is an imaging area 550, which is a surface area designated as being associated with the body image.

예λ₯Ό λ“€μ–΄, μ‚¬μš©μžλ‘œ ν•˜μ—¬κΈˆ μ»΄ν“¨νŒ… μž₯치(200)에 μ—°λ™ν•˜λŠ” μž…λ ₯ μž₯치인 마우슀(λ―Έλ„μ‹œ)의 μ™Όμͺ½, 였λ₯Έμͺ½ λ²„νŠΌμ˜ 클릭 및 κ°€μš΄λ° λ²„νŠΌμ˜ 클릭, 휠(wheel)의 μ—…λ‹€μš΄(up-down) λ“±μ˜ μ‘°μž‘, ν„°μΉ˜μŠ€ν¬λ¦° μƒμ˜ νƒ­(tab), ν•€μΉ˜(pinch)와 같은 제슀처의 μ‘°μž‘ λ“±μœΌλ‘œ μ‚¬μš©μž μΈν„°νŽ˜μ΄μŠ€ μƒμ˜ 3차원 λŒ€μƒλ¬ΌμΈ 3차원 신체 λͺ¨λΈ(520)을 νšŒμ „, ν™•λŒ€, μΆ•μ†Œ, λ˜λŠ” ν‰ν–‰μ΄λ™μ‹œν‚¬ 수 있게 ν•˜λŠ” μ—¬λŸ¬ κ°€μ§€ μ‘°μž‘μ˜ 방식이 μ•Œλ €μ Έ μžˆμœΌλ‚˜ 이에 ν•œμ •λ˜μ§€ μ•ŠμŒμ€ 물둠이닀. 3차원 λŒ€μƒλ¬Όμ„ λ‹€λ£¨λŠ” μ‚¬μš©μž μΈν„°νŽ˜μ΄μŠ€ ꡬ성에 κ΄€ν•œ λ°©μ•ˆμ€ 컴퓨터 ν•˜λ“œμ›¨μ–΄ 및 μ†Œν”„νŠΈμ›¨μ–΄μ— κ΄€ν•œ κΈ°μˆ λΆ„μ•Όμ—μ„œ ν†΅μƒμ˜ 지식을 κ°€μ§„ μ‚¬λžŒμ—κ²Œ 잘 μ•Œλ €μ Έ μžˆμœΌλ―€λ‘œ 이에 κ΄€ν•œ μ§€λ‚˜μΉœ μ„ΈλΆ€ μ„€λͺ…은 μƒλž΅ν•˜κΈ°λ‘œ ν•œλ‹€.For example, manipulations such as clicks of the left and right buttons and middle buttons of a mouse (not shown), which is an input device linked to the computing device 200, and up-down of a wheel, etc. , Tap on the touch screen (tab), various manipulations such as pinch (pinch), etc. to rotate, enlarge, reduce, or parallel move the 3D body model 520, which is a 3D object on the user interface. Although the method of manipulation is known, it is needless to say that it is not limited thereto. Since methods for constructing user interfaces for handling 3D objects are well known to those skilled in the art related to computer hardware and software, excessive detailed descriptions thereof will be omitted.

λ―Έλ„μ‹œλœ 일 λ³€ν˜•λ‘€λ‘œμ„œ, 상기 신체 μ˜μƒμ˜ μ›μ²œμœΌλ‘œ κΈ°λŠ₯ν•˜λŠ” μž₯치, μ˜ˆμ»¨λŒ€, 촬영 μž₯치(205a, 205b, 305 λ“±)에 따라 μ—¬λŸ¬ ν‘œλ©΄ μ˜μ—­λ“€μ„ λ™μ‹œμ— μ΄¬μ˜ν•΄μ•Ό ν•  κ²½μš°λ„ μžˆλŠ”λ°”, 이λ₯Ό μœ„ν•˜μ—¬ 제1 μ‚¬μš©μž μΈν„°νŽ˜μ΄μŠ€ μ˜μ—­(500)은 μ—¬λŸ¬ ν‘œλ©΄ μ˜μ—­λ“€μ„ λ™μ‹œμ— μ§€μ •ν•  수 μžˆλ„λ‘ ꡬ성될 μˆ˜λ„ μžˆμ„ 것이닀.As a variation not shown, there are cases in which several surface areas need to be simultaneously captured according to a device serving as a source of the body image, for example, a photographing device 205a, 205b, 305, etc., for which the first user The interface area 500 may be configured to simultaneously designate several surface areas.

촬영 μ˜μ—­ 선택 단계(S200)μ—μ„œλŠ” 촬영 μ˜μ—­(550)뿐만 μ•„λ‹ˆλΌ κ·Έ 촬영 μ˜μ—­(550)에 μ†ν•œ νŠΉμ • 지점(560)에 κ΄€ν•œ 선택도 ν•¨κ»˜ μ΄λ£¨μ–΄μ§ˆ 수 μžˆλŠ”λ°”, 주둜 κ·Έ νŠΉμ • 지점(560)은 촬영된 신체 μ˜μƒμ˜ 쀑심점을 κ°€λ¦¬ν‚€λŠ” 것일 수 μžˆλ‹€.In the capturing area selection step (S200), not only the capturing area 550 but also a specific point 560 belonging to the capturing area 550 can be selected. Mainly, the specific point 560 is the captured body image. It may point to the center point of

촬영 μ˜μ—­ 선택 단계(S200)μ—μ„œ, 제1 μ‚¬μš©μž μΈν„°νŽ˜μ΄μŠ€ μ˜μ—­(500)μ—μ„œ 촬영 μ˜μ—­(550)의 선택이 이루어지면, κ·Έ μ„ νƒλœ 촬영 μ˜μ—­(550)의 ꡬ뢄 λͺ…칭을 λ‚˜νƒ€λ‚΄λŠ” μ‚¬μš©μž μΈν„°νŽ˜μ΄μŠ€ μš”μ†Œ(590)κ°€ 제1 μ‚¬μš©μž μΈν„°νŽ˜μ΄μŠ€ μ˜μ—­(500) 상에 더 제곡될 수 μžˆλŠ”λ°”, 도 5a의 μ‚¬μš©μž μΈν„°νŽ˜μ΄μŠ€ μš”μ†Œ(590)μ—λŠ” 'Chest, lower'(ν•˜λΆ€ 흉뢀)κ°€ 촬영 μ˜μ—­(550)의 ꡬ뢄 λͺ…μΉ­μœΌλ‘œ μ˜ˆμ‹œλ˜μ—ˆλ‹€.In the capturing area selection step (S200), when the capturing area 550 is selected in the first user interface area 500, a user interface element 590 indicating a distinction name of the selected capturing area 550 is displayed as the first user interface element 590. As it may be further provided on the user interface area 500, 'Chest, lower' is exemplified as a classification name for the imaging area 550 in the user interface element 590 of FIG. 5A.

촬영 μ˜μ—­ 선택 단계(S200) λ‹€μŒμœΌλ‘œ, λ³Έ κ°œμ‹œμ„œμ˜ 신체 μ˜μƒ 관리 λ°©λ²•μ˜ 제1 μ‹€μ‹œ μ˜ˆλŠ”, μ»΄ν“¨νŒ… μž₯치(200)에 μ˜ν•˜μ—¬ κ΅¬ν˜„λ˜λŠ” 촬영 μ˜μ—­ μ €μž₯ λͺ¨λ“ˆ(230)이, 상기 신체 μ˜μƒμ„ 촬영 μ˜μ—­(550)의 정보와 ν•¨κ»˜, 더 ꡬ체적으둜 νŠΉμ • 지점(560)의 정보와 ν•¨κ»˜ μ†Œμ •μ˜ μ €μž₯μ†Œ(240)에 μ €μž₯ν•˜κ±°λ‚˜ 타 μž₯치(300')둜 ν•˜μ—¬κΈˆ μ €μž₯μ†Œ(340')에 μ €μž₯ν•˜λ„λ‘ μ§€μ›ν•˜λŠ” 단계(촬영 μ˜μ—­ μ €μž₯ 단계; S300)λ₯Ό 더 ν¬ν•¨ν•œλ‹€. 타 μž₯치(300')λŠ” 타 μž₯치(300)와 λ™μΌν•œ μž₯치일 μˆ˜λ„ 있고, κ·Έλ ‡μ§€ μ•Šμ„ μˆ˜λ„ μžˆλ‹€.Next, in the capturing region selection step (S200), in the first embodiment of the body image management method of the present disclosure, the capturing region storage module 230 implemented by the computing device 200 stores the body image in the capturing region ( 550) together with the information of the specific point 560 in the predetermined storage 240 or supporting another device 300' to store the information in the storage 340' (photographing area A storage step; S300) is further included. The other device 300' may be the same device as the other device 300 or not.

촬영 μ˜μ—­ μ €μž₯ 단계(S300)μ—μ„œ 상기 신체 μ˜μƒμ˜ 피검체λ₯Ό μ‹λ³„ν•˜κΈ° μœ„ν•œ ν”Όκ²€μ²΄μ˜ 이름, μ—°λ Ή, 성별 λ“±κ³Ό 같은 개인 정보가 상기 신체 μ˜μƒκ³Ό ν•¨κ»˜ 취급될 수 μžˆλ‹€.In the capturing region storage step (S300), personal information such as the name, age, and gender of the subject for identifying the subject of the body image may be handled together with the body image.

λ˜ν•œ, 촬영 μ˜μ—­ μ €μž₯ 단계(S300)μ—μ„œλŠ” 상기 신체 μ˜μƒμ„ 메타데이터(metadata)λ₯Ό ν¬ν•¨ν•˜λŠ” 이미지 ν‘œμ€€ λ˜λŠ” DICOM ν‘œμ€€μœΌλ‘œ μ €μž₯ν•  수 있으며, 상기 이미지 ν‘œμ€€ λ˜λŠ” 상기 DICOM ν‘œμ€€μ˜ μ˜ˆμ•½λœ ν•„λ“œ(reserved field) λ˜λŠ” μ—¬λΆ„μ˜ ν•„λ“œ(blank field)에 촬영 μ˜μ—­(550)의 정보λ₯Ό ν•¨κ»˜ κΈ°μž…ν•  수 μžˆλ‹€.In addition, in the capturing region storage step (S300), the body image may be stored in an image standard including metadata or DICOM standard, and a reserved field of the image standard or the DICOM standard or an extra Information on the capturing area 550 may also be written in the blank field.

μ˜ˆμ»¨λŒ€, 촬영 μ˜μ—­(550)의 μ •λ³΄λŠ”, ν‘œλ©΄ μ˜μ—­λ“€(540) κ°€μš΄λ°μ„œ 촬영 μ˜μ—­(550)을 식별할 수 있게 ν•˜λŠ” ꡬ뢄 인덱슀(index), 상기 촬영 μ˜μ—­μ˜ ꡬ뢄 λͺ…μΉ­, 3차원 신체 λͺ¨λΈ(520) μƒμ˜ νŠΉμ • 지점(560)의 3차원 μ’Œν‘œ, κ·Έ 3차원 μ’Œν‘œμ— λŒ€μ‘ν•˜λ„λ‘ κ°€κ³΅λœ 2차원 μ’Œν‘œ 쀑 ν•˜λ‚˜ 이상일 수 μžˆλ‹€. 신체 ν‘œλ©΄μ˜ νŠΉμ • 지점은 μ—¬λŸ¬ μ‚¬μ˜(projection) 기법을 μ΄μš©ν•˜μ—¬ 2차원 μ’Œν‘œμ— λŒ€μ‘λ  수 μžˆμŒμ€ 잘 μ•Œλ €μ Έ μžˆλ‹€.For example, the information on the imaging area 550 includes a classification index for identifying the imaging area 550 among the surface areas 540, a classification name of the imaging area, and an image of the 3D body model 520. It may be one or more of 3D coordinates of a specific point 560 and 2D coordinates processed to correspond to the 3D coordinates. It is well known that a specific point on the body surface can correspond to a two-dimensional coordinate using various projection techniques.

이 제1 μ‹€μ‹œ μ˜ˆμ™€ 단계듀(S100, S200)의 μˆœμ„œλ§Œμ„ λ‹¬λ¦¬ν•˜λŠ” λ³Έ κ°œμ‹œμ„œμ˜ 신체 μ˜μƒ 관리 λ°©λ²•μ˜ 제2 μ‹€μ‹œ μ˜ˆλŠ” λ‹€μŒκ³Ό κ°™λ‹€.A second embodiment of the body image management method of the present disclosure in which only the order of steps S100 and S200 is different from the first embodiment is as follows.

도 3bλŠ” λ³Έ κ°œμ‹œμ„œμ— λ”°λ₯Έ 신체 μ˜μƒ 관리 λ°©λ²•μ˜ 제2 μ‹€μ‹œ 예λ₯Ό λ„μ‹œν•œ μ˜ˆμ‹œμ  흐름도이닀. 도 2 및 도 3bλ₯Ό μ°Έμ‘°ν•˜λ©΄, λ³Έ κ°œμ‹œμ„œμ— λ”°λ₯Έ 신체 μ˜μƒ 관리 λ°©λ²•μ˜ 제2 μ‹€μ‹œ μ˜ˆμ—μ„œλŠ”, 촬영 μ˜μ—­ 선택 단계(S200') 후에, 신체 μ˜μƒ νšλ“ 단계(S100')κ°€ μˆ˜ν–‰λœλ‹€.3B is an exemplary flowchart illustrating a second embodiment of a body image management method according to the present disclosure. Referring to FIGS. 2 and 3B , in the second embodiment of the body image management method according to the present disclosure, a body image acquisition step (S100') is performed after the photographing region selection step (S200').

λ³Έ κ°œμ‹œμ„œμ— λ”°λ₯Έ 신체 μ˜μƒ 관리 λ°©λ²•μ˜ μ‹€μ‹œ μ˜ˆλ“€μ€, 촬영 μ˜μ—­μ— κ΄€ν•œ μ‚¬μš©μžμ˜ μ—΄λžŒμ„ λ³΄μ‘°ν•˜λŠ” 좔가적인 κΈ°λŠ₯을 μˆ˜ν–‰ν•  수 μžˆλŠ”λ°”, 도 5b λ‚΄μ§€ 도 5dλŠ” 이λ₯Ό μœ„ν•΄ μ œκ³΅λ˜λŠ” 제2 μ‚¬μš©μž μΈν„°νŽ˜μ΄μŠ€ μ˜μ—­μ„ μ˜ˆμ‹œμ μœΌλ‘œ λ‚˜νƒ€λ‚Έ 도면이닀.Embodiments of the body image management method according to the present disclosure may perform an additional function of assisting a user in viewing a photographing area, and FIGS. 5B to 5D illustrate a second user interface area provided for this purpose. It is an enemy drawing.

λ¨Όμ €, 도 5b 및 도 5cλ₯Ό μ°Έμ‘°ν•˜λ©΄, λ³Έ κ°œμ‹œμ„œμ— λ”°λ₯Έ 신체 μ˜μƒ 관리 방법은, μ»΄ν“¨νŒ… μž₯치(200)에 μ˜ν•΄ κ΅¬ν˜„λ˜λŠ” 촬영 μ˜μ—­ μ—΄λžŒ 보쑰 λͺ¨λ“ˆ(250)이, μ „μˆ ν•œ 3차원 신체 λͺ¨λΈ(520)κ³Ό λ§ˆμ°¬κ°€μ§€λ‘œ 피검체(ν™˜μž)의 신체λ₯Ό ν‘œμƒ(represent)ν•˜λŠ” κ°€μƒμ˜ 3차원 신체 λͺ¨λΈ(520')을 ν¬ν•¨ν•˜κ³ , κ·Έ 3차원 신체 λͺ¨λΈ(520')에 λŒ€ν•œ μ‚¬μš©μžμ˜ κ΄€μ°° μ‘°μž‘μ— μ˜ν•΄ ν”Όκ²€μ²΄μ˜ ν‘œλ©΄ μ˜μ—­λ“€μ˜ 관찰을 κ°€λŠ₯ν•˜κ²Œ ν•˜λŠ” 제2 μ‚¬μš©μž μΈν„°νŽ˜μ΄μŠ€ μ˜μ—­(500')을 μ œκ³΅ν•˜κ³ , 제2 μ‚¬μš©μž μΈν„°νŽ˜μ΄μŠ€ μ˜μ—­(500') 상에 ν‘œλ©΄ μ˜μ—­λ“€ 각각에 κ²°λΆ€λœ 신체 μ˜μƒμ˜ 유무, μœ„μΉ˜(560a, 560b, 560c) 및 개수(570a, 570b, 570c) 쀑 적어도 ν•˜λ‚˜λ₯Ό ν‘œμ‹œν•˜λŠ” 단계(촬영 μ˜μ—­ μ—΄λžŒ 보쑰 단계; S400)λ₯Ό 더 포함할 수 μžˆλ‹€.First, referring to FIGS. 5B and 5C , in the body image management method according to the present disclosure, the photographing area viewing assistance module 250 implemented by the computing device 200 is configured to generate the above-mentioned 3D body model 520 Similarly, a virtual 3D body model 520' representing the body of the subject (patient) is included, and the surface of the subject is measured by the user's observation manipulation of the 3D body model 520'. A second user interface area 500' enabling observation of areas is provided, and the presence or absence of a body image associated with each of the surface areas on the second user interface area 500' and the location (560a, 560b, 560c) ) and displaying at least one of the numbers 570a, 570b, and 570c (assisting viewing of the captured area; S400).

예λ₯Ό λ“€μ–΄, ν‘œλ©΄ μ˜μ—­λ“€ 각각에 κ²°λΆ€λœ 신체 μ˜μƒμ˜ μœ λ¬΄λŠ”, κ²°λΆ€λœ 신체 μ˜μƒμ΄ μžˆλŠ” ν‘œλ©΄ μ˜μ—­λ“€(524a, 524b, 524c)κ³Ό κ²°λΆ€λœ 신체 μ˜μƒμ΄ μ—†λŠ” ν‘œλ©΄ μ˜μ—­λ“€(544a, 544b)을 κ΅¬λΆ„ν•˜λŠ” λ°©μ‹μœΌλ‘œ ν‘œμ‹œλ  수 μžˆλŠ”λ°”, 도 5bμ—λŠ” μƒ‰μƒμœΌλ‘œ κ΅¬λΆ„λ˜κ²Œ ν‘œμ‹œλ˜μ—ˆμœΌλ‚˜ 이에 ν•œμ •λ˜μ§€ μ•ŠμŒμ€ 물둠이닀.For example, the presence or absence of a body image associated with each of the surface regions may distinguish surface regions 544a, 544b with no associated body image from surface regions 524a, 524b, and 524c with associated body images. It can be displayed in such a way that, although it is displayed in a color-coded manner in FIG. 5B, it is, of course, not limited thereto.

λŒ€μ•ˆμœΌλ‘œμ„œ, 도 5dλ₯Ό μ°Έμ‘°ν•˜λ©΄, 촬영 μ˜μ—­ μ—΄λžŒ 보쑰 λͺ¨λ“ˆ(250)은, 단계(S400)μ—μ„œ 제2 μ‚¬μš©μž μΈν„°νŽ˜μ΄μŠ€ μ˜μ—­(500') 상에 상기 κ²°λΆ€λœ 신체 μ˜μƒμ˜ 섬넀일(thumbnail; 580a, 580b, 580c)을 ν‘œμ‹œν•  μˆ˜λ„ μžˆλ‹€.As an alternative, referring to FIG. 5D , the photographing area browsing assistance module 250 displays thumbnails of the attached body image on the second user interface area 500' in step S400 (580a, 580b, 580c). ) can also be displayed.

μ—¬κΈ°μ—μ„œ μ‚¬μš©μžμ˜ κ΄€μ°° μ‘°μž‘μ€, μ˜ˆμ»¨λŒ€, 마우슀(λ―Έλ„μ‹œ)의 μ™Όμͺ½, 였λ₯Έμͺ½ λ²„νŠΌμ˜ 클릭 및 κ°€μš΄λ° λ²„νŠΌμ˜ 클릭, 휠(wheel)의 μ—…λ‹€μš΄(up-down) λ“± 제2 μ‚¬μš©μž μΈν„°νŽ˜μ΄μŠ€ μ˜μ—­(500') μƒμ˜ 3차원 신체 λͺ¨λΈ(520')을 νšŒμ „, ν™•λŒ€, μΆ•μ†Œ, λ˜λŠ” ν‰ν–‰μ΄λ™μ‹œν‚¬ 수 있게 ν•˜λŠ” μ—¬λŸ¬ κ°€μ§€ μ‘°μž‘μΌ 수 μžˆμœΌλ‚˜, μ•žμ„œ μ„€λͺ…ν–ˆλ˜ 것과 λ§ˆμ°¬κ°€μ§€λ‘œ 이에 ν•œμ •λ˜μ§€ μ•ŠμŒμ€ 물둠이닀.Here, the user's observation manipulation is performed on the second user interface area 500', such as clicking the left and right buttons of a mouse (not shown), clicking the middle button, and clicking a wheel up-down. It may be various manipulations that allow the 3D body model 520' to be rotated, enlarged, reduced, or translated in parallel, but, as described above, it is of course not limited thereto.

λ˜ν•œ, 제2 μ‚¬μš©μž μΈν„°νŽ˜μ΄μŠ€ μ˜μ—­(500') 상에 ν‘œλ©΄ μ˜μ—­λ“€(540') 각각에 κ²°λΆ€λœ 신체 μ˜μƒμ˜ 유무의 ν‘œμ‹œ(524a, 524b, 524c), μœ„μΉ˜μ˜ ν‘œμ‹œ(560a, 560b, 560c), 개수의 ν‘œμ‹œ(570a, 570b, 570c) λ˜λŠ” 섬넀일(580a, 580b, 580c)의 ν‘œμ‹œμ— λŒ€ν•œ μ‚¬μš©μžμ˜ μ‘°μž‘, μ˜ˆμ»¨λŒ€, 클릭(click) λ˜λŠ” νƒ­(tab)에 μ‘ν•˜μ—¬, μ»΄ν“¨νŒ… μž₯치(200)λŠ”, μ‚¬μš©μžλ‘œ ν•˜μ—¬κΈˆ κ·Έ κ²°λΆ€λœ 신체 μ˜μƒμ„ μ—΄λžŒν•  수 μžˆλ„λ‘, μ˜ˆμ»¨λŒ€, κ·Έ κ²°λΆ€λœ 적어도 ν•˜λ‚˜μ˜ 신체 μ˜μƒμ˜ λͺ©λ‘(440)을 ν¬ν•¨ν•˜λŠ” μ‚¬μš©μž μΈν„°νŽ˜μ΄μŠ€ μ˜μ—­μ„ μ œκ³΅ν•  μˆ˜λ„ μžˆλŠ”λ°”, 도 4에 μ°Έμ‘°λΆ€ν˜Έ 440, 460 λ“±μœΌλ‘œ μ˜ˆμ‹œλœ 바와 κ°™λ‹€.In addition, on the second user interface area 500', the presence/absence of a body image associated with each of the surface areas 540' is displayed (524a, 524b, 524c), the location is displayed (560a, 560b, 560c), and the number of body images is displayed. In response to a user's manipulation, for example, a click or a tap, on the display of the thumbnails 570a, 570b, 570c or the thumbnails 580a, 580b, 580c, the computing device 200 causes the user to For example, a user interface area including a list 440 of at least one associated body image may be provided so that the associated body image can be viewed, as illustrated by reference numerals 440 and 460 in FIG. as it has been

λ‹€μ‹œ 도 4λ₯Ό μ°Έμ‘°ν•˜λ©΄, μΆ”κ°€μ μœΌλ‘œ, λ˜λŠ” λŒ€μ•ˆμœΌλ‘œμ„œ, λ³Έ κ°œμ‹œμ„œμ— λ”°λ₯Έ 신체 μ˜μƒ 관리 방법은, μ»΄ν“¨νŒ… μž₯치(200)에 μ˜ν•΄ κ΅¬ν˜„λ˜λŠ” 촬영 μ˜μ—­ μ—΄λžŒ 보쑰 λͺ¨λ“ˆ(250)이, νŠΉμ • 신체 μ˜μƒ(442)에 λŒ€ν•œ μ‚¬μš©μžμ˜ μ—΄λžŒ μš”μ²­μ— μ‘ν•˜μ—¬, μ˜ˆμ»¨λŒ€, μ‚¬μš©μžκ°€ μ—¬λŸ¬ 신체 μ˜μƒλ“€μ„ 이λ ₯을 ν¬ν•¨ν•˜λŠ” μ˜μƒ λͺ©λ‘(440) μ€‘μ—μ„œ νŠΉμ • 신체 μ˜μƒ(442)을 ν΄λ¦­ν•˜λŠ” μ‘°μž‘μ— μ‘ν•˜μ—¬, μ»΄ν“¨νŒ… μž₯치(200)κ°€, μ†Œμ •μ˜ μΈν„°νŽ˜μ΄μŠ€ μ˜μ—­ μ˜μ—­(460)에 κ·Έ νŠΉμ • 신체 μ˜μƒμ„ μ œκ³΅ν•˜λŠ” λ™μ‹œμ—, κ·Έ νŠΉμ • 신체 μ˜μƒ(442)에 κ²°λΆ€λœ ν‘œλ©΄ μ˜μ—­(484)을 κ°€μƒμ˜ 3차원 신체 λͺ¨λΈ(482) 상에 ν‘œμ‹œν•œ 제3 μ‚¬μš©μž μΈν„°νŽ˜μ΄μŠ€ μ˜μ—­(480)을 μ œκ³΅ν•˜λŠ” 단계(S500)λ₯Ό 더 포함할 μˆ˜λ„ μžˆλ‹€.Referring back to FIG. 4 , additionally or alternatively, in the body image management method according to the present disclosure, the photographing area viewing assistance module 250 implemented by the computing device 200 is configured to display a specific body image 442 . In response to a user's request to view the body image, for example, in response to a user clicking a specific body image 442 from the image list 440 including a history of various body images, the computing device 200 displays a predetermined interface area. A third user interface area 480 displaying the surface area 484 associated with the specific body image 442 on the virtual 3D body model 482 while providing the specific body image in the area 460 . ) may be further included (S500).

상기 μ‹€μ‹œ μ˜ˆλ“€λ‘œμ¨ λ³Έ κ°œμ‹œμ„œμ—μ„œ μ„€λͺ…λœ 기술의 이점은, 의료 μ˜μƒμ˜ 촬영 및 μ—΄λžŒ μ‹œμ— κ·Έ λŒ€μƒμ΄ λ˜λŠ” 신체 λΆ€μœ„λ₯Ό 3차원 신체 λͺ¨λΈμ„ μ΄μš©ν•˜μ—¬ μ§κ΄€μ μœΌλ‘œ μ„ νƒν•˜μ—¬ μ§€μ •ν•  수 있으며, κ·Έ 신체 λΆ€μœ„μ— κ΄€ν•œ 정보λ₯Ό 의료 μ˜μƒμ— κ΄€ν•œ DICOM ν‘œμ€€ λ“±μ˜ ν˜•μ‹μœΌλ‘œ ν•¨κ»˜ μ €μž₯ν•΄λ‘˜ 수 μžˆμ–΄, μΆ”ν›„ 의료 μ˜μƒμ˜ μ—΄λžŒ μ‹œμ— μ˜λ£Œμ§„ λ“±μ˜ μ‚¬μš©μžκ°€ ν™˜μžμ— λŒ€μ‘ν•˜λŠ” κ°€μƒμ˜ 3차원 신체 λͺ¨λΈ μƒμ—μ„œ κ·Έ 의료 μ˜μƒλ“€μ„ μ§κ΄€μ μœΌλ‘œ μ°Ύμ•„λ³Ό 수 μžˆμ–΄ μ˜λ£Œμ§„μ˜ νŽΈμ˜μ„±μ΄ ν–₯μƒλœλ‹€λŠ” 점이닀.The advantage of the technology described in this disclosure as the above embodiments is that, when taking and viewing medical images, a body part that is the target can be intuitively selected and designated using a 3D body model, and information about the body part can be selected. Information can be stored together in a format such as the DICOM standard for medical images, so that users such as medical staff can intuitively find the medical images on a virtual 3D body model corresponding to the patient when viewing the medical images later. This improves the convenience of the medical staff.

λ³Έ κ°œμ‹œμ„œμ— κ°œμ‹œλœ 방법은 1회 μˆ˜ν–‰λ  수 μžˆμ„ 뿐만 μ•„λ‹ˆλΌ μ‚¬μš©μžμ˜ μš”κ΅¬ λ˜λŠ” ν•„μš”μ— λ”°λΌμ„œ 반볡적으둜, 간헐적 λ˜λŠ” μ§€μ†μ μœΌλ‘œ μˆ˜ν–‰λ  μˆ˜λ„ μžˆμŒμ€ 물둠이닀.Of course, the method disclosed in this disclosure may be performed not only once but also repeatedly, intermittently or continuously according to the user's request or need.

이상, λ³Έ κ°œμ‹œμ„œμ˜ λ‹€μ–‘ν•œ μ‹€μ‹œ μ˜ˆλ“€μ— κ΄€ν•œ μ„€λͺ…에 κΈ°μ΄ˆν•˜μ—¬ ν•΄λ‹Ή κΈ°μˆ λΆ„μ•Όμ˜ ν†΅μƒμ˜ κΈ°μˆ μžλŠ”, λ³Έ 발λͺ…μ˜ 방법 및/λ˜λŠ” ν”„λ‘œμ„ΈμŠ€λ“€, 그리고 κ·Έ 단계듀이 ν•˜λ“œμ›¨μ–΄, μ†Œν”„νŠΈμ›¨μ–΄ λ˜λŠ” νŠΉμ • μš©λ‘€μ— μ ν•©ν•œ ν•˜λ“œμ›¨μ–΄ 및 μ†Œν”„νŠΈμ›¨μ–΄μ˜ μž„μ˜μ˜ μ‘°ν•©μœΌλ‘œ μ‹€ν˜„λ  수 μžˆλ‹€λŠ” 점을 λͺ…ν™•ν•˜κ²Œ 이해할 수 μžˆλ‹€. Based on the description of various embodiments of the present disclosure, a person skilled in the art can determine the method and / or processes of the present invention, and the steps are hardware, software, or hardware and software suitable for a particular application. It can be clearly understood that any combination can be realized.

λ˜ν•œ, λ³Έ κ°œμ‹œμ„œλ₯Ό 읽은 μ˜€λŠ˜λ‚ μ˜ ν†΅μƒμ˜ κΈ°μˆ μžλŠ” μ»΄ν“¨νŒ… μž₯치, 예λ₯Ό λ“€μ–΄, μ›Œν¬μŠ€ν…Œμ΄μ…˜, 개인용 컴퓨터, νœ΄λŒ€ 단말 등을 ν†΅ν•˜μ—¬ 제곡될 수 μžˆλŠ” λ‹€μ–‘ν•œ μ‚¬μš©μž μΈν„°νŽ˜μ΄μŠ€μ— μ΅μˆ™ν•œλ°”, λ³Έ κ°œμ‹œμ„œμ—μ„œ μ„€λͺ…λœ 방법듀을 μ΄λ£¨λŠ” κ°œλ³„ λ‹¨κ³„λ“€μ—μ„œ μ–ΈκΈ‰λœ μ‚¬μš©μž μΈν„°νŽ˜μ΄μŠ€ ꡬ성에 κ΄€ν•œ μƒμ„Έν•œ μ„€λͺ…μ˜ μœ λ¬΄μ—λ„ λΆˆκ΅¬ν•˜κ³  ν†΅μƒμ˜ κΈ°μˆ μžλŠ” λ³Έ κ°œμ‹œμ„œμ˜ λ°©λ²•μ—μ„œ ν•„μš”ν•œ, μ‚¬μš©μžμ˜ μ—¬λŸ¬ μ‘°μž‘μ„ κ°€λŠ₯ν•˜κ²Œ ν•˜λŠ” λ‹€μ–‘ν•œ μ‚¬μš©μž μΈν„°νŽ˜μ΄μŠ€λ₯Ό μ‰½κ²Œ 상정할 수 μžˆμ„ 것이닀.In addition, today's skilled person reading this disclosure is familiar with various user interfaces that may be provided through a computing device, for example, a workstation, a personal computer, a portable terminal, etc., making the methods described in this disclosure Regardless of the presence or absence of a detailed description of the user interface configuration mentioned in the individual steps, a person skilled in the art can easily assume various user interfaces necessary for the method of the present disclosure, enabling various manipulations by the user.

상기 ν•˜λ“œμ›¨μ–΄ μž₯μΉ˜λŠ” λ²”μš© 컴퓨터 및/λ˜λŠ” μ „μš© μ»΄ν“¨νŒ… μž₯치 λ˜λŠ” νŠΉμ • μ»΄ν“¨νŒ… μž₯치 λ˜λŠ” νŠΉμ • μ»΄ν“¨νŒ… μž₯치의 νŠΉλ³„ν•œ λͺ¨μŠ΅ λ˜λŠ” κ΅¬μ„±μš”μ†Œλ₯Ό 포함할 수 μžˆλ‹€. 상기 ν”„λ‘œμ„ΈμŠ€λ“€μ€ ν”„λ‘œκ·Έλž¨ λͺ…λ Ήμ–΄λ₯Ό μ €μž₯ν•˜κΈ° μœ„ν•œ ROM/RAM λ“±κ³Ό 같은 λ©”λͺ¨λ¦¬μ™€ κ²°ν•©λ˜κ³  상기 λ©”λͺ¨λ¦¬μ— μ €μž₯된 λͺ…령어듀을 μ‹€ν–‰ν•˜λ„λ‘ κ΅¬μ„±λ˜λŠ” μ „μˆ λœ 바와 같은 ν”„λ‘œμ„Έμ„œμ— μ˜ν•˜μ—¬ μ‹€ν˜„λ  수 μžˆλ‹€. κ²Œλ‹€κ°€, ν˜Ήμ€ λŒ€μ•ˆμœΌλ‘œμ„œ, 상기 ν”„λ‘œμ„ΈμŠ€λ“€μ€ μ£Όλ¬Έν˜• μ§‘μ νšŒλ‘œ(application specific integrated circuit; ASIC), ν”„λ‘œκ·Έλž˜λ¨ΈλΈ” 게이트 μ–΄λ ˆμ΄(programmable gate array), μ˜ˆμ»¨λŒ€, FPGA(field programmable gate array), PLU(programmable logic unit) ν˜Ήμ€ ν”„λ‘œκ·Έλž˜λ¨ΈλΈ” μ–΄λ ˆμ΄ 둜직(Programmable Array Logic; PAL) λ˜λŠ” 기타 λͺ…λ Ήμ–΄(instruction)λ₯Ό μ‹€ν–‰ν•˜κ³  응닡할 수 μžˆλŠ” μž„μ˜μ˜ λ‹€λ₯Έ μž₯치, μ „μž μ‹ ν˜Έλ“€μ„ μ²˜λ¦¬ν•˜κΈ° μœ„ν•΄ ꡬ성될 수 μžˆλŠ” μž„μ˜μ˜ λ‹€λ₯Έ μž₯치 λ˜λŠ” μž₯μΉ˜λ“€μ˜ μ‘°ν•©μœΌλ‘œ μ‹€μ‹œλ  수 μžˆλ‹€. 처리 μž₯μΉ˜λŠ” 운영 체제 및 상기 운영 체제 μƒμ—μ„œ μˆ˜ν–‰λ˜λŠ” ν•˜λ‚˜ μ΄μƒμ˜ μ†Œν”„νŠΈμ›¨μ–΄ μ• ν”Œλ¦¬μΌ€μ΄μ…˜μ„ μˆ˜ν–‰ν•  수 μžˆλ‹€. λ˜ν•œ, 처리 μž₯μΉ˜λŠ” μ†Œν”„νŠΈμ›¨μ–΄μ˜ 싀행에 μ‘λ‹΅ν•˜μ—¬, 데이터λ₯Ό μ ‘κ·Ό, μ €μž₯, μ‘°μž‘, 처리 및 생성할 μˆ˜λ„ μžˆλ‹€. μ΄ν•΄μ˜ 편의λ₯Ό μœ„ν•˜μ—¬, 처리 μž₯μΉ˜λŠ” ν•˜λ‚˜κ°€ μ‚¬μš©λ˜λŠ” κ²ƒμœΌλ‘œ μ„€λͺ…λœ κ²½μš°λ„ μžˆμ§€λ§Œ, ν•΄λ‹Ή κΈ°μˆ λΆ„μ•Όμ—μ„œ ν†΅μƒμ˜ 지식을 κ°€μ§„ μžλŠ”, 처리 μž₯μΉ˜κ°€ 볡수 개의 처리 μš”μ†Œ(processing element) 및/λ˜λŠ” 볡수 μœ ν˜•μ˜ 처리 μš”μ†Œλ₯Ό 포함할 수 μžˆμŒμ„ μ•Œ 수 μžˆλ‹€. 예λ₯Ό λ“€μ–΄, 처리 μž₯μΉ˜λŠ” 볡수 개의 ν”„λ‘œμ„Έμ„œ λ˜λŠ” ν•˜λ‚˜μ˜ ν”„λ‘œμ„Έμ„œ 및 ν•˜λ‚˜μ˜ 컨트둀러λ₯Ό 포함할 수 μžˆλ‹€. λ˜ν•œ, 병렬 ν”„λ‘œμ„Έμ„œ(parallel processor)와 같은, λ‹€λ₯Έ 처리 ꡬ성(processing configuration)도 κ°€λŠ₯ν•˜λ‹€. 상기 ν•˜λ“œμ›¨μ–΄ μž₯μΉ˜λŠ” μ™ΈλΆ€ μž₯μΉ˜μ™€ μ‹ ν˜Έλ₯Ό 주고받을 수 μžˆλŠ” μ „μˆ λœ 바와 같은 톡신뢀도 포함할 수 μžˆλ‹€.The hardware device may include a general-purpose computer and/or a dedicated computing device or a specific computing device or a particular aspect or component of a specific computing device. The processes may be realized by a processor as described above, which is combined with a memory such as ROM/RAM or the like for storing program instructions and configured to execute instructions stored in the memory. Additionally, or alternatively, the processes may use an application specific integrated circuit (ASIC), programmable gate array, such as a field programmable gate array (FPGA), programmable logic unit (PLU) or programmable array logic (Programmable Array Logic; PAL) or any other device capable of executing and responding to other instructions, any other device or combination of devices that may be configured to process electronic signals. A processing device may run an operating system and one or more software applications running on the operating system. A processing device may also access, store, manipulate, process, and generate data in response to execution of software. For convenience of understanding, there are cases in which one processing device is used, but those skilled in the art will understand that the processing device includes a plurality of processing elements and/or a plurality of types of processing elements. It can be seen that it can include. For example, a processing device may include a plurality of processors or a processor and a controller. Other processing configurations are also possible, such as parallel processors. The hardware device may also include a communication unit as described above capable of exchanging signals with an external device.

μ†Œν”„νŠΈμ›¨μ–΄λŠ” 컴퓨터 ν”„λ‘œκ·Έλž¨(computer program), μ½”λ“œ(code), λͺ…λ Ήμ–΄(instruction; μΈμŠ€νŠΈλŸ­μ…˜), λ˜λŠ” 이듀 쀑 ν•˜λ‚˜ μ΄μƒμ˜ 쑰합을 포함할 수 있으며, μ›ν•˜λŠ” λŒ€λ‘œ λ™μž‘ν•˜λ„λ‘ 처리 μž₯치λ₯Ό κ΅¬μ„±ν•˜κ±°λ‚˜ λ…λ¦½μ μœΌλ‘œ λ˜λŠ” κ²°ν•©μ μœΌλ‘œ(collectively) 처리 μž₯μΉ˜μ— λͺ…λ Ήν•  수 μžˆλ‹€. μ†Œν”„νŠΈμ›¨μ–΄ 및/λ˜λŠ” λ°μ΄ν„°λŠ”, 처리 μž₯μΉ˜μ— μ˜ν•˜μ—¬ ν•΄μ„λ˜κ±°λ‚˜ 처리 μž₯μΉ˜μ— λͺ…λ Ήμ–΄ λ˜λŠ” 데이터λ₯Ό μ œκ³΅ν•˜κΈ° μœ„ν•˜μ—¬, μ–΄λ–€ μœ ν˜•μ˜ 기계, κ΅¬μ„±μš”μ†Œ(component), 물리적 μž₯치, 가상 μž₯치(virtual equipment), 컴퓨터 μ €μž₯ 맀체 λ˜λŠ” μž₯치, λ˜λŠ” μ „μ†‘λ˜λŠ” μ‹ ν˜Έ 파(signal wave)에 영ꡬ적으둜, λ˜λŠ” μΌμ‹œμ μœΌλ‘œ ꡬ체화(embody)될 수 μžˆλ‹€. μ†Œν”„νŠΈμ›¨μ–΄λŠ” λ„€νŠΈμ›Œν¬λ‘œ μ—°κ²°λœ 컴퓨터 μ‹œμŠ€ν…œ 상에 λΆ„μ‚°λ˜μ–΄μ„œ, λΆ„μ‚°λœ λ°©λ²•μœΌλ‘œ μ €μž₯λ˜κ±°λ‚˜ 싀행될 μˆ˜λ„ μžˆλ‹€. μ†Œν”„νŠΈμ›¨μ–΄ 및 λ°μ΄ν„°λŠ” ν•˜λ‚˜ μ΄μƒμ˜ 기계 νŒλ… κ°€λŠ₯ 기둝 맀체에 μ €μž₯될 수 μžˆλ‹€.Software may include a computer program, code, instructions, or a combination of one or more of the foregoing, and may independently or collectively configure a processing device to operate as desired. ) command to the processing unit. Software and/or data may be any tangible machine, component, physical device, virtual equipment, computer storage medium or device, intended to be interpreted by or provide instructions or data to a processing device. , or may be permanently or temporarily embodied in a transmitted signal wave. Software may be distributed on networked computer systems and stored or executed in a distributed manner. Software and data may be stored on one or more machine-readable recording media.

λ”μš±μ΄ λ³Έ 발λͺ…μ˜ 기술적 ν•΄λ²•μ˜ λŒ€μƒλ¬Ό λ˜λŠ” μ„ ν–‰ κΈ°μˆ λ“€μ— κΈ°μ—¬ν•˜λŠ” 뢀뢄듀은 λ‹€μ–‘ν•œ 컴퓨터 κ΅¬μ„±μš”μ†Œλ₯Ό ν†΅ν•˜μ—¬ μˆ˜ν–‰λ  수 μžˆλŠ” ν”„λ‘œκ·Έλž¨ λͺ…λ Ήμ–΄μ˜ ν˜•νƒœλ‘œ κ΅¬ν˜„λ˜μ–΄ 기계 νŒλ… κ°€λŠ₯ 맀체에 기둝될 수 μžˆλ‹€. 기계 νŒλ… κ°€λŠ₯ λ§€μ²΄λŠ” ν”„λ‘œκ·Έλž¨ λͺ…λ Ήμ–΄, 데이터 파일, 데이터 ꡬ쑰 등을 λ‹¨λ…μœΌλ‘œ λ˜λŠ” μ‘°ν•©ν•˜μ—¬ 포함할 수 μžˆλ‹€. 기계 νŒλ… κ°€λŠ₯ν•œ 기둝 맀체에 κΈ°λ‘λ˜λŠ” ν”„λ‘œκ·Έλž¨ λͺ…λ Ήμ–΄λŠ” μ‹€μ‹œ 예λ₯Ό μœ„ν•˜μ—¬ νŠΉλ³„νžˆ μ„€κ³„λ˜κ³  κ΅¬μ„±λœ κ²ƒλ“€μ΄κ±°λ‚˜ 컴퓨터 μ†Œν”„νŠΈμ›¨μ–΄ λΆ„μ•Όμ˜ ν†΅μƒμ˜ κΈ°μˆ μžμ—κ²Œ κ³΅μ§€λ˜μ–΄ μ‚¬μš© κ°€λŠ₯ν•œ 것일 μˆ˜λ„ μžˆλ‹€. 기계 νŒλ… κ°€λŠ₯ 기둝 맀체의 μ˜ˆμ—λŠ” ν•˜λ“œ λ””μŠ€ν¬, ν”Œλ‘œν”Ό λ””μŠ€ν¬ 및 자기 ν…Œμ΄ν”„μ™€ 같은 자기 맀체(magnetic media), CD-ROM, DVD, Blu-ray와 같은 광기둝 맀체(optical media), ν”Œλ‘­ν‹°μ»¬ λ””μŠ€ν¬(floptical disk)와 같은 자기-κ΄‘ 맀체(magneto-optical media), 및 둬(ROM), 램(RAM), ν”Œλž˜μ‹œ λ©”λͺ¨λ¦¬ λ“±κ³Ό 같은 ν”„λ‘œκ·Έλž¨ λͺ…λ Ήμ–΄λ₯Ό μ €μž₯ν•˜κ³  μˆ˜ν–‰ν•˜λ„λ‘ νŠΉλ³„νžˆ κ΅¬μ„±λœ ν•˜λ“œμ›¨μ–΄ μž₯μΉ˜κ°€ ν¬ν•¨λœλ‹€. ν”„λ‘œκ·Έλž¨ λͺ…λ Ήμ–΄μ˜ μ˜ˆμ—λŠ”, μ „μˆ ν•œ μž₯μΉ˜λ“€ 쀑 μ–΄λŠ ν•˜λ‚˜λΏλ§Œ μ•„λ‹ˆλΌ ν”„λ‘œμ„Έμ„œ, ν”„λ‘œμ„Έμ„œ μ•„ν‚€ν…μ²˜ λ˜λŠ” μƒμ΄ν•œ ν•˜λ“œμ›¨μ–΄ 및 μ†Œν”„νŠΈμ›¨μ–΄μ˜ μ‘°ν•©λ“€μ˜ 이쒅 μ‘°ν•©, λ˜λŠ” λ‹€λ₯Έ μ–΄λ–€ ν”„λ‘œκ·Έλž¨ λͺ…령어듀을 μ‹€ν–‰ν•  수 μžˆλŠ” 기계 μƒμ—μ„œ μ‹€ν–‰λ˜κΈ° μœ„ν•˜μ—¬ μ €μž₯ 및 컴파일 λ˜λŠ” μΈν„°ν”„λ¦¬νŠΈλ  수 μžˆλŠ”, C와 같은 ꡬ쑰적 ν”„λ‘œκ·Έλž˜λ° μ–Έμ–΄, C++ 같은 객체지ν–₯적 ν”„λ‘œκ·Έλž˜λ° μ–Έμ–΄ λ˜λŠ” κ³ κΈ‰ λ˜λŠ” μ €κΈ‰ ν”„λ‘œκ·Έλž˜λ° μ–Έμ–΄(μ–΄μ…ˆλΈ”λ¦¬μ–΄, ν•˜λ“œμ›¨μ–΄ 기술 μ–Έμ–΄λ“€ 및 λ°μ΄ν„°λ² μ΄μŠ€ ν”„λ‘œκ·Έλž˜λ° μ–Έμ–΄ 및 κΈ°μˆ λ“€)λ₯Ό μ‚¬μš©ν•˜μ—¬ λ§Œλ“€μ–΄μ§ˆ 수 μžˆλŠ” λ°”, 기계어 μ½”λ“œ, λ°”μ΄νŠΈμ½”λ“œλΏλ§Œ μ•„λ‹ˆλΌ 인터프리터 등을 μ‚¬μš©ν•΄μ„œ 컴퓨터에 μ˜ν•΄μ„œ 싀행될 수 μžˆλŠ” κ³ κΈ‰ μ–Έμ–΄ μ½”λ“œλ„ 이에 ν¬ν•¨λœλ‹€. Furthermore, the subject matter of the technical solution of the present invention or parts contributing to the prior art may be implemented in the form of program instructions that can be executed through various computer components and recorded on a machine-readable medium. Machine-readable media may include program instructions, data files, data structures, etc. alone or in combination. Program instructions recorded on a machine-readable recording medium may be specially designed and configured for the embodiment or may be known and usable to those skilled in the art of computer software. Examples of machine-readable recording media include magnetic media such as hard disks, floppy disks and magnetic tapes, optical media such as CD-ROMs, DVDs, and Blu-rays, and floptical disks. ), and hardware devices specially configured to store and execute program instructions, such as ROM, RAM, flash memory, and the like. Examples of program instructions include stored and compiled or interpreted for execution on any one of the foregoing devices, as well as a heterogeneous combination of processors, processor architectures, or different combinations of hardware and software, or any other machine capable of executing program instructions. Machine code, which can be created using a structured programming language such as C, an object-oriented programming language such as C++, or a high-level or low-level programming language (assembly language, hardware description languages, and database programming languages and technologies), This includes not only bytecode, but also high-level language code that can be executed by a computer using an interpreter or the like.

λ”°λΌμ„œ λ³Έ 발λͺ…에 λ”°λ₯Έ 일 νƒœμ–‘μ—μ„œλŠ”, μ•žμ„œ μ„€λͺ…λœ 방법 및 κ·Έ 쑰합듀이 ν•˜λ‚˜ μ΄μƒμ˜ μ»΄ν“¨νŒ… μž₯μΉ˜λ“€μ— μ˜ν•˜μ—¬ μˆ˜ν–‰λ  λ•Œ, κ·Έ 방법 및 λ°©λ²•μ˜ 쑰합듀이 각 단계듀을 μˆ˜ν–‰ν•˜λŠ” μ‹€ν–‰ κ°€λŠ₯ν•œ μ½”λ“œλ‘œμ„œ μ‹€μ‹œλ  수 μžˆλ‹€. λ‹€λ₯Έ 일 νƒœμ–‘μ—μ„œλŠ”, 상기 방법은 상기 단계듀을 μˆ˜ν–‰ν•˜λŠ” μ‹œμŠ€ν…œλ“€λ‘œμ„œ μ‹€μ‹œλ  수 있고, 방법듀은 μž₯μΉ˜λ“€μ— 걸쳐 μ—¬λŸ¬ κ°€μ§€ λ°©λ²•μœΌλ‘œ λΆ„μ‚°λ˜κ±°λ‚˜ λͺ¨λ“  κΈ°λŠ₯듀이 ν•˜λ‚˜μ˜ μ „μš©, λ…λ¦½ν˜• μž₯치 λ˜λŠ” λ‹€λ₯Έ ν•˜λ“œμ›¨μ–΄μ— 톡합될 수 μžˆλ‹€. 또 λ‹€λ₯Έ 일 νƒœμ–‘μ—μ„œλŠ”, μœ„μ—μ„œ μ„€λͺ…ν•œ ν”„λ‘œμ„ΈμŠ€λ“€κ³Ό μ—°κ΄€λœ 단계듀을 μˆ˜ν–‰ν•˜λŠ” μˆ˜λ‹¨λ“€μ€ μ•žμ„œ μ„€λͺ…ν•œ μž„μ˜μ˜ ν•˜λ“œμ›¨μ–΄ 및/λ˜λŠ” μ†Œν”„νŠΈμ›¨μ–΄λ₯Ό 포함할 수 μžˆλ‹€. κ·ΈλŸ¬ν•œ λͺ¨λ“  순차 κ²°ν•© 및 쑰합듀은 λ³Έ κ°œμ‹œμ„œμ˜ λ²”μœ„ 내에 μ†ν•˜λ„λ‘ μ˜λ„λœ 것이닀.Therefore, in one aspect according to the present invention, when the above-described methods and combinations thereof are performed by one or more computing devices, the methods and combinations of methods may be implemented as executable code that performs each step. In another aspect, the method may be implemented as systems performing the steps, the methods may be distributed in several ways across devices or all functions may be integrated into one dedicated, stand-alone device or other hardware. In another aspect, the means for performing the steps associated with the processes described above may include any of the hardware and/or software described above. All such sequential combinations and combinations are intended to fall within the scope of this disclosure.

μ΄μƒμ—μ„œ λ³Έ 발λͺ…이 ꡬ체적인 κ΅¬μ„±μš”μ†Œ λ“±κ³Ό 같은 νŠΉμ • 사항듀과 ν•œμ •λœ μ‹€μ‹œ μ˜ˆλ“€ 및 도면에 μ˜ν•΄ μ„€λͺ…λ˜μ—ˆμœΌλ‚˜, κ·Έ μ‹€μ‹œ μ˜ˆλ“€μ˜ νŠΉμ •ν•œ ꡬ쑰적 λ˜λŠ” κΈ°λŠ₯적 μ„€λͺ…은 μ˜ˆμ‹œλ₯Ό μœ„ν•œ λͺ©μ μœΌλ‘œ λ³Έ 발λͺ…μ˜ 보닀 μ „λ°˜μ μΈ 이해λ₯Ό 돕기 μœ„ν•΄μ„œ 제곡된 것일 뿐, κ°œμ‹œλœ μ‹€μ‹œ μ˜ˆλ“€μ— λ³Έ 발λͺ…이 ν•œμ •λ˜λŠ” 것은 μ•„λ‹ˆλ©°, λ³Έ 발λͺ…이 μ†ν•˜λŠ” κΈ°μˆ λΆ„μ•Όμ—μ„œ 톡상적인 지식을 κ°€μ§„ μ‚¬λžŒμ΄λΌλ©΄ μ΄λŸ¬ν•œ κΈ°μž¬λ‘œλΆ€ν„° λ‹€μ–‘ν•œ ν˜•νƒœμ˜ μˆ˜μ • 및 λ³€ν˜•μ„ κΎ€ν•  수 μžˆλ‹€.Although the present invention has been described above with specific details such as specific components and limited embodiments and drawings, specific structural or functional descriptions of the embodiments are provided for purposes of illustration and to help a more general understanding of the present invention. However, the present invention is not limited to the disclosed embodiments, and those skilled in the art can make various modifications and variations from these descriptions.

λ”°λΌμ„œ, λ³Έ 발λͺ…μ˜ 기술적 사상은 상기 μ„€λͺ…λœ μ‹€μ‹œ μ˜ˆμ— κ΅­ν•œλ˜μ–΄ μ •ν•΄μ Έμ„œλŠ” μ•„λ‹ˆλ˜λ©°, κ·Έ 기술적 사상에 ν¬ν•¨λ˜λŠ” λ³€κ²½ λ˜λŠ” λŒ€μ²΄λ¬Όμ„ λΉ„λ‘―ν•˜μ—¬ λ³Έ κ°œμ‹œμ„œμ— μ²¨λΆ€λœ νŠΉν—ˆμ²­κ΅¬λ²”μœ„λΏλ§Œ μ•„λ‹ˆλΌ 이 νŠΉν—ˆμ²­κ΅¬λ²”μœ„μ™€ κ· λ“±ν•˜κ²Œ λ˜λŠ” λ“±κ°€μ μœΌλ‘œ λ³€ν˜•λœ λͺ¨λ“  것듀이 λ³Έ 발λͺ…μ˜ μ‚¬μƒμ˜ 범주에 μ†ν•œλ‹€κ³  ν•  것이닀. 예λ₯Ό λ“€μ–΄, μ„€λͺ…λœ κΈ°μˆ λ“€μ΄ μ„€λͺ…λœ 방법과 λ‹€λ₯Έ μˆœμ„œλ‘œ μˆ˜ν–‰λ˜κ±°λ‚˜, 및/λ˜λŠ” μ„€λͺ…λœ μ‹œμŠ€ν…œ, ꡬ쑰, μž₯치, 회둜 λ“±μ˜ κ΅¬μ„±μš”μ†Œλ“€μ΄ μ„€λͺ…λœ 방법과 λ‹€λ₯Έ ν˜•νƒœλ‘œ κ²°ν•© λ˜λŠ” μ‘°ν•©λ˜κ±°λ‚˜, λ‹€λ₯Έ κ΅¬μ„±μš”μ†Œ λ˜λŠ” 균등물에 μ˜ν•˜μ—¬ λŒ€μΉ˜λ˜κ±°λ‚˜ μΉ˜ν™˜λ˜λ”λΌλ„ μ μ ˆν•œ κ²°κ³Όκ°€ 달성될 수 μžˆλ‹€.Therefore, the technical idea of the present invention should not be limited to the above-described embodiments and should not be determined, and the claims attached to this disclosure, including changes or substitutes included in the technical idea, as well as the scope of this patent claim Or equivalently, all modifications will fall within the scope of the spirit of the present invention. For example, the described techniques may be performed in an order different from the method described, and/or components of the described system, structure, device, circuit, etc. may be combined or combined in a different form than the method described, or other components may be used. Or even if it is replaced or substituted by equivalents, appropriate results can be achieved.

그와 같이 κ· λ“±ν•˜κ²Œ λ˜λŠ” λ“±κ°€μ μœΌλ‘œ λ³€ν˜•λœ κ²ƒμ—λŠ”, μ˜ˆμ»¨λŒ€ λ³Έ 발λͺ…에 λ”°λ₯Έ 방법을 μ‹€μ‹œν•œ 것과 λ™μΌν•œ κ²°κ³Όλ₯Ό λ‚Ό 수 μžˆλŠ”, λ…Όλ¦¬μ μœΌλ‘œ λ™μΉ˜(logically equivalent)인 방법이 포함될 것인 λ°”, λ³Έ 발λͺ…μ˜ μ§„μ˜ 및 λ²”μœ„λŠ” μ „μˆ ν•œ μ˜ˆμ‹œλ“€μ— μ˜ν•˜μ—¬ μ œν•œλ˜μ–΄μ„œλŠ” μ•„λ‹ˆλ˜λ©°, 법λ₯ μ— μ˜ν•˜μ—¬ ν—ˆμš© κ°€λŠ₯ν•œ κ°€μž₯ 넓은 의미둜 μ΄ν•΄λ˜μ–΄μ•Ό ν•œλ‹€.Such equivalent or equivalent modifications will include, for example, logically equivalent methods that can produce the same results as those performed by the method according to the present invention. The scope should not be limited by the above examples, and should be understood in the broadest sense permitted by law.

Claims (8)

촬영 μž₯치λ₯Ό ν¬ν•¨ν•˜κ±°λ‚˜ 미리 촬영된 μ˜μƒμ„ λ³΄μœ ν•œ μž₯μΉ˜λ‘œλΆ€ν„° λ˜λŠ” 촬영 μž₯μΉ˜λ‘œλΆ€ν„° μ‹ μ²΄μ˜ 적어도 μΌλΆ€μ˜ μ™Έν˜•μ΄ 촬영된 ν˜•μƒ λ˜λŠ” ν”ΌλΆ€μ˜ μ˜μƒμ„ ν¬ν•¨ν•˜λŠ” 신체 μ˜μƒμ„ νšλ“ν•˜λŠ” μ»΄ν“¨νŒ… μž₯μΉ˜μ— μ˜ν•΄ μˆ˜ν–‰λ˜λŠ” 신체 μ˜μƒ 관리 λ°©λ²•μœΌλ‘œμ„œ,A body image management method performed by a computing device that acquires a body image including an image of a skin or a shape in which at least a part of a body is photographed from a device including a photographing device or having a previously photographed image, or from a photographing device. , 상기 μ»΄ν“¨νŒ… μž₯μΉ˜κ°€, 상기 μ‹ μ²΄μ˜ 전체 ν˜•μƒ λ˜λŠ” 전체 ν‘œλ©΄μ„ 총망라(totally covering)ν•˜λŠ” μ†Œμ • 개수의 ν‘œλ©΄ μ˜μ—­λ“€λ‘œ λΆ„ν• λ˜λŠ” κ°€μƒμ˜ 3차원 신체 λͺ¨λΈμ„ ν¬ν•¨ν•˜κ³ , 상기 3차원 신체 λͺ¨λΈμ— λŒ€ν•œ μ‘°μž‘μ— μ˜ν•΄ 상기 ν‘œλ©΄ μ˜μ—­λ“€ 쀑 적어도 ν•˜λ‚˜λ₯Ό 선택 κ°€λŠ₯ν•˜κ²Œ ν•˜λŠ” 제1 μ‚¬μš©μž μΈν„°νŽ˜μ΄μŠ€ μ˜μ—­μ„ μ œκ³΅ν•¨μœΌλ‘œμ¨, 상기 ν‘œλ©΄ μ˜μ—­λ“€ 쀑 상기 신체 μ˜μƒμ— κ²°λΆ€λœ 적어도 ν•˜λ‚˜μ˜ ν˜•μƒ λ˜λŠ” ν‘œλ©΄μ˜ μ˜μ—­μΈ 촬영 μ˜μ—­μ„ μž…λ ₯λ°›κ±°λ‚˜ 상기 μ»΄ν“¨νŒ… μž₯μΉ˜μ— μ—°λ™ν•˜λ˜ 상기 제1 μ‚¬μš©μž μΈν„°νŽ˜μ΄μŠ€ μ˜μ—­μ΄ μ œκ³΅λ˜λŠ” 타 μž₯μΉ˜λ‘œλΆ€ν„° 상기 촬영 μ˜μ—­μ„ μ „λ‹¬λ°›λŠ” 단계인 촬영 μ˜μ—­ 선택 단계; 및 The computing device includes a virtual three-dimensional body model divided into a predetermined number of surface regions covering the entire shape or the entire surface of the body, and by operating the three-dimensional body model, the By providing a first user interface region enabling selection of at least one of the surface regions, a photographing region that is an area of at least one shape or surface associated with the body image among the surface regions is input or interlocked with the computing device. However, the capturing region selection step is a step of receiving the capturing region from another device provided with the first user interface region; and 상기 μ»΄ν“¨νŒ… μž₯μΉ˜κ°€, νšλ“λœ 상기 신체 μ˜μƒμ„ 상기 촬영 μ˜μ—­μ˜ 정보와 ν•¨κ»˜ μ†Œμ •μ˜ μ €μž₯μ†Œμ— μ €μž₯ν•˜κ±°λ‚˜ 타 μž₯치둜 ν•˜μ—¬κΈˆ μ €μž₯ν•˜λ„λ‘ μ§€μ›ν•˜λŠ” 단계인 촬영 μ˜μ—­ μ €μž₯ 단계The capturing area storage step, which is a step in which the computing device stores the acquired body image together with the information of the capturing area in a predetermined storage or allows another device to store the acquired body image. λ₯Ό ν¬ν•¨ν•˜κ³ , including, 상기 촬영 μ˜μ—­μ˜ μ •λ³΄λŠ”, 상기 촬영 μ˜μ—­μ˜ ꡬ뢄 인덱슀(index), 상기 촬영 μ˜μ—­μ˜ ꡬ뢄 λͺ…μΉ­, 상기 3차원 신체 λͺ¨λΈ μƒμ˜ 3차원 μ’Œν‘œ, 및 상기 3차원 μ’Œν‘œμ— λŒ€μ‘λ˜λ„λ‘ κ°€κ³΅λœ 2차원 μ’Œν‘œ 쀑 적어도 ν•˜λ‚˜λ₯Ό ν¬ν•¨ν•˜λŠ”, 신체 μ˜μƒ 관리 방법.The information on the imaging area may include at least one of a classification index of the imaging area, a classification name of the imaging area, 3D coordinates on the 3D body model, and 2D coordinates processed to correspond to the 3D coordinates. Including, a body image management method. 제1항에 μžˆμ–΄μ„œ, According to claim 1, 상기 촬영 μ˜μ—­ μ €μž₯ λ‹¨κ³„λŠ”, In the step of storing the shooting area, 상기 신체 μ˜μƒμ„ 메타데이터(metadata)λ₯Ό ν¬ν•¨ν•˜λŠ” 이미지 ν‘œμ€€ λ˜λŠ” DICOM ν‘œμ€€μœΌλ‘œ μ €μž₯ν•˜λ˜, 상기 이미지 ν‘œμ€€ λ˜λŠ” DICOM ν‘œμ€€μ˜ μ˜ˆμ•½λœ ν•„λ“œ(reserved field) λ˜λŠ” μ—¬λΆ„μ˜ ν•„λ“œ(blank field)에 상기 촬영 μ˜μ—­μ˜ 정보λ₯Ό ν•¨κ»˜ κΈ°μž…ν•˜λŠ” 것을 νŠΉμ§•μœΌλ‘œ ν•˜λŠ”, 신체 μ˜μƒ 관리 방법.The body image is stored as an image standard or DICOM standard including metadata, and the information of the capturing area is stored in a reserved field or a blank field of the image standard or DICOM standard. A method for managing body images, characterized in that they are written together. 제1항에 μžˆμ–΄μ„œ, According to claim 1, 상기 방법은, The method, 상기 촬영 μ˜μ—­ 선택 단계 전에, Before the capturing area selection step, 상기 μ»΄ν“¨νŒ… μž₯μΉ˜κ°€, 촬영 μž₯치λ₯Ό ν¬ν•¨ν•˜κ±°λ‚˜ 미리 촬영된 μ˜μƒμ„ λ³΄μœ ν•œ 타 μž₯μΉ˜λ‘œμ„œ 상기 μ»΄ν“¨νŒ… μž₯μΉ˜μ— μ—°λ™ν•˜λŠ” μž₯치 λ˜λŠ” 상기 μ»΄ν“¨νŒ… μž₯μΉ˜μ— ν¬ν•¨λœ 촬영 μž₯μΉ˜λ‘œλΆ€ν„° μ‹ μ²΄μ˜ 적어도 일뢀가 촬영된 μ˜μƒμΈ 신체 μ˜μƒμ„ νšλ“ν•˜λŠ” 단계인 신체 μ˜μƒ νšλ“ 단계λ₯Ό ν¬ν•¨ν•˜λŠ”, 신체 μ˜μƒ 관리 방법.The computing device obtains a body image, which is an image in which at least a part of the body is captured, from a device that includes a photographing device or has a previously photographed image and is interoperable with the computing device or a photographing device included in the computing device. A body image management method comprising a body image acquisition step, which is a step. 제1항에 μžˆμ–΄μ„œ, According to claim 1, 상기 방법은, The method, 상기 촬영 μ˜μ—­ 선택 단계 후에, After the capturing area selection step, 상기 μ»΄ν“¨νŒ… μž₯μΉ˜κ°€, 촬영 μž₯치λ₯Ό ν¬ν•¨ν•˜κ±°λ‚˜ 미리 촬영된 μ˜μƒμ„ λ³΄μœ ν•œ 타 μž₯μΉ˜λ‘œμ„œ 상기 μ»΄ν“¨νŒ… μž₯μΉ˜μ— μ—°λ™ν•˜λŠ” μž₯치 λ˜λŠ” 상기 μ»΄ν“¨νŒ… μž₯μΉ˜μ— ν¬ν•¨λœ 촬영 μž₯μΉ˜λ‘œλΆ€ν„° μ‹ μ²΄μ˜ 적어도 일뢀가 촬영된 μ˜μƒμΈ 신체 μ˜μƒμ„ νšλ“ν•˜λŠ” 단계인 신체 μ˜μƒ νšλ“ 단계λ₯Ό ν¬ν•¨ν•˜λŠ”, 신체 μ˜μƒ 관리 방법.The computing device obtains a body image, which is an image in which at least a part of the body is captured, from a device that includes a photographing device or has a previously photographed image and is interoperable with the computing device or a photographing device included in the computing device. A body image management method comprising a body image acquisition step, which is a step. 제1항에 μžˆμ–΄μ„œ, According to claim 1, 상기 방법은, The method, 상기 μ»΄ν“¨νŒ… μž₯μΉ˜κ°€, μ†Œμ • 개수의 ν‘œλ©΄ μ˜μ—­λ“€λ‘œ λΆ„ν• λœ κ°€μƒμ˜ 3차원 신체 λͺ¨λΈμ„ ν¬ν•¨ν•˜κ³ , 상기 3차원 신체 λͺ¨λΈμ— λŒ€ν•œ μ‚¬μš©μžμ˜ κ΄€μ°° μ‘°μž‘μ— μ˜ν•΄ 상기 ν‘œλ©΄ μ˜μ—­λ“€μ˜ 관찰을 κ°€λŠ₯ν•˜κ²Œ ν•˜λŠ” 제2 μ‚¬μš©μž μΈν„°νŽ˜μ΄μŠ€ μ˜μ—­μ„ μ œκ³΅ν•¨μœΌλ‘œμ¨, 상기 제2 μ‚¬μš©μž μΈν„°νŽ˜μ΄μŠ€ μ˜μ—­ 상에 상기 ν‘œλ©΄ μ˜μ—­λ“€ 각각에 κ²°λΆ€λœ 신체 μ˜μƒμ˜ 유무, μœ„μΉ˜ 및 개수 쀑 적어도 ν•˜λ‚˜λ₯Ό ν‘œμ‹œν•˜κ±°λ‚˜ 상기 κ²°λΆ€λœ 신체 μ˜μƒμ˜ 섬넀일(thumbnail)을 ν‘œμ‹œν•˜λŠ” 촬영 μ˜μ—­ μ—΄λžŒ 보쑰 단계λ₯Ό 더 ν¬ν•¨ν•˜λŠ”, A second user interface in which the computing device includes a virtual 3D body model divided into a predetermined number of surface areas and enables observation of the surface areas by a user's observation manipulation of the 3D body model By providing an area, a capture area for displaying at least one of the presence, location, and number of body images associated with each of the surface regions on the second user interface area, or displaying thumbnails of the associated body images. Further comprising a reading auxiliary step, 신체 μ˜μƒ 관리 방법.How to manage body imaging. 제1항에 μžˆμ–΄μ„œ, According to claim 1, 상기 방법은, The method, νŠΉμ • 신체 μ˜μƒμ— λŒ€ν•œ μ—΄λžŒ μš”μ²­μ— μ‘ν•˜μ—¬, 상기 μ»΄ν“¨νŒ… μž₯μΉ˜κ°€, 상기 νŠΉμ • 신체 μ˜μƒμ„ μ œκ³΅ν•˜λ©΄μ„œ, 상기 νŠΉμ • 신체 μ˜μƒμ— κ²°λΆ€λœ ν‘œλ©΄ μ˜μ—­μ„ κ°€μƒμ˜ 3차원 신체 λͺ¨λΈ 상에 ν‘œμ‹œν•œ 제3 μ‚¬μš©μž μΈν„°νŽ˜μ΄μŠ€ μ˜μ—­μ„ μ œκ³΅ν•˜λŠ” 단계λ₯Ό 더 ν¬ν•¨ν•˜λŠ”, 신체 μ˜μƒ 관리 방법.In response to a request to view a specific body image, the computing device provides a third user interface area displaying a surface area associated with the specific body image on a virtual 3D body model while providing the specific body image. A body image management method, further comprising the step of doing. μ»΄ν“¨νŒ… μž₯치둜 ν•˜μ—¬κΈˆ, 제1ν•­ λ‚΄μ§€ 제6ν•­ 쀑 μ–΄λŠ ν•œ ν•­μ˜ 방법을 μˆ˜ν–‰ν•˜λ„λ‘ κ΅¬ν˜„λœ λͺ…λ Ήμ–΄(instructions)λ₯Ό ν¬ν•¨ν•˜λŠ”, 기계 νŒλ… κ°€λŠ₯ν•œ λΉ„μΌμ‹œμ  기둝 맀체에 μ €μž₯된, 컴퓨터 ν”„λ‘œκ·Έλž¨.A computer program, stored in a machine-readable non-transitory storage medium, comprising instructions implemented to cause a computing device to perform the method of any one of claims 1 to 6. μ‹ μ²΄μ˜ 적어도 μΌλΆ€μ˜ μ™Έν˜•μ΄ 촬영된 ν˜•μƒ λ˜λŠ” ν”ΌλΆ€μ˜ μ˜μƒμ„ ν¬ν•¨ν•˜λŠ” 신체 μ˜μƒμ„ νšλ“ν•˜μ—¬ κ΄€λ¦¬ν•˜λŠ” μ»΄ν“¨νŒ… μž₯μΉ˜λ‘œμ„œ,A computing device that acquires and manages a body image including an image of a skin or a shape in which at least a part of the body is photographed, comprising: 촬영 μž₯치λ₯Ό ν¬ν•¨ν•˜κ±°λ‚˜ 미리 촬영된 μ˜μƒμ„ λ³΄μœ ν•œ μž₯μΉ˜μ™€ μ—°λ™ν•˜κ±°λ‚˜ 상기 μ»΄ν“¨νŒ… μž₯μΉ˜μ— ν¬ν•¨λœ 촬영 μž₯μΉ˜μ™€ μ—°λ™ν•˜λŠ” 톡신뢀; 및 a communication unit that includes a photographing device, interworks with a device holding a pre-captured image, or interworks with a photographing device included in the computing device; and 상기 μ‹ μ²΄μ˜ 전체 ν˜•μƒ λ˜λŠ” 전체 ν‘œλ©΄μ„ 총망라(totally covering)ν•˜λŠ” μ†Œμ • 개수의 ν‘œλ©΄ μ˜μ—­λ“€λ‘œ λΆ„ν• λ˜λŠ” κ°€μƒμ˜ 3차원 신체 λͺ¨λΈμ„ ν¬ν•¨ν•˜κ³ , 상기 3차원 신체 λͺ¨λΈμ— λŒ€ν•œ μ‘°μž‘μ— μ˜ν•΄ 상기 ν‘œλ©΄ μ˜μ—­λ“€ 쀑 적어도 ν•˜λ‚˜λ₯Ό 선택 κ°€λŠ₯ν•˜κ²Œ ν•˜λŠ” 제1 μ‚¬μš©μž μΈν„°νŽ˜μ΄μŠ€ μ˜μ—­μ„ 상기 톡신뢀λ₯Ό ν†΅ν•˜μ—¬ μ†Œμ •μ˜ λ””μŠ€ν”Œλ ˆμ΄ μž₯μΉ˜μ— μ œκ³΅ν•¨μœΌλ‘œμ¨, 상기 ν‘œλ©΄ μ˜μ—­λ“€ 쀑 상기 신체 μ˜μƒμ— κ²°λΆ€λœ 적어도 ν•˜λ‚˜μ˜ ν˜•μƒ λ˜λŠ” ν‘œλ©΄μ˜ μ˜μ—­μΈ 촬영 μ˜μ—­μ„ μž…λ ₯λ°›κ±°λ‚˜ 상기 톡신뢀λ₯Ό ν†΅ν•˜μ—¬ μ—°λ™ν•˜λ˜ 상기 제1 μ‚¬μš©μž μΈν„°νŽ˜μ΄μŠ€ μ˜μ—­μ΄ μ œκ³΅λ˜λŠ” 타 μž₯μΉ˜λ‘œλΆ€ν„° 상기 촬영 μ˜μ—­μ„ μ „λ‹¬λ°›λŠ” 촬영 μ˜μ—­ 선택 ν”„λ‘œμ„ΈμŠ€, 및 νšλ“λœ 상기 신체 μ˜μƒμ„ 상기 촬영 μ˜μ—­μ˜ 정보와 ν•¨κ»˜ μ†Œμ •μ˜ μ €μž₯μ†Œμ— μ €μž₯ν•˜κ±°λ‚˜ 상기 톡신뢀λ₯Ό ν†΅ν•˜μ—¬ 타 μž₯치둜 ν•˜μ—¬κΈˆ μ €μž₯ν•˜λ„λ‘ μ§€μ›ν•˜λŠ” 촬영 μ˜μ—­ μ €μž₯ ν”„λ‘œμ„ΈμŠ€λ₯Ό μˆ˜ν–‰ν•˜λŠ” ν”„λ‘œμ„Έμ„œA virtual three-dimensional body model divided into a predetermined number of surface regions covering the entire shape or entire surface of the body, and at least one of the surface regions by manipulation of the three-dimensional body model By providing a first user interface area capable of selecting one of them to a predetermined display device through the communication unit, a capturing area, which is an area of at least one shape or surface associated with the body image among the surface areas, is input, or the A photographing area selection process in which the photographing area is received from another device provided with the first user interface region while interworking through the communication unit, and storing the obtained body image together with information of the photographing area in a predetermined storage or the communication unit A processor that performs a capturing area storage process that supports other devices to store through λ₯Ό ν¬ν•¨ν•˜κ³ , including, 상기 촬영 μ˜μ—­μ˜ μ •λ³΄λŠ”, 상기 촬영 μ˜μ—­μ˜ ꡬ뢄 인덱슀(index), 상기 3차원 신체 λͺ¨λΈ μƒμ˜ 3차원 μ’Œν‘œ, 및 상기 3차원 μ’Œν‘œμ— λŒ€μ‘λ˜λ„λ‘ κ°€κ³΅λœ 2차원 μ’Œν‘œ 쀑 적어도 ν•˜λ‚˜λ₯Ό ν¬ν•¨ν•˜λŠ”, 신체 μ˜μƒ 관리 μž₯치.The information of the photographing area includes at least one of a classification index of the photographing region, 3D coordinates on the 3D body model, and 2D coordinates processed to correspond to the 3D coordinates. Device.
PCT/KR2022/018743 2021-12-16 2022-11-24 Method for managing body images and apparatus using same Ceased WO2023113285A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR1020210180572A KR102429640B1 (en) 2021-12-16 2021-12-16 Method for managing body images, and apparatus using the same
KR10-2021-0180572 2021-12-16

Publications (1)

Publication Number Publication Date
WO2023113285A1 true WO2023113285A1 (en) 2023-06-22

Family

ID=82826753

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2022/018743 Ceased WO2023113285A1 (en) 2021-12-16 2022-11-24 Method for managing body images and apparatus using same

Country Status (2)

Country Link
KR (1) KR102429640B1 (en)
WO (1) WO2023113285A1 (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102429640B1 (en) * 2021-12-16 2022-08-05 μ£Όμ‹νšŒμ‚¬ μ—ν”„μ•€λ””νŒŒνŠΈλ„ˆμŠ€ Method for managing body images, and apparatus using the same
KR102856049B1 (en) * 2024-12-10 2025-09-05 μ£Όμ‹νšŒμ‚¬ 컴포랩슀 Apparatus for 3D Human Body Scan Data Management for Selective Utilization of Human Body Part and Driving Method Thereof

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002282215A (en) * 2001-03-27 2002-10-02 Mitsubishi Space Software Kk Image filing equipment
KR20140047268A (en) * 2012-10-12 2014-04-22 μ£Όμ‹νšŒμ‚¬ μΈν”Όλ‹ˆνŠΈν—¬μŠ€μΌ€μ–΄ Medical image display method using virtual patient model and apparatus thereof
US20200327661A1 (en) * 2019-04-12 2020-10-15 Zebra Medical Vision Ltd Systems and methods for processing 3d anatomical volumes based on localization of 2d slices thereof
KR20210021818A (en) * 2019-08-19 2021-03-02 μ£Όμ‹νšŒμ‚¬ μœ λΉ„μΌ€μ–΄ Method and system for serching medical images
KR102429640B1 (en) * 2021-12-16 2022-08-05 μ£Όμ‹νšŒμ‚¬ μ—ν”„μ•€λ””νŒŒνŠΈλ„ˆμŠ€ Method for managing body images, and apparatus using the same

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2551476Y2 (en) 1992-01-21 1997-10-22 ホシデンζ ͺ式会瀾 Constant voltage compatible connector

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002282215A (en) * 2001-03-27 2002-10-02 Mitsubishi Space Software Kk Image filing equipment
KR20140047268A (en) * 2012-10-12 2014-04-22 μ£Όμ‹νšŒμ‚¬ μΈν”Όλ‹ˆνŠΈν—¬μŠ€μΌ€μ–΄ Medical image display method using virtual patient model and apparatus thereof
US20200327661A1 (en) * 2019-04-12 2020-10-15 Zebra Medical Vision Ltd Systems and methods for processing 3d anatomical volumes based on localization of 2d slices thereof
KR20210021818A (en) * 2019-08-19 2021-03-02 μ£Όμ‹νšŒμ‚¬ μœ λΉ„μΌ€μ–΄ Method and system for serching medical images
KR102429640B1 (en) * 2021-12-16 2022-08-05 μ£Όμ‹νšŒμ‚¬ μ—ν”„μ•€λ””νŒŒνŠΈλ„ˆμŠ€ Method for managing body images, and apparatus using the same

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
PARK SANG KYU, KIM BO KYUN, SHIN DONGSUN: "Semi-automatic segmentation and surface reconstruction of CT images by using rotoscoping and warping techniques", FOLIA MORPHOLOGICA, WYDAWNICTWO VIA MEDICA, PL, vol. 79, no. 1, 1 January 2020 (2020-01-01), PL , pages 156 - 161, XP093072141, ISSN: 0015-5659, DOI: 10.5603/FM.a2019.0045 *

Also Published As

Publication number Publication date
KR102429640B1 (en) 2022-08-05

Similar Documents

Publication Publication Date Title
JP5628927B2 (en) MEDICAL INFORMATION DISPLAY DEVICE AND METHOD, AND PROGRAM
WO2016125978A1 (en) Method and apparatus for displaying medical image
US20090307328A1 (en) Remote management interface for a medical device
WO2023113285A1 (en) Method for managing body images and apparatus using same
US20130123603A1 (en) Medical device and method for displaying medical image using the same
CN102959579A (en) Medical information display apparatus, operation method and program
WO2019143021A1 (en) Method for supporting viewing of images and apparatus using same
WO2021034138A1 (en) Dementia evaluation method and apparatus using same
US10810758B2 (en) Method and system using augmentated reality for positioning of ECG electrodes
WO2019230302A1 (en) Training data collecting device, training data collecting method and program, training system, trained model, and endoscope image processing device
WO2017142223A1 (en) Remote image transmission system, display apparatus, and guide displaying method thereof
WO2021006472A1 (en) Multiple bone density displaying method for establishing implant procedure plan, and image processing device therefor
CN111261265A (en) Medical image system based on virtual intelligent medical platform
KR102222509B1 (en) Method for assisting determination on medical images and apparatus using the same
WO2019164277A1 (en) Method and device for evaluating bleeding by using surgical image
WO2020231007A2 (en) Medical equipment learning system
WO2010128818A2 (en) Medical image processing system and processing method
WO2013172685A1 (en) Apparatus and method for reconfiguring panoramic x-ray image
WO2022231329A1 (en) Method and device for displaying bio-image tissue
WO2021054700A1 (en) Method for providing tooth lesion information, and device using same
WO2019124836A1 (en) Method for mapping region of interest of first medical image onto second medical image, and device using same
WO2020130349A1 (en) Method and apparatus for recording treatment plan of 3d medical image
WO2023121051A1 (en) Patient information provision method, patient information provision apparatus, and computer-readable recording medium
JP2007052699A (en) Medical information processing system
WO2019164273A1 (en) Method and device for predicting surgery time on basis of surgery image

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22907754

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 22907754

Country of ref document: EP

Kind code of ref document: A1

122 Ep: pct application non-entry in european phase

Ref document number: 22907754

Country of ref document: EP

Kind code of ref document: A1