WO2020130579A1 - Procédé de traitement d'image, et dispositif électronique associé - Google Patents
Procédé de traitement d'image, et dispositif électronique associé Download PDFInfo
- Publication number
- WO2020130579A1 WO2020130579A1 PCT/KR2019/017895 KR2019017895W WO2020130579A1 WO 2020130579 A1 WO2020130579 A1 WO 2020130579A1 KR 2019017895 W KR2019017895 W KR 2019017895W WO 2020130579 A1 WO2020130579 A1 WO 2020130579A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- landmark
- image
- distance
- landmarks
- ratio
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/80—Geometric correction
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T1/00—General purpose image data processing
- G06T1/60—Memory management
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T3/00—Geometric image transformations in the plane of the image
- G06T3/18—Image warping, e.g. rearranging pixels individually
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
- G06V40/168—Feature extraction; Face representation
- G06V40/171—Local features and components; Facial parts ; Occluding parts, e.g. glasses; Geometrical relationships
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30196—Human being; Person
- G06T2207/30201—Face
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30204—Marker
Definitions
- Various embodiments of the present invention relate to an image processing method and an electronic device thereof.
- the electronic device includes a camera, and an image may be provided to the user through the camera.
- the electronic device may support an image processing technique for correcting distortion of a visual object (eg, face) included in an image captured through a camera.
- a visual object eg, face
- Various embodiments of the present invention can provide an electronic device and a method for more accurately correcting image distortion by correcting image distortion in consideration of various elements of a visual object included in an image acquired through a camera.
- An electronic device includes at least one camera, a processor operatively connected to the at least one camera, and a memory operatively connected to the processor, wherein the memory, upon execution ,
- the processor acquires an image including a visual object corresponding to a face using the at least one camera, identifies a plurality of landmarks corresponding to a facial component from the visual object, and Based on the image in response to identifying that a ratio of a distance of a first set of landmarks among the plurality of landmarks and a distance of a second set of landmarks among the plurality of landmarks is included in a first range To store the instructions to obtain the corrected first image.
- An image processing method in an electronic device includes an operation of obtaining an image including a visual object corresponding to a face, and a plurality of landmarks corresponding to a facial component from the visual object Responsive to identifying a ratio of a distance of a first set of landmarks among the plurality of landmarks and a distance of a second set of landmarks among the plurality of landmarks is included in a first range
- it may include an operation of obtaining a corrected first image based on the image.
- An electronic device and a method according to various embodiments of the present disclosure may correct image distortion more accurately by correcting image distortion in consideration of various elements of a visual object included in an image acquired through a camera.
- An electronic device and a method thereof may provide a natural image to a user by correcting image distortion.
- FIG. 1 is a diagram for explaining a cause of image distortion in an electronic device according to various embodiments of the present disclosure.
- FIG. 2 is an exemplary diagram for describing a distorted image according to various embodiments of the present disclosure.
- FIG. 3 is a block diagram of an electronic device in a network environment according to various embodiments.
- FIG. 4 is a block diagram of an electronic device according to various embodiments of the present disclosure.
- FIG. 5 is a flowchart illustrating a method of correcting a distorted image in an electronic device according to various embodiments of the present disclosure.
- FIG. 6 is an exemplary diagram illustrating a method of correcting a distorted image in an electronic device according to various embodiments of the present disclosure.
- FIGS. 7A to 7B are flowcharts illustrating a method of correcting a distorted image based on a distance ratio between landmarks corresponding to a face element in an electronic device according to various embodiments of the present disclosure.
- FIG. 8 is an exemplary diagram for describing a method of correcting a distorted image based on a distance ratio between landmarks corresponding to a face element in an electronic device according to various embodiments of the present disclosure.
- FIG. 9 is a diagram illustrating a method of correcting a distorted image based on a position and a length associated with a face region in an electronic device according to various embodiments of the present disclosure.
- FIG. 10 is an exemplary diagram illustrating a method of correcting a distorted image based on a position and a length associated with a face region in an electronic device according to various embodiments of the present disclosure.
- 11 is a diagram illustrating an algorithm for calculating image warping coordinates according to various embodiments.
- FIG. 1 is a diagram for explaining a cause of image distortion in an electronic device according to various embodiments of the present disclosure.
- FIG. 1 At least one of a relative position of an object (eg, face) with respect to a camera of an electronic device, a distance of an object (eg, focal length or depth), or a relative direction of an object with respect to the camera Based on the description, the cause of the image distortion will be explained.
- an object eg, face
- a distance of an object eg, focal length or depth
- a relative direction of an object with respect to the camera Based on the description, the cause of the image distortion will be explained.
- the degree to which an image is distorted may be changed according to a relative position of an object with respect to a camera (or a relative position of a subject within a camera angle of view).
- the length of the path of the lights 113 and 115 passing through the lens 130 from the outline (or outline) of the object 110 (or the image from the outline of the object 110) may be the same.
- the object image 111 may not be distorted or the degree of distortion may be small.
- the object image 111 may be located at the center of the entire image acquired using the camera.
- the length of the path of the lights 123 and 125 passing through the lens 130 from the outline (or outline) of the object 120 (or from the outline of the object 120) The length of the path to the surface 140 forming the image may be different.
- the object image 121 may be distorted.
- the object 110 and the object 120 are the same, and the distance (x) of the surface 140 forming the image with the object 110 and the surface 140 forming the image with the object 120 Even if the distance x'is the same, the length d2 of the object image 121 may be different from the length d1 of the object image 111.
- distortion may occur in which the length of the object image is increased with respect to one axis on the object image and the length is reduced with respect to the other axis.
- the degree of distortion may increase as the distance y between the positions of the object image from the center of the image acquired through the camera increases.
- the degree of distortion of an object image may vary according to a distance between the camera and the object. For example, as the distance between the camera and the object increases, the degree of distortion of the object image may increase. For example, when comparing the object 120 and the object 141, the angle of forming the light passing through the lens 130 with the gwanseok of the object 141 passes through the lens 130 from the contour of the object 120 Because it is large compared to the angle of forming the light, a larger distortion may occur in the object image 121 than the object image for the object 141.
- the degree of distortion of the object image may vary depending on the relative direction (or angle) of the object with respect to the camera.
- a plurality of landmarks for example, eyebrows, eyes, nose, or mouth
- the distance between cameras may vary.
- the lens 130 from a point 155 of the contour of the object 110 The length of the path of light passing through may be different from the length of the path of light passing through the lens 130 from another point 151.
- the portion of the object image centered on the point where the length of the light path is reduced may be enlarged, and the portion centered on the point where the length of the light path may be increased may be reduced.
- FIG. 2 is an exemplary diagram for describing a distorted image according to various embodiments of the present disclosure.
- the images 201 to 209 may be images captured by changing a state between a subject and a camera for the same subject.
- the image 201 has a state 201-1 (eg, the axis 201-3 of the subject 201-2 and the axis 201-5 of the camera 201-4) are parallel and , May be an image captured in the subject 201-2 and the camera 201-4 so that the face area 201-6 of the subject 201-2 is located at the center of the image 201 have.
- the face region 201-6 including at least a part of the visual object included in the image 201 is an image 201. It can be located in the central part. Accordingly, distortion may not occur in the face region 201-6 or the degree of distortion may be small.
- the first set of landmarks e.g, eyebrows included in the face region 201-6 And nose
- distance 201-7 and distance 201-8 of the second set of landmarks e.g., eyes and mouth
- the distance 201-9 of the third set of landmarks e.g, eyebrows and eyes
- the ratio of the distance 201-10 of the fourth set of landmarks may not be included in the second range (eg, less than 1).
- the third set of lands The ratio of the distance 201-9 of the marks to the distance 201-10 of the fourth set of landmarks is the distance 201-9 of the third set of landmarks and the fifth set landmarks (for example, nose And the distance of the mouth 201-11.
- the image 203 has a state 203-1 (eg, the axis 203-3 of the subject 203-2 and the axis 203-5 of the camera 203-4) are parallel and , It may be an image captured in the subject 203-2 and the camera 203-4 so that the face area 203-6 of the subject 203-2 is positioned at the lower part of the image 203) have.
- the face region 203-6 of the subject 203-2 included in the image 203 is the bottom of the image 203 Can be located in part. Accordingly, distortion may occur in the face region 203-6.
- the ratio of the distance 203-7 to the distance of the second set of landmarks (e.g., eyes and mouth) 203-8 may be included in the first range (e.g., less than 0.95 or greater than 1.15).
- the image 205 has a state 205-1 (eg, the axis 205-3 of the subject 205-2 and the axis 205-5 of the camera 205-4) are parallel and , It may be an image captured in the subject 205-2 and the camera 205-4 so that the face region 205-6 of the subject 205-2 is positioned on the upper portion of the image 203) have.
- the face region 205-6 of the subject 205-2 included in the image 205 is the top of the image 205.
- the distance of the first set of landmarks (eg, eyebrows and nose) included in the face region 205-6 ( 205-7) and the ratio of the distance 205-8 of the second set of landmarks (eg, eyes and mouth) may be included in the first range (eg, less than 0.95 or more than 1.15).
- the image 207 has a state 207-1 (eg, the axis 207-3 of the subject 207-2 and the axis 207-5 of the camera 207-4 are not parallel) Instead, the image photographed in the subject 207-2 and the camera 207-4 so that the face region 207-6 of the subject 207-2 is positioned at the upper portion of the image 207) Can.
- the face region 207-6 of the subject 207-2 included in the image 207 is the top of the image 207 Can be located in part. Accordingly, distortion may occur in the face region 207-6.
- the ratio of the distance 207-7 to the second set of landmarks (e.g. eyes and mouth) 207-8 is not included in the first range (e.g. less than 0.95 or greater than 1.15) Can.
- the distance 207-9 of the third set of landmarks e.g, eyebrows and eyes
- the ratio of the distance 207-10 of the four sets of landmarks may not be included in the second range (eg, less than 1).
- the distance 207-9 of the third set of landmarks and the distance of the fourth set of landmarks may be greater than the ratio of the distance 207-9 of the third set of landmarks and the distance 207-11 of the fifth set of landmarks (eg, nose and mouth).
- the image 209 has a state 209-1 (eg, the axis 209-3 of the subject 209-2 and the axis 209-5 of the camera 209-4 are not parallel).
- the image captured in the subject 209-2 and the camera 209-4 so that the face region 207-6 of the subject 207-2 is located at the lower portion of the image 209) Can.
- the face area 209-6 of the subject 209-2 included in the image 209 is the bottom of the image 209 Can be located in part. Accordingly, distortion may occur in the face region 209-6.
- the ratio of the distance 209-7 of the second set of landmarks (e.g., eyes and mouth) to the first range e.g. less than 0.95 or greater than 1.15)
- the distance 209-9 of the third set of landmarks e.g, eyebrows and eyes
- the ratio of the distance 209-10 of the four sets of landmarks (eg, eyes and nose) may be included in the second range (eg, less than 1).
- the distance of the landmarks may include a distance (or difference) between the y-axis coordinates of the landmarks.
- the distance of the first set of landmarks may include the distance (or difference) between the y-axis coordinates of the eyebrows identified on the image and the y-axis coordinates of the nose identified on the image.
- the distance of the second set of landmarks may include the distance (or difference) between the y-axis coordinates of the eye identified on the image and the y-axis coordinates of the mouth identified on the image.
- the distance of the third set of landmarks may include the distance (or difference) between the y-axis coordinates of the eyebrows identified on the image and the y-axis coordinates of the eye identified on the image.
- the distance of the fourth set of landmarks may include the distance (or difference) between the y-axis coordinates of the eye identified on the image and the nose y-axis coordinates identified on the image.
- the distance of the fifth set of landmarks may include the distance (or difference) of the y-axis coordinates of the nose identified on the image and the y-axis coordinates of the mouth identified on the image.
- 3 is a block diagram of an electronic device in a network environment according to various embodiments.
- 3 is a block diagram of an electronic device 301 in a network environment 300 according to various embodiments.
- the electronic device 301 communicates with the electronic device 302 through the first network 398 (eg, a short-range wireless communication network), or the second network 399.
- the electronic device 304 or the server 308 may be communicated through (eg, a remote wireless communication network).
- the electronic device 301 may communicate with the electronic device 304 through the server 308.
- the electronic device 301 may include a processor 320, a memory 330, an input device 350, an audio output device 355, a display device 360, an audio module 370, a sensor module ( 376), interface 377, haptic module 379, camera module 380, power management module 388, battery 389, communication module 390, subscriber identification module 396, or antenna module 397 ).
- the components for example, the display device 360 or the camera module 380
- the sensor module 376 may be implemented while embedded in the display device 360 (eg, display).
- the processor 320 for example, executes software (eg, the program 340) to execute at least one other component (eg, hardware or software component) of the electronic device 301 connected to the processor 320. It can be controlled and can perform various data processing or operations. According to an embodiment, as at least a part of data processing or operation, the processor 320 may receive instructions or data received from other components (eg, the sensor module 376 or the communication module 390) in the volatile memory 332. Loaded into, process instructions or data stored in volatile memory 332, and store result data in non-volatile memory 334.
- software eg, the program 340
- the processor 320 may receive instructions or data received from other components (eg, the sensor module 376 or the communication module 390) in the volatile memory 332. Loaded into, process instructions or data stored in volatile memory 332, and store result data in non-volatile memory 334.
- the processor 320 may include a main processor 321 (eg, a central processing unit or application processor), and an auxiliary processor 323 (eg, a graphics processing unit, an image signal processor) that may be operated independently or together. , Sensor hub processor, or communication processor). Additionally or alternatively, the coprocessor 323 may be set to use less power than the main processor 321, or to be specialized for a specified function. The coprocessor 323 may be implemented separately from, or as part of, the main processor 321.
- a main processor 321 eg, a central processing unit or application processor
- auxiliary processor 323 eg, a graphics processing unit, an image signal processor
- the coprocessor 323 may be set to use less power than the main processor 321, or to be specialized for a specified function.
- the coprocessor 323 may be implemented separately from, or as part of, the main processor 321.
- the coprocessor 323 is, for example, on behalf of the main processor 321 while the main processor 321 is in an inactive (eg, sleep) state, or the main processor 321 is active (eg, executing an application) ) With the main processor 321, at least one of the components of the electronic device 301 (eg, the display device 360, the sensor module 376, or the communication module 390) It can control at least some of the functions or states associated with.
- the coprocessor 323 eg, image signal processor or communication processor
- may be implemented as part of other functionally related components eg, camera module 380 or communication module 390). have.
- the memory 330 may store various data used by at least one component of the electronic device 301 (eg, the processor 320 or the sensor module 376 ).
- the data may include, for example, software (eg, the program 340) and input data or output data for commands related thereto.
- the memory 330 may include a volatile memory 332 or a non-volatile memory 334.
- the program 340 may be stored as software in the memory 330, and may include, for example, an operating system 342, middleware 344, or an application 346.
- the input device 350 may receive commands or data to be used for components (eg, the processor 320) of the electronic device 301 from the outside (eg, a user) of the electronic device 301.
- the input device 350 may include, for example, a microphone, mouse, keyboard, or digital pen (eg, a stylus pen).
- the sound output device 355 may output sound signals to the outside of the electronic device 301.
- the audio output device 355 may include, for example, a speaker or a receiver.
- the speaker can be used for general purposes such as multimedia playback or recording playback, and the receiver can be used to receive an incoming call. According to one embodiment, the receiver may be implemented separately from the speaker, or as part thereof.
- the display device 360 may visually provide information to the outside of the electronic device 301 (eg, a user).
- the display device 360 may include, for example, a display, a hologram device, or a projector and a control circuit for controlling the device.
- the display device 360 may include touch circuitry configured to sense a touch, or a sensor circuit (eg, a pressure sensor) set to measure the strength of the force generated by the touch. have.
- the audio module 370 may convert sound into an electrical signal, or vice versa. According to an embodiment of the present disclosure, the audio module 370 acquires sound through the input device 350 or an external electronic device (eg, directly or wirelessly connected to the sound output device 355 or the electronic device 301). Sound may be output through the electronic device 302 (eg, a speaker or headphones).
- an external electronic device eg, directly or wirelessly connected to the sound output device 355 or the electronic device 301. Sound may be output through the electronic device 302 (eg, a speaker or headphones).
- the sensor module 376 detects an operating state (eg, power or temperature) of the electronic device 301, or an external environmental state (eg, a user state), and generates an electrical signal or data value corresponding to the detected state can do.
- the sensor module 376 includes, for example, a gesture sensor, a gyro sensor, a barometric pressure sensor, a magnetic sensor, an acceleration sensor, a grip sensor, a proximity sensor, a color sensor, an infrared (IR) sensor, a biological sensor, It may include a temperature sensor, a humidity sensor, or an illuminance sensor.
- the interface 377 may support one or more designated protocols that the electronic device 301 can be used to connect directly or wirelessly with an external electronic device (eg, the electronic device 302).
- the interface 377 may include, for example, a high definition multimedia interface (HDMI), a universal serial bus (USB) interface, an SD card interface, or an audio interface.
- HDMI high definition multimedia interface
- USB universal serial bus
- SD card interface Secure Digital Card
- connection terminal 378 may include a connector through which the electronic device 301 can be physically connected to an external electronic device (eg, the electronic device 302 ).
- the connection terminal 378 may include, for example, an HDMI connector, a USB connector, an SD card connector, or an audio connector (eg, a headphone connector).
- the haptic module 379 may convert electrical signals into mechanical stimuli (eg, vibration or movement) or electrical stimuli that the user can perceive through tactile or motor sensations.
- the haptic module 379 may include, for example, a motor, a piezoelectric element, or an electrical stimulation device.
- the camera module 380 may capture still images and videos. According to one embodiment, the camera module 380 may include one or more lenses, image sensors, image signal processors, or flashes.
- the power management module 388 may manage power supplied to the electronic device 301. According to an embodiment, the power management module 388 may be implemented, for example, as at least a part of a power management integrated circuit (PMIC).
- PMIC power management integrated circuit
- the battery 389 may supply power to at least one component of the electronic device 301.
- the battery 389 may include, for example, a non-rechargeable primary cell, a rechargeable secondary cell, or a fuel cell.
- the communication module 390 is a direct (eg, wired) communication channel or a wireless communication channel between the electronic device 301 and an external electronic device (eg, the electronic device 302, the electronic device 304, or the server 308). It can support establishing and performing communication through the established communication channel.
- the communication module 390 operates independently of the processor 320 (eg, an application processor), and may include one or more communication processors supporting direct (eg, wired) communication or wireless communication.
- the communication module 390 is a wireless communication module 392 (eg, a cellular communication module, a short-range wireless communication module, or a global navigation satellite system (GNSS) communication module) or a wired communication module 394 (eg : Local area network (LAN) communication module, or power line communication module.
- a wireless communication module 392 eg, a cellular communication module, a short-range wireless communication module, or a global navigation satellite system (GNSS) communication module
- GNSS global navigation satellite system
- wired communication module 394 eg : Local area network (LAN) communication module, or power line communication module.
- Corresponding communication module among these communication modules includes a first network 398 (eg, a short-range communication network such as Bluetooth, WiFi direct, or infrared data association (IrDA)) or a second network 399 (eg, a cellular network, the Internet, or It can communicate with external electronic devices through a computer network (eg, a telecommunication network such as a LAN or
- the wireless communication module 392 uses a subscriber information (eg, International Mobile Subscriber Identifier (IMSI)) stored in the subscriber identification module 396 within a communication network such as the first network 398 or the second network 399.
- IMSI International Mobile Subscriber Identifier
- the electronic device 301 may be identified and authenticated.
- the antenna module 397 may transmit a signal or power to the outside (eg, an external electronic device) or receive it from the outside.
- the antenna module may include a single antenna including a conductor formed on a substrate (eg, a PCB) or a radiator made of a conductive pattern.
- the antenna module 397 may include a plurality of antennas. In this case, at least one antenna suitable for a communication method used in a communication network, such as the first network 398 or the second network 399, is transmitted from the plurality of antennas by, for example, the communication module 390. Can be selected.
- the signal or power may be transmitted or received between the communication module 390 and an external electronic device through the at least one selected antenna.
- other components eg, RFIC
- other than the radiator may be additionally formed as part of the antenna module 397.
- peripheral devices for example, a bus, a general purpose input and output (GPIO), a serial peripheral interface (SPI), or a mobile industry processor interface (MIPI)
- GPIO general purpose input and output
- SPI serial peripheral interface
- MIPI mobile industry processor interface
- the command or data may be transmitted or received between the electronic device 301 and the external electronic device 304 through the server 308 connected to the second network 399.
- Each of the electronic devices 302 and 304 may be the same or a different type of device from the electronic device 301.
- all or some of the operations performed in the electronic device 301 may be executed in one or more external devices of the external electronic devices 302, 304, or 308. For example, when the electronic device 301 needs to perform a certain function or service automatically or in response to a request from a user or another device, the electronic device 301 executes the function or service itself.
- one or more external electronic devices may be requested to perform at least a portion of the function or the service.
- the one or more external electronic devices receiving the request may execute at least a part of the requested function or service, or an additional function or service related to the request, and deliver the result of the execution to the electronic device 301.
- the electronic device 301 may process the result, as it is or additionally, and provide it as at least part of a response to the request.
- cloud computing, distributed computing, or client-server computing technology can be used.
- FIG. 4 is a block diagram of an electronic device according to various embodiments of the present disclosure.
- the electronic device 400 (eg, the electronic device 301 of FIG. 3) includes a processor 401 (eg, the processor 320 of FIG. 3 ), a memory 403 (eg, of FIG. 3) Memory 330, a camera 405 (eg, the camera module 280 of FIG. 3), or a display 407 (eg, the display device 360 of FIG. 3).
- the electronic device 301 may further include a communication circuit (eg, the communication module 390 of FIG. 3) and/or at least one sensor (eg, the sensor module 376 of FIG. 3 ).
- the processor 401 may control the overall operation of the electronic device 400.
- the processor 401 operatively operates with other components of the electronic device 400, such as the memory 403, camera 405, or display 407, to control the overall operation of the electronic device 400. Can be connected.
- the processor 401 may receive instructions of other components, interpret the received instructions, and perform calculations or process data according to the interpreted instructions. For example, the processor 401 may request the memory 403 for instructions, data, or signals. The processor 401 may record (or store) or update instructions, data, or signals in the memory 403 to control the electronic device 400 or control other components in the electronic device 400.
- the processor 401 may interpret and process messages, data, instructions, or signals received from the memory 403, camera 405, or display 407.
- the processor 401 may generate a new message, data, instruction, or signal based on the received message, data, instruction, or signal.
- the processor 401 may provide a processed or generated message, data, instruction, or signal to the memory 403, camera 405, or display 407.
- the memory 403 may store commands for controlling the electronic device 400, control command codes, control information, or user data.
- the memory 403 may store one or more of applications, operating systems, middleware, or device drivers.
- the camera 405 may include one or more lenses, image sensors, image signal processors, or flashes to obtain a still image or dynamic image. According to one embodiment, the camera is disposed so as to be exposed through at least a portion of the first surface (eg, front) of the housing of the electronic device 400 or the second facing away from the first surface of the housing of the electronic device 400 It may be arranged to be exposed through at least a portion of the surface (eg, back).
- the display 407 may visually provide information to the outside of the electronic device 400.
- the display 407 may include a touch circuitry configured to sense a touch or a sensor circuit (eg, a pressure sensor) configured to measure the intensity of the force generated by the touch.
- a touch circuitry configured to sense a touch
- a sensor circuit eg, a pressure sensor
- the processor 401 may acquire a plurality of landmarks corresponding to a facial component from an image acquired through the camera 405. For example, the processor 401 performs a face detection function from an image acquired through the camera 405, and when one face region is detected in the image, the face component ( A plurality of landmarks corresponding to the facial component) may be obtained.
- the plurality of landmarks corresponding to the facial component may include at least one of an eyebrow, eye, nose, or mouth.
- the processor 401 may identify whether distortion has occurred in the image based on distance information between a plurality of landmarks corresponding to a facial component obtained from the image.
- the processor 401 may include a distance of the first set of landmarks (eg, eyebrows and nose), a distance of the second set of landmarks (eg, eyes and mouth), and a third set of landmarks (eg, eyebrows). And the distance of at least one of the distance of the eye and the distance of the fourth set of landmarks (eg, eyes and nose), or the distance of the fifth set of landmarks (eg, nose and mouth). It is possible to determine whether distortion occurs in the image based on the distance.
- the processor 401 when the processor 401 is included in a first range (for example, less than 0.95 or greater than 1.15), the ratio of the distance of the first set of landmarks to the distance of the second set of landmarks, the image of FIG. 2 ( As in 203 or 205, it may be determined that distortion occurs in the face region 203-6 or 205-6 of the image.
- a first range for example, less than 0.95 or greater than 1.15
- the ratio of the distance of the first set of landmarks to the distance of the second set of landmarks, the image of FIG. 2 As in 203 or 205, it may be determined that distortion occurs in the face region 203-6 or 205-6 of the image.
- the processor 401 does not include the ratio of the distance of the first set of landmarks to the distance of the second set of landmarks in the first range, but the length of the third set and the fourth set
- the ratio of the distance of the landmarks of the second range (for example, 1 or less) is included, or the ratio of the length of the third set of landmarks and the distance of the fourth set of landmarks is equal to the length of the third set of landmarks
- the processor 401 may include a ratio of a distance of a set of landmarks and a distance of a second set of landmarks in the first range, a length of the third set of landmarks, and a fourth set of landmarks.
- the ratio of the distance of the landmarks does not include the second range (for example, 1 or less), and the ratio of the length of the third set of landmarks to the distance of the fourth set of landmarks is equal to the length of the third set of landmarks If it is smaller than the ratio of the distance of the 5 sets of landmarks, as in the image 201 of FIG. 2, it may be determined that distortion is not generated in the face region 201-6 of the image.
- the processor 401 may correct an image when distortion occurs in the face region of the image. For example, if the processor 401 is included in a first range (for example, less than 0.95 and greater than 1.15) of a ratio of a distance of a set of landmarks to a distance of a second set of landmarks, the processor 401 is associated with a location and The image may be corrected based on the first correction method using the length.
- a first range for example, less than 0.95 and greater than 1.15
- the image may be corrected based on the first correction method using the length.
- the processor 401 does not include the ratio of the distance of the first set of landmarks to the distance of the second set of landmarks in the first range, but the length of the third set and the fourth set
- the ratio of the distance of the landmarks of the second range (for example, 1 or less) is included, or the ratio of the length of the third set of landmarks and the distance of the fourth set of landmarks is equal to the length of the third set of landmarks If it is greater than the ratio of the distance of the five sets of landmarks, the image may be corrected based on the second correction method using the position and length associated with the face region.
- the processor 401 may store the corrected image in the memory 403.
- the processor 401 may output the corrected image as a preview image through the display 407.
- the processor 401 may determine a distance between landmarks based on y-axis coordinates of the landmarks. For example, the processor 401 may determine a distance of the first set of landmarks by identifying the distance (or difference) between the y-axis coordinates of the eyebrows identified on the image and the y-axis coordinates of the nose identified on the image. For another example, the processor 401 may determine the distance of the second set of landmarks by identifying the distance (or difference) between the y-axis coordinates of the eye identified on the image and the y-axis coordinates of the mouth identified on the image. have.
- the processor 401 determines the distance of the third set of landmarks by identifying the distance (or difference) between the y-axis coordinates of the eyebrows identified on the image and the y-axis coordinates of the eye identified on the image. Can. For another example, the processor 401 determines the distance of the fourth set of landmarks by identifying the distance (or difference) between the y-axis coordinates of the eye identified on the image and the y-axis coordinates of the nose identified on the image. Can. For another example, the processor 401 determines the distance of the fifth set of landmarks by identifying the distance (or difference) between the y-axis coordinates of the nose identified on the image and the y-axis coordinates of the mouth identified on the image. Can.
- the first range may refer to data for determining whether an image is distorted according to a relative position of a subject with respect to a camera (or a relative position of a subject within a camera angle of view).
- the first range may be used to determine how much the subject is moved in the vertical direction from the center of view angle of the camera.
- the processor 401 may cause the subject to be moved over a certain distance in the vertical direction from the central portion of the camera's field of view. It can be determined that it is located in the moved position, and it can be determined that image distortion has occurred.
- the second range may refer to data for determining whether an image is distorted according to a relative direction (or angle) of a subject with respect to a camera.
- the second range may be used to determine how much the direction of the subject relative to the camera is rotated.
- the processor 401 determines that the image is taken in the state 209-1. Thus, it can be determined that the image is distorted.
- the ratio of the length of the third set of landmarks to the distance of the fourth set of landmarks and the ratio of the length of the third set of landmarks to the distance of the fifth set of landmarks is the subject to the camera. It may be used to determine whether the image is distorted according to the relative direction (or angle) of.
- the processor 401 may have a ratio of the length of the third set of landmarks and the distance of the fourth set of landmarks is greater than the ratio of the length of the third set of landmarks and the distance of the fifth set of landmarks. In this case, it is determined that the image is taken in the state 207-1, and it is determined that distortion has occurred in the image.
- the electronic device may include at least one camera (eg, the camera module 380 of FIG. 3 or the camera of FIG. 4 ). 405, a processor operatively connected to the at least one camera (eg, the processor 320 of FIG. 3 or the processor 401 of FIG. 4), and a memory operatively connected to the processor (eg, FIG. 3) Memory (330), or memory (403) of FIG.
- the memory when executed, the processor, using the at least one camera, the image containing the visual object corresponding to the face Acquiring, identifying a plurality of landmarks corresponding to a facial component from the visual object, and the distance of the first set of landmarks of the plurality of landmarks and the first of the plurality of landmarks
- instructions may be stored to obtain a corrected first image based on the image.
- the first set of landmarks includes a first landmark and a second landmark
- the second set of landmarks includes a third landmark and a fourth landmark
- the instructions in response to the processor identifying that the ratio of the distance of the first set of landmarks to the distance of the second set of landmarks is not included in the first range, the first landmark And determining whether a ratio of the distance between the third landmark and the distance between the second landmark and the third landmark falls within a second range, and between the first landmark and the third landmark.
- a corrected second image may be obtained based on the image.
- the processor includes a ratio of a distance between the first landmark and the third landmark and a distance between the second landmark and the third landmark in the second range. If not, the ratio of the distance between the first landmark and the third landmark and the distance between the second landmark and the third landmark is the distance between the first landmark and the third landmark and the second It is determined whether the ratio of the distance between the second landmark and the fourth landmark is greater, and the ratio of the distance between the first landmark and the third landmark and the distance between the second landmark and the third landmark When the distance between the distance between the first landmark and the third landmark and the distance between the second landmark and the fourth landmark is greater than that, the corrected second image may be obtained based on the image. have.
- the processor may have a ratio of a distance between the first landmark and the third landmark and a distance between the second landmark and the third landmark and the first landmark.
- the ratio of the distance between the third landmark and the distance between the second landmark and the fourth landmark is not greater, the image may be maintained.
- the instructions determine, by the processor, a weight for image correction based on location information associated with a face region including at least a portion of the visual object, and the weight and a length associated with the face region. Determine a first correction factor for image correction based on information, perform image warping based on the first correction factor, and obtain the first image based on at least a portion of the image warped image You can do it.
- the instructions, the processor determines the weight for image correction based on the location information associated with the face region, and performs image correction based on the weight and length information associated with the face region.
- a second correction coefficient for determining may be determined, image warping may be performed based on the second correction coefficient, and the second image may be obtained based on at least a portion of the image on which the image warping is performed.
- the location information associated with the face region may include a center coordinate of the face region, and the length information associated with the face region may include the length of one side of the face region.
- the electronic device further includes a display operatively connected to the processor (eg, the display device 360 of FIG. 3 or the display 407 of FIG. 4 ), and the instructions include: , May display the image, the first image, or the second image as a preview image through the display, and store the image, the first image, or the second image in the memory.
- the processor eg, the display device 360 of FIG. 3 or the display 407 of FIG. 4
- the instructions include: , May display the image, the first image, or the second image as a preview image through the display, and store the image, the first image, or the second image in the memory.
- the electronic device further includes a display operatively connected to the processor, and the instructions include, while the processor displays the image as a preview image through the display, the first image, Alternatively, the second image may be acquired, and the acquired image may be stored in the memory.
- the instructions identify the property of the visual object when the processor does not identify a plurality of landmarks corresponding to the facial component from the visual object, and is based on the property of the visual object.
- To generate a distortion map calculate a correction parameter for correcting the distortion map, and correct the image based on the correction parameter, and the property of the visual object is the position of the visual object Or it may include at least one of the size.
- 5 is a flowchart illustrating a method of correcting a distorted image in an electronic device according to various embodiments of the present disclosure.
- 6 is an exemplary diagram illustrating a method of correcting a distorted image in an electronic device according to various embodiments of the present disclosure.
- a processor eg, the processor 320 or FIG. 3 of the electronic device (eg, the electronic device 301 of FIG. 3 or the electronic device 400 of FIG. 4)
- the processor 401 of 4 may acquire an image including a visual object corresponding to a face through a camera (eg, the camera module 380 of FIG. 3 or the camera 405 of FIG. 4 ).
- the processor responds to identifying that an input for acquiring an image (eg, an input to a shooting icon or a shooting button) is received while the camera application is running, as shown in FIG. 6, the visual corresponding to the face through the camera An image 601 including the object 601-1 may be obtained.
- the image 601 is a state 601-3 (for example, the axis 601-4 of the camera and the axis 601-5 of the electronic device are parallel, and the visual object 601-1 is the image 601) It may be an image photographed in a state where the subject and the camera are positioned to be located at the lower portion of the.
- the processor may acquire an image to be used as a preview image displayed through a display (eg, the display device 360 of FIG. 3 or the display 407 of FIG. 4) while the camera application is running.
- the camera is disposed to be exposed through at least a portion of a first side (eg, front) of the housing of the electronic device or a second side (eg, rear) facing and away from the first side of the housing of the electronic device. It can be arranged to be exposed through at least a portion of.
- the processor may identify a plurality of landmarks corresponding to a facial component from the image acquired through the camera. For example, as shown in FIG. 6, the processor performs a face detection function from an image acquired through a camera, thereby including at least a part (eg, a face) of visual objects included in the image through the face detection function.
- the face region 603-1 to be identified may be identified, and a plurality of landmarks 603-3 to 603-9 corresponding to the face component may be identified from the identified face region.
- the processor may include a first landmark 603-3 (eg, eyebrows), a second landmark 603-5 (eg, eyes), and a third landmark 603 from the face region 603-1.
- the processor may perform an operation of identifying the plurality of landmarks corresponding to the facial component again.
- the processor induces the user to re-image an image capable of identifying the plurality of landmarks corresponding to the face component.
- an indicator eg, a guidance message, vibration, sound, or LED flashing
- the shape of the face region 603-1 may include a square shape having the same length and width.
- the processor is corrected based on the image in response to identifying that a ratio of the distance of the first set of landmarks to the second set of landmarks among the plurality of landmarks is included in the first range.
- the first image can be obtained.
- the processor may identify a distance between at least some of the plurality of landmarks.
- the processor based on the identified distance, the distance of the first set of landmarks (eg, the first landmark 603-3 and the third landmark 603-7) and the second set of landmarks (eg : The ratio of the distance between the second landmark 603-5 and the fourth landmark 603-9 can be identified.
- the processor When the ratio of the distance of the distance of the distance of the first set of landmarks to the distance of the second set of landmarks is included in the first range (for example, less than 0.95 or more than 1.15), the processor generates distortion in the visual object 601-1. It is judged that the image can be corrected. For example, the processor corrects an image using a first correction method using a location (center coordinates of the face region) and a length (length of one side of the face region) associated with the face region, thereby correcting distortion, as shown in FIG. 6.
- the obtained first image 605 may be obtained.
- the processor does not include the ratio of the distance of the first set of landmarks to the distance of the second set of landmarks in the first range (eg, less than 0.95 or greater than 1.15), but the third set of lands
- the distance of the marks eg, the first landmark 603-3 and the second landmark 603-5
- the fourth set of landmarks eg, the second landmark 603-5 and the third
- the ratio of the distance of the landmark 603-7 is included in the second range (for example, less than 1), or the ratio of the distance of the third set of landmarks to the distance of the fourth set of landmarks is Even when the ratio of the distance between the landmarks and the distance between the fifth set of landmarks (for example, the third landmark 603-7 and the fourth landmark 603-9) is larger, distortion occurs in the visual object.
- the processor may obtain a second image by correcting the image using a second correction method using a location associated with the face region (center coordinates of the face region) and a length (length of one side of the face region). .
- the processor does not include the ratio of the distance of the first set of landmarks to the distance of the second set of landmarks in a first range (eg, less than 0.95 or greater than 1.15), and the third set of lands
- the distance of the marks eg, the first landmark 603-3 and the second landmark 603-5
- the fourth set of landmarks eg, the second landmark 603-5 and the third
- the ratio of the distance of the landmark 603-7 is not included in the second range (for example, less than 1)
- the ratio of the distance of the third set of landmarks to the distance of the fourth set of landmarks is the third set If the distance between the distance of the landmarks and the distance of the fifth set of landmarks (for example, the third landmark 603-7 and the fourth landmark 603-9) is smaller, distortion occurs in the visual object It may be determined that the image is not performed, and an operation for correcting the image may not be performed.
- the processor of the electronic device may correct the distorted image even when a plurality of landmarks corresponding to the face element are not detected from the face region of the image. For example, if a plurality of landmarks corresponding to a face element are not detected from the face region of the image, the processor identifies a property (eg, position or size) of the visual object corresponding to the face, and is based on the identified property You can also correct the image.
- a property eg, position or size
- the processor determines a distortion map based on the properties of the visual object corresponding to the face, calculates correction parameters for correcting the determined distortion map, and corrects the image based on the calculated correction parameters By performing, it is also possible to correct the distorted image.
- the distortion map is a map expressing the degree of distortion of the image, and the contour of the distortion map may have a shape that is curved in a direction toward the center of the image.
- FIG. 7A to 7B are flowcharts illustrating a method of correcting a distorted image based on a distance ratio between landmarks corresponding to a face element in an electronic device according to various embodiments of the present disclosure.
- 8 is an exemplary diagram for describing a method of correcting a distorted image based on a distance ratio between landmarks corresponding to a face element in an electronic device according to various embodiments of the present disclosure. The following description may be a detailed operation of the operation of acquiring the first image in operation 505 of FIG. 5.
- the objects 801-13, 803-13, and 805-13 included in the images 801 to 807 of FIG. 8 may determine a positional relationship between a subject and a camera when shooting each image. It represents, but does not mean information included in the image.
- a processor of an electronic device eg, the electronic device 301 of FIG. 3 or the electronic device 400 of FIG. 4
- the processor 401 of FIG. 4 may identify a distance between at least some of the plurality of landmarks identified from the image.
- the processor as shown in FIG. 8, the distance 801-3 of the first set of landmarks (eg, eyebrows and nose) in the face region 801-1 included in the image 801, the second Set of landmarks (e.g. eye and mouth) distance 801-5, third set of landmarks (e.g.
- the shape of the face region may include a square shape having the same length and length.
- the processor may determine whether a ratio of a distance of the first set of landmarks and a distance of the second set of landmarks is included in the first range. When the ratio of the distance of the first set of landmarks to the distance of the second set of landmarks is less than 0.95 or 1.15 or more, the processor has a ratio of the distance of the first set of landmarks to the distance of the second set of landmarks. It can be determined to be included in the first range.
- the processor performs operation 705 when the ratio of the distance of the first set of landmarks to the distance of the second set of landmarks is included in the first range, and the distance of the first set of landmarks and the second set of lands If the ratio of the distances of the marks is not included in the first range, operation 707 may be performed.
- the processor is based on a first correction method using a position and a length associated with the face region when the ratio of the distance of the first set of landmarks to the distance of the second set of landmarks is included in the first range.
- the first image can be obtained.
- the processor may include a face region 801 of the image 801.
- the weight for image correction is determined based on the center coordinate of -1), and the correction factor for image correction is determined based on the determined weight and the length of one side of the face region 801-1, and based on the correction factor
- the processor may store the distortion-corrected image 807 in a memory (eg, the memory 330 of FIG. 3 or the memory 403 of FIG. 4 ).
- the processor converts the image 801 into a preview image through a display (eg, the display device 360 of FIG. 3 or the display 407 of FIG. 4 ). Can provide.
- the processor may provide a distortion-corrected image 807 as a preview image through a display.
- the processor sets the distance of the third set of landmarks and the fourth set of landmarks It can be determined whether or not the ratio of the distance of the field falls within the second range. If the ratio of the distance between the distance of the third set of landmarks and the distance of the fourth set of landmarks is less than 1, the processor has a ratio of the distance of the third set of landmarks and the distance of the fourth set of landmarks in the second range. It can be judged to be included.
- the processor When the ratio of the distance between the distance of the third set of landmarks and the distance of the fourth set of landmarks is included in the second range, the processor performs operation 711, and the distance of the third set of landmarks and the fourth set of lands If the ratio of the distances of the marks is not included in the second range, operation 713 may be performed.
- the processor is based on a second correction method using a position and a length associated with the face region when the ratio of the distance of the third set of landmarks to the distance of the fourth set of landmarks is included in the second range.
- a second image can be obtained.
- the processor may include a face region 803 of the image 803.
- a weighting factor for image correction is determined based on the center coordinate of -1), and a correction factor for image correction is determined based on the determined weight and the length of one side of the face region 803-1, and based on the correction factor
- a distortion-corrected image 809 can be obtained from the image 803.
- the function used to determine the weight in operation 711 and the function used to determine the correction coefficient are at least partially different from the function used to determine the weight in operation 705 and the function used to determine the correction coefficient. can do.
- the processor may store the distortion-corrected image 809 in the memory.
- the processor may provide the image 801 as a preview image through the display while generating and storing the distortion-corrected image 809. According to an embodiment, the processor may provide a distortion-corrected image 809 as a preview image through a display.
- the processor determines whether the ratio of the distance is greater than the ratio of the distance of the third set of landmarks and the distance of the fifth set of landmarks.
- the processor performs operation 711 when the ratio of the distance of the third set of landmarks to the distance of the third set of landmarks is greater than the ratio of the distance of the third set of landmarks and the distance of the fifth set of landmarks, If the ratio of the distance of the three sets of landmarks to the distance of the fourth set of landmarks is not greater than the ratio of the distance of the third sets of landmarks and the distance of the fifth set of landmarks, the algorithm may be terminated. .
- the processor may have the ratio of the distance of the third set of landmarks to the distance of the third set of landmarks and the distance of the fifth set of landmarks, such as the image 805.
- the weight for image correction is determined based on the center coordinates of the face region 805-1 of the image 805, and the image is based on the determined weight and the length of one side of the face region 805-1. From the image 805 by determining a correction factor for correction, performing image warping based on the correction factor, and cropping and resizing at least a portion of the image on which image warping was performed.
- the distortion-corrected image 811 may be obtained.
- the processor may provide the image 801 as a preview image through the display while generating and storing the distortion-corrected image 811.
- the processor may provide a distortion-corrected image 811 as a preview image through a display.
- the electronic device performs operation 709, it is described as performing operation 713, but according to various embodiments of the present invention, the electronic device performs operation 713 first, and then performs operation 709 or operation 709 And operation 713 may be performed in parallel.
- FIG. 9 is a diagram illustrating a method of correcting a distorted image based on a position and a length associated with a face region in an electronic device according to various embodiments of the present disclosure.
- 10 is an exemplary diagram illustrating a method of correcting a distorted image based on a position and a length associated with a face region in an electronic device according to various embodiments of the present disclosure.
- 11 illustrates an algorithm for calculating image warping coordinates according to various embodiments. The following description may be a detailed operation of acquiring the first image in operation 705 of FIG. 7A or acquiring the second image in operation 711 of FIG. 7B.
- a processor of an electronic device eg, the electronic device 301 of FIG. 3 or the electronic device 400 of FIG. 4
- the processor 401 of 4 may determine a weight for image correction based on the location associated with the face region. For example, the processor may distance the first set of landmarks (eg, eyebrows and nose) included in the facial area 1001-1 of the image 1001 and the second set of landmarks (eye and mouth).
- the weight for image correction may be determined using ⁇ Equation 1> below.
- ⁇ v represents a correction weight based on the vertical position of the face region
- f y represents a weight value corresponding to the vertical coordinate of the central portion of the face region
- height can represent the height of the image.
- ⁇ h denotes a correction weight based on the horizontal position of the face region
- f x denotes a weight value corresponding to the horizontal coordinates of the center portion of the face region
- width denotes the width of the image.
- values corresponding to f y and f x may be determined based on a table or map to which weight values are mapped for each horizontal or vertical coordinate of the image 1001.
- the table or map to which the weight value is mapped may be updated based on information received from an external electronic device (eg, the server 308 of FIG. 3 or the electronic device 302 or 304).
- the processor does not include the ratio of the distance of the first set of landmarks included in the face region of the image 1001 to the distance of the second set of landmarks in the first range, but the third set of lands
- the ratio of the distance of the marks (e.g., eyebrows and eyes) to the distance of the fourth set of landmarks (e.g., eyes and nose) is included in the second range (e.g., less than 1) or of the three sets of landmarks If the ratio of the distance to the distance of the fourth set of landmarks is greater than the ratio of the distance of the third set of landmarks to the distance of the fifth set of landmarks (eg, nose and mouth), the above Equation 1 > Can be used to determine the weight for image correction.
- the shape of the face region may include a square shape having the same length and width.
- the processor may determine a correction factor for image correction based on the weight and the length associated with the face region. For example, when the ratio of the distance of the distance of the first set of landmarks included in the facial area 1001-1 of the image 1001 to the distance of the second set of landmarks is included in the first range, the processor ⁇ By substituting the weight determined based on Equation 1> into Equation 2 below, a correction coefficient for image correction may be determined.
- C v denotes a correction coefficient based on the vertical position of the face region
- ⁇ c denotes a constant value for determining the correction strength of the image
- f width denotes the length of one side of the face region.
- C h may represent a correction factor based on the horizontal position of the face region.
- the processor does not include a ratio of the distance of the first set of landmarks included in the face region to the distance of the second set of landmarks in the first range, but the distance of the third set of landmarks and the fourth The ratio of the distance of the set of landmarks is included in the second range (for example, less than 1) or the ratio of the distance of the three sets of landmarks to the distance of the fourth set of landmarks is equal to the distance of the third set of landmarks If it is larger than the ratio of the distance of the fifth set of landmarks, a correction coefficient for image correction may be determined by substituting the weight determined based on ⁇ Equation 1> into ⁇ Equation 3> below.
- Equation 3 C v denotes a correction coefficient based on the vertical position of the face region, ⁇ c and ⁇ g denote constant values for determining the correction strength of the image, and C h denotes the length of one side of the face region.
- the processor may perform image warping based on the correction coefficient.
- the processor may use the algorithm shown in FIG. 11 below to adjust coordinates for transforming at least some form of the image 1001 based on a correction coefficient value determined based on ⁇ Equation 2> or ⁇ Equation 3>. You can decide through.
- the processor may transform at least some form of the image based on the determined coordinates.
- a third portion (P 'tl) calculated by the algorithm shown in Figure 11 is (P tl) of the algorithms, the image 1001 shown in Figure 11, the image An image 1003 obtained by extending at least a portion of the image 1001 such that the second portion P tr of 1001 is located in the fourth portion P′ tr calculated by the algorithm illustrated in FIG. 11 is obtained. You can.
- the processor may acquire another image in which image distortion is corrected based on at least a part of the image on which image warping is performed. For example, the processor crops some areas 1003-1 corresponding to the image 1001 in the image 1003 on which image warping is performed, and resizes the cropped areas to correspond to the size of the image 1001 ( By resizing), the distortion-corrected image 1005 can be obtained.
- an image processing method in an electronic device may include an operation of obtaining an image including a visual object corresponding to a face, and identifying a plurality of landmarks corresponding to a facial component from the visual object In response to determining that a ratio of a distance of a first set of landmarks among the plurality of landmarks and a distance of a second set of landmarks among the plurality of landmarks is included in a first range, And obtaining a corrected first image based on the image.
- the first set of landmarks includes a first landmark and a second landmark
- the second set of landmarks includes a third landmark and a fourth landmark
- the image processing method in response to identifying that the ratio of the distance between the distance of the first set of landmarks and the distance of the second set of landmarks is not included in the first range, the first landmark and the Determining whether a ratio of a distance of a third landmark and a distance between the second landmark and the third landmark is included in a second range, and a distance between the first landmark and the third landmark
- an operation of obtaining a corrected second image based on the image may be further included.
- a ratio of a distance between the first landmark and the third landmark and a distance between the second landmark and the third landmark is not included in the second range
- the ratio of the distance between the first landmark and the third landmark and the distance between the second landmark and the third landmark is the distance between the first landmark and the third landmark and the second land
- Determining whether the ratio of the distance between the mark and the fourth landmark is greater than the ratio between the distance between the first landmark and the third landmark and the distance between the second landmark and the third landmark and when the distance between the distance between the first landmark and the third landmark and the distance between the second landmark and the fourth landmark are greater, acquiring the corrected second image based on the image. can do.
- a ratio of a distance between the first landmark and the third landmark and a distance between the second landmark and the third landmark is the first landmark and the second landmark. If it is not greater than the ratio of the distance between the 3 landmarks and the distance between the second landmarks and the fourth landmarks, an operation of maintaining the image may be further included.
- the operation of obtaining the first image may include determining a weight for image correction based on location information associated with a face region including at least a portion of the visual object, and the weight and the face Determining a first correction factor for image correction based on length information associated with a region, performing image warping based on the first correction factor, and at least a portion of the image on which the image warping was performed
- the operation may include acquiring the first image.
- the operation of acquiring the second image is based on the location information associated with the face region, determining the weight for image correction, and the weight and length information associated with the face region. Determining a second correction coefficient for image correction, and performing image warping based on the second correction factor, and obtaining the second image based on at least a portion of the image warped image It may include an operation.
- the location information associated with the face region may include a center coordinate of the face region, and the length information associated with the face region may include the length of one side of the face region.
- the image processing method may include displaying the image, the first image, or the second image as a preview image, and storing the image, the first image, or the second image. It may further include.
- the first image or the second image is obtained and stored while the image is displayed as a preview image.
- the image processing method may include determining an attribute of the visual object and an attribute of the visual object.
- the method further includes generating a distortion map based on the image, calculating a correction parameter for correcting the distortion map, and correcting the image based on the correction parameter.
- the attribute may include at least one of the position or size of the visual object.
- An electronic device may be various types of devices.
- the electronic device may include, for example, a portable communication device (eg, a smart phone), a computer device, a portable multimedia device, a portable medical device, a camera, a wearable device, or a home appliance device.
- a portable communication device e.g, a smart phone
- a computer device e.g., a smart phone
- a portable multimedia device e.g., a portable medical device
- a camera e.g., a camera
- a wearable device e.g., a smart bracelet
- phrases such as “at least one of,, B, or C” may include any one of the items listed together in the corresponding phrase of the phrases, or all possible combinations thereof.
- Terms such as “first”, “second”, or “first” or “second” can be used to simply distinguish a component from other components, and to separate components from other aspects (eg, importance or Order).
- any (eg first) component is referred to as “coupled” or “connected” to another (eg second) component, with or without the term “functionally” or “communicatively”
- any of the above components can be connected directly to the other components (eg, by wire), wirelessly, or through a third component.
- module used in this document may include a unit implemented in hardware, software, or firmware, and may be used interchangeably with terms such as logic, logic blocks, components, or circuits.
- the module may be an integrally configured component or a minimum unit of the component or a part thereof performing one or more functions.
- the module may be implemented in the form of an application-specific integrated circuit (ASIC).
- ASIC application-specific integrated circuit
- Various embodiments of the present disclosure may include one or more instructions stored in a storage medium (eg, internal memory 336 or external memory 330) readable by a machine (eg, electronic device 301). It may be implemented as software (e.g., program 340) that includes.
- a processor eg, processor 320
- the one or more instructions may include code generated by a compiler or code executable by an interpreter.
- the storage medium readable by the device may be provided in the form of a non-transitory storage medium.
- a signal eg, electromagnetic waves
- a method according to various embodiments disclosed in this document may be provided as being included in a computer program product.
- Computer program products can be traded between sellers and buyers as products.
- the computer program product is distributed in the form of a device-readable storage medium (eg compact disc read only memory (CD-ROM)), or through an application store (eg Play StoreTM) or two user devices ( For example, it can be distributed directly (e.g., downloaded or uploaded) between smartphones).
- a portion of the computer program product may be stored at least temporarily on a storage medium readable by a device such as a memory of a manufacturer's server, an application store's server, or a relay server, or may be temporarily generated.
- each component (eg, module or program) of the above-described components may include a singular or a plurality of entities.
- one or more components or operations among the above-described corresponding components may be omitted, or one or more other components or operations may be added.
- a plurality of components eg, modules or programs
- the integrated component may perform one or more functions of each component of the plurality of components the same or similar to that performed by the corresponding component among the plurality of components prior to the integration. .
- operations performed by a module, program, or other component may be executed sequentially, in parallel, repeatedly, or heuristically, or one or more of the operations may be executed in a different order, or omitted. , Or one or more other actions can be added.
Landscapes
- Engineering & Computer Science (AREA)
- Health & Medical Sciences (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- General Health & Medical Sciences (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Human Computer Interaction (AREA)
- Multimedia (AREA)
- Studio Devices (AREA)
- Image Processing (AREA)
Abstract
Le dispositif électronique selon divers modes de réalisation comprend au moins une caméra, un processeur connecté de manière fonctionnelle à la ou aux caméras, et une mémoire connectée de manière fonctionnelle au processeur, la mémoire pouvant stocker des instructions pour le processeur, lors de l'exécution, pour : acquérir une image comprenant un objet visuel correspondant à un visage, à l'aide de la ou des caméras; identifier, à partir de l'objet visuel, une pluralité de points de repère correspondant aux composants faciaux; et acquérir une première image compensée sur la base de l'image, en répondant à l'identification du fait que le rapport entre les distances d'un premier ensemble de points de repère parmi la pluralité de points de repère et les distances d'un second ensemble de points de repère parmi la pluralité de points de repère se situe dans une première plage. D'autres modes de réalisation sont également possibles.
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| KR1020180163558A KR102607789B1 (ko) | 2018-12-17 | 2018-12-17 | 이미지 처리 방법 및 그 전자 장치 |
| KR10-2018-0163558 | 2018-12-17 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2020130579A1 true WO2020130579A1 (fr) | 2020-06-25 |
Family
ID=71101486
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/KR2019/017895 Ceased WO2020130579A1 (fr) | 2018-12-17 | 2019-12-17 | Procédé de traitement d'image, et dispositif électronique associé |
Country Status (2)
| Country | Link |
|---|---|
| KR (1) | KR102607789B1 (fr) |
| WO (1) | WO2020130579A1 (fr) |
Families Citing this family (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| KR102416554B1 (ko) * | 2020-10-08 | 2022-07-05 | 주식회사 써머캣 | 이미지에 포함된 얼굴 윤곽을 보정하기 위한 장치 및 이의 작동 방법 |
| WO2024111943A1 (fr) * | 2022-11-21 | 2024-05-30 | 삼성전자주식회사 | Dispositif électronique, procédé et support de stockage lisible par ordinateur pour identifier des objets visuels correspondant à des informations de code à l'aide d'une pluralité de caméras |
| WO2025147255A1 (fr) * | 2024-01-04 | 2025-07-10 | Canon Kabushiki Kaisha | Appareil et procédé de correction pendant un retrait d'affichage de monture de tête |
Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| KR20070061404A (ko) * | 2005-12-09 | 2007-06-13 | 가시오 히타치 모바일 커뮤니케이션즈 컴퍼니 리미티드 | 촬상 장치, 촬상 영상처리방법, 및 컴퓨터 판독가능기록매체 |
| KR20120055598A (ko) * | 2009-08-04 | 2012-05-31 | 베잘리스 | 기준 이미지에 따라 타켓 이미지를 수정하는 이미지 처리방법 및 이미지 처리장치 |
| US20140185931A1 (en) * | 2011-06-07 | 2014-07-03 | Omron Corporation | Image processing device, image processing method, and computer readable medium |
| KR101460130B1 (ko) * | 2007-12-11 | 2014-11-10 | 삼성전자주식회사 | 휴대 단말기의 화상 통화 방법 및 장치 |
| US20160253791A1 (en) * | 2015-02-27 | 2016-09-01 | Sony Corporation | Optical distortion compensation |
-
2018
- 2018-12-17 KR KR1020180163558A patent/KR102607789B1/ko active Active
-
2019
- 2019-12-17 WO PCT/KR2019/017895 patent/WO2020130579A1/fr not_active Ceased
Patent Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| KR20070061404A (ko) * | 2005-12-09 | 2007-06-13 | 가시오 히타치 모바일 커뮤니케이션즈 컴퍼니 리미티드 | 촬상 장치, 촬상 영상처리방법, 및 컴퓨터 판독가능기록매체 |
| KR101460130B1 (ko) * | 2007-12-11 | 2014-11-10 | 삼성전자주식회사 | 휴대 단말기의 화상 통화 방법 및 장치 |
| KR20120055598A (ko) * | 2009-08-04 | 2012-05-31 | 베잘리스 | 기준 이미지에 따라 타켓 이미지를 수정하는 이미지 처리방법 및 이미지 처리장치 |
| US20140185931A1 (en) * | 2011-06-07 | 2014-07-03 | Omron Corporation | Image processing device, image processing method, and computer readable medium |
| US20160253791A1 (en) * | 2015-02-27 | 2016-09-01 | Sony Corporation | Optical distortion compensation |
Also Published As
| Publication number | Publication date |
|---|---|
| KR102607789B1 (ko) | 2023-11-30 |
| KR20200074780A (ko) | 2020-06-25 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| WO2020171553A1 (fr) | Dispositif électronique appliquant un effet bokeh à une image, et procédé de commande associé | |
| WO2020080845A1 (fr) | Dispositif électronique et procédé pour obtenir des images | |
| WO2020185029A1 (fr) | Dispositif électronique et procédé d'affichage des informations de partage sur la base de la réalité augmentée | |
| WO2019221464A1 (fr) | Appareil et procédé de reconnaissance d'un objet dans un dispositif électronique | |
| WO2019066373A1 (fr) | Procédé de correction d'image sur la base de catégorie et de taux de reconnaissance d'objet inclus dans l'image et dispositif électronique mettant en œuvre celui-ci | |
| WO2019142997A1 (fr) | Appareil et procédé pour compenser un changement d'image provoqué par un mouvement de stabilisation d'image optique (sio) | |
| WO2019194455A1 (fr) | Appareil et procédé de reconnaissance d'objet dans une image | |
| WO2019156308A1 (fr) | Appareil et procédé d'estimation de mouvement de stabilisation d'image optique | |
| WO2020116844A1 (fr) | Dispositif électronique et procédé d'acquisition d'informations de profondeur à l'aide au moins de caméras ou d'un capteur de profondeur | |
| WO2019139404A1 (fr) | Dispositif électronique et procédé de traitement d'image correspondante | |
| WO2021080307A1 (fr) | Procédé de commande de caméra et dispositif électronique correspondant | |
| WO2020032383A1 (fr) | Dispositif électronique permettant de fournir un résultat de reconnaissance d'un objet externe à l'aide des informations de reconnaissance concernant une image, des informations de reconnaissance similaires associées à des informations de reconnaissance, et des informations de hiérarchie, et son procédé d'utilisation | |
| WO2022080869A1 (fr) | Procédé de mise à jour d'une carte tridimensionnelle au moyen d'une image et dispositif électronique prenant en charge ledit procédé | |
| WO2021230568A1 (fr) | Dispositif électronique permettant de fournir un service de réalité augmentée et son procédé de fonctionnement | |
| WO2020130579A1 (fr) | Procédé de traitement d'image, et dispositif électronique associé | |
| WO2020197070A1 (fr) | Dispositif électronique effectuant une fonction selon une entrée de geste et son procédé de fonctionnement | |
| WO2021149938A1 (fr) | Dispositif électronique et procédé de commande de robot | |
| WO2021157996A1 (fr) | Dispositif électronique et son procédé de traitement d'image | |
| WO2019172577A1 (fr) | Dispositif et procédé de traitement d'images d'un dispositif électronique | |
| WO2021235884A1 (fr) | Dispositif électronique et procédé de génération d'image par réalisation d'un awb | |
| WO2021162263A1 (fr) | Procédé de génération d'image et dispositif électronique associé | |
| WO2020190008A1 (fr) | Dispositif électronique pour fonction de focalisation auto, et procédé de commande correspondant | |
| WO2020085718A1 (fr) | Procédé et dispositif de génération d'avatar sur la base d'une image corrigée | |
| WO2019172723A1 (fr) | Interface connectée à un capteur d'image et dispositif électronique comprenant des interfaces connectées parmi une pluralité de processeurs | |
| WO2022173164A1 (fr) | Procédé et dispositif électronique d'affichage d'un objet de réalité augmentée |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 19899337 Country of ref document: EP Kind code of ref document: A1 |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 19899337 Country of ref document: EP Kind code of ref document: A1 |