[go: up one dir, main page]

WO2022250305A1 - Procédé de stabilisation d'image pendant la capture, et dispositif électronique associé - Google Patents

Procédé de stabilisation d'image pendant la capture, et dispositif électronique associé Download PDF

Info

Publication number
WO2022250305A1
WO2022250305A1 PCT/KR2022/005963 KR2022005963W WO2022250305A1 WO 2022250305 A1 WO2022250305 A1 WO 2022250305A1 KR 2022005963 W KR2022005963 W KR 2022005963W WO 2022250305 A1 WO2022250305 A1 WO 2022250305A1
Authority
WO
WIPO (PCT)
Prior art keywords
lens
electronic device
photodiode
amount
light
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/KR2022/005963
Other languages
English (en)
Korean (ko)
Inventor
이정원
박재형
송원석
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Samsung Electronics Co Ltd
Original Assignee
Samsung Electronics Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Samsung Electronics Co Ltd filed Critical Samsung Electronics Co Ltd
Publication of WO2022250305A1 publication Critical patent/WO2022250305A1/fr
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/68Control of cameras or camera modules for stable pick-up of the scene, e.g. compensating for camera body vibrations
    • H04N23/682Vibration or motion blur correction
    • H04N23/685Vibration or motion blur correction performed by mechanical compensation
    • H04N23/687Vibration or motion blur correction performed by mechanical compensation by shifting the lens or sensor position
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B5/00Adjustment of optical system relative to image or object surface other than for focusing
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • H04N23/55Optical parts specially adapted for electronic image sensors; Mounting thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B2205/00Adjustment of optical system relative to image or object surface other than for focusing
    • G03B2205/0007Movement of one or more optical elements for control of motion blur

Definitions

  • Embodiments disclosed in this document relate to an electronic device and method for compensating for shake during camera shooting.
  • the shake correction function is an essential and important function to obtain clear pictures.
  • OIS optical image stabilization
  • DIS digital IS
  • the optical shake compensation method is a method of reducing shake by moving a lens or sensor
  • the electronic shake correction method is a method borrowed from a portable terminal and refers to a method of reducing shake through digital processing.
  • portable terminals drive optical shake correction and digital shake correction together to perform shake correction in a wide range.
  • a portable terminal In order to perform shake correction smoothly, synchronization between optical image stabilization (OIS) and digital IS (DIS) must be performed.
  • OIS optical image stabilization
  • DIS digital IS
  • a portable terminal In order to achieve the above synchronization, a portable terminal must employ an image sensor capable of transmitting image frames and motion data (eg, OIS data and/or gyro data) associated with the image frames to a processor.
  • image sensor capable of transmitting image frames and motion data (eg, OIS data and/or gyro data) associated with the image frames to a processor.
  • an electronic device In order for an electronic device to output shaking information (eg, OIS data and/or gyro data) synchronized with a photographed image frame, hardware configuration is complicated and high cost is required.
  • the electronic device may provide the shake information synchronized according to the vertical synchronization signal (Vsync) of the image frame to the processor, which is also a separate hardware configuration in which the shake information synchronized with the image frame is transmitted.
  • OIS MCU micro controller unit
  • Various embodiments of the present disclosure may provide a method and an electronic device for obtaining OIS information synchronized with an image by calculating a lens movement amount through an image sensor composed of multiple photodiodes (ie, a multi-PD image sensor). have.
  • An electronic device includes a lens assembly including a lens, an OIS actuator for moving the lens assembly in a direction perpendicular to an optical axis, a micro lens, and at least two photodiodes corresponding to the micro lens.
  • An image sensor comprising: an image sensor, at least one motion sensor, a memory, and the lens assembly, the OIS actuator, the image sensor, the at least one motion sensor, and at least one processor operatively connected with the memory can include The at least one processor acquires at least one image frame through the image sensor, and determines the amount of light of the lens based on a first amount of light obtained through a first photodiode and a second amount of light obtained through a second photodiode.
  • Identify a first position determine a movement amount of the lens moved by the OIS actuator based on a reference position of the lens and the first position of the lens, and determine the at least one movement amount based on the movement amount of the lens.
  • Digital image stabilization may be performed on one image frame.
  • a method of operating an electronic device includes an operation of acquiring at least one image frame through an image sensor, a first amount of light acquired through a first photodiode, and a second photodiode. Identifying a first position of the lens based on the second amount of light obtained through, and identifying a movement amount of the lens moved by the OIS actuator based on the reference position of the lens and the first position of the lens. and performing digital image stabilization (DIS) on the at least one image frame based on the movement amount of the lens.
  • DIS digital image stabilization
  • an electronic device includes a lens assembly including a lens, an OIS actuator for moving the lens assembly in a direction perpendicular to an optical axis, a micro lens, and at least two or more corresponding to the micro lens.
  • an image sensor comprising photodiodes, at least one motion sensor, a memory, and at least one operatively connected with the lens assembly, the OIS actuator, the image sensor, and the at least one motion sensor, and the memory may include a processor of The at least one processor acquires at least one image frame through the image sensor, and determines a first light amount difference based on a first light amount obtained through a first photodiode and a second light amount obtained through a second photodiode.
  • Acquiring data for and data for a second light amount difference based on a third light amount obtained through the first photodiode and a fourth light amount obtained through the second photodiode in response to shaking of the electronic device Obtaining, confirming the movement amount of the lens moved by the OIS actuator based on the data on the first light amount difference and the data on the second light amount difference, and the checked lens movement amount, and the at least Shake correction may be performed on the at least one image frame based on the movement of the electronic device acquired through one motion sensor.
  • OIS driving information may be obtained through an input image.
  • cost can be reduced because a separate hardware configuration for synchronization of OIS and VDIS is not required.
  • the OIS movement amount is determined directly from the captured image, errors due to external factors such as temperature change and time lapse, which occur when separate hardware is used, can be prevented.
  • FIG. 1 is a block diagram of an electronic device in a network environment according to various embodiments.
  • FIG. 2 is a block diagram illustrating a camera module according to various embodiments.
  • FIG. 3 is a block diagram illustrating a camera module according to an embodiment.
  • FIG. 4 is a diagram illustrating photodiodes included in pixels of an image sensor according to an exemplary embodiment.
  • FIG. 5 is a diagram illustrating arrangement of photodiodes corresponding to one pixel according to an exemplary embodiment.
  • FIG. 6 is a diagram illustrating an amount of light incident on pixels according to an exemplary embodiment.
  • FIG. 7 is a two-dimensional diagram illustrating a change in an optical center due to an OIS operating to compensate for an effect caused by shaking of an electronic device according to an exemplary embodiment.
  • FIG. 8 is a 3D diagram illustrating a change in an optical center due to shaking of the electronic device in an electronic device according to an exemplary embodiment.
  • 9A is a flowchart illustrating a process of analyzing optical characteristics of a lens using multiple photodiodes and performing shake correction based on the analysis in an electronic device according to an embodiment.
  • 9B is a graph showing a value obtained by averaging a luminance value obtained for each pixel and a normalized shading difference (NSD) in an image acquired by photographing a subject without a pattern in an electronic device according to an embodiment, and photographing a subject with a pattern It shows a graph showing the average value of the luminance value and NSD acquired for each pixel in the image obtained by doing so.
  • NSD normalized shading difference
  • 10A shows an image obtained when a subject without a pattern is photographed.
  • 10B shows an image obtained when a subject with a pattern is photographed.
  • 11A is a graph illustrating luminance values obtained for each pixel when the image of FIG. 10A is captured in an electronic device according to an embodiment.
  • FIG. 11B is a graph illustrating luminance values obtained for each pixel when the image of FIG. 10B is captured in an electronic device according to an embodiment.
  • FIG. 12A is a graph illustrating average values of normalized shading differences (NSDs) when the image of FIG. 10A is captured in an electronic device according to an embodiment.
  • NSDs normalized shading differences
  • FIG. 12B is a graph showing average values of normalized shading differences (NSDs) when the image of FIG. 10B is captured in an electronic device according to an embodiment.
  • NSDs normalized shading differences
  • 13A is a graph illustrating an OIS movement amount detected by an electronic device according to an embodiment.
  • 13B is a graph illustrating initial NSD data acquired by an electronic device according to an embodiment.
  • 13C is a graph showing data on NSD calculated by correcting by an OIS movement amount in an electronic device according to an embodiment.
  • FIG. 14 is a diagram for explaining an operation of compensating a shake based on first motion information acquired from a motion sensor and second motion information generated by driving an OIS in an electronic device according to an embodiment.
  • 15 is a diagram for illustrating an operation of correcting an image frame line by line in an electronic device according to an exemplary embodiment.
  • FIG. 1 is a block diagram of an electronic device 101 within a network environment 100 according to various embodiments.
  • an electronic device 101 communicates with an electronic device 102 through a first network 198 (eg, a short-range wireless communication network) or through a second network 199. It is possible to communicate with the electronic device 104 or the server 108 through (eg, a long-distance wireless communication network). According to an embodiment, the electronic device 101 may communicate with the electronic device 104 through the server 108 .
  • the electronic device 101 includes a processor 120, a memory 130, an input module 150, an audio output module 155, a display module 160, an audio module 170, a sensor module ( 176), interface 177, connection terminal 178, haptic module 179, camera module 180, power management module 188, battery 189, communication module 190, subscriber identification module 196 , or the antenna module 197 may be included.
  • at least one of these components eg, the connection terminal 178) may be omitted or one or more other components may be added.
  • some of these components eg, sensor module 176, camera module 180, or antenna module 197) are integrated into one component (eg, display module 160). It can be.
  • the processor 120 for example, executes software (eg, the program 140) to cause at least one other component (eg, hardware or software component) of the electronic device 101 connected to the processor 120. It can control and perform various data processing or calculations. According to one embodiment, as at least part of data processing or operation, the processor 120 transfers commands or data received from other components (eg, sensor module 176 or communication module 190) to volatile memory 132. , processing commands or data stored in the volatile memory 132 , and storing resultant data in the non-volatile memory 134 .
  • software eg, the program 140
  • the processor 120 transfers commands or data received from other components (eg, sensor module 176 or communication module 190) to volatile memory 132. , processing commands or data stored in the volatile memory 132 , and storing resultant data in the non-volatile memory 134 .
  • the processor 120 may include a main processor 121 (eg, a central processing unit or an application processor) or a secondary processor 123 (eg, a graphic processing unit, a neural network processing unit ( NPU: neural processing unit (NPU), image signal processor, sensor hub processor, or communication processor).
  • a main processor 121 eg, a central processing unit or an application processor
  • a secondary processor 123 eg, a graphic processing unit, a neural network processing unit ( NPU: neural processing unit (NPU), image signal processor, sensor hub processor, or communication processor.
  • NPU neural network processing unit
  • the secondary processor 123 may be implemented separately from or as part of the main processor 121 .
  • the secondary processor 123 may, for example, take the place of the main processor 121 while the main processor 121 is in an inactive (eg, sleep) state, or the main processor 121 is active (eg, running an application). ) state, together with the main processor 121, at least one of the components of the electronic device 101 (eg, the display module 160, the sensor module 176, or the communication module 190) It is possible to control at least some of the related functions or states.
  • the auxiliary processor 123 eg, an image signal processor or a communication processor
  • the auxiliary processor 123 may include a hardware structure specialized for processing an artificial intelligence model.
  • AI models can be created through machine learning. Such learning may be performed, for example, in the electronic device 101 itself where artificial intelligence is performed, or may be performed through a separate server (eg, the server 108).
  • the learning algorithm may include, for example, supervised learning, unsupervised learning, semi-supervised learning or reinforcement learning, but in the above example Not limited.
  • the artificial intelligence model may include a plurality of artificial neural network layers.
  • Artificial neural networks include deep neural networks (DNNs), convolutional neural networks (CNNs), recurrent neural networks (RNNs), restricted boltzmann machines (RBMs), deep belief networks (DBNs), bidirectional recurrent deep neural networks (BRDNNs), It may be one of deep Q-networks or a combination of two or more of the foregoing, but is not limited to the foregoing examples.
  • the artificial intelligence model may include, in addition or alternatively, software structures in addition to hardware structures.
  • the memory 130 may store various data used by at least one component (eg, the processor 120 or the sensor module 176) of the electronic device 101 .
  • the data may include, for example, input data or output data for software (eg, program 140) and commands related thereto.
  • the memory 130 may include volatile memory 132 or non-volatile memory 134 .
  • the program 140 may be stored as software in the memory 130 and may include, for example, an operating system 142 , middleware 144 , or an application 146 .
  • the input module 150 may receive a command or data to be used by a component (eg, the processor 120) of the electronic device 101 from the outside of the electronic device 101 (eg, a user).
  • the input module 150 may include, for example, a microphone, a mouse, a keyboard, a key (eg, a button), or a digital pen (eg, a stylus pen).
  • the sound output module 155 may output sound signals to the outside of the electronic device 101 .
  • the sound output module 155 may include, for example, a speaker or a receiver.
  • the speaker can be used for general purposes such as multimedia playback or recording playback.
  • a receiver may be used to receive an incoming call. According to one embodiment, the receiver may be implemented separately from the speaker or as part of it.
  • the display module 160 may visually provide information to the outside of the electronic device 101 (eg, a user).
  • the display module 160 may include, for example, a display, a hologram device, or a projector and a control circuit for controlling the device.
  • the display module 160 may include a touch sensor configured to detect a touch or a pressure sensor configured to measure the intensity of force generated by the touch.
  • the audio module 170 may convert sound into an electrical signal or vice versa. According to an embodiment, the audio module 170 acquires sound through the input module 150, the sound output module 155, or an external electronic device connected directly or wirelessly to the electronic device 101 (eg: Sound may be output through the electronic device 102 (eg, a speaker or a headphone).
  • the audio module 170 acquires sound through the input module 150, the sound output module 155, or an external electronic device connected directly or wirelessly to the electronic device 101 (eg: Sound may be output through the electronic device 102 (eg, a speaker or a headphone).
  • the sensor module 176 detects an operating state (eg, power or temperature) of the electronic device 101 or an external environmental state (eg, a user state), and generates an electrical signal or data value corresponding to the detected state. can do.
  • the sensor module 176 may include, for example, a gesture sensor, a gyro sensor, an air pressure sensor, a magnetic sensor, an acceleration sensor, a grip sensor, a proximity sensor, a color sensor, an IR (infrared) sensor, a bio sensor, It may include a temperature sensor, humidity sensor, or light sensor.
  • the interface 177 may support one or more designated protocols that may be used to directly or wirelessly connect the electronic device 101 to an external electronic device (eg, the electronic device 102).
  • the interface 177 may include, for example, a high definition multimedia interface (HDMI), a universal serial bus (USB) interface, an SD card interface, or an audio interface.
  • HDMI high definition multimedia interface
  • USB universal serial bus
  • SD card interface Secure Digital Card interface
  • audio interface audio interface
  • connection terminal 178 may include a connector through which the electronic device 101 may be physically connected to an external electronic device (eg, the electronic device 102).
  • the connection terminal 178 may include, for example, an HDMI connector, a USB connector, an SD card connector, or an audio connector (eg, a headphone connector).
  • the haptic module 179 may convert electrical signals into mechanical stimuli (eg, vibration or motion) or electrical stimuli that a user may perceive through tactile or kinesthetic senses.
  • the haptic module 179 may include, for example, a motor, a piezoelectric element, or an electrical stimulation device.
  • the camera module 180 may capture still images and moving images. According to one embodiment, the camera module 180 may include one or more lenses, image sensors, image signal processors, or flashes.
  • the power management module 188 may manage power supplied to the electronic device 101 .
  • the power management module 188 may be implemented as at least part of a power management integrated circuit (PMIC), for example.
  • PMIC power management integrated circuit
  • the battery 189 may supply power to at least one component of the electronic device 101 .
  • the battery 189 may include, for example, a non-rechargeable primary battery, a rechargeable secondary battery, or a fuel cell.
  • the communication module 190 is a direct (eg, wired) communication channel or a wireless communication channel between the electronic device 101 and an external electronic device (eg, the electronic device 102, the electronic device 104, or the server 108). Establishment and communication through the established communication channel may be supported.
  • the communication module 190 may include one or more communication processors that operate independently of the processor 120 (eg, an application processor) and support direct (eg, wired) communication or wireless communication.
  • the communication module 190 is a wireless communication module 192 (eg, a cellular communication module, a short-range wireless communication module, or a global navigation satellite system (GNSS) communication module) or a wired communication module 194 (eg, : a local area network (LAN) communication module or a power line communication module).
  • a corresponding communication module is a first network 198 (eg, a short-range communication network such as Bluetooth, wireless fidelity (WiFi) direct, or infrared data association (IrDA)) or a second network 199 (eg, a legacy communication module).
  • the wireless communication module 192 uses subscriber information (eg, International Mobile Subscriber Identifier (IMSI)) stored in the subscriber identification module 196 within a communication network such as the first network 198 or the second network 199.
  • IMSI International Mobile Subscriber Identifier
  • the wireless communication module 192 may support a 5G network after a 4G network and a next-generation communication technology, for example, NR access technology (new radio access technology).
  • NR access technologies include high-speed transmission of high-capacity data (enhanced mobile broadband (eMBB)), minimization of terminal power and access of multiple terminals (massive machine type communications (mMTC)), or high reliability and low latency (ultra-reliable and low latency (URLLC)).
  • eMBB enhanced mobile broadband
  • mMTC massive machine type communications
  • URLLC ultra-reliable and low latency
  • -latency communications can be supported.
  • the wireless communication module 192 may support a high frequency band (eg, mmWave band) to achieve a high data rate, for example.
  • the wireless communication module 192 uses various technologies for securing performance in a high frequency band, such as beamforming, massive multiple-input and multiple-output (MIMO), and full-dimensional multiplexing. Technologies such as input/output (FD-MIMO: full dimensional MIMO), array antenna, analog beam-forming, or large scale antenna may be supported.
  • the wireless communication module 192 may support various requirements defined for the electronic device 101, an external electronic device (eg, the electronic device 104), or a network system (eg, the second network 199).
  • the wireless communication module 192 is a peak data rate for eMBB realization (eg, 20 Gbps or more), a loss coverage for mMTC realization (eg, 164 dB or less), or a U-plane latency for URLLC realization (eg, Example: downlink (DL) and uplink (UL) each of 0.5 ms or less, or round trip 1 ms or less) may be supported.
  • eMBB peak data rate for eMBB realization
  • a loss coverage for mMTC realization eg, 164 dB or less
  • U-plane latency for URLLC realization eg, Example: downlink (DL) and uplink (UL) each of 0.5 ms or less, or round trip 1 ms or less
  • the antenna module 197 may transmit or receive signals or power to the outside (eg, an external electronic device).
  • the antenna module 197 may include an antenna including a radiator formed of a conductor or a conductive pattern formed on a substrate (eg, PCB).
  • the antenna module 197 may include a plurality of antennas (eg, an array antenna). In this case, at least one antenna suitable for a communication method used in a communication network such as the first network 198 or the second network 199 is selected from the plurality of antennas by the communication module 190, for example. can be chosen A signal or power may be transmitted or received between the communication module 190 and an external electronic device through the selected at least one antenna.
  • other components eg, a radio frequency integrated circuit (RFIC) may be additionally formed as a part of the antenna module 197 in addition to the radiator.
  • RFIC radio frequency integrated circuit
  • the antenna module 197 may form a mmWave antenna module.
  • the mmWave antenna module includes a printed circuit board, an RFIC disposed on or adjacent to a first surface (eg, a lower surface) of the printed circuit board and capable of supporting a designated high frequency band (eg, mmWave band); and a plurality of antennas (eg, array antennas) disposed on or adjacent to a second surface (eg, a top surface or a side surface) of the printed circuit board and capable of transmitting or receiving signals of the designated high frequency band. can do.
  • peripheral devices eg, a bus, general purpose input and output (GPIO), serial peripheral interface (SPI), or mobile industry processor interface (MIPI)
  • signal e.g. commands or data
  • commands or data may be transmitted or received between the electronic device 101 and the external electronic device 104 through the server 108 connected to the second network 199 .
  • Each of the external electronic devices 102 or 104 may be the same as or different from the electronic device 101 .
  • all or part of operations executed in the electronic device 101 may be executed in one or more external electronic devices among the external electronic devices 102 , 104 , or 108 .
  • the electronic device 101 when the electronic device 101 needs to perform a certain function or service automatically or in response to a request from a user or another device, the electronic device 101 instead of executing the function or service by itself.
  • one or more external electronic devices may be requested to perform the function or at least part of the service.
  • One or more external electronic devices receiving the request may execute at least a part of the requested function or service or an additional function or service related to the request, and deliver the execution result to the electronic device 101 .
  • the electronic device 101 may provide the result as at least part of a response to the request as it is or additionally processed.
  • cloud computing distributed computing, mobile edge computing (MEC), or client-server computing technology may be used.
  • the electronic device 101 may provide an ultra-low latency service using, for example, distributed computing or mobile edge computing.
  • the external electronic device 104 may include an internet of things (IoT) device.
  • Server 108 may be an intelligent server using machine learning and/or neural networks. According to an embodiment, the external electronic device 104 or server 108 may be included in the second network 199 .
  • the electronic device 101 may be applied to intelligent services (eg, smart home, smart city, smart car, or health care) based on 5G communication technology and IoT-related technology.
  • FIG. 2 is a block diagram 200 illustrating a camera module 180 according to various embodiments.
  • the camera module 180 includes a lens assembly 210, a flash 220, an image sensor 230, an image stabilizer 240, a memory 250 (eg, a buffer memory), or an image signal processor. (260).
  • the lens assembly 210 may collect light emitted from a subject that is an image capturing target.
  • the lens assembly 210 may include one or more lenses.
  • the camera module 180 may include a plurality of lens assemblies 210 . In this case, the camera module 180 may form, for example, a dual camera, a 350 degree camera, or a spherical camera.
  • Some of the plurality of lens assemblies 210 may have the same lens properties (eg, angle of view, focal length, auto focus, f number, or optical zoom), or at least one lens assembly may have the same lens properties as other lens assemblies. may have one or more lens properties different from the lens properties of .
  • the lens assembly 210 may include, for example, a wide-angle lens or a telephoto lens.
  • the flash 220 may emit light used to enhance light emitted or reflected from a subject.
  • the flash 220 may include one or more light emitting diodes (eg, a red-green-blue (RGB) LED, a white LED, an infrared LED, or an ultraviolet LED), or a xenon lamp.
  • the image sensor 230 may acquire an image corresponding to the subject by converting light emitted or reflected from the subject and transmitted through the lens assembly 210 into an electrical signal.
  • the image sensor 230 is, for example, an image sensor selected from among image sensors having different properties, such as an RGB sensor, a black and white (BW) sensor, an IR sensor, or a UV sensor, It may include a plurality of image sensors having a property, or a plurality of image sensors having other properties.
  • Each image sensor included in the image sensor 230 may be implemented using, for example, a charged coupled device (CCD) sensor or a complementary metal oxide semiconductor (CMOS) sensor.
  • CCD charged coupled device
  • CMOS complementary metal oxide semiconductor
  • the image stabilizer 240 moves at least one lens or image sensor 230 included in the lens assembly 210 in a specific direction in response to movement of the camera module 180 or the electronic device 101 including the same. Operation characteristics of the image sensor 230 may be controlled (eg, read-out timing is adjusted, etc.). This makes it possible to compensate at least part of the negative effect of the movement on the image being taken.
  • the image stabilizer 240 may include a gyro sensor (not shown) or an acceleration sensor (not shown) disposed inside or outside the camera module 180. Such a movement of the camera module 180 or the electronic device 101 may be detected using .
  • the image stabilizer 240 may be implemented as, for example, an optical image stabilizer.
  • the memory 250 may at least temporarily store at least a portion of an image acquired through the image sensor 230 for a next image processing task. For example, when image acquisition is delayed according to the shutter, or a plurality of images are acquired at high speed, the acquired original image (eg, a Bayer-patterned image or a high-resolution image) is stored in the memory 250 and , a copy image (eg, a low resolution image) corresponding thereto may be previewed through the display module 160 . Thereafter, when a specified condition is satisfied (eg, a user input or a system command), at least a part of the original image stored in the memory 250 may be obtained and processed by the image signal processor 260 , for example. According to one embodiment, the memory 250 may be configured as at least a part of the memory 130 or as a separate memory operated independently of the memory 130 .
  • the image signal processor 260 may perform one or more image processes on an image acquired through the image sensor 230 or an image stored in the memory 250 .
  • the one or more image processes for example, depth map generation, 3D modeling, panorama generation, feature point extraction, image synthesis, or image compensation (eg, noise reduction, resolution adjustment, brightness adjustment, blurring ( blurring, sharpening, or softening.
  • the image signal processor 260 may include at least one of the components included in the camera module 180 (eg, an image sensor). 230) may be controlled (eg, exposure time control, read-out timing control, etc.)
  • the image processed by the image signal processor 260 is stored again in the memory 250 for further processing.
  • the image signal processor 260 may be configured as at least a part of the processor 120 or may be configured as a separate processor that operates independently of the processor 120.
  • the image signal processor 260 may be configured as a processor 120 When configured as a separate processor, at least one image processed by the image signal processor 260 may be displayed through the display module 160 as it is or after additional image processing by the processor 120 .
  • the electronic device 101 may include a plurality of camera modules 180 each having different properties or functions.
  • at least one of the plurality of camera modules 180 may be a wide-angle camera, and at least the other may be a telephoto camera.
  • at least one of the plurality of camera modules 180 may be a front camera, and at least another one may be a rear camera.
  • FIG. 3 is a block diagram illustrating a camera module 300 according to an embodiment.
  • the camera module 300 may correspond to the camera module 180 of FIGS. 1 and 2 .
  • the camera module 300 may include an image acquisition module 301, an image signal processor (ISP) 303, a memory 250, or a combination thereof.
  • the image signal processor 303 may correspond to the image signal processor 260 of FIG. 2 .
  • the memory 250 may store a plurality of reference luminance ratios respectively corresponding to different optical characteristics (eg, an F value, a position of the lens 320, or a tilt degree of the lens 320).
  • the reference luminance ratio may be a ratio of luminance values between at least two photodiodes with respect to a previously obtained reference image having a predetermined optical characteristic. For example, when the electronic device 101 acquires a first image frame and a second image frame obtained before the first image frame, the reference luminance ratio is understood as a luminance ratio corresponding to the second image frame. It can be.
  • the image acquisition module 301 may include an image sensor 230, an aperture 310, a lens 320, an actuator 330, or a combination thereof.
  • the diaphragm 310 is illustrated as being located on the front side of the lens 320 (eg, in a direction relatively adjacent to light), this is just an example and the position of the diaphragm may not be limited thereto.
  • the diaphragm 310 may be positioned between the lens 320 and the image sensor 230 .
  • lens 320 may consist of one or more lenses.
  • the image sensor 230 may correspond to the image sensor 230 of FIG. 2 .
  • the lens 320 may be included in the lens assembly 210 of FIG. 2 .
  • the image signal processor 303 includes an image processing unit 340, a pre-processing unit 350, noise reduction units 361, 362, 363, 364, 366, and 367, a ratio calculation unit 371, 373, A movement amount detector 381 or a combination thereof may be included.
  • the image processing unit 340, the pre-processing unit 350, the noise reduction units 361, 362, 363, 364, 366, and 367, the ratio calculation units 371 and 373, the movement amount detection unit 381, or A combination of these may be implemented through a hardware circuit.
  • the image processing unit 340, the pre-processing unit 350, the noise reduction units 361, 362, 363, 364, 366, and 367, the ratio calculation units 371 and 373, the movement amount detection unit 381, or A combination of these may be implemented as software capable of driving the image signal processor 303 .
  • at least one operation performed by the image signal processor 303 may be performed by an application processor (AP).
  • AP application processor
  • data eg, image data, motion data, and/or lens movement amount by OIS
  • the application processor to compensate for shake Example: VDIS, video digital image stabilization
  • the iris 310 may adjust the size of an entrance pupil under the control of the processor 120 or the image signal processor 303 . In one embodiment, when the size of the entrance pupil of the diaphragm 310 increases, the amount of light reaching the image sensor 230 may increase. In one embodiment, when the size of the entrance pupil of the aperture 310 is reduced, the amount of light reaching the image sensor 230 may be reduced.
  • the lens 320 may condense light incident through the diaphragm 310 .
  • the actuator 330 may move the lens 320 to one of different positions 331 to 339 under the control of the image signal processor 303 .
  • the image sensor 230 may detect a signal corresponding to light passing through the iris 310 and the lens 320 .
  • Light information of a subject incident through the lens assembly 210 may be converted into an electrical signal by the image sensor 230 and input to the image signal processor 303 .
  • the image sensor 230 may be a multi-PD sensor.
  • the image sensor 230 may include a plurality of pixels, and each of the plurality of pixels may include at least two or more photodiodes.
  • the image sensor 230 may be understood or referred to as a multiple photodiode sensor such as a dual photodiode sensor or a triple photodiode sensor according to the number of photodiodes included in each pixel.
  • a first pixel included in the image sensor 230 eg, the first pixel 410 of FIG. 4
  • includes a first photodiode eg, the first photodiode 415L of FIG.
  • the first pixel 410 included in the image sensor 230 includes a first photodiode (eg, the first photodiode 555L of FIG. 5 ) and a second photodiode (eg, the second photodiode 555L of FIG. 5 ). a photodiode 555M), and a third photodiode (eg, the third photodiode 555R of FIG. 5 ).
  • the electronic device 101 may control the movement of the lens 320 and/or the lens assembly 210 through the actuator 330 .
  • the actuator 330 may include an auto focus (AF) actuator and/or an optical image stabilization (OIS) actuator.
  • AF auto focus
  • OIS optical image stabilization
  • at least one processor eg, the processor 120 of FIG. 1
  • performs the above steps to compensate for the shake of the electronic device 101 eg, hand shake.
  • An OIS control value corresponding to shaking may be generated.
  • the electronic device 101 may perform shake correction by transmitting an electrical signal corresponding to the generated OIS control value to at least one coil of the OIS actuator.
  • at least one processor eg, the processor 120 of FIG. 1
  • AF may be implemented by transmitting an electrical signal to at least one coil of the AF actuator.
  • the preprocessor 350 may provide the first pixel value and the second pixel value to the noise reducers 361 , 362 , 363 , or 364 .
  • the first pixel value may include output values of the first photodiode and the second photodiode.
  • the second pixel value may include output values of the third photodiode and the fourth photodiode.
  • the preprocessor 350 may provide the first pixel value to the noise reducers 361 and 362 .
  • the preprocessor 350 may provide the second pixel value to the noise reducers 363 and 364 .
  • the number of noise reduction units may be determined by the number or combination of photodiodes included in a pixel. For example, when there are two photodiodes included in one pixel, since there are two photodiode output values, two corresponding noise reduction units may exist. For example, when there are three photodiodes included in one pixel, there are three photodiode output values, and thus three noise reduction units corresponding thereto may exist.
  • each of the noise reduction units 361, 362, 363, and 364 may reduce noise of input data (eg, a first pixel value and/or a second pixel value).
  • each of the noise reduction units 361 , 362 , 363 , and 364 may provide noise-reduced data to the ratio calculation units 371 and 373 .
  • the noise reduction units 361, 362, 363, and 364 may perform binning, averaging, interpolation, low pass filtering, or a combination thereof. Noise can be reduced by attenuating high-frequency components of the input data using .
  • the noise reduction units 361, 362, 363, and 364 may reduce noise according to the phase by correcting the phase difference of the input data.
  • the noise reducers 361 and 362 may reduce phase difference noise of light sensed by a first photodiode and a second photodiode forming a multi-photodiode structure with the first photodiode.
  • the noise reducers 363 and 364 may reduce phase difference noise of light sensed by the third photodiode and the fourth photodiode forming a multi-photodiode structure with the third photodiode.
  • the ratio calculators 371 and 373 may use noise reduction units 361, 362, 363, and 364 to reduce noise (eg, phase difference noise) (eg, luminance values of pixels with reduced noise and/or noise reduction).
  • noise eg, phase difference noise
  • a luminance ratio of a luminance value of a photodiode with reduced noise may be calculated.
  • the first ratio calculator 371 calculates a first luminance ratio based on data from which noise (eg, phase difference noise) is reduced from the first noise reducer 361 and the second noise reducer 362. can be computed.
  • the second ratio calculator 373 calculates the second luminance ratio based on data from which noise (eg, phase difference noise) is reduced from the third noise reducer 363 and the fourth noise reducer 364. can be computed.
  • the ratio calculators 371 and 373 may transmit the first luminance ratio and the second luminance ratio to the noise reducers 366 and 367 .
  • the luminance ratio may be understood or referred to hereinafter as normalized shading difference (NSD) or ⁇ S.
  • the noise reduction units 366 and 367 may reduce noise of the NSD obtained from the ratio calculation units 371 and 373 .
  • the movement amount detector 381 may obtain data for detecting the movement amount of the lens 320 by OIS driving from the noise reducers 366 and 367 and the memory 250 .
  • the movement amount detection unit 381 may obtain previously stored information about the reference luminance ratio, the reference NSD value, and the position of the reference lens 320 from the memory 250 .
  • the reference luminance ratio, the reference NSD value, and the position of the reference lens 320 may be data corresponding to an image frame acquired before the currently acquired image frame or data corresponding to a viewpoint of a previously acquired image frame.
  • the movement amount detector 381 may include data (eg, lens movement amount, NSD value, luminance ratio) corresponding to an image frame currently being acquired and an image currently being acquired based on the data acquired from a memory.
  • An OIS movement amount corresponding to a frame may be detected.
  • the OIS movement amount may be understood as a movement amount of the lens 320 moved by OIS driving.
  • the movement amount detector 381 may transmit the detected OIS movement amount to the VDIS module 391 .
  • the motion sensor 383 may include an acceleration sensor, a gyroscope sensor, or a magnetic sensor.
  • the acceleration sensor is a sensor set to measure acceleration acting on three axes (eg, X-axis, Y-axis, or Z-axis) of the electronic device 101, and is applied to the electronic device 101 using the measured acceleration. Forces may be measured, estimated, and/or sensed.
  • the gyro sensor is a sensor set to measure angular velocities acting on three axes (eg, X-axis, Y-axis, or Z-axis) of the electronic device 101, and uses the measured angular velocity information of each axis to determine the speed of the electronic device 101.
  • the amount of rotation about each axis can be measured and/or sensed.
  • the motion sensor 383 is illustrated as being included in a block of the image signal processor 303, but may be separate hardware from the image signal processor 303. In one embodiment, the motion sensor 383 may transmit acquired motion data to the VDIS module 390.
  • the VDIS module 390 may determine a correction value to be additionally corrected through data obtained from the movement amount detector 381 and the motion sensor 383 .
  • the VDIS module 390 may perform VDIS by applying the correction value to the image frame obtained from the image processing unit 340 .
  • FIG. 4 is a diagram illustrating photodiodes included in a pixel 410 of an image sensor 230 according to an exemplary embodiment.
  • the image sensor 230 may include a pixel array 401 .
  • the pixel array 401 may be referred to or understood as an array of photodiodes.
  • the pixel array 401 may include a plurality of pixels.
  • the pixel array 401 may include an array of individual pixels such as the first pixel 410 (eg, an m ⁇ n array, where m and n are natural numbers).
  • the pixel array 401 may be located on a plane perpendicular to a Z-axis (eg, an optical axis) corresponding to a direction in which light is incident.
  • a first direction (eg, an X-axis direction) of the pixel array 401 may be perpendicular to a second direction (eg, a Y-axis direction) of the pixel array 401 .
  • the first direction (eg, the X-axis direction) and the second direction (eg, the Y-axis direction) may be perpendicular to the Z-axis direction.
  • the pixel array 401 includes at least one micro lens, an infrared cut filter (IR cut filter), a color filter array (CFA), and at least one antireflection film.
  • the first pixel 410 may include a micro lens 411, an infrared cut filter 412, a color filter 413, an antireflection film 414, at least two photodiodes 415L and 415R, or these may include a combination of
  • a photodiode may also be referred to as a light receiving element.
  • a configuration in which at least two photodiodes (eg, 415L and 415R) are included in one pixel (eg, the first pixel 410) may be referred to as a multi-photodiode (multi PD). .
  • the micro lens 411 may condense light incident on the micro lens 411 . In an embodiment, the micro lens 411 may adjust a path of light incident on the micro lens 411 so that the light reaches the first photodiode 415L and the second photodiode 415R.
  • the infrared cut filter 412 may block at least some infrared rays among the light incident through the micro lens 411 .
  • the infrared cut filter 412 may be disposed between the micro lens 411 and the color filter 413, but is not limited thereto and may be disposed in various positions.
  • the color filter 413 may pass light of a predetermined color (or color channel).
  • the color filter 413 of each of the plurality of pixels eg, the first pixel 410 of FIG. 4
  • has a predetermined color eg, red
  • a predetermined pattern eg, a Bayer pattern
  • blue, or green may pass light of one color (eg, red).
  • the color filter 413 may block light of a color other than a pre-designated color (or color channel).
  • the anti-reflection film 414 may prevent light incident through the micro lens 411 from being reflected to the outside.
  • the first sub photodiode 415L and the second sub photodiode 415R may output values corresponding to incident light. In an embodiment, the first sub photodiode 415L and the second sub photodiode 415R may output values corresponding to incident light based on the photoelectric effect. In an embodiment, the first sub photodiode 415L and the second sub photodiode 415R may output a value corresponding to the intensity (or illuminance) of incident light based on the photoelectric effect.
  • the first sub photodiode 415L and the second sub photodiode 415R may generate charges according to the intensity (or illuminance) of incident light based on the photoelectric effect. In an embodiment, the first sub photodiode 415L and the second sub photodiode 415R may output current according to the amount of generated charge. In an embodiment, a value output by the first sub photodiode 415L may also be referred to as a first detection value. In an embodiment, a value output by the second sub photodiode 415R may also be referred to as a second sensing value.
  • One pixel (eg, the first pixel 410) may include two or more sub photodiodes at different positions.
  • the image sensor 230 may provide a value corresponding to the detected light intensity to the image signal processor 303 .
  • the image sensor 230 includes a first detection value output by each first photodiode 415L of pixels (eg, the first pixel 410 of FIG. 4 ) and a second photodiode 415R.
  • the second detection value output by may be provided to the image signal processor 303 .
  • the image sensor 230 outputs a first detection value output by each first photodiode 415L of pixels (eg, the first pixel 410 of FIG. 4 ) and a second photodiode 415R. ) may provide the image signal processor 303 with a sum of the output second detection values.
  • the image sensor 230 may also provide a low-resolution luminance signal to the image signal processor 303.
  • the low-resolution luminance signal may correspond to a value obtained by summing outputs of adjacent pixels among pixels (eg, the first pixel 410 of FIG. 4 ).
  • the adjacent pixels may be pixels to which different colors are assigned among neighboring pixels.
  • the pre-processor 350 outputs a luminance signal based on data (eg, a first detection value and a second detection value, or a sum value and a low-resolution luminance signal) from the image sensor 230.
  • data eg, a first detection value and a second detection value, or a sum value and a low-resolution luminance signal
  • the preprocessor 350 may output a first luminance signal corresponding to a first detection value output by the first photodiode 415L of the pixel 410 .
  • the preprocessor 350 may output a second luminance signal corresponding to the second detection value output by the second photodiode 415R of the pixel 410 .
  • FIG. 5 is a diagram illustrating arrangement of photodiodes corresponding to one pixel according to an exemplary embodiment.
  • a pixel 510 includes a first photodiode 515L disposed on the left side (eg, -x direction) and a second photodiode 515R disposed on the right side (eg, +x direction). can do.
  • light having different phases and/or different intensities (or illuminances) may be incident to the left and right photodiodes 515L and 515R in a horizontal direction.
  • a pixel 520 includes a first photodiode 525T disposed on an upper side (eg, in a +y direction) and a second photodiode 525B disposed on a lower side (eg, in a -y direction). can do.
  • light having different phases and/or different intensities (or illuminances) may be incident in a vertical direction to the sub photodiodes 525T and 525B disposed vertically.
  • a pixel 530 includes a first photodiode 535T disposed on the upper side (eg, +y direction), a second photodiode 535B disposed on the lower side (eg, -y direction), and a left photodiode 535T.
  • a third photodiode 535L disposed in the (eg, -x direction) and a fourth photodiode 535R disposed in the right (eg, +x direction) may be included.
  • Light having different phases and/or different intensities (or illuminances) may be incident to the four photodiodes 535T, 535B, 535L, and 535R in a horizontal direction and/or a vertical direction.
  • one pair of photodiodes 535T and 535L and the other pair of photodiodes 535B and 535R among the four photodiodes 535T, 535B, 535L and 535R are formed in a first diagonal direction. Light having different phases and/or different intensities (or illuminances) may be incident in a diagonal direction (eg, 135 degrees clockwise from the x-axis). In an exemplary embodiment, one pair of photodiodes 535B and 535L and the other pair of photodiodes 535T and 535R among the four photodiodes 535T, 535B, 535L, and 535R are formed in a second diagonal direction. Light having different phases and/or different intensities (or illuminances) may be incident (eg, 45 degrees counterclockwise from the x-axis). In one embodiment, the first diagonal direction and the second diagonal direction may be perpendicular to each other.
  • a pixel 540 is disposed on the upper left side (eg, 135 degrees clockwise from the x-axis) of the first photodiode 545LT, and on the upper right side (eg, 45 degrees clockwise from the x-axis).
  • the second photodiode 545RT disposed at the lower left side (eg, 135 degrees counterclockwise from the x-axis)
  • the third photodiode 545LB disposed at the lower right side (eg, 45 degrees counterclockwise from the x-axis).
  • a disposed fourth photodiode 545RB may be included.
  • Light having different phases and/or different intensities may be incident to the four photodiodes 545LT, 545RT, 545LB, and 545RB in a horizontal direction and/or a vertical direction.
  • photodiodes 545LT and 545RB among the four photodiodes 545LT, 545RT, 545LB, and 545RB positioned in a diagonal direction have different phases and/or different intensities in a third diagonal direction. (or, light having illuminance) may be incident.
  • photodiodes 545RT and 545LB positioned in a diagonal direction among the four photodiodes 545LT, 545RT, 545LB, and 545RB have different phases and/or different intensities in a fourth diagonal direction. (or, light having illuminance) may be incident.
  • the third diagonal direction and the fourth diagonal direction may be perpendicular to each other.
  • the number of NSDs may be set in various ways according to the number of photodiodes included in one pixel and the summing method. For example, when one pixel includes 4 photodiodes, 4 NSDs may be set, and when the number of NSDs is calculated by summing 2 photodiodes, 3 NSDs may be set.
  • FIG. 6 is a diagram illustrating the amount of light incident on the pixels 610, 620, and 630 according to an exemplary embodiment.
  • light may pass through a lens 320 and then be incident on pixels 610 , 620 , and 630 .
  • the intensity (or illuminance) of light incident on the pixels 610 , 620 , and 630 varies according to positions in the lens 320 through which the light passes.
  • one or more pixels 641 may be disposed between the pixels 610 and 620 .
  • one or more pixels 645 may be disposed between the pixels 620 and 630 .
  • different amounts of light may be incident to the photodiodes 611 and 615 of the pixel 610 .
  • the area where light passing through the right side of the lens 320 is incident on the photodiode 615 of the pixel 610 is the area that is incident on the photodiode 611 of the pixel 610. area may be smaller than Also, for example, referring to FIG. 6 , the area where light passing through the left side of the lens 320 is incident on the photodiode 631 of the pixel 630 is the photodiode 635 of the pixel 630. It may be smaller than the incident area.
  • the respective photodiodes 611 , 615 , 621 , 625 , 631 , and 635 of the pixels 610 , 620 , and 630 are located according to positions of the pixels 610 , 620 , and 630 . ), the intensity of the incident light may be different. In an embodiment, as the intensity of incident light between the photodiodes 611 and 615 of the pixel 610 is different, the photodiodes 611 and 615 may have different outputs. In one embodiment, as the intensity of incident light between the photodiodes 621 and 625 of the pixel 620 is different, the photodiodes 621 and 625 may have different outputs. In an embodiment, as the intensity of incident light between the photodiodes 631 and 635 of the pixel 630 is different, the photodiodes 631 and 635 may have different outputs.
  • FIG. 7 is a two-dimensional view illustrating a change in an optical center due to an OIS operating to compensate for an effect caused by shaking of the electronic device 101 in the electronic device 101 according to an exemplary embodiment.
  • FIG. 8 is a three-dimensional view illustrating a change in an optical center due to shaking of the electronic device 101 in the electronic device 101 according to an exemplary embodiment.
  • the optical center changes due to shaking of the electronic device 101 (eg, shaking due to hand shaking).
  • the change in the center of the optical axis can be understood as a change in the focal position of the lens 320 or the center position of the lens 320 due to the relative change in the position of the lens 320 within the camera module 180 due to the shaking.
  • ⁇ i o is an angle from the center of the first optical axis (O o ) to the ith pixel (P i ) with respect to the optical axis (eg, the z-axis)
  • ⁇ i c is the optical axis
  • ⁇ j o is an angle of the j-th pixel P j from the center O o of the first optical axis with respect to the optical axis (eg, the z-axis).
  • ⁇ j c is an angle of the j-th pixel P j from the center O c of the second optical axis with respect to the optical axis (eg, the z-axis).
  • ⁇ k o is an angle of the k-th pixel P k from the center O o of the first optical axis with respect to the optical axis (eg, the z-axis).
  • ⁇ k c is an angle of the k-th pixel P k from the center O c of the second optical axis with respect to the optical axis (eg, the z-axis).
  • the center of the first optical axis O o may be understood as a center of an optical axis corresponding to an initial optical axis.
  • the center of the second optical axis Oc may be understood as the center of an optical axis corresponding to the current optical axis.
  • the initial optical axis may refer to an optical axis at a point before the current optical axis.
  • ⁇ S may be a normalized shading difference (NSD).
  • ⁇ S may be referred to as a luminance ratio or a light quantity ratio between a first photodiode (eg, the first photodiode 415L of FIG. 4 ) and a second photodiode (eg, the second photodiode 415R of FIG. 4 ).
  • Y1 and Y2 may be luminance values (or brightness values or amounts of light) obtained through multiple photodiodes.
  • Y1 is a first photodiode constituting the multiple photodiodes ( Example: 415L) may be a luminance value
  • Y2 may be a luminance value of a second photodiode (eg 415R) constituting a multi-photodiode.
  • the NSD value according to the optical axis movement may be calculated using the NSD value in the patch W k .
  • a chief ray angle (CRA) of each pixel may be determined to correspond to a new position. For example, a relation as shown in Equation 2 below may be established with respect to an angle determined corresponding to two pixels among P 1 , P k , and P j when the optical axis is moved.
  • ⁇ S i o may indicate an NSD value calculated through the i th pixel at the center of the first optical axis O o .
  • ⁇ S k o may indicate an NSD value calculated through a k-th pixel in the center of the first optical axis O o .
  • ⁇ S i c may indicate an NSD value calculated through the i-th pixel at the center of the second optical axis O c .
  • ⁇ S k c may indicate an NSD value calculated through a k-th pixel at the center of the second optical axis O c .
  • the NSD ( ⁇ S o ) at the center of the first optical axis (O o ) and the NSD ( ⁇ S e ) calculated at the new center of the second optical axis (O c ) as the angular change amount ⁇ , ⁇ are [Equation 4] has a relationship like
  • Equation 4 if the movement of ⁇ x and ⁇ y by the OIS at a fixed position on the optical axis (z-axis) is a relatively small amount, [Equation 4] is It can be obtained by parallel movement.
  • ⁇ S o is a reference NSD
  • ⁇ S e may be a newly calculated NSD.
  • the electronic device 101 may detect the OIS movement amount by calculating the parallel movement amount of the current NSD ( ⁇ S e ) with reference to the reference NSD ( ⁇ S o ).
  • FIG. 9A is a flowchart illustrating a process of analyzing optical characteristics of a lens using multiple photodiodes and performing shake correction based on the analysis in an electronic device according to an embodiment.
  • an electronic device eg, the electronic device 101 of FIG. 1
  • the operation subject of the flowchart illustrated in FIG. 9A is a processor (eg, the processor of FIG. 1 ( 120)) or an image signal processor (eg, the image signal processor 260 of FIG. 2 or the image signal processor 303 of FIG. 3).
  • the electronic device 101 may acquire at least one image frame through an image sensor (eg, the image sensor 230 of FIG. 2 ).
  • the electronic device 101 may drive a camera (eg, the camera module 180 of FIG. 2 ) in response to an input for executing a camera application.
  • the electronic device 101 may sequentially acquire a plurality of image frames output from the image sensor 230 in response to the camera being driven.
  • shaking may occur in the electronic device 101 while the camera is being driven.
  • a plurality of image frames reflecting the shaking of the electronic device 101 may be acquired.
  • the electronic device 101 may obtain an image frame having a field of view (FOV) changed by the shaking.
  • FOV field of view
  • the electronic device 101 may primarily compensate for the shaking. In other words, the electronic device 101 may optically correct the shake through the OIS actuator. The electronic device 101 may acquire at least one image frame in which the shake is optically corrected through OIS driving.
  • the electronic device 101 determines the first amount of light acquired through the first photodiode (eg, the first photodiode 415L of FIG. 4 ) and the second photodiode (eg, the first photodiode 415L of FIG. 4 ).
  • the first position of the lens eg, the lens 320 of FIG. 3
  • the first position may be a position of the lens 320 corresponding to a first point of view when a first image frame is obtained using the first photodiode and the second photodiode.
  • the electronic device 101 may determine a ratio with respect to the first amount of light obtained through the first photodiode 415L and the amount of second light obtained through the second photodiode 415R.
  • the luminance ratio may be determined based on the first luminance value obtained through the first photodiode 415L and the second luminance value obtained through the second photodiode 415R of the electronic device 101 .
  • the electronic device 101 may calculate a normalized shading difference (NSD) value based on the first amount of light and the second amount of light.
  • the NSD may be a value obtained by normalizing a difference in the amount of light entering each photodiode by luminance. More specifically, the electronic device 101 may remove phase difference noise from the first luminance data obtained through the first photodiode 415L and the second luminance data obtained through the second photodiode 415R. .
  • the electronic device 101 may calculate the NSD value by determining a luminance ratio based on the luminance data from which the phase difference noise is removed. In an embodiment, the electronic device 101 may check the position of the lens corresponding to the calculated NSD value.
  • the electronic device 101 provides an OIS actuator (eg, the actuator 330 of FIG. 3 ) based on the reference position of the lens 320 and the first position of the lens 320 . It is possible to check the movement amount of the lens 320 moved by the .
  • the electronic device 101 may calculate an NSD value corresponding to the reference position of the lens 320 and an NSD value corresponding to the first position of the lens 320 .
  • the reference position may be a position of the lens 320 corresponding to a second viewpoint when a second image frame, which is a frame previous to the first image frame, is acquired.
  • the reference position may be a position of the lens 320 in a state in which there is no interference previously stored in a memory. Since the operation of checking the movement amount of the lens 320 is described in FIGS. 7 and 8, it will be omitted here.
  • the electronic device 101 may perform digital image stabilization (DIS) on at least one image frame based on the movement amount of the lens 320 .
  • DIS digital image stabilization
  • OIS optical image stabilization
  • Digital shake correction may be performed on at least one image frame based on the correction amount of the OIS of the electronic device 101 .
  • the electronic device 101 uses an image frame obtained through the image sensor 230, motion data acquired from the motion sensor 383, and a movement amount detector (eg, the movement amount detector 381 of FIG. 3). Through this, the detected movement amount of the lens 320 may be obtained.
  • the electronic device 101 may perform shake correction (eg, video digital image stabilization (VDIS)) in consideration of the motion data and the amount of movement of the lens 320 with respect to the image frame.
  • VDIS video digital image stabilization
  • the VDIS module eg, the VDIS module 390 of FIG. 3
  • the VDIS module 390 may obtain an image frame from an image processing unit (eg, the image processing unit 340 of FIG. 3 ).
  • the VDIS module 390 may obtain motion data (eg, gyro data and/or acceleration data) from the motion sensor 383 .
  • the VDIS module 390 may perform VDIS based on the acquired image frame and data. Additional description of the VDIS may be described with reference to FIGS. 14 and 15 below.
  • 9B is a graph showing a value obtained by averaging a luminance value obtained for each pixel and a normalized shading difference (NSD) in an image obtained by photographing a subject without a pattern in an electronic device according to an embodiment, and photographing a subject with a pattern It shows a graph showing the average value of the luminance value and NSD acquired for each pixel in the image obtained by doing so.
  • NSD normalized shading difference
  • the electronic device 101 may obtain a first image 945 by capturing a subject having no pattern and uniform brightness.
  • the electronic device 101 according to an embodiment may obtain a second image 950 by photographing a subject having a pattern.
  • FIG. 9B is a graph showing luminance distribution according to positions on the first image 945 .
  • (d) of FIG. 9B is a graph showing luminance distribution according to positions on the second image 950 .
  • the second image 950 it may be difficult to detect shading characteristics using the luminance distribution due to the influence of the subject. 101), it is a graph showing average values of normalized shading differences (NSDs) when the first image 945 and the second image 950 are captured, respectively.
  • NSDs normalized shading differences
  • a movement amount in the Y-axis direction can be detected by using this.
  • FIG. 10A shows an image 1010 obtained when a subject without a pattern is photographed.
  • 10B shows an image 1020 obtained when a subject with a pattern is photographed.
  • FIG. 11A is a graph illustrating luminance values acquired for each pixel when the image 1010 of FIG. 10A is captured in the electronic device 101 according to an embodiment.
  • FIG. 11B is a graph showing luminance values obtained for each pixel when the image 1020 of FIG. 10B is captured by the electronic device 101 according to an embodiment.
  • the horizontal axis of FIGS. 11A and 11B may represent the x-axis (eg, the x-axis of FIG. 4 ), and the vertical axis may represent normalized illuminance.
  • the normalized illuminance may be understood as a luminance value obtained or output by each pixel and/or each photodiode.
  • luminance obtained through a first photodiode eg, first photodiode 415L of FIG. 4
  • a plurality of pixels eg, first pixel 410 of FIG. 4
  • Data 1111 and 1121 for values, data 1113 and 1123 for luminance values obtained through the second photodiodes of the plurality of pixels eg, the second photodiode 415R of FIG. 4
  • the Data 1115 and 1125 of luminance values obtained through the first photodiode and the second photodiode of a plurality of pixels are shown in the graph.
  • optical characteristics (eg, shading characteristics) of the lens 320 may be known. For example, since the lens 320 has a curve, the amount of light reaching the image sensor 230 may be greatest when it is closer to the center according to the curvature, and the amount of light reaching the image sensor 230 may decrease as it moves away from the center. have. It can be seen that the luminance value acquired by the pixel or photodiode decreases as the distance from the center of the image sensor 230 (eg, field 0) to the edge (eg, field 1) of the image sensor 230 decreases. Referring to FIG. 11B , when a subject with a pattern is photographed, it is difficult to know the optical characteristics (eg, shading characteristics of the lens) of the lens 320 because the luminance values acquired by each photodiode are not consistent. .
  • FIG. 12A is a graph showing average values of normalized shading differences (NSDs) in a vertical direction when the image 1010 of FIG. 10A is captured by the electronic device 101 according to an embodiment.
  • FIG. 12B is a graph showing average values of normalized shading differences (NSDs) when the image 1020 of FIG. 10B is captured by the electronic device 101 according to an embodiment.
  • the horizontal axes of FIGS. 12A and 12B represent the x-axis (eg, the x-axis of FIG. 4 ), and the vertical axis may represent a value obtained by averaging NSDs along lines in a vertical direction (eg, the y-axis direction of FIG. 4 ).
  • FIGS. 12A and 12B unlike the luminance values of FIGS. 11A and 11B , optical characteristics that are not affected by the subject can be obtained, and here data for mean of ⁇ S according to the lens focal position is shown.
  • Near is when the distance to the subject is short
  • Far is when the distance to the subject is long
  • Center is when the subject is located in the middle of Near and Far. indicate
  • data 1211 and 1221 when the focus position is close can be identified.
  • data 1213 and 1223 when the focus position is in the middle can be identified.
  • data 1215 and 1225 when the focus position is far can be identified.
  • the mean of ⁇ S value corresponding to the x-axis may have a greater variation.
  • 13A is a graph illustrating an OIS movement amount detected by the electronic device 101 according to an embodiment.
  • the horizontal axis of FIG. 13A may represent the x-axis (eg, the x-axis of FIG. 4 ), and the vertical axis may represent the y-axis (eg, the y-axis of FIG. 4 ).
  • 13B is a graph showing data on initial NSD acquired by the electronic device 101 according to an embodiment.
  • FIG. 13C is a graph showing data on NSD calculated by correcting by the amount of OIS movement in the electronic device 101 according to an embodiment.
  • the horizontal axis of FIGS. 13B and 13C may indicate an x-axis (eg, the x-axis of FIG.
  • the OIS movement amount may be understood as a movement amount of the lens 320 moved by OIS driving.
  • the OIS movement amount may be referred to as or understood as a movement amount of the lens 320 and/or an OIS correction amount.
  • FIG. 13A in response to shaking of the electronic device 101 (eg, shaking due to shaking of the hand), a path along which the lens 320 moved by OIS driving on the x-axis-y-axis plane can be known.
  • FIG. 13B shows a graph of NSD calculated based on multiple sample data acquired through multiple shakes.
  • FIG. 13C shows NSD calculated by correcting the OIS movement amount detected in FIG. 13A.
  • the range of variation in the graph of FIG. 13c is small, and it is effective to detect the OIS movement amount through the multi-photodiode sensor as the sample data are matched.
  • FIG. 14 is a diagram for explaining an operation of correcting shake based on first motion information acquired from a motion sensor 383 and second motion information generated by OIS driving in the electronic device 101 according to an embodiment. to be.
  • FIG. 14 illustrates an example of an operation of correcting an image based on first motion information and second motion information according to various embodiments.
  • the electronic device 101 may shake in the direction of reference number 1410 due to various causes such as a user's hand shake.
  • the processor 120 may determine first motion information for the direction of reference number 1410 by detecting the motion of the electronic device 101 through the motion sensor (eg, gyro sensor) 383 .
  • the actuator eg, OIS actuator
  • the lens assembly 210 in response to the electronic device 101 being shaken in the direction of reference number 1410, the actuator (eg, OIS actuator) 330 is the lens assembly 210, and / or the image sensor 230 is the reference number It can be controlled to move in the direction of reference number 1420, which is opposite to the direction of 1410. Due to physical limitations such as a range of angles in which the lens assembly 210 and/or the image sensor 230 can move, the length of reference number 1420 may be shorter than that of reference number 1410 .
  • the processor 120 may acquire an image frame on which OIS correction is performed from the camera module 180 .
  • An image frame on which OIS correction is performed may have a residual shake by reference number 1430.
  • the processor 120 may estimate second motion information in the direction of reference number 1420 from first motion information in the direction of reference number 1410 and determine third motion information obtained by removing the second motion information from the first motion information. .
  • the third motion information may refer to motion information of reference number 1430.
  • 15 is a diagram illustrating an operation of correcting an image frame in a line unit in the electronic device 101 according to an exemplary embodiment.
  • the image sensor 230 may sequentially read out light from the first line 1520 to the last line 1523 of the image frame 1510 .
  • An operation of the electronic device 101 to lead out light line by line may be referred to as a rolling shutter operation. Since a time point at which light reflected from a subject enters the image sensor 230 is different for each line, distortion may occur in an image due to a time difference of light. In this document, distortion caused by rolling shutter operation may be referred to as rolling shutter distortion or jello effect.
  • the frequency of motion data obtained through the motion sensor 383 may be higher than the frequency of image frames obtained through the camera module 180 during the same time period.
  • the electronic device 101 may acquire the image frame 1510 every 30 ms and motion data every 10 ms.
  • the electronic device 101 may correct the image frame 1510 line by line using two or more pieces of motion data.
  • the electronic device 101 may correct rolling shutter distortion line by line within the image frame 1510 based on two or more motion data (eg, gyro data).
  • two or more pieces of first motion information may be obtained for each line constituting the image frame 1510 (eg, 1520, 1521, 1522, and 1523).
  • the electronic device 101 may obtain motion data for at least two or more of the lines.
  • the electronic device 101 may obtain motion data for a first line 1520 and motion data for a second line 1521 within the image frame 1510 .
  • the electronic device 101 may estimate motion data for the remaining lines for which motion data is not acquired by using the two obtained motion data and interpolation.
  • the electronic device 101 may generate motion data for each line within the image frame 1510 based on the obtained motion data and the estimated motion data, and correct rolling shutter distortion line by line.
  • an electronic device eg, the electronic device 101 of FIG. 1
  • a lens assembly including a lens (eg, the lens assembly 210 of FIG. 2 ), and the lens assembly on an optical axis.
  • An OIS actuator e.g., actuator 330 in FIG. 3 that moves in a vertical direction, a micro lens (e.g., micro lens 411 in FIG. 4), and at least two or more photodiodes corresponding to the micro lens.
  • An image sensor eg, the image sensor 230 of FIG. 2
  • the at least two or more photodiodes include a first photodiode and a second photodiode, and at least one motion sensor (eg, the motion sensor of FIG.
  • a memory e.g., memory 250 of FIG. 2 and the lens assembly, the OIS actuator, the image sensor, the at least one motion sensor, and at least one operatively connected with the memory. and a processor, wherein the at least one processor acquires at least one image frame through the image sensor, a first amount of light obtained through the first photodiode, and a second amount obtained through the second photodiode. Identifying a first position of the lens based on the amount of light, identifying a movement amount of the lens moved by the OIS actuator based on a reference position of the lens and the first position of the lens, and Digital image stabilization (DIS) may be performed on the at least one image frame based on the movement amount of .
  • DIS Digital image stabilization
  • the at least one processor calculates a first luminance ratio based on the first light amount and the second light amount, and the first luminance ratio and the second luminance previously stored in the memory. Based on the ratio, the amount of movement of the lens may be confirmed.
  • the at least one processor calculates a first light amount difference based on the first light amount and the second light amount, and normalizes the first light amount difference to luminance to obtain a first normalized shading difference (NSD). ) can be calculated.
  • NSD normalized shading difference
  • the movement amount of the lens moved by the OIS actuator may be determined by comparing the first NSD value with a reference NSD value previously stored in the memory.
  • the first position of the lens may be a position corresponding to the first NSD value
  • the reference position may be a position corresponding to the reference NSD
  • the at least one processor may perform the processing of the electronic device based on a third amount of light acquired through the first photodiode and a fourth amount of light obtained through the second photodiode in response to shaking of the electronic device.
  • Checking the second position of the lens assembly, checking the movement amount of the lens moved by the OIS actuator based on the first position and the second position, and based on the checked movement amount of the lens, the at least one Digital image stabilization (DIS) may be performed on the image frame.
  • DIS Digital image stabilization
  • the at least one processor calculates a first light amount difference based on the first light amount and the second light amount, and normalizes the first light amount difference to luminance to obtain a first normalized shading difference (NSD). ) value, calculate a second light amount difference based on the third light amount and the fourth light amount, and calculate a second NSD value obtained by normalizing the second light amount difference with luminance.
  • NSD normalized shading difference
  • the first position of the lens may correspond to a first viewpoint for obtaining a first image frame
  • the reference position may correspond to a second viewpoint prior to the first viewpoint
  • the image sensor includes the first photodiode, the second photodiode, and a third photodiode corresponding to the micro lens, and the at least one processor outputs the first photodiode.
  • a plurality of normalized shading difference (NSD) values are calculated based on the value, the output value of the second photodiode, and the output value of the third photodiode, and the first value of the lens is calculated based on the plurality of NSD values. location can be checked.
  • NSD normalized shading difference
  • the at least one processor performs digital shake correction for the at least one image frame based on the identified movement amount of the lens and the movement of the electronic device obtained through the at least one motion sensor.
  • DIS digital image stabilization
  • the operating method of an electronic device includes obtaining at least one image frame through an image sensor, a first amount of light acquired through a first photodiode, and a second photodiode. Identifying a first position of the lens based on the second amount of light obtained through, and identifying a movement amount of the lens moved by the OIS actuator based on the reference position of the lens and the first position of the lens. and an operation of performing digital image stabilization (DIS) on the at least one image frame based on the moving amount of the lens.
  • DIS digital image stabilization
  • a method of operating an electronic device includes an operation of calculating a first light amount difference based on the first light amount and the second light amount, and a first normalized shading (NSD) obtained by normalizing the first light amount difference to luminance. difference) may include an operation of calculating a value.
  • NSD normalized shading
  • the operating method of the electronic device may include checking the movement amount of the lens moved by the OIS actuator by comparing the first NSD value with a reference NSD value previously stored in a memory. .
  • the operating method of the electronic device may perform the operation of the electronic device based on a third amount of light acquired through the first photodiode and a fourth amount of light acquired through the second photodiode in response to shaking of the electronic device.
  • An operation of determining a second position of a lens, an operation of determining an amount of movement of the lens moved by the OIS actuator based on the first position and the second position, and an operation of determining the movement amount of the lens based on the identified movement amount of the lens may include an operation of performing digital image stabilization (DIS) on the image frame of .
  • DIS digital image stabilization
  • a method of operating an electronic device includes an operation of calculating a first light amount difference based on the first light amount and the second light amount, and a first normalized shading (NSD) obtained by normalizing the first light amount difference to luminance. difference) value, calculating a second light amount difference based on the third light amount and the fourth light amount, and calculating a second NSD value obtained by normalizing the second light amount difference with luminance can do.
  • NSD normalized shading
  • an electronic device eg, the electronic device 101 of FIG. 1
  • a lens assembly including a lens (eg, the lens assembly 210 of FIG. 2 ), and the lens assembly on an optical axis.
  • An OIS actuator e.g., actuator 330 in FIG. 3 that moves in a vertical direction, a micro lens (e.g., micro lens 411 in FIG. 4), and at least two or more photodiodes corresponding to the micro lens.
  • An image sensor eg, the image sensor 230 of FIG. 2
  • the at least two photodiodes include a first photodiode and a second photodiode, and at least one motion sensor (eg, the motion sensor 383 of FIG. 3 ).
  • a memory eg, memory 250 of FIG. 2
  • the lens assembly the OIS actuator, the image sensor, and the at least one motion sensor, and at least one operatively connected with the memory and a processor, wherein the at least one processor acquires at least one image frame through the image sensor, a first amount of light obtained through the first photodiode, and a second amount obtained through the second photodiode.
  • the at least one processor calculates a first normalized shading difference (NSD) value obtained by normalizing the data for the first light amount difference to luminance, and normalizes the data for the second light amount difference to luminance.
  • a second normalized shading difference (NSD) value may be calculated.
  • the at least one processor determines a first position of the lens corresponding to the first NSD value, determines a second position of the lens corresponding to the second NSD value, and determines the second position of the lens corresponding to the second NSD value.
  • the movement amount of the lens may be checked based on the first position and the second position.
  • the first position of the lens corresponds to a first viewpoint of acquiring a first image frame
  • the second position of the lens corresponds to a second image frame acquired after the first image frame. It can correspond to 2 time points.
  • the at least one processor transfers the identified movement amount of the lens and the motion of the electronic device acquired through the at least one motion sensor to a video digital image stabilization (VDIS) module, and transmits the at least one motion sensor to a video digital image stabilization (VDIS) module.
  • Shake correction may be performed on an image frame of .
  • Electronic devices may be devices of various types.
  • the electronic device may include, for example, a portable communication device (eg, a smart phone), a computer device, a portable multimedia device, a portable medical device, a camera, a wearable device, or a home appliance.
  • a portable communication device eg, a smart phone
  • a computer device e.g., a smart phone
  • a portable multimedia device e.g., a portable medical device
  • a camera e.g., a portable medical device
  • a camera e.g., a portable medical device
  • a camera e.g., a camera
  • a wearable device e.g., a smart bracelet
  • first, second, or first or secondary may simply be used to distinguish that component from other corresponding components, and may refer to that component in other respects (eg, importance or order) is not limited.
  • a (eg, first) component is said to be “coupled” or “connected” to another (eg, second) component, with or without the terms “functionally” or “communicatively.”
  • the certain component may be connected to the other component directly (eg by wire), wirelessly, or through a third component.
  • module used in various embodiments of this document may include a unit implemented in hardware, software, or firmware, and is interchangeably interchangeable with terms such as, for example, logic, logic blocks, components, or circuits.
  • a module may be an integrally constructed component or a minimal unit of components or a portion thereof that performs one or more functions.
  • the module may be implemented in the form of an application-specific integrated circuit (ASIC).
  • ASIC application-specific integrated circuit
  • a storage medium eg, internal memory 136 or external memory 138
  • a machine eg, electronic device 101
  • a processor eg, the processor 120
  • a device eg, the electronic device 101
  • the one or more instructions may include code generated by a compiler or code executable by an interpreter.
  • the device-readable storage medium may be provided in the form of a non-transitory storage medium.
  • the storage medium is a tangible device and does not contain a signal (e.g. electromagnetic wave), and this term refers to the case where data is stored semi-permanently in the storage medium. It does not discriminate when it is temporarily stored.
  • a signal e.g. electromagnetic wave
  • the method according to various embodiments disclosed in this document may be included and provided in a computer program product.
  • Computer program products may be traded between sellers and buyers as commodities.
  • a computer program product is distributed in the form of a device-readable storage medium (eg compact disc read only memory (CD-ROM)), or through an application store (eg Play Store TM ) or on two user devices ( It can be distributed (eg downloaded or uploaded) online, directly between smart phones.
  • a device-readable storage medium eg compact disc read only memory (CD-ROM)
  • an application store eg Play Store TM
  • It can be distributed (eg downloaded or uploaded) online, directly between smart phones.
  • at least part of the computer program product may be temporarily stored or temporarily created in a device-readable storage medium such as a manufacturer's server, an application store server, or a relay server's memory.
  • each component (eg, module or program) of the components described above may include a single object or a plurality of objects, and some of the multiple objects may be separately disposed in other components.
  • one or more components or operations among the aforementioned components may be omitted, or one or more other components or operations may be added.
  • a plurality of components eg modules or programs
  • the integrated component may perform one or more functions of each of the plurality of components identically or similarly to those performed by a corresponding component of the plurality of components prior to the integration. .
  • operations performed by modules, programs, or other components are executed sequentially, in parallel, iteratively, or heuristically, or one or more of the operations are executed in a different order, omitted, or , or one or more other operations may be added.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Studio Devices (AREA)

Abstract

Au moins un processeur inclus dans un dispositif électronique peut acquérir au moins une trame d'image par l'intermédiaire d'un capteur d'image, confirmer une première position d'une lentille sur la base d'une première quantité de lumière acquise par l'intermédiaire d'une première photodiode et d'une seconde quantité de lumière acquise par l'intermédiaire d'une seconde photodiode, confirmer, sur la base d'une position de référence de la lentille et de la première position de la lentille, une quantité de mouvement de la lentille déplacée au moyen d'un actionneur OIS, et effectuer une stabilisation d'image numérique (DIS) sur la ou les trames d'image sur la base de la quantité de mouvement de la lentille. Divers autres modes de réalisation identifiés dans la description sont également possibles.
PCT/KR2022/005963 2021-05-25 2022-04-26 Procédé de stabilisation d'image pendant la capture, et dispositif électronique associé Ceased WO2022250305A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR10-2021-0067118 2021-05-25
KR1020210067118A KR20220159133A (ko) 2021-05-25 2021-05-25 촬영 시 흔들림을 보정하는 방법 및 그 전자 장치

Publications (1)

Publication Number Publication Date
WO2022250305A1 true WO2022250305A1 (fr) 2022-12-01

Family

ID=84229975

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2022/005963 Ceased WO2022250305A1 (fr) 2021-05-25 2022-04-26 Procédé de stabilisation d'image pendant la capture, et dispositif électronique associé

Country Status (2)

Country Link
KR (1) KR20220159133A (fr)
WO (1) WO2022250305A1 (fr)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2024242518A1 (fr) * 2023-05-19 2024-11-28 삼성전자 주식회사 Dispositif de traitement d'image et procédé de traitement d'image
WO2025105718A1 (fr) * 2023-11-13 2025-05-22 삼성전자 주식회사 Capteur d'images, dispositif électronique comprenant un capteur d'images et son procédé de fonctionnement

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20100067406A (ko) * 2008-12-11 2010-06-21 삼성전자주식회사 디지털 촬영 장치의 흔들림 보정 방법 및 장치
KR20150053495A (ko) * 2013-11-08 2015-05-18 삼성전기주식회사 촬영 장치의 이미지 보정 장치 및 방법
JP2018174542A (ja) * 2018-06-07 2018-11-08 株式会社ニコン 撮像素子および撮像装置
KR20190095795A (ko) * 2018-02-07 2019-08-16 삼성전자주식회사 광학식 이미지 안정화 움직임을 추정하기 위한 장치 및 방법
JP2019186631A (ja) * 2018-04-03 2019-10-24 オリンパス株式会社 撮像装置および撮像方法

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20100067406A (ko) * 2008-12-11 2010-06-21 삼성전자주식회사 디지털 촬영 장치의 흔들림 보정 방법 및 장치
KR20150053495A (ko) * 2013-11-08 2015-05-18 삼성전기주식회사 촬영 장치의 이미지 보정 장치 및 방법
KR20190095795A (ko) * 2018-02-07 2019-08-16 삼성전자주식회사 광학식 이미지 안정화 움직임을 추정하기 위한 장치 및 방법
JP2019186631A (ja) * 2018-04-03 2019-10-24 オリンパス株式会社 撮像装置および撮像方法
JP2018174542A (ja) * 2018-06-07 2018-11-08 株式会社ニコン 撮像素子および撮像装置

Also Published As

Publication number Publication date
KR20220159133A (ko) 2022-12-02

Similar Documents

Publication Publication Date Title
WO2022245037A1 (fr) Dispositif électronique comprenant un capteur d'image et un capteur de vison dynamique, et son procédé de fonctionnement
WO2022119372A1 (fr) Appareil électronique effectuant un traitement d'image et procédé pour le faire fonctionner
WO2022235043A1 (fr) Dispositif électronique comprenant une pluralité de caméras et son procédé de fonctionnement
WO2023033396A1 (fr) Dispositif électronique pour traiter une entrée de prise de vue continue, et son procédé de fonctionnement
WO2022250305A1 (fr) Procédé de stabilisation d'image pendant la capture, et dispositif électronique associé
WO2022149654A1 (fr) Dispositif électronique pour réaliser une stabilisation d'image, et son procédé de fonctionnement
WO2022108235A1 (fr) Procédé, appareil et support de stockage pour obtenir un obturateur lent
WO2020171450A1 (fr) Dispositif électronique et procédé de génération carte de profondeur
WO2022250342A1 (fr) Dispositif électronique pour synchroniser des informations de commande de lentille avec une image
WO2022220444A1 (fr) Procédé de balayage lors d'une prise de vue avec un appareil photo et appareil électronique associé
WO2022220621A1 (fr) Dispositif électronique comprenant un réflecteur et un ensemble objectif
WO2022114789A1 (fr) Dispositif électronique et procédé pour obtenir une quantité de lumière
WO2023043132A1 (fr) Dispositif électronique d'application d'effet bokeh sur une image et son procédé de fonctionnement
WO2023153743A1 (fr) Dispositif électronique, et son procédé de fonctionnement
WO2022240186A1 (fr) Procédé de correction de distorsion d'image et dispositif électronique associé
WO2023140678A1 (fr) Dispositif électronique comprenant un capteur d'image et procédé de fonctionnement associé
WO2023146236A1 (fr) Dispositif électronique comprenant un module de caméra
WO2023277298A1 (fr) Procédé de stabilisation d'image et dispositif électronique associé
WO2022173236A1 (fr) Dispositif électronique comprenant un capteur d'image et son procédé de fonctionnement
WO2023018201A1 (fr) Dispositif électronique réalisant une stabilisation d'image et son procédé de fonctionnement
WO2022250344A1 (fr) Dispositif électronique comprenant un capteur d'image et un capteur de vision dynamique, et son procédé de fonctionnement
WO2022186511A1 (fr) Dispositif électronique comprenant un couvercle arrière et un module de caméra
WO2021251631A1 (fr) Dispositif électronique comprenant une fonction de réglage de mise au point, et procédé associé
WO2022244970A1 (fr) Procédé de capture d'image de dispositif électronique, et dispositif électronique associé
WO2022211304A1 (fr) Dispositif électronique comprenant un module de caméra

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22811498

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 22811498

Country of ref document: EP

Kind code of ref document: A1