WO2022025630A1 - Dispositif électronique comprenant un capteur de distance et procédé de mise au point automatique - Google Patents
Dispositif électronique comprenant un capteur de distance et procédé de mise au point automatique Download PDFInfo
- Publication number
- WO2022025630A1 WO2022025630A1 PCT/KR2021/009817 KR2021009817W WO2022025630A1 WO 2022025630 A1 WO2022025630 A1 WO 2022025630A1 KR 2021009817 W KR2021009817 W KR 2021009817W WO 2022025630 A1 WO2022025630 A1 WO 2022025630A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- distance
- electronic device
- calibration data
- processor
- sensor
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Images
Classifications
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B13/00—Viewfinders; Focusing aids for cameras; Means for focusing for cameras; Autofocus systems for cameras
- G03B13/32—Means for focusing
- G03B13/34—Power focusing
- G03B13/36—Autofocus systems
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/02—Systems using the reflection of electromagnetic waves other than radio waves
- G01S17/06—Systems determining position data of a target
- G01S17/08—Systems determining position data of a target for measuring distance only
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/88—Lidar systems specially adapted for specific applications
- G01S17/89—Lidar systems specially adapted for specific applications for mapping or imaging
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B13/00—Viewfinders; Focusing aids for cameras; Means for focusing for cameras; Autofocus systems for cameras
- G03B13/18—Focusing aids
- G03B13/20—Rangefinders coupled with focusing arrangements, e.g. adjustment of rangefinder automatically focusing camera
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/67—Focus control based on electronic image sensor signals
Definitions
- Various embodiments disclosed in this document relate to an electronic device including a distance sensor and a method for performing autofocus using the distance sensor.
- An electronic device such as a smartphone or a tablet PC may capture an image using a camera.
- the electronic device may perform auto focus (hereinafter, AF) on an external object.
- the electronic device may calculate a distance to the object using a distance sensor (eg, a time of flight (TOF) sensor) and perform AF using the calculated distance.
- a distance sensor eg, a time of flight (TOF) sensor
- the distance sensor may include a light emitting unit and a light receiving unit.
- the light emitting unit may output light of an IR (infrared ray) pulse
- the light receiving unit may obtain light from which the light of an IR (infrared ray) pulse is reflected by an object.
- the electronic device may calculate the distance to the object by measuring the time for the light output from the light emitting unit to be reflected by the object and flow into the light receiving unit.
- the electronic device may quickly determine a position of a lens for auto focus (AF) based on the calculated distance.
- the position of the lens for AF according to the distance to the object may be determined and stored.
- the electronic device may perform calibration on the distance sensor during a manufacturing process or in a setting after manufacturing to increase accuracy of distance measurement with an object.
- the electronic device may simultaneously perform the offset calibration and the crosstalk calibration at a distance that can escape the influence of the crosstalk, for example, at a distance of 30 cm or more from the distance sensor. In this case, at a short distance (eg, less than 30 cm), an error in distance measurement with an object may be large, and AF accuracy may be reduced.
- Various embodiments of the present disclosure provide an electronic device that increases accuracy of distance measurement with an object by performing offset calibration and crosstalk calibration of a distance sensor at different distances.
- An electronic device includes a camera module including a lens unit and an image sensor, a distance sensor including a light emitting unit and a light receiving unit, a processor, and a memory, wherein the processor performs the distance at a first distance shorter than a reference distance.
- Executes the camera module calculates an object distance with an external object based on the first calibration data or the second calibration data, and performs the auto focus of the camera module based on the object distance
- the position of the lens unit may be determined.
- the electronic device performs offset calibration (correcting a difference between an actual distance and a measured distance) and crosstalk calibration (correcting an error due to crosstalk) of a distance sensor at different distances
- offset calibration correcting a difference between an actual distance and a measured distance
- crosstalk calibration correcting an error due to crosstalk
- the electronic device may output a user interface to allow a user to remove an obstacle when the crosstalk is at a level that affects object distance measurement.
- FIG. 1 is a block diagram of an electronic device in a network environment according to various embodiments of the present disclosure
- FIG 2 illustrates an electronic device according to various embodiments.
- FIG. 3 illustrates a distance sensor according to various embodiments.
- FIG. 4 illustrates a calibration operation of a distance sensor according to various embodiments.
- FIG. 5 is a flowchart illustrating preferential storage of first calibration data according to various embodiments of the present disclosure
- FIG. 6 is a flowchart illustrating a method of performing auto-focusing according to various embodiments of the present disclosure
- FIG. 7 illustrates an error rate according to a distance according to various embodiments.
- FIG 8 illustrates a UI display according to an error in object distance measurement according to various embodiments of the present disclosure.
- FIG. 1 is a block diagram of an electronic device 101 in a network environment 100 according to various embodiments of the present disclosure.
- the electronic device may have various types of devices.
- Electronic devices include, for example, portable communication devices (eg, smartphones), computer devices (eg, personal digital assistants), tablet PCs (tablet PCs), laptop PCs (desktop PCs, workstations, or servers); It may include at least one of a portable multimedia device (eg, an e-book reader or an MP3 player), a portable medical device (eg, a heart rate, blood sugar, blood pressure, or body temperature monitor), a camera, or a wearable device.
- portable multimedia device eg, an e-book reader or an MP3 player
- portable medical device eg, a heart rate, blood sugar, blood pressure, or body temperature monitor
- camera e.g., a wearable device.
- the electronic device is, for example, a television, a digital video disk (DVD) player, an audio device, an audio accessory.
- Devices such as speakers, headphones, or headsets), refrigerators, air conditioners, vacuum cleaners, ovens, microwaves, washing machines, air purifiers, set-top boxes, home automation control panels, security control panels, game consoles, electronic dictionaries, electronic keys, It may include at least one of a camcorder and an electronic picture frame.
- the electronic device is a navigation device, a global navigation satellite system (GNSS), an event data recorder (EDR) (eg, a black box for a vehicle/vessel/airplane), an automotive infotainment device. (e.g. head-up displays for vehicles), industrial or home robots, drones, automated teller machines (ATMs), point of sales (POS) instruments, metering instruments (e.g. water, electricity, or gas metering instruments); Alternatively, it may include at least one of an IoT device (eg, a light bulb, a sprinkler device, a fire alarm, a thermostat, or a street lamp).
- IoT device eg, a light bulb, a sprinkler device, a fire alarm, a thermostat, or a street lamp.
- the electronic device is not limited to the above-described devices, and, for example, as in the case of a smartphone equipped with a function of measuring personal biometric information (eg, heart rate or blood sugar), a plurality of electronic devices
- the functions of the devices may be provided in a complex manner.
- the term user may refer to a person who uses an electronic device or a device (eg, an artificial intelligence electronic device) using the electronic device.
- the electronic device 101 communicates with the electronic device 102 through a first network 198 (eg, a short-range wireless communication network) or a second network 199 (eg, a long-distance wireless communication network). network) and communicate with the electronic device 104 or the server 108 .
- the electronic device 101 may communicate with the electronic device 104 through the server 108 .
- the electronic device 101 includes a processor 120 , a memory 130 , an input device 150 , a sound output device 155 , a display device 160 , an audio module 170 , and a sensor module ( 176), interface 177, connection terminal 178, haptic module 179, camera module 180, power management module 188, battery 189, communication module 190, subscriber identification module 196 , or an antenna module 197 may be included.
- at least one of these components eg, the connection terminals 178 ( 1 ( 1 )
- some of these components eg, the sensor module 176 , the camera module 180 , or the antenna module 197
- the processor 120 for example, executes software (eg, a program 140) to execute at least one other component (eg, a hardware or software component) of the electronic device 101 connected to the processor 120 . It can control and perform various data processing or operations. According to one embodiment, as at least part of data processing or operation, the processor 120 converts commands or data received from other components (eg, the sensor module 176 or the communication module 190 ) to the volatile memory 132 . may be stored in the volatile memory 132 , and may process commands or data stored in the volatile memory 132 , and store the result data in the non-volatile memory 134 .
- software eg, a program 140
- the processor 120 converts commands or data received from other components (eg, the sensor module 176 or the communication module 190 ) to the volatile memory 132 .
- the volatile memory 132 may be stored in the volatile memory 132 , and may process commands or data stored in the volatile memory 132 , and store the result data in the non-volatile memory 134 .
- the processor 120 is the main processor 121 (eg, a central processing unit or an application processor), or a secondary processor 123 (eg, a graphic processing unit or an image signal processor) that can operate independently or together with the main processor 121 . , a sensor hub processor, or a communication processor).
- the main processor 121 e.g, a central processing unit or an application processor
- a secondary processor 123 e.g, a graphic processing unit or an image signal processor
- the auxiliary processor 123 is, for example, on behalf of the main processor 121 while the main processor 121 is in an inactive (eg, sleep) state, or the main processor 121 is active (eg, executing an application). ), together with the main processor 121, at least one of the components of the electronic device 101 (eg, the display device 160, the sensor module 176, or the communication module 190) It is possible to control at least some of the related functions or states.
- the co-processor 123 eg, an image signal processor or a communication processor
- may be implemented as part of another functionally related component eg, the camera module 180 or the communication module 190. have.
- the memory 130 may store various data used by at least one component of the electronic device 101 (eg, the processor 120 or the sensor module 176 ).
- the data may include, for example, input data or output data for software (eg, the program 140 ) and instructions related thereto.
- the memory 130 may include a volatile memory 132 or a non-volatile memory 134 .
- the program 140 may be stored as software in the memory 130 , and may include, for example, an operating system 142 , middleware 144 , or an application 146 .
- the input device 150 may receive a command or data to be used in a component (eg, the processor 120 ) of the electronic device 101 from the outside (eg, a user) of the electronic device 101 .
- the input device 150 may include, for example, a microphone, a mouse, a keyboard, or a digital pen (eg, a stylus pen).
- the sound output device 155 may output a sound signal to the outside of the electronic device 101 .
- the sound output device 155 may include, for example, a speaker or a receiver.
- the speaker can be used for general purposes such as multimedia playback or recording playback.
- the receiver may be used to receive an incoming call. According to one embodiment, the receiver may be implemented separately from or as part of the speaker.
- the display device 160 may visually provide information to the outside of the electronic device 101 (eg, a user).
- the display device 160 may include, for example, a display, a hologram device, or a projector and a control circuit for controlling the corresponding device.
- the display device 160 may include a touch sensor configured to sense a touch or a pressure sensor configured to measure the intensity of a force generated by the touch.
- the audio module 170 may convert a sound into an electric signal or, conversely, convert an electric signal into a sound. According to an embodiment, the audio module 170 acquires a sound through the input device 150 , or an external electronic device (eg, a sound output device 155 ) connected directly or wirelessly with the electronic device 101 . A sound may be output through the electronic device 102 (eg, a speaker or headphones).
- an external electronic device eg, a sound output device 155
- a sound may be output through the electronic device 102 (eg, a speaker or headphones).
- the sensor module 176 detects an operating state (eg, power or temperature) of the electronic device 101 or an external environmental state (eg, user state), and generates an electrical signal or data value corresponding to the sensed state. can do.
- the sensor module 176 may include, for example, a gesture sensor, a gyro sensor, a barometric pressure sensor, a magnetic sensor, an acceleration sensor, a grip sensor, a proximity sensor, a color sensor, an IR (infrared) sensor, a biometric sensor, It may include a temperature sensor, a humidity sensor, or an illuminance sensor.
- the interface 177 may support one or more designated protocols that may be used by the electronic device 101 to directly or wirelessly connect with an external electronic device (eg, the electronic device 102 ).
- the interface 177 may include, for example, a high definition multimedia interface (HDMI), a universal serial bus (USB) interface, an SD card interface, or an audio interface.
- HDMI high definition multimedia interface
- USB universal serial bus
- SD card interface Secure Digital Card
- the connection terminal 178 may include a connector through which the electronic device 101 can be physically connected to an external electronic device (eg, the electronic device 102 ).
- the connection terminal 178 may include, for example, an HDMI connector, a USB connector, an SD card connector, or an audio connector (eg, a headphone connector).
- the haptic module 179 may convert an electrical signal into a mechanical stimulus (eg, vibration or movement) or an electrical stimulus that the user can perceive through tactile or kinesthetic sense.
- the haptic module 179 may include, for example, a motor, a piezoelectric element, or an electrical stimulation device.
- the camera module 180 may capture still images and moving images. According to an embodiment, the camera module 180 may include one or more lenses, image sensors, image signal processors, or flashes.
- the power management module 188 may manage power supplied to the electronic device 101 .
- the power management module 188 may be implemented as, for example, at least a part of a power management integrated circuit (PMIC).
- PMIC power management integrated circuit
- the battery 189 may supply power to at least one component of the electronic device 101 .
- battery 189 may include, for example, a non-rechargeable primary cell, a rechargeable secondary cell, or a fuel cell.
- the communication module 190 is a direct (eg, wired) communication channel or a wireless communication channel between the electronic device 101 and an external electronic device (eg, the electronic device 102, the electronic device 104, or the server 108). It can support establishment and communication performance through the established communication channel.
- the communication module 190 may include one or more communication processors that operate independently of the processor 120 (eg, an application processor) and support direct (eg, wired) communication or wireless communication.
- the communication module 190 is a wireless communication module 192 (eg, a cellular communication module, a short-range wireless communication module, or a global navigation satellite system (GNSS) communication module) or a wired communication module 194 (eg, : It may include a LAN (local area network) communication module, or a power line communication module).
- GNSS global navigation satellite system
- a corresponding communication module among these communication modules is a first network 198 (eg, a short-range communication network such as Bluetooth, wireless fidelity (WiFi) direct, or infrared data association (IrDA)) or a second network 199 (eg, legacy It may communicate with the external electronic device 104 through a cellular network, a 5G network, a next-generation communication network, the Internet, or a computer network (eg, a telecommunication network such as a LAN or a WAN).
- a first network 198 eg, a short-range communication network such as Bluetooth, wireless fidelity (WiFi) direct, or infrared data association (IrDA)
- a second network 199 eg, legacy It may communicate with the external electronic device 104 through a cellular network, a 5G network, a next-generation communication network, the Internet, or a computer network (eg, a telecommunication network such as a LAN or a WAN).
- a telecommunication network
- the wireless communication module 192 uses the subscriber information (eg, International Mobile Subscriber Identifier (IMSI)) stored in the subscriber identification module 196 within a communication network such as the first network 198 or the second network 199 .
- the electronic device 101 may be identified or authenticated.
- the antenna module 197 may transmit or receive a signal or power to the outside (eg, an external electronic device).
- the antenna module 197 may include an antenna including a conductor formed on a substrate (eg, a PCB) or a radiator formed of a conductive pattern.
- the antenna module 197 may include a plurality of antennas (eg, an array antenna). In this case, at least one antenna suitable for a communication method used in a communication network such as the first network 198 or the second network 199 is connected from the plurality of antennas by, for example, the communication module 190 . can be selected. A signal or power may be transmitted or received between the communication module 190 and an external electronic device through the selected at least one antenna.
- other components eg, a radio frequency integrated circuit (RFIC)
- RFIC radio frequency integrated circuit
- peripheral devices eg, a bus, general purpose input and output (GPIO), serial peripheral interface (SPI), or mobile industry processor interface (MIPI)
- GPIO general purpose input and output
- SPI serial peripheral interface
- MIPI mobile industry processor interface
- the command or data may be transmitted or received between the electronic device 101 and the external electronic device 104 through the server 108 connected to the second network 199 .
- Each of the external electronic devices 102 or 104 may be the same as or different from the electronic device 101 .
- all or a part of operations executed in the electronic device 101 may be executed in one or more external electronic devices 102 , 104 , or 108 .
- the electronic device 101 may perform the function or service itself instead of executing the function or service itself.
- one or more external electronic devices may be requested to perform at least a part of the function or the service.
- One or more external electronic devices that have received the request may execute at least a part of the requested function or service, or an additional function or service related to the request, and transmit a result of the execution to the electronic device 101 .
- the electronic device 101 may process the result as it is or additionally and provide it as at least a part of a response to the request.
- cloud computing distributed computing, mobile edge computing (MEC), or client-server computing technology may be used.
- the electronic device may have various types of devices.
- the electronic device may include, for example, a portable communication device (eg, a smart phone), a computer device, a portable multimedia device, a portable medical device, a camera, a wearable device, or a home appliance device.
- a portable communication device eg, a smart phone
- a computer device e.g., a smart phone
- a portable multimedia device e.g., a portable medical device
- a camera e.g., a portable medical device
- a camera e.g., a portable medical device
- a camera e.g., a portable medical device
- a wearable device e.g., a smart bracelet
- a home appliance device e.g., a home appliance
- FIG 2 illustrates an electronic device according to various embodiments.
- an electronic device 201 (eg, the electronic device 101 of FIG. 1 ) includes a processor 210 , a memory 220 , a display 230 , a distance sensor 240 and/or a camera module ( 250) may be included.
- FIG. 2 is not limited thereto, focusing on a configuration related to measuring a distance from an object or capturing an image.
- the processor 210 may perform various operations necessary for the operation of the electronic device 201 .
- the processor 210 may calculate the distance to the object (hereinafter, object distance) by driving the distance sensor 240 .
- the processor 210 may drive the camera module 250 to perform AF on an object and capture an image.
- the memory 220 may store various information used for driving the electronic device 201 .
- the memory 220 may store calibration information of the distance sensor 240 , reference data for controlling the distance sensor 240 according to noise, or data regarding the power of the distance sensor 240 .
- the memory 220 may include information about the position of the lens unit of the camera module 250 according to the object distance for performing AF or information related to photographing an image (eg, data for recognizing a light source, intensity of a light source) and data related to color temperature or data for detecting noise of a light source).
- Display 230 may display content such as images, icons, user interfaces, or text.
- the display 230 may display an image based on image data acquired through the camera module 250 .
- the display 230 may display a UI indicating a focused object according to the AF function.
- the distance sensor 240 may be used to calculate an object distance to an external object.
- the distance sensor 240 may output light (eg, IR) (hereinafter, transmitted light) of a specified wavelength and collect light reflected from an external object (hereinafter, received light).
- the control circuit of the processor 210 or the distance sensor 240 may calculate the object distance based on a time from an output time of the transmitted light to an arrival time of the received light. Additional information regarding the distance sensor 240 may be provided through FIG. 3 .
- the camera module (or the camera device, the imaging device, or the imaging device) 250 may acquire image data (eg, RGB data).
- the camera module 250 may include a lens unit, an image sensor, or an image processing unit. According to an embodiment, the camera module 250 may perform AF by moving the lens unit to a position corresponding to the object distance.
- FIG. 3 illustrates a distance sensor according to various embodiments.
- the distance sensor 240 may include a light emitting unit 310 , a light receiving unit 320 , an auxiliary light receiving unit 325 , or a surface unit 330 .
- the distance sensor 240 may further include a control circuit for controlling the light emitting unit 310 , the light receiving unit 320 , or the auxiliary light receiving unit 325 or processing data.
- the processor 210 or the control circuit of the distance sensor 240 calculates the distance (object distance) between the distance sensor 240 and the object 350 based on the time from the output time of the light to the arrival time of the reflected light. can
- the light emitting unit 310 may output transmission light (eg, IR) of a specified wavelength.
- the light receiving unit 320 may detect the received light in which the transmitted light is reflected by the object 350 .
- the auxiliary light receiving unit 325 may be disposed in an area adjacent to the light emitting unit 310 .
- the auxiliary light receiving unit 325 may be used to detect the effect of crosstalk.
- the surface part 330 may be disposed on the front surface of the light emitting part 310 and the light receiving part 320 .
- the surface portion 330 may include glass or a polymer. For example, at least a portion of the surface portion 330 may be treated using a paint or film so that the light emitting unit 310 or the light receiving unit 320 is not visible.
- the surface part 330 may be included in a housing included in an electronic device (eg, the electronic device 201 of FIG. 2 ).
- crosstalk may occur during operation of the distance sensor 240 .
- the crosstalk may be light flowing from the light emitting unit 310 to the light receiving unit 320 through a separate path other than the path reflected from the object 350 .
- the structure to which the light emitting unit 310 and the light receiving unit 320 are fixed the space between the light emitting unit 310 (or the light receiving unit 320 ) and the surface unit 330 , and the inner space of the surface unit 330 .
- crosstalk may occur (waveguide effect).
- crosstalk may further increase.
- the crosstalk may be an obstacle to recognizing the object distance with the object 350 , and may be excluded when calculating the object distance through pre-stored calibration data related to the crosstalk.
- the graph 360 represents the number of photons flowing into the light receiving unit 320 over time.
- the light receiving unit 320 may obtain data on the first photons 361 arriving at the center of the first time t1 and the second photons 362 arriving at the center at the second time t2. have.
- the data for the first photons 361 may be generated by crosstalk, and the data for the second photons 362 may be generated by the received light Rx reflected through the object 350 .
- Data by the first photons 361 by crosstalk may be removed through calibration data.
- the interval between the first time t1 and the second time t2 may be small, and therefore, the first photons due to the crosstalk It may be difficult to distinguish the 361 from the second photons 362 reflected by the object 350 . For this reason, it may be difficult to effectively remove the crosstalk.
- the processor 210 or the control circuit of the distance sensor 240 may store calibration data by performing crosstalk calibration at a second distance greater than or equal to a specified reference distance (eg, about 30 cm).
- FIG. 4 illustrates a calibration operation of a distance sensor according to various embodiments.
- the processor 210 may start calibration of the distance sensor 240 .
- the processor 210 performs calibration of the distance sensor 240 in at least one of a manufacturing process of the distance sensor 240 , a setting process of the distance sensor 240 , and an application execution process related to the distance sensor 240 .
- the processor 210 may determine whether calibration is performed at a distance within a reference distance (eg, 30 cm).
- a reference distance eg, 30 cm
- the reference distance may be set differently according to the level of occurrence of crosstalk.
- the processor 210 may perform the offset calibration at a first distance (eg, 10 cm) that is smaller than the set reference distance (eg, about 30 cm).
- Offset calibration may be an operation or process of correcting a distance measurement value according to manufacturing characteristics of the distance sensor 240 . For example, when the actual distance to the object is 100 cm and the object distance measured using the distance sensor 240 is 98 cm, the processor 210 may store the offset by offset calibration as 2 cm.
- the processor 210 may store first calibration data according to the offset calibration performed at the first distance in the memory 220 .
- the processor 210 may perform crosstalk calibration at a second distance greater than or equal to the reference distance (eg, about 30 cm).
- the reference distance eg, about 30 cm.
- the processor 210 may perform crosstalk calibration at a second distance greater than or equal to a specified reference distance (eg, about 30 cm).
- the processor 210 may store second calibration data according to the crosstalk calibration performed at the second distance in the memory 220 .
- FIG. 5 is a flowchart illustrating preferential storage of first calibration data according to various embodiments of the present disclosure
- first calibration data by offset calibration may be stored in the memory 240 or a storage element included in the distance sensor 240 .
- the first calibration data may be transmitted from a manufacturer of the distance sensor 240 to a manufacturer of the electronic device 201 .
- the first calibration data may be stored in the memory 240 during the manufacturing process of the electronic device 201 .
- the first calibration data may be stored by being performed at a first distance (10 cm) that is smaller than the reference distance (eg, 30 cm).
- the first calibration data may be stored in a storage element included in the distance sensor 240 at the manufacturing time of the distance sensor 240 .
- the distance sensor 240 may be mounted on the electronic device 201 to manufacture the electronic device 201 .
- the electronic device 201 may load and use the first calibration data from a storage element included in the distance sensor 240 .
- the processor 210 may perform crosstalk calibration at a second distance greater than or equal to a reference distance (eg, about 30 cm).
- a reference distance eg, about 30 cm.
- the processor 210 may perform crosstalk calibration at a second distance greater than or equal to a specified reference distance (eg, about 30 cm).
- the processor 210 may store second calibration data according to the crosstalk calibration performed at the second distance in the memory 220 .
- FIG. 6 is a flowchart illustrating a method of performing auto focus (AF) according to various embodiments of the present disclosure
- the processor 210 may execute a camera application using the camera module 180 . After the camera application is executed, the processor 210 may display a preview image on the display 230 based on the image data acquired through the camera module 180 .
- the processor 210 may primarily calculate a distance to an external object (hereinafter, a measurement distance) using the distance sensor 240 .
- the processor 210 outputs the transmitted light of the IR pulse through the light emitting unit of the distance sensor 240 (eg, the light emitting unit 310 of FIG. 3 ), and the light receiving unit of the distance sensor 240 (eg, the light receiving unit of FIG. 3 ) (320)), it is possible to collect data about the received light reflected by the object of the IR pulse.
- the processor 210 may calculate the measurement distance based on the speed of the IR pulse and the movement time (the output time of the transmitted light to the reception time of the received light).
- the processor 210 may calculate the object distance by correcting the measurement distance based on the first calibration data or the second calibration data.
- the first calibration data may be correction data related to a distance offset according to a manufacturing characteristic of the distance sensor 240 .
- the second calibration data may be correction data related to crosstalk.
- the processor 210 may correct the measurement distance using the first calibration data and may not use the second calibration data.
- the processor 210 when the measurement distance is greater than or equal to the reference distance (eg, 30 cm), the processor 210 corrects the measurement distance using the second calibration data, and does not use the first calibration data.
- the reference distance e.g. 30 cm
- the processor 210 before using the distance sensor 240 , the processor 210 first calculates the object distance using a phase detector, and the calculated distance to the object is within a specified range. In this case, the object distance may be secondarily calculated using the distance sensor 240 .
- the processor 210 may determine whether the object distance calculated using the distance sensor 240 is equal to or greater than (or greater than) a specified reliability level.
- the reliability may be stored in advance based on the shooting environment, the performance of the distance sensor 240, and data obtained through other sensors.
- the processor 210 may not use the object distance calculated using the distance sensor 240 for the AF operation (operation 630 ). of NO).
- the processor 210 may move the lens unit of the camera module 250 using the object distance.
- the memory 220 may store a table in which the object distance and the position of the lens unit are matched.
- the processor 210 may determine a position of the lens unit corresponding to the calculated object distance with reference to the table.
- the processor 210 may display the AF UI on the display 230 . According to an embodiment, the processor 210 may fine-tune the AF UI based on state information (movement, rotation) of the electronic device.
- 7 illustrates an error rate according to a distance according to various embodiments. 7 is illustrative and not limited thereto.
- a first graph 701 shows an error rate for each distance of the distance sensor 240 that has performed both offset calibration and crosstalk calibration within a reference distance Ls (eg, 30 cm).
- Ls eg, 30 cm
- the error rate of the object distance calculated from a distance within the reference distance (Ls) is relatively large (eg, about 20%), while the error rate of the object distance calculated from a distance greater than the reference distance (Ls) is relatively small. possible (eg about 5%).
- the second graph 702 performs offset calibration at a first distance (eg, about 10 cm) within a reference distance (Ls) (eg, 30 cm), and a second graph equal to or greater than the reference distance (Ls) (eg, 30 cm).
- a first distance eg, about 10 cm
- Ls reference distance
- Ls reference distance
- the processor 210 determines the measurement distance using the second calibration data by crosstalk calibration. can be corrected.
- both the error rate of the object distance calculated from a distance within the reference distance Ls and the error rate of the object distance calculated from a distance greater than the reference distance Ls may be relatively small (eg, about 5%).
- FIG 8 illustrates a UI display according to an error in object distance measurement according to various embodiments of the present disclosure.
- the processor 210 may determine whether the crosstalk deviates from a reference value and an error in distance measurement occurs.
- the crosstalk may deviate from the reference value.
- the reference value may be determined based on the number of photons detected by the distance sensor 240 .
- the reference value may be one of about 100 kcps (kilo counts per second) to about 120 kcps.
- the processor 210 may display the UI 810 on the display 230 .
- the UI 810 may be a pop-up window that allows the user to remove foreign substances from the surface of the distance sensor 240 .
- An electronic device (eg, the electronic device 101 of FIG. 1 , the electronic device 201 of FIG. 2 ) according to various embodiments of the present disclosure includes a camera module (eg, the camera module 180 of FIG. 1 ) including a lens unit and an image sensor. , the camera module 250 of FIG. 2 ), a distance sensor including a light emitting unit and a light receiving unit (eg, the sensor module 176 of FIG. 1 , the distance sensor 240 of FIG. 2 ), a processor (eg, the processor of FIG. 1 ) 120, including the processor 210 of FIG. 2) and a memory (eg, the memory 130 of FIG. 1, the memory 220 of FIG. 2), the processor (eg, the processor 120 of FIG. 1);
- the camera A module eg, the camera module 180 of FIG. 1 , the camera module 250 of FIG. 2
- the position of the lens unit for auto focus of the camera module may be determined based on the object distance.
- the processor may include the distance sensor (eg, the sensor module 176 of FIG. 1 , the distance sensor 240 of FIG. 2 ) )), the distance calculated using the first calibration data or the second calibration data may be corrected to calculate the object distance.
- the distance sensor eg, the sensor module 176 of FIG. 1 , the distance sensor 240 of FIG. 2
- the processor (eg, the processor 120 of FIG. 1 , the processor 210 of FIG. 2 ) corrects the distance based on the first calibration data when the distance is within the reference distance can do.
- the processor may correct the distance based on the second calibration data when the distance is greater than or equal to the reference distance.
- the reference distance may be determined based on a crosstalk characteristic of the distance sensor (eg, the sensor module 176 of FIG. 1 and the distance sensor 240 of FIG. 2 ).
- the electronic device may include a display (eg, the display device 160 of FIG. 1 , the display 230 of FIG. 2 ). )), wherein the processor (eg, the processor 120 of FIG. 1 , the processor 210 of FIG. 2 ) includes the display (eg, the display device 160 of FIG. 1 , the display 230 of FIG. 2 ) ), a user interface corresponding to the auto focus may be displayed.
- the processor eg, the processor 120 of FIG. 1 , the processor 210 of FIG. 2
- the display eg, the display device 160 of FIG. 1 , the display 230 of FIG. 2
- a user interface corresponding to the auto focus may be displayed.
- the electronic device may include a display (eg, the display device 160 of FIG. 1 , the display 230 of FIG. 2 ). )), wherein the processor (eg, the processor 120 of FIG. 1 , the processor 210 of FIG. 2 ) includes the distance sensor (eg, the sensor module 176 of FIG. 1 , the distance sensor of FIG. 2 ) 240))))) to remove foreign substances from the surface of the distance sensor (eg, the sensor module 176 of FIG. 1 and the distance sensor 240 of FIG. 2 ) when the data by the crosstalk exceeds the reference value.
- a user interface may be displayed on the display (eg, the display device 160 of FIG. 1 and the display 230 of FIG. 2 ).
- the reference distance is 30 cm from the surface of the distance sensor (eg, the sensor module 176 of FIG. 1 and the distance sensor 240 of FIG. 2 ), and the first distance is the distance sensor (eg, FIG. It may be 10 cm from the surface of the sensor module 176 of 1, the distance sensor 240 of FIG. 2).
- the first calibration data may be stored in a storage element included in the distance sensor (eg, the sensor module 176 of FIG. 1 and the distance sensor 240 of FIG. 2 ).
- the first calibration data and the second calibration data are calculated in a manufacturing process of the electronic device (eg, the electronic device 101 of FIG. 1 , the electronic device 201 of FIG. 2 ), and the memory (eg, the memory 130 of FIG. 1 , the memory 220 of FIG. 2 ).
- the auto focus execution method is performed in an electronic device (eg, the electronic device 101 of FIG. 1 , and the electronic device 201 of FIG. 2 ), and the electronic device (eg, FIG. 1 ) Using a distance sensor (eg, the sensor module 176 of FIG. 1 , and the distance sensor 240 of FIG. 2 ) of the electronic device 101 of FIG. 2 ), the first distance shorter than the reference distance
- a distance sensor eg, the sensor module 176 of FIG. 1 , and the distance sensor 240 of FIG. 2
- the camera module eg, the camera module 180 of FIG. 1 for auto focus of the camera module (eg, the camera module 180 of FIG. 1 , the camera module 250 of FIG. 2 ) based on the distance ), and determining the position of the lens unit of the camera module 250 of FIG. 2 .
- the determining of the position of the lens unit may include performing the first calibration of the distance calculated using the distance sensor (eg, the sensor module 176 of FIG. 1 and the distance sensor 240 of FIG. 2 ). and calculating the object distance by correcting it using data or the second calibration data.
- the distance sensor eg, the sensor module 176 of FIG. 1 and the distance sensor 240 of FIG. 2 .
- the operation of determining the position of the lens unit may further include correcting the distance based on the first calibration data when the distance is within the reference distance.
- the determining of the position of the lens unit may further include correcting the distance based on the second calibration data when the distance is equal to or greater than the reference distance.
- the reference distance may be determined based on a crosstalk characteristic of the distance sensor (eg, the sensor module 176 of FIG. 1 and the distance sensor 240 of FIG. 2 ).
- the method includes a display (eg, the display device 160 of FIG. 1 , the electronic device 201 of FIG. 2 ) of the electronic device (eg, the electronic device 101 of FIG. 1 , the electronic device 201 of FIG. 2 ).
- the method may further include displaying a user interface corresponding to the auto focus on the display 230 .
- the method may be performed when data by crosstalk measured by the distance sensor (eg, the sensor module 176 of FIG. 1 and the distance sensor 240 of FIG. 2 ) exceeds a reference value.
- a user interface for removing foreign substances from the surface of the distance sensor eg, the sensor module 176 of FIG. 1 and the distance sensor 240 of FIG. 2
- the method may further include displaying on a display (eg, the display device 160 of FIG. 1 , and the display 230 of FIG. 2 ) of the electronic device 201 of FIG. 2 .
- the reference distance is 30 cm from the surface of the distance sensor (eg, the sensor module 176 of FIG. 1 and the distance sensor 240 of FIG. 2 ), and the first distance is the distance sensor (eg, FIG. It may be 10 cm from the surface of the sensor module 176 of 1, the distance sensor 240 of FIG. 2).
- the operation of acquiring and storing the 1 calibration data is the 1 stored in a storage element included in the distance sensor (eg, the sensor module 176 of FIG. 1 and the distance sensor 240 of FIG. 2 ). It may include an operation of loading calibration data.
- the first calibration data and the second calibration data are calculated in a manufacturing step of the electronic device (eg, the electronic device 101 of FIG. 1 and the electronic device 201 of FIG. 2 ), and the electronic device It may be stored in a memory (eg, the memory 130 of FIG. 1 , the memory 220 of FIG. 2 ) of the device (eg, the electronic device 101 of FIG. 1 , the electronic device 201 of FIG. 2 ).
- the electronic device may have various types of devices.
- the electronic device may include, for example, a portable communication device (eg, a smart phone), a computer device, a portable multimedia device, a portable medical device, a camera, a wearable device, or a home appliance device.
- a portable communication device eg, a smart phone
- a computer device e.g., a smart phone
- a portable multimedia device e.g., a portable medical device
- a camera e.g., a portable medical device
- a camera e.g., a portable medical device
- a camera e.g., a portable medical device
- a wearable device e.g., a smart bracelet
- a home appliance device e.g., a home appliance
- first, second, or first or second may be used simply to distinguish the element from other elements in question, and may refer to elements in other aspects (e.g., importance or order) is not limited. that one (eg first) component is “coupled” or “connected” to another (eg, second) component, with or without the terms “functionally” or “communicatively” When referenced, it means that one component can be connected to the other component directly (eg by wire), wirelessly, or through a third component.
- module may include a unit implemented in hardware, software, or firmware, and may be used interchangeably with terms such as, for example, logic, logic block, component, or circuit.
- a module may be an integrally formed part or a minimum unit or a part of the part that performs one or more functions.
- the module may be implemented in the form of an application-specific integrated circuit (ASIC).
- ASIC application-specific integrated circuit
- one or more instructions stored in a storage medium may be implemented as software (eg, the program 140) including
- a processor eg, processor 120
- a device eg, electronic device 801
- the one or more instructions may include code generated by a compiler or code executable by an interpreter.
- the device-readable storage medium may be provided in the form of a non-transitory storage medium.
- 'non-transitory' only means that the storage medium is a tangible device and does not include a signal (eg, electromagnetic wave), and this term is used in cases where data is semi-permanently stored in the storage medium and It does not distinguish between temporary storage cases.
- a signal eg, electromagnetic wave
- the method according to various embodiments disclosed in this document may be provided as included in a computer program product.
- Computer program products may be traded between sellers and buyers as commodities.
- the computer program product is distributed in the form of a machine-readable storage medium (eg compact disc read only memory (CD-ROM)), or via an application store (eg Play Store TM ) or on two user devices ( It can be distributed online (eg download or upload), directly between smartphones (eg smartphones).
- a part of the computer program product may be temporarily stored or temporarily generated in a machine-readable storage medium such as a memory of a server of a manufacturer, a server of an application store, or a relay server.
- each component eg, a module or a program of the above-described components may include a singular or a plurality of entities.
- one or more components or operations among the above-described corresponding components may be omitted, or one or more other components or operations may be added.
- a plurality of components eg, a module or a program
- the integrated component may perform one or more functions of each component of the plurality of components identically or similarly to those performed by the corresponding component among the plurality of components prior to the integration. .
- operations performed by a module, program, or other component are executed sequentially, in parallel, repeatedly, or heuristically, or one or more of the operations are executed in a different order, or omitted. or one or more other operations may be added.
Landscapes
- Physics & Mathematics (AREA)
- Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Electromagnetism (AREA)
- Computer Networks & Wireless Communication (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Studio Devices (AREA)
Abstract
Un dispositif électronique selon un mode de réalisation divulgué dans le présent document comprend : un module de caméra comprenant une unité de lentille et un capteur d'image ; un capteur de distance ; un processeur ; et une mémoire, le processeur pouvant acquérir des premières données d'étalonnage relatives aux caractéristiques liées à la mesure de distance du capteur de distance à une première distance et les stocker dans la mémoire, acquérir des secondes données d'étalonnage relatives à une diaphonie à une seconde distance et les stocker dans la mémoire, faire fonctionner le module de caméra, calculer une distance d'objet à partir d'un objet externe sur la base des premières données d'étalonnage ou des secondes données d'étalonnage et déterminer, sur la base de la distance d'objet, la position de l'unité de lentille pour une mise au point automatique du module de caméra.
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| KR1020200096189A KR20220015752A (ko) | 2020-07-31 | 2020-07-31 | 거리 센서를 포함하는 전자 장치 및 오토 포커스 수행 방법 |
| KR10-2020-0096189 | 2020-07-31 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2022025630A1 true WO2022025630A1 (fr) | 2022-02-03 |
Family
ID=80035860
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/KR2021/009817 Ceased WO2022025630A1 (fr) | 2020-07-31 | 2021-07-28 | Dispositif électronique comprenant un capteur de distance et procédé de mise au point automatique |
Country Status (2)
| Country | Link |
|---|---|
| KR (1) | KR20220015752A (fr) |
| WO (1) | WO2022025630A1 (fr) |
Cited By (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2024043491A1 (fr) * | 2022-08-26 | 2024-02-29 | 삼성전자 주식회사 | Dispositif électronique permettant de fournir un service basé sur un emplacement et son procédé de fonctionnement |
Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| KR20090023874A (ko) * | 2007-09-03 | 2009-03-06 | 삼성전자주식회사 | 카메라의 자동 초점 조절 장치 및 방법 |
| KR20110052993A (ko) * | 2009-11-13 | 2011-05-19 | 삼성전자주식회사 | 영상 보정 장치 및 영상 보정 방법 |
| KR20140123302A (ko) * | 2013-04-12 | 2014-10-22 | 삼성전자주식회사 | 영상 처리 장치 및 그 제어방법 |
| KR20170005312A (ko) * | 2015-07-03 | 2017-01-12 | 전자부품연구원 | 카메라와 거리 센서의 동시 캘리브레이션 시스템 및 방법 |
| WO2020084955A1 (fr) * | 2018-10-24 | 2020-04-30 | ソニーセミコンダクタソリューションズ株式会社 | Capteur de mesure de distance, capteur de détection, procédé de mesure de distance, et dispositif électronique |
-
2020
- 2020-07-31 KR KR1020200096189A patent/KR20220015752A/ko active Pending
-
2021
- 2021-07-28 WO PCT/KR2021/009817 patent/WO2022025630A1/fr not_active Ceased
Patent Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| KR20090023874A (ko) * | 2007-09-03 | 2009-03-06 | 삼성전자주식회사 | 카메라의 자동 초점 조절 장치 및 방법 |
| KR20110052993A (ko) * | 2009-11-13 | 2011-05-19 | 삼성전자주식회사 | 영상 보정 장치 및 영상 보정 방법 |
| KR20140123302A (ko) * | 2013-04-12 | 2014-10-22 | 삼성전자주식회사 | 영상 처리 장치 및 그 제어방법 |
| KR20170005312A (ko) * | 2015-07-03 | 2017-01-12 | 전자부품연구원 | 카메라와 거리 센서의 동시 캘리브레이션 시스템 및 방법 |
| WO2020084955A1 (fr) * | 2018-10-24 | 2020-04-30 | ソニーセミコンダクタソリューションズ株式会社 | Capteur de mesure de distance, capteur de détection, procédé de mesure de distance, et dispositif électronique |
Cited By (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2024043491A1 (fr) * | 2022-08-26 | 2024-02-29 | 삼성전자 주식회사 | Dispositif électronique permettant de fournir un service basé sur un emplacement et son procédé de fonctionnement |
Also Published As
| Publication number | Publication date |
|---|---|
| KR20220015752A (ko) | 2022-02-08 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| WO2019199021A1 (fr) | Dispositif électronique comprenant un écran flexible | |
| WO2019212193A1 (fr) | Dispositif électronique comprenant un boîtier pliable et un dispositif d'affichage souple | |
| WO2019212252A1 (fr) | Procédé de fabrication de vitre de fenêtre comprenant une partie plane et dispositif électronique comprenant une vitre de fenêtre | |
| WO2018190624A1 (fr) | Capteur biométrique et dispositif le comprenant | |
| WO2019143071A1 (fr) | Dispositif électronique pour commander une pluralité d'applications | |
| EP3635786A1 (fr) | Capteur d'image comprenant un élément de protection contre la lumière pour bloquer une interférence entre une pluralité de capteurs de réception de lumière, et dispositif électronique le comprenant | |
| WO2019039838A1 (fr) | Dispositif électronique comprenant une antenne | |
| WO2019203579A1 (fr) | Procédé de génération d'informations de profondeur et dispositif électronique prenant en charge ledit procédé | |
| WO2020096413A1 (fr) | Caméra escamotable et rotative et dispositif électronique comprenant celle-ci | |
| WO2019245197A1 (fr) | Dispositif de détection d'informations biométriques et son procédé de commande | |
| WO2019093856A1 (fr) | Dispositif et procédé de commande de microphone en fonction de la connexion d'un accessoire externe | |
| WO2018190619A1 (fr) | Dispositif électronique consistant en un capteur biométrique | |
| WO2020166894A1 (fr) | Dispositif électronique et procédé de recommandation de mot associé | |
| WO2021256849A1 (fr) | Dispositif électronique équipé d'une caméra | |
| WO2020171607A1 (fr) | Circuit tactile pour empêcher un toucher erroné dû à un changement de température, dispositif électronique comprenant le circuit tactile et son procédé de fonctionnement | |
| WO2021158017A1 (fr) | Dispositif électronique et procédé de reconnaissance d'objet | |
| WO2019135548A1 (fr) | Procédé de compensation de la valeur de pression d'un capteur de force et dispositif électronique l'utilisant | |
| WO2020209644A1 (fr) | Carte à circuit imprimé souple et dispositif électronique l'intégrant | |
| WO2022025630A1 (fr) | Dispositif électronique comprenant un capteur de distance et procédé de mise au point automatique | |
| WO2021167342A2 (fr) | Dispositif électronique comprenant un module d'appareil de prise de vues prenant des prises de vues à travers au moins une partie d'un dispositif d'affichage | |
| WO2021206354A1 (fr) | Dispositif électronique de reconnaissance d'empreinte digitale et son procédé d'utilisation | |
| WO2021145693A1 (fr) | Dispositif électronique de traitement de données d'image et procédé de traitement de données d'image | |
| WO2019182357A1 (fr) | Procédé de réglage de mise au point fondé sur un niveau d'étalement d'un objet d'affichage, et dispositif électronique prenant en charge ledit procédé | |
| WO2019240520A1 (fr) | Dispositif électronique et procédé pour identifier des images capturant l'intention d'un utilisateur | |
| WO2020235821A1 (fr) | Dispositif électronique pour fournir une rétroaction correspondant à une entrée pour un boîtier |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 21849176 Country of ref document: EP Kind code of ref document: A1 |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 21849176 Country of ref document: EP Kind code of ref document: A1 |