[go: up one dir, main page]

HK1262336A1 - Control method, microprocessor, computer readable storage medium and computer equipment - Google Patents

Control method, microprocessor, computer readable storage medium and computer equipment Download PDF

Info

Publication number
HK1262336A1
HK1262336A1 HK19122468.2A HK19122468A HK1262336A1 HK 1262336 A1 HK1262336 A1 HK 1262336A1 HK 19122468 A HK19122468 A HK 19122468A HK 1262336 A1 HK1262336 A1 HK 1262336A1
Authority
HK
Hong Kong
Prior art keywords
microprocessor
light
control method
current brightness
structured light
Prior art date
Application number
HK19122468.2A
Other languages
Chinese (zh)
Inventor
周海涛
惠方方
欧锦荣
郭子青
谭筱
Original Assignee
Oppo广东移动通信有限公司
Filing date
Publication date
Application filed by Oppo广东移动通信有限公司 filed Critical Oppo广东移动通信有限公司
Publication of HK1262336A1 publication Critical patent/HK1262336A1/en

Links

Description

Control method, microprocessor, computer-readable storage medium, and computer device
Technical Field
The present invention relates to the field of three-dimensional imaging technologies, and in particular, to a control method, a microprocessor, a computer-readable storage medium, and a computer device.
Background
Structured light depth cameras typically include a structured light projector and an image collector. The structured light projector projects the diffracted laser pattern into the target space, the image collector shoots the laser pattern modulated by the object in the target space, and the depth information of the object in the target space is obtained based on the laser pattern and the reference pattern. The light emitted by the structured light projector is typically infrared laser light. The ambient light also contains infrared light components, so when the structured light depth camera works, the infrared light components in the ambient light can affect the acquisition of laser patterns by the image acquisition device, and further affect the acquisition precision of depth information.
Disclosure of Invention
The embodiment of the invention provides a control method, a microprocessor, a computer readable storage medium and computer equipment.
The control method of the embodiment of the invention is used for the structured light projector, and comprises the following steps:
acquiring the current brightness of a scene;
determining a luminous power of the structured light projector according to the current brightness; and
and controlling the structured light projector to emit light according to the luminous power.
The microprocessor of the embodiments of the present invention is electrically connected to a structured light depth camera, the structured light depth camera comprising a structured light projector, the microprocessor for:
acquiring the current brightness of a scene;
determining a luminous power of the structured light projector according to the current brightness; and
and controlling the structured light projector to emit light according to the luminous power.
One or more non-transitory computer-readable storage media embodying computer-executable instructions that, when executed by one or more processors, cause the processors to perform the control methods described above.
The computer device comprises a memory and a processor, wherein the memory stores computer readable instructions, and the instructions, when executed by the processor, cause the processor to execute the control method.
The control method, the microprocessor, the computer readable storage medium and the computer device of the embodiment of the invention can adjust the luminous power of the structured light projector according to the current brightness of the scene, thereby improving the acquisition precision of the depth image and being beneficial to reducing the power consumption of the computer device.
Additional aspects and advantages of the invention will be set forth in part in the description which follows and, in part, will be obvious from the description, or may be learned by practice of the invention.
Drawings
The foregoing and/or additional aspects and advantages of the present invention will become apparent and readily appreciated from the following description of the embodiments, taken in conjunction with the accompanying drawings of which:
FIG. 1 is a schematic flow chart of a control method according to some embodiments of the present invention.
FIG. 2 is a partially block schematic diagram of a computer device in accordance with certain embodiments of the invention.
Fig. 3 is a flow chart illustrating a control method according to some embodiments of the present invention.
Fig. 4 is a flow chart illustrating a control method according to some embodiments of the present invention.
FIG. 5 is a schematic diagram of a light source of a computer device according to some embodiments of the invention.
FIG. 6 is a schematic diagram of the architecture of a computer device according to some embodiments of the invention.
Detailed Description
Reference will now be made in detail to embodiments of the present invention, examples of which are illustrated in the accompanying drawings, wherein like or similar reference numerals refer to the same or similar elements or elements having the same or similar function throughout. The embodiments described below with reference to the drawings are illustrative and intended to be illustrative of the invention and are not to be construed as limiting the invention.
Referring to fig. 1, the present invention provides a control method for a structured light projector 11. The control method comprises the following steps:
01: acquiring the current brightness of a scene;
02: determining the luminous power of the structured light projector 11 from the current brightness; and
03: the structured light projector 11 is controlled to emit light in accordance with the light emission power.
Referring to FIG. 2, a microprocessor 20 is also provided. Microprocessor 20 is electrically connected to structured light depth camera 10. Structured light depth camera 10 includes a structured light projector 11 and an image collector 12. Step 01, step 02 and step 03 may all be implemented by the microprocessor 20. That is, the microprocessor 20 may be configured to obtain a current brightness of the scene, determine a light emission power of the structured light projector 11 based on the current brightness, and control the structured light projector 11 to emit light according to the light emission power.
The microprocessor 20 of the present embodiment may be implemented in a computer device 100 (shown in FIG. 6). Wherein, the computer device 100 may be a smart phone, a tablet computer, a notebook computer, a desktop computer, an intelligent wearable device (such as an intelligent helmet, an intelligent glasses, an intelligent watch, an intelligent bracelet, etc.), a virtual reality device, etc.
The structured light depth camera 10 may acquire depth information for objects in a scene. Specifically, the structured light projector 11 projects a laser pattern diffracted by the diffractive optical element into the scene, the image collector 12 collects a laser pattern modulated by the object, the reference pattern and the modulated laser pattern can be used for calculating a depth image, and the depth image represents depth information of the object in the scene. Wherein the laser pattern is typically an infrared laser pattern and image collector 12 is typically an infrared camera. When the structured light depth camera 10 is used, since the ambient light also contains components of infrared light, the infrared light sensed by the image collector 12 not only contains the infrared light component projected by the structured light projector 11, but also contains the infrared light component in the ambient light, and therefore, the infrared light component in the ambient light affects the collection of the laser pattern by the image collector 12, and further affects the accuracy of obtaining the depth information. Especially, when the current brightness of the scene is bright, the ratio of the infrared light component in the ambient light in the total amount of the infrared light sensed by the image collector 12 is large, the influence on the collection of the laser pattern is large, and the acquisition accuracy of the depth information is low.
In the control method of the embodiment of the invention, before the structured light projector 11 is turned on, the current brightness of the scene is detected, the light emitting power of the structured light projector 11 is determined according to the current brightness, and finally the microprocessor 20 controls the structured light projector 11 to emit light with the determined light power. Wherein, the relationship between the luminous power and the current brightness is as follows: the higher the current brightness is, the higher the luminous power is; the smaller the current luminance, the smaller the light emission power. It can be understood that when the current brightness is higher, the infrared light component in the ambient light sensed by image collector 12 is also more, at this moment, the luminous power of structured light projector 11 is correspondingly improved, then the infrared light component projected by structured light projector 11 sensed by image collector 12 is also more, so the occupation ratio of the infrared light component in the ambient light in the total infrared light amount sensed by image collector 12 can be correspondingly reduced, the influence on the collection of laser patterns is also correspondingly reduced, and the acquisition precision of depth information is higher. In addition, when the current brightness is low, the light emitting power is reduced accordingly, and the power consumption of the computer device 100 can be saved.
Of course, it should be noted that the light emitting power has a lower power threshold, and the structured light projector 11 should emit light with a power greater than or equal to the power threshold to ensure that the image acquirer 12 can acquire a laser pattern with sufficient brightness, which is beneficial to performing depth information calculation based on the laser pattern and ensuring accuracy of the acquisition precision of the depth information.
Referring to fig. 2 and 6, a computer apparatus 100 according to an embodiment of the invention includes a structured light depth camera 10, a microprocessor 20, and an application processor 30. The structured light projector 11 and the application processor 30 are electrically connected, and the application processor 30 can provide an enable signal to control the opening and closing of the structured light projector 11. The structured light projector 11 is also electrically connected to the microprocessor 20, the structured light projector 11 can be connected to the pulse width modulation interface 73 of the microprocessor 20, the microprocessor 20 provides the structured light projector 11 with a pulse signal to illuminate the structured light projector 11, and the illumination power of the structured light projector 11 is adjusted by the width modulation of the pulse signal. Image collector 12 is electrically coupled to application processor 30. application processor 30 may be configured to control the powering on and off of image collector 12, to shut down image collector 12, or to reset image collector 12. Image collector 12 is further electrically connected to microprocessor 20, image collector 12 is connected to microprocessor 20 through an Inter-Integrated Circuit (I2C) bus 71, microprocessor 20 is capable of providing a clock signal for collecting the laser pattern to image collector 12, and the laser pattern collected by image collector 12 is capable of being transmitted to microprocessor 20 through a Mobile Industry Processor Interface (MIPI) 72. In the embodiment of the present invention, computer device 100 further includes an infrared fill-in light 60, where infrared fill-in light 60 can emit uniform infrared light outwards, and the infrared light is reflected by an object in the scene and then received by image collector 12 to obtain an infrared image. The infrared light supplement lamp 60 can also be connected with the application processor 30 through the integrated circuit bus 71, the application processor 30 can provide an enable signal for the infrared light supplement lamp 60 to control the on and off of the infrared light supplement, the infrared light supplement lamp 60 can also be electrically connected with the microprocessor 20, the infrared light supplement lamp 60 is connected to a pulse width modulation interface 73 of the microprocessor 20, and the microprocessor 20 provides a pulse signal for the infrared light supplement lamp 60 to enable the infrared light supplement lamp 60 to emit light.
Microprocessor 20 may be a processing chip and microprocessor 20 is electrically coupled to application processor 30. In particular, application processor 30 may be used to reset microprocessor 20, wake microprocessor 20, error correction microprocessor 20, and the like. Microprocessor 20 may be coupled to application processor 30 via a mobile industry processor interface 72. Specifically, the application processor 30 includes a Trusted Execution Environment (TEE) 31 and an untrusted Execution Environment (REE) 32, and codes and a memory area in the Trusted Execution Environment 31 are controlled by the access control unit and cannot be accessed by a program in the untrusted Execution Environment 32. The microprocessor 20 is connected to the trusted execution environment 31 of the application processor 30 through the mobile industry processor interface 72 to transfer data in the microprocessor 20 directly to the trusted execution environment 31 for storage.
The data in trusted execution environment 31 includes reference patterns, laser patterns and infrared images collected by image collector 12, etc.
The reference pattern is pre-stored in the trusted execution environment 31 before the computer device 100 is shipped from the factory.
The microprocessor 20 controls the structured light projector 11 to project a laser pattern into the scene and controls the image collector 12 to collect a laser pattern modulated by an object in the scene, and the microprocessor 20 acquires the laser pattern through the mobile industry processor interface 72 and transmits the laser pattern into the trusted execution environment 31 of the application processor 30 through the mobile industry processor interface 72 connected to the application processor 30. The application processor 30 may calculate a depth image based on the reference pattern and the laser pattern. Some depth images may serve as depth templates based on which authentication of the user may be performed. After the user passes the authentication, the corresponding operation authority of the computer device 100, for example, the operation authority of screen unlocking, payment, etc., can be obtained. Depth images may also be used for three-dimensional scene modeling, and the like.
The microprocessor 20 may further control the infrared fill light 60 to project uniform infrared light into the scene, and control the image collector 12 to collect an infrared image, and the microprocessor 20 acquires the infrared image through the mobile industry processor interface 72, and transmits the infrared image to the trusted execution environment 31 of the application processor 30 through the mobile industry processor interface 72 connected to the application processor 30. Some infrared images can be used as infrared templates, for example, an infrared image containing a face of a user in the infrared image can be used as a face infrared template, and two-dimensional face verification and the like can be performed based on the face infrared template.
In summary, the control method and the microprocessor 20 according to the embodiments of the present invention can adjust the light emitting power of the structured light projector 11 according to the current brightness of the scene, so as to improve the accuracy of obtaining the depth image and reduce the power consumption of the computer device 100.
In some embodiments, a trusted execution environment is also included in microprocessor 20. Data in the trusted execution environment of microprocessor 20 includes reference patterns, laser patterns and infrared images captured by image capture device 12, and the like. The reference pattern is pre-stored in the trusted execution environment of the microprocessor 20 before the computer device 100 is shipped from the factory. After microprocessor 20 receives the laser light pattern from image grabber 12, the laser light pattern is stored in a trusted execution environment of microprocessor 20. The microprocessor 20 may calculate a depth image based on the reference pattern and the laser pattern. The depth image containing the depth information of the face may serve as a depth template, which the microprocessor 20 may transfer to the trusted execution environment 31 of the application processor 30 for storage via the mobile industry processor interface 72. During the subsequent authentication, the microprocessor 20 transmits the calculated depth image to the application processor 30, and the application processor 30 performs comparison between the depth image and the depth template, and schedules various processes requiring authentication based on the comparison result. Similarly, microprocessor 20 receives the infrared image from image grabber 12 and stores the infrared image in the trusted execution environment of microprocessor 20. The infrared image containing the human face may be used as an infrared template, and the microprocessor 20 may transfer the infrared template to the trusted execution environment 31 of the application processor 30 for storage via the mobile industry processor interface 72. During the subsequent authentication, the microprocessor 20 transmits the acquired infrared image to the application processor 30, and the application processor 30 performs comparison between the infrared image and the infrared template, and schedules various processes requiring authentication based on the comparison result.
In some embodiments, the reference pattern is stored in microprocessor 20. After receiving the laser pattern from image capture device 12, microprocessor 20 may transmit the reference pattern and the laser pattern to trusted execution environment 31 of application processor 30 for storage via mobile industry processor interface 72, and application processor 30 may calculate the depth image based on the reference pattern and the laser pattern. The application processor 30 performs depth image calculation in the trusted execution environment 31, and the calculated depth image is also stored in the trusted execution environment 31. Some depth images stored in the trusted execution environment 31 of the application processor 30 may serve as depth templates. During subsequent authentication, the application processor 30 performs calculation of the depth image, compares the depth image with the depth template based on the calculation, and further performs scheduling of various processes requiring authentication based on the comparison result. Similarly, after receiving the infrared image from image collector 12, microprocessor 20 may transmit the infrared image to trusted execution environment 31 of application processor 30 for storage via Mobil processor interface 72. Some of the infrared images stored in the trusted execution environment 31 of the application processor 30 may serve as infrared templates. At the time of the subsequent authentication, the application processor 30 compares the infrared image received from the microprocessor 20 with the infrared template, and performs scheduling of various processes requiring authentication based on the comparison result.
Referring to fig. 2, in some embodiments, the current brightness of the scene may be detected by the light sensor 50. The light sensor 50 is electrically connected to the microprocessor 20 as an external device, and in particular, may be electrically connected to the microprocessor 20 through an integrated circuit bus 71. The light sensor 50 is also electrically connected to the application processor 30, and in particular, electrically connected to the application processor 30 via the integrated circuit bus 71, and the application processor 30 can provide an enable signal to the light sensor 50 to control the light sensor to be turned on and off. The light sensor 50 is composed of two components, a light projector and a light receiver, and its working principle is that the light projector focuses the light through a lens, and the light is transmitted to the lens of the light receiver and finally received by the sensor. The sensor converts the received light signal into an electrical signal. The electrical signal is transmitted to the microprocessor 20, and the microprocessor 20 determines the current brightness of the scene based on the magnitude of the electrical signal, and finally determines the light emitting power of the structured light projector 11 based on the current brightness.
Referring to fig. 3, in some embodiments, the obtaining the current brightness of the scene in step 01 includes:
011: acquiring a shot image of a scene; and
012: the current brightness is calculated from the captured image.
Referring to FIG. 2, in some embodiments, step 011 and step 012 can both be implemented by microprocessor 20. That is, the microprocessor 20 may also be used to acquire a captured image of a scene and calculate the current brightness from the captured image.
Wherein a captured image of a scene may be captured by image capture device 12. The captured image is a grayscale image at this time. The plurality of pixel values of the grayscale image reflect the brightness at various locations in the scene, and the microprocessor 20 can calculate the current brightness of the scene according to the pixel values of the grayscale image, such as summing the pixel values of the entire grayscale image and then averaging. Calculating the current brightness of the scene based on the captured image may eliminate the need for a light sensor, reducing the number of peripherals of the computer device 100.
In some embodiments, the computer device 100 further includes a visible light camera 40. The visible light camera 40 is connected to the application processor 30, and specifically, the visible light camera 40 may be connected to the application processor 30 through an integrated circuit bus 71. Application processor 30 may provide an enable signal to visible light camera 40 to turn visible light camera 40 on or off, or to reset visible light camera 40. The visible light camera 40 is further electrically connected to the microprocessor 20, and specifically, the visible light camera 40 may be connected to the microprocessor 20 through an integrated circuit bus 71, the microprocessor 20 may provide a clock signal for capturing a visible light image to the visible light camera 40, and the visible light image captured by the visible light camera 40 may be transmitted to the microprocessor 20 through the mh 72. The microprocessor 20 may further transmit the visible light image to the untrusted execution environment 32 of the application processor 30 through the mobile industry processor interface 72, and the application processor 30 may perform three-dimensional modeling of a scene according to the depth image and the visible light image to obtain a three-dimensional color model of the scene, or perform facial beautification according to the depth image and the visible light image, specifically, the application processor 30 may more accurately identify corresponding pixel points of a human face in the visible light image based on the depth image, perform facial beautification on the human face, improve the facial beautification effect of the human face, and the like. The three-dimensional scene modeling, face beautification, and the like described above may also be performed by the microprocessor 20 when the microprocessor 20 also includes trusted and untrusted execution environments.
The captured image may be a visible light image captured by the visible light camera 40. The pixel values of the visible light image are typically RGB format data, and upon receiving the visible light image, the microprocessor 20 first calculates the value of the luminance component Y in the YCrCb format based on the RGB format pixel values by the following formula: y is 0.257 × R +0.564 × G +0.098 × B. Subsequently, the microprocessor 20 may calculate the current brightness of the scene according to the plurality of Y values of the entire visible light image, for example, summing and averaging the plurality of Y values, and using the final result as the current brightness of the scene.
Referring to fig. 4, in some embodiments, the light emitting powers correspond to the predetermined luminance ranges one by one. The step 02 of determining the luminous power of the structured light projector 11 from the current brightness comprises:
021: determining a preset brightness range of the current brightness; and
022: and determining the luminous power corresponding to the preset brightness range according to the preset brightness range.
Referring to FIG. 2, in some embodiments, step 021 and step 022 can both be implemented by microprocessor 20. That is, the microprocessor 20 may be further configured to determine a preset luminance range in which the current luminance is located, and determine the light emitting power corresponding to the preset luminance range according to the preset luminance range.
Specifically, the microprocessor 20 is pre-stored with a corresponding table of preset luminance ranges and light emitting powers, each preset luminance range corresponding to one light emitting power. After calculating the current brightness, the microprocessor 20 first determines which preset brightness range the current brightness is in, and after determining the division of the preset brightness range, searches the corresponding table for the light emitting power corresponding to the preset brightness range where the current brightness is located, and controls the structured light projector 11 to emit light with the determined light emitting power. The correspondence table between the preset brightness range and the light emitting power is obtained by experimental calibration during the manufacturing process of the computer device 100.
In the control method according to the embodiment of the present invention, by dividing the plurality of preset luminance ranges and setting the light emitting powers corresponding to the plurality of preset luminance ranges one to one, when the structured light depth camera 10 operates, the microprocessor 20 may find the light emitting power most suitable for the current luminance according to the correspondence table, so that on one hand, the accuracy of obtaining the depth image may be improved, and on the other hand, the power consumption of the computer device 100 may be reduced.
In some embodiments, the control of the luminous power may be achieved by zone control of the light source 111 of the structured light projector 11, in addition to varying the duty cycle and amplitude of the pulses. At this time, the light source 111 is divided into a plurality of light emitting regions 1112, each light emitting region 1112 includes a plurality of point light sources 1111, and each light emitting region 1112 can be independently controlled. After the light emitting power is determined, the number of light emitting regions 1112 to be turned on and the positions of the light emitting regions 1112 are determined based on the light emitting power. Taking fig. 5 as an example, the light source 111 is divided into 8 light-emitting regions 1112, and if 4 light-emitting regions 1112 need to be turned on based on the determined light-emitting power, the four light-emitting regions 1112 are distributed in central symmetry corresponding to the 4 light-emitting regions 1112 shown in fig. 5, so that the uniformity of the brightness of the laser pattern projected by the structured light projector 11 into the scene can be improved, and the acquisition accuracy of the depth image can be further improved.
It should be noted that the shape of the light source 111 may be a triangle, a rectangle, a square, a parallelogram, a polygon, etc. besides the circle shown in fig. 5, which is not limited herein. The light emitting regions 1112 may be disposed in a circular, zigzag, or other shape, besides the fan-shaped distribution shown in fig. 5, and are not limited herein.
Referring to fig. 6, the present invention further provides a computer apparatus 100. The computer device 100 includes a memory 80 and a processor 90. The memory 80 has stored therein computer readable instructions 81. The instructions, when executed by the processor 90, cause the processor 90 to perform the control method of any of the above embodiments. The processor 90 in the computer apparatus 100 may be the microprocessor 20 described above.
For example, when the instructions are executed by the processor 90, the processor 90 may perform the steps of:
01: acquiring the current brightness of a scene;
02: determining the luminous power of the structured light projector 11 from the current brightness; and
03: the structured light projector 11 is controlled to emit light in accordance with the light emission power.
As another example, when the instructions are executed by the processor 90, the processor may further perform the steps of:
021: determining a preset brightness range of the current brightness; and
022: and determining the luminous power corresponding to the preset brightness range according to the preset brightness range.
The present invention also provides one or more non-transitory computer-readable storage media containing computer-executable instructions that, when executed by one or more processors 90, cause the processors 90 to perform the control method of any one of the above embodiments. The processor 90 may be the microprocessor 20 described above.
For example, when the computer-executable instructions are executed by the one or more processors 90, the processors 90 may perform the steps of:
01: acquiring the current brightness of a scene;
02: determining the luminous power of the structured light projector 11 from the current brightness; and
03: the structured light projector 11 is controlled to emit light in accordance with the light emission power.
As another example, when the computer-executable instructions are executed by the one or more processors 90, the processors 90 may further perform the steps of:
021: determining a preset brightness range of the current brightness; and
022: and determining the luminous power corresponding to the preset brightness range according to the preset brightness range.
In the description herein, references to the description of the term "one embodiment," "some embodiments," "an example," "a specific example," or "some examples," etc., mean that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the invention. In this specification, the schematic representations of the terms used above are not necessarily intended to refer to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples. Furthermore, various embodiments or examples and features of different embodiments or examples described in this specification can be combined and combined by one skilled in the art without contradiction.
Furthermore, the terms "first", "second" and "first" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defined as "first" or "second" may explicitly or implicitly include at least one such feature. In the description of the present invention, "a plurality" means at least two, e.g., two, three, etc., unless specifically limited otherwise.
Any process or method descriptions in flow charts or otherwise described herein may be understood as representing modules, segments, or portions of code which include one or more executable instructions for implementing specific logical functions or steps of the process, and alternate implementations are included within the scope of the preferred embodiment of the present invention in which functions may be executed out of order from that shown or discussed, including substantially concurrently or in reverse order, depending on the functionality involved, as would be understood by those reasonably skilled in the art of the present invention.
The logic and/or steps represented in the flowcharts or otherwise described herein, e.g., an ordered listing of executable instructions that can be considered to implement logical functions, can be embodied in any computer-readable medium for use by or in connection with an instruction execution system, apparatus, or device, such as a computer-based system, processor-containing system, or other system that can fetch the instructions from the instruction execution system, apparatus, or device and execute the instructions. For the purposes of this description, a "computer-readable medium" can be any means that can contain, store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device. More specific examples (a non-exhaustive list) of the computer-readable medium would include the following: an electrical connection (electronic device) having one or more wires, a portable computer diskette (magnetic device), a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber device, and a portable compact disc read-only memory (CDROM). Additionally, the computer-readable medium could even be paper or another suitable medium upon which the program is printed, as the program can be electronically captured, via for instance optical scanning of the paper or other medium, then compiled, interpreted or otherwise processed in a suitable manner if necessary, and then stored in a computer memory.
It should be understood that portions of the present invention may be implemented in hardware, software, firmware, or a combination thereof. In the above embodiments, the various steps or methods may be implemented in software or firmware stored in memory and executed by a suitable instruction execution system. For example, if implemented in hardware, as in another embodiment, any one or combination of the following techniques, which are known in the art, may be used: a discrete logic circuit having a logic gate circuit for implementing a logic function on a data signal, an application specific integrated circuit having an appropriate combinational logic gate circuit, a Programmable Gate Array (PGA), a Field Programmable Gate Array (FPGA), or the like.
It will be understood by those skilled in the art that all or part of the steps carried by the method for implementing the above embodiments may be implemented by hardware related to instructions of a program, which may be stored in a computer readable storage medium, and when the program is executed, the program includes one or a combination of the steps of the method embodiments.
In addition, functional units in the embodiments of the present invention may be integrated into one processing module, or each unit may exist alone physically, or two or more units are integrated into one module. The integrated module can be realized in a hardware mode, and can also be realized in a software functional module mode. The integrated module, if implemented in the form of a software functional module and sold or used as a stand-alone product, may also be stored in a computer readable storage medium.
The storage medium mentioned above may be a read-only memory, a magnetic or optical disk, etc. Although embodiments of the present invention have been shown and described above, it is understood that the above embodiments are exemplary and should not be construed as limiting the present invention, and that variations, modifications, substitutions and alterations can be made to the above embodiments by those of ordinary skill in the art within the scope of the present invention.

Claims (12)

1. A control method for a structured light projector, the control method comprising:
acquiring the current brightness of a scene;
determining a luminous power of the structured light projector according to the current brightness; and
and controlling the structured light projector to emit light according to the luminous power.
2. The control method according to claim 1, wherein the current brightness is detected by a light sensor.
3. The control method according to claim 1, wherein the step of obtaining the current brightness of the scene comprises:
acquiring a shot image of the scene; and
and calculating the current brightness according to the shot image.
4. The control method of claim 1, wherein a plurality of the light emitting powers correspond to preset luminance ranges one-to-one, and the step of determining the light emitting power of the structured light projector according to the current luminance comprises:
determining a preset brightness range in which the current brightness is located; and
and determining the luminous power corresponding to the preset brightness range according to the preset brightness range.
5. The control method according to claim 1, wherein the higher the current luminance is, the larger the light emission power is.
6. A microprocessor electrically connected to a structured light depth camera comprising a structured light projector, the microprocessor to:
acquiring the current brightness of a scene;
determining a luminous power of the structured light projector according to the current brightness; and
and controlling the structured light projector to emit light according to the luminous power.
7. The microprocessor of claim 6, wherein the current brightness is detectable by a light sensor.
8. The microprocessor of claim 6, wherein the microprocessor is further configured to:
acquiring a shot image of the scene; and
and calculating the current brightness according to the shot image.
9. The microprocessor of claim 6, wherein a plurality of the light emitting powers correspond to a plurality of preset luminance ranges, and wherein the microprocessor is further configured to:
determining a preset brightness range in which the current brightness is located; and
and determining the luminous power corresponding to the preset brightness range according to the preset brightness range.
10. The microprocessor of claim 6, wherein the higher the current brightness, the greater the lighting power.
11. One or more non-transitory computer-readable storage media containing computer-executable instructions that, when executed by one or more processors, cause the processors to perform the control method of any one of claims 1 to 5.
12. A computer device comprising a memory and a processor, the memory having stored therein computer readable instructions which, when executed by the processor, cause the processor to carry out the control method of any one of claims 1 to 5.
HK19122468.2A 2019-04-16 Control method, microprocessor, computer readable storage medium and computer equipment HK1262336A1 (en)

Publications (1)

Publication Number Publication Date
HK1262336A1 true HK1262336A1 (en) 2020-01-10

Family

ID=

Similar Documents

Publication Publication Date Title
TWI714131B (en) Control method, microprocessor, computer-readable storage medium and computer device
TWI699707B (en) Method for controlling structured light projector, depth camera and electronic device
US9047514B2 (en) Apparatus, system and method for projecting images onto predefined portions of objects
CN105122943B (en) A method of characterizing a light source and a mobile device
US10313601B2 (en) Image capturing device and brightness adjusting method
CN110463183A (en) Identification device and method
US11687635B2 (en) Automatic exposure and gain control for face authentication
CN107607957B (en) A system and method for acquiring depth information, a camera module and an electronic device
US10616561B2 (en) Method and apparatus for generating a 3-D image
CN112954229B (en) Method and device for adjusting light intensity of light supplementing lamp based on gray value and refrigerator
CN110012572A (en) A kind of brightness control method and device, equipment, storage medium
EP3381015B1 (en) Systems and methods for forming three-dimensional models of objects
CN116962891A (en) Iris image acquisition infrared lamp light supplementing method, device, medium and equipment
CN108965525B (en) Detection method and device, terminal, computer equipment and readable storage medium
CN107493412A (en) Image processing system and method
HK1262336A1 (en) Control method, microprocessor, computer readable storage medium and computer equipment
CN108833884B (en) Depth calibration method and device, terminal, readable storage medium and computer equipment
CN117249904B (en) Calibration method and device of color sensor, cosmetic mask and storage medium
CN110163919B (en) Three-dimensional modeling method and device
TWI875515B (en) A display system and display method for augmented reality
JP2013055616A (en) Image display apparatus
US20170070712A1 (en) Lighting device, lighting system, and program
CN117607900A (en) Method for enhancing resolution of TOF sensor and related equipment
CN110686601A (en) A raster three-dimensional scanning device
HK1261674A1 (en) Control method, control device, depth camera and electronic device