WO2024240509A1 - Agencement de capteur - Google Patents
Agencement de capteur Download PDFInfo
- Publication number
- WO2024240509A1 WO2024240509A1 PCT/EP2024/062842 EP2024062842W WO2024240509A1 WO 2024240509 A1 WO2024240509 A1 WO 2024240509A1 EP 2024062842 W EP2024062842 W EP 2024062842W WO 2024240509 A1 WO2024240509 A1 WO 2024240509A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- sensor
- depth
- arrangement
- image sensor
- image
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/86—Combinations of lidar systems with systems other than lidar, radar or sonar, e.g. with direction finders
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/88—Lidar systems specially adapted for specific applications
- G01S17/89—Lidar systems specially adapted for specific applications for mapping or imaging
- G01S17/894—3D imaging with simultaneous measurement of time-of-flight at a 2D array of receiver pixels, e.g. time-of-flight cameras or flash lidar
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/48—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
- G01S7/481—Constructional features, e.g. arrangements of optical elements
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/48—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
- G01S7/481—Constructional features, e.g. arrangements of optical elements
- G01S7/4817—Constructional features, e.g. arrangements of optical elements relating to scanning
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/48—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
- G01S7/497—Means for monitoring or calibrating
Definitions
- the present invention relates to a sensor arrangement for mapping depth information to colour information and a method for mapping depth information to colour information with such sensor arrangement .
- Optical sensor arrangements like time-of- f light ( ToF) sensors , face a challenge in merging data from image and depth sensors due to their intrinsic and extrinsic calibration .
- This calibration helps convert depth pixels into image pixels for mapping purposes .
- Kinect RGB-D sensor is an example of this process .
- stereo calibration is used by capturing a series of images with a known pattern . Calibration can become a bottleneck for large-scale production and requires repetition for each sensor . Autocalibration algorithms are necessary as calibration may change during sensor lifetime . Joint calibration of sensors is done in a simultaneous locali zation and mapping (SLAM) framework, but this adds hardware complexity and power consumption .
- SLAM simultaneous locali zation and mapping
- Mapping dead spots can arise from di f ferences in the baseline and f ield-of-view ( FoV) of the sensors . Additionally, there can be inaccuracies due to di f fering viewpoints of obj ects between the sensors . High-resolution RGB-D sensors are not always practical , so machine learning is used to merge data from low-resolution depth imagers and high-resolution RGB sensors .
- the document "A Flexible New Technique for Camera Calibration” discloses the acquisition of a known planar calibration pattern in multiple poses viewed by both sensors.
- Non-linear optimization of a pro j ection/depro j ection model based on the detected common features on both cameras allow for the desired mapping.
- This technique is limited by the ability to capture an intensity image of a suitable quality from the depth sensor.
- For sparse dToF or iToF depth sensors it can become challenging and end up in suboptimal solutions.
- For frequency-modulated SMI signals it is not even clear how to obtain such intensity images.
- the method implies a parallax between the cameras, which is desired for stereo imaging, but detrimental for colour mapping.
- the document US 9,210,288 B2 discloses the registration of a point cloud and a colour image in a parallax-free strategy with a depth scanner being an amplitude-modulated laser scanner with rotary mirror and base and a line-scan colour camera .
- One obj ective of the present invention is to provide an improved sparse range sensor arrangement for mapping depth information to colour information .
- a sensor arrangement for mapping depth information to colour information .
- Such sensor arrangement comprises at least one image sensor, preferably an array of image sensors , configured to capture colour information of a scene ; at least one sparse range sensor as depth sensor, preferably an array of depth sensors , configured to capture depth information of the scene ; and a lens system configured to position the field of views of the depth sensor and the image sensor to substantially overlap ; and a processor configured to associate a depth pixel with a corresponding image pixel .
- the present invention thus provides a micro-scan concept for an optically overlaid high-resolution image sensor and a sparse depth sensor .
- the present invention thereby achieves a straightforward mapping between depth sensor data and image sensor data ( texture ) , particularly avoiding parallax or dead spots .
- Particularly a dense from sparse depth array is achieved with smaller deflection angles due to micro-scan .
- the present invention solves multiple inherent problems via an optical setup that aligns the field of views of depth and image sensors , particularly in embodiments comprising a dichroic mirror . While both sensors usually suf fer from parallax ef fect , the present invention advantageously removes parallax .
- the present inventors thereby surprisingly found that although the idea of parallax- free imaging is not new, it can be advantageously achieved by registering a depth and a texture image .
- the present invention is also applicable to 2D focal plane arrays . Particularly, a high-resolution image sensor and a potentially sparse depth sensor . Thereby, the present invention further provides a micro-scan concept combining it with the optical overlay of sensors . Lastly, the proposed calibration procedure is unique and can be practically implemented for mass production .
- the image sensor is a colour camera, for example , a regular CCD/CMOS array .
- sparse range sensor preferably refers to a type of range sensor that operates by capturing distance measurements at a limited number of discrete points in a scene . Unlike dense range sensors that capture a depth map with continuous measurements at each pixel , sparse range sensors preferably capture only a small number of depth measurements , resulting in a lower overall data density . The sparse range sensor may use various techniques to capture these distance measurements , such as time-of- f light or structured light .
- the image sensor and/or the depth sensor comprises a light source and a light detector .
- the sparse range sensor emits and receives light in the form of laser beams .
- the image sensor is a high-resolution image sensor .
- the sensor arrangement of the present invention comprises an array of depth sensors .
- the depth sensor of the present invention emits frequency-modulated light with sel f-mixing interferometry or pulse-based or amplitude-modulated ToF .
- the image sensor and the depth sensor are arranged such that the field of views of the depth sensor and image sensor are aligned .
- the depth sensor and the image sensor are positioned orthogonally to each other .
- the main direction of emittance and thus the direction of light beams emitted by the image sensor and the depth sensor, respectively, are perpendicular each other .
- the depth sensor and the image sensor are positioned in a plane side-by-side to each other .
- the main direction of emittance and thus the direction of light beams emitted by the image sensor and the depth sensor, respectively, are parallel to each other .
- the sensor arrangement further comprises a mirror arrangement .
- Such mirror arrangement preferably comprises at least one first mirror .
- the present sensor arrangement may comprise more than one mirror, particularly two mirrors , namely a first mirror and a second mirror .
- the mirrors are arranged such that the field of views of the depth sensor and the image sensor are aligned .
- the arrangement comprises at least a first mirror, wherein said first mirror is a dichroic mirror .
- dichroic mirror preferably refers to an optical component that selectively reflects and transmits light of di f ferent wavelengths or colours .
- a dichroic mirror as known in the art may comprise a thin coating with optical properties that allows reflection of certain wavelengths of light while transmitting others .
- Such dichroic mirrors are usually made of glass or other transparent material and coated with a thin layer of metal or dielectric material that has di f ferent reflective properties for di f ferent wavelengths of light .
- a dichroic mirror can be advantageously arranged such that the field of views of the depth sensor and the image sensor are aligned .
- the depth sensor and the image sensor may be positioned in an angle , or particularly orthogonally, to each other .
- the dichroic mirror may be arranged accordingly to align the field of views of the depth sensor and the image sensor .
- the arrangement further comprises a micro-scan device .
- a micro-scan device is advantageously selected from the group comprising a micro- electro-mechanical system (MEMS ) , a voice coil motor (VCM) , a steerable lens comprising a metalens and a MEMS , or an optical phased array ( OPA) .
- MEMS micro- electro-mechanical system
- VCM voice coil motor
- OPA optical phased array
- the micro-scan device comprises a micro-electro- mechanical system (MEMS ) configured to actuate one of the mirrors .
- MEMS micro-electro- mechanical system
- a MEMS is arranged at the first and/or the second mirror such that it can actuate the first and/or the second mirror, respectively .
- a MEMS may be arranged to actuate a dichroic mirror or any other mirror .
- the mirrors of the sensor arrangement of the present invention can be arranged in various embodiments such that the field of views of the depth sensor and image sensor are aligned .
- MEMS is preferably understood to be a microelectromechanical system . It may particularly refer to a technology of miniaturi zed mechanical and electro-mechanical elements that are fabricated using microfabrication techniques . MEMS devices are typically made of components that are smaller than 100 micrometres , and they are commonly used in a wide range of applications , such as sensors , actuators , and microfluidic systems . It is known in the art that MEMS technology enables the creation of highly precise and ef ficient devices that can perform complex functions with high reliability and accuracy .
- the micro-scan device comprises a voice coil motor (VCM) configured to translate the lens system .
- VCM voice coil motor
- VCM voice coil motor
- VCMs are used in the art for various electronic devices , including smartphones and cameras .
- the VCM may particularly comprise a coil that moves inside a magnetic field, generating a force that is used to move the component .
- By controlling the amount of current supplied to the coil the position of the component can be accurately controlled .
- VCMs are known for their precision, speed, and ability to provide high force with a small form factor .
- the micro-scan device comprises a metalens and a MEMS configured to create a steerable lens .
- metalens preferably refers to a type of lens made of an array of nanostructures .
- Such metalens is typically made of metals or dielectrics arranged in a periodic pattern . These structures may be designed such that the phase , amplitude , and/or polari zation of the light may be altered, allowing the metalens to focus or shape light in speci fic ways .
- metalenses may be made extremely thin and lightweight , making them an ideal technology in the context of the present invention for a wide range of applications , such as high-resolution imaging, optical communications , and augmented reality .
- the micro-scan device comprises an optical phased array ( OPA) .
- OPA optical phased array
- the OPA may be configured to directly steer the laser beams emitted by the depth sensor .
- OPA optical phased array
- the term "OPA" or "optical phased array” preferably refers to a device that uses multiple small , individually controllable light sources to produce a focused beam of light . Each light source produces a small portion of the overall beam, and by controlling the phase of each source , the individual beams can be made to interfere constructively, resulting in a focused beam that can be steered in di f ferent directions .
- OPAs can be used in a variety of applications of the inventive sensor arrangement comprising lidar systems for autonomous vehicles , optical communications , and laser machining .
- the depth sensor is a sparse range sensor emitting IR light . This is advantageous , particularly when using a dichroic filter arranged such that the field of views of the depth sensor and image sensor are aligned .
- the image sensor comprises a VIS and/or IR sensor .
- VIS stands for "visible” and refers to the range of wavelengths of electromagnetic radiation that are visible to the human eye .
- the visible spectrum refers to the spectrum between about 400 nanometres and about 700 nanometres . This includes the colours of the rainbow, from violet to red .
- IR infrared
- the spectrum of “IR” refers to the spectrum between about 700 nanometres and 1 millimetre . This type of radiation is not visible to the human eye , but can be felt as heat , and is commonly used in various imaging applications .
- the depth sensor is a sparse range sensor comprising a sensor array selected from the group comprising a Vertical-Cavity Surface-Emitting Laser (VCSEL ) and/or a PD/AS IC arrangement , a SPAD array and a flood or dot proj ector .
- VCSEL Vertical-Cavity Surface-Emitting Laser
- SPAD SPAD array
- flood or dot proj ector a sparse range sensor comprising a sensor array selected from the group comprising a Vertical-Cavity Surface-Emitting Laser (VCSEL ) and/or a PD/AS IC arrangement , a SPAD array and a flood or dot proj ector .
- VCSEL Vertical-Cavity Surface-Emitting Laser
- VCSEL Vertical-Cavity Surface-Emitting Laser
- VCSEL Very-Cavity Surface-Emitting Laser
- VCSELs are known to have several advantages over traditional edge-emitting lasers , including low power consumption, high ef ficiency, and the ability to be fabricated in large arrays . They are used in a variety of applications , including optical communication, sensing, and 3D sensing in mobile devices .
- PD refers to " Photodiode” and the term “AS IC” stands for "Application-Speci fic Integrated Circuit” .
- photodiode or "PD” as used herein, preferably refers to a semiconductor device that converts light into an electrical current . It works by absorbing photons from incoming light and generating a flow of electrons in response . Photodiodes are commonly used in applications such as optical communications , sensing, and imaging .
- AS IC application-speci fic integrated circuit
- IC integrated circuit
- AS IC is preferably used for processing the electrical signals generated by the photodiode and controlling the operation of the VCSEL .
- the sensor arrangement further comprises a dot detector .
- the term "dot detector” as used herein preferably refers to an algorithm capable of detecting the position of laser dots or spots on an image taken by the image sensor . It preferably works by identi fying local maxima in the intensity of the image and using quadratic interpolation to estimate the precise location of the laser dot .
- the dot detector preferably is used to extract the xy coordinates of the laser spots in the image and store them in a table for each mechanical position of the mirror or lens . This information preferably is then used at runtime to generate a 3D point cloud of the scene .
- the dot detector thus may play a crucial role in the proposed hardware solution according to the present invention as it enables the association of depth information with image information for dense RGB-D reconstruction from a sparse depth sensor in the sensor arrangement according to the present invention .
- a dot detector in sensor arrangement according to the present invention preferably is configured to extract the locations of laser spots on the image sensor .
- a dot detector in sensor arrangement according to the present invention preferably is for extracting the xy proj ections of the laser spots onto the high-resolution imager into one table per mechanical position of a mirror or lens .
- a dot detector in sensor arrangement according to the present invention preferably is based on local maxima and quadratic interpolation .
- the processor is configured to generate a 3D point cloud of the scene .
- a 3D point cloud of the scene is preferably generated by the processor by tracing rays from the xy stored locations through the optical centre of the image sensor until the estimated depth values , thereby producing a 3D point cloud of the scene .
- the processor is configured to attach an intensity or colour to each 3D point .
- the processor may take either the value of the closest corresponding image pixel and/or a linear combination of neighbouring pixels weighted by their distance to the xy location .
- the sensor arrangement according to the present invention may further comprise an RGB image sensor .
- RGB Image sensor advantageously may be configured to enable auto-calibration of the setup by allowing continuous visibility of the dots .
- the processor is configured to perform dense RGB-D reconstruction from the sparse depth sensor .
- the processor may be configured to require only the geometric camera calibration of the high-resolution image sensor for camera calibration .
- the processor may be configured to store the association between depth sensor data and image sensor data in look-up tables , preferably thereby reducing computational complexity .
- a feature , embodiment , ef fect or advantage described herein in connection with the inventive sensor arrangement may also be a feature , embodiment , ef fect or advantage of the inventive method, respectively, and vice versa .
- Such method comprises at least the following steps : a step of capturing depth information of a scene using at least one sparse range sensor as depth sensor ; a step of capturing colour information of the scene using at least one image sensor ; optionally, a step of aligning the field of views of the depth sensor and the image sensor using a dichroic mirror and a lens system to substantially overlap ; and a step of associating a depth pixel with a corresponding image pixel using a processor .
- the sensor arrangement is the sensor arrangement according to the present invention .
- the method further comprises a step of extracting the locations of the laser spots on the image sensor using a dot detector .
- Typical fields of application for a sensor arrangement and/or the method according to the invention comprise but are not limited to :
- FIG . 1 illustrates schematically a sensor arrangement and respective field of illumination and a field of view according to a first embodiment of the present invention, wherein the depth sensor and the image sensor are positioned orthogonally to each other ;
- FIG . 2 illustrates schematically a sensor arrangement and respective field of illumination and a field of view according to a second embodiment of the present invention, wherein the depth sensor and the image sensor are positioned in a plane side-by-side to each other ;
- FIG . 3 A and 3 B illustrate schematically a sensor arrangement and respective field of illumination and a field of view according to a third and fourth embodiment of the present invention, wherein the sensor arrangement comprises a micro-scan device in the form of a micro-electro-mechanical system (MEMS ) ;
- MEMS micro-electro-mechanical system
- FIG . 4 illustrate schematically a sensor arrangement and respective field of illumination and a field of view according to a fi fth embodiment of the present invention, wherein the sensor arrangement comprises a micro-scan device in the form of a VCM;
- FIG . 5 illustrate schematically a sensor arrangement and respective field of illumination and a field of view according to a sixth embodiment of the present invention, wherein the sensor arrangement comprises a micro-scan device in the form of a steerable metalens ;
- FIG . 6 illustrate schematically a sensor arrangement and respective field of illumination and a field of view according to a sixth embodiment of the present invention, wherein the sensor arrangement comprises a micro-scan device in the form of an OPA;
- FIG . 7 illustrates schematically a sensor arrangement and respective field of illumination and a field of view according to a seventh embodiment of the present invention, wherein the depth sensor and the image sensor are positioned in a plane side-by-side to each other and the field of views of the sensors do not overlap .
- the sensor arrangement 1 is suitable for mapping depth information to colour information, and comprises an image sensor 2 configured to capture colour information of a scene and a sparse range sensor as a depth sensor 3 that is configured to capture depth information of the scene .
- a lens system 4 , 5 is provided that is configured to position the field of views of the depth sensor 3 and the image sensor 2 , respectively, to substantially overlap .
- the image sensor 2 and the first lens 4 are arranged adj acent to each other, such that the light emitted by the image sensor 2 passes lens 4 .
- the depth sensor 3 is arranged adj acent to a second lens 5 , such that the light emitted by the depth sensor 3 passes second lens 5 .
- a processor 6 is provided that is configured to associate a depth pixel as sensor data from the depth sensor 3 with a corresponding image pixel as sensor data from the image sensor 2 .
- FIG . 1 a sensor arrangement and respective field of illumination and a field of view according to a first embodiment of the present invention is schematically shown .
- a the depth sensor 3 and the image sensor 2 are positioned orthogonally to each other .
- the depth sensor 3 and the image sensor 2 are arranged such that the field of views of the depth sensor 3 and image sensor 2 are aligned .
- the sensor arrangement 1 further comprises a mirror arrangement in the form of the first mirror 7 that is a dichroic mirror arranged to align the field of views of the depth sensor 3 and the image sensor 4 .
- the light of image sensor 2 passes mirror 7 , while the light emitted by depth sensor 3 is reflected by the dichroic mirror 7 .
- the dichroic mirror 7 aligns the field of views of the depth sensor 3 and the image sensor 4 .
- the image sensor' s 2 and depth sensor' s 3 receiver planes are placed orthogonal to each other and are then aligned by the single dichroic mirror 7 ( see FIG . 1 ) .
- the present invention advantageously provides a unique association of a depth to an image pixel, regardless of the object distances.
- the geometric camera calibration of the high-resolution image sensor 2 directly provides the mapping between ID range measurements and a 3D point cloud by deprojection of the pixel rays.
- a high-resolution color sensor e.g., a 3264x2448 CMOS image sensor
- a sparse VCSEL array e.g., 15x20 elements
- depth sensor 3 associated with its corresponding lens 5
- a sparse photodiode array 3 e.g., 15x20 elements
- filter 7 a dichroic filter (e.g., with a transmission band between 400 and 730nm, and a reflection band between 770 and llOOnm) is provided in a 45 degree angle relative to the emission main direction of the image sensor 2 and the depth sensor 3, respectively.
- the VCSEL/PD arrays 3 are placed at an angle of 90 degrees with respect to the image sensor 2.
- the dichroic filter 7 is placed in front of the image sensor lens 4 with an angle of 45 degrees.
- the image sensor 2 may be a monochrome or infrared camera with lower resolutions, e.g., VGA.
- the VCSEL/PD arrays 3 implement self-mixing interferometry (SMI) with frequency modulation to measure absolute distances and Doppler speeds.
- the photodiode array 3 may be integrated directly into the VCSELs 3 or may not be needed in case the SMI signal is read from the junction voltages.
- the VCSEL/PD arrays 3 may be replaced by a SPAD array 3 and a flood or dot projector 3. In this case, the depth sensor 3 estimates directly the time of flight . Additionally or alternatively, in an embodiment of the invention an amplitude-modulation may be used and the di f ference of phases ( indirect ToF) may be measured .
- FIG . 2 a sensor arrangement 1 and respective field of illumination and a field of view according to a second embodiment of the present invention is shown .
- the depth sensor 3 and the image sensor 2 are positioned in a plane side-by-side to each other . Accordingly, the image sensor and the depth sensors receiver planes are arranged next to each other and combined with two respective mirrors 7 and 8 , respectively ( see FIG . 2 ) .
- the sensor arrangement 1 comprises a mirror arrangement 7 , 8 , with one first mirror 7 , and one second mirror 8 , both arranged to align the field of views of the depth sensor 3 and the image sensor 4 .
- the first mirror 8 reflects the laser beams emitted by the depth sensor 3 into a second dichroic mirror 7 like in the embodiment shown in FIG . 1 .
- This second mirror 7 may act in transmission or reflection depending on the wavelength .
- the mirrors 7 and 8 are positioned in such a way that the field of views of both sensors 2 and 3 match, respectively .
- the second mirror 8 may be a fully reflective mirror 8 that reflects the laser beams emitted by the depth sensor 3 into the dichroic filter 7 .
- This setup may be of particular advantage for deployment in mobile or AR/VR use cases .
- FIG. 3 A and 3 B A third and fourth embodiment of the present invention, is shown in FIG . 3 A and 3 B, respectively .
- the sensor arrangement 1 comprises a micro-scan device 9 in the form of a micro-electro-mechanical system (MEMS ) .
- MEMS micro-electro-mechanical system
- the basic setup is based on the second embodiment shown in FIG . 2 comprising one first mirror 8 that reflects the laser beams emitted by the depth sensor 3 into a second dichroic mirror 7 .
- the embodiments shown in Figs the embodiments shown in Figs .
- MEMS micro- electro-mechanical system
- FIG . 3A and 3B necessitates only small deflection angles ( couple of degrees as compared to approximatively 20 degrees ) , which puts less stress on the mechanical design .
- the MEMS actuator 9 steer the respective mirror 7 and/or 8 up to 3 . 2 degrees with steps of 0 . 1 degree along two orthogonal axes .
- the original depth resolution maybe extended, for example , of 15x20 may up to 480x640 pixels at a rate of 30 fps using a capture time of 33 microseconds per pixel array .
- the sensor arrangement 1 comprises a micro-scan device 9 in the form of a VCM 9 .
- VCM voice coil motor
- the VCM may generate a force that is used to move the lens 5 which may be thus accurately controlled .
- a steerable metalens 9 is provided as the micro-scan device 9 , preferably alternatively or in addition to a MEMS micro-scan device 9 .
- the metalens may be provided in addition to or, preferably as a replacement of lens 5 .
- an OPA 9 is provided as the micro-scan device 9 .
- Such optical phased array ( OPA) 9 advantageously allows to directly steer the laser beams ( FIG . 6 ) .
- a sensor array may be used, which advantageously may improve the depth resolution at low cost by small and configurable mechanical motions .
- an intrinsic camera matrix for the high-resolution sensor obtained by standard calibration methods , the present invention allows for a straightforward mapping of depth to texture pixels and hence to create a coloured 3D point cloud .
- the mapping may also be created in an of fline phase by pointing the RGB-D device to a wall made of material sensitive to infrared light , such as Thorlabs VRC5 , with 700- 1400nm and with a peak relative sensitivity of 1 . 0 pm wavelength .
- FIG . 7 a sensor arrangement 1 and respective field of illumination and a field of view according to a seventh embodiment of the present invention is schematically shown .
- the optical principles may apply as depicted in FIG . 2 of the article Freundlich et al . , "Controlling a Robotic Stereo Camera Under Image Quanti zation Noise” in The International Journal of Robotics Research published in June 2017 .
- the depth sensor 3 and the image sensor 2 are positioned in a plane side-by-side to each other and the field of views of the sensors do not overlap .
- an alternative way of solving the problem using a stereo-like setup is shown within the scope of the present invention . It is immediately apparent from FIG . 7 that in such embodiment the field of views of the sensors 2 and 3 do not overlap . Also without further mechanical elements , such as mirrors , a single image is suf ficient . Whenever MEMS or VCM are used, one image per angle step is required .
- the laser beams will appear as spots on the texture image , at fixed locations at any distances for a given setup, their locations may be advantageously extracted using a dot detector based on local maxima and quadratic interpolation .
- the xy proj ections of the laser spots are stored onto the high- resolution imager into one table per mechanical position of the mirror or lens .
- rays are advantageously simple traced from the xy stored locations through the camera optical centre of the image sensor 2 until the estimated depth values . This will generate a 3D point cloud of the scene .
- an intensity or colour may be attached to each 3D point by taking either the value of the closest corresponding image pixel or a linear combination of neighbouring pixels weighted by their distance to the xy location .
- the present invention provides a lower computational complexity in that the association between depth and texture remains fixed and can be stored in look-up tables , in contrast to classic solutions requiring depro ection/pro ection;
- the present invention provides a full matching of image and depth sensor field of views , particularly without providing dead spots , i . e . all depth pixels can be associated to a texture pixel ; -
- the present invention provides a rather simple camera calibration, in that the inventive method requires only the geometric camera calibration of the high-resolution image sensor - dense RGB-D reconstruction from sparse depth sensor
- the present invention is suitable also for small deflection angles for steerable mirror or lens .
- the present invention is in need of only a small degree of computational power for depth reconstruction .
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Computer Networks & Wireless Communication (AREA)
- General Physics & Mathematics (AREA)
- Electromagnetism (AREA)
- Length Measuring Devices By Optical Means (AREA)
Abstract
1. L'invention concerne un dispositif de capteur (1) permettant de mettre en correspondance des informations de profondeur avec des informations de couleur, comprenant : un capteur d'image (2) configuré pour capturer des informations sur les couleurs d'une scène ; un capteur de profondeur (3) configuré pour capturer les informations de profondeur de la scène ; un système de lentilles (4,5) configuré pour positionner le champ de vue du capteur de profondeur (3) et du capteur d'image (2) de manière à ce qu'ils se chevauchent substantiellement ; et un processeur (6) configuré pour associer un pixel de profondeur à un pixel d'image correspondant.
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| DE102023113359 | 2023-05-22 | ||
| DE102023113359.4 | 2023-05-22 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2024240509A1 true WO2024240509A1 (fr) | 2024-11-28 |
Family
ID=91334538
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/EP2024/062842 Pending WO2024240509A1 (fr) | 2023-05-22 | 2024-05-08 | Agencement de capteur |
Country Status (1)
| Country | Link |
|---|---|
| WO (1) | WO2024240509A1 (fr) |
Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US8698939B2 (en) | 2008-02-18 | 2014-04-15 | The Board Of Regents For Oklahoma State University | Dual beam optic with dichroic filter |
| US9210288B2 (en) | 2009-11-20 | 2015-12-08 | Faro Technologies, Inc. | Three-dimensional scanner with dichroic beam splitters to capture a variety of signals |
| DE102017222614A1 (de) * | 2017-12-13 | 2019-06-13 | Robert Bosch Gmbh | Vorrichtung zur Umgebungserfassung sowie Verfahren zu dessen Betrieb |
| US20210208262A1 (en) * | 2018-09-16 | 2021-07-08 | Apple Inc. | Calibration of a depth sensing array using color image data |
| WO2022165650A1 (fr) * | 2021-02-02 | 2022-08-11 | 华为技术有限公司 | Dispositif de détection, procédé de commande, système de détection par fusion et terminal |
-
2024
- 2024-05-08 WO PCT/EP2024/062842 patent/WO2024240509A1/fr active Pending
Patent Citations (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US8698939B2 (en) | 2008-02-18 | 2014-04-15 | The Board Of Regents For Oklahoma State University | Dual beam optic with dichroic filter |
| US9210288B2 (en) | 2009-11-20 | 2015-12-08 | Faro Technologies, Inc. | Three-dimensional scanner with dichroic beam splitters to capture a variety of signals |
| DE102017222614A1 (de) * | 2017-12-13 | 2019-06-13 | Robert Bosch Gmbh | Vorrichtung zur Umgebungserfassung sowie Verfahren zu dessen Betrieb |
| US20210208262A1 (en) * | 2018-09-16 | 2021-07-08 | Apple Inc. | Calibration of a depth sensing array using color image data |
| WO2022165650A1 (fr) * | 2021-02-02 | 2022-08-11 | 华为技术有限公司 | Dispositif de détection, procédé de commande, système de détection par fusion et terminal |
| EP4276495A1 (fr) * | 2021-02-02 | 2023-11-15 | Huawei Technologies Co., Ltd. | Dispositif de détection, procédé de commande, système de détection par fusion et terminal |
Non-Patent Citations (4)
| Title |
|---|
| FREUNDLICH ET AL.: "Controlling a Robotic Stereo Camera Under Image Quantization Noise", THE INTERNATIONAL JOURNAL OF ROBOTICS RESEARCH, June 2017 (2017-06-01) |
| LOVEDAY, MBRECKON TP: "On the Impact of Parallax Free Colour and Infrared Image Co-Registration to Fused Illumination Invariant Adaptive Background Modelling", CVPR WORKSHOP, CVPR WORKSHOP, 2018 |
| WANG, MEMS MIRRORS FOR LIDAR: A REVIEW, 2020 |
| ZHANG, A FLEXIBLE NEW TECHNIQUE FOR CAMERA CALIBRATION, 1998 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US12105202B2 (en) | Optoelectronic sensor and method of detecting objects | |
| AU2021200905B2 (en) | Synchronized spinning lidar and rolling shutter camera system | |
| US11425359B2 (en) | Apparatus and method for generating three-dimensional image | |
| KR101762525B1 (ko) | 다수의 이미터들을 이용한 깊이 주사를 위한 장치 및 방법 | |
| US10267915B2 (en) | Optical system for object detection and location | |
| GB2579689A (en) | Improved 3D sensing | |
| US10353074B2 (en) | Agile navigation and guidance enabled by LIDAR (ANGEL) | |
| CN108718406A (zh) | 一种可变焦3d深度相机及其成像方法 | |
| CN115023627A (zh) | 用于将世界点投影到滚动快门图像的高效算法 | |
| US12066574B2 (en) | Optical system for object detection and location using a Micro-Electro-Mechanical System (MEMS) Micro-Mirror Array (MMA) beamsteering device | |
| US11747481B2 (en) | High performance three dimensional light detection and ranging (LIDAR) system for drone obstacle avoidance | |
| WO2024240509A1 (fr) | Agencement de capteur | |
| US20210382150A1 (en) | Wide fov lidar and vehicle with multiple galvanometer scanners | |
| Choudhury et al. | Simultaneous enhancement of scanning area and imaging speed for a MEMS mirror based high resolution LiDAR | |
| EP4260111B1 (fr) | Systèmes d'imagerie à vision d'ambiance | |
| US12399278B1 (en) | Hybrid LIDAR with optically enhanced scanned laser | |
| WO2023048691A1 (fr) | Lidar rapide et à haute résolution doté d'un dispositif de balayage polygonal multi-axe et multi-surface |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 24729965 Country of ref document: EP Kind code of ref document: A1 |