[go: up one dir, main page]

WO2024241321A1 - Micro-optics on detection path of lidar systems - Google Patents

Micro-optics on detection path of lidar systems Download PDF

Info

Publication number
WO2024241321A1
WO2024241321A1 PCT/IL2024/050511 IL2024050511W WO2024241321A1 WO 2024241321 A1 WO2024241321 A1 WO 2024241321A1 IL 2024050511 W IL2024050511 W IL 2024050511W WO 2024241321 A1 WO2024241321 A1 WO 2024241321A1
Authority
WO
WIPO (PCT)
Prior art keywords
light
micro
optic
lidar system
reflected
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
PCT/IL2024/050511
Other languages
French (fr)
Inventor
Yonatan Korner
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Innoviz Technologies Ltd
Original Assignee
Innoviz Technologies Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Innoviz Technologies Ltd filed Critical Innoviz Technologies Ltd
Publication of WO2024241321A1 publication Critical patent/WO2024241321A1/en
Anticipated expiration legal-status Critical
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/481Constructional features, e.g. arrangements of optical elements
    • G01S7/4816Constructional features, e.g. arrangements of optical elements of receivers alone
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/93Lidar systems specially adapted for specific applications for anti-collision purposes
    • G01S17/931Lidar systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/481Constructional features, e.g. arrangements of optical elements
    • G01S7/4817Constructional features, e.g. arrangements of optical elements relating to scanning
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B26/00Optical devices or arrangements for the control of light using movable or deformable optical elements
    • G02B26/08Optical devices or arrangements for the control of light using movable or deformable optical elements for controlling the direction of light
    • G02B26/10Scanning systems
    • G02B26/101Scanning systems with both horizontal and vertical deflecting means, e.g. raster or XY scanners

Definitions

  • the present disclosure relates to Light Detection and Ranging (LIDAR) technology for scanning a surrounding environment, and, more specifically, but not exclusively, to increasing noise immunity of LIDAR systems deployed for scanning a surrounding environment.
  • LIDAR Light Detection and Ranging
  • RADAR Radio Detection and Ranging
  • LIDAR Light Detection and Ranging
  • camera-based systems and/or the like operating alone, in conjunctions and/or in a redundant manner.
  • LIDAR based object detection and surroundings mapping has proved to be highly efficient, reliable, and robust compared to other detection technologies.
  • LIDAR based detection systems may be extremely efficient, their performance, whether employing pulsed or continuous wave illumination, may be affected, and possibly significantly degraded due to environmental interference such as, for example, noise (e.g., ambient light, crosstalk, stray light, etc.), excessive light reflection, parasitic reflections, external light sources, and interferences at the component and electrical circuits level, to name just a few.
  • an imaging system for receiving light reflected from a field of view (FOV) illuminated by a LIDAR system, comprising a focusing unit configured to receive reflected light from a FOV illuminated by a LIDAR system and focus a plurality of portions of the reflected light on a focal plane of the focusing unit, a sensor array comprising a plurality of sensing elements configured to detect the reflected light, and a micro-optic array comprising a plurality of microoptic elements each associated with a respective one of the plurality of sensing elements.
  • FOV field of view
  • Each micro-optic element comprises a light entrance coincident with the focal plane and configured for receiving a respective portion of the plurality of portions of reflected light from the focusing unit, a light exit through which the respective portion of the reflected light is transmitted to the associated sensing element, and a light blocking exterior disposed between the light entrance and the light exit, the light blocking exterior is configured to prevent transmission of the reflected light.
  • a method of distributing light reflected from a field of view (FOV) illuminated by a LIDAR system on sensing elements of the LIDAR system comprising receiving, via an optical system of a LIDAR system, light reflected from an FOV illuminated by the LIDAR system, wherein the optical system is configured to focus a plurality of portions of the reflected light on a focal plane of the optical system, transmitting the reflected light via a micro-optic array to a sensor array comprising a plurality of sensing elements configured to detect the reflected light.
  • the micro-optic array comprises a plurality of micro-optic elements each associated with a respective one of the plurality of sensing elements.
  • each micro-optic element comprises: a light entrance coincident with the focal plane and configured for receiving a respective portion of the plurality of portions of reflected light from the focusing unit, a light exit through which the respective portion of the reflected light is transmitted to the associated sensing element, and a light blocking exterior disposed between the light entrance and the light exit, the light blocking exterior is configured to prevent transmission of the reflected light.
  • each of the plurality of sensing elements comprises an array of light detecting elements.
  • Each micro-optic element is configured to disperse light received via the light entrance of the respective micro -optic element over a sensing surface of the array of light detecting elements of the associated sensing element. Wherein a surface area of the sensing surface is larger than the surface area of a cross section of the light entrance.
  • a cross section of a surface of the light exit has a larger surface area than a cross section of a surface of the light entrance by a ratio in a range of 25/1 to 4/1.
  • the light entrance and the light exit of each micro-element are transparent to the reflected light.
  • each micro-optic element is shaped to have a curvature configured to disperse the respective portion of reflected light over a sensing surface of the associated sensing element.
  • each micro-optic element is associated with one or more respective lenses configured to disperse the respective portion of reflected light over a sensing surface of the associated sensing element.
  • each micro-optic element is spaced apart and optically coupled to each other via the volume of the respective micro-optic element between the light entrance and the light exit.
  • each micro-optic elements is shaped to prevent transmission of light beams having an angle of incidence (AO I) with a surface of the light entrance outside a predefined angle range.
  • AO I angle of incidence
  • each micro-optic element comprises a front-end conduit geometrically shaped to direct away from the light exit light beams having an AOI with the surface of the light entrance outside the predefined angle range.
  • the conduit has a uniform cross section.
  • the conduit has a varying cross section.
  • the light blocking exterior is configured to absorb incident light external to the micro-optic element.
  • the light blocking exterior is configured to reflect incident light external to the micro-optic element.
  • the light blocking exterior is configured to reflect incident light inside the micro-optic element.
  • each micro-optic element is a monolithic component.
  • each micro-optic element is shaped as a truncated pyramid wherein a truncated top facet of the pyramid constitutes the light entrance, a base facet of the pyramid constitutes the light exit, and a plurality of side facets of the pyramid constitute the light blocking exterior.
  • the micro-optic array is a monolithic component.
  • the imaging system further comprises a plurality of optical filters disposed between the plurality of micro-optic elements of the microoptic array to prevent transmission of light not received via the light entrance of the plurality of micro-optic elements.
  • the LIDAR system comprises a plurality of light sources configured to transmit a plurality of light beams toward at least part of a FOV of the LIDAR system, each of the plurality portions of the reflected light corresponds to a respective one of the plurality of light beams.
  • the LIDAR system comprises one or more light source configured to transmit a single elongated light beam toward a FOV of the LIDAR system, the reflected light corresponding to the single elongated light beam is divided to the plurality portions of the reflected light.
  • non-transitory computer-readable storage media may store program instructions, which are executed by at least one processor and perform any of the methods described herein.
  • FIG. 1A and FIG. IB are schematic illustrations of an exemplary LIDAR system, in accordance with embodiments of the present disclosure
  • FIG. 2 illustrates graph charts of exemplary light emission patterns projected by a LIDAR system, in accordance with embodiments of the present disclosure
  • FIG. 3 is a schematic illustration of an exemplary micro-optic array of a LIDAR system configured for transmitting light reflected from objects illuminated by the LIDAR system to a sensor array of the LIDAR system, in accordance with embodiments of the present disclosure
  • FIG. 4A and FIG. 4B are cross section views of exemplary micro-optic elements of a LIDAR system configured for transmitting light reflected from objects illuminated by the LIDAR system to a sensing element of the LIDAR system, in accordance with embodiment of the present disclosure;
  • FIG. 5A and FIG. 5B are cross section views of exemplary micro-optic elements shaped to deflect light reflected from objects illuminated by the LIDAR system away from a sensing element of the LIDAR system, in accordance with embodiment of the present disclosure;
  • FIG. 6 depicts various views of an exemplary micro-optic array of a LIDAR system configured for transmitting light reflected from objects illuminated by the LIDAR system to a sensor array of the LIDAR system, in accordance with embodiment of the present disclosure;
  • FIG. 7A and FIG. 7B are schematic illustrations of exemplary micro-optic arrays of a LIDAR system configured for transmitting light reflected from objects illuminated by the LIDAR system to a sensor array of the LIDAR system and filtering out spatial light noise, in accordance with embodiment of the present disclosure
  • FIG. 8 is a flow chart of an exemplary process of using micro-optic elements for transmitting light reflected from objects illuminated by the LIDAR system to a sensor array of the LIDAR system, in accordance with embodiments of the present disclosure.
  • the present disclosure relates to LIDAR technology for scanning a surrounding environment, and, more specifically, but not exclusively, to increasing noise immunity of LIDAR systems deployed for scanning a surrounding environment.
  • LIDAR systems may suffer degradation in its performance, for example, reduced detection accuracy, reliability, consistency, and/or robustness, in terms of reduced detection range, reduced resolution, increased Signal to Noise Ratio (SNR), reduced confidence of detection, and/or the like due to noise affects originating from one or more sources, for example, crosstalk between sensing elements of the LIDAR system, stray light within the LIDAR system, ambient light, and more.
  • SNR Signal to Noise Ratio
  • LIDAR sensors may typically comprise an array of light detectors, for example, Avalanche Photodiodes (APDs), Single Photon Avalanche Diodes (SPADs), and/or the like which may trigger in response to capturing light (photons), specifically light reflected from objects illuminated by the LIDAR system.
  • APDs Avalanche Photodiodes
  • SPADs Single Photon Avalanche Diodes
  • photons specifically light reflected from objects illuminated by the LIDAR system.
  • Dynamic range relates to the range of light energy levels the sensors may be capable of measuring.
  • Each of the light detectors may trigger by certain energy level of received light, which is typically small (e.g., single photon for SPADs), and the sensor may therefore aggregate (e.g. sum, combine, etc.) the outputs of the plurality of light detectors of its array and/or part thereof to indicate the energy level of light captured by the sensor.
  • the size of the sensor defined by the number of light detectors may therefore define the dynamic range of the sensor.
  • transmitting light the reflected light in focused beam, i.e., a beam having a significantly small cross section, on the sensor may result in a small effective sensing surface of the sensor, i.e., the number of light detectors which may receive the reflected light may be significantly small which may significantly reduce the dynamic range of the sensor.
  • components, devices systems, and methods for using micro-optic elements to increase noise immunity of LIDAR systems and/or for increasing the dynamic range of the LIDAR sensors are provided.
  • An imaging system of a LIDAR system may include a micro -optic array comprising a plurality of micro-optic elements each associated with a respective one of a plurality of sensing elements (sensors) of the LIDAR system.
  • the micro-optic array may be configured to transmit the light reflected from one or more objects in a field of View (FOV) of the LIDAR system which are illuminated by light projected by the LIDAR system.
  • FOV field of View
  • the reflected light may comprise a plurality of light portions and each of the micro-optic elements may be configured to transmit a respective portion of the reflected light to its respective associated sensing element.
  • the plurality of portions of the reflected light may relate, for example, to a plurality of light beams projected by the LIDAR system such that each projected light beam may be associated with a receptive sensing element which may receive, via its associated micro-optic element, at least some light reflected from objects in the FOV illuminated by the respective light beam.
  • light reflected in response to a line scan in which an elongated wide beam projected by the LIDAR system may be divided (split, segmented) to a plurality of light portions to increase resolution of the scanned area.
  • micro-optic elements may be deployed on the optical path of the imaging system between a focusing unit of the LIDAR system and the sensor array.
  • Each of the micro-optic elements constructed of one or more light transferring materials may comprise a light entrance coincident with the focal plane (i.e., at the focal length) of the focusing unit for receiving the respective portion of reflected light from the focusing unit and a light exit, optically coupled to the light entrance, through which the reflected light portion may be transmitted to the respective associated sensing element.
  • Each of the micro-optic elements may further include a light blocking exterior disposed between the light entrance and the light exit of the respective micro-optic element.
  • the light blocking exterior may be configured to reduce and potentially prevent transmission of light to the respective associated sensing element.
  • the light blocking exterior of each micro-optic element may be configured to prevent transmission of light not received via the light entrance of the respective micro-optic element to the respective associated sensing element.
  • the light blocking exterior employing one or more light blocking structures, architectures, and/or compositions, for example, light absorbing , light reflecting, light deflecting, and/or a combination thereof, may prevent transmission of light originating from one or more sources, for example, stray light travelling inside the LIDAR system, crosstalk (between sensing elements, and/or between micro-optic elements), ambient light, and/or the like.
  • the light blocking exterior may thus ensure that all or at least most of the light transmitted (distributed) to the sensing element associated with each micro-optic element is that which only or at least mostly enters the light entrance of the respective micro-optic element.
  • Each of the sensing elements of the sensor array of the LIDAR system may typically consist of an array of light detecting elements (light detectors), for example, APDs, SPADs, and/or the like each capable of triggering upon reception of photon(s) of reflected light and the output of the array summed together to form an output of the respective sensing element.
  • light detectors for example, APDs, SPADs, and/or the like each capable of triggering upon reception of photon(s) of reflected light and the output of the array summed together to form an output of the respective sensing element.
  • each of the micro-optic elements may be further configured to disperse (i.e., spread, expand, distribute, etc.) the received portion of reflected light, which may be significantly focused, over the sensing area of the associated sensing element which may be significantly large (increased), which in turn enables more detecting elements (SPADs) in each sensing element.
  • the light exit of each micro-optic element may be configured to have a cross-section which is significantly larger, i.e., having a larger surface area, than the cross section (surface) of the light entrance of the respective microoptic element.
  • the cross section of the light exit of each micro-optic element may have a surface area larger by a ratio of 25/1 to 4/1 than the cross section of the surface area of the light entrance.
  • the cross section of the portion of reflected light received at the light entrance of each micro-optic element from the focusing unit may be smaller by the certain ratio (e.g., 1/25 to 1/4) compared to the cross section of the reflected light dispersed through the light exit surface on the sensing surface of the sensing element.
  • the micro-optic elements may employ one or more structures, architectures, compositions, and/or a combination thereof to expand the portion of reflected light transmitted to its respective associated sensing element.
  • the micro-optic elements may comprise one or more optical elements such as, for example, a lens, a prism, and/or the like, a curved surface, and/or the like.
  • micro-optic array comprising a plurality of micro-optic elements on the optical path of LIDAR system to their sensing element may present major benefits and advantages over currently existing LIDAR systems.
  • disposing the micro-optic array between the focusing unit and the sensor array may significantly increase separation and/or isolation between the sensing elements of the LIDAR system since each sensing element may receive only or at least mostly light transmitted via its associated micro-optic element while light originating from other sources including light directed to other sensing elements is significantly reduced, and potentially prevented.
  • Each of the sensing elements may be associated with a respective pixel in the image map (depth map), for example, a point cloud generated based on output of the plurality of sensors to map distances of objects in the surrounding environment of the LIDAR system based on timing of and energy of light reflected from the environment and captured by the sensors.
  • a respective pixel in the image map depth map
  • a point cloud generated based on output of the plurality of sensors to map distances of objects in the surrounding environment of the LIDAR system based on timing of and energy of light reflected from the environment and captured by the sensors.
  • Increasing isolation between the sensing elements of the LIDAR system may therefore significantly increase pixel separation which may significantly increase accuracy of each individual pixel and sharpness of the image map generated by the LIDAR system.
  • increasing isolation between the sensing elements via the use of the micro optic array configured to reduce and potentially entirely prevent leakage of light between sensing elements may allow reducing the distance between adjacent sensing elements which may significantly reduce the sensor array and thus consume reduced space in the LIDAR system which may reduce size, complexity, and/or cost of the LIDAR system.
  • the structure of the micro-optic elements which are each associated with a respective sensing element of the LIDAR system may significantly increase the energy of light received at each sensing element from the light entrance of its associated micro-optic element while transmission of light originating from other source toward the associated sensing element may be significantly reduced and potentially completely prevented.
  • Sensing performance of each sensing element may be therefore significantly improved since each sensing element may be projected only or at least mainly with the respective portion of reflected light directed to the respective sensing element while noise, crosstalk, parasitic, and/or stray light may be significantly blocked.
  • Increasing the light detection performance of the sensing elements may significantly increase the detection performance of the LIDAR system, for example, detection range, detection resolution, accuracy, reliability, confidence of detection, detection robustness and/or the like.
  • transmitting (distributing) each portion of reflected light through the microoptic element over a larger sensing surface may allow using larger sensors, i.e., sensing elements having a larger sensing surface, for example, significantly increasing the number of light detectors which may significantly increase the dynamic range of each sensing element compared to transmitting the focused reflected light received from the focusing unit directly on a much smaller sensing element.
  • Using the micro-optic elements may allow an increase in the sensing surface of each sensing element by 1-2 orders of magnitude which translates to an increase of the dynamic range by 1-2 orders of magnitude.
  • Increasing the dynamic range of the sensing elements may, in turn, significantly increase the detection performance of the LIDAR system.
  • Increasing the dynamic range may be specifically effective and advantageous for LIDAR short range detection where a large amount of light may be reflected from one or more objects located in close proximity to the LIDAR system (e.g., up to 10, 20, 30, 40, 50, 60 meters, etc.).
  • FIG. 1A and FIG. IB illustrating an exemplary LIDAR system 100, in accordance with embodiments of the present disclosure.
  • the LIDAR system 100 may be used, for example, in one or more ground autonomous or semi-autonomous vehicles 110, for example, road-vehicles such as, for example, cars, buses, vans, trucks and any other terrestrial vehicle.
  • Autonomous ground vehicles 110 equipped with the LIDAR system 100 may scan their environment and drive to a destination vehicle with reduced and potentially without human intervention.
  • the LIDAR system 100 may be used in one or more autonomous/semi-autonomous aerial-vehicles such as, for example, Unmanned Aerial Vehicles (UAV), drones, quadcopters, and/or any other airborne vehicle or device.
  • the LIDAR system 100 may be used in one or more autonomous or semi -autonomous water vessels such as, for example, boats, ships, hovercrafts, submarines, and/or the like. Autonomous aerial-vehicles and watercrafts with LIDAR system 100 may scan their environment and navigate to a destination autonomously or under remote human operation.
  • the LIDAR system 100 or any of its components may be used together with any of the example embodiments and methods disclosed herein.
  • LIDAR system 100 may be described herein with respect to an exemplary vehicle-based LIDAR platform, the LIDAR system 100, any of its components, or any of the processes described herein may be applicable to one or more LIDAR systems of other platform types.
  • LIDAR systems such as the LIDAR system 100 may be installed, mounted, integrated, and/or otherwise deployed, in dynamic and/or stationary deployment for one or more other applications, for example, a surveillance system, a security system, a monitoring system, and/or the like .
  • Such LIDAR systems 100 may be configured to scan their environment in order to detect objects according to their respective application needs, criteria, requirements, and/or definitions.
  • the LIDAR system 100 be configured to detect tangible objects in an environment of the LIDAR system 100, specifically in a scene contained in an FOV 120 of the LIDAR system 100 based on reflected light, and more specifically, based on light projected by the LIDAR system 100 and reflected by objects in the FOV 120.
  • the scene may include some or all objects within the FOV 120, in their relative positions and in their current states, for example, ground elements (e.g., earth, roads, grass, sidewalks, road surface marking, etc.), sky, man-made objects (e.g., vehicles, buildings, signs, etc.), vegetation, people, animals, light projecting elements (e.g., flashlights, sun, other LIDAR systems, etc.), and/or the like.
  • An object refers to a finite composition of matter that may reflect light from at least a portion thereof.
  • An object may be at least partially solid (e.g., car, tree, etc.), at least partially liquid (e.g., puddles on a road, rain, etc.), at least partly gaseous (e.g., fumes, clouds, etc.), made of a multitude of distinct particles (e.g., sandstorm, fog, spray, etc.), and/or a combination thereof.
  • An object may be of one or more scales of magnitude, such as, for example, ⁇ 1 millimeter (mm), ⁇ 5 mm, ⁇ 10 mm, ⁇ 50 mm, ⁇ 100 mm, ⁇ 500 mm, ⁇ 1 meter (m), ⁇ 5m, ⁇ 10m, ⁇ 50m, ⁇ 100m, and so on.
  • the LIDAR system 100 may be configured to detect objects by scanning the environment of the LIDAR system 100, i.e., illuminating at least part of the FOV 120 of the LIDAR system 100 and collecting and/or receiving light reflected from the illuminated part(s) of the FOV 120.
  • the LIDAR system 100 may scan the FOV 120 and/or part thereof in a plurality of scanning cycles (frames) conducted at one or more frequencies and/or frame rates, for example, 5 Frames per Second (fps), 10 fps, 15 fps, 20 fps, and/or the like.
  • the LIDAR system 100 may apply one or more scanning mechanisms, methods, and/or implementations for scanning the environment.
  • the LIDAR system 100 may scan the environment by moving and/or pivoting one or more deflectors of the LIDAR system 100 to deflect light emitted from the LIDAR system 100 in differing directions toward distinct parts of the FOV 120.
  • the LIDAR system 100 may scan the environment by changing positioning (i.e., location and/or orientation) of one or more sensor associated with the LIDAR system 100 with respect to the FOV 120.
  • the FOV 120 scanned by the LIDAR system 100 may include an extent of the observable environment of LIDAR system 100 in which objects may be detected.
  • the extent of the FOV 120 may be defined by a horizontal range (e.g., 50°, 120°, 360°, etc.), and a vertical elevation (e.g., ⁇ 20°, +40°-20°, ⁇ 90°, 0°— 90°, etc.).
  • the FOV 120 may also be defined within a certain range, for example, up to a certain depth/ distance (e.g., 100 m, 200 m, 300 m, etc.), and up to a certain vertical distance (e.g., 10 m, 25 m, 50 m, etc.).
  • a certain depth/ distance e.g. 100 m, 200 m, 300 m, etc.
  • a certain vertical distance e.g., 10 m, 25 m, 50 m, etc.
  • the FOV 120 may be divided (segmented) into a plurality of portions 122 (segments), also designated FOV pixels, having uniform and/or different sizes.
  • the FOV 120 may be divided into a plurality of portions 122 arranged in the form of a two-dimensional array of rows and columns.
  • the LIDAR system 100 may scan an instantaneous FOV which comprises a respective portion 122.
  • the portion 122 scanned during each instantaneous FOV may be narrower than the entire FOV 120, and the LIDAR system 100 may thus move the instantaneous FOV within the FOV 120 in order to scan the entire FOV 120.
  • Detecting an object may broadly refer to determining an existence of the object in the FOV 120 of the LIDAR system 100 which reflects light emitted by the LIDAR system 100 toward one or more sensors, interchangeably designated detectors, associated with the LIDAR system 100.
  • detecting an object may refer to determining one or more physical parameters relating to the object and generating information indicative of the determined physical parameters, for example, a distance between the object and one or more other objects (e.g., the LIDAR system 100, another object in the FOV 120, ground (earth), etc.), a kinematic parameter of the object (e.g., relative velocity, absolute velocity, movement direction, expansion of the object, etc.), a reflectivity (level) of the object, and/or the like.
  • a distance between the object and one or more other objects e.g., the LIDAR system 100, another object in the FOV 120, ground (earth), etc.
  • a kinematic parameter of the object e.g., relative velocity, absolute velocity, movement direction, expansion of the object, etc.
  • a reflectivity (level) of the object e.g., and/or the like.
  • the LIDAR system 100 may detect objects by processing detection results based on sensory data received from the sensor(s) which may comprise temporal information indicative of a period of time between the emission of a light signal by the light source(s) of the LIDAR system 100 and the time of detection of reflected light by the sensor(s) associated with the LIDAR system 100.
  • the LIDAR system 100 may employ one or more detection technologies.
  • the LIDAR system 100 may employ Time of Flight (ToF) detection where the light signal emitted by the LIDAR system 100 may comprise one or more short pulses, whose rise and/or fall time may be detected in reception of the emitted light after reflected by one or more objects in the FOV 120.
  • the LIDAR system 100 may employ continuous wave detection, for example, Frequency Modulated Continuous Wave (FMCW), phase-shift continuous wave, and/or the like.
  • FMCW Frequency Modulated Continuous Wave
  • the LIDAR system 100 may detect only part of one or more objects present in the FOV 120. For example, light may be reflected from only some sides of an object, for example, typically only the side opposing the LIDAR system 200 may be detected by the LIDAR system 100. In another example, light emitted by the LIDAR system 100 may be projected on only part of an, for example, a laser beam projected onto a road or a building. In another example, an object may be partly blocked by another object between the LIDAR system 100 and the detected object. In another example, ambient light and/or one or more other interferences may interfere with detection of one or more portions of an object.
  • detecting an object by the LIDAR system 100 may further refer to identifying the object, for example, classifying atype of the object (e.g., car, person, tree, road, traffic light, etc.), recognizing a specific object (e.g., natural site, structure, monument, etc.), determining a text value of the object (e.g., license plate number, road sign markings, etc.), determining a composition of the object (e.g., solid, liquid, transparent, semitransparent, etc.), and/or the like.
  • atype of the object e.g., car, person, tree, road, traffic light, etc.
  • recognizing a specific object e.g., natural site, structure, monument, etc.
  • determining a text value of the object e.g., license plate number, road sign markings, etc.
  • determining a composition of the object e.g., solid, liquid, transparent, semitransparent, etc.
  • the LIDAR system 100 may comprise a projecting unit 102, a scanning unit 104, a sensing unit 106, and a processing unit 108. According to some embodiments, the LIDAR system 100 may be mountable on a vehicle 110. [0076] Optionally, the LIDAR system 100 may include one or more optical windows 124 for transmitting outgoing light projected toward the FOV 120 and/or for receiving incoming light reflected from objects in field of view 120.
  • the optical window(s) 124 for example, an opening, a flat window, a lens, or any other type of optical window may be used for one or more purposes, for example, collimating the projected light, focusing of the reflected light, and/or the like.
  • the LIDAR system 100 may be contained in a single housing and/or divided among a plurality of housings connected to each other via one or more communication channels, for example, a wired channel, fiber optics cable, and/or the like deployed between the first and second housings, a wireless connection (e.g., RF connection), fiber optics cable, and/or any combination thereof.
  • a wireless connection e.g., RF connection
  • the light related components of the LIDAR system 100 i.e., the projecting unit 102, the scanning unit 104, and the sensing unit 106 may be deployed and/or contained in a first housing while the processing unit 108 may be deployed and/or contained in a second housing.
  • the processing unit 108 may communicate with the projecting unit 102, the scanning unit 104, and/or the sensing unit 106 via the communication channel(s) connecting the separate housings for controlling of the scanning unit 104 and/or for receiving from the sensing unit 106 sensory information indicative of light reflected from the scanned scene.
  • the LIDAR system 100 may employ one or more designs architectures, and/or configurations for the optical path of outbound light (transmission path TX) projected by the projecting unit 102 toward the scene, i.e., to the FOV 120 of the LIDAR system 100, and of inbound light (reception path RX) reflected from objects in the scene and directed to the sensing unit 106.
  • the LIDAR system 100 may employ bi-static configuration in which the outbound light projected from the projecting unit 102 and exits the LIDAR system 100 and the inbound light reflected from the scene and entering the LIDAR system 100 pass through substantially different optical paths comprising optical components, for example, windows, apertures, lenses, mirrors, beam splitters, and/or the like.
  • the LIDAR system 100 may employ monostatic configuration in which the outbound light and the inbound light share substantially the same optical path, i.e., the light 204 projected by the projecting unit 102 and exiting from the LIDAR system 100 and the light 206 reflected from the scene and entering the LIDAR system 100 pass through substantially similar optical paths and share most if not all of the optical components on the shared optical path.
  • the projecting unit 102 may include one or more light sources 112 configured to emit light in one or more light forms, for example, a laser diode, a solid-state laser, a high-power laser, an edge emitting laser, a Vertical-Cavity Surface-Emitting Laser (VCSEL), an External Cavity Diode Laser (ECDL), A distributed Bragg reflector (DBR) laser, a laser array, and/or the like.
  • a laser diode a solid-state laser, a high-power laser, an edge emitting laser, a Vertical-Cavity Surface-Emitting Laser (VCSEL), an External Cavity Diode Laser (ECDL), A distributed Bragg reflector (DBR) laser, a laser array, and/or the like.
  • VCSEL Vertical-Cavity Surface-Emitting Laser
  • ECDL External Cavity Diode Laser
  • DBR distributed Bragg reflector
  • the light source(s) 112 may be configured and/or operated, for example, by the processing unit 108, to emit light according to one or more light emission patterns defined by one or more light emission parameters, for example, lighting mode (e.g., pulsed, Continuous Wave (CW), quasi-CW, etc.), light format (e.g., angular dispersion, polarization, etc.), spectral range (wavelength), energy/power (e.g., average power, maximum power, power intensity, instantaneous power, etc.), timing (e.g., pulse width (duration), pulse repetition rate, pulse sequence, pulse duty cycle, etc.), and/or the like.
  • lighting mode e.g., pulsed, Continuous Wave (CW), quasi-CW, etc.
  • light format e.g., angular dispersion, polarization, etc.
  • spectral range wavelength
  • energy/power e.g., average power, maximum power, power intensity, instantaneous power, etc.
  • timing e.g.,
  • the projecting unit 102 may further comprise one or more optical elements associated with one or more of the light source(s) 112, for example, a lens, an aperture, a window, a light filter, a waveplate, a beam splitter, and/or the like for adjusting the light emitted by the light source(s) 112, or example, collimating, focusing, polarizing, and/or the like the emitted light beams.
  • the light source(s) 112 for example, a lens, an aperture, a window, a light filter, a waveplate, a beam splitter, and/or the like for adjusting the light emitted by the light source(s) 112, or example, collimating, focusing, polarizing, and/or the like the emitted light beams.
  • the projecting unit 102 may include a plurality of light sources 112 configured to emit a plurality of light beams, typically simultaneously, such that each of the light illuminates a respective portion, section, and/or segment of the instantaneous FOV scanned by the LIDAR system 100 at any given moment.
  • the scanning unit 104 may be configured to illuminate the FOV 120 and/or part thereof with projected light 204 by projecting the light emitted from the light source(s) 112 toward the scene thus serving as a steering element on the outbound path, i.e., the transmission path TX, of the LIDAR system 100 for directing the light emitted by the light source(s) 112 toward the scene.
  • the scanning unit 104 may be further used on the inbound path of the LIDAR system 100, i.e., the reception path RX, for directing the light (photons) 206 reflected from one or more objects in at least part of the FOV 120 toward the sensing unit 106.
  • the scanning unit 104 may therefore optionally include one or more optical elements, for example, a lens, a telephoto, a prism, and/or the like configured to direct the reflected light 206 toward the sensing unit 106.
  • the projecting unit 102 may be configured to emit a plurality of light beams, on the transmission path TX (outbound path) of the LIDAR system 100, the scanning unit 104 may be configured to project the plurality of light beams for illuminating the FOV 120 and/or part thereof.
  • the projecting unit 102 may direct the reflected light 206 from one or more objects in at least part of the FOV 120 toward the sensing unit 106.
  • the scanning unit 104 may include one or more light deflectors 114 configured to deflect the light from the light source(s) 112 for scanning the FOV 120.
  • the light deflector(s) 114 may include one or more scanning mechanism, module, devices, and/or elements configured to cause the emitted light to deviate from its original path, for example, a mirror, a prism, a controllable lens, a mechanical mirror, a mechanical scanning polygon, an active diffraction (e.g., controllable LCD), a Risley prisms, a non-mechanical-electro-optical beam steering (such as made, for example, by Vescent), a polarization grating (such as offered, for example, by Boulder Non-Linear Systems), an Optical Phase Array (OPA), and/or the like.
  • a non-mechanical-electro-optical beam steering such as made, for example, by Vescent
  • a polarization grating such as offered, for example, by Boulder
  • the deflector(s) 114 may comprise one or more scanning polygons, interchangeable designated polygon scanner, having a plurality of facets, for example, three, four, five, six and/or the like configured as mirrors and/or prisms to deflect light projected onto the facet(s) of the polygon.
  • the deflector(s) 114 may comprise one or more Micro Electro-Mechanical Systems (MEMS) mirrors configured to move by actuation of a plurality of benders connected to the mirror.
  • MEMS Micro Electro-Mechanical Systems
  • the scanning unit 104 may include one or more non-mechanical deflectors 114, for example, a non -mechanical -electro-optical beam steering such as, for example, an OPA which does not require any moving components or internal movements for changing the deflection angles of the light but is rather controlled by steering, through phase array means, a light projection angle of the light source(s) 112 to a desired projection angle.
  • a non-mechanical -electro-optical beam steering such as, for example, an OPA which does not require any moving components or internal movements for changing the deflection angles of the light but is rather controlled by steering, through phase array means, a light projection angle of the light source(s) 112 to a desired projection angle.
  • the deflector(s) 114 may be positioned in a respective instantaneous position defining a respective location, position and/or orientation in space.
  • each instantaneous position of the deflector(s) 114 may correspond to a respective portion 122 of the FOV 120.
  • the deflector(s) 114 may scan a respective one of the plurality of portions 122 of the FOV 120, i.e., project light 204 toward the respective portion 122 and/or direct light (photons) reflected from the respective portion 122 toward the sensing unit 106.
  • the scanning unit 104 may be configured and/or operated to scan the FOV 120 and/or part thereof, on the outbound path and/or on the inbound path, at one or more scales of scanning.
  • the scanning unit 104 may be configured to scan the entire FOV 120.
  • the scanning unit 104 may be configured to scan one or more ROIs which cover 10% or 25% of the FOV 120.
  • the scanning unit 104 may dynamically adjust the scanning scale, i.e., the scanned area, either between different scanning cycles and/or during the same scanning cycle.
  • the scanning unit 104 may further comprise one or more optical elements associated with the deflector(s) 114, for example, a lens, an aperture, a window, a light filter, a waveplate, a beam splitter, and/or the like for adjusting the light emitted by the light source(s) 112 and/or for adjusting the light reflected from the scene, for example, collimate the projected light 204, focus the reflected light 206, and/or the like.
  • the deflector(s) 114 for example, a lens, an aperture, a window, a light filter, a waveplate, a beam splitter, and/or the like for adjusting the light emitted by the light source(s) 112 and/or for adjusting the light reflected from the scene, for example, collimate the projected light 204, focus the reflected light 206, and/or the like.
  • the sensing unit 106 may include one or more sensors 116 configured to receive and sample light reflected from the surroundings of LIDAR system 100, specifically from the scene, i.e., the FOV 120, and generate reflection signals, interchangeably designated trace signals or trace data, indicative of light captured by the sensor(s) 116 which may include light reflected from one or more objects in the FOV 120.
  • the sensor(s) 116 may include one or more devices, elements, and/or systems capable of measuring properties of electromagnetic waves, specifically light, for example, energy/power, intensity, frequency, phase, timing, duration, and/or the like and generate output signals indicative of the measured properties.
  • the sensor(s) 116 may be configured and/or operated to sample incoming light according to one or more operation modes, for example, continuous sampling, periodic sampling, sampling according to one or more timing schemes, and/or sampling instructions.
  • the sensing unit 106 may include a sensor array comprising a plurality of sensors 116 wherein each of the sensors 116 corresponds to a receptive pixel of a plurality of pixels mapping a portion of the FOV 120 scanned at any given moment.
  • each of the plurality of sensing elements 116 of the sensor array may be associated with a respective one of the plurality of light beams, i.e., each sensor 116 may be configured to receive light reflected from one or more objects in the FOV 120 illuminated by the respective light beam.
  • the projecting unit 102 is configured to project a single elongated light beam (scan line)
  • the light reflected by one or more objects in the FOV 120 responsive to being illuminated by the elongated light beam may be divided to a plurality of portions each directed (transmitted) to a respective one of the plurality of sensors 116.
  • each FOV pixel which corresponds to a respective portion 122, i.e., an instantaneous FOV scanned at a certain point in time, may be mapped by the plurality of sensing pixels captured during the instantaneous point in time.
  • Each sensor 116 may include one or more light sensors of one or more types having differing parameters, for example, sensitivity, size, recovery time, and/orthe like.
  • the sensor(s) 116 may include a plurality of light sensors of a single type, or of multiple types selected according to their characteristics to comply with one or more detection requirements of the LIDAR system 100, for example, reliable and/or accurate detection over a span of ranges (e.g., maximum range, close range, etc.), dynamic range, temporal response, robustness against varying environmental conditions (e.g., temperature, rain, illumination, etc.), and/orthe like.
  • ranges e.g., maximum range, close range, etc.
  • dynamic range e.g., temporal response
  • robustness against varying environmental conditions e.g., temperature, rain, illumination, etc.
  • the sensor(s) 116 may include one or more light detectors constructed from a plurality of detecting elements 220, for example, an Avalanche Photodiode (APD), Single Photon Avalanche Diode (SPAD), and/or the like serving as detection elements 220 on a common silicon substrate configured for detecting photons reflected back from the FOV 120.
  • the detecting elements 220 of each sensor 116 may be typically arranged as an array in one or more arrangements over a detection area of the sensor 116, for example, a rectangular arrangement, for example, as shown in FIG.
  • the detecting elements 220 may be arranged in a plurality of regions which jointly cover the detection area of the sensor 116.
  • Each of the plurality of regions may comprise a plurality of detecting elements 220, for example, SPADs having their outputs connected together to form a common output signal of the respective region.
  • Each of the light detection elements 220 is configured to cause an electric current to flow when light (photons) passes through an outer surface of the respective detection element 220.
  • the processing unit 108 may include one or more processors 118, homogenous or heterogeneous, comprising one or more processing nodes and/or cores optionally arranged for parallel processing, as clusters and/or as one or more multi core processor(s).
  • the processor(s) 118 may execute one or more software modules such as, for example, a process, a script, an application, a (device) driver, an agent, a utility, a tool, an Operating System (OS), a plug-in, an add-on, and/or the like each comprising a plurality of program instructions stored in a non- transitory medium (program store) of the LIDAR system 100 and executed by one or more processors such as the processor(s) 118.
  • program store non- transitory medium
  • the non-transitory medium may include, for example, persistent memory (e.g., ROM, Flash, SSD, NVRAM, etc.) volatile memory (e.g., RAM component, cache, etc.) and/or the like such as the storage 234 and executed by one or more processors such as the processor(s) 232.
  • the processor(s) 118 may optionally integrate, utilize and/or facilitate one or more hardware elements (modules), for example, a circuit, a component, an Integrated Circuit (IC), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA), a Digital Signals Processor (DSP), a Graphic Processing Unit (GPU), an Artificial Intelligence (Al) accelerator and/or the like.
  • the processor(s) 118 may therefore execute one or more functional modules implemented using one or more software modules, one or more of the hardware modules and/or combination thereof.
  • the processor(s) 118 may therefore execute one or more functional modules to control functionality of the UIDAR system 100, for example, configuration, operation, coordination, and/or the like of one or more of the functional elements of the UIDAR system 100, for example, the projecting unit 102, the scanning unit 104, and/or the sensing unit 106. While the functional module(s) are executed by the processor(s) 118, for brevity and clarity, the processing unit 108 comprising the processor(s) 118 is described hereinafter to control functionality of the UIDAR system 100.
  • the processing unit 108 may communicate with the functional elements of the UIDAR system 100 via one or more channels, interconnects, and/or networks deployed in the UIDAR system 100, for example, a bus (e.g., PCIe, etc.), a switch fabric, a network, a vehicle network, and/or the like.
  • a bus e.g., PCIe, etc.
  • switch fabric e.g., a switch fabric
  • network e.g., a network, a vehicle network, and/or the like.
  • the processing unit 108 may control the scanning unit 104 to scan the environment of the UIDAR system 100 according to one or more scanning schemes and/or scanning parameters, for example, extent (e.g., angular extent) of the FOV 120, extent (e.g., angular extent) of one or more regions of interest (ROI) within the FOV 120, maximal range within the FOV 120, maximal range within each ROI, maximal range within each region of non-interest, resolution (e.g., vertical angular resolution, horizontal angular resolution, etc.) within the FOV 120, resolution within each ROI, resolution within each region of non-interest, scanning mode (e.g., raster, alternating pixels, etc.), scanning speed, scanning cycle timing (e.g., cycle time, frame rate), and/or the like.
  • extent e.g., angular extent
  • FOV 120 extent (e.g., angular extent) of one or more regions of interest (ROI) within the FOV 120
  • ROI regions of interest
  • resolution e.g
  • the processor(s) 118 may be configured to coordinate operation of the light source(s) 112 with movement of the deflector(s) 114 for scanning the FOV 120 and/or part thereof. In another example, the processor(s) 118 may be configured to configure and/or operate the light source(s) 112 to project light according to one or more light emission patterns. In another example, the processor(s) 118 may be configured to coordinate operation of the sensor(s) 116 with movement of the deflector(s) 114 to activate one or more selected sensor(s) 116 and/or pixels according to the scanned portion of the FOV 120.
  • the processor(s) 118 may be configured to receive the reflection signals generated by the sensor(s) 116 which are indicative of light captured by the sensor(s) 116 which may include light reflected from the scene specifically light reflected from one or more objects in the scanned FOV 120 and/or part thereof.
  • the processor(s) 118 may be configured to analyze the trace signals (reflection signals) received from the sensor(s) 116 in order to detect one or more objects, conditions, and/or the like in the environment of the LIDAR system 100, specifically in the scanned FOV 120 and/or part thereof.
  • Analyzing the trace data indicative of the reflected light 206 may include, for example, determining a ToF of the reflected light 206, based on timing of outputs of reflection signals, specifically with respect to transmission timing of projected light 204, for example, light pulses, corresponding to the respective reflected light 206.
  • analyzing the trace data may include determining a power of the reflected light, for example, average power across an entire return pulse, and a photon distribution/signal may be determined over the return pulse period (“pulse shape”).
  • FIG. 2 illustrates graph charts of exemplary light emission patterns projected by a LIDAR system such as the LIDAR system 100, in accordance with embodiments of the present disclosure.
  • Graph charts 202, 204, 206, and 208 depict several light emission patterns which may be emitted by one or more light sources such as the light source 112 of a projecting unit such as the projecting unit 102 of the LIDAR system 100.
  • the light source (s) 112 may emit light according to the light patterns under control of a processing unit such as the processing unit 108 of the LIDAR system 100.
  • the processing unit 108 may control the light source(s) 112, for example, a pulsed-light light source, to project toward the portion 122 one or more initial pulses according to an initial light emission pattern, also designated pilot pulses.
  • the processing unit 108 may analyze pilot information received from one or more sensors, such as the sensor 116 which is indicative of light reflections associated with the pilot pulses and, based on the analysis, may determine one or more light emission patterns according to which the light source(s) 122 may transmit subsequent light pulses during the frame time of the present frame and/or during one or more subsequent frames.
  • the processing unit 108 may control the light source(s) 112 to project toward the portion 122 light pulses according to a light emission pattern defining a plurality of pulses having gradually increasing intensities.
  • the processing unit 108 may control the light source(s) 112 to project toward the portion 122 light pulses according to different light emission patterns in different frames, i.e., in different scanning cycles, for example, a different number of pulses, pulses having different pulse duration, pulses having different intensity, and/or the like.
  • the processing unit 108 may control the light source(s) 112, for example, a continuous-wave light source (e.g., FMCW), to project toward the portion 122 light according to one or more light emission patterns.
  • a continuous-wave light source e.g., FMCW
  • Such an exemplary light emission pattern may include, for example, projecting continuous light during the entire frame time.
  • the light emission pattern may define one or more discontinuities, i.e., time periods during which the light source(s) 112 do not emit light.
  • the light emission pattern may define emission of a continuous light having a constant intensity, or alternatively emission of a continuous light having varying intensity overtime.
  • the processing unit 108 may be configured to analyze the trace data, i.e., the reflection signals received from the sensor(s) 116 which are indicative light reflected from the scene including at least part of the light emitted by the LIDAR system 100. Based on analysis of the trace data which the processing unit 108 may extract depth data relating to the scene, i.e., in the FOV 120 and/or part thereof and may derive and/or determine one or more attributes of one or more objects detected in the scene based on the light reflected from these objects.
  • the trace data i.e., the reflection signals received from the sensor(s) 116 which are indicative light reflected from the scene including at least part of the light emitted by the LIDAR system 100. Based on analysis of the trace data which the processing unit 108 may extract depth data relating to the scene, i.e., in the FOV 120 and/or part thereof and may derive and/or determine one or more attributes of one or more objects detected in the scene based on the light reflected from these objects.
  • Such object attributes may include, for example, a distance between the LIDAR system 100 and the respective object from the LIDAR system 100, a reflectivity of the respective object, a spatial location of the respective object, for example, with respect to one or more coordinate systems (e.g., Cartesian (X, Y, Z), Polar (r, 0, ⁇
  • the processing unit 108 may therefore map the reflecting objects in the environment of the LIDAR system 100.
  • the processing unit 108 may combine, join, merge, fuse, and/or otherwise aggregate information, for example, depth data pertaining to different objects, and/or different features of objects detected in the scene.
  • the processing unit 108 may be configured to generate and/or reconstruct one or more 3D models, interchangeably designated depth maps herein, of the environment of the LIDAR system 100, i.e., of objects scanned in the scene included in the FOV 120 and/or part thereof.
  • the data resolution associated with the depth map representation(s) of the FOV 120 which may depend on the operational parameters of the LIDAR system 100 may be defined by horizontal and/or vertical resolution, for example, 0.1° x 0.1°, 0.3° x 0.3°, 0.1° x 0.5° of the FOV 120, and/or the like.
  • the processing unit 108 may generate depth map(s) of one or more forms, formats and/or types, for example, a point cloud model, a polygon mesh, a depth image holding depth information for each pixel of a 2D image and/or array, and/or any other type of 3D model of the scene.
  • a point cloud model (also known a point cloud) may include a set of data points located spatially which represent the scanned scene in some coordinate system, i.e., having an identifiable locations in a space described by a coordinate system, for example, Cartesian, Polar, and/or the like. Each point in the point cloud may be a dimensionless, or a miniature cellular space whose location may be described by the point cloud model using the set of coordinates.
  • the point cloud may further include additional information for one or more and optionally all of its points, for example, reflectivity (e.g., energy of reflected light, etc.), color information, angle information, and/or the like.
  • a polygon mesh or triangle mesh may include, among other data, a set of vertices, edges and faces that define the shape of one or more 3D objects (polyhedral object) detected in the scanned scene.
  • the processing unit 108 may further generate a sequence of depth maps over time, i.e., a temporal sequence of depth maps, for example, each depth map in the sequence may be associated with a respective scanning cycle (frame). In another example, the processing unit 108 may update one or more depth maps over time based on depth data received and analyzed in each frame.
  • the processing unit 108 may control the light projection scheme of the light emitted to the environment of the LIDAR system 100, for example, adapt, and/or adjust the light emission pattern and/or the scanning pattern, to improve mapping of the environment of the LIDAR system 100.
  • the processing unit 108 may control the light projection scheme such to illuminate differently different portions 122 across the FOV 120 in order to differentiate between reflected light relating to different portions 122.
  • the processing unit 108 may apply a first light projection scheme for one or more first areas in the FOV 120, for example, an ROI and a second light projection scheme for one or more other parts of the FOV 120.
  • the processing unit 108 may adjust the light projection scheme between scanning cycles (frames) such that a different light projection scheme may be applied in different frames
  • the processing unit 108 may adjust the light projection scheme based on detection of reflected light, either during the same scanning cycle (e.g., the initial emission) and/or between different frames (e.g., successive frames), thus making the LIDAR system 100 extremely dynamic.
  • the LIDAR system 100 may include a communication interface 214 comprising one or more wired and/or wireless communication channels and/or network links, for example, PCIe, Local Area Network (LAN), Gigabit Multimedia Serial Link (GMSL), vehicle network, InfiniBand, wireless LAN (WLAN), cellular network, and/or the like.
  • a communication interface 214 comprising one or more wired and/or wireless communication channels and/or network links, for example, PCIe, Local Area Network (LAN), Gigabit Multimedia Serial Link (GMSL), vehicle network, InfiniBand, wireless LAN (WLAN), cellular network, and/or the like.
  • the LIDAR system 100 specifically the processing unit 108 may transfer data and/or communicate with one or more external systems, for example, a host system 210, interchangeable designated host herein.
  • the host 210 which may include any computing environment comprising one or more processors 218 such as the processor 118 which may interface with the LIDAR system 100.
  • the host 210 may include one or more systems deployed and/or located in the vehicle 110 such as, for example, an ADAS, a vehicle control system, a vehicle safety system, a client device (e.g., laptop, smartphone, etc.), and/or the like.
  • the host 210 may include one or more remote systems, for example, a security system, a surveillance system, a traffic control system, an urban modelling system, and/or other systems configured to monitor their surroundings.
  • the host 210 may include one or more remote cloud systems, services, and/or platforms configured to collect data from vehicles 110 for one or more monitoring, analysis, and/or control applications.
  • the host 210 may include one or more external systems, for example, a testing system, a monitoring system, a calibration system, and/or the like.
  • the host 210 may be configured to interact and communicate with the LIDAR system 100 for one or more purposes, and/or actions, for example, configure the LIDAR system 100, control the LIDAR system 100, analyze data received from the LIDAR system 100, and/or the like.
  • the host 210 may generate one or more depth maps and/or 3D models based on trace data, and/or depth data received from the LIDAR system 100.
  • the host 210 may configure one or more operation modes, and/or parameters of the LIDAR system 100, for example, define an ROI, define an illumination pattern, define a scanning pattern, and/or the like.
  • the host 210 may dynamically adjust in real-time one or more operation modes and/or parameters of the LIDAR system 100.
  • the LIDAR system 100 may include a micro-optic array comprising a plurality of micro-optic elements deployed in the optical path of the LIDAR system 100 and configured for transmitting the reflected light 206 to a sensor array of the LIDAR system 100 comprising a plurality of the sensing elements 116.
  • each of the plurality of micro-optic elements may be associated with a respective one of the plurality of sensing elements 116 and configured for transmitting a respective portion of the reflected light 206 to the respective associated sensing element 116.
  • FIG. 3 is a schematic illustration of an exemplary micro-optic array of a LIDAR system configured for transmitting light reflected from objects illuminated by the LIDAR system to a sensor array of the LIDAR system, in accordance with embodiments of the present disclosure.
  • An exemplary imaging system 300 of a LIDAR system such as the LIDAR system 100 may utilize one or more detection channels of the LIDAR system 100, for example, a primary object detector, a short range detector, and/or the like.
  • Short range detection is typically characterized by reflection of high light levels (amount of light) reflected from close by objects (e.g., distances of up to 10, 20, 30, 40, 50, 60 meters, etc.) illuminated by the LIDAR system 100 which may saturate the sensors 116 of the LIDAR system 100.
  • the imaging system 300 may comprise a focusing unit 302 configured to receive light such as the reflected light 206 reflected from a scene, specifically from at least part of an FOV such as the FOV 120 of the LIDAR system 100 by one or more objects illuminated with light such as the light 204 projected by one or more light sources such as the light source 112 of the LIDAR system 100.
  • the focusing unit 302 may be configured to direct the reflected light 206 toward a sensor array 316 comprising a plurality of sensing element such as the sensor 116.
  • the focusing unit 302 may include one or more optical components, for example, a lens, a telephoto, a prism, and/or the like configured to focus the reflected light 206 on a focal plane 304 of the focusing unit 302.
  • the focusing unit 302 may be part a scanning unit such as the scanning unit 104 of the LIDAR system 100 configured to scan the FOV 120 and/or part thereof with the light such as the projected light 204 projected by the light source(s) 112 on the outbound path and, on the inbound path, direct the reflected light 206 toward the sensor array 316.
  • the focusing unit 302 may include the scanning unit 104 and/or part thereof.
  • the imaging system 300 may further include a micro-optic array 320 comprising a plurality of micro-optic elements 310 configured to transmit the reflected light 206 to the sensor array 316.
  • Each of the micro-optic elements 310 may be associated with a respective sensing element 116 of the sensor array 316.
  • the reflected light 206 may comprise a plurality of portions and each micro-optic element 310 may be configured to transmit a respective portion of the reflected light 206 to the respective associated sensing element 116.
  • Each of the sensing elements 116 may therefore correspond to a respective pixel of a plurality of pixels mapping a portion of the FOV 120 scanned at any given moment.
  • aprojecting unit ofthe LIDAR system 100 such as the projecting unit 102 may comprise one or more light sources 112 configured to project (emit) a plurality of light beams meaning that the projected light 204 comprises a plurality of light beams.
  • each of the plurality of portions of the reflected light 206 may correspond to a respective one of the light beams meaning that the respective portion comprises light reflected by one or more objects in the scanned FOV 120 in response to being illuminated by the respective light beam.
  • each of the projected light beams may be associated with a receptive sensing element 116 which may receive, via its associated micro-optic element 310, at least some light reflected from one or more objects in the FOV illuminated by the respective light beam.
  • each of the sensing elements 116 of the sensor array 316 may constitute a respective pixel in the image map (range or depth map) created based on the reflection data (trace data) generated by each of the sensing element 116 during one or more scanning cycles of the FOV 120 by the LIDAR system 100 which is indicative of the light captured by the respective sensing element 116 via its associated micro-optic element 310.
  • the projecting unit 102 may comprise one or more light sources 112 configured to project (emit) a single wide and/or elongated light beam (scan line).
  • the reflected light 206 reflected by one or more objects in the FOV 120 responsive to being illuminated by the elongated light beam may be divided (segmented) to the plurality of portions, for example, by the focusing unit and/or the scanning unit 104.
  • Each of the micro-optic elements 310 may include a front end, i.e., a light entrance through which the respective micro-optic element 310 may receive light, for example, the respective portion of the reflected light 206 received from the focusing unit 302, and a rear end, i.e., a light exit through which the respective portion of the reflected light 206 is transmitted to the respective associated sensing element 116.
  • the plurality of microoptic elements 310 may be disposed on the optical path between the focusing unit 302 and the sensor array 316 such that each of the micro-optic elements 310 may be optically coupled, i.e., in optical communication, to its respective associated sensing element 116.
  • the micro-optic elements 310 may be disposed to have their light entrance (surface) coinciding with the focal plane 304 of the focusing unit 302 and their light exit oriented, positioned, and/or shaped to transmit the reflected light 206 toward a detector plane 306 coincident with a sensing surface of the sensing elements 116.
  • the light entrance may comprise an input surface coincident with the focal plane 304 through which the light received from the focusing unit 302 may enter the respective microoptic element 310.
  • the light input surface may be flat, curved, jugged, bent, and/or a combination thereof.
  • the light exit of each micro-optic element 310 may comprise an exit surface coincident with the sensing surface of the associated sensing element 116 through which light from the light entrance may transmitted to the sensing element 116.
  • each micro-optic element 310 may include a void such that light is transferred from the light entrance of the respective micro-optic element 310 through the void to the light exit of the respective micro-optic element 310.
  • each of the micro-optic elements 310 may be transparent to the reflected light 206 such that the reflected light 206 may enter and exit the micro-optic elements 310.
  • the reflected light 206 is in a certain wavelength range (spectral range), for example, between 650 nanometer (nm) and 1150 nm
  • the micro-optic elements 310 may be configured, and/or adapted to have their light entrances and light exit surfaces configured to be transparent to light in the certain wavelength range.
  • Each of the plurality of micro-optic elements 310 may include a light blocking exterior disposed between the light entrance and the light exit of the respective micro-optic element 310.
  • the light blocking exterior may be configured to prevent transmission of the reflected light 206.
  • the light blocking exterior of each micro-optic element 310 may be configured to prevent transmission of external light, i.e., light incident on the micro-optic element 310 from outside, toward the respective sensing element 116 associated with the respective micro-optic element 310.
  • the light blocking exterior may block, i.e., prevent transmission, of light originating from one or more sources, for example, stray light travelling inside the LIDAR system 100, crosstalk, i.e., light propagating between micro-optic elements 310, for example, adjacent micro -optic elements 310, noise, and/or the like.
  • the light blocking exterior may employ one or more structures, architectures, compositions, and/or implementations for preventing transmission of external light toward the sensing elements 116.
  • the light blocking exterior may be configured to absorb incident light, i.e., light incident (impinging) of each surface of the light blocking exterior.
  • an exemplary absorptive light blocking exterior may be constructed of one or more materials configured to absorb light in one or more wavelength and/or wavelength ranges (spectral ranges), specifically wavelengths of the light used by the LIDAR system 100, for example, a wavelength range between 650 nm and 1150 nm, or specifically about 905 nm.
  • the light blocking exterior of the micro-optic elements 310 may be configured to reflect and/or deflect the incident light away from the sensing elements 116.
  • an exemplary reflective light blocking exterior may be shaped to have a geometric structure configured to reflect the incident light, optionally toward one or more light traps configured to trap the light reflected from the light blocking exterior.
  • each micro-optic element 310 may be further configured to increase reflection of internal light within the respective micro-optic element 310 in order to increase energy of light transmitted toward the respective associated sensing element 116.
  • the light blocking exterior may be geometrically shaped to have internal reflective surfaces adapted to reflect light incident on the internal reflective surfaces in order to direct the light toward the associated sensing element 116.
  • each micro-optic element 310 may be shaped and/or constructed such that its light blocking exterior may have an angle in a certain range, for example, 60-90 degrees with the light exit of the respective micro-optic element 310 such that light transmitted from the light entrance may hit the light blocking exterior at a large angle with respect to the normal to the surface of the light blocking exterior and be reflected toward the light exit.
  • the light blocking exterior may be constructed of one or more materials having a refractive index selected and/or adapted to prevent refraction and transmission of the internal light outside of the micro-optic element 310 and reflect it back toward the associated sensing element 116.
  • each micro-optic element 310 may be shaped to have the angle between its light blocking exterior and its light exit in the certain range and be constructed of one or more materials having a selected refractive index which combined with the base angle may increase reflection of internal light beams toward the light exit.
  • An exemplary imaging system such as the imaging system 300 configured to facilitate and/or support one or more of the detection channels of the LIDAR system 100, for example, a short range detection channel may be characterized by some exemplary properties.
  • the focal length of the focusing unit 302 may be in a range between 10-30 mm, for example 15 mm, 20 mm and/orthe like.
  • An FOV of the focusing unit 302 may be, for example, about 7 degrees.
  • An FOV per pixel of the imaging system 300 may be, for example, in a range between 0.4-0.7 degrees.
  • each micro-optic element 310 may be in a range of 0.03 -0.3mm, for example 0.11 mm, and/or the like and the area of the light exit surface of each micro-optic element 310 may be in a range between 0.5-1.5mm, for example 0.7mm, 1.1 mm, and/or the like. [0127] This structure of each micro-optic element 310 may significantly reduce, and potentially completely prevent, transmission of light toward the respective associated sensing element 116, other than the portion of reflected light 206 entering the respective micro -optic element 310 from its light entrance.
  • Reducing and moreover preventing transmission of undesired light, i.e., light other than the portion of reflected light 206 entering each micro-optic element 310 through its respective light entrance, toward the array of sensing elements 116, may significantly increase the light sensing performance of each sensing element 116 as it may capture only or at least mainly the respective portion of reflected light 206 directed to the respective sensing element 116 via the respective associated micro-optic element 310 while noise, crosstalk, and/or stray light are significantly blocked.
  • Increasing the light detection performance of the sensing elements 116 may significantly increase the detection performance of the LIDAR system 100, for example, resolution, accuracy, reliability and/or robustness.
  • each sensing element 116 may comprise an array of light detecting elements such as the light detecting elements 220, for example, APDs, SPADs, and/or the like. While each of the light detecting elements 220 may trigger upon reception (detection) of one or more photons, the output of the array of light detecting elements 220 of each sensing element 112 may be summed together to form the output of the respective sensing element.
  • the light detecting elements 220 may trigger upon reception (detection) of one or more photons
  • the output of the array of light detecting elements 220 of each sensing element 112 may be summed together to form the output of the respective sensing element.
  • Each of the micro-optic elements 310 may be configured to map a cross section area of the surface of its light entrance to a sensing surface of the array of light detecting elements 220 of the associated sensing element 116 which is typically larger than the light entrance cross section. This means that the portion of reflected light 206, which may be significantly focused (on the focal plane 304), may enter each micro-optic element 310 at a significantly small light entrance surface and be dispersed through the light exit of the respective micro-optic element 310 on a larger surface matching the sensing surface of the array of light detecting elements 220 of the respective associated sensing element 116.
  • each portion of reflected light 206, through the micro-optic element 310, over a large sensing surface of each of the sensing elements 116, specifically over the large sensing surface of the array of light detecting elements 220 may significantly increase the dynamic range of each sensing element 116 compared to transmitting the focused reflected light 206 received from the focusing unit 302.
  • the micro-optic element 310 may be shaped and/or configured to uniformly distribute the incoming portion of reflected light 206 on the sensing surface of the associated sensing element 116.
  • each sensing element 116 may directly translate to an increase in the dynamic range. Therefore configuring the micro-optic elements 310 to have their light exit larger than their light entrance by a ratio of 1/25 to 1/4 may increase the dynamic range by 4 up to 25, i.e., by 1-2 orders of magnitude.
  • Increasing the dynamic range of the sensing elements 116 may significantly increase the detection performance of the LIDAR system 100, for example, resolution, accuracy, reliability and/or robustness. Increasing the dynamic range may be specifically advantageous for short range detection where large amount of reflected light 206 may be reflected from one or more objects located close to the LIDAR system 100, for example, below 10, 20, 30, 40, 50, 60 meters, and/or the like.
  • each micro-optic element 310 may be larger than the light entrance of the respective micro-optic element 310.
  • a cross section of a surface of the light exit of each micro-optic element 310 may have a larger surface area than a cross section of a surface of the light entrance of the micro - optic element 310.
  • the cross section of the light exit surface of each micro-optic element 310 may be larger than the cross section of the light entrance surface of the respective micro-optic element 310 by a ratio in a certain range, for example, a range of 25/1 to 4/1.
  • the cross section of the portion of reflected light 206 transmitted to each sensing element 115 via the light exit (surface) of the respective associated micro-optic element 310 may be larger than the cross section of the portion of reflected light 206 received at the light entrance (surface) of the respective cross section by a ratio in the certain range.
  • each micro-optic element 310 may be constructed, fabricated, shaped, and/or implemented to receive, from the focusing unit 302, a respective portion of the reflected light 206, which is typically significantly focused, and transmit the received light portion to the associated sensing element 116, typically having a large sensing surface, while preventing or at least significantly reducing (blocking, attenuating, etc.) other light from transmission to the respective associated sensing element 116.
  • the micro-optic elements 310 be therefore constructed, shaped, and structured to disperse (spread, expand) the significantly focused reflected light 206 received at the light entrance of the respective micro-optic element 310 over the significantly larger sensing surface of the respective associated sensing element 116. It should be noted that each of the microoptic elements 310 may employ a single such architecture, structure, and/or implementation, and/or a combination of two or more such architectures, structures, and/or implementations.
  • disperse and its variants relate to spreading, expanding, and/or distributing light according to a desired distribution, typically a wide distribution.
  • This means dispersing received light may be interpreted to mean that a surface area of the dispersed light may be larger than the surface area of the received light, more specifically a cross section of the dispersed (spread) light (beam) may have a larger surface area than the cross section of the received light (beam).
  • each micro-optic element 310 may be shaped to have a curvature configured to disperse the respective portion of reflected light 216 over a sensing surface of the respective associated sensing element 116.
  • the light entrance and/or exit may be configured to have a convex curved surface which may disperse the significantly focused reflected light 206 received at the light entrance of the respective micro-optic element 310 over the significantly larger sensing surface of the respective associated sensing element 116.
  • the curvature may be, for example, part of the micro-optic element 310, for example, formed in the light entrance of each micro-optic element 310 which may be constructed of one or more materials having a refractive index selected and/or adapted to transmit the reflected light 206 toward the light exit of the respective microoptic element 310, specifically disperse the respective portion of reflected light over the sensing surface of the associated sensing element 116.
  • each micro-optic element 310 may be associated with one or more respective lenses configured to disperse the respective portion of reflected light 206 over the sensing surface of the respective associated sensing element 116.
  • the light entrance and/or exit of each micro-optic element 310 may comprise a dispersing lens configured to disperse the significantly focused reflected light 206 received at the light entrance of the respective micro-optic element 310 over the significantly larger sensing surface of the respective associated sensing element 116.
  • each micro-optic element 310 may be structured to have its light entrance and its light exit spaced apart.
  • each micro-optic element 310 may comprise an optical path configured to optically couple the light exit of the respective micro-optic element 310 to the light entrance of the microoptic element 310.
  • each micro-optic element 310 may be a monolithic component constructed of one or more materials, for example, glass, a polymer, and/or the like configured to transfer light from the light entrance of the respective each micro-optic element 310 to the light exit of the micro-optic element 310.
  • micro-optic element 310 may facilitate an open air light transmission path from the light entrance of the respective each micro-optic element 310 to the light exit of the micro-optic element 310.
  • micro-optic element 310 may comprise one or more optic fibers configured to direct light from the light entrance of the respective each micro-optic element 310 to the light exit of the microoptic element 310.
  • FIG. 4A and FIG. 4B are cross section views of exemplary micro-optic elements of a LIDAR system configured for transmitting light reflected from objects illuminated by the LIDAR system to a sensing element of the LIDAR system, in accordance with embodiment of the present disclosure.
  • an exemplary micro-optic element 310A such as the micro-optic element 310 may be constructed as a monolithic component, comprise a light input, for example, an input surface 402A through which a portion of incoming reflected light such as the reflected light 206 may enter the micro-optic element 310A, and a light exit, for example, an output surface 404A through which light may be transmitted to a sensing element such as the sensing element 116 associated with the micro-optic element 310A.
  • the input surface 402A may coincide with the focal plane 304 of a focusing unit such as the focusing unit 302 such that the portion of reflected light 206 is focused on the input surface 402A.
  • the output surface 404A which is spaced apart from the input surface 402A may coincide with the detector plane 306, i.e., with the sensing surface of the associated sensing element 116.
  • the optic element 310A may be shaped and/or constructed such that an angle between the light blocking exterior 408A may have a base angle 420 with the light exit of the respective micro-optic element 310 in a certain range, for example, 60-90 degrees. As such light transmitted inside the micro-optic element 310A from the light entrance 402A may hit the light blocking exterior 408A at an obtuse angle with respect to the surface of the light blocking exterior 408 and be reflected back toward the light exit 404A.
  • the micro-optic element 310A may be shaped to have the angle 420 within the certain range combined with the light blocking exterior 408A constructed of one or more materials having a selected refractive index which combined with the base angle 420 may reduce refraction of the internal light and increase reflection of internal light beams toward the light exit 404A.
  • the micro-optic element 310A comprise one or more optical elements, for example, a prism shaped and/or constructed to optically couple the output surface 404A to the input surface 402A for distributing (expanding) the incoming reflected light 206 and transmitting distributed light 406 to the associated sensing element 116.
  • the distributed light 406 may have an increased (expanded) cross section larger than the cross section of the incoming reflected light 206 and may thus disperse (distribute) over to the entire sensing surface having a larger cross section than the cross section of the incoming reflected light 206.
  • another exemplary micro-optic element 310B such as the micro-optic element 310 may comprise a lens 410, for example, a convex lens configured to disperse, i.e., distribute the portion of incoming reflected light 206 received via an input surface 402B of the lens 410 according to a desired distribution, typically wide distribution and transmit the distributed light 406 via a lens output surface 412 of the lens 410 to the sensing element 116 associated with the micro-optic element 310B.
  • the input surface 402B may coincide with the focal plane 304 of the focusing unit 302 such that the portion of reflected light 206 is focused on the input surface 402B.
  • the micro-optic element 310B may include a void between a lens output surface 412 and the detector pane 306.
  • the lens output surface 412 does not coincide with the detector plane 306
  • the light exit of the micro- optic element 31 OB is considered to comprise an output surface 404B coincident with the detector plane 306 through which the distributed light 406 received from the lens output surface 412 is transmitted to the associated sensing element 116 at the detector plane 306.
  • the micro-optic element 31 OB may further comprise a light blocking exterior 408B which may extend from the output surface 404B to the detector plane 306, i.e., to the sensing surface of the sensing element 116.
  • the light blocking exterior 408B may be constructed, for example, as a diaphragm encircling the lens 410 configured to prevent, at least partially, transmission of incident light to the sensing element 116.
  • the light blocking exterior 408B may be constructed of one or more light absorptive materials configured to absorb incident light.
  • the light blocking exterior 408B may be geometrically shaped to reflect and/or deflect light away from the detector plane 306.
  • the micro-optic element 310B may transmit (transfer) to the sensing element 116 only or at least mostly the portion of reflected light 206 entering the micro-optic element 310B through the input surface 402B while reducing and potentially preventing transmission of other light (e.g., crosstalk, noise, stray light, etc.) like to the sensing element 116.
  • other light e.g., crosstalk, noise, stray light, etc.
  • the blocking exterior 408B encircling the lens 410 may be solid for example, i.e., its interior space may be constructed of one or more materials which optically couple the detector plane 306 to the output surface 410 such that the distributed light 406 may be transmitted to the sensing surface of the sensing element 116.
  • the blocking exterior 408B encircling the lens 410 may comprise a void forming an open air optical path from the output surface 404B of the lens 410 to the detector plane 306.
  • the micro-optic elements 310 may be configured to employ angular filtering by preventing transmission of incident light beams having an Angle of Incidence (AOI) with the light entrance of the micro-optic element 310 which is outside a predefined angle range.
  • AOI Angle of Incidence
  • the AOI of incident light beams relating to each micro-optic element 310 may be expressed with respect to a normal to the light entrance of the respective micro-optic element 310 at the point of incidence.
  • the normal may be uniform across the light entrance of each micro-optic element 310.
  • the normal may be specific to each point of the light entrance of each micro-optic element 310.
  • incident light beams having an AOI which is within the angle range may be transmitted via the micro-optic element 310 to the light exit of the micro-optic element 310 while incident light beams having an AOI outside the angle range may be rejected, i.e., prevented from transmitting to the light exit.
  • each micro-optic elements 310 may reject light beams having an AOI outside the predefined angle range, i.e., prevent their transmission to the associated sensing element 116, while transmitting to the associated sensing element 116 light beams having an AOI within the angle range.
  • the micro-optic elements 310 may employ pone or more structure, architecture, and/or compositions to facilitate the angular filtering.
  • each of the micro-optic elements 310 may be geometrically shaped to form a geometric angular light trap adapted to reflect and/or deflect away light beams outside the predefined angle range.
  • micro-optic elements 310 may be shaped and/or constructed to include a front-end conduit adapted to filter out light beams outside the predefined angle range.
  • the conduit may have one or more light transmission, reflection, refraction, and/or rejection similar to those of the light blocking exterior 408.
  • the conduit may be part of the light blocking exterior 408.
  • FIG. 5A and FIG. 5B are cross section views of exemplary micro-optic elements shaped to deflect light reflected from objects illuminated by the LIDAR system away from a sensing element of the LIDAR system, in accordance with embodiment of the present disclosure.
  • an exemplary micro-optic element 310C comprising a light entrance 402C, a light exit 404C and a light blocking exterior 408C may further comprise a conduit 502A at its front-end, specifically at its light entrance, for example, at an input surface 402C of the micro-optic element 310C.
  • the conduit 502A may be configured to convey and/or transmit the incoming reflected light 206 and increase angular rejection by rejecting light beams having an AOI with respect to the input surface 402C that is outside a predefined angle range.
  • the conduit 502A may be constructed of one or more materials having a refractive index adapted to transfer light beams having AOI angles with the normal of the input surface 402C which are within a predefined angle range between angles 510A and 510B, for example, ⁇ 45, while deflecting away light beams having AOI angles which are outside the predefined degrees angle range. As seen, the conduit 502A may have a uniform cross section across its entire height.
  • the AOI angles of light beams 206B and 206C with respect to the normal to the input surface 402C may be, for example, ⁇ 45 degrees and -45 degrees respectively, which are within the predefined angle range defined between angles 510A and 510B and may therefore transfer through the micro-optic elements 310C toward the sensing element 116.
  • Light beams 206A and 206D may be deflected away from the sensing element 116 since their AOI angles with respect to the normal to the input surface 402C, for example, +55 degrees and -65 degrees respectively, is outside the predefined angle range.
  • the conduit 502A may be further configured to support internal reflection of light beams incident on the internal surfaces of the funnel element 502A to increase reflection of light inside the micro-optic element 310C, i.e., light transmitted through the micro-optic element 310C toward the sensing element 116.
  • the conduit 502A may be constructed of one or more light transparent materials having a refractive index which may refract light beams having AOI angles outside a certain angle range and transmit them out of the micro-optic element 310C while reflecting light beams having AOI angles which are within the certain angle range toward the light exit 404C and through it to the sensing surface of the sensing element 116.
  • another exemplary micro-optic element 310D comprising a light entrance 402D, a light exit 404D and a light blocking exterior 408D may further include a conduit 502B shaped and/or constructed to have a varying cross section across its height.
  • the conduit 502B may be shaped to have inward inclining side surfaces (side walls) . This inward inclination may alter a refraction angle out of the micro-optic element 310D for light beams having an AOI outside the predefined angle range since the normal to the inward inclined surfaces is tilted inward.
  • each of the micro-optic elements 310 may facilitate the angular filtering using material angular light filtering.
  • Such micro-optic elements 310 may be composed of one or more materials having a refractive index selected to prevent transmission of incident light beams having an AOI with respect to the light entrance which is outside the predefined angle range while transferring incident light beams having an AOI within the predefined angle range.
  • an exemplary micro-optic element 310 may be constructed of one or more materials characterized by a refractive index such that light beams incident on the micro-optic element 310 with an AOI is outside the predefined angle range are refracted and transmitted away from the light exit of the micro-optic element 310 and the sensing surface of the respective associated sensing element 116.
  • the refractive index of the selected materials may cause light beams incident on the micro-optic elements 310 with an AOI with respect to the light entrance that is within the predefined angle range to be directed and transmitted toward the light exit of the micro-optic element 310 and the sensing surface of the respective associated sensing element 116.
  • each micro-optic element 310 may be shaped as a truncated pyramid, for example, a square pyramid, a triangular pyramid, and/or the like comprising a truncated top facet, a base facet, and a plurality of side facets.
  • the truncated top facet of the pyramid may constitute a light entrance such as the light entrance
  • the base facet of the pyramid may constitute a light exit such as the light exit 404
  • the plurality of side facets of the pyramid may constitute a light blocking exterior such as the light blocking exterior 408.
  • an exemplary micro-optic array 320 may be shaped as an array of truncated pyramids, i.e., where each of the truncated pyramids constitutes a respective one of the plurality of the micro-optic elements 310.
  • FIG. 6 depicting various views of an exemplary micro-optic array of a LIDAR system configured for transmitting light reflected from objects illuminated by the LIDAR system to a sensor array of the LIDAR system, in accordance with embodiment of the present disclosure.
  • An exemplary micro-optic array 320E such as the micro-optic array 320 of an imaging system such as the imaging system 300 of a LIDAR system such as the LIDAR system 100 may be constructed, shaped, produced, and/or configured to have a “Toblerone” shape comprising a plurality of truncated pyramids each constituting a respective micro-optic element of a plurality of micro-optic elements 310E such as the micro-optic elements 310.
  • Each microoptic element 310E(i) may be associated with a respective sensing element 116(i) such as the sensing element 116 of a sensor array such as the sensor array 316.
  • each of the micro-optic element 310E(i) has a respective light entrance, for example, an input surface 402E formed by the truncated top facet of the pyramid constituting the respective micro-optic element 310E(i), and a respective light exit, for example, an exit surface 404E formed by the base facet of the pyramid constituting the respective micro -optic element 310E(i).
  • the micro-optic array 320 comprising the plurality of micro-optic elements 310 may be a monolithic component i.e., a single component constructed of one or more materials, specifically light transferring materials, for example, a polymer, glass, and/or the like such that each micro-optic element 310 may optically coupling its respective light exit of its respective light entrance for transmitting the portion of incoming reflected light received from a focusing unknit such as the focusing unit 302 to a respective sensing element 116 associated with the respective micro-optic element 310.
  • the micro-optic array 320 comprising the plurality of micro-optic elements 310 may be constructed and/or fabricated of multiple disparate components, elements, and/or devices, for example, an open air optical element, an optic fiber, and/or the like which optically couple the respective light entrance of each of the micro-optic elements 310 to the respective light exit of the respective micro-optic element 310.
  • the imaging system 300, specifically the micro-optic array 320 may further comprise a plurality of optical filters disposed between the plurality of micro-optic elements 310 of the micro-optic array 320. These optical filters may be adapted, configured, shaped, and/or fabricated to absorb light not received via the light entrances of the plurality of micro optic elements 310.
  • FIG. 7A and FIG. 7B are schematic illustrations of exemplary micro-optic arrays of a LIDAR system configured for transmitting light reflected from objects illuminated by the LIDAR system to a sensor array of the LIDAR system and filtering out spatial light noise, in accordance with embodiment of the present disclosure.
  • a plurality of optical filters 702 may be associated with an exemplary micro-optic array 320 of an imaging system such as the imaging system 300 of a LIDAR system such as the LIDAR system 100.
  • a spatial filter 702(i) may be disposed between each pair a plurality of micro-optic elements 310(i) and 310(i+l) such as the micro-optic elements 310 of the micro-optic array 320 associated with respective sensing elements 116(i) and 116(i+l) of a sensor array such as the sensor array 316 of the LIDAR system 100.
  • the optical filters 702 may be configured to prevent spatial light, i.e., light coming in from the direction of a focusing unit such as the focusing unit 302 of the imaging system 300, from being transmitted toward the sensor array 316. Moreover, one or more optical filter 702 may be disposed in front of the first micro-optic element 310 of the micro-optic array 320 and behind the last micro-optic element 310 of the micro-optic array 320.
  • the spatial filter 702(i) may be shaped, arranged, and/or disposed in one or more shapes, construction and/or configuration.
  • a plurality of spatial filter 702A(i) may be disposed between each pair of micro-optic elements 310(i) and 310(i+l) along a significant stretch and possibly the entire length (height) of the lateral surfaces of the micro-optic elements 310, for example, adjacent to the surface of the light blocking exterior of the micro-optic elements 310.
  • the internal surfaces of the micro-optic elements 310 may be coated with one or more reflective materials adapted to increase reflection of light beams in the micro-optic element 310 back into the micro-optic elements 310 and toward the associated sensing element 116. This may increase the energy level of light received at the sensing element 116 which may increase light detection of the sensing element 116 and thus increase detection performance of the LIDAR system 100.
  • a plurality of spatial fdter 702B(i) may be disposed between each pair of micro-optic elements 310(i) and 310(i+l) next to the light entrances of the micro-optic elements 310.
  • the spatial fdter 702B(i) may be significantly small (short) and may not stretch along the lateral surfaces of the micro -optic elements 310.
  • this arrangement which leaves an air gap between the micro-optic elements 310, coupled with selection of the material(s) composing the micro-optic element(s) 310 having appropriate refractive index may increase reflection of light beams inside the microoptic element 310 by preventing refraction of these light beams out of the micro-optic element(s) 310 and thus reflecting them back into the micro-optic element 310 and toward the associated sensing element 116.
  • the reflected light may increase the energy level of light received at the sensing element 116 which may increase light detection of the sensing element 116 and thus increase detection performance of the LIDAR system 100.
  • the optical fdters 702 serving as spatial light traps may utilize and/or perform as the light blocking exterior 408 of the micro-optic elements 310 for reducing and potentially preventing spatial light not entering the micro-optic elements 310 via their light entrances from being transmitted to the sensing elements 116 of the sensor array 316.
  • one or more materials may be disposed between the sensing surface of the sensing elements 116, and the light exit of the micro-optic elements 310 to prevent reflection of light beams from the sensing elements 116 back toward the micro-optic elements 310, and optionally to affix the micro-optic array 320 to the sensor array 316.
  • one or more optical pastes for example, an optical glue, an optical paste, and/or the like may be spread over the sensing surface of the sensing elements 116.
  • the optical paste(s) may be selected and/or configured to have a matched refractive index to prevent reflection of incident light back toward the light entrance of the micro-optic elements 310.
  • FIG. 8 is a flow chart of an exemplary process of using micro-optic elements for transmitting light reflected from objects illuminated by the LIDAR system to a sensor array of the LIDAR system, in accordance with embodiments of the present disclosure.
  • An exemplary process 800 may be executed using a micro-optic array such as the micro-optic array 320 of an imaging system such as the imaging system 302 of a LIDAR system such as the LIDAR system 100.
  • light such as the reflected light 206 may be received by the imaging system 300, specifically the reflected light 206 may be received via a focusing unit imaging system 300 such as the focusing unit 302.
  • the light 206 may be reflected by one or more objects illuminated by light such as the projected light 204 emitted by one or more light sources such as the light source 112 of the LIDAR system and projected to an FOV such as the FOV 120 of the LIDAR system 100 by a scanning unit such as the scanning unit 104.
  • the reflected light 206 may be divided to a plurality of light portions each directed to a respective one of a plurality of sensing elements such as the sensing elements 116 of a sensor array such as the sensor array 316.
  • each of a plurality of micro-optic elements such as the micro-optic elements 310 of a micro-optic array such as the micro-optic array 320 associated with a respective sensing element 116 may receive a respective portion of the reflected light 206 from the focusing unit 302.
  • each micro-optic element 310 may receive the respective portion of the reflected light 206 via its light entrance.
  • each micro-optic element 310 may transmit the respective portion of reflected light 206 to the respective associated sensing element 116 while reducing and potentially preventing transmission of light not received via the light entrance of the respective micro-optic element 310 such that only or at least mostly of the light transmitted to the respective associated sensing element 116 is light received via the light entrance of the respective micro-optic element 310.
  • aspects of the present disclosure may be embodied as a system, method and/or computer program product. As such, aspects of the disclosed embodiments may be provided in the form of an entirely hardware embodiment, an entirely software embodiment, or a combination thereof.
  • aspects of the disclosed embodiments are described as being stored in memory, one skilled in the art will appreciate that these aspects can also be stored on other types of computer readable media, such as secondary storage devices, for example, hard disks or CD ROM, or other forms of RAM or ROM, USB media, DVD, Blu-ray, or other optical drive media.
  • secondary storage devices for example, hard disks or CD ROM, or other forms of RAM or ROM, USB media, DVD, Blu-ray, or other optical drive media.
  • Programs and computer programs products based on the written description and disclosed methods are within the skill of an experienced developer.
  • the various programs or program modules can be created using any of the techniques known to one skilled in the art or can be designed in connection with existing software.
  • program sections or program modules can be designed in or by means of .Net Framework, .Net Compact Framework (and related languages, such as Visual Basic, C, etc.), Java, C++, Objective -C, HTML, HTML/AJAX combinations, or HTML with included Java applets.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Electromagnetism (AREA)
  • Optical Radar Systems And Details Thereof (AREA)

Abstract

An imaging system for receiving light reflected from a Field Of View (FOV) illuminated by a LIDAR system, comprising a focusing unit configured to receive reflected light from a FOV illuminated by a LIDAR system and focus a plurality of portions of the reflected light on a focal plane of the focusing unit, a sensor array comprising a plurality of sensing elements configured to detect the reflected light, and a micro-optic array comprising a plurality of micro- optic elements each associated with a respective sensing element. Each micro -optic element comprises a light entrance coincident with the focal plane and configured for receiving a respective portion of the reflected light from the focusing unit, a light exit through which the respective portion of the reflected light is transmitted to the associated sensing element, and a light blocking exterior disposed between the light entrance exit and configured to prevent transmission of reflected light.

Description

APPLICATION FOR PATENT
Title: MICRO-OPTICS ON DETECTION PATH OF LIDAR SYSTEMS
TECHNICAL FIELD
[0001] The present disclosure relates to Light Detection and Ranging (LIDAR) technology for scanning a surrounding environment, and, more specifically, but not exclusively, to increasing noise immunity of LIDAR systems deployed for scanning a surrounding environment.
RELATED APPLICATIONS
[0002] The present application claims the benefit of priority of U.S. provisional patent application No. 63/504,207 filed on May 25, 2023, and of U.S. provisional patent application No. 63/508,112 filed on June 14, 2023, the content of each of which is incorporated herein by reference in its entirety.
BACKGROUND
[0003] With the advent of driver assist systems and autonomous vehicles, automobiles are equipped with systems capable of reliably sensing and interpreting their surroundings, including identifying obstacles, hazards, objects, and other physical parameters that might impact navigation of the vehicle. To this end, various technologies are currently used, for example, Radio Detection and Ranging (RADAR), LIDAR, camera-based systems, and/or the like operating alone, in conjunctions and/or in a redundant manner.
[0004] LIDAR based object detection and surroundings mapping has proved to be highly efficient, reliable, and robust compared to other detection technologies. However, while such LIDAR based detection systems may be extremely efficient, their performance, whether employing pulsed or continuous wave illumination, may be affected, and possibly significantly degraded due to environmental interference such as, for example, noise (e.g., ambient light, crosstalk, stray light, etc.), excessive light reflection, parasitic reflections, external light sources, and interferences at the component and electrical circuits level, to name just a few.
SUMMARY
[0005] It is an object of the present disclosure to provide methods, systems and/or software program products for improving performance of LIDAR system by distributing light reflected from objects in a scene illuminated by the LIDAR system on sensing elements of the LIDAR system to reduce noise reaching the sensors and/or to improve dynamic range of the LIDAR system. This objective is achieved by the features of the independent claims. Further implementation forms are apparent from the dependent claims, the description, and the figures. It should be noted that multiple such implementation forms may be combined together to any single embodiment.
[0006] According to a first aspect of embodiments disclosed herein, there is provided an imaging system for receiving light reflected from a field of view (FOV) illuminated by a LIDAR system, comprising a focusing unit configured to receive reflected light from a FOV illuminated by a LIDAR system and focus a plurality of portions of the reflected light on a focal plane of the focusing unit, a sensor array comprising a plurality of sensing elements configured to detect the reflected light, and a micro-optic array comprising a plurality of microoptic elements each associated with a respective one of the plurality of sensing elements. Each micro-optic element comprises a light entrance coincident with the focal plane and configured for receiving a respective portion of the plurality of portions of reflected light from the focusing unit, a light exit through which the respective portion of the reflected light is transmitted to the associated sensing element, and a light blocking exterior disposed between the light entrance and the light exit, the light blocking exterior is configured to prevent transmission of the reflected light.
[0007] According to a second aspect of embodiments disclosed herein, there is provided a method of distributing light reflected from a field of view (FOV) illuminated by a LIDAR system on sensing elements of the LIDAR system, comprising receiving, via an optical system of a LIDAR system, light reflected from an FOV illuminated by the LIDAR system, wherein the optical system is configured to focus a plurality of portions of the reflected light on a focal plane of the optical system, transmitting the reflected light via a micro-optic array to a sensor array comprising a plurality of sensing elements configured to detect the reflected light. The micro-optic array comprises a plurality of micro-optic elements each associated with a respective one of the plurality of sensing elements. Wherein each micro-optic element comprises: a light entrance coincident with the focal plane and configured for receiving a respective portion of the plurality of portions of reflected light from the focusing unit, a light exit through which the respective portion of the reflected light is transmitted to the associated sensing element, and a light blocking exterior disposed between the light entrance and the light exit, the light blocking exterior is configured to prevent transmission of the reflected light.
[0008] In a further implementation form of the first and/or second aspects optionally together with one or more of their related implementation forms, each of the plurality of sensing elements comprises an array of light detecting elements. Each micro-optic element is configured to disperse light received via the light entrance of the respective micro -optic element over a sensing surface of the array of light detecting elements of the associated sensing element. Wherein a surface area of the sensing surface is larger than the surface area of a cross section of the light entrance.
[0009] In a further implementation form of the first and/or second aspects optionally together with one or more of their related implementation forms, a cross section of a surface of the light exit has a larger surface area than a cross section of a surface of the light entrance by a ratio in a range of 25/1 to 4/1.
[0010] In a further implementation form of the first and/or second aspects optionally together with one or more of their related implementation forms, the light entrance and the light exit of each micro-element are transparent to the reflected light.
[0011] In a further implementation form of the first and/or second aspects optionally together with one or more of their related implementation forms, the light entrance of each micro-optic element is shaped to have a curvature configured to disperse the respective portion of reflected light over a sensing surface of the associated sensing element.
[0012] In a further implementation form of the first and/or second aspects optionally together with one or more of their related implementation forms, each micro-optic element is associated with one or more respective lenses configured to disperse the respective portion of reflected light over a sensing surface of the associated sensing element.
[0013] In a further implementation form of the first and/or second aspects optionally together with one or more of their related implementation forms, the light entrance and the light exit of each micro-optic element are spaced apart and optically coupled to each other via the volume of the respective micro-optic element between the light entrance and the light exit.
[0014] In a further implementation form of the first and/or second aspects optionally together with one or more of their related implementation forms, each micro-optic elements is shaped to prevent transmission of light beams having an angle of incidence (AO I) with a surface of the light entrance outside a predefined angle range.
[0015] In a further implementation form of the first and/or second aspects optionally together with one or more of their related implementation forms, each micro-optic element comprises a front-end conduit geometrically shaped to direct away from the light exit light beams having an AOI with the surface of the light entrance outside the predefined angle range.
[0016] In a further implementation form of the first and/or second aspects optionally together with one or more of their related implementation forms, the conduit has a uniform cross section. [0017] In a further implementation form of the first and/or second aspects optionally together with one or more of their related implementation forms, the conduit has a varying cross section. [0018] In a further implementation form of the first and/or second aspects optionally together with one or more of their related implementation forms, the light blocking exterior is configured to absorb incident light external to the micro-optic element.
[0019] In a further implementation form of the first and/or second aspects optionally together with one or more of their related implementation forms, the light blocking exterior is configured to reflect incident light external to the micro-optic element.
[0020] In a further implementation form of the first and/or second aspects optionally together with one or more of their related implementation forms, the light blocking exterior is configured to reflect incident light inside the micro-optic element.
[0021] In a further implementation form of the first and/or second aspects optionally together with one or more of their related implementation forms, each micro-optic element is a monolithic component.
[0022] In a further implementation form of the first and/or second aspects optionally together with one or more of their related implementation forms, each micro-optic element is shaped as a truncated pyramid wherein a truncated top facet of the pyramid constitutes the light entrance, a base facet of the pyramid constitutes the light exit, and a plurality of side facets of the pyramid constitute the light blocking exterior.
[0023] In a further implementation form of the first and/or second aspects optionally together with one or more of their related implementation forms, the micro-optic array is a monolithic component.
[0024] In a further implementation form of the first and/or second aspects optionally together with one or more of their related implementation forms, the imaging system further comprises a plurality of optical filters disposed between the plurality of micro-optic elements of the microoptic array to prevent transmission of light not received via the light entrance of the plurality of micro-optic elements.
[0025] In a further implementation form of the first and/or second aspects optionally together with one or more of their related implementation forms, the LIDAR system comprises a plurality of light sources configured to transmit a plurality of light beams toward at least part of a FOV of the LIDAR system, each of the plurality portions of the reflected light corresponds to a respective one of the plurality of light beams.
[0026] In a further implementation form of the first and/or second aspects optionally together with one or more of their related implementation forms, the LIDAR system comprises one or more light source configured to transmit a single elongated light beam toward a FOV of the LIDAR system, the reflected light corresponding to the single elongated light beam is divided to the plurality portions of the reflected light.
[0027] Consistent with other disclosed embodiments, non-transitory computer-readable storage media may store program instructions, which are executed by at least one processor and perform any of the methods described herein.
[0028] The foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the claims.
BRIEF DESCRIPTION OF THE DRAWINGS
[0029] The accompanying drawings, which are incorporated in and constitute a part of this disclosure, illustrate various disclosed embodiments by way of example only. With specific reference now to the drawings in detail, it is stressed that the particulars are shown by way of example and for purposes of illustrative discussion of embodiments disclosed herein. In this regard, the description taken with the drawings makes apparent to those skilled in the art how disclosed embodiments may be practiced.
[0030] The accompanying drawings, which are incorporated in and constitute a part of this disclosure, illustrate various disclosed embodiments.
[0031] In the drawings:
[0032] FIG. 1A and FIG. IB are schematic illustrations of an exemplary LIDAR system, in accordance with embodiments of the present disclosure;
[0033] FIG. 2 illustrates graph charts of exemplary light emission patterns projected by a LIDAR system, in accordance with embodiments of the present disclosure;
[0034] FIG. 3 is a schematic illustration of an exemplary micro-optic array of a LIDAR system configured for transmitting light reflected from objects illuminated by the LIDAR system to a sensor array of the LIDAR system, in accordance with embodiments of the present disclosure; [0035] FIG. 4A and FIG. 4B are cross section views of exemplary micro-optic elements of a LIDAR system configured for transmitting light reflected from objects illuminated by the LIDAR system to a sensing element of the LIDAR system, in accordance with embodiment of the present disclosure;
[0036] FIG. 5A and FIG. 5B are cross section views of exemplary micro-optic elements shaped to deflect light reflected from objects illuminated by the LIDAR system away from a sensing element of the LIDAR system, in accordance with embodiment of the present disclosure; [0037] FIG. 6 depicts various views of an exemplary micro-optic array of a LIDAR system configured for transmitting light reflected from objects illuminated by the LIDAR system to a sensor array of the LIDAR system, in accordance with embodiment of the present disclosure;
[0038] FIG. 7A and FIG. 7B are schematic illustrations of exemplary micro-optic arrays of a LIDAR system configured for transmitting light reflected from objects illuminated by the LIDAR system to a sensor array of the LIDAR system and filtering out spatial light noise, in accordance with embodiment of the present disclosure; and
[0039] FIG. 8 is a flow chart of an exemplary process of using micro-optic elements for transmitting light reflected from objects illuminated by the LIDAR system to a sensor array of the LIDAR system, in accordance with embodiments of the present disclosure.
DETAILED DESCRIPTION
[0040] The present disclosure relates to LIDAR technology for scanning a surrounding environment, and, more specifically, but not exclusively, to increasing noise immunity of LIDAR systems deployed for scanning a surrounding environment.
[0041] LIDAR systems may suffer degradation in its performance, for example, reduced detection accuracy, reliability, consistency, and/or robustness, in terms of reduced detection range, reduced resolution, increased Signal to Noise Ratio (SNR), reduced confidence of detection, and/or the like due to noise affects originating from one or more sources, for example, crosstalk between sensing elements of the LIDAR system, stray light within the LIDAR system, ambient light, and more.
[0042] Another limitation which may reduce the LIDAR systems’ performance may be traced to a limited dynamic range of the LIDAR sensors. LIDAR sensors may typically comprise an array of light detectors, for example, Avalanche Photodiodes (APDs), Single Photon Avalanche Diodes (SPADs), and/or the like which may trigger in response to capturing light (photons), specifically light reflected from objects illuminated by the LIDAR system.
[0043] Dynamic range relates to the range of light energy levels the sensors may be capable of measuring. Each of the light detectors may trigger by certain energy level of received light, which is typically small (e.g., single photon for SPADs), and the sensor may therefore aggregate (e.g. sum, combine, etc.) the outputs of the plurality of light detectors of its array and/or part thereof to indicate the energy level of light captured by the sensor. The size of the sensor defined by the number of light detectors may therefore define the dynamic range of the sensor. Therefore, transmitting light the reflected light in focused beam, i.e., a beam having a significantly small cross section, on the sensor may result in a small effective sensing surface of the sensor, i.e., the number of light detectors which may receive the reflected light may be significantly small which may significantly reduce the dynamic range of the sensor.
[0044] According to some embodiments of the present disclosure, there are provided components, devices systems, and methods for using micro-optic elements to increase noise immunity of LIDAR systems and/or for increasing the dynamic range of the LIDAR sensors.
[0045] An imaging system of a LIDAR system may include a micro -optic array comprising a plurality of micro-optic elements each associated with a respective one of a plurality of sensing elements (sensors) of the LIDAR system. The micro-optic array may be configured to transmit the light reflected from one or more objects in a field of View (FOV) of the LIDAR system which are illuminated by light projected by the LIDAR system.
[0046] In particular, the reflected light may comprise a plurality of light portions and each of the micro-optic elements may be configured to transmit a respective portion of the reflected light to its respective associated sensing element. The plurality of portions of the reflected light may relate, for example, to a plurality of light beams projected by the LIDAR system such that each projected light beam may be associated with a receptive sensing element which may receive, via its associated micro-optic element, at least some light reflected from objects in the FOV illuminated by the respective light beam. In another example, light reflected in response to a line scan in which an elongated wide beam projected by the LIDAR system may be divided (split, segmented) to a plurality of light portions to increase resolution of the scanned area.
[0047] The micro-optic elements may be deployed on the optical path of the imaging system between a focusing unit of the LIDAR system and the sensor array.
[0048] Each of the micro-optic elements constructed of one or more light transferring materials, for example, glass, a polymer, and/or the like may comprise a light entrance coincident with the focal plane (i.e., at the focal length) of the focusing unit for receiving the respective portion of reflected light from the focusing unit and a light exit, optically coupled to the light entrance, through which the reflected light portion may be transmitted to the respective associated sensing element.
[0049] Each of the micro-optic elements may further include a light blocking exterior disposed between the light entrance and the light exit of the respective micro-optic element. The light blocking exterior may be configured to reduce and potentially prevent transmission of light to the respective associated sensing element. Specifically, the light blocking exterior of each micro-optic element may be configured to prevent transmission of light not received via the light entrance of the respective micro-optic element to the respective associated sensing element. The light blocking exterior, employing one or more light blocking structures, architectures, and/or compositions, for example, light absorbing , light reflecting, light deflecting, and/or a combination thereof, may prevent transmission of light originating from one or more sources, for example, stray light travelling inside the LIDAR system, crosstalk (between sensing elements, and/or between micro-optic elements), ambient light, and/or the like. The light blocking exterior may thus ensure that all or at least most of the light transmitted (distributed) to the sensing element associated with each micro-optic element is that which only or at least mostly enters the light entrance of the respective micro-optic element.
[0050] Each of the sensing elements of the sensor array of the LIDAR system may typically consist of an array of light detecting elements (light detectors), for example, APDs, SPADs, and/or the like each capable of triggering upon reception of photon(s) of reflected light and the output of the array summed together to form an output of the respective sensing element.
[0051] Since the dynamic range of the LIDAR system is directly related to the surface area of the sensing surface of each sensing element, each of the micro-optic elements may be further configured to disperse (i.e., spread, expand, distribute, etc.) the received portion of reflected light, which may be significantly focused, over the sensing area of the associated sensing element which may be significantly large (increased), which in turn enables more detecting elements (SPADs) in each sensing element. To this end, the light exit of each micro-optic element may be configured to have a cross-section which is significantly larger, i.e., having a larger surface area, than the cross section (surface) of the light entrance of the respective microoptic element. Lor example, the cross section of the light exit of each micro-optic element may have a surface area larger by a ratio of 25/1 to 4/1 than the cross section of the surface area of the light entrance. As such, the cross section of the portion of reflected light received at the light entrance of each micro-optic element from the focusing unit may be smaller by the certain ratio (e.g., 1/25 to 1/4) compared to the cross section of the reflected light dispersed through the light exit surface on the sensing surface of the sensing element.
[0052] The micro-optic elements may employ one or more structures, architectures, compositions, and/or a combination thereof to expand the portion of reflected light transmitted to its respective associated sensing element. Lor example, the micro-optic elements may comprise one or more optical elements such as, for example, a lens, a prism, and/or the like, a curved surface, and/or the like.
[0053] Employing the micro-optic array comprising a plurality of micro-optic elements on the optical path of LIDAR system to their sensing element may present major benefits and advantages over currently existing LIDAR systems. [0054] First, disposing the micro-optic array between the focusing unit and the sensor array may significantly increase separation and/or isolation between the sensing elements of the LIDAR system since each sensing element may receive only or at least mostly light transmitted via its associated micro-optic element while light originating from other sources including light directed to other sensing elements is significantly reduced, and potentially prevented. Each of the sensing elements may be associated with a respective pixel in the image map (depth map), for example, a point cloud generated based on output of the plurality of sensors to map distances of objects in the surrounding environment of the LIDAR system based on timing of and energy of light reflected from the environment and captured by the sensors. Increasing isolation between the sensing elements of the LIDAR system may therefore significantly increase pixel separation which may significantly increase accuracy of each individual pixel and sharpness of the image map generated by the LIDAR system.
[0055] In addition, increasing isolation between the sensing elements via the use of the micro optic array configured to reduce and potentially entirely prevent leakage of light between sensing elements may allow reducing the distance between adjacent sensing elements which may significantly reduce the sensor array and thus consume reduced space in the LIDAR system which may reduce size, complexity, and/or cost of the LIDAR system.
[0056] Moreover, the structure of the micro-optic elements which are each associated with a respective sensing element of the LIDAR system may significantly increase the energy of light received at each sensing element from the light entrance of its associated micro-optic element while transmission of light originating from other source toward the associated sensing element may be significantly reduced and potentially completely prevented. Sensing performance of each sensing element may be therefore significantly improved since each sensing element may be projected only or at least mainly with the respective portion of reflected light directed to the respective sensing element while noise, crosstalk, parasitic, and/or stray light may be significantly blocked.
[0057] Increasing the light detection performance of the sensing elements may significantly increase the detection performance of the LIDAR system, for example, detection range, detection resolution, accuracy, reliability, confidence of detection, detection robustness and/or the like.
[0058] In addition, transmitting (distributing) each portion of reflected light through the microoptic element over a larger sensing surface may allow using larger sensors, i.e., sensing elements having a larger sensing surface, for example, significantly increasing the number of light detectors which may significantly increase the dynamic range of each sensing element compared to transmitting the focused reflected light received from the focusing unit directly on a much smaller sensing element. Using the micro-optic elements may allow an increase in the sensing surface of each sensing element by 1-2 orders of magnitude which translates to an increase of the dynamic range by 1-2 orders of magnitude.
[0059] Increasing the dynamic range of the sensing elements may, in turn, significantly increase the detection performance of the LIDAR system. Increasing the dynamic range may be specifically effective and advantageous for LIDAR short range detection where a large amount of light may be reflected from one or more objects located in close proximity to the LIDAR system (e.g., up to 10, 20, 30, 40, 50, 60 meters, etc.). This is due to the fact that the large amount of light reflected by these close proximity objects may be distributed on large sensing elements comprising an extremely large number of light detectors of each sensing element as opposed to the existing LIDAR systems in which the large light amount may be distributed over relatively small sensing elements comprising a small number of light detectors which may quickly saturate and go “blind” thus having a significantly limited dynamic range. [0060] The following detailed description refers to the accompanying drawings. Wherever possible, the same reference numbers are used in the drawings and the following description to refer to the same or similar parts.
[0061] While illustrative embodiments are described herein, it is to be understood that these are not necessarily limited in their application to the details of construction and/or arrangement of the components, systems, or methods, since modifications, adaptations and other implementations are possible. For example, as may be appreciated by one skilled in the art, substitutions, additions, and/or modifications may be made to the components illustrated in the drawings, and the illustrative methods described herein may be modified by substituting, reordering, removing, or adding steps to the disclosed methods.
[0062] Accordingly, the following detailed description is not limited to the disclosed embodiments and examples. Instead, the proper scope is defined by the appended claims.
[0063] Referring now to the drawings, FIG. 1A and FIG. IB illustrating an exemplary LIDAR system 100, in accordance with embodiments of the present disclosure. The LIDAR system 100 may be used, for example, in one or more ground autonomous or semi-autonomous vehicles 110, for example, road-vehicles such as, for example, cars, buses, vans, trucks and any other terrestrial vehicle. Autonomous ground vehicles 110 equipped with the LIDAR system 100 may scan their environment and drive to a destination vehicle with reduced and potentially without human intervention. In another example, the LIDAR system 100 may be used in one or more autonomous/semi-autonomous aerial-vehicles such as, for example, Unmanned Aerial Vehicles (UAV), drones, quadcopters, and/or any other airborne vehicle or device. In another example, the LIDAR system 100 may be used in one or more autonomous or semi -autonomous water vessels such as, for example, boats, ships, hovercrafts, submarines, and/or the like. Autonomous aerial-vehicles and watercrafts with LIDAR system 100 may scan their environment and navigate to a destination autonomously or under remote human operation. [0064] It should be noted that the LIDAR system 100 or any of its components may be used together with any of the example embodiments and methods disclosed herein. Moreover, while aspects of the LIDAR system 100 may be described herein with respect to an exemplary vehicle-based LIDAR platform, the LIDAR system 100, any of its components, or any of the processes described herein may be applicable to one or more LIDAR systems of other platform types. As such, LIDAR systems such as the LIDAR system 100 may be installed, mounted, integrated, and/or otherwise deployed, in dynamic and/or stationary deployment for one or more other applications, for example, a surveillance system, a security system, a monitoring system, and/or the like . Such LIDAR systems 100 may be configured to scan their environment in order to detect objects according to their respective application needs, criteria, requirements, and/or definitions.
[0065] The LIDAR system 100 be configured to detect tangible objects in an environment of the LIDAR system 100, specifically in a scene contained in an FOV 120 of the LIDAR system 100 based on reflected light, and more specifically, based on light projected by the LIDAR system 100 and reflected by objects in the FOV 120. The scene may include some or all objects within the FOV 120, in their relative positions and in their current states, for example, ground elements (e.g., earth, roads, grass, sidewalks, road surface marking, etc.), sky, man-made objects (e.g., vehicles, buildings, signs, etc.), vegetation, people, animals, light projecting elements (e.g., flashlights, sun, other LIDAR systems, etc.), and/or the like. An object refers to a finite composition of matter that may reflect light from at least a portion thereof. An object may be at least partially solid (e.g., car, tree, etc.), at least partially liquid (e.g., puddles on a road, rain, etc.), at least partly gaseous (e.g., fumes, clouds, etc.), made of a multitude of distinct particles (e.g., sandstorm, fog, spray, etc.), and/or a combination thereof. An object may be of one or more scales of magnitude, such as, for example, ~1 millimeter (mm), ~5 mm, ~10 mm, ~50 mm, ~100 mm, ~500 mm, ~1 meter (m), ~5m, ~10m, ~50m, ~100m, and so on.
[0066] The LIDAR system 100 may be configured to detect objects by scanning the environment of the LIDAR system 100, i.e., illuminating at least part of the FOV 120 of the LIDAR system 100 and collecting and/or receiving light reflected from the illuminated part(s) of the FOV 120. The LIDAR system 100 may scan the FOV 120 and/or part thereof in a plurality of scanning cycles (frames) conducted at one or more frequencies and/or frame rates, for example, 5 Frames per Second (fps), 10 fps, 15 fps, 20 fps, and/or the like.
[0067] The LIDAR system 100 may apply one or more scanning mechanisms, methods, and/or implementations for scanning the environment. For example, the LIDAR system 100 may scan the environment by moving and/or pivoting one or more deflectors of the LIDAR system 100 to deflect light emitted from the LIDAR system 100 in differing directions toward distinct parts of the FOV 120. In another example, the LIDAR system 100 may scan the environment by changing positioning (i.e., location and/or orientation) of one or more sensor associated with the LIDAR system 100 with respect to the FOV 120. In another example, the LIDAR system 100 may scan the environment by changing positioning (i.e., location, and/or orientation) of one or more light sources associated with the LIDAR system 100 with respect to the FOV 120. In another example, the LIDAR system 100 may scan the environment by changing the positioning one or more sensor and one or more light sources associated with the LIDAR system 100 with respect to the FOV 120.
[0068] The FOV 120 scanned by the LIDAR system 100, i.e., the environment in which the LIDAR system 100 may detect objects, may include an extent of the observable environment of LIDAR system 100 in which objects may be detected. The extent of the FOV 120 may be defined by a horizontal range (e.g., 50°, 120°, 360°, etc.), and a vertical elevation (e.g., ±20°, +40°-20°, ±90°, 0°— 90°, etc.). The FOV 120 may also be defined within a certain range, for example, up to a certain depth/ distance (e.g., 100 m, 200 m, 300 m, etc.), and up to a certain vertical distance (e.g., 10 m, 25 m, 50 m, etc.).
[0069] The FOV 120 may be divided (segmented) into a plurality of portions 122 (segments), also designated FOV pixels, having uniform and/or different sizes. In some embodiments, as illustrated in FIG. 1A, the FOV 120 may be divided into a plurality of portions 122 arranged in the form of a two-dimensional array of rows and columns. At any given time during a scan of the FOV 120, the LIDAR system 100 may scan an instantaneous FOV which comprises a respective portion 122. Obviously, the portion 122 scanned during each instantaneous FOV may be narrower than the entire FOV 120, and the LIDAR system 100 may thus move the instantaneous FOV within the FOV 120 in order to scan the entire FOV 120.
[0070] Detecting an object may broadly refer to determining an existence of the object in the FOV 120 of the LIDAR system 100 which reflects light emitted by the LIDAR system 100 toward one or more sensors, interchangeably designated detectors, associated with the LIDAR system 100. Additionally, or alternatively, detecting an object may refer to determining one or more physical parameters relating to the object and generating information indicative of the determined physical parameters, for example, a distance between the object and one or more other objects (e.g., the LIDAR system 100, another object in the FOV 120, ground (earth), etc.), a kinematic parameter of the object (e.g., relative velocity, absolute velocity, movement direction, expansion of the object, etc.), a reflectivity (level) of the object, and/or the like.
[0071] The LIDAR system 100 may detect objects by processing detection results based on sensory data received from the sensor(s) which may comprise temporal information indicative of a period of time between the emission of a light signal by the light source(s) of the LIDAR system 100 and the time of detection of reflected light by the sensor(s) associated with the LIDAR system 100.
[0072] The LIDAR system 100 may employ one or more detection technologies. For example, the LIDAR system 100 may employ Time of Flight (ToF) detection where the light signal emitted by the LIDAR system 100 may comprise one or more short pulses, whose rise and/or fall time may be detected in reception of the emitted light after reflected by one or more objects in the FOV 120. In another example, the LIDAR system 100 may employ continuous wave detection, for example, Frequency Modulated Continuous Wave (FMCW), phase-shift continuous wave, and/or the like.
[0073] For various reasons, the LIDAR system 100 may detect only part of one or more objects present in the FOV 120. For example, light may be reflected from only some sides of an object, for example, typically only the side opposing the LIDAR system 200 may be detected by the LIDAR system 100. In another example, light emitted by the LIDAR system 100 may be projected on only part of an, for example, a laser beam projected onto a road or a building. In another example, an object may be partly blocked by another object between the LIDAR system 100 and the detected object. In another example, ambient light and/or one or more other interferences may interfere with detection of one or more portions of an object.
[0074] Optionally, detecting an object by the LIDAR system 100 may further refer to identifying the object, for example, classifying atype of the object (e.g., car, person, tree, road, traffic light, etc.), recognizing a specific object (e.g., natural site, structure, monument, etc.), determining a text value of the object (e.g., license plate number, road sign markings, etc.), determining a composition of the object (e.g., solid, liquid, transparent, semitransparent, etc.), and/or the like.
[0075] The LIDAR system 100 may comprise a projecting unit 102, a scanning unit 104, a sensing unit 106, and a processing unit 108. According to some embodiments, the LIDAR system 100 may be mountable on a vehicle 110. [0076] Optionally, the LIDAR system 100 may include one or more optical windows 124 for transmitting outgoing light projected toward the FOV 120 and/or for receiving incoming light reflected from objects in field of view 120. The optical window(s) 124, for example, an opening, a flat window, a lens, or any other type of optical window may be used for one or more purposes, for example, collimating the projected light, focusing of the reflected light, and/or the like.
[0077] The LIDAR system 100 may be contained in a single housing and/or divided among a plurality of housings connected to each other via one or more communication channels, for example, a wired channel, fiber optics cable, and/or the like deployed between the first and second housings, a wireless connection (e.g., RF connection), fiber optics cable, and/or any combination thereof. For example, the light related components of the LIDAR system 100, i.e., the projecting unit 102, the scanning unit 104, and the sensing unit 106 may be deployed and/or contained in a first housing while the processing unit 108 may be deployed and/or contained in a second housing. In such case, the processing unit 108 may communicate with the projecting unit 102, the scanning unit 104, and/or the sensing unit 106 via the communication channel(s) connecting the separate housings for controlling of the scanning unit 104 and/or for receiving from the sensing unit 106 sensory information indicative of light reflected from the scanned scene.
[0078] The LIDAR system 100 may employ one or more designs architectures, and/or configurations for the optical path of outbound light (transmission path TX) projected by the projecting unit 102 toward the scene, i.e., to the FOV 120 of the LIDAR system 100, and of inbound light (reception path RX) reflected from objects in the scene and directed to the sensing unit 106. For example, the LIDAR system 100 may employ bi-static configuration in which the outbound light projected from the projecting unit 102 and exits the LIDAR system 100 and the inbound light reflected from the scene and entering the LIDAR system 100 pass through substantially different optical paths comprising optical components, for example, windows, apertures, lenses, mirrors, beam splitters, and/or the like. In another example, as shown in FIG. IB, the LIDAR system 100 may employ monostatic configuration in which the outbound light and the inbound light share substantially the same optical path, i.e., the light 204 projected by the projecting unit 102 and exiting from the LIDAR system 100 and the light 206 reflected from the scene and entering the LIDAR system 100 pass through substantially similar optical paths and share most if not all of the optical components on the shared optical path.
[0079] The projecting unit 102 may include one or more light sources 112 configured to emit light in one or more light forms, for example, a laser diode, a solid-state laser, a high-power laser, an edge emitting laser, a Vertical-Cavity Surface-Emitting Laser (VCSEL), an External Cavity Diode Laser (ECDL), A distributed Bragg reflector (DBR) laser, a laser array, and/or the like.
[0080] The light source(s) 112 may be configured and/or operated, for example, by the processing unit 108, to emit light according to one or more light emission patterns defined by one or more light emission parameters, for example, lighting mode (e.g., pulsed, Continuous Wave (CW), quasi-CW, etc.), light format (e.g., angular dispersion, polarization, etc.), spectral range (wavelength), energy/power (e.g., average power, maximum power, power intensity, instantaneous power, etc.), timing (e.g., pulse width (duration), pulse repetition rate, pulse sequence, pulse duty cycle, etc.), and/or the like. Optionally, the projecting unit 102 may further comprise one or more optical elements associated with one or more of the light source(s) 112, for example, a lens, an aperture, a window, a light filter, a waveplate, a beam splitter, and/or the like for adjusting the light emitted by the light source(s) 112, or example, collimating, focusing, polarizing, and/or the like the emitted light beams.
[0081] Moreover, the projecting unit 102 may include a plurality of light sources 112 configured to emit a plurality of light beams, typically simultaneously, such that each of the light illuminates a respective portion, section, and/or segment of the instantaneous FOV scanned by the LIDAR system 100 at any given moment.
[0082] The scanning unit 104 may be configured to illuminate the FOV 120 and/or part thereof with projected light 204 by projecting the light emitted from the light source(s) 112 toward the scene thus serving as a steering element on the outbound path, i.e., the transmission path TX, of the LIDAR system 100 for directing the light emitted by the light source(s) 112 toward the scene.
[0083] As described herein before, the scanning unit 104 may be further used on the inbound path of the LIDAR system 100, i.e., the reception path RX, for directing the light (photons) 206 reflected from one or more objects in at least part of the FOV 120 toward the sensing unit 106. The scanning unit 104 may therefore optionally include one or more optical elements, for example, a lens, a telephoto, a prism, and/or the like configured to direct the reflected light 206 toward the sensing unit 106.
[0084] Moreover, since the projecting unit 102 may be configured to emit a plurality of light beams, on the transmission path TX (outbound path) of the LIDAR system 100, the scanning unit 104 may be configured to project the plurality of light beams for illuminating the FOV 120 and/or part thereof. Complementary, on the reception path RX (inbound path) of the LIDAR system 100, i.e., the projecting unit 102 may direct the reflected light 206 from one or more objects in at least part of the FOV 120 toward the sensing unit 106.
[0085] The scanning unit 104 may include one or more light deflectors 114 configured to deflect the light from the light source(s) 112 for scanning the FOV 120. The light deflector(s) 114 may include one or more scanning mechanism, module, devices, and/or elements configured to cause the emitted light to deviate from its original path, for example, a mirror, a prism, a controllable lens, a mechanical mirror, a mechanical scanning polygon, an active diffraction (e.g., controllable LCD), a Risley prisms, a non-mechanical-electro-optical beam steering (such as made, for example, by Vescent), a polarization grating (such as offered, for example, by Boulder Non-Linear Systems), an Optical Phase Array (OPA), and/or the like. For example, the deflector(s) 114 may comprise one or more scanning polygons, interchangeable designated polygon scanner, having a plurality of facets, for example, three, four, five, six and/or the like configured as mirrors and/or prisms to deflect light projected onto the facet(s) of the polygon. In another example, the deflector(s) 114 may comprise one or more Micro Electro-Mechanical Systems (MEMS) mirrors configured to move by actuation of a plurality of benders connected to the mirror. In another example, the scanning unit 104 may include one or more non-mechanical deflectors 114, for example, a non -mechanical -electro-optical beam steering such as, for example, an OPA which does not require any moving components or internal movements for changing the deflection angles of the light but is rather controlled by steering, through phase array means, a light projection angle of the light source(s) 112 to a desired projection angle. It is noted that any discussion relating to moving or pivoting the light deflector(s) 114 is also applicable, mutatis mutandis, to controlling any type of light deflector 114 such that it changes its deflection behavior.
[0086] At any given time, i.e., at any instantaneous point in time, during each scan cycle of the FOV 120 and/or part thereof, the deflector(s) 114 may be positioned in a respective instantaneous position defining a respective location, position and/or orientation in space. In particular, each instantaneous position of the deflector(s) 114 may correspond to a respective portion 122 of the FOV 120. This means that while positioned in each of a plurality of instantaneous positions during each scan cycle of the FOV 120 and/or part thereof, the deflector(s) 114 may scan a respective one of the plurality of portions 122 of the FOV 120, i.e., project light 204 toward the respective portion 122 and/or direct light (photons) reflected from the respective portion 122 toward the sensing unit 106.
[0087] The scanning unit 104 may be configured and/or operated to scan the FOV 120 and/or part thereof, on the outbound path and/or on the inbound path, at one or more scales of scanning. For example, the scanning unit 104 may be configured to scan the entire FOV 120. In another example the scanning unit 104 may be configured to scan one or more ROIs which cover 10% or 25% of the FOV 120. Optionally, the scanning unit 104 may dynamically adjust the scanning scale, i.e., the scanned area, either between different scanning cycles and/or during the same scanning cycle.
[0088] Optionally, the scanning unit 104 may further comprise one or more optical elements associated with the deflector(s) 114, for example, a lens, an aperture, a window, a light filter, a waveplate, a beam splitter, and/or the like for adjusting the light emitted by the light source(s) 112 and/or for adjusting the light reflected from the scene, for example, collimate the projected light 204, focus the reflected light 206, and/or the like.
[0089] The sensing unit 106 may include one or more sensors 116 configured to receive and sample light reflected from the surroundings of LIDAR system 100, specifically from the scene, i.e., the FOV 120, and generate reflection signals, interchangeably designated trace signals or trace data, indicative of light captured by the sensor(s) 116 which may include light reflected from one or more objects in the FOV 120. The sensor(s) 116 may include one or more devices, elements, and/or systems capable of measuring properties of electromagnetic waves, specifically light, for example, energy/power, intensity, frequency, phase, timing, duration, and/or the like and generate output signals indicative of the measured properties. The sensor(s) 116 may be configured and/or operated to sample incoming light according to one or more operation modes, for example, continuous sampling, periodic sampling, sampling according to one or more timing schemes, and/or sampling instructions.
[0090] The sensing unit 106 may include a sensor array comprising a plurality of sensors 116 wherein each of the sensors 116 corresponds to a receptive pixel of a plurality of pixels mapping a portion of the FOV 120 scanned at any given moment. For example, assuming the projecting unit 102 is configured to project a plurality of light beams, each of the plurality of sensing elements 116 of the sensor array may be associated with a respective one of the plurality of light beams, i.e., each sensor 116 may be configured to receive light reflected from one or more objects in the FOV 120 illuminated by the respective light beam. In another example, assuming the projecting unit 102 is configured to project a single elongated light beam (scan line), the light reflected by one or more objects in the FOV 120 responsive to being illuminated by the elongated light beam may be divided to a plurality of portions each directed (transmitted) to a respective one of the plurality of sensors 116.
[0091] These pixels, relating to the sensing elements 116 and thus interchangeably designated sensing pixels, may typically correspond to non-overlapping regions in the FOV 120. The sensing pixels should not be confused with the FOV pixels. Rather, each FOV pixel, which corresponds to a respective portion 122, i.e., an instantaneous FOV scanned at a certain point in time, may be mapped by the plurality of sensing pixels captured during the instantaneous point in time.
[0092] Each sensor 116 may include one or more light sensors of one or more types having differing parameters, for example, sensitivity, size, recovery time, and/orthe like. The sensor(s) 116 may include a plurality of light sensors of a single type, or of multiple types selected according to their characteristics to comply with one or more detection requirements of the LIDAR system 100, for example, reliable and/or accurate detection over a span of ranges (e.g., maximum range, close range, etc.), dynamic range, temporal response, robustness against varying environmental conditions (e.g., temperature, rain, illumination, etc.), and/orthe like. [0093] For example, as seen in FIG. IB, the sensor(s) 116, for example, a Silicon Photomultipliers (SiPM), a non-silicon photomultipliers, and/or the like, may include one or more light detectors constructed from a plurality of detecting elements 220, for example, an Avalanche Photodiode (APD), Single Photon Avalanche Diode (SPAD), and/or the like serving as detection elements 220 on a common silicon substrate configured for detecting photons reflected back from the FOV 120. The detecting elements 220 of each sensor 116 may be typically arranged as an array in one or more arrangements over a detection area of the sensor 116, for example, a rectangular arrangement, for example, as shown in FIG. IB, a square arrangement, an alternating rows arrangement, and/or the like. Optionally, the detecting elements 220 may be arranged in a plurality of regions which jointly cover the detection area of the sensor 116. Each of the plurality of regions may comprise a plurality of detecting elements 220, for example, SPADs having their outputs connected together to form a common output signal of the respective region.
[0094] Each of the light detection elements 220 is configured to cause an electric current to flow when light (photons) passes through an outer surface of the respective detection element 220.
[0095] The processing unit 108 may include one or more processors 118, homogenous or heterogeneous, comprising one or more processing nodes and/or cores optionally arranged for parallel processing, as clusters and/or as one or more multi core processor(s). The processor(s) 118 may execute one or more software modules such as, for example, a process, a script, an application, a (device) driver, an agent, a utility, a tool, an Operating System (OS), a plug-in, an add-on, and/or the like each comprising a plurality of program instructions stored in a non- transitory medium (program store) of the LIDAR system 100 and executed by one or more processors such as the processor(s) 118. The non-transitory medium may include, for example, persistent memory (e.g., ROM, Flash, SSD, NVRAM, etc.) volatile memory (e.g., RAM component, cache, etc.) and/or the like such as the storage 234 and executed by one or more processors such as the processor(s) 232. The processor(s) 118 may optionally integrate, utilize and/or facilitate one or more hardware elements (modules), for example, a circuit, a component, an Integrated Circuit (IC), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA), a Digital Signals Processor (DSP), a Graphic Processing Unit (GPU), an Artificial Intelligence (Al) accelerator and/or the like. The processor(s) 118 may therefore execute one or more functional modules implemented using one or more software modules, one or more of the hardware modules and/or combination thereof.
[0096] The processor(s) 118 may therefore execute one or more functional modules to control functionality of the UIDAR system 100, for example, configuration, operation, coordination, and/or the like of one or more of the functional elements of the UIDAR system 100, for example, the projecting unit 102, the scanning unit 104, and/or the sensing unit 106. While the functional module(s) are executed by the processor(s) 118, for brevity and clarity, the processing unit 108 comprising the processor(s) 118 is described hereinafter to control functionality of the UIDAR system 100.
[0097] The processing unit 108 may communicate with the functional elements of the UIDAR system 100 via one or more channels, interconnects, and/or networks deployed in the UIDAR system 100, for example, a bus (e.g., PCIe, etc.), a switch fabric, a network, a vehicle network, and/or the like.
[0098] For example, the processing unit 108 may control the scanning unit 104 to scan the environment of the UIDAR system 100 according to one or more scanning schemes and/or scanning parameters, for example, extent (e.g., angular extent) of the FOV 120, extent (e.g., angular extent) of one or more regions of interest (ROI) within the FOV 120, maximal range within the FOV 120, maximal range within each ROI, maximal range within each region of non-interest, resolution (e.g., vertical angular resolution, horizontal angular resolution, etc.) within the FOV 120, resolution within each ROI, resolution within each region of non-interest, scanning mode (e.g., raster, alternating pixels, etc.), scanning speed, scanning cycle timing (e.g., cycle time, frame rate), and/or the like.
[0099] In another example, the processor(s) 118 may be configured to coordinate operation of the light source(s) 112 with movement of the deflector(s) 114 for scanning the FOV 120 and/or part thereof. In another example, the processor(s) 118 may be configured to configure and/or operate the light source(s) 112 to project light according to one or more light emission patterns. In another example, the processor(s) 118 may be configured to coordinate operation of the sensor(s) 116 with movement of the deflector(s) 114 to activate one or more selected sensor(s) 116 and/or pixels according to the scanned portion of the FOV 120.
[0100] In another example, the processor(s) 118 may be configured to receive the reflection signals generated by the sensor(s) 116 which are indicative of light captured by the sensor(s) 116 which may include light reflected from the scene specifically light reflected from one or more objects in the scanned FOV 120 and/or part thereof. In another example, the processor(s) 118 may be configured to analyze the trace signals (reflection signals) received from the sensor(s) 116 in order to detect one or more objects, conditions, and/or the like in the environment of the LIDAR system 100, specifically in the scanned FOV 120 and/or part thereof. Analyzing the trace data indicative of the reflected light 206 may include, for example, determining a ToF of the reflected light 206, based on timing of outputs of reflection signals, specifically with respect to transmission timing of projected light 204, for example, light pulses, corresponding to the respective reflected light 206. In another example, analyzing the trace data may include determining a power of the reflected light, for example, average power across an entire return pulse, and a photon distribution/signal may be determined over the return pulse period (“pulse shape”).
[0101] Reference is now made to FIG. 2, which illustrates graph charts of exemplary light emission patterns projected by a LIDAR system such as the LIDAR system 100, in accordance with embodiments of the present disclosure. Graph charts 202, 204, 206, and 208 depict several light emission patterns which may be emitted by one or more light sources such as the light source 112 of a projecting unit such as the projecting unit 102 of the LIDAR system 100. In particular, the light source (s) 112 may emit light according to the light patterns under control of a processing unit such as the processing unit 108 of the LIDAR system 100. The graph charts 202, 204, 206, and 208 expressing the light emission patterns of power (intensity) over time, illustrate emission patterns of light projected in a single frame (frame-time) for a single portion such as the portion 122 of an FOV such as the FOV 120 of the LIDAR system 100 which, as discussed herein before, is associated with an instantaneous position of one or more deflectors such as the deflector 114 of the LIDAR system 100.
[0102] As seen in graph chart 202, the processing unit 108 may control the light source(s) 112, for example, a pulsed-light light source, to project toward the portion 122 one or more initial pulses according to an initial light emission pattern, also designated pilot pulses. The processing unit 108 may analyze pilot information received from one or more sensors, such as the sensor 116 which is indicative of light reflections associated with the pilot pulses and, based on the analysis, may determine one or more light emission patterns according to which the light source(s) 122 may transmit subsequent light pulses during the frame time of the present frame and/or during one or more subsequent frames. As seen in graph chart 204, the processing unit 108 may control the light source(s) 112 to project toward the portion 122 light pulses according to a light emission pattern defining a plurality of pulses having gradually increasing intensities. As seen in graph chart 206, the processing unit 108 may control the light source(s) 112 to project toward the portion 122 light pulses according to different light emission patterns in different frames, i.e., in different scanning cycles, for example, a different number of pulses, pulses having different pulse duration, pulses having different intensity, and/or the like. As seen in graph chart 208, the processing unit 108 may control the light source(s) 112, for example, a continuous-wave light source (e.g., FMCW), to project toward the portion 122 light according to one or more light emission patterns. Such an exemplary light emission pattern may include, for example, projecting continuous light during the entire frame time. In another example, the light emission pattern may define one or more discontinuities, i.e., time periods during which the light source(s) 112 do not emit light. In another example, the light emission pattern may define emission of a continuous light having a constant intensity, or alternatively emission of a continuous light having varying intensity overtime.
[0103] The processing unit 108 may be configured to analyze the trace data, i.e., the reflection signals received from the sensor(s) 116 which are indicative light reflected from the scene including at least part of the light emitted by the LIDAR system 100. Based on analysis of the trace data which the processing unit 108 may extract depth data relating to the scene, i.e., in the FOV 120 and/or part thereof and may derive and/or determine one or more attributes of one or more objects detected in the scene based on the light reflected from these objects. Such object attributes may include, for example, a distance between the LIDAR system 100 and the respective object from the LIDAR system 100, a reflectivity of the respective object, a spatial location of the respective object, for example, with respect to one or more coordinate systems (e.g., Cartesian (X, Y, Z), Polar (r, 0, <|>), etc.), and/or the like. Based on the trace data coupled with the scanning scheme of the scanning unit 104, i.e., the illuminated portion 122 of the FOV 120 to which the trace data relates, the processing unit 108 may therefore map the reflecting objects in the environment of the LIDAR system 100.
[0104] The processing unit 108 may combine, join, merge, fuse, and/or otherwise aggregate information, for example, depth data pertaining to different objects, and/or different features of objects detected in the scene. For example, the processing unit 108 may be configured to generate and/or reconstruct one or more 3D models, interchangeably designated depth maps herein, of the environment of the LIDAR system 100, i.e., of objects scanned in the scene included in the FOV 120 and/or part thereof. The data resolution associated with the depth map representation(s) of the FOV 120 which may depend on the operational parameters of the LIDAR system 100 may be defined by horizontal and/or vertical resolution, for example, 0.1° x 0.1°, 0.3° x 0.3°, 0.1° x 0.5° of the FOV 120, and/or the like.
[0105] The processing unit 108 may generate depth map(s) of one or more forms, formats and/or types, for example, a point cloud model, a polygon mesh, a depth image holding depth information for each pixel of a 2D image and/or array, and/or any other type of 3D model of the scene. A point cloud model (also known a point cloud) may include a set of data points located spatially which represent the scanned scene in some coordinate system, i.e., having an identifiable locations in a space described by a coordinate system, for example, Cartesian, Polar, and/or the like. Each point in the point cloud may be a dimensionless, or a miniature cellular space whose location may be described by the point cloud model using the set of coordinates.
[0106] The point cloud may further include additional information for one or more and optionally all of its points, for example, reflectivity (e.g., energy of reflected light, etc.), color information, angle information, and/or the like. A polygon mesh or triangle mesh may include, among other data, a set of vertices, edges and faces that define the shape of one or more 3D objects (polyhedral object) detected in the scanned scene. The processing unit 108 may further generate a sequence of depth maps over time, i.e., a temporal sequence of depth maps, for example, each depth map in the sequence may be associated with a respective scanning cycle (frame). In another example, the processing unit 108 may update one or more depth maps over time based on depth data received and analyzed in each frame.
[0107] Optionally, the processing unit 108 may control the light projection scheme of the light emitted to the environment of the LIDAR system 100, for example, adapt, and/or adjust the light emission pattern and/or the scanning pattern, to improve mapping of the environment of the LIDAR system 100. For example, the processing unit 108 may control the light projection scheme such to illuminate differently different portions 122 across the FOV 120 in order to differentiate between reflected light relating to different portions 122. In another example, the processing unit 108 may apply a first light projection scheme for one or more first areas in the FOV 120, for example, an ROI and a second light projection scheme for one or more other parts of the FOV 120. In another example, the processing unit 108 may adjust the light projection scheme between scanning cycles (frames) such that a different light projection scheme may be applied in different frames, another example, the processing unit 108 may adjust the light projection scheme based on detection of reflected light, either during the same scanning cycle (e.g., the initial emission) and/or between different frames (e.g., successive frames), thus making the LIDAR system 100 extremely dynamic.
[0108] Optionally, the LIDAR system 100 may include a communication interface 214 comprising one or more wired and/or wireless communication channels and/or network links, for example, PCIe, Local Area Network (LAN), Gigabit Multimedia Serial Link (GMSL), vehicle network, InfiniBand, wireless LAN (WLAN), cellular network, and/or the like. Via the communication interface 214, the LIDAR system 100, specifically the processing unit 108 may transfer data and/or communicate with one or more external systems, for example, a host system 210, interchangeable designated host herein.
[0109] The host 210, which may include any computing environment comprising one or more processors 218 such as the processor 118 which may interface with the LIDAR system 100. For example, the host 210 may include one or more systems deployed and/or located in the vehicle 110 such as, for example, an ADAS, a vehicle control system, a vehicle safety system, a client device (e.g., laptop, smartphone, etc.), and/or the like. In another example, the host 210 may include one or more remote systems, for example, a security system, a surveillance system, a traffic control system, an urban modelling system, and/or other systems configured to monitor their surroundings. In another example, the host 210 may include one or more remote cloud systems, services, and/or platforms configured to collect data from vehicles 110 for one or more monitoring, analysis, and/or control applications. In another example, the host 210 may include one or more external systems, for example, a testing system, a monitoring system, a calibration system, and/or the like.
[0110] The host 210 may be configured to interact and communicate with the LIDAR system 100 for one or more purposes, and/or actions, for example, configure the LIDAR system 100, control the LIDAR system 100, analyze data received from the LIDAR system 100, and/or the like. For example, the host 210 may generate one or more depth maps and/or 3D models based on trace data, and/or depth data received from the LIDAR system 100. In another example, the host 210 may configure one or more operation modes, and/or parameters of the LIDAR system 100, for example, define an ROI, define an illumination pattern, define a scanning pattern, and/or the like. In another example, the host 210 may dynamically adjust in real-time one or more operation modes and/or parameters of the LIDAR system 100.
[0111] According to some embodiments disclosed herein, the LIDAR system 100 may include a micro-optic array comprising a plurality of micro-optic elements deployed in the optical path of the LIDAR system 100 and configured for transmitting the reflected light 206 to a sensor array of the LIDAR system 100 comprising a plurality of the sensing elements 116. In particular, each of the plurality of micro-optic elements may be associated with a respective one of the plurality of sensing elements 116 and configured for transmitting a respective portion of the reflected light 206 to the respective associated sensing element 116.
[0112] Reference is now made to FIG. 3, which is a schematic illustration of an exemplary micro-optic array of a LIDAR system configured for transmitting light reflected from objects illuminated by the LIDAR system to a sensor array of the LIDAR system, in accordance with embodiments of the present disclosure.
[0113] An exemplary imaging system 300 of a LIDAR system such as the LIDAR system 100 may utilize one or more detection channels of the LIDAR system 100, for example, a primary object detector, a short range detector, and/or the like. Short range detection is typically characterized by reflection of high light levels (amount of light) reflected from close by objects (e.g., distances of up to 10, 20, 30, 40, 50, 60 meters, etc.) illuminated by the LIDAR system 100 which may saturate the sensors 116 of the LIDAR system 100.
[0114] The imaging system 300 may comprise a focusing unit 302 configured to receive light such as the reflected light 206 reflected from a scene, specifically from at least part of an FOV such as the FOV 120 of the LIDAR system 100 by one or more objects illuminated with light such as the light 204 projected by one or more light sources such as the light source 112 of the LIDAR system 100. In particular, the focusing unit 302 may be configured to direct the reflected light 206 toward a sensor array 316 comprising a plurality of sensing element such as the sensor 116.
[0115] The focusing unit 302 may include one or more optical components, for example, a lens, a telephoto, a prism, and/or the like configured to focus the reflected light 206 on a focal plane 304 of the focusing unit 302. In some exemplary embodiments, the focusing unit 302 may be part a scanning unit such as the scanning unit 104 of the LIDAR system 100 configured to scan the FOV 120 and/or part thereof with the light such as the projected light 204 projected by the light source(s) 112 on the outbound path and, on the inbound path, direct the reflected light 206 toward the sensor array 316. In some exemplary embodiments, the focusing unit 302 may include the scanning unit 104 and/or part thereof.
[0116] The imaging system 300 may further include a micro-optic array 320 comprising a plurality of micro-optic elements 310 configured to transmit the reflected light 206 to the sensor array 316. Each of the micro-optic elements 310 may be associated with a respective sensing element 116 of the sensor array 316. Moreover, the reflected light 206 may comprise a plurality of portions and each micro-optic element 310 may be configured to transmit a respective portion of the reflected light 206 to the respective associated sensing element 116. Each of the sensing elements 116 may therefore correspond to a respective pixel of a plurality of pixels mapping a portion of the FOV 120 scanned at any given moment.
[0117] For example, aprojecting unit ofthe LIDAR system 100 such as the projecting unit 102 may comprise one or more light sources 112 configured to project (emit) a plurality of light beams meaning that the projected light 204 comprises a plurality of light beams. In such case, each of the plurality of portions of the reflected light 206 may correspond to a respective one of the light beams meaning that the respective portion comprises light reflected by one or more objects in the scanned FOV 120 in response to being illuminated by the respective light beam. As such, each of the projected light beams may be associated with a receptive sensing element 116 which may receive, via its associated micro-optic element 310, at least some light reflected from one or more objects in the FOV illuminated by the respective light beam. Moreover, as described herein before, each of the sensing elements 116 of the sensor array 316 may constitute a respective pixel in the image map (range or depth map) created based on the reflection data (trace data) generated by each of the sensing element 116 during one or more scanning cycles of the FOV 120 by the LIDAR system 100 which is indicative of the light captured by the respective sensing element 116 via its associated micro-optic element 310.
[0118] In another example, the projecting unit 102 may comprise one or more light sources 112 configured to project (emit) a single wide and/or elongated light beam (scan line). In such case the reflected light 206 reflected by one or more objects in the FOV 120 responsive to being illuminated by the elongated light beam may be divided (segmented) to the plurality of portions, for example, by the focusing unit and/or the scanning unit 104.
[0119] Each of the micro-optic elements 310 may include a front end, i.e., a light entrance through which the respective micro-optic element 310 may receive light, for example, the respective portion of the reflected light 206 received from the focusing unit 302, and a rear end, i.e., a light exit through which the respective portion of the reflected light 206 is transmitted to the respective associated sensing element 116.
[0120] In order to transmit the reflected light 206 to the sensor array 316, the plurality of microoptic elements 310 may be disposed on the optical path between the focusing unit 302 and the sensor array 316 such that each of the micro-optic elements 310 may be optically coupled, i.e., in optical communication, to its respective associated sensing element 116. In particular, the micro-optic elements 310 may be disposed to have their light entrance (surface) coinciding with the focal plane 304 of the focusing unit 302 and their light exit oriented, positioned, and/or shaped to transmit the reflected light 206 toward a detector plane 306 coincident with a sensing surface of the sensing elements 116.
[0121] The light entrance may comprise an input surface coincident with the focal plane 304 through which the light received from the focusing unit 302 may enter the respective microoptic element 310. The light input surface may be flat, curved, jugged, bent, and/or a combination thereof. In some embodiments, the light exit of each micro-optic element 310 may comprise an exit surface coincident with the sensing surface of the associated sensing element 116 through which light from the light entrance may transmitted to the sensing element 116. However, in some embodiments, each micro-optic element 310 may include a void such that light is transferred from the light entrance of the respective micro-optic element 310 through the void to the light exit of the respective micro-optic element 310.
[0122] The light entrance and the light exit of each of the micro-optic elements 310 may be transparent to the reflected light 206 such that the reflected light 206 may enter and exit the micro-optic elements 310. For example, assuming the reflected light 206 is in a certain wavelength range (spectral range), for example, between 650 nanometer (nm) and 1150 nm, the micro-optic elements 310 may be configured, and/or adapted to have their light entrances and light exit surfaces configured to be transparent to light in the certain wavelength range.
[0123] Each of the plurality of micro-optic elements 310 may include a light blocking exterior disposed between the light entrance and the light exit of the respective micro-optic element 310. The light blocking exterior may be configured to prevent transmission of the reflected light 206. Specifically, the light blocking exterior of each micro-optic element 310 may be configured to prevent transmission of external light, i.e., light incident on the micro-optic element 310 from outside, toward the respective sensing element 116 associated with the respective micro-optic element 310. The light blocking exterior may block, i.e., prevent transmission, of light originating from one or more sources, for example, stray light travelling inside the LIDAR system 100, crosstalk, i.e., light propagating between micro-optic elements 310, for example, adjacent micro -optic elements 310, noise, and/or the like.
[0124] The light blocking exterior may employ one or more structures, architectures, compositions, and/or implementations for preventing transmission of external light toward the sensing elements 116. For example, the light blocking exterior may be configured to absorb incident light, i.e., light incident (impinging) of each surface of the light blocking exterior. For example, an exemplary absorptive light blocking exterior may be constructed of one or more materials configured to absorb light in one or more wavelength and/or wavelength ranges (spectral ranges), specifically wavelengths of the light used by the LIDAR system 100, for example, a wavelength range between 650 nm and 1150 nm, or specifically about 905 nm. In another example, the light blocking exterior of the micro-optic elements 310 may be configured to reflect and/or deflect the incident light away from the sensing elements 116. For example, an exemplary reflective light blocking exterior may be shaped to have a geometric structure configured to reflect the incident light, optionally toward one or more light traps configured to trap the light reflected from the light blocking exterior.
[0125] The light blocking exterior of each micro-optic element 310 may be further configured to increase reflection of internal light within the respective micro-optic element 310 in order to increase energy of light transmitted toward the respective associated sensing element 116. For example, the light blocking exterior may be geometrically shaped to have internal reflective surfaces adapted to reflect light incident on the internal reflective surfaces in order to direct the light toward the associated sensing element 116. For example, each micro-optic element 310 may be shaped and/or constructed such that its light blocking exterior may have an angle in a certain range, for example, 60-90 degrees with the light exit of the respective micro-optic element 310 such that light transmitted from the light entrance may hit the light blocking exterior at a large angle with respect to the normal to the surface of the light blocking exterior and be reflected toward the light exit. In another example, the light blocking exterior may be constructed of one or more materials having a refractive index selected and/or adapted to prevent refraction and transmission of the internal light outside of the micro-optic element 310 and reflect it back toward the associated sensing element 116. In particular, each micro-optic element 310 may be shaped to have the angle between its light blocking exterior and its light exit in the certain range and be constructed of one or more materials having a selected refractive index which combined with the base angle may increase reflection of internal light beams toward the light exit.
[0126] An exemplary imaging system such as the imaging system 300 configured to facilitate and/or support one or more of the detection channels of the LIDAR system 100, for example, a short range detection channel may be characterized by some exemplary properties. For example, the focal length of the focusing unit 302 may be in a range between 10-30 mm, for example 15 mm, 20 mm and/orthe like. An FOV of the focusing unit 302 may be, for example, about 7 degrees. An FOV per pixel of the imaging system 300 may be, for example, in a range between 0.4-0.7 degrees. The area of the light entrance surface of each micro-optic element 310 may be in a range of 0.03 -0.3mm, for example 0.11 mm, and/or the like and the area of the light exit surface of each micro-optic element 310 may be in a range between 0.5-1.5mm, for example 0.7mm, 1.1 mm, and/or the like. [0127] This structure of each micro-optic element 310 may significantly reduce, and potentially completely prevent, transmission of light toward the respective associated sensing element 116, other than the portion of reflected light 206 entering the respective micro -optic element 310 from its light entrance. Reducing and moreover preventing transmission of undesired light, i.e., light other than the portion of reflected light 206 entering each micro-optic element 310 through its respective light entrance, toward the array of sensing elements 116, may significantly increase the light sensing performance of each sensing element 116 as it may capture only or at least mainly the respective portion of reflected light 206 directed to the respective sensing element 116 via the respective associated micro-optic element 310 while noise, crosstalk, and/or stray light are significantly blocked. Increasing the light detection performance of the sensing elements 116 may significantly increase the detection performance of the LIDAR system 100, for example, resolution, accuracy, reliability and/or robustness.
[0128] As described herein before, each sensing element 116 may comprise an array of light detecting elements such as the light detecting elements 220, for example, APDs, SPADs, and/or the like. While each of the light detecting elements 220 may trigger upon reception (detection) of one or more photons, the output of the array of light detecting elements 220 of each sensing element 112 may be summed together to form the output of the respective sensing element.
[0129] Each of the micro-optic elements 310 may be configured to map a cross section area of the surface of its light entrance to a sensing surface of the array of light detecting elements 220 of the associated sensing element 116 which is typically larger than the light entrance cross section. This means that the portion of reflected light 206, which may be significantly focused (on the focal plane 304), may enter each micro-optic element 310 at a significantly small light entrance surface and be dispersed through the light exit of the respective micro-optic element 310 on a larger surface matching the sensing surface of the array of light detecting elements 220 of the respective associated sensing element 116.
[0130] Distributing each portion of reflected light 206, through the micro-optic element 310, over a large sensing surface of each of the sensing elements 116, specifically over the large sensing surface of the array of light detecting elements 220 may significantly increase the dynamic range of each sensing element 116 compared to transmitting the focused reflected light 206 received from the focusing unit 302. Ideally, the micro-optic element 310 may be shaped and/or configured to uniformly distribute the incoming portion of reflected light 206 on the sensing surface of the associated sensing element 116.
[0131] The increase of the sensing surface of each sensing element 116 may directly translate to an increase in the dynamic range. Therefore configuring the micro-optic elements 310 to have their light exit larger than their light entrance by a ratio of 1/25 to 1/4 may increase the dynamic range by 4 up to 25, i.e., by 1-2 orders of magnitude.
[0132] Increasing the dynamic range of the sensing elements 116 may significantly increase the detection performance of the LIDAR system 100, for example, resolution, accuracy, reliability and/or robustness. Increasing the dynamic range may be specifically advantageous for short range detection where large amount of reflected light 206 may be reflected from one or more objects located close to the LIDAR system 100, for example, below 10, 20, 30, 40, 50, 60 meters, and/or the like.
[0133] In order to transmit (distribute) the reflected light 206 which may be significantly focused and thus having a small cross section surface area to the sensing surface of each sensing element 116 having a significantly large surface area, the light exit of each micro-optic element 310 may be larger than the light entrance of the respective micro-optic element 310. In particular, a cross section of a surface of the light exit of each micro-optic element 310 may have a larger surface area than a cross section of a surface of the light entrance of the micro - optic element 310. For example, the cross section of the light exit surface of each micro-optic element 310 may be larger than the cross section of the light entrance surface of the respective micro-optic element 310 by a ratio in a certain range, for example, a range of 25/1 to 4/1. As such, the cross section of the portion of reflected light 206 transmitted to each sensing element 115 via the light exit (surface) of the respective associated micro-optic element 310 may be larger than the cross section of the portion of reflected light 206 received at the light entrance (surface) of the respective cross section by a ratio in the certain range.
[0134] The micro-optic elements 310 may employ one or more architectures, structures, and/or implementations adapted for transmitting the reflected light 206 to the array of sensing elements 116, in particular for dispersing the reflected light 206 on the sensing surface of the sensing elements 116 of the array 316. For example, each micro-optic elements 310 may be monolithic component i.e., a single component constructed of one or more materials, specifically light transferring materials, for example, a polymer, glass, and/or the like for optically coupling the light exit of the micro-optic element 310 to its light entrance. In another example, each micro-optic elements 310 may be constructed and/or fabricated of multiple disparate components, elements, and/or devices which are optically coupled to each other.
[0135] Specifically, as described herein before, regardless of their monolithic or disparate structure, each micro-optic element 310 may be constructed, fabricated, shaped, and/or implemented to receive, from the focusing unit 302, a respective portion of the reflected light 206, which is typically significantly focused, and transmit the received light portion to the associated sensing element 116, typically having a large sensing surface, while preventing or at least significantly reducing (blocking, attenuating, etc.) other light from transmission to the respective associated sensing element 116.
[0136] The micro-optic elements 310 be therefore constructed, shaped, and structured to disperse (spread, expand) the significantly focused reflected light 206 received at the light entrance of the respective micro-optic element 310 over the significantly larger sensing surface of the respective associated sensing element 116. It should be noted that each of the microoptic elements 310 may employ a single such architecture, structure, and/or implementation, and/or a combination of two or more such architectures, structures, and/or implementations.
[0137] The term disperse and its variants (dispersing, dispersed, dispersion, etc.) as used herein relates to spreading, expanding, and/or distributing light according to a desired distribution, typically a wide distribution. This means dispersing received light may be interpreted to mean that a surface area of the dispersed light may be larger than the surface area of the received light, more specifically a cross section of the dispersed (spread) light (beam) may have a larger surface area than the cross section of the received light (beam).
[0138] For example, the light entrance and/or the light exit of each micro-optic element 310 may be shaped to have a curvature configured to disperse the respective portion of reflected light 216 over a sensing surface of the respective associated sensing element 116. For example, the light entrance and/or exit may be configured to have a convex curved surface which may disperse the significantly focused reflected light 206 received at the light entrance of the respective micro-optic element 310 over the significantly larger sensing surface of the respective associated sensing element 116. The curvature may be, for example, part of the micro-optic element 310, for example, formed in the light entrance of each micro-optic element 310 which may be constructed of one or more materials having a refractive index selected and/or adapted to transmit the reflected light 206 toward the light exit of the respective microoptic element 310, specifically disperse the respective portion of reflected light over the sensing surface of the associated sensing element 116.
[0139] In another example, each micro-optic element 310 may be associated with one or more respective lenses configured to disperse the respective portion of reflected light 206 over the sensing surface of the respective associated sensing element 116. For example, the light entrance and/or exit of each micro-optic element 310 may comprise a dispersing lens configured to disperse the significantly focused reflected light 206 received at the light entrance of the respective micro-optic element 310 over the significantly larger sensing surface of the respective associated sensing element 116. [0140] According to some embodiments, each micro-optic element 310 may be structured to have its light entrance and its light exit spaced apart. In such case the light entrance and its light exit of each micro-optic element 310 may comprise an optical path configured to optically couple the light exit of the respective micro-optic element 310 to the light entrance of the microoptic element 310. For example, each micro-optic element 310 may be a monolithic component constructed of one or more materials, for example, glass, a polymer, and/or the like configured to transfer light from the light entrance of the respective each micro-optic element 310 to the light exit of the micro-optic element 310. In another example, micro-optic element 310 may facilitate an open air light transmission path from the light entrance of the respective each micro-optic element 310 to the light exit of the micro-optic element 310. In another example, micro-optic element 310 may comprise one or more optic fibers configured to direct light from the light entrance of the respective each micro-optic element 310 to the light exit of the microoptic element 310.
[0141] Reference is now made to FIG. 4A and FIG. 4B, which are cross section views of exemplary micro-optic elements of a LIDAR system configured for transmitting light reflected from objects illuminated by the LIDAR system to a sensing element of the LIDAR system, in accordance with embodiment of the present disclosure.
[0142] As seen in FIG. 4A, an exemplary micro-optic element 310A such as the micro-optic element 310 may be constructed as a monolithic component, comprise a light input, for example, an input surface 402A through which a portion of incoming reflected light such as the reflected light 206 may enter the micro-optic element 310A, and a light exit, for example, an output surface 404A through which light may be transmitted to a sensing element such as the sensing element 116 associated with the micro-optic element 310A.
[0143] As seen, the input surface 402A may coincide with the focal plane 304 of a focusing unit such as the focusing unit 302 such that the portion of reflected light 206 is focused on the input surface 402A. The output surface 404A which is spaced apart from the input surface 402A may coincide with the detector plane 306, i.e., with the sensing surface of the associated sensing element 116.
[0144] The micro-optic element 310A may comprise a light blocking exterior 408 A disposed between the spaced apart input surface 402A and the output surface 404A. The light blocking exterior 408A having one or more surfaces adjoining the input surface 402A and the output surface 404A may be configured to prevent, at least partially, transmission of external incident light to the sensing element 116. For example, the light blocking exterior 408A may be constructed of one or more light absorptive materials configured to absorb incident light. In another example, the light blocking exterior 408A may be geometrically shaped to reflect and/or deflect light away from the detector plane 306. As such, the micro-optic element 310A may transmit to the sensing element 116 only or at least mostly the portion of reflected light 206 entering the micro-optic element 310A through the input surface 402A while reducing and potentially preventing transmission of other light (e.g., crosstalk, noise, stray light, etc.) like to the sensing element 116.
[0145] As seen, the optic element 310A may be shaped and/or constructed such that an angle between the light blocking exterior 408A may have a base angle 420 with the light exit of the respective micro-optic element 310 in a certain range, for example, 60-90 degrees. As such light transmitted inside the micro-optic element 310A from the light entrance 402A may hit the light blocking exterior 408A at an obtuse angle with respect to the surface of the light blocking exterior 408 and be reflected back toward the light exit 404A. Moreover, the micro-optic element 310A may be shaped to have the angle 420 within the certain range combined with the light blocking exterior 408A constructed of one or more materials having a selected refractive index which combined with the base angle 420 may reduce refraction of the internal light and increase reflection of internal light beams toward the light exit 404A.
[0146] Moreover, the micro-optic element 310A comprise one or more optical elements, for example, a prism shaped and/or constructed to optically couple the output surface 404A to the input surface 402A for distributing (expanding) the incoming reflected light 206 and transmitting distributed light 406 to the associated sensing element 116. As such the distributed light 406 may have an increased (expanded) cross section larger than the cross section of the incoming reflected light 206 and may thus disperse (distribute) over to the entire sensing surface having a larger cross section than the cross section of the incoming reflected light 206. [0147] In another example, illustrated in FIG. 4B, another exemplary micro-optic element 310B such as the micro-optic element 310 may comprise a lens 410, for example, a convex lens configured to disperse, i.e., distribute the portion of incoming reflected light 206 received via an input surface 402B of the lens 410 according to a desired distribution, typically wide distribution and transmit the distributed light 406 via a lens output surface 412 of the lens 410 to the sensing element 116 associated with the micro-optic element 310B. As seen, the input surface 402B may coincide with the focal plane 304 of the focusing unit 302 such that the portion of reflected light 206 is focused on the input surface 402B. However, in contrast to embodiment of the micro-optic element 310A, the micro-optic element 310B may include a void between a lens output surface 412 and the detector pane 306. However, while the lens output surface 412 does not coincide with the detector plane 306, the light exit of the micro- optic element 31 OB is considered to comprise an output surface 404B coincident with the detector plane 306 through which the distributed light 406 received from the lens output surface 412 is transmitted to the associated sensing element 116 at the detector plane 306.
[0148] The micro-optic element 31 OB may further comprise a light blocking exterior 408B which may extend from the output surface 404B to the detector plane 306, i.e., to the sensing surface of the sensing element 116. The light blocking exterior 408B may be constructed, for example, as a diaphragm encircling the lens 410 configured to prevent, at least partially, transmission of incident light to the sensing element 116. For example, the light blocking exterior 408B may be constructed of one or more light absorptive materials configured to absorb incident light. In another example, the light blocking exterior 408B may be geometrically shaped to reflect and/or deflect light away from the detector plane 306. As such, the micro-optic element 310B may transmit (transfer) to the sensing element 116 only or at least mostly the portion of reflected light 206 entering the micro-optic element 310B through the input surface 402B while reducing and potentially preventing transmission of other light (e.g., crosstalk, noise, stray light, etc.) like to the sensing element 116.
[0149] The blocking exterior 408B encircling the lens 410 may be solid for example, i.e., its interior space may be constructed of one or more materials which optically couple the detector plane 306 to the output surface 410 such that the distributed light 406 may be transmitted to the sensing surface of the sensing element 116. In another example, the blocking exterior 408B encircling the lens 410 may comprise a void forming an open air optical path from the output surface 404B of the lens 410 to the detector plane 306.
[0150] Optionally, the micro-optic elements 310 may be configured to employ angular filtering by preventing transmission of incident light beams having an Angle of Incidence (AOI) with the light entrance of the micro-optic element 310 which is outside a predefined angle range. In particular, as known in the art, the AOI of incident light beams relating to each micro-optic element 310 may be expressed with respect to a normal to the light entrance of the respective micro-optic element 310 at the point of incidence. In case the light entrance of the micro-optic elements 310 comprises a flat surface, the normal may be uniform across the light entrance of each micro-optic element 310. However, in case the light entrance of the micro-optic elements 310 comprises a non-flat surface, for example, a curved surface, the normal may be specific to each point of the light entrance of each micro-optic element 310.
[0151] As such, incident light beams having an AOI which is within the angle range, for example, ±30 degrees, ±45 degrees, and/or the like with respect to the normal may be transmitted via the micro-optic element 310 to the light exit of the micro-optic element 310 while incident light beams having an AOI outside the angle range may be rejected, i.e., prevented from transmitting to the light exit. As result, each micro-optic elements 310 may reject light beams having an AOI outside the predefined angle range, i.e., prevent their transmission to the associated sensing element 116, while transmitting to the associated sensing element 116 light beams having an AOI within the angle range.
[0152] The micro-optic elements 310 may employ pone or more structure, architecture, and/or compositions to facilitate the angular filtering. For example, each of the micro-optic elements 310 may be geometrically shaped to form a geometric angular light trap adapted to reflect and/or deflect away light beams outside the predefined angle range. For example, micro-optic elements 310 may be shaped and/or constructed to include a front-end conduit adapted to filter out light beams outside the predefined angle range. The conduit may have one or more light transmission, reflection, refraction, and/or rejection similar to those of the light blocking exterior 408. Optionally, the conduit may be part of the light blocking exterior 408.
[0153] Reference is now made to FIG. 5A and FIG. 5B, which are cross section views of exemplary micro-optic elements shaped to deflect light reflected from objects illuminated by the LIDAR system away from a sensing element of the LIDAR system, in accordance with embodiment of the present disclosure.
[0154] For example, as seen in FIG. 5 A, an exemplary micro-optic element 310C comprising a light entrance 402C, a light exit 404C and a light blocking exterior 408C may further comprise a conduit 502A at its front-end, specifically at its light entrance, for example, at an input surface 402C of the micro-optic element 310C. The conduit 502A may be configured to convey and/or transmit the incoming reflected light 206 and increase angular rejection by rejecting light beams having an AOI with respect to the input surface 402C that is outside a predefined angle range. The conduit 502A may be constructed of one or more materials having a refractive index adapted to transfer light beams having AOI angles with the normal of the input surface 402C which are within a predefined angle range between angles 510A and 510B, for example, ±45, while deflecting away light beams having AOI angles which are outside the predefined degrees angle range. As seen, the conduit 502A may have a uniform cross section across its entire height.
[0155] For example, the AOI angles of light beams 206B and 206C with respect to the normal to the input surface 402C may be, for example, ±45 degrees and -45 degrees respectively, which are within the predefined angle range defined between angles 510A and 510B and may therefore transfer through the micro-optic elements 310C toward the sensing element 116. Light beams 206A and 206D, on the other hand, may be deflected away from the sensing element 116 since their AOI angles with respect to the normal to the input surface 402C, for example, +55 degrees and -65 degrees respectively, is outside the predefined angle range.
[0156] The conduit 502A may be further configured to support internal reflection of light beams incident on the internal surfaces of the funnel element 502A to increase reflection of light inside the micro-optic element 310C, i.e., light transmitted through the micro-optic element 310C toward the sensing element 116. For example, the conduit 502A may be constructed of one or more light transparent materials having a refractive index which may refract light beams having AOI angles outside a certain angle range and transmit them out of the micro-optic element 310C while reflecting light beams having AOI angles which are within the certain angle range toward the light exit 404C and through it to the sensing surface of the sensing element 116.
[0157] As seen in FIG. 5B, another exemplary micro-optic element 310D comprising a light entrance 402D, a light exit 404D and a light blocking exterior 408D may further include a conduit 502B shaped and/or constructed to have a varying cross section across its height. Specifically, the conduit 502B may be shaped to have inward inclining side surfaces (side walls) . This inward inclination may alter a refraction angle out of the micro-optic element 310D for light beams having an AOI outside the predefined angle range since the normal to the inward inclined surfaces is tilted inward. This may enable increased control of the light trap utilized by the conduit 502B, for example, refract such exceeding light beams, for example, the light beams 206A and 206D, further away from the sensing elements 116 and/or toward one or more other light traps, absorptive elements, and/or the like thus further increasing immunity of the sensing elements 116 to light not received via the light entrance of their associated micro - optic elements 310D.
[0158] In another example, each of the micro-optic elements 310 may facilitate the angular filtering using material angular light filtering. Such micro-optic elements 310 may be composed of one or more materials having a refractive index selected to prevent transmission of incident light beams having an AOI with respect to the light entrance which is outside the predefined angle range while transferring incident light beams having an AOI within the predefined angle range. For example, an exemplary micro-optic element 310, optionally comprising a conduit such as the conduit 502, may be constructed of one or more materials characterized by a refractive index such that light beams incident on the micro-optic element 310 with an AOI is outside the predefined angle range are refracted and transmitted away from the light exit of the micro-optic element 310 and the sensing surface of the respective associated sensing element 116. However, the refractive index of the selected materials may cause light beams incident on the micro-optic elements 310 with an AOI with respect to the light entrance that is within the predefined angle range to be directed and transmitted toward the light exit of the micro-optic element 310 and the sensing surface of the respective associated sensing element 116.
[0159] According to some embodiments, each micro-optic element 310 may be shaped as a truncated pyramid, for example, a square pyramid, a triangular pyramid, and/or the like comprising a truncated top facet, a base facet, and a plurality of side facets. The truncated top facet of the pyramid may constitute a light entrance such as the light entrance, the base facet of the pyramid may constitute a light exit such as the light exit 404, and the plurality of side facets of the pyramid may constitute a light blocking exterior such as the light blocking exterior 408. [0160] Moreover, in some embodiments, an exemplary micro-optic array 320 may be shaped as an array of truncated pyramids, i.e., where each of the truncated pyramids constitutes a respective one of the plurality of the micro-optic elements 310.
[0161] Reference is now made to FIG. 6, depicting various views of an exemplary micro-optic array of a LIDAR system configured for transmitting light reflected from objects illuminated by the LIDAR system to a sensor array of the LIDAR system, in accordance with embodiment of the present disclosure.
[0162] An exemplary micro-optic array 320E such as the micro-optic array 320 of an imaging system such as the imaging system 300 of a LIDAR system such as the LIDAR system 100 may be constructed, shaped, produced, and/or configured to have a “Toblerone” shape comprising a plurality of truncated pyramids each constituting a respective micro-optic element of a plurality of micro-optic elements 310E such as the micro-optic elements 310. Each microoptic element 310E(i) may be associated with a respective sensing element 116(i) such as the sensing element 116 of a sensor array such as the sensor array 316.
[0163] As seen in the perspective view of the micro-optic array 320E, each of the micro-optic element 310E(i) has a respective light entrance, for example, an input surface 402E formed by the truncated top facet of the pyramid constituting the respective micro-optic element 310E(i), and a respective light exit, for example, an exit surface 404E formed by the base facet of the pyramid constituting the respective micro -optic element 310E(i).
[0164] The micro-optic array 320 comprising the plurality of micro-optic elements 310 may be a monolithic component i.e., a single component constructed of one or more materials, specifically light transferring materials, for example, a polymer, glass, and/or the like such that each micro-optic element 310 may optically coupling its respective light exit of its respective light entrance for transmitting the portion of incoming reflected light received from a focusing unknit such as the focusing unit 302 to a respective sensing element 116 associated with the respective micro-optic element 310. In another example, the micro-optic array 320 comprising the plurality of micro-optic elements 310 may be constructed and/or fabricated of multiple disparate components, elements, and/or devices, for example, an open air optical element, an optic fiber, and/or the like which optically couple the respective light entrance of each of the micro-optic elements 310 to the respective light exit of the respective micro-optic element 310. [0165] Optionally, the imaging system 300, specifically the micro-optic array 320 may further comprise a plurality of optical filters disposed between the plurality of micro-optic elements 310 of the micro-optic array 320. These optical filters may be adapted, configured, shaped, and/or fabricated to absorb light not received via the light entrances of the plurality of micro optic elements 310.
[0166] Reference is now made to FIG. 7A and FIG. 7B, which are schematic illustrations of exemplary micro-optic arrays of a LIDAR system configured for transmitting light reflected from objects illuminated by the LIDAR system to a sensor array of the LIDAR system and filtering out spatial light noise, in accordance with embodiment of the present disclosure.
[0167] As seen in FIG. 7A and FIG. 7B, a plurality of optical filters 702 may be associated with an exemplary micro-optic array 320 of an imaging system such as the imaging system 300 of a LIDAR system such as the LIDAR system 100. For example, a spatial filter 702(i) may be disposed between each pair a plurality of micro-optic elements 310(i) and 310(i+l) such as the micro-optic elements 310 of the micro-optic array 320 associated with respective sensing elements 116(i) and 116(i+l) of a sensor array such as the sensor array 316 of the LIDAR system 100.
[0168] The optical filters 702 may be configured to prevent spatial light, i.e., light coming in from the direction of a focusing unit such as the focusing unit 302 of the imaging system 300, from being transmitted toward the sensor array 316. Moreover, one or more optical filter 702 may be disposed in front of the first micro-optic element 310 of the micro-optic array 320 and behind the last micro-optic element 310 of the micro-optic array 320.
[0169] The spatial filter 702(i) may be shaped, arranged, and/or disposed in one or more shapes, construction and/or configuration. For example, as seen in FIG. 7A, a plurality of spatial filter 702A(i) may be disposed between each pair of micro-optic elements 310(i) and 310(i+l) along a significant stretch and possibly the entire length (height) of the lateral surfaces of the micro-optic elements 310, for example, adjacent to the surface of the light blocking exterior of the micro-optic elements 310. Optionally, the internal surfaces of the micro-optic elements 310 may be coated with one or more reflective materials adapted to increase reflection of light beams in the micro-optic element 310 back into the micro-optic elements 310 and toward the associated sensing element 116. This may increase the energy level of light received at the sensing element 116 which may increase light detection of the sensing element 116 and thus increase detection performance of the LIDAR system 100.
[0170] In another example, as seen in FIG. 7B, a plurality of spatial fdter 702B(i) may be disposed between each pair of micro-optic elements 310(i) and 310(i+l) next to the light entrances of the micro-optic elements 310. As seen, the spatial fdter 702B(i) may be significantly small (short) and may not stretch along the lateral surfaces of the micro -optic elements 310. Using this arrangement, which leaves an air gap between the micro-optic elements 310, coupled with selection of the material(s) composing the micro-optic element(s) 310 having appropriate refractive index may increase reflection of light beams inside the microoptic element 310 by preventing refraction of these light beams out of the micro-optic element(s) 310 and thus reflecting them back into the micro-optic element 310 and toward the associated sensing element 116. The reflected light may increase the energy level of light received at the sensing element 116 which may increase light detection of the sensing element 116 and thus increase detection performance of the LIDAR system 100.
[0171] Optionally, the optical fdters 702 serving as spatial light traps, for example, as seen in FIG. 7A, may utilize and/or perform as the light blocking exterior 408 of the micro-optic elements 310 for reducing and potentially preventing spatial light not entering the micro-optic elements 310 via their light entrances from being transmitted to the sensing elements 116 of the sensor array 316.
[0172] Optionally, one or more materials may be disposed between the sensing surface of the sensing elements 116, and the light exit of the micro-optic elements 310 to prevent reflection of light beams from the sensing elements 116 back toward the micro-optic elements 310, and optionally to affix the micro-optic array 320 to the sensor array 316. For example, one or more optical pastes, for example, an optical glue, an optical paste, and/or the like may be spread over the sensing surface of the sensing elements 116. The optical paste(s) may be selected and/or configured to have a matched refractive index to prevent reflection of incident light back toward the light entrance of the micro-optic elements 310.
[0173] Reference is now made to FIG. 8, which is a flow chart of an exemplary process of using micro-optic elements for transmitting light reflected from objects illuminated by the LIDAR system to a sensor array of the LIDAR system, in accordance with embodiments of the present disclosure. [0174] An exemplary process 800 may be executed using a micro-optic array such as the micro-optic array 320 of an imaging system such as the imaging system 302 of a LIDAR system such as the LIDAR system 100.
[0175] As shown at 802, light such as the reflected light 206 may be received by the imaging system 300, specifically the reflected light 206 may be received via a focusing unit imaging system 300 such as the focusing unit 302.
[0176] As described herein before, the light 206 may be reflected by one or more objects illuminated by light such as the projected light 204 emitted by one or more light sources such as the light source 112 of the LIDAR system and projected to an FOV such as the FOV 120 of the LIDAR system 100 by a scanning unit such as the scanning unit 104.
[0177] The reflected light 206 may be divided to a plurality of light portions each directed to a respective one of a plurality of sensing elements such as the sensing elements 116 of a sensor array such as the sensor array 316.
[0178] As shown at 804, each of a plurality of micro-optic elements such as the micro-optic elements 310 of a micro-optic array such as the micro-optic array 320 associated with a respective sensing element 116 may receive a respective portion of the reflected light 206 from the focusing unit 302. Specifically, each micro-optic element 310 may receive the respective portion of the reflected light 206 via its light entrance.
[0179] As shown at 806, each micro-optic element 310 may transmit the respective portion of reflected light 206 to the respective associated sensing element 116 while reducing and potentially preventing transmission of light not received via the light entrance of the respective micro-optic element 310 such that only or at least mostly of the light transmitted to the respective associated sensing element 116 is light received via the light entrance of the respective micro-optic element 310.
[0180] The foregoing description has been presented for purposes of illustration. It is not exhaustive and is not limited to the precise forms or embodiments disclosed. Modifications and adaptations will be apparent to those skilled in the art from consideration of the specification and practice of the disclosed embodiments.
[0181] Moreover, aspects of the present disclosure may be embodied as a system, method and/or computer program product. As such, aspects of the disclosed embodiments may be provided in the form of an entirely hardware embodiment, an entirely software embodiment, or a combination thereof.
[0182] Additionally, although aspects of the disclosed embodiments are described as being stored in memory, one skilled in the art will appreciate that these aspects can also be stored on other types of computer readable media, such as secondary storage devices, for example, hard disks or CD ROM, or other forms of RAM or ROM, USB media, DVD, Blu-ray, or other optical drive media.
[0183] Computer programs and computer programs products based on the written description and disclosed methods are within the skill of an experienced developer. The various programs or program modules can be created using any of the techniques known to one skilled in the art or can be designed in connection with existing software. For example, program sections or program modules can be designed in or by means of .Net Framework, .Net Compact Framework (and related languages, such as Visual Basic, C, etc.), Java, C++, Objective -C, HTML, HTML/AJAX combinations, or HTML with included Java applets.
[0184] Moreover, while illustrative embodiments have been described herein, the scope of any and all embodiments having equivalent elements, modifications, omissions, combinations (e.g., of aspects across various embodiments), adaptations and/or alterations as would be appreciated by those skilled in the art based on the present disclosure.
[0185] It is expected that during the life of a patent maturing from this application many relevant systems, methods and computer programs will be developed and the scope of the terms LIDAR systems, light projection technologies, light sensing technologies, scanning mechanisms and focusing units are intended to include all such new technologies a priori.
[0186] The terms "comprise", "comprising", "include", "including", “having” and their conjugates mean "including but not limited to". These terms encompass the terms "consisting of and "consisting essentially of which mean that the composition or method may include additional ingredients and/or steps if the additional elements and/or steps do not materially alter the novel characteristics of the claimed composition or method.
[0187] As used herein the term “about” refers to ± 5 %.
[0188] Throughout this disclosure, various embodiments may be presented in a range format. Description in range format is merely for convenience and brevity and should not be construed as an inflexible limitation on the scope of the invention. Accordingly, the description of a range should be construed to include all the possible subranges as well as individual numerical values within that range.
[0189] It is appreciated that certain features of embodiments disclosed herein, which are, for clarity, described in the context of separate embodiments, may also be provided in combination in a single embodiment. Also, features described in combination in the context of a single embodiment may also be provided separately or in suitable sub -combinations in other embodiments described herein. [0190] Publications, patents, and patent applications referred to in this disclosure are to be incorporated into the specification in their entirety by reference as if each individual publication, patent, or patent application was specifically and individually included in the disclosure. However, indication and/or identification of any such referenced document may not be construed as admission that the referenced document is available as prior art to embodiments disclosed hereon.
[0191] The limitations in the claims are to be interpreted broadly based on the language employed in the claims and not limited to examples described in the present specification or during the prosecution of the application. The examples are to be construed as non-exclusive. Furthermore, the steps of the disclosed methods may be modified in any manner, including by reordering steps and/or inserting or deleting steps. It is intended, therefore, that the specification and examples be considered as illustrative only, with a true scope and spirit being indicated by the following claims and their full scope of equivalents.

Claims

WHAT IS CLAIMED IS:
1. An imaging system for receiving light reflected from a field of view (FOV) illuminated by a LIDAR system, comprising: a focusing unit configured to receive reflected light from a FOV illuminated by a LIDAR system and focus a plurality of portions of the reflected light on a focal plane of the focusing unit; a sensor array comprising a plurality of sensing elements configured to detect the reflected light; and a micro-optic array comprising a plurality of micro-optic elements each associated with a respective one of the plurality of sensing elements, each micro-optic element comprises: a light entrance coincident with the focal plane and configured for receiving a respective portion of the plurality of portions of reflected light from the focusing unit, a light exit through which the respective portion of the reflected light is transmitted to the associated sensing element, and a light blocking exterior disposed between the light entrance and the light exit, the light blocking exterior is configured to prevent transmission of the reflected light.
2. The imaging system of claim 1, wherein each of the plurality of sensing elements comprises an array of light detecting elements, each micro-optic element is configured to disperse light received via the light entrance of the respective micro-optic element over a sensing surface of the array of light detecting elements of the associated sensing element, wherein a surface area of the sensing surface is larger than the surface area of a cross section of the light entrance.
3. The imaging system of any one of the previous claims, wherein a cross section of a surface of the light exit has a larger surface area than a cross section of a surface of the light entrance by a ratio in a range of 25/1 to 4/1.
4. The imaging system of any one of the previous claims, wherein the light entrance and the light exit of each micro-element are transparent to the reflected light.
5. The imaging system of any one of the previous claims, wherein the light entrance of each micro-optic element is shaped to have a curvature configured to disperse the respective portion of reflected light over a sensing surface of the associated sensing element.
6. The imaging system of any one of the previous claims, wherein each micro-optic element is associated with at least one respective lens configured to disperse the respective portion of reflected light over a sensing surface of the associated sensing element.
7. The imaging system of any one of the previous claims, wherein the light entrance and the light exit of each micro-optic element are spaced apart and optically coupled to each other via the volume of the respective micro-optic element between the light entrance and the light exit.
8. The imaging system of any one of the previous claims, wherein each micro-optic element is shaped to prevent transmission of light beams having an angle of incidence (AOI) with a surface of the light entrance outside a predefined angle range.
9. The imaging system of any one of the previous claims, wherein each micro-optic element comprises a front-end conduit geometrically shaped to direct away from the light exit light beams having an AOI with the surface of the light entrance outside the predefined angle range.
10. The imaging system of any one of the previous claims, wherein the conduit has a uniform cross section.
11. The imaging system of any one of previous claims 1 to 9, wherein the conduit has a varying cross section.
12. The imaging system of any one of the previous claims, wherein the light blocking exterior is configured to absorb incident light external to the micro-optic element.
13. The imaging system of any one of previous claims 1 to 11, wherein the light blocking exterior is configured to reflect incident light external to the micro-optic element.
14. The imaging system of any one of the previous claims, wherein the light blocking exterior is configured to reflect incident light inside the micro-optic element .
15. The imaging system of any one of the previous claims, wherein each micro-optic element is a monolithic component.
16. The imaging system of any one of the previous claims, wherein each micro-optic element is shaped as a truncated pyramid wherein a truncated top facet of the pyramid constitutes the light entrance, a base facet of the pyramid constitutes the light exit, and a plurality of side facets of the pyramid constitute the light blocking exterior.
17. The imaging system of any one of the previous claims, wherein the micro-optic array is a monolithic component.
18. The imaging system of any one of the previous claims, wherein the imaging system further comprises a plurality of optical filters disposed between the plurality of micro-optic elements of the micro-optic array to prevent transmission of light not received via the light entrance of the plurality of micro-optic elements.
19. The imaging system of any one of the previous claims, wherein the LIDAR system comprises a plurality of light sources configured to transmit a plurality of light beams toward at least part of a FOV of the LIDAR system, each of the plurality portions of the reflected light corresponds to a respective one of the plurality of light beams.
20. The imaging system of any one of previous claims 1 to 18, wherein the LIDAR system comprises at least one light source configured to transmit a single elongated light beam toward a FOV of the LIDAR system, the reflected light corresponding to the single elongated light beam is divided to the plurality portions of the reflected light.
21. A method of distributing light reflected from a field of view (FOV) illuminated by a LIDAR system on sensing elements of the LIDAR system, comprising: receiving, via an optical system of a LIDAR system, light reflected from an FOV illuminated by the LIDAR system, wherein the optical system is configured to focus a plurality of portions of the reflected light on a focal plane of the optical system; transmitting the reflected light via a micro-optic array to a sensor array comprising a plurality of sensing elements configured to detect the reflected light, the micro-optic array comprises a plurality of micro-optic elements each associated with a respective one of the plurality of sensing elements; wherein each micro-optic element comprises: a light entrance coincident with the focal plane and configured for receiving a respective portion of the plurality of portions of reflected light from the focusing unit, a light exit through which the respective portion of the reflected light is transmitted to the associated sensing element, and a light blocking exterior disposed between the light entrance and the light exit, the light blocking exterior is configured to prevent transmission of the reflected light.
PCT/IL2024/050511 2023-05-25 2024-05-23 Micro-optics on detection path of lidar systems Pending WO2024241321A1 (en)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US202363504207P 2023-05-25 2023-05-25
US63/504,207 2023-05-25
US202363508112P 2023-06-14 2023-06-14
US63/508,112 2023-06-14

Publications (1)

Publication Number Publication Date
WO2024241321A1 true WO2024241321A1 (en) 2024-11-28

Family

ID=93589058

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IL2024/050511 Pending WO2024241321A1 (en) 2023-05-25 2024-05-23 Micro-optics on detection path of lidar systems

Country Status (1)

Country Link
WO (1) WO2024241321A1 (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180113200A1 (en) * 2016-09-20 2018-04-26 Innoviz Technologies Ltd. Variable flux allocation within a lidar fov to improve detection in a region
US20180292532A1 (en) * 2017-04-07 2018-10-11 General Electric Company Lidar system and method
US20190011556A1 (en) * 2017-07-05 2019-01-10 Ouster, Inc. Light ranging device with electronically scanned emitter array and synchronized sensor array
US20190227175A1 (en) * 2018-01-23 2019-07-25 Innoviz Technologies Ltd. Distributed lidar systems and methods thereof
WO2019234503A2 (en) * 2018-06-05 2019-12-12 Innoviz Technologies Ltd. Mems mirror with resistor for determining a position of the mirror
US20210025997A1 (en) * 2018-04-09 2021-01-28 Innoviz Technologies Ltd. Lidar systems and methods with internal light calibration
WO2022053874A2 (en) * 2020-09-14 2022-03-17 Innoviz Technologies Ltd. Lidar system with variable resolution multi-beam scanning

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180113200A1 (en) * 2016-09-20 2018-04-26 Innoviz Technologies Ltd. Variable flux allocation within a lidar fov to improve detection in a region
US20180292532A1 (en) * 2017-04-07 2018-10-11 General Electric Company Lidar system and method
US20190011556A1 (en) * 2017-07-05 2019-01-10 Ouster, Inc. Light ranging device with electronically scanned emitter array and synchronized sensor array
US20190227175A1 (en) * 2018-01-23 2019-07-25 Innoviz Technologies Ltd. Distributed lidar systems and methods thereof
US20210025997A1 (en) * 2018-04-09 2021-01-28 Innoviz Technologies Ltd. Lidar systems and methods with internal light calibration
WO2019234503A2 (en) * 2018-06-05 2019-12-12 Innoviz Technologies Ltd. Mems mirror with resistor for determining a position of the mirror
WO2022053874A2 (en) * 2020-09-14 2022-03-17 Innoviz Technologies Ltd. Lidar system with variable resolution multi-beam scanning

Similar Documents

Publication Publication Date Title
JP7256920B2 (en) LIDAR system and method
US12135386B2 (en) LIDAR systems and methods for detection and classification of objects
US20250172674A1 (en) Systems and methods for photodiode-based detection
CN112969937A (en) LIDAR system and method
CN112236685A (en) Lidar system and method with internal light calibration
US12429564B2 (en) Systems and methods for interlaced scanning in lidar systems
WO2020021339A2 (en) Lidar system having a mirror housing with a window
US20220229164A1 (en) Systems and methods for time-of-flight optical sensing
KR101785254B1 (en) Omnidirectional LIDAR Apparatus
US20250299441A1 (en) Systems and methods for updating point clouds in LIDAR systems
US20220206114A1 (en) Flash lidar having nonuniform light modulation
CN114144698A (en) Anti-reflection label for laser radar window
US20240230906A9 (en) Lidar systems and methods for generating a variable density point cloud
US20220163633A1 (en) System and method for repositioning a light deflector
WO2024241321A1 (en) Micro-optics on detection path of lidar systems
EP4487140A1 (en) Increasing signal to noise ratio of a pixel of a lidar system
US20220276348A1 (en) Systems and methods for eye-safe lidar
US20240045040A1 (en) Detecting obstructions
US20230288541A1 (en) Object edge identification based on partial pulse detection

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 24810603

Country of ref document: EP

Kind code of ref document: A1