[go: up one dir, main page]

WO2024136622A1 - Système, procédé et produit programme informatique pour seuil de détection dynamique pour lidar d'un véhicule autonome - Google Patents

Système, procédé et produit programme informatique pour seuil de détection dynamique pour lidar d'un véhicule autonome Download PDF

Info

Publication number
WO2024136622A1
WO2024136622A1 PCT/KR2023/021613 KR2023021613W WO2024136622A1 WO 2024136622 A1 WO2024136622 A1 WO 2024136622A1 KR 2023021613 W KR2023021613 W KR 2023021613W WO 2024136622 A1 WO2024136622 A1 WO 2024136622A1
Authority
WO
WIPO (PCT)
Prior art keywords
sensor
light
threshold
digital output
aspects
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/KR2023/021613
Other languages
English (en)
Inventor
Ryan Thomas Davis
Dane Bennington
Mohamed SEGHILANI
Bayard G. Gardineer
Christopher John Trowbridge
Ying Xiang
Richard SLOCUM
Martin JAN TAUC
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
LG Innotek Co Ltd
Original Assignee
LG Innotek Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US18/087,950 external-priority patent/US12344272B2/en
Priority claimed from US18/088,846 external-priority patent/US12039945B1/en
Priority claimed from US18/153,209 external-priority patent/US20230356146A1/en
Application filed by LG Innotek Co Ltd filed Critical LG Innotek Co Ltd
Priority to CN202380094495.6A priority Critical patent/CN120731123A/zh
Priority to EP23907905.6A priority patent/EP4637968A1/fr
Priority to KR1020257021381A priority patent/KR20250123820A/ko
Publication of WO2024136622A1 publication Critical patent/WO2024136622A1/fr
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/93Lidar systems specially adapted for specific applications for anti-collision purposes
    • G01S17/931Lidar systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W60/00Drive control systems specially adapted for autonomous road vehicles
    • B60W60/001Planning or execution of driving tasks
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • G01S17/42Simultaneous measurement of distance and other co-ordinates
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/481Constructional features, e.g. arrangements of optical elements
    • G01S7/4814Constructional features, e.g. arrangements of optical elements of transmitters alone
    • G01S7/4815Constructional features, e.g. arrangements of optical elements of transmitters alone using multiple transmitters
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/483Details of pulse systems
    • G01S7/484Transmitters
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/483Details of pulse systems
    • G01S7/486Receivers
    • G01S7/4861Circuits for detection, sampling, integration or read-out
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/483Details of pulse systems
    • G01S7/486Receivers
    • G01S7/487Extracting wanted echo signals, e.g. pulse detection
    • G01S7/4873Extracting wanted echo signals, e.g. pulse detection by deriving and controlling a threshold value
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2420/00Indexing codes relating to the type of sensors based on the principle of their operation
    • B60W2420/40Photo, light or radio wave sensitive means, e.g. infrared sensors
    • B60W2420/408Radar; Laser, e.g. lidar

Definitions

  • This disclosed subject matter relates generally to systems, methods, and computer program products for object detection with LiDAR and, in some non-limiting embodiments or aspects, to a system, method, and computer program product for a dynamic detection threshold for a sensor (e.g., LiDAR) of an autonomous vehicle.
  • a sensor e.g., LiDAR
  • Autonomous vehicles rely on various sensors that operate to gather information about an environment in which the vehicle is operating and/or traveling.
  • autonomous vehicles may rely on one or more cameras, one or more ray casting systems (e.g., a LiDAR and/or the like), and/or the like to detect objects.
  • Certain LiDAR systems operate on a time-of-flight principle by measuring the time difference between emission of a signal and detection with a sensor of the signal returning after being reflected by an object.
  • ⁇ detection criteria such as a threshold
  • Setting detection criteria may reduce false detections, but selecting an appropriate threshold can be difficult. For example, if the threshold is too low, false detections may occur (e.g., due to the factors described above). If the threshold is too high, certain objects (e.g., objects with low reflectivity, distant objects, and/or the like) may not be detected (e.g., because the returned signal may not have sufficient power to cause the output of the sensor to exceed the threshold).
  • certain sources of noise such as solar radiation and/or light from other light sources
  • may vary throughout the day e.g., based on the time of day, the position of the sun, the location of shade and/or shadows, the state of other light sources being on or off, weather conditions, and/or other environmental factors.
  • setting a threshold that is appropriate for all times of day and environmental conditions can be difficult.
  • detection based on setting a threshold eliminates information about the returned signal (e.g., amplitude of the returned signal, the amplitude of the output of the sensor, the amount by which the returned signal exceeds the threshold, and/or the like).
  • Amplitude estimation techniques such as time over threshold (TOT) are not precise and suffer from issues, such as pulse pileup (e.g., superimposition of multiple pulses and/or the like).
  • Digitization e.g., with high-speed analog-to-digital converters (ADCs) and/or the like) may be very expensive and generate large amounts of data, most of which is useless, and, depending on the optical technology applied, digitization may require enormous dynamic range.
  • faster photodetectors such as Silicon Photomultipliers (SiPMs) produce extremely fast and short signal profiles that require extremely high-speed and expensive ADCs.
  • a characteristic of a sensor based on a display of an e-ink display device.
  • a system may include a LiDAR system of an autonomous vehicle.
  • the LiDAR system may include at least one light emitter configured to emit pulses of light and at least one light detector configured to receive reflected pulses of light and generate analog output signals based on the reflected pulses of light.
  • the reflected pulses of light may include the pulses of light reflected back to the at least one light detector.
  • a comparator may be configured to receive the analog output signals from the light detector and generate digital output signals based on the analog output signals and a threshold.
  • a controller may be configured to receive a first digital output signal of the digital output signals from the comparator based on the threshold, adjust the threshold, receive at least one further digital output signal of the digital output signals from the comparator based on the threshold as adjusted, and/or determine at least one aggregation based on the first digital output signal and the at least one further digital output signal.
  • the at least one light emitter may include a plurality of light emitters. Additionally or alternatively, the at least one light detector may include a plurality of light detectors.
  • the controller may be further configured to detect at least one object in an environment surrounding the autonomous vehicle based on the at least one aggregation.
  • the controller may be further configured to issue at least one command to cause the autonomous vehicle to perform at least one autonomous driving operation based on detecting the at least one object.
  • the controller may be further configured to issue at least one command to cause the autonomous vehicle to perform at least one autonomous driving operation based on the at least one aggregation.
  • the pulses of light may include a first pulse of light associated with the first digital output signal and at least one further pulse of light associated with the at least one further digital output signal.
  • the LiDAR system may be configured to rotate the at least one light emitter and the at least one light detector.
  • a field of view of the LiDAR system may rotate as the at least one light emitter and the at least one light detector rotate.
  • a pulse repetition rate of the pulses of light may be sufficiently high that the field of view when emitting the first pulse of light at least partially overlaps with the field of view when emitting the at least one further pulse of light.
  • the at least one further digital output signal may include a plurality of further digital output signals. Additionally or alternatively, adjusting the threshold and receiving the at least one further digital output signal may include repeatedly adjusting the threshold and receiving a respective further digital output signal of the plurality of further digital output signals based on the threshold as adjusted.
  • repeatedly adjusting the threshold and receiving the respective further digital output signal of the plurality of further digital output signals based on the threshold as adjusted may include adjusting the threshold according to at least one of a linear search, a low-to-high search, a high-to-low search, a binary search, a sawtooth search, or any combination thereof.
  • repeatedly adjusting the threshold and receiving the respective further digital output signal of the plurality of further digital output signals based on the threshold as adjusted may include repeatedly adjusting the threshold according to a first linear search within a first range and/or repeatedly adjusting the threshold according to a second linear search within a second range.
  • the second range may be based on a first threshold value within the first range for which the respective further digital output signal is associated with detection of an object and a second threshold value within the first range for which the respective further digital output signal is associated with not detecting the object.
  • the threshold may include at least one of a linear value of voltage above a noise voltage level, an exponential value of voltage above the noise voltage level, a value of full width at half maximum, a signal-to-noise ratio (SNR), or any combination thereof.
  • SNR signal-to-noise ratio
  • the controller may be further configured to determine an approximate amplitude of the analog output signals based on the at least one aggregation.
  • the system may further include a time-to-digital converter (TDC) configured to determine at least one time of flight (TOF) based on at least one pulse of light of the pulses of light and at least one reflected pulse of light of the reflected pulses of light.
  • TDC time-to-digital converter
  • the TDC may be configured to receive the at least one aggregation and/or determining the at least one TOF may include determining the at least one TOF based on the at least one aggregation.
  • the controller may be further configured to determine a target threshold based on the at least one aggregation.
  • the target threshold may include at least one of an optimal threshold value, a threshold value that increases a signal-to-noise ratio (SNR), or any combination thereof.
  • SNR signal-to-noise ratio
  • the system may further include a digital-to-analog converter (DAC).
  • the DAC may be connected to the controller.
  • an output of the at least one light detector may be connected to a first comparator input of the comparator, and/or the DAC may be connected to a second comparator input of the comparator.
  • the controller may be configured to adjust the threshold by controlling the DAC to adjust a voltage at the second comparator input of the comparator.
  • a method for a dynamic detection threshold for a sensor of an autonomous vehicle may include emitting, with at least one light emitter of a LiDAR system of an autonomous vehicle, at least one first pulse of light.
  • At least one light detector of the LiDAR system of the autonomous vehicle may receive at least one first reflected pulse of light including the at least one first pulse of light reflected back to the at least one light detector.
  • the at least one light detector may generate at least one first analog output signal based on the at least one first reflected pulse of light.
  • At least one comparator may receive the at least one first analog output signal from the at least one light detector.
  • the at least one comparator may generated at least one first digital output signal based on the at least one first analog output signal and a threshold.
  • At least one controller may receive the at least one first digital output signal from the comparator.
  • the at least one controller may adjust the threshold.
  • the at least one light emitter may emit at least one further pulse of light.
  • the at least one light detector may receive at least one further reflected pulse of light including the at least one further pulse of light reflected back to the at least one light detector.
  • the at least one light detector may generate at least one further analog output signal based on the at least one further reflected pulse of light.
  • the at least one comparator may receive the at least one further analog output signal from the at least one light detector.
  • the at least one comparator may generate at least one further digital output signal based on the at least one further analog output signal and the threshold as adjusted.
  • the at least one controller may receive the at least one further digital output signal from the comparator.
  • the at least one controller may determine at least one aggregation based on the at least one first digital output signal and the at least one further digital output signal.
  • the computer program product may include at least one non-transitory computer-readable medium including one or more instructions that, when executed by at least one processor, cause the at least one processor to receive at least one first digital output signal from a comparator, the at least one first digital output signal based on a threshold and at least one first analog output signal of at least one light detector of a LiDAR system of an autonomous vehicle.
  • the instructions when executed by the at least one processor, may further cause the at least one processor to adjust the threshold.
  • the instructions when executed by the at least one processor, may further cause the at least one processor to receive at least one further digital output signal from the comparator, the at least one further digital output signal based on the threshold as adjusted and at least one further analog output signal of the at least one light detector.
  • the instructions when executed by the at least one processor, may further cause the at least one processor to determine at least one aggregation based on the at least one first digital output signal and the at least one further digital output signal.
  • a system for a dynamic detection threshold for a sensor of an autonomous vehicle comprising: a LiDAR system of an autonomous vehicle, the LiDAR system comprising at least one light emitter configured to emit pulses of light and at least one light detector configured to receive reflected pulses of light and generate analog output signals based on the reflected pulses of light, the reflected pulses of light comprising the pulses of light reflected back to the at least one light detector; a comparator configured to receive the analog output signals from the light detector and generate digital output signals based on the analog output signals and a threshold; and a controller configured to: receive a first digital output signal of the digital output signals from the comparator based on the threshold; adjust the threshold; receive at least one further digital output signal of the digital output signals from the comparator based on the threshold as adjusted; and determine at least one aggregation based on the first digital output signal and the at least one further digital output signal.
  • Clause 2 The system of clause 1, wherein the at least one light emitter comprises a plurality of light emitters, and wherein the at least one light detector comprises a plurality of light detectors.
  • Clause 3 The system of any preceding clause, wherein the controller is further configured to detect at least one object in an environment surrounding the autonomous vehicle based on the at least one aggregation.
  • Clause 4 The system of any preceding clause, wherein the controller is further configured to issue at least one command to cause the autonomous vehicle to perform at least one autonomous driving operation based on detecting the at least one object.
  • Clause 5 The system of any preceding clause, wherein the controller is further configured to issue at least one command to cause the autonomous vehicle to perform at least one autonomous driving operation based on the at least one aggregation.
  • Clause 7 The system of any preceding clause, wherein the LiDAR system is configured to rotate the at least one light emitter and the at least one light detector, wherein a field of view of the LiDAR system rotates as the at least one light emitter and the at least one light detector rotate, and wherein a pulse repetition rate of the pulses of light is sufficiently high that the field of view when emitting the first pulse of light at least partially overlaps with the field of view when emitting the at least one further pulse of light.
  • Clause 8 The system of any preceding clause, wherein the at least one further digital output signal comprises a plurality of further digital output signals, and wherein adjusting the threshold and receiving the at least one further digital output signal comprises: repeatedly adjusting the threshold and receiving a respective further digital output signal of the plurality of further digital output signals based on the threshold as adjusted.
  • Clause 11 The system of any preceding clause, wherein the second range is based on a first threshold value within the first range for which the respective further digital output signal is associated with detection of an object and a second threshold value within the first range for which the respective further digital output signal is associated with not detecting the object.
  • the threshold comprises at least one of: a linear value of voltage above a noise voltage level; an exponential value of voltage above the noise voltage level; a value of full width at half maximum; a signal-to-noise ratio (SNR); or any combination thereof.
  • Clause 13 The system of any preceding clause, wherein the controller is further configured to determine an approximate amplitude of the analog output signals based on the at least one aggregation.
  • Clause 14 The system of any preceding clause, further comprising a time-to-digital converter (TDC) configured to determine at least one time of flight (TOF) based on at least one pulse of light of the pulses of light and at least one reflected pulse of light of the reflected pulses of light.
  • TDC time-to-digital converter
  • TOF time of flight
  • Clause 15 The system of any preceding clause, wherein the TDC is configured to receive the at least one aggregation and wherein determining the at least one TOF comprises determining the at least one TOF based on the at least one aggregation.
  • Clause 16 The system of any preceding clause, wherein the controller is further configured to determine a target threshold based on the at least one aggregation.
  • the target threshold comprises at least one of: an optimal threshold value; a threshold value that increases a signal-to-noise ratio (SNR); or any combination thereof.
  • SNR signal-to-noise ratio
  • Clause 18 The system of any preceding clause, further comprising a digital-to-analog converter (DAC), wherein the DAC is connected to the controller, wherein an output of the at least one light detector is connected to a first comparator input of the comparator, wherein the DAC is connected to a second comparator input of the comparator, and wherein the controller is configured to adjust the threshold by controlling the DAC to adjust a voltage at the second comparator input of the comparator.
  • DAC digital-to-analog converter
  • a method for a dynamic detection threshold for a sensor of an autonomous vehicle comprising: emitting, with at least one light emitter of a LiDAR system of an autonomous vehicle, at least one first pulse of light; receiving, with at least one light detector of the LiDAR system of the autonomous vehicle, at least one first reflected pulse of light comprising the at least one first pulse of light reflected back to the at least one light detector; generating, with the at least one light detector, at least one first analog output signal based on the at least one first reflected pulse of light; receiving, with at least one comparator, the at least one first analog output signal from the at least one light detector; generating, with the at least one comparator, at least one first digital output signal based on the at least one first analog output signal and a threshold; receiving, with at least one controller, the at least one first digital output signal from the comparator; adjusting, with the at least one controller, the threshold; emitting, with the at least one light emitter, at least one further pulse of light; receiving, with the at least one
  • a computer program product for a dynamic detection threshold for a sensor of an autonomous vehicle comprising at least one non-transitory computer-readable medium comprising one or more instructions that, when executed by at least one processor, cause the at least one processor to: receive at least one first digital output signal from a comparator, the at least one first digital output signal based on a threshold and at least one first analog output signal of at least one light detector of a LiDAR system of an autonomous vehicle; adjust the threshold; receive at least one further digital output signal from the comparator, the at least one further digital output signal based on the threshold as adjusted and at least one further analog output signal of the at least one light detector; and determine at least one aggregation based on the at least one first digital output signal and the at least one further digital output signal.
  • a system comprising a memory; and at least one processor coupled to the memory and configured to: receive data associated with a first display of an e-ink display device and data associated with a second display of the e-ink display device, wherein the data associated with the first display of the e-ink display device is based on a first reading of the e-ink display device by a sensor system and the data associated with the second display of the e-ink display device is based on a second reading of the e-ink display device by the sensor system; process the data associated with the first display of the e-ink display device and the data associated with the second display of the e-ink display device to provide a quantitative result; and determine a characteristic of the sensor system based on the quantitative result.
  • a computer program product comprising at least one non-transitory computer-readable medium comprising one or more instructions that, when executed by at least one processor, cause the at least one processor to: receive data associated with a first display of an e-ink display device and data associated with a second display of the e-ink display device, wherein the data associated with the first display of the e-ink display device is based on a first reading of the e-ink display device by a sensor system and the data associated with the second display of the e-ink display device is based on a second reading of the e-ink display device by the sensor system; process the data associated with the first display of the e-ink display device and the data associated with the second display of the e-ink display device to provide a quantitative result; and determine a characteristic of the sensor system based on the quantitative result.
  • a method comprising: receiving, with at least one processor, data associated with a first display of an e-ink display device and data associated with a second display of the e-ink display device, wherein the data associated with the first display of the e-ink display device is based on a first reading of the e-ink display device by a sensor system and the data associated with the second display of the e-ink display device is based on a second reading of the e-ink display device by the sensor system; processing, with at least one processor, the data associated with the first display of the e-ink display device and the data associated with the second display of the e-ink display device to provide a quantitative result; and determining, with at least one processor, a characteristic of the sensor system based on the quantitative result.
  • a desiccant assembly within a sensor housing includes a desiccant chamber configured to hold a desiccant element; a transfer window positioned between the desiccant chamber and a sensor chamber of the sensor housing; and a permeable membrane covering the transfer window and configured to allow water vapor to transfer from the sensor chamber to the desiccant chamber.
  • a lidar system includes a sensor housing including a sensor chamber; a sensor disposed within the sensor chamber; and a desiccant assembly disposed within the sensor housing, the desiccant assembly comprising a desiccant chamber configured to hold a desiccant element; and a permeable membrane positioned between the desiccant chamber and the sensor chamber and configured to allow water vapor to transfer from the sensor chamber to the desiccant chamber.
  • an equipment housing includes an equipment chamber configured to hold electronic equipment; and a desiccant assembly, comprising a desiccant chamber configured to hold a desiccant element; and a transfer assembly positioned between the desiccant chamber and the equipment chamber, wherein the transfer assembly is configured to allow water vapor to transfer from the equipment chamber to the desiccant chamber and to prevent particulate matter from transferring from the desiccant chamber to the equipment chamber.
  • a method of manufacturing a sensor housing includes providing a desiccant chamber configured to hold a desiccant element; positioning a transfer window between the desiccant chamber and a sensor chamber of the sensor housing; and disposing a permeable membrane over the transfer window, wherein the permeable membrane is configured to allow water vapor to transfer from the sensor chamber to the desiccant chamber.
  • FIG. 1 is a diagram of an exemplary system for a dynamic detection threshold for a sensor (e.g., LiDAR) of an autonomous vehicle, according to some non-limiting embodiments or aspects of the presently disclosed subject matter;
  • a sensor e.g., LiDAR
  • FIG. 2 is an illustration of an illustrative architecture for a vehicle, according to some non-limiting embodiments or aspects of the presently disclosed subject matter;
  • FIG. 3 is an illustration of an illustrative architecture for a LiDAR system, according to the principles of the presently disclosed subject matter
  • FIG. 4 is an illustration of an illustrative computer system, according to some non-limiting embodiments or aspects of the presently disclosed subject matter
  • FIG. 5 is a flowchart of an exemplary process for a dynamic detection threshold for a sensor (e.g., LiDAR) of an autonomous vehicle, according to some non-limiting embodiments or aspects of the presently disclosed subject matter;
  • a sensor e.g., LiDAR
  • FIGS. 6A-6C are diagrams of exemplary implementations for a dynamic detection threshold for a sensor (e.g., LiDAR) of an autonomous vehicle, according to some non-limiting embodiments or aspects of the presently disclosed subject matter;
  • a sensor e.g., LiDAR
  • FIG. 7 is an exemplary graph of detection threshold and false detection probability, according to some non-limiting embodiments or aspects of the presently disclosed subject matter.
  • FIG. 8 is an exemplary graph of detection probability and false detection probability, according to some non-limiting embodiments or aspects of the presently disclosed subject matter.
  • FIG. 9 is a diagram of a non-limiting embodiment of an environment in which systems, methods, and/or computer program products, described herein, may be implemented;
  • FIG. 10 is a diagram of a non-limiting embodiment of a computing device
  • FIG. 11 is a flowchart of a non-limiting embodiment of a process for determining a characteristic of a sensor based on a display of an e-ink display device;
  • FIG. 12 is a diagram of a non-limiting embodiment of an implementation of a process for determining a characteristic of a sensor based on a display of an e-ink display device.
  • FIGS. 13A-13B are graphs showing a relationship between reflectivity of an e-ink display device and wavelength of light, and reflectivity of the e-ink display and refractive index with regard to angle of incidence.
  • Fig. 14 is a diagram of an example environment in which an autonomous vehicle may operate, in accordance with some aspects of the disclosure.
  • Fig. 15 is a diagram of an example on-board system of an autonomous vehicle, in accordance with some aspects of the disclosure.
  • Fig. 16 is a diagram of an example lidar system, in accordance with some aspects of the disclosure.
  • Fig. 17A is a perspective diagram of an example sensor housing, in accordance with some aspects of the disclosure.
  • Fig. 17B is an exploded perspective diagram of the example sensor housing, with the housing shell removed, in accordance with some aspects of the disclosure.
  • Fig. 17C is another exploded perspective diagram of the example sensor housing, with the housing shell removed, in accordance with some aspects of the disclosure.
  • Fig. 17D is another exploded perspective diagram of the example sensor housing, with the housing shell removed, in accordance with some aspects of the disclosure.
  • Fig. 17E is a perspective diagram of an example desiccant assembly body, in accordance with some aspects of the disclosure.
  • Fig. 17F is another perspective diagram of the example desiccant assembly body, in accordance with some aspects of the disclosure.
  • Fig. 17G is a top plan diagram of the example desiccant assembly body, in accordance with some aspects of the disclosure.
  • Fig. 18 is a flowchart of an example method associated with manufacturing a sensor housing, in accordance with some aspects of the disclosure.
  • the terms “communication” and “communicate” may refer to the reception, receipt, transmission, transfer, provision, and/or the like of information (e.g., data, signals, messages, instructions, commands, and/or the like).
  • one unit e.g., a device, a system, a component of a device or system, combinations thereof, and/or the like
  • to be in communication with another unit means that the one unit is able to directly or indirectly receive information from and/or transmit information to the other unit.
  • This may refer to a direct or indirect connection (e.g., a direct communication connection, an indirect communication connection, and/or the like) that is wired and/or wireless in nature.
  • two units may be in communication with each other even though the information transmitted may be modified, processed, relayed, and/or routed between the first and second unit.
  • a first unit may be in communication with a second unit even though the first unit passively receives information and does not actively transmit information to the second unit.
  • a first unit may be in communication with a second unit if at least one intermediary unit (e.g., a third unit located between the first unit and the second unit) processes information received from the first unit and communicates the processed information to the second unit.
  • a message may refer to a network packet (e.g., a data packet and/or the like) that includes data. It will be appreciated that numerous other arrangements are possible.
  • vehicle refers to any moving form of conveyance that is capable of carrying either one or more human occupants and/or cargo and is powered by any form of energy.
  • vehicle includes, but is not limited to, cars, trucks, vans, trains, autonomous vehicles, aircraft, aerial drones, and/or the like.
  • An “autonomous vehicle” is a vehicle having a processor, programming instructions, and drivetrain components that are controllable by the processor without requiring a human operator.
  • An autonomous vehicle may be fully autonomous in that it does not require a human operator for most or all driving conditions and functions, or it may be semi-autonomous in that a human operator may be required in certain conditions or for certain operations, or that a human operator may override the vehicle's autonomous system and may take control of the vehicle.
  • computing device may refer to one or more electronic devices configured to process data.
  • a computing device may, in some examples, include the necessary components to receive, process, and output data, such as a processor, a display, a memory, an input device, a network interface, and/or the like.
  • a computing device may be a mobile device.
  • a mobile device may include a cellular phone (e.g., a smartphone or standard cellular phone), a portable computer, a wearable device (e.g., watches, glasses, lenses, clothing, and/or the like), a personal digital assistant (PDA), and/or other like devices.
  • a computing device may also be a desktop computer or other form of non-mobile computer.
  • server may refer to one or more computing devices (e.g., processors, storage devices, similar computer components, and/or the like) that communicate with client devices and/or other computing devices over a network (e.g., a public network, the Internet, a private network, and/or the like) and, in some examples, facilitate communication among other servers and/or client devices.
  • a network e.g., a public network, the Internet, a private network, and/or the like
  • system may refer to one or more computing devices or combinations of computing devices (e.g., processors, servers, client devices, software applications, components of such, and/or the like).
  • references to "a device,” “a server,” “a processor,” and/or the like, as used herein, may refer to a previously-recited device, server, or processor that is recited as performing a previous step or function, a different server or processor, and/or a combination of servers and/or processors.
  • a first server or a first processor that is recited as performing a first step or a first function may refer to the same or different server or the same or different processor recited as performing a second step or a second function.
  • GUIs graphical user interfaces
  • satisfying a threshold may refer to a value being greater than the threshold, more than the threshold, higher than the threshold, greater than or equal to the threshold, less than the threshold, fewer than the threshold, lower than the threshold, less than or equal to the threshold, equal to the threshold, etc.
  • Non-limiting embodiments or aspects of the disclosed subject matter are directed to systems and methods for a dynamic detection threshold for a sensor (e.g., LiDAR) of an autonomous vehicle.
  • a dynamic detection threshold for a sensor e.g., LiDAR
  • a LiDAR system including at least one light emitter configured to emit pulses of light and at least one light detector configured to receive reflected pulses of light (e.g., the pulses of light reflected back to the light detector) and generate analog output signals based on the reflected pulses of light, a comparator configured to receive the analog output signals from the light detector and generate digital output signals based on the analog output signals and a threshold, and a controller.
  • the controller may be configured to receive a first digital output signal from the comparator based on the threshold, adjust the threshold, receive at least one further digital output signal from the comparator based on the threshold as adjusted, and determine at least one aggregation based on the first digital output signal and the further digital output signal(s).
  • Such embodiments or aspects enable accurate detection of return of the signal (e.g., the reflected pulses returned based on the emitted pulses) based on aggregating multiple digital output signals obtained based on different threshold values (e.g., a threshold that is dynamically adjusted between at least some of the pulses).
  • aggregating multiple digital output signals obtained based on such different threshold values may reduce (e.g., decrease, minimize, eliminate, and/or the like) false detections based on sources of noise, such as solar radiation, light from other light sources, electrical noise, and/or the like, e.g., because if one or a few digital output signals obtained when the threshold is relatively low erroneously indicate detection of an object, but other digital output signals when the threshold is relatively higher indicate an object is not detected, the aggregation(s) may indicate that an object is not detected and/or a low confidence (e.g., probability, confidence score, and/or the like) that an object is detected.
  • sources of noise such as solar radiation, light from other light sources, electrical noise, and/or the like
  • the disclosed embodiments or aspects may reduce the risk of failing to detect certain objects (e.g., objects with low reflectivity, distant objects, and/or the like), for example, because even if one or a few digital output signals when the threshold is relatively high erroneously indicate an object is not detected, but other digital output signals (e.g., multiple separate digital output signals) when the threshold is relatively low indicate an object is detected, the aggregation(s) may indicate that an object is and/or a relatively higher confidence that an object is detected (even though it was only detected at relatively low threshold values) because multiple digital outputs indicated detection of the same object (e.g., as opposed to a single digital output at a single threshold value).
  • certain objects e.g., objects with low reflectivity, distant objects, and/or the like
  • the aggregation(s) may indicate that an object is and/or a relatively higher confidence that an object is detected (even though it was only detected at relatively low threshold values) because multiple digital outputs indicated detection of the same object (e.g.,
  • the disclosed embodiments or aspects may reduce the effects of certain sources of noise, such as solar radiation and/or light from other light sources, that may vary throughout the day by using a dynamic threshold with multiple, changing threshold values (e.g., rather than setting a single value for the threshold to be used throughout the day). Additionally or alternatively, non-limiting embodiments or aspects of the disclosed subject matter provide determining an approximate amplitude of the analog output signals based on the aggregation(s).
  • Such embodiments or aspects enable accurately and/or precisely estimating the returned power (e.g., the power of the reflected pulses as represented by the analog output signals) without the need for high-speed analog-to-digital converters (ADCs) (e.g., by using a dynamically adjusted threshold and a comparator) that also reduces the effects of issues that may cause inaccuracy and/or imprecision in other amplitude estimation techniques (e.g., pulse pileup, noise, and/or the like), for example, because multiple digital outputs are aggregated (e.g., rather than relying on a single digital output from what appears to be a single reflected pulse).
  • ADCs analog-to-digital converters
  • a dynamic detection threshold e.g., for LiDAR of an autonomous vehicle
  • a signal e.g., emitted signal, reflected signal, returned signal, and/or the like
  • FIG. 1 is a diagram of an exemplary system 100 for a dynamic detection threshold for a sensor (e.g., LiDAR) of an autonomous vehicle, according to some non-limiting embodiments or aspects of the presently disclosed subject matter.
  • system 100 may include autonomous vehicle 102, LiDAR system 104, light emitter 106, light detector 108, comparator 110, controller 120, remote system 130, and/or communication network 190.
  • Autonomous vehicle 102 may include a vehicle, as described herein.
  • autonomous vehicle 102 may include one or more devices (e.g., controller 120, a vehicle on-board computing device, and/or the like) capable of receiving information from and/or communicating information to remote system 130 (e.g., directly, indirectly via communication network 190, and/or any other suitable communication technique).
  • each autonomous vehicle 102 may include a device (e.g., controller 120, a vehicle on-board computing device, and/or the like) capable of receiving information from and/or communicating information to other autonomous vehicles 102 (e.g., directly, indirectly via communication network 190, and/or any other suitable communication technique).
  • autonomous vehicle 102 may include at least one controller 120, such as a vehicle on-board computing device, a portable and/or handheld device (e.g., a computer, a laptop, a personal digital assistant (PDA), a smartphone, a tablet, and/or the like), a field programmable gate array (FPGA), a microcontroller, an application specific integrated circuit (ASIC), a server, and/or other like devices.
  • a vehicle on-board computing device e.g., a computer, a laptop, a personal digital assistant (PDA), a smartphone, a tablet, and/or the like
  • PDA personal digital assistant
  • FPGA field programmable gate array
  • ASIC application specific integrated circuit
  • autonomous vehicle 102 may include at least one computing device (e.g., controller 120, a vehicle on-board computing device, and/or the like) and at least one sensor, such as an image capture system (e.g., a camera and/or the like), a ray casting system (e.g., LiDAR system 104, light emitter 106, light detector 108, a laser scanner, a radar, any combination thereof, and/or the like), any combination thereof, and/or the like, as described herein.
  • an image capture system e.g., a camera and/or the like
  • a ray casting system e.g., LiDAR system 104, light emitter 106, light detector 108, a laser scanner, a radar, any combination thereof, and/or the like
  • autonomous vehicle 102 may be configured to generate map data, image data, object detection data, and/or the like based on the sensor(s) (e.g., LiDAR system 104, light emitter 106, light detector 108, and/or the like).
  • autonomous vehicle 102 may use data from the sensor(s) (e.g., LiDAR system 104, light emitter 106, light detector 108, and/or the like) to facilitate at least one autonomous driving operation of the autonomous vehicle 102, as described herein.
  • autonomous vehicle 102 may detect at least one object using the sensor(s) onboard the vehicle (e.g., using LiDAR system 104, light emitter 106, light detector 108, and/or the like).
  • LiDAR system 104 may include at least one LiDAR system, as described herein.
  • LiDAR system 104 may be the same as or substantially similar to LiDAR system 264 of FIG. 2, LiDAR system 300 of FIG. 3, and/or the like.
  • LiDAR system 104 may include at least one light emitter 106 (e.g., a plurality of light emitters 106) and/or at least one light detector 108 (e.g., a plurality of light detectors 108), as described herein.
  • LiDAR system 104 may include at least one of comparator 110 and/or controller 120.
  • At least one of (e.g., both of) comparator 110 and/or controller 120 may be separate from and/or connected to (e.g., in communication with) LiDAR system 104.
  • LiDAR system 104 may be part of autonomous vehicle 102.
  • LiDAR system 104 may include a device capable of receiving information from and/or communicating information to other sensors, as described herein.
  • Light emitter 106 may include at least one light emitter configured and positioned to generate and emit pulses of light, as described herein.
  • light emitter 106 may be the same as or substantially similar to emitter system 304 of FIG. 3 and/or the like.
  • light emitter 106 may be part of LiDAR system 104 and/or autonomous vehicle 102.
  • LiDAR system 104 may include any number of light emitters 106 (e.g., 8 emitters, 64 emitters, 128 emitters, etc.).
  • Light detector 108 may include at least one light detector (e.g., a photodetector and/or the like) positioned and configured to receive light reflected back into the system, as described herein.
  • light detector 108 may be the same as or substantially similar to light detector 308 of FIG. 3 and/or the like.
  • light detector 108 may be part of LiDAR system 104 and/or autonomous vehicle 102.
  • LiDAR system 104 may include any number of light detector 108.
  • light detector 108 may include at least one photodetector, such as a Silicon Photomultiplier (SiPM), an avalanche photodiode (APD), a single-photon avalanche diode (SPAD), a photodiode, and/or the like.
  • SiPM Silicon Photomultiplier
  • APD avalanche photodiode
  • SPAD single-photon avalanche diode
  • photodiode and/or the like.
  • light detector 108 may include a SiPM, which may produce a fast and short signal profile (e.g., upon receiving a reflected pulse of light).
  • light detector 108 may be configured to generate analog output signals based on receiving light (e.g., reflected pulses of light).
  • Comparator 110 may include at least one comparator, such as a differential comparator, a differential amplifier, an operational amplifier (op-amp), and/or the like.
  • comparator 110 may be configured to receive the outputs (e.g., analog output signals) from light detector 108.
  • an output of light detector 108 may be connected to a first comparator input (e.g., a positive comparator input, a negative comparator input, and/or the like) of comparator 110.
  • comparator 110 may be configured to generate digital output signals based on the analog output signals and a threshold.
  • a second comparator input (e.g., a negative comparator input, a positive comparator input, and/or the like) may be connected (e.g., directly, indirectly via a digital-to-analog converter (DAC), and/or the like) to controller 120, and a signal (e.g., a threshold signal, such as a voltage, a current, and/or the like) provided at the second comparator input (e.g., from controller 120, the DAC, and/or the like) may be associated with the threshold.
  • a signal e.g., a threshold signal, such as a voltage, a current, and/or the like
  • Comparator 110 may compare the signals (e.g., the analog output signal at the first comparator input and the threshold signal at the second comparator input) and generate the digital output signal based thereon (e.g., a high digital output signal (e.g., 1) if the voltage, current, and/or the like of the analog output signal at the first comparator input is greater than the voltage, current, and/or the like of the threshold signal at the second comparator input; a low digital output signal (e.g., 0) if the voltage, current, and/or the like of the analog output signal at the first comparator input is less than the voltage, current, and/or the like of the threshold signal at the second comparator input; and/or the like).
  • a high digital output signal e.g., 1
  • a low digital output signal e.g., 0
  • controller 120 may adjust the threshold by adjusting (and/or cause the DAC to adjust) the threshold signal at the second comparator input.
  • comparator 110 may be part of at least one of LiDAR system 104 and/or controller 120. In some non-limiting embodiments or aspects, comparator 110 may separate from (and be connected to) at least one of (e.g., both of) LiDAR 104 and/or controller 120.
  • comparator 108 may be part of autonomous vehicle 102. In some non-limiting embodiments or aspects, a plurality of comparators 110 may be included.
  • autonomous vehicle 102 and/or LiDAR system 104 may include a respective comparator 110 for each respective light detector 108.
  • Controller 120 may include one or more devices capable of receiving information from and/or communicating information to autonomous vehicle 102, LiDAR system 104, light emitter 106, light detector 108, comparator 110, and/or remote system 130 (e.g., directly, indirectly via communication network 190, and/or any other suitable communication technique).
  • controller 120 may include at least one computing device, such as a vehicle on-board computing device, an FPGA, a microcontroller, an ASIC, a portable and/or handheld device (e.g., a computer, a laptop, a personal digital assistant (PDA), a smartphone, a tablet, and/or the like), a server, and/or other like devices.
  • PDA personal digital assistant
  • controller 120 may include a vehicle on-board computing device, as described herein. Additionally or alternatively, controller 120 may include at least one of a processor, an FPGA, a microcontroller, an ASIC, any combination thereof, and/or the like (e.g., connected to and/or in communication with the vehicle on-board computing device of autonomous vehicle 102). In some non-limiting embodiments or aspects, controller 120 may be part of autonomous vehicle 102. In some non-limiting embodiments or aspects, controller 120 may be part of LiDAR system 104. In some non-limiting embodiments or aspects, controller 120 may be separate from, connected to, and/or in communication with LiDAR system 104.
  • Remote system 130 may include one or more devices capable of receiving information from and/or communicating information to autonomous vehicle 102 (e.g., a computing device thereof, controller 120 thereof, and/or the like) and/or controller 120 (e.g., directly, indirectly via communication network 190, and/or any other suitable communication technique).
  • remote system 130 may include at least one computing device, such as a server, a group of servers, a portable and/or handheld device (e.g., a computer, a laptop, a personal digital assistant (PDA), a smartphone, a tablet, and/or the like), and/or other like devices.
  • remote system 130 may include at least one of a remote guidance system, a mapping system, a tracking system tracking the location of one or more autonomous vehicles, a logging system maintaining records for potential liability, and/or the like.
  • Communication network 190 may include one or more wired and/or wireless networks.
  • communication network 190 may include a cellular network (e.g., a long-term evolution (LTE) network, a third generation (3G) network, a fourth generation (4G) network, a fifth generation (5G) network, a code division multiple access (CDMA) network, and/or the like), a public land mobile network (PLMN), a local area network (LAN), a wireless LAN (WLAN), a wide area network (WAN), a metropolitan area network (MAN), a telephone network (e.g., the public switched telephone network (PSTN)), a private network (e.g., a private network associated with a transaction service provider), an ad hoc network, an intranet, the Internet, a fiber optic-based network, a cloud computing network, and/or the like, and/or a combination of these or other types of networks.
  • LTE long-term evolution
  • 3G third generation
  • 4G fourth generation
  • 5G fifth generation
  • LiDAR system 104 may include at least one light emitter 106 and at least one light detector 108.
  • light emitter 106 may be configured to emit pulses of light, as described herein
  • light detector 108 may be configured to receive reflected pulses of light (e.g., the pulses of light from light emitter 106 reflected back to light detector 108), as described herein.
  • Light detector 108 may be configured to generate analog output signals based on the reflected pulses of light, as described herein.
  • Comparator 110 may be configured to receive the analog output signals from light detector 108 and generate digital output signals based on the analog output signals and a threshold, as described herein.
  • Controller 120 may be configured to repeatedly (e.g., continuously, periodically, and/or the like) receive digital output signals from comparator 110 based on the threshold and/or adjust the threshold, as described herein. For example, controller 120 may receive at least one digital output signal from comparator 110 based on the threshold, adjust the threshold, and receive at least one further digital output signal from comparator 110 based on the threshold as adjusted. Controller 120 may be further configured to determine at least one aggregation based on the digital output signals based on the different thresholds (e.g., the digital output signal(s) based on the initial threshold value, the further digital output signal(s) based on the adjusted threshold value, etc.), as described herein. For example, the aggregation(s) may include at least one statistic based on the digital output signals (e.g., for the different thresholds).
  • a plurality of light emitters 106 may be included. Additionally or alternatively, a plurality of light detectors 108 may be included.
  • controller 120 may be further configured to detect at least one object in an environment surrounding autonomous vehicle 102 based on the aggregation(s). In some non-limiting embodiments or aspects, controller 120 may be further configured to issue at least one command to cause autonomous vehicle 102 to perform at least one autonomous driving operation (e.g., brake, steer, accelerate, and/or the like) based on at least one of the aggregation(s) and/or detecting the object(s).
  • at least one autonomous driving operation e.g., brake, steer, accelerate, and/or the like
  • the pulses of light from light emitter 106 may include a first pulse of light associated with a first digital output from comparator 110 (e.g., a first pulse of light being emitted by light emitter 106, being reflected back to light detector 108, causing generation of a first analog output signal from light detector 108, and causing generation of the first digital output from comparator 110). Additionally or alternatively, at least one further pulse of light from emitter 106 may be associated with the at least one further digital output from comparator 110.
  • LiDAR system 104 may be configured to rotate light emitter 106 and light detector 108 so that a field of view of LiDAR system 104 rotates as light emitter 106 and light detector 108 rotate.
  • a pulse repetition rate of the pulses of light from light emitter 106 may be sufficiently high that the field of view when emitting the first pulse of light at least partially overlaps with the field of view when emitting the further pulse(s) of light.
  • controller 120 may repeatedly (e.g., dynamically, periodically, and/or the like) adjust the threshold and receive one or more respective digital output signal(s) based on the threshold as adjusted (e.g., for each threshold value). For example, controller 120 may adjust the threshold according to at least one of a linear search, a low-to-high search, a high-to-low search, a binary search, a sawtooth search, a biased search algorithm, any combination thereof, any other suitable search algorithm or search pattern, and/or the like.
  • a range may include a range from 0 volts (V) (or the minimum output voltage of light detector 108) to a maximum voltage of light detector 108.
  • a linear search may include sequentially adjusting the threshold to each possible value (or a subset of discrete values) in the range (e.g., 10% of the maximum voltage, 20% of the maximum voltage, 30% of the maximum voltage, 40% of the maximum voltage, 50% of the maximum voltage, 60% of the maximum voltage, 70% of the maximum voltage, 80% of the maximum voltage, 90% of the maximum voltage, and/or the like) and receiving at least one digital output signal at each of the aforementioned threshold values.
  • a low-to-high search may include a linear search starting from the lowest value in the range (and/or the lowest discrete value in the subset of discrete values in the range) and sequentially increasing the threshold to each value until reaching the highest value.
  • a high-to-low search may include a linear search starting from the highest value in the range (and/or the highest discrete value in the subset of discrete values in the range) and sequentially decreasing the threshold to each value until reaching the lowest value.
  • a binary search may include starting from a value in a middle of the range (e.g., 50% of the maximum voltage), receiving at least one digital output signal at that threshold value, eliminating half the range based on the digital output signal (e.g., eliminating the lower half of the range if the digital output signal is high (e.g., 1) or eliminating the upper half of the range if the digital output signal is low (e.g., 0)), and repeatedly adjusting the threshold to the center of the remaining portion of the range and receiving at least one digital output signal at that adjusted threshold to repeatedly discard half of the range until a termination condition is satisfied (e.g., a predetermined number of repetitions, a threshold value above a maximum target value, a threshold value below a minimum target value, and/or the like).
  • a termination condition e.g., a predetermined number of repetitions, a threshold value above a maximum target value, a threshold value below a minimum target value, and/or the like.
  • a sawtooth search may be similar to a linear search (e.g., low-to-high search or high-to-low search), but after reaching the end of the range, the search starts again from the beginning of the range.
  • a biased search algorithm may account for false positives (e.g., at low threshold values and/or at low signal-to-noise ratio (SNR) levels).
  • controller 120 may repeatedly adjust the threshold according to a first linear search within a first range, as described herein. Thereafter, controller 120 may repeatedly adjust the threshold according to a second linear search within a second range.
  • the second range may be based on a first threshold value within the first range for which the respective digital output signal is associated with detection of an object (e.g., a high digital output signal) and a second threshold value within the first range for which the respective further digital output signal is associated with not detecting an object (e.g., a low digital output signal).
  • the threshold may include at least one of a voltage value, a current value, a linear value of voltage above a noise voltage level, an exponential value of voltage above the noise level, a value of full width at half maximum (e.g., of the analog output signal), an SNR, a peak intensity (e.g., peak voltage intensity, peak current intensity, and/or the like), a pulse energy, any combination thereof, and/or the like.
  • the at least one aggregation(s) may include at least one of a maximum threshold value for which the respective digital output signal is associated with detection of an object (e.g., a high digital output signal, such as 1), a minimum threshold value for which the respective digital output is associated with not detecting an object (e.g., a low digital output signal, such as 0), an average value of the respective digital outputs for each threshold value, an average time of flight (TOF) for each threshold value, an average time over threshold (TOT) for each threshold value, a time-based aggregation (e.g., a sum of values from previous pulses for a window of time and the value for the current pulse, a rolling window of values (e.g., a sum thereof), a running sum of values, a running attenuated sum of values, and/or the like), a time-domain aggregation (e.g., pseudo-binary time-of-detection, value-of-thre
  • controller 120 may be further configured to determine an approximate amplitude of the analog output signals based on the aggregation(s). For example, controller 120 may determine an approximate amplitude of the analog output signals based on at least one of a maximum threshold value for which the respective digital output signal is associated with detection of an object (e.g., a high digital output signal, such as 1), a minimum threshold value for which the respective digital output is associated with not detecting an object (e.g., a low digital output signal, such as 0), a TOT (e.g., for at least one of the aforementioned threshold values, such as the maximum threshold value for which the respective digital output signal is associated with a detection), any combination thereof, and/or the like.
  • a maximum threshold value for which the respective digital output signal is associated with detection of an object e.g., a high digital output signal, such as 1
  • a minimum threshold value for which the respective digital output is associated with not detecting an object e.g., a low digital output signal, such as 0
  • a time-to-digital converter may be configured to determine at least one TOF based on at least one pulse of light of the pulses of light and at least one reflected pulse of light of the reflected pulses of light.
  • the TDC may be configured to receive the aggregation(s), and determining the TOF may include determining the TOF based on the aggregation(s).
  • the TDC may be implemented by (e.g., part of) controller 120.
  • the TDC may be separate from controller 120 and/or implemented (e.g., completely, partially, and/or the like) by another computing device separate from or including controller 120, such as a vehicle on-board computing device of autonomous vehicle 102 and/or the like.
  • controller 120 may be further configured to determine a target threshold based on the aggregation(s).
  • the target threshold may include at least one of an optimal threshold value, a threshold value that increases an SNR, a threshold value that reduces false detections, a threshold that increases detections of certain objects (e.g., low-reflectivity objects), any combination thereof, and/or the like.
  • a digital-to-analog converter may be included.
  • the DAC may be implemented by (e.g., part of) controller 120.
  • the DAC may be separate from controller 120 and/or implemented (e.g., completely, partially, and/or the like) by another computing device separate from or including controller 120, such as a vehicle on-board computing device of autonomous vehicle 102 and/or the like.
  • the DAC may be connected to controller 120.
  • an output of light detector 108 may be connected to a first comparator input of comparator 110, and the DAC may be connected to a second comparator input of comparator 110.
  • Controller 120 may be configured to adjust the threshold by controlling the DAC to adjust a voltage at the second comparator input of the comparator 110.
  • FIG. 1 The number and arrangement of systems, devices, sensors, components, and/or networks shown in FIG. 1 are provided as an example. There may be additional systems, devices, sensors, components, and/or networks; fewer systems, devices, sensors, components, and/or networks; different systems, devices, sensors, components, and/or networks; and/or differently arranged systems, devices, sensors, components, and/or networks than those shown in FIG. 1. Furthermore, two or more systems, devices, sensors, or components shown in FIG. 1 may be implemented within a single system, device, sensor, or components, or a single system, device, sensor, or component shown in FIG. 1 may be implemented as multiple, distributed systems, devices, sensors, or components.
  • a set of systems e.g., one or more systems
  • a set of devices e.g., one or more devices
  • a set of sensors e.g., one or more sensors
  • a set of components e.g., one or more components of system 100 may perform one or more functions described as being performed by another set of systems, another set of devices, another set of sensors, or another set of components of system 100.
  • FIG. 2 is an illustration of an illustrative system architecture 200 for a vehicle, according to some non-limiting embodiments or aspects of the presently disclosed subject matter.
  • autonomous vehicle 102 may include a same or similar system architecture as that of system architecture 200 shown in FIG. 2.
  • comparator 110 and/or controller 120 may be the same as, similar to, or part of vehicle on-board computing device 220.
  • light emitter 106, light detector 108, comparator 110, and/or controller 120 may be part of or connected to LiDAR 264.
  • LiDAR system 104 may be the same as or similar to LiDAR 264.
  • system architecture 200 may include additional components, fewer components, different components, or differently arranged components than those shown in FIG. 2. Additionally, or alternatively, a set of components (e.g., one or more components) of system architecture 200 may perform one or more functions described as being performed by another set of components of system architecture 200.
  • system architecture 200 may include engine or motor 202 and various sensors 204-218 for measuring various parameters of the vehicle.
  • the sensors may include, for example, engine temperature sensor 204, battery voltage sensor 206, engine rotations per minute (RPM) sensor 208, and/or throttle position sensor 210.
  • the vehicle may have an electric motor 202, and may have sensors such as battery monitoring sensor 212 (e.g., to measure current, voltage, and/or temperature of the battery), motor current sensor 214, motor voltage sensor 216, and/or motor position sensors 218, such as resolvers and encoders.
  • System architecture 200 may include operational parameter sensors, which may be common to both types of vehicles, and may include, for example: position sensor 236 such as an accelerometer, gyroscope, and/or inertial measurement unit; speed sensor 238; and/or odometer sensor 240.
  • System architecture 200 may include clock 242 that the system architecture 200 uses to determine vehicle time during operation.
  • Clock 242 may be encoded into the vehicle on-board computing device 220, it may be a separate device, or multiple clocks may be available.
  • System architecture 200 may include various sensors that operate to gather information about an environment in which the vehicle is operating and/or traveling. These sensors may include, for example: location sensor 260 (e.g., a Global Positioning System (GPS) device); object detection sensors such as one or more cameras 262; LiDAR sensor system 264; and/or radar and/or sonar system 266.
  • the sensors may include environmental sensors 268, such as a precipitation sensor and/or ambient temperature sensor.
  • the object detection sensors may enable the system architecture 200 to detect objects that are within a given distance range of the vehicle in any direction, and the environmental sensors 268 may collect data about environmental conditions within an area of operation and/or travel of the vehicle.
  • vehicle on-board computing device 220 analyzes the data captured by the sensors and optionally controls operations of the vehicle based on results of the analysis. For example, vehicle on-board computing device 220 may control: braking via a brake controller 222; direction via steering controller 224; speed and acceleration via throttle controller 226 (e.g., in a gas-powered vehicle) or motor speed controller 228 such as a current level controller (e.g., in an electric vehicle); differential gear controller 230 (e.g., in vehicles with transmissions); and/or other controllers such as auxiliary device controller 254.
  • throttle controller 226 e.g., in a gas-powered vehicle
  • motor speed controller 228 such as a current level controller (e.g., in an electric vehicle); differential gear controller 230 (e.g., in vehicles with transmissions); and/or other controllers such as auxiliary device controller 254.
  • Geographic location information may be communicated from location sensor 260 to vehicle on-board computing device 220, which may access a map of the environment including map data that corresponds to the location information to determine known fixed features of the environment, such as streets, buildings, stop signs, and/or stop/go signals.
  • Captured images from cameras 262 and/or object detection information captured from sensors, such as LiDAR sensor system 264 and/or radar and/or sonar system 266, is communicated from those sensors to vehicle on-board computing device 220.
  • the object detection information and/or captured images are processed by on-board computing device 220 to detect objects in proximity to the vehicle. Any known or to be known techniques for making an object detection based on sensor data and/or captured images can be used in the embodiments or aspects disclosed in this document.
  • Vehicle on-board computing device 220 may generate new map data (e.g., based on object detection data captured from sensors such as LiDAR 264, captured images from cameras 262, the map data, and/or the like). Additionally or alternatively, vehicle on-board computing device 220 may communicate sensor data (e.g., object detection data captured from sensors such as LiDAR 264, captured images from cameras 262, and/or the like) to a remote system (e.g., remote system 119), which may generate new map data based on the sensor data.
  • sensor data e.g., object detection data captured from sensors such as LiDAR 264, captured images from cameras 262, and/or the like
  • remote system e.g., remote system 119
  • FIG. 3 is an illustration of an illustrative LiDAR system 300.
  • LiDAR system 104 of FIG. 1 and/or LiDAR 264 of FIG. 2 may be the same as or substantially similar to LiDAR system 300.
  • light emitter 106 and/or light detector 108 may be the same as or similar to light emitter system 304 and/or light detector 308, respectively.
  • LiDAR system 300 may include housing 306, which may be rotatable 360° about a central axis, such as hub or axle 316.
  • Housing 306 may include an emitter/receiver aperture 312 made of a material transparent to light.
  • emitter/receiver aperture 312 made of a material transparent to light.
  • FIG. 3 non-limiting embodiments or aspects of the present disclosure are not limited in this regard. In other scenarios, multiple apertures for emitting and/or receiving light may be provided. Either way, LiDAR system 300 can emit light through one or more of aperture(s) 312 and receive reflected light back toward one or more of aperture(s) 312 as housing 306 rotates around the internal components.
  • the outer shell of housing 306 may be a stationary dome, at least partially made of a material that is transparent to light, with rotatable components inside of housing 306.
  • Light emitter system 304 Inside the rotating shell or stationary dome is a light emitter system 304 that is configured and positioned to generate and emit pulses of light through aperture 312 or through the transparent dome of housing 306 via one or more laser emitter chips or other light emitting devices.
  • Light emitter system 304 may include any number of individual emitters (e.g., 8 emitters, 64 emitters, 128 emitters, etc.). The emitters may emit light of substantially the same intensity or of varying intensities.
  • the individual beams emitted by light emitter system 304 may have a well-defined state of polarization that is not the same across the entire array. As an example, some beams may have vertical polarization and other beams may have horizontal polarization.
  • LiDAR system 300 may include light detector 308 containing a photodetector or array of photodetectors positioned and configured to receive light reflected back into the system.
  • Light emitter system 304 and light detector 308 may rotate with the rotating shell, or light emitter system 304 and light detector 308 may rotate inside the stationary dome of housing 306.
  • One or more optical element structures 310 may be positioned in front of light emitter system 304 and/or light detector 308 to serve as one or more lenses and/or waveplates that focus and direct light that is passed through optical element structure 310.
  • One or more optical element structures 310 may be positioned in front of a mirror to focus and direct light that is passed through optical element structure 310.
  • LiDAR system 300 may include optical element structure 310 positioned in front of a mirror and connected to the rotating elements of LiDAR system 300 so that optical element structure 310 rotates with the mirror.
  • optical element structure 310 may include multiple such structures (e.g., lenses, waveplates, etc.).
  • multiple optical element structures 310 may be arranged in an array on or integral with the shell portion of housing 306.
  • each optical element structure 310 may include a beam splitter that separates light that the system receives from light that the system generates.
  • the beam splitter may include, for example, a quarter-wave or half-wave waveplate to perform the separation and ensure that received light is directed to the receiver unit rather than to the emitter system (which could occur without such a waveplate as the emitted light and received light should exhibit the same or similar polarizations).
  • LiDAR system 300 may include power unit 318 to power the light emitter system 304, motor 316, and electronic components.
  • LiDAR system 300 may include an analyzer 314 with elements such as processor 322 and non-transitory computer-readable memory 320 containing programming instructions that are configured to enable the LiDAR system 300 to receive data collected by the light detector unit, analyze the data to measure characteristics of the light received, and generate information that a connected system can use to make decisions about operating in an environment from which the data was collected.
  • Analyzer 314 may be integral with the LiDAR system 300 as shown, or some or all of analyzer 314 may be external to LiDAR system 300 and communicatively connected to LiDAR system 300 via a wired and/or wireless communication network or link.
  • Computer system 400 can correspond to one or more devices of (e.g., one or more devices of a system of) autonomous vehicle 102, controller 120, and/or remote system 130.
  • one or more devices of (e.g., one or more devices of a system of) autonomous vehicle 102, controller 120, and/or remote system 130 may include at least one computer system 400 and/or at least one component of computer system 400.
  • Computer system 400 can be any computer capable of performing the functions described herein.
  • computer system 400 may be any computer capable of performing the functions described herein.
  • Computer system 400 may include one or more processors (also called central processing units, or CPUs), such as a processor 404.
  • processors also called central processing units, or CPUs
  • Processor 404 is connected to a communication infrastructure or bus 406.
  • One or more processors 404 may each be a graphics processing unit (GPU).
  • a GPU may include a processor that is a specialized electronic circuit designed to process mathematically intensive applications.
  • the GPU may have a parallel structure that is efficient for parallel processing of large blocks of data, such as mathematically intensive data common to computer graphics applications, images, videos, etc.
  • one or more processors 404 e.g., CPU, GPU, and/or the like
  • a hardware accelerator may include an artificial intelligence (AI) accelerator.
  • AI artificial intelligence
  • Computer system 400 also may include input/output device(s) 403 (e.g., user input/output device(s), such as monitors, keyboards, pointing devices, etc.), that communicate with communication infrastructure 406 through input/output interface(s) 402 (e.g., user input/output interface(s)).
  • input/output device(s) 403 e.g., user input/output device(s), such as monitors, keyboards, pointing devices, etc.
  • input/output interface(s) 402 e.g., user input/output interface(s)
  • Computer system 400 also may include a main or primary memory 408, such as random access memory (RAM).
  • Main memory 408 may include one or more levels of cache.
  • Main memory 408 may have stored therein control logic (i.e., computer software) and/or data.
  • Computer system 400 may also include one or more secondary storage devices or memory 410.
  • Secondary memory 410 may include, for example, a hard disk drive 412 and/or a removable storage device or drive 414.
  • Removable storage drive 414 may be a floppy disk drive, a magnetic tape drive, a compact disk drive, an optical storage device, tape backup device, and/or any other storage device/drive.
  • Removable storage drive 414 may interact with a removable storage unit 418.
  • Removable storage unit 418 may include a computer usable or readable storage device having stored thereon computer software (control logic) and/or data.
  • Removable storage unit 418 may be a floppy disk, magnetic tape, compact disk, DVD, optical storage disk, removable solid state drive (SSD), removable hard disk drive, and/or any other computer data storage device.
  • Removable storage drive 414 reads from and/or writes to removable storage unit 418 in any suitable manner.
  • secondary memory 410 may include other means, instrumentalities, or other approaches for allowing computer programs and/or other instructions and/or data to be accessed by computer system 400.
  • Such means, instrumentalities, or other approaches may include, for example, a removable storage unit 422 and an interface 420.
  • the removable storage unit 422 and the interface 420 may include a program cartridge and cartridge interface (such as that found in video game devices), a removable memory chip (such as an EPROM or PROM) and associated socket, a memory stick and USB port, a memory card and associated memory card slot, and/or any other removable storage unit and associated interface.
  • Computer system 400 may further include a communication or network interface 424.
  • Communication interface 424 may enable computer system 400 to communicate and interact with any combination of remote devices, remote networks, remote entities, etc. (individually and collectively referenced by reference number 428).
  • communication interface 424 may allow computer system 400 to communicate with remote devices 428 over communications path 426, which may be wired and/or wireless, and which may include any combination of LANs, WANs, the Internet, etc. Control logic and/or data may be transmitted to and from computer system 400 via communication path 426.
  • a tangible, non-transitory apparatus or article of manufacture comprising a tangible, non-transitory computer useable or readable medium having control logic (software) stored thereon also may be referred to herein as a computer program product or program storage device.
  • This may include, but is not limited to, computer system 400, main memory 408, secondary memory 410, and removable storage units 418 and 422, as well as tangible articles of manufacture embodying any combination of the foregoing.
  • control logic when executed by one or more data processing devices (such as computer system 400), may cause such data processing devices to operate as described herein.
  • computer system 400 may include additional components, fewer components, different components, or differently arranged components than those shown in FIG. 4. Additionally, or alternatively, a set of components (e.g., one or more components) of computer system 400 may perform one or more functions described as being performed by another set of components of computer system 400.
  • FIG. 5 is a flowchart of a non-limiting embodiment or aspect of a process 500 for a dynamic detection threshold for a sensor (e.g., LiDAR) of an autonomous vehicle, according to some non-limiting embodiments or aspects of the presently disclosed subject matter.
  • a sensor e.g., LiDAR
  • one or more of the steps of process 500 may be performed (e.g., completely, partially, and/or the like) by controller 120.
  • one or more of the steps of process 500 may be performed (e.g., completely, partially, and/or the like) by another system, another device, another group of systems, or another group of devices, separate from or including controller 120, such as LiDAR system 104, light emitter 106, light detector 108, comparator 110, remote system 130, and/or the like.
  • controller 120 such as LiDAR system 104, light emitter 106, light detector 108, comparator 110, remote system 130, and/or the like.
  • process 500 may include additional steps, fewer steps, different steps, or differently arranged steps than those shown in FIG. 5.
  • process 500 may include emitting at least one pulse.
  • light emitter 106 may emit at least one pulse of light, as described herein.
  • controller 120 may control light emitter 106 to emit the pulse(s) of light.
  • a plurality of light emitters 106 may emit a plurality of pulses of light (e.g., simultaneously, sequentially, independently, and/or the like), as described herein.
  • process 500 may include receiving at least one reflected pulse and/or generating at least one analog output signal based on the reflected pulse(s).
  • light detector 108 may receive at least one reflected pulse of light, as described herein. Additionally or alternatively, light detector 108 may generate at least one analog output signal based on the reflected pulse(s) of light, as described herein.
  • the reflected pulse(s) of light may include the pulses of light (e.g., emitted from light emitter 108) reflected back to LiDAR system 104 and/or light detector 108.
  • a plurality of light detectors 108 may receive at least one reflected pulse of light (e.g., a plurality of reflected pulses of light), as described herein. In some non-limiting embodiments or aspects, the light detectors 108 may generate at least one analog output signal (e.g., a plurality of analog output signals) based on the reflected pulse(s) of light.
  • the light detectors 108 may generate at least one analog output signal (e.g., a plurality of analog output signals) based on the reflected pulse(s) of light.
  • process 500 may include receiving at least one analog output signal and/or generating at least one digital output signal based on the analog output signal(s) and a threshold.
  • comparator 110 may receive the analog output signal(s) from light detector 108, as described herein. Additionally or alternatively, comparator 110 may generate at least one digital output signal based on the analog output signal(s) and a threshold, as described herein.
  • a plurality of comparators 110 may receive a plurality of analog output signals from a plurality of light detectors 108. Additionally or alternatively, each comparator 110 may generate a respective digital output signal based on the respective analog output signal and a respective threshold, as described herein.
  • process 500 may include receiving at least one digital output signal.
  • controller 120 may receive at least one digital output signal (e.g., based on the threshold) from comparator 110.
  • controller 120 may receive a plurality of digital output signals from a plurality of comparators 110. In some non-limiting embodiments or aspects, a plurality of controllers 120 may be included, and each controller 120 may receive at least one digital output signal from at least one comparator 110 (e.g., a controller 120 for each comparator 110 and/or a controller 120 for each subset of a plurality of comparators 110).
  • steps 502-508 may be repeated.
  • steps 502-508 may be repeated a predetermined number of times (e.g., to receive a predetermined number of digital output signals associated with a predetermined number of pulses of light).
  • controller 120 may determine whether to repeat steps 502-508.
  • process 500 may include adjusting the threshold.
  • controller 120 may adjust the threshold of comparator 110, as described herein.
  • controller 120 may adjust the threshold of comparator 110 according to a search algorithm or search pattern, as described herein.
  • a plurality of comparators 110 may be included, and controller 120 may adjust the threshold of at least one of (e.g., each of) comparators 110.
  • controller 120 may adjust the threshold of each comparator 110 to be the same threshold value as the other comparators 110. Additionally or alternatively, controller 120 may adjust the threshold of each of comparator 110 independently, such that at least some comparators 110 may have a threshold value different than at least some other comparators 110.
  • steps 502-510 may be repeated.
  • steps 502-510 may be repeated a number of times based on the search algorithm and/or search pattern according to which controller 120 is adjusting the threshold. Additionally or alternatively, controller 120 may determine whether to repeat steps 502-510.
  • process 500 may include determining at least one aggregation.
  • controller 120 may determine at least one aggregation based on the digital output signal(s), as described herein.
  • controller 120 may adjust the threshold of comparator 110 based on the aggregation(s) (e.g., step 510 may follow step 512), as described herein.
  • steps 502-508 and 512 may be repeated or steps 502-512 may be repeated.
  • controller 120 may determine to repeat steps 502-508 and 512 or steps 502-512 based on the aggregation(s). Additionally or alternatively, process 500 may be repeated continuously (e.g., during operation of autonomous vehicle 102).
  • FIGS. 6A-6C are diagrams of exemplary implementations 600a-600c of a system for a dynamic detection threshold for a sensor (e.g., LiDAR) of an autonomous vehicle, according to some non-limiting embodiments or aspects of the presently disclosed subject matter.
  • implementations 600a-600c may include LiDAR system 604, light emitter 606, light detector 608, comparator 610, DAC 612, and/or controller 620.
  • LiDAR system 604 may be the same as or similar to LiDAR system 104.
  • light emitter 606 may be the same as or similar to light emitter 106.
  • light detector 608 may be the same as or similar to light detector 108.
  • comparator 610 may be the same as or similar to comparator 110.
  • controller 620 may be the same as or similar to controller 120.
  • the number and arrangement of components shown in FIGS. 6A-6C are provided as an example.
  • implementations 600a-600c may include additional components, fewer components, different components, or differently arranged components than those shown in FIGS. 6A-6C. Additionally or alternatively, a set of components (e.g., one or more components) of implementations 600a-600c may perform one or more functions described as being performed by another set of components of implementations 600a-600c.
  • At least one of (e.g., all of) light emitter 606, light detector 608, comparator 610, DAC 612, and/or controller 620 may be part of LiDAR system 604. In some non-limiting embodiments or aspects, at least one of light emitter 606, light detector 608, comparator 610, DAC 612, and/or controller 620 may be separate from LiDAR system 604. For example, as shown in FIG. 6A, light emitter 606, light detector 608, comparator 610, and controller 620 may be part of LiDAR system 604. For example, as shown in FIG.
  • light emitter 606, light detector 608, comparator 610, and DAC 612 may be part of LiDAR system 604, and controller 620 may be separate from LiDAR system 604.
  • controller 620 may be separate from LiDAR system 604.
  • light emitter 606 and light detector 608 may be part of LiDAR system 604, and comparator 610, DAC 612, and controller 620 may be separate from LiDAR system 604.
  • LiDAR system 604 may include at least one light emitter 606 and at least one light detector 608, as described herein.
  • light emitter 606 may be configured to emit pulses of light, as described herein
  • light detector 608 may be configured to receive reflected pulses of light (e.g., the pulses of light from light emitter 606 reflected back to light detector 608), as described herein.
  • Light detector 608 may be configured to generate analog output signals based on the reflected pulses of light, as described herein.
  • Comparator 610 may be configured to receive the analog output signals from light detector 608 and generate digital output signals based on the analog output signals and a threshold, as described herein. For example, an output of light detector 608 may be connected to a first comparator input of comparator 610, as described herein. Additionally or alternatively, controller 620 and/or DAC 612 may be connected to a second comparator input of comparator 610, as described herein. For example, DAC 612 may be connected to controller 620, and DAC 612 may be connected to the second comparator input of comparator 610, as described herein. Additionally or alternatively, controller 620 may include DAC 612 (e.g., DAC 612 may be part of and/or integrated with controller 620).
  • controller 620 may be configured to adjust the threshold, as described herein.
  • controller 620 may adjust the threshold by adjusting and/or controlling DAC 612 to adjust a voltage at the second comparator input of comparator 610.
  • controller 620 may be configured to repeatedly (e.g., continuously, periodically, and/or the like) receive digital output signals from comparator 610 based on the threshold and/or to repeatedly adjust the threshold, as described herein.
  • FIG. 7 is an exemplary graph 700 of detection threshold and false detection probability, according to some non-limiting embodiments or aspects of the presently disclosed subject matter.
  • graph 700 may include a vertical axis associated with a detection threshold above a noise level (e.g., in volts (V)) and a horizontal axis associated with a probability of a false detection (e.g., false alarm).
  • V in volts
  • increasing the detection threshold may reduce the probability of a false detection. Additionally or alternatively, decreasing the detection threshold may increase the probability of a false detection.
  • FIG. 8 is an exemplary graph 800 of detection probability and false detection probability, according to some non-limiting embodiments or aspects of the presently disclosed subject matter.
  • graph 800 may include a vertical axis associated with a probability of a detection (e.g., true and/or accurate detection of an object) and a horizontal axis associated with a probability of a false detection (e.g., false alarm).
  • graph 800 may include multiple curves 801-805 for multiple different signal-to-noise ratios.
  • graph 800 may include a first curve 801 for a signal-to-noise ratio (SNR) of 0, a second curve 802 for an SNR of 1, a third curve 803 for an SNR of 2, a fourth curve 804 for an SNR of 3, and a fifth curve 805 for an SNR of 6.
  • SNR signal-to-noise ratio
  • the SNR associated with each curve may be the threshold, as described herein.
  • an increased threshold e.g., a higher value for SNR
  • a decreased threshold e.g., lower value for SNR
  • an increased threshold e.g., a higher value for SNR
  • a decreased threshold e.g., lower value for SNR
  • characterization of the performance of a sensor may be performed to ensure that the sensor is functioning correctly.
  • LiDAR characterization can be performed on targets of varying reflectivity to quantify sensor performance.
  • LiDAR characterization can be performed by painting targets using specific paint that represents discrete reflectivity values. These targets can be placed in the LiDAR field of view and the reflectivity of the targets can be used when quantifying sensor performance.
  • target objects painted with different reflectivity paint in geometric patterns may be required to be used to calibrate a LiDAR sensor's intrinsic parameters. Because the target object is often designed based on the spatial resolution of the sensor, each different LiDAR sensor could require a different geometric pattern. Further, due to the differences in geometric patterns, calibrating a LiDAR sensor may require multiple iterations and even then, the calibration may not be accurate. In addition, calibrating a LiDAR sensor may require the use of multiple geometric patterns and/or orientations of the LiDAR sensor which may be time consuming.
  • the present disclosure provides systems, methods, and computer program products that determine a characteristic of a sensor, such as a LiDAR sensor, based on a display of an e-ink display device.
  • the present disclosure includes a sensor analysis system that includes a memory and at least one processor coupled to the memory and configured to receive data associated with a first display of an e-ink display device and data associated with a second display of the e-ink display device, process the data associated with the first display of the e-ink display and the data associated with the second display of the e-ink display device to provide a quantitative result, and determine a characteristic of a sensor system based on the quantitative result.
  • the sensor system is a LiDAR sensor system.
  • the sensor analysis system when processing the data associated with the first display of the e-ink display and the data associated with the second display of the e-ink display device to provide the quantitative result, is configured to compare the data associated with the first display of the e-ink display and the data associated with the second display of the e-ink display device and determine a metric associated with a difference between the data associated with the first display of the e-ink display and the data associated with the second display of the e-ink display device. In some non-limiting embodiments, the characteristic of the sensor system is based on the metric.
  • the sensor analysis system is further configured to control the e-ink display device to provide the first display of an e-ink display device and control the e-ink display device to provide the second display of the e-ink display device.
  • the first display of the e-ink display device includes a first pattern associated with a first object positioned at a first distance from the sensor system and the second display of the e-ink display device includes a second pattern associated with a second object positioned at a second distance from the sensor system.
  • the first pattern associated with the first object positioned at the first distance from the sensor system includes a first pattern having a first value of reflectivity and the second pattern associated with the second object positioned at the second distance from the sensor system includes a second pattern having a second value of reflectivity.
  • the sensor analysis system is further configured to determine a calibration setting associated with a sensor of a sensor system, such as a perception component of an autonomous vehicle and/or a robotic device. In some non-limiting embodiments, the sensor analysis system is further configured to adjust the calibration setting associated with the sensor.
  • the sensor analysis system eliminates the need for the use of static targets and may provide a more accurate procedure for determining a characteristic of a sensor, such as a LiDAR sensor, based on a display of an e-ink display device.
  • the sensor analysis system may control the e-ink display device and/or a sensor system to reduce an amount of time and/or processing resources required to determine a characteristic of a sensor.
  • the sensor analysis system may provide more robust, faster, and more accurate calibration, qualification, and/or commissioning methodologies.
  • Such a system may lead to improved sensor performance and accuracy, improved AV solutions, and further to expedited vehicle commissioning, enabling increased scalability by deploying more AVs in a shorter period of time.
  • FIG. 9 is a diagram of an example environment 1100 in which systems, methods, products, apparatuses, and/or devices described herein, may be implemented.
  • environment 1100 may include sensor analysis system 1102, sensor system 1104, e-ink display device 1106, and communication network 1108.
  • Sensor analysis system 1102 may include one or more devices capable of communicating with sensor system 1104 and/or e-ink display device 1106 via communication network 1108.
  • sensor analysis system 1102 may include a computing device, such as a server, a group of servers, and/or other like devices.
  • sensor analysis system 1102 may communicate with sensor system 1104 via an application (e.g., a mobile application) stored on sensor analysis system 1102 and/or sensor system 1104.
  • an application e.g., a mobile application
  • Sensor system 1104 may include one or more devices capable of communicating with sensor analysis system 1102 communication network 1108.
  • sensor system 1104 may include a computing device, such as a mobile device, a desktop computer, and/or other like devices.
  • sensor system 1104 may include one or more sensors, such as a LiDAR sensor, a light sensor, an image sensor (e.g., an image capture device, such as a camera), a laser sensor, a barcode reader, an audio sensor, and/or the like.
  • sensor analysis system 1102 may be a component of sensor system 1104.
  • sensor system 1104 may include an optical remote sensing system.
  • E-ink display device 1106 may include one or more devices capable of communicating with sensor analysis system 1102 and/or sensor system 1104 via communication network 1108.
  • e-ink display device 1106 may include a computing device, such as a server, a group of servers, and/or other like devices.
  • e-ink display device 1106 may include an electrophoretic display (e.g., an e-ink display).
  • An electrophoretic display may be a display device that is configured to mimic the appearance of ordinary ink on paper by reflecting ambient light in the same way as paper.
  • e-ink display device 1106 may include microcapsules, which may vary (e.g., digitally vary) the reflectivity of e-ink display device 1106 (e.g., a screen, a panel, etc. of e-ink display device 1106).
  • e-ink display device 1106 can be used to create a display, in the form of a target for a sensor, having variable reflectivity.
  • the target may be used for characterization of the performance of a sensor, digital signal processing (DSP) tuning for development of a sensor, and/or for intrinsic calibration of a sensor.
  • DSP digital signal processing
  • a display on a screen of e-ink display device 1106 may be able to be detected by an image capture device (e.g., a camera, such as a Red, Green, Blue (RGB) camera, a LiDAR sensor, and/or other sensor modalities, and sensor analysis system 1102 may use e-ink display device 1106 for computing a relative orientation between multiple sensor modalities.
  • an image capture device e.g., a camera, such as a Red, Green, Blue (RGB) camera, a LiDAR sensor, and/or other sensor modalities
  • sensor analysis system 1102 may use e-ink display device 1106 for computing a relative orientation between multiple sensor modalities.
  • Communication network 1108 may include one or more wired and/or wireless networks.
  • communication network 1108 may include a cellular network (e.g., a long-term evolution (LTE) network, a third generation (3G) network, a fourth generation (4G) network, a fifth generation (5G) network, a code division multiple access (CDMA) network, etc.), a public land mobile network (PLMN), a local area network (LAN), a wide area network (WAN), a metropolitan area network (MAN), a telephone network (e.g., the public switched telephone network (PSTN)), a private network, an ad hoc network, an intranet, the Internet, a fiber optic-based network, a cloud computing network, and/or the like, and/or a combination of these or other types of networks.
  • LTE long-term evolution
  • 3G third generation
  • 4G fourth generation
  • 5G fifth generation
  • CDMA code division multiple access
  • PLMN public land mobile network
  • LAN local area network
  • WAN wide
  • an autonomous vehicle may incorporate the functionality of sensor analysis system 1102 such that the autonomous vehicle can operate without communication to or from sensor analysis system 1102.
  • a set of devices and/or systems (e.g., one or more devices or systems) of environment 1100 may perform one or more functions described as being performed by another set of devices and/or systems of environment 1100.
  • FIG. 10 is a diagram of an architecture for a computing device 1400.
  • Computing device 1400 can correspond to sensor analysis system 1102 (e.g., one or more devices of sensor analysis system 1102), sensor system 1104 (e.g., one or more devices of sensor system 1104), and/or e-ink display device 1106.
  • one or more devices of e.g., one or more devices of a system of
  • sensor analysis system 1102, sensor system 1104, and/or e-ink display device 1106 e.g., one or more devices of system architecture 200 in FIG 2., etc.
  • an autonomous vehicle can include at least one computing device 1400 and/or at least one component of computing device 1400.
  • computing device 1400 may include additional components, fewer components, different components, or differently arranged components than those shown in FIG. 10. Additionally or alternatively, a set of components (e.g., one or more components) of computing device 1400 may perform one or more functions described as being performed by another set of components of computing device 1400.
  • computing device 1400 comprises user interface 1402, Central Processing Unit (CPU) 1406, system bus 1410, memory 1412 connected to and accessible by other portions of computing device 1400 through system bus 1410, system interface 1460, and hardware entities 1414 connected to system bus 1410.
  • User interface 402 can include input devices and output devices, which facilitate user-software interactions for controlling operations of computing device 1400.
  • the input devices may include, but are not limited to, physical and/or touch keyboard 1450.
  • the input devices can be connected to computing device 1400 via a wired and/or wireless connection (e.g., a Bluetooth® connection).
  • the output devices may include, but are not limited to, speaker 1452, display 1454, and/or light emitting diodes 1456.
  • System interface 1460 is configured to facilitate wired and/or wireless communications to and from external devices (e.g., network nodes, such as access points, etc.).
  • Hardware entities 1414 may perform actions involving access to and use of memory 1412, which can be a random access memory (RAM), a disk drive, flash memory, a compact disc read only memory (CD-ROM) and/or another hardware device that is capable of storing instructions and data.
  • Hardware entities 1414 can include disk drive unit 1416 comprising computer-readable storage medium 1418 on which is stored one or more sets of instructions 1420 (e.g., software code) configured to implement one or more of the methodologies, procedures, or functions described herein.
  • Instructions 1420, application(s) 1424, and/or parameter(s) 1426 can also reside, completely or at least partially, within memory 1412 and/or within CPU 1406 during execution and/or use thereof by computing device 1400.
  • Memory 1412 and CPU 1406 may include machine-readable media (e.g., non-transitory computer-readable media).
  • machine-readable media may refer to a single medium or multiple media (e.g., a centralized or distributed database and/or associated caches and server) that store the one or more sets of instructions 1420.
  • machine-readable media may refer to any medium that is capable of storing, encoding, or carrying a set of instructions 1420 for execution by computing device 1400 and that cause computing device 1400 to perform any one or more of the methodologies of the present disclosure.
  • FIG. 11 is a flowchart of non-limiting embodiments of a process 1500 for determining a characteristic of a sensor, such as a LiDAR sensor, based on a display of an e-ink display device.
  • one or more of the steps of process 1500 may be performed (e.g., completely, partially, etc.) with sensor analysis system 1102 (e.g., one or more devices of sensor analysis system 1102, etc.).
  • one or more of the steps of process 1500 may be performed (e.g., completely, partially, etc.) with another device or a group of devices separate from or including sensor analysis system 1102, such as sensor system 1104 and/or e-ink display device 1106.
  • one or more of the steps of process 1500 may be performed with an autonomous vehicle (e.g., system architecture 200 of an autonomous vehicle, etc.).
  • an autonomous vehicle e.g., system architecture 200 of an autonomous vehicle, etc.
  • process 1500 includes receiving data associated with a first display of an e-ink display device and data associated with a second display of the e-ink display device.
  • sensor analysis system 1102 may receive data associated with a first display of e-ink display device 1106 and/or data associated with a second display of the e-ink display device 1106.
  • sensor analysis system 1102 may receive the data associated with a display of e-ink display device 1106 based on a sensor reading e-ink display device 1106.
  • e-ink display device 1106 may provide a display (e.g., information in the form of a pattern) on a screen of e-ink display device 1106.
  • Sensor system 1104 may read the display on the screen of e-ink display device 1106 and provide data associated with the display on the screen of e-ink display device 1106 to sensor analysis system 1102.
  • Sensor analysis system 1102 may receive the data associated with the display on the screen of e-ink display device 1106 based on sensor system 1104 providing (e.g., transmitting) the data associated with the display on the screen of e-ink display device 1106.
  • a display of e-ink display device 1106 may include one or more patterns that are displayed for enhancing a calibration of a sensor (e.g., a sensor of sensor system 1104) and/or a characterization of data provided by the sensor.
  • the data associated with the display of e-ink display device 1106 may also include data for use in calibrating and commissioning a sensor device (e.g., preconfigured calibration patterns and the like).
  • a display (e.g., a first display, a second display, etc.) of e-ink display device 1106 may include information that is provided (e.g., shown, displayed, output, projected, etc.) on a screen of e-ink display device 1106.
  • the display of e-ink display device 1106 may include a pattern of information (e.g., a pattern of shapes, a pattern of alternating colors, such as black and white colors, a pattern of shades of colors, etc.) that is to read by sensor system 1104 (e.g., a sensor of sensor system 1104).
  • a first display of e-ink display device 1106 may be different from a second display of e-ink display device 1106.
  • the first display of e-ink display device 1106 may include a first pattern of information that is different from a second pattern of information included in the second display of e-ink display device 1106.
  • the first display of e-ink display device 1106 and/or the second display of e-ink display device 1106 may include a pattern of information that is designed to allow for testing of a characteristic of a sensor (e.g., a sensor of sensor system 1104).
  • a characteristic of a sensor may include a characteristic associated with direction at which the sensor will detect an object (e.g., a pointing direction, a pointing angle, etc.), a characteristic associated with range accuracy of the sensor, a characteristic associated with a standard deviation of range measurements of the sensor, a characteristic associated with reflectivity accuracy of the sensor, and/or the like.
  • the characteristic may include a characteristic of a LiDAR sensor of sensor system 1104.
  • the data associated with the first display of e-ink display device 1106 is based on a first reading of e-ink display device 1106 by sensor system 1104 and the data associated with the second display of e-ink display device 1106 is based on a second reading of e-ink display device 1106 by sensor system 1104.
  • the first display of e-ink display device 1106 may include a first pattern associated with a representation of a first object positioned at a first distance (e.g., a first distance from sensor system 1104), and the second display of e-ink display device 1106 comprises a second pattern associated with a representation of a second object positioned at a second distance from the sensor system.
  • the first pattern associated with the representation of the first object positioned at the first distance from the sensor system may include a first pattern having a first value of reflectivity (e.g., reflectivity of e-ink display device 1106) and the second pattern associated with the representation of the second object positioned at the second distance from the sensor system comprises a second pattern having a second value of reflectivity.
  • a first value of reflectivity e.g., reflectivity of e-ink display device 1106
  • the second pattern associated with the representation of the second object positioned at the second distance from the sensor system comprises a second pattern having a second value of reflectivity.
  • data associated with the display of e-ink display device 1106 may include data associated with a reading (e.g., a measurement, a recording, a detected aspect, etc.) of the display of e-ink display device 1106.
  • data associated with the display of e-ink display device 1106 may include data that is generated by sensor system 1104 (e.g., a sensor of sensor system 1104) based on sensor system 1104 sensing (e.g., reading, detecting, measuring, etc.) the display of e-ink display device 1106.
  • the data associated with the display of e-ink display device 1106 may include data associated with a representation of an object (e.g., a target object of a sensor, an object that may occur in an environment of an autonomous vehicle, such as a person, a traffic sign, a vehicle, etc.) that is provided on a screen of e-ink display device 1106.
  • the data associated with the display of e-ink display device 1106 may include data associated with physical property (e.g., a value of reflectivity, a position, a distance, a shape, a height, a width, a color, a velocity, a rate of acceleration, a direction of movement, etc.) of the representation of the object as detected by sensor system 1104.
  • sensor analysis system 1102 may generate the data associated with the display of e-ink display device 1106 based on data received from sensor system 1104. For example, sensor analysis system 1102 may receive an output signal from a sensor of sensor system 1104, and sensor analysis system 1102 may generate the data associated with the display of e-ink display device 1106 based on the output signal.
  • sensor analysis system 1102 may store the data associated with the display of e-ink display device 1106.
  • sensor analysis system 1102 may store the data associated with the display of e-ink display device 1106 in a data structure (e.g., a database, a linked list, a tree, and/or the like).
  • the data structure may be located within sensor analysis system 1102 or external to (e.g., remote from) sensor analysis system 1102.
  • sensor analysis system 1102 may control another device.
  • sensor analysis system 1102 may control e-ink display device 1106 to provide a first display of e-ink display device 1106 and control e-ink display device 1106 to provide a second display of e-ink display device 1106.
  • sensor analysis system 1102 may control sensor system 1104 to read (e.g., to obtain a reading of) the first display and/or the second display of e-ink display device 1106.
  • the first display and the second display of e-ink display device 1106 may be displayed simultaneously at different portions of e-ink display device 1106.
  • the first display and the second display of e-ink display device 1106 may be displayed sequentially based on a characterization function and/or calibration function being initiated by sensor analysis system 1102.
  • process 1500 includes processing the data associated with the first display of the e-ink display device and the data associated with the second display of the e-ink display device.
  • sensor analysis system 1102 may process data associated with the first display of e-ink display device 1106 and data associated with the second display of e-ink display device 1106 to provide a quantitative result.
  • sensor analysis system 1102 may compare the data associated with the first display of e-ink display device 1106 and the data associated with the second display of e-ink display device 1106 to determine a characteristic of sensor system 1104 (e.g., a sensor of sensor system 1104).
  • a characteristic of sensor system 1104 e.g., a sensor of sensor system 1104
  • sensor analysis system 1102 may receive the data associated with the first display of e-ink display device 1106 based on a first reading of a screen of e-ink display device 1106 by sensor system 1104, and sensor analysis system 1102 may receive the data associated with the second display of e-ink display device 1106 based on a second reading of the screen of e-ink display device 1106 by sensor system 1104. In such an example, sensor analysis system 1102 may compare the data associated with the first display of e-ink display device 1106 and the data associated with the second display of e-ink display device 1106 based on receiving the data associated with the displays of e-ink display device 1106.
  • sensor analysis system 1102 may use a comparison of the data associated with the first display of e-ink display device 1106 and the data associated with the second display of e-ink display device 1106 to determine the characteristic of a sensor of sensor system 1104 involved in reading the screen of e-ink display device 1106.
  • sensor analysis system 1102 may compare the data associated with the first display of e-ink display device 1106 and the data associated with the second display of e-ink display device 1106 and determine a quantitative result, where the a quantitative result is a metric (e.g., a metric used in calibrating a sensor).
  • the metric may be a metric associated with a difference between the data associated with the first display of e-ink display device 1106 and the data associated with the second display of e-ink display device 1106.
  • the metric may be a metric associated with an error value (e.g., an error value based on a parameter measured by the sensor, such as reflectivity).
  • a characteristic of sensor system 1104 is based on the metric. For example, the characteristic of sensor system 1104 may be determined using the metric.
  • process 1500 includes determining a characteristic of a sensor.
  • sensor analysis system 1102 may determine a characteristic of a sensor of sensor system 1104.
  • sensor analysis system 1102 may determine the characteristic of the sensor based on processing data associated with the first display of e-ink display device 1106 and data associated with the second display of e-ink display device 1106.
  • the characteristic of the sensor may be directly related to a display (e.g., pattern) provided by e-ink display device 1106 and/or a condition, such as an intensity level, at which the display is provided.
  • sensor analysis system 1102 may select a display (e.g., of a plurality of displays) to be provided by e-ink display device 1106 and/or a condition (e.g., of a plurality of conditions) at which the display is provided, based on the characteristic of the sensor of sensor system 1104.
  • a display e.g., of a plurality of displays
  • a condition e.g., of a plurality of conditions
  • sensor analysis system 1102 may determine the characteristic of the sensor by determining whether a result (e.g., a quantitative result that includes a metric, such as a metric associated with an error value of sensor system 1104, a quantitative result that includes a plot of values of distance and/or reflectivity versus an error value of sensor system 1104, etc.) of comparing data associated with a first display of e-ink display device 1106 and data associated with a second display of e-ink display device 1106 satisfies a threshold (e.g., a threshold value of accuracy).
  • a result e.g., a quantitative result that includes a metric, such as a metric associated with an error value of sensor system 1104, a quantitative result that includes a plot of values of distance and/or reflectivity versus an error value of sensor system 1104, etc.
  • sensor analysis system 1102 may determine the characteristic of the sensor. In some non-limiting embodiments, if sensor analysis system 1102 determines that the result of comparing the data associated with the first display of e-ink display device 1106 and the data associated with the second display of e-ink display device 1106 does not satisfy the threshold, sensor analysis system 1102 may forego determining the characteristic of the sensor.
  • sensor analysis system 1102 may perform an action based on a characteristic of a sensor. For example, sensor analysis system 1102 may adjust a threshold (e.g., a threshold value of acceptable risk behavior) associated with a perception component (e.g., a component of a perception stack) of an autonomous vehicle. In some non-limiting embodiments, sensor analysis system 1102 may determine a calibration setting associated with sensor system 1104 based on the characteristic.
  • a threshold e.g., a threshold value of acceptable risk behavior
  • a perception component e.g., a component of a perception stack
  • sensor analysis system 1102 may determine a calibration setting associated with sensor system 1104 based on the characteristic.
  • sensor analysis system 1102 may determine an extrinsic calibration setting associated with a sensor of sensor system 1104 (e.g., a calibration setting associated external aspects of a sensor, such as directions at which a sensor is pointed) and/or an intrinsic calibration setting associated with a sensor of sensor system 1104 (e.g., a calibration setting associated with internal aspects of a sensor, such as directions at which a beam of light is pointed) based on the characteristic.
  • an extrinsic calibration setting associated with a sensor of sensor system 1104 e.g., a calibration setting associated external aspects of a sensor, such as directions at which a sensor is pointed
  • an intrinsic calibration setting associated with a sensor of sensor system 1104 e.g., a calibration setting associated with internal aspects of a sensor, such as directions at which a beam of light is pointed
  • sensor analysis system 1102 may determine a calibration setting associated with a perception component (e.g., a perception component of an autonomous vehicle, a perception component of a robotic device, etc.) and adjust the calibration setting associated with the perception component.
  • sensor analysis system 1102 may provide an indication that the sensor is to be replaced and/or adjusted.
  • sensor analysis system 1102 may adjust a position (e.g., an orientation, such as a direction from which a reading is to be taken, etc.) of the sensor.
  • sensor analysis system 1102 may perform an action with an autonomous vehicle.
  • sensor analysis system 1102 may control an operation of the autonomous vehicle in a real-time environment.
  • sensor analysis system 1102 may control an operation of the autonomous vehicle in a real-time environment based on a characteristic of a sensor (e.g., a characteristic of a sensor determined by sensor analysis system 1102).
  • sensor analysis system 1102 may transmit a control signal to the autonomous vehicle to control an operational characteristic (e.g., velocity, acceleration, deceleration, etc.) of the autonomous vehicle.
  • an operational characteristic e.g., velocity, acceleration, deceleration, etc.
  • FIG. 12 is a diagram of a non-limiting embodiment of an implementation of process 1600 (e.g., process 1500) for determining a characteristic of a sensor.
  • process 1600 may include sensor analysis system 1602, LiDAR sensor system 1604, e-ink display device 1606, and autonomous vehicle 1608.
  • sensor analysis system 1602 may the same as or similar to sensor analysis system 1102.
  • LiDAR sensor system 1604 may be the same as or similar to sensor system 1104.
  • e-ink display device 1606 may be the same as or similar to e-ink display device 1106.
  • autonomous vehicle 1608 may be the same as or similar to an autonomous vehicle as described herein.
  • sensor analysis system 1602 may receive data associated with a first display of e-ink display device 1606 and data associated with a second display of e-ink display device 1606.
  • the data associated with the first display and/or the second display of e-ink display device 1606 may include data associated with a representation of an object (e.g., a target object of a sensor, an object that may occur in an environment of an autonomous vehicle, such as a person, a traffic sign, a vehicle, etc.) that is provided on a screen of e-ink display device 1606.
  • an object e.g., a target object of a sensor, an object that may occur in an environment of an autonomous vehicle, such as a person, a traffic sign, a vehicle, etc.
  • the first display of e-ink display device 1606 may include a first pattern associated with a representation of a first object positioned at a first distance from LiDAR sensor system 1604, and the second display of e-ink display device 1606 comprises a second pattern associated with a representation of a second object positioned at a second distance from LiDAR sensor system 1604.
  • the first pattern has a first value of reflectivity and the second pattern has a second value of reflectivity.
  • sensor analysis system 1602 may receive and/or generate the data associated with the first and second displays of e-ink display device 1606 based on data received from LiDAR sensor system 1604. For example, for each of the first and second displays of e-ink display device 1606, LiDAR sensor system 1604 (e.g., a LiDAR sensor of LiDAR sensor system 1604) may emit a light pulse and receive light (e.g., an amount of light, one or more wavelengths of light, a pattern of light, etc.) reflected by e-ink display device 1606 (e.g., reflected based on the first display of e-ink display device 1606, reflected based on the second display of e-ink display device 1606, etc.).
  • LiDAR sensor system 1604 e.g., a LiDAR sensor of LiDAR sensor system 1604
  • LiDAR sensor system 1604 may generate an output signal based on the light reflected by e-ink display device 1606.
  • Sensor analysis system 1602 may receive the output signal from LiDAR sensor system 1604, and sensor analysis system 1602 may generate the data associated with a respective display of e-ink display device 1606 based on the output signal.
  • sensor analysis system 1602 may process the data associated with the first display of e-ink display device 1606 and the data associated with the second display of e-ink display device 1606. For example, sensor analysis system 1602 may compare the data associated with the first display of e-ink display device 1606 and the data associated with the second display of e-ink display device 1606 and determine a quantitative result, where the a quantitative result is a metric (e.g., a metric used in calibrating a sensor). In some non-limiting embodiments, the metric may be a metric associated with a difference between the data associated with the first display of e-ink display device 1606 and the data associated with the second display of e-ink display device 1606.
  • a metric e.g., a metric used in calibrating a sensor
  • the metric may be a metric associated with an error value (e.g., an error value based on a parameter measured by LiDAR sensor system 1604, such as reflectivity).
  • a characteristic of LiDAR sensor system 1604 is based on the metric.
  • the characteristic of LiDAR sensor system 1604 may be determined using the metric.
  • sensor analysis system 1602 may determine a characteristic of a LiDAR sensor of LiDAR sensor system 1604.
  • sensor analysis system 1602 may determine the characteristic of the LiDAR sensor based on processing data associated with the first display of e-ink display device 1606 and data associated with the second display of e-ink display device 1606.
  • the characteristic may include a characteristic associated with direction at which the sensor will detect an object (e.g., a pointing direction, a pointing angle, etc.), a characteristic associated with range accuracy of the sensor, a characteristic associated with a standard deviation of range measurements of the sensor, a characteristic associated with reflectivity accuracy of the sensor, and/or the like.
  • FIG. 13A is a graph 1710 showing a relationship between reflectivity of an e-ink display device (e.g., e-ink display device 1106, e-ink display device 1606, etc.) and wavelength of light, with regard to varying e-ink values
  • FIG. 13B is a graph 1730 showing a relationship between reflectivity of the e-ink display device and angle of incidence at a wavelength of light of 940 nm, with regard to the varying e-ink values.
  • line 1712 has an e-ink value (e.g., an e-ink value provided as a digital number (DN)) of 100
  • line 1714 has an e-ink value of 75
  • line 1716 has an e-ink value of 50
  • line 1718 has an e-ink value of 25
  • line 1720 has an e-ink value of 0.
  • a value of reflectivity e.g., as a percentage of light reflected by an e-ink display device
  • a value of reflectivity generally decreases for each of lines 1712, 1714, 1716, 1718, 1720 as wavelength increases.
  • line 1732 represents an angle of incidence (AOI) of 10 degrees
  • line 1734 represents an AOI of 30 degrees
  • line 1736 represents an AOI of 60 degrees.
  • a value of reflectivity e.g., as a percentage of light reflected by an e-ink display device
  • AOI e.g., 10 degrees, 30 degrees, and 60 degrees
  • a sensor assembly may include a desiccant assembly within a sensor housing.
  • the desiccant assembly may include a desiccant chamber configured to hold at least one desiccant element (e.g., one or more desiccant blocks) and may include a transfer window positioned between the desiccant chamber and a sensor chamber of the sensor housing.
  • a permeable membrane may cover the transfer window and may be configured to allow water vapor to transfer from the sensor chamber to the desiccant chamber, while preventing liquid water and particulate matter from transferring from the desiccant chamber to the sensor chamber.
  • the permeable membrane and/or a set of dimensions of the at least one transfer window may be configured so that a leak rate corresponding to a transfer of water vapor from the sensor chamber to the desiccant chamber causes prevention of condensation of water within the sensor chamber.
  • Fig. 14 is a diagram of an example environment 2100 in which an autonomous vehicle may operate, in accordance with some aspects of the disclosure.
  • the environment 2100 may include, for example, a vehicle 2102, an on-board system 2104 of the vehicle 2102, a remote computing device 216, and/or a network 2108.
  • the environment 2100 may include one or more objects 2110 that the vehicle 2102is configured to detect.
  • the vehicle 2102 may include any moving form of conveyance that is capable of carrying one or more human occupants and/or cargo and that is powered by any form of energy.
  • the vehicle 2102 may include, for example, a land vehicle (e.g., a car, a truck, a van, or a train), an aircraft (e.g., an unmanned aerial vehicle or a drone), or a watercraft.
  • a land vehicle e.g., a car, a truck, a van, or a train
  • an aircraft e.g., an unmanned aerial vehicle or a drone
  • a watercraft e.g. 14, the vehicle 2102is a land vehicle, and is shown as a car.
  • the vehicle 2102 is an autonomous vehicle in the example of Fig. 14.
  • An autonomous vehicle is a vehicle having a processor, programming instructions, and drivetrain components that are controllable by the processor without requiring a human operator.
  • An autonomous vehicle may be fully autonomous in that the autonomous vehicle does not require a human operator for most or all driving conditions and functions, or an autonomous vehicle may be semi-autonomous in that a human operator may be required in certain conditions or for certain operations, or that a human operator may override the autonomous vehicle's autonomous system and may take control of the autonomous vehicle.
  • the vehicle 2102 may include an on-board system 214 that is integrated into and/or coupled with the vehicle 2102.
  • the on-board system 2104 may be used to control the vehicle 2102, to sense information about the vehicle 2102and/or an environment in which the vehicle 2102operates, to detect one or more objects 2110 in a proximity of the vehicle, to provide output to or receive input from an occupant of the vehicle 2102, and/or to communicate with one or more devices remote from the vehicle 2102, such as another vehicle and/or the remote computing device 216.
  • the on-board system 2104 is described in more detail below in connection with Fig. 15.
  • the vehicle 2102 may travel along a road in a semi-autonomous or autonomous manner.
  • the vehicle 2102 may be configured to detect objects 2110 in a proximity of the vehicle 2102.
  • An object 2110 may include, for example, another vehicle (e.g., an autonomous vehicle or a non-autonomous vehicle that requires a human operator for most or all driving conditions and functions), a cyclist (e.g., a rider of a bicycle, electric scooter, or motorcycle), a pedestrian, a road feature (e.g., a roadway boundary, a lane marker, a sidewalk, a median, a guard rail, a barricade, a sign, a traffic signal, a railroad crossing, or a bike path), and/or another object that may be on a roadway or in proximity of a roadway, such as a tree or an animal.
  • another vehicle e.g., an autonomous vehicle or a non-autonomous vehicle that requires a human operator for most or all driving conditions and functions
  • a cyclist e.
  • the vehicle 2102 may be equipped with one or more sensors, such as a lidar system, as described in more detail elsewhere herein.
  • the lidar system may be configured to transmit a light pulse 2112 to detect objects 2110 located within a distance or range of distances of the vehicle 2102.
  • the light pulse 2112 may be incident on an object 2110 and may be reflected back to the lidar system as a reflected light pulse 2114.
  • the reflected light pulse 2114 may be incident on the lidar system and may be processed to determine a distance between the object 2110 and the vehicle 212.
  • the reflected light pulse 2114 may be detected using, for example, a photodetector or an array of photodetectors positioned and configured to receive the reflected light pulse 2114.
  • a lidar system may be included in another system other than a vehicle 2102, such as a robot, a satellite, and/or a traffic light, or may be used as a standalone system.
  • implementations described herein are not limited to autonomous vehicle applications and may be used in other applications, such as robotic applications, radar system applications, metric applications, and/or system performance applications.
  • the lidar system may provide lidar data, such as information about a detected object 2110 (e.g., information about a distance to the object 2110, a speed of the object 2110, and/or a direction of movement of the object 2110), to one or more other components of the on-board system 2104. Additionally, or alternatively, the vehicle 2102 may transmit lidar data to the remote computing device 2106 (e.g., a server, a cloud computing system, and/or a database) via the network 2108. The remote computing device 2106 may be configured to process the lidar data and/or to transmit a result of processing the lidar data to the vehicle 2102 via the network 2108.
  • the remote computing device 2106 e.g., a server, a cloud computing system, and/or a database
  • the network 2108 may include one or more wired and/or wireless networks.
  • the network 2108 may include a wireless wide area network (e.g., a cellular network or a public land mobile network), a local area network (e.g., a wired local area network or a wireless local area network (WLAN), such as a Wi-Fi network), a personal area network (e.g., a Bluetooth network), a near-field communication network, a telephone network, a private network, the Internet, and/or a combination of these or other types of networks.
  • the network 2108 enables communication among the devices of environment 2100.
  • Fig. 14 is provided as an example. Other examples may differ from what is described with regard to Fig. 14.
  • the number and arrangement of devices shown in Fig. 14 are provided as an example. In practice, there may be additional devices, fewer devices, different devices, or differently arranged devices than those shown in Fig. 14.
  • two or more devices shown in Fig. 14 may be implemented within a single device, or a single device shown in Fig. 14 may be implemented as multiple, distributed devices.
  • a set of devices (e.g., one or more devices) shown in Fig. 14 may perform one or more functions described as being performed by another set of devices shown in Fig. 14.
  • Fig. 15 is a diagram of an example on-board system 2200 of an autonomous vehicle, in accordance with some aspects of the disclosure.
  • the on-board system 2200 may correspond to the on-board system 2104 included in the vehicle 2102, as described above in connection with Fig. 14.
  • the on-board system 2200 may include one or more of the illustrated components 2202-2256.
  • the components of the on-board system 2200 may include, for example, a power system 2202, one or more sensors 2204, one or more controllers 2206, and/or an on-board computing device 2208.
  • the components of the on-board system 2200 may communicate via a bus (e.g., one or more wired and/or wireless connections), such as a controller area network (CAN) bus.
  • a bus e.g., one or more wired and/or wireless connections
  • CAN controller area network
  • the power system 2202 may be configured to generate mechanical energy for the vehicle 2102to move the vehicle 2102.
  • the power system 2202 may include an engine that converts fuel to mechanical energy (e.g., via combustion) and/or a motor that converts electrical energy to mechanical energy.
  • the one or more sensors 2204 may be configured to detect operational parameters of the vehicle 2102 and/or environmental conditions of an environment in which the vehicle 2102 opeates.
  • the one or more sensors 2204 may include an engine temperature sensor 2210, a battery voltage sensor 2212, an engine rotations per minute (RPM) sensor 2214, a throttle position sensor 2216, a battery sensor 2218 (to measure current, voltage, and/or temperature of a battery), a motor current sensor 2220, a motor voltage sensor 2222, a motor position sensor 2224 (e.g., a resolver and/or encoder), a motion sensor 2226 (e.g., an accelerometer, gyroscope and/or inertial measurement unit), a speed sensor 2228, an odometer sensor 2230, a clock 2232, a position sensor 2234 (e.g., a global navigation satellite system (GNSS) sensor and/or a global positioning system (GPS) sensor), one or more cameras 2236, a lidar system 2238, one or
  • the one or more controllers 2206 may be configured to control operation of the vehicle 2102.
  • the one or more controllers 2206 may include a brake controller 2244 to control braking of the vehicle 2102, a steering controller 2246 to control steering and/or direction of the vehicle 2212, a throttle controller 2248 and/or a speed controller 2250 to control speed and/or acceleration of the vehicle 2102, a gear controller 2252 to control gear shifting of the vehicle 2102, a routing controller 2254 to control navigation and/or routing of the vehicle 2102(e.g., using map data), and/or an auxiliary device controller 2256 to control one or more auxiliary devices associated with the vehicle 2102, such as a testing device, an auxiliary sensor, and/or a mobile device transported by the vehicle 2102.
  • a brake controller 2244 to control braking of the vehicle 2102
  • a steering controller 2246 to control steering and/or direction of the vehicle 2212
  • a throttle controller 2248 and/or a speed controller 2250 to control speed and/or acceleration of the vehicle 2102
  • the on-board computing device 2208 may be configured to receive sensor data from one or more sensors 2204 and/or to provide commands to one or more controllers 2206. For example, the on-board computing device 2208 may control operation of the vehicle 2102 by providing a command to a controller 2206 based on sensor data received from a sensor 2204. In some implementations, the on-board computing device 2208 may be configured to process sensor data to generate a command.
  • the on-board computing device 2208 may include memory, one or more processors, an input component, an output component, and/or a communication component, as described in more detail elsewhere herein.
  • the on-board computing device 2208 may receive navigation data, such as information associated with a navigation route from a start location of the vehicle 2102 to a destination location for the vehicle 2102.
  • the navigation data is accessed and/or generated by the routing controller 2254.
  • the routing controller 254 may access map data and identify possible routes and/or road segments that the vehicle 2102 can travel to move from the start location to the destination location.
  • the routing controller 2254 may identify a preferred route, such as by scoring multiple possible routes, applying one or more routing techniques (e.g., minimum Euclidean distance, Dijkstra's algorithm, and/or Bellman-Ford algorithm), accounting for traffic data, and/or receiving a user selection of a route, among other examples.
  • the on-board computing device 2208 may use the navigation data to control operation of the vehicle 2102.
  • the on-board computing device 2208 may receive sensor data from various sensors 2204.
  • the position sensor 2234 may provide geographic location information to the on-board computing device 2208, which may then access a map associated with the geographic location information to determine known fixed features associated with the geographic location, such as streets, buildings, stop signs, and/or traffic signals, which may be used to control operation of the vehicle 2102.
  • the on-board computing device 2208 may receive one or more images captured by one or more cameras 2236, may analyze the one or more images (e.g., to detect object data), and may control operation of the vehicle 2102 based on analyzing the images (e.g., to avoid detected objects). Additionally, or alternatively, the on-board computing device 2208 may receive object data associated with one or more objects detected in a vicinity of the vehicle 2102 and/or may generate object data based on sensor data.
  • the object data may indicate the presence or absence of an object, a location of the object, a distance between the object and the vehicle 2102, a speed of the object, a direction of movement of the object, an acceleration of the object, a trajectory (e.g., a heading) of the object, a shape of the object, a size of the object, a footprint of the object, and/or a type of the object (e.g., a vehicle, a pedestrian, a cyclist, a stationary object, or a moving object).
  • a trajectory e.g., a heading
  • the object data may be detected by, for example, one or more cameras 2236 (e.g., as image data), the lidar system 2238 (e.g., as lidar data) and/or one or more other ranging systems 2240 (e.g., as radar data or sonar data).
  • the on-board computing device 2208 may process the object data to detect objects in a proximity of the vehicle 2102 and/or to control operation of the vehicle 2102 based on the object data (e.g., to avoid detected objects).
  • the on-board computing device 2208 may use the object data (e.g., current object data) to predict future object data for one or more objects. For example, the on-board computing device 2208 may predict a future location of an object, a future distance between the object and the vehicle 2102, a future speed of the object, a future direction of movement of the object, a future acceleration of the object, and/or a future trajectory (e.g., a future heading) of the object. For example, if an object is a vehicle and map data indicates that the vehicle is at an intersection, then the on-board computing device 2208 may predict whether the object will likely move straight or turn. As another example, if the sensor data and/or the map data indicates that the intersection does not have a traffic light, then the on-board computing device 2208 may predict whether the object will stop prior to entering the intersection.
  • the object data e.g., current object data
  • the on-board computing device 2208 may generate a motion plan for the vehicle 2102 based on sensor data, navigation data, and/or object data (e.g., current object data and/or future object data). For example, based on current locations of objects and/or predicted future locations of objects, the on-board computing device 2208 may generate a motion plan to move the vehicle 2102 along a surface and avoid collision with other objects.
  • the motion plan may include, for one or more points in time, a speed of the vehicle 2102, a direction of the vehicle 2102, and/or an acceleration of the vehicle 2102. Additionally, or alternatively, the motion plan may indicate one or more actions with respect to a detected object, such as whether to overtake the object, yield to the object, pass the object, or the like.
  • the on-board computing device 2208 may generate one or more commands or instructions based on the motion plan, and may provide those command(s) to one or more controllers 2206 for execution.
  • Fig. 15 is provided as an example. Other examples may differ from what is described with regard to Fig. 15.
  • the number and arrangement of components shown in Fig. 15 are provided as an example. In practice, there may be additional components, fewer components, different components, or differently arranged components than those shown in Fig. 15.
  • two or more components shown in Fig. 15 may be implemented within a single components, or a single components shown in Fig. 15 may be implemented as multiple, distributed components.
  • a set of components (e.g., one or more components) shown in Fig. 15 may perform one or more functions described as being performed by another set of components shown in Fig. 15.
  • an on-board system of an aircraft may not include the brake controller 244 and/or the gear controller 2252, but may include an altitude sensor.
  • an on-board system of a watercraft may include a depth sensor.
  • Fig. 16 is a diagram of an example lidar system 2300, in accordance with some aspects of the disclosure.
  • the lidar system 2300 may correspond to the lidar system 238 of Fig. 15.
  • the lidar system may include a housing 2302, a light emitter system 2304, a light detector system 2306, an optical element structure 2308, a motor 2310, and an analysis device 2312.
  • the housing 2302 may be rotatable (e.g., by 360 degrees) around an axle 2314 (or hub) of the motor 2310.
  • the housing 2302 may include an aperture 2316 (e.g., an emitter and/or receiver aperture) made of a material transparent to light. Although a single aperture 2316 is shown in Fig. 16, the housing 2302 may include multiple apertures 2316 in some implementations.
  • the lidar system 2300 may emit light through one or more apertures 2316 and may receive reflected light back through one or more apertures 2316 as the housing 2302 rotates around components housed within the housing 2302.
  • the housing 2302 may be a stationary structure (e.g., that does not rotate), at least partially made of a material that is transparent to light, with rotatable components inside of the housing 2302.
  • the housing 2302 may house the light emitter system 2304, the light detector system 2306, and/or the optical element structure 2308.
  • the light emitter system 2304 may be configured and/or positioned to generate and emit pulses of light through the aperture 2316 and/or through a transparent material of the housing 2302.
  • the light emitter system 2304 may include one or more light emitters, such as laser emitter chips or other light emitting devices.
  • the light emitter system 2304 may include any number of individual light emitters (e.g., 8 emitters, 64 emitters, or 128 emitters), which may emit light at substantially the same intensity or of varying intensities.
  • the light detector system 2306 may include a photodetector or an array of photodetectors configured and/or positioned to receive light reflected back through the housing 2302 and/or the aperture 2316.
  • the optical element structure 2308 may be positioned between the light emitter system 2304 and the housing 2302, and/or may be positioned between the light detector system 2306 and the housing 2302.
  • the optical element structure 2308 may include one or more lenses, waveplates, and/or mirrors that focus and direct light that passes through the optical element structure 2308.
  • the light emitter system 2304, the light detector system 2306, and/or the optical element structure 2308 may rotate with a rotatable housing 2302 or may rotate inside of a stationary housing 2302.
  • the analysis device 2312 may be configured to receive (e.g., via one or more wired and/or wireless connections) sensor data collected by the light detector system 2306, analyze the sensor data to measure characteristics of the received light, and generate output data based on the sensor data.
  • the analysis device 2312 may provide the output data to another system that can control operations and/or provide recommendations with respect to an environment from which the sensor data was collected.
  • the analysis device 2312 may provide the output data to the on-board system 2104 (e.g., the on-board computing device 2208) of the vehicle 2102 to enable the on-board system 2104 to process the output data and/or use the output data (or the processed output data) to control operation of the vehicle 2102.
  • the on-board system 2104 e.g., the on-board computing device 2208
  • the analysis device 2312 may be integrated into the lidar system 2300 or may be external from the lidar system 2300 and communicatively connected to the lidar system 2300 via a network.
  • the analysis device 2312 may include memory, one or more processors, an input component, an output component, and/or a communication component, as described in more detail elsewhere herein.
  • Fig. 16 is provided as an example. Other examples may differ from what is described with regard to Fig. 16.
  • the number and arrangement of components shown in Fig. 16 are provided as an example. In practice, there may be additional components, fewer components, different components, or differently arranged components than those shown in Fig. 16.
  • two or more components shown in Fig. 16 may be implemented within a single components, or a single components shown in Fig. 16 may be implemented as multiple, distributed components.
  • a set of components (e.g., one or more components) shown in Fig. 16 may perform one or more functions described as being performed by another set of components shown in Fig. 16.
  • Figs. 17A-17G are diagrams of an example sensor housing 2400, in accordance with some aspects of the disclosure.
  • the sensor housing 2400 may be, be similar to, include, or be included in, the housing 2302 of the lidar system 2300 depicted in Fig. 16.
  • the sensor housing 2400 may be a housing associated with any other type of sensor such as, for example, an imaging device (e.g., a camera and/or a video camera), a radar device, and/or a motion sensor, among other examples.
  • Fig. 17A is a perspective diagram of the example sensor housing 2400, in accordance with some aspects of the disclosure.
  • the sensor housing 2400 may include a sensor chamber 2402 enclosed by a housing shell 2404.
  • the housing shell 2404 may include a perimeter wall 2406 having a number of apertures 2408.
  • the housing shell 2404 may include an upper wall 2410 that is coupled with the perimeter wall 2406 via a beveled edge 2412.
  • the interface between the upper wall 2410 and the perimeter wall 2406 may be at least approximately perpendicular, for example, without including the beveled edge 2412.
  • a top plate 2416 may be removably attachable to the upper wall 2410 using fasteners 2418 (e.g., screws).
  • Figs. 17B-17D are exploded perspective diagrams of the example sensor housing 2400, with the housing shell 2404 removed, in accordance with some aspects of the disclosure.
  • a desiccant assembly 2414 may be disposed with the sensor housing 2400.
  • the desiccant assembly 2414 may include a desiccant assembly body 2420.
  • the desiccant assembly body 2420 may include a surface 2422 within which is defined with a pocket 2424 to form a desiccant chamber 2426.
  • the desiccant assembly body 2420 may be configured to separate the desiccant chamber 2426 from the sensor chamber 2402.
  • the pocket 2424 may be a recess defined in the surface 2422 of the desiccant assembly body 2420.
  • the pocket 2424 may be configured to receive at least one desiccant element 2428.
  • the at least one desiccant element 2428 includes two desiccant blocks 2428.
  • the at least one desiccant element 2428 may be removable.
  • the at least one desiccant element 2428 may be fixed.
  • the at least one desiccant element 2428 may include an adhesive material adhered to at least one side of a transfer wall 2430.
  • the at least one desiccant element 2428 may be either configured to be replaceable at a predetermined service interval, or configured to be permanently integrated and operable for the duration of the expected life of the sensor.
  • a replaceable desiccant element 2428 may be placed in the pocket 2424 in a location that is accessible for a maintenance technician to perform a replacement service.
  • the at least one desiccant element 2428 may be permanently integrated within a sensor (e.g., lidar) assembly and may be placed at any location that advantageously leverages size, weight, space considerations as well as mass balance considerations in cases where the sensor is a mechanical (e.g., spinning or rotating) sensor such as a mechanical lidar.
  • the at least one desiccant element 2428 may be shaped in any number of ways to leverage space and performance considerations of the sensor.
  • the at least one desiccant element 2428 may be made of a molecular sieve powder mixed with a polymer binder and formed into the desired shape.
  • Figs. 17E and 17F are perspective diagrams of the example desiccant assembly body 2420, in accordance with some aspects of the disclosure.
  • Fig. 17G is a top plan diagram of the example desiccant assembly body 2420, in accordance with some aspects of the disclosure.
  • the pocket 2424 includes a transfer wall 2430 having at least one transfer window 2432 defined therein.
  • the at least one transfer window 2432 may be positioned between the desiccant chamber 2426 and the sensor chamber 2402.
  • a permeable membrane 2434 may be disposed over the at least one transfer window 2432 and may be configured to allow water vapor to transfer from the sensor chamber 2402 to the desiccant chamber 2426.
  • the permeable membrane 2434 may be further configured to prevent liquid water and/or particulate matter from transferring from the desiccant chamber 2426 to the sensor chamber 2402.
  • the permeable membrane 2434 and/or a set of dimensions (e.g., area, length, width) of the at least one transfer window 2432 may be configured so that a leak rate corresponding to a transfer of water vapor from the sensor chamber 2402 to the desiccant chamber 2426 causes prevention of condensation of water within the sensor chamber 2402.
  • the permeable membrane 2434 may include a material selected so that a leak rate corresponding to a transfer of water vapor from the sensor chamber 2402 to the desiccant chamber 2426 facilitates maintaining a relative humidity level within the sensor chamber 2402 at or below a humidity threshold.
  • the permeable membrane 2434 may include a polymer material.
  • the polymer material may include expanded polytetrafluoroethylene (ePTFE).
  • the permeable membrane 2434 may be an adhesive material adhered to the transfer wall 2430.
  • the permeable membrane 2434 may be adhered to an upper surface (e.g., a surface facing the desiccant chamber 2426) of the transfer wall 2430 and/or to a lower surface (e.g., a surface facing the sensor chamber 2402) of the transfer wall 2430.
  • an upper surface e.g., a surface facing the desiccant chamber 2426
  • a lower surface e.g., a surface facing the sensor chamber 2402
  • the desiccant assembly 2414 may include an access component 2436 configured to isolate the desiccant chamber 2426 from an environment external to the desiccant chamber 2426.
  • the access component 2436 may include a removable chamber cover 2438 configured to be fastened to the surface 2422 of the desiccant assembly body 2420.
  • the access component 2436 may include a seal 2440 (e.g., a gasket and/or O-ring) configured to be engaged by the removable chamber cover 2438 to seal the desiccant chamber 2426 from the outside environment.
  • the top plate 2416 may serve as the removable chamber cover 2438.
  • the desiccant assembly 2414 may be disposed within the sensor housing 2400.
  • the desiccant assembly 2414 may be disposed within the sensor housing 2400 at a location that is selected so that a mass balance associated with the sensor housing 2400 facilitates a mechanical operation of a sensor within the sensor housing 2400 and/or of a mechanical operation of the sensor housing 2400 itself.
  • the desiccant assembly 2414 is located at the top of the sensor housing 2400.
  • the desiccant assembly 2414 may be located at a bottom of the sensor housing, at a side of the sensor housing 2400, and/or at any other location that may be selected to facilitate operation of the sensor.
  • the desiccant assembly 2414 may be coupled to a sensor housing frame 2442.
  • the desiccant assembly 2414 may be integrated into, or coupled with, the sensor housing shell 2404.
  • the desiccant assembly 2414 may include one or more mechanical assemblies configured to open and close, and/or partially open and partially close, the at least one transfer window 2432.
  • a mechanical flapper or slider may be configured to cover the at least one transfer window 2432 in response to actuation by an actuator.
  • the actuator may be communicatively coupled with the on-board computing device 2208 depicted in Fig. 15.
  • the on-board computing device 2208 may obtain measurements from one or more environmental sensors (e.g., temperature sensors, humidity sensors, and/or pressure sensors, among other examples), and may cause actuation of the actuator to at least partially open and/or close the at least one transfer window 2432 to maintain, based on the measurements, a leak rate corresponding to a transfer of water vapor from the sensor chamber 2402 to the desiccant chamber 2426 that facilitates maintaining a relative humidity level within the sensor chamber 2402 at or below a humidity threshold.
  • environmental sensors e.g., temperature sensors, humidity sensors, and/or pressure sensors, among other examples
  • Figs. 17A-17G are provided as an example. Other examples may differ from what is described with regard to Figs. 17A-17G.
  • the number and arrangement of components shown in Figs. 17A-17G are provided as an example. In practice, there may be additional components, fewer components, different components, or differently arranged components than those shown in Figs. 17A-17G.
  • two or more components shown in Figs. 17A-17G may be implemented within a single components, or a single components shown in Figs. 17A-17G may be implemented as multiple, distributed components.
  • a set of components (e.g., one or more components) shown in Figs. 17A-17G may perform one or more functions described as being performed by another set of components shown in Figs. 17A-17G.
  • an equipment housing may include an equipment chamber configured to hold electronic equipment; and a desiccant assembly.
  • the desiccant assembly may include, as described herein, a desiccant chamber configured to hold a desiccant element; and a transfer assembly positioned between the desiccant chamber and the equipment chamber.
  • the transfer assembly may include at least one transfer window and at least one permeable membrane and may be configured to allow water vapor to transfer from the equipment chamber to the desiccant chamber and to prevent particulate matter from transferring from the desiccant chamber to the equipment chamber.
  • Fig. 18 is a flowchart of an example method 2500 associated with manufacturing a sensor housing.
  • the method 2500 may include providing a desiccant chamber configured to hold a desiccant element (block 2510). As further shown in Fig. 18, the method 2500 may include positioning a transfer window between the desiccant chamber and a sensor chamber of the sensor housing (block 2520). As further shown in Fig. 18, the method 2500 may include disposing a permeable membrane over the transfer window, wherein the permeable membrane is configured to allow water vapor to transfer from the sensor chamber to the desiccant chamber (block 2530).
  • the desiccant chamber comprises a pocket defined in a surface of a desiccant assembly body, the desiccant assembly body configured to separate the desiccant chamber from the sensor chamber.
  • the pocket comprises a recess defined in the surface of the desiccant assembly body, the recess comprising a transfer wall within which the transfer window is defined.
  • the recess is configured to receive the desiccant element.
  • the desiccant element comprises an adhesive material adhered to at least one side of the transfer wall.
  • the desiccant chamber is configured to hold at least one additional desiccant element.
  • the method 2500 includes positioning at least one additional transfer window between the desiccant chamber and the sensor chamber.
  • the method 2500 includes providing an access component configured to isolate the desiccant chamber from an environment external to the sensor chamber.
  • providing the access component comprises removably attaching a chamber cover to a surface of desiccant assembly body, the desiccant assembly body configured to separate the desiccant chamber from the sensor chamber.
  • the method 2500 includes selecting the permeable membrane so that a leak rate corresponding to a transfer of water vapor from the sensor chamber to the desiccant chamber causes prevention of condensation of water within the sensor chamber.
  • the method 2500 includes configuring the permeable membrane to prevent a transfer of liquid water and particulate matter from the desiccant chamber to the sensor chamber.
  • the permeable membrane comprises a material selected so that a leak rate corresponding to a transfer of water vapor from the sensor chamber to the desiccant chamber facilitates maintaining a relative humidity level within the sensor chamber at or below a humidity threshold.
  • the method 2500 includes configuring a set of dimensions of the transfer window is so that a leak rate corresponding to a transfer of water vapor from the sensor chamber to the desiccant chamber facilitates maintaining a relative humidity level within the sensor chamber at or below a humidity threshold.
  • the permeable membrane comprises a polymer material.
  • the polymer material comprises expanded polytetrafluoroethylene.
  • the desiccant element is removeable.
  • the method 2500 includes disposing the desiccant assembly within the sensor housing at a location that is selected so that a mass balance associated with the sensor housing facilitates a mechanical operation of a sensor within the sensor housing.
  • Fig. 18 shows example blocks of a method 2500
  • the method 2500 may include additional blocks, fewer blocks, different blocks, or differently arranged blocks than those depicted in Fig. 18. Additionally, or alternatively, two or more of the blocks of the method 2500 may be performed in parallel.
  • the method 2500 is an example of one method that may be performed by one or more devices described herein. These one or more devices may perform or may be configured to perform one or more other methods based on operations described herein.
  • the term "component” is intended to be broadly construed as hardware, firmware, or a combination of hardware and software. It will be apparent that systems and/or methods described herein may be implemented in different forms of hardware, firmware, and/or a combination of hardware and software. The hardware and/or software code described herein for implementing aspects of the disclosure should not be construed as limiting the scope of the disclosure. Thus, the operation and behavior of the systems and/or methods are described herein without reference to specific software code - it being understood that software and hardware can be used to implement the systems and/or methods based on the description herein.
  • satisfying a threshold may, depending on the context, refer to a value being greater than the threshold, greater than or equal to the threshold, less than the threshold, less than or equal to the threshold, equal to the threshold, not equal to the threshold, or the like.
  • a, b, or c is intended to cover a, b, c, a-b, a-c, b-c, and a-b-c, as well as any combination with multiple of the same item.
  • the term "and/or” used to connect items in a list refers to any combination and any permutation of those items, including single members (e.g., an individual item in the list).
  • "a, b, and/or c" is intended to cover a, b, c, a-b, a-c, b-c, and a-b-c.
  • the terms “has,” “have,” “having,” or the like are intended to be open-ended terms. Further, the phrase “based on” is intended to mean “based, at least in part, on” unless explicitly stated otherwise. Also, as used herein, the term “or” is intended to be inclusive when used in a series and may be used interchangeably with “and/or,” unless explicitly stated otherwise (e.g., if used in combination with “either” or “only one of”).
  • Coupled can also mean that two or more elements are not in direct contact with each other, but yet still co-operate or interact with each other.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Electromagnetism (AREA)
  • Automation & Control Theory (AREA)
  • Human Computer Interaction (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • Optical Radar Systems And Details Thereof (AREA)

Abstract

Des systèmes, des procédés et des produits programmes informatique pour un seuil de détection dynamique pour un capteur d'un véhicule autonome sont divulgués. Un système peut comprendre un système LiDAR d'un véhicule autonome. Le système LiDAR peut comprendre au moins un émetteur de lumière et au moins un détecteur de lumière pour générer des signaux de sortie analogiques sur la base d'impulsions de lumière réfléchies. Un comparateur peut recevoir les signaux de sortie analogiques provenant du détecteur de lumière et générer des signaux de sortie numériques sur la base des signaux de sortie analogiques et d'un seuil. Un dispositif de commande peut recevoir un premier signal de sortie numérique des signaux de sortie numériques provenant du comparateur sur la base du seuil, ajuster le seuil, recevoir au moins un autre signal de sortie numérique des signaux de sortie numériques provenant du comparateur sur la base du seuil tel qu'il a été ajusté, et/ou déterminer au moins une agrégation sur la base du premier signal de sortie numérique et du ou des autres signaux de sortie numériques.
PCT/KR2023/021613 2022-12-23 2023-12-26 Système, procédé et produit programme informatique pour seuil de détection dynamique pour lidar d'un véhicule autonome Ceased WO2024136622A1 (fr)

Priority Applications (3)

Application Number Priority Date Filing Date Title
CN202380094495.6A CN120731123A (zh) 2022-12-23 2023-12-26 用于自动驾驶交通工具的LiDAR的动态检测阈值的系统、方法和计算机程序产品
EP23907905.6A EP4637968A1 (fr) 2022-12-23 2023-12-26 Système, procédé et produit programme informatique pour seuil de détection dynamique pour lidar d'un véhicule autonome
KR1020257021381A KR20250123820A (ko) 2022-12-23 2023-12-26 자율주행 차량의 라이다 동적 탐지 임계값을 위한 시스템, 방법 및 컴퓨터 프로그램 제품

Applications Claiming Priority (6)

Application Number Priority Date Filing Date Title
US18/087,950 US12344272B2 (en) 2022-12-23 2022-12-23 System, method, and computer program product for dynamic detection threshold for LiDAR of an autonomous vehicle
US18/087,950 2022-12-23
US18/088,846 2022-12-27
US18/088,846 US12039945B1 (en) 2022-12-27 2022-12-27 Methods and systems for determining characteristics of sensors using e-ink display devices
US18/153,209 US20230356146A1 (en) 2022-08-18 2023-01-11 Desiccant assembly for humidity control within a sensor housing
US18/153,209 2023-01-11

Publications (1)

Publication Number Publication Date
WO2024136622A1 true WO2024136622A1 (fr) 2024-06-27

Family

ID=91589580

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2023/021613 Ceased WO2024136622A1 (fr) 2022-12-23 2023-12-26 Système, procédé et produit programme informatique pour seuil de détection dynamique pour lidar d'un véhicule autonome

Country Status (4)

Country Link
EP (1) EP4637968A1 (fr)
KR (1) KR20250123820A (fr)
CN (1) CN120731123A (fr)
WO (1) WO2024136622A1 (fr)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20170094197A (ko) * 2014-12-09 2017-08-17 바스프 에스이 광 검출기
JP2019512704A (ja) * 2016-03-21 2019-05-16 ベロダイン ライダー, インク. 可変パルス繰返し数を有するlidarに基づく三次元撮像
US20200150232A1 (en) * 2017-03-29 2020-05-14 SZ DJI Technology Co., Ltd. Light detecting and ranging (lidar) signal processing circuitry
KR20200069678A (ko) * 2018-12-07 2020-06-17 현대오트론 주식회사 라이다 감지거리 증가 장치 및 방법
US20220342072A1 (en) * 2020-10-19 2022-10-27 Aeva, Inc. Techniques for detection threshold adjustment for known target detection

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20170094197A (ko) * 2014-12-09 2017-08-17 바스프 에스이 광 검출기
JP2019512704A (ja) * 2016-03-21 2019-05-16 ベロダイン ライダー, インク. 可変パルス繰返し数を有するlidarに基づく三次元撮像
US20200150232A1 (en) * 2017-03-29 2020-05-14 SZ DJI Technology Co., Ltd. Light detecting and ranging (lidar) signal processing circuitry
KR20200069678A (ko) * 2018-12-07 2020-06-17 현대오트론 주식회사 라이다 감지거리 증가 장치 및 방법
US20220342072A1 (en) * 2020-10-19 2022-10-27 Aeva, Inc. Techniques for detection threshold adjustment for known target detection

Also Published As

Publication number Publication date
CN120731123A (zh) 2025-09-30
EP4637968A1 (fr) 2025-10-29
KR20250123820A (ko) 2025-08-18

Similar Documents

Publication Publication Date Title
CN114424086B (zh) 用于lidar测量的处理系统
WO2016153233A1 (fr) Dispositif lidar
EP3187895B1 (fr) Système de radar de vol à résolution variable
US9069060B1 (en) Circuit architecture for optical receiver with increased dynamic range
WO2020050498A1 (fr) Procédé et dispositif destinés à détecter un milieu environnant à l'aide d'une segmentation d'image
WO2014175502A2 (fr) Procédé et appareil permettant de mesurer une distance à l'aide d'un signal optique
WO2017196062A1 (fr) Capteur de distance, et procédé d'étalonnage effectué par un dispositif et un système qui comprennent le capteur de distance
US11693098B2 (en) Shared sample and convert capacitor architecture
WO2018003227A1 (fr) Dispositif de mesure de distance et procédé de mesure de distance
KR20200069678A (ko) 라이다 감지거리 증가 장치 및 방법
WO2019039728A1 (fr) Capteur lidar à balayage tridimensionnel extrêmement petit
WO2016186319A1 (fr) Dispositif d'assistance à la conduite d'un véhicule et véhicule
WO2024080801A1 (fr) Système lidar, procédé de fonctionnement du système lidar
WO2024136622A1 (fr) Système, procédé et produit programme informatique pour seuil de détection dynamique pour lidar d'un véhicule autonome
US20200081094A1 (en) Light scanning device and control method
WO2021177752A1 (fr) Capteur micro-lidar
WO2017057801A1 (fr) Appareil radar pour véhicule, système d'assistance à la conduite de véhicule, et véhicule
US20240241236A1 (en) Dynamic Alignment and Optical Stabilization of Optical Path in an Automotive-Grade LIDAR
CN119096163A (zh) 增加lidar系统的像素的信噪比
WO2024039237A1 (fr) Dispositif de capteur, dispositif lidar et leur procédé de commande
EP4411414A1 (fr) Systèmes et procédés de récupération de données de nuage de points à partir d'une mémoire cache en réponse à des événements de déclenchement
WO2017204459A1 (fr) Appareil optique lidar ayant une structure améliorée
JP7284979B2 (ja) 位置決めシステムおよび関連方法
WO2013002448A1 (fr) Appareil de mesure de distance permettant de mesurer une distance et des coordonnées au moyen d'un laser
WO2023234755A1 (fr) Systèmes et procédés destinés à la correction de tension de claquage dans des réseaux de plans focaux (fpa) de photodiode à avalanche en mode geiger (apd)

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23907905

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2025536979

Country of ref document: JP

Kind code of ref document: A

WWE Wipo information: entry into national phase

Ref document number: 2025536979

Country of ref document: JP

WWE Wipo information: entry into national phase

Ref document number: 2023907905

Country of ref document: EP

NENP Non-entry into the national phase

Ref country code: DE

WWP Wipo information: published in national office

Ref document number: 1020257021381

Country of ref document: KR

WWE Wipo information: entry into national phase

Ref document number: 202380094495.6

Country of ref document: CN

WWP Wipo information: published in national office

Ref document number: 202380094495.6

Country of ref document: CN

WWP Wipo information: published in national office

Ref document number: 2023907905

Country of ref document: EP