US11094191B2 - Distributed safety infrastructure for autonomous vehicles and methods of use - Google Patents
Distributed safety infrastructure for autonomous vehicles and methods of use Download PDFInfo
- Publication number
- US11094191B2 US11094191B2 US16/396,660 US201916396660A US11094191B2 US 11094191 B2 US11094191 B2 US 11094191B2 US 201916396660 A US201916396660 A US 201916396660A US 11094191 B2 US11094191 B2 US 11094191B2
- Authority
- US
- United States
- Prior art keywords
- street
- street object
- poles
- sensor
- wireless transmission
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Expired - Fee Related
Links
Images
Classifications
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/005—Traffic control systems for road vehicles including pedestrian guidance indicator
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/09—Arrangements for giving variable traffic instructions
- G08G1/091—Traffic information broadcasting
-
- G06K9/00785—
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/52—Surveillance or monitoring of activities, e.g. for recognising suspicious objects
- G06V20/54—Surveillance or monitoring of activities, e.g. for recognising suspicious objects of traffic, e.g. cars on the road, trains or boats
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/01—Detecting movement of traffic to be counted or controlled
- G08G1/0104—Measuring and analyzing of parameters relative to traffic conditions
- G08G1/0108—Measuring and analyzing of parameters relative to traffic conditions based on the source of data
- G08G1/0116—Measuring and analyzing of parameters relative to traffic conditions based on the source of data from roadside infrastructure, e.g. beacons
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/01—Detecting movement of traffic to be counted or controlled
- G08G1/0104—Measuring and analyzing of parameters relative to traffic conditions
- G08G1/0125—Traffic data processing
- G08G1/0133—Traffic data processing for classifying traffic situation
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/01—Detecting movement of traffic to be counted or controlled
- G08G1/0104—Measuring and analyzing of parameters relative to traffic conditions
- G08G1/0137—Measuring and analyzing of parameters relative to traffic conditions for specific applications
- G08G1/0141—Measuring and analyzing of parameters relative to traffic conditions for specific applications for traffic information dissemination
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/01—Detecting movement of traffic to be counted or controlled
- G08G1/04—Detecting movement of traffic to be counted or controlled using optical or ultrasonic detectors
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/01—Detecting movement of traffic to be counted or controlled
- G08G1/052—Detecting movement of traffic to be counted or controlled with provision for determining speed or overspeed
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/09—Arrangements for giving variable traffic instructions
- G08G1/091—Traffic information broadcasting
- G08G1/093—Data selection, e.g. prioritizing information, managing message queues, selecting the information to be output
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/16—Anti-collision systems
- G08G1/164—Centralised systems, e.g. external to vehicles
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/20—Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from infrared radiation only
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/90—Arrangement of cameras or camera modules, e.g. multiple cameras in TV studios or sports stadiums
-
- H04N5/247—
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W4/00—Services specially adapted for wireless communication networks; Facilities therefor
- H04W4/30—Services specially adapted for particular environments, situations or purposes
- H04W4/38—Services specially adapted for particular environments, situations or purposes for collecting sensor information
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W4/00—Services specially adapted for wireless communication networks; Facilities therefor
- H04W4/30—Services specially adapted for particular environments, situations or purposes
- H04W4/40—Services specially adapted for particular environments, situations or purposes for vehicles, e.g. vehicle-to-pedestrians [V2P]
- H04W4/44—Services specially adapted for particular environments, situations or purposes for vehicles, e.g. vehicle-to-pedestrians [V2P] for communication between vehicles and infrastructures, e.g. vehicle-to-cloud [V2C] or vehicle-to-home [V2H]
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/09—Arrangements for giving variable traffic instructions
- G08G1/0962—Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
- G08G1/0967—Systems involving transmission of highway information, e.g. weather, speed limits
- G08G1/096708—Systems involving transmission of highway information, e.g. weather, speed limits where the received information might be used to generate an automatic action on the vehicle control
- G08G1/096716—Systems involving transmission of highway information, e.g. weather, speed limits where the received information might be used to generate an automatic action on the vehicle control where the received information does not generate an automatic action on the vehicle control
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/09—Arrangements for giving variable traffic instructions
- G08G1/0962—Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
- G08G1/0967—Systems involving transmission of highway information, e.g. weather, speed limits
- G08G1/096708—Systems involving transmission of highway information, e.g. weather, speed limits where the received information might be used to generate an automatic action on the vehicle control
- G08G1/096725—Systems involving transmission of highway information, e.g. weather, speed limits where the received information might be used to generate an automatic action on the vehicle control where the received information generates an automatic action on the vehicle control
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/09—Arrangements for giving variable traffic instructions
- G08G1/0962—Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
- G08G1/0967—Systems involving transmission of highway information, e.g. weather, speed limits
- G08G1/096733—Systems involving transmission of highway information, e.g. weather, speed limits where a selection of the information might take place
- G08G1/09675—Systems involving transmission of highway information, e.g. weather, speed limits where a selection of the information might take place where a selection from the received information takes place in the vehicle
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/09—Arrangements for giving variable traffic instructions
- G08G1/0962—Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
- G08G1/0967—Systems involving transmission of highway information, e.g. weather, speed limits
- G08G1/096766—Systems involving transmission of highway information, e.g. weather, speed limits where the system is characterised by the origin of the information transmission
- G08G1/096783—Systems involving transmission of highway information, e.g. weather, speed limits where the system is characterised by the origin of the information transmission where the origin of the information is a roadside individual element
Definitions
- the present disclosure relates generally to safety infrastructures that support and enable safe autonomous vehicle operations.
- self-driving cars have challenges in identifying pedestrians, especially when they are outside of the field of view of their sensors.
- a pedestrian appears in front of the car and a view of that pedestrian is obstructed by, for example, a different car, the autonomous car does not possess any inherent advantages over a human driver.
- a system for assisting navigation of traffic participants may include a plurality of poles located adjacent to a street and having at least one sensor, an object data processing unit, and a wireless transmission unit.
- the at least one sensor can be configured to collect street object data associated with at least one street object within a range of the at least one sensor.
- the object data processing unit can be configured to analyze, in real time, the street object data.
- the object data processing unit can be further configured to generate street object metadata corresponding to the at least one street object.
- the wireless transmission unit may be associated with a unique identification number and configured to broadcast the street object metadata to at least one traffic participant within the range.
- the street object metadata may be used to provide at least one warning to the at least one traffic participant of traffic conditions to allow the at least one traffic participant to take at least one proactive action.
- a method for assisting navigation of traffic participants may commence with collecting street object data by at least one sensor installed on a plurality of poles adjacent to a street.
- the street object data may be associated with at least one street object within a range of the at least one sensor.
- the method may continue with analyzing, by an object data processing unit, the street object data in real time.
- the method may further include generating, by the object data processing unit, street object metadata corresponding to the at least one street object.
- the method may continue with broadcasting, by a wireless transmission unit, the street object metadata to at least one traffic participant within the range.
- the street object metadata may be used to provide at least one warning to the at least one traffic participant of traffic conditions to allow the at least one traffic participant to take at least one proactive action.
- FIG. 1 illustrates an environment within which systems and methods for assisting navigation of traffic participants can be implemented, in accordance with an exemplary embodiment.
- FIG. 2 is a block diagram showing various modules of an object data processing unit, in accordance with an exemplary embodiment.
- FIG. 3 is a block diagram showing various modules of a system for assisting navigation of traffic participants, in accordance with an exemplary embodiment.
- FIG. 4 is a flow chart illustrating a method for assisting navigation of traffic participants, in accordance with an exemplary embodiment.
- FIG. 5 shows a diagrammatic representation of a computing device for a machine in the exemplary electronic form of a computer system, within which a set of instructions for causing the machine to perform any one or more of the methodologies discussed herein can be executed.
- Smart city features should include one for autonomous driving. This driving should not include just cars that transport people, but may also include a cargo autonomous vehicle. A large city can be compared to a warehouse with distractions.
- a typical street may have pedestrians, on foot or riding personal transportation equipment, such as bicycles, scooters and the like, autonomously driven cars, computer-assisted cars, human-driven cars, delivery robots, animals, stationary objects, and lighting or utility poles. Many traffic lights already have integrated cameras, but they are focused on cars.
- the infrastructures of the present disclosure contemplate the use of additional cameras mounted on other poles or other types of street-adjacent structures. These cameras are configured to focus on both street surfaces and at adjacent pedestrian areas.
- a smart module at each pole may have an Internet connectivity and real-time, low latency, low power (short-distance) radio broadcasting module with low bandwidth requirements.
- the camera can generate a real time video feed to be processed into a low fidelity stream that includes associated data such as location, direction, speed of each object in the field of view of the camera, prediction, other metadata, and so forth, as well as optionally a map of stationary objects.
- the sensors include a hyper-spectral imaging system. While a hyper-spectral imaging is typically of a lower resolution than visible light imaging, a hyper-spectral imaging system provides additional information about materials, objects and processes, through the output of an image cube where a vector of values corresponds to each pixel, corresponding to the wavelength bands of the hyper-spectrometer. Hyper-spectral imaging can help better differentiate between living and non-living objects and provide better information about weather conditions, among other uses. Processing unit on the pole may overlay hyper-spectral image on the information from other sensors to provide additional information not otherwise easily available to the imaging sensor, even with computational post-processing. Hyper-spectral information can provide additional information about the street conditions and help the cars make appropriate adjustments to their behavior and/or speed.
- hyper-spectral imaging is similar to precise agriculture: information available in the hyper-spectral image contains information about the needs of trees, plants, or flowers, growing on the sidewalk, including information about watering requirements, weeds, and diseases. This can enable an automated, robot-based precision maintenance of greenery in cities.
- the sensors include a radar.
- a radar can also be installed on the pole. Imaging sensors work best in good weather conditions, but in adverse weather conditions (rain, snow) their performance degrade, even when the sensor protective surface is not affected due to the height of the sensor and downward orientation.
- a radar can provide additional layer of information to help offset degradation of the imaging sensors and still provide the most important information to the cars and other signal recipients. Additionally, stereo vision can help produce more accurate spatial, size, and dimension information about non-human objects. While radar performance varies and depends on multiple factors, including the radio frequency, radars are typically not affected by rain or snow as much as passive or active visible spectrum imaging systems.
- a depth measuring device can utilize a parallax between two cameras to add a stereo vision and to help build a skeleton model of the pedestrians.
- the skeleton model can be used to improve predictions, as well as can help differentiate between people on the side of a street that are situation-aware from distracted people.
- Stereo vision can be done from a single pole, or, with overlapping pole information.
- the parallax can be either horizontal, or vertical (located at different height of the pole). Additionally, such robots can rely on the pole-enabled vision to improve precision of maintenance jobs by augmenting their on-board imaging systems.
- a drone can be added to the autonomous vehicle in such a way that when the autonomous vehicle leaves the area of the smart city infrastructure, a drone can be dispatched from the autonomous vehicle to fly above and in front of the autonomous vehicle and to guide the autonomous vehicle from above.
- the information transmitted by the drone can be shared between different cars, thereby creating a “herd immunity” effect.
- the cars can share the signal received from the drones with other cars so that other cars (those that are not using the drones) can utilize the same signal and take advantage of the signal.
- peer-to-peer (e.g. pole-to-pole) communication can be implemented.
- direct communication between poles can be implemented, thereby creating a mesh network.
- This mesh network can send advance notices to other poles even when cellular infrastructure is down or not available. Examples of the information that can be transmitted between poles: an accident that is outside of the view of a car and/or other poles, information about a fast-moving object between the field of view of poles (similar to a hand-off procedure between air traffic controls tracking an airplane), and the like.
- the same information can also be transmitted through the cloud using cellular networks.
- predictions can be made by observing objects for a period of time.
- objects such as humans
- behavioral predictions can be made and communicated alongside with location information.
- This can also include the level of distraction: a human communicating on a cellphone has a limited attention and vision to see the approaching car, compared to an alert human.
- Observing the human for several seconds or even minutes can produce a model that can predict the next actions of each specific human, with or without a skeleton model.
- only actions within the next few seconds need to be predicted, and this is one of the key differentiators of the pole-based system from an on-board only system of a fast-moving car.
- a car moves 44 feet in 1 second. This translates to a sub-second time that is available for the car to detect any moving object in its field of vision, which is often inadequate to make a good prediction.
- FIG. 1 illustrates an environment 100 (e.g., infrastructure) within which methods and systems for assisting navigation of traffic participants can be implemented.
- the environment 100 may include a system for assisting navigation of traffic participants, also referred to as a system 300 , pedestrians 118 , cars 120 , specialized equipment 122 , and a data network 124 .
- the data network 124 may include the Internet, a computing cloud, and any other network capable of communicating data between devices. Suitable networks may include or interface with any one or more of, for instance, a local intranet, a Personal Area Network, a Local Area Network, a Wide Area Network, a Metropolitan Area Network, a virtual private network, a storage area network, a frame relay connection, an Advanced Intelligent Network connection, a synchronous optical network connection, a digital T1, T3, E1 or E3 line, Digital Data Service connection, Digital Subscriber Line connection, an Ethernet connection, an Integrated Services Digital Network line, a dial-up port such as a V.90, V.34 or V.34bis analog modem connection, a cable modem, an Asynchronous Transfer Mode connection, or a Fiber Distributed Data Interface or Copper Distributed Data Interface connection.
- communications may also include links to any of a variety of wireless networks, including Wireless Application Protocol, General Packet Radio Service, Global System for Mobile Communication, Code Division Multiple Access or Time Division Multiple Access, cellular phone networks, Global Positioning System, cellular digital packet data, Research in Motion, Limited duplex paging network, Bluetooth radio, or an IEEE 802.11-based radio frequency network.
- the data network can further include or interface with any one or more of Recommended Standard 232 (RS-232) serial connection, an IEEE-1394 (FireWire) connection, a Fiber Channel connection, an IrDA (infrared) port, a Small Computer Systems Interface connection, a Universal Serial Bus connection or other wired or wireless, digital or analog interface or connection, mesh or Digi® networking.
- the data network may include a network of data processing nodes, also referred to as network nodes, that are interconnected for the purpose of data communication.
- the system 300 may include a plurality of poles 102 , a safety data gathering and processing apparatus also referred herein to as an object data processing unit 104 , and a wireless interface also referred herein to as a wireless transmission unit 112 .
- the object data processing unit 104 and the wireless transmission unit 112 can be disposed in the pole 102 and, optionally, placed into a weatherproof housing 116 .
- the wireless transmission unit 112 can be a component of the object data processing unit 104 .
- the object data processing unit 104 is further shown in detail on FIG. 2 .
- the object data processing unit 104 may include a camera 106 (such as a high resolution camera), an optional infrared (IR) camera 108 , an image/video processing unit also referred herein to as an image processing unit 110 , a wireless transmission unit 112 , and a plurality of environmental sensors 114 A-N.
- the sensors 114 A-N can also be hyperspectral image sensors or radar.
- the image processing unit 110 and the wireless transmission unit 112 can be enclosed in the weatherproof housing 116 .
- the object data processing unit 104 on the pole 102 can have the camera 106 which can include a high resolution camera operating in the visible spectrum.
- the IR camera 108 can be used to generate IR images that can be processed to correctly classify objects and differentiate living objects from non-living objects and to help during the night time when artificial lighting is not turned on or not sufficient for the visible light imaging sensor.
- the image processing unit 110 can perform image segmentation, classification, and movement vector calculation in real-time.
- the image processing unit 110 may be a specifically programmed computing device, such as an integrated microprocessor, as well as, in some embodiments, include an application specific integrated circuit or other similar computing device.
- the image processing unit 110 can include any of the features of a system 500 of FIG. 5 as well.
- the object data processing unit 104 mounted on the pole 102 can include the wireless transmission unit 112 for Internet connectivity used to synchronize with the data network 124 , such as a cloud, for updates and for use as a control channel if needed.
- the wireless transmission unit 112 can also provide wireless broadcasting of processed information such as location information that can be derived from hardwired components or determined from Global Positioning System (GPS)-based information.
- GPS Global Positioning System
- the wireless transmission unit 112 can also broadcast information that may assist in triangulating a location of receivers near the street. This can be especially important in downtowns with high rises where GPS signal is blocked, or distorted through reflection off buildings.
- Triangulation with beacons can provide precision within a very short distance, unobstructed by clouds and buildings.
- each pole 102 and/or components mounted on the pole 102 can be identified using a unique identification number, which can be used for data addressing, determining the location where the data are collected by the pole, triangulation, and so forth.
- all components disclosed for use on the pole 102 can be enclosed in the weatherproof housing 116 .
- FIG. 3 is a block diagram showing various modules of a system 300 for assisting navigation of traffic participants, in accordance with certain embodiments.
- the system 300 may include a plurality of poles 102 adjacent to a street, an object data processing unit 104 , a wireless transmission unit 112 , and, optionally, a waterproof housing 116 and a drone 310 .
- One or more sensors may be mounted on the plurality of poles 102 .
- the waterproof housing 116 may be configured to hold the object data processing unit 104 and the wireless transmission unit 112 .
- Each of the object data processing unit 104 and the wireless transmission unit 112 may include a programmable processor, such as a microcontroller, a central processing unit, and so forth.
- each of the object data processing unit 104 and the wireless transmission unit 112 may include an application-specific integrated circuit or programmable logic array designed to implement the functions performed by the system 300 . The operations performed by components of the system 300 are further described in detail with reference to FIG. 4 .
- FIG. 4 is a process flow diagram showing a method 400 for assisting navigation of traffic participants, according to an exemplary embodiment.
- the operations may be combined, performed in parallel, or performed in a different order.
- the method 400 may also include additional or fewer operations than those illustrated.
- the method 400 may be performed by processing logic that comprises hardware (e.g., decision making logic, dedicated logic, programmable logic, and microcode), software (such as software run on a general-purpose computer system or a dedicated machine), or a combination of both.
- processing logic comprises hardware (e.g., decision making logic, dedicated logic, programmable logic, and microcode), software (such as software run on a general-purpose computer system or a dedicated machine), or a combination of both.
- the method 400 may commence with collecting, by at least one sensor installed on a plurality of poles adjacent to a street, street object data at operation 402 .
- the at least one sensor includes one or more of the following: a camera, an infrared camera, a high-resolution camera, a hyper-spectral image sensor, a radar, an image/video processing unit, a wireless interface, an environmental sensor, a depth estimation sensor, a parallax-based sensor, and so forth.
- the infrared camera and the hyper-spectral image sensor can be configured to assist in differentiating between living objects and non-living objects and provide information about weather conditions.
- the at least one sensor may be pointed at an area of a possible collision.
- the collected street object data may be associated with at least one street object within a range of the at least one sensor.
- the at least one street object may include an autonomous vehicle, a semi-autonomous vehicle, a human-operated vehicle, a delivery robot, a drone, a pedestrian, a human riding a bicycle, a scooter, or any other type of light personal transportation equipment, a specialized equipment, a garbage can, an automated garbage collection equipment, an animal, a plant, an automated equipment for irrigating the plant, and so forth.
- the method 400 may continue with analyzing, by an object data processing unit, the street object data, in real time, at operation 404 .
- the object data processing unit may include an image processing unit.
- the image processing unit may be configured to process at least one image included in the street object data.
- the image processing unit may include an integrated microprocessor or a specially designed dedicated circuitry.
- the method 400 may further include an operation 406 , generating street object metadata corresponding to the at least one street object.
- the street object metadata may be generated based on the collected street object data and analysis of the street object data.
- the street object metadata may include one or more of the following: a classification of the at least one street object based on a machine learning technique trained on a set of predetermined objects, an object location based on location services associated with the wireless transmission unit or triangulation, an object direction, an object velocity, and so forth.
- the object data processing may store the street object metadata to a cloud, synchronize the street object metadata with the cloud for updates, or use the street object metadata to establish a control channel using a wireless interface of a wireless transmission unit.
- the method 400 may further include broadcasting, by the wireless transmission unit, the street object metadata to at least one traffic participant within the range at operation 408 .
- the wireless transmission unit may be configured to broadcast the street object metadata using real-time, low latency, low power radio transmitting capabilities, with low bandwidth requirements.
- the wireless transmission unit may be associated with a unique identification number. In particular, a unique identification number may be assigned to the wireless transmission unit of each pole of the plurality of poles.
- the street object metadata may be used to provide at least one warning to the at least one traffic participant of traffic conditions to allow the at least one traffic participant to take at least one proactive action.
- the at least one warning may be provided by generating an audible signal via a hearing aid, via a smartphone, via a cellular or mobile data, sending an alarm to a first responder, and so forth.
- traffic participants can use broadcasted signal from adjacent poles to see beyond objects separating the street and the sidewalks, around the corner with a building blocking the view, or “see” in front of parked vehicles.
- the pole may continue to track all objects in its field of view in real-time, broadcast their position and metadata about them, which includes an estimate about their mass/size, their vector of movement direction, and the prediction of their next moves, when such prediction can be made.
- the autonomous vehicle can process this information in a number of ways, including, but not limited by taking a slowing down action in advance of the possible collision, by focusing sensors, cameras in the direction of a possible collision, and the like.
- the autonomous vehicle may be configured to receive notifications from adjacent poles through encrypted or open communication.
- a warning of the human drivers about approaching pedestrians can be accomplished with this signal.
- the car can also perform proactive braking or any other course correction actions in addition to a warning.
- the same technology for informing autonomous vehicles about pedestrians can be used by pedestrians to inform them about cars in their vicinity.
- Hearing aids used by people with low hearing levels can be equipped to receive signals broadcast by the pole over the wireless transmission unit and alert the people to approaching cars, including interpreting additional information about their speed, direction, or any other street object metadata that would be valuable for a pedestrian.
- the wireless transmission unit of the pole can broadcast information that is capable of being received by a smartphone or enable a hearing aid.
- Audible, visual, or tactile messages such as beeps, light flashes, vibrations, or natural language commands can be provided to the wearer.
- Humans can also ride bicycles, scooters, and other light personal transportation equipment, either human-powered or electric-powered.
- Such equipment can be configured with receivers of the broadcasted signal from the poles and warn the human about an imminent threat of a collision.
- This light personal transportation equipment can also proactively act upon the warning, by braking or other collision avoidance actions.
- the street objects can by observed for a period of time and, based on data collected during the observing, future behavior of the at least one street object can be predicted.
- the predicted behavior may be used for generating street object metadata based on which a warning to the at least one traffic participant of traffic conditions can be provided.
- the plurality of poles may share the street object metadata to create a mesh network and send advance notices to further poles when a cellular infrastructure is unavailable.
- the method 400 may further include following, with a drone, the at least one street object outside the range of the at least one sensor.
- the drone may collect the street object data outside the range of the at least one sensor.
- the collected data may be provided by the drone to the object data processing unit.
- a mobile device processing signal broadcast by the poles can react by alerting the user of the mobile device at the street crossings with approaching cars or other pedestrians or other obstacles in their path (for example, poles).
- a smartphone can receive a message and display contents of the message to the user directly on the smartphone informing the user of surrounding traffic.
- the same pole broadcasting infrastructure can be used for emergency alerting, including weather emergencies and amber alerts.
- the poles can also be equipped with accepting emergency signal ( 911 ) for reporting an emergency, which can be done via cellular wireless signal or separate, low-power radio protocol.
- a plurality of beacons may be used to triangulate the object location obstructed by clouds or buildings. Meanwhile, even if equipped by beacons, some street objects such as garbage cans can still present a challenge for an automated pickup. Combining on-board cameras on the pole with a view from above can create a robust method of operating autonomous garbage can collection vehicles in smart cities. This can also assist in locating misplaced garbage cans.
- last mile delivery may use delivery robots navigating pedestrian path walks. While cameras can be used by these robots to avoid collisions with humans, additional information received from the poles may help choose the path of the least probability of a collision, and provide the feedback loop for on-board cameras as the robot can continuously match visible moving obstacles with information received from the poles. Location-based information can also augment the location determined with the GPS.
- the collected street object data and generated street object metadata can be stored at the pole for a limited time for processing by law enforcement and retrieved with specialized equipment, when this information is not streamed to the cloud.
- This information can help law enforcement correctly attribute the cause of the accidents and provide ground truth information for autonomous vehicle algorithms for their future improvement and further accident reduction.
- the metadata may be important to broadcast only metadata.
- the metadata only includes location, child/adult classifier information, moving direction, speed, and prediction information. Since the signal is localized and low-power, no information is broadcast that is not available for a camera or for any human on the street. Real imagery can still be sent to the cloud, but it is optional, and may not be broadcast.
- Beacon and main signal transmission power can take weather conditions in consideration and increase under rain or snow as sensed by any of the integrated environmental sensors (such as sensors 114 A-N shown in FIG. 1 ) or from a weather service that provides weather data over the wireless transmission unit.
- a camera installed on the pole because of its view down, can be better protected against rain, snow, or dirt, especially compared to car sensors. It has been reported that self-driving cars cannot be cleaned with automated car washers and must be cleaned by hand due to sensitivity of instruments that cannot be completely isolated from weather if located on a car. Some cities with high level of dust pollution would make autonomous vehicles blind and impossible to navigate.
- the pole-mounted systems for assisting navigation of traffic participants may be protected against spoofing attacks pretending to be poles by implementing public key cryptography or cryptographically signed data.
- Adversarial spoofing can still be done with cardboard cutouts of vehicles and other objects trying to affect the processing of the camera, but using a dual visible spectrum/infrared spectrum can significantly reduce the mistake. Even if this “visual” spoofing occurs, this object should still be avoided by the car, thus reducing the negative consequences from this form of input spoofing.
- a message format used between the pole-mounted system for assisting navigation of traffic participants and devices receiving broadcasts from the poles can be standardized so that both pole equipment and receiver equipment of various manufacturers can be interoperable.
- pole-mounted systems for assisting navigation of traffic participants can be used to automatically report incidents with people. Based on predicted data, such as behavioral predictions, and a skeleton model, the object data processing unit may make a conclusion that a human had an accident (e.g., fell to the ground) and needs help. Based on this conclusion, the system for assisting navigation of traffic participants may generate a warning to an appropriate person or entity, e.g., a first responder, medical personnel, security personnel, and so forth. This can be useful in playgrounds, retirement communities, hospital territories, college campuses, and the like. Recorded collisions can be automatically saved with geolocation and other meta-information, timestamped and cryptographically signed to prevent evidence tampering.
- an appropriate person or entity e.g., a first responder, medical personnel, security personnel, and so forth. This can be useful in playgrounds, retirement communities, hospital territories, college campuses, and the like. Recorded collisions can be automatically saved with geolocation and other meta-information, timestamped and cryptographically signed to prevent evidence
- FIG. 5 shows a diagrammatic representation of a machine in the exemplary electronic form of a computer system 500 , within which a set of instructions for causing the machine to perform any one or more of the methodologies discussed herein may be executed.
- the machine operates as a standalone device or may be connected (e.g., networked) to other machines.
- the machine may operate in the capacity of a server or a client machine in a server-client network environment, or as a peer machine in a peer-to-peer (or distributed) network environment.
- the machine may be an embedded computer, a personal computer (PC), a tablet PC, a set-top box, a cellular telephone, a portable music player (e.g., a portable hard drive audio device such as an Moving Picture Experts Group Audio Layer 3 player), a web appliance, a network router, switch or bridge, or any machine capable of executing a set of instructions (sequential or otherwise) that specify actions to be taken by that machine.
- a portable music player e.g., a portable hard drive audio device such as an Moving Picture Experts Group Audio Layer 3 player
- a web appliance e.g., a web appliance, a network router, switch or bridge, or any machine capable of executing a set of instructions (sequential or otherwise) that specify actions to be taken by that machine.
- the term “machine” shall also be taken to include any collection of machines that individually or jointly execute a set (or multiple sets) of instructions to perform any one or more of the methodologies discussed herein.
- the exemplary computer system 500 includes a processor or multiple processors 502 (e.g., a central processing unit, a graphics processing unit, or both), a main memory 504 and a static memory 506 , which communicate with each other via a bus 508 .
- the computer system 500 may further include a video display unit 510 (e.g., a liquid crystal display or a cathode ray tube.
- the computer system 500 may also include an alphanumeric input device 512 (e.g., a keyboard), a cursor control device 514 (e.g., a mouse), a disk drive unit 516 , a signal generation device 518 (e.g., a speaker) and a network interface device 520 , or a capability to connect these peripheral devices for maintenance purposes.
- an alphanumeric input device 512 e.g., a keyboard
- a cursor control device 514 e.g., a mouse
- a disk drive unit 516 e.g., a disk drive unit 516
- a signal generation device 518 e.g., a speaker
- a network interface device 520 e.g., a network interface device
- the disk drive unit 516 includes a non-transitory computer-readable medium 522 , on which is stored one or more sets of instructions and data structures (e.g., instructions 524 ) embodying or utilized by any one or more of the methodologies or functions described herein.
- the instructions 524 may also reside, completely or at least partially, within the main memory 504 and/or within the processors 502 during execution thereof by the computer system 500 .
- the main memory 504 and the processors 502 may also constitute machine-readable media.
- the instructions 524 may further be transmitted or received over a network 526 via the network interface device 520 utilizing any one of a number of well-known transfer protocols (e.g., Hyper Text Transfer Protocol).
- a number of well-known transfer protocols e.g., Hyper Text Transfer Protocol
- While the computer-readable medium 522 is shown in an example embodiment to be a single medium, the term “computer-readable medium” should be taken to include a single medium or multiple media (e.g., a centralized or distributed database and/or associated caches and servers) that store the one or more sets of instructions.
- the term “computer-readable medium” shall also be taken to include any medium that is capable of storing, encoding, or carrying a set of instructions for execution by the machine and that causes the machine to perform any one or more of the methodologies of the present application, or that is capable of storing, encoding, or carrying data structures utilized by or associated with such a set of instructions.
- computer-readable medium shall accordingly be taken to include, but not be limited to, solid-state memories, optical and magnetic media, and carrier wave signals. Such media may also include, without limitation, hard disks, floppy disks, flash memory cards, digital video disks, random access memory, read only memory, and the like.
- the exemplary embodiments described herein may be implemented in an operating environment comprising software installed on a computer, in hardware, or in a combination of software and hardware.
- the computer system 500 may be implemented as a cloud-based computing environment, such as a virtual machine operating within a computing cloud.
- the computer system 500 may itself include a cloud-based computing environment, where the functionalities of the computer system 500 are executed in a distributed fashion.
- the computer system 500 when configured as a computing cloud, may include pluralities of computing devices in various forms, as will be described in greater detail below.
- a cloud-based computing environment is a resource that typically combines the computational power of a large grouping of processors (such as within web servers) and/or that combines the storage capacity of a large grouping of computer memories or storage devices.
- Systems that provide cloud-based resources may be utilized exclusively by their owners or such systems may be accessible to outside users who deploy applications within the computing infrastructure to obtain the benefit of large computational or storage resources.
Landscapes
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Chemical & Material Sciences (AREA)
- Analytical Chemistry (AREA)
- Computer Networks & Wireless Communication (AREA)
- Theoretical Computer Science (AREA)
- Traffic Control Systems (AREA)
Abstract
Description
Claims (20)
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US16/396,660 US11094191B2 (en) | 2018-04-27 | 2019-04-27 | Distributed safety infrastructure for autonomous vehicles and methods of use |
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US201862664014P | 2018-04-27 | 2018-04-27 | |
| US16/396,660 US11094191B2 (en) | 2018-04-27 | 2019-04-27 | Distributed safety infrastructure for autonomous vehicles and methods of use |
Publications (2)
| Publication Number | Publication Date |
|---|---|
| US20190333378A1 US20190333378A1 (en) | 2019-10-31 |
| US11094191B2 true US11094191B2 (en) | 2021-08-17 |
Family
ID=68292736
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US16/396,660 Expired - Fee Related US11094191B2 (en) | 2018-04-27 | 2019-04-27 | Distributed safety infrastructure for autonomous vehicles and methods of use |
Country Status (1)
| Country | Link |
|---|---|
| US (1) | US11094191B2 (en) |
Cited By (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US12148285B1 (en) | 2020-09-02 | 2024-11-19 | Wm Intellectual Property Holdings L.L.C. | System and method for oncoming vehicle detection and alerts for a waste collection vehicle |
| US12394317B2 (en) | 2023-10-16 | 2025-08-19 | Plusai, Inc. | Automatic event capturing for autonomous vehicle driving |
Families Citing this family (15)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US11891284B2 (en) * | 2018-03-28 | 2024-02-06 | The Heil Co. | Camera safety system for aerial device |
| US11100329B1 (en) * | 2019-08-14 | 2021-08-24 | Lytx, Inc. | Ranging system data utilization for marking of video data of interest |
| US11610486B1 (en) * | 2019-08-14 | 2023-03-21 | Traffic & Parking Control Co., Inc. | Connected-vehicle interface module and method of use |
| FR3107493B1 (en) * | 2020-02-20 | 2022-05-13 | Renault Sas | Driving assistance device at an intersection |
| US10992401B1 (en) | 2020-03-05 | 2021-04-27 | Rovi Guides, Inc. | Systems and methods for generating playlist for a vehicle |
| US10972206B1 (en) * | 2020-03-05 | 2021-04-06 | Rovi Guides, Inc. | Systems and methods for generating playlist for a vehicle |
| US11805160B2 (en) | 2020-03-23 | 2023-10-31 | Rovi Guides, Inc. | Systems and methods for concurrent content presentation |
| US11790364B2 (en) | 2020-06-26 | 2023-10-17 | Rovi Guides, Inc. | Systems and methods for providing multi-factor authentication for vehicle transactions |
| US11599880B2 (en) | 2020-06-26 | 2023-03-07 | Rovi Guides, Inc. | Systems and methods for providing multi-factor authentication for vehicle transactions |
| CN111739344B (en) * | 2020-06-29 | 2022-11-08 | 北京百度网讯科技有限公司 | Early warning method and device and electronic equipment |
| US12211061B2 (en) | 2020-07-31 | 2025-01-28 | Adeia Guides Inc. | Systems and methods for providing an offer based on calendar data mining |
| WO2022044125A1 (en) * | 2020-08-25 | 2022-03-03 | 日本電気株式会社 | Information provision device, information provision method, and program |
| CN112101156B (en) * | 2020-09-02 | 2024-08-27 | 杭州海康威视数字技术股份有限公司 | A method, device and electronic device for target recognition |
| US12405615B2 (en) * | 2021-06-04 | 2025-09-02 | Rhoman Aerospace Corporation | Cloud and hybrid-cloud flight vehicle and robotic control system AI and ML enabled cloud-based software and data system method for the optimization and distribution of flight control and robotic system solutions and capabilities |
| KR102721015B1 (en) * | 2023-09-25 | 2024-10-24 | 한국건설기술연구원 | Cooperative Perception System And Cooperative Perception Method |
Citations (23)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US5999877A (en) * | 1996-05-15 | 1999-12-07 | Hitachi, Ltd. | Traffic flow monitor apparatus |
| US20050169500A1 (en) * | 2004-01-30 | 2005-08-04 | Fujitsu Limited | Method of and apparatus for setting image-capturing conditions, and computer program |
| US20060095199A1 (en) * | 2004-11-03 | 2006-05-04 | Lagassey Paul J | Modular intelligent transportation system |
| US20060274917A1 (en) * | 1999-11-03 | 2006-12-07 | Cet Technologies Pte Ltd | Image processing techniques for a video based traffic monitoring system and methods therefor |
| US20070067410A1 (en) * | 2005-09-20 | 2007-03-22 | Mulligan Bryan P | Method and apparatus for the surveillance, monitoring, management and control of vehicular traffic |
| US20080285803A1 (en) * | 2007-05-15 | 2008-11-20 | Jai Inc., Usa. | Modulated light trigger for license plate recognition cameras |
| US20090041302A1 (en) * | 2007-08-07 | 2009-02-12 | Honda Motor Co., Ltd. | Object type determination apparatus, vehicle, object type determination method, and program for determining object type |
| US20100063736A1 (en) * | 2008-09-05 | 2010-03-11 | Robert Bosch Gmbh | Collision avoidance system and method |
| US20100271497A1 (en) * | 2009-04-28 | 2010-10-28 | Monsive Jr Michael G | Portable traffic monitoring system and methods for use |
| US20110001626A1 (en) * | 2008-02-22 | 2011-01-06 | Tri-Concept Technology Limited | Apparatus and system for led street lamp monitoring and control |
| US20130141576A1 (en) * | 2011-12-01 | 2013-06-06 | Richard T. Lord | Determining threats based on information from road-based devices in a transportation-related context |
| US20130144490A1 (en) * | 2011-12-01 | 2013-06-06 | Richard T. Lord | Presentation of shared threat information in a transportation-related context |
| US20150035685A1 (en) * | 2013-08-02 | 2015-02-05 | Honda Patent & Technologies North America, LLC | Vehicle to pedestrian communication system and method |
| US20160125246A1 (en) * | 2014-10-30 | 2016-05-05 | Kent W. Ryhorchuk | System and method for parking and traffic analysis |
| US20170023945A1 (en) * | 2014-04-04 | 2017-01-26 | Koninklijke Philips N.V. | System and methods to support autonomous vehicles via environmental perception and sensor calibration and verification |
| US20170061791A1 (en) * | 2015-09-02 | 2017-03-02 | Constructron, Inc. | Automated traffic control system for use in construction and work zones |
| US20170301220A1 (en) * | 2016-04-19 | 2017-10-19 | Navio International, Inc. | Modular approach for smart and customizable security solutions and other applications for a smart city |
| US20180096595A1 (en) * | 2016-10-04 | 2018-04-05 | Street Simplified, LLC | Traffic Control Systems and Methods |
| US9952594B1 (en) * | 2017-04-07 | 2018-04-24 | TuSimple | System and method for traffic data collection using unmanned aerial vehicles (UAVs) |
| US10147315B2 (en) * | 2016-07-27 | 2018-12-04 | Here Global B.V. | Method and apparatus for determining split lane traffic conditions utilizing both multimedia data and probe data |
| US10178430B2 (en) * | 2015-01-26 | 2019-01-08 | Hangzhou Hikvision Digital Technology Co., Ltd. | Intelligent processing method and system for video data |
| US10516858B2 (en) * | 2016-12-28 | 2019-12-24 | Panasonic Intellectual Property Management Co., Ltd. | Monitoring system, monitoring method, and program |
| US10521665B2 (en) * | 2012-08-06 | 2019-12-31 | Cloudparc, Inc. | Tracking a vehicle using an unmanned aerial vehicle |
-
2019
- 2019-04-27 US US16/396,660 patent/US11094191B2/en not_active Expired - Fee Related
Patent Citations (23)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US5999877A (en) * | 1996-05-15 | 1999-12-07 | Hitachi, Ltd. | Traffic flow monitor apparatus |
| US20060274917A1 (en) * | 1999-11-03 | 2006-12-07 | Cet Technologies Pte Ltd | Image processing techniques for a video based traffic monitoring system and methods therefor |
| US20050169500A1 (en) * | 2004-01-30 | 2005-08-04 | Fujitsu Limited | Method of and apparatus for setting image-capturing conditions, and computer program |
| US20060095199A1 (en) * | 2004-11-03 | 2006-05-04 | Lagassey Paul J | Modular intelligent transportation system |
| US20070067410A1 (en) * | 2005-09-20 | 2007-03-22 | Mulligan Bryan P | Method and apparatus for the surveillance, monitoring, management and control of vehicular traffic |
| US20080285803A1 (en) * | 2007-05-15 | 2008-11-20 | Jai Inc., Usa. | Modulated light trigger for license plate recognition cameras |
| US20090041302A1 (en) * | 2007-08-07 | 2009-02-12 | Honda Motor Co., Ltd. | Object type determination apparatus, vehicle, object type determination method, and program for determining object type |
| US20110001626A1 (en) * | 2008-02-22 | 2011-01-06 | Tri-Concept Technology Limited | Apparatus and system for led street lamp monitoring and control |
| US20100063736A1 (en) * | 2008-09-05 | 2010-03-11 | Robert Bosch Gmbh | Collision avoidance system and method |
| US20100271497A1 (en) * | 2009-04-28 | 2010-10-28 | Monsive Jr Michael G | Portable traffic monitoring system and methods for use |
| US20130141576A1 (en) * | 2011-12-01 | 2013-06-06 | Richard T. Lord | Determining threats based on information from road-based devices in a transportation-related context |
| US20130144490A1 (en) * | 2011-12-01 | 2013-06-06 | Richard T. Lord | Presentation of shared threat information in a transportation-related context |
| US10521665B2 (en) * | 2012-08-06 | 2019-12-31 | Cloudparc, Inc. | Tracking a vehicle using an unmanned aerial vehicle |
| US20150035685A1 (en) * | 2013-08-02 | 2015-02-05 | Honda Patent & Technologies North America, LLC | Vehicle to pedestrian communication system and method |
| US20170023945A1 (en) * | 2014-04-04 | 2017-01-26 | Koninklijke Philips N.V. | System and methods to support autonomous vehicles via environmental perception and sensor calibration and verification |
| US20160125246A1 (en) * | 2014-10-30 | 2016-05-05 | Kent W. Ryhorchuk | System and method for parking and traffic analysis |
| US10178430B2 (en) * | 2015-01-26 | 2019-01-08 | Hangzhou Hikvision Digital Technology Co., Ltd. | Intelligent processing method and system for video data |
| US20170061791A1 (en) * | 2015-09-02 | 2017-03-02 | Constructron, Inc. | Automated traffic control system for use in construction and work zones |
| US20170301220A1 (en) * | 2016-04-19 | 2017-10-19 | Navio International, Inc. | Modular approach for smart and customizable security solutions and other applications for a smart city |
| US10147315B2 (en) * | 2016-07-27 | 2018-12-04 | Here Global B.V. | Method and apparatus for determining split lane traffic conditions utilizing both multimedia data and probe data |
| US20180096595A1 (en) * | 2016-10-04 | 2018-04-05 | Street Simplified, LLC | Traffic Control Systems and Methods |
| US10516858B2 (en) * | 2016-12-28 | 2019-12-24 | Panasonic Intellectual Property Management Co., Ltd. | Monitoring system, monitoring method, and program |
| US9952594B1 (en) * | 2017-04-07 | 2018-04-24 | TuSimple | System and method for traffic data collection using unmanned aerial vehicles (UAVs) |
Cited By (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US12148285B1 (en) | 2020-09-02 | 2024-11-19 | Wm Intellectual Property Holdings L.L.C. | System and method for oncoming vehicle detection and alerts for a waste collection vehicle |
| US12394317B2 (en) | 2023-10-16 | 2025-08-19 | Plusai, Inc. | Automatic event capturing for autonomous vehicle driving |
| US12424104B2 (en) * | 2023-10-16 | 2025-09-23 | Plusai, Inc. | Motion planning for autonomous vehicle driving using vehicle-to-infrastructure communication |
Also Published As
| Publication number | Publication date |
|---|---|
| US20190333378A1 (en) | 2019-10-31 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US11094191B2 (en) | Distributed safety infrastructure for autonomous vehicles and methods of use | |
| US11928149B2 (en) | Systems and methods for querying a distributed inventory of visual data | |
| US11475774B2 (en) | Systems and methods for machine learning based collision avoidance | |
| US11430331B2 (en) | Power and thermal management systems and methods for autonomous vehicles | |
| EP4198944A1 (en) | Roadside sensing system and traffic control method | |
| KR20230092673A (en) | Multi-modal segmentation network for enhanced semantic labeling in mapping | |
| KR20230051035A (en) | Object detection using radar and lidar fusion | |
| JP7420734B2 (en) | Data distribution systems, sensor devices and servers | |
| CN116552511A (en) | Detection of traffic dynamics and road changes in autonomous driving | |
| US20220020271A1 (en) | Method and system for vehicle navigation using information from smart node | |
| US12183061B2 (en) | Identifying new classes of objects in environments of vehicles | |
| US20240005666A1 (en) | Managing vehicle resources based on scenarios | |
| Fatima et al. | Mobile crowdsensing with energy efficiency to control road congestion in internet cloud of vehicles: a review | |
| CN111369760A (en) | A nighttime pedestrian safety early warning device and method based on UAV | |
| CN115240470A (en) | NR-V2X-based weak traffic participant collision early warning system and method | |
| Agarwal et al. | Federated Learning in Intelligent Traffic Management System | |
| KR102789279B1 (en) | Methods and systems for agent prioritization | |
| US20240126268A1 (en) | Track refinement networks | |
| US12267569B2 (en) | Plenoptic sensor devices, systems, and methods | |
| US20240048853A1 (en) | Pulsed-Light Optical Imaging Systems for Autonomous Vehicles | |
| WO2024081259A1 (en) | Region of interest detection for image signal processing | |
| EP4605908A1 (en) | Identifying new classes of objects in environments of vehicles | |
| KR20230140517A (en) | Predicting and controlling object crossings on vehicle routes | |
| KR20230034819A (en) | Location based parameters for an image sensor | |
| CN207133979U (en) | A kind of traffic events based on car networking intelligently put to the proof device |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| FEPP | Fee payment procedure |
Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: SMALL ENTITY |
|
| FEPP | Fee payment procedure |
Free format text: ENTITY STATUS SET TO SMALL (ORIGINAL EVENT CODE: SMAL); ENTITY STATUS OF PATENT OWNER: SMALL ENTITY |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE AFTER FINAL ACTION FORWARDED TO EXAMINER |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE AFTER FINAL ACTION FORWARDED TO EXAMINER |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT RECEIVED |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT VERIFIED |
|
| STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
| FEPP | Fee payment procedure |
Free format text: MAINTENANCE FEE REMINDER MAILED (ORIGINAL EVENT CODE: REM.); ENTITY STATUS OF PATENT OWNER: SMALL ENTITY |
|
| LAPS | Lapse for failure to pay maintenance fees |
Free format text: PATENT EXPIRED FOR FAILURE TO PAY MAINTENANCE FEES (ORIGINAL EVENT CODE: EXP.); ENTITY STATUS OF PATENT OWNER: SMALL ENTITY |
|
| STCH | Information on status: patent discontinuation |
Free format text: PATENT EXPIRED DUE TO NONPAYMENT OF MAINTENANCE FEES UNDER 37 CFR 1.362 |
|
| FP | Lapsed due to failure to pay maintenance fee |
Effective date: 20250817 |