[go: up one dir, main page]

WO2023049461A1 - Système et procédé de détection de quasi collision/collision pour trafic - Google Patents

Système et procédé de détection de quasi collision/collision pour trafic Download PDF

Info

Publication number
WO2023049461A1
WO2023049461A1 PCT/US2022/044748 US2022044748W WO2023049461A1 WO 2023049461 A1 WO2023049461 A1 WO 2023049461A1 US 2022044748 W US2022044748 W US 2022044748W WO 2023049461 A1 WO2023049461 A1 WO 2023049461A1
Authority
WO
WIPO (PCT)
Prior art keywords
objects
group
collision
near miss
data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/US2022/044748
Other languages
English (en)
Inventor
Nicholas D'ANDRE
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Gridmatrix Inc
Original Assignee
Gridmatrix Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US17/861,411 external-priority patent/US11955001B2/en
Application filed by Gridmatrix Inc filed Critical Gridmatrix Inc
Publication of WO2023049461A1 publication Critical patent/WO2023049461A1/fr
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • G08G1/164Centralised systems, e.g. external to vehicles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/93Lidar systems specially adapted for specific applications for anti-collision purposes
    • G01S17/931Lidar systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/52Surveillance or monitoring of activities, e.g. for recognising suspicious objects
    • G06V20/54Surveillance or monitoring of activities, e.g. for recognising suspicious objects of traffic, e.g. cars on the road, trains or boats
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/0104Measuring and analyzing of parameters relative to traffic conditions
    • G08G1/0108Measuring and analyzing of parameters relative to traffic conditions based on the source of data
    • G08G1/0116Measuring and analyzing of parameters relative to traffic conditions based on the source of data from roadside infrastructure, e.g. beacons
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/0104Measuring and analyzing of parameters relative to traffic conditions
    • G08G1/0125Traffic data processing
    • G08G1/0133Traffic data processing for classifying traffic situation
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/0104Measuring and analyzing of parameters relative to traffic conditions
    • G08G1/0137Measuring and analyzing of parameters relative to traffic conditions for specific applications
    • G08G1/0145Measuring and analyzing of parameters relative to traffic conditions for specific applications for active traffic flow control
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/015Detecting movement of traffic to be counted or controlled with provision for distinguishing between two or more types of vehicles, e.g. between motor-cars and cycles
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/04Detecting movement of traffic to be counted or controlled using optical or ultrasonic detectors
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/052Detecting movement of traffic to be counted or controlled with provision for determining speed or overspeed
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/07Controlling traffic signals
    • G08G1/087Override of traffic control, e.g. by signal transmitted by an emergency vehicle
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • G08G1/166Anti-collision systems for active traffic, e.g. moving vehicles, pedestrians, bikes

Definitions

  • Provisional Patent Application 63/318,442 filed March 10, 2022, titled “Traffic Near Miss Detection;” and U.S. Provisional Patent Application No. 63/320,010, filed March 15, 2022, titled “Traffic Near Miss/Collision Detection;” the contents of which are incorporated herein by reference as if fully disclosed herein.
  • the described embodiments relate generally to traffic analysis. More particularly, the present embodiments relate to traffic near miss/collision detection.
  • Traffic may be motorized, non-motorized, and so on. Traffic may include cars, trucks, pedestrians, scooters, bicycles, and so on. Traffic appears to only increase as the population of the world continues to increase.
  • Some population areas such as cities, use cameras and/or other traffic monitoring devices to capture data about traffic. This data may be used to evaluate congestion, traffic signal configurations, road layout, and so on.
  • Traffic data may be obtained, such as video from intersection cameras, point cloud data from Light Detection and Ranging (or “LiDAR”) sensors, and so on.
  • Object detection and classification may be performed using the data, and structured data may be determined and/or output using the detected and classified objects.
  • structured data may be obtained that has been generated by performing such object detection and classification on traffic data (such as point cloud data from LiDAR sensors).
  • Metrics may be calculated using the structured data. For each frame of traffic data, the metrics may be analyzed to detect whether a near miss/collision occurs between each object in the frame (such as motorized or non-motorized vehicles, pedestrians, and so on) and each of the other objects in the frame.
  • these metrics may be analyzed to evaluate whether or not a group of conditions are met. If the group of conditions are met, a near miss/collision may be detected. This may be recorded in the metrics for the objects involved. In some implementations, one or more indicators may be added to the traffic data and/or to one or more visualizations generated using the metrics, the traffic data, the structured data, and so on.
  • a system includes a memory allocation configured to store at least one executable asset and a processor allocation configured to access the memory allocation and execute the at least one executable asset to instantiate a near miss/collision detection service.
  • the near miss/collision detection service detects objects in a frame, analyzes pairs of the objects, determines that a group of conditions are met, and determines that a near miss/collision has occurred.
  • the group of conditions includes that a distance between an object pair of the pairs of the objects is less than a distance threshold.
  • the distance threshold is approximately 7 feet. By way of illustration, approximately 7 feet may be within 6-8 feet.
  • the group of conditions includes that a speed of an object of an object pair of the pairs of the objects is greater than a speed threshold.
  • the speed threshold is approximately zero miles per hour. By way of illustration, approximately zero may be within 1 mile per hour of zero.
  • the group of conditions includes that an angle between an object pair of the pairs of the objects is higher than an angle threshold.
  • the angle threshold is approximately 12 degrees. By way of illustration, approximately 12 degrees may be within 11 -13 degrees.
  • the group of conditions includes that an object pair of the pairs of the objects are not both coming from a same approach. In some implementations, the group of conditions includes that one of an object pair of the pairs of the objects was not previously determined to be involved in another near miss/collision. In a number of implementations, the group of conditions includes that a sum of previous speeds is higher than zero for both of an object pair of the pairs of the objects.
  • the group of conditions includes a first group of conditions for a first object pair of the pairs of the objects that are both vehicles and a second group of conditions for a second object pair of the pairs of the objects that include a vehicle and a pedestrian.
  • the second group of conditions includes a lower distance threshold than the first group of conditions.
  • the second group of conditions includes no condition related to an angle between the vehicle and the pedestrian.
  • the second group of conditions includes a higher speed threshold than the first group of conditions and the second group of conditions evaluates the vehicle according to the higher speed threshold.
  • the near miss/collision detection service determines a conversion factor between pixels and a speed measurement.
  • the speed measurement may be in miles per hour.
  • the system further includes adding a near miss/collision indicator to the frame.
  • a method of near miss/collision detection includes obtaining traffic data, analyzing the traffic data, determining that a near miss/collision occurred, and responding to the near miss/collision.
  • responding to the near miss/collision includes determining that the near miss/collision is a collision. In a number of implementations of such examples, the method further includes transmitting an alert regarding the collision.
  • FIG. 1 depicts an example system for traffic near miss/collision detection.
  • FIG. 2 depicts a flow chart illustrating a first example method for traffic near miss/collision detection. This method may be performed by the system of FIG. 1 .
  • FIG. 3 depicts a flow chart illustrating a second example method for traffic near miss/collision detection. This method may be performed by the system of FIG. 1 .
  • FIG. 4 depicts a flow chart illustrating a third example method for traffic near miss/collision detection. This method may be performed by the system of FIG. 1 .
  • FIG. 5A depicts a first frame of traffic data video.
  • FIG. 5B depicts a second frame of traffic data video.
  • FIG. 50 depicts a third frame of traffic data video.
  • FIG. 5D depicts a fourth frame of traffic data video.
  • FIG. 5E depicts a fifth frame of traffic data video.
  • a near miss may be when two objects (such as motorized or non-motorized vehicles, pedestrians, and so on) in traffic almost collide.
  • a collision may be when the two objects actually collide.
  • Near misses/collisions may signal traffic problems that may need to be addressed. Further, near misses may be more challenging to track than actual collisions as near misses may not be reported to insurance providers, law enforcement, and/or other entities. Data regarding detected near misses/collisions may be useful in the ability to visualize various metrics about the traffic, enable adaptive traffic signal control, predict traffic congestion and/or accidents, and aggregate data from multiple population areas for various uses, such as in the auto insurance industry, rideshare industry, logistics industry, autonomous vehicle original equipment manufacturer industry, and so on.
  • Traffic data may be obtained, such as video from intersection cameras, point cloud data from Light Detection and Ranging (or “LiDAR”) sensors, and so on.
  • Object detection and classification may be performed using the data, and structured data may be determined and/or output using the detected and classified objects.
  • structured data may be obtained that has been generated by performing such object detection and classification on traffic data (such as point cloud data from LiDAR sensors).
  • Metrics may be calculated using the structured data. For each frame of traffic data, the metrics may be analyzed to detect whether a near miss/collision occurs between each object in the frame (such as motorized or non-motorized vehicles, pedestrians, and so on) and each of the other objects in the frame.
  • these metrics may be analyzed to evaluate whether or not a group of conditions are met. If the group of conditions are met, a near miss/collision may be detected. This may be recorded in the metrics for the objects involved. In some implementations, one or more indicators may be added to the traffic data and/or to one or more visualizations generated using the metrics, the traffic data, the structured data, and so on.
  • the system may be able to perform near miss/collision detection and/or various other actions based thereon that the system would not previously have been able to perform absent the technology disclosed herein. This may enable the system to operate more efficiently while consuming fewer hardware and/or software resources as more resource consuming techniques could be omitted. Further, a variety of components may be omitted while still enabling traffic near miss detection, reducing unnecessary hardware and/or software components, and providing greater system flexibility.
  • FIG. 1 depicts an example system 100 for traffic near miss/collision detection.
  • the system 100 may include one or more analysis devices 101 (which may be implemented using one or more loud computing arrangements) and/or one or more traffic monitoring devices 102 (such as one or more intersection and/or other still image and/or video cameras, LiDAR sensors, loops, radar, weather data, Internet of Things sensors, fleet vehicles, traffic controllers and/or other city and/or other population area supplied data devices, navigation app data, connected vehicles, and so on).
  • analysis devices 101 which may be implemented using one or more loud computing arrangements
  • traffic monitoring devices 102 such as one or more intersection and/or other still image and/or video cameras, LiDAR sensors, loops, radar, weather data, Internet of Things sensors, fleet vehicles, traffic controllers and/or other city and/or other population area supplied data devices, navigation app data, connected vehicles, and so on).
  • the analysis device 101 and/or one or more other devices may obtain traffic data (such as video from intersection cameras, point cloud data from Light Detection and Ranging (or “LiDAR”) sensors, and so on) from the traffic monitoring device.
  • the analysis device 101 and/or one or more other devices may perform object detection and classification using the data and may determine an output structured data using the detected and classified objects.
  • the analysis device 101 and/or one or more other devices may obtain structured data that has been generated by performing such object detection and classification on traffic data (such as point cloud data from LiDAR sensors).
  • the analysis device 101 and/or one or more other devices may calculate one or more metrics using the structured data.
  • the analysis device 101 and/or one or more other devices may analyze the metrics to detect whether a near miss/collision occurs between each object in the frame (such as motorized or non-motorized vehicles, pedestrians, and so on) and each of the other objects in the frame.
  • the analysis device 101 and/or one or more other devices may analyze the metrics to evaluate whether or not a group of conditions are met. If the group of conditions are met, the analysis device 101 and/or one or more other devices may determine that a near miss/collision is detected.
  • the analysis device 101 and/or one or more other devices may record the near miss/collision and/or other data based thereon in the metrics for the objects involved.
  • analysis device 101 and/or one or more other devices may add one or more indicators to the traffic data and/or to one or more visualizations generated using the metrics, the traffic data, the structured data, and so on.
  • the group of conditions may include a variety of factors, such as the distance between the objects, the direction of movement of the objects, the current speed of the objects, and so on.
  • a near miss/collision may be detected between a pair of objects when the distance between them is below a distance threshold, the speed of at least one of the objects is higher than a speed threshold, the angle between the vehicles is higher than an angle threshold, the objects are not coming from the same approach (intersection entrance), at least one of the vehicles was not detected in a near miss already in the same traffic data, and the sum of previous speeds of each of the objects is higher than zero (avoiding parked and/or otherwise stationary objects).
  • Near misses/collisions may be calculated in a metrics module and analyzed for every frame.
  • a near miss/collision detection process may include detecting objects in a given frame, making a list of all objects in that frame, analyzing all possible pairs of objects in that frame and, if the group of conditions above are met, determine that a near miss/collision is detected. After a near miss/collision is detected, both objects may be marked as having been involved in a near miss/collision.
  • near misses/collision detection analysis may be restricted to objects within intersections of opposing directions, objects involved in left turns, and so on.
  • Accuracy of near miss/collision detection may depend on the traffic monitoring device 102, the angle to the intersection, and so on.
  • Performance of near miss/collision detection may be evaluated using a near miss/collision data set (traffic data with a near miss/collision) and a number of different performance measurements. These performance measurements may include recall (the number of near misses/collisions detected over the total number of near misses/collisions), precision (the number of near misses/collisions over the total number of possible pairs of objects in the data set), and so on.
  • Near misses/collisions may be detected and data stored for such at the frame level.
  • near misses/collisions may be detected and data stored for such at a second level above the frame level, such as a group of frames. This may reduce the amount of data that may be stored.
  • Latency of near miss/collision detection may be approximately equal to the latency of the system 100. This may relate to the time that passes between ingestion of traffic data and processing of a frame.
  • a near miss/collision detection procedure may be run every frame, along with other relevant metrics.
  • Such a near miss/collision detection procedure may involve the following steps.
  • the near miss/collision detection procedure may start with object detection and tracking in the frame under analysis.
  • the output of this process may be a list of record keys identifying each object, bounding boxes pointing to the object’s position, class, the total number of frames that the object has appeared in so far (age), the number of frames that have passed since the object has been detected for the last time (time_since_update), and so on.
  • the second step may involve going through this list of objects and processing a group of metrics calculations that involve the near miss/collision detection. These metrics may be calculated on the base of each object.
  • direction angle direction of the object
  • conversion factor between pixels and miles
  • speed in miles per hour or kilometers per hour and/or other measure
  • approach ID identifier
  • intersection entrance or where the object is coming from.
  • direction angle changes in position and previous value of direction angle may be used to calculate the direction of the object and exclude outlier values (such as erroneous values).
  • conversion factor the direction angle of the object within its bounding box may be used to obtain the longitudinal size of the object in pixels.
  • the average size of the object class (such as vehicle class) may be used to get a conversion factor between pixels and miles (or other measurement) for that object in the frame.
  • a conversion factor between pixels and miles (or other measurement) for that object in the frame.
  • cars in the United States measure on average 14.7 feet long.
  • speed in miles per hour or kilometers per hour and/or other measure
  • the distance in pixels travelled between the current and last frame may be measured. Then this may be multiplied by the conversion factor to get the distance in miles.
  • the distance may be divided by the time between frames to get the speed in miles per hour.
  • approach ID when an object appears in the traffic data, its distance to all approach center points may be calculated and the closest approach may be attributed to the object.
  • a near miss detection determination may be applied as follows. For each object, the distance in pixels to all other objects may be measured and transformed into miles (and/or other measurement) using the conversion factor calculated previously. The conversion factor may only allow calculation of distance in miles correctly for objects that are close to each other, as near miss analysis targets are. For each object, the angle with all other objects may be calculated using the direction angle. Then, a group of conditions may be evaluated for every possible pair with the object under analysis. If the group of conditions are all met, the case may be determined to be a near miss. The groups of conditions may be different for different objects, such as between two vehicles, between a vehicle and a pedestrian, and so on.
  • the group of conditions for detecting a near miss/collision between two vehicles may be as follows.
  • the second vehicle may be required to be at a distance lower than a threshold distance, such as 7 feet.
  • Both vehicles may be required to have speed values higher than a speed threshold, such as 0 miles per hour.
  • the angle between the vehicles may be higher than an angle threshold, such as 12 degrees.
  • the second vehicle may be required to be coming from a different approach than the vehicle under analysis.
  • the group of conditions to detect a near miss/collision between a vehicle and a pedestrian may be the same as the above with the following exceptions.
  • the person may need to be at a distance lower than a pedestrian distance threshold, such as 5 feet.
  • a pedestrian distance threshold such as 5 feet.
  • the vehicle may be required to have a speed higher than a vehicle speed threshold, such as 0.3 miles per hour.
  • the analysis device 101 and/or one or more other devices may perform object detection and classification using the data. For example, objects may be detected and classified as cars, trucks, buses, pedestrians, light vehicles, heavy vehicles, non-motor vehicles, and so on. Objects may be assigned individual identifiers, identifiers by type, and so on.
  • the analysis device 101 and/or one or more other devices may determine and/or output structured data using the detected and classified objects.
  • the analysis device 101 and/or one or more other devices may calculate one or more metrics using the structured data.
  • the metrics may involve vehicle volume, vehicle volume by vehicle type, average speed, movement status, distance travelled, queue length, pedestrian volume, nonmotor volume, light status on arrival, arrival phase, route through intersection, light times, near misses, longitude, latitude, city, state, country, and/or any other metrics that may be calculated using the structured data.
  • the analysis device 101 may be any kind of electronic device. Examples of such devices include, but are not limited to, one or more desktop computing devices, laptop computing devices, server computing devices, mobile computing devices, tablet computing devices, set top boxes, digital video recorders, televisions, displays, wearable devices, smart phones, digital media players, and so on.
  • the analysis device 101 may include one or more processors 103 and/or other processing units and/or controllers, one or more non-transitory storage media 104 (which may take the form of, but is not limited to, a magnetic storage medium; optical storage medium; magneto-optical storage medium; read only memory; random access memory; erasable programmable memory; flash memory; and so on), one or more communication units 105, one or more input and/or output devices 106 (such as one or more displays, speakers, keyboards, mice, track pads, touch pads, touch screens, sensors, printers, and so on), and/or other components.
  • the processor 103 may execute instructions stored in the non-transitory storage medium to perform various functions.
  • the analysis device 101 may involve one or more memory allocations configured to store at least one executable asset and one or more processor allocations configured to access the one or more memory allocations and execute the at least one executable asset to instantiate one or more processes and/or services, such as one or more near miss and/or collision detection and/or response services, and so on.
  • the traffic monitoring device 102 may be any kind of electronic device.
  • Various configurations are possible and contemplated without departing from the scope of the present disclosure.
  • computing resource refers to any physical and/or virtual electronic device or machine component, or set or group of interconnected and/or communicably coupled physical and/or virtual electronic devices or machine components, suitable to execute or cause to be executed one or more arithmetic or logical operations on digital data.
  • Example computing resources contemplated herein include, but are not limited to: single or multi-core processors; single or multi-thread processors; purpose-configured coprocessors (e.g., graphics processing units, motion processing units, sensor processing units, and the like); volatile or non-volatile memory; application-specific integrated circuits; field-programmable gate arrays; input/output devices and systems and components thereof (e.g., keyboards, mice, trackpads, generic human interface devices, video cameras, microphones, speakers, and the like); networking appliances and systems and components thereof (e.g., routers, switches, firewalls, packet shapers, content filters, network interface controllers or cards, access points, modems, and the like); embedded devices and systems and components thereof (e.g., system(s)-on-chip, Internet-of-Things devices, and the like); industrial control or automation devices and systems and components thereof (e.g., programmable logic controllers, programmable relays, supervisory control and data acquisition controllers, discrete
  • Example information can include, but may not be limited to: personal identification information (e.g., names, social security numbers, telephone numbers, email addresses, physical addresses, driver’s license information, passport numbers, and so on); identity documents (e.g., driver’s licenses, passports, government identification cards or credentials, and so on); protected health information (e.g., medical records, dental records, and so on); financial, banking, credit, or debt information; third-party service account information (e.g., usernames, passwords, social media handles, and so on); encrypted or unencrypted files; database files; network connection logs; shell history; filesystem files; libraries, frameworks, and binaries; registry entries; settings files; executing processes; hardware vendors, versions, and/or information associated with the compromised computing resource; installed applications or services; password hashes; idle time, uptime, and/or last login time; document files; product renderings; presentation files; image files; customer information; configuration files; passwords; and so on.
  • personal identification information e.g., names, social security
  • a person of skill in the art may appreciate that the various functions and operations of a system such as described herein can be implemented in a number of suitable ways, developed leveraging any number of suitable libraries, frameworks, first or third-party APIs, local or remote databases (whether relational, NoSQL, or other architectures, or a combination thereof), programming languages, software design techniques (e.g., procedural, asynchronous, event-driven, and so on or any combination thereof), and so on.
  • the various functions described herein can be implemented in the same manner (as one example, leveraging a common language and/or design), or in different ways.
  • functions of a system described herein are implemented as discrete microservices, which may be containerized or executed/instantiated leveraging a discrete virtual machine, that are only responsive to authenticated API requests from other microservices of the same system.
  • each microservice may be configured to provide data output and receive data input across an encrypted data channel.
  • each microservice may be configured to store its own data in a dedicated encrypted database; in others, microservices can store encrypted data in a common database; whether such data is stored in tables shared by multiple microservices or whether microservices may leverage independent and separate tables/schemas can vary from embodiment to embodiment.
  • processor refers to any software and/or hardware- implemented data processing device or circuit physically and/or structurally configured to instantiate one or more classes or objects that are purpose-configured to perform specific transformations of data including operations represented as code and/or instructions included in a program that can be stored within, and accessed from, a memory.
  • This term is meant to encompass a single processor or processing unit, multiple processors, multiple processing units, analog or digital circuits, or other suitably configured computing element or combination of elements.
  • system 100 is illustrated and described as including particular components arranged in a particular configuration, it is understood that this is an example. In a number of implementations, various configurations of various components may be used without departing from the scope of the present disclosure.
  • the system 100 is illustrated and described as including the traffic monitoring device 102. However, it is understood that this is an example. In various implementations, the traffic monitoring device 102 may not be part of the system 100. The system 100 may instead communicate with the traffic monitoring device 102. Various configurations are possible and contemplated without departing from the scope of the present disclosure.
  • data that has already been detected and classified may be obtained.
  • Various metrics may be calculated from such, similar to above, which may then be prepared for visualization and/or visualized and/or otherwise used similar to above.
  • FIG. 2 depicts a flow chart illustrating a first example method 200 for traffic near miss/collision detection. This method 200 may be performed by the system 100 of FIG. 1.
  • an electronic device (such as the analysis device 101 of FIG. 1 ) may obtain traffic data, such as video from one or more intersection cameras.
  • the electronic device may analyze the traffic data.
  • the electronic device may determine whether or not a near miss/collision occurred. If not, the flow may proceed to operation 240 and end. Otherwise, at operation 250, the electronic device may respond to the detected near miss/collision.
  • This may include recording data regarding the near miss/collision, marking the traffic data with one or more near miss/collision indicators, transmitting one or more notifications (such as to one or more cities and/or other municipalities or authorities, emergency responders, and so on for the purpose of summoning emergency services, tracking near misses/collisions, and so on), and so on.
  • notifications such as to one or more cities and/or other municipalities or authorities, emergency responders, and so on for the purpose of summoning emergency services, tracking near misses/collisions, and so on
  • responding to the detected near miss/collision may include one or more automatic and/or other alerts.
  • the electronic device may determine within a confidence level, threshold, or similar mechanism that a detected near miss/collision is a collision.
  • the electronic device may automatically and/or otherwise send an alert, such as to a 911 operator and/or other emergency and/or other vehicle dispatcher, emergency and/or other vehicle, vehicle controller, vehicle navigation device, and so on via one or more mechanisms such as cellular and/or other communication network.
  • the collision detection may be external to the vehicle dispatched to render aid and/or perform other actions related to the collision.
  • this may add functions to vehicle collision detection systems, add redundancy to vehicle collision detection systems, and so on.
  • the electronic device may also utilize traffic data and/or control other devices, such as to determine the fastest and/or most efficient route to the collision, control traffic signals to prioritize traffic to the collision (such as creating an empty corridor to the collision), and so on.
  • traffic data and/or control other devices such as to determine the fastest and/or most efficient route to the collision, control traffic signals to prioritize traffic to the collision (such as creating an empty corridor to the collision), and so on.
  • the electronic device may record the near miss with other traffic data and/or otherwise analyze such as part of analyzing traffic data.
  • the electronic device may record the near miss with other traffic data and/or otherwise analyze such as part of analyzing traffic data.
  • this example method 200 may be implemented as a group of interrelated software modules or components that perform various functions discussed herein. These software modules or components may be executed within a cloud network and/or by one or more computing devices, such as the analysis device 101 of FIG. 1 .
  • example method 200 is illustrated and described as including particular operations performed in a particular order, it is understood that this is an example. In various implementations, various orders of the same, similar, and/or different operations may be performed without departing from the scope of the present disclosure.
  • the method 200 is illustrated and described as including the operation 240. However, it is understood that this is an example. In some implementations, operation 240 may be omitted and the electronic device may instead return to operation 220 and continue analyzing the traffic data. Various configurations are possible and contemplated without departing from the scope of the present disclosure.
  • intersection cameras Although the above is described in the context of intersection cameras, it is understood that this is an example. In various implementations, other data sources beyond data extracted from intersection video feeds may be used. This may include weather, Internet of Things sensors, LiDAR sensors, fleet vehicles, city suppliers (e.g., traffic controller), navigation app data, connected vehicle data, and so on.
  • This illustrates and describes performance of functions like detection and classification, determination of structured data, and so on, it is understood that this is an example. In various implementations, one or more such functions may be omitted without departing from the scope of the present disclosure.
  • data that has already been detected and classified may be obtained.
  • Various metrics may be calculated from such, similar to above, which may then be prepared for visualization and/or visualized and/or otherwise used similar to above.
  • FIG. 3 depicts a flow chart illustrating a second example method for traffic near miss/collision detection. This method may be performed by the system of FIG. 1 .
  • an electronic device may detect all objects in a frame, such as in frames of a video from one or more intersection cameras.
  • the electronic device may analyze all pairs of objects.
  • the electronic device may determine whether or not a group of conditions are met (such as one or more of the groups of conditions discussed with respect to FIG. 1 above). If not, the flow may proceed to operation 340 and end. Otherwise, at operation 350, the electronic device may determine that a near miss has occurred.
  • this example method 300 may be implemented as a group of interrelated software modules or components that perform various functions discussed herein. These software modules or components may be executed within a cloud network and/or by one or more computing devices, such as the analysis device 101 of FIG. 1 .
  • example method 300 is illustrated and described as including particular operations performed in a particular order, it is understood that this is an example. In various implementations, various orders of the same, similar, and/or different operations may be performed without departing from the scope of the present disclosure.
  • the method 300 is illustrated and described as determining that a near miss/collision has occurred. However, it is understood that this is an example.
  • the electronic device may perform one or more actions in response to determining that a near miss/collision has occurred. This may include recording data regarding the near miss/collision, marking the traffic data with one or more near miss/collision indicators, and so on. Various configurations are possible and contemplated without departing from the scope of the present disclosure.
  • FIG. 4 depicts a flow chart illustrating a third example method for traffic near miss/collision detection. This method may be performed by the system of FIG. 1 .
  • an electronic device may analyze traffic video.
  • the electronic device may determine that a near miss/collision has occurred.
  • the electronic device may add a near miss/collision indicator to the traffic video.
  • this example method 400 may be implemented as a group of interrelated software modules or components that perform various functions discussed herein. These software modules or components may be executed within a cloud network and/or by one or more computing devices, such as the analysis device 101 of FIG. 1 .
  • example method 400 is illustrated and described as including particular operations performed in a particular order, it is understood that this is an example. In various implementations, various orders of the same, similar, and/or different operations may be performed without departing from the scope of the present disclosure.
  • the method 400 is illustrated and described as analyzing traffic video. However, it is understood that this is an example. In some implementations, other traffic data, such as point cloud data from one or more LIDAR sensors, may instead be analyzed. Various configurations are possible and contemplated without departing from the scope of the present disclosure.
  • FIG. 5A depicts a first frame 500A of traffic data video.
  • the objects 551 A-551 D, 551 F in the first frame 500A each includes an indicator 552A-552D, 552F that indicates that each of the objects 551 A-551 D, 551 F has not been involved in a near miss/collision.
  • the indicators 552A-552D, 552F may be in dashed lines to show that each of the objects 551 A-551 D, 551 F has not been involved in near miss/collision. However, it is understood that this is an example.
  • FIG. 5B depicts a second frame 500B of traffic data video.
  • the second frame 500B may be a subsequent frame to the first frame 500A of FIG. 5A.
  • the objects 551 A-551 F in the second frame 500B each includes an indicator 552A-552F that indicates that each of the objects 551 A-551 F has not been involved in a near miss/collision.
  • FIG. 5B depicts a second frame 500B of traffic data video.
  • the second frame 500B may be a subsequent frame to the first frame 500A of FIG. 5A.
  • the objects 551 A-551 F in the second frame 500B each includes an indicator 552A-552F that indicates that each of the objects 551 A-551 F has not been involved in a near miss/collision.
  • FIG. 5B depicts a second frame 500B of traffic data video.
  • the second frame 500B may be a subsequent frame to the first frame 500A of FIG. 5A.
  • the objects 551 A-551 F in the second frame 500B each includes
  • 5C depicts a third frame 500C of traffic data video.
  • the third frame 500C may be a subsequent frame to the second frame 500B of FIG. 5B.
  • the objects 551 A-551 D, 551 F- 5511 in the third frame 5000 each includes an indicator 552A-552D, 552F-552I that indicates that each of the objects 551 A-551 D, 551 F-5511 has not been involved in a near miss/collision.
  • two of the objects 551 G, 551 H in the intersection are about to have a near miss.
  • FIG. 5D depicts a fourth frame 500D of traffic data video.
  • the fourth frame 500D may be a subsequent frame to the third frame 500C of FIG. 5C.
  • the two objects 551 G, 551 H in the intersection have proceeded and are involved in a near miss.
  • the indicators 552G, 552H for those two objects 551 G, 551 H that previously indicated that the two objects 551 G, 551 H had not been involved in a near miss have been modified to indicate that the two objects 551 G, 551 H have been involved in a near miss.
  • this is an example.
  • the indicators 552G, 552H for those two objects 551 G, 551 H that previously indicated that the two objects 551 G, 551 H had not been involved in a near miss may instead be removed and other indicators 552G, 552H indicating that the two objects 551 G, 551 H have been involved in a near miss may be added.
  • the indicators 552G, 552H for those two objects 551 G, 551 H that previously indicated that the two objects 551 G, 551 H had not been involved in a near miss may instead be removed and other indicators 552G, 552H indicating that the two objects 551 G, 551 H have been involved in a near miss may be added.
  • Various configurations are possible and contemplated without departing from the scope of the present disclosure.
  • the indicators 552G-552H may be in solid lines to show that the objects 551 G, 551 H have been involved in a near miss. However, it is understood that this is an example. In other examples, the indicators 552G, 552H may be red to show that the objects 551 G, 551 H have been involved in a near miss/collision. In still other examples, other indicators 552G, 552H may be used without departing from the scope of the present disclosure.
  • FIG. 5E depicts a fifth frame 500E of traffic data video.
  • the fifth frame 500E may be a subsequent frame to the fourth frame 500D of FIG. 5D.
  • the indicators 552G, 552H for the two objects 551 G, 551 H that were involved in the near miss in FIG. 5D still indicate that the two objects 551 G, 551 H were involved in a near miss.
  • FIGs. 5A-5E are illustrated and discussed above with respect to a near miss, it is understood that this is an example.
  • the present disclosure may detect and respond to collisions between objects 551 A-551 D, 551 F-5511 instead of and/or in addition to detecting and responding to near misses. Collisions may be detected and responded to similarly to how near misses are detected and responded to in the present disclosure. Various configurations are possible and contemplated without departing from the scope of the present disclosure.
  • FIGs. 5A-5E are illustrated and discussed above with respect to particular traffic data, objects, frames, and indicators, it is understood that these are examples. In various implementations, other traffic data, objects, frames, indicators, and so on may be used without departing from the scope of the present disclosure.
  • frames of a raw, real-time video feed from an intersection camera and/or other traffic data may be obtained (though it is understood that this is an example and that in other examples other data, such as point cloud LiDAR data, may be obtained and used).
  • Detection and classification may be performed on each frame to identify and classify the objects in the frame. Structured data may then be determined for the objects detected.
  • a frame number may be determined for a frame
  • an intersection identifier may be determined for a frame
  • a unique tracker identifier may be assigned to each object detected
  • the class of the object may be determined (such as person, car, truck, bus, motorbike, bicycle, and so on)
  • coordinates of the object detected in the frame may be determined (which may be determined with reference to known coordinates of the intersection and/or the intersection camera, such as camera longitude, latitude, city, state, country, and so on) (such as the minimum and maximum x positions of the object, the minimum and maximum y positions of the object, and so on), and the like.
  • a bounding box may be calculated for the object based on one or more x and/or y positions for the object.
  • one or more geometric centers of the object’s bounding box may be calculated for the object in the x and/or y coordinate (such as an x min, a y min, and so on).
  • an intersection approach that the object is currently on may be calculated, such as based on a position of the object and a position of the center of the intersection.
  • other structured data may be determined from the frames.
  • one or more time stamps associated with frames may be determined and/or associated with other structured data, such as to determine a time at which an object was at a determined x and/or y position.
  • a light phase for the frame may be determined (such as whether a traffic light in the frame is green, red, and so on), though this may instead be determined by means other than image analysis (such as time-stamped traffic light data that may be correlated to a frame time stamp). This may be used to determine the traffic light phase when an object arrived at the intersection, such as by correlating a traffic light phase determined for a frame along with a determination that an object arrived at the intersection in the frame.
  • data for an approach and/or intersection associated with a frame may be determined (such as based on a uniform resource locator of the video feed and/or any other intersection camera identifier associated with the frame, an approach identifier associated with the frame, an intersection identifier associated with the frame, and so on).
  • the structured data determined for an object in a frame may be used with the structured data determined for the object in other frames to calculate various metrics. For example, the difference between one or more x and/or y positions for the object (such as the difference and/or distance between x or y midpoints of the object’s bounding box) in different frames (such as in a current and a previous frame) may be calculated. Such difference in position between frames, along with times respectively associated with the frames (such as from one or more time stamps) may be used to calculate one or more metrics associated with the speed of the object (such as an average speed of the object during the video feed (such as in miles per hour and/or other units), cumulative speed, and so on).
  • the speed of the object such as an average speed of the object during the video feed (such as in miles per hour and/or other units), cumulative speed, and so on).
  • Such difference in position between frames may also be used to calculate various metrics about the travel of the object (such as the direction of travel between frames, how the object left an intersection, whether or not the object made a right on red, and so on).
  • structured data from multiple frames may be used to determine a status of the object (such as an approach associated with the object, how an object moved through an intersection, an approach an object used to enter an intersection, the approach an object used to exit an intersection, and so on), a time or number of frames since the object was last detected (and/or since first detected and so on), whether or not the object is moving, and so on.
  • Structured data and/or metrics for individual detected objects and/or other data may be used together to calculate various metrics, such as metrics associated with approaches.
  • structured data and/or metrics for individual detected objects associated with an approach identifier may be aggregated and analyzed to determine one or more approach volumes (such as a number of vehicles (cars, motorbikes, trucks, buses, and so on)) in a particular approach, a number of light vehicles (such as cars, motorbikes, and so on) in a particular approach, a number of heavy vehicles (such as trucks, buses, and so on) in a particular approach, a number of cars in a particular approach, a number of trucks in a particular approach, a number of buses in a particular approach, a number of pedestrians in a particular approach, a number of non-motor vehicles in a particular approach, a number of bicycles in a particular approach, and so on), an average queue length (such as in feet and/or another unit of measurement) of a particular approach, and so on.
  • approach volumes such as a number of vehicles (cars, motorbikes, trucks, buses, and so on)
  • light vehicles such as cars, motorbikes, and so on
  • light status in one or more frames may be tracked and/or correlated with other information to determine a light status, an effective green time (such as a length of time that objects are moving through a particular intersection), an effective red time (such as a length of time that objects are stopped at a particular intersection), a cycle time (such as a length of time that a light is green determined by comparing the light phase across multiple frames), a number of cars that arrived while a traffic light is green, a number of cars that arrived while a traffic light is red, a measure of individual phase progression performance derived from a percentage of vehicle volume arrivals on green, and so on.
  • an effective green time such as a length of time that objects are moving through a particular intersection
  • an effective red time such as a length of time that objects are stopped at a particular intersection
  • a cycle time such as a length of time that a light is green determined by comparing the light phase across multiple frames
  • a number of cars that arrived while a traffic light is green
  • a last stop time may be calculated based on a last time stamp that an object stopped at an approach.
  • a last start time may be calculated based on a last time stamp that an object moved into the intersection at a particular approach.
  • an approach identifier for a particular approach may be determined, coordinates for a camera associated with a particular intersection may be determined, a number of lanes associated with a particular approach may be determined, and so on.
  • Structured data and/or metrics for individual detected objects and/or other data may be also used together to calculate various metrics associated with intersections.
  • a vehicle volume for a particular intersection may be determined by summing objects (such as cars, motorbikes, trucks, buses, and so on) in all approaches of a frame associated with the intersection
  • a light vehicle volume for a particular intersection may be determined by summing objects (such as cars, motorbikes, and so on) in all approaches of a frame associated with the intersection
  • a heavy vehicle volume for a particular intersection may be determined by summing objects (such as trucks, buses, and so on) in all approaches of a frame associated with the intersection
  • a car volume for a particular intersection may be determined by summing cars in all approaches of a frame associated with an intersection
  • a truck volume for a particular intersection may be determined by summing trucks in all approaches of a frame associated with an intersection
  • a bus volume for a particular intersection may be determined by summing buses in all approaches of a frame associated with an intersection
  • a person volume for a particular intersection may be determined by summing people in all approaches of a frame associated with an intersection
  • Other information for an intersection may be determined using the video feed, frames, and/or other structured data and/or metrics. For example, an identifier for a camera associated with an intersection may be determined, identifiers for frames of one or more video feeds associated with the intersection may be determined, observation times associated with an intersection may be determined (such as a time stamp based on ingestion time when other metadata from a stream or other video feed is not available, a cumulative time (such as from the start of processing of the video feed) may be determined, and so on.
  • the above raw data, structured data, metrics, and so on may be used to detect one or more near misses/collisions. Detection of such near misses/collisions may be performed using one or more of the methods and/or procedures discussed above.
  • any structured data and/or metrics relating to one or more vehicles and/or other objects, approaches, intersections, and so on may be determined and calculated from the objects detected in one or more frames of one or more video feeds of one or more intersection cameras and/or other traffic data without departing from the scope of the present disclosure.
  • connected vehicle data may be obtained and used.
  • structured data and/or metrics may be determined and/or calculated using a combination of connected vehicle data and data from one or more video feeds from one or more intersection cameras and/or other traffic data.
  • a visualization dashboard may visualize connected vehicle data along with structured data and/or metrics determined and/or calculated from one or more video feeds from one or more intersection cameras and/or other traffic data.
  • real-time video feed from an intersection camera and/or other traffic data may be obtained.
  • Objects in frames of the video feed may be detected and classified. Positions of the objects at various times in the frames of the video feed may be determined, as well as information such as light statuses related to the objects. Differences between the objects in different frames may be used to determine behavior of the objects over time.
  • Such calculated object metrics may be stored, such as in one or more vehicle tables.
  • Such calculated object metrics for objects that are associated with a particular approach may be aggregated in order to determine various approach object volumes and/or other metrics related to the approach, which may then be stored, such as in one or more approach tables.
  • object metrics for objects that are associated with a particular intersection may be aggregated in order to determine various intersection object volumes and/or other metrics related to the intersection, which may then be stored, such as in one or more intersection tables.
  • structured data and/or metrics related to one or more vehicles and/or other objects, approaches, intersections, and so on discussed above may then be processed and/or otherwise prepared for visualization and/or one or more other purposes, such as near miss/collision detection.
  • structured data and/or metrics related to one or more vehicles and/or other objects may be stored in one or more vehicle tables
  • structured data and/or metrics related to one or more intersections may be stored in one or more intersection tables
  • structured data and/or metrics related to one or more approaches may be stored in one or more approach tables, and so on.
  • Such tables may then be used for visualization and/or one or more other purposes.
  • LiDAR sensors may be operable to determine data, such as ranges (variable distance), by targeting an object with elements, such as one or more lasers, and measuring the time for the reflected light to return to one or more receivers.
  • LiDAR sensors may generate point cloud data that may be used for the analysis discussed herein instead of frames of a raw, real-time video feed from an intersection camera and/or other traffic data.
  • functions similar to those described above performed on frames of a raw, real-time video feed from an intersection camera and/or other traffic data may be performed on the LiDAR sensor data.
  • traffic data such as detection and classification, determination of structured data, near miss/collision detection, and so on
  • structured data generated from LiDAR cloud data that has already been detected and classified may be obtained and various metrics may be calculated from such, similar to above, which may then be prepared for visualization and/or visualized and/or otherwise used similar to above.
  • LiDAR sensor data may have a number of advantages over frames of a raw, realtime video feed from an intersection camera and/or other traffic data.
  • point cloud data from one or more LiDAR sensors may not have the same privacy issues as frames of a raw, real-time video feed from an intersection camera and/or other traffic data as facial and/or other similar images may not be captured.
  • LiDAR sensor data may not be dependent on lighting and thus may provide more reliable data over all times of day and night as compared to frames of a raw, real-time video feed from an intersection camera and/or other traffic data.
  • LiDAR sensor data may provide data in three- dimensional space as opposed to the two-dimensional data from frames of a raw, real-time video feed from an intersection camera and/or other traffic data and thus may provide depth, which may not be provided via frames of a raw, real-time video feed from an intersection camera and/or other traffic data.
  • a determination may be made about the size of an average vehicle in pixels. This may be used with the LiDAR sensor data to determine the pixels from the center of a vehicle represented in the LiDAR sensor data and then infer the speed of the vehicle. Compared to approaches using frames of a raw, real-time video feed from an intersection camera and/or other traffic data, an assumption may not have to be made about object speed. This may be more accurate, but also may improve the processing speed of computing devices processing the data as functions performed on frames of a raw, real-time video feed from an intersection camera and/or other traffic data to determine speed may not need to be performed and can be omitted since this information may already be represented in LiDAR sensor data.
  • a system may include a memory allocation configured to store at least one executable asset and a processor allocation configured to access the memory allocation and execute the at least one executable asset to instantiate a near miss/collision detection service.
  • the near miss/collision detection service may detect objects in a frame, analyze pairs of the objects, determine that a group of conditions are met, and determine that a near miss/collision has occurred.
  • the group of conditions may include that a distance between an object pair of the pairs of the objects is less than a distance threshold.
  • the distance threshold may be approximately 7 feet. By way of illustration, approximately 7 feet may be within 6-8 feet.
  • the group of conditions may include that a speed of an object of an object pair of the pairs of the objects is greater than a speed threshold.
  • the speed threshold may be approximately zero miles per hour. By way of illustration, approximately zero may be within 1 mile per hour of zero.
  • the group of conditions may include that an angle between an object pair of the pairs of the objects is higher than an angle threshold.
  • the angle threshold may be approximately 12 degrees.
  • approximately 12 degrees may be within 11 -13 degrees.
  • the group of conditions may include that an object pair of the pairs of the objects are not both coming from a same approach. In some implementations, the group of conditions may include that one of an object pair of the pairs of the objects was not previously determined to be involved in another near miss/collision. In a number of implementations, the group of conditions may include that a sum of previous speeds is higher than zero for both of an object pair of the pairs of the objects.
  • the group of conditions may include a first group of conditions for a first object pair of the pairs of the objects that are both vehicles and a second group of conditions for a second object pair of the pairs of the objects that include a vehicle and a pedestrian.
  • the second group of conditions may include a lower distance threshold than the first group of conditions.
  • the second group of conditions may include no condition related to an angle between the vehicle and the pedestrian.
  • the second group of conditions may include a higher speed threshold than the first group of conditions and the second group of conditions evaluates the vehicle according to the higher speed threshold.
  • the near miss/collision detection service may determine a conversion factor between pixels and a speed measurement. In various such examples, the speed measurement may be in miles per hour.
  • the system may further include adding a near miss/collision indicator to the frame.
  • a method of near miss/collision detection may include obtaining traffic data, analyzing the traffic data, determining that a near miss/collision occurred, and responding to the near miss/collision.
  • responding to the near miss/collision may include determining that the near miss/collision is a collision.
  • the method may further include transmitting an alert regarding the collision.
  • Traffic data may be obtained, such as video from intersection cameras, point cloud data from LiDAR sensors, and so on.
  • Object detection and classification may be performed using the data, and structured data may be determined and/or output using the detected and classified objects.
  • structured data may be obtained that has been generated by performing such object detection and classification on traffic data (such as point cloud data from LiDAR sensors).
  • Metrics may be calculated using the structured data. For each frame of traffic data, the metrics may be analyzed to detect whether a near miss/collision occurs between each object in the frame (such as motorized or non-motorized vehicles, pedestrians, and so on) and each of the other objects in the frame.
  • these metrics may be analyzed to evaluate whether or not a group of conditions are met. If the group of conditions are met, a near miss/collision may be detected. This may be recorded in the metrics for the objects involved. In some implementations, one or more indicators may be added to the traffic data and/or to one or more visualizations generated using the metrics, the traffic data, the structured data, and so on.
  • the methods disclosed may be implemented as sets of instructions or software readable by a device. Further, it is understood that the specific order or hierarchy of steps in the methods disclosed are examples of sample approaches. In other embodiments, the specific order or hierarchy of steps in the method can be rearranged while remaining within the disclosed subject matter.
  • the accompanying method claims present elements of the various steps in a sample order, and are not necessarily meant to be limited to the specific order or hierarchy presented.
  • the described disclosure may be provided as a computer program product, or software, that may include a non-transitory machine-readable medium having stored thereon instructions, which may be used to program a computer system (or other electronic devices) to perform a process according to the present disclosure.
  • a non-transitory machine- readable medium includes any mechanism for storing information in a form (e.g., software, processing application) readable by a machine (e.g., a computer).
  • the non-transitory machine-readable medium may take the form of, but is not limited to, a magnetic storage medium (e.g., floppy diskette, video cassette, and so on); optical storage medium (e.g., CD- ROM); magneto-optical storage medium; read only memory (ROM); random access memory (RAM); erasable programmable memory (e.g., EPROM and EEPROM); flash memory; and so on.
  • a magnetic storage medium e.g., floppy diskette, video cassette, and so on
  • optical storage medium e.g., CD- ROM
  • magneto-optical storage medium e.g., magneto-optical storage medium
  • ROM read only memory
  • RAM random access memory
  • EPROM and EEPROM erasable programmable memory
  • flash memory and so on.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Chemical & Material Sciences (AREA)
  • Analytical Chemistry (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Electromagnetism (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Business, Economics & Management (AREA)
  • Emergency Management (AREA)
  • Traffic Control Systems (AREA)

Abstract

La présente invention concerne un ou plusieurs dispositifs pouvant obtenir des données de trafic, telles qu'une vidéo provenant de caméras d'intersection, des données de nuage de points provenant de capteurs de détection et de télémétrie par la lumière (ou "LiDAR"), et ainsi de suite. Des mesures peuvent être calculées à partir des données de trafic. Pour chaque cadre, les mesures peuvent être analysées pour détecter si une quasi collision/collision se produit entre chaque objet dans le cadre (tels que des véhicules motorisés ou non motorisés, des piétons, etc.) et chacun des autres objets dans le cadre. Ces mesures peuvent être analysées pour évaluer si un groupe de conditions est ou non satisfait. Si le groupe de conditions est satisfait, une quasi collision/collision peut être détectée. Ceci peut être enregistré dans les mesures pour les objets impliqués. Dans certains modes de réalisation, un ou plusieurs indicateurs peuvent être ajoutés aux données de trafic et/ou à une ou plusieurs visualisations générées à l'aide des mesures, des données de trafic, des données structurées et ainsi de suite.
PCT/US2022/044748 2021-09-27 2022-09-26 Système et procédé de détection de quasi collision/collision pour trafic Ceased WO2023049461A1 (fr)

Applications Claiming Priority (10)

Application Number Priority Date Filing Date Title
US202163248948P 2021-09-27 2021-09-27
US63/248,948 2021-09-27
US202263315200P 2022-03-01 2022-03-01
US63/315,200 2022-03-01
US202263318442P 2022-03-10 2022-03-10
US63/318,442 2022-03-10
US202263320010P 2022-03-15 2022-03-15
US63/320,010 2022-03-15
US17/861,411 2022-07-11
US17/861,411 US11955001B2 (en) 2021-09-27 2022-07-11 Traffic near miss collision detection

Publications (1)

Publication Number Publication Date
WO2023049461A1 true WO2023049461A1 (fr) 2023-03-30

Family

ID=84047734

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2022/044748 Ceased WO2023049461A1 (fr) 2021-09-27 2022-09-26 Système et procédé de détection de quasi collision/collision pour trafic

Country Status (1)

Country Link
WO (1) WO2023049461A1 (fr)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6810132B1 (en) * 2000-02-04 2004-10-26 Fujitsu Limited Traffic monitoring apparatus
US20150145701A1 (en) * 2010-07-27 2015-05-28 Ryan P. Beggs Methods and apparatus to detect and warn proximate entities of interest
US20200388150A1 (en) * 2019-06-06 2020-12-10 Verizon Patent And Licensing Inc. Monitoring a scene to analyze an event using a plurality of image streams
WO2022040610A1 (fr) * 2020-08-21 2022-02-24 Ubicquia Iq Llc Détection de quasi-absence basée sur un noeud

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6810132B1 (en) * 2000-02-04 2004-10-26 Fujitsu Limited Traffic monitoring apparatus
US20150145701A1 (en) * 2010-07-27 2015-05-28 Ryan P. Beggs Methods and apparatus to detect and warn proximate entities of interest
US20200388150A1 (en) * 2019-06-06 2020-12-10 Verizon Patent And Licensing Inc. Monitoring a scene to analyze an event using a plurality of image streams
WO2022040610A1 (fr) * 2020-08-21 2022-02-24 Ubicquia Iq Llc Détection de quasi-absence basée sur un noeud

Similar Documents

Publication Publication Date Title
US11955001B2 (en) Traffic near miss collision detection
US9583000B2 (en) Vehicle-based abnormal travel event detecting and reporting
CN113240909B (zh) 车辆监测方法、设备、云控平台和车路协同系统
US10165231B2 (en) Visualization of navigation information for connected autonomous vehicles
CN106935039A (zh) 用于获取驾驶员行为数据的平台
US11651685B2 (en) Traffic data analysis and traffic jam prediction
US20240161621A1 (en) Systems that predict accidents and ameliorate predicted accidents
US11995154B2 (en) Method of determining state of target object, electronic device, and storage medium
CN112818792A (zh) 车道线检测方法、装置、电子设备及计算机存储介质
KR20130108928A (ko) 차량 사고 정보 수집 방법, 이를 위한 장치 및 차량 사고 정보 수집 시스템
Perafan-Villota et al. Fast and precise: parallel processing of vehicle traffic videos using big data analytics
CN111932046A (zh) 一种服务场景下处理风险的方法、计算机设备、存储介质
JP2022137267A (ja) 画像認識方法、装置、電子機器、記憶媒体、及びプログラム
CN115146539B (zh) 车辆的安全评价参数确定方法、装置和车辆
Cho et al. Big data pre-processing methods with vehicle driving data using MapReduce techniques
US20200128371A1 (en) System and method for reporting observed events/objects from smart vehicles
Park et al. Opportunities for preventing rear-end crashes: findings from the analysis of actual freeway crash data
WO2023049461A1 (fr) Système et procédé de détection de quasi collision/collision pour trafic
US20210217510A1 (en) Correlating driving behavior and user conduct
WO2023049453A1 (fr) Surveillance, analyse et prédiction de trafic
US20200250970A1 (en) Information processing apparatus, information processing method and program
CN111709665A (zh) 车辆安全性的评估方法和装置
Böddeker et al. Automated driving safety-The art of conscious risk taking-minimum lateral distances to pedestrians
US20250046091A1 (en) Traffic image sensor movement detection and handling
CN111653124B (zh) 数据处理方法、装置、电子设备及存储介质

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22798399

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 22798399

Country of ref document: EP

Kind code of ref document: A1