WO2023049453A1 - Surveillance, analyse et prédiction de trafic - Google Patents
Surveillance, analyse et prédiction de trafic Download PDFInfo
- Publication number
- WO2023049453A1 WO2023049453A1 PCT/US2022/044733 US2022044733W WO2023049453A1 WO 2023049453 A1 WO2023049453 A1 WO 2023049453A1 US 2022044733 W US2022044733 W US 2022044733W WO 2023049453 A1 WO2023049453 A1 WO 2023049453A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- data
- traffic
- metrics
- intersection
- frame
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Classifications
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/01—Detecting movement of traffic to be counted or controlled
- G08G1/0104—Measuring and analyzing of parameters relative to traffic conditions
- G08G1/0108—Measuring and analyzing of parameters relative to traffic conditions based on the source of data
- G08G1/0116—Measuring and analyzing of parameters relative to traffic conditions based on the source of data from roadside infrastructure, e.g. beacons
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/01—Detecting movement of traffic to be counted or controlled
- G08G1/0104—Measuring and analyzing of parameters relative to traffic conditions
- G08G1/0108—Measuring and analyzing of parameters relative to traffic conditions based on the source of data
- G08G1/0112—Measuring and analyzing of parameters relative to traffic conditions based on the source of data from the vehicle, e.g. floating car data [FCD]
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/01—Detecting movement of traffic to be counted or controlled
- G08G1/0104—Measuring and analyzing of parameters relative to traffic conditions
- G08G1/0125—Traffic data processing
- G08G1/0129—Traffic data processing for creating historical data or processing based on historical data
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/01—Detecting movement of traffic to be counted or controlled
- G08G1/0104—Measuring and analyzing of parameters relative to traffic conditions
- G08G1/0125—Traffic data processing
- G08G1/0133—Traffic data processing for classifying traffic situation
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/01—Detecting movement of traffic to be counted or controlled
- G08G1/0104—Measuring and analyzing of parameters relative to traffic conditions
- G08G1/0137—Measuring and analyzing of parameters relative to traffic conditions for specific applications
- G08G1/0141—Measuring and analyzing of parameters relative to traffic conditions for specific applications for traffic information dissemination
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/01—Detecting movement of traffic to be counted or controlled
- G08G1/015—Detecting movement of traffic to be counted or controlled with provision for distinguishing between two or more types of vehicles, e.g. between motor-cars and cycles
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/01—Detecting movement of traffic to be counted or controlled
- G08G1/04—Detecting movement of traffic to be counted or controlled using optical or ultrasonic detectors
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/16—Anti-collision systems
- G08G1/161—Decentralised systems, e.g. inter-vehicle communication
- G08G1/163—Decentralised systems, e.g. inter-vehicle communication involving continuous checking
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/16—Anti-collision systems
- G08G1/166—Anti-collision systems for active traffic, e.g. moving vehicles, pedestrians, bikes
Definitions
- Provisional Patent Application 63/318,442 filed March 10, 2022, titled “Traffic Near Miss Detection;” and U.S. Provisional Patent Application No. 63/320,010, filed March 15, 2022, titled “Traffic Near Miss/Collision Detection;” the contents of which are incorporated herein by reference as if fully disclosed herein.
- the described embodiments relate generally to traffic monitoring. More particularly, the present embodiments relate to traffic monitoring, analysis, and prediction.
- Traffic may be motorized, non-motorized, and so on. Traffic may include cars, trucks, pedestrians, scooters, bicycles, and so on. Traffic appears to only increase as the population of the world continues to increase.
- Some population areas such as cities, use cameras and/or other traffic monitoring devices to capture data about traffic. This data may be used to evaluate congestion, traffic signal configurations, road layout, and so on.
- Traffic data may be obtained, such as video from intersection cameras.
- Object detection and classification may be performed using the data.
- Structured data may be determined and/or output using the detected and classified objects.
- Metrics may be calculated using the structured data.
- Processed data may be prepared for visualization and/or other uses. The prepared processed data may be presented via one or more dashboards and/or the prepared processed data may be otherwise used.
- a system for traffic monitoring, analysis, and prediction includes a memory allocation configured to store at least one executable asset and a processor allocation configured to access the memory allocation and execute the at least one executable asset to instantiate at least one service.
- the at least one service obtains traffic data, performs object detection and classification, determines structured data, calculates metrics using the structured data, prepares processed data for visualization from the metrics, and presents the prepared processed data via at least one dashboard.
- the at least one service determines the structured data by determining a frame number for a frame of video, determining an intersection identifier for the frame of video, assigning a unique tracker identifier to each object detected in the frame of video, and determining coordinates of each object detected in the frame of video. In a number of implementations of such examples, the at least one service further determines the structured data by determining a class of each object detected in the frame of video.
- the at least one service calculates the metrics using the structured data by calculating a difference between one or more x or y positions for an object in different frames of video. In some implementations of such examples, the at least one service uses the difference along with times respectively associated with the different frames to calculate at least one of the metrics that is associated with a speed of the object. In various implementations of such examples, the speed is an average speed of the object during the video or a cumulative speed of the object.
- the at least one service calculates the metrics using the structured data by correlating a traffic light phase determined for a frame of video along with a determination that an object arrived at an intersection in the frame.
- a system for traffic monitoring, analysis, and prediction includes a memory allocation configured to store at least one executable asset and a processor allocation configured to access the memory allocation and execute the at least one executable asset to instantiate at least one service.
- the at least one service retrieves structured data determined from point cloud data from LiDAR sensors used to monitor traffic, calculates metrics using the structured data, prepares processed data for visualization from the metrics, and presents the prepared processed data via at least one dashboard.
- the metrics include at least one of vehicle volume, average speed, distance travelled, pedestrian volume, non-motor volume, light status on arrival, arrival phase, a route through an intersection, or a light time.
- the at least one service summons at least one vehicle using at least one of the metrics or the processed data.
- the at least one service tracks near misses/collisions using at least one of the metrics or the processed data.
- the at least one service determines a fastest route using at least one of the metrics or the processed data.
- the at least one service controls traffic signals to prioritize traffic using at least one of the metrics or the processed data.
- the at least one service determines a most efficient route using at least one of the metrics or the processed data.
- a system for traffic monitoring, analysis, and prediction includes a memory allocation configured to store at least one executable asset and a processor allocation configured to access the memory allocation and execute the at least one executable asset to instantiate at least one service.
- the at least one service constructs a digital twin of an area of interest, retrieves structured data determined from traffic data for the area of interest, calculates metrics using the structured data, prepares processed data for visualization from the metrics, and presents the prepared processed data in a context of the digital twin via at least one dashboard that displays the digital twin.
- the at least one service simulates traffic via the at least one dashboard using the processed data. In some implementations of such examples, the at least one service simulates how a change affects traffic patterns. In various implementations of such examples, the change alters at least one of a simulation of the traffic, a traffic signal, or a traffic condition.
- the digital twin includes multiple intersections.
- the at least one dashboard includes indicators selectable to display information for each of the multiple intersections.
- FIG. 1 depicts an example system for traffic monitoring, analysis, and prediction.
- FIG. 2 depicts a flow chart illustrating an example method for traffic monitoring, analysis, and prediction. This method may be performed by the system of FIG. 1 .
- FIGs. 3A and 3B depict a first example data pipeline structure that may be used for traffic monitoring, analysis, and prediction.
- FIG. 3C depicts a second example data pipeline structure that may be used for traffic monitoring, analysis, and prediction.
- FIG. 3D depicts a third example data pipeline structure that may be used for traffic monitoring, analysis, and prediction.
- FIG. 4 depicts an example of traffic monitoring data.
- FIG. 5 depicts the example of traffic monitoring data of FIG. 4 after object detection and classification.
- FIG. 6 depicts an example of structured data that may be determined from the detected and classified objects depicted in FIG. 5.
- FIGs. 7A-1 through 7A-4 depict a first portion of a list of metrics that may be used in traffic monitoring, analysis, and prediction.
- FIGs. 7B-1 through 7B-4 depict a second portion of a list of metrics that may be used in traffic monitoring, analysis, and prediction.
- FIGs. 7C-1 and 7C-2 depict a third portion of a list of metrics that may be used in traffic monitoring, analysis, and prediction.
- FIG. 8 depicts an example intersection table.
- FIG. 9 depicts an example vehicle table.
- FIG. 10 depicts an example approaches table.
- FIG. 11 depicts a first example dashboard display that may be used to visualize the results of data processed traffic monitoring.
- FIG. 12 depicts a second example dashboard display that may be used to visualize the results of data processed traffic monitoring.
- FIG. 13 depicts a third example dashboard display that may be used to visualize the results of data processed traffic monitoring.
- FIG. 14 depicts a fourth example dashboard display that may be used to visualize the results of data processed traffic monitoring.
- FIG. 15 depicts a fifth example dashboard display that may be used to visualize the results of data processed traffic monitoring.
- FIG. 16 depicts a sixth example dashboard display that may be used to visualize the results of data processed traffic monitoring.
- FIG. 17 depicts a seventh example dashboard display that may be used to visualize the results of data processed traffic monitoring.
- FIG. 18 depicts an eighth example dashboard display that may be used to visualize the results of data processed traffic monitoring.
- FIG. 19 depicts a ninth example dashboard display that may be used to visualize LiDAR traffic monitoring.
- FIGs. 20A and 20B depict an example of building a digital twin using a Dashboard and OpenStreetMap where FIG. 20A depicts a dashboard with 9 cameras in Bellevue and FIG. 20B depicts a network built with OSM and imported in SUMO (red circles for the camera locations).
- FIG. 21 depicts a visualization of the network with the speed (MPH).
- FIG. 22A and 22B depict a visualization of the outputs where FIG. 22A depicts a graph of the NOx (Nitrogen Oxide) emissions against time (in seconds) and FIG. 22B depicts a graph of speed against time (in seconds).
- NOx Nirogen Oxide
- FIG. 23 depicts an example system for traffic near miss/collision detection.
- FIG. 24 depicts a flow chart illustrating a first example method for traffic near miss/collision detection. This method may be performed by the system of FIG. 23.
- FIG. 25 depicts a flow chart illustrating a second example method for traffic near miss/collision detection. This method may be performed by the system of FIG. 23.
- FIG. 26 depicts a flow chart illustrating a third example method for traffic near miss/collision detection. This method may be performed by the system of FIG. 23.
- FIG. 27A depicts a first frame of traffic data video.
- FIG. 27B depicts a second frame of traffic data video.
- FIG. 27C depicts a third frame of traffic data video.
- FIG. 27D depicts a fourth frame of traffic data video.
- FIG. 27E depicts a fifth frame of traffic data video.
- the raw data from one or more traffic devices may only be able to provide so much insight into the traffic. Processing of the data may be more useful, providing the ability to visualize various metrics about the traffic, enable adaptive traffic signal control, predict traffic congestion and/or accidents, aggregate data from multiple population areas for various uses, such as in the auto insurance industry, rideshare industry, logistics industry, autonomous vehicle original equipment manufacturer industry, and so on.
- Traffic data may be obtained, such as video from intersection cameras.
- Object detection and classification may be performed using the data.
- Structured data may be determined and/or output using the detected and classified objects.
- Metrics may be calculated using the structured data.
- Processed data may be prepared for visualization and/or other uses. The prepared processed data may be presented via one or more dashboards and/or the prepared processed data may be otherwise used.
- the system may be able to perform traffic monitoring, analysis, and prediction that the system would not previously have been able to perform absent the technology disclosed herein. This may enable the system to operate more efficiently while consuming fewer hardware and/or software resources as more resource consuming techniques could be omitted. Further, a variety of components may be omitted while still enabling traffic monitoring, analysis, and prediction, reducing unnecessary hardware and/or software components, and providing greater system flexibility.
- FIG. 1 depicts an example system 100 for traffic monitoring, analysis, and prediction.
- the system may include one or more data processing pipeline devices 101 (which may be implemented using one or more loud computing arrangements), security and/or other gateways 102, dashboard presenting devices 103, and so on.
- the system may perform traffic monitoring, analysis, and prediction.
- Traffic data may be obtained, such as via the gateway from one or more traffic monitoring devices 104 (such as one or more intersection and/or other still image and/or video cameras, Light Detection and Ranging sensors (or “LiDAR”), loops, radar, weather data, Internet of Things sensors, fleet vehicles, traffic controllers and/or other city and/or other population area supplied data devices, navigation app data, connected vehicles, and so on).
- the one or more data processing pipeline devices 101 may perform object detection and classification using the data. For example, objects may be detected and classified as cars, trucks buses, pedestrians, light vehicles, heavy vehicles, non-motor vehicles, and so on.
- Objects may be assigned individual identifiers, identifiers by type, and so on.
- the one or more data processing pipeline devices 101 may determine and/or output structured data using the detected and classified objects.
- the one or more data processing pipeline devices 101 may calculate one or more metrics using the structured data.
- the metrics may involve vehicle volume, vehicle volume by vehicle type, average speed, movement status, distance travelled, queue length, pedestrian volume, non-motor volume, light status on arrival, arrival phase, route through intersection, light times, near misses, longitude, latitude, city, state, country, and/or any other metrics that may be calculated using the structured data.
- the one or more data processing pipeline devices 101 may prepare the processed data for visualization and/or other uses.
- the one or more data processing pipeline devices 101 may present the prepared processed data via one or more dashboards, such as via the dashboard presenting device 103, and/or otherwise use the prepared processed data.
- the data processing pipeline device 101 may be any kind of electronic device. Examples of such devices include, but are not limited to, one or more desktop computing devices, laptop computing devices, server computing devices, mobile computing devices, tablet computing devices, set top boxes, digital video recorders, televisions, displays, wearable devices, smart phones, digital media players, and so on.
- the data processing pipeline device 101 may include one or more processing units and/or other processors and/or controllers, one or more non-transitory storage media (which may take the form of, but is not limited to, a magnetic storage medium; optical storage medium; magneto-optical storage medium; read only memory; random access memory; erasable programmable memory; flash memory; and so on), one or more communication units, and/or other components.
- the processing unit may execute instructions stored in the non-transitory storage medium to perform various functions.
- the data processing pipeline device 101 may involve one or more memory allocations configured to store at least one executable asset and one or more processor allocations configured to access the one or more memory allocations and execute the at least one executable asset to instantiate one or more processes and/or services, such as one or more services, and so on.
- gateway 102 may be any kind of electronic device.
- dashboard presenting device 103 may be any kind of traffic monitoring device.
- traffic monitoring device 104 may be any kind of electronic device.
- Various configurations are possible and contemplated without departing from the scope of the present disclosure.
- computing resource refers to any physical and/or virtual electronic device or machine component, or set or group of interconnected and/or communicably coupled physical and/or virtual electronic devices or machine components, suitable to execute or cause to be executed one or more arithmetic or logical operations on digital data.
- Example computing resources contemplated herein include, but are not limited to: single or multi-core processors; single or multi-thread processors; purpose-configured coprocessors (e.g., graphics processing units, motion processing units, sensor processing units, and the like); volatile or non-volatile memory; application-specific integrated circuits; field-programmable gate arrays; input/output devices and systems and components thereof (e.g., keyboards, mice, trackpads, generic human interface devices, video cameras, microphones, speakers, and the like); networking appliances and systems and components thereof (e.g., routers, switches, firewalls, packet shapers, content filters, network interface controllers or cards, access points, modems, and the like); embedded devices and systems and components thereof (e.g., system(s)-on-chip, Internet-of-Things devices, and the like); industrial control or automation devices and systems and components thereof (e.g., programmable logic controllers, programmable relays, supervisory control and data acquisition controllers, discrete
- Example information can include, but may not be limited to: personal identification information (e.g., names, social security numbers, telephone numbers, email addresses, physical addresses, driver’s license information, passport numbers, and so on); identity documents (e.g., driver’s licenses, passports, government identification cards or credentials, and so on); protected health information (e.g., medical records, dental records, and so on); financial, banking, credit, or debt information; third-party service account information (e.g., usernames, passwords, social media handles, and so on); encrypted or unencrypted files; database files; network connection logs; shell history; filesystem files; libraries, frameworks, and binaries; registry entries; settings files; executing processes; hardware vendors, versions, and/or information associated with the compromised computing resource; installed applications or services; password hashes; idle time, uptime, and/or last login time; document files; product renderings; presentation files; image files; customer information; configuration files; passwords; and so on. It may be appreciated that the foregoing examples are not
- each microservice may be configured to provide data output and receive data input across an encrypted data channel.
- each microservice may be configured to store its own data in a dedicated encrypted database; in others, microservices can store encrypted data in a common database; whether such data is stored in tables shared by multiple microservices or whether microservices may leverage independent and separate tables/schemas can vary from embodiment to embodiment.
- processor refers to any software and/or hardware- implemented data processing device or circuit physically and/or structurally configured to instantiate one or more classes or objects that are purpose-configured to perform specific transformations of data including operations represented as code and/or instructions included in a program that can be stored within, and accessed from, a memory.
- This term is meant to encompass a single processor or processing unit, multiple processors, multiple processing units, analog or digital circuits, or other suitably configured computing element or combination of elements.
- system 100 is illustrated and described as including particular components arranged in a particular configuration, it is understood that this is an example. In a number of implementations, various configurations of various components may be used without departing from the scope of the present disclosure.
- the system 100 is illustrated and described as including the gateway 102. However, it is understood that this is an example. In various implementations, the gateway 102 may be omitted.
- the system 100 is illustrated and described as including the traffic monitoring device 104. However, it is understood that this is an example. In various implementations, the traffic monitoring device 104 may not be part of the system 100. The system 100 may instead communicate with the traffic monitoring device 104.
- Various configurations are possible and contemplated without departing from the scope of the present disclosure.
- data that has already been detected and classified may be obtained.
- Various metrics may be calculated from such, similar to above, which may then be prepared for visualization and/or visualized and/or otherwise used similar to above.
- FIG. 2 depicts a flow chart illustrating a first example method 200 for traffic monitoring, analysis, and prediction. This method may be performed by the system of FIG. 1.
- traffic data may be obtained.
- object detection and classification may be performed.
- structured data may be determined and/or output.
- one or more metrics maybe calculated.
- processed data may be prepared for visualization and/or other use.
- the prepared processed data may be presented via one or more dashboards and/or otherwise used.
- this example method 200 may be implemented as a group of interrelated software modules or components that perform various functions discussed herein. These software modules or components may be executed within a cloud network and/or by one or more computing devices, such as the data processing pipeline device 101 of FIG. 1.
- example method 200 is illustrated and described as including particular operations performed in a particular order, it is understood that this is an example. In various implementations, various orders of the same, similar, and/or different operations may be performed without departing from the scope of the present disclosure.
- a data pipeline may begin with a raw, real-time video feed from an intersection camera that is in use by a city department of transportation. This video may then be passed through a secure gateway to a cloud based processing pipeline (such as Amazon Web ServicesTM and/or any other cloud vendor, service, and/or implementation).
- a cloud based processing pipeline such as Amazon Web ServicesTM and/or any other cloud vendor, service, and/or implementation.
- the pipeline's first instance may allow for rapid development of machine learning and computer vision applications within the cloud provider's on-demand infrastructure. It may run object detection and classification deep learning models on the video. Examples of such video detection and classification algorithms include, but are not limited to, YOLOv4 + DeepSORT, YOLOv4 + DeepMOT, and so on.
- the pipeline may output structured data, such as the position, trajectory, count, and type of motorized and nonmotorized road users.
- structured data such as the position, trajectory, count, and type of motorized and nonmotorized road users.
- the structured data from this module may be stored in a cloud instance and then be passed to a second instance that may calculate the intersection metrics.
- These metrics may be stored in 3 tables to minimize latency and storage size. These tables may include an intersection table, a vehicle table, and an approaches table, and so on. Examples of these tables are included below, particularly with respect to FIGs. 8- 10.
- This processed data may be held in an additional cloud instance, and then exported (such as in JSON files) to a data warehouse where it may be further optimized for visualization.
- the processed data may be written to a live model (such as SiSenseTM, an enterprise dashboard provider) that may allow for the data to be visualized in real time and/or a SiSenseTM ElastiCubeTM for later retrieval and visualization of time-based metrics.
- a live model such as SiSenseTM, an enterprise dashboard provider
- dashboard which may be shown as camera locations in a city.
- a user may click in (and/or otherwise select) and see specific metrics about an intersection’s health and performance.
- such data processing may be used to support adaptive traffic signal control, predicting traffic congestion and accidents, as well as productizing aggregated data from multiple cities for private sector use in the auto insurance, rideshare, logistics, autonomous vehicle original equipment manufacturer spaces, and so on.
- intersection video feeds may be used. This may include weather, Internet of Things sensors, LiDAR sensors, fleet vehicles, city suppliers (e.g. traffic controller), navigation app data, connected vehicle data, and so on.
- data that has already been detected and classified may be obtained.
- Various metrics may be calculated from such, similar to above, which may then be prepared for visualization and/or visualized and/or otherwise used similar to above.
- FIGs. 3A and 3B depict a first example data pipeline structure that may be used for traffic monitoring, analysis, and prediction.
- the example data pipeline structure includes a camera stage, a security layer stage, an object detection and classification stage, a video pipe stage, a metric calculation stage, a data bucket 2 stage, a data warehouse stage, a live model 1 stage, a SiSense ElastiCube stage, a dashboard visual stage, a live model 2 stage, and a connected vehicle data API (application programing interface) stage.
- a camera stage includes a security layer stage, an object detection and classification stage, a video pipe stage, a metric calculation stage, a data bucket 2 stage, a data warehouse stage, a live model 1 stage, a SiSense ElastiCube stage, a dashboard visual stage, a live model 2 stage, and a connected vehicle data API (application programing interface) stage.
- API application programing interface
- the camera stage is connected to the security layer stage
- the security layer stage is connected to the object detection and classification stage
- the object detection and classification stage is connected to the video pipe stage
- the video pipe stage is connected to the metric calculation stage
- the metric calculation stage is connected to the data bucket 2 stage
- the data bucket 2 stage is connected to the data warehouse stage
- the data warehouse stage is connected to the live model 1 stage
- the live model 1 stage is connected to the SiSense ElastiCube stage and the dashboard visual stage
- the SiSense ElastiCube stage and the dashboard visual stage are connected to the live model 2 stage
- the live model 2 stage is connected to the connected vehicle data API stage.
- the camera stage is where video footage may be collected at intersections (this stage may be hosted by cities or other population centers), the security layer stage and the object detection and classification stage may run one or more algorithms for object detection and classification, the video pipe stage may stream data from one or more algorithm processes run by the object detection and classification stage and store it for a specified amount of time (such as ten seconds, three minutes, two days, and so on), the metric calculation stage may include an Al (artificial intelligence) instance that may perform a metric calculation (such as by running a python script) and/or process object detection and classification content into traffic metrics, the data bucket 2 stage may capture processed metrics and/or output (such as in one or more JSON files), the data warehouse stage may optimize data format for visualization, the live model 1 stage may stream video (such as within an adjustable time window) based on metrics, the SiSense ElastiCube stage may store data and/or cache historical data, the dashboard visual stage may display data for one or more end users, the live model 2 stage may stream (such as within an adjustable
- data that has already been detected and classified may be obtained.
- Various metrics may be calculated from such, similar to above, which may then be prepared for visualization and/or visualized and/or otherwise used similar to above.
- FIG. 3C depicts a second example data pipeline structure that may be used for traffic monitoring, analysis, and prediction.
- Stream Ingestion & Processing Module Besides handling camera discovery, registry and video stream health check, this module may be responsible for starting, ending and restarting processes of stream data. After ingesting the stream, it may handle the object detection and tracking processes.
- Metrics Processing Module Once the stream data is processed, the tracked objects data may go through the metrics processing module which may output the desired metrics.
- Stream ingestion and processing code may be bundled into a single container and deployed as an ECS task inside an ECS service.
- ECS is short for Elastic Container Service which is an AWS (AmazonTM Web Services) proprietary container orchestration service.
- the ECS service may be deployed in an EC2 GPU instance group inside a private network (VPC).
- VPC private network
- a container per stream may be deployed, and camera metadata as well as stream properties may be stored inside a DynamoDB collection.
- the DynamoDB table may allow the tracking and synchronization of the status and state of all different containers.
- Kinesis Data Streams may assure the stream of data between the stream ingestion and processing module and the metrics processing service.
- the metrics processing code may be deployed on a Fargate cluster - a serverless deployment option of the ECS service.
- Kinesis Client Library may assist the task of consuming and processing data from the stream and a DynamoDB collection may be used to keep track of relevant stream-related metadata.
- S3 may enable permanent storage for the output of the metrics processing module. With a lifecycle policy, stored data may be moved to different storage tiers for cost effectiveness.
- VPC all of the services may be be deployed in a private network to control the flow of traffic between the systems.
- ECS Cluster an ECS service may be deployed to group the ECS tasks and to handle the scaling of the tasks. Containers may be deployed on an EC2 instance.
- DynamoDB Cluster may be used to store any kind of metadata.
- Kinesis Data Streams a provisioned Kinesis data stream may be deployed for streaming the tracked object data and to act as a buffer between the video processing module and the metric module.
- ECS Cluster a Fargate launch type ECS cluster may be deployed to metric calculation.
- S3 Bucket may be used to store all the data resulting from the metrics processing module. May be a standard tier bucket configured with lifecycle to move the data after a certain period to deep archive more accurately Glacier.
- FIG. 3D depicts a third example data pipeline structure that may be used for traffic monitoring, analysis, and prediction.
- Stream Ingestion Module Besides handling camera discovery, registry, and video stream health check, this module may be responsible for starting, ending, and restarting processes of stream data.
- the stream processing module may handle the object detection and tracking processes.
- the tracked objects data may go through the metrics processing module which may output the desired metrics.
- the stream ingestion and stream processing code may be bundled into a single container and deployed as an ECS task inside an ECS service.
- ECS is short for Elastic Container Service which is an AWS proprietary container orchestration service.
- the ECS service may be deployed in an auto scaling EC2 GPU instance group inside a private network (VPC).
- VPC private network
- Deployed in front of the ECS service will be an Elastic Load Balancer (ELB) which may distribute the traffic across the different containers, one for each camera. Once the configuration files for the cameras are read, a container per camera may be deployed, and the camera metadata may be stored inside a DynamoDB collection. Network endpoints may be created to connect the Load Balancer, the output data stream, and the DynamoDB instance from the private network.
- ELB Elastic Load Balancer
- All of the metrics processing code may be deployed on a Fargate cluster which is a serverless deployment option of the ECS service.
- S3 may be used as a permanent storage where all of the metric data may be stored and a lifecycle policy may be configured where data may be moved to different storage tiers for cost effectiveness.
- VPC All of the services may be deployed in a private network to control the flow of traffic between the systems.
- o NAT gateway - a NAT gateway may be deployed in the private subnets to route internet traffic.
- ELB An application load balancer may be deployed to distribute traffic across the containers and also may be used for service discovery.
- an ECS service may be deployed to group the ECS tasks and to handle the scaling of the tasks.
- o EC2 launch type - containers may be deployed on an EC2 GPU instance.
- Kinesis Data Stream A provisioned Kinesis data stream may be deployed for streaming the tracked object data and to act as a buffer between the video processing module and the metric module.
- ECS cluster Fargate
- a Fargete launch type ECS cluster may be deployed to metric calculation.
- S3 bucket May be used to store all the metric data. May be a standard tier bucket configured with lifecycle to move the data after a certain period to deep archive more accurately. Glacier Other services may be added once the initial deployment of the system is complete. Those services may be an EC2 instance to act as a front-facing control module, Cloudwatch alarms will be set to track the container cluster and data stream metrics, a notification service will be deployed for Cloudwatch events and email notifications.
- the above may have several benefits. These may include: resiliency (the architecture may be tolerant to fault), scalability (the system may be able to scale easily, without changes in the architecture), and modularization (this may make problem solving within each module much easier as well as latency and cost optimization).
- FIG. 4 depicts an example of traffic monitoring data.
- the traffic monitoring data includes a still image of an intersection including traffic.
- the traffic monitoring data may take a variety of different forms. Various configurations are possible and contemplated without departing from the scope of the present disclosure.
- FIG. 5 depicts the example of traffic monitoring data of FIG. 4 after object detection and classification.
- the different objects in the example of traffic monitoring data of FIG. 4 may be detected and classified, such as by identifying the objects as one or more different cars that may be associated with an identifier, tracked, and so on.
- FIG. 6 depicts an example of structured data that may be determined from the detected and classified objects depicted in FIG. 5. Various behaviors about the identified and/or tracked objects may be evaluated and recorded.
- FIGs. 7A-1 through 7A-4 depict a first portion of a list of metrics that may be used in traffic monitoring, analysis, and prediction.
- FIGs. 7B-1 through 7B-4 depict a second portion of a list of metrics that may be used in traffic monitoring, analysis, and prediction.
- FIG. 7C1 -7C2 depict a third portion of a list of metrics that may be used in traffic monitoring, analysis, and prediction. This list of metrics may also be referred to below as the “Data Dictionary.”
- FIG. 8 depicts an example intersection table.
- FIG. 9 depicts an example vehicle table.
- FIG. 10 depicts an example approaches table.
- FIG. 11 depicts a first example dashboard display that may be used to visualize the results of data processed traffic monitoring.
- This first example dashboard display may be a city view where the camera icons represent individual intersections that may be selected to switch to a dashboard display focusing on that intersection, which may present specific metrics about that intersection’s health, performance, and so on.
- FIG. 12 depicts a second example dashboard display that may be used to visualize the results of data processed traffic monitoring.
- This second example dashboard display may be an intersection view that presents various specific metrics for an intersection, which may indicate various information about the intersection’s health, performance, and so on.
- this second example dashboard display may depict arrival on green over a period of time (which may be adjustable) versus arrival on red over a period of time (which may be adjustable), average speed over a period of time (which may be adjustable), arrival phase, and so on.
- FIG. 13 depicts a third example dashboard display that may be used to visualize the results of data processed traffic monitoring.
- This second example dashboard display may also be an intersection view that presents various specific metrics for an intersection, which may indicate various information about the intersection’s health, performance, and so on.
- This second example dashboard display may depict short term traffic intensity; a feed of the intersection; the distribution of travel left, right, or through over a period of time (which may be adjustable); an approach-exit distribution, and so on.
- FIG. 14 depicts a fourth example dashboard display that may be used to visualize the results of data processed traffic monitoring.
- the top boxes and the bottom left may include animations that may be supported by processed traffic data, connected vehicle data, and so on.
- the data may include latitude and longitude for each unique vehicle over time.
- the top left box may be a GIS (Geographic Information System) layer with long trails.
- the top right box may be a GIS layer with points.
- the bottom left box may be a GIS layer that may be rotated with space.
- the bottom right box may show individual intersections with cameras/sublayers that may be highlighted and may be selectable.
- total vehicle volume may be inferred using traffic data volume data and a sample percentage variable, such as a total percentage of all vehicles that are connected and/or represented by the connected vehicle data.
- FIG. 15 depicts a fifth example dashboard display that may be used to visualize the results of data processed traffic monitoring.
- FIG. 16 depicts a sixth example dashboard display that may be used to visualize the results of data processed traffic monitoring.
- FIG. 17 depicts a seventh example dashboard display that may be used to visualize the results of data processed traffic monitoring.
- FIG. 16 may be a magnified “pop out” of FIG. 15.
- FIG. 17 may be a magnified pop out of FIG. 16.
- FIG. 15 may include a variable selection menu. In some examples, up to 2 may be selected. Selected variables may be contrasted on the X & Y axes of 7.
- FIG. 15 also include an interface to export selected data as a .csv or .xlsx file.
- FIG. 15 may also include a time series graph. The time series graph may show up to 2 variables selected from 5. The time series graph may be updated in real time. The time window shown in the time series graph may be chosen by a user.
- FIG. 15 may show “point in time values” for a current intersection. The point in time values may update in real time.
- FIG. 15 may also include depict a live camera feed. The live camera feed may be streamed from a selected intersection camera.
- FIG. 15 may also include an interface zoomed in view of a “primary layer” view (such as one or more of the views of FIG. 14) of the intersection. This view may show vehicles and such moving through the intersection, such as they would in the primary layer. This may be a top-down “T” view.
- FIG. 15 may also include an interface to return to the primary layer city grid view.
- FIG. 15 may also include an interface to initiate a deep dive into a playback of the intersection.
- FIGs. 16 and 17 may include interfaces that may be selectable to expand.
- FIG. 18 depicts an eighth example dashboard display that may be used to visualize the results of data processed traffic monitoring.
- FIG. 18 may include a select metrics and time window.
- FIG. 18 may also include an interface that enables export of selected data as a .csv or .xlsx file.
- FIG. 18 may display a time series according to one or more selected variables and a time window, which may be 1x/graph.
- FIG. 18 may include an interface that enables navigation back to a secondary level (such as FIG. 15).
- FIG. 18 may depict vehicles and such moving through an intersection during a time interval.
- FIG. 18 may include a time slider. The time slider may enable scrolling through a time interval to see vehicles and such moving through the intersection.
- This example dashboard may include the examples shown in FIGs. 15-18.
- the objective of the dashboard in this example may be to visualize traffic data in real time, which may use SiSense’s enterprise dashboard platform. These visualizations may be based on a stream of data generated from live traffic camera video feeds and may constitute the dashboard.
- This example dashboard may involve user interface formatting, features, functionality, and so on.
- This dashboard may consist of three layers: a primary, secondary, and tertiary layer. Each layer, as well as its design, display, and performance expectations are detailed in its respective section below. References will be made throughout each layer’s section to the Data Dictionary, which is shown in FIGs. 7A and 7B. Data Dictionary metrics may be referred to by their “key” names under each layer-feature that they support.
- the primary layer may be to display a map of a selected city, real time traffic flow, and provide a click-through gateway to the secondary layer.
- FIG. 14 may show an example of the primary layer.
- the primary layer may consist of the following features:
- A. May show the geo-spatial layout of a city’s road network
- E. May show the real time location of cars as they move through a city’s road network
- the secondary layer may be to provide users with single-intersection level data.
- FIGs. 15-17 may show examples of the secondary layer.
- FIG. 16 may be a magnified “pop out” of FIG. 15.
- FIG. 17 may be a magnified pop out of FIG. 16.
- the secondary layer may consist of the following features:
- Class variables may be characters
- Class variables may modify both metric and category variables
- Category variables may be characters
- Metric variables that cannot be filtered may already assume category and/or class variables
- bus_volume (filtered metric variable)
- This feature may allow for the raw data corresponding to a user’s selections to be exported to a .csv format
- C. May display a time series of selected metric variables
- This time series may display up to two selected metric variables
- This time series graph may continuously update in real time with new data
- This time series may allow users to adjust desired periodicity
- Pilot periodicity may cover 1 week worth of time
- This time series may be interactive, and may allow users to see percent increase and decrease between two points
- Time series may display absolute values when hovered over
- This layer may be clicked in order to create a magnified pop out
- This feature may be a replicated, “zoomed-in” version of the Dashboard’s Primary Layer
- G May allow the user to navigate back to the Primary Layer
- the tertiary’s layer’s purpose may be to provide users with intersection playback functionality (as compared with a continuously updating real-time stream of data as in the secondary layer).
- FIG. 18 may show an example of the tertiary layer.
- the tertiary layer may consist of the following features:
- C. May display a time series of intersection level metrics selected by the user
- F. May have a “play-back” slider to allow a user to toggle through a selected time period
- the periodicity may correspond to a user’s selection in the Tertiary Layer
- Geometric center of the detected object s bounding box in the x coordinate.
- Geometric center of the detected object s bounding box in the y coordinate.
- I2_distance_units [00330] L2 distance between the geometric centers of the object’s bounding box in the current and previous frame.
- motion_status [00371] Can be:
- Miles per hour Current speed of vehicle. This may involve checking direction of a vehicle within a bounding box. This direction may be used to calculate length of vehicle in pixels. The average size of vehicles in the United States (or other places) may be used to get pixels to miles conversion factor. The miles conversion factor may be used to convert to speed_in_units. In other examples, other units may be used instead of miles per hour.
- Total time a vehicle spends in intersection May be measured in seconds or other units. Determination may use age of object and fps to convert to time units.
- Difference between the actual travel time for a vehicle to move through the intersection and the reference travel time May be measured in seconds or other units. Determination may involve subtracting travel time and reference travel time.
- Time during which vehicle is stationary at an intersection approach May be measured in seconds or other units. Determination may involve counting each frame the vehicle is stopped and using fps to get the time value.
- approachjd Integer that identifies the approach of a specific intersection. Is the same as the key in the json
- platoon_ratio Measure of individual phase progression performance derived from the percentage arrivals on green.
- Necessary green time for all vehicles to not stop at arrival within certain time window May be measured in seconds or other units. Determination may involve # of vehicles per hour per approach * (average time per movement). May be based off of observed use - (# of cars/hour/approach )* ( average time I movement).
- Difference between green_time_demand & green_time for a given interval May be measured in seconds or other units.
- frames of a raw, real-time video feed from an intersection camera and/or other traffic data may be obtained (though it is understood that this is an example and that in other examples other data, such as point cloud LiDAR data, may be obtained and used).
- Detection and classification may be performed on each frame to identify and classify the objects in the frame. Structured data may then be determined for the objects detected.
- a frame number may be determined for a frame
- an intersection identifier may be determined for a frame
- a unique tracker identifier may be assigned to each object detected
- the class of the object may be determined (such as person, car, truck, bus, motorbike, bicycle, and so on)
- coordinates of the object detected in the frame may be determined (which may be determined with reference to known coordinates of the intersection and/or the intersection camera, such as camera longitude, latitude, city, state, country, and so on) (such as the minimum and maximum x positions of the object, the minimum and maximum y positions of the object, and so on), and the like.
- An example of such information is shown in FIG. 6.
- a bounding box may be calculated for the object based on one or more x and/or y positions for the object.
- one or more geometric centers of the object’s bounding box may be calculated for the object in the x and/or y coordinate (such as an x min, a y min, and so on).
- an intersection approach that the object is currently on may be calculated, such as based on a position of the object and a position of the center of the intersection.
- other structured data may be determined from the frames.
- one or more time stamps associated with frames may be determined and/or associated with other structured data, such as to determine a time at which an object was at a determined x and/or y position.
- a light phase for the frame may be determined (such as whether a traffic light in the frame is green, red, and so on), though this may instead be determined by means other than image analysis (such as time-stamped traffic light data that may be correlated to a frame time stamp). This may be used to determine the traffic light phase when an object arrived at the intersection, such as by correlating a traffic light phase determined for a frame along with a determination that an object arrived at the intersection in the frame.
- data for an approach and/or intersection associated with a frame may be determined (such as based on a uniform resource locator of the video feed and/or any other intersection camera identifier associated with the frame, an approach identifier associated with the frame, an intersection identifier associated with the frame, and so on).
- the structured data determined for an object in a frame may be used with the structured data determined for the object in other frames to calculate various metrics. For example, the difference between one or more x and/or y positions for the object (such as the difference and/or distance between x or y midpoints of the object’s bounding box) in different frames (such as in a current and a previous frame) may be calculated. Such difference in position between frames, along with times respectively associated with the frames (such as from one or more time-stamps) may be used to calculate one or more metrics associated with the speed of the object (such as an average speed of the object during the video feed (such as in miles per hour and/or other units), cumulative speed, and so on).
- the speed of the object such as an average speed of the object during the video feed (such as in miles per hour and/or other units), cumulative speed, and so on).
- Such difference in position between frames may also be used to calculate various metrics about the travel of the object (such as the direction of travel between frames, how the object left an intersection, whether or not the object made a right on red, and so on).
- structured data from multiple frames may be used to determine a status of the object (such as an approach associated with the object, how an object moved through an intersection, an approach an object used to enter an intersection, the approach an object used to exit an intersection, and so on), a time or number of frames since the object was last detected (and/or since first detected and so on), whether or not the object is moving, and so on.
- Structured data and/or metrics for individual detected objects and/or other data may be used together to calculate various metrics, such as metrics associated with approaches.
- structured data and/or metrics for individual detected objects associated with an approach identifier may be aggregated and analyzed to determine one or more approach volumes (such as a number of vehicles (cars, motorbikes, trucks, buses, and so on)) in a particular approach, a number of light vehicles (such as cars, motorbikes, and so on) in a particular approach, a number of heavy vehicles (such as trucks, buses, and so on) in a particular approach, a number of cars in a particular approach, a number of trucks in a particular approach, a number of buses in a particular approach, a number of pedestrians in a particular approach, a number of non-motor vehicles in a particular approach, a number of bicycles in a particular approach, and so on), an average queue length (such as in feet and/or another unit of measurement) of a particular approach, and so on.
- approach volumes such as a number of vehicles (cars, motorbikes, trucks, buses, and so on)
- light vehicles such as cars, motorbikes, and so on
- light status in one or more frames may be tracked and/or correlated with other information to determine a light status, an effective green time (such as a length of time that objects are moving through a particular intersection), an effective red time (such as a length of time that objects are stopped at a particular intersection), a cycle time (such as a length of time that a light is green determined by comparing the light phase across multiple frames), a number of cars that arrived while a traffic light is green, a number of cars that arrived while a traffic light is red, a measure of individual phase progression performance derived from a percentage of vehicle volume arrivals on green, and so on.
- an effective green time such as a length of time that objects are moving through a particular intersection
- an effective red time such as a length of time that objects are stopped at a particular intersection
- a cycle time such as a length of time that a light is green determined by comparing the light phase across multiple frames
- a number of cars that arrived while a traffic light is green
- a last stop time may be calculated based on a last time-stamp that an object stopped at an approach.
- a last start time may be calculated based on a last time-stamp that an object moved into the intersection at a particular approach.
- an approach identifier for a particular approach may be determined, coordinates for a camera associated with a particular intersection may be determined, number of lanes associated with a particular approach may be determined, and so on.
- Structured data and/or metrics for individual detected objects and/or other data may be also used together to calculate various metrics associated with intersections.
- a vehicle volume for a particular intersection may be determined by summing objects (such as cars, motorbikes, trucks, buses, and so on) in all approaches of a frame associated with the intersection
- a light vehicle volume for a particular intersection may be determined by summing objects (such as cars, motorbikes, and so on) in all approaches of a frame associated with the intersection
- a heavy vehicle volume for a particular intersection may be determined by summing objects (such as trucks, buses, and so on) in all approaches of a frame associated with the intersection
- a car volume for a particular intersection may be determined by summing cars in all approaches of a frame associated with an intersection
- a truck volume for a particular intersection may be determined by summing trucks in all approaches of a frame associated with an intersection
- a bus volume for a particular intersection may be determined by summing buses in all approaches of a frame associated with an intersection
- a person volume for a particular intersection may be determined by summing people in all approaches of a frame associated with an intersection
- Other information for an intersection may be determined using the video feed, frames, and/or other structured data and/or metrics. For example, an identifier for a camera associated with an intersection may be determined, identifiers for frames of one or more video feeds associated with the intersection may be determined, observation times associated with an intersection may be determined (such as a time-stamp based on ingestion time when other metadata from a stream or other video feed is not available, a cumulative time (such as from the start of processing of the video feed) may be determined, and so on.
- any structured data and/or metrics (such as those discussed above with relation to FIGs. 7A1 - 7C-2) relating to one or more vehicles and/or other objects, approaches, intersections, and so on may be determined and calculated from the objects detected in one or more frames of one or more video feeds of one or more intersection cameras and/or other traffic data without departing from the scope of the present disclosure.
- connected vehicle data may be obtained and used.
- structured data and/or metrics may be determined and/or calculated using a combination of connected vehicle data and data from one or more video feeds from one or more intersection cameras and/or other traffic data.
- a visualization dashboard may visualize connected vehicle data along with structured data and/or metrics determined and/or calculated from one or more video feeds from one or more intersection cameras and/or other traffic data.
- real-time video feed from an intersection camera and/or other traffic data may be obtained.
- Objects in frames of the video feed may be detected and classified. Positions of the objects at various times in the frames of the video feed may be determined, as well as information such as light statuses related to the objects. Differences between the objects in different frames may be used to determine behavior of the objects over time.
- Such calculated object metrics may be stored, such as in one or more vehicle tables.
- Such calculated object metrics for objects that are associated with a particular approach may be aggregated in order to determine various approach object volumes and/or other metrics related to the approach, which may then be stored, such as in one or more approach tables.
- object metrics for objects that are associated with a particular intersection may be aggregated in order to determine various intersection object volumes and/or other metrics related to the intersection, which may then be stored, such as in one or more intersection tables.
- structured data and/or metrics related to one or more vehicles and/or other objects, approaches, intersections, and so on discussed above may then be processed and/or otherwise prepared for visualization (such as one or more of the example dashboard displays of FIGs. 11 -18) and/or one or more other purposes.
- structured data and/or metrics related to one or more vehicles and/or other objects may be stored in one or more vehicle tables (such as the example vehicle table of FIG. 9)
- structured data and/or metrics related to one or more intersections may be stored in one or more intersection tables (such as the example intersection table of FIG. 8)
- structured data and/or metrics related to one or more approaches may be stored in one or more approach tables (such as the example approach table of FIG. 10), and so on.
- Such tables may then be used for visualization (such as one or more of the example dashboard displays of FIGs. 11 -18) and/or one or more other purposes.
- a visualization dashboard may include a graphical model generated of a city or other area.
- the graphical model may include one or more intersections and may visualize various of the structured data and/or metrics to the depicted intersections and so on.
- one or more intersections depicted by the graphical model may be selected to present various information related to the structured data and/or metrics associated with the intersection (such as arrival phase over an interval, average speed over an interval, various object volumes (such as right turn volume, left turn volume, through volume, and so on), approach data related to the intersection, how objects proceeded through the intersection, a current and/or historical video feed associated with the intersection, and so on).
- Various controls may be provided that enable a user to select which information is displayed, export data related to the information, playback historic data, and so on.
- the structured data and/or metrics may be used for purposes other than visualization.
- Example uses include, but are not limited to, adaptive traffic signal control, predicting traffic congestion and accidents, productizing aggregated data from multiple cities for private sector use (such as in the auto insurance, rideshare, logistics, autonomous vehicle original equipment manufacturer spaces, and so on), routing (such as for rideshare, logistics, autonomous vehicle control, and so on), simulating traffic, using structured data and/or metrics to simulate how changes to traffic (and/or traffic signals, traffic conditions, and so on) will change traffic patterns, and so on.
- LiDAR sensors may be operable to determine data, such as ranges (variable distance), by targeting an object with elements, such as one or more lasers, and measuring the time for the reflected light to return to one or more receivers.
- LiDAR sensors may generate point cloud data that may be used for the analysis discussed herein instead of frames of a raw, real-time video feed from an intersection camera and/or other traffic data.
- functions similar to those described above performed on frames of a raw, real-time video feed from an intersection camera and/or other traffic data may be performed on the LiDAR sensor data.
- structured data generated from LiDAR cloud data that has already been detected and classified may be obtained and various metrics may be calculated from such, similar to above, which may then be prepared for visualization and/or visualized and/or otherwise used similar to above.
- LiDAR sensor data may have a number of advantages over frames of a raw, realtime video feed from an intersection camera and/or other traffic data.
- point cloud data from one or more LiDAR sensors may not have the same privacy issues as frames of a raw, real-time video feed from an intersection camera and/or other traffic data as facial and/or other similar images may not be captured.
- LiDAR sensor data may not be dependent on lighting and thus may provide more reliable data over all times of day and night as compared to frames of a raw, real-time video feed from an intersection camera and/or other traffic data.
- LiDAR sensor data may provide data in three- dimensional space as opposed to the two-dimensional data from frames of a raw, real-time video feed from an intersection camera and/or other traffic data and thus may provide depth, which may not be provided via frames of a raw, real-time video feed from an intersection camera and/or other traffic data.
- a determination may be made about the size of an average vehicle in pixels. This may be used with the LiDAR sensor data to determine the pixels from the center of a vehicle represented in the LiDAR sensor data and then infer the speed of the vehicle. Compared to approaches using frames of a raw, real-time video feed from an intersection camera and/or other traffic data, an assumption may not have to be made about object speed. This may be more accurate, but also may improve the processing speed of computing devices processing the data as functions performed on frames of a raw, real-time video feed from an intersection camera and/or other traffic data to determine speed may not need to be performed and can be omitted since this information may already be represented in LiDAR sensor data.
- Various configurations are possible and contemplated without departing from the scope of the present disclosure.
- pedestrian and/or other traffic through one or more parking lots, walkways, and/or other areas related to one or more events and/or event venues may be monitored, analyzed, directed, controlled, simulated, and so on.
- cargo in one or more container trucks moving in relation to one or more ports may be monitored, analyzed, directed, controlled, simulated, and so on.
- cargo truck queues related to one or more ports may be monitored, analyzed, directed, controlled, simulated, and so on.
- various airport traffic (such as pedestrians and/or vehicles approaching an airport, moving through an airport, and so on) may be monitored, analyzed, directed, controlled, simulated, and so on.
- Various configurations are possible and contemplated without departing from the scope of the present disclosure.
- FIG. 19 depicts a ninth example dashboard display that may be used to visualize LiDAR traffic monitoring.
- the ninth example dashboard display may include a map showing approach density and/or an average speed tab including an indication of average speed, an indication of speed by approach, and/or an indicator of average platoon ratio.
- the ninth example dashboard display may include an object volume tab that may be selected to show one or more indicators related to object volume, an arrival phase tab that may be selected to show one or more indicators related to arrival phase, and/or an average queue tab that may be selected to show one or more indicators related to average queue.
- the ninth example dashboard display may also include a feed illustrating cloud point LiDAR data on an image and/or other representation of an intersection.
- simulation of a digital twin intersection may be performed with real data. This will now be discussed in detail.
- FIGs. 20A and 20B depict an example of building a digital twin using a Dashboard and OpenStreetMap where FIG. 20A depicts a dashboard with 9 cameras in Bellevue and FIG. 20B depicts a network built with OSM and imported in SUMO (red circles for the camera locations).
- FIG. 21 depicts a visualization of the network with the speed (MPH).
- FIGs. 22A and 22B depict a visualization of the outputs where FIG. 22A depicts a graph of the NOx (Nitrogen Oxide) emissions against time (in seconds) and FIG. 22B depicts a graph of speed against time (in seconds).
- NOx Nirogen Oxide
- SUMO Simulation of Urban Mobility
- SUMO is an open source, portable, microscopic, and continuous multi-modal traffic simulation package designed to handle large networks.
- SUMO is developed by the German Aerospace Center and community users.
- any traffic scenario may be simulated, whether it’s building a new unique network of roads or creating a digital twin of an intersection of interest.
- Digital twin is defined as a virtual representation that serves as the real-time digital counterpart of a physical object or process.
- the context may involve intelligent traffic solutions. Products may be built for cities to help them economically achieve their transportation objectives, including reductions in traffic congestion, accidents, and associated emissions.
- Products may use intersection cameras with a machine learning stack to provide insights on that intersection whether it is a count of classified vehicles, speed, or emissions. Using this information, products may be able to pinpoint areas of interest so that a traffic engineer may be able to easily work to reduce congestion, cut emissions, etc.
- a digital twin of the intersections may first be created using OSM (OpenStreetMaps).
- SUMO has a program with OSM that allows the user to be able to crop out a section of the map that one would want to simulate in SUMO and edit accordingly.
- the parameters of the simulation may be adjusted.
- SUMO currently allows one to adjust the count of cars, trucks, buses, motorcycles, bicycles, and pedestrians. There are other options such as trams, urban trains, and trains but those may be excluded due to the nature of the simulation.
- Another file that SUMO may provide is the emission data from the simulation. Using this dataset the output of CO2, CO, HC, NOx, PMx, fuel consumption, and electricity consumption may be seen. Using this information insight on how the intersections are doing may be obtained, and then whether the traffic light signals may need polishing or more/less lanes may be decided.
- this kind of data may yield more robust and full-bodied output to a traffic engineer.
- This kind of information may be able to give traffic engineers a full understanding of the intersections that they are working with alongside with products discussed herein.
- a near miss may be when two objects (such as motorized or non-motorized vehicles, pedestrians, and so on) in traffic almost collide.
- a collision may be when the two objects actually collide.
- Near misses/collisions may signal traffic problems that may need to be addressed. Further, near misses may be more challenging to track than actual collisions as near misses may not be reported to insurance providers, law enforcement, and/or other entities. Data regarding detected near misses/collisions may be useful in the ability to visualize various metrics about the traffic, enable adaptive traffic signal control, predict traffic congestion and/or accidents, and aggregate data from multiple population areas for various uses, such as in the auto insurance industry, rideshare industry, logistics industry, autonomous vehicle original equipment manufacturer industry, and so on.
- Traffic data may be obtained, such as video from intersection cameras, point cloud data from LiDAR sensors, and so on.
- Object detection and classification may be performed using the data, and structured data may be determined and/or output using the detected and classified objects.
- structured data may be obtained that has been generated by performing such object detection and classification on traffic data (such as point cloud data from LiDAR sensors).
- Metrics may be calculated using the structured data. For each frame of traffic data, the metrics may be analyzed to detect whether a near miss/collision occurs between each object in the frame (such as motorized or non-motorized vehicles, pedestrians, and so on) and each of the other objects in the frame.
- these metrics may be analyzed to evaluate whether or not a group of conditions are met. If the group of conditions are met, a near miss/collision may be detected. This may be recorded in the metrics for the objects involved. In some implementations, one or more indicators may be added to the traffic data and/or to one or more visualizations generated using the metrics, the traffic data, the structured data, and so on.
- the system may be able to perform near miss/collision detection and/or various other actions based thereon that the system would not previously have been able to perform absent the technology disclosed herein. This may enable the system to operate more efficiently while consuming fewer hardware and/or software resources as more resource consuming techniques could be omitted. Further, a variety of components may be omitted while still enabling traffic near miss detection, reducing unnecessary hardware and/or software components, and providing greater system flexibility.
- FIG. 23 depicts an example system 2300 for traffic near miss/collision detection.
- the system 100 may include one or more analysis devices 2301 (which may be implemented using one or more loud computing arrangements) and/or one or more traffic monitoring devices 2302 (such as one or more intersection and/or other still image and/or video cameras, LiDAR sensors, loops, radar, weather data, Internet of Things sensors, fleet vehicles, traffic controllers and/or other city and/or other population area supplied data devices, navigation app data, connected vehicles, and so on).
- analysis devices 2301 which may be implemented using one or more loud computing arrangements
- traffic monitoring devices 2302 such as one or more intersection and/or other still image and/or video cameras, LiDAR sensors, loops, radar, weather data, Internet of Things sensors, fleet vehicles, traffic controllers and/or other city and/or other population area supplied data devices, navigation app data, connected vehicles, and so on).
- the analysis device 2301 and/or one or more other devices may obtain traffic data (such as video from intersection cameras, point cloud data from Light Detection and Ranging (or “LiDAR”) sensors, and so on) from the traffic monitoring device.
- the analysis device 2301 and/or one or more other devices may perform object detection and classification using the data and may determine an output structured data using the detected and classified objects.
- the analysis device 2301 and/or one or more other devices may obtain structured data that has been generated by performing such object detection and classification on traffic data (such as point cloud data from LiDAR sensors).
- the analysis device 2301 and/or one or more other devices may calculate one or more metrics using the structured data.
- the analysis device 2301 and/or one or more other devices may analyze the metrics to detect whether a near miss/collision occurs between each object in the frame (such as motorized or non-motorized vehicles, pedestrians, and so on) and each of the other objects in the frame.
- the analysis device 2301 and/or one or more other devices may analyze the metrics to evaluate whether or not a group of conditions are met.
- the analysis device 2301 and/or one or more other devices may determine that a near miss/collision is detected.
- the analysis device 2301 and/or one or more other devices may record the near miss/collision and/or other data based thereon in the metrics for the objects involved.
- analysis device 2301 and/or one or more other devices may add one or more indicators to the traffic data and/or to one or more visualizations generated using the metrics, the traffic data, the structured data, and so on.
- the group of conditions may include a variety of factors, such as the distance between the objects, the direction of movement of the objects, the current speed of the objects, and so on.
- a near miss/collision may be detected between a pair of objects when the distance between them is below a distance threshold, the speed of at least one of the objects is higher than a speed threshold, the angle between the vehicles is higher than an angle threshold, the objects are not coming from the same approach (intersection entrance), at least one of the vehicles was not detected in a near miss already in the same traffic data, and the sum of previous speeds of each of the objects is higher than zero (avoiding parked and/or otherwise stationary objects).
- Near misses/collisions may be calculated in a metrics module and analyzed for every frame.
- a near miss/collision detection process may include detecting objects in a given frame, making a list of all objects in that frame, analyzing all possible pairs of objects in that frame and, if the group of conditions above are met, determine that a near miss/collision is detected. After a near miss/collision is detected, both objects may be marked as having been involved in a near miss/collision. [00605] In various implementations, only portions of the traffic data may be analyzed for near misses/collisions. For example, most near misses/collisions may happen within actual intersections between objects of opposing directions. Further, 40% of near misses/collisions may happen with left turns, either a left turn or an exit approach. As such, near miss/collision detection analysis may be restricted to objects within intersections of opposing directions, objects involved in left turns, and so on.
- Accuracy of near miss/collision detection may depend on the traffic monitoring device 2302, the angle to the intersection, and so on. Performance of near miss/collision detection may be evaluated using a near miss/collision data set (traffic data with a near miss/collision) and a number of different performance measurements. These performance measurements may include recall (the number of near misses/collisions detected over the total number of near misses/collisions), precision (the number of near misses/collisions over the total number of possible pairs of objects in the data set), and so on.
- Near misses/collisions may be detected and data stored for such at the frame level.
- near misses/collisions may be detected and data stored for such at a second level above the frame level, such as a group of frames. This may reduce the amount of data that may be stored.
- Latency of near miss/collision detection may be approximately equal to the latency of the system 2300. This may relate to the time that passes between ingestion of traffic data and processing of a frame.
- a near miss/collision detection procedure may be run every frame, along with other relevant metrics.
- Such a near miss/collision detection procedure may involve the following steps.
- the near miss/collision detection procedure may start with object detection and tracking in the frame under analysis.
- the output of this process may be a list of record keys identifying each object, bounding boxes pointing to the object’s position, class, the total number of frames that the object has appeared in so far (age), the number of frames that have passed since the object has been detected for the last time (time_since_update), and so on.
- the second step may involve going through this list of objects and processing a group of metrics calculations that involve the near miss/collision detection. These metrics may be calculated on the base of each object.
- the following metrics may be calculated that may play a role in the detection of near misses/collisions: direction angle (direction of the object’s movement), conversion factor between pixels and miles, speed in miles per hour (or kilometers per hour and/or other measure), approach ID, intersection entrance, or where the object is coming from.
- direction angle changes in position and previous value of direction angle may be used to calculate the direction of the object and exclude outlier values (such as erroneous values).
- the direction angle of the object within its bounding box may be used to obtain the longitudinal size of the object in pixels. Then, the average size of the object class (such as vehicle class) may be used to get a conversion factor between pixels and miles (or other measurement) for that object in the frame.
- cars in the United States measure on average 14.7 feet long.
- speed in miles per hour or kilometers per hour and/or other measure
- the distance in pixels travelled between the current and last frame may be measured. Then this may be multiplied by the conversion factor to get the distance in miles. Finally, the distance may be divided by the time between frames to get the speed in miles per hour.
- approach ID when an object appears in the traffic data, its distance to all approach center points may be calculated and the closest approach may be attributed to the object.
- a near miss detection determination may be applied as follows. For each object, the distance in pixels to all other objects may be measured and transformed into miles (and/or other measurement) using the conversion factor calculated previously.
- the conversion factor may only allow calculation of distance in miles correctly for objects that are close to each other, as near miss analysis targets are.
- the angle with all other objects may be calculated using the direction angle.
- a group of conditions may be evaluated for every possible pair with the object under analysis. If the group of conditions are all met, the case may be determined to be a near miss.
- the groups of conditions may be different for different objects, such as between two vehicles, between a vehicle and a pedestrian, and so on.
- the group of conditions for detecting a near miss/collision between two vehicles may be as follows.
- the second vehicle may be required to be at a distance lower than a threshold distance, such as 7 feet.
- Both vehicles may be required to have speed values higher than a speed threshold, such as 0 miles per hour.
- the angle between the vehicles may be higher than an angle threshold, such as 12 degrees.
- the second vehicle may be required to be coming from a different approach than the vehicle under analysis.
- the group of conditions to detect a near miss/collision between a vehicle and a pedestrian may be the same as the above with the following exceptions.
- the person may need to be at a distance lower than a pedestrian distance threshold, such as 5 feet.
- a pedestrian distance threshold such as 5 feet.
- the vehicle may be required to have a speed higher than a vehicle speed threshold, such as 0.3 miles per hour.
- the analysis device 2301 and/or one or more other devices may perform object detection and classification using the data. For example, objects may be detected and classified as cars, trucks, buses, pedestrians, light vehicles, heavy vehicles, non-motor vehicles, and so on. Objects may be assigned individual identifiers, identifiers by type, and so on.
- the analysis device 101 and/or one or more other devices may determine and/or output structured data using the detected and classified objects.
- the analysis device 2301 and/or one or more other devices may calculate one or more metrics using the structured data.
- the metrics may involve vehicle volume, vehicle volume by vehicle type, average speed, movement status, distance travelled, queue length, pedestrian volume, nonmotor volume, light status on arrival, arrival phase, route through intersection, light times, near misses, longitude, latitude, city, state, country, and/or any other metrics that may be calculated using the structured data.
- the analysis device 2301 may be any kind of electronic device. Examples of such devices include, but are not limited to, one or more desktop computing devices, laptop computing devices, server computing devices, mobile computing devices, tablet computing devices, set top boxes, digital video recorders, televisions, displays, wearable devices, smart phones, digital media players, and so on.
- the analysis device 2301 may include one or more processors 2303 and/or other processing units and/or controllers, one or more non- transitory storage media 2304 (which may take the form of, but is not limited to, a magnetic storage medium; optical storage medium; magneto-optical storage medium; read only memory; random access memory; erasable programmable memory; flash memory; and so on), one or more communication units 2305, one or more input and/or output devices 2306 (such as one or more displays, speakers, keyboards, mice, track pads, touch pads, touch screens, sensors, printers, and so on), and/or other components.
- the processor 2303 may execute instructions stored in the non-transitory storage medium to perform various functions.
- the analysis device 2301 may involve one or more memory allocations configured to store at least one executable asset and one or more processor allocations configured to access the one or more memory allocations and execute the at least one executable asset to instantiate one or more processes and/or services, such as one or more near miss and/or collision detection and/or response services, and so on.
- the traffic monitoring device 2302 may be any kind of electronic device. Various configurations are possible and contemplated without departing from the scope of the present disclosure.
- system 2300 is illustrated and described as including particular components arranged in a particular configuration, it is understood that this is an example. In a number of implementations, various configurations of various components may be used without departing from the scope of the present disclosure.
- the system 2300 is illustrated and described as including the traffic monitoring device 2302. However, it is understood that this is an example. In various implementations, the traffic monitoring device 2302 may not be part of the system 2300.
- the system 2300 may instead communicate with the traffic monitoring device 2302.
- Various configurations are possible and contemplated without departing from the scope of the present disclosure.
- data that has already been detected and classified may be obtained.
- Various metrics may be calculated from such, similar to above, which may then be prepared for visualization and/or visualized and/or otherwise used similar to above.
- FIG. 24 depicts a flow chart illustrating a first example method 2400 for traffic near miss/collision detection. This method 2400 may be performed by the system 2300 of FIG. 1 .
- an electronic device may obtain traffic data, such as video from one or more intersection cameras.
- the electronic device may analyze the traffic data.
- the electronic device may determine whether or not a near miss/collision occurred. If not, the flow may proceed to operation 2440 and end. Otherwise, at operation 2450, the electronic device may respond to the detected near miss/collision.
- This may include recording data regarding the near miss/collision, marking the traffic data with one or more near miss/collision indicators, transmitting one or more notifications (such as to one or more cities and/or other municipalities or authorities, emergency responders, and so on for the purpose of summoning emergency services, tracking near misses/collisions, and so on), and so on.
- notifications such as to one or more cities and/or other municipalities or authorities, emergency responders, and so on for the purpose of summoning emergency services, tracking near misses/collisions, and so on
- responding to the detected near miss/collision may include one or more automatic and/or other alerts.
- the electronic device may determine within a confidence level, threshold, or similar mechanism that a detected near miss/collision is a collision.
- the electronic device may automatically and/or otherwise send an alert, such as to a 911 operator and/or other emergency and/or other vehicle dispatcher, emergency and/or other vehicle, vehicle controller, vehicle navigation device, and so on via one or more mechanisms such as cellular and/or other communication network.
- the collision detection may be external to the vehicle dispatched to render aid and/or perform other actions related to the collision.
- this may add functions to vehicle collision detection systems, add redundancy to vehicle collision detection systems, and so on.
- the electronic device may also utilize traffic data and/or control other devices, such as to determine the fastest and/or most efficient route to the collision, control traffic signals to prioritize traffic to the collision (such as creating an empty corridor to the collision), and so on.
- traffic data and/or control other devices such as to determine the fastest and/or most efficient route to the collision, control traffic signals to prioritize traffic to the collision (such as creating an empty corridor to the collision), and so on.
- the electronic device may record the near miss with other traffic data and/or otherwise analyze such as part of analyzing traffic data.
- the electronic device may record the near miss with other traffic data and/or otherwise analyze such as part of analyzing traffic data.
- this example method 2400 may be implemented as a group of interrelated software modules or components that perform various functions discussed herein. These software modules or components may be executed within a cloud network and/or by one or more computing devices, such as the analysis device 2301 of FIG. 23.
- example method 2400 is illustrated and described as including particular operations performed in a particular order, it is understood that this is an example. In various implementations, various orders of the same, similar, and/or different operations may be performed without departing from the scope of the present disclosure.
- the method 2400 is illustrated and described as including the operation 2440. However, it is understood that this is an example. In some implementations, operation 2440 may be omitted and the electronic device may instead return to operation 2420 and continue analyzing the traffic data. Various configurations are possible and contemplated without departing from the scope of the present disclosure.
- intersection video feeds may be used. This may include weather, Internet of Things sensors, LiDAR sensors, fleet vehicles, city suppliers (e.g., traffic controller), navigation app data, connected vehicle data, and so on.
- data that has already been detected and classified may be obtained.
- Various metrics may be calculated from such, similar to above, which may then be prepared for visualization and/or visualized and/or otherwise used similar to above.
- FIG. 25 depicts a flow chart illustrating a second example method 2500 for traffic near miss/collision detection. This method 2500 may be performed by the system 2300 of FIG. 23.
- an electronic device may detect all objects in a frame, such as in frames of a video from one or more intersection cameras.
- the electronic device may analyze all pairs of objects.
- the electronic device may determine whether or not a group of conditions are met (such as one or more of the groups of conditions discussed with respect to FIG. 23 above). If not, the flow may proceed to operation 2540 and end. Otherwise, at operation 2550, the electronic device may determine that a near miss has occurred.
- this example method 2500 may be implemented as a group of interrelated software modules or components that perform various functions discussed herein. These software modules or components may be executed within a cloud network and/or by one or more computing devices, such as the analysis device 2301 of FIG. 23.
- example method 2500 is illustrated and described as including particular operations performed in a particular order, it is understood that this is an example. In various implementations, various orders of the same, similar, and/or different operations may be performed without departing from the scope of the present disclosure.
- the method 2500 is illustrated and described as determining that a near miss/collision has occurred. However, it is understood that this is an example.
- the electronic device may perform one or more actions in response to determining that a near miss/collision has occurred. This may include recording data regarding the near miss/collision, marking the traffic data with one or more near miss/collision indicators, and so on.
- Various configurations are possible and contemplated without departing from the scope of the present disclosure.
- FIG. 26 depicts a flow chart illustrating a third example method 2600 for traffic near miss/collision detection. This method 2600 may be performed by the system 2300 of FIG. 23.
- an electronic device may analyze traffic video.
- the electronic device may determine that a near miss/collision has occurred.
- the electronic device may add a near miss/collision indicator to the traffic video.
- this example method 2600 may be implemented as a group of interrelated software modules or components that perform various functions discussed herein. These software modules or components may be executed within a cloud network and/or by one or more computing devices, such as the analysis device 2301 of FIG. 23.
- example method 2600 is illustrated and described as including particular operations performed in a particular order, it is understood that this is an example. In various implementations, various orders of the same, similar, and/or different operations may be performed without departing from the scope of the present disclosure.
- the method 2600 is illustrated and described as analyzing traffic video. However, it is understood that this is an example. In some implementations, other traffic data, such as point cloud data from one or more LIDAR sensors, may instead be analyzed. Various configurations are possible and contemplated without departing from the scope of the present disclosure.
- FIG. 27A depicts a first frame 2700A of traffic data video.
- the objects 2751 A-2751 D, 2751 F in the first frame 2700A each includes an indicator 2752A-2752D, 2752F that indicates that each of the objects 2751 A-2751 D, 2751 F has not been involved in a near miss/collision.
- the indicators 2752A-2752D, 2752F may be in dashed lines to show that each of the objects 2751 A-2751 D, 2751 F has not been involved in near miss/collision. However, it is understood that this is an example.
- FIG. 27B depicts a second frame 2700B of traffic data video.
- the second frame 2700B may be a subsequent frame to the first frame 2700A of FIG. 27A.
- the objects 2751 A-2751 F in the second frame 2700B each includes an indicator 2752A-2752F that indicates that each of the objects 2751 A-2751 F has not been involved in a near miss/collision.
- FIG. 27A depicts a second frame 2700B of traffic data video.
- the second frame 2700B may be a subsequent frame to the first frame 2700A of FIG. 27A.
- the objects 2751 A-2751 F in the second frame 2700B each includes an indicator 2752A-2752F that indicates that each of the objects 2751 A-2751 F has not been involved in a near miss/collision.
- FIG. 27C depicts a third frame 2700C of traffic data video.
- the third frame 2700C may be a subsequent frame to the second frame 2700B of FIG. 27B.
- the objects 2751 A-2751 D, 2751 F-27511 in the third frame 2700C each includes an indicator 2752A-2752D, 2752F-2752I that indicates that each of the objects 2751 A-2751 D, 2751 F-27511 has not been involved in a near miss/collision.
- FIG. 27D depicts a fourth frame 2700D of traffic data video.
- the fourth frame 2700D may be a subsequent frame to the third frame 2700C of FIG. 27C.
- the two objects 2751 G, 2751 H in the intersection have proceeded and are involved in a near miss.
- the indicators 2752G, 2752H for those two objects 2751 G, 2751 H that previously indicated that the two objects 2751 G, 2751 H had not been involved in a near miss have been modified to indicate that the two objects 2751 G, 2751 H have been involved in a near miss.
- this is an example.
- the indicators 2752G, 2752H for those two objects 2751 G, 2751 H that previously indicated that the two objects 2751 G, 2751 H had not been involved in a near miss may instead be removed and other indicators 2752G, 2752H indicating that the two objects 2751 G, 2751 H have been involved in a near miss may be added.
- the indicators 2752G, 2752H for those two objects 2751 G, 2751 H that previously indicated that the two objects 2751 G, 2751 H had not been involved in a near miss may instead be removed and other indicators 2752G, 2752H indicating that the two objects 2751 G, 2751 H have been involved in a near miss may be added.
- Various configurations are possible and contemplated without departing from the scope of the present disclosure.
- the indicators 2752G-2752H may be in solid lines to show that the objects 2751 G, 2751 H have been involved in a near miss. However, it is understood that this is an example. In other examples, the indicators 2752G, 2752H may be red to show that the objects 2751 G, 2751 H have been involved in a near miss/collision. In still other examples, other indicators 2752G, 2752H may be used without departing from the scope of the present disclosure. [00643] FIG. 27E depicts a fifth frame 2700E of traffic data video. The fifth frame 2700E may be a subsequent frame to the fourth frame 2700D of FIG. 27D. As shown, the indicators 2752G, 2752H for the two objects 2751 G, 2751 H that were involved in the near miss in FIG. 27D still indicate that the two objects 2751 G, 2751 H were involved in a near miss.
- FIGs. 27A-27E are illustrated and discussed above with respect to a near miss, it is understood that this is an example.
- the present disclosure may detect and respond to collisions between objects 2751 A-2751 D, 2751 F instead of and/or in addition to detecting and responding to near misses. Collisions may be detected and responded to similarly to how near misses are detected and responded to in the present disclosure.
- Various configurations are possible and contemplated without departing from the scope of the present disclosure.
- FIGs. 27A-27E are illustrated and discussed above with respect to particular traffic data, objects, frames, and indicators, it is understood that these are examples. In various implementations, other traffic data, objects, frames, indicators, and so on may be used without departing from the scope of the present disclosure.
- frames of a raw, real-time video feed from an intersection camera and/or other traffic data may be obtained (though it is understood that this is an example and that in other examples other data, such as point cloud LiDAR data, may be obtained and used).
- Detection and classification may be performed on each frame to identify and classify the objects in the frame. Structured data may then be determined for the objects detected.
- a frame number may be determined for a frame
- an intersection identifier may be determined for a frame
- a unique tracker identifier may be assigned to each object detected
- the class of the object may be determined (such as person, car, truck, bus, motorbike, bicycle, and so on)
- coordinates of the object detected in the frame may be determined (which may be determined with reference to known coordinates of the intersection and/or the intersection camera, such as camera longitude, latitude, city, state, country, and so on) (such as the minimum and maximum x positions of the object, the minimum and maximum y positions of the object, and so on), and the like.
- a bounding box may be calculated for the object based on one or more x and/or y positions for the object.
- one or more geometric centers of the object’s bounding box may be calculated for the object in the x and/or y coordinate (such as an x min, a y min, and so on).
- an intersection approach that the object is currently on may be calculated, such as based on a position of the object and a position of the center of the intersection.
- other structured data may be determined from the frames.
- one or more time stamps associated with frames may be determined and/or associated with other structured data, such as to determine a time at which an object was at a determined x and/or y position.
- a light phase for the frame may be determined (such as whether a traffic light in the frame is green, red, and so on), though this may instead be determined by means other than image analysis (such as time-stamped traffic light data that may be correlated to a frame time stamp). This may be used to determine the traffic light phase when an object arrived at the intersection, such as by correlating a traffic light phase determined for a frame along with a determination that an object arrived at the intersection in the frame.
- data for an approach and/or intersection associated with a frame may be determined (such as based on a uniform resource locator of the video feed and/or any other intersection camera identifier associated with the frame, an approach identifier associated with the frame, an intersection identifier associated with the frame, and so on).
- the structured data determined for an object in a frame may be used with the structured data determined for the object in other frames to calculate various metrics. For example, the difference between one or more x and/or y positions for the object (such as the difference and/or distance between x or y midpoints of the object’s bounding box) in different frames (such as in a current and a previous frame) may be calculated. Such difference in position between frames, along with times respectively associated with the frames (such as from one or more time stamps) may be used to calculate one or more metrics associated with the speed of the object (such as an average speed of the object during the video feed (such as in miles per hour and/or other units), cumulative speed, and so on).
- the speed of the object such as an average speed of the object during the video feed (such as in miles per hour and/or other units), cumulative speed, and so on).
- Such difference in position between frames may also be used to calculate various metrics about the travel of the object (such as the direction of travel between frames, how the object left an intersection, whether or not the object made a right on red, and so on).
- structured data from multiple frames may be used to determine a status of the object (such as an approach associated with the object, how an object moved through an intersection, an approach an object used to enter an intersection, the approach an object used to exit an intersection, and so on), a time or number of frames since the object was last detected (and/or since first detected and so on), whether or not the object is moving, and so on.
- Structured data and/or metrics for individual detected objects and/or other data may be used together to calculate various metrics, such as metrics associated with approaches.
- structured data and/or metrics for individual detected objects associated with an approach identifier may be aggregated and analyzed to determine one or more approach volumes (such as a number of vehicles (cars, motorbikes, trucks, buses, and so on)) in a particular approach, a number of light vehicles (such as cars, motorbikes, and so on) in a particular approach, a number of heavy vehicles (such as trucks, buses, and so on) in a particular approach, a number of cars in a particular approach, a number of trucks in a particular approach, a number of buses in a particular approach, a number of pedestrians in a particular approach, a number of non-motor vehicles in a particular approach, a number of bicycles in a particular approach, and so on), an average queue length (such as in feet and/or another unit of measurement) of a particular approach, and so on.
- approach volumes such as a number of vehicles (cars, motorbikes, trucks, buses, and so on)
- light vehicles such as cars, motorbikes, and so on
- light status in one or more frames may be tracked and/or correlated with other information to determine a light status, an effective green time (such as a length of time that objects are moving through a particular intersection), an effective red time (such as a length of time that objects are stopped at a particular intersection), a cycle time (such as a length of time that a light is green determined by comparing the light phase across multiple frames), a number of cars that arrived while a traffic light is green, a number of cars that arrived while a traffic light is red, a measure of individual phase progression performance derived from a percentage of vehicle volume arrivals on green, and so on.
- an effective green time such as a length of time that objects are moving through a particular intersection
- an effective red time such as a length of time that objects are stopped at a particular intersection
- a cycle time such as a length of time that a light is green determined by comparing the light phase across multiple frames
- a number of cars that arrived while a traffic light is green
- a last stop time may be calculated based on a last time stamp that an object stopped at an approach.
- a last start time may be calculated based on a last time stamp that an object moved into the intersection at a particular approach.
- an approach identifier for a particular approach may be determined, coordinates for a camera associated with a particular intersection may be determined, a number of lanes associated with a particular approach may be determined, and so on.
- Structured data and/or metrics for individual detected objects and/or other data may be also used together to calculate various metrics associated with intersections.
- a vehicle volume for a particular intersection may be determined by summing objects (such as cars, motorbikes, trucks, buses, and so on) in all approaches of a frame associated with the intersection
- a light vehicle volume for a particular intersection may be determined by summing objects (such as cars, motorbikes, and so on) in all approaches of a frame associated with the intersection
- a heavy vehicle volume for a particular intersection may be determined by summing objects (such as trucks, buses, and so on) in all approaches of a frame associated with the intersection
- a car volume for a particular intersection may be determined by summing cars in all approaches of a frame associated with an intersection
- a truck volume for a particular intersection may be determined by summing trucks in all approaches of a frame associated with an intersection
- a bus volume for a particular intersection may be determined by summing buses in all approaches of a frame associated with an intersection
- a person volume for a particular intersection may be determined by summing people in all approaches of a frame associated with an intersection
- Other information for an intersection may be determined using the video feed, frames, and/or other structured data and/or metrics. For example, an identifier for a camera associated with an intersection may be determined, identifiers for frames of one or more video feeds associated with the intersection may be determined, observation times associated with an intersection may be determined (such as a time stamp based on ingestion time when other metadata from a stream or other video feed is not available, a cumulative time (such as from the start of processing of the video feed) may be determined, and so on.
- the above raw data, structured data, metrics, and so on may be used to detect one or more near misses/collisions. Detection of such near misses/collisions may be performed using one or more of the methods and/or procedures discussed above.
- any structured data and/or metrics relating to one or more vehicles and/or other objects, approaches, intersections, and so on may be determined and calculated from the objects detected in one or more frames of one or more video feeds of one or more intersection cameras and/or other traffic data without departing from the scope of the present disclosure.
- connected vehicle data may be obtained and used.
- structured data and/or metrics may be determined and/or calculated using a combination of connected vehicle data and data from one or more video feeds from one or more intersection cameras and/or other traffic data.
- a visualization dashboard may visualize connected vehicle data along with structured data and/or metrics determined and/or calculated from one or more video feeds from one or more intersection cameras and/or other traffic data.
- real-time video feed from an intersection camera and/or other traffic data may be obtained.
- Objects in frames of the video feed may be detected and classified. Positions of the objects at various times in the frames of the video feed may be determined, as well as information such as light statuses related to the objects. Differences between the objects in different frames may be used to determine behavior of the objects over time.
- Such calculated object metrics may be stored, such as in one or more vehicle tables.
- Such calculated object metrics for objects that are associated with a particular approach may be aggregated in order to determine various approach object volumes and/or other metrics related to the approach, which may then be stored, such as in one or more approach tables.
- object metrics for objects that are associated with a particular intersection may be aggregated in order to determine various intersection object volumes and/or other metrics related to the intersection, which may then be stored, such as in one or more intersection tables.
- structured data and/or metrics related to one or more vehicles and/or other objects, approaches, intersections, and so on discussed above may then be processed and/or otherwise prepared for visualization and/or one or more other purposes, such as near miss/collision detection.
- structured data and/or metrics related to one or more vehicles and/or other objects may be stored in one or more vehicle tables
- structured data and/or metrics related to one or more intersections may be stored in one or more intersection tables
- structured data and/or metrics related to one or more approaches may be stored in one or more approach tables, and so on.
- Such tables may then be used for visualization and/or one or more other purposes.
- LiDAR sensors may be operable to determine data, such as ranges (variable distance), by targeting an object with elements, such as one or more lasers, and measuring the time for the reflected light to return to one or more receivers.
- LiDAR sensors may generate point cloud data that may be used for the analysis discussed herein instead of frames of a raw, real-time video feed from an intersection camera and/or other traffic data.
- functions similar to those described above performed on frames of a raw, real-time video feed from an intersection camera and/or other traffic data may be performed on the LiDAR sensor data.
- traffic data such as detection and classification, determination of structured data, near miss/collision detection, and so on
- structured data generated from LiDAR cloud data that has already been detected and classified may be obtained and various metrics may be calculated from such, similar to above, which may then be prepared for visualization and/or visualized and/or otherwise used similar to above.
- LiDAR sensor data may have a number of advantages over frames of a raw, realtime video feed from an intersection camera and/or other traffic data.
- point cloud data from one or more LiDAR sensors may not have the same privacy issues as frames of a raw, real-time video feed from an intersection camera and/or other traffic data as facial and/or other similar images may not be captured.
- LiDAR sensor data may not be dependent on lighting and thus may provide more reliable data over all times of day and night as compared to frames of a raw, real-time video feed from an intersection camera and/or other traffic data.
- LiDAR sensor data may provide data in three- dimensional space as opposed to the two-dimensional data from frames of a raw, real-time video feed from an intersection camera and/or other traffic data and thus may provide depth, which may not be provided via frames of a raw, real-time video feed from an intersection camera and/or other traffic data.
- a determination may be made about the size of an average vehicle in pixels. This may be used with the LiDAR sensor data to determine the pixels from the center of a vehicle represented in the LiDAR sensor data and then infer the speed of the vehicle. Compared to approaches using frames of a raw, real-time video feed from an intersection camera and/or other traffic data, an assumption may not have to be made about object speed.
- This may be more accurate, but also may improve the processing speed of computing devices processing the data as functions performed on frames of a raw, real-time video feed from an intersection camera and/or other traffic data to determine speed may not need to be performed and can be omitted since this information may already be represented in LiDAR sensor data.
- Various configurations are possible and contemplated without departing from the scope of the present disclosure.
- a system may include a memory allocation configured to store at least one executable asset and a processor allocation configured to access the memory allocation and execute the at least one executable asset to instantiate a near miss/collision detection service.
- the near miss/collision detection service may detect objects in a frame, analyze pairs of the objects, determine that a group of conditions are met, and determine that a near miss/collision has occurred.
- the group of conditions may include that a distance between an object pair of the pairs of the objects is less than a distance threshold.
- the distance threshold may be approximately 7 feet. By way of illustration, approximately 7 feet may be within 6-8 feet.
- the group of conditions may include that a speed of an object of an object pair of the pairs of the objects is greater than a speed threshold.
- the speed threshold may be approximately zero miles per hour. By way of illustration, approximately zero may be within 1 mile per hour of zero.
- the group of conditions may include that an angle between an object pair of the pairs of the objects is higher than an angle threshold.
- the angle threshold may be approximately 12 degrees.
- approximately 12 degrees may be within 11 -13 degrees.
- the group of conditions may include that an object pair of the pairs of the objects are not both coming from a same approach. In some implementations, the group of conditions may include that one of an object pair of the pairs of the objects was not previously determined to be involved in another near miss/collision. In a number of implementations, the group of conditions may include that a sum of previous speeds is higher than zero for both of an object pair of the pairs of the objects.
- the group of conditions may include a first group of conditions for a first object pair of the pairs of the objects that are both vehicles and a second group of conditions for a second object pair of the pairs of the objects that include a vehicle and a pedestrian.
- the second group of conditions may include a lower distance threshold than the first group of conditions.
- the second group of conditions may include no condition related to an angle between the vehicle and the pedestrian.
- the second group of conditions may include a higher speed threshold than the first group of conditions and the second group of conditions evaluates the vehicle according to the higher speed threshold.
- the near miss/collision detection service may determine a conversion factor between pixels and a speed measurement.
- the speed measurement may be in miles per hour.
- the system may further include adding a near miss/collision indicator to the frame.
- a method of near miss/collision detection may include obtaining traffic data, analyzing the traffic data, determining that a near miss/collision occurred, and responding to the near miss/collision.
- responding to the near miss/collision may include determining that the near miss/collision is a collision.
- the method may further include transmitting an alert regarding the collision.
- a system for traffic monitoring, analysis, and prediction may include a memory allocation configured to store at least one executable asset and a processor allocation configured to access the memory allocation and execute the at least one executable asset to instantiate at least one service.
- the at least one service may obtain traffic data, perform object detection and classification, determine structured data, calculate metrics using the structured data, prepare processed data for visualization from the metrics, and present the prepared processed data via at least one dashboard.
- the at least one service may determine the structured data by determining a frame number for a frame of video, determining an intersection identifier for the frame of video, assigning a unique tracker identifier to each object detected in the frame of video, and determining coordinates of each object detected in the frame of video. In a number of such examples, the at least one service further may determine the structured data by determining the class of each object detected in the frame of video.
- the at least one service may calculate the metrics using the structured data by calculating a difference between one or more x or y positions for an object in different frames of video. In some such examples, the at least one service may use the difference along with times respectively associated with the different frames to calculate at least one of the metrics that is associated with a speed of the object. In various such examples, the speed may be an average speed of the object during the video or a cumulative speed of the object.
- the at least one service may calculate the metrics using the structured data by correlating a traffic light phase determined for a frame of video along with a determination that an object arrived at the intersection in the frame.
- a system for traffic monitoring, analysis, and prediction may include a memory allocation configured to store at least one executable asset and a processor allocation configured to access the memory allocation and execute the at least one executable asset to instantiate at least one service.
- the at least one service may retrieve structured data determined from point cloud data from LiDAR sensors used to monitor traffic, calculate metrics using the structured data, prepare processed data for visualization from the metrics, and present the prepared processed data via at least one dashboard.
- the metrics may include at least one of vehicle volume, average speed, distance travelled, pedestrian volume, non-motor volume, light status on arrival, arrival phase, a route through an intersection, or a light time.
- the at least one service may summon at least one vehicle using at least one of the metrics or the processed data.
- the at least one service may track near misses/collisions using at least one of the metrics or the processed data.
- the at least one service may determine a fastest route using at least one of the metrics or the processed data.
- the at least one service may control traffic signals to prioritize traffic using at least one of the metrics or the processed data.
- the at least one service may determine a most efficient route using at least one of the metrics or the processed data.
- a system for traffic monitoring, analysis, and prediction may include a memory allocation configured to store at least one executable asset and a processor allocation configured to access the memory allocation and execute the at least one executable asset to instantiate at least one service.
- the at least one service may construct a digital twin of an area of interest, retrieve structured data determined from traffic data for the area of interest, calculate metrics using the structured data, prepare processed data for visualization from the metrics, and present the prepared processed data in the context of the digital twin via at least one dashboard that displays the digital twin.
- the at least one service may simulate traffic via the at least one dashboard using the processed data.
- the at least one service may simulate how a change affects traffic patterns.
- the change may alter at least one of a simulation of the traffic, a traffic signal, or a traffic condition.
- the digital twin may include multiple intersections.
- the at least one dashboard may include indicators selectable to display information for each of the multiple intersections.
- Traffic data may be obtained, such as video from intersection cameras.
- Object detection and classification may be performed using the data.
- Structured data may be determined and/or output using the detected and classified objects.
- Metrics may be calculated using the structured data.
- Processed data may be prepared for visualization and/or other uses. The prepared processed data may be presented via one or more dashboards and/or the prepared processed data may be otherwise used.
- the methods disclosed may be implemented as sets of instructions or software readable by a device. Further, it is understood that the specific order or hierarchy of steps in the methods disclosed are examples of sample approaches. In other embodiments, the specific order or hierarchy of steps in the method can be rearranged while remaining within the disclosed subject matter. The accompanying method claims present elements of the various steps in a sample order, and are not necessarily meant to be limited to the specific order or hierarchy presented. [00685]
- the described disclosure may be provided as a computer program product, or software, that may include a non-transitory machine-readable medium having stored thereon instructions, which may be used to program a computer system (or other electronic devices) to perform a process according to the present disclosure.
- a non-transitory machine- readable medium includes any mechanism for storing information in a form (e.g., software, processing application) readable by a machine (e.g., a computer).
- the non-transitory machine-readable medium may take the form of, but is not limited to, a magnetic storage medium (e.g., floppy diskette, video cassette, and so on); optical storage medium (e.g., CD- ROM); magneto-optical storage medium; read only memory (ROM); random access memory (RAM); erasable programmable memory (e.g., EPROM and EEPROM); flash memory; and so on.
Landscapes
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Chemical & Material Sciences (AREA)
- Analytical Chemistry (AREA)
- Traffic Control Systems (AREA)
Abstract
Un ou plusieurs dispositifs obtiennent des données de trafic, telles que des vidéos provenant de caméras d'intersection. Le ou les dispositifs réalisent une détection et une classification d'objets au moyen des données. Le ou les dispositifs déterminent et/ou délivrent des données structurées au moyen des objets détectés et classés. Le ou les dispositifs calculent des mesures au moyen des données structurées. Le ou les dispositifs peuvent préparer des données traitées pour une visualisation et/ou d'autres utilisations. Le ou les dispositifs peuvent présenter les données traitées préparées par l'intermédiaire d'un ou de plusieurs tableaux de bord et/ou utiliser autrement les données traitées préparées.
Applications Claiming Priority (10)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US202163248948P | 2021-09-27 | 2021-09-27 | |
| US63/248,948 | 2021-09-27 | ||
| US202263315200P | 2022-03-01 | 2022-03-01 | |
| US63/315,200 | 2022-03-01 | ||
| US202263318442P | 2022-03-10 | 2022-03-10 | |
| US63/318,442 | 2022-03-10 | ||
| US202263320010P | 2022-03-15 | 2022-03-15 | |
| US63/320,010 | 2022-03-15 | ||
| US17/952,068 US20230097373A1 (en) | 2021-09-27 | 2022-09-23 | Traffic monitoring, analysis, and prediction |
| US17/952,068 | 2022-09-23 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2023049453A1 true WO2023049453A1 (fr) | 2023-03-30 |
Family
ID=83900319
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/US2022/044733 Ceased WO2023049453A1 (fr) | 2021-09-27 | 2022-09-26 | Surveillance, analyse et prédiction de trafic |
Country Status (1)
| Country | Link |
|---|---|
| WO (1) | WO2023049453A1 (fr) |
Cited By (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| NL2036060A (en) * | 2023-10-17 | 2024-01-25 | Hebei Transp Investment Group Company Limited | Laser Radar-based Adaptive Traffic Flow Measurement System and Method |
Citations (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20180253973A1 (en) * | 2017-03-03 | 2018-09-06 | Kennesaw State University Research And Service Foundation, Inc. | Real-time video analytics for traffic conflict detection and quantification |
| US20200388156A1 (en) * | 2018-03-19 | 2020-12-10 | Derq Inc. | Early warning and collision avoidance |
| WO2021061488A1 (fr) * | 2019-09-27 | 2021-04-01 | Zoox, Inc. | Cadriciel d'analyse de sécurité |
| WO2021073716A1 (fr) * | 2019-10-14 | 2021-04-22 | Huawei Technologies Co., Ltd. | Raisonneur de trafic |
-
2022
- 2022-09-26 WO PCT/US2022/044733 patent/WO2023049453A1/fr not_active Ceased
Patent Citations (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20180253973A1 (en) * | 2017-03-03 | 2018-09-06 | Kennesaw State University Research And Service Foundation, Inc. | Real-time video analytics for traffic conflict detection and quantification |
| US20200388156A1 (en) * | 2018-03-19 | 2020-12-10 | Derq Inc. | Early warning and collision avoidance |
| WO2021061488A1 (fr) * | 2019-09-27 | 2021-04-01 | Zoox, Inc. | Cadriciel d'analyse de sécurité |
| WO2021073716A1 (fr) * | 2019-10-14 | 2021-04-22 | Huawei Technologies Co., Ltd. | Raisonneur de trafic |
Cited By (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| NL2036060A (en) * | 2023-10-17 | 2024-01-25 | Hebei Transp Investment Group Company Limited | Laser Radar-based Adaptive Traffic Flow Measurement System and Method |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US20230097373A1 (en) | Traffic monitoring, analysis, and prediction | |
| US12450959B2 (en) | Systems and methods for managing vehicle data | |
| US11941887B2 (en) | Scenario recreation through object detection and 3D visualization in a multi-sensor environment | |
| US9583000B2 (en) | Vehicle-based abnormal travel event detecting and reporting | |
| CN114174949B (zh) | 确定感兴趣区域中的基础设施的状态 | |
| WO2022046469A1 (fr) | Systèmes et procédés de gestion de données de véhicule | |
| US11024169B2 (en) | Methods and systems for utilizing vehicles to investigate events | |
| US10733541B2 (en) | System, method, and recording medium for geolocation data discovery in streaming texts | |
| US20250328979A1 (en) | Providing dynamic alternate location transportation modes and user interfaces within multi-pickup-location area geofences | |
| US20240161621A1 (en) | Systems that predict accidents and ameliorate predicted accidents | |
| Iqbal et al. | An efficient traffic incident detection and classification framework by leveraging the efficacy of model stacking | |
| US11657100B2 (en) | Cognitively rendered event timeline display | |
| Ma et al. | Vehicle-based machine vision approaches in intelligent connected system | |
| US12094041B2 (en) | Restoration of a kinetic event using video | |
| WO2023049453A1 (fr) | Surveillance, analyse et prédiction de trafic | |
| Shukla et al. | Real-time parking space detection and management with artificial intelligence and deep learning system | |
| US10652708B1 (en) | System and method for reporting observed events/objects from smart vehicles | |
| CN116975120A (zh) | 一种数据处理方法、装置、计算机设备及存储介质 | |
| EP2093999A1 (fr) | Intégration d'informations vidéo | |
| Parygin et al. | Management of Information from Surveillance Cameras at the Infrastructure Facility | |
| Schicktanz et al. | Detection and analysis of critical interactions in illegal u-turns at an urban signalized intersection | |
| Huang et al. | Technical and economic feasibility assessment of a cloud-enabled traffic video analysis framework | |
| EP4567766A1 (fr) | Système et procédé pour fournir des événements consolidés dans un centre de commande de gestion de trafic à l'aide d'une connaissance de la situation | |
| WO2023049461A1 (fr) | Système et procédé de détection de quasi collision/collision pour trafic | |
| CN119271915B (zh) | 基于云原生分布式架构的cim平台及平台管理方法 |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 22793015 Country of ref document: EP Kind code of ref document: A1 |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 22793015 Country of ref document: EP Kind code of ref document: A1 |