WO2020232732A1 - Systèmes et procédés de prévision d'informations de circulation - Google Patents
Systèmes et procédés de prévision d'informations de circulation Download PDFInfo
- Publication number
- WO2020232732A1 WO2020232732A1 PCT/CN2019/088956 CN2019088956W WO2020232732A1 WO 2020232732 A1 WO2020232732 A1 WO 2020232732A1 CN 2019088956 W CN2019088956 W CN 2019088956W WO 2020232732 A1 WO2020232732 A1 WO 2020232732A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- historical
- target
- link
- information
- time point
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Images
Classifications
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/01—Detecting movement of traffic to be counted or controlled
- G08G1/0104—Measuring and analyzing of parameters relative to traffic conditions
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/08—Learning methods
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/044—Recurrent networks, e.g. Hopfield networks
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/044—Recurrent networks, e.g. Hopfield networks
- G06N3/0442—Recurrent networks, e.g. Hopfield networks characterised by memory or gating, e.g. long short-term memory [LSTM] or gated recurrent units [GRU]
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/0495—Quantised networks; Sparse networks; Compressed networks
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/08—Learning methods
- G06N3/09—Supervised learning
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q10/00—Administration; Management
- G06Q10/04—Forecasting or optimisation specially adapted for administrative or management purposes, e.g. linear programming or "cutting stock problem"
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/01—Detecting movement of traffic to be counted or controlled
- G08G1/0104—Measuring and analyzing of parameters relative to traffic conditions
- G08G1/0125—Traffic data processing
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/01—Detecting movement of traffic to be counted or controlled
- G08G1/0104—Measuring and analyzing of parameters relative to traffic conditions
- G08G1/0125—Traffic data processing
- G08G1/0129—Traffic data processing for creating historical data or processing based on historical data
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/01—Detecting movement of traffic to be counted or controlled
- G08G1/0104—Measuring and analyzing of parameters relative to traffic conditions
- G08G1/0125—Traffic data processing
- G08G1/0133—Traffic data processing for classifying traffic situation
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/01—Detecting movement of traffic to be counted or controlled
- G08G1/0104—Measuring and analyzing of parameters relative to traffic conditions
- G08G1/0137—Measuring and analyzing of parameters relative to traffic conditions for specific applications
- G08G1/0141—Measuring and analyzing of parameters relative to traffic conditions for specific applications for traffic information dissemination
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/065—Traffic control systems for road vehicles by counting the vehicles in a section of the road or in a parking area, i.e. comparing incoming count with outgoing count
Definitions
- the present disclosure generally relates to systems and methods for predicting traffic information of road links, and in particular, to systems and methods for predicting the traffic information of the road links based on a trained recurrent neural network model.
- a traffic prediction platform can predict traffic information (i.e. traffic conditions, e.g., traffic volume, a traffic congestion) of one or more road links at a target time point.
- traffic information i.e. traffic conditions, e.g., traffic volume, a traffic congestion
- the traffic prediction platform usually uses a linear model or a tree model to predict the traffic conditions based on historical traffic data (e.g., historical traffic volume, historical traffic congestion) .
- the traffic conditions at different time points may be interdependent. For example, if a link is congested at 8: 58 a.m., it is likely that the link will be congested at 9: 00 a.m., and was congested at 8: 56 a.m., etc.
- the traffic prediction platform can also use an Autoregressive Integrated Moving Average (ARIMA) model to predict the traffic conditions based on historical time series traffic data.
- ARIMA Autoregressive Integrated Moving Average
- the ARIMA model can only predict the traffic conditions based on a single type of historical traffic data, but it cannot concatenate different types of historical traffic data to predict comprehensive traffic conditions, which may result in unreliable prediction results.
- the ARIMA model can only predict traffic conditions based on historical traffic volume data associated with a link, but cannot predict the traffic conditions based on historical traffic volume data associated with a link and historical average speed data of vehicles associated with the link. Therefore, it is desirable to provide systems and methods for predicting traffic information with a high level of precision, efficiency, and comprehensiveness, e.g., by using a neural network model based on different types of data.
- a system may include at least one storage medium and at least one processor in communication with the at least one storage medium.
- the at least one storage medium may include a set of instructions.
- the at least one processor may be directed to: determine a target link having a linkID and a target future time point; obtain a plurality of reference information packages corresponding to a plurality of reference time points associated with the target link, wherein each of the plurality of reference information packages corresponds to a reference time point and includes at least one static information item and at least one dynamic information item related to the reference time point; and determine target traffic information of the target link at the target future time point using a trained recurrent neural network model based on the plurality of reference information packages.
- the at least one static information item may include one of: the linkID of the target link, a speed limit of the target link, a link type of the target link, a length of the target link, a width of the target link, a count of lanes of the target link, or whether the target link is near to an intersection.
- the at least one dynamic information item may include one of: weather information of the target link at the reference time point, traffic volume of the target link at the reference time point, an average driving speed of vehicles along the target link at the reference time point, a traffic congestion condition of the target link at the reference time point, or an average passing time of vehicles along the target link at the reference time point.
- the plurality of reference time points may include one or more time points in a current day of the target future time point and one or more corresponding time points in at least one day prior to the target future time point.
- the target traffic information of the target link at the target future time point may include one of: a target average driving speed of vehicles along the target link at the target future time point, a target traffic congestion condition of the target link at the target future time point, target traffic volume of the target link at the target future time point or a target average passing time of vehicles along the target link at the target future time point.
- the at least one processor may be further directed to: supplement at least one of the plurality of reference information packages if at least one information item is missing from the reference information packages.
- the trained recurrent neural network model may be generated with a training process, and the training process may include: obtaining a plurality of sets of historical reference information packages corresponding to a plurality of first historical time points associated with a plurality of reference links, wherein each set of the historical reference information packages includes one or more historical reference information packages corresponding to one or more second historical time points associated with a reference link, each of the one or more historical reference information packages corresponds to a second historical time point associated with the reference link and includes a plurality of historical information items, including at least one historical static information item and at least one historical dynamic information item related to the historical time point; arranging the plurality of sets of historical reference information packages in a temporal order; and training a preliminary neural network model based on the historical information items to generate the trained recurrent neural network model.
- the at least one historical static information item may include at least one of: a linkID of the reference link, a speed limit of the reference link, a link type of the reference link, a length of the reference link, a width of the reference link, a count of lanes of the reference link, or whether the reference link was near to an intersection
- the at least one historical dynamic information item may include at least one of: historical weather information of the reference link at the second historical time point, historical traffic volume of the reference link at the second historical time point, a historical average driving speed of vehicles along the reference link at the second historical time point, a historical traffic congestion condition of the reference link at the second historical time point, or a historical average passing time of vehicles along the reference link at the second historical time point.
- training a preliminary neural network model based on the historical information items to generate the trained recurrent neural network model may include: determining training traffic information corresponding to the each set of the historical reference information packages based on the preliminary neural network model and the historical information items; determining whether the training traffic information satisfies a predetermined condition ; and designating the preliminary neural network model as the trained recurrent neural network model in response to a determination that the training traffic information satisfies the predetermined condition.
- the preliminary neural network model may at least include an input layer, a long-short term memory layer, a fully connected layer, an output layer, and an embedding layer; and determining training traffic information comprises: sparsing the at least one historical static information item for each of the plurality sets of historical reference information packages; dividing the plurality of sets of historical reference information packages into one or more groups, and for each of the one or more groups, inputting the historical dynamic information items of the group into the input layer; and embedding the sparsed historical static information items of the group into the embedding layer; and outputting the training traffic information corresponding to the each of the plurality of sets of historical reference information packages.
- the preliminary neural network model may at least include an input layer, a long-short term memory layer, a fully connected layer, an output layer, and an embedding layer; and determines training traffic information comprises: sparsing the at least one historical static information item for the each of the plurality of sets of historical reference information packages; dividing the plurality of sets of historical reference information package into one or more groups, and for each of the one or more groups, concatenating the sparsed historical static information items of the group and the historical dynamic information items of the group; inputting the concatenated historical information items into the input layer; and embedding the sparsed historical static information items into the embedding layer; and outputting the training traffic information corresponding to the each of the plurality of sets of historical reference information packages.
- the at least one processor may be further directed to: gather target traffic information of a plurality of target links at the target future time point, perform route planning, determine Estimated Times of Arrival (ETAs) , perform transportation scheduling, or determine a travelling price of a travelling route based on the target traffic information of the plurality of target links.
- ETAs Estimated Times of Arrival
- the at least one processor may be further directed to: transmit signals to a terminal device; the signals directing the terminal device to display the target traffic information of the plurality of target links, a planned route, the Estimated Times of Arrival (ETAs) , the travelling price, or schedule transportations.
- ETAs Estimated Times of Arrival
- a method may be implemented on a computing device having at least one processor, at least one storage medium, and a communication platform connected to a network.
- the method may include determining a target link having a linkID and a target future time point; obtaining a plurality of reference information packages corresponding to a plurality of reference time points associated with the target link, wherein each of the plurality of reference information packages corresponds to a reference time point and includes at least one static information item and at least one dynamic information item related to the reference time point; and determining target traffic information of the target link at the target future time point using a trained recurrent neural network model based on the plurality of reference information packages.
- the at least one static information item may include one of: the linkID of the target link, a speed limit of the target link, a link type of the target link, a length of the target link, a width of the target link, a count of lanes of the target link, or whether the target link is near to an intersection.
- the at least one dynamic information item may include one of: weather information of the target link at the reference time point, traffic volume of the target link at the reference time point, an average driving speed of vehicles along the target link at the reference time point, a traffic congestion condition of the target link at the reference time point, or an average passing time of vehicles along the target link at the reference time point.
- the plurality of reference time points may include one or more time points in a current day of the target future time point and one or more corresponding time points in at least one day prior to the target future time point.
- the target traffic information of the target link at the target future time point may include one of: a target average driving speed of vehicles along the target link at the target future time point, a target traffic congestion condition of the target link at the target future time point, target traffic volume of the target link at the target future time point or a target average passing time of vehicles along the target link at the target future time point.
- the method may further include: supplementing at least one of the plurality of reference information packages if at least one information item is missing from the reference information packages.
- the trained recurrent neural network model may be generated with a training process, and the training process may include: obtaining a plurality of sets of historical reference information packages corresponding to a plurality of first historical time points associated with a plurality of reference links, wherein each set of the historical reference information packages includes one or more historical reference information packages corresponding to one or more second historical time points associated with a reference link, each of the one or more historical reference information packages corresponds to a second historical time point associated with the reference link and includes a plurality of historical information items, including at least one historical static information item and at least one historical dynamic information item related to the historical time point; arranging the plurality of sets of historical reference information packages in a temporal order; and training a preliminary neural network model based on the historical information items to generate the trained recurrent neural network model.
- the at least one historical static information item may include at least one of: a linkID of the reference link, a speed limit of the reference link, a link type of the reference link, a length of the reference link, a width of the reference link, a count of lanes of the reference link, or whether the reference link was near to an intersection
- the at least one historical dynamic information item may include at least one of: historical weather information of the reference link at the second historical time point, historical traffic volume of the reference link at the second historical time point, a historical average driving speed of vehicles along the reference link at the second historical time point, a historical traffic congestion condition of the reference link at the second historical time point, or a historical average passing time of vehicles along the reference link at the second historical time point.
- training a preliminary neural network model based on the historical information items to generate the trained recurrent neural network model may include: determining training traffic information corresponding to the each set of the historical reference information packages based on the preliminary neural network model and the historical information items; determining whether the training traffic information satisfies a predetermined condition; and designating the preliminary neural network model as the trained recurrent neural network model in response to a determination that the training traffic information satisfies the predetermined condition.
- the preliminary neural network model may at least include an input layer, a long-short term memory layer, a fully connected layer, an output layer, and an embedding layer; and determining training traffic information may include: sparsing the at least one historical static information item for each of the plurality sets of historical reference information packages; dividing the plurality of sets of historical reference information packages into one or more groups, and for each of the one or more groups, inputting the historical dynamic information items of the group into the input layer; and embedding the sparsed historical static information items of the group into the embedding layer; and outputting the training traffic information corresponding to the each of the plurality of sets of historical reference information packages.
- the preliminary neural network model may at least include an input layer, a long-short term memory layer, a fully connected layer, an output layer, and an embedding layer; and determines training traffic information may include: sparsing the at least one historical static information item for the each of the plurality of sets of historical reference information packages; dividing the plurality of sets of historical reference information package into one or more groups, and for each of the one or more groups, concatenating the sparsed historical static information items of the group and the historical dynamic information items of the group; inputting the concatenated historical reference information items into the input layer; and embedding the sparsed historical static information items into the embedding layer; and outputting the training traffic information corresponding to the each of the plurality of sets of historical reference information packages.
- the method may further include: gathering target traffic information of a plurality of target links at the target future time point, perform route planning, determine Estimated Times of Arrival (ETAs) , perform transportation scheduling, or determine a travelling price of a travelling route based on the target traffic information of the plurality of target links.
- ETAs Estimated Times of Arrival
- the method may further include transmitting signals to a terminal device; the signals directing the terminal device to display the target traffic information of the plurality of target links, a planned route, the Estimated Times of Arrival (ETAs) , the travelling price, or schedule transportations.
- ETAs Estimated Times of Arrival
- a non-transitory computer readable medium including executable instructions that, when executed by at least one processor, may direct the at least one processor to perform a method.
- the method may include: determining a target link having a linkID and a target future time point; obtaining a plurality of reference information packages corresponding to a plurality of reference time points associated with the target link, wherein each of the plurality of reference information packages corresponds to a reference time point and includes at least one static information item and at least one dynamic information item related to the reference time point; and determining target traffic information of the target link at the target future time point using a trained recurrent neural network model based on the plurality of reference information packages.
- FIG. 1 is a schematic diagram illustrating an exemplary traffic information prediction system according to some embodiments of the present disclosure
- FIG. 2 is a schematic diagram illustrating exemplary hardware and/or software components of a computing device according to some embodiments of the present disclosure
- FIG. 3 is a schematic diagram illustrating exemplary hardware and/or software components of a mobile device according to some embodiments of the present disclosure
- FIG. 4 is a block diagram illustrating an exemplary processing engine according to some embodiments of the present disclosure
- FIG. 5 is a flowchart illustrating an exemplary process for determining target traffic information of a target link at a target future time point according to some embodiments of the present disclosure
- FIG. 6 is a flowchart illustrating an exemplary process for training a preliminary neural network model according to some embodiments of the present disclosure
- FIG. 7 is a flowchart illustrating an exemplary process for determining a trained recurrent neural network model according to some embodiments of the present disclosure
- FIG. 8 is a flowchart illustrating an exemplary process for determining training traffic information of a set of reference information packages according to some embodiments of the present disclosure.
- FIG. 9 is a flowchart illustrating an exemplary process for determining training traffic information of a set of reference information packages according to some embodiments of the present disclosure.
- the flowcharts used in the present disclosure illustrate operations that systems implement according to some embodiments of the present disclosure. It is to be expressly understood, the operations of the flowchart may be implemented not in order. Conversely, the operations may be implemented in inverted order, or simultaneously. Moreover, one or more other operations may be added to the flowcharts. One or more operations may be removed from the flowcharts.
- the system or method of the present disclosure may be applied to transportation systems of different environments including land, ocean, aerospace, or the like, or any combination thereof.
- the vehicle of the transportation systems may include a taxi, a private car, a hitch, a bus, a train, a bullet train, a high-speed rail, a subway, a vessel, an aircraft, a spaceship, a hot-air balloon, a driverless vehicle, or the like, or any combination thereof.
- the transportation system may also include any transportation system for management and/or distribution, for example, a system for sending and/or receiving an express.
- the application of the system or method of the present disclosure may include a webpage, a plug-in of a browser, a client terminal, a custom system, an internal analysis system, an artificial intelligence robot, or the like, or any combination thereof.
- the positioning technology used in the present disclosure may be based on a global positioning system (GPS) , a global navigation satellite system (GLONASS) , a compass navigation system (COMPASS) , a Galileo positioning system, a quasi-zenith satellite system (QZSS) , a wireless fidelity (WiFi) positioning technology, or the like, or any combination thereof.
- GPS global positioning system
- GLONASS global navigation satellite system
- COMPASS compass navigation system
- Galileo positioning system Galileo positioning system
- QZSS quasi-zenith satellite system
- WiFi wireless fidelity positioning technology
- An aspect of the present disclosure relates to systems and methods for predicting traffic information of a target link at a target future time point.
- the system may obtain a plurality of reference information packages corresponding to a plurality of reference time points associated with the target link.
- Each of the plurality of reference information packages may correspond to a reference time point and may include a plurality of information items (e.g., at least one static information item and at least one dynamic information item related to the reference time point) .
- the system may determine target traffic information of the target link at the target future time point using a trained recurrent neural network model based on the plurality of reference information packages.
- the plurality of information items included in the each of the reference information packages may be input into the trained recurrent neural network model in a temporal sequence.
- the trained recurrent neural network model may be trained based on a plurality of sets of historical reference information packages corresponding to a plurality of first historical time points associated with a plurality of reference links.
- the trained recurrent neural network model can use different types of information items to predict the target traffic information accurately and efficiently.
- FIG. 1 is a schematic diagram illustrating an exemplary traffic information prediction system according to some embodiments of the present disclosure.
- the traffic information prediction system 100 may include a server 110, a network 120, a user terminal 130, and a storage 140.
- the server 110 may be a single server, or a server group.
- the server group may be centralized, or distributed (e.g., server 110 may be a distributed system) .
- the server 110 may be local or remote.
- the server 110 may access information and/or data stored in the user terminal 130 and/or the storage 140 via the network 120.
- the server 110 may be directly connected to the user terminal 130 and/or the storage 140 to access stored information and/or data.
- the server 110 may be implemented on a cloud platform.
- the cloud platform may include a private cloud, a public cloud, a hybrid cloud, a community cloud, a distributed cloud, an inter-cloud, a multi-cloud, or the like, or any combination thereof.
- the server 110 may be implemented on a computing device 200 having one or more components illustrated in FIG. 2.
- the server 110 may include a processing engine 112.
- the processing engine 112 may process information and/or data to perform one or more functions described in the present disclosure. For example, the processing engine 112 may determine target traffic information of a target link at a target future time point using a trained recurrent neural network model.
- the processing engine 112 may include one or more processing engines (e.g., single-core processing engine (s) or multi-core processor (s) ) .
- the processing engine 112 may include a central processing unit (CPU) , an application-specific integrated circuit (ASIC) , an application-specific instruction-set processor (ASIP) , a graphics processing unit (GPU) , a physics processing unit (PPU) , a digital signal processor (DSP) , a field programmable gate array (FPGA) , a programmable logic device (PLD) , a controller, a microcontroller unit, a reduced instruction-set computer (RISC) , a microprocessor, or the like, or any combination thereof.
- CPU central processing unit
- ASIC application-specific integrated circuit
- ASIP application-specific instruction-set processor
- GPU graphics processing unit
- PPU physics processing unit
- DSP digital signal processor
- FPGA field programmable gate array
- PLD programmable logic device
- controller a microcontroller unit, a reduced instruction-set computer (RISC) , a microprocessor, or the like, or any combination thereof.
- RISC reduced
- the network 120 may facilitate exchange of information and/or data.
- one or more components of the traffic information prediction system 100 e.g., the server 110, the user terminal 130, or the storage 140
- the server 110 may obtain a plurality of sets of historical reference information packages corresponding to a plurality of first historical time points associated with a plurality of reference links from the storage 140 via the network 120.
- the server 110 may use the plurality of sets of historical reference information packages to train a preliminary neural network model.
- the network 120 may be any type of wired or wireless network, or any combination thereof.
- the network 120 may include a cable network, a wireline network, an optical fiber network, a telecommunications network, an intranet, an Internet, a local area network (LAN) , a wide area network (WAN) , a wireless local area network (WLAN) , a metropolitan area network (MAN) , a public telephone switched network (PSTN) , a Bluetooth network, a ZigBee network, a near field communication (NFC) network, or the like, or any combination thereof.
- the network 120 may include one or more network access points.
- the network 120 may include wired or wireless network access points such as base stations and/or internet exchange points 120-1, 120-2, ..., through which one or more components of the traffic information prediction system 100 may be connected to the network 120 to exchange data and/or information.
- the user terminal 130 may be associated with users (e.g., drivers, passengers, meal-delivery men, couriers) of the traffic information prediction system 100.
- the user terminal 130 may include a mobile device 130-1, a tablet computer 130-2, a laptop computer 130-3, a built-in device in a vehicle 130-4, or the like, or any combination thereof.
- the mobile device 130-1 may include a smart home device, a wearable device, a smart mobile device, a virtual reality device, an augmented reality device, or the like, or any combination thereof.
- the smart home device may include a smart lighting device, a control device of an intelligent electrical apparatus, a smart monitoring device, a smart television, a smart video camera, an interphone, or the like, or any combination thereof.
- the wearable device may include a smart bracelet, a smart footgear, a smart glass, a smart helmet, a smart watch, a smart clothing, a smart backpack, a smart accessory, or the like, or any combination thereof.
- the smart mobile device may include a smartphone, a personal digital assistance (PDA) , a gaming device, a navigation device, a point of sale (POS) device, or the like, or any combination thereof.
- PDA personal digital assistance
- the virtual reality device and/or the augmented reality device may include a virtual reality helmet, a virtual reality glass, a virtual reality patch, an augmented reality helmet, an augmented reality glass, an augmented reality patch, or the like, or any combination thereof.
- the virtual reality device and/or the augmented reality device may include a Google Glass TM , an Oculus Rift TM , a Hololens TM , a Gear VR TM , etc.
- a built-in device in the vehicle 130-4 may include an onboard computer, an onboard television, etc.
- the user terminal 130 may be a device with positioning technology for locating the location of the user and/or the user terminal 130.
- the storage 140 may store data and/or instructions. In some embodiments, the storage 140 may store data obtained from the user terminal 130. In some embodiments, the storage 140 may store data and/or instructions that the server 110 may execute or use to perform exemplary methods described in the present disclosure. In some embodiments, the storage 140 may include a mass storage, a removable storage, a volatile read-and-write memory, a read-only memory (ROM) , or the like, or any combination thereof. Exemplary mass storage may include a magnetic disk, an optical disk, a solid-state drive, etc. Exemplary removable storage may include a flash drive, a floppy disk, an optical disk, a memory card, a zip disk, a magnetic tape, etc.
- Exemplary volatile read-and-write memory may include a random access memory (RAM) .
- RAM may include a dynamic RAM (DRAM) , a double date rate synchronous dynamic RAM (DDR SDRAM) , a static RAM (SRAM) , a thyristor RAM (T-RAM) , and a zero-capacitor RAM (Z-RAM) , etc.
- Exemplary ROM may include a mask ROM (MROM) , a programmable ROM (PROM) , an erasable programmable ROM (EPROM) , an electrically erasable programmable ROM (EEPROM) , a compact disk ROM (CD-ROM) , and a digital versatile disk ROM, etc.
- MROM mask ROM
- PROM programmable ROM
- EPROM erasable programmable ROM
- EEPROM electrically erasable programmable ROM
- CD-ROM compact disk ROM
- digital versatile disk ROM etc.
- the storage 140 may be implemented on a cloud platform.
- the cloud platform may include a private cloud, a public cloud, a hybrid cloud, a community cloud, a distributed cloud, an inter-cloud, a multi-cloud, or the like, or any combination thereof.
- the storage 140 may be connected to the network 120 to communicate with one or more components of the traffic information prediction system 100 (e.g., the server 110, the user terminal 130) .
- One or more components of the traffic information prediction system 100 may access the data and/or instructions stored in the storage 140 via the network 120.
- the storage 140 may be directly connected to or communicate with one or more components of the traffic information prediction system 100 (e.g., the server 110, the user terminal 130) .
- the storage 140 may be part of the server 110.
- FIG. 2 is a schematic diagram illustrating exemplary hardware and/or software components of a computing device 200 according to some embodiments of the present disclosure.
- the server 110, the user terminal 130 may be implemented on the computing device 200.
- the processing engine 112 may be implemented on the computing device 200 and configured to perform functions of the processing engine 112 disclosed in this disclosure.
- the computing device 200 may be used to implement any component of the traffic information prediction system 100 as described herein.
- the processing engine 112 may be implemented on the computing device 200, via its hardware, software program, firmware, or a combination thereof. Although only one such computer is shown, for convenience, the computer functions as described herein may be implemented in a distributed fashion on a number of similar platforms to distribute the processing load.
- the computing device 200 may include COM ports 250 connected to and from a network connected thereto to facilitate data communications.
- the computing device 200 may also include a processor 220, in the form of one or more processors (e.g., logic circuits) , for executing program instructions.
- the processor 220 may include interface circuits and processing circuits therein.
- the interface circuits may be configured to receive electronic signals from a bus 210, wherein the electronic signals encode structured data and/or instructions for the processing circuits to process.
- the processing circuits may conduct logic calculations, and then determine a conclusion, a result, and/or an instruction encoded as electronic signals. Then the interface circuits may send out the electronic signals from the processing circuits via the bus 210.
- the computing device 200 may further include program storage and data storage of different forms including, for example, a disk 270, and a read only memory (ROM) 230, or a random access memory (RAM) 240, for various data files to be processed and/or transmitted by the computing device.
- the exemplary computer platform may also include program instructions stored in the ROM 230, RAM 240, and/or other type of non-transitory storage medium to be executed by the processor 220.
- the methods and/or processes of the present disclosure may be implemented as the program instructions.
- the computing device 200 may also include an I/O component 260, supporting input/output between the computer and other components.
- the computing device 200 may also receive programming and data via network communications.
- step A and step B may also be performed by two different CPUs and/or processors jointly or separately in the computing device 200 (e.g., the first processor executes step A and the second processor executes step B, or the first and second processors jointly execute steps A and B) .
- FIG. 3 is a schematic diagram illustrating exemplary hardware and/or software components of a mobile device 300 on which the user terminal 130 may be implemented according to some embodiments of the present disclosure.
- the mobile device 300 may include a communication platform 310, a display 320, a graphic processing unit (GPU) 330, a central processing unit (CPU) 340, an I/O 350, a memory 360, a mobile operating system (OS) 370, and a storage 390.
- any other suitable component including but not limited to a system bus or a controller (not shown) , may also be included in the mobile device 300.
- the mobile operating system 370 e.g., iOS TM , Android TM , Windows Phone TM , etc.
- the applications 380 may include a browser or any other suitable mobile apps for receiving and rendering information from the traffic information prediction system 100.
- User interactions with the information stream may be achieved via the I/O 350 and provided to the processing engine 112 and/or other components of the traffic information prediction system 100 via the network 120.
- FIG. 4 is a block diagram illustrating an exemplary processing engine according to some embodiments of the present disclosure.
- the processing engine 112 may include a link determination module 410, a reference information obtaining module 420, a traffic information determination module 430, and a model training module 440.
- the link determination model 410 may be configured to determine a target link having a linkID and a target future time point.
- a link may refer to a segment of a road or part of a segment of a road that has a predetermined direction. In certain embodiments, if a segment of a road includes two directions, it may include two links each having a different direction.
- the linkID may refer to a link identification code of a link.
- the linkID may be unique for each link. For example, the traffic information prediction system 100 may allocate a unique linkID to each link in a road network of a city.
- the linkID can be in any form, such as but not limited to characters, letters, digits, symbols, images, codes, or a combination thereof.
- the target link may be a link for which a user of the traffic information prediction system 100 wants to predict traffic information.
- the traffic information may indicate traffic conditions of a link, and the traffic conditions may include a number of features that reflect how the link is being used by vehicles, pedestrians, and others.
- the target future time point may be any time point in the future.
- the target future time point may be a time point when the user wants to predict traffic information of a link.
- a difference between the target future time point and a current time point may be a predetermined time threshold, e.g., 10 minutes, 15 minutes, 30 minutes.
- the target future time point may be 10: 30 a.m.
- the first predetermined time threshold may be default settings of the traffic information prediction system 100, or may be adjusted under different situations.
- the reference information obtaining module 420 may be configured to may obtain a plurality of reference information packages corresponding to a plurality of reference time points associated with the target link.
- the reference information obtaining module 420 may obtain the plurality of reference information packages from a storage device (e.g., the storage) or an external device described elsewhere in the present disclosure.
- the storage device or the external device may collect the plurality of reference information packages from the user terminals 130, a third party (a traffic control department) , etc.
- Each of the plurality of reference information packages may correspond to a reference time point.
- Each reference information package may include a plurality of information items.
- An information item refers to a feature and the value of the feature that is associated with a link, and in some cases (e.g., for dynamic information items) also associated with a particular time point.
- the plurality of information items may include at least one static information item and at least one dynamic information item related the reference time point.
- the reference time points may be historical time points, future time points, or the current time point (the time point when the prediction is made) .
- the reference time points may include the target time point.
- the reference time points may include time point after the target time point.
- the processing engine 112 obtains at least one information item of the target link at a future time point (e.g., obtain predicted weather information of a future day from a third party (a weather monitoring system) ) , the obtained information item can be used for the prediction of the traffic information.
- the plurality of reference time points may be sequential time points in at least one day.
- a difference of two sequential time points of the reference time points in a day may be a second predetermined time threshold, e.g., 2 minutes.
- the plurality of reference time points may include 9: 00 a.m., 9: 02 a.m., 9: 04 a.m., ...in a current day, 9: 00 a.m., 9: 02 a.m., 9: 04 a.m., ...in a day prior to the current day, 9: 00 a.m., 9: 02 a.m., 9: 04 a.m., ...in a day subsequent to the current day.
- the reference information obtaining module 420 may supplement the at least one reference information package. In some embodiments, the reference information obtaining module 420 may supplement the at least one reference information package based on a data filling technique (e.g., a data mining model, a smooth filling model, etc. ) . In some embodiments, the reference information obtaining module 420 may use candidate information items at candidate time points that are close to the reference time point to supplement the missing information item. For example, if a traffic congestion condition of a target link at 9: 00 a.m.
- the reference information obtaining module 420 may obtain a traffic congestion condition of the target link at 8: 58 a.m., and use it to supplement the traffic congestion condition at 9: 00 a.m.
- the reference information obtaining module 420 may determine a candidate information package including a plurality of candidate information items of the target link based on initial information items for a historical time period (e.g., last month, last three months, last six months) .
- the processing engine 112 may determine an average or a mode of each of the initial information items and designate the calculated values as the candidate information items to supplement the reference information package.
- the traffic information determination module 430 may be configured to may determine target traffic information of the target link at the target future time point using a trained recurrent neural network model based on the plurality of reference information packages.
- the target traffic information may include a target average driving speed of vehicles along the target link at the target future time point, a target traffic congestion condition (e.g., open, flowing, or congested) of the target link at the target future time point, target traffic volume of the target link at the target future time point, a target average passing time of vehicles along the target link at the target future time point, or the like, or any combination thereof.
- a target traffic congestion condition e.g., open, flowing, or congested
- the trained recurrent neural network model may be configured to predict traffic information of a link at a future time point.
- the traffic information determination module 430 may obtain the trained recurrent neural network model from a storage device (e.g., the storage) or an external device described elsewhere in the present disclosure.
- the trained recurrent neural network model may be predetermined based on a plurality of sets of historical reference information packages by the traffic information prediction system 100 or the external device, and may be stored in the storage device or the external device.
- the trained recurrent neural network model may be trained after the reference information packages have been obtained.
- the traffic information determination module 430 may further predict traffic information of a plurality of target links at the target future time point. Further, the processing engine 112 may display the target traffic information on a digital map, and forecast the traffic information at the target future time point. In some embodiments, the processing engine 112 may perform route planning, perform transportation scheduling, determine Estimated Times of Arrival (ETAs) , or determine a travelling price of a route based on the traffic information.
- ETAs Estimated Times of Arrival
- the model training module 440 may be configured to train a preliminary neural network model to generate a trained recurrent neural network model.
- the model training module 440 may obtain a plurality of sets of historical reference information packages (also referred to as “training dataset” ) corresponding to a plurality of first historical time points associated with a plurality of reference links.
- Each set of the historical reference information packages (also referred to as “training sample” ) may include one or more historical reference information packages corresponding to one or more second historical time points associated with a reference link.
- Each of the one or more historical reference information packages may correspond to a second historical time point associated with the reference link.
- the first historical time point may be different from the corresponding second historical time points.
- the first historical time point may be a time point when traffic information of a reference link is to be predicted, and may be similar to the target future time point.
- the corresponding second historical time points may be similar to the plurality of reference time points.
- the training module 440 may arrange the plurality of sets of historical reference information packages in a temporal order, i.e., based on time (e.g. from an earlier time to a later time, sequentially; or from a later time to an earlier time, backward sequentially) .
- the training module 440 may be configured to train a preliminary neural network model based on the arranged historical information items to generate the trained recurrent neural network model.
- the training module 440 may update the trained recurrent neural network model at a certain time interval (e.g., per week, per month, per two months) based on a plurality of newly obtained sets of historical reference information packages.
- FIG. 5 is a flowchart illustrating an exemplary process for determining target traffic information of a target link at a target future time point according to some embodiments of the present disclosure.
- the process 500 may be implemented as a set of instructions (e.g., an application) stored in the storage ROM 230 or RAM 240.
- the processor 220 and/or the modules in FIG. 4 may execute the set of instructions, and when executing the instructions, the processor 220 and/or the modules may be configured to perform the process 500.
- the operations of the illustrated process presented below are intended to be illustrative. In some embodiments, the process 500 may be accomplished with one or more additional operations not described and/or without one or more of the operations herein discussed. Additionally, the order in which the operations of the process as illustrated in FIG. 5 and described below is not intended to be limiting.
- the processing engine 112 may determine a target link having a linkID and a target future time point.
- a link may refer to a segment of a road or part of a segment of a road that has a predetermined direction. In certain embodiments, if a segment of a road includes two directions, it may include two links each having a different direction. A route, whether or not planned, normally includes multiple links.
- the length of a link may be any value, e.g., within a range of 100 m and 500 m or shorter than 500 m, 200 m, or 100 m.
- the linkID may refer to a link identification code of a link.
- the linkID may be unique for each link.
- the traffic information prediction system 100 may allocate a unique linkID to each link in a road network of a city.
- the linkID can be in any form, such as but not limited to characters, letters, digits, symbols, images, codes, or a combination thereof.
- the target link may be a link for which a user of the traffic information prediction system 100 wants to predict traffic information.
- the traffic information may indicate traffic conditions of a link, and the traffic conditions may include a number of features that reflect how the link is being used by vehicles, pedestrians, and others.
- the traffic information may include an average driving speed of vehicles along the link, a traffic congestion condition (e.g., open, flowing, or congested) of the link, traffic volume of the link or an average passing time of vehicles along the link, etc.
- the target future time point may be any time point in the future.
- the target future time point may be a time point when the user wants to predict traffic information of a link.
- a difference between the target future time point and a current time point may be a predetermined time threshold, e.g., 10 minutes, 15 minutes, 30 minutes. For example, if the current time point is 10: 00 a.m., and the first predetermined time threshold is 30 minutes, the target future time point may be 10: 30 a.m.
- the first predetermined time threshold may be default settings of the traffic information prediction system100, or may be adjusted under different situations.
- the processing engine 112 may obtain a plurality of reference information packages corresponding to a plurality of reference time points associated with the target link.
- the processing engine 112 may obtain the plurality of reference information packages from a storage device (e.g., the storage) or an external device described elsewhere in the present disclosure.
- the storage device or the external device may collect the plurality of reference information packages from the user terminals 130, a third party (a traffic control department) , etc.
- Each of the plurality of reference information packages may correspond to a reference time point.
- Each reference information package may include a plurality of information items.
- An information item refers to a feature and the value of the feature that is associated with a link, and in some cases (e.g., for dynamic information items) also associated with a particular time point.
- the plurality of information items may include at least one static information item and at least one dynamic information item related the reference time point.
- each of the at least one static information item may be time-independent, i.e., each of the at least one static information item rarely changes with time.
- the static information item may include the linkID of the target link, a speed limit of the target link (e.g., 60 Km/h, 80 Km/h) , a link type of the target link (e.g., a highway, an avenue, a tunnel) , a length of the target link, a width of the target link, a direction of the target link, a count of lanes of the target link, whether the target link is near to an intersection, or the like, or any combination thereof.
- the processing engine 112 may determine that the target link is near to the intersection.
- each of the at least one dynamic information item may be time-dependent, i.e., each of the at least one dynamic information item changes with time.
- the dynamic information item may include weather information of the target link at the reference time point, traffic volume of the target link at the reference time point, an average driving speed of vehicles along the target link at the reference time point, a traffic congestion condition (e.g., open, flowing, or congested) of the target link at the reference time point, an average passing time of vehicles along the target link at the reference time point, or the like, or any combination thereof.
- each of the information item may have an effect on reference traffic information of the target link at the reference time point.
- a linkID of a link indicates that the link is at a busy road (e.g., Houchang Village Road, West 2 nd Banner in Beijing with a plurality of technology companies)
- the link may have relatively large traffic volume.
- an average driving speed of vehicles along a link at a time point is only 10 km/h
- the link may be congested.
- an average driving speed of vehicles along a link at a time point is 80 km/h, the link may be open.
- traffic information of a link at different time points may be interdependent if a difference between the different time points is less than a time interval, the target traffic information of the target link may be related to the reference traffic information of the target link at the reference time points.
- the reference time points may be any time point that can be used to facilitate the prediction of the traffic information.
- the plurality of reference time points may include one or more first time points in a current day of the target future time point and one or more second corresponding time points in one or more days prior to the target future time point.
- the one or more first time points may be within a predetermined time period in the current day. For example, if a current time point is 10: 00 a.m., the one or more first time points may be within 9: 00 a.m. and 10: 00 a.m.
- the one or more second corresponding time points may be within the predetermined time period in the one or more days prior to the current day.
- the second corresponding time points may be within 9: 00 a.m. and 10: 00 a.m. one day prior to the current day, 9: 00 a.m., and 10:00 a.m. two days prior to the current day, etc.
- the reference time points may be historical time points, future time points, or the current time point (the time point when the prediction is made) .
- the reference time points may include the target time point.
- the reference time points may include time point after the target time point.
- the processing engine 112 obtains at least one information item of the target link in a future time point (e.g., obtain predicted weather information of a future day from a third party (a weather monitoring system) ) , the obtained information item can be used for the prediction of the traffic information
- the processing engine 112 can take into consideration the influence of traffic information items associated with a future reference time point in predicting traffic information. For example, rain or snow may increase the likelihood of congestion.
- the processing engine 12 may take into consideration people’s anticipated response to information items associated with a future reference time point. For example, a storm forecast may result in people trying to avoid the road, causing a reduced likelihood of congestion at target time points before the storm arrives.
- the plurality of reference time points may be sequential time points in at least one day.
- a difference of two sequential time points of the reference time points in a day may be a second predetermined time threshold, e.g., 2 minutes.
- the plurality of reference time points may include 9: 00 a.m., 9: 02 a.m., 9: 04 a.m., ...in a current day, 9: 00 a.m., 9: 02 a.m., 9: 04 a.m., ...in a day prior to the current day, 9: 00 a.m., 9: 02 a.m., 9: 04 a.m., ...in a day subsequent to the current day.
- a first portion of the information items may be instantaneous, i.e., each of the first portion of the information items may be real-time data corresponding to the reference time point. For example, if a target link is rainy in 9: 02 a.m., weather information of the target link in 9: 02 a.m. may be rainy.
- a second portion of the information items may be an average value of real-time data corresponding to a time interval (e.g., 2 minutes) associated with the reference time point.
- an average driving speed of vehicles along the target link in 9: 02 a.m. may be an average value of driving speeds of vehicles along the target link within 9: 00 a.m. -9: 02 a.m. or 9: 02 a.m. -9: 04 a.m.
- the processing engine 112 may supplement the at least one reference information package. In some embodiments, the processing engine 112 may supplement the at least one reference information package based on a data filling technique (e.g., data mining model, a smooth filling model, etc. ) . In some embodiments, the processing engine 112 may use candidate information items at candidate time points that are close to the reference time point to supplement the missing information item. For example, if a traffic congestion condition of a target link at 9: 00 a.m.
- a traffic congestion condition of a target link at 9: 00 a.m.
- the processing engine 112 may obtain a traffic congestion condition of the target link at 8: 58 a.m., and use it to supplement the traffic congestion condition at 9: 00 a.m.
- the processing engine 112 may determine a candidate information package including a plurality of candidate information items of the target link based on initial information items for a historical time period (e.g., last month, last three months, last six months) . Specifically, for each of the candidate information items, the processing engine 112 may determine an average or a mode of each of the initial information items and designate the calculated values as the candidate information items to supplement the reference information package.
- the processing engine 112 may determine target traffic information of the target link at the target future time point using a trained recurrent neural network model based on the plurality of reference information packages.
- the target traffic information may include a target average driving speed of vehicles along the target link at the target future time point, a target traffic congestion condition (e.g., open, flowing, or congested) of the target link at the target future time point, target traffic volume of the target link at the target future time point, a target average passing time of vehicles along the target link at the target future time point, or the like, or any combination thereof.
- the trained recurrent neural network model may be configured to predict traffic information of a link at a future time point.
- the processing engine 112 may obtain the trained recurrent neural network model from a storage device (e.g., the storage) or an external device described elsewhere in the present disclosure.
- the trained recurrent neural network model may be predetermined based on a plurality of sets of historical reference information packages by the traffic information prediction system 100 or the external device, and may be stored in the storage device or the external device.
- the trained recurrent neural network model may be trained after the reference information packages have been obtained. More detailed description of determining the trained recurrent neural network model can be found elsewhere in the present disclosure, e.g., FIGs. 6-9, and the descriptions thereof.
- the processing engine 112 may further predict traffic information of a plurality of target links at the target future time point. Further, the processing engine 112 may display the target traffic information on a digital map, and forecast the traffic information at the target future time point.
- the processing engine 112 may perform route planning based on the traffic information. In an application scenario, if there are a plurality of candidate routes between two locations, and the processing engine 112 determines that, although not congested at a current time point, at least one route will probably be congested based on the predicted traffic information at a future time point, the processing engine 112 may remind a user of the traffic information prediction system 100 of the possibility of congestion for that route, and recommend the user to select another route to avoid traffic congestion at the future time point.
- the processing engine 112 may perform transportation scheduling based on the traffic information.
- the processing engine 112 may determine one or more links will become congested based on the predicted traffic information if there is no intervention.
- a user e.g., the traffic control department
- the traffic information prediction system 100 may direct vehicles heading toward the congested links to a detour to reduce the likelihood of congestion of those links.
- the user of the traffic information prediction system 100 may reduce the likelihood of congestion of the links by adjusting traffic light cycles of intersections leading up to and/or near the link.
- the user of the traffic information prediction system 100 may adjust a ratio of the green light cycle to the red light cycle leading up to the link and/or along the link to increase the duration of the green lights, thus reducing the likelihood of congestion.
- the likelihood of congestion along those links can be reduced.
- the processing engine 112 may determine Estimated Times of Arrival (ETAs) based on the traffic information.
- the ETA may refer to a driving time along a route including a plurality of links (e.g., from one location to another location) , and be related with a length of the route, an average passing time of vehicles of each of the plurality of links, a traffic congestion condition of each of the plurality of links included in traffic information of the links, etc.
- the processing engine 112 may determine the ETA based on the predicted traffic information. Further, the processing engine 112 may determine a travelling price of the route at least partly based on the ETA and the traffic information.
- the processing engine 112 may send signals to one or more user terminals to direct the user terminals to display the ETAs and the traveling price.
- the processing engine 112 may store information (e.g., the target link, the target future time point, the target traffic information) in a storage device (e.g., the storage 140) described elsewhere in the present disclosure.
- FIG. 6 is a flowchart illustrating an exemplary process for training a preliminary neural network model according to some embodiments of the present disclosure.
- the process 600 may be implemented as a set of instructions (e.g., an application) stored in the storage ROM 230 or RAM 240.
- the processor 220 and/or the modules in FIG. 4 may execute the set of instructions, and when executing the instructions, the processor 220 and/or the modules may be configured to perform the process 600.
- the operations of the illustrated process presented below are intended to be illustrative. In some embodiments, the process 600 may be accomplished with one or more additional operations not described and/or without one or more of the operations herein discussed. Additionally, the order in which the operations of the process as illustrated in FIG. 6 and described below is not intended to be limiting.
- the processing engine 112 may obtain a plurality of sets of historical reference information packages (also referred to as “training dataset” ) corresponding to a plurality of first historical time points associated with a plurality of reference links.
- Each set of the historical reference information packages (also referred to as “training sample” ) may include one or more historical reference information packages corresponding to one or more second historical time points associated with a reference link.
- Each of the one or more historical reference information packages may correspond to a second historical time point associated with the reference link.
- the first historical time point may be different from the corresponding second historical time points.
- the first historical time point may be a time point when traffic information of a reference link is to be predicted, and may be similar to the target future time point described in FIG. 5.
- the corresponding second historical time points may be similar to the plurality of reference time points.
- Each of the one or more historical reference information packages may include a plurality of historical information items. Similar to the reference information items included in the reference information package described in FIG. 5, the plurality of historical information items may include at least one historical static information item and at least one historical dynamic information item related to a second historical time point.
- the at least one historical static information item may include a linkID of the reference link, a speed limit (e.g., 60 Km/h, 80 Km/h) of the reference link, a link type of the reference link (e.g., a highway, an avenue, a tunnel) , a length of the reference link, a width of the reference link, a direction of the reference link, a count of lanes of the reference link, whether the reference link was near to an intersection, or the like, or any combination thereof.
- a linkID of the reference link e.g., a speed limit (e.g., 60 Km/h, 80 Km/h) of the reference link, a link type of the reference link (e.g., a highway, an avenue, a tunnel) , a length of the reference link, a width of the reference link, a direction of the reference link, a count of lanes of the reference link, whether the reference link was near to an intersection, or the like, or any combination thereof.
- the at least one historical dynamic information item may include historical weather information of the reference link at the second historical time point, historical traffic volume of the reference link at the second historical time point, a historical average driving speed of vehicles along the reference link at the second historical time point, a historical traffic congestion condition (e.g., open, flowing, or congested) of the reference link at the second historical time point, a historical average passing time of vehicles along the reference link at the second historical time point, or the like, or any combination thereof.
- a historical traffic congestion condition e.g., open, flowing, or congested
- the processing engine 112 may arrange the plurality of sets of historical reference information packages in a temporal order, i.e., based on time (e.g. from an earlier time to a later time, sequentially; or from a later time to an earlier time, backward sequentially) .
- the processing engine 112 may use a matrix to represent the historical information item.
- the processing engine 112 may arrange the matrixes in the temporal order.
- the processing engine 112 may arrange the historical information items included in the set of historical reference information packages as a combined matrix illustrated below:
- (1, 2, ..., i, ..., n) refer to the second historical time points in chronological order
- x i refers to a matrix representing historical information items included in a historical reference information package corresponding to an i th second historical time point.
- the second historical time point i may be a time point earlier than the second historical time point (i+1) and later than the second historical time point (i-1) .
- the processing engine 112 may train a preliminary neural network model based on the historical information items to generate the trained recurrent neural network model.
- the preliminary neural network model may include an input layer, a hidden layer and an output layer.
- the processing engine 112 may input the historical information items corresponding to the plurality sets of historical reference information packages through the input layer.
- the hidden layer may be configured to process the historical information items, and the output layer may output a result (e.g., training traffic information) determined by the preliminary neural network model. More detailed descriptions of training the preliminary neural network model and the structure of the preliminary neural network model can be found elsewhere in the present disclosure, e.g., FIGs. 7-9 and the descriptions thereof.
- FIG. 7 is a flowchart illustrating an exemplary process for determining a trained recurrent neural network model according to some embodiments of the present disclosure.
- the process 700 may be implemented as a set of instructions (e.g., an application) stored in the storage ROM 230 or RAM 240.
- the processor 220 and/or the modules in FIG. 4 may execute the set of instructions, and when executing the instructions, the processor 220 and/or the modules may be configured to perform the process 700.
- the operations of the illustrated process presented below are intended to be illustrative. In some embodiments, the process 700 may be accomplished with one or more additional operations not described and/or without one or more of the operations herein discussed. Additionally, the order in which the operations of the process as illustrated in FIG. 7 and described below is not intended to be limiting. In some embodiments, operation 630 of the process 600 may be performed based on the process 700.
- the processing engine 112 may determine the preliminary neural network model (e.g., a recurrent neural network model) .
- the processing engine 112 may set preliminary parameters (e.g., weight values, bias values) for the preliminary neural network model.
- the processing engine 112 may determine training traffic information corresponding to each set of the historical reference information packages based on the preliminary neural network model and the historical information items. Similar to the target traffic information described in FIG. 5, the training traffic information may include a training average driving speed of vehicles along the reference link at a first historical time point, a training traffic congestion condition (e.g., open, flowing, or congested) of the reference link at the first historical time point, training traffic volume of the reference at the first historical time point or a training average passing time of vehicles along the reference link at the first historical time point.
- a training traffic congestion condition e.g., open, flowing, or congested
- the preliminary neural network model may include the input layer, the hidden layer, and the output layer.
- the hidden layer may include a long short-term memory (LSTM) layer.
- the LSTM layer may include three gates including an input gate, a forget gate, and an output gate.
- the processing engine 112 may arrange the historical information items included in a set associated with a first historical time point as (x 1 , x 2 , ..., x i , ..., x n ) as described in 620.
- the LSTM layer may utilize the three gates to determine first short-time traffic information (e.g., traffic information corresponding to the 1 st second historical time point) and first long-term traffic information (e.g., traffic information corresponding to the first historical time point) based on x 1 .
- first short-time traffic information e.g., traffic information corresponding to the 1 st second historical time point
- first long-term traffic information e.g., traffic information corresponding to the first historical time point
- the LSTM layer may utilize the three gates to determine second short-time traffic information (e.g., traffic information corresponding to the 2 nd second historical time point) and second long-term traffic information (e.g., traffic information corresponding to the first historical time point) based on x 2 , the first short-term traffic information, and the first long-term traffic information.
- the LSTM layer may utilize the three gates to determine n th short-time traffic information (e.g., traffic information corresponding to the n th second historical time point) and n th long-term traffic information (e.g., traffic information corresponding to the first historical time point) based on x n , the (n-1) th short-term traffic information, and the (n-1) th long-term traffic information.
- the output layer may determine the training traffic information at least based on the n th long-term traffic information.
- the LSTM layer may determine the n th long-term traffic information based on Equation (2) as illustrated below:
- y n refers to the n th long-term traffic information
- x n refers to the historical information items corresponding to the n th second historical time point
- y n-1 refers to an output (e.g., the (n-1) th short-term traffic information, and the (n-1) th long-term traffic information) of the historical information items corresponding to the (n-1) th second historical time point.
- the processing engine 112 may determine whether the training traffic information of the plurality of sets of historical reference information packages satisfies a predetermined condition.
- the processing engine 112 may determine an objective function (e.g., a loss function, a Root Mean Square Error (RMSE) function, a, Mean Absolute Error (MAE) function) of the preliminary neural network model and determine a value thereof based on the training traffic information. Further, the processing engine 112 may determine whether the value of the objective function is less than a first predetermined value threshold or the value of the objective function is minimum.
- the first predetermined value threshold may be default settings of the traffic information prediction system 100, or may be adjustable under different situations.
- the processing engine 112 may designate the preliminary neural network model as the trained recurrent neural network model in 740.
- the processing engine 112 may execute the process 700 to return to 710 to update the preliminary neural network model.
- the processing engine 112 may update one or more preliminary parameters (e.g., weight values, bias values) of the preliminary neural network model to generate an updated neural network model.
- the processing engine 112 may determine updated training traffic information corresponding to the plurality of sets of historical reference information packages based on the updated neural network model.
- the processing engine 112 may determine whether the updated training traffic information satisfies the predetermined condition. In response to the determination that the updated training traffic information satisfies the predetermined condition, the processing engine 112 may designate the updated neural network model as the trained recurrent neural network model in 740. In response to the determination that the updated training traffic information does not satisfy the predetermined condition, the processing engine 112 may still execute the process 700 to return to 710 to update the updated neural network model until the updated training traffic information satisfies the predetermined condition.
- the processing engine 112 may obtain a plurality of second sets of historical reference information packages (also referred to as validation dataset) to validate the updated neural network model.
- the updated neural network model may be designated as the trained recurrent neural network model until a validation result associated with the validation dataset satisfies a second predetermined condition.
- the second predetermined condition may include a value of an objective function (e.g., a loss function, a Root Mean Square Error function, a, Mean Absolute Error function) of the validation dataset less than a second predetermined value threshold, or the value of the objective function being minimum.
- the processing engine 112 may update the trained recurrent neural network model at a certain time interval (e.g., per week, per month, per two months) based on a plurality of newly obtained sets of historical reference information packages.
- FIG. 8 is a flowchart illustrating an exemplary process for determining training traffic information of a set of reference information packages according to some embodiments of the present disclosure.
- the process 800 may be implemented as a set of instructions (e.g., an application) stored in the storage ROM 230 or RAM 240.
- the processor 220 and/or the modules in FIG. 4 may execute the set of instructions, and when executing the instructions, the processor 220 and/or the modules may be configured to perform the process 800.
- the operations of the illustrated process presented below are intended to be illustrative. In some embodiments, the process 800 may be accomplished with one or more additional operations not described and/or without one or more of the operations herein discussed. Additionally, the order in which the operations of the process as illustrated in FIG. 8 and described below is not intended to be limiting. In some embodiments, operation 720 of the process 700 may be performed based on the process 800.
- the processing engine 112 may sparse at least one historical static information item for each of the plurality sets of historical reference information packages. As described in 620, for historical information items included in a historical reference information package, the processing engine 112 may use a matrix to represent the historical information item. Since at least one vector corresponding to the at least one historical static information item is sparse, i.e., there may be a number of zero elements in the vectors. The processing engine 112 may perform sparsity for the vectors and reduce dimensions of the vectors.
- the preliminary neural network model may include the input layer, the LSTM layer, and the output layer. In some embodiments, the preliminary neural network model may further include an embedding layer and a fully connected layer. In some embodiments, the processing engine 112 may embed the at least one historical static information item to perform the sparsity in the embedding layer.
- the processing engine 112 may divide the plurality of sets of historical reference information packages (also referred to as “training samples) into one or more groups.
- a count of sets of historical reference information packages in each of the one or more groups may be a predetermined value, e.g., 32, 64, 128, etc.
- the processing engine 112 may input historical dynamic information items in each of the one or more groups into the input layer in 830.
- the LSTM layer may filter out a portion of the historical dynamic information items, and the portion of the historical dynamic information items may be determined to have less effect on the output (e.g., (training traffic information) of the preliminary neural network model during the training process.
- the processing engine 112 e.g., the model training module 440 and/or the interface circuits of the processor 220
- the fully connected layer may extract combined information from the input layer, the LSTM layer, and the embedding layer.
- the processing engine 112 e.g., the model training module 440 and/or the interface circuits of the processor 220
- FIG. 9 is a flowchart illustrating an exemplary process for determining training traffic information of a set of reference information packages according to some embodiments of the present disclosure.
- the process 900 may be implemented as a set of instructions (e.g., an application) stored in the storage ROM 230 or RAM 240.
- the processor 220 and/or the modules in FIG. 4 may execute the set of instructions, and when executing the instructions, the processor 220 and/or the modules may be configured to perform the process 900.
- the operations of the illustrated process presented below are intended to be illustrative. In some embodiments, the process 900 may be accomplished with one or more additional operations not described and/or without one or more of the operations herein discussed. Additionally, the order in which the operations of the process as illustrated in FIG. 9 and described below is not intended to be limiting. In some embodiments, operation 720 of the process 700 may be performed based on the process 900.
- the processing engine 112 may sparse at least one historical static information item for each of the plurality sets of historical reference information packages. As described in 620 or 810, for historical information items included in a historical reference information package, the processing engine 112 may use a matrix to represent the historical information item. Since at least one vector corresponding to the at least one historical static information item may be sparse, i.e., there may be a number of zero elements in the vectors. The processing engine 112 may perform sparsity for the vectors and reduce dimensions of the vectors.
- the processing engine 112 may embed the at least one historical static information item to perform the sparsity in the embedding layer.
- the processing engine 112 may divide the plurality of sets of historical reference information packages into one or more groups.
- a count of sets of historical reference information packages in each of the one or more groups may be a predetermined value, e.g., 32, 64, 128, etc.
- the processing engine 112 may concatenate the sparsed historical static information items and historical dynamic information items for each of the one or more groups.
- the processing engine 112 may concatenate the sparsed historical static information items and historical dynamic information items for each historical reference information package, and form a combined matrix for each set of the historical reference information packages (also referred to as “training sample” ) .
- the processing engine 112 may input the concatenated historical information items of the each of the one or more groups into the input layer in 940.
- the LSTM layer may fitter out at least a portion of the concatenated historical information items, and the at least a portion of the concatenated historical information items may be determined to have less effect on the output (e.g., (training traffic information) of the preliminary neural network model during the training process.
- the processing engine 112 e.g., the model training module 440 and/or the interface circuits of the processor 220
- the fully connected layer may extract combined information based on the concatenated historical information items and the sparsed historical static information items.
- the processing engine 112 e.g., the model training module 440 and/or the interface circuits of the processor 220
- aspects of the present disclosure may be illustrated and described herein in any of a number of patentable classes or context including any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof. Accordingly, aspects of the present disclosure may be implemented entirely hardware, entirely software (including firmware, resident software, micro-code, etc. ) or combining software and hardware implementation that may all generally be referred to herein as a “unit, ” “module, ” or “system. ” Furthermore, aspects of the present disclosure may take the form of a computer program product embodied in one or more computer readable media having computer readable program code embodied thereon.
- a computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated signal may take any of a variety of forms, including electro-magnetic, optical, or the like, or any suitable combination thereof.
- a computer readable signal medium may be any computer readable medium that is not a computer readable storage medium and that may communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
- Program code embodied on a computer readable signal medium may be transmitted using any appropriate medium, including wireless, wireline, optical fiber cable, RF, or the like, or any suitable combination of the foregoing.
- Computer program code for carrying out operations for aspects of the present disclosure may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, Scala, Smalltalk, Eiffel, JADE, Emerald, C++, C#, VB. NET, Python or the like, conventional procedural programming languages, such as the "C" programming language, Visual Basic, Fortran 2003, Perl, COBOL 2002, PHP, ABAP, dynamic programming languages such as Python, Ruby and Groovy, or other programming languages.
- the program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server.
- the remote computer may be connected to the user's computer through any type of network, including a local area network (LAN) or a wide area network (WAN) , or the connection may be made to an external computer (for example, through the Internet using an Internet Service Provider) or in a cloud computing environment or offered as a service such as a Software as a Service (SaaS) .
- LAN local area network
- WAN wide area network
- SaaS Software as a Service
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- General Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- Mathematical Physics (AREA)
- Software Systems (AREA)
- Computational Linguistics (AREA)
- Biophysics (AREA)
- Biomedical Technology (AREA)
- Artificial Intelligence (AREA)
- Health & Medical Sciences (AREA)
- Data Mining & Analysis (AREA)
- Computing Systems (AREA)
- Molecular Biology (AREA)
- General Health & Medical Sciences (AREA)
- Evolutionary Computation (AREA)
- Life Sciences & Earth Sciences (AREA)
- Analytical Chemistry (AREA)
- Chemical & Material Sciences (AREA)
- Business, Economics & Management (AREA)
- Strategic Management (AREA)
- Human Resources & Organizations (AREA)
- Economics (AREA)
- Game Theory and Decision Science (AREA)
- General Business, Economics & Management (AREA)
- Tourism & Hospitality (AREA)
- Quality & Reliability (AREA)
- Operations Research (AREA)
- Marketing (AREA)
- Entrepreneurship & Innovation (AREA)
- Development Economics (AREA)
- Traffic Control Systems (AREA)
Abstract
La présente invention concerne des systèmes et des procédés de prévision d'information sur la circulation d'une liaison. Le système peut déterminer une liaison cible ayant un ID de liaison et un point temporel futur cible. Le système peut également obtenir une pluralité d'ensembles d'informations de référence correspondant à une pluralité de points temporels de référence associés à la liaison cible. Chacun de la pluralité d'ensembles d'informations de référence peut correspondre à un point temporel de référence et comprendre au moins un élément d'information statique et au moins un élément d'information dynamique associé au point temporel de référence. En outre, le système peut déterminer des informations de circulation cible de la liaison cible au point temporel futur cible à l'aide d'un modèle de réseau neuronal récurrent entraîné sur la base de la pluralité d'ensembles d'informations de référence.
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN201910433966.7 | 2019-05-23 | ||
| CN201910433966.7A CN110675621B (zh) | 2019-05-23 | 2019-05-23 | 预测交通信息的系统和方法 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2020232732A1 true WO2020232732A1 (fr) | 2020-11-26 |
Family
ID=69068651
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/CN2019/088956 Ceased WO2020232732A1 (fr) | 2019-05-23 | 2019-05-29 | Systèmes et procédés de prévision d'informations de circulation |
Country Status (2)
| Country | Link |
|---|---|
| CN (1) | CN110675621B (fr) |
| WO (1) | WO2020232732A1 (fr) |
Cited By (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN112820103A (zh) * | 2020-12-31 | 2021-05-18 | 山东奥邦交通设施工程有限公司 | 一种一体化动静结合标志管控方法及系统 |
| CN112907953A (zh) * | 2021-01-27 | 2021-06-04 | 吉林大学 | 一种基于稀疏gps数据的公交行程时间预测方法 |
| CN113570867A (zh) * | 2021-09-26 | 2021-10-29 | 西南交通大学 | 一种城市交通状态预测方法、装置、设备及可读存储介质 |
| CN115440038A (zh) * | 2022-08-31 | 2022-12-06 | 青岛海信网络科技股份有限公司 | 一种交通信息确定方法以及电子设备 |
| CN115909748A (zh) * | 2023-01-07 | 2023-04-04 | 深圳市城市交通规划设计研究中心股份有限公司 | 节假日公路交通量预测方法、电子设备及存储介质 |
| US11754410B1 (en) | 2022-07-11 | 2023-09-12 | Chengdu Qinchuan Iot Technology Co., Ltd. | Methods and internet of things systems for determining government traffic routes in smart cities |
Families Citing this family (12)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN113344233A (zh) * | 2020-02-18 | 2021-09-03 | 阿里巴巴集团控股有限公司 | 通行速度预测模型的训练方法、装置和存储介质 |
| CN111489553B (zh) * | 2020-04-26 | 2022-02-25 | 百度在线网络技术(北京)有限公司 | 一种路线规划方法、装置、设备和计算机存储介质 |
| CN111833605B (zh) * | 2020-07-10 | 2022-04-26 | 北京嘀嘀无限科技发展有限公司 | 路况预测方法、路况预测模型训练方法、装置及存储介质 |
| CN111986490A (zh) * | 2020-09-18 | 2020-11-24 | 北京百度网讯科技有限公司 | 路况预测方法、装置、电子设备和存储介质 |
| CN113780606B (zh) * | 2020-09-23 | 2024-09-24 | 京东城市(北京)数字科技有限公司 | 模型训练方法、预测方法、装置、系统及存储介质 |
| CN113112795B (zh) * | 2021-04-06 | 2022-01-21 | 中移(上海)信息通信科技有限公司 | 一种路况预测方法、装置及设备 |
| CN113470356B (zh) * | 2021-06-28 | 2022-11-04 | 青岛海信网络科技股份有限公司 | 电子设备及区域路况预测方法 |
| CN115909711A (zh) * | 2021-08-05 | 2023-04-04 | 中移智行网络科技有限公司 | 交通拥堵指数预测方法、装置及相关设备 |
| CN114154711A (zh) * | 2021-11-30 | 2022-03-08 | 北京世纪新运交通运输科技应用研究所 | 交通信息推荐方法、装置、电子设备及计算机存储介质 |
| CN114550453B (zh) * | 2022-02-23 | 2023-09-26 | 阿里巴巴(中国)有限公司 | 模型训练方法、确定方法、电子设备及计算机存储介质 |
| CN114781243A (zh) * | 2022-03-22 | 2022-07-22 | 高德软件有限公司 | Eta预测及模型训练方法、设备、介质及产品 |
| CN116246460A (zh) * | 2022-12-27 | 2023-06-09 | 北京百度网讯科技有限公司 | 路段通行时长的确定方法、装置、电子设备和介质 |
Citations (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN101438334A (zh) * | 2006-03-03 | 2009-05-20 | 因瑞克斯有限公司 | 未来交通状况的动态时序预测 |
| WO2018141403A1 (fr) * | 2017-02-03 | 2018-08-09 | Siemens Aktiengesellschaft | Système, dispositif et procédé de gestion de la circulation dans un lieu géographique |
| CN108734614A (zh) * | 2017-04-13 | 2018-11-02 | 腾讯科技(深圳)有限公司 | 交通拥堵预测方法及装置、存储介质 |
| CN109118014A (zh) * | 2018-08-30 | 2019-01-01 | 浙江工业大学 | 一种基于时间递归神经网络的交通流速度预测方法 |
Family Cites Families (7)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US6317686B1 (en) * | 2000-07-21 | 2001-11-13 | Bin Ran | Method of providing travel time |
| CN107230351B (zh) * | 2017-07-18 | 2019-08-09 | 福州大学 | 一种基于深度学习的短时交通流预测方法 |
| CN109000676B (zh) * | 2018-06-22 | 2021-11-09 | 东华大学 | 一种vanet环境下结合预测信息的路径规划方法 |
| CN109285346B (zh) * | 2018-09-07 | 2020-05-05 | 北京航空航天大学 | 一种基于关键路段的城市路网交通状态预测方法 |
| CN109300309A (zh) * | 2018-10-29 | 2019-02-01 | 讯飞智元信息科技有限公司 | 路况预测方法及装置 |
| CN109544911B (zh) * | 2018-10-30 | 2021-10-01 | 中山大学 | 一种基于lstm-cnn的城市路网交通状态预测方法 |
| CN109697852B (zh) * | 2019-01-23 | 2021-04-02 | 吉林大学 | 基于时序交通事件的城市道路拥堵程度预测方法 |
-
2019
- 2019-05-23 CN CN201910433966.7A patent/CN110675621B/zh active Active
- 2019-05-29 WO PCT/CN2019/088956 patent/WO2020232732A1/fr not_active Ceased
Patent Citations (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN101438334A (zh) * | 2006-03-03 | 2009-05-20 | 因瑞克斯有限公司 | 未来交通状况的动态时序预测 |
| WO2018141403A1 (fr) * | 2017-02-03 | 2018-08-09 | Siemens Aktiengesellschaft | Système, dispositif et procédé de gestion de la circulation dans un lieu géographique |
| CN108734614A (zh) * | 2017-04-13 | 2018-11-02 | 腾讯科技(深圳)有限公司 | 交通拥堵预测方法及装置、存储介质 |
| CN109118014A (zh) * | 2018-08-30 | 2019-01-01 | 浙江工业大学 | 一种基于时间递归神经网络的交通流速度预测方法 |
Cited By (9)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN112820103A (zh) * | 2020-12-31 | 2021-05-18 | 山东奥邦交通设施工程有限公司 | 一种一体化动静结合标志管控方法及系统 |
| CN112907953A (zh) * | 2021-01-27 | 2021-06-04 | 吉林大学 | 一种基于稀疏gps数据的公交行程时间预测方法 |
| CN112907953B (zh) * | 2021-01-27 | 2022-01-28 | 吉林大学 | 一种基于稀疏gps数据的公交行程时间预测方法 |
| CN113570867A (zh) * | 2021-09-26 | 2021-10-29 | 西南交通大学 | 一种城市交通状态预测方法、装置、设备及可读存储介质 |
| CN113570867B (zh) * | 2021-09-26 | 2021-12-07 | 西南交通大学 | 一种城市交通状态预测方法、装置、设备及可读存储介质 |
| US11754410B1 (en) | 2022-07-11 | 2023-09-12 | Chengdu Qinchuan Iot Technology Co., Ltd. | Methods and internet of things systems for determining government traffic routes in smart cities |
| CN115440038A (zh) * | 2022-08-31 | 2022-12-06 | 青岛海信网络科技股份有限公司 | 一种交通信息确定方法以及电子设备 |
| CN115440038B (zh) * | 2022-08-31 | 2023-11-03 | 青岛海信网络科技股份有限公司 | 一种交通信息确定方法以及电子设备 |
| CN115909748A (zh) * | 2023-01-07 | 2023-04-04 | 深圳市城市交通规划设计研究中心股份有限公司 | 节假日公路交通量预测方法、电子设备及存储介质 |
Also Published As
| Publication number | Publication date |
|---|---|
| CN110675621A (zh) | 2020-01-10 |
| CN110675621B (zh) | 2021-01-15 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| WO2020232732A1 (fr) | Systèmes et procédés de prévision d'informations de circulation | |
| US11024163B2 (en) | Systems and methods for monitoring traffic congestion | |
| US11003677B2 (en) | Systems and methods for location recommendation | |
| US11085792B2 (en) | Systems and methods for determining estimated time of arrival | |
| US20200011692A1 (en) | Systems and methods for recommending an estimated time of arrival | |
| CN111862585B (zh) | 用于交通预测的系统和方法 | |
| US11069247B2 (en) | Systems and methods for distributing a service request for an on-demand service | |
| US11398002B2 (en) | Systems and methods for determining an estimated time of arrival | |
| WO2018214361A1 (fr) | Systèmes et procédés pour l'amélioration de la prédiction d'indices et de la construction de modèles | |
| US11676485B2 (en) | Systems and methods for determining traffic information of a region | |
| US20190107405A1 (en) | Systems and methods for route planning | |
| EP3704645A1 (fr) | Systèmes et procédés permettant de déterminer une heure d'arrivée prévue pour des services en ligne à hors ligne | |
| US11004335B2 (en) | Systems and methods for speed prediction | |
| US20210088342A1 (en) | Systems and methods for navigation services | |
| US20220214185A1 (en) | Systems and methods for recommendation and display of point of interest | |
| US20200141741A1 (en) | Systems and methods for determining recommended information of a service request | |
| US20200158522A1 (en) | Systems and methods for determining a new route in a map | |
| WO2020019237A1 (fr) | Systèmes et procédés de distribution de fournisseurs de services | |
| WO2020093385A1 (fr) | Systèmes et procédés de détermination de relations de topologie de liaisons | |
| WO2020243963A1 (fr) | Systèmes et procédés de détermination d'informations recommandées de demande de service | |
| CN114041129A (zh) | 确定上车点名称的系统和方法 | |
| US11029166B2 (en) | Systems and methods for reserving a carpooling service | |
| US20210065548A1 (en) | Systems and methods for navigation based on intersection coding | |
| WO2021022487A1 (fr) | Systèmes et procédés de détermination d'une heure d'arrivée estimée | |
| WO2022126354A1 (fr) | Systèmes et procédés pour obtenir un horaire d'arrivée estimé dans des services en ligne à hors ligne |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 19929273 Country of ref document: EP Kind code of ref document: A1 |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 19929273 Country of ref document: EP Kind code of ref document: A1 |