CN116168468A - Vehicle collision recognition method and device - Google Patents
Vehicle collision recognition method and device Download PDFInfo
- Publication number
- CN116168468A CN116168468A CN202111403344.3A CN202111403344A CN116168468A CN 116168468 A CN116168468 A CN 116168468A CN 202111403344 A CN202111403344 A CN 202111403344A CN 116168468 A CN116168468 A CN 116168468A
- Authority
- CN
- China
- Prior art keywords
- vehicle
- data
- collision
- time
- battery pack
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000000034 method Methods 0.000 title claims abstract description 45
- 238000013136 deep learning model Methods 0.000 claims abstract description 14
- 230000001413 cellular effect Effects 0.000 claims description 4
- 230000015654 memory Effects 0.000 claims description 4
- 238000004590 computer program Methods 0.000 claims description 2
- 238000012545 processing Methods 0.000 description 12
- 238000010586 diagram Methods 0.000 description 5
- 230000003936 working memory Effects 0.000 description 5
- 238000004891 communication Methods 0.000 description 4
- 238000004364 calculation method Methods 0.000 description 3
- 238000013135 deep learning Methods 0.000 description 3
- 238000012423 maintenance Methods 0.000 description 3
- 230000003287 optical effect Effects 0.000 description 3
- 230000005540 biological transmission Effects 0.000 description 2
- 238000013527 convolutional neural network Methods 0.000 description 2
- 230000014509 gene expression Effects 0.000 description 2
- 238000010801 machine learning Methods 0.000 description 2
- 241001573498 Compacta Species 0.000 description 1
- 230000008485 antagonism Effects 0.000 description 1
- 230000010267 cellular communication Effects 0.000 description 1
- 238000013500 data storage Methods 0.000 description 1
- 230000010354 integration Effects 0.000 description 1
- 230000007787 long-term memory Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 230000006403 short-term memory Effects 0.000 description 1
- 239000000344 soap Substances 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 238000012549 training Methods 0.000 description 1
- 238000012546 transfer Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G07—CHECKING-DEVICES
- G07C—TIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
- G07C5/00—Registering or indicating the working of vehicles
- G07C5/08—Registering or indicating performance data other than driving, working, idle, or waiting time, with or without registering driving, working, idle or waiting time
- G07C5/0808—Diagnosing performance data
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02T—CLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
- Y02T10/00—Road transport of goods or passengers
- Y02T10/60—Other road transportation technologies with climate change mitigation effect
- Y02T10/70—Energy storage systems for electromobility, e.g. batteries
Landscapes
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Electric Propulsion And Braking For Vehicles (AREA)
- Traffic Control Systems (AREA)
Abstract
A vehicle collision recognition method and apparatus are disclosed. The vehicle collision recognition method includes: receiving vehicle data from a vehicle, the vehicle data including accelerator pedal position data, vehicle speed data, driver demand torque data, low voltage battery voltage data, and battery pack relay status data; determining a point in time at which a collision is likely to occur based on the battery pack relay status data; vehicle data within a predetermined period of time including the determined point in time is acquired, and based on the acquired vehicle data, whether the vehicle collides is determined by using a deep learning model, wherein in the case where the collision is determined, the point in time at which the collision is likely to occur is determined as a collision point in time.
Description
Technical Field
The present disclosure relates to a vehicle collision recognition method and apparatus.
Background
The current vehicle collision recognition method is to install a vehicle-mounted central processing unit and a sensing module in a vehicle, collect sensing data through the sensing module and perform a vehicle collision recognition process on the collected sensing data using the vehicle-mounted central processing unit, more specifically, perform a collision recognition process using a method such as a fixed threshold method in the vehicle-mounted central processing unit.
Disclosure of Invention
According to a first aspect of the present disclosure, there is provided a vehicle collision recognition method, the method comprising: receiving vehicle data from a vehicle, the vehicle data including accelerator pedal position data, vehicle speed data, driver demand torque data, low voltage battery voltage data, and battery pack relay status data; determining a point in time at which a collision is likely to occur based on the battery pack relay status data; vehicle data within a predetermined period of time including the determined point in time is acquired, and based on the acquired vehicle data, whether the vehicle collides is determined by using a deep learning model, wherein in the case where the collision is determined, the point in time at which the collision is likely to occur is determined as a collision point in time.
According to a second aspect of the present disclosure, there is provided an apparatus for vehicle collision recognition, the apparatus comprising: means for performing the vehicle collision recognition method according to the first aspect.
According to a third aspect of the present disclosure, there is provided a non-transitory computer-readable storage medium storing instructions that, when executed by a processor, cause the vehicle collision recognition method according to the first aspect to be performed.
According to a fourth aspect of the present disclosure, there is provided a computer program product storing instructions that, when executed by a processor, cause the vehicle collision recognition method according to the first aspect to be performed.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments of the disclosure and together with the description, serve to explain, without limitation, the principles of the disclosure. Like reference numerals are used to denote like items throughout the various figures.
Fig. 1 is a block diagram of an exemplary vehicle collision recognition device according to some embodiments of the present disclosure.
Fig. 2 is a flowchart illustrating an exemplary vehicle collision recognition method according to some embodiments of the present disclosure.
Fig. 3 is a flowchart illustrating an exemplary process of determining a potential collision time point according to some embodiments of the present disclosure.
Fig. 4 is a diagram illustrating a general hardware environment in which the present disclosure may be applied, according to some embodiments of the present disclosure.
Detailed Description
In the following detailed description, numerous specific details are set forth in order to provide a thorough understanding of the described exemplary embodiments. It will be apparent, however, to one skilled in the art that the described embodiments may be practiced without some or all of these specific details. In the described exemplary embodiments, well known structures or processing steps have not been described in detail in order to avoid unnecessarily obscuring the concepts of the present disclosure.
The blocks within each block diagram shown below may be implemented by hardware, software, firmware, or any combination thereof to implement the principles of the present disclosure. It will be appreciated by those skilled in the art that the blocks described in each block diagram may be combined or divided into sub-blocks to implement the principles of the present disclosure.
The steps of the methods presented in this disclosure are intended to be illustrative. In some embodiments, the method may be accomplished with one or more additional steps not described and/or without one or more of the steps discussed.
In addition, in the description of the present disclosure, the terms "first," "second," "third," etc. are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or order. Similarly, although operations are depicted in the drawings in a particular order, this should not be understood as requiring that such operations be performed in the particular order shown or in sequential order, or that all illustrated operations be performed, to achieve desirable results. In some cases, multitasking and parallel processing may be advantageous.
Fig. 1 is a block diagram of an exemplary vehicle collision recognition device according to some embodiments of the present disclosure.
As shown in fig. 1, the apparatus 100 may include: a receiving component 110 configured to receive vehicle data from a vehicle, the vehicle data including accelerator pedal position data, vehicle speed data, driver demand torque data, low voltage battery voltage data, and battery pack relay status data; a potential collision time point determining section 120 configured to determine a time point at which a collision is likely to occur based on the battery pack relay state data; a deep learning identification section 130 configured to acquire vehicle data within a predetermined period including the determined point in time, and to determine whether or not a vehicle collides by using a deep learning model based on the acquired vehicle data, wherein in the case where a collision is determined, the point in time at which the collision is likely to occur is determined as a collision point in time; the collision recognition result providing section 140 is configured to provide the after-market vehicle service provider with a collision recognition result including information indicating that a collision has occurred and a collision time point.
The operation of the various components shown in fig. 1 will be described in further detail below.
The method 200 begins at step S210, where the receiving component 110 receives vehicle data from a vehicle, the vehicle data including accelerator pedal position data, vehicle speed data, driver demand torque data, low voltage battery voltage data, and battery pack relay status data. In this disclosure, a vehicle refers to an electric vehicle.
At least a portion of the vehicle data, such as accelerator pedal position data, vehicle speed data, low voltage battery voltage data, and battery pack relay status data, may be collected using on-board sensors, while driver demand torque data may be obtained by calculation.
More specifically, accelerator pedal position data indicating the position of the accelerator pedal may be acquired by a sensor such as an accelerator pedal position sensor; vehicle speed data may be obtained by a sensor, such as a speed sensor, the vehicle speed data being indicative of the speed of the vehicle; the low voltage battery voltage data may be obtained by a sensor such as a voltage sensor, wherein the low voltage battery is typically used for low power systems in vehicles, such as door control systems, vehicle starting systems, lamp systems, instrument display systems, etc.; the relay state data of the battery pack can be obtained through a relay state sensor, and the relay state data of the battery pack indicates the relay state of the main positive relay or the main negative relay in the battery pack when the main positive relay or the main negative relay is disconnected or connected; the driver demand torque data may be calculated by, for example, accelerator pedal position data, vehicle speed data, etc., which indicates the driver demand torque, while the torque demanded by different drivers may be different, for example, at the same accelerator pedal position.
In some embodiments, the time interval for collecting vehicle data is 2 seconds. In other embodiments, the time interval for collecting vehicle data is 5 seconds. The time interval at which the vehicle data is acquired is not limited thereto, and may be changed according to factors such as the in-vehicle sensor performance, the vehicle data transmission speed, and the like.
In some embodiments, the receiving component 110 may receive vehicle data from a vehicle via, for example, cellular internet of vehicles (LTE-V2X). In this case, the apparatus as shown in fig. 1 may be implemented by a server in the cloud. In other embodiments, the receiving component 110 may receive vehicle data from the vehicle via a data transmission line within the vehicle. In this case, the apparatus as shown in fig. 1 may be implemented by a processor on board the vehicle.
Next, the method 200 proceeds to step S220, at which step S220 the potential collision time point determining section 120 determines a time point at which a collision is likely to occur based on the battery pack relay state data.
The processing at step S220 is described below with reference to fig. 3. Fig. 3 illustrates a flowchart of an exemplary process of determining a potential collision time point. At step S310, the potential collision time point determining part 120 determines a first dynamic representation in a first period of time before the target time point and a second dynamic representation in a second period of time after the target time point, based on the battery pack relay state data, wherein the first period of time indicates a time required to collect a predetermined number of data samples before the target time point and the second period of time indicates a time required to collect a predetermined number of data samples after the target time point; the first dynamic representation and the second dynamic representation indicate dynamic changes in the battery pack relay state, wherein the battery pack relay state data includes a plurality of data samples.
At step S320, a point in time at which a collision is likely to occur is determined based on the first dynamic representation and the second dynamic representation, the determination including determining whether the following preset conditions are satisfied, and determining a first point in time at which the battery pack relay state is open within the first period of time as a point in time at which a collision is likely to occur if the following preset conditions are satisfied:
the data samples at the target time points indicate that the battery pack relay status is open,
the first dynamic representation indicates: the data samples representing the open battery pack relay status during the first time period account for more than 80% of the total data samples during the first time period, and
the second dynamic representation indicates: the data samples representing the open battery pack relay status during the second time period account for more than 60% of the total data samples during the second time period, and the vehicle speed at the target time point is less than the preset vehicle speed.
In one specific example, consider the following data samples "1, -1, -1, -1, -1, -1, -1, -1, -1, -1, …" of the battery pack relay state data, where "1" indicates that the battery pack relay state is on and "1" indicates that the battery pack relay state is off.
The first dynamic representation and the second dynamic representation in this particular example may be determined by the following expressions (1) and (2):
wherein m is i Indicating battery packThe data sample value of the relay state data i is an integer value larger than zero; k represents a predetermined number and k=5; OFF (OFF state) i Representing a first dynamic representation, ON i Representing a second dynamic representation.
When the target time point is the time point corresponding to the 1 st data sample, since the predetermined number is 5, there is no predetermined number of data samples in the first period and there is a predetermined number of data samples in the second period, so that OFF cannot be obtained i And ON i Is-3. Similarly, when the target time points are the time points corresponding to the 2 nd, 3 rd and 4 th data samples respectively, no OFF can be obtained in the first time period i And ON i Are all-5.
When the target time point is the time point corresponding to the 5 th data sample, the predetermined number of data samples are provided in the first time period and the predetermined number of data samples are provided in the second time period, so that the OFF can be obtained i And ON i At this time OFF i Is-3, ON i Is-5.
Similarly, when the target time point is the time point corresponding to the data sample such as the 6 th, 7 th, 8 th, etc., the case is similar to the above case, and the description thereof will be omitted.
Based on the calculated OFF i And ON i It may be determined whether the following condition is satisfied, and the first point in time at which the battery pack relay state is off in the first period of time is determined as a point in time at which a collision is likely to occur if the following condition is satisfied:
a) The data samples at the target time points indicate that the battery pack relay status is open;
b)OFF i <-2;
c)ON i < 0 and the actual vehicle speed is less than the preset vehicle speed, which here may be, for example, 20m/s.
When the target time point is the time point corresponding to the 5 th data sample, the above conditions a) -c) are satisfied, and thus the time point corresponding to the second data sample, which is the first time point when the battery pack relay state is off in the first period, is determined as the time point at which collision is likely to occur.
It should be appreciated that the first time period may include or be adjacent to a target point in time. Similarly, the second time period may also include or be adjacent to the target time point.
It should be appreciated that in this example, in the case where the time interval for collecting the vehicle data is 2 seconds, the first period and the second period may be 10 seconds, respectively; in the case when the time interval for collecting the vehicle data is 5 seconds, the first period and the second period may be 25 seconds, respectively.
It should be understood that while in this example the battery pack relay state value is-1 when the battery pack relay state is off and 1 when the battery pack relay state is on, the disclosure is not limited thereto and the battery pack relay state value may be represented by other values.
It should be understood that, although the first dynamic representation and the second dynamic representation are obtained by using the summation in the expressions (1) and (2), the first dynamic representation and the second dynamic representation are not limited thereto, and may be obtained by using a calculation method such as weighted summation, integration, or the like, for example.
It should be appreciated that while in this example the first time period is the same as the second time period in length, the present disclosure is not so limited and the first time period and the second time period may be different in length. Although the predetermined number is 5 in this example, it is not limited thereto, and the predetermined number may be an integer value between 4 and 20.
It will be appreciated that conditions at vehicle launch can be excluded by the above condition c).
It should be appreciated that the point in time at which a collision may occur may be determined in the onboard central processor, or vehicle data may be received via, for example, a cellular internet of vehicles to determine the point in time at which a collision may occur at a server in the cloud.
Step S220 (i.e., step S310 and step S320) determines a time point at which a collision is likely to occur based on the battery pack relay status data, and can screen out data related to a potential collision time point from a huge amount of input data, thereby reducing the computational load of the subsequent machine learning process.
Next, the method 200 proceeds to step S230, at which step S230 the deep learning identification means 130 acquires vehicle data including accelerator pedal position data, vehicle speed data, driver demand torque data, low-voltage battery voltage data, and battery pack relay state data within a predetermined period of time including the determined point in time, and determines whether the vehicle has a collision by using a deep learning model based on the acquired vehicle data, wherein in the case where it is determined that a collision has occurred, the point in time at which the collision is likely to occur is determined as a collision point in time.
Specifically, the input data of the deep learning model is vehicle data including accelerator pedal position data, vehicle speed data, driver demand torque data, low-voltage battery voltage data, and battery pack relay state data for a predetermined period of time including the determined point in time; the output data of the deep learning model is a vehicle collision recognition result of whether the vehicle collides or not, and a collision time point determined in the case where the collision is determined.
The deep learning model may include a deep learning model such as a two-way long and short term memory (Bi-LSTM) network based on an attention (attention) mechanism, a Convolutional Neural Network (CNN), a Generative Antagonism Network (GAN), and the like.
In some embodiments, the predetermined period of time may be a period of time from the first 1 minute of the point in time at which the collision is likely to occur to the last 1 minute of the point in time at which the collision is likely to occur. The present disclosure is not limited thereto and the predetermined period of time may be changed according to actual needs.
It should be appreciated that the deep learning model described above is pre-trained. Training of deep learning models is known and will not be described in detail herein.
It should be appreciated that vehicle data may be received via, for example, a cellular internet of vehicles to determine whether a vehicle is crashed using a deep learning model at a server in the cloud.
The use of the deep learning model for collision recognition can significantly improve the accuracy of vehicle collision recognition as compared with the use of a method such as a fixed threshold method for vehicle collision recognition, while by performing the vehicle collision recognition processing at the cloud, not only is high-accuracy vehicle collision recognition realized, but also the need to install a processor with high processing power on the vehicle can be eliminated.
Next, the method 200 proceeds to step S240, and at step S240, the collision recognition result providing section 140 provides the collision recognition result including the information indicating that the collision has occurred and the point of time of the collision to the after-market vehicle service provider. The after-market vehicle services may include maintenance, rescue, insurance, etc.
In some embodiments, the collision recognition result providing section 140 opens the collision recognition result including information indicating that a collision has occurred and a collision time point to the after-market vehicle service provider in the form of a representational state transfer style application program interface (RESTful API).
Alternatively, the collision recognition result containing the information indicating that the collision has occurred and the point of time of the collision may be opened to the after-market vehicle service provider in a form such as a simple object access protocol application program interface (SOAP API).
The collision recognition result containing the information indicating the collision and the collision time point is provided for the after-sales service provider of the vehicle, so that the after-sales service provider of the vehicle can timely provide services such as vehicle maintenance, rescue, insurance and the like, and the user experience of a driver of the vehicle is improved.
Exemplary vehicle collision recognition methods and apparatus according to the present disclosure are described above with reference to fig. 1, 2, and 3. It can be appreciated that the method in the disclosure can screen out data related to potential collision time points from massive input data, so that the computational load of machine learning processing can be reduced; by performing the vehicle collision recognition processing at the cloud, the need to install a processor with high processing capability on the vehicle can be eliminated; in addition, the vehicle collision recognition result is timely shared to the after-sale service provider after the vehicle collision occurs, so that services such as vehicle maintenance, rescue and insurance can be timely provided, and the user experience of a vehicle driver is improved.
Hardware implementation
Fig. 4 illustrates a general hardware environment 400 in which the present disclosure may be applied, according to an exemplary embodiment of the present disclosure.
With reference to fig. 4, a computing device 400 will now be described as an example of a hardware device applicable to aspects of the present disclosure. Computing device 400 may be any machine configured to perform processes and/or calculations and may be, but is not limited to, a workstation, a server, a desktop computer, a laptop computer, a tablet computer, a personal digital assistant, a smart phone, a portable camera, or any combination thereof. The apparatus 100 described above may be implemented in whole or at least in part by a computing device 400 or similar device or system.
The software elements may reside in a working memory 414 including, but not limited to, an operating system 416, one or more application programs 418, drivers, and/or other data and code. Instructions for performing the above-described methods and steps may be included in one or more applications 418, and components of the apparatus 100 described above may be implemented by the processor 404 reading and executing the instructions of the one or more applications 418. More specifically, the receiving component 110 may be implemented, for example, by the processor 404 upon execution of the application 418 having instructions to perform step S210. The potential collision time point determining section 120 may be implemented, for example, by the processor 404 when executing the application 418 having instructions to perform step S220 (or steps S310 and S320). The deep learning identification component 130 may be implemented, for example, by the processor 404 upon execution of the application 418 having instructions to perform step S230. The collision recognition result providing section 140 may be implemented, for example, by the processor 404 when executing the application 418 having instructions to perform step S240. Executable code or source code of instructions of the software elements may be stored in a non-transitory computer readable storage medium, such as the storage device(s) 410 described above, and may be read into working memory 414, possibly compiled and/or installed. Executable code or source code for the instructions of the software elements may also be downloaded from a remote location.
From the above embodiments, it is apparent to those skilled in the art that the present disclosure may be implemented by software and necessary hardware, or may be implemented by hardware, firmware, etc. Based on this understanding, embodiments of the present disclosure may be implemented, in part, in software. The computer software may be stored in a computer readable storage medium, such as a floppy disk, hard disk, optical disk, or flash memory. The computer software includes a series of instructions that cause a computer (e.g., a personal computer, a service station, or a network terminal) to perform a method according to various embodiments of the present disclosure, or a portion thereof.
Having thus described the present disclosure, it is clear that the present disclosure can be varied in a number of ways. Such variations are not to be regarded as a departure from the spirit and scope of the present disclosure, and all such modifications as would be obvious to one skilled in the art are intended to be included within the scope of the following claims.
Claims (11)
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN202111403344.3A CN116168468B (en) | 2021-11-24 | 2021-11-24 | Vehicle collision identification method and device |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN202111403344.3A CN116168468B (en) | 2021-11-24 | 2021-11-24 | Vehicle collision identification method and device |
Publications (2)
| Publication Number | Publication Date |
|---|---|
| CN116168468A true CN116168468A (en) | 2023-05-26 |
| CN116168468B CN116168468B (en) | 2024-12-10 |
Family
ID=86418699
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| CN202111403344.3A Active CN116168468B (en) | 2021-11-24 | 2021-11-24 | Vehicle collision identification method and device |
Country Status (1)
| Country | Link |
|---|---|
| CN (1) | CN116168468B (en) |
Cited By (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN118536027A (en) * | 2024-04-30 | 2024-08-23 | 明觉科技(北京)有限公司 | Vehicle collision accident detection method, device, system and computer readable medium |
Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN109591603A (en) * | 2019-01-04 | 2019-04-09 | 北京长城华冠汽车科技股份有限公司 | Control method and system after electric car and its collision |
| CN110775057A (en) * | 2019-08-29 | 2020-02-11 | 浙江零跑科技有限公司 | Lane assisting method for analyzing and controlling steering torque based on vehicle-mounted blind zone visual scene |
| US20210089938A1 (en) * | 2019-09-24 | 2021-03-25 | Ford Global Technologies, Llc | Vehicle-to-everything (v2x)-based real-time vehicular incident risk prediction |
| CN112749210A (en) * | 2021-01-18 | 2021-05-04 | 优必爱信息技术(北京)有限公司 | Vehicle collision recognition method and system based on deep learning |
| CN112959895A (en) * | 2021-03-28 | 2021-06-15 | 大运汽车股份有限公司 | Finished automobile control method of pure electric commercial vehicle |
-
2021
- 2021-11-24 CN CN202111403344.3A patent/CN116168468B/en active Active
Patent Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN109591603A (en) * | 2019-01-04 | 2019-04-09 | 北京长城华冠汽车科技股份有限公司 | Control method and system after electric car and its collision |
| CN110775057A (en) * | 2019-08-29 | 2020-02-11 | 浙江零跑科技有限公司 | Lane assisting method for analyzing and controlling steering torque based on vehicle-mounted blind zone visual scene |
| US20210089938A1 (en) * | 2019-09-24 | 2021-03-25 | Ford Global Technologies, Llc | Vehicle-to-everything (v2x)-based real-time vehicular incident risk prediction |
| CN112749210A (en) * | 2021-01-18 | 2021-05-04 | 优必爱信息技术(北京)有限公司 | Vehicle collision recognition method and system based on deep learning |
| CN112959895A (en) * | 2021-03-28 | 2021-06-15 | 大运汽车股份有限公司 | Finished automobile control method of pure electric commercial vehicle |
Cited By (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN118536027A (en) * | 2024-04-30 | 2024-08-23 | 明觉科技(北京)有限公司 | Vehicle collision accident detection method, device, system and computer readable medium |
| CN118536027B (en) * | 2024-04-30 | 2025-02-11 | 明觉科技(北京)有限公司 | Vehicle collision accident detection method, device, system and computer readable medium |
Also Published As
| Publication number | Publication date |
|---|---|
| CN116168468B (en) | 2024-12-10 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| CN110929799B (en) | Method, electronic device, and computer-readable medium for detecting abnormal user | |
| JP2021509978A (en) | Driving behavior evaluation method, device and computer-readable storage medium | |
| CN108959247B (en) | Data processing method, server and computer readable medium | |
| CN111862945A (en) | A speech recognition method, device, electronic device and storage medium | |
| US20190147540A1 (en) | Method and apparatus for outputting information | |
| CN111222051B (en) | Training method and device for trend prediction model | |
| CN113222050A (en) | Image classification method and device, readable medium and electronic equipment | |
| CN114103944B (en) | Method, device and equipment for adjusting time interval between workshops | |
| CN110069997B (en) | Scene classification method and device and electronic equipment | |
| CN116168468B (en) | Vehicle collision identification method and device | |
| CN110555861B (en) | Optical flow calculation method and device and electronic equipment | |
| CN113111692A (en) | Target detection method and device, computer readable storage medium and electronic equipment | |
| CN110060477B (en) | Method and device for pushing information | |
| CN116645956A (en) | Speech synthesis method, speech synthesis system, electronic device and storage medium | |
| CN114462502B (en) | A method and device for training a core recommendation model | |
| CN115171718A (en) | Specific bird identification method and device and storage medium | |
| CN112036519B (en) | Multi-bit sigmoid-based classification processing method and device and electronic equipment | |
| CN112487876B (en) | Intelligent pen character recognition method and device and electronic equipment | |
| CN110378936B (en) | Optical flow calculation method and device and electronic equipment | |
| CN111651686B (en) | Test processing method and device, electronic equipment and storage medium | |
| CN107888652A (en) | Result abnormal detector, detection program, detection method and moving body | |
| JP2025528640A (en) | Image dataset processing method, device, equipment and storage medium | |
| CN111798591B (en) | Method and device for determining total mileage of vehicle, computer equipment and storage medium | |
| CN111738311A (en) | Multitask-oriented feature extraction method and device and electronic equipment | |
| CN114969224B (en) | Vehicle test driving reservation method and device, electronic equipment and medium |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| PB01 | Publication | ||
| PB01 | Publication | ||
| SE01 | Entry into force of request for substantive examination | ||
| SE01 | Entry into force of request for substantive examination | ||
| GR01 | Patent grant | ||
| GR01 | Patent grant |