[go: up one dir, main page]

CN116168468A - Vehicle collision recognition method and device - Google Patents

Vehicle collision recognition method and device Download PDF

Info

Publication number
CN116168468A
CN116168468A CN202111403344.3A CN202111403344A CN116168468A CN 116168468 A CN116168468 A CN 116168468A CN 202111403344 A CN202111403344 A CN 202111403344A CN 116168468 A CN116168468 A CN 116168468A
Authority
CN
China
Prior art keywords
vehicle
data
collision
time
battery pack
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202111403344.3A
Other languages
Chinese (zh)
Other versions
CN116168468B (en
Inventor
陆顺
王峰
赵龙刚
钱兵
项超
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
China Telecom Corp Ltd
Original Assignee
China Telecom Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by China Telecom Corp Ltd filed Critical China Telecom Corp Ltd
Priority to CN202111403344.3A priority Critical patent/CN116168468B/en
Publication of CN116168468A publication Critical patent/CN116168468A/en
Application granted granted Critical
Publication of CN116168468B publication Critical patent/CN116168468B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07CTIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
    • G07C5/00Registering or indicating the working of vehicles
    • G07C5/08Registering or indicating performance data other than driving, working, idle, or waiting time, with or without registering driving, working, idle or waiting time
    • G07C5/0808Diagnosing performance data
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02TCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
    • Y02T10/00Road transport of goods or passengers
    • Y02T10/60Other road transportation technologies with climate change mitigation effect
    • Y02T10/70Energy storage systems for electromobility, e.g. batteries

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Electric Propulsion And Braking For Vehicles (AREA)
  • Traffic Control Systems (AREA)

Abstract

A vehicle collision recognition method and apparatus are disclosed. The vehicle collision recognition method includes: receiving vehicle data from a vehicle, the vehicle data including accelerator pedal position data, vehicle speed data, driver demand torque data, low voltage battery voltage data, and battery pack relay status data; determining a point in time at which a collision is likely to occur based on the battery pack relay status data; vehicle data within a predetermined period of time including the determined point in time is acquired, and based on the acquired vehicle data, whether the vehicle collides is determined by using a deep learning model, wherein in the case where the collision is determined, the point in time at which the collision is likely to occur is determined as a collision point in time.

Description

Vehicle collision recognition method and device
Technical Field
The present disclosure relates to a vehicle collision recognition method and apparatus.
Background
The current vehicle collision recognition method is to install a vehicle-mounted central processing unit and a sensing module in a vehicle, collect sensing data through the sensing module and perform a vehicle collision recognition process on the collected sensing data using the vehicle-mounted central processing unit, more specifically, perform a collision recognition process using a method such as a fixed threshold method in the vehicle-mounted central processing unit.
Disclosure of Invention
According to a first aspect of the present disclosure, there is provided a vehicle collision recognition method, the method comprising: receiving vehicle data from a vehicle, the vehicle data including accelerator pedal position data, vehicle speed data, driver demand torque data, low voltage battery voltage data, and battery pack relay status data; determining a point in time at which a collision is likely to occur based on the battery pack relay status data; vehicle data within a predetermined period of time including the determined point in time is acquired, and based on the acquired vehicle data, whether the vehicle collides is determined by using a deep learning model, wherein in the case where the collision is determined, the point in time at which the collision is likely to occur is determined as a collision point in time.
According to a second aspect of the present disclosure, there is provided an apparatus for vehicle collision recognition, the apparatus comprising: means for performing the vehicle collision recognition method according to the first aspect.
According to a third aspect of the present disclosure, there is provided a non-transitory computer-readable storage medium storing instructions that, when executed by a processor, cause the vehicle collision recognition method according to the first aspect to be performed.
According to a fourth aspect of the present disclosure, there is provided a computer program product storing instructions that, when executed by a processor, cause the vehicle collision recognition method according to the first aspect to be performed.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments of the disclosure and together with the description, serve to explain, without limitation, the principles of the disclosure. Like reference numerals are used to denote like items throughout the various figures.
Fig. 1 is a block diagram of an exemplary vehicle collision recognition device according to some embodiments of the present disclosure.
Fig. 2 is a flowchart illustrating an exemplary vehicle collision recognition method according to some embodiments of the present disclosure.
Fig. 3 is a flowchart illustrating an exemplary process of determining a potential collision time point according to some embodiments of the present disclosure.
Fig. 4 is a diagram illustrating a general hardware environment in which the present disclosure may be applied, according to some embodiments of the present disclosure.
Detailed Description
In the following detailed description, numerous specific details are set forth in order to provide a thorough understanding of the described exemplary embodiments. It will be apparent, however, to one skilled in the art that the described embodiments may be practiced without some or all of these specific details. In the described exemplary embodiments, well known structures or processing steps have not been described in detail in order to avoid unnecessarily obscuring the concepts of the present disclosure.
The blocks within each block diagram shown below may be implemented by hardware, software, firmware, or any combination thereof to implement the principles of the present disclosure. It will be appreciated by those skilled in the art that the blocks described in each block diagram may be combined or divided into sub-blocks to implement the principles of the present disclosure.
The steps of the methods presented in this disclosure are intended to be illustrative. In some embodiments, the method may be accomplished with one or more additional steps not described and/or without one or more of the steps discussed.
In addition, in the description of the present disclosure, the terms "first," "second," "third," etc. are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or order. Similarly, although operations are depicted in the drawings in a particular order, this should not be understood as requiring that such operations be performed in the particular order shown or in sequential order, or that all illustrated operations be performed, to achieve desirable results. In some cases, multitasking and parallel processing may be advantageous.
Fig. 1 is a block diagram of an exemplary vehicle collision recognition device according to some embodiments of the present disclosure.
As shown in fig. 1, the apparatus 100 may include: a receiving component 110 configured to receive vehicle data from a vehicle, the vehicle data including accelerator pedal position data, vehicle speed data, driver demand torque data, low voltage battery voltage data, and battery pack relay status data; a potential collision time point determining section 120 configured to determine a time point at which a collision is likely to occur based on the battery pack relay state data; a deep learning identification section 130 configured to acquire vehicle data within a predetermined period including the determined point in time, and to determine whether or not a vehicle collides by using a deep learning model based on the acquired vehicle data, wherein in the case where a collision is determined, the point in time at which the collision is likely to occur is determined as a collision point in time; the collision recognition result providing section 140 is configured to provide the after-market vehicle service provider with a collision recognition result including information indicating that a collision has occurred and a collision time point.
The operation of the various components shown in fig. 1 will be described in further detail below.
The method 200 begins at step S210, where the receiving component 110 receives vehicle data from a vehicle, the vehicle data including accelerator pedal position data, vehicle speed data, driver demand torque data, low voltage battery voltage data, and battery pack relay status data. In this disclosure, a vehicle refers to an electric vehicle.
At least a portion of the vehicle data, such as accelerator pedal position data, vehicle speed data, low voltage battery voltage data, and battery pack relay status data, may be collected using on-board sensors, while driver demand torque data may be obtained by calculation.
More specifically, accelerator pedal position data indicating the position of the accelerator pedal may be acquired by a sensor such as an accelerator pedal position sensor; vehicle speed data may be obtained by a sensor, such as a speed sensor, the vehicle speed data being indicative of the speed of the vehicle; the low voltage battery voltage data may be obtained by a sensor such as a voltage sensor, wherein the low voltage battery is typically used for low power systems in vehicles, such as door control systems, vehicle starting systems, lamp systems, instrument display systems, etc.; the relay state data of the battery pack can be obtained through a relay state sensor, and the relay state data of the battery pack indicates the relay state of the main positive relay or the main negative relay in the battery pack when the main positive relay or the main negative relay is disconnected or connected; the driver demand torque data may be calculated by, for example, accelerator pedal position data, vehicle speed data, etc., which indicates the driver demand torque, while the torque demanded by different drivers may be different, for example, at the same accelerator pedal position.
In some embodiments, the time interval for collecting vehicle data is 2 seconds. In other embodiments, the time interval for collecting vehicle data is 5 seconds. The time interval at which the vehicle data is acquired is not limited thereto, and may be changed according to factors such as the in-vehicle sensor performance, the vehicle data transmission speed, and the like.
In some embodiments, the receiving component 110 may receive vehicle data from a vehicle via, for example, cellular internet of vehicles (LTE-V2X). In this case, the apparatus as shown in fig. 1 may be implemented by a server in the cloud. In other embodiments, the receiving component 110 may receive vehicle data from the vehicle via a data transmission line within the vehicle. In this case, the apparatus as shown in fig. 1 may be implemented by a processor on board the vehicle.
Next, the method 200 proceeds to step S220, at which step S220 the potential collision time point determining section 120 determines a time point at which a collision is likely to occur based on the battery pack relay state data.
The processing at step S220 is described below with reference to fig. 3. Fig. 3 illustrates a flowchart of an exemplary process of determining a potential collision time point. At step S310, the potential collision time point determining part 120 determines a first dynamic representation in a first period of time before the target time point and a second dynamic representation in a second period of time after the target time point, based on the battery pack relay state data, wherein the first period of time indicates a time required to collect a predetermined number of data samples before the target time point and the second period of time indicates a time required to collect a predetermined number of data samples after the target time point; the first dynamic representation and the second dynamic representation indicate dynamic changes in the battery pack relay state, wherein the battery pack relay state data includes a plurality of data samples.
At step S320, a point in time at which a collision is likely to occur is determined based on the first dynamic representation and the second dynamic representation, the determination including determining whether the following preset conditions are satisfied, and determining a first point in time at which the battery pack relay state is open within the first period of time as a point in time at which a collision is likely to occur if the following preset conditions are satisfied:
the data samples at the target time points indicate that the battery pack relay status is open,
the first dynamic representation indicates: the data samples representing the open battery pack relay status during the first time period account for more than 80% of the total data samples during the first time period, and
the second dynamic representation indicates: the data samples representing the open battery pack relay status during the second time period account for more than 60% of the total data samples during the second time period, and the vehicle speed at the target time point is less than the preset vehicle speed.
In one specific example, consider the following data samples "1, -1, -1, -1, -1, -1, -1, -1, -1, -1, …" of the battery pack relay state data, where "1" indicates that the battery pack relay state is on and "1" indicates that the battery pack relay state is off.
The first dynamic representation and the second dynamic representation in this particular example may be determined by the following expressions (1) and (2):
Figure BDA0003371530220000051
Figure BDA0003371530220000052
wherein m is i Indicating battery packThe data sample value of the relay state data i is an integer value larger than zero; k represents a predetermined number and k=5; OFF (OFF state) i Representing a first dynamic representation, ON i Representing a second dynamic representation.
When the target time point is the time point corresponding to the 1 st data sample, since the predetermined number is 5, there is no predetermined number of data samples in the first period and there is a predetermined number of data samples in the second period, so that OFF cannot be obtained i And ON i Is-3. Similarly, when the target time points are the time points corresponding to the 2 nd, 3 rd and 4 th data samples respectively, no OFF can be obtained in the first time period i And ON i Are all-5.
When the target time point is the time point corresponding to the 5 th data sample, the predetermined number of data samples are provided in the first time period and the predetermined number of data samples are provided in the second time period, so that the OFF can be obtained i And ON i At this time OFF i Is-3, ON i Is-5.
Similarly, when the target time point is the time point corresponding to the data sample such as the 6 th, 7 th, 8 th, etc., the case is similar to the above case, and the description thereof will be omitted.
Based on the calculated OFF i And ON i It may be determined whether the following condition is satisfied, and the first point in time at which the battery pack relay state is off in the first period of time is determined as a point in time at which a collision is likely to occur if the following condition is satisfied:
a) The data samples at the target time points indicate that the battery pack relay status is open;
b)OFF i <-2;
c)ON i < 0 and the actual vehicle speed is less than the preset vehicle speed, which here may be, for example, 20m/s.
When the target time point is the time point corresponding to the 5 th data sample, the above conditions a) -c) are satisfied, and thus the time point corresponding to the second data sample, which is the first time point when the battery pack relay state is off in the first period, is determined as the time point at which collision is likely to occur.
It should be appreciated that the first time period may include or be adjacent to a target point in time. Similarly, the second time period may also include or be adjacent to the target time point.
It should be appreciated that in this example, in the case where the time interval for collecting the vehicle data is 2 seconds, the first period and the second period may be 10 seconds, respectively; in the case when the time interval for collecting the vehicle data is 5 seconds, the first period and the second period may be 25 seconds, respectively.
It should be understood that while in this example the battery pack relay state value is-1 when the battery pack relay state is off and 1 when the battery pack relay state is on, the disclosure is not limited thereto and the battery pack relay state value may be represented by other values.
It should be understood that, although the first dynamic representation and the second dynamic representation are obtained by using the summation in the expressions (1) and (2), the first dynamic representation and the second dynamic representation are not limited thereto, and may be obtained by using a calculation method such as weighted summation, integration, or the like, for example.
It should be appreciated that while in this example the first time period is the same as the second time period in length, the present disclosure is not so limited and the first time period and the second time period may be different in length. Although the predetermined number is 5 in this example, it is not limited thereto, and the predetermined number may be an integer value between 4 and 20.
It will be appreciated that conditions at vehicle launch can be excluded by the above condition c).
It should be appreciated that the point in time at which a collision may occur may be determined in the onboard central processor, or vehicle data may be received via, for example, a cellular internet of vehicles to determine the point in time at which a collision may occur at a server in the cloud.
Step S220 (i.e., step S310 and step S320) determines a time point at which a collision is likely to occur based on the battery pack relay status data, and can screen out data related to a potential collision time point from a huge amount of input data, thereby reducing the computational load of the subsequent machine learning process.
Next, the method 200 proceeds to step S230, at which step S230 the deep learning identification means 130 acquires vehicle data including accelerator pedal position data, vehicle speed data, driver demand torque data, low-voltage battery voltage data, and battery pack relay state data within a predetermined period of time including the determined point in time, and determines whether the vehicle has a collision by using a deep learning model based on the acquired vehicle data, wherein in the case where it is determined that a collision has occurred, the point in time at which the collision is likely to occur is determined as a collision point in time.
Specifically, the input data of the deep learning model is vehicle data including accelerator pedal position data, vehicle speed data, driver demand torque data, low-voltage battery voltage data, and battery pack relay state data for a predetermined period of time including the determined point in time; the output data of the deep learning model is a vehicle collision recognition result of whether the vehicle collides or not, and a collision time point determined in the case where the collision is determined.
The deep learning model may include a deep learning model such as a two-way long and short term memory (Bi-LSTM) network based on an attention (attention) mechanism, a Convolutional Neural Network (CNN), a Generative Antagonism Network (GAN), and the like.
In some embodiments, the predetermined period of time may be a period of time from the first 1 minute of the point in time at which the collision is likely to occur to the last 1 minute of the point in time at which the collision is likely to occur. The present disclosure is not limited thereto and the predetermined period of time may be changed according to actual needs.
It should be appreciated that the deep learning model described above is pre-trained. Training of deep learning models is known and will not be described in detail herein.
It should be appreciated that vehicle data may be received via, for example, a cellular internet of vehicles to determine whether a vehicle is crashed using a deep learning model at a server in the cloud.
The use of the deep learning model for collision recognition can significantly improve the accuracy of vehicle collision recognition as compared with the use of a method such as a fixed threshold method for vehicle collision recognition, while by performing the vehicle collision recognition processing at the cloud, not only is high-accuracy vehicle collision recognition realized, but also the need to install a processor with high processing power on the vehicle can be eliminated.
Next, the method 200 proceeds to step S240, and at step S240, the collision recognition result providing section 140 provides the collision recognition result including the information indicating that the collision has occurred and the point of time of the collision to the after-market vehicle service provider. The after-market vehicle services may include maintenance, rescue, insurance, etc.
In some embodiments, the collision recognition result providing section 140 opens the collision recognition result including information indicating that a collision has occurred and a collision time point to the after-market vehicle service provider in the form of a representational state transfer style application program interface (RESTful API).
Alternatively, the collision recognition result containing the information indicating that the collision has occurred and the point of time of the collision may be opened to the after-market vehicle service provider in a form such as a simple object access protocol application program interface (SOAP API).
The collision recognition result containing the information indicating the collision and the collision time point is provided for the after-sales service provider of the vehicle, so that the after-sales service provider of the vehicle can timely provide services such as vehicle maintenance, rescue, insurance and the like, and the user experience of a driver of the vehicle is improved.
Exemplary vehicle collision recognition methods and apparatus according to the present disclosure are described above with reference to fig. 1, 2, and 3. It can be appreciated that the method in the disclosure can screen out data related to potential collision time points from massive input data, so that the computational load of machine learning processing can be reduced; by performing the vehicle collision recognition processing at the cloud, the need to install a processor with high processing capability on the vehicle can be eliminated; in addition, the vehicle collision recognition result is timely shared to the after-sale service provider after the vehicle collision occurs, so that services such as vehicle maintenance, rescue and insurance can be timely provided, and the user experience of a vehicle driver is improved.
Hardware implementation
Fig. 4 illustrates a general hardware environment 400 in which the present disclosure may be applied, according to an exemplary embodiment of the present disclosure.
With reference to fig. 4, a computing device 400 will now be described as an example of a hardware device applicable to aspects of the present disclosure. Computing device 400 may be any machine configured to perform processes and/or calculations and may be, but is not limited to, a workstation, a server, a desktop computer, a laptop computer, a tablet computer, a personal digital assistant, a smart phone, a portable camera, or any combination thereof. The apparatus 100 described above may be implemented in whole or at least in part by a computing device 400 or similar device or system.
Computing device 400 may include elements capable of connecting with bus 402 or communicating with bus 402 via one or more interfaces. For example, computing device 400 may include a bus 402, one or more processors 404, one or more input devices 406, and one or more output devices 408. The one or more processors 404 may be any type of processor and may include, but are not limited to, one or more general purpose processors and/or one or more special purpose processors (such as special purpose processing chips). Input device 406 may be any type of device capable of inputting information to a computing device and may include, but is not limited to, a mouse, keyboard, touch screen, microphone, and/or remote control. Output device 408 may be any type of device capable of presenting information and may include, but is not limited to, a display, speakers, video/audio output terminals, and/or a printer. Computing device 400 may also include a non-transitory storage device 410 or any storage device that is connected to non-transitory storage device 410, non-transitory storage device 410 may be non-transitory and may implement a data storage library, and may include, but is not limited to, disk drives, optical storage devices, solid state storage, floppy disks, flexible disks, hard disks, magnetic tape, or any other magnetic medium, compactA disk or any other optical medium, ROM (read only memory), RAM (random access memory), cache memory, and/or any other memory chip or cartridge, and/or any other medium from which a computer may read data, instructions, and/or code. The non-transitory storage device 410 may be detachable from the interface. The non-transitory storage device 410 may have data/instructions/code for implementing the methods and steps described above. Computing device 400 may also include communication device 412. The communication device 412 may be any type of device or system capable of communicating with external apparatus and/or with a network and may include, but is not limited to, a modem, a network card, an infrared communication device, wireless communication equipment, and/or a device such as bluetooth TM A device, 802.11 device, wiFi device, wiMax device, a chipset of a cellular communication facility, etc.
Bus 402 can include, but is not limited to, an Industry Standard Architecture (ISA) bus, a Micro Channel Architecture (MCA) bus, an Enhanced ISA (EISA) bus, a Video Electronics Standards Association (VESA) local bus, and a Peripheral Component Interconnect (PCI) bus.
Computing device 400 may also include a working memory 414, where working memory 414 may be any type of working memory that may store instructions and/or data useful for the operation of processor 404, and may include, but is not limited to, random access memory and/or read-only memory devices.
The software elements may reside in a working memory 414 including, but not limited to, an operating system 416, one or more application programs 418, drivers, and/or other data and code. Instructions for performing the above-described methods and steps may be included in one or more applications 418, and components of the apparatus 100 described above may be implemented by the processor 404 reading and executing the instructions of the one or more applications 418. More specifically, the receiving component 110 may be implemented, for example, by the processor 404 upon execution of the application 418 having instructions to perform step S210. The potential collision time point determining section 120 may be implemented, for example, by the processor 404 when executing the application 418 having instructions to perform step S220 (or steps S310 and S320). The deep learning identification component 130 may be implemented, for example, by the processor 404 upon execution of the application 418 having instructions to perform step S230. The collision recognition result providing section 140 may be implemented, for example, by the processor 404 when executing the application 418 having instructions to perform step S240. Executable code or source code of instructions of the software elements may be stored in a non-transitory computer readable storage medium, such as the storage device(s) 410 described above, and may be read into working memory 414, possibly compiled and/or installed. Executable code or source code for the instructions of the software elements may also be downloaded from a remote location.
From the above embodiments, it is apparent to those skilled in the art that the present disclosure may be implemented by software and necessary hardware, or may be implemented by hardware, firmware, etc. Based on this understanding, embodiments of the present disclosure may be implemented, in part, in software. The computer software may be stored in a computer readable storage medium, such as a floppy disk, hard disk, optical disk, or flash memory. The computer software includes a series of instructions that cause a computer (e.g., a personal computer, a service station, or a network terminal) to perform a method according to various embodiments of the present disclosure, or a portion thereof.
Having thus described the present disclosure, it is clear that the present disclosure can be varied in a number of ways. Such variations are not to be regarded as a departure from the spirit and scope of the present disclosure, and all such modifications as would be obvious to one skilled in the art are intended to be included within the scope of the following claims.

Claims (11)

1.一种车辆碰撞识别方法,其特征在于,包括:1. A vehicle collision recognition method, characterized in that, comprising: 接收来自车辆的车辆数据,车辆数据包括加速踏板位置数据、车辆速度数据、驾驶员需求扭矩数据、低压蓄电池电压数据和电池包继电器状态数据;Receive vehicle data from the vehicle, including accelerator pedal position data, vehicle speed data, driver demand torque data, low-voltage battery voltage data and battery pack relay status data; 基于电池包继电器状态数据来确定可能发生碰撞的时间点;Determining the point in time of a possible collision based on the state data of the battery pack relay; 取得包含所确定的时间点的预定时间段内的车辆数据,并且基于所取得车辆数据,通过使用深度学习模型来确定车辆是否发生碰撞,其中,在确定发生碰撞的情况下,将所述可能发生碰撞的时间点确定为碰撞时间点。Acquiring vehicle data within a predetermined period of time including the determined point in time, and based on the acquired vehicle data, using a deep learning model to determine whether the vehicle has collided, wherein, in the event of a collision being determined to occur, the possible occurrence The time point of the collision is determined as the collision time point. 2.根据权利要求1所述的车辆碰撞识别方法,其中,基于电池包继电器状态数据来确定可能发生碰撞的时间点包括:2. The vehicle collision recognition method according to claim 1, wherein determining the time point when a collision may occur based on the state data of the battery pack relay comprises: 基于电池包继电器状态数据,确定在目标时间点之前的第一时间段内的第一动态表示和在目标时间点之后的第二时间段内的第二动态表示,第一动态表示和第二动态表示指示出电池包继电器状态的动态变化,其中电池包继电器状态数据包括多个数据样本;Based on the battery pack relay status data, determining a first dynamic representation for a first time period before the target time point and a second dynamic representation for a second time period after the target time point, the first dynamic representation and the second dynamic representation Indicates a dynamic change indicating the state of the battery pack relay, wherein the battery pack relay state data includes a plurality of data samples; 基于第一动态表示和第二动态表示来确定可能发生碰撞的时间点,该确定包括判断以下条件是否被满足,以及在以下条件被满足的情况下将在第一时间段内的电池包继电器状态为断开的第一个时间点确定为可能发生碰撞的时间点:Determine the time point at which a collision may occur based on the first dynamic representation and the second dynamic representation, the determination includes judging whether the following conditions are met, and the state of the battery pack relay within the first time period if the following conditions are satisfied The first time point of disconnection is determined as the time point of possible collision: 在目标时间点处的数据样本指示出电池包继电器状态是断开的,The data sample at the target time point indicates that the battery pack relay state is open, 第一动态表示指示出:在第一时间段内的表示断开的电池包继电器状态的数据样本占在第一时间段内的总数据样本的80%以上,并且The first dynamic representation indicates that the data samples representing an open battery pack relay state during the first time period represent more than 80% of the total data samples during the first time period, and 第二动态表示指示出:在第二时间段内的表示断开的电池包继电器状态的数据样本占在第二时间段内的总数据样本的60%以上,并且,在目标时间点处的车辆速度小于预设的车辆速度。The second dynamic representation indicates that the data samples representing the state of the open battery pack relay during the second time period represent more than 60% of the total data samples during the second time period, and the vehicle at the target time point The speed is less than the preset vehicle speed. 3.根据权利要求2所述的车辆碰撞识别方法,其中,第一时间段与第二时间段长度相同并且与预定数量的数据样本对应,该预定数量取4-20之间的整数值。3. The vehicle collision recognition method according to claim 2, wherein the first time period is the same length as the second time period and corresponds to a predetermined number of data samples, and the predetermined number takes an integer value between 4-20. 4.根据权利要求1所述的车辆碰撞识别方法,还包括:将包含指示出发生了碰撞的信息和碰撞时间点的碰撞识别结果提供给车辆售后服务商。4. The vehicle collision recognition method according to claim 1, further comprising: providing a collision recognition result including information indicating that a collision has occurred and a time point of the collision to a vehicle after-sales service provider. 5.根据权利要求1所述的车辆碰撞识别方法,其中,深度学习模型包括基于注意力机制的双向长短期记忆网络。5. The vehicle collision recognition method according to claim 1, wherein the deep learning model comprises a two-way long-short-term memory network based on an attention mechanism. 6.根据权利要求1所述的车辆碰撞识别方法,其中,至少一部分的车辆数据是通过使用车载传感器采集得到的。6. The vehicle collision recognition method according to claim 1, wherein at least a part of the vehicle data is collected by using on-board sensors. 7.根据权利要求1所述的车辆碰撞识别方法,其中,车辆数据是经由蜂窝车联网接收的。7. The vehicle collision recognition method according to claim 1, wherein the vehicle data is received via a cellular vehicle network. 8.一种车辆碰撞识别装置,其特征在于,包括:用于执行如权利要求1-7中的任一项所述的方法的部件。8. A vehicle collision recognition device, characterized by comprising: components for executing the method according to any one of claims 1-7. 9.一种车辆碰撞识别装置,其特征在于,包括:9. A vehicle collision recognition device, characterized in that it comprises: 至少一个处理器;以及at least one processor; and 至少一个存储设备,所述至少一个存储设备存储有指令,当所述指令由所述至少一个处理器执行时使得所述至少一个处理器执行如权利要求1-7中的任一项所述的方法。At least one storage device, the at least one storage device stores instructions, which when the instructions are executed by the at least one processor, cause the at least one processor to perform the method according to any one of claims 1-7 method. 10.一种非瞬态计算机可读存储介质,其特征在于,存储有指令,当所述指令由处理器执行时使得执行如权利要求1-7中的任一项所述的方法。10. A non-transitory computer-readable storage medium, characterized in that instructions are stored therein, and when the instructions are executed by a processor, the method according to any one of claims 1-7 is executed. 11.一种计算机程序产品,其特征在于,存储有指令,当所述指令由处理器执行时使得执行如权利要求1-7中的任一项所述的方法。11. A computer program product, characterized by storing instructions which, when executed by a processor, cause the method according to any one of claims 1-7 to be performed.
CN202111403344.3A 2021-11-24 2021-11-24 Vehicle collision identification method and device Active CN116168468B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111403344.3A CN116168468B (en) 2021-11-24 2021-11-24 Vehicle collision identification method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111403344.3A CN116168468B (en) 2021-11-24 2021-11-24 Vehicle collision identification method and device

Publications (2)

Publication Number Publication Date
CN116168468A true CN116168468A (en) 2023-05-26
CN116168468B CN116168468B (en) 2024-12-10

Family

ID=86418699

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111403344.3A Active CN116168468B (en) 2021-11-24 2021-11-24 Vehicle collision identification method and device

Country Status (1)

Country Link
CN (1) CN116168468B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN118536027A (en) * 2024-04-30 2024-08-23 明觉科技(北京)有限公司 Vehicle collision accident detection method, device, system and computer readable medium

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109591603A (en) * 2019-01-04 2019-04-09 北京长城华冠汽车科技股份有限公司 Control method and system after electric car and its collision
CN110775057A (en) * 2019-08-29 2020-02-11 浙江零跑科技有限公司 Lane assisting method for analyzing and controlling steering torque based on vehicle-mounted blind zone visual scene
US20210089938A1 (en) * 2019-09-24 2021-03-25 Ford Global Technologies, Llc Vehicle-to-everything (v2x)-based real-time vehicular incident risk prediction
CN112749210A (en) * 2021-01-18 2021-05-04 优必爱信息技术(北京)有限公司 Vehicle collision recognition method and system based on deep learning
CN112959895A (en) * 2021-03-28 2021-06-15 大运汽车股份有限公司 Finished automobile control method of pure electric commercial vehicle

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109591603A (en) * 2019-01-04 2019-04-09 北京长城华冠汽车科技股份有限公司 Control method and system after electric car and its collision
CN110775057A (en) * 2019-08-29 2020-02-11 浙江零跑科技有限公司 Lane assisting method for analyzing and controlling steering torque based on vehicle-mounted blind zone visual scene
US20210089938A1 (en) * 2019-09-24 2021-03-25 Ford Global Technologies, Llc Vehicle-to-everything (v2x)-based real-time vehicular incident risk prediction
CN112749210A (en) * 2021-01-18 2021-05-04 优必爱信息技术(北京)有限公司 Vehicle collision recognition method and system based on deep learning
CN112959895A (en) * 2021-03-28 2021-06-15 大运汽车股份有限公司 Finished automobile control method of pure electric commercial vehicle

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN118536027A (en) * 2024-04-30 2024-08-23 明觉科技(北京)有限公司 Vehicle collision accident detection method, device, system and computer readable medium
CN118536027B (en) * 2024-04-30 2025-02-11 明觉科技(北京)有限公司 Vehicle collision accident detection method, device, system and computer readable medium

Also Published As

Publication number Publication date
CN116168468B (en) 2024-12-10

Similar Documents

Publication Publication Date Title
CN110929799B (en) Method, electronic device, and computer-readable medium for detecting abnormal user
JP2021509978A (en) Driving behavior evaluation method, device and computer-readable storage medium
CN108959247B (en) Data processing method, server and computer readable medium
CN111862945A (en) A speech recognition method, device, electronic device and storage medium
US20190147540A1 (en) Method and apparatus for outputting information
CN111222051B (en) Training method and device for trend prediction model
CN113222050A (en) Image classification method and device, readable medium and electronic equipment
CN114103944B (en) Method, device and equipment for adjusting time interval between workshops
CN110069997B (en) Scene classification method and device and electronic equipment
CN116168468B (en) Vehicle collision identification method and device
CN110555861B (en) Optical flow calculation method and device and electronic equipment
CN113111692A (en) Target detection method and device, computer readable storage medium and electronic equipment
CN110060477B (en) Method and device for pushing information
CN116645956A (en) Speech synthesis method, speech synthesis system, electronic device and storage medium
CN114462502B (en) A method and device for training a core recommendation model
CN115171718A (en) Specific bird identification method and device and storage medium
CN112036519B (en) Multi-bit sigmoid-based classification processing method and device and electronic equipment
CN112487876B (en) Intelligent pen character recognition method and device and electronic equipment
CN110378936B (en) Optical flow calculation method and device and electronic equipment
CN111651686B (en) Test processing method and device, electronic equipment and storage medium
CN107888652A (en) Result abnormal detector, detection program, detection method and moving body
JP2025528640A (en) Image dataset processing method, device, equipment and storage medium
CN111798591B (en) Method and device for determining total mileage of vehicle, computer equipment and storage medium
CN111738311A (en) Multitask-oriented feature extraction method and device and electronic equipment
CN114969224B (en) Vehicle test driving reservation method and device, electronic equipment and medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant