WO2022061702A1 - Procédé, appareil et système pour des alertes de conduite - Google Patents
Procédé, appareil et système pour des alertes de conduite Download PDFInfo
- Publication number
- WO2022061702A1 WO2022061702A1 PCT/CN2020/117687 CN2020117687W WO2022061702A1 WO 2022061702 A1 WO2022061702 A1 WO 2022061702A1 CN 2020117687 W CN2020117687 W CN 2020117687W WO 2022061702 A1 WO2022061702 A1 WO 2022061702A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- data
- vehicle
- reminder
- distance
- driver
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W50/00—Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
- B60W50/08—Interaction between the driver and the control system
- B60W50/14—Means for informing the driver, warning the driver or prompting a driver intervention
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W40/00—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
- B60W40/08—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to drivers or passengers
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W40/00—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
- B60W40/10—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to vehicle motion
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W40/00—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
- B60W40/10—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to vehicle motion
- B60W40/105—Speed
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W60/00—Drive control systems specially adapted for autonomous road vehicles
- B60W60/001—Planning or execution of driving tasks
- B60W60/0015—Planning or execution of driving tasks specially adapted for safety
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W40/00—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
- B60W40/08—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to drivers or passengers
- B60W2040/0818—Inactivity or incapacity of driver
- B60W2040/0827—Inactivity or incapacity of driver due to sleepiness
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W50/00—Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
- B60W50/08—Interaction between the driver and the control system
- B60W50/14—Means for informing the driver, warning the driver or prompting a driver intervention
- B60W2050/143—Alarm means
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W50/00—Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
- B60W50/08—Interaction between the driver and the control system
- B60W50/14—Means for informing the driver, warning the driver or prompting a driver intervention
- B60W2050/146—Display means
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2420/00—Indexing codes relating to the type of sensors based on the principle of their operation
- B60W2420/40—Photo, light or radio wave sensitive means, e.g. infrared sensors
- B60W2420/403—Image sensing, e.g. optical camera
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2420/00—Indexing codes relating to the type of sensors based on the principle of their operation
- B60W2420/40—Photo, light or radio wave sensitive means, e.g. infrared sensors
- B60W2420/408—Radar; Laser, e.g. lidar
Definitions
- the present application relates to the field of automatic driving, and more particularly, to a driving reminder method, device and system.
- Artificial intelligence is a theory, method, technology and application system that uses digital computers or machines controlled by digital computers to simulate, extend and expand human intelligence, perceive the environment, acquire knowledge and use knowledge to obtain the best results.
- artificial intelligence is a branch of computer science that attempts to understand the essence of intelligence and produce a new kind of intelligent machine that responds in a similar way to human intelligence.
- Artificial intelligence is to study the design principles and implementation methods of various intelligent machines, so that the machines have the functions of perception, reasoning and decision-making.
- Research in the field of artificial intelligence includes robotics, natural language processing, computer vision, decision-making and reasoning, human-computer interaction, recommendation and search, and basic AI theory.
- Autonomous driving is a mainstream application in the field of artificial intelligence.
- Autonomous driving technology relies on the cooperation of computer vision, radar, monitoring devices and global positioning systems, etc., so that motor vehicles can achieve autonomous driving without the need for human active operation.
- Autonomous vehicles use various computing systems to help transport passengers or cargo from one location to another. Some autonomous vehicles may require some initial or continuous input from an operator, such as a pilot, driver, or passenger.
- An autonomous vehicle allows the operator to switch from a manual mode of operation to an autonomous driving mode or a mode in between. Since automatic driving technology does not require humans to drive motor vehicles, it can theoretically effectively avoid human driving errors, reduce the occurrence of traffic accidents, and improve the efficiency of highway transportation. Therefore, autonomous driving technology is getting more and more attention.
- Driving reminders for example, blind spot detection reminders, collision avoidance reminders, lane departure reminders, and fatigue driving reminders, etc.
- Driving assistance technology can analyze the situation inside and outside the vehicle and send warnings to the driver in advance of possible dangerous situations. , to improve driving safety.
- the accuracy of driving reminders is currently not high.
- the present application provides a driving reminder method, device and system, which are used to improve the accuracy of the driving reminder and the safety of driving.
- a driving reminder method comprising:
- first data from a first sensor of the vehicle and second data from a second sensor of the vehicle, the first data including data of traffic elements surrounding the vehicle, the second data including the vehicle
- the data of the driver of The distance between the vehicle and the traffic element, the first line of sight direction refers to the line of sight direction of the driver determined according to the second data.
- the distance between the vehicle and the traffic element and the direction of the driver's line of sight are used for judgment to determine whether to send reminder information, which can improve the accuracy of the driving reminder, thereby improving the user experience. experience.
- the traffic elements may include: pedestrians, animals, vehicles, street lights, guardrails and other objects around the vehicle.
- the first sensor may include at least one of a global positioning system, an inertial measurement unit, a radar, a laser rangefinder, and a camera, and the first sensor may be used to collect the current speed, Acceleration, current position and contour, etc.
- the second sensor may comprise a camera or other sensor, and the second sensor may be used to collect data on the driver of the vehicle.
- the second sensor may be used to collect the head posture of the driver, such as roll, pitch and yaw of the driver's head.
- the line-of-sight direction of the driver may be determined according to the head posture of the driver.
- the reminder levels of the reminder information are different.
- the reminder level of the reminder information is different. Therefore, the reminder information can be made more accurate, thereby helping Further improve the accuracy of driving reminders.
- the reminder information of different reminder levels makes the driver feel different importance or urgency.
- At least one of the number of reminders, the frequency of reminders, and the intensity of reminders corresponding to the reminder information of different reminder levels is different.
- the reminder information of different reminder levels can be distinguished by at least one of the number of reminders, the reminder frequency, and the reminder intensity, which can make the reminder information more accurate, thereby helping to further improve the user experience.
- the acquiring first data from a first sensor of the vehicle and second data from a second sensor of the vehicle includes: When the speed is less than or equal to the preset speed, the first data and the second data are acquired.
- the data collected by the sensor is acquired only when the speed of the vehicle satisfies the preset condition, which can reduce the power consumption of the vehicle.
- a driving reminder method comprising:
- the driver's fatigue driving level is related, wherein the first distance refers to the distance between the vehicle and the traffic element determined according to the first data.
- the reminder level of the reminder information is determined according to the distance between the vehicle and the traffic element and the driver's fatigue driving level, which can make the reminder information more accurate, thereby helping To further improve the accuracy of driving reminders.
- the reminder information of different reminder levels makes the driver feel different importance or urgency.
- the traffic elements may include: pedestrians, animals, vehicles, street lights, guardrails and other objects around the vehicle.
- the first sensor may include at least one of a global positioning system, an inertial measurement unit, a radar, a laser rangefinder, and a camera, and the first sensor may be used to collect the current speed, Acceleration, current position and contour, etc.
- the second sensor may comprise a camera or other sensor, and the second sensor may be used to collect data on the driver of the vehicle.
- the second sensor may also be used to collect the proportion of the driver's eyes closed time within a certain time interval, for example, the driver's PERCLOS (percentage of eyeIid CIosure over the PupiI, over time) physical quantity.
- the driver's PERCLOS percentage of eyeIid CIosure over the PupiI, over time
- the driver's fatigue driving level may be determined according to the proportion of the driver's eyes closed time within a certain time interval.
- the The reminder level of the reminder message is different.
- the reminder level of the reminder information is different. Therefore, all The above reminder information is more accurate, which helps to further improve the accuracy of driving reminders.
- the driver's fatigue driving level determined according to the second data is the same, and the distance ranges to which the first distance belongs are different, the The reminder level of the reminder message is different.
- the reminder level of the reminder information is different. Therefore, all The above reminder information is more accurate, which helps to further improve the accuracy of driving reminders.
- At least one of the number of reminders, the frequency of reminders, and the intensity of reminders corresponding to the reminder information of different reminder levels is different.
- the reminder information of different reminder levels can be distinguished by at least one of the number of reminders, the reminder frequency, and the reminder intensity, which can make the reminder information more accurate, thereby helping to further improve the user experience.
- a driving reminder device including:
- an acquisition unit configured to acquire first data from a first sensor of the vehicle and second data from a second sensor of the vehicle, the first data includes data of traffic elements around the vehicle, the second data The data includes the data of the driver of the vehicle; the sending unit is configured to send reminder information when the first distance is greater than or equal to a first threshold and the first sight direction does not pay attention to the traffic element, wherein the first A distance refers to the distance between the vehicle and the traffic element determined according to the first data, and the first line of sight direction refers to the line of sight direction of the driver determined according to the second data.
- the distance between the vehicle and the traffic element and the direction of the driver's line of sight are used for judgment to determine whether to send reminder information, which can improve the accuracy of the driving reminder, thereby improving the user experience. experience.
- the traffic elements may include: pedestrians, animals, vehicles, street lights, guardrails and other objects around the vehicle.
- the first sensor may include at least one of a global positioning system, an inertial measurement unit, a radar, a laser rangefinder, and a camera, and the first sensor may be used to collect the current speed, Acceleration, current position and contour, etc.
- the second sensor may comprise a camera or other sensor, and the second sensor may be used to collect data on the driver of the vehicle.
- the second sensor may be used to collect the head posture of the driver, such as roll, pitch and yaw of the driver's head.
- the line-of-sight direction of the driver may be determined according to the head posture of the driver.
- the reminder levels of the reminder information are different.
- the reminder level of the reminder information is different. Therefore, the reminder information can be made more accurate, thereby helping Further improve the accuracy of driving reminders.
- the reminder information of different reminder levels makes the driver feel different importance or urgency.
- At least one of the number of reminders, the frequency of reminders, and the intensity of reminders corresponding to the reminder information of different reminder levels is different.
- the reminder information of different reminder levels can be distinguished by at least one of the number of reminders, the reminder frequency and the reminder intensity, which can make the reminder information more accurate, thereby helping to further improve the user experience.
- the obtaining unit is specifically configured to: obtain the first data and the first data when the speed of the vehicle is less than or equal to a preset speed Two data.
- the data collected by the sensor is acquired only when the speed of the vehicle satisfies the preset condition, which can reduce the power consumption of the vehicle.
- a driving reminder device including:
- an acquisition unit configured to acquire first data from a first sensor of the vehicle and second data from a second sensor of the vehicle, the first data includes data of traffic elements around the vehicle, the second data
- the data includes the data of the driver of the vehicle;
- the sending unit is configured to send reminder information to the driver according to the first data and the second data, the reminder level of the reminder information and the distance range to which the first distance belongs , and the driver's fatigue driving level determined according to the second data, wherein the first distance refers to the distance between the vehicle and the traffic element determined according to the first data.
- the reminder level of the reminder information is determined according to the distance between the vehicle and the traffic element and the driver's fatigue driving level, which can make the reminder information more accurate, thereby helping To further improve the accuracy of driving reminders.
- the reminder information of different reminder levels makes the driver feel different importance or urgency.
- the traffic elements may include: pedestrians, animals, vehicles, street lights, guardrails and other objects around the vehicle.
- the first sensor may include at least one of a global positioning system, an inertial measurement unit, a radar, a laser rangefinder, and a camera, and the first sensor may be used to collect the current speed, Acceleration, current position and contour, etc.
- the second sensor may comprise a camera or other sensor, and the second sensor may be used to collect data on the driver of the vehicle.
- the second sensor may also be used to collect the proportion of the driver's eyes closed time within a certain time interval, for example, the driver's PERCLOS (percentage of eyeIid CIosure over the PupiI, over time) physical quantity.
- the driver's PERCLOS percentage of eyeIid CIosure over the PupiI, over time
- the driver's fatigue driving level may be determined according to the proportion of the driver's eyes closed time within a certain time interval.
- the The reminder level of the reminder message is different.
- the reminder level of the reminder information is different. Therefore, all The above reminder information is more accurate, which helps to further improve the accuracy of driving reminders.
- the driver's fatigue driving level determined according to the second data is the same, and the distance ranges to which the first distance belongs are different, the The reminder level of the reminder message is different.
- the reminder level of the reminder information is different. Therefore, all The above reminder information is more accurate, which helps to further improve the accuracy of driving reminders.
- At least one of the number of reminders, the frequency of reminders, and the intensity of reminders corresponding to the reminder information of different reminder levels is different.
- the reminder information of different reminder levels can be distinguished by at least one of the number of reminders, the reminder frequency, and the reminder intensity, which can make the reminder information more accurate, thereby helping to further improve the user experience.
- a driving reminder device in a fifth aspect, includes a storage medium and a central processing unit, the storage medium may be a non-volatile storage medium, and a computer-executable program is stored in the storage medium, the The central processing unit is connected to the non-volatile storage medium, and executes the computer-executable program to implement the first aspect or the method in any possible implementation manner of the first aspect.
- a driving reminder device in a sixth aspect, includes a storage medium and a central processing unit, the storage medium may be a non-volatile storage medium, and a computer-executable program is stored in the storage medium, and the The central processing unit is connected to the non-volatile storage medium, and executes the computer-executable program to implement the method in the second aspect or any possible implementation manner of the second aspect.
- a seventh aspect provides a chip, the chip includes a processor and a data interface, the processor reads instructions stored in a memory through the data interface, and executes the first aspect or any possible implementation of the first aspect method in method.
- the chip may further include a memory, in which instructions are stored, the processor is configured to execute the instructions stored in the memory, and when the instructions are executed, the The processor is configured to perform the method in the first aspect or any possible implementation of the first aspect.
- a chip in an eighth aspect, includes a processor and a data interface, the processor reads an instruction stored in a memory through the data interface, and executes the second aspect or any possible implementation of the second aspect method in method.
- the chip may further include a memory, in which instructions are stored, the processor is configured to execute the instructions stored in the memory, and when the instructions are executed, the The processor is configured to perform the method of the second aspect or any possible implementation of the second aspect.
- a computer-readable storage medium stores program codes for device execution, the program codes including the first aspect or any possible implementation manner of the first aspect. method in the directive.
- a computer-readable storage medium where the computer-readable medium stores program codes for device execution, the program codes including the second aspect or any possible implementation manner of the second aspect. method in the directive.
- a driving reminder system comprising the driving reminder device according to the third aspect or the fourth aspect or the fifth aspect.
- a twelfth aspect provides an automobile, which includes the driving reminder device according to the third aspect or the fourth aspect or the fifth aspect.
- the distance between the vehicle and the traffic element and the direction of the driver's line of sight are used for judgment to determine whether to send reminder information, which can improve the accuracy of the driving reminder, thereby improving the user experience. experience.
- FIG. 1 is a schematic structural diagram of an automatic driving vehicle according to an embodiment of the present application.
- FIG. 2 is a schematic structural diagram of an automatic driving system according to an embodiment of the present application.
- FIG. 3 is a schematic structural diagram of a system architecture to which an embodiment of the present application is applicable;
- FIG. 4 is a schematic block diagram of a driving reminder method provided by an embodiment of the present application.
- FIG. 5 is a schematic block diagram of a driving reminder method provided by another embodiment of the present application.
- FIG. 6 is a schematic block diagram of a driving reminder method provided by another embodiment of the present application.
- FIG. 7 is a schematic block diagram of a driving reminder method provided by another embodiment of the present application.
- FIG. 8 is a schematic block diagram of a driving reminder device provided by an embodiment of the present application.
- FIG. 9 is a schematic block diagram of a driving reminder device provided by another embodiment of the present application.
- FIG. 10 is a schematic block diagram of a driving reminder device provided by another embodiment of the present application.
- the technical solutions of the embodiments of the present application can be applied to various vehicles, and the vehicle may be a diesel locomotive, an intelligent electric vehicle, or a hybrid vehicle, or the vehicle may also be a vehicle of other power types, or the like, or the embodiment of the present application
- the technical solution can also be applied to various other vehicles, such as airplanes and ships, which are not limited in the embodiments of the present application.
- the vehicle in the embodiment of the present application may be an automatic driving vehicle.
- the automatic driving vehicle may be configured with an automatic driving mode, and the automatic driving mode may be a fully automatic driving mode, or may also be a partial automatic driving mode.
- the embodiment is not limited to this.
- the vehicle in this embodiment of the present application may also be configured with other driving modes, and the other driving modes may include one or more of a variety of driving modes, such as a sport mode, an economy mode, a standard mode, a snow mode, and a hill climbing mode.
- the automatic driving vehicle can switch between the automatic driving mode and the above-mentioned various driving models (the driver drives the vehicle), which is not limited in the embodiment of the present application.
- FIG. 1 is a functional block diagram of a vehicle 100 provided by an embodiment of the present application.
- the vehicle 100 is configured in a fully or partially autonomous driving mode.
- the vehicle 100 may control itself while in an autonomous driving mode, and may determine the current state of the vehicle and its surrounding environment through human manipulation, determine the likely behavior of at least one other vehicle in the surrounding environment, and determine the other vehicle's
- the vehicle 100 is controlled based on the determined information with a confidence level corresponding to the likelihood of performing the possible behavior.
- the vehicle 100 may be placed to operate without human interaction.
- Vehicle 100 may include various subsystems, such as travel system 102 , sensor system 104 , control system 106 , one or more peripherals 108 and power supply 110 , computer system 112 and user interface 116 .
- vehicle 100 may include more or fewer subsystems, and each subsystem may include multiple elements. Additionally, each of the subsystems and elements of the vehicle 100 may be interconnected by wire or wirelessly.
- the travel system 102 may include components that provide powered motion for the vehicle 100 .
- propulsion system 102 may include engine 118 , energy source 119 , transmission 120 , and wheels/tires 121 .
- the engine 118 may be an internal combustion engine, an electric motor, an air compression engine, or other types of engine combinations, such as a hybrid engine of an air oil engine and an electric motor, a hybrid engine of an internal combustion engine and an air compression engine.
- Engine 118 converts energy source 119 into mechanical energy.
- Examples of energy sources 119 include gasoline, diesel, other petroleum-based fuels, propane, other compressed gas-based fuels, ethanol, solar panels, batteries, and other sources of electricity.
- the energy source 119 may also provide energy to other systems of the vehicle 100 .
- Transmission 120 may transmit mechanical power from engine 118 to wheels 121 .
- Transmission 120 may include a gearbox, a differential, and a driveshaft.
- transmission 120 may also include other devices, such as clutches.
- the drive shaft may include one or more axles that may be coupled to one or more wheels 121 .
- the sensor system 104 may include several sensors that sense information about the environment surrounding the vehicle 100 .
- the sensor system 104 may include a positioning system 122 (which may be a GPS system, a Beidou system or other positioning system), an inertial measurement unit (IMU) 124, a radar 126, a laser rangefinder 128, and camera 130.
- the sensor system 104 may also include sensors of the internal systems of the vehicle 100 being monitored (eg, an in-vehicle air quality monitor, a fuel gauge, an oil temperature gauge, etc.). Sensor data from one or more of these sensors can be used to detect objects and their corresponding characteristics (position, shape, orientation, velocity, etc.). This detection and identification is a critical function for the safe operation of the autonomous vehicle 100 .
- the positioning system 122 may be used to estimate the geographic location of the vehicle 100 .
- the IMU 124 is used to sense position and orientation changes of the vehicle 100 based on inertial acceleration.
- IMU 124 may be a combination of an accelerometer and a gyroscope.
- Radar 126 may utilize radio signals to sense objects within the surrounding environment of vehicle 100 . In some embodiments, in addition to sensing objects, radar 126 may be used to sense the speed and/or heading of objects.
- the laser rangefinder 128 may utilize laser light to sense objects in the environment in which the vehicle 100 is located.
- the laser rangefinder 128 may include one or more laser sources, laser scanners, and one or more detectors, among other system components.
- Camera 130 may be used to capture multiple images of the surrounding environment of vehicle 100 .
- Camera 130 may be a still camera or a video camera.
- Control system 106 controls the operation of the vehicle 100 and its components.
- Control system 106 may include various elements including steering system 132 , throttle 134 , braking unit 136 , sensor fusion algorithms 138 , computer vision system 140 , route control system 142 , and obstacle avoidance system 144 .
- the steering system 132 is operable to adjust the heading of the vehicle 100 .
- it may be a steering wheel system.
- the throttle 134 is used to control the operating speed of the engine 118 and thus the speed of the vehicle 100 .
- the braking unit 136 is used to control the deceleration of the vehicle 100 .
- the braking unit 136 may use friction to slow the wheels 121 .
- the braking unit 136 may convert the kinetic energy of the wheels 121 into electrical current.
- the braking unit 136 may also take other forms to slow the wheels 121 to control the speed of the vehicle 100 .
- Computer vision system 140 may be operable to process and analyze images captured by camera 130 in order to identify objects and/or features in the environment surrounding vehicle 100 .
- the objects and/or features may include traffic signals, road boundaries and obstacles.
- Computer vision system 140 may use object recognition algorithms, Structure from Motion (SFM) algorithms, video tracking, and other computer vision techniques.
- SFM Structure from Motion
- the computer vision system 140 may be used to map the environment, track objects, estimate the speed of objects, and the like.
- the route control system 142 is used to determine the travel route of the vehicle 100 .
- the route control system 142 may combine data from the sensors 138 , the GPS 122 , and one or more predetermined maps to determine a driving route for the vehicle 100 .
- the obstacle avoidance system 144 is used to identify, evaluate, and avoid or otherwise traverse potential obstacles in the environment of the vehicle 100 .
- control system 106 may additionally or alternatively include components other than those shown and described. Alternatively, some of the components shown above may be reduced.
- Peripherals 108 may include a wireless communication system 146 , an onboard computer 148 , a microphone 150 and/or a speaker 152 .
- peripherals 108 provide a means for a user of vehicle 100 to interact with user interface 116 .
- the onboard computer 148 may provide information to the user of the vehicle 100 .
- User interface 116 may also operate on-board computer 148 to receive user input.
- the onboard computer 148 can be operated via a touch screen.
- peripheral devices 108 may provide a means for vehicle 100 to communicate with other devices located within the vehicle.
- microphone 150 may receive audio (eg, voice commands or other audio input) from a user of vehicle 100 .
- speakers 152 may output audio to a user of vehicle 100 .
- Wireless communication system 146 may wirelessly communicate with one or more devices, either directly or via a communication network.
- wireless communication system 146 may use 3G cellular communications, such as CDMA, EVDO, GSM/GPRS, or 4G cellular communications, such as LTE. Or 5G cellular communications.
- the wireless communication system 146 may communicate with a wireless local area network (WLAN) using WiFi.
- WLAN wireless local area network
- the wireless communication system 146 may communicate directly with the device using an infrared link, Bluetooth, or ZigBee.
- Other wireless protocols, such as various vehicle communication systems, for example, wireless communication system 146 may include one or more dedicated short range communications (DSRC) devices, which may include communication between vehicles and/or roadside stations public and/or private data communications.
- DSRC dedicated short range communications
- the power supply 110 may provide power to various components of the vehicle 100 .
- the power source 110 may be a rechargeable lithium-ion or lead-acid battery.
- One or more battery packs of such a battery may be configured as a power source to provide power to various components of the vehicle 100 .
- power source 110 and energy source 119 may be implemented together, such as in some all-electric vehicles.
- Computer system 112 may include at least one processor 113 that executes instructions 115 stored in a non-transitory computer-readable medium such as data storage device 114 .
- Computer system 112 may also be multiple computing devices that control individual components or subsystems of vehicle 100 in a distributed fashion.
- the processor 113 may be any conventional processor, such as a commercially available CPU. Alternatively, the processor may be a dedicated device such as an ASIC or other hardware-based processor.
- FIG. 1 functionally illustrates the processor, memory, and other elements of the computer 110 in the same block, one of ordinary skill in the art will understand that the processor, computer, or memory may actually be included in the same physical Multiple processors, computers, or memories within a housing.
- the memory may be a hard drive or other storage medium located within an enclosure other than computer 110 .
- reference to a processor or computer will be understood to include reference to a collection of processors or computers or memories that may or may not operate in parallel.
- some components such as the steering and deceleration components may each have their own processor that only performs computations related to component-specific functions .
- a processor may be located remotely from the vehicle and in wireless communication with the vehicle. In other aspects, some of the processes described herein are performed on a processor disposed within the vehicle while others are performed by a remote processor, including taking steps necessary to perform a single maneuver.
- data storage 114 may include instructions 115 (eg, program logic) executable by processor 113 to perform various functions of vehicle 100 , including those described above.
- Data storage 114 may also contain additional instructions, including sending data to, receiving data from, interacting with, and/or performing data processing on one or more of propulsion system 102 , sensor system 104 , control system 106 , and peripherals 108 . control commands.
- the data storage device 114 may store data such as road maps, route information, the vehicle's position, direction, speed, and other such vehicle data, among other information. Such information may be used by the vehicle 100 and the computer system 112 during operation of the vehicle 100 in autonomous, semi-autonomous and/or manual modes.
- a user interface 116 for providing information to or receiving information from a user of the vehicle 100 .
- the user interface 116 may include one or more input/output devices within the set of peripheral devices 108 , such as a wireless communication system 146 , an onboard computer 148 , a microphone 150 and a speaker 152 .
- Computer system 112 may control functions of vehicle 100 based on input received from various subsystems (e.g., travel system 102, sensor system 104, and control system 106) and from user interface 116. For example, computer system 112 may utilize input from control system 106 in order to control steering unit 132 to avoid obstacles detected by sensor system 104 and obstacle avoidance system 144 . In some embodiments, computer system 112 is operable to provide control of various aspects of vehicle 100 and its subsystems.
- various subsystems e.g., travel system 102, sensor system 104, and control system 106
- computer system 112 may utilize input from control system 106 in order to control steering unit 132 to avoid obstacles detected by sensor system 104 and obstacle avoidance system 144 .
- computer system 112 is operable to provide control of various aspects of vehicle 100 and its subsystems.
- one or more of these components described above may be installed or associated with the vehicle 100 separately.
- data storage device 114 may exist partially or completely separate from vehicle 100 .
- the above-described components may be communicatively coupled together in a wired and/or wireless manner.
- FIG. 1 should not be construed as a limitation on the embodiments of the present application.
- An autonomous vehicle traveling on a road can recognize objects within its surroundings to determine adjustments to current speed.
- the objects may be other vehicles, traffic control equipment, or other types of objects.
- each identified object may be considered independently, and based on the object's respective characteristics, such as its current speed, acceleration, distance from the vehicle, etc., may be used to determine the speed at which the autonomous vehicle is to adjust.
- the vehicle 100 or a computing device associated with the vehicle 100 may be based on the characteristics of the identified objects and the state of the surrounding environment (eg, traffic, rain, ice on the road, etc.) to predict the behavior of the identified object.
- each identified object is dependent on the behavior of the other, so it is also possible to predict the behavior of a single identified object by considering all identified objects together.
- the vehicle 100 can adjust its speed based on the predicted behavior of the identified object.
- the autonomous vehicle can determine what steady state the vehicle will need to adjust to (eg, accelerate, decelerate, or stop) based on the predicted behavior of the object.
- other factors may also be considered to determine the speed of the vehicle 100, such as the lateral position of the vehicle 100 in the road being traveled, the curvature of the road, the proximity of static and dynamic objects, and the like.
- the computing device may also provide instructions to modify the steering angle of the vehicle 100 so that the autonomous vehicle follows a given trajectory and/or maintains contact with objects in the vicinity of the autonomous vehicle (eg, , cars in adjacent lanes on the road) safe lateral and longitudinal distances.
- objects in the vicinity of the autonomous vehicle eg, , cars in adjacent lanes on the road
- the above-mentioned vehicle 100 can be a car, a truck, a motorcycle, a bus, a boat, an airplane, a helicopter, a lawn mower, a recreational vehicle, a playground vehicle, construction equipment, a tram, a golf cart, a train, a cart, etc.
- the embodiment is not particularly limited.
- FIG. 2 is a schematic diagram of an automatic driving system provided by an embodiment of the present application.
- the automatic driving system shown in FIG. 2 includes a computer system 101 , wherein the computer system 101 includes a processor 103 , and the processor 103 is coupled with a system bus 105 .
- the processor 103 may be one or more processors, each of which may include one or more processor cores.
- a video adapter 107 which can drive a display 109, is coupled to the system bus 105.
- System bus 105 is coupled to input-output (I/O) bus 113 through bus bridge 111 .
- I/O interface 115 is coupled to the I/O bus. I/O interface 115 communicates with various I/O devices, such as input device 117 (eg, keyboard, mouse, touch screen, etc.), media tray 121, (eg, CD-ROM, multimedia interface, etc.).
- Transceiver 123 (which can send and/or receive radio communication signals), camera 155 (which can capture still and moving digital video images) and external USB interface 125 .
- the interface connected to the I/O interface 115 may be a USB interface.
- the processor 103 may be any conventional processor, including a reduced instruction set computing (reduced instruction set computer, RISC) processor, a complex instruction set computing (complex instruction set computer, CISC) processor or a combination of the above.
- the processor may be a dedicated device such as an application specific integrated circuit (ASIC).
- the processor 103 may be a neural network processor or a combination of a neural network processor and the above-mentioned conventional processors.
- computer system 101 may be located remotely from the autonomous vehicle (eg, computer system 101 may be located in a cloud or server) and may communicate wirelessly with the autonomous vehicle.
- computer system 101 may be located remotely from the autonomous vehicle (eg, computer system 101 may be located in a cloud or server) and may communicate wirelessly with the autonomous vehicle.
- some of the processes described herein are performed on a processor disposed within the autonomous vehicle, others are performed by a remote processor, including taking actions required to perform a single maneuver.
- Network interface 129 is a hardware network interface, such as a network card.
- the network 127 may be an external network, such as the Internet, or an internal network, such as an Ethernet network or a virtual private network (VPN).
- the network 127 may also be a wireless network, such as a WiFi network, a cellular network, and the like.
- the hard disk drive interface is coupled to the system bus 105 .
- the hard drive interface is connected to the hard drive.
- System memory 135 is coupled to system bus 105 . Data running in system memory 135 may include operating system 137 and application programs 143 of computer 101 .
- the operating system includes a parser 139 (shell) and a kernel (kernel) 141 .
- the shell is an interface between the user and the kernel of the operating system.
- the shell is the outermost layer of the operating system.
- the shell manages the interaction between the user and the operating system: waiting for user input, interpreting user input to the operating system, and processing various operating system output.
- Kernel 141 consists of those parts of the operating system that manage memory, files, peripherals, and system resources. Interacting directly with hardware, the operating system kernel usually runs processes and provides inter-process communication, providing CPU time slice management, interrupts, memory management, IO management, and more.
- the application 143 includes programs related to driving reminders, such as acquiring data collected by sensors of the vehicle (for example, the collected data may include data of traffic elements around the vehicle and data of the driver of the vehicle), The collected data is processed to obtain processing results (for example, the processing results can be used to indicate the driving status of the vehicle and the driving status of the driver of the vehicle), and combined with the processing results
- the driver sends a reminder message.
- Application 143 may also exist on the system of software deployment server 149 (deploying server).
- the computer system 101 may download the application program 143 from a software deployment server 149 (deploying server) when the application program 143 needs to be executed.
- Sensor 153 is associated with computer system 101 .
- the sensor 153 is used to detect the environment around the computer 101 .
- the sensor 153 can detect animals, cars, obstacles and pedestrian crossings, etc. Further sensors can also detect the environment around the above-mentioned animals, cars, obstacles and pedestrian crossings, such as: the environment around animals, for example, animals appear around other animals, weather conditions, ambient light levels, etc.
- the sensors 153 may also be used to obtain status information of the vehicle.
- the sensor 153 may detect vehicle status information such as the position of the vehicle, the speed of the vehicle, the acceleration of the vehicle, and the attitude of the vehicle.
- the sensors may be cameras, infrared sensors, chemical detectors, microphones, and the like.
- the application program 143 may process the data of the traffic elements around the vehicle and the data of the driver of the vehicle collected by the sensor 153, obtain the processing result, and send reminder information to the driver in combination with the processing result. At this time, the reminder information is sent to the driver in combination with the processing result, so that the safety of the automatic driving of the vehicle can be improved.
- the processing result may be used to indicate the driving state of the vehicle and the driving state of the driver of the vehicle.
- the driving state of the vehicle may include the distance between the vehicle and the traffic element.
- the driving state of the driver may include the line of sight direction of the driver and/or the driver's fatigue driving level.
- FIG. 3 is a schematic structural diagram of a driving reminder system to which an embodiment of the present application is applied. It should be understood that the architecture 300 shown in FIG. 3 is only an example and not a limitation, and the architecture 300 may include more or less steps, which are not limited in this embodiment of the present application.
- the architecture 300 may include: an internal camera, a camera control unit (control unit for the rear-view camera and side view, TRSVC), a central gateway (gateway), a head unit (HU), a combination instrument (combination instrument, KOMBI), camera-based driving assistance system (camera-based driving assistance system), exterior camera, central display, etc.
- a camera control unit control unit for the rear-view camera and side view, TRSVC
- a central gateway gateway
- HU head unit
- KOMBI combination instrument
- camera-based driving assistance system camera-based driving assistance system
- exterior camera central display, etc.
- the inner camera can be connected to the gateway through TRSVC, the outer camera can be connected to the gateway through KAFAS, and the gateway is then connected to the HU.
- the HU can be used to execute the driving reminder method in the embodiment of this application, and the HU can process the images captured by the camera. , and send reminder information to the driver in combination with the processing results.
- the HU may include an inner camera data processing unit, an outer camera data processing unit, and a fusion processing unit.
- the inner camera data processing unit may be used to process images captured by the inner camera
- the outer camera data processing unit may be used to process images captured by the outer camera.
- the fusion processing unit may be configured to combine the processing results of the inner camera data (images captured by the inner camera) and the processing results of the outer camera data (images captured by the outer cameras) to send reminder information to the driver.
- the central display screen may be a display screen in a vehicle information terminal system (car informatic device, CID).
- CID vehicle information terminal system
- the central display screen may be connected to the HU and used as the main display screen of the vehicle. , that is, send a reminder to the driver.
- KOMBI can have an independent system, KOMBI can transmit data with HU through APIX (automotive pixel link), KOMBI can also transmit images with HU through LVDS (low voltage differential signal), KOMBI can also include a sound module, which can emit Therefore, KOMBI can output the reminder information in the embodiment of the present application through sound information, that is, send the reminder information to the driver.
- APIX automotive pixel link
- KOMBI can also transmit images with HU through LVDS (low voltage differential signal)
- KOMBI can also include a sound module, which can emit Therefore, KOMBI can output the reminder information in the embodiment of the present application through sound information, that is, send the reminder information to the driver.
- the gateway can be used to receive data from various modules and then transmit it to the HU;
- TRSVC can include driving recorders and other automatic driving assistance functions;
- KAFAS can be used to detect the driver's state and behavior.
- architecture 300 and each module in the architecture 300 are only examples and not limitations, and are not limited in the embodiments of the present application.
- the inner camera and the outer camera can also be other sensors.
- the inner camera data processing unit and the outer camera data processing unit can be the data processing units of other sensors, and the TRSVC can also be a control unit based on other sensors or based on other sensors.
- the processing unit, KAFAS can also be a driver assistance system based on other sensors (or, a control unit based on other sensors or a processing unit based on other sensors).
- FIG. 4 is a schematic flowchart of a driving reminder method 400 provided by an embodiment of the present application.
- the method 400 shown in FIG. 4 may include steps 410 and 420. It should be understood that the method 400 shown in FIG. 4 is only an example and not a limitation, and the method 400 may include more or less steps. This is not limited, and the steps are described in detail below.
- the method 400 shown in FIG. 4 may be performed by the processor 113 in the vehicle 100 in FIG. 1 , or the method 400 may also be performed by the processor 103 in the autonomous driving system in FIG. 2 , or the method 400 may also be performed by The head unit (HU) in Figure 3 executes (eg, by a fusion processing unit in the head unit).
- HU head unit
- S410 Acquire first data from a first sensor of the vehicle and second data from a second sensor of the vehicle.
- the first data may include data of traffic elements around the vehicle
- the second data may include data of a driver of the vehicle.
- the first sensor may comprise a sensor in the sensor system 104 of FIG. 1, which may be used to collect data on traffic elements surrounding the vehicle.
- the first sensor may be used to collect the current speed, acceleration, current position, contour, and the like of the traffic element.
- the first sensor may include at least one of a global positioning system 122 , an inertial measurement unit 124 , a radar 126 , a laser rangefinder 128 and a camera 130 .
- the first sensor may include one or more laser rangefinders for collecting ranging data of the vehicle relative to its surrounding traffic elements.
- the first sensor may include one or more cameras for capturing images of traffic elements surrounding the vehicle.
- traffic elements may include pedestrians, animals, vehicles, street lights, guardrails, and other objects around the vehicle.
- the second sensor which may include the camera 130 of FIG. 1 or other sensors, may be used to collect data on the driver of the vehicle.
- the second sensor may be used to collect the head posture of the driver, such as roll, pitch and yaw of the driver's head.
- the line-of-sight direction of the driver may be determined according to the head posture of the driver.
- the acquiring the first data from the first sensor of the vehicle and the second data from the second sensor of the vehicle may include:
- the first data and the second data are acquired when the speed of the vehicle is less than or equal to a preset speed.
- the first data collected by the first sensor and the data collected by the second sensor are acquired only when a preset condition (for example, the speed of the vehicle is less than or equal to the preset speed) is satisfied. That is, in other cases, the first sensor and the second sensor can temporarily stop data collection, and therefore, the power consumption of the vehicle can be reduced.
- a preset condition for example, the speed of the vehicle is less than or equal to the preset speed
- the first distance may refer to the distance between the vehicle and the traffic element determined according to the first data
- the first sight direction may refer to the driver determined according to the second data direction of sight.
- the reminder information may be sound information, image information, or other information.
- the reminder information may be sound information emitted by a speaker inside the vehicle; alternatively, the reminder information may also be a central display screen (or may also be referred to as a central control display or a near-close screen) in the vehicle.
- the image information displayed on the vehicle, the image information displayed on the instrument screen of the vehicle, or the image information displayed on other devices such as the heads up display of the vehicle; or, the reminder information can also be the vibration of the steering wheel or seat, etc. information, and the specific form of the reminder information is not limited in the embodiments of the present application.
- the driver should Send reminders.
- the reminder levels of the reminder information may be different.
- the reminder information of different reminder levels makes the driver feel different levels of importance or urgency.
- At least one of reminder times, reminder frequencies and reminder strengths corresponding to the reminder information of different reminder levels is different.
- the reminder information may be sound information from a speaker inside the vehicle, the distance between the vehicle and the vehicle in front is greater than or equal to 5 meters, and the driver is not looking straight ahead (that is, the driver If the distance between the vehicle and the vehicle in front is greater than or equal to 6 meters, and the driver has not looked straight ahead (that is, any If the driver does not pay attention to the preceding vehicle in the direction of the driver's line of sight), a voice message with an increased volume is sent to the driver.
- FIG. 5 is a schematic flowchart of a driving reminder method 500 provided by an embodiment of the present application.
- the method 500 shown in FIG. 5 may include steps 510, 520, 530, 540, 550 and 560. It should be understood that the method 500 shown in FIG. 5 is only an example and not a limitation, and the method 500 may include more There are more or fewer steps, which are not limited in the embodiments of the present application, and these steps are described in detail below.
- the method 500 shown in FIG. 5 may be performed by the processor 113 in the vehicle 100 in FIG. 1 , or the method 500 may also be performed by the processor 103 in the autonomous driving system in FIG. 2 , or the method 500 may also be performed by The head unit (HU) in Figure 3 executes (eg, by a fusion processing unit in the head unit).
- the method 500 is executed by the HU as an example in combination with the scenario of the start reminder.
- the HU can obtain the speed information of the vehicle in real time, and activate the preceding vehicle identification function when the speed of the vehicle meets the preset conditions.
- the preceding vehicle identification can identify the position, shape, direction, speed, image, etc. of the preceding vehicle.
- the HU can obtain the image of the preceding vehicle through the front camera, compare the image obtained by the front camera with the model library, and identify the preceding vehicle (that is, the other vehicles in front of the mentioned vehicle).
- the size of the image obtained by the front camera can be 1028*578.
- the HU can obtain an image with a size of 640*300, and input the image with a size of 640*300 into the model library for comparison.
- HU can also identify other vehicles, pedestrians, and non-motor vehicles.
- the distance L1 between the vehicle and the vehicle in front is identified, and the change in the distance between the vehicles (ie the distance between the vehicle and the vehicle in front) is monitored in real time .
- the distance L1 between the vehicle and the preceding vehicle may be identified based on a vision-based ACC with a single camera: bounds on range and range rate accuracy.
- the head posture of the driver may be detected when the preceding vehicle enters the starting state.
- L2 L1+1 meter
- the roll, pitch, and yaw angles of the driver's head can be obtained, and the obtained roll, pitch, and yaw can be processed. , to detect the driver's head posture.
- S550 may be performed; if the driver's head attitude is within the range of the mid-face attitude In this case, it may be determined that the driver is facing the vehicle in front, and accordingly, S560 may be performed.
- a reminder can be sent to the driver through the sound information and the image on the central control screen (that is, a reminder information is sent to the driver).
- the reminder information may be sent to the driver again.
- the reminder information can last for a certain period of time each time the reminder is performed, and if the driver immediately faces the vehicle in front within the duration of the reminder information, the reminder information can be stopped immediately.
- FIG. 6 is a schematic flowchart of a driving reminder method 600 provided by an embodiment of the present application.
- the method 600 shown in FIG. 6 may include steps 610 and 620. It should be understood that the method 600 shown in FIG. 6 is only an example and not a limitation, and the method 600 may include more or less steps. This is not limited, and these steps are described in detail below.
- the method 600 shown in FIG. 6 may be performed by the processor 113 in the vehicle 100 in FIG. 1 , or the method 600 may be performed by the processor 103 in the autonomous driving system in FIG. 2 , or the method 600 may also be performed by The head unit (HU) in Figure 3 executes (eg, by a fusion processing unit in the head unit).
- HU head unit
- S610 Acquire first data from a first sensor of the vehicle and second data from a second sensor of the vehicle.
- the first data may include data of traffic elements around the vehicle
- the second data may include data of a driver of the vehicle.
- the first sensor may comprise a sensor in the sensor system 104 of FIG. 1, which may be used to collect data on traffic elements surrounding the vehicle.
- the first sensor may be used to collect the current speed, acceleration, current position, contour, and the like of the traffic element.
- the first sensor may include at least one of a global positioning system 122 , an inertial measurement unit 124 , a radar 126 , a laser rangefinder 128 and a camera 130 .
- the first sensor may include one or more laser rangefinders for collecting ranging data of the vehicle relative to its surrounding traffic elements.
- the first sensor may include one or more cameras for capturing images of traffic elements surrounding the vehicle.
- traffic elements may include pedestrians, animals, vehicles, street lights, guardrails, and other objects around the vehicle.
- the second sensor which may include the camera 130 of FIG. 1 or other sensors, may be used to collect data on the driver of the vehicle.
- the second sensor may also be used to collect the percentage of the driver's eyes closed time within a certain time interval, for example, the driver's PERCLOS (percentage of eyeIid CIosure over the PupiI, over time) physical quantity.
- the driver's PERCLOS percentage of eyeIid CIosure over the PupiI, over time
- the driver's fatigue driving level may be determined according to the proportion of the driver's eyes closed time within a certain time interval.
- S620 Send reminder information to the driver according to the first data and the second data.
- the reminder level of the reminder information may be related to the distance range to which the first distance belongs and the driver's fatigue driving level determined according to the second data.
- the first distance may refer to the distance between the vehicle and the traffic element determined according to the first data.
- the reminder level of the reminder information may be determined in combination with the distance between the vehicle and the traffic element and the driver's fatigue driving level.
- the reminder information of different reminder levels makes the driver feel different levels of importance or urgency.
- the reminder information may be sound information, image information or other information.
- the reminder information may be sound information emitted by a speaker inside the vehicle; or, the reminder information may also be a central display screen (or may also be referred to as a central control display or a near-close screen) in the vehicle.
- the image information displayed on the vehicle, the image information displayed on the instrument screen of the vehicle, or the image information displayed on other devices such as the heads up display of the vehicle; or, the reminder information can also be the vibration of the steering wheel or seat, etc. information, and the specific form of the reminder information is not limited in the embodiments of the present application.
- the reminder levels of the reminder information may be different.
- the reminder level of the reminder information is Level 1, that is, the reminder information ( That is, the volume of sound information) is 8, and a beep is heard.
- the reminder information may be determined
- the reminder level is level 2, that is, the volume of the reminder information (that is, the sound information) is 8, with two beeps.
- the reminder levels of the reminder information may be different.
- the reminder level of the reminder information is level 1, that is, the reminder information (ie the sound information) at volume 8, beep once.
- the reminder level of the reminder information can be adjusted to the second level, that is, the reminder information (that is, the sound information ) at volume 12, beep twice.
- At least one of reminder times, reminder frequencies and reminder strengths corresponding to the reminder information of different reminder levels is different.
- FIG. 7 is a schematic flowchart of a driving reminder method 700 provided by an embodiment of the present application.
- the method 700 shown in FIG. 7 may include steps 710, 720, 730 and 740. It should be understood that the method 700 shown in FIG. 7 is only an example and not a limitation, and the method 700 may include more or less steps , this is not limited in the embodiments of the present application, and the steps are described in detail below.
- the method 700 shown in FIG. 7 may be performed by the processor 113 in the vehicle 100 in FIG. 1 , or the method 700 may also be performed by the processor 103 in the autonomous driving system in FIG. 2 , or the method 700 may also be performed by The head unit (HU) in Figure 3 executes (eg, by a fusion processing unit in the head unit).
- HU head unit
- whether the driver is fatigued driving and the fatigue driving level may be determined according to the proportion of the eyes closed time of the driver's eyes within a certain time interval.
- the eye can be collected at a frequency of 4fps, and the vertical integral projection of the eye (surrounding image) can be performed to judge the closed eye state, thereby Determine the driver's fatigue driving level.
- the fatigue driving level may be determined according to the proportion of eyes closed time, and fatigue driving may be defined to include three levels.
- the distance between the vehicle and the preceding vehicle may be identified based on a vision-based ACC with a single camera: bounds on range and range rate accuracy.
- the distance between the vehicle and the preceding vehicle may be classified into multiple levels.
- a distance between the vehicle and the preceding vehicle of 100 meters may be defined as the first level, 80 meters as the second level, and 50 meters as the third level.
- the level of the fatigue driving reminder can be adjusted according to the distance between the vehicle and the preceding vehicle.
- the fatigue driving reminder information is determined to be the first level;
- the fatigue driving reminder information is adjusted to the second level reminder;
- the fatigue driving reminder information is adjusted. for the third level.
- the fatigue driving reminder information is the third level
- the third level is sent to the driver again. Fatigue driving reminder information.
- the fatigue driving reminder information is sound information and the fatigue driving reminder information includes three levels
- different levels of the fatigue driving reminder information may correspond to different reminder modes respectively.
- the volume of the sound information corresponding to the first-level fatigue driving reminder information may be 8, with one beep; the volume of the sound information corresponding to the second-level fatigue driving reminder information may be 12, with two continuous beeps;
- the volume of the sound information corresponding to the fatigue driving reminder information can be 16, with three consecutive beeps.
- the reminder information may also be image information, vibration information, or other information, etc., which is not limited in this embodiment of the present application.
- FIG. 8 is a schematic block diagram of a driving reminder device 800 provided by an embodiment of the present application. It should be understood that the driving reminder device 800 shown in FIG. 8 is only an example, and the device 800 in this embodiment of the present application may further include other modules or units.
- the apparatus 800 can perform various steps in the methods of FIGS. 4 and 5 .
- the acquiring unit 810 can be used to perform S410 in the method 400, and the sending unit 820 can be used to perform S420 in the method 400; It can be used to perform S540 , S550 and S560 in the method 500 .
- the apparatus 800 may be specifically as follows:
- the obtaining unit 810 is configured to obtain first data from a first sensor of the vehicle and second data from a second sensor of the vehicle, where the first data includes data of traffic elements around the vehicle, and the first data includes data of traffic elements around the vehicle. 2.
- the data includes data of the driver of the vehicle;
- the sending unit 820 is configured to send reminder information when the first distance is greater than or equal to the first threshold and the first line of sight direction does not pay attention to the traffic element.
- the first distance refers to the distance between the vehicle and the traffic element determined according to the first data
- the first line of sight direction refers to the line of sight of the driver determined according to the second data direction
- the reminder levels of the reminder information are different.
- At least one of reminder times, reminder frequencies and reminder strengths corresponding to the reminder information of different reminder levels is different.
- the acquiring unit 810 is specifically configured to acquire the first data and the second data when the speed of the vehicle is less than or equal to a preset speed.
- the driving reminder device 800 here is embodied in the form of functional modules.
- the term “module” here can be implemented in the form of software and/or hardware, which is not specifically limited.
- a “module” may be a software program, a hardware circuit, or a combination of the two that implement the above-mentioned functions.
- the hardware circuits may include application specific integrated circuits (ASICs), electronic circuits, processors for executing one or more software or firmware programs (eg, shared processors, proprietary processors, or group processors) etc.) and memory, merge logic and/or other suitable components to support the described functions.
- ASICs application specific integrated circuits
- processors for executing one or more software or firmware programs (eg, shared processors, proprietary processors, or group processors) etc.) and memory, merge logic and/or other suitable components to support the described functions.
- the driving reminder device 800 provided in this embodiment of the present application may be a processor in an automatic driving system, or may be a car machine in an automatic driving vehicle, or may also be a chip configured in the car machine , so as to execute the method described in the embodiments of the present application.
- FIG. 9 is a schematic block diagram of a driving reminder device 900 provided by an embodiment of the present application. It should be understood that the driving reminder device 900 shown in FIG. 9 is only an example, and the device 900 in this embodiment of the present application may further include other modules or units.
- the apparatus 900 can perform various steps in the methods of FIGS. 6 and 7 .
- the acquiring unit 910 can be used to perform S610 in the method 600, and the sending unit 920 can be used to perform S620 in the method 600; S730 and S740 in the method 700 are performed.
- the apparatus 900 may be specifically as follows:
- the obtaining unit 910 is configured to obtain first data from a first sensor of the vehicle and second data from a second sensor of the vehicle, where the first data includes data of traffic elements around the vehicle, and the first data includes data of traffic elements around the vehicle. 2.
- the data includes data of the driver of the vehicle;
- the sending unit 920 is configured to send reminder information to the driver according to the first data and the second data, the reminder level of the reminder information and the distance range to which the first distance belongs, and the range determined according to the second data.
- the driver's fatigue driving level is related.
- the first distance refers to the distance between the vehicle and the traffic element determined according to the first data.
- the reminder levels of the reminder information are different.
- the reminder levels of the reminder information are different.
- At least one of reminder times, reminder frequencies and reminder strengths corresponding to the reminder information of different reminder levels is different.
- the driving reminder device 900 here is embodied in the form of functional modules.
- the term “module” here can be implemented in the form of software and/or hardware, which is not specifically limited.
- a “module” may be a software program, a hardware circuit, or a combination of the two that implement the above-mentioned functions.
- the hardware circuits may include application specific integrated circuits (ASICs), electronic circuits, processors for executing one or more software or firmware programs (eg, shared processors, proprietary processors, or group processors) etc.) and memory, merge logic and/or other suitable components to support the described functions.
- ASICs application specific integrated circuits
- processors for executing one or more software or firmware programs (eg, shared processors, proprietary processors, or group processors) etc.) and memory, merge logic and/or other suitable components to support the described functions.
- the driving reminder device 900 provided in this embodiment of the present application may be a processor in an automatic driving system, or may also be a car machine in an automatic driving vehicle, or may also be a chip configured in the car machine , so as to execute the method described in the embodiments of the present application.
- FIG. 10 is a schematic block diagram of a driving reminder device 1000 according to an embodiment of the present application.
- the apparatus 1000 shown in FIG. 10 includes a memory 1001 , a processor 1002 , a communication interface 1003 and a bus 1004 .
- the memory 1001 , the processor 10102 , and the communication interface 1003 are connected to each other through the bus 1004 for communication.
- the apparatus 1000 may be an exemplary structure of the transmitting unit in FIG. 8 , or may be an exemplary structure of a chip that can be applied in the transmitting unit. In this example, the apparatus 1000 may be used to perform the steps or operations performed by the sending unit in the method shown in FIG. 8 .
- the apparatus 1000 may be an exemplary structure of the sending unit in FIG. 9 , or may be an exemplary structure of a chip that can be applied in the sending unit. In this example, the apparatus 1000 may be configured to perform the steps or operations performed by the sending unit in the method shown in FIG. 9 .
- the memory 1001 may be a read only memory (ROM), a static storage device, a dynamic storage device, or a random access memory (RAM).
- the memory 101 may store a program, and when the program stored in the memory 1001 is executed by the processor 1002, the processor 1002 is used to execute each step of the driving reminder method of the embodiment of the present application. For example, FIG. 4, FIG. 5, FIG. 6 and each step of the embodiment shown in FIG. 7 .
- the processor 1002 can use a general-purpose central processing unit (CPU), a microprocessor, an application-specific integrated circuit (ASIC), or one or more integrated circuits for executing related programs to The method for realizing the driving reminder of the method embodiment of the present application is implemented.
- CPU central processing unit
- ASIC application-specific integrated circuit
- the processor 1002 may also be an integrated circuit chip with signal processing capability.
- each step of the driving reminder method in the embodiment of the present application may be completed by an integrated logic circuit of hardware in the processor 1002 or an instruction in the form of software.
- the above-mentioned processor 1002 can also be a general-purpose processor, a digital signal processor (digital signal processing, DSP), an application-specific integrated circuit (ASIC), an off-the-shelf programmable gate array (field programmable gate array, FPGA) or other programmable logic devices, Discrete gate or transistor logic devices, discrete hardware components.
- DSP digital signal processor
- ASIC application-specific integrated circuit
- FPGA field programmable gate array
- a general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
- the steps of the method disclosed in conjunction with the embodiments of the present application may be directly embodied as executed by a hardware decoding processor, or executed by a combination of hardware and software modules in the decoding processor.
- the software modules may be located in random access memory, flash memory, read-only memory, programmable read-only memory or electrically erasable programmable memory, registers and other storage media mature in the art.
- the storage medium is located in the memory 1001, and the processor 1002 reads the information in the memory 1001 and, in combination with its hardware, completes the functions required to be performed by the units included in the driving reminder device in the embodiment of the present application, or executes the driving of the method embodiment of the present application.
- the reminding method for example, can perform each step/function of the embodiment shown in FIG. 4 , FIG. 5 , FIG. 6 and FIG. 7 .
- the communication interface 1003 can use, but is not limited to, a transceiver such as a transceiver to implement communication between the device 1000 and other devices or a communication network.
- a transceiver such as a transceiver to implement communication between the device 1000 and other devices or a communication network.
- the bus 1004 may include a pathway for communicating information between the various components of the apparatus 1000 (eg, the memory 1001, the processor 1002, the communication interface 1003).
- the apparatus 1000 shown in the embodiments of the present application may be a processor in an automatic driving system, or may also be a vehicle machine in an automatic driving vehicle, or may also be a chip configured in the vehicle machine.
- the processor in the embodiment of the present application may be a central processing unit (central processing unit, CPU), and the processor may also be other general-purpose processors, digital signal processors (digital signal processors, DSP), application-specific integrated circuits (application specific integrated circuit, ASIC), off-the-shelf programmable gate array (field programmable gate array, FPGA) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components, etc.
- a general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
- the memory in the embodiments of the present application may be volatile memory or non-volatile memory, or may include both volatile and non-volatile memory.
- the non-volatile memory may be read-only memory (ROM), programmable read-only memory (PROM), erasable programmable read-only memory (EPROM), electrically programmable Erase programmable read-only memory (electrically EPROM, EEPROM) or flash memory.
- Volatile memory may be random access memory (RAM), which acts as an external cache.
- RAM random access memory
- SRAM static random access memory
- DRAM dynamic random access memory
- DRAM synchronous dynamic random access memory
- SDRAM synchronous dynamic random access memory
- DDR SDRAM double data rate synchronous dynamic random access memory
- enhanced SDRAM enhanced synchronous dynamic random access memory
- SLDRAM synchronous connection dynamic random access memory Fetch memory
- direct memory bus random access memory direct rambus RAM, DR RAM
- the above embodiments may be implemented in whole or in part by software, hardware, firmware or any other combination.
- the above-described embodiments may be implemented in whole or in part in the form of a computer program product.
- the computer program product includes one or more computer instructions or computer programs. When the computer instructions or computer programs are loaded or executed on a computer, all or part of the processes or functions described in the embodiments of the present application are generated.
- the computer may be a general purpose computer, special purpose computer, computer network, or other programmable device.
- the computer instructions may be stored in or transmitted from one computer-readable storage medium to another computer-readable storage medium, for example, the computer instructions may be downloaded from a website site, computer, server, or data center Transmission to another website site, computer, server, or data center by wire (eg, infrared, wireless, microwave, etc.).
- the computer-readable storage medium may be any available medium that a computer can access, or a data storage device such as a server, a data center, or the like containing one or more sets of available media.
- the usable media may be magnetic media (eg, floppy disks, hard disks, magnetic tapes), optical media (eg, DVDs), or semiconductor media.
- the semiconductor medium may be a solid state drive.
- At least one means one or more, and “plurality” means two or more.
- At least one item(s) below” or similar expressions thereof refer to any combination of these items, including any combination of single item(s) or plural items(s).
- at least one item (a) of a, b, or c can represent: a, b, c, ab, ac, bc, or abc, where a, b, c can be single or multiple .
- the size of the sequence numbers of the above-mentioned processes does not mean the sequence of execution, and the execution sequence of each process should be determined by its functions and internal logic, and should not be dealt with in the embodiments of the present application. implementation constitutes any limitation.
- the disclosed systems, devices and methods may be implemented in other manners.
- the apparatus embodiments described above are only illustrative.
- the division of the units is only a logical function division. In actual implementation, there may be other division methods.
- multiple units or components may be combined or Can be integrated into another system, or some features can be ignored, or not implemented.
- the shown or discussed mutual coupling or direct coupling or communication connection may be through some interfaces, indirect coupling or communication connection of devices or units, and may be in electrical, mechanical or other forms.
- the units described as separate components may or may not be physically separated, and components displayed as units may or may not be physical units, that is, may be located in one place, or may be distributed to multiple network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution in this embodiment.
- each functional unit in each embodiment of the present application may be integrated into one processing unit, or each unit may exist physically alone, or two or more units may be integrated into one unit.
- the functions, if implemented in the form of software functional units and sold or used as independent products, may be stored in a computer-readable storage medium.
- the technical solution of the present application can be embodied in the form of a software product in essence, or the part that contributes to the prior art or the part of the technical solution, and the computer software product is stored in a storage medium, including Several instructions are used to cause a computer device (which may be a personal computer, a server, or a network device, etc.) to execute all or part of the steps of the methods described in the various embodiments of the present application.
- the aforementioned storage medium includes: U disk, mobile hard disk, read-only memory (Read-Only Memory, ROM), random access memory (Random Access Memory, RAM), magnetic disk or optical disk and other media that can store program codes .
Landscapes
- Engineering & Computer Science (AREA)
- Automation & Control Theory (AREA)
- Transportation (AREA)
- Mechanical Engineering (AREA)
- Physics & Mathematics (AREA)
- Mathematical Physics (AREA)
- Human Computer Interaction (AREA)
- Traffic Control Systems (AREA)
Abstract
La présente invention concerne un procédé, un appareil et un système pour des alertes de conduite, pouvant être appliqués aux voitures intelligentes dans le domaine de la conduite autonome. Le procédé consiste à : acquérir des premières données à partir d'un premier capteur d'un véhicule et des secondes données à partir d'un second capteur du véhicule, les premières données comprenant des données d'éléments de circulation autour du véhicule, et les secondes données comprenant des données du conducteur du véhicule ; et envoyer des informations d'alerte lorsqu'une première distance est supérieure ou égale à un premier seuil et qu'une première direction de ligne de visée ne se concentre pas sur un élément de circulation, la première distance se référant à la distance entre le véhicule et l'élément de circulation déterminée sur la base des premières données, et la première direction de ligne de visée se référant à la direction de ligne de visée du conducteur déterminée sur la base des secondes données. Le procédé dans les modes de réalisation de la présente invention aide à augmenter la précision des alertes de conduite.
Priority Applications (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN202080004885.6A CN112654547A (zh) | 2020-09-25 | 2020-09-25 | 驾驶提醒的方法、装置及系统 |
| PCT/CN2020/117687 WO2022061702A1 (fr) | 2020-09-25 | 2020-09-25 | Procédé, appareil et système pour des alertes de conduite |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| PCT/CN2020/117687 WO2022061702A1 (fr) | 2020-09-25 | 2020-09-25 | Procédé, appareil et système pour des alertes de conduite |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2022061702A1 true WO2022061702A1 (fr) | 2022-03-31 |
Family
ID=75368407
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/CN2020/117687 Ceased WO2022061702A1 (fr) | 2020-09-25 | 2020-09-25 | Procédé, appareil et système pour des alertes de conduite |
Country Status (2)
| Country | Link |
|---|---|
| CN (1) | CN112654547A (fr) |
| WO (1) | WO2022061702A1 (fr) |
Cited By (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN114880262A (zh) * | 2022-05-31 | 2022-08-09 | 国汽智控(北京)科技有限公司 | 自动驾驶功能适配方法、可扩展多板卡系统、装置及设备 |
| CN115366907A (zh) * | 2022-08-12 | 2022-11-22 | 重庆长安汽车股份有限公司 | 驾驶员的状态异常提醒方法、装置、车辆及存储介质 |
| CN115416678A (zh) * | 2022-07-08 | 2022-12-02 | 重庆长安汽车股份有限公司 | 一种车辆控制方法及装置、电子设备、存储介质 |
Families Citing this family (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN113895453A (zh) * | 2021-10-22 | 2022-01-07 | 青岛海尔智能技术研发有限公司 | 车辆控制方法、装置、车辆和存储介质 |
| CN115675504A (zh) * | 2022-10-31 | 2023-02-03 | 华为技术有限公司 | 一种车辆告警方法以及相关设备 |
| CN116279556B (zh) * | 2023-03-03 | 2024-04-02 | 北京辉羲智能科技有限公司 | 一种提醒驾驶员接管的安全智能驾驶系统 |
Citations (9)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2004078510A1 (fr) * | 2003-03-04 | 2004-09-16 | Angelo Gilardi | Dispositif de securite permettant de produire des vibrations dans le siege afin d'avertir les conducteurs de vehicules a moteur lorsque leur attention se relache |
| CN104408879A (zh) * | 2014-11-19 | 2015-03-11 | 湖南工学院 | 疲劳驾驶预警处理方法、装置及系统 |
| CN106295583A (zh) * | 2016-08-15 | 2017-01-04 | 深圳市华宝电子科技有限公司 | 一种提醒用户驾驶方式的方法及装置 |
| CN106274693A (zh) * | 2016-09-26 | 2017-01-04 | 惠州Tcl移动通信有限公司 | 一种驾车行驶中进行提醒的方法、系统及电子设备 |
| CN106408878A (zh) * | 2016-12-16 | 2017-02-15 | 苏州清研微视电子科技有限公司 | 一种考虑驾驶人疲劳状态及反应能力的车辆防撞预警系统 |
| CN108928294A (zh) * | 2018-06-04 | 2018-12-04 | Oppo(重庆)智能科技有限公司 | 行车危险的提醒方法、装置、终端及计算机可读存储介质 |
| CN109035718A (zh) * | 2018-07-31 | 2018-12-18 | 苏州清研微视电子科技有限公司 | 多因素融合的危险驾驶行为分级预警方法 |
| CN110271561A (zh) * | 2019-06-06 | 2019-09-24 | 浙江吉利控股集团有限公司 | 一种自动驾驶警示方法、装置及车辆 |
| CN110303883A (zh) * | 2018-03-27 | 2019-10-08 | 厦门歌乐电子企业有限公司 | 一种检测提醒装置、方法及辅助驾驶设备 |
Family Cites Families (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN109801511B (zh) * | 2017-11-16 | 2021-01-05 | 华为技术有限公司 | 一种碰撞预警方法及装置 |
| CN209176597U (zh) * | 2018-10-10 | 2019-07-30 | 深圳市国脉畅行科技股份有限公司 | 安全驾驶提醒装置 |
| CN110281923A (zh) * | 2019-06-28 | 2019-09-27 | 信利光电股份有限公司 | 一种车辆辅助变道方法、装置及系统 |
| CN110638474A (zh) * | 2019-09-25 | 2020-01-03 | 中控智慧科技股份有限公司 | 一种驾驶状态检测的方法、系统、设备及可读存储介质 |
| CN110934600A (zh) * | 2020-01-09 | 2020-03-31 | 河南省安信科技发展有限公司 | 一种基于脑电波监测的防疲劳驾驶预警装置及监测方法 |
| CN111634288A (zh) * | 2020-04-30 | 2020-09-08 | 长城汽车股份有限公司 | 疲劳驾驶监测方法、系统及智能识别系统 |
-
2020
- 2020-09-25 CN CN202080004885.6A patent/CN112654547A/zh active Pending
- 2020-09-25 WO PCT/CN2020/117687 patent/WO2022061702A1/fr not_active Ceased
Patent Citations (9)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2004078510A1 (fr) * | 2003-03-04 | 2004-09-16 | Angelo Gilardi | Dispositif de securite permettant de produire des vibrations dans le siege afin d'avertir les conducteurs de vehicules a moteur lorsque leur attention se relache |
| CN104408879A (zh) * | 2014-11-19 | 2015-03-11 | 湖南工学院 | 疲劳驾驶预警处理方法、装置及系统 |
| CN106295583A (zh) * | 2016-08-15 | 2017-01-04 | 深圳市华宝电子科技有限公司 | 一种提醒用户驾驶方式的方法及装置 |
| CN106274693A (zh) * | 2016-09-26 | 2017-01-04 | 惠州Tcl移动通信有限公司 | 一种驾车行驶中进行提醒的方法、系统及电子设备 |
| CN106408878A (zh) * | 2016-12-16 | 2017-02-15 | 苏州清研微视电子科技有限公司 | 一种考虑驾驶人疲劳状态及反应能力的车辆防撞预警系统 |
| CN110303883A (zh) * | 2018-03-27 | 2019-10-08 | 厦门歌乐电子企业有限公司 | 一种检测提醒装置、方法及辅助驾驶设备 |
| CN108928294A (zh) * | 2018-06-04 | 2018-12-04 | Oppo(重庆)智能科技有限公司 | 行车危险的提醒方法、装置、终端及计算机可读存储介质 |
| CN109035718A (zh) * | 2018-07-31 | 2018-12-18 | 苏州清研微视电子科技有限公司 | 多因素融合的危险驾驶行为分级预警方法 |
| CN110271561A (zh) * | 2019-06-06 | 2019-09-24 | 浙江吉利控股集团有限公司 | 一种自动驾驶警示方法、装置及车辆 |
Cited By (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN114880262A (zh) * | 2022-05-31 | 2022-08-09 | 国汽智控(北京)科技有限公司 | 自动驾驶功能适配方法、可扩展多板卡系统、装置及设备 |
| CN115416678A (zh) * | 2022-07-08 | 2022-12-02 | 重庆长安汽车股份有限公司 | 一种车辆控制方法及装置、电子设备、存储介质 |
| CN115366907A (zh) * | 2022-08-12 | 2022-11-22 | 重庆长安汽车股份有限公司 | 驾驶员的状态异常提醒方法、装置、车辆及存储介质 |
Also Published As
| Publication number | Publication date |
|---|---|
| CN112654547A (zh) | 2021-04-13 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| CN113968216B (zh) | 一种车辆碰撞检测方法、装置及计算机可读存储介质 | |
| US20250013231A1 (en) | Remote Assistance for an Autonomous Vehicle in Low Confidence Situations | |
| CN110550029B (zh) | 障碍物避让方法及装置 | |
| CN112230642B (zh) | 道路可行驶区域推理方法及装置 | |
| JP7620784B2 (ja) | 車両運転モード切り替えを制御する方法及び装置 | |
| US20200033877A1 (en) | Assisted Perception For Autonomous Vehicles | |
| CN114056346B (zh) | 一种自动驾驶行车控制方法及装置 | |
| WO2021102955A1 (fr) | Procédé et appareil de planification de trajet pour véhicule | |
| US20230222914A1 (en) | Vehicle reminding method and system, and related device | |
| CN112672942B (zh) | 一种车辆换道方法及相关设备 | |
| WO2022061702A1 (fr) | Procédé, appareil et système pour des alertes de conduite | |
| WO2021212379A1 (fr) | Procédé et appareil de détection de ligne de délimitation de voie | |
| CN110471411A (zh) | 自动驾驶方法和自动驾驶装置 | |
| WO2022062825A1 (fr) | Procédé, dispositif de commande de véhicule et véhicule | |
| CN110077410A (zh) | 对在预定情况中的自主车辆的远程协助 | |
| CN112654546B (zh) | 用户感兴趣对象的识别方法以及识别装置 | |
| WO2022001432A1 (fr) | Procédé d'inférence de voie et procédé et appareil d'entraînement de modèle d'inférence de voie | |
| CN110789533A (zh) | 一种数据呈现的方法及终端设备 | |
| JP7554937B2 (ja) | 制御方法および制御装置 | |
| WO2023015510A1 (fr) | Procédé d'évitement de collision et appareil de commande | |
| CN113859265A (zh) | 一种驾驶过程中的提醒方法及设备 | |
| WO2022062582A1 (fr) | Procédé et appareil de commande de temps d'apport de lumière d'un module de caméra | |
| WO2022041820A1 (fr) | Procédé et appareil pour planification de trajectoire de changement de file | |
| CN115675504A (zh) | 一种车辆告警方法以及相关设备 | |
| CN116547732A (zh) | 确定泊出方向的方法和装置 |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 20954554 Country of ref document: EP Kind code of ref document: A1 |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 20954554 Country of ref document: EP Kind code of ref document: A1 |