[go: up one dir, main page]

WO2025062624A1 - Driving control method and driving control device - Google Patents

Driving control method and driving control device Download PDF

Info

Publication number
WO2025062624A1
WO2025062624A1 PCT/JP2023/034501 JP2023034501W WO2025062624A1 WO 2025062624 A1 WO2025062624 A1 WO 2025062624A1 JP 2023034501 W JP2023034501 W JP 2023034501W WO 2025062624 A1 WO2025062624 A1 WO 2025062624A1
Authority
WO
WIPO (PCT)
Prior art keywords
avoidance
detection information
vehicle
camera
driving
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
PCT/JP2023/034501
Other languages
French (fr)
Japanese (ja)
Inventor
幸熙 小泉
明 森本
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nissan Motor Co Ltd
Original Assignee
Nissan Motor Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nissan Motor Co Ltd filed Critical Nissan Motor Co Ltd
Priority to PCT/JP2023/034501 priority Critical patent/WO2025062624A1/en
Publication of WO2025062624A1 publication Critical patent/WO2025062624A1/en
Pending legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
    • B60W30/10Path keeping
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/02Ensuring safety in case of control system failures, e.g. by diagnosing, circumventing or fixing failures
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems

Definitions

  • a driving assistance technology is known that offsets the target course of the vehicle in a direction away from the other vehicle when it is determined that the vehicle and another vehicle will be side-by-side.
  • the problem that this invention aims to solve is to prevent the vehicle's behavior from being disrupted by erroneous detection by the sensor.
  • the present invention solves the above problem by controlling the execution of an evasive maneuver that separates the vehicle from the vehicle behind when a specific evasive condition is met, by making it easier to execute an evasive maneuver when detection information acquired from a camera and a radar device satisfies the evasive condition than when only detection information acquired from the radar device satisfies the evasive condition, and by making it easier to execute an evasive maneuver when only detection information acquired from the radar device satisfies the evasive condition than when only detection information acquired from the camera satisfies the evasive condition.
  • the present invention can prevent the vehicle's behavior from being disrupted by erroneous detection by the sensor.
  • FIG. 2 is a block diagram showing a hardware configuration of the operation control system.
  • 13 is a flowchart showing a procedure for avoidance driving control.
  • FIG. 13 is a diagram for explaining avoidance driving.
  • FIG. 13 is a diagram for explaining the ease of implementing avoidance driving.
  • FIG. 1 shows the configuration of a driving control system 100 equipped with a vehicle driving control device 1 according to this embodiment.
  • This driving control method is implemented using each device (hardware) of the perception system, judgment and control system, and information system of the driving control system 100, which includes the processor 10 of the driving control device 1.
  • the driving control system 100 is equipped with one or more sensors 2, an own vehicle information acquisition device 3, an other vehicle information acquisition device 4, and a rear vehicle recognition device 5.
  • the processor 10 acquires observed values from each of the perception system devices, and uses known methods to determine the state and changes in the position, attitude, and motion (speed, acceleration, etc.) of objects including the own vehicle and other vehicles.
  • a plurality of sensors 2 are provided in the vehicle, forming a sensor group that cooperates with each other.
  • the sensor 2 detects the presence or absence of objects, including other vehicles, in the entire periphery of the vehicle, the distance to the objects, and the relative speed and relative acceleration of the objects.
  • the detection information acquired by the sensor 2 is provided to the processor 10.
  • the sensor 2 includes a single or multiple cameras 21 arranged on the vehicle.
  • the single or multiple cameras 21 capture images of the surroundings of the vehicle in all directions.
  • the cameras 21 include an image sensor equipped with an imaging element such as a CCD, an ultrasonic camera, and an infrared camera.
  • the cameras 21 include at least a front camera that captures images of the front of the vehicle, a rear camera that captures images of the rear or rear sides of the vehicle, and left and right side cameras that capture images of the left and right sides of the vehicle, and the front and rear of the left and right sides.
  • the type of the camera 21 is not limited as long as it can capture images of the vehicle in all directions.
  • a single camera 21 attached to a base having a rotation mechanism may be used, or this may be used in combination with multiple other cameras 21.
  • the sensor 2 includes a radar device 22 that detects (measures) the presence of an object around the vehicle, the object's position, and a change in position.
  • the radar device 22 is a device that measures the distance and direction to the object by emitting electromagnetic waves toward the object and measuring the reflected waves.
  • the radar device 22 includes a laser radar, a millimeter wave radar (LRF), a LiDAR (light detection and ranging) unit, an ultrasonic radar, and a sonar.
  • the sensor 2 includes a GPS (Global Positioning System) unit, a gyro sensor, a vehicle speed sensor, and the like, and detects the position of the vehicle at each timing. Each sensor 2 can also acquire information from an in-vehicle device and an external device according to its respective function.
  • Each sensor 2 transmits the acquired detection information to the host vehicle information acquisition device 3, the other vehicle information acquisition device 4, the rear vehicle recognition device 5, or the processor 10 in response to a request or command.
  • the processor 10 may acquire the detection information directly from the camera 21 and the radar device 22, or may acquire the detection information via the host vehicle information acquisition device 3, the other vehicle information acquisition device 4, or the rear vehicle recognition device 5, which will be described later.
  • the host vehicle information acquisition device 3 calculates the host vehicle's current position, attitude, speed, acceleration, behavior and direction of travel based on the detection information acquired from the sensor 2, and provides the calculated information to the processor 10.
  • the other vehicle information acquisition device 4 calculates the position, attitude, speed, acceleration, behavior and direction of travel of objects including other vehicles around the host vehicle based on the detection information acquired from the sensor 2, and provides the calculated information to the processor 10.
  • the rear side vehicle recognition device 5 recognizes other vehicles traveling in lanes adjacent to the host vehicle's lane behind the host vehicle based on the detection information acquired from the sensor 2, and recognizes these as "rear side vehicles.” Rear side vehicles are located to the rear side of the host vehicle. Only other vehicles that exist within a predetermined distance range that the host vehicle should recognize may be recognized as rear side vehicles.
  • the recognition results of the rear side vehicles are provided to the processor 10 as rear side vehicle information.
  • the rear vehicle information includes one or more of the following: the presence of the rear vehicle, the position of the rear vehicle, the lateral position of the rear vehicle in the lane (position in the road width direction), the lateral position of the rear vehicle relative to the lane mark, the speed of the rear vehicle, the acceleration of the rear vehicle, and the vehicle type of the rear vehicle.
  • the vehicle type of the rear vehicle is a type such as a saddle-type vehicle such as a two-wheeled vehicle or a three-wheeled vehicle, a large vehicle, a regular car, or a truck.
  • the type of the rear vehicle can be determined from the shape characteristics such as the width and/or height and size of the rear vehicle.
  • the above-mentioned host vehicle information, other vehicle information, and rear vehicle information may be determined by the processor 10 based on the detection information of the sensor 2.
  • the driving control system 100 has map information 6 and lane information 61 as information system devices.
  • the map information 6 and lane information 61 are recorded in a storage device that the processor 10 can access via the communication device 30 and an external server.
  • the map information 6 is high-precision map information that includes lane information 61 that is referenced when performing autonomous lane change control.
  • the lane information 61 includes identification information that identifies each of the multiple lanes that belong to a road.
  • the navigation device 7 refers to the map information 6 and calculates a route to a set destination. This route includes a target trajectory. The calculated route and target trajectory are provided to the vehicle controller 200 and used for autonomous driving control.
  • the vehicle controller 200 includes a steering control device 210 and a drive control device 220.
  • the vehicle controller 200 acquires command values for autonomous driving control according to a driving plan proposed by the processor 10 of the driving control device 1, and drives the vehicle along a route to the destination.
  • the route is composed of multiple consecutive target trajectories to which command values are associated.
  • the target trajectory includes an avoidance trajectory for avoidance driving.
  • the avoidance trajectory is calculated based on a target lateral position for avoiding a rear vehicle.
  • the command values for driving control are generated by the vehicle controller 200 or the processor 10.
  • the command values include a set speed when driving the vehicle, and the vehicle controller 200 drives the vehicle according to the set speed.
  • the driving control system 100 has a driving control device 1 and a vehicle controller 200.
  • the driving control device 1 controls autonomous driving to make the vehicle travel along a target trajectory.
  • the target trajectory includes an avoidance trajectory for avoidance driving.
  • the processor 10 provided in the driving control device 1 has a ROM (Read Only Memory) 12 that stores a program for controlling the autonomous driving, a CPU (Central Processing Unit) 11 that executes the program stored in this ROM 12, and a RAM (Random Access Memory) 13 that functions as an accessible storage device.
  • the processor 10 implements this driving control method using each piece of hardware of the driving control system 100.
  • the processor 10 executes each function by working in cooperation with the software and each piece of hardware shown in Figure 1 to realize at least a function for determining whether the avoidance conditions are satisfied or not, a function for determining the ease of executing avoidance driving, and an automatic driving function for executing driving control including avoidance driving according to the ease of execution.
  • Avoidance driving in this embodiment is performed in order for the host vehicle to avoid approaching the rear vehicle, which is another vehicle traveling behind the host vehicle on a lane adjacent to the host vehicle's driving lane.
  • the rear vehicle may change its lateral position to approach the host vehicle.
  • the driving control device 1 executes avoidance driving to avoid the rear vehicle by autonomous control.
  • This avoidance driving includes lateral movement of the host vehicle along the width direction of the driving lane.
  • the processor 10 acquires detection information from the sensor 2 including the camera 21 and the radar device 22 (S1).
  • the detection information includes detection information based on imaging information from the camera 21 and detection information based on observation information from the radar device 22.
  • Each piece of detection information includes an identifier for distinguishing the sensor 2 that performed the detection.
  • the processor 10 distinguishes between the detection information from the camera 21 and the detection information from the radar device 22 by referring to the identifier.
  • the processor 10 acquires vehicle information such as the current position and speed of the vehicle from the vehicle information acquisition device 3 as necessary (S2).
  • the processor 10 acquires lane information of the current lane of the vehicle as one of the vehicle information by referring to the detection information of the sensor 2 or the lane information 61 of the map information 6 (S2).
  • the lane is included in the route to the destination, and this route may be acquired from the navigation device 7.
  • the position of the lane of the vehicle and the position of the lane marker may be acquired from the detection information of the sensor 2.
  • the processor 10 acquires other vehicle information such as the presence or absence, position (distance), relative speed, and relative acceleration of other vehicles traveling around the vehicle (S3).
  • the other vehicles include a rear side vehicle traveling in a lane adjacent to the lane of the vehicle behind the vehicle.
  • the processor 10 acquires a judgment of whether or not a rear side vehicle traveling in a lane adjacent to the lane of the vehicle exists, that is, whether or not a rear side vehicle has been detected, from the rear side vehicle recognition device 5 (S4).
  • the rear vehicle is a vehicle to be avoided in the driving control of the own vehicle, and may be a vehicle whose distance or time to collision (hereinafter referred to as TTC: Time-To-Collision) from the own vehicle is within a predetermined range.
  • TTC Time-To-Collision
  • the processor 10 observes objects on the rear side based on the detection information of the sensor 2 until the rear vehicle is detected (NO in S4).
  • the recognition process of the rear vehicle may be performed by the processor 10.
  • the processor 10 reads a predefined avoidance condition and judges whether or not the detection information of the sensor 2 satisfies the avoidance condition (S5).
  • the avoidance condition is a condition for judging whether or not avoidance driving to separate the host vehicle from the rear vehicle can be performed.
  • the avoidance condition is predefined based on the proximity between the rear vehicle and the host vehicle.
  • the proximity is a risk index that indicates the degree of proximity between the host vehicle and the rear vehicle. The higher the proximity, the closer the host vehicle and the rear vehicle are, and the lower the proximity, the farther the host vehicle and the rear vehicle are.
  • the avoidance condition is determined to be satisfied when the approach risk indicated by the approach degree is equal to or greater than a predetermined threshold (high approach risk state).
  • a predetermined threshold high approach risk state.
  • the approach degree may be defined using the lateral distance, lateral speed, and lateral acceleration.
  • the approach degree can be expressed using thresholds defined by index values known at the time of filing using the distance between the host vehicle and the rear vehicle, the lateral distance along the lane width, the TTC value, and their reciprocals and derivatives.
  • the TTC is an index value based on the time of contact between the rear vehicle and the host vehicle under the assumption that the current relative speed is maintained.
  • the processor 10 analyzes the type of sensor 2 that provided the detection information that satisfied the avoidance conditions. From the detection information that satisfied the avoidance conditions, the processor 10 extracts detection information based on the imaging information of the camera 21, and also extracts detection information based on the observation information of the radar device 22, and analyzes whether the detection information that satisfied the avoidance conditions is detection information from the camera 21, the radar device 22, or both.
  • the processor 10 classifies the determination result of whether the avoidance condition is satisfied into the following three patterns. The classification is performed from the viewpoint of whether the sensor 2 that outputs the detection information that is determined to satisfy the avoidance condition is only the camera 21, only the radar device 22, or both the camera 21 and the radar device 22.
  • the avoidance condition includes an avoidance condition related to the imaging information of the camera 21 and an avoidance condition related to the observation information of the radar device 22.
  • the processor 10 determines whether the event that the avoidance condition is satisfied is an event in which the detection information of the camera 21 and the radar device 22 satisfy each avoidance condition, an event in which only the detection information of the radar device 22 satisfies the avoidance condition related to the observation information of the radar device 22, or an event in which only the detection information of the camera 21 satisfies the avoidance condition related to the imaging information of the camera 21.
  • the patterns P1, P2, and P3 of each event will be described.
  • Pattern P1 Satisfied by detection information (both) of camera 21 and radar device 22>
  • the judgment result is that the detection information acquired from the camera 21 satisfies the avoidance condition, and the detection information acquired from the radar device 22 satisfies the avoidance condition.
  • the approach degree based on the imaging information of the camera 21 is equal to or greater than the threshold value set for the imaging information
  • the approach degree based on the observation information of the radar device 22 is equal to or greater than the threshold value set for the observation information. This is the case where the avoidance condition is satisfied in both the detection result of the camera 21 and the detection result of the radar device 22.
  • ⁇ Pattern P2 Satisfied with only the detection information from the radar device 22>
  • the judgment result is that only the detection information acquired from the radar device 22 satisfies the avoidance condition.
  • the detection information acquired from the camera 21 does not satisfy the avoidance condition.
  • the approach degree based on the imaging information of the camera 21 is less than the threshold set for the imaging information (not satisfied), and the approach degree based on the observation information of the radar device 22 is equal to or greater than the threshold set for the observation information (satisfied).
  • ⁇ Pattern P3 Satisfied with only the detection information from camera 21> In pattern P3, the judgment result is that only the detection information acquired from the camera 21 satisfies the avoidance condition.
  • the detection information acquired from the radar device 22 does not satisfy the avoidance condition.
  • the approach degree based on the imaging information of the camera 21 is equal to or greater than the threshold set for the imaging information (satisfied), and the approach degree based on the observation information of the radar device 22 is less than the threshold set for the observation information (not satisfied).
  • the processor 10 sets different "ease of execution” for the avoidance driving depending on the analyzed patterns P1, P2, P3 of the determination result of the satisfaction of the avoidance conditions.
  • the ease of execution is the ease with which the avoidance driving is executed, and is defined as the level of ease of the avoidance driving.
  • the "ease of execution” uses the speed of starting the avoidance driving and the speed of the execution process of the avoidance driving (the shortness of time required for the avoidance driving) as evaluation indexes.
  • the processor 10 sets the ease of execution of evasive maneuver to level E1 (S8) and proceeds to S11. If the detection information of the camera does not satisfy the avoidance condition (NO in S6) and only the detection information of the radar device 22 satisfies the avoidance condition (YES in S7'), the processor 10 sets the ease of execution of evasive maneuver to level E2 ( ⁇ E1) (S9) and proceeds to S11.
  • the processor 10 sets the ease of execution of evasive maneuver to level E3 ( ⁇ E2) (S10) and proceeds to S11. If it is determined that the avoidance conditions are met (YES in S5), but the detection information from the camera 21 and the radar device 22 does not indicate that the avoidance conditions are met (NO in S6, NO in S7'), the process returns to S1 and the determination is made again.
  • the ease of execution of level E1 is associated, when it is pattern P2, the ease of execution of level E2 is associated, and when it is pattern P3, the ease of execution of level E3 is associated.
  • the respective levels E of execution have the relationship of level E1>level E2>level E3.
  • the ease of execution of avoidance maneuver at level E1 is the highest, and avoidance maneuver is most likely to be executed.
  • the ease of execution of avoidance maneuver at level E3 is the lowest, and avoidance maneuver is least likely to be executed.
  • the ease of execution of avoidance maneuver at level E2 is a medium degree, lower than level E1 and higher than level E3.
  • a level E1 (> level E2) is set at which evasive maneuvering is more likely to be performed than when it is determined that the avoidance conditions are satisfied based only on the detection information acquired from the radar device 22 (pattern P2)
  • a level E2 (> level E3) is set at which evasive maneuvering is more likely to be performed than when it is determined that the avoidance conditions are satisfied based only on the detection information acquired from the camera 21 (pattern P3).
  • 3 shows the movements of the host vehicle V1 and the rear vehicle V2 from the decision to perform evasive maneuver to the completion of the execution.
  • the direction indicated by the arrow Y is the vertical direction along the traveling direction of the host vehicle V1
  • the direction indicated by the arrow X is the horizontal direction along the width direction or road width direction of the host vehicle V1.
  • 3(a) shows the host vehicle V1 (T1) and the rear vehicle V2 (T1) at timing T1.
  • the host vehicle V1 (T1) is traveling in lane L3, and the rear vehicle V2 (T1) is traveling behind the host vehicle V1 in the adjacent lane L2.
  • FIG. 3B shows the host vehicle V1 (T2) and the rear vehicle V2 (T2) at timing T2.
  • the processor 10 calculates the approach between the host vehicle V1 (T2) and the rear vehicle V2 (T2) to determine whether the avoidance condition is satisfied or not satisfied.
  • the approach indicates the risk of the host vehicle V1 and the rear vehicle V2 approaching each other, and is a requirement for defining the avoidance condition.
  • the approach is calculated based on the distance D, the lateral distance DX, or the vertical distance DY between the host vehicle V1 (T2) and the rear vehicle V2 (T2).
  • the processor 10 compares the approach threshold defined by the distance with the actual measured distance to determine whether the avoidance condition is satisfied or not satisfied.
  • the approach may be calculated as a risk assessment value using the TTC that takes into account the relative speed between the host vehicle V1 (T2) and the rear vehicle V2 (T2).
  • the processor 10 may perform evasive maneuvering to shift the lateral position V1X2 of the host vehicle V1 (T2) to a lateral position V11X3 within the lane L3 in which the host vehicle V1 (T2) is traveling, or may perform evasive maneuvering to move the host vehicle V1 (T2) from the lateral position V1X2 in the lane L3 in which the host vehicle V1 (T2) is traveling to a lateral position V12X3 in a lane L4 adjacent to the lane L3.
  • the timing T2 when the detection information satisfying the avoidance condition is acquired is the timing when the execution of the avoidance operation is decided.
  • the evasive maneuver is started at a timing TS after the waiting time TR2 has elapsed.
  • the waiting time T2 is set as a time for confirmation, rather than starting the movement of the host vehicle immediately after the decision.
  • the execution time TR3 from the timing TS at which the evasive maneuver is started to the timing T3 at which the evasive maneuver is completed is the time for the host vehicle V1 to move to the destination calculated to avoid the rear vehicle V2.
  • the start timing of the evasive maneuver can be determined by the start of a lateral position change to separate the host vehicle V1 from the rear vehicle V2, the start of steering control, a change in the angle of the wheels, etc.
  • the completion timing of the evasive maneuver can be determined by the completion of the movement of the targeted lateral position, the completion of steering control, and the direction of the wheels becoming the vehicle length direction.
  • the start timing of the evasive maneuver involving a lane change can be determined by the above, in addition, by one or two front wheels crossing the lane.
  • the timing of the completion of an evasive maneuver involving a lane change can be determined by the following: one or two rear wheels have crossed the lane, the reference position of the vehicle body (such as the center of gravity) has entered the new lane, all or part of the vehicle body belongs to the new lane, the orientation of the vehicle body has returned from the left or right direction (when turning) to a straight direction, and the posture of the vehicle body after steering matches the direction of the lane.
  • the ease of performing evasive maneuvering that is, the ease of performing, will be explained in detail.
  • evasive maneuvering is easy to perform (high ease of performing), it means that the time spent on evasive maneuvering is relatively short.
  • evasive maneuvering is difficult to perform (low ease of performing), it means that the time spent on evasive maneuvering is relatively long.
  • the time spent on evasive maneuvering refers to the total time TR1 from the timing T2 when detection information that satisfies the avoidance conditions shown in Figure 4 is acquired to the timing T3 when evasive maneuvering is completed.
  • evasive maneuvering when evasive maneuvering is easy to perform (high ease of performing), it means that evasive maneuvering is performed relatively early (early, swiftly), and that evasive maneuvering is performed relatively quickly (quickly, fast).
  • evasive maneuvering When evasive maneuvering is performed early, it means that the waiting time TR2 from the timing T2 when detection information that satisfies the avoidance conditions in Figure 4 is acquired to the timing TS when evasive maneuvering begins is short.
  • the avoidance operation is difficult to execute (low execution ease) means that the avoidance operation is executed relatively late and that the avoidance operation is executed relatively slowly.
  • the avoidance operation is executed late means that the waiting time TR2 from the timing T2 when the information satisfying the avoidance condition in FIG.
  • the waiting time TR2 is a time provided for a process to check the reliability of the detection information, such as whether noise is included in the detection information, before the avoidance operation starts.
  • the avoidance operation is executed quickly means that the execution time TR3 from the timing TS when the avoidance operation starts to the timing T3 when the avoidance operation is completed in FIG. 4 is short.
  • An avoidance operation in which any of the total time TR1, the waiting time TR2, or the execution time TR3 is relatively short is evaluated as being easy to execute (high execution ease), and an avoidance operation in which any of the total time TR1, the waiting time TR2, or the execution time TR3 is relatively long is judged as being difficult to execute (low execution ease).
  • An avoidance operation that is easy to execute (high execution ease) means that the avoidance operation is executed early (the waiting time TR2 is short) or that the avoidance operation is executed quickly (the execution time TR3 is short).
  • an avoidance operation that is difficult to execute (low execution ease) means that the execution of the avoidance operation is delayed (the waiting time TR2 is long) or that the avoidance operation is executed slowly (the execution time TR3 is long).
  • An avoidance operation that is easy to execute means that the total time TR1 required to complete the execution of the avoidance operation is relatively short. Conversely, an avoidance operation that is difficult to execute (low execution ease) means that the total time TR1 is relatively long.
  • the overall time TR1 can be shortened. Note that, in cases where careful confirmation or cautious driving is required, it is preferable to make it difficult to execute avoidance maneuvers (to lower the ease of execution).
  • the processor 10 acquires the details of the evasive maneuver associated with the execution ease levels E1, E2, and E3 set according to the patterns P1, P2, and P3 of the event that the avoidance condition is satisfied.
  • the total time TR1(1) from the timing T2 at which the information on the rear vehicle V2 that satisfies the avoidance condition is acquired to the timing T3 at which the execution of the evasive maneuver is completed is shorter than the total time TR1(2) in the evasive maneuver of level E2 set for pattern P2.
  • the total time TR1(2) in the evasive maneuver of level E2 set for pattern P2 is shorter than the total time TR1(3) in the evasive maneuver of level E3 set for pattern P3.
  • the relationship of the total time TR1 is TR1(1) ⁇ TR1(2) ⁇ TR1(3).
  • the total time TR1 includes the waiting time TR2 and the execution time TR3. Reducing the waiting time TR2 and/or the execution time TR3 reduces the total time TR1.
  • the waiting time TR2(1) from the timing T2 when the information on the rear vehicle V2 that satisfies the avoidance condition is acquired to the timing TS when the execution of the avoidance maneuver starts is shorter than the waiting time TR2(2) in the avoidance maneuver of level E2 set in pattern P2.
  • the waiting time TR2(2) in the avoidance maneuver of level E2 set in pattern P2 is shorter than the waiting time TR2(3) in the avoidance maneuver of level E3 set in pattern P3.
  • the relationship of the waiting time TR2 is TR2(1) ⁇ TR2(2) ⁇ TR2(3).
  • the execution time TR3(1) from the timing TS when the execution of the avoidance operation of the level E1 set in the pattern P1 starts to the timing T3 when the execution of the avoidance operation is completed is shorter than the execution time TR3(2) of the avoidance operation of the level E2 set in the pattern P2.
  • the execution time TR3(2) of the avoidance operation of the level E2 set in the pattern P2 is shorter than the execution time TR3(3) of the avoidance operation of the level E3 set in the pattern P3.
  • the relationship of the execution times TR3 is TR3(1) ⁇ TR3(2) ⁇ TR3(3).
  • the execution time TR3 in the avoidance driving can be shortened by increasing the moving speed of the vehicle V1, particularly the moving speed (lateral speed) along the lateral direction, which is the road width direction in which the rear vehicle V2 traveling in the adjacent lane approaches the vehicle V1.
  • the lateral moving speed can be improved by increasing the speed, turning speed, and turning acceleration, and the vehicle V1 can move to the target avoidance position in a short time. Therefore, the lateral movement speed VL(1) of the vehicle V1 defined in the execution of the avoidance driving in the avoidance driving of the level E1 set in the pattern P1 is higher than the lateral movement speed VL(2) in the avoidance driving of the level E2 set in the pattern P2.
  • the lateral movement speed VL(2) of the level E2 set in the pattern P2 is higher than the lateral movement speed VL(3) of the level E3 set in the pattern P3.
  • the lateral movement speed VL has a relationship of VL(1)>VL(2)>VL(3).
  • the driving control device 1 of this embodiment includes a camera 21 and a radar device 22 as the sensor 2.
  • the camera 21 and the radar device 22 detect and measure distance to an object based on their respective detection characteristics. Even if the camera 21 and the radar device 22 have high performance, false detection is inevitable depending on the detection environment.
  • the detection information of the camera 21 has a characteristic that it is more susceptible to noise than the detection information of the radar device 22 and has a relatively low measurement accuracy of the vertical distance. According to an experiment by the inventors, the distance measurement performance of the camera 21 in the automatic driving control tends to be relatively inferior to that of the radar device 22. In particular, the radar device 22 tends to have a better distance measurement performance than the camera 21 for distant objects.
  • the inventors focus on the detection characteristics of the camera 21 and the radar device 22, and analyze a pattern P of events that satisfy the avoidance conditions from the standpoint of whether the sensor 2 that provided detection information that satisfies the avoidance conditions is both the camera 21 and the radar device 22, only the radar device 22, or only the camera 21. They then associate each pattern P with a level E that indicates the ease of evasive driving (ease of execution).
  • the detection information of both the camera 21 and the radar device 22 satisfies the avoidance condition
  • the reliability is higher than when only the detection information of the camera 21 satisfies the avoidance condition, and the ease of execution (ease of execution) of the avoidance operation is set relatively high.
  • the reliability of the detection information is evaluated in stages based on the type (camera 21 and/or radar device 22) of the sensor 2 that outputs the detection information that satisfies the avoidance condition and the combination thereof, and the avoidance operation can be performed at different levels of ease of execution E depending on the evaluation result.
  • the event of satisfying the avoidance condition is classified into a plurality of patterns P, the reliability of the event is evaluated for each pattern P, and the level of ease of execution E is set in stages according to the reliability of each event.
  • the vehicle can feel agility and quick movement.
  • the satisfaction of the avoidance conditions is determined based only on the detection information of the camera 21, which has a relatively high possibility of false detection, the ease of execution of the avoidance driving is lowered, so that the vehicle can be prevented from being moved and avoidance driving can be prevented from being performed based on the false detection of the camera 21.
  • the detection results are analyzed based on the result of satisfaction/non-satisfaction of the avoidance conditions based on the detection information of the camera 21 and the result of satisfaction/non-satisfaction of the avoidance conditions based on the detection information of the radar device 22, and the patterns P1, P2, and P3 are associated with each other, and the order of the ease of execution according to the patterns P1-P3 is defined, so that avoidance driving control can be performed taking into account the detection characteristics of the camera 21 and the radar device 22 used in combination. Simply confirming and supplementing the detection information of the camera 21 with the detection information of the radar device 22 predicts that the avoidance driving will be performed in the same manner.
  • the conclusion that the avoidance conditions are satisfied is divided into patterns P1, P2, and P3, and the ease of performing the avoidance driving E1, E2, and E3 are gradually associated with these patterns P1, P2, and P3, thereby reducing the number of situations in which the start of the avoidance driving is delayed or the movement speed is suppressed.
  • driving that includes lateral movement such as avoidance driving
  • avoidance driving requires accurate judgment and quick movement.
  • the avoidance driving is likely to be performed means that the avoidance driving is started early and completed in a short time.
  • the waiting time TR2 is shortened compared to when the avoidance conditions are satisfied by the detection information of the camera 21 or the radar device 22, so that the avoidance driving can be started promptly without missing an opportunity for lateral movement.
  • the avoidance driving can be performed promptly by shortening the execution time TR3.
  • the execution time TR3 can be shortened by increasing the speed of lateral movement during the avoidance driving. As a result, the overall time TR1 is shortened, and the host vehicle V2 can respond sensitively to the detection result and perform the avoidance driving with smooth and prompt movement without hesitation.
  • the total time TR1, the waiting time TR2, and the execution time TR3 can be set relatively longer than when the avoidance conditions are satisfied by the detection information of the radar device 22 (patterns P1 and P2).
  • careful avoidance driving can be performed by checking the detection information in advance and taking time to perform avoidance driving at a low speed.
  • the total time TR1, the waiting time TR2, and the execution time TR3 are set longer than the respective times for pattern P2 and shorter than the respective times for pattern P3, thereby making it possible to realize avoidance driving with a degree of ease of execution that varies in stages according to the reliability.
  • the processor 10 performs a correction process for the content of the evasive maneuver.
  • this correction process the content of the evasive maneuver is changed according to the level. Specifically, the content of the evasive maneuver is corrected by extending or shortening the total time TR1, waiting time TR2, and execution time TR3 in the evasive maneuver, or by improving or decreasing the evasive speed or evasive turning degree.
  • the processor 10 corrects the content of the evasive maneuver related to the ease of executing the evasive maneuver (ease of execution) taking into account the weather conditions and/or the vehicle condition of the rear vehicle V2.
  • the processor 10 calculates the reliability of the detection result by the sensor 2 including the camera 21 and the radar device 22, and when the calculated reliability is relatively low, the processor 10 lengthens the total time TR1 from the timing T2 at which the information of the rear vehicle V2 that satisfies the avoidance conditions is acquired to the timing T3 at which the execution of the evasive maneuver is completed, compared to when the reliability is relatively high. Specifically, the processor 10 lengthens the waiting time TR2 and/or execution time TR3 included in the total time TR1. In this process, the lower the calculated reliability, the longer the total time TR1, the waiting time TR2 and/or execution time TR3 included therein may be.
  • the detection information of the camera 21 is prone to noise, and has a characteristic that the measurement accuracy of distant positions is relatively low.
  • the detection accuracy of the camera 21 and the radar device 22 may be affected by weather.
  • the image captured by the camera 21 may have whiteout due to the effects of high-intensity sunlight, its reflected light, and snowfall, and object recognition and position measurement may not be accurate.
  • the irradiation and reception of electromagnetic waves by the radar device 22 may be affected by rain and snowfall, and object recognition and position measurement may not be accurate.
  • the camera 21 may not be able to capture the target accurately, and the radar device 22 may not be able to receive sufficient reflected waves and may not be able to accurately recognize and measure the distance to the target. If the rear vehicle V2 is a truck with a large width and length, it may not fit within the field of view of the camera 21 and only a portion of it may be recognized, making it impossible to capture the image accurately.
  • the radar device 22 may also not be able to receive reflected waves from the object boundary and may not be able to accurately recognize and measure the distance to the entire object.
  • the detection accuracy of the sensor 2 differs depending on the environment, and also differs depending on the type of sensor 2 (camera 21, radar device 22).
  • the environmental conditions weather conditions and/or the vehicle type of the rear vehicle V2 that affect the detection accuracy of the camera 21 and the radar device 22 are defined in advance.
  • the processor 10 calculates the reliability of the detection information when an influencing environment is detected.
  • the weather conditions can be acquired from an external server via the communication device 30.
  • Rain and snowfall can be detected by the operation of the wipers equipped in the host vehicle V1.
  • Sunlight and reflected light can be acquired from a light meter equipped in the host vehicle V1.
  • Whether the rear vehicle V2 is a motorcycle can be determined by a pattern matching method based on an image of the object.
  • Whether the rear vehicle V2 is a truck can be determined by learning image features such as the object boundary not being within the angle of view of the camera 21 and the object boundary based on the received waves of the radar device 22 not being continuous.
  • the content of the avoidance driving can be corrected according to the reliability of the detection information.
  • the overall time TR1, the waiting time TR2, and/or the execution time TR3 can be shortened to perform the avoidance driving that eliminates the influence of the external factor.
  • the processor 10 corrects the content of the avoidance driving related to the ease of executing the avoidance driving (execution ease) based on the relative distance between the host vehicle V1 and the rear vehicle V2. Whether the camera 21 or the radar device 22 has a limit to the detectable distance, the detection accuracy decreases as the object becomes farther. For this reason, the processor 10 evaluates the reliability of the detection information according to the relative distance between the host vehicle V1 and the rear vehicle V2, and corrects the content of the avoidance driving by changing the ease of executing the avoidance driving (execution ease).
  • the processor 10 lengthens the total time TR1 from the timing T2 at which the information of the rear vehicle V2 that satisfies the avoidance condition is acquired to the timing T3 at which the execution of the avoidance driving is completed, the waiting time TR2 and/or the execution time TR3 included therein, compared to when the relative distance is short.
  • the content of this correction includes making the total time TR1, the waiting time TR2 and/or the execution time TR3 included therein longer when the relative distance between the host vehicle V1 and the rear vehicle V2 is short than when the relative distance is long.
  • the detection accuracy of the camera 21 tends to be lower for distant objects than for nearby objects. Experiments have shown that the detection accuracy for distant objects is lower for the camera 21 than for the radar device 22.
  • the above correction process may be executed only when it is determined that the avoidance conditions are met based only on the information acquired from the camera 21 (pattern P3).
  • the length of the waiting time TR2 can be adjusted according to the relative distance between the host vehicle V1 and the rear vehicle V2.
  • processor 10 corrects the content of the avoidance maneuver related to the ease of execution (ease of execution) of the avoidance maneuver, taking into consideration whether the vertical approach or the lateral approach in the detection information of camera 21 met the avoidance conditions. If only the detection information of camera 21 meets the avoidance conditions, processor 10 extracts vertical distance information and lateral distance information from the detection information.
  • the approach thresholds set for the detection information of camera 21 include a threshold for vertical information and a threshold for lateral information, respectively.
  • Processor 10 determines whether the lateral (vehicle width) approach in the detection information of camera 21 met the avoidance conditions based on the lateral threshold, or whether the vertical approach met the avoidance conditions based on the vertical threshold. Then, when the degree of approach in the vertical direction (vehicle length direction) in the detection information of the camera 21 satisfies the avoidance condition based on the vertical threshold, the processor 10 lengthens the total time TR1, the waiting time TR2 and/or the execution time TR3 included therein, from the timing T2 when the information of the rear vehicle V2 that satisfies the avoidance condition is acquired to the timing T3 when the execution of the avoidance maneuver is completed, compared to when the degree of approach in the horizontal direction (vehicle width direction) in the detection information of the camera 21 satisfies the avoidance condition based on the horizontal threshold.
  • the content of this correction includes shortening the total time TR1, the waiting time TR2 and/or the execution time TR3 included therein, when the degree of approach in the horizontal direction (vehicle width direction) satisfies the avoidance condition based on the horizontal threshold, compared to when the avoidance condition based on the vertical threshold is satisfied.
  • the detection accuracy of the camera 21 in the horizontal direction tends to be higher than its detection accuracy in the vertical direction.
  • the detection accuracy of the angle (existence direction) at which an object exists based on the detection information of the camera 21 is high, the detection accuracy of the distance to the object cannot be said to be high.
  • the detection information in the vertical direction and the detection information in the horizontal direction are extracted from the detection information of the camera 21, and a relative evaluation is given to the reliability of the detection information in each direction.
  • the detection information in the horizontal direction is evaluated to be more reliable than the detection information in the vertical direction, and the content of the avoidance driving related to the ease of execution is changed based on this evaluation.
  • the content of the avoidance operation can be changed according to the direction of the detection information (vertical direction/horizontal direction) that affects the reliability of the detection information of the camera 21.
  • the detection information of the camera 21 that satisfies the avoidance condition is horizontal information
  • the total time TR1, the waiting time TR2 and/or the execution time TR3 included therein are shortened
  • the detection information of the camera 21 that satisfies the avoidance condition is vertical information
  • the total time TR1, the waiting time TR2 and/or the execution time TR3 included therein are lengthened.
  • the processor 10 causes the vehicle controller 200 to perform autonomous avoidance driving based on the determined content.
  • Driving control system 1... Driving control device, 10... Processor, 11... CPU, 12... ROM, 13... RAM, 20... Input/output device, 30... Communication device, 2... Sensor, 21... Camera, 22... Radar device, 3... Vehicle information acquisition device, 4... Other vehicle information acquisition device, 5... Rear vehicle recognition device, 6... Map information, 7... Navigation device, 200... Vehicle controller, 210... Steering control device, 220... Drive control device

Landscapes

  • Engineering & Computer Science (AREA)
  • Automation & Control Theory (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Traffic Control Systems (AREA)

Abstract

A driving control method executed using a driving control device provided with a processor 10, wherein the processor 10: causes a host vehicle V1 to perform avoidance driving when a prescribed avoidance condition is satisfied; when it is determined that the avoidance condition is satisfied on the basis of detection information acquired from a camera 21 and detection information acquired from a radar device 22, makes it easier to perform the avoidance driving than when it is determined that the avoidance condition is satisfied on the basis of only the detection information acquired from the radar device 22; and when it is determined that the avoidance condition is satisfied on the basis of only the detection information acquired from the radar device 22, makes it easier to perform the avoidance driving than when it is determined that the avoidance condition is satisfied on the basis of only the detection information acquired from the camera 21.

Description

運転制御方法及び運転制御装置Operation control method and operation control device

車両の運転制御方法及び運転制御装置に関する。 Related to a vehicle driving control method and driving control device.

自車と他車とが横並びになると判断された場合に、自車の目標進路を他車から離れる方向へオフセットさせる運転支援技術が知られている。 A driving assistance technology is known that offsets the target course of the vehicle in a direction away from the other vehicle when it is determined that the vehicle and another vehicle will be side-by-side.

特開2020-197828号公報JP 2020-197828 A

 しかしながら、従来の技術では、センサの誤検知によって自車の挙動が乱れることがある。 However, with conventional technology, erroneous detection by the sensor can disrupt the vehicle's behavior.

 本発明が解決しようとする課題は、センサの誤検知により自車の挙動が乱れることを抑制することである。 The problem that this invention aims to solve is to prevent the vehicle's behavior from being disrupted by erroneous detection by the sensor.

 本発明は、所定の回避条件を充足した場合に自車を後側車から離隔させる回避運転の実行制御において、カメラ及びレーダー装置から取得した検知情報が回避条件を充足する場合には、レーダー装置から取得した検知情報のみが回避条件を充足する場合よりも回避運転が実行されやすいようにし、レーダー装置から取得した検知情報のみが回避条件を充足する場合には、カメラから取得した検知情報のみが回避条件を充足する場合よりも、回避運転の実行がされやすいようにすることで上記課題を解決する。 The present invention solves the above problem by controlling the execution of an evasive maneuver that separates the vehicle from the vehicle behind when a specific evasive condition is met, by making it easier to execute an evasive maneuver when detection information acquired from a camera and a radar device satisfies the evasive condition than when only detection information acquired from the radar device satisfies the evasive condition, and by making it easier to execute an evasive maneuver when only detection information acquired from the radar device satisfies the evasive condition than when only detection information acquired from the camera satisfies the evasive condition.

 本発明によれば、センサの誤検知により自車の挙動が乱れることを抑制できる。 The present invention can prevent the vehicle's behavior from being disrupted by erroneous detection by the sensor.

運転制御システムのハードウェア構成を示すブロック図である。FIG. 2 is a block diagram showing a hardware configuration of the operation control system. 回避運転制御の処理手順を示すフローチャートである。13 is a flowchart showing a procedure for avoidance driving control. 回避運転を説明するための図である。FIG. 13 is a diagram for explaining avoidance driving. 回避運転の実行容易度を説明するための図である。FIG. 13 is a diagram for explaining the ease of implementing avoidance driving.

 図1に本実施形態に係る車両の運転制御装置1を備えた運転制御システム100の構成を示す。本運転制御方法は、運転制御装置1のプロセッサ10を含む運転制御システム100の知覚系、判断・制御系、及び情報系の各機器(ハードウェア)を使用して実施される。 FIG. 1 shows the configuration of a driving control system 100 equipped with a vehicle driving control device 1 according to this embodiment. This driving control method is implemented using each device (hardware) of the perception system, judgment and control system, and information system of the driving control system 100, which includes the processor 10 of the driving control device 1.

 知覚系の機器として、運転制御システム100は、一又は複数のセンサ2、自車情報取得装置3、他車情報取得装置4、及び後側車認識装置5を備える。プロセッサ10は、各知覚系の機器から観測値を取得し、既知の手法を用いて、自車及び他車を含む物体の位置、姿勢、運動(速度、加速度など)の状態及びその変化を判断する。 As the perception system devices, the driving control system 100 is equipped with one or more sensors 2, an own vehicle information acquisition device 3, an other vehicle information acquisition device 4, and a rear vehicle recognition device 5. The processor 10 acquires observed values from each of the perception system devices, and uses known methods to determine the state and changes in the position, attitude, and motion (speed, acceleration, etc.) of objects including the own vehicle and other vehicles.

 センサ2は車両に複数設けられ、互いに連携するセンサ群を形成する。センサ2は、自車の全周囲の他車両を含む物体の存在の有無、物体までの距離、物体の相対速度及び相対加速度を検知する。センサ2により取得された検知情報はプロセッサ10に提供される。
 センサ2は、車両に配置された単一又は複数のカメラ21を含む。単一又は複数のカメラ21は、車両の全方位の周囲を撮像する。カメラ21は、CCD等の撮像素子を備えるイメージセンサ、超音波カメラ、赤外線カメラを含む。カメラ21は、少なくとも車両の前方を撮像する前方カメラと、車両の後方乃至後側方を撮像する後方カメラと、車両の左右の側方、左右側の前方及び後方を撮像する左右の各側方カメラを備える。車両の全方位を撮像できれば、カメラ21の態様は限定されない。回転機構を有する台座に備え付けられた単一のカメラ21を用いてもよいし、これを他の複数のカメラ21と併用してもよい。
 センサ2は、自車周囲の物体の存在、物体の位置及び位置変化を検知(測距)するレーダー装置22を含む。レーダー装置22は、電磁波を対象物に向けて発射し、その反射波を測定することにより、対象物までの距離や方向を測る装置である。レーダー装置22は、レーザーレーダー、ミリ波レーダー(LRF)、LiDAR(light detection and ranging)ユニット、超音波レーダー、ソナーを含む。
 また、センサ2は、GPS(Global Positioning System)ユニット、ジャイロセンサ、および車速センサなどを備え、各タイミングにおける自車の位置を検知する。
 各センサ2は、それぞれの機能に応じて車載装置及び外部装置から情報を取得することもできる。各センサ2は、要求又は指令に応じて、取得した検知情報を自車情報取得装置3,他車情報取得装置4、後側車認識装置5、又はプロセッサ10へ向けて送出する。プロセッサ10は、カメラ21及びレーダー装置22から検知情報を直接取得してもよいし、後述する自車情報取得装置3、他車情報取得装置4、若しくは後側車認識装置5を介して取得してもよい。
A plurality of sensors 2 are provided in the vehicle, forming a sensor group that cooperates with each other. The sensor 2 detects the presence or absence of objects, including other vehicles, in the entire periphery of the vehicle, the distance to the objects, and the relative speed and relative acceleration of the objects. The detection information acquired by the sensor 2 is provided to the processor 10.
The sensor 2 includes a single or multiple cameras 21 arranged on the vehicle. The single or multiple cameras 21 capture images of the surroundings of the vehicle in all directions. The cameras 21 include an image sensor equipped with an imaging element such as a CCD, an ultrasonic camera, and an infrared camera. The cameras 21 include at least a front camera that captures images of the front of the vehicle, a rear camera that captures images of the rear or rear sides of the vehicle, and left and right side cameras that capture images of the left and right sides of the vehicle, and the front and rear of the left and right sides. The type of the camera 21 is not limited as long as it can capture images of the vehicle in all directions. A single camera 21 attached to a base having a rotation mechanism may be used, or this may be used in combination with multiple other cameras 21.
The sensor 2 includes a radar device 22 that detects (measures) the presence of an object around the vehicle, the object's position, and a change in position. The radar device 22 is a device that measures the distance and direction to the object by emitting electromagnetic waves toward the object and measuring the reflected waves. The radar device 22 includes a laser radar, a millimeter wave radar (LRF), a LiDAR (light detection and ranging) unit, an ultrasonic radar, and a sonar.
The sensor 2 includes a GPS (Global Positioning System) unit, a gyro sensor, a vehicle speed sensor, and the like, and detects the position of the vehicle at each timing.
Each sensor 2 can also acquire information from an in-vehicle device and an external device according to its respective function. Each sensor 2 transmits the acquired detection information to the host vehicle information acquisition device 3, the other vehicle information acquisition device 4, the rear vehicle recognition device 5, or the processor 10 in response to a request or command. The processor 10 may acquire the detection information directly from the camera 21 and the radar device 22, or may acquire the detection information via the host vehicle information acquisition device 3, the other vehicle information acquisition device 4, or the rear vehicle recognition device 5, which will be described later.

 自車情報取得装置3は、センサ2から取得した検知情報に基づいて、自車の現在位置、姿勢、速度、加速度、挙動、進行方向を算出し、プロセッサ10に提供する。他車情報取得装置4は、センサ2から取得した検知情報に基づいて、自車の周囲の他車を含む物体の位置、姿勢、速度、加速度、挙動、進行方向を算出し、プロセッサ10に提供する。後側車認識装置5は、センサ2から取得した検知情報に基づいて、自車の後方であって、自車の走行車線の隣接車線を走行する他車を認識し、これを「後側車(rear side vehicle)」として認識する。後側車は自車の後側方に位置する。自車が認識すべき所定距離の範囲内に存在する他車のみを後側車として認識してもよい。後側車の認識結果は後側車情報としてプロセッサ10へ提供される。後側車情報は、後側車の存在、後側車の位置、後側車の車線における横位置(路幅方向の位置)、レーンマークに対する後側車の横位置、後側車の速度、及び後側車の加速度、後側車の車種のうちのいずれか一つ以上を含む。後側車の車種は、二輪車、三輪車などの乗鞍型車両、大型車、普通車、トラックなどの種別である。後側車の種別は、後側車の幅及び/又は高さ、大きさなどの形状の特徴から判断できる。上記自車情報、他車情報、及び後側車情報はセンサ2の検知情報に基づいてプロセッサ10が判断してもよい。 The host vehicle information acquisition device 3 calculates the host vehicle's current position, attitude, speed, acceleration, behavior and direction of travel based on the detection information acquired from the sensor 2, and provides the calculated information to the processor 10. The other vehicle information acquisition device 4 calculates the position, attitude, speed, acceleration, behavior and direction of travel of objects including other vehicles around the host vehicle based on the detection information acquired from the sensor 2, and provides the calculated information to the processor 10. The rear side vehicle recognition device 5 recognizes other vehicles traveling in lanes adjacent to the host vehicle's lane behind the host vehicle based on the detection information acquired from the sensor 2, and recognizes these as "rear side vehicles." Rear side vehicles are located to the rear side of the host vehicle. Only other vehicles that exist within a predetermined distance range that the host vehicle should recognize may be recognized as rear side vehicles. The recognition results of the rear side vehicles are provided to the processor 10 as rear side vehicle information. The rear vehicle information includes one or more of the following: the presence of the rear vehicle, the position of the rear vehicle, the lateral position of the rear vehicle in the lane (position in the road width direction), the lateral position of the rear vehicle relative to the lane mark, the speed of the rear vehicle, the acceleration of the rear vehicle, and the vehicle type of the rear vehicle. The vehicle type of the rear vehicle is a type such as a saddle-type vehicle such as a two-wheeled vehicle or a three-wheeled vehicle, a large vehicle, a regular car, or a truck. The type of the rear vehicle can be determined from the shape characteristics such as the width and/or height and size of the rear vehicle. The above-mentioned host vehicle information, other vehicle information, and rear vehicle information may be determined by the processor 10 based on the detection information of the sensor 2.

 運転制御システム100は、情報系の機器として、地図情報6、車線情報61を有する。地図情報6及び車線情報61はプロセッサ10が通信装置30を介してアクセス可能な記憶装置、外部のサーバに記録される。地図情報6は、自律的な車線変更制御の実行において参照される車線情報61を備えた高精度地図情報である。車線情報61は、道路に属する複数の車線のそれぞれを識別する識別情報を含む。ナビゲーション装置7は、地図情報6を参照し、設定された目的地に至る経路を演算する。この経路は、目標軌道を含む。演算された経路及び目標軌道は車両コントローラ200に提供され、自律運転制御に利用される。 The driving control system 100 has map information 6 and lane information 61 as information system devices. The map information 6 and lane information 61 are recorded in a storage device that the processor 10 can access via the communication device 30 and an external server. The map information 6 is high-precision map information that includes lane information 61 that is referenced when performing autonomous lane change control. The lane information 61 includes identification information that identifies each of the multiple lanes that belong to a road. The navigation device 7 refers to the map information 6 and calculates a route to a set destination. This route includes a target trajectory. The calculated route and target trajectory are provided to the vehicle controller 200 and used for autonomous driving control.

 車両コントローラ200は、操舵制御装置210と駆動制御装置220を備える。車両コントローラ200は、運転制御装置1のプロセッサ10により立案された運転計画に従う自律運転制御のための指令値を取得し、自車に目的地に至る経路を走行させる。経路は、指令値が対応づけられた連続する複数の目標軌道により構成される。目標軌道は回避運転のための回避軌道を含む。回避軌道は後側車を回避するための目標横位置に基づいて算出される。運転制御のための指令値は、車両コントローラ200又はプロセッサ10が生成する。指令値は車両を走行させる際の設定速度を含み、車両コントローラ200は設定速度に従い自車を運転する。設定速度は、法定規則を前提とし、先行車両との距離、先行車両との相対速度・相対加速度などのセンサ2による検知情報に基づいて、所定の基準に従い自動的に設定されてもよいし、ドライバが入出力装置20を介して設定してもよい。車両コントローラ200は、指令値に基づいて、自車の走行位置を制御する縦力及び横力を入力する。これらの入力に従い、自車が目的地に至る経路を自律的に走行するように、車体の挙動及び車輪の挙動が制御される。これらの制御に基づいて、駆動制御装置220が制御する車体の駆動機構の駆動アクチュエータ及び制動アクチュエータのうちの少なくとも一方と、必要に応じて起動される操舵制御装置210のステアリングアクチュエータとが自律的に動作し、目標軌道を車両に自律的に走行させる自律運転制御が実行される。もちろん、車両コントローラ200は、入出力装置20を介して入力された運転者の手動操作に基づく指令値に従う運転を実行できる。 The vehicle controller 200 includes a steering control device 210 and a drive control device 220. The vehicle controller 200 acquires command values for autonomous driving control according to a driving plan proposed by the processor 10 of the driving control device 1, and drives the vehicle along a route to the destination. The route is composed of multiple consecutive target trajectories to which command values are associated. The target trajectory includes an avoidance trajectory for avoidance driving. The avoidance trajectory is calculated based on a target lateral position for avoiding a rear vehicle. The command values for driving control are generated by the vehicle controller 200 or the processor 10. The command values include a set speed when driving the vehicle, and the vehicle controller 200 drives the vehicle according to the set speed. The set speed may be automatically set according to a predetermined standard based on information detected by the sensor 2, such as the distance to the preceding vehicle, and the relative speed and relative acceleration to the preceding vehicle, based on legal regulations, or may be set by the driver via the input/output device 20. The vehicle controller 200 inputs a longitudinal force and a lateral force that control the driving position of the vehicle based on the command value. According to these inputs, the behavior of the vehicle body and the behavior of the wheels are controlled so that the vehicle autonomously travels along a route to the destination. Based on these controls, at least one of the drive actuator and the brake actuator of the drive mechanism of the vehicle body controlled by the drive control device 220 and the steering actuator of the steering control device 210, which is activated as necessary, operate autonomously, and autonomous driving control is executed to make the vehicle autonomously travel along a target trajectory. Of course, the vehicle controller 200 can also perform driving according to command values based on manual operation by the driver input via the input/output device 20.

 判断・制御系の機器として、運転制御システム100は、運転制御装置1と車両コントローラ200を有する。運転制御装置1は、自車に目標軌道を走行させる自律的な運転を制御する。目標軌道は、回避運転のための回避軌道を含む。運転制御装置1が備えるプロセッサ10は、自律的な運転を制御するためのプログラムを格納したROM(Read Only Memory)12と、このROM12に格納されたプログラムを実行するCPU(Central Processing Unit)11と、アクセス可能な記憶装置として機能するRAM(Random Access Memory)13とを備える。プロセッサ10は、運転制御システム100の各ハードウェアを用いて本運転制御方法を実施する。 As devices of the judgment and control system, the driving control system 100 has a driving control device 1 and a vehicle controller 200. The driving control device 1 controls autonomous driving to make the vehicle travel along a target trajectory. The target trajectory includes an avoidance trajectory for avoidance driving. The processor 10 provided in the driving control device 1 has a ROM (Read Only Memory) 12 that stores a program for controlling the autonomous driving, a CPU (Central Processing Unit) 11 that executes the program stored in this ROM 12, and a RAM (Random Access Memory) 13 that functions as an accessible storage device. The processor 10 implements this driving control method using each piece of hardware of the driving control system 100.

 プロセッサ10は、少なくとも回避条件の充足/非充足を判断する機能と、回避運転の実行容易度を判断する機能と、実行容易度に応じて回避運転を含む運転制御を実行する自動運転機能とを実現するためのソフトウェアと、図1に示す各ハードウェアとの協働により各機能を実行する。 The processor 10 executes each function by working in cooperation with the software and each piece of hardware shown in Figure 1 to realize at least a function for determining whether the avoidance conditions are satisfied or not, a function for determining the ease of executing avoidance driving, and an automatic driving function for executing driving control including avoidance driving according to the ease of execution.

 図2のフローチャートに基づいて、自律運転制御において実行される自律的な回避運転の処理を説明する。本実施形態における回避運転は、自車の走行車線に隣接する車線上であって、自車よりも後方を走行する他車である後側車が自車に接近する場面において、自車が後側車との接近を回避するために行われる。後側車はその横位置を変更して自車に接近することがある。好ましくないことではあるが、十分な車線変更スペースが確保されない状態で後側車が隣接車線から自車の走行車線に車線変更をする場合や、自車の存在に気づかずに後側車が横位置を変更する又は車線変更をする場合がある。このような場合に、運転制御装置1は、後側車を回避するための回避運転を自律制御により実行する。この回避運転は、走行レーンの幅方向に沿う自車の横移動を含む。 The process of autonomous avoidance driving executed in autonomous driving control will be described based on the flowchart in FIG. 2. Avoidance driving in this embodiment is performed in order for the host vehicle to avoid approaching the rear vehicle, which is another vehicle traveling behind the host vehicle on a lane adjacent to the host vehicle's driving lane. The rear vehicle may change its lateral position to approach the host vehicle. Although this is undesirable, there are cases where the rear vehicle changes lanes from an adjacent lane to the host vehicle's driving lane without sufficient lane-changing space being secured, or where the rear vehicle changes its lateral position or changes lanes without noticing the presence of the host vehicle. In such cases, the driving control device 1 executes avoidance driving to avoid the rear vehicle by autonomous control. This avoidance driving includes lateral movement of the host vehicle along the width direction of the driving lane.

 プロセッサ10は、カメラ21及びレーダー装置22を含むセンサ2の検知情報を取得する(S1)。検知情報は、カメラ21の撮像情報に基づく検知情報と、レーダー装置22の観測情報に基づく検知情報を含む。各検知情報は検知したセンサ2を区別する識別子を含む。プロセッサ10は、識別子を参照して、カメラ21の検知情報とレーダー装置22の検知情報とを識別する。
 プロセッサ10は、必要に応じて、自車情報取得装置3から自車の現在位置及び速度といった自車情報を取得する(S2)。プロセッサ10は、センサ2の検知情報又は地図情報6の車線情報61を参照して、現在の自車の走行車線の車線情報を自車情報の一つとして取得する(S2)。走行車線は、目的地に至る経路に含まれ、この経路はナビゲーション装置7から得てもよい。走行車線の位置やレーンマーカの位置は、センサ2の検知情報から得てもよい。プロセッサ10は、自車の周囲を走行する他車の存否、位置(距離)、相対速度、相対加速度などの他車情報を取得する(S3)。他車は、自車の後方であって、自車の走行車線の隣接車線を走行する後側車を含む。プロセッサ10は、後側車認識装置5から自車の走行車線の隣接車線を走行する後側車が存在するか否か、つまり後側車が検知されたか否かの判断を取得する(S4)。後側車は、自車の運転制御において回避の対象となる他車であるので、自車からの距離又は接触余裕時間(以下、TTC:Time-To-Collisionと称する)が所定範囲内に存在する他車としてもよい。プロセッサ10は、後側車が検知されるまで、センサ2の検知情報に基づいて後側方の物体を観察する(S4でNO)。後側車の認識処理は、プロセッサ10が行ってもよい。
The processor 10 acquires detection information from the sensor 2 including the camera 21 and the radar device 22 (S1). The detection information includes detection information based on imaging information from the camera 21 and detection information based on observation information from the radar device 22. Each piece of detection information includes an identifier for distinguishing the sensor 2 that performed the detection. The processor 10 distinguishes between the detection information from the camera 21 and the detection information from the radar device 22 by referring to the identifier.
The processor 10 acquires vehicle information such as the current position and speed of the vehicle from the vehicle information acquisition device 3 as necessary (S2). The processor 10 acquires lane information of the current lane of the vehicle as one of the vehicle information by referring to the detection information of the sensor 2 or the lane information 61 of the map information 6 (S2). The lane is included in the route to the destination, and this route may be acquired from the navigation device 7. The position of the lane of the vehicle and the position of the lane marker may be acquired from the detection information of the sensor 2. The processor 10 acquires other vehicle information such as the presence or absence, position (distance), relative speed, and relative acceleration of other vehicles traveling around the vehicle (S3). The other vehicles include a rear side vehicle traveling in a lane adjacent to the lane of the vehicle behind the vehicle. The processor 10 acquires a judgment of whether or not a rear side vehicle traveling in a lane adjacent to the lane of the vehicle exists, that is, whether or not a rear side vehicle has been detected, from the rear side vehicle recognition device 5 (S4). The rear vehicle is a vehicle to be avoided in the driving control of the own vehicle, and may be a vehicle whose distance or time to collision (hereinafter referred to as TTC: Time-To-Collision) from the own vehicle is within a predetermined range. The processor 10 observes objects on the rear side based on the detection information of the sensor 2 until the rear vehicle is detected (NO in S4). The recognition process of the rear vehicle may be performed by the processor 10.

 プロセッサ10は、後側車が検知された場合には(S4でYES)、予め定義された所定の回避条件を読み込み、センサ2の検知情報が回避条件を充足するか否かを判断する(S5)。回避条件は、後側車から自車を離隔させる回避運転の実行可否を判断するための条件である。回避条件は、後側車と自車との接近度に基づいて予め定義される。接近度とは、自車と後側車との接近の度合いを示すリスク指標である。接近度が高いほど自車と後側車は接近した状態であり、接近度が低いほど自車と後側車は離隔した状態である。
 回避条件が接近度で定義される場合には、接近度が示す接近のリスクが所定閾値以上(接近リスクが高い状態)の場合に回避条件の充足が判断されるように定義する。隣接車線を走行する後側車は自車に横移動しながら接近するので、接近度は、横方向の距離、横速度、横加速度を用いて定義してもよい。接近度は、自車と後側車の距離、レーン幅に沿う横方向の距離、TTCの値、これらの逆数や微分値などを用いた出願時に知られた指標値で定義される閾値を用いて表現できる。TTCは、現在の相対速度が維持されるという仮定の下、後側車と自車が接触する時間に基づく指標値である。
When a rear vehicle is detected (YES in S4), the processor 10 reads a predefined avoidance condition and judges whether or not the detection information of the sensor 2 satisfies the avoidance condition (S5). The avoidance condition is a condition for judging whether or not avoidance driving to separate the host vehicle from the rear vehicle can be performed. The avoidance condition is predefined based on the proximity between the rear vehicle and the host vehicle. The proximity is a risk index that indicates the degree of proximity between the host vehicle and the rear vehicle. The higher the proximity, the closer the host vehicle and the rear vehicle are, and the lower the proximity, the farther the host vehicle and the rear vehicle are.
When the avoidance condition is defined by the approach degree, the avoidance condition is determined to be satisfied when the approach risk indicated by the approach degree is equal to or greater than a predetermined threshold (high approach risk state). Since the rear vehicle traveling in the adjacent lane approaches the host vehicle while moving laterally, the approach degree may be defined using the lateral distance, lateral speed, and lateral acceleration. The approach degree can be expressed using thresholds defined by index values known at the time of filing using the distance between the host vehicle and the rear vehicle, the lateral distance along the lane width, the TTC value, and their reciprocals and derivatives. The TTC is an index value based on the time of contact between the rear vehicle and the host vehicle under the assumption that the current relative speed is maintained.

 プロセッサ10は、回避条件が充足したと判断された場合に(S5でYES)、回避条件を充足した検知情報の内容を提供したセンサ2の種別を分析する。プロセッサ10は、回避条件を充足した検知情報から、カメラ21の撮像情報に基づく検知情報を抽出するとともに、レーダー装置22の観測情報に基づく検知情報を抽出し、回避条件を充足した検知情報がカメラ21の検知情報であるのか、レーダー装置22の検知情報であるのか、両方の検知情報であるのかを分析する。 When it is determined that the avoidance conditions are met (YES in S5), the processor 10 analyzes the type of sensor 2 that provided the detection information that satisfied the avoidance conditions. From the detection information that satisfied the avoidance conditions, the processor 10 extracts detection information based on the imaging information of the camera 21, and also extracts detection information based on the observation information of the radar device 22, and analyzes whether the detection information that satisfied the avoidance conditions is detection information from the camera 21, the radar device 22, or both.

 プロセッサ10は、回避条件の充足の判断結果を以下の3つのパターンに分類する。分類は、回避条件が充足すると判断される検知情報を出力したセンサ2がカメラ21のみであるのか、レーダー装置22のみであるのか、又はカメラ21及びレーダー装置22の両方であるのかという観点から行われる。回避条件は、カメラ21の撮像情報に関する回避条件と、レーダー装置22の観測情報に関する回避条件とを含む。つまり、プロセッサ10は、回避条件を充足したという事象が、カメラ21及びレーダー装置22の検知情報がそれぞれの回避条件を充足したという事象であるのか、レーダー装置22の検知情報のみが、レーダー装置22の観測情報に関する回避条件を充足した事象であるのか、又はカメラ21の検知情報のみが、カメラ21の撮像情報に関する回避条件を充足した事象であるのかを判断する。各事象のパターンP1,P2及びP3について説明する。
<パターンP1:カメラ21及びレーダー装置22の検知情報(両方)で充足>
 パターンP1では、判断結果が、カメラ21から取得した検知情報が回避条件を充足し、かつレーダー装置22から取得した検知情報が回避条件を充足する。つまり、カメラ21の撮像情報に基づく接近度は、撮像情報用について設定された閾値以上であり、かつレーダー装置22の観測情報に基づく接近度は観測情報について設定された閾値以上である。カメラ21の検知結果においても、レーダー装置22の検知結果においても、回避条件が充足された場合である。
<パターンP2:レーダー装置22の検知情報のみで充足>
 パターンP2では、判断結果がレーダー装置22から取得した検知情報のみが回避条件を充足する。カメラ21から取得した検知情報は回避条件を充足しない。カメラ21の撮像情報に基づく接近度は、撮像情報用に設定された閾値未満(非充足)であり、かつレーダー装置22の観測情報に基づく接近度は観測情報用に設定された閾値以上(充足)である。
<パターンP3:カメラ21の検知情報のみで充足>
 パターンP3では、判断結果がカメラ21から取得した検知情報のみが回避条件を充足する。レーダー装置22から取得した検知情報は回避条件を充足しない。カメラ21の撮像情報に基づく接近度は、撮像情報用に設定された閾値以上(充足)であり、レーダー装置22の観測情報に基づく接近度は観測情報用に設定された閾値未満(非充足)である。
The processor 10 classifies the determination result of whether the avoidance condition is satisfied into the following three patterns. The classification is performed from the viewpoint of whether the sensor 2 that outputs the detection information that is determined to satisfy the avoidance condition is only the camera 21, only the radar device 22, or both the camera 21 and the radar device 22. The avoidance condition includes an avoidance condition related to the imaging information of the camera 21 and an avoidance condition related to the observation information of the radar device 22. In other words, the processor 10 determines whether the event that the avoidance condition is satisfied is an event in which the detection information of the camera 21 and the radar device 22 satisfy each avoidance condition, an event in which only the detection information of the radar device 22 satisfies the avoidance condition related to the observation information of the radar device 22, or an event in which only the detection information of the camera 21 satisfies the avoidance condition related to the imaging information of the camera 21. The patterns P1, P2, and P3 of each event will be described.
<Pattern P1: Satisfied by detection information (both) of camera 21 and radar device 22>
In pattern P1, the judgment result is that the detection information acquired from the camera 21 satisfies the avoidance condition, and the detection information acquired from the radar device 22 satisfies the avoidance condition. In other words, the approach degree based on the imaging information of the camera 21 is equal to or greater than the threshold value set for the imaging information, and the approach degree based on the observation information of the radar device 22 is equal to or greater than the threshold value set for the observation information. This is the case where the avoidance condition is satisfied in both the detection result of the camera 21 and the detection result of the radar device 22.
<Pattern P2: Satisfied with only the detection information from the radar device 22>
In pattern P2, the judgment result is that only the detection information acquired from the radar device 22 satisfies the avoidance condition. The detection information acquired from the camera 21 does not satisfy the avoidance condition. The approach degree based on the imaging information of the camera 21 is less than the threshold set for the imaging information (not satisfied), and the approach degree based on the observation information of the radar device 22 is equal to or greater than the threshold set for the observation information (satisfied).
<Pattern P3: Satisfied with only the detection information from camera 21>
In pattern P3, the judgment result is that only the detection information acquired from the camera 21 satisfies the avoidance condition. The detection information acquired from the radar device 22 does not satisfy the avoidance condition. The approach degree based on the imaging information of the camera 21 is equal to or greater than the threshold set for the imaging information (satisfied), and the approach degree based on the observation information of the radar device 22 is less than the threshold set for the observation information (not satisfied).

 プロセッサ10は、分析された、回避条件充足の判断結果のパターンP1,P2,P3に応じて、異なる「実行容易度」を回避運転について設定する。実行容易度は、回避運転の実行されやすさであり、回避運転のしやすさのレベルとして定義される。回避条件の充足をトリガとして実行命令が生成され、回避運転が実行され完了されるという一連の流れを考慮すると、「実行容易度」は、回避運転の開始の早さと、回避運転の実行処理の速さ(回避運転に要する時間の短さ)を評価指標とする。
 プロセッサ10は、カメラ21の検知情報が回避条件を充足し(S6でYES)、かつレーダー装置22の検知情報が回避条件を充足する場合には(S7でYES)、回避運転の実行容易度をレベルE1に設定し(S8)、S11へ進む。プロセッサ10は、カメラの検知情報が回避条件を充足せず(S6でNO)、レーダー装置22の検知情報のみが回避条件を充足する場合には(S7´でYES)、回避運転の実行容易度をレベルE2(<E1)に設定し(S9)、S11へ進む。プロセッサ10は、カメラの検知情報が回避条件を充足し(S6でYES)、レーダー装置22の検知情報が回避条件を充足しない場合には(S7でNO)、回避運転の実行容易度をレベルE3(<E2)に設定し(S10)、S11へ進む。回避条件が充足したと判断された(S5でYES)にもかかわらず、カメラ21及びレーダー装置22の検知情報において回避条件の充足が特定できない場合(S6でNO、S7´でNO)には、S1へ戻り判断をやり直す。
 本処理により、回避条件充足の内容がパターンP1である場合にはレベルE1の実行容易度が対応づけられ、パターンP2である場合にはレベルE2の実行容易度が対応づけられ、パターンP3である場合にはレベルE3の実行容易度が対応づけられる。実行容易度の各レベルEは、レベルE1>レベルE2>レベルE3という関係を有する。レベルE1の回避運転の実行容易度は最も高く、回避運転が最も実行されやすい。レベルE3の回避運転の実行容易度は最も低く、回避運転の実行が最もされにくい。レベルE2の回避運転の実行容易度は、レベルE1よりも低く、レベルE3よりも高い中程度の度合いである。
 つまり、カメラ21及びレーダー装置22から取得した検知情報の両方に基づいて回避条件が充足したと判断された場合(パターンP1)には、レーダー装置22から取得した検知情報のみに基づいて回避条件が充足したと判断された場合(パターンP2)よりも回避運転が実行されやすいレベルE1(>レベルE2)が設定され、レーダー装置22から取得した検知情報のみに基づいて回避条件が充足したと判断された場合(パターンP2)には、カメラ21から取得した検知情報のみに基づいて回避条件が充足したと判断された場合(パターンP3)よりも、回避運転が実行されやすいレベルE2(>レベルE3)が設定される。
The processor 10 sets different "ease of execution" for the avoidance driving depending on the analyzed patterns P1, P2, P3 of the determination result of the satisfaction of the avoidance conditions. The ease of execution is the ease with which the avoidance driving is executed, and is defined as the level of ease of the avoidance driving. Considering the series of steps in which the execution command is generated with the satisfaction of the avoidance conditions as a trigger, and the avoidance driving is executed and completed, the "ease of execution" uses the speed of starting the avoidance driving and the speed of the execution process of the avoidance driving (the shortness of time required for the avoidance driving) as evaluation indexes.
If the detection information of the camera 21 satisfies the avoidance condition (YES in S6) and the detection information of the radar device 22 satisfies the avoidance condition (YES in S7), the processor 10 sets the ease of execution of evasive maneuver to level E1 (S8) and proceeds to S11. If the detection information of the camera does not satisfy the avoidance condition (NO in S6) and only the detection information of the radar device 22 satisfies the avoidance condition (YES in S7'), the processor 10 sets the ease of execution of evasive maneuver to level E2 (<E1) (S9) and proceeds to S11. If the detection information of the camera satisfies the avoidance condition (YES in S6) and the detection information of the radar device 22 does not satisfy the avoidance condition (NO in S7), the processor 10 sets the ease of execution of evasive maneuver to level E3 (<E2) (S10) and proceeds to S11. If it is determined that the avoidance conditions are met (YES in S5), but the detection information from the camera 21 and the radar device 22 does not indicate that the avoidance conditions are met (NO in S6, NO in S7'), the process returns to S1 and the determination is made again.
With this process, when the content of the avoidance condition satisfaction is pattern P1, the ease of execution of level E1 is associated, when it is pattern P2, the ease of execution of level E2 is associated, and when it is pattern P3, the ease of execution of level E3 is associated. The respective levels E of execution have the relationship of level E1>level E2>level E3. The ease of execution of avoidance maneuver at level E1 is the highest, and avoidance maneuver is most likely to be executed. The ease of execution of avoidance maneuver at level E3 is the lowest, and avoidance maneuver is least likely to be executed. The ease of execution of avoidance maneuver at level E2 is a medium degree, lower than level E1 and higher than level E3.
In other words, when it is determined that the avoidance conditions are satisfied based on both the detection information acquired from the camera 21 and the radar device 22 (pattern P1), a level E1 (> level E2) is set at which evasive maneuvering is more likely to be performed than when it is determined that the avoidance conditions are satisfied based only on the detection information acquired from the radar device 22 (pattern P2), and when it is determined that the avoidance conditions are satisfied based only on the detection information acquired from the radar device 22 (pattern P2), a level E2 (> level E3) is set at which evasive maneuvering is more likely to be performed than when it is determined that the avoidance conditions are satisfied based only on the detection information acquired from the camera 21 (pattern P3).

 図3は、回避運転の実行判断から実行完了までの自車V1と後側車V2の動きを示す。矢印Yが示す方向が自車V1の進行方向に沿う縦方向であり、矢印Xが示す方向が自車V1の幅方向又は路幅方向に沿う横方向である。
 図3(a)は、タイミングT1における自車V1(T1)と後側車V2(T1)を示す。自車V1(T1)は車線L3を走行し、その隣接車線L2であって自車V1の後方を後側車V2(T1)が走行する。
 図3(b)は、タイミングT2における自車V1(T2)と後側車V2(T2)を示す。プロセッサ10は、回避条件の充足/非充足を判断するために、自車V1(T2)と後側車V2(T2)との接近度を算出する。接近度は、自車V1と後側車V2の接近のリスクを示し、回避条件を定義する要件となる。接近度は、自車V1(T2)と後側車V2(T2)の距離D、横方向の距離DX、又は縦方向の距離DYに基づいて算出される。プロセッサ10は、距離で定義された接近度の閾値と実測距離を比較して回避条件の充足又は非充足を判断する。接近度は、自車V1(T2)と後側車V2(T2)の相対速度を考慮したTTCを用いたリスク評価値として算出してもよい。プロセッサ10は、TTCを用いて定義された接近度の閾値と実測値に基づくTTCを用いた指標値を比較して回避条件の充足又は非充足を判断する。接近度の閾値は、センサ2の種別に応じて設定できる。また、接近度はこれらに限定されず、自車V1の走行車線の中央位置から後側車V2までの距離(自車V1の走行車線からのずれ量)、後側車V2の走行車線の中央位置から後側車V2の中央位置までの距離(後側車V2の走行車線からのずれ量)に基づいて算出してもよい。この回避条件を充足する検知情報の取得が、回避運転の実行トリガとなる。回避条件において、カメラ21の検知情報に対する接近度の閾値は、レーダー装置22の検知情報に対する接近度の閾値と同じ値であってもよいし、異なる値であってもよい。
 図3(c)は、後側車V2(T2)から自車V1(T2)を離隔させるために回避運転が実行された状況を示す。回避運転は、後側車V2から離隔するための移動であるので、横位置の移動を含む。回避運転は、同一車線内における横移動と、隣接車線へ車線変更をする横移動と、路肩への横移動を含む。図3(c)は、走行車線L3内で横移動した回避運転制御の完了後の自車V11(T3)と、隣接車線L4へ横移動した回避運転制御の完了後の自車V12(T3)を示す。プロセッサ10は、自車V1(T2)が走行する車線L3内で自車V1(T2)の横位置V1X2を横位置V11X3にシフトさせる回避運転を実行させてもよいし、自車V1(T2)が走行する車線L3の横位置V1X2から、車線L3に隣接する車線L4の横位置V12X3に移動させる回避運転を実行させてもよい。
3 shows the movements of the host vehicle V1 and the rear vehicle V2 from the decision to perform evasive maneuver to the completion of the execution. The direction indicated by the arrow Y is the vertical direction along the traveling direction of the host vehicle V1, and the direction indicated by the arrow X is the horizontal direction along the width direction or road width direction of the host vehicle V1.
3(a) shows the host vehicle V1 (T1) and the rear vehicle V2 (T1) at timing T1. The host vehicle V1 (T1) is traveling in lane L3, and the rear vehicle V2 (T1) is traveling behind the host vehicle V1 in the adjacent lane L2.
FIG. 3B shows the host vehicle V1 (T2) and the rear vehicle V2 (T2) at timing T2. The processor 10 calculates the approach between the host vehicle V1 (T2) and the rear vehicle V2 (T2) to determine whether the avoidance condition is satisfied or not satisfied. The approach indicates the risk of the host vehicle V1 and the rear vehicle V2 approaching each other, and is a requirement for defining the avoidance condition. The approach is calculated based on the distance D, the lateral distance DX, or the vertical distance DY between the host vehicle V1 (T2) and the rear vehicle V2 (T2). The processor 10 compares the approach threshold defined by the distance with the actual measured distance to determine whether the avoidance condition is satisfied or not satisfied. The approach may be calculated as a risk assessment value using the TTC that takes into account the relative speed between the host vehicle V1 (T2) and the rear vehicle V2 (T2). The processor 10 compares the approach threshold defined by the TTC with the index value using the TTC based on the actual measured value to determine whether the avoidance condition is satisfied or not satisfied. The threshold of the approach degree can be set according to the type of the sensor 2. The approach degree is not limited to these, and may be calculated based on the distance from the center position of the driving lane of the host vehicle V1 to the rear vehicle V2 (the deviation of the host vehicle V1 from the driving lane) and the distance from the center position of the driving lane of the rear vehicle V2 to the center position of the rear vehicle V2 (the deviation of the rear vehicle V2 from the driving lane). Acquisition of detection information that satisfies this avoidance condition triggers the execution of avoidance driving. In the avoidance condition, the threshold of the approach degree for the detection information of the camera 21 may be the same value as the threshold of the approach degree for the detection information of the radar device 22, or may be a different value.
FIG. 3(c) shows a situation where an avoidance maneuver is performed to move the host vehicle V1 (T2) away from the rear vehicle V2 (T2). The avoidance maneuver includes a lateral movement because it is a movement to move away from the rear vehicle V2. The avoidance maneuver includes a lateral movement within the same lane, a lateral movement to change lanes to an adjacent lane, and a lateral movement to the shoulder. FIG. 3(c) shows the host vehicle V11 (T3) after the completion of the avoidance maneuver control in which the host vehicle V11 moves laterally within the driving lane L3, and the host vehicle V12 (T3) after the completion of the avoidance maneuver control in which the host vehicle V12 moves laterally to the adjacent lane L4. The processor 10 may perform evasive maneuvering to shift the lateral position V1X2 of the host vehicle V1 (T2) to a lateral position V11X3 within the lane L3 in which the host vehicle V1 (T2) is traveling, or may perform evasive maneuvering to move the host vehicle V1 (T2) from the lateral position V1X2 in the lane L3 in which the host vehicle V1 (T2) is traveling to a lateral position V12X3 in a lane L4 adjacent to the lane L3.

 図4は、回避条件を充足する情報を取得してから回避運転の完了までのタイムチャートを示す。ここでは、プロセッサ10におけるデータの処理時間は無視する。回避条件を充足する検知情報を取得したタイミングT2が、回避運転の実行が決定されるタイミングである。
 回避運転の実行が決定された後、待機時間TR2の経過後のタイミングTSにおいて回避運転が開始される。待機時間T2は、決定後直ちに自車の移動を開始するのではなく、確認のための時間として設定される。回避運転が開始されたタイミングTSから回避運転が完了されるタイミングT3までの実行時間TR3は、後側車V2を回避するために算出された移動目的地まで自車V1が移動するための時間である。回避運転の開始タイミングは、後側車V2から自車V1を離隔させるための横位置変化の開始、操舵制御の開始、車輪の角度変更などにより判断できる。回避運転の完了タイミングは、目標とされる横位置の移動完了、操舵制御の完了、車輪の方向が車長方向になったことにより判断できる。車線変更を伴う回避運転の開始タイミングは、上記に加えて一つ又は二つの前輪が車線を越えたことにより判断できる。車線変更を伴う回避運転の完了タイミングは、上記に加えて一つ又は二つの後輪が車線を越えたこと、車体の基準位置(重心など)が変更後の車線に進入したこと、車体の全部又は一部が変更後の車線に属したこと、車体の向きが左右方向(旋回時)から直進方向に戻ったこと、操舵後の車体の姿勢が車線の方向と一致したことにより判断できる。
4 shows a time chart from when information satisfying the avoidance condition is acquired to when the avoidance operation is completed. Here, the data processing time in the processor 10 is ignored. The timing T2 when the detection information satisfying the avoidance condition is acquired is the timing when the execution of the avoidance operation is decided.
After it is decided to perform the evasive maneuver, the evasive maneuver is started at a timing TS after the waiting time TR2 has elapsed. The waiting time T2 is set as a time for confirmation, rather than starting the movement of the host vehicle immediately after the decision. The execution time TR3 from the timing TS at which the evasive maneuver is started to the timing T3 at which the evasive maneuver is completed is the time for the host vehicle V1 to move to the destination calculated to avoid the rear vehicle V2. The start timing of the evasive maneuver can be determined by the start of a lateral position change to separate the host vehicle V1 from the rear vehicle V2, the start of steering control, a change in the angle of the wheels, etc. The completion timing of the evasive maneuver can be determined by the completion of the movement of the targeted lateral position, the completion of steering control, and the direction of the wheels becoming the vehicle length direction. The start timing of the evasive maneuver involving a lane change can be determined by the above, in addition, by one or two front wheels crossing the lane. In addition to the above, the timing of the completion of an evasive maneuver involving a lane change can be determined by the following: one or two rear wheels have crossed the lane, the reference position of the vehicle body (such as the center of gravity) has entered the new lane, all or part of the vehicle body belongs to the new lane, the orientation of the vehicle body has returned from the left or right direction (when turning) to a straight direction, and the posture of the vehicle body after steering matches the direction of the lane.

 回避運転の実行のしやすさ、つまり、実行容易度について詳細に説明する。回避運転の実行がしやすい(実行容易度が高い)とは、回避運転に費やされる時間が相対的に短いことを意味する。回避運転の実行がしにくい(実行容易度が低い)とは、回避運転に費やされる時間が相対的に長いことを意味する。回避運転に費やされる時間とは、図4に示す回避条件を充足する検知情報を取得したタイミングT2から回避運転の完了のタイミングT3までの全体時間TR1を意味する。また、回避運転の実行がしやすい(実行容易度が高い)とは、相対的に回避運転が早期に(early, swiftly)実行されることと、相対的に回避運転が迅速に(quickly, fast,)実行されることを意味する。回避運転が早期に実行されるとは、図4の回避条件を充足する検知情報を取得したタイミングT2から回避運転の開始のタイミングTSまでの待機時間TR2が短いことを意味する。一方、回避運転の実行がしにくい(実行容易度が低い)とは、相対的に回避運転が遅れて(lately)実行されることと、相対的に回避運転がゆっくり(slowly)実行されることを意味する。回避運転が遅れて実行されるとは、図4の回避条件を充足する情報を取得したタイミングT2から回避運転の開始のタイミングTSまでの待機時間TR2が長いことを意味する。待機時間TR2は、回避運転の開始前に検知情報にノイズが含まれないかなどの信頼度を確認する処理のために設けられる時間である。待機時間TR2を短くすることにより回避運転における自車の動き出しを早くすることができる。回避運転が迅速に実行されるとは、図4の回避運転の開始タイミングTSから回避運転の完了タイミングT3までの実行時間TR3が短いことを意味する。全体時間TR1、待機時間TR2、又は実行時間TR3の何れかが相対的に短い回避運転は回避運転の実行がしやすい(実行容易度が高い)と評価され、全体時間TR1、待機時間TR2、又は実行時間TR3の何れかが相対的に長い回避運転は回避運転の実行がされにくい(実行容易度が低い)と判断される。回避運転の実行がされやすい(実行容易度が高い)とは、回避運転の実行が早期に行われる(待機時間TR2が短い)こと、回避運転の実行が迅速に行われる(実行時間TR3が短い)ことを意味する。逆に、回避運転の実行がしにくい(実行容易度が低い)とは、回避運転の実行を遅延させる(待機時間TR2が長い)こと、又は回避運転の実行がゆっくり行われる(実行時間TR3が長い)ことを意味する。回避運転の実行がされやすい(実行容易度が高い)とは、回避運転の実行完了までに要する全体時間TR1が相対的に短いことを意味する。逆に、回避運転の実行がされにくい(実行容易度が低い)とは、全体時間TR1が相対的に長いことを意味する。待機時間TR2及び/又は実行時間TR3を短くすることにより、必然的に全体時間TR1を短くすることができる。なお、慎重な確認や用心深い運転が求められる場合には、回避運転の実行がされにくくする(実行容易度を低くする)ことが好ましい。  The ease of performing evasive maneuvering, that is, the ease of performing, will be explained in detail. When evasive maneuvering is easy to perform (high ease of performing), it means that the time spent on evasive maneuvering is relatively short. When evasive maneuvering is difficult to perform (low ease of performing), it means that the time spent on evasive maneuvering is relatively long. The time spent on evasive maneuvering refers to the total time TR1 from the timing T2 when detection information that satisfies the avoidance conditions shown in Figure 4 is acquired to the timing T3 when evasive maneuvering is completed. Furthermore, when evasive maneuvering is easy to perform (high ease of performing), it means that evasive maneuvering is performed relatively early (early, swiftly), and that evasive maneuvering is performed relatively quickly (quickly, fast). When evasive maneuvering is performed early, it means that the waiting time TR2 from the timing T2 when detection information that satisfies the avoidance conditions in Figure 4 is acquired to the timing TS when evasive maneuvering begins is short. On the other hand, the avoidance operation is difficult to execute (low execution ease) means that the avoidance operation is executed relatively late and that the avoidance operation is executed relatively slowly. The avoidance operation is executed late means that the waiting time TR2 from the timing T2 when the information satisfying the avoidance condition in FIG. 4 is acquired to the timing TS when the avoidance operation starts is long. The waiting time TR2 is a time provided for a process to check the reliability of the detection information, such as whether noise is included in the detection information, before the avoidance operation starts. By shortening the waiting time TR2, the vehicle can start moving quickly during the avoidance operation. The avoidance operation is executed quickly means that the execution time TR3 from the timing TS when the avoidance operation starts to the timing T3 when the avoidance operation is completed in FIG. 4 is short. An avoidance operation in which any of the total time TR1, the waiting time TR2, or the execution time TR3 is relatively short is evaluated as being easy to execute (high execution ease), and an avoidance operation in which any of the total time TR1, the waiting time TR2, or the execution time TR3 is relatively long is judged as being difficult to execute (low execution ease). An avoidance operation that is easy to execute (high execution ease) means that the avoidance operation is executed early (the waiting time TR2 is short) or that the avoidance operation is executed quickly (the execution time TR3 is short). Conversely, an avoidance operation that is difficult to execute (low execution ease) means that the execution of the avoidance operation is delayed (the waiting time TR2 is long) or that the avoidance operation is executed slowly (the execution time TR3 is long). An avoidance operation that is easy to execute (high execution ease) means that the total time TR1 required to complete the execution of the avoidance operation is relatively short. Conversely, an avoidance operation that is difficult to execute (low execution ease) means that the total time TR1 is relatively long. By shortening the waiting time TR2 and/or the execution time TR3, the overall time TR1 can be shortened. Note that, in cases where careful confirmation or cautious driving is required, it is preferable to make it difficult to execute avoidance maneuvers (to lower the ease of execution).

 図2のS11に戻り、プロセッサ10は、回避条件を充足したという事象のパターンP1,P2,P3に応じて設定された実行容易度のレベルE1,E2,E3に対応づけられた回避運転の内容を取得する。パターンP1について設定されたレベルE1の回避運転における、回避条件を充足する後側車V2の情報を取得したタイミングT2から回避運転の実行が完了するタイミングT3までの全体時間TR1(1)は、パターンP2において設定されたレベルE2の回避運転における全体時間TR1(2)よりも短い。パターンP2において設定されたレベルE2の回避運転における全体時間TR1(2)は、パターンP3において設定されたレベルE3の回避運転における全体時間TR1(3)よりも短い。全体時間TR1は、TR1(1)<TR1(2)<TR1(3)の関係となる。 Returning to S11 in FIG. 2, the processor 10 acquires the details of the evasive maneuver associated with the execution ease levels E1, E2, and E3 set according to the patterns P1, P2, and P3 of the event that the avoidance condition is satisfied. In the evasive maneuver of level E1 set for pattern P1, the total time TR1(1) from the timing T2 at which the information on the rear vehicle V2 that satisfies the avoidance condition is acquired to the timing T3 at which the execution of the evasive maneuver is completed is shorter than the total time TR1(2) in the evasive maneuver of level E2 set for pattern P2. The total time TR1(2) in the evasive maneuver of level E2 set for pattern P2 is shorter than the total time TR1(3) in the evasive maneuver of level E3 set for pattern P3. The relationship of the total time TR1 is TR1(1) < TR1(2) < TR1(3).

 図4に示すように、全体時間TR1は、待機時間TR2と実行時間TR3を含む。待機時間TR2及び/又は実行時間TR3を短縮することは、全体時間TR1を短縮することになる。パターンP1において設定されたレベルE1の回避運転における、回避条件を充足する後側車V2の情報を取得したタイミングT2から回避運転の実行が開始されるタイミングTSまでの待機時間TR2(1)は、パターンP2において設定されたレベルE2の回避運転における待機時間TR2(2)よりも短い。加えて、パターンP2において設定されたレベルE2の回避運転における待機時間TR2(2)は、パターンP3において設定されたレベルE3の回避運転における待機時間TR2(3)よりも短い。つまり、待機時間TR2は、TR2(1)<TR2(2)<TR2(3)の関係となる。 As shown in FIG. 4, the total time TR1 includes the waiting time TR2 and the execution time TR3. Reducing the waiting time TR2 and/or the execution time TR3 reduces the total time TR1. In the avoidance maneuver of level E1 set in pattern P1, the waiting time TR2(1) from the timing T2 when the information on the rear vehicle V2 that satisfies the avoidance condition is acquired to the timing TS when the execution of the avoidance maneuver starts is shorter than the waiting time TR2(2) in the avoidance maneuver of level E2 set in pattern P2. In addition, the waiting time TR2(2) in the avoidance maneuver of level E2 set in pattern P2 is shorter than the waiting time TR2(3) in the avoidance maneuver of level E3 set in pattern P3. In other words, the relationship of the waiting time TR2 is TR2(1)<TR2(2)<TR2(3).

 同様に、パターンP1において設定されたレベルE1の回避運転における、回避運転の実行が開始されるタイミングTSから回避運転の実行が完了するタイミングT3までの実行時間TR3(1)は、パターンP2において設定されたレベルE2の回避運転における実行時間TR3(2)よりも短い。パターンP2において設定されたレベルE2の回避運転における実行時間TR3(2)は、パターンP3において設定されたレベルE3の回避運転における実行時間TR3(3)よりも短い。実行時間TR3は、TR3(1)<TR3(2)<TR3(3)の関係となる。
 回避運転における実行時間TR3は、自車V1の移動速度、特に、隣接車線を走行する後側車V2が自車V1に接近する路幅方向である横方向に沿う移動速度(横速度)を高めることで短縮できる。具体的には、速度、旋回速度、旋回加速度を高めることで、横方向の移動速度を向上させることができ、自車V1は目的の回避位置まで短時間で移動できる。このため、パターンP1において設定されたレベルE1の回避運転における、回避運転の実行において定義された自車V1の横移動の速度VL(1)は、パターンP2において設定されたレベルE2の回避運転における横移動の速度VL(2)よりも高い。加えて、パターンP2において設定されたレベルE2の横移動の速度VL(2)は、パターンP3において設定されたレベルE3の横移動の速度VL(3)よりも高い。つまり、横移動の速度VLは、VL(1)>VL(2)>VL(3)という関係となる。
Similarly, the execution time TR3(1) from the timing TS when the execution of the avoidance operation of the level E1 set in the pattern P1 starts to the timing T3 when the execution of the avoidance operation is completed is shorter than the execution time TR3(2) of the avoidance operation of the level E2 set in the pattern P2. The execution time TR3(2) of the avoidance operation of the level E2 set in the pattern P2 is shorter than the execution time TR3(3) of the avoidance operation of the level E3 set in the pattern P3. The relationship of the execution times TR3 is TR3(1)<TR3(2)<TR3(3).
The execution time TR3 in the avoidance driving can be shortened by increasing the moving speed of the vehicle V1, particularly the moving speed (lateral speed) along the lateral direction, which is the road width direction in which the rear vehicle V2 traveling in the adjacent lane approaches the vehicle V1. Specifically, the lateral moving speed can be improved by increasing the speed, turning speed, and turning acceleration, and the vehicle V1 can move to the target avoidance position in a short time. Therefore, the lateral movement speed VL(1) of the vehicle V1 defined in the execution of the avoidance driving in the avoidance driving of the level E1 set in the pattern P1 is higher than the lateral movement speed VL(2) in the avoidance driving of the level E2 set in the pattern P2. In addition, the lateral movement speed VL(2) of the level E2 set in the pattern P2 is higher than the lateral movement speed VL(3) of the level E3 set in the pattern P3. In other words, the lateral movement speed VL has a relationship of VL(1)>VL(2)>VL(3).

 ところで、本実施形態の運転制御装置1は、センサ2としてカメラ21とレーダー装置22とを備える。カメラ21とレーダー装置22は、それぞれの検知特性に基づいて物体の検知・測距を行う。カメラ21とレーダー装置22が高い性能を備えていても、検知環境によって誤検知が生じることは避けられない。また、カメラ21の検知情報は、レーダー装置22の検知情報よりもノイズがのりやすく、縦方向の距離の計測精度が相対的に低いという特性を有する。発明者らの実験によれば、自動運転制御におけるカメラ21の測距性能は、レーダー装置22のそれよりも相対的に劣る傾向が見られた。特に、遠方の物体に対する測距性能は、カメラ21よりもレーダー装置22のほうが優れている傾向がある。
 発明者らは、カメラ21とレーダー装置22の検知特性に着目し、回避条件を充足する検知情報を提供したセンサ2がカメラ21とレーダー装置22の両方なのか、レーダー装置22のみなのか、カメラ21のみなのかという観点から、回避条件を充足するという事象のパターンPを分析し、各パターンPに対して回避運転のしやすさ(実行容易度)を示す各レベルEを対応づける。
Incidentally, the driving control device 1 of this embodiment includes a camera 21 and a radar device 22 as the sensor 2. The camera 21 and the radar device 22 detect and measure distance to an object based on their respective detection characteristics. Even if the camera 21 and the radar device 22 have high performance, false detection is inevitable depending on the detection environment. In addition, the detection information of the camera 21 has a characteristic that it is more susceptible to noise than the detection information of the radar device 22 and has a relatively low measurement accuracy of the vertical distance. According to an experiment by the inventors, the distance measurement performance of the camera 21 in the automatic driving control tends to be relatively inferior to that of the radar device 22. In particular, the radar device 22 tends to have a better distance measurement performance than the camera 21 for distant objects.
The inventors focus on the detection characteristics of the camera 21 and the radar device 22, and analyze a pattern P of events that satisfy the avoidance conditions from the standpoint of whether the sensor 2 that provided detection information that satisfies the avoidance conditions is both the camera 21 and the radar device 22, only the radar device 22, or only the camera 21. They then associate each pattern P with a level E that indicates the ease of evasive driving (ease of execution).

 本発明では、カメラ21とレーダー装置22の両方の検知情報が回避条件を充足する場合は、レーダー装置22のみの検知情報が回避条件を充足するという場合よりも信頼度が高いと判断し、最も高い回避運転の実行のされやすさ(実行容易度)を対応づける。レーダー装置22の検知情報のみが回避条件を充足する場合には、カメラ21の検知情報のみが回避条件を充足する場合よりも信頼度が高いと判断し、回避運転の実行のされやすさ(実行容易度)を相対的に高く設定する。これにより、回避条件を充足する検知情報を出力したセンサ2の種別(カメラ21及び/又はレーダー装置22)、その組み合せに基づいて検知情報の信頼度を段階的に評価し、評価結果に応じて異なる実行容易度のレベルEで回避運転を行うことができる。本発明では、回避条件を充足するという事象を複数のパターンPに分類し、パターンPごとに事象の信頼度を評価し、各事象の信頼度の高さに応じて実行容易度のレベルEを段階的に設定する。
 カメラ21の検知情報とレーダー装置22の検知情報のいずれもが回避条件を充足する場合には回避運転の実行容易度を高くするので、応答性能が良好で動作が機敏な回避運転を自車V1に実行させることができる。特に、回避運転が横移動量の大きい車線変更を伴う場合には、機動的で素早い移動を実感できる。一方で、誤検知発生の可能性が相対的に高いカメラ21の検知情報のみに基づいて回避条件の充足が判断された場合には、回避運転の実行容易度を低くすることにより、カメラ21の誤検知に基づいて車両が動かされ、回避運転が実行されることを抑制できる。このように、カメラ21の検知情報に基づく回避条件の充足/非充足の結果と、レーダー装置22の検知情報に基づく回避条件の充足/非充足の結果に基づいて検知結果を分析し、パターンP1,P2,P3を対応づけ、パターンP1-P3に応じた実行容易度の順序を定義することで、併用されるカメラ21とレーダー装置22の検知特性を考慮した回避運転制御を実行できる。
 単に、カメラ21の検知情報をレーダー装置22の検知情報で確認・補完するだけでは、同じ態様で回避運転がされることが予測される。本発明では、回避条件の充足という結論をパターンP1,P2,P3に区別し、このパターンP1,P2,P3に応じて回避運転の実行容易度E1,E2,E3を段階的に対応づけることにより、回避運転の開始が遅延する場面や移動速度が抑制されるという場面を少なくすることができる。回避運転のように横移動が含まれる運転では、移動先のスペースを確認し、移動開始を待機する事前の処理が必須であるため、回避運転は正確な判断と、機敏な動きが求められる。本発明によれば、センサ2の構成を維持しつつ、機敏な回避運転が実行される頻度を向上させることができる。
In the present invention, when the detection information of both the camera 21 and the radar device 22 satisfies the avoidance condition, it is determined that the reliability is higher than when the detection information of only the radar device 22 satisfies the avoidance condition, and the highest ease of execution (ease of execution) of the avoidance operation is associated with it. When only the detection information of the radar device 22 satisfies the avoidance condition, it is determined that the reliability is higher than when only the detection information of the camera 21 satisfies the avoidance condition, and the ease of execution (ease of execution) of the avoidance operation is set relatively high. In this way, the reliability of the detection information is evaluated in stages based on the type (camera 21 and/or radar device 22) of the sensor 2 that outputs the detection information that satisfies the avoidance condition and the combination thereof, and the avoidance operation can be performed at different levels of ease of execution E depending on the evaluation result. In the present invention, the event of satisfying the avoidance condition is classified into a plurality of patterns P, the reliability of the event is evaluated for each pattern P, and the level of ease of execution E is set in stages according to the reliability of each event.
When both the detection information of the camera 21 and the detection information of the radar device 22 satisfy the avoidance conditions, the ease of execution of the avoidance driving is increased, so that the vehicle V1 can be made to perform avoidance driving with good response performance and quick operation. In particular, when the avoidance driving involves a lane change with a large amount of lateral movement, the vehicle can feel agility and quick movement. On the other hand, when the satisfaction of the avoidance conditions is determined based only on the detection information of the camera 21, which has a relatively high possibility of false detection, the ease of execution of the avoidance driving is lowered, so that the vehicle can be prevented from being moved and avoidance driving can be prevented from being performed based on the false detection of the camera 21. In this way, the detection results are analyzed based on the result of satisfaction/non-satisfaction of the avoidance conditions based on the detection information of the camera 21 and the result of satisfaction/non-satisfaction of the avoidance conditions based on the detection information of the radar device 22, and the patterns P1, P2, and P3 are associated with each other, and the order of the ease of execution according to the patterns P1-P3 is defined, so that avoidance driving control can be performed taking into account the detection characteristics of the camera 21 and the radar device 22 used in combination.
Simply confirming and supplementing the detection information of the camera 21 with the detection information of the radar device 22 predicts that the avoidance driving will be performed in the same manner. In the present invention, the conclusion that the avoidance conditions are satisfied is divided into patterns P1, P2, and P3, and the ease of performing the avoidance driving E1, E2, and E3 are gradually associated with these patterns P1, P2, and P3, thereby reducing the number of situations in which the start of the avoidance driving is delayed or the movement speed is suppressed. In driving that includes lateral movement such as avoidance driving, it is essential to check the space of the movement destination and wait for the start of the movement in advance, so avoidance driving requires accurate judgment and quick movement. According to the present invention, it is possible to improve the frequency with which quick avoidance driving is performed while maintaining the configuration of the sensor 2.

 回避運転の実行がされやすいとは、回避運転が早期に開始され、回避運転が短時間で完了されることである。カメラ21及びレーダー装置22の両方の検知情報によって回避条件が充足される場合には、カメラ21又はレーダー装置22の検知情報によって回避条件が充足された場合よりも、待機時間TR2を短縮するので、横移動の機会を逃すことなく機敏に回避運転を開始できる。また、運転開始後は実行時間TR3を短縮することにより機敏に回避運転を実行できる。また、回避運転中の横移動の速度を高めることで、実行時間TR3を短縮することができる。結果として、全体時間TR1が短縮され、自車V2は検知結果に敏感に反応し、もたつくことなく、滑らかで機敏な動きで回避運転を実行できる。
 一方、カメラ21の検知情報のみによって回避条件が充足される場合には、レーダー装置22の検知情報が回避条件を充足する場合(パターンP1及びP2)よりも全体時間TR1、待機時間TR2、実行時間TR3を相対的に長く設定できる。信頼度が相対的に低い場合には、事前に検知情報を確認して、低速で時間をかけて回避運転をすることにより、慎重な回避運転を実行できる。
 さらに、レーダー装置22の検知情報のみによって回避条件が充足される場合(パターンP2)では、全体時間TR1、待機時間TR2、実行時間TR3をパターンP2に対する各時間よりも長く、パターンP3に対する各時間よりも短くすることで、信頼度に応じた段階的な実行容易度で回避運転を実現できる。
The avoidance driving is likely to be performed means that the avoidance driving is started early and completed in a short time. When the avoidance conditions are satisfied by the detection information of both the camera 21 and the radar device 22, the waiting time TR2 is shortened compared to when the avoidance conditions are satisfied by the detection information of the camera 21 or the radar device 22, so that the avoidance driving can be started promptly without missing an opportunity for lateral movement. In addition, after the driving is started, the avoidance driving can be performed promptly by shortening the execution time TR3. In addition, the execution time TR3 can be shortened by increasing the speed of lateral movement during the avoidance driving. As a result, the overall time TR1 is shortened, and the host vehicle V2 can respond sensitively to the detection result and perform the avoidance driving with smooth and prompt movement without hesitation.
On the other hand, when the avoidance conditions are satisfied only by the detection information of the camera 21, the total time TR1, the waiting time TR2, and the execution time TR3 can be set relatively longer than when the avoidance conditions are satisfied by the detection information of the radar device 22 (patterns P1 and P2). When the reliability is relatively low, careful avoidance driving can be performed by checking the detection information in advance and taking time to perform avoidance driving at a low speed.
Furthermore, in the case where the avoidance conditions are satisfied solely by the detection information of the radar device 22 (pattern P2), the total time TR1, the waiting time TR2, and the execution time TR3 are set longer than the respective times for pattern P2 and shorter than the respective times for pattern P3, thereby making it possible to realize avoidance driving with a degree of ease of execution that varies in stages according to the reliability.

 図2のS12において、プロセッサ10は、回避運転の内容の補正処理を行う。本補正処理では、レベルに応じた回避運転の内容を変更する。具体的には、回避運転における全体時間TR1、待機時間TR2、実行時間TR3を延長若しくは短縮、又は回避速度若しくは回避旋回度を向上若しくは低下により、回避運転の内容を補正する。 In S12 of FIG. 2, the processor 10 performs a correction process for the content of the evasive maneuver. In this correction process, the content of the evasive maneuver is changed according to the level. Specifically, the content of the evasive maneuver is corrected by extending or shortening the total time TR1, waiting time TR2, and execution time TR3 in the evasive maneuver, or by improving or decreasing the evasive speed or evasive turning degree.

 第一の補正処理において、プロセッサ10は、天候条件及び/又は後側車V2の車種条件を考慮して、回避運転の実行のしやすさ(実行容易度)に係る回避運転の内容を補正する。プロセッサ10は、カメラ21及びレーダー装置22を含むセンサ2による検知結果の信頼度を算出し、算出された信頼度が相対的に低い場合には、信頼度が相対的に高い場合よりも、回避条件を充足する後側車V2の情報を取得したタイミングT2から回避運転の実行が完了するタイミングT3までの全体時間TR1を長くする。具体的には、全体時間TR1に含まれる待機時間TR2及び/又は実行時間TR3を長くする。この処理において、算出された信頼度が低いほど、全体時間TR1、それに含まれる待機時間TR2及び/又は実行時間TR3を長くしてもよい。 In the first correction process, the processor 10 corrects the content of the evasive maneuver related to the ease of executing the evasive maneuver (ease of execution) taking into account the weather conditions and/or the vehicle condition of the rear vehicle V2. The processor 10 calculates the reliability of the detection result by the sensor 2 including the camera 21 and the radar device 22, and when the calculated reliability is relatively low, the processor 10 lengthens the total time TR1 from the timing T2 at which the information of the rear vehicle V2 that satisfies the avoidance conditions is acquired to the timing T3 at which the execution of the evasive maneuver is completed, compared to when the reliability is relatively high. Specifically, the processor 10 lengthens the waiting time TR2 and/or execution time TR3 included in the total time TR1. In this process, the lower the calculated reliability, the longer the total time TR1, the waiting time TR2 and/or execution time TR3 included therein may be.

 カメラ21とレーダー装置22が高い性能を備えていても、検知環境によって誤検知が生じることは避けられない。カメラ21の検知情報はノイズがのりやすく、遠方の位置の計測精度が相対的に低いという特性を有する。また、カメラ21及びレーダー装置22の検知精度は天候の影響を受けることがある。カメラ21の撮像画像は、高照度の日光やその反射光、積雪の影響で画像情報が白飛びをしてしまい、物体認識や位置計測が正確にできない場合がある。レーダー装置22の電磁波の照射及び受波は雨天・降雪の影響を受け、物体認識や位置計測が正確にできない場合がある。また、後側車V2が車幅の小さい二輪車などの鞍乗型車両である場合には、カメラ21は対象物を正確に捉えることができない場合があり、レーダー装置22も十分な反射波を受波できずに対象物を正確に認識及び測距できない場合がある。後側車V2が車幅及び車長が大きいトラックである場合には、カメラ21の画角に収まらずに一部しか認識できず、その像を正確に捉えることができない場合があり、レーダー装置22も、物体境界からの反射波を受波できずに正確に対象物全体を認識及び測距できない場合がある。
 このように、環境に応じてセンサ2の検知精度は異なり、センサ2の種別(カメラ21,レーダー装置22)によっても各検知精度が異なる。
 カメラ21,レーダー装置22の検知精度に影響を与える環境条件(天候条件及び/又は後側車V2の車種)は予め定義される。プロセッサ10は、影響を与える環境が検知された場合に検知情報の信頼度を算出する。天候条件は、外部サーバから通信装置30を介して取得できる。雨天・降雪については、自車V1が備えるワイパーの動作により検知できる。日光や反射光については自両V1が備える照度計から取得できる。後側車V2が二輪車であるか否かは、物体の画像に基づくパターンマッチング法などにより判断できる。後側車V2がトラックであるか否かは、物体境界がカメラ21の画角に収まらない、レーダー装置22の受波に基づく物体境界が連続しないといった画像の特徴を学習させることにより判断できる。
 天候条件や後側車V2の車種などの外部要因によって検知精度が低下することが予測される場合には、検知情報の信頼度に応じて回避運転の内容を補正できる。外部要因によって信頼度が低下するときには、全体時間TR1、待機時間TR2及び/又は実行時間TR3を短縮することにより、外部要因の影響を排除した回避運転を実行できる。
Even if the camera 21 and the radar device 22 have high performance, it is inevitable that false detection will occur depending on the detection environment. The detection information of the camera 21 is prone to noise, and has a characteristic that the measurement accuracy of distant positions is relatively low. In addition, the detection accuracy of the camera 21 and the radar device 22 may be affected by weather. The image captured by the camera 21 may have whiteout due to the effects of high-intensity sunlight, its reflected light, and snowfall, and object recognition and position measurement may not be accurate. The irradiation and reception of electromagnetic waves by the radar device 22 may be affected by rain and snowfall, and object recognition and position measurement may not be accurate. In addition, if the rear vehicle V2 is a saddle-type vehicle such as a two-wheeled vehicle with a small vehicle width, the camera 21 may not be able to capture the target accurately, and the radar device 22 may not be able to receive sufficient reflected waves and may not be able to accurately recognize and measure the distance to the target. If the rear vehicle V2 is a truck with a large width and length, it may not fit within the field of view of the camera 21 and only a portion of it may be recognized, making it impossible to capture the image accurately. The radar device 22 may also not be able to receive reflected waves from the object boundary and may not be able to accurately recognize and measure the distance to the entire object.
In this way, the detection accuracy of the sensor 2 differs depending on the environment, and also differs depending on the type of sensor 2 (camera 21, radar device 22).
The environmental conditions (weather conditions and/or the vehicle type of the rear vehicle V2) that affect the detection accuracy of the camera 21 and the radar device 22 are defined in advance. The processor 10 calculates the reliability of the detection information when an influencing environment is detected. The weather conditions can be acquired from an external server via the communication device 30. Rain and snowfall can be detected by the operation of the wipers equipped in the host vehicle V1. Sunlight and reflected light can be acquired from a light meter equipped in the host vehicle V1. Whether the rear vehicle V2 is a motorcycle can be determined by a pattern matching method based on an image of the object. Whether the rear vehicle V2 is a truck can be determined by learning image features such as the object boundary not being within the angle of view of the camera 21 and the object boundary based on the received waves of the radar device 22 not being continuous.
When it is predicted that the detection accuracy will decrease due to external factors such as weather conditions and the type of the rear vehicle V2, the content of the avoidance driving can be corrected according to the reliability of the detection information. When the reliability decreases due to an external factor, the overall time TR1, the waiting time TR2, and/or the execution time TR3 can be shortened to perform the avoidance driving that eliminates the influence of the external factor.

 第二の補正処理において、プロセッサ10は、自車V1と後側車V2との相対距離に基づいて、回避運転の実行のしやすさ(実行容易度)に係る回避運転の内容を補正する。カメラ21にしてもレーダー装置22にしても、検知可能距離に限界があり、対象物が遠いほどその検知精度は低下する。このため、プロセッサ10は、自車V1と後側車V2との相対距離に応じて検知情報の信頼度を評価し、回避運転の実行のしやすさ(実行容易度)を変更することにより、回避運転の内容を補正する。具体的に、プロセッサ10は、自車V1と後側車V2との相対距離が長い場合には、相対距離が短い場合よりも、回避条件を充足する後側車V2の情報を取得したタイミングT2から回避運転の実行が完了するタイミングT3までの全体時間TR1、それに含まれる待機時間TR2及び/又は実行時間TR3を長くする。この補正の内容は、自車V1と後側車V2との相対距離が短い場合には、相対距離が長い場合よりも、全体時間TR1、それに含まれる待機時間TR2及び/又は実行時間TR3を長くするという内容を含む。
 これにより、カメラ21又はレーダー装置22の特性によって検知精度が低下することが予測される場合には、カメラ21又はレーダー装置22の検知情報の信頼度に影響を与える距離に応じて回避運転の実行容易度の内容を補正することができる。カメラ21又はレーダー装置22の検知情報の信頼度が距離による影響を受けると予測される場合には、全体時間TR1を短縮することにより、カメラ21又はレーダー装置22の検知情報における距離の影響を排除した回避運転を実行できる。
 この補正処理において、カメラ21又はレーダー装置22の検知情報に基づいて計測された相対距離が長いほど、全体時間TR1、それに含まれる待機時間TR2及び/又は実行時間TR3が長くなるように変更してもよいし、相対距離が短いほど、全体時間TR1、それに含まれる待機時間TR2及び/又は実行時間TR3が短くなるように変更してもよい。
 ちなみに、カメラ21の検知精度は、近傍の対象物よりも遠方の対象物に対して相対的に低下する傾向がある。実験によると、遠方の対象物に対する検知精度は、レーダー装置22よりもカメラ21のほうが低い。このため、上記補正処理は、カメラ21から取得した情報のみに基づいて回避条件が充足したと判断された場合(パターンP3)のみにおいて実行するようにしてもよい。カメラ21の検知情報のみが回避条件を充足した場合には、自車V1と後側車V2との相対距離に応じて、待機時間TR2の長さを調整できる。
In the second correction process, the processor 10 corrects the content of the avoidance driving related to the ease of executing the avoidance driving (execution ease) based on the relative distance between the host vehicle V1 and the rear vehicle V2. Whether the camera 21 or the radar device 22 has a limit to the detectable distance, the detection accuracy decreases as the object becomes farther. For this reason, the processor 10 evaluates the reliability of the detection information according to the relative distance between the host vehicle V1 and the rear vehicle V2, and corrects the content of the avoidance driving by changing the ease of executing the avoidance driving (execution ease). Specifically, when the relative distance between the host vehicle V1 and the rear vehicle V2 is long, the processor 10 lengthens the total time TR1 from the timing T2 at which the information of the rear vehicle V2 that satisfies the avoidance condition is acquired to the timing T3 at which the execution of the avoidance driving is completed, the waiting time TR2 and/or the execution time TR3 included therein, compared to when the relative distance is short. The content of this correction includes making the total time TR1, the waiting time TR2 and/or the execution time TR3 included therein longer when the relative distance between the host vehicle V1 and the rear vehicle V2 is short than when the relative distance is long.
As a result, when it is predicted that the detection accuracy will decrease due to the characteristics of the camera 21 or the radar device 22, it is possible to correct the content of the feasibility of avoidance driving according to the distance that affects the reliability of the detection information of the camera 21 or the radar device 22. When it is predicted that the reliability of the detection information of the camera 21 or the radar device 22 will be affected by the distance, it is possible to execute avoidance driving that eliminates the effect of distance on the detection information of the camera 21 or the radar device 22 by shortening the overall time TR1.
In this correction process, the longer the relative distance measured based on the detection information of the camera 21 or the radar device 22, the longer the total time TR1, the waiting time TR2 included therein and/or the execution time TR3 may be changed, and the shorter the relative distance, the shorter the total time TR1, the waiting time TR2 included therein and/or the execution time TR3 may be changed.
Incidentally, the detection accuracy of the camera 21 tends to be lower for distant objects than for nearby objects. Experiments have shown that the detection accuracy for distant objects is lower for the camera 21 than for the radar device 22. For this reason, the above correction process may be executed only when it is determined that the avoidance conditions are met based only on the information acquired from the camera 21 (pattern P3). When only the detection information from the camera 21 meets the avoidance conditions, the length of the waiting time TR2 can be adjusted according to the relative distance between the host vehicle V1 and the rear vehicle V2.

 第三の補正処理において、カメラ21から取得した情報のみに基づいて回避条件が充足したと判断された場合(パターンP3)、プロセッサ10は、カメラ21の検知情報における縦方向の接近度又は横方向の接近度のどちらが回避条件を充足したのかという点を考慮して、回避運転の実行のしやすさ(実行容易度)に係る回避運転の内容を補正する。プロセッサ10は、カメラ21の検知情報のみが回避条件を充足した場合に、その検知情報から縦方向の距離情報と横方向の距離情報をそれぞれ抽出する。回避条件において、カメラ21の検知情報について設定された接近度の閾値は、縦方向の情報に対する閾値と横方向の情報に対する閾値をそれぞれ含む。プロセッサ10は、カメラ21の検知情報における横方向(車幅方向)の接近度が横方向の閾値に基づく回避条件を充足したのか、又は、縦方向の接近度が縦方向の閾値に基づく回避条件を充足したのかを判断する。そして、プロセッサ10は、カメラ21の検知情報における縦方向(車長方向)の接近度が縦方向の閾値に基づく回避条件を充足した場合には、カメラ21の検知情報における横方向(車幅方向)の接近度が横方向の閾値に基づく回避条件を充足した場合よりも、回避条件を充足する後側車V2の情報を取得したタイミングT2から回避運転の実行が完了するタイミングT3までの全体時間TR1、それに含まれる待機時間TR2及び/又は実行時間TR3を長くする。この補正の内容は、横方向(車幅方向)の接近度が横方向の閾値に基づく回避条件を充足した場合には、縦方向の閾値に基づく回避条件を充足した場合よりも、全体時間TR1、それに含まれる待機時間TR2及び/又は実行時間TR3を短くするという内容を含む。 In the third correction process, if it is determined that the avoidance conditions are met based only on the information obtained from camera 21 (pattern P3), processor 10 corrects the content of the avoidance maneuver related to the ease of execution (ease of execution) of the avoidance maneuver, taking into consideration whether the vertical approach or the lateral approach in the detection information of camera 21 met the avoidance conditions. If only the detection information of camera 21 meets the avoidance conditions, processor 10 extracts vertical distance information and lateral distance information from the detection information. In the avoidance conditions, the approach thresholds set for the detection information of camera 21 include a threshold for vertical information and a threshold for lateral information, respectively. Processor 10 determines whether the lateral (vehicle width) approach in the detection information of camera 21 met the avoidance conditions based on the lateral threshold, or whether the vertical approach met the avoidance conditions based on the vertical threshold. Then, when the degree of approach in the vertical direction (vehicle length direction) in the detection information of the camera 21 satisfies the avoidance condition based on the vertical threshold, the processor 10 lengthens the total time TR1, the waiting time TR2 and/or the execution time TR3 included therein, from the timing T2 when the information of the rear vehicle V2 that satisfies the avoidance condition is acquired to the timing T3 when the execution of the avoidance maneuver is completed, compared to when the degree of approach in the horizontal direction (vehicle width direction) in the detection information of the camera 21 satisfies the avoidance condition based on the horizontal threshold. The content of this correction includes shortening the total time TR1, the waiting time TR2 and/or the execution time TR3 included therein, when the degree of approach in the horizontal direction (vehicle width direction) satisfies the avoidance condition based on the horizontal threshold, compared to when the avoidance condition based on the vertical threshold is satisfied.

 発明者らの実験によると、カメラ21の横方向の検知精度は、その縦方向の検知精度よりも高い傾向がある。カメラ21の検知情報に基づく対象物が存在する角度(存在方向)の検知精度は高いが、対象物までの距離の検知精度は高いとはいえない。このため、カメラ21の検知情報から、縦方向の検知情報と横方向の検知情報を抽出し、それぞれの方向の検知情報の信頼度に相対的な評価を与える。横方向の検知情報は、縦方向の検知情報よりも信頼できると評価し、この評価に基づいて実行容易度に関する回避運転の内容を変更する。
 これにより、カメラ21の縦方向の検知精度が横方向の検知精度よりも低いことが予測される場合において、カメラ21の検知情報の方向が検知情報の信頼度に影響を与えることを考慮して、カメラ21の検知情報の信頼度に影響を与える検知情報の方向(縦方向/横方向)に応じて回避運転の内容を変更できる。回避条件を充足したカメラ21の検知情報が横方向の情報である場合には全体時間TR1、それに含まれる待機時間TR2及び/又は実行時間TR3を短縮し、回避条件を充足したカメラ21の検知情報が縦方向の情報である場合には全体時間TR1、それに含まれる待機時間TR2及び/又は実行時間TR3を長くする。これにより、カメラ21の検知情報における方向の影響を排除した回避運転を実行できる。
According to experiments by the inventors, the detection accuracy of the camera 21 in the horizontal direction tends to be higher than its detection accuracy in the vertical direction. Although the detection accuracy of the angle (existence direction) at which an object exists based on the detection information of the camera 21 is high, the detection accuracy of the distance to the object cannot be said to be high. For this reason, the detection information in the vertical direction and the detection information in the horizontal direction are extracted from the detection information of the camera 21, and a relative evaluation is given to the reliability of the detection information in each direction. The detection information in the horizontal direction is evaluated to be more reliable than the detection information in the vertical direction, and the content of the avoidance driving related to the ease of execution is changed based on this evaluation.
As a result, in a case where the detection accuracy of the camera 21 in the vertical direction is predicted to be lower than that in the horizontal direction, taking into consideration that the direction of the detection information of the camera 21 affects the reliability of the detection information, the content of the avoidance operation can be changed according to the direction of the detection information (vertical direction/horizontal direction) that affects the reliability of the detection information of the camera 21. When the detection information of the camera 21 that satisfies the avoidance condition is horizontal information, the total time TR1, the waiting time TR2 and/or the execution time TR3 included therein are shortened, and when the detection information of the camera 21 that satisfies the avoidance condition is vertical information, the total time TR1, the waiting time TR2 and/or the execution time TR3 included therein are lengthened. As a result, it is possible to perform an avoidance operation that eliminates the influence of the direction in the detection information of the camera 21.

 図2のS13において、プロセッサ10は、決定された内容で車両コントローラ200に自律的な回避運転を実行させる。 In S13 of FIG. 2, the processor 10 causes the vehicle controller 200 to perform autonomous avoidance driving based on the determined content.

 100…運転制御システム,1…運転制御装置,10…プロセッサ,11…CPU,12…ROM,13…RAM,20…入出力装置,30…通信装置,2…センサ,21…カメラ,22…レーダー装置,3…自車情報取得装置,4…他車情報取得装置,5…後側車認識装置,6…地図情報,7…ナビゲーション装置,200…車両コントローラ,210…操舵制御装置,220…駆動制御装置  100... Driving control system, 1... Driving control device, 10... Processor, 11... CPU, 12... ROM, 13... RAM, 20... Input/output device, 30... Communication device, 2... Sensor, 21... Camera, 22... Radar device, 3... Vehicle information acquisition device, 4... Other vehicle information acquisition device, 5... Rear vehicle recognition device, 6... Map information, 7... Navigation device, 200... Vehicle controller, 210... Steering control device, 220... Drive control device

Claims (9)

 カメラとレーダー装置を含むセンサと、プロセッサとを備えた運転制御装置において使用される、自車の運転を制御する運転制御方法であって、
 前記プロセッサは、前記センサを用いて前記自車の隣接車線の後方を走行する後側車に関する検知情報を取得し、
 前記検知情報が所定の回避条件を充足する場合に、前記後側車から前記自車を離隔させる回避運転を前記自車に実行させ、
 前記カメラから取得した前記検知情報が前記回避条件を充足し、かつ前記レーダー装置から取得した前記検知情報が前記回避条件を充足する場合には、前記レーダー装置から取得した前記検知情報のみが前記回避条件を充足する場合よりも、前記回避運転が実行されやすいようにし、
 前記レーダー装置から取得した前記検知情報のみが前記回避条件を充足する場合には、前記カメラから取得した前記検知情報のみが前記回避条件を充足する場合よりも、前記回避運転が実行されやすいようにする運転制御方法。
A driving control method for controlling driving of a vehicle, which is used in a driving control device having a sensor including a camera and a radar device, and a processor, comprising:
The processor acquires detection information regarding a rear vehicle traveling behind the host vehicle in an adjacent lane using the sensor,
When the detection information satisfies a predetermined avoidance condition, the host vehicle is caused to perform an avoidance operation to separate the host vehicle from the rear vehicle;
When the detection information acquired from the camera satisfies the avoidance condition and the detection information acquired from the radar device satisfies the avoidance condition, the avoidance driving is more likely to be performed than when only the detection information acquired from the radar device satisfies the avoidance condition,
A driving control method that makes it easier for the avoidance driving to be performed when only the detection information acquired from the radar device satisfies the avoidance condition than when only the detection information acquired from the camera satisfies the avoidance condition.
 前記プロセッサは、前記回避条件を充足する前記検知情報を取得してから前記回避運転の実行が完了するまでの時間を短くすることにより、前記回避運転が実行されやすいようにする請求項1に記載の運転制御方法。 The driving control method according to claim 1, wherein the processor shortens the time from when the detection information that satisfies the avoidance condition is obtained to when the avoidance maneuver is completed, thereby making it easier to execute the avoidance maneuver.  前記プロセッサは、前記回避条件を充足する前記検知情報を取得してから前記回避運転の実行が開始されるまでの時間を短くすることにより、前記回避運転が実行されやすいようにする請求項1又は2に記載の運転制御方法。 The driving control method according to claim 1 or 2, wherein the processor shortens the time from when the detection information that satisfies the avoidance condition is obtained to when the avoidance maneuver is started, thereby making it easier to execute the avoidance maneuver.  前記プロセッサは、前記回避運転の実行の開始から完了までの時間を短くすることにより、前記回避運転が実行されやすいようにする請求項1~3の何れか一項に記載の運転制御方法。 The driving control method according to any one of claims 1 to 3, wherein the processor shortens the time from the start to the completion of the execution of the avoidance maneuver, thereby making it easier to execute the avoidance maneuver.  前記プロセッサは、前記回避運転の実行時の前記自車の横移動の速度を高くする請求項4に記載の運転制御方法。 The driving control method according to claim 4, wherein the processor increases the speed of lateral movement of the host vehicle when the avoidance driving is performed.  前記プロセッサは、天候条件及び/又は前記後側車の車種条件に基づいて前記センサから取得した前記検知情報の信頼度を算出し、前記信頼度が相対的に高い場合には、前記信頼度が相対的に低い場合よりも、前記回避運転が実行されやすいようにする請求項1~5の何れか一項に記載の運転制御方法。 The driving control method according to any one of claims 1 to 5, wherein the processor calculates the reliability of the detection information acquired from the sensor based on weather conditions and/or vehicle type conditions of the rear vehicle, and makes it easier to execute the avoidance maneuver when the reliability is relatively high than when the reliability is relatively low.  前記プロセッサは、前記自車と前記後側車との距離が相対的に長い場合には、前記自車と前記後側車との距離が相対的に短い場合よりも、前記回避運転が実行されにくいようにする請求項1~6の何れか一項に記載の運転制御方法。 The driving control method according to any one of claims 1 to 6, wherein the processor makes it less likely that the avoidance maneuver is performed when the distance between the host vehicle and the rear vehicle is relatively long than when the distance between the host vehicle and the rear vehicle is relatively short.  前記回避条件は、前記後側車と前記自車との接近度に基づいて定義され、前記プロセッサは、前記カメラから取得した前記検知情報のみが前記回避条件を充足する場合において、前記カメラから取得した前記検知情報における縦方向の前記接近度が前記縦方向の閾値に基づく前記回避条件を充足する場合には、前記カメラから取得した前記検知情報における横方向の前記接近度が前記横方向の閾値に基づく前記回避条件を充足した場合よりも、前記回避運転が実行されにくいようにする請求項1~7の何れか一項に記載の運転制御方法。 The driving control method according to any one of claims 1 to 7, wherein the avoidance condition is defined based on the degree of proximity between the rear vehicle and the vehicle, and the processor, in a case where only the detection information acquired from the camera satisfies the avoidance condition, makes it less likely that the avoidance maneuver will be performed if the vertical proximity in the detection information acquired from the camera satisfies the avoidance condition based on the vertical threshold value, compared to a case where the lateral proximity in the detection information acquired from the camera satisfies the avoidance condition based on the lateral threshold value.  カメラとレーダー装置を含むセンサと、プロセッサとを備え、自車の運転を制御する運転制御装置であって、
 前記プロセッサは、前記センサを用いて前記自車の隣接車線の後方を走行する後側車に関する検知情報を取得し、
 前記検知情報が所定の回避条件を充足する場合に、前記後側車から前記自車を離隔させる回避運転を前記自車に実行させ、
 前記カメラから取得した前記検知情報が前記回避条件を充足し、かつ前記レーダー装置から取得した前記検知情報が前記回避条件を充足する場合には、前記レーダー装置から取得した前記検知情報のみが前記回避条件を充足する場合よりも、前記回避運転が実行されやすいようにし、
 前記レーダー装置から取得した前記検知情報のみが前記回避条件を充足する場合には、前記カメラから取得した前記検知情報のみが前記回避条件を充足する場合よりも、前記回避運転が実行されやすいようにする運転制御装置。
A driving control device that includes a sensor including a camera and a radar device, and a processor, and controls driving of a vehicle,
The processor acquires detection information regarding a rear vehicle traveling behind the host vehicle in an adjacent lane using the sensor,
When the detection information satisfies a predetermined avoidance condition, the host vehicle is caused to perform an avoidance operation to separate the host vehicle from the rear vehicle;
When the detection information acquired from the camera satisfies the avoidance condition and the detection information acquired from the radar device satisfies the avoidance condition, the avoidance driving is more likely to be performed than when only the detection information acquired from the radar device satisfies the avoidance condition,
A driving control device that makes it easier for the avoidance driving to be performed when only the detection information acquired from the radar device satisfies the avoidance condition than when only the detection information acquired from the camera satisfies the avoidance condition.
PCT/JP2023/034501 2023-09-22 2023-09-22 Driving control method and driving control device Pending WO2025062624A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/JP2023/034501 WO2025062624A1 (en) 2023-09-22 2023-09-22 Driving control method and driving control device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2023/034501 WO2025062624A1 (en) 2023-09-22 2023-09-22 Driving control method and driving control device

Publications (1)

Publication Number Publication Date
WO2025062624A1 true WO2025062624A1 (en) 2025-03-27

Family

ID=95072617

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2023/034501 Pending WO2025062624A1 (en) 2023-09-22 2023-09-22 Driving control method and driving control device

Country Status (1)

Country Link
WO (1) WO2025062624A1 (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0719882A (en) * 1993-06-30 1995-01-20 Mazda Motor Corp Traveling area recognition apparatus of vehicle and safety apparatus with the device
JP2013037601A (en) * 2011-08-10 2013-02-21 Suzuki Motor Corp Operation support device
JP2022088809A (en) * 2020-12-03 2022-06-15 本田技研工業株式会社 Vehicle control devices, vehicle control methods, and programs
JP2023062506A (en) * 2021-10-21 2023-05-08 トヨタ自動車株式会社 Device, method, and program for controlling vehicle

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0719882A (en) * 1993-06-30 1995-01-20 Mazda Motor Corp Traveling area recognition apparatus of vehicle and safety apparatus with the device
JP2013037601A (en) * 2011-08-10 2013-02-21 Suzuki Motor Corp Operation support device
JP2022088809A (en) * 2020-12-03 2022-06-15 本田技研工業株式会社 Vehicle control devices, vehicle control methods, and programs
JP2023062506A (en) * 2021-10-21 2023-05-08 トヨタ自動車株式会社 Device, method, and program for controlling vehicle

Similar Documents

Publication Publication Date Title
CN110406533B (en) Lane keeping assist system and method for improving safety of longitudinal control of front vehicle follower
JP6404722B2 (en) Vehicle travel control device
CN107251127B (en) Vehicle travel control device and travel control method
US9688272B2 (en) Surroundings monitoring apparatus and drive assistance apparatus
US11260859B2 (en) Vehicle control system, vehicle control method, and storage medium
JP6507839B2 (en) Vehicle travel control device
JP6363516B2 (en) Vehicle travel control device
CN107004367A (en) Travel controlling system, travel control method and the traveling control program of vehicle
US20180012083A1 (en) Demarcation line recognition apparatus
US20190168758A1 (en) Cruise control device
US12258011B2 (en) Vehicle controller and method for controlling vehicle
US20200098126A1 (en) Object detection apparatus
CN109689459B (en) Vehicle travel control method and travel control device
US11161505B2 (en) Vehicle control device, vehicle control method, and non-transitory computer-readable medium storing a program
JP7607743B2 (en) Method for determining an avoidance trajectory for a vehicle - Patents.com
WO2025062624A1 (en) Driving control method and driving control device
WO2025062620A1 (en) Driving control method and driving control device
WO2025062618A1 (en) Driving control method and driving control device
WO2025062617A1 (en) Driving control method and driving control device
US20250304060A1 (en) Mobile object control device, mobile object control method, and storage medium
US20200114906A1 (en) Vehicle control system and vehicle control method

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23952446

Country of ref document: EP

Kind code of ref document: A1