[go: up one dir, main page]

WO2019130483A1 - Vehicle control device, vehicle control method, and program - Google Patents

Vehicle control device, vehicle control method, and program Download PDF

Info

Publication number
WO2019130483A1
WO2019130483A1 PCT/JP2017/046959 JP2017046959W WO2019130483A1 WO 2019130483 A1 WO2019130483 A1 WO 2019130483A1 JP 2017046959 W JP2017046959 W JP 2017046959W WO 2019130483 A1 WO2019130483 A1 WO 2019130483A1
Authority
WO
WIPO (PCT)
Prior art keywords
vehicle
information
unit
driver
face
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/JP2017/046959
Other languages
French (fr)
Japanese (ja)
Inventor
三浦 弘
石川 誠
成光 土屋
浩司 川邊
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Honda Motor Co Ltd
Original Assignee
Honda Motor Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Honda Motor Co Ltd filed Critical Honda Motor Co Ltd
Priority to PCT/JP2017/046959 priority Critical patent/WO2019130483A1/en
Publication of WO2019130483A1 publication Critical patent/WO2019130483A1/en
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/08Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to drivers or passengers
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems

Definitions

  • the present invention relates to a vehicle control device, a vehicle control method, and a program.
  • An image of the other vehicle is taken by the camera, it is determined from the captured image whether the driver of the other vehicle is in the forward careless state, and if it is in the forward careless state, an alert to the driver of the other vehicle is
  • the technique to emit is known (for example, refer to patent documents 1).
  • the present invention has been made in consideration of such circumstances, and a vehicle control apparatus and a vehicle control method capable of realizing more flexible automatic driving while attempting to communicate with drivers of other vehicles. And to provide one of the programs.
  • the vehicle control device acquires the information indicating the direction of the face or the line of sight of the occupant of the other vehicle, and the direction indicated by the information acquired by the acquisition unit is viewed from the other vehicle itself
  • a determination unit that determines whether or not the vehicle is facing a direction
  • a drive control unit that controls one or both of steering and acceleration / deceleration of the host vehicle, wherein the information is indicated by the determination unit.
  • a driving control unit that performs predetermined control when it is determined that the direction is the direction facing the host vehicle as viewed from the other vehicle.
  • the vehicle control device is at least one of the predetermined control decelerating the host vehicle, changing the lane of the host vehicle, or moving the host vehicle backward.
  • the above control is included.
  • the vehicle control device detects a steering wheel installed in the other vehicle, and detects a direction of a face or a line of sight of an occupant closest to the detected steering wheel. Further, the acquisition unit acquires the detection result of the detection unit as information indicating the direction of the face or the line of sight of the occupant of the other vehicle.
  • the vehicle control device has a communication unit for performing inter-vehicle communication with another vehicle, and a direction indicated by the information from the other vehicle by the determination unit, the direction of the information is directed to the own vehicle And a communication control unit that causes the communication unit to start the inter-vehicle communication and transmits predetermined information to the other vehicle when it is determined that the vehicle is in the moving direction.
  • the operation control unit determines that the direction indicated by the information is a direction in which the information is facing the own vehicle as viewed from the other vehicle by the determination unit. When it is done, the host vehicle is decelerated or changed to another lane.
  • the acquisition unit acquires information indicating the direction of the face or line of sight of the occupant of the other vehicle, and the determination unit determines that the direction indicated by the information acquired by the acquisition unit is the own vehicle when viewed from the other vehicle. It is determined whether or not it is a direction facing the vehicle, and the operation control unit controls one or both of steering and acceleration / deceleration of the host vehicle, and the direction indicated by the information is the other by the determination unit.
  • the computer is made to acquire information indicating the direction of the face or line of sight of the occupant of the other vehicle, and the direction indicated by the acquired information is the direction facing the own vehicle as viewed from the other vehicle Whether or not to control one or both of steering or acceleration / deceleration of the vehicle, and determining that the direction indicated by the information is the direction facing the vehicle as viewed from the other vehicle Program that performs predetermined control when it happens.
  • FIG. 2 is a functional configuration diagram of a first control unit 120 and a second control unit 160. It is a figure which shows a mode that the target track T is produced
  • the vehicle control device will be described as being applied to a vehicle capable of autonomous driving (autonomous driving).
  • autonomous driving for example, one or both of the steering and the acceleration / deceleration of the vehicle is controlled to travel the vehicle without depending on the operation of the passenger who got on the vehicle.
  • the automatic driving may include driving support such as ACC (Adaptive Cruse Control) and LKAS (Lane Keeping Assist).
  • FIG. 1 is a block diagram of a vehicle system 1 using the vehicle control device according to the first embodiment.
  • the vehicle on which the vehicle system 1 is mounted (hereinafter referred to as the own vehicle M) is, for example, a vehicle such as a two-wheeled vehicle, a three-wheeled vehicle or a four-wheeled vehicle, and its drive source is an internal combustion engine such as a diesel engine or gasoline engine, an electric motor, Or it is a combination of these.
  • the motor When the motor is provided, the motor operates using the power generated by the generator connected to the internal combustion engine or the discharge power of the secondary battery or the fuel cell.
  • the vehicle system 1 includes, for example, a camera 10, a radar device 12, a finder 14, an object recognition device 16, a communication device 20, an HMI (Human Machine Interface) 30, a vehicle sensor 40, and a navigation device 50; An MPU (Map Positioning Unit) 60, a driving operator 80, a lamp group 90, a horn (or horn) 92, an automatic driving control device 100, a traveling driving force output device 200, a braking device 210, a steering device And 220. These devices and devices are mutually connected by a multiplex communication line such as a CAN (Controller Area Network) communication line, a serial communication line, a wireless communication network or the like.
  • CAN Controller Area Network
  • serial communication line a wireless communication network or the like.
  • the camera 10 is, for example, a digital camera using a solid-state imaging device such as a charge coupled device (CCD) or a complementary metal oxide semiconductor (CMOS).
  • CMOS complementary metal oxide semiconductor
  • One or more cameras 10 are attached to any part of a vehicle (hereinafter referred to as a host vehicle M) on which the vehicle system 1 is mounted.
  • the camera 10 When imaging the front, the camera 10 is attached to the top of the front windshield, the rear surface of the rearview mirror, or the like.
  • the camera 10 periodically and repeatedly captures the periphery of the vehicle M.
  • the camera 10 may be a stereo camera.
  • the radar device 12 emits radio waves such as millimeter waves around the host vehicle M and detects radio waves (reflected waves) reflected by the object to detect at least the position (distance and direction) of the object.
  • radio waves such as millimeter waves around the host vehicle M and detects radio waves (reflected waves) reflected by the object to detect at least the position (distance and direction) of the object.
  • One or more of the radar devices 12 are attached to any part of the host vehicle M.
  • the radar device 12 may detect the position and the velocity of the object by a frequency modulated continuous wave (FM-CW) method.
  • FM-CW frequency modulated continuous wave
  • the finder 14 is a light detection and ranging (LIDAR).
  • the finder 14 irradiates light around the host vehicle M and measures scattered light.
  • the finder 14 detects the distance to the object based on the time from light emission to light reception.
  • the light to be irradiated is, for example, pulsed laser light.
  • One or more finders 14 are attached to any part of the host vehicle M.
  • the object recognition device 16 performs sensor fusion processing on the detection result of a part or all of the camera 10, the radar device 12, and the finder 14 to recognize the position, type, speed, etc. of the object.
  • the object recognition device 16 outputs the recognition result to the automatic driving control device 100.
  • the object recognition device 16 may output the detection results of the camera 10, the radar device 12, and the finder 14 to the automatic driving control device 100 as it is, as necessary.
  • the communication device 20 communicates with another vehicle m existing around the host vehicle M using, for example, a cellular network, Wi-Fi network, Bluetooth (registered trademark), DSRC (Dedicated Short Range Communication), or It communicates with various server devices via a wireless base station.
  • the other vehicle m may be, for example, a vehicle on which automatic driving is performed as in the case of the host vehicle M, or may be a vehicle on which manual driving is performed, and there is no particular limitation.
  • the manual driving means that the acceleration / deceleration and the steering of the own vehicle M are controlled in accordance with the operation of the occupant with respect to the driving operation element 80.
  • the communication device 20 is an example of a “communication unit”.
  • the HMI 30 presents various information to the occupant of the host vehicle M, and accepts input operation by the occupant.
  • the HMI 30 includes various display devices, speakers, a buzzer, a touch panel, switches, keys, and the like.
  • the vehicle sensor 40 includes a vehicle speed sensor that detects the speed of the host vehicle M, an acceleration sensor that detects acceleration, a yaw rate sensor that detects an angular velocity around the vertical axis, and an azimuth sensor that detects the direction of the host vehicle M.
  • the navigation device 50 includes, for example, a GNSS (Global Navigation Satellite System) receiver 51, a navigation HMI 52, and a path determination unit 53, and stores the first map information 54 in a storage device such as an HDD (Hard Disk Drive) or a flash memory. Hold
  • the GNSS receiver 51 specifies the position of the host vehicle M based on the signal received from the GNSS satellite. The position of the host vehicle M may be identified or supplemented by an INS (Inertial Navigation System) using the output of the vehicle sensor 40.
  • the navigation HMI 52 includes a display device, a speaker, a touch panel, keys and the like. The navigation HMI 52 may be partially or entirely shared with the above-described HMI 30.
  • the route determination unit 53 for example, a route from the position of the host vehicle M specified by the GNSS receiver 51 (or an arbitrary position input) to the destination input by the occupant using the navigation HMI 52 (hereinafter referred to as The route on the map is determined with reference to the first map information 54.
  • the first map information 54 is, for example, information in which a road shape is represented by a link indicating a road and a node connected by the link.
  • the first map information 54 may include road curvature, POI (Point Of Interest) information, and the like.
  • the on-map route determined by the route determination unit 53 is output to the MPU 60.
  • the navigation device 50 may also perform route guidance using the navigation HMI 52 based on the on-map route determined by the route determination unit 53.
  • the navigation device 50 may be realized by, for example, the function of a terminal device such as a smartphone or a tablet terminal owned by a passenger.
  • the navigation device 50 may transmit the current position and the destination to the navigation server via the communication device 20, and acquire the on-map route returned from the navigation server.
  • the MPU 60 functions as, for example, a recommended lane determination unit 61, and holds the second map information 62 in a storage device such as an HDD or a flash memory.
  • the recommended lane determination unit 61 divides the route provided from the navigation device 50 into a plurality of blocks (for example, in units of 100 [m] in the traveling direction of the vehicle), and refers to the second map information 62 for each block. Determine the recommended lanes.
  • the recommended lane determination unit 61 determines which lane to travel from the left.
  • the recommended lane determination unit 61 determines the recommended lane so that the host vehicle M can travel on a reasonable route for traveling to a branch destination when a branch point, a junction point, or the like exists in the route.
  • the second map information 62 is map information that is more accurate than the first map information 54.
  • the second map information 62 includes, for example, information on the center of the lane or information on the boundary of the lane. Further, the second map information 62 may include road information, traffic regulation information, address information (address / zip code), facility information, telephone number information, and the like.
  • the second map information 62 may be updated as needed by accessing another device using the communication device 20.
  • the operating element 80 includes, for example, an accelerator pedal, a brake pedal, a shift lever, a steering wheel, a modified steering wheel, a joystick and other operating elements.
  • a sensor for detecting the amount of operation or the presence or absence of an operation is attached to the driving operation element 80, and the detection result is the automatic driving control device 100 or the traveling driving force output device 200, the brake device 210, and the steering device. It is output to part or all of 220.
  • the lamp group 90 includes, for example, a head lamp installed at the front end of the vehicle body of the host vehicle M, a rear lamp installed at the rear end, a hazard lamp, and the like.
  • the horn 92 makes a sound.
  • the automatic driving control device 100 includes, for example, a first control unit 120 and a second control unit 160.
  • Each of the first control unit 120 and the second control unit 160 is realized, for example, when a hardware processor such as a CPU (Central Processing Unit) executes a program (software).
  • a hardware processor such as a CPU (Central Processing Unit) executes a program (software).
  • some or all of these components may be hardware (circuits) such as LSI (Large Scale Integration), ASIC (Application Specific Integrated Circuit), FPGA (Field-Programmable Gate Array), GPU (Graphics Processing Unit), etc. Circuit (including circuitry) or may be realized by cooperation of software and hardware.
  • FIG. 2 is a functional block diagram of the first control unit 120 and the second control unit 160.
  • the first control unit 120 includes, for example, a recognition unit 130 and an action plan generation unit 140.
  • the recognition unit 130 includes, for example, another vehicle occupant gaze direction recognition unit 132.
  • the action plan generation unit 140 includes, for example, the other vehicle encounter time determination unit 142. What combined the camera 10, the object recognition apparatus 16, and the recognition part 130 is an example of a "detection part.”
  • the first control unit 120 implements, for example, a function by artificial intelligence (AI) and a function by a predetermined model in parallel. For example, in the “identify intersection” function, recognition of an intersection by deep learning etc. and recognition based on predetermined conditions (a signal capable of pattern matching, road marking, etc.) are executed in parallel, and both are performed. It is realized by scoring against and comprehensively evaluating. This ensures the reliability of automatic driving.
  • AI artificial intelligence
  • the recognition unit 130 detects the position of an object in the vicinity of the host vehicle M, the state of the velocity, the acceleration, and the like based on information input from the camera 10, the radar device 12, and the finder 14 via the object recognition device 16. recognize.
  • the objects include other vehicles m and stationary obstacles.
  • the position of the object is recognized as, for example, a position on an absolute coordinate with a representative point (such as the center of gravity or the center of the drive axis) of the host vehicle M as an origin, and is used for control.
  • the position of the object may be represented by a representative point such as the center of gravity or a corner of the object, or may be represented by a represented region.
  • the "state" of an object may include the acceleration or jerk of the object, or "action state” (e.g.
  • the recognition unit 130 recognizes the shape of a curve through which the host vehicle M passes from now on the basis of the captured image of the camera 10.
  • the recognition unit 130 converts the shape of the curve from the captured image of the camera 10 to a real plane, and for example, information indicating the shape of the curve which is expressed using two-dimensional point sequence information or a model equivalent thereto. Output to the action plan generation unit 140.
  • the recognition unit 130 also recognizes, for example, a lane in which the host vehicle M is traveling (traveling lane). For example, the recognition unit 130 may use a pattern of road division lines obtained from the second map information 62 (for example, an array of solid lines and broken lines) and road division lines around the host vehicle M recognized from an image captured by the camera 10 The traveling lane is recognized by comparing with the pattern of.
  • the recognition unit 130 may recognize the traveling lane by recognizing a runway boundary (road boundary) including not only road division lines but also road division lines, road shoulders, curbs, median dividers, guard rails and the like. . In this recognition, the position of the host vehicle M acquired from the navigation device 50 or the processing result by the INS may be added.
  • the recognition unit 130 also recognizes road markings drawn on a road surface such as a temporary stop line, road signs, obstacles, red lights, toll plazas, and other road events.
  • the recognition unit 130 recognizes the position and orientation of the host vehicle M with respect to the traveling lane when recognizing the traveling lane.
  • the recognition unit 130 is, for example, a deviation of the reference point of the host vehicle M from the center of the lane, and an angle formed by a line connecting the center of the lane in the traveling direction of the host vehicle M It may be recognized as an attitude. Also, instead of this, the recognition unit 130 sets the position of the reference point of the host vehicle M with respect to any one side end (road segment or road boundary) of the travel lane relative to the host vehicle M with respect to the travel lane. It may be recognized as
  • the recognition unit 130 may derive recognition accuracy in the above-described recognition processing, and output the recognition accuracy to the action plan generation unit 140 as recognition accuracy information.
  • the recognition unit 130 generates recognition accuracy information based on the frequency at which a road marking can be recognized in a fixed period.
  • the other vehicle occupant's gaze direction recognition unit 132 of the recognition unit 130 specifies the other vehicle m from among the recognized objects, and recognizes the passenger getting on the other vehicle m. Then, the other vehicle occupant gaze direction recognition unit 132 recognizes the direction of the face or the direction of the line of sight of the recognized other vehicle m. For example, the other vehicle occupant's gaze direction recognition unit 132 recognizes the direction of the face or the line of sight of the occupant based on the position of the occupant's pupil or iris, the position of the eyes, etc., the skin or the hair.
  • the action plan generation unit 140 basically travels in the recommended lane determined by the recommended lane determination unit 61, and further determines events to be sequentially executed in automatic driving so as to correspond to the surrounding situation of the host vehicle M.
  • Events include, for example, a constant speed traveling event traveling on the same traveling lane at a constant speed, a following traveling event tracking a preceding vehicle, an overtaking event passing a leading vehicle, braking for avoiding approaching with an obstacle and / Or Avoiding events to steer, curve driving events to drive a curve, passing events passing predetermined points such as intersections, pedestrian crossings, crossings, etc., lane change events, merging events, branching events, automatic stop events, automatic driving There is a takeover event for ending and switching to the manual operation.
  • the “following” is a traveling mode in which the speed is controlled so as to make the inter-vehicle distance (relative distance) between the leading vehicle and the host vehicle M constant.
  • the action plan generation unit 140 generates a target trajectory T on which the vehicle M travels in the future, in accordance with the activated event.
  • the target trajectory T includes, for example, a velocity component.
  • the target trajectory T is expressed as a sequence of points (track points) to be reached by the vehicle M.
  • the track point is a point to be reached by the vehicle M for every predetermined traveling distance (for example, several [m]) in road distance, and separately, for a predetermined sampling time (for example, about 0 comma [sec]) B) target velocity and target acceleration are generated as part of the target trajectory T.
  • the track point may be a position to be reached by the vehicle M at the sampling time for each predetermined sampling time. In this case, information on the target velocity and the target acceleration is expressed by the distance between the track points.
  • FIG. 3 is a diagram showing how the target track T is generated based on the recommended lane.
  • the recommended lanes are set to be convenient to travel along the route to the destination.
  • the action plan generation unit 140 When the action plan generation unit 140 reaches a predetermined distance (may be determined according to the type of event) of the switching point of the recommended lane, it activates a passing event, a lane changing event, a branching event, a merging event and the like. When it is necessary to avoid an obstacle during the execution of each event, an avoidance trajectory is generated as illustrated.
  • the action plan generation unit 140 determines the other vehicle encounter time determination unit 142 from the recognition unit 130 as information that indicates the direction of the face or line of sight of the occupant of the other vehicle m (hereinafter, gaze direction information) as a recognition result. Then, the other vehicle encounter time determination unit 142, for example, refers to the acquired gaze direction information and determines whether the direction indicated by the information is a predetermined direction.
  • the other vehicle encounter time determination unit 142 is an example of the “acquisition unit” in the claims.
  • the action plan generation unit 140 generates a new target trajectory T when the other vehicle encounter time determination unit 142 determines that the direction of the face or line of sight of the occupant is a predetermined direction, and the second control unit 160 generates the new target trajectory T.
  • the second control unit 160 includes, for example, an acquisition unit 162, a speed control unit 164, and a steering control unit 166.
  • the combination of the action plan generation unit 140, the speed control unit 164, and the steering control unit 166 is an example of the “operation control unit”.
  • the acquisition unit 162 acquires information on the target trajectory T (orbital point) generated by the action plan generation unit 140, and stores the information in the memory.
  • the speed control unit 164 and the steering control unit 166 control the travel driving force output device 200, the brake device 210, and the like so that the vehicle M passes the target trajectory T generated by the action plan generation unit 140 as scheduled. And control the steering device 220.
  • the speed control unit 164 controls the traveling drive power output device 200 or the brake device 210 based on the speed component associated with the target track T stored in the memory.
  • the steering control unit 166 controls the steering device 220 according to the degree of curvature (for example, curvature) of the target track T stored in the memory.
  • the processing of the speed control unit 164 and the steering control unit 166 is realized by, for example, a combination of feedforward control and feedback control.
  • the steering control unit 166 combines feedforward control according to the curvature of the road ahead of the host vehicle M and feedback control based on the deviation from the target track T.
  • the device control unit 170 acquires the determination result of the other vehicle encounter time determination unit 142, outputs a command to the communication device 20 according to the determination result, and causes the communication device 20 to perform inter-vehicle communication with the other vehicle m.
  • the device control unit 170 may operate the lamp group 90 or the horn 92 according to the determination result by the other vehicle encounter time determination unit 142.
  • the device control unit 170 is an example of a “communication control unit”.
  • the traveling driving force output device 200 outputs traveling driving force (torque) for the vehicle to travel to the driving wheels.
  • the traveling driving force output device 200 includes, for example, a combination of an internal combustion engine, a motor, and a transmission, and an ECU (Electronic Control Unit) that controls these.
  • the ECU controls the above configuration in accordance with the information input from the second control unit 160 or the information input from the drive operator 80.
  • the brake device 210 includes, for example, a brake caliper, a cylinder that transmits hydraulic pressure to the brake caliper, an electric motor that generates hydraulic pressure in the cylinder, and a brake ECU.
  • the brake ECU controls the electric motor in accordance with the information input from the second control unit 160 or the information input from the drive operator 80 so that the brake torque corresponding to the braking operation is output to each wheel.
  • the brake device 210 may include, as a backup, a mechanism for transmitting the hydraulic pressure generated by the operation of the brake pedal included in the drive operator 80 to the cylinder via the master cylinder.
  • the brake device 210 is not limited to the configuration described above, and is an electronically controlled hydraulic brake device that controls the actuator according to the information input from the second control unit 160 to transmit the hydraulic pressure of the master cylinder to the cylinder It is also good.
  • the steering device 220 includes, for example, a steering ECU and an electric motor.
  • the electric motor for example, applies a force to the rack and pinion mechanism to change the direction of the steered wheels.
  • the steering ECU drives the electric motor to change the direction of the steered wheels in accordance with the information input from the second control unit 160 or the information input from the drive operator 80.
  • FIG. 4 is a flowchart showing an example of processing executed by the automatic driving control apparatus 100 according to the first embodiment.
  • the processing of this flowchart is executed, for example, when the recognition unit 130 recognizes the other vehicle m. Also, the processing of this flowchart may be repeatedly executed, for example, at a predetermined cycle.
  • the other vehicle occupant's gaze direction recognition unit 132 recognizes the recognized driver of the other vehicle m (step S100).
  • the driver is, for example, an occupant seated in a seat closest to the steering wheel among one or more occupants who board the vehicle.
  • the other vehicle occupant gaze direction recognition unit 132 determines whether a steering wheel is present in front of the seat on which the occupant sits. When it is determined that the steering wheel is present in front of the seat where the recognized occupant is seated, the other vehicle occupant gaze direction recognition unit 132 recognizes the occupant as the driver of the other vehicle m.
  • the other vehicle occupant's gaze direction recognition unit 132 recognizes the direction D of the recognized driver's face or gaze (step S102).
  • FIG. 5 and 6 schematically show the direction D of the face or line of sight of the driver of the other vehicle m.
  • ST represents a steering wheel
  • DP represents a display device described later.
  • the driver of another vehicle m monitors the surroundings of the vehicle through the side window, at least a part of the driver's face can be viewed from outside the vehicle.
  • the other vehicle encounter time determination unit 142 acquires gaze direction information as a recognition result from the other vehicle occupant gaze direction recognition unit 132, and based on the gaze direction information, the driver's face or line of sight of the other vehicle m. It is determined whether or not the direction D of the object is a predetermined direction (step S104).
  • the predetermined direction is, for example, a direction facing the own vehicle M as viewed from the other vehicle m.
  • FIG. 7 is a diagram for describing a predetermined direction.
  • the own vehicle M exists within a range of a predetermined angle width ⁇ based on the direction D of the face or line of sight of the driver of the other vehicle m.
  • the other vehicle encounter time determination unit 142 determines that the direction D of the driver's face or line of sight of the other vehicle m is a predetermined direction, and the own vehicle M does not exist within the predetermined angle width ⁇ . It is determined that the direction D of the face or gaze of the driver of the other vehicle m is not a predetermined direction.
  • FIG. 8 is a view showing an example of a scene in which the direction D of the face or line of sight of the driver of the other vehicle m may be a predetermined direction.
  • the example of the figure shows a scene when the combined channel is viewed from above, the main lines in the combined channel are the lanes L1 and LN2, and the combined lane is the lane LN3.
  • the vehicle traveling on the main line LN1 can be It is often done to appeal the willingness to join. As a result, it becomes possible to recognize the driver of the other vehicle m as seen from the host vehicle M.
  • the driver of the other vehicle m necessarily gazes at the other vehicle (including the own vehicle M) going straight from the rear on the main line LN1 in order to determine the timing of the lane change to the main line LN1. Do. Therefore, the direction D of the face or line of sight of the driver of the other vehicle m can be easily determined to be a predetermined direction.
  • the equipment control unit 170 recognizes the driver by the communication device 20.
  • the other-vehicle communication with the other vehicle m is started (step S106).
  • the device control unit 170 controls the communication device 20 to transmit predetermined information to another vehicle m which is a communication partner of inter-vehicle communication.
  • the predetermined information is, for example, information for notifying what kind of action is to be taken as a result of the own vehicle M recognizing the direction D of the face or the line of sight of the driver of the other vehicle m.
  • the other vehicle m which has received the predetermined information outputs the predetermined information via the display device DP or the speaker installed in the vehicle compartment.
  • the device control unit 170 may omit the process of transmitting predetermined information when inter-vehicle communication is not established with the other vehicle m.
  • FIG. 9 is a view showing an example of a screen displayed on the display device DP of another vehicle m.
  • the screen of the display device DP displays, as the predetermined information, information indicating that the host vehicle M has accepted an interruption ahead by the other vehicle m.
  • the driver of the other vehicle m can drive the vehicle in anticipation of the action of the host vehicle M.
  • the device control unit 170 blinks the headlamps of the lamp group 90 Turn it on, turn it on with a high beam, or blink or turn on the hazard lamp.
  • the device control unit 170 may sound the horn 92 when it is determined by the other vehicle encounter time determination unit 142 that the direction D of the driver's face or gaze of the other vehicle m is a predetermined direction. By such control, the driver of the other vehicle m is notified of predetermined information.
  • the behavior plan generation unit 140 determines that the direction D of the driver's face or line of sight of the other vehicle m is the predetermined direction by the other vehicle encounter time determination unit 142, the other vehicle m is in its own lane It is determined that it represents an intention to join a certain main line LN1, and a target track T for realizing deceleration or lane change is generated in order to accept a forward interrupt by another vehicle m (step S108). Then, one or both of the speed control unit 164 and the steering control unit 166 control each control target based on the target trajectory T newly generated by the action plan generation unit 140. Control for decelerating or changing the host vehicle M is an example of “predetermined control”.
  • FIG. 10 is a view showing an example of a scene in which the host vehicle M is decelerated.
  • the action plan generation unit 140 determines that the target speed V M defined as the speed element of the target track T generated previously is A new target trajectory T including a small target velocity V M # is generated.
  • “Before” is, for example, a timing before the recognition unit 130 recognizes the driver of the other vehicle m.
  • the speed control unit 164 controls the traveling drive power output device 200 or the brake device 210 based on the target speed V M # included as a speed element in the newly generated target track, and the host vehicle M Slow down.
  • a space is formed in front of the host vehicle M, and it is possible to urge the other vehicle m looking at the direction of the host vehicle M to merge.
  • FIG. 11 is a view showing an example of a scene in which the host vehicle M is changed to the lane.
  • the action plan generation unit 140 changes the lane LN1 adjacent to the merging lane LN3 to the lane LN1 without being adjacent to the merging lane LN3.
  • a target track T leading to the adjacent lane LN2 is generated.
  • the steering control unit 166 controls the steering device 220 based on the degree of curvature of the newly generated target track T, and changes the lane of the host vehicle M to the lane LN2.
  • a space for at least the vehicle M is formed in the main line LN1, it is possible to urge the other vehicle m that is looking at the direction of the vehicle M to merge.
  • the action plan generation unit 140 When it is determined by the other vehicle encounter time determination unit 142 that the direction D of the face or line of sight of the driver of the other vehicle m is not a predetermined direction, the action plan generation unit 140 The same target trajectory T is generated, and the target trajectory T to be referred to by the speed control unit 164 and the steering control unit 166 is not changed to a new one (step S110). “The same as the target trajectory T generated previously” means, for example, that the target velocity determined as a velocity factor and the degree of bending of the target trajectory itself (relative positional relationship between trajectory points) are substantially the same. . In response to this, the speed control unit 164 controls the traveling drive power output device 200 or the brake device 210 as before, and the steering control unit 166 controls the steering device 220 as before. As a result, the vehicle M travels while maintaining the lane LN1.
  • the action plan generation unit 140 may perform the process of S110.
  • the scene of the combined flow path has been described as an example of a scene in which the direction D of the face or line of sight of the driver of the other vehicle m can be a predetermined direction
  • the present invention is not limited thereto. It may be a 3-way JUNCTION such as a road or other traffic scenes.
  • FIG. 12 and FIG. 13 are diagrams showing other examples of a scene in which the direction D of the face or line of sight of the driver of the other vehicle m may be a predetermined direction.
  • the scene when looking at the upper side of the chopsticks is shown, and the other vehicle m tries to turn right.
  • the lane LN3 in which the other vehicle m is present is a lane that strikes the lane LN1 in which the host vehicle M is present, and the temporary stop regulation may be applied as a traffic rule.
  • the other vehicle m scheduled to make a right turn must stop on the temporary stop line SL and wait for the host vehicle M traveling in the lane LN1, which is a priority road, to pass.
  • the automatic driving control device 100 of the own vehicle M recognizes the direction D of the face or the line of sight of the driver of the other vehicle m.
  • the automatic driving control apparatus 100 determines whether the direction D of the recognized driver's face or line of sight is a predetermined direction, and if the direction D is a predetermined direction, transmits predetermined information to the other vehicle m.
  • the host vehicle M is decelerated. This enables more flexible automatic driving while communicating with the occupants of the other vehicles m.
  • the automatic driving control apparatus 100 decelerates the host vehicle M when another vehicle m turns right and changes lanes to the lane LN2 of the priority road in a double-sided road, but the other vehicle When m turns left and changes the lane to the lane LN1 of the priority road, the host vehicle M may be changed to the lane LN2 adjacent to the lane LN1 as in the scene of the merging path.
  • the scene in which the direction D of the face or line of sight of the driver of the other vehicle m can be a predetermined direction may be a scene in which the host vehicle M follows the other vehicle m (a scene where a following travel event is activated) .
  • FIGS. 14 to 16 are diagrams showing other examples of a scene in which the direction D of the face or sight line of the driver of the other vehicle m may be a predetermined direction.
  • the other vehicle m follows the other vehicle m so as to maintain a constant distance between the own vehicle M and the other vehicle m, which is a leading vehicle.
  • the host vehicle M may enter an area behind the vehicle that the driver of the other vehicle m feels uncomfortable, and the driver of the other vehicle m may face a face or gaze backward.
  • the other vehicle encounter time determination unit 142 determines that the direction D of the driver's face or gaze of the other vehicle m is a predetermined direction.
  • the equipment control unit 170 notifies the driver of the other vehicle m of predetermined information, and the action plan generation unit 140 generates a target trajectory T for decelerating the own vehicle M (see FIG. 15). And generate a target track T for changing the host vehicle M to an adjacent lane (see FIG. 16). This enables more flexible automatic driving while communicating with the occupants of the other vehicles m.
  • the direction D of the driver's face or line of sight of the other vehicle m has been described as being recognized by the host vehicle M, but the present invention is not limited thereto.
  • the direction D of the line of sight may be recognized by the other vehicle m.
  • a camera for monitoring an occupant may be installed in the compartment of another vehicle m. The occupant monitoring camera is installed to keep the driver of the other vehicle m within the angle of view.
  • the device control unit 170 controls the communication device 20 to receive, from the other vehicle m, an image captured by the occupant monitoring camera.
  • the recognition unit 130 performs image processing on the image received from the other vehicle m, and recognizes the direction D of the face or the line of sight of the driver of the other vehicle m. Further, on the other vehicle m side, the direction D of the driver's face or line of sight may be recognized from the image captured by the occupant monitoring camera. In this case, gaze direction information is transmitted from the other vehicle m to the host vehicle M.
  • the probability that the driver of the other vehicle m is looking at the host vehicle M is high.
  • the driver of the other vehicle m has the intention of merging or changing lanes
  • decelerating the host vehicle M or changing lanes to other lanes gives a space to give way to the other vehicle m.
  • the driver of the other vehicle m is highly likely not to look at the host vehicle M.
  • the automatic driving control apparatus 100 sets the own vehicle M for which the driver of the other vehicle m is willing to join or change lanes. It may be determined that the vehicle M is not recognized, and the vehicle M may be decelerated or changed to another lane to secure a space for giving the road to the other vehicle m.
  • gaze direction information indicating the direction D of the face or gaze of the occupant (driver) of the other vehicle m is acquired, and the direction D indicated by the acquired gaze direction information is predetermined. If it is determined that the other vehicle encounter time determination unit 142 that determines whether the direction is the direction or not and the other vehicle encounter time determination unit 142 determine that the direction D indicated by the gaze direction information is a predetermined direction, the host vehicle M Based on the target trajectory T generated by the action plan generation unit 140, the travel driving force output device 200 and the brake device, for generating the target trajectory T for decelerating the vehicle or changing the lane.
  • Second Embodiment The second embodiment will be described below.
  • the second embodiment differs from the above-described first embodiment in that the host vehicle M is retracted when the direction D of the driver's face or line of sight of the other vehicle m is a predetermined direction.
  • differences from the first embodiment will be mainly described, and descriptions of functions and the like common to the first embodiment will be omitted.
  • FIG. 17 is a flowchart showing an example of processing executed by the automatic driving control apparatus 100 according to the second embodiment.
  • the processing of this flowchart is executed, for example, when the recognition unit 130 recognizes the other vehicle m. Also, the processing of this flowchart may be repeatedly executed, for example, at a predetermined cycle.
  • the other vehicle occupant's gaze direction recognition unit 132 recognizes the recognized driver of the other vehicle m (step S200). Next, the other vehicle occupant's gaze direction recognition unit 132 recognizes the direction D of the recognized driver's face or gaze (step S202).
  • the other vehicle encounter time determination unit 142 acquires gaze direction information as a recognition result from the other vehicle occupant gaze direction recognition unit 132, and based on the gaze direction information, the driver's face or line of sight of the other vehicle m. It is determined whether or not the direction D of the target is a predetermined direction (step S204).
  • the action plan generation unit 140 determines the same target as the target trajectory T generated previously.
  • the trajectory T is generated, and the target trajectory T to be referred to by the speed control unit 164 and the steering control unit 166 is not changed to a new one (step S206).
  • the device control unit 170 causes the communication device 20 to communicate between the other vehicle m for which the driver is recognized and the inter-vehicle communication. It is made to start (step S210), and the predetermined information is transmitted to the other vehicle m.
  • the action plan generation unit 140 generates a target trajectory T for realizing deceleration or lane change (step S212).
  • the device control unit 170 causes the communication device 20 to communicate between the other vehicle m whose driver is recognized and the inter-vehicle communication It is made to start (step S214), and predetermined information is transmitted to the other vehicle m.
  • the action plan generation unit 140 generates a target trajectory T for moving the host vehicle M backward (step S216). Then, one or both of the speed control unit 164 and the steering control unit 166 control each control target based on the target trajectory T newly generated by the action plan generation unit 140.
  • the other vehicle encounter time determination unit 142 determines that the direction D of the driver's face or sight line D of the other vehicle m is It will be determined that it is a predetermined direction.
  • the equipment control unit 170 notifies that the backward space of the other vehicle m is to be secured by the backward movement of the own vehicle M as predetermined information.
  • the action plan generation unit 140 newly generates a target trajectory T for moving the host vehicle M backward. This makes it possible to form a space that allows the other vehicle m to move backward, and to urge the other vehicle m to move backward.
  • the target for backing up the own vehicle M In order to newly generate the track T, it is possible to form a space that allows the other vehicle m to move backward, and to urge the other vehicle m to move backward.
  • more flexible automatic driving can be realized while attempting to communicate with the driver of the other vehicle.
  • the host vehicle M is decelerated, changed in lane, or moved backward when the direction D of the driver's face or line of sight of the other vehicle m is a predetermined direction.
  • the vehicle control is different from the vehicle control when the direction D is the predetermined direction. It differs from the first and second embodiments described above in that vehicle control is performed.
  • differences from the first and second embodiments will be mainly described, and descriptions of functions and the like common to the first and second embodiments will be omitted.
  • the other vehicle occupant gaze direction recognition unit 132 recognizes the recognized driver of the other vehicle m (step S300). Next, the other vehicle occupant's gaze direction recognition unit 132 recognizes the direction D of the recognized driver's face or gaze (step S302).
  • the other vehicle encounter time determination unit 142 acquires gaze direction information as a recognition result from the other vehicle occupant gaze direction recognition unit 132, and based on the gaze direction information, the driver's face or line of sight of the other vehicle m. It is determined whether or not the direction D of the target is a predetermined direction (step S304).
  • the behavior plan generation unit 140 determines that the direction D of the driver's face or gaze of the other vehicle m is determined to be a predetermined direction by the other vehicle encounter time determination unit 142, the same as the target trajectory T generated before The target trajectory T is generated, and the target trajectory T to be referred to by the speed control unit 164 and the steering control unit 166 is not changed to a new one (step S306).
  • the action plan generation unit 140 A target trajectory T for causing the vehicle M to follow the other vehicle m is newly generated while keeping the inter-vehicle distance constant (step S308). Then, one or both of the speed control unit 164 and the steering control unit 166 control each control target based on the target trajectory T newly generated by the action plan generation unit 140.
  • FIG. 21 and FIG. 22 are diagrams showing an example of a situation where the direction D of the face or line of sight of the driver of the other vehicle m does not become a predetermined direction.
  • the driver of the other vehicle m when the other vehicle m is a vehicle capable of automatic driving, the driver of the other vehicle m, for example, releases the hand from the steering wheel ST to view content such as a television program displayed on the display device DP It is assumed. In this case, the other vehicle encounter time determination unit 142 determines that the driver of the other vehicle m does not operate the steering wheel ST and that the direction D of the driver's face or line of sight is not a predetermined direction.
  • the action plan generation unit 140 newly generates a target track T for changing the lane of the host vehicle M to the lane LN1 in which the other vehicle m exists, and after the host vehicle M moves to the lane LN1,
  • the target velocity is determined to be substantially the same as the velocity of the other vehicle m, and a target trajectory T including this target velocity as a velocity component is generated.
  • the own vehicle M can be made to follow the other vehicle m.
  • the host vehicle M follows the other vehicle m.
  • the other vehicle m which is a leading vehicle, bears a part of the function of the recognition unit 130 of the host vehicle M, so the processing load on the host vehicle M side can be reduced. Further, since the own vehicle M follows the other vehicle m, the air resistance of the own vehicle M decreases, and the fuel efficiency of the own vehicle M can be improved.
  • FIG. 23 is a diagram illustrating an example of a hardware configuration of the automatic driving control device 100 according to the embodiment.
  • the automatic operation control apparatus 100 includes a communication controller 100-1, a CPU 100-2, a RAM (Random Access Memory) 100-3, a ROM (Read Only Memory) 100-4, and a secondary such as a flash memory or an HDD (Hard Disc Drive).
  • the storage device 100-5 and the drive device 100-6 are mutually connected by an internal bus or a dedicated communication line.
  • a portable storage medium such as an optical disk is attached to the drive device 100-6.
  • the program 100-5a stored in the secondary storage device 100-5 is expanded on the RAM 100-3 by a DMA controller (not shown) or the like, and is executed by the CPU 100-2 to execute the first control unit 120 and the second control.
  • the unit 160 is realized. Further, the program referred to by the CPU 100-2 may be stored in a portable storage medium attached to the drive device 100-6, or may be downloaded from another device via the network NW.
  • the present invention is not limited at all by such an embodiment, and various modification and substitution within the range which does not deviate from the gist of the present invention Can be added.
  • the vehicle system 1 of the embodiment described above may be applied to a system that provides driving assistance such as ACC or LKAS.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Mathematical Physics (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • General Physics & Mathematics (AREA)
  • Control Of Driving Devices And Active Controlling Of Vehicle (AREA)
  • Traffic Control Systems (AREA)

Abstract

Provided is a vehicle control device comprising: an acquisition part for acquiring information indicating the orientation of the face or gaze of an occupant in another vehicle; an assessment part for assessing whether the orientation indicated by the information acquired by the acquisition part corresponds to an orientation that is directed toward a host vehicle (on which the vehicle control device is mounted) as viewed from the other vehicle; and a drive control part for controlling the steering and/or the acceleration of the host vehicle, said drive control part being designed to carry out prescribed control when it has been assessed by the assessment part that the orientation indicated by the information corresponds to the orientation directed toward the host vehicle as viewed from the other vehicle.

Description

車両制御装置、車両制御方法、およびプログラムVehicle control device, vehicle control method, and program

 本発明は、車両制御装置、車両制御方法、およびプログラムに関する。 The present invention relates to a vehicle control device, a vehicle control method, and a program.

 カメラによって他車両を撮像し、その撮像した画像から他車両の運転者が前方不注意状態であるか否かを判定し、前方不注意状態である場合に、他車両の運転者への警報を発する技術が知られている(例えば特許文献1参照)。 An image of the other vehicle is taken by the camera, it is determined from the captured image whether the driver of the other vehicle is in the forward careless state, and if it is in the forward careless state, an alert to the driver of the other vehicle is The technique to emit is known (for example, refer to patent documents 1).

特開2017-84157号公報JP, 2017-84157, A

 しかしながら、従来の技術は、他車両の運転者と意思の疎通を図りながら自車両を自動的に制御することについて検討されていなかった。 However, the prior art has not been considered about automatically controlling the vehicle while attempting to communicate with the driver of the other vehicle.

 本発明は、このような事情を考慮してなされたものであり、他車両の運転者と意思の疎通を図りながら、より柔軟な自動運転を実現することができる車両制御装置、車両制御方法、およびプログラムを提供することを目的の一つとする。 The present invention has been made in consideration of such circumstances, and a vehicle control apparatus and a vehicle control method capable of realizing more flexible automatic driving while attempting to communicate with drivers of other vehicles. And to provide one of the programs.

 (1):車両制御装置は、他車両の乗員の顔または視線の向きを示す情報を取得する取得部と、前記取得部により取得された情報によって示される向きが、前記他車両から見て自車両を向いている方向であるか否かを判定する判定部と、前記自車両の操舵または加減速のうち一方または双方を制御する運転制御部であって、前記判定部により、前記情報が示す向きが前記他車両から見て前記自車両を向いている方向であると判定された場合に、所定の制御を行う運転制御部と、を備える。 (1): The vehicle control device acquires the information indicating the direction of the face or the line of sight of the occupant of the other vehicle, and the direction indicated by the information acquired by the acquisition unit is viewed from the other vehicle itself A determination unit that determines whether or not the vehicle is facing a direction, and a drive control unit that controls one or both of steering and acceleration / deceleration of the host vehicle, wherein the information is indicated by the determination unit. And a driving control unit that performs predetermined control when it is determined that the direction is the direction facing the host vehicle as viewed from the other vehicle.

 (2):(1)に記載の車両制御装置は、前記所定の制御が、前記自車両を減速させること、前記自車両を車線変更させること、または前記自車両を後退させることの少なくとも一つ以上の制御を含むものである。 (2) The vehicle control device according to (1) is at least one of the predetermined control decelerating the host vehicle, changing the lane of the host vehicle, or moving the host vehicle backward. The above control is included.

 (3):(1)に記載の車両制御装置は、前記他車両に設置されたステアリングホイールを検出すると共に、前記検出したステアリングホイールに最も近い乗員の顔または視線の向きを検出する検出部を更に備え、前記取得部が、前記検出部の検出結果を、前記他車両の乗員の顔または視線の向きを示す情報として取得するものである。 (3): The vehicle control device according to (1) detects a steering wheel installed in the other vehicle, and detects a direction of a face or a line of sight of an occupant closest to the detected steering wheel. Further, the acquisition unit acquires the detection result of the detection unit as information indicating the direction of the face or the line of sight of the occupant of the other vehicle.

 (4):(1)に記載の車両制御装置は、他車両と車車間通信を行う通信部と、前記判定部により、前記情報が示す向きが前記他車両から見て前記自車両を向いている方向であると判定された場合、前記通信部に前記車車間通信を開始させ、所定の情報を前記他車両に送信させる通信制御部と、を更に備えるものである。 (4) The vehicle control device according to (1) has a communication unit for performing inter-vehicle communication with another vehicle, and a direction indicated by the information from the other vehicle by the determination unit, the direction of the information is directed to the own vehicle And a communication control unit that causes the communication unit to start the inter-vehicle communication and transmits predetermined information to the other vehicle when it is determined that the vehicle is in the moving direction.

 (5):(1)に記載の車両制御装置は、前記運転制御部が、前記判定部により、前記情報が示す向きが前記他車両から見て前記自車両を向いている方向であると判定された場合、前記自車両を減速させる、または他車線に車線変更させるものである。 (5): In the vehicle control device according to (1), the operation control unit determines that the direction indicated by the information is a direction in which the information is facing the own vehicle as viewed from the other vehicle by the determination unit. When it is done, the host vehicle is decelerated or changed to another lane.

 (6):取得部が、他車両の乗員の顔または視線の向きを示す情報を取得し、判定部が、前記取得部により取得された情報によって示される向きが、前記他車両から見て自車両を向いている方向であるか否かを判定し、運転制御部が、前記自車両の操舵または加減速のうち一方または双方を制御し、前記判定部により、前記情報が示す向きが前記他車両から見て前記自車両を向いている方向であると判定された場合に、所定の制御を行う車両制御方法。 (6): The acquisition unit acquires information indicating the direction of the face or line of sight of the occupant of the other vehicle, and the determination unit determines that the direction indicated by the information acquired by the acquisition unit is the own vehicle when viewed from the other vehicle. It is determined whether or not it is a direction facing the vehicle, and the operation control unit controls one or both of steering and acceleration / deceleration of the host vehicle, and the direction indicated by the information is the other by the determination unit. A vehicle control method for performing predetermined control when it is determined that the vehicle is in a direction facing the host vehicle as viewed from the vehicle.

 (7):コンピュータに、他車両の乗員の顔または視線の向きを示す情報を取得させ、前記取得させた情報によって示される向きが、前記他車両から見て自車両を向いている方向であるか否かを判定させ、前記自車両の操舵または加減速のうち一方または双方を制御させると共に、前記情報が示す向きが前記他車両から見て前記自車両を向いている方向であると判定させた場合に、所定の制御を行わせるプログラム。 (7): The computer is made to acquire information indicating the direction of the face or line of sight of the occupant of the other vehicle, and the direction indicated by the acquired information is the direction facing the own vehicle as viewed from the other vehicle Whether or not to control one or both of steering or acceleration / deceleration of the vehicle, and determining that the direction indicated by the information is the direction facing the vehicle as viewed from the other vehicle Program that performs predetermined control when it happens.

 (1)~(7)によれば、他車両の運転者と意思の疎通を図りながら、より柔軟な自動運転を実現することができる。 According to (1) to (7), more flexible automatic driving can be realized while attempting to communicate with the driver of another vehicle.

第1実施形態に係る車両制御装置を利用した車両システム1の構成図である。BRIEF DESCRIPTION OF THE DRAWINGS It is a block diagram of the vehicle system 1 using the vehicle control apparatus which concerns on 1st Embodiment. 第1制御部120および第2制御部160の機能構成図である。FIG. 2 is a functional configuration diagram of a first control unit 120 and a second control unit 160. 推奨車線に基づいて目標軌道Tが生成される様子を示す図である。It is a figure which shows a mode that the target track T is produced | generated based on a recommendation lane. 第1実施形態の自動運転制御装置100により実行される処理の一例を示すフローチャートである。It is a flowchart which shows an example of the process performed by the automatic driving | operation control apparatus 100 of 1st Embodiment. 他車両mの運転者の顔または視線の向きDを模式的に示す図である。It is a figure showing typically direction D of a face or a look of a driver of other vehicles m. 他車両mの運転者の顔または視線の向きDを模式的に示す図である。It is a figure showing typically direction D of a face or a look of a driver of other vehicles m. 所定の方向について説明するための図である。It is a figure for demonstrating a predetermined direction. 他車両mの運転者の顔または視線の向きDが所定の方向となり得る場面の一例を示す図である。It is a figure showing an example of a scene where direction D of a face or a look of a driver of other vehicles m may turn into a predetermined direction. 他車両mの表示装置DPに表示させる画面の一例を示す図である。It is a figure which shows an example of the screen displayed on the display apparatus DP of the other vehicle m. 自車両Mを減速させる場面の一例を示す図である。It is a figure which shows an example of the scene which decelerates the own vehicle M. As shown in FIG. 自車両Mを車線変更させる場面の一例を示す図である。It is a figure which shows an example of the scene which changes the own vehicle M into a lane. 他車両mの運転者の顔または視線の向きDが所定の方向となり得る場面の他の例を示す図である。It is a figure which shows the other example of the scene where direction D of the driver | operator of the other vehicle m or gaze direction may turn into predetermined direction. 他車両mの運転者の顔または視線の向きDが所定の方向となり得る場面の他の例を示す図である。It is a figure which shows the other example of the scene where direction D of the driver | operator of the other vehicle m or gaze direction may turn into predetermined direction. 他車両mの運転者の顔または視線の向きDが所定の方向となり得る場面の他の例を示す図である。It is a figure which shows the other example of the scene where direction D of the driver | operator of the other vehicle m or gaze direction may turn into predetermined direction. 他車両mの運転者の顔または視線の向きDが所定の方向となり得る場面の他の例を示す図である。It is a figure which shows the other example of the scene where direction D of the driver | operator of the other vehicle m or gaze direction may turn into predetermined direction. 他車両mの運転者の顔または視線の向きDが所定の方向となり得る場面の他の例を示す図である。It is a figure which shows the other example of the scene where direction D of the driver | operator of the other vehicle m or gaze direction may turn into predetermined direction. 第2実施形態の自動運転制御装置100により実行される処理の一例を示すフローチャートである。It is a flowchart which shows an example of the process performed by the automatic driving | operation control apparatus 100 of 2nd Embodiment. 他車両mの運転者の顔または視線の向きDが所定の方向となり得る場面の他の例を示す図である。It is a figure which shows the other example of the scene where direction D of the driver | operator of the other vehicle m or gaze direction may turn into predetermined direction. 他車両mの運転者の顔または視線の向きDが所定の方向となり得る場面の他の例を示す図である。It is a figure which shows the other example of the scene where direction D of the driver | operator of the other vehicle m or gaze direction may turn into predetermined direction. 第3実施形態の自動運転制御装置100により実行される処理の一例を示すフローチャートである。It is a flowchart which shows an example of the process performed by the automatic driving | operation control apparatus 100 of 3rd Embodiment. 他車両mの運転者の顔または視線の向きDが所定の方向とならない場面の一例を示す図である。It is a figure showing an example of a scene where direction D of a face or a look of a driver of other vehicles m does not turn into a predetermined direction. 他車両mの運転者の顔または視線の向きDが所定の方向とならない場面の一例を示す図である。It is a figure showing an example of a scene where direction D of a face or a look of a driver of other vehicles m does not turn into a predetermined direction. 実施形態の自動運転制御装置100のハードウェア構成の一例を示す図である。It is a figure showing an example of the hardware constitutions of automatic operation control device 100 of an embodiment.

 以下、図面を参照し、本発明の車両制御装置、車両制御方法、およびプログラムの実施形態について説明する。以下の実施形態では、車両制御装置が自動運転(自律運転)可能な車両に適用されるものとして説明する。自動運転は、例えば、車両に搭乗した乗員の操作に依らずに、車両の操舵または加減速のうち一方または双方を制御して車両を走行させる態様である。自動運転には、ACC(Adaptive Cruse Control)やLKAS(Lane Keeping Assist)等の運転支援が含まれてもよい。 Hereinafter, embodiments of a vehicle control device, a vehicle control method, and a program according to the present invention will be described with reference to the drawings. In the following embodiments, the vehicle control device will be described as being applied to a vehicle capable of autonomous driving (autonomous driving). In the automatic driving, for example, one or both of the steering and the acceleration / deceleration of the vehicle is controlled to travel the vehicle without depending on the operation of the passenger who got on the vehicle. The automatic driving may include driving support such as ACC (Adaptive Cruse Control) and LKAS (Lane Keeping Assist).

 <第1実施形態>
 [全体構成]
 図1は、第1実施形態に係る車両制御装置を利用した車両システム1の構成図である。車両システム1が搭載される車両(以下、自車両Mと称する)は、例えば、二輪や三輪、四輪等の車両であり、その駆動源は、ディーゼルエンジンやガソリンエンジンなどの内燃機関、電動機、或いはこれらの組み合わせである。電動機を備える場合、電動機は、内燃機関に連結された発電機による発電電力、或いは二次電池や燃料電池の放電電力を使用して動作する。
First Embodiment
[overall structure]
FIG. 1 is a block diagram of a vehicle system 1 using the vehicle control device according to the first embodiment. The vehicle on which the vehicle system 1 is mounted (hereinafter referred to as the own vehicle M) is, for example, a vehicle such as a two-wheeled vehicle, a three-wheeled vehicle or a four-wheeled vehicle, and its drive source is an internal combustion engine such as a diesel engine or gasoline engine, an electric motor, Or it is a combination of these. When the motor is provided, the motor operates using the power generated by the generator connected to the internal combustion engine or the discharge power of the secondary battery or the fuel cell.

 車両システム1は、例えば、カメラ10と、レーダ装置12と、ファインダ14と、物体認識装置16と、通信装置20と、HMI(Human Machine Interface)30と、車両センサ40と、ナビゲーション装置50と、MPU(Map Positioning Unit)60と、運転操作子80と、ランプ群90と、クラクション(またはホーン)92と、自動運転制御装置100と、走行駆動力出力装置200と、ブレーキ装置210と、ステアリング装置220とを備える。これらの装置や機器は、CAN(Controller Area Network)通信線等の多重通信線やシリアル通信線、無線通信網等によって互いに接続される。なお、図1に示す構成はあくまで一例であり、構成の一部が省略されてもよいし、更に別の構成が追加されてもよい。 The vehicle system 1 includes, for example, a camera 10, a radar device 12, a finder 14, an object recognition device 16, a communication device 20, an HMI (Human Machine Interface) 30, a vehicle sensor 40, and a navigation device 50; An MPU (Map Positioning Unit) 60, a driving operator 80, a lamp group 90, a horn (or horn) 92, an automatic driving control device 100, a traveling driving force output device 200, a braking device 210, a steering device And 220. These devices and devices are mutually connected by a multiplex communication line such as a CAN (Controller Area Network) communication line, a serial communication line, a wireless communication network or the like. The configuration shown in FIG. 1 is merely an example, and a part of the configuration may be omitted, or another configuration may be added.

 カメラ10は、例えば、CCD(Charge Coupled Device)やCMOS(Complementary Metal Oxide Semiconductor)等の固体撮像素子を利用したデジタルカメラである。カメラ10は、車両システム1が搭載される車両(以下、自車両Mと称する)の任意の箇所に一つまたは複数が取り付けられる。前方を撮像する場合、カメラ10は、フロントウインドシールド上部やルームミラー裏面等に取り付けられる。カメラ10は、例えば、周期的に繰り返し自車両Mの周辺を撮像する。カメラ10は、ステレオカメラであってもよい。 The camera 10 is, for example, a digital camera using a solid-state imaging device such as a charge coupled device (CCD) or a complementary metal oxide semiconductor (CMOS). One or more cameras 10 are attached to any part of a vehicle (hereinafter referred to as a host vehicle M) on which the vehicle system 1 is mounted. When imaging the front, the camera 10 is attached to the top of the front windshield, the rear surface of the rearview mirror, or the like. For example, the camera 10 periodically and repeatedly captures the periphery of the vehicle M. The camera 10 may be a stereo camera.

 レーダ装置12は、自車両Mの周辺にミリ波などの電波を放射すると共に、物体によって反射された電波(反射波)を検出して少なくとも物体の位置(距離および方位)を検出する。レーダ装置12は、自車両Mの任意の箇所に一つまたは複数が取り付けられる。レーダ装置12は、FM-CW(Frequency Modulated Continuous Wave)方式によって物体の位置および速度を検出してもよい。 The radar device 12 emits radio waves such as millimeter waves around the host vehicle M and detects radio waves (reflected waves) reflected by the object to detect at least the position (distance and direction) of the object. One or more of the radar devices 12 are attached to any part of the host vehicle M. The radar device 12 may detect the position and the velocity of the object by a frequency modulated continuous wave (FM-CW) method.

 ファインダ14は、LIDAR(Light Detection and Ranging)である。ファインダ14は、自車両Mの周辺に光を照射し、散乱光を測定する。ファインダ14は、発光から受光までの時間に基づいて、対象までの距離を検出する。照射される光は、例えば、パルス状のレーザー光である。ファインダ14は、自車両Mの任意の箇所に一つまたは複数が取り付けられる。 The finder 14 is a light detection and ranging (LIDAR). The finder 14 irradiates light around the host vehicle M and measures scattered light. The finder 14 detects the distance to the object based on the time from light emission to light reception. The light to be irradiated is, for example, pulsed laser light. One or more finders 14 are attached to any part of the host vehicle M.

 物体認識装置16は、カメラ10、レーダ装置12、およびファインダ14のうち一部または全部による検出結果に対してセンサフュージョン処理を行って、物体の位置、種類、速度などを認識する。物体認識装置16は、認識結果を自動運転制御装置100に出力する。また、物体認識装置16は、必要に応じて、カメラ10、レーダ装置12、およびファインダ14の検出結果をそのまま自動運転制御装置100に出力してよい。 The object recognition device 16 performs sensor fusion processing on the detection result of a part or all of the camera 10, the radar device 12, and the finder 14 to recognize the position, type, speed, etc. of the object. The object recognition device 16 outputs the recognition result to the automatic driving control device 100. In addition, the object recognition device 16 may output the detection results of the camera 10, the radar device 12, and the finder 14 to the automatic driving control device 100 as it is, as necessary.

 通信装置20は、例えば、セルラー網やWi-Fi網、Bluetooth(登録商標)、DSRC(Dedicated Short Range Communication)などを利用して、自車両Mの周辺に存在する他車両mと通信し、或いは無線基地局を介して各種サーバ装置と通信する。他車両mは、例えば、自車両Mと同様に、自動運転が行われる車両であってもよいし、手動運転が行われる車両であってもよく、特段の制約はない。手動運転とは、前述した自動運転とは異なり、運転操作子80に対する乗員の操作に応じて自車両Mの加減速および操舵が制御されることをいう。通信装置20は、「通信部」の一例である。 The communication device 20 communicates with another vehicle m existing around the host vehicle M using, for example, a cellular network, Wi-Fi network, Bluetooth (registered trademark), DSRC (Dedicated Short Range Communication), or It communicates with various server devices via a wireless base station. The other vehicle m may be, for example, a vehicle on which automatic driving is performed as in the case of the host vehicle M, or may be a vehicle on which manual driving is performed, and there is no particular limitation. Unlike the above-described automatic driving, the manual driving means that the acceleration / deceleration and the steering of the own vehicle M are controlled in accordance with the operation of the occupant with respect to the driving operation element 80. The communication device 20 is an example of a “communication unit”.

 HMI30は、自車両Mの乗員に対して各種情報を提示すると共に、乗員による入力操作を受け付ける。HMI30は、各種表示装置、スピーカ、ブザー、タッチパネル、スイッチ、キーなどを含む。 The HMI 30 presents various information to the occupant of the host vehicle M, and accepts input operation by the occupant. The HMI 30 includes various display devices, speakers, a buzzer, a touch panel, switches, keys, and the like.

 車両センサ40は、自車両Mの速度を検出する車速センサ、加速度を検出する加速度センサ、鉛直軸回りの角速度を検出するヨーレートセンサ、自車両Mの向きを検出する方位センサ等を含む。 The vehicle sensor 40 includes a vehicle speed sensor that detects the speed of the host vehicle M, an acceleration sensor that detects acceleration, a yaw rate sensor that detects an angular velocity around the vertical axis, and an azimuth sensor that detects the direction of the host vehicle M.

 ナビゲーション装置50は、例えば、GNSS(Global Navigation Satellite System)受信機51と、ナビHMI52と、経路決定部53とを備え、HDD(Hard Disk Drive)やフラッシュメモリなどの記憶装置に第1地図情報54を保持している。GNSS受信機51は、GNSS衛星から受信した信号に基づいて、自車両Mの位置を特定する。自車両Mの位置は、車両センサ40の出力を利用したINS(Inertial Navigation System)によって特定または補完されてもよい。ナビHMI52は、表示装置、スピーカ、タッチパネル、キーなどを含む。ナビHMI52は、前述したHMI30と一部または全部が共通化されてもよい。経路決定部53は、例えば、GNSS受信機51により特定された自車両Mの位置(或いは入力された任意の位置)から、ナビHMI52を用いて乗員により入力された目的地までの経路(以下、地図上経路)を、第1地図情報54を参照して決定する。第1地図情報54は、例えば、道路を示すリンクと、リンクによって接続されたノードとによって道路形状が表現された情報である。第1地図情報54は、道路の曲率やPOI(Point Of Interest)情報などを含んでもよい。経路決定部53により決定された地図上経路は、MPU60に出力される。また、ナビゲーション装置50は、経路決定部53により決定された地図上経路に基づいて、ナビHMI52を用いた経路案内を行ってもよい。なお、ナビゲーション装置50は、例えば、乗員の保有するスマートフォンやタブレット端末等の端末装置の機能によって実現されてもよい。また、ナビゲーション装置50は、通信装置20を介してナビゲーションサーバに現在位置と目的地を送信し、ナビゲーションサーバから返信された地図上経路を取得してもよい。 The navigation device 50 includes, for example, a GNSS (Global Navigation Satellite System) receiver 51, a navigation HMI 52, and a path determination unit 53, and stores the first map information 54 in a storage device such as an HDD (Hard Disk Drive) or a flash memory. Hold The GNSS receiver 51 specifies the position of the host vehicle M based on the signal received from the GNSS satellite. The position of the host vehicle M may be identified or supplemented by an INS (Inertial Navigation System) using the output of the vehicle sensor 40. The navigation HMI 52 includes a display device, a speaker, a touch panel, keys and the like. The navigation HMI 52 may be partially or entirely shared with the above-described HMI 30. The route determination unit 53, for example, a route from the position of the host vehicle M specified by the GNSS receiver 51 (or an arbitrary position input) to the destination input by the occupant using the navigation HMI 52 (hereinafter referred to as The route on the map is determined with reference to the first map information 54. The first map information 54 is, for example, information in which a road shape is represented by a link indicating a road and a node connected by the link. The first map information 54 may include road curvature, POI (Point Of Interest) information, and the like. The on-map route determined by the route determination unit 53 is output to the MPU 60. The navigation device 50 may also perform route guidance using the navigation HMI 52 based on the on-map route determined by the route determination unit 53. The navigation device 50 may be realized by, for example, the function of a terminal device such as a smartphone or a tablet terminal owned by a passenger. In addition, the navigation device 50 may transmit the current position and the destination to the navigation server via the communication device 20, and acquire the on-map route returned from the navigation server.

 MPU60は、例えば、推奨車線決定部61として機能し、HDDやフラッシュメモリなどの記憶装置に第2地図情報62を保持している。推奨車線決定部61は、ナビゲーション装置50から提供された経路を複数のブロックに分割し(例えば、車両進行方向に関して100[m]毎に分割し)、第2地図情報62を参照してブロックごとに推奨車線を決定する。推奨車線決定部61は、左から何番目の車線を走行するといった決定を行う。推奨車線決定部61は、経路において分岐箇所や合流箇所などが存在する場合、自車両Mが、分岐先に進行するための合理的な経路を走行できるように、推奨車線を決定する。 The MPU 60 functions as, for example, a recommended lane determination unit 61, and holds the second map information 62 in a storage device such as an HDD or a flash memory. The recommended lane determination unit 61 divides the route provided from the navigation device 50 into a plurality of blocks (for example, in units of 100 [m] in the traveling direction of the vehicle), and refers to the second map information 62 for each block. Determine the recommended lanes. The recommended lane determination unit 61 determines which lane to travel from the left. The recommended lane determination unit 61 determines the recommended lane so that the host vehicle M can travel on a reasonable route for traveling to a branch destination when a branch point, a junction point, or the like exists in the route.

 第2地図情報62は、第1地図情報54よりも高精度な地図情報である。第2地図情報62は、例えば、車線の中央の情報あるいは車線の境界の情報等を含んでいる。また、第2地図情報62には、道路情報、交通規制情報、住所情報(住所・郵便番号)、施設情報、電話番号情報などが含まれてよい。第2地図情報62は、通信装置20を用いて他装置にアクセスすることにより、随時、アップデートされてよい。 The second map information 62 is map information that is more accurate than the first map information 54. The second map information 62 includes, for example, information on the center of the lane or information on the boundary of the lane. Further, the second map information 62 may include road information, traffic regulation information, address information (address / zip code), facility information, telephone number information, and the like. The second map information 62 may be updated as needed by accessing another device using the communication device 20.

 運転操作子80は、例えば、アクセルペダル、ブレーキペダル、シフトレバー、ステアリングホイール、異形ステア、ジョイスティックその他の操作子を含む。運転操作子80には、操作量あるいは操作の有無を検出するセンサが取り付けられており、その検出結果は、自動運転制御装置100、もしくは、走行駆動力出力装置200、ブレーキ装置210、およびステアリング装置220のうち一部または全部に出力される。 The operating element 80 includes, for example, an accelerator pedal, a brake pedal, a shift lever, a steering wheel, a modified steering wheel, a joystick and other operating elements. A sensor for detecting the amount of operation or the presence or absence of an operation is attached to the driving operation element 80, and the detection result is the automatic driving control device 100 or the traveling driving force output device 200, the brake device 210, and the steering device. It is output to part or all of 220.

 ランプ群90は、例えば、自車両Mの車体の前端に設置されたヘッドランプと、後端に設置されたリアランプ、ハザードランプなどを含む。クラクション92は音を鳴らす。 The lamp group 90 includes, for example, a head lamp installed at the front end of the vehicle body of the host vehicle M, a rear lamp installed at the rear end, a hazard lamp, and the like. The horn 92 makes a sound.

 自動運転制御装置100は、例えば、第1制御部120および第2制御部160を備える。第1制御部120および第2制御部160の其々は、例えば、CPU(Central Processing Unit)などのハードウェアプロセッサがプログラム(ソフトウェア)を実行することにより実現される。また、これらの構成要素のうち一部または全部は、LSI(Large Scale Integration)やASIC(Application Specific Integrated Circuit)、FPGA(Field-Programmable Gate Array)、GPU(Graphics Processing Unit)などのハードウェア(回路部;circuitryを含む)によって実現されてもよいし、ソフトウェアとハードウェアの協働によって実現されてもよい。 The automatic driving control device 100 includes, for example, a first control unit 120 and a second control unit 160. Each of the first control unit 120 and the second control unit 160 is realized, for example, when a hardware processor such as a CPU (Central Processing Unit) executes a program (software). In addition, some or all of these components may be hardware (circuits) such as LSI (Large Scale Integration), ASIC (Application Specific Integrated Circuit), FPGA (Field-Programmable Gate Array), GPU (Graphics Processing Unit), etc. Circuit (including circuitry) or may be realized by cooperation of software and hardware.

 図2は、第1制御部120および第2制御部160の機能構成図である。第1制御部120は、例えば、認識部130と、行動計画生成部140とを備える。認識部130は、例えば、他車両乗員注視方向認識部132を備える。また、行動計画生成部140は、例えば、他車両遭遇時判定部142を備える。カメラ10と、物体認識装置16と、認識部130とを合わせたものは、「検出部」の一例である。 FIG. 2 is a functional block diagram of the first control unit 120 and the second control unit 160. As shown in FIG. The first control unit 120 includes, for example, a recognition unit 130 and an action plan generation unit 140. The recognition unit 130 includes, for example, another vehicle occupant gaze direction recognition unit 132. Further, the action plan generation unit 140 includes, for example, the other vehicle encounter time determination unit 142. What combined the camera 10, the object recognition apparatus 16, and the recognition part 130 is an example of a "detection part."

 第1制御部120は、例えば、AI(Artificial Intelligence;人工知能)による機能と、予め与えられたモデルによる機能とを並行して実現する。例えば、「交差点を認識する」機能は、ディープラーニング等による交差点の認識と、予め与えられた条件(パターンマッチング可能な信号、道路標示などがある)に基づく認識とが並行して実行され、双方に対してスコア付けして総合的に評価することで実現される。これによって、自動運転の信頼性が担保される。 The first control unit 120 implements, for example, a function by artificial intelligence (AI) and a function by a predetermined model in parallel. For example, in the “identify intersection” function, recognition of an intersection by deep learning etc. and recognition based on predetermined conditions (a signal capable of pattern matching, road marking, etc.) are executed in parallel, and both are performed. It is realized by scoring against and comprehensively evaluating. This ensures the reliability of automatic driving.

 認識部130は、カメラ10、レーダ装置12、およびファインダ14から物体認識装置16を介して入力される情報に基づいて、自車両Mの周辺にある物体の位置、および速度、加速度等の状態を認識する。物体には、他車両mや静止した障害物などが含まれる。物体の位置は、例えば、自車両Mの代表点(重心や駆動軸中心など)を原点とした絶対座標上の位置として認識され、制御に使用される。物体の位置は、その物体の重心やコーナー等の代表点で表されてもよいし、表現された領域で表されてもよい。物体の「状態」とは、物体の加速度やジャーク、あるいは「行動状態」(例えば車線変更をしている、またはしようとしているか否か)を含んでもよい。また、認識部130は、カメラ10の撮像画像に基づいて、自車両Mがこれから通過するカーブの形状を認識する。認識部130は、カーブの形状をカメラ10の撮像画像から実平面に変換し、例えば、二次元の点列情報、或いはこれと同等なモデルを用いて表現した情報を、カーブの形状を示す情報として行動計画生成部140に出力する。 The recognition unit 130 detects the position of an object in the vicinity of the host vehicle M, the state of the velocity, the acceleration, and the like based on information input from the camera 10, the radar device 12, and the finder 14 via the object recognition device 16. recognize. The objects include other vehicles m and stationary obstacles. The position of the object is recognized as, for example, a position on an absolute coordinate with a representative point (such as the center of gravity or the center of the drive axis) of the host vehicle M as an origin, and is used for control. The position of the object may be represented by a representative point such as the center of gravity or a corner of the object, or may be represented by a represented region. The "state" of an object may include the acceleration or jerk of the object, or "action state" (e.g. whether or not you are changing lanes or trying to change). Further, the recognition unit 130 recognizes the shape of a curve through which the host vehicle M passes from now on the basis of the captured image of the camera 10. The recognition unit 130 converts the shape of the curve from the captured image of the camera 10 to a real plane, and for example, information indicating the shape of the curve which is expressed using two-dimensional point sequence information or a model equivalent thereto. Output to the action plan generation unit 140.

 また、認識部130は、例えば、自車両Mが走行している車線(走行車線)を認識する。例えば、認識部130は、第2地図情報62から得られる道路区画線のパターン(例えば実線と破線の配列)と、カメラ10によって撮像された画像から認識される自車両Mの周辺の道路区画線のパターンとを比較することで、走行車線を認識する。なお、認識部130は、道路区画線に限らず、道路区画線や路肩、縁石、中央分離帯、ガードレールなどを含む走路境界(道路境界)を認識することで、走行車線を認識してもよい。この認識において、ナビゲーション装置50から取得される自車両Mの位置やINSによる処理結果が加味されてもよい。また、認識部130は、一時停止線などの道路面に描かれた道路標示や、道路標識、障害物、赤信号、料金所、その他の道路事象を認識する。 The recognition unit 130 also recognizes, for example, a lane in which the host vehicle M is traveling (traveling lane). For example, the recognition unit 130 may use a pattern of road division lines obtained from the second map information 62 (for example, an array of solid lines and broken lines) and road division lines around the host vehicle M recognized from an image captured by the camera 10 The traveling lane is recognized by comparing with the pattern of. The recognition unit 130 may recognize the traveling lane by recognizing a runway boundary (road boundary) including not only road division lines but also road division lines, road shoulders, curbs, median dividers, guard rails and the like. . In this recognition, the position of the host vehicle M acquired from the navigation device 50 or the processing result by the INS may be added. The recognition unit 130 also recognizes road markings drawn on a road surface such as a temporary stop line, road signs, obstacles, red lights, toll plazas, and other road events.

 認識部130は、走行車線を認識する際に、走行車線に対する自車両Mの位置や姿勢を認識する。認識部130は、例えば、自車両Mの基準点の車線中央からの乖離、および自車両Mの進行方向の車線中央を連ねた線に対してなす角度を、走行車線に対する自車両Mの相対位置および姿勢として認識してもよい。また、これに代えて、認識部130は、走行車線のいずれかの側端部(道路区画線または道路境界)に対する自車両Mの基準点の位置などを、走行車線に対する自車両Mの相対位置として認識してもよい。 The recognition unit 130 recognizes the position and orientation of the host vehicle M with respect to the traveling lane when recognizing the traveling lane. The recognition unit 130 is, for example, a deviation of the reference point of the host vehicle M from the center of the lane, and an angle formed by a line connecting the center of the lane in the traveling direction of the host vehicle M It may be recognized as an attitude. Also, instead of this, the recognition unit 130 sets the position of the reference point of the host vehicle M with respect to any one side end (road segment or road boundary) of the travel lane relative to the host vehicle M with respect to the travel lane. It may be recognized as

 また、認識部130は、上記の認識処理において、認識精度を導出し、これを認識精度情報として行動計画生成部140に出力してもよい。例えば、認識部130は、一定期間において、道路区画線を認識できた頻度に基づいて、認識精度情報を生成する。 In addition, the recognition unit 130 may derive recognition accuracy in the above-described recognition processing, and output the recognition accuracy to the action plan generation unit 140 as recognition accuracy information. For example, the recognition unit 130 generates recognition accuracy information based on the frequency at which a road marking can be recognized in a fixed period.

 また、認識部130の他車両乗員注視方向認識部132は、認識した物体の中から他車両mを特定し、その他車両mに搭乗する乗員を認識する。そして、他車両乗員注視方向認識部132は、認識した他車両mの乗員の顔の向き、または視線の向きを認識する。例えば、他車両乗員注視方向認識部132は、乗員の瞳孔や虹彩、目頭等の位置、肌や髪の領域などに基づいて、その乗員の顔の向き、または視線の向きを認識する。 Further, the other vehicle occupant's gaze direction recognition unit 132 of the recognition unit 130 specifies the other vehicle m from among the recognized objects, and recognizes the passenger getting on the other vehicle m. Then, the other vehicle occupant gaze direction recognition unit 132 recognizes the direction of the face or the direction of the line of sight of the recognized other vehicle m. For example, the other vehicle occupant's gaze direction recognition unit 132 recognizes the direction of the face or the line of sight of the occupant based on the position of the occupant's pupil or iris, the position of the eyes, etc., the skin or the hair.

 行動計画生成部140は、原則的には推奨車線決定部61により決定された推奨車線を走行し、更に、自車両Mの周辺状況に対応できるように、自動運転において順次実行されるイベントを決定する。イベントには、例えば、一定速度で同じ走行車線を走行する定速走行イベント、前走車両に追従する追従走行イベント、前走車両を追い越す追い越しイベント、障害物との接近を回避するための制動および/または操舵を行う回避イベント、カーブを走行するカーブ走行イベント、交差点や横断歩道、踏切などの所定のポイントを通過する通過イベント、車線変更イベント、合流イベント、分岐イベント、自動停止イベント、自動運転を終了して手動運転に切り替えるためのテイクオーバイベントなどがある。「追従」とは、前走車両と自車両Mとの車間距離(相対距離)を一定にするように速度を制御する走行態様である。 The action plan generation unit 140 basically travels in the recommended lane determined by the recommended lane determination unit 61, and further determines events to be sequentially executed in automatic driving so as to correspond to the surrounding situation of the host vehicle M. Do. Events include, for example, a constant speed traveling event traveling on the same traveling lane at a constant speed, a following traveling event tracking a preceding vehicle, an overtaking event passing a leading vehicle, braking for avoiding approaching with an obstacle and / Or Avoiding events to steer, curve driving events to drive a curve, passing events passing predetermined points such as intersections, pedestrian crossings, crossings, etc., lane change events, merging events, branching events, automatic stop events, automatic driving There is a takeover event for ending and switching to the manual operation. The “following” is a traveling mode in which the speed is controlled so as to make the inter-vehicle distance (relative distance) between the leading vehicle and the host vehicle M constant.

 行動計画生成部140は、起動したイベントに応じて、自車両Mが将来走行する目標軌道Tを生成する。目標軌道Tは、例えば、速度要素を含んでいる。例えば、目標軌道Tは、自車両Mの到達すべき地点(軌道点)を順に並べたものとして表現される。軌道点は、道なり距離で所定の走行距離(例えば数[m]程度)ごとの自車両Mの到達すべき地点であり、それとは別に、所定のサンプリング時間(例えば0コンマ数[sec]程度)ごとの目標速度および目標加速度が、目標軌道Tの一部として生成される。また、軌道点は、所定のサンプリング時間ごとの、そのサンプリング時刻における自車両Mの到達すべき位置であってもよい。この場合、目標速度や目標加速度の情報は軌道点の間隔で表現される。 The action plan generation unit 140 generates a target trajectory T on which the vehicle M travels in the future, in accordance with the activated event. The target trajectory T includes, for example, a velocity component. For example, the target trajectory T is expressed as a sequence of points (track points) to be reached by the vehicle M. The track point is a point to be reached by the vehicle M for every predetermined traveling distance (for example, several [m]) in road distance, and separately, for a predetermined sampling time (for example, about 0 comma [sec]) B) target velocity and target acceleration are generated as part of the target trajectory T. Further, the track point may be a position to be reached by the vehicle M at the sampling time for each predetermined sampling time. In this case, information on the target velocity and the target acceleration is expressed by the distance between the track points.

 図3は、推奨車線に基づいて目標軌道Tが生成される様子を示す図である。図示するように、推奨車線は、目的地までの経路に沿って走行するのに都合が良いように設定される。行動計画生成部140は、推奨車線の切り替わり地点の所定距離(イベントの種類に応じて決定されてよい)手前に差し掛かると、通過イベント、車線変更イベント、分岐イベント、合流イベントなどを起動する。各イベントの実行中に、障害物を回避する必要が生じた場合には、図示するように回避軌道が生成される。 FIG. 3 is a diagram showing how the target track T is generated based on the recommended lane. As shown, the recommended lanes are set to be convenient to travel along the route to the destination. When the action plan generation unit 140 reaches a predetermined distance (may be determined according to the type of event) of the switching point of the recommended lane, it activates a passing event, a lane changing event, a branching event, a merging event and the like. When it is necessary to avoid an obstacle during the execution of each event, an avoidance trajectory is generated as illustrated.

 行動計画生成部140の他車両遭遇時判定部142は、認識部130から、他車両mの乗員の顔または視線の向きを示す情報(以下、注視方向情報)を認識結果として取得する。そして、他車両遭遇時判定部142は、例えば、取得した注視方向情報を参照し、その情報が示す方向が所定の方向であるか否かを判定する。他車両遭遇時判定部142は、請求の範囲における「取得部」の一例である。 The action plan generation unit 140 determines the other vehicle encounter time determination unit 142 from the recognition unit 130 as information that indicates the direction of the face or line of sight of the occupant of the other vehicle m (hereinafter, gaze direction information) as a recognition result. Then, the other vehicle encounter time determination unit 142, for example, refers to the acquired gaze direction information and determines whether the direction indicated by the information is a predetermined direction. The other vehicle encounter time determination unit 142 is an example of the “acquisition unit” in the claims.

 行動計画生成部140は、他車両遭遇時判定部142によって乗員の顔または視線の向きが所定の方向であると判定された場合、新たな目標軌道Tを生成し、これを第2制御部160に出力する。 The action plan generation unit 140 generates a new target trajectory T when the other vehicle encounter time determination unit 142 determines that the direction of the face or line of sight of the occupant is a predetermined direction, and the second control unit 160 generates the new target trajectory T. Output to

 図2の説明に戻り、第2制御部160は、例えば、取得部162と、速度制御部164と、操舵制御部166とを備える。行動計画生成部140と、速度制御部164と、操舵制御部166とを合わせたものは、「運転制御部」の一例である。 Returning to the description of FIG. 2, the second control unit 160 includes, for example, an acquisition unit 162, a speed control unit 164, and a steering control unit 166. The combination of the action plan generation unit 140, the speed control unit 164, and the steering control unit 166 is an example of the “operation control unit”.

 取得部162は、行動計画生成部140により生成された目標軌道T(軌道点)の情報を取得し、これをメモリに記憶させる。 The acquisition unit 162 acquires information on the target trajectory T (orbital point) generated by the action plan generation unit 140, and stores the information in the memory.

 速度制御部164および操舵制御部166は、行動計画生成部140によって生成された目標軌道Tを、予定の時刻通りに自車両Mが通過するように、走行駆動力出力装置200、ブレーキ装置210、およびステアリング装置220を制御する。 The speed control unit 164 and the steering control unit 166 control the travel driving force output device 200, the brake device 210, and the like so that the vehicle M passes the target trajectory T generated by the action plan generation unit 140 as scheduled. And control the steering device 220.

 例えば、速度制御部164は、メモリに記憶された目標軌道Tに付随する速度要素に基づいて、走行駆動力出力装置200またはブレーキ装置210を制御する。操舵制御部166は、メモリに記憶された目標軌道Tの曲がり具合(例えば曲率)に応じて、ステアリング装置220を制御する。速度制御部164および操舵制御部166の処理は、例えば、フィードフォワード制御とフィードバック制御との組み合わせにより実現される。一例として、操舵制御部166は、自車両Mの前方の道路の曲率に応じたフィードフォワード制御と、目標軌道Tからの乖離に基づくフィードバック制御とを組み合わせて実行する。 For example, the speed control unit 164 controls the traveling drive power output device 200 or the brake device 210 based on the speed component associated with the target track T stored in the memory. The steering control unit 166 controls the steering device 220 according to the degree of curvature (for example, curvature) of the target track T stored in the memory. The processing of the speed control unit 164 and the steering control unit 166 is realized by, for example, a combination of feedforward control and feedback control. As an example, the steering control unit 166 combines feedforward control according to the curvature of the road ahead of the host vehicle M and feedback control based on the deviation from the target track T.

 機器制御部170は、他車両遭遇時判定部142の判定結果を取得し、この判定結果に応じて、通信装置20に指令を出力し、通信装置20を他車両mと車車間通信させる。また、機器制御部170は、他車両遭遇時判定部142による判定結果に応じて、ランプ群90やクラクション92を作動させてもよい。機器制御部170は、「通信制御部」の一例である。 The device control unit 170 acquires the determination result of the other vehicle encounter time determination unit 142, outputs a command to the communication device 20 according to the determination result, and causes the communication device 20 to perform inter-vehicle communication with the other vehicle m. In addition, the device control unit 170 may operate the lamp group 90 or the horn 92 according to the determination result by the other vehicle encounter time determination unit 142. The device control unit 170 is an example of a “communication control unit”.

 走行駆動力出力装置200は、車両が走行するための走行駆動力(トルク)を駆動輪に出力する。走行駆動力出力装置200は、例えば、内燃機関、電動機、および変速機などの組み合わせと、これらを制御するECU(Electronic Control Unit)とを備える。ECUは、第2制御部160から入力される情報、或いは運転操作子80から入力される情報に従って、上記の構成を制御する。 The traveling driving force output device 200 outputs traveling driving force (torque) for the vehicle to travel to the driving wheels. The traveling driving force output device 200 includes, for example, a combination of an internal combustion engine, a motor, and a transmission, and an ECU (Electronic Control Unit) that controls these. The ECU controls the above configuration in accordance with the information input from the second control unit 160 or the information input from the drive operator 80.

 ブレーキ装置210は、例えば、ブレーキキャリパーと、ブレーキキャリパーに油圧を伝達するシリンダと、シリンダに油圧を発生させる電動モータと、ブレーキECUとを備える。ブレーキECUは、第2制御部160から入力される情報、或いは運転操作子80から入力される情報に従って電動モータを制御し、制動操作に応じたブレーキトルクが各車輪に出力されるようにする。ブレーキ装置210は、運転操作子80に含まれるブレーキペダルの操作によって発生させた油圧を、マスターシリンダを介してシリンダに伝達する機構をバックアップとして備えてよい。なお、ブレーキ装置210は、上記説明した構成に限らず、第2制御部160から入力される情報に従ってアクチュエータを制御して、マスターシリンダの油圧をシリンダに伝達する電子制御式油圧ブレーキ装置であってもよい。 The brake device 210 includes, for example, a brake caliper, a cylinder that transmits hydraulic pressure to the brake caliper, an electric motor that generates hydraulic pressure in the cylinder, and a brake ECU. The brake ECU controls the electric motor in accordance with the information input from the second control unit 160 or the information input from the drive operator 80 so that the brake torque corresponding to the braking operation is output to each wheel. The brake device 210 may include, as a backup, a mechanism for transmitting the hydraulic pressure generated by the operation of the brake pedal included in the drive operator 80 to the cylinder via the master cylinder. The brake device 210 is not limited to the configuration described above, and is an electronically controlled hydraulic brake device that controls the actuator according to the information input from the second control unit 160 to transmit the hydraulic pressure of the master cylinder to the cylinder It is also good.

 ステアリング装置220は、例えば、ステアリングECUと、電動モータとを備える。電動モータは、例えば、ラックアンドピニオン機構に力を作用させて転舵輪の向きを変更する。ステアリングECUは、第2制御部160から入力される情報、或いは運転操作子80から入力される情報に従って、電動モータを駆動し、転舵輪の向きを変更させる。 The steering device 220 includes, for example, a steering ECU and an electric motor. The electric motor, for example, applies a force to the rack and pinion mechanism to change the direction of the steered wheels. The steering ECU drives the electric motor to change the direction of the steered wheels in accordance with the information input from the second control unit 160 or the information input from the drive operator 80.

 [処理フロー]
 図4は、第1実施形態の自動運転制御装置100により実行される処理の一例を示すフローチャートである。本フローチャートの処理は、例えば、認識部130によって他車両mが認識されると実行される。また、本フローチャートの処理は、例えば、所定の周期で繰り返し実行されてよい。
Processing flow
FIG. 4 is a flowchart showing an example of processing executed by the automatic driving control apparatus 100 according to the first embodiment. The processing of this flowchart is executed, for example, when the recognition unit 130 recognizes the other vehicle m. Also, the processing of this flowchart may be repeatedly executed, for example, at a predetermined cycle.

 まず、他車両乗員注視方向認識部132は、認識した他車両mの運転者を認識する(ステップS100)。運転者とは、例えば、車両に搭乗した一人以上の乗員のうち、ステアリングホイールに最も近い座席に着座した乗員である。 First, the other vehicle occupant's gaze direction recognition unit 132 recognizes the recognized driver of the other vehicle m (step S100). The driver is, for example, an occupant seated in a seat closest to the steering wheel among one or more occupants who board the vehicle.

 例えば、他車両乗員注視方向認識部132は、認識した他車両mのウィンドウ越しに乗員を認識した場合、その乗員が着座する座席の前にステアリングホイールが存在しているか否かを判定する。他車両乗員注視方向認識部132は、認識した乗員が着座している座席の前にステアリングホイールが存在していると判定した場合、その乗員を他車両mの運転者として認識する。 For example, when the occupant's gaze direction recognition unit 132 recognizes the occupant through the window of the recognized other vehicle m, the other vehicle occupant gaze direction recognition unit 132 determines whether a steering wheel is present in front of the seat on which the occupant sits. When it is determined that the steering wheel is present in front of the seat where the recognized occupant is seated, the other vehicle occupant gaze direction recognition unit 132 recognizes the occupant as the driver of the other vehicle m.

 次に、他車両乗員注視方向認識部132は、認識した運転者の顔または視線の向きDを認識する(ステップS102)。 Next, the other vehicle occupant's gaze direction recognition unit 132 recognizes the direction D of the recognized driver's face or gaze (step S102).

 図5および図6は、他車両mの運転者の顔または視線の向きDを模式的に示す図である。図中STは、ステアリングホイールを表し、DPは、後述する表示装置を表している。図示の例のように、他車両mの運転者が、サイドウィンドウ越しに車両の周囲を監視する場合、その運転者の顔の少なくとも一部は車外から視認することができるようになる。 5 and 6 schematically show the direction D of the face or line of sight of the driver of the other vehicle m. In the figure, ST represents a steering wheel, and DP represents a display device described later. As in the illustrated example, when the driver of another vehicle m monitors the surroundings of the vehicle through the side window, at least a part of the driver's face can be viewed from outside the vehicle.

 次に、他車両遭遇時判定部142は、他車両乗員注視方向認識部132から、認識結果として注視方向情報を取得し、その注視方向情報を基に、他車両mの運転者の顔または視線の向きDが所定の方向であるか否かを判定する(ステップS104)。所定の方向とは、例えば、他車両mから見て自車両Mの方を向いている方向である。 Next, the other vehicle encounter time determination unit 142 acquires gaze direction information as a recognition result from the other vehicle occupant gaze direction recognition unit 132, and based on the gaze direction information, the driver's face or line of sight of the other vehicle m. It is determined whether or not the direction D of the object is a predetermined direction (step S104). The predetermined direction is, for example, a direction facing the own vehicle M as viewed from the other vehicle m.

 図7は、所定の方向について説明するための図である。例えば、各車両を上から見たときの仮想的な二次元空間において、他車両mの運転者の顔または視線の向きDを基準とした所定角度幅θの範囲内に自車両Mが存在する場合、他車両遭遇時判定部142は、他車両mの運転者の顔または視線の向きDが所定の方向であると判定し、所定角度幅θの範囲内に自車両Mが存在しない場合、他車両mの運転者の顔または視線の向きDが所定の方向でないと判定する。 FIG. 7 is a diagram for describing a predetermined direction. For example, in a virtual two-dimensional space when each vehicle is viewed from above, the own vehicle M exists within a range of a predetermined angle width θ based on the direction D of the face or line of sight of the driver of the other vehicle m. In this case, the other vehicle encounter time determination unit 142 determines that the direction D of the driver's face or line of sight of the other vehicle m is a predetermined direction, and the own vehicle M does not exist within the predetermined angle width θ. It is determined that the direction D of the face or gaze of the driver of the other vehicle m is not a predetermined direction.

 図8は、他車両mの運転者の顔または視線の向きDが所定の方向となり得る場面の一例を示す図である。図の例では、合流路を上から見たときの場面を表しており、合流路における本線は車線L1およびLN2であり、合流車線は車線LN3である。例えば、合流路LN3を走行している他車両mは、合流路LN3から本線LN1へと車線変更する必要があるため、車体を車幅方向に傾けることで、本線LN1を走行している車両に合流する意思をアピールすることがしばしば行われる。この結果、自車両Mから見て他車両mの運転者を認識することが可能となる。このとき、他車両mの運転者は、本線LN1への車線変更のタイミングを決定するために、必然的に、本線LN1において、後方から直進してくる他車両(自車両Mも含む)を注視する。従って、他車両mの運転者の顔や視線の向きDは所定の方向であると判定され易くなる。 FIG. 8 is a view showing an example of a scene in which the direction D of the face or line of sight of the driver of the other vehicle m may be a predetermined direction. The example of the figure shows a scene when the combined channel is viewed from above, the main lines in the combined channel are the lanes L1 and LN2, and the combined lane is the lane LN3. For example, since it is necessary to change the lane from the joint flow LN3 to the main line LN1 for the other vehicle m traveling on the joint flow LN3, the vehicle traveling on the main line LN1 can be It is often done to appeal the willingness to join. As a result, it becomes possible to recognize the driver of the other vehicle m as seen from the host vehicle M. At this time, the driver of the other vehicle m necessarily gazes at the other vehicle (including the own vehicle M) going straight from the rear on the main line LN1 in order to determine the timing of the lane change to the main line LN1. Do. Therefore, the direction D of the face or line of sight of the driver of the other vehicle m can be easily determined to be a predetermined direction.

 他車両遭遇時判定部142によって、他車両mの運転者の顔または視線の向きDが所定の方向であると判定された場合、機器制御部170は、通信装置20に、運転者が認識された他車両mと車車間通信を開始させる(ステップS106)。 When the other vehicle encounter time determination unit 142 determines that the direction D of the driver's face or gaze of the other vehicle m is a predetermined direction, the equipment control unit 170 recognizes the driver by the communication device 20. The other-vehicle communication with the other vehicle m is started (step S106).

 例えば、機器制御部170は、通信装置20を制御して、車車間通信の通信相手である他車両mに、所定の情報を送信する。所定の情報は、例えば、自車両Mが、他車両mの運転者の顔または視線の向きDを認識した結果、どういう行動を起こす予定であるのかを通知するための情報である。例えば、所定の情報を受けた他車両mは、車室内に設置された表示装置DPやスピーカを介して所定の情報を出力する。なお、機器制御部170は、他車両mとの間で車車間通信が確立しない場合、所定の情報を送信する処理を省略してよい。 For example, the device control unit 170 controls the communication device 20 to transmit predetermined information to another vehicle m which is a communication partner of inter-vehicle communication. The predetermined information is, for example, information for notifying what kind of action is to be taken as a result of the own vehicle M recognizing the direction D of the face or the line of sight of the driver of the other vehicle m. For example, the other vehicle m which has received the predetermined information outputs the predetermined information via the display device DP or the speaker installed in the vehicle compartment. The device control unit 170 may omit the process of transmitting predetermined information when inter-vehicle communication is not established with the other vehicle m.

 図9は、他車両mの表示装置DPに表示させる画面の一例を示す図である。例えば、自車両M側において、他車両mの運転者の顔または視線の向きDが所定の方向であると判定された場合、すなわち、他車両mが合流の意思を示しているものと判断された場合、図示の例のように、表示装置DPの画面には、所定の情報として、自車両Mが、他車両mによる前方への割込みを受け入れていることを示す情報が表示される。これによって、他車両mの運転者は、自車両Mの行動を予期した上で車両を運転することができる。 FIG. 9 is a view showing an example of a screen displayed on the display device DP of another vehicle m. For example, when it is determined that the direction D of the driver's face or line of sight of the other vehicle m is a predetermined direction on the host vehicle M side, that is, it is determined that the other vehicle m indicates the intention of merging. In this case, as shown in the example of the drawing, the screen of the display device DP displays, as the predetermined information, information indicating that the host vehicle M has accepted an interruption ahead by the other vehicle m. Thus, the driver of the other vehicle m can drive the vehicle in anticipation of the action of the host vehicle M.

 また、機器制御部170は、他車両遭遇時判定部142によって、他車両mの運転者の顔または視線の向きDが所定の方向であると判定された場合、ランプ群90のヘッドランプを点滅させたり、ハイビームにして点灯させたり、或いはハザードランプを点滅または点灯させたりする。また、機器制御部170は、他車両遭遇時判定部142によって、他車両mの運転者の顔または視線の向きDが所定の方向であると判定された場合、クラクション92を鳴らしてもよい。このような制御によって、他車両mの運転者に所定の情報を通知する。 Further, when the other-vehicle encounter determining unit 142 determines that the direction D of the driver's face or sight line of the other vehicle m is a predetermined direction, the device control unit 170 blinks the headlamps of the lamp group 90 Turn it on, turn it on with a high beam, or blink or turn on the hazard lamp. In addition, the device control unit 170 may sound the horn 92 when it is determined by the other vehicle encounter time determination unit 142 that the direction D of the driver's face or gaze of the other vehicle m is a predetermined direction. By such control, the driver of the other vehicle m is notified of predetermined information.

 次に、行動計画生成部140は、他車両遭遇時判定部142により他車両mの運転者の顔または視線の向きDが所定の方向であると判定された場合、他車両mが自車線である本線LN1に合流する意思を表していると判断し、他車両mによる前方への割込みを受け入れるために、減速または車線変更を実現するための目標軌道Tを生成する(ステップS108)。そして、速度制御部164および操舵制御部166の一方または双方は、行動計画生成部140により新たに生成された目標軌道Tに基づいて、各制御対象を制御する。自車両Mを減速または車線変更させる制御は、「所定の制御」の一例である。 Next, when the behavior plan generation unit 140 determines that the direction D of the driver's face or line of sight of the other vehicle m is the predetermined direction by the other vehicle encounter time determination unit 142, the other vehicle m is in its own lane It is determined that it represents an intention to join a certain main line LN1, and a target track T for realizing deceleration or lane change is generated in order to accept a forward interrupt by another vehicle m (step S108). Then, one or both of the speed control unit 164 and the steering control unit 166 control each control target based on the target trajectory T newly generated by the action plan generation unit 140. Control for decelerating or changing the host vehicle M is an example of “predetermined control”.

 図10は、自車両Mを減速させる場面の一例を示す図である。例えば、他車両mの運転者の顔または視線の向きDが所定の方向である場合、行動計画生成部140は、以前に生成した目標軌道Tの速度要素として定められた目標速度Vよりも小さい目標速度V#を含む目標軌道Tを新たに生成する。「以前」とは、例えば、認識部130によって他車両mの運転者が認識されるよりも前のタイミングである。これを受けて、速度制御部164は、新たに生成された目標軌道に速度要素として含まれる目標速度V#に基づいて、走行駆動力出力装置200またはブレーキ装置210を制御し、自車両Mを減速させる。これによって、自車両Mの前方にスペースが形成され、自車両Mの方向を見ている他車両mに合流を促すことができる。 FIG. 10 is a view showing an example of a scene in which the host vehicle M is decelerated. For example, when the direction D of the face or line of sight of the driver of the other vehicle m is a predetermined direction, the action plan generation unit 140 determines that the target speed V M defined as the speed element of the target track T generated previously is A new target trajectory T including a small target velocity V M # is generated. “Before” is, for example, a timing before the recognition unit 130 recognizes the driver of the other vehicle m. In response to this, the speed control unit 164 controls the traveling drive power output device 200 or the brake device 210 based on the target speed V M # included as a speed element in the newly generated target track, and the host vehicle M Slow down. Thus, a space is formed in front of the host vehicle M, and it is possible to urge the other vehicle m looking at the direction of the host vehicle M to merge.

 図11は、自車両Mを車線変更させる場面の一例を示す図である。例えば、他車両mの運転者の顔または視線の向きDが所定の方向である場合、行動計画生成部140は、合流車線LN3に隣接する車線LN1から、合流車線LN3に隣接せず車線LN1に隣接する車線LN2へと至る目標軌道Tを生成する。そして、操舵制御部166は、新たに生成された目標軌道Tの曲がり具合に基づいて、ステアリング装置220を制御し、自車両Mを車線LN2へと車線変更させる。これによって、少なくとも自車両M分のスペースが本線LN1に形成されるため、自車両Mの方向を見ている他車両mに合流を促すことができる。 FIG. 11 is a view showing an example of a scene in which the host vehicle M is changed to the lane. For example, when the direction D of the driver's face or line of sight of the other vehicle m is a predetermined direction, the action plan generation unit 140 changes the lane LN1 adjacent to the merging lane LN3 to the lane LN1 without being adjacent to the merging lane LN3. A target track T leading to the adjacent lane LN2 is generated. Then, the steering control unit 166 controls the steering device 220 based on the degree of curvature of the newly generated target track T, and changes the lane of the host vehicle M to the lane LN2. As a result, since a space for at least the vehicle M is formed in the main line LN1, it is possible to urge the other vehicle m that is looking at the direction of the vehicle M to merge.

 一方、行動計画生成部140は、他車両遭遇時判定部142によって、他車両mの運転者の顔または視線の向きDが所定の方向でないと判定された場合、以前に生成した目標軌道Tと同じ目標軌道Tを生成し、速度制御部164および操舵制御部166に参照させる目標軌道Tを新たなものに変更しない(ステップS110)。「以前に生成した目標軌道Tと同じ」とは、例えば、速度要素として定められた目標速度と、目標軌道そのものの曲がり具合(軌道点同士の相対的な位置関係)とが略同じことである。これを受けて、速度制御部164は、以前と同じように走行駆動力出力装置200またはブレーキ装置210を制御し、操舵制御部166は、以前と同じようにステアリング装置220を制御する。これによって、自車両Mは、車線LN1を維持しながら走行する。 On the other hand, when it is determined by the other vehicle encounter time determination unit 142 that the direction D of the face or line of sight of the driver of the other vehicle m is not a predetermined direction, the action plan generation unit 140 The same target trajectory T is generated, and the target trajectory T to be referred to by the speed control unit 164 and the steering control unit 166 is not changed to a new one (step S110). “The same as the target trajectory T generated previously” means, for example, that the target velocity determined as a velocity factor and the degree of bending of the target trajectory itself (relative positional relationship between trajectory points) are substantially the same. . In response to this, the speed control unit 164 controls the traveling drive power output device 200 or the brake device 210 as before, and the steering control unit 166 controls the steering device 220 as before. As a result, the vehicle M travels while maintaining the lane LN1.

 なお、S102の処理において、認識部130が、他車両mの運転者の顔または視線の向きDを認識できなかった場合、行動計画生成部140は、S110の処理を行ってよい。これによって、本フローチャートの処理が終了する。 In addition, in the process of S102, when the recognition unit 130 can not recognize the direction D of the face or the line of sight of the driver of the other vehicle m, the action plan generation unit 140 may perform the process of S110. By this, the processing of this flowchart ends.

 なお、上述した第1実施形態では、他車両mの運転者の顔または視線の向きDが所定の方向となり得る場面の一例として、合流路の場面を説明したがこれに限られず、例えば、丁字路のような三叉路交差点(3-WAY JUNCTION)やその他の交通場面であってよい。 In the first embodiment described above, although the scene of the combined flow path has been described as an example of a scene in which the direction D of the face or line of sight of the driver of the other vehicle m can be a predetermined direction, the present invention is not limited thereto. It may be a 3-way JUNCTION such as a road or other traffic scenes.

 図12および図13は、他車両mの運転者の顔または視線の向きDが所定の方向となり得る場面の他の例を示す図である。図示の例では、丁字路を上から見たときの場面を表しており、他車両mが右折しようと試みている。例えば、他車両mが存在する車線LN3は、自車両Mが存在する車線LN1に対して突き当たる車線であり、一時停止規制が交通ルールとして適用される場合がある。この場合、右折を行う予定の他車両mは、一時停止線SLにて一時停止し、優先道路である車線LN1を走行する自車両Mが通過するのを待機する必要がある。一方で、自車両Mの自動運転制御装置100は、他車線LN3において他車両mを認識すると、この他車両mの運転者の顔または視線の向きDを認識する。自動運転制御装置100は、認識した運転者の顔または視線の向きDが所定の方向であるか否かを判定し、向きDが所定の方向であれば、他車両mに所定の情報を送信すると共に、自車両Mを減速させる。これによって、他車両mの乗員とコミュニケーションを取りながら、より柔軟に自動運転を行うことができる。 FIG. 12 and FIG. 13 are diagrams showing other examples of a scene in which the direction D of the face or line of sight of the driver of the other vehicle m may be a predetermined direction. In the example shown in the figure, the scene when looking at the upper side of the chopsticks is shown, and the other vehicle m tries to turn right. For example, the lane LN3 in which the other vehicle m is present is a lane that strikes the lane LN1 in which the host vehicle M is present, and the temporary stop regulation may be applied as a traffic rule. In this case, the other vehicle m scheduled to make a right turn must stop on the temporary stop line SL and wait for the host vehicle M traveling in the lane LN1, which is a priority road, to pass. On the other hand, when recognizing the other vehicle m in the other lane LN3, the automatic driving control device 100 of the own vehicle M recognizes the direction D of the face or the line of sight of the driver of the other vehicle m. The automatic driving control apparatus 100 determines whether the direction D of the recognized driver's face or line of sight is a predetermined direction, and if the direction D is a predetermined direction, transmits predetermined information to the other vehicle m. At the same time, the host vehicle M is decelerated. This enables more flexible automatic driving while communicating with the occupants of the other vehicles m.

 なお、上述した例では、丁字路において、自動運転制御装置100は、他車両mが右折して優先道路の車線LN2に車線変更する場合、自車両Mを減速させるものとして説明したが、他車両mが左折して優先道路の車線LN1に車線変更する場合、合流路の場面のときと同様に、自車両Mを車線LN1に隣接する車線LN2に車線変更さてもよい。 In the above-described example, the automatic driving control apparatus 100 decelerates the host vehicle M when another vehicle m turns right and changes lanes to the lane LN2 of the priority road in a double-sided road, but the other vehicle When m turns left and changes the lane to the lane LN1 of the priority road, the host vehicle M may be changed to the lane LN2 adjacent to the lane LN1 as in the scene of the merging path.

 また、他車両mの運転者の顔または視線の向きDが所定の方向となり得る場面は、自車両Mが他車両mに追従する場面(追従走行イベントが起動される場面)であってもよい。 Further, the scene in which the direction D of the face or line of sight of the driver of the other vehicle m can be a predetermined direction may be a scene in which the host vehicle M follows the other vehicle m (a scene where a following travel event is activated) .

 図14から図16は、他車両mの運転者の顔または視線の向きDが所定の方向となり得る場面の他の例を示す図である。図示の例では、自車両Mが前走車両である他車両mとの車両間距離を一定に保つように他車両mに追従している。このような追従走行下では、他車両mの運転者が不快に感じる車両後方の領域に自車両Mが進入する場合があり、他車両mの運転者が後方に顔または視線を向ける場合がある。この場合、他車両遭遇時判定部142は、他車両mの運転者の顔または視線の向きDが所定の方向であると判定する。この場合、機器制御部170が他車両mの運転者に所定の情報を通知すると共に、行動計画生成部140が、自車両Mを減速させるための目標軌道Tを生成したり(図15参照)、自車両Mを隣接車線に車線変更させるための目標軌道Tを生成したりする(図16参照)。これによって、他車両mの乗員とコミュニケーションを取りながら、より柔軟に自動運転を行うことができる。 FIGS. 14 to 16 are diagrams showing other examples of a scene in which the direction D of the face or sight line of the driver of the other vehicle m may be a predetermined direction. In the illustrated example, the other vehicle m follows the other vehicle m so as to maintain a constant distance between the own vehicle M and the other vehicle m, which is a leading vehicle. Under such a follow-up, the host vehicle M may enter an area behind the vehicle that the driver of the other vehicle m feels uncomfortable, and the driver of the other vehicle m may face a face or gaze backward. . In this case, the other vehicle encounter time determination unit 142 determines that the direction D of the driver's face or gaze of the other vehicle m is a predetermined direction. In this case, the equipment control unit 170 notifies the driver of the other vehicle m of predetermined information, and the action plan generation unit 140 generates a target trajectory T for decelerating the own vehicle M (see FIG. 15). And generate a target track T for changing the host vehicle M to an adjacent lane (see FIG. 16). This enables more flexible automatic driving while communicating with the occupants of the other vehicles m.

 また、上述した第1実施形態では、他車両mの運転者の顔または視線の向きDが自車両M側で認識されるものとして説明したがこれに限られず、他車両mの運転者の顔または視線の向きDは他車両m側で認識されてもよい。例えば、他車両mの車室内には、乗員監視用のカメラが設置される場合がある。この乗員監視用のカメラは、他車両mの運転者を画角に収めるように設置される。このような場合、機器制御部170は、通信装置20を制御して、他車両mから、乗員監視用のカメラによって撮像された画像を受信する。そして、認識部130は、他車両mから受信した画像に対して画像処理を行って、他車両mの運転者の顔または視線の向きDを認識する。また、他車両m側において、乗員監視用のカメラによって撮像された画像から運転者の顔または視線の向きDが認識されてもよい。この場合、注視方向情報が他車両mから自車両Mに送信される。 In the first embodiment described above, the direction D of the driver's face or line of sight of the other vehicle m has been described as being recognized by the host vehicle M, but the present invention is not limited thereto. Alternatively, the direction D of the line of sight may be recognized by the other vehicle m. For example, a camera for monitoring an occupant may be installed in the compartment of another vehicle m. The occupant monitoring camera is installed to keep the driver of the other vehicle m within the angle of view. In such a case, the device control unit 170 controls the communication device 20 to receive, from the other vehicle m, an image captured by the occupant monitoring camera. Then, the recognition unit 130 performs image processing on the image received from the other vehicle m, and recognizes the direction D of the face or the line of sight of the driver of the other vehicle m. Further, on the other vehicle m side, the direction D of the driver's face or line of sight may be recognized from the image captured by the occupant monitoring camera. In this case, gaze direction information is transmitted from the other vehicle m to the host vehicle M.

 また、上述した第1実施形態では、他車両mの運転者の顔または視線の向きDが所定の方向である場合、他車両mの運転者が自車両Mを見ている蓋然性が高いため、他車両mの運転者に合流や車線変更の意思があるものと見做して、自車両Mを減速させたり他車線に車線変更させたりすることで他車両mに道を譲るためにスペースを確保するものとして説明し、他車両mの運転者の顔または視線の向きDが所定の方向でない場合、他車両mの運転者が自車両Mを見ていない蓋然性が高いため、他車両mの運転者に合流や車線変更の意思がないものと見做して、現状の自車両Mの走行を維持させるものとして説明したがこれに限られない。 In the first embodiment described above, when the direction D of the driver's face or sight line of the other vehicle m is a predetermined direction, the probability that the driver of the other vehicle m is looking at the host vehicle M is high. Considering that the driver of the other vehicle m has the intention of merging or changing lanes, decelerating the host vehicle M or changing lanes to other lanes gives a space to give way to the other vehicle m. In the case where the direction D of the driver's face or line of sight of the other vehicle m is not a predetermined direction, the driver of the other vehicle m is highly likely not to look at the host vehicle M. Although it has been described that the current travel of the host vehicle M is maintained on the assumption that the driver has no intention of merging or changing lanes, the present invention is not limited to this.

 例えば、自動運転制御装置100は、他車両mの運転者の顔または視線の向きDが所定の方向である場合、他車両mの運転者が合流や車線変更の意思があり、更に自車両Mを認識していると判断し、他車両mに道を譲るためのスペースを確保せずに、現状の自車両Mの走行を維持させもよい。一方で、自動運転制御装置100は、他車両mの運転者の顔または視線の向きDが所定の方向でない場合、他車両mの運転者が合流や車線変更の意思があるものの自車両Mを認識していないと判断し、自車両Mを減速させたり他車線に車線変更させたりすることで他車両mに道を譲るためにスペースを確保してもよい。 For example, when the direction D of the driver's face or sight line of the other vehicle m is a predetermined direction, for example, the driver of the other vehicle m has the intention of merging or changing lanes. It is also possible to maintain the current travel of the host vehicle M without securing a space for giving the road to another vehicle m. On the other hand, when the direction D of the driver's face or line of sight of the other vehicle m is not a predetermined direction, the automatic driving control apparatus 100 sets the own vehicle M for which the driver of the other vehicle m is willing to join or change lanes. It may be determined that the vehicle M is not recognized, and the vehicle M may be decelerated or changed to another lane to secure a space for giving the road to the other vehicle m.

 以上説明した第1実施形態によれば、他車両mの乗員(運転者)の顔または視線の向きDを示す注視方向情報を取得し、取得した注視方向情報によって示される向きDが、所定の方向であるか否かを判定する他車両遭遇時判定部142と、他車両遭遇時判定部142により、注視方向情報によって示される向きDが所定の方向であると判定された場合、自車両Mを減速させたり、車線変更させたりするための目標軌道Tを生成する行動計画生成部140と、行動計画生成部140により生成された目標軌道Tに基づいて、走行駆動力出力装置200およびブレーキ装置210を制御する速度制御部164と、上記の目標軌道Tに基づいて、ステアリング装置220を制御する操舵制御部166と、を備えることにより、他車両の運転者と意思の疎通を図りながら、より柔軟な自動運転を実現することができる。 According to the first embodiment described above, gaze direction information indicating the direction D of the face or gaze of the occupant (driver) of the other vehicle m is acquired, and the direction D indicated by the acquired gaze direction information is predetermined. If it is determined that the other vehicle encounter time determination unit 142 that determines whether the direction is the direction or not and the other vehicle encounter time determination unit 142 determine that the direction D indicated by the gaze direction information is a predetermined direction, the host vehicle M Based on the target trajectory T generated by the action plan generation unit 140, the travel driving force output device 200 and the brake device, for generating the target trajectory T for decelerating the vehicle or changing the lane. By providing a speed control unit 164 for controlling the steering wheel 210 and a steering control unit 166 for controlling the steering device 220 based on the target trajectory T described above, the driver of another vehicle can While realizing the communication, it is possible to realize a more flexible automatic operation.

 <第2実施形態>
 以下、第2実施形態について説明する。上述した第1実施形態では、他車両mの運転者の顔または視線の向きDが所定の方向である場合、自車両Mを減速または車線変更させるものとして説明した。これに対して、第2実施形態では、他車両mの運転者の顔または視線の向きDが所定の方向である場合、自車両Mを後退させる点で、上述した第1実施形態と異なる。以下、第1実施形態との相違点を中心に説明し、第1実施形態と共通する機能等についての説明は省略する。
Second Embodiment
The second embodiment will be described below. In 1st Embodiment mentioned above, when direction D of the driver | operator of the other vehicle m or gaze direction was predetermined direction, it demonstrated as what decelerates the own vehicle M or changes lanes. On the other hand, the second embodiment differs from the above-described first embodiment in that the host vehicle M is retracted when the direction D of the driver's face or line of sight of the other vehicle m is a predetermined direction. Hereinafter, differences from the first embodiment will be mainly described, and descriptions of functions and the like common to the first embodiment will be omitted.

 図17は、第2実施形態の自動運転制御装置100により実行される処理の一例を示すフローチャートである。本フローチャートの処理は、例えば、認識部130によって他車両mが認識されると実行される。また、本フローチャートの処理は、例えば、所定の周期で繰り返し実行されてよい。 FIG. 17 is a flowchart showing an example of processing executed by the automatic driving control apparatus 100 according to the second embodiment. The processing of this flowchart is executed, for example, when the recognition unit 130 recognizes the other vehicle m. Also, the processing of this flowchart may be repeatedly executed, for example, at a predetermined cycle.

 まず、他車両乗員注視方向認識部132は、認識した他車両mの運転者を認識する(ステップS200)。次に、他車両乗員注視方向認識部132は、認識した運転者の顔または視線の向きDを認識する(ステップS202)。 First, the other vehicle occupant's gaze direction recognition unit 132 recognizes the recognized driver of the other vehicle m (step S200). Next, the other vehicle occupant's gaze direction recognition unit 132 recognizes the direction D of the recognized driver's face or gaze (step S202).

 次に、他車両遭遇時判定部142は、他車両乗員注視方向認識部132から、認識結果として注視方向情報を取得し、その注視方向情報を基に、他車両mの運転者の顔または視線の向きDが所定の方向であるか否かを判定する(ステップS204)。 Next, the other vehicle encounter time determination unit 142 acquires gaze direction information as a recognition result from the other vehicle occupant gaze direction recognition unit 132, and based on the gaze direction information, the driver's face or line of sight of the other vehicle m. It is determined whether or not the direction D of the target is a predetermined direction (step S204).

 行動計画生成部140は、他車両遭遇時判定部142によって、他車両mの運転者の顔または視線の向きDが所定の方向でないと判定された場合、以前に生成した目標軌道Tと同じ目標軌道Tを生成し、速度制御部164および操舵制御部166に参照させる目標軌道Tを新たなものに変更しない(ステップS206)。 If it is determined by the other vehicle encounter time determination unit 142 that the direction D of the face or line of sight of the driver of the other vehicle m is not a predetermined direction, the action plan generation unit 140 determines the same target as the target trajectory T generated previously. The trajectory T is generated, and the target trajectory T to be referred to by the speed control unit 164 and the steering control unit 166 is not changed to a new one (step S206).

 一方、他車両遭遇時判定部142は、他車両mの運転者の顔または視線の向きDが所定の方向でないと判定した場合、更に、他車両mが踏切に到達したか否かを判定する(ステップS208)。 On the other hand, when the other vehicle encounter determining unit 142 determines that the direction D of the driver's face or line of sight of the other vehicle m is not a predetermined direction, it further determines whether the other vehicle m has reached the crossing. (Step S208).

 他車両遭遇時判定部142によって、他車両mが踏切に到達していないと判定された場合、機器制御部170は、通信装置20に、運転者が認識された他車両mと車車間通信を開始させ(ステップS210)、所定の情報を他車両mに送信させる。次に、行動計画生成部140は、減速または車線変更を実現するための目標軌道Tを生成する(ステップS212)。 If the other vehicle encounter time determination unit 142 determines that the other vehicle m has not reached the crossing, the device control unit 170 causes the communication device 20 to communicate between the other vehicle m for which the driver is recognized and the inter-vehicle communication. It is made to start (step S210), and the predetermined information is transmitted to the other vehicle m. Next, the action plan generation unit 140 generates a target trajectory T for realizing deceleration or lane change (step S212).

 一方、他車両遭遇時判定部142によって、他車両mが踏切に到達したと判定された場合、機器制御部170は、通信装置20に、運転者が認識された他車両mと車車間通信を開始させ(ステップS214)、所定の情報を他車両mに送信させる。次に、行動計画生成部140は、自車両Mを後退させるための目標軌道Tを生成する(ステップS216)。そして、速度制御部164および操舵制御部166の一方または双方は、行動計画生成部140により新たに生成された目標軌道Tに基づいて、各制御対象を制御する。 On the other hand, when the other vehicle encounter determination unit 142 determines that the other vehicle m has reached the crossing, the device control unit 170 causes the communication device 20 to communicate between the other vehicle m whose driver is recognized and the inter-vehicle communication It is made to start (step S214), and predetermined information is transmitted to the other vehicle m. Next, the action plan generation unit 140 generates a target trajectory T for moving the host vehicle M backward (step S216). Then, one or both of the speed control unit 164 and the steering control unit 166 control each control target based on the target trajectory T newly generated by the action plan generation unit 140.

 図18および図19は、他車両mの運転者の顔または視線の向きDが所定の方向となり得る場面の他の例を示す図である。図示の例では、自車両Mの直前に他車両mが存在し、その他車両mの車体の一部が踏切内に入り込んでいる。このような場面では、例えば、他車両mの運転者が、踏切内から出るために、後方確認を行ってから他車両mを後退させようと試みることが想定される。この場合、他車両mの運転者が後方確認の際に自車両Mを視界に捉える蓋然性が高いため、他車両遭遇時判定部142は、他車両mの運転者の顔または視線の向きDが所定の方向であると判定することになる。これを受けて、機器制御部170は、例えば、自車両Mが後退することで他車両mの後退スペースを確保する予定であることを所定の情報として通知する。また、行動計画生成部140は、認識部130によって後続車両が認識されていない場合、自車両Mを後退させるための目標軌道Tを新たに生成する。これによって、他車両mが後退できる程度のスペースを形成することができ、他車両mに後退を促すことができる。 FIG. 18 and FIG. 19 are diagrams showing other examples of a scene in which the direction D of the face or line of sight of the driver of another vehicle m may be a predetermined direction. In the illustrated example, another vehicle m is present immediately before the host vehicle M, and a part of the vehicle body of the other vehicle m is in the level crossing. In such a scene, for example, it is assumed that the driver of the other vehicle m attempts to reverse the other vehicle m after performing a backward check in order to get out of the level crossing. In this case, since it is highly probable that the driver of the other vehicle m captures the own vehicle M in the view at the time of backward confirmation, the other vehicle encounter time determination unit 142 determines that the direction D of the driver's face or sight line D of the other vehicle m is It will be determined that it is a predetermined direction. In response to this, for example, the equipment control unit 170 notifies that the backward space of the other vehicle m is to be secured by the backward movement of the own vehicle M as predetermined information. In addition, when the recognition unit 130 does not recognize the following vehicle, the action plan generation unit 140 newly generates a target trajectory T for moving the host vehicle M backward. This makes it possible to form a space that allows the other vehicle m to move backward, and to urge the other vehicle m to move backward.

 以上説明した第2実施形態によれば、自車両Mの前走車両である他車両mの運転者の顔または視線の向きDが所定の方向である場合、自車両Mを後退させるための目標軌道Tを新たに生成するため、他車両mが後退できる程度のスペースを形成することができ、他車両mに後退を促すことができる。この結果、上述した第1実施形態と同様に、他車両の運転者と意思の疎通を図りながら、より柔軟な自動運転を実現することができる。 According to the second embodiment described above, when the direction D of the face or the line of sight of the driver of the other vehicle m who is the front traveling vehicle of the own vehicle M is a predetermined direction, the target for backing up the own vehicle M In order to newly generate the track T, it is possible to form a space that allows the other vehicle m to move backward, and to urge the other vehicle m to move backward. As a result, as in the first embodiment described above, more flexible automatic driving can be realized while attempting to communicate with the driver of the other vehicle.

 <第3実施形態>
 以下、第3実施形態について説明する。上述した第1および第2実施形態では、他車両mの運転者の顔または視線の向きDが所定の方向である場合に、自車両Mを減速させたり、車線変更させたり、後退させたりする例について説明した。これに対して、第3実施形態では、他車両mの運転者の顔または視線の向きDが所定の方向でない場合に、自車両Mを向きDが所定の方向である場合の車両制御と異なる車両制御を行う点で、上述した第1および第2実施形態と異なる。以下、第1および第2実施形態との相違点を中心に説明し、第1および第2実施形態と共通する機能等についての説明は省略する。
Third Embodiment
The third embodiment will be described below. In the first and second embodiments described above, the host vehicle M is decelerated, changed in lane, or moved backward when the direction D of the driver's face or line of sight of the other vehicle m is a predetermined direction. An example has been described. On the other hand, in the third embodiment, when the direction D of the driver's face or gaze of the other vehicle m is not the predetermined direction, the vehicle control is different from the vehicle control when the direction D is the predetermined direction. It differs from the first and second embodiments described above in that vehicle control is performed. Hereinafter, differences from the first and second embodiments will be mainly described, and descriptions of functions and the like common to the first and second embodiments will be omitted.

 図20は、第3実施形態の自動運転制御装置100により実行される処理の一例を示すフローチャートである。本フローチャートの処理は、例えば、認識部130によって他車両mが認識されると実行される。また、本フローチャートの処理は、例えば、所定の周期で繰り返し実行されてよい。 FIG. 20 is a flowchart showing an example of processing executed by the automatic driving control apparatus 100 according to the third embodiment. The processing of this flowchart is executed, for example, when the recognition unit 130 recognizes the other vehicle m. Also, the processing of this flowchart may be repeatedly executed, for example, at a predetermined cycle.

 まず、他車両乗員注視方向認識部132は、認識した他車両mの運転者を認識する(ステップS300)。次に、他車両乗員注視方向認識部132は、認識した運転者の顔または視線の向きDを認識する(ステップS302)。 First, the other vehicle occupant gaze direction recognition unit 132 recognizes the recognized driver of the other vehicle m (step S300). Next, the other vehicle occupant's gaze direction recognition unit 132 recognizes the direction D of the recognized driver's face or gaze (step S302).

 次に、他車両遭遇時判定部142は、他車両乗員注視方向認識部132から、認識結果として注視方向情報を取得し、その注視方向情報を基に、他車両mの運転者の顔または視線の向きDが所定の方向であるか否かを判定する(ステップS304)。 Next, the other vehicle encounter time determination unit 142 acquires gaze direction information as a recognition result from the other vehicle occupant gaze direction recognition unit 132, and based on the gaze direction information, the driver's face or line of sight of the other vehicle m. It is determined whether or not the direction D of the target is a predetermined direction (step S304).

 行動計画生成部140は、他車両遭遇時判定部142によって、他車両mの運転者の顔または視線の向きDが所定の方向であると判定された場合、以前に生成した目標軌道Tと同じ目標軌道Tを生成し、速度制御部164および操舵制御部166に参照させる目標軌道Tを新たなものに変更しない(ステップS306)。 If the behavior plan generation unit 140 determines that the direction D of the driver's face or gaze of the other vehicle m is determined to be a predetermined direction by the other vehicle encounter time determination unit 142, the same as the target trajectory T generated before The target trajectory T is generated, and the target trajectory T to be referred to by the speed control unit 164 and the steering control unit 166 is not changed to a new one (step S306).

 一方、行動計画生成部140は、他車両遭遇時判定部142によって、他車両mの運転者の顔または視線の向きDが所定の方向でないと判定された場合、自車両Mと他車両mとの車間距離を一定に保ちながら自車両Mを他車両mに追従させるための目標軌道Tを新たに生成する(ステップS308)。そして、速度制御部164および操舵制御部166の一方または双方は、行動計画生成部140により新たに生成された目標軌道Tに基づいて、各制御対象を制御する。 On the other hand, when it is determined by the other vehicle encounter time determination unit 142 that the direction D of the driver's face or line of sight of the other vehicle m is not a predetermined direction, the action plan generation unit 140 A target trajectory T for causing the vehicle M to follow the other vehicle m is newly generated while keeping the inter-vehicle distance constant (step S308). Then, one or both of the speed control unit 164 and the steering control unit 166 control each control target based on the target trajectory T newly generated by the action plan generation unit 140.

 図21および図22は、他車両mの運転者の顔または視線の向きDが所定の方向とならない場面の一例を示す図である。例えば、他車両mが、自動運転が可能な車両である場合、他車両mの運転者は、例えば、ステアリングホイールSTから手を離して表示装置DPに表示されたテレビ番組などのコンテンツを視聴することが想定される。この場合、他車両遭遇時判定部142は、他車両mの運転者がステアリングホイールSTを操作しておらず、その運転者の顔または視線の向きDが所定の方向でないと判定する。この場合、行動計画生成部140は、自車両Mを他車両mが存在する車線LN1に車線変更させる目標軌道Tを新たに生成し、自車両Mが車線LN1に移動した後、前走車両となる他車両mの速度と略同じ速度に目標速度を決定し、この目標速度を速度要素として含む目標軌道Tを生成する。これによって、自車両Mを他車両mに追従させることができる。このように、他車両mの運転者の顔または視線の向きDに基づいて、他車両mが自動運転の車両である蓋然性が高いことが判明した場合、この他車両mに自車両Mを追従させることで、自車両Mの認識部130の機能の一部を前走車両である他車両mが担うことになるため、自車両M側の処理負荷を軽減させることができる。また、他車両mに自車両Mを追従させるため、自車両Mの空気抵抗が減少し、自車両Mの燃費を向上させることができる。 FIG. 21 and FIG. 22 are diagrams showing an example of a situation where the direction D of the face or line of sight of the driver of the other vehicle m does not become a predetermined direction. For example, when the other vehicle m is a vehicle capable of automatic driving, the driver of the other vehicle m, for example, releases the hand from the steering wheel ST to view content such as a television program displayed on the display device DP It is assumed. In this case, the other vehicle encounter time determination unit 142 determines that the driver of the other vehicle m does not operate the steering wheel ST and that the direction D of the driver's face or line of sight is not a predetermined direction. In this case, the action plan generation unit 140 newly generates a target track T for changing the lane of the host vehicle M to the lane LN1 in which the other vehicle m exists, and after the host vehicle M moves to the lane LN1, The target velocity is determined to be substantially the same as the velocity of the other vehicle m, and a target trajectory T including this target velocity as a velocity component is generated. By this, the own vehicle M can be made to follow the other vehicle m. As described above, when it is determined that the probability that the other vehicle m is an automatically driven vehicle is high based on the direction D of the face or line of sight of the driver of the other vehicle m, the host vehicle M follows the other vehicle m. By doing this, the other vehicle m, which is a leading vehicle, bears a part of the function of the recognition unit 130 of the host vehicle M, so the processing load on the host vehicle M side can be reduced. Further, since the own vehicle M follows the other vehicle m, the air resistance of the own vehicle M decreases, and the fuel efficiency of the own vehicle M can be improved.

 [ハードウェア構成]
 上述した実施形態の自動運転制御装置100は、例えば、図23に示すようなハードウェアの構成により実現される。図23は、実施形態の自動運転制御装置100のハードウェア構成の一例を示す図である。
[Hardware configuration]
The automatic driving control apparatus 100 according to the above-described embodiment is realized, for example, by a hardware configuration as shown in FIG. FIG. 23 is a diagram illustrating an example of a hardware configuration of the automatic driving control device 100 according to the embodiment.

 自動運転制御装置100は、通信コントローラ100-1、CPU100-2、RAM(Random Access Memory)100-3、ROM(Read Only Memory)100-4、フラッシュメモリやHDD(Hard Disc Drive)等の二次記憶装置100-5、およびドライブ装置100-6が、内部バスあるいは専用通信線によって相互に接続された構成となっている。ドライブ装置100-6には、光ディスク等の可搬型記憶媒体が装着される。二次記憶装置100-5に格納されたプログラム100-5aがDMAコントローラ(不図示)等によってRAM100-3に展開され、CPU100-2によって実行されることで、第1制御部120および第2制御部160が実現される。また、CPU100-2が参照するプログラムは、ドライブ装置100-6に装着された可搬型記憶媒体に格納されていてもよいし、ネットワークNWを介して他の装置からダウンロードされてもよい。 The automatic operation control apparatus 100 includes a communication controller 100-1, a CPU 100-2, a RAM (Random Access Memory) 100-3, a ROM (Read Only Memory) 100-4, and a secondary such as a flash memory or an HDD (Hard Disc Drive). The storage device 100-5 and the drive device 100-6 are mutually connected by an internal bus or a dedicated communication line. A portable storage medium such as an optical disk is attached to the drive device 100-6. The program 100-5a stored in the secondary storage device 100-5 is expanded on the RAM 100-3 by a DMA controller (not shown) or the like, and is executed by the CPU 100-2 to execute the first control unit 120 and the second control. The unit 160 is realized. Further, the program referred to by the CPU 100-2 may be stored in a portable storage medium attached to the drive device 100-6, or may be downloaded from another device via the network NW.

 上記実施形態は、以下のように表現することができる。
 プログラム記憶するストレージと、
 プロセッサと、を備え、
 前記プロセッサは、前記プログラムを実行することにより、
 他車両の乗員の顔または視線の向きを示す情報を取得し、
 前記情報によって示される向きが、前記他車両から見て自車両を向いている方向であるか否かを判定し、
 前記自車両の操舵または加減速のうち一方または双方を制御し、
 前記情報が示す向きが前記他車両から見て前記自車両を向いている方向であると判定した場合に、所定の制御を行うように構成された、
 車両制御装置。
The above embodiment can be expressed as follows.
Storage for program storage,
A processor, and
The processor executes the program to
Obtain information that indicates the direction of the face or gaze of the occupant of another vehicle,
It is determined whether or not the direction indicated by the information is the direction facing the own vehicle as seen from the other vehicle,
Control one or both of steering and acceleration / deceleration of the vehicle;
The predetermined control is performed when it is determined that the direction indicated by the information is a direction facing the host vehicle as viewed from the other vehicle.
Vehicle control device.

 以上、本発明を実施するための形態について実施形態を用いて説明したが、本発明はこうした実施形態に何等限定されるものではなく、本発明の要旨を逸脱しない範囲内において種々の変形及び置換を加えることができる。例えば、上述した実施形態の車両システム1は、ACCやLKAS等の運転支援を行うシステムに適用されてもよい。 As mentioned above, although the form for carrying out the present invention was explained using an embodiment, the present invention is not limited at all by such an embodiment, and various modification and substitution within the range which does not deviate from the gist of the present invention Can be added. For example, the vehicle system 1 of the embodiment described above may be applied to a system that provides driving assistance such as ACC or LKAS.

1…車両システム、10…カメラ、12…レーダ装置、14…ファインダ、16…物体認識装置、20…通信装置、30…HMI、40…車両センサ、50…ナビゲーション装置、60…MPU、80…運転操作子、90…ランプ群、92…クラクション、100…自動運転制御装置、120…第1制御部、130…認識部、132…他車両乗員注視方向認識部、140…行動計画生成部、142…他車両遭遇時判定部、160…第2制御部、162…取得部、164…速度制御部、166…操舵制御部、200…走行駆動力出力装置、210…ブレーキ装置、220…ステアリング装置、M…自車両、m…他車両 DESCRIPTION OF SYMBOLS 1 ... Vehicle system, 10 ... Camera, 12 ... Radar apparatus, 14 ... Finder, 16 ... Object recognition apparatus, 20 ... Communication apparatus, 30 ... HMI, 40 ... Vehicle sensor, 50 ... Navigation apparatus, 60 ... MPU, 80 ... Driving Operators, 90: lamps, 92: horns, 100: automatic driving control device, 120: first control unit, 130: recognition unit, 132: other vehicle occupant gaze direction recognition unit, 140: action plan generation unit, 142 ... Other-vehicle encounter determination unit 160: second control unit 162: acquisition unit 164: speed control unit 166: steering control unit 200: traveling driving force output device 210: braking device 220: steering device M ... own vehicle, m ... other vehicles

Claims (7)

 他車両の乗員の顔または視線の向きを示す情報を取得する取得部と、
 前記取得部により取得された情報によって示される向きが、前記他車両から見て自車両を向いている方向であるか否かを判定する判定部と、
 前記自車両の操舵または加減速のうち一方または双方を制御する運転制御部であって、前記判定部により、前記情報が示す向きが前記他車両から見て前記自車両を向いている方向であると判定された場合に、所定の制御を行う運転制御部と、
 を備える車両制御装置。
An acquisition unit that acquires information indicating the direction of the face or line of sight of the occupant of another vehicle;
A determination unit that determines whether or not the direction indicated by the information acquired by the acquisition unit is a direction facing the host vehicle as viewed from the other vehicle;
The operation control unit controls one or both of the steering and the acceleration / deceleration of the host vehicle, and the direction indicated by the information from the other vehicle is a direction in which the information is directed to the host vehicle from the other vehicle by the determination unit. Operation control unit that performs predetermined control when it is determined that
A vehicle control device comprising:
 前記所定の制御は、前記自車両を減速させること、前記自車両を車線変更させること、または前記自車両を後退させることの少なくとも一つ以上の制御を含む、
 請求項1に記載の車両制御装置。
The predetermined control includes at least one control of decelerating the host vehicle, changing the lane of the host vehicle, or moving the host vehicle backward.
The vehicle control device according to claim 1.
 前記他車両に設置されたステアリングホイールを検出すると共に、前記検出したステアリングホイールに最も近い乗員の顔または視線の向きを検出する検出部を更に備え、
 前記取得部は、前記検出部の検出結果を、前記他車両の乗員の顔または視線の向きを示す情報として取得する、
 請求項1に記載の車両制御装置。
A detection unit is further provided for detecting a steering wheel installed in the other vehicle and detecting a direction of a face or a line of sight of an occupant closest to the detected steering wheel.
The acquisition unit acquires the detection result of the detection unit as information indicating the direction of the face or the line of sight of the occupant of the other vehicle.
The vehicle control device according to claim 1.
 他車両と車車間通信を行う通信部と、
 前記判定部により、前記情報が示す向きが前記他車両から見て前記自車両を向いている方向であると判定された場合、前記通信部に前記車車間通信を開始させ、所定の情報を前記他車両に送信させる通信制御部と、を更に備える、
 請求項1に記載の車両制御装置。
A communication unit that performs inter-vehicle communication with other vehicles;
If it is determined by the determination unit that the direction indicated by the information is a direction facing the host vehicle as viewed from the other vehicle, the communication unit causes the communication unit to start the inter-vehicle communication, and the predetermined information is And a communication control unit for transmitting to another vehicle.
The vehicle control device according to claim 1.
 前記運転制御部は、前記判定部により、前記情報が示す向きが前記他車両から見て前記自車両を向いている方向であると判定された場合、前記自車両を減速させる、または他車線に車線変更させる、
 請求項1に記載の車両制御装置。
When it is determined that the direction indicated by the information is the direction facing the own vehicle as viewed from the other vehicle, the operation control unit decelerates the own vehicle or sets the other vehicle to the other lane. Change lanes,
The vehicle control device according to claim 1.
 取得部が、他車両の乗員の顔または視線の向きを示す情報を取得し、
 判定部が、前記取得部により取得された情報によって示される向きが、前記他車両から見て自車両を向いている方向であるか否かを判定し、
 運転制御部が、前記自車両の操舵または加減速のうち一方または双方を制御し、前記判定部により、前記情報が示す向きが前記他車両から見て前記自車両を向いている方向であると判定された場合に、所定の制御を行う、
 車両制御方法。
An acquisition unit acquires information indicating an orientation of a face or a line of sight of an occupant of another vehicle;
The determination unit determines whether or not the direction indicated by the information acquired by the acquisition unit is the direction facing the own vehicle as seen from the other vehicle,
The operation control unit controls one or both of steering and acceleration / deceleration of the host vehicle, and the determination unit determines that the direction indicated by the information is the direction facing the host vehicle from the other vehicle. If determined, perform predetermined control,
Vehicle control method.
 コンピュータに、
 他車両の乗員の顔または視線の向きを示す情報を取得させ、
 前記取得させた情報によって示される向きが、前記他車両から見て自車両を向いている方向であるか否かを判定させ、
 前記自車両の操舵または加減速のうち一方または双方を制御させると共に、前記情報が示す向きが前記他車両から見て前記自車両を向いている方向であると判定させた場合に、所定の制御を行わせる、
 プログラム。
On the computer
Get information that indicates the direction of the face or gaze of the occupant of another vehicle,
It is determined whether or not the direction indicated by the acquired information is the direction facing the own vehicle as seen from the other vehicle,
The predetermined control is performed when one or both of the steering and the acceleration / deceleration of the host vehicle are controlled and the direction indicated by the information is determined to be the direction facing the host vehicle as viewed from the other vehicle To make
program.
PCT/JP2017/046959 2017-12-27 2017-12-27 Vehicle control device, vehicle control method, and program Ceased WO2019130483A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/JP2017/046959 WO2019130483A1 (en) 2017-12-27 2017-12-27 Vehicle control device, vehicle control method, and program

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2017/046959 WO2019130483A1 (en) 2017-12-27 2017-12-27 Vehicle control device, vehicle control method, and program

Publications (1)

Publication Number Publication Date
WO2019130483A1 true WO2019130483A1 (en) 2019-07-04

Family

ID=67066849

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2017/046959 Ceased WO2019130483A1 (en) 2017-12-27 2017-12-27 Vehicle control device, vehicle control method, and program

Country Status (1)

Country Link
WO (1) WO2019130483A1 (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110543193A (en) * 2019-08-30 2019-12-06 中国人民解放军国防科技大学 An online acceleration and deceleration control method, system and medium for a pointing mechanism
CN110606084A (en) * 2019-09-19 2019-12-24 中国第一汽车股份有限公司 Cruise control method, cruise control device, vehicle and storage medium
CN112519883A (en) * 2019-09-19 2021-03-19 本田技研工业株式会社 Vehicle control system
JP2022121103A (en) * 2021-02-08 2022-08-19 パナソニックIpマネジメント株式会社 Merging Support Device, Merging Support System, and Merging Support Method
US20220332318A1 (en) * 2020-03-31 2022-10-20 Hitachi, Ltd. Mobile object control system

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH11328597A (en) * 1998-05-15 1999-11-30 Fujitsu Ten Ltd Vehicle driving controller
JP2007200052A (en) * 2006-01-27 2007-08-09 Nissan Motor Co Ltd Driving support device at intersection and driving support method at intersection
JP2007241729A (en) * 2006-03-09 2007-09-20 Toyota Central Res & Dev Lab Inc Driving support device and driving support system
JP2007304880A (en) * 2006-05-11 2007-11-22 Toyota Motor Corp Vehicle driving support device
JP2008158588A (en) * 2006-12-20 2008-07-10 Toyota Central R&D Labs Inc Inter-vehicle communication device and inter-vehicle communication system
JP2009298193A (en) * 2008-06-10 2009-12-24 Fuji Heavy Ind Ltd Driving support device for vehicle
US20140118523A1 (en) * 2012-10-25 2014-05-01 Po Yiu Pauline Li Devices, systems and methods for identifying potentially dangerous oncoming cars
JP2014178971A (en) * 2013-03-15 2014-09-25 Denso Corp Vehicular collision warning device
WO2017056401A1 (en) * 2015-09-30 2017-04-06 ソニー株式会社 Control device, control method, and program

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH11328597A (en) * 1998-05-15 1999-11-30 Fujitsu Ten Ltd Vehicle driving controller
JP2007200052A (en) * 2006-01-27 2007-08-09 Nissan Motor Co Ltd Driving support device at intersection and driving support method at intersection
JP2007241729A (en) * 2006-03-09 2007-09-20 Toyota Central Res & Dev Lab Inc Driving support device and driving support system
JP2007304880A (en) * 2006-05-11 2007-11-22 Toyota Motor Corp Vehicle driving support device
JP2008158588A (en) * 2006-12-20 2008-07-10 Toyota Central R&D Labs Inc Inter-vehicle communication device and inter-vehicle communication system
JP2009298193A (en) * 2008-06-10 2009-12-24 Fuji Heavy Ind Ltd Driving support device for vehicle
US20140118523A1 (en) * 2012-10-25 2014-05-01 Po Yiu Pauline Li Devices, systems and methods for identifying potentially dangerous oncoming cars
JP2014178971A (en) * 2013-03-15 2014-09-25 Denso Corp Vehicular collision warning device
WO2017056401A1 (en) * 2015-09-30 2017-04-06 ソニー株式会社 Control device, control method, and program

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110543193A (en) * 2019-08-30 2019-12-06 中国人民解放军国防科技大学 An online acceleration and deceleration control method, system and medium for a pointing mechanism
CN110543193B (en) * 2019-08-30 2022-04-15 中国人民解放军国防科技大学 An online acceleration and deceleration control method, system and medium for pointing mechanism
CN110606084A (en) * 2019-09-19 2019-12-24 中国第一汽车股份有限公司 Cruise control method, cruise control device, vehicle and storage medium
CN112519883A (en) * 2019-09-19 2021-03-19 本田技研工业株式会社 Vehicle control system
CN112519883B (en) * 2019-09-19 2023-02-21 本田技研工业株式会社 vehicle control system
US20220332318A1 (en) * 2020-03-31 2022-10-20 Hitachi, Ltd. Mobile object control system
US12005898B2 (en) * 2020-03-31 2024-06-11 Hitachi, Ltd. Mobile object platoon control system that calculates longitudinal acceleration of the mobile objects by setting a gain of an arithmetic expression
JP2022121103A (en) * 2021-02-08 2022-08-19 パナソニックIpマネジメント株式会社 Merging Support Device, Merging Support System, and Merging Support Method
JP7627538B2 (en) 2021-02-08 2025-02-06 パナソニックオートモーティブシステムズ株式会社 Merging support device, merging support system, and merging support method

Similar Documents

Publication Publication Date Title
CN111771234B (en) Vehicle control system, vehicle control method, and storage medium
JP6704890B2 (en) Vehicle control device, vehicle control method, and program
CN111727145B (en) Vehicle control system, vehicle control method and storage medium
CN110239549B (en) Vehicle control device, vehicle control method, and storage medium
CN111762113A (en) Vehicle control device, vehicle control method, and storage medium
JP6676025B2 (en) Vehicle control device, vehicle control method, and program
JP6586685B2 (en) Vehicle control device, vehicle control method, and program
CN110001634A (en) Controller of vehicle, control method for vehicle and storage medium
JP2019108103A (en) Vehicle control device, vehicle control method, and program
WO2019073511A1 (en) Vehicle control device, vehicle control method, and program
JP2019048570A (en) Vehicle control device, vehicle control method, and program
CN110194153B (en) Vehicle control device, vehicle control method, and storage medium
JP7029322B2 (en) Vehicle control devices, vehicle control methods, and programs
JP2019128612A (en) Vehicle control device, vehicle control method, and program
WO2019069425A1 (en) Vehicle control device, vehicle control method, and program
CN111511621B (en) Vehicle control device, vehicle control method, and storage medium
WO2019069347A1 (en) Vehicle control apparatus, vehicle control method, and program
JP2019137189A (en) Vehicle control system, vehicle control method, and program
JP2022011545A (en) Control device, control method, and program
CN111231961A (en) Vehicle control device, vehicle control method, and storage medium
JP2019137185A (en) Vehicle control system, vehicle control method, and program
WO2019130483A1 (en) Vehicle control device, vehicle control method, and program
JP2019067295A (en) Vehicle control device, vehicle control method, and program
JP2021068016A (en) Vehicle controller, vehicle control method and program
CN115716473A (en) Vehicle control device, vehicle control method, and storage medium

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17936432

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17936432

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP