WO2021193059A1 - 情報処理方法及び情報処理システム - Google Patents
情報処理方法及び情報処理システム Download PDFInfo
- Publication number
- WO2021193059A1 WO2021193059A1 PCT/JP2021/009486 JP2021009486W WO2021193059A1 WO 2021193059 A1 WO2021193059 A1 WO 2021193059A1 JP 2021009486 W JP2021009486 W JP 2021009486W WO 2021193059 A1 WO2021193059 A1 WO 2021193059A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- emergency stop
- situation
- moving body
- information processing
- control
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Images
Classifications
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/0011—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots associated with a remote control arrangement
- G05D1/0038—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots associated with a remote control arrangement by providing the operator with simple or augmented images from one or more cameras located onboard the vehicle, e.g. tele-operation
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W60/00—Drive control systems specially adapted for autonomous road vehicles
- B60W60/001—Planning or execution of driving tasks
- B60W60/0015—Planning or execution of driving tasks specially adapted for safety
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W30/00—Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
- B60W30/08—Active safety systems predicting or avoiding probable or impending collision or attempting to minimise its consequences
- B60W30/09—Taking automatic action to avoid collision, e.g. braking and steering
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W30/00—Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
- B60W30/08—Active safety systems predicting or avoiding probable or impending collision or attempting to minimise its consequences
- B60W30/095—Predicting travel path or likelihood of collision
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W50/00—Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
- B60W50/02—Ensuring safety in case of control system failures, e.g. by diagnosing, circumventing or fixing failures
- B60W50/0205—Diagnosing or detecting failures; Failure detection models
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W50/00—Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
- B60W50/08—Interaction between the driver and the control system
- B60W50/14—Means for informing the driver, warning the driver or prompting a driver intervention
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/16—Anti-collision systems
- G08G1/166—Anti-collision systems for active traffic, e.g. moving vehicles, pedestrians, bikes
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/16—Anti-collision systems
- G08G1/167—Driving aids for lane monitoring, lane changing, e.g. blind spot detection
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W50/00—Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
- B60W50/08—Interaction between the driver and the control system
- B60W50/14—Means for informing the driver, warning the driver or prompting a driver intervention
- B60W2050/146—Display means
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2420/00—Indexing codes relating to the type of sensors based on the principle of their operation
- B60W2420/40—Photo, light or radio wave sensitive means, e.g. infrared sensors
- B60W2420/403—Image sensing, e.g. optical camera
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2540/00—Input parameters relating to occupants
- B60W2540/215—Selection or confirmation of options
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2556/00—Input parameters relating to data
- B60W2556/10—Historical data
Definitions
- This disclosure relates to an information processing method, etc., which is executed when an emergency stop of an autonomously traveling mobile body occurs.
- Patent Document 1 has neither disclosure nor suggestion regarding confirmation of the reason.
- the display method for a mobile body described in Patent Document 2 is useful for a remote operator to understand the processing status of the automatic driving system that has led to a sudden operation. However, it is up to the operator to determine if there is actually a problem with the resumption of travel of the moving object. Therefore, it is difficult to surely improve the safety of resuming running.
- the present disclosure provides an information processing method and an information processing system that can more reliably improve the safety of resuming driving when an autonomously traveling moving object makes an emergency stop in order to avoid a collision with an obstacle.
- the information processing method is an information processing method executed by a processor, and determines whether or not an emergency stop has occurred in order to avoid a collision of an autonomously traveling moving body with an obstacle. Then, when it is determined that the emergency stop has occurred, the moving body autonomously travels so as to avoid a collision with the obstacle without performing the emergency stop based on the traveling status history of the moving body.
- the safety control which is the autonomous driving control of the above, is simulated, the log of the automatic driving system that executes the autonomous driving control of the moving body is acquired, and based on the autonomous driving control indicated by the log and the simulated safety control, The process that is the cause of the emergency stop is identified from the processes for autonomous driving control shown in the log, and the situation candidates corresponding to the specified process are acquired from the database regarding the situation in which the emergency stop can occur.
- An interface for causing the operator to input the situation candidate corresponding to the situation indicated by the log is presented, and it is determined whether or not the autonomous driving control can be restarted by the automatic driving system according to the input situation candidate. Including processing.
- the information processing system is an information processing system including one or more computers equipped with a processor, and the processor is for avoiding a collision of an autonomously traveling moving object with an obstacle.
- the moving body does not perform the emergency stop and is associated with the obstacle based on the traveling status history of the moving body.
- safety control which is autonomous driving control for autonomous driving so as to avoid a collision, acquires a log of an automatic driving system that executes the autonomous driving control of the moving body, and obtains the autonomous driving control and the autonomous driving control indicated by the log.
- the process that is the cause of the emergency stop is identified from the processes for autonomous driving control shown in the log, and the identified process is specified from the database regarding the situation where the emergency stop can occur.
- the situation candidate corresponding to the processing is acquired, the operator is presented with an interface for inputting the situation candidate corresponding to the situation indicated by the log, and the said situation candidate by the automatic driving system is provided according to the input situation candidate. It is determined whether or not the autonomous driving control can be restarted.
- a computer-readable recording medium such as a device, an integrated circuit, a computer program or a CD-ROM, and the device, system, integrated circuit, method, computer program. And any combination of recording media may be realized.
- FIG. 1 is an overall configuration diagram of a mechanism including a mobile body including an automatic driving system and a remote server according to the embodiment.
- FIG. 2A is a schematic diagram for explaining an image generated by the remote server when a process by the detection unit is specified as a cause of an emergency stop.
- FIG. 2B is a schematic diagram for explaining an image generated by the remote server when a process by the detection unit is specified as a cause of the emergency stop of the moving body.
- FIG. 3A is an image generated by the remote server when the processing by the detection unit is specified as the cause of the emergency stop of the moving body.
- FIG. 3B is a schematic diagram for explaining an image generated by the remote server when a process by the detection unit is specified as a cause of the emergency stop of the moving body.
- FIG. 4A is an example of an image generated by the remote server when the processing by the prediction unit is specified as the cause of the emergency stop of the moving body.
- FIG. 4B is a schematic diagram for explaining an image generated by the remote server when the processing by the prediction unit is specified as the cause of the emergency stop of the moving body.
- FIG. 5A is a schematic diagram for explaining an image generated by the remote server when the processing by the prediction unit is specified as the cause of the emergency stop of the moving body.
- FIG. 5B is a schematic diagram for explaining an image generated by the image generation unit when the processing by the prediction unit is specified as the cause of the emergency stop of the moving body.
- FIG. 6A is a schematic diagram for explaining an image generated by the remote server when a process by the determination unit is specified as the cause of the emergency stop of the moving body.
- FIG. 6B is a schematic diagram for explaining an image generated by the remote server when a process by the determination unit is specified as the cause of the emergency stop of the moving body.
- FIG. 7A is a schematic diagram for explaining an image generated by the remote server when a process by the determination unit is specified as the cause of the emergency stop of the moving body.
- FIG. 7B is a schematic diagram for explaining an image generated by the remote server when a process by the determination unit is specified as the cause of the emergency stop of the moving body.
- FIG. 8 is an example of a UI controlled by a user interface (hereinafter referred to as UI) control unit on the remote server and presented to the operator.
- FIG. 9 is an example of a UI presented in response to input by an operator.
- FIG. 10 is an example of a UI presented in response to input by an operator.
- FIG. 11 is a flowchart for explaining a series of flows of the information processing method according to the embodiment.
- UI user interface
- the information processing method is an information processing method executed by a processor, and is an emergency for an autonomously traveling moving body to avoid a collision with an obstacle.
- safety control which is autonomous driving control for autonomous driving so as to avoid
- the process that is the cause of the emergency stop is identified from the processes for autonomous driving control indicated by the log, and the specified process is specified from the database relating to the situation where the emergency stop may occur.
- the situation candidate corresponding to the above is acquired, the operator is presented with an interface for inputting the situation candidate corresponding to the situation indicated by the log, and the autonomous driving system is used according to the input situation candidate. It is determined whether or not the driving control can be restarted.
- the safety control is compared with the autonomous driving control indicated by the log, and the safety control for executing the safety control in the autonomous driving control of the moving body is performed.
- the safety control time which is the start time of the control, may be obtained, and the result of the process required for executing the safety control may be determined based on the result of determining whether or not the result of the process is output before the safety control time.
- the process required to execute the safety control is a detection process for acquiring a detection result indicating the surrounding situation of the moving body, and the surrounding situation of the moving body is predicted by using the detection result. It may be at least one of the prediction process for determining the traveling route and the determination process for determining the traveling route according to the predicted surrounding situation.
- the interface may include an image generated by using the output of the specified process and the observation data input to the automatic driving system. Further, the interface may include an image generated by further using the output at the safety control time in the safety control. Further, the interface may be an interface for inputting the situation candidate corresponding to the situation indicated by the log to a time point after the safety control time.
- whether or not the autonomous driving control can be restarted may be determined based on the value of the parameter indicating the safety level or the risk level of the execution of the autonomous driving control set for the input situation candidate. Further, the parameter may be based on the delay time length from the safety control time of the output of the specified process. Further, the parameter may be input with respect to the time when the input situation candidate corresponds, and the situation candidate may be input with respect to the time until the safety control time.
- the safety control may be a control for stopping the moving body at a stop position due to the emergency stop or a position before the stop position. Further, for example, the safety control may be a control for causing the moving body to travel while avoiding the obstacle.
- the information processing system is an information processing system including one or more computers equipped with a processor, and the processor is an obstacle in a moving body traveling autonomously.
- the moving body makes the emergency stop based on the traveling status history of the moving body.
- the safety control which is an autonomous driving control for autonomous driving so as to avoid a collision with the obstacle, is simulated, and the log of the automatic driving system that executes the autonomous driving control of the moving body is acquired.
- the process that is the cause of the emergency stop is identified from the processes for autonomous driving control indicated by the log, and the emergency stop can occur.
- the situation candidate corresponding to the specified process is acquired from the database related to the above, the operator is presented with an interface for inputting the situation candidate corresponding to the situation indicated by the log, and the situation candidate is input according to the input situation candidate. , It is determined whether or not the autonomous driving control can be restarted by the automatic driving system.
- a computer-readable recording medium such as a device, an integrated circuit, a computer program or a CD-ROM, and the device, system, integrated circuit, method, computer program. And any combination of recording media may be realized.
- FIG. 1 is an overall configuration diagram of a mechanism including a mobile body and a remote server provided with an automatic driving system according to the present embodiment.
- the information processing method according to the present disclosure is executed by a processor in a mechanism including a mobile body 100 that autonomously travels as shown in FIG. 1 and a remote server 200 that is wirelessly connected to the mobile body 100.
- the moving body 100 includes an automatic driving system 110, a receiving unit 120, an observing unit 11, and a moving body control unit 12.
- the observation unit 11 observes by sensing the surroundings of the moving body 100.
- the observation unit 11 is realized by using, for example, a camera, Radar (Radio detecting and ranging) or LiDAR (Light detection and ranging or Laser imaging detection and ranging). Observation data showing the result of this observation is provided from the observation unit 11 to the automatic operation system 110.
- the moving body control unit 12 controls the running of the moving body 100 according to an instruction regarding operation control from the automatic driving system 110.
- the moving body control unit 12 is realized by using a circuit that controls the steering angle and acceleration / deceleration of the moving body. Further, information indicating the content of this control (hereinafter, also referred to as steering / acceleration / deceleration information) is provided from the moving body control unit 12 to the automatic driving system 110.
- the automatic operation system 110 has a detection unit 111, a prediction unit 112, and a determination unit 113 as functional components.
- the detection unit 111 detects an obstacle by processing the observation data such as an image or a point cloud acquired by the observation unit 11, and obtains information on the type and size of the obstacle, the relative position with respect to the moving body 100, and the relative speed. Get and output.
- the prediction unit 112 predicts and outputs the movement route of the detected obstacle by processing the information acquired from the detection unit 111 and the behavior prediction technology.
- the determination unit 113 dynamically plans the route of the moving body 100 by processing the predicted moving path of the obstacle acquired from the predicting unit 112 and the steering / acceleration / deceleration information of the moving body 100 acquired from the moving body control unit 12. .. Then, the determination unit 113 determines and determines the content of the operation control including at least one of the next steering control and acceleration / deceleration control for traveling the planned route (hereinafter, also referred to as a planned route) on the moving body 100. An instruction including the content of the operation control is output to the mobile control unit 12.
- the steering and acceleration / deceleration control amounts instructed by the determination unit 113 have a predetermined upper limit as a normal traveling limit.
- the determination unit 113 instructs braking for an emergency stop.
- the mobile body 100 is driven by the mobile body control unit 12 in response to the instruction from the determination unit 113 until the predetermined procedure involving the remote server 200 is performed. Operation control is not executed.
- the receiving unit 120 receives the travel resumption availability signal by wireless communication with the transmitting unit 240 included in the remote server 200.
- the travel resumption enable / fail signal received by the receiving unit 120 indicates that travel can be resumed
- the mobile body 100 returns to a state in which the mobile body control unit 12 executes operation control in accordance with an instruction from the determination unit 113.
- the mobile control unit 12, the automatic driving system 110, and the receiving unit 120 are one or more ECUs (Electronic Control Units) included in various operation control systems included in the vehicle-mounted network system constructed on the vehicle-mounted network included in the mobile body 100. ) Can be realized.
- the ECU is, for example, a device including a processor (microprocessor), a digital circuit such as a memory, an analog circuit, a communication circuit, and the like.
- the memory is a ROM (Read-Only Memory) and a RAM (Random Access Memory), and can store a control program (computer program) executed by the processor.
- the ECU provides the functions of each of the above-mentioned components by operating the processor according to the control program.
- a computer program is a combination of a plurality of instruction codes for a processor in order to realize a predetermined function.
- the remote server 200 is an example of an information processing system including one or more computers including a processor.
- the remote server 200 determines whether the emergency stop occurs due to an abnormality in the automatic driving system 110 or is unavoidable due to circumstances. Provide the operator with available information. Further, it is determined whether or not autonomous driving can be resumed in response to the input from the operator, and a signal indicating the result of determining whether or not the autonomous driving can be resumed is transmitted to the moving body 100.
- the remote server 200 is a device including, for example, a processor, a memory, a communication circuit, and the like.
- the memory is a ROM and a RAM, and can store a control program (computer program) executed by the processor.
- the remote server 200 includes a log recording unit 210, a first information processing unit 220, a second information processing unit 230, a transmission unit 240, an output device 21, and an input device 22.
- An information processing system may be configured by arranging each component of the remote server 200 in a distributed manner on a plurality of servers.
- the output device 21 outputs information to the outside of the remote server 200.
- Examples of the output device 21 include a liquid crystal display and a speaker.
- the input device 22 receives an input from the outside of the remote server 200. Examples of the input device 22 include a keyboard, a mouse and a microphone. Further, the output device 21 and the input device 22 may be provided by a touch screen.
- the log recording unit 210 acquires and records the sensor log 211, the travel log 212, and the processing log 213 by wireless communication with the automatic operation system 110.
- the sensor log 211 is a history of observation data indicating an image, a point cloud, or the like input to the detection unit 111.
- the travel log 212 is a history of speed or position information of the moving body 100 input to the determination unit 113.
- the travel log 212 may include an operation history of an actuator (for example, a brake, an accelerator, or a steering actuator) related to the travel of the moving body 100.
- the processing log 213 is a history of processing results of the detection unit 111, the prediction unit 112, and the determination unit 113. Regarding the determination unit 113, the history of the planned route is also included in the processing log 213.
- the first information processing unit 220 has an operation detection unit 221 and a time calculation unit 222 and a cause identification unit 223 as functional components. These components are realized by a processor or the like.
- the operation detection unit 221 monitors the issuance of an emergency stop instruction for the moving body 100 by the determination unit 113 of the automatic operation system 110 with reference to the processing log 213, and detects the occurrence of an emergency stop. When the occurrence of an emergency stop is detected, this detection result is output as a trigger for subsequent processing by the time calculation unit 222 or the like.
- the emergency stop instruction may be indicated by, for example, a flag given to the instruction, or may be judged from the content of the braking control instruction (brake force magnitude, braking force distribution, etc.). .. If the moving body 100 is equipped with an emergency stop means such as a collision damage mitigation brake that operates separately from the emergency stop for which an instruction is issued by the determination unit 113, the operation detection unit 221 refers to the travel log 212. Then, the operation of the emergency stop means may be detected, and the detection result may be output as the trigger.
- the time calculation unit 222 calculates the safety control limit time from the travel log 212.
- the safety control in the present embodiment means a control in which the moving body 100 stops at a position where the moving body 100 has stopped due to an emergency stop by a normal autonomous driving control (that is, does not make an emergency stop), and the moving body by the time calculation unit 222. 100 simulation controls.
- ⁇ t indicates the processing delay time from the output of the moving body control of the emergency stop by the automatic driving system 110 to the actual start of the emergency stop of the moving body 100.
- distance (l k , l k + 1 ) indicates the distance from the position l k to the position l k + 1 , and for example, the Euclidean distance can be used.
- the safety control limit time calculated as described above is an example of the safety control time which is the start time of the safety control for executing the safety control in the autonomous driving control of the moving body 100.
- the safety control limit time can be said to be the latest time in which the safety control can be started as a feasible control in the autonomous driving control of the moving body 100.
- the safety control limit time will be used for explanation, but the safety control start time does not necessarily have to be the latest time (that is, the safety control limit time), and even if it is earlier than the latest time. good.
- the cause identification unit 223 uses the time T 0 calculated by the time calculation unit 222 and the processing log 213 to determine the cause of the emergency stop that occurred in the processing for autonomous driving control executed in the automatic driving system 110. Identify. Specifically, the cause identification unit 223 is processed by the detection unit 111 (hereinafter, also referred to as detection processing) based on the time from the time T 0 to the time T when the moving body 100 is stopped due to the emergency stop, and the prediction unit 112. The process that is the cause of the emergency stop is identified from the process by (hereinafter, also referred to as prediction process) and the process by the determination unit 113 (hereinafter, also referred to as determination process). This process will be described below.
- the emergency stop is executed to avoid a collision with an obstacle, and at least the obstacle related to the emergency stop is detected by the automatic driving system 110 at the time T'when the emergency stop is started. ..
- an object tracking technique for associating an obstacle with a time series is used for detecting the speed of an obstacle and predicting a movement path.
- the automatic driving system 110 according to the present embodiment also adopts this object tracking technology and detects it at a certain time based on the information acquired by the prediction unit 112 or the information acquired from the outside using the technology.
- the obstacle related to the emergency stop is equated with the obstacle detected at another time.
- the obstacle related to the emergency stop will be referred to as a target.
- Whether the detection process, the prediction process, or the judgment process is the cause of the emergency stop is determined from the result of the process for the target.
- the position of the target at time t output by the detection unit 111 is represented by detection (target, t).
- detection (target, t) NULL And.
- the cause identification unit 223 identifies the process by the detection unit 111 as the cause leading to the emergency stop.
- the process by the prediction unit 112 may be the cause of the emergency stop.
- the probability that the target at time t output by the prediction unit 112 moves to the position of the target at the time T'when the emergency stop is started is represented by prediction (target, t).
- the cause identification unit 223 identifies the process by the prediction unit 112 as the cause of the emergency stop of the moving body 100.
- the output detection (target, T') of the detection unit 111 which is the processing in the previous stage, can be used.
- the processing by the determination unit 113 may be the cause of the emergency stop.
- the maximum deceleration in normal autonomous driving control if the position of the target at the time T'when the emergency stop was started can be predicted.
- the speed is reduced below, and the moving body 100 can be stopped without making an emergency stop. Therefore, it is determined whether or not the processing by the determination unit 113 is the cause of the emergency stop depending on whether or not the determination unit 113 has output an instruction for deceleration at a deceleration that satisfies the above.
- the deceleration of the moving body 100 at time t which is output by the determination unit 113, is represented by a t.
- a t NULL And.
- the cause identification unit 223 identifies the process by the determination unit 113 as the cause leading to the emergency stop. Equation 2, the stopping distance of the moving body 100 in the deceleration a t indicates that the moving body 100 is greater than the distance to the position where the vehicle stops by an emergency stop.
- the automatic driving system 110 reduces the number of vehicles so that the determination unit 113 can stop before the actual stop position without executing the emergency stop. It means that the instruction of braking by speed was output. Nevertheless, the fact that the emergency stop was executed means that the cause of the emergency stop is other than the processing by the automatic operation system 110. In other words, it can be said that the moving body 100 is in a situation where it is impossible to continue traveling under the driving control by the automatic driving system 110. In the present embodiment, the mobile body 100 in such a situation is treated as if a system error has occurred, and the cause identification unit 223 outputs information indicating "autonomous driving restart impossible" to the transmission unit 240. ..
- the first information processing unit 220 has detected the occurrence of an emergency stop, and has reached the emergency stop from the processing for autonomous driving control executed by each functional component in the automatic driving system 110. Identify the causative process. The process identified as the cause is notified from the first information processing unit 220 to the second information processing unit 230.
- the second information processing unit 230 has an image generation unit 231, a status database (also referred to as a status DB in FIG. 11 described later) 232, a UI control unit 233, and a travel restartability determination unit 234 as functional components. These components are realized by a processor or the like.
- the image generation unit 231 generates an image according to the process identified as the cause of the emergency stop by the first information processing unit 220. For example, the image generation unit 231 uses the information extracted from the log recording unit 210 to generate an image showing the situation in which the moving body 100 has reached an emergency stop. The image generation unit 231 generates such an image corresponding to, for example, the time from the safety control limit time to the time when the moving body stops due to an emergency stop. The generated image is presented to the operator involved in determining whether or not the autonomous traveling of the moving body 100 can be resumed via the output device 21 described above. This image will be described later with reference to FIGS. 2A to 7B.
- the following is an example of a candidate for the above situation regarding processing by the detection unit 111.
- the candidate for the situation is listed in advance by the developer or the servicer and registered in the situation database 232.
- the situation database 232 When a large number of candidates are registered, it becomes difficult for the operator to make an appropriate selection.
- there are few candidates or there is an omission in the list in advance there is a possibility that there will be many selections of situations that do not correspond to any of the candidates.
- narrowing down of candidates or provisional selection of candidates as described later may be executed.
- the number of candidates when the number of candidates is less than the second threshold value, the number of selections of "not applicable", which is a situation that does not correspond to any candidate, becomes the third threshold value or more, or selection within a predetermined period. If there is a candidate whose number of times is less than the fourth threshold value, an alert may be notified to the operator.
- the video or system log of the case in which "Not applicable" is selected may be stored so as not to be deleted for the update of the candidate. Further, if possible, a free description by the operator and the like may be stored. By using the information stored in this way, the selection of new candidates can be made more efficient.
- the automated driving system may be improved based on the stored information. This makes it possible for developers or service designers to understand the scenes where emergency stops are often made, the automatic driving system is updated to avoid emergency stops, and the service route is changed to a route that is less likely to cause emergency stops. Or the speed limit is reset to a speed at which emergency stops are unlikely to occur. As a result, improvement of services by the automatic driving system can be expected.
- the UI control unit 233 uses the output device 21 to present to the operator the processing in the automatic operation system 110 identified as the cause of the emergency stop. The details of the UI will be described later using an example.
- the travel resumption possibility determination unit 234 determines whether or not the travel can be resumed based on the input result from the operator for the candidate of the situation where the emergency stop may occur.
- the travel resumption possibility determination unit 234 is realized by an AI (artificial intelligence) model that outputs, for example, the occurrence status of an emergency stop and outputs the determination result of the resumption of travel by autonomous driving of an autonomously traveling vehicle. The method of determining whether or not to resume running will be described later using an example.
- the transmission unit 240 transmits a travel resumption availability signal by wireless communication with the reception unit 120 included in the mobile body 100.
- FIGS. 2A to 7B are schematic views of image examples generated according to the type of processing identified as the cause of the emergency stop in the first information processing unit 220.
- FIGS. 2A and 2B the stop detection processing as the cause that led to is identified from the safety control limit time T 0, the observation unit 11 to the time until the time T a to the detection unit 111 starts the detection of the target
- FIG. 2A is a schematic diagram of an image generated by the image generation unit 231 based on the data output by the vehicle-mounted camera which is the observation unit 11 of the moving body 100 (FIGS. 3A, 4A and 4A described later). The same applies to 5A, FIG. 6A, and FIG. 7A). Further, FIG.
- 2B is a schematic diagram of an image generated by the image generation unit 231 based on the data (object extraction processed) output by the LiDAR which is the observation unit 11 of the moving body 100 (FIGS. 3B and 4B described later). , 5B, 6B, 7B).
- FIGS. 2A and 2B are based on the data acquired by the observation unit 11 at the safety control limit time T 0.
- FIG. 2B is drawn from a viewpoint above the moving body 100 (the same applies to FIGS. 3B, 4B, 5B, 6B, and 7B described later).
- Transmission image of humanoid with dotted outline in FIGS. 2A and 2B are images of time T a to the sensed target (human in this example), the image generation unit 231 to the log recording unit 210 Sensarogu 211 It is superimposed based on.
- FIG. 2A in the area to the left of the image, the figure of a person on the sidewalk when viewed from the moving body 100 (observation unit 11) is almost entirely hidden by a columnar object (“pillar” in the figure).
- pillar in the figure
- FIG. 2B it can be seen that this person on the sidewalk does not appear in the data output by LiDAR.
- the operator refers to these images in chronological order including the safety control limit time T 0 in order to understand the situation in which the moving body 100 has reached an emergency stop. For example, the operator said that the detection unit 111 could not acquire the position of the target by the safety control limit time T 0 because the target was hidden behind the moving body 100 before the emergency stop of the moving body 100. Look at these images to understand that the situation has occurred.
- the detection unit 111 from the time T a which has started detection of the target, the observed portion 11 produced by the data and automatic operation system 110 to the time until the time T of the moving body 100 is stopped by the emergency stop
- Image humanoid in FIGS. 3A and 3B is an image of a person is a target which is the object of the detection process at time T a
- the outline in FIGS. 2A and 2B are represented only by dashed lines.
- the outline of the alternate long and short dash line is superimposed on this image of the target by the image generation unit 231 as being consistent with the image on which the determination unit 113 determines the execution of the emergency stop.
- the detection unit 111 determines in the automatic operation system 110 that the operator who sees the images of FIGS. 3A and 3B as a continuation of the images of FIGS. 2A and 2B executes an emergency stop of the moving body 100 by the detection unit 111. It can be understood that this was done for the target that started detection at a time after time T 0.
- Emergency case diagram 4A and 4B the stop prediction process as a cause that led to is specified, the detection unit 111 via the safety control limit time T 0 from the time T a which has started detection of the target, the prediction unit 112 of the predetermined
- humanoid images reflected in the image of FIG. 4A and 4B is only an image which is located leftmost What originally captured, represented in the observation data obtained by the observation unit 11, for example, time T a It is an image of target.
- the other two humanoid images are superimposed based on the information acquired from the log recording unit 210 by the image generation unit 231.
- the humanoid image having the outline of the dotted line located at a place off the sidewalk to the right is an image indicating the position of the target after a predetermined time, which is output by the prediction unit 112 at time T b.
- a prediction unit 112 the time after a predetermined predicted time (time T a subsequent to the image which is located the origin of the arrow Time Indicates the position of the target before T b). That is, the arrow indicates the direction and magnitude of the movement of the target predicted by the prediction unit 112 from the image located at the starting point.
- the operator looking at the images of FIGS. 4A and 4B takes into account, for example, the surrounding conditions of the moving body 100 and the target grasped from the images, or the time difference from the safety control limit time T 0 and the predicted position of the target. Therefore, it is possible to determine whether or not there is a problem in the prediction process by the prediction unit 112. If it is determined that there is a problem in the prediction process by the prediction unit 112, the operator may determine that the traveling body 100 cannot be restarted by autonomous driving.
- 5A and 5B show data and automatic data generated by the observation unit 11 from the time T b when the prediction unit 112 outputs the prediction result related to the emergency stop to the time T when the moving body 100 stops due to the emergency stop. It is a schematic diagram for demonstrating the image generated by the image generation unit 231 using the log of the data generated by the operation system 110.
- the humanoid image shown by the solid line in FIGS. 5A and 5B is the image of the target represented by the observation data acquired by the observation unit 11 at time T b.
- the arrow indicates the direction and magnitude of the movement of the target predicted by the prediction unit 112 from this image located at the starting point.
- the humanoid image with the outline of the alternate long and short dash line located off the sidewalk to the right is the position of the target related to this prediction, and is also the image that was the basis for the judgment of the execution of the emergency stop by the judgment unit 113. For example, in the continuation of the images of FIGS.
- the image of the target, the arrow and the dashed humanoid move over time, and the dashed humanoid shows the outline of the image on which the decision to execute the emergency stop was based.
- the broken line changes to a dash-dotted line.
- an operator who sees the images of FIGS. 5A and 5B as a continuation of the images of FIGS. 4A and 4B can easily set the time difference between the time corresponding to the image in which the broken line is changed to the alternate long and short dash line and the safety control limit time T 0. Can be grasped.
- the judgment unit 113 changes the judgment result related to the emergency stop from the time T b when the prediction unit 112 outputs the prediction result related to the emergency stop.
- the image generated by the image generation unit 231 will be described using the log of the data generated by the observation unit 11 and the data generated by the automatic operation system 110 up to the time T c when the instruction for the based control is output. It is a schematic diagram for.
- Each of the images of FIGS. 6A and 6B is a moving body (“other moving body” in the figure) which is a target located in front of the moving body 100 based on the data acquired by the observation unit 11 at time T b. Contains the image at time T b.
- the broken line arrow and the dotted line segment seen at the lower right of the image of the other moving body in FIGS. 6A and 6B are superimposed based on the information acquired from the log recording unit 210 by the image generation unit 231.
- the broken line arrow indicates the route of the moving body 100 planned by the determination unit 113 by the time T b.
- the dotted line segment indicates the planned stop position related to the emergency stop instruction output by the determination unit 113 at time T c.
- the operator who sees the images of FIGS. 6A and 6B can confirm, for example, the situation around the planned stop position, the moving body 100, and the target grasped from the images.
- FIGS. 7A and 7B show the observation unit 11 from the time T c when the determination unit 113 outputs an instruction for control based on the determination result related to the emergency stop to the time T when the moving body 100 is stopped due to the emergency stop.
- It is a schematic diagram for demonstrating the image generated by the image generation unit 231 using the log of the data generated in 1 and the data generated by the automatic operation system 110.
- Each of the images of FIGS. 7A and 7B is based on the data acquired by the observation unit 11 at time T c , and is a target of two mobile bodies located in front of the mobile body 100 (“other movement” in the figure. Includes images of "body A" and "other mobile B") at time T c.
- the band and line segment of the alternate long and short dash line seen in the lower right of the image of the other moving body A in FIGS. 7A and 7B, and the arrow seen under the other moving body B are acquired by the image generation unit 231 from the log recording unit 210. It is superimposed based on the information obtained.
- the belt and line segment of the alternate long and short dash line are the planned route and stop position of the moving body 100 determined by the determination unit 113 in the determination regarding the emergency stop, and the route and the stop position at the time T that the moving body 100 traveled by the time T. Is shown. That is, the plan by the determination unit 113 and the control result of the mobile body 100 by the mobile body control unit 12 are in agreement.
- the arrow indicates the prediction path of the other mobile body B output by the prediction unit 112, which is related to the control instruction output by the determination unit 113 by the time T c.
- the operator looking at the images of FIGS. 7A and 7B can see, for example, align the route and stop position planned for the emergency stop with the travel path and stop position of the moving body 100 as a control result, and for the emergency stop. It is possible to check the situation around the target when the control instruction is output.
- UI configuration The configuration of the UI provided by the UI control unit 233 to the operator via the output device 21 and the input device 22 will be described below with reference to an example.
- the UI includes a component that presents to the operator the processing in the automatic driving system 110 identified by the cause identification unit 223 as the cause of the moving body 100 reaching the emergency stop.
- the UI further includes a component obtained from the status database 232 that presents to the operator a candidate for a situation in which an emergency stop may occur according to the above process identified as the cause of the emergency stop.
- the UI is an image for allowing the operator to confirm the actual situation in which the emergency stop has occurred in order to select an appropriate one from the above candidates, and is a component for presenting the image generated by the image generation unit 231. include.
- the UI further includes a configuration in which the operator is presented with the possibility of resuming the autonomous driving of the moving body 100 determined by the traveling resumption possibility determination unit 234.
- the UI presented to the operator when the process by the detection unit 111 is specified as the cause of the moving body 100 reaching the emergency stop is as shown in FIGS. 8 to 10.
- UIs are a series of screens including examples of each component listed above.
- the operator can operate the seek bar in the middle of the screen to see an image of the surroundings of the moving body 100 before and after the time when the moving body 100 reaches an emergency stop.
- This image is an image generated by the image generation unit 231.
- FIG. 8 is a screen that presents this image and a candidate for a situation in which an emergency stop may occur as an option. On this screen, the operator has not yet entered the selected status.
- the two images arranged side by side above the sentence indicating the identified cause are presented to the operator for confirmation of the occurrence status of the emergency stop generated by the image generation unit 231 from the log held in the log recording unit 210. It is an example of the image to be made.
- the images illustrated in FIGS. 2A and 2B include images presented for confirmation of the occurrence of this emergency stop.
- the result of the judgment of whether or not the running can be resumed by the running restart possibility judgment unit 234 is entered. This determination is based on the operator's input results regarding the situation in which an emergency stop has occurred.
- the wording "travel restart OK” indicating the result of the determination of whether or not the travel can be resumed by the travel restart possibility determination unit 234 is displayed.
- the phrase "emergency stop has worked normally” is also displayed to indicate that the autonomous driving system 110 is normal.
- the wording "travel restart NG” indicating the result of the determination of whether or not the travel can be resumed by the travel restart possibility determination unit 234 is displayed.
- the phrase "abnormal operation of emergency stop. Cannot resume running.” Is also displayed to indicate that the automatic driving system 110 may be abnormal.
- buttons that accept operator input.
- a travel restart enable / disable signal indicating whether or not the travel is resumed is transmitted from the transmission unit 240 to the moving body 100 according to the result of this determination.
- the operator only confirms the determination result, but the present invention is not limited to this.
- the UI may include a component that allows the operator to select approval or denial of the determination result (and redo the situation selection).
- the travel resumption possibility determination unit 234 determines whether or not the travel of the moving body 100 can be resumed by autonomous driving in response to the input of the situation selected by the operator from the candidates for the situation where the emergency stop may occur.
- An example of the method of determining whether or not the traveling can be resumed by the traveling resuming possibility determination unit 234 for each process of the automatic driving system 110 identified as having an abnormality will be described below.
- a parameter indicating the safety level or the risk level of the execution of the autonomous driving control is used for the situation selected and input by the operator as the corresponding situation.
- a parameter is used in which a value of 1 is used when the restart is safe, and 0 is used when the restart is unsafe or dangerous.
- each candidate for the situation presented to the operator may be given the value of the parameter in advance in the situation database.
- a candidate "assimilated with the background" is newly registered in the database 232. If the detection process can detect up to the class of the object, separate candidates are registered according to the class. Measures such as keeping it can be considered.
- the server determines whether or not the overlapping area between the obstacle and the other obstacle or the blind spot area exceeds the threshold value, and if the threshold value is exceeded, the obstacle and the other obstacle are displayed in the video at that time.
- Information indicating the overlap with the object or the blind spot area or information indicating the degree of the overlap may be superimposed, and the superimposed image may be presented to the operator. In this way, ambiguity can be efficiently eliminated by having the operator confirm the image on which information useful for judgment is superimposed.
- the candidates may be narrowed down in advance and the narrowed down candidates may be presented.
- the result of provisional selection by the server may be presented and approved by the operator. For example, based on the state of overlap between past obstacles and other obstacles or blind spot areas, and the situation selected by the operator at that time, the candidates are narrowed down to two or more situations, and the narrowed down candidates are narrowed down. It may be presented to the operator. Further, instead of narrowing down, one situation candidate may be tentatively selected, and the tentatively selected situation candidate may be presented to the operator.
- the operator says that the obstacle related to the emergency stop has "started moving”, “moved but stopped”, or “changed the direction of travel while moving”. It is assumed that one or more situations are selected and input. In this case, it is difficult to predict the obstacle because the obstacle has suddenly started or stopped moving, or the orientation information that is difficult to predict has changed. Therefore, in the autonomous driving by the automatic driving system 110, the occurrence of an emergency stop is unavoidable, that is, 1 is given as a parameter value assuming that it is safe to restart the driving rather than an abnormality of the automatic driving system 110.
- ambiguity can be expressed numerically by associating the detection results of the positions of the obstacles in chronological order. Therefore, the server may detect acceleration / deceleration of movement exceeding the threshold value or change in direction, superimpose the detection result on the image at that time, and present the superimposed image to the operator. In this way, ambiguity can be efficiently eliminated by having the operator confirm the image on which information useful for judgment is superimposed.
- the candidates may be narrowed down in advance and the narrowed down candidates may be presented.
- a separate process may be performed on the server side to make a selection in advance, and then the operator may confirm the selection result. For example, based on the acceleration / deceleration or change of direction of the movement of the obstacle in the past and the situation selected by the operator at that time, the candidates are narrowed down to two or more situations, and the narrowed down candidates are presented to the operator. May be good.
- one situation candidate may be tentatively selected, and the tentatively selected situation candidate may be presented to the operator.
- the above is the case where the processing result on the server side of the above candidate selection can be sufficiently trusted. This is because it is considered that the server can estimate the situation based on the contents confirmed by the operator so far in the case of occurrence of the above.
- the deceleration control should have been executed by making a decision to decelerate so as not to get too close to the obstacle in front of the obstacle, and no emergency stop should have occurred. That is, there is a possibility that the automatic driving system 110 has an abnormality, and it is unsafe to resume driving. Therefore, 0 is given as the value of the parameter.
- the server may detect acceleration / deceleration of movement exceeding the threshold value or change in direction, superimpose the detection result on the image at that time, and present the superimposed image to the operator. In this way, ambiguity can be efficiently eliminated by having the operator confirm the image on which information useful for judgment is superimposed.
- a separate process may be performed on the server side to make a selection in advance, and then the operator may confirm the selection result. For example, based on the acceleration / deceleration or change of direction of the movement of the obstacle in the past and the situation selected by the operator at that time, the candidate of the situation is tentatively selected, and the candidate of the tentatively selected situation is presented to the operator. May be good.
- situation candidates are registered for each obstacle class, and two or more situations are based on the acceleration / deceleration or change of orientation of the obstacle movement for each class in the past and the situation selected by the operator at that time.
- the narrowed down candidates may be presented to the operator. Further, instead of narrowing down, one situation candidate may be tentatively selected, and the tentatively selected situation candidate may be presented to the operator.
- the cause identification unit 223 identifies the "other obstacle" as a cause due to the processing by the detection unit 111 or the prediction unit 112, in each case, input the selection from the candidates of the above situation.
- the value of the parameter may be given by the operator performing again.
- the travel resumption possibility determination unit 234 determines that the travel can be resumed if the value given to the parameter with respect to the operator's input is 1, and that the travel cannot be resumed if the parameter is 0.
- the determination result is output to the UI control unit 233.
- the UI control unit 233 controls the display in the "AI automatic determination" column according to the determination result input from the travel restart possibility determination unit 234.
- the emergency stop may be unavoidable.
- the image generation unit 231 acquires from the identification result of the cause identification unit 223 how much time difference the identified process is expected to output from the safety control limit time T 0 , as described above. Generate an image in.
- the operator follows the image generated by the image generation unit 231 in chronological order via the input device 22, and inputs which situation candidate the image at each time corresponds to.
- the travel resumption possibility determination unit 234 determines whether or not the travel can be resumed depending on whether or not the time from the change of the situation until the obstacle is detected exceeds the allowable value of the processing delay. can do. It should be noted that the time from when the situation changes (or after the obstacle is detected after the situation changes) until the position of the obstacle is predicted, or after the situation changes (or after the situation changes), the obstacle The time from the predicted position to the determination of an emergency stop may be compared to the permissible value.
- FIG. 11 is a flowchart showing a procedure example of the information processing method according to the present embodiment.
- step S10 When the remote server 200 acquires any of the sensor log 211, the traveling log 212, and the processing log 213 transmitted from the automatic driving system 110 of the mobile body 100, these logs are recorded in the log recording unit 210 (step S10). ..
- the operation detection unit 221 of the first information processing unit 220 refers to the processing log 213 or the travel log 212 in order to detect the occurrence of an emergency stop in the moving body 100 (step S11). If no emergency stop has occurred (No in step S11), the processing for the log acquired in step S10 ends.
- step S11 When the occurrence of an emergency stop is detected (Yes in step S11), the time calculation unit 222 and the cause identification unit 223 of the first information processing unit 220 indicate that the cause of the emergency stop is the detection unit 111 of the automatic operation system 110. , It is determined whether the process is specified by either the prediction unit 112 or the determination unit 113, or whether it is a system error (step S12). If it is determined that the cause of the emergency stop is a system error (No in step S12), a travel restart enable / fail signal indicating that autonomous driving cannot be resumed from the cause identification unit 223 through the transmission unit 240. Is transmitted to the moving body 100 (step S18).
- step S12 When the cause of the emergency stop is the process in the automatic operation system 110 (Yes in step S12), the UI control unit 233 of the second information processing unit 230 has identified the cause of the emergency stop. In response to this, a candidate for a situation in which an emergency stop may occur is acquired from the situation database 232 (step S13).
- the UI control unit 233 includes an image showing the actual situation in which the moving body 100 generated by the image generation unit 231 using the information extracted from the log recording unit 210 has reached an emergency stop, and a candidate for the situation acquired in step S13.
- the UI including the above is presented to the operator via the output device 21 (step S14).
- the UI presented here is, for example, a screen as shown in FIG. 8 in which the operator selects and inputs an appropriate one as an explanation of the situation leading to the emergency stop.
- the travel resumption possibility determination unit 234 determines whether or not the travel of the moving body 100 can be resumed by autonomous driving according to the input result of the situation corresponding to the actual situation selected by the operator from the situation candidates via the input device 22. (Step S15).
- the determination here may be a tentative determination, or the determination may be made after further input regarding the situation is obtained from the operator. If there is a UI for asking the operator for further input about the situation (Yes in step S16), the UI control unit 233 repeats the procedure from step S14.
- the UI control unit 233 uses the UI control unit 233 to provide information on the determination result of the travel restartability determination unit 234 by the travel restartability determination unit 234.
- the UI reflected in the above is presented to the operator via the output device 21 to request approval (or confirmation) (step S17). For example, a screen as shown in FIG. 9 or FIG. 10 is presented.
- step S17 When it is detected by the UI control unit 233 that the operator approves the determination result of whether or not to resume traveling via the input device 22 (Yes in step S17), whether or not traveling can be resumed according to the approved determination result.
- the signal is transmitted from the transmission unit 240 to the mobile body 100 (step S19). If the operator refuses to approve the determination result of whether or not to resume running (No in step S17), the procedure returns to step S14 in this procedure example. In step S14 and subsequent steps in this case, the UI once presented may be presented again, for example, in order to make the operator reconfirm and reselect the corresponding situation.
- the safety control has been described as a control for stopping the vehicle at a position stopped by an emergency stop by a normal autonomous driving control, but the mode of the safety control is not limited to this.
- the safety control may be, for example, steering and acceleration / deceleration as necessary, and may be a control that keeps driving while avoiding obstacles related to an emergency stop in actual driving.
- the value of the parameter indicating the safety level or the risk level is determined according to the situation selected from the candidates as the corresponding situation by the operator and input, but is not limited to this.
- the value of the above parameter may be a value related to the time corresponding to the candidate of the situation input for the time until the safety control limit time. More specifically, the closer the unexpected course entry of another moving object is to the safety control limit time, the less likely the emergency stop is to be abnormal for the operation of the autonomous driving system, and the degree of safety is low. high.
- Another example of a parameter value is one that contains a safety or risk value based on the length of the delay from the safety control limit time of the output of the process identified as the cause of the emergency stop. There may be.
- the frequent occurrence of emergency stops in a specific candidate is considered to be dangerous driving, so each time there is a certain number of inputs to the candidate, the safety level or risk level is indicated. You may update the value.
- the values of the parameters listed here are not limited to binary values such as 0 and 1, and more multi-step values can be used.
- a part or all of the functional components included in each of the above-mentioned information processing systems may be composed of one system LSI (Large Scale Integration).
- a system LSI is an ultra-multifunctional LSI manufactured by integrating a plurality of components on one chip, and specifically, is a computer system including a microprocessor, a ROM, a RAM, and the like. ..
- a computer program is stored in the ROM. When the microprocessor operates according to this computer program, the system LSI achieves the function of each component.
- system LSI Although it is referred to as a system LSI here, it may be referred to as an IC, an LSI, a super LSI, or an ultra LSI due to the difference in the degree of integration. Further, the method of making an integrated circuit is not limited to LSI, and may be realized by a dedicated circuit or a general-purpose processor. An FPGA (Field Programmable Gate Array) that can be programmed after the LSI is manufactured, or a reconfigurable processor that can reconfigure the connection and settings of the circuit cells inside the LSI may be used.
- FPGA Field Programmable Gate Array
- One aspect of the present disclosure is not limited to the information processing method described above using the flow chart of FIG. 11, and may be a program executed by a computer and an information processing system including the computer. Also, one aspect of the present disclosure may be a computer-readable, non-temporary recording medium on which such a computer program is recorded.
- the technology according to the present disclosure is an information processing method executed when an emergency stop of an autonomously traveling moving body occurs, an information processing device that executes such an information processing device, or an information processing device that executes such an information processing method. It can be used as a program to process information.
Landscapes
- Engineering & Computer Science (AREA)
- Automation & Control Theory (AREA)
- Transportation (AREA)
- Mechanical Engineering (AREA)
- Human Computer Interaction (AREA)
- General Physics & Mathematics (AREA)
- Physics & Mathematics (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Aviation & Aerospace Engineering (AREA)
- Traffic Control Systems (AREA)
- Control Of Driving Devices And Active Controlling Of Vehicle (AREA)
- Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
Abstract
Description
図1は、本実施の形態に係る、自動運転システムを備える移動体及び遠隔サーバを含む仕組みの全体構成図である。
移動体100は、自動運転システム110、受信部120、観測部11、移動体制御部12を備える。
遠隔サーバ200は、プロセッサを備える一台以上のコンピュータを含む情報処理システムの一例である。遠隔サーバ200は、自律走行している移動体100で緊急停止が発生した場合に、その緊急停止の発生は、自動運転システム110の異常によるものか、状況上やむを得なかったものであるかの判断に利用可能な情報をオペレータに提供する。また、このオペレータからの入力に応じて自律走行の再開の可否を判定し、走行再開の可否の判定結果を示す信号を移動体100に送信する。
detection(target,t) = NULL
とする。
at = NULL
とする。
「緊急停止に係る障害物が他の障害物で部分的/一時的に隠れていた」
また、次に挙げるのは、予測部112による処理に関する前記状況の候補の例である。
「緊急停止に係る障害物が移動中であったが停止した」
「緊急停止に係る障害物が移動中、進行方向を変えた」
また、次に挙げるのは、判断部113による処理に関する前記状況の候補の例である。
前記状況の候補は、自動運転システム110の処理能力に基づいて、開発者又はサービサが事前にリストアップして、状況データベース232への登録を行う。多数の候補を登録すると、オペレータが適切に選択することが難しくなる。一方で候補が少ない、あるいは、事前のリストアップに漏れがある場合、いずれの候補にも該当しない状況の選択が多発する可能性もある。
次に、画像生成部231が生成してUIとしてオペレータに提示される画面について図2A~図7Bを参照して説明する。図2A~図7Bに示す画像は、第一情報処理部220で緊急停止に至った原因として特定された処理の種類に応じて生成される画像例の模式図である。
図2A及び図2Bは、安全制御限界時刻T0から、検知部111がtargetの検知を開始した時刻Taまでの時間に観測部11で生成されたデータ及び自動運転システム110で生成されたデータのログを用いて画像生成部231が生成する画像を説明するための模式図である。より詳細には、図2Aは、移動体100の観測部11である車載カメラが出力したデータに基づいて画像生成部231が生成する画像の模式図である(後述の図3A、図4A、図5A、図6A、図7Aも同じ)。また、図2Bは、移動体100の観測部11であるLiDARが出力したデータ(オブジェクト抽出処理済み)に基づいて画像生成部231が生成する画像の模式図である(後述の図3B、図4B、図5B、図6B、図7Bも同じ)。
図4A及び図4Bは、検知部111がtargetの検知を開始した時刻Taから安全制御限界時刻T0を経て、予測部112が所定の閾値以上の確率で緊急停止に係るtargetの位置の予測結果を出力した時刻Tbまでの時間に観測部11で取得されたデータを用いて画像生成部231が生成する画像の説明するための模式図である。
図6A及び図6Bは、予測部112が緊急停止に係る予測結果を出力した時刻Tbから、判断部113が緊急停止に係る判断結果に基づく制御のための指示を出力した時刻Tcまでの時間に観測部11で生成されたデータ及び自動運転システム110で生成されたデータのログを用いて画像生成部231が生成する画像を説明するための模式図である。図6A及び図6Bの各画像は、いずれも時刻Tbに観測部11で取得されたデータに基づき、移動体100の前方に位置するtargetである移動体(図中の「他移動体」)の時刻Tbの時点の像を含む。また、図6A及び図6Bで他移動体の像の右下に見える破線の矢印及び点線の線分は、画像生成部231がログ記録部210から取得した情報に基づいて重畳したものである。破線の矢印は、判断部113が時刻Tbまでに計画した移動体100の経路を示す。また、点線の線分は、時刻Tcに判断部113が出力した緊急停止の指示に係る、計画上の停止位置を示す。
UI制御部233が出力装置21及び入力装置22を介してオペレータに提供するUIの構成について、例を用いて以下に説明する。
走行再開可否判定部234は、緊急停止が発生し得る状況の候補からオペレータが選択した状況の入力に対して、移動体100の自律走行による走行再開の可否を判定する。走行再開可否判定部234によるこの走行再開の可否の判定方法について、異常があると特定された自動運転システム110の処理ごとの例を以下に説明する。
オペレータの選択として入力された状況が、緊急停止に係る障害物が「死角に隠れていた」か、「部分的/一時的に隠れていた」であるとする(図9参照)。これらの状況では、カメラ又はLiDARといった観測部11からの観測データには障害物が表れにくいために、検知は困難である。したがって、自動運転システム110による自律走行では緊急停止の発生がやむを得ない状況であった、つまり自動運転システム110の異常ではなく走行再開は安全であるとしてパラメータの値として1が与えられる。
オペレータが、緊急停止に係る障害物が「移動を開始した」、「移動中であったが停止した」、「移動中に進行方向を変えた」のうち、1つ以上の状況を選択して入力したとする。この場合、障害物は急遽移動を開始又は終了している、もしくは、予測が困難な向き情報が変化しているため、予測が困難である。したがって、自動運転システム110による自律走行では緊急停止の発生がやむを得ない状況であった、つまり自動運転システム110の異常ではなく走行再開は安全であるとしてパラメータの値として1が与えられる。
オペレータが、「他の障害物が自移動体の走行予定経路上に不意に進入してきた」という状況を選択して入力したとする。この場合、この障害物に過度に近づかないよう減速度での速度制御をするという判断は困難である。したがって、自動運転システム110による自律走行では緊急停止の発生がやむを得ない状況であった、つまり自動運転システム110の異常ではなく走行再開は安全であるとしてパラメータの値として1が与えられる。
次に、移動体100と通信する遠隔サーバ200で実行される、本実施形態に係る情報処理方法の手順について説明する。図11は、本実施形態に係る情報処理方法の手順例を示すフローチャートである。
本開示の一又は複数の態様に係る情報処理方法は、上記実施の形態の説明に限定されるものではない。本開示の趣旨を逸脱しない限り、当業者が想到する各種の変形を上記の実施の形態に施したものも本開示の態様に含まれる。下記にそのような変形の例、及び実施の形態の説明へのその他の補足事項を挙げる。
12 移動体制御部
21 出力装置
22 入力装置
100 移動体
110 自動運転システム
111 検知部
112 予測部
113 判断部
120 受信部
200 遠隔サーバ
210 ログ記録部
211 センサログ
212 走行ログ
213 処理ログ
220 第一情報処理部
221 作動検出部
222 時刻算出部
223 原因特定部
230 第二情報処理部
231 画像生成部
232 状況データベース
233 UI制御部
234 走行再開可否判定部
240 送信部
Claims (12)
- プロセッサにより実行される情報処理方法であって、
自律走行する移動体が障害物との衝突を回避するための緊急停止が発生したか否かを判定し、
前記緊急停止が発生したと判定した場合、前記移動体の走行状況履歴に基づいて、前記移動体が前記緊急停止を行わずに前記障害物との衝突を回避するように自律走行するための自律走行制御である安全制御をシミュレーションし、
前記移動体の前記自律走行制御を実行する自動運転システムのログを取得し、前記ログが示す自律走行制御及びシミュレーションされた前記安全制御に基づいて、前記ログが示す自律走行制御のための処理の中から前記緊急停止の原因である処理を特定し、
緊急停止が発生し得る状況に関するデータベースから、特定した前記処理に応じた状況候補を取得し、オペレータに、前記ログが示す状況に該当する前記状況候補を入力させるためのインタフェースを提示し、
入力された前記状況候補に応じて、前記自動運転システムによる前記自律走行制御の再開の可否を判定する
情報処理方法。 - 前記緊急停止の原因である処理の特定は、
前記安全制御と前記ログが示す自律走行制御とを比較し、前記移動体の前記自律走行制御において前記安全制御を実行するための前記安全制御の開始時刻である安全制御時刻を求め、
前記安全制御の実行に要する処理の結果が、前記安全制御時刻よりも前に出力されたか否かの判定の結果に基づいて行う
請求項1に記載の情報処理方法。 - 前記安全制御の実行に要する処理は、前記移動体の周囲状況を示す検知結果を取得するための検知処理、前記検知結果を用いて前記移動体の周囲状況を予測するための予測処理、及び予測された前記周囲状況に応じた走行経路を決定するための判断処理の少なくとも1つである
請求項2に記載の情報処理方法。 - 前記インタフェースは、
特定した前記処理の出力と、前記自動運転システムに入力された観測データとを用いて生成した画像を含む
請求項2又は3に記載の情報処理方法。 - 前記インタフェースは、
前記安全制御での前記安全制御時刻における出力をさらに用いて生成した画像を含む
請求項4に記載の情報処理方法。 - 前記インタフェースは、
前記ログが示す状況に該当する前記状況候補を、前記安全制御時刻以降の時点に対して入力させるためのインタフェースである
請求項5に記載の情報処理方法。 - 前記自律走行制御の再開の可否は、
入力された前記状況候補について設定されている、自律走行制御の実行の安全度又は危険度を示すパラメータの値に基づいて決定される
請求項2~6のいずれか一項に記載の情報処理方法。 - 前記パラメータは、
特定した前記処理の出力の前記安全制御時刻からの遅延時間長に基づく
請求項7に記載の情報処理方法。 - 前記パラメータは、
入力された前記状況候補が該当する時間に関し、
前記状況候補は、前記安全制御時刻までの時間に対して入力される
請求項7に記載の情報処理方法。 - 前記安全制御は、
前記移動体を前記緊急停止による停車位置又は前記停車位置よりも手前の位置に停車させる制御である
請求項1~9のいずれか一項に記載の情報処理方法。 - 前記安全制御は、
前記移動体を前記障害物を避けて走行させる制御である
請求項1~9のいずれか一項に記載の情報処理方法。 - プロセッサを備える一台以上のコンピュータを含む情報処理システムであって、
前記プロセッサは、
自律走行する移動体が障害物との衝突を回避するための緊急停止が発生したか否かを判定し、
緊急停止が発生したと判定した場合、前記移動体の走行状況履歴に基づいて、前記移動体が前記緊急停止を行わずに前記障害物との衝突を回避するように自律走行するための自律走行制御である安全制御をシミュレーションし、
前記移動体の前記自律走行制御を実行する自動運転システムのログを取得し、前記ログが示す自律走行制御及びシミュレーションされた前記安全制御に基づいて、前記ログが示す自律走行制御のための処理の中から前記緊急停止の原因である処理を特定し、
緊急停止が発生し得る状況に関するデータベースから、特定した前記処理に応じた状況候補を取得し、オペレータに、前記ログが示す状況に該当する前記状況候補を入力させるためのインタフェースを提示し、
入力された前記状況候補に応じて、前記自動運転システムによる前記自律走行制御の再開の可否を判定する
情報処理システム。
Priority Applications (4)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| EP21775655.0A EP4131201B1 (en) | 2020-03-26 | 2021-03-10 | Information processing method and information processing system |
| CN202180006829.0A CN114762022B (zh) | 2020-03-26 | 2021-03-10 | 信息处理方法以及信息处理系统 |
| JP2021543438A JP7555030B2 (ja) | 2020-03-26 | 2021-03-10 | 情報処理方法及び情報処理システム |
| US17/846,346 US12110039B2 (en) | 2020-03-26 | 2022-06-22 | Information processing method and information processing system |
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| JP2020055906 | 2020-03-26 | ||
| JP2020-055906 | 2020-03-26 |
Related Child Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US17/846,346 Continuation US12110039B2 (en) | 2020-03-26 | 2022-06-22 | Information processing method and information processing system |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2021193059A1 true WO2021193059A1 (ja) | 2021-09-30 |
Family
ID=77891484
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/JP2021/009486 Ceased WO2021193059A1 (ja) | 2020-03-26 | 2021-03-10 | 情報処理方法及び情報処理システム |
Country Status (5)
| Country | Link |
|---|---|
| US (1) | US12110039B2 (ja) |
| EP (1) | EP4131201B1 (ja) |
| JP (1) | JP7555030B2 (ja) |
| CN (1) | CN114762022B (ja) |
| WO (1) | WO2021193059A1 (ja) |
Cited By (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2023145366A (ja) * | 2022-03-28 | 2023-10-11 | ソフトバンク株式会社 | 遠隔監視装置 |
| JP7446526B1 (ja) * | 2023-01-25 | 2024-03-08 | 三菱電機株式会社 | 遠隔操作支援装置 |
Families Citing this family (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US12157502B2 (en) * | 2022-03-01 | 2024-12-03 | Mitsubishi Electric Research Laboratories, Inc. | System and method for parking an autonomous ego-vehicle in a dynamic environment of a parking area |
| CN120429308B (zh) * | 2025-07-07 | 2025-10-03 | 智道网联科技(北京)有限公司 | 自动驾驶数据采集方法、装置、电子设备和可读介质 |
Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP4861158B2 (ja) | 2006-12-27 | 2012-01-25 | セコム株式会社 | 遠隔監視システム |
| JP2017187839A (ja) | 2016-04-01 | 2017-10-12 | 日産自動車株式会社 | 車両用表示方法及び車両用表示装置 |
| JP2018113938A (ja) * | 2017-01-20 | 2018-07-26 | 株式会社クボタ | 走行制御装置 |
| WO2018155159A1 (ja) * | 2017-02-24 | 2018-08-30 | パナソニックIpマネジメント株式会社 | 遠隔映像出力システム、及び遠隔映像出力装置 |
| JP2019087015A (ja) * | 2017-11-07 | 2019-06-06 | トヨタ自動車株式会社 | 遠隔監視システム及び自律走行車両並びに遠隔監視方法 |
Family Cites Families (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JPS4861158U (ja) | 1971-11-17 | 1973-08-03 | ||
| US9997077B2 (en) * | 2014-09-04 | 2018-06-12 | Honda Motor Co., Ltd. | Vehicle operation assistance |
| JP6870270B2 (ja) * | 2016-10-14 | 2021-05-12 | 日産自動車株式会社 | 無人運転システムの遠隔操作方法と遠隔操作装置 |
| JP6958243B2 (ja) * | 2017-11-01 | 2021-11-02 | トヨタ自動車株式会社 | 自動運転車両 |
| JP7324716B2 (ja) * | 2018-02-05 | 2023-08-10 | ソニーグループ株式会社 | 情報処理装置、移動装置、および方法、並びにプログラム |
-
2021
- 2021-03-10 EP EP21775655.0A patent/EP4131201B1/en active Active
- 2021-03-10 JP JP2021543438A patent/JP7555030B2/ja active Active
- 2021-03-10 CN CN202180006829.0A patent/CN114762022B/zh active Active
- 2021-03-10 WO PCT/JP2021/009486 patent/WO2021193059A1/ja not_active Ceased
-
2022
- 2022-06-22 US US17/846,346 patent/US12110039B2/en active Active
Patent Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP4861158B2 (ja) | 2006-12-27 | 2012-01-25 | セコム株式会社 | 遠隔監視システム |
| JP2017187839A (ja) | 2016-04-01 | 2017-10-12 | 日産自動車株式会社 | 車両用表示方法及び車両用表示装置 |
| JP2018113938A (ja) * | 2017-01-20 | 2018-07-26 | 株式会社クボタ | 走行制御装置 |
| WO2018155159A1 (ja) * | 2017-02-24 | 2018-08-30 | パナソニックIpマネジメント株式会社 | 遠隔映像出力システム、及び遠隔映像出力装置 |
| JP2019087015A (ja) * | 2017-11-07 | 2019-06-06 | トヨタ自動車株式会社 | 遠隔監視システム及び自律走行車両並びに遠隔監視方法 |
Non-Patent Citations (1)
| Title |
|---|
| See also references of EP4131201A4 |
Cited By (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2023145366A (ja) * | 2022-03-28 | 2023-10-11 | ソフトバンク株式会社 | 遠隔監視装置 |
| JP2024105473A (ja) * | 2022-03-28 | 2024-08-06 | ソフトバンク株式会社 | 遠隔監視装置 |
| JP7634750B2 (ja) | 2022-03-28 | 2025-02-21 | ソフトバンク株式会社 | 遠隔監視装置 |
| JP7446526B1 (ja) * | 2023-01-25 | 2024-03-08 | 三菱電機株式会社 | 遠隔操作支援装置 |
| WO2024157377A1 (ja) * | 2023-01-25 | 2024-08-02 | 三菱電機株式会社 | 遠隔操作支援装置 |
Also Published As
| Publication number | Publication date |
|---|---|
| US20220324487A1 (en) | 2022-10-13 |
| JP7555030B2 (ja) | 2024-09-24 |
| CN114762022A (zh) | 2022-07-15 |
| US12110039B2 (en) | 2024-10-08 |
| CN114762022B (zh) | 2025-04-08 |
| JPWO2021193059A1 (ja) | 2021-09-30 |
| EP4131201A4 (en) | 2023-09-06 |
| EP4131201B1 (en) | 2025-01-29 |
| EP4131201A1 (en) | 2023-02-08 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| WO2021193059A1 (ja) | 情報処理方法及び情報処理システム | |
| CN111373335B (zh) | 用于混合驾驶中基于自身觉知性能参数的驾驶模式切换的方法和系统 | |
| EP3091370B1 (en) | Method and arrangement for determining safe vehicle trajectories | |
| CN108349490B (zh) | 用于避免机动车与紧急任务车辆碰撞的方法和与此有关的系统及机动车 | |
| EP3577528B1 (en) | Enabling remote control of a vehicle | |
| JP7592052B2 (ja) | 車両制御装置、車両制御方法、およびプログラム | |
| CN107667048A (zh) | 用于控制车辆的行驶功能的方法和系统 | |
| JP2015051761A (ja) | アクティブ車両制御のための運転支援技術 | |
| EP3552909B1 (en) | Apparatus and method for managing control authority transition in vehicle | |
| CN113734163A (zh) | 无人车的控制方法、装置、存储介质及电子设备 | |
| US12077171B2 (en) | Vehicle control device, automated driving vehicle development system, vehicle control method, and storage medium for verifying control logic | |
| US20250222939A1 (en) | System, Method and Software for Operating a Driver Assistance System | |
| JP7507816B2 (ja) | 車両制御装置、車両制御方法、およびプログラム | |
| CN112590778A (zh) | 车辆控制的方法、装置、控制器和智能汽车 | |
| CN111413973A (zh) | 车辆的换道决策方法及装置、电子设备、存储介质 | |
| CN112537318A (zh) | 用于远程控制机动车的方法 | |
| CN114834443A (zh) | 车辆控制的方法、装置、控制器和智能汽车 | |
| Huang et al. | A safety analysis and verification framework for autonomous vehicles based on the identification of triggering events | |
| WO2023228781A1 (ja) | 処理システム及び情報提示装置 | |
| CN107458337B (zh) | 机动车保护装置的控制 | |
| US20250014461A1 (en) | Server device, information processing method, information processing program and storage medium | |
| Hans et al. | Human-factor focused application of stpa to remotely driven vehicles | |
| WO2022091400A1 (ja) | 他車挙動予測診断装置および他車挙動予測診断方法 | |
| CN109144070A (zh) | 移动设备辅助自动驾驶方法、汽车和存储介质 | |
| US20250355413A1 (en) | Scenario parameter optimization device, scenario parameter optimization method, storage medium having stored therein a scenario parameter optimization program, and control logic inspection system |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| ENP | Entry into the national phase |
Ref document number: 2021543438 Country of ref document: JP Kind code of ref document: A |
|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 21775655 Country of ref document: EP Kind code of ref document: A1 |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| ENP | Entry into the national phase |
Ref document number: 2021775655 Country of ref document: EP Effective date: 20221026 |
|
| WWG | Wipo information: grant in national office |
Ref document number: 202180006829.0 Country of ref document: CN |