WO2021131785A1 - 情報処理方法及び情報処理システム - Google Patents
情報処理方法及び情報処理システム Download PDFInfo
- Publication number
- WO2021131785A1 WO2021131785A1 PCT/JP2020/046256 JP2020046256W WO2021131785A1 WO 2021131785 A1 WO2021131785 A1 WO 2021131785A1 JP 2020046256 W JP2020046256 W JP 2020046256W WO 2021131785 A1 WO2021131785 A1 WO 2021131785A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- sensing
- task
- execution
- information processing
- moving body
- Prior art date
Links
- 230000010365 information processing Effects 0.000 title claims abstract description 106
- 238000003672 processing method Methods 0.000 title claims abstract description 45
- 238000000034 method Methods 0.000 claims description 33
- 230000008859 change Effects 0.000 claims description 28
- 238000004364 calculation method Methods 0.000 claims description 21
- 238000012544 monitoring process Methods 0.000 claims description 19
- 238000010586 diagram Methods 0.000 description 39
- 230000008569 process Effects 0.000 description 28
- 238000012545 processing Methods 0.000 description 20
- 238000013459 approach Methods 0.000 description 16
- 230000004044 response Effects 0.000 description 15
- 238000012986 modification Methods 0.000 description 14
- 230000004048 modification Effects 0.000 description 14
- 230000001133 acceleration Effects 0.000 description 11
- 230000012447 hatching Effects 0.000 description 9
- 230000000694 effects Effects 0.000 description 3
- 208000019901 Anxiety disease Diseases 0.000 description 2
- 230000036506 anxiety Effects 0.000 description 2
- 238000004891 communication Methods 0.000 description 2
- 238000004590 computer program Methods 0.000 description 2
- 238000001514 detection method Methods 0.000 description 2
- 238000011161 development Methods 0.000 description 1
- 230000018109 developmental process Effects 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 238000003384 imaging method Methods 0.000 description 1
- 238000009434 installation Methods 0.000 description 1
- 239000000463 material Substances 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W60/00—Drive control systems specially adapted for autonomous road vehicles
- B60W60/001—Planning or execution of driving tasks
- B60W60/0015—Planning or execution of driving tasks specially adapted for safety
- B60W60/0017—Planning or execution of driving tasks specially adapted for safety of other traffic participants
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W60/00—Drive control systems specially adapted for autonomous road vehicles
- B60W60/005—Handover processes
- B60W60/0053—Handover processes from vehicle to occupant
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W30/00—Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W40/00—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
- B60W40/02—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to ambient conditions
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W50/00—Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
- B60W50/08—Interaction between the driver and the control system
- B60W50/14—Means for informing the driver, warning the driver or prompting a driver intervention
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W60/00—Drive control systems specially adapted for autonomous road vehicles
- B60W60/001—Planning or execution of driving tasks
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W60/00—Drive control systems specially adapted for autonomous road vehicles
- B60W60/001—Planning or execution of driving tasks
- B60W60/0015—Planning or execution of driving tasks specially adapted for safety
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/01—Detecting movement of traffic to be counted or controlled
- G08G1/0104—Measuring and analyzing of parameters relative to traffic conditions
- G08G1/0108—Measuring and analyzing of parameters relative to traffic conditions based on the source of data
- G08G1/0112—Measuring and analyzing of parameters relative to traffic conditions based on the source of data from the vehicle, e.g. floating car data [FCD]
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/01—Detecting movement of traffic to be counted or controlled
- G08G1/0104—Measuring and analyzing of parameters relative to traffic conditions
- G08G1/0125—Traffic data processing
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/01—Detecting movement of traffic to be counted or controlled
- G08G1/0104—Measuring and analyzing of parameters relative to traffic conditions
- G08G1/0137—Measuring and analyzing of parameters relative to traffic conditions for specific applications
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/09—Arrangements for giving variable traffic instructions
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/16—Anti-collision systems
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/16—Anti-collision systems
- G08G1/166—Anti-collision systems for active traffic, e.g. moving vehicles, pedestrians, bikes
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2552/00—Input parameters relating to infrastructure
- B60W2552/50—Barriers
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2556/00—Input parameters relating to data
- B60W2556/45—External transmission of data to or from the vehicle
Definitions
- This disclosure relates to information processing methods and information processing systems.
- Patent Document 1 describes a map specified from the braking distance according to the moving speed of the moving body, the braking time which is the time required to stop, and the traveling path of the moving body, and may collide with the moving body.
- a driving support device that receives an obstacle map showing an obstacle in a certain range from a peripheral body as an external map is disclosed.
- traveling task a task related to traveling
- the moving body may not be able to run safely even if the running task is executed based on the external map.
- an object of the present disclosure is to provide an information processing method and an information processing system capable of providing safety of execution of a traveling task to a moving body having various traveling specifications.
- the information processing method is an information processing method executed by a computer, in which a task related to traveling executed in a moving body and a task mounted on the moving body are mounted on the moving body to sense the outside of the moving body.
- the first sensing data output by the first sensor and the specifications related to the running of the moving body are acquired, the sensing requirements are calculated based on the task and the specifications, and the first sensor output by the first sensor.
- the first sensing result is calculated based on the sensing data, it is determined whether or not the execution of the task is restricted based on the sensing requirement and the first sensing result, and it is determined that the execution of the task is restricted.
- a recording medium such as a system, a method, an integrated circuit, a computer program, or a computer-readable CD-ROM, and the system, the method, and the like. It may be realized using any combination of integrated circuits, computer programs and recording media.
- FIG. 1 is a block diagram showing an information processing system according to an embodiment.
- FIG. 2A is a diagram showing an example of a traveling task of the information processing system according to the embodiment.
- FIG. 2B is a diagram showing an example of travel plan information of the information processing system according to the embodiment.
- FIG. 2C is a diagram showing an example of vehicle specification information of the information processing system according to the embodiment.
- FIG. 2D is a diagram showing an example of travel point information of the information processing system according to the embodiment.
- FIG. 2E is a diagram showing an example of safety requirement information of the information processing system according to the embodiment.
- FIG. 2F is a diagram showing an example of a required sensing region of the information processing system according to the embodiment.
- FIG. 3 is a flowchart showing the operation of the information processing system according to the embodiment.
- FIG. 4A is a flowchart showing the detailed operation of the information processing system according to the embodiment.
- FIG. 4B is a diagram showing an example of an actual sensing region of the information processing system according to the embodiment.
- FIG. 5 is a flowchart showing a process of calculating the required sensing distance.
- FIG. 6 is a diagram showing an example of a required sensing region and a required sensing distance.
- FIG. 7 is a flowchart showing a process of searching the target area lane.
- FIG. 8A is a flowchart showing a process of calculating the actual sensing region.
- FIG. 8B is a flowchart showing a process of calculating the actual sensing distance for each sensor.
- FIG. 8A is a flowchart showing a process of calculating the actual sensing region.
- FIG. 8B is a flowchart showing a process of calculating the actual sensing distance for
- FIG. 9 is a diagram showing an example of the relationship between the first sensing region and the second sensing region.
- FIG. 10 is a diagram showing an example of the relationship between the required sensing region and the actual sensing region.
- FIG. 11 is a schematic diagram showing an information processing system in a modified example.
- FIG. 12A is a diagram showing an example of a traveling task of the information processing system in the modified example 5.
- FIG. 12B is a diagram showing an example of travel plan information of the information processing system in the modified example 5.
- FIG. 12C is a diagram showing an example of robot spec information of the information processing system in the modified example 5.
- FIG. 12D is a diagram showing an example of travel point information of the information processing system in the modified example 5.
- FIG. 12A is a diagram showing an example of a traveling task of the information processing system in the modified example 5.
- FIG. 12B is a diagram showing an example of travel plan information of the information processing system in the modified example 5.
- FIG. 12C is a diagram showing an example of
- FIG. 12E is a diagram showing an example of safety requirement information of the information processing system in the modified example 5.
- FIG. 12F is a diagram showing an example of a required sensing region of the information processing system in the modified example 5.
- FIG. 13 is a diagram showing an example of a required sensing region and a required sensing distance.
- an autonomous vehicle is designed to be optimized for driving in a specific driving scene (specific area, environment, time zone, etc.).
- a specific driving scene specific area, environment, time zone, etc.
- the self-driving car cannot always judge whether or not it is suitable for the driving scene. Therefore, when an autonomous vehicle runs in a non-optimized driving scene, the combination of driving specifications and sensor specifications is not suitable for the driving scene, in other words, safety is improved by insufficient sensing performance with respect to driving performance. It may not be possible to secure it.
- driving support using an external map is provided, but if the driving specifications are not suitable for the driving scene, it is not always possible to guarantee the driving safety of the moving object even if the external map is used. Absent.
- the information processing method is an information processing method executed by a computer, which includes a task related to traveling executed in a moving body and a task mounted on the moving body and mounted on the moving body.
- the first sensing data output by the first sensor that senses the outside and the specifications related to the running of the moving body are acquired, the sensing requirements are calculated based on the task and the specifications, and the first sensor outputs the data.
- the first sensing result is calculated based on the first sensing data, it is determined whether or not the execution of the task is restricted based on the sensing requirement and the first sensing result, and the execution of the task is restricted. If it is determined, an instruction for restricting the execution of the task is output to the moving body.
- the execution of the traveling task can be restricted depending on the sensing requirements and the sensing results required from the traveling specifications of the moving body. That is, when the driving specifications and the sensor specifications are not suitable for the driving scene, the execution of the driving task can be restricted. Therefore, it is possible to provide safety for executing a running task for a moving body having various running specifications.
- the running spec and the sensor spec of the moving body are not suitable for the running scene and the execution of the running task cannot be completed or the running task cannot be executed, that is, when the moving body cannot run, the running task is concerned. Stops execution. As a result, it is possible to suppress the occurrence of an accident or an incident due to the execution of the traveling task. Further, for example, when the traveling spec and the sensor spec of the moving object are not suitable for the traveling scene and the traveling task cannot be safely executed, that is, when the moving object cannot safely travel, the content of the execution of the traveling task is changed. As a result, even if the moving body does not completely meet the driving safety conditions, the moving body continues to run safely by executing a running task with limited contents instead of the original running task. be able to.
- the information processing system includes a first acquisition unit that acquires a task related to traveling executed in the mobile body, and a first sensor mounted on the mobile body that senses the outside of the mobile body.
- a second acquisition unit that acquires the first sensing data to be output, a third acquisition unit that acquires specifications related to the running of the moving body, and a first calculation unit that calculates sensing requirements based on the task and the specifications.
- the second calculation unit that calculates the first sensing result based on the first sensing data output by the first sensor, and whether or not to restrict the execution of the task based on the sensing requirement and the first sensing result. It is provided with a determination unit for determining whether or not, and an output unit for outputting an instruction for restricting the execution of the task to the moving body when it is determined that the execution of the task is restricted.
- This information processing system also has the same effects as described above.
- the sensing requirement includes a required sensing region which is a region where sensing is required, and the first sensing result is calculated based on the first sensing data. Including the first sensing region, in the determination, it is determined whether or not to limit the execution of the task based on the required sensing region and the first sensing region.
- the moving body In order to avoid the occurrence of accidents or incidents at the destination, the moving body must be able to sense the destination and its surroundings. For example, the moving body must be able to sense a region to be moved and a region where an object moving to the destination can exist.
- the execution of the traveling task can be restricted according to the comparison result between the required sensing area and the sensing area of the moving body. For example, when the moving body cannot sense the required sensing area, the execution of the traveling task can be restricted. Therefore, it is possible to provide safety in executing a traveling task.
- the information processing method is determined according to the overlap between the required sensing region and the first sensing region.
- the overlap between the required sensing area and the first sensing area affects the safety of execution of the driving task. Therefore, by restricting the execution of the traveling task according to the overlap, the moving body can safely execute the traveling task. That is, the moving body can travel more safely.
- the information processing method is determined according to the degree of overlap between the required sensing region and the first sensing region.
- the degree of overlap between the required sensing area and the first sensing area is related to the degree of safety in executing the driving task. Therefore, by limiting the execution of the traveling task according to the degree of overlap, the moving body can safely execute the traveling task. That is, the moving body can travel more safely.
- the information processing method determines according to the region where the required sensing region and the first sensing region do not overlap.
- the safety of executing the driving task may not be significantly reduced.
- Areas of low importance related to driving safety such as the area behind the moving object or the area opposite to the traveling direction of the moving object, are a safety risk even if sensing is insufficient. Is low.
- the region where the required sensing region and the first sensing region do not overlap may reduce the safety of executing the traveling task.
- a region having a high degree of importance related to driving safety such as an region near the moving body or a region in the traveling direction of the moving body, has an increased safety risk when sensing is insufficient. Therefore, by limiting the execution of the traveling task according to the non-overlapping regions (for example, according to the importance of the regions), it is possible to improve the traveling efficiency while safely traveling the moving body.
- the limitation is prohibition of execution of the task.
- the limitation is a change in the execution content of the task.
- the driving task can be safely executed by reducing the speed, changing the intersection where the vehicle turns right, changing the stop position, or delaying the start timing.
- the content of the change of the task is determined based on the overlap between the required sensing area and the first sensing area.
- the overlap between the required sensing area and the first sensing area affects the safety of execution of the driving task. Therefore, by changing the content of the traveling task according to the overlap, the traveling task can be changed to a highly safe content.
- the information processing method further acquires the second sensing data output by the second sensor installed in the moving path of the moving body, and the second sensing is based on the second sensing data.
- the result is calculated, and in the determination, the determination is made based on the second sensing result.
- the sensing area can be expanded even if the sensing performance of the first sensor of the moving body is low. By expanding the sensing area that overlaps with the required sensing area, it becomes easier to safely execute the driving task. That is, the moving body can easily travel safely.
- the information processing method further adds the moving body to the monitoring target or raises the monitoring priority of the moving body when it is determined that the execution of the task is restricted.
- a moving object that restricts the execution of driving tasks is more likely to cause an accident or incident than other moving objects. Therefore, it is possible to suppress the occurrence of an accident or an incident by, for example, setting a moving object whose execution of a traveling task is restricted as a monitoring target or raising the monitoring priority. In addition, even if an accident or incident occurs, the observer can respond promptly.
- the information processing method further notifies the manager or passenger of the mobile body that the execution of the task is restricted when it is determined that the execution of the task is restricted.
- the manager or the passenger can grasp that the execution of the traveling task of the moving body is restricted. For example, when the manager is a watcher, it is possible to prevent the watcher from overlooking a mobile body that is more likely to cause an accident or incident than other mobile bodies. Further, since the moving body may be monitored preferentially, the burden of monitoring the moving body by the observer can be reduced. In addition, it is possible to reduce passengers' anxiety about the behavior of the moving body on which they are riding.
- the sensing requirement includes a required sensing target that requires sensing
- the first sensing result is calculated based on the first sensing data. Including the first sensing target, in the determination, it is determined whether or not to limit the execution of the task based on the required sensing target and the first sensing target.
- the moving body In order to avoid the occurrence of an accident or incident at the destination, the moving body must be able to sense the target that may cause the accident or incident. For example, the moving body must be able to sense an obstacle at the moving destination, the condition of the road surface at the moving destination, and the like.
- the execution of the traveling task can be restricted according to the comparison result between the required sensing target and the target sensed by the moving body. Therefore, it is possible to provide safety in executing a traveling task.
- the information processing method determines according to the degree of sufficiency or the degree of agreement between the required sensing target and the first sensing target.
- the sensing requirement includes a required sensing performance that requires sensing
- the first sensing result is calculated based on the first sensing data. Including the first sensing performance, in the determination, it is determined whether or not to limit the execution of the task based on the required sensing performance and the first sensing performance.
- the sensing performance of the moving body In order to avoid the occurrence of accidents or incidents at the destination, the sensing performance of the moving body must be sufficient. For example, the accuracy, accuracy, resolution, processing cycle, etc. of sensing must be sufficient.
- the execution of the traveling task can be restricted according to the comparison result between the required sensing performance and the sensing performance of the moving body. Therefore, it is possible to provide safety in executing a traveling task.
- the determination is made according to whether or not the required sensing performance is exceeded by the first sensing performance.
- the moving body can safely execute the traveling task. For example, if the sensing performance of the moving object is lower than the required sensing performance, the execution of the traveling task can be restricted.
- FIG. 1 is a block diagram showing an information processing system 1 according to an embodiment.
- the information processing system 1 includes an automatic operation device 2, an operation control device 3, an infrastructure device 4, and a determination device 5.
- the automatic driving device 2 may be provided in the moving body, and the automatic driving device 2 and the operation control device 3 may be provided in the moving body.
- the automatic driving device 2 is mounted on the moving body, senses the periphery of the moving body, and controls the traveling of the moving body based on the sensing result.
- the moving body is a vehicle, an aircraft, a ship, or the like.
- the self-driving car 6 will be described below as a moving body.
- the automatic driving device 2 has a first sensing unit 21 and a traveling determination unit 22.
- the first sensing unit 21 outputs the first sensing data to the determination device 5 as the first sensor.
- the first sensing unit 21 is, for example, a sensor or sensor module such as a Lidar (Laser Imaging Detection and Ranking) or an image pickup device, and senses the outside of the autonomous driving vehicle 6.
- the first sensing unit 21 generates the first sensing data which is the result of sensing.
- the first sensing data is, for example, point cloud information or an image.
- the travel determination unit 22 acquires a task related to driving (hereinafter, may be referred to as a travel task) generated by the autonomous driving vehicle 6 generated by the travel task generation unit 31 and a travel task permission output by the travel task restriction unit 57. To do.
- the travel determination unit 22 determines whether or not to execute the travel task based on the acquired travel task and the travel task permission, and executes the determined travel task.
- the travel determination unit 22 outputs a travel instruction according to the travel task to the automatic driving vehicle 6.
- the operation control device 3 has a travel task generation unit 31, a first storage unit 32, and a travel plan change unit 33.
- the travel task generation unit 31 generates a travel task based on the travel plan information acquired from the first storage unit 32.
- the traveling task is an abstract upper traveling control than the lower traveling control that controls the actuator.
- the lower driving control is control of speed, acceleration, deceleration, steering angle, etc.
- the upper driving control is straight ahead, right turn, left turn, obstacle avoidance, parking, lane change, merging. It is the control of the automatic driving vehicle 6 such as starting or stopping.
- the traveling task includes a traveling task name, a traveling task type, and a point name, as shown in FIG. 2A.
- FIG. 2A is a diagram showing an example of a traveling task of the information processing system 1 according to the embodiment.
- the travel plan information includes the route and the point where the travel task on the route is executed.
- the points are the starting point, the destination, the right turn point, the left turn point, the stop point, and the like.
- FIG. 2B is a diagram showing an example of travel plan information of the information processing system 1 according to the embodiment.
- the travel task generation unit 31 outputs the generated travel task to the determination device 5.
- the first storage unit 32 stores a travel plan information database showing a travel plan of the autonomous driving vehicle 6.
- the first storage unit 32 outputs the travel plan information in response to the request of the travel task generation unit 31. Further, when the travel plan change unit 33 changes the travel plan information, the first storage unit 32 updates the travel plan information after the change.
- the travel plan change unit 33 stores the travel plan information in the first storage unit 32 when it becomes necessary to change the travel plan information based on an instruction for restricting the execution of the travel task (hereinafter, restriction matter). To change and update. Further, the travel plan change unit 33 may update that the travel plan information stored in the first storage unit 32 is permitted even when the travel task permission output by the travel task restriction unit 57 is acquired. The user may manually change the travel plan information to a desired plan.
- the limitation is prohibition of execution of the traveling task or change of the execution content of the traveling task.
- prohibition of execution of a traveling task is prohibition of traveling, prohibition of turning right or left, prohibition of stopping, and the like.
- the change in the execution content of the running task is a change in speed or acceleration, a change in the running lane, or the like.
- the traveling plan changing unit 33 deletes the traveling task and adds a new traveling task when the acquired restriction is prohibited from turning right. To change.
- the infrastructure device 4 is set on an infrastructure such as a road or a traffic light.
- the infrastructure device 4 has a second sensing unit 41.
- the second sensing unit 41 is a sensor installed in the movement path of the autonomous driving vehicle 6, senses the surroundings of the own machine, and generates the second sensing data.
- the second sensing unit 41 outputs the generated second sensing data to the determination device 5.
- the second sensing unit 41 may be an example of the second sensor.
- the determination device 5 includes a travel task acquisition unit 51, a second storage unit 52, a third storage unit 53, a fourth storage unit 54, a condition acquisition unit 55, a determination unit 56, and a travel task restriction unit 57. Have.
- the travel task acquisition unit 51 acquires the travel task output by the travel task generation unit 31 of the operation control device 3.
- the travel task acquisition unit 51 outputs the acquired travel task to the determination unit 56.
- the traveling task acquisition unit 51 is an example of the first acquisition unit.
- the second storage unit 52 stores a database of vehicle spec information (hereinafter, vehicle spec information database) indicating the vehicle specs of the autonomous driving vehicle 6.
- vehicle spec information database indicating the vehicle specs of the autonomous driving vehicle 6.
- the second storage unit 52 outputs vehicle spec information in response to a request from the condition acquisition unit 55.
- the vehicle specifications are specifications related to the running of the autonomous driving vehicle 6. Specifically, as shown in FIG. 2C, the vehicle spec information includes the vehicle name, maximum acceleration, maximum deceleration, maximum speed, vehicle response time, and the like.
- the vehicle response time is the time from when an instruction is given to the autonomous driving vehicle 6 to when the operation corresponding to the instruction is actually executed or the operation is completed. For example, the vehicle response time is the time required for the autonomous driving vehicle 6 to apply the brake when a stop instruction is given to the autonomous driving vehicle 6.
- FIG. 2C is a diagram showing an example of vehicle spec information of the information processing system 1 according to the embodiment.
- the third storage unit 53 stores a database of travel point information (hereinafter, travel point information database) showing various information of travel points represented by a map.
- the third storage unit 53 outputs travel point information in response to a request from the condition acquisition unit 55.
- the travel point information is map information of the travel route and traffic environment information.
- the traveling point information includes map information such as the point name, the point number assigned to the point name, the type of road, the presence or absence of a traffic light, the left turn route length, the right turn route length, and straight ahead. Includes traffic environment information such as route length, speed limit, assumed maximum speed of other vehicles, and approach lane.
- FIG. 2D is a diagram showing an example of travel point information of the information processing system 1 according to the embodiment.
- the fourth storage unit 54 stores a database of safety requirement information (hereinafter, safety requirement information database) indicating safety requirements for driving tasks.
- the fourth storage unit 54 outputs the safety requirement information in response to the request of the condition acquisition unit 55.
- the safety requirement information is a requirement set in advance by the business operator or the like in order for the autonomous driving vehicle 6 to travel safely for the traveling task.
- the safety requirement information relates to the execution condition of the driving task.
- the safety requirement information includes the traveling task name, the target area type, the road type, the required sensing range inside the intersection, the required sensing range of the approach lane, the required sensing range calculation input, and the like.
- FIG. 2E is a diagram showing an example of safety requirement information of the information processing system 1 according to the embodiment.
- the safety requirement information may include information other than the information related to the above sensing requirements.
- the safety requirement information includes safety requirements regarding the driving environment such as obstacles, blind spot areas, weather, road surface conditions, and illuminance.
- the safety requirements for the driving environment are used to calculate the driving environment requirements.
- the target area type indicates the type of area in which the autonomous driving vehicle 6 may collide with another object. Specifically, in the example of FIG. 2E, the target area type shows the inside of the intersection and the approach lane other than the lane in which the autonomous driving vehicle 6 exists as the type of the above area.
- the required sensing range inside the intersection is the required sensing range of the type area inside the intersection.
- the required sensing range inside the intersection is set to the entire area.
- the required sensing range of the approach lane is the required sensing range of the area of the type called the approach lane. For example, since the region far away from the approach lane is unlikely to collide, the required sensing range of the approach lane is set to the region from the intersection to the required sensing distance.
- the required sensing range calculation input is information input for calculating the required sensing distance. Therefore, it may be said that the information is associated with the required sensing range of the target area.
- the required sensing range calculation inputs are, for example, the maximum speed of the autonomous driving vehicle 6, the speed limit, the maximum acceleration, the vehicle response time, the right turn route length, the assumed maximum speed of other vehicles, and the like.
- the condition acquisition unit 55 acquires vehicle spec information from the second storage unit 52 and travel point information from the third storage unit 53. Further, the condition acquisition unit 55 acquires a travel task from the travel task acquisition unit 51 via the determination unit 56.
- the condition acquisition unit 55 is an example of the first calculation unit and also an example of the third acquisition unit. Further, the condition acquisition unit 55 further acquires safety requirement information from the fourth storage unit 54.
- the condition acquisition unit 55 calculates the execution condition of the running task.
- the execution condition of the traveling task is a condition for determining whether or not the normal execution determination of the traveling task is possible, that is, whether or not the sensing for executing the traveling task is sufficient.
- the execution condition of the traveling task is an execution condition of the traveling task according to at least one of the autonomous driving vehicle 6 and the traveling point.
- the execution condition of the driving task is for determining whether or not the driving task such as straight-ahead, right-turn, left-turn, obstacle avoidance, parking, lane change, merging, starting, and stopping of the autonomous driving vehicle 6 can be executed. It is a condition. Therefore, the execution condition includes the sensing requirement as one element.
- the execution condition includes the driving environment requirement as another factor. Factors of driving environment requirements include obstacles, blind spot areas, weather, road surface conditions, and illuminance. Execution conditions are calculated based on safety requirement information.
- the condition acquisition unit 55 calculates and acquires the sensing requirement based on the vehicle spec information and the traveling task acquired via the determination unit 56. Specifically, the condition acquisition unit 55 calculates a required sensing region, which is a region where sensing is required, based on vehicle spec information and traveling tasks. That is, the condition acquisition unit 55 determines the required sensing area required for the autonomous driving vehicle 6 of the vehicle specifications to execute the driving task based on the traveling point information and the vehicle spec information that match the points indicated in the traveling task. Calculate and obtain. More specifically, the condition acquisition unit 55 calculates the required sensing area based on the vehicle spec information, the traveling task, and the safety requirement information.
- the condition acquisition unit 55 acquires travel point information and safety requirement information of the corresponding travel point from the travel task.
- the condition acquisition unit 55 acquires the target area type, the required sensing range for each target area type, and the required sensing range calculation input from the acquired safety requirement information.
- the condition acquisition unit 55 acquires the information shown in the acquired request sensing range calculation input from the vehicle spec information and the travel point information.
- the condition acquisition unit 55 calculates the required sensing range for each target area in the point using the acquired information.
- the sensing range of the target region calculated in this way is the required sensing region.
- the required sensing area includes the vehicle name, the traveling task name, the target area ID, and the required sensing range for each area type.
- the required sensing range is, for example, the required sensing distance in the entire area or the target area.
- the entire area is set for the target area A0
- the required sensing distance 43 m is set for each of the target areas A1, A3, and A7.
- FIG. 2F is a diagram showing an example of a request sensing region of the information processing system 1 according to the embodiment.
- the condition acquisition unit 55 mainly acquires the execution condition including the request sensing area before the autonomous driving vehicle 6 travels, but the execution condition may be acquired while the autonomous driving vehicle 6 is traveling.
- the condition acquisition unit 55 outputs the acquired execution condition to the determination unit 56.
- the determination unit 56 acquires the first sensing data from the first sensing unit 21, acquires the second sensing data from the second sensing unit 41, and acquires the execution condition including the sensing requirement from the condition acquisition unit 55. Further, the determination unit 56 acquires a travel task from the travel task acquisition unit 51.
- the determination unit 56 calculates the first sensing result based on the first sensing data.
- the first sensing result is the first sensing region indicating the region sensed by the first sensing unit 21.
- the determination unit 56 calculates the second sensing result based on the second sensing data.
- the second sensing result is a second sensing region indicating a region sensed by the second sensing unit 41.
- the first sensing result and the second sensing result may include other information. For example, the presence / absence of an obstacle, the type of the obstacle, the position of the obstacle, the size of the obstacle, the blind spot area due to the obstacle, the weather, the condition of the road surface, the illuminance, and the like may be included.
- the first sensing region and the second sensing region are actual sensing regions.
- the determination unit 56 determines whether or not to restrict the execution of the traveling task based on the first sensing result and the sensing requirement. Specifically, the determination unit 56 determines whether or not to restrict the execution of the traveling task by the autonomous driving vehicle 6 based on the first sensing region and the required sensing region of the execution condition. That is, the determination unit 56 determines whether or not the self-driving car 6 can normally determine whether or not the traveling task may be executed.
- the fact that the execution of the traveling task can be normally determined means that the first sensing result is sufficient, that is, the first sensing result satisfies the sensing requirement. For example, the determination unit 56 determines whether or not to limit the execution of the traveling task according to the overlap between the request sensing region and the first sensing region.
- the determination unit 56 determines according to the degree of overlap between the required sensing region and the first sensing region. For example, when the entire required sensing region or a portion of a predetermined ratio or more overlaps with the first sensing region, the determination unit 56 does not restrict the execution of the traveling task. On the contrary, when there is no portion of the required sensing region that overlaps with the first sensing region or is less than a predetermined ratio, the determination unit 56 limits the execution of the traveling task.
- the determination unit 56 determines whether or not to restrict the execution of the traveling task based on the second sensing region. For example, the determination unit 56 determines whether or not to limit the execution of the traveling task according to the overlap between the combined sensing region and the required sensing region, which is the combination of the first sensing region and the second sensing region.
- the determination unit 56 may make a determination according to an area where the required sensing area and the first sensing area do not overlap. Specifically, the determination unit 56 determines whether the portion of the required sensing region that does not overlap with the first sensing region is a region that affects the driving safety of the autonomous driving vehicle 6. Areas that affect safety are, for example, an area close to the autonomous driving vehicle 6, an area in the traveling direction of the autonomous driving vehicle 6, an area on a planned traveling route, a sidewalk, an area where a traffic light is located, and the like. When the region affecting the running safety is the non-overlapping portion, the determination unit 56 limits the execution of the running task.
- the determination unit 56 determines whether or not to restrict the execution of the traveling task based on other requirements of the execution condition. Specifically, the determination unit 56 determines whether or not to restrict the execution of the travel task based on the travel environment requirement and the first sensing data. For example, when fog is generated on the route or the road surface on the route is frozen, the determination unit 56 determines that the execution of the traveling task is restricted. Further, when the number of pedestrians exceeds the threshold value or an accident occurs at a predetermined point on the route, the determination unit 56 determines that the execution of the traveling task is restricted. Further, when the animal enters the route, the determination unit 56 determines that the execution of the traveling task is restricted. Further, when the communication state between the observer terminal for monitoring the autonomous driving vehicle 6 and the autonomous driving vehicle 6 is poor, the determination unit 56 determines that the execution of the traveling task is restricted.
- the determination unit 56 outputs the determination result to the traveling task restriction unit 57.
- the travel task restriction unit 57 restricts the execution of the travel task according to the determination result. Specifically, according to the determination result of the sensing requirement by the determination unit 56, the travel task restriction unit 57 generates a restriction item which is an instruction for restricting the execution of the travel task. Specifically, the traveling task limiting unit 57 generates, as a limiting item, prohibition of execution of the traveling task according to the determination result of the request sensing region by the determination unit 56. For example, when the traveling task is a right turn, if the portion of the required sensing region corresponding to the right turn direction does not overlap with the first sensing region, the determination unit 56 determines that the execution of the traveling task is restricted.
- the traveling task restriction unit 57 generates a restriction item indicating that a right turn is prohibited. Further, the traveling task limiting unit 57 determines the content of the change of the traveling task as a restriction item according to the determination result of the request sensing area by the determination unit 56. Specifically, the travel task limiting unit 57 determines the content of the travel task change based on the overlap between the first sensing region and the required sensing region (for example, the presence or absence of overlap, the degree of overlap, etc.). For example, when the traveling task is traveling straight and the overlap between the portion of the required sensing region corresponding to the traveling direction and the first sensing region is less than a predetermined ratio, the determination unit 56 restricts the execution of the traveling task. Then it is judged. Therefore, the traveling task limiting unit 57 generates a limiting item indicating a speed (in other words, a speed limit) such that the overlapping regions are equal to or more than a predetermined ratio.
- a speed in other words, a speed limit
- the traveling task restriction unit 57 generates restrictions according to the determination result of other requirements by the determination unit 56. Specifically, the traveling task limiting unit 57 generates restrictions according to the determination result of the traveling environment requirement. For example, when fog is generated on the route or the road surface on the route is frozen, the traveling task limiting unit 57 is generated as a restriction item for reducing the traveling speed. In addition, when the number of pedestrians is equal to or greater than the threshold value at a predetermined point on the route, or when an accident occurs, the traveling task restriction unit 57 generates a restriction item for prohibiting intrusion into the predetermined point. In addition, when an animal enters the route, the traveling task restriction unit 57 generates restrictions for prohibiting the route and changing the route. Further, when the communication state between the observer terminal for monitoring the autonomous driving vehicle 6 and the autonomous driving vehicle 6 is poor, or when an emergency vehicle approaches, the traveling task limiting unit 57 sets a restriction item for stopping the traveling. Generate.
- the restrictions on the driving task may be generated based on the operation of the administrator.
- the administrator selects a restriction item
- the traveling task restriction unit 57 generates a restriction item based on the selection result.
- the traveling task limiting unit 57 is an example of an output unit.
- the manager is a manager of the information processing system 1 or a part of the device, an owner of the self-driving car 6, a monitor of the self-driving car 6, and the like.
- the travel task restriction unit 57 outputs the generated restriction items to the travel determination unit 22 and the travel plan change unit 33. Further, the travel task restriction unit 57 outputs a travel task permission indicating permission for execution of the travel task that is not determined to be restricted to the travel determination unit 22. The traveling task limiting unit 57 may output the traveling task permission to the traveling plan changing unit 33.
- FIG. 3 is a flowchart showing the processing of the information processing system 1 in the embodiment.
- FIG. 3 describes an overall outline of the processing of the information processing system 1.
- the condition acquisition unit 55 executes a traveling task for each traveling point according to the autonomous driving vehicle 6 based on the vehicle spec information and the traveling point information before the autonomous driving vehicle 6 travels.
- the condition is calculated and acquired (S11).
- the process of step S11 may be executed before the self-driving car 6 travels.
- the running task acquisition unit 51 acquires the running task output by the running task generation unit 31 (S12).
- the determination unit 56 acquires the first sensing data from the first sensing unit 21, the second sensing data from the second sensing unit 41, the execution conditions from the condition acquisition unit 55, and the travel task from the travel task acquisition unit 51. To do.
- the determination unit 56 determines whether or not the autonomous driving vehicle 6 can normally determine the execution of the traveling task based on the acquired information (S13). That is, the determination unit 56 determines whether or not to restrict the execution of the traveling task based on the acquired information.
- the determination unit 56 determines that the execution of the travel task is not restricted (YES in S13)
- the determination unit 56 outputs the travel task permission to the travel determination unit 22 (S17).
- the travel determination unit 22 obtains the travel task permission
- the travel determination unit 22 executes the travel task indicated by the travel task permission.
- the autonomous driving vehicle 6 is controlled to travel according to the traveling task.
- the travel determination unit 22 determines whether or not to end the travel (S18). That is, the travel determination unit 22 determines whether or not the vehicle has arrived at the destination.
- the travel determination unit 22 determines that the travel is completed (YES in S18), and terminates the travel of the autonomous driving vehicle 6. In addition, the information processing system 1 ends the process.
- the travel task restriction unit 57 when the determination unit 56 determines that the execution of the travel task is restricted (NO in S13), the travel task restriction unit 57 generates a restriction item for the travel task determined to restrict the execution of the travel task.
- the restriction items are output to the travel plan change unit 33 and the travel determination unit 22 (S14).
- the travel plan change unit 33 changes the travel plan information of the travel task stored in the first storage unit 32 based on the restrictions (S15). That is, the travel plan change unit 33 changes the travel plan by deleting or changing the travel point regarding the restriction from the travel plan.
- the travel determination unit 22 determines whether or not the travel can be continued based on the restrictions (S16). Specifically, the travel determination unit 22 cancels the execution of the travel task or changes the execution content based on the restrictions. The traveling determination unit 22 determines whether or not the traveling of the autonomous driving vehicle 6 can be continued after the execution of the traveling task is stopped or the execution content is changed.
- the running determination unit 22 If the running determination unit 22 cannot continue running (NO in S16), the running determination unit 22 ends the running of the self-driving car 6. In addition, the information processing system 1 ends the process.
- the process returns to step S13.
- the traveling determination unit 22 causes the autonomous driving vehicle 6 to travel by acquiring a plurality of traveling tasks until the autonomous driving vehicle 6 arrives at the destination.
- FIG. 4A is a flowchart showing detailed processing of the determination device 5 in the embodiment. In FIG. 4A, it is assumed that the process is executed after the start of traveling of the autonomous driving vehicle 6.
- the condition acquisition unit 55 calculates and acquires the request sensing area. Specifically, the condition acquisition unit 55 calculates and acquires the required sensing area based on the vehicle spec information and the traveling task acquired via the determination unit 56. The details of the calculation process of the request sensing region will be described later.
- the traveling task acquisition unit 51 acquires the traveling task from the operation control device 3, it outputs it to the determination unit 56.
- the determination unit 56 acquires the traveling task (S21).
- the determination unit 56 acquires the first sensing data from the automatic driving device 2 (S22).
- the determination unit 56 acquires the second sensing data from the infrastructure device 4 (S23).
- the determination unit 56 calculates and acquires the actual sensing area based on the first sensing data and the second sensing data (S24).
- the actual sensing area includes the name of the sensing device and the actual sensing range for each target area.
- the actual sensing range is indicated by the entire area or the actual sensing distance.
- the actual sensing range of the target area A0 is the entire area
- the actual sensing range of the target areas A1 to A8 is the numerical value of the actual sensing distance.
- FIG. 4B is a diagram showing an example of an actual sensing region of the information processing system 1 according to the embodiment.
- the actual sensing region is a region defined based on the first sensing region and the second sensing region, and is, for example, a region in which the first sensing region and the second sensing region are combined (in other words, a synthetic sensing region).
- the actual sensing distance is the distance actually sensed by the first sensing unit 21 (that is, the autonomous driving vehicle 6) or the second sensing unit 41 (that is, the infrastructure device 4).
- the determination unit 56 determines whether the synthetic sensing region is sufficient (S25). Specifically, the determination unit 56 determines whether or not to limit the execution of the traveling task based on the first sensing region or the second sensing region and the required sensing region.
- the determination unit 56 determines that the synthetic sensing area is sufficient (YES in S25)
- the determination unit 56 outputs the travel task permission to the travel determination unit 22 (S31).
- the determination device 5 determines whether or not the running of the self-driving car 6 has been completed (S32). Specifically, the determination device 5 determines whether or not the travel determination unit 22 has completed the travel of the autonomous driving vehicle 6.
- the determination device 5 determines that the running of the autonomous driving vehicle 6 has been completed (YES in S32). Further, when the determination device 5 determines that the traveling of the automatic driving vehicle 6 has not been completed (NO in S32), the determination device 5 returns the process to step S21.
- the determination unit 56 determines whether or not it can be solved by adjusting the infrastructure device 4 (that is, the second sensing unit 41) (S26). Specifically, by adjusting the installation position of the infrastructure device 4, the sensing range of the second sensing unit 41, the accuracy, and the like, the determination unit 56 determines whether or not the synthetic sensing area is sufficient.
- the solution may be the addition of a sensor, the change of infrastructure, or the like. In addition, the solution may be preset with adjustable items, and the selected solution may be notified to the administrator.
- step S26 when the determination unit 56 determines that the problem can be solved by adjusting the infrastructure device 4 (YES in S26), the determination unit 56 adjusts the infrastructure device 4 (S33), and the process returns to step S23. Since the processes of steps S26 and S33 are not essential processes, they may be omitted.
- the travel task restriction unit 57 restricts the travel task determined that the synthetic sensing area is not sufficient. By generating the matter, the execution of the traveling task is restricted (S27).
- the traveling task limiting unit 57 outputs the generated restriction items to the traveling plan changing unit 33 to change the traveling plan information (including the acquired traveling task) to the traveling plan changing unit 33 (S28).
- the travel plan change unit 33 changes the travel plan information stored in the first storage unit 32 based on the restrictions.
- the traveling task limiting unit 57 determines whether or not the autonomous driving vehicle 6 can continue traveling based on the changed traveling plan information (S29).
- the travel task restriction unit 57 acquires the determination result by the travel determination unit 22 as to whether or not the travel can be continued based on the changed travel task together with the travel plan information.
- the traveling task limiting unit 57 determines that the traveling cannot be continued (NO in S29)
- the traveling task limiting unit 57 outputs an instruction to end the traveling of the autonomous driving vehicle 6 to the traveling determination unit 22 (S30).
- the self-driving car 6 ends traveling.
- the determination device 5 ends the process.
- the required sensing distance is, for example, a predetermined distance from the intersection region when the autonomous driving vehicle 6 enters the intersection.
- the predetermined distance is a distance for sensing a moving body entering the intersection.
- FIG. 5 is a flowchart showing a process of calculating the required sensing distance.
- the condition acquisition unit 55 acquires safety requirement information from the fourth storage unit 54 (S41). For example, when the traveling environment requirement of the safety requirement information is set so that other moving objects do not enter the intersection, the autonomous driving vehicle 6 can safely enter the intersection. In addition, when the sensing requirement of the safety requirement information is set to be able to sense other moving objects in and around the intersection, the possibility that the autonomous driving vehicle 6 overlooks other moving objects is reduced, that is, in an accident or incident. Occurrence can be suppressed.
- the condition acquisition unit 55 acquires travel point information from the third storage unit 53.
- the condition acquisition unit 55 searches for a sensing target area from the acquired travel point information (S42). Specifically, the condition acquisition unit 55 determines a road lane and an area as a target area to be sensed for each traveling task from the traveling point information.
- FIG. 6 is a diagram showing an example of the required sensing region R1 and the required sensing distance.
- the condition acquisition unit 55 calculates the required sensing distance (S43). Specifically, the condition acquisition unit 55 uses the required sensing distance calculation input (maximum speed, speed limit, approach speed to intersection, maximum acceleration, vehicle response time, right turn route length, assumed maximum speed of other vehicles). Etc.), the required sensing distance is calculated for each driving task.
- condition acquisition unit 55 substitutes the maximum speed v max of the self-driving car 6, the approach speed v min to the intersection, and the maximum acceleration a max into the equation (1), and the self-driving car 6 has the maximum speed.
- v max the time required to accelerate to. For example, when the self-driving car 6 departs, v min is set to 0.
- condition acquisition unit 55 substitutes the maximum speed v max of the self-driving car 6, the approach speed v min to the intersection, and the time t max velocity into the equation (2), so that the self-driving car 6 accelerates to the maximum speed. Calculate the distance l max required for.
- condition acquisition unit 55 formulates the maximum speed v max , the maximum acceleration a max , the vehicle response time t response , the right turn route length l task , the time t max velocity, and the distance l max velocity of the autonomous driving vehicle 6 into the equation (3). Substituting, the time t task required for the autonomous driving vehicle 6 to complete the traveling task is calculated.
- condition acquisition unit 55 substitutes the time t task and the assumed maximum speed v other of the other vehicle into the equation (4), and the maximum distance l move other that the other moving body moves by the time when the autonomous driving vehicle 6 completes the traveling task. Calculate the vehicle.
- the condition acquisition unit 55 sets the calculated distance l move other vehicle as the required sensing distance.
- the information processing system 1 can calculate the required sensing distance. For example, in FIG. 6, when the self-driving car 6 turns right at the intersection area A0, the required sensing distance from the intersection area A0 is calculated for each of the lanes A1, A3, and A7. As a result, it is possible to set the sensing requirement (in other words, the safety requirement) to detect that another moving body does not invade by the time the autonomous driving vehicle 6 completes the right turn.
- the sensing requirement in other words, the safety requirement
- FIG. 7 is a flowchart showing the process of searching the target area lane.
- condition acquisition unit 55 includes the approach lanes A1, A3, A5, and A7 into the intersection area A0 where the autonomous driving vehicle 6 is about to enter, other than the autonomous driving vehicle 6. Lanes A1, A3, and A7 are determined as lanes for the moving body to enter (S51).
- condition acquisition unit 55 determines whether or not a traffic light exists in the intersection region A0 (S52).
- condition acquisition unit 55 determines all approach lanes A1, A3, and A7 other than the lane A5 in which the autonomous driving vehicle 6 exists as the target area lanes (NO). S53).
- condition acquisition unit 55 determines the approach lane A1 of the oncoming vehicle for the autonomous driving vehicle 6 as the target area lane (S54).
- FIG. 8A is a flowchart showing a process of calculating the actual sensing region.
- the determination unit 56 calculates the actual sensing area for each sensor (first sensing unit 21 and second sensing unit 41) (S61).
- the calculation of the actual sensing area for each sensor will be described with reference to FIG. 8B.
- FIG. 8B is a flowchart showing a process of calculating the actual sensing area for each sensor.
- the determination unit 56 acquires the maximum sensing distance of the first sensing unit 21 (S61a). For example, the determination unit 56 acquires the maximum sensing distance of the first sensing unit 21 from the vehicle spec information stored in the second storage unit 52.
- the determination unit 56 calculates the distance from the first sensing unit 21 (that is, the autonomous driving vehicle 6) to the object in each direction based on the first sensing data (S61b).
- the determination unit 56 sets a smaller value among the maximum sensing distance and the distance to the object in each direction as the actual sensing distance in each direction (S61c). For example, in FIG. 9, the distance from the autonomous vehicle 6 existing in the lane A5 to the obstacle 7 is shorter than the maximum sensing distance. Therefore, the determination unit 56 sets the distance from the autonomous driving vehicle 6 to the obstacle 7 as the actual sensing distance.
- the side opposite to the self-driving car 6 side of the obstacle 7 is a blind spot of the self-driving car 6.
- FIG. 9 is a diagram showing an example of the relationship between the first sensing region K1 and the second sensing region K2.
- the determination unit 56 superimposes the first sensing region, which is the actual sensing region, and the second sensing region (S62).
- the first sensing region K1 is shown by hatching of diagonal lines in a grid pattern
- the second sensing region K2 is shown by hatching in a dot shape.
- the determination unit 56 uses an actual sensing region, that is, a synthetic sensing region, for determining whether or not to limit the execution of the traveling task based on the result of superimposing the first sensing region K1 and the second sensing region K2. Is calculated (S63).
- the determination unit 56 determines a region covered by either the first sensing region or the second sensing region as the synthetic sensing region.
- the sum of the first sensing region and the second sensing region is determined as the synthetic sensing region.
- the lattice-shaped diagonal hatching or the dot-shaped hatching in FIG. 9 is determined as the synthetic sensing region.
- step S25 of FIG. 4A is performed.
- An example of the determination process based on the required sensing region and the actual sensing region will be described with reference to FIG.
- FIG. 10 is a diagram showing an example of the relationship between the required sensing region and the actual sensing region.
- the sum of the first sensing region K1 and the second sensing region K2 is shown as the actual sensing region K3 of the hatching with diagonal lines. Further, among the required sensing regions, the region R2 that overlaps with the actual sensing region K3 is indicated by vertical line hatching, and the region R3 that does not overlap with the actual sensing region K3 is indicated by hatching with fine dot spacing.
- the determination unit 56 determines that the execution of the traveling task, that is, the right turn is restricted because a part of the requested sensing region R3 does not overlap with the actual sensing region K3.
- the execution of the traveling task can be restricted depending on the traveling specifications of the moving body. That is, when the driving specifications and the sensor specifications are not suitable for the driving scene, the execution of the driving task can be restricted. Therefore, it is possible to provide safety for executing a running task for a moving body having various running specifications.
- the running spec and the sensor spec of the moving body are not suitable for the running scene and the execution of the running task cannot be completed or the running task cannot be executed, that is, when the moving body cannot run, the running task is concerned. Stops execution. As a result, it is possible to suppress the occurrence of an accident or an incident due to the execution of the traveling task. Further, for example, when the traveling spec and the sensor spec of the moving object are not suitable for the traveling scene and the traveling task cannot be safely executed, that is, when the moving object cannot safely travel, the content of the execution of the traveling task is changed. As a result, even if the moving body does not completely meet the driving safety conditions, the moving body continues to run safely by executing a running task with limited contents instead of the original running task. be able to.
- Modification example 1 In the above embodiment, the information processing method and the information processing system 1 have been described as an example, but the present invention is not limited to this.
- the terminal device 80 and the observer terminal 90 may be communicably connected to the information processing system 1.
- this case will be described as a modification 1, focusing on the differences from the above-described embodiment.
- FIG. 11 is a schematic diagram showing the information processing system 1 in the modified example 1.
- the traveling task limiting unit 57 determines the monitoring mode of the autonomous driving vehicle 6 by the observer according to the determination result of whether or not to restrict the execution of the traveling task. Specifically, when the determination unit 56 determines that the execution of the traveling task is restricted, the traveling task limiting unit 57 adds the autonomous driving vehicle 6 to the monitoring target, or sets the monitoring priority of the autonomous driving vehicle 6. increase.
- the travel task restriction unit 57 outputs to the terminal device 80 and the monitor terminal 90, which will be described later, that the execution of the travel task is restricted. As a result, the manager or passenger of the self-driving car 6 is notified that the execution of the traveling task is restricted.
- the traveling task limiting unit 57 may output to the terminal device 80 and the observer terminal 90 that the autonomous driving vehicle 6 is added to the monitoring target or that the monitoring priority of the autonomous driving vehicle 6 is raised.
- the terminal device 80 is a car navigation device, a personal computer, a smartphone, a tablet terminal, or the like that is communicably connected to the information processing system 1.
- the terminal device 80 includes an autonomous driving vehicle 6 added to the monitoring target, an autonomous driving vehicle 6 having a higher monitoring priority, an autonomous driving vehicle 6 in which execution of a driving task is restricted, and execution of a driving task.
- the notification may be realized by a display by a display device such as a display, an audio output by an audio device such as a speaker, or the like.
- the owner is an example of an administrator.
- the observer terminal 90 is a personal computer, a smartphone, a tablet terminal, or the like that is communicably connected to the information processing system 1.
- the observer terminal 90 is an automatic driving vehicle 6 added to the monitoring target, an automatic driving vehicle 6 having a higher monitoring priority, an automatic driving vehicle 6 in which the execution of the traveling task is restricted, and the execution of the traveling task being restricted. At least one is acquired from the traveling task restriction unit 57 and notified to the observer.
- the notification may be realized by a display by a display device such as a display, an audio output by an audio device such as a speaker, or the like.
- the observer is an example of an administrator.
- the occurrence of an accident or an incident can be suppressed.
- the observer will be able to respond promptly.
- the manager or the passenger can grasp that the execution of the traveling task of the moving body is restricted.
- the burden of monitoring the moving body by the observer can be reduced.
- the sensing requirement is the required sensing region
- the sensing requirement may be another requirement.
- the sensing requirement is a required sensing target that requires sensing
- the first sensing result is the first sensing target calculated based on the first sensing data
- the determination unit 56 It is determined whether or not to restrict the execution of the traveling task based on the request sensing target and the first sensing target.
- the second sensing result is also the target of the second sensing, and may be used for the above determination.
- the determination unit 56 limits the execution of the traveling task according to the sufficiency of the request sensing target and the first sensing target (that is, whether or not the request sensing target is included in the first sensing target).
- the required sensing target is a type of geographical fixture such as a traffic light, a road sign, a curb, or a road marking, and a number thereof.
- the required sensing target is a geographical scene such as an intersection, a curve, or a bridge. For items that may be subject to request sensing, information is added to the travel point information and safety requirement information.
- the condition acquisition unit 55 calculates the request sensing target as the execution condition of the driving task based on the traveling task, the vehicle spec information, the traveling point information, and the safety requirement information. For example, the condition acquisition unit 55 acquires the arrangement or geographical scene of the geographically fixed object at the point C from the travel point information for the travel task of turning right at the point C, and acquires the speed and acceleration from the vehicle specifications. The condition acquisition unit 55 calculates from the acquired information a geographical fixed object or a geographical scene that requires sensing specified in the safety requirement information. For example, a traffic light and a signboard in the traveling direction of the autonomous driving vehicle 6 at the intersection before a predetermined distance or more are calculated as request sensing targets.
- the determination unit 56 calculates the first sensing target from the first sensing data and the second sensing target from the second sensing data. For example, the determination unit 56 calculates an object (for example, a signboard) located in the traveling direction of the autonomous driving vehicle 6 as a first and second sensing target from the first and second sensing data such as image data or point cloud data, respectively. ..
- an object for example, a signboard located in the traveling direction of the autonomous driving vehicle 6 as a first and second sensing target from the first and second sensing data such as image data or point cloud data, respectively. ..
- the determination unit 56 determines whether or not to restrict the execution of the traveling task based on the request sensing target, the first sensing target, and the second sensing target. For example, the determination unit 56 determines whether or not the object calculated as the first and second sensing objects is the object calculated as the request sensing object, for example, the traffic light or the signboard. When it is determined that the object calculated as the first and second sensing targets does not include at least one of the above-mentioned traffic light and signboard calculated as the request sensing target, the determination unit 56 determines that the execution of the traveling task is restricted. .. Otherwise, it is determined that the execution of the traveling task is not restricted.
- the determination unit 56 may determine whether or not to limit the execution of the traveling task according to the degree of coincidence between the request sensing target and the first sensing target. For example, when the request sensing target is a traffic light and a signboard, and the first sensing target is a traffic light and a pedestrian crossing, the request sensing target and the first sensing target do not match, so that the execution of the traveling task is restricted. .. Further, when the first sensing target is a traffic light, a signboard, and a pedestrian crossing, the execution of the traveling task is restricted because the required sensing target and the first sensing target do not match.
- the determination unit 56 further determines whether or not the first sensing target includes the state of the request sensing target (that is, the degree of coincidence or sufficientness between the state of the request sensing target and the state of the first sensing target). You may decide whether to limit the execution. Even if the sensing targets match, the safety of executing the driving task changes according to the state of the targets. Therefore, by limiting the execution of the traveling task according to the degree of coincidence of the states of the sensing target, it is possible to make the moving body execute the traveling task more safely. For example, when the state of the request sensing target and the state of the target sensed by the moving object are different, the execution of the traveling task can be restricted.
- the sensing requirement may be another requirement.
- the sensing requirement is the required sensing performance that requires sensing
- the first sensing result is the first sensing performance calculated based on the first sensing data
- the determination unit 56 Determine whether to limit the execution of the driving task based on the required sensing performance and the first sensing performance.
- the second sensing result is the second sensing performance and may be used for the above determination.
- the determination unit 56 determines whether or not to limit the execution of the traveling task according to whether or not the first sensing performance exceeds the required sensing performance.
- the required sensing performance is sensing accuracy, accuracy, resolution, processing cycle, or the like. Information is added to the safety requirement information for items that may have the required sensing performance.
- the condition acquisition unit 55 calculates the required sensing performance as an execution condition of the driving task based on the traveling task, the vehicle spec information, the traveling point information, and the safety requirement information. For example, the condition acquisition unit 55 acquires the speed limit and the assumed maximum speed of another vehicle from the travel point information, and acquires the speed and acceleration from the vehicle specifications for the travel task of turning right at the point C. The condition acquisition unit 55 calculates the accuracy of sensing specified in the safety requirement information from the acquired information. For example, the required sensing performance is calculated as a processing cycle that can secure a time that can be avoided even if another vehicle tries to enter the intersection.
- the determination unit 56 calculates the first sensing performance from the first sensing data and the second sensing performance from the second sensing data. For example, the determination unit 56 calculates the processing cycles of the first sensing unit 21 and the second sensing unit 41 as the first and second sensing performances from the first and second sensing data such as image data or point cloud data, respectively.
- the determination unit 56 determines whether or not to limit the execution of the traveling task based on the required sensing performance, the first sensing performance, and the second sensing performance. For example, the determination unit 56 determines whether or not the processing cycle calculated as the first and second sensing performance is shorter than the processing cycle calculated as the required sensing performance. When it is determined that any of the processing cycles calculated as the first and second sensing targets is equal to or longer than the processing cycle calculated as the required sensing performance, the determination unit 56 determines that the execution of the traveling task is restricted. Otherwise, it is determined that the execution of the traveling task is not restricted.
- the determination unit 56 calculates the first and second sensing results, but the present invention is not limited to this.
- the first and second sensing results may be calculated by the first sensing unit 21 and the second sensing unit 41, respectively.
- the first and second sensing results are output to the determination device 5, that is, the determination unit 56.
- the moving body is an autonomous driving vehicle, but the present invention is not limited to this.
- the moving body may be an autonomous mobile robot.
- a robot that is used in a building and moves in a passage is assumed.
- the moving body may be a robot that is used outside the building and runs on the road.
- an example in which the moving body is the autonomous mobile robot 8 will be described. The description of the configuration or processing substantially the same as the configuration or processing of the above-described embodiment will be omitted.
- the travel task generation unit 31 generates a travel task based on the travel plan information of the autonomous mobile robot 8 acquired from the first storage unit 32.
- the travel task includes a travel task name, a travel task type, and a point name, as shown in FIG. 12A.
- FIG. 12A is a diagram showing an example of a traveling task of the information processing system 1 in the modified example 5.
- the travel plan information includes the route and the point where the travel task on the route is executed.
- the points include the starting point, the destination, the right turn point, the left turn point, the stop point, and the like, as well as the points where equipment in the building such as an elevator is installed.
- FIG. 12B is a diagram showing an example of travel plan information of the information processing system 1 in the modified example 5.
- the first storage unit 32 stores a travel plan information database showing a travel plan of the autonomous mobile robot 8.
- the second storage unit 52 stores a database of robot spec information (hereinafter, robot spec information database) indicating the specifications of the autonomous mobile robot 8.
- the second storage unit 52 outputs the robot spec information in response to the request of the condition acquisition unit 55.
- the robot specifications are specifications related to the running of the autonomous mobile robot 8.
- the robot spec information includes a robot name, a movement method, a maximum acceleration, a maximum deceleration, a maximum speed, a response time, and the like.
- FIG. 12C is a diagram showing an example of robot spec information of the information processing system 1 in the modified example 5.
- the third storage unit 53 stores a travel point information database.
- the travel point information includes map information such as a point name, a point number assigned to the point name, a passage type, and traffic environment information such as an approach lane.
- the traffic environment information may include information such as the presence / absence of equipment in the building or the type of equipment.
- FIG. 12D is a diagram showing an example of travel point information of the information processing system 1 in the modified example 5.
- the fourth storage unit 54 stores the safety requirement information database.
- the safety requirement information includes a traveling task name, a target area type, a passage type, a required sensing range within a point, a required sensing range of a connecting passage, a required sensing range calculation input, and the like.
- the area type, the required sensing range within the point, and the required sensing range of the connecting passage are safety requirements related to the sensing requirements.
- FIG. 12E is a diagram showing an example of safety requirement information of the information processing system 1 in the modified example 5.
- the target area type indicates the type of area in which the autonomous mobile robot 8 may collide with another object. Specifically, in the example of FIG. 12E, the target area type shows the inside of the point and the passage connecting to the point as the type of the above area.
- the required sensing range inside the point is the required sensing range of the type area inside the point.
- the required sensing range inside the point is set to the entire area in the example of FIG. 12E.
- the required sensing range of the connecting passage is the required sensing range of the area of the type called the connecting passage.
- the required sensing range of the connecting passage is set to the region from the connecting portion to the required sensing distance because the region far from the connection portion with the point has a low possibility of collision.
- the required sensing range calculation input is information input for calculating the required sensing distance.
- the required sensing range calculation input is the maximum speed, maximum acceleration, response time, etc. of the autonomous mobile robot 8.
- the condition acquisition unit 55 acquires robot spec information from the second storage unit 52 and travel point information from the third storage unit 53. Further, the condition acquisition unit 55 acquires a travel task from the travel task acquisition unit 51 via the determination unit 56. Further, the condition acquisition unit 55 further acquires safety requirement information from the fourth storage unit 54.
- the condition acquisition unit 55 calculates the execution condition of the running task.
- the execution condition of the travel task is an execution condition of the travel task according to at least one of the autonomous mobile robot 8 and the travel point.
- condition acquisition unit 55 calculates the sensing requirement based on the robot spec information, the traveling task, and the safety requirement information. Specifically, the condition acquisition unit 55 calculates the requirement sensing area based on the robot spec information, the traveling task, and the safety requirement information.
- the required sensing area includes the robot name, the traveling task name, the target area ID, and the required sensing range for each area type.
- the required sensing range is, for example, the required sensing distance in the entire area or the target area.
- the entire area is set for the target area B0, and the required sensing distance of 3 m is set for each of the target areas B1, B2, and B3.
- FIG. 12F is a diagram showing an example of the required sensing region of the information processing system 1 in the modified example 5.
- the condition acquisition unit 55 searches the target area for sensing based on the robot spec information, the traveling task, and the safety requirement information. Specifically, the condition acquisition unit 55 determines an area and a passage connected to the area as a target area to be sensed for each traveling task based on the traveling point information specified from the safety requirement information.
- the passages connected to the intersection area B0 are the passages B1, B2, and B3 as shown by the hatching of vertical lines.
- the condition acquisition unit 55 refers to the target area type specified in the safety requirement information and searches for the area corresponding to the type.
- the intersection area B0 and the passages B1, B2, and B3 correspond to the target areas, they are determined as the target areas, respectively.
- a part of the passage which is the target area becomes the request sensing area R1.
- FIG. 13 is a diagram showing an example of the required sensing region R1 and the required sensing distance.
- the determination unit 56 acquires the first sensing data from the first sensing unit 21, acquires the second sensing data from the second sensing unit 41, and acquires the execution condition including the sensing requirement from the condition acquisition unit 55. Further, the determination unit 56 acquires a travel task from the travel task acquisition unit 51. For the processing of the determination unit 56 and the traveling task restriction unit 57, refer to the description of the above embodiment.
- the configuration according to the embodiment of the present disclosure can be applied.
- the information processing method and the information processing system according to the above-described embodiment and modification are realized by a program using a computer, and such a program may be stored in a storage device.
- each processing unit included in the information processing method and the information processing system according to the above-described embodiment and modification is typically realized as an LSI which is an integrated circuit. These may be individually integrated into one chip, or may be integrated into one chip so as to include a part or all of them.
- the integrated circuit is not limited to the LSI, and may be realized by a dedicated circuit or a general-purpose processor.
- An FPGA Field Programmable Gate Array
- a reconfigurable processor that can reconfigure the connection and settings of the circuit cells inside the LSI may be used.
- each component may be configured by dedicated hardware or may be realized by executing a software program suitable for each component.
- Each component may be realized by a program execution unit such as a CPU or a processor reading and executing a software program recorded on a recording medium such as a hard disk or a semiconductor memory.
- the division of the functional block in the block diagram is an example, and a plurality of functional blocks can be realized as one functional block, one functional block can be divided into a plurality of functional blocks, and some functions can be transferred to other functional blocks. You may. Further, the functions of a plurality of functional blocks having similar functions may be processed by a single hardware or software in parallel or in a time division manner.
- each step in the flowchart is executed is for the purpose of exemplifying the present disclosure in detail, and may be an order other than the above. Further, a part of the above steps may be executed at the same time (parallel) as other steps.
- the present disclosure can be applied to an autonomous vehicle, a device for remotely controlling an autonomous vehicle, an autonomous mobile robot, or a system including these.
Landscapes
- Engineering & Computer Science (AREA)
- Automation & Control Theory (AREA)
- Transportation (AREA)
- Mechanical Engineering (AREA)
- Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- General Physics & Mathematics (AREA)
- Chemical & Material Sciences (AREA)
- Analytical Chemistry (AREA)
- Mathematical Physics (AREA)
- Traffic Control Systems (AREA)
Abstract
Description
<構成:情報処理システム1>
図1は、実施の形態における情報処理システム1を示すブロック図である。
自動運転装置2は、移動体に搭載され、移動体の周辺をセンシングし、センシング結果に基づいて移動体の走行を制御する。移動体は、車両、航空機、又は船舶等である。本実施の形態では、以下、移動体として自動運転車6を想定して説明する。
運行管制装置3は、走行タスク生成部31と、第1記憶部32と、走行計画変更部33とを有する。
インフラストラクチャ装置4は、道路、又は信号機等のインフラストラクチャに設定される。インフラストラクチャ装置4は、第2センシング部41を有する。
判定装置5は、走行タスク取得部51と、第2記憶部52と、第3記憶部53と、第4記憶部54と、条件取得部55と、判定部56と、走行タスク制限部57とを有する。
以上のように構成される情報処理システム1の処理について説明する。
次に、本実施の形態における情報処理方法及び情報処理システム1の作用効果について説明する。
上記の実施の形態では、一例として、情報処理方法及び情報処理システム1について説明したが、これに限らない。情報処理システム1に端末装置80と監視者端末90とが通信可能に接続されていてもよい。以下、この場合を変形例1として、上記の実施の形態と異なるところを中心に説明する。
図11に示すように、走行タスク制限部57は、走行タスクの実行を制限するか否かの判定結果にしたがって、監視者による自動運転車6の監視態様を決定する。具体的には、走行タスク制限部57は、判定部56によって走行タスクの実行を制限すると判定された場合、自動運転車6を監視対象に追加する、又は、自動運転車6の監視優先度を上げる。
端末装置80は、情報処理システム1と通信可能に接続される、カーナビゲーション装置、パーソナルコンピュータ、スマートフォン、又はタブレット端末等である。端末装置80は、監視対象に追加した自動運転車6又は監視優先度を上げた自動運転車6、走行タスクの実行が制限される自動運転車6、及び走行タスクの実行が制限されることの少なくとも1つを自動運転車6の所有者又は乗客に通知する。通知は、ディスプレイ等の表示装置による表示、スピーカ等の音響装置による音声出力等で実現されてよい。所有者は、管理者の一例である。
監視者端末90は、情報処理システム1と通信可能に接続されるパーソナルコンピュータ、スマートフォン、又はタブレット端末等である。監視者端末90は、監視対象に追加した自動運転車6又は監視優先度を上げた自動運転車6、走行タスクの実行が制限される自動運転車6、走行タスクの実行が制限されたことの少なくとも1つを、走行タスク制限部57から取得し、監視者に通知する。通知は、ディスプレイ等の表示装置による表示、スピーカ等の音響装置による音声出力等で実現されてもよい。監視者は、管理者の一例である。
上記実施の形態では、センシング要件が要求センシング領域である例を説明したが、センシング要件は、他の要件であってもよい。具体的には、センシング要件は、センシングが要される対象である要求センシング対象であり、第1センシング結果は、第1センシングデータに基づいて算出される第1センシング対象であり、判定部56は、要求センシング対象及び第1センシング対象に基づいて走行タスクの実行を制限するか否かを判定する。第2センシング結果についても同様に第2センシング対象であり、上記判定に用いられてよい。
また、上記変形例2と同様に、センシング要件は、他の要件であってもよい。具体的には、センシング要件は、センシングが要される対象である要求センシング性能であり、第1センシング結果は、第1センシングデータに基づいて算出される第1センシング性能であり、判定部56は、要求センシング性能及び第1センシング性能に基づいて走行タスクの実行を制限するか否かを判定する。第2センシング結果についても同様に第2センシング性能であり、上記判定に用いられてよい。
また、上記実施の形態では、判定部56が第1及び第2センシング結果を算出するとしたがこれに限らない。第1及び第2センシング結果は、それぞれ第1センシング部21及び第2センシング部41が算出してもよい。この場合、第1及び第2センシング結果は、判定装置5すなわち判定部56に出力される。
また、上記実施の形態では、移動体が自動運転車であるとしたがこれに限らない。移動体は、自律移動ロボットであってもよい。例えば、建物内で利用され通路を移動するロボットが想定される。当然ながら移動体は建物外で利用され道路を走るロボットであってもよい。以下、移動体が自律移動ロボット8である例について説明する。なお、上記実施の形態の構成又は処理と実質的に同一の構成又は処理については説明を省略する。
走行タスク生成部31は、第1記憶部32から取得した自律移動ロボット8の走行計画情報に基づいて、走行タスクを生成する。例えば、走行タスクは、図12Aに示すように、走行タスク名、走行タスク種類、及び地点名を含む。図12Aは、変形例5における情報処理システム1の走行タスクの例を示す図である。
第2記憶部52は、自律移動ロボット8のスペックを示すロボットスペック情報のデータベース(以下、ロボットスペック情報データベース)を格納する。第2記憶部52は、条件取得部55の要求に応じて、ロボットスペック情報を出力する。ロボットスペックは、自律移動ロボット8の走行に関するスペックである。具体的には、ロボットスペック情報は、図12Cに示すように、ロボット名、移動方式、最大加速度、最大減速度、最大速度、応答時間等を含む。図12Cは、変形例5における情報処理システム1のロボットスペック情報の例を示す図である。
以上、本開示について、実施の形態及び変形例に基づいて説明したが、本開示は、これら実施の形態及び変形例に限定されるものではない。
6 自動運転車(移動体)
8 自律移動ロボット(移動体)
21 第1センシング部(第1センサ)
41 第2センシング部(第2センサ)
51 走行タスク取得部(第1取得部)
55 条件取得部(第3取得部、第1算出部)
56 判定部(第2取得部、第2算出部、判定部)
57 走行タスク制限部(出力部)
Claims (16)
- コンピュータにより実行される情報処理方法であって、
移動体において実行される走行に関するタスクと、前記移動体に搭載され、前記移動体の外部をセンシングする第1センサが出力する第1センシングデータと、前記移動体の走行に関するスペックと、を取得し、
前記タスクと前記スペックとに基づいてセンシング要件を算出し、
前記第1センサが出力した前記第1センシングデータに基づいて第1センシング結果を算出し、
前記センシング要件及び前記第1センシング結果に基づいて前記タスクの実行を制限するか否かを判定し、
前記タスクの実行を制限すると判定された場合、前記タスクの実行を制限する指示を前記移動体へ出力する
情報処理方法。 - 前記センシング要件は、センシングが要される領域である要求センシング領域を含み、
前記第1センシング結果は、前記第1センシングデータに基づいて算出される第1センシング領域を含み、
前記判定では、前記要求センシング領域及び前記第1センシング領域に基づいて前記タスクの実行を制限するか否かを判定する
請求項1に記載の情報処理方法。 - 前記判定では、前記要求センシング領域と前記第1センシング領域との重なりにしたがって判定する
請求項2に記載の情報処理方法。 - 前記判定では、前記要求センシング領域と前記第1センシング領域との重なり度合いにしたがって判定する
請求項3に記載の情報処理方法。 - 前記判定では、前記要求センシング領域と前記第1センシング領域とが重ならない領域にしたがって判定する
請求項3に記載の情報処理方法。 - 前記制限は、前記タスクの実行の禁止である
請求項2~5のいずれか1項に記載の情報処理方法。 - 前記制限は、前記タスクの実行内容の変更である
請求項2~5のいずれか1項に記載の情報処理方法。 - 前記タスクの変更の内容は、前記要求センシング領域と前記第1センシング領域との重なりに基づいて決定される
請求項7に記載の情報処理方法。 - さらに、前記移動体の移動経路に設置される第2センサが出力する第2センシングデータを取得し、
前記第2センシングデータに基づいて第2センシング結果を算出し、
前記判定では、さらに前記第2センシング結果にも基づいて判定する
請求項1~8のいずれか1項に記載の情報処理方法。 - さらに、前記タスクの実行を制限すると判定された場合、前記移動体を監視対象に追加する、又は、前記移動体の監視優先度を上げる
請求項1~9のいずれか1項に記載の情報処理方法。 - さらに、前記タスクの実行を制限すると判定された場合、前記タスクの実行が制限されることを前記移動体の管理者又は乗客に通知する
請求項1~10のいずれか1項に記載の情報処理方法。 - 前記センシング要件は、センシングが要される対象である要求センシング対象を含み、
前記第1センシング結果は、前記第1センシングデータに基づいて算出される第1センシング対象を含み、
前記判定では、前記要求センシング対象及び前記第1センシング対象に基づいて前記タスクの実行を制限するか否かを判定する
請求項1に記載の情報処理方法。 - 前記判定では、前記要求センシング対象と前記第1センシング対象との十分度又は一致度にしたがって判定する
請求項12に記載の情報処理方法。 - 前記センシング要件は、センシングが要される対象である要求センシング性能を含み、
前記第1センシング結果は、前記第1センシングデータに基づいて算出される第1センシング性能を含み、
前記判定では、前記要求センシング性能及び前記第1センシング性能に基づいて前記タスクの実行を制限するか否かを判定する
請求項1に記載の情報処理方法。 - 前記判定では、前記要求センシング性能を前記第1センシング性能が上回るか否かにしたがって判定する
請求項14に記載の情報処理方法。 - 移動体において実行される走行に関するタスクを取得する第1取得部と、
前記移動体に搭載され、前記移動体の外部をセンシングする第1センサが出力する第1センシングデータを取得する第2取得部と、
前記移動体の走行に関するスペックを取得する第3取得部と、
前記タスクと前記スペックとに基づいてセンシング要件を算出する第1算出部と、
前記第1センサが出力した前記第1センシングデータに基づいて第1センシング結果を算出する第2算出部と、
前記センシング要件及び前記第1センシング結果に基づいて前記タスクの実行を制限するか否かを判定する判定部と、
前記タスクの実行を制限すると判定された場合、前記タスクの実行を制限する指示を前記移動体へ出力する出力部とを備える
情報処理システム。
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202080065981.1A CN114423664A (zh) | 2019-12-26 | 2020-12-11 | 信息处理方法以及信息处理*** |
JP2021567234A JPWO2021131785A1 (ja) | 2019-12-26 | 2020-12-11 | |
US17/691,711 US20220194408A1 (en) | 2019-12-26 | 2022-03-10 | Information processing method and information processing system |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2019-236880 | 2019-12-26 | ||
JP2019236880 | 2019-12-26 |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US17/691,711 Continuation US20220194408A1 (en) | 2019-12-26 | 2022-03-10 | Information processing method and information processing system |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2021131785A1 true WO2021131785A1 (ja) | 2021-07-01 |
Family
ID=76575237
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2020/046256 WO2021131785A1 (ja) | 2019-12-26 | 2020-12-11 | 情報処理方法及び情報処理システム |
Country Status (4)
Country | Link |
---|---|
US (1) | US20220194408A1 (ja) |
JP (1) | JPWO2021131785A1 (ja) |
CN (1) | CN114423664A (ja) |
WO (1) | WO2021131785A1 (ja) |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPS6312944B2 (ja) | 1980-03-20 | 1988-03-23 | Parupu Ando Peepaa Risaachi Inst Obu Kanada | |
JPH11345396A (ja) * | 1998-06-02 | 1999-12-14 | Pub Works Res Inst Ministry Of Constr | 走行支援道路システムの合流制御システムにおける本線交通流予測方法 |
JP2000078566A (ja) * | 1998-08-31 | 2000-03-14 | Aisin Seiki Co Ltd | 駐車補助装置 |
WO2016194134A1 (ja) * | 2015-06-02 | 2016-12-08 | 日産自動車株式会社 | 車両制御装置及び車両制御方法 |
JP2019011055A (ja) * | 2018-10-01 | 2019-01-24 | 株式会社デンソー | 運転支援装置 |
Family Cites Families (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP5053776B2 (ja) * | 2007-09-14 | 2012-10-17 | 株式会社デンソー | 車両用視界支援システム、車載装置、及び、情報配信装置 |
JP5494332B2 (ja) * | 2010-07-27 | 2014-05-14 | トヨタ自動車株式会社 | 車両制御システム |
US9505413B2 (en) * | 2015-03-20 | 2016-11-29 | Harman International Industries, Incorporated | Systems and methods for prioritized driver alerts |
JP6550994B2 (ja) * | 2015-07-15 | 2019-07-31 | 日産自動車株式会社 | 走行制御装置の制御方法および走行制御装置 |
US11945454B2 (en) * | 2019-04-17 | 2024-04-02 | Paccar Inc | Vehicle maximum speed limiter bypass system |
-
2020
- 2020-12-11 WO PCT/JP2020/046256 patent/WO2021131785A1/ja active Application Filing
- 2020-12-11 JP JP2021567234A patent/JPWO2021131785A1/ja active Pending
- 2020-12-11 CN CN202080065981.1A patent/CN114423664A/zh active Pending
-
2022
- 2022-03-10 US US17/691,711 patent/US20220194408A1/en active Pending
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPS6312944B2 (ja) | 1980-03-20 | 1988-03-23 | Parupu Ando Peepaa Risaachi Inst Obu Kanada | |
JPH11345396A (ja) * | 1998-06-02 | 1999-12-14 | Pub Works Res Inst Ministry Of Constr | 走行支援道路システムの合流制御システムにおける本線交通流予測方法 |
JP2000078566A (ja) * | 1998-08-31 | 2000-03-14 | Aisin Seiki Co Ltd | 駐車補助装置 |
WO2016194134A1 (ja) * | 2015-06-02 | 2016-12-08 | 日産自動車株式会社 | 車両制御装置及び車両制御方法 |
JP2019011055A (ja) * | 2018-10-01 | 2019-01-24 | 株式会社デンソー | 運転支援装置 |
Also Published As
Publication number | Publication date |
---|---|
CN114423664A (zh) | 2022-04-29 |
JPWO2021131785A1 (ja) | 2021-07-01 |
US20220194408A1 (en) | 2022-06-23 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11231286B2 (en) | Dynamic routing for self-driving vehicles | |
JP6831419B2 (ja) | 自動運転車と警告サービスの操作方法、システム及び機械可読メディア | |
JP6380919B2 (ja) | 車両制御装置 | |
JP6308233B2 (ja) | 車両制御装置及び車両制御方法 | |
US9915951B2 (en) | Detection of overhanging objects | |
US20180136653A1 (en) | Emergency handling system for an autonomous driving vehicle (adv) | |
KR102137933B1 (ko) | 차량 코너링 제어 방법 및 그 장치 | |
JP4877364B2 (ja) | 物体検出装置 | |
JP7331939B2 (ja) | 車載装置及び運転支援方法 | |
WO2021070451A1 (ja) | 車両制御装置、車両制御方法、自動運転装置及び自動運転方法 | |
KR20190035159A (ko) | 차량 움직임 예측 방법 및 장치 | |
JP2015032028A (ja) | 運転支援装置及び運転支援方法 | |
US11753035B2 (en) | Vehicle control system | |
JP2019200464A (ja) | 運転支援方法及び運転支援装置 | |
JPWO2019171100A1 (ja) | 車両走行支援方法及び車両走行支援装置 | |
JP2019073039A (ja) | 車両制御装置 | |
JP6809087B2 (ja) | 運転支援方法及び運転支援装置 | |
JP7435787B2 (ja) | 経路確認装置および経路確認方法 | |
JP6721054B2 (ja) | 車両はみ出し判断方法及び車両はみ出し判断装置 | |
WO2021131785A1 (ja) | 情報処理方法及び情報処理システム | |
JP6867257B2 (ja) | 車両制御装置、車両、車両制御方法およびプログラム | |
JP7220192B2 (ja) | 車両制御装置、車両制御方法、およびプログラム | |
JP2021059275A (ja) | 車両運転支援方法及び車両運転支援装置 | |
JP2020123030A (ja) | 交通違反判定装置 | |
JP7419209B2 (ja) | 交通管理システム |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 20908290 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2021567234 Country of ref document: JP Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
ENP | Entry into the national phase |
Ref document number: 2020908290 Country of ref document: EP Effective date: 20220726 |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 20908290 Country of ref document: EP Kind code of ref document: A1 |