WO2020147360A1 - 用于无人车的控制方法和装置 - Google Patents

用于无人车的控制方法和装置 Download PDF

Info

Publication number
WO2020147360A1
WO2020147360A1 PCT/CN2019/112541 CN2019112541W WO2020147360A1 WO 2020147360 A1 WO2020147360 A1 WO 2020147360A1 CN 2019112541 W CN2019112541 W CN 2019112541W WO 2020147360 A1 WO2020147360 A1 WO 2020147360A1
Authority
WO
WIPO (PCT)
Prior art keywords
information
vehicle
obstacle
driving state
unmanned vehicle
Prior art date
Application number
PCT/CN2019/112541
Other languages
English (en)
French (fr)
Inventor
王月
闵芮豪
薛晶晶
刘颖楠
慎东辉
程烈
Original Assignee
北京百度网讯科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 北京百度网讯科技有限公司 filed Critical 北京百度网讯科技有限公司
Priority to JP2020568740A priority Critical patent/JP2021532009A/ja
Priority to EP19910877.0A priority patent/EP3812866A4/en
Publication of WO2020147360A1 publication Critical patent/WO2020147360A1/zh
Priority to US17/118,590 priority patent/US20210132614A1/en

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • G05D1/0214Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory in accordance with safety or protection criteria, e.g. avoiding hazardous areas
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60QARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
    • B60Q1/00Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor
    • B60Q1/02Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to illuminate the way ahead or to illuminate other areas of way or environments
    • B60Q1/04Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to illuminate the way ahead or to illuminate other areas of way or environments the devices being headlights
    • B60Q1/14Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to illuminate the way ahead or to illuminate other areas of way or environments the devices being headlights having dimming means
    • B60Q1/1415Dimming circuits
    • B60Q1/1423Automatic dimming circuits, i.e. switching between high beam and low beam due to change of ambient light or light level in road traffic
    • B60Q1/143Automatic dimming circuits, i.e. switching between high beam and low beam due to change of ambient light or light level in road traffic combined with another condition, e.g. using vehicle recognition from camera images or activation of wipers
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
    • B60W30/08Active safety systems predicting or avoiding probable or impending collision or attempting to minimise its consequences
    • B60W30/09Taking automatic action to avoid collision, e.g. braking and steering
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
    • B60W30/14Adaptive cruise control
    • B60W30/143Speed control
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W60/00Drive control systems specially adapted for autonomous road vehicles
    • B60W60/001Planning or execution of driving tasks
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0276Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60QARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
    • B60Q2300/00Indexing codes for automatically adjustable headlamps or automatically dimmable headlamps
    • B60Q2300/10Indexing codes relating to particular vehicle conditions
    • B60Q2300/11Linear movements of the vehicle
    • B60Q2300/114Vehicle acceleration or deceleration
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60QARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
    • B60Q2300/00Indexing codes for automatically adjustable headlamps or automatically dimmable headlamps
    • B60Q2300/30Indexing codes relating to the vehicle environment
    • B60Q2300/31Atmospheric conditions
    • B60Q2300/314Ambient light
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W2050/0062Adapting control system settings
    • B60W2050/0063Manual parameter input, manual setting means, manual initialising or calibrating means
    • B60W2050/0066Manual parameter input, manual setting means, manual initialising or calibrating means using buttons or a keyboard connected to the on-board processor
    • B60W2050/0067Confirmation by the driver
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2540/00Input parameters relating to occupants
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2554/00Input parameters relating to objects
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2555/00Input parameters relating to exterior conditions, not covered by groups B60W2552/00, B60W2554/00
    • B60W2555/20Ambient conditions, e.g. wind or rain
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2556/00Input parameters relating to data
    • B60W2556/45External transmission of data to or from the vehicle

Definitions

  • the embodiments of the present application relate to the field of computer technology, in particular to a control method and device for an unmanned vehicle.
  • An unmanned vehicle is a smart car that senses the road environment through the on-board sensor system, automatically plans the driving route and controls the vehicle to reach the predetermined target.
  • the embodiment of the application proposes a control method and device for an unmanned vehicle.
  • an embodiment of the present application provides a control method for an unmanned vehicle.
  • the method includes: acquiring vehicle driving environment information; determining the intended driving state of the unmanned vehicle based on the vehicle driving environment information; The target user generates a driving state adjustment instruction corresponding to the interactive operation for the interactive operation to be taken in the driving state to control the unmanned vehicle to adjust the driving state.
  • the method further includes: in response to receiving environment perception information sent by an external device communicatively connected with the unmanned vehicle, controlling the unmanned vehicle to adjust the driving state according to the environment perception information.
  • the method further includes: presenting at least one of the following on a preset terminal display screen: vehicle driving environment information, a driving state to be adopted, and environment perception information.
  • the method further includes: receiving a vehicle travel route information set sent by a cloud server connected by communication; selecting vehicle travel route information from the vehicle travel route information set to control the unmanned vehicle to follow the selected vehicle travel route Driving on the road indicated by the information.
  • the method further includes: in response to determining that there is an obstacle on the road on which the unmanned vehicle is traveling, presenting obstacle information for describing the obstacle; in response to detecting the obstacle that the target user is directed to the obstacle information
  • the confirmation operation generates an obstacle elimination instruction corresponding to the obstacle confirmation operation to control the unmanned vehicle to adjust the driving state.
  • the vehicle driving environment information includes at least one of the following: vehicle location information, environment image information, in-vehicle device switch information, and in-vehicle device energy consumption information.
  • an embodiment of the present application provides a control device for an unmanned vehicle.
  • the device includes: an information acquiring unit configured to acquire vehicle driving environment information; and a state determining unit configured to obtain information based on vehicle driving environment information. , Determine the driving state of the unmanned vehicle; the first adjustment unit is configured to generate a driving state adjustment instruction corresponding to the interactive operation in response to detecting the interactive operation of the target user for the intended driving state to control the unmanned vehicle Carry out driving status adjustments.
  • the device further includes: a second adjustment unit configured to, in response to receiving environment perception information sent by an external device communicatively connected with the unmanned vehicle, control the unmanned vehicle to perform a driving state according to the environment perception information Adjustment.
  • a second adjustment unit configured to, in response to receiving environment perception information sent by an external device communicatively connected with the unmanned vehicle, control the unmanned vehicle to perform a driving state according to the environment perception information Adjustment.
  • the device further includes: an information display unit configured to present at least one of the following on a preset terminal display screen: vehicle driving environment information, a driving state to be adopted, and environment perception information.
  • the device further includes: a route receiving unit configured to receive a vehicle travel route information set sent by a communication-connected cloud server; a route selection unit configured to select a vehicle travel route from the vehicle travel route information set Information to control the unmanned vehicle to drive along the road indicated by the selected vehicle route information.
  • the device further includes: an information determination unit configured to present obstacle information describing the obstacle in response to determining that there is an obstacle on the road on which the unmanned vehicle is traveling; and a third adjustment unit, which is It is configured to generate an obstacle elimination instruction corresponding to the obstacle confirmation operation in response to detecting the obstacle confirmation operation of the target user for the obstacle information, so as to control the unmanned vehicle to adjust the driving state.
  • an information determination unit configured to present obstacle information describing the obstacle in response to determining that there is an obstacle on the road on which the unmanned vehicle is traveling
  • a third adjustment unit which is It is configured to generate an obstacle elimination instruction corresponding to the obstacle confirmation operation in response to detecting the obstacle confirmation operation of the target user for the obstacle information, so as to control the unmanned vehicle to adjust the driving state.
  • the vehicle driving environment information includes at least one of the following: vehicle location information, environmental image information, in-vehicle device switch information, and in-vehicle device energy consumption information.
  • an embodiment of the present application provides a controller, which includes: one or more processors; a memory on which one or more programs are stored; when one or more programs are stored by one or more The processor executes, so that one or more processors implement the method described in any implementation manner of the first aspect.
  • an embodiment of the present application provides an unmanned vehicle, which includes the controller as described in the third aspect.
  • an embodiment of the present application provides a computer-readable medium on which a computer program is stored, and when the program is executed by a processor, the method as described in any implementation manner in the first aspect is implemented.
  • the control method and device for unmanned vehicles provided in the embodiments of the present application can obtain vehicle driving environment information. Then according to the vehicle driving environment information, determine the unmanned vehicle's intended driving state. Finally, in response to detecting the target user's interactive operation for the intended driving state, a driving state adjustment instruction corresponding to the interactive operation is generated to control the unmanned vehicle to adjust the driving state.
  • the method and device of this embodiment can detect the interactive operation of the target user in the driving state, thereby controlling the unmanned vehicle to adjust the driving state based on the interactive operation, which helps to improve the flexible control of the unmanned vehicle.
  • Fig. 1 is an exemplary system architecture diagram to which an embodiment of the present application can be applied;
  • Fig. 2 is a flowchart of an embodiment of a control method for an unmanned vehicle according to the present application
  • Fig. 3 is a schematic diagram of an application scenario of the control method for unmanned vehicles according to the present application.
  • Fig. 5 is a schematic structural diagram of an embodiment of a control device for an unmanned vehicle according to the present application.
  • Fig. 6 is a schematic structural diagram of a computer system suitable for implementing the controller of the embodiment of the present application.
  • FIG. 1 shows an exemplary system architecture 100 of a control method for an unmanned vehicle or a control device for an unmanned vehicle to which embodiments of the present application can be applied.
  • the system architecture 100 may include a control device 101, an unmanned vehicle 102 and a network 103.
  • the network 103 is used as a medium for providing a communication link between the control device 101 and the unmanned vehicle 102.
  • the network 103 may include various connection types, such as wired, wireless communication links, or fiber optic cables.
  • the control device 101 and the unmanned vehicle 102 can interact through the network 103 to receive or send messages and so on.
  • the control device 101 may be hardware or software. When the control device is hardware, it may be a processor with computing capability. The processor can control the unmanned vehicle 102. It should be noted that the control device 101 may be integrated in the unmanned vehicle 102, or may exist separately from the unmanned vehicle 102. When the control device 101 is software, it can be installed in the processors listed above. It can be implemented as multiple software or software modules (for example, to provide distributed services), or as a single software or software module. There is no specific limit here.
  • the unmanned vehicle 102 can interact with the control device 101 through the network 103 to receive or send messages and so on.
  • Various communication client applications may be installed on the unmanned vehicle 102, such as instant messaging tools, email clients, etc.
  • the unmanned vehicle 102 may be an unmanned vehicle that can realize automatic driving.
  • control method for unmanned vehicles provided by the embodiments of the present application is generally executed by the control device 101, and correspondingly, the control device for unmanned vehicles is generally provided in the control device 101.
  • control devices, unmanned vehicles, and networks in FIG. 1 are merely illustrative. According to realizing needs, it can have any number of control devices, unmanned vehicles and networks.
  • the control method for unmanned vehicles includes the following steps:
  • Step 201 Acquire vehicle driving environment information.
  • the execution subject of the control method for the unmanned vehicle can obtain the vehicle driving collected by the information collection device of the unmanned vehicle through a wired connection or a wireless connection.
  • Environmental information The above-mentioned information collection device may be various devices that collect information. As an example, the above-mentioned information collection device may be a camera, a speed sensor, a position sensor, and the like.
  • the vehicle driving environment information can be various information used to describe the current environment of the unmanned vehicle.
  • the aforementioned vehicle driving environment information may include, but is not limited to, at least one of the following: vehicle location information, environmental image information, in-vehicle device switch information, and in-vehicle device energy consumption information.
  • the vehicle location information can be various information used to describe the current location of the unmanned vehicle.
  • the foregoing vehicle location information may be GPS (Global Positioning System) coordinate values of the current location of the unmanned vehicle.
  • the foregoing environmental image information may be various image information used to describe the environment in which the unmanned vehicle is currently located.
  • the foregoing environmental image information may be image information used to describe a certain designated device in the unmanned vehicle, or image information used to describe the road currently being driven.
  • the above-mentioned in-vehicle device switch information may be various information used to describe the switch state of the in-vehicle device.
  • the in-vehicle equipment includes, but is not limited to, car lights, car air conditioners, car audio, car windows, etc.
  • the above-mentioned in-vehicle device switch information may be the character group "light-1" used to describe that the lights of the vehicle are on.
  • the energy consumption information of the in-vehicle equipment may be various information used to describe the energy consumption of the in-vehicle equipment.
  • the foregoing energy consumption information of in-vehicle equipment may be the character group "air conditioner-10%", which is used to describe that the energy consumption of the vehicle air conditioner accounts for 10% of the total consumption.
  • the above-mentioned unmanned vehicles may be various unmanned vehicles. It can also be various other vehicles. For example, airplanes, ships, etc.
  • Step 202 Determine the intended driving state of the unmanned vehicle based on the vehicle driving environment information.
  • the aforementioned driving state to be adopted includes at least one of the following: forward at a constant speed, forward at an acceleration, forward at a deceleration, stop, whistle, turn on a vehicle light, etc.
  • the execution subject can analyze the acquired vehicle driving environment information to determine the intended driving state of the unmanned vehicle, so that the unmanned vehicle can drive according to the determined intended driving state.
  • the proposed driving state of the unmanned vehicle may be set as: slowing down and turning right.
  • the execution subject obtains the exterior brightness information from the brightness detector, and when the exterior brightness information indicates that the exterior brightness is lower than the set brightness threshold, the unmanned vehicle's plan to take The driving state is set to: turn on the outside lights.
  • the vehicle driving environment information can usually include multiple pieces of information at the same time.
  • the vehicle driving environment information includes both the distance measurement information obtained from the ranging sensor and the outside brightness information obtained from the brightness detector.
  • the executive body can analyze the two information in the vehicle driving environment information at the same time, and the proposed driving state of the unmanned vehicle can be: slow down and turn to the right, and turn on the outside lights.
  • Step 203 In response to detecting the target user's interactive operation for the driving state to be adopted, a driving state adjustment instruction corresponding to the interactive operation is generated to control the unmanned vehicle to adjust the driving state.
  • the target user may be a user who has control authority over the unmanned vehicle.
  • the execution subject may present the intended driving state determined in step 202 to the target user through the terminal display screen.
  • the description information of the driving state to be adopted can be displayed on the terminal screen.
  • the description information may be "accelerate forward".
  • the target user can perform interactive operations by manually selecting the description information of the driving state to be adopted.
  • the target user can select the description information of the driving state to be adopted through touch or button mode to perform interactive operations.
  • the execution subject can detect the aforementioned interactive operations through changes in touch information or changes in key information.
  • the specific interactive operation process can be as follows: the target user selects the determined description information of the driving state to be adopted, and then selects the description information of the plurality of candidate driving states associated with the determined driving state to be adopted. Descriptive information about the expected candidate driving state. As an example, the target user can click on the description information of the determined driving state to be adopted, and then select "advance at a constant speed” from “accelerating forward”, “decelerating forward”, and "advancing at a constant speed” related to the determined driving state to be adopted. .
  • the execution subject may also broadcast the intended driving state determined in step 202 to the target user through a voice playback device.
  • the voice playback device can play the description voice of the driving state to be adopted.
  • the description voice may be "accelerate forward".
  • the target user can perform interactive operations by selecting the description information of the driving state to be adopted by voice.
  • the target user can perform interactive operations by emitting a control voice corresponding to the voice described above.
  • the control voice includes a description voice.
  • the description voice is "accelerate forward”
  • the control voice can be "accelerate forward, please drive slowly.” In this way, the execution subject can detect the above-mentioned interactive operations through voice changes.
  • the execution subject may also present the intended driving state determined in step 202 to the target user through the terminal display screen, and broadcast the intended driving state determined in step 202 to the target user through a voice playback device.
  • the target user can select one or two methods from the method of manually selecting the description information of the driving state to be adopted and the method of voice selecting the description information of the driving state to be adopted for interactive operation.
  • the execution subject after detecting the target user's interactive operation for the driving state to be taken, the execution subject can generate a driving state adjustment instruction corresponding to the interactive operation, thereby sending the driving state adjustment instruction to the corresponding execution device of the unmanned vehicle In, realize the control of unmanned vehicles to adjust the driving state.
  • the above-mentioned execution device may be a motor, a window controller, and the like.
  • the aforementioned driving state adjustment instruction corresponding to the interactive operation is usually an instruction for executing the determined driving state to be adjusted to the driving state desired by the target user.
  • the aforementioned driving state adjustment instruction corresponding to the interactive operation may be an instruction for executing the adjustment of opening the window to closing the window.
  • the executive body sends the driving state adjustment command to the window controller of the unmanned vehicle.
  • the above-mentioned driving state adjustment command corresponding to the interactive operation may also be a command for executing the adjustment of the acceleration forward to a constant speed.
  • the execution body sends the driving state adjustment command to the motor of the unmanned vehicle.
  • control method for the unmanned vehicle may further include the following steps: in response to receiving the environment perception information sent by the external device communicatively connected with the unmanned vehicle, according to the environment perception Information, control the unmanned vehicle to adjust the driving state.
  • the above-mentioned external devices are usually various devices that are communicatively connected with the unmanned vehicle and can send out environmental perception information.
  • the aforementioned external device may be a vehicle, a traffic light, or the like.
  • the aforementioned environmental perception information may be various information used to describe the environment in which the external device is located.
  • the aforementioned environmental perception information may include, but is not limited to, device location information and device environment image information.
  • the above-mentioned device location information may be various information used to describe the current location of the external device.
  • the aforementioned device location information may be the GPS coordinate value of the current location of the external device.
  • the foregoing device environment image information may be various image information used to describe the environment in which the external device is currently located.
  • the execution subject can analyze the received environmental perception information to generate corresponding control instructions to control the unmanned vehicle to perform state adjustment.
  • the execution subject may first calculate the relative distance between A and B. If the relative distance is greater than the first distance threshold, a control instruction for controlling the unmanned vehicle to adjust the driving state to accelerate forward may be generated. If the relative distance is less than the first distance threshold and greater than the second distance threshold, a control instruction for controlling the unmanned vehicle to adjust the driving state to a constant speed may be generated.
  • a control instruction for controlling the unmanned vehicle to adjust the driving state to decelerate forward may be generated.
  • the above-mentioned first distance threshold and second distance threshold may be data values preset by technicians, which are not limited here. It should be pointed out that the way that the execution subject controls the unmanned vehicle to adjust the driving state according to the control command is basically the same as the way of controlling the unmanned vehicle to adjust the driving state according to the driving state adjustment command in step 203, and will not be repeated here.
  • control method for the unmanned vehicle may further include the following steps: presenting at least one of the following on a preset terminal display screen: vehicle driving environment information, and driving Status, environmental perception information.
  • the aforementioned preset terminal display screens may be various terminal display screens pre-installed in the unmanned vehicle by the technicians.
  • the aforementioned preset terminal display screen may be a touch-sensitive tablet computer.
  • the use of a preset terminal display screen to present information can further improve the flexibility of the target user's interaction with the unmanned vehicle.
  • control method for unmanned vehicles may further include the following steps:
  • obstacle information describing the obstacle is presented.
  • the obstacle information may be various information used to describe the obstacle.
  • the execution subject can determine whether there are obstacles on the road that is traveling by acquiring and analyzing the images collected by the camera.
  • the executive body can also determine whether there are obstacles on the road by acquiring and analyzing the distance information collected by the ranging sensor.
  • the obstacle information used to describe the obstacle may be presented.
  • the execution subject may present the above-mentioned obstacle information in a display form through a terminal display screen, or may also present it in a form of playing voice through a voice playback device.
  • an obstacle elimination instruction corresponding to the obstacle confirmation operation is generated, and the unmanned vehicle is controlled to adjust the driving state.
  • the execution subject can present the obstacle information to the target user through the terminal display screen.
  • the target user can perform obstacle confirmation operations by manually selecting obstacle information.
  • the target user can select obstacle information through touch or key buttons to perform obstacle confirmation operations.
  • the execution subject can detect the obstacle confirmation operation described above through a change in touch information or a change in key information.
  • the obstacle confirmation operation may be a confirmation operation of the above-mentioned obstacle information by the target user.
  • the target user can click on the obstacle information, and then select "This obstacle is invalid” from “This obstacle is valid” and "This obstacle is invalid” associated with the obstacle information.
  • “the obstacle is invalid” is used to indicate that the obstacle determined by the execution subject is wrong.
  • “The obstacle is valid” is used to characterize that the obstacle determined by the executive body is correct.
  • an obstacle removal instruction for executing the proposed driving state for the obstacle to be adjusted to the driving state without obstacles is generated.
  • an obstacle elimination instruction for continuing to execute the intended driving state for the obstacle is generated. It should be pointed out that the detection of obstacle confirmation operations can help assist unmanned vehicles to drive better. At the same time, it can further improve the flexibility of the target user's interaction with the unmanned vehicle.
  • the executive body can also broadcast the obstacle information to the target user through a voice playback device.
  • the voice playback device can play obstacle information.
  • the target user can perform obstacle confirmation operations by selecting obstacle information by voice.
  • the target user may perform an obstacle confirmation operation by emitting an obstacle removal control voice corresponding to the above obstacle information.
  • the obstacle removal control voice includes obstacle information. As an example, if the obstacle information is "There is a large obstacle 2 meters directly ahead". At this time, the obstacle removal control voice can be "There is a large obstacle 2 meters directly ahead, please ignore”. In this way, the execution subject can detect the aforementioned obstacle confirmation operation through a voice change.
  • the executive body can also present the obstacle information to the target user through the terminal display screen, and broadcast the obstacle information to the target user through a voice playback device.
  • the target user can select one or two methods from the method of manually selecting obstacle information and the method of selecting obstacle information by voice to perform the obstacle confirmation operation.
  • the execution subject after detecting the obstacle confirmation operation of the target user for the obstacle information, the execution subject can generate an obstacle elimination instruction corresponding to the obstacle confirmation operation, thereby sending the obstacle elimination instruction to the driver of the unmanned vehicle In the corresponding execution equipment, control the unmanned vehicle to adjust the driving state.
  • the way that the executive body controls the unmanned vehicle to adjust the driving state according to the obstacle removal instruction is basically the same as the way of controlling the unmanned vehicle to adjust the driving state according to the driving state adjustment instruction in step 203, and will not be repeated here. .
  • Fig. 3 is a schematic diagram of an application scenario of the control method for an unmanned vehicle according to this embodiment.
  • the control device 302 obtains the vehicle driving environment information collected by the information collection device of the unmanned vehicle 301 (for example, the brightness information outside the vehicle). Then, according to the vehicle driving environment information, determine the unmanned vehicle's intended driving state. At this time, the control device 302 may compare the brightness indicated by the vehicle exterior brightness information with a preset brightness threshold, and if the brightness is less than the preset brightness threshold, it is determined that the intended driving state is: turn on the exterior lights.
  • a driving state adjustment instruction corresponding to the interactive operation is generated and sent to the exterior lights power supply.
  • the target user 303 finds that the determined driving state of turning on the exterior lights does not match the actual environment, or the target user 303 does not want to turn on the exterior lights at the moment, he can click "Turn on exterior lights ", select "Do not turn on exterior lights” displayed in association with “Turn on exterior lights”.
  • the control device can generate a driving state adjustment instruction based on the interactive operation of the target user 303, and control the power supply of the exterior lights of the unmanned vehicle to be turned off.
  • the control method for unmanned vehicles provided by the foregoing embodiments of the present application can obtain vehicle driving environment information. Then according to the vehicle driving environment information, determine the unmanned vehicle's intended driving state. Finally, in response to detecting the target user's interactive operation for the intended driving state, a driving state adjustment instruction corresponding to the interactive operation is generated, and the unmanned vehicle is controlled to adjust the driving state.
  • the method of this embodiment can detect the interactive operation of the target user in the driving state, thereby controlling the unmanned vehicle to adjust the driving state based on the interactive operation, which helps to improve the flexible control of the unmanned vehicle.
  • FIG. 4 shows a flow 400 of another embodiment of a control method for an unmanned vehicle.
  • the process 400 of the control method for unmanned vehicles includes the following steps:
  • Step 401 Receive a vehicle driving route information set sent by a cloud server connected in communication.
  • the executive body may receive the vehicle driving route information collection from the cloud server through a wired connection or a wireless connection.
  • the vehicle driving route information includes road information. It should be pointed out that the executive body directly receives the vehicle driving route information collection from the cloud server to determine the vehicle driving route, which can save the time for the executive body to determine the route and help improve the control efficiency of the unmanned vehicle.
  • Step 402 Select vehicle travel route information from the vehicle travel route information set to control the unmanned vehicle to travel along the road indicated by the selected vehicle travel route information.
  • the execution subject may randomly select one vehicle travel route information from the vehicle travel route information set, or select the vehicle travel route information corresponding to the shortest route from the vehicle travel route information set.
  • Step 403 Acquire vehicle driving environment information.
  • Step 404 Determine the intended driving state of the unmanned vehicle based on the vehicle driving environment information.
  • Step 405 In response to detecting the target user's interactive operation for the driving state to be adopted, a driving state adjustment instruction corresponding to the interactive operation is generated to control the unmanned vehicle to adjust the driving state.
  • steps 403-405 are basically the same as the operations of steps 201-203 in the embodiment shown in FIG. 2, and will not be repeated here.
  • the process 400 of the control method for unmanned vehicles in this embodiment embodies the steps of receiving the collection of vehicle driving route information sent by the cloud server, and The step of selecting vehicle travel route information from the vehicle travel route information collection. Therefore, the solution described in this embodiment can save the time for the execution subject to determine the route and help improve the control efficiency of the unmanned vehicle.
  • this application provides an embodiment of a control device for an unmanned vehicle.
  • the device embodiment corresponds to the method embodiment shown in FIG. 2 ,
  • the device can be applied to various electronic equipment.
  • the control device 500 for an unmanned vehicle in this embodiment includes: an information acquiring unit 501 configured to acquire vehicle driving environment information; and a state determining unit 502 configured to determine according to the vehicle driving environment information The unmanned vehicle's intended driving state; the first adjustment unit 503 is configured to generate a driving state adjustment instruction corresponding to the interactive operation in response to detecting the target user's interactive operation for the intended driving state to control the unmanned vehicle to perform Driving state adjustment.
  • the device may further include a second adjustment unit (not shown in the figure).
  • the second adjustment unit may be configured to control the unmanned vehicle to adjust the driving state according to the environmental perception information in response to receiving the environmental perception information sent by the external device communicatively connected with the unmanned vehicle.
  • the device may further include an information display unit (not shown in the figure).
  • the information display unit may be configured to present at least one of the following on a preset terminal display screen: vehicle driving environment information, a driving state to be adopted, and environment perception information.
  • the device may further include a route receiving unit and a route selection unit (not shown in the figure).
  • the route receiving unit may be configured to receive the vehicle travel route information set sent by the cloud server connected in communication.
  • the route selection unit may be configured to select vehicle travel route information from the vehicle travel route information set to control the unmanned vehicle to travel along the road indicated by the selected vehicle travel route information.
  • the vehicle driving environment information may include at least one of the following: vehicle location information, environmental image information, in-vehicle device switch information, and in-vehicle device energy consumption information.
  • the device may further include an information determination unit and a third adjustment unit (not shown in the figure).
  • the information determining unit may be configured to present obstacle information describing the obstacle in response to determining that there is an obstacle on the road on which the unmanned vehicle is traveling.
  • the third adjustment unit may be configured to generate an obstacle elimination instruction corresponding to the obstacle confirmation operation in response to detecting the obstacle confirmation operation of the target user with respect to the obstacle information to control the unmanned vehicle to adjust the driving state.
  • the information acquiring unit 501 acquires vehicle driving environment information. Then, the state determination unit 502 determines the intended driving state of the unmanned vehicle based on the vehicle driving environment information. Finally, the first adjustment unit 503 generates a driving state adjustment instruction corresponding to the interactive operation in response to detecting the interactive operation of the target user for the driving state to be adopted, and controls the unmanned vehicle to adjust the driving state.
  • the device of this embodiment can detect the interactive operation of the target user in the driving state, thereby controlling the unmanned vehicle to adjust the driving state based on the interactive operation, which helps to improve the flexible control of the unmanned vehicle.
  • FIG. 6 shows a schematic structural diagram of a computer system 600 suitable for implementing the controller of the embodiment of the present application.
  • the controller shown in FIG. 6 is only an example, and should not bring any limitation to the function and scope of use of the embodiments of the present application.
  • the computer system 600 may include a central processing unit (CPU) 601, which may be loaded into a random access memory (RAM) 603 according to a program stored in a read only memory (ROM) 602 or from the storage part 606.
  • the program executes various appropriate actions and processing.
  • the RAM 603 also stores various programs and data required for the operation of the system 600.
  • the CPU 601, ROM 602, and RAM 603 are connected to each other through a bus 604.
  • An input/output (I/O) interface 605 is also connected to the bus 604.
  • the following components are connected to the I/O interface 605: a storage section 606 including a hard disk and the like; and a communication section 607 including a network interface card such as a LAN card, a modem, and the like.
  • the communication section 607 performs communication processing via a network such as the Internet.
  • the above-mentioned controller can exist alone or can be installed in an unmanned vehicle.
  • the above functions can also be integrated into the processor of the control system of the unmanned vehicle.
  • input parts such as cameras, sensors, radars, etc.
  • output parts such as liquid crystal displays (LCD), speakers, etc.
  • motor drivers may also be connected to the aforementioned I/O interface 605 as needed.
  • the motor driver can drive the mobile device to complete the movement of the unmanned vehicle according to the control information sent by the CPU.
  • the drive is also connected to the I/O interface 605 as needed.
  • Removable media such as magnetic disks, optical disks, magneto-optical disks, semiconductor memories, etc.
  • the central processing unit (CPU) 601 calls the above-mentioned computer program to execute the function of controlling the unmanned vehicle, it can control the input part to obtain the vehicle running environment information of the unmanned vehicle from the outside.
  • the process described above with reference to the flowchart can be implemented as a computer software program.
  • the embodiments of the present application include a computer program product, which includes a computer program carried on a computer-readable medium, and the computer program includes program code for executing the method shown in the flowchart.
  • the computer program can be downloaded and installed from the network through the communication part 607.
  • the central processing unit (CPU) 601 the above-mentioned functions defined in the method of the present application are executed.
  • the computer-readable medium of the present application may be a computer-readable signal medium or a computer-readable storage medium, or any combination of the two.
  • the computer-readable storage medium may be, for example, but not limited to, an electrical, magnetic, optical, electromagnetic, infrared, or semiconductor system, device, or device, or any combination of the above. More specific examples of computer-readable storage media may include, but are not limited to: electrical connections with one or more wires, portable computer disks, hard disks, random access memory (RAM), read-only memory (ROM), erasable Programmable read-only memory (EPROM or flash memory), optical fiber, portable compact disk read-only memory (CD-ROM), optical storage device, magnetic storage device, or any suitable combination of the foregoing.
  • the computer-readable storage medium may be any tangible medium that contains or stores a program, and the program may be used by or in combination with an instruction execution system, apparatus, or device.
  • a computer-readable signal medium may include a data signal propagated in a baseband or as a part of a carrier wave, and a computer-readable program code is carried therein. This propagated data signal can take many forms, including but not limited to electromagnetic signals, optical signals, or any suitable combination of the foregoing.
  • the computer-readable signal medium may also be any computer-readable medium other than the computer-readable storage medium.
  • the computer-readable medium may send, propagate or transmit the program for use by or in combination with the instruction execution system, apparatus, or device .
  • the program code contained on the computer-readable medium may be transmitted using any appropriate medium, including but not limited to: wireless, wire, optical cable, RF, etc., or any suitable combination of the foregoing.
  • each block in the flowchart or block diagram may represent a module, program segment, or part of code that contains one or more logic functions Executable instructions.
  • the functions marked in the block may also occur in a different order from the order marked in the drawings. For example, two blocks shown in succession can actually be executed substantially in parallel, or they can sometimes be executed in the reverse order, depending on the functions involved.
  • each block in the block diagrams and/or flowcharts, and combinations of blocks in the block diagrams and/or flowcharts can be implemented with dedicated hardware-based systems that perform specified functions or operations Or, it can be realized by a combination of dedicated hardware and computer instructions.
  • the units involved in the embodiments described in the present application can be implemented in software or hardware.
  • the described unit may also be provided in the processor.
  • a processor includes an information acquisition unit, a state determination unit, and a first adjustment unit.
  • the names of these units do not constitute a limitation on the unit itself under certain circumstances.
  • the information acquisition unit can also be described as a "unit for acquiring vehicle driving environment information.”
  • the present application also provides a computer-readable medium.
  • the computer-readable medium may be included in the device described in the foregoing embodiment; or it may exist alone without being assembled into the device.
  • the above-mentioned computer-readable medium carries one or more programs.
  • the device obtains the vehicle driving environment information; determines the unmanned vehicle's intended driving State: In response to detecting the target user's interactive operation for the driving state to be taken, a driving state adjustment instruction corresponding to the interactive operation is generated to control the unmanned vehicle to adjust the driving state.

Landscapes

  • Engineering & Computer Science (AREA)
  • Automation & Control Theory (AREA)
  • Mechanical Engineering (AREA)
  • Physics & Mathematics (AREA)
  • Transportation (AREA)
  • Remote Sensing (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Electromagnetism (AREA)
  • Traffic Control Systems (AREA)
  • Control Of Driving Devices And Active Controlling Of Vehicle (AREA)

Abstract

一种用于无人车的控制方法和装置,该方法包括:获取车辆行驶环境信息(201);根据车辆行驶环境信息,确定无人车的拟采取行驶状态(202);响应于检测到目标用户针对拟采取行驶状态的交互操作,生成对应于交互操作的行驶状态调整指令,控制无人车进行行驶状态调整(203),该方法提高了对无人车控制的灵活性。

Description

用于无人车的控制方法和装置
本专利申请要求于2019年01月15日提交的、申请号为201910037537.8、申请人为北京百度网讯科技有限公司、发明名称为“用于无人车的控制方法和装置”的中国专利申请的优先权,该申请的全文以引用的方式并入本申请中。
技术领域
本申请实施例涉及计算机技术领域,具体涉及用于无人车的控制方法和装置。
背景技术
无人车是通过车载传感***感知道路环境,自动规划行车路线并控制车辆到达预定目标的智能车。
相关技术中,存在用户与无人车的交互的需求。
发明内容
本申请实施例提出了用于无人车的控制方法和装置。
第一方面,本申请实施例提供了一种用于无人车的控制方法,该方法包括:获取车辆行驶环境信息;根据车辆行驶环境信息,确定无人车的拟采取行驶状态;响应于检测到目标用户针对拟采取行驶状态的交互操作,生成对应于交互操作的行驶状态调整指令,以控制无人车进行行驶状态调整。
在一些实施例中,方法还包括:响应于接收到与无人车通信连接的外部设备发送的环境感知信息,根据环境感知信息,控制无人车进行行驶状态调整。
在一些实施例中,方法还包括:在预设的终端显示屏上呈现以下至少一项:车辆行驶环境信息,拟采取行驶状态,环境感知信息。
在一些实施例中,该方法还包括:接收通信连接的云端服务器发送的车辆行驶路线信息集合;从车辆行驶路线信息集合中选择车辆行 驶路线信息,以控制无人车沿所选择的车辆行驶路线信息所指示的道路行驶。
在一些实施例中,该方法还包括:响应于确定无人车所行驶的道路上存在障碍物,呈现用于描述障碍物的障碍物信息;响应于检测到目标用户针对障碍物信息的障碍物确认操作,生成对应于障碍物确认操作的障碍物消除指令,以控制无人车进行行驶状态调整。
在一些实施例中,车辆行驶环境信息包括以下至少一项:车辆位置信息,环境图像信息,车内设备开关信息,车内设备能源消耗信息。
第二方面,本申请实施例提供了一种用于无人车的控制装置,该装置包括:信息获取单元,被配置成获取车辆行驶环境信息;状态确定单元,被配置成根据车辆行驶环境信息,确定无人车的拟采取行驶状态;第一调整单元,被配置成响应于检测到目标用户针对拟采取行驶状态的交互操作,生成对应于交互操作的行驶状态调整指令,以控制无人车进行行驶状态调整。
在一些实施例中,该装置还包括:第二调整单元,被配置成响应于接收到与无人车通信连接的外部设备发送的环境感知信息,根据环境感知信息,控制无人车进行行驶状态调整。
在一些实施例中,该装置还包括:信息显示单元,被配置成在预设的终端显示屏上呈现以下至少一项:车辆行驶环境信息,拟采取行驶状态,环境感知信息。
在一些实施例中,该装置还包括:路线接收单元,被配置成接收通信连接的云端服务器发送的车辆行驶路线信息集合;路线选择单元,被配置成从车辆行驶路线信息集合中选择车辆行驶路线信息,以控制无人车沿所选择的车辆行驶路线信息所指示的道路行驶。
在一些实施例中,该装置还包括:信息确定单元,被配置成响应于确定无人车所行驶的道路上存在障碍物,呈现用于描述障碍物的障碍物信息;第三调整单元,被配置成响应于检测到目标用户针对障碍物信息的障碍物确认操作,生成对应于障碍物确认操作的障碍物消除指令,以控制无人车进行行驶状态调整。
在一些实施例中,车辆行驶环境信息包括以下至少一项:车辆位 置信息,环境图像信息,车内设备开关信息,车内设备能源消耗信息。
第三方面,本申请实施例提供了一种控制器,该控制器包括:一个或多个处理器;存储器,其上存储有一个或多个程序;当一个或多个程序被一个或多个处理器执行,使得一个或多个处理器实现如第一方面中任一实现方式描述的方法。
第四方面,本申请实施例提供了一种无人车,该无人车包括如第三方面中描述的控制器。
第五方面,本申请实施例提供了一种计算机可读介质,其上存储有计算机程序,该程序被处理器执行时实现如第一方面中任一实现方式描述的方法。
本申请实施例提供的用于无人车的控制方法和装置,可以获取车辆行驶环境信息。然后根据车辆行驶环境信息,确定无人车的拟采取行驶状态。最后,响应于检测到目标用户针对拟采取行驶状态的交互操作,生成对应于交互操作的行驶状态调整指令,以控制无人车进行行驶状态调整。本实施例的方法和装置,可以检测到目标用户针对行驶状态的交互操作,从而基于交互操作控制无人车进行行驶状态调整,有助于提高对无人车的灵活控制。
附图说明
通过阅读参照以下附图所作的对非限制性实施例所作的详细描述,本申请的其它特征、目的和优点将会变得更明显:
图1是本申请的一个实施例可以应用于其中的示例性***架构图;
图2是根据本申请的用于无人车的控制方法的一个实施例的流程图;
图3是根据本申请的用于无人车的控制方法的一个应用场景的示意图;
图4是根据本申请的用于无人车的控制方法的又一个实施例的流程图;
图5是根据本申请的用于无人车的控制装置的一个实施例的结构 示意图;
图6是适于用来实现本申请实施例的控制器的计算机***的结构示意图。
具体实施方式
下面结合附图和实施例对本申请作进一步的详细说明。可以理解的是,此处所描述的具体实施例仅仅用于解释相关发明,而非对该发明的限定。另外还需要说明的是,为了便于描述,附图中仅示出了与有关发明相关的部分。
需要说明的是,在不冲突的情况下,本申请中的实施例及实施例中的特征可以相互组合。下面将参考附图并结合实施例来详细说明本申请。
图1示出了可以应用本申请实施例的用于无人车的控制方法或用于无人车的控制装置的示例性***架构100。
如图1所示,***架构100可以包括控制装置101,无人车102和网络103。网络103用以在控制装置101和无人车102之间提供通信链路的介质。网络103可以包括各种连接类型,例如有线、无线通信链路或者光纤电缆等等。
控制装置101和无人车102可以通过网络103进行交互,以接收或发送消息等。控制装置101可以是硬件,也可以是软件。当控制装置为硬件时,其可以是具有运算能力的处理器。处理器可以对无人车102进行控制。需要说明的是,控制装置101可以集成在无人车102中,也可以与无人车102分别存在。当控制装置101为软件时,其可以安装在上述所列举的处理器中。其可以实现成多个软件或软件模块(例如用来提供分布式服务),也可以实现成单个软件或软件模块。在此不做具体限定。
无人车102可以通过网络103与控制装置101交互,以接收或发送消息等。无人车102上可以安装有各种通讯客户端应用,例如,即时通信工具、邮箱客户端等。无人车102可以是能够实现自动驾驶的无人车。
需要说明的是,本申请实施例所提供的用于无人车的控制方法一般由控制装置101执行,相应地,用于无人车的控制装置一般设置于控制装置101中。
应该理解,图1中的控制装置、无人车和网络的数目仅仅是示意性的。根据实现需要,可以具有任意数目的控制装置、无人车和网络。
继续参考图2,示出了根据本申请的用于无人车的控制方法的一个实施例的流程200。该用于无人车的控制方法,包括以下步骤:
步骤201,获取车辆行驶环境信息。
在本实施例中,用于无人车的控制方法的执行主体(例如图1所示的控制装置101)可以通过有线连接方式或者无线连接方式获取无人车的信息采集装置所采集的车辆行驶环境信息。其中,上述信息采集装置可以是采集信息的各种装置。作为示例,上述信息采集装置可以是摄像头、速度传感器、位置传感器等。车辆行驶环境信息可以是用于描述无人车当前所处环境的各种信息。可选地,上述车辆行驶环境信息可以包括但不限于以下至少一项:车辆位置信息,环境图像信息,车内设备开关信息,车内设备能源消耗信息。其中,车辆位置信息可以是用于描述无人车当前所处位置的各种信息。作为示例,上述车辆位置信息可以是无人车当前所处位置的GPS(Global Positioning System)坐标值。上述环境图像信息可以是用于描述无人车当前所处环境的各种图像信息。作为示例,上述环境图像信息可以是用于描述无人车内某指定设备的图像信息,也可以是用于描述当前所行驶的道路的图像信息。上述车内设备开关信息可以是用于描述车内设备的开关状态的各种信息。其中,车内设备包括但不限于车灯、车载空调、车载音响、车窗等。作为示例,上述车内设备开关信息可以是字符组“light-1”用于描述车灯为打开状态。上述车内设备能源消耗信息可以是用于描述车内设备能源消耗的各种信息。作为示例,上述车内设备能源消耗信息可以是字符组“air conditioner-10%”,用于描述车载空调的能源消耗占总消耗的10%。
需要指出的是,在本申请的各个实施例中,上述无人车可以是各种无人车。也可以是各种其它交通工具。例如,飞机、轮船等。
步骤202,根据车辆行驶环境信息,确定无人车的拟采取行驶状态。
可选地,上述拟采取行驶状态包括以下至少一项:匀速前进,加速前进,减速前进,停车,鸣笛,打开车灯等。在本实施例中,执行主体可以对所获取的车辆行驶环境信息进行分析,从而确定无人车的拟采取行驶状态,以使得无人车可以按照所确定的拟采取行驶状态进行行驶。作为一个示例,若步骤201中,执行主体从测距传感器中获取的测距信息指示前方10米存在障碍物,则可以将无人车的拟采取行驶状态设定为:减速向右转弯。作为另一示例,若步骤201中,执行主体从亮度检测器中获取的车外亮度信息,当车外亮度信息指示车外亮度低于设定亮度阈值时,则可以将无人车的拟采取行驶状态设定为:打开外车灯。需要指出的是,由于无人车通常采用多个信息采集装置进行信息采集,因此,车辆行驶环境信息中通常可以同时包括有多个信息。在车辆行驶环境信息中包括有多个信息时,例如,车辆行驶环境信息中既包括从测距传感器中获取的测距信息,又包括从亮度检测器中获取的车外亮度信息,此时,执行主体可以对车辆行驶环境信息中的两个信息进行同时分析,得到无人车的拟采取行驶状态可以为:减速向右转弯,以及打开外车灯。
步骤203,响应于检测到目标用户针对拟采取行驶状态的交互操作,生成对应于交互操作的行驶状态调整指令,以控制无人车进行行驶状态调整。
其中,目标用户可以是对无人车具有控制权限的用户。
在本实施例中,执行主体可以将步骤202所确定的拟采取行驶状态通过终端显示屏,呈现给目标用户。这时,终端显示屏上可以显示拟采取行驶状态的描述信息。作为示例,描述信息可以为“加速前进”。这样,目标用户可以通过手动选择拟采取行驶状态的描述信息的方式,进行交互操作。作为示例,目标用户可以通过触控方式或者按键方式选择拟采取行驶状态的描述信息,进行交互操作。如此,执行主体可以通过触控信息的变化或者按键信息的变化检测到上述交互操作。这里,具体地交互操作的过程,可以为:目标用户选择所确定的拟采取 行驶状态的描述信息,然后从关联于所确定的拟采取行驶状态的多个候选行驶状态的描述信息中,选择所期望的候选行驶状态的描述信息。作为示例,目标用户可以点击所确定的拟采取行驶状态的描述信息,然后从关联于所确定的拟采取行驶状态的“加速前进”、“减速前进”、“匀速前进”中选择“匀速前进”。
另外,执行主体也可以将步骤202所确定的拟采取行驶状态通过语音播放设备,播放给目标用户。这时,语音播放设备可以播放拟采取行驶状态的描述语音。作为示例,描述语音可以为“加速前进”。这样,目标用户可以通过语音选择拟采取行驶状态的描述信息的方式,进行交互操作。作为示例,目标用户可以通过发出对应于上述描述语音的控制语音,进行交互操作。其中,控制语音中包括描述语音。作为示例,若描述语音为“加速前进”,则控制语音可以为“加速前进,请开慢点”。如此,执行主体可以通过语音变化检测到上述交互操作。
另外,执行主体还可以将步骤202所确定的拟采取行驶状态通过终端显示屏,呈现给目标用户,以及,将步骤202所确定的拟采取行驶状态通过语音播放设备,播放给目标用户。这样,目标用户可以从手动选择拟采取行驶状态的描述信息的方式和语音选择拟采取行驶状态的描述信息的方式中,选择一种或两种方式,进行交互操作。
在本实施例中,在检测到目标用户针对拟采取行驶状态的交互操作后,执行主体可以生成对应于交互操作的行驶状态调整指令,从而将行驶状态调整指令发送给无人车的相应执行设备中,实现控制无人车进行行驶状态调整。其中,上述执行设备可以是电机、车窗控制器等。上述对应于交互操作的行驶状态调整指令,通常为用于执行将所确定的行驶状态调整为目标用户所期望的行驶状态的指令。作为示例,上述对应于交互操作的行驶状态调整指令,可以为用于执行将打开车窗调整为关闭车窗的指令。此时,执行主体将行驶状态调整指令发送给无人车的车窗控制器。作为另一示例,上述对应于交互操作的行驶状态调整指令,也可以为用于执行将加速前进调整为匀速前进的指令。此时,执行主体将行驶状态调整指令发送给无人车的电机。
在本实施例的一些可选的实现方式中,用于无人车的控制方法, 还可以包括以下步骤:响应于接收到与无人车通信连接的外部设备发送的环境感知信息,根据环境感知信息,控制无人车进行行驶状态调整。
其中,上述外部设备通常是与无人车通信连接的、可以发出环境感知信息的各种设备。作为示例,上述外部设备可以是车辆、交通灯等。上述环境感知信息可以是用于描述外部设备所处环境的各种信息。上述环境感知信息可以包括但不限于设备位置信息、设备环境图像信息。其中,上述设备位置信息可以是用于描述外部设备当前所处位置的各种信息。作为示例,上述设备位置信息可以是外部设备当前所处位置的GPS坐标值。上述设备环境图像信息可以是用于描述外部设备当前所处环境的各种图像信息。
这里,在接收到外部设备发送的环境感知信息后,执行主体可以通过对所接收到的环境感知信息进行分析,生成对应的控制指令,以控制无人车进行状态调整。作为示例,若执行主体当前所处位置为A,外部设备甲的环境感知信息指示甲当前所处位置为B,此时,执行主体可以先计算A与B的相对距离。若相对距离大于第一距离阈值,则可以生成用于控制无人车将行驶状态调整为加速前进的控制指令。若相对距离小于第一距离阈值且大于第二距离阈值,则可以生成用于控制无人车将行驶状态调整为匀速前进的控制指令。若相对距离小于第二距离阈值,则可以生成用于控制无人车将行驶状态调整为减速前进的控制指令。其中,上述第一距离阈值、第二距离阈值可以为技术人员预先设定的数据值,这里不做限定。需要指出的是,执行主体根据控制指令控制无人车进行行驶状态调整的方式与步骤203中的根据行驶状态调整指令,控制无人车进行行驶状态调整的方式基本相同,这里不做赘述。
在本实施例的一些可选的实现方式中,用于无人车的控制方法,还可以包括以下步骤:在预设的终端显示屏上呈现以下至少一项:车辆行驶环境信息,拟采取行驶状态,环境感知信息。
这里,上述预设的终端显示屏可以是技术人员预先安装在无人车中的各种终端显示屏。作为示例,上述预设的终端显示屏可以是触控 式平板电脑。这里,采用预设的终端显示屏对信息进行呈现,可以进一步提高目标用户与无人车交互的灵活性。
在本实施例的一些可选的实现方式中,上述用于无人车的控制方法,还可以包括以下步骤:
响应于确定无人车所行驶的道路上存在障碍物,呈现用于描述障碍物的障碍物信息。其中,障碍物信息可以是用于描述障碍物的各种信息。这里,执行主体可以通过获取以及分析摄像头所采集的图像,来判断所行驶的道路上是否存在障碍物。执行主体还可以通过获取以及分析测距传感器所采集的距离信息,来判断所行驶的道路上是否存在障碍物。在执行主体确定所行驶的道路上存在障碍物时,可以呈现用于描述障碍物的障碍物信息。这里,执行主体可以将上述障碍物信息通过终端显示屏以显示的形式呈现,也可以通过语音播放设备以播放语音的形式呈现。
响应于检测到目标用户针对障碍物信息的障碍物确认操作,生成对应于障碍物确认操作的障碍物消除指令,控制无人车进行行驶状态调整。这里,执行主体可以将障碍物信息通过终端显示屏,呈现给目标用户。这样,目标用户可以通过手动选择障碍物信息的方式,进行障碍物确认操作。作为示例,目标用户可以通过触控方式或者按键方式选择障碍物信息,进行障碍物确认操作。如此,执行主体可以通过触控信息的变化或者按键信息的变化检测到上述障碍物确认操作。这里,障碍物确认操作可以是目标用户对上述障碍物信息的确认操作。举例来说,目标用户可以点击障碍物信息,然后从关联于障碍物信息的“该障碍物有效”、“该障碍物无效”中选择“该障碍物无效”。其中,“该障碍物无效”用于表征执行主体所确定的障碍物有误。“该障碍物有效”用于表征执行主体所确定的障碍物正确。这里,若目标用户选择“该障碍物无效”,则生成用于执行将针对障碍物的拟采取行驶状态调整为没有障碍物的行驶状态的障碍物消除指令。若目标用户选择“该障碍物有效”,则生成用于继续执行针对障碍物的拟采取行驶状态的障碍物消除指令。需要指出的是,障碍物确认操作的检测有助于辅助无人车更好地行驶。同时,可以进一步提高目标用户与无人车交互的灵 活性。
另外,执行主体也可以将障碍物信息通过语音播放设备,播放给目标用户。这时,语音播放设备可以播放障碍物信息。这样,目标用户可以通过语音选择障碍物信息的方式,进行障碍物确认操作。作为示例,目标用户可以通过发出对应于上述障碍物信息的去障控制语音,进行障碍物确认操作。其中,去障控制语音中包括障碍物信息。作为示例,若障碍物信息为“正前方2米有一较大障碍物”。此时,去障控制语音可以为“正前方2米有一较大障碍物,请忽略”。如此,执行主体可以通过语音变化检测到上述障碍物确认操作。
另外,执行主体还可以将障碍物信息通过终端显示屏,呈现给目标用户,以及,将障碍物信息通过语音播放设备,播放给目标用户。这样,目标用户可以从手动选择障碍物信息的方式和语音选择障碍物信息的方式中,选择一种或两种方式,进行障碍物确认操作。
在本实施例中,在检测到目标用户针对障碍物信息的障碍物确认操作后,执行主体可以生成对应于障碍物确认操作的障碍物消除指令,从而将障碍物消除指令发送给无人车的相应执行设备中,实现控制无人车进行行驶状态调整。需要指出的是,执行主体根据障碍物消除指令控制无人车进行行驶状态调整的方式与步骤203中的根据行驶状态调整指令,控制无人车进行行驶状态调整的方式基本相同,这里不做赘述。
继续参见图3,图3是根据本实施例的用于无人车的控制方法的应用场景的一个示意图。在图3的应用场景中,首先,控制装置302获取无人车301的信息采集装置所采集的车辆行驶环境信息(如,车外亮度信息)。然后,根据车辆行驶环境信息,确定无人车的拟采取行驶状态。此时,控制装置302可以将车外亮度信息所指示的亮度与预设亮度阈值进行比较,若亮度小于预设亮度阈值,则确定拟采取行驶状态为:打开外车灯。最后,响应于检测到目标用户303点击“打开外车灯”的交互操作,生成对应于交互操作的行驶状态调整指令,发送给外车灯电源。这里,具体地,若目标用户303发现所确定的打开外车灯这一拟采取行驶状态与实际环境不相符,或者目标用户303此 刻并不想打开外车灯,则可以通过点击“打开外车灯”,选择与“打开外车灯”关联显示的“不打开外车灯”。如此,控制装置可以基于目标用户303的交互操作,生成行驶状态调整指令,控制无人车的外车灯电源不通电。
本申请的上述实施例提供的用于无人车的控制方法,可以获取车辆行驶环境信息。然后根据车辆行驶环境信息,确定无人车的拟采取行驶状态。最后,响应于检测到目标用户针对拟采取行驶状态的交互操作,生成对应于交互操作的行驶状态调整指令,控制无人车进行行驶状态调整。本实施例的方法,可以检测到目标用户针对行驶状态的交互操作,从而基于交互操作控制无人车进行行驶状态调整,有助于提高对无人车的灵活控制。
进一步参考图4,其示出了用于无人车的控制方法的又一个实施例的流程400。该用于无人车的控制方法的流程400,包括以下步骤:
步骤401,接收通信连接的云端服务器发送的车辆行驶路线信息集合。
在本实施例中,执行主体可以通过有线连接方式或者无线连接方式从云端服务器接收车辆行驶路线信息集合。其中,车辆行驶路线信息包括道路信息。需要指出的是,执行主体直接从云端服务器接收车辆行驶路线信息集合以确定车辆行驶路线,可以节省执行主体确定路线的时间,有助于提高无人车的控制效率。
步骤402,从车辆行驶路线信息集合中选择车辆行驶路线信息,以控制无人车沿所选择的车辆行驶路线信息所指示的道路行驶。
在本实施例中,执行主体可以从车辆行驶路线信息集合中随机选择一个车辆行驶路线信息,也可以从车辆行驶路线信息集合中选择对应于最短路线的车辆行驶路线信息。
步骤403,获取车辆行驶环境信息。
步骤404,根据车辆行驶环境信息,确定无人车的拟采取行驶状态。
步骤405,响应于检测到目标用户针对拟采取行驶状态的交互操作,生成对应于交互操作的行驶状态调整指令,以控制无人车进行行 驶状态调整。
在本实施例中,步骤403-405的具体操作与图2所示的实施例中步骤201-203的操作基本相同,在此不再赘述。
从图4中可以看出,与图2对应的实施例相比,本实施例中的用于无人车的控制方法的流程400体现了接收云端服务器发送的车辆行驶路线信息集合的步骤,以及从车辆行驶路线信息集合中选择车辆行驶路线信息的步骤。由此,本实施例描述的方案可以节省执行主体确定路线的时间,有助于提高无人车的控制效率。
进一步参考图5,作为对上述各图所示方法的实现,本申请提供了一种用于无人车的控制装置的一个实施例,该装置实施例与图2所示的方法实施例相对应,该装置具体可以应用于各种电子设备中。
如图5所示,本实施例的用于无人车的控制装置500包括:信息获取单元501,被配置成获取车辆行驶环境信息;状态确定单元502,被配置成根据车辆行驶环境信息,确定无人车的拟采取行驶状态;第一调整单元503,被配置成响应于检测到目标用户针对拟采取行驶状态的交互操作,生成对应于交互操作的行驶状态调整指令,以控制无人车进行行驶状态调整。
在本实施例的一些可选的实现方式中,该装置还可以包括第二调整单元(图中未示出)。第二调整单元可以被配置成响应于接收到与无人车通信连接的外部设备发送的环境感知信息,根据环境感知信息,控制无人车进行行驶状态调整。
在本实施例的一些可选的实现方式中,该装置还可以包括信息显示单元(图中未示出)。信息显示单元可以被配置成在预设的终端显示屏上呈现以下至少一项:车辆行驶环境信息,拟采取行驶状态,环境感知信息。
在本实施例的一些可选的实现方式中,该装置还可以包括路线接收单元和路线选择单元(图中未示出)。其中,路线接收单元可以被配置成接收通信连接的云端服务器发送的车辆行驶路线信息集合。路线选择单元可以被配置成从车辆行驶路线信息集合中选择车辆行驶路线信息,以控制无人车沿所选择的车辆行驶路线信息所指示的道路行驶。
在本实施例的一些可选的实现方式中,车辆行驶环境信息可以包括以下至少一项:车辆位置信息,环境图像信息,车内设备开关信息,车内设备能源消耗信息。
在本实施例的一些可选的实现方式中,该装置还可以包括信息确定单元和第三调整单元(图中未示出)。其中,信息确定单元可以被配置成响应于确定无人车所行驶的道路上存在障碍物,呈现用于描述障碍物的障碍物信息。第三调整单元可以被配置成响应于检测到目标用户针对障碍物信息的障碍物确认操作,生成对应于障碍物确认操作的障碍物消除指令,以控制无人车进行行驶状态调整。
本申请的上述实施例提供的装置,信息获取单元501获取车辆行驶环境信息。而后,状态确定单元502根据车辆行驶环境信息,确定无人车的拟采取行驶状态。最后,第一调整单元503响应于检测到目标用户针对拟采取行驶状态的交互操作,生成对应于交互操作的行驶状态调整指令,控制无人车进行行驶状态调整。本实施例的装置,可以检测到目标用户针对行驶状态的交互操作,从而基于交互操作控制无人车进行行驶状态调整,有助于提高对无人车的灵活控制。
下面参考图6,其示出了适于用来实现本申请实施例的控制器的计算机***600的结构示意图。图6示出的控制器仅仅是一个示例,不应对本申请实施例的功能和使用范围带来任何限制。
如图6所示,计算机***600可以包括中央处理单元(CPU)601,其可以根据存储在只读存储器(ROM)602中的程序或者从存储部分606加载到随机访问存储器(RAM)603中的程序而执行各种适当的动作和处理。在RAM 603中,还存储有***600操作所需的各种程序和数据。CPU 601、ROM 602以及RAM 603通过总线604彼此相连。输入/输出(I/O)接口605也连接至总线604。
以下部件连接至I/O接口605:包括硬盘等的存储部分606;以及包括诸如LAN卡、调制解调器等的网络接口卡的通信部分607。通信部分607经由诸如因特网的网络执行通信处理。
需要说明的是,上述控制器可以单独存在,也可以安装在无人车中。当上述控制器安装在无人车中的时候,也可以将上述功能集成在 无人车的控制***的处理器中。此时,包括诸如摄像头、传感器、雷达等的输入部分,包括诸如液晶显示器(LCD)、扬声器等的输出部分,以及电机驱动器也可以根据需要连接至上述I/O接口605。电机驱动器可以根据CPU发送的控制信息,带动移动装置完成无人车的移动。驱动器也根据需要连接至I/O接口605。诸如磁盘、光盘、磁光盘、半导体存储器等等的可拆卸介质,可以根据需要安装在驱动器上,以便于从其上读出的计算机程序根据需要被安装入存储部分606。由此,中央处理单元(CPU)601在调用上述计算机程序执行控制无人车的功能时,可以控制输入部分从外部获取无人车的车辆行驶环境信息。
特别地,根据本申请的实施例,上文参考流程图描述的过程可以被实现为计算机软件程序。例如,本申请的实施例包括一种计算机程序产品,其包括承载在计算机可读介质上的计算机程序,该计算机程序包含用于执行流程图所示的方法的程序代码。在这样的实施例中,该计算机程序可以通过通信部分607从网络上被下载和安装。在该计算机程序被中央处理单元(CPU)601执行时,执行本申请的方法中限定的上述功能。
需要说明的是,本申请的计算机可读介质可以是计算机可读信号介质或者计算机可读存储介质或者是上述两者的任意组合。计算机可读存储介质例如可以是——但不限于——电、磁、光、电磁、红外线、或半导体的***、装置或器件,或者任意以上的组合。计算机可读存储介质的更具体的例子可以包括但不限于:具有一个或多个导线的电连接、便携式计算机磁盘、硬盘、随机访问存储器(RAM)、只读存储器(ROM)、可擦式可编程只读存储器(EPROM或闪存)、光纤、便携式紧凑磁盘只读存储器(CD-ROM)、光存储器件、磁存储器件、或者上述的任意合适的组合。在本申请中,计算机可读存储介质可以是任何包含或存储程序的有形介质,该程序可以被指令执行***、装置或者器件使用或者与其结合使用。而在本申请中,计算机可读的信号介质可以包括在基带中或者作为载波一部分传播的数据信号,其中承载了计算机可读的程序代码。这种传播的数据信号可以采用多种形式,包括但不限于电磁信号、光信号或上述的任意合适的组合。计算 机可读的信号介质还可以是计算机可读存储介质以外的任何计算机可读介质,该计算机可读介质可以发送、传播或者传输用于由指令执行***、装置或者器件使用或者与其结合使用的程序。计算机可读介质上包含的程序代码可以用任何适当的介质传输,包括但不限于:无线、电线、光缆、RF等等,或者上述的任意合适的组合。
附图中的流程图和框图,图示了按照本申请各种实施例的***、方法和计算机程序产品的可能实现的体系架构、功能和操作。在这点上,流程图或框图中的每个方框可以代表一个模块、程序段、或代码的一部分,该模块、程序段、或代码的一部分包含一个或多个用于实现规定的逻辑功能的可执行指令。也应当注意,在有些作为替换的实现中,方框中所标注的功能也可以以不同于附图中所标注的顺序发生。例如,两个接连地表示的方框实际上可以基本并行地执行,它们有时也可以按相反的顺序执行,这依所涉及的功能而定。也要注意的是,框图和/或流程图中的每个方框、以及框图和/或流程图中的方框的组合,可以用执行规定的功能或操作的专用的基于硬件的***来实现,或者可以用专用硬件与计算机指令的组合来实现。
描述于本申请实施例中所涉及到的单元可以通过软件的方式实现,也可以通过硬件的方式来实现。所描述的单元也可以设置在处理器中,例如,可以描述为:一种处理器包括信息获取单元、状态确定单元和第一调整单元。其中,这些单元的名称在某种情况下并不构成对该单元本身的限定,例如,信息获取单元还可以被描述为“获取车辆行驶环境信息的单元”。
作为另一方面,本申请还提供了一种计算机可读介质,该计算机可读介质可以是上述实施例中描述的装置中所包含的;也可以是单独存在,而未装配入该装置中。上述计算机可读介质承载有一个或者多个程序,当上述一个或者多个程序被该装置执行时,使得该装置:获取车辆行驶环境信息;根据车辆行驶环境信息,确定无人车的拟采取行驶状态;响应于检测到目标用户针对拟采取行驶状态的交互操作,生成对应于交互操作的行驶状态调整指令,以控制无人车进行行驶状态调整。
以上描述仅为本申请的较佳实施例以及对所运用技术原理的说明。本领域技术人员应当理解,本申请中所涉及的发明范围,并不限于上述技术特征的特定组合而成的技术方案,同时也应涵盖在不脱离上述发明构思的情况下,由上述技术特征或其等同特征进行任意组合而形成的其它技术方案。例如上述特征与本申请中公开的(但不限于)具有类似功能的技术特征进行互相替换而形成的技术方案。

Claims (15)

  1. 一种用于无人车的控制方法,包括:
    获取车辆行驶环境信息;
    根据所述车辆行驶环境信息,确定所述无人车的拟采取行驶状态;
    响应于检测到目标用户针对所述拟采取行驶状态的交互操作,生成对应于所述交互操作的行驶状态调整指令,以控制所述无人车进行行驶状态调整。
  2. 根据权利要求1所述的方法,其中,所述方法还包括:
    响应于接收到与所述无人车通信连接的外部设备发送的环境感知信息,根据所述环境感知信息,控制所述无人车进行行驶状态调整。
  3. 根据权利要求2所述的方法,其中,所述方法还包括:
    在预设的终端显示屏上呈现以下至少一项:所述车辆行驶环境信息,所述拟采取行驶状态,所述环境感知信息。
  4. 根据权利要求1所述的方法,其中,所述方法还包括:
    接收通信连接的云端服务器发送的车辆行驶路线信息集合;
    从所述车辆行驶路线信息集合中选择车辆行驶路线信息,以控制所述无人车沿所选择的车辆行驶路线信息所指示的道路行驶。
  5. 根据权利要求1-4之一所述的方法,其中,所述方法还包括:
    响应于确定所述无人车所行驶的道路上存在障碍物,呈现用于描述所述障碍物的障碍物信息;
    响应于检测到目标用户针对所述障碍物信息的障碍物确认操作,生成对应于所述障碍物确认操作的障碍物消除指令,以控制所述无人车进行行驶状态调整。
  6. 根据权利要求1-4之一所述的方法,其中,所述车辆行驶环境 信息包括以下至少一项:车辆位置信息,环境图像信息,车内设备开关信息,车内设备能源消耗信息。
  7. 一种用于无人车的控制装置,包括:
    信息获取单元,被配置成获取车辆行驶环境信息;
    状态确定单元,被配置成根据所述车辆行驶环境信息,确定所述无人车的拟采取行驶状态;
    第一调整单元,被配置成响应于检测到目标用户针对所述拟采取行驶状态的交互操作,生成对应于所述交互操作的行驶状态调整指令,以控制所述无人车进行行驶状态调整。
  8. 根据权利要求7所述的装置,其中,所述装置还包括:
    第二调整单元,被配置成响应于接收到与所述无人车通信连接的外部设备发送的环境感知信息,根据所述环境感知信息,控制所述无人车进行行驶状态调整。
  9. 根据权利要求8所述的装置,其中,所述装置还包括:
    信息显示单元,被配置成在预设的终端显示屏上呈现以下至少一项:所述车辆行驶环境信息,所述拟采取行驶状态,所述环境感知信息。
  10. 根据权利要求7所述的装置,其中,所述装置还包括:
    路线接收单元,被配置成接收通信连接的云端服务器发送的车辆行驶路线信息集合;
    路线选择单元,被配置成从所述车辆行驶路线信息集合中选择车辆行驶路线信息,以控制所述无人车沿所选择的车辆行驶路线信息所指示的道路行驶。
  11. 根据权利要求7-10之一所述的装置,其中,所述装置还包括:
    信息确定单元,被配置成响应于确定所述无人车所行驶的道路上 存在障碍物,呈现用于描述所述障碍物的障碍物信息;
    第三调整单元,被配置成响应于检测到目标用户针对所述障碍物信息的障碍物确认操作,生成对应于所述障碍物确认操作的障碍物消除指令,以控制所述无人车进行行驶状态调整。
  12. 根据权利要求7-10之一所述的装置,其中,所述车辆行驶环境信息包括以下至少一项:车辆位置信息,环境图像信息,车内设备开关信息,车内设备能源消耗信息。
  13. 一种控制器,包括:
    一个或多个处理器;
    存储装置,其上存储有一个或多个程序,
    当所述一个或多个程序被所述一个或多个处理器执行,使得所述一个或多个处理器实现如权利要求1-6中任一所述的方法。
  14. 一种无人车,包括如权利要求13所述的控制器。
  15. 一种计算机可读存储介质,其上存储有计算机程序,其中,该程序被处理器执行时实现如权利要求1-6中任一所述的方法。
PCT/CN2019/112541 2019-01-15 2019-10-22 用于无人车的控制方法和装置 WO2020147360A1 (zh)

Priority Applications (3)

Application Number Priority Date Filing Date Title
JP2020568740A JP2021532009A (ja) 2019-01-15 2019-10-22 無人車の制御方法および制御装置、コントローラ、無人車、コンピュータ可読記憶媒体並びにコンピュータプログラム
EP19910877.0A EP3812866A4 (en) 2019-01-15 2019-10-22 CONTROL METHOD AND DEVICE FOR DRIVERLESS VEHICLE
US17/118,590 US20210132614A1 (en) 2019-01-15 2020-12-10 Control method and apparatus for autonomous vehicle

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201910037537.8 2019-01-15
CN201910037537.8A CN109709966B (zh) 2019-01-15 2019-01-15 用于无人车的控制方法和装置

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US17/118,590 Continuation US20210132614A1 (en) 2019-01-15 2020-12-10 Control method and apparatus for autonomous vehicle

Publications (1)

Publication Number Publication Date
WO2020147360A1 true WO2020147360A1 (zh) 2020-07-23

Family

ID=66260114

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2019/112541 WO2020147360A1 (zh) 2019-01-15 2019-10-22 用于无人车的控制方法和装置

Country Status (5)

Country Link
US (1) US20210132614A1 (zh)
EP (1) EP3812866A4 (zh)
JP (1) JP2021532009A (zh)
CN (1) CN109709966B (zh)
WO (1) WO2020147360A1 (zh)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112017667A (zh) * 2020-09-04 2020-12-01 华人运通(上海)云计算科技有限公司 语音交互方法、车辆和计算机存储介质
CN114435383A (zh) * 2022-01-28 2022-05-06 中国第一汽车股份有限公司 一种控制方法、装置、设备及存储介质

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109709966B (zh) * 2019-01-15 2021-12-07 阿波罗智能技术(北京)有限公司 用于无人车的控制方法和装置
CN110320911A (zh) * 2019-07-01 2019-10-11 百度在线网络技术(北京)有限公司 无人车控制方法、装置、无人车及存储介质
CN112141111B (zh) * 2020-09-02 2022-01-11 新石器慧义知行智驰(北京)科技有限公司 无人车行驶控制方法、装置及无人车
CN113479192B (zh) * 2021-07-06 2023-03-24 阿波罗智能技术(北京)有限公司 车辆泊出方法、车辆泊入方法、装置、设备以及存储介质
CN113692030A (zh) * 2021-08-25 2021-11-23 北京三快在线科技有限公司 无人车通信方法、装置、存储介质及电子设备
CN113895458B (zh) * 2021-10-26 2023-06-30 上海集度汽车有限公司 车辆驾驶行为的管理方法、装置、车辆及存储介质
CN114827470B (zh) * 2022-04-28 2024-05-28 新石器慧通(北京)科技有限公司 基于云台角度调整的无人车控制方法及装置
CN116279259A (zh) * 2023-02-27 2023-06-23 成都赛力斯科技有限公司 车辆控制***、方法以及智能车辆

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5318143A (en) * 1992-06-22 1994-06-07 The Texas A & M University System Method and apparatus for lane sensing for automatic vehicle steering
WO2014091611A1 (ja) * 2012-12-13 2014-06-19 株式会社日立製作所 自律走行装置
CN104571101A (zh) * 2013-10-17 2015-04-29 厦门英拓通讯科技有限公司 一种可实现车辆任意位置移动的***
CN107298021A (zh) * 2016-04-15 2017-10-27 松下电器(美国)知识产权公司 信息提示控制装置、自动驾驶车及其驾驶辅助***
CN107709127A (zh) * 2015-04-21 2018-02-16 松下知识产权经营株式会社 驾驶辅助方法以及利用了该驾驶辅助方法的驾驶辅助装置、自动驾驶控制装置、车辆、驾驶辅助程序
CN109709966A (zh) * 2019-01-15 2019-05-03 北京百度网讯科技有限公司 用于无人车的控制方法和装置

Family Cites Families (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102013213039A1 (de) * 2013-07-03 2015-01-08 Continental Automotive Gmbh Assistenzsystem und Assistenzverfahren zur Unterstützung bei der Steuerung eines Kraftfahrzeugs
US9573592B2 (en) * 2014-12-23 2017-02-21 Toyota Motor Engineering & Manufacturing North America, Inc. Risk mitigation for autonomous vehicles relative to oncoming objects
US9975472B2 (en) * 2015-04-30 2018-05-22 Tyri International, Inc. Controllable lighting arrangement for a vehicle
DE102015012723B3 (de) * 2015-10-01 2017-01-12 Audi Ag Verfahren zum Koppeln von Betriebszuständen eines Kraftfahrzeugs und einer fahrzeugexternen Vorrichtung sowie Servereinrichtung
EP4180893A1 (en) * 2015-11-04 2023-05-17 Zoox, Inc. Teleoperation system and method for trajectory modification of autonomous vehicles
DE102015225161A1 (de) * 2015-12-14 2017-06-14 Robert Bosch Gmbh Verfahren und Vorrichtung zum Empfangen von Datenwerten und zum Betreiben eines Fahrzeugs
US20170276504A1 (en) * 2016-03-22 2017-09-28 Toyota Jidosha Kabushiki Kaisha Vehicular Traffic Assistance Based on Traffic Management Decisions
US9827986B2 (en) * 2016-03-29 2017-11-28 Ford Global Technologies, Llc System and methods for adaptive cruise control based on user defined parameters
CN105741595B (zh) * 2016-04-27 2018-02-27 常州加美科技有限公司 一种基于云端数据库的无人驾驶车辆导航行车方法
KR101876968B1 (ko) * 2016-10-21 2018-07-12 네이버 주식회사 실내 자율 주행 로봇을 위한 제어 방법 및 시스템
CN106502248B (zh) * 2016-11-15 2019-11-05 百度在线网络技术(北京)有限公司 一种启动智能车辆的方法及装置
CN108422949B (zh) * 2017-02-15 2019-06-28 百度在线网络技术(北京)有限公司 用于无人驾驶车辆的信息共享方法、装置、***及设备
KR102181196B1 (ko) * 2017-06-23 2020-11-23 닛산 지도우샤 가부시키가이샤 주차 제어 방법 및 주차 제어 장치
EP3936966B1 (en) * 2017-07-07 2023-05-03 Zoox, Inc. Interactions between vehicle and teleoperations system
CN111465524A (zh) * 2017-12-07 2020-07-28 福特全球技术公司 动态车辆充电
CN108073174A (zh) * 2017-12-21 2018-05-25 重庆鲁班机器人技术研究院有限公司 无人车控制***和方法
CN108225364B (zh) * 2018-01-04 2021-07-06 吉林大学 一种无人驾驶汽车驾驶任务决策***及方法
US10611384B1 (en) * 2018-07-27 2020-04-07 Uatc, Llc Systems and methods for autonomous vehicle operator vigilance management
US10831207B1 (en) * 2018-08-22 2020-11-10 BlueOwl, LLC System and method for evaluating the performance of a vehicle operated by a driving automation system

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5318143A (en) * 1992-06-22 1994-06-07 The Texas A & M University System Method and apparatus for lane sensing for automatic vehicle steering
WO2014091611A1 (ja) * 2012-12-13 2014-06-19 株式会社日立製作所 自律走行装置
CN104571101A (zh) * 2013-10-17 2015-04-29 厦门英拓通讯科技有限公司 一种可实现车辆任意位置移动的***
CN107709127A (zh) * 2015-04-21 2018-02-16 松下知识产权经营株式会社 驾驶辅助方法以及利用了该驾驶辅助方法的驾驶辅助装置、自动驾驶控制装置、车辆、驾驶辅助程序
CN107298021A (zh) * 2016-04-15 2017-10-27 松下电器(美国)知识产权公司 信息提示控制装置、自动驾驶车及其驾驶辅助***
CN109709966A (zh) * 2019-01-15 2019-05-03 北京百度网讯科技有限公司 用于无人车的控制方法和装置

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP3812866A4

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112017667A (zh) * 2020-09-04 2020-12-01 华人运通(上海)云计算科技有限公司 语音交互方法、车辆和计算机存储介质
CN112017667B (zh) * 2020-09-04 2024-03-15 华人运通(上海)云计算科技有限公司 语音交互方法、车辆和计算机存储介质
CN114435383A (zh) * 2022-01-28 2022-05-06 中国第一汽车股份有限公司 一种控制方法、装置、设备及存储介质

Also Published As

Publication number Publication date
EP3812866A4 (en) 2022-01-26
CN109709966B (zh) 2021-12-07
CN109709966A (zh) 2019-05-03
US20210132614A1 (en) 2021-05-06
EP3812866A1 (en) 2021-04-28
JP2021532009A (ja) 2021-11-25

Similar Documents

Publication Publication Date Title
WO2020147360A1 (zh) 用于无人车的控制方法和装置
CN108090603B (zh) 为自动驾驶车辆管理车辆组的方法和***
WO2020107974A1 (zh) 用于无人驾驶车的避障方法和装置
JP5945999B1 (ja) 運転支援装置、運転支援システム、運転支援方法、運転支援プログラム及び自動運転車両
WO2017022199A1 (ja) 運転支援装置、運転支援システム、運転支援方法及び自動運転車両
CN112590813B (zh) 自动驾驶车辆信息生成方法、装置、电子设备和介质
CN109189054A (zh) 用于控制无人驾驶车执行路线验证的方法和装置
JP2020095481A (ja) 車両の制御装置及び自動運転システム
WO2020258602A1 (zh) 智能汽车的控制方法、装置及存储介质
KR102359497B1 (ko) 단일 차량 동작용으로 설계된 자율 주행 시스템에 따른 차량 플래툰 구현
CN114148341A (zh) 用于车辆的控制设备、方法及车辆
JP2019174992A (ja) 情報処理装置及びプログラム
US20220004187A1 (en) Display control device and display control method
JP6575915B2 (ja) 運転支援装置、運転支援システム、運転支援方法、運転支援プログラム及び自動運転車両
WO2021196985A1 (zh) 自动驾驶方法及装置
WO2023241564A1 (zh) 泊车方法、装置、车辆及存储介质
JP2022153363A (ja) サーバ装置及び情報処理方法並びにサーバプログラム
WO2022224311A1 (ja) 経路案内装置、経路案内方法、及び、経路案内プログラム
JP2018205294A (ja) 情報処理装置及び情報処理方法
US20230003535A1 (en) Rendezvous assistance system and rendezvous assistance method
JP6671019B2 (ja) 運転支援装置、運転支援システム、運転支援方法、運転支援プログラム、自動運転車両及び自動運転制御システム
CN114217613B (zh) 一种远程控制方法、装置和远程驾驶***
US20240198938A1 (en) Computing Systems And Methods For Generating User-Specific Automated Vehicle Actions
KR102332513B1 (ko) 군집 주차 제어 장치 및 그 방법
US11215472B2 (en) Information providing device and in-vehicle device

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19910877

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2020568740

Country of ref document: JP

Kind code of ref document: A

ENP Entry into the national phase

Ref document number: 2019910877

Country of ref document: EP

Effective date: 20201209

NENP Non-entry into the national phase

Ref country code: DE