CN107515607A - Control method and device for unmanned vehicle - Google Patents

Control method and device for unmanned vehicle Download PDF

Info

Publication number
CN107515607A
CN107515607A CN201710791661.4A CN201710791661A CN107515607A CN 107515607 A CN107515607 A CN 107515607A CN 201710791661 A CN201710791661 A CN 201710791661A CN 107515607 A CN107515607 A CN 107515607A
Authority
CN
China
Prior art keywords
avoidance
unmanned vehicle
parameter
deep learning
learning model
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201710791661.4A
Other languages
Chinese (zh)
Inventor
郑超
郁浩
闫泳杉
唐坤
张云飞
姜雨
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Baidu Netcom Science and Technology Co Ltd
Original Assignee
Beijing Baidu Netcom Science and Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Baidu Netcom Science and Technology Co Ltd filed Critical Beijing Baidu Netcom Science and Technology Co Ltd
Priority to CN201710791661.4A priority Critical patent/CN107515607A/en
Publication of CN107515607A publication Critical patent/CN107515607A/en
Priority to PCT/CN2018/098630 priority patent/WO2019047643A1/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0238Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using obstacle or wall sensors
    • G05D1/024Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using obstacle or wall sensors in combination with a laser
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0242Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using non-visible light signals, e.g. IR or UV signals
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0246Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0257Control of position or course in two dimensions specially adapted to land vehicles using a radar

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Electromagnetism (AREA)
  • Optics & Photonics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • Traffic Control Systems (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

This application discloses the control method and device for unmanned vehicle.One embodiment of this method includes:Obtain the data of at least two sensors collection;By the avoidance deep learning model of acquired data input training in advance, wherein, the avoidance deep learning model is used to characterizing the corresponding relation of the data that sensor gathered and the avoidance parameter of unmanned vehicle;The avoidance parameter of the unmanned vehicle of avoidance deep learning model output is obtained, to be controlled based on the avoidance parameter to the unmanned vehicle.The embodiment of the present application obtains avoidance parameter by avoidance deep learning model, realizes the driving to unmanned vehicle and is controlled.

Description

Control method and device for unmanned vehicle
Technical field
The application is related to field of computer technology, and in particular to Internet technical field, more particularly, to unmanned vehicle Control method and device.
Background technology
Unmanned vehicle is one kind of intelligent automobile, also referred to as wheeled mobile robot, rely primarily on in-car with department of computer science Intelligent driving instrument based on system realizes unpiloted target.
Unmanned vehicle can be detected during traveling by sensor road pavement situation.But in prior art In, detected using single-sensor, testing result is easily poor by surrounding environment influence, stability.
The content of the invention
The purpose of the application is to propose a kind of improved control method and device for unmanned vehicle, carried on the back to solve the above The technical problem that scape technology segment is mentioned.
In a first aspect, the embodiment of the present application provides a kind of control method for unmanned vehicle, this method includes:Unmanned vehicle At least two sensors are installed, method includes:Obtain the data of at least two sensors collection;By acquired data input The avoidance deep learning model of training in advance, wherein, avoidance deep learning model be used to characterizing data that sensor gathered with The corresponding relation of the avoidance parameter of unmanned vehicle;Obtain the avoidance parameter of the unmanned vehicle of avoidance deep learning model output, with based on Avoidance parameter is controlled to unmanned vehicle.
In certain embodiments, avoidance parameter includes braking parameters and/or turn around parameters.
In certain embodiments, at least two sensors include camera, laser radar and millimetre-wave radar.
In certain embodiments, avoidance deep learning model trains to obtain in a manner of end to end.
In certain embodiments, before the data of at least two sensors collection are obtained, this method also includes:Obtain extremely The data of few two sensors collection, and the current avoidance parameter of unmanned vehicle is obtained, avoidance parameter is the driving behavior by user Generation;Using acquired data and current avoidance parameter as the input and output of avoidance deep learning model, with Avoidance deep learning model is trained.
Second aspect, the embodiment of the present application provide a kind of control device for unmanned vehicle, and the device includes:Unmanned vehicle At least two sensors are installed, device includes:Acquiring unit, it is configured to obtain the data of at least two sensors collection; Input block, it is configured to the avoidance deep learning model of acquired data input training in advance, wherein, the avoidance is deep Degree learning model is used to characterizing the corresponding relation of the data that sensor gathered and the avoidance parameter of unmanned vehicle;Control unit, match somebody with somebody The avoidance parameter of the unmanned vehicle for obtaining the avoidance deep learning model output is put, with based on the avoidance parameter pair The unmanned vehicle is controlled.
In certain embodiments, avoidance parameter includes braking parameters and/or turn around parameters.
In certain embodiments, at least two sensors include camera, laser radar and millimetre-wave radar.
In certain embodiments, avoidance deep learning model trains to obtain in a manner of end to end.
In certain embodiments, device also includes:Parameter acquiring unit, it is configured to obtain the collection of at least two sensors Data, and obtain the current avoidance parameter of unmanned vehicle, avoidance parameter is generated by the driving behavior of user;Training unit, It is configured to using acquired data and current avoidance parameter as the input and output of avoidance deep learning model, with Avoidance deep learning model is trained.
The third aspect, the embodiment of the present application provide a kind of unmanned vehicle, including:One or more processors;Storage device, For storing one or more programs, when one or more programs are executed by one or more processors so that one or more Processor realizes the method as being used for any embodiment in the control method of unmanned vehicle.
Fourth aspect, the embodiment of the present application provide a kind of computer-readable recording medium, are stored thereon with computer journey Sequence, the method as being used for any embodiment in the control method of unmanned vehicle is realized when the program is executed by processor.
The control method and device for unmanned vehicle that the embodiment of the present application provides, unmanned vehicle are provided with least two sensings Device, method include:The data of at least two sensors collection are obtained first.Afterwards, by acquired data input training in advance Avoidance deep learning model, wherein, avoidance deep learning model is used to characterizing the data that sensor gathered and unmanned vehicle The corresponding relation of avoidance parameter.Finally, the avoidance parameter of the unmanned vehicle of avoidance deep learning model output is obtained, with based on avoidance Parameter is controlled to unmanned vehicle.The embodiment of the present application obtains avoidance parameter by avoidance deep learning model, realizes to nothing The driving of people's car is controlled.
Brief description of the drawings
By reading the detailed description made to non-limiting example made with reference to the following drawings, the application's is other Feature, objects and advantages will become more apparent upon:
Fig. 1 is that the application can apply to exemplary system architecture figure therein;
Fig. 2 is the flow chart according to one embodiment of the control method for unmanned vehicle of the application;
Fig. 3 is the schematic diagram according to an application scenarios of the control method for unmanned vehicle of the application;
Fig. 4 is the flow chart according to another embodiment of the control method for unmanned vehicle of the application;
Fig. 5 is the structural representation according to one embodiment of the control device for unmanned vehicle of the application;
Fig. 6 is adapted for the structural representation of the computer system of the unmanned vehicle for realizing the embodiment of the present application.
Embodiment
The application is described in further detail with reference to the accompanying drawings and examples.It is understood that this place is retouched The specific embodiment stated is used only for explaining related invention, rather than the restriction to the invention.It also should be noted that in order to Be easy to describe, illustrate only in accompanying drawing to about the related part of invention.
It should be noted that in the case where not conflicting, the feature in embodiment and embodiment in the application can phase Mutually combination.Describe the application in detail below with reference to the accompanying drawings and in conjunction with the embodiments.
Fig. 1, which is shown, can apply the control method for unmanned vehicle of the application or the control device for unmanned vehicle The exemplary system architecture 100 of embodiment.
As shown in figure 1, system architecture 100 can include unmanned vehicle 101, network 102 and server 103.Network 102 to The medium of communication link is provided between unmanned vehicle 101 and server 103.Network 102 can include various connection types, such as Wired, wireless communication link or fiber optic cables etc..
User can be interacted using unmanned vehicle 101 by the server 103 of network 102, to receive or send message etc..Nobody Various telecommunication customer end applications can be installed on car 101.
Unmanned vehicle 101 can be supported image to obtain and can carry out the various electronic equipments of image procossing, can be nothing People's car etc..
Server 103 can be to provide the server of various services.Server 103 can carry out the processing such as analyzing, and will Result feeds back to unmanned vehicle.
It should be noted that the image processing method for unmanned vehicle that the embodiment of the present application is provided is typically by unmanned vehicle 101 are performed, and correspondingly, the image processing apparatus for unmanned vehicle is generally positioned in unmanned vehicle 101.
It should be understood that the number of the terminal device, network and server in Fig. 1 is only schematical.According to realizing need Will, can have any number of terminal device, network and server.
With continued reference to Fig. 2, the flow of one embodiment of the control method for unmanned vehicle according to the application is shown 200.This is used for the control method of unmanned vehicle, comprises the following steps:
Step 201, the data of at least two sensors collection are obtained.
In the present embodiment, unmanned vehicle is provided with least two sensors, and the control method for unmanned vehicle runs on it On unmanned vehicle can be by wired connection mode on either radio connection obtains from local or other electronic equipments The data at least two sensors collection stated.Herein, the quantity of every kind of sensor can be one or two or It is more than person.
In some optional implementations of the present embodiment, at least two sensors can include camera, laser thunder Reach and millimetre-wave radar.
Specifically, camera can gather view data or video stream data.Laser radar is detected using laser, The data of the return collected are also laser signal.Millimetre-wave radar is detected using millimeter wave, the return collected Data be millimeter-wave signal.
Step 202, by the avoidance deep learning model of acquired data input training in advance.
In the present embodiment, above-mentioned unmanned vehicle can be by the avoidance deep learning mould of acquired data input training in advance Type, so that the avoidance deep learning model is exported according to the data of input.Wherein, avoidance deep learning model is used to characterize The data and the corresponding relation of the avoidance parameter of unmanned vehicle that sensor is gathered.Avoidance parameter is involved by unmanned vehicle avoiding obstacles And parameter.Here avoidance parameter is the data that unmanned vehicle uses immediately.
Above-mentioned avoidance deep learning model can be SVMs (Support Vector Machine, SVM), simplicity The training of the graders (Classifier) such as Bayesian model (Naive Bayesian Model, NBM) model obtains.In addition, Above-mentioned image processing model can also be based on some classification functions (such as softmax functions etc.) and training in advance forms.
In some optional implementations of the present embodiment, avoidance deep learning model is trained in a manner of end to end Obtain.
In the present embodiment, the model that end-to-end mode is trained can using data that sensor collects as input, and Export the unmanned vehicle avoidance parameter to be used.
Avoidance deep learning model is a kind of deep neural network, can directly generate vehicle according to the data of collection Avoidance parameter.Specifically, in the training process of the model, employ which kind of data is trained as input and output, then In the application process of model, it becomes possible to according to this kind of input data, obtain corresponding output data.
In some optional implementations of the present embodiment, avoidance deep learning model includes:
Feature extraction component, the characteristics of image of the image of extraction camera collection, the data of extraction laser radar collection First data characteristics, the second data characteristics of the data of extraction millimetre-wave radar collection.
In the present embodiment, feature extraction can be carried out in the following ways, and figure is extracted from the image of camera collection As feature, the first data characteristics, and the millimeter wave gathered from millimetre-wave radar are extracted from the laser data of laser radar collection The data characteristics of extracting data second.
It should be noted that " first " and " second " here and the implication without ranking, simply to special to data Sign makes a distinction.
Step 203, obtain the avoidance parameter of the unmanned vehicle of avoidance deep learning model output, with based on avoidance parameter to nothing People's car is controlled.
In the present embodiment, above-mentioned unmanned vehicle obtains nobody that avoidance deep learning model exported according to the data of input The avoidance parameter of car, to be controlled based on avoidance parameter to unmanned vehicle.
In some optional implementations of the present embodiment, avoidance parameter includes braking parameters and/or turn around parameters.
In the present embodiment, parameter used by braking parameters is brakes, the size of the deceleration of unmanned vehicle can be included, The direction of the deceleration of unmanned vehicle can also be included simultaneously.As a rule, the direction of unmanned vehicle deceleration is opposite with direct of travel. The speed of driving can be controlled by braking parameters.Turn around parameters are the parameter that unmanned vehicle is turned to, and can be steering angles, Such as steering wheel angle etc..The direction of driving can be controlled by turn around parameters.Above-mentioned unmanned vehicle can be by limiting or adjusting Any one of whole above-mentioned two avoidance parameter, to carry out avoidance operation.
With continued reference to Fig. 3, Fig. 3 is shown according to one of the application scenarios of the control method for unmanned vehicle of the present embodiment It is intended to.In Fig. 3 application scenarios, the above-mentioned acquisition of unmanned vehicle 301 is installed at least two sensors collection of the unmanned vehicle Data 302.Afterwards, above-mentioned unmanned vehicle is by the avoidance deep learning model of acquired data input training in advance, avoidance depth Learning model is used to characterizing the corresponding relation of the data that sensor gathered and the avoidance parameter of unmanned vehicle.Obtain avoidance depth The avoidance parameter 303 of the unmanned vehicle of model output is practised, to be controlled 304 to unmanned vehicle based on avoidance parameter.
The method that above-described embodiment of the application provides obtains avoidance parameter by avoidance deep learning model, realizes pair The control of driving.
With further reference to Fig. 4, it illustrates the flow 400 of another embodiment of the control method for unmanned vehicle.Should For the flow 400 of the control method of unmanned vehicle, comprise the following steps:
Step 401, the data of at least two sensors collection are obtained, and obtain the current avoidance parameter of unmanned vehicle.
In the present embodiment, model can be trained before application model.Unmanned vehicle can obtain at least two The data of sensor collection, and obtain current avoidance parameter.Above-mentioned braking parameters are generated by the driving behavior of user, That is user is when driving the unmanned vehicle, the avoidance parameter of vehicle caused by the brake behavior taken.Unmanned vehicle herein can To be that the unmanned vehicle specified can also be arbitrarily carrying out the unmanned vehicle of model training.Here at least two sensors peace Loaded on the unmanned vehicle.
Step 402, using acquired data and current avoidance parameter as the input of avoidance deep learning model And output, to be trained to avoidance deep learning model.
In the present embodiment, above-mentioned unmanned vehicle using the data acquired in step 401 and current avoidance parameter as The input and output of above-mentioned avoidance deep learning model, to be trained to the avoidance deep learning model.
Step 403, the data of at least two sensors collection are obtained.
In the present embodiment, unmanned vehicle is provided with least two sensors, and the control method for unmanned vehicle runs on it On unmanned vehicle can be by wired connection mode on either radio connection obtains from local or other electronic equipments The data at least two sensors collection stated.Herein, the quantity of every kind of sensor can be one or two or It is more than person.
In some optional implementations of the present embodiment, at least two sensors can include camera, laser thunder Reach and millimetre-wave radar.
Specifically, camera can gather view data or video stream data.Laser radar is detected using laser, The data of the return collected are also laser signal.Millimetre-wave radar is detected using millimeter wave, the return collected Data be millimeter-wave signal.
Step 404, by the avoidance deep learning model of acquired data input training in advance.
In the present embodiment, above-mentioned unmanned vehicle can be by the avoidance deep learning mould of acquired data input training in advance Type, so that the avoidance deep learning model is exported according to the data of input.Wherein, avoidance deep learning model is used to characterize The data and the corresponding relation of the avoidance parameter of unmanned vehicle that sensor is gathered.Avoidance parameter is involved by unmanned vehicle avoiding obstacles And parameter.Here avoidance parameter is the data that unmanned vehicle uses immediately.
Above-mentioned avoidance deep learning model can be SVMs (Support Vector Machine, SVM), simplicity The training of the graders (Classifier) such as Bayesian model (Naive Bayesian Model, NBM) model obtains.In addition, Above-mentioned image processing model can also be based on some classification functions (such as softmax functions etc.) and training in advance forms.
In some optional implementations of the present embodiment, avoidance deep learning model is trained in a manner of end to end Obtain.
In the present embodiment, the model that end-to-end mode is trained can using data that sensor collects as input, and Export the unmanned vehicle avoidance parameter to be used.
Avoidance deep learning model is a kind of deep neural network, can directly generate vehicle according to the data of collection Avoidance parameter.Specifically, in the training process of the model, employ which kind of data is trained as input and output, then In the application process of model, it becomes possible to according to this kind of input data, obtain corresponding output data.
Step 405, obtain the avoidance parameter of the unmanned vehicle of avoidance deep learning model output, with based on avoidance parameter to nothing People's car is controlled.
In the present embodiment, above-mentioned unmanned vehicle obtains nobody that avoidance deep learning model exported according to the data of input The avoidance parameter of car, to be controlled based on avoidance parameter to unmanned vehicle.
In some optional implementations of the present embodiment, avoidance parameter includes braking parameters and/or turn around parameters.
In the present embodiment, parameter used by braking parameters is brakes, the size of the deceleration of unmanned vehicle can be included, The direction of the deceleration of unmanned vehicle can also be included simultaneously.As a rule, the direction of unmanned vehicle deceleration is opposite with direct of travel. Turn around parameters are the parameter that is turned to of unmanned vehicle, can be steering angles, such as steering wheel angle etc..Above-mentioned unmanned vehicle can With by limiting or adjusting any one of above-mentioned two avoidance parameter, to carry out avoidance operation.
The present embodiment, can be in the application process of model by being trained end-to-end to avoidance deep learning model Avoidance parameter is obtained exactly.
With further reference to Fig. 5, as the realization to method shown in above-mentioned each figure, this application provides one kind to be used for unmanned vehicle Control device one embodiment, the device embodiment is corresponding with the embodiment of the method shown in Fig. 2, and the device specifically can be with Applied in various electronic equipments.
As shown in figure 5, the control device 500 for unmanned vehicle of the present embodiment includes:Acquiring unit 501, input block 502 and control unit 503.Wherein, acquiring unit 501, it is configured to obtain the data of at least two sensors collection;Input is single Member 502, is configured to the avoidance deep learning model of acquired data input training in advance, wherein, the avoidance depth Learning model is used to characterizing the corresponding relation of the data that sensor gathered and the avoidance parameter of unmanned vehicle;Control unit 503, matches somebody with somebody The avoidance parameter of the unmanned vehicle for obtaining the avoidance deep learning model output is put, with based on the avoidance parameter pair The unmanned vehicle is controlled.
In the present embodiment, acquiring unit 501 can by wired connection mode or radio connection from local or The data of above-mentioned at least two sensors collection are obtained on other electronic equipments of person.Herein, the quantity of every kind of sensor can Be one or two or more than.
In the present embodiment, input block 502 can be by the avoidance deep learning of acquired data input training in advance Model, so that the avoidance deep learning model is exported according to the data of input.Wherein, avoidance deep learning model is used for table The data and the corresponding relation of the avoidance parameter of unmanned vehicle that sign sensor is gathered.Avoidance parameter is unmanned vehicle avoiding obstacles institute The parameter being related to.Here avoidance parameter is the data that unmanned vehicle uses immediately.
In the present embodiment, control unit 503 obtains the nothing that avoidance deep learning model is exported according to the data of input The avoidance parameter of people's car, to be controlled based on avoidance parameter to unmanned vehicle.
In some optional implementations of the present embodiment, avoidance parameter includes braking parameters and/or turn around parameters.
In some optional implementations of the present embodiment, at least two sensors include camera, laser radar and Millimetre-wave radar.
In some optional implementations of the present embodiment, avoidance deep learning model is trained in a manner of end to end Obtain.
In some optional implementations of the present embodiment, device also includes:Parameter acquiring unit, it is configured to obtain The data of at least two sensors collection, and the current avoidance parameter of unmanned vehicle is obtained, avoidance parameter is the driving row by user For generation;Training unit, it is configured to using acquired data and current avoidance parameter as avoidance deep learning The input and output of model, to be trained to avoidance deep learning model.
Below with reference to Fig. 6, it illustrates suitable for for realizing the computer system 600 of the unmanned vehicle of the embodiment of the present application Structural representation.Unmanned vehicle shown in Fig. 6 is only an example, should not be to the function and use range band of the embodiment of the present application Carry out any restrictions.
As shown in fig. 6, computer system 600 includes CPU (CPU) 601, it can be read-only according to being stored in Program in memory (ROM) 602 or be loaded into program in random access storage device (RAM) 603 from storage part 608 and Perform various appropriate actions and processing.In RAM 603, also it is stored with system 600 and operates required various programs and data. CPU 601, ROM 602 and RAM 603 are connected with each other by bus 604.Input/output (I/O) interface 605 is also connected to always Line 604.
I/O interfaces 605 are connected to lower component:Importation 606 including keyboard, mouse etc.;Penetrated including such as negative electrode The output par, c 607 of spool (CRT), liquid crystal display (LCD) etc. and loudspeaker etc.;Storage part 608 including hard disk etc.; And the communications portion 609 of the NIC including LAN card, modem etc..Communications portion 609 via such as because The network of spy's net performs communication process.Driver 610 is also according to needing to be connected to I/O interfaces 605.Detachable media 611, such as Disk, CD, magneto-optic disk, semiconductor memory etc., it is arranged on as needed on driver 610, in order to read from it Computer program be mounted into as needed storage part 608.
Especially, in accordance with an embodiment of the present disclosure, it may be implemented as computer above with reference to the process of flow chart description Software program.For example, embodiment of the disclosure includes a kind of computer program product, it includes being carried on computer-readable medium On computer program, the computer program include be used for execution flow chart shown in method program code.In such reality To apply in example, the computer program can be downloaded and installed by communications portion 609 from network, and/or from detachable media 611 are mounted.When the computer program is performed by CPU (CPU) 601, perform what is limited in the present processes Above-mentioned function.It should be noted that the computer-readable medium of the application can be computer-readable signal media or calculating Machine readable storage medium storing program for executing either the two any combination.Computer-readable recording medium for example can be --- but it is unlimited In system, device or the device of --- electricity, magnetic, optical, electromagnetic, infrared ray or semiconductor, or it is any more than combination.Calculate The more specifically example of machine readable storage medium storing program for executing can include but is not limited to:Electrically connecting, be portable with one or more wires Formula computer disk, hard disk, random access storage device (RAM), read-only storage (ROM), erasable programmable read only memory (EPROM or flash memory), optical fiber, portable compact disc read-only storage (CD-ROM), light storage device, magnetic memory device or The above-mentioned any appropriate combination of person.In this application, computer-readable recording medium can be any includes or storage program Tangible medium, the program can be commanded execution system, device either device use or it is in connection.And in this Shen Please in, computer-readable signal media can include in a base band or as carrier wave a part propagation data-signal, its In carry computer-readable program code.The data-signal of this propagation can take various forms, and include but is not limited to Electromagnetic signal, optical signal or above-mentioned any appropriate combination.Computer-readable signal media can also be computer-readable Any computer-readable medium beyond storage medium, the computer-readable medium can send, propagate or transmit for by Instruction execution system, device either device use or program in connection.The journey included on computer-readable medium Sequence code can be transmitted with any appropriate medium, be included but is not limited to:Wirelessly, electric wire, optical cable, RF etc., or it is above-mentioned Any appropriate combination.
Flow chart and block diagram in accompanying drawing, it is illustrated that according to the system of the various embodiments of the application, method and computer journey Architectural framework in the cards, function and the operation of sequence product.At this point, each square frame in flow chart or block diagram can generation The part of one module of table, program segment or code, the part of the module, program segment or code include one or more use In the executable instruction of logic function as defined in realization.It should also be noted that marked at some as in the realization replaced in square frame The function of note can also be with different from the order marked in accompanying drawing generation.For example, two square frames succeedingly represented are actually It can perform substantially in parallel, they can also be performed in the opposite order sometimes, and this is depending on involved function.Also to note Meaning, the combination of each square frame and block diagram in block diagram and/or flow chart and/or the square frame in flow chart can be with holding Function as defined in row or the special hardware based system of operation are realized, or can use specialized hardware and computer instruction Combination realize.
Being described in unit involved in the embodiment of the present application can be realized by way of software, can also be by hard The mode of part is realized.Described unit can also be set within a processor, for example, can be described as:A kind of processor bag Include acquiring unit, input block and control unit.Wherein, the title of these units is not formed to the unit under certain conditions The restriction of itself, for example, acquiring unit is also described as " unit for obtaining the data of at least two sensors collection ".
As on the other hand, present invention also provides a kind of computer-readable medium, the computer-readable medium can be Included in device described in above-described embodiment;Can also be individualism, and without be incorporated the device in.Above-mentioned calculating Machine computer-readable recording medium carries one or more program, when said one or multiple programs are performed by the device so that should Device:Obtain the data of at least two sensors collection;By the avoidance deep learning mould of acquired data input training in advance Type, wherein, avoidance deep learning model is used to characterize the data pass corresponding with the avoidance parameter of unmanned vehicle that sensor is gathered System;The avoidance parameter of the unmanned vehicle of avoidance deep learning model output is obtained, to be controlled based on avoidance parameter to unmanned vehicle.
Above description is only the preferred embodiment of the application and the explanation to institute's application technology principle.People in the art Member should be appreciated that invention scope involved in the application, however it is not limited to the technology that the particular combination of above-mentioned technical characteristic forms Scheme, while should also cover in the case where not departing from foregoing invention design, carried out by above-mentioned technical characteristic or its equivalent feature The other technical schemes for being combined and being formed.Such as features described above has similar work(with (but not limited to) disclosed herein The technical scheme that the technical characteristic of energy is replaced mutually and formed.

Claims (12)

1. a kind of control method for unmanned vehicle, it is characterised in that the unmanned vehicle is provided with least two sensors, described Method includes:
Obtain the data of at least two sensors collection;
By the avoidance deep learning model of acquired data input training in advance, wherein, the avoidance deep learning model is used The data and the corresponding relation of the avoidance parameter of unmanned vehicle gathered in sign sensor;
Obtain the avoidance parameter of the unmanned vehicle of avoidance deep learning model output, with based on the avoidance parameter to institute Unmanned vehicle is stated to be controlled.
2. the control method according to claim 1 for unmanned vehicle, it is characterised in that the avoidance parameter includes brake Parameter and/or turn around parameters.
3. the control method according to claim 1 for unmanned vehicle, it is characterised in that at least two sensor bag Include camera, laser radar and millimetre-wave radar.
4. the control method according to claim 1 for unmanned vehicle, it is characterised in that the avoidance deep learning model Train to obtain in a manner of end to end.
5. the control method according to claim 1 for unmanned vehicle, it is characterised in that described in the acquisition at least two Before the data of individual sensor collection, methods described also includes:
Obtain the data of at least two sensors collection, and obtain the current avoidance parameter of unmanned vehicle, the avoidance parameter be by The driving behavior generation of user;
Using acquired data and current avoidance parameter as the input and output of the avoidance deep learning model, with The avoidance deep learning model is trained.
6. a kind of control device for unmanned vehicle, it is characterised in that the unmanned vehicle is provided with least two sensors, described Device includes:
Acquiring unit, it is configured to obtain the data of at least two sensors collection;
Input block, it is configured to the avoidance deep learning model of acquired data input training in advance, wherein, it is described to keep away Barrier deep learning model is used to characterizing the corresponding relation of the data that sensor gathered and the avoidance parameter of unmanned vehicle;
Control unit, it is configured to obtain the avoidance parameter of the unmanned vehicle of the avoidance deep learning model output, with base The unmanned vehicle is controlled in the avoidance parameter.
7. the control device according to claim 6 for unmanned vehicle, it is characterised in that the avoidance parameter includes brake Parameter and/or turn around parameters.
8. the control device according to claim 6 for unmanned vehicle, it is characterised in that at least two sensor bag Include camera, laser radar and millimetre-wave radar.
9. the control device according to claim 6 for unmanned vehicle, it is characterised in that the avoidance deep learning model Train to obtain in a manner of end to end.
10. the control device according to claim 6 for unmanned vehicle, it is characterised in that described device also includes:
Parameter acquiring unit, is configured to obtain the data of at least two sensors collection, and obtains the current avoidance of unmanned vehicle Parameter, the avoidance parameter are generated by the driving behavior of user;
Training unit, it is configured to using acquired data and current avoidance parameter as the avoidance deep learning mould The input and output of type, to be trained to the avoidance deep learning model.
11. a kind of unmanned vehicle, including:
One or more processors;
Storage device, for storing one or more programs,
When one or more of programs are by one or more of computing devices so that one or more of processors are real The now method as described in any in claim 1-5.
12. a kind of computer-readable recording medium, is stored thereon with computer program, it is characterised in that the program is by processor The method as described in any in claim 1-5 is realized during execution.
CN201710791661.4A 2017-09-05 2017-09-05 Control method and device for unmanned vehicle Pending CN107515607A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201710791661.4A CN107515607A (en) 2017-09-05 2017-09-05 Control method and device for unmanned vehicle
PCT/CN2018/098630 WO2019047643A1 (en) 2017-09-05 2018-08-03 Control method and device for unmanned vehicle

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710791661.4A CN107515607A (en) 2017-09-05 2017-09-05 Control method and device for unmanned vehicle

Publications (1)

Publication Number Publication Date
CN107515607A true CN107515607A (en) 2017-12-26

Family

ID=60725124

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710791661.4A Pending CN107515607A (en) 2017-09-05 2017-09-05 Control method and device for unmanned vehicle

Country Status (2)

Country Link
CN (1) CN107515607A (en)
WO (1) WO2019047643A1 (en)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109141911A (en) * 2018-06-26 2019-01-04 百度在线网络技术(北京)有限公司 The acquisition methods and device of the control amount of unmanned vehicle performance test
CN109324608A (en) * 2018-08-31 2019-02-12 百度在线网络技术(北京)有限公司 Unmanned vehicle control method, device, equipment and storage medium
WO2019047643A1 (en) * 2017-09-05 2019-03-14 百度在线网络技术(北京)有限公司 Control method and device for unmanned vehicle
CN109693672A (en) * 2018-12-28 2019-04-30 百度在线网络技术(北京)有限公司 Method and apparatus for controlling pilotless automobile
WO2019179094A1 (en) * 2018-03-23 2019-09-26 广州汽车集团股份有限公司 Method and apparatus for maintaining driverless driveway, computer device, and storage medium
CN110967991A (en) * 2018-09-30 2020-04-07 百度(美国)有限责任公司 Method and device for determining vehicle control parameters, vehicle-mounted controller and unmanned vehicle
CN113705381A (en) * 2021-08-11 2021-11-26 北京百度网讯科技有限公司 Target detection method and device in foggy days, electronic equipment and storage medium

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106292704A (en) * 2016-09-07 2017-01-04 四川天辰智创科技有限公司 The method and device of avoiding barrier
CN106394555A (en) * 2016-08-29 2017-02-15 无锡卓信信息科技股份有限公司 Unmanned automobile obstacle avoidance system and method based on 3D camera
CN106742717A (en) * 2016-11-15 2017-05-31 江苏智石科技有限公司 A kind of intelligent magazine transport vehicle based on 3D cameras
CN106873566A (en) * 2017-03-14 2017-06-20 东北大学 A kind of unmanned logistic car based on deep learning
CN106950964A (en) * 2017-04-26 2017-07-14 北京理工大学 Nobody electronic university student's equation motorcycle race and its control method
CN107065890A (en) * 2017-06-02 2017-08-18 北京航空航天大学 A kind of unmanned vehicle intelligent barrier avoiding method and system

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2009089369A1 (en) * 2008-01-08 2009-07-16 Raytheon Sarcos, Llc Point and go navigation system and method
CN105843229B (en) * 2016-05-17 2018-12-18 中外合资沃得重工(中国)有限公司 Unmanned intelligent carriage and control method
CN106080590B (en) * 2016-06-12 2018-04-03 百度在线网络技术(北京)有限公司 The acquisition methods and device of control method for vehicle and device and decision model
CN106292666A (en) * 2016-08-29 2017-01-04 无锡卓信信息科技股份有限公司 Pilotless automobile barrier-avoiding method based on ultrasonic distance detection and system
CN206231471U (en) * 2016-10-11 2017-06-09 深圳市招科智控科技有限公司 A kind of unmanned bus of taxi pattern
CN106515728A (en) * 2016-12-22 2017-03-22 深圳市招科智控科技有限公司 System and method for avoiding collision and obstacle for a driverless bus
CN107515607A (en) * 2017-09-05 2017-12-26 百度在线网络技术(北京)有限公司 Control method and device for unmanned vehicle

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106394555A (en) * 2016-08-29 2017-02-15 无锡卓信信息科技股份有限公司 Unmanned automobile obstacle avoidance system and method based on 3D camera
CN106292704A (en) * 2016-09-07 2017-01-04 四川天辰智创科技有限公司 The method and device of avoiding barrier
CN106742717A (en) * 2016-11-15 2017-05-31 江苏智石科技有限公司 A kind of intelligent magazine transport vehicle based on 3D cameras
CN106873566A (en) * 2017-03-14 2017-06-20 东北大学 A kind of unmanned logistic car based on deep learning
CN106950964A (en) * 2017-04-26 2017-07-14 北京理工大学 Nobody electronic university student's equation motorcycle race and its control method
CN107065890A (en) * 2017-06-02 2017-08-18 北京航空航天大学 A kind of unmanned vehicle intelligent barrier avoiding method and system

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019047643A1 (en) * 2017-09-05 2019-03-14 百度在线网络技术(北京)有限公司 Control method and device for unmanned vehicle
WO2019179094A1 (en) * 2018-03-23 2019-09-26 广州汽车集团股份有限公司 Method and apparatus for maintaining driverless driveway, computer device, and storage medium
US11505187B2 (en) 2018-03-23 2022-11-22 Guangzhou Automobile Group Co., Ltd. Unmanned lane keeping method and device, computer device, and storage medium
CN109141911A (en) * 2018-06-26 2019-01-04 百度在线网络技术(北京)有限公司 The acquisition methods and device of the control amount of unmanned vehicle performance test
US11148674B2 (en) 2018-06-26 2021-10-19 Baidu Online Network Technology (Beijing) Co., Ltd. Method and apparatus for acquiring control amount for performance test of unmanned vehicle
CN109324608A (en) * 2018-08-31 2019-02-12 百度在线网络技术(北京)有限公司 Unmanned vehicle control method, device, equipment and storage medium
US11320818B2 (en) 2018-08-31 2022-05-03 Apollo Intelligent Driving Technology (Beijing) Co., Ltd. Method, apparatus, device and storage medium for controlling unmanned vehicle
CN110967991A (en) * 2018-09-30 2020-04-07 百度(美国)有限责任公司 Method and device for determining vehicle control parameters, vehicle-mounted controller and unmanned vehicle
CN109693672A (en) * 2018-12-28 2019-04-30 百度在线网络技术(北京)有限公司 Method and apparatus for controlling pilotless automobile
CN109693672B (en) * 2018-12-28 2020-11-06 百度在线网络技术(北京)有限公司 Method and device for controlling an unmanned vehicle
CN113705381A (en) * 2021-08-11 2021-11-26 北京百度网讯科技有限公司 Target detection method and device in foggy days, electronic equipment and storage medium
CN113705381B (en) * 2021-08-11 2024-02-02 北京百度网讯科技有限公司 Target detection method and device for foggy days, electronic equipment and storage medium

Also Published As

Publication number Publication date
WO2019047643A1 (en) 2019-03-14

Similar Documents

Publication Publication Date Title
CN107515607A (en) Control method and device for unmanned vehicle
US11427215B2 (en) Systems and methods for generating a task offloading strategy for a vehicular edge-computing environment
EP3451230A1 (en) Method and apparatus for recognizing object
CN107063711A (en) Method and apparatus for testing unmanned vehicle
CN110654381B (en) Method and device for controlling a vehicle
CN112001287B (en) Point cloud information generation method and device for obstacle, electronic equipment and medium
CN109455180A (en) Method and apparatus for controlling unmanned vehicle
CN110390237A (en) Processing Method of Point-clouds and system
CN111598006B (en) Method and device for labeling objects
CN109878512A (en) Automatic Pilot control method, device, equipment and computer readable storage medium
US20210089792A1 (en) Method and apparatus for outputting information
CN107521500A (en) Information acquisition method and device
CN107830869A (en) Information output method and device for vehicle
CN115761702B (en) Vehicle track generation method, device, electronic equipment and computer readable medium
CN110696826A (en) Method and device for controlling a vehicle
CN112622923B (en) Method and device for controlling a vehicle
CN107527074A (en) Image processing method and device for vehicle
CN107622241A (en) Display methods and device for mobile device
CN110654380A (en) Method and device for controlling a vehicle
CN112649011B (en) Vehicle obstacle avoidance method, device, equipment and computer readable medium
CN109784129A (en) Information output method and device
CN110321854B (en) Method and apparatus for detecting target object
CN117218187A (en) Pedestrian position information generation method, device, equipment and computer readable medium
CN114523985B (en) Unmanned vehicle motion decision method and device based on sensing result of sensor
CN114724116B (en) Vehicle traffic information generation method, device, equipment and computer readable medium

Legal Events

Date Code Title Description
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20171226

RJ01 Rejection of invention patent application after publication